Hey there, maybe someone knows what's up here, trying to use my own dataset for object detection API in tensorflow and get the following error:
Traceback (most recent call last): File "D:\Work\Python Stuff\models-master\research\object_detection\model_main.py", line 109, in <module> tf.app.run() File "D:\Work\Anaconda\envs\vehicle-detection\lib\site-packages\tensorflow\python\platform\app.py", line 125, in run _sys.exit(main(argv)) File "D:\Work\Python Stuff\models-master\research\object_detection\model_main.py", line 71, in main FLAGS.sample_1_of_n_eval_on_train_examples)) File "D:\Work\Anaconda\envs\vehicle-detection\lib\site-packages\object_detection-0.1-py3.5.egg\object_detection\model_lib.py", line 589, in create_estimator_and_inputs model_config=model_config, predict_input_config=eval_input_configs) IndexError: list index out of range
I'm not sure what is wrong here.
Hi guys, currently attempting to use TFLITE for CPP to load an existing
.pb file. After compiling successfully using bazel, I encounter this error at this code:
tflite::MutableOpResolver resolver; std::unique_ptr<tflite::Interpreter> interpreter; tflite::InterptreterBuilder(*model, resolver)(&interpreter);
Didn't find op for builtin opcode 'ADD' version '1' Registration failed.
What does this mean? How can I go about with this? I tried to look into if I have added all the necessary header files, but to no avail so far.
FlatBufferModel::BuildFromFileif that matters to anyone
flags_dict = FLAGS._flags()
keys_list = [keys for keys in flags_dict]
for keys in keys_list:
flags.DEFINE_float("learning_rate", default = 0.0001, help = "Initial learning rate.")
flags.DEFINE_integer("epochs", default = 700, help = "Number of epochs to train for")
flags.DEFINE_integer("batch_size", default =128, help = "Batch size.")
flags.DEFINE_integer("eval_freq", default = 400, help =" Frequency at which to validate the model.")
flags.DEFINE_float("kernel_posterior_scale_mean", default = -0.9, help = "Initial kernel posterior mean of the scale (log var) for q(w)")
flags.DEFINE_float("kernel_posterior_scale_constraint", default = 0.2, help = "Posterior kernel constraint for the scale (log var) for q(w)")
flags.DEFINE_float("kl_annealing", default = 50, help = "Epochs to anneal the KL term (anneals from 0 to 1)")
flags.DEFINE_integer("num_hidden_layers", default = 4, help = "Number of hidden layers")
default=50, help="Network draws to compute predictive probabilities.")
tf.compat.v1.app.flags.DEFINE_string('f', '', 'kernel')
Hello everyone - I'm trying to generate a
SavedModel that does some initialization work after loading a graph. When using the
tf.saved_model.Builder class, I can pass an operation to the
init_op parameter for this purpose.
Some of the initialization I would like to happen is for some
tf.contrib.lookup.HashTable objects to be initialized. I know I can get these object's initializer operations by invoking
table_var.initializer. What I would like to do, additionally, however, is after initializing these tables, add them to a collection via
tf.add_to_collection(...). I want to do this because I want to be able to access that table from some other point in the same session / graph, but at that point, I won't have a python reference to the table. So my solution was to use the collection to store the object and get it back from its key... As far as I can tell, however,
tf.add_to_collection(...) isn't an operation, however, and even if it was, I don't know how I can create an operation that is the sequence of that function call and the table initializer... I know that
tf.control_dependencies is a thing, but that returns a context manager, not an op, which still only gives us the capability to call
tf.add_to_collection(...) eagerly again...
Ideally, I would wish something like the following would work:
table = tf.contrib.lookup.HashTable(initializer=tf.contrib.lookup.TextFileIdTableInitializer( filename='file.csv', key_column_index=0, value_column_index=1, delimiter=','), name='my_table', default_value=-1) init_op = tf.Operation(node_def=lambda: tf.add_to_collection(table.name, table), g=tf.get_default_graph(), control_inputs=table.initializer) ... # Elsewhere in the universe key = '1' with tf.Session() as sess: sess.run(init_op) # this is only for demonstration - I want init_op to be an actual tf.Operation because I want to pass it to the init_op parameter of the tf.saved_model.Builder class's __init__ method... [ table_handle ] = tf.get_collection('my_table') value = table_handle.lookup(tf.constant(value=key)) sess.run(value)
Does anyone know I way I might be able to achieve the above?
Alternatively, when resources like tables are initialized, are they automatically placed somewhere I could retrieve them, and if so, how? I think the table initializer itself is stored in
tf.GraphKeys.TABLE_INITIALIZERS, but I don't want to initialize the table in "elsewhere" I just want to fetch the existing, initialized table...