These are chat archives for arita37/tensorflow

30th
Nov 2016
serteckian
@serteckian
Nov 30 2016 15:01
Hi, one basic question for a newbye, please
I am training a network and it works great, but after some time where it is stable ate the maximum performance, then it gets crazy.
I want to store the best network parameters in a new variable, i.e.,
if current_performance > best_performance:
best_params = current_params
how can I copy the parameters?
serteckian
@serteckian
Nov 30 2016 15:36
solved! :-)
serteckian
@serteckian
Nov 30 2016 16:06

oh! nope... I have created another network:
best_inputs, best_out, best_scaled_out = create_network()
best_network_params = tf.trainable_variables()

Then I create this op:
update_best_network_params = [best_network_params[i].assign(original_network_params[i])
for i in range(len(original_network_params))]

Later in my program I invoke:
actor.update_best_network()

And the problem is that it seems that it is assigning a reference rather than doing a deep copy

serteckian
@serteckian
Nov 30 2016 16:29
Learning a new library/language by imitation, I feel like a monkey writing a book by random keys! :-D
serteckian
@serteckian
Nov 30 2016 17:41
I am struggling for duplicating a network when some condition is met (e.g., performance > threshold) in order to get a snapshot. This way, even if the training gets worse, I can recover an intermediate snapshot of the network. Please, any idea how to achieve this?