These are chat archives for deeplearning4j/deeplearning4j

Sep 2018
Alex Black
Sep 26 2018 00:07
@ioannisbaptista I've replied in your issue

@C4N4D4M4N I think that's a bad idea, but here's how you'd do it for ComputationGraph:

((OutputLayer)cg.getOutputLayer(0).conf().getLayer()).setLossFn(new LossMSE());

MultiLayerNetwork is the same, but using getOutputLayer() instead (i.e., no index arg)

Alex Black
Sep 26 2018 00:15
@gordoncaleb no fix yet (no full auto-broadcasting implemented yet), but I've replied in the issue with an option/workaround
João Batista
Sep 26 2018 00:28
@AlexDBlack , just replied
Jiří Vahala
Sep 26 2018 00:49
Can I somehow print topology of my model? It really doesn't work for me but I can't find the error
João Batista
Sep 26 2018 01:17
@AlexDBlack , new post :smile:
Alex Black
Sep 26 2018 01:48
@snurkabill MultiLayerNetwork/ComputationGraph have .summary() methods (and .summary(InputType))
that shows the layers, input/output sizes, and part of the config
Jiří Vahala
Sep 26 2018 02:19
@AlexDBlack thank you sir
Sotiria Bampatzani
Sep 26 2018 09:37
Hello all,
I've been getting a "wrong"(?) confusion matrix... I'm very sorry in advance for not being very clear... By adding all the predictions in a column we're supposed to get the length of the class right? I mean to say, all my classes in /test have a length of 113975, so in the confusion matrix each column should add up to 113975?
Sep 26 2018 10:58
Hi @AlexDBlack a quick question on OptimizationAlgorithms, I see on the model jsons that they are included one per layer and one global to all the net/compgraph , but from the builder you can only set one globally for all the net , is not possible to set them per layer too? Thanks!
Raman Gupta
Sep 26 2018 14:44
The dl4j site has changed a lot -- I'm trying to find that table that used to have a list of the DL architectures supported by dl4j, with some suggested uses for each. Does that still exist?
Sep 26 2018 19:13
One thing I've noticed a lot of people do with their neural nets is they have hidden layers that gradually decrease in size: input = 128, hidden = 64, 32, 16, 8, 4, 2, output = 1
is there any actual usefulness to this or is it just a gimmick?
Sep 26 2018 20:50
@rocketraman do you mean this?