These are chat archives for deeplearning4j/deeplearning4j/earlyadopters

Jun 2016
Alex Black
Jun 19 2016 00:39
@crockpotveggies not sure if you are still looking at Spark or not; just a heads up that this exists: deeplearning4j/deeplearning4j#1713 deeplearning4j/dl4j-spark-cdh5-examples#19
Justin Long
Jun 19 2016 00:49
@AlexDBlack oh snap! Yea I've been looking at it but met with one of the Spark committers to figure out a good angle of attack. Glad to see this though nice work! Anything I can do to help test?
Alex Black
Jun 19 2016 00:54
so far it's basically refactoring there and and some slight improvements to parameter averaging
couple of new features (one being collecting timing/stats info - should help with debugging/tuning)
also added in data prefecting, and better control over averaging frequencies etc
main thing was to get the design better so we can add other approaches later (async stuff for updates and the like)
at this point it all runs and seems to be correct
soon I'll be looking into performance issues a bit more - i.e., analyzing bottlenecks etc
so that's definitely something you could help out with if you are up for it
I'll also DM you some docs/notes on things
Alex Black
Jun 19 2016 01:17
I should also note that I've only got to MultiLayerNetwork so far - going to do ComputationGraph today
Jun 19 2016 04:15
how do I handle if I have an input which itself is another vector, for example, if I have 3 inputs - and the second input is a vector: like [0.2, [1,0,0,1], 0.003] - the second vector may represent all the web sites one visited during a time period
Alex Black
Jun 19 2016 05:03
@vjanand just concatenate them into one feature vector
Jun 19 2016 17:24
what is the default activation? - if we do not specific the activation during the configuration of NN, what is the default used?
Adam Gibson
Jun 19 2016 17:29
@vjanand Could you please move to the tuninghelp channel?