These are chat archives for FreeCodeCamp/DataScience
discussion on how we can use statistical methods to measure and improve the efficacy of http://freeCodeCamp.com
Do you think these machines would be good for OpenNMT and any extensive machine learning work? I think the price is affordable. What do you think?
@mzeidhassan Not an expert but I would buy one if I could for sure, specially if you prefer to use your own work station instead of cloud.
I am not sure if they will solve OpenNMT though, I guess it depends. They have just 2 Nvidia Quadro (although 48 cores in the Z6!), which is no bad but might not be enough for big models. I would suggest to go to the OpenNMT forum and ask there.
REALLY interesting discussions in the Deep Learning course of Andrew Ng.
Here this question:
Found an interesting post where the author suggests to learn the fundamentals and learn to apply them rather than just learning the tools (TensorFlow, Hadoop, Theano). Kind of agree. You can find the article here.
What's your opinion? Will doing this course make us future proof for next 2-3 years?
Here an answer by one of the students commenting the article:
What the article says is that deep learning and its affiliated algorithms rely on a relatively simple technique--gradient descent--that's a classic method in statistics. What the article claims is that in the future other statistical methods will be rediscovered and applied to develop new machine learning techniques.
The article also lists those classic methods:
the EM algorithm
unsupervised learning with linear Gaussian systems
slow feature analysis
Aapo Hyvarinen's work on ICA, pseudolikelihood.
this seminal deep belief network paper
So, yes, I agree, if you want to develop new Machine Learning techniques it makes sense to get more knowledgeable in statistics. But Machine Learning is not just statistical modeling and I guess one of the novelties of Machine Learning with respect to statistics is also the fact that it provides a new framework for using statistics (I leave this vague here because I'm not an expert).
You ask "Will doing this course make us future proof for next 2-3 years?"
This is an introductory course that should allow us to use and configure deep neural networks in an informed way, so definitely it should make us present-proof. Not sure about the next 2-3 years because who knows how fast ML technologies will be evolving in the future.
I first finished already the videos for the first course of Deep Learning Specialization (all 4 weeks, at 1.75x the speed with some exceptions), finished some exams and now going through the Programming Assignments.
Guess... implementing NNs from scratch.
If you have done Coursera before you will find some innovations: Before you had to prepare the assignment on your side. Now you open a Notebook in cloud, complete the assignment, and submit that Notebook. You get evaluated very much as opening an account in Databricks. Smooth and no need of moving between platforms though.
This is THE course, IMO.
@erictleung - you were who mentioned the course here in DSR, not sure if you are interested.
I would say this course combined with the one by Google in Udacity.
In that course you are absolutely on your own, very much like taking the fCC web development course. You have to come up with the solutions without any advise but from the community.
I didn't really take it all but I learnt a couple of things. Planning to finish this one by Andrew Ng and then revisit the other one soon after.
shobhit1610 sends brownie points to @evaristoc :sparkles: :thumbsup: :sparkles:
mzeidhassan sends brownie points to @mstellaluna :sparkles: :thumbsup: :sparkles: