These are chat archives for FreeCodeCamp/DataScience

25th
Nov 2018
Eric Leung
@erictleung
Nov 25 2018 10:30

Just doing some casual reading on cross-entropy and found these two links to be useful in understanding https://stackoverflow.com/a/41990932/ (quick answer) and http://rdipietro.github.io/friendly-intro-to-cross-entropy-loss/ (more long form answer).

In sum, cross-entropy is another way to quantify differences between distributions, which may be useful as a cost function in machine learning algorithms using an information theoretic perspective.

Mirkan Çalışkan
@mirkancal
Nov 25 2018 15:20
@honmanyau send me an example!