These are chat archives for FreeCodeCamp/DataScience

Nov 2018
Eric Leung
Nov 25 2018 10:30

Just doing some casual reading on cross-entropy and found these two links to be useful in understanding (quick answer) and (more long form answer).

In sum, cross-entropy is another way to quantify differences between distributions, which may be useful as a cost function in machine learning algorithms using an information theoretic perspective.

Mirkan Çalışkan
Nov 25 2018 15:20
@honmanyau send me an example!