grid-line

Cross entropy

Concept from information theory commonly used in machine learning, particularly in classification problems. It measures the difference between two probability distributions for a given random variable or set of events, often serving as a loss function to evaluate model performance. Cross entropy is crucial for optimizing model parameters to improve prediction accuracy, benefiting data scientists and machine learning practitioners.
18.1K
Volume
+6%
Growth
regular