Skip to content

Commit e618611

Browse files
Update binary_cross_entropy.py
1 parent 747e8cc commit e618611

File tree

1 file changed

+4
-6
lines changed

1 file changed

+4
-6
lines changed

machine_learning/losses/binary_cross_entropy.py

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,16 @@
11
"""
22
Binary Cross-Entropy (BCE) Loss Function
33
4-
This script defines the Binary Cross-Entropy (BCE) loss function, which is commonly used for binary classification problems.
4+
This script defines the Binary Cross-Entropy (BCE) loss function, commonly used for binary classification.
55
66
Description:
7-
Binary Cross-Entropy (BCE), also known as log loss or logistic loss, is a popular loss function for binary classification tasks.
8-
It quantifies the dissimilarity between the true binary labels (0 or 1) and the predicted probabilities produced by a model.
9-
Lower BCE values indicate better alignment between predicted probabilities and true labels.
7+
BCE quantifies dissimilarity between true binary labels (0 or 1) and predicted probabilities.
8+
It's widely used in binary classification tasks.
109
1110
Formula:
1211
BCE = -Σ(y_true * log(y_pred) + (1 - y_true) * log(1 - y_pred))
1312
14-
Source:
15-
- [Wikipedia - Cross entropy](https://en.wikipedia.org/wiki/Cross_entropy)
13+
Source: [Wikipedia - Cross entropy](https://en.wikipedia.org/wiki/Cross_entropy)
1614
"""
1715

1816
import numpy as np

0 commit comments

Comments
 (0)