Skip to content

Commit a423eda

Browse files
Merge pull request #177 from eli-b/formatting
Formatting
2 parents 01d0953 + b37c2a7 commit a423eda

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

Chapter5_LossFunctions/LossFunctions.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -61,7 +61,7 @@
6161
"Other popular loss functions include:\n",
6262
"\n",
6363
"- $ L( \\theta, \\hat{\\theta} ) = \\mathbb{1}_{ \\hat{\\theta} \\neq \\theta } $ is the zero-one loss often used in machine learning classification algorithms.\n",
64-
"- $ L( \\theta, \\hat{\\theta} ) = -\\hat{\\theta}\\log( \\theta ) - (1-\\hat{ \\theta})\\log( 1 - \\theta ), \\; \\; \\hat{\\theta} \\in {0,1}, \\; \\theta \\in [0,1]$$, called the *log-loss*, also used in machine learning. \n",
64+
"- $ L( \\theta, \\hat{\\theta} ) = -\\hat{\\theta}\\log( \\theta ) - (1-\\hat{ \\theta})\\log( 1 - \\theta ), \\; \\; \\hat{\\theta} \\in {0,1}, \\; \\theta \\in [0,1]$, called the *log-loss*, also used in machine learning. \n",
6565
"\n",
6666
"Historically, loss functions have been motivated from 1) mathematical convenience, and 2) they are robust to application, i.e., they are objective measures of loss. The first reason has really held back the full breadth of loss functions. With computers being agnostic to mathematical convenience, we are free to design our own loss functions, which we take full advantage of later in this Chapter.\n",
6767
"\n",

0 commit comments

Comments
 (0)