Minimize model errors with cost functions
The learning process repeatedly alters a model until it can make high-quality estimates. To determine how well a model is performing, the learning process uses mathematics in the form of a cost function.
What is another name for a cost function?
Error, cost, and loss
In supervised learning, error, cost, and loss all refer to the number of mistakes that a model makes in predicting one or more labels.
Let's categorize these terms by their usage:
Cost Calculation Example
These three terms are used loosely in machine learning, which can cause some confusion. For the sake of simplicity, we use them interchangeably here. Cost is calculated through mathematics; it isn't a qualitative judgment.
If a model predicts a daily temperature of 40°F, but the actual value is 35°F, what is the error?
Goal of Cost Functions
What is a cost function?
In supervised learning, a cost function is a small piece of code that calculates cost from:
and the expected label (the correct answer).
Training Process
Final Considerations
During training, different cost functions can change how long training takes, or how well it works. Complete these key points:
- If the cost function states errors are
, the optimizer makes small changes - If the cost function returns
values for certain mistakes, the optimizer works to avoid these mistakes - There isn't a
cost function - Which cost function is best depends on what we're trying to achieve
Remember: We often need to experiment with cost functions to get a result we're happy with.