Mean Square LossΒΆ

The mean square error is defined as \(l = \frac{1}{n}\sum (y_i-\hat{y}^i)^2\). Since this is the last derivative we need to compute, we will only need to compute \(\frac{\partial l}{\partial y_i}\). Let \(g(y_i)=y_i-\hat{y_i}\), then \(\frac{\partial g}{\partial y_i}=1\).

\[\frac{\partial l}{\partial y_i}=\frac{\partial l}{\partial g}\times \frac{\partial g}{{\partial y_i}}=\frac{2}{n}(y_i-\hat{y_i})\]

The implementation of mean square error loss in tinyml is as below:

1
2
3
4
5
6
def mse_loss(predicted, ground_truth):
    '''
    Compute the mean square error loss.
    '''
    diff = predicted - ground_truth.reshape(predicted.shape)
    return (diff**2).mean(), 2 * diff / diff.shape[1]