Black Swans

bdhammel

2 points

joined
https://bdhammel.github.io
History

Recent History

I've got a question on problem 1.2

The analogous form of 1.122 for the MSE with regularization I get is:

$$A_{ij}= \sum_{n=1}^N \left ( \lambda+(x_n)^{i+j} \right )$$

for

$$T_i=\sum_{j=0}^M A_{ij}w_j$$

To me, this suggest that regularization in the loss function is akin to a constant offset in $x$; but, intuitively, that doesn't make any sense...

Could someone shed some light on this, or point out the flaw in my interpretation?

Hi! Did a list of exercises ever get finalized?