Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The Jacobian is first derivatives, but for a function mapping N to M dimensions. It's the first derivative of every output wrt every input, so it will be an N x M matrix.

The gradient is a special case of the Jacobian for functions mapping N to 1 dimension, such as loss functions. The gradient is an N x 1 vector.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: