I recently hosted a presentation with the authors of the AdaHessian optimizer paper, Zhewei Yao and Amir Gholami and, to the fastai community.

AdaHessian is a promising new optimizer and one of the first second-order optimizers that is becoming practical to use. I've been working on a port of it to fastai which I hope to release shortly, I;ll post up here once I have it released. Until then, I hope you enjoy the presentation and Q&A afterwards!

(Click here of the embedded link below doesn't work)

Paper Citation

@article{yao2020adahessian,
  title={ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning},
  author={Yao, Zhewei and Gholami, Amir and Shen, Sheng and Keutzer, Kurt and Mahoney, Michael W},
  journal={arXiv preprint arXiv:2006.00719},
  year={2020}
}

Thanks for Reading 😃

As always, I would love to hear if you have any comments, thoughts or criticisms, you can find me on Twitter at @mcgenergy