132 min listen
061: Interpolation, Extrapolation and Linearisation (Prof. Yann LeCun, Dr. Randall Balestriero)
061: Interpolation, Extrapolation and Linearisation (Prof. Yann LeCun, Dr. Randall Balestriero)
ratings:
Length:
200 minutes
Released:
Jan 4, 2022
Format:
Podcast episode
Description
We are now sponsored by Weights and Biases! Please visit our sponsor link: http://wandb.me/MLST
Yann LeCun thinks that it's specious to say neural network models are interpolating because in high dimensions, everything is extrapolation. Recently Dr. Randall Balestriero, Dr. Jerome Pesente and prof. Yann LeCun released their paper learning in high dimensions always amounts to extrapolation. This discussion has completely changed how we think about neural networks and their behaviour.
[00:00:00] Pre-intro
[00:11:58] Intro Part 1: On linearisation in NNs
[00:28:17] Intro Part 2: On interpolation in NNs
[00:47:45] Intro Part 3: On the curse
[00:48:19] LeCun
[01:40:51] Randall B
YouTube version: https://youtu.be/86ib0sfdFtw
Yann LeCun thinks that it's specious to say neural network models are interpolating because in high dimensions, everything is extrapolation. Recently Dr. Randall Balestriero, Dr. Jerome Pesente and prof. Yann LeCun released their paper learning in high dimensions always amounts to extrapolation. This discussion has completely changed how we think about neural networks and their behaviour.
[00:00:00] Pre-intro
[00:11:58] Intro Part 1: On linearisation in NNs
[00:28:17] Intro Part 2: On interpolation in NNs
[00:47:45] Intro Part 3: On the curse
[00:48:19] LeCun
[01:40:51] Randall B
YouTube version: https://youtu.be/86ib0sfdFtw
Released:
Jan 4, 2022
Format:
Podcast episode
Titles in the series (100)
ICLR 2020: Yann LeCun and Energy-Based Models by Machine Learning Street Talk (MLST)