Next #PyTorchCon keynote is "Foundations for AI + Science" from Anima Anandkumar.
Discussion
Next #PyTorchCon keynote is "Foundations for AI + Science" from Anima Anandkumar.
Mathematical equations govern the world at all scales, and there's a need for physical understanding, which is outside the language-based domain of modern LLMs
We need new neural operators that can handle physical, mathematical domains better.
Using lower-res translations and models considered harmful! Think: Hurricanes
#PyTorchCon TorchCon
The team has developed Physics-Informed Neural Operators (PINO).
Can more accurately predict phenomena like fluid dynamics (very cool demo that's hard to capture in text)
NeuralOperator is the group this work is based on, check them out on github. There will also be a poster at #PyTorchCon
Now on to success stories.
They've completed a weather forecasting model that is 45,000 times faster than current models (not sure how measured).
The model is Apache-licensed #PyTorch code.
Beyond modeling phenomena, these neural operators are able to do hardware designed at the quantum gate level.
(I absolutely do not know enough about the science here, but it looks cool!)
Neural networks need to be able to understand mathematical reasoning -- LEAN / LeanDojo are critical steps in this path.
This team took LEAN and built an LLM out of it, they're hoping to Open Source soon
Wrapping up, we need more hardware-efficient AI training.
"Hardware should be part of the constraint of algorithm design"
The true bottleneck right now is memory. Memory is a bigger bottleneck than compute.
A space for Bonfire maintainers and contributors to communicate