
Your one-stop shop for all Changelog podcasts. Weekly shows about software development, developer culture, open source, building startups, artificial intelligence, shipping code to production, and the people involved. Yes, we focus on the people. Everything else is an implementation detail.
Stellar inference speed via AutoNAS (Practical AI #148)
Yonatan Geifman of Deci makes Daniel and Chris buckle up, and takes them on a tour of the ideas behind his amazing new inference platform. It enables AI developers to build, optimize, and deploy blazing-fast deep learning models on any hardware. Don’t blink or you’ll miss it!
Changelog++ members save 2 minutes on this episode because they made the ads disappear. Join today!
Sponsors
- RudderStack – Smart customer data pipeline made for developers. RudderStack is the smart customer data pipeline. Connect your whole customer data stack. Warehouse-first, open source Segment alternative.
-
SignalWire – Build what’s next in communications with video, voice, and messaging APIs powered by elastic cloud infrastructure. Try it today at signalwire.com and use code
SHIPIT
for $25 in developer credit. - Fastly – Our bandwidth partner. Fastly powers fast, secure, and scalable digital experiences. Move beyond your content delivery network to their powerful edge cloud platform. Learn more at fastly.com
Featuring
- Yonatan Geifman – Twitter, GitHub, Website
- Chris Benson – Twitter, GitHub, LinkedIn, Website
- Daniel Whitenack – Twitter, GitHub, Website
Notes and Links
- Deci
- An Introduction to the Inference Stack and Inference Acceleration Techniques
- Deci and Intel Collaborate to Optimize Deep Learning Inference on Intel’s CPUs
- DeciNets: A New Efficient Frontier for Computer Vision Models
- White paper
Something missing or broken? PRs welcome!