Embedded AI - Intelligence at the Deep Edge
“Intelligence at the Deep Edge” is a podcast exploring the fascinating intersection of embedded systems and artificial intelligence. Dive into the world of cutting-edge technology as we discuss how AI is revolutionizing edge devices, enabling smarter sensors, efficient machine learning models, and real-time decision-making at the edge.
Discover more on Embedded AI (https://medium.com/embedded-ai) — our companion publication where we detail the ideas, projects, and breakthroughs featured on the podcast.
Help support the podcast - https://www.buzzsprout.com/2429696/support
Embedded AI - Intelligence at the Deep Edge
Most Neurons Do Nothing and That's the Point!
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
This episode explores why biological neural networks are inherently sparse, with only 1 to 5 percent of cortical neurons active at any moment, and why this silence is a feature rather than a limitation. We trace the evolutionary pressures that drove the brain toward sparse coding, from the metabolic cost of each spike to the fixed energy budget per neuron, and examine the computational advantages that follow: greater memory capacity, more efficient representations, and robust generalisation. The discussion then turns to what this means for artificial intelligence, covering the Lottery Ticket Hypothesis, dynamic sparse training, Mixture of Experts architectures, and spiking neural networks. For engineers building at the deep edge, the conclusion is clear: strategic sparsity is not a constraint to work around but a design principle to build on.
If you are interested in learning more then please subscribe to the podcast or head over to https://medium.com/@reefwing, where there is lots more content on AI, IoT, robotics, drones, and development. To support us in bringing you this material, you can buy me a coffee or just provide feedback. We love feedback!