Wednesday, December 6, 2023 6:00-7:00pm
IN PERSON Location: 3175 Bowers Ave, Santa Clara, CA 95054
Doors open at 5:30
Virtual Link at 6:00pm
[button link=”https://calendar.ucsc.edu/event/the_uc_santa_cruz_kraw_lecture_series_presents_bridging_the_gap_between_artificial_intelligence_and_natural_intelligence” class=”short_code” type=”big”] Register[/button]
The brain is the perfect place to look for inspiration to develop more efficient neural networks. Indeed, the inner workings of our synapses and neurons offer a glimpse at what the future of deep learning might look like. Our brains are constantly adapting, our neurons processing all that we know, mistakes we’ve made, failed predictions—all working to anticipate what will happen next with incredible speed. Our brains are also amazingly efficient. Training large-scale neural networks can cost more than $10 million in energy expense, yet the human brain does remarkably well on a power budget of 20 watts.
We can apply the computational principles that underpin the brain, and use them to engineer more efficient systems that adapt to ever changing environments. There is an interplay between neural inspired algorithms, how they can be deployed on low-power microelectronics, and how the brain provides a blueprint for this process.
Our Speaker
Jason K. Eshraghian is an assistant professor with the Department of Electrical and Computer Engineering at UC Santa Cruz. He received Bachelor of Engineering (Electrical and Electronic) and Bachelor of Laws degrees from University of Western Australia, WA, Australia in 2016 where he also received a Ph.D. in 2019. From 2019 to 2022, he was a postdoctoral research fellow at the University of Michigan, MI,. He serves as the Secretary of the Neural Systems and Applications Technical Committee. He is the recipient of a Fulbright Fellowship (Australian-American Fulbright Commission), a Forrest Research Fellowship (Forrest Research Foundation), and the Endeavour Research Fellowship (Australian Government). His research interests include neuromorphic computing, spiking neural networks, and memory circuits, and he is the developer of snnTorch, a widely used Python library with over 100,000 downloads used to train and model spiking neural networks.