A Clear Vision: How Animals and Robots See the Physical World

preview_player
Показать описание
Sensing the world around us might feel effortless, but how does the brain succeed in the complex task of interpreting what we see? Could understanding the neuroscience of vision be the key for creating robots that perceive, interact with, and learn from their surroundings as well as we can -- or even better? In this pair of talks by two experts in distinct but related fields, our speakers will explore vision, from biology to technology, and discuss how the domains of science and engineering can inspire each other in tackling these fascinating questions.

Rudy Behnia, PhD, Assistant Professor of Neuroscience and Principal Investigator at Columbia University’s Zuckerman Institute, will open our event by sharing her research on how the brain processes the dynamic information coming in from our surrounding environment. Using the sophisticated yet compact brains of fruit flies as a model, she will bring us into the world of animal behavior and discuss the neuroscience behind color perception.

Shuran Song, PhD, Assistant Professor in the Department of Computer Science at Columbia University, will then talk about her work building computer vision algorithms at the forefront of robotics that enable machines to see better. Drawing from examples in her lab, she will discuss how robots can sense and perceive more than ever before by actively exploring the physical world and learning from these interactions to carry out complex tasks.
Following the two talks, Vassiki Chauhan, PhD, Postdoctoral Fellow at Barnard College, will moderate a discussion and Q&A with the speakers. Audience questions are welcomed, either submitted during registration or live during the event.

RSVP by Wednesday, February 8, 2023

Registration is required.
Closed captioning will be made available via Zoom
Рекомендации по теме