“With the ubiquity of sensors and mobile computers, information about our surroundings is ever abundant. AIDA embodies a new effort to make sense of these great amounts of data, harnessing our personal electronic devices as tools for behavioral support,” comments professor Carlo Ratti, director of the SENSEable City Lab. “In developing AIDA we asked ourselves how we could design a system that would offer the same kind of guidance as an informed and friendly companion.”
AIDA communicates with the driver through a small robot embedded in the dashboard. "AIDA builds on our long experience in building sociable robots,” explains professor Cynthia Breazeal, director of the Personal Robots Group at the MIT Media Lab. “We are developing AIDA to read the driver's mood from facial expression and other cues and respond in a socially appropriate and informative way."
AIDA communicates in a very immediate way: with the seamlessness of a smile or the blink of an eye. Over time, the project envisions that a kind of symbiotic relationship develops between the driver and AIDA, whereby both parties learn from each other and establish an affective bond.
To identify the set of goals the driver would like to achieve, AIDA analyses the driver’s mobility patterns, keeping track of common routes and destinations. AIDA draws on an understanding of the city beyond what can be seen through the windshield, incorporating real-time event information and knowledge of environmental conditions, as well as commercial activity, tourist attractions, and residential areas.
“When it merges knowledge about the city with an understanding of the driver’s priorities and needs, AIDA can make important inferences,” explains Assaf Biderman, associate director of the SENSEable City Lab. “Within a week AIDA will have figured out your home and work location. Soon afterwards the system will be able to direct you to your preferred grocery store, suggesting a route that avoids a street fair-induced traffic jam. On the way AIDA might recommend a stop to fill up your tank, upon noticing that you are getting low on gas," says Biderman. “AIDA can also give you feedback on your driving, helping you achieve more energy efficiency and safer behavior.”
AIDA was developed in partnership with Audi, a premium brand of the Volkswagen Group, and the Volkswagen Group of America's Electronics Research Lab. The AIDA team is directed by Professor Cynthia Breazeal, Carlo Ratti, and Assaf Biderman. The SENSEable City Lab team includes team leader Giusy di Lorenzo and includes Francisco Pereira, Fabio Pinelli, Pedro Correia, E Roon Kang, Jennifer Dunnam, and Shaocong Zhou. The Personal Robots Group's technical and aesthetic team includes Mikey Siegel, Fardad Faridi and Ryan Wistort as well as videographers Paula Aguilera and Jonathan Williams. Chuhee Lee and Charles Lee represent the Volkswagen Group of America’s Electronics Research Lab.
- MIT 'Smart Biking' System In Copenhagen Makes New Friends, Tells How Much CO2 You Didn't Use
- Poggio Lab Mimics How The Brain Recognizes Street Scenes
- Science Imitating Art: Why Is 21st Century Science Obsessed With The Mona Lisa?
- Science, Medicine And Biotech At DLD12
- Autonomous Car Climbs Pikes Peak