A guided tour of the new MIT Museum

LIGO prototype

Developed by Professor Emeritus Rainer Weiss ’55, PhD ’62, and his students, this prototype in the 1970s led to the Laser Interferometer Gravitational-wave Observatory (LIGO), a massive physics experiment that eventually detected gravitational waves. wave predicted by Einstein’s General Theory of Relativity. The work earned Weiss the 2017 Nobel Prize in physics.

“The experiments that LIGO has done to facilitate me feel like magic, as a non-physicist,” Nuñez said. “Can you imagine what it felt like there when they found out it worked? What a great moment for humanity!”

Kismet

One of the first social robots designed to simulate social interactions, Kismet was created in the 1990s by Cynthia Breazeal, SM ’93, ScD ’00, now MIT’s dean for digital learning and head of Personal Robots. Research Group at the MIT Media Lab. Originally controlled by 15 different computers, Kismet used 21 motors to create facial expressions and body postures.

“I have a lot of connection with that particular artifact,” said Nuñez, who studies Breazeal at the Media Lab. “It’s a charismatic thing; it’s one of the museum’s Instagram moments.”

criticism

Developed by Julie Shah ’04, SM ’06, PhD ’11, IRGO is an interactive robot that helps museum visitors train through artificial-intelligence demonstrations. “Our guests are participating in real robotics research,” Nuñez said. “That’s a rare and special opportunity.”

Today Shah is the HN Slater Professor of Aeronautics and Astronautics at MIT and head of the Interactive Robotics Group within the Computer Science and Artificial Intelligence Laboratory. He shares his thoughts on AI in the nearby audio gallery. Other alumni featured in the gallery include Professor Rosalind Picard, SM ’86, ScD ’91, director of the Media Lab’s Affective Computing Research Group, and Media Lab PhD students Matt Groh, SM ’19, and Pat Pataranutaporn, SM ’20.

“We want to expose the fact that there are communities of people behind everything you see,” Nuñez said.

Coded view

Visitors to the AI ​​gallery can see the mask used by Joy Buolamwini, SM ’17, PhD ’22, to present a white face—instead of her own Black one—to facial recognition software, which she saw which is less accurate for people with black. skin In his doctoral thesis, Buolamwini coined the term “coded gaze” to describe algorithmic bias.

Leave a Comment