July 20, 1969, Mission Time: 102:45:40. One hundred and twenty feet
above the surface of the moon and with less than thirty seconds of fuel
remaining in the tanks, Neil Armstrong sees a sea of boulders covering the
Lunar Module Eagle’s landing spot. He does what any wide-awake driver does when
a pot-hole appears in the freeway just ahead: stomping on the gas, he swerves
hard to the side and drives that puppy full-bore towards the nearest open spot.
After finally – softly – touching down on the moon’s surface, Apollo 11 mission
control can only say, “You got a bunch of guys about to turn blue. We're
breathing again. Thanks a lot”
Next time we land on the moon, Mary “Missy” Cummings is
going to make sure it won’t be anywhere near that hairy. Not that Missy isn’t
used to hairy landings: as one of the first women Naval aviators to be cleared
for combat flight, Missy landed her A-4 Skyhawk countless times on heaving
carrier decks. Now that she runs MIT’s Humans and Automation Lab, she gets the
chance to put her academic career (Ph.D in Systems Engineering with a focus on
Cognitive Engineering) and first-hand piloting experience into practice. Her
most recent area of focus: designing the visual displays that the next lunar astronauts
will use when they land on the moon.
As Missy told me: “As instrumentation designers, one of our
big challenges is deciding how much information not to show, and how to best trick people into perceiving what we
most want them to see. We do this through multivariate instrument optimization,
which is a fancy way of describing the process of layering many visual inputs
together to create a single, rapidly-perceived display. In a time-critical task
like a moon landing, the key is combining precise information about height and
vertical speed in such a way that the pilot senses their position and speed
before they even know that they’ve thought about it.”
1960’s era Apollo astronauts’ eyes had to jump repeatedly
across many instruments to get this sense of situational awareness. They had to
burn several “cognitive cycles” merging these multiple visual inputs and
mentally making the calculations they needed to land. Eliminating such
time-consuming cognition tasks (even if the time is counted in
tenths-of-seconds) is one of the main goals of new cockpit instrumentation
designs.
The new VAVI (Vertical Altitude and Velocity Indicator)
Missy’s team designed is a perfect example. By combining “ecological perception”
(seeing both the environment and its embedded clues that show what actions to
take) with “emergent features” (features produced by the interaction of
individual graphical elements), this instrument’s “waving arms” make
the astronaut visually feel how
quickly they are going up or down. Her team has tested their VAVI in a Harrier
jump-jet with great success, and looks forward to pushing it out into the
commercial aviation market.
Missy knows the VAVI design works: “In academia, many tools
are invented purely based on theoretical research, which can be great. But in
this case, I can literally say that with all my flying that I’ve ‘been there,
done that’… and wished I’d had something like this instrument when I did.”