Mind-Controlled Airplanes?

The future is not only coming, it might be here already

Pulleys. Pushrods. Electrical connectors swabbed with stabilant goo. Aircraft have untold connecting points that translate an input to an action. And, as those long-ago games of elementary school "telephone" demonstrated, every time a command moves from one node to the next, there's potential for corruption or failure.

Of all these connection types, one holds an outsized potential for havoc. That's the interface between the pilot's brain and the avionics. Whether the controls are knobs, switches, push buttons or touch screens, there's endless room for us, as pilots, to screw up our manipulation of them. Let's see: Big knob, chapter; small knob, page!.

Advances today in user interface technology have less to do with what information is shown than with how we interact with the aircraft. New systems give pilots touch and voice control options, and researchers say that mind control isn't far behind.

Someday, we'll jack our minds right into the panel, and the plane will become an extension of our mind. Or, wait: Maybe we're already further into the aviation Matrix than you think. Take the blue pill with me, and let's start our investigation all the way at its logical end. Honeywell Aerospace Vice President, Advanced Technology, Bob Witwer and his colleagues are actual human brains into the avionics on their testbed King Air.

No kidding. In their experiments, pilots actually maneuver aircraft with their thoughts. Honeywell calls its project Thought-enabled Hands-free Interaction using Neural Kinetics, or THINK, for short.

The idea is that as different areas of the brain fire, they emit faint electrical signals that can be sensed with the same contacts used in electroencephalograms (EEGs). Concentrate on a special light-up square on the left of the panel, and the plane rolls left. Focus on the box to the right, and back you go the other way. And, whatever you do, don't fixate on that argument you had with your boss.

In their testbed King Air, Honeywell Aerospace is experimenting with technology that just might allow pilots to maneuver aircraft by thought. It's all part of its Thought-enabled Hands-free Interaction using Neural Kinetics, or THINK, program.

"It's unbelievable in a way, but it's not that out there," says Witwer. He explains that Honeywell's next series of tests will ditch the visual targets and try to read the weaker signals associated with imagined movement. If that works, the pilot will only have to think about raising his right arm and dipping his left to command that movement.

It's all laboratory work right now, but it works. So, someday, we may be able to command our planes in the same way we fly in our dreams.

While we wait for the Matrix in our Piper Matrix, the current connections between pilot and panel are changing radically. Not so long ago, the only means of commanding an aircraft was spinning knobs and pushing buttons.

Garmin's Telligence voice command system is based on intuitive phraseology and accepts hundreds of basic commands like switching COM or NAV radios and squelching, as well as playing back clearances.

Now, for instance, it's possible to just tell the avionics what you want. Who hasn't wanted to "Mute Passengers" during a particularly stressful moment? Garmin's audio panels with the Telligence voice command system respond to just that without requiring the pilot to move eyes or hands from the controls. The system allows many basic commands like switching COM or NAV radios, squelching, and plays back clearances. But that's just the beginning, according to Garmin's Bill Stone, Senior Business Development Manager for the product. "It's a fairly limited set of commands," he says, very much by design. "You don't want to ask for COM 2 and have the gear go down. You really want to bring value and not just inundate the pilot with all kinds of features."

What seemed like science fiction 20 years ago is commonplace today. Garmin's latest multifunction navigators can be controlled via touch---you can pinch and zoom in Apple-like fashion. With a compatible audio panel, you can even issue voice commands, with reliability that would make Siri envious.

That awareness of need and context is driving avionics makers to choose when voice, touch panels, gestures or physical knobs and buttons work best.

In addition to its work with neural control, Honeywell has demonstrated cockpits that respond to gestures like waving in the air to scroll a page. And, with digital displays instead of fixed gauges, they're quickly adapting information to the phase of flight.

So, Honeywell's Smart Taxi airport moving map automatically switches the primary flight display to an external view of the plane on landing rollout. Gone are the altitude and horizon indications---replaced by oversized signs for intersections, a magenta taxi line showing the path to the gate, ADS-B ground traffic and even 3D virtual barricades at hold short points.

Accelerate for takeoff again, and the view melts back to an inside-the-cockpit field, complete with flight instruments.

Honeywell's Smart Taxi airport moving map technology automatically switches the primary flight display to an external view of the plane on landing rollout. Accelerate for takeoff again, and the view melts back to an inside-the-cockpit field, complete with flight instruments.

"This can't be just about putting gadgets in the cockpit," says Witwer. "We have a duty, as avionics and cockpit designers, to provide the best possible situational awareness and ease of operation. So, we have come up with a few mantras. Give pilots what they need. Only what they need. Only when they need it. And give it to them in a way that's intuitive, unambiguous and easy to understand."

In other words, don't have the gear go down when you call for COM 2. And think hard about matching input method to context: It's no coincidence that Garmin's first Telligence products were developed for the helicopter market, where pilots hold collective in one hand and cyclic in the other.

Of course, even as the OEMs talk about slow, incremental progress, there's no question the interface mobs are storming the gates. Why is it that you can ask your phone for the score on the Giants game, but there's no avionics equivalent?

IBM is one company investigating how natural language queries might make their way into the cockpit. It has already found ways to apply its Watson technology---the computer that beat Jeopardy's all-time, human champs---to various industries. One of those is airlines, where maintenance technicians and dispatch agents must solve complex questions involving reams of data and reference material.

What if that same intelligence were available to the pilot? Say, a sensor onboard the aircraft begins to report an anomaly that might indicate a failure. Watson can look at all the possibilities and help the pilot prioritize.

"Based on the severity of the analysis, based on a number of factors, here's the suggested course of action and the confidence we have with that decision," says Steve Peterson, IBM's Global Travel & Transportation Leader at the Institute For Business Value. "But recognizing that pilots are often making decisions while they're flying an aircraft and they could be flying through turbulence, you want to deliver these sorts of suggestions in a consumable way. And that's where it starts to make sense to have a natural language interface layered on top so the pilot can speak to this brain of the operations."

Peterson cautions that IBM's work is still experimental and isn't yet targeted to commercial products. But it's easy to imagine a day when a pilot might ask for the best route to a destination that minimizes fuel burn and encounters no more than light chop. Just don't give us the answer in the form of another question while we're flying an approach to minimums.

In the end, the only thing better than getting an instant answer to every aviation question may be not having to ask in the first place. So we return to mind control: It's the 22nd century---or maybe still the 21st century---solution to pilot-aircraft interface.

"I don't think that's even 50 years out," Honeywell's Witwer says. "I think the industry and the world will be making tons of progress on neurotechnology. To do something in an airplane that's neurally controlled in 20 years? I think that's something I can easily get my brain around."

A commercial pilot with instrument privileges, Grant Opperman is a writer and business strategy consultant who flies himself to more than 20 states across the U.S. for business and pleasure.

None

Subscribe to Our Newsletter

Get the latest Plane & Pilot Magazine stories delivered directly to your inbox