Will Your Airliner Get Hacked?
Meet the people who are making sure it won’t.
Modern aircraft have sometimes been called computers with wings, and as far back as 1994, long before hacking became high on society’s list of everyday worries, Boeing engineers were having discussions about how to keep malicious software from being introduced into the data network of their newest marvel, the 777 airliner. So far, the protections devised by those engineers—and the ones who came after them—have worked. No hacker has ever penetrated the computers of an airliner’s flight control system or any part of its avionics. The not-so-shocking news is that hackers have tried. Mike Vanguardia is one of an army of engineers who work to keep them out.
Vanguardia is a cybersecurity engineer with Boeing, the company that gave the world its first “e-Enabled” commercial airplane, the 787 Dreamliner. In 2006, as the Dreamliner was being born, the idea of an e-Enabled airplane—one that lets passengers connect to the Internet with commercial off-the-shelf electronics and Internet protocols, used for in-flight entertainment and some communications systems—wasn’t as scary as it is today. Back then, cybercrime was rare. Today, it is anything but. “The cyberthreat is always moving,” says Vanguardia, who is part of a team that studies the electronic connections on Boeing airliners, as they’re being designed, and tries to make sure that protections are built into them and into the airplanes’ software. His team, he says, is constantly asking, “What things have happened in the news that we haven’t thought about as we design?”
A recent challenge to Vanguardia’s team came in August 2019. At a computer security conference known as Black Hat, Ruben Santamarta, Principle Security Consultant with the Seattle information security services firm IOActive, reported “attack paths” into the avionics network of the Boeing 787. According to his white paper, these paths could be accessed from “non-critical domains, such as the Passenger Information and Entertainment Services….”
This wasn’t Santamarta’s first warning that airliners are vulnerable to cyberattack. At Black Hat 2018, he reported having found a way to remotely access SATCOM equipment onboard hundreds of in-flight aircraft. His report drew a strong rebuttal from the Aviation Information Sharing and Analysis Center (A-ISAC), a trade group composed of more than 80 aviation companies and airports that share vulnerability reports and tactics for cybersecurity. The A-ISAC statement in essence said: We looked at this very thing years ago and put the necessary safeguards in place. Still, at least one provider of aviation satcom equipment installed a patch to thwart potential hacks.
Santamarta’s 2019 claim was more serious. Having found on the Internet a Boeing page showing some of the computer code that the 787 uses to run a package known as the Crew Information System/Management System, Santamarta “reverse engineered” the code, he explained, to find attack paths. He admits that he could not follow the paths because he did not have access to an airplane, but he theorized that these vulnerabilities could allow hackers into the onboard avionics system.
To understand Santamarta’s work—and Boeing’s subsequent refutation—you need to know a little about the electronic networks aboard e-Enabled aircraft. Airlines and manufacturers—through the Radio Technical Commission for Aeronautics (RTCA), a consortium that develops engineering standards—agree to organize onboard networks into three, isolated domains. The Aircraft Control Domain, which includes avionics and flight controls, is the holy of holies. Only the pilots can access it. The Airline Information Service Domain, which includes the information and management systems Santamarta claimed he could reach, provides information to the pilots—in some aircraft models, through something known as an electronic flight bag, a digital edition of all the old paper sectional charts, terminal procedures, and flight-regime calculations that pilots used to lug around in black leather flight bags. This domain includes a system that reports engine health measurements as well as system performance problems to an airline’s maintenance crews, by satellite link in flight, before the airplane lands. The third domain is the Passenger Information and Entertainment Services Domain, which contains the videos and ground tracks that passengers watch from their seats in the cabin. By industry standards and by Federal Aviation Administration regulation, these three domains must not share data pathways, and all three must be hardened against cyberattack.
Aircraft makers are continually updating and upgrading onboard systems. They have to if they are to stay ahead of the bad guys, said Joel Otto, an executive at Collins Aerospace, a supplier of avionics and integrated systems in airliners. Otto, who has spent more than 30 years designing and building avionics for aircraft, said the avionics on commercial airliners are constantly being hardened against cyberattack. Data pathways, for example, are made one-way: Data can reach the passenger cabin but data from the cabin can’t be transmitted in the other direction. Onboard networks have firewalls and layers; to infiltrate, an attacker must get through concentric shells of protection that use varied security controls at each level.
The direction of data flow is also controlled by hardware like routers and switches. Automated systems include switches that determine when they are activated. A recent U.S. Government Accountability Office report on aircraft cybersecurity gives an example: “Airplanes use a weight switch in the wheels to verify that an airplane is on the ground before it will allow software changes to be uploaded to an airplane’s avionics systems. Such a system prevents software changes while an airplane is in flight.”
“We have begun to think of security as a process parallel to safety,” Otto said. “This process spans the entire lifecycle of a product.”
In 2019, when Santamarta described the attack paths into Boeing’s onboard networks, IOActive reported his findings to Boeing and its suppliers, A-ISAC, and several federal agencies before Santamarta sounded the alarm. “We take all these claims seriously,” says Vanguardia. “We spent several months in our labs really digging into this, and we did testing on an aircraft. We’re design engineers, and we know the system really well. Ruben didn’t have access to all the aircraft systems that we do.” Boeing concluded that the paths Santamarta theorized were not exploitable.
“We presented our findings community-wide to get a second level of check,” says Vanguardia. “We wanted third-party attestation to make sure that we weren’t missing anything.” Boeing also reported its test results—and its suppliers’ test results—to the FAA and the Department of Homeland Security (DHS). Boeing also shared its conclusion with IOActive before the Black Hat conference, but Santamarta went forward with his presentation. “IOActive disagreed with Boeing’s conclusion, which gave no details of controls that would prevent exploitation,” says John Sheehy, senior vice president for research and strategy.
Santamarta’s bombshell was costly for the airplane maker. “That was a big disruption,” says Vanguardia. “We pulled so many people off of design projects. Three or four months of airplane and lab testing. It was a multi-million-dollar effort.”
Claims of vulnerabilities in Boeing systems increased after the well-publicized FBI investigation of Chris Roberts, who in 2015 tweeted from a 737 in flight that he was tampering with its systems. Roberts says that before this incident, he had alerted the airlines to these flaws and gotten nowhere. Since that time, cyberattacks on airports, hospitals, businesses, and government systems have grown more frequent and more serious. The Center for Strategic and International Studies tallied 34 significant cyber events in 2015. (The Washington, D.C.-based think tank uses the term “significant” to describe attacks “on government agencies, defense and high-tech companies, or economic crimes with losses of more than a million dollars.”) For the last three years beginning in 2017, CSIS has counted more than 100. Boeing itself was the victim of a ransomware virus in March 2018 that exploited a vulnerability in a version of Microsoft Windows software. A spokesperson told news outlets that remediation was quickly applied and limited the damage to the few affected systems at the South Carolina production facility. (It was not serious enough to interrupt production.)
“When something big happens due to a lapse of cybersecurity, people will be asking how we got there,” says Carl Herberger, a former B-52 pilot and vice president of the cybersecurity firm RadWare. “I really believe it is kind of a 9/11 thing.”
As cyberattacks increase, so do the defensive actions taken by government and industry, notably within aviation. In October 2019, the International Civil Aviation Organization (ICAO) published its first Aviation Cybersecurity Strategy, which recommended planning for cyber incidents, information sharing as a means of prevention, and increased vigilance. “It is critically important that the civil aviation sector takes tangible steps to increase the number of personnel that are qualified and knowledgeable in both aviation and cybersecurity,” the ICAO stated.
Last year in the United States, the administration established an Aviation Cybersecurity Initiative task force with representatives from DHS, the FAA, and the Department of Defense (DoD). The task force focuses not just on airliners but on what its three chairpersons call “the aviation ecosystem.” It includes airports, air traffic control, airlift and cargo operations, airlines, and the employees who work in all these areas.
Alan Burke, the task force DoD chairperson, says that in 2017, there was a change among security experts in the perception of risk. The change came in the wake of NotPetya, an encryption virus that denied access to everything on the computers it infected. NotPetya first targeted Ukrainian infrastructure—power companies, airports, and public transportation—but, as a Wired reporter wrote, “its blast radius was the entire world.” It shut down the port terminals of Danish shipping concern Maersk for two days, costing the company $300 million. What was most frightening about NotPetya was that it wasn’t ransomware, as its forerunner Petya was. If Petya encrypted a computer’s files, the user could pay an amount in bitcoin to have them unencrypted. NotPetya gave its victims no remedy. Its only purpose was to cause havoc.
“The NotPetya attack on the maritime sector was a big eye opener for many in the Department of Defense,” says Burke. “I know of more than one senior leader who [uses] the example of the Maersk attack to demonstrate that the cyberthreat is real.”
To bolster their lines of defense against the cyberthreat, some in the aviation industry have begun a new approach. At the Aviation-ISAC virtual summit last September, Mike Vanguardia called Boeing’s 2019 experience with Santamarta a turning point. In a presentation he called “Operation Reverse Thrust,” Vanguardia described Boeing’s new Security Researcher Technical Council, which brings researchers together with Boeing engineers to cooperate on strategies to defend against evolving cyberthreats.
“By working together we’re hoping the researchers will give us a unique perspective on attacks into our platforms,” he says. “We’ve also seen that a lot of these folks just really want to help make our industry better, safer, and more secure.”
Boeing has also instituted a vulnerability disclosure program and a website for security researchers to report potential vulnerabilities directly to the company. Randy Talley is the DHS chairman on the aviation cybersecurity task force. Noting Boeing’s new programs, Talley says he’s seen a “tremendous change over the past two years” in the attitudes of airlines, manufacturers, and suppliers. “At the Aviation-ISAC Summit in Barcelona last year,” says Alan Burke, “[we learned] that less than 10 percent of the companies at the summit had a program of cyber vulnerability disclosure. We can encourage avionics companies to work together and report vulnerabilities when they are discovered. It helps strengthen the herd.”
Reporting software vulnerabilities requires trust; design and data are proprietary. But the threat outside the aviation industry is forcing companies to focus less on the threat inside.
Boeing continues the industry-wide practice of wargaming, both to rehearse responses in case the worst happens and to simulate exercises aimed at defending aircraft from cyberattack. In these exercises, a “blue team” defends an asset, while a “red team” tries to invade it. In one of the exercises Vanguardia conducted, he put avionics subject matter experts and pilots on the blue team. “We studied the robustness of the airplane’s position, navigation, and timing systems. We looked at points of weakness: Is it a network interface? Is it [data traveling] over radio frequency? How would I attempt to send bad data?” Bad data could make a pilot believe her aircraft is in a position different from its actual position, at a higher or lower altitude in its landing approach, for example. The defenders search for the system to protect the airplane from that scenario. “But we would search also to see if an attack would be discoverable and to see how the pilot would react,” says Vanguardia.
Another safeguard against a cyberattack on U.S. airliners is the oversight of the FAA. An October 2020 report by the U.S. Government Accountability Office, however, concluded that the FAA “is not providing sufficient oversight to guard against evolving cybersecurity risks facing avionics systems in commercial airplanes.” The FAA has not been absent, and the GAO acknowledges that the agency has established cybersecurity requirements for airframe manufacturers as part of the certification process and that it continues to monitor airlines to make sure they adhere to security programs. But those security programs are ones the airlines themselves have created, based on the manufacturers’ recommendations. “Industry stakeholders across the aviation sector expressed concern that FAA lacks personnel with cybersecurity expertise,” the GAO reported; the agency recommended that the FAA institute training programs and develop guidance for the industry on the periodic testing of airliners, among other measures.
At the same time, the Aviation Cybersecurity Initiative task force has as one of its goals to improve the “confidentiality and integrity” of aviation transponder data; in other words, to decide how to deal with ADS-B.
The Automatic Dependent Surveillance-Broadcast system, ADS-B, is the satellite-based successor to the ground-based radar that has been the foundation of air traffic control for at least half a century. As of last year, all aircraft in the U.S. national airspace are required to continually broadcast their position and speed—information gleaned from GPS satellites—so that controllers and other aircraft with ADS-B receiving equipment (which is not required on all aircraft) know where they are. The problem is that anyone with a receiver can know where they are—as well as who they are, how big they are, and how fast they’re traveling.
The DoD has obtained waivers from the FAA so that many military aircraft can fly without transmitting this data. (It would, after all, defeat the purpose of the almost $70 billion the Pentagon spent to develop and purchase a fleet of F-22 stealth fighters if the Raptors were out there on every training mission screaming “Here I am!”) All other aircraft, however, are left exposed.
The concern over signal integrity is that there is no way to authenticate the signals sent by aircraft transponders; therefore, someone could “spoof” the system by sending a signal that purports to come from an aircraft transponder. In a recent email statement to Air & Space, the FAA noted that the ADS-B equipment itself protects against false information: “The FAA has worked to ensure that … onboard avionics equipment using ADS-B Out information” validates received data before displaying it to the pilot. Airliners are also equipped with collision avoidance systems. The FAA considers this system an additional verification of the received picture of aircraft flying in the vicinity, stating, “This allows the airline pilot to have confidence in the traffic information seen on their displays.”
As the FAA begins to implement the GAO recommendations, industry red and blue teams continue to wage battle in the labs, hackers search for vulnerabilities to present at Black Hat and other conferences, A-ISAC updates its 84 industry members at an annual summit, and thousands of engineers test and probe aircraft systems so that passengers can rest easy. How easy are those engineers resting? They are not letting their guard down.
The Disturbing Case of Chris Roberts
While sitting in the passenger seat of an airliner traveling high above the United States sometime between 2011 and 2014, cybersecurity expert Chris Roberts commanded one of the engines to climb, according to an affidavit filed by the Federal Bureau of Investigation. What brought Roberts to the Bureau’s attention was a tweet he sent on April 15, 2015: “Find myself on a 737/800, let’s see Box-IFE-ICE-SATCOM,? Shall we start playing with EICAS messages? ”
The affidavit supported the bureau’s request for a search warrant of digital devices seized from Roberts on April 15, 2015, months after the purported hacking of an aircraft control system.
“The FBI never really came out and said [the hack] happened,” said Dan Katz, a senior solutions engineer at the IT security firm Anomali.
Roberts, a hero in the hacker community, founded the Denver cybersecurity firm One World Labs. (The firm later filed for bankruptcy protection.) He has been called “brilliant” by both hackers and aviation professionals. Today he is an information security consultant and hacker for the research firm HillBilly Hit Squad.
In the affidavit, the FBI quotes Roberts’ statement that he “wiggled and squeezed” the In-Flight Entertainment (IFE) electronics box on the floor of the seat in front of him to get the cover off, cabled his laptop computer to the inside electronics, then “overwrote code on the airplane’s Thrust Management Computer.”
John Craig, Boeing’s chief engineer for cabin and network systems, stated emphatically that Roberts could not have taken over the aircraft in the way the agent described, noting that the IFE is an “untrusted network,” and consequently the flight control system would not have accepted any commands from it.
In the affidavit, the FBI reported that Roberts issued a climb command to one of the engines, causing the aircraft to fly “in a lateral or sideways movement.” Fixed-wing aircraft cannot travel sideways, and Roberts later said the affidavit lacked context for that statement.
Still, Craig explained why such claims are not believable by likening the IFE channel to a one-way street. The channel allows the flow of information from the navigation system to displays that show passengers where the aircraft is located. Data cannot flow in the opposite direction.
In the affidavit, the FBI stated that Roberts volunteered the information to the bureau, because he “would like the vulnerabilities to be fixed.” Indeed, Roberts says today, he had been trying to get the attention of the airlines and manufacturers since 2013.