Tag Archives: united airlines

Hacking Airplanes…Lets Think About This

Recent news of airplane security and the did or didn’t someone take control of an airplane during a flight is scattered across the web. There are lots of opinions on whether or not the inflight entertainment systems and the airplane control systems are connected or not. I haven’t tested an airline system, so I can’t say for sure, and it may be different depending on the type of plane. One glaring issue here is we don’t know and there are a lot of people that don’t know either, while acting as if they do know. Is airplane security a concern? Of course it is, what security isn’t a concern? What is the right approach to having it tested?

United Airlines recently announced a bug bounty program. For those that may not know, a bug bounty program is set up by companies to recognize or reward security testers for identifying security bugs in their applications. Some of the big names like Google, FaceBook and Twitter have been doing this for a while now. While not something everyone is prepared for, it can help identify some of the security bugs in your applications, although many of these flaws should be identified internally by developers and QA before release to production. Any average person can participate in most bug bounties, no skills required (we won’t dig into that for this piece).

What seems to be interesting with the United program, at least what we see on Twitter, is that there is some concern that the airplane and in-flight systems are out of scope. This means that while you can test United’s external applications, they are NOT giving permission for anyone to test the airline systems during a flight. Airline security has been propelled into the spotlight recently with stories like GAO: Newer aircraft vulnerable to hacking and Chris Roberts tweeting on a plane about it and then getting questioned by authorities for hours upon arrival.

Does United have it right, by banning hacking on the plane? But what about the children you say? First off, without permission, you shouldn’t be security testing something that isn’t yours. I know there are lots of debate around this topic, but lets just get the permission thing out of the way. I understand, if the systems are not safe, then the issue should be addressed. Many will tell you that the only way to know if it is safe is to have any Joe Blow out there firing away at it. If telling the airline about it doesn’t get them to fix it then doing something a bit more rash is needed “for your safety”. Be prepared, when it comes to public disclosure of flaws that contain working exploits that are not patched “YOU” are the collateral damage.

Lets get real here for just a moment. Lets take a moment to realize that things that happen on computers DO have real consequences. Messing around on a website that exposes sensitive information is bad enough, but to think that allowing anyone to attempt hacking a plane to look for security vulnerabilities at 30,000 feet is a good idea is just ludicrous. You are directly, and immediately putting the lives of everyone on that plane at risk. Maybe you should do a vote to see who is ok with you attempting this. After events such as 9/11, I don’t think you want to announce you are hacking the plane.. you may find yourself duct taped to a chair and bruised up a bit for the remainder of the diverted flight.

In the professional world of security, when we want to test the security of something like this, we seek out the vendor and get a contract that outlines what testing will be done. Obviously this requires the vendor to agree to a contract and the testing. In this scenario, the testing would most likely be done in an airplane in a hangar at the airport, not at 30,000 feet and with no other passengers on board. If you are unable to get the vendor to commit to a contract for testing, then hopefully making people aware of the potential issue and the risks they assume by using that vendor could be enough to force the vendor into it. In our market, people stop using a service, vendor starts listening to people.

In the case of United, and hopefully any other airline that decides to open a bug bounty, I think they are making a good decision in not opening up a bounty on the airline systems. Of course these systems are critical, especially since they keep the plane safe in the air, but we need to make intelligent decisions about how things get tested. This decision by United does not seem to be a method of trying to silence “researchers” about the potential security vulnerabilities in the airplane. This is a move to keep people safe during a flight. We have ways to test, as mentioned earlier with the contract in a controlled environment, we don’t have to do it in the air with other passengers. It is also a smart decision to not open a bug bounty on those systems because with critical systems like this you want to ensure that only trained experts are assessing the system. Someone that can understand the fragility of the environment, the way it works, the things that shouldn’t be done. You start letting John in 34C who just learned what Metasploit is start firing exploits at a system all ad hoc, you are asking for a world of hurt.

If you really want to test the security of an airplane and its flight controls, pony up and buy a plane to do the testing. We see this with the guys that are testing the security of cars. They get funded or pay out of their own pocket to get a vehicle that they can test out the security. Look at some of what they have done, it doesn’t always go as planned. They are not hopping on a city bus and hacking it. They are not hopping on a train and attempting to hack it. They are doing their best to create a controlled environment to test in a safe environment.

Everything has security issues. There will never be a time when we don’t have some security issue still around in a system. We should be glad that due to recent events the airlines have not banned electronic devices on airplanes.. Yet if we keep making decisions to put people at risk with this type of “research” we will probably really learn with “chilling security research” really means.