Threat and Error Avoidance System (TEAS)

teas

Table of Contents

 

Introduction

In recent times, more than ever before it has come to our attention the fatalities being caused by aircraft malfunctions. More specifically, the measures pilots should have taken when they are dealing with aircraft malfunctions or mishaps.

Our idea is to mitigate this problem by using artificial intelligence (AI) technology to act as an advisory fallback for pilots. The system we are proposing is called Threat and Error Avoidance System (TEAS). Facilities provided in this system are; a history analysis of problems faced by other aircraft, the analysis of external systems such as weather, media and current events and finally as an interaction tool with other systems within/outside the aircraft.

TEAS acts as an advisory system where it feeds the pilot with quick and reliable solutions to enable faster and smarter decision making. Our AI technology continuously analyses all the variables being inputted to calculate the best possible outcome for any given situation. This lessens the likeliness of a wrong decision being made. Through this report, we shall discuss the problem in greater detail and show how our artificial intelligence technology can make huge advances in the aircraft industry.

Problem Identified

History

One of the most reputable insurance companies recently published their findings on Aviation Accidents. They found that astonishingly 85% of aircraft accidents are caused solely by pilot error. There is a lot more emphasis now being put on pilots needing better qualifications, training and focus (Hope, 2015).

These figures are not just in statistics because they equate to some of most devastating crashes that claimed lives of 100s through time. Some of the most notable examples include Air France Flight AF447, TransAsia Flight 235, Eastern Airline Flight 401, United Airlines Flight 173, KLM Flight 4804, Polish Air Force TU154, Adam Air Flight 574, Helios Airways Flight 522, American Airlines Flight 587, and Asiana Airlines Flight 214 (Kebabjian, n.d.).

Every one of these aircraft had a very real chance of landing on the ground safely but owing to the stress a pilot goes through to take the next best possible step in their 10-20 seconds decision-making time is what causes most of these crashes. With that said, there is still chances of hardware failure. Thi accounts for just 20% of all accidents occurring due to faulty equipment, weather-related flying conditions and so on (Haq, 2013).

Classifications of Pilot Error

We are all very aware that pilots work in a complex environment and are regularly exposed to situational stress during their air time. While they are high-efficient and excellent in decision making, they often cannot keep up with those attributes during a critical situation.

A pilot’s error by definition is often referred to a pilot who makes a decision, action or inaction either by themselves or by their crew where they contribute to accidents or incidents (Rural and Regional Affairs and Transport References Committee, 2013). The classification of pilot errors made is mostly related to threats (external or internal), errors (physiological or psychological), decision making (Boone, 2014).

Current Management of the Problem

Threat and Error Mangement

Maurino (2005) states that Threats are defined as “events or errors that occur beyond the influence of the air traffic controller, increase operational complexity, and which must be managed to maintain the margins of safety”. Threats can happen unexpectedly such as human errors. The current framework: Threat and Error Management (TEM) recognises that consequences as a potential threat, the threat cannot be mitigated by operational personnel, they can only manage it, but it is considered unavoidable. The air traffic controllers can anticipate and counter-react to threat using countermeasure strategies.

Errors, on the other hand, are spontaneous, it can be linked to threats or part of an error chain. If threats are not solved in the threat segment, then it travels down to the error segment of the system. There are three types of error categories including – equipment handling errors, procedural errors, and communication errors. It classifies error types based on the primary interaction between the air traffic controller at the moment the error is committed. If the air traffic controller is incorrectly executing a procedure it is a procedural error; it is communication errors. Undesired states are activated within the system when the problem is still not solved. It is the result of the ineffective threat, and/or error management may lead to compromised situations and reduce margins of safety in ATC operations (Maurino, 2005).

Our Solution

Artificial Intelligence

Artificial Intelligence (AI) is defined as an art of creating machines that perform functions requiring intelligence when performed by people (Kurzweil, 1990), however, there are multiple definitions as various AI researchers have different interpretations of it. “AI strives to build intelligent entities as well as understand them” (Russell, Norvig 1995), which then gives us a general definition that AI is the study of designing and developing intelligent systems to analyse the best possible solution for a given situation. In the history of aircraft, research by the National Aeronautics and Space Administration have shown that aviation accidents have found that 70% involve human error. Various causes of error may occur including factors such as fatigue levels, workload level, imperfect communication, flawed decision making and psychological factors (Helmreich, 2000)

Previously there have been various several attempts to enhance situational assessments, the assistant cockpit system (CASSY) was developed to mimic a copilot’s responsibility, CASSY was developed to maintain the situation and make decisions on dependent on the flight information. (Onken, 1997). However, a significant flaw of CASSY is that the system only makes decisions based on the flight information, it does not incorporate pilot data, analyse and re-evaluate history of accidents. We may think of CASSY as an additional add-on as they also do not interact with all other existing systems within the aircraft, cannot fully understand the whole system.

AI Techniques used

We propose to use various kind of AI techniques including fuzzy logic, neural networks and expert systems. Firstly, fuzzy logic is logic which is not set in stone. It is something with no straight answer i.e. true or false. In the context of our problem pilot errors such as tiredness, which affects general decision making, is fuzzy and on top of this the environment effects in problem solving is too. By using the TEA AI system, we can reduce the amount of fuzziness incorporated in making decisions. The system will provide the pilot with a set of instructions given the circumstances and based on this the pilot must adopt the best choice in his/her eyes. Some of the other techniques in consideration include Neural Network Systems which model a brain structure constructed to resemble the biological neurones of the brain which are used to solve problems that require input and output data mapping. We are planning on looking at Robotics and Swarm automation technologies when we do enable TEAS to take control but at this stage, we know very little or nothing about how well our AI works and its intentions. There are other system and techniques we are looking into to make our AI work better with existing aircraft controls. These are fly by wire (electrically signalled control systems) and inference engines (reasoning using knowledge base) (Aplak, H. S., & Türkbey, O., 2013).

Our System – TEAS

Currently, the biggest reason for aircraft around the world to crash is because of pilot error. From the 1950’s to the 2000’s pilot error made up 53% of all aircraft crashes (Statistics, n.d.). This figure is hugely concerning especially because it can so easily be avoided. What allows the Threat and Error Avoidance System to solve the problem of human error is its ability to make calculated decision outputs based on historical data such as issues pilots have dealt with in the past and challenges that are likely to cause the aircraft to crash. Examples of such include what speed should the pilot be taking off at given external elements such as weather where the system outputs desired speed, height and route.

Another is the incorporation of media and current events fed into the system in a timely fashion. If this facility had already existed, the MH17 Malaysian aircraft shot down in Ukraine airspace could have been saved as pilots would have been well aware this was an off limit flying zone. TEAS would have Knowledge and Access to External Factors. External factors include things like hijacking events that occur time to time. Having access and knowledge regarding these factors would mean the AI system can alter its calculated possible outcomes based on the likelihood of such events occurring. This could have been very useful when aircraft were hijacked to crash in world trade centre. Having knowledge and access would mean the AI would possibly haven taken control to stop sudden descents or going off flight path and thereby avoid one of the most catastrophic event claiming 100s of lives.

TEAS is also continually interacting with external systems such as weather stations and satellites to get up to date information for presenting best possible flight paths or steps to take in an emergency. Combining data on the drag experienced by an aircraft during the storm and having information regarding the storm could possible help avoid crashes from happening. TEAS also analyses every move a pilot makes to specific situations helping it keep historical data on ideal moves for a given situation. It will also have data analysed on past crashes and steps a pilot could have taken to avoid crash giving it more information to come up with best possible outcomes. Most of the data is stored centrally allowing other aircraft to gain vital information for its safety and the safety of aircraft its gaining information from. The systems act in sync giving each pilot a significant advantage. It also interacts well with the existing system on board and one good example of that would be ACAS – Aircraft Collision Avoidance System.

Finally, this incorporation of external systems allows our TEAS system to be up-to-date with every element that could affect a pilot’s decision making and substantially mitigating the problem altogether. The cost of our system is estimated to be around 4-5 million for development and operational cost of 10-15 million per year to start off with.

Possible Implications

We could face a few effects with the implementation of such a system. Firstly, like any technology, when information is stored in a system if not careful, security can be breached by external parties in the form of hacking (Daly, K., Jeziorski, A., & Sedbon, G., 1994). Such a violation could be extremely dangerous if this information is used by the wrong people i.e. terrorism acts. This problem can be avoided only by having the top-most security measures in place, encryption keys and constant monitoring and updates by an in-house security team.

A second implication of the system is its ability to be accurate and reliable. Before we go ahead and implement it into the real world, it is necessary for multiple simulations to take place before hand. By doing this, we can see how well the system works, what faults arise from it and how these errors can be fixed. Finally, the pilots need also to undergo training to fully comprehend this technology.

Since the majority of the airlines would be onboard with this idea of preventing crashes from happening, we are likely to be in a situation where there are privacy concerns since all airliner data is unified and used for processing. The way we were hoping to overcome this issue of privacy is through contracts that heavily penalise us for breach of privacy through writings like non-disclosure agreements. Another way to overcome this issue is by encrypting the data with latest technologies and strongest of keys. Even if the data falls into the wrong hands, there is tiny or no use owing to its encryption and the amount of time it takes to decrypt a length key.

Last by not least is the idea of computers taking over the control with wrong intentions. Since this is an AI system, there is a very real possibility of system thinking like a human. To avoid real-life Terminator situations, we have put TEAS in advisory role alone at this stage. This means that TEAS cannot alter a flight path or take controls from the pilot. It can only offer best possible steps for a pilot to take by himself. If he chooses to ignore a solution provided by TEAS, the TEAS will re-calculate another solution for his/her altered actions. We are sure we will have more implications as we dig deep into building a prototype but at this stage, there were the only ones we could identify that could potentially cause a lot of damage if not dealt with correctly.

Conclusion

As discussed above, pilot errors account to astonishing 80% or so of air crashes. We saw that the industry experts categorised these into Environmental and Airline threats, Physiological and Psychological mistakes, and errors in their Decision Making. We have also looked at the current method to deal with this problem of pilots making an error by the extensive use of TEM and other systems. We found that they were inadequate, and there is a need to develop a better system. We aimed to solve this problem by developing our system Threat and Error Avoidance System (TEAS). TEAS seem to assist in resolving majority of the pilot errors in all areas listed above. We are using AI techniques such as Expert Systems, Neural Network, Fuzzy Logic and other systems to help build TEAS. We are hoping to reduce that 85% of errors made by the pilots in their highly stressful environment by making it easier to make decisions based on millions of variables which our human brain couldn’t have analysed in time.

References:

Aplak, H. S., & Türkbey, O. (2013). Fuzzy logic based game theory applications in multi-criteria decision-making process. Journal of Intelligent & Fuzzy Systems, 25(2), 359-371. doi:10.3233/IFS-2012-0642

Arnoult, S. (2009). The Problem That Won’t Go Away. Air Transport World, 46(8), 26-31

Boone, P. (2014, March). Pilot errors from a pilot perspective. Retrieved April 5, 2016.

Daly, K., Jeziorski, A., & Sedbon, G. (1994), Intelligent Conversation. Retrieved from https://www.flightglobal.com/pdfarchive/view/1994/1994%20-%202003.html

Donnelly, S. B. (2002). Is Your Airline Pilot Ready for Surprises? Time, 160(16), 11.

Haq, H. (2013, May 22). How human error can cause a plane crash. Retrieved April 6, 2016, from http://www.bbc.com/travel/story/20130521-how-human-error-can-cause-a-plane-crash

Helmreich, R. L. (2000). On error management: Lessons from aviation. Bmj,320(7237), 781-785. doi:10.1136/bmj.320.7237.781

Hope, S. (2015, January 07). 85% of aviation accidents caused by pilot error. | AvBuyer. Retrieved April 2, 2016, from http://www.avbuyer.com/articles/business-aviation-insurance/85-of-aviation-accidents-are-caused-by-pilot-error/

Howard, C. E. (2016). Common technologies for manned and unmanned aircraft. Military & Aerospace Electronics, 27(2), 15-22.

Hui, Y., & Qin, Y. (1996). An artificial intelligence system of trouble diagnosis for aircraft engines. Computers & Industrial Engineering, 31(3/4), 797

Kebabjian, R. (n.d.). Accident Statistics. Retrieved April 5, 2016, from http://www.planecrashinfo.com/cause.htm

Kilingaru, K., Tweedale, J. W., Thatcher, S., & Jain, L. C. (2013). Monitoring pilot ‘Situation Awareness’. Journal of Intelligent & Fuzzy Systems, 24(3), 457-466.

Maurino, D. (April, 2005). Threat and error management (TEM). Paper presented at Canadian Aviation Safety seminar, Vancouver, Canada. Available from flightsafety.org/archives-and-resources/threat-and-error-management-tem

Onken, R. (1997, June). The cockpit assistant system CASSY as an on-board player in the ATM environment. In Proceedings of first air traffic management research and development seminar. Saclay, France. http://www.atmseminar.org/seminarContent/seminar1/papers/p_024_ASSP.pdf

Poole, David; Mackworth, Alan; Goebel, Randy (1998). Computational Intelligence: A Logical Approach. New York: Oxford University Press. ISBN 0-19-510270-3.
Rural and Regional Affairs and Transport References Committee (2013, May). Aviation Accident Investigations. Retrieved April 6, 2016, from http://www.aph.gov.au/Parliamentary_Business/Committees/Senate_Committees?url=rrat_ctte/completed_inquiries/2010-13/pel_air_2012/report/report.pdf

Russell, S., Norvig, P.(1995). A modern approach. Artificial Intelligence. Prentice-Hall, Egnlewood Cliffs, 25, 27. http://citeseerx.ist.psu.edu.ezproxy.auckland.ac.nz/viewdoc/download?doi=10.1.1.259.8854&rep=rep1&type=pdf

Russell, Stuart J.; Norvig, Peter (2003), Artificial Intelligence: A Modern Approach (2nd ed.), Upper Saddle River, New Jersey: Prentice Hall, ISBN 0-13-790395-2
STATISTICS. (n.d.). Retrieved April 14, 2016, from http://www.planecrashinfo.com/cause.htm

AUTHOR
Roshan Roy Jonnalagadda – University of Auckland – ISOM Student