Regulatory Authorities and Air Safety - the case of the Boeing 737 Max 8

This article will deal with the above topic using two Boeing 737 disasters as examples: to illustrate the problem of inadequate supervision with two real examples.

Regulators FAA (USA) and EASA (Europe) will soon have their own articles as they are also experiencing insufficient activity in other areas.

Here the failure of FAA and EASA will only be reconstructed in the announced context.


Systems of ir/responsibility

Flying is considered one of the safest means of transport. The figures prove it: in 2019, 1.35 million people worldwide died in road traffic. Aircraft disasters killed 240 people (2018: 523).

If - for the sake of better comparability - the number of deaths is converted to the length of the route, the result is the same: for every 1 billion kilometres travelled by air, 0.003 deaths occur. The figure is 2.9 for cars, 30 for bicycles (considerably more) and 53 for motorbikes. Travelling by train (0.03) is a little more dangerous than flying. Ships are the best: 0.00001 deaths per 1 billion kilometres travelled. This is how the portal Flüge.de has calculated it.

The fact that flying does so well is due to the principle of "learning from mistakes". Over the decades, countless pilots and others have pointed out weaknesses and the aviation industry and regulatory authorities have (mostly) reacted. We have reconstructed it in another context in the chapter on Meldesysteme in der Luftfahrt - Whistleblower auch hier unverzichtbarreporting systems in aviation - whistleblowers (German).

However, this quality and safety principle was and is not always observed. Many disasters could have been avoided. Because every dead person is one too many.

Sometimes it is the manufacturers of aircraft, other times it is the airlines and if so, then above all the supervisory authorities who are responsible for the death of crashed, i.e. fatally injured people (pilots, flight attendants, passengers). For it is them (actually)who decide what is and is not possible.

In the USA, the Federal Aviation Administration, FAA for short, is the air traffic control authority. In Europe it is the EASA, the European Union Aviation Safety Agency.

Examples: Boeing 737 Max 8

Recent cases (as of October 2020) of irresponsible behaviour: the crash of two Boeing 737 Max 8 aircraft

  •     29 October 2020 marks the second anniversary of the first disaster: 189 deaths.
  •     The second crash only a few months later, on 10 March 2019, killed 157 people.
  •     Together 346 human lives were lost. Not counting the human suffering and the   
  •     subsequent financial existential problems of the relatives.

The aircraft type Boeing 737 Max 8 had been in service for only a short time.  The ensuing disaster seems typical of the disastrous procedures: profit considerations and cost pressures on manufacturers and airlines, turning a blind eye and failing to act on the part of supervisory authorities FAA and EASA. In addition: disinterest on the part of most politicians, even if everyone regularly claims otherwise. If it were as they pretend, there would still be some people alive who are now buried under the masses of sea water or who were crushed into countless small body parts and hurled by the enormous force of the impact and simultaneous drilling of the machine into the ground.

And even more people would not have been exposed to the risk of deadly crashes all this time.

The causes and reasons of the two Boeing 737 Max 8 - disaster

The fact that an aircraft type crashes twice in a row in a short period of time is unusual, but it has contributed to the fact that the US FAA, which is originally responsible for all Boeing approvals, has convened an international committee of aircraft experts on the causes of the crash. When 'outsiders' are on board, there is less wagon burial mentality and therefore, as is well known, less cover-up is possible. And obviously the FAA (this time) wanted to avoid exactly that. After all, the "Max 8" was supposed to become a revenue or profit generator for the US aircraft manufacturer Boeing, according to its ideas, in order to be able to counter the European competitor model the Airbus "A 320 neo". Since a double disaster, for more or less the same reasons, is tantamount to a total loss of image and above all of confidence in an aircraft type, the FAA (this time) apparently wanted to play it safe. Retroactively.

These are the most important findings of the final report on 71 pages:

  • Competitive thinking hinders thoroughness and precision, promotes hectic and sloppiness

    The European joint venture Airbus, which had long ago landed a success with the "A 320 neo" and left Boeing behind in the prestigious competition, was forced to catch up with the US manufacturer. To develop a new model would have taken too long and cost too many billions of dollars in development. For this reason, the engineers and management came up with the idea of re-modeling an existing but expiring successful model, the Boeing 737.

    In particular to give the old type a new, bigger turbine. The larger its diameter and the bigger the blades, the more fuel can be saved. Because the now larger and heavier turbine no longer fitted under the previous wings, it was not only necessary to trim the usual circular shape to oval, but also to move the wings further forward. The result: modified aerodynamics.

    In real terms: the jet now climbs far too quickly when it takes off, so that in the worst case scenario a stall can occur; then the aircraft inevitably crashes.

    In order to prevent this, the developers had created a new software system, the "MCAS", which stands for "Maneuvering Characteristics Augmentation System". It should - automatically - counter-steer, so that the nose of the plane is pressed down again. Automatically.

    In contrast to another standard principle that all relevant systems must be duplicated in aircraft, Boeing had only one sensor installed, which should send signals to the computer whether the aircraft is too steep and needs to be counter-steered. If the sensor fails or displays an incorrect value, there is a problem.

    This was exactly the problem with the two Boeing 737 Max 8 desasters. The wrong message to the on-board computer and to the cockpit sent the aircraft into a steep descent, the pilots countered it manually, several times, but the programme kept the upper hand: in both cases the aircraft shot down headfirst.

  • Concealment of safety information

    When pilots (should) fly a new type of aircraft, they have to learn a lot. If an existing model is re-modelled, but with changed aerodynamics and flight behaviour, pilots must know what is different and how it works.

    This is exactly what has not been communicated. Or deliberately concealed in the aircraft's manual: neither the peculiarities of the new software, nor that it is controlled by a single sensor, nor that the changed flight characteristics should be trained in the simulator in a meaningful way. Training hours cost time, in real terms: money for the airlines.
  • De-regulation of supervision duties: Boeing by Boeing instead of supervision of Boeing by the FAA

    We know from many examples what can happen when highly sensitive institutions do not allow for a culture of criticism and error, but instead evade any kind of supervision. Take Fukushima (Japan), for example, where a nuclear power plant exploded in 2011. At the "Nuclear Village" (as it is internally known) there was a constant exchange: nuclear power plant employees became state supervisors and the official inspectors switched to the nuclear company – a constant back and forth. Another example: the car manufacturer VW (Germany) whose name is synonymous worldwide with a huge environmental fraud. Internal whistleblowers had given early warnings, but they were stopped 'from the top'. VW's own compliance department had completely failed because it apparently had (or wanted) no direct access to the board of directors and/or supervisory board.

    Boeing as well did not think much of a culture of criticism and error or neutral supervision. The long-standing 'boss' of the company, Dennis MUILENBURG, was chairman of the board of directors and at the same time chairman of the executive board until October 2019, i.e. until the publication of the expert report (see above). MUILENBURG supervised himself, so to speak.

    Not much different at the FAA supervisory authority. It had already transferred the supervision almost completely to Boeing in 2009. Around 1,500 Boeing employees, engineers and technicians, were now responsible for supervision and approval procedures: on behalf of the FAA, but paid by Boeing.

    And at Boeing, criticism and safety thinking were not on the agenda.

  • Not whistleblowers. Instead: a climate of intimidation and fear

    Some 50,000 engineers work for Boeing. Only 1 (one in fifty thousand) dared to point out problems early on.

    That was already in 2014, four years before the first disaster. That was when whistleblower Martin BICKEBÖLLER, an American with German roots and a former student at the LMU in Munich, had contacted the FAA and pointed out quality problems in aircraft production. Although the FAA had noted that "FAA regulations regarding airline safety had been violated", much more did not happen.

    However, for the whistleblower it did. Boeing rated his work worse and worse and BICKEBÖLLER was pushed from his post to an insignificant one after 20 years. A clear signal to all other 50,000 engineers. The supervisory authority FAA did not react to this and was satisfied with the professional fate of the whistleblower.

    Martin BICKEBÖLLER was not satisfied and took action again. This time - out of disappointment with the FAA - he turned to the European Aviation Safety Agency EASA. At the same time he turned to the US Congress. It was about the "Dreamliner", the Boeing 787. We do not know what became of this.

    But we do know from internal correspondence, which Boeing had to publish in response to public pressure, how the management thought and acted:

    "We will take on any regulator" who imposes (further) conditions.

    This kind of corporate culture can be seen in the company's own transparency initiative published by Boeing in early 2020: 117 pages of "Internal Boeing communications about the 737 Max".

    But even the active aircraft builders, the thousands of engineers and technicians, apparently had clarified the problems: "Would you put your family in an aircraft whose pilots were trained on a MAX simulator? I would not." Or: The Max 8 was "designed by clowns, who in turn were supervised by monkeys" (e.g. page 84 of the email compilation) This is how, and similarly those who were directly involved in the production of the Max 8, communicated.

    Among those of the approximately 1,200 aircraft experts who inspected and approved the aircraft on behalf of the FAA, there was obviously not one person who informed the (actually) responsible supervisory authority. But that's the way it is when everyone just does his or her small-scale job and doesn't think along. Obviously this is standard practice at Boeing and the FAA.

The role of EASA: knowing, but not warning

 "Your safety is our mission" is the nimble slogan that EASA itself communicates. The FAA does not do it that directly.

EASA was aware of the serious shortcomings of the MCAS system shortly after the first disaster. And had not warned anyone. Not the European airlines, not the pilots, not the cabin crew, not the citizens, not the politicians. No one.

Except the management of the EASA. Their representatives would probably not have boarded such a plane.

That was already before the second disaster.

And only after that did the first countries issue a flight ban on the Boeing 737 Max 8. The USA was one of the last countries to show this aircraft the red card.

The fact that EASA was aware of the software problems and the inadequate training of pilots immediately after the first disaster had to be admitted to a committee of the European Parliament after the second crash of Ethopian Airline.

"An air safety authority which only classifies a software error as a risk when two aircraft have already crashed represents a risk for the citizen himself", Markus FERBER, the transport policy spokesman of the European CSU, stated in the EU Parliament.

Whether this statement had any impact on EASA, whether EASA itself saw a need for action and if so, whether it drew the necessary consequences, i.e. whether it introduced and implemented changes, is what we want to know in a question from EASA. And we are curious to hear the response of the European aviation supervisory authority, which, as it says and writes, is concerned about our safety.

The FAA's calculation: 15 fatal events

The only thing that prompted the FAA after the first crash on 29 October 2018 was an analysis and type of calculation of how many crashes of a future fleet size of 4,800 Boeing 737 Max 8 aircraft would be expected over their life cycles. The result of this "Random Transport Aircraft Risk Analysis (R-TARA)": 15 "fatal events

The only reaction: a circular letter to all airlines, pointing out that a faulty display of the sensor ("faulty angle-of-attack (AOA) indicator") could lead to the nose of the aircraft (i.e. the tip of the aircraft and thus the entire aircraft) being pressed down. There was no indication of the counteracting control by the MCAS software.

On 10 March 2019 the second "fatal event" occurred.

And only now all aircraft of this type were decommissioned.

For the time being.


Notes

You can access this text directly and link to www.ansTageslicht.de/Boeing737Max8.

We will document EASA's answers in the context of a separate text ("chapter") in this ABC Fume event under the heading "EASA".

At the University of Applied Sciences (HAW) in Hamburg, to whom the ansTageslicht.de project is also linked to, technical research is also carried out: at the Aircraft Design and Systems Group (AERO), managed by Prof. Dr.-Ing. Dieter SCHOLZ. He gave a lecture at the German Aerospace Congress in September 2020, which deals with the fundamental problem: "Aviation Ethics - Growth, Gain, Greed, and Guilt", the contents of which can be read here in Powerpoint form.

(JL)