Before this incidence, at least 3 to 4 accidents occurred where the pilot was in fault. Like : They were pulling the sticks too high. The autopilot was screaming "Stall" "Stall". But the pilots were too busy or incompetent to notice. And ultimately crashed the plane.<p>I remember here in HN, there were cries why auto pilot didn't take over at that time and save lives. "Pilots should never be allowed to stall the plane."<p>Well, Boeing's new software did exactly that. Correct the situation when the pilot wasn't. And that crashed the plane.<p>Interesting indeed. Lesson: There's a lot to look beyond before taking a decision, even when the obvious decision is just in front of everyone.
This is part of an unsettling trend where companies prefer to keep users in the dark about the underlying tech of their products. For normal consumer tech, I can kind of understand, but what astonishes me is that this mindset has extended to aircraft, where the "users" are pilots of commercial aircraft!<p>The article's quote from a high-ranking boeing official sums it up: <i>the company had decided against disclosing more details to cockpit crews due to concerns about inundating average pilots with too much information—and significantly more technical data—than they needed or could digest.</i>
> <i>Boeing is working on a software fix, according to industry and government officials, that would likely mitigate risks. On Saturday, the company went further than before in spelling out dangers pilots can face if they misinterpret or respond too slowly to counter automated commands.</i><p>So the issue seems to be that Boeing didn’t even tell pilots and airlines that the auto-stall-prevention system has been added to new variants. So I wonder if this software mitigation is something as simple as a warning screen or dialog box. If they’re writing software, at this point, to fix/patch how the system actually functions, that seems to imply they released a flawed system/heuristic, if such a patchBle flaw was found out so soon after the Lion Air crash.
Did I understand correctly that basically air speed sensor was faulty and because of that the autopilot decided that we need more speed to avoid stall and put the nose down all the way to the ground/sea?
The book "Aviation Psychology: Practice and Research" by Klaus-Martin Goeters explains exactly this kind of situation, when the flight control system and the pilot are not in sync and don't know each other intentions.
I fly out of Renton, where these aircraft are made, and the lot is overflowing. The faulty software could only be an indicator of the quality going into the newer aircraft to roll out fast enough to please the stakeholders. I hope that is not the case. It's not ok that Boeing didn't disclose the feature, but it's even more concerning that it wasn't captured during testing as a potential flaw.<p><a href="https://www.seattletimes.com/business/boeing-aerospace/737-problems-have-grown-in-renton-despite-boeings-reassurances/" rel="nofollow">https://www.seattletimes.com/business/boeing-aerospace/737-p...</a>
Pilots are expensive to train, expensive to maintain in terms of salary and cost a lot afterwards with regards to pensions. I am sure airlines are doing everything they can to reduce these costs and as a result competence is suffering. It used to be the case that air force pilots retired to become commercial pilots. Now commercial pilots are trained straight out of high school. (I'm not sure this is entirely true but bear with me as it supports the point I want to make).<p>Airlines are the final customers of Boeing, Airbus, etc. I am sure they want as much automation on a plane as is possible to reduce the training requirements and so decrease the cost of having pilots on the balance sheet.<p>The problem I think is that the abstraction that is the modern flight deck is not quite up to the job of dealing with poorly trained pilots or pilots with little experience of unusual situations. That gap was nicely addressed by having military trained people in the cockpits where unusual situations are somewhat more "routine".<p>So what we are seeing is the mismatch of cost-constrained customers and the failings of technology in a situation where failures are less forgiving. It's the same story with automation that is being played out everywhere. The only difference, if you exclude x-ray machines, is that the impact is higher.
I'm going to get a little meta here.<p>After the crash, reading the online comments about it (and the things said about Lion Air and the pilots) was pretty interesting given how things have turned out. It is also interesting how much play the initial discussion received relative to the follow up stories about the safety bulletin and now criticism from within the industry.<p>And when we finally do get an article on the safety issue, the top comment is focused on the pilot's supposed issues instead:<p><a href="https://news.ycombinator.com/item?id=18409041" rel="nofollow">https://news.ycombinator.com/item?id=18409041</a><p>Or trying to continue to blame Lion Air:<p><a href="https://news.ycombinator.com/item?id=18408540" rel="nofollow">https://news.ycombinator.com/item?id=18408540</a><p>I guess what I am saying is; there seems to be a deep unwillingness to criticize Boeing. This isn't recent or specific to this accident, Boeing is a very challenging topic to discuss without people getting tribal. Why is that?
Alternative story link: <a href="https://www.marketwatch.com/story/boeing-withheld-crucial-safety-information-on-new-737-models-experts-say-2018-11-12" rel="nofollow">https://www.marketwatch.com/story/boeing-withheld-crucial-sa...</a>
I would have thought Scandinavian Airlines Flight 751 would be well known enough to make manufacturers think twice about sneaking in features and not telling pilots about them and CAAs about certifying them.