Thread by @www_ora_tion_ca: "This is wildly disingenuous, I speak as a flight instructor and major IT incident investigator. Modern software authors have the professiona […]"


Airplanes *are* under constant attack by gravity, weather, system complexity, human factors, delivery process, training consistency, traffic congestion, and even under attack now by software developers

But every time a disaster happens, we learn from it, publicly, and we share. We're still learning from crashes decades ago. Software developers? Bullshit.

Software is authored by an organization, including programmers, architects, technical writers, quality assurance, UX, and most importantly, management. They're all authors with a responsibility to implement and improve standards of professional discipline.

The PMP credential exists for a real reason, because even management can grasp the value of a shared body of knowledge to use in the construction and improvement of workflows and processes. Don't blame "management," they're trying.

Here's an excellent example - 35 years ago, an airline bought their first metric airliner, management cancelled the project to update all ground paperwork from metric. Plane ran out of gas and engines shut down in the air. 200 page report: data2.collectionscanada.gc.ca/e/e444/e011083…

Where's the detailed 200-page public report from Facebook on how their management failed to prevent major disinformation campaigns in the US election? There isn't one, because they're just not that mature.

And "try to write a standards document on how to win at chess" give me a break my dude there is one and it is software and it works and you know that.

Sadly, @erratarob is making the same mistakes here. The objective in aviation is "Safe Transportation" and not "Preventing accidents" - a subtle wording difference, but an entirely different mindset at a much higher level. blog.erratasec.com/2018/08/that-x…
Similarly, the objective in elections is "Confidence in democracy" and not "stopping attackers," which the CSE clearly lays out as one of many fronts: cse-cst.gc.ca/sites/default/…

Simplistic focus on the machine and loss of perspective on the bigger system & society is the hubris that keeps the technology industry trapped in the footgun cycle.

I mean "Airplanes and elevators are designed to avoid accidental failures" come on have you never heard of fail-safe design? Elevators and planes fail *all the time* but they fail SAFE.

Since this is picking up steam, I want to be clear that it's not "engineering standards" or "way more money" that gives the aviation industry the edge -- it's the constant, daily, global, organized and disciplined continuous improvement.

That's what gave rise to crew resource management and human factors analysis. Even doctors are years behind at this stuff:

But the medical community is learning now, and Microsoft even brought in a surgeon to lecture them on lessons the medical community has learned from the aviation industry:

Surgeons didn't want to use checklists because they were too full of themselves, but then accidental deaths fell by 30-50% in hospitals that adopted them. Know who else often suffers from the same hubris? Programmers.

So many programmers are feeling defensive because they just think I'm talking about bugs. I'm not. There were no "bugs" exploited in the theft of Podesta's emails, there were no "bugs" exploited in the 2016 Facebook disinformation campaign.

Google and Facebook need to get absolutely spanked around because they keep pretending that they are software companies when they are not. They're platforms, environments, ecosystems, societies, whatever.

I know @zeynep has been talking about this stuff for ages -- so long as these companies think that their product is software, and don't get held accountable by society, humanity will increasingly suffer.
More amplification of smarter voices than mine: "How Complex Systems Fail," Cognitive Technologies Laboratory, web.mit.edu/2.75/resources…
IT Practitioners can't even start to make things better unless they start with a baseline of psychological safety: usenix.org/system/files/l…
For the record: Checklists are not the solution, they are a first baby step towards maturity. The aviation industry has already moved well beyond just checklists to a full SMS model. en.wikipedia.org/wiki/Safety_ma…

So many programmers with fake twitter names trying to explain to me how planes work right now. [zero empathy expected from you, ladies, feel free to chuckle]

For folks late to this thread, this is about **voting machine software**, and how the software industry (Google and Facebook, specifically) hasn't earned the trust required to manage democracy securely.

Two heavyweights from Google (Chrome "engineer") and Facebook's former CISO saw a cartoon and attacked the cartoonist: xkcd.com/2030/

They called the cartonist a "non-practitioner" (though he had programming experience at NASA), and a nihilist, and belittled the relative maturity of risk management in the aviation industry (they appear to have been "non-practitioners" of aviation).

There's nothing that the software industry can gain by belittling the life-safety industries that they can learn so much from, just because a comic strip hurt their feelings.

This content can be removed from Twitter at anytime, get a PDF archive by mail!
This is a Premium feature, you will be asked to pay $30.00/year
for a one year Premium membership with unlimited archiving.

Don't miss anything from @www_ora_tion_ca, subscribe and get alerts when a new unroll is available!