Discussion about this post

User's avatar
Vaishnav Sunil's avatar

I agree directionally. But there are cases where it's best NOT to be sanguine about markets being a sufficient antidote to DEI.

i.e Industries or domains with loose feedback loops - The most obvious examples are places where markets aren't the primary resource allocation mechanism - bureaucracies or the military or academia - where job protection and lack of consumer feedback can take quite a while for underperformance to feed back into hiring processes. Wokeness in public health is a case in point (which arguably already caused thousands of incremental deaths ) and many more in expectation when the next pandemic comes around. But even within for-profit entities, you can have loose feedback loops, especially for junior employees who don't directly affect the bottom line (this is especially so if it's also culturally harder to fire minorities). But with junior employees, they're less likely to have real-word impact anyway, positive or negative. Over time however, it can create a talent bottleneck as these incompetent junior employees rise up the ranks, but not sure the problem is quite that bad anywhere yet.

More importantly, I think it's a mistake to consider DEI a static problem. DEI is much more like terrorism rather than malaria. Malaria doesn't necessarily get worse if we don't take action. It isn't encouraged by our inaction. Both terrorism and DEI are. So an EA style utilitarian analysis will underweight it's importance.

Having said that, I agree that conservatives spend way too much time on this and i'd be surprised if evne after accounting for the dynamic nabture of the problem, it features in my top 5. The biggest negative impact is probably going to be via damaging the credibility of signalling tools (like universities) but i'm not sure that'll be entirely bad.

Expand full comment
Carter Williams's avatar

I worked at Boeing for many years, in a senior role. Every person took safety seriously. I recall a meeting in 2003 where we forecasted by 2020 we would have solved for every redundancy and failure except for the very edge of human failure. That failures in the 2020 time frame would be one in 10Billion like probability. Our question in 2003 was how do we engineer around even those errors.

I raise this as an example to reflect how deeply Boeing thinks about its product and safety. No industry does the same. 250k people die each year in healthcare from medical errors. Imagine if we left airplane safety to human error.

I believe the 737 MAX MCAS accident would likely not have happened with a US pilot in the left seat. US pilots simply have more training. It is an engineers job to make sure flight safety though is not subject to human frailty. Engineering a solution to be immune to human error is likely easier in healthcare, which is abysmal, versus aerospace, which is exceptionally safe.

The recent error in the plug door is likely a failure to follow process. Procedure is pretty clear - the mechanic needs to stamp the manufacturing log that they tightened the bolts. And QA that they inspected it. The system is designed for an IQ of 90 with redundancy. Because Boeing, Spirit and FAA care about safety they will figure out what happened and correct it. The duty in aerospace is to engineer the systems to be safe no matter what humans do.

Expand full comment
108 more comments...

No posts