Completely predictable in principle
Of course we're still in "Boeing knocking time", but this is not specifically a Boeing problem.
When any organisation grows to a critical level of scale and complexity, control is unwittingly lost for several reasons. Chains of communication become over-extended, the growing disparity of power between the individual and the "organisation" encourages folks to keep their heads below the parapet, and identifying individuals with specific responsibilities gets increasingly difficult.
In any human endeavour normalised deviance (aka entropy) constantly causes processes and controls to weaken imperceptibly over time - imperceptibly because it's progressive and thus defines the current cultural norm against which performance is verified. Typically, only when an accident happens is attention drawn to the degree of departure from the original intent.
This largely explains most major incidents and the internal cultural responses to them. For example, both shuttle disasters were precipitated by the culture at NASA, and I can quite believe the former CIO of Equifax when he stated to the enquiry that he did "not think Equifax could have done anything differently". What they did was in each of these cases driven by their current cultural norms and therefore assumed to be adequate.