Middleware
In the beginning (of computing, anyway), there was just data. Computing as a business (and therefore "social"; a lot of academic writing becomes more straightforward if you search-replace) practice precedes computers by a baker's half-century. The first machines were fast tabulators that kept counts and shuttled around data. These machines performed the minimal, simplest tasks that were valuable enough to warrant capital investment and R&D. What made them valuable was scale: big data has been driving innovation since Summerian tablets precipitated the dawn of writing. But what made this scalability possible given the mechanical clockwork quality of then-available technology was the punched-card representation of data. In the beginning, there was just data.
Middleware bridges over a gap. The party line goes that we live in a connected world, but there's much work required to produce this effect. Even brain cells are air-gapped and rely on chemical cascades to run us. But as the value of connectedness grows, so does the demand for protocols, intermediaries, consolidators. This is perhaps more evident in the trenches of logistics or in modern "microservices" software, but can easily be experienced by manipulating any physical object at hand: your watch, almost useless without a wristband; a book, rendered incoherent but for the glue that binds it.
The thesis that there's growing value in connectedness isn't exactly self-evident. Taken to its limits, it implies that nothing is valuable in isolation, and therefore that there's no agalma (Plato's term for the ultimate source of value hidden inside Socrates' hideous form, and thereafter by the innermost value of things). Value becomes a topological concept.
A modern-day Jack Welch that believed this proposition might fire the bottom-10% performing experts who wrote their thesis on Itô finance (the stochastic calculus of assumed-connected spaces) and get some people whose CVs say something about cohomology and simplicial complexes. Given the scarcity of high-end mathematicians willing to go into non-research roles, a passable substitute might be found in philosophy graduates saying something more or less incomprehensible that sounds the word "Heidegger" somewhere. You'd have to teach them businesspeak -- but maybe that's something I can help you with. Middleware.
On the other hand: the great success of computer networks shouldn't blind us to how radical is this "death of the agalma" thesis. If connectedness is growing in value, it implies it was less valuable in the past. We remember better the history of wars than that of trade agreements. Modern economics teaches us about comparative advantage, the agalma of which rests on the heterogeneity of parties in a trade situation. Maybe at the dawn of humanity we were homogeneous hunter-foragers, always narrowly avoiding starvation, and now we're algebraic topologists and Heidegger scholars, who can't really trade services with each other and must rather rely on the rest of the economy. We become extremely narrow, and thereafter connectedness is brutally essential. This process has been a boon to humanity, but will it remain so indefinitely?
This critique of atomizing hyperspecialization is not new, but it's not always shown to be a critique of connectivity as an unexamined good. Critics of connectedness usually come off as atavistic -- meaning, they sound like they'd like to roll back some of the past -- but their grievances may be more subtle. To be clear, I'm not here to assume either position (about trade, or "monoliths versus microservices", and so on), but to frame problems in interesting terms. This, too, is middleware: restating (apparently incommensurable) positions in terms that are sufficiently complex to allow theoretical exploration and even strategizing.
This particular lens can be applied to theory itself: the principle of sufficient complexity projects an "agalmic" aura (it seems to stand for the value of making things more abstract and removed from practical concerns), but as the world grows more connected, complexity emerges on its own. What this tension illustrates is the inadequacy of the word "complex" (despite its richness and breadth) in a sufficiency principle. Then, this theorizing-about-theorizing is way beyond this particular newsletter. Instead, the way in which theories about tensions in specific domains echo across so many others (data processing to global trade to software architecture to theory itself) is but a glimmer of what pointed and specific theoretical analysis can achieve.