All cohesion mechanisms impose constraints and reduce autonomy. However, standards and protocols are unique. When well-designed or evolved, they can increase the overall autonomy and agency of the coordinated participants.
This assay is part of the Autonomy and Cohesion series.
When social systems become more complex, conventions, norms and rituals emerge to absorb the access of variety. When complexity increases further, they become more sophisticated and codified as laws, standards and protocols.
Laws, standards and protocols are cohesion and coordination mechanisms. But they have significant differences. Laws are coercive mechanisms. We even speak about laws using coercive language: a law comes into force. Standards and protocols, on the other hand, are voluntary. When compliance with a standard is mandatory, that can only come from a law referring to the standard, not from the standard itself. The same is true for protocols.
Standards and protocols are now ubiquitous. Almost everything we do relies on layers of standards and protocols.1 Thanks to another unique feature, when standards and protocols work, they are invisible. We become aware of them only when they fail.
Standards and protocols are similar—in fact, many protocols are standards—but they are not the same.
Protocols are ancient. Just think about how old greeting and parting protocols are. Standards are way more recent. They date from the beginning of the 19th century.2
The second difference is that, unlike standards, protocols are not necessarily related to technology.
The third and most prominent difference between standards and protocols is related to time. Standards are static. Protocols are always about some sequence of actions. Standards may be related to time, as is the case with the UTC (Universal Time Coordinated), the ISO-8601 time format, the ISO 19235:2015 standards for analog clocks or the W3C time ontology. However, the focus of standards is on what and how, not on when. Protocols, on the other hand, are themselves temporal species. An outstretched hand invites taking, an HTTP request triggers a response, SMTP specifies timeouts.
Some social emergent protocols seem to standardize well across cultures. In most parts of the world, a raised hand with a signing gesture reliably brings the waiter with the bill. Other protocols, even those with many instantiations a day, may not get unified even within a single country. Take, for example, cheek kissing3, a popular greeting protocol in some parts of Europe. The number of kisses is different in the Netherlands, France and Belgium. In the Netherlands, three kisses are exchanged for birthdays, while in Belgium, cheek kisses are not only for special occasions but the daily greeting norm. Their number, however, depends on the region. It is one kiss in Flanders, and it goes up to three in Wallonia. In France, the standard in most regions is two, but there are a few with one or three, and in some places, it even goes up to four.
Since protocols are coordination mechanisms, they rely on shared expectations. It is, then, not surprising, with the different flavors of the protocol, to have an occasional kiss hanging in the air, a cheek-kissing faux pas. In some cities like Brussels, when a big expat community is added to an already rich culture mix, such mishaps happen so often that the exchange of excuses gets ritualized as well and somehow becomes part of the protocol.
Like protocols, standards also manage expectations, but in a different, indirect way, mediated by the artifacts they are applied to. The expectations can be about safety, quality, and performance. There is a fourth category: interoperability. Interoperability standards come the closest to protocols, and they can interplay together to bring a special kind of liberating cohesion.
Let's take HTML and HTTP, two popular interplaying standards and protocols. The way they individually and together provide cohesion reduces the autonomy of the interacting agents to comply with their rules, while otherwise, they bring more autonomy and agency, the result of which is today's world totally transformed by the web.
When the internet appeared in 1969, it was a small network used by a few, and it did not change much for a couple of decades. One possible explanation could be that back then, it required expert knowledge. Indeed, to get a file from another computer in the network, the user had to know the right commands to connect via a terminal to a remote computer, retrieve the directories, find the needed file, and download it to the local computer. Only then she was able to read that file. On the web, in comparison, the only thing needed to read another file on the network is to click on a hyperlink. Was then hypertext the game changer?
Not really. The web was not the first hypertext system. Moreover, many earlier hypertext systems, such as HES and FRESS from the 1960s, ZOG and EDS from the 1970s, and HyperTies and Intermedia from the 1980s, were in some ways superior to the early version of HTML.4 FRESS offered more advanced text manipulation, HyperTies allowed interlinking different document parts, and Intermedia supported transclusion. But none of these changed the world, while HTML did. Why?
One of the reasons was that early hypertext systems were proprietary, while HTML is an open standard.5 That, together with its simplicity and appearance at the right time, makes HTML a liberating cohesion mechanism.
It reduces the autonomy of agents but only as far as compliance goes. To publish content, you can't do it the way you want; you need to respect the HTML conventions. There are constraints, and all constraints reduce autonomy. These constraints also apply to creating a tool for rendering. However, while direct autonomy is reduced due to the need to comply, adopting the standard increases the overall autonomy and agency of all participants—readers, writers, and developers—and fosters an ecosystem in which all kinds of new participants appear and proliferate.
The second reason is that the HTML standard interplays with a very good protocol, HTTP. HTML standardized content creation and consumption, while HTTP standardized content exchange and participant coordination.
As a cohesion mechanism, HTTP imposes constraints on how to request and respond, but these were enabling constraints. The request and response are independent6 of the type of resource, the underlying infrastructure and the physical location of the participants.
This facilitated innovation on both the client and server side. New web browsers, crawlers, and later mobile apps were developed, which, as long as they adhered to the protocol specification, offered more choices and created new kinds of experiences for the users. The technology-driven freedom of the protocols induced social freedom for the participants and fostered more innovations.
Standards and protocols evolve like organic species, following their natural drift.7 Not only do good design decisions stabilize, but so do good development principles to enhance the adaptability of these engineering artifacts even further. A prominent example is the orthogonality principle, according to which the components described in each specification should evolve independently. This is another example of decoupling by design.
Standards and protocols make competitors cooperate. That became a norm in web standards. It is common for representatives of competing companies to work together in W3C working groups to develop a new or improve an existing standard. The Web Applications Working Group, for example, has representatives from Microsoft, Google, Apple, Intel, Adobe, Sony, Alibaba, Mozzila, and many other organizations. It is happening also in the automotive industry. Competing companies adopt a de facto standard established by a competitor, as is the case with electric vehicle charging. They also sit together and produce new standards to increase interoperability. A recent example is the Catena-X network.
From all this, it may look like good standards and protocols bring only virtuous cycles. That's not always the case.
HTTP, while enabling a decentralized Web, came with a client/server architecture that, in combination with other technologies, attracted business models based on extreme centralization and tight coupling between data and application. That’s the case for any SaaS platform today.
If we compare the web with previous media like radio and television, it is not only the interactive nature and the diversity that make the web so different. The feedback and feedforward loops are more in number and run way faster. Compare a city and a village. A city is not only a bigger settlement than a village but also a socio-technical system or entirely different order. Cities create conditions for more jobs, higher salaries, more entertainment, ideas, higher rate of innovation. However, the same dynamics also bring faster spread of diseases,8 corruption, crime and all kinds of anti-social behavior.9 In like manner, the web standards and protocols increase autonomy and agency for everyone and to use them for everything, good and bad.
Whenever the abuse of such autonomy exceeds a certain tolerance level, regulation steps in. But could it be that some future improved standards and protocols can use technology to mitigate a large part of that risk without the need for regulation?
The mass production of machines during that period triggered the need for unification. A prominent example is the unification of screw threads. It was initiated by a paper presented by Joseph Whitworth to the Institution of Civil Engineers. The period of the Industrial Revolution was from 1760 to 1840. The paper was presented in 1841 as if to announce the start of a new era.
The importance of such a cohesion mechanism cannot be overstated. Whenever they get banned, other cohesion mechanisms emerge to keep communities together, as we saw recently during the pandemic.
By clicking on this link, you will run a federated query that will bring a coherent result with the list of hypertext systems and links to the respective resources on Wikipedia, Wikidata, and DBpedia. This is yet another demonstration of the power of interplaying standards and protocols since SPARQL is a standard query language and a protocol working on top of HTTP to get data expressed in RDF, which is another W3C standard.
The first versions of HTML existed only as drafts, but in 1995, HTML 2.0 became an official standard, maintained initially by the Internet Engineering Task Force and later by the World Wide Web consortium. Sometime later, in the year 2000, HTML became an ISO standard.
Another important independence was the so-called "statelesness" between sessions, which is an important scalability factor.
Natural drift refers to systems' inherent tendency to evolve in response to ongoing interactions with their surroundings. Chapter 5 of The Tree of Knowledge provides an accessible description and comparison with natural selection.
A notable exception is malaria, which spreads better in rural areas. A mosquito needs time to digest, so if presented too soon with a potential host, which is what a densely populated area would provide, this won’t lead to spread as it is with other transmissive diseases.