Tech News
← Back to articles

The Diffusion Dilemma

read original related products more articles

On the sun-baked plains of the American Midwest in 1892, a revolution was loudly sputtering to life: the tractor, an engine which signaled the end of the era of animal power and the beginning of the age of machine power. This machine was not just a piece of equipment; the tractor was a manifestation of an exponential shift in energy density, from animal metabolism to coal burning, empowered by discoveries in thermodynamics. But diffusion of the tractor, screeching across the horizon, took much longer than expected.

At the beginning of the 20th century, 40% of American workers were farmers, comprising 15% of the economy. The tractor created a significant reduction in marginal unit cost of food, freeing approximately three acres of cropland per horse previously needed for feed. Yet by 1920, only 4% of American farms had a tractor. This is the ‘diffusion deficit’: the difference between the availability of an innovation and its diffusion throughout its potential markets. But once the tractor diffused, however, the effects were enormous: agricultural mechanisation would eventually raise American GDP by 8%. Time and time again similar patterns have shown up in the spread of technologies through the economy, as we sit on the precipice of yet another industrial shift we may yet learn from the past.

Diffusion Deficits

Technologists and scientists often equate technological innovation with immediate social and economic change. But just because a technology exists does not mean it will necessarily have a strong effect on society in the short run––even if the technology is profitable and sensible to adopt. The tractor was one of the most impactful innovations in recent centuries, but it took over 70 years for the potential of this technology to be realised at scale.

If we understand an “innovation” to be an idea, practice, or technology that is perceived as new, then “diffusion” is the process by which an innovation spreads throughout a population or social system over time. Everett Rogers first popularized the notion of “technological diffusion" in 1962 , when he observed that adoption of a new innovation follows a predictable pattern across populations. Rogers proposed a bell curve with five distinct adopter categories: innovators (2.5%), early adopters (13.5%), early majority (34%), late majority (34%), and laggards (16%). This distribution of adopters remained remarkably consistent, regardless of the particular innovation being studied, from agricultural practices to technological advancements.

For a given technology, Rogers identified five crucial factors that influence adoption rates: relative advantage (the perceived benefits of using a new technology over existing alternatives), compatibility (alignment of the technology with existing social values and needs), complexity (the ease of understanding and using the technology), trialability (the ability of new users to experiment with the technology), and observability (the visibility of positive results of others using the technology). Rogers’ theory also emphasizes the importance of communication channels and social systems in the diffusion process, noting that innovations rarely spread through pure technical merit alone, but rather through social networks and interpersonal influence. The importance of word-of-mouth recommendations for diffusion explains why many superior technologies fail while inferior ones sometimes achieve widespread adoption—for diffusion, the social mechanisms of technological recommendation often matter more than the intrinsic qualities of the innovation itself.

A classic example of how Rogers' diffusion theory played out in a real-world scenario is the VHS vs. Betamax format war of the late 1970s and early 1980s. Despite Betamax offering superior video quality, VHS ultimately dominated the market due to VHS's superior compatibility with user needs—specifically, VHS allowed users to record TV for two hours, compared to Betamax's one hour. Home video recording had existed for some time, but what spurred its adoption was this seemingly minor advantage, which triggered powerful network effects. As more consumers purchased VHS recorders, more video rental stores stocked VHS tapes––creating a positive feedback loop where the availability of VHS content reinforced its superior market position. The Betamax failure shows that even with better technology, compatibility and network effects can lead to an inferior product dominating the market.

General-purpose technologies (GPTs) take even longer to be widely adopted by society, but their effects are transformational once they are. Exactly what is classed as a “general purpose” technology has been the source of some debate amongst historical economists: a “general purpose” technology is a one which initially has much scope for improvement, has many uses, eventually comes to be widely used, and also has many social spillover effects. Writing, agriculture, steam power, electricity, railroads, and information technology are all paradigmatic examples of general purpose technologies. General purpose technologies (or GPTs) have even more specific diffusion challenges than regular technologies, even if they follow the same adoption curve as regular technologies.Crucially, GPTs require specialized skills and training to be useful in proportion to their potential. But the diffusion of those skills will necessarily lag behind the invention itself.

In many cases, productivity initially drops when a GPT is introduced because workers and organizations face a learning curve​ . For example, firms needed operators and engineers to implement long-range telegraphs and telephones––skills that were scarce in the 1870s. After the invention of the telephone, it took time to develop educational programs to upskill workers and for ordinary people to gain experience with the new systems. But once this infrastructure was in place, the telephone took off. Inertia also plays a role in slowing diffusion of GPTs––people comfortable with older solutions may be slow to trust or learn a radically new tool.

Cutting edge invention often speeds ahead of what organizations and society can absorb––creating a gap between what could be done with the available technology, and what is done in practice. [something like] as a result, we can’t learn a lot about whether or not a new technology will be impactful just by looking at productivity charts soon after its invention: the full effect of a technology may only be felt years, decades, or even a full century after the fact.

... continue reading