In 1855, the world was in the midst of dramatic technological transformation. The first commercial telegraph went into operation 16 years earlier, allowing near-instant communication across vast distances. The railroads were expanding rapidly, driving down travel times and transportation costs. Gas lighting, cheaper and more convenient than earlier oil lamps, was making nighttime illumination much more common, leading to night shifts at factories and musical theater shows in the evenings. So it's strange that the new Chair of Technology at the University of Edinburgh had to begin his 1855 inaugural address by explaining what the word “technology” meant. He didn't do this as a rhetorical flourish; he did it because the audience was genuinely unfamiliar with the term:
Educated people in the 19th century were, of course, familiar with individual inventions and crafts like the railroad or metalworking, and they used the term “useful arts” or “industrial arts” to refer to the collection of practical crafts such as weaving, furniture making, and agriculture (reference, p.47). What they lacked was the notion that tools, machines, techniques, weapons, communication devices, and other things formed a coherent whole known as "technology”. In his book What Technology Wants, Kevin Kelly points out that this conceptual void existed long before the 19th century:
In fact, technology was not widely discussed as an important societal force until suprisingly recent times, despite early work by Karl Marx and others. A plot of word frequency in English books (at the top of this post) shows that even though science, industry, and invention have long been discussed, discourse on technology was all but silent before the middle of the 1900's. Technology was not included as a major theme in the Encyclopedia Britannica until 1974 (reference, p. 47), it was first mentioned in a Presidential State of the union address in 1939 (1,2), and the Society for the History of Technology was not founded until 1958 (reference, p. 3).
Today technology is widely recognized as a powerful force that shapes our lives, but because of its conceptual late-start, there is still much that we don’t understand about it. To start with, our cultural understanding of technology is mostly wrong. The common story is that individual inventors create new technology whole cloth from isolated workshops. Their inventions are acts of creative genius, which are not anticipated by society at large and quickly transform the status quo. This idea is enshrined in our education and legal systems. For example, the 8th grade social studies teaching standards in California instruct students to "Name the significant inventors and their inventions and identify how they improved the quality of life" (standard 8.12.9), and the patent system is based on the idea that “a lone genius can solve problems that stump the experts, and that the lone genius will do so only if properly incented” (reference, p.1).
But in practice, things just don’t happen this way. In fact, new technologies are developed incrementally by many people over decades. Individual inventors only build upon the developments of others. Thomas Edison is credited with the invention of the light bulb, but his main contribution was developing a good material to use for the light bulb filament. Others before him developed the concept of electrically heating a resistive filament inside an evacuated glass bulb, and others after him further improved the filament material. An interesting consequence is that almost all major inventions were made independently by multiple people at the same time.
What’s more, invention is not the end of the story. School children memorize the dates of important inventions, but the real impact on society happens later, when the technology is widely adopted. This takes decades. Technologies first need to be redesigned to fit the needs of users, and industries need to reorganize to take advantage of the new way of doing things (see the section titled "technology and the economy" in this post). For example, personal computers were widely available in the 1980’s, but by 2013 almost half of American doctors were still using paper records and filing cabinets.
We even have trouble defining technology. Scientists and engineers commonly define technology as “applied science”, but the relationship between science and technology is more complicated than that. For many technologies, such as the steam engine, working devices and scientific understanding co-evolved. Furthermore, science depends on new technology, as much as technology depends on science. Galileo discovered Jupiter’s moons because of the powerful telescopes (a new technology) that he built. Many modern physics experiments would not be possible without powerful computers and high-speed electronics.
What we do know about technology mostly comes from historical case studies about how certain technologies and industries evolved in the past. This body of research provides fascinating insight into how technology gets developed and evolves, but the field is very new. Even as late as 1983, the economist Nathan Rosenberg wrote: “The study of the history of technology is still largely (although by no means entirely) neglected.” Since then there has been much work, as Brian Arthur summarizes in his book "The Nature of Technology":
So the raw material for understanding technology is abundant, but it’s only been available for the last decade or two, and it has not yet been summarized and distilled. It hasn’t been integrated into a coherent theory. Again, Brian Arthur:
For centuries, no one spoke of technology, and it drew sustained attention from historians only after World War 2. As a result, researchers have just begun to develop a theory of technology. Some fascinating recent books address the subject (1,2,3), and the Santa Fe Institute is holding a large workshop this August. It is surprising that such a powerful phenomena evaded study for so long, and exhilarating that we may soon understand much more about it. I’ll leave you with some final words from Brian Arthur: