literature

The Great Reset: Technology

Deviation Actions

YNot1989's avatar
By
Published:
14.4K Views

Literature Text

The King is Dead. Long Live the King. In 1959 two engineers at Texas Instruments, Jack Kilby and Robert Noyce, changed the world in the most profound way since the internal combustion engine. The microchip had been born, and with it every aspect of military and consumer technology would be changed in ways no one prior to that moment could have imagined. In the 62 years that have followed, the microchip has been the absolute center of innovation, from guided missiles, GPS, digital cameras, personal computers, smart phones, and all the services enabled by the internet. It is an enabling technology, the Core Technology of this era. It has changed our entire world, and brought all human knowledge into our pockets... and it is not going to get significantly better for the foreseeable future. The microchip has become mature technology, like the internal combustion engine or electric lighting, and innovation will come much slower from here on out.

This is going to be the most bitter pill for entrepreneurs and investors to swallow in the coming decade: the days of digital computing being high-tech are over. Transistors have been shrunk to near their theoretical limit, with that innovation in software has slowed to a crawl, and with that productivity in the United States has declined over the last decade and will continue to do so for at least the remainder of the 2020s. The iPhone was the last significant innovation in digital computing, and it is now more than a decade old. New innovations have been little more than improved services competing for existing markets, and genuine attempts to push the envelope of digital technology have largely hit dead ends. Does anyone seriously think the Apple Watch, or Oculus Rift, or "Internet of Things" products are anything more than novelties? We are at the beginning of a period of consumer technology stagnation, not unlike the late 1960s and 70s was for internal combustion. The futility of trying to push the envelope of digital innovation won't be felt by the majority of people until Silicon Valley creates for the microchip what the Concord and the Cadillac Eldorado were for internal combustion. By that point, we should hopefully see whatever will be the next revolution's Altair 8800.

So the question is: What's next? What is the next revolution in technology that will rewrite our perspective of the world. Such a question seems impossible to answer. Trying to imagine the next revolution in 2020 is not unlike someone in 1970 trying to understand how a room sized computer your bank uses to transfer account information between other banks is going to lead to you being able to watch videos on devices small enough to fit in your pocket. To have any hope of seeing where the next revolution will come from, we must look to the past cycles of innovation, and see how previous Core Technologies emerged.

Core Technology

Core Technologies are innovations that are at the center of every innovation of their era. The microchip didn't just bring you twitter, it brought you the human genome project, the moon landing, and virtually every innovation to the service industry. But each core technology followed the same cycle of development, and each fundamentally changed the way nature of human existence. Puddling Iron created the market revolution by making inexpensive high quality tools readily available to the masses, including complex machines that would create the 18th century textile industry, and with it the concept of a "job" as the main source of personal income. High pressure steam engines defined much of the innovation of the 19th century and created a world where humans could traverse great distances without the aid of the wind or livestock, and with that more interconnected societies that in turn gave rise to nationalism and large colonial empires. Electricity took back the night and made instantaneous communication across the world possible, and made cities more livable by eliminating coal burning heat sources. Internal combustion brought the world closer together, allowed humans to conquer the air, compelled us to divide our societies into suburbs, and created everything from the tourist industry to McDonalds. All of these technologies completely revolutionized our world, and each was radically different from the last (though by no means possible without the innovations that came before). And all had a common cycle that propelled them forward.

The Core Technology Cycle

The cycle of development to implementation and maturation of technology tends to follow a familiar pattern from the laboratory, to heavy investment and R&D by the government out of national interest, before eventually entering the consumer market. The time between each phase varies between innovations, but in general the following is a good guideline for understanding the cycle:
  • Lab - A core technology often starts out as little more than theory, or simply not be possible to implement because other technologies have not yet been developed. Basic research often begins centuries before the technology has any practical application.
  • Government - Technology often finds its first applications in matters of national interest. This does not just mean military applications, NASA was just as much an early source of R&D funding for the microchip as the Defense Department. At this point rapid development occurs often with near exclusive use by the government. For much of modern history, this phase in a new technology's evolution lasts about 20 years, with limited implementation to non-military users who are able to marshal the capital costs required of early adopters, usually financial institutions and the super-rich.
  • Consumer - After about a generation of use by the military and a handful of private institutions, technology transfer inevitably occurs, be it through university grants, government tech-transfer programs, or collaborative R&D projects with private businesses. At this point we can see smaller phases within the Consumer phase:
    • Market Entry - An explosion of applications from private businesses and universities typically lasting around 20-30 years as a new generation of entrepreneurs and investors takes advantage of the market opportunities afforded by the new core technology. This phase includes a significant degree of private sector R&D that leads to steady improvements to stay competitive. 
    • Commoditization - Eventually improvements begin to slow down as the technology reaches its limits within the laws of physics and the constraints of cost. Those innovations that do come forward are less groundbreaking than in the previous phase, and less efficient innovators begin to fall away as the technology reaches a point of commoditization.
    • Maturation - Once the technology has reached a point of maturation, usually after about ten to twenty years after the end of the boom, productivity begins to decline. Investments in new companies yields diminishing rates of return, if any return at all for entrepreneurs. New applications seldom reach a mass market. It is at this point you get a "Concord;" something that solves a problem only for the wealthy, and even then is in no way worth the money. After the Concord moment there is a point of genuine maturation, in which a more practical product emerges that sets the tone for a less innovative era, like the Chrysler K-Car.

The Next Revolution

In each of the phases of the Core Technology Cycle we typically see the military development of new technology begin around 20 years prior to the point of maturation of the current core technology. The Microchip was introduced 17 years before the "No Replacement For Displacement" era of the automobile in the US, for example. Thus, if the iPhone was the point where maturation began, and 2020 marks the dead end for innovation, we can surmise that the new core technology began its life as a government project in the 1990s and 2000s that was distinct from the current core technology. And only one project of that era is currently on the crest of market entry: genetic engineering.

In 1990 the US Government launched the Human Genome Project, the largest and most concerted effort yet to map the human genome. A working draft of the human genome was first published in 2001, and a more complete draft in 2003, formally concluding the project. It wasn't until May of 2021 that the last 5 gaps of rDNA were found and published, albeit with 0.3% containing some errors, and the Y chromosome still hasn't been fully sequenced. Despite these setbacks, the work completed in 2003 launched a wave of biotech projects either by or on behalf of the US and other major governments. In the 18 years since the working draft of the human genome was published, gene sequencing technologies have gotten faster and cheaper with each passing year. What took 13 years and $2.7 Billion, now takes 2 days and costs about $200. CRISPR Cas9 has revolutionized genetic research and engineering techniques, while mRNA technology allowed Modern to stand up a COVID vaccine in just 2 days with a mandate by the government. 

Today Biotech is a lot like computers in the late 60s early 1970s, meaning the first products that we would consider bearing a resemblance to what will be increasingly commonplace in 20 years are already entering service. But they're still used almost entirely by a few major corporations and the government. Thus, I think we can confidentially say at this point that the 2020s we'll see the emergence of the genetic engineering equivalent of the Altair 8800, the first genuine biotech products available to the mass market, but still used primarily by enthusiasts and a handful of early adopters. What will eventually follow will be the biotech equivalent of the Apple II, a general purpose product that will enable mass market adoption to people with little to no formal education or training with the technology. This would lead to biotech being attached to virtually every product and industry as solution or improvement to existing technology. During this time most people will still see biotech as either novel or mysterious, but not really approachable, and in many cases a sign of the class divide. Think of how cars and airplanes were seen in the 1910s, or computers in the 1980s, or indeed steam engines in the 1840s. It won't be until about the 2040s that we'll see a true democratization of biotech and genuinely new services and applications as the generation of tinkerers and workers in the 2020s and 2030s matures to become managers and project leads. By then the list of applications will be truly transformative, and I could not even hope to list even half of the new products and services that will emerge.

New technology impacts more than just what products are available, it also impacts society and culture. The automobile defined economic growth from 1915 to about the 1960s in the US, but it also defined our image of labor, the changing ways we lived, and of course how we waged war. The same can be said of the microchip: it began as a revolutionary piece of technology that entered the mass market in the late 1970s and has defined virtually every innovation and aspect of market growth and the culture. Cyberpunk fiction would not exist if computers were not viewed with curiosity and suspicion in the 1970s and 80s. "Hacking" would not have become the deus ex machina for every action movie for the last 40 years without the microchip. Even the most basic idea of labor shifting from somebody with a greasy shirt to a person in a cubical or at home staring at a monitor owes its birth to the microchip. Biotech will be at least as transformative, initially impacting the healthcare industry much as the microchip impacted the financial sector. Where computers made our modern idea of the stock market possible, biotech will disrupt everything we know about healthcare right down to the idea of having to go to a "Hospital" for illness. The same innovations that made the COVID vaccines possible will continue to mature to the point where small clinics will be able to diagnose disease with extreme accuracy and provide tailor made vaccines, either in-house or via companies with maybe a few dozen people.

A World Without Meat

Before I close this, I offer one scenario, outlining what I believe will be the most transformative early use-case for biotech: engineered of cyanobacteria (algae). Several startups have already developed products made using engineered algae; all fairly simple alternatives to chemically derived products, but the next phase is just around the corner. Imagine a world where proteins are derived from algae grown in huge greenhouses in the desert, fed by only CO2 and salt water. Algae is dried out and plant based protein powder engineered to taste like beef, pork, or chicken can be stored indefinitely to shipped, rehydrated, and formed into processed roles or printed cuts of meat. Ranches sell off their land, unable to keep up with the new synthetic meat industry, and farms in the great plains similarly sell as animal feed becomes nearly worthless (expect this to be a defining part of any social unrest we might face in the 2030s), the Great Plains and other areas devoted to producing primarily animal feed either switch to cereals consumed primarily by humans (a fairly small portion of agriculture compared to animal feed), or are left to be reclaimed by nature. Our image of the Plains changes from one of endless rows of Corn to one of meadows and small stands of trees. Political issues of the subsequent years surround the issue of reintroducing predators to Kansas and Nebraska to control the booming Deer and feral Pig populations. Even dairy cows are eventually left to go the way of the dodo as engineered algae replicate the chemical process to produce milk. Eventually animal rights activists get their wish and it becomes a cultural taboo to eat an animal. The few species of cow that don't go extinct due to lack of demand for beef are kept either as pets or by religious communities that see synthetic meat as sinful. Coastal land, particularly in the Sunbelt becomes the agricultural hub of the world as companies take advantage of easy access to cheap energy (both for cooling and heating their buildings but also for photosynthesis in the algae themselves) to produce proteins and ship them over water with little to no travel time over road or rail during production. Huge grain silos are partially repurposed to dump tonnes of protein powder onto ships, and those nations once defined as lifeless deserts become agriculture hubs (if they aren't experiencing any political instability). For America, Turkey, Israel, Egypt, and Mexico, this means their deserts become some of the most productive farmland in the world. For Europe and East Asia, this means either import dependency or setting up colonies in North Africa and Arabia (which the Turks will have something to say about).

The biotech age is going to be upon us, sooner than you might think. And the applications will absolutely change everything about how we live. There will be overhyped solutions, there will be people scared or ignorant of basic details of the technology for a time, but rest assured it is coming.
Comments38
Join the community to add your comment. Already a deviant? Log In

I found this video that I think is interesting: https://www.youtube.com/watch?v=omo0rE4qATY


This video talks about a team of scientists using AI to design an enzyme that eats plastic. This makes me wonder about one thing. If artificial intelligence can design an enzyme that eats plastic, could it be used to invent novel genetic engineering procedures, not just for humans but also for animals and plants? I have a feeling AI will play a role in genetic engineering becoming mainstream. What do you think?