This post was originally published by at Medium [AI]
The remarkable advance of Infotech and Biotech during those decades provided the foundation for The Transformation as seen from the year 2100
Editor’s Note: We continue with the second installment of our interview with Stuart Rand to mark the arrival of the year 2100. The esteemed journalist and author started out with an overview of the critical years from 2020 to 2050 in the first installment and here he first focuses on the huge advances in what was then called infotech that led to an unexpected explosion of innovation. He then moves to the breakthroughs in biotech that provided new pathways to sustainable everything. The interview has been lightly edited to bring out just the narrative from Stuart.
I was born a California kid, so I’ll start there and mostly tell you about what happened in America and more generally in the liberal democracies of Europe and the West. I assume you are talking to someone from China for their take on the era since what happened in China was just as instrumental and amazing.
California had always played an outsized role as innovator and early adopter in modern American history. But by the early 21st century, the state took on added importance. Northern California was ground zero for the digital revolution and the related technological breakthroughs that just kept building off that foundation. The San Francisco Bay Area acted like a magnet for ambitious technologists and entrepreneurs from all over America and really all over the world.
I grew up in that cultural zeitgeist and so developed a deep appreciation for the importance of understanding new technologies in order to understand the economies and societies and cultures that build off them. Every historical era has a certain level of technological capability that defines what’s possible and what’s not possible for humans to be able to do. I like to start with looking at the era’s technologies in order to understand everything else that comes about.
When I was born in 2000 only about one-third of Americans were on the internet and most regions of the world had pretty much no one online. By the time I was age 10 Steve Jobs of Apple had just introduced the first smartphone connected to the internet and about 75 percent of Americans were online and about a third of the rest of the world. By the time I was 20 just more than half the world was online, mostly wirelessly connected through their smartphones, after huge leaps in China and Asia.
By the time I was 30 the entire population of the world had been brought onto the internet. The 2020s saw many different efforts to connect the developing world and remote regions, particularly through small satellites blanketing the Earth in low orbit. And smartphones kept getting cheaper, enabling almost everyone to afford them. The really poor were given them for free by that time because we needed everyone to be monitored for new virus contagions.
Think about that. We went from nobody being connected to everybody on the planet being connected in just 30 years. That was an extraordinary feat of world-historical significance. Its implications took decades to sink in.
Universal Connectivity: A Media Infrastructure for the Entire World
Of course, the introduction of a fundamentally new communication and media infrastructure that connected 8 billion people for the first time in history was going to take some time to sort out and get working right. To their credit, many in the developed world — from governments to the tech companies themselves — were fully engaged in dealing with the many problems by 2020 and beginning to come up with solutions that continued to roll out for another decade. Some solutions had to do with treating the new environment as a powerful commercial medium that needed to be properly regulated by nimble government oversight to prevent abuses by businesses. And some had to do with thinking of the internet as an essential infrastructure that needed to be protected from cyber-attacks by the military.
Other solutions had to do with treating the new environment, from high bandwidth internet connections to key platforms for essential services like search, as public utilities that all citizens needed to function in society. The pandemic showed how critical that infrastructure was to stay informed and carry out work and education. The big tech and telecom companies, under threat of breakups, eventually let go of their monopolistic platforms and let them be run more like utility companies. They then spun off the innovative parts of the original companies to compete on the merits against up-and-coming competitors working on next generation products.
One of the early casualties of the digital revolution had been the business of journalism, which, in the 20th century, lived well off their own information delivery monopolies. The first couple decades of the new century were very difficult as the internet blasted apart those monopolies and the advertising models they were based on. But by the 2020s the value of accurate information in the cacophony of the online world rose, and so did the value of journalism. During a deadly pandemic, you wanted to avoid fake news. The industry rebuilt around new subscription models that could scale up audiences and so thrive once again.
A parallel rebirth came to the television and movie industries. By the 2020s they had made the full transition to streaming all content online which worked well in a more virtualized world. That, in turn, vastly expanded the commercial and creative possibilities. Next generation television series could go on forever and go into infinite detail — and they did on almost all topics. Also all audiences had the potential to go global and aggregate sufficient numbers for niche content with long tails.
Interactive Video Virtualized Much of Knowledge Work & Education
The world of interactive group video itself came into its own during the Coronavirus Crisis. The lockdowns forced almost all businesses to virtualize as much as possible of what they previously did. The early days were chaotic and unproductive, but soon the work processes became more efficient and organizations of all sorts realized there were advantages to this approach too. After the virus threat passed, many decentralized work habits remained. Employees realized they liked being able to avoid long commutes and frequent business travel. Business leaders realized they might not need as much expensive downtown real estate. In the decades that followed this new approach to work impacted everything from home construction to transportation to regional development.
The crisis blew apart higher education and finally pushed it into the digital transformation that had already reworked almost every other business sector and dramatically driven down costs. The costs of higher education in physical settings had grown to exorbitant levels that made it extremely difficult for the vast majority of young people to afford — despite the need for such training in a high-tech knowledge economy. The lockdown forced schools to shift to the online learning environments that they had resisted for so long. Once there, innovation exploded via teachers and students alike. The post-pandemic experience was changed forever, and the costs finally came down.
Elementary and high school education learned the opposite lessons from the isolation that came in the crisis. Parents realized how valuable teachers were in the hands-on development of their children. Everyone realized how much learning for kids involved socialization. So the physical experience of neighborhood schools remained relatively unchanged though better resourced. The changes for them came soon enough with the new mediums like virtual reality.
Almost everyone in my generation and the Millennial generation had grown up gaming and interacting with our friends in what we thought at the time were immersive digital worlds. The arrival of virtual reality in the 2020s took those worlds and the sophistication of online interaction to a whole other level. Our kids went to school and didn’t just open textbooks but entered worlds of the past and the future that we could only imagine before. They could explore the micro level of the body and the macro level of the cosmos. And kids around the world benefited from these educational breakthroughs too.
The new companies that ended up dominating the new video and VR spaces ultimately grew to disrupt and even supersede some of the tech behemoths of my youth. What goes around, comes around.
Ubiquitous AI: Supercharged Human Capabilities Just in Time
When I was a kid the specter of artificial intelligence freaked out almost everybody even before the actual impact of AI arrived. One strain of fear had to do with AI becoming intellectually superior to humans and ending up as our overlords. The other strain of fear centered on the displacement of human jobs as AI and advanced robotics spread into many different fields. Manual workers feared the robots and knowledge workers feared AI.
We know now in 2100 that both fears were misplaced, but for different reasons. We also can see now how important the arrival of AI was. The timing was perfect given the multiple challenges that people faced. Humans needed to significantly augment their capabilities and accelerate their ability to innovate — and AI made that possible. AI could only have emerged after all the digital groundwork had been laid over the previous 40 years: making all information digital, accessible through the cloud, etc. Once that threshold was crossed by the 2020s there was an explosion of thousands of new AI startups that focused on applying what at the time was called Deep Learning to any data collected, which was in every corner of every industry and field, in private and public sectors alike. China became dominant in this dramatic expansion of the application of AI — the first of the high-tech industries where they led the West and we fast-followed.
The pandemics of that era brought the need to keep tabs on where all individuals went and who they met in case someone in their orbit caught the virus and then all those connections needed to be traced back. This would have been impossible without half the planet carrying smartphones and AI being available to constantly monitor everything. China, which had fewer qualms and much more experience with monitoring big data of its citizens, was quick to export what they learned. The West tried as best they could to maintain higher privacy standards but the benefits of open access to data largely outweighed the concerns even there.
By the 2030s AI was a cheap ubiquitous resource that was available everywhere on the planet and could be applied to almost everything humans did. Another way to look at it was that almost every physical thing in the world — which was trillions of things — became smart. No matter where you went at that time you could use augmented reality to draw information from sensors embedded in the physical and natural world around you. AI was just part of the background tech infrastructure like electricity in a previous era. People just assumed AI was always available and being used around them to enrich their physical and virtual experiences.
By 2050 AI was still far from achieving generalized intelligence that would be comparable with the extremely versatile intelligence of human beings. It turned out that the Deep Learning path of AI could not lead to that kind of intelligence. This became widely recognized in the 2020s and new paths of research into next generation AI started up — primarily in the US. Those efforts made much progress and had many spin-offs, but generalized AI still eludes us.
The Massive Resorting of Work Between Humans & Machines
The worries about AI disruption to jobs and the economy were more grounded in reality. The global economy in the 2020s and 2030s went through a great sorting process. Pretty much any job, manual or cognitive, that was based on routine tasks became vulnerable to the application of AI or advanced robotics — particularly after the pandemic. All essential services that needed to operate even through health scares or climate disruptions were particularly vulnerable to replacement by machines.
Non-routine tasks remained the purview of humans — and that left quite a few jobs. The other way to think about the sorting was that humans were best suited for work that required creativity, dexterity or empathy — things machines could not do. That left many traditional jobs in high-touch human services as well as many new ones to come. For example, the great retrofit of all housing and commercial real estate for clean energy needed the versatile judgment of humans.
To be sure there was much economic dislocation as this giant sorting process took place. Whole categories of work that humans had once done were turned over to machines. More often machines augmented and scaled the work of smaller numbers of humans who remained employed in a category. In that way the dislocation was similar to other technological transitions in the past like mechanizing agriculture.
What people did not foresee was how much we ultimately ended up needing the robots. The populations of all Western countries kept seriously declining in numbers because of low fertility rates at the same time as the number of old people kept mounting through the 21st century. Robots were more desirable for some countries than dramatically increased immigration. The same phenomenon hit China and even the more advanced sectors of the developing world. World population peaked at just about 10 billion in the middle of the century. We ended up lucky to have our machines.
Accelerated Innovation: The Outcome that Surprised Us All
One of the most unanticipated developments of the 2020s was the unprecedented explosion of innovation that happened all around the world. Very few people back then could see accelerated innovation coming, though it seems obvious in retrospect. At the time we simply underestimated the potential of our human resources and new tools. The lesson is to never underestimate the ingenuity of humans and their ability to maximize a new tool.
For the previous 10,000 years, humans had lived in a Tower of Babel. That started changing in the 2020s as computers became able to outperform humans in simultaneous language translation by precisely following spoken conversations in real time. Soon any video, audio, text, or in-person connection could be supplemented with simultaneous translation. Travelers with earbuds in one ear could easily interact with anyone they met. All articles, books, scientific papers, documents of any kind could be accurately accessed by anyone — whether they spoke English, Chinese, Farsi, or Swahili.
Innovation is often a product of conceptual cross-fertilization. New ideas often emerge from the frisson that comes from experts in different fields. The advent of simultaneous language translation ramped up the multidisciplinary cross-fertilization to a degree never seen before.
That new AI capability arrived at a time when innovation also went from an art to a science. Over the previous several decades a body of rigorous research had been done inside the business world and academia on understanding just how innovation works. “Design thinking” emerged as a powerful framework for creative problem-solving. You set some ambitious goal to generally solve a big problem. Then you engaged people affected by the problem to understand their reality and what they truly needed and wanted. Then you set aside old ideas of existing legacy systems, and ideological constraints that limited options, and you went through a process to figure out what was actually possible given current technologies and resources. Only then did you figure out a new way forward to address a problem. By the 2020s the design thinking concept was fully developed, well understood by legions of university graduates, and ready to scale.
Armed with new tools and new processes to accelerate innovation, people throughout the world faced up to the massive challenges all around them — starting with the global pandemic. Scientists and technologists overnight jump-started work that normally would have taken months and even years through normal academic or institutional channels. But the speed of the spread of the coronavirus and the extent of its spreading damage forced informal sharing of research and new levels of global coordination. We had long talked about the fast pace of “internet time,” but we now had an even faster form of “coronavirus time.”
In almost every field there were innovators who had been thinking deeply about fundamentally reinventing systems for the 21st century but had not been taken seriously or widely recognized. The Coronavirus Crisis created the demand for new ideas and the search for this advance team. In the social media environment of that time they became tagged as the #Ateam showing the #NewWayForward. The emergent network of innovators found each other, and the world found their big ideas outside the 20th century formal bureaucracies and institutions. They were a welcome addition in the tumultuous decades to come.
Genetic Understanding: Reinvented Human Healthcare
Right around when I was born scientists cracked the first human genome at a cost of about $2 billion (a lot of money back then). In two decades the cost fell to $1,000, and by 2030 sequencing an individual genome pretty much cost nothing. So it shouldn’t be surprising that by today in 2100 the average lifespan of an American is roughly 100 years and a guy like me — lucky to be healthier than most centenarians — can possibly live up to a couple decades more.
With that Human Genome Project, humans crossed a major threshold that heralded a new age. The door was opened to genetic understanding, the harnessing of nature’s operating system to ultimately remake the living world. This achievement really marked the arrival of Biotech as a potentially world-historical technology comparable in importance to Infotech. It would take a few more decades of breakthroughs and scaling to reach the impact of Infotech, but for most of the 21st century they both were equally transformative.
First off, our understanding of genetics and biological systems opened up very different ways to provide health care more efficiently and effectively for less. By combining new biotech tools with the artificial intelligence that was mining the big data being accumulated on all individuals, a whole new paradigm of personalized healthcare took shape.
Again, the Coronavirus Crisis provided the impetus for driving real change. Americans had long been talking about how to cover all citizens with healthcare like their European counterparts — but that debate often looked back to the past and focused on expanding old programs from the 1960s like Medicare. The pandemic forced everyone to shift their sights to the future. It was reasonable to expect more pandemics to arise in coming decades given our highly globalized world. We certainly needed all citizens covered by healthcare and not dependent on companies that could disappear in the economic carnage overnight. The more information we had on each individual the better — and why not start with their foundational genetic code that we could hang all other information off for the rest of their lives?
We launched an effort that combined the best of the public and private sectors to figure out how to reinvent health care through what we called the Healthcare Moonshot. We combined aspects of the original moonshot with the federal government providing significant funding, with aspects of subsequent private sector moonshots that spurred competition between teams of companies.
This project was carried out like a state-of-the-art design process: What goals do we want to achieve? Well, we needed a resilient system that could dramatically expand during a pandemic, and efficiently deal with everyone remotely during a lockdown. What new tools are now available? Well, we’d be crazy not to take advantage of our new understanding of genetics and biology. In the end two factions emerged from the competition with two competing approaches to fundamentally rework the national healthcare system. Both were vetted through the political process and one emerged as the new way forward. The transition caused quite a bit of disruption, and took years to fully play out, but in the end the American health system was taking full advantage of the biological revolution. Europeans and other developed countries, who in the 20th century had superior healthcare systems, were scrambling to catch up.
The Capability to Edit Human Genes Forced Global Accords
The second step in the genetics revolution was more complicated. CRISPR technology arrived on the scene in practical form around 2015, enabling easy and cheap gene editing. Now humans could use the genome to not only read but also write. The age of genetic engineering had arrived.
The ethics were murky: If we can identify a gene related to a disease, why not edit that gene out to avoid the disease? Within a few years doctors were trying out this technique using CRISPR on adults in ways that wouldn’t be inherited by their offspring. But soon rogue doctors were modifying babies who could pass on traits to their offspring, which affected the human germline. This jump-started a debate on how to treat human enhancement that could affect our shared human germline or give modified individuals unfair social advantages.
Meanwhile, genetic engineering raised the specter of new forms of bio-warfare between nations, or even worse, bioterrorism. It was possible to imagine an enemy using readily available kits to create a form of pathogen that proved resistant to vaccines or treatments. These fears prompted massive investment from the military and national governments into deeper understanding of biological systems. The best defense against these widespread capabilities turned out to be cultivating a deeper and more comprehensive understanding of how all cells fully work. If disaster hit, scientists could quickly respond. Yet again military spending ended up accelerating new knowledge and technologies that became a boon to human health.
The worries around genetic engineering as applied to humans led to the Shanghai Accords of 2032. I was at that gathering, which brought together private and public sector experts from countries around the world and ended up forging guidelines on human genetic engineering applied to peace and war. All nations in the early 20th century had banned the use of chemical weapons, which also were relatively easy to manufacture, and those accords held with very few violations. We did it again in the early 21st century with bioweapons and genetic modification of humans and those accords held with few violations until well after 2050.
Biological Engineering: Made Sustainable Everything Possible
The first half of the 21st century saw a dramatic leap in our ability to manipulate living things for our benefit. Think of it this way: In the previous 4 billion years life on Earth evolved over eons through random mutations and natural selection. By the 2020s humans became able to immediately evolve any living thing through non-random mutation and un-natural selection. The age of biological engineering had arrived, with immense consequences.
Take food. For the first 200,000 years of human existence, we had been hunter gatherers who roamed around and stumbled across food we could eat. For the last 10,000 years, we were farmers who slowly domesticated plants and animals, and then painstakingly bred strains of both that optimized desired traits over decades of trial and error. By the 2020s we had domesticated the cell and could produce pretty much exactly what we wanted on the spot.
The early 21st century skepticism about the safety of genetically modified organisms fell away under the twin influences of the crying need for more sustainable crops in the midst of climate change and the impossible-to-deny improvements that came in taste, nutrition, and cost.
Climate change also played a part in the rapid adoption of alternatives to meat, particularly beef. Cattle were an extremely inefficient way to produce calories for human consumption. Only about 3 percent of the energy calories eaten by cattle were converted to the meat that humans ate. In other words, 97 percent was wasted. Cattle also emitted methane, which is 20 times as bad as CO2 in causing global warming. Yet beef is delicious and the growing middle classes in Asia and throughout the developing world wanted to eat it just as much as their Western counterparts.
Luckily two forms of meat alternatives emerged. One was plant-based and emerged in the 2010s and just kept building in popularity. Proteins were extracted from plants and broken down only to be combined with other nutrients, fats and flavors to create a product that came close to the experience of tasting and chewing actual meat.
The other alternative emerged in the 2020s and was known as cell-based meat. This process started with an adult stem cell extracted from a cow, pig, chicken or fish. One cell was sufficient, and the animal could live on. The cell then was put in a culture to exponentially grow in a bioreactor that fed the raw materials of amino acids, sugars and salts that go into meat. The cells eventually were structured into tissues that recreated the experience of chewing meat. This product was meat, and so tasted like meat. It just grew much more efficiently in a vat rather than in an animal. About 70 percent of the plant calories that went into the vat process converted into cell-based meat.
By the middle of the 2020s plant-based meat replacements had captured 10 percent of the American meat market. By 2030 cell-based meat had 10 percent of the market, and combined alternative meats had 30 percent. But by 2040 alternative meats were 60 percent and conventional meat was only 40 percent of the American market. By 2050 conventional meat was too expensive compared to the alternatives and so considered a rare treat. The obvious beneficiaries of this shift off conventional meat were the animals. No need to be slaughtered. And today, in our time, the idea of the mass slaughter of sentient animals to eat seems very strange, if not cruel.
The Biological Revolution in Materials Superseded the Industrial
One more front in the synthetic biology revolution made an impact on the environment — materials. The majority of things that a person in the early 21st century touched each day were biological — food, clothing, buildings. The actual materials they touched were grown, whether vegetables, cotton or wood. The biological toolkit that opened up in the 2020s allowed innovators to rapidly improve all those materials. Over time that just made organic material better and better. And anything that organically grows also sucks C02 out of the atmosphere, so the growth of synthetic biology scaled up carbon sequestration.
The rest of the non-organic materials that surrounded a person in the early 21st century could also begin to be reappraised and possibly replaced through the wonders of synthetic biology. Plastics had been invented around the time of World War II by the petrochemical industry and by 2010 annual production of plastics throughout the world was reaching more than 300 million metric tons. The world was awash with plastic bottles, bags and packaging of all sorts. Landfills, lakes, oceans were filling up with material that would take 1000 years on average to biodegrade.
However, by definition, biological material is designed to quickly bio-degrade. And so by the 2020s companies had zeroed in on whole classes of molecules that quickly degraded after extended exposure to sunlight or sea water. New forms of materials were soon developed that began to make inroads on the plastic packaging front. They stayed strong and coherent when needed. But decomposed over time if discarded or turned into waste. Yet another new way forward.
Editor’s Note from the Medium package marking the arrival of the year 2100: This interview with Stuart Rand will be continued in the next story next week.
Ending note from the world of 2020: This story is the third in The Transformation series that tells the largely positive story of America and the world from 2020 until 2050 from the perspective of Stuart Rand, a journalist and author born into Generation Z, looking back at the end of this life in 2100. You can get an introduction to the entire series in the voice of author Peter Leyden writing in 2020 here. You can get an overview of the overarching story of the series from Stuart’s perspective in 2100 here. The initial series of 6 stories will continue to come out each week and all that have been published can always be found on the series landing page here. Follow Peter Leyden to get notified when any future stories come out.
This post was originally published by at Medium [AI]