How AI Will Define New Industries
If you were a brilliant artificial intelligence (AI) expert just graduating from a doctoral program at a prestigious school, would you pursue that startup you’ve been thinking about, join a company that wants to build cutting-edge AI applications, or use your expertise to help scientists in other fields conduct basic research?
The opportunities presented by the first two options are outrageous, and growing more outrageous by the day.
With more than 2,000 startups absorbing much of the top-tier AI talent, estimated by some to be just 10,000 individuals worldwide, the combination of great scarcity and even greater demand for talent is driving salaries through industry roofs. Some businesses offer seven-figure compensation packages for elite AI talent.
Plus, it’s never been a better time to launch an AI startup. Investment in AI-focused ventures has grown 1,800% in just six years, from $282 million in 2011 to more than $5 billion in 2016, according to CB Insights.
The sanity behind these numbers comes, in part, from the fact that companies expect AI to allow them to move into new business segments or to maintain their competitive advantage in their industry. Driving this expectation is the idea that AI will enable them to dramatically improve the efficiency or effectiveness of their operations and offerings.
Despite the lucrative financial opportunities in these first two career paths, the alternative choice, to collaborate with scientists in other fields, may be more pivotal to how nations and businesses compete over the long term.
Conventional wisdom, based on no small amount of research, holds that AI-driven automation will create technological unemployment in a variety of sectors in the next five to 10 years.
Optimists believe that this won’t be a big deal for the labor sector as a whole because corporate adoption of AI (and other digital trends) will create new industries and new job categories that will replace whatever AI-driven losses occur in labor sectors of the current economy.
But there are two open questions about this hypothesis:
1 How will these new industries be created?
2 How soon will they come, if they come at all?
The creation of new industries frequently depends on dramatic advances in science and technology that can take decades to move from discovery to commercial application to new industry.
The industry around AI is already 60 years in the making, and we’re still not there yet. A cursory look at the history of three technologies, crucial to modern life, shows the lengthy, complex path to new industry creation.
One is the global positioning satellite (GPS) system. Stephen Hawking, in his book A Brief History of Time, contends that GPS would not be possible were it not for Einstein’s 1915 theory of general relativity (GR). GR both explains why satellites in space track time differently from chronometers on Earth and enables satellites to work together to precisely track movements on Earth.
Precision agriculture, autonomous cars, Waze, and Uber are just some of the many applications of GPS. According to a 2015 government estimate, GPS contributes at least 0.4% to the US economy, a number that omits several sectors and indirect benefits.
Another example is the Internet, which was first conceived, arguably, by J.C.R. Licklider of MIT in August 1962, in a series of memos that described a “Galactic Network,” a globally interconnected set of computers through which people could quickly access data and programs from any site.
It took many inventions (such as packet-switching technology) and investment (by the Defense Advanced Research Projects Agency of the US government) to develop the foundations for commercial applications related to this idea.
An industry around the Internet emerged only after the creation of the World Wide Web (1989) and commercial browsers (circa 1994). The Internet bubble occurred nearly four decades after the Internet was first conceived.
A third example is recombinant DNA and other gene-related technologies, which have become tremendous sources of economic value in terms of jobs and business opportunity, the market value of recombinant DNA technology by itself is expected to reach $844 billion by 2025.
The “genes industry” owes much to the discovery of DNA structure in the early 1950s by James Watson, Francis Crick, and Maurice Wilkins (and Rosalind Franklin). Their discovery of the double-helix structure of DNA revolutionised scientific understanding of genetic code, making possible advances in agriculture, cancer treatment, and personalised health care.
Given how long it takes to develop new industries (and the basic research breakthroughs that enable them), there is a real question about whether, even if new industries based on AI and other digital advances eventually create new job categories, these new jobs will emerge fast enough to maintain sufficiently high employment levels in the economy in the interim.
While a recent study by Gartner indicates that AI may create more jobs than it eliminates, it does admit that AI will destroy “millions of middle- and low-level positions.”
Creating these new industries and retraining a labor force to fill them in a timely way is a significant issue for at least two reasons. One is that labor unrest and/or economic downturns become more likely if unemployment stays too high for too long. Even retraining is no panacea if new jobs from new industries have yet to be created.
Two, climate change-related economic effects are on the rise, with a disproportionately negative impact on lower and middle-income populations.
If new job categories are not created before climate-related effects wreak havoc on infrastructure and private property, the entire economic system becomes less stable.
Today, AI expertise is focused more on developing commercial applications that optimise efficiencies in existing industries, and focused less on developing scientific applications that could give rise to new industries.
These efficiencies are accelerating the sectoral consolidation and convergence, and are less about new industry creation. Platform companies in the sharing economy, such as Uber, Lyft, and Airbnb, are disrupting existing industries, not creating new ones.
Amazon’s moves into groceries and pharmaceutical distribution are shaking up incumbents, but not creating new industries. Squeezing ever more efficiency out of business models is usually great for profitability, but rarely a boon for the labor force.
We don’t want to minimise the effect of digital trends on existing labor markets. The web is filled with lists of new job titles and job types; by and large, jobs have yet to be lost as a result of AI. For now, managers remain much more optimistic about what AI can do for them in their jobs.
However, AI’s most potent, long-term economic use may just be to augment the discovery and pursuit of basic scientific advances that could be the foundations of new industry. Few companies have a long-term interest in using AI in this way. It’s simply not in their near-term commercial interests.
But promoting this potential is a role that can be played by government. And if the history of the Internet and GPS is a guide, the real winners in an economic era dependent on AI technologies will be the countries or regions that align private and public sectors to allocate scarce AI expertise to augment basic scientific discovery (as well as commercial applications).
If AI can supercharge discovery, commercial applications, and the creation of new industries, it is short-sighted from a social perspective to focus scarce AI talent on developing only commercial applications and disrupting current industries.
Accelerating the pace of scientific discovery may be the most important societal use of AI. It’s time for business and government to work together to promote that potential.
You Might Also Read:
Combating The Threat Of Malicious AI: