In the decade now drawing to a close, much of the world’s business and everyday life became fully digital. But innovation is far from over.
The 2020s will see further refinement of established technologies and the practical rollout of new modes, like quantum computing, that are still at the experimental stage.
In December 2029, we’ll no doubt be remarking on inventions that today we still can’t imagine. But for now, here is IBM’s preview of the year and decade ahead.
Artificial intelligence technology probably won’t take your job. But it will change how you work.
Recent research on the future of work, from the MIT-IBM Watson AI Lab, shows that AI will increasingly help us with tasks that can be automated, such as scheduling. But it will have a less direct impact on jobs that require human skills such as design expertise and industrial strategy.
Even newsrooms have adopted AI for mundane tasks like writing quarterly earnings stories, freeing reporters to focus on explaining the bigger trends affecting people and society.
We can expect more workers in 2020 to begin seeing these effects as AI makes its way into all sorts of workplaces around the world.
Workers will benefit as tasks that complement AI solutions will take on greater value, with increased compensation. New skills will be required in order to build the new career paths necessary to migrate to those more highly valued tasks. That’s a challenge for employers and employees alike.
Governments, in turn, will need to devote more resources to support training and education in the private sector—and possibly adjust tax policies to enable businesses to support training and reskilling.
Martin Fleming, VP and Chief Economist at IBM, and author of that MIT-IBM Watson future-of-work report, notes that AI is a new technology different from anything that preceded it because of its ability to draw on vast pools of unstructured data, solve nonlinear optimization problems, and use fast, inexpensive computing power.
“AI technology,’’ Fleming says, “has the potential to increase the productivity of workers as well as productivity in all walks of life.”
Embracing the Flexible Freedom of the Hybrid Cloud
Cloud computing – the ability to run applications and access data anywhere, anytime – has moved into the mainstream over the past decade. That's because cloud computing offers major advantages in scalability, resiliency and cost savings, compared with the traditional corporate computing model that relies on costly on-premises hardware situated in centralized data centers.
But as many organizations have found, putting all the enterprise’s data on a single cloud provider’s platform often isn’t workable. There are simply too many technically disparate, geographically far-flung data systems to coordinate. And so, more recently, large organizations are adopting a hybrid multicloud approach – a mix of public cloud, private cloud and on-premises resources from different vendors. This allows them to use the best solution for the tasks at hand while avoiding being locked in to any single vendor.
This hybrid multicloud approach has become a viable path for enterprises, particularly as the public cloud services within hybrid environments have proven the ability to support the security, data protection and compliance requirements that businesses demand.
While hybrid cloud computing provides the ultimate flexibility, it works only if it’s based on open standards so that software developers can build an application once and run it anywhere. That’s the reason IBM in 2019 completed its acquisition of Red Hat, the leading provider of open-standards hybrid cloud technologies.
Now that the wisdom of hybrid cloud computing has become widely recognized, 2020 is the year it will go mainstream—setting a model for corporate data that will continue well into the decade. Analysts now see hybrid cloud as a $1.2 trillion market opportunity, and nearly 80 percent of IT decision makers see it in their future.
"Hybrid multicloud is becoming more accepted by enterprises,’’ says Bala Rajaraman, IBM Fellow and VP Cloud. “So, in 2020 there's going to be a lot of focus on execution."
Quantum Continues on Path to Practicality
These are still early days for one of the future’s most promising technologies, quantum computing. But in 2020 and the decade beyond, quantum will move beyond the realm of theory into the world of practical experiments and applications.
The technology dives deep below the atomic-level electrons of traditional chip-based computing to operate in the realm of sub-atomic quantum physics. But it’s not necessary to be a quantum physicist to grasp the main point: Quantum computing seeks to solve complex problems that have been considered unsolvable using classical computers alone.
Since putting the first quantum computer on the cloud in 2016, IBM has focused on making the technology available to industry, academia and anyone else inspired by quantum computing’s potential.
Science and industry have begun to experiment with the first applications that combine quantum and classical computers to explore solution-specific problems related to chemistry, financial services and machine learning. In the coming years, those pioneering organizations will be the first to see the technology’s true potential—provided their workers have the right skill set to take advantage of quantum computing.
As the next decade unspools, students must be “quantum ready” when they enter the workforce. That means quantum computing should be an integral part of any university technical curriculum. Every student who’s taking computer science courses or chemistry or business classes should become familiar with this new technology and consider career paths rooted in quantum theory, information science and programming.
Blockchain Becomes a Business Basic
In 2019 blockchain proved itself in the corporate mainstream. Next year the digital-ledger technology will continue to mature and expand as the way businesses can trust and verify every transaction, even when dealing with the largest of global supply chains.
A major proof point this year was IBM Food Trust. That blockchain network is being used by a growing number of retailers including WalMart, Albertson’s and Carrefour. Major food suppliers like Dole and Raw Seafoods, as well as a variety of supply-chain partners, are also using the technology to track sourcing, transport and delivery. Among other capabilities, IBM Food Trust—which is now 200 members strong—can enable retailers to quickly and precisely track down the source of specific lots of food in the event of safety-related recall.
Another well-established example is TradeLens, a blockchain network for the global shipping supply chain that was jointly created by A.P. Moller-Maersk and IBM. TradeLens now carries data on more than half the world’s shipping cargo.
As blockchain networks like Food Trust and TradeLens grow and mature, they have begun capturing millions of points of data, which will open the door to new possibilities for machine learning and A.I.-based capabilities.
For industry groups, blockchain creates need for new forms of governance to cover issues that include decision-making, permissions for data access, and payment systems among parties. As a potential model, Food Trust members pioneered a comprehensive governance model for the network to help ensure that the rights and information of all participants will be managed and protected appropriately.
With blockchain becoming a business basic for more and more industries, the technology focus will shift toward interoperability – establishing open-system standards so that existing blockchain networks can communicate and integrate with legacy systems and other external blockchain networks. Open standards, in turn, will speed blockchain adoption across industries.
“Blockchain is no longer a 'what if?' proposition,’’ says IBM Blockchain General Manager Marie Wieck. “Today, enterprise blockchain platforms are already beginning to solve decades-long problems in industries ranging from food to international trade."
Computing Is About to Get a Lot More Edgy
Today's consumer electronics, cars and electric vehicles, factory floor machines and all sorts of other digital devices are equipped with sensors that collectively generate petabytes of data.
There are already an estimated 15 billion of these increasingly intelligent devices operating at the outer edges of networks. And their numbers are expected to reach 55 billion by 2022.
For businesses, the proliferation of such edge devices offers an opportunity to open a real-time window into everything from how street traffic is flowing to whether a factory robot needs maintenance. The way to achieve this is through what’s called edge computing—a way of analyzing all that sensor-generated data by performing computations as close to the source as possible, instead of sending it all back to a data center via the cloud.
Next decade will see a surge in edge computing, aided by the telecom industry’s rollout of 5G technology – a high-speed, low-latency wireless format well suited to the close-to-the-source needs of edge computing. Compact, efficient computer servers located at the network’s edges can put the processing power where it can best be used.
Whether the data comes from sensors on a factory floor, consumers shopping in a retail environment, or mobile devices connected to a company’s enterprise network, edge computing is a “transformative technology," says Ryan Anderson, IBM platform strategist, CTO group for edge development.
Retailers will benefit from faster updates on consumer buying trends, factories will be able to perform predictive maintenance on equipment that's about to fail, cellphone carriers will be able to support mobile gaming and augmented reality.
“Edge is an enabling technology to deliver things that do work to the places that work needs to get done,’’ Anderson says. “Think of edge as the technology to get the workloads to the places where it makes the most sense to get the work done.”
AI Goes Greener
Data centers are the linchpins of our modern world. We rely on them for everything from enterprise computing to social media to streaming our favorite movies.
They also consume massive amounts of electricity, accounting for as much as 2 percent of the world’s energy use. With growing demand for cloud computing and data-intensive AI applications, big server-packed data centers are becoming even more energy ravenous. AI workloads are on pace to double every three to four months.
To support AI and cloud workloads, while reducing supercomputing’s carbon footprint, technologists in 2020 will be working to make these energy-intensive forms of computing more sustainable. And rather than the typical reliance on renewable sources of electricity, these emerging techniques aim to make the data processing itself more energy efficient.
One approach, for example, is “approximate computing,” which retains accuracy but can allow a four-fold reduction in energy consumed. Researchers have applied new algorithmic approaches to derive the performance and efficiency gains from approximate computing without accuracy loss, even for the most complex of AI models.
Pushing AI acceleration beyond the limits of digital technology, new efforts include building analog AI hardware devices with new, non-volatile memory materials acting as resistors. By putting memory and computation in the same place, analog AI hardware speeds up AI model development and deployment while cutting power consumption. New materials for memory systems have been introduced and tested to optimize these analog AI devices.
The overall goal: Let more of the world benefit from the growing capabilities of AI and other computing technologies in the coming decade—and to do it in increasingly sustainable ways.
‘’We must develop technologies which will sustainably advance and scale AI,’’ says Mukesh Khare, Vice President of Systems Research at IBM. “It’s essential to develop energy efficient AI systems, and we’ll see some truly innovative software and hardware solutions for meeting these massive energy needs in the years ahead.”