Why Artificial Intelligence Won’t Replace CEOs
An MBA’s instinct is increasingly vital in the age of information overload
Peter Drucker was prescient about most things, but the computer wasn’t one of them. "The computer ... is a moron,” the management guru asserted in a McKinsey Quarterly article in 1967, calling the devices that now power our economy and our daily lives “the dumbest tool we have ever had.”
Drucker was hardly alone in underestimating the unfathomable pace of change in digital technologies and artificial intelligence (AI). AI builds on the computational power of vast neural networks sifting through massive digital data sets or “big data” to achieve outcomes analogous, often superior, to those produced by human learning and decision-making. Careers as varied as advertising, financial services, medicine, journalism, agriculture, national defense, environmental sciences, and the creative arts are being transformed by AI.
Computer algorithms gather and analyze thousands of data points, synthesize the information, identify previously undetected patterns and create meaningful outputs—whether a disease treatment, a face match in a city of millions, a marketing campaign, new transportation routes, a crop harvesting program, a machine-generated news story, a poem, painting, or musical stanza—faster than a human can pour a cup of coffee.
A recent McKinsey study suggests that 45 percent of all on-the-job activities can be automated by deploying AI. That includes file clerks whose jobs can become 80 percent automated, or CEOs’ jobs that can be 20 percent automated because AI systems radically simplify and target CEOs’ reading of reports, risk detection, or pattern recognition.
AI has been one of those long-hyped technologies that hasn’t transformed our whole world yet, but will. Now that AI appears ready for prime time, there is consternation, even among technologists, about the unbridled power that machines may have over human decision-making. Elon Musk has called AI "our biggest existential threat,” echoing Bill Joy’s 2000 warning in Wired magazine that “the future doesn’t need us.” On the other side, of course, are enthusiasts eager for smart machines to improve our lives and the health of the planet.
I’m on the side of Microsoft CEO Satya Nadella who says that we should be preparing for the promise of ever smarter machines as partners to human decision-making, focusing on the proper role, and limitations, of AI tools. For business school educators like me who believe the future will indeed need us, the expanding power of AI or deep learning poses a challenge and opportunity: How do we prepare students for the coming decades so that they embrace the power of AI, and understand its advantages for management and leadership in the future?
It would be a mistake to force every MBA graduate to become a data scientist. The challenge for business schools is to update our broadly focused curricula while giving our MBAs a greater familiarity and comfort level with data analytics. Tomorrow’s CEOs will need a better sense of what increasingly abundant and complex data sets within organizations can, and cannot, answer.
The sophistication and volume of data may be increasing, but history affords models of a decision maker’s proper relationship to data analytics.
Take D-Day. General Dwight D. Eisenhower sought as much data as possible to inform his decision on when to land hundreds of thousands of Allied forces on the beaches of Normandy in that fateful late spring of 1944. As Antony Beevor’s book on the battle and other accounts make clear, Eisenhower especially craved reliable meteorological data, back when weather forecasting was in its infancy. The general cultivated Dr. James Stagg, his chief meteorologist, and became adept not just at analyzing Stagg’s reports, but also at reading Stagg’s own level of confidence in any report.
For months before the fateful decision to “embark upon the Great Crusade,” Eisenhower developed a keen appreciation for what meteorological forecasts could and could not deliver. In the end, as history knows, Stagg convinced him to postpone the invasion to June 6 from June 5, when the predicted storm raged over the English Channel and when many others questioned Stagg’s call that it would soon clear.
No one would argue that Eisenhower should have become an expert meteorologist himself. His job was to oversee and coordinate all aspects of the campaign by collecting pertinent information, and assessing the quality and utility of that information to increase the invasion’s probability of success. Today, big data and the advent of AI expand the information available to corporate decision-makers. However, the role of a CEO in relation to data echoes the absorptive and judgmental function exercised by General Eisenhower in reading probabilities into his meteorologist’s weather reports.
It’s noteworthy that today, amidst all the talk of technological complexity and specialization across so much of corporate America, a Deloitte report prepared for our school found that employers looking to hire MBA graduates value prospective employees’ “soft skills” more than any others. They want to hire people with cultural competence and stronger communication skills, who can work collaboratively in diverse teams, and be flexible in adapting continuously to new opportunities and circumstances in the workplace and market.
This isn’t just about intolerance for jerks in the office. It’s about a leader’s need to be able to synthesize, negotiate, and arbitrate between competing and conflicting environments, experts and data. If there was once a time when corporate leaders were paid to make “gut check” calls even when essential information was lacking, today’s CEOs will increasingly have to make tough, interpretive judgment calls (a different type of “gut check”) in the face of excessive, often conflicting, information.
Those in the driver seat of institutions have access to an expanding universe of empirically derived insights about widely varying phenomena, such as optimal models for unloading ships in the world’s busiest ports in various weather conditions, parameters of loyalty programs that generate the ‘stickiest’ customer response, or talent selection models that yield both the most successful, and diverse, employment pools.
Corporate leaders will need to be discerning in their use of AI tools. They must judge the source of the data streams before them, ascertain their validity and reliability, detect less than obvious patterns in the data, probe the remaining “what ifs” they present, and ultimately make inferences and judgment calls that are more informed, nuanced around context, valid, and useful because they are improved by intelligent machines. Flawed judgments built on flawed or misinterpreted data could be even more harmful than uninformed flawed judgments because of the illusion of quasi-scientific authority resulting from the aura of data.
As a project management tool, AI might prescribe optimal work routines for different types of employees, but it won’t have the sensitivity to translate these needs into nuanced choices of one organizational outcome (e.g., equity in employee assignments) over another (family values). AI might pinpoint the best location for a new restaurant or power plant, but it will be limited in mapping the political and social networks that need to be engaged to bring the new venture to life.
Machines also lack whimsy. Adtech programs have replaced human ad buyers, but the ability to create puns or design campaigns that pull at our heartstrings will remain innately human, at least for the foreseeable future.
A new level of questioning and integrative thinking is required among MBA graduates. As educators we must foster learning approaches that develop these skills—by teaching keen data management and inferential skills, developing advanced data simulations, and practicing how to probe and question the yet unknown.
In parallel to the ascendancy of machine power, the importance of emotional intelligence, or EQ, looms larger than ever to preserve the human connectivity of organizations and communities. While machines are expected to advance to the point of reading and interpreting emotions, they won’t have the capacity to inspire followers, the wisdom to make ethical judgments, or the savvy to make connections.
That’s still all on us.
Judy D. Olian is dean of the UCLA Anderson School of Management.