In Silicon Valley, as in many parts of the world, we have long prioritized the promise of technology over the risks. After all, no founder purposefully builds a company that aspires “to do evil” or seeks to “destroy the world’s information.” Technology founders are, by and large, thoughtful, responsible individuals who genuinely believe that they’re making the world a better place.
So, why then are we embroiled in a world where robots threaten our jobs, algorithms determine our news and curate the communication of our friends, self-driving cars cause fatalities, and misuse of data undermines democracy?
We’ve too long, too falsely believed in the monotheism of technology, of learning to code. If we simply get enough people studying science, technology, engineering, and math, we will progress to a fruitful world of low unemployment and high productivity—or so the argument goes. But what we forget in endless hashtags and coding bootcamps is that without an appreciation for our own humanity, this optimization is mechanistic.
We certainly need more diversity in STEM and more folks believing that they can be part of the innovation economy. But this also means we broaden diversity into who we hire and how we reconsider the real skills of the humanities and liberal arts.
If science is the study of the natural world and how it works, the humanities are the study of our place in it. Anthropology is the study of humanity, and history is our written record. Politics and sociology teach us about our collaborative and interdependent nature. Psychology and philosophy are the introspection into our fears, passions, and motivations. They are the study of how we persuade and how we can be manipulated, a contemplation not just of what we can do, but the more complex how and why we actually do them.
The evaluation of any startup is far more philosophical and psychological than it is technical.
Anyone who has sat through a VC pitch knows that it’s your charisma, energy, authenticity, and communication that are as important as any technology you’ve built. Most investors analyze motivations, passion, and grit alongside the tech. Does this entrepreneur authentically want to solve this problem? Will this founder be able to sell this vision one thousand more times to win customers, hire a team, and raise more money? The evaluation of any startup is far more philosophical and psychological than it is technical.
I recall speaking to investors who saw Jack Dorsey’s presentation of Square. “That was a seduction,” they told me. Sure the tech looked good, but the personality looked better.
And for entrepreneurs or product managers, white-boarding late into the night about customer acquisition and onboarding flows, the debates center on user behavior. These are anthropology and philosophy conversations. On a one-to-many platform, will users abuse the right to post publicly? What are their rights to free speech vs the platform’s obligation to protect their safety? These are civil liberty issues and questions of privacy and security.
One look at Facebook today and it’s clear that the gravest challenges are ethical, editorial, and philosophical. They’re issues of culture and communication, of politics and law. While they certainly face technical challenges daily, these algorithmic tweaks in how they store, sort, and retrieve data are less interesting than the sociological and political ramifications they take on when they become public.
We forget that it is the application of technology that matters—its touch point, or friction, with human beings.
People don’t understand or question the technology in the abstract, but they question the outcome any technology generates. We forget that it is the application of technology that matters—its touch point, or friction, with human beings.
After all, all technology is human; it is created of, by, and for the people.
Fei-Fei Li (Google Cloud’s head of artificial intelligence) recently wrote a piece on The New York Times in which she called for “human-centered AI.” Along with Melinda Gates, she founded an organization called AI4All, a non-profit with the mission of broadening the diversity of those who participate in shaping our AI.
There is no way to build effectively for human beings without engaging in the study of human nature.
She has made the argument that there is nothing “artificial” about AI, but that it is made for and produced by people, each of whom has their own interests and biases. She is one of the world’s experts in AI, and also one of the strongest proponents of bringing more philosophers, anthropologists, and psychologists into the conversation. There is no way to build effectively for human beings without engaging in the study of human nature.
As we think about technology and what makes it successful, we might move beyond the singular drum beat of STEM and instead consider a blending of sciences and humanities in how we teach and hire. At Stanford University, some of the most successful founders have come out of a major called “Symbolic Systems.” The course gives graduates an appreciation for technology while at the same time teaching philosophy, psychology, linguistics, and logic.
At Google, the philosophy of “20-percent time” allows high-performing employees to spend part of their time working on side projects that can develop into new products or passions. In considering how to blend our teams, we might consider the concept of “20-percent talent.” While a sales or operations team might need mostly high-energy extroverts to find and convert sales leads, they’d do well to have two techies optimizing processes. And while a data science team might need mostly statisticians, 20 percent talent in philosophy or anthropology would help them better frame problems or ask questions.
Technology is not monolithic or made up only of engineers and coders. The sooner we recognize the equality in contribution between the humanities and the sciences in how we solve our gravest challenges, the faster we’ll be able to effectively put new tools to work.
The best leaders and companies are those that blend the fuzzy and the techie.