Is humanity doomed? Are we one of the last generations of homo sapiens — soon to be supplanted by engineered cyberbeings, with a distant semblance to their creators (us)?

On Jan. 24, historian and international best-selling author Yuval Noah Harari presented his view of the future at the World Economic Forum in Davos, Switzerland. Harari wrote Sapiens: A Brief History of Humankind and also Homo Deus: A Brief History of Tomorrow.

In a riveting 25-minute presentation, Harari painted a very gloomy — but possible — view of the future, based on his thesis that we are now in our third grand revolution: the control of data, following the control of land (Agrarian Revolution) and the control of machinery (Industrial Revolution). The point of no return, Harari contends, will happen when technology will be able to extract high-precision biometric data from people and report back to a centralized decision-making control system, owned by governments or by corporations — or both. By biometric, he means your pulse, pressure, sweat composition, dilation of your pupils, etc.: kind of a lie-detector on steroids.

For example, Harari said, if the North Korean government forces its citizens to wear a bracelet that transmits biometric data to government data centers, the government will be able to monitor how people feel about their leader and about pretty much everything else in their lives. They may know more about you than you do, given that we are often unaware of what's really going on with us.

Harari's vision echoes Ray Kurzweil's "singularity," without Kurzweil's somewhat romantic expectation that technology will bring us immortality. (Not really "us," but our essence captured as digital data.) The interested reader can watch Transcendent Man, a documentary about Kurzweil and his vision.

The main idea is that we are close to being able to hack life itself: Thinking of organisms as algorithms, it is just a matter of computing power and enough biometric data before we are able to engineer any kind of creature from scratch. After all, if life is like a computer program (the software) running on biochemistry (the hardware) all we have to do is collect all the information we can to write our own algorithms of other living things. That, coupled with the rise of machine learning and artificial intelligence, will seal our doom as a species; or, in another way to see this, it will put the future of evolution in our hands and not on the random leashes of natural selection.

The main question, then, asks Harari, is: Who is going to have control over this data? How is this wealth going to be regulated? We have laws that regulate land and machines. What are the laws that regulate data and the privacy of individuals? Are people willingly going to give away their privacy, their biometric info, to a centralized data processing unit? Possibly, argues Harari, if your health depends on it. Or if you live in a dictatorship and have no option. Or if you exchange that data for services offered by corporations, something that is already beginning to happen. ("Get your deliveries and other offers for free if you allow us to monitor your smart-watch data.")

When asked when this is going to happen, Harari is vague — as he should be — decades, possibly a century. But in his view, as in Kurzweil's, this future is pretty much a done deal.

Of course, no one can predict the future. What we can do is look at current trends and extrapolate the best way we can. There is no question that computer power will continue to increase and that our knowledge of genetics and bio-engineering will as well. Data science, which mostly serves the interests of governments and corporations, will also become savvier as machine learning progresses. Market forces and investor greed will keep on steamrolling the data revolution forward. Are there no balancing forces to this onslaught?

I think there are.

First, we can mention the rise of corporate ethics. A growing number of companies are realizing that if they don't align their values with that of their clients, they will lose them. A current example is the boycott of the NRA by many companies, from Delta Air Lines to Best Western. Corporations that are reluctant are being pressured by their clients to change. Consumers have power — and they can be heard when mobilized. Corporations or institutions that are perceived to have low ethical standards may be forced to change or close doors.

Another point is that there is only so much information we can gather about any natural system, including ourselves. This sort of extremely precise and complete mapping of all human metabolic functions and brain activity is a dream. Science cannot be omniscient. That is reserved for supernatural gods. There are limits to what technology can do. Every machine has a precision range and is blind to what goes on beyond what it can probe. To monitor the activity of about 85 billion neurons and the flowing of neurotransmitters through trillions of synapses seems highly implausible, even if I wear my science-fiction nerd hat. What we can achieve (and will) is an incomplete mapping of the human brain and body.

In being so definitive about our doomed future, Harari seems to be confusing the map with the territory. The map (how science describes reality) is not the territory (reality in its totality). We have no way of knowing what reality in its totality even means, given how our scientific narrative depends on our limited experience of reality. Humans perceive and describe the world in a uniquely human way. Science cannot eradicate doubt. Quite the opposite: As science learns more about the world, it also opens new questions we couldn't have anticipated.

Granted, governments and corporations may not need this level of fantastic detail about our bodies and brains to do a lot of damage. That, to me, is a more realistic outlook. I do agree with Harari that we need to start a conversation as soon as possible about our collective future — and that this conversation cannot be relegated exclusively to politicians. Who will control the ownership of data? What limits and safeguards should be imposed so that there is no rise of data dictatorships?

To tackle the upcoming changes, we need a plurality of views: Scientists, humanists, and business and community leaders must be part of it. The real danger is that we do nothing. Taking a hint from history, trouble has started whenever the ownership of land or of machinery has been concentrated in the hands of too few. With data, we face the same issue with a more challenging and fluid commodity.

Meanwhile, be aware of how much biometric data from your Fitbit or smart watch you share on the Web.


Marcelo Gleiser is a theoretical physicist and writer — and a professor of natural philosophy, physics and astronomy at Dartmouth College. He is the director of the Institute for Cross-Disciplinary Engagement at Dartmouth, co-founder of 13.7 and an active promoter of science to the general public. His latest book is The Simple Beauty of the Unexpected: A Natural Philosopher's Quest for Trout and the Meaning of Everything. You can keep up with Marcelo on Facebook and Twitter: @mgleiser

Copyright 2018 NPR. To see more, visit http://www.npr.org/.

300x250 Ad

300x250 Ad

Support quality journalism, like the story above, with your gift right now.

Donate