In 1837, a blacksmith from Grand Detura, Illinois, came across a mill on a broken steel pollen; He took the blade home - at that time there was little steel - and turned it into a plow, which was perfectly suitable for the dense soil of the region. It began mass production of this tool, and soon the name John Deere became synonymous with agricultural products. Today, the company is increasingly producing autonomous tractors and combines, unmanned machines that collect and process data for yield optimization. "Agricultural work is not just coming," Mustafa Suleiman admires. - "They are here."
Mr Suleiman knows something about new technologies. In 2010, together with Demis Hassabis and Shaine Legage, he founded DeepMind for his immodest purpose: "solve the problem of intelligence" - that is, to reproduce human knowledge with computers - and "to make the world better." DeepMind, purchased by Google in 2014 for the development of artificial intelligence called deep learning, has achieved a number of stunning achievements. One of its programs, Alphafold, was able to predict a three -dimensional structure of proteins based on their amino acid sequence, which became a scientific breakthrough. Not surprisingly, Mr. Suleiman believes deeply in the possibility of artificial intelligence. But something in the future that he imagines so well after many hours of reflection and work on its creation, shocked him. In The Coming Wave book, he hopes and upset us.
The good news, Mr Suleiman tells us, is that AI - along with synthetic biology, technology aimed at creating organisms - is ready to "open a new dawn for humanity". It provides for the universe of favorable prospects. According to him, AI is on the right way to "help manage our business, treat our ailments and lead our battles." "It seems that there is no obvious upper limit of possible."
But it is frightened by the fact that the spread of these technologies will allow "many intruders to provoke destruction, instability and even an incredible catastrophe." His fear is caused by the belief that technologies tend to become "cheaper and easier to use, and in the end they are spreading everywhere." (One example is a car, the other - a smartphone.) New technologies, according to Mr. Suleiman, are "fragments of fragility", potentially giving a "lever with global consequences" in other people's hands. He writes that Dipfeyics - manipulative imitation of an image or voice - are already spreading quickly, poisoning local elections in India and creating opportunities for banking fraud in Hong Kong. Soon, he fears, the ability to synthesize biological weapons or use armed drones will be widespread-and these are just some of the other other alarming scenarios.
In addition to the unbridled nature of the technology itself, it is difficult to find out how best to modulate our desire for its expansion and use. Any such effort counteracts the combination of powerful forces: for example, countries that are fiercely competing for strategic advantage or desperately solving common problems, such as food supply, climate change and disease. Suleiman suggests that technology alone cannot be a response to such fears, but it is still the "most powerful" tool we have. Reinstalling is also complicated by the desire for profit, especially given that the "approaching wave" of approaching technologies is the "greatest economic reward in history."
But technology is difficult to restrain for another reason: our own unstable mixture of curiosity and ego, our programmed inability to "leave fruits on a tree", as Mr Suleiman said. He recognizes in his colleagues (and in himself) the property expressed by J. Robert Oppenheimer, Director of Atomic Bomb studies in Los Alamos, New Mexico: “When you see something technically beautiful, you go forward and do it, and you will only get about what to do.
So what to do? To begin with, Mr Suleiman suggests, we have to overcome our first reflexive reactions: to deny the problem or to immediately turn to regulation. According to him, the scale of the problem can cause a reaction that he calls a "trap of disgust to pessimism": we look in the other direction. Or we may require legislative corrections. However, it takes a long time to develop effective adjustment, and the technology is simply developing too fast.
Our goal, says Mr. Suleiman, should be not so much to stop the emergence of new technologies, but in order to "fashion" them and ensure their adaptation to our needs. It proposes to make them object of supervision and ensure that the rules, including audit and external check, advocate for a "more licensed environment", in which the most complex artificial intelligence systems will be produced "only responsible certified developers". It stands for pressure on critically important points - for example, limiting the export of advanced computer chips or restricting the transmission of large -scale fiber optic cables - to restrain the speed of development. However, he admits that even if his long list of wishes is accepted, this may not be enough.
Futurist Roy Amara once said that "we tend to overestimate the effect of technology in the short term and underestimate the effect in the long run," and the point of view of Suleiman that we are balancing on the border of nirvana and death seems to confirm this trend. AI and synthetic biology may develop faster than many of us think, but not as fast as he thinks. In particular, describing progress in biology, it is inclined to jump from early success (for example, the creation of a synthetic bacterial genome) to the fact that it considers an inevitable final condition: say, "a world of cheap personalized medicines" or food produced on demand. In fact, this future is much further than he imagines.
But even if Mr Suleiman's time is wrong, his vision of the future seems correct: artificial intelligence and synthetic biology create opportunities for transformation and existential dangers. As he argues, it is now time to count with both opportunities. How can we manage technology so that we can benefit from its unusual prospects without collapsing its exceptional power?