Bruce Sterling is a prominent science fiction writer and a pioneer of the cyberpunk genre. His cyberpunk novels Heavy Weather (1994), Islands in the Net (1988), Schismatrix (1985), The Artificial Kid (1980) earned him the nickname “Chairman Bruce”. Apart from his writings, Bruce Sterling is also a professor of internet studies and science fiction at the European Graduate School. He has contributed to several projects within the scheme of futurist theory, founded an environmental aesthetic movement, edited anthologies and he still continues to write for several magazines including Wired, Discover, Architectural Record and The Atlantic.
In the interview below, we had the honor of hosting Bruce Sterling in our Next Nature Network headquarters to talk to him about the concept of the convergence of humans and machines. Sterling weighs in on the issue with a rather challenging perspective.
Lots of people are actually talking about and also investing a lot of money in this idea of convergence of the machine and humans. What are your thoughts on this?
The result is the unbundling of those metaphysical ideas and their replacement by actual products and services
That convergence will not happen, because the ambition is basically metaphysical. It will recede over the horizon like a heat mirage. We are never going to get there. It works like this: first, far-fetched metaphysical propositions. Then an academic computer scientist will try and build one in the lab. Some aspect of that can actually be commercialized, distributed and sold.
This is the history of artificial intelligence. We do not have Artificial Intelligence today, but we do have other stuff like computer vision systems, robotic abilities to move around, gripper systems. We have bits and pieces of the grand idea, but those pieces are big industries. They do not fit together to form one super thing. Siri can talk, but she cannot grip things. There are machines that grip and manipulate, but they do not talk. You end up with this unbundling of the metaphysical ideas and their replacement by actual products and services. Those products exist in the marketplace like most other artifacts that we have: like potato chips, bags, shoes, Hollywood products, games.
They should be seen in that context, you should not dress these commercial products up, and say “Someday soon we will have the Artificial Intelligence Super Ghost.” I know there are guys in the business who are into that vision, but I do not think that the real captains of the industry, who are making multimillion dollar investments, believe any of that.
Yesterday we had quite an interesting discussion about the future of humanity. This is also related to the concept of artificial intelligence. It is argued that 2025 will be the singularity point, what is your take on that?
You do not want Siri to be more like Alan Turing. You want Siri to be more like Apple. Because Apple owns Siri
There will not be a Singularity. I think that artificial intelligence is a bad metaphor. It is not the right way to talk about what is happening. So, I like to use the terms “cognition” and “computation”. Cognition is something that happens in brains, physical, biological brains. Computation is a thing that happens with software strings on electronic tracks that are inscribed out of silicon and put on fibre board.