Neal Stephenson – legendary author of speculative fiction – on Elon Musk and geek culture, the NSA revelations of Edward Snowden, how negative cultural narratives are killing big science – and the upbringing that made him the writer he is.
IN LATE 2013 I had the opportunity to interview the author Neal Stephenson. Some Remarks, Stephenson’s collected non-fiction writing, was due for release in the UK and I was fascinated to talk to the author of Snow Crash and Cryptonomicon about his wider views of science, technology and contemporary culture. It happened that the interview came just at the time that CLANG, the innovative sword fighting game that Neal had championed to successful Kickstarter funding, hit a few kinks in its development. Our interview took a few twists and turns, but came out full of interesting insights in to the author’s thoughts and creative development. But, as sometimes happens with interviews, our discussion didn’t quite match the focus the commissioning technology publication had been looking for. And so, after some consideration, I’ve rescued the interview from editorial limbo to publish here in full. I hope you enjoy reading it.
Damien Walter, 2014
“I grew up in an environment that seemed utterly normal at the time and that in retrospect was almost unbelievably weird.”~ Neal Stephenson.
DW – Your non-fiction writing collected in Some Remarks displays the same fascination with technology and social change as your novels, I think that’s fair to say? Where did this fascination begin?
NS – One of the items in Some Remarks is a foreword to the posthumous re-issue of David Foster Wallace’s book Everything and More, in which I try to make the case that DFW’s work is informed by a particular sensibility peculiar to what I call the Midwestern American College Town, or MACT. I won’t try to recapitulate that argument here, but the gist of it is that I grew up in an environment that seemed utterly normal at the time and that in retrospect was almost unbelievably weird. I suppose we all have such insights when we move away from the place of our upbringing. My ancestors had been ministers, professors – or ministers and professors – for several generations back. That’s in the paternal line. On the maternal side, they were reasonably well-to-do farmers with a direct and recent connection to Geraldine Jewsbury, a very complicated Victorian author. By the way, I didn’t know about any of that when I was young, I only became aware of it in my twenties and thirties. But one assumes it has an effect.
Anyway, during the 20th century they all made a turn toward science and technology and so I ended up with a lot of academic scientists and engineers in my family. I grew up in a MACT, dominated by a university of science and technology, wherein our neighbors, the people we saw at church, the parents of my friends, etc. all tended to have (or to be studying for) Ph.Ds. Some of my friends’ fathers had worked on the Manhattan Project, and as a teenager I worked summers as a research assistant in an old Manhattan Project lab. I developed a fairly typical nerdy fascination with computers and programming, which showed up in my fiction, particularly Snow Crash; and when that book became popular among high tech people, I ended up knowing many such.
Advanced SciFi & Fantasy
Writing the 21st century myth
Damien Walter, writer on sci-fi and geek culture for The Guardian, BBC, WIRED and graduate of the Clarion writers workshop, leads a journey into scifi and fantasy storytelling.
When high-falutin people talk about sci-fi you’ll often hear them use words like novum and the like. Critic and academic Darko Suvin came up with novum to describe the…thing…at the heart of every sci-fi story that makes it sci-fi. Androids hiding as humans! A world populated by talking apes! A portal that leads to everyContinue reading “Write better sci-fi stories with this simple idea”
Corporations love to take cool things and turn them to trash to make money. In the early 80s black artists took DJ music loops, rapped radical political lyrics over them, and invented hip-hop. Corporations took hip-hop and degraded it into “gangsta rap”, perpetuating stereotypes of black male violence to sell hip-hop to the masses. CorporationsContinue reading “Science fiction sold out. Let’s take it back.”
When did the science fiction community start using “genre” as a proper noun? “It’s a common thing in Genre.” As though “Genre” is a city you can visit. Or a distinct community unified by being “Genre”. It’s one of those linguistic ticks that arise on the internet. But for science fiction it’s also symbolic ofContinue reading “Genre fiction is the worst thing that ever happened to science fiction”
M John Harrison is one of the all time greats, a “science fiction writer’s science fiction writer”, a creator of weird tales in the horror tradition, and a powerful weaver of fantasy. The Viriconium stories defined political fantasy in the 80’s, as the Light trilogy redefined literary SF in the 00s. As editor of NewContinue reading “How does M John Harrison enter a story?”
When life takes an unexpected left turn I do four things – tidy my room, go running, take 72 hours away from anything stressful…and read a good book. This time around I landed on Neuromancer by William Gibson. I first read this book when I was 14, I suspect I read it at least sevenContinue reading “Neuromancer…still the best science fiction novel ever written”
Is Europe welcoming desperate refugees, or being invaded by economic migrants? Is Donald Trump a serious President, or a clownish attention seeker? The Man In The High Castle reveals the most basic truths about our era of competing narratives. * In 1947 the forces of Nazi Germany and Imperial Japan swept to victory over Europe andContinue reading “How Philip K Dick’s 1960’s masterpiece nailed politics in the 2020’s”
Every genre of science fiction began as literary fiction. For writers and fans of SF it’s useful to get familiar with the literary origins of genre fiction. The Glass Bead Game by Herman Hesse is set some 400 years in the future from its first publication in 1943. Hesse spent over a decade writing this,Continue reading “7 literary Sci-Fi and Fantasy novels you must read”
Calling sci-fi a genre in 2016 is about as accurate as calling the United States one nation. In principle it’s true, but in practice things don’t work that way. While crime, romance and thrillers all remain as coherent genres of fiction, it’s been decades since sci-fi could be comfortably understood by any shared generic criteria.Continue reading “The 8 Tribes of SciFi”
A Scanner Darkly is one of Philip K Dick’s most famous but also most divisive novels. Written in 1973 but not published until 1977, it marks the boundary between PKD’s mid-career novels that were clearly works of science fiction, including The Man in the High Castle and Do Androids Dream of Electric Sheep?, and hisContinue reading “Transrealism: the first major literary movement of the 21st century”
DW – How did this upbringing contribute to your talent for seeing the “big picture” of technology?
NS – To the extent that I have any talent for it, it presumably arises from the fact that I never recognized any meaningful division or conflict between science and technology on the one hand, and any other aspect of culture (literature, religion) on the other. The typical MACT is too small to allow for specialization, and so if the professors are going to have cultural events they must organize them themselves, rather than delegating the work to a separate cultural elite. Again, all of this was simply the air I breathed, and I didn’t become conscious of it until later in life.
DW – The MACT sounds like much the kind of place where many young science fiction fans came of age. Today scifi and “geek culture” are arguably the new mainstream culture of the internet connected generation. How do you rate its influence on your work?
NS – Re scifi/geek culture, this is something that I grew up with, just as a historical accident. I can still remember seeing The Hobbit for the first time, in the hands of an older boy at my school when I was in the sixth grade. This was at about the same time that I was obsessing over the original Star Trek series and watching Astro Boy cartoons. Today, of course, we would identify all of these as being touchstones of geek culture, but at the time, nothing of the sort had even been imagined. So I was left with a fascination for these strange found objects on the periphery of our culture. I could say similar things about D & D and even Star Wars. People who were fans of one of these things tended to be fans of the others, and so geek culture evolved, I think, out of a lot of random encounters in dorm rooms and subway cars, and began to snowball as the geeks got better at networking.
“when Snow Crash popped up on the radar of geek culture and became a popular book, it took me by surprise”
When the Internet came along and made networking easy, the whole phenomenon just exploded and has now become a dominant force in our culture. I never partook of it as heavily as some others, in the sense that I didn’t go to SF cons, have never visited Comicon, and haven’t really been involved in the relevant Internet discussion groups. Consequently, when Snow Crash popped up on the radar of geek culture and became a popular book, it took me by surprise, and in fact I wasn’t really aware that anything had happened until people began to reach me via the then-new medium of email and to address me as if I were some kind of significant person.
Its main influence on my work has been that I have felt confident that I need not keep writing the same book over and over again. I have tried to make each book different from the last. I’ve always felt confident that this would work, which is to say, that the community of readers would accept this sort of random-walk approach, and so far I have never been disappointed. From time to time I will hear from a reader who is startled by the fact that my latest book isn’t very much like the one previous, but those people seem to be outnumbered by the ones who don’t care at all, supposing they even notice.
DW – In your 2011 essay Innovation Starvation you question if we still have the capacity to get big things done, citing the kind of technological innovation that went in to the Apollo programme. Have we lost our faith in technology to bring progress, or is there good reason to retreat from the disruption that comes with it?
NS – The particular events that set me off were the Deepwater Horizon disaster and Fukushima, both of which were examples of what I would consider old technologies that became ensconced within our system and took on permanence wildly in excess of their technical merits. The Fukushima reactors are technology from the 1960s, constructed in the 1970s. Look under the hood of a 1960s automobile, if you can find one that is still running, and compare it to a new Tesla, or even a Buick, and you can get a sense (as if you needed one) of how crazy it is to have a plant of that vintage under the control of a bureacracy as catatonic as Tepco.
“So yeah, we’ve definitely lost our faith in technology to bring progress.”
So, I would consider the state of the nuclear power industry to be a case in which an early, faulty embodiment of a new technology was pushed out into the market, leading to a quite understandable backlash from the general public as most people discarded their rose-colored glasses and created barriers to adoption of new tech. The two main barriers that were created were legal/regulatory, and cultural. I won’t elaborate on the former.
The cultural barrier is somewhat more in my bailiwick, and I’ll talk about it in a moment, but the point is that these barriers were set up too late to solve the problems that inspired their creation, and so they had the unintended consequence of locking in all of the bad stuff that had come before–grandfathering it into place, in effect–while making it impossible to build newer and better stuff. So Fukushima and many other reactors of that vintage are still there, while people trying to construct modern replacements for them can’t make headway. Even wind turbines and solar farms are difficult to build because of regulatory barriers that were put into place to control much more baleful technologies.
A lot could be said about the cultural barriers, but maybe the most succinct thing I can say is that I was browsing on my Apple TV the other day, looking for a movie to watch, and was confronted with an entire category of films labeled “Dystopian Futures.” I am old enough to remember when some of the very first dystopian SF movies came out. They wouldn’t have been called that at the time, other than by film critics writing for an elite audience. At the time it was refreshing, and extremely hip, to see depictions of futures that were not as clean and simple as Star Trek. Now, the dystopian future is the only future that is allowed to be presented in new SF films and television, and it has become so ubiquitous, and so tired, that Apple TV is deploying it as a mass marketing term right up there with “Romantic Comedies” and “Superheroes.” So yeah, we’ve definitely lost our faith in technology to bring progress.
Is there “good reason to retreat from the disruption?” Well, there’s a buried premise in the question I don’t agree with. The presumption is that the world is static–and basically hospitable–until we do something and thereby disrupt it. Which I don’t agree with at all. We live in an environment almost all aspects of which were engineered by our ancestors. The continents of Australia and the Americas, when discovered by Europeans, had been made over by systematic hunting, burning and gardening over tens of thousands of years, and didn’t exist in anything like a pristine state of nature. We live, and have always lived, in a completely manufactured environment. All we’re left with is the ability to choose between different technological strategies. It’s incoherent to point at one thing and call it a technology in contradistinction to the [implicitly non-technological] status quo ante.
Other things being equal, and speaking very broadly, newer tech tends to work better than older, which is why Apple keeps getting us to buy the latest and greatest iPhone. So, at the mass-market consumer level, we have a strange state of affairs in which people are eager to vote with their dollars, pounds and Euros for the latest tech but they flock to movies depicting a relentlessly depressing view of the future, and resist any tech deployed on a large scale, in a centralized way, such as wind turbine farms.
DW – We seem to have a lot of these negative cultural narratives about technology – the apocalypse of course, environmental collapse, but also the most negative assessment of our economic situation, that capitalism has reached its end game and technology won’t power it any further. Do we face a hard limit on our current development? What comes next?
NS – It is worth pointing out that the narratives are just that: narratives. We should begin by asking ourselves where those narratives come from and why they are that way; there’s no prima facie evidence that they have any connection whatsoever to how the future’s actually going to play out. Except, of course, insofar as they might make people so discouraged and skeptical that they become self-fulfilling prophecies.
For practical purposes, the only narratives that matter are the ones we see on screens in video games, TV series, and movies (much as I would like to believe in the power of the written word to sway the imagination, it just doesn’t have the same ability to swerve the zeitgeist as the screen-based media).
In the budget of a video game or a movie, writing is a very small wedge of the pie. The money all goes into other wedges. In both games and movies the production of visuals is very expensive, and the people responsible for creating those visuals hold sway in proportion to their share of the budget.
I hope I won’t come off as unduly cynical if I say that such people (or, barring that, their paymasters) are looking for the biggest possible bang for the buck. And it is much easier and cheaper to take the existing visual environment and degrade it than it is to create a new vision of the future from whole cloth. That’s why New York keeps getting destroyed in movies: it’s relatively easy to take an iconic structure like the Empire State Building or the Statue of Liberty and knock it over than it is to design a future environment from scratch. A few weeks ago I think I actually groaned out loud when I was watching OBLIVION and saw the wrecked Statue of Liberty sticking out of the ground. The same movie makes repeated use of a degraded version of the Empire State Building’s observation deck. If you view that in strictly economic terms–which is how studio executives think–this is an example of leveraging a set of expensive and carefully thought-out design decisions that were made in 1930 by the ESB’s architects and using them to create a compelling visual environment, for minimal budget, of a future world.
“…entertainment executives basically don’t care about narrative at all.”
As a counter-example, you might look at AVATAR, in which they actually did go to the trouble of creating a new planet from whole cloth. This was far more creative and visually interesting than putting dirt on the Empire State Building, but it was also quite expensive, and it was a project that very few people are capable of attempting. Only James Cameron has the clout to combine such a large budget with so much creative independence; he was able to turn Rick Carter loose on the design and create magic. But in basically every other movie, game, and TV show, the creators of the visual environment are caught in a trap where their work is expensive enough to draw scrutiny from executives who are, by and large, unwilling to take chances on anything new, and will always steer in the direction of something that is cheaper to produce and that they have seen before. And this ends up being the degraded near-future environment seen in so many dystopian movies.
That environment also works well with movie stars, who make a fine impression in those surroundings and the inevitable plot complications that arise from them. Again, the AVATAR counter-example is instructive. The world was so fascinating and vivid that it tended to draw attention away from the stars.
Compared to all of these considerations, the things that matter to literary people (character and story) are entirely secondary and are generally pasted on as an afterthought. So, what you are characterizing as “negative cultural narratives about technology” are, in my view, just an epiphenomenon of decisions made by entertainment executives who basically don’t care about narrative at all. Taking those narratives seriously is kind of like looking at a Rolls-Royce and assuming that it is made entirely out of a giant block of paint.
The “hard limit” and “what comes next?” parts of your question are where you ask me to be way more oracular than I’m comfortable attempting. There are plenty of people with money and vision who would like to build a future more interesting than “Empire State Building covered with dirt” and I don’t really see any reason in principle why this couldn’t happen. To me it seems to be largely about institutions and whether they are capable of adapting. It is easy to fall into a trap where existing institutions are productive enough to funnel money to vested interests who’d rather keep milking them in their current form than take a risk on transforming them.
DW – I want to ask about some of the revelations in recent weeks about privacy. The NSA’s Prism programme and Palantir being employed by government and major corporations has made people wonder just what else is out there that we don’t know about. The early promise of the internet seemed to be greater liberty, but as the technologies have evolved they appear to be concentrating immense power in a few hands. Which direction do you think we are heading in, and what should we be doing to effect the course of these technologies?
NS – I don’t claim to be an expert on this sort of thing, if indeed I ever was–it has been a long time since I wrote Cryptonomicon. But just on general principles, what impresses me is how easily leaked this information is. That’s not to understate the difficulties Edward Snowden is facing, but the fact is that the NSA is going to find it quite difficult to keep a lid on such activities. Much of the shock and dudgeon expressed over what Snowden revealed seems disingenuous to me. Every techno-thriller movie and TV show that I have watched in the last twenty years has assumed that the intelligence agencies had all of these surveillance capabilities and much more. And there was much indignation in the US about the FBI’s failure to predict the Boston Marathon bombings. One can’t be indignant about all of these things at once. Deep layers of cant must be scraped off of this discourse before we can even begin talking about it in any useful way.
DW – As we’ve been conducting this interview Elon Musk – who seems to me a little like a flesh and blood Tony Stark – announced his Hyperloop project. It raises the obvious question, why haven’t we already done this?
NS – According to a widespread meme that is not devoid of truth the track gauge used by modern railroads is derived from that used by horse-drawn vehicles, such as Roman chariots.
A similar point can be made about petroleum-based fuel. This had its origins in the practice of sailing around the ocean hurling pointed sticks at sperm whales and boiling their heads to make lamp fuel. When we ran out of whales, kerosene was developed as a synthetic whale oil substitute. One thing led to another and we ended up with the modern petroleum industry.
“I would urge people to consider the Hyperloop not only as a technical proposal but (…) as a question that we need to address as a technological society.”
It is a bit facile to talk this way, since there are many technical reasons why petroleum makes an excellent fuel, but it does help to illustrate the idea of technological lock-in.
Now let us consider the problem of moving humans quickly, safely, and cheaply between LA and San Francisco. The proposal least likely to get anyone fired, or publicly mocked, is to take existing rail technology and make it a little faster, and so that is the sort of plan that tends to make headway.
Elon Musk is simply pointing out that this isn’t the best way of doing it. To that point, it’s a strictly technological argument. But he’s implicitly making a more interesting point, which is that two cities such as LA and San Francisco ought to be capable of doing much, much better than that. He’s asking what happened to us as a civilization that we are unwilling to even think about doing something that is quite doable on a technical level but sufficiently different from existing technology as to pose a serious challenge to engineers, regulators, financiers, and insurers. His Hyperloop proposal is almost a kind of performance art, in that sense.
I would urge people to consider the Hyperloop not only as a technical proposal but in the way that I think Elon Musk actually intended it: as a question that we need to address as a technological society. Even if your answer is “I’m fine with Victorian railway technology, thank you very much” it’s worth musing over.
DW – The “proposal least likely to get someone fired” works as a good shorthand for many of the systematic problems that get in the way of new technology. Crowdfunding has surged forward arguably because it provides a route around some of those problems. Is this ushering in a more creative era for tech, or are there limitations to consider? Also, why sword-fighting?
NS – Crowdfunding is a thrilling development. It’s useful to keep some of its limitations in mind. There is a fairly hard upper bound on how much it’s possible to raise that way–somewhere in the low seven digits for extraordinarily successful campaigns. No one is going to build a Hyperloop with that. Preparing and running a large campaign is a full-time job for at least one person. Even if such people aren’t being paid, you have to, in some sense, subtract their their opportunity cost from the amount raised, and also factor in taxes and the cost of shipping out the donor rewards.
Once you have accepted donors’ money to do a particular thing, you actually have to do that thing, and not some other thing you thought of in the meantime. This is fine if the objective is, say, to make a film or construct a house (i.e. some project with a well-defined objective that is unlikely to evolve in the making) but if the objective is to undertake some sort of business enterprise, it can lead to a certain loss of flexibility. Most businesses adapt continuously as circumstances change. But it would be difficult to launch a Kickstarter around the premise of “here’s a team of smart people who want to do something that we’ll largely make up as we go along” because Kickstarter is oriented toward clearly definable, specific goals.
This isn’t meant to be discouraging, I’m just pointing out that, for many types of projects, it is not a replacement for a motivated, visionary investor.
In spite of this, some people go the Kickstarter route anyway just because it is a fine way to get attention for one’s project and build up a community around it. Presumably that is why Richard Garriott used Kickstarter to fund his game Shroud of the Avatar.
Typically there is an awkward gap between the size of project easily fundable by crowdsourcing, and one large enough to attract VCs. Some efforts are underway to fill that gap with Kickstarter-like schemes that actually reward contributors with equity, but this is very difficult because of complexities entailed in securities regulation.
Why swordfighting? Because I enjoy it enough to keep pursuing it, which is not true of any other sport activity I have ever tried, and so it keeps me physically active. In the end, the only real justification for any sport is to improve health by inducing one to get up and move around in a way that isn’t strictly necessary in modern technological society. I hate to reduce it to such arid terms, because in the case of swordfighting there is so much that I could say in a historical and romantic vein, but that really is the bottom line.