Zadie Smith, Facebook, and social coding

The current New York Review of Books, in its belated fashion, carries novelist Zadie Smith’s remarkable and troubling review of The Social Network, Aaron Sorkin’s absorbing film adaptation of Ben Mezrich’s execrable Facebook book, The Accidental Billionaires. Smith’s dissection of the film is smart and illuminating. But when she turns to the broader question of the cultural and human dimensions of social networking, things get complicated. Smith makes the mistake many critics of the social media fall into, presuming that our online personalities become our personalities tout court:

When a human being becomes a set of data on a website like Facebook, he or she is reduced. Everything shrinks. Individual character. Friendships. Language. Sensibility. In a way it�s a transcendent experience: we lose our bodies, our messy feelings, our desires, our fears. It reminds me that those of us who turn in disgust from what we consider an overinflated liberal-bourgeois sense of self should be careful what we wish for: our denuded networked selves don�t look more free, they just look more owned.

In a lengthy and thoughtful rejoinder at the, technology editor Alexis Madrigal offer an alternative take:

When (Zadie Smith) has a bout of existential angst on the way to see The Social Network, calculating her physical age, she wonders, “Can you have that feeling, on Facebook?” Her implicit reply is no. Which, to me, is to miss the point. Twice, she tries to say we “live” online now….

But we will never live on the Internet in the way we (do) other places. Let’s not reify our online meanderings. The angst of a body slowly dying doesn’t go away no matter how many times you type something into a box and then hit return. And that is a good thing.

Smith wants to say, “You are who you appear to be on Facebook.” But who believes that of themselves or anyone else? She makes the drastic overstatement only to serve as her grounds for outright rejection of the service…. Should Facebook be responsible for making humans better friends, better lovers, more magnanimous, more prone to checking in on grandma?

There’s a necessary gap between our online avatars and our lives offline�and it’s up to us to shape and control that gap. Social-media anthropologist danah boyd, who spends her time talking to teens about their lives online, offers an example:

Mikalah uses Facebook but when she goes to log out, she deactivates her Facebook account. She knows that this doesn�t delete the account�that�s the point. She knows that when she logs back in, she�ll be able to reactivate the account and have all of her friend connections back. But when she�s not logged in, no one can post messages on her wall or send her messages privately or browse her content. But when she�s logged in, they can do all of that. And she can delete anything that she doesn�t like.

To exert control over her social self, Mikalah is exploiting Facebook’s most fundamental framework in ways unanticipated by its programmers. She may not know a line of javascript when she see’s it, but she’s coding�hacking, even�nonetheless.

It’s social coding, not computing per se. And it’s something which, in one form or another, we’ve been doing all along amidst complicated modern lives. As in computing, the codes keep changing. The challenge, online and off, is put best by media thinker Doug Rushkoff: program or be programmed. It’s a modern problem, not a Facebook problem�and at some level it’s not a bug, but a feature.

About Mohit


  1. We’ve been using technology to extend, limit, model, and understand the self since well before computers. Even before knapped arrowheads and cave paintings. I want to say since the first Narcissus of the Australopithecus gazed into that clear lake surface, or since the first Echo caught her sonic reflection on the rebound; but it was probably even before that.

    Yes, to take a piece of us (the augmented hand of an arrowhead, the interior image of a wolfman rubbed onto a cave wall, the pocket-sized connection-network of a smartphone) and remodel it in technology, we leave out a lot. And then to use that as a habit, or even an explanation, those missing bits play a role by their negative space. But technologies also extend (I can bag that mammoth an we’ll eat for a month, I can talk to my friends as a regular thing despite the fact we’re in different cities, or might as well be). And importantly, they extend the *explanations as well as the actions — often by stripping away “inessentials” (I use that term cautiously, terming something inessential is only ever relative and temporary) not only can we see what we’re focussing on much more clearly; but often, see further. And then we re-incorporate what we’ve learned back into our ongoing and flexible model of the self, and the cycle begins again.

    To lament this shows not only a too-heavy investment in previous technological and conceptual limitations, but a profound ignorance of the plasticity of what it means to be human. And at least part of what *that means, is joy in the experimentation. The great experiment that is always already us.

    And it’s not an old versus young thing here, or even one of format. There are many authors of the past who have relevant things to say to the future. I’ll quote one in closing:

    “Your old road is/Rapidly agin’./Please get out of the new one/If you can’t lend your hand/For the times they are a-changin’.”

  2. There is one element of Zadie Smith’s critique that is spot on, something that seems to get glossed over in all the reactions I have read to her peice. It is how Zuckerberg is a reduction in this movie to a Citizen Kane style character, someone who can be explained as being “driven” by a girl who dumped him. This is perhaps one of the most powerful substitutions of a real life person for a Hollywood person in our time. Zuckerberg says it best here:
    But I think this reduction is relevant to the discussion of social media and Zadie Smith’s fear that our Facebook identities are becoming who we are. Hollywood has done a spectacular job over the years of reducing personhood. But it seems Facebook does this sort of thing even better. Of course there are movies that present real human beings but those are not the movies that drive the industry. I think Zadie Smith is right to fear the facebooked self – BUT that said, if she can imagine a time, a not too distant future, where Facebook will matter as much as Friendster.. then she shouldn’t be that scared!

  3. “Hollywood has done a spectacular job of reducing personhood.” That’s on the money, Benjamin; and I’m willing to go nearly as far with Facebook. It’s especially useful to remember that many cultural things one might associate with Smith’s Person 1.0 model have actually been corrosive, long before the social media came along. (I’ve said this on Twitter: the trouble with Smith’s Person 1/2.0 metaphor is that it assumes there was only one other kind of person before the internet kind.) Only with this caveat: I think it gives us tools that can be used to flatten and cheapen ourselves, and it’s up to us to figure out whether we can use them without reducing ourselves thereby.

    I’m with you on Smith’s critique of the characerization of Zuckerberg, too. It was a problem with Mezrich’s book, which was much more curious about what coke-and-supermodel parties than it was about code.

  4. And thanks, too, Peggy: as Kevin Kelly points out, literature is part of the “technium,” the ever-burgeoning domain of technology. Without tech, we don’t get the novel. There’s no turning it back�only working out how to use it without cutting ourselves to ribbons.

  5. Yes, I am also looking forward to more of your blogging of the Kelly book here. Avoiding ribbons will need to be, as usual, managed in a multi-edged space: we need to scrutinize not only our use of technology, but our use of technological metaphors. The metaphors of IT are ascetic in the service and/or guise of efficiency, sometimes even more so than the tech itself: input/output, online/offline, bandwidth, processing “power,” reload, interface, icon, and of course the general hyper-abbreviations of code itself (and of “code” as yet another metaphor!). How we think about technology is in the tech of language (and the imagery arising from it), what language we use is central to how we manage any of our potential expansions or limitations, or even recognize them.

    And *as we recognize them, I’m going to bet that many, if not most, of the features and bugs coming into better focus under the lens are going to be things that go back, at least in potential, a long, long way.

Leave a Reply

Your email address will not be published. Required fields are marked *