Right from the first page, Douglas Rushkoff’s book Program or Be Programmed1 reminded me of Nicholas Carr‘s, The Shallows2 — only with a broader scope and more buzzwords and a less gloomy appraisal of the subject. I read The Shallows last year, and though it was interesting, it was also overly dramatic, and was too timid in its speculations — and thus it failed to draw fully-baked conclusions or make substantive predictions. We walk away with Carr’s Neural Doomsday:
The price we pay to assume technology’s power is alienation.
Rushkoff dives into a lot of the same territory as Carr. They both discuss (and not wholly favorably) the optimistic futurists that long for the infinite memory of their “outboard brain[s]”, those same futurists that assume that our cybernetic evolution will (through technology) give us powers that are indistinguishable from telepathy. On the flip of super-human memory and super-human emotion/intelligence-sharing, both Carr and Rushkoff talk about the flavor of hyper-facile “breadth-only/depth-never” searches that are encouraged by the very design of systems like Google and Wikipedia. This is where we start to see differences in their approaches to the subject though: Carr sees us as being “reprogrammed” by those systems to think in a specific and narrow way; meanwhile, Rushkoff points to those systems and says that what’s happening is us bending to the bias of the machine, instead of taking advantage of those machine biases to do for us what is otherwise difficult or repetitive or time-consuming. Rushkoff’s argument is similar to Carr’s but subtly and importantly different — he is not quick to cast off these powerful and seductive tools, but instead urges us to remember that they are simply a means through which to achieve our ultimate goals, which are really about meaningful contact with other human beings. If going head-to-head, I’m sure that Carr would cite McLuhan and accuse Rushkoff of making David Sarnoff’s argument, placing all of the blame on the consumer. On the surface, this would seem true; after all, isn’t Rushkoff imploring us in the title to take control by learning the fundamental means of production for digital content?
As I disagreed with Carr on this before, I disagree with him now. Rushkoff is not naïve in invoking neuroplasticity3 here. He wisely points out that the reason we assume the shape of “the machine’s” biases is because it is convenient to do so, and in large part it is convenient because the masters of those machines have made it that way. Rushkoff cites how American pedagogy looks at computer literacy through the lens of usage and consumption — “how do you enter data into last year’s version of Excel?” instead of “how would you go about designing a data aggregation and analytics engine on your own?” Rushkoff goes beyond that to point out that even the language around the simple act of installing software (“the Wizard” in Windows) is constructed to mystify and obfuscate it behind abstractions — and that is to say nothing of the mechanism itself. He does not damn all creators of software4, but he does point the finger in that direction. So what Rushkoff is saying is not that those machine biases are bad5 — but that our approach to learning and interacting with those systems is flawed, and in part that is an incidental conspiracy on the part of those creators to feed what they want into those systems. But… re-enter neuroplasticity — the brain mechanism that causes us to take the shape of those machine biases is also the same one responsible for the kind of technological re-appropriation that William Gibson often talks about6 — and that’s enough of an argument to say that we can and often do “snap out of it” and shape the tools to our desires and needs.
That technological re-appropriation is in the spirit of the type of New Media Literacy that Rushkoff would have us learn, and which Carr seems to mention only obliquely and incompletely and perhaps a bit timorously. To Rushkoff, “the new literacy” — as mentioned above — is woefully insufficient. Learning “spreadsheet skills”7 like data-entry and copy/paste and sorting/filtering is ultimately just cranking out more consumers (albeit spreadsheet consumers) and is not encouraging creativity or even thoughtfulness. As a consequence, the lessons learned for our un-fun software become the same lessons for our fun/social8 software — we graze from them, we engage shallowly with those systems, and since we use those systems to mediate our social connections, then those interactions become increasingly shallow as well.
Once again, we have Rushkoff’s theses dovetailing with Carr’s. They both assert that taking the shape of the machine’s bias puts you at a disadvantage, that you wind up fetishizing the gadgets themselves instead of putting them to work for you. But Carr offers us his ditch-digger analogy9 and stops coyly and obliquely short — abstaining from any speculation on how we might save ourselves. Meanwhile, Rushkoff comes right out and delivers a proposed salvation in the form of an ultimatum: “Program or be programmed.”10 But that ultimatum is just a stand-in or metaphor for something else: “Think, synthesize, and create — don’t just consume.”
There is a great deal more than just the above going on Rushkoff’s book. I’ve focused on these items because it makes a great (and significantly more positive) companion piece to Nicholas Carr’s book.11 But Rushkoff discusses more than just “machine biases” and “spreadsheet skills”; he talks about identity and anonymity, about factuality and openness, about nuance… He talks coherently and passionately about a great many things in the span of 150 pages.12 And he delivers these points in such a way that anyone can read them, that anyone can process them and act on them. He wants you to act on these “commands”. And for all of my minor criticisms13, I would want you to read and act on these “commands” as well.
- Buy a copy on Amazon (affiliate link). [↩]
- My review is here on this blog. And/or: buy it on Amazon (affiliate link). [↩]
- Carr also invokes neuroplasticity in his text, but he sees it as dooming us to forever mutate into impulse-driven click-hungry meat-terminals for machine masters. (Okay, that is maybe going a little too far into what I perceive to be the spirit of his text…) [↩]
- Mostly Rushkoff is just damning the commercial creators. He seems to have kind words for free/open source software (FOSS) developers, and the FOSS movement on the whole. And/but that said, I was a little surprised that he didn’t jump in and link this “abstractions” business up with how developers are (by and large) lazy — inasmuch as “lazy” developers are “lazy” because they are not interested in re-solving solved problems unless those problems are worth re-solving. (Did that make sense?) [↩]
- In a way, he argues that these biases are essential — that the machines are designed to compensate for things that we (as human beings) do not do well, and/or do not like to do. [↩]
- Check out William Gibson’s remarks about pagers in this interview in The Paris Review. [↩]
- My term, not his — though I wish it was his. [↩]
- Though I almost didn’t stick “social” in there, since Rushkoff believes that all software is social, since “the point” of all software is to connect users to other users, people to other people, to enable sharing between them and strengthen social bonds. Like the digital equivalent of primates grooming each other? [↩]
- In case you didn’t read it yourself, I’ll summarize the ditch-digger analogy as follows: “Is it better to dig a longer and wider ditch in half the time with your steam shovel if it means that your muscles atrophy as a consequence?” [↩]
- Although, let’s be honest here — there isn’t much real/actual discussion of programming until the very end of the text. And even then, it’s only really few pages in the last chapter and then a page or two of references in the bibliography. [↩]
- …which I recommend despite despising it. [↩]
- Screw it, here are the ten “commands” from the table of contents:
- Time – do not be always on
- Place – live in person
- Choice – you may always choose none of the above
- Complexity – you are never completely right
- Scale – one size does not fit all
- Identity – be yourself
- Social – do not sell your friends
- Fact – tell the truth
- Openness – share, don’t steal
- Purpose – program or be programmed
And as a brief side note there: after reading the chapter on “Choice”, I felt surprised that Rushkoff’s “Essential Reading” section did not include Sheena Iyengar‘s The Art of Choosing (affiliate link). But I suppose that they did come out at about the same time… [↩]
- And there were a few… I could have done without some of the lurid buzz-wordy passages; and they could have done another editorial pass (some of the sentences seemed to be missing… an important verb or two); and he sometimes flubbed certain scientific elements… but it’s all water under the bridge in light of his central thesis and commitment to the subject matter. [↩]