The PlayStation 3's security wall has fallen. As of this week, the disc has been slipped in, Tron-style, by a few hackers' combined efforts, and the game console's "private keys"—336 seemingly random letters and numbers—are public. Other recent game systems have taken shots from hackers, particularly the Nintendo DS and Wii, but none as hugely as this.
The story has some fascinating angles, from the system's complete, public dissection at a recent conference, to the hackers' stance that Sony brought this upon itself. Yet the most interesting angle is neither technical nor militant. Once we understand the hack, the weird thing becomes evident: that digital "security walls" have become commonplace and tolerated, due in no small part to the history of videogames.
Regarding the hack itself, I turn to a fine guide at hacking enthusiast site GBATemp:
The reason game systems will only run official content is because the company in question (e.g. Sony, Nintendo, or Microsoft) builds the system so that it will only accept digitally "signed" content. This "signature" takes the form of a key used to encrypt/lock the game/program. If the system is presented with a program that doesn't have the key incorporated into it, it refuses to run it.
As the guide goes on to say, this feature prevents piracy, because those specific keys are impossible to fully copy. The keys also attempt to ensure "quality"; hypothetically, the maker of, say, BoobMaster VII has to get Sony's approval, lest his porn-violence opus never have those secret keys printed on the discs by Sony itself. Sony happily charges for this approval service (as do Nintendo and Microsoft for their respective boxes).
Take out the private keys, and you take out the middleman. The brains behind the misunderstood BoobMaster empire get a second chance, as do thousands of people who want to write (or copy) all matter of program, game, or virus without asking Sony's permission. The PlayStation 3 becomes—gasp—a plain ol' computer, willing to run any program that you ask it to.
That's a lot of hacking to use a game system however you please, which illuminates a troubling fact: In our public computer consciousness, we're okay with our devices telling us, "I'm afraid I can't do that, Dave." The iPhone is the most well-known example, its App Store controlling nearly every installation via username, password, and Apple's content control. E-book readers and other telephones have followed suit.
But videogames have led the charge. As GBATemp points out, Nintendo was the first company to pursue widespread digital protection in the '80s. The Nintendo Entertainment System's huge cartridges had lockout chips, which prevented just anybody from making and releasing Nintendo games. If you wanted in, you needed Nintendo's approval—and its lockout-chip cartridges, which the company, yes, happily charged for.
In terms of marketing, this shook out as Nintendo intended. The earlier Atari and ColecoVision eras were riddled with too many buggy, lousy games, while the NES's selection was relatively well pruned. But Nintendo may also have done this to reduce competition and control prices; among its rules were a limit to how many NES games a single company could release in a year.
In spite of some legal attention Nintendo's way, the basic issue of lockout chips remained unchallenged. Reason being? It wasn't a computer; it was a toy. Game systems continued employing them, and the DVD Consortium followed suit in the mid-'90s with encryption meant mostly to prevent piracy but also, for example, to keep American DVDs from playing in other markets. The public didn't react. It wasn't a computer; it was a movie player.
Now, almost every major digital device comes with a form of digital lockdown. And each of these gizmos is powerful enough to be our computer, movie player, toy, Internet-enabled device, and so much more. Hell of a paradox there—that the world's fab gadgets, the equivalent of Q's best inventions, can only run "approved" software. There's an entire Internet of industrious programmers and coders who'd love to make our devices more useful and cool, if only they didn't have to deal with portals (App Store, Xbox Live, PlayStation Network) that erect barriers like cost and country of origin.
Why is it okay for device makers to decide how we can and can't use their circuit boards, power supplies, screens, and buttons? Related, why have videogame fans reacted negatively to the PlayStation 3's unlock?
One answer I've found to both: Because, in Tron-like fashion, many digital devices appeared in our culture as childish playthings. You only play Hungry Hungry Hippos one way, right? Well, same with a Nintendo Entertainment System. Or a DVD player. Or a PlayStation 3.
The other concern is piracy—but let's look at the flipside of the money issue. If I can only buy official discs, or download games from one official web outlet, how can I trust that the marketplace has allowed for free, unfettered competition? Why is that a more economically valid way of doing business? Even the capitalist dogs think the market should decide, right?
This kind of antitrust oversight has survived since the Nintendo days. If the PlayStation 3 hack does anything, I'd rather it free our collective, antiquated perspective on computer devices than free the PS3's security system.
Photograph by David Boni