Universities and museums recognize the cultural value of video games, but the question of whether to preserve the actual devices—and how—is more divisive.
Since their birth as a science-fair curiosity at Brookhaven National Laboratory in the late 1950s, video games have moved inexorably towards higher and more central cultural ground, much like film did in the first half of the 20th century.
Games were confined at first to the lowbrow carnival of the arcade, but they soon spread to the middlebrow sphere of the living room, overran this private space, and burst out and upwards into the public spheres of art and academia. With prestigious universities like NYU and USC now offering graduate-level programs in game design, and major museums like MoMA, MAD, and SF MoMA beginning to acquire games and curate game exhibitions, preserving the early history of the medium appears more important than ever. But what exactly does it mean to preserve a digital game?
The answer is surprisingly simple: It means, first and foremost, preserving a record of how it was played and what it meant to its player community. Ensuring continued access to a playable version of the game through maintenance of the original hardware or emulation is less important—if it matters at all.