When the renowned cinematographer Emmanuel Lubezki began planning to shoot the wilderness drama The Revenant, he decided that to capture the stark, frozen beauty of a Canadian winter, he would use no artificial light, instead relying on sunlight, moonlight, and fire. He also planned to use traditional film cameras for most of the shooting, reserving digital cameras for low-light scenes. He quickly realized, though, that film “didn’t have the sensitivity to capture the scenes we were trying to shoot, especially the things we shot at dawn and dusk,” as he told an interviewer.
The digital footage, by contrast, had no noise or graininess, and the equipment held up much better in the extreme cold. The crew soon switched over to digital cameras exclusively. “I felt this was my divorce from film—finally,” Lubezki said. The film, released in December 2015, earned him an Academy Award for cinematography two months later.
Lubezki’s late-breaking discovery of digital is one that other filmmakers the world over have been making since the first digital cameras came to market in the late 1990s. Back then, digital moviemaking was virtually unheard of; according to the producer and popular film blogger Stephen Follows, none of the top-grossing U.S. films in 2000 were recorded digitally.
These days, nearly all of the films from all of the major studios are shot and edited digitally. Like Lubezki, filmmakers have switched to digital because it allows a far greater range of special effects, filming conditions, and editing techniques. Directors no longer have to wait for film stock to be chemically processed in order to view it, and digital can substantially bring down costs compared with traditional film. Distribution of films is likewise entirely digital, feeding not only the digital cinema projectors in movie theaters but also the streaming video services run by the likes of Netflix and Hulu. The industry’s embrace of digital has been astonishingly rapid.
Digital technology has also radically altered the way that movies are preserved for posterity, but here the effect has been far less salutary. These days, the major studios and film archives largely rely on a magnetic tape storage technology known as LTO, or linear tape-open, to preserve motion pictures. When the format first emerged in the late 1990s, it seemed like a great solution. The first generation of cartridges held an impressive 100 gigabytes of uncompressed data; the latest, LTO-7, can hold 6 terabytes uncompressed and 15 TB compressed. Housed properly, the tapes can have a shelf life of 30 to 50 years. While LTO is not as long-lived as polyester film stock, which can last for a century or more in a cold, dry environment, it’s still pretty good.
The problem with LTO is obsolescence. Since the beginning, the technology has been on a Moore’s Law–like march that has resulted in a doubling in tape storage densities every 18 to 24 months. As each new generation of LTO comes to market, an older generation of LTO becomes obsolete. LTO manufacturers guarantee at most two generations of backward compatibility. What that means for film archivists with perhaps tens of thousands of LTO tapes on hand is that every few years they must invest millions of dollars in the latest format of tapes and drives and then migrate all the data on their older tapes—or risk losing access to the information altogether.
That costly, self-perpetuating cycle of data migration is why Dino Everett, film archivist for the University of Southern California, calls LTO “archive heroin—the first taste doesn’t cost much, but once you start, you can’t stop. And the habit is expensive.” As a result, Everett adds, a great deal of film and TV content that was “born digital,” even work that is only a few years old, now faces rapid extinction and, in the worst case, oblivion.
To understand how the movie studios and archives got into this predicament, it helps to know a little about what came before LTO. Up until the early 1950s, filmmakers shot on nitrate film stock, which turned out to be not just unstable but highly flammable. Over the years, entire studio collections went up in flames, sometimes accidentally and sometimes on purpose, to avoid the costs of storage. According to the Film Foundation, a nonprofit founded by director Martin Scorsese to restore and preserve important films, about half of the U.S. films made before 1950 have been lost, including an astounding 90 percent of those made before 1929.
It wasn’t just that film was difficult to preserve, however. Studios didn’t see any revenue potential in their past work. They made money by selling movie tickets; absent the kind of follow-on markets that exist today, long-term archiving didn’t make sense economically.
In the 1950s, nitrate film was eclipsed by more stable cellulose acetate “safety film” and polyester film, and it became practical for studios to start storing film reels. And so they did. The proliferation of television around the same time created a new market for film. Soon the studios came to view their archives not as an afterthought or a luxury but as a lucrative investment—and as an essential part of our collective cultural heritage, of course.
The question then became: What’s the best way to store a film? For decades, the studios took a “store and ignore” approach: Put the film reels on shelves, placed horizontally rather than vertically, at a constant cool temperature and 30 to 50 percent humidity. Ideally, they’d have redundant copies of each work in two or more of these climate-controlled vaults. Remarkably, the industry still uses film archiving, even for works that are born digital. A master copy of the finished piece will be rendered as yellow-cyan-magenta separations on black-and-white film and then preserved as traditional celluloid, on polyester film stock.
“We know how long film lasts,” says the USC archivist Everett. “And archives were designed to store things. They’re cool, they’re dry, and they have shelves. Put the film on the shelf, and it will play in a hundred years.”
One big problem with this approach is that to preserve the work, you must disturb it as little as possible. Dust, fingerprints, and scratches will obviously compromise the integrity of the film. Archive staff periodically check the stored masters for signs of degradation; occasionally, a master will be used to make a duplicate for public release, such as a showing at a repertory cinema or film festival. But otherwise, the archive remains pristine and off-limits. It’s like having a museum where none of the art is ever on display.
Maintaining such a facility isn’t cheap. And as chemical film stock becomes obsolete, along with the techniques used to create and manipulate it, relying on a film-based archive will only grow more difficult and more costly.
“The sad truth is that film images are ephemeral in nature, kept alive only by intensive effort,” David Walsh, the head of digital collections at London’s Imperial War Museum, has written. “Apart from anything else, if you are storing film in air-conditioned vaults or running digital mass-storage systems, your carbon footprint will be massive and may one day prove to be politically or practically unsustainable.”
The movie industry executives I interviewed would argue that the current system for digital archiving is already unsustainable. And yet when LTO storage first came along 20 years ago, it seemed to offer so much more than traditional film. Magnetic tape storage for computer data had been around since the 1950s, so it was considered a mature technology. LTO, as an open-standard alternative to proprietary magnetic tape storage, meant that companies wouldn’t be locked into a single vendor’s format; instead they could buy tape cartridges and tape drives from a variety of vendors, and the competition would keep costs down. Digital works could be kept in digital format. Tapes could be easily duplicated, and the data quickly accessed.
And manufacturers promised that the cartridges would last for 30 years or more. In an interview, Janet Lafleur, a product manager at Quantum Corp., which makes LTO cartridges and drives, said that LTO tape may still be “perfect” after 50 years. LTO came to be widely used for data backup in the corporate world, the sciences, and the military.
But the frequency of LTO upgrades has film archivists over a barrel. Already there have been seven generations of LTO in the 18 years of the product’s existence, and the LTO Consortium, which includes Hewlett Packard Enterprise, IBM, and Quantum, has a road map that specifies generations 8, 9, and 10. Given the short period of backward compatibility—just two generations—an LTO-5 cartridge, which can still be read on an LTO-7 drive, won’t be readable on an LTO-8 drive. So even if that tape is still free from defects in 30 or 50 years, all those gigabytes or terabytes of data will be worthless if you don’t also have a drive upon which to play it.
Steven Anastasi, vice president of global media archives and preservation services at Warner Bros., therefore puts the practical lifetime of an LTO cartridge at approximately 7 years. Before that time elapses, you must migrate to a newer generation of LTO because, of course, it takes time to move the data from one format to the next. While LTO data capacities have been steadily doubling, tape speeds have not kept up. The first generation, LTO-1, had a maximum transfer rate of 20 megabytes per second; LTO-7’s top rate is 750 MB/s. Then you need technicians to operate and troubleshoot the equipment and ensure that the migrated data is error free. Migrating a petabyte (a thousand terabytes) of data can take several months, says Anastasi.
And how much does it cost to migrate from one LTO format to the next? USC’s Everett cited a recent project to restore the 1948 classic The Red Shoes. “It was archived on LTO-3,” Everett says. “When LTO-5 came out, the quote was US $20,000 to $40,000 just to migrate it.” Now that the film is on LTO-5, it will soon have to be migrated again, to LTO-7.
For a large film archive, data migration costs can easily run into the millions. A single LTO-7 cartridge goes for about $115, so an archive that needs 50,000 new cartridges will have to shell out $5.75 million, or perhaps a little less with volume discounts. LTO drives aren’t cheap either. An autoloader for LTO-6 can be had for less than $3,000; an equivalent for LTO-7 is double that. And archivists are compelled to maintain and service each new generation of LTO drive along with preserving the LTO cartridges.
Lee Kline, technical director at Janus Films’ Criterion Collection, regards data migration as an unavoidable hassle: “Nobody wants to do it, but you have to.” Archivists like Kline at least have the budgets to maintain their digital films. Independent filmmakers, documentarians, and small TV producers don’t. These days, an estimated 75 percent of the films shown in U.S. theaters are considered independent. From a preservation standpoint, those digital works might as well be stored on flammable nitrate film.
Meanwhile, the motion-picture studios are churning out content at an ever-increasing rate. The head of digital archiving at one major studio, who asked not to be identified, told me that it costs about $20,000 a year to digitally store one feature film and related assets such as deleted scenes and trailers. All told, the digital components of a big-budget feature can total 350 TB. Storing a single episode of a high-end hour-long TV program can cost $12,000 per year. Major studios like Disney, NBCUniversal, Sony, and Warner each have archives of tens of thousands of TV episodes and features, and they’re adding new titles all the time.
Meanwhile, the use of higher-resolution digital cameras and 3D cameras has caused the amount of potentially archivable material to skyrocket. “We went from standard definition to HD and then from HD to UHD,” Peter Schade, NBCUniversal’s vice president of content management, said in an interview. Pixel resolutions have gone from 2K to 4K and soon, 8K, he adds. Codecs—the software used to compress and decompress digital video files—keep changing, as do the hardware and software for playback. “And the rate of change has escalated,” Schade says.
Computer-animation studios like Pixar have their own archiving issues. Part of the creative process in a feature-length animated film is developing the algorithms and other digital tools to render the images. It’s impossible to preserve those software assets in a traditional film vault or even on LTO tape, and so animation and visual effects studios have had to develop their own archival methods. Even so, the sheer pace of technological advancement means those digital tools become obsolete quickly, too.
When Pixar wanted to release its 2003 film Finding Nemo for Blu-ray 3D in 2012, the studio had to rerender the film to produce the 3D effects. The studio by then was no longer using the same animation software system, and it found that certain aspects of the original could not be emulated in its new software. The movement of seagrass, for instance, had been controlled by a random number generator, but there was no way to retrieve the original seed value for that generator. So animators manually replicated the plants’ movements frame by frame, a laborious process. The fact that the studio had lost access to its own film after less than a decade is a sobering commentary on the challenges of archiving computer-generated work.
Another problem for archivists is that digital camera technology has allowed productions to shoot essentially everything. In the past, the ratio of what’s shot to what’s eventually used for a feature film was typically 10 to 1. These days, says Warner archive chief Anastasi, films can go as high as 200 to 1. “On some sets, they’re simply not turning the camera off,” he says.
All that material will typically get saved and stored for a while. But at some point, somebody will have to decide how much of that excess really needs to be preserved for posterity. Given the huge expense of film preservation, archivists are being ruthless about what they choose to store. “There’s no way we can store it all,” says USC archivist Everett. “We’re just going to store the bare minimum.”
At Warner, Anastasi has taken a triage approach. Four years ago, when he took over the studio’s archives, he faced two distinct challenges: First he had to “stop the bleeding” by figuring out how to save those assets that were most vulnerable to being lost. Those on two-inch videotape, the medium of choice for network TV shows in the 1970s and 1980s, “were the most at risk. We captured that material on digital as uncompressed JPEG 2000 files.” That part of the triage is now nearly complete.
The second challenge was finding a way to affordably maintain the studio’s archive for more than a generation. He set the goal at “50-plus years.” He also decided that rather than operating an in-house archive, the problem would be better handled by outsourcing it. And so in 2014, Warner signed a long-term contract with USC Libraries to maintain the studio’s archives.
Sam Gustman, associate dean of the USC Libraries, says that the Warner archives are now part of 50 petabytes of archived data at USC, which also includes nearly 54,000 video interviews with Holocaust survivors gathered by the USC Shoah Foundation. For 20 years of storage, including power, supervision, and data migration every 3 years, USC charges $1,000 per terabyte, or $1,000,000 per petabyte. That works out to a relatively affordable $2.5 million per year for its current 50-PB holdings. It’s not a money-making business, Gustman adds.
The USC archive maintains four copies of each tape: Two are held at USC, one at Clemson University in South Carolina, and one at Charles University in Prague. The aim is to “touch every tape” every six months, using an automated system, Gustman explains. A robotic arm selects a tape from a rack and loads it into a reader, which plays it back while a computer checks for aberrations. Any tape that isn’t perfect is immediately trashed, and the archive makes a replacement from one of its remaining copies of the tape. The archive migrates to the latest version of LTO as it becomes available, so no tape is more than three years old.
Warner also began classifying its 8,000 feature films and 5,000 TV shows into two categories: those it will “manage”—that is, preserve for the long term—and those it deems “perishable.” Managed assets include not just the finished work but also marketing materials and some deleted scenes. Perishable material may include dailies for features or unused footage; it will be stored for some time in the archive but may not be migrated. To decide what’s perishable and what’s not, the studio considers things like how successful the film has been, how popular its stars are, and whether the film could have enduring (or cult) appeal.
The manage-or-perish scheme is by no means perfect, Anastasi admits, but he sees it as buying the studio a little time until a truly long-term digital storage technology comes along. If one ever does.
For now, he says, “We’ll keep it, and there’ll be time to rethink the strategy. But after 10 years, we can’t guarantee access” to any of the material that hasn’t been migrated to managed storage.
Everett says Warner’s strategic thinking about digital archiving is pioneering. All of the studios, he notes, “are in a realm where there is no policy.” Meanwhile, they’re waiting for an archival technology that is better than LTO. “Originally, we went all digital because it’s so much cheaper,” Everett notes. “But is it? Really? We haven’t solved the storage problem.”
If technology companies don’t come through with a long-term solution, it’s possible that humanity could lose a generation’s worth of filmmaking, or more. Here’s what that would mean. Literally tens of thousands of motion pictures, TV shows, and other works would just quietly cease to exist at some point in the foreseeable future. The cultural loss would be incalculable because these works have significance beyond their aesthetics and entertainment value. They are major markers of the creative life of our time.
Most of the archivists I spoke with remain—officially at least—optimistic that a good, sound, post-LTO solution will eventually emerge. But not everyone shares that view. The most chilling prediction I heard came from a top technician at Technicolor.
“There’s going to be a large dead period,” he told me, “from the late ’90s through 2020, where most media will be lost.”
This article appears in the May 2017 print issue as “The Lost Picture Show.”
About the Author
Marty Perlmutter is based in Southern California and has worked in interactive video and new media for four decades, including early work designing immersive technology hardware, building exhibits, and exploring artistic and commercial uses of image control.