The New York Times: Even Digital Memories Can Fade

Anonymous Patron writes "Even Digital Memories Can Fade says no one has figured out how to preserve electronic materials from the nation's 115 million home computers for the next decade, much less for the ages. "To save a digital file for, let's say, a hundred years is going to take a lot of work," said Peter Hite, president of Media Management Services, a consulting firm in Houston. "Whereas to take a traditional photograph and just put it in a shoe box doesn't take any work." Already, half of all photographs are taken by digital cameras, with most of the shots never leaving a personal computer's hard drive."

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

learning curve for technologists as well

This article was covered in a slashdot entry yesterday. I posted a few replies (my id over there is ragnar) to clarify some points, but I find the responses of this slice of the tech community to be interesting. For example, many of the geeks (I'm on of 'em) confuse backup with archival. Also, many don't see past the initial loss-less bit for bit copy to the longer term issues with digital archival.

Naturally the public at large is unfamiliar with the issues, but technologists at some level need to understand. They will be the ones involved in either perpetuating or solving the issue.

I saw this headline and shuddered

You all are probably thinking more about the archival properties of digital information in re libraries, but all I could think about were the moments I'd captured on my digital camera...some very unimportant but others of people near and dear to my heart...and here they abide on the hard drive of my computer.

This article sent a chill down my Luddite spine because I want my daughter to be able to look at old pictures from the 1990's and 2000's and be able to remember when... I'm going to load up my little 35 mm point and shoot and my OM-1 for the sake of familial posterity.

Re:learning curve for technologists as well

Could you elaborate on this: "many don't see past the initial loss-less bit for bit copy to the longer term issues with digital archival"

Re:learning curve for technologists as well

What I mean to say is that IT people see the ability to copy an image file from one medium to another as prima fascia evidence that digital sources have no archival challenges. They see the problems imposed by analog sources, which lose quality when copies are made (think of copying a video tape) and extrapolate this single issue to the whole of archival tasks.As many here may know, one of the biggest long term archival issues with digital media is the interface whereby it is rendered. I'll offer an example that is close to my way of thinking these days.I'm doing some work at the University of Virginia where we are exploring options for housing scholarly digital editions in the Library. The UVa Library has a mature effort underway with the Fedora project, however as we discuss levels of collection and imagine future migrations it is evident that some content will be lost.In this case, I use the word "content" very broadly. For example, we contemplate how to preserve the search interface long term, as well as the appearance. Your question spurred me to write a more detailed example of the scenario:http://faustroll.clas.virginia.edu/nines/node/view /49

Re:learning curve for technologists as well

I thought you were referring to some kind of degradation when you copy the electronic file. So you're not talking about just archiving pictures and documents, you're talking archiving the technology. Yes?

Re:learning curve for technologists as well

Correct. Your description (archiving the technology) is a good way of putting it.

Re:learning curve for technologists as well

Well, that said, while I understand the desire and the need I'm not sure how practical it is. You can set aside a machine that only runs Windows ?3.1? (what was pre-95?) but to try and archive coding just seems mind-boggling. Coding would have to be the equivalent of dialects, branches of a language. In the computer world some of these dialects and languages are the equivalent of a thousand years old. A hundred years from now people are just going to be scratching their heads.

I'm not sure I'm saying this very well. The point is while an infinite library is a pleasant dream in reality it would take an equally infinite consciousness to be able to access and make use of such a library.

Re:I saw this headline and shuddered

Aw, just put on CDROM and it will last forever! YAY!

Re:I saw this headline and shuddered

Dear GrumpyOldMan, would that that were true, but I hear tell it ain't so - and somehow I never get around to it either. ;-)

Even Digital Memories Can Fade

Isn't Google working on something similar to this where they are developing a tool that indexes the contents of your computer's hard drive?

Syndicate content