CNN, among others, reports that an earthquake ranging 5.6-5.8 hit in Chino Hills outside Los Angeles. It will be days before a final magnitude score for the earthquake is settled upon. The report by MSNBC notes that location is key to whether damage occurs.
Much of this brings up a point we don't think about in librarianship too much. If we rely on a remote server that gets hit by a natural disaster, what do we do? Do we have local backups? Is there something we can fail over to?
A prime example of a problem is Twitter. The majority of Twitter's servers are located in one of the most geologically active areas of North America. If an earthquake hit, Twitter would be probably toast without a backup outside San Francisco.
Great centralization may be great for cognitive processing but it is so vulnerable. During the Cold War it was found that a way to disrupt the Soviet side was to blow up a factory. Typically all production was centered in a single factory. If you hit the shoe factory, there might not be shoes for a while. If you hit the radio factory, folks might have to turn to smugglers from Western Europe to bring in Telefunken devices and other such things.
While there are Web 2.0 sites with great promise, the biggest worry is excessive centralization. If a site goes down, what do you do? If you had important documents saved only to GoogleDocs, what do you do when it goes away? The recent Amazon S3 outage showed just how fragile cloud computing is as it requires a near-perfect world without disruption in which to operate effectively.
There are some moves afoot for decentralization. The PGP web of trust is one great example of decentralizing a backbone to a public-key encryption system. identi.ca is based off a program licensed under the GNU Affero General Public License. The underlying software, known as Laconica, allows for decentralized microblogging across multiple servers. A major step forward in creating resilience is decentralization.
When the one big server blows up, where will you go for your data?