Submitted by Bibliofuture on March 5, 2010 - 8:47pm
A recent blog post by Craig Mod, a self-titled computer programmer, book designer and book publisher, offers a thoughtful and distinctive perspective on the move of books from paper to interactive devices like Apple’s iPad.
Mr. Mod summarizes his argument in the subtitle of his post: “Print is dying. Digital is surging. Everyone is confused. Good riddance.”
Mr. Mod divides content broadly into two categories: content where the form is important, such as poetry or text with graphics, and content where form is divorced from layout, which he says applies to most novels and non-fiction.
This kind of thinking makes a key point: instead of arguing about pixels versus paper, as many book lovers tend to do, it is more useful to focus on whether the technology is a good match for the content.
Full article at the NYT Bits Blog
Submitted by Blake on February 2, 2010 - 7:53am
You may know Peter Morville from such books as Information Architecture for the World Wide Web or Ambient Findability, or from any number of library conferences, or his sites semanticstudios.com and findability.org. Well, he's back with a new book and site, http://searchpatterns.org/ , and Search Patterns: Design for Discovery.
"Search is among the most disruptive innovations of our time. It influences what we buy and where we go. It shapes how we learn and what we believe. This provocative and inspiring book explores design patterns that apply across the categories of web, e-commerce, enterprise, desktop, mobile, social, and real time search and discovery. Using colorful illustrations and examples, the authors bring modern information retrieval to life, covering such diverse topics as relevance ranking, faceted navigation, multi-touch, and mixed reality. Search Patterns challenges us to invent the future of discovery while serving as a practical guide to help us make search applications better today."
Submitted by Bibliofuture on January 31, 2010 - 1:18pm
Book: Print Is Dead: Books in Our Digital Age
For over 1500 years books have weathered numerous cultural changes remarkably unaltered. Through wars, paper shortages, radio, TV, computer games, and fluctuating literacy rates, the bound stack of printed paper has, somewhat bizarrely, remained the more robust and culturally relevant way to communicate ideas. Now, for the first time since the Middle Ages, all that is about to change.
Newspapers are struggling for readers and relevance; downloadable music has consigned the album to the format scrap heap; and the digital revolution is now about to leave books on the high shelf of history. In Print Is Dead, Gomez explains how authors, producers, distributors, and readers must not only acknowledge these changes, but drive digital book creation, standards, storage, and delivery as the first truly transformational thing to happen in the world of words since the printing press.
Submitted by Bibliofuture on January 22, 2010 - 10:58am
Sharing what we see on the Web has reduced the chance that we are also missing something important. Call it "controlled serendipity."
NYT Bits Blog
Submitted by Bibliofuture on December 26, 2009 - 12:20am
Google sponsored research to detect differences in how children and adults search and to identify barriers children face when seeking information.
When Benjamin Feshbach was 11 years old, he was given a brainteaser: Which day would the vice president’s birthday fall on the next year?
Benjamin, now 13, said he typed the question directly into the Google search box, to no avail. He then tried Wikipedia, Yahoo, AOL and Ask.com, also without success. “Later someone told me it was a multistep question,” said Benjamin, a seventh grader from North Potomac, Md.
“Now it seems quite obvious because I’m older,” he said. “But, eventually, I gave up. I didn’t think the answer was important enough to be on Google.” Benjamin is one of 83 children, ages 7, 9 and 11, who participated in a study on children and keyword searching. Sponsored by Google and developed by the University of Maryland and the Joan Ganz Cooney Center, the research was aimed at discerning the differences between how children and adults search and identify the barriers children face when trying to retrieve information.
Full article in the NYT
Submitted by Bibliofuture on December 10, 2009 - 2:25pm
A report published Wednesday by the University of California, San Diego, calculates that American households collectively consumed 3.6 zettabytes of information in 2008. The paper — entitled “How Much Information?” — explores all forms of American communication and consumption and hopes to create a census of the information we consume.
I’ll be honest: this is the first time I’ve ever used the word zettabyte. I’ve heard of petabytes and even exabytes, but zettabytes are a whole new level of bytes. If a zettabyte is beyond your comprehension, too, it’s essentially one billion trillion bytes: a 1 with 21 zeros at the end. To put that into perspective, one exabyte — which equals 1/1000 of a zettabyte or 1 billion gigabytes — is roughly equivalent to the capacity of 5.1 million computer hard drives, or all the hard drives in Minnesota.
Full piece here.
Submitted by Bibliofuture on November 27, 2009 - 4:02pm
Submitted by Bibliofuture on November 26, 2009 - 11:36am
Hype around augmented reality, a technology that can superimpose graphics or information over the real world in your phone’s viewfinder, is at a fever pitch. But can it deliver the revenues?
Full article in the NYT
Submitted by Bibliofuture on October 8, 2009 - 4:02pm
A new company called SkyRiver has launched a bibliographic utility, directly challenging long-dominant OCLC. Over the last 18 years, strategic acquisitions by OCLC have narrowed competition, but SkyRiver—founded by Jerry Kline, the owner and co-founder of Innovative Interfaces—aims to expand the market and offer an alternative bibliographic utility for cataloging that could save libraries up to 40 percent off their expenditures for bibliographic services.
Full article here.
Submitted by Bibliofuture on September 16, 2009 - 12:56pm
On "Talk of the Nation"
Google stands to be the single repository for millions of the world's books. Advocates applaud the organization and the access a digital library can afford. But critics worry about monopoly and profit motives, and what it means for readers' privacy.
Submitted by Bibliofuture on August 25, 2009 - 1:30am
Officials at the Wikimedia Foundation, the nonprofit in San Francisco that governs Wikipedia, say that within weeks, the English-language Wikipedia will begin imposing a layer of editorial review on articles about living people.
The new feature, called “flagged revisions,” will require that an experienced volunteer editor for Wikipedia sign off on any change made by the public before it can go live. Until the change is approved — or in Wikispeak, flagged — it will sit invisibly on Wikipedia’s servers, and visitors will be directed to the earlier version.
Story in the NYT
Submitted by Great Western Dragon on August 2, 2009 - 11:00pm
CDs, tapes, external drives, off site back up through Amazon S3; all of these are viable options for backing up precious data.
But what about paper?
Crazy? Well, not really. A programme called PaperBack will take files and render them as code on standard paper. Simply print and file. To recover files, scan the paper. Still, what's the advantage?
Well, one big one is that technology comes and goes. We had ZIP drives, tape drives, and all kinds of stuff before now that aren't used anymore. Meanwhile TWAIN, the standard protocol for scanners, has been around for almost two decades and isn't likely to go anywhere soon.
Sure, you wouldn't want to back up, say, your ILS database like this. But how about important circulation data? Passwords for those days when an act of god wipes your data centre from the face of the earth? You could send updates to rural areas with limited internet access. And in the end, it uses a medium that's been with us for thousands of years.
Submitted by StephenK on June 4, 2009 - 2:17pm
Submitted by Bibliofuture on June 1, 2009 - 2:16pm
Wikipedia’s arbitration committee ruled to permanently block contributions and edits to Scientology articles from Internet addresses originating from the Church of Scientology’s headquarters.
The decision follows six months of debate among administrators of the user-edited encyclopedia, who found conflicts between Wikipedia editors who were Scientology enthusiasts and those who disliked the religion. Some 430 Scientology entries on Wikipedia resulted in constant battles over revisions between the two camps. User accounts were created for the sole purpose of deleting or adding information on Scientology, a practice seen as harmful to Wikipedia’s neutrality principles.
Full story in the WSJ
Submitted by StephenK on May 13, 2009 - 1:45pm
Tuesday was a unique day. As the 12th day of May and its second Tuesday, I had appointments to keep within civil society. While I was out and about interacting with other human beings in-person, Twitter launched a change. Download Squad reported that Twitter changed part of their core functioning. UX specialist Whitney Hess railed against the change. Gregory Pittman linked on Twitter to a blog post where Twitter explained that the change was due to engineering limitations related to system stability.
This presents a core problem in the Twitter debates. Twitter may be where people hang out. Is it structurally capable of handling the load, though? Are there reasonable assurances of consistent system behavior? Today's blog post dances around the problem of scalability somewhat by relegating it to being the 800 pound elephant in the room.
Twitter, at its core, is a fairly limited service. External bolt-ons like TwitPic, Twibes, and more were created to make the service do more than was ever intended originally. Re-tweets, "Follow Friday", and other such things are more limited now which practically prevents serendipitous discovery. Unless service was contracted by a library with Twitter, there could be no guaranteed service level which could potentially annoy patrons that might seek help via Twitter.
Twitter is not the only game in town for microblogging, though. In December 2008, LISTen talked to Evan Prodomou who is a principal designer of the Laconica software platform. Identi.ca is the flagship site for the Laconica service while others like TWiT Army and Dungeon Twitter also exist. Group functionality that Twibes provides Twitter is also integrated into Laconica itself. Twitpic, Twitterfeed, and more can now interact with Laconica-based sites just as easily as they can interact with Twitter.
It seems a technically superior choice to Twitter exists. With the weeping and gnashing of teeth observed Tuesday over changes in functionality, the question is raised as to what constitutes the bright line that has to be crossed before someone will switch services. At the least, you can control your own local Laconica installation far more readily than you can impact engineering decision-making at Twitter. With federation possible through the OpenMicroBlogging protocol, there is less of a need for the monolithic microblogging platform than before.
The biggest question seems to be, though, what the next move is for Twitter users.
On The Twitter Brouhaha by Stephen Michael Kellat is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.
Submitted by Bibliofuture on May 7, 2009 - 12:27am
Over the weekend, on a mailing list associated with new media transformations, there emerged a debate on the inherent utility of Google Book Search (GBS). Involving Paul Duguid of the Information School at UC Berkeley, Danny Sullivan from Search Engine Land, Tim O’Reilly from O’Reilly Media, and Donald Waters of the The Andrew W. Mellon Foundation (as well as a few others not excerpted here), the debate drew out many of the tensions of GBS.
Full blog entry here.
Submitted by Bibliofuture on May 3, 2009 - 10:46pm
THIS is the end of the line for Encarta, the encyclopedia that Microsoft introduced in 1993 and still describes boastfully on its Web site as “the No. 1 best-selling encyclopedia software brand for the past eight years.” Microsoft recently announced that sales would soon cease and that the Encarta Web site, supported by advertising, would be shut down later this year.
It’s hard to look at the end of the Encarta experiment without the free and much larger Wikipedia springing immediately to mind. But Encarta arguably would have failed even without that competition. The Google-indexed Web forms a virtual encyclopedia that Encarta never had a chance of competing against.
Full article in the New York Times
Submitted by Bibliofuture on April 28, 2009 - 7:51pm
Publicly-funded research doesn't seem so public when taxpayers must pay to read the results in a journal. A new law may help publishing companies preserve their business models, but will limit public access to the research.
Story on Marketplace
Submitted by Bibliofuture on April 14, 2009 - 2:57pm