Abstract
There has been widespread excitement in recent years about the emergence of large-scale digital initiatives (LSDIs) such as Google Book Search. Although many have become excited at the prospect of a digital recreation of the Library of Alexandria, there has also been great controversy surrounding these projects. This article looks at one of these controversies: the suggestion that mass digitization is creating a virtual rubbish dump of our cultural heritage. It discusses some of the quantitative methods being used to analyse the big data that have been created, and two major concerns that have arisen as a result. First, there is the concern that quantitative analysis has inadvertently fed a culture that favours information ahead of traditional research methods. Second, little information exists about how LSDIs are used for any research other than quantitative methods. These problems have helped to fuel the idea that digitization is destroying the print medium, when in many respects it still closely remediates the bibliographic codes of the Gutenberg era. The article concludes that more work must be done to understand what impact mass digitization has had on all researchers in the humanities, rather than just the early adopters, and briefly mentions the work that the author is undertaking in this area.
Original language | English |
---|---|
Pages (from-to) | 425-431 |
Number of pages | 7 |
Journal | Literary and Linguistic Computing |
Volume | 28 |
Issue number | 3 |
DOIs | |
Publication status | Published - 2012 |