It is a commonplace view that the use of computing and its associated technologies has a profound effect on the practice of disciplines that have been associated with the humanities. At base, the production and dissemination of documents has been greatly augmented by computer technology. The development of multimedia and hypertext methods has, furthermore, allowed for multimodal presentation of information that has enhanced our cognitive appraisal and assimilation of materials produced by scholars working in these “liberal arts” fields. Much of the discussion concerning the impact of computer technology upon humanities disciplines has taken place under the rubric of “digital humanities”. This article argues that a new methodological paradigm is emerging in fields such as English and history that accommodates not only what commonly falls under “digital humanities”, but also includes endeavors that encompass cognitive and algorithmic/computational approaches. For example, theorists (such as David Herman, Lisa Zunshine, Patrick Colm Hogan and Ellen Spolsky) who work in departments of English have taken inspiration from research currently done on human cognition to help interpret literary texts from the standpoint of psychology and neuroscience. From a computational perspective, an interpretive approach that uses statistical and graph-theoretic methods to analyze texts has been advocated by Franco Moretti and his colleagues at Stanford University. I will argue that these three approaches—digital humanities, cognitive humanities and algorithmic/computational humanities—have distinctive claims that can also complement and intersect with each other. Taken together they represent an emerging methodological paradigm for the humanities.
|Keywords:||Digital Humanities, Computers and the Humanities, Emerging Trends, Cognition and the Humanities, Computational, Humanities|
Director, CCIT, Department of Computer Science, State University of New York at Oswego, Oswego, USA