icm2re logo. icm2:re (I Changed My Mind Reviewing Everything) is an  ongoing web column  by Brunella Longo

This column deals with some aspects of change management processes experienced almost in any industry impacted by the digital revolution: how to select, create, gather, manage, interpret, share data and information either because of internal and usually incremental scope - such learning, educational and re-engineering processes - or because of external forces, like mergers and acquisitions, restructuring goals, new regulations or disruptive technologies.

The title - I Changed My Mind Reviewing Everything - is a tribute to authors and scientists from different disciplinary fields that have illuminated my understanding of intentional change and decision making processes during the last thirty years, explaining how we think - or how we think about the way we think. The logo is a bit of a divertissement, from the latin divertere that means turn in separate ways.

Chronological Index | Subject Index

Digital Humanities: a long time coming?

About a millenary change in education

How to cite this article?
Longo, Brunella (2021). Digital Humanities: a long time coming? About a millenary change in education. icm2re [I Changed my Mind Reviewing Everything ISSN 2059-688X (Print)], 10.9 (August). http://www.icm2re.com/2021-9.html

How to cite this article?
Longo, Brunella (2021). Digital Humanities: a long time coming? About a millenary change in education. icm2re [I Changed my Mind Reviewing Everything ISSN 2059-688X (Print)], 10.9 (August). http://www.icm2re.com/2021-9.html

It's been a long
A long time coming
But I know a change gonna come
Oh, yes it will
Sam Cooke, 1964

London, 30 October 2021 - After my (sad) stroll in the woods of human computer interaction (see icm2re 10.4), during the pandemic lockdowns I felt compelled to keep up with another discipline, digital humanities, that is in a state of fascinating turmoil, with various streams of ongoing development and, at the same time, a growing pressure to consolidate methods and scope within the arts and the humanities areas.

Born as computational linguistics back in the 1940s, and entrenched in the domain of theological studies, it remained for many years a field of applications of computer power to text analysis, with pretty much no need to refer to abstract matters of methods on its own. It was just a niche of practical research and application of information retrieval techniques to language and documents and it remained as such, substantially unchanged but for adopting new software tools as soon as they were available, until the internet came along in the mid 1990s. This brought more demand for and fear of integration and collaborations between humanists, computer scientists and engineers.

Informatics accumulated over the decades an impressive body of evidence on the practical utility of the discipline for libraries, databases and electronic publishing not less than for social and political scientists and operational research, and not just for counts and frequency of words in literary and religious texts: in the 1980s, was common practice that universities offered courses in informatics within humanities degrees (following pioneering examples in library and information sciences that date back to the1960s).

I was an enthusiast adopter, student and advocate of informatics in the 1980s and early 1990s.

I learned a lot from its history of unstoppable micro-changes. It was a window opened on the infinite creative possibilities that computer science could bring into libraries, archives and in general into data and information management practices.

With the internet and the world wide web, its characteristics of extreme applicability have proliferated up to the point that informatics seemed ubiquitous and above all necessary everywhere, not just in cultural heritage institutions and R&D functions.

Scholars started to talk about digital humanities and not informatics anymore, with the aim of distancing themselves from the past, perhaps too colonised with jesuits' expectations.

So I took a plunge into a couple of dozen of English books, academic journals and course-books on the subject, published in the last twenty years, and I selected a few as representative of the immense literature on the subject, produced in America, in the UK and in continental Europe. I reviewed some of these books for various publications and I met some researchers to ask questions about their current work and views.

I identified three waves of studies that seem to have marked the evolution of the discipline over the last two decades and are also recognisable as synchronic trends - you still see some countries and communities dealing with problems of the second wave while, in others, the third one has reached maturity.

At least until the mid 2000s, practical projects have dominated the field: they aimed at the creation of electronic archives of digitised texts, with febrile enthusiasm for the democratisation of informatics that had come with the internet. The construction of digital repositories was, in fact, seen as the first goal of the digital scholar or "computational humanist", concerned with long-term survival and preservation of electronic records and questions of access. The perception that the digital revolution would require an immense cultural change and a different humanists' mindset was so overwhelming that, like a self-fulfilling prophecy, it perpetuated invisible, but thick and irrational, generational barriers, conceiving the digital as a playground for "young professionals". The advances in networked computation, visualisation and information retrieval and above all the possibility to share access to databases on a global scale created many more opportunities for humanists to be engaged with digital artefacts, compared to the previous decades. Choosing standards like TEI (Text Encoding Initiative), then HTML or XML was the first practical technicality researchers had to handle - although this type of problems progressively lost prominence, due to the increased availability of more easy to use, standardised and automated software solutions.

Following up the optimistic, inspirational and creative journeys of the previous generations, scholars that followed in a second wave of research organised symposia and published papers, books and anthologies of limited and perhaps ephemeral interest, mostly concerning the future of the discipline and above all its "legacy" problem. What to do with the huge number of websites, digital archives and databases, paperwork, thesis, books of very limited or no interest at all but for their authors, with an explosion of studies and projects promoted within the field? The legacy problem showed to a wider audience - often sceptical about the place of arts and humanities in the modern world - not yet the results achieved by the discipline but its absence of commercial and business value. In fact, since the financial crisis of 2008, with a contraction of public funding and private investments, the legacy problem of digital humanities has increasingly become the main concern of many institutions, particularly of the smaller ones. The race (and for some a countdown) to find new value propositions started with researchers pointing towards various directions, including a sort of revival of 1950s information policies, imagining digital humanists as crafters and "makers" of 3D objects printed in library spaces, or a sort of U-turn in which the career of the digital humanist revolves around traditional teaching roles. But there are other interesting ideas, like cultural analytics (analytics applied to the world of arts, from archeological objects to reading habits) and, of course, hermeneutics of user generated contents - this last is perhaps a creative pastime, that has invaded the social media shores like jellyfish, creating debates as well as disinformation, more than a profession.

A third wave I identified in the more recent literature is characterised by the critical consolidation (or at least the attempt to achieve this goal) of the domain knowledge, from which evidence of practical utility can be sifted and re-examined among teachers, scholars, experts. It seems there is a growing tension towards a mature attitude in respect of the inevitability for "the digital" to be integrated back in the main curriculum, to prepare to work - and not just survive - in the gig economy of the creative industries. Everywhere digital humanists look at themselves as communities of practices: this does not mean to be able to earn a living or to acquire the skills needed to have a career as digital humanist outside the academic world. It is also true that positions in large organisations often require a sophisticated humanistic background together with additional specialisations, such media law, epidemiology or medicine, journalism and public relations.


It could or it could not be long before a radical reform of secondary and higher education - it may happen very slowly over a hundred year - but it seems to me that "the digital" urgently calls for a once in a thousands years upside down turn in how curricula for further education and university are designed - not only in the humanities. I am referring here to societal and institutional changes of the scale happened in education with the invention of the Universities in the High Middle Ages and, several centuries later, with the invention of compulsory public schools.

I do not know if "open degrees" (in which subjects from several disciplines can be combined creating original and sometimes uniques curricula, self-designed by students often while working) are the solution, or if more private Universities could accelerate the transition towards more equitable educational systems. Everybody would like to have more articulate and standard solutions for lifelong learning problems and more computer skills embedded everywhere. But, for now, I see the entire world of education still deeply influenced by the ancient and medieval distinctions, cemented over centuries, of liberal and mechanical arts (le arti del trivio e del quadrivio).

Not more than a scratch, the so called "competence based method" has introduced levers of change in the system over the last thirty years. Yet, its effectiveness seems embroiled into a fragmentation of knowledge and abilities that sometimes is absolutely ridiculous to reconcile and to assess according to current standards or it just conflicts with academic views.

Digital humanities is made with two halves of a walnut, potentially treasuring unstoppable lessons about societal, cultural and economic change as long as they remain deeply intertwined: history of the book on one side - the bibles, the data, the documents aspect - and technologies of information and communication on the other.

It is perhaps the only discipline in which the arti del trivio e del quadrivio are, however unintentionally, muddled. That still means troublemaking stuff for academics.