Notes on current issues in Digital Humanities

In July 2011, the Open University held a colloquium called ‘Digital technologies: help or hindrance for the humanities?’, in part to celebrate the launch of the Thematic Research Network for Digital Humanities at the OU.  A full multi-author report about the colloquium (titled 'Colloquium: Digital Technologies: Help or Hindrance for the Humanities?') will be coming out in the 'Digital Futures Special Issue Arts and Humanities in HE' edition of Arts and Humanities in Higher Education soon, but a workshop was also held at the OU's Milton Keynes campus on Thursday to discuss some of the key ideas that came from the colloquium and to consider the agenda for the thematic research network.  I was invited to present in the workshop, and I've shared my notes and some comments below (though of course the spoken version varied slightly).

To help focus the presentations, Professor John Wolffe (who was chairing) suggested we address the following points:

  1. What, for you, were the two most important insights arising from last July’s colloquium?
  2. What should be the two key priorities for the OU’s DH thematic research network over the next year, and why?
Notes on the colloquium and current issues in the Digital Humanities
 

Introduction – who I am as context for how I saw the colloquium
Before I started my PhD, I was a digital practitioner – a programmer, analyst, bearer of Zeitgeisty made-up modern job titles – situated in an online community of technologists loosely based in academia, broadcasting, libraries, archives, and particularly, in public history and museums. That's really only interesting in the context of this workshop because my digital community is constituted by the very things that challenge traditional academia – ad hoc collaboration, open data, publicly sharing and debating thoughts in progress.

For people who happily swim in this sea, it's hard to realise how new and scary it can be, but just yesterday I was reminded how challenging the idea of a public identity on social media is for some academics, let alone the thought of finding time to learn and understand yet another tool. As a humanist-turned-technologist-turned-humanist, I have sympathy for the perspective of both worlds.

The two most important insights arising from last July’s colloquium?
John Corrigan's introduction made it clear that the answer to the question 'what is digital humanities' is still very open, and has perhaps as many different answers as there are humanists. That's both exciting and challenging – it leaves room for the adaptation (and adoption) of DH by different humanities disciplines, but it also makes it difficult to develop a shared language for collaboration, for critiquing and peer reviewing DH projects and outputs… [I've also been wondering whether 'digital humanities' would eventually devolve into the practices of disciplines – digital history, etc – and how much digital humanities really works across different humanities disciplines in a meaningful way, but that's a question for another day.]

In my notes, it was the discussion around Chris Bissel's paper on 'Reality and authenticity', Google Earth and archaeology that also stood out – the questions about what's lost and gained in the digital context are important, but, as a technologist, I ask us to be wary of false dichotomies. There's a danger in conflating the materiality of a resource, the seductive aura of an original document, the difficulties in accessing it, in getting past the gatekeepers, with the quality of the time spent with it; with the intrinsic complexity of access, context, interpretation… The sometimes difficult physical journey to an archive, or the smell of old books is not the same as earned access to knowledge.

What should be the two key priorities for the OU’s DH thematic research network over the next year?
[I don't think I did a very good job answering this, perhaps because I still feel too new to know what's already going on and what could be added. Also, I'm apparently unable to limit myself to two.]
I tend to believe that the digital humanities will eventually become normalised as just part of how humanities work, but we need to be careful about how that actually happens.

The early adopters have blazed their trails and lit the way, but in their wake, they've left the non-early adopters – the ordinary humanist – blinking and wondering how to thrive in this new world. I have a sense that digital humanities is established enough, or at least the impact of digitisation projects has been broad enough, that the average humanist is expected to take on the methods of the digital humanist in their grant and research proposals and in their teaching – but has the ordinary humanist been equipped with the skills and training and the access to technologists and collaborators to thrive? Do we need to give everyone access to DH101?

We need to deal with the challenges of interdisciplinary collaboration, particularly publication models, peer review and the inescapable REF. We need to understand how to judge the processes as well as the products of research projects, and to find better ways to recognise new forms of publication, particularly as technology is also disrupting the publication models that early career researchers used to rely on to get started.

Much of the critique of digital working was about what it let people get away with, or how it risks misleading the innocent researcher. As with anything on a screen, there's an illusion of accuracy, completeness, neatness. We need shared practices to critique visualisations and discuss what's really available in database searches, the representativeness of digital repositories, the quality of transcriptions and metadata, the context in which data was created and knowledge produced… Translating the slipperiness of humanities data and research questions into a digital world is a juicy challenge but it's necessary if the potential of DH is to be exploited, whether by humanities scholars or the wider public who have new access to humanities content. 'natural order of things'.

Digitality is no excuse to let students (or other researchers) get away with sloppy practice. The ability to search across millions of records is important, but you should treat the documents you find as rigorously as you'd treat something uncovered deep in the archives. Slow, deep reading, considering the pages or documents adjacent to the one that interests you, the serendipitous find – these are all still important. But we also need to help scholars find ways to cope with the sheer volume of data now available and the probably unrealistic expectations of complete coverage of all potential sources this may create. So my other key priority is working out and teaching the scholarly practices we need to ensure we survive the transition from traditional to digital humanities.

In conclusion, the same issues – trust, authority, the context of knowledge production – are important for my digital and my humanities communities, but these concepts are expressed very differently in each. We need to work together to build bridges between the practices of traditional academia and those of the digital humanities.

Notes from EuropeanaTech 2011

Some very scrappy notes from the EuropeanaTech conference held in Vienna this week as I prepare a short talk for the Open data in cultural heritage (LODLAM-London) event tonight… For a different perspective there's an overview post at EuropeanaTech – är det här framtidens kulturarv? and I'll link to any others I find.  I've also put up some photos of ten questions attendees asked about Europeana, with written answers from the break-out exercise.  I'll tidy up and post my keynote notes in a few days, and I'll probably summarise things a bit more then.

Max Kaiser: Europeana is like a cruise ship with limited room to move, hackathons inject Europeana with a bit more agility… Build real stuff for real people with real business requirements – different to building prototypes and proofs of concept – requires different project culture.

Bill Thompson: pulling the analogue past into the digital future… We don't live in a digital world and never will – the physical world is not going to vanish. We'll remain embodied minds; will have co-existing analogue and digital worlds.Digital technologies shaping the possibilities we decide to embrace. … Can't have a paradigm shift in humanities because no basic set of beliefs to argue with… But maybe the shift to digital is so fundamental that it could be called a paradigm shift. … Even if you don't engage online, you'll still live in a world shaped by the digital.  Those who are online will come to define the norms. … Revolutionary vanguard in our midst – hope lies with the programmers, the coders – the only weapon that matters is running code. Have to build on technologies that are open, only way to build diverse online culture that allows all voices to be heard. … Means open data in a usable form – properly formulated so can be interpreted by anyone or any program that wants it; integrate them into the broader cultural space. Otherwise just disconnected islands.

Two good reasons to endorse open linked data. We're the first generation that's capable of doing this – have the tools, network, storage, processes. Within our power to digitise everything and make it findable. We may also be the only generation that wants to do it – later generations will not value things that aren't visible on the screen in the same way – they'll forget the importance of the non-digital. So we'd better get on with it, and do it properly. LOD is a foundation that allows us to build in the future.

Panel discussion…

Qu: how does open theme fit with orgs with budget cuts and need to make more money?
BT: when need to make money from assets, openness is a real challenge. There are ways of making assets available to people that are unlikely to have commercial impact but could raise awareness e.g. low-res for public access, high-res for commercial use [a model adopted by many UK museums].

Jill Cousins: there's a reputational need to put decent resolution images online to counter poor quality versions online.

Max: be clever – don't make an exclusive contract with digitisation partners – make sure you can also give free access to it.
Jill Cousins: User always been central to Europeana though got slightly lost along the way as busy getting data.  …  Big stumbling block – licenses. Not just commercial reasons, also about reputational risk, loss of future earnings, fear of giving away something that's valuable in future. Without CC licence, can't publish as linked open data. Without it, commercial providers like INA can't take the API. Can't use blogs that have advertising on them. Couldn't put it on Wikipedia. Or ArtFinder.  …  New [UK?] Renaissance report – metadata related to the digitised objects by cultural heritage orgs should be widely and freely available for re-use.
Workshops with content holders: Risks – loss of quality, loss of control, attribution, brand value, potential income ('phantom income'), unwanted spillover effects – misuse/juxtaposition of data. Rwards: increasing relevance, increasing channels to end users, data enrichment, brand value, specific funding opportunties, discoverability, new customers, public mission, building expertise, desired spillover effects. … You are reliant on user doing the right thing with attribution….
Main risks: unwanted spillover effects, loss of attribution, loss of potential income. Main rewards: new customers, increasing relevance, public mission. But the risks diminshed as the rewards gain more prominence – overall outweighed the risks.  But address those 3 areas of risk.
What next? Operationalise some of the applications developed.  Yellow Kitchen Maid paper on the business of open data. Working together on difficulties faced by institutions and licensing open data.
[notes from day 2 to follow!]
Ten questions about Europeana…
10 questions (and one general question)
The general question was, what can the community building with domain experts, developers and researchers/R&D/innovation work package in Europeana 2.0 do?  (Something like that anyway, it was all a bit confusing by that point)
You had to pick a question and go into a group to try and answer it – I've uploaded photos of the answer sheets.
1 Open source – if Europeana using open source software and is open software, should it also become a community-driven development project?
2 Open source – are doubts about whether OSS provides quality services justified? What should be done to ensure quality?
3 Aggregation and metadata quality – what will be the role of aggregators, and what is role of Europeana in LOD future?
4 What can Europeana do which search engines can't that justifies the extra effort of creating and managing structured metadata?
5 Is EDM [Europeana Data Model] still too complicated? If yes, what to simplify.
6 What is the actual value of semantic contexualisation, and could that not be produced by search engines?
7 enhance experience of exploring, discovering [see photo – it was too long to type in time!]
8 How important is multilingual access for discovery in Europeana? Which elements are the most important?
9 Can Europeana drive end-user engagement on the distributed sites and services of contributing archives?
10 How can we benefit from existing (local, international) communities in enriching the user experience on Europeana?

Usability: the key that unlocks geeky goodness

This is a quick pointer to three posts about some usability work I did for the JISC-funded Pelagios project, and a reflection on the process. Pelagios aims to 'help introduce Linked Open Data goodness into online resources that refer to places in the Ancient World'. The project has already done lots of great work with the various partners to bring lots of different data sources together, but they wanted to find out whether the various visualisations (particularly the graph explorer) let users discover the full potential of the linked data sets.

I posted on the project blog about how I worked out a testing plan to encourage user-centred design and set up the usability sessions in Evaluating Pelagios' usability, set out how a test session runs (with sample scripts and tasks) in Evaluating usability: what happens in a user testing session? and finally I posted some early Pelagios usability testing results. The results are from a very small sample of potential users but they were consistent in the issues and positive results uncovered.

The wider lesson for LOD-LAM (linked open data in library, archives, museums) projects is that user testing (and/or a strong user-centred design process) helps general audiences (including subject specialists) appreciate the full potential of a technically-led project – without thoughtful design, the results of all those hours of code may go unloved by the people they were written for. In other words, user experience design is the key that unlocks the geeky goodness that drives these projects. It's old news, but the joy of user testing is that it reminds you of what's really important…