Usability: the key that unlocks geeky goodness

This is a quick pointer to three posts about some usability work I did for the JISC-funded Pelagios project, and a reflection on the process. Pelagios aims to ‘help introduce Linked Open Data goodness into online resources that refer to places in the Ancient World’. The project has already done lots of great work with the various partners to bring lots of different data sources together, but they wanted to find out whether the various visualisations (particularly the graph explorer) let users discover the full potential of the linked data sets.

I posted on the project blog about how I worked out a testing plan to encourage user-centred design and set up the usability sessions in Evaluating Pelagios’ usability, set out how a test session runs (with sample scripts and tasks) in Evaluating usability: what happens in a user testing session? and finally I posted some early Pelagios usability testing results. The results are from a very small sample of potential users but they were consistent in the issues and positive results uncovered.

The wider lesson for LOD-LAM (linked open data in library, archives, museums) projects is that user testing (and/or a strong user-centred design process) helps general audiences (including subject specialists) appreciate the full potential of a technically-led project – without thoughtful design, the results of all those hours of code may go unloved by the people they were written for. In other words, user experience design is the key that unlocks the geeky goodness that drives these projects. It’s old news, but the joy of user testing is that it reminds you of what’s really important…

Museums and iterative agility: do your ideas get oxygen?

Re-visiting the results of the survey I ran about issues facing museum technologists has inspired me to gather together some great pieces I’ve read on museum projects moving away from detailed up-front briefs and specifications toward iterative and/or agile development.

In ‘WaterWorx – our first in-gallery iPad interactive at the Powerhouse Museum‘, Seb Chan writes:

“the process by which this game was developed was in itself very different for us. … Rather than an explicit and ‘completed’ brief be given to Digital Eskimo, the game developed using an iterative and agile methodology, begun by a process that they call ‘considered design‘. This brought together stakeholders and potential users all the way through the development process with ‘real working prototypes’ being delivered along the way – something which is pretty common for how websites and web applications are made, but is still unfortunately not common practice for exhibition development.”

I’d also recommend the presentation ‘Play at Work: Applying Agile Methods to Museum Website Development‘ given at the 2010 Museum Computer Network Conference by Dana Mitroff Silvers and Alon Salant for examples of how user stories were used to identify requirements and prioritise development, and for an insight into how games can be used to get everyone working in an agile way.  If their presentation inspires you, you can find games you can play with people to help everyone understand various agile, scrum and other project management techniques and approaches at tastycupcakes.com.

I’m really excited by these examples, as I’m probably not alone in worrying about the mis-match between industry-standard technology project management methods and museum processes. In a ‘lunchtime manifesto‘ written in early 2009, I hoped the sector would be able to ‘figure out agile project structures that funders and bid writers can also understand and buy into’ – maybe we’re finally at that point.

And from outside the museum sector, a view on why up-front briefs don’t work for projects that where user experience design is important.  Peter Merholz of Adaptive Path writes:

“1. The nature of the user experience problems are typically too complex and nuanced to be articulated explicitly in a brief. Because of that, good user experience work requires ongoing collaboration with the client. Ideally, client and agency basically work as one big team.

2. Unlike the marketing communications that ad agencies develop, user experience solutions will need to live on, and evolve, within the clients’ business. If you haven’t deeply involved the client throughout your process, there is a high likelihood that the client will be unable to maintain whatever you produce.”

Finally, a challenge to the perfectionism of museums.  Matt Mullenweg (of WordPress fame), writes in ‘1.0 Is the Loneliest Number‘: ‘if you’re not embarrassed when you ship your first version you waited too long’.  Ok, so that might be a bit difficult for museums to cope with, but what if it was ok to release your beta websites to the public?  Mullenweg makes a strong case for iterating in public:

“Usage is like oxygen for ideas. You can never fully anticipate how an audience is going to react to something you’ve created until it’s out there. That means every moment you’re working on something without it being in the public it’s actually dying, deprived of the oxygen of the real world.

By shipping early and often you have the unique competitive advantage of hearing from real people what they think of your work, which in best case helps you anticipate market direction, and in worst case gives you a few people rooting for you that you can email when your team pivots to a new idea. Nothing can recreate the crucible of real usage.

You think your business is different, that you’re only going to have one shot at press and everything needs to be perfect for when Techcrunch brings the world to your door. But if you only have one shot at getting an audience, you’re doing it wrong.”

* The Merholz article above is great because you can play a fun game with the paragraph below – in your museum, what job titles would you put in place of ‘art director’ and ‘copywriter’?  Answers in a comment, if you dare!  I think it feels particularly relevant because of the number of survey responses that suggested museums still aren’t very good at applying the expertise of their museum technologists.

“One thing I haven’t yet touched on is the legacy ad agency practice where the art director and copywriter are the voices that matter, and the rest of the team exists to serve their bidding. This might be fine in communications work, but in user experience, where utility is king, this means that the people who best understand user engagement are often the least empowered to do anything about it, while those who have little true understanding of the medium are put in charge. In user experience, design teams need to recognize that great ideas can come from anywhere, and are not just the purview of a creative director.”


If you liked this post, you may also be interested in Confluence on digital channels; technologists and organisational change? (29 September 2012) and A call for agile museum projects (a lunchtime manifesto) (10 March 2009).

Edward Tufte on ‘Beautiful Evidence’

Tonight I went to see Edward Tufte lecture on ‘visual thinking and analytical design’ at the Royal Geographic Society tonight. The room was packed, perhaps because lots of people were in town for UX London already. The event was organised by Intelligence Squared, who are apparently ‘dedicated to creating knowledge through contest’, which is an interesting goal in its own right.

It’s been a long week, so I suspect my notes are pretty sketchy, but I’m posting them in case they’re useful. Let me know if you spot any corrections. On with the talk…

[His basic thesis:] The method of production interferes with production [of knowledge?].

We do most of our serious visual thinking inside, on unreal flatland [2D] screens, looking at representations of real things, instead of being outside looking at real things. In the real world, most of our thinking would be about way-finding, rather than deep analysis.

Evidence is evidence. Information doesn’t care what it is. The intellectual tasks remain constant regardless of the mode of production/consumption – we need evidence to understand and reason about the materials to hand. We might care about mode of production but we shouldn’t segregate the information by modes of production. We need content-oriented design.

In manuscripts, the hand directly integrates words and image. When marks all come from same source e.g. hands then material is integrated. When technology has different modes of production e.g. type and drawings, then content is segregated by modes of production.

[He showed a 9th century centaur – the image was made of Latin words, a unification of text and image.] Visual meaning of Latin centaur is still clear despite language. The universality of images, the stupefying locality of languages. Images are cosmopolitan, words are local and parochial. When the language is unknown to the viewer, image and language are separated into comprehensible image and incomprehensible language.

‘Content indifference’ is the result of teaching that only design matters. The essential test is how well each assists understanding the understanding of the content, not how stylish they are. You must know the meaning of the content to design it.

The point of information display is to assist analytical thinking. One common task is to make comparisons. Take the intellectual task (e.g. ‘make smart comparisons’) and turn them into design principles. Tasks become instructions to the design. Otherwise it’s just based on fashion or latest technology.

[He then had the lights turned up so people could view copies of Minard’s map of Napoleon’s Russian campaign in 1812. He used this to illustrate his Six Grand Principles of Analytical Design.]

  1. First Grand Principle of Analytical Design: show comparisons, contrasts, differences e.g. difference between those who left and those who returned.  Principles guide the design but also the content. A lot of his work is secretly about analytical thinking.
  2. Second Grand Principle of Analytical Design – show causality, dynamics, mechanisms, explanation. For policy thinking, intervention thinking – to produce an effect, you need to know about and govern the cause – show causality.  Thinking about how these are derived are producer commandments. But as they’re derived from fundamental intellectual task they’re also consumer tasks – you should be asking, as an audience, what is your task?  Minard’s map shows causality with temperature.
  3. Third Grand Principle of Analytical Design – show multivariate data. Three or more factors, variables. i.e. show more than two variables. Reality is inherently multivariable.Minard showed six dimensions – the size of army, direction, temperature, dates, location (lat,long). It’s not about the method of display. The design is so good that it’s invisible. Good web design has to be enormously self-effacing. The task is for users to understand information, not admire the interface. People should be too busy going about their business to notice the design.
  4. Fourth Grand Principle of Analytical Design – the principle of mode indifference – completely integrate all content. No segregation by mode of content. Videos and tables should be embedded in text, not set aside with captions elsewhere. Information don’t care what it is, cognitive tasks don’t care what the mode of production is. Minard had paragraphs of text, statistical bits, annotations all over.  Cognitive styles in approaching material – really good analysts are indifferent to the mode of evidence, spirit is ‘whatever it takes to explain it’. Driven by explanation. Enormous difference between process driven and content driven explanation. Academics make industry from process driven design but it should be whatever it takes to explain it – not what’s lying around, not what I’m good at.
  5. Fifth Grand Principle of Analytical Design – document everything and tell people about it. Document sources, scales, missing data. It’s the credibility argument. Two things you need to get across in a presentation – what the story is and why the audience should believe you. Audience has two tasks – trying to figure out the story and whether they can believe the presenter. An important way to have credibility is to have care and craft with respect to the data. Minard’s two paragraphs are about documentation – assumptions, scales of measurement. Minard was a great engineer who designed bridges and canals – this has the facticity of an engineer. Minard didn’t want quibbles about the facts – he wanted people to appreciate the disaster of war. It was meant as an anti-war poster right from the start. It was meant to show the horrors of war. Precision and accuracy in evidence helps his credibility.  Documentation is part of the fundamental quality control mechanism for preservation of credibility. As a consumer, you should be sceptical if people don’t say where the data came from e.g. URLs or full data sets. Very few people look at the original data but making sure it’s available is important.  Cherry picking is a big threat to the credibility of presenters – am I seeing the results of evidence, or of evidence selection? Providing source is a way of showing you’re not a cherry picker.  Overall, incompetence is more likely to be an explanation than conspiracy. Intimacy with the evidence helps convince of credibility.
  6. Sixth Grand Principle of Analytical Design – serious presentations largely stand or fall on the quality, relevance or integrity of the content. ‘Just fancy that.’ ‘ If that’s not true of presentations where you work, maybe you want to work somewhere else.’  His great insight into design from his books is that content matters, but ‘it’s a shame we live in a world where that counts as an insight’.The best way you can improve your presentation is to improve the content.
  7. [Then I think a Seventh Grand Principle of Analytical Design snuck in:] We want to try as much as we can to show information as adjacent in space than flip backwards in time. If information is stacked in time e.g. one slide after another, you have to try and make comparisons between something that’s gone away and something you’re not seeing. Important comparisons should be try to be made in the eye span. This seventh principle has all kinds of consequences for design. Power users have multiple monitors – trying to get more content real estate adjacent in space.

Digression ‘strange word, users’, only two industries describe their customers as users, illegal drug industry and computer industry.

The principles makes you think about how you display information to viewers – put the data in front of the user, show the comparison. The human eye is really good at comparing things, so having content rich screens and use just about every pixel that you can to carry content.

Trust the users optical capacity. New York Times has 400 links on its homepage. When working on a model for site for site reporting transparency, he suggested news sites.

User testing is not ‘having some temps come in and look your screen over’ but rather how your website performs in the wild. Know New York Times and Google News work because of the sheer number of visitors. Websites that are very successful in the wild provide models. Look at people who report – first rate news websites.

[He then turned to the sparklines handout.]

Bonus real live ‘Powerpoint sucks’ comment.
Sparklines – intense, high resolution display.
Graphics should have resolution of typography [not thickness of pencil]. ‘Graphics are no longer a special occasion.’ They can be anywhere that a word, image or number can be.

Grey band [in the handout ] is based on the types of tasks that clinicians want to do – they don’t care about normal values, only exceptions. [The whole visualisation is based on the task, the user and the context]

‘We want to be approximately right rather than accurately wrong’. An approximate answer to the right question is better than an accurate answer to the wrong question. [I think I misheard ‘accurately’ for ‘exactly’, judging by people’s tweets]

We’re subject to the recency effect – sparklines show most recent change but put it in context – patterns in change, other similarly shaped changes.

Data analytical design is about showing reality in flatland.

In response to questions: a lot of websites get corrupted because they’re pitching, bringing the ethics of the marketplace into the design – important alongside the intensity of a good news website is its spirit – reporting spirit, not pitching.

Qu re current trend for data visualisation e.g. artists animating visualisations
‘I think they can do exactly what they please’ One reason it’s alright is they’re not making other claims about the content, just using it as a found object.

[Finally, out of interest – you could compare these written notes to this sketched version of his talk by @lucyjspence.]

Get thee to a wiki – the great API challenge in action

Help us work on an informal, lightweight way of devising shared data, API standards for museum and cultural heritage organisations – museum-api.pbwiki.com is open for business.

You could provide examples of APIs you’ve used or produced, share your experience as a consumer of web services, tell us about your collections.

Commenting on other people’s queries and content is an easy way to get started.  I’d particularly love to hear from curators and collections managers – we should be working together to enable greater access to collections.  If you check it out and none of it makes any sense – be brave and say so!  We should be able to explain what we’re doing clearly, or we’re not doing it right.

Some background: as announced on the nascent museumdev blog, the Science Museum is looking at releasing an API soon – it’ll be project-specific to start with, but we’re creating it with the intention of using that as an iterative testing and learning process to design an API for wider use. We could re-invent the wheel, but we’d rather make it easy for people to use what they’ve learnt using other APIs and other museum collections – the easiest way to do that is to work with other museums and developers. The Science Museum’s initial public-facing collections API will be used for a ‘mashup competition’ based on object metadata from our ‘cosmos and culture’ gallery.

Speaking of museumdev, I started it as somewhere where I could ask questions, point people to discussions, a home for collections of links and stuff in development.  It’s also got random technical bits like ‘Tip of the Day: saving web.config as Unicode‘ because I figure I might as well share my mistakes^H^H^H^H^H^H^H^H learning experiences in the hope that someone, somewhere, benefits.

A call for agile museum projects (a lunchtime manifesto)

Yet another conversation on twitter about the NMOLP/Creative Spaces project lead to a discussion of the long lead times for digital projects in the cultural heritage sector. I’ve worked on projects that were specced and goals agreed with funders five years before delivery, and two years before any technical or user-focussed specification or development began, and I wouldn’t be surprised if something similar happened with NMOLP.

Five years is obviously a *very* long time in internet time, though it’s a blink of an eye for a museum. So how do we work with that institutional reality? We need to figure out agile, adaptable project structures that funders and bid writers can also understand and buy into…

The initial project bid must be written to allow for implementation decisions that take into account the current context, and ideally a major goal of the bid writing process should be finding points where existing infrastructure could be re-used. The first step for any new project should be a proper study of the needs of current and potential users in the context of the stated goals of the project. All schema, infrastructure and interface design decisions should have a link to one or more of those goals. Projects should built around usability goals, not object counts or interface milestones set in stone three years earlier.

Taking institutional parameters into account is of course necessary, but letting them drive the decision making process leads to sub-optimal projects, so projects should have the ability to point out where institutional constraints are a risk for the project. Constraints might be cultural, technical, political or collections-related – we’re good at talking about the technical and resourcing constraints, but while we all acknowledge the cultural and political constraints it often happens behind closed doors and usually not in a way that explicitly helps the project succeed.

And since this is my lunchtime dream world, I would like plain old digitisation to be considered sexy without the need to promise funders more infrastructure they can show their grandkids.

We also need to work out project models that will get buy-in from contractors and 3rd party suppliers. As Dan Zambonini said, ”Usability goals’ sounds like an incredibly difficult thing to quantify’ so existing models like Agile/sprint-esque ‘user stories’ might be easier to manage.

We, as developers, need to create a culture in which ‘failing intelligently’ is rewarded. I think most of us believe in ‘failing faster to succeed sooner’, at least to some extent, but we need to think carefully about the discourse around public discussions of project weaknesses or failures if we want this to be a reality. My notes from Clay Shirky’s ICA talk earlier this year say that the members of the Invisible College (a society which aimed to ‘acquire knowledge through experimental investigation’) “went after alchemists for failing to be informative when they were wrong” – ” it was ok to be wrong but they wanted them to think about and share what went wrong”. They had ideas about how results should be written up and shared for maximum benefit. I think we should too.

I think the MCG and Collections Trust could both have a role to play in advocating more agile models to those who write and fund project bids. Each museum also has a responsibility to make sure projects it puts forward (whether singly or in a partnership) have been reality checked by its own web or digital specialists as well as other consultants, but we should also look to projects and developers (regardless of sector) that have managed to figure out agile project structures that funders and bid writers can also understand and buy into.

So – a blatant call for self-promotion – if you’ve worked on projects that could provide a good example, written about your dream project structures, know people or projects that’d make a good case study – let me know in the comments.

Thanks also to Mike, Giv and Mike, Daniel Evans (and the MCG plus people who listened to me rant at dev8D in general) for the conversations that triggered this post.


If you liked this post, you may also be interested in Confluence on digital channels; technologists and organisational change? (29 September 2012) and Museums and iterative agility: do your ideas get oxygen? (21 November 2010).

Social Media Statistics

One of those totally brilliant and obvious-in-hindsight ideas. I’d like to see stronger guidelines on citing sources as it grows and clear differentiation by region/nation, because it’s easy for vague figures and rumour to become universal ‘fact’, but it’s a great idea and will hopefully grow: Social Media Statistics is:

A big home for all facts and figures around social media – because I’m fed up of trawling around for them and I’m also sure that I’m not the only one who gets asked ‘how many users does Facebook have?’ every hour of every day. … I’m hoping that this wiki will not only include usage stats, but also behaviour and attitude stats. It’s a bit of a skeleton at the moment, with v few of my stats having stated sources, but be patient – and help where you can!

Please add in any juicy stats as you come across them, and do cite your references and link to them where possible.

I’ll put my money where my mouth is and add information I find. I find wikis a really useful tool for lightweight documentation – it’s really easy to add some information while it’s in your brain, and the software doesn’t get in the way of your flow.

For a while now I’ve wanted a repository of museum and cultural heritage audience evaluation – this could be a good model. Speaking of which, I really must write up my notes from the MCG Autumn meeting.

[Edit to add: Social Media Statistics also links to Measurementcamp, which might be of interest to cultural heritage organisations wondering how they can ‘measure their social media communications online and offline’ (and how they can work with project sponsors and funders to define suitable metrics for an APId, social media world).]

WCAG 2.0 is coming!

That’d be the ‘Web Content Accessibility Guidelines 2.0‘ – a ‘wide range of recommendations for making Web content more accessible’ with success criteria ‘written as testable statements that are not technology-specific’ (i.e. possibly including JavaScript or Flash as well as HTML and CSS, but the criteria are still sorted into A, AA and AAA).

Putting that in context, a blog post on webstandards.org, ‘WCAG 2 and mobileOK Basic Tests specs are proposed recommendations‘, says:

It’s possible that WCAG 2 could be the new accessibility standard by Christmas. What does that mean for you? The answer: it depends. If your approach to accessibility has been one of guidelines and ticking against checkpoints, you’ll need some reworking your test plans as the priorities, checkpoints and surrounding structures have changed from WCAG 1. But if your site was developed with an eye to real accessibility for real people rather than as a compliance issue, you should find that there is little difference.

How to Meet WCAG 2.0 (currently a draft) provides a ‘customizable quick reference to Web Content Accessibility Guidelines 2.0 requirements (success criteria) and techniques’, and there are useful guidelines on Accessible Forms using WCAG 2.0, with practical advice on e.g., associating labels with form inputs. More resources are listed at WCAG 2.0 resources.

I’m impressed with the range and quality of documentation – they are working hard to make it easy to produce accessible sites.

UKOLN’s one-stop shop ‘Cultural Heritage’ site

I’ve been a bad blogger lately (though I do have some good excuses*), so make up for it here’s an interesting new resource from UKOLN – their Cultural Heritage site provides a single point of access to ‘a variety of resources on a range of issues of particular relevance to the cultural heritage sector’.

Topics currently include ‘collection description, digital preservation, metadata, social networking services, supporting the user experience and Web 2.0’. Usefully, the site includes IntroBytes – short briefing documents aimed at supporting use of networked technologies and services in the cultural heritage sector and an Events listing. Most sections seem to have RSS feeds, so you can subscribe and get updates when new content or events are added.

* Excuses include: (offline) holidays, Virgin broadband being idiots, changing jobs (I moved from the Museum of London to an entirely front-end role at the Science Museum) and I’ve also just started a part-time MSc in Human-Centred Systems at City University’s School of Informatics.

20% time – an experiment (with some results)

A company called Atlassian have been experimenting with allowing their engineers 20% of their time to work on free or non-core projects (a la Google). They said:

You see, while everyone knows about Google’s 20% time and we’ve heard about all the neat products born from it (Google News, GMail etc) – we’ve found it extremely difficult to get any hard facts about how it actually works in practice.

So they started with a list of questions they wanted to answer through their experiment, and they’ve been blogging about it at http://blogs.atlassian.com/developer/20_percent_time/. It makes for interesting reading, and it’s great to see some real evidence starting to emerge.

Hat tip: Tech-Ed Collisions.