More on Web 2.0 post-CAA

Some more quick thoughts as conversations I had at and after CAA UK settle into my brain. This doesn't really apply to anyone I talked to there, but as a general rule I think it's worth saying:

Don't chase the zeitgeist. It's not a popularity contest and it's not a race to see who can cram the most buzzwords into their site.

Also, here's a link to the blog of the AHRC-funded Semantic Web Think Tank I mentioned, and the original announcements about the SWTT.

Finally, what's hopefully a quite useful link for those considering official institutional blogs: Sample guidelines for institutional blog authors.

Via the Museums Computer Group list, the Emerging Technologies Initiative 2007 Horizon Report has just been released. It "highlights six technologies that the underlying research suggests will become very important to higher education over the next one to five years. A central focus of the discussion of each technology is its relevance for teaching, learning, and creative expression. Live weblinks to example applications are provided in each section, as well as to additional readings."

CAA UK 2007 Chapter Meeting

Last week I went to the Computer applications and quantitative methods in archaeology (CAA) UK 2007 Chapter Meeting in Southampton. There was a range of interesting papers and it was really exciting to talk to people with similar passions.

I managed to overrun and didn't get to the last few slides of my paper, which were some random suggestions for cultural heritage organisations looking to get started with Web 2.0. They're based on the assumption that resources are limited so the basic model I've suggested is that you think about why you're doing it and who you're doing it for, then start with something small. I would also suggest matching the technology to your content, using applications that meet existing standards to avoid lock-in, ensuring you backup your data regularly (including user-generated content) and taking advantage of existing participation models, particularly from commercial sites that have User Interface and Information Arcitect specialists.

  • Start small, monitor usage and build on the response
  • Design for extensibility
    • Sustainable
    • Interoperable
    • Re-usable
  • Use existing applications, services, APIs, architectures, design patterns wherever possible
  • Embrace your long tail
  • It's easy and free/cheap to create a blog, or a Flickr account to test the waters
  • Investigate digitising and publishing existing copyright free audio or video content as a podcast or on YouTube
  • Add your favourite specialist sites to a social bookmarking site
  • Check out Myspace or Second Life to see where your missing users hang out
  • Publish your events data in the events microformat so they can be included in social event sites
  • Geotag photos and publish them online
  • Or just publish photos on Flickr and watch to see if people start creating a folksonomy for you

Some of the history of the Catalhoyuk database

I was going to post this on the Catalhoyuk blog but authentication isn't working right now. So, I'll post it here and move it over when it's working again.

Just in case you thought nothing happened during the off-season…

A lot of this information is contained in the Archive Reports but as the audience for those is probably more specialised than the average reader of this blog, I thought it might be interesting to talk about them here.

When MoLAS first became involved with the project, there were lots of isolated Microsoft Access 2000 databases for excavation, finds and specialist data. I could see that the original database design and structure was well structured and much valuable work had been done on the database previously. However, some problems had arisen over the years as the database grew and different specialists brought their own systems based on a mixture of applications and platforms.

It was difficult for specialist databases to use live field or excavation data because it wasn't available in a single central source. It had also become almost impossible to run queries across excavation seasons or areas, or produce multi-disciplinary analysis , as there were disparate unrelated databases for each area of study. Within many specialisms the data set has been broken up into many different files – for example, the excavation database was split into teams and some teams were creating separate files for different years.

In many cases, referential integrity was not properly enforced in the interface or database structure. While the original database structures included tables to supply lists of values to enable controlled vocabularies, the interfaces were using static rather than dynamic menus on data entry interfaces. Primary and/or foreign keys were not implemented in some databases, leading to the possibility of multiple entries, anomalous data or incorrect codes being recorded. There was little or no validation on data entry.

IBM generously donated two new servers, one for use on site and the other for the Cambridge office. This meant that we were able to install Microsoft SQL Server 2000 to use as a single backend database and start re-centralising the databases. This meant re-combining the disparate datasets into a single, central database, and reconfiguring the Access forms to use this new centralised backend.

Centralising and cleaning the data and interfaces was a bit of a slog (covered in more detail in the archive reports), and even now there are still bits and pieces to be done. I guess this shows the importance of proper database design and documentation, even when you think a project is only going to be small. I'm sure there was documentation originally, so I guess this also shows the importance of a good archiving system!

Unfortunately, because the 'business logic' of the database applications wasn't documented (if there was documentation it'd been lost over time) we couldn't re-do the existing forms in another application (like web forms) without losing all the validation and data entry rules that had been built up over time in response to the specialists' requirements. As usual in the world of archaeology, limited resources meant this wasn't possible at that stage. A lot of the application logic seemed to be held in the interfaces rather than in the relationships between tables, which meant a lot of data cleaning had to be done when centralising the databases and enforcing relationships.

As the 2004 Archive Report says, "The existing infrastructure was Microsoft Access based, and after consideration for minimal interruption to existing interfaces, and for the cost to the project of completely redeveloping the forms on another platform, these applications were retained."

Luckily, we're not tied to Access for new application development, and new and future database applications are created as HTML, eliminating any platform/OS compatibility issues.

This means that we can get on with more exciting things in the future! I'll post about some of those ideas soon.

In the meantime, check out the public version of the web interface to the Çatalhöyük database.

[Originally published on http://www.catalhoyuk.com/blog/, January 24, 2007]

Notes on usability testing

Further to my post about the downloadable usability.gov guidelines, I've picked out the bits from the chapter on 'Usability Testing' that are relevant to my work but it's worth reading the whole of the chapter if you're interested. My comments or headings are in square brackets below.

"Generally, the best method is to conduct a test where representative participants interact with representative scenarios.

The second major consideration is to ensure that an iterative approach is used.

Use an iterative design approach

The iterative design process helps to substantially improve the usability of Web sites. One recent study found that the improvements made between the original Web site and the redesigned Web site resulted in thirty percent more task completions, twenty-five percent less time to complete the tasks, and sixty-seven percent greater user satisfaction. A second study reported that eight of ten tasks were performed faster on the Web site that had been iteratively designed. Finally, a third study found that forty-six percent of the original set of issues were resolved by making design changes to the interface.

[Soliciting comments]

Participants tend not to voice negative reports. In one study, when using the ’think aloud’ [as opposed to retrospective] approach, users tended to read text on the screen and verbalize more of what they were doing rather than what they were thinking.

[How many user testers?]

Performance usability testing with users:
– Early in the design process, usability testing with a small number of users (approximately six) is sufficient to identify problems with the information architecture (navigation) and overall design issues. If the Web site has very different types of users (e.g., novices and experts), it is important to test with six or more of each type of user. Another critical factor in this preliminary testing is having trained usability specialists as the usability test facilitator and primary observers.
– Once the navigation, basic content, and display features are in place,
quantitative performance testing … can be conducted

[What kinds of prototypes?]

Designers can use either paper-based or computer-based prototypes. Paper-based prototyping appears to be as effective as computer-based prototyping when trying to identify most usability issues.

Use inspection evaluation [and cognitive walkthroughs] results with caution.
Inspection evaluations include heuristic evaluations, expert reviews, and cognitive walkthroughs. It is a common practice to conduct an inspection evaluation to try to detect and resolve obvious problems before conducting usability tests. Inspection evaluations should be used cautiously because several studies have shown that they appear to detect far more potential problems than actually exist, and they also tend to miss some real problems.

Heuristic evaluations and expert reviews may best be used to identify potential usability issues to evaluate during usability testing. To improve somewhat on the performance of heuristic evaluations, evaluators can use the ’usability problem inspector’ (UPI) method or the ’Discovery and Analysis Resource’ (DARe) method.

Cognitive walkthroughs may best be used to identify potential usability issues to evaluate during usability testing.

Testers can use either laboratory or remote usability testing because they both elicit similar results.

[And finally]

Use severity ratings with caution."

Useful background on usability testing

I came across www.usability.gov while looking for some background information on usability testing to send colleagues I'm planning some user evaluation with. It looks like a really useful resource for all stages of a project from planning to deployment.

Their guidelines are available to download in PDF form, either as entire book or specific chapters.