While there’s good news for the Museum of London, overall the cultural heritage sector in the UK is about to suffer. As the 24 Hour Museum puts it:
While the round of government grants, now in its sixth year, is welcomed by the country’s museums and galleries, trepidation still hangs in the air as severe cuts are due to come into force in Heritage Lottery and Arts Council funding.
The main reason for this is money being diverted for the Olympics – the HLF has lost £233m to the Olympic fund up to 2012.
Lottery grants for projects exceeding £5 million have been slashed from £80 million in 2006/07 to half that in 2007/08, and to £20 million in 2008/09. Big handouts have helped projects like the £10m York Minster restoration this year, but commentators say hard decisions will have to be made over future applications of the same significance. The HLF budget for smaller projects has also been reduced.
From DCMS Wolfson Fund Announces £4m For Museums And Galleries.
Data Visualization: Modern Approaches presents the “most interesting modern approaches to data visualization” for displaying mind maps, news, data, connections, websites, articles and resources and tools and services.
This will only be relevant to the archaeologists, I guess, but it has occurred to me to ask – what would you like to see in the Catalhoyuk archive reports? What information would either be useful or satisfy your curiosity?
In a wider sense, what can we (as IT geeks in the cultural heritage sector) learn from each other? What are we too scared to ask in case it’s a stupid question, or because it seems too obscure? What don’t we share because we assume that everyone else knows it already?
A recent Alertbox talked about Banner Blindness: Old and New Findings:
The most prominent result from the new eyetracking studies is not actually new. We simply confirmed for the umpteenth time that banner blindness is real. Users almost never look at anything that looks like an advertisement, whether or not it’s actually an ad.
The heatmaps also show how users don’t fixate within design elements that resemble ads, even if they aren’t ads
I guess the most interesting thing about the post is that it acknowledges that unethical methods attract the most eyeballs:
In addition to the three main design elements that occasionally attract fixations in online ads, we discovered a fourth approach that breaks one of publishing’s main ethical principles by making the ad look like content:
- The more an ad looks like a native site component, the more users will look at it.
- Not only should the ad look like the site’s other design elements, it should appear to be part of the specific page section in which it’s displayed.
This overtly violates publishing’s principle of separating “church and state” — that is, the distinction between editorial content and paid advertisements should always be clear. Reputable newspapers don’t allow advertisers to mimic their branded typefaces or other layout elements.
I think it’s particularly important that we don’t allow commercial considerations to damage our users’ trust in cultural heritage institutions as repositories of impartial* knowledge. We’ve developed models for differentiating user- and museum-generated content and hopefully quelled fears about user-generated content somehow damaging or diluting museum content; it would be a shame if we lost that trust over funding agreements.
* insert acknowledgement of the impossibility of truly impartial cultural content.
In a post titled, What is Web 3.0?, Nicholas Carr said:
“Web 3.0 involves the disintegration of digital data and software into modular components that, through the use of simple tools, can be reintegrated into new applications or functions on the fly by either machines or people.”
And recently I went to a London Geek Girl Dinner, where Paul Amery from Skype (who hosted the event) said
“the next big step forward in software is going to be providing the plumbing, to provide people what they want, where they want …start thinking about plumbing all this software together, joining solutions together… mashups are just the tip of the iceberg”.
So why does that matter to us in the cultural heritage sector? Without stretching the analogy too far, we have two possible roles – one, to provide the content that flows through the pipes, ensuring we use plumbing-compatible tubes so that other people can plumb our content into new applications; the second is to build applications ourselves, using our data and others. I think we’re are brilliant content producers, and we’re getting better at providing re-usable data sources – but we often don’t have the resources to do cool things with them ourselves.
Maybe what I’m advocating is giving geeks in the cultural heritage sector the time to spend playing with technology and supplying the tools for agile development. Or maybe it’s just the perennial cry of the backend geek who never gets to play with the shiny pretty things. I’m still thinking about this one.
This post on ‘What comes after post-processualism‘ caught my eye, I guess because I have a fascination with the ways in which archaeological theory affects database design and digitisation strategies. I either work with contract archaeologists or on a post-processual site and the structural requirements are quite different, though both fundamentally rely on single context recording.
We have to face the fact that archaeological theory is quite simply no longer at the heart of archaeology, as it perhaps was from the 1960s until the end of the 1980s.
Instead we have seen over the last few decades an enormous expansion of commercial archaeology, now controlling far more funding than the Universities and responsible for the lion share of archaeological research. We may or may not like that fact and what it led to in terms of research results but commercial archaeology is undeniably today a far bigger player in the discipline than its poor sibling, University-based research.
I’m sure it’ll be eons before it trickles down into the museum sector, but it’s an interesting change:
Nielsen/NetRatings to use total time spent by users of a site as its primary measurement metric
In a nod to the success of emerging Web 2.0 technologies like AJAX and streaming media, one of the country’s largest Internet benchmarking companies will no longer use page views as its primary metric for comparing sites.
Nielsen/NetRatings will announce Tuesday that it will immediately begin using total time spent by users of a site as its primary measurement metric.
Nielsen/NetRatings will still report page views as a secondary metric, and it will continue to reevaluate its primary metric as technology continues to evolve, Ross added. “For the foreseeable future, we will champion minutes if you are comparing two sites. Going forward, we’ll see what that equates to in terms of true advertising opportunity,” he said.
I’ve been wondering how long it would take for a meta-social networking site to emerge (or whether I should create one thereby making millions), allowing you to maintain active accounts on facebook, myspace, etc, with one single interface to read and post messages and comments, but of course Wired got there first. Sorta.
And yes, I did mean to post that many months ago! But it’s still relevant because interoperability is only going to become more important in the social networking world.
This post on the Gartner “Hype Cycle for Emerging Technologies 2007” report includes the familiar Gartner Hype Cycle diagram, updated for 2007, which is more than you’ll get from the Gartner site (for free, anyway).
Common Craft have produced videos on RSS in Plain English, Social Bookmarking in Plain English, Wikis in Plain English and Social Networking in Plain English (via Groundswell)
Also worth a look, Google Code for Educators “provides teaching materials created especially for CS educators looking to enhance their courses with some of the most current computing technologies and paradigms”. They say, “[w]e know that between teaching, doing research and advising students, CS educators have little time to stay on top of the most recent trends. This website is meant to help you do just that” and it looks like it might also be useful for busy professionals who want to try new technologies they don’t get time to play with in their day jobs (via A Consuming Experience).
Also from A Consuming Experience, a report on a talk on “5 secrets of successful Web 2.0 businesses” at the June London Geek Dinner.
On a random note, I noticed that the BBC have added social bookmarking to their news site:
I wonder if this marks the ‘mainstreaming’ of social bookmarking.
EU OKs German Online Search-Engine Grant
The European Union on Thursday authorized Germany to give $165 million for research on Internet search-engine technologies that could someday challenge U.S. search giant Google Inc.
The Theseus research project — the German arm of what the French call Quaero — is aiming to develop the world’s most advanced multimedia search engine for the next-generation Internet. It would translate, identify and index images, audio and text.
Fragmented European research efforts are one of the reasons blamed for the region lagging behind the United States in information technology. European companies in general spend far less on research than those based in other parts of the world, and the EU said the project should help change that.
I wonder how they’ll identify and weight or rank European content. And will it be tied in with the European Digital Library?