It is widely acknowledged that connecting science with the public is a must, and many organisations put significant resources into doing so, but how can we know when these efforts are successful? This article looks at the European Space Agency’s outreach activities for the Hubble Space Telescope to give some guidelines on how best to evaluate the success of science communication activities.
Outreach for ESA/Hubble takes a number of forms. The primary channels are press releases (both of new science results, and newly processed images), the spacetelescope.org website, video podcasts, and social networks for community engagement. Due to the complexity of the information flow in modern day society, the dividing lines between these are often indistinct — press releases are read by the public on our website; videos from podcasts are widely used by broadcasters; social media reaches opinion leaders and journalists.
Different types of communication have to be assessed in different ways, and it is important to understand that evaluation is not an exact science, but rather a mechanism to direct us towards what is most effective. The impact of a web page can easily be measured through visitor numbers, while a video podcast which gets shared and re-posted virally is much harder to assess. This is not a weakness, but simply a recognition that we have to do our job with imperfect information at our disposal. Nevertheless, with this broad range of indicative data we can still get a good handle on what works and what doesn’t.
Monitoring the effectiveness of ESA/Hubble website at www.spacetelescope.org using Google Analytics is arguably the most straightforward element of our evaluation procedures. This gives a wealth of information, not just about reader numbers for individual pages but also paths readers take through the website, geographical location, technical info, and many other metrics. We monitor the impact of press releases both quantitatively and qualitatively. The quantitative approach involves recording the number of times a story’s keywords appear in Meltwater, an online news monitoring service (a proxy for how many times a story has been picked up); combined with page views on our website as measured by Google Analytics (a proxy for public interest); and hits in Eurekalert, a science press release service (a proxy for journalists’ interest in a story).
These figures vary considerably between press releases, and the impact is not necessarily immediate or predictable. While interest typically peaks in the few days following a release date, they can sometimes have a second burst of life, for example if images are reused later in a documentary film or book — something we encourage with our liberal copyright policy.
Qualitative monitoring of press release impacts is by its nature less complete, but it gives a complementary view. We use Meltwater’s online news and monitoring of magazines and newspapers in print to build up a library of press clippings. This serves as an archive — a form of feedback on the effectiveness of our press releases.
Since 2007, ESA/Hubble has produced the Hubblecast, one of the first HD video podcasts. Hubblecasts are distributed through the iTunes store and on video sharing websites. Rankings from iTunes are easy to get hold of — the software offers a ‘most popular’ selection both for podcast series and for individual podcast episodes in any of the store’s subject areas. Hubblecast, along with several other astronomy podcasts, exists in several versions (full HD, HD and SD), which are counted separately in iTunes. While this means we cannot produce a single headline figure, the improved service to the end user far outweighs this disadvantage for evaluation.
These iTunes statistics give an idea of relative popularity, but not of absolute numbers. While we can in principle track these via Google Feedburner, its subscriber figures are based on number of users whose client software checks our feed every day, not on a list of actual subscribers. Feedburner’s measure of ‘reach’ (downloads per day) is better but still problematic as podcast software automatically downloads new episodes regardless of whether they are ever watched.
Performance of the Hubblecast on YouTube can give a clearer idea of public reception— partly through viewing figures, but also through user comments. Here, though, our copyright policy which encourages reuse of our materials is a hindrance to analysing impact: the vast majority of views of Hubblecasts on YouTube are not through the official channel, but through users who legally upload the video to their own accounts and share the videos themselves.
[like we have just shared the video here… (!!)]
The complex interaction with users who edit and re-post our videos, as well as instant (if ambiguous or throwaway) feedback on YouTube is typical of ESA/Hubble’s community interaction, mainly through social media. This is the primary mechanism for public dialogue in our outreach operations.
Social media has opened the door for science communicators to a world where “the public” is now resolved into individuals with personal opinions. Each individual connects with hundreds of other individuals and creates their own sphere of influence. The potential of viral information is enormous and hard to keep track of, but not completely obscure. Even if the impact of social media is hard to grasp, there are still a number of indicators that tell us if our inputs have results.
For our Facebook page we track the number of fans. However, since everything we post on our page is publicly available to everyone, not only to our fans, people do not have to “Like” our page in order to see them. For this reason, there are other more relevant indicators, prime among them Facebook’s “Insights” service, which provides interesting statistics — for example, in March 2011 we actually had three times the number of ‘post views’ on Facebook as we had visits to our website. We can also look indirectly. For example Google Analytics tells us that Facebook is the second-largest source of visitors to spacetelescope.org, right after Google — evidence of a large community reading news on Facebook and looking for more on our website.
We also look at numbers of monthly active users, and numbers of people responding to or posting on our page, for example with questions or comments. This gives us an indication of the degree to which we succeed in engaging people rather than simply informing them in a one-way process. Facebook also gives data about the gender and age of Facebook friends, as well as the countries and cities where a page is most popular, which can be an important variable if you have a local impact you need to justify.
In the case of Twitter, evaluation is harder because unlike Facebook, it does not have a well-established monitoring service. This has meant relying on simple indicators like numbers of followers or website visitors coming from Twitter. However, HootSuite, a software platform used to manage social media accounts, has recently launched a tool that generates reports on activity and impact. In future, this should allow for more detailed insights into how best to engage with social networks.
Analysing the success of outreach efforts is not just a simple matter of unambiguous statistics. It relies on educated guesses and common sense, because we function with an imperfect data set. However the purpose of evaluating science outreach is not to come up with detailed, accurate statistics which can be published, but rather to assess whether what we do actually works. Effective evaluation holds a paradoxical position, in that it should be both tightly integrated with science communication work, and clearly secondary to it. If there is one thing worse than not measuring output at all, it is to undermine your output (for example by enforcing strict copyright terms) in the name of measuring it properly.
O. Usher, O. Sandu and L. L. Christensen
Featured image credit: Vadim Sadovski via Shutterstock
EuroScientist is looking for contributors!
If you would like to write guest posts in EuroScientist magazine, send us your suggestions of articles at office@euroscientist.com.
- Measuring the impact of outreach - 11 April, 2011