The Internet, Policy & Politics Conferences

Oxford Internet Institute, University of Oxford

Tom Nicholls, Simon Gray: Crowdsourcing measurement of e-government usability

Tom Nicholls, Oxford Internet Institute

Simon Gray, 

 

 

Background
==========
In an age of austerity, the cost-effectiveness of government has become of increasing concern to policy-makers. As part of this drive, the government is once again pushing the adoption of effective digital service delivery as one method of achieving cost savings while preserving the quality of public services. As a result of that drive, and other pressures, the quality and ease of use of governments’ online offerings has become increasingly important in recent years. However, this drive to put services online has not been matched by a corresponding effort to develop the measures needed to assess whether these websites are doing a good job. This problem is particularly acute in local government, where the number of discrete services delivered by a single organisation is high, and the websites are correspondingly complex and difficult to navigate.

The leading method for measuring councils’ online performance is Socitm’s annual Better Connected (Socitm, 2013) set of surveys, which use expert assessors to judge the quality of particular transactions. However, Better Connected has limitations: the size of the problem and the availability of reviewers means that only a small number of “top task” services are judged in any one year; as a result of the cost of the exercise the detailed results are limited to paid subscribers to Socitm’s Insight service; and the judgements of the assessors are necessarily those of interested experts rather than the typical users of council websites who are local citizens with wildly varying levels of online skill. As a result, it is in danger of measuring the staff consensus of what websites ought to behave like, rather than a measure of performance and usability which reflects the user experience. It may or may not be a coincidence that as councils’ Better Connected scores have increased, overall citizen satisfaction as measured by user surveys
has instead fallen.

Against this background, we describe and analyse an alternative approach based upon crowdsourcing measurement of website usability and online service delivery. We focus on some of the features of the council website usability dashboard which lead citizens to participate in it. As one of the attractions of crowdsourcing in this context is to draw judgements from a broader pool of citizens than simply those who are web professionals and local government managers, we also report on a survey of those participating in the sourcing project to understand their motivations for participating and to justify the value added by this new approach.

Research questions
==================
How can data from the dashboard develop our understanding of the present usability of local authority web provision?

Are the results of Socitm (2013) valid and robust, when compared against data with a different basis?

Does this example of crowdsourcing provide insights about or suggest other opportunities for crowdsourcing the evaluation of policy and service delivery in other areas of government?

Approach, methods and data
==========================
We situate the dashboard’s approach to crowdsourcing within a wider body of theory dealing with co-production of public policy (including Parks et al., 1981; Benkler, 2002; Linders, 2011), in this case by providing user involvement in service evaluation. The present case is differentiated from the body of literature on service user co-creation, noting that the participant in this service is not necessarily a user of services from the local authority being rated (although all are, of course, users of local government services in some location).

The effectiveness of the dashboard is analysed, using its results to test, critique and validate the results in the Better Connected data series. This includes computation of inter-coder reliability measures to assess the performance of the volunteer raters on the dashboard. A randomly-sampled survey of raters provides deeper data on sustainability, participants’ motivations for participating and the value added by crowdsourcing judgement. Full data is available from the dashboard, which allows analysis at individual rating level, with each set of judgements attached to an individual-level record for the rater involved. This allows a breakdown of both reliability and summary outcomes measures by age, location, disability and levels of expertise.

Policy implications
===================
Greater responsiveness is possible by using a year-round crowdsourced system as a replacement or supplement for a once-a-year expert evaluation. We suggest that this may enable decision timescales for online services to be shortened, with development work consequently able to be sped up, as well as providing fewer opportunities for focused gaming of measures by council staff. Free access to the full dashboard dataset also potentially assists policymakers in having access to timely data and opens possibilities for further crowdsourcing of analysis and recommendations for development.

We offer suggestions as to how this crowdsourced approach to service quality might be taken up by policymakers in developing their online service offerings. We also explore the limitations of this approach, suggesting that triangulation of the new crowdsourced data with existing measures is likely to prove the most reliable way forward. We also briefly assess the future prospects of the dashboard, noting the challenges to ongoing participation inherent in its design and incentives, and suggest ways to ensure that the dashboard service remains sustainable in the future.

References
==========
Yochai Benkler. Coase’s penguin, or, linux and the nature of the firm. Yale Law Journal, 112:369–446, December 2002.

Dennis Linders. We-government: an anatomy of citizen coproduction in the information age. In Proceedings of the 12th Annual International Digital Government Research Conference:Digital Government Innovation in Challenging Times, dg.o ’11, page 167–176, New York, NY, USA, 2011. ACM.

Roger B. Parks, Paula C. Baker, Larry Kiser, Ronald Oakerson, Elinor Ostrom, Vincent Ostrom, Stephen L. Percy, Martha B. Vandivort, Gordon P. Whitaker, and Rick Wilson. Consumers as coproducers of public services: Some economic and institutional considerations. Policy Studies Journal, 9(7):1001–1011, 1981.

Socitm. Better connected 2013: a snapshot of all local authority websites. Technical report, Socitm, Northampton, UK, February 2013.

Authors: 
Tom Nicholls, Simon Gray