Developing electoral registration performance indicators. Consultation responses paper

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "Developing electoral registration performance indicators. Consultation responses paper"

Transcription

1 Developing electoral registration performance indicators Consultation responses paper April 2007

2 Translations and other formats For information on obtaining this publication in another language or in a large-print or Braille version please contact the Electoral Commission: Tel: We are an independent body that was set up by the UK Parliament. Our mission is to foster public confidence and participation by promoting integrity, involvement and effectiveness in the democratic process. 2

3 Contents 1 Introduction 4 2 Summary of key responses 10 3 Summary of key themes 13 4 Responses to the indicators 15 5 The Commission s response to the consultation 22 6 Alternative indicators 29 7 Piloting process 32 Appendices Appendix A Extract from the Political Parties, Elections and Referendums Act Appendix B List of consultation questions 39 Appendix C Breakdown of responses to specific indicators (questions 1 9) 40 3

4 1 Introduction 1.1 Section 67 of the Electoral Administration Act 2006 (EAA) inserted new sections 9A, 9B and 9C into the Political Parties, Elections and Referendums Act 2000 (PPERA). These sections allow the Electoral Commission to set and monitor performance standards for electoral services and to collect information on the costs of electoral services from Electoral Registration Officers, Returning Officers and Referendum Counting Officers. This power is not directly applicable to Northern Ireland 1 and does not apply to Scottish local government elections The new PPERA powers are broad and allow the Commission the scope either to focus efforts on a specific area of electoral administration, or to use a broad brush approach and set indicators across the spectrum in the first instance There is increasing comment and concern as to the accuracy and completeness of electoral registers in the UK. While it is accepted that registration is the foundation of our electoral process, neither the Commission, the Government nor Electoral Registration Officers (EROs) themselves currently have a clear idea of the level of electoral registration, the determinants of differing levels of registration across the UK, or a consensus on an acceptable level of electoral registration. 1.4 The Commission therefore intends to focus on electoral registration in the first 18 months of the performance standards programme. In so doing, the Commission will seek to exploit existing data sources e.g. the Office of National Statistics (ONS) and the General Register Office for Scotland (GROS) where possible and appropriate and set tight, specific indicators. We see this as a key test of whether standards can be set that will allow us to answer key questions about the electoral process, as well as provide a blueprint for improvement and change. 1.5 Parallel to this process, the Commission will develop a model for costing electoral registration services across the UK. It is envisaged that data collection and analysis can be designed to have a low impact on EROs in 2007, while still enabling the Commission to make early estimates of registration levels, as well as test the indicators and cost model. The financial data would be that from , and the Commission would seek to get registration data from the same period in order for the information to be complementary. The Commission will work towards a full system of collection and assessment by 1 December Electoral event standards remain very important, but are not likely to be as comparable by year or event as registration. There is also a limited prospect of status quo elections over the next three years, given the nature and rate of proposed change. 1 The Secretary of State for Northern Ireland has indicated an intention to issue a direction to the Chief Electoral Officer of Northern Ireland to have regard to the performance standards set by the Electoral Commission in Great Britain. 2 The Local Electoral Administration and Registration Services (Scotland) (LEARS) Bill enables Scottish Ministers to set performance standards for Returning Officers at local government elections. Scottish Ministers have stated their view that these performance standards should be compatible with the standards the Commission will produce for Parliamentary elections. 3 An extract from the legislation is attached at Appendix A. 4

5 1.7 The Commission is committed to having a full system of electoral event indicators in place for , a year that includes the European Parliamentary elections and possibly the next UK Parliamentary general election. This timescale will allow for piloting of the indicators at the scheduled elections in May The Commission s approach is very much one of measuring performance to drive improvement. A comprehensive programme of support and assistance will be integral to the roll-out of the regime. Our preparatory work has concluded that a great deal of data is currently collected or obtained, but it is neither uniform nor complete across the country. Accordingly, our first stage will be focused on the collection of accurate and uniform data, as well as the design of support mechanisms for improvement and development. 1.9 The development of performance standards for EROs will be informed by data collected and analysed in the first year of the project. Consultative process 1.10 In addition to the consultation paper (see Paragraph 1.15 onwards), the Commission arranged meetings with the Stakeholder Advisory Panel (SAP) in Belfast, Edinburgh and London The event in Northern Ireland was held on 9 November 2006 and included representatives from the Electoral Office for Northern Ireland (EONI) and political parties. Attendees commented that they would like to see the performance standards project concentrate on performance improvement rather than setting UK minimum standards alone. It was also the consensus of the attendees that it would be necessary to ensure that the specific indicators are appropriate for Northern Ireland The event in Edinburgh was held on 8 December 2006 and included electoral practitioners, the Scottish Assessors Association (SAA), Society of Local Authority Lawyers and Administrators in Scotland, Society of Local Authority Chief Executives and Senior Managers (SOLACE), Scotland Office, Scottish Executive and the Political Parties Panel representatives. It was a useful event with many valuable points raised, such as the fact that the financial costs model will have to take into account that the joint valuation boards cover a number of local authority population areas. It was also suggested that additional indicators could be considered that are not registration or electoral event specific, e.g. generic participation work, polling place locations, and other general electoral management (but not event) activities The Commission held meetings of SAP on 2 November and 9 November 2006 and 15 February 2007 in London. Attendees included representatives from UK and devolved governments, electoral administrators, Association of Electoral Administrators (AEA), SAA, and political parties. This group has provided significant input into the development of the indicators, and allowed the Commission to host valuable debates on various issues among the stakeholders of the project. 5

6 1.14 The Commission has also held numerous meetings with government departments and organisations such as the Department for Constitutional Affairs (DCA), Welsh Assembly Government (WAG), Local Government Association, the Office for National Statistics and the Audit Commission. The Electoral Commission is keen to work with the existing performance management regime, and will continue to consult with the relevant bodies. The Commission has also discussed the development of the indicators at the Electoral Leadership Forum (ELF) meetings. Purpose of the consultation paper 1.15 The objective for the consultation paper was to obtain views on a basket of proposed performance indicators for the electoral registration function. The consultation was launched on 4 December 2006 and closed on 26 January The paper discussed a proposed basket of electoral registration performance indicators, asking for comment on the nature of each individual indicator and separate views on the overall mix of proposed indicators. The indicators included in Chapter 7 are those that will be piloted from March July The indicators need to be considered in the context of the proposed electoral services performance standards framework and the relevant performance standards regimes in Great Britain The Commission wished to address the following questions in the consultation period: Do the indicators address the key issues relating to non- or under-registration? Are they appropriate across the whole of the UK electoral community? Do the indicators appear to propose any additional electoral registration activity which may present a potential new burden cost implication? Are the proposed indicators robust and fit for purpose? What will be required to support improvement in performance if this is indicated by the initial collection exercises? 1.19 The paper asked 16 questions of the respondents, the first nine of which were relevant for each indicator, and a further seven that asked general questions about the proposed basket of indicators. A detailed breakdown of the responses to the first nine questions for each indicator is available at Appendix C, while the responses to the latter seven questions are outlined in Chapter 4. The questions used were based on recognised assessment criteria for performance measures in the local government performance community The Commission sent the consultation document to a large number of stakeholders including the following people and organisations: all EROs (via Commission circular) all local authority performance managers 6

7 all political parties on the Westminster, Northern Ireland, Scotland and Wales political parties panels Department for Constitutional Affairs Welsh Assembly Government Northern Ireland Office Scotland Office Scottish Executive ONS, GROS Audit Commission, Audit Scotland, Audit Wales AEA, central and branches SOLACE electoral matters panel a series of NGOs (such as accessibility and community charities) 1.21 It was also publicly available on the Commission s website The Commission received responses to the consultation paper from the following people and organisations: Electoral Registration Officers and the local authorities that employ them Aylesbury District Council Basingstoke and Deane Borough Council Blaby District Council London Borough of Brent Carlisle City Council Colchester Borough Council East Devon District Council Gateshead Council Great Yarmouth Borough Council London Borough of Greenwich London Borough of Hammersmith and Fulham Hart District Council Royal Borough of Kensington and Chelsea Kirklees Council Maidstone Borough Council London Borough of Merton Milton Keynes Council Oldham Metropolitan Borough Council South Tyneside Council London Borough of Southwark ERO London Borough of Southwark Performance team Torfaen County Borough Council Wealden District Council West Berkshire Council West Dorset District Council Westminster City Council Weymouth and Portland Borough Council Wigan Metropolitan Borough Council 7

8 Chief Electoral Officer of Northern Ireland Scottish Assessors Association Total: 30 Political parties, candidates and agents Liberal Democrats Scottish National Party (SNP) Roger Stephens Total: 3 Association of Electoral Administrators (AEA) AEA corporate response East Midland AEA Essex AEA London AEA North West AEA South West AEA Southern AEA West Midlands AEA Total: 8 Government Welsh Assembly Government (WAG) Department for Constitutional Affairs (DCA) General Register Office for Scotland (GROS) Total: 3 Local Government Association Local Government Association Lifting the Burdens Task Force Local Government Association (LGA) Total: 2 Other Experian Total: 1 8

9 Individuals Malcolm Nicholson Total: 1 Total number of responses: 48 9

10 2 Summary of key responses Department for Constitutional Affairs 2.1 The DCA was supportive of the overall range of indicators, and of the planned approach to implement them. The DCA also sympathised with the intention of beginning with electoral registration, and considered the use of quantitative indicators as an important step in introducing a performance culture for electoral registration. Their response also outlined a desire for the indicators not to be burdensome on EROs, and they acknowledged that much of the information required can be extracted from existing data sources. 2.2 The DCA s response wished to see the indicators linked to legislative requirements, such as the new duties in the Electoral Administration Act Another suggestion was to use the forthcoming minimum data standard against which all registers will be assessed for the Co-ordinated Online Record of Electors as a performance indicator. In addition, the DCA would prefer to see the qualitative indicators merged into one, and for it to include reference to anonymous registration and the registration of service voters. Association of Electoral Administrators (corporate response) 2.3 The AEA response began by supporting the need for performance standards but also stated disappointment at the general approach taken in the consultation document. It argued that there are too many proposed indicators, with too much focus on data collection and that the use of ONS data is misguided. The AEA s response also stated a desire to see more qualitative indicators, and less quantitative ones. It was concerned about the proposed indicators representing a burden to electoral administrators, and would like to see a more explicit link to performance improvement. 2.4 The AEA would also like to see closer links to related schemes such as National Occupational Standards and the Beacon Scheme. Welsh Assembly Government 2.5 The response from WAG was also endorsed by the Welsh Local Government Association, the Local Government Data Unit Wales, the Wales Audit Office, SOLACE Cymru, and local authority performance specialists. They supported the introduction of performance standards for electoral services, and stated a desire to incorporate the final indicators into the overall performance management framework for Wales. The response also noted the important difference between performance indicators (which capture data) and performance standards (which define the required level of performance). The response tacitly accepted the need to initially undertake the former, before undertaking the latter following the first year of the project. 10

11 Chief Electoral Officer of Northern Ireland 2.6 The formal response from the Chief Electoral Officer of Northern Ireland welcomed the progress made by the Commission, and suggested they will be a valuable mechanism by which to drive up standards. The response continued by pointing to the pre-existing legislative performance standards in Northern Ireland, and the significant difference between the electoral systems of Great Britain and Northern Ireland. The Chief Electoral Officer noted that the Commission s powers to implement performance standards do not extend to Northern Ireland, and doubted the relevance of many of the proposed indicators if introduced to Northern Ireland. Scottish Assessors Association 2.7 The SAA provided a comprehensive review of each consultation question, and of each indicator. As with many responses, the SAA cited concern with the population data to be used in Indicators 1 and 2. The SAA was also wary of the utility of the qualitative indicators, stating that it is hard to measure and compare textual data. 2.8 The SAA doubted the point of Indicator 3, as it felt that the number of errors discovered would be so small that they would not be a viable statistic. In addition, they were clear that the definitions used in the indicators would need to be very rigorously stated to ensure that the information received would be directly comparable. General Register Office for Scotland 2.9 The response from GROS focused on the population data they are able to supply. They explained the process by which they calculate the mid-year population estimates, and included a link to a recent Parliamentary question answered by the National Statistician, Kathryn Dunnel. The statement from the National Statistician highlights the difficulty of reaching a figure of eligible electorate rather than population: It is not possible to split estimates of the usual resident population in order to give estimates of the population entitled to vote. We do not hold data (e.g. populations of non-eu citizens) that enable us to produce such a split. Local Government Association 2.10 The response from the LGA noted that they share the Commission s desire to see all local authorities possess efficient electoral services that command the public s confidence. However, the response was sceptical over the need for external performance indicators and raised the Government s pledge to reduce the number of performance indicators for local authorities. Lifting the Burdens Taskforce 2.11 The response from the Lifting the Burdens Taskforce was sceptical of the need to introduce performance standards in a sector where no previous standards 11

12 had existed. The response argued that in light of the desire to lower the amount of performance indicators for local government, they did not feel that the Commission made sufficient case for the introduction of new burdens. The response went on to say that electoral registration had been carried out perfectly well for more than 100 years and therefore there was no need to introduce a performance framework to raise standards. The response ended by noting that Indicators 3, 8, 9, 10 and 11 would be particularly burdensome. 12

13 3 Summary of the key themes 3.1 Perhaps the most significant theme coming out of the consultation responses was the lack of consensus between the various groups involved on almost all of the issues. EROs and the local authorities that employ them had widely differing views on all aspects of the consultation paper and this disparity was reflected in AEA branch responses. The number of proposed indicators 3.2 Comments that there were too many indicators were often backed up by reference to the Government s public stance of attempting to reduce enforced topdown performance information collection from local government, replacing them with local targets and risk-based inspection regimes. Most responses stated a desire to see fewer indicators but respondents were divided on whether these should be quantitative, qualitative or a mixture of the two. The number of quantitative indicators 3.3 This was a popular comment from some AEA branches and EROs. The justification for such a comment came from the concern that quantitative indicators would not show the full picture and the substantial resource and demographical differences between local authorities would not be sufficiently taken into account. There was also a desire, stated in the AEA corporate response, to mirror the qualitative Beacon Scheme format. The number of qualitative indicators 3.4 This was a comment that came more from the performance management professionals and government than electoral practitioners. However, there was also significant concern from some EROs over the number of qualitative indicators suggested. As the Royal Borough of Kensington and Chelsea stated in their response: It is noted that the majority of indicators are quantitative and this is to be welcomed since local analysis and local comparisons will be facilitated. The justification for such a comment comes from the view that qualitative indicators, while providing useful contextual background, are neither sufficiently verifiable (without a thorough inspection regime) nor fully comparable, due to the number of variables involved in any such responses. The accuracy of population data 3.5 This was a popular comment from the AEA branches and some EROs. The justification for such a comment came from the concern that the census data was wrong at the time of collection and becomes no more accurate (and potentially less accurate) at each mid-year readjustment. Therefore, any attempt to use it to judge performance will be inaccurate at best and misleading at worst. There was also an issue with the timeliness of the provision of data (i.e. the mid-year readjustment is not available on a local authority area basis for 18 months). 13

14 3.6 However, it must be said that there were many responses from EROs that did not address this issue. The DCA, while acknowledging the concerns, stated in their response: This measure will be a useful proxy for the registration rate. The data available will give an indicative figure on the number of adults registered in a given area and is an internationally accepted standard. Importance of contextualising the data 3.7 Many of the respondents from the electoral community stated that any information collected by the Commission would need to be contextualised by further information relative to that local authority area. Many EROs were very concerned by the thought that all quantitative (or qualitative) information would simply be compared, and some would inevitably (and unfairly) be found wanting. For example, the London Borough of Merton stated: We have concerns that the specific problems of areas such as London with its transient population, with high turnover at properties and a multi-ethnic/cultural dimension will not be sufficiently considered. 3.8 A common specific comment concerned with this issue was the varying levels of funding available to electoral registration departments. The response from South Tyneside Council stated: Any standards should consider the level of funding available to electoral services units. It would be unfair to judge any particular electoral service unit to be failing if they are under-staffed and/or under-resourced generally. Links to performance improvement 3.9 Another key theme arising from the consultation responses was the desire to see a closer link between the content of the indicators and a plan for performance improvement. East Devon District Council stated: The paper is virtually silent on how the indicators will be used or how they will contribute to improving standards. There is no need for it 3.10 Although this comment was not widely made by respondents, the Commission considers it appropriate to acknowledge the fact that it was made. The response from the Local Government Association s Lifting the Burdens Task Force was the most explicit on this point: The consultation document adduces no evidence of significant incompetence, irregularity or inefficiency in the electoral registration function of local government a function it has been carrying out very well for well over 100 years. The response from Malcolm Nicholson agreed with this position, stating: There is no evidence that these PIs will tangibly improve performance, particularly as, in general terms, the work of elections offices is scrutinised through an existing legislative framework. If registers weren t published and elections not run, then this would be immediately evident. 14

15 4 Responses to the indicators Responses to specific indicators Indicator 1: Percentage of adult population included on the register of electors 4.1 This indicator provoked a near-universal reaction, particularly among the EROs, over concerns about the accuracy of the ONS population figures. Many stated that the population figures created by ONS were not endorsed by the local authorities so they would be unwilling to have their performance judged by reference to such figures. Another point raised by EROs was the difference between a population figure in a given area and an eligible electorate figure. As ONS do not collect nationality data, it is only possible to compare the register figure to that of the overall population. But, as the response from Westminster City Council notes: Simply using adult population without discounting non-eligible adults results in significantly overstating the unregistered electorate. These concerns were noticeably less prevalent among the responses from those other than EROs. 4.2 Despite these concerns, some of the responses from EROs accepted the motivation for including an indicator such as this. One went as far as describing it as a valid and useful indicator and the response from the London branch of the AEA stated that it was accepted that this indicator was one that Parliament would be most interested in, and is relatively easy to produce. The DCA concurred, stating: This measure will be a useful proxy for the registration rate. The data available will give an indicative figure on the number of adults registered in a given area and is an internationally accepted standard. 4.3 Of those commenting on the collection of the data involved, respondents unanimously agreed that the data would be easily obtainable and would not present additional cost burdens on the electoral registration services (i.e. due to the RPF 29 form). Although the resulting data was said by some respondents to be comparable, this was outnumbered by the respondents who said that, despite this, the value of such comparisons would be poor due to the inaccurate base data. Indicator 2: Percentage of 16- and 17-year-olds (attainers) included on the register as a percentage of the total population of that age group 4.4 The responses were very similar to those of Indicator 1 as this indicator is simply a breakdown of the overall percentage collected in Indicator 1. The accuracy of the population figures and the question of the eligibility of the 16 and 17-year-olds were both raised by EROs again. Similarly, some EROs did view it as a valuable indicator, but they were in the minority. Indicator 3: Recorded errors on the register (number and nature of clerical errors recorded at the last major election/annually) 4.5 The utility of this indicator was questioned by most EROs. Many responses noted that the new system allows for errors to be rectified up until 9pm on election day, meaning that very few electors could be disenfranchised by a clerical error. 15

16 Several EROs also stated that errors would be more likely to be noticed if an election was held, so those authorities with more frequent elections may appear to have a greater number of errors. Therefore, the only errors that could be recorded would be those errors that have been discovered. A minority of EROs stated that they thought this indicator would help reveal inconsistencies in clerical activities, and this attitude was more common among responses from those other than EROs. It is also clear from a wide range of responses that this indicator would require the term error to be clearly and finitely defined. Indicator 4: Percentage of entries on the register of electors that are over 12 months old 4.6 There was debate among respondents regarding the actual meaning of this indicator. Most assumed that it referred to the number of entries that had been carried forward on the register for 12 months in the event of non-response in any way. Those that did not entertain this reading thought it pointless as they noted that the accuracy of the entries is the key, not their age. 4.7 Among EROs there was a lack of consensus over support for this indicator. Some felt it added little while others agreed with the DCA s statement that: This will demonstrate how effective the annual canvass has been in producing a comprehensive and accurate register. Indicator 5: Percentage of electoral canvass forms returned during the canvass period, or percentage of properties returning canvass forms and the number of electronic transactions 4.8 The responses from EROs regarding this indicator fell into three roughly equal fields. The first group disapproved of the indicator as they did not feel that it took into account work undertaken by the ERO at other times. The second group valued the indicator as they felt it directly linked to the accuracy of the register but was also concerned that the term return is tightly defined in order for the results to be genuinely comparable. The response from the London Borough of Brent stated: In fact this is perhaps the most reliable indicator of an accurate register. The third group valued it and saw it as an easily collectable figure that they largely already used for their own purposes. One respondent, the ERO from Westminster City Council, felt that due to rolling registration, it is no longer appropriate to measure the accuracy of the register just as at the December publication date. 4.9 The DCA stated that it would like to see the responses to this indicator broken down by registration types (e.g. internet, phone, post, etc.) and thought it could include the steps EROs are to undertake due to Section 9 of the EAA. Indicator 6: Percentage turn-over of changes to the electoral register: total number of additions, total number of deletions, amendments (e.g. surname) 4.10 Some EROs took issue with this indicator in their responses as they did not feel that it was one that measured their performance. The West Midlands AEA branch noted that the percentage of changes is totally out of our control, and varies massively from area to area so could not be compared easily. However, others 16

17 considered it valuable as it identifies those authorities with more transient populations. It may be argued that, due to this, the indicator would act more as an information gathering exercise rather than actually measure the performance of the ERO. The DCA saw the gathering of such data as useful as it could help give a picture of the accuracy of the register and show it is being maintained The general consensus was that the precise wording of the indicator would need to be tightened considerably to ensure that each ERO used the same criteria in recording the data. Indicator 7: Timely publication and supply of the revised register of electors/absent voter lists 4.12 Almost all respondents considered the data involved to be easily obtainable in a cost-effective manner, although data regarding the supply of the register was thought to be somewhat more onerous to capture. There was widespread scepticism from the electoral registration community over the point of the indicator. It was frequently stated that EROs have a statutory duty to publish the register on 1 December, and not to do so would be a breach of the regulations. Therefore, since the ERO has a legal obligation in this situation, many respondents considered it superfluous to add a performance indicator as well However, the response was quite different from that of the DCA, which stated, regarding publication dates, that feedback from users concludes that this is not always met and therefore this indicator will be important to that group. The political parties and WAG were other stakeholders who did not argue for the removal of this indicator, suggesting that there is considerable difference of opinion on this indicator There was greater unanimity over the fact that the definition of supply would need to be clearly stated. Weymouth and Portland Borough Council put the issue as follows: Authorities with large numbers of requests for registers and absent vote lists, for example districts with high numbers of parish councillors, all of whom are entitled to copies of registers, are likely to take longer to provide these lists than other authorities who deal with limited (but probably more vociferous) requests for registers and lists from fewer candidates, but a greater number of political parties. Since the supply of the register is based on demand many respondents considered it unfair to judge performance on this. Indicator 8: Customer satisfaction rating of the electoral registration service 4.15 All the qualitative indicators provoked widely varying answers from the respondents, but Indicator 8 produced particularly disparate views. Most respondents from the electoral registration community were concerned about the potential costs and resources required to undertake an annual customer satisfaction survey. There were also concerns with making sure it is comparable, and deciding on the questions and sample size. Those EROs with such concerns stated that they hoped that the Commission would be able to provide guidance in these areas Although there was widespread concern over the potential resources involved, the EROs were split over general support for the indicator. Many of the ERO 17

18 responses, and those from the AEA branches, noted the potential resource implications but considered such surveys worthwhile, as they would provide a measure of the experience of the customer. Some of the more negative comments expressed concerns with the comparability of such data, citing it to be too localised to be used as a success measure The response from the Performance Team at London Borough of Southwark noted that: satisfaction with the electoral process will be hard to quantify, as it is a hygiene indicator people will rarely be very satisfied because it is something which only happens once a year, often requires minimal time, and so long as it is hassle free, they don t mind! Therefore authorities may not achieve better than OK no matter how hard they try. Other responses pointed to the fact that it may be hard to get a statistically valid response given the level of public awareness of the process. For this reason Maidstone Borough Council proposed surveying candidates and agents as well The response from the DCA considered a mandated survey too onerous for local authorities to carry out and suggested that all the proposed qualitative indicators should be grouped together into one narrative report. Indicator 9: Activity carried out by the ERO to validate the accuracy of the register (notification of deaths, other authority registrations, etc.) 4.19 This indicator also split the respondents between those who thought that the indicator could be a valuable tool in promoting and disseminating best practice across the UK, and others who thought that the qualitative nature of it meant that the data will be essentially incomparable The London Borough of Greenwich supported the former point of view, stating: This indicator is welcomed, and is likely to throw up a lot of good practice. Milton Keynes Council articulated the latter view clearly: Any qualitative indicator based on narrative information will not be comparable between authorities. East Devon District Council suggested that a tick-box approach should be used in order for it to be more comparable among authorities. Most thought that this indicator would result in extra costs and resources used, but this was by no means unanimous. Indicator 10: Activity carried out by the ERO to encourage participation/registration (especially in traditionally low-registration areas and communities) 4.21 This indicator followed the pattern of the other qualitative indicators with the division over the potential measurability of qualitative indicators. The Southern AEA branch s response was echoed by several other responses and stated: It will be difficult to establish a national protocol that provides for the collection of qualitative data in a narrative format that could be measured in a comparable way across authorities. However, the DCA s response considered that this indicator is a clear driver in raising awareness of the importance of registration amongst the electorate as required by Section 69 of the EAA. 18

19 4.22 The concern over the comparability of such data stemmed from a belief among the electoral registration community that the local environment and resources were responsible in a large part for the ability of the ERO to encourage registration. Wealdon District Council was one of the authorities that stated this but also suggested the following in an attempt to resolve the problem: The Electoral Commission might be better advised to construct an indicator around a checklist of best practice comprising ingredients known to be likely to ensure improvements in registration. The elements can then be weighted to produce percentages which can be benchmarked. Indicator 11: Activity carried out by the ERO to prevent fraud in the electoral registration and absent vote application process 4.23 The responses to this indicator were very similar to those on the previous qualitative indicators. Most respondents again commented that such qualitative data would be unlikely to be measurable across all authorities. However, there were also a significant number of responses that noted the value of the indicator in promoting best practice in this area. The DCA supported the indicator, stating: The Government wants to ensure that the integrity of registration is maintained and this indicator goes some way to achieving this by placing an onus on the ERO to prevent fraud Maidstone Borough Council noted that this indicator is different from the previous qualitative indicators as it deals with an unverifiable activity: It is impossible to ascertain how much fraud has been detected as the total amount of fraud is unknown. If the same exercise is carried out in another local authority and no fraud is detected it may be that none exists or simply that none is detected. It is impossible to analyse results that do not have firm base data. Hence, although such an indicator may identify the most proactive ERO, that ERO may not necessarily be the best performing. General questions from the consultation paper 4.25 Most respondents did not engage with the general questions, but there were a significant number that did. The comments made are detailed below. Q10 Are there any additional assessment criteria which the Commission could use to assess potential electoral registration indicators? 4.26 The most common answer to this referred to the need to take the resources of the local authority into account. There was a widespread belief, especially among EROs, that the ability to financially contextualise the data should be key to assessing potential electoral registration indicators. The response from WAG provided a series of interesting suggestions, including: Do they measure local authority/ero performance rather than wider demographic, contextual or other issues beyond a local authority s control? Do they measure the actual outcomes or outputs of an activity, as distinct from inputs or business processes? 19

20 Do they measure discrete outputs or outcomes, avoiding duplication of/overlap with other indicators? Are they applicable and relevant to all or nearly all local authorities which will be collecting them, and will the data they generate be reliably comparable? Are they and the data they generate sufficiently clear to act as a basis for scrutiny and accountability, i.e. would they be readily comprehensible by an informed member of the public? Q11 Is the range of electoral registration indicators appropriate for and applicable to the whole UK electoral community? 4.27 There was widespread concern among the EROs that answered this question over the differing demographic characteristics and available resources among local authorities. The Royal Borough of Kensington and Chelsea wrote: Comparisons between an inner London Borough and, for example, a rural district council may not be appropriate. Interestingly, it was Maidstone Borough Council that pointed to the significant differences to the electoral registration processes in Northern Ireland. Q12 Do you agree that the electoral registration performance indicators, taken as a group, will adequately measure all the relevant aspects of the electoral registration process? 4.28 Concerns over ONS population data were repeated in the relatively few answers to this question, along with the now familiar concerns over taking the differences between authorities into account. Q13 Looking at the overall basket of indicators, is the mix of quantitative and qualitative indicators appropriate? 4.29 Although most respondents did not engage with this question, the general response from those that did considered that the mix was reasonable. The concern over the measurability and comparability of the qualitative indicators was revived again by some respondents, such as the London Borough of Merton, which stated: We have doubts on the measurability of qualitative indicators. Q14 Are there any additional qualitative or quantitative electoral registration indicators which you would like the Electoral Commission to consider? 4.30 The suggested alternative indicators are detailed in Chapter 6. Q15 Do you agree that the proposed basket of electoral registration indicators addresses the key non- and under-registration issues identified in this paper? 4.31 Comments from a few EROs referred to the need to recognise that there are many reasons why people do not register that have nothing to do with the registration process. East Devon District Council argued in their response that: Key issues affecting non- and under-registration are lack of faith in politicians, the media and public perception of politics, lack of interest in the democratic process, concern about personal information and big brother government. 20

21 Q16 What support would EROs require to (a) capture and report this data in a timely fashion and (b) develop and implement improvement plans? 4.32 The most common response to this question concerned the need for sufficient resources and software. There was also a desire, from some EROs, to see support from the Commission in terms of templates or survey methodologies. 21

22 5 The Commission s response to the consultation 5.1 The Commission welcomes the responses it has received, and is grateful for the effort made by respondents to contribute constructively to the debate. The responses have made a genuine impact on the existence and appearance of the indicators to be piloted during summer Overall themes The number of proposed indicators 5.2 Although the Commission understands the concerns made over the number of indicators proposed in the consultation document, this process was intended to actively challenge and improve the potential indicators. 5.3 In addition, the Commission is only seeking to gather information in the first year, not set standards. Once we have received and analysed the information it will be clearer where there are to be resolved. Therefore, the final number of indicators may be fewer. In addition, although the Commission is mindful of the Government s desire to reduce top-down performance standards, it should be remembered that reduction is only possible because the indicators in question have existed for many years, resulting in a wealth of knowledge of the sectors involved. 5.4 As the Commission is starting from nothing, it is to be expected that in the early years the number of indicators used may be disproportionately high (although all efforts will be made to ensure that there are not unreasonable burdens put upon EROs). 5.5 Another factor that presents a challenge is the lack of a quantitative indicator that clearly and genuinely reveals the accuracy of the electoral register in question. If the Commission could ask for the percentage of the eligible adult population included on the register then almost all other indicators could be discarded. The lack of a catch-all indicator means that the Commission is obliged to use more indicators to try to get the same information. The number of quantitative indicators 5.6 It is understandable that EROs, in the face of a new performance management framework, are concerned about the quantitative indicators and see a qualitative approach as an opportunity to show what it is actually like in their particular areas. However, although EROs should be judged fairly, the Commission also has to produce a framework that will genuinely measure performance and provide comparable data. It is hard to argue against the notion that a performance management framework has to include some quantitative indicators if it is to be genuinely comparable across the organisations being measured. 22

23 The number of qualitative indicators 5.7 The responses to the consultation demonstrate that performance management experts are wary of qualitative indicators. It cannot be denied that such indicators risk introducing a less rigorous and clear method of measuring performance. The development of the qualitative indicators during piloting will be crucial. In addition, the specific nature of the power to set standards means that it will also be necessary to develop an appropriate analysis and verification framework during the piloting phase. Importance of contextualising the data 5.8 The Commission agrees that the information provided by EROs and local authorities should be contextualised. This fits with the complementary power to collect financial information on both the electoral registration process, and the running of electoral events. This data will provide valuable information on the different levels of resources available for electoral functions across the UK. In addition, the indicators themselves, as well as the substantial contextual information already publicly available, add up to a significant set of data that the Commission can use. The Commission will also be working closely with the other performance management bodies such as the Audit Commission and CIPFA to develop a suitable model for comparing performance data (i.e. nearest neighbour or families of authorities, etc.). Links to performance improvement 5.9 The Commission accepts that there are few explicit links between the indicators and performance improvement processes in the consultation paper. However, it was clearly stated in the paper that the development of such processes firstly required an understanding of the current situation. The collection of the baseline data will allow the Commission to see where the problems lie and what level of support would be needed to improve performance. There is no need for it 5.10 The Commission strongly disagrees with this sentiment, and it is clear that a large number of EROs also consider the introduction of performance standards as a welcome event. This is not to criticise the actions of EROs; the Commission has always been supportive of the work undertaken by EROs. However, the fact remains that the Commission, Parliament and EROs themselves have concerns about the varying level of service received by voters across the UK. In addition, neither the Commission, the Government, nor EROs themselves currently have a clear idea of the level of electoral registration, the determinants of differing levels of registration across the UK, or a consensus on an acceptable level of electoral registration The response from the London Borough of Hammersmith and Fulham stated: We strongly welcome the introduction of a nationwide performance regime as part of the Electoral Administration Act, and the Commission s central role in this. The London branch of the AEA concurred with this view, arguing: The branch agrees with the need to produce performance indicators that will aid in the development of 23

24 electoral administration, and assist in improving services nationwide. It is recognised that there is a greater need for consistency and quality of data in order properly to consider the performance levels of Electoral Registration Officers (EROs) in both comparable and non-comparable areas, and that there is a need to demonstrate to Parliament that value for money is being gained from the resources devoted to electoral services by local authorities, including from government grants This type of response was also echoed by the Liberal Democrats: We would support the introduction of performance indicator standards for electoral registration and, specifically, standards that encourage appropriate systems to discourage electoral fraud and encourage voter participation. Support from the Government was reinforced by the submission from the DCA. The indicators Indicator 1: Percentage of adult population included on the register of electors 5.13 It is understandable that EROs are concerned about the use of ONS data. They have significant doubts over the accuracy of the figures involved and do not want to have their performance judged, in part, by it. However, there is simply a lack of alternative population data meaning that it would be unrealistic not to use ONS figures. The Commission agrees with the DCA that this will still prove to be a useful proxy registration figure as the caveats can be clearly stated when presenting the findings. In addition, the concerns over the difference between the population total and the eligible electorate total may be relatively small in many local authorities. Indicator 2: Percentage of 16- and 17-year-olds (attainers) included on the register as a percentage of the total population of that age group 5.14 The same arguments for and against using the population data supplied by ONS apply as in Indicator 1. As it is only a very small number of 16-year-olds who can register, it might be more feasible, but still meaningful, to amend this indicator to 17-year-olds included on the register. In addition, this indicator is perhaps best placed as a sub-set of Indicator 1, rather than an indicator in its own right. Indicator 3: Recorded errors on the register (number and nature of clerical errors recorded at the last major election/annually) 5.15 The concerns about the utility of this indicator were largely well-founded. The fact that errors can now be amended up until 9pm on the day of poll means that it is extremely unlikely that any voter would be disenfranchised by a clerical error. In addition, it is hard to know how to quantify errors; would an error that left an entire street off the register count as one error? However, the Commission does sympathise with the view that it is useful to know about the quality of the recording of registration data. The issues with it as a quantitative indicator suggest the information may be better gained through qualitative methods. 24