data quality spot checks - cumbria · 3 the audit commission is committed to stimulating...

22
Data quality spot checks Supplementary guidance for use of resources KLOE 2.2 Use of resources 2009/10

Upload: phungphuc

Post on 15-Sep-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

Data quality spot checksSupplementary guidance for use of resources KLOE 2.2 Use of resources 2009/10

Contents

Introduction 3

Performance indicators 5

Selecting data for testing 7

Resource requirements 10

Spot check approach 11

Linking findings to use of resources judgements 14

Technical support 15

Appendix 1: risk assessment 16

Appendix 2: service-specific management arrangements 17

Appendix 3: data quality dimensions 19

Appendix 4: information about the indicator 20

Appendix 5: spot check approach 21

TTTIntroduction

3 Data quality spot checks

Introduction 1 This guidance is for auditors undertaking spot checks of performance information,

which should support their judgement for use of resources KLOE 2.2.

2 The spot checks may be of performance indicators from the National Indicator Set, local performance indicators (PIs) or any other performance measure which lends itself to this style of testing. Auditors may choose to adapt this tool as appropriate to test other types of data if particular local risks are identified.

Background 3 The Audit Commission is committed to stimulating significant improvement in the

quality of data and the use of information by decision makers.

4 Effective organisations measure their performance against priorities and targets in order to check how well they are doing and to identify opportunities for improvement. The performance information they use must be fit-for-purpose. The Audit Commission paper, In the know, published in 2008, defines fit-for-purpose information as being relevant, and of an appropriate quality and presentation for the decision being taken.

5 Good quality data is the foundation of good quality information. The paper: Improving Information to Support Decision Making: Standards for Better Quality Data, published jointly in 2007 by the Audit Commission, CIPFA, Audit Scotland, the Wales Audit Office and the Northern Ireland Audit Office, sets out standards for promoting good data quality. Poor data quality puts public bodies at significant risk of, for example, damaging public trust; weakening frontline service delivery; incurring financial loss or poor value for money; and undermining partnership working.

6 The Audit Commission study: Is there something I should know?, published in 2009, found that, although data quality had improved, only 5 per cent of councils were considered to have excellent data quality. Many acknowledged fundamental data quality problems, such as: duplication of data and inconsistency; lack of basic data; poor system design; and confusion over how and where data should be recorded. Many councils also had problems in sharing data with external partners. Staff did not always understand the Data Protection Act, and had misplaced concerns about confidentiality and data sharing.

Scope 7 Use of resources KLOE 2.2 focuses predominantly on arrangements for using fit-for-

purpose information and securing data quality. Auditors undertake spot checks of selected data, based on their knowledge of local risks, as evidence to support this KLOE judgement. The spot checks help to confirm whether authorities are producing relevant and reliable data to support decision-making, and are ensuring the security of confidential data.

TTTIntroduction

Data quality spot checks 4

8 Auditors undertake spot checks at councils and fire and rescue authorities, but not at police authorities or PCTs (where they rely on other work). In addition to these locally selected spot checks, the Commission will continue to specify spot checks of particular data for housing benefits and fire performance indicators. The tools for these specified spot checks are available via the relevant links in the use of resources guidance.

9 The spot checks enable the auditor to consider, as part of the assessment of KLOE 2.2, whether an authority’s arrangements are working in practice and are applied consistently. The spot checks consider whether data is fit-for-purpose, by considering the arrangements to produce the data and testing a small sample of supporting records. The purpose of the spot checks is not to comment specifically on the published value of an indicator.

10 The Spot checks assess data against the six data quality dimensions defined by the Audit Commission:

• accuracy; • validity; • reliability; • timeliness. • relevance; and • completeness.

11 These dimensions underpin the voluntary standards set out in the joint paper: Improving Information to Support Decision Making: Standards for Better Quality Data.

Performance indicators

5 Data quality spot checks

Performance indicators National indicators 12 The department for Communities and Local Government (CLG) introduced a single set

of national indicators (the National Indicator Set or NIS) from 1 April 2008, following the government's Comprehensive Spending Review 2007.The final Handbook of Definitions providing definitions for the NIS, which currently comprises 188 indicators. The annexes to the Handbook of Definitions provide useful information on how each NI is calculated and the information sources.

13 The NIS is the only set of indicators on which central government manages the performance of local government. It replaced all other sets of indicators, including Best Value Performance Indicators (BVPIs) and Performance Assessment Framework (PAF) indicators, from 1st April 2008.

14 Targets against the NIS are negotiated through Local Area Agreements (LAAs) at each single tier and county council local strategic partnership. Each LAA includes up to 35 targets from the NIS, complemented by 17 statutory targets on educational attainment and early years.

15 Further information on the NIS can be found on the National Indicator Set section of the Audit Commission’s website. This includes detailed guides and frequently asked questions on a number of national indicators, which have been chosen for inclusion in LAAs. These resources can help auditors to understand the indicators and plan their spot checks.

Local indicators 16 National indicators will not be the only way in which authorities manage performance,

and authorities will also be using locally developed performance indicators and measures that align to their objectives and priorities.

17 The Audit Commission works with government departments, national agencies and local authorities to agree and define a wide range of local performance indicators and authorities may use these as the basis of some of their local measures.

Housing and council tax benefit indicators 18 The Commission has specified work on housing and council tax benefits data quality

for councils with benefits services. From 2009/10, the approach to this work has been revised because of the earlier use of resources timetable, which is not in line with the certification deadline for housing and council tax benefits certification (BEN01). The data quality spot check is no longer integrated into the HB COUNT approach in the same way as in previous years. Auditors should now use the housing and council tax benefits spot check tool, available via use of resources KLOE 2.2. This involves

Performance indicators

Data quality spot checks 6

undertaking some work on the arrangements for benefits data quality and using the results of HB COUNT certification testing from the previous year.

19 Auditors should refer to the tool for housing and council tax benefit data spot checks, available via the use of resources guidance for KLOE 2.2, for further information.

20 The spot checks of housing and council tax benefits data are undertaken in addition to the spot checks selected by the auditor based on local priorities and risks to support the assessment of use of resources KLOE 2.2.

Fire and rescue performance indicators 21 The Commission has specified work on fire data quality spot checks at all FRAs. This

requires auditors to review selected fire NIS performance indicators.

22 Auditors should refer to the fire audit tool for spot checks of performance indicators, available via the use of resources guidance for KLOE 2.2, for further information.

Selecting data for testing

7 Data quality spot checks

Selecting data for testing Risk assessment 23 The number and selection of items for spot checking depends on local data quality

arrangements and risks identified by the auditor. Key risks influencing an auditor’s selection may include:

• risks around delivery of the audited body’s corporate priorities; • risks identified in prior-year data quality assessments, for example PIs which were

qualified based on the auditor’s detailed work, or weaknesses identified in arrangements;

• consideration of high-risk and sensitive areas and functions not previously reviewed by the auditor, for example social care data;

• risks identified in any relevant systems work, for example information technology risks and automated and manual system risks;

• risks identified by other inspectorates or internal audit reviews; • analysis of NIS and key local indicator outturns, for example, key LAA PIs with

large year-on-year variances may be considered high-risk; • size and performance of the organisation; • source of performance information: for example, a data item from a reliable

national data provider may be considered low-risk; • risks identified from auditor assessment against the UoR KLOE, for example weak

partnership arrangements; and • appointed auditor or comprehensive area assessment lead concerns about data

quality and use of information.

24 Auditors should note that the purpose of spot checks is to test the audited body's arrangements for producing data that is fit for purpose. Auditors are not expected to test directly the data held by a third party; in these cases the auditor would focus on how the audited body gains assurance about the third party's arrangements for data quality. A spot check will therefore reveal more about data quality arrangements where the audited body, on its own or working with partners, directly produces, compiles and calculates data for the indicator.

25 Some national indicators are based on data produced, compiled and calculated entirely by a central government department or agency. Where this is the case, and particularly where the audited body has no access to the underlying data, a spot check is unlikely to reveal much about data quality arrangements within the audited body. For example, in general it will not be possible for audited bodies to have an individual data sharing protocol with central government, or to carry out ‘reviews’ of data quality or internal audit in central government. Evidence, testing, and conclusions, are likely to be limited with these types of indicator.

Selecting data for testing

Data quality spot checks 8

26 Auditors should therefore avoid selecting indicators to spot-check where the underlying data and indicator is produced, compiled and calculated entirely by central government departments, or agencies (see example in Table 1).They may also wish to avoid selecting indicators where the data is produced, compiled and calculated entirely by other external organisations.

Table 1

Indicator description

NI 172: Percentage of small businesses in the area showing employment growth

Information Source Office of National Statistics (agency of central government)

Notes Underlying data not available to councils. Numerator, denominator and indicator are published on the BIS (central government) website.

27 Auditors should also avoid selecting one of the 18 indicators which are collected in the Place Survey. The Audit Commission manages the collection and initial processing of the local data for Place Survey indicators, and the final indicators are published by CLG after detailed quality assurance. These indicators can therefore be considered low risk for data quality.

28 Auditors may find the diagram at Appendix 1 useful when considering risks for selecting data quality spot checks. For national indicators, auditors should check the information sources in the CLG Handbook of Definitions. for indicators being considered for spot checks

Sample sizes 29 The number of indicators selected for spot checking and the sample sizes used should

be based on local circumstances and risk. The detail set out below is an indicative guide only.

Number of indicators 30 For low risk organisations, a spot check of two to three performance indicators, in

addition to any indicators specified by the Commission, should be sufficient to confirm whether the organisation’s arrangements for securing data quality are working in practice and are applied consistently.

31 For high risk organisations, the number of indicators selected in addition to those specified by the Commission should reflect the nature and significance of the risks identified. This will include considering the size of the organisation, whether weaknesses are generic across the organisation or specific to certain services or partnerships and the consequence of any weakness, for example, has the weakness resulted in key decisions being made on the basis of information which is not fit-for-purpose and therefore resulted in poor VFM?

32 Auditors should collect data selected for spot checks directly from the audited body. These include NIS indicators. Where relevant, auditors should be satisfied the

Selecting data for testing

9 Data quality spot checks

information they are testing is consistent with that used locally to manage performance, and with that reported externally, for example, to CIPFA, central government or in reports to partners or the public.

Sample sizes for detailed testing 33 When testing the underlying data supporting the individual performance indicator,

sample sizes of ten can be used as a guideline, as this is likely to be sufficient to identify any significant errors and thus identify significant data quality management arrangements issues. However, auditors should consider local circumstances, for example the nature of the data or population to be tested, in setting the size of samples for testing, and could increase the samples where appropriate.

Resource requirements

Data quality spot checks 10

Resource requirements 34 Auditors should allow up to two days per indicator for the spot check work. Resourcing

needs will depend on:

• the complexity of the data and the indicator; • the clarity of the guidance supporting the performance indicator; • local circumstances for recording and validating data; and • cumulative auditor knowledge and previous data quality findings.

35 An experienced auditor or performance specialist with a good knowledge of the organisation and performance indicators should carry out this work.

Spot check approach

11 Data quality spot checks

Spot check approach 36 Spot checks of performance indicators consist of:

• applying a set of service-specific management arrangements questions; • understanding the system used to collect and process the data in accordance with

the PI definition; and • testing the underlying data against the six data quality dimensions (accuracy,

validity, reliability, timeliness, relevance and completeness) as applicable.

Service-specific management arrangements questions 37 Auditors should consider the service-specific management arrangements questions at

Appendix 2 for the service producing the PIs being reviewed. The questions are based on the key principles set out in the Audit Commission joint paper: Improving Information to Support Decision Making: Standards for Better Quality Data. Auditors may wish to refer to the appendix of this joint paper for more explanation of the principles. Local circumstances may mean that some of the questions will not be applicable for every spot check. Auditors should use their professional judgement when considering these.

Understanding the systems 38 Before testing supporting data, auditors should:

• gain an understanding of the system used to collect and process the data in accordance with the PI definition (the definition will be set locally for local PIs or nationally for NIS indicators). This should include key controls and audit trails; and

• document the system. Documentation does not need to go to the same depth of detail required in opinion work for compliance with ISAs (UK&I) (such as flow charts), but should be detailed enough to give an understanding of the key risks to securing data quality.

39 Auditors should consider the following key questions when assessing the system:

• Does the system appear to be adequately designed to ensure the data is accurate, valid, reliable, timely, relevant and complete (see Appendix 3 for definitions)?

• Are there any issues which need to be addressed in the testing of the underlying data, for example are aspects of the data provided by a third party?

• Do the tests need to take account of specific risks at this organisation, for example, where system providers have changed?

• Do the samples to be tested need to be directed to particular aspects of the service or definition being measured, or the system used to collect and process the data?

Spot check approach

Data quality spot checks 12

Testing the underlying data 40 Testing the underlying data will help gain evidence about whether arrangements for

securing the accuracy, validity, reliability, timeliness, relevance and completeness are being applied. Appendix 3 summarises how auditors might test each of the data quality dimensions (these will vary depending on the type of data being tested).

41 Auditors may find it useful to complete the table at Appendix 4 to capture the key details of the performance indicator selected for testing. Note that for NIS indicators, the annexes to CLG's Handbook of Definitions provide most of the relevant information. Appendix 5 contains an outline of tests to follow when undertaking spot checks. Auditors should adapt these tests to address specific local risks or to test other types of data.

Evaluating the results of errors 42 Where initial testing identifies errors, auditors should evaluate the reasons for these

errors and decide how they impact on their judgement of whether the authority’s arrangements are adequate and are working in practice. Where auditors identify errors, they should consider the following questions:

• What do the errors tell me about this PI and the authority’s management arrangements?

• Is the error significant? • Is this an isolated error? • Is there any pattern to the errors found? • Have similar errors been identified in previous years? • Do I have enough evidence to reach a conclusion or is further work required?

Further testing where errors are identified 43 The important question auditors should ask themselves as a result of their spot check

testing is “do I have enough evidence to inform my judgement on the management arrangements?” Auditors need only collect enough evidence to support the purpose of the spot check: that is, to whether the authority’s data quality management arrangements are working effectively in practice. Auditors do not need to carry out additional work to calculate or to confirm corrected indicator figures.

44 Testing of additional performance measures may be required where auditors do not have enough evidence to reach a conclusion on data quality management arrangements. For example, if an error was found in an environment performance measure because the incorrect parameters were used, you may decide to extend your testing to see whether the correct parameters have been used in other environment performance measures. Further sample testing may be required where an error has been found, to understand the extent of the error in a particular performance measure. The extent and focus of further testing should be based on the nature and the risks

Spot check approach

13 Data quality spot checks

attached to the errors identified. However, auditors should be clear about the potential advantages and disadvantages before undertaking further testing.

Linking findings to use of resources judgements

Data quality spot checks 14

Linking findings to use of resources judgements 45 Auditors assess data quality and use of information at KLOE 2.2 in the use of

resources assessment. Auditors should use the results of their spot checks of selected data, including those specified by the Commission, as evidence to support their judgement on KLOE 2.2. The spot checks help to confirm whether audited bodies are producing relevant and reliable data to inform decision making, and ensuring the security of confidential data.

46 Auditors should also consider whether the findings and local risks impact on other KLOE, For example, how can an organisation produce relevant, timely and reliable financial monitoring and forecasting information (KLOE 1.3) when there are significant issues relating to the underlying data?

Reporting findings 47 Auditors will not normally prepare a separate report on data quality, as significant

findings should be reflected in the use of resources reporting. However, auditors may also decide to feed back detailed findings as part of a local briefing or meeting.

48 Auditors will not be required to make a judgement on the accuracy of published indicators, as was previously the case for PI testing to support CPA service scores. However, to support the sharing of audit evidence and support consistency processes, auditors will be expected to report their spot check findings to the Commission via EDC. This will include any significant concerns about the accuracy of the data or performance indicators.

Technical support

15 Data quality spot checks

Technical support 49 Auditors in the Commission's audit practice should first seek to resolve technical

queries locally on their patch. Any unresolved queries should be emailed to the Audit Commission Technical Support (ACTS) via the ACTS portal.

50 Firms' auditors should first seek to resolve technical queries using the firms' own support service. The firms' technical contacts should email any unresolved technical queries to Audit Commission Technical Support (ACTS) via the ACTS portal.

51 Details are available in the Audit Commission’s technical support protocol. The request to ACTS should include the following information:

• information sought from ACTS; • relevant background information; and • steps already taken to try to resolve the query.

52 ACTS will log and monitor all queries and respond to them within three working days (ten working days where there is a need for a more substantive answer or further research).

Appendix 1: risk assessment

Appendix 1: risk assessment

Data quality spot checks 16

Prior year DQ and UoR

assessments

Code systems work

Other Inspectorates/

IA findings

Analysis of NIS and local

PIs

DA/CAAL concerns

Source of data (eg

partnership)

Risk identification

Consideration of local arrangements and risk For example: • How does the organisation apply the core principles in the data quality standards? • What has the organisation done to mitigate specific risks to data quality? • What arrangements are in place to ensure the quality of partnership data?

Selection of data for testing

Data quality concerns

Data quality spot checks

Evidence to support spot check and UoR

Judgement

Data quality spot checks

Appendix 2: service-specific management arrangements

17 Data quality spot checks

Appendix 2: service-specific management arrangements 1 The following service-specific management arrangement questions should be considered in relation to each performance

indicator reviewed. Local circumstances may that some questions will not be applicable for every spot check. Auditors should use their professional judgement when considering which are relevant.

2 Auditors should consider responses to each relevant question alongside the results from data testing of each PI when considering whether the organisation’s arrangements for securing data quality work in practice and are applied consistently.

Service-specific spot check questions

Governance and Leadership 3 Evidence the organisation has put in place a framework for management and accountability of data quality:

• How has data quality, in relation to this indicator, been monitored and reported during the year? • Has the service taken action to address the results of any previous internal or external reviews of data quality? • Who is accountable for data quality for this indicator, and how is this made clear? • Where there is joint working, are there sound governance arrangements based on risk, for example is there an

agreement covering data quality with relevant partners?

Policies 4 Evidence the organisation has put in place policies and procedures to secure data quality:

• What guidance has been used to record the data and calculate this indicator? Does it comply with relevant national standards or legislation, for example the Data Protection Act, and is it up-to-date?

• Do staff have ready access to guidance and support for recording data?

Appendix 2: service-specific management arrangements

Data quality spot checks 18

• How does the organisation ensure polices and procedures are applied consistently?

Systems and processes 5 Evidence the body has systems and processes which secure data quality:

• Is manual intervention minimised in producing the indicator? • Does the organisation consider its arrangements for collecting, recording, compiling and reporting the data as part of

business planning and management processes? • Do systems have adequate controls to minimise human error, prevent erroneous data entry and unauthorised entry or

manipulation (for example, password controls, audit trails, number of people able to input data)? • Where appropriate, do the arrangements ensure that data for the indicator is collected and recorded in one system or

database only (using COUNT principles)?

People and skills 6 Evidence the body has arrangements to ensure staff have the knowledge, competencies and capacity for their roles

regarding data quality:

• Have staff received any data quality training during the year or training directly related to data needed for this indicator or service?

• Are data quality standards set, for example, through a common competency framework, and are staff assessed against these?

• How does the service ensure consistency in recording data when several people are involved in this task?

Data use and reporting 7 Evidence the body has arrangements to ensure data is subject to a system of internal control and validation:

• Is the data used for producing performance information also, where appropriate, used for day-to-day management of the organisation’s business?

• Is the indicator or PI data supported by a clear and complete audit trail? • Is the data subjected to appropriate levels of verification and senior management approval?

Appendix 3: data quality dimensions

19 Data quality spot checks

Appendix 3: data quality dimensions

Dimension Meaning Assurances

Accuracy Is the data sufficiently accurate for the intended purposes?

Validity Is the data recorded and used in compliance with relevant requirements?

Reliability Does the data reflect stable and consistent collection processes across collection points and over time?

Timeliness Is the data up-to-date and has it been captured as quickly as possible after the event or activity?

Relevance Is the data captured applicable to the purposes for which they are used?

Completeness Is all the relevant data included?

Correct numerator and denominator are used and calculation is correct and is consistent with PI definition. Figures agree to systems reports or compilation documents. The correct data has been included or excluded and it is accurate, complete and up-to-date. Data has been collected, recorded and reported consistently for the correct year (eg appropriate cut-off procedures have been applied), using the same methods across the period and across collection points.

Appendix 4: information about the indicatorTPF FPT

Data quality spot checks 20

Appendix 4: information about the indicatorI

Indicator description

Purpose of indicator

Numerator Source: Value:

Denominator Source: Value:

Indicator calculation

Outturn

Reported: Target:

Central return details (if applicable)

Return format (e.g. per cent): Measurement period (e.g. 08/09 financial year): Decimal places:

I For National Indicators, the CLG Handbook of Definitions annexes provide most of this information.

Appendix 5: spot check approach

21 Data quality spot checks

Appendix 5: spot check approach 1. Management arrangements Answer the service-specific management arrangements questions in Appendix 2, after checking that they are all relevant for this performance indicator.

2. Systems Obtain an understanding of the system used to collect and process the data in accordance with the performance indicator definition.

3. Testing Test a small sample of supporting records using the follow test guidelines.

Test 1 - Test the accuracy of the PI calculation Refer to the performance indicator definition, and confirm correct numerator and denominator are used. Check all figures back to systems reports or compilation documents and re-perform the calculation.

Test 2 - Test the completeness and accuracy of the numerator Check supporting figures back to system via the audit trails. Where intermediate calculations or adjustments have been performed to arrive at the required figure, check the arithmetic and review for reasonableness. Select a sample of items and ensure: • the data is complete for the whole year; • the data is complete for all methods of collection (for example where data is

collected from different sites); • the data has been recorded accurately and is in accordance with the PI definition; • the data is valid, for example the correct data has been included or excluded; and • where samples have been used to calculate the numerator, these are reasonable

and follow relevant requirements of the PI definition.

Test 3 - Test the completeness and accuracy of the denominator Check supporting figures and totals back to source, for example via system audit trails, system reports or control totals. Where intermediate calculations or adjustments have been performed to arrive at the required figure, check the

Appendix 5: spot check approach

Data quality spot checks 22

arithmetic and review for reasonableness and accordance with the PI definition. Consider the need for any further detailed testing.

Overall conclusion - Does the reported PI comply with the Audit Commission's data quality criteria? Does the reported PI comply fairly with the Audit Commission’s data quality criteria (accuracy, validity, reliability, timeliness, relevance and completeness) and are there any specific data quality concerns?

Consider impact on Use of Resources judgements KLOE 2.2 - How does your conclusion on the spot check for this indicator impact on your judgement on the relevance and reliability of data which is used to make decisions, and data security. If there are data quality concerns, does this impact on any other KLOE?