survey on stakeholder engagement - reporting back, eduardo esteban romero fong
Click here to load reader
Post on 22-Aug-2014
Embed Size (px)
DESCRIPTIONPresentation by Eduardo Esteban Romero Fong, General Coordinator, Regulatory Impact Assessment, Federal Commission for Regulatory Improvement, Mexico, at the 6th Expert Meeting on Measuring Regulatory Performance: Evaluating Stakeholder Engagement in Regulatory Policy, Reporting back, Breakout Session 2, The Hague, 16-18 June 2014. Further information is available at http://www.oecd.org/gov/regulatory-policy/
- Survey on Stakeholder Engagement Report from the Breakout Session The Hague, Netherlands June 17, 2014
- Consultation Principle 2 Adhere to principles of open government, including transparency and participation in the regulatory process to ensure that regulation serves the public interest and is informed by the legitimate needs of those interested in and affected by regulation. This includes providing meaningful opportunities (including online) for the public to contribute to the process of preparing draft regulatory proposals and to the quality of the supporting analysis. Governments should ensure that regulations are comprehensible and clear and that parties can easily understand their rights and obligations. Principle 6 Regularly publish reports on the performance of regulatory policy and reform programmes and the public authorities applying the regulations. Such reports should also include information on how regulatory tools such as Regulatory Impact Assessment (RIA), public consultation practices and reviews of existing regulations are functioning in practice.
- Clarity (1/2) Are all questions on stakeholder engagement clear? Questions Questions may not to gather all the facts, do requirements guarantee implementaton? It is not possible to check information on every question. The questionnaire is organized with a set of questions for the specific stages of consultation, there may me a FAQ for all of relevant issues. Do we ask for the types of documents or the specific document? The same goes for groups, is it types or specific groups? Format It is helpful to have options, we need to ensure the right order or hierarchy and avoid, as much as possible, the use of the option "other. Provide specific spaces for comments. Do any questions leave room for interpretation?
- Clarity (2/2) Are all questions on stakeholder engagement clear? Definitions & Scope Ensure that definitions are equally clear to all, recognize differences among legal systems. Central level of government excludes Parliament. Statistics Clarity on statistics, provide data or percentages and ensure comparability across countries. Ability to provide data. The steering group will decide what to do with statistics and their presentation. Other Suggestions Clarity on primary legislation. Add a question on the percentage of laws that originate in the Executive. Include the Official Gazette. For proposed regulations use the word "issued" instead of "drafted". Use the term, open to the general public instead of any member of the public. Do any questions leave room for interpretation?
- Evaluation Why do we need indicators on consultation? Comparison The comparison helps to be able to implement succesful practices from other countries and raise interest from decision makers and high level officials to change what needs to be changed Progress It helps to also look at evolution or progress within countries Identify areas of improvement Compare and contrast. Identify the baseline and the best practice, even targets
- Data How can the REG indicators be complemented with data/indicators from other sources? Data from the Parliamentary process Reaching people without Internet or technology access. Identify who is the audience, who is excluded and who should be included Full implementation cannot be covered in the survey, we might be able to identify the reason for our differences Quantity vs. Quality Use complementary sources, for example, perception surveys. Use groups of indicators These indicators provide information for other indicators and databases, for example, Better Life Index Explain the reasons for the differences among practices
- Tools How would you use a good practice database to complement the indicators? How could it look like? Information would be useful about the budget used to implement those practices to have an idea of resources needed. Who are the experts or practitioners and where are they? - Contacts Observatory of Public Sector Innovation (OPSI) Database. How to select? What are the differences between innovative and good (or useful)? Searchable, easy to find and user-friendly. Database or evidence? Be careful that this is updated, maybe including the date. Standardization and description of processes.
- Aggregation Which questions could be grouped together? Pros & Cons Simple weights, expert, statistical or self selection? Make it difficult to game with indicators. They should be comparable over time. Track back progress from 2008 indicators, where possible. Simple to understand.