At the closing of the Open Government Data Programme, Stats NZ commissioned Martin Jenkin to carry out an independent review of the Open Data Programme. The closure report includes direct comments and feedback from the interviews and analysis from the independent reviewer.
Independent review of the Open Data Programme [PDF 381 KB]
This independent post-programme review was commissioned by Stats NZ at the completion of the Open Data Programme (ODP) in its current form. The review focuses on 2016-2020 when the ODP was housed at Stats NZ. The review assesses the achievements of the ODP against its expected outcomes, examines the ODP performance against the delivery and realisation of benefits, and identifies lessons learnt.
The ODP is a cross-government programme designed to accelerate the release and reuse of open government data to maximise the value of that data. The ODP began in 2010 to support the adoption of the New Zealand Government Open Access and Licensing Framework (NZGOAL). The Programme was hosted first by the State Services Commission (SSC) then Land Information New Zealand (LINZ), before moving to Stats NZ in 2015.
The ODP worked transparently and much of their work was online; mostly on data.govt.nz and the Open Government Partnership webpages, and the Open Data Programme pages of ict.govt.nz in the past. The contribution to open government in New Zealand was noted as well. The uniqueness of the ODP’s home in a national statistics agency was noted and seen as an advantage due to the wealth of data knowledge the programme could draw on and Stats NZ’s international links.
The ODP achieved greater awareness of data as an asset and greater availability of government data, and there is anecdotal evidence of an increased use of government data. The ODP with its original purpose remains topical and relevant for New Zealand but can be further refined to be more targeted and strategic, and to increase impact both for data holders and data users.
Having a Cabinet mandated, voluntary approach to opening data has proved to be an advantageous approach, though its limitations were noted. Instead of being a compliance exercise for agencies, the ODP triggered a real organisational culture shift in relation to data governance.
In implementing the ODP, Stats NZ has mainly played the role of an advocate and facilitator. It fostered networks of users, brokered relationships between data holders and users, raised awareness among government agencies and within the user community, advocated data release and supported capability building. However, some stakeholders expected more guidance and leadership, especially with and within the Government.
The ODP would have benefitted from defining its outcomes and benefits as well as relevant measures at the start. The programme’s approach and activities would then have been more oriented to what needs to change. A more outcome-oriented programme would have been easier to implement as the structure and activities would have followed the outcomes.
As the ODP is a cross-government initiative, it would have been helpful to consider a stronger all-of-government perspective in formulating its outcomes, benefits, and planning activities. As an all-of-government initiative effort and delivery could also have been more aligned, or even centralised. “Government helping government” approaches could have been adopted, with larger agencies encouraged to help smaller agencies or work together on datasets of interest to those agencies.
Resilience is an important issue for multi-year programmes. Continuity and succession planning for staff needs to be built into the programme from the start. The ODP would have benefitted greatly from capability analysis, capability building and succession plans from the beginning.
The Programme has learnt that one-size solutions do not fit the variety of stakeholders. Tailored solutions need to be developed to approach smaller agencies, get their buy-in and help them stocktake and release their data. Better understanding of the needs and capabilities of customers would help assess the baseline and identify the appropriate points of impact.
The New Zealand’s Open Data Programme has developed and been implemented in the international context and is part of a global movement. Being an active participant of this international movement has brought mutual benefits to New Zealand and overseas partners. New Zealand is seen as a reputable and innovative partner, its inputs are incorporated in the international guidelines, and its experience is widely shared and replicated.
This report provides findings from an independent review of Open Data Programme (ODP) at Statistics New Zealand (Stats NZ) carried out by the independent consultants, MartinJenkins.
This independent post-programme review was commissioned by Stats NZ at the completion of the Open Data Programme in its current form. The review focuses on 2016-2020 when the ODP was housed at Stats NZ.
The review looked into the programme collateral, outcomes and benefits framework, budget bid and other documents to assess the operation and performance of the programme. The review primarily assessed the achievements of the ODP against its expected outcomes and examined the ODP performance against the delivery and realisation of benefits.
The review also identified lessons learnt that may inform the work of future cross-government data programmes.
The review used a mix of methods to gather feedback from a range of perspectives:
Name | Organisation | Position |
---|---|---|
Sean Audain | Wellington City Council | City Innovation Lead |
Ania Calderon | Open Data Charter | Executive Director |
Eleisha Hawkins | Stats NZ | General Manager, System and Partnerships |
Daria Kwon | Stats NZ | Senior Manager, Data Leadership and Capability |
Rachel Milicich | Stats NZ | Deputy Govt Statitician, Deputy CE, Insights and Statistics |
Jocelyn Morrison | Stats NZ | Senior Advisor, Data Leadership and Capability |
Tracy Parsons | New Zealand Ministry of Foreign Affairs & Trade | Manager, Information and Analytics; Knowledge, Information, and Analytics Unit Manager |
Suzanne Snively | Transparency International | Chair, Transparency International New Zealand |
Paul Stone | Stats NZ | Senior Advisor, Data Leadership and Capability |
Evelyn Wareham | New Zealand Ministry of Business, Innovation, and Employment | Chief Data and Insights Officer |
In this section we summarise the key findings and recommendations from the review. Specifically, these relate to:
The Open Data Programme (ODP) is a cross-government programme designed to accelerate the release and reuse of open government data to maximise the value of that data. The ODP began in 2011 to support the adoption of the New Zealand Government Open Access and Licensing Framework (NZGOAL). The programme was hosted first by State Services Commission (SSC), then Land Information New Zealand (LINZ), before moving to Stats NZ in 2015. At the beginning of the latter move, the ODP was run in a systemic partnership between LINZ and Stats NZ. By 2017 the move was completed, and Stats NZ took over the operations of the ODP in full.
In 2017, the Open Data Action Plan was published, following a public consultation, and contributed to Commitment 4 of the Open Government Partnership’s second National Action Plan (which started in 2016). The action plan sets out goals and initiatives for the ODP till 30 June 2020, when the programme came to an end. The ODP has also been updated to include initiatives for implementing the principles of the international Open Data Charter, which is a specific initiative under the Open Government Partnership.
Stakeholders have a strong common understanding of the objectives and the purpose of the ODP. The programme looks to sustainably transform the government data culture by supporting agencies to open datasets and improving their ability to manage datasets for the use by third parties. The systematic and sustainable release of open data should become business as usual (aka ‘open data by default’ or by design), increasing the net value of data for New Zealand, and accruing economic, social, and environmental benefits.
While the interviewees said that the objectives and the purpose of the programme have not changed, they noted that the ODP lost political priority, as it moved from agency to agency. Since 2016-2017, different data-related initiatives and programmes (e.g. Chief Data Steward, Open Government Partnership) have been initiated, and it has been a challenge to keep the focus of the ODP, to distinguish it from, and to find its place within these data-related initiatives. Additionally, the 2017 election created a new context for the ODP; with more focus on data sovereignty and data ethics.
The described developments may have added some uncertainty and confusion about the original purpose of the programme and its role, especially for new staff and new participants.
Previous versions of the Open Data Programme provided strong foundations, due to more than seven years of work from dedicated staff. There were earlier examples of the usual programme frameworks and operating models: governance and advisory groups, surveys, measures, intervention logics, reporting frameworks, agendas, minutes, public events, contact lists, work plans and budgets.
However, the ODP that began in 2017/18 has fewer of the typical programme planning and monitoring documentation than would usually be expected from a programme of this size. Following a restructure in 2018 and new management, work on benefits frameworks and measures for the ODP commenced. However, they were not completed until 2019, and therefore made evaluating the programme much more difficult.
Stats NZ interviewees reflected that measures and reporting against measures should have started much earlier and would have greatly aided prioritisation of work. Even though decisions were made prior to the ODP joining Stats NZ, external interviewees wondered why surveys and reporting of the achievements of the ODP were not continued.
Having more robust processes, programme structure and reporting lines would have benefited the programme’s monitoring, and resulted in stronger programme disciplines from the start of the programme. This would have resulted in clearer communication to external stakeholders and better alignment with the internal processes of Stats NZ.
The ODP worked transparently and much of their work was online; mostly on data.govt.nz, the Open Government Partnership webpages, and the Open Data Programme pages of ict.govt.nz in the past. The contribution to open government in New Zealand was noted as well.
The uniqueness of the ODP’s home in a national statistics agency was noted and seen as an advantage due to the wealth of data knowledge the programme could draw on, and Stats NZ’s international links.
The Stats NZ Investment Board saw the ODP as “Business as Usual”, and therefore the programme did not have a lot of burden on programme management or monitoring. The monthly reporting through the leadership meetings were also light touch and quite informal.
All interviewees emphasised the efforts and performance of the ODP team. The programme benefitted greatly from having dedicated and inspiring staff who have had a clear vision of what the ODP was about and communicated it effectively to data holders and data users. The ODP staff have developed a solid reputation in the data world which helped in gaining programme participants (i.e. government agencies), creating wide networks and increasing visibility of the ODP, nation-wide and overseas.
Funding and management tracked well against the implementation plan of the programme. The management was ‘adaptive’ in the sense that the implementation plan took into account feedback from stakeholders. There was also great transparency around the implementation of the programme, with the action plan published on the ODP website and the delivery tracked against it, providing details on time-bound tasks.
The delivery was on time, and the funding withdrawal process was well organised. According to the internal monthly reporting (Data System Leadership Dashboard) at Stats NZ the ODP has tracked well against the implementation plan (7 or 8 out of 10 points on track to achieve outputs). The only difficulty was finding procurement partners and things to invest in; because of this there was a risk of underspending every year. However, the programme managed to close with very small underspending every year. Once procurement partners were decided and the activities could be carried out, the programme managed to close with very little underspend each year.
The reporting on the implementation of the ODP was on a monthly basis, via the senior leader meetings and the internal Data System Leadership Dashboard. This reporting was purely operational and focused on activities planned and executed. It did not cover outcomes and benefits but mentioned the preparation and completion of stocktakes, which is one of the performance measures.
More outcome-oriented reports were submitted to the Investment Board annually. These reports and the Investment Board proved to be a great sounding structure for outcomes and benefits. The Investment Board noted the lack of a benefits framework in the budget bid in 2016/17. Upon its insistence, benefits of the ODP and corresponding measures were developed and finalised by the end of 2018 and then reported regularly to the Investment Board.
Prior to the ODP’s move to Stats NZ, participating agencies and data champions had to report annually to the Cabinet on the progress of open data released by agencies. Some interviewees argued that this reporting was a useful incentive as it probably kept the pressure on to advance with releasing government data and showed where the organisation was by comparison to other participants. Unfortunately, this practice ceased, though this decision was made prior to the ODP moving to Stats NZ.
Stakeholders agree that the visibility and outreach approach chosen by the ODP team was very effective and successful. The approach involved many face-to-face engagements and was multipronged. The physical presence and active participation of the ODP team at many events helped the ODP become credible with the data community and agencies.
The ODP staff connected directly with the government agencies to advocate for the value of open data and explain its benefits. Lunch meetings, seminars and workshops were set up both for groups of data holders and for individual agencies. As a result, a number of agencies worked with the ODP, and the knowledge and awareness of open data within the government has grown significantly. The ODP team also spoke directly to the relevant units in agencies about making data accessible, which helped the team to understand the specific needs of the data holders and design an approach to meet them.
A large number of meetups, forums and conferences were organised for data holders and data users. These events targeted non-government stakeholders also and were effective in bringing data holders and data users together (see Table 2 below). The range of events catered for different types of audiences, and most of them were fully subscribed.
We note that these events seemed to be more engaging for government actors and business users (entrepreneurs). Less entrepreneurial users (e.g. banks, researchers, economists) were not as involved in or not considered as much for these events.
The ODP staff also raised awareness of open data within Stats NZ, demonstrating the value of open data and providing a strong narrative of how it can be used.
Meetups stand out as an especially effective format of outreach, communication and awareness raising among the broader range of stakeholders. Data holders could meet (potential) data users in an informal setting and explain and discuss needs and challenges on both sides. This helped to build mutual trust and cooperation. Ultimately, meetups could lead to creating an open data community in New Zealand.
Four meetups have been run in Wellington (since 2017), Auckland, Christchurch, and Dunedin (since 2018). The mailing list for the meetups increased significantly over the course of the programme (however, the actual participation in meetups was smaller).
Table 2: Subscription and participation in meetups
2017 | 2018 | 2019 | 2020 | |
Auckland | ||||
Meetup subscription | 0 | 216 | 346 | 431 |
Meetup participation | 0 | 111 | 47 | N/A |
Christchurch | ||||
Meetup subscription | 0 | 63 | 107 | 116 |
Meetup participation | 0 | 39 | 33 | N/A |
Dunedin | ||||
Meetup subscription | 0 | 53 | 86 | 102 |
Meetup participation | 0 | 36 | 5 | N/A |
Wellington | ||||
Meetup subscription | 249 | 380 | 503 | 524 |
Meetup participation | 77 | 87 | 63 | N/A |
Note: In 2020, a virtual meetup was conducted instead of a physical one.
Beyond the growing interest in events, there are no indications about the outputs and impact of the meetups (e.g. what types of cooperation result from meetups if any, what types of solutions are developed and/or implemented, whether there is a multiplier effect). It is also not clear how the participation in the meetups develop and evolve, as this information does not seem to have been collected. For example, the proportion of data holders versus data users, recurrent participants (from the same organisation) versus new participants, participation of Māori and Pacific.
The use of case studies has been a simple and creative way of raising awareness and advocacy as well as documenting achievements of the ODP. The ODP team captured four case studies so far, continuing to create an evidence base for the ODP. These case studies are the success stories and best practices in opening and reuse of data. They also document the actual value of open data in specific situations helping to overcome the measurement problem. At the same time, they are used to promote the ODP by providing good examples and incentives for agencies and demonstrating practical and effective ways of opening and using data.
Inventorying datasets by participating government agencies has been a challenging exercise for many participating agencies. However, it has been a useful and necessary first step in the process for opening data, and it was a catalyst of culture change. Inventories and associated actions triggered a different way of thinking about data, i.e. as an asset, and about the agencies themselves, i.e. as data custodians.
The network of data champions has been very valuable. Data champions are agents of change within organisations holding data. They can create a way into, and within, the organisation and they can also ask for help from the ODP.
The tendency was noted for organisations to put data champions at lower (operational) levels of organisations, rather than second or third tier managers. While it ensures that things get done, it may also reduce their efficiency as they need support from senior management.
At the start of the ODP, there were regular meetings of the data champions’ network, annual reports to the Cabinet and cross-agency cooperation. The meetings were held at different levels, both senior and working. This structure was discontinued at some point, probably because it was considered too onerous, though contact with the data champions was maintained. Currently, only seminars for data champions are held. Some interviewees indicated that the seminars, while useful, are often quite long and difficult to attend. They also indicated that previous structure provided more guidance on what was expected from them as data champions.
The ODP is based on the Declaration on Open and Transparent Government approved by Cabinet on 8 August 2011. This means that the programme is operating through a Cabinet mandate, not a legislative one. This may have downgraded, or contributed to a perceived downgrading of, the programme’s importance over time though it gave the ODP more flexibility to use influence and other soft leadership methods. Some interviewees noted that the ODP seemed to go down in the hierarchy of priorities as it was moved from agency to agency, and that it could have been more effective if housed at an appropriate level with more monitoring and statutory requirements. Additionally, gaining a legislative mandate is not straightforward or fast, and may have further delayed the work of the ODP.
Stakeholders indicated that it was difficult to keep the ODP focused on its original objectives and to communicate these objectives across the government due to integration of the ODP in the broader Stats NZ data leadership and capability hub. This has led to some confusion for Stats NZ staff, especially new staff, because the broader data stewardship objectives are similar to the ODP (e.g. confusion between open data and data sharing) and because the interplay of the programmes is not clear.
The ODP has had only a small team both at LINZ and Stats NZ: only 3-4 core staff have been working on the ODP full time. Because of this, some of the work on the programme had to be outsourced to contracted data experts and Stats NZ staff. While outsourcing worked well and was cost-efficient, the Stats NZ staff might have been confused, considering broader data initiatives and the general data work of Stats NZ. However, the operating model of a small core team, with outsourced data skills was a deliberate choice for the ODP. The consequences of this decision may have made it more difficult to manage expectations about how fast the ODP could bring culture change across government, and left less time for reporting on the effectiveness of the programme. There was also less funding available to support change in agencies.
Much of the public profile and knowledge of the programme was held by one staff member, rather than shared across the team. One Stats NZ interviewee reflected that this was a risk to the programme and a risk to Stats NZ, and succession planning should commence to manage this risk. One interviewee also noted that the small staff numbers limited the outreach of the ODP because they could not be in several places at the same time.
One of the challenges of the ODP have been a number of temporary acting arrangements in the senior management, which impacted the momentum of the programme. On the one hand, the proper induction of acting senior staff is difficult, and a lot of knowledge is lost if these staff change often. The short-term involvement of the staff in acting roles could not ensure the necessary buy-in and ownership, impacting the leadership and progress of the programme.
The ODP is not considered core business for Stats NZ, and it is seen as an external project by internal stakeholders. It appears that even after several years at Stats NZ, the ODP is still operating in a silo, its visibility is limited and internal connections to the rest of Stats NZ are weak. The ODP was not properly integrated in Stats NZ when moved from LINZ, and as a consequence wider organisational buy-in may be lacking. This diminishes the strategic importance of the ODP and the support for its operation – which is crucial in the context of Stats NZ being Government Chief Data Steward and the ODP set up to influence all of government.
The ODP had to adjust to the ways of working of the wider Stats NZ. The ODP might have been slowed down by the internal communications with Stats NZ as it was often necessary to explain actions and activities under the Programme and subsequently adjust them. Although the ODP came with its own separate budget, it had to re-align with Stats NZ requirements and provide planning and reporting for its budget. Potentially neither integration into Stat NZ or adjustment of the ODP’s ways of working were done well, resulting in a somewhat limbo situation
The development of benefits and measures was a challenging task for the ODP. The ODP budget bid of 2016 was rather vague on potential benefits and did not offer any measures for them. Only at the end of 2018 was a benefits framework agreed with the Investment Board, and the final version of benefits and measures was published in January 2019. The main difficulty was in understanding what can be measured and how it would cover the programme’s outcomes.
Having benefits and measures clarified early on could have helped to focus the programme and its resources: both the staff and the participating agencies would have had more clarity about what they need to achieve. It would have also helped to distinguish the ODP from other data-related programmes and initiatives.
The development of measures for open data programmes is acknowledged as an internationally recognised challenge. It is difficult to determine the suitable measures among different options of what can actually be measured (e.g. economic value of open data, shifts in behaviour, number of open datasets, number and types of reuse). Some of the important benefits (e.g. organisational structure, change in processes) cannot be quantified at all. The various benefits that can be measured are very expensive and time consuming to monitor and quantify. Quantitative data on reuse and economic value for users are especially difficult to capture.
Some interviewees argued that the measurement worked well in the early years of the ODP (prior to the move to Stats NZ) when annual reporting to Cabinet was in place. These reports captured more than just how many datasets were released; they also documented how much different agencies knew about their data and its value, what relevant processes were instituted and other things. Besides providing the evidence of change for the ODP, these reports were helpful for the participating agencies as they helped them to position themselves against each other, identified areas for improvement and investment, and to get the senior buy-in for action on open data. It’s not clear why the decision was made to discontinue reporting, though the decision was made prior to the ODP moving to Stats NZ.
The measurement challenge was overcome with the help of case studies. Case studies provide concrete examples for participating organisations where they can focus their efforts. They illustrate the thinking about the whole value chain of open data and demonstrate specific impact and tangible results of using open data to improve lives of communities and individuals. Case studies also help capturing those measures that are hard or impossible to quantify.
However, the current four case studies can be considered only anecdotal evidence, even when they are combined with previous case studies. While conducting a certain number of case studies is one of the measures, they are not linked to outcomes and impacts of the programme.
Because the benefits and measures of the ODP were agreed very late in the programme implementation, there has been no consistent monitoring of them.
The budget bid did not mention any reporting arrangements. Monthly reporting on the activities completed and planned has been done via the internal dashboard at Stats NZ. This reporting did not focus on benefits and outcomes but gave an indication how the programme tracked against implementation plan. The annual reports to the Investment Board focused on the progress against the implementation plan and did not cover outcomes and benefits either. Depending on the measures appropriate for the programme annual reporting may have been sufficient.
The ODP’s understanding of its beneficiaries (both data holders and data users) is still developing. It is still unclear who the open data community is, who exactly the ODP is serving and how many beneficiaries are out there. This means that the needs of the beneficiaries are also not clear, which clouds the focus of the programme.
As the ODP has unfolded, the understanding of data holders and their challenges has grown. For instance, it has become clear that many data holders are small agencies with very small data teams that are stretched thin. They experience both financial and capability barriers. Even if they are keen participants of the ODP, they need a lot of help, often hands-on, with opening up data. Some agencies have a decentralised operating model and need tailored approaches to opening data. At the same time, some agencies are more mature and/or have more resources to faster incorporate open data culture.
Understanding the ODP’s beneficiaries is a learning curve and a necessary step for developing more effective outreach strategies and support mechanisms. More tailored approaches to different data holders and data users would allow for better planning and resource allocation, stronger buy-in from the ODP participants and faster progress of the programme.
This section discusses the actual outcomes and benefits of the ODP. What has been accomplished by the programme and what are the benefits of the programme to New Zealand?
Whether the planned benefits and outcomes have been achieved comes back to the ability to measure well. If the overarching benefit is achieving better evidence-based decision making, there are many activities which need to be done in parallel. Making data more available, making the open data more accessible, more timely data, understanding quality, building capability – all need to come together.
The benefits discussed below are mainly based on the interviews, with limited other evidence of the outcomes and benefits of the programme.
One of the main outcomes of the ODP is raising awareness of the value of open data among data holders and (potential) data users. Open data becomes a familiar concept, organisation know what open data is and why it is necessary to incorporate its principles in the organisations’ work. The outreach strategy of the ODP, encompassing conferences, forums, meet-ups, seminars and other activities, was instrumental in advocating for open data and building understanding among the stakeholders. Outreach activities inspired conversations between and within organisations about the value of data, about the data as a public good and about the role of organisations as data custodians, not data owners. The ODP advocacy has likely been an impetus for the change of mindset and culture in organisations holding data.
While open data can be considered an output of the ODP a more important outcome is capability building around open data encouraged and supported by the ODP. There has been a skill gap around open data, and the ODP helped to close it by creating shared capability. The ODP provided many incentives and tools for capability building, for example, data literacy courses, e-learning modules, funding for data stocktakes, and support to decide what to open and maintain. This ensured not only that data was opened, but that the open data was of high quality.
In 2019, over 170 people participated in data literacy and data visualisation courses provided by the ODP. The customers improved their data-related capabilities, developed new approaches, data roadmaps and tools as well as triggered shifts in data culture.
Fostering of the open data community and community of practice among local councils has supported capability building. Participants of those communities learnt from each other and exchanged experience and lessons.
In addition to capability building, the ODP provided guidance, recommendations, and examples of how to perform specific steps to open data (for example, data inventories or stocktakes). The ODP researched relevant international practices and tools, searched for best practices in New Zealand and highlighted and publicised them among central and local government, and Crown Research Institutes.
The existence of the ODP has been an incentive and support for organisations willing to open their data. Data champions at such organisations could point to the ODP as an all-of-government effort or priority to get the necessary attention of executives. The seminars and talks by the ODP staff helped organisations to understand the value of open data and to ensure buy-in from the senior staff.
The ODP has enjoyed a wide uptake by government agencies at different levels, and more than 170 agencies participated in the ODP over the course of the programme. Open data inventorying (stocktakes) started in January 2018, and the number of completed stocktakes has been steadily increasing every year: there were three stocktakes in 2017-2018, five in 2019 and 11 in 2020. As a result, more than 8000 datasets were open by the end of the programme. This number includes many unique datasets and high-value datasets identified by data users. This is a significant overachievement of the initial ODP target, which was 7000 open datasets by 2020.
The website data.govt.nz has been well visited and the numbers of both users and sessions have grown. For instance, the number of new users increased by more than 25% (year average) from 2018 to 2019. Over the same period, the number of returning users increased by 55%.
However, uptake and reuse of the open datasets are not measured and therefore unknown as this is a technically challenging and expensive undertaking. Individual stories and case studies have been instead used to help understand how open data has been used in New Zealand by third parties and what impact it has made. The four case studies published by the ODP team in 2020 indicate that open data helped save or reduce costs, minimise risk for various activities for the organisations, and make the functioning of the government more efficient overall. Open data increases transparency and accountability of the government and ensures equal access to data for all citizens and users. For the wider communities, there are examples of reuse of open data that helped keep people safe when working in remote areas or improved mobility of people with disabilities.
An important achievement of the ODP is the cultivation of the community of open data users. Through outreach activities, the ODP connected different groups (technical, subject-matter, geographic). The ODP supported the establishment of a community of practice among local councils, which is important for their mutual learning and exchange of experiences. The ODP have been successful in finding partners and advisers for various open data initiatives of government agencies, both in New Zealand and overseas. The interviewees were particularly impressed with the ODP team’s ability to understand and identify what exactly was needed for their initiatives and link up with the right partners (in terms of knowledge and experience). The ODP helped stakeholders navigate the government and to work in a complementary way. These connections, collaborations and partnerships are necessary to sustain the programme in the long run and to ensure its success. The ODP also provided a platform and an environment for open data including the website data.govt.nz where datasets are catalogued. It is, however, not clear how well or often the materials and the datasets on the website are used.
The ODP also supported the necessary licensing arrangements (NZGOAL), and funded Creative Commons Aotearoa New Zealand to run NZGOAL training. CCANZ is now known as Tohatoha Aotearoa and continues work on NZGOAL training and other open access advocacy. However, non- government funding for such activities is extremely difficult to find and is difficult to sustain by cost-recovered training. NZGOAL is an essential part of the legal framework for open data in New Zealand and should be part of any future government data programme.
Another side-effect of the ODP was implementation of standards by different data holders. The ODP did not lead to widespread adoption of standards across all of government but moved some government agencies to start thinking about them, alerted interested agencies about open standards (e.g. open geographic standards) and helped implement them.
New Zealand has developed a strong reputation in the area of open data internationally. New Zealand has been an early mover and adopted an unusual approach to open data; not through legislation but through a declaration. The approach proved to be effective, and shows in international comparisons, with New Zealand finding itself among the leaders of open data ranking of the Open Data Barometer, and the open data index of the OECD.
New Zealand’s experience of the ODP is studied and adopted internationally. For example, the practice of meetups will be scaled up. The OECD and Open Data Charter announced a global call to action for governments at all levels around the world to run virtual meetups with data users. The aim of these meetups is to collect data needs and use ideas to respond to and recover from COVID and/or to be more data ready next time. The international Open Government Partnership is distributing this call among its networks and is interested more generally in the New Zealand experience of meetups as an ongoing practice.
Another example is the measurement of the true value of open data and its re-use, which is an internationally recognised challenge. To resolve it, the United Nations Economic Commission for Europe (UNECE) established a Task Force on Value of Official Statistics that considers how to promote, measure, and communicate the value of official statistics, including open data. Stats NZ is a member of the Task Force. The ODP team has developed a relevant framework that has been adopted by the UNECE Task Force to classify value metrics used by national statistics offices. While it is impossible to quantify every single re-use of open data, qualitative data or case studies of impacts can be analysed to understand the change that open data creates.
While the outreach and awareness raising strategy worked well for some stakeholders, it may not have reached others. Meetups, forums, and conferences were successful forms of engagement with business users, but attracted fewer researchers, economists, banks, citizens and other less entrepreneurial types.
It has been suggested that the interface of the website data.govt.nz (specifically the data catalogue there) does not encourage the engagement from wider community of potential users and citizens. The interface is more targeted at developers and is difficult for general public or non-specialist researchers (i.e. non data scientist) to navigate. This may have limited the outreach and visibility of the ODP to only the very experienced data community.
Some interviewees suggested that the advocacy and raising awareness work at government agencies has not gone far enough as there are still chief executives who do not know what the ODP is and what it does.
Working with different agencies on inventorying their data has been a challenging task. The first difficulty seemed to be the negotiation to conduct the data stocktake in some organisations: at times, it took a few months till an agreement was reached. Further, many smaller organisations have small data teams or no expert teams at all. The ODP team had to provide capability building training, while also helping to solve capacity issues for data stocktake. First stocktakes took on average 6-7 months to complete, but this period has shortened in 2020.
Once data stocktakes were completed, publishing the data has not been an automatic process for some agencies. The data teams in the organisations had to obtain permissions from managers higher up, which took much time and effort in some cases. There are examples where no data has been published after the stocktaking exercise. The reason for this was low quality of datasets from the agencies’ perspectives.
There is a lack of common standards and common frameworks for data collection, inventories, and release across agencies, which has been a serious impediment to open data. Data standards and standard publication processes could have supported the participating agencies giving them clear pathways to open data and improving the quality of the published data.
In this context, some interviewees mentioned the decentralisation as an issue. There is no central publishing portal for data. Datasets are published by different agencies on different portals and in different ways, even if the datasets are on the same topic. The website data.govt.nz serves as a catalogue or pathway to these different websites; not as a publishing site.
The lack of monitoring of, or supervision over, the publication processes and quality of published data were indicated by some interviewees as obstacles for achieving better programme outcomes.
Changing organisational cultures to embrace open data is still ongoing. Data stocktaking has been the catalyst for many organisations to start thinking and discussing how their internal processes, structures and mindsets need to change. Some interviewees indicated that the stocktaking exercise was not so much about inventories as about learning about data value, capability building and data culture. However, this aspect of the stocktakes was not so clear at the beginning, and the exercise was first focused on the “volume” before it became apparent that a more organic, systemic approach is necessary.
All interviewees agreed that New Zealand government agencies have not yet reached the state of open data by design or by default. Opening data is still generally ad hoc across the government and not part of business as usual. To achieve this in three years is not a realistic expectation.
In this section, we outline lessons learnt from the programme that are relevant both for the ODP if it were to continue in another form and other cross-government data programmes. We did find that interviewees were divided on strengths and weaknesses of the ODP; for example, some viewed the hosting of the programme at Stats NZ the best place for the ODP, while others thought it should be hosted in a central agency instead.
The ODP achieved greater awareness of data as an asset and greater availability of government data. There is also anecdotal evidence of an increased use of government data, including use beyond the original purpose.
However, incorporating open data into the day-to-day functioning of the government (open data by design or by default) has not yet been achieved. The necessary culture change and process development require more time than initially hoped. Barriers to data stocktakes and data release are higher for some agencies than originally realised. In some cases, more advocacy work and awareness raising are still necessary in agencies at the senior level. Better understanding of user needs should be developed and high-value datasets released.
The ODP with its original purpose remains topical and relevant for New Zealand, but can be further refined to be more targeted and strategic and to increase impact both for data holders and data users.
Making the ODP part of a larger data stewardship programme may seem logical. However, it could also undermine and dilute the objectives and purpose of open data efforts, both for the programme staff and customers. The ODP may become one initiative within the wider scope of data stewardship programmes. Embedding the ODP in the broader policy may also decrease programme’s visibility and transparency. However, there were advantages with the programme being at Stats NZ, including access to data know how and integration with national and international statistical programmes.
Having a voluntary approach to opening data has proved to be an advantageous approach, though its limitations were noted. An obligation to open data would have been a barrier instead of enabler for many agencies, and the ODP may not have been as successful. Under the voluntary approach, agencies decided when and how to participate in the programme. The ODP team negotiated with them on how to open data, how to conduct data inventories and how to publish. This more flexible approach allowed agencies to play to their strengths and to effect change at their own pace. Instead of being a compliance exercise, the ODP triggered real organisational culture shifts in relation to data governance.
Some interviewees suggested that the ODP could have delivered stronger outcomes if it were located within a different agency than Stats NZ. A different agency (like Treasury or State Services Commission) could have adopted an approach that is more aligned with their organisational culture, which could have been more appropriate to initiate and sustain culture change in different agencies. The SSC could be an appropriate home from the policy perspective: the objective to make open data business as usual for agencies means incorporating it in the government service. Open data is also part of open government, transparency, and accountability. However, other interviewees thought that Stats NZ was the best place for the ODP to be housed due to the importance of long-term data governance and capability, and the international statistics connections.
In implementing the ODP, Stats NZ has mainly played the role of an advocate and facilitator. It fostered networks of users, brokered relationships between data holders and users, raised awareness among government agencies and within the user community, advocated data release and supported capability building.
However, some stakeholders expected more guidance and leadership, especially with the Government Chief Data Steward residing at Stats NZ who is responsible for accelerating the release of open data. Clearer frameworks and guidance on processes of data assessment and data release, best practices and reference points, recommended methodologies and technologies for certain types of data (e.g. sensitive data) would have been useful for agencies to design their own processes and improve their practices. Strategy and prioritisation by the ODP could have guided internal planning in participating government agencies.
The ODP would have benefitted from defining its outcomes and benefits as well as relevant measures at the start. This would have been beneficial from both the policy and operational perspectives; thinking about the whole system, and cycle of data production and use within the government and designing a programme that is more outcome oriented. The programme’s approaches and activities would have been more oriented to what needs to change, namely capabilities and processes, from the beginning. A more outcome-oriented programme would have been easier to implement as structure and activities would have followed the outcomes.
As the ODP is a cross-government initiative, it would have been helpful to consider a stronger all-of-government perspective in formulating its outcomes, benefits, and planning activities. Open data is not an end in itself, but an instrument to achieve government objectives and deliver better public service to citizens. Outcomes of the ODP could be more strongly aligned with government priorities, which may have resulted in higher participation and commitment by agencies.
As an all-of-government effort, the delivery also could have been more aligned or, in some cases centralised. For example, centralised data publication portals, frameworks and standards could be developed. “Government helping government” approaches could have been adopted with larger agencies encouraged to help smaller agencies or take care of certain types of datasets that are useful for the whole of government. Research could be made into whether certain data are collected by several agencies – to remove duplications in the government efforts and to merge datasets for better data quality.
Resilience is an important issue for multi-year programmes. For an individual-focused programme like the ODP, the continuity and succession planning for staff needs to be built into the programme from the start. The ODP would have benefitted greatly from capability analysis, capability building and succession plans. It would have been helpful to have multiple layers of staff, with dedicated mid-level specialists who can help drive the progress of the programme.
The programme has learnt that one-size solutions do not fit the variety of stakeholders. Tailored solutions need to be developed to approach smaller agencies, get their buy-in and help them stocktake and release their data. Better understanding of the needs and capabilities of customers would help assess the baseline and identify the appropriate points of impact. Different outreach strategies may be necessary to better engage with regions and a wider range of data users.
The New Zealand’s Open Data Programme has developed and been implemented in the international context and is part of a global movement. Being an active participant of this international movement has brought mutual benefits to New Zealand and the overseas partners. New Zealand is seen as a reputable and innovative partner, its inputs are incorporated in the international guidelines and its experience is widely shared and replicated.
The international cooperation could be further used to close the gaps and address the challenges identified in this review. For instance, the Open Data Charter measurement guide brings together open data indicators and evidence for different levels of government, which can be used to improve outcomes reporting for future programmes. Measuring monetary value could be done through coherent storytelling of open data use and impact within public sector organisations and by private sector.
Continuing participation in the international open data movement also ensures that New Zealand stays aligned with international developments. Publishing with purpose seems to be the new/next trend in open data policy, which means that organisations need to consider the public value of data, specific policy challenges and priorities of the government and how open data can support or enable the achievement of the government’s goals. This would ensure that the investment in open data produce higher value. The Open Data Charter is preparing guidelines on this focusing on the top international issues, such as climate change, anti-corruption, gender pay gap and others.
If you’d like more information, have a question, or want to provide feedback, email datalead@stats.govt.nz.
Content last reviewed 11 November 2020.