Skip to content

Consultation and submissions summary: draft algorithm charter

Public consultation on the draft Algorithm charter closed on 31 December 2019. The following page contains a summary of the consultation process and submissions.

Submissions summary: draft algorithm charter [PDF 225 KB]



Computer algorithms are now a fundamental element of data analytics. Algorithms support the services that government provides to people in New Zealand and help government agencies deliver new, innovative, and well targeted policies to achieve government aims.

In a world where technology is moving rapidly, and the use of artificial intelligence is increasingly common, it is essential that government has the right safeguards in place when it uses public data for decision-making. The government must ensure that data ethics are embedded into its work, and always keep in mind the people and communities being served by these tools.

New Zealand is a member of the Open Government Partnership, which brings together government and civil society leaders to create action plans that make governments more inclusive, open, responsible and accountable. As part of the 2018-2020 Open Government Action Plan, the government committed to take action to make its algorithm use more transparent and accountable.

In 2018, the Government Chief Data Steward (GCDS) and Government Chief Digital Officer undertook a review of how 14 government agencies use algorithms to deliver their functions. The Algorithm assessment report provided a summary of the review. The review focused on areas where algorithms were used in decision-making processes that affect people in direct and significant ways. The review also considered the use of algorithms against the principles for the safe and effective use of data and analytics published by the Privacy Commissioner and the GCDS in May 2018.

In 2019, the Minister of Statistics, Hon. James Shaw, launched a draft algorithm charter (the charter). Signing the charter would commit government agencies to improving transparency and accountability in their use of algorithms over the next five years. 

Algorithm charter
Algorithm assessment report
Government Chief Data Steward

Public consultation on the draft algorithm charter

The consultation period ran from 17 October to 31 December 2019 and this was reported by domestic and international media. Beyond central government agencies, submissions were also solicited from a range of key stakeholders including academics, non-government organisations, civil society representatives, and regulators.

Consultation on the charter was also promoted at the All-of-Government Innovation Showcase on 3 December 2019 and Internet NZ facilitated an online discussion session on Twitter on 20 November 2019. The consultation documentation was available online at and public were invited to make submissions by email.

34 written submissions were received in relation to the charter and associated issues.

Profile of submissions

Of the 34 submissions, 12 were from individuals and 22 were from organisations or groups.

The names of individual submitters have been withheld to protect their personal information. The organisations that made submissions are:

  • The Artificial Intelligence Forum of New Zealand
  • Antistatic
  • Canterbury District Health Board
  • Chartered Accountants of Australia and New Zealand
  • Deloitte
  • New Zealand Human Rights Commission
  • Internet New Zealand
  • Lowndes Jordon
  • Microsoft
  • New Zealand Council for Civil Liberties
  • New Zealand School Trustees Association
  • Office of the Auditor General
  • Transparency International New Zealand
  • Xero

Government agencies

  • Archives New Zealand, Department of Internal Affairs
  • Ministry of Foreign Affairs and Trade


  • Auckland University of Technology
  • Joint submission Te Mana Raraunga and University of Auckland
  • Sheffield Hallam University
  • University of Auckland
  • University of Otago

What we heard

Most submitters supported the introduction of a charter and provided feedback about defining the charter’s scope. For example, The Artificial Intelligence Forum of New Zealand (the AI Forum) stated that the charter was “an important step in helping to ensure that any algorithms and associated data sets used within the public sector in New Zealand are developed and utilised in a fair and transparent way”.

In considering a possible charter many submitters considered different elements of transparency and accountability, and how this relates to trust. Across all of the submissions received, the major themes were:

  • that the purpose and scope of the charter should be broadened to include all algorithms, not just operational algorithms (algorithms that impact significantly on individuals or groups)
  • that government algorithm use should reflect the principles of the Treaty of Waitangi and the charter should specifically consider the protection of Māori interests
  • general support for monitoring the charter implementation, including several proposals for the establishment of an independent monitoring body
  • the need for regular review of bias and public reporting of implementation
  • the need to develop specific guidance to accompany the charter and aid implementation
  • that human judgement and evaluation need to remain part of the decision-making process based on data, and this should be addressed in the charter.

Summary of responses to consultation document questions

Question: Does the proposed text provide you with increased confidence in how the government uses algorithms?

Many (68%) noted that the introduction of a charter is an important step forward and would contribute towards providing assurance around how the government uses algorithms as there is currently nothing else in place to provide this. A submission from the University of Auckland stated that “government algorithms can have wide-reaching and long-lasting impacts, so it is only appropriate to have principles that ensure high-quality decisions are being made”.

Other submissions noted that implementation of the charter’s principles would greatly improve New Zealand’s use of algorithms in the public sector and would provide a valuable resource for those working to improve the private sector.

There is a range of feedback on whether the proposed text will provide increased confidence in how the government uses algorithms. For example, Archives New Zealand agreed that the assurance provided by the introduction of a charter is helpful, while 14 submissions by individuals and organisations commented that more is needed to meaningfully develop trust in government.

Submitters who thought that the charter would not increase public confidence in government were concerned that the proposed text did not necessarily align with how government uses algorithms today and focussed too much on how government would use algorithms in the future.

Feedback on ‘Embed a Te Ao Māori perspective in algorithm development or procurement’

Half of all submissions supported the Te Ao Māori commitment but suggested it go further and provide more clarity about how agencies will embed a Te Ao Māori perspective. Some of the submissions suggested a Te Ao Māori perspective may include concepts of indigenous data sovereignty, a shared governance structure, and community rights.

Submitters also suggested that there is a need to monitor implementation and questioned why the Treaty of Waitangi and other obligations are not directly referenced. 6 submissions suggested that a reference to Māori data sovereignty be included in the charter.

The main feedback from Te Mana Raraunga (a Māori data sovereignty network) is that “an ‘algorithm charter’ is insufficient to protect Māori rights and interests and is highly likely to fail. Such failure will not only be to the detriment of Māori, but also puts at risk the trust in the public data system that is foundational to the willingness to participate and provide high quality data. Regulation that includes mechanisms for accountability and redress is necessary. Such regulation would need to include Māori data governance at all levels”.

Te Mana Raraunga also stated that “any framework on the use of government algorithms needs to actively recognise government obligations under the Treaty of Waitangi and under the United Nations Declaration on the Rights of Indigenous Peoples”. Te Mana Raraunga also stated that “a voluntary charter is unlikely to provide the necessary protections and safeguards for Māori in relation to government use of algorithms”.

Te Mana Raraunga also provided feedback on how the charter relates to their Principles of Māori Data Sovereignty (Rangatiratanga, Whakapapa, Whanaungatanga, Kotahitanga, Manaakitanga and Kaitiakitanga).

2 submissions suggested the charter be provided in both English and in Te Reo Māori.

Perspectives of communities and minimising bias

Submitters generally supported the commitment to consider perspectives of communities but suggested more clarity be incorporated into the charter on what this commitment meant in practical terms. There were a range of submissions supporting greater community involvement in the creation of government algorithms and providing specific suggestions about how these can be accomplished, including communities being involved in governance, decision-making, and testing around the use of algorithms and not to simply include their perspectives.

Some submitters felt that it would be helpful to require active consultation for the use of algorithms, especially with people who are likely to be impacted by decisions made using algorithms. While submitters recognised this could be a costly step, many argued it would help to establish social licence and help people understand the impacts of this technology.

It was noted in several submissions that the draft charter only mentions bias in relation to reviewing the performance of algorithms. Antistatic (a private research and communications group) argued that “minimising bias and discrimination should be considered much earlier in the process, when the system (including the data collected and used) is being designed”. The University of Auckland suggested that an algorithmic impact assessment, similar to a Privacy Impact Assessment, could broadly check for possible negative impacts.

Microsoft (a private technology company) stated “that inclusiveness is a fundamental principle and all stakeholders have a role to play in how to address AI issues and their implications”.

Question: Should the charter apply only to operational algorithms?

More than half of the submissions received provided views on whether the charter should apply only to operational algorithms.

9 submitters suggested that the charter should apply to more than operational algorithms and the scope widened to include algorithms used for policy and research development. Some reasons provided to support widening the scope of the charter were:

  • All algorithms should be used in a fair, ethical and transparent manner.
  • All algorithm-enabled decisions made by government agencies should be subject to high standards of transparent and safe use.
  • Potential bias is reduced or eliminated by extending algorithms to policy and research development.
  • Operational algorithms will be developed based on available data and may already exclude a Te Ao Māori perspective, which suggests widening the scope of the charter.

Other submissions noted the similarities between operational algorithms and ‘business rules’, suggesting that applying the charter to operational algorithms and not business rules would create ambiguity. Some submissions suggested the charter should be based on international classifications of algorithms.

3 submissions supported the proposal to apply the charter to operational algorithms only. This was because respondents believed:

  • this would add clarity and usefully define a more manageable scope
  • operational algorithms are the most “public-facing” algorithms that may directly involve the public’s information and data and involve decisions that significantly impact on individuals and groups.

The Office of the Auditor-General recommended a staged approach: “the algorithm charter and its success (or otherwise) should be reviewed after the proposed 5-year period of commitment. When reviewed, consideration should be given to extending the algorithm charter to other types of algorithms, advanced data analytics techniques, and/or business rules used by government agencies”.

2 submissions suggested the scope of the charter should be extended to algorithm development, with Transparency International NZ (a global civil society organisation against corruption) strongly recommending “clear definition of both and a centralised register of both so that there is more clarity on how many agencies are using algorithms on both operational and other types of algorithms, and what the purpose of each of these algorithms is”.

Internet NZ (a non-profit organisation who administers the .nz domain name and provides infrastructure and security to support it) recommended that “agencies using operational algorithms be required to accept the charter, to adopt specific measures for public accountability, and to be subject to independent review in light of emerging best practice”.

Some submissions challenged the ‘undefined’ category of operational algorithms, suggesting a preferred approach of clearly defining the type of algorithmic tools to which the charter is intended to apply, and making this definition clear and comprehensible.

Some submitters also proposed that data collection, where applicable, should also be included in the scope of the charter. This is because the structure and the quality of the data determines the extent to which the interests of various stakeholders can be considered and/or realised in algorithm development.

As part of responses to this question some submitters also considered possible signatories to the charter, suggesting that the charter should be offered as an optional commitment for any organisation using algorithms, including organisations outside of government.

Question: Have we got the right balance to enable innovation, while retaining transparency?

More than half of the submissions specifically provided feedback to this question, however, feedback on transparency was also provided in general comments and has been captured in the summary below.

Overall most submitters believed that it is difficult to say in advance whether the balance is right, but most agreed that innovation and transparency are not opposite ends of a trade-off. Two submissions were comfortable with the balance and agreed that the charter does not stifle innovation and explored the benefits and challenges of innovation and transparency.

Transparency is a major concern for submitters. Some submissions mentioned that the transparency encouraged through this charter not only protects people's rights and increases confidence, but it can help improve the quality of the algorithms and models as well as build social licence with people who may be subject to these algorithms.

For example, Internet NZ describes social licence as a key concept to include in the charter and sees the charter sitting alongside other frameworks such as the Principles for the safe and effective use of data and analytics (Stats NZ), the Social Investment Agency’s Data Protection and Use Policy, the Ministry of Social Development’s Privacy Human Rights and Ethics Framework, and the work by the Data Futures Partnership.

Many submitters acknowledged the importance of processes relating to transparency, for example identifying and engaging with stakeholder groups for feedback, co-design, algorithmic audits, and reporting.

It was also noted in some submissions that there is no clarity on what New Zealand users are looking for in terms of transparency relating to algorithm use, or what they trust. Research to learn more about what New Zealanders are seeking in terms of the different levels of transparency (including explanation) might be helpful. Some submissions agreed that more openness should in theory at least enable more innovation and listed some practical ways of increasing transparency such as:

  • sharing algorithm code as open source
  • datasets other than personal information should be shared as open data by default
  • the charter could refer to alignment with the Open Data Charter and/or the Open Government Partnership Action Plan.

Others suggested that the current wording means that an agency could fully comply with the charter and still not deliver meaningful transparency. They suggest, as a minimum, information should be made available about how decisions were made, consultation undertaken, and how bias is tested for.

Submitters were also concerned about the impact of innovation. Submitters commented on the importance of innovation that is beneficial to New Zealanders, pointing out that irresponsible innovation is not in the interests of the public good.

There was also a concern that government agencies need to be mindful of potential downsides as technologies have a greater impact on society and recognised the charter as a key opportunity to ensure that government approaches algorithms in a way that protects and enhances public confidence, privacy, human rights, and other vital community interests.

Some submissions suggested that there are views that the accountability and transparency may be better positioned as elements that contribute to equitable outcomes for New Zealanders, rather than accountability and transparency being the outcomes of the charter themselves. Other submissions made specific suggestions on how we might amend the wording in the charter to improve the balance between innovation and transparency.

Some submissions acknowledged the risk that a charter that is too prescriptive will hinder innovation, and that too many rules would make compliance more challenging and supported a principles-based approach for the charter.

Question: Have we captured your specific concerns and expectations, and those of your whānau, community or organisation?

12 Submissions provided specific responses to this question and suggested a broad degree of comfort that their individual concerns and expectations had been captured through the consultation process. However, a range of issues were raised around the possible implementation of the charter.

The algorithmic system

Algorithms are part of a wider system that goes beyond the technology itself. Antistatic recommends “the whole algorithmic system, including data, governance, and decision-making processes, is covered and that this is made clear in the charter’s wording”.

Artificial intelligence (AI)

Submissions encouraged consideration of clarifying the scope of the charter. Microsoft and the AI Forum believed the scope should be broader and not solely focused on algorithms. It would be perhaps appropriate to look at algorithms as part of the AI system. An Artificial Intelligence Charter title would be able to capture broader aspects than solely algorithms and could take a more expansive approach and  better encompassing humans’ role in AI/algorithms adoption. In addition, the adoption of the charter would help the responsible use of AI going forward. This would be an opportunity for the New Zealand government to be a frontrunner in the practical implementation of AI policy.

Submitters noted that there has been progress in developing technical tools for managing some of the challenges associated with responsible AI, including detecting bias, traceability through model-management, AI security, and improving intelligibility. Microsoft suggested further leveraging these existing efforts in the charter and in subsequent operational work of the NZ government and its agencies in line with the charter.

Trust, confidence and social licence

Several submissions suggested the charter includes guidance for people about what to do when there are concerns about how an algorithm has been implemented, what data is used to inform decision making, or when reasons for a decision are not clear.

Internet NZ suggested that the public need to understand how a decision was made and when it affects them, and recommends the “government develops guidance on a right to appeal automated decisions and resources a responsible agency to hear appeals” and if that is not possible there needs to be a process or develop guidance on how to appeal automated decisions.

Human rights

There was support for the charter included in 5 submissions to unequivocally guarantee that all algorithm development, procurement, and application adhere to the government’s human rights obligations, including non-discrimination, with a recommendation to include a statement in the text to this effect. The Human Rights Commission wanted to “ensure that privacy, ethics and human rights obligations are safeguarded and integrated as part of algorithm development, procurement, and application”.

Data collection and quality

The submission from the University of Auckland suggested that it would be useful to explicitly acknowledge that for some people or groups of people, there may simply be no data available to represent them, which leads to algorithm models that do not accurately reflect the real-world (which is one form of bias) and therefore leads to algorithms making errors.

Several submissions noted that researchers have also raised concerns about decisions being guided by advice generated automatically by a machine, based on a large set of data which can extend far beyond the experience of the decision maker. Without appropriate training of employees who use the system, (including a thorough description of the dataset on which it was assessed and the ways in which it may encompass inherent biases or distort data) further discrimination can be perpetuated.

Implementation of the charter

Many submitters recognise that governance of automated algorithms is still developing, both in New Zealand and globally. Some believe there is an emerging global consensus on the governance of algorithms and reference international examples (for example external audits are accepted practices in other fields that require both expertise and oversight).

Submissions suggested the charter might benefit from greater emphasis on a commitment to action in addition to oversight and asked for clear information about:

  • how the introduction and application of the charter will be resourced, monitored and reported on
  • how agencies will be supported to implement the charter, including costs
  • how communities will be supported to use the charter to hold agencies to account
  • whether the charter will be regularly reviewed/revisited.

10 submissions suggested the establishment of a group or entity that might provide advice on how agencies ought to implement the charter and provide some consistency in the application of the charter commitments. The detail of these suggestions varied from a centrally monitored list of government algorithms and algorithmic impact assessments to the involvement of the Ombudsman or a similar entity in reviewing charter compliance, or the establishment of a new regulatory body.

The University of Otago recommend that “a new regulatory body should be established to support government agencies when conducting internal reviews and serve a range of functions, including producing best practice guidelines, provide for review of agencies’ in-house evaluations (including seeking more information where appropriate), maintain a register of algorithms used in government, producing an annual report on such uses, and conduct ongoing monitoring on the effect of these tools”.

Review period

It was suggested that the Principles for the safe and effective use of data analytics and the charter should be reviewed no less frequently than annually, given the rapid pace of technology development.

The AI Forum suggest “the algorithm charter be seen as a ‘living’ document to be kept under review by the government”. The Chartered Accountants of Australia and New Zealand recommend “piloting the charter prior to rolling it out to the entire Public Service to obtain practical feedback on how it could be improved”.


Archives New Zealand feedback included “the charter does not provide comprehensive technical guidance on using algorithms in a way that meets the Public Records Act 2005 (PRA) obligations. We see potential for guidelines and, potentially, standards under the PRA to ensure that this occurs”. Algorithms are, and are generating, public records under the Public Records Act 2005 (PRA). 5 submissions saw the potential for guidelines to assist public sector agencies to implement the commitments in the charter.


Some submitters thought consultation on the charter requires that the wording of the charter is considered in the context of the reasoning that led to the document. This would mean that the thinking should be shared and consulted on so that the resulting charter accurately reflects stakeholder and community concerns as well as international best practice. There was a view that the charter should restrict itself to the specific actions required for algorithm use, over and above the standard principles of good public policy practice, which of course apply.

Contact us

If you’d like more information, have a question, or want to provide feedback, email
Content last reviewed 20 November 2020.