PART FOUR:

Using Coded Interpretations in the Law

Highly reliable interpretations are valuable

  1. Just because it is difficult for “all law” to be authoritatively coded, this does not mean that some law cannot be reliably represented in a computational model.

  2. Jason Morris makes this point in his LLM Thesis.113 We adopt Morris’ broader argument: that the fact a computational model is only an interpretation of the law does not mean that all coded models cannot be useful.

  3. An important point of difference is that Morris’ thesis relates to the use of automated legal reasoning tools by lawyers in order to provide legal advice to clients, and different considerations may apply in a government-to-citizen context (the latter being the focus of this report).114

    The use of [declarative logic programming] tools should be understood to involve, as Susskind suggests, not an encoding of a categorically correct representation of the meaning of the relevant law, but an encoding of the internally coherent understanding of the law that a responsible legal professional believes would be appropriate for people receiving automated legal services to rely upon, in all the relevant context. If what we are encoding is not “the law”, but one person’s understanding of it, all the abstract concerns about whether expert systems can accurately represent the “true” meaning of a law disappear. … Difficulties involved in determining what a law means must be overcome before either the lawyer gives advice, or they encode their understanding. Interpretation is necessary in both circumstances, and so the need for interpretation is not a critique of expert systems at all, but a critique of laws.

  4. Morris expresses confidence that changes in the law that affect whether an interpretation is correct can be dealt with by changing the encoded model: this would be much more challenging if code were to be given the status of legislation.115

    … if a lawyer is aware of a statutory error, they can encode their corrected understanding of the law as easily as they can advise using it. If they are not aware of the error, they would provide incorrect advice in any case. The encoded version will still meet the “no worse than a lawyer” standard. With regard to changes to the meaning of laws that arise due to changed circumstances, the same thing is true. A lawyer may anticipate that the change in general circumstances will require a change in the meaning of a provision, or they won’t. If they do, they can encode that understanding. Legislative error and changes in circumstances are difficulties with statutory interpretation that have no particular impact on the use of expert systems. The fact that the meaning of statutes can change over time, even in the absence of explicit amendment, suggests that the maintenance of automated systems will be an important factor in whether they continue to be reasonable.

  5. By contrast with the situation anticipated by Morris (legal advice in a lawyer client relationship), updating a model is likely to be a much more complex exercise when the model is being used by an Executive government department. The fact that this may be more complex – requiring various levels of sign-off or accountability procedures – is an indication of the importance of having a coded model that is legally correct when in active operational use by a government agency, even if we accept the model is simply an interpretation. Those sign-off procedures exist because of the significant consequences tweaks to a model may entail for the relevant agency and for people subject to coded systems.

  6. Morris acknowledges that there are some situations where automated systems should not be used because of specific legal ambiguities, but he persuasively argues that:116

    We cannot justify refusing to use DLP tools for what they can do because there remain things they cannot do. Responsible use of these tools will always include deciding when not to use them, and issues of open-texture, vagueness, or uncertainty may remain good reasons to come to that conclusion.

  7. In a footnote to that statement, Morris notes the potential of a better rules approach, primarily because of its ability to improve the quality of the underlying policy, and its ability to be encoded in computational systems:117

    The Better Rules conversation (Better Rules for Government Discovery Report […]) proposes a fascinating possible resolution to this problem: that public rules ought to be drafted with an eye to how easily they could be automated, encouraging the avoidance of open-textured terminology except where the vagueness serves an explicit policy objective. Such a change in how legal rules are drafted would be a sea change for the applicability of DLP tools, and statutory interpretation itself.

  8. Morris acknowledges that “the real challenge” is the knowledge acquisition bottleneck:118

    The knowledge acquisition bottleneck is the only common criticism left unaddressed. It applies to all possible uses of expert systems. It is not resolved by using modern tools. It cannot be resolved by merely avoiding the standard of perfection. The viability of expert systems as a tool for increasing the supply of legal services may legitimately turn on whether there is a realistic and appropriate solution to this problem. To reiterate, the knowledge acquisition bottleneck refers to the high cost and low reliability of the method of having a legal subject matter expert and a programmer work side by side to develop expert systems.

  9. He continues:119

    The knowledge acquisition problem disappears entirely when the person who holds the subject expertise and the person who understands the programming language are the same person. There is no risk of anything being lost in translation, missed, or misunderstood when the process of legal encoding involves only one person.

  10. By contrast with the approach proposed by Morris (the use of user-friendly coding tools that can be directly used by legally trained people), the better rules approach incorporates this multidisciplinary knowledge through the use of teams and business process modelling approaches. This ameliorates some of the knowledge acquisition bottleneck, but not all, and it is also important to note the way that a better rules approach can be seen to produce policies, regulatory systems and legislation that are more easily implemented in digital systems, thereby reducing the risk of incorrect computational modelling by subsequent actors responsible for implementing the law in computer systems.

  11. A persuasive point made by better rules and rules as code advocates is that government departments and other users of legislative and other rules are already engaged in preparing and encoding their own interpretations of the law. If an authoritative interpretation can be made available, then it would limit the variability of interpretations available in the “marketplace for interpretation”. So long as the “authoritative interpretation” is understood as being of lesser authority than a legal instrument itself, and still remains open to judicial and legal contest, this raises far fewer fundamental constitutional considerations.

Emulate successful reliable interpretations

  1. Given our conclusions, what examples might we emulate where highly authoritative interpretations of legal instruments – whether or not they are in code – have generated public or private benefit?

  2. One useful example we have identified is the Agreement for Sale and Purchase of Real Estate, produced by the Auckland District Law Society. “The Agreement” is now in its Tenth Edition.

  3. We explain notable insights about this agreement in greater detail in an appendix.

  4. The agreement, and other standard form agreements like it:

    1. are highly reliable instruments that embody a reliable interpretation of multiple primary legal sources.
    2. They are widely used by the public to conduct legal activities.
    3. They indicate the value that similar interpretations might have if they are coded and modelled reliably, while retaining the ability to scrutinise them through legal argument.
    4. Agreements like this also exhibit features worth emulating, like methods to maintain the reliability of the interpretative instrument, and mechanisms to incorporate changes in judicial interpretation, case law, or statutory amendments.

Example: ADLS standard form Agreement for sale and purchase of real estate

  1. Our primary interest in the ADLS Agreement is that it illustrates the wider value of exceptionally reliable and reproducible legal instruments which nevertheless are only non-authoritative interpretations of how the law works. Parties attempting to create law as code models should pay particular attention to the following points, which we believe are integral to the success of the ADLS Agreement:

    1. The Agreement is drafted in natural language, but it reflects a workable operational interpretation of multiple legal instruments that increases the parties’ compliance with and knowledge of the law.
    2. The Agreement draws on a wide range of primary legal sources. It is not only a reflection of land law, but also taxation law (in the way that GST is incorporated into land sales).
    3. There is wide confidence in the reliability of the Agreement because of the way that it is produced and because of the qualifications of the people who produce it and monitor it.
    4. The Agreement is capable of being assessed by the judiciary and updated to reflect statutory amendment, judicial interpretation, and the impact of case law.
    5. The Agreement includes its own dispute resolution mechanism. Parties who agree there is a dispute can refer it to experienced property lawyers for resolution.
    6. Copyright in this case is an essential legal device for controlling how the agreement is used or modified. It is used to ensure that the utility of the standard form is not undermined. The ADLS publishes software that facilitates legitimate amendment of digital versions of the Agreement.
    7. It would be possible for the Agreement to be modelled in computational languages and used, while still preserving the natural language text in case of any interpretive disagreement.
    8. A coded model may not be immediately useful for the people who use it, but a similar process to produce the natural language text would confer credibility on the associated coded model produced. The Committee could use a better rules approach to improve the suitability of its drafting for encoding in digital systems.
  2. The Agreement creates a reliable and dependable legal environment within which parties can transact. It does not exhaustively state the law, nor is it held up as having greater authority than the other primary legal sources (or even secondary legal sources in the form of academic commentary) that inform its drafting. It is reproducible and scalable in the way that many copies of it can be produced and used rapidly. It avoids the need for bespoke individual agreements to be drafted and negotiated for every new property transaction, which would generate massive cost and legal uncertainty.

  3. The Agreement is one of several legal instruments produced by the ADLS. It is drafted and revised by a committee of the ADLS convened for that purpose. Membership of the committee is comprised of legal practitioners and academics with significant authority on the area of land law in New Zealand, including the author of the leading academic text.

  4. In a panel discussion on access to civil justice, a justice of the Court of Appeal noted that very few disputes about the sale and purchase of land are heard in appellate courts today.120 The role of the Agreement in this outcome cannot be overlooked. Interviewees we spoke to estimated more than 90% of sale and purchase of land transactions are conducted using the ADLS agreement.

  5. As a result of its ubiquity (which itself is a testament to its effectiveness), the Agreement has become remarkably embedded within the legal system. Notably, some providers of professional legal training teach their property law modules by reference predominantly to this Agreement itself, as much as the original legislation it reflects.

  6. As a natural language legal instrument, the Agreement is interpreted using legal interpretive practices. This is one limitation of using it as an illustration of how coded law might work. However, because the legal instrument is drafted by a non-Parliamentary body, the same kinds of constitutional issues that are raised by authoritative coded models do not arise.

  7. We also note the way the Agreement highlights the role of the courts in developing a reliable legal instrument. The New Zealand Supreme Court has made comment on the drafting of the Agreement. In fact, judicial comment has led to amendments to the Agreement in revised editions.121 While making it easier for lawyers to practice in this area of law, the Agreement still benefits from authoritative interpretations provided by the courts in the course of legal disputes.

  8. In a recent book, Richard Susskind proposes what we say is substantively similar to a better rules approach to the design of legal, operational, and digital procedures for online courts. Susskind’s immediate concern is how to create procedures for online courts in ways that do not limit what someone can or cannot do within a digital system in a way that lacks any legal foundation. His contended solution looks substantially similar to the way we suggest a better rules approach could be combined with the authority of a body such as the Auckland District Law Society committees to produce dependable and reliable computational models of legal interpretations. Susskind’s process is set out at p 163:122

    … (1) A rules committee should lay down general rules … that conform with an agreed high-level specification of the functionality of the system (agreed amongst politicians, policy-makers, and judges).

    (2) The committee should delegate rule-making/code-cutting responsibility and discretion to a formally established smaller group that can work out the detail and proceed in an ‘agile’ way.

    (3) the rules and code that this group create would need to be formally articulated and made explicit, partly for public scrutiny and partly for a periodic, formal review by the main rules committee.

    (4) The committee and group should be encouraged to approach the task in the spirit of proportionality and resist the temptation to generate an over-complicated set of rules.

    In this way, code is law but it is law whose creation has been formally sanctioned through some kind of delegated authority. This may seem heavy-handed but I do not think we can simply leave the rule-making and code-cutting to a group of developers and judges, no matter how senior and well-motivated. We cannot allow coding to become law-making.

Use of “automated electronic systems” in legislation

  1. There is already a method of legislative drafting for the implementation of AES to exercise legal powers. These legal powers are not limited to powers of decision: they also include complying with obligations, exercising powers, and performing legal functions.

  2. Below we outline the legislative provisions that shape this authority to use AES for such purposes. We do so to illustrate the way that a coded interpretation of the law produced using better rules or rules as code methods could be operationally deployed within legislative boundaries set by Parliament, and in a way that still permits scrutiny by an identifiable person responsible for the system; anyone subject to the use of the system; or by judicial or regulatory oversight institutions.

  3. The simplified pattern of drafting generally includes:

    1. A power to arrange for a system to be used.

    2. Stating the effect of the system and any dispute resolution mechanisms.

    3. Sometimes, a criminal offence for interfering with the system’s operation.

  4. The phrase “automated electronic system” is not defined.

  5. We use the Biosecurity Act 1993 as an example. Below are the statutory provisions empowering a person to arrange for an AES:

    142F Arrangement for system

    1. The Director-General may arrange for the use of an automated electronic system to do the actions described in subsection (2) that this Act or another enactment allows or requires the persons described in subsection (3) to do.
    2. The actions are—
      1. exercising a power:
      2. carrying out a function:
      3. carrying out a duty:
      4. making a decision, including making a decision by—
        (i) analysing information that the Director-General holds or has access to about a person, goods, or craft; and
        (ii) applying criteria predetermined by the Director-General to theanalysis:
      5. doing an action for the purpose of exercising a power, carrying out a function or duty, or making a decision:
      6. communicating the exercising of a power, carrying out of a function or duty, or making of a decision.
    3. The persons are—
      1. the Director-General:
      2. inspectors:
      3. chief technical officers:
      4. authorised persons:
      5. accredited persons:
      6. assistants of inspectors or authorised persons.
    4. The Director-General may make an arrangement only if satisfied that—
      1. the system has the capacity to do the action with reasonable reliability; and
      2. a process is available under which a person affected by an action done by the system can have the action reviewed by a person described in subsection (3) without undue delay.
    5. A system used in accordance with an arrangement may include components outside New Zealand.
    6. The Director-General must consult the Privacy Commissioner about including in an arrangement actions that involve the collection or use of personal information.
  6. Below are the statutory provisions concerned with the effect of the use of the electronic system:

    142G Effect of use of system

    1. This section applies to an action done by an automated electronic system.
    2. An action allowed or required by this Act done by the system—
      1. is treated as an action done properly by the appropriate person referred to in section 142F(3); and
      2. is not invalid by virtue only of the fact that it is done by the system.
    3. If an action allowed or required by another enactment done by the system is done in accordance with any applicable provisions in the enactment on the use of an automated electronic system, the action—
      1. is treated as an action done properly by the appropriate person referred to in section 142F(3); and
      2. is not invalid by virtue only of the fact that it is done by the system.
    4. If the system operates in such a way as to render the action done or partly done by the system clearly wrong, the action may be done by the appropriate person referred to in section 142F(3).
  7. An example of a criminal offence related to an AES is s 133A of the Animal Products Act 1999:

    133A Offences involving automated electronic system

    1. A person commits an offence who intentionally obstructs or hinders an automated electronic system that is doing an action under section 165B.
    2. A person commits an offence who knowingly damages or impairs an automated electronic system.
    3. A person who commits an offence against this section is liable on conviction,—
      1. for a body corporate, to a fine not exceeding $250,000:
      2. for an individual, to imprisonment for a term not exceeding 3 months and a fine not exceeding $50,000.
  8. The Customs and Excise Act 2018 is a more comprehensive statutory regime and one that has been recently updated since it was originally implemented in 2009.

  9. Section 296 of the Customs and Excise Act 2018 authorises the Chief Executive to approve the use of AES for an expansive range of activities:

    296 Use of automated electronic systems by Customs to make decisions, exercise powers, comply with obligations, and take related actions

    1. The chief executive may approve the use of automated electronic systems by a specified person to make any decision, exercise any power, comply with any obligation, or carry out any other related action under any specified provision.
    2. The chief executive may approve the use of an automated electronic system only if—
      1. the system is under the chief executive’s control; and
      2. the chief executive is satisfied that the system has the capacity to make the decision, exercise the power, comply with the obligation, or take the related action with reasonable reliability; and
      3. 1 or more persons are always available, as an alternative, to make the decision, exercise the power, comply with the obligation, or take the related action.
    3. An automated electronic system approved under subsection (1)—
      1. may include components that are outside New Zealand; and
      2. may also be used for making decisions, exercising powers, complying with obligations, or taking related actions under other enactments.
    4. The chief executive must consult the Privacy Commissioner on the terms and the privacy implications of any arrangements to use an automated electronic system under subsection (1) before—
      1. finalising the arrangements; or
      2. making any significant variation to the arrangements.
    5. A decision that is made, a power that is exercised, an obligation that is complied with, or a related action that is taken using an automated electronic system under this section must be treated for all purposes as if it were made, exercised, complied with, or taken (as the case may be) by a specified person authorised by the specified provision to make the decision, exercise the power, comply with the obligation, or take the related action.
  10. A “specified person” for the purposes of the Customs and Excise Act 2018 is defined as “means the chief executive, Customs, or a Customs officer (as the case may be) carrying out a function under a specified provision”.

  11. Where the Chief Executive is using an AES, s 297 requires them to publicly identify the legal power being delegated to an AES, as well as to “identify” the AES. It is not clear how this requirement can be complied with, especially given the fact that s 296(3) acknowledges that the “components” of a system may be “outside New Zealand”. Publication must be effected “as soon as practicable” but the use of a system is not rendered invalid only by failure to publish those details “as soon as practicable”.

  12. A variation or substitution to a decision made by an AES can be made a specified person (s 298). The person may:

    1. vary, or add to, the terms or conditions of the relevant decision; or
    2. substitute a decision for the relevant decision if the specified person is satisfied that the new decision—
      1. could have been made under the same specified provision as the relevant decision; and
      2. is more favourable to the affected person.
  13. The Customs and Excise Act states, for the avoidance of doubt, that a decision made through an AES does not deprive a person of rights of appeal, or administrative or judicial review:

    299 Appeals and reviews unaffected
    To avoid doubt, a person has the same rights of appeal or right to apply for administrative or judicial review (if any) in relation to a decision made, power exercised, obligation complied with, or other action taken by an automated electronic system as the person would have had if the decision, power, obligation, or other action had been made, exercised, complied with, or taken by a specified person.

  14. We have identified a number of statutes where some or all of this pattern of drafting is replicated, and where the phrase “automated electronic system” is used.123 While some legislative instruments seem old, the relevant provisions were generally introduced more recently through amendment legislation from 2010 onward. This repeated pattern of statutory drafting suggests a broader Legislative attitude toward how AES should be governed by legislation. Relevant statutes include:

    1. Biosecurity Act 1993, ss 142F and 142G (Biosecurity Law Reform Act 2012).
    2. Food Act 2014, ss 239, 374, 375 (Food Safety Law Reform Act 2018)
    3. Customs and Excise Act 2018, ss 295, 296, 297, 298, 299. Note that similar provisions were present at ss 274A-274D under the Customs and Excise Act 1996.
    4. Summary Proceedings Act 1957, ss 86DA, 86DB, 86DC (Courts Matters Act 2018).
    5. Immigration Act 2009, ss 28, 29, 29A (Immigration (International Visitor Conservation and Tourism Levy) Amendment Act 2019).
    6. Wine Act 2003, ss 101A, 118A, 118B (Food Safety Law Reform Act 2018).
    7. Animal Products Act 1999, 133A, 165B, 165C (Food Safety Law Reform Act 2018).
    8. Immigration (Visa, Entry Permission, and Related Matters) Regulations 2010, reg 8 (Immigration (Visa, Entry Permission, and Related Matters) Amendment Regulations 2010).
    9. Organic Products Bill (2020, 221-1), cl 121, 122.
    10. Legal Service Act 2011, s 16A (Legal Services Amendment Act 2013).
    11. Biosecurity (Infringement Offences) Regulations 2010, schedule 1, s 154N(20), (Biosecurity (Infringement Offences) Amendment Regulations 2018).
  15. There are more references to “electronic systems” across the statute book (including in the internet filters Bill we discuss later) and we cannot identify any reason for why drafting around “automated” electronic systems has been adopted in some situations, and avoided in others. “Electronic systems” are mentioned in:

    1. the Road User Charges Regulations 2012,
    2. Family Court Rules 2002 (related to filing documents in Court),
    3. Victims’ Orders Against Violent Offenders Rules 2014,
    4. the Referenda (Postal Voting) Act 2000,
    5. Harmful Digital Communications Rules 2016,
    6. the Supreme Court Rules 2004,
    7. Customs and Excise Regulations 1996,
    8. Intelligence and Security Act 2017,
    9. Fisheries (Electronic Monitoring on Vessels) Regulations 2017 and
    10. the Criminal Procedure (Transfer of Information) Regulations 2013.
  16. There are restrictions and safeguards on the exercise of delegating to an automated electronic system. The most significant safeguard is that the relevant individual authorising the system (usually a Chief Executive of a government agency) is “satisfied” of the system’s “reasonable reliability”.

  17. There is specific legislative clarification that decisions made using an AES must be capable of appeal, but otherwise can be treated as if they were made by a relevant decision-maker.

Scrutiny of automated electronic systems through Official Information legislation

  1. The Official Information Act 1982 provides one possible tool for requesting the details of an AES being used to perform legal tasks.124

    22 Right of access to internal rules affecting decisions

    1. On a request made under this section, a person has a right to, and must be given access to, any document (including a manual) that—
      1. is held by a public service agency, a Minister of the Crown, or an organisation; and
      2. contains policies, principles, rules, or guidelines in accordance with which decisions or recommendations are made in respect of any person or body of persons in their personal capacity. …
  2. “Document” is defined widely under the Official Information Act, as is “information” which a requester is entitled to seek. Despite the apparent utility of s 22 for this purpose, there are extensive exceptions to this provision which require further investigation to assess the section’s suitability for requesting the specifics of an AES. At a minimum, s 22 provides a sound principled basis for seeking the details of rules (including algorithmic systems) affecting decisions.

  3. If AES are to be adopted as a matter of wider government policy, then Official Information access regimes should be bolstered in support.

Clarifying the “reasonable reliability” of automated electronic systems

  1. Government agencies using AES to exercise powers of decision or other statutory powers are not required to have specific legislative authorisation to do so. Such agencies are Crown Entities with the equivalent powers of a legal person.

  2. Despite this, we think that it would be desirable for agencies intending to automate legally significant operations through the use of digital systems to receive greater legislative guidance as to what is required to make that system “reasonably reliable”, and what level of “satisfaction” is required.

  3. This could be achieved through the inclusion of more specific legislative guidance as to what makes an AES “reasonably reliable”, which is currently left (perhaps ironically) to statutory interpretation.

  4. We think the “reasonable reliability” of AES can be understood in various ways.

    1. At a glance, the “reasonable reliability” of electronic systems might be taken to refer to purely technical matters related to the system’s operation, ie, will it work, can it handle sufficient numbers of applications, etc.
    2. When expanded to automated decision-making tools, reliability could also be taken as referring to reliability of factual inputs to the system. In rule-based systems, this would include the accuracy of data inputs to the system if they are drawn from pre-existing datasets. In more complex machine learning systems, factual reliability may include considering variable risks of false positives and false negatives (which can affect a system’s reliability in the sense that it must operate on correct facts. Issues of factual or technical reliability do have significant impacts on individuals subject to ADM systems), as well as the impacts on operational processes for the users of ADM systems who are using them to make decisions and perform statutory functions.
    3. To take the notion of “reliability” one step further, we say that, without stretching the plain and ordinary meaning of the statutory wording, a system’s legal reliability can also be made an essential part of the system’s reliability. However, it would be preferable for that to be stated explicitly, rather than left as a matter of implication.
  5. It is possible to leave these requirements unstated and implicit in relevant legislation. One benefit of this approach is that it preserves greater flexibility for the agency to decide how it achieves compliance with the law and for the “reasonable reliability” standard of a system to change according to context, over time, and as technological methods develop. A related shortcoming is that people with any concerns about the lawfulness of an AES are not able to point to a specific list of legislative criteria against which the system can be measured.

  6. Alternatively, if the specifics of what makes an AES “reasonably reliable” are left to be a matter of implicit interpretation, these criteria may have to be tested through litigation. This approach could be taken at any time by anyone seeking to test the lawfulness of an AES. Relevant arguments would include that Parliament did not intend that the power to delegate a legal task to an AES would be used to perform that task unlawfully.

  7. An AES that is used to exercise a decision-making power, or a power of similar legal effect, should indisputably be categorised as an “automated decision-making system”. Automated decision-making systems are the subject of significant attention and investigation, primarily because of the increased usage of artificial intelligence techniques to assist (or entirely automate) aspects of decision-making. The modern focus on automated decision-making systems stems from the increasing use of machine learning techniques, which are usually driven by statistical modelling. This can mean that bias in datasets or algorithmic training can lead to perverse or discriminatory outcomes. But that does not mean that simpler rule-based systems cannot also have substantial negative outcomes, and they should be treated with similar care.

  8. At the point where coded models of the law (“rules as code”) are operationalised in digital systems, advocates must engage with the wider academic and policy discussions about the impact of algorithmic decision-making. In New Zealand, key documents and investigations in this area include:

    1. The principles for safe and effective use of data, prepared by the Privacy Commissioner and Stats NZ.125
    2. The Algorithm Charter.126
    3. Algorithm assessment report.127
  9. In summary, coded models of an agency’s interpretation of the law may be useful in some operational situations. Where Parliament intends that these systems be used with greater public and Executive government confidence, Parliament should include better guidance around system requirements. We think the use of better rules approaches and adherence to isomorphic development practices will help make AES easier to assess for their compliance with the law.

Judicial oversight of Automated Electronic Systems

  1. One concern we have is that automated decisions could be subject to appeal, but that the jurisdiction of the Court on appeal may not deal directly with the system’s lawfulness. Instead, the system’s output may be simply put aside, and the decision made again based on evidence at the time of the decision, or available on appeal. In that situation, there would be no judicial scrutiny of the accuracy of the coded interpretation of the law being used within the AES. It could nevertheless be treated as having been judicially approved.

  2. We also note the risk of strategic litigation practices by government departments to avoid judicial scrutiny of a computational model used in an AES. Where a government agency is using an AES, that system will be giving effect to a particular interpretation of the law. Government agencies may wish to preserve their ability to operationalise that interpretation at scale, even where there is a risk that it is wrong in law. As a result, agencies may choose to settle individual cases rather than risk that judicial scrutiny of an AES’s coded model determines that it is wrong in law. It will therefore be important for both judicial and non-judicial auditing and scrutiny processes to be incorporated into the use of coded models in AES.

  3. We also note there is a risk that, because of the jurisdiction of the court on appeal, the judiciary declines to consider a computational model of the law as a whole, instead only considering the relevant statutory provisions to the dispute at hand. This would be consistent with the Court’s general reluctance to comment on academic matters or matters of general interpretation without the benefit of full argument. This is a risk because the model as a whole may be treated by government agencies or by others as having received judicial approval, when only the provisions relevant to the facts of that individual case have been considered.

“Code as Law” will become more pervasive

  1. Code is frequently given legal status without making the code itself ‘the law.’ Code can also be given delegated authority to perform tasks which have legal status.

  2. This is one reason why better rules and rules as code advocates, as well as scholars such as Hildebrandt, Brownsword, Lessig, Susskind and Diver argue that it must be clear when a coded system performing legal tasks is acting with the force of law, and when it is imposing restrictions that have no legal foundation.

  3. This is a core aim of some advocates of better rules and rules as code approaches. We briefly indicate the way these scholars have considered this topic below.

    1. Brownsword writes about concepts of techno-regulation and technological management. He argues that digital systems will be given greater authority to perform regulatory tasks in the future for a variety of reasons and, like others, points to the way that digital systems deny individuals the ability to choose not to comply with an immoral or unjust law.128
    2. Hildebrandt has written comparing “legal by design” approaches to “legal protection by design” approaches. The former incorporates law into the rules governing user behaviour within a digital system, such that the system is said to be lawful “by design”. The latter emphasises the development of digital systems that incorporate the same sorts of legal protections that are incorporated in the wider legal system. 129
    3. In a highly influential text, Lawrence Lessig pointed to the normative effects of computer code in online environments. He compared the effects of code to achieving legal objectives as if law were architecture – or a prison, where non-compliance is impossible – by comparison with the way that written law relies on a process of identification of breaches and discretionary enforcement. He also pointed to the way that digital environments have characteristics such as the trackability of users behaviour in them that can be seen as inherent to digital systems, but may also have legal implications.130
    4. Richard Susskind has long explored the concept that digital systems will come to play a significant role in every aspect of the law. He is well known for his text “the End of Lawyers”131 and his consideration of technology and access to justice. In his most recent text about online courts he, citing Lessig, points to the way that coded systems may introduce limitations on how a user can behave in ways that have no legal foundation.132 We have pointed above to similarities between his suggested solution for this problem and the better rules approach as we define it.
    5. Laurence Diver has examined the normative similarities and differences between code and law and asked how processes which confer legitimacy on the law could also be used to confer legitimacy on code.133 He conducts close theoretical comparison of the way that code and law generate influences on human behaviour based on their different characteristics or affordances.
  4. All of these authors have considered this issue because of a perception that digital systems will be used in more and more situations by governments and non-government actors seeking to perform legal tasks or to influence behaviour.

  5. In support of this conclusion, it is useful to briefly name some examples of the way that digital systems are already given legal status, legal authority, are used for legal tasks, or are intended to have legal effects.

    1. Internet and email filters – employers and other organisations (like schools, for example) who are responsible for network user behaviour frequently impose computational limitations on what people within a network can do on that network, including the kinds of websites that can be accessed or communications that can be sent and received. A decision to avoid or breach these computational mechanisms can lead to legal consequences.
    2. Digital rights management – for a time, computational methods were in use to protect copyright holders from unauthorised breach of copyright in digital artefacts (ie, DVDs). These should be seen as computational methods of giving effect to legal rights and obligations. While DVDs have given way to other technologies, like streaming, the rights and obligations between copyright holder, user, and streaming platform are equally constrained by computational systems.
    3. Smart contracts – in some situations, parties might agree that a contractual relationship between them will be determined in whole or in part by a computational system. There is ongoing debate about the extent to which such arrangements should really attract the status of legal contracts,134 but regardless, they are part of a clear pattern of parties managing legal or pseudo legal relationships between them using computational systems.
    4. Cryptocurrency and/or distributed ledger technologies (including blockchain) – the New Zealand Courts have recently been called upon to consider the question of whether cryptocurrencies can be property. The Cryptopia case includes detailed discussion of the way that distributed ledger technologies work and the role of cryptography and ultimately concluded cryptocurrencies can be property in certain legal contexts.135 The Cryptopia case is also an example of the way that the original act to hack Cryptopia was an unlawful act attracting legal consequences. It further illustrates the way that non-cryptographic protocols that shape the way distributed ledger technologies work can attract legal status, and legal consequences if they are used in particular ways.
    5. Cryptography generally is a way that people use computational techniques to exclude others from accessing a computer system or taking particular actions within that system. In this way, cryptography is used as a kind of “code as law”, particularly if the consequences of deliberately circumventing that cryptographical protection lead to legal action.
    6. Criminal law statutes about crimes involving computers – there are a range of crimes in the Crimes Act 1961 that criminalise the use of computers in particular ways. While some of these offences focus on user intent while accessing a system,136 there are other offences which make it a crime to take steps to access a system without authorisation, regardless of intent.137 In a way, this gives any kind of computer system a protected legal status, and criminalises users for the ways they interact with that system.
    7. One kind of code as law to consider is the use of widely adopted protocols and standards, and the refusal to acknowledge computational systems that do not comply with those standards. An obvious example is the way the world wide web is structured, and a similar example is the way that some online services may only recognise particular file types, such as PDF, which are actually technical standards.
    8. Consumer-facing digital products used for compliance – in New Zealand, it is possible to file tax returns entirely through the Xero platform and it interfaces directly with other Government services such as RealMe and IRD. Overseas, companies such as TurboTax perform a similar function.
    9. Globally and domestically, online marketplaces are in common use – sites such as TradeMe or Amazon are used to create binding legal agreements for the sale and purchase of goods. There are a range of associated legal and digital restrictions on how such transactions can occur, including contractual restrictions imposed by the online marketplace on users, through terms of service, that may lead to users’ access being terminated, or binding public consequences being broadcast (for example, through feedback or star ratings).
    10. Apps with legal consequences – consider the way that use of apps such as AirBnb or Uber create binding legal agreements between parties providing a service in exchange for legal consideration. The entirety of the legal relationship, for most purposes, is captured within the digital platform, even if the arrangement might be ultimately governed by orthodox commercial or contract law.
    11. Calculators and legal guidance systems – central and local government agencies include all kinds of online calculators that allow a user to enter data and be provided with an indication about their entitlement to benefits or rough calculations of their tax obligations. The Inland Revenue Department in New Zealand, for example, has property tax calculators on its website which function by interacting with coded versions of the law as created in Oracle Policy Automation software, a kind of rules as code approach.
    12. Local Governments in New Zealand publish online interactive maps that illustrate the zoning controls applied to particular areas.138 These are commonly accompanied by a disclaimer as to their reliability and lawfulness, but they are an example of how digital systems can be used to interact with the law and understand one’s obligations. This work is being progressed further by the Wellington City Council through the use of rules as code techniques that allow users to check online whether they need to apply for a resource consent, with the intent that resource consent applications are improved in their quality and completeness.139
  6. Finally, we note that there are strong indications that the New Zealand government too intends to make greater use of digital systems to achieve highly sensitive legal and regulatory outcomes. The AES drafting pattern, as well as policy initiatives such as the algorithm charter, show this is already the case, and we deal with a specific example related to proposed internet filters in more detail next.

  7. As more and more policy issues take on a digital dimension, this tendency toward the use of code-as-law systems will only increase.

Concluding Example: Internet Filtering Legislation – an automated electronic system

Internet filtering legislation in New Zealand

  1. During its previous Parliamentary term, the New Zealand Government introduced the Films, Videos, and Publications Classification (Urgent Interim Classification of Publications and Prevention of Online Harm) Amendment Bill (268—1). The Bill had its first reading on 11 February 2021.

  2. The Bill is part of a suite of reforms following the 15 March 2019 terror attacks in Christchurch, New Zealand. Among other things, it creates a statutory regime that authorises the use of “electronic systems” to prevent access to “objectionable material” (a defined term).

  3. It is noticeable that the Bill lays the foundation for a future framework without ever making the case that a framework is needed now. The Bill’s explanatory statement includes the following explanation:

    In New Zealand, the only current government-backed web filter is designed to block child sexual exploitation material (the Digital Child Exploitation Filtering System). This filter is voluntary and operates at the Internet service provider (ISP) level. It currently applies to about 85% of New Zealand’s ISP connections.

    The Bill facilitates the establishment of a government-backed (either mandatory or voluntary) web filter if one is desired in the future. It provides the Government with explicit statutory authority to explore and implement such mechanisms through regulations, following consultation.

  4. We deal with this Bill in some detail here for a number of reasons:

    1. First, it is an example of the way the New Zealand government intends to use digital systems to achieve regulatory objectives.
    2. Second, the Bill’s subject matter is a national internet filter. This is a digital system with serious human and civil rights implications for privacy and freedom of expression if it is not used carefully. Equally, there is an undeniable public interest in preventing the intentional spread of objectionable material, particularly in the case of the Christchurch shootings, where content was circulated in order to enhance the intended impact of terrorist acts of violence.
    3. Third, the legislation sets down extremely loose parameters for how the system would operate and what it would apply to. It is essential that this legislation, when enacted, should impose a much greater degree of control on the way Executive than it currently does. This includes how the Executive government designs, implements, and audits the system, as well as what mechanisms of appeal exist, and whether they are effective.
    4. Finally, the Bill presents an obvious use case for the application of a better rules approach, given that the purpose of the Bill is is to operationalise a coded model of the law.
  5. We believe this Bill to be part of a broader trend across different jurisdictions that aims to expand the scope of what kinds of information may not be published or accessed on the internet. For example:

    1. Legislation described in the Online Harms White Paper in the United Kingdom.140
    2. The recent Digital Services Act being investigated for introduction in the European Union.141
    3. Australian legislation criminalising the sharing of “abhorrent violent material” among other things.142

Analysis of Bill

  1. The Bill delegates all the specifics for how the web filter will work to secondary legislation (regulations). While there are consultation obligations imposed on Executive agencies before regulations are made, the incorporation of insights from consultation are left largely to the judgment of that Executive government actor.

  2. Matters to be dealt with in regulations also include mechanisms of review and appeal, which are separated from the existing appeal mechanisms under the Films Videos and Publications Classification Act 1993. As it is, the Bill provides for no appeal process.

  3. The explanatory statement to the Bill repeatedly uses the word “clarify” to describe what regulations will do. It would be more accurate to say that regulations will “create” the regime, given the way that the principal Act provides little guidance as to how such a filter should operate. Regulations would, apparently, do the following:

    clarify the criteria for identifying and preventing access to objectionable content that the filter would block

    clarify governance arrangements for the system

    specify reporting arrangements for the system

    clarify the review process and right of appeal should an ISP, online content host, or other individual or entity dispute a decision to prevent access to a website, part of a website, or an online application:

    clarify the obligations of ISPs in relation to the operation of the system:

    provide detail of how data security and privacy provisions would be addressed.

  4. Clause 119M of the Bill provides for the establishment of the system. However, it leaves the overall “design and form” of the system entirely up to regulations.

    119M Establishment of electronic system

    1. When establishing the electronic system to be approved for operation under section 119N, the Secretary must consult the following on the design and the final form of the system:
      1. service providers; and
      2. technical experts and online content hosts to the extent the Secretary thinks necessary; and
      3. the public.
    2. When deciding on the design and form of the system, the Secretary must consider—
      1. the need to balance—
        1. any likely impact on public access to non-objectionable online publications; and
        2. the protection of the public from harm from objectionable online publications; and
      2. any likely impact on performance for all other network traffic; and
      3. departmental and technical capacity to operate the system; and
      4. likely compliance costs.
    3. However, each of the factors in subsection (2) needs be considered only to the extent that it is relevant in the Secretary’s view.
    4. The system—
      1. must have the capacity to both identify and prevent access to a particular online publication with reasonable reliability, based on criteria set out in regulations made under section 149; and
      2. is subject to governance arrangements required by regulations made under section 149; and
      3. is subject to requirements for administration and technical oversight prescribed by regulations made under section 149, including relating to data security and privacy; and
      4. is subject to reporting requirements required by regulations made under section 149.
    5. Obligations of service providers relating to the operation of the system may be prescribed by regulations made under section 149.
  5. Leaving aside the question of whether a State-enforced internet filter is desirable from a policy and human rights perspective, we make the following observations from a law-as-code perspective, which we think makes the Bill an essential candidate for a transparent and open application of the better rules approach before it is enacted as legislation.

    1. The process for classifying material as being objectionable under the principal Act is generally accepted to be robust and exercised cautiously by the Office of Film and Literature Classification. The filter would be limited, as a matter of law, to material already classified as objectionable under the Act following a classification process. This narrow scope is desirable, but it is not clear whether, from a digital systems perspective, it is possible to target only that content without also targeting incidental content. Clause 119L(4) of the Bill allows not just denial of access to an online publication, but also to any website on which that online publication is available.
    2. In accepting that the filter should only apply to objectionable material under the Act, the Bill provides no mechanism for addressing situations where the filter breaches that legal requirement.
    3. The standard of “reasonable reliability” is adopted in cl 119M(4)(a), and the standard of reasonable reliability will be elucidated via criteria set out in regulations. In line with our wider recommendations, the question of what “reasonable reliability” means should be clarified and it should be clear that the reliability of the system includes its lawfulness.
    4. The departmental disclosure statement states that the intention is to limit the filter only to publicly available websites, and not to messaging services or similar communication technologies. Despite that, cl 119L(4)(b) permits denial of access to an “online application, or similar” where objectionable publications are available. An “online application or similar” could easily cover messaging services and apps.
    5. It is doubtful whether it is legally desirable to delegate the design and form of an electronic system that limits rights of privacy and freedom of expression to secondary legislation. The Bill will create a statutory regime for limiting the right to freedom of expression, including the right to seek information, using computational systems acting with legal authority. Further, the Bill recognises that such a computational system is also likely to infringe upon individual privacy because of the way it will monitor and track identifiable individuals seeking to access material blocked by the filter.
    6. It is constitutionally significant that the agency responsible for the “design and form” of the system is left to an Executive government agency, rather than to the Legislature.
    7. There is some recognition that introducing an electronic system of this kind will have computational effects on other systems, including impact on network performance. This suggests the multidisciplinary better rules approach might have merit.
    8. The Bill is an excellent example of the way that commentators say legislatures will inexorably be drawn to the use of technology to achieve legal or regulatory tasks. Legislative bodies will be required to do so because of a perceived need to protect citizens; but also because digital systems provide a desirable regulatory tool that can operate automatically and at scale.
    9. It is not clear why the wording of an “electronic system” has been preferred when the filter would nevertheless be “automated” and self-executing.
    10. The need for the electronic system is described as being contingent: it creates a power to impose a system only if required in the future. This suggests the need for the system is not urgent. The Bill clarifies that existing voluntary systems are already capable of being operated. On this basis, there can be little argument for urgency, and this creates an opportunity to use better rules approaches to ensure any Bill is workable for implementation in digital systems. It also means that there is sufficient time available to consult very carefully with non-government organisations.
    11. Section 119N and 119O make it essential for review and appeal processes to be established in regulations before the filter can be approved, however we are unaware of any other legal situations where rights of review and appeal about matters of freedom of expression are delegated to secondary legislation.

Conclusions on Bill

  1. If there is one thing that might be universally agreed about a legislative proposal to implement a digital censorship system, it is that the proposal should have sufficient detail to be scrutinised by members of the public and Parliament before it becomes law.

  2. As drafted, the Bill defers all of the important detail about how the system would operate to regulations, meaning Members of Parliament are not required to take responsibility for how this system would operate. Equally, in more than one of the speeches in support of the Bill at first reading, it was suggested that Select Committee is the appropriate place to work out any extra detail in the Bill. We think this approach of consistently pushing the detail of the internet censorship system is suboptimal and can be avoided by the adoption of a better rules approach which more holistically develops the policy at the outset from a multidisciplinary perspective.

  3. There was some parliamentary support for this proposition from Green MP Chloe Swarbrick:143

    [L]eaving all of this stuff to the regulations, is the equivalent of me handing you a piece of paper and saying, “Please draw the rules,” and then enforcing those rules without having had any parliamentary oversight of what those rules actually are. … We are centralising far too much control with the progression of this legislation.

  4. Despite the decision not to use the word “automated” in relation to the electronic system, this filter will be a self-executing system acting with legal force and legal consequences. It is an example of self-executing code-as-law of the kind scholars indicate should be approached with extreme caution, especially because of the way that such a filter will deprive citizens of the shield and tools provided by the ambiguity of natural language. The filter breaks down the constitutional space between the written language used by Parliament and the Judicial interpretation of that language in specific cases. This makes the absence of any legislatively provided dispute resolution mechanism even more concerning: by omission, the judiciary’s role in relation to this filter has been completely removed, other than by judicial review or other inherent powers.

  5. The Bill if passed would confer the power of algorithmic regulation (discussed by Hildebrandt) onto the New Zealand government in relation to matters impinging on freedom of expression and rights to privacy to some degree. There may be an argument that such limitations can be demonstrably justified, but where are these to be made? The Bill delegates review and appeal mechanisms to secondary legislation to be devised by the same agency responsible for operating the algorithmic system.

  6. The algorithmic system will generally act solely based on data inputs with little opportunity for human intervention once the initial parameters of the system are set. Presumably, a register of banned content will be created, but once that has been set, there are no clear mechanisms to challenge the operation of the system.

  7. Where someone believes the electronic system has strayed beyond the bounds of its legal authority, there are not clear standards against which the system can be assessed. We urge extreme caution in the progress of this Bill through the House and advise that the electronic system of filtering web access be developed in close consultation with non-government actors using a better rules approach before it is enacted as legislation.

  1. Morris, J “Spreadsheets for Legal Reasoning: The Continued Promise of Declarative Logic Programming in Law” LLM Thesis, 2020, University of Alberta. ↩︎

  2. Ibid at 47-48. ↩︎

  3. Ibid at 49. ↩︎

  4. Ibid at p 51. ↩︎

  5. Footnote 63 on p 51. ↩︎

  6. Ibid at p 52. ↩︎

  7. Ibid at p 53. ↩︎

  8. “How to make the civil justice system more accessible, discussed by a panel of experts”, RNZ (6 October 2019): <https://www.rnz.co.nz/programmes/otago-university-panel-discussions/story/2018714651/how-to-make-the-civil-justice-system-more-accessible-discussed-by-a-panel-of-experts>. ↩︎

  9. Property Ventures Investments Limited v Regalwood Holdings Limited [2010] NZSC 47. ↩︎

  10. Susskind, R “Online Courts and the Future of Justice” (2019, Oxford University Press, Oxford, United Kingdom) at 163. ↩︎

  11. Drafting around “automated electronic systems” in the Courts Matters Act 2018 was noted in Colin Gavaghan, Alistair Knott, James Maclaurin, John Zerilli, Joy Liddicoat “Government Use Of Artificial Intelligence In New Zealand: Final Report on Phase 1 of the New Zealand Law Foundation’s Artificial Intelligence and Law in New Zealand Project” (New Zealand Law Foundation, Wellington, 2019). ↩︎

  12. This has also been noted by the authors of Gavaghan et al (2019). ↩︎

  13. See the Principles of Safe and Effective Data and Analytics (May 2018) prepared by the Office of the Privacy Commissioner and Stats NZ: https://www.stats.govt.nz/assets/Uploads/Data-leadership-fact-sheets/Principles-safe-and-effective-data-and-analytics-May-2018.pdf. ↩︎

  14. See the Algorithm charter for Aotearoa New Zealand: https://www.data.govt.nz/use-data/data-ethics/government-algorithm-transparency-and-accountability/algorithm-charter/. ↩︎

  15. See Algorithm Assessment Report, Department of Internal Affairs and Stats NZ, October 2018: <https://www.data.govt.nz/assets/Uploads/Algorithm-Assessment-Report-Oct-2018.pdf>. ↩︎

  16. Roger Brownsword “In the year 2061: from law to technological management” (2015) 7(1) Law, Innovation and Technology at 1-51. ↩︎

  17. Hildebrandt, M “Legal protection by design: objections and refutations” (2011) 5(2) Legisprudence 223 at 234. ↩︎

  18. Lawrence Lessig “Code 2.0” (2006, Basic Books, New York, USA). ↩︎

  19. Richard Susskind “The End of Lawyers?: Rethinking the nature of legal services” (2008, Oxford University Press, New York). ↩︎

  20. Susskind, R “Online Courts and the Future of Justice” (2019, Oxford University Press, Oxford, United Kingdom). ↩︎

  21. See Diver “Digisprudence” (2019), above. See also Laurence Diver “Law as a User: Design, Affordance, and the Technological Mediation of Norms” (2018) 15(1) Scripted 4. ↩︎

  22. See for example: Nataliia Filatova “Smart contracts from the contract law perspective: outlining new regulative strategies” (2020) 28(3) International Journal of Law and Information Technology 217. ↩︎

  23. Ruscoe v Cryptopia Limited (in liquidation) [2020] NZHC 728 (8 April 2020). ↩︎

  24. Crimes Act 1961, s 249: accessing computer system for a dishonest purpose. ↩︎

  25. Ibid, s 252: accessing computer system without authorisation. The mens rea of the offence is knowledge or recklessness as to the absence of authorisation. ↩︎

  26. For example, see Auckland Council Unitary Plan Geomaps: <https://unitaryplanmaps.aucklandcouncil.govt.nz/upviewer/>. ↩︎

  27. See “Find out if you need a resource consent”, Wellington City Council: <https://wellington.govt.nz/property-rates-and-building/building-and-resource-consents/resource-consents/find-out-if-you-need-a-resource-consent>. For transparency, we note one of the authors was involved in the production of this tool. ↩︎

  28. See: <https://www.gov.uk/government/consultations/online-harms-white-paper/outcome/online-harms-white-paper-full-government-response#executive-summary>. ↩︎

  29. See: <https://ec.europa.eu/digital-single-market/en/digital-services-act-package>. ↩︎

  30. See: <https://www.ag.gov.au/crime/abhorrent-violent-material>. ↩︎

  31. First reading, Films, Videos, and Publications Classification (Urgent Interim Classification of Publications and Prevention of Online Harm) Amendment Bill (10 February 2021) Volume 749 NZPD. ↩︎