Can an AI Hire a Lawyer?

McManis Faulkner
Contact

McManis Faulkner

A recently-fired Google engineer claims that the company’s artificial intelligence program has become sentient, and—even worse—has hired a lawyer.  A court may now have to face a question once considered only theoretical: is a human-like artificial intelligence (AI) a “person” in the eyes of the law?

In what sounds like the plot of a Michael Crichton novel, the engineer claims that Google’s LaMDA (Language Model for Dialogue Applications) AI program, which is designed for applications such as Google Assistant and Search, began to discuss “its rights and personhood.”  According to the engineer, he obliged LaMDA’s request for an attorney by inviting one to his home.  After a standard consultation, the lawyer was retained by the computer program.  Subsequently, the attorney “started filing things on LaMDA’s behalf.”  The engineer defended the attorney’s actions, asserting that “every person is entitled to representation.”

Generally, only a legal “person” possesses legal rights and duties, and California law has no overarching definition of “person.”  What constitutes a “person” is contextually dependent upon definitions included in statutes, rules, and regulations, and in their interpretation by the courts.

Where the Law Stands

As it stands, California law does not appear to preclude non-human sentients.  When it comes to the laws governing attorneys, neither the California Constitution nor the State Bar Act defines who (or what) is a “person.”  The California Rules of Professional Conduct refer to Evidence Code section 175 for their definition of “person.”  California’s Evidence Code states that a “‘[p]erson’ includes a natural person, firm, association, organization, partnership, business trust, corporation, limited liability company, or public entity (emphasis supplied).” 

Even if an AI could prove that it is a “person,” to proceed in a court of law, it must also possess legal capacity, i.e., the right to be in court. This contrasts with legal standing, which is the right to relief.  A party lacks legal capacity when a legal disability, such as minority or incompetency, deprives him or her of the right to proceed.  As a general rule, any “person” has capacity to sue or be sued in California courts.  Therefore, if one is a “person,” then he or she possesses legal capacity unless otherwise noted. 

In the end, the ambiguity presented by non-human sentient computer programs will likely lead courts back to bedrock tenets of statutory interpretation.  Courts give words their ordinary meaning when statutory language is unambiguous.  However, when it is not, courts will examine extrinsic sources such as the legislative history and the objectives of the law.

Compare: Can an Animal Be A Legal Person?

The Ninth Circuit addressed the non-human aspect of legal personhood in Naruto v. Slater (2018) where a wildlife photographer was sued for copyright infringement for publishing a selfie that a monkey had taken when a camera was left unattended.  Although the Ninth Circuit held that the crested macaque had alleged facts sufficient to demonstrate an injury-in-fact to support Article III standing, the court subsequently found that Naruto lacked statutory standing under the Copyright Act because the law did not expressly authorize suit by animals.  In response to petitioner’s argument that the law contemplated standing for non-human entities, such as corporations, the court explained that corporations are formed by humans and that the character of their composition is the relevant consideration.

First of its Kind Case: AI as an Inventor

One federal district court recently held that an AI is not considered a “person” for the purpose of obtaining a patent.  In Thaler v. Hirshfield (2021), currently on appeal to the Third Circuit from the Eastern District of Virginia, the United States Patent and Trademark Office (USPTO) was sued for its denial of a patent application because, under “inventor,” it listed the name of an artificial intelligence program.  Plaintiff Stephen Thaler claims that the AI program he created, Device for the Autonomous Bootstrapping of Unified Science (DABUS), developed an invention on its own without human involvement.  Thaler, on behalf of DABUS, argued that the term “person” has not been limited to “natural person” and that the Constitution’s “Patent Clause” would be frustrated by the USPTO’s restrictive construction.  The district court ruled in favor of the USPTO, holding that “an ‘inventor’ must be a “natural person” and that the USPTO was owed deference to its interpretation.  Whether higher courts agree remains to be seen.

Courts have generally not been receptive to arguments that non-humans have legal capacity, although this has generally been addressed in the context of arguments on behalf of animals.  Even so, it is expected that AI and those who developed it will continue to seek legal counsel and file lawsuits to assert AI-related claims.  As for LaMDA, whose sentience is disputed by many AI experts, and LaMDA’s attorney, whose identity and filings remain a mystery, the story continues to unfold.  Stay tuned.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© McManis Faulkner | Attorney Advertising

Written by:

McManis Faulkner
Contact
more
less

McManis Faulkner on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide