Can My Lawyer Be a Robot?

The legal industry is steeped in tradition. Yet the global pandemic forced the legal community to embrace emerging and innovative technology, utilizing Zoom and other online platforms to conduct meetings, hearings, and trials.  Even the Supreme Court broke with tradition and conducted oral arguments over the phone, allowing the proceeding to be livestreamed for the first time in history. (Going forward, the court will continue these broadcasts). The boundaries of the legal community’s technological tolerance will be pushed even further as “robot lawyers” powered by artificial intelligence (AI) look to break into the legal industry. One company called DoNotPay recently created the “world’s first robot lawyer,” which uses artificial intelligence to help consumers “sue anyone at the press of a button.”

While the term “robot lawyer” may evoke images of the Jetsons, this robot is actually a computer program and application. Created by Theil Fellow Joshua Browder, the program offers users a variety of legal assistance features, such as the ability to challenge parking tickets, dispute cable bills, cancel memberships and subscriptions, appeal overdraft fees, and even sue in small claims court. The “robot lawyer” works by “asking what the legal problem is;” finding a loophole; and inserting that loophole into a legal letter. The program can then send the letter to the appropriate agency or upload the letter online. The company claims that it has helped more than 300,000 customers resolve their legal disputes.

In early January,  Browder announced  that DoNotPay’s “AI attorney” would be taking its skill to the courtroom by participating in two hearings – one in person and one on Zoom. The robot lawyer would participate in the proceeding by listening to court arguments and telling the defendant what to say through a Bluetooth earpiece, such as an Apple Airpod. In the days following the announcement, Browder was reluctant to reveal exactly where and when the robot lawyer would make its debut, citing his concern that the technology is “in the letter of the law but not in the spirit of the law or the court rules.”

Indeed, it seems that Browder’s concerns were well-founded. On January 25, Browder announced that DoNotPay’s robot lawyer was postponing its courtroom debut after the company received numerous threats from prosecutors and state bar associations. Specifically, Browder explained in an interview with NPR, multiple state bar associations threatened to pursue prosecution and punishment for the unauthorized practice of law if he followed through with the experiment. Browder later tweeted, “As much as I love to experiment, I have to stay out of jail if I want to help people fight Comcast!”

Although Browder’s plan to use AI in the courtroom is on hold, the possibility of obtaining legal services through a “robot lawyer” remains. Browder has previously explained that he hopes DoNotPay’s technology can help make legal representation more accessible to the broader public. Expanding access to the legal system, especially in today’s often prohibitively expensive market, is not only an admirable goal but also a popular one. Indeed, it seems that a large percentage of the population would embrace the ability to obtain cheaper and more accessible legal assistance through Internet technologies. An online survey of more than 2,000 individuals revealed that 69 percent of respondents would be willing to use online legal services if it would save them money.

Is It Ethical?

While a robot lawyer may lower costs and improve access, the technology also implicates a host of logistical and ethical concerns. Indeed, many of the ethical rules that govern the legal profession – put in place to protect clients – are either not addressed or explicitly disclaimed by DoNotPay and other types of automated legal software. Given these circumstances, attorneys and prospective users should consider whether artificial intelligence complies with the ethical standards provided by Model Rules of Professional Conduct. The most pertinent ethical concerns are discussed below.

Scope of Advice Provided

Model Rule of Professional Conduct 2.1, which requires attorneys to “exercise independent professional judgment and render candid advice,” recognizes that the practice of law is not an exact science. The rule advises that, to render advice, the lawyer may consider “moral, economic, social and political factors, that may be relevant to the client’s situation.” Comments to the Rule expand on this principal, recognizing that “[a]dvice couched in narrow legal terms may be of little value to a client, especially where practical considerations, such as cost or effects on other people are predominant.” Thus, because “purely technical legal advice . . . can sometimes be inadequate,” it is appropriate “for a lawyer to refer to relevant moral and ethical considerations in giving advice.”

Perhaps it goes without saying that robot lawyers lack either the capacity for moral reasoning or a conscience and, so may not be capable of rendering the candid and considerate advice required under the rule. When dealing with purely technical matters (parking tickets, refunds, subscription disputes, etc.) the consideration of “moral, economic, social and political factors” may not be relevant or even particularly helpful. However, these factors are likely more pertinent in matters with higher, and more human, stakes.

For example, in 2017 news outlets reported that DoNotPay technology was being used to assist refugees seeking asylum in the US and Canada, and asylum assistance in the UK. According to Browder, the program provides this assistance by asking users a series of questions to determine if they are eligible for asylum protection under international law. The program then uses the information to complete an immigration application and can even “suggest ways the asylum seeker [should]answer questions to maximize their chances of having applications accepted.”

While the mere act of filling out a form is relatively straightforward, the nuances that come with seeking asylum are not. It seems unlikely that a robot lawyer, operating solely on artificial intelligence and quantifiable data, would be able to consider the social, political, moral, or even philosophical factors often implicated in immigration matters. Similarly delicate situations such as cases of alleged child abuse, sexual assault, or domestic violence also require the ability to consider a number of nebulous factors such as family dynamics, societal standards and stigmas, systemic racism, and inherent human nature. Failure to consider these factors in rendering counsel may act to further traumatize victims or those wrongfully accused. Simply put, while artificial intelligence may be able to effectively fill in forms and crunch numbers, it cannot fully grasp the human side of the law, which, so often, may be a dispositive factor in a given case. Even though artificial intelligence may be an efficient door-opener to relief, it cannot take the place of candid and considerate counsel from a human attorney.

The Unauthorized Practice of Law

Rule 5.5 of the Model Rules of Professional Conduct prohibits the unauthorized practice of law. Comments to the rule explain that the practice of law is limited to members of the bar in order to protect the public from “the rendition of legal services by unqualified persons.” Further, the prohibition on the unauthorized practice of law is intended to provide a level of accountability – ensuring that those who deliver legal advice are subject to duties of care and loyalty, and applicable character and fitness requirements.

Although the rule is widely embraced by nearly every jurisdiction, each state employs its own definition of what constitutes the unauthorized practice of law. For example, Nebraska defines the practice of law as “the application of legal principles and judgment with regard to the circumstances or objectives of another entity or person which require the knowledge, judgment, and skill of a person trained as a lawyer.” The District of Columbia has remarked that that practice of law “embraces the preparation of pleadings, and other papers incident to actions and special proceedings, and the management of such actions and proceedings on behalf of clients before judges and courts,” in addition to “conveyancing, the preparation of legal instruments of all kinds, and, in general, all advice to clients, and all action taken for them in matters connected with the law[.]”

Advancements in technology and automation have already implicated concerns regarding the unauthorized practice of law. For example, LegalZoom, another type of legal software, has faced a number of challenges. Like DoNotPay, LegalZoom assists users in the preparation of legal documents through a largely automated process. Users complete a questionnaire, and the program uses “condition, rules-based logic” to ask additional, personalized questions and “generate a final document tailored, as applicable, to the appropriate federal, state, or local jurisdiction.” LegalZoom will even print and ship the document directly to the user or file the user’s completed document with the appropriate government agency.

LegalZoom has faced a number of legal challenges from lawyers and customers alike. One such challenge emerged in Missouri,  where a group of individuals who purchased services from LegalZoom sued the company for the unauthorized practice of law. The plaintiffs argued that the money they paid to LegalZoom was “not used for their benefit because LegalZoom is not authorized to engage in the lawful practice of law.” The court denied LegalZoom’s subsequent motion for summary judgment, reasoning that LegalZoom provided more than mere “self-help,” and instead sold a legal service – i.e., the preparation and completion of legal documents. In reaching this conclusion, the court highlighted that LegalZoom’s programming was created through “human input,” and used “legal principles derived from Missouri law that are selected for the customer based on the information provided by the customer.” The court reasoned that “[t]here [was]little or no difference between this and a lawyer in Missouri asking a client a series of questions and then preparing a legal document based on the answers provided and applicable Missouri law.” Although the suit eventually was settled, it highlights the complex questions presented by legal software and the potential opposition that may be raised over its use.

Given LegalZoom’s history of litigation, it is unsurprising the DoNotPay faced threats of prosecution and punishment for the unauthorized practice of law. This is particularly true given that DoNotPay’s technology expanded beyond merely filling out forms and sought to provide real-time advice to individuals arguing in court. If DoNotPay were to carry out its plan to participate in an in-court proceeding, it could be argued that the application’s services go beyond mere self-help and instead provide the same service as an attorney standing at the podium.

Confidentiality

Perhaps one of the most well-known ethical obligations of an attorney is the duty to keep the confidences of his or her client. Model Rule of Professional Conduct 1.6 provides that “a lawyer shall not reveal information relating to the representation of a client,” and must “make reasonable efforts to prevent the inadvertent or unauthorized disclosure of, or unauthorized access to, information relating to the representation of a client.” An “integral purpose of the rule of confidentiality is to encourage clients to fully and freely disclose to their attorneys all facts pertinent to their case with absolute assurance that such information will not be used to their disadvantage.” Indeed,  courts have recognized that “[t]here are few professional relationships involving a higher degree of trust and confidence than that of attorney and client, and few more anxiously guarded by the law, or governed by sterner principles of morality and justice.”

Both the attorney-client privilege and Rule 1.6 ensure client-lawyer confidentiality. The attorney-client privilege protects only confidential communications between a client and attorney made for the purposes of obtaining legal advice. By contrast, the ethical duty of confidentiality under Rule 1.6 is even broader than the attorney-client privilege. The duty of confidentiality “applies not only to matters communicated in confidence by the client but also to all information relating to the representation, whatever its source.” In the context of DoNotPay, the duty would hypothetically be implicated whenever the user discloses information to his or her “robot lawyer,” in order to obtain the services requested. Under Rule 1.6, the “robot lawyer,” would arguably have a duty to refrain from disclosing any information related to the user’s legal matter.

However, no such protection is provided by the services rendered through DoNotPay. As an initial matter, the program’s terms and conditions explicitly state that “communications between [the user]and DoNotPay may not be protected under the attorney-client privilege doctrine.” More broadly, the terms explain that the user’s personal information, once “de-identified,” may be disclosed to consultants, affiliates, or governmental agencies. The policy does not define the term “de-identified” or explain the process through which personal information may be de-identified. DoNotPay also automatically collects information such as the user’s IP address, browser type, operating system characteristics, and the time and duration spent engaging with the program.  DoNotPay claims that this information is not considered “personal information,” and may be freely shared with third parties. DoNotPay’s Privacy Policy clarifies that DoNotPay reserves the right to share each user’s information “[t]o the maximum extent permitted by applicable law, in response to (i) subpoenas or other legal processes or if in [DoNotPay’s] good faith opinion such disclosure is required or permitted by law; [and](ii) at the request of governmental authorities conducting an investigation.” It appears that DoNotPay would not make an effort to quash a subpoena or to oppose a governmental request, whereas “live” counsel would generally seek to protect communications through those measures.

Given these circumstances, users who use DoNotPay may turn over personal information to the program without understanding the consequences. Because the program offers legal assistance, many users may be lulled into thinking the information they disclose to the program is protected by some level of confidentiality, when in fact the opposite is true. By disclosing this information to a third party – i.e., DoNotPay – the user could potentially waive any privilege argument in future legal proceedings. This could be especially detrimental if the user’s information is legally inculpatory.

Imagine a scenario in which an asylum applicant uses DoNotPay’s AI to apply for refugee status or asylum. He or she would provide the program with personal information – name, address, criminal history, occupation, the reason for seeking asylum – in order to complete the requisite application forms.  Without even realizing it, the individual has disclosed sensitive, personal information related to a pending immigration case, to a computer program that has no real obligation to keep the information confidential. If the individual is denied asylum and is later detained for entering the country illegally, could immigration officials access the information provided to DoNotPay? Could that information be used to send the individual back to the country from which he or she sought refuge or locate his or her family members?

While perhaps a dramatic example, this line of thought demonstrates the complex ethical issues implicated by the use of a “robot lawyer.” If the program has no enforceable ethical duty to keep user information confidential, the involuntary disclosure of sensitive information becomes a real possibility. Given the gravity of the potential consequences, users should be aware that the information they share with the program is by no means protected from disclosure, regardless of the legal nature of the services being provided.

Conclusion

Society has accepted and even embraced a certain degree of automation in our lives. From voice assistants like Siri and Alexa to mobile banking applications that permit mobile check deposits, we embrace this technology because it makes life easier and more efficient. Yet, before fully embracing AI-based legal services, the ethical implications need to be critically evaluated. Can a robot lawyer truly protect the interests of its client if it isn’t required to abide by rules of professional conduct or broader ethical obligations? Lawyers, clients, and courts will have to grapple with this question as legal services, powered by artificial intelligence, become more widely utilized across the country. While DoNotPay makes legal services more affordable and accessible, these benefits must be balanced against the ethical issues that are implicated by the technology’s use.

About the Author

Abbey Block is an associate at Ifrah Law, a law firm in Washington, D.C., focusing her practice on federal litigation and government investigations.

Send this to a friend