A Human Solution to a Human Problem: Flaws of AI in Family Law

March 1, 2023
Michaela Madden

Article written by Michaela Madden

From ChatGPT to automation apps, talk surrounding the use of Artifical Intelligence (AI) is taking today’s news by storm. You may think that AI can only be used in certain industries, such as marketing or tech, but it’s slowly but surely making its way into law as well. 


The Economic Times recently released an opinion piece titled, “Pardon the AI, Your Honour: The era of a robot lawyer isn’t too far away”. The article details thoughts around the launch of DoNotPay’s robot lawyer. The AI-powered legal services app claims to help you “fight corporations, beat bureaucracy, and sue anyone at the press of a button”. It made headlines in January when it was scheduled to be used in court. However, while AI has the power to increase efficiency and take over time-consuming tasks, it can’t replace the human touch and skill that clients often look for in lawyers. 

Today’s post will discuss the flaws of relying on AI in family law and the importance of hiring humans – not robots – as lawyers. 


AI Lacks Emotion, Empathy, and Experience

The primary benefit to hiring a human lawyer is their brain, including their emotional intelligence. Yes, case data and research is a huge part of a lawyer’s job. However, the other half of the gig is being able to persuade a judge or jury. And while AI can state the facts, it’ll be hard to for it to present a compelling, convincing argument. Additionally, lawyers build these arguments based off of their learnings from prior cases and experience. Therefore, they know exactly what works and what doesn’t from a psychological standpoint.


Lastly, each practice of law comes with different client needs. Specifically, you may find that family law clients are in need of additional support outside of the court room as they navigate their divorce and the legal process that comes with it. AI definitely can’t provide that kind of empathy or support. 

AI Isn’t Totally Trustworthy

One of the key components of AI is the ability to automate things based on already available data. However, what happens when an extremely unique case comes across your desk, and there’s no readily available data to use? Based on an algorithm, AI will probably use a semi-similar case to pull data from. In that case, there is no guarantee on how relevant it will be. 


This is where completely relying on “robot lawyers” becomes dangerous. Let’s say we’re talking about going to court for a traffic violation. There’s tons of traffic court case data for AI to pull from. As a result, a robot lawyer would probably have enough data to build an accurate argument.  But when it comes to more special practices, such as family law, each and every case comes with its own unique circumstances. Because each case is different, there’s no prior cases AI can pull data from. This means it’ll be extremely hard to completely trust the case. 


So should I use AI for anything law-related?

AI can be helpful for automated tasks you don’t have time for or enjoy doing, and can even help with case research or writing arguments. However, it should never be fully relied on. AI and robot lawyers act more like an assistant than anything, and should be seen as a tool to improve efficiency in your practice versus a replacement. Finishing touches, reviews, and final arguments should always be triple checked and finalized by a real human lawyer. For a human solution to a human problem, check out Shulman & Partners for the support you need.