Legal research doesn’t have to suck

Published by:
David Johnson

Reviewed by:
Alistair Vigier
Last Modified: 2025-06-04
I’m a legal technology founder, and I want to share how AI software (like our startup’s platform, Caseway) are transforming how people access court decisions and do legal research. This is my honest take from the trenches, with all the excitement, surprises, and challenges we’ve faced along the way.
As someone who went to law school, I remember spending countless hours sifting through case law. Picture sitting in a law library or wrestling with clunky databases at 2 AM, trying to find that one precedent on point. It’s a grind.
There are over 2.5 million Canadian court judgments out there on public databases. No human can read all that, it’s completely impossible. Even top lawyers often don’t fully read every case they cite. Who has time to digest a 100-page judgment for one footnote?. It’s easy to miss something critical. Frankly, legal research has long been a tedious, risk-prone slog.
That’s the pain point that got me interested in artificial intelligence. What if an AI could parse all those millions of court decisions on demand and surface the relevant ones in seconds? Turns out it can. I helped build an artificial intelligence legal research company called Caseway.
An AI that goes through 100m + court decisions
The easiest way to think of Caseway is “an AI that goes through 100m + court decisions,” looking for answers. Instead of searching manually, you can ask it in plain English (or legalese) and it will retrieve the most relevant cases, even summarizing the pertinent parts. The first time I tried it, it felt almost like cheating, in a good way.

Let me give a practical example. Someone recently got a pesky parking ticket.
Typically, fighting it’d mean digging up past cases on similar tickets, a needle-in-a-haystack task that they would probably skip. Instead, they asked our AI for precedents on that type of parking bylaw. Within seconds, it provided the person with a couple of on-point cases (complete with the exact quotes) that they could reference.
They actually used it to draft my argument and beat the ticket in court. It was a small victory, but a glimpse of how even “little guy vs system” can shift when you have instant legal knowledge on tap.
How Legal AI Like Caseway Works (No Hallucinating Fake Cases!)
Early on, we knew that trust would be the issue. Lawyers and judges aren’t going to accept some black-box artificial intelligence spitting out answers without proof. We all heard about that fiasco where a lawyer asked ChatGPT for cases, and it invented fake citations that didn’t exist.
That actually happened. A lawyer in New York filed a brief with entirely fictional cases courtesy of ChatGPT, and got severely sanctioned for it, and a similar incident occurred in Vancouver Canada, prompting an investigation. Those headlines made many lawyers understandably wary of artificial intelligence. Nobody wants to be the following cautionary tale of “artificial intelligence hallucinations” in the courtroom.
So we took a different approach. We trained Caseway exclusively on official court decisions. If a judge didn’t write it, our AI doesn’t know it. It won’t speculate or pull info from random websites or social media. This dramatically reduces the garbage-in, garbage-out problem. The model focuses on the actual text of millions of judgments. It doesn’t “know” about gossip from Reddit or some blog’s opinion. It sticks to the legal corpus.
Showing Sources Is So Important With Legal AI
The artificial intelligence always shows its sources. Every answer it gives comes with citations to the actual cases (with page references and links). If it says “In Smith v. Jones (2015) the court ruled X”, it will footnote exactly where that came from so you can verify. In other words, it’s an open-book exam approach. This transparency is huge for building trust.
Lawyers and legal professionals can’t just take an AI’s word for it, but if the AI points me to Smith v. Jones, page 5, I can pull up the case myself to double-check. In practice, I find the AI’s summary is usually spot-on, but I love that it forces itself to show its work. It’s like having a super-smart research senior paralegal who never gets tired and always provides the case references for every answer.
That solves the hallucination problem because if the answer had no source, it simply tells you it found nothing, rather than making something up. This kind of built-in sourcing dramatically increases trust compared to using a generic chatbot.
One of the most incredible moments was stress-testing the system with obscure questions. We’d ask things like, “Find cases where a judge mentions Facebook posts affecting a defamation claim.” In the past, you’d try a bunch of keyword combinations on CanLII or and never find that.
Our AI actually pulled up a few relevant hits immediately, complete with quotes from the judgments. It felt almost like magic. It encourages you to ask more nuanced, creative questions because the barrier (time and effort) is so low now.
Saving Time, Money, and Levelling the Field
For lawyers, the value proposition is pretty straightforward: time saved is money saved (or earned). Caseway can find relevant case law “in less than a minute, saving lawyers countless hours and reducing the risk of costly courtroom mistakes.” If you can shave a five-hour research task down to 30 minutes, you can either bill that time elsewhere or handle more cases.
Lawyers using our artificial intelligence have told me it’s like having an assistant who pre-reads the entire law library and hands you just the essential bits. One lawyer said it helped him catch a crucial case that he would have otherwise overlooked, which ultimately won them a motion that might have been lost otherwise. Better research = better odds of winning, which means happier clients and more referrals for the firm.
Improving access to justice
It’s not just about lawyer productivity, though. A big part of my mission (and background) is improving access to justice. Believe it or not, in Canada and the United States, a considerable portion of people end up self-representing in court (often because they can’t afford a lawyer).
We’re talking roughly 70% of litigants in some areas going it alone. That’s scary, because the legal system is a maze and going in solo usually doesn’t end well. These individuals often aren’t doing it by choice; they simply don’t have $300-400 per hour for legal research assistance. I saw this firsthand running a legal referral business before, so many people would hire a lawyer if only it were affordable.

Artificial intelligence can help make legal knowledge accessable to all. For simple matters like my parking ticket, or let’s say a tenant trying to figure out their rights, an AI legal assistant can be a game-changer. It won’t turn a layperson into a lawyer.
Still, it can equip them with relevant case law and clear, plain-language explanations, enabling them to make a more informed argument or decision. Our goal with Caseway is to make it useful not just for lawyers, but also for non-lawyers who need legal information. We tout it as the first legal research AI designed for everyone, not just the suits.
Low-income individuals facing a legal issue
We priced it at approximately $49 per month for basic access, which is a fraction of traditional legal fees. We’re exploring a near-free or subsidized tier for individuals who qualify for legal aid.
I’m passionate about this part… If you’re a low-income individual facing a legal issue, you should be able to get reliable answers without taking out a second mortgage. We haven’t fully figured out the economics (servers and AI processing do cost money). Still, we’re looking to partner with legal aid organizations or possibly libraries to offer the software at cost or for free to those who need it most.
Let me temper this with reality… artificial intelligence is not a substitute for a lawyer. It doesn’t give tailored legal advice, and it can’t represent you in court. But it can save you 95% on your legal fees.
I always advise users (especially non-lawyers) to use it as a research and educational software, rather than a replacement for professional counsel in serious matters. For example, someone fighting a complex child custody battle can use Caseway to gather cases and understand legal tests, which helps them be informed.
But that’s not a substitute for having an honest lawyer in a high-stakes trial, at least not with today’s AI and rules. We must be cautious not to overpromise. The public trust will only be established if we’re upfront about what the AI can and cannot do. It can empower people with knowledge; it doesn’t eliminate the need for legal expertise in complex judgment calls.
Old-School Lawyers, Bias and Limitations, Legal research
Not everyone in the legal field is jumping on the AI bandwagon, and I get it. Law, by nature, is a conservative profession; many attorneys pride themselves on upholding precedent and tradition. When we first started pitching our AI, we got reactions ranging from “Wow, this could save me so much time!” to “Absolutely not, I don’t trust AI and I don’t even want a website for my firm.”
Yes, there are lawyers out there who told us “No artificial intelligence. We don’t even have a website, we get clients by referral only.”That mindset is thankfully fading as the older generation retires, but it’s still a factor.
Change is hard, and the legal industry has historically been very slow to adopt new tech. Fun fact: many courts only stopped using fax machines recently. So there’s a cultural hurdle. We realized we can’t just drop this tech and expect instant acceptance, we have to educate, build trust, and sometimes even market it in a way that doesn’t trigger fear. One phrase you hear a lot is: “AI won’t replace lawyers, but lawyers who use AI will replace those who don’t.”
Bias and accuracy
Another challenge is bias and accuracy. AI systems are only as good as their training data. If all the past cases have a particular bias (consciously or unconsciously), the AI might reflect that. For instance, if historically courts ruled one way on a topic due to bias, the AI could conceivably steer you toward that same pattern.
We must be aware of artificial intelligence bias, which refers to systematic errors in the AI’s output resulting from biased training data or algorithms. We mitigate this by sticking to official sources (which are generally vetted by the legal process itself), and we’re working on features that can flag anomalies.
However, it’s not a solved problem across the industry. There’s also the issue that an AI trained on case law might be missing context, e.g. it doesn’t inherently “know” what a statute says unless that statute has been quoted in a case.
Our first version was case-law focused
We might incorporate legislation and regulations in the future, but our first version was case-law focused. Users need to understand these limits: if you ask a question that no case directly addresses, the AI might come up empty or provide a very general answer. It’s only as good as the body of law it has to work with.
Regulation is also a lurking challenge. The legal profession has strict rules about who can provide legal advice. We’ve seen pushback when tech tries to cross that line. The startup DoNotPay (which offers AI-driven legal assistance for consumers) attempted to have an AI “robot lawyer” actually argue in traffic court, and state bar associations threatened to prosecute the founder and even suggested jail time, basically scaring them into cancelling that plan.
That incident was a wake-up call. You can innovate, but you can’t just flout unauthorized practice of law regulations. For us, this means we position Caseway carefully: it’s a research platform, not a “lawyer”.
We also focus on jurisdictions like Canada (and specific areas) where we have a strong understanding of the landscape. Expanding to, say, U.S. federal law or other countries means dealing with different data sources, copyright issues (some law databases are proprietary), and varying rules.
Not to mention, the big legal publishers (you know who they are) are protective of their turf. One early legal artificial intelligence startup, ROSS Intelligence, got sued out of existance by a major publisher for using their data to train an AI. So we’ve had to navigate data licensing carefully and lean on public domain sources where possible.
Changing Attitudes and the Road Ahead
Despite the challenges, I’m optimistic. Attitudes in the legal field are shifting rapidly. It’s actually wild how much has changed in just the past couple of years. Not long ago, investors wouldn’t touch legal technology. It was seen as a niche backwater market. One founder joked that trying to get VC funding for legal tech in 2018 was like “pushing a boulder uphill.”
Suddenly, legal technology is hot. In the AI space, we’ve had big exits that caught investors’ eyes. For example, in 2023, Thomson Reuters acquired an AI legal research startup (Casetext) for a whopping $650 million. That kind of headline travels. Now investors are reaching out to legal AI startups instead of the other way around.
I actually experienced this firsthand: when we launched Caseway, we drew interest not just from legal industry professionals but also from mainstream technology investors who, a few years ago, wouldn’t have known what CanLII or Westlaw were.
Using artificial intelligence for legal research
Even clients expect their lawyers to use modern software to be efficient. As that expectation grows, firms that stick to the old ways will feel the pressure. I foresee a time (sooner than later) when clients will ask lawyers, “Are you using artificial intelligence for research? Why am I paying you five hours for something an AI could do in 1?”
I’ve started gently suggesting to friends and family that when they hire a lawyer, they should ask if the law firm uses software like ours. If not, they might be paying extra for someone’s luddite tax. That might sound harsh, but as AI becomes more standard, savvy clients won’t want to foot the bill for inefficient practices.
However, let’s not get carried away with the “robot lawyers taking over” narrative. A nuanced view from industry professionals, such as Daniel Lewis (CEO of LegalOn and co-founder of an earlier legal AI startup), suggests that AI will not replace the traditional lawyer business model overnight, but rather change where specific work is done.
Daniel recently noted that we’re likely to see in-house legal teams (corporate counsel) lead the way in AI adoption, using these platforms to handle more work internally instead of outsourcing everything.
From Tedious Hours to Instant Answers
And he predicts something interesting… Contrary to the hype, the billable hour in law firms isn’t dead yet, big law firms will still bill hours, but AI will quietly shift some of the workload away from firms to those efficient in-house teams.
In other words, lawyers who leverage AI can deliver results faster, and clients (especially businesses) will adjust by keeping more matters in-house or expecting alternative fee arrangements. I agree.
In my own dealings, I’ve seen some forward-thinking law firms already adopt fixed-fee or subscription models for certain services, primarily because artificial intelligence enables them to complete the work in a fraction of the time, making billing by the hour less sensible. The whole economics of legal services could shift toward value delivered rather than hours spent, and AI is a catalyst for that shift.
Software like Caseway are helping lawyers with legal research
The bottom line from my perspective is that AI is here to stay in the legal field, but it’s not a lightswitch disruption, it’s more of a gradual transformation that’s now quickly picking up pace. Today, software like Caseway are helping lawyers and ordinary people obtain answers more rapidly and affordably.
Tomorrow, who knows… Maybe AI will draft entire judgments or help predict case outcomes (some projects are working on that). There’s even talk about AI handling contract negotiations or due diligence at blazing speed.
I’ll wrap up by saying this: building a legal artificial intelligence startup has been one of the most challenging yet rewarding adventures of my life. On one hand, I feel like a mad scientist who unleashed something new into a very old-school domain. On the other hand, I’m still a lawyer at heart, and I genuinely believe in the mission of making law more accessible and efficient.
If we can save a young lawyer from burnout by cutting their all-nighter hours, or help a single mom win her case because she could afford some legal research via AI, that’s worth all the headaches and debugging sessions in the world.
It’s an exciting time to be in the legal tech industry. We’re seeing real cases of it (pun intended). And as more people get comfortable with AI helpers, I think we’ll wonder how we ever lived without them.
Just as we can’t imagine practicing law without online research now, using a legal artificial intelligence assistant might be as routine as using email in a few years. We’re not replacing lawyers; we’re giving them (and everyone) a superpower… The ability to instantly tap into the collective wisdom of decades of court decisions. And that, in my humble opinion, is a game-changer for justice.
-Al Vigier
RELATED POSTS
No related posts found.