Do lawyers need the truth?
AI can’t handle the truth when it comes to the law
Almost one in five lawyers are using AI, according to an American Bar Association survey.
But there are a growing number of legal horror stories involving tools like ChatGPT, because chatbots have a tendency to make stuff up. For example, legal precedents from cases that never happened.
Marketplace’s Meghan McCarty Carino spoke with Daniel Ho at Stanford’s Institute for Human-Centered Artificial Intelligence about the group’s recent study on how frequently three of the most popular language models from ChatGPT, Meta and Google hallucinate when asked to weigh in or assist with legal cases.
The following is an edited transcript of their conversation
Lots of questions. What rules do we need? It’s sort of like, “Hey, that guy looks like Vladimir Putin but with a mustache!”
Police increasingly using Colorado DMV facial recognition program
… In a recent case, from October, federal records show ATF investigators requested an image scan at the Colorado DMV after security cameras captured a man walking out of a Denver gun shop with guns and ammo.
The face scan pointed to two men, including a 20-year-old man named Brayan Enriquez, according to a federal criminal complaint.
… Investigators also pointed out in the criminal complaint the face scan of the second suspect led to a man who didn’t have anything to do with the crime.
… DMV records also show a “possible match” comes up only about one-third of the time over the last four years.
No comments:
Post a Comment