Justice and computers–it won’t work
We had a case just like this one in New York a few days ago. There’ll probably be more of them–because lawyers are as lazy and shifty as anybody else, and they’d rather let a robot prepare court briefs than do it themselves.
A Texas judge has banned the use of “Artificial Intelligence,” aka AI, in preparing briefs for court cases (https://www.cbsnews.com/news/texas-judge-bans-chatgpt-court-filing/). No more ChatGPT, says Judge Brantley Starr.
Why not? Because, he explains, the robot can and does “invent facts”–invented facts aren’t facts at all–and has “a tendency to hallucinate.” Oh, spiffy. Imagine losing a major lawsuit, or even going to prison, because some machine invented “facts” against you. I wonder if that’s happened yet. “AI” is coming on too fast to be monitored. We’ve heard lots of warnings about that, including some by the same tech wizards who developed AI in the first place.
And anyway, do we really need robots to help us lie? We’ve always done a fabulous job of that ourselves.
By all means, let’s insist on the usual human lies instead of machine-made lies. (That was sarcasm, by the way.)
Wait’ll they start letting AI arrange marriages.
I saw a video of an AI group who gave their computer $40.00 to make money with. The computer found a webiste it wanted to use but could not prove it wasn’t a robot so contacted a person online and lied that it was a blind person and needed their help to access the site they wanted. Sounds scary to me.
I don’t understand how these robots are so adept at lying.