Then instead of making the documentary about the horrors, it would just generate a couple of lame pictures
I gotta be a bit of an asshole and say that the Nuremberg trials were very long and very exhaustive on details and accuracy. The idea was that if the trials just brushed over what happened, Nazis would have a reason to rise again, as the German population would have deemed it unfair. Because of this, I’m sure the allies would have gone over the documents in excruciating detail and even if they had an LLM they would’ve never entrusted it with such an important and brittle task.
It’s definitely interesting to think about how LLMs get used in law practice (if legal in your country) but the historic context just dissipates that train of thought about the Nuremberg trials for me.
There were still plenty of Nazis left in Germany and many war criminals never ended up on trial. They just continued living normal lives.
LLMs are only about as useful in law practice as a uber-caffinated non-lawyer. The things simply don’t know the law, and any argument it may create should be checked with the same thoroughness that a lawyer would give that of a associate who was revealed to have had super early-onset dementia.
If you’re in the situation of going to court and are thinking of trusting the advice you get from a LLM, dont. You’d be better off appearing “pro se” with advice from the sovereign citizen movement.
(I mean, so long as you don’t try that admiralty court / all-caps pseudo-corp bullshit. The law is written in a keyword-less syntax with a case-insensitive parser. )
To be fair I really do think they can take the painful task of outlining something in long winded legalese if given the details and laws, but ofc everything needs to be double checked and the LLM needs to be constrained to exact paragraphs and details.
I like AI when it’s used right and I think ppl underestimate their ability to eliminate tedious tasks, but ofc I understand why AI is hated right now. I really hope we can get to a world where court transcriptions don’t lead to carpal tunnel syndrome because they only have to supervise an AI and not type every word out, or where burocracy gets eliminated for the average overworked citizen.
But yeah tech bros fucked that up for all of us for a while.
Out of the box LLMs are trash, but you can use their APIs to great efficacy on large amounts of data to find needles in haystacks. Complex needles, not just strings of text like any sophisticated search.
You’re right that anything it finds needs to be gone over by a human, but it HUGELY reduces problem sizes when used intelligently.
I am not sure why you said you are being asshole; that’s exactly my thought