I think there’s an important nuance to lmgtfy or RTFM. These two were clearly identifiable as the kind of - sometimes snarky - min-effort response, and sometimes absolutely justified (e.g. if I googled the question of OP and the very first result correctly answers their question, which I have made the effort of checking myself).
For the slop responses however, the receiver has to invest sometimes considerable time into reading & processing it to even understand that it might be pure slop. And in doubt, as a reader we are left with the moral dilemma of potentially offending the writer by asking “Did you just send me LLM output?”
It is both harder to identify and it drives a wedge into online (and personal) relationships because it adds a layer of doubt or distrust. This slop shit is poison for internet friendships. Those tech bros all need to fuck off and use their money for a permanent coke trip straight until they become irrelevant. :/
It’s really not that hard to identify the soulless drivel output by an LLM in an email. Nobody writes like that, not even a passive agressive middle management psychopath.
I think there’s an important nuance to lmgtfy or RTFM. These two were clearly identifiable as the kind of - sometimes snarky - min-effort response, and sometimes absolutely justified (e.g. if I googled the question of OP and the very first result correctly answers their question, which I have made the effort of checking myself).
For the slop responses however, the receiver has to invest sometimes considerable time into reading & processing it to even understand that it might be pure slop. And in doubt, as a reader we are left with the moral dilemma of potentially offending the writer by asking “Did you just send me LLM output?”
It is both harder to identify and it drives a wedge into online (and personal) relationships because it adds a layer of doubt or distrust. This slop shit is poison for internet friendships. Those tech bros all need to fuck off and use their money for a permanent coke trip straight until they become irrelevant. :/
Oh yeah, I was thinking of people who link to llm output, like this: https://chatgpt.com/share/697e8957-9494-8010-beb9-eb90c4760518
Copy-pasting llm summaries is definitely worse.
It’s really not that hard to identify the soulless drivel output by an LLM in an email. Nobody writes like that, not even a passive agressive middle management psychopath.
Becomes harder in shorter messages though.