Because if I spent my whole day reviewing AI-generated PRs and walking through the codebase with them only for the next PR to be AI-generated unreviewed shit again, I’d never get my job done.
I’d love to help people learn, but nobody will use anything they learn because they’re just going to ask an LLM to do their task for them anyway.
This is a people problem, and primarily at a high level. The incentive is to churn out slop rather than do things right, so that’s what people do.
Who creates these AI-generated PRs? Colleagues in your company or some third party people in an open source project? I guess when this would happen to me, and the PRs were created by colleagues, then I would escalate this to my line manager. And I guess that she would understand why this is a problem and why it had to stop.
Because if I spent my whole day reviewing AI-generated PRs and walking through the codebase with them only for the next PR to be AI-generated unreviewed shit again, I’d never get my job done.
I’d love to help people learn, but nobody will use anything they learn because they’re just going to ask an LLM to do their task for them anyway.
This is a people problem, and primarily at a high level. The incentive is to churn out slop rather than do things right, so that’s what people do.
Who creates these AI-generated PRs? Colleagues in your company or some third party people in an open source project? I guess when this would happen to me, and the PRs were created by colleagues, then I would escalate this to my line manager. And I guess that she would understand why this is a problem and why it had to stop.
Colleagues, and the issue is top-down. I’ve raised it as an issue already. My manager can’t do anything about it.