• argueswithidiots@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    6 hours ago

    I’ve so far avoided using LLMs unless I need something explicitly explained. I don’t know enough to be able to verify any code it would produce, so I don’t know what the hell these vibe coders are doing.

    But I’m also a little long in the tooth to be starting this, so maybe that’s part of my problem.

      • argueswithidiots@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        3 hours ago

        I am aware of what I don’t know, and having something which hallucinates produce something I want to use seems silly. I wouldn’t be able to verify it independently because I’m not smart enough. It just seems like asking for trouble. Like I would ask the thing which gave me broken code in the first place to fix it? Smort.