• 🇵🇸antifa_ceo@lemmy.ml
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    3
    ·
    15 hours ago

    AI is fine when it is your hands. Not when it is your brain. As long as you can vet it it can be useful for coding. Its the moment we give agency away to the AI that many see issues with it.

  • dfyx@lemmy.helios42.de
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    3
    ·
    edit-2
    16 hours ago

    Quick, let’s all abandon Linux (edit: and git) because the main developer did something we don’t like! /s

  • Angel Mountain@feddit.nl
    link
    fedilink
    arrow-up
    55
    arrow-down
    3
    ·
    17 hours ago

    The problem is that AI is not useless. It has a lot of other issues, but not that it is never a helpful tool.

    • prettybunnys@piefed.social
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      2
      ·
      14 hours ago

      With constrained data sets it’s actually really useful.

      Parsing text and logs and correlating events, super useful.

      When you dump all human “intelligence” into it you discover how dumb we are collectively.

  • palordrolap@fedia.io
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    15 hours ago

    If 1) you’re smart or practised enough to be able to generate what you’re asking the AI to do for yourself, 2) you’re able to take what the AI generates and debug, check and correct it using non-AI tools like your own brain, 3) you’re sure this whole AI-inclusive process will save time and money, and 4) you’re sure using AI as a crutch won’t cause you brain-rot in the long term, go nuts.

    Caveat: Those last two are tricky traps. You can be sure and wrong.

    Otherwise, grab the documentation or a bunch of examples and start hacking and crafting. Leave the AI alone. Maybe ask it a question about something that isn’t clear, but on no account trust it. It might have developed the same confusion that you have for precisely the same reasons.

    So anyway, Linus clearly fits 1 and 2, and believes 3 and 4 or else he wouldn’t be using an AI. Let’s just hope he hasn’t fallen into the traps.

  • bleistift2@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    1
    ·
    17 hours ago

    From the project’s README:

    Also note that the python visualizer tool has been basically written by vibe-coding. I know more about analog filters – and that’s not saying much – than I do about python. It started out as my typical “google and do the monkey-see-monkey-do” kind of programming, but then I cut out the middle-man – me – and just used Google Antigravity to do the audio sample visualizer.

    This is the commit: https://github.com/torvalds/AudioNoise/commit/93a72563cba609a414297b558cb46ddd3ce9d6b5

    • boonhet@sopuli.xyz
      link
      fedilink
      arrow-up
      34
      arrow-down
      1
      ·
      edit-2
      16 hours ago

      Tbf it’s his project so he can do whatever he wants

      Issue is when people do things like that one dude who had Claude implement support for DWARF in… Whatever language it was (Something MLy I think?) and literally didn’t even remove the copyright attribution to some random 3rd person that Claude added. It was a PR of several thousand lines, all AI generated and he had no idea how it worked, but said it’s ok, Claude understands it. He didn’t even tell anyone he was going to be working on it so there was no discussion of the architecture beforehand.

      Edit: Ocaml. So I was right that it was something MLy lol

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    4
    arrow-down
    9
    ·
    11 hours ago

    I’ve long found it funny how some people claim that generative AI produces terrible slop, and simultaneously that it’s a huge threat to their jobs.

    • Dekkia@this.doesnotcut.it
      link
      fedilink
      arrow-up
      21
      ·
      11 hours ago

      The people who make firing decisions often aren’t the ones doing the day-to-day work.

      It’s very possible to be replaced by a machine that does a worse job, as long as your (ex-) boss isn’t aware of it.