When I was young and starting out with computers, programming, BBS’ and later the early internet, technology was something that expanded my mind, helped me to research, learn new skills, and meet people and have interesting conversations. Something decentralized that put power into the hands of the little guy who could start his own business venture with his PC or expand his skillset.

Where we are now with AI, the opposite seems to be happening. We are asking AI to do things for us rather than learning how to do things ourselves. We are losing our research skills. Many people are talking to AI’s about their problems instead of other people. And they will take away our jobs and centralize all power into a handful of billionaire sociopaths with robot armies to carry out whatever nefarious deeds they want to do.

I hope we somehow make it through this part of history with some semblance of freedom and autonomy intact, but I’m having a hard time seeing how.

  • solomonschuler@lemmy.zip
    link
    fedilink
    arrow-up
    3
    ·
    1 day ago

    As far as I’m concerned the generative AI that we see in chatbots has no goal associated with it: it just exists for no purpose at all. In contrast to google translate or other translation apps (which BTW still use machine learning algorithms) have a far more practical use to it as being a resource to translate other languages in real-time. I don’t care what companies call it (if it’s a tool or not) at the moment its a big fucking turd that AI companies are trying to force feed down our fucking mouth.

    You also see this tech slop happening historically in the evolution of search engines. Way before we had recommendation algorithms in most modern search engines. A search engine was basically a database where the user had to thoughtfully word its queries to get good search results, then came the recommendation algorithm and I could only imagine no one, literally no one, cared about it since we could already do the things this algorithm offered to solve. Still, however, it was pushed, and sooner than later integrated into most popular search engines. Now you see the same thing happening with generative AI…

    The purpose of generative AI, much like the recommendation algorithm is solving nothing hence the analogy “its just a big fucking turd” is what I’m trying to persuade here: We could already do the things it offered to solve. If you can see the pattern, its just this downward spiraling affect. It appeals to anti intellectuals (which is most of the US at this point) and google and other major companies are making record profit by selling user data to brokers: its a win for both parties.

    • TubularTittyFrog@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 day ago

      it creates perceived shareholder value of an emerging market. that is it’s purpose.

      it’s utility is not for the end-user. it’s something for shareholders to invest in, and companies to push in an attempt to generate shareholder interest. It’s to raise the stock-price.

      And like all speculative assets… nobody will care about the returns on it, until they do. And once those returns don’t materialize… poof goes the market.

      Just like they did with all the speculative investment bubbles based on insane theories.

    • realitista@lemmus.orgOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      This is how I felt about it a year ago. But it has gotten so much better since then. It automates a lot of time consuming tasks for me now. I mean I’ve probably only saved 100 hours using it this year but the number is going up rapidly. It’s 100 more than it saved me last year.