• bss03@infosec.pub
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    35 minutes ago

    The whole industry needs rebuilt from the foundations. GRTT with a grading ring that tightly controls resources (including, but not limited to RAM) as the fundamental calculus, instead of whatever JS happens to stick to the Chome codebase and machine codes spewed by your favorite C compiler.

  • Valmond@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    7
    ·
    2 hours ago

    Had to install (an old mind you, 2019) visual studio on windows…

    First it’s like 30GB, what the hell?? It’s an advanced text editor with a compiler and some …

    Crashed a little less than what I remember 🥴😁

  • bampop@lemmy.world
    cake
    link
    fedilink
    arrow-up
    69
    arrow-down
    1
    ·
    4 hours ago

    My PC is 15 times faster than the one I had 10 years ago. It’s the same old PC but I got rid of Windows.

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    53
    arrow-down
    2
    ·
    edit-2
    4 hours ago

    Everything bad people said about web apps 20+ years ago has proved true.

    It’s like, great, now we have consistent cross-platform software. But it’s all bloated, slow, and only “consistent” with itself (if even). The world raced to the bottom, and here we are. Everything is bound to lowest-common-denominator tech. Everything has all the disadvantages of client-server architecture even when it all runs (or should run) locally.

    It is completely fucking insane how long I have to wait for lists to populate with data that could already be in memory.

    But at least we’re not stuck with Windows-only admin consoles anymore, so that’s nice.

    All the advances in hardware performance have been used to make it faster (more to the point, “cheaper”) to develop software, not faster to run it.

  • DontRedditMyLemmy@lemmy.world
    link
    fedilink
    arrow-up
    27
    arrow-down
    1
    ·
    5 hours ago

    I hate that our expectations have been lowered.

    2016: “oh, that app crashed?? Pick a different one!”

    2026: “oh, that app crashed again? They all crash, just start it again and cross your toes.”

    • wabasso@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      33 minutes ago

      I’m starting to develop a conspiracy theory that MS is trying to make the desktop experience so terrible that everyone switches to mobile devices, such that they can be more easily spied on.

  • wulrus@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    4 hours ago

    I bought a desktop PC for a little over 2k in late 2011, and still use it. I’m a back-end developer, and certainly I would like to be able to upgrade my 16 GB RAM to 32 GB in an affordable way.

    Other than that, it’s perfectly fine. IDE, a few docker containers, works.

    And modern gaming is a scam anyway. Realistic graphics do not increase fun, they just eat electricity and our money. Retro gaming or not at all.

    Imagine how things were if they were built to be maintained for 15+ years.

      • wulrus@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        29 minutes ago

        wow, you are right! I didn’t bother to check this whole time of needless suffering, but for what I earn with it in less than an hour I could probably buy 2x8 GB DDR-3, lol!

        It just seemed a fair assumption that it would be insanely expensive …

  • kunaltyagi@programming.dev
    link
    fedilink
    arrow-up
    58
    arrow-down
    1
    ·
    7 hours ago

    The same? Try worse. Most devices have seen input latency going up. Most applications have a higher latency post input as well.

    Switching from an old system with old UI to a new system sometimes feels like molasses.

    • Korhaka@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      12
      ·
      5 hours ago

      I work in support for a SaaS product and every single click on the platform takes a noticeable amount of time. I don’t understand why anyone is paying any amount of money for this product. I have the FOSS equivalent of our software in a test VM and its far more responsive.

    • Buddahriffic@lemmy.world
      link
      fedilink
      arrow-up
      12
      ·
      6 hours ago

      Except for KDE. At least compared to cinnamon, I find KDE much more responsive.

      AI generated code will make things worse. They are good at providing solutions that generally give the correct output but the code they generate tends to be shit in a final product style.

      Though perhaps performance will improve since at least the AI isn’t limited by only knowing JavaScript.

      • boonhet@sopuli.xyz
        link
        fedilink
        arrow-up
        2
        ·
        4 hours ago

        I still have no idea what it is, but over time my computer, which has KDE on it, gets super slow and I HAVE to restart. Even if I close all applications it’s still slow.

        It’s one reason I’ve been considering upgrading from6 cores and 32 GB to 16 and 64.

        • dr_robotBones@reddthat.com
          link
          fedilink
          arrow-up
          1
          ·
          57 minutes ago

          Have you gone through settings and disabled unnecessary effects, indexing and such? With default settings it can get quite slow but with some small changes it becomes very snappy.

          • boonhet@sopuli.xyz
            link
            fedilink
            arrow-up
            1
            ·
            20 minutes ago

            I have not, but also it’s not slow immediately, it takes time under use to get slow. Fresh boot is quite fast. And then once it’s slow, even if I close my IDE, browsers and everything, it remains slow, even if CPU usage is really low and there’s theoretically plenty of memory that could be freed easily.

        • arendjr@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          2 hours ago

          Have you tried disabling the file indexing service? I think it’s called Baloo?

          Usually it doesn’t have too much overhead, but in combination with certain workflows it could be a bottleneck.

        • rumba@lemmy.zip
          link
          fedilink
          English
          arrow-up
          5
          ·
          4 hours ago

          Upgrade isn’t likely to help. If KDE is struggling on 6@32, you have something going on that 16@64 is only going to make it last twice as long before choking.

          wail till it’s slow

          Check your Ram / CPU in top and the disk in iotop, hammering the disk/CPU (of a bad disk/ssd) can make kde feel slow.

          plasmashell --replace # this just dumps plasmashell’s widgets/panels

          See if you got a lot of ram/CPU back or it’s running well, if so if might be a bad widget or panel

          if it’s still slow,

          kwin_x11 --replace

          or

          kwin_wayland --replace &

          This dumps everything and refreshes the graphics driver/compositor/window manager

          If that makes it better, you’re likely looking at a graphics driver issue

          I’ve seen some stuff where going to sleep and coming out degrades perf

          • boonhet@sopuli.xyz
            link
            fedilink
            arrow-up
            2
            ·
            3 hours ago

            Hmm, I haven’t noticed high CPU usage, but usually it only leaves me around 500MB actually free RAM, basically the entire rest of it is either in use or cache (often about 15 gigs for cache). Turning on the 64 gig swapfile usually still leaves me with close to no free RAM.

            I’ll see if it’s slow already when I get home, I restarted yesterday. Then I’ll try the tricks you suggested. For all I know maybe it’s not even KDE itself.

            Root and home are on separate NVMe drives and there’s a SATA SSD for misc non-system stuff.

            GPU is nvidia 3060ti with latest proprietary drivers.

            The PC does not sleep at all.

            To be fair I also want to upgrade to speed up Rust compilation when working on side projects and because I often have to store 40-50 gigs in tmpfs and would prefer it to be entirely in RAM so it’s faster to both write and read.

            • rumba@lemmy.zip
              link
              fedilink
              English
              arrow-up
              4
              ·
              3 hours ago

              Don’t let me stop you from upgrading, that’s got loads of upsides. Just suspecting you still have something else to fix before you’ll really get to use it :)

              It CAN be ok to have very low free ram if it’s used up by buffers/cache. (freeable) If Buff/cache gets below about 3GB on most systems, you’ll start to struggle.

              If you have 16GB, it’s running low, and you can’t account for it in top, you have something leaking somewhere.

              • boonhet@sopuli.xyz
                link
                fedilink
                arrow-up
                1
                ·
                9 minutes ago

                Lol I sorted top by memory usage and realized I’m using 12 gigs on an LLM I was playing around with to get local code completion in my JetBrains IDE. It didn’t work all that well anyway and I forgot to disable it.

                I did have similar issues before this too, but I imagine blowing 12 gigs on an LLM must’ve exacerbated things. I’m wondering how long I can go now before I’m starting to run out of memory again. Though I was still sitting at 7 gigs buffer/cache and it hadn’t slowed down yet.

  • oyo@lemmy.zip
    link
    fedilink
    arrow-up
    48
    arrow-down
    1
    ·
    7 hours ago

    Windows 11 is the slowest Windows I’ve ever used, by far. Why do I have to wait 15-45 seconds to see my folders when I open explorer? If you have a slow or intermittent Internet connection it’s literally unusable.

    • da_cow (she/her)@feddit.org
      link
      fedilink
      arrow-up
      15
      ·
      6 hours ago

      Even Windows 10 is literally unusable for me. When pressing the windows key it literally takes about 4 seconds until the search pops up, just for it to be literally garbage.

      • UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        6 hours ago

        Found out about this while watching “Halt and Catch Fire” (AMC’s effort to recreate the magic of Mad Men, but on the computer).

        Doherty Threshold

        In 1982 Walter J. Doherty and Ahrvind J. Thadani published, in the IBM Systems Journal, a research paper that set the requirement for computer response time to be 400 milliseconds, not 2,000 (2 seconds) which had been the previous standard. When a human being’s command was executed and returned an answer in under 400 milliseconds, it was deemed to exceed the Doherty threshold, and use of such applications were deemed to be “addicting” to users.

      • WhyJiffie@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        if it only occurs hours or days after boot, try killing the startmenuexperiencehost process. that’s what I was doing until I switched to linux

        • da_cow (she/her)@feddit.org
          link
          fedilink
          arrow-up
          2
          ·
          2 hours ago

          I am using windows like once a week at maximum and then it only takes about 10 minutes. So I kind of do not really care and am glad, that I do not need to use it more often.

    • Echo Dot@feddit.uk
      link
      fedilink
      arrow-up
      8
      ·
      7 hours ago

      It takes forever to boot I know that and that’s from fast food which is extra pathetic.

        • HugeNerd@lemmy.ca
          link
          fedilink
          arrow-up
          5
          ·
          6 hours ago

          I’ve given up trying to understand modern PC software. I can barely keep up with the little microcontrollers I work with. They aren’t so little.

  • brotato@slrpnk.net
    link
    fedilink
    arrow-up
    102
    arrow-down
    3
    ·
    9 hours ago

    The tech debt problem will keep getting worse as product teams keep promising more in less time. Keep making developers move faster. I’m sure nothing bad will come of it.

    Capitalism truly ruins everything good and pure. I used to love writing clean code and now it’s just “prompt this AI to spit out sloppy code that mostly works so you can focus on what really matters… meetings!”

  • Yerbouti@sh.itjust.works
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    6 hours ago

    I’ll keep saying this: my 2009 i5 750 still feels as fast as my 2 years old workstations and can play almost everything I want with the 1060.

    • Narauko@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      5 hours ago

      Vista honestly wasn’t as bad as we all said/remember, but it was the start of Windows optimization downturn. It worked great on top of the line systems with tons of power, and was the best looking Windows Microslop ever developed.

      It just happened to also coincide with the start of netbooks and low power computers going mainstream, and marketing thought that the F1 requiring OS should also be sold on a 3 door hatchback with 60 horsepower.

      • rumba@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 hours ago

        Your mileage with Vista was wildly hardware-dependent. Prior to Vista, if you could run one version of Windows, the next version would run just about as well.

        The Indexer and Glass were memory hungry. If you gave it a decent amount of ram, it could look like a dream while it did. If you turned off Aero on an under-specced machine, it could also run pretty well, but if you turned off Aero, you didn’t have much of a reason not to just run 98se.

        The other shoe was drivers. Noone was ready for WDDM and a LOT of the small to mid-sized hardware vendors emergency released slow, buggy, memory-hungry drivers that just made Vista feel horrible.

        I had some off-the-shelf compaqs that ran beautifully, My dual P3/scsi workstation with tons of ram, ran like hot garbage.

  • ZILtoid1991@lemmy.world
    link
    fedilink
    arrow-up
    20
    ·
    7 hours ago

    They often are worse, because everything needed to be an electron app, so they could hire the cheaper web developers for it, and also can boast about “instant cross platform support” even if they don’t release Linux versions.

    Qt and GTK could do cross platform support, but not data collection, for big data purposes.

    • boonhet@sopuli.xyz
      link
      fedilink
      arrow-up
      4
      ·
      4 hours ago

      There’s no difference whatsoever between qt or gtk and electron for data collection. You can add networking to your application in any of those frameworks.

    • Echo Dot@feddit.uk
      link
      fedilink
      arrow-up
      8
      ·
      7 hours ago

      I don’t know why electron has to use so much memory up though. It seems to use however much RAM is currently available when it boots, the more RAM system has the more electron seems to think it needs.

      • GamingChairModel@lemmy.world
        link
        fedilink
        arrow-up
        13
        ·
        6 hours ago

        Chromium is basically Tyrone Biggums asking if y’all got any more of that RAM, so bundling that into Electron is gonna lead to the same behavior.

      • Buddahriffic@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        5 hours ago

        Ib4 “uNusEd RAm iS wAStEd RaM!”

        No, unused RAM keeps my PC running fast. I remember the days where accidentally hitting the windows key while in a game meant waiting a minute for it to swap the desktop pages in, only to have to swap the game pages back when you immediately click back into it, expecting it to either crash your computer or probably disconnect from whatever server you were connected to. Fuck that shit.

        • boonhet@sopuli.xyz
          link
          fedilink
          arrow-up
          3
          ·
          4 hours ago

          I mean unused RAM is still wasted: You’d want all the things cached in RAM already so they’re ready to go.

          • Echo Dot@feddit.uk
            link
            fedilink
            arrow-up
            1
            ·
            2 hours ago

            I mean I have access to a computer with a terabyte of RAM I’m gonna go ahead and say that most applications aren’t going to need that much and if they use that much I’m gonna be cross.

            • boonhet@sopuli.xyz
              link
              fedilink
              arrow-up
              1
              ·
              35 minutes ago

              Wellll

              If you have a terabyte of RAM sitting around doing literally nothing, it’s kinda being wasted. If you’re actually using it for whatever application can make good use of it, which I’m assuming is some heavy-duty scientific computation or running full size AI models or something, then it’s no longer being wasted.

              And yes if your calculator uses the entire terabyte, that’s also memory being wasted obviously.

          • Buddahriffic@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            4 hours ago

            I don’t want my PC wasting resources trying to guess every possible next action I might take. Even I don’t know for sure what games I’ll play tonight.

            • boonhet@sopuli.xyz
              link
              fedilink
              arrow-up
              3
              ·
              3 hours ago

              Well you’d want your OS to cache the start menu in the scenario you highlighted above. The game could also run better if it can cache assets not currently in use instead of waiting for the last moment to load them. Etc.

              • Buddahriffic@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                3 hours ago

                Yeah, for things that will likely be used, caching is good. I just have a problem with the “memory is free, so find more stuff to cache to fill it” or “we have gigabytes of RAM so it doesn’t matter how memory-efficient any program I write is”.

                • boonhet@sopuli.xyz
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  30 minutes ago

                  “memory is free, so find more stuff to cache to fill it”

                  As long as it’s being used responsibly and freed when necessary, I don’t have a problem with this

                  “we have gigabytes of RAM so it doesn’t matter how memory-efficient any program I write is”

                  On anything running on the end user’s hardware, this I DO have a problem with.

                  I have no problem with a simple backend REST API being built on Spring Boot and requiring a damn gigabyte just to provide a /status endpoint or whatever. Because it runs on one or a few machines, controlled by the company developing it usually.

                  When a simple desktop application uses over a gigabyte because of shitty UI frameworks being used, I start having a problem with it, because that’s a gigabyte used per every single end user, and end users are more numerous than servers AND they expect their devices to do multiple things, rather than running just one application.

  • GreenShimada@lemmy.world
    link
    fedilink
    arrow-up
    212
    arrow-down
    2
    ·
    12 hours ago

    For anyone unsure: Jevon’s Paradox is that when there’s more of a resource to consume, humans will consume more resource rather than make the gains to use the resource better.

    Case in point: AI models could be written to be more efficient in token use (see DeepSeek), but instead AI companies just buy up all the GPUs and shove more compute in.

    For the expansive bloat - same goes for phones. Our phones are orders of magnitude better than what they were 10 years ago, and now it’s loaded with bloat because the manufacturer thinks “Well, there’s more computer and memory. Let’s shove more bloat in there!”

    • GamingChairModel@lemmy.world
      link
      fedilink
      arrow-up
      17
      ·
      6 hours ago

      Jevon’s Paradox is that when there’s more of a resource to consume, humans will consume more resource rather than make the gains to use the resource better.

      More specifically, it’s when an improvement in efficiency cause the underlying resource to be used more, because the efficiency reduces cost and then using that resource becomes even more economically attractive.

      So when factories got more efficient at using coal in the 19th century, England saw a huge increase in coal demand, despite using less coal for any given task.

      • Quetzalcutlass@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 hour ago

        Also Eli Whitney inventing the cotton gin to make extracting cotton less of a tedious and backbreaking process, which lead to a massive expansion in slavery plantations in the American South due to the increased output and profitability of the crop.

    • frunch@lemmy.world
      link
      fedilink
      arrow-up
      15
      ·
      7 hours ago

      I always felt American car companies were a really good example of that back in the 60s-70s when enormously long vehicles with giant engines were the order of the day. Why not bigger? Why not stronger? It also acted as a symbol of American strength, which was being measured by raw power just like today lol.

      This also reminds me of the way video game programmers in the late 70s/early 80s had such tight limitations to work within that you had to get creative if you wanted to make something stand out. Some very interesting stories from that era.

      I also love to think about the tricks the programmer of Prince of Persia had employed to get the “shadow prince” to work…

      https://www.youtube.com/watch?v=sw0VfmXKq54

    • VibeSurgeon@piefed.social
      link
      fedilink
      English
      arrow-up
      62
      arrow-down
      1
      ·
      11 hours ago

      Case in point: AI models could be written to be more efficient in token use

      They are being written to be more efficient in inference, but the gains are being offset by trying to wring more capabilities out of the models by ballooning token use.

      Which is indeed a form of Jevon’s paradox

      • errer@lemmy.world
        link
        fedilink
        English
        arrow-up
        22
        ·
        8 hours ago

        Costs have been dropping by a factor of 3 per year, but token use increased 40x over the same period. So while the efficiency is contributing a bit to the use, the use is exploding even faster.