• rmrf@lemmy.ml
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    5
    ·
    1 day ago

    Honestly just fine use computers at all, completely eliminate the remote attack vector. And only drink rain water since city water can be compromised.

    Or, recognize this is a normal part of using software and have more than 1 thing between you and a breach

    • quack@lemmy.zip
      link
      fedilink
      English
      arrow-up
      32
      arrow-down
      1
      ·
      1 day ago

      The rules of cybersecurity:

      1. Under no circumstances should you own a computer.

      2. If you absolutely must own a computer, under no circumstances should you connect it to the internet.

      3. If you absolutely must connect it to the internet, it’s too late and they already have you

      • HubertManne@piefed.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        23 hours ago

        I know this is a joke but im old enough we used to install the os and had it on the network and eventually update it but then it got to the point were like being connected to the internet for like a minute and the machines were compromised. Thats when we got off our duffs and started making custom installs that had updates and configurations and software pre installed before we even connected it to the net.

      • StarDreamer@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        23 hours ago

        And how would apt help in this particular case? A supply chain attack can happen with any particular package manager. In this case, the compromised package was detected and mitigated within 93 minutes, affecting a total of ~330 users. Which is a lot better than how a lot of distros handled the xz breach last year.

        All reasonably secure package managers (and https) operate on a chain of trust. There is little that can be done if that chain of trust is broken.

        Based on this the cause was a malicious VSCode extension that stole credentials that were later used to trigger a deployment CI/CD pipeline. If there’s anything to learn from this, it’s probably to not use VSCode.

          • StarDreamer@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            3
            ·
            22 hours ago
            1. If your assumption is that X509 is trash, does that mean you hold the same amount of distrust to TLS?
            2. How do you propose the scaling of key management? Do you have a reasonable alternative to users blindly trusting every single key they come across?
            3. Back to my original question: what prevents a VSCode extension from stealing a private signing key (as opposed to an API key) and causing the same issues described here?