• 0 Posts
  • 12 Comments
Joined 1 year ago
cake
Cake day: July 2nd, 2023

help-circle









  • I’ll try to give an ELI5 kind of answer here.

    Before the Internet, “networks” were mostly one-offs you would dial into with a modem. Big or small, users would dial into the systems to enjoy whatever content was available on them.

    The Internet was created as a way to connect multiple, disparate network nodes like these. Now, instead of just letting people access your content, you could now let them access other people’s content as well.

    There were lots of programs made to do this. IRC for chatting, Archie and Gopher for searching FTP sites for downloads you might want. There was also Usenet - a threaded discussion forum. The discussions looked a lot like Lemmy - there were subject lines and when you clicked on them there was threaded discussion you could read and participate in.

    When this was all initially going on the Internet was mostly text-based. We may have been accessing Usenet from our Windows 3.1 laptops (I used a program called Agent), but all these programs were doing was trading text. Slowly though, bandwidth started creeping up.

    As bandwidth began to creep up, people realized that huge text posts to Usenet could be used to post things like photos encoded to text. And thus was uuencoding born - and it didn’t stop at photos. But because Usenet posts are limited in size, big files would get posted as multiple parchives - in multiple sections/posts that could be stitched back together into a whole again.

    It was in this way that Usenet - a system designed for conversation - became a way to trade files.

    Meanwhile the web happened. Discussion quickly moved to the web because you didn’t have to download a separate program to view web forums. At the time, web forums were inherently inferior (they couldn’t do threaded discussion) but they were also inherently superior (they could be moderated). Yeah, Usenet was unmoderated and because of this it was basically a huge pile of dogshit by the time the web got huge.

    Usenet did continue to flourish though - as this sort of Frankenstein file-sharing system. The problem is that most Usenet servers were hosted by ISPs because they wanted to host discussions - not file-sharing. So they shut their Usenet servers down. But the file sharing was just too useful to die, so dedicated Usenet providers popped up and picked up the slack where the local ISPs left off. It wasn’t hard. Usenet is just a protocol - anybody can adhere to it and create a node.

    And clients changed too - from the readers I used like Agent, to new readers that recognized that people using Usenet aren’t looking for discussion anymore. They’re looking for an easy way to find the files they want and a program that will seamlessly stitch together all those PAR files behind the scenes for them to get it.

    This was the purpose behind Newzbin, which was an elaborate way to access the remaining Federation of (now mostly dedicated, paid) Usenet servers and easily find and download all they had to offer. It was super easy and worked very well, so naturally, it was fucked into oblivion by Hollywood in 2010.

    The great thing about Usenet though, is you can’t kill it by killing off one node. The other great thing is that it’s pretty stupidly complicated by today’s standards, so it still exists because it’s been largely forgotten while Hollywood focuses on stuff like torrenting.

    If you want to access Usenet, you will need to purchase access to a company that runs a Usenet server and get client software that can help you find and stitch together those PAR files. I am out of the loop, so I am afraid I cannot help you any further with that. But hopefully if you know the history of it and how it works in theory, it should help.