They/Them, agender-leaning scalie.

ADHD software developer with far too many hobbies/trades: AI, gamedev, webdev, programming language design, audio/video/data compression, software 3D, mass spectrometry, genomics.

Learning German (B2), Chinese (HSK 3-4ish), French (A2).

  • 0 Posts
  • 29 Comments
Joined 1 year ago
cake
Cake day: June 18th, 2023

help-circle
  • I’d say it’s more like they’re failing upwards. It’s certainly good for AMD, but it seems like it happened in spite of their involvement, not because of it:

    For reasons unknown to me, AMD decided this year to discontinue funding the effort and not release it as any software product. But the good news was that there was a clause in case of this eventuality: Janik could open-source the work if/when the contract ended.

    AMD didn’t want this advertised or released, and even canned this project despite it reaching better performance than the OpenCL alternative. I really don’t get their thought process. It’s surreal. Do they not want to support AI? Do they not like selling GPUs?



  • There’s a weird divide between self-determined identity and external classifications. Often, a culture forms around the label and the external label stops being relevant because the term has more social/cultural implications than practical implications. Some people internalize the label as that’s how they wish to steer their future interactions, and others ignore the label and move on with their lives.

    You can watch all of Star Trek, and some parts of society will label you a Trekkie if they find out, but it’s up to you whether you choose to identify as a Trekkie, or just go about your life not making a big deal about it.


  • Assuming enthusiastic consent, good faith, and that you meant “sex/body they want” instead of “gender they want” (because gender is just a social construct):

    On another hand, it would erase their identity as trans people.

    I don’t think it would. Identities are built from life experiences, and having lived through transition they’d still be trans even if there were no traces of it on their body. A war veteran doesn’t stop being a veteran just because the war ended.

    consider it a genocide

    The definition of genocide depends on intent! Even in wars, etc. It’s only genocide if you’re specifically trying to erase/displace people/culture.

    • Trying to cure gender dysphoria: it’s not genocide, it’s medical treatment.

    • Trying to “fix” people to make them fit into society: it’s genocide.

    turning them into what they want would mean there is no more trans people

    There are identities that don’t stop being trans even if you give them the body they want:

    • A non-binary person’s desired sex/body and social gender might not match. Even with the perfect body (if one exists), they might still identify as trans because that body doesn’t match their social gender.

    • For genderfluid people, there might not be one singular perfect body. Even if their body constantly updated to suit them, they’d probably still identify as trans because they’d be constantly transitioning…








  • Honestly, I don’t think that there’s room for a competitor until a whole new paradigm is found. PyTorch’s community is the biggest and still growing. With their recent focus on compilation, not only are TF and Jax losing any chance at having an advantage, but the barrier to entry for new competitors is becoming much higher. Compilation takes a LOT of development time to implement, and it’s hard to ignore 50-200% performance boosts.

    Community size tends to ultimately drive open source software adoption. You can see the same with the web frameworks - in the end, most people didn’t learn React because it was the best available library, they learned it because the massive community had published so many tutorials and driven so many job adverts that it was a no-brainer to choose it over Angular, Vue, etc. Only the paradigm-shift libraries like Svelte and Htmx have had a chance at chipping away at React’s dominance.


  • The easiest way to get the basics is to search for articles, online courses, and youtube videos about the specific modules you’re interested in. Papers are written for people who are already deep in the field. You’ll get there, but they’re not the most efficient way to get up to speed. I have no experience with textbooks.

    It helps to think of PyTorch as just a fancy math library. It has some well-documented frameworky structure (nn.Module) and a few differentiation engines, but all the deep learning-specific classes/functions (Conv2d, BatchNorm1d, ReLU, etc.) are just optimized math under the hood.

    You can see the math by looking for projects that reimplement everything in numpy, e.g. picoGPT or ConvNet in NumPy.

    If you can’t get your head around the tensor operations, I suggest searching for “explainers”. Basically for every impactful module there will be a bunch of “(module) Explained” articles or videos out there, e.g. Grouped Convolution, What are Residual Connections. There are also ones for entire models, e.g. The Illustrated Transformer. Once you start googling specific modules’ explainers, you’ll find people who have made mountains of them - I suggest going through their guides and learning everything that seems relevant to what you’re working on.

    If you’re not getting an explanation of something, just google and find another one. People have done an incredible job of making this information freely accessible in many different formats. I basically learned my way from webdev to an AI career with a couple years of casually watching YouTube videos.





  • I think the big difference between people benefiting at small doses (~0.3mg) and large doses (2+mg) is that the 0.3mg group use it for sleep quality through the night, whereas the 3+mg people just need the sudden shock to get to sleep in the first place.

    The drawback with big doses is that your brain becomes less sensitive so your naturally-produced melatonin might not be enough to keep you asleep for the whole night after the pill wears off. It has a very short half-life in the body (under 1 hour), so there’s no way for a single dose before sleeping to last 8 hours. We naturally produce only 0.06-0.08mg per night, so it’s easy to see how supplementing melatonin could desensitize someone and cause them to wake up after just 4-6 hours of sleep.

    I have ADHD and am in the large-dose category and use 2-3mg of melatonin to help me fall asleep. Without it, I can’t sleep reliably because my brain often won’t shut up. Sleep reliably is so much more important to me than sleep quality.

    Using it only 5 nights a week, I’m not significantly dependent. I can still sleep without melatonin, just less reliably. I’ve tried 0.3mg, but it felt the same as taking nothing.

    For me, 10mg would be excessive and probably harmful in a desensitizing way. The most I’ve taken is 6mg, but it only helped in 2 out of 6 times. The other 4 times my brain just wouldn’t stop. If doubling my usual dose didn’t help, I don’t think doubling it again would be any different.

    There are however studies with higher doses, e.g. this one about kids with ADHD that says:

    two-third of the patients responded to relatively medium doses (2.5–6 mg/d), whereas doses above 6 mg added further benefit only in a small percentage of children.

    so I guess it’s different for everyone.