• 0 Posts
  • 8 Comments
Joined 2 years ago
cake
Cake day: March 22nd, 2024

help-circle
  • I’m just really tired of the parroted and reactionary “blockchain/AI bad” comments because they often have no nuance. It’s not the implementation that’s “bad”, it’s seemingly that they used the tech at all.

    Agreed on that, too.

    There was this one instance of a remastered” TV show, partially processed with GANs, a long time ago. I pointed this out (as apparently this little detail was forgotten with time), and I got chewed out. Reddit commenters either claimed I was lying (when I helped work on it), or declared it was now awful and intolerable just because it’s “AI”… something they had loved for years, way before the LLM bubble.


  • Of course it can be discussed…

    I’m really into (open weights) genAI myself, have been for years, but at the same time I’m under no illusion the space is clean. The vast majority of services are scams, many open source AI projects are autogenerated slop from someone with AI psychosis (if not outright Tech Bro scams), and that’s not even touching on what Big Tech is pushing.

    What I’m asserting is that a fat slab of skepticism is healthy in this kind of space. Be an enthusiast, not a believer. I know much less about blockchain, so perhaps I was a little zealous in judgement, but something about this project just raised a lot of red flags in my head like scam-adjacent AI projects do.


    Another thing is that the blockchain scams haven’t gone away, and in ten years they probably will still stubbornly persist. GenAI is going to be the same.




  • To add to what others said:

    LPDDRX is used in some inference hardware. The same stuff you find in laptops and smartphones.

    Also, the servers need a whole lot of regular CPU DIMMs since they’re still mostly EPYC/Xeon severs with 8 GPUs in each. And why are they “wasting” so much RAM on CPU RAM that isn’t really needed, you ask? Same reason as a lot of AI: it’s immediately accessible, already targeted by devs, and AI dev is way more conservative and wasteful than you’d think.

    Same for SSDs. Regular old servers (including AI servers) need it too. In a perfect world they’d use centralized storage for images/weights with near-“diskless” inference/training servers. Some AI servers do this, but most don’t.


    Basically, the waste is tremendous, for the same reason they use cheap gas generators on-site: it’s faster-to-market.