Intel brings 32GB of VRAM and plenty of bandwidth to the local AI inference party

  • renegadespork@lemmy.jelliefrontier.net
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    2 days ago

    “Corporation Pivots to Follow the Money” I suppose.

    I’m so tired of this bubble. The diminishing returns of more compute on AI started rearing its head in 2024. Can we start allocating tech to stuff that people actually want, again?

    • Die4Ever@retrolemmy.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      2 days ago

      At least this is for consumers/prosumers to buy and OWN, instead of data center products that we can only rent. And we desperately need Intel in the GPU market. (Or any 3rd player)

      Although IDK how useful they are outside of AI