Intel brings 32GB of VRAM and plenty of bandwidth to the local AI inference party

  • Die4Ever@retrolemmy.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    20 hours ago

    At least this is for consumers/prosumers to buy and OWN, instead of data center products that we can only rent. And we desperately need Intel in the GPU market. (Or any 3rd player)

    Although IDK how useful they are outside of AI