I’m just getting into playing with Ollama and want to work to build some self-hosted AI applications. I don’t need heavy duty cards because they probably won’t ever be under too much load, so I’m mostly looking for power efficiency + decent price.

Any suggestions for cards I should look at? So far I’ve been browsing ebay and I was looking at Tesla M40 24GB DDR5’s. They’re reasonably priced, but I’m wondering if anyone has any specific recommendations.

  • dxx255@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I installed a Tesla P40 in a R720xd. You just have to carefully select the correct cable for power supply.