FS: Nvidia RTX 2080 ti 22gb vram mod (LLM, GenAI, ChatGPT) - $500
QR Code Link to This Post
I am freeing up a pair of modded Nvidia 2080ti graphics cards that have 22gb vram available (technically 24gb is installed, but only 22gb is available).
I’ve been using these for local LLM use. With one of these you can use models that are over 20 billion parameters. With both you can get close to 50bn parameters. To use both nothing special is needed, ollama will make use of both.
So if you were looking to use LLM locally and doing generative AI, and not break the bank, these are great options. I’ve mainly used llama3, but recently have been using DeepSeek-r1. .But there are many models you can try that are freely available.