GPU Cheap

Hello World

We are all GPU poor

If you have done any LLM training or inference, you’ve probably run into either out-of-memory errors or very slow speeds. You have a 3090 GPU in a nicely built PC, but what you really want is an A100 or an H100. This is a struggle for everyone, including us.

Local GPU

NVIDIA GPUs, new or used, are anything but cheap these days. How good are 3090s in 2026? Should I buy a GB10 (aka DGX Spark) based system instead? Is it worth forking over $10k to buy an RTX PRO 6000 Blackwell?

Cloud GPU

There are many neocloud vendors popping up these days. What is a good hourly price for an H100? What is the difference between a container-based service and a VM-based service?

Alternative Platforms

Is Apple a viable alternative to x86 PCs for AI? How good is the new GPU that AMD or Intel just launched? Where can you access AMD GPUs in the cloud?

Stay Tuned

The answers to all the questions we posted above are complicated. We will try to answer them in this blog. Don’t forget to check back often.