TensorDock: Affordable, Easy, Hourly Cloud GPUs From $0.32/hour | Free $15 Credit!
Looking for an alternative to big, expensive cloud providers who are fleecing you of money when it comes to cloud GPUs? Meet TensorDock.
We're a small, close-knit startup based in Connecticut that sells virtual machines with a dedicated GPUs attached. Our goal is not to make money. Rather, our primary goal is to democratize large-scale high-performance computing (HPC) and make it accessible for everyday developers.
1. Ridiculously Easy
Your time is money, so we've tried to make your life as easy as possible. We built our own panel, designed for the GPU use case. No WHMCS here. We did things our way. We have an API too.
When you deploy a Linux server, NVIDIA drivers, Docker, NVIDIA-Docker2, CUDA toolkit, and other basic software packages are preinstalled. For Windows, we include Parsec.
2. Ridiculously Cheap
The cheapest VM that you can launch is a $0.32/hour Quadro RTX 4000 + 2 vCPUs + 4 GB of RAM and 100 GB of NVMe storage. If you are running an hourly GPU instance at another provider, check our pricing, and you'll save by switching to us. If you can commit long term, we can give discounts up to 40%, sometimes 60% or higher.
Our pricing is very unique. During our experimentation phase, we purchased a ton of different servers and ended up with a heterogeneous fleet of servers. So, we decided to charge per-resource. Customers are rewarded for choosing the smallest amount of CPU/RAM, and they'll be placed on the smallest host node available. Select your preferred GPU and other configurations and you'll only be billed for what you are allocated. It's that simple.
If you are training an ML model for 5 hours on an 4x NVIDIA A5000, it'll cost you less than $20
3. Live GPU Stock
As of this very moment, we have over 1000 GPUs in-stock, with another 5000 GPUs available through reservation, where you contact us and then we tell our partner cloud providers to install our host node software stack on their idle GPUs. We can handle your computing needs, no matter how large.
Because we charge per-resource, just check out our pricing:
You can register here:
And then deploy a server here:
It's that simple.
The LES Offer
Not everyone needs GPUs, especially on a server forum like LES. So, this is more of a soft launch for us before we go onto other ML-related forums at the start of next year
This is only for LES users with at least 5 thanks, 5 posts/comments. If you already claimed the signup bonus on LET, unfortunately you can't claim a second one
$5 in account credit for registering and posting your user ID
Cloud GPUs at https://tensordock.com/, ID [Your User ID]
E.g. if your user ID was
recbob0gcd, you'd post:
Cloud GPUs at https://tensordock.com/, ID recbob0gcd
Additional $10 in account credit for creating a server & giving feedback
Once we've given you $5 in account credit, go create a GPU server and give us some feedback on the experience. 2 sentences please! Again, post your user ID with this comment, and we'll give you an additional $10 in account credit. Bonus if you try using our API
Goal is to get some feedback to improve the product before we go bigger
~ Mark & Richard
Questions? Feel free to ask within this thread.