Running recommender-system experiments may require significant computing resources. Especially memory-based approaches like kNN might need more memory than a normal desktop computer offers. Luckily, many organisations offer free or cheap (cloud) computing resources, though often limited in their free versions
- https://vast.ai/ claims to reduce GPU costs by factor 3-5 as they don´t offer their own infrastructure but allow you to rent GPU resources on other people’s computers. The concept seems very interesting, but there are also critical voices.
More details Yet to Come…
Related Blog Posts
Even Oldridge from NVIDIA Merlin has written a blog post about Why isn’t your recommender system training faster on GPU? (And what can you do about it?). It’s a nice article that outlines the differences between Computer Vision and NLP — two areas where Deep Learning and GPUs work excellently — and recommender systems, where […]
Google and efabless announce that they would let anybody to design their own chip architectures and manufacture those chips free of charge. Maybe it is time for a hardware recommender-system on a chip? Did you ever dream about creating your own chip? I mean, a physical chip. One which you can hold in your hand, […]