How much Tensor cores improve calculations

SvenCam report abuse

I plan to buy RTX 2080 for machine learning task. But the price is very high! Now I use 1080Ti for my work which includes many image recognition tasks. After googling, I see that new RTX GPU will have Tensor cores in it, that are constructed specially for deep learning. So will I be able to perform image recognition at least 2x faster than now (as mentioned in the Nvidia presentation)? Or buying a new video card will not give me such a performance boost? Thanks for the help.

Answers

CK report abuse

I guess that hardly depends on your libraries and algorithms used. Here is a comparision: https://www.pugetsystems.com/labs/hpc/NVIDIA-RTX-2080-Ti-vs-2080-vs-1080-Ti-vs-Titan-V-TensorFlow-Performance-with-CUDA-10-0-1247/

Maybe cloud computing / cloud GPUs are another option for you, if you don't want high initial costs for the hardware and/or you need faster results.

Caprico report abuse

Tensor Cores improve calculations of CNN and LSTM in 30%-60%. You can improve that enabling 16-bit calculation. This is an advantage for matrix multiplication because with numbers only being 16-bit instead of 32-bit large one can transfer twice the amount of numbers in a matrix with the same memory bandwidth. And this can allow you to get around 100-300% speedups switching to 16-bit. So yes, RTX 2080 can give you a 2x boost in comparison with 1080Ti.

Regards.

SvenCam report abuse

Thanks for your answers.

Add Answer

Need support?

Just drop us an email to ... Show more