Warm tip: This article is reproduced from stackoverflow.com, please click
deep-learning neural-network tensorflow conv-neural-network

Deep learning CNN eating VRAM

发布于 2020-03-31 22:59:59

I've recently started building my own small deep learning rig, got a ryzen 5 3600 processor, 32gb ram and the GPU is an RTX 2080 SUPER with 8gp VRam

When working with CNNs ( which is most of my work) with tensorflow i see that most of the VRam is eaten up almost immediatly (7.8gb used out of 7.8) while the GPU utilization is somewhere around 35%

I have installed the latest drivers and running CUDA v 10.0 for compatibility with tesnofrflow v2

Is that normal for CNNs working with images? Should i invest in more VRam than a faster GPU ( a multi gpu system with a couple of used 1080 Ti's or 2060 super cards)

Thoughts?

Questioner
Hassan Amin
Viewed
55
Susmit Agrawal 2020-01-31 19:59

It is not your CNN occupying the GPU, it is Tensorflow. By default, Tensorflow occupies all available GPU memory before your code is run, even if that memory is not required.

You can make Tensorflow occupy only as much memory as required by using the following code snippet (GPU 0, TF2.*):

import tensorflow as tf
gpu_devices = tf.config.experimental.list_physical_devices('GPU')
tf.config.experimental.set_memory_growth(gpu_devices[0], True)

The set_memory_growth option prevents Tensorflow from reserving all memory.