Home>

In order to compare the deep learning processing speed between GPU and CPU, we built a virtual environment that can use Tensorflow with CPU and GPU with Anaconda.

◆ Environment
Windows10 64bit
NVIDIA GeForce GTX 1650
Microsoft Visual Studio C ++ 2019
CUDA v10.0
cuDNN 7.4

"gpuEnv"
python 3.6
TensorFlow-GPU 2.0.0
Keras 2.3.1

"cpuEnv"
python 3.7.7
TensorFlow 2.1.0
Keras 2.3.1


I tried to learn the following simple MNIST_Dataset in the above virtual environment (gpuEnv, cpuEnv).

#Check if GPU is recognized by Tensorflow
import tensorflow as tf
if tf.test.gpu_device_name ():
    print ('Default GPU Device: {}'. format (tf.test.gpu_device_name ()))
else: else:
    print ("Please install GPU version of TF")
from tensorflow import keras
import numpy as np
fashion_mnist = keras.datasets.fashion_mnist
(train_images, train_labels), (test_images, test_labels) = fashion_mnist.load_data ()
train_images = train_images/225.0
test_images = test_images/225.0
model = keras.Sequential ([[
    keras.layers.Flatten (input_shape = (28,28)),
    keras.layers.Dense (units = 128, activation = "relu"),
    keras.layers.Dense (units = 10, activation = "softmax")
])
model.compile (optimizer ='adam',
              loss ='sparse_categorical_crossentropy',
              metrics = ['accuracy'])
# Learning execution
model.fit (train_images, train_labels, epochs = 1)

Below is the output console for each virtual environment. You can see that the GPU has a longer processing time than the CPU.
What are the possible causes? I'm a beginner and I'm sorry, but I would like to borrow your wisdom.

>Default GPU Device:/device: GPU: 0 # GPU is recognized
>Train on 60000 samples
>60000/60000 [==============================] --4s 64us/sample --loss: 0.4982 --accuracy: 0.8250
><tensorflow.python.keras.callbacks.History at 0x1a395e48e48>
>Please install GPU version of TF # GPU is not recognized
>Train on 60000 samples
>60000/60000 [==============================] --2s 38us/sample --loss: 0.4913 --accuracy: 0.8275
><tensorflow.python.keras.callbacks.History at 0x29ae6374188>
  • Answer # 1

    It may be because "I tried to learn the following simple MNIST_Dataset".
    GPUs don't work on small datasets.

    * I'm preparing a jet plane to go to a convenience store in the neighborhood.