Home>
I want to achieve

We are designing a CNN that inputs two images and obtains (classifies) one output.

Specifically, it is a network that extracts features from two images using the feature extraction layer of VGG16 (uses that have been trained by Imagenet without initializing the weights) and infers them with the fully connected layer.

However, the following error occurred.

Error message
-------------------------------------------------- -------------------------
ValueError Traceback (most recent call last)
<ipython-input-62-d0881f532e33>in<module>()
---->1 NN3_conv = Flatten () (merged)
      2 NN3_conv = Dense (8192) (NN3_conv)
      3 NN3_conv = BatchNormalization () (NN3_conv)
      4 NN3_conv = Dense (8192) (NN3_conv)
      5 NN3_conv = BatchNormalization () (NN3_conv)
~ \ Anaconda3 \ lib \ site-packages \ keras \ engine \ base_layer.py in __call__ (self, inputs, ** kwargs)
    504 if all ([s is not None]
    505 for s in to_list (input_shape)]):
->506 output_shape = self.compute_output_shape (input_shape)
    507 else:
    508 if is instance (input_shape, list):
~ \ Anaconda3 \ lib \ site-packages \ keras \ layers \ core.py in compute_output_shape (self, input_shape)
    499 raise ValueError ('The shape of the input to "Flatten"'
    500'is not fully defined'
->501'(got' + str (input_shape [1:]) +').'
    502'Make sure to pass a complete "input_shape"'
    503'or "batch_input_shape" argument to the first'
ValueError: The shape of the input to "Flatten" is not fully defined (got (None, None, 1024)). Make sure to pass a complete "input_shape" or "batch_input_shape" argument to the first layer in your model.
Source code
from keras.models import Model, Sequential
from keras.layers import Input, Dense, Dropout, Activation, Flatten
from keras.layers import add, concatenate
from keras.utils import plot_model
from keras.applications.vgg16 import VGG16
import keras
NN1 = keras.applications.vgg16.VGG16 (include_top = False, weights ='imagenet', input_tensor = None, input_shape = None, pooling = None)
NN2 = keras.applications.vgg16.VGG16 (include_top = False, weights ='imagenet', input_tensor = None, input_shape = None, pooling = None)
merged = concatenate ([NN1.output, NN2.output])
NN3_conv = Flatten () (merged)
NN3_conv = Dense (8192) (NN3_conv)
NN3_conv = BatchNormalization () (NN3_conv)
NN3_conv = Dense (8192) (NN3_conv)
NN3_conv = BatchNormalization () (NN3_conv)
NN3_conv = Dense (CLASS_NUM, activation = "softmax") (NN3_conv)
model = Model ([NN1.input, NN2.input], NN3_conv)

I'm not sure what's causing it.
I look forward to working with you as a professor.

Environment/version

OS
windows10

version
python3.6.4
keras 2.3.1

  • Answer # 1

    When I specified the basic size of Imagenet, it was flattened normally. Please note that the same VGG16 model is used twice, so if you arrange them normally, the layer name will be batting inside keras. So, only the name is rewritten in the middle.

    from keras.models import Model, Sequential
    from keras.layers import Input, Dense, Dropout, Activation, Flatten
    from keras.layers import add, concatenate, BatchNormalization
    from keras.utils import plot_model
    from keras.applications.vgg16 import VGG16
    import keras
    NN1 = keras.applications.vgg16.VGG16 (include_top = False, weights ='imagenet', input_tensor = None, input_shape = (224,224,3), pooling = None)
    NN2 = keras.applications.vgg16.VGG16 (include_top = False, weights ='imagenet', input_tensor = None, input_shape = (224,224,3), pooling = None)
    for layer in NN1.layers:
        layer.name = layer.name + str ("NN1")
    for layer in NN2.layers:
        layer.name = layer.name + str ("NN2")
    merged = concatenate ([NN1.output, NN2.output])
    NN3_conv1 = Flatten () (merged)
    NN3_conv2 = Dense (8192) (NN3_conv1)
    NN3_conv3 = BatchNormalization () (NN3_conv2)
    NN3_conv4 = Dense (8192) (NN3_conv3)
    NN3_conv5 = BatchNormalization () (NN3_conv4)
    NN3_conv6 = Dense (1000, activation = "softmax") (NN3_conv5)
    model = Model ([NN1.input, NN2.input],NN3_conv6)
    model.summary ()
    __________________________________________________________________________________________________
    Layer (type) Output Shape Param # Connected to
    ================================================== ================================================
    input_1NN1 (InputLayer) (None, 224, 224, 3) 0
    __________________________________________________________________________________________________
    input_2NN2 (InputLayer) (None, 224, 224, 3) 0
    __________________________________________________________________________________________________
    block1_conv1NN1 (Conv2D) (None, 224, 224, 64) 1792 input_1NN1 [0] [0]
    __________________________________________________________________________________________________
    block1_conv1NN2 (Conv2D) (None, 224, 224, 64) 1792 input_2NN2 [0] [0]
    __________________________________________________________________________________________________
    ・ ・ Omitted ・ ・
    block1_conv2NN1 (Conv2D) (None, 224, 224, 64) 36928 block1_conv1NN1 [0] [0]
    __________________________________________________________________________________________________
    block1_conv2NN2 (Conv2D) (None, 224, 224, 64) 36928 block1_conv1NN2 [0] [0]
    __________________________________________________________________________________________________
    block1_poolNN1 (MaxPooling2D) (None, 112, 112, 64) 0 block1_conv2NN1 [0] [0]
    __________________________________________________________________________________________________
    block1_poolNN2 (MaxPooling2D) (None, 112, 112, 64) 0 block1_conv2NN2 [0] [0]
    __________________________________________________________________________________________________block2_conv1NN1 (Conv2D) (None, 112, 112, 128 73856 block1_poolNN1 [0] [0]
    __________________________________________________________________________________________________
    block2_conv1NN2 (Conv2D) (None, 112, 112, 128 73856 block1_poolNN2 [0] [0]
    __________________________________________________________________________________________________
    block2_conv2NN1 (Conv2D) (None, 112, 112, 128 147584 block2_conv1NN1 [0] [0]
    flatten_1 (Flatten) (None, 50176) 0 concatenate_1 [0] [0]
    __________________________________________________________________________________________________
    dense_1 (Dense) (None, 8192) 411049984 flatten_1 [0] [0]
    __________________________________________________________________________________________________
    batch_normalization_1 (BatchNor (None, 8192) 32768 dense_1 [0] [0]
    __________________________________________________________________________________________________
    dense_2 (Dense) (None, 8192) 67117056 batch_normalization_1 [0] [0]
    __________________________________________________________________________________________________
    batch_normalization_2 (BatchNor (None, 8192) 32768 dense_2 [0] [0]
    __________________________________________________________________________________________________
    dense_3 (Dense) (None, 1000) 8193000 batch_normalization_2 [0] [0]
    ================================================== ================================================
    Total params: 515,854,952
    Trainable params: 515,822,184
    Non-trainable params: 32,768
    __________________________________________________________________________________________________