Home>

I have a dataset with 11 columns and 8640 rows. I built a neural network that consists of 3 inputs with 2 hidden layers and 1 output. In general, the idea is 8 output, but you need a separate model for each output. And for that I used the function. Test set 33% Traning set 67%.

for i in range (0, 8):
    inputs= Inputs_arr
    outputs= Outputs_arr [:, i] .reshape (-1, 1)
    from sklearn.preprocessing import StandardScaler
    sc= StandardScaler ()
    inputs= sc.fit_transform (inputs)
    outputs= sc.fit_transform (outputs)
    condition= np.isin (Inputs_arr [:, 1], (8, 2))
    inputs_train, inputs_test= inputs [~ condition], inputs [condition]
    outputs_train, outputs_test= outputs [~ condition], outputs [condition]
    from tensorflow.keras import layers
    model= Sequential ([
      layers.Dense (units= 10, activation= 'relu', input_dim= 3),
      layers.Dense (units= 10, activation= 'relu'),
      layers.Dense (1, kernel_regularizer= l2 (0.001), bias_regularizer= l2 (0.09))
    ])
    opt= tensorflow.keras.optimizers.Adam (learning_rate= 0.001)
    model.compile (optimizer= opt, loss= 'mse')
    history= model.fit (inputs_train, outputs_train, validation_data= (inputs_test, outputs_test), epochs= 100, batch_size= 65)

now currently R ^ 2 is 0.87, Training Mean Squared Error 0.0749 Test Mean Squared Error 0.1236. How can the model be improved?

The model is apparently being retrained. Try to increase the learning rate, decrease the number of elements in hidden layers. Although fast and so good-looking, much better then.

CrazyElf2021-11-25 08:48:25

Why does the result change every time? That is, the same model but the result is different.

Abishkozha Amangeldin2021-11-25 09:41:58

Use cross-validation in training and assessment. Declare kernel_initializer= keras.initializers.glorot_uniform () and the same bias_initializer in fully connected layers. Probably batch_size might be too big for your data.

Andrew2021-11-25 12:27:38