You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to train a very simple neural network and tuning it to find the optimal number of units in the hidden layer. When there is no normalization layer, my code works fine but, when I add the normalizer and the normalization layer after the input tensor in the sequential model definition, the following error pops up:
ValueError: Received incompatible tensor with shape (1,) when attempting to restore variable with shape (10,) and name layer_with_weights-1/bias/.ATTRIBUTES/VARIABLE_VALUE.
Find my code below:
import numpy as np
# NN training
from sklearn.model_selection import train_test_split
import tensorflow.keras as tf
import keras_tuner
import matplotlib.pyplot as plt
class neural_network:
# Define attributes that are unique to each instance of the class
def __init__(self, X, y, epochs):
self.model = tf.Sequential()
self.history = {} # History object
self.X_all = X
self.y_all = y
self.epochs = epochs
self.split_data(self.X_all, self.y_all)
self.units = []
self.weights = {}
self.bias = {}
def build_model(self,units,epochs):
''' Define model arquitecture'''
normalizer = tf.layers.Normalization()
normalizer.adapt(self.X_all)
model = tf.Sequential()
model.add(tf.Input(shape = (self.X_tr.shape[1],))) # Input tensor definition (not a layer!)
model.add(normalizer)
model.add(tf.layers.Dense(units, activation = 'relu')) # Hidden layers
model.add(tf.layers.Dense(units = self.y_tr.shape[1], activation = 'linear')) # Output layer (no ReLU applied here, only weighted sum as per Neur2SP paper. ASK WHY)
model.compile(loss='mean_squared_error', optimizer='adam', metrics=['mse'])
model.summary()
# model.fit(self.X_tr, self.y_tr, epochs=self.epochs, verbose = 0) # verbose: show progress bar or not
# HOW TO CHOOSE METRICS? https://machinelearningmastery.com/custom-metrics-deep-learning-keras-python/
return model
def build_hypermodel(self, hp):
''' Define search space where tunning the hyperparameters'''
# hp = keras_tuner.HyperParameters() . Not needed because the call in keras_tunner.RandomSearch already performs that action internally and passes it to the function specified in hypermodel attribute
units = hp.Int("units", min_value = 5, max_value = 50, step = 5)
model = self.build_model(units = units, epochs = self.epochs)
return model
def tune_model(self):
'''Tune model using random search strategy'''
tuner = keras_tuner.RandomSearch(
hypermodel=self.build_hypermodel, # model-building function
objective="val_mse", # objective to optimize (e.g., metric to be minimized)
)
# Add a callback to choose the optimal number of epochs
# https://www.geeksforgeeks.org/choose-optimal-number-of-epochs-to-train-a-neural-network-in-keras/
earlystopping = tf.callbacks.EarlyStopping(monitor="val_mse",
mode="min", patience=2, # stop your training as soon as the validation loss stops improving. In this case, it will stop if it doesn't improve in 2 epochs
restore_best_weights=True)
# tuner.search_space_summary() # print summary of the search space
tuner.search(self.X_tr, self.y_tr, self.epochs, verbose = 0,
validation_data = (self.X_val, self.y_val),
callbacks = [earlystopping]) # search method takes the same arguments that would be passed to the fit method
tuner.results_summary()
# Get the best model
best_model = tuner.get_best_models(num_models=1) # Returns a list object but best_model[0] returns a instance of the keras Sequential class
best_model[0].summary()
# Get the best hyperparameters
best_hps = tuner.get_best_hyperparameters(num_trials = 1)[0]
To help with reproducibility, the shape of the input variables is (5000,7) and the output shape is (5000,1), being 5000 the total number of samples.
I am new to keras so any guidance on what might be triggering that error would be highly welcome.
The text was updated successfully, but these errors were encountered:
Hi all,
I am trying to train a very simple neural network and tuning it to find the optimal number of units in the hidden layer. When there is no normalization layer, my code works fine but, when I add the normalizer and the normalization layer after the input tensor in the sequential model definition, the following error pops up:
ValueError: Received incompatible tensor with shape (1,) when attempting to restore variable with shape (10,) and name layer_with_weights-1/bias/.ATTRIBUTES/VARIABLE_VALUE.
Find my code below:
To help with reproducibility, the shape of the input variables is (5000,7) and the output shape is (5000,1), being 5000 the total number of samples.
I am new to keras so any guidance on what might be triggering that error would be highly welcome.
The text was updated successfully, but these errors were encountered: