I am trying to train a CNN model in Matlab to predict the mean value of a random vector. To further clarify, I am generating a random vector with 10 components (using rand function) for 500 times. Correspondingly, the figure of each vector versus 1:10 is plotted and saved separately. Moreover, the mean value of each of the 500 vectors are calculated and saved. Thereafter, the saved images are used as the input file (X) for training (70%), validating (15%) and testing (15%) a CNN model which is supposed to predict the mean value of the mentioned random vectors (Y). However, the RMSE of the model becomes too high. In other words, the model is not trained despite changing its options and parameters. Any advice?
clear;clc;close allrng(1)mkdir('Figures')Num_Sample=500;N=10;X=1:N;Percent_Train=70;Percent_Val=15;Percent_Test=100-(Percent_Train+Percent_Val);Num_Train=floor(Percent_Train/100*Num_Sample);Num_Val=floor(Percent_Val/100*Num_Sample);Num_Test=Num_Sample-(Num_Train+Num_Val);Rand_Ind=randperm(Num_Sample);Rand_Ind_Train=Rand_Ind(1,1:Num_Train);Rand_Ind_Val=Rand_Ind(1,1+Num_Train:Num_Train+Num_Val);Rand_Ind_Test=Rand_Ind(1,1+Num_Train+Num_Val:end);X0_Train=[];X0_Val=[];X0_Test=[];for i=1:Num_Sample Y0{i}=2*rand(1,N); Mean_Y(1,i)=mean(Y0{i}); Fig_i=figure(i); plot(X,Y0{i},'o','linewidth',2); xlim([1,N]) ylim([0,2]) clear Y0{i} saveas(Fig_i,['Figures/Fig_' num2str(i) '.jpg']); Fig_JPG=imread(['Figures/Fig_' num2str(i) '.jpg']); %Fig_JPG_RS=imresize(Fig_JPG,0.5); Fig_JPG_Gray=rgb2gray(Fig_JPG); Fig_JPG_Gray_Double{i}=im2double(Fig_JPG_Gray); close all iend[R_Fig_JPG_Gray C_Fig_JPG_Gray]=size(Fig_JPG_Gray);X_Train_0=[];for i=1:length(Rand_Ind_Train) X_Train_0=[X_Train_0 Fig_JPG_Gray_Double{Rand_Ind_Train(1,i)}]; clear Fig_JPG_Gray_Double{Rand_Ind_Train(1,i)}endX_Train=reshape(X_Train_0,R_Fig_JPG_Gray,C_Fig_JPG_Gray,1,length(Rand_Ind_Train));X_Val_0=[];for i=1:length(Rand_Ind_Val) X_Val_0=[X_Val_0 Fig_JPG_Gray_Double{Rand_Ind_Val(1,i)}]; clear Fig_JPG_Gray_Double{Rand_Ind_Val(1,i)}endX_Val=reshape(X_Val_0,R_Fig_JPG_Gray,C_Fig_JPG_Gray,1,length(Rand_Ind_Val));X_Test_0=[];for i=1:length(Rand_Ind_Test) X_Test_0=[X_Test_0 Fig_JPG_Gray_Double{Rand_Ind_Test(1,i)}]; clear Fig_JPG_Gray_Double{Rand_Ind_Test(1,i)}endX_Test=reshape(X_Test_0,R_Fig_JPG_Gray,C_Fig_JPG_Gray,1,length(Rand_Ind_Test));Y_Train=[Mean_Y(1,Rand_Ind_Train)]';Y_Val=[Mean_Y(1,Rand_Ind_Val)]';Y_Test=[Mean_Y(1,Rand_Ind_Test)]';figure(1)histogram(Mean_Y,N)axis tightylabel('Counts')xlabel('Mean_Y')%Creating the netwrok layerslayers=[ imageInputLayer([R_Fig_JPG_Gray C_Fig_JPG_Gray 1]) convolution2dLayer(5,12,'Padding','same') batchNormalizationLayer reluLayer maxPooling2dLayer(2,'Stride',2) convolution2dLayer(3,16,'Padding','same') batchNormalizationLayer reluLayer %averagePooling2dLayer(2,'Stride',2) %convolution2dLayer(3,32,'Padding','same') %batchNormalizationLayer %reluLayer %convolution2dLayer(3,32,'Padding','same') %batchNormalizationLayer %reluLayer dropoutLayer(0.2) fullyConnectedLayer(1) regressionLayer];% Training the networkminiBatchSize=60;validationFrequency=3; %floor(numel(Y_Train)/miniBatchSize);options=trainingOptions('sgdm', ...'MiniBatchSize',miniBatchSize, ...'MaxEpochs',6, ...'InitialLearnRate',1e-3, ...'LearnRateSchedule','piecewise', ...'LearnRateDropFactor',0.1, ...'LearnRateDropPeriod',2, ...'Shuffle','every-epoch', ...'ValidationData',{X_Val,Y_Val}, ...'ValidationFrequency',validationFrequency, ...'Plots','training-progress', ...'Verbose',false);% Creating the NetworkNet=trainNetwork(X_Train,Y_Train,layers,options);% Testing the NetworkY_Sim_Test=predict(Net,X_Test);