[K3BHPD1] - Regression with a Dense Network (DNN)¶
Simple example of a regression with the dataset Boston Housing Prices Dataset (BHPD)Objectives :¶
- Predicts housing prices from a set of house features.
- Understanding the principle and the architecture of a regression with a dense neural network
The Boston Housing Prices Dataset consists of price of houses in various places in Boston.
Alongside with price, the dataset also provide theses informations :
- CRIM: This is the per capita crime rate by town
- ZN: This is the proportion of residential land zoned for lots larger than 25,000 sq.ft
- INDUS: This is the proportion of non-retail business acres per town
- CHAS: This is the Charles River dummy variable (this is equal to 1 if tract bounds river; 0 otherwise)
- NOX: This is the nitric oxides concentration (parts per 10 million)
- RM: This is the average number of rooms per dwelling
- AGE: This is the proportion of owner-occupied units built prior to 1940
- DIS: This is the weighted distances to five Boston employment centers
- RAD: This is the index of accessibility to radial highways
- TAX: This is the full-value property-tax rate per 10,000 dollars
- PTRATIO: This is the pupil-teacher ratio by town
- B: This is calculated as 1000(Bk — 0.63)^2, where Bk is the proportion of people of African American descent by town
- LSTAT: This is the percentage lower status of the population
- MEDV: This is the median value of owner-occupied homes in 1000 dollars
What we're going to do :¶
- Retrieve data
- Preparing the data
- Build a model
- Train the model
- Evaluate the result
Step 1 - Import and init¶
You can also adjust the verbosity by changing the value of TF_CPP_MIN_LOG_LEVEL :
- 0 = all messages are logged (default)
- 1 = INFO messages are not printed.
- 2 = INFO and WARNING messages are not printed.
- 3 = INFO , WARNING and ERROR messages are not printed.
In [1]:
import os
os.environ['KERAS_BACKEND'] = 'torch'
import keras
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
import os,sys
import fidle
# Init Fidle environment
run_id, run_dir, datasets_dir = fidle.init('K3BHPD1')
FIDLE - Environment initialization
Version : 2.3.2 Run id : K3BHPD1 Run dir : ./run/K3BHPD1 Datasets dir : /lustre/fswork/projects/rech/mlh/uja62cb/fidle-project/datasets-fidle Start time : 22/12/24 21:20:59 Hostname : r3i6n0 (Linux) Tensorflow log level : Info + Warning + Error (=0) Update keras cache : False Update torch cache : False Save figs : ./run/K3BHPD1/figs (True) keras : 3.7.0 numpy : 2.1.2 sklearn : 1.5.2 yaml : 6.0.2 matplotlib : 3.9.2 pandas : 2.2.3 torch : 2.5.0
Verbosity during training :
- 0 = silent
- 1 = progress bar
- 2 = one line per epoch
In [2]:
fit_verbosity = 1
Override parameters (batch mode) - Just forget this cell
In [3]:
fidle.override('fit_verbosity')
** Overrided parameters : ** fit_verbosity : 2
Step 2 - Retrieve data¶
2.1 - Option 1 : From Keras¶
Boston housing is a famous historic dataset, so we can get it directly from Keras datasets
In [4]:
# (x_train, y_train), (x_test, y_test) = keras.datasets.boston_housing.load_data(test_split=0.2, seed=113)
2.2 - Option 2 : From a csv file¶
More fun !
In [5]:
data = pd.read_csv(f'{datasets_dir}/BHPD/origine/BostonHousing.csv', header=0)
display(data.head(5).style.format("{0:.2f}").set_caption("Few lines of the dataset :"))
print('Missing Data : ',data.isna().sum().sum(), ' Shape is : ', data.shape)
crim | zn | indus | chas | nox | rm | age | dis | rad | tax | ptratio | b | lstat | medv | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0.01 | 18.00 | 2.31 | 0.00 | 0.54 | 6.58 | 65.20 | 4.09 | 1.00 | 296.00 | 15.30 | 396.90 | 4.98 | 24.00 |
1 | 0.03 | 0.00 | 7.07 | 0.00 | 0.47 | 6.42 | 78.90 | 4.97 | 2.00 | 242.00 | 17.80 | 396.90 | 9.14 | 21.60 |
2 | 0.03 | 0.00 | 7.07 | 0.00 | 0.47 | 7.18 | 61.10 | 4.97 | 2.00 | 242.00 | 17.80 | 392.83 | 4.03 | 34.70 |
3 | 0.03 | 0.00 | 2.18 | 0.00 | 0.46 | 7.00 | 45.80 | 6.06 | 3.00 | 222.00 | 18.70 | 394.63 | 2.94 | 33.40 |
4 | 0.07 | 0.00 | 2.18 | 0.00 | 0.46 | 7.15 | 54.20 | 6.06 | 3.00 | 222.00 | 18.70 | 396.90 | 5.33 | 36.20 |
Missing Data : 0 Shape is : (506, 14)
In [6]:
# ---- Shuffle and Split => train, test
#
data = data.sample(frac=1., axis=0)
data_train = data.sample(frac=0.7, axis=0)
data_test = data.drop(data_train.index)
# ---- Split => x,y (medv is price)
#
x_train = data_train.drop('medv', axis=1)
y_train = data_train['medv']
x_test = data_test.drop('medv', axis=1)
y_test = data_test['medv']
print('Original data shape was : ',data.shape)
print('x_train : ',x_train.shape, 'y_train : ',y_train.shape)
print('x_test : ',x_test.shape, 'y_test : ',y_test.shape)
Original data shape was : (506, 14) x_train : (354, 13) y_train : (354,) x_test : (152, 13) y_test : (152,)
3.2 - Data normalization¶
Note :
- All input data must be normalized, train and test.
- To do this we will subtract the mean and divide by the standard deviation.
- But test data should not be used in any way, even for normalization.
- The mean and the standard deviation will therefore only be calculated with the train data.
In [7]:
display(x_train.describe().style.format("{0:.2f}").set_caption("Before normalization :"))
mean = x_train.mean()
std = x_train.std()
x_train = (x_train - mean) / std
x_test = (x_test - mean) / std
display(x_train.describe().style.format("{0:.2f}").set_caption("After normalization :"))
display(x_train.head(5).style.format("{0:.2f}").set_caption("Few lines of the dataset :"))
x_train, y_train = np.array(x_train), np.array(y_train)
x_test, y_test = np.array(x_test), np.array(y_test)
crim | zn | indus | chas | nox | rm | age | dis | rad | tax | ptratio | b | lstat | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
count | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 |
mean | 3.48 | 12.46 | 10.77 | 0.06 | 0.55 | 6.27 | 67.49 | 3.89 | 9.24 | 401.92 | 18.36 | 359.05 | 12.53 |
std | 8.70 | 24.44 | 6.85 | 0.25 | 0.12 | 0.71 | 28.69 | 2.07 | 8.56 | 167.22 | 2.17 | 88.87 | 7.26 |
min | 0.01 | 0.00 | 0.46 | 0.00 | 0.39 | 3.56 | 2.90 | 1.17 | 1.00 | 188.00 | 12.60 | 0.32 | 1.92 |
25% | 0.08 | 0.00 | 4.95 | 0.00 | 0.45 | 5.89 | 42.32 | 2.11 | 4.00 | 277.50 | 16.92 | 376.78 | 6.74 |
50% | 0.22 | 0.00 | 8.14 | 0.00 | 0.53 | 6.21 | 76.25 | 3.46 | 5.00 | 329.00 | 18.80 | 392.05 | 10.59 |
75% | 2.81 | 20.00 | 18.10 | 0.00 | 0.62 | 6.61 | 93.57 | 5.37 | 8.00 | 666.00 | 20.20 | 396.90 | 16.95 |
max | 88.98 | 100.00 | 27.74 | 1.00 | 0.87 | 8.70 | 100.00 | 10.71 | 24.00 | 711.00 | 21.20 | 396.90 | 36.98 |
crim | zn | indus | chas | nox | rm | age | dis | rad | tax | ptratio | b | lstat | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
count | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 | 354.00 |
mean | 0.00 | 0.00 | 0.00 | -0.00 | -0.00 | -0.00 | -0.00 | -0.00 | 0.00 | -0.00 | 0.00 | 0.00 | 0.00 |
std | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 |
min | -0.40 | -0.51 | -1.50 | -0.26 | -1.37 | -3.81 | -2.25 | -1.31 | -0.96 | -1.28 | -2.65 | -4.04 | -1.46 |
25% | -0.39 | -0.51 | -0.85 | -0.26 | -0.89 | -0.54 | -0.88 | -0.86 | -0.61 | -0.74 | -0.66 | 0.20 | -0.80 |
50% | -0.37 | -0.51 | -0.38 | -0.26 | -0.16 | -0.09 | 0.31 | -0.21 | -0.49 | -0.44 | 0.20 | 0.37 | -0.27 |
75% | -0.08 | 0.31 | 1.07 | -0.26 | 0.64 | 0.48 | 0.91 | 0.72 | -0.14 | 1.58 | 0.85 | 0.43 | 0.61 |
max | 9.83 | 3.58 | 2.48 | 3.79 | 2.78 | 3.42 | 1.13 | 3.29 | 1.72 | 1.85 | 1.31 | 0.43 | 3.37 |
crim | zn | indus | chas | nox | rm | age | dis | rad | tax | ptratio | b | lstat | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
319 | -0.34 | -0.51 | -0.13 | -0.26 | -0.05 | -0.23 | -0.30 | 0.06 | -0.61 | -0.59 | 0.02 | 0.42 | 0.03 |
255 | -0.40 | 2.76 | -1.04 | -0.26 | -1.37 | -0.56 | -1.69 | 2.57 | -0.96 | -0.52 | -0.90 | 0.41 | -0.45 |
455 | 0.15 | -0.51 | 1.07 | -0.26 | 1.41 | 0.35 | 0.66 | -0.70 | 1.72 | 1.58 | 0.85 | -3.47 | 0.77 |
414 | 4.86 | -0.51 | 1.07 | -0.26 | 1.24 | -2.47 | 1.13 | -1.08 | 1.72 | 1.58 | 0.85 | -3.05 | 3.37 |
86 | -0.39 | -0.51 | -0.92 | -0.26 | -0.88 | -0.36 | -0.78 | 0.26 | -0.73 | -0.93 | 0.07 | 0.42 | 0.05 |
In [8]:
def get_model_v1(shape):
model = keras.models.Sequential()
model.add(keras.layers.Input(shape, name="InputLayer"))
model.add(keras.layers.Dense(32, activation='relu', name='Dense_n1'))
model.add(keras.layers.Dense(64, activation='relu', name='Dense_n2'))
model.add(keras.layers.Dense(32, activation='relu', name='Dense_n3'))
model.add(keras.layers.Dense(1, name='Output'))
model.compile(optimizer = 'adam',
loss = 'mse',
metrics = ['mae', 'mse'] )
return model
In [9]:
model=get_model_v1( (13,) )
model.summary()
# img=keras.utils.plot_model( model, to_file='./run/model.png', show_shapes=True, show_layer_names=True, dpi=96)
# display(img)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━┩ │ Dense_n1 (Dense) │ (None, 32) │ 448 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ Dense_n2 (Dense) │ (None, 64) │ 2,112 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ Dense_n3 (Dense) │ (None, 32) │ 2,080 │ ├──────────────────────────────────────┼─────────────────────────────┼─────────────────┤ │ Output (Dense) │ (None, 1) │ 33 │ └──────────────────────────────────────┴─────────────────────────────┴─────────────────┘
Total params: 4,673 (18.25 KB)
Trainable params: 4,673 (18.25 KB)
Non-trainable params: 0 (0.00 B)
5.2 - Train it¶
In [10]:
history = model.fit(x_train,
y_train,
epochs = 60,
batch_size = 10,
verbose = fit_verbosity,
validation_data = (x_test, y_test))
Epoch 1/60
36/36 - 1s - 33ms/step - loss: 552.6011 - mae: 21.6340 - mse: 552.6011 - val_loss: 459.5090 - val_mae: 19.4499 - val_mse: 459.5090
Epoch 2/60
36/36 - 0s - 8ms/step - loss: 318.6428 - mae: 15.4018 - mse: 318.6428 - val_loss: 137.6176 - val_mae: 9.1210 - val_mse: 137.6176
Epoch 3/60
36/36 - 0s - 8ms/step - loss: 70.6390 - mae: 6.3921 - mse: 70.6390 - val_loss: 58.6480 - val_mae: 5.6008 - val_mse: 58.6480
Epoch 4/60
36/36 - 0s - 8ms/step - loss: 39.1765 - mae: 4.6129 - mse: 39.1765 - val_loss: 41.4944 - val_mae: 4.7192 - val_mse: 41.4944
Epoch 5/60
36/36 - 0s - 8ms/step - loss: 29.0254 - mae: 3.9321 - mse: 29.0254 - val_loss: 33.0362 - val_mae: 4.1499 - val_mse: 33.0362
Epoch 6/60
36/36 - 0s - 8ms/step - loss: 24.0055 - mae: 3.5411 - mse: 24.0055 - val_loss: 28.8925 - val_mae: 3.8650 - val_mse: 28.8925
Epoch 7/60
36/36 - 0s - 8ms/step - loss: 21.0437 - mae: 3.2821 - mse: 21.0437 - val_loss: 26.5158 - val_mae: 3.6375 - val_mse: 26.5158
Epoch 8/60
36/36 - 0s - 8ms/step - loss: 18.8576 - mae: 3.1259 - mse: 18.8576 - val_loss: 24.8119 - val_mae: 3.5243 - val_mse: 24.8119
Epoch 9/60
36/36 - 0s - 8ms/step - loss: 17.1906 - mae: 3.0018 - mse: 17.1906 - val_loss: 23.7083 - val_mae: 3.3951 - val_mse: 23.7083
Epoch 10/60
36/36 - 0s - 8ms/step - loss: 16.3584 - mae: 2.9301 - mse: 16.3584 - val_loss: 23.1440 - val_mae: 3.3218 - val_mse: 23.1440
Epoch 11/60
36/36 - 0s - 8ms/step - loss: 14.8845 - mae: 2.8030 - mse: 14.8845 - val_loss: 22.0320 - val_mae: 3.2148 - val_mse: 22.0320
Epoch 12/60
36/36 - 0s - 8ms/step - loss: 14.1007 - mae: 2.7446 - mse: 14.1007 - val_loss: 21.4347 - val_mae: 3.1709 - val_mse: 21.4347
Epoch 13/60
36/36 - 0s - 8ms/step - loss: 13.5592 - mae: 2.6926 - mse: 13.5592 - val_loss: 20.9801 - val_mae: 3.1183 - val_mse: 20.9801
Epoch 14/60
36/36 - 0s - 8ms/step - loss: 12.8744 - mae: 2.6291 - mse: 12.8744 - val_loss: 20.6138 - val_mae: 3.0929 - val_mse: 20.6138
Epoch 15/60
36/36 - 0s - 8ms/step - loss: 12.5834 - mae: 2.5426 - mse: 12.5834 - val_loss: 20.1067 - val_mae: 3.0289 - val_mse: 20.1067
Epoch 16/60
36/36 - 0s - 8ms/step - loss: 12.2945 - mae: 2.5726 - mse: 12.2945 - val_loss: 20.1485 - val_mae: 2.9954 - val_mse: 20.1485
Epoch 17/60
36/36 - 0s - 8ms/step - loss: 11.6292 - mae: 2.4498 - mse: 11.6292 - val_loss: 19.8272 - val_mae: 3.0280 - val_mse: 19.8272
Epoch 18/60
36/36 - 0s - 8ms/step - loss: 11.7875 - mae: 2.4964 - mse: 11.7875 - val_loss: 19.7747 - val_mae: 2.9893 - val_mse: 19.7747
Epoch 19/60
36/36 - 0s - 8ms/step - loss: 10.8071 - mae: 2.4374 - mse: 10.8071 - val_loss: 19.4899 - val_mae: 2.9394 - val_mse: 19.4899
Epoch 20/60
36/36 - 0s - 8ms/step - loss: 10.5484 - mae: 2.3460 - mse: 10.5484 - val_loss: 19.1290 - val_mae: 2.8710 - val_mse: 19.1290
Epoch 21/60
36/36 - 0s - 8ms/step - loss: 10.2614 - mae: 2.3403 - mse: 10.2614 - val_loss: 19.1074 - val_mae: 2.8635 - val_mse: 19.1074
Epoch 22/60
36/36 - 0s - 8ms/step - loss: 10.0241 - mae: 2.3401 - mse: 10.0241 - val_loss: 19.2033 - val_mae: 2.8503 - val_mse: 19.2033
Epoch 23/60
36/36 - 0s - 8ms/step - loss: 9.9626 - mae: 2.2710 - mse: 9.9626 - val_loss: 18.6036 - val_mae: 2.8252 - val_mse: 18.6036
Epoch 24/60
36/36 - 0s - 8ms/step - loss: 9.6020 - mae: 2.2434 - mse: 9.6020 - val_loss: 18.4747 - val_mae: 2.7770 - val_mse: 18.4747
Epoch 25/60
36/36 - 0s - 8ms/step - loss: 9.5089 - mae: 2.2437 - mse: 9.5089 - val_loss: 18.5592 - val_mae: 2.7752 - val_mse: 18.5592
Epoch 26/60
36/36 - 0s - 8ms/step - loss: 9.1992 - mae: 2.1904 - mse: 9.1992 - val_loss: 18.6239 - val_mae: 2.7469 - val_mse: 18.6239
Epoch 27/60
36/36 - 0s - 8ms/step - loss: 9.0184 - mae: 2.1712 - mse: 9.0184 - val_loss: 18.4458 - val_mae: 2.7831 - val_mse: 18.4458
Epoch 28/60
36/36 - 0s - 8ms/step - loss: 9.2274 - mae: 2.1964 - mse: 9.2274 - val_loss: 18.0883 - val_mae: 2.7530 - val_mse: 18.0883
Epoch 29/60
36/36 - 0s - 8ms/step - loss: 8.8379 - mae: 2.1635 - mse: 8.8379 - val_loss: 18.1416 - val_mae: 2.7474 - val_mse: 18.1416
Epoch 30/60
36/36 - 0s - 8ms/step - loss: 8.6780 - mae: 2.1447 - mse: 8.6780 - val_loss: 18.0183 - val_mae: 2.7820 - val_mse: 18.0183
Epoch 31/60
36/36 - 0s - 8ms/step - loss: 9.1903 - mae: 2.2133 - mse: 9.1903 - val_loss: 18.0481 - val_mae: 2.8027 - val_mse: 18.0481
Epoch 32/60
36/36 - 0s - 8ms/step - loss: 8.4637 - mae: 2.1314 - mse: 8.4637 - val_loss: 17.6470 - val_mae: 2.7634 - val_mse: 17.6470
Epoch 33/60
36/36 - 0s - 8ms/step - loss: 8.5856 - mae: 2.1450 - mse: 8.5856 - val_loss: 17.7953 - val_mae: 2.7016 - val_mse: 17.7953
Epoch 34/60
36/36 - 0s - 8ms/step - loss: 8.1718 - mae: 2.0619 - mse: 8.1718 - val_loss: 17.8786 - val_mae: 2.6658 - val_mse: 17.8786
Epoch 35/60
36/36 - 0s - 8ms/step - loss: 7.8050 - mae: 2.0428 - mse: 7.8050 - val_loss: 17.4320 - val_mae: 2.6921 - val_mse: 17.4320
Epoch 36/60
36/36 - 0s - 8ms/step - loss: 7.7422 - mae: 2.0260 - mse: 7.7422 - val_loss: 17.0421 - val_mae: 2.6534 - val_mse: 17.0421
Epoch 37/60
36/36 - 0s - 8ms/step - loss: 7.7575 - mae: 1.9779 - mse: 7.7575 - val_loss: 17.3632 - val_mae: 2.6220 - val_mse: 17.3632
Epoch 38/60
36/36 - 0s - 8ms/step - loss: 7.6174 - mae: 1.9998 - mse: 7.6174 - val_loss: 16.6116 - val_mae: 2.6319 - val_mse: 16.6116
Epoch 39/60
36/36 - 0s - 8ms/step - loss: 7.5491 - mae: 1.9527 - mse: 7.5491 - val_loss: 17.0154 - val_mae: 2.7230 - val_mse: 17.0154
Epoch 40/60
36/36 - 0s - 8ms/step - loss: 7.6164 - mae: 2.0398 - mse: 7.6164 - val_loss: 17.1923 - val_mae: 2.6671 - val_mse: 17.1923
Epoch 41/60
36/36 - 0s - 8ms/step - loss: 7.3691 - mae: 1.9622 - mse: 7.3691 - val_loss: 16.7006 - val_mae: 2.5751 - val_mse: 16.7006
Epoch 42/60
36/36 - 0s - 8ms/step - loss: 7.0265 - mae: 1.9187 - mse: 7.0265 - val_loss: 16.5408 - val_mae: 2.5706 - val_mse: 16.5408
Epoch 43/60
36/36 - 0s - 8ms/step - loss: 7.0378 - mae: 1.9346 - mse: 7.0378 - val_loss: 16.1454 - val_mae: 2.5522 - val_mse: 16.1454
Epoch 44/60
36/36 - 0s - 8ms/step - loss: 7.0363 - mae: 1.9162 - mse: 7.0363 - val_loss: 16.4976 - val_mae: 2.5639 - val_mse: 16.4976
Epoch 45/60
36/36 - 0s - 8ms/step - loss: 6.7418 - mae: 1.8865 - mse: 6.7418 - val_loss: 16.1442 - val_mae: 2.5754 - val_mse: 16.1442
Epoch 46/60
36/36 - 0s - 8ms/step - loss: 6.6587 - mae: 1.8638 - mse: 6.6587 - val_loss: 16.2308 - val_mae: 2.5458 - val_mse: 16.2308
Epoch 47/60
36/36 - 0s - 8ms/step - loss: 6.8526 - mae: 1.9030 - mse: 6.8526 - val_loss: 15.9731 - val_mae: 2.5638 - val_mse: 15.9731
Epoch 48/60
36/36 - 0s - 8ms/step - loss: 6.7054 - mae: 1.8707 - mse: 6.7054 - val_loss: 16.0701 - val_mae: 2.5394 - val_mse: 16.0701
Epoch 49/60
36/36 - 0s - 8ms/step - loss: 6.4748 - mae: 1.8408 - mse: 6.4748 - val_loss: 15.6880 - val_mae: 2.5163 - val_mse: 15.6880
Epoch 50/60
36/36 - 0s - 8ms/step - loss: 6.4839 - mae: 1.8638 - mse: 6.4839 - val_loss: 16.1445 - val_mae: 2.5782 - val_mse: 16.1445
Epoch 51/60
36/36 - 0s - 8ms/step - loss: 6.1334 - mae: 1.7756 - mse: 6.1334 - val_loss: 15.3155 - val_mae: 2.4964 - val_mse: 15.3155
Epoch 52/60
36/36 - 0s - 8ms/step - loss: 6.0821 - mae: 1.7696 - mse: 6.0821 - val_loss: 15.6474 - val_mae: 2.5117 - val_mse: 15.6474
Epoch 53/60
36/36 - 0s - 8ms/step - loss: 6.0743 - mae: 1.8045 - mse: 6.0743 - val_loss: 14.7500 - val_mae: 2.4915 - val_mse: 14.7500
Epoch 54/60
36/36 - 0s - 8ms/step - loss: 5.7969 - mae: 1.7041 - mse: 5.7969 - val_loss: 15.5576 - val_mae: 2.5976 - val_mse: 15.5576
Epoch 55/60
36/36 - 0s - 8ms/step - loss: 5.9764 - mae: 1.7891 - mse: 5.9764 - val_loss: 15.1105 - val_mae: 2.5220 - val_mse: 15.1105
Epoch 56/60
36/36 - 0s - 8ms/step - loss: 6.0050 - mae: 1.7895 - mse: 6.0050 - val_loss: 15.2569 - val_mae: 2.5354 - val_mse: 15.2569
Epoch 57/60
36/36 - 0s - 8ms/step - loss: 5.6429 - mae: 1.6973 - mse: 5.6429 - val_loss: 14.5583 - val_mae: 2.4487 - val_mse: 14.5583
Epoch 58/60
36/36 - 0s - 8ms/step - loss: 5.5046 - mae: 1.7011 - mse: 5.5046 - val_loss: 15.5672 - val_mae: 2.5913 - val_mse: 15.5672
Epoch 59/60
36/36 - 0s - 8ms/step - loss: 5.8248 - mae: 1.7740 - mse: 5.8248 - val_loss: 14.5579 - val_mae: 2.4877 - val_mse: 14.5579
Epoch 60/60
36/36 - 0s - 8ms/step - loss: 5.9535 - mae: 1.7849 - mse: 5.9535 - val_loss: 14.3759 - val_mae: 2.4764 - val_mse: 14.3759
In [11]:
score = model.evaluate(x_test, y_test, verbose=0)
print('x_test / loss : {:5.4f}'.format(score[0]))
print('x_test / mae : {:5.4f}'.format(score[1]))
print('x_test / mse : {:5.4f}'.format(score[2]))
x_test / loss : 14.3759 x_test / mae : 2.4764 x_test / mse : 14.3759
6.2 - Training history¶
What was the best result during our training ?
In [12]:
df=pd.DataFrame(data=history.history)
display(df)
loss | mae | mse | val_loss | val_mae | val_mse | |
---|---|---|---|---|---|---|
0 | 552.601074 | 21.633995 | 552.601074 | 459.509033 | 19.449942 | 459.509033 |
1 | 318.642822 | 15.401767 | 318.642822 | 137.617554 | 9.121042 | 137.617554 |
2 | 70.639000 | 6.392056 | 70.639000 | 58.648014 | 5.600778 | 58.648014 |
3 | 39.176506 | 4.612941 | 39.176506 | 41.494419 | 4.719237 | 41.494419 |
4 | 29.025448 | 3.932127 | 29.025448 | 33.036163 | 4.149935 | 33.036163 |
5 | 24.005482 | 3.541128 | 24.005482 | 28.892485 | 3.864979 | 28.892485 |
6 | 21.043699 | 3.282119 | 21.043699 | 26.515785 | 3.637505 | 26.515785 |
7 | 18.857632 | 3.125867 | 18.857632 | 24.811874 | 3.524302 | 24.811874 |
8 | 17.190617 | 3.001753 | 17.190617 | 23.708338 | 3.395098 | 23.708338 |
9 | 16.358448 | 2.930121 | 16.358448 | 23.143991 | 3.321771 | 23.143991 |
10 | 14.884508 | 2.803027 | 14.884508 | 22.031952 | 3.214803 | 22.031952 |
11 | 14.100672 | 2.744578 | 14.100672 | 21.434681 | 3.170895 | 21.434677 |
12 | 13.559190 | 2.692582 | 13.559190 | 20.980061 | 3.118278 | 20.980061 |
13 | 12.874426 | 2.629063 | 12.874426 | 20.613829 | 3.092899 | 20.613829 |
14 | 12.583358 | 2.542602 | 12.583358 | 20.106726 | 3.028854 | 20.106726 |
15 | 12.294549 | 2.572622 | 12.294549 | 20.148535 | 2.995361 | 20.148535 |
16 | 11.629196 | 2.449756 | 11.629196 | 19.827217 | 3.028043 | 19.827217 |
17 | 11.787493 | 2.496351 | 11.787493 | 19.774734 | 2.989304 | 19.774734 |
18 | 10.807133 | 2.437395 | 10.807133 | 19.489931 | 2.939384 | 19.489931 |
19 | 10.548433 | 2.346042 | 10.548433 | 19.129023 | 2.871042 | 19.129023 |
20 | 10.261354 | 2.340328 | 10.261354 | 19.107397 | 2.863487 | 19.107397 |
21 | 10.024061 | 2.340052 | 10.024061 | 19.203321 | 2.850341 | 19.203321 |
22 | 9.962586 | 2.270966 | 9.962586 | 18.603563 | 2.825216 | 18.603563 |
23 | 9.601974 | 2.243445 | 9.601974 | 18.474682 | 2.777019 | 18.474682 |
24 | 9.508885 | 2.243731 | 9.508885 | 18.559206 | 2.775191 | 18.559206 |
25 | 9.199199 | 2.190363 | 9.199199 | 18.623907 | 2.746933 | 18.623907 |
26 | 9.018448 | 2.171181 | 9.018448 | 18.445766 | 2.783106 | 18.445766 |
27 | 9.227386 | 2.196391 | 9.227386 | 18.088251 | 2.752972 | 18.088251 |
28 | 8.837852 | 2.163522 | 8.837852 | 18.141611 | 2.747433 | 18.141611 |
29 | 8.677971 | 2.144674 | 8.677971 | 18.018318 | 2.781960 | 18.018318 |
30 | 9.190325 | 2.213301 | 9.190325 | 18.048130 | 2.802679 | 18.048130 |
31 | 8.463733 | 2.131410 | 8.463734 | 17.647036 | 2.763420 | 17.647036 |
32 | 8.585588 | 2.144973 | 8.585588 | 17.795256 | 2.701603 | 17.795256 |
33 | 8.171778 | 2.061889 | 8.171778 | 17.878607 | 2.665828 | 17.878607 |
34 | 7.805007 | 2.042782 | 7.805007 | 17.431953 | 2.692094 | 17.431953 |
35 | 7.742220 | 2.025972 | 7.742219 | 17.042139 | 2.653405 | 17.042139 |
36 | 7.757503 | 1.977874 | 7.757503 | 17.363218 | 2.621971 | 17.363218 |
37 | 7.617399 | 1.999812 | 7.617400 | 16.611553 | 2.631893 | 16.611553 |
38 | 7.549075 | 1.952750 | 7.549075 | 17.015360 | 2.723044 | 17.015360 |
39 | 7.616443 | 2.039772 | 7.616443 | 17.192270 | 2.667056 | 17.192270 |
40 | 7.369143 | 1.962158 | 7.369143 | 16.700644 | 2.575090 | 16.700644 |
41 | 7.026539 | 1.918651 | 7.026539 | 16.540834 | 2.570634 | 16.540834 |
42 | 7.037803 | 1.934624 | 7.037803 | 16.145403 | 2.552151 | 16.145403 |
43 | 7.036263 | 1.916229 | 7.036263 | 16.497643 | 2.563887 | 16.497643 |
44 | 6.741823 | 1.886532 | 6.741823 | 16.144201 | 2.575377 | 16.144201 |
45 | 6.658704 | 1.863793 | 6.658704 | 16.230818 | 2.545785 | 16.230818 |
46 | 6.852619 | 1.902994 | 6.852619 | 15.973050 | 2.563817 | 15.973050 |
47 | 6.705412 | 1.870660 | 6.705412 | 16.070127 | 2.539413 | 16.070127 |
48 | 6.474781 | 1.840755 | 6.474780 | 15.688000 | 2.516261 | 15.688000 |
49 | 6.483906 | 1.863812 | 6.483906 | 16.144526 | 2.578224 | 16.144526 |
50 | 6.133446 | 1.775570 | 6.133446 | 15.315453 | 2.496422 | 15.315453 |
51 | 6.082084 | 1.769618 | 6.082084 | 15.647402 | 2.511681 | 15.647402 |
52 | 6.074298 | 1.804548 | 6.074298 | 14.750008 | 2.491462 | 14.750008 |
53 | 5.796865 | 1.704091 | 5.796865 | 15.557580 | 2.597581 | 15.557580 |
54 | 5.976436 | 1.789137 | 5.976436 | 15.110455 | 2.521980 | 15.110455 |
55 | 6.004953 | 1.789515 | 6.004953 | 15.256926 | 2.535437 | 15.256926 |
56 | 5.642893 | 1.697277 | 5.642893 | 14.558274 | 2.448738 | 14.558274 |
57 | 5.504644 | 1.701081 | 5.504644 | 15.567157 | 2.591281 | 15.567157 |
58 | 5.824846 | 1.774044 | 5.824846 | 14.557885 | 2.487737 | 14.557885 |
59 | 5.953528 | 1.784931 | 5.953528 | 14.375948 | 2.476415 | 14.375948 |
In [13]:
print("min( val_mae ) : {:.4f}".format( min(history.history["val_mae"]) ) )
min( val_mae ) : 2.4487
In [14]:
fidle.scrawler.history( history, plot={'MSE' :['mse', 'val_mse'],
'MAE' :['mae', 'val_mae'],
'LOSS':['loss','val_loss']}, save_as='01-history')
Saved: ./run/K3BHPD1/figs/01-history_0
Saved: ./run/K3BHPD1/figs/01-history_1
Saved: ./run/K3BHPD1/figs/01-history_2
Step 7 - Make a prediction¶
The data must be normalized with the parameters (mean, std) previously used.
In [15]:
my_data = [ 1.26425925, -0.48522739, 1.0436489 , -0.23112788, 1.37120745,
-2.14308942, 1.13489104, -1.06802005, 1.71189006, 1.57042287,
0.77859951, 0.14769795, 2.7585581 ]
real_price = 10.4
my_data=np.array(my_data).reshape(1,13)
In [16]:
predictions = model.predict( my_data, verbose=fit_verbosity )
print("Prediction : {:.2f} K$".format(predictions[0][0]))
print("Reality : {:.2f} K$".format(real_price))
1/1 - 0s - 4ms/step
Prediction : 10.39 K$ Reality : 10.40 K$
In [17]:
fidle.end()
End time : 22/12/24 21:21:19
Duration : 00:00:20 215ms
This notebook ends here :-)
https://fidle.cnrs.fr