Applied Data Science Notebook in Python for Beginners to Professionals

Data Science Project – A Guide to build simple Classification and Regression models with Keras in Python

Machine Learning for Beginners - A Guide to build simple Classification and Regression models with Keras in Python

For more projects visit: https://setscholars.net

  • There are 5000+ free end-to-end applied machine learning and data science projects available to download at SETSscholar. SETScholars is a Science, Engineering and Technology Scholars community.
In [1]:
# Suppress warnings in Jupyter Notebooks
import warnings
warnings.filterwarnings("ignore")

import matplotlib.pyplot as plt
plt.style.use('fivethirtyeight')

In this notebook, we will learn how to build simple Classification and Regression models with Keras in Python.

Python Codes

Create a simulated dataset for Classification

In [2]:
# example of training a final classification model
from sklearn.datasets import make_blobs

# generate 2d classification dataset
X, y = make_blobs(n_samples=1000, centers=2, n_features=10, random_state=412)

print(); print(X)
print(); print(y)
[[ -1.45467043  -2.64378719   7.33338001 ...  -7.41021183   6.21133413
   -6.95204558]
 [ 10.1951416    7.18718902  -8.66784207 ...  -5.40155119   2.09315293
   -0.06713891]
 [  8.3174425    7.91334626  -9.15008385 ...  -5.65672248   0.44811703
    0.3215537 ]
 ...
 [  9.54934726   6.52541405 -10.26434761 ...  -5.25128744   2.8351045
    0.40673994]
 [  1.30420552  -2.60730524   5.29214819 ...  -5.54417029   8.15774727
   -8.9464098 ]
 [  0.39540319  -1.11136124   5.30736581 ...  -6.42616915   8.14182449
   -7.61577769]]

[1 0 0 0 1 0 1 0 1 0 1 1 0 1 0 0 1 1 0 0 1 0 0 1 0 1 0 1 1 1 1 1 0 1 1 0 0
 0 1 0 1 1 1 0 1 0 0 1 0 1 1 0 0 0 0 0 1 1 1 1 1 0 1 0 1 0 0 0 0 0 0 1 1 0
 0 0 1 0 1 0 0 0 0 1 1 1 0 0 0 1 0 0 0 1 1 0 1 0 1 1 0 0 1 1 1 1 0 1 0 0 1
 1 1 0 1 1 1 1 1 1 0 0 1 0 1 0 0 1 0 1 0 1 1 0 1 1 0 1 1 1 1 1 0 1 1 0 1 1
 1 1 1 1 1 0 1 0 1 0 0 0 0 1 1 0 1 0 1 1 0 0 0 0 0 0 0 0 0 1 1 0 0 0 1 1 1
 1 1 1 0 1 0 0 0 0 1 0 1 1 1 0 1 1 0 0 0 1 1 0 0 1 0 0 0 0 0 0 1 0 1 1 1 0
 1 1 0 0 1 0 1 1 1 1 1 0 0 1 1 1 1 1 1 0 0 1 1 0 0 1 1 1 1 1 1 1 0 0 0 1 1
 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 1 1 0 1 1 1 1 0 0 0 1 1 1 1 0 1 0 0
 0 1 1 0 1 1 1 0 1 1 0 1 0 0 1 1 0 0 1 1 1 0 0 1 0 0 1 1 1 1 1 1 1 0 0 1 1
 1 0 1 1 0 1 0 1 0 0 0 0 0 0 1 0 1 0 0 1 1 0 1 1 1 0 0 0 0 0 0 0 1 1 1 0 1
 0 0 0 0 1 0 1 0 0 0 0 0 1 0 1 0 0 1 0 1 0 1 1 0 0 0 0 1 0 0 0 1 1 0 0 0 1
 1 1 1 1 1 1 1 0 1 1 1 1 0 1 1 0 0 0 0 1 0 0 0 0 0 0 1 0 1 0 1 1 1 1 0 0 1
 0 1 0 0 0 1 1 1 0 1 1 0 1 1 1 1 1 0 1 0 0 0 0 0 0 1 0 0 1 0 1 0 1 1 1 1 0
 0 0 0 1 1 0 1 1 0 1 1 1 0 0 1 1 1 0 0 0 1 1 1 0 1 0 1 0 1 1 0 1 1 0 0 0 0
 1 1 1 0 1 0 0 1 0 1 0 0 0 1 1 1 0 1 0 0 1 0 1 0 0 0 0 1 1 1 0 1 1 1 0 1 1
 1 0 1 1 1 0 0 1 1 0 0 1 0 1 0 0 0 1 1 0 1 1 1 1 0 0 1 1 1 0 0 0 1 1 1 1 0
 1 1 0 0 0 1 1 0 1 1 0 1 0 0 1 0 1 0 1 1 0 0 1 0 0 0 0 0 1 1 1 1 1 0 1 0 0
 0 0 0 0 0 0 0 1 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 1 1 0 1 1 1 0
 0 1 0 0 0 0 0 1 0 1 0 1 1 1 0 0 1 0 1 0 0 1 0 0 0 1 0 0 0 0 1 1 1 0 0 1 0
 1 0 0 1 1 1 1 1 1 1 1 0 1 0 0 0 0 1 1 1 1 0 1 1 1 0 1 0 0 0 1 0 0 1 1 1 0
 0 1 0 1 1 1 0 1 0 0 1 0 0 0 1 0 1 1 1 1 1 1 1 0 1 0 0 0 0 1 1 1 1 0 0 0 0
 0 0 0 1 0 0 1 1 1 1 1 0 1 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 0 1 1 1 1 1 1 0 1
 1 1 1 1 1 1 0 1 0 1 1 1 1 1 0 1 1 1 0 1 0 0 1 1 1 1 0 0 1 0 1 1 0 0 1 0 1
 1 1 1 1 0 0 0 0 0 1 1 0 0 1 1 1 0 1 0 0 0 1 1 1 1 0 1 1 1 1 1 0 1 1 0 0 1
 1 0 0 0 1 0 0 1 0 1 0 0 0 0 1 1 0 0 1 1 1 1 0 0 0 1 1 0 0 0 0 0 1 1 0 0 0
 0 0 0 0 0 1 1 0 1 0 1 0 1 1 1 0 0 0 1 0 0 0 1 0 0 1 1 1 1 1 0 0 1 1 0 0 1
 0 1 1 0 0 0 0 1 1 0 0 0 0 1 0 1 0 1 0 1 1 0 0 1 0 1 0 0 1 0 0 1 1 0 1 0 1
 1]

Classfication model in Keras

In [3]:
import warnings
warnings.filterwarnings("ignore")

from keras.models import Sequential
from keras.layers import Dense
from sklearn.datasets import make_blobs
from sklearn.preprocessing import MinMaxScaler

# generate 2d classification dataset
X, y = make_blobs(n_samples=1000, centers=2, n_features=2, random_state=412)
scalar = MinMaxScaler()
scalar.fit(X)
X = scalar.transform(X)

# define and fit the final model
model = Sequential()
model.add(Dense(4, input_dim=2, activation='relu'))
model.add(Dense(4, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam')

model.fit(X, y, epochs=500, verbose=0)

# new instances where we do not know the answer
Xnew, _ = make_blobs(n_samples=10, centers=2, n_features=2, random_state=412)
Xnew = scalar.transform(Xnew)

# make a prediction
ynew = model.predict_classes(Xnew)

# show the inputs and predicted outputs
print()
for i in range(len(Xnew)):
    print()
    print("X=%s, Predicted=%s" % (Xnew[i], ynew[i]))
Using TensorFlow backend.
WARNING: Logging before flag parsing goes to stderr.
W1006 16:56:49.944037 140358961665856 deprecation_wrapper.py:119] From /home/crown/miniconda/envs/nilimeshdss/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:74: The name tf.get_default_graph is deprecated. Please use tf.compat.v1.get_default_graph instead.

W1006 16:56:49.957853 140358961665856 deprecation_wrapper.py:119] From /home/crown/miniconda/envs/nilimeshdss/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:517: The name tf.placeholder is deprecated. Please use tf.compat.v1.placeholder instead.

W1006 16:56:49.959786 140358961665856 deprecation_wrapper.py:119] From /home/crown/miniconda/envs/nilimeshdss/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:4138: The name tf.random_uniform is deprecated. Please use tf.random.uniform instead.

W1006 16:56:49.991970 140358961665856 deprecation_wrapper.py:119] From /home/crown/miniconda/envs/nilimeshdss/lib/python3.6/site-packages/keras/optimizers.py:790: The name tf.train.Optimizer is deprecated. Please use tf.compat.v1.train.Optimizer instead.

W1006 16:56:50.008375 140358961665856 deprecation_wrapper.py:119] From /home/crown/miniconda/envs/nilimeshdss/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:3376: The name tf.log is deprecated. Please use tf.math.log instead.

W1006 16:56:50.012195 140358961665856 deprecation.py:323] From /home/crown/miniconda/envs/nilimeshdss/lib/python3.6/site-packages/tensorflow/python/ops/nn_impl.py:180: add_dispatch_support.<locals>.wrapper (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use tf.where in 2.0, which has the same broadcast rule as np.where
W1006 16:56:50.250089 140358961665856 deprecation_wrapper.py:119] From /home/crown/miniconda/envs/nilimeshdss/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:986: The name tf.assign_add is deprecated. Please use tf.compat.v1.assign_add instead.


X=[0.10563029 0.08102864], Predicted=[1]

X=[0.84304341 0.82219769], Predicted=[0]

X=[0.10818261 0.03689724], Predicted=[1]

X=[0.16482511 0.20884075], Predicted=[1]

X=[0.91470615 0.83097573], Predicted=[0]

X=[0.88386598 0.65311665], Predicted=[0]

X=[0.18274449 0.21237837], Predicted=[1]

X=[0.87845859 0.97915588], Predicted=[0]

X=[0.78587887 0.68250739], Predicted=[0]

X=[0.10331569 0.31785207], Predicted=[1]
In [4]:
ynew = model.predict_proba(Xnew)
ynew
Out[4]:
array([[9.9999988e-01],
       [0.0000000e+00],
       [9.9999988e-01],
       [9.9999982e-01],
       [0.0000000e+00],
       [1.1920929e-07],
       [9.9999970e-01],
       [0.0000000e+00],
       [1.6865806e-06],
       [9.9999988e-01]], dtype=float32)

Create a simulated dataset for Regression

In [5]:
from sklearn.datasets import make_regression

# generate regression dataset
X, y = make_regression(n_samples=100, n_features=2, noise=0.1, random_state=412)

print(); print(X)
print(); print(y)
[[ 1.3046454  -1.11359938]
 [ 0.09711123 -1.36595059]
 [-1.3219651  -0.21265457]
 [-1.45761077 -0.22657123]
 [ 1.81318954 -1.05019921]
 [ 0.32013868  0.02477864]
 [ 1.65023068  1.04969327]
 [-0.66767978  0.26138894]
 [-1.4120531  -0.98088897]
 [ 0.41759846  0.63891759]
 [-0.57846357  0.08380797]
 [ 0.97257761  0.38414575]
 [ 0.24921048  1.79920158]
 [-0.54893776 -1.72363685]
 [ 0.27124985  1.1754071 ]
 [ 1.34147817 -1.2066923 ]
 [-1.56562222  0.62686989]
 [-0.39410434 -0.26683937]
 [-1.53289573  1.61872462]
 [-1.25829698 -1.18299843]
 [ 0.14065678  0.74702183]
 [ 1.46226468 -0.65578988]
 [-0.95861587 -0.24263857]
 [-0.58041399  1.57846257]
 [ 0.91176968  0.0537408 ]
 [-0.53498294 -0.23603517]
 [-0.80290544 -1.10432497]
 [ 0.19485285  0.79042122]
 [-0.24634308 -0.83507259]
 [-0.16845563  0.72732175]
 [ 1.45772002  0.23658891]
 [-0.31598284 -0.45226836]
 [-1.0312321   1.0382546 ]
 [ 0.66662214 -0.30297025]
 [-0.25246294  0.50065509]
 [-1.82018155 -0.63852667]
 [-0.16182195 -0.63402045]
 [-0.77221866 -0.18942929]
 [-1.45590405 -0.89375095]
 [-0.58097039 -0.16043082]
 [-0.93497157 -1.44946557]
 [-0.32257084  0.27635871]
 [ 0.75960342  1.59747586]
 [ 0.58279866 -0.10393255]
 [ 0.67685281  0.39888536]
 [-0.89398195 -0.04474103]
 [-0.47822172  0.43488564]
 [ 1.7450647  -1.23220322]
 [-0.92058326 -0.21275815]
 [-0.62223991  1.15400551]
 [ 2.54303698 -0.18338027]
 [-2.65516162 -0.18768744]
 [ 0.50281538  1.28030285]
 [-0.04528367  0.47712735]
 [ 0.95173305  0.55001554]
 [-1.75093176 -0.52786403]
 [ 0.79107409  0.67521706]
 [ 0.35725416  0.0337816 ]
 [-1.13225557 -1.80298652]
 [ 0.560494    0.02005529]
 [-1.9755922  -1.59821599]
 [ 0.39446896  0.38234244]
 [ 0.60056399  0.83655442]
 [ 0.68728969 -1.02226085]
 [-1.24953692 -0.11148453]
 [-2.13338745 -0.24814415]
 [ 0.5156659  -0.56032058]
 [-1.25902086  0.08035959]
 [ 0.47844349 -0.10487372]
 [ 0.50328313 -0.15207217]
 [-0.13544111 -0.74215744]
 [ 0.04779735 -0.68953716]
 [-0.96355694 -0.1816465 ]
 [-0.9642961  -2.37631923]
 [ 0.56599441 -0.24101222]
 [ 0.83413026  0.44555735]
 [-0.44933716  0.08092915]
 [-0.29488064  0.12178625]
 [-1.31178855 -0.0552951 ]
 [ 0.03730687  0.45767185]
 [ 0.40679854  0.71098122]
 [-1.2438099  -0.99878941]
 [-0.23117295 -1.61342205]
 [-0.8274801  -0.59912031]
 [ 0.78686427  0.17574324]
 [-1.02156425  0.62747756]
 [ 0.11997635  0.00313723]
 [-0.31742367 -0.27505442]
 [ 0.18106723  0.53802392]
 [ 0.42427475 -0.71642499]
 [-0.04844936 -0.97029406]
 [ 0.80117299  0.63993028]
 [ 0.6383753  -0.19343477]
 [ 1.43493755  0.46236855]
 [-0.43394907  0.58202603]
 [-0.32493369 -0.16231846]
 [ 0.03464479  1.71618776]
 [ 0.10223729  0.48989716]
 [ 0.76608641  0.97984068]
 [-0.3293799  -1.55766312]]

[ 2.51845693e+01 -1.03408293e+02 -1.35374719e+02 -1.48811838e+02
  7.58017310e+01  3.06313474e+01  2.33441535e+02 -3.80550349e+01
 -2.06552014e+02  8.96904091e+01 -4.46124243e+01  1.18181954e+02
  1.69953063e+02 -1.90628377e+02  1.20770208e+02  2.07937345e+01
 -8.82308798e+01 -5.71231332e+01 -4.09304301e+00 -2.09484280e+02
  7.37688192e+01  7.67213096e+01 -1.05659631e+02  7.80321510e+01
  8.56620657e+01 -6.70875456e+01 -1.62293911e+02  8.22640947e+01
 -9.05507895e+01  4.47364253e+01  1.49577926e+02 -6.53009970e+01
 -6.88830731e+00  3.46122205e+01  1.86461994e+01 -2.15125151e+02
 -6.65357100e+01 -8.45875911e+01 -2.03347551e+02 -6.50119329e+01
 -2.02373440e+02 -6.15962825e+00  1.98938190e+02  4.34565029e+01
  9.30821358e+01 -8.35226439e+01 -7.04823364e+00  5.46170217e+01
 -9.97246493e+01  3.91802152e+01  2.11834508e+02 -2.52428061e+02
  1.49969457e+02  3.51255790e+01  1.30390343e+02 -1.99545092e+02
  1.26045943e+02  3.45677335e+01 -2.49022704e+02  5.16948979e+01
 -3.07389360e+02  6.65910103e+01  1.22333406e+02 -2.25600992e+01
 -1.20701884e+02 -2.10765342e+02 -3.74804945e-02 -1.05690069e+02
  3.41351132e+01  3.26045605e+01 -7.30960082e+01 -5.23396076e+01
 -1.00856704e+02 -2.81158418e+02  3.06455502e+01  1.10988469e+02
 -3.34623909e+01 -1.63538333e+01 -1.21648517e+02  4.08605655e+01
  9.47816150e+01 -1.92999130e+02 -1.53185975e+02 -1.22964903e+02
  8.45074235e+01 -3.96997946e+01  1.08451306e+01 -5.09194765e+01
  6.03823736e+01 -2.11245794e+01 -8.40818129e+01  1.23991959e+02
  4.11680894e+01  1.66245750e+02  9.16806615e+00 -4.22604785e+01
  1.43894279e+02  4.91869717e+01  1.48859879e+02 -1.57431951e+02]

Regression model in Keras

In [6]:
import warnings
warnings.filterwarnings("ignore")

from keras.models import Sequential
from keras.layers import Dense
from sklearn.datasets import make_regression
from sklearn.preprocessing import MinMaxScaler

# generate regression dataset
X, y = make_regression(n_samples=100, n_features=2, noise=0.1, random_state=1)
scalarX, scalarY = MinMaxScaler(), MinMaxScaler()
scalarX.fit(X)
scalarY.fit(y.reshape(100,1))
X = scalarX.transform(X)
y = scalarY.transform(y.reshape(100,1))

# define and fit the final model
model = Sequential()
model.add(Dense(4, input_dim=2, activation='relu'))
model.add(Dense(4, activation='relu'))
model.add(Dense(1, activation='linear'))
model.compile(loss='mse', optimizer='adam')
model.fit(X, y, epochs=1000, verbose=0)

# new instances where we do not know the answer
Xnew, a = make_regression(n_samples=10, n_features=2, noise=0.1, random_state=1)
Xnew = scalarX.transform(Xnew)

# make a prediction
ynew = model.predict(Xnew)

# show the inputs and predicted outputs
for i in range(len(Xnew)):
    print()
    print("X=%s, Predicted=%s" % (Xnew[i], ynew[i]))
X=[0.53594751 0.55645826], Predicted=[0.54876053]

X=[0.48950648 0.22345069], Predicted=[0.17003132]

X=[0.41253599 0.17896904], Predicted=[0.08943562]

X=[0.59584285 0.36673477], Predicted=[0.36960426]

X=[ 0.71405793 -0.10112306], Predicted=[-0.0796763]

X=[0.77212207 0.17283115], Predicted=[0.23529007]

X=[0.87826558 0.28411721], Predicted=[0.40028194]

X=[0.45705409 0.33602922], Predicted=[0.27764916]

X=[0.9043303  0.25004516], Predicted=[0.37459865]

X=[ 0.84316306 -0.04608862], Predicted=[0.01976432]
In [ ]:
 

Summary

In this coding recipe, we discussed how to build simple Classification and Regression models with Keras in Python.

Specifically, we have learned the followings:

  • How to build a simple Classification model with Keras in Python.
  • How to build a simple Regression model with Keras in Python.
In [ ]: