Skip to content

Artificial Neural Network Regression with Python

Last Update: February 10, 2020

Supervised deep learning consists of using multi-layered algorithms for finding which class output target data belongs to or predicting its value by mapping its optimal relationship with input predictors data. Main supervised deep learning tasks are classification and regression.

This topic is part of Deep Learning Regression with Python course. Feel free to take a look at Course Curriculum.

This tutorial has an educational and informational purpose and doesn’t constitute any type of forecasting, business, trading or investment advice. All content, including code and data, is presented for personal educational use exclusively and with no guarantee of exactness of completeness. Past performance doesn’t guarantee future results. Please read full Disclaimer.

An example of supervised deep learning algorithm is artificial neural network [1] which consists of predicting output target feature by dynamically processing output target and input predictors data through multi-layer network of optimally weighted connection of nodes. Nodes are organized in input, hidden and output layers. Weight decay L_{2} or sparsity L_{1} regularizations are used for lowering variance error source generated by a greater model complexity.

1. Activation function

Activation function consists of describing linear or non-linear connection between nodes. For supervised deep learning, linear, rectified linear unit, hyperbolic tangent sigmoid or logistic sigmoid functions are used.

1.1. Activation function formula notation.

\left ( linear \right )a\left ( h \right )=h

Where a\left ( h \right ) = linear activation function, h = hidden layer.

2. Algorithm definition.

Backward Propagation of Errors using quasi-Newton limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS), stochastic gradient descent (SGD) or adaptive moment estimation (Adam) algorithms consists of finding optimal nodes connection weights by minimizing information loss measured through sum of squared errors.

2.1. Algorithm formula notation.

min\left ( sse \right )=\sum_{t=1}^{n}\left ( y_{t}-\hat{y}_{t} \right )^2

\hat{y}_{t}=a\left ( \sum_{i=1}^{l}\sum_{j=1}^{m}\upsilon _{ij}+\omega _{ij} x_{t}\right )

Where y_{t} = output target feature data, \hat{y}_{t} = output target node prediction, a\left ( h \right ) = activation function, \upsilon _{ij} = i layer, j hidden node intercept connection or bias, \omega _{ij} = i layer, j hidden node connection optimal weight, x_{t} = input predictor features data, l = number of layers, m = number of hidden nodes, n = number of input predictor features data observations.

3. Python code example.

3.1. Import Python packages [2].

import numpy as np
import pandas as pd
import sklearn.neural_network as ml

3.2. Artificial neural network regression data reading, target and predictor features creation, training and testing ranges delimiting.

  • Data: S&P 500® index replicating ETF (ticker symbol: SPY) daily adjusted close prices (2007-2015).
  • Data daily arithmetic returns used for target feature (current day) and predictor feature (previous day).
  • Target and predictor features creation, training and testing ranges delimiting not fixed and only included for educational purposes.
spy = pd.read_csv('Data//ANN-Regression-Data.txt', index_col='Date', parse_dates=True)
rspy = spy.pct_change(1)
rspy.columns = ['rspy']
rspy1 = rspy.shift(1)
rspy1.columns = ['rspy1']
rspyall = rspy
rspyall = rspyall.join(rspy1)
rspyall = rspyall.dropna()
rspyt = rspyall['2007-01-01':'2014-01-01']
rspyf = rspyall['2014-01-01':'2016-01-01']

3.3. Artificial neural network regression fitting, results and output.

  • Artificial neural network fitting within training range.
  • Artificial neural network fitting number of hidden nodes, number of hidden layers and linear activation function not fixed and only included for educational purposes.
  • Artificial neural network output results might be different depending on algorithm random number generation seed.
annt = ml.MLPRegressor(hidden_layer_sizes=(1,), activation='identity',
                      solver='lbfgs').fit(np.array(rspyt['rspy1']).reshape(-1, 1), 
                      rspyt['rspy'])
In:
anntw = [{'0': 'Nodes Connection:', '1': 'Weight or Bias:'},
        {'0': 'Intercept --> Hidden', '1': np.round(annt.intercepts_[0], 4)},
        {'0': 'Input     --> Hidden', '1': np.round(annt.coefs_[0], 4)},
        {'0': 'Intercept --> Output', '1': np.round(annt.intercepts_[1], 4)},
        {'0': 'Hidden    --> Output', '1': np.round(annt.coefs_[1], 4)},]
print('')
print('== Artificial Neural Network Regression Results ==')
print('')
print('ANN Regression Number of Layers:', annt.n_layers_)
print('ANN Regression Nodes Connection Weights and Bias:')
print('')
print(pd.DataFrame(anntw))
Out:
== Artificial Neural Network Regression Results ==

ANN Regression Number of Layers: 3
ANN Regression Nodes Connection Weights and Bias:

                      0                1
0     Nodes Connection:  Weight or Bias:
1  Intercept --> Hidden         [1.1518]
2  Input     --> Hidden       [[0.1687]]
3  Intercept --> Output         [0.9378]
4  Hidden    --> Output      [[-0.8139]]
4. References.

[1] C. M. Bishop, “Neural Networks for Pattern Recognition”, Oxford University Press, 1995.

[2] Travis E, Oliphant. “A guide to NumPy”. USA: Trelgol Publishing. 2006.

Stéfan van der Walt, S. Chris Colbert and Gaël Varoquaux. “The NumPy Array: A Structure for Efficient Numerical Computation”. Computing in Science & Engineering. 2011.

Wes McKinney. “Data Structures for Statistical Computing in Python.” Proceedings of the 9th Python in Science Conference. 2010.

Fabian Pedregosa, Gaël Varoquaux, Alexandre Gramfort, Vincent Michel, Bertrand Thirion, Olivier Grisel, Mathieu Blondel, Peter Prettenhofer, Ron Weiss, Vincent Dubourg, Jake Vanderplas, Alexandre Passos, David Cournapeau, Matthieu Brucher, Matthieu Perrot, Édouard Duchesnay. “Scikit-learn: Machine Learning in Python”. Journal of Machine Learning Research. 2011.

My online courses are closed for enrollment.
+