[MXDL-1-07] Artificial Neural Network [7/7] - Linear Regression and Non-linear Regression

preview_player
Показать описание
In this video, we will write a code to implement a regression model. A single-layer perceptron can be used to implement a linear regression model, and a multi-layer perceptron can be used to implement a non-linear regression model.

In the machine learning, we implemented the non-linear regression model using weighted linear regression or support vector regressor. Weighted linear regression model takes a long time to make predictions because it is a lazy learner. And support vector regressor model required the use of a kernel function.

Using artificial neural networks, non-linear regression models can be easily implemented by simply changing the loss function and the activation function in the output layer. The output layer uses a linear activation function, and the hidden layer uses non-linear activation function, such as sigmoid, hyperbolic tangent, ReLU. And mean squared error, MSE is used as the loss function.

Let's implement a single-layered perceptron to perform linear regression and a two-layered perceptron to perform non-linear regression

We will use the gradient descent function we created before. As mentioned before, to calculate the gradient accurately we need to use automatic differentiation. However, here we use numerical differentiation to approximate the gradient to better understand how ANNs fundamentally work. We will take a closer look at gradient descent using automatic differentiation in the error backpropagation algorithm later.

#ArtificialNeuralNetwork #ANN #LinearRegression #NonlinearRegression #SingleLayerPerceptron #MultiLayerPerceptron
Рекомендации по теме