Perceptron neural network-1 with solved example

preview_player
Показать описание
#perceptron #neuralNetworks #softComputing
Perceptron algorithm with solved example

Introduction:1.1 Biological neurons, McCulloch and Pitts models of neuron, Types
of activation function, Network architectures, Knowledge representation, Hebb net
1.2 Learning processes: Supervised learning, Unsupervised learning and
Reinforcement learning
1.3 Learning Rules : Hebbian Learning Rule, Perceptron Learning Rule, Delta
Learning Rule, Widrow-Hoff Learning Rule, Correlation Learning Rule, WinnerTake-All Learning Rule
1.4 Applications and scope of Neural Networks
10
2
Supervised Learning Networks :
2.1 Perception Networks – continuous & discrete, Perceptron convergence theorem,
Adaline, Madaline, Method of steepest descent, – least mean square algorithm,
Linear & non-linear separable classes & Pattern classes,
2.2 Back Propagation Network,
2.3 Radial Basis Function Network.
12
3
Unsupervised learning network:
3.1 Fixed weights competitive nets,
3.2 Kohonen Self-organizing Feature Maps, Learning Vector Quantization,
3.3 Adaptive Resonance Theory – 1
06
4
Associative memory networks:
4.1 Introduction, Training algorithms for Pattern Association,
4.2 Auto-associative Memory Network, Hetero-associative Memory Network,
Bidirectional Associative Memory,
4.3 Discrete Hopfield Networks.
08
5
Fuzzy Logic:
5.1 Fuzzy Sets, Fuzzy Relations and Tolerance and Equivalence
5.2 Fuzzification and Defuzzification
5.3 Fuzzy Controllers
Рекомендации по теме
Комментарии
Автор

Best one... In all I have ever searched.

gauravupadhyay
Автор

Your teaching is really helpful...you save my marks in neural model...thank you❤

Adx_.
Автор

Why so much hurry bro...
Have to saw the whole video in 0.5 playback speed
And great teaching😂👍👍

supriyoghosh
Автор

Cleared all my doubt, now I can implement jarvis too.

eisenergy
Автор

Thankyou so much, video help lot while preparing for exam

mohammedarbas
Автор

Thank you so much sir, sir plzzzz make videos as soon as possible, our university examination is from 13 December 2018😓😓 & there is no proper videos available for neutral network. So please make videos as soon as possible...
Plzzzz cover these topics also,
Single layer continue perceptron,
Multi layer perceptron
Delta learning rules
Winner take all learning rules
Separability limitations
Error back propagation
Hopefield network

artikumari-bihari
Автор

good explanation. can you plz tell when to stop. what did you mean by final answer.

seemapathak
Автор

Good explanation but plz explain one thing. At the start of epoch 2, for the first input, why did you take x1 = 1, x2 = 1 and t = 1?

hannanbaig
Автор

How can a set of data be
classified using a simple perceptron? Using a simple perceptron with
weights w0, w1, and w2 as −1, 2, and 1, respectively, classify data
points (3, 4); (5, 2); (1, −3); (−8, −3); (−3, 0).

bhavikprajapati
Автор

Hello boss, can u give us a clear picture of where these shall be used and why do we have to follow the steps ?
Is that mandatory ?
Also

Mrnaga
Автор

Final table me last row mein -3 k baad -1 hoga..naa ki 1

vinitraj
Автор

Bro great work done but always try to explain from scatch and easily

bhaviyachopra
Автор

in some texts the weight updation is done as "delW= alpha(target-output)Xi" instead of what you have taught. which one is the correct one

prateekganguly
Автор

Best explanation but please improve video quality

bhootuncle
Автор

Then what we will call .... After how many epoch do error converge ?

mrkunalgoswami
Автор

kindly tell me value of bias for 1, 2, 3, 4 inputs for calculation yin. why we take zero value in 1st input?

voiceofssuet
Автор

Lots of doubts like how many epoch and what are the applications of algorithm

Artist_Sir
Автор

Is it compulsory to assume theta as 0 ?
For all gates theta will be zero or only for AND gate?

nishakhubchandani
Автор

What is the Learning solution to a problem ? Pleas can u solve this q

فاطمةعليخضير
Автор

Bhai har ek case m bias ki values kha s pta krte ho???

devaryan
visit shbcf.ru