Machine Learning Algorithms
Paradigm |
Learning rule |
Architecture |
Learning algorithm |
Task |
Supervised |
Error-correction |
Single or multilayer perceptron |
Perceptron rule, Stochastic gradient descent, Back propagation, BP+reinf, SBPI |
Pattern Classification, Functions Approximation, Prediction,
Control |
|
|
Convolutional Networks |
Stochastic gradient descent, Back propagation
|
Classification, Computer Vision, Speech recognition |
|
|
Auto-encoders |
Gradient Descent |
Data compression, Preprocessing |
|
Competitive |
Competitive |
Learning Vector Quantization |
Within Class Categorization, Data Compression |
Unsupervised
|
Hebbian |
Recurrent |
Hebb rule |
Denoising, Attractor Learning |
|
|
Hopfield Network |
Hebb rule, Inference (BP, TAP, Montecarlo) |
Memory, Denoising, Attractor Learning |
|
Boltzmann |
Boltzamann Machine |
Contrastive divergence, Statistical Inference |
Feature Learning, Preprocessing, Denoising, Generative model |
|
Kohonen's SOM |
Kohonen's SOM |
Kohonen's SOM |
Data Analysis |
Supervised VS Unsupervised
Data can be labeled or not
Nessun commento:
Posta un commento