You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Mar 6, 2021. It is now read-only.
Copy file name to clipboardexpand all lines: README.md
+6-6
Original file line number
Diff line number
Diff line change
@@ -1,9 +1,9 @@
1
1
Python-ELM v0.3
2
2
===============
3
3
4
-
######This is an implementation of the [Extreme Learning Machine](http://www.extreme-learning-machines.org)[1][2] in Python, based on [scikit-learn](http://scikit-learn.org).
4
+
######This is an implementation of the [Extreme Learning Machine](http://www.extreme-learning-machines.org)[1][2] in Python, based on [scikit-learn](http://scikit-learn.org).
5
5
6
-
######From the abstract:
6
+
######From the abstract:
7
7
8
8
> It is clear that the learning speed of feedforward neural networks is in general far slower than required and it has been a major bottleneck in their applications for past decades. Two key reasons behind may be: 1) the slow gradient- based learning algorithms are extensively used to train neural networks, and 2) all the parameters of the networks are tuned iteratively by using such learning algorithms. Unlike these traditional implementations, this paper proposes a new learning algorithm called extreme learning machine (ELM) for single- hidden layer feedforward neural networks (SLFNs) which ran- domly chooses the input weights and analytically determines the output weights of SLFNs. In theory, this algorithm tends to provide the best generalization performance at extremely fast learning speed. The experimental results based on real- world benchmarking function approximation and classification problems including large complex applications show that the new algorithm can produce best generalization performance in some cases and can learn much faster than traditional popular learning algorithms for feedforward neural networks.
9
9
@@ -17,7 +17,7 @@ __License: Simple BSD__
17
17
18
18
Files
19
19
-----
20
-
####__random_layer.py__
20
+
####__random_layer.py__
21
21
22
22
Contains the __RandomLayer__, __MLPRandomLayer__, __RBFRandomLayer__ and __GRBFRandomLayer__ classes.
23
23
@@ -76,7 +76,7 @@ notebook for details.
76
76
77
77
The GRBFRandomLayer implements the Generalized Radial Basis Function from [[3]](http://sci2s.ugr.es/keel/pdf/keel/articulo/2011-Neurocomputing1.pdf)
78
78
79
-
####__elm.py__
79
+
####__elm.py__
80
80
81
81
Contains the __ELMRegressor__, __ELMClassifier__, __GenELMRegressor__, and __GenELMClassifier__ classes.
82
82
@@ -85,11 +85,11 @@ GenELMClassifier is little more than a wrapper around GenELMRegressor that binar
85
85
86
86
The ELMRegressor class is a wrapper around GenELMRegressor that uses a RandomLayer instance by default and exposes the RandomLayer parameters in the constructor. ELMClassifier is similar for classification.
87
87
88
-
####__plot_elm_comparison.py__
88
+
####__plot_elm_comparison.py__
89
89
90
90
A small demo (based on scikit-learn's plot_classifier_comparison) that shows the decision functions of a couple of different instantiations of the GenELMClassifier on three different datasets.
91
91
92
-
####__elm_notebook.py__
92
+
####__elm_notebook.py__
93
93
94
94
An IPython notebook, illustrating several ways to use the __\*ELM\*__ and __\*RandomLayer__ classes.
0 commit comments