Skip to content
This repository was archived by the owner on Mar 6, 2021. It is now read-only.

Commit f8e6fb3

Browse files
committed
README.md update
1 parent cfbed88 commit f8e6fb3

File tree

1 file changed

+96
-19
lines changed

1 file changed

+96
-19
lines changed

README.md

+96-19
Original file line numberDiff line numberDiff line change
@@ -2,35 +2,112 @@ Python-ELM
22
==========
33

44
Extreme Learning Machine implementation in Python
5-
Version 0.2
5+
Version 0.3
66

7-
This is an implementation of the Extreme Learning Machine in python,
8-
based on the scikit-learn machine learning library.
7+
This is an implementation of the [Extreme Learning Machine](http://www.extreme-learning-machines.org) [1][2] in Python, based on [scikit-learn](http://scikit-learn.org).
98

10-
Distance and dot product based hidden layers are provided via the
11-
RBFRandomLayer and SimpleRandomLayer classes respectively.
9+
It's a work in progress, so things can/might/will change.
1210

13-
The SimpleRandomLayer provides the following activation functions:
11+
__David C. Lambert__
12+
__dcl [at] panix [dot] com__
1413

15-
tanh, sine, tribas, sigmoid, hardlim
14+
__Copyright © 2013__
15+
__License: Simple BSD__
1616

17-
The RBFRandomLayer provides the following activation functions:
17+
Files
18+
-----
19+
####__random_layer.py__
1820

19-
gaussian, multiquadric and polyharmonic spline ('poly_spline')
21+
Contains the __RandomLayer__, __MLPRandomLayer__, __RBFRandomLayer__ and __GRBFRandomLayer__ classes.
2022

21-
In addition, each random hidden layer class can take a callable user
23+
RandomLayer is a transformer that creates a feature mapping of the
24+
inputs that corresponds to a layer of hidden units with randomly
25+
generated components.
26+
27+
The transformed values are a specified function of input activations
28+
that are a weighted combination of dot product (multilayer perceptron)
29+
and distance (rbf) activations:
30+
31+
input_activation = alpha * mlp_activation + (1-alpha) * rbf_activation
32+
33+
mlp_activation(x) = dot(x, weights) + bias
34+
rbf_activation(x) = rbf_width * ||x - center||/radius
35+
36+
_mlp_activation_ is multi-layer perceptron input activation
37+
38+
_rbf_activation_ is radial basis function input activation
39+
40+
_alpha_ and _rbf_width_ are specified by the user
41+
42+
_weights_ and _biases_ are taken from normal distribution of
43+
mean 0 and sd of 1
44+
45+
_centers_ are taken uniformly from the bounding hyperrectangle
46+
of the inputs, and
47+
48+
radius = max(||x-c||)/sqrt(n_centers*2)
49+
50+
(All random components can be supplied by the user by providing entries in the dictionary given as the _user_components_ parameter.)
51+
52+
The input activation is transformed by a transfer function that defaults
53+
to numpy.tanh if not specified, but can be any callable that returns an
54+
array of the same shape as its argument (the input activation array, of
55+
shape [n_samples, n_hidden]).
56+
57+
Transfer functions provided are:
58+
59+
* sine
60+
* tanh
61+
* tribas
62+
* inv_tribas
63+
* sigmoid
64+
* hardlim
65+
* softlim
66+
* gaussian
67+
* multiquadric
68+
* inv_multiquadric
69+
70+
MLPRandomLayer and RBFRandomLayer classes are just wrappers around the RandomLayer class, with the _alpha_ mixing parameter set to 1.0 and 0.0 respectively (for 100% MLP input activation, or 100% RBF input activation)
71+
72+
The RandomLayer, MLPRandomLayer, RBFRandomLayer classes can take a callable user
2273
provided transfer function. See the docstrings and the example ipython
2374
notebook for details.
2475

25-
There's a little demo in plot_elm_comparison.py (based on scikit-learn's
26-
plot_classifier_comparison).
76+
The GRBFRandomLayer implements the Generalized Radial Basis Function from [[3]](http://sci2s.ugr.es/keel/pdf/keel/articulo/2011-Neurocomputing1.pdf)
77+
78+
####__elm.py__
79+
80+
Contains the __ELMRegressor__, __ELMClassifier__, __GenELMRegressor__, and __GenELMClassifier__ classes.
81+
82+
GenELMRegressor and GenELMClassifier both take *RandomLayer instances as part of their contructors, and an optional regressor (conforming to the sklearn API)for performing the fit (instead of the default linear fit using the pseudo inverse from scipy.pinv2).
83+
GenELMClassifier is little more than a wrapper around GenELMRegressor that binarizes the target array before performing a regression, then unbinarizes the prediction of the regressor to make its own predictions.
84+
85+
The ELMRegressor class is a wrapper around GenELMRegressor that uses a RandomLayer instance by default and exposes the RandomLayer parameters in the constructor. ELMClassifier is similar for classification.
86+
87+
####__plot_elm_comparison.py__
88+
89+
A small demo ()based on scikit-learn's plot_classifier_comparison) that shows the decision functions of a couple of different instantiations of the GenELMClassifier on three different datasets.
90+
91+
####__elm_notebook.py__
92+
93+
An IPython notebook, illustrating several ways to use the __\*ELM*__ and __\*RandomLayer__ classes.
94+
95+
Requirements
96+
------------
97+
98+
Written using Python 2.7.3, numpy 1.6.1, scipy 0.10.1, scikit-learn 0.13.1 and ipython 0.12.1
2799

28-
Requires that scikit-learn be installed, along with its usual prerequisites,
29-
and ipython to use elm_notebook.py (though it can be tweaked to run without
30-
it).
100+
References
101+
----------
102+
```
103+
[1] http://www.extreme-learning-machines.org
31104
32-
This is a work in progress, it may be restructured as time goes by.
105+
[2] G.-B. Huang, Q.-Y. Zhu and C.-K. Siew, "Extreme Learning Machine:
106+
Theory and Applications", Neurocomputing, vol. 70, pp. 489-501,
107+
2006.
108+
109+
[3] Fernandez-Navarro, et al, "MELM-GRBF: a modified version of the
110+
extreme learning machine for generalized radial basis function
111+
neural networks", Neurocomputing 74 (2011), 2502-2510
112+
```
33113

34-
- David C Lambert
35-
March, 2013
36-
[dcl -at- panix -dot- com]

0 commit comments

Comments
 (0)