MATLAB Codes of ELM Algorithm

(for ELM with random hidden nodes and random hidden neurons)

 

How to use it?

After downloading the ELM codes, save and unzip in your own folder, type “help elm” for HELP.

 

Basic Usage: elm(TrainingData_File, TestingData_File, Elm_Type, NumberofHiddenNeurons, ActivationFunction)
OR:    [TrainingTime, TestingTime, TrainingAccuracy, TestingAccuracy] = elm(TrainingData_File, TestingData_File, Elm_Type, NumberofHiddenNeurons, ActivationFunction)


Input:

TrainingData_File

-

Filename of training data set

TestingData_File

-

Filename of testing data set

Elm_Type

-

ELM as functional approximators or classifiers
0 for regression;
1 for (both binary and multi-classes) classification

NumberofHiddenNeurons

-

Number of hidden neurons assigned to the ELM

ActivationFunction

-

Type of activation function:
'sig' for Sigmoidal function
'sin' for Sine function
'hardlim' for Hardlim function
'tribas' for Triangular basis function
'radbas' for Radial basis function (for additive type of SLFNs instead of RBF type of SLFNs)

Output:

TrainingTime

-

CPU Time (seconds) spent on training ELM

TestingTime

-

CPU Time (seconds) spent on predicting ALL testing data

TrainingAccuracy

-

Training accuracy:
RMSE for regression or correct classification rate for classification

TestingAccuracy

-

Testing accuracy:
RMSE for regression or correct classification rate for classification

 

MULTI-CLASSE CLASSIFICATION: NUMBER OF OUTPUT NEURONS WILL BE AUTOMATICALLY SET EQUAL TO NUMBER OF CLASSES

FOR EXAMPLE, if there are 7 classes in all, there will have 7 output neurons; neuron 5 has the highest output means input belongs to 5-th class

Sample1 regression: [TrainingTime, TestingTime, TrainingAccuracy, TestingAccuracy] = elm('sinc_train', 'sinc_test', 0, 20, 'sig')
Sample2 classification: elm('diabetes_train', 'diabetes_test', 1, 20, 'sig')

Advanced Usage: If users wish to save trained network model and use it for different testing data sets, the provided advanced elm package may be useful:

For training: elm_traing(TrainingData_File, Elm_Type, NumberofHiddenNeurons, ActivationFunction)
OR:    [TrainingTime, TrainingAccuracy] = elm_train(TrainingData_File, Elm_Type, NumberofHiddenNeurons, ActivationFunction)

For testing/prediction: elm_predict(TestingData_File)
OR:    [TestingTime, TestingAccuracy] = elm_predict(TestingData_File)

After training phase completed, the trained network model will be saved in ‘elm_model.mat’, which can be used for prediction by calling function elm_predict. The prediction output will be saved in ‘elm_output.mat’.

 

Data Format of Training and Testing Files

Important Notes:

  • It is suggested that all the input attributes (except expected targets) be normalized into the range [-1, 1].
  • The downloaded elm works for single-output function regression and single/multi-label classification cases. Users may easily customize and make it suitable to multi-output function regression cases.

Data Format: Training and testing files are text files, each raw consisting of information of one instance. First column are the expected output (target) for regression and classification applications, the rest columns consist of different attributes information of each instance. For example, the following shows a 7-class application dataset which have 9 input attributes. Since the ELM can automatically detect and propose multi-class labels, users can simply use one column to indicate the multi-class labels in their training and testing data files.

 

% Target
%

Input Attr 1

Input Attr 2

Input Attr 3

Input Attr 4

Input Attr 5

Input Attr 6

Input Attr 7

Input Attr 8

Input Attr 9

 

 

 

 

 

 

 

 

 

 

7

-0.38462

-0.34545

-0.70115

-0.63636

-0.26154

-0.34545

-0.6

-0.59375

-0.28125

3

0.630769

0.545455

0.310345

-0.05785

0.753846

0.545455

0.452632

-0.04688

0.75

2

-0.13846

-0.12727

-0.01149

-0.23967

0.107692

0.018182

-0.03158

-0.15625

0.21875

3

0.353846

0.236364

0.103448

-0.17355

0.476923

0.236364

0.073684

-0.15625

0.34375

2

0.261538

0.054545

-0.14943

-0.42149

0.015385

0.054545

-0.07368

-0.32813

0

4

0.138462

0.054545

0.034483

-0.28926

0.015385

-0.01818

-0.24211

-0.32813

-0.125

4

0.076923

0.090909

-0.28736

-0.38843

0.076923

0.090909

-0.11579

-0.35938

0.0625

1

-0.26154

0.236364

0.195402

-0.05785

-0.26154

0.309091

0.263158

-0.04688

-0.28125

1

-0.13846

0.545455

0.517241

0.07438

-0.13846

0.472727

0.452632

0.078125

-0.03125

7

-0.01538

-0.05455

-0.4023

-0.43802

-0.01538

-0.05455

-0.30526

-0.40625

-0.03125

1

-0.13846

0.527273

0.632184

0.123967

-0.26154

0.6

0.557895

0.125

-0.15625

2

-0.84615

-0.92727

0.632184

0.818182

-0.84615

-0.87273

0.452632

0.734375

-0.875

2

0.107692

0.163636

0.011494

-0.30579

0.107692

0.109091

-0.09474

-0.32813

-0.15625

1

-0.56923

-0.29091

-0.33333

-0.28926

-0.35385

-0.16364

-0.24211

-0.26563

-0.375

4

0.138462

0.163636

0.08046

-0.32231

0.138462

0.090909

0.052632

-0.29688

0.125

% More instances

 

Some samples

 

The following three samples are provided for users to try, where users can get average performance for 50 or 10 trials of each application. The training and testing datasets are randomly generated at each trial.

  • Function approximation:      SinC with noise (testing data is noise free).        Command: sinc_mean
  • Binary classification:            Diabetes.                                                          Command: diabetes_mean
  • Multi-class classification:     Landsat Satellite Image.                                    Command: satimage_mean
  • Multi-class classification:     Image Segment.                                                Command: segment_mean