MATLAB implementation of LPS

I recently implemented a slightly modifed version of LPS with MATLAB. This version is a randomized version of LPS which drops depth parameter and segment length parameter. The only parameter to set is the number of trees which is not important if set large enough. Also, the node size in each tree is restricted to be 0.01 times the number of rows  in order to control the representation size. That results in another parameter to tune in practice, however we used the same parameter setting for all the datasets.

This version might be slower than the one for R because of the regression tree implementation of MATLAB.  Although the paper submitted to PAMI is still under review, I wanted to release the code.

The new results are provided in LPS supporting page. As long as there is enough computing power, the larger number of trees will give better (reliable) results.

The code is available in Files section. The zip file has three MATLAB scripts named as "main.m", 'trainLPS.m' and 'LPS.m'.

'trainLPS.m' implements the function trainLPS which has three input arguments: trainLPS(series,ntree,nsegment). It returns the trained tree-based ensemble.

trainLPS trains ntree trees to learn the patterns where each tree is trained on the randomly selected nsegment segments. Training LPS is very fast which is near-linear to time series length and the number of time series in the time series database. This makes LPS suitable for finding similarity in large time series databases with long series.

'LPS.m' implements the function LPS which has three input arguments: LPS(ensemble,testseries,trainseries). It returns the similarity matrix between test time series and train time series based on the patterns learned by the ensemble.

'main.m' implements the 1 Nearest-Neighbor (1NN) classification based on LPS similarity. 

Please find the experimental results of MATLAB implementation in the related blog post.

Let me know if you have any questions. 

Copyright © 2018 mustafa gokce baydogan