The feature weights are used for selecting and ranking purposes inside one of the implemented schemes:
- Recursive Feature Elimination family [Guyon02]: RFE, ERFE [Furlanello03], BISRFE, SQRTRFE
- Recursive Forward Selection family [Louw06]: RFS
- One-step
Ranking class based on Recursive Feature Elimination (RFE) and Recursive Forward Selection (RFS) methods.
Example:
>>> from numpy import *
>>> from mlpy import *
>>> x = array([[1.1, 2.1, 3.1, -1.0], # first sample
... [1.2, 2.2, 3.2, 1.0], # second sample
... [1.3, 2.3, 3.3, -1.0]]) # third sample
>>> y = array([1, -1, 1]) # classes
>>> myrank = Ranking() # initialize ranking class
>>> mysvm = Svm() # initialize svm class
>>> myrank.compute(x, y, mysvm) # compute feature ranking
array([3, 1, 2, 0])
Initialize Ranking class.
Input
- method - [string] method (‘onestep’, ‘rfe’, ‘bisrfe’, ‘sqrtrfe’, ‘erfe’, ‘rfs’)
- lastsinglesteps - [integer] last single steps with ‘rfe’
Compute the feature ranking.
Input
- x - [2D numpy array float] (sample x feature) training data
- y - [1D numpy array integer] (1 or -1) classes
- w - object (e.g. classifier) with weights() method
- debug - [bool] show remaining number of feature at each step (True or False)
Output
- feature ranking - [1D numpy array integer] ranked feature indexes
[Guyon02] | Isabelle Guyon, Jason Weston, Stephen Barnhill, Vladimir Vapnik. Gene Selection for Cancer Classification using Support Vector Machines, Machine Learning, v.46 n.1-3, p.389-422, 2002. |
[Furlanello03] | C Furlanello, M Serafini, S Merler, and G Jurman. Advances in Neural Network Research: IJCNN 2003, chapter An accelerated procedure for recursive feature ranking on microarray data. Elsevier, 2003. |
[Louw06] | N Louw and S J Steel. Variable selection in kernel Fisher discriminant analysis by means of recursive feature elimination. Computational Statistics & Data Analysis, Volume 51 Issue 3 Pages 2043-2055, 2006. |