Support Vector Machines (SVMs)

class mlpy.Svm(kernel='linear', kp=0.10000000000000001, C=1.0, tol=0.001, eps=0.001, maxloops=1000, cost=0.0)

Support Vector Machines (SVM).

Example:

>>> from numpy import *
>>> from mlpy import *
>>> xtr = array([[1.0, 2.0, 3.0, 1.0],  # first sample
...              [1.0, 2.0, 3.0, 2.0],  # second sample
...              [1.0, 2.0, 3.0, 1.0]]) # third sample
>>> ytr = array([1, -1, 1])             # classes
>>> mysvm = Svm()                       # initialize Svm class
>>> mysvm.compute(xtr, ytr)             # compute SVM
1
>>> mysvm.predict(xtr)                  # predict SVM model on training data
array([ 1, -1,  1])
>>> xts = array([4.0, 5.0, 6.0, 7.0])   # test point
>>> mysvm.predict(xts)                  # predict SVM model on test point
-1
>>> mysvm.realpred                      # real-valued prediction
-5.5
>>> mysvm.weights(xtr, ytr)             # compute weights on training data
array([ 0.,  0.,  0.,  1.])

Initialize the Svm class.

Input

  • kernel - [string] kernel (‘linear’, ‘gaussian’, ‘polynomial’, ‘tr’)
  • kp - [float] kernel parameter (two sigma squared) for gaussian and polynomial kernel
  • C - [float] regularization parameter
  • tol - [float] tolerance for testing KKT conditions
  • eps - [float] convergence parameter
  • maxloops - [integer] maximum number of optimization loops
  • cost - [float] for cost-sensitive classification (-1.0, ..., 1.0)
compute(x, y)

Compute Svm model.

Input

  • x - [2D numpy array float] (sample x feature) training data
  • y - [1D numpy array integer] (-1 or 1) classes

Output

  • conv - [integer] svm convergence (0: false, 1: true)
predict(p)

Predict svm model on a test point(s).

Input

  • p - [1D or 2D numpy array float] test point(s)

Output

  • cl - [integer or 1D numpy array integer] class(es) predicted
  • self.realpred - [1D numpy array float] real valued prediction
weights(x, y)

Return feature weights.

Input

  • x - [2D numpy array float] (sample x feature) training data
  • y - [1D numpy array integer] (-1 or 1) classes

Output

  • fw - [1D numpy array float] feature weights

Note

For tr kernel (Terminated Ramp Kernel) see [Merler06].

[Vapnik95]V Vapnik. The Nature of Statistical Learning Theory. Springer-Verlag, 1995.
[Cristianini]N Cristianini and J Shawe-Taylor. An introduction to support vector machines. Cambridge University Press.
[Merler06]S Merler and G Jurman. Terminated Ramp - Support Vector Machine: a nonparametric data dependent kernel. Neural Network, 19:1597-1611, 2006.

Previous topic

Classification

Next topic

K Nearest Neighbor (KNN)

This Page