The website is in Maintenance mode. We are in the process of adding more features.
Any new bookmarks, comments, or user profiles made during this time will not be saved.

AIML.com

Machine Learning Resources

What are common choices to use for kernels in SVM?

Bookmark this question

When using kernelized SVM, the kernel function must be specified. Common choices for kernels include:

  • Linear: The linear kernel is the simplest choice and works best when classes can be linearly separated. 
  • Polynomial: The polynomial kernel is a possible choice when data is not linearly separable and maps the data into higher dimensional space by taking a power of the original data. It requires the degree of the polynomial to be specified. 
  • Radial Basis (RBF): The RBF kernel projects the original data into infinite-dimensional space and is a common choice for non-linear decision boundaries. It requires a gamma parameter that controls the influence carried by individual observations. It is often the default choice used when there is no prior knowledge about the decision boundary. 
  • Sigmoid: The sigmoid kernel represents the original feature space in the same way as a neural network perceptron model with a tanh activation function, but in complex non-linear decision boundaries, the RBF kernel is usually preferred. 

Leave your Comments and Suggestions below:

Please Login or Sign Up to leave a comment

Partner Ad  

Find out all the ways
that you can

Explore Questions by Topics

Partner Ad

Learn Data Science with Travis - your AI-powered tutor | LearnEngine.com