SVM with Linearly Inseparable Classes

02.05.2021

Intro

In a previous post, we discusess how to build a simple linear classifier with SVM. Now, we will handle the case when classification is not easily separable. For these cases, we can use different kernals of SVM. We will also show plots of the different kernals so you can see the different from linear classification.

Building the Model

To change kernals, we can use the kernal parameter for the SVC class from sklearn. You can view the kernals here. https://scikit-learn.org/stable/modules/generated/sklearn.svm.SVC.html

# Load libraries
from sklearn.svm import SVC
from sklearn import datasets
from sklearn.preprocessing import StandardScaler
import numpy as np

# Set randomization seed
np.random.seed(0)

# Generate two features
features = np.random.randn(200, 2)

# Use a XOR gate (you don't need to know what this is) to generate
# linearly inseparable classes
target_xor = np.logical_xor(features[:, 0] > 0, features[:, 1] > 0)
target = np.where(target_xor, 0, 1)

# Create a support vector machine with a radial basis function kernel
svc = SVC(kernel="rbf", random_state=0, gamma=1, C=1)

# Train the classifier
model = svc.fit(features, target)