From above image we can get slop and intercept
so, Equation of hyperplane is x+y = 0 option (1) is
correct
H in [42]: # some basic dependencies import numpy as np import matplot lib.pyplot as plt # to visualize data from matplotlib import style style.use("ggplot") from sklearn import svm # the feature list in capital X variable as x and y coordinates [x,y] x np.array([[4, 5], [3, 2], [2, 3], [2, 2], [-1, -3], [-2, -6], [-2,-4], [-3, -1]]) # corresponding class 1/0 two classes, means 1st 4 features of x array are belong to class-1, # Last 4 features are belong to class-0 # SVC (support vector classifier) SVM (support vector machine) here i defined our classifier # kernel a linear clf svm.SVC(kernel-'linear', C 1.0) # this will train or process to find best fit for given data clf.fit(X,y) # Now, visualize data wclf.coef [e print(w) # weight a : -w[0] / w[1] # slop of best fit classifier line print("slop -",int(a)) print("intersept- ,. , int(clf.intercept-[0] / w[1])) # intercept of line xx - np.linspace(-3,3) yy - a *xx - clf.intercept_[o]/ w[1] margin = 1 / np. sqrt (np. sum(clf.coef-** 2)) print("Margin ",margin) he -plt.plot(xx, yy, 'k- ', label-"Linear SVM hyperplane") plt.scatter(X[:, 0], X[:, 1], c-y) plt.legend() plt.show) [0.25015266 0.24974557] slop-1 intersept-0 Margin-2.829002006592767
Linear SVM hyperplane -4 -6