An Improvement Of Least Square - Twin Support Vector Machine
Cải tiến máy Véc-tơ tựa song sinh bình phương tối thiểu
In binary classification problems, two classes of data seem to be different from each other. It is expected to be
more complicated due to the number of data points of clusters in each class also be different. Traditional algorithms
as Support Vector Machine (SVM), Twin Support Vector Machine (TSVM), or Least Square Twin Support Vector
Machine (LSTSVM) cannot sufficiently exploit information about the number of data points in each cluster of the data.
Which may be effect to the accuracy of classification problems. In this paper, we propose a new Improvement Least
Square - Support Vector Machine (called ILS-SVM) for binary classification problems with a class-vs-clusters strategy.
Experimental results show that the ILS-SVM training time is faster than that of TSVM, and the ILS-SVM accuracy is
better than LSTSVM and TSVM in most cases.
V. Vapnik, The Natural Of Statistical Learning Theory. Springer-Verlag New York, 1995.
G. Fung and O. L. Mangasarian, “Proximal support vector machine,” in KDD ’01: Proceedings of the Seventh ACM SIGKDD International Conference On Knowledge Discovery And Data Mining. San Francisco California: Association for Computing Machinery, New York, NY, United States, 2001, pp. 77–86. [Online]. Available: https://dl.acm.org/doi/10.1145/502512.502527
W. Noble, Support Vector Machine Applications in Computational Biology. MIT Press, 2004.
M. Adancon and M. Cheriet, “Model selection for the lssvm. application to handwriting recognition,” Pattern Recognition, vol. 42, pp. 3264–3270, 2009.
Y. Tian, Y. Shi, and Y. Liu, “Recent advances on support vector machines research,” Technological and Economic Development of Economy, vol. 18, pp. 5–33, 2012.  D. Tomar and S. Agarwal, “Twin support vector machine: A review from 2007 to 2014,” Egyptian Informatics Journal, vol. 16, pp. 55–69, 2015.
J. Cervantes, F. Lamont, L. Mazahua, and A. Lopez, “A comprehensive survey on support vector machine classification: Applications, challenges and trends,” Neurocomputing, vol. 408, pp. 189–215, 2020.
X. Pan, Y. Luo, and Y. Xu, “K-nearest neighbor based structural twin support vector machine,” Knowledge-Based Systems, vol. 88, pp. 34–44, 2015.
X. Xie and S. Sun, “Multitask centroid twin support vector machines,” Neurocomputing, vol. 149, pp. 1085–1091, 2015.
B. Mei and Y. Xu, “Multi-task least squares twin support vector machine for classification,” Neurocomputing, vol. 338, pp. 26–33, 2019.
Jayadeva, R. Khemchandani, and S. Chandra, “Twin support vector machines for pattern classification,” IEEE Transactions on Pattern Analysis and Machine intelligence, vol. 29, pp. 905–910, 2007.
Z. Qi, Y. Tian, and Y. Shi, “Structural twin support vector machine for classification,” Knowledge-Based Systems, vol. 43, pp. 74–81, 2013.
M. Kumar and M. Gopal, “Least squares twin support vector machines for pattern classification,” Expert Systems with Applications, vol. 36, pp. 7535–7543, 2009.
O. Mangasarian and E. Wild, “Multisurface proximal support vector classification via generalized igenvalues,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, pp. 69–74, 2006.
B. Schoelkopf and A. Smola, Learning with Kernel. MIT Press, 2002.
G. Golub and C. Van Loan, Matrix Computations. The John Hopkins University Press, 2013.
N. Cuong, Python code. [Online]. Available: https://github.com/makeho8/python/
UCI Machine Learning Repository, Center for Machine Learning and Intelligent Systems at the University of California, Irvine. [Online]. Available: http://archive.ics.uci.edu/ml/machine-learning-databases/