Publication:
FSOCP: feature selection via second-order cone programming

dc.contributor.authorGüldoğuş, Buse Çisil
dc.contributor.authorAkyüz, Süreyya
dc.contributor.institutionGüldoğuş, Buse Çisil, Department of Industrial Engineering, Bahçeşehir Üniversitesi, Istanbul, Turkey
dc.contributor.institutionAkyüz, Süreyya, Department of Mathematics, Bahçeşehir Üniversitesi, Istanbul, Turkey
dc.date.accessioned2025-10-05T14:32:58Z
dc.date.issued2025
dc.description.abstractFeature selection is an important factor of accurately classifying high dimensional data sets by identifying relevant features and improving classification accuracy. The use of feature selection in operations research allows for the identification of relevant features and the creation of optimal subsets of features for improved predictive performance. This paper proposes a novel feature selection algorithm inspired from ensemble pruning which involves the use of second-order conic programming modeled as an embedded feature selection technique with neural networks, named feature selection via second order cone programming (FSOCP). The proposed FSOCP algorithm trains features individually on a neural network and generates a probability class distribution and prediction, allowing the second-order conic programming model to determine the most important features for improved classification accuracies. The algorithm is evaluated on multiple synthetic data sets and compared with other feature selection techniques, demonstrating its promising potential as a feature selection approach. © 2025 Elsevier B.V., All rights reserved.
dc.identifier.doi10.1007/s10100-023-00903-y
dc.identifier.endpage64
dc.identifier.issn16139178
dc.identifier.issn1435246X
dc.identifier.issue1
dc.identifier.scopus2-s2.0-85182847689
dc.identifier.startpage51
dc.identifier.urihttps://doi.org/10.1007/s10100-023-00903-y
dc.identifier.urihttps://hdl.handle.net/20.500.14719/6464
dc.identifier.volume33
dc.language.isoen
dc.publisherSpringer
dc.relation.sourceCentral European Journal of Operations Research
dc.subject.authorkeywordsEnsemble Pruning
dc.subject.authorkeywordsFeature Extraction
dc.subject.authorkeywordsFeature Selection
dc.subject.authorkeywordsNeural Networks
dc.subject.authorkeywordsSecond Order Cone Programming
dc.subject.authorkeywordsClassification (of Information)
dc.subject.authorkeywordsClustering Algorithms
dc.subject.authorkeywordsFeature Selection
dc.subject.authorkeywordsNeural Networks
dc.subject.authorkeywordsOperations Research
dc.subject.authorkeywordsProbability Distributions
dc.subject.authorkeywordsClassification Accuracy
dc.subject.authorkeywordsEnsemble Pruning
dc.subject.authorkeywordsFeatures Extraction
dc.subject.authorkeywordsFeatures Selection
dc.subject.authorkeywordsHigh Dimensional Data
dc.subject.authorkeywordsNeural-networks
dc.subject.authorkeywordsRelevant Features
dc.subject.authorkeywordsSecond-order Cone Programming
dc.subject.authorkeywordsSecond-order Conic Programming
dc.subject.authorkeywordsSelection Techniques
dc.subject.indexkeywordsClassification (of information)
dc.subject.indexkeywordsClustering algorithms
dc.subject.indexkeywordsFeature Selection
dc.subject.indexkeywordsNeural networks
dc.subject.indexkeywordsOperations research
dc.subject.indexkeywordsProbability distributions
dc.subject.indexkeywordsClassification accuracy
dc.subject.indexkeywordsEnsemble pruning
dc.subject.indexkeywordsFeatures extraction
dc.subject.indexkeywordsFeatures selection
dc.subject.indexkeywordsHigh dimensional data
dc.subject.indexkeywordsNeural-networks
dc.subject.indexkeywordsRelevant features
dc.subject.indexkeywordsSecond-order cone programming
dc.subject.indexkeywordsSecond-order conic programming
dc.subject.indexkeywordsSelection techniques
dc.titleFSOCP: feature selection via second-order cone programming
dc.typeArticle
dcterms.referencesA Powerful Feature Selection Approach Based on Mutual Information, (2008), Data Clustering Algorithms and Applications, (2013), Battiti, Roberto, Using Mutual Information for Selecting Features in Supervised Neural Net Learning, IEEE Transactions on Neural Networks, 5, 4, pp. 537-550, (1994), Brown, Gavin, Conditional likelihood maximisation: A unifying framework for information theoretic feature selection, Journal of Machine Learning Research, 13, pp. 27-66, (2012), Cheng, Gaofeng, An exploration of dropout with LSTMs, Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH, 2017-August, pp. 1586-1590, (2017), Dobos, Imre, Supplier selection: comparison of DEA models with additive and reciprocal data, Central European Journal of Operations Research, 29, 2, pp. 447-462, (2021), Dougherty, James, Supervised and Unsupervised Discretization of Continuous Features, pp. 194-202, (1995), Pattern Classification, (2001), Duda, Jaroslaw Jarek, Multi-feature evaluation of financial contagion, Central European Journal of Operations Research, 30, 4, pp. 1167-1194, (2022), El Aboudi, Naoual, Review on wrapper feature selection approaches, (2016)
dspace.entity.typePublication
local.indexed.atScopus
person.identifier.scopus-author-id58115198400
person.identifier.scopus-author-id24766886300

Files