Mixture of Experts with Entropic Regularization for Data Classification

dc.contributor.authorPeralta Márquez, Billy
dc.contributor.authorSaavedra, Ariel
dc.contributor.authorCaro Saldivia, Luis
dc.contributor.authorSoto, Alvaro
dc.date2019
dc.date.accessioned2021-04-30T16:47:50Z
dc.date.available2021-04-30T16:47:50Z
dc.description.abstractToday, there is growing interest in the automatic classification of a variety of tasks, such as weather forecasting, product recommendations, intrusion detection, and people recognition. Mixture-of-experts is a well-known classification technique; it is a probabilistic model consisting of local expert classifiers weighted by a gate network that is typically based on softmax functions, combined with learnable complex patterns in data. In this scheme, one data point is influenced by only one expert; as a result, the training process can be misguided in real datasets for which complex data need to be explained by multiple experts. In this work, we propose a variant of the regular mixture-of-experts model. In the proposed model, the cost classification is penalized by the Shannon entropy of the gating network in order to avoid a winner-takes-all output for the gating network. Experiments show the advantage of our approach using several real datasets, with improvements in mean accuracy of 3-6% in some datasets. In future work, we plan to embed feature selection into this model.
dc.identifier.citationENTROPY,Vol.21,,2019
dc.identifier.doi10.3390/e21020190
dc.identifier.urihttp://repositoriodigital.uct.cl/handle/10925/3580
dc.language.isoen
dc.publisherMDPI
dc.sourceENTROPY
dc.subject.englishmixture-of-experts
dc.subject.englishregularization
dc.subject.englishentropy
dc.subject.englishclassification
dc.titleMixture of Experts with Entropic Regularization for Data Classification
dc.typeArticle
uct.catalogadorWOS
uct.indizacionSCI
Files