Mixture of Experts with Entropic Regularization for Data Classification

datacite.alternateIdentifier.citationEntropy, 21 (2), 2019
datacite.alternateIdentifier.doi10.3390/e21020190
datacite.creatorPeralta, Billy
datacite.creatorSaavedra, Ariel
datacite.creatorCaro, Luis
datacite.creatorSoto, Alvaro
datacite.date2019
datacite.rightsAcceso abierto
datacite.subjectmixture-of-experts
datacite.subjectregularization
datacite.subjectentropy
datacite.subjectclassification
datacite.titleMixture of Experts with Entropic Regularization for Data Classification
dc.description.abstractToday, there is growing interest in the automatic classification of a variety of tasks, such as weather forecasting, product recommendations, intrusion detection, and people recognition. Mixture-of-experts is a well-known classification technique; it is a probabilistic model consisting of local expert classifiers weighted by a gate network that is typically based on softmax functions, combined with learnable complex patterns in data. In this scheme, one data point is influenced by only one expert; as a result, the training process can be misguided in real datasets for which complex data need to be explained by multiple experts. In this work, we propose a variant of the regular mixture-of-experts model. In the proposed model, the cost classification is penalized by the Shannon entropy of the gating network in order to avoid a winner-takes-all output for the gating network. Experiments show the advantage of our approach using several real datasets, with improvements in mean accuracy of 3-6% in some datasets. In future work, we plan to embed feature selection into this model.
dc.description.ia_keywordclassification, experts, model, data, mixture, network, datasets
dc.formatPDF
dc.identifier.issn1099-4300
dc.identifier.urihttps://repositoriodigital.uct.cl/handle/10925/3580
dc.language.isoen
dc.publisherMultidisciplinary Digital Publishing Institute (MDPI)
dc.relationinstname: ANID
dc.relationreponame: Repositorio Digital RI2.0
dc.rights.driverinfo:eu-repo/semantics/openAccess
dc.sourceEntropy
dc.subject.ia_odsODS 8: Trabajo decente y crecimiento económico
dc.type.driverinfo:eu-repo/semantics/article
dc.type.driverhttp://purl.org/coar/resource_type/c_2df8fbb1
dc.type.openaireinfo:eu-repo/semantics/publishedVersion
dspace.entity.typePublication
oaire.citationEdition2019
oaire.citationIssue2
oaire.citationTitleEntropy
oaire.citationVolume21
oaire.fundingReferenceANID FONDECYT 11140892 (Iniciación)
oaire.licenseConditionObra bajo licencia Creative Commons Atribución 4.0 Internacional
oaire.licenseCondition.urihttps://creativecommons.org/licenses/by/4.0/
oaire.resourceTypeArtículo
oaire.resourceType.enArticle
uct.catalogadorjvu
uct.comunidadIngenieríaen_US
uct.departamentoDepartamento de Ingeniería Informática
uct.facultadFacultad de Ingeniería
uct.indizacionScience Citation Index Expanded - SCIE
uct.indizacionScopus
uct.indizacionDOAJ
uct.indizacionInspec
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Peralta et al. - 2019 - Entropy - Mixture of Experts with Entrop.pdf
Size:
3.52 MB
Format:
Adobe Portable Document Format
Description: