Mixture of Experts with Entropic Regularization for Data Classification
| datacite.alternateIdentifier.citation | Entropy, 21 (2), 2019 | |
| datacite.alternateIdentifier.doi | 10.3390/e21020190 | |
| datacite.creator | Peralta, Billy | |
| datacite.creator | Saavedra, Ariel | |
| datacite.creator | Caro, Luis | |
| datacite.creator | Soto, Alvaro | |
| datacite.date | 2019 | |
| datacite.rights | Acceso abierto | |
| datacite.subject | mixture-of-experts | |
| datacite.subject | regularization | |
| datacite.subject | entropy | |
| datacite.subject | classification | |
| datacite.title | Mixture of Experts with Entropic Regularization for Data Classification | |
| dc.description.abstract | Today, there is growing interest in the automatic classification of a variety of tasks, such as weather forecasting, product recommendations, intrusion detection, and people recognition. Mixture-of-experts is a well-known classification technique; it is a probabilistic model consisting of local expert classifiers weighted by a gate network that is typically based on softmax functions, combined with learnable complex patterns in data. In this scheme, one data point is influenced by only one expert; as a result, the training process can be misguided in real datasets for which complex data need to be explained by multiple experts. In this work, we propose a variant of the regular mixture-of-experts model. In the proposed model, the cost classification is penalized by the Shannon entropy of the gating network in order to avoid a winner-takes-all output for the gating network. Experiments show the advantage of our approach using several real datasets, with improvements in mean accuracy of 3-6% in some datasets. In future work, we plan to embed feature selection into this model. | |
| dc.description.ia_keyword | classification, experts, model, data, mixture, network, datasets | |
| dc.format | ||
| dc.identifier.issn | 1099-4300 | |
| dc.identifier.uri | https://repositoriodigital.uct.cl/handle/10925/3580 | |
| dc.language.iso | en | |
| dc.publisher | Multidisciplinary Digital Publishing Institute (MDPI) | |
| dc.relation | instname: ANID | |
| dc.relation | reponame: Repositorio Digital RI2.0 | |
| dc.rights.driver | info:eu-repo/semantics/openAccess | |
| dc.source | Entropy | |
| dc.subject.ia_ods | ODS 8: Trabajo decente y crecimiento económico | |
| dc.type.driver | info:eu-repo/semantics/article | |
| dc.type.driver | http://purl.org/coar/resource_type/c_2df8fbb1 | |
| dc.type.openaire | info:eu-repo/semantics/publishedVersion | |
| dspace.entity.type | Publication | |
| oaire.citationEdition | 2019 | |
| oaire.citationIssue | 2 | |
| oaire.citationTitle | Entropy | |
| oaire.citationVolume | 21 | |
| oaire.fundingReference | ANID FONDECYT 11140892 (Iniciación) | |
| oaire.licenseCondition | Obra bajo licencia Creative Commons Atribución 4.0 Internacional | |
| oaire.licenseCondition.uri | https://creativecommons.org/licenses/by/4.0/ | |
| oaire.resourceType | Artículo | |
| oaire.resourceType.en | Article | |
| uct.catalogador | jvu | |
| uct.comunidad | Ingeniería | en_US |
| uct.departamento | Departamento de Ingeniería Informática | |
| uct.facultad | Facultad de Ingeniería | |
| uct.indizacion | Science Citation Index Expanded - SCIE | |
| uct.indizacion | Scopus | |
| uct.indizacion | DOAJ | |
| uct.indizacion | Inspec |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Peralta et al. - 2019 - Entropy - Mixture of Experts with Entrop.pdf
- Size:
- 3.52 MB
- Format:
- Adobe Portable Document Format
- Description:
