- Title
- Improving deep forest by confidence screening
- Creator
- Pang, Ming; Ting, Kaiming; Zhao, Peng; Zhou, Zhi-Hua
- Date
- 2018
- Type
- Text; Conference proceedings
- Identifier
- http://researchonline.federation.edu.au/vital/access/HandleResolver/1959.17/169295
- Identifier
- vital:13989
- Identifier
-
https://doi.org/10.1109/ICDM.2018.00158
- Identifier
- ISBN:1550-4786 (ISSN)978-1-5386-9159-5 (ISBN)
- Abstract
- Most studies about deep learning are based on neural network models, where many layers of parameterized nonlinear differentiable modules are trained by backpropagation. Recently, it has been shown that deep learning can also be realized by non-differentiable modules without backpropagation training called deep forest. The developed representation learning process is based on a cascade of cascades of decision tree forests, where the high memory requirement and the high time cost inhibit the training of large models. In this paper, we propose a simple yet effective approach to improve the efficiency of deep forest. The key idea is to pass the instances with high confidence directly to the final stage rather than passing through all the levels. We also provide a theoretical analysis suggesting a means to vary the model complexity from low to high as the level increases in the cascade, which further reduces the memory requirement and time cost. Our experiments show that the proposed approach achieves highly competitive predictive performance with significantly reduced time cost and memory requirement by up to one order of magnitude.
- Publisher
- IEEE
- Relation
- 2018 Ieee International Conference on Data Mining; Singapore, Singapore; 17th-20th November 2018 p. 1194-1199
- Rights
- Copyright © 2018 IEEE.
- Rights
- This metadata is freely available under a CCO license
- Subject
- Classification; Confidence screening; Deep forest; Ensemble methods
- Full Text
- Reviewed
- Hits: 758
- Visitors: 1257
- Downloads: 528
Thumbnail | File | Description | Size | Format | |||
---|---|---|---|---|---|---|---|
View Details Download | SOURCE1 | Accepted version | 426 KB | Adobe Acrobat PDF | View Details Download |