Geurts, Pierre ; Université de Liège > Dép. d'électric., électron. et informat. (Inst.Montefiore) > Algorith. des syst. en interaction avec le monde physique
Wehenkel, Louis ; Université de Liège > Dép. d'électric., électron. et informat. (Inst.Montefiore) > Systèmes et modélisation
Blockeel H, Raedt LD, Ramon J (1998) Top-down induction of clustering trees. In: Proceedings of the fifteenth international conference on machine learning, ICML ’98. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, pp 55–63
Breiman L (1996) Bagging predictors. Mach Learn 24(2):123–140
Brown G, Wyatt J, Harris R, Yao X (2005) Diversity creation methods: a survey and categorisation. Inf Fusion 6(1):5–20 (diversity in Multiple Classifier Systems)
Buntine W (1992) Learning classification trees. Stat Comput 2:63–73
Chipman HA, George EI, McCulloch RE (1998) Bayesian cart model search. J Am Stat Assoc 93(443):935–960
Chipman HA, George EI, McCulloch RE (2010) Bart: Bayesian additive regression trees. Ann Appl Stat 4(1):266–298
Freund Y, Schapire RE (1997) A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci 55(1):119–139
Geurts P (2002) Contributions to decision tree induction: bias/variance tradeoff and time series classification. PhD thesis, University of Liège, Belgium
Geurts P, Wehenkel L (2005) Closed-form dual perturb and combine for tree-based models. In: Proceedings of the 22nd international conference on machine learning, ICML ’05. ACM, New York, NY, USA, pp 233–240
Geurts P, Olaru C, Wehenkel L (2001) Improving the bias/variance tradeoff of decision trees–towards soft tree induction. Eng Intell Syst 9:195–204
Geurts P, Ernst D, Wehenkel L (2006a) Extremely randomized trees. Mach Learn 63(1):3–42
Geurts P, Wehenkel L, d’Alché Buc F (2006b) Kernelizing the output of tree-based methods. In: Cohen WW, Moore A (eds) ICML, ACM, ACM international conference proceeding series, vol 148, pp 345–352
Ho TK (1998) The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 20(8):832–844
Ioannou Y, Robertson D, Zikic D (2016) Decision forests, convolutional networks and the models in-between. arXiv:1603.01250
Jordan MI (1994) Hierarchical mixtures of experts and the em algorithm. Neural Comput 6:181–214
Kocev D, Vens C, Struyf J, Deroski S (2013) Tree ensembles for predicting structured outputs. Pattern Recognit 46(3):817–833
Kontschieder P, Fiterau M, Criminisi A, Bulò SR (2015) Deep neural decision forests. In: ICCV, pp 1467–1475
Kuncheva LI, Whitaker CJ (2003) Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach Learn 51(2):181–207
Lakshminarayanan B, Roy DM, Teh YW (2013) Top-down particle filtering for Bayesian decision trees. In: ICML, pp 280–288
Louppe G, Geurts P (2012) Ensembles on random patches. In: Flach PA, Bie TD, Cristianini N (eds) ECML/PKDD (1), vol 7523. Lecture notes in computer science. Springer, Berlin, pp 346–361
Matthew T, Chen CS, Yu J, Wyle M (2015) Bayesian and empirical Bayesian forests. In: ICML, pp 967–976
Olaru C, Wehenkel L (2003) A complete fuzzy decision tree technique. Fuzzy Sets Syst 138(2):221–254
Quadrianto N, Ghahramani Z (2014) A very simple safe-Bayesian random forest. IEEE Trans Pattern Anal Mach Intell 37(6):1297–1303
Segal M, Xiao Y (2011) Multivariate random forests. Wiley Interdiscip Rev Data Min Knowl Discov 1(1):80–87
Sethi IK (1990) Entropy nets: from decision trees to neural networks. Proc IEEE 78(10):1605–1613. doi:10.1109/5.58346
Yildiz OT, Alpaydin E (2013) Regularizing soft decision trees. In: Information sciences and systems 2013–proceedings of the 28th international symposium on computer and information sciences, ISCIS 2013, Paris, France, October 28–29, 2013, pp 15–21