Interpretability and explainability methods for tree-based ensemble models: theoretical developments and economic applications
Scientific-Disciplinary Group
13/STAT-01 - Statistics
Description
This project explores the intersection of advanced Machine Learning and decision-making transparency. Although tree-based ensemble models (such as Random Forest and XGBoost) provide enhanced predictive performance, their "black-box" nature often hinders their adoption in highly regulated sectors. This research aims to develop and refine Explainable AI(XAI) methods to make these models interpretable without compromising accuracy. The theoretical framework investigates novel feature attribution metrics, while the applied component validates these tools using economic datasets, addressing complex issues such as credit scoring, financial market forecasting, and public policy analysis.
Job posting website
https://www.unina.it/it/ateneo/concorsi-e-borse-di-studio/incarichi-di-ricerca
Funding body
MUR
How to apply
Other
View the original posting on the MUR website: Go to MUR website