Interpretability and explainability methods for tree-based ensemble models: theoretical developments and economic applications

Position: Post-doctoral research appointment Institute: Uni. Naples
New! Posted on: 07/05/2026 Deadline: 01/06/2026

Scientific-Disciplinary Group

13/STAT-01 - Statistics

Description

This project explores the intersection of advanced Machine Learning and decision-making transparency. Although tree-based ensemble models (such as Random Forest and XGBoost) provide enhanced predictive performance, their "black-box" nature often hinders their adoption in highly regulated sectors. This research aims to develop and refine Explainable AI(XAI) methods to make these models interpretable without compromising accuracy. The theoretical framework investigates novel feature attribution metrics, while the applied component validates these tools using economic datasets, addressing complex issues such as credit scoring, financial market forecasting, and public policy analysis.

Funding body

MUR

How to apply

Other