Dimension Reduction Forests: Local Variable Importance Using Structured Random Forests

Joshua Daniel Loyal, Ruoqing Zhu, Yifan Cui, Xin Zhang

Research output: Contribution to journalArticlepeer-review


Random forests are one of the most popular machine learning methods due to their accuracy and variable importance assessment. However, random forests only provide variable importance in a global sense. There is an increasing need for such assessments at a local level, motivated by applications in personalized medicine, policy-making, and bioinformatics. We propose a new nonparametric estimator that pairs the flexible random forest kernel with local sufficient dimension reduction to adapt to a regression function’s local structure. This allows us to estimate a meaningful directional local variable importance measure at each prediction point. We develop a computationally efficient fitting procedure and provide sufficient conditions for the recovery of the splitting directions. We demonstrate significant accuracy gains of our proposed estimator over competing methods on simulated and real regression problems. Finally, we apply the proposed method to seasonal particulate matter concentration data collected in Beijing, China, which yields meaningful local importance measures. The methods presented here are available in the drforest Python package. Supplementary materials for this article are available online.

Original languageEnglish (US)
Pages (from-to)1104-1113
Number of pages10
JournalJournal of Computational and Graphical Statistics
Issue number4
StatePublished - 2022


  • Random forests
  • Sufficient dimension reduction
  • Variable importance

ASJC Scopus subject areas

  • Discrete Mathematics and Combinatorics
  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Dimension Reduction Forests: Local Variable Importance Using Structured Random Forests'. Together they form a unique fingerprint.

Cite this