Decision trees help in easy interpretation and a limited number of parameters for tuning. Decision trees are nonparametric, and hence they are not vulnerable to outliers. On the other hand, decision trees are highly vulnerable to overfitting. However, you can choose ensemble methods such as boosted trees or random forests to deal with such issues.