Prepare for the Society of Actuaries PA Exam with our comprehensive quiz. Study with multiple-choice questions, each providing hints and explanations. Gear up for success!

Each practice test/flash card set has 50 randomly selected questions from a bank of over 500. You'll get a new set of questions each time!

Practice this question and more.


What is a common drawback of Decision Trees?

  1. They require normalization of data

  2. Sensitivity to noise and overfitting

  3. Difficulty in understanding the model's decisions

  4. Relying heavily on linear relationships

The correct answer is: Sensitivity to noise and overfitting

A key drawback of Decision Trees is their sensitivity to noise and the tendency to overfit the data. Decision Trees create models by splitting data into subsets based on feature values, which can lead to complex trees that capture noise rather than the underlying patterns in the data. This complexity often results from making too many splits, especially with small sample sizes, which can create a model that doesn’t generalize well to new, unseen data. Overfitting occurs when the model describes random error or noise instead of the underlying relationship, leading to poor performance on validation or test datasets. To mitigate this, techniques such as pruning, setting a maximum depth for the tree, or using ensemble methods like Random Forests can help improve the model’s robustness and ability to generalize. The other options, while relevant to different modeling techniques, do not specifically apply to Decision Trees in the same way. Normalization is primarily a concern for algorithms that utilize distance calculations or are sensitive to scaling. Understanding the decisions made by a Decision Tree is generally more straightforward than complex models like neural networks. Lastly, Decision Trees do not rely on linear relationships; in fact, their strength lies in their ability to capture non-linear relationships.