# Top 30+ Machine Learning - Exploring the Model questions and answers

ago
Top 30+ Machine Learning - Exploring the Model questions and answers

ago by (321k points)

1.What is the name of the function that takes the input and maps it to the output variable called?

1. Map Function
2. None of the options
3. Hypothesis Function
4. Model Function

2.What is the process of dividing each feature by its range called?

1. Feature Scaling
2. None of the options
3. Feature Dividing
4. Range Dividing

3.Problems that predict real values outputs are called __________

1. Classification Problems
2. Regression Problems
3. Real Valued Problems
4. Greedy Problems

4.The result of scaling is a variable in the range of [1 , 10].

1. False
2. True

5.The objective function for linear regression is also known as Cost Function.

1. False
2. True

6.What is the Learning Technique in which the right answer is given for each example in the data called?

1. Unsupervised Learning
2. Supervised Learning
3. Reinforcement Learning

7.Output variables are also known as feature variables.

1. False
2. True

8.Input variables are also known as feature variables.

1. False
2. True

9.____________ controls the magnitude of a step taken during Gradient Descent.

1. Parameter
2. Step Rate
3. Momentum
4. Learning Rate

10.Cost function in linear regression is also called squared error function.

1. False
2. True

11.For different parameters of the hypothesis function, we get the same hypothesis function.

1. False
2. True

12.How are the parameters updated during Gradient Descent process?

1. Sequentially
2. Simultaneously
3. Not updated
4. One at a time

1.For ____________, the error is determined by getting the proportion of values misclassified by the model.

1. Classification
2. Clustering
3. None of the options
4. Regression

2.High values of threshold are good for the classification problem.

1. True
2. False

3.Underfit data has a high variance.

1. True
2. False

4.____________ function is used as a mapping function for classification problems.

1. Linear
2. Sigmoid
3. Convex
4. Concave

5.Classification problems with just two classes are called Binary classification problems.

1. True
2. False

6.Where does the sigmoid function asymptote?

1. -1 and +1
2. 0 and 1
3. -inf and +inf
4. 0 and inf

7.Lower Decision boundary leads to False Positives during classification.

1. False
2. True

8.Linear Regression is an optimal function that can be used for classification problems.

1. False
2. True

9.For ____________, the error is calculated by finding the sum of squared distance between actual and predicted values.

1. Regression
2. None of the options
3. Classification
4. Clustering

10.I have a scenario where my hypothesis fits my training set well but fails to generalize for the test set. What is this scenario called?

1. Underfitting
2. Generalization Failure
3. Overfitting
4. None of the options

11.What is the range of the output values for a sigmoid function?

1. [0,.5]
2. [-inf,+ inf]
3. [0,1]
4. [0,inf]

12.____________ is the line that separates y = 0 and y = 1 in a logistic function.

1. Divider
2. None of the options
3. Separator
4. Decision Boundary

13.Reducing the number of features can reduce overfitting.

1. False
2. True

14.A suggested approach for evaluating the hypothesis is to split the data into training and test set.

1. True
2. False

15.Overfitting and Underfitting are applicable only to linear regression problems.

1. True
2. False

16.Overfit data has high bias.

1. False
2. True

ML Exploring the Model - Final Quiz

1.For an underfit data set, the training and the cross-validation error will be high.

1. True
2. False

2.For an overfit data set, the cross-validation error will be much bigger than the training error.

1. True
2. False

3.Problems, where discrete-valued outputs are predicted, are called?

1. Real Valued Problems
2. Classification Problems
3. Greedy Problems
4. Regression Problems

4.What measures the extent to which the predictions change between various realizations of the model?

1. Deviation
2. Bias
3. Variance
4. Difference