BA5106
STATISTICS FOR MANAGEMENT
UNIT-V-CORRELATION AND REGRESSION
MULTI CHOICE QUESTIONS
1. The covariance is
(A) A measure of the strength of relationship between two variables
(B) Dependent on the units of measurement of the variables
(C) An unstandardized version of the correlation coefficient
(D) All of these
2. Rank the score of 5 in the following set of scores:
9, 3, 5, 10, 8, 5, 9, 7, 3, 4
(A) 3
(B) 4
(C) 4.5
(D) 6
3. Which of the following statistical tests allows causal inferences to be made?
(A) Analysis of variance
(B) Regression
(C) None of these, it’s the design of the research that determines whether causal inferences can be made
(D) t-test
4.R2 is known as the
(A) Multiple correlation coefficient.
(B) Partial correlation coefficient.
(C) Coefficient of determination.
(D) Semi-partial correlation coefficient.
5.Which of the following statements about outliers is not true?
(A) Outliers are values very different from the rest of the data.
(B) Outliers have an effect on the mean.
(C) Influential cases will always show up as outliers
(D) Outliers have an effect on regression parameters.
6. For which regression assumption does the Durbin–Watson statistic test?
(A) Independence of errors
(B) Linearity
(C) Multi collinearity
(D) Homeostatic
7.Which of the following is not an assumption for the Pearson’s correlation analysis?
(A) Normally distributed variables
(B) Monotonic relationship
(C) Linear relationship
(D) Constant variance
8.What is the primary purpose of Pearson’s and Spearman’s correlation coefficients?
(A) Identifying deviations from normality for continuous variables
(B) Examining the relationship between two categorical variables
(C) Examining the relationship between two non categorical variables
(D) Comparing means across group
9.Which of the following would be considered a very strong negative correlation?
(A) 0.89
(B) -0.9
(C ) 0.09
(D) -0.89
10.Which test is used to determine whether a correlation coefficient is statistically significant?
(A) Paired samples t-test
(B) Chi-squared test
(C) P-value
(A) Independence of errors
(B) Linearity
(C) Multi collinearity
(D) Homeostatic
7.Which of the following is not an assumption for the Pearson’s correlation analysis?
(A) Normally distributed variables
(B) Monotonic relationship
(C) Linear relationship
(D) Constant variance
8.What is the primary purpose of Pearson’s and Spearman’s correlation coefficients?
(A) Identifying deviations from normality for continuous variables
(B) Examining the relationship between two categorical variables
(C) Examining the relationship between two non categorical variables
(D) Comparing means across group
9.Which of the following would be considered a very strong negative correlation?
(A) 0.89
(B) -0.9
(C ) 0.09
(D) -0.89
10.Which test is used to determine whether a correlation coefficient is statistically significant?
(A) Paired samples t-test
(B) Chi-squared test
(C) P-value
(D) One-sample t-test
11.Which of the following is not an assumption for simple linear regression?
(A) Normally distributed variables
(B) Constant variance
(C) Multicollinearity
(A) Normally distributed variables
(B) Constant variance
(C) Multicollinearity
(D) Linear relationship
12. Continuous predictors influence the_________ of the regression line, while categorical predictors influence the _____________
(A) p-value,$R^2$
(B) $R^2$, p-value
(C) intercept, slope
(D) slope, intercept
13.Which of the following is true about the adjusted $R^2$?
(A) It is usually smaller than the $R^2$
(B) It is usually larger than the $R^2$
(C) It is only used when there is just one predictor
(D) It is used to determine whether residuals are normally distributed
14. The $R^2$ is the squared correlation of which two values?
(A) y and each continuous x
(B) y and the predicted values of y
(C) b and t
(D) b and se
15.Following are the elements of correlation except
(A) There should be three or more variables
(B) The change in the value of one affects another
(C) There should be relationship among them
(D) There should be only curvilinear relations among variables
12. Continuous predictors influence the_________ of the regression line, while categorical predictors influence the _____________
(A) p-value,$R^2$
(B) $R^2$, p-value
(C) intercept, slope
(D) slope, intercept
13.Which of the following is true about the adjusted $R^2$?
(A) It is usually smaller than the $R^2$
(B) It is usually larger than the $R^2$
(C) It is only used when there is just one predictor
(D) It is used to determine whether residuals are normally distributed
14. The $R^2$ is the squared correlation of which two values?
(A) y and each continuous x
(B) y and the predicted values of y
(C) b and t
(D) b and se
15.Following are the elements of correlation except
(A) There should be three or more variables
(B) The change in the value of one affects another
(C) There should be relationship among them
(D) There should be only curvilinear relations among variables
16.If with the fall in the value of one variable the value of another variable rises in the same proportion then it is said to be
(A) None
(B) Both
(C) Negatively correlated
(D) Positively correlated
17. Two variables are said to be positively correlated when with ________ in the value of one variable, the value of other variable also ________
(A) Rise , Falls
(B) No change, Rises
(C) Fall, Rises
(D) Fall , falls
(A) None
(B) Both
(C) Negatively correlated
(D) Positively correlated
17. Two variables are said to be positively correlated when with ________ in the value of one variable, the value of other variable also ________
(A) Rise , Falls
(B) No change, Rises
(C) Fall, Rises
(D) Fall , falls
18.If the coefficient correlation exactly equals to -1 then it will be effect
(A) Simple correlation
(B) Negative correlation
(C) Positive correlation
(D) Multiple correlation
19. When the correlation is only studied between two variables it is called
(A) Simple correlation
(B) Positive correlation
(C) Multiple correlation
(D) Negative correlation
20.Multiple correlation is
(A) When the correlation is only studied between four variables
(B) When the correlation is studied between three or more variables
(C) When the correlation is only studied between two variables
(D) When the correlation is only studied between three variable
21. the ratio of change between the two variables is a constant then there will be
(A) Non-linear correlation
(B) Linear correlation
(C) Negative correlation
(D) Positive correlation
22. While drawing a scatter diagram if all points appear to form a straight line going downward from left to right, then it is inferred that there is _________
(A) No correction
(B) Simple positive correlation
(C) Perfect positive correction
(D) Perfect negative correlation
23. Correlation coefficient is denoted by
(B) Negative correlation
(C) Positive correlation
(D) Multiple correlation
19. When the correlation is only studied between two variables it is called
(A) Simple correlation
(B) Positive correlation
(C) Multiple correlation
(D) Negative correlation
20.Multiple correlation is
(A) When the correlation is only studied between four variables
(B) When the correlation is studied between three or more variables
(C) When the correlation is only studied between two variables
(D) When the correlation is only studied between three variable
21. the ratio of change between the two variables is a constant then there will be
(A) Non-linear correlation
(B) Linear correlation
(C) Negative correlation
(D) Positive correlation
22. While drawing a scatter diagram if all points appear to form a straight line going downward from left to right, then it is inferred that there is _________
(A) No correction
(B) Simple positive correlation
(C) Perfect positive correction
(D) Perfect negative correlation
23. Correlation coefficient is denoted by
(A) co
(B) l
(C) c
(D) r }
24. Who was a great biometrician and statistician?
(A) Kally Pearson
(B) Kaerl Pearson
(C) Karl Pearson
(D) Kal Pearson
25. When r = 1, there is perfect
(A) perfect + ve relationship between the variables
(B) perfect -ve relationship between the variables
(C) No relationship between the variables
(D) None
(B) l
(C) c
(D) r }
24. Who was a great biometrician and statistician?
(A) Kally Pearson
(B) Kaerl Pearson
(C) Karl Pearson
(D) Kal Pearson
25. When r = 1, there is perfect
(A) perfect + ve relationship between the variables
(B) perfect -ve relationship between the variables
(C) No relationship between the variables
(D) None
26.What is the range of simple correlation coefficient?
(A) 1< r < 1
(B) -1< r < 1
(C) 1> r > 1
(D) 1< r > 1
27.Which method of measuring correlation measures any type of relationship?
(A) Karl Pearson’s coefficient of correlation
(B) Spearman’s rank correlation.
(C) Both
(D) None
28.If cov(x,y) = 0 then
(A) x and y are correlated
(B) x and y are uncorrelated
(C) x and y are linearly related
(D) None
29. Correlation coefficient is independent of change of
(A) scale
(B) Origin
(C) Scale and origin
(D) None
30. Rank Correlation was found by
(A) Galton
(B) Spearman
(C) Fisher
(D) Pearson
(A) 1< r < 1
(B) -1< r < 1
(C) 1> r > 1
(D) 1< r > 1
27.Which method of measuring correlation measures any type of relationship?
(A) Karl Pearson’s coefficient of correlation
(B) Spearman’s rank correlation.
(C) Both
(D) None
28.If cov(x,y) = 0 then
(A) x and y are correlated
(B) x and y are uncorrelated
(C) x and y are linearly related
(D) None
29. Correlation coefficient is independent of change of
(A) scale
(B) Origin
(C) Scale and origin
(D) None
30. Rank Correlation was found by
(A) Galton
(B) Spearman
(C) Fisher
(D) Pearson
31.The number of observation in regression analysis is considered as
(A) Degree of average
(B) Degree of possibility
(C) Degree of freedom
(D) Degree of variance
32.All the conditions or assumption of regression analysis in simple regression can give
(A) Dependent Estimation
(B) Independent Estimation
(C) Reliable Estimates
(D) Unreliable Estimates
33.The standard error of regression analysis known as
(A) Average of coefficient
(B) Mean of residual
(C) Variance of residual
(D) Average of residual
34.In regression Analysis,the testing assumption if these are true or not is classified as
(A) Specification Analysis
(B) Significance Analysis
(C) Average Analysis
(D) Weighted Analysis
35.The correlation coefficient is
(A) $r(X,Y)=\frac{\sigma_x \sigma_y}{cov(x,y)}$
(B) $r(X,Y)=\frac{cov(x,y)}{\sigma_x \sigma_y }$
(C) $r(X,Y)=\frac{cov(x,y)}{ \sigma_y }$
(D) $r(X,Y)=\frac{cov(x,y)}{\sigma_x }$
36. The variable which influences the value or is used for prediction is called
(A) Dependent Variable
(B) Independent Variable
(C) Explained Variable
(D) Regressed
37.The correlation coefficient
(A) Dependent Variable
(B) Independent Variable
(C) Explained Variable
(D) Regressed
37.The correlation coefficient
(A) $r=\pm b_{xy}\times b_{yx}$
(B) $r=\pm \sqrt{b_{xy}+ b_{yx}}$
(C) $r=\pm \sqrt{b_{xy} -b_{yx}}$
(D) $r=\pm \sqrt{b_{xy}\times b_{yx}}$}
38. The regression coefficient of X on y
(A) $b_{xy}=\frac{N \sum dx dy+ (\sum dx)(\sum dy)}{N \sum dy^2-(\sum dy)^2}$
(B) $b_{xy}=\frac{N \sum dx dy-(\sum dx)(\sum dy)}{N \sum dy^2-(\sum dy)^2}$}
(C) $b_{yx}=\frac{N \sum dx dy+(\sum dx)(\sum dy)}{N \sum dy^2-(\sum dy)^2}$
(D) $b_{yx}=\frac{N \sum dx dy+(\sum dx)(\sum dy)}{N \sum dy^2+(\sum dy)^2}$
39.The regression coefficient of Y on X
(A) $b_{yx}=\frac{N \sum dx dy+ (\sum dx)(\sum dy)}{N \sum dx^2-(\sum dx)^2}$
(B) $b_{yx}=\frac{N \sum dx dy-(\sum dx)(\sum dy)}{N \sum dx^2-(\sum dx)^2}$
(C) $b_{xy}=\frac{N \sum dx dy+(\sum dx)(\sum dy)}{N \sum dx^2-(\sum dx)^2}$
(D) $b_{xy}=\frac{N \sum dx dy+(\sum dx)(\sum dy)}{N \sum dx^2+(\sum dx)^2}$
(B) $r=\pm \sqrt{b_{xy}+ b_{yx}}$
(C) $r=\pm \sqrt{b_{xy} -b_{yx}}$
(D) $r=\pm \sqrt{b_{xy}\times b_{yx}}$}
38. The regression coefficient of X on y
(A) $b_{xy}=\frac{N \sum dx dy+ (\sum dx)(\sum dy)}{N \sum dy^2-(\sum dy)^2}$
(B) $b_{xy}=\frac{N \sum dx dy-(\sum dx)(\sum dy)}{N \sum dy^2-(\sum dy)^2}$}
(C) $b_{yx}=\frac{N \sum dx dy+(\sum dx)(\sum dy)}{N \sum dy^2-(\sum dy)^2}$
(D) $b_{yx}=\frac{N \sum dx dy+(\sum dx)(\sum dy)}{N \sum dy^2+(\sum dy)^2}$
39.The regression coefficient of Y on X
(A) $b_{yx}=\frac{N \sum dx dy+ (\sum dx)(\sum dy)}{N \sum dx^2-(\sum dx)^2}$
(B) $b_{yx}=\frac{N \sum dx dy-(\sum dx)(\sum dy)}{N \sum dx^2-(\sum dx)^2}$
(C) $b_{xy}=\frac{N \sum dx dy+(\sum dx)(\sum dy)}{N \sum dx^2-(\sum dx)^2}$
(D) $b_{xy}=\frac{N \sum dx dy+(\sum dx)(\sum dy)}{N \sum dx^2+(\sum dx)^2}$
40.When one regression coefficient is negative , the other would be
(A) Negative
(B) Positive
(C) Zero
(D) None of these
41. If X and Y are two variate, there can be almost
(A) One regression line
(B) Three regression line
(C) Two regression line
(D) More regression line
42.The lines of regression of X and Y estimates
(A) Y for a given value of X
(B) X for a given value of Y
(C) X from Y and Y from X
(D) None of these
43.Scatter diagram of the variate value (X,Y) give the idea about
(A) One regression line
(B) Three regression line
(C) Two regression line
(D) More regression line
42.The lines of regression of X and Y estimates
(A) Y for a given value of X
(B) X for a given value of Y
(C) X from Y and Y from X
(D) None of these
43.Scatter diagram of the variate value (X,Y) give the idea about
(A ) Functional relationship
(B) Regression model
(C) Distribution of errors
(D) No relation
44. If two variable moves in degreasing direction then the correlation is
(A) Perfect negative
(B) Negative
(C) Positive
(D) No correlation
45. The line of regression intersect at the point
(A) (X,Y)
(B) (0,0)
(C) $(\sigma_x,\sigma_y)$
(D) $(\bar{X},\bar{Y})$
(B) Regression model
(C) Distribution of errors
(D) No relation
44. If two variable moves in degreasing direction then the correlation is
(A) Perfect negative
(B) Negative
(C) Positive
(D) No correlation
45. The line of regression intersect at the point
(A) (X,Y)
(B) (0,0)
(C) $(\sigma_x,\sigma_y)$
(D) $(\bar{X},\bar{Y})$
46. The coefficient correlation describes
(A) Only magnitude
(B) only direction
(C) no magnitude and no direction
(D) The magnitude and direction
47. The regression coefficient of Y on X is 2, then the regression coefficient of X on y is
(A) $> \frac{1}{2}$
(B) $\leq \frac{1}{2}$
(C) 2
(D) 1
48. The regression analysis confined to the study of only two variables at a time is called the _________
(A) Linear regression
(B) Multiple regression
(C) Simple regression
(D) Non linear regression
49.The regression analysis confined to the study of more than two variables at a time is called the _________
(A) Linear regression
(B) Simple regression
(C) Multiple regression}
(D) Non linear regression
(A) Only magnitude
(B) only direction
(C) no magnitude and no direction
(D) The magnitude and direction
47. The regression coefficient of Y on X is 2, then the regression coefficient of X on y is
(A) $> \frac{1}{2}$
(B) $\leq \frac{1}{2}$
(C) 2
(D) 1
48. The regression analysis confined to the study of only two variables at a time is called the _________
(A) Linear regression
(B) Multiple regression
(C) Simple regression
(D) Non linear regression
49.The regression analysis confined to the study of more than two variables at a time is called the _________
(A) Linear regression
(B) Simple regression
(C) Multiple regression}
(D) Non linear regression
50._________ the relationship between the two variables x and y is linear
(A) Multiple regression
(B) Simple regression
(C) Linear regression
(D) Non linear regression
(A) Multiple regression
(B) Simple regression
(C) Linear regression
(D) Non linear regression
No comments:
Post a Comment