1. How machine learning is deployed in real world scenarios?

2. What are the various aspects of a Machine Learning process?

3. Explain the life cycle of a data science project.

4. What are the types of Distribution?

5. What is the difference between a vector and matrix?

6. Give examples of Discrete and Continuous Distribution

7. What is the difference between co-relation and co-variance?

8. Difference between Linear and Logistic Regression?

9. What are the drawbacks of the linear model?

10. What is ‘Training set’ and ‘Test set’?

11. What is a Random Variable?

12. Which one would you prefer to choose – model accuracy or model performance?

13. How much data will you allocate for your training, validation and test sets?

14. What is the value under Normal Curve?

15. What are some alternative models to a linear regression? Why are they better or worse?

16. What does P-value signify about the statistical data?

17. How can outlier values be treated?

18. During analysis, how do you treat missing values?

19. What is the difference between a Test Set and a Validation Set?

20. What are the conditions for Omitted Variable Bias and how does it affect the coefficient estimates? Why? What are some fixes for OVB?

21. How do you interpret the coefficients in a log-log model? Why?

22. What does the Gauss-Markov Theorem say and why is it important?

23. How does heteroscedasticity affect the coefficient estimates and why? What are some fixes for heteroscedasticity?

24. Given a regression setting with a binary response variable, what probability model should be used and why?

25. What happens to the errors in the logistic regression function?

26. Design a regression model to test the Law of Demand.

27. Explain the Central Limit Theorem to a five year-old. (This one gave me the most trouble).

28. Why do the residuals from a linear regression add up to 0?

29. Is this still true if you fit a regression without intercept?

30. What’s so bad about collinearity?

31. What will happen when you fit degree 4 polynomial in linear regression?

32. What will happen when you fit degree 2 polynomial in linear regression?

33. What is the difference between Supervised Learning an Unsupervised Learning?

34. How will you find the correlation between a categorical variable and a continuous variable?

35. What is the difference between squared error and absolute error?

36. What is K-means? How can you select K for K-means?

37. What do you expect will happen with bias and variance as you increase the size of training data?

38. How do you define goodness of fit?

39. What do coefficient estimates mean?

40. How do you measure fit of the model? What do R and D mean?

41. What is Ordinary Least Squares?

42. What are some possible problems with regression models? How do you avoid or compensate for them?

43. Name a few types of regression you are familiar with? What are the differences?

44. What is overfitting a regression model? What are ways to avoid it?

45. In linear regression, under what condition R^2 always equals a perfect 1?

46. How do you perform a regression?

47. Why do you perform a regression?

48. What are the cons of performing a regression?

49. How many variables should you use? What are the downfalls of using too many or too few variables

50. How do you do feature generation?

51. It is possible to design a Linear regression algorithm using a neural network?

52. What is Systematic Sampling?

53. What are the types of biases that can occur during sampling?

54. What is power analysis?

55. What is p-value and how it is used for variable selection?

56. What are an Eigenvalue and Eigenvector?

57. Why do the residuals from a linear regression add up to 0?

58. Is this still true if you fit a regression without intercept?

59. Difference between Collinearity and correlation?

60. Explain equation of logistic regression model

61. What is Standard error?

62. How VIF is calculated and interpretation of it? Do we remove intercepts while calculating VIF?

63. What is Gradient Descent?

64. I have two models of comparable accuracy and computational performance. Which one should I choose for production and why?

65. What is the importance of having a selection bias?

66. What is the difference between Bayesian Estimate and Maximum Likelihood Estimation (MLE)?

67. What is the advantage of performing dimensionality reduction before fitting an SVM?

68. How do you ensure you’re not overfitting with a model?

69. What is multi-collinearity and how you can overcome it?

70. What are the differences between overfitting and underfitting?

71. How can you overcome Overfitting?

72. Name an example where ensemble techniques might be useful.

73. How would you develop a model to identify plagiarism?

74. Why L1 regularizations cause parameter sparsity whereas L2 regularization does not?

75. What are the various classification algorithms?

76. How is kNN different from k-means clustering?

77. What is the “curse of dimensionality” and how do you tackle it?

78. How do you handle categorical features in your dataset?

79. What is the other term for best fit line?

80. What are the essential steps in a predictive modeling project?

81. What are the applications of predictive modeling?

82. Define observation and performance window?

83. Difference between Factor Analysis and PCA?

84. Explain Dimensionality / Variable Reduction Techniques

85. When should you use classification over regression?

86. What is Fisher Scoring in Logistic Regression?

87. How to validate cluster analysis?

88. How would you handle an imbalanced dataset?

89. What kind of problems does regularization solve?

90. In what areas Pattern Recognition is used?

91. Which are the popular R packages for decision tree?

92. What’s the difference between Type I and Type II error?

93. What’s a Fourier transform?

94. What’s the difference between probability and likelihood?

95. How would you implement a recommendation system for our company’s users?

96. How do you think Google is training data for self-driving cars?

97. What cross-validation technique would you use on a time series dataset?

98. How is a decision tree pruned?

99. When is Ridge regression favorable over Lasso regression?

100. Both being tree based algorithm, how is random forest different from Gradient boosting algorithm (GBM)?

101. In what areas Pattern Recognition is used?

# Data Science Interview questions – Part 1

**10**
*Sunday*
Jun 2018

Posted Big Data, Data Science, Hadoop, Interview Questions, Python, R, RStudio, Uncategorized

inHere is the first list of Interview questions (more to come in next posts, please subscribe and get notified as and when the posts are here):

Advertisements