
Deep Learning Interview Questions
- What is a Perceptron? Discuss its various components
- What is a Multilayer Perceptron (MLP), also commonly known as Feed Forward Neural Network?
- What do you mean by pretraining, finetuning and transfer learning?
- Describe the training process of a Neural Network model (Forward and Backward propagation)
- What are the key hyper-parameters of a neural network model?
- What is an activation function, and what are the common choices for activation functions?
- What are some options to address overfitting in Neural Networks?
- Why do we add bias component in a perceptron / neural network?
- What is Rectified Linear Unit (ReLU) activation function? Discuss its advantages and disadvantages
- What is the “dead ReLU” problem and, why is it an issue in Neural Network training?
- What is the vanishing and exploding gradient problem, and how are they typically addressed?
- What do you mean by saturation in neural network training? Discuss the problems associated with it
- Discuss Softmax activation function
- How does dropout work?
- Explain the Transformer Architecture
- What are the primary advantages of transformer models?
- What are the limitations of transformer models?
- Explain Self-Attention, and Masked Self-Attention as used in Transformers
- What are recurrent neural networks (RNNs), and what are their applications?
- What is Long-Short Term Memory (LSTM)?
Relevant articles:
- What is Deep Learning? Discuss its key characteristics, working and applications
- What are the advantages and disadvantages of Deep Learning?
- How does Deep Learning methods compare with traditional Machine Learning methods?
- Explain the basic architecture of a Neural Network, model training and key hyper-parameters
- What are Transformers? Discuss the evolution and major breakthroughs in transformer models
- What are Language Models?
- What is Natural Language Processing (NLP) ? List the different types of NLP tasks
- Top 100 Machine Learning Interview Questions and Answers (All Free)