Mastering Deep Learning Fundamentals: Flashcards for Neural Networks

Table of Contents

Deep Learning Flashcards

What is Deep Learning?   drill deep_learning

Front

What is Deep Learning?

Back

Deep Learning is a subset of machine learning that uses artificial neural networks with multiple layers (deep neural networks) to model and process complex patterns in data.

Activation Functions   drill deep_learning

Front

Name and describe three common activation functions used in deep learning.

Back

  1. ReLU (Rectified Linear Unit): f(x) = max(0, x) Returns x if positive, otherwise returns 0.
  2. Sigmoid: f(x) = 1 / (1 + e(-x)) Squashes input to range (0, 1), useful for binary classification.
  3. Tanh (Hyperbolic Tangent): f(x) = (ex - e(-x)) / (ex + e(-x)) Squashes input to range (-1, 1), often used in hidden layers.

Backpropagation   drill deep_learning

Front

What is backpropagation in the context of neural networks?

Back

Backpropagation is an algorithm used to train neural networks by calculating gradients of the loss function with respect to the network's weights, allowing for efficient updates to minimize the loss.

Convolutional Neural Networks (CNNs)   drill deep_learning

Front

What are Convolutional Neural Networks (CNNs) and what are they primarily used for?

Back

CNNs are a type of deep neural network designed to process grid-like data, such as images. They use convolutional layers to automatically learn hierarchical features from the input data. CNNs are primarily used for image classification, object detection, and other computer vision tasks.

Recurrent Neural Networks (RNNs)   drill deep_learning

Front

What are Recurrent Neural Networks (RNNs) and what types of problems are they suited for?

Back

RNNs are a class of neural networks designed to work with sequential data by maintaining an internal state (memory). They are well-suited for tasks involving time series, natural language processing, and other sequential data problems like speech recognition and machine translation.

Gradient Descent   drill deep_learning

Front

Explain the concept of Gradient Descent in deep learning.

Back

Gradient Descent is an optimization algorithm used to minimize the loss function by iteratively moving in the direction of steepest descent. It updates the model's parameters (weights and biases) in the opposite direction of the gradient of the loss function with respect to the parameters.

Overfitting   drill deep_learning

Front

What is overfitting in deep learning and how can it be mitigated?

Back

Overfitting occurs when a model learns the training data too well, including its noise and peculiarities, leading to poor generalization on unseen data. It can be mitigated by:

  1. Using regularization techniques (e.g., L1/L2 regularization)
  2. Applying dropout
  3. Increasing the training data size
  4. Using early stopping
  5. Implementing data augmentation

Long Short-Term Memory (LSTM)   drill deep_learning

Front

What is Long Short-Term Memory (LSTM) and how does it improve upon traditional RNNs?

Back

LSTM is a type of RNN architecture designed to address the vanishing gradient problem in traditional RNNs. It introduces a memory cell and three gates (input, forget, and output) to better capture long-term dependencies in sequential data. This allows LSTMs to learn and remember information over long sequences more effectively than standard RNNs.

Transfer Learning   drill deep_learning

Front

What is Transfer Learning in the context of deep learning?

Back

Transfer Learning is a technique where a model trained on one task is re-purposed on a second related task. It involves using pre-trained models as a starting point for a new task, which can significantly reduce training time and improve performance, especially when limited labeled data is available for the new task.

Author: Jason Walsh

j@wal.sh

Last Updated: 2024-08-14 06:08:49