What Mini-Batch Meaning, Applications & Example

Small subset of training data used in each iteration.

What is Mini-Batch?

Mini-batch refers to a small subset of the entire training dataset used in each iteration of the training process. Instead of training on the entire dataset at once (batch) or one example at a time (stochastic), mini-batch training strikes a balance, improving both the efficiency and convergence speed of machine learning algorithms, particularly in deep learning .

Benefits of Mini-Batch

  1. Faster Training: Mini-batch allows parallel processing of multiple data points, speeding up the training process compared to stochastic gradient descent .
  2. Memory Efficiency: By processing smaller batches, it reduces memory consumption compared to using the full dataset.
  3. Better Generalization: Mini-batch introduces a degree of randomness that can help avoid local minima, potentially leading to better generalization.

Example of Mini-Batch in Training

In deep learning, you typically divide your dataset into mini-batches. For example, if you have 1000 data points and you choose a mini-batch size of 100, you’ll update the model ’s weights 10 times for each full pass over the dataset (epoch ).

# Example pseudo-code for mini-batch [gradient descent](/ai-dictionary/gradient-descent/)
for epoch in range(num_epochs):
    for batch in mini_batches:
        # Compute gradient for this mini-batch
        gradient = compute_gradient(batch)
        # Update weights
        weights -= learning_rate * gradient

Read the Governor's Letter

Stay ahead with Governor's Letter, the newsletter delivering expert insights, AI updates, and curated knowledge directly to your inbox.

By subscribing to the Governor's Letter, you consent to receive emails from AI Guv.
We respect your privacy - read our Privacy Policy to learn how we protect your information.

A

B

C

D

E

F

G

H

I

J

K

L

M

N

O

P

Q

R

S

T

U

V

W

X

Y

Z