All posts

Fix Kubernetes Pod Crash with Gatsby

Fix CrashLoopBackOff in Kubernetes when serving Gatsby static sites, covering build failures, memory limits, and Nginx config issues.

Gatsby Pods Crashing in Kubernetes

Gatsby generates static HTML, so Kubernetes crashes usually happen during the build phase or due to misconfigured serving containers.

Build-Phase OOMKilled

Gatsby builds are memory-intensive. If your build pod gets OOMKilled, increase the memory limit:

# build job
resources:
  limits:
    memory: "4Gi"
  requests:
    memory: "2Gi"

You can also set Node.js heap size in your Dockerfile:

ENV NODE_OPTIONS="--max-old-space-size=3072"
RUN npm run build

Serving the Static Files

A multi-stage Dockerfile keeps your runtime image small:

FROM node:20-alpine AS builder
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run build

FROM nginx:alpine
COPY --from=builder /app/public /usr/share/nginx/html
COPY nginx.conf /etc/nginx/conf.d/default.conf

Common Nginx Mistakes

Forgetting to handle client-side routes causes 404s that look like crashes in monitoring:

server {
  listen 80;
  root /usr/share/nginx/html;

  location / {
    try_files $uri $uri/ /404.html =404;
  }
}

Environment Variables at Build Time

Gatsby bakes environment variables into the static output. Missing variables cause build failures:

env:
  - name: GATSBY_API_URL
    valueFrom:
      configMapKeyRef:
        name: gatsby-config
        key: api-url

Connect Bugsly to your Gatsby site's client-side JavaScript to capture runtime errors from users' browsers, complementing your Kubernetes-level monitoring.

Try Bugsly Free

AI-powered error tracking that explains your bugs. Set up in 2 minutes, free forever for small projects.

Get Started Free