When something’s broken in a container, logs are the first place to look. I probably run docker logs a dozen times a day when debugging local services.

Basic log viewing Link to heading

View logs:

docker logs my-container

Follow logs in real time:

docker logs my-container -f

Search through logs:

docker logs my-container | grep 'error'

Page through logs:

docker logs my-container | less

If you don’t know the container name:

docker ps

View logs of a stopped container:

docker ps -a  # find the container ID
docker logs abc123

Tail the last 100 lines:

docker logs --tail 100 my-container

Understanding log drivers Link to heading

Docker’s default logging driver is json-file, which stores logs as JSON on your host machine. This works well for development, but there are other options depending on your needs:

  • json-file: Default, stores logs locally
  • journald: Sends logs to systemd journal (useful on systems using systemd)
  • syslog: Forwards to syslog daemon
  • none: Disables logging entirely (I’ve used this for extremely chatty containers where logs aren’t useful)
  • awslogs, gcplogs: Send directly to cloud logging services

You can configure the log driver in your docker-compose.yml:

services:
  app:
    image: myapp
    logging:
      driver: "json-file"
      options:
        max-size: "10m"
        max-file: "3"

This configuration keeps logs manageable - up to 3 files of 10MB each, then rotation kicks in.

When logs get too big Link to heading

I once had a development container that filled up 50GB of disk space with logs because I left it running over a long weekend. Painful lesson. Here’s what I learnt:

Set rotation limits. The max-size and max-file options (shown above) prevent logs from consuming all your disk space. I now set these on all my local development containers.

Use docker logs --since. If logs are massive, you can filter by time:

docker logs --since 10m my-container  # last 10 minutes
docker logs --since 2025-12-17T09:00:00 my-container  # since specific time

Check log file locations. JSON logs are stored at /var/lib/docker/containers/<container-id>/<container-id>-json.log on Linux. On macOS with Docker Desktop, they’re inside the VM.

Production considerations Link to heading

In production, I don’t rely on docker logs at all. Logs should go to a centralised logging system like:

  • Cloud provider logging (CloudWatch, Cloud Logging, Azure Monitor)
  • ELK stack (Elasticsearch, Logstash, Kibana)
  • Loki with Grafana
  • Datadog, New Relic, or similar APM tools

Container logs are ephemeral - once a container is removed, so are its logs (unless you’ve configured persistent storage). Centralised logging solves this and makes it much easier to search across multiple containers.

For more on Docker logging, the Docker logging documentation covers all the drivers and configuration options in detail. The logging best practices guide is also worth a read if you’re setting up production systems.