Failed to solve: failed to read dockerfile: open Dockerfile: no such file or directory

We have a Python DJango based application which I am trying to build using docker compose command. But I am running into the below error:

failed to solve: failed to read dockerfile: open Dockerfile: no such file or directory

The Dockerfile exists in the same folder where I am running the command from. This is the command I am running:

docker compose up -d --build

My docker version is 28.1.1. I tried running it on Windows machine as well as Ubuntu 24.04 LTS, but having the same error.

Can someone please help regarding this?

Please share the content of your compose file, and the output of docker ls , so we can see if the file is actually named Dockerfile and aligns with what you configured in the compose file.

Hi, on running the ‘docker ls’ command I get an error:

docker: unknown command: docker ls
Run ‘docker --help’ for more information

Below is content of the compose file:

version: "3.8"

services:
  web:
    build: ./
    command: ./init-scripts/web-dev.sh
    expose:
      - 8000
    ports:
      - 8050:8050
    volumes:
      - ./:/app
      - ${BATCH_INPUT_PATH}:/batches
    environment:
      - SECRET_KEY=${SECRET_KEY}
      - DEBUG=${DEBUG}
      - ALLOWED_HOSTS=${ALLOWED_HOSTS}
      - CSRF_TRUSTED_ORIGINS=${CSRF_TRUSTED_ORIGINS}
      - CORS_ALLOWED_ORIGINS=${CORS_ALLOWED_ORIGINS}
      - AUTO_MIGRATE=${AUTO_MIGRATE}
      - DB_NAME=${DB_NAME}
      - PGBOUNCER_HOST=${PGBOUNCER_HOST}
      - PGBOUNCER_PORT=${PGBOUNCER_PORT}
      - DB_PORT=${DB_PORT}
      - DB_USERNAME=${DB_USERNAME}
      - DB_PASSWORD=${DB_PASSWORD}
      - REDIS_HOST=${REDIS_HOST}
      - REDIS_PORT=${REDIS_PORT}
      - RABBITMQ_HOST=${RABBITMQ_HOST}
      - RABBITMQ_PORT=${RABBITMQ_PORT}
      - RABBITMQ_USERNAME=${RABBITMQ_USERNAME}
      - RABBITMQ_PASSWORD=${RABBITMQ_PASSWORD}
      - DOCBUILDER_API_URL=${DOCBUILDER_API_URL}
      - UTILITY_API_URL=${UTILITY_API_URL}
      - ICAP_API_URL=${ICAP_API_URL}
      - FRONTEND_URL=${FRONTEND_URL}
      - DEFINITION_VERSIONS=${DEFINITION_VERSIONS}
      - DEFAULT_DEFINITION_VERSION=${DEFAULT_DEFINITION_VERSION}
      - DATACAP_API_BASE_URL=${DATACAP_API_BASE_URL}
      - DATACAP_APP=${DATACAP_APP}
      - DATACAP_USER=${DATACAP_USER}
      - DATACAP_PASSWORD=${DATACAP_PASSWORD}
      - DATACAP_STATION=${DATACAP_STATION}
      - DATACAP_JOB=${DATACAP_JOB}
      - DATACAP_MILI_JOB=${DATACAP_MILI_JOB}
      - ENABLE_LDAP_AUTH=${ENABLE_LDAP_AUTH}
      - LDAP_SERVER_ADDRESS=${LDAP_SERVER_ADDRESS}
      - LDAP_SERVER_PORT=${LDAP_SERVER_PORT}
      - LDAP_BIND_DN=${LDAP_BIND_DN}
      - LDAP_BIND_PASSWORD=${LDAP_BIND_PASSWORD}
      - LDAP_USER_SEARCH_DN=${LDAP_USER_SEARCH_DN}
      - LDAP_GROUP_SEARCH_DN=${LDAP_GROUP_SEARCH_DN}
      - LDAP_AUTH_USER_GROUP=${LDAP_AUTH_USER_GROUP}
      - LDAP_AUTH_SUPERUSER_GROUP=${LDAP_AUTH_SUPERUSER_GROUP}
      - SEND_EMAILS=${SEND_EMAILS}
      - EMAIL_HOST=${EMAIL_HOST}
      - EMAIL_PORT=${EMAIL_PORT}
      - EMAIL_HOST_USER=${EMAIL_HOST_USER}
      - EMAIL_HOST_PASSWORD=${EMAIL_HOST_PASSWORD}
      - EMAIL_USE_TLS=${EMAIL_USE_TLS}
      - CLASSIFIER_API_URL=${CLASSIFIER_API_URL}
      - MAX_RETRY=${MAX_RETRY}
      - RETRY_INTERVAL=${RETRY_INTERVAL}
      - SERVER_IP=${SERVER_IP}
      - SAVE_TRANSACTION_LOG=${SAVE_TRANSACTION_LOG}
      - DATACAP_CALLBACK_AWAITING=${DATACAP_CALLBACK_AWAITING}
    depends_on:
      - pgbouncer
      - redis
      - rabbitmq
    extra_hosts:
      - "localhost:host-gateway"
  daphne:
    build: ./
    command: daphne -b 0.0.0.0 -p 8001 app.asgi:application
    expose:
      - 8001
    environment:
      - SECRET_KEY=${SECRET_KEY}
      - DEBUG=${DEBUG}
      - ALLOWED_HOSTS=${ALLOWED_HOSTS}
      - CSRF_TRUSTED_ORIGINS=${CSRF_TRUSTED_ORIGINS}
      - CORS_ALLOWED_ORIGINS=${CORS_ALLOWED_ORIGINS}
      - AUTO_MIGRATE=${AUTO_MIGRATE}
      - DB_NAME=${DB_NAME}
      - PGBOUNCER_HOST=${PGBOUNCER_HOST}
      - PGBOUNCER_PORT=${PGBOUNCER_PORT}
      - DB_PORT=${DB_PORT}
      - DB_USERNAME=${DB_USERNAME}
      - DB_PASSWORD=${DB_PASSWORD}
      - REDIS_HOST=${REDIS_HOST}
      - REDIS_PORT=${REDIS_PORT}
      - RABBITMQ_HOST=${RABBITMQ_HOST}
      - RABBITMQ_PORT=${RABBITMQ_PORT}
      - RABBITMQ_USERNAME=${RABBITMQ_USERNAME}
      - RABBITMQ_PASSWORD=${RABBITMQ_PASSWORD}
      - DOCBUILDER_API_URL=${DOCBUILDER_API_URL}
      - UTILITY_API_URL=${UTILITY_API_URL}
      - ICAP_API_URL=${ICAP_API_URL}
      - FRONTEND_URL=${FRONTEND_URL}
      - DATACAP_API_BASE_URL=${DATACAP_API_BASE_URL}
      - DEFINITION_VERSIONS=${DEFINITION_VERSIONS}
      - DEFAULT_DEFINITION_VERSION=${DEFAULT_DEFINITION_VERSION}
    depends_on:
      - web
    extra_hosts:
      - "localhost:host-gateway"
  worker:
    build: ./
    command: ./init-scripts/worker.sh
    volumes:
      - ./:/app
      - ${BATCH_INPUT_PATH}:/batches
      - ${SCRIPTS_INPUT_PATH}:/scripts
    environment:
      - SECRET_KEY=${SECRET_KEY}
      - DEBUG=${DEBUG}
      - ALLOWED_HOSTS=${ALLOWED_HOSTS}
      - CSRF_TRUSTED_ORIGINS=${CSRF_TRUSTED_ORIGINS}
      - CORS_ALLOWED_ORIGINS=${CORS_ALLOWED_ORIGINS}
      - AUTO_MIGRATE=${AUTO_MIGRATE}
      - DB_NAME=${DB_NAME}
      - PGBOUNCER_HOST=${PGBOUNCER_HOST}
      - PGBOUNCER_PORT=${PGBOUNCER_PORT}
      - DB_PORT=${DB_PORT}
      - DB_USERNAME=${DB_USERNAME}
      - DB_PASSWORD=${DB_PASSWORD}
      - REDIS_HOST=${REDIS_HOST}
      - REDIS_PORT=${REDIS_PORT}
      - RABBITMQ_HOST=${RABBITMQ_HOST}
      - RABBITMQ_PORT=${RABBITMQ_PORT}
      - RABBITMQ_USERNAME=${RABBITMQ_USERNAME}
      - RABBITMQ_PASSWORD=${RABBITMQ_PASSWORD}
      - DOCBUILDER_API_URL=${DOCBUILDER_API_URL}
      - UTILITY_API_URL=${UTILITY_API_URL}
      - ICAP_API_URL=${ICAP_API_URL}
      - FRONTEND_URL=${FRONTEND_URL}
      - DEFINITION_VERSIONS=${DEFINITION_VERSIONS}
      - DEFAULT_DEFINITION_VERSION=${DEFAULT_DEFINITION_VERSION}
      - DATACAP_API_BASE_URL=${DATACAP_API_BASE_URL}
      - DATACAP_APP=${DATACAP_APP}
      - DATACAP_USER=${DATACAP_USER}
      - DATACAP_PASSWORD=${DATACAP_PASSWORD}
      - DATACAP_STATION=${DATACAP_STATION}
      - DATACAP_JOB=${DATACAP_JOB}
      - DATACAP_MILI_JOB=${DATACAP_MILI_JOB}
      - SEND_EMAILS=${SEND_EMAILS}
      - EMAIL_HOST=${EMAIL_HOST}
      - EMAIL_PORT=${EMAIL_PORT}
      - EMAIL_HOST_USER=${EMAIL_HOST_USER}
      - EMAIL_HOST_PASSWORD=${EMAIL_HOST_PASSWORD}
      - EMAIL_USE_TLS=${EMAIL_USE_TLS}
      - CLASSIFIER_API_URL=${CLASSIFIER_API_URL}
      - MOCK_ASSEMBLY_APIS=${MOCK_ASSEMBLY_APIS}
      - CUSTOMS_API_BASE_URL=${CUSTOMS_API_BASE_URL}
      - DSC_API_BASE_URL=${DSC_API_BASE_URL}
      - DSC_API_CLIENT_ID=${DSC_API_CLIENT_ID}
      - DSC_API_CLIENT_SECRET=${DSC_API_CLIENT_SECRET}
      - FREIGHT_API_CLIENT_ID=${FREIGHT_API_CLIENT_ID}
      - FREIGHT_API_CLIENT_SECRET=${FREIGHT_API_CLIENT_SECRET}
      - CUSTOMS_API_KEY=${CUSTOMS_API_KEY}
      - EDM_API_KEY=${EDM_API_KEY}
      - MAX_RETRY=${MAX_RETRY}
      - RETRY_INTERVAL=${RETRY_INTERVAL}
      - TIMESTAMP_API_BASE_URL=${TIMESTAMP_API_BASE_URL}
      - DHL_API_KEY=${DHL_API_KEY}
      - SERVER_IP=${SERVER_IP}
      - USA_API_BASE_URL=${USA_API_BASE_URL}
      - USA_API_KEY=${USA_API_KEY}
      - SAVE_TRANSACTION_LOG=${SAVE_TRANSACTION_LOG}
      - FCM_CLIENT_ID=${FCM_CLIENT_ID}
      - FCM_CLIENT_SECRET=${FCM_CLIENT_SECRET}
    depends_on:
      - web
    extra_hosts:
      - "localhost:host-gateway"
    deploy:
      replicas: ${WORKER_REPLICAS}
  celery:
    build: ./
    command: ./init-scripts/celery.sh
    volumes:
      - ./:/app
      - ${BATCH_INPUT_PATH}:/batches
    environment:
      - SECRET_KEY=${SECRET_KEY}
      - DEBUG=${DEBUG}
      - ALLOWED_HOSTS=${ALLOWED_HOSTS}
      - CSRF_TRUSTED_ORIGINS=${CSRF_TRUSTED_ORIGINS}
      - CORS_ALLOWED_ORIGINS=${CORS_ALLOWED_ORIGINS}
      - AUTO_MIGRATE=${AUTO_MIGRATE}
      - DB_NAME=${DB_NAME}
      - PGBOUNCER_HOST=${PGBOUNCER_HOST}
      - PGBOUNCER_PORT=${PGBOUNCER_PORT}
      - DB_PORT=${DB_PORT}
      - DB_USERNAME=${DB_USERNAME}
      - DB_PASSWORD=${DB_PASSWORD}
      - REDIS_HOST=${REDIS_HOST}
      - REDIS_PORT=${REDIS_PORT}
      - RABBITMQ_HOST=${RABBITMQ_HOST}
      - RABBITMQ_PORT=${RABBITMQ_PORT}
      - RABBITMQ_USERNAME=${RABBITMQ_USERNAME}
      - RABBITMQ_PASSWORD=${RABBITMQ_PASSWORD}
      - DOCBUILDER_API_URL=${DOCBUILDER_API_URL}
      - UTILITY_API_URL=${UTILITY_API_URL}
      - ICAP_API_URL=${ICAP_API_URL}
      - FRONTEND_URL=${FRONTEND_URL}
      - DATACAP_API_BASE_URL=${DATACAP_API_BASE_URL}
      - DEFINITION_VERSIONS=${DEFINITION_VERSIONS}
      - DEFAULT_DEFINITION_VERSION=${DEFAULT_DEFINITION_VERSION}
    depends_on:
      - web
    extra_hosts:
      - "localhost:host-gateway"
  celery_beat:
    build: ./
    command: ./init-scripts/celery_beat.sh
    volumes:
      - ./:/app
    environment:
      - SECRET_KEY=${SECRET_KEY}
      - DEBUG=${DEBUG}
      - ALLOWED_HOSTS=${ALLOWED_HOSTS}
      - CSRF_TRUSTED_ORIGINS=${CSRF_TRUSTED_ORIGINS}
      - CORS_ALLOWED_ORIGINS=${CORS_ALLOWED_ORIGINS}
      - AUTO_MIGRATE=${AUTO_MIGRATE}
      - DB_NAME=${DB_NAME}
      - PGBOUNCER_HOST=${PGBOUNCER_HOST}
      - PGBOUNCER_PORT=${PGBOUNCER_PORT}
      - DB_PORT=${DB_PORT}
      - DB_USERNAME=${DB_USERNAME}
      - DB_PASSWORD=${DB_PASSWORD}
      - REDIS_HOST=${REDIS_HOST}
      - REDIS_PORT=${REDIS_PORT}
      - RABBITMQ_HOST=${RABBITMQ_HOST}
      - RABBITMQ_PORT=${RABBITMQ_PORT}
      - RABBITMQ_USERNAME=${RABBITMQ_USERNAME}
      - RABBITMQ_PASSWORD=${RABBITMQ_PASSWORD}
      - DOCBUILDER_API_URL=${DOCBUILDER_API_URL}
      - UTILITY_API_URL=${UTILITY_API_URL}
      - ICAP_API_URL=${ICAP_API_URL}
      - FRONTEND_URL=${FRONTEND_URL}
      - DATACAP_API_BASE_URL=${DATACAP_API_BASE_URL}
      - DEFINITION_VERSIONS=${DEFINITION_VERSIONS}
      - DEFAULT_DEFINITION_VERSION=${DEFAULT_DEFINITION_VERSION}
    depends_on:
      - web
    extra_hosts:
      - "localhost:host-gateway"
  nginx:
    build:
      context: ./nginx
      args:
        CONFIG_FILE: nginx.dev.conf.template
    ports:
      - ${PORT}:8080
    volumes:
      - ${BATCH_INPUT_PATH}:/app/batches
    environment:
      - WEB_HOST=web:8000
      - DAPHNE_HOST=daphne:8001
    depends_on:
      - web
  postgres:
    build:
      context: ./postgres
    expose:
      - 5432
    ports:
      - ${DB_HOST_BIND_PORT}:5432
    volumes:
      - postgres_data:/var/lib/postgresql/data
      - ./postgres/postgresql.conf:/etc/postgresql/postgresql.conf:ro
    environment:
      - POSTGRES_DB=${DB_NAME}
      - POSTGRES_USER=${DB_USERNAME}
      - POSTGRES_PASSWORD=${DB_PASSWORD}
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U ${DB_USERNAME} -d ${DB_NAME} -h localhost"]
      interval: 10s
      timeout: 5s
      retries: 5
      start_period: 15s
    command: postgres -c config_file=/etc/postgresql/postgresql.conf
  pgbouncer:
    build:
      context: ./pgbouncer
    ports:
      - "${PGBOUNCER_PORT}:6432"
    environment:
      - POSTGRESQL_HOST=${DB_HOST}
      - POSTGRESQL_PORT=${DB_PORT}
      - POSTGRESQL_DATABASE=${DB_NAME}
      - POSTGRESQL_USERNAME=${DB_USERNAME}
      - POSTGRESQL_PASSWORD=${DB_PASSWORD}
      - PGBOUNCER_DATABASE=${DB_NAME}
      - PGBOUNCER_PORT=${PGBOUNCER_PORT}
      - PGBOUNCER_LISTEN_ADDRESS=0.0.0.0
      - PGBOUNCER_AUTH_TYPE=trust
      - PGBOUNCER_POOL_MODE=${PGBOUNCER_POOL_MODE}
      - PGBOUNCER_MAX_CLIENT_CONN=${PGBOUNCER_MAX_CLIENT_CONN}
      - PGBOUNCER_DEFAULT_POOL_SIZE=${PGBOUNCER_DEFAULT_POOL_SIZE}
      - PGBOUNCER_MIN_POOL_SIZE=${PGBOUNCER_MIN_POOL_SIZE}
      - PGBOUNCER_RESERVE_POOL_SIZE=${PGBOUNCER_RESERVE_POOL_SIZE}
      - PGBOUNCER_MAX_DB_CONNECTIONS=${PGBOUNCER_MAX_DB_CONNECTIONS}
    depends_on:
      postgres:
        condition: service_healthy
  redis:
    build:
      context: ./redis
    ports:
      - ${REDIS_PORT}:6379
    volumes:
      - redis_data:/data
  rabbitmq:
    build:
      context: ./rabbitmq
    ports:
      - ${RABBITMQ_PORT}:5672
      - ${RABBITMQ_DASHBOARD_PORT}:15672
    volumes:
      - rabbitmq_data:/var/lib/rabbitmq/
    environment:
      - RABBITMQ_DEFAULT_USER=${RABBITMQ_USERNAME}
      - RABBITMQ_DEFAULT_PASS=${RABBITMQ_PASSWORD}
    hostname: ${RABBITMQ_SYSTEM_HOSTNAME}

volumes:
  postgres_data:
  redis_data:
  rabbitmq_data:

My bad, it should have been just ls.

I am not sure how much sense it makes that multiple services use the same Dockerfile from the current folder to build an image. I am not sure if using ./ of the context is correct, or only . should be used.

Also, we know now that in the folder where the compose file is located, a file named Dockerfile (exactly written like that) is expected to exist. At least that’s what the services web, daphne, worker,celery and celery_beat expect to find.

So either the file is not named Dockerfile, or using ./ as context is the problem.

Hi, please find attached the output of the ls command. Also I modified the docker-compose.yaml to use build . instead of ./ but still getting the same error.

I was able to resolve the issue. Some of the services were missing the dockerfile inside the folders. Adding the dockerfiles resolved the issue

1 Like

Make sure that you DO NOT have Dockerfile in your .gcloudignore file, or equivalent.