Docker-compose build and up

I am trying to Dockerize a Django app with tailwind to deploy to lightsail. I followed two tutorials.

Both work individually. Now I am trying to combine the both. I started with the django-tailwind guide, then added the code from the first tutorial to it.

When I run docker-compose build it builds (19/28) of the commands, I assume. When I run docker-compose up I get the error

gmo-django-lightsail-tailwind-1 | CommandError:
gmo-django-lightsail-tailwind-1 | It looks like node.js and/or npm is not installed or cannot be found.
gmo-django-lightsail-tailwind-1 |
gmo-django-lightsail-tailwind-1 | Visit to download and install node.js for your system.
gmo-django-lightsail-tailwind-1 |
gmo-django-lightsail-tailwind-1 | If you have npm installed and still getting this error message, set NPM_BIN_PATH variable in to match path of NPM executable in your system.
gmo-django-lightsail-tailwind-1 |
gmo-django-lightsail-tailwind-1 | Example:
gmo-django-lightsail-tailwind-1 | NPM_BIN_PATH = "/usr/local/bin/npm

In my I have “NPM_BIN_PATH = r"C:\Program Files\nodejs\npm.cmd” included.

Here is my dockerfile:

FROM python:3.11.1-alpine3.17


COPY ./requirements.txt /requirements.txt

RUN apk add --update --no-cache \
  postgresql-client \
  build-base \
  postgresql-dev \
  musl-dev \
  zlib \
  zlib-dev \


RUN apk add --no-cache \
  musl-dev \
  curl \

RUN python -m venv /py && \
  /py/bin/pip install --upgrade pip && \
  /py/bin/pip install -r /requirements.txt

COPY ./scripts /scripts

RUN chmod -R +x /scripts

ENV PATH="/scripts:/py/bin:$PATH"

COPY ./gmo /gmo



CMD ["/scripts/"]

Here is my docker-compose file:

version: "3.8"

      context: .
      - ./gmo:/gmo
      - 8000:8000
    command: sh -c "python migrate && python runserver"
      DEBUG: 1
      DB_HOST: database
      DB_NAME: django-dev-db
      DB_USER: devuser
      DB_PASS: devpassword123
        condition: service_healthy

      context: .
      - ./gmo:/gmo
    command: python tailwind start
    restart: unless-stopped
    tty: true

    image: postgres:12-alpine
      - dev-db-data:/var/lib/postgresql/data
      POSTGRES_DB: django-dev-db
      POSTGRES_USER: devuser
      POSTGRES_PASSWORD: devpassword123
      test: ["CMD", "pg_isready", "-q", "-d", "django-dev-db", "-U", "devuser"]
      interval: 5s
      timeout: 5s
      retries: 10


And here is my script file:

set -e

SECRET_KEY=nothing python tailwind install --no-input
SECRET_KEY=nothing python tailwind build --no-input
SECRET_KEY=nothing python collectstatic --no-input
python migrate
gunicorn -b :80 --chdir /gmo gmo.wsgi:application

Well, I’ve tried many things. Editing the docker-compose and Dockerfile (the latest modification is posted).

I want to be able to live update tailwind when I run docker-compose up. It sometimes works, but then doesn’t live update after. Normal HTML changes work, but tailwind specific changes doesn’t.

This is my first time posting here, so sorry if I placed it in the wrong forum.

This is my original post.

Looks like your python application requires nodejs(?!) and it’s not available inside the container.
Which of the commands requires nodjs? I try to understand whether nodejs is needed during image build time or during runtime as well.

This is my first time using Docker for anything, so excuse my ignorance

I believe TailwindCSS uses node.js to compile the CSS. Normally, without running Docker, once I edit the TailwindCSS in the HTML file it auto updates the generated style.css file, which updates the code.

How your build process works is not related to docker :slight_smile:

I still have the same questions. Without you describing how the build process works and what it needs, there is not much I can do, as I can only contribute knowledge about the Docker mechanics.

Let’s take docker out of the equation: what dependencies do you need to install on your machine to build and run the application?

Sorry for the late response.

I am unsure how to answer this question tbh.

These are my dependencies:

Also, these are the commands that appear when I run docker-compose build

[+] Building 2.5s (18/27)
=> [gmo-django-lightsail-gmo internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 32B 0.0s
=> [gmo-django-lightsail-tailwind internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 32B 0.0s
=> [gmo-django-lightsail-tailwind internal] load .dockerignore 0.0s
=> => transferring context: 34B 0.0s
=> [gmo-django-lightsail-gmo internal] load .dockerignore 0.0s
=> => transferring context: 34B 0.0s
=> [gmo-django-lightsail-tailwind internal] load metadata for dockerio/library/python:3.11.1-alpine3.17 2.0s
=> [gmo-django-lightsail-gmo internal] load build context 0.1s
=> => transferring context: 20.53kB 0.1s
=> [gmo-django-lightsail-tailwind 1/10] FROM dockerio/library/python:3.11.1-alpine3.17@sha256:d8b0703ce84fe5a5 0.0s
=> [gmo-django-lightsail-tailwind internal] load build context 0.1s
=> => transferring context: 20.53kB 0.0s
=> CACHED [gmo-django-lightsail-tailwind 2/10] COPY ./requirements.txt /requirements.txt 0.0s
=> CACHED [gmo-django-lightsail-tailwind 3/10] RUN apk add --update --no-cache postgresql-client build-base 0.0s
=> CACHED [gmo-django-lightsail-tailwind 4/10] WORKDIR /gmo 0.0s
=> CACHED [gmo-django-lightsail-tailwind 5/10] RUN apk add --no-cache musl-dev curl nodejs 0.0s
=> CACHED [gmo-django-lightsail-tailwind 6/10] RUN python -m venv /py && /py/bin/pip install --upgrade pip && 0.0s
=> CACHED [gmo-django-lightsail-tailwind 7/10] COPY ./scripts /scripts 0.0s
=> CACHED [gmo-django-lightsail-tailwind 8/10] RUN chmod -R +x /scripts 0.0s
=> [gmo-django-lightsail-tailwind 9/10] COPY ./gmo /gmo 0.1s
=> [gmo-django-lightsail-tailwind 10/10] WORKDIR /gmo 0.0s
=> [gmo-django-lightsail-gmo] exporting to image 0.1s
=> => exporting layers 0.1s
=> => writing image sha256:bc1f8470339ce5a0fdedd446c768d4015292b63e170198424191f8daf599ef00 0.0s
=> => naming to Docker 0.0s
=> => writing image sha256:c4e3a29010dd8aeadaa293938f3d49eb60551cce9038db365be72ca18d146262 0.0s
=> => naming to Docker 0.0s

So it means one or more of the commands in your entrypoint script must require nodejs.

If you don’t need a specific nodejs version, try adding the package nodejs to one of your apk add commands.

If your image build had required nodejs, it would be beneficial to use multi-staged builds to render the files in a separate stage and just copy over the result to keep the final container small. But apparently this is not the case in your situation.

Yea, I think you are right.

in my script file, I believe

SECRET_KEY=nothing python tailwind install --no-input
SECRET_KEY=nothing python tailwind build --no-input

requires node.js. Can I add simply

RUN apk add --no-cache

to my script file?

Sorry, for being ambiguous. I mend in the Dockerfile, as you already have two apk add commands in there.

Also, I missed that you already have nodejsit in your Dockerfile. The problem seems that npm is either not available or can not be found. I just checked if alpine installs npm when nodejs is installed and the answer is: it does not. You will need to install the npm package as well.

Though are you sure it is really the best option to include the commands that render the css content in your entrypoint script and therefore requiring nodejs and npm in the final image? This might make sense during development, but is less efficient for the final image, as you would render the content every time the container starts and makes the image bigger as it would need to be.

Also: the commands in your compose file for the image you build actual do nothing. When a container starts it executes the entrypoint script, if a command is provided as well, it will use the command as argument to the entrypoint script. But, your entrypoint script does not handle the additional arguments.

How do you recommend I go about it then? What are the best practices?

Ty for that comment. I think you are right. I’m wondering if I can just use a prod and dev env, and dockerize the prod env only, but unsure if my code would work.

I would suggest taking a look at images on dockerhub that use the same frameworks you use. Often the github repository is linked, so you can check out and learn how they solved things.

Your entrypoint script can be quite flexible. You can access each of its argument (you passed in as command in your compose file) individually with ${1}, ${2} and so on, or all of them at once with ${@}. You can also access whatever ENV you declared in your Dockerfile or environment variable you declared in your compose file: ${KEY}.

I have not the faintest idea how best practice are for python images. I never build one.
But there are plenty of blog posts about that topic: e.g. Docker Best Practices for Python Developers |

If you want a small final image, you will end up needing to either build a new image whenever you change something in the part that requires nodejs, or you will need to have two different images: one for development and one for production.