Why are the redis, worker and scheduler containers automatically exited?

This is my docker-compose and it runs succesfully on Windows 10 on WSL Ubuntu 18.04:

version: '2'
services:
  postgresql:
image: 'docker.io/bitnami/postgresql:10-debian-10'
volumes:
  - 'postgresql_data:/bitnami/postgresql'
environment:
  - AIRFLOW__DATABASE_NAME=bitnami_airflow
  - AIRFLOW__DATABASE_USERNAME=bn_airflow
  - AIRFLOW__DATABASE_PASSWORD=bitnami1
  - ALLOW_EMPTY_PASSWORD=yes
  redis:
image: docker.io/bitnami/redis:6.0-debian-10
volumes:
  - 'redis_data:/bitnami'
environment:
  - ALLOW_EMPTY_PASSWORD=yes
  airflow-scheduler:
image: docker.io/bitnami/airflow-scheduler:1-debian-10
environment:
  - AIRFLOW__DATABASE_NAME=bitnami_airflow
  - AIRFLOW__DATABASE_USERNAME=bn_airflow
  - AIRFLOW__DATABASE_PASSWORD=bitnami1
  - AIRFLOW__EXECUTOR=CeleryExecutor
volumes:
  - airflow_scheduler_data:/bitnami
  - ./dags:/opt/bitnami/airflow/dags
  airflow-worker:
image: docker.io/bitnami/airflow-worker:1-debian-10
environment:
  - AIRFLOW__DATABASE_NAME=bitnami_airflow
  - AIRFLOW__DATABASE_USERNAME=bn_airflow
  - AIRFLOW__DATABASE_PASSWORD=bitnami1
  - AIRFLOW__EXECUTOR=CeleryExecutor
volumes:
  - airflow_worker_data:/bitnami
  
  airflow:
image: docker.io/bitnami/airflow:1-debian-10
environment:
  - AIRFLOW__DATABASE_NAME=bitnami_airflow
  - AIRFLOW__DATABASE_USERNAME=bn_airflow
  - AIRFLOW__DATABASE_PASSWORD=bitnami1
  - AIRFLOW__EXECUTOR=CeleryExecutor
  - AIRFLOW__WEBSERVER_RBAC=True
  - AIRFLOW__CORE_LOAD_EXAMPLES=False
ports:
  - '8080:8080'
volumes:
  - airflow_data:/bitnami
  - ./dags:/opt/bitnami/airflow/dags

volumes:
  airflow_scheduler_data:
driver: local
  airflow_worker_data:
driver: local
  postgresql_data:
driver: local
  redis_data:
driver: local
  airflow_data:
driver: local

This is the error:
nami INFO Initializing airflow
airflow INFO ==> No injected configuration files found. Creating default config file…
airflow INFO ==> Setting Airflow base URL…
airflow INFO ==> Enabling Webserver authentication…
airflow INFO ==> Enabling examples…
airflow INFO ==> Configuring Airflow database…
airflow INFO ==> Configuring Airflow Celery Executor…
airflow INFO ==> Deploying Airflow from scratch…
postgre INFO Trying to connect to PostgreSQL server
postgre INFO Found PostgreSQL server listening at postgresql:5432
postgre ERROR [canConnect] Connection with ‘bn_airflow’ user is unsuccessful
postgre ERROR [canConnect] Connection with ‘bn_airflow’ user is unsuccessful
postgre ERROR [canConnect] Connection with ‘bn_airflow’ user is unsuccessful
postgre ERROR [canConnect] Connection with ‘bn_airflow’ user is unsuccessful
postgre ERROR [canConnect] Connection with ‘bn_airflow’ user is unsuccessful
Error executing ‘postInstallation’: Cannot connect to PostgreSQL server:
psql: FATAL: password authentication failed for user “bn_airflow”

But when I try to upload the containers on a Linux operating system machine, the image is built, however the worker, scheduler and redis containers are automatically exited. Is there anything I can do?