I am starting to learn docker. I have a project where I have to upload information to a Postgresql database. I was able to dockerize the data and it works correctly. But I have a problem that I have not been able to solve.
The information that takes the Python code that updates the database, is information in excel that is in a local shared disk. In my work it is used to update that data that is in excel every few hours.
I managed to dockerize the process but I can’t figure out if there is a way for docker to listen for changes on the external disk and trigger the code that loads the database.
I have been bind mount but when I make a change in the file, it is not reflected.
This is the structure of my docker-compose
version: "3.8"
services:
app:
build: .
image: mi_app_personalizada:latest # Nombre personalizado para la imagen construida
volumes:
- "C:/Users/niir/Documents/incendios:/data/incendios"
- "C:/Users/niir/Documents/deslizamientos:/data/deslizamientos"
depends_on:
- db
environment:
- DATABASE_URL=postgresql://postgres:password@db:5432/alertas
networks:
- mi_red_personalizada
db:
image: postgis/postgis
container_name: mi_db_postgis # Nombre personalizado para el contenedor de la base de datos
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: password
POSTGRES_DB: alertas
volumes:
- db_data:/var/lib/postgresql/data
networks:
- mi_red_personalizada
pgadmin:
image: dpage/pgadmin4
container_name: mi_pgadmin
environment:
PGADMIN_DEFAULT_EMAIL: "EMAIL"
PGADMIN_DEFAULT_PASSWORD: "PASSWORD"
ports:
- "5050:80"
networks:
- mi_red_personalizada
networks:
mi_red_personalizada: # Definición de la red personalizada
volumes:
db_data: