Docker Desktop Partial Volume Mount For Windows 4.25.1

This is a continuation from ELK Forums: Setting Up Logstash In Docker-Compose For Bulk Ingest Of CSV Files In Local Machine - Elastic Stack / Logstash - Discuss the Elastic Stack

Docker Desktop 4.25.1 with wsl2

I was asked my the experts there to make a post in Docker Forums to continue my issue/troubleshooting

I currently have a logstash container running very healthy. Pipeline is also setup.

However, I am facing issue getting my locally hosted csv files in my D Drive to be seen by this docker-container.

I understand I need to Volume Mount the data folder to this Docker Container.

This is the folder I want to Volume Mount to logstash-1 container

So that within this docker container env, the files can be seen for a successful pipe into Elasticsearch.

For now, I can get the csv_files folder be seen inside docker container. But none of the csv files are piping in.

- /d/ATS_Event_Logs/For-Logstash_(ML)/Logstash:/usr/share/logstash/csv_files is supposed to convert the windows address to unix form in order for docker to read it - smth like that.

docker-compose.yml

version: "3.8"

volumes:
  logstashdata01:
    driver: local

networks:
  default:
    name: elastic
    external: true
    
services:
  logstash:
    image: docker.elastic.co/logstash/logstash:${STACK_VERSION}
    labels:
      co.elastic.logs/module: logstash
    user: root
    environment:
      - xpack.monitoring.enabled=false
    volumes:
      - ./:/usr/share/logstash/pipeline/
      **- /d/ATS_Event_Logs/For-Logstash_(ML)/Logstash:/usr/share/logstash/csv_files** <-- pls check if this is correct syntax.
    command: logstash -r -f /usr/share/logstash/pipeline/logstash.conf
    ports:
      - "5044:5044"
    mem_limit: ${LS_MEM_LIMIT}

logstash.conf

input { 
    file { 
        path => "/usr/share/logstash/csv_files/events2022-01-01.csv"
        start_position => "beginning" 
        sincedb_path => "/dev/null"
    } 
}(rest of file)

Wish to seek advice on how to Volume Mount this For-Logstash (ML) folder to my logstash docker container and what other things I need to improve on in my config files above. Thanks!

Your way works for me, but you can try this format for the mount?

- D:/ATS_Event_Logs/For-Logstash_(ML)/Logstash:/usr/share/logstash/csv_files

My guess is that your source folder was created somewhere else so you mounted an empty folder instead of what you wanted.

You can also try to run a container with a simple “bash” or any shell interactively instead of running logstash if the files are somehow deleted by it, but I don’t think so because the guys on the elastic forum would have noticed that.

1 Like

I got my pipeline to work via

image

I transferred my data to reside in the same directory where I spawned my logstash docker container.

- D:/ATS_Event_Logs/For-Logstash_(ML)/Logstash:/usr/share/logstash/csv_files

was my original way - but I understand from Volume binding using docker compose on Windows - Stack Overflow

That you have to replacing : and ‘’ in the windows path with / at the first of the line.

Until now, im still not sure why I can’t get my path to be read if I resided my data from another directory instead of being hosted on the same one.

Just wanna unds things better.