This is a continuation from ELK Forums: Setting Up Logstash In Docker-Compose For Bulk Ingest Of CSV Files In Local Machine - Elastic Stack / Logstash - Discuss the Elastic Stack
Docker Desktop 4.25.1 with wsl2
I was asked my the experts there to make a post in Docker Forums to continue my issue/troubleshooting
I currently have a logstash container running very healthy. Pipeline is also setup.
However, I am facing issue getting my locally hosted csv files in my D Drive to be seen by this docker-container.
I understand I need to Volume Mount the data folder to this Docker Container.
This is the folder I want to Volume Mount to logstash-1 container
So that within this docker container env, the files can be seen for a successful pipe into Elasticsearch.
For now, I can get the csv_files folder be seen inside docker container. But none of the csv files are piping in.
- /d/ATS_Event_Logs/For-Logstash_(ML)/Logstash:/usr/share/logstash/csv_files
is supposed to convert the windows address to unix form in order for docker to read it - smth like that.
docker-compose.yml
version: "3.8"
volumes:
logstashdata01:
driver: local
networks:
default:
name: elastic
external: true
services:
logstash:
image: docker.elastic.co/logstash/logstash:${STACK_VERSION}
labels:
co.elastic.logs/module: logstash
user: root
environment:
- xpack.monitoring.enabled=false
volumes:
- ./:/usr/share/logstash/pipeline/
**- /d/ATS_Event_Logs/For-Logstash_(ML)/Logstash:/usr/share/logstash/csv_files** <-- pls check if this is correct syntax.
command: logstash -r -f /usr/share/logstash/pipeline/logstash.conf
ports:
- "5044:5044"
mem_limit: ${LS_MEM_LIMIT}
logstash.conf
input {
file {
path => "/usr/share/logstash/csv_files/events2022-01-01.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}(rest of file)
Wish to seek advice on how to Volume Mount this For-Logstash (ML) folder to my logstash docker container and what other things I need to improve on in my config files above. Thanks!