Docker-logstash-csv

Hi there,

I have a big problem here.

I would like to import some csv files in Kibana to process them true elastich search.
When i try with my local machine evritying is working fine doing : logstash -f /my_folder/logstash.conf

But now that I’m using Docker and is all about containers i have some problem.
Is working only if i pass stdin and a normal message but with my working directory of my pc is dosn’t work.

I supposed i have to put my csv in the containers??

my logstash config file is this one:
input {
file {
path => ["/home/sirox/Base_Sim/base2012prova.csv"]
type => “csv”
}
}

filter {
csv {
columns => [“idade”,“sexo”,“racacor”]
separator => “,”
}
}

output {
elasticsearch {
hosts => [ “http://elasticsearch:9200” ]

}
}

the command that i use to submit my file is:

docker run -h logstash --name logstash --link elasticsearch:elasticsearch -it --rm -v “$PWD”:/config-dir logstash -f /config-dir/logstash.conf

I really appreciate if you could help me.
Thanks a lot

Alberto

it seems that you did not properly mount in all needed files into your logstash container. You just mounted the config, not the data. Also if you specify in your config a /home/foo directory, then it may not exist in the container.
Could you check this?

How to upload the data file to the image containing logstash? And what path to specify in logstash.conf in the docker image?