Reputation: 891
Any ideas on how to export the entire database in elastic search in csv? I just need one index to be exported.
I tried the following python plugin but elastic search as a service provider is complaining on root certificates:
https://github.com/taraslayshchuk/es2csv/blob/master/README.rst
es2csv -i leads -a user:password -u https://host.us-east-1.aws.found.io:9243 -q '' -o database.csv
but I get: elasticsearch.exceptions.ImproperlyConfigured: Root certificates are missing for certificate validation. Either pass them in using the ca_certs parameter or install certifi to use it automatically.
help? I'm using Elasticsearch.co as a service for my elastic search
Upvotes: 1
Views: 1269
Reputation: 43
As taraslayshchuk has already said, es2csv has been updated to support SSL. For an elastic.co hosted ES I've been using something like this:
es2csv.py -u https://myserverurl -i myindex-* -r -q '
{
"query": {
"range": {
"@timestamp": {
"gte": "2017-04-16T00:00",
"lt": "2017-04-23T23:59",
"time_zone": "-06:00"
}
}
}
}
' -o outputfile.csv --auth user:password --use-ssl --verify-certs
Upvotes: 1
Reputation: 71
The latest update of es2csv has a SSL patch, which should resolve your problem.
Upvotes: 0
Reputation: 522
You can use Logstash to export an index to CSV:
input {
elasticsearch {
hosts => "localhost:9200"
index => "some-index"
query => '{"query": {
"match_all": {}
}}'
}
output {
file {
codec => line { format => "%{field1},%{field2}"}
path => "some-index.csv"
}
}
Warning: There is a csv output plugin, but it has a known bug for versions Logstash 5.x. The above configuration should be fine.
Upvotes: 2