Categorygithub.com/pteich/elastic-query-export
repositorypackage
1.6.1
Repository: https://github.com/pteich/elastic-query-export.git
Documentation: pkg.go.dev

# Packages

No description provided by the author
No description provided by the author
No description provided by the author

# README

elastic-query-export

Export Data from ElasticSearch to CSV by Raw or Lucene Query (e.g. from Kibana). Works with ElasticSearch 6+ (OpenSearch works too) and makes use of ElasticSearch's Scroll API and Go's concurrency possibilities to work as fast as possible.

Install

Download a pre-compiled binary for your operating system from here: https://github.com/pteich/elastic-query-export/releases You need just this binary. It works on OSX (Darwin), Linux and Windows.

Arch

yay -S elastic-query-export-bin

General usage

es-query-export -c "http://localhost:9200" -i "logstash-*" --start="2019-04-04T12:15:00" --fields="RemoteHost,RequestTime,Timestamp,RequestUri,RequestProtocol,Agent" -q "RequestUri:*export*"

CLI Options

FlagDefault
-h --helpshow help
-v --versionshow version
-c --connecthttp://localhost:9200URI to ElasticSearch instance
-i --indexlogs-*name of index to use, use globbing characters * to match multiple
-q --queryLucene query to match documents (same as in Kibana)
--fieldsdefine a comma separated list of fields to export
-o --outfileoutput.csvname of output file, you can use - as filename to output data to stdout and pipe it to other commands
-f --outformatcsvformat of the output data: possible values csv, json, raw
-r --rawqueryoptional raw ElasticSearch query JSON string
-s --startoptional start date - Format: YYYY-MM-DDThh:mm:ss.SSSZ. or any other Elasticsearch default format
-e --endoptional end date - Format: YYYY-MM-DDThh:mm:ss.SSSZ. or any other Elasticsearch default format
--timefieldoptional time field to use, default to @timestamp
--verifySSLfalseoptional define how to handle SSL certificates
--useroptional username
--passoptional password
--size1000size of the scroll window, the more the faster the export works but it adds more pressure on your nodes
--tracefalseenable trace mode to debug queries send to ElasticSearch

Output Formats

  • csv - all or selected fields separated by comma (,) with field names in the first line
  • json - all or selected fields as JSON objects, one per line
  • raw - JSON dump of matching documents including id, index and _source field containing the document data. One document as JSON object per line.

Pipe output to other commands

Since v1.6.0 you can provide - as filename and send output to stdout. This can be used to pipe it to other commands like so:

es-query-export -start="2019-04-04T12:15:00" -q "RequestUri:*export*" -outfile - | aws s3 cp - s3://mybucket/stream.csv