# README
Event generation testing framework
Introduction
What it is
Many modules do requests to an HTTP endpoint to fetch metrics data. Then they do data manipulation and enriching of this data before sending it to Elasticsearch.
What we wanted to achieve is to mock the HTTP responses from the modules into a generic server that will serve those responses. So, for each tested metricset, an HTTP server is launched at a random port (but be aware that JSON responses written to disk are hardcoded with value "127.0.0.1:5555") with the mocked response from a fixed module version to respond to the HTTP requests, once the test is done, it's shut down. This way we isolate the manipulation of data in some tests and reduce lot of boilerplate we had in many modules.
How to use it
The idea is simple, head to beats/metricbeat/mb/testing/data
and run go test .
It will run all tests, each metricset of each module.
An alternative is to just run from metricbeat mage mockedTests
to achieve the same result but using environment variables instead of flags, for example: MODULE=apache GENERATE=true mage mockedTests
Worth to mention
- If the input file in
testdata
folder is prefixed (named)docs
, whatever its extension is, and the flag-data
is passed; the framework will also create adocs.json
file in_meta
folder of the metricset as historically has been done in Metricbeat. - Config file must be called
config.yml
and be located insidemetricbeat/module/{module}/{metricset}/_meta/testdata
Available flags / environment variables
-data
: It will regenerate the expected JSON file with the output of an event an place it withintestdata
folder. For example:go test . -data
. If using mage, a environment variableGENERATE
is available to-module
: Test only the specified module. For examplego test . -module=apache
. If using mageMODULE
environment variable must be set with the module name that must be tested.
You can also combine both flags with
go test . -data -module=apache
to generate files for Apache module only.
Available settings in config.yml
type
: (string) The type of the test to run. At the moment, onlyhttp
is supported.url
: (string) This is the URL path that the module usually fetches to get metrics. For example, in case of Apache module this url is/server-status?auto=
suffix
: (string) The suffix that the input file has. By defaultjson
other common suffixes areplain
(string) for plain text files.omit_documented_fields_check
: (List of strings) Some fields generated by the modules are completely dynamic so they aren't documented infields.yml
. Set a list of fields or paths in your metricset that might not be documented likeapache.status.*
for all fields withinapache.status
object orapache.status.hostname
just for that specific field. Even you can omit all fields using*
remove_fields_from_comparison
: (List of strings) Some fields must be removed for byte-to-byte comparison but they must be printed anyways in the expected JSON files. Write a list of those fields here. For example,apache.status.hostname
in the Apache module was generating a new port on each run so a comparison wasn't possible. Set one item withapache.status.hostname
to omit this field when comparing outputs.module
: (Map) Anything added to this map will be appended in the module config before launching tests. For example, This is useful for some modules that requires the user to specify anamespace
.