Categorygithub.com/open-telemetry/opentelemetry-collector-contrib/internal/scrapertest

# README

scrapertest

This module provides a mechanism for capturing and comparing expected metric results.

Typical Usage

A scraper test typically looks something like this:

func TestScraper(t *testing.T) {
  cfg := createDefaultConfig().(*Config)
  require.NoError(t, component.ValidateConfig(cfg))

  scraper := newScraper(componenttest.NewNopReceiverCreateSettings(), cfg)

  err := scraper.start(context.Background(), componenttest.NewNopHost())
  require.NoError(t, err)

  actualMetrics, err := scraper.scrape(context.Background())
  require.NoError(t, err)

  expectedFile := filepath.Join("testdata", "scraper", "expected.json")
  expectedMetrics, err := golden.ReadMetrics(expectedFile)
  require.NoError(t, err)

  require.NoError(t, scrapertest.CompareMetrics(expectedMetrics, actualMetrics))
}

Generating an expected result file

The easiest way to capture the expected result in a file is golden.WriteMetrics.

When writing a new test:

  1. Write the test as if the expected file exists.
  2. Follow the steps below for updating an existing test.

When updating an existing test:

  1. Add a call to golden.WriteMetrics in the appropriate place.
  2. Run the test once.
  3. Remove the call to golden.WriteMetrics.
func TestScraper(t *testing.T) {
  cfg := createDefaultConfig().(*Config)
  require.NoError(t, component.ValidateConfig(cfg))

  scraper := newScraper(componenttest.NewNopReceiverCreateSettings(), cfg)

  err := scraper.start(context.Background(), componenttest.NewNopHost())
  require.NoError(t, err)

  actualMetrics, err := scraper.scrape(context.Background())
  require.NoError(t, err)

  expectedFile := filepath.Join("testdata", "scraper", "expected.json")

  golden.WriteMetrics(expectedFile, actualMetrics)   // This line is temporary! TODO remove this!!

  expectedMetrics, err := golden.ReadMetrics(expectedFile)
  require.NoError(t, err)

  require.NoError(t, scrapertest.CompareMetrics(expectedMetrics, actualMetrics))
}

# Packages

No description provided by the author
No description provided by the author

# Functions

No description provided by the author
CompareMetricSlices compares each part of two given MetricSlices and returns an error if they don't match.
CompareNumberDataPoints compares each part of two given NumberDataPoints and returns an error if they don't match.
CompareNumberDataPointSlices compares each part of two given NumberDataPointSlices and returns an error if they don't match.
No description provided by the author
IgnoreMetricAttributeValue is a CompareOption that clears all values.
IgnoreMetricValues is a CompareOption that clears all values.
IgnoreResourceAttributeValue is a CompareOption that removes a resource attribute from all resources.
IgnoreSubsequentDataPoints is a CompareOption that ignores data points after the first.

# Interfaces

CompareOption is applied by the CompareMetricSlices function to mutates an expected and/or actual result before comparing.