Categorygithub.com/mrichman/hargo
modulepackage
1.0.1
Repository: https://github.com/mrichman/hargo.git
Documentation: pkg.go.dev

# README

Hargo

Hargo Build Status GoDoc Go Report Card Join the chat at https://gitter.im/mrichman/hargo GitHub license GitHub issues Twitter

Hargo parses HAR files, can convert to curl format, and serve as a load test driver.

NAME:
   hargo - work with HTTP Archive (.har) files

USAGE:
   hargo <command> [arguments] <.har file>

VERSION:
   0.1.2-dev.57 (da53069)

AUTHOR:
   Mark A. Richman <[email protected]>

COMMANDS:
     fetch, f     Fetch URLs in .har
     curl, c      Convert .har to curl
     run, r       Run .har file
     validate, v  Validate .har file
     dump, d      Dump .har file
     load, l      Load test .har file
     help, h      Shows a list of commands or help for one command

GLOBAL OPTIONS:
   --debug        Show debug output
   --help, -h     show help
   --version, -v  print the version

COPYRIGHT:
   (c) 2021 Mark A. Richman

Building and Running Hargo

git clone https://github.com/mrichman/hargo.git
cd hargo
make install
hargo validate test/golang.org.har

About HAR Files

If you use Google Chrome, you can record these files by following the steps below:

  1. Right-click anywhere on that page and click on Inspect Element to open Chrome's Developer Tools
  2. The Developer Tools will open as a panel at the bottom of the page. Click on the Network tab.
  3. Click the Record button, which is the solid black circle at the bottom of the Network tab, and you'll start recording activity in your browser.
  4. Refresh the page and start working normally
  5. Right-click within the Network tab and click Save as HAR with Content to save a copy of the activity that you recorded.
  6. Within the file window, save the HAR file.

Commands

Fetch

The fetch command downloads all resources references in .har file:

hargo fetch foo.har

This will produce a directory named hargo-fetch-yyyymmddhhmmss containing all assets references by the .har file. This is similar to what you'd see when invoking wget on a particular URL.

Curl

The curl command will output a curl command line for each entry in the .har file.

hargo curl foo.har

Run

The run command executes each HTTP request in .har file:

hargo run foo.har

This is similar to fetch but will not save any output.

Validate

The validate command will report any errors in the format of a .har file.

hargo validate foo.har

HAR file format is defined here: https://w3c.github.io/web-performance/specs/HAR/Overview.html

Dump

Dump prints information about all HTTP requests in .har file

hargo dump foo.har

Load

Hargo can act as a load test agent. Given a .har file, hargo can spawn a number of concurrent workers to repeat each HTTP request in order. By default, hargo will spawn 10 workers and run for a duration of 60 seconds.

Hargo will also save its results to InfluxDB, if available. Each HTTP response is stored as a point of time-series data, which can be graphed by Chronograf, Grafana, or similar visualization tool for analysis.

Docker

Build container

docker build -t hargo .

Run container

docker run --rm -v `pwd`/test:/test hargo hargo run /test/golang.org.har

Docker-compose

The example docker-compose file will start three containers:

  • hargo
  • influxdb
  • grafana

The hargo container will first needs to be built. See build. When the compose file is run it will start a hargo load process that will write the results to InfluxDB. This InfluxDB instance can be viewed using the grafana container. This contains an example dashboard showing the latency of the executed request. Username/password for all the containers is hargo/hargo.

commands

cd example/docker-compose
docker-compose up
docker-compose down -v

# Packages

No description provided by the author

# Functions

Decode reads from a reader and returns Har object.
Dump prints all HTTP requests in .har file.
EntryToRequest converts a HAR entry type to an http.Request.
Fetch downloads all resources references in .har file.
LoadTest executes all HTTP requests in order concurrently for a given number of workers.
NewReader returns a bufio.Reader that will skip over initial UTF-8 byte order marks.
ReadStream reads the har file as a stream and puts the entries on a chan for consumption.
Run executes all entries in .har file.
ToCurl converts a HAR Entry to a curl command line curl -X <method> -b "<name=value&name=value...>" -H <name: value> ..
Validate validates the format of a .har file.
WritePoint inserts data to InfluxDB.

# Structs

Browser that created the log.
Cache contains info about a request coming from browser cache.
CacheObject is used by both beforeRequest and afterRequest.
Content describes details about response content (embedded in <response> object).
Cookie contains list of all cookies (used in <request> and <response> objects).
Creator contains information about the log creator application.
Entry
Entry is a unique, optional Reference to the parent page.
Har is a container type for deserialization.
Log represents the root of the exported data.
NVP is simply a name/value pair with a comment.
Page object for every exported web page and one <entry> object for every HTTP request.
PageTiming describes timings for various events (states) fired during the page load.
PageTimings describes various phases within request-response round trip.
PostData describes posted data, if any (embedded in <request> object).
PostParam is a list of posted parameters, if any (embedded in <postData> object).
Request contains detailed info about performed request.
Response contains detailed info about the response.
TestResult contains results for an individual HTTP request.