modulepackage
0.0.0-20200422123956-0d3d3dd9b482
Repository: https://github.com/lithdew/nicehttp.git
Documentation: pkg.go.dev
# README
nicehttp
Package nicehttp contains helper utilities for downloading files/making requests with valyala/fasthttp.
- Download a file from a URL serially/in chunks with multiple workers in parallel, should the URL allow it.
- Download contents of a URL and write its contents to a
io.Writer
. - Query the headers of a URL using a HTTP head request.
- Follow redirects provisioned by a URL.
# Packages
No description provided by the author
# Functions
Do sends a HTTP request prescribed in req and populates its results into res.
DoDeadline sends a HTTP request prescribed in req and populates its results into res.
DoTimeout sends a HTTP request prescribed in req and populates its results into res.
Download downloads the contents of url and writes its contents to w.
DownloadBytes downloads the contents of url, and returns them as a byte slice.
DownloadBytesDeadline downloads the contents of url, and returns them as a byte slice.
DownloadBytesTimeout downloads the contents of url, and returns them as a byte slice.
DownloadDeadline downloads the contents of url and writes its contents to w.
DownloadFile downloads of url, and writes its contents to a newly-created file titled filename.
DownloadFileDeadline downloads of url, and writes its contents to a newly-created file titled filename.
DownloadFileTimeout downloads of url, and writes its contents to a newly-created file titled filename.
DownloadInChunks downloads file at url comprised of length bytes in chunks using multiple workers, and stores it in writer w.
DownloadInChunksDeadline downloads file at url comprised of length bytes in chunks using multiple workers, and stores it in writer w.
DownloadInChunksTimeout downloads file at url comprised of length bytes in chunks using multiple workers, and stores it in writer w.
DownloadSerially contents of url and writes it to w.
DownloadSeriallyDeadline contents of url and writes it to w.
DownloadSeriallyTimeout contents of url and writes it to w.
DownloadTimeout downloads the contents of url and writes its contents to w.
NewClient instantiates a new nicehttp.Client with sane configuration defaults.
NewWriteBuffer instantiates a new write buffer around dst.
NewWriterAtOffset instantiates a new writer at a specified offset.
QueryHeaders learns from url its content length, and if it accepts parallel chunk fetching.
QueryHeadersDeadline learns from url its content length, and if it accepts parallel chunk fetching.
QueryHeadersTimeout learns from url its content length, and if it accepts parallel chunk fetching.
WrapClient wraps an existing fasthttp.Client or Transport into a nicehttp.Client.
# Structs
Client wraps over fasthttp.Client a couple of useful helper functions.
WriteBuffer implements io.Writer and io.WriterAt on an optionally-provided byte slice.
WriterAtOffset implements io.Writer for a given io.WriterAt at an offset.