Categorygithub.com/abzcoding/hget
modulepackage
0.0.0-20211227074927-ff8e0886516f
Repository: https://github.com/abzcoding/hget.git
Documentation: pkg.go.dev

# README

Build Status Scrutinizer Code Quality Maintainability Codebeat

hget

Features

  • Fast (multithreading & stuff)
  • Ability to interrupt/resume (task mangement)
  • Support for proxies( socks5 or http)
  • Bandwidth limiting
  • You can give it a file that contains list of urls to download

Install

$ go get -d github.com/abzcoding/hget
$ cd $GOPATH/src/github.com/abzcoding/hget
$ make clean install

Binary file will be built at ./bin/hget, you can copy to /usr/bin or /usr/local/bin and even alias wget hget to replace wget totally :P

Usage

hget [-n parallel] [-skip-tls false] [-rate bwRate] [-proxy proxy_server] [-file filename] [URL] # to download url, with n connections, and not skip tls certificate
hget tasks # get interrupted tasks
hget resume [TaskName | URL] # to resume task
hget -proxy "127.0.0.1:12345" URL # to download using socks5 proxy
hget -proxy "http://sample-proxy.com:8080" URL # to download using http proxy
hget -file sample.txt # to download a list of urls
hget -n 4 -rate 100KB URL # to download using 4 threads & limited to 100KB per second

Help

[I] ➜ hget -h
Usage of hget:
  -file string
        filepath that contains links in each line
  -n int
        connection (default 16)
  -proxy string
        proxy for downloading, ex
                -proxy '127.0.0.1:12345' for socks5 proxy
                -proxy 'http://proxy.com:8080' for http proxy
  -rate string
        bandwidth limit to use while downloading, ex
                -rate 10kB
                -rate 10MiB
  -skip-tls
        skip verify certificate for https (default true)

To interrupt any on-downloading process, just ctrl-c or ctrl-d at the middle of the download, hget will safely save your data and you will be able to resume later

Download

Resume

# Functions

DisplayProgressBar shows a fancy progress bar.
Errorf outputs error level logs.
Errorln is non formatted error printer.
Execute configures the HTTPDownloader and uses it to download stuff.
ExistDir checks if `folder` is available.
FatalCheck panics if err is not nil.
FilterIPV4 returns parsed ipv4 string.
FolderOf makes sure you won't get LFI.
IsTerminal checks if we have tty.
IsURL checks if `s` is actually a parsable URL.
JoinFile joins seperate chunks of file and forms the final downloaded artifact.
MkdirIfNotExist creates `folder` directory if not available.
NewHTTPDownloader returns a ProxyAwareHttpClient with given configurations.
Printf outputs information level logs.
ProxyAwareHTTPClient will use http or socks5 proxy if given one.
Read loads data about the state of downloaded files.
Resume gets back to a previously stopped task.
TaskFromURL runs when you want to download a single url.
TaskPrint read and prints data about current download jobs.
Warnf outputs warning level logs.

# Variables

No description provided by the author
No description provided by the author
No description provided by the author

# Structs

Console is an implementation of UI interface.
HTTPDownloader holds the required configurations.
Part represents a chunk of downloaded file.
State holds information about url Parts.

# Interfaces

UI represents a simple IO output.