Categorygithub.com/stubey/hdfs
modulepackage
1.1.3
Repository: https://github.com/stubey/hdfs.git
Documentation: pkg.go.dev

# README

HDFS for Go

GoDoc build

This is a native golang client for hdfs. It connects directly to the namenode using the protocol buffers API.

It tries to be idiomatic by aping the stdlib os package, where possible, and implements the interfaces from it, including os.FileInfo and os.PathError.

Here's what it looks like in action:

client, _ := hdfs.New("namenode:8020")

file, _ := client.Open("/mobydick.txt")

buf := make([]byte, 59)
file.ReadAt(buf, 48847)

fmt.Println(string(buf))
// => Abominable are the tumblers into which he pours his poison.

For complete documentation, check out the Godoc.

The hdfs Binary

Along with the library, this repo contains a commandline client for HDFS. Like the library, its primary aim is to be idiomatic, by enabling your favorite unix verbs:

$ hdfs --help
Usage: hdfs COMMAND
The flags available are a subset of the POSIX ones, but should behave similarly.

Valid commands:
  ls [-lah] [FILE]...
  rm [-rf] FILE...
  mv [-fT] SOURCE... DEST
  mkdir [-p] FILE...
  touch [-amc] FILE...
  chmod [-R] OCTAL-MODE FILE...
  chown [-R] OWNER[:GROUP] FILE...
  cat SOURCE...
  head [-n LINES | -c BYTES] SOURCE...
  tail [-n LINES | -c BYTES] SOURCE...
  du [-sh] FILE...
  checksum FILE...
  get SOURCE [DEST]
  getmerge SOURCE DEST
  put SOURCE DEST

Since it doesn't have to wait for the JVM to start up, it's also a lot faster hadoop -fs:

$ time hadoop fs -ls / > /dev/null

real  0m2.218s
user  0m2.500s
sys 0m0.376s

$ time hdfs ls / > /dev/null

real  0m0.015s
user  0m0.004s
sys 0m0.004s

Best of all, it comes with bash tab completion for paths!

Installing the library

To install the library, once you have Go all set up:

$ go get -u github.com/colinmarc/hdfs

Installing the commandline client

Grab a tarball from the releases page and unzip it wherever you like.

You'll want to add the following line to your .bashrc or .profile:

export HADOOP_NAMENODE="namenode:8020"

To install tab completion globally on linux, copy or link the bash_completion file which comes with the tarball into the right place:

ln -sT bash_completion /etc/bash_completion.d/gohdfs

By default, the HDFS user is set to the currently-logged-in user. You can override this in your .bashrc or .profile:

export HADOOP_USER_NAME=username

Compatibility

This library uses "Version 9" of the HDFS protocol, which means it should work with hadoop distributions based on 2.2.x and above. The tests run against CDH 5.x and HDP 2.x.

Acknowledgements

This library is heavily indebted to snakebite.

# Packages

No description provided by the author
No description provided by the author
Package rpc implements some of the lower-level functionality required to communicate with the namenode and datanodes.

# Functions

LoadHadoopConf returns a HadoopConf object representing configuration from the specified path, or finds the correct path in the environment.
New returns a connected Client, or an error if it can't connect.
NewClient returns a connected Client for the given options, or an error if the client could not be created.
NewForConnection returns Client with the specified, underlying rpc.NamenodeConnection.
NewForUser returns a connected Client with the user specified, or an error if it can't connect.
Username returns the value of HADOOP_USER_NAME in the environment, or the current system user if it is not set.

# Variables

No description provided by the author

# Structs

A Client represents a connection to an HDFS cluster.
ClientOptions represents the configurable options for a client.
ContentSummary represents a set of information about a file or directory in HDFS.
FileInfo implements os.FileInfo, and provides information about a file or directory in HDFS.
A FileReader represents an existing file or directory in HDFS.
A FileWriter represents a writer for an open file in HDFS.
FsInfo provides information about HDFS.
Property is the struct representation of hadoop configuration key value pair.

# Type aliases

HadoopConf represents a map of all the key value configutation pairs found in a user's hadoop configuration files.