Categorygithub.com/lytics/cloudstorage
modulepackage
0.2.16
Repository: https://github.com/lytics/cloudstorage.git
Documentation: pkg.go.dev

# README

Introduction

Cloudstorage is an library for working with Cloud Storage (Google, AWS, Azure) and SFTP, Local Files. It provides a unified api for local files, sftp and Cloud files that aids testing and operating on multiple cloud storage.

GoDoc Go ReportCard

Features

  • Provide single unified api for multiple cloud (google, azure, aws) & local files.
  • Cloud Upload/Download is unified in api so you don't have to download file to local, work with it, then upload.
  • Buffer/Cache files from cloud local so speed of usage is very high.

Similar/Related works

Example usage:

Note: For these examples all errors are ignored, using the _ for them.

Creating a Store object:
// This is an example of a local storage object:  
// See(https://github.com/lytics/cloudstorage/blob/master/google/google_test.go) for a GCS example:
config := &cloudstorage.Config{
	Type:            localfs.StoreType,
	AuthMethod:      localfs.AuthFileSystem,
	LocalFS:         "/tmp/mockcloud",
	TmpDir:          "/tmp/localcache",
}
store, _ := cloudstorage.NewStore(config)
Listing Objects:

See go Iterator pattern doc for api-design: https://github.com/GoogleCloudPlatform/google-cloud-go/wiki/Iterator-Guidelines

// From a store that has been created

// Create a query
q := cloudstorage.NewQuery("list-test/")
// Create an Iterator
iter, err := store.Objects(context.Background(), q)
if err != nil {
	// handle
}

for {
	o, err := iter.Next()
	if err == iterator.Done {
		break
	}
	log.Println("found object ", o.Name())
}
Writing an object :
obj, _ := store.NewObject("prefix/test.csv")
// open for read and writing.  f is a filehandle to the local filesystem.
f, _ := obj.Open(cloudstorage.ReadWrite) 
w := bufio.NewWriter(f)
_, _ := w.WriteString("Year,Make,Model\n")
_, _ := w.WriteString("1997,Ford,E350\n")
w.Flush()

// Close sync's the local file to the remote store and removes the local tmp file.
obj.Close()
Reading an existing object:
// Calling Get on an existing object will return a cloudstorage object or the cloudstorage.ErrObjectNotFound error.
obj2, _ := store.Get(context.Background(), "prefix/test.csv")
// Note, the file is not yet open
f2, _ := obj2.Open(cloudstorage.ReadOnly)
bytes, _ := ioutil.ReadAll(f2)
fmt.Println(string(bytes)) // should print the CSV file from the block above...
Transferring an existing object:
var config = &storeutils.TransferConfig{
	Type:                  google.StoreType,
	AuthMethod:            google.AuthGCEDefaultOAuthToken,
	ProjectID:             "my-project",
	DestBucket:            "my-destination-bucket",
	Src:                   storeutils.NewGcsSource("my-source-bucket"),
	IncludePrefxies:       []string{"these", "prefixes"},
}

transferer, _ := storeutils.NewTransferer(client)
resp, _ := transferer.NewTransfer(config)

See testsuite.go for more examples

Testing

Due to the way integration tests act against a cloud bucket and objects; run tests without parallelization.

cd $GOPATH/src/github.com/lytics/cloudstorage
go test -p 1 ./...

# Packages

No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author

# Functions

Backoff sleeps a random amount so we can.
CachePathObj check the cache path.
CleanETag transforms a string into the full etag spec, removing extra quote-marks, whitespace from etag.
CleanupCacheFiles cleans up old store cache files if your process crashes all it's old cache files, the local copies of the cloudfiles, will left behind.
ContentType check content type of file by looking at extension (.html, .png) uses package mime for global types.
Copy source to destination.
EnsureContextType read Type of metadata.
EnsureDir ensure directory exists.
Exists does this file path exists on the local file-system?.
Move source object to destination.
NewObjectPageIterator create an iterator that wraps the store List interface.
No description provided by the author
NewQuery create a query for finding files under given prefix.
NewQueryAll query for all objects/files.
NewQueryForFolders create a query for finding Folders under given path.
NewStore create new Store from Storage Config/Context.
ObjectResponseFromIter get all objects for an iterator.
ObjectsAll get all objects for an iterator.
Register adds a store type provider.

# Constants

ContentTypeKey.
MaxResults default number of objects to retrieve during a list-objects request, if more objects exist, then they will need to be paged.
ReadOnly File Permissions Levels.
No description provided by the author
StoreCacheFileExt = ".cache".

# Variables

ErrNotImplemented this feature is not implemented for this store.
ErrObjectExists error trying to create an already existing file.
ErrObjectNotFound Error of not finding a file(object).
No description provided by the author

# Structs

No description provided by the author
No description provided by the author
ObjectPageIterator iterator to facilitate easy paging through store.List() method to read all Objects that matched query.
No description provided by the author
No description provided by the author
Query used to query the cloud source.

# Interfaces

No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author
No description provided by the author

# Type aliases

AccessLevel is the level of permissions on files.
No description provided by the author
Filter func type definition for filtering objects.
No description provided by the author
StoreProvider a provider function for creating New Stores.