Categorygithub.com/takashabe/google-cloud-go
modulepackage
9.1.0+incompatible
Repository: https://github.com/takashabe/google-cloud-go.git
Documentation: pkg.go.dev

# README

Google Cloud Client Libraries for Go

GoDoc

Go packages for Google Cloud Platform services.

import "cloud.google.com/go"

To install the packages on your system,

$ go get -u cloud.google.com/go/...

NOTE: Some of these packages are under development, and may occasionally make backwards-incompatible changes.

NOTE: Github repo is a mirror of https://code.googlesource.com/gocloud.

News

January 18, 2018

v0.18.0

  • bigquery:

    • Marked stable.
    • Schema inference of nullable fields supported.
    • Added TimePartitioning to QueryConfig.
  • firestore: Data provided to DocumentRef.Set with a Merge option can contain Delete sentinels.

  • logging: Clients can accept parent resources other than projects.

  • pubsub:

    • pubsub/pstest: A lighweight fake for pubsub. Experimental; feedback welcome.
    • Support updating more subscription metadata: AckDeadline, RetainAckedMessages and RetentionDuration.
  • oslogin/apiv1beta: New client for the Cloud OS Login API.

  • rpcreplay: A package for recording and replaying gRPC traffic.

  • spanner:

    • Add a ReadWithOptions that supports a row limit, as well as an index.
    • Support query plan and execution statistics.
    • Added OpenCensus support.
  • storage: Clarify checksum validation for gzipped files (it is not validated when the file is served uncompressed).

December 11, 2017

v0.17.0

  • firestore BREAKING CHANGES:
    • Remove UpdateMap and UpdateStruct; rename UpdatePaths to Update. Change docref.UpdateMap(ctx, map[string]interface{}{"a.b", 1}) to docref.Update(ctx, []firestore.Update{{Path: "a.b", Value: 1}})

      Change docref.UpdateStruct(ctx, []string{"Field"}, aStruct) to docref.Update(ctx, []firestore.Update{{Path: "Field", Value: aStruct.Field}})

    • Rename MergePaths to Merge; require args to be FieldPaths

    • A value stored as an integer can be read into a floating-point field, and vice versa.

  • bigtable/cmd/cbt:
    • Support deleting a column.
    • Add regex option for row read.
  • spanner: Mark stable.
  • storage:
    • Add Reader.ContentEncoding method.
    • Fix handling of SignedURL headers.
  • bigquery:
    • If Uploader.Put is called with no rows, it returns nil without making a call.
    • Schema inference supports the "nullable" option in struct tags for non-required fields.
    • TimePartitioning supports "Field".

October 30, 2017

v0.16.0

  • Other bigquery changes:

    • JobIterator.Next returns *Job; removed JobInfo (BREAKING CHANGE).
    • UseStandardSQL is deprecated; set UseLegacySQL to true if you need Legacy SQL.
    • Uploader.Put will generate a random insert ID if you do not provide one.
    • Support time partitioning for load jobs.
    • Support dry-run queries.
    • A Job remembers its last retrieved status.
    • Support retrieving job configuration.
    • Support labels for jobs and tables.
    • Support dataset access lists.
    • Improve support for external data sources, including data from Bigtable and Google Sheets, and tables with external data.
    • Support updating a table's view configuration.
    • Fix uploading civil times with nanoseconds.
  • storage:

    • Support PubSub notifications.
    • Support Requester Pays buckets.
  • profiler: Support goroutine and mutex profile types.

October 3, 2017

v0.15.0

  • firestore: beta release. See the announcement.

  • errorreporting: The existing package has been redesigned.

  • errors: This package has been removed. Use errorreporting.

Older news

Supported APIs

Google APIStatusPackage
BigQuerystablecloud.google.com/go/bigquery
Bigtablestablecloud.google.com/go/bigtable
Containeralphacloud.google.com/go/container/apiv1
Data Loss Preventionalphacloud.google.com/go/dlp/apiv2beta1
Datastorestablecloud.google.com/go/datastore
Debuggeralphacloud.google.com/go/debugger/apiv2
ErrorReportingalphacloud.google.com/go/errorreporting
Firestorebetacloud.google.com/go/firestore
Languagestablecloud.google.com/go/language/apiv1
Loggingstablecloud.google.com/go/logging
Monitoringbetacloud.google.com/go/monitoring/apiv3
OS Loginalphacloud.google.com/compute/docs/oslogin/rest
Pub/Subbetacloud.google.com/go/pubsub
Spannerstablecloud.google.com/go/spanner
Speechstablecloud.google.com/go/speech/apiv1
Storagestablecloud.google.com/go/storage
Translationstablecloud.google.com/go/translate
Video Intelligencebetacloud.google.com/go/videointelligence/apiv1beta1
Visionstablecloud.google.com/go/vision/apiv1

Alpha status: the API is still being actively developed. As a result, it might change in backward-incompatible ways and is not recommended for production use.

Beta status: the API is largely complete, but still has outstanding features and bugs to be addressed. There may be minor backwards-incompatible changes where necessary.

Stable status: the API is mature and ready for production use. We will continue addressing bugs and feature requests.

Documentation and examples are available at https://godoc.org/cloud.google.com/go

Visit or join the google-api-go-announce group for updates on these packages.

Go Versions Supported

We support the two most recent major versions of Go. If Google App Engine uses an older version, we support that as well. You can see which versions are currently supported by looking at the lines following go: in .travis.yml.

Authorization

By default, each API will use Google Application Default Credentials for authorization credentials used in calling the API endpoints. This will allow your application to run in many environments without requiring explicit configuration.

client, err := storage.NewClient(ctx)

To authorize using a JSON key file, pass option.WithServiceAccountFile to the NewClient function of the desired package. For example:

client, err := storage.NewClient(ctx, option.WithServiceAccountFile("path/to/keyfile.json"))

You can exert more control over authorization by using the golang.org/x/oauth2 package to create an oauth2.TokenSource. Then pass option.WithTokenSource to the NewClient function: snip:# (auth-ts)

tokenSource := ...
client, err := storage.NewClient(ctx, option.WithTokenSource(tokenSource))

Cloud Datastore GoDoc

Example Usage

First create a datastore.Client to use throughout your application:

client, err := datastore.NewClient(ctx, "my-project-id")
if err != nil {
	log.Fatal(err)
}

Then use that client to interact with the API:

type Post struct {
	Title       string
	Body        string `datastore:",noindex"`
	PublishedAt time.Time
}
keys := []*datastore.Key{
	datastore.NameKey("Post", "post1", nil),
	datastore.NameKey("Post", "post2", nil),
}
posts := []*Post{
	{Title: "Post 1", Body: "...", PublishedAt: time.Now()},
	{Title: "Post 2", Body: "...", PublishedAt: time.Now()},
}
if _, err := client.PutMulti(ctx, keys, posts); err != nil {
	log.Fatal(err)
}

Cloud Storage GoDoc

Example Usage

First create a storage.Client to use throughout your application:

client, err := storage.NewClient(ctx)
if err != nil {
	log.Fatal(err)
}
// Read the object1 from bucket.
rc, err := client.Bucket("bucket").Object("object1").NewReader(ctx)
if err != nil {
	log.Fatal(err)
}
defer rc.Close()
body, err := ioutil.ReadAll(rc)
if err != nil {
	log.Fatal(err)
}

Cloud Pub/Sub GoDoc

Example Usage

First create a pubsub.Client to use throughout your application:

client, err := pubsub.NewClient(ctx, "project-id")
if err != nil {
	log.Fatal(err)
}

Then use the client to publish and subscribe:

// Publish "hello world" on topic1.
topic := client.Topic("topic1")
res := topic.Publish(ctx, &pubsub.Message{
	Data: []byte("hello world"),
})
// The publish happens asynchronously.
// Later, you can get the result from res:
...
msgID, err := res.Get(ctx)
if err != nil {
	log.Fatal(err)
}

// Use a callback to receive messages via subscription1.
sub := client.Subscription("subscription1")
err = sub.Receive(ctx, func(ctx context.Context, m *pubsub.Message) {
	fmt.Println(m.Data)
	m.Ack() // Acknowledge that we've consumed the message.
})
if err != nil {
	log.Println(err)
}

Cloud BigQuery GoDoc

Example Usage

First create a bigquery.Client to use throughout your application: snip:# (bq-1)

c, err := bigquery.NewClient(ctx, "my-project-ID")
if err != nil {
	// TODO: Handle error.
}

Then use that client to interact with the API: snip:# (bq-2)

// Construct a query.
q := c.Query(`
    SELECT year, SUM(number)
    FROM [bigquery-public-data:usa_names.usa_1910_2013]
    WHERE name = "William"
    GROUP BY year
    ORDER BY year
`)
// Execute the query.
it, err := q.Read(ctx)
if err != nil {
	// TODO: Handle error.
}
// Iterate through the results.
for {
	var values []bigquery.Value
	err := it.Next(&values)
	if err == iterator.Done {
		break
	}
	if err != nil {
		// TODO: Handle error.
	}
	fmt.Println(values)
}

Stackdriver Logging GoDoc

Example Usage

First create a logging.Client to use throughout your application: snip:# (logging-1)

ctx := context.Background()
client, err := logging.NewClient(ctx, "my-project")
if err != nil {
	// TODO: Handle error.
}

Usually, you'll want to add log entries to a buffer to be periodically flushed (automatically and asynchronously) to the Stackdriver Logging service. snip:# (logging-2)

logger := client.Logger("my-log")
logger.Log(logging.Entry{Payload: "something happened!"})

Close your client before your program exits, to flush any buffered log entries. snip:# (logging-3)

err = client.Close()
if err != nil {
	// TODO: Handle error.
}

Cloud Spanner GoDoc

Example Usage

First create a spanner.Client to use throughout your application:

client, err := spanner.NewClient(ctx, "projects/P/instances/I/databases/D")
if err != nil {
	log.Fatal(err)
}
// Simple Reads And Writes
_, err = client.Apply(ctx, []*spanner.Mutation{
	spanner.Insert("Users",
		[]string{"name", "email"},
		[]interface{}{"alice", "[email protected]"})})
if err != nil {
	log.Fatal(err)
}
row, err := client.Single().ReadRow(ctx, "Users",
	spanner.Key{"alice"}, []string{"email"})
if err != nil {
	log.Fatal(err)
}

Contributing

Contributions are welcome. Please, see the CONTRIBUTING document for details. We're using Gerrit for our code reviews. Please don't open pull requests against this repo, new pull requests will be automatically closed.

Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms. See Contributor Code of Conduct for more information.

# Packages

Package bigquery provides a client for the BigQuery service.
Package bigtable is an API to Google Cloud Bigtable.
Package civil implements types for civil time, a time-zone-independent representation of time that follows the rules of the proleptic Gregorian calendar with exactly 24-hour days, 60-minute hours, and 60-second minutes.
No description provided by the author
No description provided by the author
Package container contains a deprecated Google Container Engine client.
No description provided by the author
Package datastore provides a client for Google Cloud Datastore.
No description provided by the author
No description provided by the author
Package errorreporting is a Google Stackdriver Error Reporting library.
Package firestore provides a client for reading and writing to a Cloud Firestore database.
Package iam supports the resource-specific operations of Google Cloud IAM (Identity and Access Management) for the Google Cloud Libraries.
No description provided by the author
Package logging contains a Stackdriver Logging client suitable for writing logs.
Package longrunning supports Long Running Operations for the Google Cloud Libraries.
No description provided by the author
No description provided by the author
Package profiler is a client for the Stackdriver Profiler service.
Package pubsub provides an easy way to publish and receive Google Cloud Pub/Sub messages, hiding the the details of the underlying server RPCs.
Package rpcreplay supports the capture and replay of gRPC calls.
Package spanner provides a client for reading and writing to Cloud Spanner databases.
No description provided by the author
Package storage provides an easy way to work with Google Cloud Storage.
This package is OBSOLETE.
Package translate is a client for the Google Translation API.
No description provided by the author
No description provided by the author