package
0.2.5
Repository: https://github.com/asfsadas/contrib.git
Documentation: pkg.go.dev

# Packages

No description provided by the author
No description provided by the author

# README

entproto

entproto is a library and cli tool to facilitate the generation of .proto files from an ent.Schema.

Disclaimer: This is an experimental feature, expect the API to change in the near future.

Quick Start

Prerequesites:

Download the module:

go get -u github.com/asfsadas/contrib/entproto

Install protoc-gen-entgrpc (ent's gRPC service implementation generator):

go get github.com/asfsadas/contrib/entproto/cmd/protoc-gen-entgrpc

Annotate the schema with entproto.Message() and all fields with the desired proto field numbers (notice the field number 1 is reserved for the schema's ID field:

package schema

import (
	"entgo.io/ent"
	"entgo.io/ent/schema"
	"entgo.io/ent/schema/field"
	"github.com/asfsadas/contrib/entproto"
)

type User struct {
	ent.Schema
}

func (User) Annotations() []schema.Annotation {
	return []schema.Annotation{
		entproto.Message(),
		entproto.Service(), // also generate a gRPC service definition
	}
}

func (User) Fields() []ent.Field {
	return []ent.Field{
		field.String("user_name").
			Annotations(entproto.Field(2)),
	}
}

Run the code generation:

go run github.com/asfsadas/contrib/entproto/cmd/entproto -path ./ent/schema

The proto file is generated under ./ent/proto/entpb/entpb.proto:

// Code generated by entproto. DO NOT EDIT.
syntax = "proto3";

package entpb;

option go_package = "github.com/asfsadas/contrib/entproto/internal/todo/ent/proto/entpb";

message User {
  int32 id = 1;

  string user_name = 2;
}

In addition, a file named generate.go, which contains a //go:generate directive to invoke protoc and create Go files for the protocol buffers and gRPC services is created adjecent to the .proto file. If a file by that name already exists, this step is skipped. The protoc invocation includes requests for codegen from 3 plugins: protoc-gen-go (standard Go codegen), protoc-gen-go-grpc (standard gRPC codegen) and protoc-gen-entgrpc (an ent-specific protoc plugin that generates service implementations using ent).

To generate the Go files from the .proto file run:

go generate ./ent/proto/...

protoc-gen-entgrpc

protoc-gen-entgrpc is a protoc plugin that generates server code that implements the gRPC interface that was generated from the ent schema. It must receive a path to the ent schema directory which is used to map the schema definitions with the proto definitions to produce correct code:

protoc -I=.. --go_out=.. --go-grpc_out=.. --go_opt=paths=source_relative --entgrpc_out=.. --entgrpc_opt=paths=source_relative,schema_path=../../schema --go-grpc_opt=paths=source_relative entpb/entpb.proto

As mentioned in the section above, this command will be generated for you for each protobuf package directory when you run the entproto command.

The current version generates a full service implementation, an example can be found in entpb/entpb_user_service.go.

Some caveats with the current version:

  • Currently only "unique" edges are supported (O2O, O2M). Support for multi-relations will land soon.
  • The generated "mutating" methods (Create/Update) currently set all fields, disregarding zero/null values and field nullability.
  • All fields are copied from the gRPC request to the ent client, support for making some fields not settable via the service by adding a field/edge annotation is also planned.
// UserService implements UserServiceServer
type UserService struct {
	client *ent.Client
	UnimplementedUserServiceServer
}

func NewUserService(client *ent.Client) *UserService {
	return &UserService{client: client}
}

// Create implements UserServiceServer.Create
func (svc *UserService) Create(ctx context.Context, req *CreateUserRequest) (*User, error) {
	return nil, status.Error(codes.Unimplemented, "error")
}
/// ... and so on 

Programmatic code-generation

To programmatically invoke entproto from a custom entc.Generate call, entproto can be used as a gen.Hook. For example:

package main

import (
	"log"
	
	"github.com/asfsadas/contrib/entproto"
	"entgo.io/ent/entc"
	"entgo.io/ent/entc/gen"
)

func main() {
	err := entc.Generate("./ent/schema", &gen.Config{
		Hooks: []gen.Hook{
			// Run entproto codegen in addition to normal ent codegen.
			entproto.Hook(),
		},
	})
	if err != nil {
		log.Fatal(err)
	}
}

Message Annotations

ent.Message

By default, entproto will skip all schemas, unless they explicitly opt-in for proto file generation:

type User struct {
	ent.Schema
}

func (User) Annotations() []schema.Annotation {
	return []schema.Annotation{entproto.Message()}
}

By default the proto package name for the generated files will be entpb but it can be specified using a functional option:


func (MessageWithPackageName) Annotations() []schema.Annotation {
	return []schema.Annotation{entproto.Message(
		entproto.PackageName("io.entgo.apps.todo"),
	)}
}

Per the protobuf style guide:

Package name should be in lowercase, and should correspond to the directory hierarchy. e.g., if a file is in my/package/, then the package name should be my.package.

Therefore, protos for a package named io.entgo.apps.todo will be placed under io/entgo/apps/todo. To avoid issues with cyclic dependencies, all messages for a given package are placed in a single file with the name of the last part of the module. In the example above, the generated file name will be todo.proto.

entproto.SkipGen()

To explicitly opt-out of proto file generation, the functional option entproto.SkipGen() can be used:

func (ExplicitSkippedMessage) Annotations() []schema.Annotation {
	return []schema.Annotation{
        entproto.SkipGen(),
    }
}

This is useful in cases where a Mixin is used and its default behavior enables proto generation.

entproto.Service()

entproto supports the generation of simple CRUD gRPC service definitions from ent.Schema

To enable generation of a service definition, add an entproto.Service() annotation:

func (User) Annotations() []schema.Annotation {
	return []schema.Annotation{
		entproto.Message(),
		entproto.Service(),
	}
}

This will generate:

message CreateUserRequest {
  User user = 1;
}

message GetUserRequest {
  int32 id = 1;
}

message UpdateUserRequest {
  User user = 1;
}

message DeleteUserRequest {
  int32 id = 1;
}

service UserService {
  rpc Create ( CreateUserRequest ) returns ( User );

  rpc Get ( GetUserRequest ) returns ( User );

  rpc Update ( UpdateUserRequest ) returns ( User );

  rpc Delete ( DeleteUserRequest ) returns ( google.protobuf.Empty );
}

Method generation can be customized by including the argument entproto.Methods() in the entproto.Service() annotation. entproto.Methods() accepts bit flags to determine what service methods should be generated.

// Generates a Create gRPC service method for the entproto.Service.
entproto.MethodCreate

// Generates a Get gRPC service method for the entproto.Service.
entproto.MethodGet

// Generates an Update gRPC service method for the entproto.Service.
entproto.MethodUpdate

// Generates a Delete gRPC service method for the entproto.Service.
entproto.MethodDelete

// Generates all service methods for the entproto.Service.
// This is the same behavior as not including entproto.Methods.
entproto.MethodAll

To generate a service with multiple methods, bitwise OR the flags.

For example, the ent.Schema can be modified to generate only Create and Get methods:

func (User) Annotations() []schema.Annotation {
	return []schema.Annotation{
		entproto.Message(),
		entproto.Service(
			entproto.Methods(entproto.MethodCreate | entproto.MethodGet),
        ),
	}
}

This will generate:

message CreateUserRequest {
  User user = 1;
}

message GetUserRequest {
  int32 id = 1;
}

service UserService {
  rpc Create ( CreateUserRequest ) returns ( User );

  rpc Get ( GetUserRequest ) returns ( User );
}

Field Annotations

entproto.Field

All fields must be annotated with entproto.Field to specify their proto field numbers

// Fields of the User.
func (User) Fields() []ent.Field {
	return []ent.Field{
		field.String("name").
			Annotations(entproto.Field(2)),
	}
}

The ID field is added to the generated message as well, in the above example it is implicitly defined, but entproto will respect explicitly defined ID fields as well. The above schema would translate to:

message User {
  int32 id = 1;
  string name = 2
}

Field type mappings:

Ent TypeProto TypeMore considerations
TypeBoolbool
TypeTimegoogle.protobuf.Timestamp
TypeJSONX
TypeUUIDbytesWhen receiving an arbitrary byte slice as input, 16-byte length must be validated
TypeBytesbytes
TypeEnumEnumProto enums like proto fields require stable numbers to be assigned to each value. Therefore we will need to add an extra annotation to map from field value to tag number.
TypeStringstring
TypeOtherX
TypeInt8int32
TypeInt16int32
TypeInt32int32
TypeIntint32
TypeInt64int64
TypeUint8uint32
TypeUint16uint32
TypeUint32uint32
TypeUintuint32
TypeUint64uint64
TypeFloat32float
TypeFloat64double
 

Validations:

  • Field number 1 is reserved for the ID field
  • No duplication of field numbers (this is illegal protobuf)
  • Only supported ent field types are used

Custom Fields

In some edge cases, it may be required to override the automatic ent <> proto type mapping. This can be done by using the entproto.OverrideType, field option:

field.Uint8("custom_pb").
    Annotations(
        entproto.Field(12,
            entproto.Type(descriptorpb.FieldDescriptorProto_TYPE_UINT64),
        ),
    )

entproto.Enum

Proto Enum options, similar to message fields are assigned a numeric identifier that is expected to remain stable through all versions. This means, that a specific Ent Enum field option must always be translated to the same numeric identifier across the re-generation of the export code.

To accommodate this, we add an additional annotation (entproto.Enum) that maps between the Ent Enum options and their desired proto identifier:


// Fields of the Todo.
func (Todo) Fields() []ent.Field {
	return []ent.Field{
		field.String("task").
			Annotations(entproto.Field(2)),
		field.Enum("status").
			Values("pending", "in_progress", "done").
			Default("pending").
			Annotations(
				entproto.Field(3),
				entproto.Enum(map[string]int32{
					"pending":     0,
					"in_progress": 1,
					"done":        2,
				}),
			),
	}
}

Which is transformed into:

message Todo {
  int32 id = 1;

  string task = 2;

  Status status = 3;

  User user = 4;

  enum Status {
    PENDING = 0;

    IN_PROGRESS = 1;

    DONE = 2;
  }
}

As per the proto3 language guide for enums, the zero value (default) must always be specified. The Proto Style Guide suggests that we use CAPS_WITH_UNDERSCORES for value names, and a suffix of _UNSPECIFIED to the zero value. Ent supports specifying default values for Enum fields. We map this to proto enums in the following manner:

  • If no default value is defined for the enum, we generate a <MessageName>_UNSPECIFIED = 0; option on the enum and verify that no option received the 0 number in the enproto.Enum Options field.
  • If a default value is defined for the enum, we verify that it receives the 0 value on the Options field.

Edges

Edges are annotated in the same way as fields: using entproto.Field annotation to specify the field number for the generated field. Unique relations are mapped to normal fields, non-unique relations are mapped to repeated fields. For example:

func (BlogPost) Edges() []ent.Edge {
	return []ent.Edge{
		edge.To("author", User.Type).
			Unique().
			Annotations(entproto.Field(4)),
		edge.From("categories", Category.Type).
			Ref("blog_posts").
			Annotations(entproto.Field(5)),
	}
}

func (BlogPost) Fields() []ent.Field {
	return []ent.Field{
		field.String("title").
			Annotations(entproto.Field(2)),
		field.String("body").
			Annotations(entproto.Field(3)),
	}
}

Is transformed to:

message BlogPost {
  int32 id = 1;
  string title = 2;
  string body = 3;
  User author = 4;
  repeated Category categories = 5;
}

Validation:

  • Cyclic dependencies are not supported in protobuf - so back references can only be supported if both messages are output to the same proto package. (In the above example, BlogPost, User and Category must be output to the same proto package).

Contributing

Code generation

Please re-generate all code using go generate ./... before checking code in - CI will fail on this check otherwise. To ensure you get the same output as the CI process make sure your local environment has the same protoc, protoc-gen-go and protoc-gen-go-grpc. See the environment setup in the ci.yaml file.

Codegen + Test flow

To rebuild the protoc-gen-entgrpc plugin, regenerate the code and run all tests:

go generate ./cmd/protoc-gen-entgrpc/... && 
  go get github.com/asfsadas/contrib/entproto/cmd/protoc-gen-entgrpc &&
  go generate ./... &&
  go test ./... 

Running in Docker

If you prefer to run code-generation inside a Docker container you can use the provided Dockerfile that mimics the CI environment.

Build the image:

docker build -t entproto-dev .

Run the image (from the root contrib/ directory), mounting your local source code into /go/src inside the container:

docker run -it -v $(pwd):/go/src -w /go/src/entproto entproto bash

From within the Docker image, compile and install your current protoc-gen-entgrpc binary, regenerate all code and run the tests.

go generate ./cmd/protoc-gen-entgrpc/... && 
  go get github.com/asfsadas/contrib/entproto/cmd/protoc-gen-entgrpc &&
  go generate ./... &&
  go test ./...