Protocol Buffers (aka Protobuf) is an interface definition language and binary serialization
format. Schemas defined in .proto
files are platform-independent and can be used in many languages.
For example, the following Protobuf file (example.proto) defines a data structure named User
:
syntax = "proto3";
package example;
message User {
string first_name = 1;
string last_name = 2;
bool active = 3;
User manager = 4;
repeated string locations = 5;
map<string, string> projects = 6;
}
To use the data structure, you generate code with a Protobuf compiler and a plugin for the language of your choice. To learn more about Protobuf's capabilities, read the official language guide.
Protobuf-ES is a complete implementation of Protocol Buffers in TypeScript, suitable for web browsers and Node.js, created by Buf. It's the only fully-compliant JavaScript Protobuf library that passes the Protobuf conformance tests—read more on our blog.
Protobuf-ES consists of three npm packages:
- @bufbuild/protoc-gen-es: Compiler plugin to generate TypeScript or JavaScript.
- @bufbuild/protobuf: Runtime library with core functionality.
- @bufbuild/protoplugin: Helps to create your own code generator.
The quickstart below shows a simple example of code generation for a .proto
file.
-
Start with a new project:
mkdir example cd example npm init -y npm install typescript npx tsc --init
-
Install the runtime library, code generator, and the Buf CLI:
npm install @bufbuild/protobuf npm install --save-dev @bufbuild/buf @bufbuild/protoc-gen-es
-
Create a
buf.gen.yaml
file that looks like this:# Learn more: https://buf.build/docs/configuration/v2/buf-gen-yaml version: v2 inputs: - directory: proto plugins: - local: protoc-gen-es out: src/gen opt: target=ts
-
Create a
proto
subdirectory and download example.proto into it. -
To generate code for all Protobuf files in the
proto
directory, simply run:npx buf generate
The generated code now exists in src/gen/example_pb.ts
:
.
├── buf.gen.yaml
├── package.json
├── proto
│ └── example.proto
└── src
└── gen
+ └── example_pb.ts
protoc-gen-es
is a standard Protobuf plugin, and can also be used with protoc
:
PATH=$PATH:$(pwd)/node_modules/.bin \
protoc -I . \
--es_out src/gen \
--es_opt target=ts \
proto/example.proto
Note that node_modules/.bin
needs to be added to the $PATH
so that the Protobuf compiler can find the plugin. This
happens automatically with npm scripts.
If you use Yarn, versions v2 and above don't use a node_modules
directory, so you need to change the variable a
bit:
PATH=$(dirname $(yarn bin protoc-gen-es)):$PATH
Our plugin supports a few options to control the generated code. The example above used target=ts
to generate
TypeScript files.
With @bufbuild/buf, multiple options can be specified as a YAML list:
# buf.gen.yaml
version: v2
plugins:
- local: protoc-gen-es
out: src/gen
opt: # multiple options
- target=ts
- import_extension=js
With protoc
, you specify multiple options with multiple --es_opt
flags. Alternatively, both compilers
allow you to specify multiple options as a single comma-separated value like target=ts,import_extension=js
.
This option controls whether the plugin generates JavaScript, TypeScript, or TypeScript declaration files. Possible values:
target=js
: Generates a_pb.js
file for every.proto
input file.target=ts
: Generates a_pb.ts
file for every.proto
input file.target=dts
: Generates a_pb.d.ts
file for every.proto
input file.
You can pass multiple values by separating them with +
—for example, target=js+dts
.
By default, it generates JavaScript and TypeScript declaration files, which produces the smallest code size and is the
most compatible with various bundler configurations. If you prefer to generate TypeScript, use target=ts
.
By default, protoc-gen-es doesn't add file extensions to import paths. However, some
environments require an import extension. For example, using ECMAScript modules in Node.js
requires the .js
extension, and Deno requires .ts
. With this plugin option, you can add .js
/.ts
extensions in
import paths with the given value. Possible values:
import_extension=none
: Doesn't add an extension. (Default)import_extension=js
: Adds the.js
extension.import_extension=ts
. Adds the.ts
extension.
By default, protoc-gen-es generates ECMAScript import
and export
statements. For use cases where CommonJS is
difficult to avoid, this option can be used to generate CommonJS require()
calls. Possible values:
js_import_style=module
: Generates ECMAScriptimport
/export
statements. (Default)js_import_style=legacy_commonjs
: Generates CommonJSrequire()
calls.
By default, protoc-gen-es omits empty files from the plugin output. This option disables pruning of empty files to allow for smooth interoperation with Bazel and similar tooling that requires all output files to be declared ahead of time. Unless you use Bazel, you probably don't need this option.
protoc-gen-es generates valid TypeScript for current versions of the TypeScript compiler with standard settings.
If you use compiler settings that yield an error for generated code, setting this option generates an annotation at
the top of each file to skip type checks: // @ts-nocheck
.
Generates JSON types for every Protobuf message and enumeration. Calling toJson()
automatically returns the JSON type
if available. Learn more about JSON types.
This section shows the code that Protobuf-ES generates for each Protobuf definition, based on example.proto.
For every Protobuf source file, it generates a corresponding .js
, .ts
, or .d.ts
file, and adds a _pb
suffix to
the name. For example, for foo/bar.proto
, it generates foo/bar_pb.js
.
At the top of each file, it generates a preamble with information about the source file and how it was generated:
// @generated by protoc-gen-es v2.0.0 with parameter "target=dts"
// @generated from file example.proto (package example, syntax proto3)
/* eslint-disable */
Below the preamble are imports. If your Protobuf file imports another Protobuf file, a relative import is generated:
import type { User } from "./example_pb";
Tip
By default, it generates ECMAScript modules, which means we use import
and export
statements. To modify imports,
see the plugin options js_import_style
and import_extension
.
Below the import statements, it generates the schema of the Protobuf file:
/**
* Describes the file example.proto.
*/
export declare const file_example: GenFile;
You typically only need this export to create a registry, or for advanced use cases with reflection.
Messages are the primary data structures in Protobuf. They're simple objects with an arbitrary number of fields. For the following declaration:
message User {
string first_name = 1;
}
Protobuf-ES generates a User
type:
import type { Message } from "@bufbuild/protobuf";
/**
* @generated from message example.User
*/
export declare type User = Message<"example.User"> & {
/**
* @generated from field: string first_name = 1;
*/
firstName: string;
};
and its schema:
export declare const UserSchema: GenMessage<User>;
If you've used zod before, it's similar—you use the schema to parse a message:
import { fromBinary } from "@bufbuild/protobuf";
import { UserSchema } from "./gen/example_pb";
const bytes = new Uint8Array([10, 3, 84, 105, 109]);
fromBinary(UserSchema, bytes); // User
Schemas are a powerful feature. You can take a deeper dive in the section about reflection.
A Protobuf message has an arbitrary number of fields, and fields have one of several available types. Scalar field types are primitive types, such as a simple string:
string first_name = 1;
They're generated as the closest matching type in ECMAScript:
/**
* @generated from field: string first_name = 1;
*/
firstName: string;
Here is a complete list of scalar types, and how they map to ECMAScript:
Protobuf type | ECMAScript type | Notes | Default value |
---|---|---|---|
string | string | UTF-8 | "" |
bool | boolean | false |
|
bytes | Uint8Array | new Uint8Array(0) |
|
double | number | Double-precision, 64-bit floating point value | 0 |
float | number | Single-precision, 32-bit floating point value | 0 |
int32 | number | 32-bit signed integer with variable length | 0 |
uint32 | number | 32-bit unsigned integer with variable length | 0 |
int64 | bigint | 64-bit signed integer with variable length | 0n |
uint64 | bigint | 64-bit unsigned integer with variable length | 0n |
fixed32 | number | 32-bit unsigned integer with fixed length (always 4 bytes) | 0 |
fixed64 | bigint | 64-bit unsigned integer with fixed length (always 8 bytes) | 0n |
sfixed32 | number | 32-bit signed integer with fixed length (always 4 bytes) | 0 |
sfixed64 | bigint | 64-bit signed integer with fixed length (always 8 bytes) | 0n |
sint32 | number | 32-bit signed integer with variable length, most efficient for negative numbers | 0 |
sint64 | bigint | 64-bit signed integer with variable length, most efficient for negative numbers | 0n |
Scalar fields use the zero-value as the default.
If bigint
isn't available in your environment, you can still serialize and deserialize messages with 64-bit integral
fields without losing any data, but the fields hold string
values instead of bigint
.
If you prefer that a field use string
instead of bigint
, use the field option jstype = JS_STRING
:
int64 field = 1 [jstype = JS_STRING]; // will generate `field: string`
Tip
Set jstype = JS_STRING
on all applicable fields automatically with buf. Add the following
managed mode config:
# Add to buf.gen.yaml:
managed:
enabled: true
override:
- field_option: jstype
value: JS_STRING
For the following Protobuf field declaration:
User manager = 4;
Protobuf-ES generates the following property:
/**
* @generated from field: example.User manager = 4;
*/
manager?: User
Message fields don't have default values in Protobuf. They are always optional in ECMAScript.
Tip
google.protobuf.Struct and the messages from wrappers.proto have a special representation in generated code.
Repeated fields are represented with an ECMAScript Array. For example, the following Protobuf field declaration:
repeated string locations = 5;
is generated as:
/**
* @generated from field: repeated string locations = 5;
*/
locations: string[] = [];
Repeated fields have an empty array as the default value.
For the following Protobuf declaration:
map<string, string> projects = 6;
Protobuf-ES generates the property:
/**
* @generated from field: map<string, string> projects = 6;
*/
projects: { [key: string]: string } = {};
Map fields have an empty object as the default value.
Note
ECMAScript Map objects have great support for key types, but many popular libraries don't support them correctly yet. For this reason, we use an object to represent map fields.
A oneof
construct in Protobuf guarantees that only one of the contained fields can be selected at a time.
For the following Protobuf definition:
oneof result {
int32 number = 1;
string error = 2;
}
Protobuf-ES generates the following property:
result:
| { case: "number"; value: number }
| { case: "error"; value: string }
| { case: undefined; value?: undefined } = { case: undefined };
The entire oneof
group is turned into an object result
with two properties:
case
: The name of the selected fieldvalue
: The value of the selected field
This property is always defined on the message—similar to the way map or repeated fields are always defined. By default,
it's {case: undefined}
.
In our example, result.case
can be either "number"
, "error"
, or undefined
. If a field is selected, the
property result.value
contains the value of the
selected field.
To select a field, simply replace the result
object:
user.result = { case: "number", value: 123 };
user.result = { case: undefined };
To query a oneof
group, you can use if blocks:
if (user.result.case === "number") {
user.result.value; // a number
}
or a switch statement:
switch (user.result.case) {
case "number":
user.result.value; // a number
break;
case "error":
user.result.value; // a string
break;
}
This representation is particularly useful in TypeScript, because the compiler narrows down the type. The if blocks and
switch statements above tell the compiler the type of the value
property.
Tip
This feature requires the TypeScript compiler option strictNullChecks
to be enabled.
This option is automatically enabled with the option strict
. See the documentation for details.
Groups are a deprecated language feature of proto2 that allows you to declare a field and a message at the same time:
optional group MyGroup = 1 {
optional int32 int32_field = 1;
}
For this group field, Protobuf-ES generates the following property and the message User_MyGroup
:
/**
* @generated from field: optional example.User.MyGroup mygroup = 1;
*/
mygroup?: User_MyGroup;
Caution
The groups feature is deprecated and shouldn't be used when creating new schemas. Use nested messages instead.
In proto2, fields can use the required
keyword to ensure that the field is always set. In Protobuf-ES, required fields
are validated when serializing a message, but not when parsing or constructing a message.
With Protobuf-ES v2, required
is less of a burden because the properties are no longer optional. However, the
improvement only applies to scalar and enum fields, not to message fields. For message fields, the behavior for
proto2 required
is unchanged between v1 and v2.
Caution
required
is a legacy feature. The official language guide states: Do not use.
In proto3, zero values like 0
, false
, or ""
aren't serialized. The optional
keyword enables presence tracking
for a field, allowing you to distinguish between an absent value and an explicitly set zero value.
optional bool active = 3;
The field is generated as an optional property:
/**
* @generated from field: optional bool active = 3;
*/
active?: boolean;
Tip
See field presence and default values for more information about optional fields.
Property names are always lowerCamelCase
, even if the corresponding Protobuf field uses
snake_case
. Though there's no official style for ECMAScript, most style guides (AirBnB,
MDN, Google) as well as Node.js APIs and
browser APIs use lowerCamelCase
, and so do we.
For the following Protobuf definition:
enum PhoneType {
UNSPECIFIED = 0;
MOBILE = 1;
LAND_LINE = 2;
}
Protobuf-ES generates the following TypeScript enum:
/**
* @generated from enum example.PhoneType
*/
export enum PhoneType {
UNSPECIFIED = 0,
MOBILE = 1,
LAND_LINE = 2,
}
If all enum values share a prefix that corresponds with the enum's name, the prefix is dropped from all enum value names. For example, given the following enum declaration:
enum PhoneType {
PHONE_TYPE_UNSPECIFIED = 0;
PHONE_TYPE_MOBILE = 1;
PHONE_TYPE_LAND_LINE = 2;
}
Protobuf-ES generates the following TypeScript enum:
/**
* @generated from enum example.PhoneType
*/
export enum PhoneType {
UNSPECIFIED = 0,
MOBILE = 1,
LAND_LINE = 2,
}
A quick refresher about TypeScript enums:
- You can convert an enum value to a string:
let val: PhoneType = PhoneType.MOBILE; let name = PhoneType[val]; // => "MOBILE"
- You can convert a string to an enum value:
let val: PhoneType = PhoneType["MOBILE"];
- TypeScript enums support aliases, as does Protobuf with the
allow_alias
option.
Along with the TypeScript enum, we also generate its schema:
/**
* Describes the enum example.PhoneType.
*/
export declare const PhoneTypeSchema: GenEnum<PhoneType>;
To learn more about the schema, take a look at the section about reflection.
An extension is a field defined outside of its container message. For example, we can add the field age
to the
message User
:
syntax = "proto2";
message User {
extensions 100 to 200;
}
// The extension can also be defined in a separate file
extend User {
optional uint32 age = 100;
}
Given that extension, Protobuf-ES generates the export:
/**
* @generated from extension: optional uint32 age = 100;
*/
export declare const age: GenExtension<User, number>;
You can set the age
extension field like this:
import { setExtension } from "@bufbuild/protobuf";
import { User, age } from "./example_pb.js";
const user = new User();
setExtension(user, age, 77);
If the message already has a value for the age
extension, the value is replaced.
You can remove an extension from a message with the function clearExtension
.
To retrieve an extension value, use getExtension
. To check whether an extension
is set, use hasExtension
.
import {
setExtension,
getExtension,
hasExtension,
clearExtension,
} from "@bufbuild/protobuf";
setExtension(user, age, 77);
hasExtension(user, age); // true
getExtension(user, age); // 77
clearExtension(user, age);
hasExtension(user, age); // false
Note that getExtension
never returns undefined
. If the extension isn't set, hasExtension
returns false
, but
getExtension
returns the default value, for example:
0
for numeric types[]
for repeated fields- an empty message instance for message fields
Extensions are stored as unknown fields on a message. If you retrieve an extension value, it's
deserialized from the binary unknown field data. To mutate a value, make sure to store the new value with setExtension
after mutating. For example, let's say you have the extension field repeated string hobbies = 101
, and want to add
values:
import {
setExtension,
getExtension,
hasExtension,
clearExtension,
} from "@bufbuild/protobuf";
import { hobbies } from "./example_pb.js";
const h = getExtension(user, hobbies);
h.push("Baking");
h.push("Pottery");
setExtension(user, hobbies, h);
Note
In proto3, extensions can only be used for custom options.
Tip
To use extensions with the JSON format, you need to provide them in the serialization options.
In Protobuf, you can define a service for Remote Procedure Calls (RPCs):
service UserService {
rpc CreateUser(CreateUserRequest) returns (CreateUserResponse);
}
For every service, Protobuf-ES generates just the schema that describes the service and its methods:
/**
* @generated from service example.UserService
*/
export declare const UserService: GenService<{
/**
* @generated from rpc example.UserService.CreateUser
*/
createUser: {
methodKind: "unary";
input: typeof CreateUserRequestSchema;
output: typeof CreateUserResponseSchema;
};
}>;
Protobuf-ES doesn't implement RPC itself, but other projects can use this typed schema. See Connect-ES for a project that does.
Some names that are valid in Protobuf can't be used in ECMAScript, either because they are reserved keywords like
catch
, or because they would clash with built-in properties like constructor
. @bufbuild/protoc-gen-es
escapes reserved names by adding the suffix $
.
Message and enum declarations can be nested in a message. For example:
syntax = "proto3";
package example;
message User {
string first_name = 1;
Type type = 7;
enum Type {
USER = 0;
MANAGER = 1;
}
}
Similar to Protobuf in Go, we join the name of a nested type with its parents' names, separated with an underscore.
In generated code, the enum User.Type
has the identifier User_Type
.
We believe that comments in Protobuf source files are important, and take great care to carry them over to the generated code as JSDocs comments. That includes license headers in your file, but also comments on messages, fields, services and methods.
If you deprecate a Protobuf element, we add a JSDoc tag to the generated element:
/**
* This field is deprecated
*
* @generated from field: string deprecated_field = 1 [deprecated = true];
* @deprecated
*/
deprecatedField = "";
Protobuf files can specify a package for languages that support namespaces, like Java. ECMAScript does not have an equivalent, so Protobuf packages are largely ignored, but are supported in descriptors and type names.
Protobuf has a small standard library of well-known types. @bufbuild/protobuf provides all of
them as pre-compiled exports. If you import a well-known type in a Protobuf file, the generated code simply imports from
@bufbuild/protobuf/wkt
.
Expand to see the list of Well-known types
For some of the well-known types, we provide additional features for convenience:
A Timestamp
represents a point in time with nanosecond precision. It's independent of any time zone or local
calendar. For convenience, we provide a few functions for conversion:
import {
type Timestamp,
timestampNow,
timestampFromDate,
timestampFromMs,
timestampDate,
timestampMs,
} from "@bufbuild/protobuf/wkt";
// Create a Timestamp for the current time.
let ts: TimeStamp = timestampNow();
// Create a Timestamp message from an ECMAScript Date.
ts = timestampFromDate(new Date(1938, 0, 10));
// Create a Timestamp message from a Unix timestamp in milliseconds.
ts = timestampFromMs(818035920123);
// Convert a Timestamp message to an ECMAScript Date.
let date: Date = timestampDate(ts);
// Convert a Timestamp to a Unix timestamp in milliseconds.
let ms: number = timestampMs(ts);
Any
stores an arbitrary message as binary data. For convenience, we provide function to pack and unpack messages:
import { type Any, anyPack, anyIs } from "@bufbuild/protobuf/wkt";
import { create, createRegistry } from "@bufbuild/protobuf";
import { type User, UserSchema } from "./gen/example_pb";
let user: User = create(UserSchema);
// Create a new Any, and pack the given message into it
let any: Any = anyPack(UserSchema, user);
// Check if the Any contains a specific type
anyIs(any, UserSchema); // true
anyIs(any, "example.User"); // true
// Try to unpack a specific message from an Any
anyUnpack(any, UserSchema); // User | undefined
// Unpack an any, using a registry of known types
const registry = createRegistry(UserSchema);
anyUnpack(any, registry); // Message | undefined
Struct
stores a dynamic object with the same range of types as a plain object in JSON. When this message is used in a
field, it's generated as the type JsonObject
from @bufbuild/protobuf. For example:
/**
* @generated from field: google.protobuf.Struct struct = 1;
*/
struct?: JsonObject;
This feature makes it very easy to work with Struct
fields:
myMessage.struct = {
text: "abc",
number: 123,
};
wrappers.proto defines a message for every Protobuf primitive type. The messages are
useful for embedding primitives in the google.protobuf.Any
type, or to distinguish between the absence of a primitive
field and its default value.
For convenience, it generates fields that use one of the wrapper messages as "unboxed" optional primitives:
/**
* @generated from field: google.protobuf.BoolValue bool_value_field = 1;
*/
boolValueField?: boolean;
You create a message with the function create
, and a message schema:
import { create } from "@bufbuild/protobuf";
import { type User, UserSchema } from "./gen/example_pb";
const user: User = create(UserSchema);
For convenience, the function accepts an initializer object. All fields in the initializer object are optional, and if not provided, the default value for the field is used:
import { create } from "@bufbuild/protobuf";
import { type User, UserSchema } from "./gen/example_pb";
const user: User = create(UserSchema, {
firstName: "Homer",
active: true,
manager: {
// Manager is also a message. You can pass an initializer object here,
// and don't need create().
lastName: "Burns",
},
});
Messages can be serialized with the binary or the JSON format. The conformance test suite ensures interoperability with implementations in other languages.
To serialize a message, call the function toBinary
with the schema and the message. To parse a message, use
fromBinary
:
import { toBinary, fromBinary } from "@bufbuild/protobuf";
import { type User, UserSchema } from "./gen/example_pb";
declare let user: User;
const bytes: Uint8Array = toBinary(UserSchema, user);
user = fromBinary(UserSchema, bytes);
JSON serialization uses the functions toJson
and fromJson
:
import { toJson, fromjson, type JsonValue } from "@bufbuild/protobuf";
import { type User, UserSchema } from "./gen/example_pb";
declare let user: User;
const json: JsonValue = toJson(UserSchema, user);
user = fromjson(UserSchema, json);
JsonValue
can be serialized to a string
with JSON.stringify
. For convenience, we also provide the functions
toJsonString
and fromJsonString
that include the step.
To parse into an existing message, use the functions mergeFromBinary
and mergeFromJson
. To learn about serialization
options and other details, see the serialization section.
The function isMessage
is a type guard to check whether a value is a specific message.
import { create, isMessage } from "@bufbuild/protobuf";
import { UserSchema } from "./gen/example_pb";
const msg: unknown = create(UserSchema);
msg.firstName; // type error
if (isMessage(msg, UserSchema)) {
msg.firstName; // string
}
Messages include their fully qualified name in the $typeName
property:
msg.$typeName; // "example.User"
Tip
If you don't know the message's schema, you can look it up in a registry.
In general, all fields in Protobuf are optional. The rationale is that optional fields allow the schema to evolve, and that applications are more robust if they handle missing fields gracefully. To avoid boilerplate, Protobuf implementations typically implement default values.
For example, if you create a new User
message, the boolean field active
has the default value false
.
The value false
is never serialized, so it is important to design the schema accordingly.
Protobuf can distinguish between a default value false
and a property deliberately set to false
with explicit field
presence. In proto3, you can use the optional
keyword to enable explicit presence for a field:
syntax = "proto3";
message Presence {
// Implicit presence - false is not serialized.
bool a = 1;
// Explicit presence - false is serialized.
optional bool b = 2;
}
With Protobuf-ES, you can determine whether a field is present with the function isFieldSet
. In the following example,
the field with implicit presence always ignores false
, while the field with explicit presence tracks that the field
has been set:
import { isFieldSet } from "@bufbuild/protobuf";
import { PresenceSchema } from "./gen/example_pb";
const msg = create(PresenceSchema);
isFieldSet(msg, PresenceSchema.field.a); // false
isFieldSet(msg, PresenceSchema.field.b); // false
msg.a = false;
msg.b = false;
isFieldSet(msg, PresenceSchema.field.a); // false
isFieldSet(msg, PresenceSchema.field.b); // true
For repeated fields, isFieldSet
returns true if the Array has one or more elements. For map fields, isFieldSet
returns true if the Object has one or more entries. clearField
resets a field.
Important
Protobuf-ES uses the prototype chain to track explicit presence for fields with default values.
- With proto3, your messages will always be plain objects without a custom prototype.
- With proto2, your messages will always use a custom prototype for default values.
- With editions, your messages will use a custom prototype, unless all scalar and enum fields use
features.field_presence=IMPLICIT
.
Use the function equals
to check whether two messages of the same schema have the same field values:
import { equals } from "@bufbuild/protobuf";
import { type User, UserSchema } from "./gen/example_pb";
declare const a: User;
declare const b: User;
equals(UserSchema, a, b); // boolean
Note
NaN does not equal NaN.
Important
Extensions and unknown fields are disregarded by equals()
.
Use the function clone
to create a deep copy of a message:
import { clone } from "@bufbuild/protobuf";
import { type User, UserSchema } from "./gen/example_pb";
declare const a: User;
const b = clone(UserSchema, a);
Note
clone()
clones extensions and unknown fields.
As a general guide when deciding between the binary format and JSON, the JSON format is great for debugging, but the binary format is more resilient to changes. For example, you can rename a field and still parse binary data serialized with the previous version. In general, the binary format is also more performant than JSON.
Options for toBinary
:
writeUnknownFields?: boolean
Controls whether to include unknown fields in the serialized output. The default behavior is to retain unknown fields and include them in the serialized output.
Options for fromBinary
:
readUnknownFields?: boolean
Controls whether to retain unknown fields during parsing. The default behavior is to retain unknown fields and include them in the serialized output.
Options for fromJson
and fromJsonString
:
ignoreUnknownFields?: boolean
By default, unknown properties are rejected. This option overrides that behavior and ignores properties, as well as unrecognized enum string representations.registry?: Registry
A registry to use for parsinggoogle.protobuf.Any
and extensions from JSON.
Options for toJson
and toJsonString
:
alwaysEmitImplicit?: boolean
By default, fields with implicit presence are not serialized if they are unset. For example, an empty list field or a proto3 int32 field with0
is not serialized. With this option enabled, such fields are included in the output.enumAsInteger?: boolean
The name of an enum value is used by default in JSON output. This option overrides the behavior to use the numeric value of the enum value instead.useProtoFieldName?: boolean
Field names are converted to lowerCamelCase by default in JSON output. This option overrides the behavior to use the proto field name instead.registry?: Registry
A registry to use for converting extensions andgoogle.protobuf.Any
to JSON.prettySpaces?: number
Only available withtoJsonString
. A convenience property for thespace
parameter to JSON.stringify.
When binary message data is parsed, unrecognized fields are stored on the message as unknown fields in the property
$unknown?: UnknownField[]
. When the message is serialized, unknown fields are included, preserving them.
This default behavior can be modified with the binary serialization options
readUnknownFields
and writeUnknownFields
.
Extension values are also stored as unknown fields.
At a low level, the Protobuf binary serialization is implemented with the classes BinaryReader
and BinaryWriter
.
They implement the primitives of the Protobuf binary encoding.
Both classes are part of the public API and can be used on their own. The following example uses BinaryWriter
to
serialize data for our example message:
import { BinaryWriter } from "@bufbuild/protobuf/wire";
import { UserSchema } from "./gen/example_pb";
const bytes = new BinaryWriter()
// string first_name = 1
.tag(1, WireType.LengthDelimited)
.string("Homer")
// bool active = 3
.tag(3, WireType.Varint)
.bool(true)
.finish();
const user = fromBinary(UserSchema, bytes);
user.firstName; // "Homer"
user.active; // true
We require the WHATWG Text Encoding API to convert UTF-8 from and to binary. The API is widely available, but it is not part of the ECMAScript standard.
If the API is unavailable in your runtime, use the function configureTextEncoding
from @bufbuild/protobuf/wire
to
provide your own implementation. Note that the function must be called early in the initialization.
Base64 encoding can be very useful when transmitting binary data. There is no convenient standard API in ECMAScript, but we export two functions for encoding and decoding:
import { base64Encode, base64Decode } from "@bufbuild/protobuf/wire";
base64Encode(new Uint8Array([2, 4, 8, 16])); // "AgQIEA=="
base64Decode("AgQIEA=="); // Uint8Array(4) [ 2, 4, 8, 16 ]
Protobuf-ES supports the size-delimited format for messages. It lets you serialize multiple messages to a stream and parse multiple messages from a stream. A size-delimited message is a varint size in bytes, followed by exactly that many bytes of a message serialized with the binary format. This implementation is compatible with its counterparts in C++, Java, Go, and other languages.
Serialize size-delimited messages with sizeDelimitedEncode
:
import { sizeDelimitedEncode } from "@bufbuild/protobuf/wire";
import { type User, UserSchema } from "./gen/example_pb";
import { createWriteStream } from "node:fs";
declare const user: User;
const stream = createWriteStream("delim.bin", { encoding: "binary" });
stream.write(sizeDelimitedEncode(UserSchema, user));
stream.end();
You can parse size-delimited messages with sizeDelimitedDecodeStream
. The function expects an AsyncIterable<Uint8Array>
,
so it works with Node.js out of the box and can be easily adapted to other stream APIs:
import { sizeDelimitedDecodeStream } from "@bufbuild/protobuf/wire";
import { createReadStream } from "node:fs";
const stream = createReadStream("delim.bin");
for await (const user of sizeDelimitedDecodeStream(UserSchema, stream)) {
console.log(user);
}
This is an advanced feature that's set with the plugin option json_types=true
. If it's enabled,
@bufbuild/protoc-gen-es generates a JSON type for every Protobuf message.
Given this definition:
syntax = "proto3";
message Example {
int32 amount = 1;
bytes data = 2;
}
the following additional export is generated:
/**
* @generated from message Example
*/
export type ExampleJson = {
/**
* @generated from field: int32 amount = 1;
*/
amount?: number;
/**
* @generated from field: bytes data = 2;
*/
data?: string;
};
The JSON type matches exactly what toJson()
will emit with standard serialization options, and toJson()
automatically returns the JSON type if available:
const example = create(ExampleSchema, { amount: 123 });
const json: ExampleJson = toJson(ExampleSchema, example);
// Without json_types=true, the following would be a type error:
json.amount; // number | undefined
json.data; // string | undefined
For enumerations, a similar mechanism applies. We generate a union type with all JSON string values for the enum:
syntax = "proto3";
enum Format {
FORMAT_UNSPECIFIED = 0;
FORMAT_BINARY = 1;
FORMAT_JSON = 2;
}
/**
* @generated from enum Format
*/
export type FormatJson = "FORMAT_UNSPECIFIED" | "FORMAT_BINARY" | "FORMAT_JSON";
With the enumToJson()
and enumFromJson()
functions, values can be converted between both representations. With
isEnumJson()
, unknown input can be narrowed down to known values. If JSON types are available, the functions are
type safe:
const strVal: FormatJson = enumToJson(FormatSchema, Format.BINARY);
const enumVal: Format = enumFromJson(FormatSchema, strVal);
const someString: string = "FORMAT_BINARY";
if (isEnumJson(FormatSchema, someString)) {
someString; // FormatJson
}
Descriptors describe Protobuf definitions. Every Protobuf compiler parses source files into descriptors, which are Protobuf messages themselves. They are a core feature of Protobuf and of Protobuf-ES—they provide access to custom options and are used to generate code, serialize messages, and many other tasks.
Similar to several other Protobuf implementations, Protobuf-ES provides wrapper types for the Protobuf descriptor
messages. If we refer to descriptors, we usually mean the wrapped types, rather than the low-level Protobuf messages.
Our descriptor types are easy to identify—their names always start with Desc
.
Type | Wraps descriptor message | Purpose |
---|---|---|
DescFile |
google.protobuf.FileDescriptorProto |
The root of the hierarchy. This describes a single source file. It contains references to all top-level types defined in the file: messages, enums, extensions, and services. |
DescMessage |
google.protobuf.DescriptorProto |
This describes a message. This element may also contain references to other nested types: messages, enums, and extensions that are defined inside another message. |
DescField |
google.protobuf.FieldDescriptorProto |
This describes a field, defined in a message. |
DescOneof |
google.protobuf.OneofDescriptorProto |
This describes a oneof defined in a message. |
DescEnum |
google.protobuf.EnumDescriptorProto |
This describes an enum. It contains enum values. |
DescEnumValue |
google.protobuf.EnumValueDescriptorProto |
This describes an enum value. |
DescService |
google.protobuf.ServiceDescriptorProto |
This describes a service. It contains methods. |
DescMethod |
google.protobuf.MethodDescriptorProto |
This describes a method, also called an “RPC”. |
DescExtension |
google.protobuf.FieldDescriptorProto |
This describes an extension, a special kind of field defined outside of its container message. |
Descriptors form a hierarchy with a file at the root:
─ DescFile
│
├─ messages: DescMessage
│ ├─ fields: DescField[]
│ ├─ oneofs: DescOneof[]
│ ├─ nestedMessages: DescMessage[]
│ │ └─ (...more...)
│ ├─ nestedExtensions: DescExtension[]
│ └─ nestedEnums: DescEnum[]
│ └─ values: DescEnumValue[]
│
├─ enums: DescEnum[]
│ └─ values: DescEnumValue[]
│
├─ extensions: DescExtension[]
│
└─ services: DescService[]
└─ methods: DescMethod[]
To convert descriptor messages to the more convenient wrapper types, you can use a registry. You can also access descriptors from generated code—the schemas generated by @bufbuild/protoc-gen-es are these descriptors, just with some additional type information attached.
Tip
You can find a deep dive into the model in Buf's reference about descriptors.
You can fetch descriptors from the Buf Schema Registry. In tests, you can use @bufbuild/protocompile to compile inline Protobuf source to a descriptor.
The example.proto
file is described by the export file_example
. It is a DescFile
, and we can easily walk through
its elements:
import { file_example as file } from "./gen/example_pb";
// Loop through all messages defined at the root
for (const message of file.messages) {
message; // DescMessage
message.typeName; // The fully qualified name, e.g. "example.User"
// Loop through all fields for this message
for (const field of message.fields) {
field; // DescField
}
// Messages, enumerations, and extensions can be nested in a message definition
message.nestedMessages; // DescMessage[]
message.nestedEnums; // DescEnum[]
message.nestedExtensions; // DescExtension[]
}
// Loop through all enumerations defined at the root
for (const enumeration of file.enums) {
enumeration; // DescEnum
enumeration.typeName; // The fully qualified name, e.g. "example.PhoneType"
// Loop through all values of this enumeration
for (const value of enumeration.values) {
value; // DescEnumValue
value.name; // The name as specified in the source, e.g. "PHONE_TYPE_MOBILE"
}
}
// Loop through all services
for (const service of file.services) {
service; // DescService
service.typeName; // The fully qualified name, e.g. "example.UserService"
// Loop through all methods of this service
for (const method of service.methods) {
method; // DescMethod
method.name; // The name as specified in the source, e.g. "CreateUser"
}
}
// Loop through all extensions defined at the root
for (const extension of file.extensions) {
method; // DescExtension
extension.typeName; // The fully qualified name, e.g. "example.sensitive"
}
Messages, enumerations, and extensions can be defined at the root, but they can also be nested in a message definition.
With the nestedTypes()
utility, you can iterate through all nested types recursively with a single loop. The function
accepts a DescFile
or DescMessage
.
import { nestedTypes } from "@bufbuild/protobuf/reflect";
import { file_example as file } from "./gen/example_pb";
for (const type of nestedTypes(file)) {
type; // DescMessage | DescEnum | DescExtension | DescService
type.kind; // "message" | "enum" | "extension" | "service"
}
The schemas generated by @bufbuild/protoc-gen-es have some additional type information attached and allow for a type-safe lookup for some elements:
import { UserSchema, PhoneTypeSchema, UserService } from "./gen/example_pb";
// Look up fields by their localName
UserSchema.field.firstName; // DescField
UserSchema.field.firstName.name; // "first_name"
// Look up enum values by their number
PhoneTypeSchema.value[PhoneType.MOBILE]; // DescEnumValue
PhoneTypeSchema.value[PhoneType.MOBILE].name; // "PHONE_TYPE_MOBILE"
// Look up methods by their localName
UserService.method.createUser; // DescMethod
UserService.method.createUser.name; // "CreateUser"
To walk through the fields of a message, there are several options depending on how you prefer to handle fields in a
oneof
group.
For example, let's use the following message:
syntax = "proto3";
message Question {
string text = 1;
oneof result {
int32 number = 2;
string error = 3;
}
}
You can use DescMessage.fields
to list all fields, including fields in oneof
:
import type { DescMessage } from "@bufbuild/protobuf";
function walkFields(message: DescMessage) {
for (const field of message.fields) {
console.log(field.name); // prints "text", "number", "error"
field.oneof; // DescOneof | undefined
}
}
You can use DescMessage.oneofs
to list oneof groups, and descend into fields:
import type { DescMessage } from "@bufbuild/protobuf";
function walkOneofs(message: DescMessage) {
for (const oneof of message.oneofs) {
console.log(oneof.name); // prints "result"
for (const field of oneof.fields) {
console.log(field.name); // prints "number", "error"
}
}
}
You can use DescMessage.members
to list both regular fields, and oneof groups:
import type { DescMessage } from "@bufbuild/protobuf";
function walkMembers(message: DescMessage) {
for (const member of message.members) {
console.log(member.name); // prints "text", "result"
if (member.kind == "oneof") {
for (const field of member.fields) {
console.log(field.name); // prints "number", "error"
}
}
}
}
Protobuf has scalar, enum, map, repeated, and
message fields. They are all represented by the type DescField
, and share the following properties:
name
: The name as specified in source—for example, "first_name"number
: The field number as specified in sourcelocalName
: A safe and idiomatic name for ECMAScript—for example, "firstName"
Depending on the field type, DescField
provides more details, for example the descriptor for the message of a message
field. To discriminate between the different kinds of fields, use the property fieldKind
, which can be one of
"scalar"
, "enum"
, "message"
, "list"
, and "map"
.
The following example exhaustively inspects all field kinds:
import type { DescField } from "@bufbuild/protobuf";
function handleField(field: DescField) {
field.scalar; // ScalarType | undefined
field.message; // DescMessage | undefined
field.enum; // DescEnum | undefined
switch (field.fieldKind) {
case "scalar":
field.scalar; // ScalarType.STRING for a Protobuf field string first_name = 1
break;
case "enum":
field.enum; // DescEnum
break;
case "message":
field.message; // DescMessage
break;
case "list":
field.listKind; // "scalar" | "message" | "enum"
switch (field.listKind) {
case "scalar":
field.scalar; // ScalarType.INT32 for the values in `repeated int32 numbers = 2`
break;
case "message":
field.message; // DescMessage for the values in `repeated User users = 2`
break;
case "enum":
field.enum; // DescEnum for the values in `repeated PhoneType types = 2`
break;
}
break;
case "map":
field.mapKey; // ScalarType.STRING for the keys in `map<string, int32> map = 2`
switch (field.mapKind) {
case "scalar":
field.scalar; // ScalarType.INT32 for the values in `map<string, int32> map = 2`
break;
case "message":
field.message; // DescMessage for the values in `map<string, User> map = 2`
break;
case "enum":
field.enum; // DescEnum for the values in `map<string, PhoneType> map = 2`
break;
}
break;
}
}
Tip
The fieldKind
and related properties are also available on extension descriptors, DescExtension
.
Registries are collections of descriptors that enable you to look up a type by its qualified name. When serializing
or parsing extensions or google.protobuf.Any
from JSON, registries are used to
look up types.
Registry
is a set of descriptors for messages, enumerations, extensions, and services:
import type { Registry } from "@bufbuild/protobuf";
declare const registry: Registry;
// Retrieve a type by its qualified name
registry.getMessage("example.User"); // DescMessage | undefined
registry.getEnum("example.PhoneType"); // DescEnum | undefined
registry.getService("example.MyService"); // DescService | undefined
registry.getExtension("example.sensitive"); // DescExtension | undefined
// Loop through types
for (const type of registry) {
type.kind; // "message" | "enum" | "extension" | "service"
}
Registries can be composed with the createRegistry
function:
import { createRegistry } from "@bufbuild/protobuf";
import { UserSchema, file_example } from "./gen/example_pb";
const registry = createRegistry(
UserSchema, // Initialize with a message, enum, extension, or service descriptor
file_example, // add all types from the file descriptor
otherRegistry, // Adds all types from the other registry
);
Mutable registries allow you to add descriptors after creation:
import { createMutableRegistry } from "@bufbuild/protobuf";
import { UserSchema } from "./gen/example_pb";
const registry = createMutableRegistry();
// Adds a message, enum, extension, or service descriptor to the registry
registry.add(UserSchema);
// Removes a descriptor
registry.remove(UserSchema);
File registries provide access to file descriptors, and can be created from a google.protobuf.FileDescriptorSet
from
the well-known types. The following command compiles all Protobuf files in the directory proto
:
buf build proto --output set.binpb
You can read the data and create a file registry with just two steps:
import { readFileSync } from "node:fs";
import {
fromBinary,
createFileRegistry,
type DescMessage,
} from "@bufbuild/protobuf";
import { FileDescriptorSetSchema } from "@bufbuild/protobuf/wkt";
// Read a google.protobuf.FileDescriptorSet from disk.
// The set can be compiled with `buf build --output set.binpb`
const fileDescriptorSet = fromBinary(
FileDescriptorSetSchema,
readFileSync("set.binpb"),
);
// Create a FileRegistry from the google.protobuf.FileDescriptorSet message:
const registry = createFileRegistry(fileDescriptorSet);
// Loop through files
for (const file of registry.files) {
file.name;
}
// Loop through types
for (const type of registry) {
type.kind; // "message" | "enum" | "extension" | "service"
}
With custom options, you can annotate elements in a Protobuf file with arbitrary information.
Custom options are extensions to the google.protobuf.*Options
messages defined in
google/protobuf/descriptor.proto. Let's define an option to mark sensitive fields.
Create a proto/options-example.proto
file:
syntax = "proto3";
package example.options;
import "google/protobuf/descriptor.proto";
extend google.protobuf.FieldOptions {
// This field should be redacted
bool sensitive = 8765;
}
To use this option, edit example.proto
:
syntax = "proto3";
package example;
message User {
string first_name = 1;
- string last_name = 2;
+ string last_name = 2 [(example.options.sensitive) = true];
bool active = 3;
User manager = 4;
repeated string locations = 5;
map<string, string> projects = 6;
}
When the compiler parses this file, it sets the custom option value on the options
field of the
google.protobuf.FieldDescriptorProto
for the field last_name
.
After re-generating code with buf generate
, you can read the field option with the function getOption
:
import { getOption, hasOption } from "@bufbuild/protobuf";
import { UserSchema } from "./gen/example_pb";
import { sensitive } from "./gen/example-option_pb";
getOption(UserSchema.field.lastName, sensitive); // true
The companion function hasOption
returns true if an option is present. The functions behave the same as the functions
hasExtension
and getExtension
, but accept any descriptor.
Tip
Custom options can be read from generated code, from the schema passed to a plugin, or from any other descriptor.
To learn more about custom options in Protobuf, see the language guide.
Reflection allows you to inspect a schema and dynamically manipulate data. This section explains the core primitives and shows how to write a function to redact sensitive information from messages.
The reflection API provides a simple interface to access and manipulate messages without knowing their type. As an example, let's write a simple function to redact sensitive fields from a message, using the custom option we created above:
import { getOption, type Message, type DescMessage } from "@bufbuild/protobuf";
import { reflect } from "@bufbuild/protobuf/reflect";
import { sensitive } from "./gen/example-option_pb";
export function redact(schema: DescMessage, message: Message) {
const r = reflect(schema, message);
for (const field of r.fields) {
if (getOption(field, sensitive)) {
// This field has the option (example.options.sensitive) = true
r.clear(field);
}
}
}
We can use this function to redact any message:
import { create } from "@bufbuild/protobuf";
import { redact } from "./redact";
import { UserSchema } from "./gen/example_pb";
const msg = create(UserSchema, {
firstName: "Homer",
// This field has the option (example.options.sensitive) = true
lastName: "Simpson",
});
msg.lastName; // "Simpson"
redact(UserSchema, msg);
msg.lastName; // ""
There is one gotcha with our redact
function—it doesn't ensure that the schema and message match—but we can solve
this with a type inference and constraints:
import type { DescMessage, MessageShape } from "@bufbuild/protobuf";
export function redact<Desc extends DescMessage>(
schema: Desc,
message: MessageShape<Desc>,
) {
// ...
}
Tip
EnumShape
extracts the enum type from an enum descriptor.MessageShape
extracts the type from a message descriptor.MessageInitShape
extracts the init type from a message descriptor - the initializer object forcreate()
.
The function reflect
returns a ReflectMessage
. Its most important methods are:
isSet(field: DescField): boolean
Returns true if the field is set, exactly like the function isFieldSet
(see Field presence and default values).
clear(field: DescField): void
Resets the field, so that isSet()
will return false.
get<Field extends DescField>(field: Field): ReflectMessageGet<Field>
Returns the field value, but in a form that's most suitable for reflection:
-
Scalar fields: Returns the value, but converts 64-bit integer fields with the option
jstype=JS_STRING
to abigint
value. If the field is not set, the default value is returned. If no default value is set, the zero value is returned. -
Enum fields: Returns the numeric value. If the field is not set, the default value is returned. If no default value is set, the zero value is returned.
-
Message fields: Returns a
ReflectMessage
. If the field is not set, a new message is returned, but not set on the field. -
List fields: Returns a
ReflectList
object. -
Map fields: Returns a
ReflectMap
object.
Note
get()
never returns undefined
. To determine whether a field is set, use isSet()
.
Tip
If you use a switch statement on DescField.fieldKind
, the return type of ReflectMessage.get
will be narrowed down.
In case that's insufficient, the guard functions isReflectMessage
, isReflectList
, isReflectMap
from
@bufbuild/protobuf/reflect
can help.
set<Field extends DescField>(field: Field, value: unknown): void;
Set a field value, expecting values in the same form that get()
returns. This method throws an error with helpful
information if the value is invalid for the field.
Note
undefined
is not a valid value. To reset a field, use clear()
.
Repeated fields are represented with ReflectList
. Similar to ReflectMessage
, it provides values in a form most
suitable for reflection:
- Scalar 64-bit integer fields with the option
jstype=JS_STRING
are converted to bigint. - Messages are wrapped in a
ReflectMessage
.
import type { DescField } from "@bufbuild/protobuf";
import type { ReflectMessage } from "@bufbuild/protobuf/reflect";
declare const field: DescField;
declare const message: ReflectMessage;
if (field.fieldKind == "list") {
const list: ReflectList = message.get(field);
for (const item of list) {
// ReflectList is iterable
}
list.get(123); // can be undefined
list.add(123); // throws an error for invalid values
}
Map fields are represented with ReflectMap
. Similar to ReflectMessage
, it provides values in a form most
suitable for reflection:
- A map field is a record object on a message, where keys are always strings.
ReflectMap
converts keys to their closest possible type in TypeScript. - Messages are wrapped in a
ReflectMessage
.
import type { DescField } from "@bufbuild/protobuf";
import type { ReflectMessage } from "@bufbuild/protobuf/reflect";
declare const field: DescField;
declare const message: ReflectMessage;
if (field.fieldKind == "map") {
const map: ReflectMap = message.get(field);
for (const [key, value] of map) {
// ReflectMap is iterable
}
map.has(123); // boolean
map.get(123); // can be undefined
map.set(123, "abc"); // throws an error for invalid keys or values
}
Code generator plugins are a unique feature of Protobuf compilers like protoc
and the Buf CLI. Using a
plugin, you can generate files based on Protobuf schemas as the input. Plugins can generate outputs like RPC clients and
server stubs, mappings from Protobuf to SQL, validation code, and pretty much anything else you can think of.
Plugins are implemented as simple executables, named protoc-gen-x
, where x
is the name of the language or feature that the
plugin provides. They receive the schema via stdin and return generated files via stdout.
Tip
The plugin contract is defined in google/protobuf/compiler/plugin.proto.
First, make sure you've completed all steps from the quickstart. You'll add two new dependencies:
- @bufbuild/protoplugin: The plugin framework for Protobuf-ES
- tsx: A command to run TypeScript in Node.js
npm install @bufbuild/protoplugin tsx
Create a new file named src/protoc-gen-hello.ts
:
import {
createEcmaScriptPlugin,
runNodeJs,
type Schema,
} from "@bufbuild/protoplugin";
const plugin = createEcmaScriptPlugin({
name: "protoc-gen-hello",
version: "v1",
generateTs(schema: Schema) {
// Loop through all Protobuf files in the schema
for (const file of schema.files) {
// Generate a file for each Protobuf file
const f = schema.generateFile(file.name + "_hello.ts");
// Print text to the file
f.print("// hello world");
}
},
});
// Reads the schema from stdin, runs the plugin, and writes the generated files to stdout.
runNodeJs(plugin);
Run the plugin with tsx
to see it working. It prints its name and version:
npx tsx src/protoc-gen-hello.ts --version
To feed a schema to the plugin, add the plugin to your buf.gen.yaml
file:
# Learn more: https://buf.build/docs/configuration/v2/buf-gen-yaml
version: v2
inputs:
- directory: proto
plugins:
- local: protoc-gen-es
out: src/gen
opt: target=ts
+ - local: ["tsx", "./src/protoc-gen-hello.ts"]
+ opt: target=ts
+ out: src/gen
Run npx buf generate
again and a new file appears with the contents // hello world
:
.
├── buf.gen.yaml
├── package.json
├── proto
│ └── example.proto
└── src
├── gen
│ ├── example_pb.ts
+ │ └── example_hello.ts
└── protoc-gen-hello.ts
Typically, a plugin loops through files of the schema, and generates a file for each Protobuf file. The property
Schema.files
is an array of file descriptors, which contain descriptors for every message and all
other Protobuf elements.
To generate a file, call the Schema.generateFile
method—it takes a file name as an argument. By convention, you
should add a suffix to distinguish files generated by your plugin from files generated by other plugins. For example,
we're adding the suffix _hello
for our plugin protoc-gen-hello
:
for (const file of schema.files) {
// file.name is the name of the Protobuf file, minus the .proto extensions
schema.generateFile(file.name + "_hello.ts");
}
We recommend adding a preamble to every generated file. It provides helpful information about the plugin, and about the Protobuf file it was generated from:
const f = schema.generateFile(file.name + "_hello.ts");
f.preamble(file);
// Generates the lines:
// @generated by protoc-gen-hello v1 with parameter "target=ts"
// @generated from file example.proto (package example, syntax proto3)
Note
If you don't print anything to a generated file, it won't be generated.
The method print
adds a line of text to a generated file:
f.print("// hello world");
// Generates the line:
// hello world
If you pass multiple arguments, they are joined:
const world = "world";
f.print("// hello ", world);
// Generates the line:
// hello world
Arguments can also be numbers, booleans, and other types. They are printed as literals:
f.print("const num = ", 123, ";");
// Generates:
// const num = 123;
f.print("const bool = ", true, ";");
// Generates:
// const bool = true;
f.print("const bytes = ", new Uint8Array([0xde, 0xad, 0xbe, 0xef]), ";");
// Generates:
// const bytes = new Uint8Array([0xDE, 0xAD, 0xBE, 0xEF]);
To print string or array literals, wrap them with a function call:
f.print("const str = ", f.string("hello"), ";");
// Generates:
// const str = "hello";
f.print("const arr = ", f.array([1, 2, 3]), ";");
// Generates:
// const arr = [1, 2, 3];
To print documentation comments, take a look at the jsDoc
method. You can pass it text or a descriptor. For
descriptors, it uses any comments on the Protobuf element, and adds a helpful @generated from ...
annotation.
Tip
If you want to generate complex expressions in multiple places, you can move the logic to a function that returns
Printable
. You can pass Printable
to GeneratedFile.print
.
Tip
If you prefer, you can use GeneratedFile.print
with template literals:
const world = "world";
f.print`// hello ${world}`;
// Generates the line:
// hello world
You can generate import statements with a combination of the import
and print
methods.
For example, importing the useEffect
hook from React:
const useEffect = f.import("useEffect", "react");
returns an ImportSymbol
that you can print:
f.print(useEffect, "(() => {");
f.print(" document.title = `You clicked ${count} times`;");
f.print("}, [count]);");
When the ImportSymbol
is printed, an import statement is automatically generated for you:
import { useEffect } from "react";
useEffect(() => {
document.title = `You clicked ${count} times`;
}, [count]);
Tip
If you need a type-only import, call toTypeOnly()
on ImportSymbol
.
It's common to import from protoc-gen-es
generated code in other plugins. To make it as easy as possible, the
importSchema
method imports the schema for a descriptor, and the importShape
method imports the type:
for (const message of file.messages) {
const { create } = f.runtime;
const schema = f.importSchema(message);
const shape = f.importShape(message);
f.print("const msg: ", shape, " = ", create, "(", schema, ");");
}
// Generates:
// import { create } from "@bufbuild/protobuf";
// import type { User } from "./example_pb";
// import { UserSchema } from "./example_pb";
// const msg: User = create(UserSchema);
Tip
GeneratedFile.runtime
provides common imports from @bufbuild/protobuf.
Importing with the GeneratedFile
methods has many advantages:
- Conditional imports: Import statements belong at the top of a file, but you usually only find out later in your
code whether you need the import, such as in a nested if statement. Conditionally printing the import symbol only generates the import statement when it's actually used. - Preventing name collisions: For example, if you
import { Foo } from "bar"
andimport { Foo } from "baz"
,f.import()
automatically renames one of themFoo$1
, preventing name collisions. - Import styles: If the plugin option
js_import_style=legacy_commonjs
is set, code is automatically generated withrequire()
calls instead ofimport
statements.
To export a declaration from your code, use export
:
f.print(f.export("const", "foo"), " = 123;");
// Generates:
// export const foo = 123;
This method takes two arguments:
- The declaration—for example
const
,enum
,abstract class
, or anything you might need. - The name of the declaration, which is also used for the export.
The method automatically generates CommonJS exports if the plugin option js_import_style=legacy_commonjs
is set.
Tip
If you generate exports based on Protobuf names, make sure to escape any reserved words with the function
safeIdentifier
from @bufbuild/protoplugin.
The plugin framework recognizes a set of options that can be passed to all plugins when executed (for example
target
, import_extension
, etc.), but if your plugin needs to have additional
parameters passed, you can provide parseOptions
to createEcmaScriptPlugin
:
parseOptions(rawOptions: {key: string, value: string}[]): T;
This function is invoked by the framework, passing in any key/value pairs that it doesn't recognize from its
pre-defined list. The returned option is merged with the pre-defined options and passed to the generate functions
via the options
property of the schema.
Tip
Our runnable plugin example shows a custom plugin option, and also custom Protobuf options.
Our Hello world plugin only implements generateTs
, but it can still generate JavaScript and
TypeScript Declaration files with the plugin option target=js+dts
. Under the hood, @bufbuild/protoplugin uses the
TypeScript compiler to transpile the output from generateTs
to .js
and .d.ts
files if necessary.
Important
Transpiling on the fly works well for many applications, but it comes at the cost of long code generation times for
large sets of Protobuf files. To provide the best possible user experience for your plugin, we recommend that you
also provide generateJs
and generateDts
to createEcmaScriptPlugin
.
To release a plugin on npmjs.com, you can use the "bin" field of package.json to provide it as an
executable. Users who install the package will automatically have the executable in their PATH
when running commands
with npm scripts or npx
.
Tip
To provide the best possible user experience for your plugin, we recommend that you avoid transpilation.
We recommend testing generated code just like handwritten code. Identify a representative Protobuf file for your use
case, generate code, and then run tests against the generated code. If you implement your own generator functions for
the js
and dts
targets, we also recommend running all tests against both.
For a runnable example that uses Protocol Buffers to manage a list of users, see packages/protobuf-example. For a custom plugin, see packages/protoplugin-example. It generates Twirp clients for your services, and also uses custom options.
Version 2 provides many new features around reflection, and support for Protobuf editions. To upgrade, you'll need to update your dependencies, re-generate code, and update call sites in your application.
The following npm packages are available with version 2.0.0:
- @bufbuild/protobuf: The runtime library, containing base types, generated well-known types, and core functionality.
- @bufbuild/protoc-gen-es: Provides the code generator plugin
protoc-gen-es
. - @bufbuild/protoplugin: Framework to create your own code generator plugin.
Update the ones you use with npm:
npm install @bufbuild/protobuf@^2.0.0 @bufbuild/protoc-gen-es@^2.0.0
Make sure to re-generate code with the new plugin, and verify that the package versions match the generated code.
Plugin options now have more convenient default behavior:
import_extension
is nownone
by default, which means we don't add a.js
extension to import paths. If you use the plugin optionimport_extension=none
, you can delete it. If you require imports to have the.js
extension, useimport_extension=js
.ts_nocheck
is now off by default. If you require a// @ts-nocheck
annotation at the top of generated code, usets_nocheck=true
.
Update your buf.gen.yaml
file if you require the previous behavior.
Are you using the remote plugin?
If you're using the remote plugin instead of the locally installed protoc-gen-es
, make
sure to update the version in your config:
# buf.gen.yaml
version: v2
plugins:
- remote: buf.build/bufbuild/es:v2.0.0
out: gen
Are you using a generated SDK?
If you're using a generated SDK, install latest
with plugin version v2.0.0:
Next, you'll need to update call sites that construct or serialize messages. For small applications, this will only take a few minutes. For more complex applications, migration can be more involved. If necessary, you can run both versions in parallel to migrate the application piece by piece.
The biggest change is that the generated code no longer uses classes. To create a new instance, you call the function
create()
and pass the generated schema:
- import { User } from "./gen/example_pb";
+ import { create } from "@bufbuild/protobuf";
+ import { UserSchema } from "./gen/example_pb";
- let user = new User({
+ let user = create(UserSchema, {
firstName: "Homer",
});
Methods like toBinary
and toJson
are no longer attached to the object. Similar to create()
, they're simple
functions that you call with two arguments: the schema and the message:
import type { User } from "./gen/example_pb";
+ import { UserSchema } from "./gen/example_pb";
+ import { toJsonString } from "@bufbuild/protobuf";
function show(user: User) {
- alert(user.toJsonString());
+ alert(toJsonString(UserSchema, user));
}
Note that messages also no longer implement the special method toJSON
. Before you pass a message to JSON.stringify
,
convert it to a JSON value first (using toJson
).
The generated properties remain largely unchanged. There are two improvements:
- A message field using
google.protobuf.Struct
is generated asJsonObject
. - Proto2 fields support default values now and are no longer generated as optional properties.
The toPlainMessage
function and the PlainMessage<T>
type are no longer necessary. If you create a proto3 message with
create(UserSchema)
, the returned object is already a plain object. You can replace the PlainMessage<User>
type with
User
. The only difference is that User
has a property $typeName
, which is a simple string with the full name of
the message like "example.User"
. This property makes sure you don't pass the wrong message to a function by accident.
Well-known types have moved to a subpath export @bufbuild/protobuf/wkt
.
Protobuf-ES is intended to be a solid, modern alternative to existing Protobuf implementations for the JavaScript ecosystem. It's the first project in this space to provide a comprehensive plugin framework and decouple the base types from RPC functionality.
Some additional features that set it apart from the others:
- ECMAScript module support
- First-class TypeScript support
- Generation of idiomatic JavaScript and TypeScript code
- Generation of much smaller bundles
- Implementation of all proto3 features, including the canonical JSON format
- Implementation of all proto2 features except for the text format
- Support for Editions
- Usage of standard JavaScript APIs instead of the Closure Library
- Compatibility is covered by the Protocol Buffers conformance tests
- Descriptor and reflection support
TypeScript's enum
definitely has drawbacks. It requires an extra import, console.log
loses the name, and they don't
have a native equivalent in JavaScript.
Admittedly, { species: "DOG" }
looks a bit more straight-forward than { species: Species.DOG }
.
But enum
s also have some nice properties that union types don't provide. For example, the numeric values can
actually be meaningful (enum {ONE=1, TWO=2}
for a silly example), and they can be used for bitwise flags.
TypeScript enum
s also have a property that's important for backwards compatibility in Protobuf. Like
enumerations in C# and C++, you can actually assign values other than the declared ones to an enum. For example,
consider the following Protobuf file:
enum Species {
UNSPECIFIED = 0;
CAT = 1;
DOG = 2;
}
message Animal {
Species species = 1;
}
If we were to add HAMSTER = 3;
to the enumeration, old generated code can still (de)serialize an Animal
created by
new generated code:
enum Species {
UNSPECIFIED = 0,
CAT = 1,
DOG = 2,
}
const hamster: Species = 3;
As a result, there is a range of Protobuf features we wouldn't be able to model if we were using string union types for enumerations. Many users may not need those features, but this also has downstream impacts on frameworks such as Connect-ES, which couldn't be a fully featured replacement for gRPC-web if we didn't use TypeScript enums.
We generate our enum
values based on how they are written in the source Protobuf file. The reason is that
the Protobuf JSON spec requires that the name of the enum value be whatever is used in the Protobuf file and
keeping them identical makes it very easy to encode/decode JSON.
The Buf style guide further says that enum
values should be UPPER_SNAKE_CASE, which will result in your
generated TypeScript enum
values being in UPPER_SNAKE_CASE if you follow it.
We don't provide an option to generate different cases for your enum
values because we try to limit options to those
we feel are necessary. PascalCase seems to be more of a stylistic choice, as even
TypeScript's documentation uses various ways to name enum
members. For more about our thoughts on
options, see this question.
The short answer is that they're the best way to represent the 64-bit numerical types allowable in Protobuf. BigInt
has widespread browser support, and for those environments where it isn't supported, we fall back to
a string representation.
Though it's true that an int32
type's 2^32 size isn't enough to represent a 64-bit value, Javascript's
MAX_SAFE_INTEGER
can safely represent integers between -(2^53 – 1) and 2^53 – 1. However, this is
obviously only effective if you can guarantee that no number in that field will ever exceed that range. Exceeding it
could lead to subtle and potentially serious bugs, so the clear-cut usage of BigInt
makes more sense.
js_generator.cc
is rarely updated and has fallen behind the quickly moving world of JavaScript.
For example:
- It doesn't support ECMAScript modules
- It can't generate TypeScript (third-party plugins are necessary)
- It doesn't support the canonical JSON format
- It doesn't carry over comments from your
.proto
files
Because of this, we want to provide a solid, modern alternative with Protobuf-ES. The main differences of the generated code are:
- We use plain properties for fields, whereas
protoc
uses getter and setter methods - We implement the canonical JSON format
- We generate much smaller bundles
- We rely on standard APIs instead of the Closure Library
In general, we feel that an abundance of options makes the plugin less approachable. It can be daunting to new users to sift through numerous configuration choices when they're just beginning to use the plugin. Our default position is to be as opinionated as possible about the generated code, and this results in fewer knobs that need turning at configuration time. In addition, too many options also makes debugging more difficult. It's much easier to reason about the generated code when it conforms to a predictable standard.
There are also more concrete reasons why we prefer to add options judiciously. Consider a popular option request,
which is to add the ability to generate snake_case
field names as opposed to camelCase
. If we provided this
option, any plugin downstream that accesses these fields or uses the base types has to also
support it and ensure that it's set to the same value across plugins every time files are generated. Any functionality
that uses the generated code must also now stay in sync. Exposing options, especially those
that affect the generated code, introduces an entirely new way for breaking changes to happen. The generated code is no
longer predictable, which defeats the purpose of generating code.
This isn't to say that we're completely against adding any options to the plugin. There are obviously cases where adding an option is necessary. However, for cases such as stylistic choices or user preferences, we tend to err on the side of caution.
Protobuf-ES uses package exports. If you see the following error with Parcel, make sure to enable package exports:
@parcel/core: Failed to resolve '@bufbuild/protobuf/codegenv1'
If you see the following error with Metro or Expo, make sure to enable package exports:
Metro error: Unable to resolve module @bufbuild/protobuf/codegenv1
Serialization to JSON and binary is deterministic within a version of protobuf-es, but map entries, repeated fields and extensions are ordered by insertion. Regular fields are sorted by field number.