diff --git a/docs/cli/router/plugin/build.mdx b/docs/cli/router/plugin/build.mdx index 9ce961d3..9cbf5fd5 100644 --- a/docs/cli/router/plugin/build.mdx +++ b/docs/cli/router/plugin/build.mdx @@ -48,7 +48,7 @@ If you only want to generate code but not compile the binary (useful when you're The build command will automatically check for and install the necessary toolchain (like protoc, protoc-gen-go, etc.) when required tools can't be found in the right version on your system. You can control this behavior with the `--skip-tools-installation` and `--force-tools-installation` flags. -For debugging your plugin, use the `--debug` flag to build with debug symbols. This enables debugging with tools like Delve, GoLand, or VS Code. See the [debugging guide](./debug) for detailed instructions. +For debugging your plugin, use the `--debug` flag to build with debug symbols. This enables debugging with tools like Delve, GoLand, or VS Code. See the [debugging guide](/router/gRPC/plugins/debugging) for detailed instructions. You can also install the dependencies manually and use an IDE with Go support. The following table shows the current versions and download links for the required tools: diff --git a/docs/cli/router/plugin/test.mdx b/docs/cli/router/plugin/test.mdx index e1f9d84d..0728e6e2 100644 --- a/docs/cli/router/plugin/test.mdx +++ b/docs/cli/router/plugin/test.mdx @@ -43,7 +43,7 @@ Testing your plugin is an important step to ensure that your resolvers work corr The build command will automatically check for and install the necessary toolchain (like protoc, protoc-gen-go, etc.) when required tools can't be found in the right version on your system. You can control this behavior with the `--skip-tools-installation` and `--force-tools-installation` flags. -For debugging your plugin, use the `--debug` flag to build with debug symbols. This enables debugging with tools like Delve, GoLand, or VS Code. See the [debugging guide](./debug) for detailed instructions. +For debugging your plugin, use the `--debug` flag to build with debug symbols. This enables debugging with tools like Delve, GoLand, or VS Code. See the [debugging guide](/router/gRPC/plugins/debugging) for detailed instructions. You can also install the dependencies manually and use an IDE with Go support. The following table shows the current versions and download links for the required tools: diff --git a/docs/connect-rpc/consume-via-grpc.mdx b/docs/connect-rpc/consume-via-grpc.mdx new file mode 100644 index 00000000..1dc3109a --- /dev/null +++ b/docs/connect-rpc/consume-via-grpc.mdx @@ -0,0 +1,74 @@ +--- +title: "Consume gRPC" +description: "Interact with your API using high-performance gRPC clients and standard tooling like grpcurl." +icon: "tower-broadcast" +--- + +This page demonstrates how your platform enables API consumers to interact with the API using standard gRPC tooling. + +Platform teams can adapt the examples on this page when creating consumer-facing documentation to ensure a consistent and supported API consumption experience. + +--- + +For internal microservices, backend systems, or high-performance mobile applications, consumers can interact with the API using the binary gRPC protocol over HTTP/2. + +## Prerequisites + +To consume the API via standard gRPC clients, consumers need two things provided by the platform team: + +1. Connection Details: The host and port where the router's ConnectRPC server is listening (e.g., localhost:5026). + +2. Service Definitions: The service.proto file generated during the build step. This file is required to define the available services, methods, and message structures for binary serialization. + +## Ad-Hoc testing with `grpcurl` + +`grpcurl` is a widely used command-line tool that lets you interact with gRPC servers, similar to how `curl` works for HTTP. +It is excellent for testing and debugging without generating code. + +### Command structure + +To make a request, you need to point `grpcurl` to your local `.proto` definition file, specify the plaintext flag (unless TLS is configured), provide the request data as JSON, and specify the target endpoint and fully qualified method name. + +In production environments, TLS is typically enabled and the `-plaintext` flag should be omitted. + +```bash +grpcurl -plaintext \ + -proto \ + -d '' \ + : \ + ./ +``` + +## Examples + +The following examples assume the router is listening locally on port `5026` and the proto definitions are available in a local `./services/service.proto` file. + +### GetEmployeeById + +Retrieve an employee by their ID. + +```bash +grpcurl -plaintext \ + -proto ./services/service.proto \ + -d '{"id": 1}' \ + localhost:5026 \ + employees.v1.HrService/GetEmployeeById +``` + +### UpdateEmployeeMood + +Update data on the server. + +```bash +grpcurl -plaintext \ + -proto ./services/service.proto \ + -d '{"id": 1, "mood": "MOOD_HAPPY"}' \ + localhost:5026 \ + employees.v1.HrService/UpdateEmployeeMood +``` + +## Programmatic gRPC Clients + +While `grpcurl` is useful for testing, nearly every major programming language has robust library support for gRPC. Consumers can use standard `protoc` code generation tools against the provided `service.proto` file to build high-performance, typed clients for backend services. + + For the best developer experience in languages like TypeScript, Go, and Kotlin, we recommend using the pre-generated [SDKs](/connect-rpc/consume-via-sdks) instead of raw gRPC clients. diff --git a/docs/connect-rpc/consume-via-rest-openapi.mdx b/docs/connect-rpc/consume-via-rest-openapi.mdx new file mode 100644 index 00000000..4393ad63 --- /dev/null +++ b/docs/connect-rpc/consume-via-rest-openapi.mdx @@ -0,0 +1,102 @@ +--- +title: "Consume REST / OpenAPI" +description: "Interact with your API using standard HTTP/JSON requests, suitable for standard clients like cURL, Postman, or browsers." +icon: "arrow-right-arrow-left" +--- + +This page demonstrates how your platform enables API consumers to interact with the API using standard HTTP tooling and OpenAPI specifications. + +Platform teams can adapt the examples on this page when creating consumer-facing documentation to ensure a consistent and supported API consumption experience. + +Every GraphQL operation defined in your contract is automatically exposed as a standard HTTP endpoint supporting JSON encoding. +This allows any HTTP client to interact with your API without needing specific gRPC or GraphQL knowledge. + +## OpenAPI Specification + +If your platform team has generated an OpenAPI specification (e.g. `service.openapi.yaml`) as part of the build process, you can import it into tools like Swagger UI, Redoc, or Postman to explore the API interactively and see the exact request and response contracts. + +To learn how to generate the OpenAPI specification file, see [Generate & Distribute SDKs](/connect-rpc/produce-generate-distribute-sdks). + +## HTTP POST requests + +The standard method for calling any RPC method (both Queries and Mutations) is via an HTTP POST request with a JSON body. + +### Required Headers + +When making a `POST` request, you must include the following headers to indicate you are using the Connect protocol with JSON: + +- `Content-Type: application/json` + +- `Connect-Protocol-Version: 1` + +### Endpoint URL Structure + +The URL structure follows the pattern: `http(s)://:/./` + +For example: `http://localhost:5026/employees.v1.HrService/GetEmployeeById` + +### Examples + +#### Query with Arguments (POST) + +Retrieve an employee by ID. The arguments are passed as a JSON object in the request body. + +```bash +curl -X POST http://localhost:5026/employees.v1.HrService/GetEmployeeById \ + -H "Content-Type: application/json" \ + -H "Connect-Protocol-Version: 1" \ + -d '{"id": 1}' +``` + +#### Mutation (POST) + +Mutations — operations that modify data and have side effects — must use HTTP POST. + +```bash +curl -X POST http://localhost:5026/employees.v1.HrService/UpdateEmployeeMood \ + -H "Content-Type: application/json" \ + -H "Connect-Protocol-Version: 1" \ + -d '{"id": 1, "mood": "MOOD_HAPPY"}' +``` + +### HTTP GET Requests (Caching) + +GraphQL Query operations are marked as idempotent (`NO_SIDE_EFFECTS`) in the generated protocol buffers. This enables them to be called via HTTP GET requests. + +Using GET is highly recommended for read-only operations because it allows responses to be cached by standard CDNs and browser caches, significantly improving performance. + +#### Constructing the GET request + +Since GET requests do not have a body, arguments must be passed via query parameters. The request requires three specific parameters: + +- `connect=v1`: Specifies the protocol version. +- `encoding=json`: Specifies the format of the message data. +- `message={...}`: A URL-encoded JSON object containing the arguments. + +These parameters are required by the Connect protocol when using HTTP GET with JSON encoding. + +### Examples + +#### Query without arguments (GET) + +Retrieve all employees. The message is an empty JSON object `{}`. + +```bash +curl --get \ + --data-urlencode 'encoding=json' \ + --data-urlencode 'message={}' \ + --data-urlencode 'connect=v1' \ + http://localhost:5026/employees.v1.HrService/GetEmployees +``` + +#### Query with arguments (GET) + +Retrieve an employee by ID. The message contains the arguments `{"id": 1}`. + +```bash +curl --get \ + --data-urlencode 'encoding=json' \ + --data-urlencode 'message={"id": 1}' \ + --data-urlencode 'connect=v1' \ + http://localhost:5026/employees.v1.HrService/GetEmployeeById +``` diff --git a/docs/connect-rpc/consume-via-sdks.mdx b/docs/connect-rpc/consume-via-sdks.mdx new file mode 100644 index 00000000..54422fa7 --- /dev/null +++ b/docs/connect-rpc/consume-via-sdks.mdx @@ -0,0 +1,111 @@ +--- +title: "Consume SDKs" +description: "Learn how to install and use the pre-generated, type-safe client libraries for TypeScript and Go" +icon: "brackets-curly" +--- + +This page demonstrates the developer experience your platform team enables for API consumers using generated SDKs. + +Platform teams can adapt the examples on this page when creating consumer-facing documentation to ensure a consistent and supported API consumption experience. + +Once your platform team has generated and distributed the client SDKs, consuming the API is straightforward. + +You do not need to deal with GraphQL queries, Protocol Buffer definitions, or generation tooling. You simply install a library and interact with strongly typed objects and methods native to your programming language. + +This guide assumes you have been provided with the package name or repository URL for the generated SDKs. + + +The examples below demonstrate TypeScript and Go SDKs. ConnectRPC supports many other languages including Python, Swift, Kotlin, and more. For a complete list of supported languages and their SDKs, see the [official Buf documentation](https://buf.build/docs/bsr/generated-sdks/#supported-languages). + + +## TypeScript / JavaScript example + +The generated TypeScript client provides full type safety for requests and responses, along with autocomplete in your IDE. + +### 1. Installation + +Install the generated SDK package provided by your platform team, along with the required Connect runtime libraries. + +Note: Replace `@my-org/sdk` with the actual package name provided by your team. + +```bash +npm install @my-org/sdk @connectrpc/connect @connectrpc/connect-web +``` + +### 2. Usage Example + +To make a request, you create a transport (configuring the base URL) and then instantiate the client for the desired service. + +```ts +import { createPromiseClient } from "@connectrpc/connect"; +import { createConnectTransport } from "@connectrpc/connect-web"; +// Import the service definition from your generated SDK package +import { HrService } from "@my-org/sdk/employees/v1/service_connect"; + +const transport = createConnectTransport({ + baseUrl: "http://localhost:5026", +}); + +const client = createPromiseClient(HrService, transport); + +async function fetchEmployee() { + // Request and response objects are fully typed + const response = await client.getEmployeeById({ id: 1 }); + console.log(`Fetched employee: ${response.employee?.details?.forename}`); +} + +fetchEmployee(); +``` + +## Go example + +The generated Go SDK provides idiomatic Go structs and interfaces for interacting with the API. + +### 1. Installation + +Download the generated Go module provided by your platform team, along with the required Connect runtime. + +Note: Replace `github.com/my-org/sdk` with the actual module path provided by your team. + +```bash +go get github.com/my-org/sdk +``` + +### 2. Usage Example + +Instantiate the service client using a standard HTTP client and the base URL of the router. + +```go +package main + +import ( + "context" + "fmt" + "log" + "net/http" + + "connectrpc.com/connect" + // Import the generated packages from your SDK module + employeesv1 "github.com/my-org/sdk/gen/go/employees/v1" + "github.com/my-org/sdk/gen/go/employees/v1/employeesv1connect" +) + +func main() { + client := employeesv1connect.NewHrServiceClient( + http.DefaultClient, + "http://localhost:5026", + ) + + req := connect.NewRequest(&employeesv1.GetEmployeeByIdRequest{ + Id: 1, + }) + + res, err := client.GetEmployeeById(context.Background(), req) + if err != nil { + log.Fatal(err) + } + + // Response data is strongly typed + fmt.Printf("Fetched employee: %s\n", res.Msg.Employee.Details.Forename) +} +``` diff --git a/docs/connect-rpc/overview.mdx b/docs/connect-rpc/overview.mdx new file mode 100644 index 00000000..82b25258 --- /dev/null +++ b/docs/connect-rpc/overview.mdx @@ -0,0 +1,273 @@ +--- +title: "Overview" +description: "Transform GraphQL operations into OpenAPI specifications and type-safe SDKs" +icon: "circle-info" +--- + +Cosmo ConnectRPC transforms your GraphQL operation collections into OpenAPI specifications and type-safe SDKs - enabling multiple consumption interfaces from a single source of truth. + +## From GraphQL Operations to Multiple Interfaces + +```mermaid +%%{init: {'theme':'base', 'themeVariables': { 'primaryColor':'#0284c7','primaryTextColor':'#fff','primaryBorderColor':'#0c4a6e','lineColor':'#64748b','secondaryColor':'#ea580c','secondaryTextColor':'#fff','secondaryBorderColor':'#9a3412','tertiaryColor':'#1e293b','tertiaryTextColor':'#e2e8f0','tertiaryBorderColor':'#475569'}}}%% +flowchart TB + A[GraphQL Operation Collections] + A --> B[OpenAPI Specification] + A --> C[TypeScript SDK] + A --> D[Go SDK] + + style A fill:#0284c7,stroke:#0c4a6e,color:#fff + style B fill:#ea580c,stroke:#9a3412,color:#fff + style C fill:#ea580c,stroke:#9a3412,color:#fff + style D fill:#ea580c,stroke:#9a3412,color:#fff +``` + +The following example shows how a single `GetEmployeeById` GraphQL operation becomes consumable through multiple interfaces: + + + + The source GraphQL operation that defines the API contract: + + ```graphql + query GetEmployeeById($id: Int!) { + employee(id: $id) { + id + details { + forename + surname + } + } + } + ``` + + + Automatically generated OpenAPI specification: + + ```yaml +/employees.v1.HrService/GetEmployeeById: + post: + tags: + - employees.v1.HrService + summary: GetEmployeeById + operationId: employees.v1.HrService.GetEmployeeById + parameters: + - name: Connect-Protocol-Version + in: header + required: true + schema: + $ref: '#/components/schemas/connect-protocol-version' + - name: Connect-Timeout-Ms + in: header + schema: + $ref: '#/components/schemas/connect-timeout-header' + requestBody: + content: + application/json: + schema: + $ref: '#/components/schemas/employees.v1.GetEmployeeByIdRequest' + required: true + responses: + default: + description: Error + content: + application/json: + schema: + $ref: '#/components/schemas/connect.error' + "200": + description: Success + content: + application/json: + schema: + $ref: '#/components/schemas/employees.v1.GetEmployeeByIdResponse' + ``` + + + Type-safe TypeScript client code: + + ```ts + import { createPromiseClient } from "@connectrpc/connect"; + import { createConnectTransport } from "@connectrpc/connect-web"; + import { HrService } from "@my-org/sdk/employees/v1/service_connect"; + + const transport = createConnectTransport({ + baseUrl: "http://localhost:5026", + }); + + const client = createPromiseClient(HrService, transport); + + // Request and response are fully typed + const response = await client.getEmployeeById({ id: 1 }); + console.log(`Employee: ${response.employee?.details?.forename}`); + ``` + + + Type-safe Go client code: + + ```go + import ( + "context" + "connectrpc.com/connect" + employeesv1 "github.com/my-org/sdk/gen/go/employees/v1" + "github.com/my-org/sdk/gen/go/employees/v1/employeesv1connect" + ) + + client := employeesv1connect.NewHrServiceClient( + http.DefaultClient, + "http://localhost:5026", + ) + + req := connect.NewRequest(&employeesv1.GetEmployeeByIdRequest{ + Id: 1, + }) + + res, err := client.GetEmployeeById(context.Background(), req) + // Response is strongly typed + fmt.Printf("Employee: %s\n", res.Msg.Employee.Details.Forename) + ``` + + + +### Why This Matters + +While GraphQL is an excellent interface for schema design and composition, it is not universally consumable. + +Many API consumers - due to existing tooling, security policies, performance requirements, or language ecosystems - rely on REST, RPC, or generated SDKs rather than constructing GraphQL queries directly. + +Without a unified approach, platform teams often end up maintaining parallel APIs and duplicated contracts, leading to drift, inconsistent behavior, and higher operational overhead. + +## Architectural Benefits + +ConnectRPC addresses fundamental infrastructure and operational challenges by leveraging Protocol Buffers as the intermediate representation between GraphQL and consumption interfaces. + +### Performance + +Protocol Buffer encoding delivers significant efficiency gains over JSON-based GraphQL responses: + +- **Compact wire format**: Binary serialization reduces payload sizes, lowering bandwidth consumption and improving response times +- **Optimized parsing**: Generated code eliminates runtime reflection and dynamic parsing overhead +- **HTTP caching**: GET-based queries enable standard HTTP caching layers and CDN integration + +### Reliability + +The architecture inherently improves system reliability through contract enforcement: + +- **Schema-driven validation**: All requests are validated against compiled Protocol Buffer definitions before execution +- **Deterministic behavior**: Trusted Documents eliminate query variability, making performance characteristics predictable +- **Infrastructure observability**: Standard RPC semantics integrate with existing metrics, tracing, and monitoring infrastructure + +### Operational Efficiency + +The approach eliminates common sources of operational complexity: + +- **No manual glue code**: The router handles protocol translation automatically—no custom ingress controllers, API gateways, or translation layers required +- **Single source of truth**: GraphQL operations define the contract once; all downstream artifacts are generated, not maintained +- **Breaking change detection**: Schema changes are validated against defined operations at compile time, surfacing incompatibilities before deployment +- **Reduced attack surface**: Only explicitly defined operations are exposed—arbitrary query construction is not possible at runtime + +### API Abstraction + +Consumers interact through stable, well-defined interfaces rather than raw query languages: + +- **No exposed GraphQL**: API consumers use generated SDKs or standard REST/RPC calls—they never construct or see GraphQL queries +- **Implementation independence**: The underlying GraphQL layer becomes an internal implementation detail, not a public contract +- **Simplified onboarding**: Consumers integrate using familiar patterns (REST, gRPC, typed and generated clients) without GraphQL expertise + +### Security Model + +All API paths are explicitly bounded by design: + +- **Enclosed execution paths**: Consumers can only invoke operations that have been pre-defined and compiled into the system +- **No query injection**: Unlike open GraphQL endpoints, there is no mechanism for clients to construct arbitrary queries +- **Auditable contracts**: Every exposed operation is traceable to a named, versioned definition in source control + +## One API, Multiple Interfaces + +The core concept of API Consumption in Cosmo is simple: + +> GraphQL is your interface for design and governance; RPC and REST are your interfaces for consumption. + +Platform teams define collections of Trusted Documents - named GraphQL queries and mutations that represent the supported API surface. + +```graphql +query GetUser($id: ID!) { + user(id: $id) { + id + name + email + } +} +``` + +> This single `GetUser` operation becomes a versioned RPC method, a REST endpoint and a typed SDK function. + +Cosmo compiles these into Protocol Buffer definitions, and the router acts as a mediation layer, automatically mapping incoming RPC or HTTP requests to these trusted operations against your graph. + +This approach provides: + +| Benefit | Before | After | +|---------|--------|-------| +| **Governance** | Arbitrary queries allowed in production | No arbitrary queries in production - only defined operations are exposed | +| **Type Safety** | Handwritten clients with runtime shape mismatches | No handwritten clients or runtime shape mismatches - strongly typed generated code | +| **Performance** | POST-only requests bypass HTTP caching | GET-based queries unlock HTTP caching and CDNs | + +## How It Works + +The lifecycle moves from GraphQL contract definition to multi-protocol consumption, without introducing additional API layers. + +```mermaid +%%{init: {'theme':'base', 'themeVariables': { 'primaryColor':'#0284c7','primaryTextColor':'#fff','primaryBorderColor':'#0c4a6e','lineColor':'#64748b','secondaryColor':'#ea580c','secondaryTextColor':'#fff','secondaryBorderColor':'#9a3412','tertiaryColor':'#1e293b','tertiaryTextColor':'#e2e8f0','tertiaryBorderColor':'#475569','clusterBkg':'transparent','clusterBorder':'#64748b','titleColor':'#64748b'}}}%% +flowchart TB + A[1\. Define Contracts] --> B[2\. Configure Router] + B --> C[3\. Generate & Distribute SDKs] + C --> D[4a\. Consume via SDKs] + C --> E[4b\. Consume via REST/OpenAPI] + C --> F[4c\. Consume via gRPC] + + subgraph Legend + L1[API Producers] + L2[API Consumers] + end + + click A "/connect-rpc/produce-define-contracts" "Go to Define Contracts" + click B "/connect-rpc/produce-configure-router" "Go to Configure Router" + click C "/connect-rpc/produce-generate-distribute-sdks" "Go to Generate & Distribute SDKs" + click D "/connect-rpc/consume-via-sdks" "Go to Consume via SDKs" + click E "/connect-rpc/consume-via-rest-openapi" "Go to Consume via REST/OpenAPI" + click F "/connect-rpc/consume-via-grpc" "Go to Consume via gRPC" + + style A fill:#0284c7,stroke:#0c4a6e,color:#fff + style B fill:#0284c7,stroke:#0c4a6e,color:#fff + style C fill:#0284c7,stroke:#0c4a6e,color:#fff + style D fill:#ea580c,stroke:#9a3412,color:#fff + style E fill:#ea580c,stroke:#9a3412,color:#fff + style F fill:#ea580c,stroke:#9a3412,color:#fff + style L1 fill:#0284c7,stroke:#0c4a6e,color:#fff + style L2 fill:#ea580c,stroke:#9a3412,color:#fff + style Legend fill:transparent,stroke:#64748b,color:#64748b +``` + +1. **Define Contracts**: Create named GraphQL operations (Trusted Documents) and compile them into Protocol Buffer definitions that act as stable, versioned API contracts. + +2. **Configure Router**: Load the proto definitions into the Cosmo Router, which handles protocol translation automatically without server-side code. + +3. **Generate & Distribute SDKs**: Generate type-safe client SDKs (in languages like Go and TypeScript) and OpenAPI specifications from your definitions, ready for distribution. + +4. **Consume**: Developers install generated SDKs or use standard HTTP clients to interact with the API in their preferred language and protocol. + +## Supported Protocols and Clients + +By defining your operations once, you can support a wide range of consumers: + +- **REST / HTTP**: Simple JSON encoding over HTTP/1.1 or HTTP/2 for universal access via tools like curl. + +- **OpenAPI/Swagger**: Automatically generated specifications for documentation and REST tooling integration. + +- **Typed SDKs**: Generated client libraries for languages like TypeScript, Go, Swift, and Kotlin using standard tools like Buf. + +- **gRPC**: High-performance binary protobuf over HTTP/2 for internal microservices. + +- **Connect Protocol & gRPC-Web**: Browser-compatible RPC protocols. + +All interfaces are generated from the same GraphQL operations and stay in sync by construction. + + Hands-on Tutorial: The documentation in this section uses examples from our ConnectRPC Demo Repository. We recommend cloning it to follow along with the guides. diff --git a/docs/connect-rpc/produce-configure-router.mdx b/docs/connect-rpc/produce-configure-router.mdx new file mode 100644 index 00000000..e6cc8677 --- /dev/null +++ b/docs/connect-rpc/produce-configure-router.mdx @@ -0,0 +1,89 @@ +--- +title: "Configure Router" +description: "Enable the ConnectRPC server in the Cosmo Router to serve your typed API endpoints." +icon: "gears" +--- + +Once you have defined your contracts and generated your Protocol Buffer definitions, the next step is to configure the Cosmo Router to serve them. + +In this architecture, the router runs a ConnectRPC server alongside its standard GraphQL endpoint, acting as a protocol sidecar within the same binary. + +It loads your proto definitions, maps RPC calls to GraphQL operations, and handles protocol translation automatically. + +## Prerequisites + +Ensure you have the following files generated from the Define Contracts step: + +- The directory containing your source `.graphql` operation files (e.g., ./services/). +- The generated `service.proto` located inside that same directory. + +## 1. Update Router Configuration + +You need to modify your router's `config.yaml` to enable the ConnectRPC server and tell it where to find your service definitions. + +The router uses a storage provider to specify the **root services directory**. The router recursively walks this directory to discover all .proto files and their associated .graphql operations. + +Add the following configuration blocks: + +```yaml +# Storage providers must be defined first +storage_providers: + file_system: + - id: "fs-services" + path: "./services" + +# ConnectRPC configuration +connect_rpc: + enabled: true + storage: + provider_id: "fs-services" # Must match a storage_providers.file_system[].id +``` + +For detailed information about router configuration options, including environment variables, storage providers, and advanced settings, see the [Router Configuration documentation](/router/configuration). + +## 2. Run the router (docker example) + +When running the router container, you must ensure: + +- The new ConnectRPC port (e.g., 5026) is exposed. + +- The directory containing your service definitions is mounted into the container. + +Here is an example docker run command: + +```bash +docker run \ + --name cosmo-router \ + # Map the standard GraphQL port + -p 3002:3002 \ + # Map the new ConnectRPC port configured in config.yaml + -p 5026:5026 \ + # Mount your config file + -v "$(pwd)/config.yaml:/config/config.yaml:ro" \ + # Mount your services directory containing proto and graphql files + -v "$(pwd)/services:/services:ro" \ + # ... other required environment variables (GRAPH_API_TOKEN, etc.) ... + ghcr.io/wundergraph/cosmo/router:latest +``` + +## 3. Verification + +When the router starts, it will scan the mounted directories. Look at the logs to verify it has successfully discovered your services and operations. + +You should see output similar to this: + +```Plaintext +INFO discovered service {"service": "HrService", "dir": "/services", "proto_files": 1, "operation_files": 3} +INFO loaded operations for service {"service": "employees.v1.HrService", "operation_count": 3} +INFO ConnectRPC server ready {"addr": "0.0.0.0:5026"} +``` + +Key indicators: + +- `discovered service`: Confirms the service.proto file was found. + +- `loaded operations`: Confirms the corresponding .graphql files were found and mapped. + +- `ConnectRPC server ready`: Confirms the server is listening on the configured port. + +Your API is now ready to accept traffic via gRPC, Connect, and HTTP/JSON. diff --git a/docs/connect-rpc/produce-define-contracts.mdx b/docs/connect-rpc/produce-define-contracts.mdx new file mode 100644 index 00000000..2c6c2ccf --- /dev/null +++ b/docs/connect-rpc/produce-define-contracts.mdx @@ -0,0 +1,258 @@ +--- +title: "Define Contracts" +description: "Create the API surface using Trusted Documents and generate Protocol Buffer definitions." +icon: "scroll" +--- + + + +The first step in the producer workflow is to define the API contract. + +In Cosmo, this contract is not defined by creating `.proto` files directly. Instead, you define it by creating a collection of Trusted Documents - named GraphQL queries and mutations that represent your desired API surface. + +These operations are then compiled into Protocol Buffer definitions that serve as the stable interface for your consumers. + +## 1. Creating Trusted Documents + +Trusted Documents are simply standard GraphQL operations saved in `.graphql` files. Each file should contain exactly one named operation. + +Create a directory for your service operations (e.g. `services/`) and add your GraphQL files there. + +### Rules for Mapping Operations to RPC Methods: + +- One operation per file: Each `.graphql` file must contain only one operation. + +- PascalCase Naming: The operation name must use PascalCase (e.g. `GetEmployeeById`). This name will become the RPC method name. + +- No Root-Level Aliases: Aliases are not allowed at the root of the query, but nested aliases are permitted. + +### Example operations + +Query example `services/GetEmployeeById.graphql` + +```graphql +query GetEmployeeById($id: Int!) { + employee(id: $id) { + id + details { + forename + surname + } + } +} +``` + +Mutation example `services/UpdateEmployeeMood.graphql` + +```graphql +mutation UpdateEmployeeMood($id: Int!, $mood: Mood!) { + updateMood(employeeID: $id, mood: $mood) { + id + currentMood + updatedAt + } +} +``` + +## 2. Generating Proto Definitions + +Once your operations are defined, use the [`wgc` CLI](/cli/essentials) to generate the corresponding Protocol Buffer service definition. + +This process compiles your operations and schema into a `.proto` file that defines the services, methods and message types. + +Run the following command from your project root: + +```bash +wgc grpc-service generate \ + --input schema.graphql \ + --output ./services \ + --with-operations ./services \ + --package-name "employees.v1" \ + HRService +``` + +For the complete list of available command flags and options, see the [CLI reference documentation](/cli/essentials). + +This command will generate two files in your output directory: `service.proto` and `service.proto.lock.json`. + +### How mapping works + +At a high level, each GraphQL operation is translated into a single RPC method with strongly typed request and response messages. + +The generator automatically maps GraphQL concepts to protobuf: + +| GraphQL | Protobuf | +| -------------- | --------------------------- | +| Operation name | RPC method | +| Variables | Request message | +| Selection set | Response message | +| Scalar types | Protobuf scalar equivalents | + +- Query operations are marked with an `idempotency_level = NO_SIDE_EFFECTS` option, enabling support for HTTP GET requests. + +## 3. Organizing Multiple Services + +The Cosmo Router supports serving multiple gRPC services and packages simultaneously. It achieves this by recursively walking the directory specified in your router configuration to discover `.proto` files and their associated `.graphql` operations. + +Because discovery is recursive and based on the package declaration within the generated `.proto` files, you have flexibility in how you organize your directories. + +### Standard Package Organization + +A common pattern is to organize services by their package name. + +Example: Single Service per Package + +``` +services/ +└── employee.v1/ + ├── employee.proto + ├── GetEmployee.graphql + └── UpdateEmployee.graphql +``` + +Example: Multiple Services per Package You can group multiple services that share the same proto package into subdirectories. + +``` +services/ +└── company.v1/ + ├── EmployeeService/ + │ ├── employee.proto + │ └── ...graphql files + └── DepartmentService/ + ├── department.proto + └── ...graphql files +``` + +### Flexible Organization + +Since the router relies on the package declaration in the proto file and not the directory name, you can organize directories for convenience. + +``` +services/ +├── foo/ +│ ├── employee.proto +│ └── ...graphql files +└── bar/ + ├── department.proto + └── ...graphql files +``` + +#### Important: Nested Discovery Rules + +While the router searches recursively, it has a specific rule regarding nested proto files: Nested proto files are not discovered if a parent directory already contains a proto file. + +Once the router finds a .proto file in a directory, it stops searching deeper in that specific branch. + +``` +services/ +└── employee.v1/ + ├── employee.proto ✅ + ├── op1.graphql ✅ + └── nested/ + ├── other.proto ❌ + └── op2.graphql ✅ +``` + +**Key:** +- ✅ **Discovered and used** (employee.proto, op1.graphql, op2.graphql) +- ❌ **Not discovered** (other.proto - parent directory already contains a proto file) + +## 4. Versioning & Stability + +Versioning and compatibility are handled automatically so you can safely evolve your API without manually managing protobuf field numbers. + +> **You usually don’t need to think about this.** +> The lock file exists to guarantee compatibility automatically as your API evolves. + +A critical part of maintaining a gRPC, or any API, is ensuring forward compatibility for your clients. This means that as you evolve your GraphQL schema and operations, existing clients must continue to work. + +The `wgc grpc-service generate` command manages this automatically using a lock file. + +### The `service.proto.lock.json` file + +When you generate your proto definitions for the first time a `service.proto.lock.json` file is created alongside the `.proto` file. + +This file records the unique field numbers assigned to every field in your protobuf messages. + +You should commit this file to version control. + +On subsequent runs, the generator reads this lock file to ensure that: + +- existing fields retain their assigned numbers. + +- new fields are assigned new, unused numbers. + +- removed fields have their numbers marked as "reserved" so that they are not reused. + +### How it works + +The lock file tracks field numbers using the full, dot-notation path of nested messages. + +This ensures that fields in different messages with the same name (e.g. `Details` for User vs Details for Product) are tracked independently. + +#### Example 1: Stable Field Numbers + +Adding fields is always safe. +New fields get new numbers; existing fields keep theirs. + +```diff + "fields": { + "name": 1, + "email": 2 ++ "phone": 3 + } +``` + +This behavior guarantees that an old client that only knows about fields 1 and 2 can still communicate with a new server that also sends field 3. + +#### Example 2: Handling Removed Fields + +Removing fields is safe. +Removed field numbers are reserved and never reused. + +```diff + "fields": { + "name": 1, +- "email": 2, + "phone": 3 +-} ++}, ++"reservedNumbers": [ ++ 2 ++] +``` + +This is critical for backward compatibility. If an old client sends data with field number 2, the new server will correctly ignore it as a reserved field, rather than mistakenly trying to parse it as a new, unrelated field that happened to get assigned number 2. + +### Deeply Nested Messages (advanced) + +You don't need to manage this manually. + +The locking mechanism works regardless of nesting depth. It uses full paths like `GetDeepResponse.GetDeep.Level2.Level3.Level4.Level5` to uniquely identify every message scope, ensuring precise control over field numbering throughout your entire API surface. + +### Best Practices + +1. Commit the lock file: Always commit service.proto.lock.json to your version control system along with your .graphql and .proto files. + +2. Do not edit manually: Never manually modify or delete the lock file. Let the wgc CLI manage it. + +3. Generate on CI: Run the generation step as part of your CI/CD pipeline to ensure the lock file is always up-to-date with your operations. + +By following these practices, you can safely evolve your GraphQL-backed gRPC API without breaking existing clients. diff --git a/docs/connect-rpc/produce-generate-distribute-sdks.mdx b/docs/connect-rpc/produce-generate-distribute-sdks.mdx new file mode 100644 index 00000000..48b2e319 --- /dev/null +++ b/docs/connect-rpc/produce-generate-distribute-sdks.mdx @@ -0,0 +1,166 @@ +--- +title: "Generate & Distribute SDKs" +description: "Configure Buf to automatically generate and distribute type-safe TypeScript, Go, and OpenAPI artifacts from your API contracts as part of your CI/CD workflow." +icon: "wand-magic-sparkles" +--- + +This page guides you through the "build step" of your API product. +Once you have defined your contracts (operations and proto files), the next step is to generate the artifacts that your consumers need: type-safe client SDKs and documentation or API specs. + +In a production environment, this generation process is typically automated as part of a Continuous Integration and Continuous Deployment (CI/CD) workflow. This ensures that whenever your API contracts change, your SDKs and documentation are automatically regenerated and distributed. + +We recommend using [Buf](https://buf.build/), a modern toolchain for working with Protocol Buffers, to manage this process. + +*Note: The examples on this page demonstrate setting up generators for **TypeScript** and **Go**, but the underlying workflow is the same for any language supported by the Buf ecosystem.* + +## Recommendation: Local Generators + +For the most robust and reproducible open-source workflow, we recommend using local generators. + +While it is possible to use remote plugins hosted by the Buf Schema Registry (BSR), using local plugins ensures your build pipeline is self-contained, runs offline, and is not subject to external rate limiting for unauthenticated users. + +This approach requires a one-time installation of the necessary generator plugins on your dev, build machine or CI environment. + +## 1. Install Prerequisites + +You need the Buf CLI and the specific language generator plugins installed locally in your development environment and on your CI/CD agents. + +### Install Buf CLI + +Follow the official [Buf installation guide](https://buf.build/docs/cli/installation/) for your OS. + +### Install Generator plugins + +Run the following commands to install the plugins for TypeScript, Go, and OpenAPI. + +TypeScript/JavaScript (via npm): + +```bash +npm install --save-dev @bufbuild/protoc-gen-es @connectrpc/protoc-gen-connect-es +``` + +Go & OpenAPI (via Go): + +```bash +# Ensure $(go env GOPATH)/bin is in your $PATH +go install google.golang.org/protobuf/cmd/protoc-gen-go@latest +go install connectrpc.com/connect/cmd/protoc-gen-connect-go@latest +go install connectrpc.com/connect/cmd/protoc-gen-connect-openapi@latest +``` + +## 2. Configure Buf + +Create two configuration files in the root of your project (next to your `services/` directory). + +`buf.yaml` (Module Configuration) +This file defines your project as a Buf module. It configures linting rules to ensure your proto definitions follow best practices. + +```yaml +version: v1 +breaking: + use: + - FILE +lint: + use: + - DEFAULT +``` + +`buf.gen.yaml` (Generation Targets) + +This file tells Buf which plugins to run and where to output the generated code. + + Update the Go Package Path: Replace github.com/your-org/your-repo below with the actual module path for your Go project. This ensures imports work correctly in the generated Go code. + +```yaml +version: v1 +managed: + enabled: true + go_package_prefix: + # TODO: Replace with your actual Go module path + default: github.com/your-org/your-repo/gen/go +plugins: + # --- TypeScript SDK --- + # Generates base protobuf types + - plugin: es + out: gen/ts + opt: target=ts + # Generates ConnectRPC clients + - plugin: connect-es + out: gen/ts + opt: target=ts + + # --- Go SDK --- + # Generates base protobuf structs + - plugin: go + out: gen/go + opt: paths=source_relative + # Generates ConnectRPC interfaces/clients + - plugin: connect-go + out: gen/go + opt: paths=source_relative + + # --- OpenAPI Spec --- + # Generates openapi.yaml for documentation + - plugin: connect-openapi + out: gen/openapi +``` + +Because we installed the plugins locally, specifying just the plugin name (e.g. `plugin: es`) works because Buf finds the corresponding binary (e.g. `protoc-gen-es`) in your `$PATH` or `node_modules/.bin`. + +## 3. Run Generation + +Run the following command in your project root to generate all artifacts: + +```bash +buf generate +``` + +Buf will discover the `.proto` files in your `services/` directory (based on the setup in Define Contracts) and run the configured plugins. + +### Output Structure + +You will see a new `gen/` directory with your distributable artifacts: + +``` +gen/ +├── go/ +│ └── ... +├── openapi/ +│ └── service.openapi.yaml +└── ts/ + └── ... +``` + +## 4. Distribution Strategy & CI/CD Integration + +Once these artifacts are generated, you need a strategy for distributing them to your consumers. This entire workflow should be automated within your CI/CD pipeline to run on every commit, merge or release. + +### Strategy A: Monorepo (recommended for internal teams) + +Best when producers and consumers are owned by the same organization. + +Check the entire `gen/` directory into your version control system. + +- CI/CD: Your CI pipeline should run `wgc grpc-service generate` followed by `buf generate` to ensure the `gen/` folder is always in sync with your operation definitions. + +- Consumers: Your internal applications import the SDKs directly from the file path (e.g. `import from ../../packages/sdk/gen/ts`). + +- Pros: Easiest setup, atomic updates across services and consumers. + +### Strategy B: Polyrepo / Publishing + +Best when APIs are consumed by external teams or third parties. + +Do not check the `gen/` folder into version control. Instead, use CI/CD to generate and publish the artifacts. + +- CI/CD: Your pipeline runs the generation commands and then publishes the output to package registries. + + - TypeScript: Publish `gen/ts` to npm as `@your-org/sdk`. + + - Go: Push `gen/go` to a separate Git repository that acts as a Go module. + + - OpenAPI: Upload `service.openapi.yaml` to your developer portal or host it on S3. + +- Consumers: Use standard tools (`npm install, go get`) and semantic versioning. + +- Pros: Decouples producers and consumers, standard versioning for public consumption. diff --git a/docs/deployments-and-hosting/terraform.mdx b/docs/deployments-and-hosting/terraform.mdx index 0eabe879..ff24285e 100644 --- a/docs/deployments-and-hosting/terraform.mdx +++ b/docs/deployments-and-hosting/terraform.mdx @@ -47,7 +47,7 @@ You can find complete examples in the [examples](https://github.com/wundergraph/ ## Available Resources - + Organize and isolate resources using namespaces diff --git a/docs/docs.json b/docs/docs.json index 199ef4f5..8a0c6c63 100644 --- a/docs/docs.json +++ b/docs/docs.json @@ -72,6 +72,18 @@ "connect/grpc-services" ] }, + { + "group": "Cosmo ConnectRPC", + "pages": [ + "connect-rpc/overview", + "connect-rpc/produce-define-contracts", + "connect-rpc/produce-configure-router", + "connect-rpc/produce-generate-distribute-sdks", + "connect-rpc/consume-via-sdks", + "connect-rpc/consume-via-rest-openapi", + "connect-rpc/consume-via-grpc" + ] + }, { "group": "Router", "pages": [ diff --git a/docs/router/configuration.mdx b/docs/router/configuration.mdx index 72cef7eb..990989f0 100644 --- a/docs/router/configuration.mdx +++ b/docs/router/configuration.mdx @@ -837,7 +837,7 @@ modules: ## Router Plugins -The configuration for the router plugins. To learn more about the plugins, see the [plugins documentation](/router/plugins). +The configuration for the router plugins. To learn more about the plugins, see the [plugins documentation](/router/gRPC/plugins). | Environment Variable | YAML | Required | Description | Default Value | | -------------------- | ------- | ---------------------- | ------------------------------------------------------------------------------------- | ------------- | diff --git a/docs/router/gRPC/graphql-support.mdx b/docs/router/gRPC/graphql-support.mdx index b380e34c..1ab1abf5 100644 --- a/docs/router/gRPC/graphql-support.mdx +++ b/docs/router/gRPC/graphql-support.mdx @@ -169,4 +169,4 @@ For questions, updates, and community support: -See also: [gRPC Services](./router/gRPC/grpc-services) · [Plugins](./router/gRPC/plugins) +See also: [gRPC Services](/router/gRPC/grpc-services) · [Plugins](/router/gRPC/plugins) diff --git a/docs/router/gRPC/plugins.mdx b/docs/router/gRPC/plugins.mdx index 7c559110..d74f7dfe 100644 --- a/docs/router/gRPC/plugins.mdx +++ b/docs/router/gRPC/plugins.mdx @@ -227,7 +227,7 @@ For more details on the directory structure and build process, see the [`wgc rou ## Debugging Plugins -Please refer to the [Debugging Plugins](/router/gRPC/debugging) documentation for more details. +Please refer to the [Debugging Plugins](/router/gRPC/plugins/debugging) documentation for more details. ## Deployment Considerations diff --git a/docs/router/gRPC/plugins/go-plugin/logging.mdx b/docs/router/gRPC/plugins/go-plugin/logging.mdx index aba66d07..09904b9d 100644 --- a/docs/router/gRPC/plugins/go-plugin/logging.mdx +++ b/docs/router/gRPC/plugins/go-plugin/logging.mdx @@ -290,4 +290,4 @@ When using JSON logging in production, the structured output integrates well wit The structured JSON format makes it easy to query, filter, and create dashboards based on your plugin logs. -See also: [Plugins](./router/gRPC/plugins) · [gRPC Services](./router/gRPC/grpc-services) · [GraphQL Support](./router/gRPC/graphql-support) +See also: [Plugins](/router/gRPC/plugins) · [gRPC Services](/router/gRPC/grpc-services) · [GraphQL Support](/router/gRPC/graphql-support)