This README is for SDK contributors and AI assistants. For end users/consumers, use the public docs: https://ocp.canton.fairmint.com/.
- Strict types:
"strict": trueacross the repo; no unsound assertions - No
any/unknown: prefer unions, discriminated unions, and narrow interfaces - Fail fast: validate inputs, throw early with actionable messages
- Explicit APIs: always annotate exported function return types
- Deterministic: avoid hidden state; keep functions pure when possible
src/functions/<domain>/...: one operation per file; exportParams/Resultsrc/OcpClient.ts: grouped facade that wires functions to aLedgerJsonApiClientsrc/types/native.ts: ergonomic OCF-native types for inputs/outputssrc/types/contractDetails.ts:ContractDetailsfor disclosed cross-domain referencessrc/utils/typeConversions.ts: conversions between DAML types and native OCF typessrc/utils/TransactionBatch.ts: typed batch submission helper
- Name operations with verbs:
createX,getXAsOcf,archiveXByIssuer - Define
XParamsandXResultin the operation file; export them - Validate all required fields in
XParamsand return preciseXResult - Provide
buildXCommandfor batch support when relevant (returns Command + disclosures)
- Use
ContractDetailswith all fields required:contractId,createdEventBlob,synchronizerId,templateId - Include only the minimum set of disclosed contracts required for the transaction
- Construct with
ClientConfigfrom@fairmint/canton-node-sdk - Ensure LEDGER_JSON_API is configured with auth and party/user identifiers
- Map inputs using native OCF types from
types/native - Perform conversions in
utils/typeConversions.ts; reject invalid shapes immediately - Never hide data issues with defensive checks or fallback values
- If DAML defines a field as an array, trust it's an array—don't check with
Array.isArray()and provide empty array fallbacks - DAML arrays are never null or undefined, so don't check for that either (e.g.,
arr && arr.length→ justarr.length) - If data is malformed, let it throw naturally so issues surface immediately
- This applies to all conversions: fail fast rather than silently handling unexpected data
- Example of what NOT to do:
Array.isArray(data) ? data : []❌ ordata && data.length❌ - Example of what to do:
data as ExpectedType[]✅ anddata.length✅
- If DAML defines a field as an array, trust it's an array—don't check with
- Use
TransactionBatchfor multi-command submissions; preferbuildXCommandhelpers to ensure types are correct
- Throw
Error(or domain-specific subclasses) with clear messages; never swallow errors - Prefer adding context (contract/template ids, party id) to error messages
- Use the shared logger from the node SDK when available
- Unit-test conversions and param validation
- Prefer deterministic fixtures and golden samples for OCF objects
- All OCF fixtures are validated against the official OCF JSON schemas (see
test/utils/ocfSchemaValidator.ts) - The OCF schemas are maintained as a git submodule at
libs/Open-Cap-Format-OCF/
- Contributor guidance lives here
- End-user/API docs: Open Cap Table Protocol Canton SDK
- Internal API docs can be generated locally with
npm run docsintodocs/ - AI context (single source of truth):
llms.txt(agent entrypoints:CLAUDE.md,AGENTS.md,GEMINI.md)
- Types are precise; no
any/unknown - Public APIs annotated; parameters validated
- Errors are actionable; no silent fallbacks
- Functions added to the relevant
index.tsbarrels andOcpClient - Add
buildXCommandvariant where batching is expected
This SDK includes automated validation of all OCF data against the official Open Cap Format JSON schemas.
Setup:
# Initialize/update the OCF schema submodule
git submodule update --init --recursiveUsage in tests:
import { validateOcfObject } from './test/utils/ocfSchemaValidator';
// Validate any OCF object
await validateOcfObject({
object_type: 'ISSUER',
id: '...',
legal_name: 'Example Inc.',
// ... other fields
});The validator automatically validates both:
- Input fixtures (
fixture.db) - Data being sent to Canton - Output results - Data returned from
get*AsOcfmethods
All 78+ test fixtures pass validation for both input and output data.
MIT