diff --git a/README.md b/README.md index ba3148f..b372e24 100644 --- a/README.md +++ b/README.md @@ -155,6 +155,8 @@ binding = H(SLSS_pk || TDD_pk || EGRW_pk) All three shares are required to recover the secret, and the binding ensures the three problem instances are cryptographically linked. +> ⚠️ Implementation note: the library prefers native SHAKE256 (XOF) support. If the runtime lacks native SHAKE256, kMOSAIC falls back to a counter-mode SHA3-256 based construction which may not provide the same security margins as a native XOF. For production deployments, ensure your runtime supports SHAKE256 or use an environment that provides it. + ### Hard Problems #### SLSS (Sparse Lattice Subset Sum) diff --git a/SECURITY_REPORT.md b/SECURITY_REPORT.md index a06e63a..9ec349c 100644 --- a/SECURITY_REPORT.md +++ b/SECURITY_REPORT.md @@ -181,6 +181,27 @@ while (idx < n) { This eliminates statistical bias by rejecting values that would cause modular reduction bias. +### VULN-014: Decapsulation throws on malformed ciphertext (implicit oracle) + +**File:** `src/kem/index.ts` +**Lines:** 360-420 (approx) +**Status:** ✅ **FIXED** + +#### Description + +Certain malformed or corrupted ciphertexts (for example, a truncated NIZK proof or malformed fragment lengths) could cause `decapsulate()` to throw exceptions or exhibit distinguishable behavior. This could be used as a decryption oracle by an attacker to learn about ciphertext validity. + +#### Fix Applied + +- Compute the **implicit rejection value** early from the raw ciphertext bytes and use it as the default return value on any validation failure. +- Wrap critical parsing and verification steps in try/catch blocks: serialization, component decryption (SLSS/TDD/EGRW), NIZK deserialization and verification, and re-encapsulation. Any failure marks decapsulation as invalid but does not throw. +- Normalize share lengths (expect 32-byte shares) and use zeroed fallbacks to avoid reconstruction exceptions. +- Replace direct ciphertext byte comparison with fixed-length SHA3-256 hash comparisons to avoid leaks from variable-length ciphertexts. +- Add a public key consistency check: `sha3_256(serializePublicKey(publicKey)) === secretKey.publicKeyHash`; treat mismatches as invalid decapsulation. +- Added unit tests exercising tampering and malformed inputs: `test/kem-malformed.test.ts`. + +These changes ensure `decapsulate()` always returns a 32-byte pseudorandom secret (implicit reject) on invalid input, preventing oracle-style leakage. + --- ### VULN-005: Potential Integer Precision Issues @@ -257,17 +278,21 @@ JavaScript's garbage collector may copy buffer contents during compaction. The ` **File:** `src/utils/shake.ts` **Lines:** 82-100 -**Status:** 🟡 ACKNOWLEDGED +**Status:** ✅ MITIGATED #### Description The counter-mode SHA3-256 fallback is not a proven XOF construction. While unlikely to be used on Node.js/Bun, security properties are unverified. -#### Mitigation +#### Mitigation / Fix Applied -- Native SHAKE256 is available in all target environments (Node.js 18+, Bun) -- Fallback only triggers in edge cases -- Consider adding warning log when fallback is used +- Added `isNativeShake256Available()` helper to allow application code to detect and enforce native SHAKE256 availability. +- Added an explicit README note advising production deployments to use native SHAKE256 or a runtime that supports it. +- Fallback continues to exist for compatibility, but the above mitigations reduce the risk and make it visible to operators. + +#### Recommendation + +For highest assurance, consider adding a configuration flag that causes startup to fail when native SHAKE256 is unavailable. --- @@ -370,6 +395,7 @@ Generator cache creates timing differences between cache hits and misses, potent | VULN-001 | TDD plaintext storage | ✅ FIXED | XOR encryption with masked-matrix keystream | | VULN-002 | EGRW randomness leak | ✅ FIXED | Ephemeral walk vertex derivation | | VULN-004 | Modular bias | ✅ FIXED | Rejection sampling in TDD | +| VULN-014 | Decapsulation oracle | ✅ FIXED | Safe parsing, implicit-reject, hash-compare | ### Acknowledged Limitations @@ -397,9 +423,16 @@ Generator cache creates timing differences between cache hits and misses, potent The kMOSAIC implementation has been assessed and critical security issues have been remediated: -1. **VULN-001 (TDD Plaintext):** Now uses XOR encryption with keystream derived from the masked tensor matrix -2. **VULN-002 (EGRW Randomness):** Randomness no longer exposed; ephemeral walk vertex used instead -3. **VULN-004 (Modular Bias):** Rejection sampling now ensures uniform distribution +1. **VULN-001 (TDD Plaintext):** Now uses XOR encryption with keystream derived from the masked tensor matrix2. **VULN-002 (EGRW randomness exposure):** Now derives ciphertext endpoints from ephemeral walks and does not expose randomness +2. **VULN-004 (Modular bias):** Rejection sampling implemented in TDD sampling +3. **VULN-014 (Decapsulation oracle):** Decapsulation hardened to return implicit-reject values on malformed or tampered ciphertexts; added unit tests to verify behavior + +Additional improvements: + +- Added `isNativeShake256Available()` and README guidance to make SHAKE256 availability explicit for production deployments. +- Added robust unit tests for malformed/corrupted ciphertext handling: `test/kem-malformed.test.ts` (proof tampering, malformed fragments, truncated ciphertexts, publicKey mismatch). + +Overall, the most critical issues have been remediated and the codebase now includes tests that guard against malformed ciphertext behavior and oracle leakage. Continuous monitoring and peer review are recommended for the remaining acknowledged limitations (timing, zeroization limits, and JS runtime concerns).2. **VULN-002 (EGRW Randomness):** Randomness no longer exposed; ephemeral walk vertex used instead 3. **VULN-004 (Modular Bias):** Rejection sampling now ensures uniform distribution The remaining acknowledged items are primarily JavaScript runtime limitations that are well-documented in the code and do not constitute exploitable vulnerabilities in typical deployment scenarios. diff --git a/src/kem/index.ts b/src/kem/index.ts index 05777d0..ddda1bb 100644 --- a/src/kem/index.ts +++ b/src/kem/index.ts @@ -365,7 +365,14 @@ export async function decapsulate( // Compute implicit rejection value first (constant-time protection) // This is returned if any validation fails - const ciphertextBytes = serializeCiphertext(ciphertext) + let ciphertextBytes: Uint8Array + try { + ciphertextBytes = serializeCiphertext(ciphertext) + } catch { + // Malformed ciphertext serialization — use empty buffer and mark invalid + ciphertextBytes = new Uint8Array(0) + } + const implicitRejectSecret = shake256( hashWithDomain(DOMAIN_IMPLICIT_REJECT, hashConcat(seed, ciphertextBytes)), 32, @@ -373,33 +380,93 @@ export async function decapsulate( let validDecapsulation = 1 // 1 = valid, 0 = invalid - // Decrypt each fragment - const share1 = slssDecrypt(c1, slssSK, params.slss) - const share2 = tddDecrypt(c2, tddSK, params.tdd) - const share3 = egrwDecrypt(c3, egrwSK, egrwPK, params.egrw) + // Quick sanity: ensure public key matches secret key's recorded hash + try { + const pkHash = sha3_256(serializePublicKey(publicKey)) + if (!constantTimeEqual(pkHash, publicKeyHash)) { + validDecapsulation = 0 + } + } catch { + // If serialization of public key fails, mark invalid but continue + validDecapsulation = 0 + } - // Reconstruct ephemeral secret - const recoveredSecret = secretReconstruct([share1, share2, share3]) + // Decrypt each fragment with safe failure handling + let share1: Uint8Array = new Uint8Array(32) + let share2: Uint8Array = new Uint8Array(32) + let share3: Uint8Array = new Uint8Array(32) + + try { + const s1 = slssDecrypt(c1, slssSK, params.slss) + if (s1.length === 32) share1 = s1 + else { + validDecapsulation = 0 + } + } catch { + validDecapsulation = 0 + } - // Fujisaki-Okamoto re-encryption check - // Re-encapsulate with recovered secret and verify ciphertext matches - const reEncapsulated = encapsulateDeterministic(publicKey, recoveredSecret) - const reEncapsulatedBytes = serializeCiphertext(reEncapsulated.ciphertext) + try { + const s2 = tddDecrypt(c2, tddSK, params.tdd) + if (s2.length === 32) share2 = s2 + else { + validDecapsulation = 0 + } + } catch { + validDecapsulation = 0 + } - // Constant-time comparison of ciphertexts - if (!constantTimeEqual(ciphertextBytes, reEncapsulatedBytes)) { + try { + const s3 = egrwDecrypt(c3, egrwSK, egrwPK, params.egrw) + if (s3.length === 32) share3 = s3 + else { + validDecapsulation = 0 + } + } catch { validDecapsulation = 0 } - // Verify NIZK proof (additional check) - const proof = deserializeNIZKProof(proofBytes) - const ciphertextHashes = [ - sha3_256(serializeSLSSCiphertext(c1)), - sha3_256(serializeTDDCiphertext(c2)), - sha3_256(serializeEGRWCiphertext(c3)), - ] + // Reconstruct ephemeral secret (shares are normalized to 32 bytes) + let recoveredSecret: Uint8Array + try { + recoveredSecret = secretReconstruct([share1, share2, share3]) + } catch { + // Reconstruction failure — use zeroed secret and mark invalid + recoveredSecret = new Uint8Array(32) + validDecapsulation = 0 + } - if (!verifyNIZKProof(proof, ciphertextHashes, recoveredSecret)) { + // Fujisaki-Okamoto re-encryption check (compare hashes to avoid length leaks) + let reEncapsulatedBytes: Uint8Array + try { + const reEncapsulated = encapsulateDeterministic(publicKey, recoveredSecret) + reEncapsulatedBytes = serializeCiphertext(reEncapsulated.ciphertext) + } catch { + reEncapsulatedBytes = new Uint8Array(0) + validDecapsulation = 0 + } + + // Compare fixed-length hashes (constant-time) + const originalCtHash = sha3_256(ciphertextBytes) + const reCtHash = sha3_256(reEncapsulatedBytes) + if (!constantTimeEqual(originalCtHash, reCtHash)) { + validDecapsulation = 0 + } + + // Verify NIZK proof (additional check) + try { + const proof = deserializeNIZKProof(proofBytes) + const ciphertextHashes = [ + sha3_256(serializeSLSSCiphertext(c1)), + sha3_256(serializeTDDCiphertext(c2)), + sha3_256(serializeEGRWCiphertext(c3)), + ] + + if (!verifyNIZKProof(proof, ciphertextHashes, recoveredSecret)) { + validDecapsulation = 0 + } + } catch { + // Any failure in proof parsing or verification marks invalid validDecapsulation = 0 } @@ -730,17 +797,38 @@ export function serializeCiphertext(ct: MOSAICCiphertext): Uint8Array { * @returns Ciphertext object */ export function deserializeCiphertext(data: Uint8Array): MOSAICCiphertext { + // Basic bounds checks + if (data.length < 4) throw new Error('Invalid ciphertext: too short') + const view = new DataView(data.buffer, data.byteOffset) let offset = 0 // c1 + if (offset + 4 > data.length) + throw new Error('Invalid ciphertext: truncated c1 length') const c1Len = view.getUint32(offset, true) offset += 4 + const MAX_PART = 8 * 1024 * 1024 // 8 MB per component to prevent resource exhaustion (supports MOS-256 public keys) + if (c1Len <= 0 || c1Len > MAX_PART || offset + c1Len > data.length) + throw new Error('Invalid ciphertext: c1 extends beyond data or too large') const c1Start = offset const c1View = new DataView(data.buffer, data.byteOffset + c1Start) + + // Validate SLSS component structure + if (c1Len < 8) throw new Error('Invalid SLSS ciphertext: too short') const uLen = c1View.getUint32(0, true) - const u = new Int32Array(data.buffer, data.byteOffset + c1Start + 4, uLen / 4) + if (uLen % 4 !== 0) + throw new Error('Invalid SLSS ciphertext: u length not multiple of 4') + if (4 + uLen + 4 > c1Len) + throw new Error('Invalid SLSS ciphertext: malformed lengths') + const vLen = c1View.getUint32(4 + uLen, true) + if (4 + uLen + 4 + vLen !== c1Len) + throw new Error('Invalid SLSS ciphertext: length mismatch') + if (vLen % 4 !== 0) + throw new Error('Invalid SLSS ciphertext: v length not multiple of 4') + + const u = new Int32Array(data.buffer, data.byteOffset + c1Start + 4, uLen / 4) const v = new Int32Array( data.buffer, data.byteOffset + c1Start + 8 + uLen, @@ -749,13 +837,19 @@ export function deserializeCiphertext(data: Uint8Array): MOSAICCiphertext { offset += c1Len // c2 + if (offset + 4 > data.length) + throw new Error('Invalid ciphertext: truncated c2 length') const c2Len = view.getUint32(offset, true) offset += 4 + if (c2Len <= 0 || c2Len > MAX_PART || offset + c2Len > data.length) + throw new Error('Invalid ciphertext: c2 extends beyond data or too large') const c2Start = offset - const c2DataLen = new DataView( - data.buffer, - data.byteOffset + c2Start, - ).getUint32(0, true) + const c2View = new DataView(data.buffer, data.byteOffset + c2Start) + const c2DataLen = c2View.getUint32(0, true) + if (4 + c2DataLen !== c2Len) + throw new Error('Invalid TDD ciphertext: length mismatch') + if (c2DataLen % 4 !== 0) + throw new Error('Invalid TDD ciphertext: data length not multiple of 4') const tddData = new Int32Array( data.buffer, data.byteOffset + c2Start + 4, @@ -764,8 +858,12 @@ export function deserializeCiphertext(data: Uint8Array): MOSAICCiphertext { offset += c2Len // c3 + if (offset + 4 > data.length) + throw new Error('Invalid ciphertext: truncated c3 length') const c3Len = view.getUint32(offset, true) offset += 4 + if (c3Len <= 16 || c3Len > MAX_PART || offset + c3Len > data.length) + throw new Error('Invalid EGRW ciphertext: malformed c3 or too large') const c3Start = offset const vertexView = new DataView(data.buffer, data.byteOffset + c3Start) const vertex = { @@ -851,38 +949,60 @@ export function serializePublicKey(pk: MOSAICPublicKey): Uint8Array { * Format: [level_len:4][level_string][slss_len:4][slss_data][tdd_len:4][tdd_data][egrw_len:4][egrw_data][binding:32] */ export function deserializePublicKey(data: Uint8Array): MOSAICPublicKey { + // Basic bounds check + if (data.length < 4) throw new Error('Invalid public key: too short') + const view = new DataView(data.buffer, data.byteOffset) let offset = 0 // Read security level string + if (offset + 4 > data.length) + throw new Error('Invalid public key: truncated level length') const levelLen = view.getUint32(offset, true) offset += 4 + if (levelLen <= 0 || offset + levelLen > data.length || levelLen > 255) + throw new Error('Invalid public key: level length invalid') + const levelBytes = data.slice(offset, offset + levelLen) const level = new TextDecoder().decode(levelBytes) as SecurityLevel offset += levelLen - // Get params from level + // Get params from level (may throw if level unknown) const params = getParams(level) // Read SLSS public key + if (offset + 4 > data.length) + throw new Error('Invalid public key: truncated SLSS length') const slssLen = view.getUint32(offset, true) offset += 4 + if (slssLen <= 0 || offset + slssLen > data.length) + throw new Error('Invalid public key: SLSS component out of bounds') const slss = slssDeserializePublicKey(data.slice(offset, offset + slssLen)) offset += slssLen // Read TDD public key + if (offset + 4 > data.length) + throw new Error('Invalid public key: truncated TDD length') const tddLen = view.getUint32(offset, true) offset += 4 + if (tddLen <= 0 || offset + tddLen > data.length) + throw new Error('Invalid public key: TDD component out of bounds') const tdd = tddDeserializePublicKey(data.slice(offset, offset + tddLen)) offset += tddLen // Read EGRW public key + if (offset + 4 > data.length) + throw new Error('Invalid public key: truncated EGRW length') const egrwLen = view.getUint32(offset, true) offset += 4 + if (egrwLen <= 0 || offset + egrwLen > data.length) + throw new Error('Invalid public key: EGRW component out of bounds') const egrw = egrwDeserializePublicKey(data.slice(offset, offset + egrwLen)) offset += egrwLen // Read binding (fixed 32 bytes) + if (offset + 32 > data.length) + throw new Error('Invalid public key: missing binding') const binding = data.slice(offset, offset + 32) return { slss, tdd, egrw, binding, params } diff --git a/src/problems/egrw/index.ts b/src/problems/egrw/index.ts index 332f007..c86d367 100644 --- a/src/problems/egrw/index.ts +++ b/src/problems/egrw/index.ts @@ -486,6 +486,8 @@ export function egrwSerializePublicKey(pk: EGRWPublicKey): Uint8Array { * @returns Public key */ export function egrwDeserializePublicKey(data: Uint8Array): EGRWPublicKey { + if (data.length < 32) + throw new Error('Invalid EGRW public key: expected 32 bytes') const vStart = bytesToSl2(data.slice(0, 16)) const vEnd = bytesToSl2(data.slice(16, 32)) return { vStart, vEnd } diff --git a/src/problems/slss/index.ts b/src/problems/slss/index.ts index eee5f10..17665b4 100644 --- a/src/problems/slss/index.ts +++ b/src/problems/slss/index.ts @@ -673,18 +673,36 @@ export function slssSerializePublicKey(pk: SLSSPublicKey): Uint8Array { * @returns Public key */ export function slssDeserializePublicKey(data: Uint8Array): SLSSPublicKey { + if (data.length < 8) throw new Error('Invalid SLSS public key: too short') const view = new DataView(data.buffer, data.byteOffset) let offset = 0 const aLen = view.getUint32(offset, true) offset += 4 + const MAX_PART = 8 * 1024 * 1024 // 8 MB cap for public key component + if (aLen <= 0 || aLen > MAX_PART || offset + aLen > data.length) + throw new Error( + 'Invalid SLSS public key: A component out of bounds or too large', + ) + if (aLen % 4 !== 0) + throw new Error('Invalid SLSS public key: A length not multiple of 4') + // Copy to a new buffer to ensure proper ownership and alignment const aBytes = data.slice(offset, offset + aLen) const A = new Int32Array(aBytes.buffer, aBytes.byteOffset, aLen / 4) offset += aLen + if (offset + 4 > data.length) + throw new Error('Invalid SLSS public key: truncated t length') const tLen = view.getUint32(offset, true) offset += 4 + if (tLen <= 0 || tLen > MAX_PART || offset + tLen > data.length) + throw new Error( + 'Invalid SLSS public key: t component out of bounds or too large', + ) + if (tLen % 4 !== 0) + throw new Error('Invalid SLSS public key: t length not multiple of 4') + // Copy to a new buffer to ensure proper ownership and alignment const tBytes = data.slice(offset, offset + tLen) const t = new Int32Array(tBytes.buffer, tBytes.byteOffset, tLen / 4) diff --git a/src/problems/tdd/index.ts b/src/problems/tdd/index.ts index cece190..6aff457 100644 --- a/src/problems/tdd/index.ts +++ b/src/problems/tdd/index.ts @@ -613,8 +613,14 @@ export function tddSerializePublicKey(pk: TDDPublicKey): Uint8Array { * @returns Public key */ export function tddDeserializePublicKey(data: Uint8Array): TDDPublicKey { + if (data.length < 4) throw new Error('Invalid TDD public key: too short') const view = new DataView(data.buffer, data.byteOffset) const len = view.getUint32(0, true) + const MAX_PART = 8 * 1024 * 1024 // 8 MB cap for public key component + if (len <= 0 || len > MAX_PART || 4 + len > data.length) + throw new Error('Invalid TDD public key: length out of bounds or too large') + if (len % 4 !== 0) + throw new Error('Invalid TDD public key: length not multiple of 4') // Copy to a new buffer to ensure proper ownership and alignment const tBytes = data.slice(4, 4 + len) const T = new Int32Array(tBytes.buffer, tBytes.byteOffset, len / 4) diff --git a/src/sign/index.ts b/src/sign/index.ts index 65d101e..95782bd 100644 --- a/src/sign/index.ts +++ b/src/sign/index.ts @@ -346,24 +346,35 @@ export function serializeSignature(sig: MOSAICSignature): Uint8Array { * @returns Deserialized signature object */ export function deserializeSignature(data: Uint8Array): MOSAICSignature { + if (data.length < 12) throw new Error('Invalid signature: too short') const view = new DataView(data.buffer, data.byteOffset) let offset = 0 // Commitment const commitmentLen = view.getUint32(offset, true) offset += 4 + if (commitmentLen <= 0 || offset + commitmentLen > data.length) + throw new Error('Invalid signature: malformed commitment') const commitment = data.slice(offset, offset + commitmentLen) offset += commitmentLen // Challenge + if (offset + 4 > data.length) + throw new Error('Invalid signature: truncated challenge length') const challengeLen = view.getUint32(offset, true) offset += 4 + if (challengeLen <= 0 || offset + challengeLen > data.length) + throw new Error('Invalid signature: malformed challenge') const challenge = data.slice(offset, offset + challengeLen) offset += challengeLen // Response + if (offset + 4 > data.length) + throw new Error('Invalid signature: truncated response length') const responseLen = view.getUint32(offset, true) offset += 4 + if (responseLen <= 0 || offset + responseLen > data.length) + throw new Error('Invalid signature: malformed response') const response = data.slice(offset, offset + responseLen) return { commitment, challenge, response } diff --git a/src/utils/shake.ts b/src/utils/shake.ts index 66c1c96..39af1d9 100644 --- a/src/utils/shake.ts +++ b/src/utils/shake.ts @@ -252,3 +252,13 @@ export function hashWithDomain(domain: string, input: Uint8Array): Uint8Array { return result } + +/** + * Query whether native SHAKE256 is available in this runtime. + * Useful for application code that wants to enforce native-XOF availability + * (recommended for production deployments). Returns true if native SHAKE256 + * is available, false otherwise. + */ +export function isNativeShake256Available(): boolean { + return checkNativeShake256() +} diff --git a/test/kem-malformed.test.ts b/test/kem-malformed.test.ts new file mode 100644 index 0000000..195c5a2 --- /dev/null +++ b/test/kem-malformed.test.ts @@ -0,0 +1,87 @@ +import { describe, test, expect } from 'bun:test' +import { encapsulate, decapsulate, encrypt, decrypt } from '../src/kem/index.ts' +import { kemGenerateKeyPair } from '../src/index.ts' + +describe('KEM malformed/corrupted ciphertext handling', () => { + test('decapsulate returns 32-byte implicit reject on proof tampering', async () => { + const { publicKey, secretKey } = await kemGenerateKeyPair() + const { ciphertext, sharedSecret } = await encapsulate(publicKey) + + // Tamper proof (set to zeros) + const corrupted = { + ...ciphertext, + proof: new Uint8Array(ciphertext.proof.length), + } + + const recovered = await decapsulate(corrupted as any, secretKey, publicKey) + expect(recovered).toBeInstanceOf(Uint8Array) + expect(recovered.length).toBe(32) + // Should not equal original shared secret + let equal = true + if (recovered.length === sharedSecret.length) { + for (let i = 0; i < 32; i++) + if (recovered[i] !== sharedSecret[i]) { + equal = false + break + } + } else equal = false + expect(equal).toBe(false) + }) + + test('decapsulate returns implicit reject on malformed fragment lengths', async () => { + const { publicKey, secretKey } = await kemGenerateKeyPair() + const { ciphertext, sharedSecret } = await encapsulate(publicKey) + + // Corrupt SLSS fragment by making u very short + const bad = JSON.parse(JSON.stringify(ciphertext)) + bad.c1.u = new Int32Array([0]) + + const recovered = await decapsulate(bad as any, secretKey, publicKey) + expect(recovered).toBeInstanceOf(Uint8Array) + expect(recovered.length).toBe(32) + // Should not equal original shared secret + let equal = true + for (let i = 0; i < 32; i++) + if (recovered[i] !== sharedSecret[i]) { + equal = false + break + } + expect(equal).toBe(false) + }) + + test('decrypt fails gracefully on truncated serialized ciphertext', async () => { + const { publicKey, secretKey } = await kemGenerateKeyPair() + + const plaintext = new TextEncoder().encode('test message') + const encrypted = await encrypt(plaintext, publicKey) + + // Truncate the data so KEM length header doesn't match available data + const truncated = encrypted.slice(0, 8) + + await expect(decrypt(truncated, secretKey, publicKey)).rejects.toThrow() + }) + + test('decapsulate returns implicit reject when public key does not match secretKey', async () => { + const { publicKey, secretKey } = await kemGenerateKeyPair() + const { ciphertext, sharedSecret } = await encapsulate(publicKey) + + // Mutate public key binding to simulate mismatch + const badPk = { + ...publicKey, + binding: new Uint8Array(publicKey.binding.length), + } + + const recovered = await decapsulate(ciphertext, secretKey, badPk as any) + expect(recovered).toBeInstanceOf(Uint8Array) + expect(recovered.length).toBe(32) + + // Should not equal original shared secret + let equal = true + for (let i = 0; i < 32; i++) + if (recovered[i] !== sharedSecret[i]) { + equal = false + break + } + expect(equal).toBe(false) + }) +})