Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .github/scripts/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -352,6 +352,8 @@ bash .github/scripts/nix.sh --variant fips --link dynamic sbom --target server

Updates Nix expected-hash inputs by parsing **GitHub Actions** packaging logs (fixed-output derivation hash mismatches).

This works even if the workflow run is still in progress (it fetches per-job logs directly when needed).

This command is meant to be used after a CI packaging job fails with a message like:

- `specified: sha256-...`
Expand Down
2 changes: 1 addition & 1 deletion .github/scripts/release.sh
Original file line number Diff line number Diff line change
Expand Up @@ -62,4 +62,4 @@ git cliff -w "$PWD" -u -p CHANGELOG.md -t "$NEW_VERSION"
${SED_BINARY} "${SED_IN_PLACE[@]}" 's/(#\([0-9]\+\))/([#\1](https:\/\/github.com\/Cosmian\/kms\/pull\/\1))/g' CHANGELOG.md

bash .github/scripts/build_ui.sh
bash .github/scripts/nix.sh update-hashes
bash .github/scripts/nix.sh sbom
37 changes: 30 additions & 7 deletions .github/scripts/update_hashes.sh
Original file line number Diff line number Diff line change
Expand Up @@ -114,17 +114,44 @@ else
# Get all failed jobs from this run (id + name).
# We rely on the job name (when available) to infer platform/linkage for server vendor hashes.
FAILED_JOBS=$(gh api "repos/Cosmian/kms/actions/runs/$RUN_ID/jobs" \
--jq '.jobs[] | select(.conclusion == "failure") | [.id, .name] | @tsv' 2>/dev/null || echo "")
--jq '.jobs[]
| select((.conclusion == "failure") or (.status == "in_progress"))
| [.id, .name] | @tsv' 2>/dev/null || echo "")

if [ -z "$FAILED_JOBS" ]; then
echo "No failed jobs found in run $RUN_ID. Nothing to update."
echo "No failed or in-progress jobs found in run $RUN_ID. Nothing to update."
exit 0
fi
fi

# Declare associative array to store hash updates
declare -A FILE_TO_HASH

stream_job_logs() {
local run_id="$1"
local job_id="$2"
local tmp
tmp=$(mktemp -t gha-job-log.XXXXXX)

# Prefer `gh run view` (nice formatting and smaller for failed steps),
# but it may refuse logs while the overall run is still in progress.
if gh run view "$run_id" --log-failed --job "$job_id" >"$tmp" 2>/dev/null; then
cat "$tmp"
rm -f "$tmp"
return 0
fi

if gh run view "$run_id" --log --job "$job_id" >"$tmp" 2>/dev/null; then
cat "$tmp"
rm -f "$tmp"
return 0
fi

# Fallback: fetch raw job logs directly (works even if run is still running).
rm -f "$tmp"
gh api "repos/Cosmian/kms/actions/jobs/$job_id/logs" 2>/dev/null || true
}

# Process each failed job
while IFS=$'\t' read -r JOB_ID JOB_NAME; do
[ -z "${JOB_ID:-}" ] && continue
Expand All @@ -142,10 +169,6 @@ while IFS=$'\t' read -r JOB_ID JOB_NAME; do
# If a specific job was requested and it didn't fail, fall back to the full job log.
# Output format is typically: "<STEP NAME> | <log line>".
last_drv_name=""
log_cmd=(gh run view "$RUN_ID" --log-failed --job "$JOB_ID")
if ! "${log_cmd[@]}" >/dev/null 2>&1; then
log_cmd=(gh run view "$RUN_ID" --log --job "$JOB_ID")
fi

while IFS= read -r raw_line; do
line="$raw_line"
Expand Down Expand Up @@ -221,7 +244,7 @@ while IFS=$'\t' read -r JOB_ID JOB_NAME; do
last_drv_name=""
fi
fi
done < <("${log_cmd[@]}" 2>/dev/null || true)
done < <(stream_job_logs "$RUN_ID" "$JOB_ID")
done <<<"$FAILED_JOBS"

# Apply updates
Expand Down
2 changes: 1 addition & 1 deletion .github/scripts/windows_ui.ps1
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ function Build-UI {
{
"name": "cosmian_kms_client_wasm",
"type": "module",
"version": "5.15.0",
"version": "5.16.0",
"main": "cosmian_kms_client_wasm.js",
"types": "cosmian_kms_client_wasm.d.ts"
}
Expand Down
9 changes: 8 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,27 +27,34 @@ All notable changes to this project will be documented in this file.

### 🐛 Bug Fixes

- Fix SQL Locate request for OpenTelemetry metrics collector:
- Fix SQL Locate request for OpenTelemetry metrics collector (#694):
- Refactored SQL Locate query building in locate_query.rs to use bound, typed parameters (LocateQuery + LocateParam) instead of interpolating values into SQL (safer + fixes type/cast handling across SQLite/Postgres/MySQL).
- Updated the SQL backends to consume the new LocateQuery API: crate/server_database/src/stores/sql/{mysql,pgsql,sqlite}.rs.
- Improved DB test error context in json_access_test.rs to make failures easier to diagnose.
- OpenTelemetry wiring updates:
- mod.rs: add OTEL resource attributes (service name/version + optional environment).
- otel_metrics.rs: ensure active_keys_count time series exists even when 0.
- cron.rs: fall back to default username if hsm_admin is empty.
- Fix regression on KMIP 1.0 (Fresh and InitialDate attributes) (#689)
- Fix Linux packaging smoke tests when the host has `/etc/cosmian/kms.toml` present by running with an explicit temp config.
- Make OpenTelemetry export tests resilient under FIPS Nix shells by running `curl` in a clean environment (avoid inherited OpenSSL/LD overrides).
- *(ui)* Azure BYOK export (#697)

### ⚙️ Build

- Nix builds now target GLIBC ≤ 2.34 (Rocky Linux 9 compatibility) by updating pins and building Linux OpenSSL/server outputs against a glibc 2.34 stdenv; server vendor hash expectations are split by static/dynamic on Linux.
- SBOM generation improvements:
- `.github/scripts/nix.sh sbom` strictly validates `--target/--variant/--link`, defaults to generating all combinations, and supports generating a specific server subset.
- SBOM tooling runs in an isolated workdir to avoid stray repo-root artifacts, keeps only final `sbom.csv` + `vulns.csv` reports per output directory, and deduplicates CVE rows in-place (via `nix/scripts/dedup_cves.py`, with optional filtering helper `nix/scripts/filter_vulns.py`).
- *(deps)* Bump jsonwebtoken in the cargo group across 1 directory (#702)
- *(deps)* Bump bytes in the cargo group across 1 directory (#703)
- *(deps)* Bump time in the cargo group across 1 directory (#706)
- *(deps)* Bump actix-files in the cargo group across 1 directory (#707)

### 📚 Documentation

- Update SBOM documentation to match the generator output layout and behavior.
- Update OpenSSL versions (#713)

## [5.15.0] - 2026-01-21

Expand Down
36 changes: 18 additions & 18 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ map_err_ignore = "deny"
redundant_clone = "deny"

[workspace.package]
version = "5.15.0"
version = "5.16.0"
edition = "2024"
rust-version = "1.87.0"
authors = [
Expand Down
Loading
Loading