Skip to content

Conversation

@lcawl
Copy link
Contributor

@lcawl lcawl commented Dec 11, 2025

Summary

  • Added a docs-builder changelog bundle command that collects one or more changelogs into a single YAML file (to align with a product release).
  • Added a docs-builder changelog render command that generates Docs V3-friendly markdown files from one or more changelog bundles.

Impetus

The goal is to have a stop-gap way to:

  1. create files like https://github.com/elastic/elastic-agent/blob/main/changelog/9.2.2.yaml and https://github.com/elastic/elasticsearch/blob/main/docs/release-notes/changelog-bundles/9.2.2.yml (which were created by the elastic-agent-changelog-tool build and gradlew bundleChangelogs commands respectively).
  2. create files like https://github.com/elastic/elastic-agent/tree/main/docs/release-notes and https://github.com/elastic/elasticsearch/tree/main/docs/release-notes (which were created or updated by the elastic-agent-changelog-tool render and gradlew generateReleaseNotes commands respectively)

Behaviour

The bundle command can create the list based on (a) all changelogs in a folder, (b) changelogs that have specific product and target values, (c) changelogs that have specific PR values. Only (a) was existing functionality. The long-term goal is to have these manifests generated from the list of PRs associated with a github release or deployment event (then optionally add known issues and security issues and remove feature-flagged changelogs as desired).

Examples

An example of the use of both the bundle and render commands can be found in https://github.com/elastic/cloud/pull/150210

Bundle

Bundle a list of PRs

You can use the --prs option (with the --repo and --owner options if you provide only the PR numbers) to create a bundle of the changelogs that relate to those pull requests. For example:

./docs-builder changelog bundle --prs 108875,135873,136886 --repo elasticsearch --owner elastic

Bundle by PRs in file

The --prs option also supports the use of a file that lists the PRs.
For example, if you have a file with the following PR URLs:

https://github.com/elastic/elasticsearch/pull/108875
https://github.com/elastic/elasticsearch/pull/135873
https://github.com/elastic/elasticsearch/pull/136886
https://github.com/elastic/elasticsearch/pull/137126

Run the bundle command to reference this file and explicitly set the bundle's product metadata:

./docs-builder changelog bundle --prs test/9.2.2a.txt --output-products "elasticsearch 9.2.2"

Alternatively, if the file contains just a list of PR numbers, you must specify the --repo and --owner options:

./docs-builder changelog bundle --prs test/9.2.2b.txt --output-products "elasticsearch 9.2.2" \
  --repo elasticsearch --owner elastic

Both variations create a bundle like this:

products:
- product: elasticsearch
   target: 9.2.2
entries:
- file:
    name: 1765507819-fix-ml-calendar-event-update-scalability-issues.yaml
    checksum: 069b59edb14594e0bc3b70365e81626bde730ab7
- file:
    name: 1765507798-convert-bytestransportresponse-when-proxying-respo.yaml
    checksum: c6dbd4730bf34dbbc877c16c042e6578dd108b62
- file:
    name: 1765507839-use-ivf_pq-for-gpu-index-build-for-large-datasets.yaml
    checksum: 451d60283fe5df426f023e824339f82c2900311e
- file:
    name: 1765507778-break-on-fielddata-when-building-global-ordinals.yaml
    checksum: 70d197d96752c05b6595edffe6fe3ba3d055c845

In this example, none of the changelogs had target or lifecycle product values, therefore there's no version info in this bundle.

Bundle by product and target

If you specify the --input-products option, the bundle contains only changelogs that contain one or more of the specified values:

docs-builder changelog bundle --input-products "cloud-serverless 2025-12-02 ga, cloud-serverless 2025-12-06 ga"

NOTE: As of #2429 you must always specify "product target lifecycle" (or else a wildcard asterisk).

Even if the changelogs also have other product values, only those specified in the bundle command appear in the output:

products:
- product: cloud-serverless
  target: 2025-12-02
- product: cloud-serverless
  target: 2025-12-06
entries:
- file:
    name: 1765495972-fixes-enrich-and-lookup-join-resolution-based-on-m.yaml
    checksum: 6c3243f56279b1797b5dfff6c02ebf90b9658464
- file:
    name: 1-test.yaml
    checksum: 0229ff4e908a0392af00e0905db94134616e6457

Bundle all changelog files

./docs-builder changelog bundle --directory . --all

NOTE: If you have changelogs that apply to multiple products and/or versions in your directory, this can result in a potentially unrealistic bundle. This command option was added to replicate existing behaviour in the elastic-agent-changelog-tool build and gradlew bundleChangelog commands and will likely be deprecated.

products:
- product: cloud-serverless
  target: 2025-12-02
- product: elasticsearch
  target: 9.2.3
- product: elasticsearch
  target: 9.3.0
- product: kibana
entries:
- file:
    name: 1765319409-fixes-enrich-and-lookup-join-resolution-based-on-m.yaml
    checksum: a01d40dc3673d681452373e5b78d1f01da609ff7
- file:
    name: 1765415340-[es|ql]-take-top_snippets-out-of-snapshot.yaml
    checksum: 4be2d3a14154b432f3a1d83ebfbd5568c69cbd1d
...

Copy the changelogs into the bundle

To include the contents of the changelogs, use the --resolve option:

./docs-builder changelog bundle --prs 108875,135873,136886 --repo elasticsearch --owner elastic --output-products "elasticsearch 9.2.2" --resolve

This generates output that's similar to the existing Elastic Agent bundles, for example https://github.com/elastic/elastic-agent/blob/main/changelog/9.2.2.yaml

products:
- product: elasticsearch
   target: 9.2.2
entries:
- file:
    name: 1765507819-fix-ml-calendar-event-update-scalability-issues.yaml
    checksum: 069b59edb14594e0bc3b70365e81626bde730ab7
  type: bug-fix
  title: Fix ML calendar event update scalability issues
  products:
  - product: elasticsearch
  areas:
  - Machine Learning
  pr: https://github.com/elastic/elasticsearch/pull/136886
...

The command is ready to use. Build succeeds and the help text displays correctly.

Render

For example, if you have a bundle like this:

products:
- product: elasticsearch
  target: 9.2.2
entries:
- file:
    name: 1765581721-convert-bytestransportresponse-when-proxying-respo.yaml
    checksum: d7e74edff1bdd3e23ba4f2f88b92cf61cc7d490a
- file:
    name: 1765581721-fix-ml-calendar-event-update-scalability-issues.yaml
    checksum: dfafce50c9fd61c3d8db286398f9553e67737f07
- file:
    name: 1765581651-break-on-fielddata-when-building-global-ordinals.yaml
    checksum: 704b25348d6daff396259216201053334b5b3c1d

You can render it to markdown files as follows:

./docs-builder changelog render \
  --input "changelog-bundle.1.yaml"
  --title 9.2.2 --output ./release-notes

The command merges all bundles, resolves file references using each bundle's directory, uses the appropriate repo for PR/issue links, and renders markdown files with the specified title.

For example, it generates a 9.2.2 folder in the specified output directory and creates breaking-changes.md, deprecations.md, and index.md files. The index.md file in this case contains:

## 9.2.2 [elastic-release-notes-9.2.2]

### Fixes [elastic-9.2.2-fixes]
* Convert BytesTransportResponse when proxying response from/to local node. [#135873](https://github.com/elastic/elastic/pull/135873) 
* Fix ML calendar event update scalability issues. [#136886](https://github.com/elastic/elastic/pull/136886) 
* Break on FieldData when building global ordinals. [#108875](https://github.com/elastic/elastic/pull/108875) 

If you add the --subsections option to the command, the output changes as follows:

## 9.2.2 [elastic-release-notes-9.2.2]

### Fixes [elastic-9.2.2-fixes]

**Network**
* Convert BytesTransportResponse when proxying response from/to local node. [#135873](https://github.com/elastic/elastic/pull/135873) 

**Machine Learning**
* Fix ML calendar event update scalability issues. [#136886](https://github.com/elastic/elastic/pull/136886) 

**Aggregations**
* Break on FieldData when building global ordinals. [#108875](https://github.com/elastic/elastic/pull/108875) 

There is a --hide-features to comment out changelogs that match specific feature-id values per #2412.

There is also --hide-private-links to comment out the PR and issue URLs for cases where we're working in private repos per #2408.

Finally, the command also supports the use of a --config option and heeds the presence of a render_blockers definition if we want to comment out changelogs that match specific area or type values per #2426.

These output files can be integrated into existing release note docs by using file inclusions. For example, view https://github.com/elastic/cloud/pull/150210

Outstanding items

All known outstanding items have been addressed.

Generative AI disclosure

  1. Did you use a generative AI (GenAI) tool to assist in creating this contribution?
  • Yes
  • No
  1. If you answered "Yes" to the previous question, please specify the tool(s) and model(s) used (e.g., Google Gemini, OpenAI ChatGPT-4, etc.).

Tool(s) and model(s) used: composer-1 agent, claude-4.5-sonnet

@github-actions
Copy link

github-actions bot commented Dec 11, 2025

@lcawl lcawl changed the title [WIP] Changelog manifest command Add changelog manifest command Dec 12, 2025
@lcawl lcawl marked this pull request as ready for review December 12, 2025 07:43
@lcawl lcawl requested review from a team as code owners December 12, 2025 07:43
@lcawl lcawl requested a review from reakaleek December 12, 2025 07:43
@lcawl lcawl changed the title Add changelog manifest command Add changelog bundle command Dec 12, 2025
@lcawl lcawl marked this pull request as draft December 12, 2025 14:12
@lcawl lcawl marked this pull request as ready for review December 12, 2025 15:00
lcawl and others added 3 commits January 6, 2026 21:17
Co-authored-by: Copilot Autofix powered by AI <223894421+github-code-quality[bot]@users.noreply.github.com>
Co-authored-by: Copilot Autofix powered by AI <223894421+github-code-quality[bot]@users.noreply.github.com>
lcawl and others added 2 commits January 6, 2026 21:35
Co-authored-by: Copilot Autofix powered by AI <223894421+github-code-quality[bot]@users.noreply.github.com>
* Improve changelog bundle --output

* Fix changelog render with multiple input

* More fixes

* Fix PR link resolution

* Make hide-private-links bundle-specific
Comment on lines +73 to +78
foreach (var input in inputs)
{
var bundleInput = Parse(input);
if (bundleInput != null)
result.Add(bundleInput);
}
@cotti cotti self-requested a review January 14, 2026 15:10

public async Task<bool> BundleChangelogs(
IDiagnosticsCollector collector,
ChangelogBundleInput input,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

input is a tad too non-descriptive as a variable coming in as an argument... Can we change it to bundleInput or something else more descriptive?

Comment on lines 509 to 950
public async Task<bool> BundleChangelogs(
IDiagnosticsCollector collector,
ChangelogBundleInput input,
Cancel ctx
)
{
try
{
// Validate input
if (string.IsNullOrWhiteSpace(input.Directory))
{
collector.EmitError(string.Empty, "Directory is required");
return false;
}

if (!_fileSystem.Directory.Exists(input.Directory))
{
collector.EmitError(input.Directory, "Directory does not exist");
return false;
}

// Validate filter options
var filterCount = 0;
if (input.All)
filterCount++;
if (input.InputProducts != null && input.InputProducts.Count > 0)
filterCount++;
if (input.Prs != null && input.Prs.Length > 0)
filterCount++;
if (!string.IsNullOrWhiteSpace(input.PrsFile))
filterCount++;

if (filterCount == 0)
{
collector.EmitError(string.Empty, "At least one filter option must be specified: --all, --input-products, --prs, or --prs-file");
return false;
}

if (filterCount > 1)
{
collector.EmitError(string.Empty, "Only one filter option can be specified at a time: --all, --input-products, --prs, or --prs-file");
return false;
}

// Load PRs from file if specified
var prsToMatch = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
if (!string.IsNullOrWhiteSpace(input.PrsFile))
{
if (!_fileSystem.File.Exists(input.PrsFile))
{
collector.EmitError(input.PrsFile, "PRs file does not exist");
return false;
}

var prsFileContent = await _fileSystem.File.ReadAllTextAsync(input.PrsFile, ctx);
var prsFromFile = prsFileContent
.Split('\n', StringSplitOptions.RemoveEmptyEntries | StringSplitOptions.TrimEntries)
.Where(p => !string.IsNullOrWhiteSpace(p))
.ToArray();

if (input.Prs != null && input.Prs.Length > 0)
{
foreach (var pr in input.Prs)
{
_ = prsToMatch.Add(pr);
}
}

foreach (var pr in prsFromFile)
{
_ = prsToMatch.Add(pr);
}
}
else if (input.Prs != null && input.Prs.Length > 0)
{
foreach (var pr in input.Prs)
{
_ = prsToMatch.Add(pr);
}
}

// Build set of product/version combinations to filter by
var productsToMatch = new HashSet<(string product, string version)>();
if (input.InputProducts != null && input.InputProducts.Count > 0)
{
foreach (var product in input.InputProducts)
{
var version = product.Target ?? string.Empty;
_ = productsToMatch.Add((product.Product.ToLowerInvariant(), version));
}
}

// Determine output path to exclude it from input files
var outputPath = input.Output ?? _fileSystem.Path.Combine(input.Directory, "changelog-bundle.yaml");
var outputFileName = _fileSystem.Path.GetFileName(outputPath);

// Read all YAML files from directory (exclude bundle files and output file)
var allYamlFiles = _fileSystem.Directory.GetFiles(input.Directory, "*.yaml", SearchOption.TopDirectoryOnly)
.Concat(_fileSystem.Directory.GetFiles(input.Directory, "*.yml", SearchOption.TopDirectoryOnly))
.ToList();

var yamlFiles = new List<string>();
foreach (var filePath in allYamlFiles)
{
var fileName = _fileSystem.Path.GetFileName(filePath);

// Exclude the output file
if (fileName.Equals(outputFileName, StringComparison.OrdinalIgnoreCase))
continue;

// Check if file is a bundle file by looking for "entries:" key (unique to bundle files)
try
{
var fileContent = await _fileSystem.File.ReadAllTextAsync(filePath, ctx);
// Bundle files have "entries:" at root level, changelog files don't
if (fileContent.Contains("entries:", StringComparison.Ordinal) &&
fileContent.Contains("products:", StringComparison.Ordinal))
{
_logger.LogDebug("Skipping bundle file: {FileName}", fileName);
continue;
}
}
catch (Exception ex) when (ex is not (OutOfMemoryException or StackOverflowException or ThreadAbortException))
{
// If we can't read the file, skip it
_logger.LogWarning(ex, "Failed to read file {FileName} for bundle detection", fileName);
continue;
}

yamlFiles.Add(filePath);
}

if (yamlFiles.Count == 0)
{
collector.EmitError(input.Directory, "No YAML files found in directory");
return false;
}

_logger.LogInformation("Found {Count} YAML files in directory", yamlFiles.Count);

// Deserialize and filter changelog files
var deserializer = new StaticDeserializerBuilder(new ChangelogYamlStaticContext())
.WithNamingConvention(UnderscoredNamingConvention.Instance)
.Build();

var changelogEntries = new List<(ChangelogData data, string filePath, string fileName, string checksum)>();
var matchedPrs = new HashSet<string>(StringComparer.OrdinalIgnoreCase);

foreach (var filePath in yamlFiles)
{
try
{
var fileName = _fileSystem.Path.GetFileName(filePath);
var fileContent = await _fileSystem.File.ReadAllTextAsync(filePath, ctx);

// Compute checksum (SHA1)
var checksum = ComputeSha1(fileContent);

// Deserialize YAML (skip comment lines)
var yamlLines = fileContent.Split('\n');
var yamlWithoutComments = string.Join('\n', yamlLines.Where(line => !line.TrimStart().StartsWith('#')));

// Normalize "version:" to "target:" in products section for compatibility
// Some changelog files may use "version" instead of "target"
// Match "version:" with various indentation levels
var normalizedYaml = VersionToTargetRegex().Replace(yamlWithoutComments, "$1target:");

var data = deserializer.Deserialize<ChangelogData>(normalizedYaml);

if (data == null)
{
_logger.LogWarning("Skipping file {FileName}: failed to deserialize", fileName);
continue;
}

// Apply filters
if (input.All)
{
// Include all
}
else if (productsToMatch.Count > 0)
{
// Filter by products
var matches = data.Products.Any(p =>
{
var version = p.Target ?? string.Empty;
return productsToMatch.Contains((p.Product.ToLowerInvariant(), version));
});

if (!matches)
{
continue;
}
}
else if (prsToMatch.Count > 0)
{
// Filter by PRs
var matches = false;
if (!string.IsNullOrWhiteSpace(data.Pr))
{
// Normalize PR for comparison
var normalizedPr = NormalizePrForComparison(data.Pr, input.Owner, input.Repo);
foreach (var pr in prsToMatch)
{
var normalizedPrToMatch = NormalizePrForComparison(pr, input.Owner, input.Repo);
if (normalizedPr == normalizedPrToMatch)
{
matches = true;
_ = matchedPrs.Add(pr);
break;
}
}
}

if (!matches)
{
continue;
}
}

changelogEntries.Add((data, filePath, fileName, checksum));
}
catch (YamlException ex)
{
_logger.LogWarning(ex, "Failed to parse YAML file {FilePath}", filePath);
collector.EmitError(filePath, $"Failed to parse YAML: {ex.Message}");
continue;
}
catch (Exception ex) when (ex is not (OutOfMemoryException or StackOverflowException or ThreadAbortException))
{
_logger.LogWarning(ex, "Error processing file {FilePath}", filePath);
collector.EmitError(filePath, $"Error processing file: {ex.Message}");
continue;
}
}

// Warn about unmatched PRs if filtering by PRs
if (prsToMatch.Count > 0)
{
var unmatchedPrs = prsToMatch.Where(pr => !matchedPrs.Contains(pr)).ToList();
if (unmatchedPrs.Count > 0)
{
foreach (var unmatchedPr in unmatchedPrs)
{
collector.EmitWarning(string.Empty, $"No changelog file found for PR: {unmatchedPr}");
}
}
}

if (changelogEntries.Count == 0)
{
collector.EmitError(string.Empty, "No changelog entries matched the filter criteria");
return false;
}

_logger.LogInformation("Found {Count} matching changelog entries", changelogEntries.Count);

// Build bundled data
var bundledData = new BundledChangelogData();

// Set products array in output
// If --output-products was specified, use those values (override any from changelogs)
if (input.OutputProducts != null && input.OutputProducts.Count > 0)
{
bundledData.Products = input.OutputProducts
.OrderBy(p => p.Product)
.ThenBy(p => p.Target ?? string.Empty)
.Select(p => new BundledProduct
{
Product = p.Product,
Target = p.Target
})
.ToList();
}
// If --input-products filter was used, only include those specific product-versions
else if (productsToMatch.Count > 0)
{
bundledData.Products = productsToMatch
.OrderBy(pv => pv.product)
.ThenBy(pv => pv.version)
.Select(pv => new BundledProduct
{
Product = pv.product,
Target = string.IsNullOrWhiteSpace(pv.version) ? null : pv.version
})
.ToList();
}
// Otherwise, extract unique products/versions from changelog entries
else
{
var productVersions = new HashSet<(string product, string version)>();
foreach (var (data, _, _, _) in changelogEntries)
{
foreach (var product in data.Products)
{
var version = product.Target ?? string.Empty;
_ = productVersions.Add((product.Product, version));
}
}

bundledData.Products = productVersions
.OrderBy(pv => pv.product)
.ThenBy(pv => pv.version)
.Select(pv => new BundledProduct
{
Product = pv.product,
Target = string.IsNullOrWhiteSpace(pv.version) ? null : pv.version
})
.ToList();
}

// Check for products with same product ID but different versions
var productsByProductId = bundledData.Products.GroupBy(p => p.Product, StringComparer.OrdinalIgnoreCase)
.Where(g => g.Count() > 1)
.ToList();

foreach (var productGroup in productsByProductId)
{
var targets = productGroup.Select(p => string.IsNullOrWhiteSpace(p.Target) ? "(no target)" : p.Target).ToList();
collector.EmitWarning(string.Empty, $"Product '{productGroup.Key}' has multiple targets in bundle: {string.Join(", ", targets)}");
}

// Build entries
if (input.Resolve)
{
// When resolving, include changelog contents and validate required fields
var resolvedEntries = new List<BundledEntry>();
foreach (var (data, filePath, fileName, checksum) in changelogEntries)
{
// Validate required fields
if (string.IsNullOrWhiteSpace(data.Title))
{
collector.EmitError(filePath, "Changelog file is missing required field: title");
return false;
}

if (string.IsNullOrWhiteSpace(data.Type))
{
collector.EmitError(filePath, "Changelog file is missing required field: type");
return false;
}

if (data.Products == null || data.Products.Count == 0)
{
collector.EmitError(filePath, "Changelog file is missing required field: products");
return false;
}

// Validate products have required fields
if (data.Products.Any(product => string.IsNullOrWhiteSpace(product.Product)))
{
collector.EmitError(filePath, "Changelog file has product entry missing required field: product");
return false;
}

resolvedEntries.Add(new BundledEntry
{
File = new BundledFile
{
Name = fileName,
Checksum = checksum
},
Type = data.Type,
Title = data.Title,
Products = data.Products,
Description = data.Description,
Impact = data.Impact,
Action = data.Action,
FeatureId = data.FeatureId,
Highlight = data.Highlight,
Subtype = data.Subtype,
Areas = data.Areas,
Pr = data.Pr,
Issues = data.Issues
});
}

bundledData.Entries = resolvedEntries;
}
else
{
// Only include file information
bundledData.Entries = changelogEntries
.Select(e => new BundledEntry
{
File = new BundledFile
{
Name = e.fileName,
Checksum = e.checksum
}
})
.ToList();
}

// Generate bundled YAML
var bundleSerializer = new StaticSerializerBuilder(new ChangelogYamlStaticContext())
.WithNamingConvention(UnderscoredNamingConvention.Instance)
.ConfigureDefaultValuesHandling(DefaultValuesHandling.OmitNull | DefaultValuesHandling.OmitEmptyCollections)
.Build();

var bundledYaml = bundleSerializer.Serialize(bundledData);

// Output path was already determined above when filtering files
var outputDir = _fileSystem.Path.GetDirectoryName(outputPath);
if (!string.IsNullOrWhiteSpace(outputDir) && !_fileSystem.Directory.Exists(outputDir))
{
_ = _fileSystem.Directory.CreateDirectory(outputDir);
}

// If output file already exists, generate a unique filename
if (_fileSystem.File.Exists(outputPath))
{
var directory = _fileSystem.Path.GetDirectoryName(outputPath) ?? string.Empty;
var fileNameWithoutExtension = _fileSystem.Path.GetFileNameWithoutExtension(outputPath);
var extension = _fileSystem.Path.GetExtension(outputPath);
var timestamp = DateTimeOffset.UtcNow.ToUnixTimeSeconds();
var uniqueFileName = $"{fileNameWithoutExtension}-{timestamp}{extension}";
outputPath = _fileSystem.Path.Combine(directory, uniqueFileName);
_logger.LogInformation("Output file already exists, using unique filename: {OutputPath}", outputPath);
}

// Write bundled file
await _fileSystem.File.WriteAllTextAsync(outputPath, bundledYaml, ctx);
_logger.LogInformation("Created bundled changelog: {OutputPath}", outputPath);

return true;
}
catch (OperationCanceledException)
{
throw;
}
catch (IOException ioEx)
{
collector.EmitError(string.Empty, $"IO error bundling changelogs: {ioEx.Message}", ioEx);
return false;
}
catch (UnauthorizedAccessException uaEx)
{
collector.EmitError(string.Empty, $"Access denied bundling changelogs: {uaEx.Message}", uaEx);
return false;
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The method is overall too long; there are some portions that can be extracted into private auxiliary methods.

Comment on lines 1008 to 1390
public async Task<bool> RenderChangelogs(
IDiagnosticsCollector collector,
ChangelogRenderInput input,
Cancel ctx
)
{
try
{
// Validate input
if (input.Bundles == null || input.Bundles.Count == 0)
{
collector.EmitError(string.Empty, "At least one bundle file is required. Use --input to specify bundle files.");
return false;
}

var deserializer = new StaticDeserializerBuilder(new ChangelogYamlStaticContext())
.WithNamingConvention(UnderscoredNamingConvention.Instance)
.Build();

// Validation phase: Load and validate all bundles before merging
var bundleDataList = new List<(BundledChangelogData data, BundleInput input, string directory)>();
var seenFileNames = new Dictionary<string, List<string>>(StringComparer.OrdinalIgnoreCase); // filename -> list of bundle files
var seenPrs = new Dictionary<string, List<string>>(); // PR -> list of bundle files
var defaultRepo = "elastic";

foreach (var bundleInput in input.Bundles)
{
if (string.IsNullOrWhiteSpace(bundleInput.BundleFile))
{
collector.EmitError(string.Empty, "Bundle file path is required for each --input");
return false;
}

if (!_fileSystem.File.Exists(bundleInput.BundleFile))
{
collector.EmitError(bundleInput.BundleFile, "Bundle file does not exist");
return false;
}

// Load bundle file
var bundleContent = await _fileSystem.File.ReadAllTextAsync(bundleInput.BundleFile, ctx);

// Validate bundle structure - check for unexpected fields by deserializing
BundledChangelogData? bundledData;
try
{
bundledData = deserializer.Deserialize<BundledChangelogData>(bundleContent);
}
catch (YamlException yamlEx)
{
collector.EmitError(bundleInput.BundleFile, $"Failed to deserialize bundle file: {yamlEx.Message}", yamlEx);
return false;
}

if (bundledData == null)
{
collector.EmitError(bundleInput.BundleFile, "Failed to deserialize bundle file");
return false;
}

// Validate bundle has required structure
if (bundledData.Products == null)
{
collector.EmitError(bundleInput.BundleFile, "Bundle file is missing required field: products");
return false;
}

if (bundledData.Entries == null)
{
collector.EmitError(bundleInput.BundleFile, "Bundle file is missing required field: entries");
return false;
}

// Determine directory for resolving file references
var bundleDirectory = bundleInput.Directory ?? _fileSystem.Path.GetDirectoryName(bundleInput.BundleFile) ?? Directory.GetCurrentDirectory();

// Validate all referenced files exist and check for duplicates
var fileNamesInThisBundle = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
foreach (var entry in bundledData.Entries)
{
// Track file names for duplicate detection
if (!string.IsNullOrWhiteSpace(entry.File?.Name))
{
var fileName = entry.File.Name;

// Check for duplicates within the same bundle
if (!fileNamesInThisBundle.Add(fileName))
{
collector.EmitWarning(bundleInput.BundleFile, $"Changelog file '{fileName}' appears multiple times in the same bundle");
}

// Track across bundles
if (!seenFileNames.TryGetValue(fileName, out var bundleList))
{
bundleList = [];
seenFileNames[fileName] = bundleList;
}
bundleList.Add(bundleInput.BundleFile);
}

// If entry has resolved data, validate it
if (!string.IsNullOrWhiteSpace(entry.Title) && !string.IsNullOrWhiteSpace(entry.Type))
{
// Validate required fields in resolved entry
if (string.IsNullOrWhiteSpace(entry.Title))
{
collector.EmitError(bundleInput.BundleFile, $"Entry in bundle is missing required field: title");
return false;
}

if (string.IsNullOrWhiteSpace(entry.Type))
{
collector.EmitError(bundleInput.BundleFile, $"Entry in bundle is missing required field: type");
return false;
}

if (entry.Products == null || entry.Products.Count == 0)
{
collector.EmitError(bundleInput.BundleFile, $"Entry '{entry.Title}' in bundle is missing required field: products");
return false;
}

// Track PRs for duplicate detection
if (!string.IsNullOrWhiteSpace(entry.Pr))
{
var normalizedPr = NormalizePrForComparison(entry.Pr, null, null);
if (!seenPrs.TryGetValue(normalizedPr, out var prBundleList))
{
prBundleList = [];
seenPrs[normalizedPr] = prBundleList;
}
prBundleList.Add(bundleInput.BundleFile);
}
}
else
{
// Entry only has file reference - validate file exists
if (string.IsNullOrWhiteSpace(entry.File?.Name))
{
collector.EmitError(bundleInput.BundleFile, "Entry in bundle is missing required field: file.name");
return false;
}

if (string.IsNullOrWhiteSpace(entry.File.Checksum))
{
collector.EmitError(bundleInput.BundleFile, $"Entry for file '{entry.File.Name}' in bundle is missing required field: file.checksum");
return false;
}

var filePath = _fileSystem.Path.Combine(bundleDirectory, entry.File.Name);
if (!_fileSystem.File.Exists(filePath))
{
collector.EmitError(bundleInput.BundleFile, $"Referenced changelog file '{entry.File.Name}' does not exist at path: {filePath}");
return false;
}

// Validate the changelog file can be deserialized
try
{
var fileContent = await _fileSystem.File.ReadAllTextAsync(filePath, ctx);
var checksum = ComputeSha1(fileContent);
if (checksum != entry.File.Checksum)
{
collector.EmitWarning(bundleInput.BundleFile, $"Checksum mismatch for file {entry.File.Name}. Expected {entry.File.Checksum}, got {checksum}");
}

// Deserialize YAML (skip comment lines) to validate structure
var yamlLines = fileContent.Split('\n');
var yamlWithoutComments = string.Join('\n', yamlLines.Where(line => !line.TrimStart().StartsWith('#')));

// Normalize "version:" to "target:" in products section
var normalizedYaml = VersionToTargetRegex().Replace(yamlWithoutComments, "$1target:");

var entryData = deserializer.Deserialize<ChangelogData>(normalizedYaml);
if (entryData == null)
{
collector.EmitError(bundleInput.BundleFile, $"Failed to deserialize changelog file '{entry.File.Name}'");
return false;
}

// Validate required fields in changelog file
if (string.IsNullOrWhiteSpace(entryData.Title))
{
collector.EmitError(filePath, "Changelog file is missing required field: title");
return false;
}

if (string.IsNullOrWhiteSpace(entryData.Type))
{
collector.EmitError(filePath, "Changelog file is missing required field: type");
return false;
}

if (entryData.Products == null || entryData.Products.Count == 0)
{
collector.EmitError(filePath, "Changelog file is missing required field: products");
return false;
}

// Track PRs for duplicate detection
if (!string.IsNullOrWhiteSpace(entryData.Pr))
{
var normalizedPr = NormalizePrForComparison(entryData.Pr, null, null);
if (!seenPrs.TryGetValue(normalizedPr, out var prBundleList2))
{
prBundleList2 = [];
seenPrs[normalizedPr] = prBundleList2;
}
prBundleList2.Add(bundleInput.BundleFile);
}
}
catch (YamlException yamlEx)
{
collector.EmitError(filePath, $"Failed to parse changelog file: {yamlEx.Message}", yamlEx);
return false;
}
}
}

bundleDataList.Add((bundledData, bundleInput, bundleDirectory));
}

// Check for duplicate file names across bundles
foreach (var (fileName, bundleFiles) in seenFileNames.Where(kvp => kvp.Value.Count > 1))
{
var uniqueBundles = bundleFiles.Distinct().ToList();
if (uniqueBundles.Count > 1)
{
collector.EmitWarning(string.Empty, $"Changelog file '{fileName}' appears in multiple bundles: {string.Join(", ", uniqueBundles)}");
}
}

// Check for duplicate PRs
foreach (var (pr, bundleFiles) in seenPrs.Where(kvp => kvp.Value.Count > 1))
{
var uniqueBundles = bundleFiles.Distinct().ToList();
if (uniqueBundles.Count > 1)
{
collector.EmitWarning(string.Empty, $"PR '{pr}' appears in multiple bundles: {string.Join(", ", uniqueBundles)}");
}
}

// If validation found errors, stop before merging
if (collector.Errors > 0)
{
return false;
}

// Merge phase: Now that validation passed, load and merge all bundles
var allResolvedEntries = new List<(ChangelogData entry, string repo)>();
var allProducts = new HashSet<(string product, string target)>();

foreach (var (bundledData, bundleInput, bundleDirectory) in bundleDataList)
{
// Collect products from this bundle
foreach (var product in bundledData.Products)
{
var target = product.Target ?? string.Empty;
_ = allProducts.Add((product.Product, target));
}

var repo = bundleInput.Repo ?? defaultRepo;

// Resolve entries
foreach (var entry in bundledData.Entries)
{
ChangelogData? entryData = null;

// If entry has resolved data, use it
if (!string.IsNullOrWhiteSpace(entry.Title) && !string.IsNullOrWhiteSpace(entry.Type))
{
entryData = new ChangelogData
{
Title = entry.Title,
Type = entry.Type,
Subtype = entry.Subtype,
Description = entry.Description,
Impact = entry.Impact,
Action = entry.Action,
FeatureId = entry.FeatureId,
Highlight = entry.Highlight,
Pr = entry.Pr,
Products = entry.Products ?? [],
Areas = entry.Areas,
Issues = entry.Issues
};
}
else
{
// Load from file (already validated to exist)
var filePath = _fileSystem.Path.Combine(bundleDirectory, entry.File.Name);
var fileContent = await _fileSystem.File.ReadAllTextAsync(filePath, ctx);

// Deserialize YAML (skip comment lines)
var yamlLines = fileContent.Split('\n');
var yamlWithoutComments = string.Join('\n', yamlLines.Where(line => !line.TrimStart().StartsWith('#')));

// Normalize "version:" to "target:" in products section
var normalizedYaml = VersionToTargetRegex().Replace(yamlWithoutComments, "$1target:");

entryData = deserializer.Deserialize<ChangelogData>(normalizedYaml);
}

if (entryData != null)
{
allResolvedEntries.Add((entryData, repo));
}
}
}

if (allResolvedEntries.Count == 0)
{
collector.EmitError(string.Empty, "No changelog entries to render");
return false;
}

// Determine output directory
var outputDir = input.Output ?? Directory.GetCurrentDirectory();
if (!_fileSystem.Directory.Exists(outputDir))
{
_ = _fileSystem.Directory.CreateDirectory(outputDir);
}

// Extract version from products (use first product's target if available, or "unknown")
var version = allProducts.Count > 0
? allProducts.OrderBy(p => p.product).ThenBy(p => p.target).First().target
: "unknown";

if (string.IsNullOrWhiteSpace(version))
{
version = "unknown";
}

// Warn if --title was not provided and version defaults to "unknown"
if (string.IsNullOrWhiteSpace(input.Title) && version == "unknown")
{
collector.EmitWarning(string.Empty, "No --title option provided and bundle files do not contain 'target' values. Output folder and markdown titles will default to 'unknown'. Consider using --title to specify a custom title.");
}

// Group entries by type (kind)
var entriesByType = allResolvedEntries.Select(e => e.entry).GroupBy(e => e.Type).ToDictionary(g => g.Key, g => g.ToList());

// Use title from input or default to version
var title = input.Title ?? version;
// Convert title to slug format for folder names and anchors (lowercase, dashes instead of spaces)
var titleSlug = TitleToSlug(title);

// Render markdown files (use first repo found, or default)
var repoForRendering = allResolvedEntries.Count > 0 ? allResolvedEntries[0].repo : defaultRepo;

// Render index.md (features, enhancements, bug fixes, security)
await RenderIndexMarkdown(collector, outputDir, title, titleSlug, repoForRendering, allResolvedEntries.Select(e => e.entry).ToList(), entriesByType, input.Subsections, input.HidePrivateLinks, ctx);

// Render breaking-changes.md
await RenderBreakingChangesMarkdown(collector, outputDir, title, titleSlug, repoForRendering, allResolvedEntries.Select(e => e.entry).ToList(), entriesByType, input.Subsections, input.HidePrivateLinks, ctx);

// Render deprecations.md
await RenderDeprecationsMarkdown(collector, outputDir, title, titleSlug, repoForRendering, allResolvedEntries.Select(e => e.entry).ToList(), entriesByType, input.Subsections, input.HidePrivateLinks, ctx);

_logger.LogInformation("Rendered changelog markdown files to {OutputDir}", outputDir);

return true;
}
catch (OperationCanceledException)
{
throw;
}
catch (IOException ioEx)
{
collector.EmitError(string.Empty, $"IO error rendering changelogs: {ioEx.Message}", ioEx);
return false;
}
catch (UnauthorizedAccessException uaEx)
{
collector.EmitError(string.Empty, $"Access denied rendering changelogs: {uaEx.Message}", uaEx);
return false;
}
catch (YamlException yamlEx)
{
collector.EmitError(string.Empty, $"YAML parsing error: {yamlEx.Message}", yamlEx);
return false;
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This one got too long as well, it might have some places that can be put in private methods to simplify it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Deserialization of ChangelogData can be extracted and reused:

private ChangelogData? DeserializeChangelogContent(string content, IDeserializer deserializer)
{
    var yamlLines = content.Split('\n');
    var yamlWithoutComments = string.Join('\n', yamlLines.Where(line => !line.TrimStart().StartsWith('#')));
    var normalizedYaml = VersionToTargetRegex().Replace(yamlWithoutComments, "$1target:");
    return deserializer.Deserialize<ChangelogData>(normalizedYaml);
}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The deserializer builders are created multiple times, they can be class fields instead.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The ChangelogData required field checks (duplicated between BundleChangelogs and RenderChangelogs) can be moved:

private static bool ValidateChangelogRequiredFields(
    ChangelogData changelogData,
    string filePath,
    IDiagnosticsCollector collector)
{
    if (string.IsNullOrWhiteSpace(changelogData.Title))
    {
        collector.EmitError(filePath, "Changelog file is missing required field: title");
        return false;
    }

    if (string.IsNullOrWhiteSpace(changelogData.Type))
    {
        collector.EmitError(filePath, "Changelog file is missing required field: type");
        return false;
    }

    if (changelogData.Products is not { Count: > 0 })
    {
        collector.EmitError(filePath, "Changelog file is missing required field: products");
        return false;
    }

    if (changelogData.Products.Any(p => string.IsNullOrWhiteSpace(p.Product)))
    {
        collector.EmitError(filePath, "Changelog file has product entry missing required field: product");
        return false;
    }

    return true;
}

Comment on lines +1062 to +1079
if (bundledData == null)
{
collector.EmitError(bundleInput.BundleFile, "Failed to deserialize bundle file");
return false;
}

// Validate bundle has required structure
if (bundledData.Products == null)
{
collector.EmitError(bundleInput.BundleFile, "Bundle file is missing required field: products");
return false;
}

if (bundledData.Entries == null)
{
collector.EmitError(bundleInput.BundleFile, "Bundle file is missing required field: entries");
return false;
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. So effectively they (null/empty) mean the same: invalid input.

That's correct; I'm just not 100% sure on which path to take: make them nullable to reflect that's an input that can happen and error during validation, or outright throw during the parsing, since this is an error anyway.

Resolved conflicts by keeping features from both branches:
- AddBlockers (from main): Prevents changelog creation based on PR labels
- RenderBlockers (from changelog-manifest): Prevents changelogs from being rendered
- Multiple PRs support (from main): Creates one changelog per PR
- Bundle and render commands (from changelog-manifest): Create bundles and generate markdown

Merged documentation to include all features and examples.
}

// It's a directory path - append default filename
processedOutput = Path.Combine(output, "changelog-bundle.yaml");
cotti added 4 commits January 15, 2026 11:56
Resolved conflicts by keeping features from both branches:
- AddBlockers (from main): Prevents changelog creation based on PR labels
- RenderBlockers (from changelog-manifest): Prevents changelogs from being rendered
- Multiple PRs support (from main): Creates one changelog per PR
- Bundle and render commands (from changelog-manifest): Create bundles and generate markdown

Merged documentation to include all features and examples.
…og-manifest

# Conflicts:
#	tests/Elastic.Documentation.Services.Tests/ChangelogServiceTests.cs
@lcawl lcawl merged commit b84df3b into main Jan 15, 2026
30 checks passed
@lcawl lcawl deleted the changelog-manifest branch January 15, 2026 15:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants