-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
Summary
Add the ability to export benchmark results to CSV and Markdown table formats for easy sharing, analysis, and inclusion in documentation.
Motivation
Currently, benchmark results are only displayed in the console. To track performance over time, compare across hardware, or include results in reports and READMEs, a structured export format is needed. CSV enables spreadsheet analysis and charting, while Markdown enables direct pasting into GitHub issues, PRs, and documentation.
Acceptance Criteria
- Add a
--outputCLI flag acceptingcsvandmd(ormarkdown) as values - CSV export should include all metrics: task name, accelerator, average latency, P95 latency, throughput, and any additional fields from PCIe vs kernel timing breakdown #7 (timing breakdown) if available
- CSV should include a header row and use standard formatting (comma-delimited, quoted strings where needed)
- Markdown export should produce a GitHub-flavoured Markdown table with aligned columns
- Output should be written to a file (e.g.
results.csv/results.md) with the filename optionally configurable via--output-file - Console output should still be shown by default (export is in addition, not instead)
- Include machine/hardware metadata in the export (GPU name, driver version, OS, .NET version) as a header or separate section
Technical Notes
- Consider using
StringBuilderorStreamWriterfor efficient file output - Markdown tables should be formatted for readability (padded columns)
- This pairs well with Batch size sweep analysis #8 (batch size sweep) and Linux / cross-platform CI benchmarks #12 (CI benchmarks) for automated reporting
- Hardware metadata can be sourced from ILGPU's accelerator properties and
System.Runtime.InteropServices.RuntimeInformation
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels