Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
202 commits
Select commit Hold shift + click to select a range
ff1a95c
Set data file `sort_order_id` in manifest for writes from Spark
jbewing Nov 25, 2025
47d5f50
Docs: Add gc.enabled table property (#14676)
ebyhr Nov 25, 2025
6a54bc1
Core: Add idempotency-key-lifetime to ConfigResponse (#14649)
huaxingao Nov 25, 2025
078fbeb
Build: Don't ignore major version upgrade for GH actions in dependabo…
manuzhang Nov 26, 2025
790a820
Build: Bump actions/labeler from 5 to 6 (#14689)
dependabot[bot] Nov 26, 2025
5e166b5
Build: Bump actions/setup-python from 5 to 6 (#14690)
dependabot[bot] Nov 26, 2025
b7549da
Build: Bump actions/stale from 9.1.0 to 10.1.0 (#14692)
dependabot[bot] Nov 26, 2025
0846ed5
Build: Bump software.amazon.awssdk:bom from 2.39.2 to 2.39.4 (#14693)
dependabot[bot] Nov 26, 2025
87174fc
Build: Bump actions/upload-artifact from 4 to 5 (#14688)
dependabot[bot] Nov 26, 2025
fe6f78b
Core: Allow overriding view location for subclasses (#14653)
tmater Nov 26, 2025
cd8d2a3
Core: Fix server side planning on empty tables in CatalogHandlers (#1…
geruh Nov 26, 2025
42719ef
Build: Bump actions/checkout from 3 to 6 (#14691)
dependabot[bot] Nov 27, 2025
cf27769
Spark: Fix scala warnings in View code (#14703)
nastra Nov 28, 2025
d2d4135
infra: notify on github workflow failure (#14609)
kevinjqliu Nov 28, 2025
784f1f4
Docs: Fix package of iceberg.catalog.io-impl (#14711)
ebyhr Nov 29, 2025
8a38d6e
Spark 4.0: expire-snapshots with cleanupLevel=None (#14695)
alessandro-nori Nov 29, 2025
52c176d
Use non-deprecated del_branch_on_merge (#14710)
kevinjqliu Nov 29, 2025
b9a8c31
Build: Bump software.amazon.awssdk:bom from 2.39.4 to 2.39.5 (#14718)
dependabot[bot] Nov 30, 2025
3a11776
Build: Bump datamodel-code-generator from 0.35.0 to 0.36.0 (#14716)
dependabot[bot] Nov 30, 2025
46d766a
Build: Bump com.google.errorprone:error_prone_annotations (#14715)
dependabot[bot] Nov 30, 2025
65c667d
Nit: Move unchecked suppression down to violating assignment in `Parq…
smaheshwar-pltr Nov 30, 2025
f3949ce
Build: Bump com.azure:azure-sdk-bom from 1.3.2 to 1.3.3 (#14717)
dependabot[bot] Nov 30, 2025
b35c7ec
Core: Fix NAN_VALUE_COUNTS serialization for ContentFile (#14721)
huaxingao Dec 1, 2025
3747965
Core: Use `@TempDir` in TestTableMetadataParser (#14732)
ebyhr Dec 2, 2025
16e8435
Core: Add UUIDv7 generator (#14700)
huaxingao Dec 2, 2025
166a6eb
Build: Apply spotless for scala code (#8023)
ConeyLiu Dec 2, 2025
29a144a
Refactor SnapshotAncestryValidator (#14650)
aiborodin Dec 2, 2025
9896e8c
Core: Align ContentFile partition JSON with REST spec (#14702)
geruh Dec 2, 2025
da76b87
open-api: use uv and python virtual env (#14684)
kevinjqliu Dec 2, 2025
35d66a3
Core: Support Custom Table/View Operations in RESTCatalog (#14465)
XJDKC Dec 2, 2025
fac485c
Spark: Analyze but don't optimize view body during creation (#14681)
jbewing Dec 3, 2025
74a1160
Spark 4.0, Core: Add Limit pushdown to Scan (#14615)
nastra Dec 3, 2025
fc7cd7d
Spark 3.4,3.5: Add LIMIT pushdown to Scan (#14741)
nastra Dec 3, 2025
bfe06a5
Spark: Move DeleteFiltering out from the vectorized reader (#14652)
pvary Dec 3, 2025
8626ef5
Docs: encryption (#14621)
ggershinsky Dec 3, 2025
52d6a79
Nit: Prefer `Preconditions` in `StandardEncryptionManager` (#14753)
smaheshwar-pltr Dec 4, 2025
65280c0
Spark: Backport move DeleteFiltering out from the vectorized reader (…
pvary Dec 4, 2025
667bc59
Flink: Dynamic Sink: Document writeParallelism and fail on invalid co…
mxm Dec 4, 2025
86e53a7
Flink: Backport: Dynamic Sink: Document writeParallelism and fail on …
mxm Dec 4, 2025
4c0ad42
Core: Align ContentFile enum serialization with REST Spec (#14739)
geruh Dec 4, 2025
d7f8950
Exception on encryption key altering (#14723)
ggershinsky Dec 5, 2025
ba98347
Docs: fix rendering issues in encryption doc (#14756)
huaxingao Dec 5, 2025
8db3d21
Flink: Fix cache refreshing in dynamic sink (#14406)
aiborodin Dec 5, 2025
c4ba60d
Flink: Backport fix cache refreshing in dynamic sink (#14765)
aiborodin Dec 5, 2025
55bfc7e
OpenAPI: use yaml linter (#14686)
kevinjqliu Dec 5, 2025
73f8ab8
Core: Reference IRC to return 204 (#14724)
gaborkaszab Dec 5, 2025
7bac865
Core: Send Idempotency-Key on mutation requests when advertised (#14740)
huaxingao Dec 5, 2025
4c90831
Spark: ORC vectorized reader to use the delete filter (#14746)
pvary Dec 5, 2025
9632a2f
AWS: Configure builder for reuse of http connection pool in SDKv2 (#1…
anuragmantri Dec 5, 2025
885bcbb
site: Update Slack link (#14772)
Fokko Dec 5, 2025
1c024d7
Update configuration.md (#14771)
Kurtiscwright Dec 5, 2025
fc43499
Revert "Update configuration.md (#14771)" (#14780)
manuzhang Dec 6, 2025
3b1ec48
Build: Bump actions/stale from 10.1.0 to 10.1.1 (#14784)
dependabot[bot] Dec 7, 2025
faabc46
Build: Bump nessie from 0.105.7 to 0.106.0 (#14785)
dependabot[bot] Dec 7, 2025
786d164
Build: Bump org.xerial:sqlite-jdbc from 3.51.0.0 to 3.51.1.0 (#14786)
dependabot[bot] Dec 7, 2025
6735edd
Build: Bump software.amazon.awssdk:bom from 2.39.5 to 2.40.3 (#14788)
dependabot[bot] Dec 7, 2025
19b4bd0
Core: Disallow encryption table properties in v1 and v2 (#14668)
ebyhr Dec 8, 2025
0c19450
OpenAPI: Use `PrimitiveTypeValue` rather than `object` (#14184)
Fokko Dec 8, 2025
a739cb3
Flink: Dynamic Sink: Add support for dropping columns (#14728)
mxm Dec 8, 2025
01b29dd
Build: Bump datamodel-code-generator from 0.36.0 to 0.41.0 (#14791)
ebyhr Dec 8, 2025
a9482bd
Flink: Backport: Dynamic Sink: Add support for dropping columns (#14799)
mxm Dec 8, 2025
642b852
OpenAPI: Make namespace separator configurable by server (#14448)
nastra Dec 8, 2025
8901269
OpenAPI: Add idempotency key for the mutating plan endpoints (#14730)
singhpk234 Dec 8, 2025
bd8d289
Hive: Metadata integrity check for encrypted tables (#14685)
szlta Dec 8, 2025
344adf9
Core: Make namespace separator configurable (#10877)
nastra Dec 9, 2025
b2f3a4c
Spark: Backport ORC vectorized reader to use the delete filter (#14794)
pvary Dec 9, 2025
0547af0
Flink: Fix write unknown type to ORC exception and add ut for unknown…
Guosmilesmile Dec 9, 2025
7e41316
Flink: Backport fix write unknown type to ORC exception and add ut fo…
Guosmilesmile Dec 9, 2025
c8f4d5f
Spark: Add comet reader test (#14807)
pvary Dec 9, 2025
03415df
Spark: Backport add comet reader test (#14809)
pvary Dec 9, 2025
c6ba7a4
GCS: Integrate GCSAnalyticsCore Library (#14333)
prudhvimaharishi Dec 9, 2025
0cc337a
Core: REST Scan Planning Task Implementation (#13400)
singhpk234 Dec 10, 2025
122c440
API: Reduce 'Scanning table' log verbosity for long list of strings (…
raunaqmorarka Dec 10, 2025
1118838
Flink: Dynamic Sink: Handle NoSuchNamespaceException properly (#14812)
mxm Dec 10, 2025
482d850
Spark: Test all simple types in TestSelect (#14804)
nastra Dec 10, 2025
cbd3579
Encryption: Simplify Hive key handling and add transaction tests (#14…
smaheshwar-pltr Dec 10, 2025
3611313
Handle SupportsWithPrefix in EncryptingFileIO (#14727)
tom-s-powell Dec 10, 2025
d894a02
Core: Align CharSequenceSet impl with Data/DeleteFileSet (#11322)
nastra Dec 10, 2025
3694095
Throw CommitFailedException when BQ returns FAILED_PRECONDITION. (#14…
vladislav-sidorovich Dec 10, 2025
5025d58
Build: Improvements around applying spotless for Scala (#14798)
ConeyLiu Dec 10, 2025
23dc32e
REST: Implement Batch Scan for RESTTableScan (#14776)
singhpk234 Dec 11, 2025
f531767
Support for TIME, TIMESTAMPNTZ_NANO, UUID types in Inclusive Metrics …
manirajv06 Dec 11, 2025
fc981b4
Flink: Log on cache refresh in dynamic sink (#14792)
aiborodin Dec 11, 2025
7418f49
Core: disable flaky test for batchScan RemoteScanPlanning (#14826)
singhpk234 Dec 11, 2025
e90b06c
Flink: Backport: Dynamic Sink: Handle NoSuchNamespaceException proper…
mxm Dec 12, 2025
cc02655
Flink: Backport: Log on cache refresh in dynamic sink (#14828)a
aiborodin Dec 12, 2025
c68f041
Core: Expose the stats of the manifest file content cache (#13560)
gaborkaszab Dec 12, 2025
849f218
Docs: Add Apache Fluss integration link (#14829)
MehulBatra Dec 12, 2025
41b5af3
Azure: KeyManagementClient implementation for Azure Key Vault (#13186)
nandorKollar Dec 12, 2025
baff19f
Docs: Update community meetup guidelines (#14770)
danicafine Dec 12, 2025
9a04882
Spark, Flink: replace deprecated cleanExpiredFiles in expireSnapshots…
dramaticlly Dec 12, 2025
bc23a77
Core: Adjust namespace separator in TestRESTCatalog (#14808)
gaborkaszab Dec 13, 2025
9daef17
Build: Bump software.amazon.awssdk:bom from 2.40.3 to 2.40.8 (#14843)
dependabot[bot] Dec 14, 2025
83b8afa
Build: Bump datamodel-code-generator from 0.41.0 to 0.43.1 (#14845)
ebyhr Dec 14, 2025
837df74
Build: Bump org.immutables:value from 2.11.7 to 2.12.0 (#14844)
dependabot[bot] Dec 14, 2025
abd1d77
Build: Bump io.netty:netty-buffer from 4.2.7.Final to 4.2.8.Final (#1…
dependabot[bot] Dec 14, 2025
b6e262d
Build: Bump actions/upload-artifact from 5 to 6 (#14840)
dependabot[bot] Dec 14, 2025
69b4191
Build: Bump actions/cache from 4 to 5 (#14839)
dependabot[bot] Dec 14, 2025
831b4ea
Core: Change removal of deprecations to 1.12.0 (#14392)
gaborkaszab Dec 15, 2025
ba28a33
Core: Deprecate scan response builder deleteFiles API (#14838)
amogh-jahagirdar Dec 15, 2025
90bfc3d
SPEC: Add NoSuchPlanId to cancel endpoint (#14796)
singhpk234 Dec 15, 2025
f244955
Flink, Core: RewriteDataFiles add max file group count (#14837)
Guosmilesmile Dec 16, 2025
d5bfcaf
Docs: Add schema selection example for time travel queries (#14825)
pallevam Dec 16, 2025
baee887
fix typo in assert message (#14855)
huaxingao Dec 16, 2025
b23f13f
Core: Address Race Condition in ScanTaskIterable (#14824)
singhpk234 Dec 16, 2025
60b42ec
API: Remove redundant } from Transforms javadoc (#14866)
ebyhr Dec 17, 2025
26cb7cd
Flink: Backport RewriteDataFiles add max file group count (#14861)
Guosmilesmile Dec 17, 2025
9ca8029
GCS: bump up gcs-analytics-core version from 1.2.1 to 1.2.3 (#14873)
ajayky-os Dec 17, 2025
33cab35
Spark: Enable remote scan planning with REST catalog (#14822)
nastra Dec 18, 2025
8ea92ff
Core: Simplify handling of the current planId in client side of remot…
nastra Dec 18, 2025
05998ed
Fix: Enable metadata tables support for REST scan planning (#14881)
singhpk234 Dec 18, 2025
00bf964
Spark: Order results to fix test flakiness with remote scan planning …
nastra Dec 19, 2025
2a006ba
Core: Close planFiles() iterable in CatalogHandler (#14891)
nastra Dec 19, 2025
59280a1
Hive: Update view query in HMS when replacing view (#14831)
stuxuhai Dec 19, 2025
554a3c1
GCP: Add service account impersonation support for BigQueryMetastoreC…
joyhaldar Dec 19, 2025
0fb1e3c
OpenAPI: Etag for CommitTableResponse (#14760)
c-thiel Dec 19, 2025
76fcd47
Build: Bump io.netty:netty-buffer from 4.2.8.Final to 4.2.9.Final (#1…
dependabot[bot] Dec 21, 2025
da67268
Build: Bump testcontainers from 2.0.2 to 2.0.3 (#14898)
dependabot[bot] Dec 21, 2025
73c2aa2
Build: Bump software.amazon.awssdk:bom from 2.40.8 to 2.40.13 (#14904)
dependabot[bot] Dec 21, 2025
830cbc9
Build: Bump net.snowflake:snowflake-jdbc from 3.27.1 to 3.28.0 (#14899)
dependabot[bot] Dec 21, 2025
e8f6e90
Build: Bump com.google.cloud:libraries-bom from 26.72.0 to 26.73.0 (#…
dependabot[bot] Dec 21, 2025
d6d44a7
Build: Bump org.apache.httpcomponents.client5:httpclient5 (#14900)
dependabot[bot] Dec 21, 2025
1c5bb01
Build: Bump datamodel-code-generator from 0.43.1 to 0.46.0 (#14905)
ebyhr Dec 22, 2025
f400586
Site: Updates for 1.10.1 Release (#14907)
huaxingao Dec 22, 2025
f1e0273
Issue template: add 1.10.1 to version dropdown (#14916)
huaxingao Dec 23, 2025
752a282
Site: correct release time for 1.10.1 (#14918)
huaxingao Dec 23, 2025
3149892
Spark: Move 4.0 as 4.1
manuzhang Dec 23, 2025
ddea565
Spark: Copy back 4.1 as 4.0
manuzhang Dec 23, 2025
3179684
Spark: Initial support for 4.1.0
manuzhang Dec 1, 2025
ed26fd7
DOAP: add release 1.10.1 (#14917)
huaxingao Dec 23, 2025
0069c5e
INFRA: Skip running CI for doap.rdf file (#14919)
singhpk234 Dec 23, 2025
026ec35
Core: Small cleanup in MergingSnapshotProducer cleanUncommittedAppend…
amogh-jahagirdar Dec 24, 2025
0651b89
[doc] Add highlight note for Hadoop S3A FileSystem (#14913)
nhuantho Dec 24, 2025
5304461
Build: Bump datamodel-code-generator from 0.46.0 to 0.49.0 (#14938)
dependabot[bot] Dec 28, 2025
b26009c
Build: Bump pymarkdownlnt from 0.9.33 to 0.9.34 (#14937)
dependabot[bot] Dec 28, 2025
63c923e
fix test regex (#14939)
singhpk234 Dec 28, 2025
4db3909
Build: Bump org.openapitools:openapi-generator-gradle-plugin (#14934)
dependabot[bot] Dec 28, 2025
4632f31
Build: Bump software.amazon.awssdk:bom from 2.40.13 to 2.40.16 (#14936)
dependabot[bot] Dec 28, 2025
9c3bed6
Docs: Fix MERGE INTO example in Getting Started (#14943)
varun-lakhyani Dec 30, 2025
0004600
Spec: fix impl note about snapshot ID generation (#14720)
dalaro Jan 2, 2026
e131329
Spark: Add ordering to TestSelect to remove flakiness (#14956)
huaxingao Jan 3, 2026
01b59d3
Build: Bump software.amazon.awssdk:bom from 2.40.16 to 2.41.1 (#14961)
dependabot[bot] Jan 4, 2026
64b7b66
Build: Bump datamodel-code-generator from 0.49.0 to 0.52.1 (#14962)
ebyhr Jan 4, 2026
3048d77
Flink: Dynamic Sink: Fix serialization issues with schemas larger tha…
mxm Jan 5, 2026
4bd1fb8
Flink: DynamicSink: Report writer records/bytes send metrics (#14878)
aiborodin Jan 5, 2026
4bc934b
Spark 3.4 | 3.5: Enable remote scan planning (#14963)
singhpk234 Jan 5, 2026
bc7bfa5
Flink: Backport: Dynamic Sink: Fix serialization issues with schemas …
mxm Jan 5, 2026
8f2ed20
Flink: Backport: DynamicSink: Report writer records/bytes send metric…
aiborodin Jan 6, 2026
42cac92
Flink: Fix equalityFieldColumns always null in IcebergSink (#14952)
Guosmilesmile Jan 6, 2026
4fe8ae2
Flink: Backport fix equalityFieldColumns always null in IcebergSink (…
Guosmilesmile Jan 6, 2026
d754518
Core: Reduce manifest logging noise on drop table (#14969)
dramaticlly Jan 6, 2026
bde85b0
API, Spark: Optimize NOT IN and != predicate evaluation for fields co…
joyhaldar Jan 6, 2026
7bfe144
Flink: fix VisibleForTesting import in ZkLockFactory (#14977)
huaxingao Jan 6, 2026
234af35
site: fix live loading in make serve-dev
kevinjqliu Jan 7, 2026
46c871c
Spark: Add Spark app name to env context (#14976)
teamurko Jan 7, 2026
055a73a
AWS: Merge catalog properties with properties prefixed with client.cr…
tom-s-powell Jan 7, 2026
b07c1e5
Spark: Backport: Add Spark app name to env context for Spark v3.4, 3.…
varun-lakhyani Jan 7, 2026
51d548a
API, Core: Scan API for partition stats (#14640)
gaborkaszab Jan 7, 2026
1dce77c
Data: Handle TIMESTAMP_NANO in InternalRecordWrapper (#14974)
ayushtkn Jan 7, 2026
b3b6657
Site: Add Iceberg Summit 2026 section to homepage (#14988)
RussellSpitzer Jan 7, 2026
aee8900
Include key metadata in manifest tables (#14750)
tom-s-powell Jan 7, 2026
88d833b
Core: Handle NotFound exception for missing metadata file (#13143)
coded9 Jan 8, 2026
99f14e7
Spark 4.1: Initial support for MERGE INTO schema evolution (#14970)
szehon-ho Jan 8, 2026
daa3bb2
manually update spark 3.4 (#14993)
kevinjqliu Jan 8, 2026
01e3240
site infra: when running `make serve`, add a tip on using `make serve…
kevinjqliu Jan 8, 2026
cde5b9f
Kafka Connect: Fix CVE-2025-55163 in grpc-netty-shaded (#14985)
rmoff Jan 8, 2026
615b5a0
Core: Unlink table metadata's last-updated timestamp from snapshot ti…
dramaticlly Jan 8, 2026
0094ccc
Use SnapshotRef.MAIN_BRANCH instead of the 'main' string (#14999)
pvary Jan 8, 2026
a7b8a08
Spark 4.1: Fix spark 4.1 test for unlink table metadata's last-update…
dramaticlly Jan 9, 2026
a90848e
infra: add gradle cache to github workflows
kevinjqliu Jan 9, 2026
4a4d734
Core: Add storage credentials to FetchPlanningResultResponse (#14994)
nastra Jan 9, 2026
6f7b568
Flink: Dynamic Sink: Refactor write result aggregation (#14810)
aiborodin Jan 9, 2026
a1c1c1b
Core: Support case-insensitive field lookups in SchemaUpdate (#14734)
mxm Jan 9, 2026
d85f8a8
Kafka Connect: validate table uuid on commit (#14979)
danielcweeks Jan 9, 2026
b2696b9
Kafka Connect: fix table UUID check (#15011)
bryanck Jan 9, 2026
2c92500
Spark: Add location overlap validation for SnapshotTableAction (#14933)
varun-lakhyani Jan 10, 2026
b4bb71f
Spark: Backport #14933: Snapshot location overlap check to spark v3.4…
varun-lakhyani Jan 10, 2026
d671410
Build: Bump datamodel-code-generator from 0.52.1 to 0.52.2 (#15018)
dependabot[bot] Jan 11, 2026
7fd8a2d
Build: Bump io.grpc:grpc-netty-shaded from 1.76.2 to 1.78.0 (#15024)
dependabot[bot] Jan 11, 2026
12f9354
Build: Bump nessie from 0.106.0 to 0.106.1 (#15019)
dependabot[bot] Jan 11, 2026
a401603
Build: Bump com.google.errorprone:error_prone_annotations (#15020)
dependabot[bot] Jan 11, 2026
95d7405
Build: Bump junit-platform from 1.14.1 to 1.14.2 (#15021)
dependabot[bot] Jan 11, 2026
ccf4dfb
Build: Bump software.amazon.awssdk:bom from 2.41.1 to 2.41.5 (#15022)
dependabot[bot] Jan 11, 2026
cc966fc
Build: Bump org.immutables:value from 2.12.0 to 2.12.1 (#15026)
dependabot[bot] Jan 11, 2026
c73116a
Build: Bump orc from 1.9.7 to 1.9.8 (#15025)
dependabot[bot] Jan 11, 2026
b7d9817
Build: Bump junit from 5.14.1 to 5.14.2 (#15023)
dependabot[bot] Jan 11, 2026
4206122
Spark 4.1: Upgrade to Spark 4.1.1 (#14946)
manuzhang Jan 12, 2026
7f81e1e
Core: Use scan API to read partition stats (#14989)
gaborkaszab Jan 12, 2026
f8ee29e
Core: Drop support for Java 11 (#14400)
manuzhang Jan 12, 2026
cbf07cb
AWS, Azure, Core, GCP: Pass planId when refreshing vended credentials…
nastra Jan 13, 2026
42c7f47
Core, Data, Spark: Use partition stats scan API in tests (#14996)
gaborkaszab Jan 13, 2026
d5b83fc
Include key metadata in manifest tables (Spark 4.1) (#15041)
tom-s-powell Jan 13, 2026
243badb
site: Apache Iceberg Project News and Blog (#15013)
kevinjqliu Jan 13, 2026
779af12
add registeristration link closer to the top (#15044)
kevinjqliu Jan 13, 2026
bd96b79
Core, Hive: Detect if a view already exists when registering a table …
nastra Jan 14, 2026
046298f
Bump to Parquet 1.17.0 (#14924)
Fokko Jan 14, 2026
b62802f
Spark: Add test coverage for Hive View catalog (#15048)
nastra Jan 14, 2026
c1aed47
Spark 3.5,4.0: Add test coverage for Hive View catalog (#15052)
nastra Jan 14, 2026
035e0fb
REST Spec: clarify uniqueness of ETags for table metadata responses (…
danielcweeks Jan 15, 2026
38cc881
BigQuery: Eliminate redundant table load by using ETag for conflict d…
joyhaldar Jan 15, 2026
1d438fd
Set data file `sort_order_id` in manifest for writes from Spark
jbewing Nov 25, 2025
7f43292
Merge branch 'set-sort-order-id-from-spark' of https://github.com/jbe…
jbewing Jan 16, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
7 changes: 5 additions & 2 deletions .asf.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -33,14 +33,16 @@ github:
squash: true
rebase: true

pull_requests:
# auto-delete head branches after being merged
del_branch_on_merge: true

protected_branches:
main:
required_pull_request_reviews:
required_approving_review_count: 1

required_linear_history: true

del_branch_on_merge: true

features:
wiki: true
Expand All @@ -64,6 +66,7 @@ notifications:
commits: commits@iceberg.apache.org
issues: issues@iceberg.apache.org
pullrequests: issues@iceberg.apache.org
jobs: ci-jobs@iceberg.apache.org
jira_options: link label link label

publish:
Expand Down
2 changes: 2 additions & 0 deletions .baseline/checkstyle/checkstyle-suppressions.xml
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,8 @@

<!-- Suppress checks for CometColumnReader -->
<suppress files="org.apache.iceberg.spark.data.vectorized.CometColumnReader" checks="IllegalImport"/>
<!-- Suppress checks for CometDeletedColumnVector -->
<suppress files="org.apache.iceberg.spark.data.vectorized.CometDeletedColumnVector" checks="IllegalImport"/>

<!-- Suppress TestClassNamingConvention for main source files -->
<suppress files=".*[/\\]src[/\\]main[/\\].*" id="TestClassNamingConvention" />
Expand Down
32 changes: 32 additions & 0 deletions .baseline/scala/.scala212fmt.conf
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.

version = 3.9.7

align = none
align.openParenDefnSite = false
align.openParenCallSite = false
align.tokens = []
importSelectors = "singleLine"
optIn = {
configStyleArguments = false
}
danglingParentheses.preset = false
docstrings.style = Asterisk
docstrings.wrap = false
maxColumn = 100
runner.dialect = scala212
32 changes: 32 additions & 0 deletions .baseline/scala/.scala213fmt.conf
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.

version = 3.9.7

align = none
align.openParenDefnSite = false
align.openParenCallSite = false
align.tokens = []
importSelectors = "singleLine"
optIn = {
configStyleArguments = false
}
danglingParentheses.preset = false
docstrings.style = Asterisk
docstrings.wrap = false
maxColumn = 100
runner.dialect = scala213
3 changes: 2 additions & 1 deletion .github/ISSUE_TEMPLATE/iceberg_bug_report.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,8 @@ body:
description: What Apache Iceberg version are you using?
multiple: false
options:
- "1.10.0 (latest release)"
- "1.10.1 (latest release)"
- "1.10.0"
- "1.9.2"
- "1.9.1"
- "1.9.0"
Expand Down
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/iceberg_question.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ body:
- type: markdown
attributes:
value: |
Feel free to ask your question on [Slack](https://join.slack.com/t/apache-iceberg/shared_invite/zt-2561tq9qr-UtISlHgsdY3Virs3Z2_btQ) as well.
Feel free to ask your question on [Slack](https://join.slack.com/t/apache-iceberg/shared_invite/zt-3kclosz6r-3heAW3d~_PHefmN2A_~cAg) as well.

Do **NOT** share any sensitive information like passwords, security tokens, private URLs etc.
- type: textarea
Expand Down
3 changes: 0 additions & 3 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,6 @@ updates:
schedule:
interval: "weekly"
day: "sunday"
ignore:
- dependency-name: "*"
update-types: ["version-update:semver-major"]
- package-ecosystem: "gradle"
directory: "/"
schedule:
Expand Down
11 changes: 9 additions & 2 deletions .github/workflows/api-binary-compatibility.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ jobs:
revapi:
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
with:
# fetch-depth of zero ensures that the tags are pulled in and we're not in a detached HEAD state
# revapi depends on the tags, specifically the tag from git describe, to find the relevant override
Expand All @@ -55,10 +55,17 @@ jobs:
with:
distribution: zulu
java-version: 17
- uses: actions/cache@v5
with:
path: |
~/.gradle/caches
~/.gradle/wrapper
key: ${{ runner.os }}-gradle-${{ hashFiles('**/*.gradle*', '**/gradle-wrapper.properties') }}
restore-keys: ${{ runner.os }}-gradle-
- run: |
echo "Using the old version tag, as per git describe, of $(git describe)";
- run: ./gradlew revapi --rerun-tasks
- uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v6
if: failure()
with:
name: test logs
Expand Down
17 changes: 9 additions & 8 deletions .github/workflows/delta-conversion-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,7 @@ on:
- 'CONTRIBUTING.md'
- '**/LICENSE'
- '**/NOTICE'
- 'doap.rdf'

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
Expand All @@ -71,16 +72,16 @@ jobs:
runs-on: ubuntu-24.04
strategy:
matrix:
jvm: [11, 17, 21]
jvm: [17, 21]
env:
SPARK_LOCAL_IP: localhost
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: actions/setup-java@v5
with:
distribution: zulu
java-version: ${{ matrix.jvm }}
- uses: actions/cache@v4
- uses: actions/cache@v5
with:
path: |
~/.gradle/caches
Expand All @@ -89,7 +90,7 @@ jobs:
restore-keys: ${{ runner.os }}-gradle-
- run: echo -e "$(ip addr show eth0 | grep "inet\b" | awk '{print $2}' | cut -d/ -f1)\t$(hostname -f) $(hostname -s)" | sudo tee -a /etc/hosts
- run: ./gradlew -DsparkVersions=3.5 -DscalaVersion=2.12 -DkafkaVersions= -DflinkVersions= :iceberg-delta-lake:check -Pquick=true -x javadoc
- uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v6
if: failure()
with:
name: test logs
Expand All @@ -100,16 +101,16 @@ jobs:
runs-on: ubuntu-24.04
strategy:
matrix:
jvm: [11, 17, 21]
jvm: [17, 21]
env:
SPARK_LOCAL_IP: localhost
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: actions/setup-java@v5
with:
distribution: zulu
java-version: ${{ matrix.jvm }}
- uses: actions/cache@v4
- uses: actions/cache@v5
with:
path: |
~/.gradle/caches
Expand All @@ -118,7 +119,7 @@ jobs:
restore-keys: ${{ runner.os }}-gradle-
- run: echo -e "$(ip addr show eth0 | grep "inet\b" | awk '{print $2}' | cut -d/ -f1)\t$(hostname -f) $(hostname -s)" | sudo tee -a /etc/hosts
- run: ./gradlew -DsparkVersions=3.5 -DscalaVersion=2.13 -DkafkaVersions= -DflinkVersions= :iceberg-delta-lake:check -Pquick=true -x javadoc
- uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v6
if: failure()
with:
name: test logs
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/docs-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,8 +32,8 @@ jobs:
matrix:
os: [ubuntu-latest, macos-latest]
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
- uses: actions/checkout@v6
- uses: actions/setup-python@v6
with:
python-version: 3.x
- name: Build Iceberg documentation
Expand Down
9 changes: 5 additions & 4 deletions .github/workflows/flink-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,7 @@ on:
- 'CONTRIBUTING.md'
- '**/LICENSE'
- '**/NOTICE'
- 'doap.rdf'

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
Expand All @@ -73,17 +74,17 @@ jobs:
runs-on: ubuntu-24.04
strategy:
matrix:
jvm: [11, 17, 21]
jvm: [17, 21]
flink: ['1.20', '2.0', '2.1']
env:
SPARK_LOCAL_IP: localhost
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: actions/setup-java@v5
with:
distribution: zulu
java-version: ${{ matrix.jvm }}
- uses: actions/cache@v4
- uses: actions/cache@v5
with:
path: |
~/.gradle/caches
Expand All @@ -92,7 +93,7 @@ jobs:
restore-keys: ${{ runner.os }}-gradle-
- run: echo -e "$(ip addr show eth0 | grep "inet\b" | awk '{print $2}' | cut -d/ -f1)\t$(hostname -f) $(hostname -s)" | sudo tee -a /etc/hosts
- run: ./gradlew -DsparkVersions= -DkafkaVersions= -DflinkVersions=${{ matrix.flink }} :iceberg-flink:iceberg-flink-${{ matrix.flink }}:check :iceberg-flink:iceberg-flink-runtime-${{ matrix.flink }}:check -Pquick=true -x javadoc -DtestParallelism=auto
- uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v6
if: failure()
with:
name: test logs
Expand Down
9 changes: 5 additions & 4 deletions .github/workflows/hive-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,7 @@ on:
- 'CONTRIBUTING.md'
- '**/LICENSE'
- '**/NOTICE'
- 'doap.rdf'

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
Expand All @@ -72,16 +73,16 @@ jobs:
runs-on: ubuntu-24.04
strategy:
matrix:
jvm: [11, 17, 21]
jvm: [17, 21]
env:
SPARK_LOCAL_IP: localhost
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: actions/setup-java@v5
with:
distribution: zulu
java-version: ${{ matrix.jvm }}
- uses: actions/cache@v4
- uses: actions/cache@v5
with:
path: |
~/.gradle/caches
Expand All @@ -90,7 +91,7 @@ jobs:
restore-keys: ${{ runner.os }}-gradle-
- run: echo -e "$(ip addr show eth0 | grep "inet\b" | awk '{print $2}' | cut -d/ -f1)\t$(hostname -f) $(hostname -s)" | sudo tee -a /etc/hosts
- run: ./gradlew -DsparkVersions= -DflinkVersions= -DkafkaVersions= -Pquick=true :iceberg-mr:check -x javadoc
- uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v6
if: failure()
with:
name: test logs
Expand Down
17 changes: 9 additions & 8 deletions .github/workflows/java-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,7 @@ on:
- 'CONTRIBUTING.md'
- '**/LICENSE'
- '**/NOTICE'
- 'doap.rdf'

concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
Expand All @@ -67,16 +68,16 @@ jobs:
runs-on: ubuntu-24.04
strategy:
matrix:
jvm: [11, 17, 21]
jvm: [17, 21]
env:
SPARK_LOCAL_IP: localhost
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: actions/setup-java@v5
with:
distribution: zulu
java-version: ${{ matrix.jvm }}
- uses: actions/cache@v4
- uses: actions/cache@v5
with:
path: |
~/.gradle/caches
Expand All @@ -85,7 +86,7 @@ jobs:
restore-keys: ${{ runner.os }}-gradle-
- run: echo -e "$(ip addr show eth0 | grep "inet\b" | awk '{print $2}' | cut -d/ -f1)\t$(hostname -f) $(hostname -s)" | sudo tee -a /etc/hosts
- run: ./gradlew check -DsparkVersions= -DflinkVersions= -DkafkaVersions= -Pquick=true -x javadoc
- uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v6
if: failure()
with:
name: test logs
Expand All @@ -96,9 +97,9 @@ jobs:
runs-on: ubuntu-24.04
strategy:
matrix:
jvm: [11, 17, 21]
jvm: [17, 21]
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: actions/setup-java@v5
with:
distribution: zulu
Expand All @@ -109,9 +110,9 @@ jobs:
runs-on: ubuntu-24.04
strategy:
matrix:
jvm: [11, 17, 21]
jvm: [17, 21]
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v6
- uses: actions/setup-java@v5
with:
distribution: zulu
Expand Down
Loading