Elasticsearch is built using the Gradle open source build tools.
This document provides general guidelines for using and working on the Elasticsearch build logic.
The Elasticsearch project contains 3 build-related projects that are included into the Elasticsearch build as a composite build.
This project contains build conventions that are applied to all Elasticsearch projects.
This project contains all build logic that we publish for third party Elasticsearch plugin authors. We provide the following plugins:
elasticsearch.esplugin- A Gradle plugin for building an elasticsearch plugin.elasticsearch.testclusters- A Gradle plugin for setting up es clusters for testing within a build.
This project is published as part of the Elasticsearch release and accessible by
org.elasticsearch.gradle:build-tools:<versionNumber>.
These build tools are also used by the elasticsearch-hadoop project maintained by elastic.
This project contains all Elasticsearch project specific build logic not meant to be shared with other internal or external projects.
The Elasticsearch build uses several third-party Gradle plugins. All versions are centralized in
/gradle/build.versions.toml (version catalog).
| Plugin ID | Purpose |
|---|---|
com.gradle.develocity |
Enables build scans and integration with Gradle Enterprise at gradle-enterprise.elastic.co. Provides build performance metrics, failure diagnostics, and CI integration for Buildkite. |
com.gradleup.nmcp.aggregation |
Aggregates all Maven artifacts from projects applying elasticsearch.publish for publishing to Maven Central via DRA infrastructure. |
| Plugin ID | Purpose |
|---|---|
com.netflix.nebula.ospackage-base |
Creates DEB and RPM Linux packages for Elasticsearch distribution. Handles package metadata, install/remove scripts, file permissions, and package signing. |
com.gradleup.shadow |
Creates fat JARs (uber-jars) by merging dependencies into a single JAR. Used for standalone CLI tools (plugin-cli, sql-cli) and the JDBC driver. |
| Plugin ID | Purpose |
|---|---|
de.thetaphi:forbiddenapis |
Static bytecode analysis that detects invocations of forbidden API methods. Ensures code doesn't use unsafe or deprecated JDK APIs. Integrated via ForbiddenApisPrecommitPlugin. |
org.apache.rat:apache-rat |
License header validation (Apache RAT - Release Audit Tool). Ensures all source files have proper license headers. |
com.diffplug.spotless |
Code formatting enforcement using Eclipse JDT formatter. Provides spotlessJavaCheck and spotlessApply tasks. Configuration in build-conventions/formatterConfig.xml. |
| Plugin ID | Purpose |
|---|---|
com.netflix.nebula:gradle-info-plugin |
Automatically includes build metadata (git info, build time, Java version) in JAR manifests. Applied as nebula.info-broker, nebula.info-basic, nebula.info-java, nebula.info-jar. |
com.avast.gradle:docker-compose |
Manages Docker Compose environments for integration testing fixtures. Used by TestFixturesPlugin for test infrastructure (AWS, Azure, GCS, HDFS mocks). |
org.jetbrains.gradle.plugin.idea-ext |
Enhanced IntelliJ IDEA project configuration. Customizes IDE settings, JUnit configurations, and post-sync tasks. |
When creating a new subproject, choose the appropriate Elasticsearch plugin based on your project type:
| Project Type | Plugin to Apply | Example Projects |
|---|---|---|
| Core library | elasticsearch.build |
server, libs/* |
| Module shipped with ES | elasticsearch.internal-es-plugin |
modules/* |
| External plugin | elasticsearch.internal-es-plugin + elasticsearch.publish |
plugins/* |
| X-Pack plugin | elasticsearch.internal-es-plugin + elasticsearch.publish |
x-pack/plugin/* |
| YAML REST tests | elasticsearch.internal-yaml-rest-test |
modules/plugins with REST APIs |
| Java REST tests | elasticsearch.internal-java-rest-test |
modules/plugins needing Java test flexibility |
| Cluster integration tests | elasticsearch.internal-cluster-test |
Projects testing cluster behavior |
| BWC/upgrade tests | elasticsearch.bwc-test or elasticsearch.fwc-test |
qa/rolling-upgrade, qa/full-cluster-restart |
| Standalone QA project | elasticsearch.standalone-rest-test |
qa/* subprojects |
This is an intentionally small set of guidelines to build users and authors
to ensure we keep the build consistent. We also publish Elasticsearch build logic
as build-tools to be usable by thirdparty Elasticsearch plugin authors. This is
also used by other elastic teams like elasticsearch-hadoop.
Breaking changes should therefore be avoided and an appropriate deprecation cycle
should be followed.
The Elasticsearch build usually uses the latest Gradle GA release. We stay as close to the latest Gradle releases as possible. In certain cases an update is blocked by a breaking behaviour in Gradle. We're usually in contact with the Gradle team here or working on a fix in our build logic to resolve this.
The Elasticsearch build will fail if any deprecated Gradle API is used.
Tony Robalik has compiled a good list of rules that aligns with ours when it comes to writing and maintaining Elasticsearch Gradle build logic at http://autonomousapps.com/blog/rules-for-gradle-plugin-authors.html. Our current build does not yet tick off all those rules everywhere but the ultimate goal is to follow these principles. The reasons for following those rules besides better readability or maintenance are also the goal to support newer Gradle features that we will benefit from in terms of performance and reliability. E.g. configuration-cache support, Project Isolation or predictive test selection
There are a few guidelines to follow that should make your life easier to make changes to the Elasticsearch build.
Please add a member of the es-delivery team as a reviewer if you're making non-trivial changes to the build.
We rely on Gradle dependency verification to mitigate the security risks and avoid integrating compromised dependencies.
This requires to have third party dependencies and their checksums listed in gradle/verification-metadata.xml.
For updated or newly added dependencies you need to add an entry to this verification file or update the existing one:
<component group="asm" name="asm" version="3.1">
<artifact name="asm-3.1.jar">
<sha256 value="333ff5369043975b7e031b8b27206937441854738e038c1f47f98d072a20437a" origin="official site"/>
</artifact>
</component>
In case of updating a dependency, ensure to remove the unused entry of the outdated dependency manually from the verification-metadata.xml file.
You can also automate the generation of this entry by running your build using the --write-verification-metadata commandline option:
./gradlew --write-verification-metadata sha256 precommit
The --write-verification-metadata Gradle option is generally able to resolve reachable configurations,
but we use detached configurations for a certain set of plugins and tasks. Therefore, please ensure you run this option with a task that
uses the changed dependencies. In most cases, precommit or check are good candidates.
We prefer sha256 checksums as md5 and sha1 are not considered safe anymore these days. The generated entry
will have the origin attribute been set to Generated by Gradle.
Tip
A manual confirmation of the Gradle generated checksums is currently not mandatory.
If you want to add a level of verification you can manually confirm the checksum (e.g. by looking it up on the website of the library)
Please replace the content of the origin attribute by official site in that case.
Dependency management is a critical aspect of maintaining a secure and reliable build system, requiring explicit control over what we rely on. The Elasticsearch build mainly uses component metadata rules declared in the ComponentMetadataRulesPlugin
plugin to manage transitive dependencies and avoid version conflicts.
This approach ensures we have explicit control over all dependencies used in the build.
-
Avoid unused transitive dependencies - Dependencies that are not actually used by our code should be excluded to reduce the attack surface and avoid potential conflicts.
-
Prefer versions declared in
build-tools-internal/version.properties- All dependency versions should be centrally managed in this file to ensure consistency across the entire build. -
Libraries required to compile our code should be direct dependencies - If we directly use a library in our source code, it should be declared as a direct dependency rather than relying on it being transitively available.
We use two main types of component metadata rules at this point to manage transitive dependencies:
-
ExcludeAllTransitivesRule- Excludes all transitive dependencies for libraries where we want complete control over dependencies or the transitive dependencies are unused. -
ExcludeOtherGroupsTransitiveRule- Excludes transitive dependencies that don't belong to the same group as the direct dependency, while keeping same-group dependencies. -
ExcludeByGroup- Excludes transitive dependencies that match a specific groupId while keeping all other transitive dependencies with different groupIds.
Examples from the ComponentMetadataRulesPlugin:
// Exclude all transitives - used when transitive deps are unused or problematic
components.withModule("com.fasterxml.jackson.dataformat:jackson-dataformat-cbor", ExcludeAllTransitivesRule.class);
// Exclude other groups - used when we want same-group deps but not external ones
components.withModule("com.azure:azure-core", ExcludeOtherGroupsTransitiveRule.class);
// Exclude only specific groups - used when we want exclude specific group of transitive deps.
components.withModule("org.apache.logging.log4j:log4j-api", ExcludeByGroup.class, rule -> {
rule.params(List.of("biz.aQute.bnd", "org.osgi"));
});Version Conflicts: When a transitive dependency brings in a different version than what we use:
// brings in jackson-databind and jackson-annotations not used
components.withModule("com.fasterxml.jackson.dataformat:jackson-dataformat-cbor", ExcludeAllTransitivesRule.class);Unused Dependencies: When transitive dependencies are not actually used:
// brings in azure-core-http-netty. not used
components.withModule("com.azure:azure-core-http-netty", ExcludeAllTransitivesRule.class);Mismatching Version Dependencies: When other versions are required:
// brings in org.slf4j:slf4j-api:1.7.25. We use 2.0.6
components.withModule("org.apache.directory.api:api-asn1-ber", ExcludeOtherGroupsTransitiveRule.class);When adding or updating dependencies, ensure that any required transitive dependencies are either:
- Already available as direct dependencies with compatible versions
- Added as direct dependencies if they're actually used by our code
- Properly excluded if they're not needed
Build logic that is used across multiple subprojects should be considered to be
moved into a Gradle plugin with according Gradle task implementation.
Elasticsearch specific build logic is located in the build-tools-internal
subproject including integration tests.
- Gradle plugins and Tasks should be written in Java
- We use a groovy and spock for setting up Gradle integration tests.
- For each Gradle Plugin and Gradle Task implementation (e.g.
org.elasticsearch.gradle.internal.info.GlobalBuildInfoPlugin) we aim for
a dedicated
- unit test class containing all relaed unit tests in src/test (e.g. GlobalBuildInfoPluginSpec) for basic unit testing of the plugin logic and
- integration test containing all related Gradle TestKit based integration tests in src/integTest (e.g. GlobalBuildInfoPluginFuncTest) for testing the plugin in a real Gradle build based on src/main/build-tools/src/testFixtures/groovy/org/elasticsearch/gradle/fixtures/AbstractGradleFuncTest.groovy)
The Elasticsearch build makes use of the task avoidance API to keep the configuration time of the build low.
When declaring tasks (in build scripts or custom plugins) this means that we want to register a task like:
tasks.register('someTask') { ... }
instead of eagerly creating the task:
task someTask { ... }
The major difference between these two syntaxes is, that the configuration block of a registered task will only be executed when the task is actually created due to the build requires that task to run. The configuration block of an eagerly created tasks will be executed immediately.
By actually doing less in the Gradle configuration time as only creating tasks that are requested as part of the build and by only running the configurations for those requested tasks, using the task avoidance api contributes a major part in keeping our build fast.
When using the Elasticsearch test cluster plugin we want to use (similar to the task avoidance API) a Gradle API to create domain objects lazy or only if required by the build. Therefore we register test cluster by using the following syntax:
def someClusterProvider = testClusters.register('someCluster') { ... }
This registers a potential testCluster named somecluster and provides a provider instance, but doesn't create it yet nor configures it. This makes the Gradle configuration phase more efficient by
doing less.
To wire this registered cluster into a TestClusterAware task (e.g. RestIntegTest) you can resolve the actual cluster from the provider instance:
tasks.register('someClusterTest', RestIntegTestTask) {
useCluster someClusterProvider
nonInputProperties.systemProperty 'tests.leader_host', "${-> someClusterProvider.get().getAllHttpSocketURI().get(0)}"
}
Additional integration tests for a certain Elasticsearch modules that are specific to certain cluster configuration can be declared in a separate so called qa subproject of your module.
The benefit of a dedicated project for these tests are:
qaprojects are dedicated to specific use-cases and easier to maintain- It keeps the specific test logic separated from the common test logic.
- You can run those tests in parallel to other projects of the build.
Sometimes we want to share fixture code used by tests across multiple Gradle projects. There are two supported approaches, depending on where the shared classes live:
-
Prefer the built-in java-test-fixtures Gradle plugin when and only when the shared fixtures are placed in the dedicated
testFixturessource set (for examplesrc/testFixtures/java).In the providing project apply the plugin and place shared code under
src/testFixtures/...:plugins { id 'java-test-fixtures' }In the consumer project you can then depend on those fixtures like this:
dependencies { testImplementation(testFixtures(project(":fixture-providing-project"))) } -
Use
elasticsearch.internal-test-artifactfor the common Elasticsearch case where fixtures and tests live in the same source set (for examplesrc/test/java) and you need to share thosetestclasses/resources with another project. This plugin provides an additional test artifact derived from thetestsource set, which can be resolved by the consumer project as shown below:dependencies { // Add the `test` source set classes/resources from `:fixture-providing-project`. testImplementation(testArtifact(project(":fixture-providing-project"))) }
This test artifact mechanism makes use of the concept of component capabilities
similar to how the Gradle built-in java-test-fixtures plugin works.
testArtifact(...) is a shortcut declared in the Elasticsearch build. Alternatively you can declare the dependency via an explicit capability requirement:
dependencies {
testImplementation(project(":fixture-providing-project")) {
capabilities {
requireCapabilities("${project(':fixture-providing-project').group}:fixture-providing-project-test-artifacts")
}
}
}Several precommit tasks support project-specific configuration. Use the task avoidance API when configuring them.
Configure missing class and violation ignores when third-party dependencies use optional APIs:
tasks.named("thirdPartyAudit").configure {
// Ignore classes that are optional dependencies of our dependencies
ignoreMissingClasses(
'javax.servlet.ServletContextEvent',
'org.apache.log.Logger'
)
// Ignore known-safe internal API usage in dependencies
ignoreViolations(
'com.google.common.hash.Striped64'
)
}Map related artifacts to a single license when dependencies are published as multiple JARs:
tasks.named("dependencyLicenses").configure {
mapping from: /lucene-.*/, to: 'lucene'
mapping from: /netty-.*/, to: 'netty'
mapping from: /jackson-.*/, to: 'jackson'
}Exclude files that legitimately contain patterns detected as forbidden:
tasks.named("forbiddenPatterns").configure {
exclude '**/*.key' // Test certificates
exclude '**/*.p12' // PKCS12 keystores
exclude '**/*.json' // Test data files
}Projects with REST APIs should declare which specs and tests they need:
restResources {
restApi {
include '_common', 'cluster', 'indices', 'your_api_name'
}
restTests {
includeCore 'your_api_tests' // For core APIs
includeXpack 'your_xpack_tests' // For X-Pack APIs
}
}To test an unreleased development version of a third party dependency you have several options.
Currently only openjdk EA builds by oracle are supported.
To test against an early access version java version you can pass the major
java version appended with -pre as a system property (e.g. -Druntime.java=26-pre) to the Gradle build:
./gradlew clean test -Druntime.java=26-pre
This will run the tests using the JDK 26 pre-release version and pick the latest available build of the matching JDK EA version we expose
in our custom jdk catalogue at https://builds.es-jdk-archive.com/jdks/openjdk/recent.json.
To run against a specific build number of the EA build you can pass a second system property (e.g. -Druntime.java.build=6):
./gradlew clean test -Druntime.java=26-pre -Druntime.java.build=6
- Clone the third party repository locally
- Run
mvn installto install copy into your~/.m2/repositoryfolder. - Add this to the root build script:
allprojects {
repositories {
mavenLocal()
}
}
- Update the version in your dependency declaration accordingly (likely a snapshot version)
- Run the Gradle build as needed
https://jitpack.io is an adhoc repository that supports building Maven projects transparently in the background when resolving unreleased snapshots from a GitHub repository. This approach also works as a temporary solution and is compliant with our CI builds.
- Add the JitPack repository to the root build file:
allprojects {
repositories {
maven { url "https://jitpack.io" }
}
}
- Add the dependency in the following format
dependencies {
implementation 'com.github.User:Repo:Tag'
}
As version you could also use a certain short commit hash or main-SNAPSHOT.
In addition to snapshot builds JitPack supports building Pull Requests. Simply use PR-SNAPSHOT as the version.
- Run the Gradle build as needed. Keep in mind the initial resolution might take a bit longer as this needs to be built by JitPack in the background before we can resolve the adhoc built dependency.
Note
You should only use that approach locally or on a developer branch for production dependencies as we do not want to ship unreleased libraries into our releases.
For third party libraries that are not built with Maven (e.g. Ant) or provided as a plain jar artifact we can leverage a flat directory repository that resolves artifacts from a flat directory on your filesystem.
- Put the jar artifact with the format
artifactName-version.jarinto a directory namedlocalRepo(you have to create this manually) - Declare a flatDir repository in your root build.gradle file (That ensures all projects have the flatDir repository declared and also the projects consuming the project you tweaked can resolve that local dependency)
allprojects {
repositories {
flatDir {
dirs 'localRepo'
}
}
}
- Update the dependency declaration of the artifact in question to match the custom build version. For a file named e.g.
jmxri-1.2.1.jarthe dependency definition would bex:jmxri:1.2.1as it the group information is ignored on flatdir repositories you can replace thexin group name:
dependencies {
implementation 'x:jmxri:1.2.1'
}
- Run the Gradle build as needed with
--write-verification-metadatato ensure the Gradle dependency verification does not fail on your custom dependency.
# write verification metadata and run the precommit task
./gradlew --write-verification-metadata sha256 precommitNote
As Gradle prefers to use modules whose descriptor has been created from real meta-data rather than being generated,
flat directory repositories cannot be used to override artifacts with real meta-data from other repositories declared in the build.
For example, if Gradle finds only jmxri-1.2.1.jar in a flat directory repository, but jmxri-1.2.1.pom in another repository
that supports meta-data, it will use the second repository to provide the module.
Therefore, it is recommended to declare a version that is not resolvable from public repositories we use (e.g. Maven Central)