Add Docker build and per-profile image tooling #436
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
This PR adds a production-ready Docker build for LST-Bench and documentation to streamline local usage and Docker Hub publishing. It also introduces a Makefile that builds/pushes per-profile images so the correct JDBC drivers are included.
Changes
Dockerfilewith non-root runtime and optional Maven profile support (MAVEN_PROFILES)..dockerignoreto keep build context lean.Makefilewithbuild-<profile>/push-<profile>targets andbuild-all/push-all.README.mdwith Docker build/run instructions and Spark Thrift example.Motivation / Context
Running LST-Bench from Docker makes it easier to:
These changes reduce setup time.
How to Test
Dockerhub image
I have already built and pushed the image to my Dockerhub repo for convenience.
docker run --rm \ -v "$PWD/config":/work/config:ro \ ikyrannas/lst-bench:spark-jdbc \ -c /work/config/connections.yaml \ -e /work/config/experiment.yaml \ -l /work/config/library.yaml \ -t /work/config/telemetry.yaml \ -w /work/config/workload.yamlLocal build
docker build -t lst-bench:spark-jdbc --build-arg MAVEN_PROFILES=spark-jdbc .docker run --rm \ -v "$PWD/config":/work/config:ro \ lst-bench:spark-jdbc \ -c /work/config/connections.yaml \ -e /work/config/experiment.yaml \ -l /work/config/library.yaml \ -t /work/config/telemetry.yaml \ -w /work/config/workload.yamlNotes
spark-jdbc,trino-jdbc, etc.) to ensure the correct JDBC driver set is included.Checklist