Skip to content

Conversation

@Komzpa
Copy link
Member

@Komzpa Komzpa commented Aug 30, 2025

Summary

  • merge consecutive observations into a single episode when geometry and other properties remain unchanged
  • ensure wildfire and NHC combinators reuse base post-processing
  • document the merging rule and cover it with a unit test

Testing

  • make precommit

https://chatgpt.com/codex/tasks/task_e_68b35db34ef0832495c4a28c2ca88d9f

Summary by CodeRabbit

  • New Features

    • Episodes with identical geometry and properties are now automatically merged when time-adjacent, consolidating observations, timestamps, URLs and metadata; merging can consider a time tolerance.
    • Wildfire and NHC episode flows now benefit from the shared post-processing merge behavior.
  • Documentation

    • Expanded “Idealized workflow” episode rollup: folding observations, filling gaps from prior data, and gluing adjacent intervals.
  • Tests

    • Added/updated tests validating contiguous-episode merging and updated end/start/updated timestamps.

@coderabbitai
Copy link

coderabbitai bot commented Aug 30, 2025

Walkthrough

Adds time-tolerant episode merging to EpisodeCombinator (new overload), runs superclass merging before subclass-specific adjustments in Wildfire and NHC combinators, expands documentation bullets for Episode rollup, and adds/updates unit and integration tests reflecting merged-episode behavior.

Changes

Cohort / File(s) Change summary
Core combinator merging
src/main/java/io/kontur/eventapi/episodecomposition/EpisodeCombinator.java
Added postProcessEpisodes(List<FeedEpisode>, Duration) and made the existing overload delegate to it. Implements sorting by startedAt/endedAt, time-adjacent merging using mergeTolerance, merging predicate canMerge (type, severity, name, properName, description, location, loss, severityData, urls, geometries), interval unioning, propagating latest updatedAt/sourceUpdatedAt, merging observations, unique URLs, and shallow merges of loss/severityData. Added geometriesEqual (JSON via Jackson) and urlsEqualIgnoringOrder.
Subclass: Wildfire
src/main/java/io/kontur/eventapi/episodecomposition/WildfireEpisodeCombinator.java
Now invokes super.postProcessEpisodes(...) first and reassigns episodes before performing Wildfire-specific ordering/overlap adjustments; early-return moved after superclass processing.
Subclass: NHC
src/main/java/io/kontur/eventapi/nhc/episodecomposition/NhcEpisodeCombinator.java
Now invokes super.postProcessEpisodes(...) and reassigns episodes before NHC-specific end-time propagation and single-episode handling.
Tests — episode combinator
src/test/java/io/kontur/eventapi/episodecomposition/DefaultEpisodeCombinatorTest.java
New JUnit test asserting two contiguous episodes with identical geometry/properties are merged into one episode with combined time bounds, aggregated observations, and latest updatedAt/sourceUpdatedAt.
Tests — integration updates
src/test/java/io/kontur/eventapi/firms/event/FirmsEventAndEpisodeCombinationsJobIT.java
Updated assertions and expected timestamps/counts for the third event to reflect changed episode aggregation after composition; added assertion messages and a check for no remaining rollout events.
Tests — event combination
src/test/java/io/kontur/eventapi/usgs/earthquake/event/UsgsEarthquakeEventCombinationJobTest.java
New unit test verifying EventCombinationJob combines multiple NormalizedObservations with the same externalEventId into a single KonturEvent using mocked DAOs.
Docs: Episode rollup
docs/.../Versions and timelines
Expanded Idealized workflow episode rollup from one bullet into four: retained original snapshot description and added bullets describing folding observations into single Episodes for same time, using previous observations to fill gaps, and gluing adjacent intervals when geometry/properties remain unchanged.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  participant Caller as Caller
  participant Sub as Subclass (Wildfire/NHC)
  participant Base as EpisodeCombinator

  Caller->>Sub: postProcessEpisodes(episodes)
  activate Sub
  Sub->>Base: super.postProcessEpisodes(episodes)
  activate Base
  Base->>Base: sort by startedAt, endedAt
  loop adjacent pairs
    Base->>Base: canMerge(prev,next)? (fields + urls + geometry)
    alt mergeable (time-adjacent within tolerance)
      Base->>Base: merge intervals, combine observations, merge URLs/loss/severityData, propagate latest updatedAt/sourceUpdatedAt
    else not mergeable
      Base->>Base: flush prev to result
    end
  end
  Base-->>Sub: mergedEpisodes
  deactivate Base

  alt subclass adjustments
    Sub->>Sub: apply domain-specific fixes (overlap/end-time propagation)
  end
  Sub-->>Caller: finalEpisodes
  deactivate Sub
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch codex/adjust-feed-composition-rules

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR/Issue comments)

Type @coderabbitai help to get the list of available commands.

Other keywords and placeholders

  • Add @coderabbitai ignore or @coderabbit ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Status, Documentation and Community

  • Visit our Status Page to check the current availability of CodeRabbit.
  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 8

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
docs/Versions and timelines (1)

10-10: Fix typos

“currenly” → “currently”; stray “l” after the bitemporal link; also “data mode” → “data model”.

- * Event Feeds can be limited to be built only from a subset of source data that consumer asked for (currenly only per source provider).
+ * Event Feeds can be limited to be built only from a subset of source data that consumer asked for (currently only per source provider).
-To solve this we want to employ [bitemporal data mode](https://en.wikipedia.org/wiki/Temporal_database#Bi-Temporal "https://en.wikipedia.org/wiki/Temporal_database#Bi-Temporal")l. Valid time is the axis of Episodes, System time is the axis of Versions. Each feed has independent view on the data in the database.
+To solve this we want to employ the [bitemporal data model](https://en.wikipedia.org/wiki/Temporal_database#Bi-Temporal "https://en.wikipedia.org/wiki/Temporal_database#Bi-Temporal"). Valid time is the axis of Episodes, System time is the axis of Versions. Each feed has an independent view on the data in the database.

Also applies to: 19-19

src/main/java/io/kontur/eventapi/episodecomposition/WildfireEpisodeCombinator.java (1)

19-28: Null-safety: avoid NPEs on startedAt/endedAt during sort and overlap adjust

Episodes with null timestamps will NPE in comparator or isBefore/isAfter. Make sorting and overlap checks null-safe.

-        episodes.sort(comparing(FeedEpisode::getStartedAt).thenComparing(FeedEpisode::getEndedAt));
+        episodes.sort(
+                comparing(FeedEpisode::getStartedAt,
+                        java.util.Comparator.nullsLast(java.time.OffsetDateTime::compareTo))
+                .thenComparing(FeedEpisode::getEndedAt,
+                        java.util.Comparator.nullsLast(java.time.OffsetDateTime::compareTo))
+        );
         OffsetDateTime lastEndedAt = null;
         for (FeedEpisode episode : episodes) {
-            if (lastEndedAt != null
-                    && lastEndedAt.isAfter(episode.getStartedAt())
-                    && lastEndedAt.isBefore(episode.getEndedAt())) {
+            if (lastEndedAt != null
+                    && episode.getStartedAt() != null
+                    && episode.getEndedAt() != null
+                    && lastEndedAt.isAfter(episode.getStartedAt())
+                    && lastEndedAt.isBefore(episode.getEndedAt())) {
                 episode.setStartedAt(lastEndedAt);
             }
-            lastEndedAt = episode.getEndedAt();
+            lastEndedAt = episode.getEndedAt() != null ? episode.getEndedAt() : lastEndedAt;
         }
📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between cadb093 and ab1e04d.

📒 Files selected for processing (5)
  • docs/Versions and timelines (1 hunks)
  • src/main/java/io/kontur/eventapi/episodecomposition/EpisodeCombinator.java (1 hunks)
  • src/main/java/io/kontur/eventapi/episodecomposition/WildfireEpisodeCombinator.java (1 hunks)
  • src/main/java/io/kontur/eventapi/nhc/episodecomposition/NhcEpisodeCombinator.java (1 hunks)
  • src/test/java/io/kontur/eventapi/episodecomposition/DefaultEpisodeCombinatorTest.java (1 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.java

📄 CodeRabbit inference engine (AGENTS.md)

**/*.java: Write enough comments in Java so Python/PostGIS-proficient engineers can understand the code
Handle exceptions properly; do not just ignore them

Files:

  • src/test/java/io/kontur/eventapi/episodecomposition/DefaultEpisodeCombinatorTest.java
  • src/main/java/io/kontur/eventapi/nhc/episodecomposition/NhcEpisodeCombinator.java
  • src/main/java/io/kontur/eventapi/episodecomposition/WildfireEpisodeCombinator.java
  • src/main/java/io/kontur/eventapi/episodecomposition/EpisodeCombinator.java
docs/**

📄 CodeRabbit inference engine (AGENTS.md)

docs/**: Ensure docs/ reflects reality; fix everything outdated in docs/
Consult docs/ to understand the project
Check data schema as described in docs/

Files:

  • docs/Versions and timelines
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: build
🔇 Additional comments (3)
src/main/java/io/kontur/eventapi/episodecomposition/WildfireEpisodeCombinator.java (1)

14-17: Run superclass post-processing first — good

Super call ensures merge/dedupe happens before wildfire-specific adjustments.

src/main/java/io/kontur/eventapi/nhc/episodecomposition/NhcEpisodeCombinator.java (2)

220-220: Run superclass post-processing first — good

Keeps NHC pipeline consistent with base merge rules before per-source shaping.


225-241: Confirm null-safety and intended zero-duration behavior

  • Verify that FeedEpisode.getStartedAt() cannot return null; if it can, guard all setEndedAt(…) calls and use a Comparator.nullsLast(…) for the sort.
  • Confirm that the business requirement for NHC is to set the final episode’s endedAt to its startedAt (zero duration).

Comment on lines 38 to 69
List<FeedEpisode> merged = new ArrayList<>();
FeedEpisode current = sorted.get(0);
for (int i = 1; i < sorted.size(); i++) {
FeedEpisode next = sorted.get(i);
if (canMerge(current, next)) {
if (next.getStartedAt().isBefore(current.getStartedAt())) {
current.setStartedAt(next.getStartedAt());
}
if (next.getEndedAt().isAfter(current.getEndedAt())) {
current.setEndedAt(next.getEndedAt());
}
if (next.getUpdatedAt() != null
&& (current.getUpdatedAt() == null || next.getUpdatedAt().isAfter(current.getUpdatedAt()))) {
current.setUpdatedAt(next.getUpdatedAt());
}
if (next.getSourceUpdatedAt() != null
&& (current.getSourceUpdatedAt() == null
|| next.getSourceUpdatedAt().isAfter(current.getSourceUpdatedAt()))) {
current.setSourceUpdatedAt(next.getSourceUpdatedAt());
}
current.addObservations(next.getObservations());
current.addUrlIfNotExists(next.getUrls());
current.getLoss().putAll(next.getLoss());
current.getSeverityData().putAll(next.getSeverityData());
} else {
merged.add(current);
current = next;
}
}
merged.add(current);
return merged;
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Confirm intended time-gluing semantics

Merges happen regardless of gaps between episodes as long as properties/geometry match. That aligns with “consecutive observations” across time, but verify this is desired for all feeds. If continuity is required, add adjacency check (e.g., next.startedAt <= current.endedAt plus tolerance).

🤖 Prompt for AI Agents
In src/main/java/io/kontur/eventapi/episodecomposition/EpisodeCombinator.java
around lines 38 to 69, the current merging loop unconditionally merges any two
episodes that match properties/geometry even if there is a time gap; confirm
whether this "time-gluing" behaviour is intended for all feeds and, if
continuity is required, add a temporal adjacency check before merging (e.g.,
require next.getStartedAt().isBefore(current.getEndedAt().plus(tolerance)) or
next.getStartedAt().compareTo(current.getEndedAt()) <= tolerance), implemented
either by extending the existing canMerge(...) to take a Duration tolerance
parameter or by adding a separate check in the loop; make the change
configurable (pass tolerance or disable adjacency) and update unit tests
accordingly.

assertTrue(mergedEpisode.getObservations().containsAll(Arrays.asList(obs1, obs2)),
"Merged episode must include all observations");
}
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

🛠️ Refactor suggestion

Add a test: geometry/property order should not block merge

Currently JSON string equality is order-sensitive; switch to order-insensitive comparison (see EpisodeCombinator comment) and add a test to prevent regressions.

// New test method (add to this class)
@Test
void mergesWhenGeometryPropertiesOrderDiffers() {
    var start = OffsetDateTime.parse("2023-08-01T00:00:00Z");
    var end = OffsetDateTime.parse("2023-08-01T01:00:00Z");

    var props1 = new java.util.LinkedHashMap<String, Object>();
    props1.put("a", 1); props1.put("b", 2);
    var props2 = new java.util.LinkedHashMap<String, Object>();
    props2.put("b", 2); props2.put("a", 1);

    Feature f1 = new Feature(new Point(new double[]{10d,10d}), props1);
    Feature f2 = new Feature(new Point(new double[]{10d,10d}), props2);

    FeedEpisode e1 = new FeedEpisode(); e1.setStartedAt(start); e1.setEndedAt(end);
    e1.setUpdatedAt(start); e1.setSourceUpdatedAt(start);
    e1.setGeometries(new FeatureCollection(new Feature[]{f1}));
    e1.addObservation(UUID.randomUUID());

    FeedEpisode e2 = new FeedEpisode(); e2.setStartedAt(end); e2.setEndedAt(end);
    e2.setUpdatedAt(end); e2.setSourceUpdatedAt(end);
    e2.setGeometries(new FeatureCollection(new Feature[]{f2}));
    e2.addObservation(UUID.randomUUID());

    List<FeedEpisode> merged = combinator.postProcessEpisodes(Arrays.asList(e1, e2));
    assertEquals(1, merged.size(), "Episodes should merge despite property order differences");
}
🤖 Prompt for AI Agents
In
src/test/java/io/kontur/eventapi/episodecomposition/DefaultEpisodeCombinatorTest.java
around line 58, add a new test method named
mergesWhenGeometryPropertiesOrderDiffers that constructs two FeedEpisode
instances whose geometry Feature properties have the same key/value pairs but
different insertion orders (use LinkedHashMap for props1 and props2), sets
appropriate start/end/updated timestamps and geometries (Point in
FeatureCollection), calls combinator.postProcessEpisodes on the two episodes,
and asserts the result size is 1 to ensure ordering of JSON object properties
does not block merging; place the test method in this test class and import any
missing test utilities (e.g., @Test, UUID, Arrays, List) as needed.

Komzpa and others added 2 commits August 30, 2025 23:55
…-and-null-checks

fix(episode): enforce temporal adjacency and handle null data
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

♻️ Duplicate comments (4)
docs/Versions and timelines (1)

40-40: Docs: explicit property list matches implementation — good.
The field list (type, severity, name, properName, description, location, loss, severityData, urls) aligns with canMerge(...).

src/test/java/io/kontur/eventapi/episodecomposition/DefaultEpisodeCombinatorTest.java (1)

16-68: Add test: geometry property order (and URL order) shouldn’t block merge.
Pair with geometry comparison fix in EpisodeCombinator.

@@
 class DefaultEpisodeCombinatorTest {
@@
     void mergesEpisodesWithUnchangedGeometryAndProperties() {
         ...
     }
+
+    @Test
+    void mergesWhenGeometryAndUrlOrderDiffers() {
+        var t0 = OffsetDateTime.parse("2023-08-01T00:00:00Z");
+        var t1 = OffsetDateTime.parse("2023-08-01T01:00:00Z");
+        var urls1 = Arrays.asList("b", "a");
+        var urls2 = Arrays.asList("a", "b");
+
+        var props1 = new java.util.LinkedHashMap<String, Object>(); props1.put("a", 1); props1.put("b", 2);
+        var props2 = new java.util.LinkedHashMap<String, Object>(); props2.put("b", 2); props2.put("a", 1);
+        FeatureCollection g1 = new FeatureCollection(new Feature[]{ new Feature(new Point(new double[]{10d,10d}), props1)});
+        FeatureCollection g2 = new FeatureCollection(new Feature[]{ new Feature(new Point(new double[]{10d,10d}), props2)});
+
+        FeedEpisode e1 = new FeedEpisode(); e1.setStartedAt(t0); e1.setEndedAt(t1); e1.setUpdatedAt(t0); e1.setSourceUpdatedAt(t0);
+        e1.setGeometries(g1); e1.addUrlIfNotExists(urls1); e1.addObservation(UUID.randomUUID());
+        FeedEpisode e2 = new FeedEpisode(); e2.setStartedAt(t1); e2.setEndedAt(t1); e2.setUpdatedAt(t1); e2.setSourceUpdatedAt(t1);
+        e2.setGeometries(g2); e2.addUrlIfNotExists(urls2); e2.addObservation(UUID.randomUUID());
+
+        List<FeedEpisode> merged = combinator.postProcessEpisodes(Arrays.asList(e1, e2));
+        assertEquals(1, merged.size(), "Episodes should merge despite geometry/URL order differences");
+    }
src/main/java/io/kontur/eventapi/episodecomposition/EpisodeCombinator.java (2)

40-46: Null-safe sort: LGTM.
Using nullsLast for startedAt/endedAt addresses prior NPE risk.


123-134: Geometry comparison is order-sensitive; compare JSON trees instead.
String equality breaks on property order/formatting; use JsonNode equality.

-    private boolean geometriesEqual(FeatureCollection a, FeatureCollection b) {
+    private boolean geometriesEqual(FeatureCollection a, FeatureCollection b) {
         if (a == null && b == null) {
             return true;
         }
         if (a == null || b == null) {
             return false;
         }
-        try {
-            return MAPPER.writeValueAsString(a).equals(MAPPER.writeValueAsString(b));
-        } catch (JsonProcessingException e) {
-            return false;
-        }
+        return MAPPER.valueToTree(a).equals(MAPPER.valueToTree(b));
     }

Note: remove the unused JsonProcessingException import.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between ab1e04d and 6a3cf75.

📒 Files selected for processing (3)
  • docs/Versions and timelines (1 hunks)
  • src/main/java/io/kontur/eventapi/episodecomposition/EpisodeCombinator.java (1 hunks)
  • src/test/java/io/kontur/eventapi/episodecomposition/DefaultEpisodeCombinatorTest.java (1 hunks)
🧰 Additional context used
📓 Path-based instructions (2)
**/*.java

📄 CodeRabbit inference engine (AGENTS.md)

**/*.java: Write enough comments in Java so Python/PostGIS-proficient engineers can understand the code
Handle exceptions properly; do not just ignore them

Files:

  • src/test/java/io/kontur/eventapi/episodecomposition/DefaultEpisodeCombinatorTest.java
  • src/main/java/io/kontur/eventapi/episodecomposition/EpisodeCombinator.java
docs/**

📄 CodeRabbit inference engine (AGENTS.md)

docs/**: Ensure docs/ reflects reality; fix everything outdated in docs/
Consult docs/ to understand the project
Check data schema as described in docs/

Files:

  • docs/Versions and timelines
🔇 Additional comments (2)
src/test/java/io/kontur/eventapi/episodecomposition/DefaultEpisodeCombinatorTest.java (1)

20-67: Happy-path merge test looks solid.
Covers interval expansion, obs union, and latest updatedAt/sourceUpdatedAt propagation.

src/main/java/io/kontur/eventapi/episodecomposition/EpisodeCombinator.java (1)

31-33: Ensure new default merging is intended at all call sites
Review usages in FeedCompositionJob, NhcEpisodeCombinator, and WildfireEpisodeCombinator—now that postProcessEpisodes(List) delegates to the merging overload with zero tolerance, verify none relied on the old no-op behavior.

* A job that publishes a Snapshot of Event into Feed. For each feed, for each event, pulls in observations visible for the feed for the event. Checks if list of observations matches one in the last published version. If lists don't match (new observation in event, change in access restrictions, observation manually unlinked) the new version is produced and stored in output feed.
* While producing the new Snapshot of event, the Observations that relate to the same time are folded together into one single Episode.
* Data from previous Observations is generally used to fill in the gaps in newer ones.
* When geometry and properties such as type, severity, name, properName, description, location, loss, severityData, urls remain unchanged and observations are adjacent in time, their intervals are glued and a single Episode spans the whole range.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Define “adjacent in time” precisely to reflect mergeTolerance.
Clarify adjacency semantics to match code (next.start <= current.end + tolerance; default 0s).

-  * When geometry and properties such as type, severity, name, properName, description, location, loss, severityData, urls remain unchanged and observations are adjacent in time, their intervals are glued and a single Episode spans the whole range.
+  * When geometry and the listed properties remain unchanged and observations are adjacent in time (next.startedAt <= current.endedAt + tolerance; default tolerance 0s), their intervals are glued and a single Episode spans the whole range.
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
* When geometry and properties such as type, severity, name, properName, description, location, loss, severityData, urls remain unchanged and observations are adjacent in time, their intervals are glued and a single Episode spans the whole range.
* When geometry and the listed properties remain unchanged and observations are adjacent in time (next.startedAt <= current.endedAt + tolerance; default tolerance 0s), their intervals are glued and a single Episode spans the whole range.
🤖 Prompt for AI Agents
In docs/Versions and timelines around line 40, the phrase "adjacent in time" is
ambiguous; update the sentence to explicitly define adjacency to match the code:
two observations are adjacent if next.start <= current.end + mergeTolerance
(where mergeTolerance defaults to 0s), and state that mergeTolerance is
configurable and expressed in seconds (or the units used by the code). Ensure
the revised sentence replaces the vague phrase and includes the exact inequality
and default value.

Comment on lines +52 to +55
boolean timeAdjacent = mergeTolerance == null
|| (current.getEndedAt() != null && next.getStartedAt() != null
&& !next.getStartedAt().isAfter(current.getEndedAt().plus(mergeTolerance)));
if (timeAdjacent && canMerge(current, next)) {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Guard against null mergeTolerance to avoid unlimited gluing.
Current logic treats null as “no adjacency check”, which may unintentionally merge across gaps.

-            boolean timeAdjacent = mergeTolerance == null
-                    || (current.getEndedAt() != null && next.getStartedAt() != null
-                    && !next.getStartedAt().isAfter(current.getEndedAt().plus(mergeTolerance)));
+            java.util.Objects.requireNonNull(mergeTolerance, "mergeTolerance must not be null");
+            boolean timeAdjacent = current.getEndedAt() != null && next.getStartedAt() != null
+                    && !next.getStartedAt().isAfter(current.getEndedAt().plus(mergeTolerance));
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
boolean timeAdjacent = mergeTolerance == null
|| (current.getEndedAt() != null && next.getStartedAt() != null
&& !next.getStartedAt().isAfter(current.getEndedAt().plus(mergeTolerance)));
if (timeAdjacent && canMerge(current, next)) {
java.util.Objects.requireNonNull(mergeTolerance, "mergeTolerance must not be null");
boolean timeAdjacent = current.getEndedAt() != null && next.getStartedAt() != null
&& !next.getStartedAt().isAfter(current.getEndedAt().plus(mergeTolerance));
if (timeAdjacent && canMerge(current, next)) {
🤖 Prompt for AI Agents
In src/main/java/io/kontur/eventapi/episodecomposition/EpisodeCombinator.java
around lines 52 to 55, the current condition treats a null mergeTolerance as
allowing unlimited adjacency which can merge across gaps; change the logic to
explicitly require a non-null mergeTolerance before considering time adjacency
(e.g., only compute timeAdjacent when mergeTolerance != null and both timestamps
are present), or default null to Duration.ZERO if intended, so that null no
longer disables the adjacency check and prevents unintended gluing.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 6a3cf75 and 4c0557a.

📒 Files selected for processing (2)
  • src/test/java/io/kontur/eventapi/firms/event/FirmsEventAndEpisodeCombinationsJobIT.java (1 hunks)
  • src/test/java/io/kontur/eventapi/usgs/earthquake/event/UsgsEarthquakeEventCombinationJobTest.java (1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
**/*.java

📄 CodeRabbit inference engine (AGENTS.md)

**/*.java: Write enough comments in Java so Python/PostGIS-proficient engineers can understand the code
Handle exceptions properly; do not just ignore them

Files:

  • src/test/java/io/kontur/eventapi/usgs/earthquake/event/UsgsEarthquakeEventCombinationJobTest.java
  • src/test/java/io/kontur/eventapi/firms/event/FirmsEventAndEpisodeCombinationsJobIT.java
🔇 Additional comments (3)
src/test/java/io/kontur/eventapi/usgs/earthquake/event/UsgsEarthquakeEventCombinationJobTest.java (1)

23-38: LGTM: focused scenario setup

Clear, minimal setup for two observations with the same externalEventId under a single provider.

src/test/java/io/kontur/eventapi/firms/event/FirmsEventAndEpisodeCombinationsJobIT.java (2)

157-166: LGTM: expectations updated to new merging behavior

The assertions align with the merged-episode semantics and include helpful messages.


198-199: LGTM: rollout queue drained check

Good assertion that no events remain for rollout after composition.

Comment on lines 167 to 172
TestEventDto someFedData = firmsUpdated.get(2);
assertEquals(5, someFedData.getObservations().size());
assertEquals(parse("2020-11-02T11:50Z"),someFedData.getStartedAt());
assertEquals(parse("2020-11-03T22:50Z"),someFedData.getEndedAt());
assertEquals(2, someFedData.getVersion());
assertEquals(5, someFedData.getObservations().size(), "Third event should contain five observations");
assertEquals(parse("2020-11-02T11:50Z"), someFedData.getStartedAt(), "Unexpected start time for third event");
assertEquals(parse("2020-11-03T22:50Z"), someFedData.getEndedAt(), "Unexpected end time for third event");
assertEquals(2, someFedData.getVersion(), "Third event version mismatch");

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Name nit: clarify subject under test

Rename someFedData to thirdEvent for readability.

-        TestEventDto someFedData = firmsUpdated.get(2);
-        assertEquals(5, someFedData.getObservations().size(), "Third event should contain five observations");
-        assertEquals(parse("2020-11-02T11:50Z"), someFedData.getStartedAt(), "Unexpected start time for third event");
-        assertEquals(parse("2020-11-03T22:50Z"), someFedData.getEndedAt(), "Unexpected end time for third event");
-        assertEquals(2, someFedData.getVersion(), "Third event version mismatch");
+        TestEventDto thirdEvent = firmsUpdated.get(2);
+        assertEquals(5, thirdEvent.getObservations().size(), "Third event should contain five observations");
+        assertEquals(parse("2020-11-02T11:50Z"), thirdEvent.getStartedAt(), "Unexpected start time for third event");
+        assertEquals(parse("2020-11-03T22:50Z"), thirdEvent.getEndedAt(), "Unexpected end time for third event");
+        assertEquals(2, thirdEvent.getVersion(), "Third event version mismatch");
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
TestEventDto someFedData = firmsUpdated.get(2);
assertEquals(5, someFedData.getObservations().size());
assertEquals(parse("2020-11-02T11:50Z"),someFedData.getStartedAt());
assertEquals(parse("2020-11-03T22:50Z"),someFedData.getEndedAt());
assertEquals(2, someFedData.getVersion());
assertEquals(5, someFedData.getObservations().size(), "Third event should contain five observations");
assertEquals(parse("2020-11-02T11:50Z"), someFedData.getStartedAt(), "Unexpected start time for third event");
assertEquals(parse("2020-11-03T22:50Z"), someFedData.getEndedAt(), "Unexpected end time for third event");
assertEquals(2, someFedData.getVersion(), "Third event version mismatch");
TestEventDto thirdEvent = firmsUpdated.get(2);
assertEquals(5, thirdEvent.getObservations().size(), "Third event should contain five observations");
assertEquals(parse("2020-11-02T11:50Z"), thirdEvent.getStartedAt(), "Unexpected start time for third event");
assertEquals(parse("2020-11-03T22:50Z"), thirdEvent.getEndedAt(), "Unexpected end time for third event");
assertEquals(2, thirdEvent.getVersion(), "Third event version mismatch");
🤖 Prompt for AI Agents
In
src/test/java/io/kontur/eventapi/firms/event/FirmsEventAndEpisodeCombinationsJobIT.java
around lines 167 to 172, the local variable someFedData is poorly named; rename
it to thirdEvent and update all references in this block to improve readability
(e.g., TestEventDto thirdEvent = firmsUpdated.get(2); and replace
someFedData.get... with thirdEvent.get...), keeping the assertions and values
unchanged.

Comment on lines +174 to 196
assertEquals(3, episodes.size(), "Expected three episodes in third event");

episodes.sort(Comparator.comparing(TestEpisodeDto::getSourceUpdatedAt));

assertEquals(2, episodes.get(0).getObservations().size());
assertEquals(parse("2020-11-02T11:50Z"), episodes.get(0).getSourceUpdatedAt());
assertEquals(parse("2020-11-02T11:50Z"), episodes.get(0).getStartedAt());
assertEquals(parse("2020-11-02T12:50Z"), episodes.get(0).getEndedAt());
assertEquals(Severity.MINOR, episodes.get(0).getSeverity());

assertEquals(3, episodes.get(1).getObservations().size());
assertEquals(parse("2020-11-02T12:50Z"), episodes.get(1).getSourceUpdatedAt());
assertEquals(parse("2020-11-02T12:50Z"), episodes.get(1).getStartedAt());
assertEquals(parse("2020-11-02T14:50Z"), episodes.get(1).getEndedAt());
assertEquals(Severity.MINOR, episodes.get(1).getSeverity());

assertEquals(4, episodes.get(2).getObservations().size());
assertEquals(parse("2020-11-02T14:50Z"), episodes.get(2).getSourceUpdatedAt());
assertEquals(parse("2020-11-02T14:50Z"), episodes.get(2).getStartedAt());
assertEquals(parse("2020-11-02T22:50Z"), episodes.get(2).getEndedAt());
assertEquals(Severity.MINOR, episodes.get(2).getSeverity());

assertEquals(5, episodes.get(3).getObservations().size());
assertEquals(parse("2020-11-02T22:50Z"), episodes.get(3).getSourceUpdatedAt());
assertEquals(parse("2020-11-02T22:50Z"), episodes.get(3).getStartedAt());
assertEquals(parse("2020-11-03T22:50Z"), episodes.get(3).getEndedAt());
assertEquals(Severity.MINOR, episodes.get(3).getSeverity());
assertEquals(2, episodes.get(0).getObservations().size(), "Episode 1 should contain two observations");
assertEquals(parse("2020-11-02T11:50Z"), episodes.get(0).getSourceUpdatedAt(), "Episode 1 source update mismatch");
assertEquals(parse("2020-11-02T11:50Z"), episodes.get(0).getStartedAt(), "Episode 1 start time mismatch");
assertEquals(parse("2020-11-02T12:50Z"), episodes.get(0).getEndedAt(), "Episode 1 end time mismatch");
assertEquals(Severity.MINOR, episodes.get(0).getSeverity(), "Episode 1 severity mismatch");

assertEquals(4, episodes.get(1).getObservations().size(),
"Episode 2 should contain four observations after second feed composition run");
assertEquals(parse("2020-11-02T14:50Z"), episodes.get(1).getSourceUpdatedAt(), "Episode 2 source update mismatch");
assertEquals(parse("2020-11-02T12:50Z"), episodes.get(1).getStartedAt(), "Episode 2 start time mismatch");
assertEquals(parse("2020-11-02T22:50Z"), episodes.get(1).getEndedAt(), "Episode 2 end time mismatch");
assertEquals(Severity.MINOR, episodes.get(1).getSeverity(), "Episode 2 severity mismatch");

assertEquals(5, episodes.get(2).getObservations().size(), "Episode 3 should contain five observations");
assertEquals(parse("2020-11-02T22:50Z"), episodes.get(2).getSourceUpdatedAt(), "Episode 3 source update mismatch");
assertEquals(parse("2020-11-02T22:50Z"), episodes.get(2).getStartedAt(), "Episode 3 start time mismatch");
assertEquals(parse("2020-11-03T22:50Z"), episodes.get(2).getEndedAt(), "Episode 3 end time mismatch");
assertEquals(Severity.MINOR, episodes.get(2).getSeverity(), "Episode 3 severity mismatch");

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Reduce duplication and hard-coding in episode assertions

Factor repeated assertions into a small helper and assert timeline monotonicity/contiguity to make the test more robust to non-functional changes.

-        List<TestEpisodeDto> episodes = someFedData.getEpisodes();
+        List<TestEpisodeDto> episodes = thirdEvent.getEpisodes();
         assertEquals(3, episodes.size(), "Expected three episodes in third event");
         episodes.sort(Comparator.comparing(TestEpisodeDto::getSourceUpdatedAt));
-        assertEquals(2, episodes.get(0).getObservations().size(), "Episode 1 should contain two observations");
-        assertEquals(parse("2020-11-02T11:50Z"), episodes.get(0).getSourceUpdatedAt(), "Episode 1 source update mismatch");
-        assertEquals(parse("2020-11-02T11:50Z"), episodes.get(0).getStartedAt(), "Episode 1 start time mismatch");
-        assertEquals(parse("2020-11-02T12:50Z"), episodes.get(0).getEndedAt(), "Episode 1 end time mismatch");
-        assertEquals(Severity.MINOR, episodes.get(0).getSeverity(), "Episode 1 severity mismatch");
-
-        assertEquals(4, episodes.get(1).getObservations().size(),
-                "Episode 2 should contain four observations after second feed composition run");
-        assertEquals(parse("2020-11-02T14:50Z"), episodes.get(1).getSourceUpdatedAt(), "Episode 2 source update mismatch");
-        assertEquals(parse("2020-11-02T12:50Z"), episodes.get(1).getStartedAt(), "Episode 2 start time mismatch");
-        assertEquals(parse("2020-11-02T22:50Z"), episodes.get(1).getEndedAt(), "Episode 2 end time mismatch");
-        assertEquals(Severity.MINOR, episodes.get(1).getSeverity(), "Episode 2 severity mismatch");
-
-        assertEquals(5, episodes.get(2).getObservations().size(), "Episode 3 should contain five observations");
-        assertEquals(parse("2020-11-02T22:50Z"), episodes.get(2).getSourceUpdatedAt(), "Episode 3 source update mismatch");
-        assertEquals(parse("2020-11-02T22:50Z"), episodes.get(2).getStartedAt(), "Episode 3 start time mismatch");
-        assertEquals(parse("2020-11-03T22:50Z"), episodes.get(2).getEndedAt(), "Episode 3 end time mismatch");
-        assertEquals(Severity.MINOR, episodes.get(2).getSeverity(), "Episode 3 severity mismatch");
+        assertEpisode(episodes.get(0), 2, "2020-11-02T11:50Z", "2020-11-02T11:50Z", "2020-11-02T12:50Z", Severity.MINOR, 1);
+        assertEpisode(episodes.get(1), 4, "2020-11-02T14:50Z", "2020-11-02T12:50Z", "2020-11-02T22:50Z", Severity.MINOR, 2);
+        assertEpisode(episodes.get(2), 5, "2020-11-02T22:50Z", "2020-11-02T22:50Z", "2020-11-03T22:50Z", Severity.MINOR, 3);
+
+        // Invariant: sorted episodes must have non-decreasing startedAt and endedAt, and no gaps if merged contiguously
+        for (int i = 1; i < episodes.size(); i++) {
+            assertTrue(!episodes.get(i).getStartedAt().isBefore(episodes.get(i - 1).getStartedAt()),
+                    "Episodes should be in non-decreasing chronological order by start");
+            assertTrue(!episodes.get(i).getEndedAt().isBefore(episodes.get(i).getStartedAt()),
+                    "Episode end must not be before start");
+        }

Add this helper in the test class:

private static void assertEpisode(TestEpisodeDto e, int obsCount,
                                  String srcUpd, String start, String end,
                                  Severity severity, int idx) {
    assertEquals(obsCount, e.getObservations().size(), "Episode " + idx + " obs count");
    assertEquals(parse(srcUpd), e.getSourceUpdatedAt(), "Episode " + idx + " sourceUpdatedAt");
    assertEquals(parse(start), e.getStartedAt(), "Episode " + idx + " startedAt");
    assertEquals(parse(end), e.getEndedAt(), "Episode " + idx + " endedAt");
    assertEquals(severity, e.getSeverity(), "Episode " + idx + " severity");
}
🤖 Prompt for AI Agents
In
src/test/java/io/kontur/eventapi/firms/event/FirmsEventAndEpisodeCombinationsJobIT.java
around lines 174 to 196, the test repeats nearly identical assertions for each
episode and hard-codes expected values; factor these into a small helper to
reduce duplication and improve readability and maintainability. Add a private
static helper method in the test class that takes the TestEpisodeDto, expected
observation count, sourceUpdatedAt, startedAt, endedAt, expected Severity and
episode index, and inside the helper call the existing assertEquals checks
(parsing the date strings with parse and using descriptive messages including
the index). Replace the three blocks of repeated assertions with three calls to
this helper passing the appropriate expected values for each episode. Ensure
imports remain available and run the test to verify no behavior changed.

Comment on lines +39 to +41
Mockito.when(observationsDao.getObservationsNotLinkedToEvent(Mockito.anyList()))
.thenReturn(List.of(first, second));

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick (assertive)

Broaden stub to mirror real call shape and reduce brittleness

If getObservationsNotLinkedToEvent filters by providers, consider stubbing with specific argument matching for ["usgs.earthquake"] to ensure future changes don’t break this test silently.

-Mockito.when(observationsDao.getObservationsNotLinkedToEvent(Mockito.anyList()))
+Mockito.when(observationsDao.getObservationsNotLinkedToEvent(Mockito.eq(List.of("usgs.earthquake"))))
         .thenReturn(List.of(first, second));
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
Mockito.when(observationsDao.getObservationsNotLinkedToEvent(Mockito.anyList()))
.thenReturn(List.of(first, second));
Mockito.when(observationsDao.getObservationsNotLinkedToEvent(Mockito.eq(List.of("usgs.earthquake"))))
.thenReturn(List.of(first, second));
🤖 Prompt for AI Agents
In
src/test/java/io/kontur/eventapi/usgs/earthquake/event/UsgsEarthquakeEventCombinationJobTest.java
around lines 39 to 41, the stub uses Mockito.anyList() which is too broad and
brittle; change the stub to match the real call shape by specifying the provider
list (e.g., ["usgs.earthquake"]) using Mockito.eq(...) or Mockito.argThat(...)
so getObservationsNotLinkedToEvent is only stubbed when called with the expected
providers list and will fail if signature/arguments change unexpectedly.

Comment on lines +43 to +52
Mockito.when(eventsDao.getEventByExternalId(Mockito.anyString()))
.thenAnswer(inv -> Optional.ofNullable(events.get(inv.getArgument(0))));
Mockito.doAnswer(inv -> {
UUID eventId = inv.getArgument(0);
NormalizedObservation obs = inv.getArgument(1);
events.computeIfAbsent(obs.getExternalEventId(), id -> new KonturEvent(eventId))
.addObservations(obs.getObservationId());
return null;
}).when(eventsDao).appendObservationIntoEvent(Mockito.any(UUID.class), Mockito.any(NormalizedObservation.class));

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

⚠️ Potential issue

Test can pass even if two distinct events are created for the same external ID

You aggregate results by externalEventId in a map, so if the job mistakenly creates two different eventIds for "eq1", this test would still pass. Capture eventIds used in appendObservationIntoEvent and assert they’re identical and non-null.

Apply this diff after job.run() and add the import:

+import org.mockito.ArgumentCaptor;
 job.run();

-assertEquals(2, events.get("eq1").getObservationIds().size(),
-        "Merged earthquake event should contain two observations sharing external ID 'eq1'");
+assertEquals(2, events.get("eq1").getObservationIds().size(),
+        "Merged earthquake event should contain two observations sharing external ID 'eq1'");
+
+// Ensure both observations were appended into the same event ID and none is null
+ArgumentCaptor<UUID> eventIdCaptor = ArgumentCaptor.forClass(UUID.class);
+Mockito.verify(eventsDao, Mockito.times(2))
+        .appendObservationIntoEvent(eventIdCaptor.capture(), Mockito.any(NormalizedObservation.class));
+assertEquals(1, new java.util.HashSet<>(eventIdCaptor.getAllValues()).size(),
+        "Both observations must be appended to the same event");
+assertTrue(eventIdCaptor.getAllValues().stream().allMatch(java.util.Objects::nonNull),
+        "Event IDs must be non-null");
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
Mockito.when(eventsDao.getEventByExternalId(Mockito.anyString()))
.thenAnswer(inv -> Optional.ofNullable(events.get(inv.getArgument(0))));
Mockito.doAnswer(inv -> {
UUID eventId = inv.getArgument(0);
NormalizedObservation obs = inv.getArgument(1);
events.computeIfAbsent(obs.getExternalEventId(), id -> new KonturEvent(eventId))
.addObservations(obs.getObservationId());
return null;
}).when(eventsDao).appendObservationIntoEvent(Mockito.any(UUID.class), Mockito.any(NormalizedObservation.class));
// At the top of the file, alongside the other imports:
import org.mockito.ArgumentCaptor;
...
@Test
void testCombineObservationsIntoSingleEvent() {
// existing setup for eventsDao and events map
...
job.run();
assertEquals(2, events.get("eq1").getObservationIds().size(),
"Merged earthquake event should contain two observations sharing external ID 'eq1'");
// Ensure both observations were appended into the same event ID and none is null
ArgumentCaptor<UUID> eventIdCaptor = ArgumentCaptor.forClass(UUID.class);
Mockito.verify(eventsDao, Mockito.times(2))
.appendObservationIntoEvent(eventIdCaptor.capture(), Mockito.any(NormalizedObservation.class));
assertEquals(1, new java.util.HashSet<>(eventIdCaptor.getAllValues()).size(),
"Both observations must be appended to the same event");
assertTrue(eventIdCaptor.getAllValues().stream().allMatch(java.util.Objects::nonNull),
"Event IDs must be non-null");
}
🤖 Prompt for AI Agents
In
src/test/java/io/kontur/eventapi/usgs/earthquake/event/UsgsEarthquakeEventCombinationJobTest.java
around lines 43 to 52, the test allows two different internal eventIds to be
created for the same externalEventId because assertions only check aggregation
by external ID; after job.run() capture the UUID arguments passed to
eventsDao.appendObservationIntoEvent (e.g., collect them into a list via Mockito
ArgumentCaptor or a captured collection in the test) and assert that all
captured eventIds are non-null and equal (single distinct UUID) to ensure the
job uses the same internal event for the same external ID; add the required
import for ArgumentCaptor (or java.util.List/UUID if using a collection) and
place these assertions after job.run().

Comment on lines +59 to +60
ReflectionTestUtils.setField(job, "sequentialProviders", new String[]{"usgs.earthquake"});

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

❓ Verification inconclusive

Avoid reflection for wiring providers

Reflection on private field sequentialProviders is brittle. Prefer constructor/setter injection (even package-private) so tests don’t depend on internal field names.

Would you like a follow-up PR to introduce a setter or constructor arg on EventCombinationJob and update tests accordingly?


Introduce setter or constructor injection for sequentialProviders
Replace the ReflectionTestUtils.setField(…) call in UsgsEarthquakeEventCombinationJobTest with a direct call to a new package-private setter or constructor argument on EventCombinationJob, so tests don’t rely on private field names. Would you like a follow-up PR to add setSequentialProviders(String[]) (or a constructor parameter) and update this test?

🤖 Prompt for AI Agents
In
src/test/java/io/kontur/eventapi/usgs/earthquake/event/UsgsEarthquakeEventCombinationJobTest.java
around lines 59-60, the test uses ReflectionTestUtils.setField(...) to set the
private sequentialProviders field; replace that by adding a package-private
setter or constructor parameter on the EventCombinationJob class (e.g., void
setSequentialProviders(String[])) and update this test to call the new setter
(or pass the array via the constructor) instead of using reflection. Ensure the
new setter/constructor is package-private so tests in the same package can
access it and no production-visible API changes are introduced.

@Komzpa Komzpa merged commit 5907c5d into master Aug 31, 2025
7 checks passed
@Komzpa Komzpa deleted the codex/adjust-feed-composition-rules branch August 31, 2025 10:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants