Conversation
📝 WalkthroughWalkthroughDB migration SQL and the SQL generator were changed so genscript_upsert_kind and genscript_upsert_source_kind return SMALLINT IDs instead of void; callers now capture and propagate those IDs. Exception paths that raised on missing kinds were replaced with upsert calls. An integration test validating extension data initialization was added. Changes
Sequence Diagram(s)sequenceDiagram
participant Generator as SQL Generator
participant Migrator as Migration SQL
participant DB as Postgres
Note over Generator,Migrator: Codegen emits upsert helpers that return IDs
Generator->>DB: Emit genscript_upsert_kind / genscript_upsert_source_kind (RETURN SMALLINT)
Migrator->>DB: CALL genscript_upsert_kind(name)
DB-->>Migrator: kind_id (inserted or existing)
Migrator->>DB: INSERT/UPSERT dependent rows USING kind_id
DB-->>Migrator: confirmation / inserted ids
Migrator->>DB: CALL genscript_upsert_source_kind(name)
DB-->>Migrator: source_kind_id
Migrator->>DB: INSERT/UPSERT dependent rows USING source_kind_id
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Suggested reviewers
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
Tip Try Coding Plans. Let us write the prompt for your AI agent so you can ship faster (with fewer bugs). Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@cmd/api/src/database/migration/extensions_integration_test.go`:
- Around line 116-121: Rename the method receiver from "suite" to "s" on the
IntegrationTestSuite teardown method (teardownIntegrationTestSuite) to follow
receiver naming guidelines; update all references inside the method (e.g.,
s.BHDatabase, s.Context and the s.BHDatabase.Close call) so the function
compiles and behavior remains the same.
- Around line 44-114: Both setupIntegrationTestSuite and getPostgresConfig
declare variables mid-function; hoist those declarations into a top-level var
block at the start of each function (e.g. declare ctx, connConf, gormDB, err in
setupIntegrationTestSuite and config, environmentMap, err in getPostgresConfig)
and keep only assignments/initializations inline where necessary; update uses in
functions (calls to database.OpenDatabase, db.Migrate, db.PopulateExtensionData,
utils.LoadIntegrationTestConfig, parsing loop) to work with the hoisted vars.
ℹ️ Review info
Configuration used: Repository YAML (base), Organization UI (inherited)
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (4)
cmd/api/src/database/migration/extensions/ad_graph_schema.sqlcmd/api/src/database/migration/extensions/az_graph_schema.sqlcmd/api/src/database/migration/extensions_integration_test.gopackages/go/schemagen/generator/sql.go
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (1)
cmd/api/src/database/migration/extensions_integration_test.go (1)
82-90: Hoist variable declarations into avarblock ingetPostgresConfig.
config,err, andenvironmentMapare still declared inline with:=rather than in a top-levelvar ( ... )block.♻️ Proposed fix
func getPostgresConfig(t *testing.T) pgtestdb.Config { t.Helper() - config, err := utils.LoadIntegrationTestConfig() + var ( + config *utils.IntegrationTestConfig // adjust type as needed + environmentMap = make(map[string]string) + err error + ) + + config, err = utils.LoadIntegrationTestConfig() require.NoError(t, err) - environmentMap := make(map[string]string) for _, entry := range strings.Fields(config.Database.Connection) {As per coding guidelines: Group variable initializations in a
var ( ... )block and hoist them to the top of the function when possible.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cmd/api/src/database/migration/extensions_integration_test.go` around lines 82 - 90, Hoist the local declarations for config, err, and environmentMap into a top-of-function var block in getPostgresConfig; replace the inline short-declarations (:=) with the var block like "var ( config <type> ; err error ; environmentMap map[string]string )" and then assign to them using "=" when calling utils.LoadIntegrationTestConfig() and setting environmentMap = make(map[string]string) before populating it. Ensure all subsequent uses of config, err, and environmentMap in getPostgresConfig remain unchanged.
🧹 Nitpick comments (1)
cmd/api/src/database/migration/extensions_integration_test.go (1)
127-166: Hoisterr,extensions, andtotalRecordsinto the existingvarblock; add a comment explaining the intentional double call.Two issues:
err(line 134) andextensions,totalRecords,err(line 138) are declared mid-function with:=instead of being declared in the top-levelvarblock.PopulateExtensionDatais already called once insidesetupIntegrationTestSuite(line 65) and then called again here (line 134). The second call appears intentional — verifying that the fixed SQL is idempotent — but there is no comment explaining this, which will confuse future readers.♻️ Proposed fix
func TestExtensions_GetOnStartExtensionData(t *testing.T) { var ( - testSuite = setupIntegrationTestSuite(t) + testSuite = setupIntegrationTestSuite(t) + extensions []model.GraphSchemaExtension // adjust type as needed + totalRecords int + err error ) defer testSuite.teardownIntegrationTestSuite(t) - err := testSuite.BHDatabase.PopulateExtensionData(testSuite.Context) + // Call PopulateExtensionData a second time to validate idempotency (the core fix: no RAISE EXCEPTION on re-run). + err = testSuite.BHDatabase.PopulateExtensionData(testSuite.Context) require.NoError(t, err) // Validate Both Schema Extensions Exist - extensions, totalRecords, err := testSuite.BHDatabase.GetGraphSchemaExtensions(testSuite.Context, model.Filters{}, model.Sort{}, 0, 0) + extensions, totalRecords, err = testSuite.BHDatabase.GetGraphSchemaExtensions(testSuite.Context, model.Filters{}, model.Sort{}, 0, 0)As per coding guidelines: Group variable initializations in a
var ( ... )block and hoist them to the top of the function when possible.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cmd/api/src/database/migration/extensions_integration_test.go` around lines 127 - 166, Hoist the local variables err, extensions, and totalRecords into the existing var block at the top of TestExtensions_GetOnStartExtensionData instead of declaring them inline with :=; after moving them, replace the mid-function short declarations with assignments (e.g., err = ... and extensions, totalRecords, err = ...). Also add a brief comment above the second call to PopulateExtensionData explaining the call is intentional to verify idempotency (i.e., the SQL can be run twice safely). Update references to GetGraphSchemaExtensions and PopulateExtensionData accordingly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@cmd/api/src/database/migration/extensions_integration_test.go`:
- Around line 168-188: The default branches in validateSourceKind and
validateEnvironmentKind currently call t.Error which only marks failure and
allows execution to continue; replace those calls with a fatal failure (e.g.,
t.Fatalf or require.FailNow) so the test stops immediately on an unknown
extensionName. Update the default case in validateSourceKind (function
validateSourceKind) and in validateEnvironmentKind (function
validateEnvironmentKind) to use t.Fatalf("Invalid extension name %s",
extensionName) or require.FailNow with a message to prevent subsequent
misleading failures.
---
Duplicate comments:
In `@cmd/api/src/database/migration/extensions_integration_test.go`:
- Around line 82-90: Hoist the local declarations for config, err, and
environmentMap into a top-of-function var block in getPostgresConfig; replace
the inline short-declarations (:=) with the var block like "var ( config <type>
; err error ; environmentMap map[string]string )" and then assign to them using
"=" when calling utils.LoadIntegrationTestConfig() and setting environmentMap =
make(map[string]string) before populating it. Ensure all subsequent uses of
config, err, and environmentMap in getPostgresConfig remain unchanged.
---
Nitpick comments:
In `@cmd/api/src/database/migration/extensions_integration_test.go`:
- Around line 127-166: Hoist the local variables err, extensions, and
totalRecords into the existing var block at the top of
TestExtensions_GetOnStartExtensionData instead of declaring them inline with :=;
after moving them, replace the mid-function short declarations with assignments
(e.g., err = ... and extensions, totalRecords, err = ...). Also add a brief
comment above the second call to PopulateExtensionData explaining the call is
intentional to verify idempotency (i.e., the SQL can be run twice safely).
Update references to GetGraphSchemaExtensions and PopulateExtensionData
accordingly.
ℹ️ Review info
Configuration used: Repository YAML (base), Organization UI (inherited)
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
cmd/api/src/database/migration/extensions_integration_test.go
wes-mil
left a comment
There was a problem hiding this comment.
nice improvements 😄 (would like someone from opengraph to also review as usual)
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
cmd/api/src/database/migration/extensions_integration_test.go (1)
82-90: Fail fast when required DB connection keys are missing.If
database.connectionis malformed, this helper currently continues with empty fields and fails later with less actionable DB errors. Add required-key assertions right after parsing.🔧 Suggested hardening
func getPostgresConfig(t *testing.T) pgtestdb.Config { t.Helper() config, err := utils.LoadIntegrationTestConfig() require.NoError(t, err) environmentMap := make(map[string]string) for _, entry := range strings.Fields(config.Database.Connection) { if parts := strings.SplitN(entry, "=", 2); len(parts) == 2 { environmentMap[parts[0]] = parts[1] } } + + for _, key := range []string{"host", "user", "password", "dbname"} { + require.NotEmpty(t, environmentMap[key], "missing %q in database.connection", key) + } + if !strings.HasPrefix(environmentMap["host"], "/") { + require.NotEmpty(t, environmentMap["port"], "missing %q in database.connection", "port") + } if strings.HasPrefix(environmentMap["host"], "/") {Also applies to: 92-116
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@cmd/api/src/database/migration/extensions_integration_test.go` around lines 82 - 90, After parsing config.Database.Connection into environmentMap in the test (after the loop that builds environmentMap), immediately assert presence of the required DB keys (e.g., "host", "port", "user", "password", "dbname" or whatever keys your integration tests expect) and fail the test with require.NoError/require.NotEmpty-style assertions if any are missing; update the same check pattern also for the later parsing block around lines 92-116. Locate the parsing logic that calls utils.LoadIntegrationTestConfig and fills environmentMap and add explicit require.NotContains/require.NotEmpty/require.True checks referencing environmentMap["host"], environmentMap["port"], environmentMap["user"], environmentMap["password"], environmentMap["dbname"] (adjust keys to match your config) so the test fails fast with a clear message when keys are missing.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@cmd/api/src/database/migration/extensions_integration_test.go`:
- Around line 127-130: The test TestExtensions_GetOnStartExtensionData currently
only checks record count and per-row fields; update it (and the similar block
around the function starting at lines 142–164) to explicitly assert the exact
set of extension names {"AD","AZ"} is present. After extracting the rows (e.g.,
the slice/variable built from the query in
TestExtensions_GetOnStartExtensionData and the other test), collect the name
values and add an assertion that the names match the expected set (use the test
framework's set/element matching helper such as require.ElementsMatch or
assert.ElementsMatch) so the test fails if any name is missing or unexpected.
Ensure this assertion is placed after the per-row shape checks and references
the same row/name variables used in those tests.
---
Nitpick comments:
In `@cmd/api/src/database/migration/extensions_integration_test.go`:
- Around line 82-90: After parsing config.Database.Connection into
environmentMap in the test (after the loop that builds environmentMap),
immediately assert presence of the required DB keys (e.g., "host", "port",
"user", "password", "dbname" or whatever keys your integration tests expect) and
fail the test with require.NoError/require.NotEmpty-style assertions if any are
missing; update the same check pattern also for the later parsing block around
lines 92-116. Locate the parsing logic that calls
utils.LoadIntegrationTestConfig and fills environmentMap and add explicit
require.NotContains/require.NotEmpty/require.True checks referencing
environmentMap["host"], environmentMap["port"], environmentMap["user"],
environmentMap["password"], environmentMap["dbname"] (adjust keys to match your
config) so the test fails fast with a clear message when keys are missing.
ℹ️ Review info
Configuration used: Repository YAML (base), Organization UI (inherited)
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
cmd/api/src/database/migration/extensions_integration_test.go
| func TestExtensions_GetOnStartExtensionData(t *testing.T) { | ||
| var ( | ||
| testSuite = setupIntegrationTestSuite(t) | ||
| ) |
There was a problem hiding this comment.
Assert the exact extension names (AD, AZ), not only record count.
The current checks verify count and per-row shape, but they don’t explicitly enforce the expected extension name set. Add a final name-set assertion to make the regression test stricter.
✅ Tighten expectations
func TestExtensions_GetOnStartExtensionData(t *testing.T) {
var (
- testSuite = setupIntegrationTestSuite(t)
+ testSuite = setupIntegrationTestSuite(t)
+ extensionNames = make([]string, 0, 2)
)
@@
require.NoError(t, err)
require.Equal(t, 2, totalRecords)
+ require.Len(t, extensions, 2)
for _, extension := range extensions {
+ extensionNames = append(extensionNames, extension.Name)
require.True(t, extension.IsBuiltin, "All extensions should be marked as built-in")
@@
validateEnvironmentKind(t, extension.Name, environmentKind.Name)
}
+
+ require.ElementsMatch(t, []string{"AD", "AZ"}, extensionNames)
}Also applies to: 142-164
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@cmd/api/src/database/migration/extensions_integration_test.go` around lines
127 - 130, The test TestExtensions_GetOnStartExtensionData currently only checks
record count and per-row fields; update it (and the similar block around the
function starting at lines 142–164) to explicitly assert the exact set of
extension names {"AD","AZ"} is present. After extracting the rows (e.g., the
slice/variable built from the query in TestExtensions_GetOnStartExtensionData
and the other test), collect the name values and add an assertion that the names
match the expected set (use the test framework's set/element matching helper
such as require.ElementsMatch or assert.ElementsMatch) so the test fails if any
name is missing or unexpected. Ensure this assertion is placed after the per-row
shape checks and references the same row/name variables used in those tests.
Description
RAISE EXCEPTIONfrom the sql during on-start migration; if a necessary kind is not found, it will be inserted into the database so that the migration process can continue.Motivation and Context
Resolves BED-7480
Why is this change required? What problem does it solve?
How Has This Been Tested?
Integration tests added! Also tested locally, the schema data still populates.
Screenshots (optional):
Types of changes
Checklist:
Summary by CodeRabbit
Tests
Chores