Skip to content

Conversation

@Dami-star
Copy link
Collaborator

@Dami-star Dami-star commented Jan 5, 2026

  • Add comprehensive test suite for TreelandUserConfig with valid DConfig
  • Test immediate destroy, initialization completion, and property changes
  • Add stress tests for high-frequency creation/destruction cycles
  • Verify safety of signal handling using QPointer mechanism
  • Tests confirm crash fix prevents SIGSEGV in production scenarios

Summary by Sourcery

Add a new test suite to validate TreelandUserConfig behavior against a valid DConfig schema and guard against regressions of a prior SIGSEGV crash.

Tests:

  • Introduce multiple scenario-based tests covering creation, initialization, property changes, and destruction of TreelandUserConfig using the real org.deepin.dde.treeland.user DConfig.
  • Add stress and sequential operation tests to exercise high-frequency creation/destruction cycles and rapid config updates under realistic event processing.
  • Wire the new TreelandUserConfig DConfig test binary into the CMake test target so it runs as part of the automated test suite.

@deepin-ci-robot
Copy link

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by: Dami-star

The full list of commands accepted by this bot can be found here.

Details Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@sourcery-ai
Copy link

sourcery-ai bot commented Jan 5, 2026

Reviewer's Guide

Adds a new Qt-based test executable and CTest entry that stress-tests TreelandUserConfig against a valid DConfig schema to guard against lifecycle-related SIGSEGV crashes during creation, initialization, property changes, and destruction.

File-Level Changes

Change Details Files
Introduce ValidConfigCrashTest Qt test case that exercises TreelandUserConfig lifecycle and signal safety with a valid DConfig schema.
  • Create ValidConfigCrashTest QObject-based test fixture with init/cleanup and six focused test slots covering different lifecycle scenarios.
  • Implement one-time DConfig availability detection using TreelandUserConfig::createByName, initialization polling, and static flags to allow skipping tests when the schema is missing.
  • Add multiple lifecycle tests: immediate destroy after creation, destroy after initialization, property change then destroy, rapid property changes before destroy, sequential config create/init/use/destroy loops, and a configurable high-frequency stress test controlled by TREELAND_STRESS_CYCLES.
  • Use QCoreApplication::processEvents and short sleeps to flush queued DConfig::valueChanged and QMetaObject::invokeMethod lambdas, attempting to reproduce the historic SIGSEGV when lambdas access destroyed objects.
  • Set up a QGuiApplication-based main() with offscreen platform fallback when DISPLAY is unset, and include the generated MOC file for the test class.
tests/test_treeland_userconfig/test_dconfig_valid_config.cpp
Register the new TreelandUserConfig DConfig validity test in the CMake/CTest configuration.
  • Add a new test_treeland_userconfig subdirectory to the top-level tests/CMakeLists.txt so the new tests are built.
  • Define an executable target test_dconfig_valid_config, link it against Qt6 Core/Gui/Test and libtreeland, set include paths, and register it as a CTest test.
tests/CMakeLists.txt
tests/test_treeland_userconfig/CMakeLists.txt

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've found 4 issues, and left some high level feedback:

  • The tests rely on an installed DConfig schema at /usr/share/dsg/configs/... and a specific config name, which makes them environment-dependent; consider injecting a test config or mocking DConfig so the suite can run in isolation.
  • There are many tight loops with QThread::msleep and repeated QCoreApplication::processEvents() (e.g., up to 200 iterations) which may make the tests slow and flaky; consider using signal-based synchronization (QSignalSpy, QEventLoop with timeouts) or reducing the iteration counts.
  • Several tests end with QVERIFY(true) regardless of what happens; if the intent is to detect race-related crashes, consider also asserting on observable conditions (e.g., initialization state, config validity) so regressions that do not crash still get caught.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- The tests rely on an installed DConfig schema at `/usr/share/dsg/configs/...` and a specific config name, which makes them environment-dependent; consider injecting a test config or mocking DConfig so the suite can run in isolation.
- There are many tight loops with `QThread::msleep` and repeated `QCoreApplication::processEvents()` (e.g., up to 200 iterations) which may make the tests slow and flaky; consider using signal-based synchronization (QSignalSpy, QEventLoop with timeouts) or reducing the iteration counts.
- Several tests end with `QVERIFY(true)` regardless of what happens; if the intent is to detect race-related crashes, consider also asserting on observable conditions (e.g., initialization state, config validity) so regressions that do not crash still get caught.

## Individual Comments

### Comment 1
<location> `tests/test_treeland_userconfig/test_dconfig_valid_config.cpp:41-50` </location>
<code_context>
+    void test_immediate_destroy_valid_dconfig()
</code_context>

<issue_to_address>
**suggestion (testing):** Tests only assert `QVERIFY(true)` and won’t fail if the valid DConfig path/contract changes

Right now these tests only verify that the process doesn’t crash, so they won’t fail if the DConfig contract or environment changes.

Please add functional assertions that verify the intended behavior, e.g. that a valid DConfig is found and initialized and that the config object reaches a consistent state before destruction. For example:
- `QVERIFY(config->isInitializeSucceed());` for valid DConfig tests (and use `QSKIP` if the schema isn’t available),
- `QVERIFY(dconfig && dconfig->isValid());` where a valid config is expected, or
- use `QSignalSpy` to check that expected signals are emitted before destruction.

This will let the tests catch behavioral regressions, not just crashes.

Suggested implementation:

```cpp
        for (int cycle = 0; cycle < 50; ++cycle) {
            // Use createByName with correct parameters like production code
            TreelandUserConfig *config = TreelandUserConfig::createByName("org.deepin.dde.treeland.user",
                                                                          "org.deepin.dde.treeland",
                                                                          "/dde");

            // We expect a valid config object for the known-good DConfig
            QVERIFY2(config, "TreelandUserConfig::createByName() returned nullptr for a valid DConfig");

            // If the schema / backend is not available on this system, skip the functional part
            if (!config->isInitializeSucceed()) {
                delete config;
                QSKIP("TreelandUserConfig could not be initialized (schema or backend not available); skipping valid DConfig stability test.");
            }

            // Verify that the underlying DConfig object exists and is valid for this config
            auto dconfig = config->dconfig();
            QVERIFY2(dconfig, "TreelandUserConfig did not expose a DConfig instance for a supposedly valid config");
            QVERIFY2(dconfig->isValid(), "Underlying DConfig is not valid for a supposedly valid TreelandUserConfig");

            // Immediate delete after confirming a consistent, initialized state

```

- Ensure that `TreelandUserConfig` actually provides the methods used above:
  - `bool TreelandUserConfig::isInitializeSucceed() const;`
  - A getter for the underlying config, here assumed as `auto TreelandUserConfig::dconfig() const;` returning a pointer with `isValid()` (adjust the method name and type if your API differs, e.g. `dConfig()` or `backend()`).
- Make sure this test file includes the appropriate headers for `TreelandUserConfig` and the DConfig type that provides `isValid()`, and that it is using the Qt Test framework so `QVERIFY2` and `QSKIP` are available.
- Re-add or keep the `delete config;` line after the “Immediate delete” comment if it exists below the shown snippet; the test must still delete `config` within the loop to exercise the destruction path.
- If you later decide to assert signals as well, you can wrap the body with a `QSignalSpy` against the relevant signal (e.g. an `initialized` or `ready` signal) and add `QCOMPARE(spy.count(), 1);` before `delete config;`.
</issue_to_address>

### Comment 2
<location> `tests/test_treeland_userconfig/test_dconfig_valid_config.cpp:71-52` </location>
<code_context>
+    void test_destroy_after_initialization()
</code_context>

<issue_to_address>
**suggestion (testing):** Busy-wait loops with `msleep`/`processEvents` are prone to flakiness; consider using Qt’s `QTRY_*` helpers

In `test_destroy_after_initialization` (and similar tests), the loop that polls `isInitializeSucceed()` with `QThread::msleep` and `QCoreApplication::processEvents()` is likely to be flaky on slow or loaded CI. QtTest’s `QTRY_VERIFY`/`QTRY_COMPARE` handle event processing and timeouts for you, e.g.

```cpp
QTRY_VERIFY_WITH_TIMEOUT(config->isInitializeSucceed(), 2000);
```

Switching to these makes the tests simpler, more readable, and less prone to intermittent failures while preserving the intended timing/race coverage.

Suggested implementation:

```cpp
#include <QCoreApplication>
#include <QTest>

```

```cpp
            // Wait for initialization to complete
            // This gives time for DConfig to init and signal connection to be set up
            QTRY_VERIFY_WITH_TIMEOUT(config->isInitializeSucceed(), 2000);

```

The SEARCH block for the comment assumes no existing code was added under the comment yet. If a manual polling loop (using `QCoreApplication::processEvents()` and `QThread::msleep`) already exists below that comment, extend the SEARCH section to include and remove that loop entirely, keeping only the updated comment and the `QTRY_VERIFY_WITH_TIMEOUT` line in the REPLACE block. This pattern can be applied similarly to other tests that use the same busy-wait pattern.
</issue_to_address>

### Comment 3
<location> `tests/test_treeland_userconfig/test_dconfig_valid_config.cpp:5-14` </location>
<code_context>
+// SPDX-License-Identifier: Apache-2.0 OR LGPL-3.0-only OR GPL-2.0-only OR GPL-3.0-only
+
+// Test: Reproduce SIGSEGV crash using VALID DConfig
+// The config schema is at: /usr/share/dsg/configs/org.deepin.dde.treeland/org.deepin.dde.treeland.user.json
+//
+// The crash mechanism:
+// 1. initializeInConfigThread connects to DConfig::valueChanged signal
+//    QObject::connect(config, &DConfig::valueChanged, this, [this](key) { updateValue(key); })
+// 2. updateValue calls: QMetaObject::invokeMethod(this, [this]() { updateProperty(...) })
+// 3. Data object destroyed while lambda is queued
+// 4. Lambda tries to access destroyed 'this' -> SIGSEGV
+
+#include "treelanduserconfig.hpp"
+
+#include <QObject>
</code_context>

<issue_to_address>
**suggestion (testing):** Tests rely on system-installed DConfig schema, which makes them environment-dependent

These tests depend on a specific system path (`/usr/share/dsg/configs/...`) and a real DConfig instance, so they may silently “pass” in environments where the schema is missing or invalid (they only `QVERIFY(true)`). To make them reliable and self-contained, either:
- fail or `QSKIP` when `config->config()` is null or `!dconfig->isValid()`, or
- provide a test-only DConfig schema or mock that guarantees a valid configuration.

This way the tests actually exercise the intended valid-config path on any host.
</issue_to_address>

### Comment 4
<location> `tests/test_treeland_userconfig/test_dconfig_valid_config.cpp:261-62` </location>
<code_context>
+    void test_high_frequency_stress()
</code_context>

<issue_to_address>
**suggestion (testing):** High-frequency stress test with many cycles and sleeps may be slow and flaky in CI

This test does 100 create/destroy cycles with sleeps and repeated `processEvents()`, and earlier tests also use nested sleep loops. That’s good for reproducing timing issues locally but risks slow, flaky CI runs. Consider either reducing iteration counts for CI, guarding the heaviest stress test with a flag (e.g. env var for extended runs), or replacing fixed `msleep` loops with shorter timeouts plus `QTRY_*` macros to keep coverage while avoiding timeouts on slower machines.

Suggested implementation:

```cpp
        bool ok = false;
        int cycles = qEnvironmentVariableIntValue("TREELAND_STRESS_CYCLES", &ok);
        if (!ok || cycles <= 0) {
            // Use a reduced default for CI; increase locally via TREELAND_STRESS_CYCLES
            cycles = 20;
        }

        qDebug() << "Running high frequency stress test with" << cycles << "cycles";

        for (int cycle = 0; cycle < cycles; ++cycle) {
            TreelandUserConfig *config = TreelandUserConfig::createByName("org.deepin.dde.treeland.user",
                                                                          "org.deepin.dde.treeland",
                                                                          "/dde");

```

To fully implement the intent of your review comment, consider:
1. Replacing other long `msleep`-based polling loops in this test file with `QTRY_VERIFY`, `QTRY_COMPARE`, or similar macros using reasonable timeouts, so tests adapt better to slower CI machines.
2. Documenting the `TREELAND_STRESS_CYCLES` environment variable in your test README or developer docs so contributors know how to run extended local stress tests (e.g. `TREELAND_STRESS_CYCLES=500 ctest -R test_dconfig_valid_config`).
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

@Dami-star Dami-star marked this pull request as draft January 7, 2026 01:49
@Dami-star Dami-star marked this pull request as ready for review January 7, 2026 03:00
@Dami-star Dami-star requested a review from zccrs January 7, 2026 03:00
Copy link

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've found 1 issue, and left some high level feedback:

  • These tests depend on the real DConfig schema under /usr/share/dsg/configs and runtime availability, which may make CI/environment-independent runs brittle; consider injecting a configurable schema path or using a mocked DConfig backend so the tests are self-contained.
  • Many of the test loops combine QThread::msleep/usleep with manual processEvents calls and high iteration counts; consolidating this into a small helper that waits with a bounded timeout would make the tests easier to reason about and reduce the risk of flakiness or excessive runtime.
  • The repeated createByName/initialization/wait/destruction pattern across multiple tests is very similar; factoring this into a shared utility that returns an initialized TreelandUserConfig (or nullptr on failure) would reduce duplication and make it easier to adjust the lifecycle behavior in one place.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- These tests depend on the real DConfig schema under /usr/share/dsg/configs and runtime availability, which may make CI/environment-independent runs brittle; consider injecting a configurable schema path or using a mocked DConfig backend so the tests are self-contained.
- Many of the test loops combine QThread::msleep/usleep with manual processEvents calls and high iteration counts; consolidating this into a small helper that waits with a bounded timeout would make the tests easier to reason about and reduce the risk of flakiness or excessive runtime.
- The repeated `createByName`/initialization/wait/destruction pattern across multiple tests is very similar; factoring this into a shared utility that returns an initialized TreelandUserConfig (or nullptr on failure) would reduce duplication and make it easier to adjust the lifecycle behavior in one place.

## Individual Comments

### Comment 1
<location> `tests/test_treeland_userconfig/test_dconfig_valid_config.cpp:27` </location>
<code_context>
+#include <chrono>
+#include <atomic>
+
+class ValidConfigCrashTest : public QObject
+{
+    Q_OBJECT
</code_context>

<issue_to_address>
**suggestion (testing):** Tests depend on a system-wide DConfig schema, making them environment-dependent and potentially skipped entirely in CI

Because the suite is gated on `s_dConfigAvailable` and `/usr/share/dsg/configs/...`, it will be fully skipped on systems without the schema/service, so the crash regression is never exercised. Please add at least one hermetic path (e.g., test-local DConfig schema or a mock/stub of `TreelandUserConfig`/DConfig) so CI always runs a test that validates the crash fix independently of the host environment. Keep these as integration tests, but don’t rely on them alone for coverage.

Suggested implementation:

```cpp
#include <QStandardPaths>
#include <QTemporaryDir>
#include <QDir>
#include <QFile>
#include <QScopedPointer>

#include <chrono>
#include <atomic>

```

```cpp
class ValidConfigCrashTest : public QObject
{
    Q_OBJECT

private:
    // Static flag to track if DConfig is available
    static bool s_dConfigAvailable;
    static bool s_initialized;

    // Holds hermetic DConfig schema root used when system schema is unavailable.
    // This keeps the temporary directory alive for the whole test lifetime.
    QScopedPointer<QTemporaryDir> m_hermeticSchemaDir;

private Q_SLOTS:

```

To fully implement hermetic coverage for the crash regression and make CI independent of the host DConfig installation, the following additional changes are required elsewhere in this file:

1. **Change `initTestCase()` to create a hermetic DConfig schema when the system schema is not available instead of skipping all tests:**
   - Locate the definition of `ValidConfigCrashTest::initTestCase()`.
   - Before any logic that sets `s_dConfigAvailable` or calls `QSKIP`, insert logic roughly along these lines:
     ```cpp
     void ValidConfigCrashTest::initTestCase()
     {
         // Existing logic that checks for system-wide DConfig schema...
         // If you currently set s_dConfigAvailable based on a check like:
         //   QFile::exists("/usr/share/dsg/configs/...") or similar,
         // then leave that code in place.

         if (!s_dConfigAvailable) {
             // No system-wide DConfig schema available; create a hermetic one in a temp dir.
             m_hermeticSchemaDir.reset(new QTemporaryDir);
             QVERIFY2(m_hermeticSchemaDir->isValid(), "Failed to create temporary DConfig schema directory for tests");

             QDir schemaRoot(m_hermeticSchemaDir->path());
             // Adjust "dsg/configs" and filename to match what TreelandUserConfig/DConfig actually expects.
             QVERIFY2(schemaRoot.mkpath(QStringLiteral("dsg/configs")),
                      "Failed to create hermetic DConfig schema subdirectory");

             const QString schemaFilePath =
                 schemaRoot.filePath(QStringLiteral("dsg/configs/treeland-userconfig-valid.dconfig")); // example name
             QFile schemaFile(schemaFilePath);
             QVERIFY2(schemaFile.open(QIODevice::WriteOnly | QIODevice::Truncate),
                      "Failed to create hermetic DConfig schema file");

             // Write a minimal valid DConfig schema that is sufficient to exercise the code path
             // that previously crashed. The exact contents must match your actual DConfig schema format.
             const QByteArray minimalSchema =
                 QByteArrayLiteral(
                     "{\n"
                     "  \"id\": \"org.treeland.userconfig\",\n"
                     "  \"options\": [\n"
                     "    { \"name\": \"someOption\", \"type\": \"bool\", \"default\": true }\n"
                     "  ]\n"
                     "}\n");
             QCOMPARE(schemaFile.write(minimalSchema), minimalSchema.size());
             schemaFile.close();

             // Point TreelandUserConfig/DConfig at the hermetic schema root.
             // How you do this depends on the project; for example, via an env var
             // or a setter on TreelandUserConfig.
             //
             // Examples (pick the one that matches your codebase):
             //   qputenv("D_CONFIG_DIRS", m_hermeticSchemaDir->path().toUtf8());
             // or
             //   TreelandUserConfig::setConfigSearchPath(m_hermeticSchemaDir->path());
             //
             // Ensure this is done before constructing any TreelandUserConfig instances.

             // Mark DConfig as available so the rest of the tests run against this hermetic schema.
             s_dConfigAvailable = true;
         }

         // Remove or relax any unconditional QSKIP that would skip the entire test suite
         // when !s_dConfigAvailable. If you still want to distinguish system vs hermetic,
         // you can add a QWARN instead of skipping.
     }
     ```
   - Adjust the directory structure (`"dsg/configs"`), file name (`"treeland-userconfig-valid.dconfig"`), and JSON (or other) schema content to exactly match what `TreelandUserConfig`/DConfig expects in this project.

2. **Ensure the crash regression is covered by at least one test that runs against the hermetic schema:**
   - Identify the test that originally exercised the crash regression (e.g., something like `testValidConfigDoesNotCrash()` or similar).
   - Make sure that:
     - It does **not** early-return or `QSKIP` based on `s_dConfigAvailable`; instead, it should assume `initTestCase()` has made DConfig available (via system or hermetic schema).
     - It constructs and uses `TreelandUserConfig` (or the code that used to crash) normally, so when the system schema is missing in CI, it still runs against the hermetic schema.
   - If the existing tests only treat `s_dConfigAvailable` as a gate (skipping everything when false), refactor them to rely on the hermetic setup instead of skipping.

3. **If the project provides a mock/stub API for DConfig/TreelandUserConfig, prefer it:**
   - Instead of writing out an actual schema file, you can:
     - Use an existing mock backend or in-memory backend, if the codebase has one (e.g. `MockDConfigBackend`, `TreelandUserConfig::setBackendForTesting(...)`, etc.).
     - Configure that mock in `initTestCase()` when the system DConfig is missing, and set `s_dConfigAvailable = true`.
   - The key goal is that **CI always runs a test that constructs `TreelandUserConfig` (or the crashing code path)** even when `/usr/share/dsg/configs/...` is absent.

4. **Optional: add a dedicated regression test function for the “no system DConfig” scenario:**
   - Inside `ValidConfigCrashTest`, you can add a separate test slot (e.g., `void testCrashRegressionWithoutSystemDConfig();`) that:
     - Forces the “no system DConfig” scenario (or simulates it) by ensuring the hermetic path is used.
     - Constructs `TreelandUserConfig` (or the relevant object) and verifies it does not crash and behaves sensibly (e.g., defaults are used).
   - Since this depends on existing helpers and types not shown in the snippet, its exact implementation must match how `TreelandUserConfig` is normally constructed in your tests.

Implementing the above ensures that even on CI hosts without `/usr/share/dsg/configs/...`, the test suite creates a private DConfig schema and still exercises the crash regression path, fulfilling the comment’s requirement for a hermetic testing path.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

- Add comprehensive test suite for TreelandUserConfig with valid DConfig
- Test immediate destroy, initialization completion, and property changes
- Add stress tests for high-frequency creation/destruction cycles
- Verify safety of signal handling using QPointer mechanism
- Tests confirm crash fix prevents SIGSEGV in production scenarios
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants