Conversation
📝 WalkthroughWalkthroughThis PR introduces Spark wallet support to COINCUBE by adding a new Changes
Sequence Diagram(s)sequenceDiagram
participant GUI as GUI<br/>(App)
participant SC as SparkClient
participant Bridge as Bridge<br/>subprocess
participant SDK as Breez Spark<br/>SDK
GUI->>SC: SparkClient::connect()
activate SC
SC->>Bridge: spawn(path, config)
activate Bridge
SC->>Bridge: send Init request
Bridge->>SDK: connect_mainnet(mnemonic)
activate SDK
SDK-->>Bridge: ok
deactivate SDK
Bridge->>Bridge: register event listener
Bridge-->>SC: Init response
SC->>SC: create event broadcast channel
SC-->>GUI: Connected SparkClient
deactivate SC
rect rgba(0, 100, 200, 0.5)
Note over GUI,SDK: Background Tasks
SC->>Bridge: spawn reader task
SC->>Bridge: spawn writer task
SC->>Bridge: spawn stderr task
Bridge->>SDK: emit events (PaymentSucceeded, etc)
Bridge-->>SC: Event frames
SC-->>GUI: broadcast events
end
deactivate Bridge
sequenceDiagram
participant User as User
participant View as Spark Send<br/>View
participant State as SparkSend<br/>State
participant Backend as SparkBackend
participant Bridge as Bridge
participant SDK as Breez Spark<br/>SDK
User->>View: enter destination & amount
View->>State: SparkSendMessage::*InputChanged
State->>State: update fields, clear prepared
User->>View: click "Prepare"
View->>State: Message::SparkSend(PrepareRequested)
State->>Backend: parse_input(destination)
Backend->>Bridge: ParseInput request
Bridge->>SDK: classify input
SDK-->>Bridge: ParseInputOk (kind, min/max)
Bridge-->>Backend: ParseInputOk response
alt LNURL Pay detected
State->>Backend: prepare_lnurl_pay(input, amount)
Backend->>Bridge: PrepareLnurlPay request
Bridge->>SDK: SDK prepare with LNURL
else Regular Send
State->>Backend: prepare_send(input, amount)
Backend->>Bridge: PrepareSend request
Bridge->>SDK: SDK prepare
end
SDK-->>Bridge: PrepareSendOk (handle, fee)
Bridge-->>Backend: PrepareSendOk response
Backend-->>State: Result
State->>State: phase := Prepared(ok)
State-->>View: update phase
View-->>User: show confirm dialog
User->>View: click "Send"
View->>State: Message::SparkSend(ConfirmRequested)
State->>Backend: send_payment(handle)
Backend->>Bridge: SendPayment request
Bridge->>SDK: SDK send or LNURL pay
SDK-->>Bridge: SendPaymentOk
Bridge-->>Backend: SendPaymentOk response
Backend-->>State: Result
State->>State: phase := Sent(ok), clear inputs
State-->>View: update
View-->>User: show success
sequenceDiagram
participant GUI as GUI App
participant State as SparkReceive<br/>State
participant Backend as SparkBackend
participant Bridge as Bridge
participant SDK as Breez Spark<br/>SDK
participant Events as Event<br/>Stream
rect rgba(0, 150, 100, 0.5)
Note over GUI,Events: Receive BOLT11
end
GUI->>State: SparkReceiveMessage::GenerateRequested
State->>Backend: receive_bolt11(amount, desc)
Backend->>Bridge: ReceiveBolt11 request
Bridge->>SDK: SDK receive_bolt11
SDK-->>Bridge: ReceivePaymentOk
Bridge-->>Backend: response
Backend-->>State: Result
State->>State: encode QR, phase := Generated
State-->>GUI: show QR + invoice
rect rgba(0, 150, 100, 0.5)
Note over Events: Background Event Stream
end
Events->>Events: PaymentSucceeded event fires
Events-->>GUI: Message::AppEvent(PaymentSucceeded)
GUI->>State: handle PaymentReceived
alt invoice matches
State->>State: phase := Received
State-->>GUI: show confirmation
end
rect rgba(100, 150, 0, 0.5)
Note over GUI,SDK: Receive On-chain
end
GUI->>State: SparkReceiveMessage::GenerateRequested (OnchainBitcoin)
State->>Backend: receive_onchain(new_address)
Backend->>Bridge: ReceiveOnchain request
Bridge->>SDK: SDK receive_onchain
SDK-->>Bridge: ReceivePaymentOk
Bridge-->>Backend: response
Backend-->>State: Result
State->>State: phase := Generated
State-->>GUI: show address + QR
Events->>Events: DepositsChanged event fires
Events-->>GUI: reload deposits
GUI->>Backend: list_unclaimed_deposits
Backend->>Bridge: ListUnclaimedDeposits request
Bridge->>SDK: SDK list deposits
SDK-->>Bridge: ListUnclaimedDepositsOk
Bridge-->>Backend: response
Backend-->>GUI: deposits list
GUI->>State: SparkReceiveMessage::ClaimDepositRequested(txid, vout)
State->>Backend: claim_deposit(txid, vout)
Backend->>Bridge: ClaimDeposit request
Bridge->>SDK: SDK claim
SDK-->>Bridge: ClaimDepositOk
Bridge-->>Backend: response
Backend-->>State: Result
State->>State: mark claimed, refresh deposits
Estimated code review effort🎯 5 (Critical) | ⏱️ ~135 minutes Possibly related PRs
Suggested reviewers
Poem
✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
⚔️ Resolve merge conflicts
|
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 3 potential issues.
❌ Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
Reviewed by Cursor Bugbot for commit 72ca626. Configure here.
| method: Method::Shutdown, | ||
| }); | ||
| } | ||
| } |
There was a problem hiding this comment.
Drop uses racy strong_count check for shutdown decision
Low Severity
The Drop impl for SparkClient uses Arc::strong_count(&self.inner) > 1 to decide whether to send a shutdown request. This check is inherently racy — between the strong_count read and the actual Arc drop, another thread could clone or drop a different SparkClient, leading to either a missed shutdown (count transiently rises) or a spurious shutdown (count transiently drops). kill_on_drop(true) on the Command acts as a backstop for the missed-shutdown case, but the fire-and-forget shutdown request sent at id: u64::MAX could collide with a real request if next_id wraps (extremely unlikely but theoretically possible).
Reviewed by Cursor Bugbot for commit 72ca626. Configure here.
| }; | ||
| if !matches_invoice { | ||
| return Task::none(); | ||
| } |
There was a problem hiding this comment.
Receive panel advances on unrelated payment events
Medium Severity
The matches_invoice check defaults to true when either displayed_invoice or the event's bolt11 field is None. This means when a BOLT11 invoice is displayed (displayed_invoice is Some) but the event carries bolt11: None (documented for on-chain/Spark-native payments), ANY PaymentSucceeded event—including completely unrelated ones—will trigger the "Received" celebration screen and reset the panel state, prematurely clearing the user's displayed invoice.
Reviewed by Cursor Bugbot for commit 72ca626. Configure here.
| /// Liquid wallet — advanced wallet for L-BTC, USDt, and | ||
| /// Liquid-specific receive flows. | ||
| Liquid, | ||
| } |
There was a problem hiding this comment.
Default WalletKind silently reroutes existing cubes to Spark
Medium Severity
WalletKind defaults to Spark, so existing cubes whose settings file lacks default_lightning_backend (all pre-PR cubes) will silently switch their Lightning Address routing from Liquid to Spark on upgrade. While resolve_lightning_backend falls back to Liquid when Spark is unavailable, cubes that do have a configured Spark signer will start routing invoices through Spark without any user action or notification — a potentially surprising behavior change for existing users.
Reviewed by Cursor Bugbot for commit 72ca626. Configure here.
There was a problem hiding this comment.
Actionable comments posted: 18
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (4)
coincube-gui/src/loader.rs (1)
858-865:⚠️ Potential issue | 🟠 MajorReplace
std::thread::sleepwith async sleep to avoid blocking the executor thread.
sync()runs underTask::perform, which uses the tokio runtime.std::thread::sleepparks the executor worker for a full second on every poll withsleep=true, stalling unrelated async work like wallet tasks and UI updates. Usetokio::time::sleep(...).awaitinstead, which is already available in your dependencies and used elsewhere in the codebase.Suggested fix
async fn sync( daemon: Arc<dyn Daemon + Sync + Send>, sleep: bool, ) -> Result<GetInfoResult, DaemonError> { if sleep { - std::thread::sleep(std::time::Duration::from_secs(1)); + tokio::time::sleep(std::time::Duration::from_secs(1)).await; } daemon.get_info().await }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@coincube-gui/src/loader.rs` around lines 858 - 865, The sync function is blocking the tokio executor by calling std::thread::sleep; replace that call in async fn sync(...) with an asynchronous sleep using tokio::time::sleep(std::time::Duration::from_secs(1)).await so the task yields instead of parking the worker thread; keep the existing daemon.get_info().await return behavior and add the tokio::time::sleep call only when sleep == true.coincube-gui/src/services/lnurl/stream.rs (1)
21-42:⚠️ Potential issue | 🟠 MajorInclude routing inputs in the subscription identity.
create_streamcapturespreferredandspark_backend, but thisHashimplementation ignores both. If the user changesdefault_lightning_backendor the Spark bridge comes up/down whiletokenandretriesstay the same, the existing SSE task keeps minting invoices with the stale route until the subscription happens to restart for some other reason.Suggested fix
impl Hash for LnurlStreamData { fn hash<H: Hasher>(&self, state: &mut H) { self.token.hash(state); self.retries.hash(state); + matches!(self.preferred, WalletKind::Spark).hash(state); + self.spark_backend.is_some().hash(state); } }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@coincube-gui/src/services/lnurl/stream.rs` around lines 21 - 42, The Hash impl for LnurlStreamData currently only hashes token and retries, so changes to routing inputs are ignored; update the Hash implementation for LnurlStreamData to also incorporate the routing-related fields used by create_stream/handle_invoice_request: include preferred (WalletKind) and the spark_backend identity (e.g., hash spark_backend.is_some() or the Arc pointer/inner identity) so that changes to default_lightning_backend or the Spark bridge presence will change the stream's hash and force Iced to recreate the SSE task.coincube-gui/src/gui/tab.rs (1)
975-984:⚠️ Potential issue | 🟠 MajorRemote-backend sessions still drop Spark on app creation.
BreezClientLoadedAfterPinnow loads Spark, butcreate_app_with_remote_backendhas no Spark parameter and hardcodesNoneintoApp::new. That means remote-backend users will never see Spark after login, even when the backend was loaded successfully during PIN verification.Also applies to: 1114-1115
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@coincube-gui/src/gui/tab.rs` around lines 975 - 984, The create_app_with_remote_backend function currently hardcodes None for the Spark argument when calling App::new, which prevents Spark loaded by BreezClientLoadedAfterPin from being visible; change the signature of create_app_with_remote_backend to accept the Spark (or Option<SparkType>) parameter and forward that value into app::App::new instead of None, then update all call sites (including where BreezClientLoadedAfterPin returns a loaded Breez client/PIN flow) to pass the loaded Spark through; ensure the type matches the App::new parameter and adjust related declarations where the function is defined/used (also update the other occurrence around lines 1114-1115) so remote-backend users receive the loaded Spark after login.coincube-gui/src/app/state/liquid/send.rs (1)
790-833:⚠️ Potential issue | 🟠 MajorPreserve USDt classification even when
asset_infois missing.
load_balancealready treatsasset_id == usdt_idas the USDt check, but this block switches back toasset_info.as_ref().map(...)and uses that to setis_usdt_payment. If a USDt payment arrives withoutasset_info, the row falls back to BTC formatting/fiat conversion and loses the USDt description. Useasset_idto classify the asset, and only useamount_minorwhen it is available.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@coincube-gui/src/app/state/liquid/send.rs` around lines 790 - 833, The USDt classification must use the asset_id check instead of relying on asset_info presence: set is_usdt_payment by matching payment.details against DomainPaymentDetails::LiquidAsset { asset_id, .. } and comparing asset_id == usdt_id; compute usdt_amount_minor separately as asset_info.as_ref().map(|i| i.amount_minor) only when asset_info exists; keep amount = Amount::from_sat(usdt_amount_minor.unwrap_or(payment.amount_sat)); keep fiat_amount = None when is_usdt_payment else use fiat_converter.convert(amount); and adjust the desc match to detect USDt by asset_id (or is_usdt_payment) but avoid depending on asset_info for choosing the "USDt Transfer" fallback.
🧹 Nitpick comments (3)
docs/SPARK_WALLET.md (1)
13-25: Add a language specifier to the fenced code block.The ASCII diagram fenced code block is missing a language specifier. Use an empty specifier or
textto satisfy markdown linting.📝 Suggested fix
-``` +```text ┌──────────────┐ stdin/stdout ┌──────────────────────┐🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@docs/SPARK_WALLET.md` around lines 13 - 25, The fenced ASCII diagram (the triple backticks surrounding the diagram in the docs/SPARK_WALLET.md file) is missing a language specifier; update the opening fence from ``` to ```text (or an empty specifier) so the block is marked as plain text and satisfies the markdown linter—edit the code block that contains the ASCII diagram (the block starting with the triple backticks above the diagram) to include the specifier.coincube-spark-bridge/Cargo.toml (1)
22-23: Consider documenting the lint override reason.The
mismatched_lifetime_syntaxeslint is allowed, which may be necessary due to upstream crate code generation. A brief comment explaining why this override is needed would aid future maintenance.📝 Suggested documentation
[lints.rust] +# breez-sdk-spark generates code with lifetime syntax that triggers this lint mismatched_lifetime_syntaxes = "allow"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@coincube-spark-bridge/Cargo.toml` around lines 22 - 23, Add a brief comment next to the Cargo.toml lints override explaining why mismatched_lifetime_syntaxes is set to "allow" (e.g., due to upstream/generated code or a specific dependency causing unavoidable lifetime syntax mismatches). Locate the [lints.rust] section and the mismatched_lifetime_syntaxes entry and insert a one-line rationale referencing the upstream crate or codegen that necessitates this override so future maintainers understand the exception.coincube-gui/src/app/view/liquid/send.rs (1)
1881-2041: Remove the parked legacy toggle before merge.Keeping the full
_old_send_asset_toggleimplementation around makes this file harder to maintain and easy to accidentally resurrect once the new picker flow diverges.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@coincube-gui/src/app/view/liquid/send.rs` around lines 1881 - 2041, Delete the parked legacy toggle by removing the entire _old_send_asset_toggle function (including the leading NOTE comment and the #[allow(dead_code)] attribute) from the file; ensure you also remove any leftover references to _old_send_asset_toggle and its symbol (fn _old_send_asset_toggle, SendAsset variants used only there) so the compiler/warnings stay clean, and run a quick build to confirm no callers remain (e.g. search for _old_send_asset_toggle and LiquidSendMessage::PresetAsset usages that were only for this old toggle).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.env.example:
- Line 21: The .env.example currently ends immediately after the
BREEZ_SPARK_API_KEY= line and triggers the EndingBlankLine lint; add a single
trailing newline at the end of the file so the file ends with a blank line
(ensure the final character after the BREEZ_SPARK_API_KEY= line is a newline).
In `@coincube-gui/src/app/breez_spark/client.rs`:
- Around line 483-497: In shutdown(), the code sets inner.closed (via
closed.swap(true, Ordering::SeqCst)) before calling request(Method::Shutdown),
which causes request() to immediately return BridgeUnavailable; change the logic
so the Shutdown RPC actually gets sent: either move the closed.swap(true, ...)
to after awaiting self.request(Method::Shutdown) inside shutdown(), or update
request() to special-case Method::Shutdown and allow the call even when
inner.closed is true; ensure the same fix is applied to the equivalent block
around lines 525-533 (the alternate shutdown path).
- Around line 690-747: On stdout close or read error inside spawn_reader_task,
drain the pending map and fail each outstanding oneshot by taking
pending.lock().await, removing all entries (e.g., via drain or iterating keys),
and sending an immediate error/response to each sender so callers don't wait the
full timeout; perform this cleanup in both the Ok(None) and Err(e) branches
before breaking. Also mark the client as closed (set the shared "closed" flag or
notify the client-closed signal used by the client code) after failing pending
requests so other tasks see the subprocess is dead. Ensure you update only
symbols shown here: spawn_reader_task, pending (PendingMap), and event_tx/Frame
handling, and send a failure through each oneshot::Sender previously stored in
pending.
In `@coincube-gui/src/app/breez_spark/config.rs`:
- Around line 1-8: The current api_key_from_env() bakes an empty string via
option_env!(...).unwrap_or("") which hides missing credentials; change it to
enforce compile-time or explicit failure: either replace option_env! with
env!(...) so missing BREEZ_SPARK_API_KEY / BREEZ_API_KEY triggers a compile
error (mirroring the Breez Liquid pattern), or keep option_env! but return a
Result and map a missing key to a SparkConfigError (e.g., return
Err(SparkConfigError::MissingApiKey)) so callers of
api_key_from_env()/SparkConfig::new() fail early instead of embedding an empty
API key; update api_key_from_env, SparkConfig::new, and any callers to handle
the Result accordingly.
In `@coincube-gui/src/app/settings/mod.rs`:
- Around line 178-183: The doc is wrong because serde(default) will use
WalletKind::default() (currently Spark); fix by providing an explicit
deserialization default that returns Liquid: add a function (e.g.,
default_liquid() -> crate::app::wallets::WalletKind { WalletKind::Liquid }) and
change the field attribute to #[serde(default = "default_liquid")] on
default_lightning_backend; alternatively, if you want every default across the
code to be Liquid, change WalletKind::default() in types.rs to return
WalletKind::Liquid — if you choose the serde-default approach, ensure
new_with_id still explicitly sets Spark for newly created cubes as before.
In `@coincube-gui/src/app/state/global_home.rs`:
- Around line 1122-1123: The filter currently skips any payment whose
payment.status is not DomainPaymentStatus::Pending, which wrongly excludes
in-flight variants; update the check in the loop that computes pending totals so
that Created and WaitingFeeAcceptance are treated as pending too (i.e., only
continue/skip for terminal states, or change the matches! to allow Pending |
Created | WaitingFeeAcceptance). Locate the conditional using
matches!(payment.status, DomainPaymentStatus::Pending) and modify it to include
DomainPaymentStatus::Created and DomainPaymentStatus::WaitingFeeAcceptance in
the set of statuses counted as pending for
restore_pending_liquid_to_vault_transfer and the pending totals calculation.
In `@coincube-gui/src/app/state/liquid/receive.rs`:
- Around line 23-35: usdt_amount_minor currently treats a payment as USDt only
when asset_info is Some; change it so the USDt check is based solely on asset_id
== usdt_id and return a sensible minor-unit value even if asset_info is None:
inside usdt_amount_minor match DomainPaymentDetails::LiquidAsset where asset_id
== usdt_id, return Some(amount_minor) when asset_info.is_some(), otherwise
return a fallback (e.g. the payment's amount_sat converted/used as the
minor-unit fallback) instead of None; alternatively split responsibilities by
adding a separate is_usdt predicate (asset_id == usdt_id) and keep this helper
only extracting minor amount with a fallback so callers that check is_some() do
not lose USDt payments.
In `@coincube-gui/src/app/state/spark/overview.rs`:
- Around line 58-69: The view() status selection wrongly prioritizes snapshot
over error, showing stale Connected(...) after a get_info() failure; fix by
either clearing self.snapshot when an error is set (e.g., in the error-handling
path that sets self.error in get_info()) or change view() to check self.error
before self.snapshot so Error(...) is returned first; apply the same fix to the
analogous logic around the code referenced at lines 141-147 (the other status
derivation).
In `@coincube-gui/src/app/state/spark/receive.rs`:
- Around line 273-299: The PaymentReceived handler
(SparkReceiveMessage::PaymentReceived) currently treats any event lacking a
BOLT11 as a match (matches_invoice => true), which allows unrelated sends to
mark the panel as received; update the handler to reject non-positive amounts
(ignore events where amount_sat <= 0) and additionally require an explicit
receive correlation when bolt11 is absent (e.g., check an explicit
direction/receive flag on the event or a receive-specific correlation field)
before setting self.displayed_invoice = None / self.qr_data = None and
transitioning to SparkReceivePhase::Received { amount_sat }; keep the existing
exact-match logic for when both self.displayed_invoice and bolt11 are Some(...)
so BOLT11 correlation still works.
- Around line 185-205: The handlers for SparkReceiveMessage::MethodSelected,
AmountInputChanged, and DescriptionInputChanged reset UI state but do not cancel
or version-guard the in-flight receive_* task, so a late
GenerateSucceeded/GenerateFailed can overwrite the new inputs; add the same
request-token/version guard used in the send flow: when changing
method/amount/description increment or rotate the receive request token/version
(or cancel the existing receive task) and ensure GenerateSucceeded and
GenerateFailed only apply their results if the token/version matches the current
one (referencing the receive_* task, GenerateSucceeded, GenerateFailed, and the
Message variants MethodSelected/AmountInputChanged/DescriptionInputChanged).
In `@coincube-gui/src/app/state/spark/send.rs`:
- Around line 115-124: The prepare task results (PrepareSucceeded/PrepareFailed)
can apply after the user edits inputs; add a monotonic prepare_version counter
to the Spark send state, bump it whenever DestinationInputChanged or
AmountInputChanged (and before launching resolve_and_prepare), pass the current
version into the spawned prepare task's completion message, and when handling
PrepareSucceeded/PrepareFailed ignore any result whose version does not equal
the current state.prepare_version; apply the same pattern for the other
prepare-launching branch (the code around lines 162-180) and ensure
ConfirmRequested only proceeds if the state.phase and prepare_version still
match the expected values.
In `@coincube-gui/src/app/state/spark/settings.rs`:
- Around line 216-264: Capture the pre-toggle value and return it with the RPC
result so the rollback can restore that exact state: in StableBalanceToggled,
save let prev = self.stable_balance_active before changing
self.stable_balance_active, then change the Task::perform payload to return
(result, prev) (i.e., async move { (backend.set_stable_balance(enabled).await,
prev) }) and update the message variant SparkSettingsMessage::StableBalanceSaved
to carry the previous value (e.g., Result<(bool, Option<bool>), String> or add a
new payload field) so StableBalanceSaved(result, prev) handler can set
self.stable_balance_saving = false and on Err restore self.stable_balance_active
= prev rather than flipping the current value; update all construction sites of
StableBalanceSaved in the StableBalanceToggled success/error branches
accordingly (use backend.set_stable_balance, StableBalanceToggled,
StableBalanceSaved, self.stable_balance_active, self.stable_balance_saving to
locate changes).
In `@coincube-gui/src/app/view/spark/receive.rs`:
- Around line 267-274: The short_error function currently slices the input
string by byte index (using &err[..MAX]) which can panic for multi-byte UTF-8
characters; change the truncation to operate on Unicode scalar values (chars)
instead of bytes — e.g., count up to MAX characters using err.char_indices() or
err.chars().take(MAX).collect::<String>() and then append the ellipsis when
truncated; keep the MAX constant as the character limit and ensure the function
returns the full string when its character count is <= MAX.
In `@coincube-gui/src/app/view/spark/transactions.rs`:
- Around line 60-64: Update the empty-state copy in the
SparkTransactionsStatus::Loaded(payments) if payments.is_empty() branch (the
Column::new() push that calls p1_regular) so it no longer references "once the
Send / Receive panels land"; replace that string with text that reflects the
panels are available now (for example: "No payments yet. Incoming and outgoing
payments will appear here when you send or receive funds."). Ensure the new
message is used in the same place where p1_regular(...) is currently called.
In `@coincube-gui/src/gui/tab.rs`:
- Around line 367-370: The installer path recreates the app with
app::App::new_without_wallet passing None for the Spark backend, which drops an
existing spark_backend when backing out of vault setup; update the Installer
struct to carry the existing spark_backend (similar to how it already carries
breez_client) and when constructing the app in the vault flow pass
Installer.spark_backend (or clone) into app::App::new_without_wallet instead of
None so the Spark backend is preserved; ensure all places that construct
Installer initialize and propagate spark_backend accordingly.
In `@coincube-gui/src/services/lnurl/stream.rs`:
- Around line 321-344: The Spark branch currently trusts event.description
without verifying it against event.description_hash; compute the SHA-256 hash of
the UTF-8 bytes of event.description and compare it (in the same encoding/format
used elsewhere for event.description_hash) before calling receive_bolt11 in the
use_spark path (symbols: use_spark, event.description, event.description_hash,
receive_bolt11, WalletKind::Spark, BOLT11_MAX_DESCRIPTION_BYTES); if the hash
does not match, treat it as a mismatch and fall back to the non-Spark path
(i.e., do not call spark.receive_bolt11 and proceed to the else/liquid handling)
or return an error, so the minted invoice’s d tag cannot diverge from the
validated metadata hash.
In `@coincube-spark-bridge/src/server.rs`:
- Around line 63-85: The sweep task keeps the ServerState alive and prevents
writer shutdown; replace the strong Arc usage with a Weak reference so the sweep
doesn’t extend ServerState lifetime: create let sweep_state =
Arc::downgrade(&state) and inside the loop attempt sweep_state.upgrade() — if
upgrade() returns None break the loop, otherwise call sweep_expired_prepares
with the upgraded Arc; additionally ensure any extra event_tx clones owned by
ServerState are dropped (or explicitly closed) before awaiting writer_task in
run() so rx.recv() can observe channel closure.
In `@docs/WALLETS.md`:
- Around line 7-31: The fenced code block in docs/WALLETS.md currently has a
bare opening fence which triggers markdownlint MD040; update the opening fence
to specify a language (e.g., change the backticks starting the tree block to
```text) so the tree layout block is explicitly marked as text; locate the
fenced block that contains the directory tree under coincube-gui/src/app/ and
modify only the opening fence to include the language identifier (leave the
block contents and closing fence unchanged).
---
Outside diff comments:
In `@coincube-gui/src/app/state/liquid/send.rs`:
- Around line 790-833: The USDt classification must use the asset_id check
instead of relying on asset_info presence: set is_usdt_payment by matching
payment.details against DomainPaymentDetails::LiquidAsset { asset_id, .. } and
comparing asset_id == usdt_id; compute usdt_amount_minor separately as
asset_info.as_ref().map(|i| i.amount_minor) only when asset_info exists; keep
amount = Amount::from_sat(usdt_amount_minor.unwrap_or(payment.amount_sat)); keep
fiat_amount = None when is_usdt_payment else use fiat_converter.convert(amount);
and adjust the desc match to detect USDt by asset_id (or is_usdt_payment) but
avoid depending on asset_info for choosing the "USDt Transfer" fallback.
In `@coincube-gui/src/gui/tab.rs`:
- Around line 975-984: The create_app_with_remote_backend function currently
hardcodes None for the Spark argument when calling App::new, which prevents
Spark loaded by BreezClientLoadedAfterPin from being visible; change the
signature of create_app_with_remote_backend to accept the Spark (or
Option<SparkType>) parameter and forward that value into app::App::new instead
of None, then update all call sites (including where BreezClientLoadedAfterPin
returns a loaded Breez client/PIN flow) to pass the loaded Spark through; ensure
the type matches the App::new parameter and adjust related declarations where
the function is defined/used (also update the other occurrence around lines
1114-1115) so remote-backend users receive the loaded Spark after login.
In `@coincube-gui/src/loader.rs`:
- Around line 858-865: The sync function is blocking the tokio executor by
calling std::thread::sleep; replace that call in async fn sync(...) with an
asynchronous sleep using
tokio::time::sleep(std::time::Duration::from_secs(1)).await so the task yields
instead of parking the worker thread; keep the existing daemon.get_info().await
return behavior and add the tokio::time::sleep call only when sleep == true.
In `@coincube-gui/src/services/lnurl/stream.rs`:
- Around line 21-42: The Hash impl for LnurlStreamData currently only hashes
token and retries, so changes to routing inputs are ignored; update the Hash
implementation for LnurlStreamData to also incorporate the routing-related
fields used by create_stream/handle_invoice_request: include preferred
(WalletKind) and the spark_backend identity (e.g., hash spark_backend.is_some()
or the Arc pointer/inner identity) so that changes to default_lightning_backend
or the Spark bridge presence will change the stream's hash and force Iced to
recreate the SSE task.
---
Nitpick comments:
In `@coincube-gui/src/app/view/liquid/send.rs`:
- Around line 1881-2041: Delete the parked legacy toggle by removing the entire
_old_send_asset_toggle function (including the leading NOTE comment and the
#[allow(dead_code)] attribute) from the file; ensure you also remove any
leftover references to _old_send_asset_toggle and its symbol (fn
_old_send_asset_toggle, SendAsset variants used only there) so the
compiler/warnings stay clean, and run a quick build to confirm no callers remain
(e.g. search for _old_send_asset_toggle and LiquidSendMessage::PresetAsset
usages that were only for this old toggle).
In `@coincube-spark-bridge/Cargo.toml`:
- Around line 22-23: Add a brief comment next to the Cargo.toml lints override
explaining why mismatched_lifetime_syntaxes is set to "allow" (e.g., due to
upstream/generated code or a specific dependency causing unavoidable lifetime
syntax mismatches). Locate the [lints.rust] section and the
mismatched_lifetime_syntaxes entry and insert a one-line rationale referencing
the upstream crate or codegen that necessitates this override so future
maintainers understand the exception.
In `@docs/SPARK_WALLET.md`:
- Around line 13-25: The fenced ASCII diagram (the triple backticks surrounding
the diagram in the docs/SPARK_WALLET.md file) is missing a language specifier;
update the opening fence from ``` to ```text (or an empty specifier) so the
block is marked as plain text and satisfies the markdown linter—edit the code
block that contains the ASCII diagram (the block starting with the triple
backticks above the diagram) to include the specifier.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: c88001fc-28c4-49b5-ad66-7c553c8b2633
⛔ Files ignored due to path filters (2)
Cargo.lockis excluded by!**/*.lockcoincube-spark-bridge/Cargo.lockis excluded by!**/*.lock
📒 Files selected for processing (71)
.env.exampleCargo.tomlcoincube-gui/Cargo.tomlcoincube-gui/src/app/breez_liquid/assets.rscoincube-gui/src/app/breez_liquid/client.rscoincube-gui/src/app/breez_liquid/config.rscoincube-gui/src/app/breez_liquid/mod.rscoincube-gui/src/app/breez_spark/assets.rscoincube-gui/src/app/breez_spark/client.rscoincube-gui/src/app/breez_spark/config.rscoincube-gui/src/app/breez_spark/mod.rscoincube-gui/src/app/cache.rscoincube-gui/src/app/menu.rscoincube-gui/src/app/message.rscoincube-gui/src/app/mod.rscoincube-gui/src/app/settings/mod.rscoincube-gui/src/app/state/connect/cube.rscoincube-gui/src/app/state/connect/mod.rscoincube-gui/src/app/state/global_home.rscoincube-gui/src/app/state/liquid/overview.rscoincube-gui/src/app/state/liquid/receive.rscoincube-gui/src/app/state/liquid/send.rscoincube-gui/src/app/state/liquid/settings.rscoincube-gui/src/app/state/liquid/sideshift_receive.rscoincube-gui/src/app/state/liquid/sideshift_send.rscoincube-gui/src/app/state/liquid/transactions.rscoincube-gui/src/app/state/mod.rscoincube-gui/src/app/state/spark/mod.rscoincube-gui/src/app/state/spark/overview.rscoincube-gui/src/app/state/spark/receive.rscoincube-gui/src/app/state/spark/send.rscoincube-gui/src/app/state/spark/settings.rscoincube-gui/src/app/state/spark/transactions.rscoincube-gui/src/app/state/vault/export.rscoincube-gui/src/app/view/buysell/mavapay/mod.rscoincube-gui/src/app/view/buysell/panel.rscoincube-gui/src/app/view/global_home.rscoincube-gui/src/app/view/liquid/overview.rscoincube-gui/src/app/view/liquid/receive.rscoincube-gui/src/app/view/liquid/send.rscoincube-gui/src/app/view/liquid/sideshift_send.rscoincube-gui/src/app/view/liquid/transactions.rscoincube-gui/src/app/view/message.rscoincube-gui/src/app/view/mod.rscoincube-gui/src/app/view/spark/mod.rscoincube-gui/src/app/view/spark/overview.rscoincube-gui/src/app/view/spark/receive.rscoincube-gui/src/app/view/spark/send.rscoincube-gui/src/app/view/spark/settings.rscoincube-gui/src/app/view/spark/transactions.rscoincube-gui/src/app/wallets/liquid.rscoincube-gui/src/app/wallets/mod.rscoincube-gui/src/app/wallets/registry.rscoincube-gui/src/app/wallets/spark.rscoincube-gui/src/app/wallets/types.rscoincube-gui/src/export.rscoincube-gui/src/gui/tab.rscoincube-gui/src/installer/mod.rscoincube-gui/src/launcher.rscoincube-gui/src/loader.rscoincube-gui/src/services/connect/login.rscoincube-gui/src/services/lnurl/mod.rscoincube-gui/src/services/lnurl/stream.rscoincube-spark-bridge/Cargo.tomlcoincube-spark-bridge/src/main.rscoincube-spark-bridge/src/sdk_adapter.rscoincube-spark-bridge/src/server.rscoincube-spark-protocol/Cargo.tomlcoincube-spark-protocol/src/lib.rsdocs/SPARK_WALLET.mddocs/WALLETS.md
| # explicitly — it takes precedence over `BREEZ_API_KEY` for the Spark | ||
| # backend. See `coincube-gui/src/app/breez_spark/config.rs`. | ||
| BREEZ_API_KEY= | ||
| BREEZ_SPARK_API_KEY= No newline at end of file |
There was a problem hiding this comment.
Add a trailing newline to satisfy dotenv lint.
The file currently ends immediately after BREEZ_SPARK_API_KEY=, which matches the reported EndingBlankLine warning.
🔧 Proposed fix
BREEZ_API_KEY=
BREEZ_SPARK_API_KEY=
+📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| BREEZ_SPARK_API_KEY= | |
| BREEZ_API_KEY= | |
| BREEZ_SPARK_API_KEY= | |
🧰 Tools
🪛 dotenv-linter (4.0.0)
[warning] 21-21: [EndingBlankLine] No blank line at the end of the file
(EndingBlankLine)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.env.example at line 21, The .env.example currently ends immediately after
the BREEZ_SPARK_API_KEY= line and triggers the EndingBlankLine lint; add a
single trailing newline at the end of the file so the file ends with a blank
line (ensure the final character after the BREEZ_SPARK_API_KEY= line is a
newline).
| if self | ||
| .inner | ||
| .closed | ||
| .swap(true, Ordering::SeqCst) | ||
| { | ||
| return Ok(()); | ||
| } | ||
|
|
||
| // Best-effort: send Shutdown and wait up to 5s for the child | ||
| // to exit, otherwise kill it. | ||
| let shutdown_result = tokio::time::timeout( | ||
| Duration::from_secs(5), | ||
| self.request(Method::Shutdown), | ||
| ) | ||
| .await; |
There was a problem hiding this comment.
shutdown() never sends the Shutdown RPC.
It sets closed = true before calling request(Method::Shutdown), but request() immediately returns BridgeUnavailable once that flag is set. So this path always skips the graceful RPC and falls back to wait/kill. Move the flag update after the request, or let request() special-case Method::Shutdown.
Also applies to: 525-533
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@coincube-gui/src/app/breez_spark/client.rs` around lines 483 - 497, In
shutdown(), the code sets inner.closed (via closed.swap(true, Ordering::SeqCst))
before calling request(Method::Shutdown), which causes request() to immediately
return BridgeUnavailable; change the logic so the Shutdown RPC actually gets
sent: either move the closed.swap(true, ...) to after awaiting
self.request(Method::Shutdown) inside shutdown(), or update request() to
special-case Method::Shutdown and allow the call even when inner.closed is true;
ensure the same fix is applied to the equivalent block around lines 525-533 (the
alternate shutdown path).
| fn spawn_reader_task( | ||
| stdout: tokio::process::ChildStdout, | ||
| pending: PendingMap, | ||
| event_tx: broadcast::Sender<Event>, | ||
| ) { | ||
| tokio::spawn(async move { | ||
| let mut lines = BufReader::new(stdout).lines(); | ||
| loop { | ||
| match lines.next_line().await { | ||
| Ok(Some(line)) => { | ||
| if line.trim().is_empty() { | ||
| continue; | ||
| } | ||
| let frame: Frame = match serde_json::from_str(&line) { | ||
| Ok(f) => f, | ||
| Err(e) => { | ||
| warn!("Spark bridge emitted unparseable line: {} ({})", line, e); | ||
| continue; | ||
| } | ||
| }; | ||
| match frame { | ||
| Frame::Response(resp) => { | ||
| if let Some(sender) = pending.lock().await.remove(&resp.id) { | ||
| let _ = sender.send(resp); | ||
| } else { | ||
| warn!( | ||
| "Spark bridge response for unknown id {} — dropping", | ||
| resp.id | ||
| ); | ||
| } | ||
| } | ||
| Frame::Event(event) => { | ||
| // `broadcast::Sender::send` returns Err only | ||
| // when there are zero active receivers. | ||
| // That's fine in Phase 4d — if no panel has | ||
| // subscribed yet, the event is dropped | ||
| // on the floor. Log at debug so smoke | ||
| // tests can observe the flow. | ||
| debug!("Spark bridge event: {:?}", event); | ||
| let _ = event_tx.send(event); | ||
| } | ||
| Frame::Request(_) => { | ||
| warn!("Spark bridge sent a Request frame — ignoring"); | ||
| } | ||
| } | ||
| } | ||
| Ok(None) => { | ||
| debug!("Spark bridge stdout closed"); | ||
| break; | ||
| } | ||
| Err(e) => { | ||
| warn!("Spark bridge stdout read error: {}", e); | ||
| break; | ||
| } | ||
| } | ||
| } | ||
| }); | ||
| } |
There was a problem hiding this comment.
Fail in-flight requests immediately when the bridge dies.
When stdout closes or read errors, this task just exits. The pending map still retains the outstanding oneshot::Senders, so callers sit until the 30s timeout even though the subprocess is already gone. Clear the pending map and mark the client closed before breaking so waiting requests fail immediately.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@coincube-gui/src/app/breez_spark/client.rs` around lines 690 - 747, On stdout
close or read error inside spawn_reader_task, drain the pending map and fail
each outstanding oneshot by taking pending.lock().await, removing all entries
(e.g., via drain or iterating keys), and sending an immediate error/response to
each sender so callers don't wait the full timeout; perform this cleanup in both
the Ok(None) and Err(e) branches before breaking. Also mark the client as closed
(set the shared "closed" flag or notify the client-closed signal used by the
client code) after failing pending requests so other tasks see the subprocess is
dead. Ensure you update only symbols shown here: spawn_reader_task, pending
(PendingMap), and event_tx/Frame handling, and send a failure through each
oneshot::Sender previously stored in pending.
| //! Spark SDK configuration — read at **compile time** from env vars. | ||
| //! | ||
| //! Mirrors the pattern used by [`crate::app::breez_liquid::config`]: | ||
| //! the API key is baked into the binary via `env!(...)` so the packaged | ||
| //! app doesn't need a runtime `.env` to connect. Override the value by | ||
| //! setting `BREEZ_SPARK_API_KEY` (or `BREEZ_API_KEY`, which is the | ||
| //! fallback — the same issued Breez key currently covers both the Liquid | ||
| //! and Spark SDKs) before running `cargo build`. |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's find and read the Breez Spark config file
find . -name "config.rs" | grep breez_sparkRepository: coincubetech/coincube
Length of output: 109
🏁 Script executed:
# Let's also check the file structure
ls -la coincube-gui/src/app/Repository: coincubetech/coincube
Length of output: 961
🏁 Script executed:
# Read the Breez Spark config file
cat -n coincube-gui/src/app/breez_spark/config.rsRepository: coincubetech/coincube
Length of output: 3747
🏁 Script executed:
# Also read the Breez Liquid config file for comparison
cat -n coincube-gui/src/app/breez_liquid/config.rsRepository: coincubetech/coincube
Length of output: 4482
Align Spark API key loading with the stricter Breez Liquid pattern.
The api_key_from_env() function uses option_env!() with an unwrap_or("") fallback, which silently bakes an empty string into the binary if neither BREEZ_SPARK_API_KEY nor BREEZ_API_KEY is set. In contrast, the Breez Liquid config (which this file claims to mirror) uses env!() directly, causing a hard compile-time failure if BREEZ_API_KEY is missing. For consistency and to catch misconfigured builds earlier, consider switching to env!() or explicitly returning a SparkConfigError on missing credentials rather than deferring the failure to SparkClient::connect handshake time.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@coincube-gui/src/app/breez_spark/config.rs` around lines 1 - 8, The current
api_key_from_env() bakes an empty string via option_env!(...).unwrap_or("")
which hides missing credentials; change it to enforce compile-time or explicit
failure: either replace option_env! with env!(...) so missing
BREEZ_SPARK_API_KEY / BREEZ_API_KEY triggers a compile error (mirroring the
Breez Liquid pattern), or keep option_env! but return a Result and map a missing
key to a SparkConfigError (e.g., return Err(SparkConfigError::MissingApiKey)) so
callers of api_key_from_env()/SparkConfig::new() fail early instead of embedding
an empty API key; update api_key_from_env, SparkConfig::new, and any callers to
handle the Result accordingly.
| /// Which backend should fulfill incoming Lightning Address invoices | ||
| /// for this cube. Starts as `Liquid` (backwards-compatible default | ||
| /// for existing cubes); Phase 5 flips the default to `Spark` and | ||
| /// adds a per-cube override in Settings. | ||
| #[serde(default)] | ||
| pub default_lightning_backend: crate::app::wallets::WalletKind, |
There was a problem hiding this comment.
Documentation states incorrect default value.
The doc comment states the default is Liquid for backwards compatibility, but WalletKind::default() in types.rs is Spark. For existing cubes where the field is absent, #[serde(default)] will use Spark, not Liquid.
If backwards compatibility requires Liquid as the default for existing cubes, you need to either:
- Change
WalletKind::default()toLiquid, or - Use a custom default function:
#[serde(default = "default_liquid")]
🔧 Option: Custom default for backwards compatibility
If Liquid should be the default for deserialization (existing cubes):
+fn default_lightning_backend() -> crate::app::wallets::WalletKind {
+ crate::app::wallets::WalletKind::Liquid
+}
+
/// Which backend should fulfill incoming Lightning Address invoices
-/// for this cube. Starts as `Liquid` (backwards-compatible default
-/// for existing cubes); Phase 5 flips the default to `Spark` and
-/// adds a per-cube override in Settings.
-#[serde(default)]
+/// for this cube. Defaults to `Liquid` for backwards compatibility
+/// with existing cubes; new cubes created in Phase 5+ default to `Spark`.
+#[serde(default = "default_lightning_backend")]
pub default_lightning_backend: crate::app::wallets::WalletKind,Then in new_with_id, explicitly set Spark for new cubes (which is already done at line 215).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@coincube-gui/src/app/settings/mod.rs` around lines 178 - 183, The doc is
wrong because serde(default) will use WalletKind::default() (currently Spark);
fix by providing an explicit deserialization default that returns Liquid: add a
function (e.g., default_liquid() -> crate::app::wallets::WalletKind {
WalletKind::Liquid }) and change the field attribute to #[serde(default =
"default_liquid")] on default_lightning_backend; alternatively, if you want
every default across the code to be Liquid, change WalletKind::default() in
types.rs to return WalletKind::Liquid — if you choose the serde-default
approach, ensure new_with_id still explicitly sets Spark for newly created cubes
as before.
| SparkTransactionsStatus::Loaded(payments) if payments.is_empty() => Column::new() | ||
| .push(p1_regular( | ||
| "No payments yet. Incoming and outgoing payments will \ | ||
| appear here once the Send / Receive panels land.", | ||
| )) |
There was a problem hiding this comment.
Fix the empty-state copy.
This message says payments will appear once the Send / Receive panels land, but this PR already adds those panels. Shipped as-is, it will read like stale feature-gating text.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@coincube-gui/src/app/view/spark/transactions.rs` around lines 60 - 64, Update
the empty-state copy in the SparkTransactionsStatus::Loaded(payments) if
payments.is_empty() branch (the Column::new() push that calls p1_regular) so it
no longer references "once the Send / Receive panels land"; replace that string
with text that reflects the panels are available now (for example: "No payments
yet. Incoming and outgoing payments will appear here when you send or receive
funds."). Ensure the new message is used in the same place where p1_regular(...)
is currently called.
| let (app, command) = app::App::new_without_wallet( | ||
| breez.clone(), | ||
| None, // Spark backend — Phase 4 wires the runtime spawn | ||
| cfg, |
There was a problem hiding this comment.
Preserve spark_backend when returning from the installer.
This path recreates the app with None for Spark, so a cube that already loaded Spark loses it as soon as the user backs out of vault setup. Installer needs to carry the existing Spark backend the same way it already carries breez_client.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@coincube-gui/src/gui/tab.rs` around lines 367 - 370, The installer path
recreates the app with app::App::new_without_wallet passing None for the Spark
backend, which drops an existing spark_backend when backing out of vault setup;
update the Installer struct to carry the existing spark_backend (similar to how
it already carries breez_client) and when constructing the app in the vault flow
pass Installer.spark_backend (or clone) into app::App::new_without_wallet
instead of None so the Spark backend is preserved; ensure all places that
construct Installer initialize and propagate spark_backend accordingly.
| let use_spark = matches!(preferred, WalletKind::Spark) | ||
| && spark_backend.is_some() | ||
| && event | ||
| .description | ||
| .as_deref() | ||
| .is_some_and(|d| d.len() <= BOLT11_MAX_DESCRIPTION_BYTES); | ||
|
|
||
| let invoice_result = if use_spark { | ||
| let spark = spark_backend.expect("checked above"); | ||
| let description = event | ||
| .description | ||
| .clone() | ||
| .expect("checked above"); | ||
| log::info!( | ||
| "[LNURL] Routing request {} via Spark (desc_len={})", | ||
| request_id, | ||
| description.len() | ||
| ); | ||
| spark | ||
| .receive_bolt11(Some(amount_sat), description, None) | ||
| .await | ||
| .map(|ok| ok.payment_request) | ||
| .map_err(|e| e.to_string()) | ||
| } else { |
There was a problem hiding this comment.
Verify the Spark description preimage before using it.
The Spark branch trusts event.description and never checks that it matches event.description_hash. If those fields ever diverge, we'll mint an invoice whose d tag does not match the metadata hash the payer validated, which can break the payment or commit to the wrong description. Please hash-check the description before receive_bolt11(...) and fall back to Liquid (or reject) on mismatch.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@coincube-gui/src/services/lnurl/stream.rs` around lines 321 - 344, The Spark
branch currently trusts event.description without verifying it against
event.description_hash; compute the SHA-256 hash of the UTF-8 bytes of
event.description and compare it (in the same encoding/format used elsewhere for
event.description_hash) before calling receive_bolt11 in the use_spark path
(symbols: use_spark, event.description, event.description_hash, receive_bolt11,
WalletKind::Spark, BOLT11_MAX_DESCRIPTION_BYTES); if the hash does not match,
treat it as a mismatch and fall back to the non-Spark path (i.e., do not call
spark.receive_bolt11 and proceed to the else/liquid handling) or return an
error, so the minted invoice’s d tag cannot diverge from the validated metadata
hash.
| let state = Arc::new(ServerState::new(tx.clone())); | ||
|
|
||
| // Phase 4f: background sweep that evicts pending-prepare entries | ||
| // older than `PREPARE_TTL`. Both pending maps share the same | ||
| // policy. The task lives for the bridge process lifetime; it | ||
| // exits when the ServerState's strong count drops to 1 (i.e. | ||
| // only the sweep task itself holds it, which only happens after | ||
| // `run()` returns and the dispatcher state has been dropped). | ||
| let sweep_state = Arc::clone(&state); | ||
| tokio::spawn(async move { | ||
| let mut tick = tokio::time::interval(PREPARE_SWEEP_INTERVAL); | ||
| // Skip the immediate first tick — wait one full interval so | ||
| // the sweep doesn't fire during the init handshake. | ||
| tick.tick().await; | ||
| loop { | ||
| tick.tick().await; | ||
| // Stop sweeping once the dispatcher state is gone. | ||
| if Arc::strong_count(&sweep_state) <= 1 { | ||
| break; | ||
| } | ||
| sweep_expired_prepares(&sweep_state).await; | ||
| } | ||
| }); |
There was a problem hiding this comment.
Shutdown/EOF can hang because the writer channel never closes.
run() waits for writer_task, but ServerState still owns an event_tx clone and the sweep task keeps an Arc<ServerState> alive. That means rx.recv() may never observe channel closure after EOF or shutdown, so the bridge can hang instead of exiting cleanly. Use an explicit writer-stop path, or drop/abort the remaining state/tasks before awaiting the writer.
Also applies to: 147-149, 178-182
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@coincube-spark-bridge/src/server.rs` around lines 63 - 85, The sweep task
keeps the ServerState alive and prevents writer shutdown; replace the strong Arc
usage with a Weak reference so the sweep doesn’t extend ServerState lifetime:
create let sweep_state = Arc::downgrade(&state) and inside the loop attempt
sweep_state.upgrade() — if upgrade() returns None break the loop, otherwise call
sweep_expired_prepares with the upgraded Arc; additionally ensure any extra
event_tx clones owned by ServerState are dropped (or explicitly closed) before
awaiting writer_task in run() so rx.recv() can observe channel closure.
| ``` | ||
| coincube-gui/src/app/ | ||
| ├── wallets/ ← domain types + registry | ||
| │ ├── mod.rs pub-use surface | ||
| │ ├── types.rs WalletKind, DomainPayment*, DomainRefundableSwap | ||
| │ ├── registry.rs WalletRegistry + LightningRoute | ||
| │ ├── liquid.rs LiquidBackend (wraps BreezClient) | ||
| │ └── spark.rs SparkBackend (wraps SparkClient) | ||
| ├── breez_liquid/ ← Liquid SDK wrapper (in-process) | ||
| │ ├── mod.rs loader (load_breez_client) | ||
| │ ├── config.rs BreezConfig::from_env | ||
| │ ├── client.rs BreezClient — full Liquid read/write API | ||
| │ └── assets.rs L-BTC / USDt descriptors | ||
| ├── breez_spark/ ← Spark SDK wrapper (subprocess) | ||
| │ ├── mod.rs loader (load_spark_client) | ||
| │ ├── config.rs SparkConfig::from_env | ||
| │ ├── client.rs SparkClient — spawns + owns the bridge subprocess | ||
| │ └── assets.rs Spark asset registry (BTC-only today) | ||
| ├── state/ | ||
| │ ├── liquid/ LiquidOverview, LiquidSend, LiquidReceive, … | ||
| │ └── spark/ SparkOverview, SparkSend, SparkReceive, … | ||
| └── view/ | ||
| ├── liquid/ view renderers | ||
| └── spark/ view renderers | ||
| ``` |
There was a problem hiding this comment.
Add a language to this fenced block.
The opening fence here is bare, which trips markdownlint's MD040. text would be enough for the tree layout.
🧰 Tools
🪛 markdownlint-cli2 (0.22.0)
[warning] 13-13: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@docs/WALLETS.md` around lines 7 - 31, The fenced code block in
docs/WALLETS.md currently has a bare opening fence which triggers markdownlint
MD040; update the opening fence to specify a language (e.g., change the
backticks starting the tree block to ```text) so the tree layout block is
explicitly marked as text; locate the fenced block that contains the directory
tree under coincube-gui/src/app/ and modify only the opening fence to include
the language identifier (leave the block contents and closing fence unchanged).


Note
High Risk
Adds a new Spark wallet payment/receive flow and an IPC subprocess client, plus changes settings schema and Lightning Address routing, which impacts core wallet/payment behavior and persistence compatibility.
Overview
Adds a new Spark wallet path in the GUI: a
breez_sparkmodule that spawns and talks tocoincube-spark-bridgeover JSONL IPC (request/response + event stream), plus new Spark panel states for Overview/Send/Receive/Transactions/Settings and a newMenu::Sparksection with cached expand/collapse state.Introduces a
walletsabstraction layer with domain payment types, migrating Liquid panels from directbreez-sdk-liquidtypes toDomainPayment/DomainRefundableSwapand routing Liquid access throughLiquidBackend; addsWalletRegistryto hold Liquid + optional Spark backends and wires Spark event subscriptions into the app loop.Updates persisted settings to rename
liquid_wallet_signer_fingerprinttobreez_wallet_signer_fingerprint(with serde alias for backwards compatibility) and addsdefault_lightning_backendpreference; extends.env.examplewithBREEZ_SPARK_API_KEY(fallback toBREEZ_API_KEY) and adds a newcoincube-spark-protocolworkspace crate while explicitly excludingcoincube-spark-bridgefrom the main workspace due to dependency conflicts.Reviewed by Cursor Bugbot for commit 72ca626. Bugbot is set up for automated code reviews on this repo. Configure here.
Summary by CodeRabbit
New Features
Documentation