refactor: consolidate fee splitting functions in Gateway contract#80
refactor: consolidate fee splitting functions in Gateway contract#80onahprosper wants to merge 14 commits intomainfrom
Conversation
…ate event names in IGateway interface
📝 WalkthroughWalkthroughThe PR renames and splits settlement flows (adds settleIn, renames settle → settleOut), updates fee-splitting signatures/events, adds ProviderBatchCallAndSponsor for signed batched execution, migrates tests/fixtures to ESM/BigInt, updates Hardhat config and generated typings/artifacts, and adjusts deployment/ignition modules and artifacts. Changes
Sequence DiagramsequenceDiagram
participant User
participant ProviderBatchCallAndSponsor
participant Aggregator
participant Gateway
participant Token as ERC20
User->>ProviderBatchCallAndSponsor: execute(calls, signature)
activate ProviderBatchCallAndSponsor
ProviderBatchCallAndSponsor->>ProviderBatchCallAndSponsor: recover signer from signature
ProviderBatchCallAndSponsor->>ProviderBatchCallAndSponsor: verify signer == IGateway(gateway).getAggregator()
ProviderBatchCallAndSponsor->>ProviderBatchCallAndSponsor: increment nonce
loop each Call
ProviderBatchCallAndSponsor->>ProviderBatchCallAndSponsor: _executeCall(call)
ProviderBatchCallAndSponsor->>ProviderBatchCallAndSponsor: emit CallExecuted
end
ProviderBatchCallAndSponsor->>ProviderBatchCallAndSponsor: emit BatchExecuted(nonce)
deactivate ProviderBatchCallAndSponsor
rect rgba(0, 150, 200, 0.5)
Note over Aggregator,Gateway: Outbound settlement (settleOut)
Aggregator->>Gateway: settleOut(splitOrderId, orderId, liquidityProvider, settlePercent, rebatePercent)
activate Gateway
Gateway->>Gateway: update order state
Gateway->>Gateway: choose local vs FX fee split
Gateway->>Token: transfer fees to recipients
Gateway->>Gateway: emit SettleOut and fee events
deactivate Gateway
end
rect rgba(100, 200, 100, 0.5)
Note over User,Gateway: Inbound settlement (settleIn)
User->>Gateway: settleIn(orderId, token, amount, senderFeeRecipient, senderFee, recipient, rate)
activate Gateway
Token->>Gateway: transfer(amount)
Gateway->>Gateway: update order state
Gateway->>Gateway: choose local vs FX fee split
Gateway->>Token: transfer fees to recipients
Gateway->>Gateway: emit SettleIn and fee events
deactivate Gateway
end
Estimated code review effort🎯 5 (Critical) | ⏱️ ~120 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 7
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
contracts/Gateway.sol (1)
214-222: Event name mismatch: emitsOrderSettledbut interface declaresSettleOut.The interface has been updated to declare
SettleOutevent (line 41 in IGateway.sol), but the implementation still emitsOrderSettled. This will cause a compilation error or event signature mismatch.🐛 Proposed fix
// emit settled event - emit OrderSettled( + emit SettleOut( _splitOrderId, _orderId, _liquidityProvider, _settlePercent, _rebatePercent );
🤖 Fix all issues with AI agents
In `@contracts/Gateway.sol`:
- Line 238: The settleIn handler only checks order[_orderId].isFulfilled and can
overwrite an existing order created by createOrder; add a strict existence check
(e.g., require(order[_orderId].creator == address(0) || !order[_orderId].exists,
'OrderAlreadyExists')) before writing to the order mapping in settleIn to ensure
an orderId cannot be reused, and apply the same existence check to the analogous
settlement code path mentioned in the review (the other settlement block that
currently mirrors this logic) so neither path can overwrite orders from
createOrder.
- Around line 227-228: The NatSpec reference above function settleIn in
Gateway.sol is pointing to the wrong symbol ({processSettlement-IGateway});
update the doc to reference the correct interface member by changing the tag to
{settleIn-IGateway} (or the exact symbol name used in IGateway) so the NatSpec
link resolves to the interface method for settleIn.
- Line 237: The settleIn function is missing the onlyAggregator modifier,
allowing unauthorized callers to perform settlements; update the settleIn
function signature (the external whenNotPaused returns (bool) declaration for
settleIn) to include onlyAggregator just like settleOut and refund so only the
aggregator can invoke it, ensuring the same access control is applied.
- Around line 242-253: In settleIn, the contract only transfers _amount up front
but for local transfers (rate == 100) the function later calls
_handleLocalTransferFeeSplitting which expects the contract to hold _senderFee
as well; update the deposit so the contract pulls the sender fee for local
transfers by changing the initial IERC20(_token).transferFrom call to transfer
_amount + _senderFee when _rate == 100 (compute a requiredDeposit variable
before the transfer and use it), keep processedAmount -= _senderFee and
protocolFee logic intact, and ensure this mirrors createOrder's behavior so
subsequent local fee transfers in _handleLocalTransferFeeSplitting succeed.
In `@contracts/interfaces/IGateway.sol`:
- Around line 52-60: The SettleIn event declares an address parameter named
senderFee which is misleading; rename the parameter to senderFeeRecipient in the
SettleIn event declaration (and update any references) so it matches the emitted
argument in Gateway.sol and clarifies that the value is an address recipient,
not an amount; update the event signature and all usages that reference SettleIn
(including any emits) to use senderFeeRecipient.
- Around line 41-47: The implementation still emits the old OrderSettled event;
update the emission to use the interface's renamed event SettleOut with the
exact parameter order and indexed fields (bytes32 splitOrderId, bytes32 indexed
orderId, address indexed liquidityProvider, uint64 settlePercent, uint64
rebatePercent). Locate the emit call in Gateway.sol (the function that currently
does emit OrderSettled(...)) and replace it with emit SettleOut(...) passing the
same values so listeners of SettleOut receive the event.
In `@contracts/ProviderBatchCallAndSponsor.sol`:
- Around line 56-60: The current construction of encodedCalls via repeated
abi.encodePacked concatenation is both quadratic (repeated reallocations) and
unsafe for hashing because abi.encodePacked with dynamic bytes can produce
ambiguous concatenations; instead compute the digest with
keccak256(abi.encode(nonce, calls)) (or keccak256(abi.encode(nonce,
individualCallStructArray)) if calls is a struct array) and remove the manual
loop and encodedCalls variable. Replace the abi.encodePacked usage with
abi.encode when building the hash and use the calls array directly in keccak256
to eliminate collision risk and O(n²) behavior (referencing encodedCalls, calls,
nonce, and the keccak256 line).
🧹 Nitpick comments (4)
contracts/ProviderBatchCallAndSponsor.sol (3)
25-25: Hardcoded gateway address limits contract reusability.The
gatewayAddressis a compile-time constant, which means this contract cannot be reused across different deployments or networks where the Gateway contract has a different address.Consider using an immutable variable set in the constructor or an upgradeable pattern if flexibility is needed:
♻️ Suggested refactor
- address public constant gatewayAddress = 0x30F6A8457F8E42371E204a9c103f2Bd42341dD0F; + address public immutable gatewayAddress; + + constructor(address _gatewayAddress) { + require(_gatewayAddress != address(0), "Invalid gateway"); + gatewayAddress = _gatewayAddress; + }
71-78: Consider removing or documenting the stub function more clearly.This overloaded
executefunction always reverts. If it's intentionally disabled, consider:
- Adding a more descriptive revert reason explaining why direct execution is not supported
- Or removing it entirely if not needed for interface compliance
Also, minor typo in the natspec: "doesnt" → "doesn't".
99-103: Original revert reason is lost on failed calls.When a call fails, the original revert reason is discarded, making debugging difficult. Consider capturing and bubbling up the revert data:
♻️ Suggested improvement
function _executeCall(Call calldata callItem) internal { - (bool success,) = callItem.to.call{value: callItem.value}(callItem.data); - require(success, "Call reverted"); + (bool success, bytes memory returnData) = callItem.to.call{value: callItem.value}(callItem.data); + if (!success) { + if (returnData.length > 0) { + assembly { + revert(add(returnData, 32), mload(returnData)) + } + } + revert("Call reverted"); + } emit CallExecuted(msg.sender, callItem.to, callItem.value, callItem.data); }contracts/interfaces/IGateway.sol (1)
74-74: Consider whether all three parameters should be indexed.All three parameters (
orderId,sender,amount) are indexed. Havingamountindexed is unusual since you typically filter events by order or sender, not exact amounts. This uses an extra topic slot unnecessarily.♻️ Suggested change
- event SenderFeeTransferred(bytes32 indexed orderId, address indexed sender, uint256 indexed amount); + event SenderFeeTransferred(bytes32 indexed orderId, address indexed sender, uint256 amount);
…stence check in Gateway contract
There was a problem hiding this comment.
Actionable comments posted: 3
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
contracts/Gateway.sol (1)
156-163: Clarify whethersettleOutshould includewhenNotPausedmodifier.
settleOutis missing thewhenNotPausedmodifier whilecreateOrderandsettleInboth include it. If pause is intended to halt all settlements, add the modifier; ifsettleOutshould remain operational during pause (e.g., to allow the aggregator to process pending orders), document this design choice.🔧 Possible adjustment
- ) external onlyAggregator returns (bool) { + ) external onlyAggregator whenNotPaused returns (bool) {
🤖 Fix all issues with AI agents
In `@contracts/Gateway.sol`:
- Around line 181-184: Update the misleading comment above the call to
_handleFxTransferFeeSplitting in Gateway.sol: replace “// fx transfer - sender
keeps all fee” with a concise note that reflects actual behavior (e.g., that
sender fee may be split and a portion can be routed to treasury/aggregator via
senderToAggregator logic). Reference the order[_orderId].senderFee and the
senderToAggregator path in the comment so readers know
_handleFxTransferFeeSplitting can allocate part of the sender fee to
treasury/aggregator rather than the sender keeping the entire fee.
In `@contracts/interfaces/IGateway.sol`:
- Around line 177-180: Update the NatSpec for the settleIn function in the
IGateway interface to clarify that it is intentionally permissionless for onramp
flows (callers supply tokens via transferFrom) rather than limited to an
"authorized aggregator"; explicitly state expected caller behavior, token
transfer pattern, and reference EIP-7702/onramp usage so integrators don’t
assume aggregator-only access and so implementers know to enforce any off-chain
aggregator checks elsewhere.
- Around line 182-186: Update the NatSpec for the _amount parameter to state
that it represents the total tokens transferred into the Gateway (i.e., the
gross transfer-in amount that includes sender fee and any protocol/FX fees)
rather than the net amount delivered to the recipient; apply this change to the
_amount `@param` in the comment for settleIn (and any related Gateway interface
comments) so integrators understand fees are deducted from _amount when
computing recipient and fee amounts.
…Out and implement settleIn for order processing
…ependencies and add license information for several packages
There was a problem hiding this comment.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
test/gateway/gateway.settleOrder.test.js (1)
1106-1112: Fix BigNumber arithmetic: use.add()instead of+operator.
MAX_BPSis aBigNumber, and using the+operator will cause JavaScript type coercion issues rather than performing proper BigNumber addition. This could result in unexpected behavior or test failures.🐛 Proposed fix
- const invalidRebatePercent = MAX_BPS + 1; // 100.001% + const invalidRebatePercent = MAX_BPS.add(1); // 100.001%
🤖 Fix all issues with AI agents
In `@contracts/ProviderBatchCallAndSponsor.sol`:
- Around line 29-31: The constructor sets gatewayAddress without validation, so
deploying with address(0) would break onlyAggregator checks; update the
constructor (constructor) to validate _gatewayAddress != address(0) using a
require that reverts with a clear message (e.g., "gatewayAddress cannot be
zero") before assigning gatewayAddress, ensuring onlyAggregator can function
correctly.
In `@test/utils/utils.manager.js`:
- Around line 29-43: The Events.Gateway.SenderFeeTransferred entry is unused in
tests; either add assertions for the contract's sender-fee emission in the
settle flow or remove the mapping key to avoid stale test utilities. Locate
test(s) that call the contract's settle function (the contract method settle)
and add an assertion that the transaction emits the SenderFeeTransferred event
(using your test framework's to.emit/expect pattern and the
Events.Gateway.SenderFeeTransferred symbol with the expected args), or simply
delete Events.Gateway.SenderFeeTransferred from test/utils/utils.manager.js if
you prefer not to cover it.
♻️ Duplicate comments (2)
contracts/ProviderBatchCallAndSponsor.sol (1)
60-65: Harden batch signature digest (domain separation + collision resistance).The digest isn’t bound to chain/contract and uses
abi.encodePackedover dynamicbytes, which can collide and is O(n²). Consider hashing the structured data with explicit domain separation.🔒 Proposed fix
- bytes memory encodedCalls; - for (uint256 i = 0; i < calls.length; i++) { - encodedCalls = abi.encodePacked(encodedCalls, calls[i].to, calls[i].value, calls[i].data); - } - bytes32 digest = keccak256(abi.encodePacked(nonce, encodedCalls)); + bytes32 digest = keccak256(abi.encode(block.chainid, address(this), nonce, calls));Solidity abi.encodePacked dynamic bytes hash collision and recommended abi.encode for hashingcontracts/Gateway.sol (1)
181-184: Update FX fee-split comments to match behavior.
_handleFxTransferFeeSplittingdoes split the sender fee; the comments say the sender keeps all fees.📝 Suggested edit
- // fx transfer - sender keeps all fee + // fx transfer - split sender fee per senderToAggregator settings ... - // FX transfer - sender keeps all fee + // FX transfer - split sender fee per senderToAggregator settingsAlso applies to: 287-288
🧹 Nitpick comments (4)
artifacts/@openzeppelin/contracts/utils/Strings.sol/Strings.json (1)
1-10: Addartifacts/to.gitignoreto exclude auto-generated build files.The
artifacts/directory currently appears in version control with 41 tracked files. These are auto-generated Hardhat artifacts that can be regenerated vianpx hardhat compile. Committing them bloats the repository and causes unnecessary diffs. The.gitignorefile has a comment on line 13 but the actual exclusion pattern is missing.Add
artifacts/as a proper entry in.gitignoreto prevent these files from being tracked.contracts/Gateway.sol (1)
242-242: Clarify that_amountalready includes sender fee.A short inline note here would prevent confusion with
createOrder, which transfers_amount + _senderFee.📝 Suggested edit
- IERC20(_token).transferFrom(msg.sender, address(this), _amount); + // _amount already includes sender fee; fee is split later + IERC20(_token).transferFrom(msg.sender, address(this), _amount);Based on learnings, this reflects the intended off-chain computation of
_amountincluding_senderFee.test/gateway/gateway.settleOrder.test.js (2)
1315-1343: Remove unused variable.The
retvariable fromgetSupportedInstitutions()is fetched but not used in this test case. The test only needs to verify the revert condition.♻️ Proposed cleanup
it("Should revert when settleIn is called with zero sender fee for local transfer", async function () { - const ret = await getSupportedInstitutions(); const orderId = ethers.utils.keccak256(ethers.utils.defaultAbiCoder.encode(["string"], ["test-order-4"])); const rate = 100; // Local transfer requires sender fee const data = [{ bank_account: "09090990903" }];
1437-1444: Consider moving settings reset toafterEachfor test robustness.If this test fails before reaching the reset block (lines 1437-1444), subsequent tests may run with incorrect token fee settings. Moving the reset to an
afterEachhook ensures cleanup regardless of test outcome.♻️ Alternative: Use afterEach for cleanup
// Inside describe("settleIn", function () { afterEach(async function () { // Ensure token settings are reset after each test await gateway.connect(this.deployer).setTokenFeeSettings( mockUSDT.address, 50000, // senderToProvider 50000, // providerToAggregator 0, // senderToAggregator 500 // providerToAggregatorFx ); });Then remove the manual reset at lines 1437-1444.
chibie
left a comment
There was a problem hiding this comment.
in addition,
processedAmount should be senderAmount
…tocol fees - Changed the data structure in IGateway interface to replace senderFeeRecipient with aggregatorFee. - Updated Gateway contract logic to calculate and transfer aggregator fees instead of protocol fees. - Modified settleOrder tests to reflect changes in fee calculations and ensure correct amounts are transferred. - Removed unused event SenderFeeTransferred from Events.
…regator address retrieval - Renamed variables for clarity in fee calculations, changing `senderAmount` to `amountToSettle`. - Updated logic to correctly handle aggregator fees during order settlements. - Added a new function `getAggregator` in the IGateway interface to retrieve the aggregator address. - Adjusted event emissions to reflect changes in fee handling.
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@contracts/Gateway.sol`:
- Around line 227-272: The settleIn function treats _amount as net instead of
the documented gross (it currently calls IERC20.transferFrom(msg.sender,
address(this), _amount + _senderFee) and sends full _amount to the recipient
while also handling _senderFee), causing double‑charging and overpayment; fix
by: change the incoming pull to IERC20.transferFrom(..., _amount) only, compute
amountToSettle = _amount - _senderFee (and then subtract any aggregatorFee for
FX flows), send aggregatorFee to treasury if applicable, update
order[_orderId].amount to the net amountToSettle, and ensure
IERC20.transfer(_recipient, amountToSettle) and order fields (senderFee,
protocolFee) reflect these net/gross semantics in settleIn.
In `@contracts/interfaces/IGateway.sol`:
- Around line 176-197: The docs/implementation mismatch for settleIn must be
reconciled by treating _amount as the gross amount (includes sender fee)
everywhere: update Gateway.settleIn so it transfersFrom only the gross _amount
(not _amount + _senderFee), compute recipientAmount = _amount - _senderFee (and
subtract any protocol fee if applicable), send recipientAmount to _recipient,
and send _senderFee to _senderFeeRecipient; ensure function signature and
NatSpec remain consistent with this gross semantics and adjust any internal
references that currently assume _amount is net.
…nt after deducting all fees - Adjusted transfer logic to handle gross amount and subtract sender and protocol fees. - Updated variable names for clarity, changing `senderAmount` to `recipientAmount`. - Ensured that the correct amount is transferred to the recipient after all fees are deducted. - Emitted settlement event with the updated recipient amount.
- Updated index files to export MockUSDT and its factory. - Modified IGateway interface to include changes in event parameters for SettleIn. - Updated hardhat types to include new contract factories and deployment methods for ProxyAdmin and TransparentUpgradeableProxy. - Refactored existing index files to organize exports for better clarity and maintainability.
There was a problem hiding this comment.
Actionable comments posted: 12
🤖 Fix all issues with AI agents
In
`@artifacts/build-info/solc-0_8_20-32286d087051e35025cad535aaaac947e68287bf.json`:
- Around line 48-49: The contract in the artifact contains inconsistent naming:
rename contract MockUSDT to MockUSDC, fix the ERC20 name in the constructor from
"MockUDSC" to "MockUSDC" (keep or adjust the symbol "MUSDC" as desired), and
update any string literals or require messages that reference "MockUSDT" (e.g.,
in burnAll) to "MockUSDC"; ensure the constructor, mint, burn, and burnAll
functions still compile after renaming and then regenerate the build artifacts.
In
`@artifacts/build-info/solc-0_8_20-fca8cc730a3cef896a4a57d6ce6a72d2a65b5d82.json`:
- Line 49: The digest construction in execute() currently builds encodedCalls
via a manual loop using abi.encodePacked which can cause collisions; replace
that logic by computing the digest with abi.encode to include length prefixes
(i.e., remove the encodedCalls loop and set bytes32 digest =
keccak256(abi.encode(nonce, calls))); update the execute(Call[] calldata calls,
bytes calldata signature) function to use this new digest computation
(references: execute, encodedCalls, digest) and remove the abi.encodePacked
accumulation to prevent replayable signature collisions.
In `@artifacts/contracts/ProviderBatchCallAndSponsor.sol/artifacts.d.ts`:
- Around line 7-42: Rename the locally declared interface IGateway to a unique
name (e.g., IAggregatorGateway) and update all references: change the interface
declaration symbol and the ArtifactMap entries in artifacts.d.ts from IGateway
to IAggregatorGateway, and update usages in the contract code such as the
onlyAggregator modifier, the gatewayAddress getter, and any cast/ABI usage that
references IGateway so they now reference IAggregatorGateway; ensure the ABI
entry name (getAggregator) and any import/fully‑qualified names (e.g.,
"contracts/ProviderBatchCallAndSponsor.sol:IGateway") are replaced accordingly
to avoid the duplicate ArtifactMap key.
In `@contracts/ProviderBatchCallAndSponsor.sol`:
- Around line 25-27: The comment above the gatewayAddress constant has a typo
("eao"); update the comment to use the correct acronym "EOA" (Externally Owned
Account) so the comment reads e.g. "constant because when it assigned to EOA we
want to ensure it never changes" or similar; locate the comment immediately
preceding the address public constant gatewayAddress and replace "eao" with
"EOA".
In `@hardhat.config.ts`:
- Around line 118-148: The chainDescriptors entries for chains 42220 (celo),
534352 (scroll), and 1135 (lisk) have swapped fields: swap the values of url and
apiUrl in each object's blockExplorers.etherscan so that url is the human-facing
explorer website and apiUrl is the API endpoint used for verification (update
the objects under chainDescriptors for 42220, 534352, and 1135 accordingly).
In `@ignition/modules/GatewayDeployProxy.ts`:
- Around line 10-14: The proxy is being constructed with empty initialization
data ("0x"), leaving the implementation uninitialized; encode and pass
Gateway.initialize(...) as the initializer when creating
TransparentUpgradeableProxy. Use the gateway contract's initialize function
selector/encoded call (as shown in the test fixture) and replace "0x" in the
m.contract("TransparentUpgradeableProxy", [gateway, proxyAdminOwner, "0x"]) call
with the ABI-encoded Gateway.initialize(...) payload so the proxy delegates the
required MAX_BPS, Ownable2Step and Pausable setup to the implementation during
deployment.
In `@ignition/modules/GatewayUpgrade.ts`:
- Around line 34-38: The current call m.staticCall(proxy, "admin", [],
"getProxyAdmin") will revert for non-admin callers because
ITransparentUpgradeableProxy.admin() is ifAdmin-gated; replace this by either
accepting the ProxyAdmin address from configuration/environment and using that
value to set proxyAdminAddress, or read the EIP‑1967 admin storage slot directly
(compute ADMIN_SLOT = bytes32(uint256(keccak256('eip1967.proxy.admin')) - 1) and
read storage at that slot for existingProxyAddress) and convert the returned
slot value to an address; update references to proxyAdminAddress and remove the
m.staticCall to ITransparentUpgradeableProxy.admin so the upgrade flow no longer
depends on calling admin() as a non-admin.
In `@package.json`:
- Around line 19-37: Two test files (test/gateway/gateway.settleOrder.test.js
and test/gateway/gateway.createorder.test.js) still use CommonJS require() which
breaks under "type":"module"; change each require(...) to ES module import
statements (use named or default import form matching the original
destructuring), update any module.exports in code they import to export/exports
(or convert exported helpers to export default/named) and adjust references
accordingly, and handle CommonJS globals if present (replace
__dirname/__filename uses with import.meta.url or path utilities). Locate
require calls in those two files and replace them with appropriate import lines
and fix any exported helper modules to use export syntax so tests run under ESM.
In `@test/fixtures/mockUSDT.js`:
- Around line 7-8: The deployment result from ethers.deployContract("MockUSDT")
is returned before the transaction is mined; after obtaining mockUSDT, await
mockUSDT.waitForDeployment() before calling mockUSDT.getAddress() so the
deployment transaction is confirmed and avoids race conditions in tests.
In `@test/utils/utils.manager.js`:
- Around line 76-79: The helper mockMintDeposit uses the deprecated
gateway.address (ethers v6); change it to await gateway.getAddress() and pass
that result to usdc.connect(account).approve(...). Specifically, inside
mockMintDeposit ensure you call const gatewayAddr = await gateway.getAddress()
(or inline await gateway.getAddress()) and use gatewayAddr when calling
usdc.connect(account).approve(...), keeping the function async and preserving
the mint call.
- Around line 56-74: The code in getSupportedInstitutions uses the deprecated
ethers.utils.formatBytes32String; replace those calls with
ethers.encodeBytes32String for currency, accessBank.code/name and
diamondBank.code/name to be compatible with ethers v6 (update any imports if
needed so ethers is the v6 API). Ensure you update all occurrences in
getSupportedInstitutions to call ethers.encodeBytes32String("...") for the
currency, accessBank.code, accessBank.name, diamondBank.code, and
diamondBank.name.
In `@types/ethers-contracts/common.ts`:
- Around line 44-49: The deploy signature on MinEthersFactory is wrong: change
deploy(...a: ARGS[]) to accept the argument tuple rather than an array of tuples
(e.g., deploy(...a: ARGS) or deploy(...args: ARGS)) so the factory deploy takes
the normal variadic tuple parameters; update MinEthersFactory's deploy signature
accordingly and keep GetContractTypeFromFactory and GetARGsTypeFromFactory
(which uses Parameters<F['deploy']>) intact so they resolve the correct tuple
types.
🧹 Nitpick comments (2)
hardhat.config.ts (1)
86-110: Inconsistent optimizer settings across compiler versions.The optimizer is disabled for versions 0.8.18 and 0.8.9 but enabled for 0.8.20. If this is intentional (e.g., only optimizing production contracts compiled with 0.8.20), consider adding a comment to clarify. Otherwise, this inconsistency could lead to unexpected gas costs or behavior differences.
contracts/ProviderBatchCallAndSponsor.sol (1)
101-105: Consider bubbling revert reasons from failed calls.When a call fails, the current implementation reverts with a generic "Call reverted" message. This makes debugging difficult since the actual revert reason from the target contract is lost.
♻️ Proposed fix to bubble revert reasons
function _executeCall(Call calldata callItem) internal { (bool success, bytes memory returnData) = callItem.to.call{value: callItem.value}(callItem.data); - require(success, "Call reverted"); + if (!success) { + if (returnData.length > 0) { + assembly { + revert(add(32, returnData), mload(returnData)) + } + } else { + revert("Call reverted"); + } + } emit CallExecuted(msg.sender, callItem.to, callItem.value, callItem.data); }
| "project/contracts/mocks/MockUSDC.sol": { | ||
| "content": "//SPDX-License-Identifier: UNLICENSED\npragma solidity ^0.8.18;\n\nimport {ERC20} from \"@openzeppelin/contracts/token/ERC20/ERC20.sol\";\n\n/// @dev Mock mintable USDC\ncontract MockUSDT is ERC20 {\n constructor() ERC20(\"MockUDSC\", \"MUSDC\") {\n _mint(msg.sender, 1_000_000E18);\n }\n\n function mint(uint256 _amount) external {\n _mint(msg.sender, _amount);\n }\n\n function burn(uint256 _amount) external {\n _burn(msg.sender, _amount);\n }\n\n function burnAll() external {\n uint256 _balanceOf = balanceOf(msg.sender);\n require(_balanceOf > 0, \"MockUSDT: Nothing to burn\");\n _burn(msg.sender, _balanceOf);\n }\n}\n" |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Find and inspect the actual MockUSDC.sol file
echo "=== Searching for MockUSDC.sol source file ==="
fd -a 'MockUSDC.sol' --type f
# Also check if MockUSDT.sol exists
echo -e "\n=== Searching for MockUSDT.sol source file ==="
fd -a 'MockUSDT.sol' --type f
# Check for any mock token files in the contracts/mocks directory
echo -e "\n=== Looking for mock token files ==="
fd -a 'Mock.*\.sol' --type f | grep -i 'mock\|usdc\|usdt'Repository: paycrest/contracts
Length of output: 290
🏁 Script executed:
#!/bin/bash
# If MockUSDC.sol exists, show its content
echo "=== Checking MockUSDC.sol content ==="
if [ -f "project/contracts/mocks/MockUSDC.sol" ]; then
cat -n "project/contracts/mocks/MockUSDC.sol"
else
echo "File not found at project/contracts/mocks/MockUSDC.sol"
fi
# Search for references to MockUSDC and MockUSDT in the codebase
echo -e "\n=== Searching for MockUSDC references ==="
rg 'MockUSDC' --type solidity --max-count 20
echo -e "\n=== Searching for MockUSDT references ==="
rg 'MockUSDT' --type solidity --max-count 20Repository: paycrest/contracts
Length of output: 323
🏁 Script executed:
#!/bin/bash
# Read the actual MockUSDC.sol file from the correct path
echo "=== Content of MockUSDC.sol ==="
find . -name 'MockUSDC.sol' -type f -exec cat -n {} \;
# Search for MockUSDC and MockUSDT references
echo -e "\n=== Searching for MockUSDC references ==="
rg 'MockUSDC' --type sol
echo -e "\n=== Searching for MockUSDT references ==="
rg 'MockUSDT' --type solRepository: paycrest/contracts
Length of output: 1086
🏁 Script executed:
#!/bin/bash
# Search for any imports or references to these mock tokens
echo "=== Searching for imports of MockUSDC ==="
rg 'MockUSDC' -i
echo -e "\n=== Searching for imports of MockUSDT ==="
rg 'MockUSDT' -i
echo -e "\n=== Checking for other mock token files ==="
fd -a 'Mock' --type f | grep -E '\.sol$'Repository: paycrest/contracts
Length of output: 50376
Fix MockUSDC vs MockUSDT naming inconsistencies in the source contract.
The contracts/mocks/MockUSDC.sol file declares contract MockUSDT, initializes the ERC20 with token name "MockUDSC" (typo), and references "MockUSDT" in error messages. The file comment indicates USDC intent. Align the contract name, token name (fix typo to "MockUSDC"), and error messages with the filename and intended token type, then regenerate artifacts.
🤖 Prompt for AI Agents
In
`@artifacts/build-info/solc-0_8_20-32286d087051e35025cad535aaaac947e68287bf.json`
around lines 48 - 49, The contract in the artifact contains inconsistent naming:
rename contract MockUSDT to MockUSDC, fix the ERC20 name in the constructor from
"MockUDSC" to "MockUSDC" (keep or adjust the symbol "MUSDC" as desired), and
update any string literals or require messages that reference "MockUSDT" (e.g.,
in burnAll) to "MockUSDC"; ensure the constructor, mint, burn, and burnAll
functions still compile after renaming and then regenerate the build artifacts.
| "content": "// SPDX-License-Identifier: MIT\n// OpenZeppelin Contracts (last updated v4.9.0) (utils/Strings.sol)\n\npragma solidity ^0.8.0;\n\nimport \"./math/Math.sol\";\nimport \"./math/SignedMath.sol\";\n\n/**\n * @dev String operations.\n */\nlibrary Strings {\n bytes16 private constant _SYMBOLS = \"0123456789abcdef\";\n uint8 private constant _ADDRESS_LENGTH = 20;\n\n /**\n * @dev Converts a `uint256` to its ASCII `string` decimal representation.\n */\n function toString(uint256 value) internal pure returns (string memory) {\n unchecked {\n uint256 length = Math.log10(value) + 1;\n string memory buffer = new string(length);\n uint256 ptr;\n /// @solidity memory-safe-assembly\n assembly {\n ptr := add(buffer, add(32, length))\n }\n while (true) {\n ptr--;\n /// @solidity memory-safe-assembly\n assembly {\n mstore8(ptr, byte(mod(value, 10), _SYMBOLS))\n }\n value /= 10;\n if (value == 0) break;\n }\n return buffer;\n }\n }\n\n /**\n * @dev Converts a `int256` to its ASCII `string` decimal representation.\n */\n function toString(int256 value) internal pure returns (string memory) {\n return string(abi.encodePacked(value < 0 ? \"-\" : \"\", toString(SignedMath.abs(value))));\n }\n\n /**\n * @dev Converts a `uint256` to its ASCII `string` hexadecimal representation.\n */\n function toHexString(uint256 value) internal pure returns (string memory) {\n unchecked {\n return toHexString(value, Math.log256(value) + 1);\n }\n }\n\n /**\n * @dev Converts a `uint256` to its ASCII `string` hexadecimal representation with fixed length.\n */\n function toHexString(uint256 value, uint256 length) internal pure returns (string memory) {\n bytes memory buffer = new bytes(2 * length + 2);\n buffer[0] = \"0\";\n buffer[1] = \"x\";\n for (uint256 i = 2 * length + 1; i > 1; --i) {\n buffer[i] = _SYMBOLS[value & 0xf];\n value >>= 4;\n }\n require(value == 0, \"Strings: hex length insufficient\");\n return string(buffer);\n }\n\n /**\n * @dev Converts an `address` with fixed length of 20 bytes to its not checksummed ASCII `string` hexadecimal representation.\n */\n function toHexString(address addr) internal pure returns (string memory) {\n return toHexString(uint256(uint160(addr)), _ADDRESS_LENGTH);\n }\n\n /**\n * @dev Returns true if the two strings are equal.\n */\n function equal(string memory a, string memory b) internal pure returns (bool) {\n return keccak256(bytes(a)) == keccak256(bytes(b));\n }\n}\n" | ||
| }, | ||
| "project/contracts/ProviderBatchCallAndSponsor.sol": { | ||
| "content": "// SPDX-License-Identifier: MIT\npragma solidity ^0.8.18;\n\nimport \"@openzeppelin/contracts/utils/cryptography/ECDSA.sol\";\n\ninterface IGateway {\n function getAggregator() external view returns (address);\n}\n/**\n * @title ProviderBatchCallAndSponsor\n *\n * When an EOA upgrades via EIP‑7702, it delegates to this implementation.\n * Off‑chain, the account signs a message authorizing a batch of calls. The message is the hash of:\n * keccak256(abi.encodePacked(nonce, calls))\n * The signature must be generated with the EOA’s private key so that, once upgraded, the recovered signer equals the account’s own address (i.e. address(this)).\n *\n * This contract provides just one way to execute a batch:\n * 1. With a signature: Any sponsor can submit the batch if it carries a valid signature.\n *\n * Replay protection is achieved by using a nonce that is included in the signed message.\n */\ncontract ProviderBatchCallAndSponsor {\n using ECDSA for bytes32;\n \n // constant because when it assigned to eao we want to ensure it never changes\n /// @notice The address of the Gateway contract.\n address public constant gatewayAddress = 0x56dA8fCE8FD64CaaE90D80DED55587b282bb4303;\n /// @notice A nonce used for replay protection.\n uint256 public nonce;\n\n /// @notice Represents a single call within a batch.\n struct Call {\n address to;\n uint256 value;\n bytes data;\n }\n\n modifier onlyAggregator() {\n\t\trequire(msg.sender == IGateway(gatewayAddress).getAggregator(), \"OnlyAggregator\");\n\t\t_;\n\t}\n\n /// @notice Emitted for every individual call executed.\n event CallExecuted(address indexed sender, address indexed to, uint256 value, bytes data);\n /// @notice Emitted when a full batch is executed.\n event BatchExecuted(uint256 indexed nonce, Call[] calls);\n\n /**\n * @notice Executes a batch of calls using an off–chain signature.\n * @param calls An array of Call structs containing destination, ETH value, and calldata.\n * @param signature The ECDSA signature over the current nonce and the call data.\n *\n * The signature must be produced off–chain by signing:\n * The signing key should be the account’s key (which becomes the smart account’s own identity after upgrade).\n */\n function execute(Call[] calldata calls, bytes calldata signature) external payable onlyAggregator {\n // Compute the digest that the account was expected to sign.\n bytes memory encodedCalls;\n for (uint256 i = 0; i < calls.length; i++) {\n encodedCalls = abi.encodePacked(encodedCalls, calls[i].to, calls[i].value, calls[i].data);\n }\n bytes32 digest = keccak256(abi.encodePacked(nonce, encodedCalls));\n \n bytes32 ethSignedMessageHash = ECDSA.toEthSignedMessageHash(digest);\n\n // Recover the signer from the provided signature.\n address recovered = ECDSA.recover(ethSignedMessageHash, signature);\n require(recovered == address(this), \"Invalid signature\");\n\n _executeBatch(calls);\n }\n\n /**\n * @notice Executes a batch of calls directly.\n * @dev This contract doesnt authorized self execution.\n * @param calls An array of Call structs containing destination, ETH value, and calldata.\n */\n function execute(Call[] calldata calls) external payable {\n revert(\"Not implemented\"); // we don't expect this to be called directly\n }\n\n /**\n * @dev Internal function that handles batch execution and nonce incrementation.\n * @param calls An array of Call structs.\n */\n function _executeBatch(Call[] calldata calls) internal {\n uint256 currentNonce = nonce;\n nonce++; // Increment nonce to protect against replay attacks\n\n for (uint256 i = 0; i < calls.length; i++) {\n _executeCall(calls[i]);\n }\n\n emit BatchExecuted(currentNonce, calls);\n }\n\n /**\n * @dev Internal function to execute a single call.\n * @param callItem The Call struct containing destination, value, and calldata.\n */\n function _executeCall(Call calldata callItem) internal {\n (bool success,) = callItem.to.call{value: callItem.value}(callItem.data);\n require(success, \"Call reverted\");\n emit CallExecuted(msg.sender, callItem.to, callItem.value, callItem.data);\n }\n\n // Allow the contract to receive ETH (e.g. from DEX swaps or other transfers).\n fallback() external payable {}\n receive() external payable {}\n}" |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Solidity abi.encodePacked dynamic types collision vulnerability signature
💡 Result:
abi.encodePacked(...) uses Solidity’s non-standard packed ABI encoding: it concatenates values tightly, and for dynamic types (string, bytes, dynamic arrays) it encodes the data in-place without a length field. That makes the byte stream ambiguous as soon as you pack 2+ dynamic values, so different inputs can produce identical bytes, and therefore identical keccak256 hashes/signature digests. This is the “abi.encodePacked collision” vulnerability. [1]
Why this breaks signature security
If your signature scheme hashes packed user-controlled fields, e.g.
bytes32 digest = keccak256(abi.encodePacked(a, b)); // a,b are dynamican attacker can find (a, b) and (a', b') such that:
abi.encodePacked(a, b) == abi.encodePacked(a', b')So a signature intended for (a, b) will also verify for (a', b') (same digest), enabling authorization bypass / replay on different parameters. This is explicitly warned about in the Solidity ABI spec; classic example:
abi.encodePacked("a","bc") == abi.encodePacked("ab","c"). [1]
When collisions happen
- High risk: 2+ dynamic types in the same
abi.encodePacked(...)(e.g.,string,string;bytes,bytes;string,bytes; dynamic arrays). [1][2] - Also risky: mixing types where boundaries can be ambiguous without padding/length (packed encoding is not self-delimiting). [1]
How to fix / safe patterns
- Prefer
abi.encode(...)for anything you hash for signatures/authentication, because it includes offsets/lengths for dynamic data (unambiguous). [1] - If you must use packed encoding, ensure at most one dynamic argument, or add explicit domain separators / lengths, e.g. hash each dynamic piece first:
bytes32 digest = keccak256(abi.encode( TYPEHASH, keccak256(bytes(a)), keccak256(bytes(b)), otherStaticFields ));
- Use EIP-712 typed structured data style encodings (typed, length-safe) rather than ad-hoc
encodePackeddigests (same core idea as#2). (General best practice; the key point is “typed + unambiguous”.)
Rule of thumb: never sign/verify keccak256(abi.encodePacked(...)) when multiple packed inputs are dynamic or user-controlled. [1][3]
Sources
[1] Solidity ABI spec — “Non-standard Packed Mode” warning about ambiguity/collisions with multiple dynamic types (and the "a","bc" vs "ab","c" example).
[2] MetaMask encodePacked docs noting ambiguity with more than one dynamic type.
[3] Practical vulnerability write-ups illustrating collision scenarios and impact on hashes/signatures.
🌐 Web query:
Solidity abi.encode vs abi.encodePacked signed messages best practices
💡 Result:
Key difference (why it matters for signatures)
abi.encode(...)is standard ABI encoding (32-byte aligned, includes offsets/lengths for dynamic types). It’s unambiguous, so different inputs won’t “accidentally” produce the same byte payload.abi.encodePacked(...)is non-standard packed encoding (tight concatenation; dynamic types are encoded without length), so combining multiple dynamic / variable-length values can be ambiguous and lead to hash collisions (samekeccak256for different inputs). This is especially dangerous for signed messages / authorization. [1][2][3]
Best practices for “signed message” hashes
-
Prefer EIP-712 for anything security/authorization-related (permits, meta-tx, allowlists, orders, etc.). It gives you:
- Domain separation (chainId, verifying contract, name/version) to prevent cross-contract / cross-chain replay.
- A standardized, wallet-supported typed-data flow (
eth_signTypedData*). [4][5]
In Solidity, the struct hash is typically built withkeccak256(abi.encode(...))(not packed). [5]
-
If you must use “personal_sign” / EIP-191 style signing, hash your structured parameters with:
bytes32 messageHash = keccak256(abi.encode(...));(safe encoding)- then apply the EIP-191 prefix (
"\x19Ethereum Signed Message:\n32") and hash again.
Using packed encoding is appropriate for the fixed, known prefix + 32-byte hash case, because there’s no ambiguity there. [1]
-
Avoid
keccak256(abi.encodePacked(...))when:- there are two or more dynamic types (
string,bytes, dynamic arrays), or - user-controlled variable-length values are concatenated for an auth/signature check. [1][2][3]
- there are two or more dynamic types (
-
If you insist on packed encoding, constrain it so it’s unambiguous:
- ensure at most one dynamic argument, or
- hash each dynamic field first (
keccak256(bytes(str))) and pack only fixed-size values (e.g.,bytes32,uint256,address). [3][5]
-
Always include replay protection in the signed payload:
- at minimum:
nonce+deadline/expiry - and ideally also:
chainId+verifyingContract(EIP-712 domain does this for you). [4][5]
- at minimum:
Rule of thumb:
- For signatures/authorization:
keccak256(abi.encode(...))(or EIP-712). - Use
abi.encodePackedonly for carefully bounded cases (fixed-size fields, or prefixing), not for “concatenate arbitrary user inputs and sign”.
Sources: Solidity ABI packed-mode ambiguity and dynamic-type issue [1], SWC-133 guidance for signature contexts [2], security guidance to avoid packed encoding when hashing [3], EIP-712 standard [4], OpenZeppelin EIP712 docs recommending abi.encode + keccak256 for typed data [5].
Replace abi.encodePacked with abi.encode for signed call digest construction.
abi.encodePacked concatenates dynamic bytes without length prefixes. When multiple calls are packed in a loop, different call arrays can produce identical digests, allowing signatures to be replayed across different batches. abi.encode includes length fields and prevents collision.
Safer digest construction
- bytes memory encodedCalls;
- for (uint256 i = 0; i < calls.length; i++) {
- encodedCalls = abi.encodePacked(encodedCalls, calls[i].to, calls[i].value, calls[i].data);
- }
- bytes32 digest = keccak256(abi.encodePacked(nonce, encodedCalls));
+ bytes32 digest = keccak256(abi.encode(nonce, calls));🤖 Prompt for AI Agents
In
`@artifacts/build-info/solc-0_8_20-fca8cc730a3cef896a4a57d6ce6a72d2a65b5d82.json`
at line 49, The digest construction in execute() currently builds encodedCalls
via a manual loop using abi.encodePacked which can cause collisions; replace
that logic by computing the digest with abi.encode to include length prefixes
(i.e., remove the encodedCalls loop and set bytes32 digest =
keccak256(abi.encode(nonce, calls))); update the execute(Call[] calldata calls,
bytes calldata signature) function to use this new digest computation
(references: execute, encodedCalls, digest) and remove the abi.encodePacked
accumulation to prevent replayable signature collisions.
| export interface IGateway$Type { | ||
| readonly _format: "hh3-artifact-1"; | ||
| readonly contractName: "IGateway"; | ||
| readonly sourceName: "contracts/ProviderBatchCallAndSponsor.sol"; | ||
| readonly abi: [{"inputs":[],"name":"getAggregator","outputs":[{"internalType":"address","name":"","type":"address"}],"stateMutability":"view","type":"function"}]; | ||
| readonly bytecode: "0x"; | ||
| readonly deployedBytecode: "0x"; | ||
| readonly linkReferences: {}; | ||
| readonly deployedLinkReferences: {}; | ||
| readonly immutableReferences: {}; | ||
| readonly inputSourceName: "project/contracts/ProviderBatchCallAndSponsor.sol"; | ||
| readonly buildInfoId: "solc-0_8_20-fca8cc730a3cef896a4a57d6ce6a72d2a65b5d82"; | ||
| }; | ||
|
|
||
| export interface ProviderBatchCallAndSponsor$Type { | ||
| readonly _format: "hh3-artifact-1"; | ||
| readonly contractName: "ProviderBatchCallAndSponsor"; | ||
| readonly sourceName: "contracts/ProviderBatchCallAndSponsor.sol"; | ||
| readonly abi: [{"anonymous":false,"inputs":[{"indexed":true,"internalType":"uint256","name":"nonce","type":"uint256"},{"components":[{"internalType":"address","name":"to","type":"address"},{"internalType":"uint256","name":"value","type":"uint256"},{"internalType":"bytes","name":"data","type":"bytes"}],"indexed":false,"internalType":"struct ProviderBatchCallAndSponsor.Call[]","name":"calls","type":"tuple[]"}],"name":"BatchExecuted","type":"event"},{"anonymous":false,"inputs":[{"indexed":true,"internalType":"address","name":"sender","type":"address"},{"indexed":true,"internalType":"address","name":"to","type":"address"},{"indexed":false,"internalType":"uint256","name":"value","type":"uint256"},{"indexed":false,"internalType":"bytes","name":"data","type":"bytes"}],"name":"CallExecuted","type":"event"},{"stateMutability":"payable","type":"fallback"},{"inputs":[{"components":[{"internalType":"address","name":"to","type":"address"},{"internalType":"uint256","name":"value","type":"uint256"},{"internalType":"bytes","name":"data","type":"bytes"}],"internalType":"struct ProviderBatchCallAndSponsor.Call[]","name":"calls","type":"tuple[]"}],"name":"execute","outputs":[],"stateMutability":"payable","type":"function"},{"inputs":[{"components":[{"internalType":"address","name":"to","type":"address"},{"internalType":"uint256","name":"value","type":"uint256"},{"internalType":"bytes","name":"data","type":"bytes"}],"internalType":"struct ProviderBatchCallAndSponsor.Call[]","name":"calls","type":"tuple[]"},{"internalType":"bytes","name":"signature","type":"bytes"}],"name":"execute","outputs":[],"stateMutability":"payable","type":"function"},{"inputs":[],"name":"gatewayAddress","outputs":[{"internalType":"address","name":"","type":"address"}],"stateMutability":"view","type":"function"},{"inputs":[],"name":"nonce","outputs":[{"internalType":"uint256","name":"","type":"uint256"}],"stateMutability":"view","type":"function"},{"stateMutability":"payable","type":"receive"}]; | ||
| readonly bytecode: "0x608060405234801561000f575f80fd5b50610bd68061001d5f395ff3fe60806040526004361061003f575f3560e01c80633f707e6b146100485780636171d1c91461005b5780638b851b951461006e578063affed0e0146100b257005b3661004657005b005b610046610056366004610819565b6100d4565b610046610069366004610858565b610113565b348015610079575f80fd5b506100957356da8fce8fd64caae90d80ded55587b282bb430381565b6040516001600160a01b0390911681526020015b60405180910390f35b3480156100bd575f80fd5b506100c65f5481565b6040519081526020016100a9565b60405162461bcd60e51b815260206004820152600f60248201526e139bdd081a5b5c1b195b595b9d1959608a1b60448201526064015b60405180910390fd5b7356da8fce8fd64caae90d80ded55587b282bb43036001600160a01b0316633ad59dbc6040518163ffffffff1660e01b8152600401602060405180830381865afa158015610163573d5f803e3d5ffd5b505050506040513d601f19601f8201168201806040525081019061018791906108fb565b6001600160a01b0316336001600160a01b0316146101d85760405162461bcd60e51b815260206004820152600e60248201526d27b7363ca0b3b3b932b3b0ba37b960911b604482015260640161010a565b60605f5b848110156102a957818686838181106101f7576101f761091d565b90506020028101906102099190610931565b61021790602081019061094f565b8787848181106102295761022961091d565b905060200281019061023b9190610931565b602001358888858181106102515761025161091d565b90506020028101906102639190610931565b61027190604081019061096a565b6040516020016102859594939291906109da565b604051602081830303815290604052915080806102a190610a1b565b9150506101dc565b505f8054826040516020016102bf929190610a3f565b6040516020818303038152906040528051906020012090505f61030e827f19457468657265756d205369676e6564204d6573736167653a0a3332000000005f908152601c91909152603c902090565b90505f6103508287878080601f0160208091040260200160405190810160405280939291908181526020018383808284375f920191909152506103b292505050565b90506001600160a01b038116301461039e5760405162461bcd60e51b8152602060048201526011602482015270496e76616c6964207369676e617475726560781b604482015260640161010a565b6103a888886103d4565b5050505050505050565b5f805f6103bf8585610471565b915091506103cc816104b3565b509392505050565b5f8054908190806103e483610a1b565b91905055505f5b828110156104315761041f8484838181106104085761040861091d565b905060200281019061041a9190610931565b6105ff565b8061042981610a1b565b9150506103eb565b50807f280bb3599696acbf79fb8ffcde81a57337b52500f789600fbb1cff9b4cbaba398484604051610464929190610a80565b60405180910390a2505050565b5f8082516041036104a5576020830151604084015160608501515f1a6104998782858561071b565b945094505050506104ac565b505f905060025b9250929050565b5f8160048111156104c6576104c6610b5b565b036104ce5750565b60018160048111156104e2576104e2610b5b565b0361052f5760405162461bcd60e51b815260206004820152601860248201527f45434453413a20696e76616c6964207369676e61747572650000000000000000604482015260640161010a565b600281600481111561054357610543610b5b565b036105905760405162461bcd60e51b815260206004820152601f60248201527f45434453413a20696e76616c6964207369676e6174757265206c656e67746800604482015260640161010a565b60038160048111156105a4576105a4610b5b565b036105fc5760405162461bcd60e51b815260206004820152602260248201527f45434453413a20696e76616c6964207369676e6174757265202773272076616c604482015261756560f01b606482015260840161010a565b50565b5f61060d602083018361094f565b6001600160a01b03166020830135610628604085018561096a565b604051610636929190610b6f565b5f6040518083038185875af1925050503d805f8114610670576040519150601f19603f3d011682016040523d82523d5f602084013e610675565b606091505b50509050806106b65760405162461bcd60e51b815260206004820152600d60248201526c10d85b1b081c995d995c9d1959609a1b604482015260640161010a565b6106c3602083018361094f565b6001600160a01b0316337fed7e8f919df9cc0d0ad8b4057d084ebf319b630564d5da283e14751adc931f3a6020850135610700604087018761096a565b60405161070f93929190610b7e565b60405180910390a35050565b5f807f7fffffffffffffffffffffffffffffff5d576e7357a4501ddfe92f46681b20a083111561075057505f905060036107cf565b604080515f8082526020820180845289905260ff881692820192909252606081018690526080810185905260019060a0016020604051602081039080840390855afa1580156107a1573d5f803e3d5ffd5b5050604051601f1901519150506001600160a01b0381166107c9575f600192509250506107cf565b91505f90505b94509492505050565b5f8083601f8401126107e8575f80fd5b50813567ffffffffffffffff8111156107ff575f80fd5b6020830191508360208260051b85010111156104ac575f80fd5b5f806020838503121561082a575f80fd5b823567ffffffffffffffff811115610840575f80fd5b61084c858286016107d8565b90969095509350505050565b5f805f806040858703121561086b575f80fd5b843567ffffffffffffffff80821115610882575f80fd5b61088e888389016107d8565b909650945060208701359150808211156108a6575f80fd5b818701915087601f8301126108b9575f80fd5b8135818111156108c7575f80fd5b8860208285010111156108d8575f80fd5b95989497505060200194505050565b6001600160a01b03811681146105fc575f80fd5b5f6020828403121561090b575f80fd5b8151610916816108e7565b9392505050565b634e487b7160e01b5f52603260045260245ffd5b5f8235605e19833603018112610945575f80fd5b9190910192915050565b5f6020828403121561095f575f80fd5b8135610916816108e7565b5f808335601e1984360301811261097f575f80fd5b83018035915067ffffffffffffffff821115610999575f80fd5b6020019150368190038213156104ac575f80fd5b5f81515f5b818110156109cc57602081850181015186830152016109b2565b505f93019283525090919050565b5f6109e582886109ad565b6bffffffffffffffffffffffff198760601b168152856014820152838560348301375f9301603401928352509095945050505050565b5f60018201610a3857634e487b7160e01b5f52601160045260245ffd5b5060010190565b8281525f610a5060208301846109ad565b949350505050565b81835281816020850137505f828201602090810191909152601f909101601f19169091010190565b60208082528181018390525f906040808401600586901b8501820187855b88811015610b4d57878303603f190184528135368b9003605e19018112610ac3575f80fd5b8a0160608135610ad2816108e7565b6001600160a01b0316855281880135888601528682013536839003601e19018112610afb575f80fd5b90910187810191903567ffffffffffffffff811115610b18575f80fd5b803603831315610b26575f80fd5b8188870152610b388287018285610a58565b96890196955050509186019150600101610a9e565b509098975050505050505050565b634e487b7160e01b5f52602160045260245ffd5b818382375f9101908152919050565b838152604060208201525f610b97604083018486610a58565b9594505050505056fea264697066735822122042e77a99c684d5c0d4fe73c5495f8473bf83b096b103360db73b60a067ac60c464736f6c63430008140033"; | ||
| readonly deployedBytecode: "0x60806040526004361061003f575f3560e01c80633f707e6b146100485780636171d1c91461005b5780638b851b951461006e578063affed0e0146100b257005b3661004657005b005b610046610056366004610819565b6100d4565b610046610069366004610858565b610113565b348015610079575f80fd5b506100957356da8fce8fd64caae90d80ded55587b282bb430381565b6040516001600160a01b0390911681526020015b60405180910390f35b3480156100bd575f80fd5b506100c65f5481565b6040519081526020016100a9565b60405162461bcd60e51b815260206004820152600f60248201526e139bdd081a5b5c1b195b595b9d1959608a1b60448201526064015b60405180910390fd5b7356da8fce8fd64caae90d80ded55587b282bb43036001600160a01b0316633ad59dbc6040518163ffffffff1660e01b8152600401602060405180830381865afa158015610163573d5f803e3d5ffd5b505050506040513d601f19601f8201168201806040525081019061018791906108fb565b6001600160a01b0316336001600160a01b0316146101d85760405162461bcd60e51b815260206004820152600e60248201526d27b7363ca0b3b3b932b3b0ba37b960911b604482015260640161010a565b60605f5b848110156102a957818686838181106101f7576101f761091d565b90506020028101906102099190610931565b61021790602081019061094f565b8787848181106102295761022961091d565b905060200281019061023b9190610931565b602001358888858181106102515761025161091d565b90506020028101906102639190610931565b61027190604081019061096a565b6040516020016102859594939291906109da565b604051602081830303815290604052915080806102a190610a1b565b9150506101dc565b505f8054826040516020016102bf929190610a3f565b6040516020818303038152906040528051906020012090505f61030e827f19457468657265756d205369676e6564204d6573736167653a0a3332000000005f908152601c91909152603c902090565b90505f6103508287878080601f0160208091040260200160405190810160405280939291908181526020018383808284375f920191909152506103b292505050565b90506001600160a01b038116301461039e5760405162461bcd60e51b8152602060048201526011602482015270496e76616c6964207369676e617475726560781b604482015260640161010a565b6103a888886103d4565b5050505050505050565b5f805f6103bf8585610471565b915091506103cc816104b3565b509392505050565b5f8054908190806103e483610a1b565b91905055505f5b828110156104315761041f8484838181106104085761040861091d565b905060200281019061041a9190610931565b6105ff565b8061042981610a1b565b9150506103eb565b50807f280bb3599696acbf79fb8ffcde81a57337b52500f789600fbb1cff9b4cbaba398484604051610464929190610a80565b60405180910390a2505050565b5f8082516041036104a5576020830151604084015160608501515f1a6104998782858561071b565b945094505050506104ac565b505f905060025b9250929050565b5f8160048111156104c6576104c6610b5b565b036104ce5750565b60018160048111156104e2576104e2610b5b565b0361052f5760405162461bcd60e51b815260206004820152601860248201527f45434453413a20696e76616c6964207369676e61747572650000000000000000604482015260640161010a565b600281600481111561054357610543610b5b565b036105905760405162461bcd60e51b815260206004820152601f60248201527f45434453413a20696e76616c6964207369676e6174757265206c656e67746800604482015260640161010a565b60038160048111156105a4576105a4610b5b565b036105fc5760405162461bcd60e51b815260206004820152602260248201527f45434453413a20696e76616c6964207369676e6174757265202773272076616c604482015261756560f01b606482015260840161010a565b50565b5f61060d602083018361094f565b6001600160a01b03166020830135610628604085018561096a565b604051610636929190610b6f565b5f6040518083038185875af1925050503d805f8114610670576040519150601f19603f3d011682016040523d82523d5f602084013e610675565b606091505b50509050806106b65760405162461bcd60e51b815260206004820152600d60248201526c10d85b1b081c995d995c9d1959609a1b604482015260640161010a565b6106c3602083018361094f565b6001600160a01b0316337fed7e8f919df9cc0d0ad8b4057d084ebf319b630564d5da283e14751adc931f3a6020850135610700604087018761096a565b60405161070f93929190610b7e565b60405180910390a35050565b5f807f7fffffffffffffffffffffffffffffff5d576e7357a4501ddfe92f46681b20a083111561075057505f905060036107cf565b604080515f8082526020820180845289905260ff881692820192909252606081018690526080810185905260019060a0016020604051602081039080840390855afa1580156107a1573d5f803e3d5ffd5b5050604051601f1901519150506001600160a01b0381166107c9575f600192509250506107cf565b91505f90505b94509492505050565b5f8083601f8401126107e8575f80fd5b50813567ffffffffffffffff8111156107ff575f80fd5b6020830191508360208260051b85010111156104ac575f80fd5b5f806020838503121561082a575f80fd5b823567ffffffffffffffff811115610840575f80fd5b61084c858286016107d8565b90969095509350505050565b5f805f806040858703121561086b575f80fd5b843567ffffffffffffffff80821115610882575f80fd5b61088e888389016107d8565b909650945060208701359150808211156108a6575f80fd5b818701915087601f8301126108b9575f80fd5b8135818111156108c7575f80fd5b8860208285010111156108d8575f80fd5b95989497505060200194505050565b6001600160a01b03811681146105fc575f80fd5b5f6020828403121561090b575f80fd5b8151610916816108e7565b9392505050565b634e487b7160e01b5f52603260045260245ffd5b5f8235605e19833603018112610945575f80fd5b9190910192915050565b5f6020828403121561095f575f80fd5b8135610916816108e7565b5f808335601e1984360301811261097f575f80fd5b83018035915067ffffffffffffffff821115610999575f80fd5b6020019150368190038213156104ac575f80fd5b5f81515f5b818110156109cc57602081850181015186830152016109b2565b505f93019283525090919050565b5f6109e582886109ad565b6bffffffffffffffffffffffff198760601b168152856014820152838560348301375f9301603401928352509095945050505050565b5f60018201610a3857634e487b7160e01b5f52601160045260245ffd5b5060010190565b8281525f610a5060208301846109ad565b949350505050565b81835281816020850137505f828201602090810191909152601f909101601f19169091010190565b60208082528181018390525f906040808401600586901b8501820187855b88811015610b4d57878303603f190184528135368b9003605e19018112610ac3575f80fd5b8a0160608135610ad2816108e7565b6001600160a01b0316855281880135888601528682013536839003601e19018112610afb575f80fd5b90910187810191903567ffffffffffffffff811115610b18575f80fd5b803603831315610b26575f80fd5b8188870152610b388287018285610a58565b96890196955050509186019150600101610a9e565b509098975050505050505050565b634e487b7160e01b5f52602160045260245ffd5b818382375f9101908152919050565b838152604060208201525f610b97604083018486610a58565b9594505050505056fea264697066735822122042e77a99c684d5c0d4fe73c5495f8473bf83b096b103360db73b60a067ac60c464736f6c63430008140033"; | ||
| readonly linkReferences: {}; | ||
| readonly deployedLinkReferences: {}; | ||
| readonly immutableReferences: {}; | ||
| readonly inputSourceName: "project/contracts/ProviderBatchCallAndSponsor.sol"; | ||
| readonly buildInfoId: "solc-0_8_20-fca8cc730a3cef896a4a57d6ce6a72d2a65b5d82"; | ||
| }; | ||
|
|
||
| import "hardhat/types/artifacts"; | ||
| declare module "hardhat/types/artifacts" { | ||
| interface ArtifactMap { | ||
| ["IGateway"]: IGateway$Type | ||
| ["ProviderBatchCallAndSponsor"]: ProviderBatchCallAndSponsor$Type; | ||
| ["contracts/ProviderBatchCallAndSponsor.sol:IGateway"]: IGateway$Type | ||
| ["contracts/ProviderBatchCallAndSponsor.sol:ProviderBatchCallAndSponsor"]: ProviderBatchCallAndSponsor$Type; | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check for duplicate IGateway artifact keys across all artifacts
rg -n --glob 'artifacts/**/*.d.ts' '\["IGateway"\]'Repository: paycrest/contracts
Length of output: 311
🏁 Script executed:
#!/bin/bash
# Check if IGateway exists in the contracts/interfaces directory
fd -t f IGateway.solRepository: paycrest/contracts
Length of output: 95
🏁 Script executed:
#!/bin/bash
# Search for IGateway interface definitions in contracts
rg -n 'interface IGateway' --type solRepository: paycrest/contracts
Length of output: 89
🏁 Script executed:
#!/bin/bash
# Search for IGateway interface definition without type restriction
rg -n 'interface IGateway' contracts/Repository: paycrest/contracts
Length of output: 184
🏁 Script executed:
#!/bin/bash
# Check the actual ProviderBatchCallAndSponsor.sol source to see the local interface
head -100 contracts/ProviderBatchCallAndSponsor.solRepository: paycrest/contracts
Length of output: 4093
🏁 Script executed:
#!/bin/bash
# Check artifacts.d.ts to see how it resolves the conflict
cat artifacts/artifacts.d.ts | head -30Repository: paycrest/contracts
Length of output: 344
Avoid duplicate ArtifactMap key for IGateway.
The artifact file registers ["IGateway"] for the interface defined locally in ProviderBatchCallAndSponsor.sol, but IGateway is also defined in contracts/interfaces/IGateway.sol. This creates a conflict—notice that artifacts/artifacts.d.ts resolves the ambiguity by setting ["IGateway"]: never, preventing proper artifact type resolution. Rename the local interface (e.g., IAggregatorGateway) and update references in the code (e.g., the onlyAggregator modifier).
🛠️ Example fix (rename the local interface)
-interface IGateway {
+interface IAggregatorGateway {
function getAggregator() external view returns (address);
}🤖 Prompt for AI Agents
In `@artifacts/contracts/ProviderBatchCallAndSponsor.sol/artifacts.d.ts` around
lines 7 - 42, Rename the locally declared interface IGateway to a unique name
(e.g., IAggregatorGateway) and update all references: change the interface
declaration symbol and the ArtifactMap entries in artifacts.d.ts from IGateway
to IAggregatorGateway, and update usages in the contract code such as the
onlyAggregator modifier, the gatewayAddress getter, and any cast/ABI usage that
references IGateway so they now reference IAggregatorGateway; ensure the ABI
entry name (getAggregator) and any import/fully‑qualified names (e.g.,
"contracts/ProviderBatchCallAndSponsor.sol:IGateway") are replaced accordingly
to avoid the duplicate ArtifactMap key.
| // constant because when it assigned to eao we want to ensure it never changes | ||
| /// @notice The address of the Gateway contract. | ||
| address public constant gatewayAddress = 0x56dA8fCE8FD64CaaE90D80DED55587b282bb4303; |
There was a problem hiding this comment.
Minor typo in comment.
Line 25: "eao" should be "EOA" (Externally Owned Account).
📝 Proposed fix
- // constant because when it assigned to eao we want to ensure it never changes
+ // constant because when it is assigned to an EOA we want to ensure it never changes📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // constant because when it assigned to eao we want to ensure it never changes | |
| /// @notice The address of the Gateway contract. | |
| address public constant gatewayAddress = 0x56dA8fCE8FD64CaaE90D80DED55587b282bb4303; | |
| // constant because when it is assigned to an EOA we want to ensure it never changes | |
| /// `@notice` The address of the Gateway contract. | |
| address public constant gatewayAddress = 0x56dA8fCE8FD64CaaE90D80DED55587b282bb4303; |
🤖 Prompt for AI Agents
In `@contracts/ProviderBatchCallAndSponsor.sol` around lines 25 - 27, The comment
above the gatewayAddress constant has a typo ("eao"); update the comment to use
the correct acronym "EOA" (Externally Owned Account) so the comment reads e.g.
"constant because when it assigned to EOA we want to ensure it never changes" or
similar; locate the comment immediately preceding the address public constant
gatewayAddress and replace "eao" with "EOA".
| chainDescriptors: { | ||
| 42220: { | ||
| name: "celo", | ||
| blockExplorers: { | ||
| etherscan: { | ||
| name: "celoscan", | ||
| url: "https://api.celoscan.io/api", | ||
| apiUrl: "https://celoscan.io/", | ||
| }, | ||
| }, | ||
| { | ||
| network: "amoy", | ||
| chainId: 80002, | ||
| urls: { | ||
| apiURL: "https://api-amoy.polygonscan.com/api", | ||
| browserURL: "https://amoy.polygonscan.com", | ||
| }, | ||
| 534352: { | ||
| name: "scroll", | ||
| blockExplorers: { | ||
| etherscan: { | ||
| name: "Scroll Explorer", | ||
| url: "https://api.scrollscan.com/api", | ||
| apiUrl: "https://scrollscan.com/", | ||
| }, | ||
| }, | ||
| { | ||
| network: "celo", | ||
| chainId: 42220, | ||
| urls: { | ||
| apiURL: "https://api.celoscan.io/api", | ||
| browserURL: "https://celoscan.io/", | ||
| }, | ||
| 1135: { | ||
| name: "lisk", | ||
| blockExplorers: { | ||
| etherscan: { | ||
| name: "Lisk Explorer", | ||
| url: "https://explorer.lisk.com/api", | ||
| apiUrl: "https://explorer.lisk.com", | ||
| }, | ||
| }, | ||
| ], | ||
| }, |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Hardhat v3 chainDescriptors etherscan url apiUrl configuration
💡 Result:
In Hardhat v3, the Etherscan verification endpoint configuration for custom (non-natively-supported) chains is done via chainDescriptors → blockExplorers.etherscan, where you set both the human-facing explorer URL and the API base URL:
// hardhat.config.ts
import { defineConfig } from "hardhat/config";
import hardhatVerify from "@nomicfoundation/hardhat-verify";
export default defineConfig({
plugins: [hardhatVerify],
verify: {
etherscan: {
apiKey: process.env.ETHERSCAN_API_KEY!, // or configVariable(...)
},
},
chainDescriptors: {
123456: {
name: "ExampleChain",
blockExplorers: {
etherscan: {
name: "ExampleScan",
url: "https://example.scan", // explorer website
apiUrl: "https://example.scan/api", // explorer API base used for verification
},
},
},
},
});url: the block explorer website base URL. (hardhat.org)apiUrl: the block explorer API base URL (the one Hardhat uses to submit verification requests). (hardhat.org)- The API key is configured under
verify.etherscan.apiKey. (hardhat.org)
Citations:
- 1: https://hardhat.org/verify-custom-networks?utm_source=openai
- 2: https://hardhat.org/verify-custom-networks?utm_source=openai
- 3: https://hardhat.org/verify-custom-networks?utm_source=openai
🌐 Web query:
Celoscan API endpoint documentation vs web explorer URL
💡 Result:
Celoscan’s web explorer URL and its API base URL are different things:
-
Web explorer (human UI):
- Celo Mainnet:
https://celoscan.io/ - Celo Sepolia:
https://sepolia.celoscan.io/(listed as official explorers in Celo’s network overview) [3]
- Celo Mainnet:
-
API endpoint (programmatic): Celoscan’s “Developer API” docs currently point to Etherscan API V2, where you call a single multichain endpoint and specify Celo via
chainid. [1]- Base:
https://api.etherscan.io/v2/api - Celo Mainnet uses
chainid=42220, Celo Sepolia useschainid=11142220. [1] - Example (block number):
https://api.etherscan.io/v2/api?chainid=42220&module=proxy&action=eth_blockNumber&apikey=YOUR_KEY[2]
- Base:
So: don’t use the explorer URL (celoscan.io) as the API base; use the Etherscan V2 API base and set the appropriate Celo chainid. [1][2]
Sources:
[1] Etherscan V2 “Supported Chains” (includes Celo chain IDs) — https://docs.etherscan.io/etherscan-v2/getting-started/supported-chains
[2] Etherscan V2 endpoint example (/v2/api, chainid=...) — https://docs.etherscan.io/api-reference/endpoint/ethblocknumber
[3] Celo docs listing Celoscan as a block explorer — https://docs.celo.org/tooling/overview/network-overview
🌐 Web query:
Hardhat defineConfig blockExplorers etherscan url apiUrl meaning
💡 Result:
In Hardhat’s defineConfig when you add a custom explorer under chainDescriptors -> <chainId> -> blockExplorers -> etherscan, the two fields are:
url: the human-facing block explorer website base URL (used for links / UI navigation), e.g.https://example.scan. (hardhat.org)apiUrl: the programmatic API endpoint that the verification plugin calls to submit source code and poll verification status, e.g.https://example.scan/api(or an Etherscan v2 endpoint likehttps://api.etherscan.io/v2/api). (hardhat.org)
Related: in older @nomicfoundation/hardhat-verify “customChains” config (Hardhat 2-era), the same idea appears as urls.browserURL (website) vs urls.apiURL (API). (docs.linea.build)
Citations:
- 1: https://hardhat.org/verify-custom-networks?utm_source=openai
- 2: https://hardhat.org/verify-custom-networks?utm_source=openai
- 3: https://docs.linea.build/network/how-to/verify-smart-contract/hardhat?utm_source=openai
Fix swapped url and apiUrl in chainDescriptors for all three chains.
In chainDescriptors, the url and apiUrl fields are reversed across celo, scroll, and lisk configurations. According to Hardhat v3 documentation, url should point to the explorer's website (human-facing) and apiUrl should point to the API endpoint used for contract verification.
Current (incorrect):
- Celo:
url→ API endpoint,apiUrl→ website - Scroll:
url→ API endpoint,apiUrl→ website - Lisk:
url→ API endpoint,apiUrl→ website
Swap these values so that url points to the web explorer and apiUrl points to the API endpoint.
🤖 Prompt for AI Agents
In `@hardhat.config.ts` around lines 118 - 148, The chainDescriptors entries for
chains 42220 (celo), 534352 (scroll), and 1135 (lisk) have swapped fields: swap
the values of url and apiUrl in each object's blockExplorers.etherscan so that
url is the human-facing explorer website and apiUrl is the API endpoint used for
verification (update the objects under chainDescriptors for 42220, 534352, and
1135 accordingly).
package.json
Outdated
| "type": "module", | ||
| "devDependencies": { | ||
| "@daochild/tronweb-typescript": "^1.1.2", | ||
| "@ethersproject/abi": "^5.7.0", | ||
| "@ethersproject/providers": "^5.7.2", | ||
| "@nomicfoundation/hardhat-chai-matchers": "^1.0.6", | ||
| "@nomicfoundation/hardhat-network-helpers": "^1.0.8", | ||
| "@nomicfoundation/hardhat-toolbox": "^2.0.2", | ||
| "@nomiclabs/hardhat-ethers": "^2.2.3", | ||
| "@nomiclabs/hardhat-etherscan": "^3.1.7", | ||
| "@openzeppelin/hardhat-upgrades": "^1.27.0", | ||
| "@nomicfoundation/hardhat-ethers": "^4.0.4", | ||
| "@nomicfoundation/hardhat-ethers-chai-matchers": "^3.0.2", | ||
| "@nomicfoundation/hardhat-mocha": "^3.0.9", | ||
| "@nomicfoundation/hardhat-network-helpers": "^3.0.3", | ||
| "@nomicfoundation/hardhat-toolbox-mocha-ethers": "^3.0.2", | ||
| "@nomicfoundation/hardhat-typechain": "^3.0.1", | ||
| "@nomicfoundation/hardhat-verify": "^3.0.8", | ||
| "@openzeppelin/truffle-upgrades": "^1.21.0", | ||
| "@typechain/ethers-v5": "^10.2.0", | ||
| "@typechain/hardhat": "^6.1.5", | ||
| "chai": "^4.3.7", | ||
| "@types/chai": "^5.2.3", | ||
| "@types/mocha": "^10.0.10", | ||
| "chai": "^6.2.2", | ||
| "dotenv": "^16.0.2", | ||
| "ethers": "^5.7.2", | ||
| "hardhat": "^2.14.0", | ||
| "hardhat-deploy": "^0.11.34", | ||
| "hardhat-gas-reporter": "^1.0.9", | ||
| "solidity-coverage": "^0.8.2", | ||
| "ethers": "^6.16.0", | ||
| "hardhat": "^3.0.12", | ||
| "mocha": "^11.7.5", |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Locate Hardhat config files and check for CommonJS usage
echo "=== Looking for hardhat.config files ==="
fd -a 'hardhat.config.*' 2>/dev/null || echo "No hardhat.config files found"
echo -e "\n=== Checking for require/module.exports in hardhat config ==="
rg -n --glob 'hardhat.config.*' 'require\(|module\.exports' 2>/dev/null || echo "No require/module.exports in hardhat configs"
echo -e "\n=== Checking for require in test files ==="
rg -n --glob 'test/**/*.{js,ts}' 'require\(' 2>/dev/null || echo "No require found in test files"
echo -e "\n=== List all config-like files in root ==="
fd -a '.*config' --type f --max-depth 1 2>/dev/null || trueRepository: paycrest/contracts
Length of output: 1516
🏁 Script executed:
#!/bin/bash
echo "=== All test files in repository ==="
find test -type f \( -name "*.js" -o -name "*.ts" \) 2>/dev/null | sort
echo -e "\n=== Count of require statements in test directory ==="
rg -c 'require\(' test/ 2>/dev/null | grep -v ':0$' || echo "Some files contain require"
echo -e "\n=== Check fixtures directory ==="
rg -n 'require\(' test/fixtures/ 2>/dev/null | head -20
echo -e "\n=== Check utils directory ==="
rg -n 'require\(' test/utils/ 2>/dev/null | head -20Repository: paycrest/contracts
Length of output: 502
🏁 Script executed:
#!/bin/bash
echo "=== Content of test/fixtures/gateway.js ==="
cat -n test/fixtures/gateway.js 2>/dev/null | head -30
echo -e "\n=== Content of test/fixtures/mockUSDT.js ==="
cat -n test/fixtures/mockUSDT.js 2>/dev/null | head -30
echo -e "\n=== Content of test/utils/utils.manager.js ==="
cat -n test/utils/utils.manager.js 2>/dev/null | head -40Repository: paycrest/contracts
Length of output: 3687
🏁 Script executed:
#!/bin/bash
echo "=== Content of test/gateway/gateway.ownable.test.js (first 20 lines) ==="
head -20 test/gateway/gateway.ownable.test.jsRepository: paycrest/contracts
Length of output: 594
Convert test files to ESM: 2 test files still use CommonJS require() and will break with "type": "module" enabled.
test/gateway/gateway.settleOrder.test.js(6 require statements)test/gateway/gateway.createorder.test.js(6 require statements)
The hardhat config, fixtures, utils, and test/gateway/gateway.ownable.test.js are already ESM-safe. Convert the two listed test files to use import statements instead of require().
🤖 Prompt for AI Agents
In `@package.json` around lines 19 - 37, Two test files
(test/gateway/gateway.settleOrder.test.js and
test/gateway/gateway.createorder.test.js) still use CommonJS require() which
breaks under "type":"module"; change each require(...) to ES module import
statements (use named or default import form matching the original
destructuring), update any module.exports in code they import to export/exports
(or convert exported helpers to export default/named) and adjust references
accordingly, and handle CommonJS globals if present (replace
__dirname/__filename uses with import.meta.url or path utilities). Locate
require calls in those two files and replace them with appropriate import lines
and fix any exported helper modules to use export syntax so tests run under ESM.
| const mockUSDT = await ethers.deployContract("MockUSDT"); | ||
| console.log("MockUSDT deployed to:", await mockUSDT.getAddress()); |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Hardhat v3 ethers.deployContract behavior mining await
💡 Result:
In Hardhat v3 + ethers v6, await ethers.deployContract("Name", ...) gives you an ethers.Contract instance, but it doesn’t guarantee the deployment tx has been mined (i.e., you may still need to wait for the receipt). Hardhat’s ethers plugin documents deployContract as returning Promise<ethers.Contract> without stating it waits for mining, unlike the viem plugin which explicitly says its deployContract waits until the deployment transaction is mined. [1] [2]
So if you need the deployment to be mined (address usable, receipt available, downstream tooling that “detects deploy” based on receipts, etc.), do:
const c = await ethers.deployContract("Counter");
await c.waitForDeployment(); // ethers v6 (ethers v5 was c.deployed())
const addr = await c.getAddress();This “explicit wait” is also what third-party tooling expects when it says to call waitForDeployment() (ethers v6) to ensure the deployment receipt is available. [3]
If you have automining disabled on Hardhat Network, any waitForDeployment() / tx.wait() can hang forever until you mine a block, because the tx is queued but never included in a block. Fix by manually mining (or re-enabling automine):
await network.provider.send("evm_setAutomine", [false]);
const c = await ethers.deployContract("Counter"); // tx sent/queued
await network.provider.send("evm_mine"); // mine a block
await c.waitForDeployment(); // now it resolves[1] [2] [4]
Sources: Hardhat hardhat-ethers docs (deployContract API) [1], Hardhat hardhat-viem docs (deployContract waits for mining) [2], Tenderly Hardhat note about needing waitForDeployment() for ethers deploy helpers [3], explanation of hangs when automine is off and you must evm_mine [4].
🌐 Web query:
ethers deployContract waitForDeployment Hardhat v3
💡 Result:
In Hardhat v3 (with ethers v6), ethers.deployContract() returns a contract object before it’s mined, so you should await waitForDeployment() before reading the address or interacting with it. Hardhat v3’s hardhat-ethers plugin exposes ethers via network.connect(). [1]
import { network } from "hardhat";
const { ethers } = await network.connect();
const counter = await ethers.deployContract("Counter", [42]); // ctor args optional
await counter.waitForDeployment(); // ethers v6
const address = await counter.getAddress();
console.log("deployed at:", address);Notes:
deployContract(name, constructorArgs?, signer?)is the Hardhat v3 helper for deployments. [1]waitForDeployment()is the ethers v6 way to wait until the deployment tx is mined (ethers v5 useddeployed()). [7]
Add waitForDeployment() to ensure the deployment transaction is mined.
In Hardhat v3 with ethers v6, ethers.deployContract() returns the contract instance before the deployment transaction is mined. Without explicitly awaiting waitForDeployment(), calling getAddress() can race and cause test flakiness.
🛠️ Suggested fix
const mockUSDT = await ethers.deployContract("MockUSDT");
+ await mockUSDT.waitForDeployment();
console.log("MockUSDT deployed to:", await mockUSDT.getAddress());📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const mockUSDT = await ethers.deployContract("MockUSDT"); | |
| console.log("MockUSDT deployed to:", await mockUSDT.getAddress()); | |
| const mockUSDT = await ethers.deployContract("MockUSDT"); | |
| await mockUSDT.waitForDeployment(); | |
| console.log("MockUSDT deployed to:", await mockUSDT.getAddress()); |
🤖 Prompt for AI Agents
In `@test/fixtures/mockUSDT.js` around lines 7 - 8, The deployment result from
ethers.deployContract("MockUSDT") is returned before the transaction is mined;
after obtaining mockUSDT, await mockUSDT.waitForDeployment() before calling
mockUSDT.getAddress() so the deployment transaction is confirmed and avoids race
conditions in tests.
| async function getSupportedInstitutions() { | ||
| const currency = ethers.utils.formatBytes32String("NGN"); | ||
|
|
||
| const accessBank = { | ||
| code: ethers.utils.formatBytes32String("ABNGNGLA"), | ||
| name: ethers.utils.formatBytes32String("ACCESS BANK"), | ||
| }; | ||
|
|
||
| const diamondBank = { | ||
| code: ethers.utils.formatBytes32String("DBLNNGLA"), | ||
| name: ethers.utils.formatBytes32String("DIAMOND BANK"), | ||
| }; | ||
|
|
||
| return { | ||
| currency, | ||
| accessBank, | ||
| diamondBank, | ||
| }; | ||
| } |
There was a problem hiding this comment.
ethers.utils.formatBytes32String is deprecated in ethers v6.
The ethers.utils namespace was removed in ethers v6. Use ethers.encodeBytes32String instead.
🐛 Proposed fix
async function getSupportedInstitutions() {
- const currency = ethers.utils.formatBytes32String("NGN");
+ const currency = ethers.encodeBytes32String("NGN");
const accessBank = {
- code: ethers.utils.formatBytes32String("ABNGNGLA"),
- name: ethers.utils.formatBytes32String("ACCESS BANK"),
+ code: ethers.encodeBytes32String("ABNGNGLA"),
+ name: ethers.encodeBytes32String("ACCESS BANK"),
};
const diamondBank = {
- code: ethers.utils.formatBytes32String("DBLNNGLA"),
- name: ethers.utils.formatBytes32String("DIAMOND BANK"),
+ code: ethers.encodeBytes32String("DBLNNGLA"),
+ name: ethers.encodeBytes32String("DIAMOND BANK"),
};
return {
currency,
accessBank,
diamondBank,
};
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async function getSupportedInstitutions() { | |
| const currency = ethers.utils.formatBytes32String("NGN"); | |
| const accessBank = { | |
| code: ethers.utils.formatBytes32String("ABNGNGLA"), | |
| name: ethers.utils.formatBytes32String("ACCESS BANK"), | |
| }; | |
| const diamondBank = { | |
| code: ethers.utils.formatBytes32String("DBLNNGLA"), | |
| name: ethers.utils.formatBytes32String("DIAMOND BANK"), | |
| }; | |
| return { | |
| currency, | |
| accessBank, | |
| diamondBank, | |
| }; | |
| } | |
| async function getSupportedInstitutions() { | |
| const currency = ethers.encodeBytes32String("NGN"); | |
| const accessBank = { | |
| code: ethers.encodeBytes32String("ABNGNGLA"), | |
| name: ethers.encodeBytes32String("ACCESS BANK"), | |
| }; | |
| const diamondBank = { | |
| code: ethers.encodeBytes32String("DBLNNGLA"), | |
| name: ethers.encodeBytes32String("DIAMOND BANK"), | |
| }; | |
| return { | |
| currency, | |
| accessBank, | |
| diamondBank, | |
| }; | |
| } |
🤖 Prompt for AI Agents
In `@test/utils/utils.manager.js` around lines 56 - 74, The code in
getSupportedInstitutions uses the deprecated ethers.utils.formatBytes32String;
replace those calls with ethers.encodeBytes32String for currency,
accessBank.code/name and diamondBank.code/name to be compatible with ethers v6
(update any imports if needed so ethers is the v6 API). Ensure you update all
occurrences in getSupportedInstitutions to call
ethers.encodeBytes32String("...") for the currency, accessBank.code,
accessBank.name, diamondBank.code, and diamondBank.name.
| async function mockMintDeposit(gateway, account, usdc, amount) { | ||
| await usdc.connect(account).mint(amount); | ||
| await usdc.connect(account).approve(gateway.address, amount); | ||
| } |
There was a problem hiding this comment.
gateway.address is deprecated in ethers v6.
In ethers v6, contract instances no longer have a .address property. Use await gateway.getAddress() instead.
🐛 Proposed fix
async function mockMintDeposit(gateway, account, usdc, amount) {
await usdc.connect(account).mint(amount);
- await usdc.connect(account).approve(gateway.address, amount);
+ await usdc.connect(account).approve(await gateway.getAddress(), amount);
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async function mockMintDeposit(gateway, account, usdc, amount) { | |
| await usdc.connect(account).mint(amount); | |
| await usdc.connect(account).approve(gateway.address, amount); | |
| } | |
| async function mockMintDeposit(gateway, account, usdc, amount) { | |
| await usdc.connect(account).mint(amount); | |
| await usdc.connect(account).approve(await gateway.getAddress(), amount); | |
| } |
🤖 Prompt for AI Agents
In `@test/utils/utils.manager.js` around lines 76 - 79, The helper mockMintDeposit
uses the deprecated gateway.address (ethers v6); change it to await
gateway.getAddress() and pass that result to usdc.connect(account).approve(...).
Specifically, inside mockMintDeposit ensure you call const gatewayAddr = await
gateway.getAddress() (or inline await gateway.getAddress()) and use gatewayAddr
when calling usdc.connect(account).approve(...), keeping the function async and
preserving the mint call.
| export type MinEthersFactory<C, ARGS> = { | ||
| deploy(...a: ARGS[]): Promise<C> | ||
| } | ||
|
|
||
| export type GetContractTypeFromFactory<F> = F extends MinEthersFactory<infer C, any> ? C : never | ||
| export type GetARGsTypeFromFactory<F> = F extends MinEthersFactory<any, any> ? Parameters<F['deploy']> : never |
There was a problem hiding this comment.
Fix MinEthersFactory deploy signature to accept the argument tuple, not an array of tuples.
deploy(...a: ARGS[]) makes the method take multiple tuple arguments (e.g., [string, number][]) instead of the usual argument list, which breaks typing for factory deploy calls. This is a type-level correctness issue.
🔧 Proposed fix
-export type MinEthersFactory<C, ARGS> = {
- deploy(...a: ARGS[]): Promise<C>
-}
+export type MinEthersFactory<C, ARGS extends any[]> = {
+ deploy(...a: ARGS): Promise<C>
+}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export type MinEthersFactory<C, ARGS> = { | |
| deploy(...a: ARGS[]): Promise<C> | |
| } | |
| export type GetContractTypeFromFactory<F> = F extends MinEthersFactory<infer C, any> ? C : never | |
| export type GetARGsTypeFromFactory<F> = F extends MinEthersFactory<any, any> ? Parameters<F['deploy']> : never | |
| export type MinEthersFactory<C, ARGS extends any[]> = { | |
| deploy(...a: ARGS): Promise<C> | |
| } | |
| export type GetContractTypeFromFactory<F> = F extends MinEthersFactory<infer C, any> ? C : never | |
| export type GetARGsTypeFromFactory<F> = F extends MinEthersFactory<any, any> ? Parameters<F['deploy']> : never |
🤖 Prompt for AI Agents
In `@types/ethers-contracts/common.ts` around lines 44 - 49, The deploy signature
on MinEthersFactory is wrong: change deploy(...a: ARGS[]) to accept the argument
tuple rather than an array of tuples (e.g., deploy(...a: ARGS) or
deploy(...args: ARGS)) so the factory deploy takes the normal variadic tuple
parameters; update MinEthersFactory's deploy signature accordingly and keep
GetContractTypeFromFactory and GetARGsTypeFromFactory (which uses
Parameters<F['deploy']>) intact so they resolve the correct tuple types.
- Changed Mocha version from 11.7.5 to 11.3.0 in package.json and package-lock.json. - Updated hardhat.config.ts to move Mocha timeout settings under the new test configuration structure. - Adjusted TypeScript configuration to use ES2020 module syntax and bundler resolution.
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@hardhat.config.ts`:
- Around line 11-12: The current env value is built only from
dotenv.config().parsed so system/CI environment variables (process.env) are
ignored; update the construction of env to merge parsed and process.env so that
CI/system variables are included and take precedence. Locate the dotenv.config()
call and the env variable declaration (dotEnvResult, dotenv.config(), env) and
change env to combine dotEnvResult.parsed with process.env (with process.env
overriding parsed) and cast to Record<string,string> as needed.
In `@package.json`:
- Line 9: The "ownership" npm script is malformed because it passes "mocha" as
an argument to "npx hardhat test"; update the "ownership" script (symbol:
ownership) to call Hardhat correctly by removing the extra "mocha" token so it
matches other test scripts (e.g., use "npx hardhat test
test/gateway/gateway.ownable.test.js" or the project’s standard test
invocation).
| const dotEnvResult = dotenv.config(); | ||
| const env = (dotEnvResult.parsed ?? {}) as Record<string, string>; |
There was a problem hiding this comment.
Environment variables from system/CI won't be picked up.
dotenv.config().parsed only contains variables loaded from the .env file. System environment variables (e.g., those injected in CI pipelines) won't be included in env, causing builds/deployments to fail in CI where secrets are typically set via system environment, not .env files.
🔧 Proposed fix to use process.env with dotenv loaded
import dotenv from "dotenv";
-const dotEnvResult = dotenv.config();
-const env = (dotEnvResult.parsed ?? {}) as Record<string, string>;
+dotenv.config();
+const env = process.env as Record<string, string>;📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const dotEnvResult = dotenv.config(); | |
| const env = (dotEnvResult.parsed ?? {}) as Record<string, string>; | |
| import dotenv from "dotenv"; | |
| dotenv.config(); | |
| const env = process.env as Record<string, string>; |
🤖 Prompt for AI Agents
In `@hardhat.config.ts` around lines 11 - 12, The current env value is built only
from dotenv.config().parsed so system/CI environment variables (process.env) are
ignored; update the construction of env to merge parsed and process.env so that
CI/system variables are included and take precedence. Locate the dotenv.config()
call and the env variable declaration (dotEnvResult, dotenv.config(), env) and
change env to combine dotEnvResult.parsed with process.env (with process.env
overriding parsed) and cast to Record<string,string> as needed.
| "scripts": { | ||
| "test": "npx hardhat test", | ||
| "ownership": "npx hardhat test test/gateway/gateway.ownable.test.js", | ||
| "ownership": "npx hardhat test mocha test/gateway/gateway.ownable.test.js", |
There was a problem hiding this comment.
Malformed ownership script command.
The command npx hardhat test mocha test/gateway/gateway.ownable.test.js appears incorrect—it passes mocha as an argument to hardhat test. This will likely fail or behave unexpectedly.
🔧 Proposed fix to match other test scripts
- "ownership": "npx hardhat test mocha test/gateway/gateway.ownable.test.js",
+ "ownership": "npx hardhat test test/gateway/gateway.ownable.test.js",📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| "ownership": "npx hardhat test mocha test/gateway/gateway.ownable.test.js", | |
| "ownership": "npx hardhat test test/gateway/gateway.ownable.test.js", |
🤖 Prompt for AI Agents
In `@package.json` at line 9, The "ownership" npm script is malformed because it
passes "mocha" as an argument to "npx hardhat test"; update the "ownership"
script (symbol: ownership) to call Hardhat correctly by removing the extra
"mocha" token so it matches other test scripts (e.g., use "npx hardhat test
test/gateway/gateway.ownable.test.js" or the project’s standard test
invocation).
…ening - Implemented a new task to print available accounts and their ETH balances. - Created a task to flatten specified contracts and their dependencies, with license handling.
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Fix all issues with AI agents
In `@hardhat.config.ts`:
- Line 14: The config defines a hardcoded testPrivateKey and falls back to it
when DEPLOYER_PRIVATE_KEY is missing, which is unsafe for production networks;
update the networks configuration (where testPrivateKey is referenced) to stop
using the well-known key by removing the fallback and instead set the accounts
array to be empty when DEPLOYER_PRIVATE_KEY is not present (so deployments fail
fast), ensuring this change is applied for all production networks listed
(mainnet, arbitrum, polygon, base, bsc, optimism, scroll, celo, assetchain,
lisk) and remove or deprecate the testPrivateKey constant usage.
In `@tasks/accounts.ts`:
- Around line 15-21: The console.log call is printing raw wei with the wrong
`%i` specifier; change the second argument to a string-formatted ETH value and
use `%s` in the format string: call hre.ethers.formatEther on the result of
provider.getBalance(account.address) and pass that formatted string to
console.log (update the format string from "%s (%i ETH)" to use "%s (%s ETH)" or
similar), keeping references to account.address, provider.getBalance,
hre.ethers.formatEther and the existing console.log invocation.
In `@tasks/flatten.ts`:
- Around line 23-26: The code writes to outputPath (const outputPath =
path.join("flattened", "Gateway.sol")) with fs.writeFileSync but doesn't ensure
the "flattened" directory exists; before calling fs.writeFileSync(outputPath,
flattened) create the directory for path.dirname(outputPath) (e.g., using
fs.mkdirSync(dir, { recursive: true }) or fs.promises.mkdir(dir, { recursive:
true })) so the write won't throw ENOENT; update the block around outputPath,
path.join, and fs.writeFileSync accordingly.
🧹 Nitpick comments (4)
tasks/flatten.ts (3)
13-14: Output filename is hardcoded regardless of input files.The task accepts arbitrary
filesvia arguments, but the output is always written toflattened/Gateway.sol. If a user passes different contracts (e.g.,["contracts/Token.sol"]), the output filename would be misleading.Consider deriving the output name from the input files, or documenting that this task is Gateway-specific.
Also applies to: 24-24
16-17: Comment doesn't match behavior.The comment says "Remove every line" but the code replaces
SPDX-License-Identifier:withLicense-Identifier:— the lines remain, just renamed. Consider updating the comment to reflect the actual behavior (e.g., "Rename SPDX-License-Identifier to avoid duplicate license warnings").
20-21: Consider a more readable approach for keeping only the first pragma.The IIFE closure pattern is clever but obscure. A clearer alternative improves maintainability.
♻️ More readable alternative
// Remove every line started with "pragma experimental ABIEncoderV2;" except the first one - flattened = flattened.replace(/pragma experimental ABIEncoderV2;\n/gm, ((i) => (m: any) => (!i++ ? m : ""))(0)); + let pragmaFound = false; + flattened = flattened.replace(/pragma experimental ABIEncoderV2;\n/gm, (match) => { + if (!pragmaFound) { + pragmaFound = true; + return match; + } + return ""; + });tasks/accounts.ts (1)
3-5: Replace the empty interface with a type alias.Line 3–5 triggers Biome's
noEmptyInterfacerule (enabled via "recommended": true in your config); atypeis clearer and lint-clean. The interface is only used once as a function parameter type and has no usages elsewhere.Suggested change
-interface AccountTaskArguments { - // No argument in this case -} +type AccountTaskArguments = Record<string, never>;
| const env = (dotEnvResult.parsed ?? {}) as Record<string, string>; | ||
|
|
||
| dotenv.config(); | ||
| const testPrivateKey = "0000000000000000000000000000000000000000000000000000000000000001" |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
find . -name "hardhat.config.ts" -o -name "hardhat.config.js" | head -20Repository: paycrest/contracts
Length of output: 81
🏁 Script executed:
cat -n ./hardhat.config.tsRepository: paycrest/contracts
Length of output: 5707
🌐 Web query:
Hardhat empty accounts array configuration behavior
💡 Result:
In Hardhat, the networks.<name>.accounts field is not “optional if empty”—it directly determines what signers Hardhat can use:
- For JSON-RPC (“http”) networks,
accountscan be:"remote"(use the node’s unlocked accounts) — this is also the default if you omitaccounts. [1]- an array of private keys / config variables (use those as local signers). [1]
- an HD wallet config object (derive accounts). [1]
If you set accounts: [], you are explicitly configuring Hardhat to use zero local accounts. That means Hardhat will typically have no available signer, so anything that needs a default account/wallet client (deployments, sending txs, etc.) can fail with “no account / no wallet client” style errors (e.g., Hardhat 3’s “Default Wallet Client Not Found … if no accounts were configured for the selected network”). [2]
Practical takeaway:
- If you intended “use the node’s accounts”, omit
accountsor setaccounts: "remote". [1] - If you intended “use my key(s)”, provide at least one key/config variable in the array (not an empty array). [1]
Sources:
[1] Hardhat configuration reference (accounts default and accepted shapes). (hardhat.org)
[2] Hardhat errors reference (missing accounts ⇒ missing default wallet client). (hardhat.org)
Citations:
- 1: https://hardhat.org/docs/reference/configuration?utm_source=openai
- 2: https://hardhat.org/docs/reference/errors?utm_source=openai
🌐 Web query:
private key 0x0000000000000000000000000000000000000000000000000000000000000001 well-known test key
💡 Result:
0x000…0001 (i.e., the integer private key 1) is a “well‑known” key because on secp256k1 the public key is computed as P = k·G; with k = 1, the public key is just the generator point G itself. The generator point (in compressed SEC form) is 0279be66…f81798. [1]
Because it’s a real (and trivially guessable) key, people commonly reference it in examples and “puzzle”/demo contexts; it corresponds to widely cited, fixed addresses, e.g.:
- Bitcoin (P2PKH, compressed pubkey) address:
1BgGZ9tcN4rm9KBzDn7KprQz87SZ26SAMHfor private key…0001. [2] - Ethereum address:
0x7E5F4552091A69125d5DfCb7b8C2659029395Bdffor the same private key…0001. [3]
It is not safe for testing with real funds on any public chain (bots sweep deposits immediately); use dedicated dev chains or generated test keys instead. [2]
Remove the fallback to a well-known test key on production networks.
The config uses a well-known private key (0x000…0001 → Ethereum address 0x7E5F4552091A69125d5DfCb7b8C2659029395Bdf) as a fallback when DEPLOYER_PRIVATE_KEY is missing. This key is publicly documented in tutorials and actively monitored by bots; any funds sent to it are immediately swept. Using it for mainnet, arbitrum, polygon, base, bsc, optimism, scroll, celo, assetchain, or lisk risks accidental deployment with a completely compromised signer.
Replace with an empty accounts array to fail fast if the env var is missing:
🔐 Suggested fix (apply across all networks)
-const testPrivateKey = "0000000000000000000000000000000000000000000000000000000000000001"
+const deployerPrivateKey = env.DEPLOYER_PRIVATE_KEY;
+const accounts = deployerPrivateKey ? [deployerPrivateKey] : [];-accounts: [env.DEPLOYER_PRIVATE_KEY || testPrivateKey],
+accounts,🤖 Prompt for AI Agents
In `@hardhat.config.ts` at line 14, The config defines a hardcoded testPrivateKey
and falls back to it when DEPLOYER_PRIVATE_KEY is missing, which is unsafe for
production networks; update the networks configuration (where testPrivateKey is
referenced) to stop using the well-known key by removing the fallback and
instead set the accounts array to be empty when DEPLOYER_PRIVATE_KEY is not
present (so deployments fail fast), ensuring this change is applied for all
production networks listed (mainnet, arbitrum, polygon, base, bsc, optimism,
scroll, celo, assetchain, lisk) and remove or deprecate the testPrivateKey
constant usage.
| console.log( | ||
| "%s (%i ETH)", | ||
| account.address, | ||
| // hre.ethers.utils.formatEther( | ||
| // getBalance returns wei amount, format to ETH amount | ||
| await provider.getBalance(account.address) | ||
| // ) |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# Find ethers version in package manifests
rg -n "ethers" --type json -g 'package.json' -g 'package-lock.json' -g 'yarn.lock' -g 'pnpm-lock.yaml' | head -20Repository: paycrest/contracts
Length of output: 1321
🏁 Script executed:
# Also check if package.json exists and read it
if [ -f "package.json" ]; then
cat package.json | grep -A 5 '"ethers"'
fiRepository: paycrest/contracts
Length of output: 216
🏁 Script executed:
# Read the actual code in tasks/accounts.ts
if [ -f "tasks/accounts.ts" ]; then
cat -n tasks/accounts.ts | head -30
fiRepository: paycrest/contracts
Length of output: 865
Format balances to ETH (not raw wei) and use %s format specifier.
Line 16 uses %i format with a wei value, which will format incorrectly. For ethers v6, extract the balance, format it using hre.ethers.formatEther(), and use %s instead:
Suggested change
- console.log(
- "%s (%i ETH)",
- account.address,
- // hre.ethers.utils.formatEther(
- // getBalance returns wei amount, format to ETH amount
- await provider.getBalance(account.address)
- // )
- );
+ const balanceWei = await provider.getBalance(account.address);
+ const balanceEth = hre.ethers.formatEther(balanceWei);
+ console.log("%s (%s ETH)", account.address, balanceEth);📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| console.log( | |
| "%s (%i ETH)", | |
| account.address, | |
| // hre.ethers.utils.formatEther( | |
| // getBalance returns wei amount, format to ETH amount | |
| await provider.getBalance(account.address) | |
| // ) | |
| const balanceWei = await provider.getBalance(account.address); | |
| const balanceEth = hre.ethers.formatEther(balanceWei); | |
| console.log("%s (%s ETH)", account.address, balanceEth); |
🤖 Prompt for AI Agents
In `@tasks/accounts.ts` around lines 15 - 21, The console.log call is printing raw
wei with the wrong `%i` specifier; change the second argument to a
string-formatted ETH value and use `%s` in the format string: call
hre.ethers.formatEther on the result of provider.getBalance(account.address) and
pass that formatted string to console.log (update the format string from "%s (%i
ETH)" to use "%s (%s ETH)" or similar), keeping references to account.address,
provider.getBalance, hre.ethers.formatEther and the existing console.log
invocation.
| // Write to flattened/Gateway.sol | ||
| const outputPath = path.join("flattened", "Gateway.sol"); | ||
| fs.writeFileSync(outputPath, flattened); | ||
| console.log(`Flattened contract written to ${outputPath}`); |
There was a problem hiding this comment.
Ensure the output directory exists before writing.
fs.writeFileSync will throw ENOENT if the flattened directory doesn't exist. Consider creating it first.
🐛 Proposed fix
// Write to flattened/Gateway.sol
const outputPath = path.join("flattened", "Gateway.sol");
+ fs.mkdirSync(path.dirname(outputPath), { recursive: true });
fs.writeFileSync(outputPath, flattened);
console.log(`Flattened contract written to ${outputPath}`);🤖 Prompt for AI Agents
In `@tasks/flatten.ts` around lines 23 - 26, The code writes to outputPath (const
outputPath = path.join("flattened", "Gateway.sol")) with fs.writeFileSync but
doesn't ensure the "flattened" directory exists; before calling
fs.writeFileSync(outputPath, flattened) create the directory for
path.dirname(outputPath) (e.g., using fs.mkdirSync(dir, { recursive: true }) or
fs.promises.mkdir(dir, { recursive: true })) so the write won't throw ENOENT;
update the block around outputPath, path.join, and fs.writeFileSync accordingly.
Description
This PR refactors the Gateway contract to eliminate code duplication in fee handling by consolidating duplicate fee splitting functions into single reusable implementations.
Changes made:
Benefits:
References
Related to the Gateway contract fee handling implementation.
Testing
Manual testing:
Checklist
By submitting a PR, I agree to Paycrest's Contributor Code of Conduct and Contribution Guide.
Summary by CodeRabbit
New Features
Bug Fixes
Chores
Tests