Skip to content
This repository was archived by the owner on Jan 6, 2026. It is now read-only.
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
39 changes: 6 additions & 33 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,36 +2,9 @@

SP Retrieval Checker Module

- [Roadmap](https://pl-strflt.notion.site/SPARK-Roadmap-ac729c11c49b409fbec54751d1bc6c8a)
- [API](https://github.com/filecoin-station/spark-api)

## Development

Install [Zinnia CLI](https://github.com/filecoin-station/zinnia).

```bash
$ # Install dev tooling
$ npm ci
$ # Lint
$ npm run lint
$ # Fix linting issues
$ npm run lint:fix
$ # Run module
$ zinnia run main.js
$ # Test module
$ zinnia run test.js
```

## Release

On a clean working tree, run the following command:

```bash
$ ./release.sh <semver>
$ # Example
$ ./release.sh 1.0.0
```

Use GitHub's changelog feature to fill out the release notes.

Publish the new release and let the CI/CD workflow upload the sources to IPFS & IPNS.
> [!CAUTION]
>
> **This repository is no longer mantained.**
>
> Filecoin Spark and Checker Network continue to operate in a permissioned architecture.
> See the [announcement](https://x.com/FilecoinCDN/status/1932472254245298504) for more details.
15 changes: 12 additions & 3 deletions main.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,13 @@
import Spark from './lib/spark.js'
/* global Zinnia */

const spark = new Spark()
await spark.run()
Zinnia.activity.error(
'Spark update: Filecoin Station and Checker Network programmes ended. The node is no longer contributing to the network, and there will be no further rewards. Thank you for your participation!',
)

while (true) {
await new Promise((resolve) => setTimeout(resolve, 60_000))
}

// import Spark from './lib/spark.js'
// const spark = new Spark()
// await spark.run()