diff --git a/docs/marketplace/Guides/prepare-comfyui.md b/docs/cli/Guides/Solutions/comfyui.md
similarity index 91%
rename from docs/marketplace/Guides/prepare-comfyui.md
rename to docs/cli/Guides/Solutions/comfyui.md
index 5b8dc8eb..aa1cab11 100644
--- a/docs/marketplace/Guides/prepare-comfyui.md
+++ b/docs/cli/Guides/Solutions/comfyui.md
@@ -1,14 +1,14 @@
---
-id: "prepare-comfyui"
-title: "Prepare a ComfyUI Workflow"
-slug: "/guides/prepare-comfyui"
-sidebar_position: 5
+id: "comfyui"
+title: "ComfyUI"
+slug: "/guides/solutions/comfyui"
+sidebar_position: 2
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
-This guide provides step-by-step instructions for preparing a **ComfyUI** workflow with custom nodes before uploading it. For security reasons, you cannot upload custom nodes directly to a deployed ComfyUI.
+This guide provides step-by-step instructions for preparing a **ComfyUI** workflow with custom nodes to run on Super Protocol. For security reasons, you cannot upload custom nodes directly to a deployed ComfyUI.
:::note
@@ -28,7 +28,7 @@ You can prepare your model, workflow, and custom node files manually or using Do
1. Clone the [Super-Protocol/solutions](https://github.com/Super-Protocol/solutions/) GitHub repository to the location of your choosing:
- ```
+ ```shell
git clone https://github.com/Super-Protocol/solutions.git --depth 1
```
@@ -54,13 +54,13 @@ You can prepare your model, workflow, and custom node files manually or using Do
Access the running container with the following command:
- ```
+ ```shell
docker exec -it comfyui bash
```
Go to the `models` directory inside the container and download the model files to the corresponding subdirectories using the `wget` command. For example:
- ```
+ ```shell
wget https://huggingface.co/prompthero/openjourney/resolve/main/mdjrny-v4.safetensors
```
@@ -68,7 +68,7 @@ You can prepare your model, workflow, and custom node files manually or using Do
If you have the model on your computer, copy its files to the container using the following command:
- ```
+ ```shell
docker cp
`/sp/inputs/input-0002`
etc. | Possible data locations
(AI model, dataset, training scripts, etc.) | Read-only |
+| `/sp/output` | Output directory for results | Read and write |
+| `/sp/certs` | Contains the order certificate, private key, and `workloadInfo` | Read-only |
+
+Your scripts must find the data in `/sp/inputs` and write the results to `/sp/output`.
+
+### 2. Place an order
+
+2.1. Initiate a dialog to construct and place an order:
+
+```shell
+./run-unsloth.sh
+```
+
+2.2. `Enter TEE offer id (number)`: Enter a compute offer ID. This determines the available compute resources and cost of your order. You can find the full list of available compute offers on the [Marketplace](https://marketplace.superprotocol.com/).
+
+2.3. `Choose run mode`: `1) file`.
+
+2.4. `Select the model option`:
+
+- `1) Medgemma 27b (offer 15900)`: Select this option if you need an untuned MedGemma 27B.
+- `2) your model`: Select this option to use another model. Further, when prompted about `Model input`, enter one of the following:
+ - a path to the model's resource JSON file, if it was already uploaded with SPCTL
+ - model offer ID, if the model exists on the Marketplace
+ - a path to the local directory with the model to upload it using SPCTL.
+- `3) no model`: No model will be used.
+
+2.5. `Enter path to a .py/.ipynb file OR a directory`: Enter the path to your training script (file or directory). For a directory, select the file to run (entrypoint) when prompted. Note that you cannot reuse resource files in this step; scripts should be uploaded every time.
+
+2.6. `Provide your dataset as a resource JSON path, numeric offer id, or folder path`: As with the model, enter one of the following:
+
+- a path to the dataset's resource JSON file, if it was already uploaded with SPCTL
+- dataset offer ID, if the dataset exists on the Marketplace
+- a path to the local directory with the dataset to upload it using SPCTL.
+
+2.7. `Upload SPCTL config file as a resource?`: Answer `N` unless you need to use SPCTL from within the TEE during the order execution. In this case, your script should run a `curl` command to download SPCTL and find the uploaded `config.json` in the `/sp/inputs/` subdirectories.
+
+2.8. Wait for the order to be created and find the order ID in the output, for example:
+
+```shell
+Unsloth order id: 259126
+Done.
+```
+
+### 3. Check the order result
+
+3.1. The order will take some time to complete. Check the order status:
+
+```shell
+./spctl orders get
**Preparation**
-Alice builds a solution—a Docker image containing her script (1). She uploads the solution using SPCTL (2) and grants Bob access for verification (3).
+Alice builds a solution—a Docker image containing her script ([1](/cli/guides/collaboration#alice-1-build-a-solution)). She uploads the solution using SPCTL ([2](/cli/guides/collaboration#alice-2-upload-the-solution)) and grants Bob access for verification ([3](/cli/guides/collaboration#alice-3-send-the-solution-to-bob)).
-Bob (or an independent auditor) downloads the solution (4) and verifies that it is safe to process his data (5).
+Bob (or an independent auditor) downloads the solution ([4](/cli/guides/collaboration#bob-4-download-the-solution)) and verifies that it is safe to process his data ([5](/cli/guides/collaboration#bob-5-verify-the-solution)).
-Bob uploads his dataset to remote storage using SPCTL (6). The dataset is automatically encrypted during upload, and only Bob holds the key.
+Bob uploads his dataset to remote storage using SPCTL ([6](/cli/guides/collaboration#bob-6-upload-the-dataset)). The dataset is automatically encrypted during upload, and only Bob holds the key.
-Bob creates an offer on the Marketplace (7). The offer requires Bob's manual approval for use. He shares the offer's IDs with Alice.
+Bob creates an offer on the Marketplace ([7](/cli/guides/collaboration#bob-7-create-an-offer)). The offer requires Bob's manual approval for use. He shares the offer's IDs with Alice.
**Execution**
-Alice places an order on Super Protocol using her solution and Bob's offer ID (8). The order remains **Blocked** by the data suborder.
+Alice places an order on Super Protocol using her solution and Bob's offer ID ([8](/cli/guides/collaboration#alice-8-place-an-order)). The order remains **Blocked** by the data suborder.
-Bob manually completes the data suborder (9). The command includes the verified solution hash. Completion succeeds only if this hash matches the actual solution hash, meaning the solution was not altered.
+Bob manually approves the usage of his dataset for the image with a specific hash ([9](/cli/guides/collaboration#bob-9-complete-the-data-suborder)). If this hash matches the actual solution hash, the CVM begins to process the order. If the hashes do not match, the order will be terminated with an error.
-Once the computation finishes, Alice can download the result (10). All the data within the TEE (solution, dataset, order results, etc.) is automatically deleted.
+Once the computation finishes, Alice can download the result ([10](/cli/guides/collaboration#alice-10-download-the-order-results)). All the data within the TEE (solution, dataset, order results, etc.) is automatically deleted.
-Both Alice and Bob can retrieve the order report (11) that confirms the authenticity of the entire trusted setup.
+Both Alice and Bob can retrieve the order report ([11](/cli/guides/collaboration#alice-and-bob-11-get-the-order-report)) that confirms the authenticity of the entire trusted setup.
## Prerequisites
@@ -93,11 +93,11 @@ Both Alice and Bob can retrieve the order report (11) that confirms the authenti
1.1. Write a Dockerfile that creates an image with your code. Keep in mind the special file structure inside the TEE:
-| **Location** | **Purpose** | **Access** |
-| :- | :- | :- |
-| `/sp/inputs/input-0001/`
`/sp/inputs/input-0002/`
etc. | Possible data locations | Read-only |
-| `/sp/output/` | Output directory for results | Write; read own files |
-| `/sp/certs/` | Contains the order certificate | Read-only |
+| **Location** | **Purpose** | **Access** |
+| :- | :- | :- |
+| `/sp/inputs/input-0001/`
`/sp/inputs/input-0002/`
etc. | Possible data locations | Read-only |
+| `/sp/output/` | Output directory for results | Write; read own files |
+| `/sp/certs/` | Contains the order certificate, private key, and workloadInfo | Read-only |
Your scripts must find the data in `/sp/inputs/` and write the results to `/sp/output/`.
@@ -112,7 +112,7 @@ You can find several Dockerfile examples in the [Super-Protocol/solutions](https
1.2. Build an image:
```shell
-docker build -t
-Alice uploads her model ([6](/cli/guides/fine-tune#alice-6-upload-the-model)) and Bob uploads his dataset ([7](/cli/guides/fine-tune#bob-7-upload-the-dataset)) to remote storage using SPCTL. The dataset is automatically encrypted during upload, and only Bob holds the key.
+Alice uploads her model ([6](/cli/guides/fine-tune#alice-6-upload-the-model)) and Bob uploads his dataset ([7](/cli/guides/fine-tune#bob-7-upload-the-dataset)) to remote storage using SPCTL. Files are automatically encrypted during upload, and only the uploader holds the key.
Bob creates an offer on the Marketplace ([8](/cli/guides/fine-tune#bob-8-create-an-offer)). The offer requires Bob's manual approval for use. He shares the offer's IDs with Alice.
@@ -81,10 +81,8 @@ sequenceDiagram
note over Alice,Blockchain: Execute
Alice ->>+ Super Protocol / TEE: 9. Place an order
- Super Protocol / TEE ->> Storage: Download the solution
- Super Protocol / TEE ->> Storage: Download the model
- Bob ->> Super Protocol / TEE: 10. Complete the suborder
- Super Protocol / TEE ->> Storage: Download the dataset
+ Bob ->> Super Protocol / TEE: 10. Approve the usage of the dataset
+ Super Protocol / TEE ->> Storage: Download the solution, model, and dataset
Super Protocol / TEE ->> Blockchain: Publish the order report
Super Protocol / TEE ->> Super Protocol / TEE: Process the order
Super Protocol / TEE ->>- Storage: Upload the order results
@@ -94,13 +92,11 @@ sequenceDiagram
```
-Alice places an order on Super Protocol ([9](/cli/guides/fine-tune#alice-9-place-an-order)), adding the solution, her model, and Bob's offer. The order does not proceed automatically and remains `Blocked` by the data suborder with Bob's dataset.
+Alice places an order on Super Protocol ([9](/cli/guides/fine-tune#alice-9-place-an-order)), adding the solution, her model, and Bob's offer. The order does not proceed automatically and remains `Blocked`.
-Bob manually completes the respective data suborder ([10](/cli/guides/fine-tune#bob-10-complete-the-data-suborder)). The command he uses includes the solution hash. The completion will be successful only if this hash matches the actual solution hash.
+Bob manually approves the usage of his dataset for the image with a specific hash ([10](/cli/guides/fine-tune#bob-10-complete-the-data-suborder)). If this hash matches the actual solution hash, the CVM begins to process the order. If the hashes do not match, the order will be terminated with an error.
-If the suborder is completed successfully, the execution of the main order proceeds.
-
-When the main order is complete, Alice downloads the result ([11](/cli/guides/fine-tune#alice-11-download-the-order-results)). All the data within the TEE (solution, AI model, dataset, order results, etc.) is automatically deleted.
+When the order is complete, Alice downloads the result ([11](/cli/guides/fine-tune#alice-11-download-the-order-results)). All the data within the TEE (solution, AI model, dataset, order results, etc.) is automatically deleted.
Both Alice and Bob can retrieve the order report ([12](/cli/guides/fine-tune#alice-and-bob-12-get-the-order-report)) that confirms the authenticity of the entire trusted setup.
@@ -128,11 +124,11 @@ Both Alice and Bob can retrieve the order report ([12](/cli/guides/fine-tune#ali
Keep in mind the special file structure inside the TEE:
-| **Location** | **Purpose** | **Access** |
-| :- | :- | :- |
-| `/sp/inputs/input-0001`
`/sp/inputs/input-0002`
etc. | Possible data locations
(AI model, dataset, training scripts, etc.) | Read-only |
-| `/sp/output` | Output directory for results | Write; read own files |
-| `/sp/certs` | Contains the order certificate | Read-only |
+| **Location** | **Purpose** | **Access** |
+| :- | :- | :- |
+| `/sp/inputs/input-0001`
`/sp/inputs/input-0002`
etc. | Possible data locations
(AI model, dataset, training scripts, etc.) | Read-only |
+| `/sp/output` | Output directory for results | Read and write |
+| `/sp/certs` | Contains the order certificate, private key, and workloadInfo | Read-only |
Your solution must find the data in `/sp/inputs` and write the results to `/sp/output`.
@@ -149,7 +145,7 @@ You can find several Dockerfile examples in the [Super-Protocol/solutions](https
1.2. Build an image:
```shell
-docker build -t
`/sp/inputs/input-0002`
etc. | Possible data locations | Read-only |
-| `/sp/output` | Output directory for results | Write; read own files |
-| `/sp/certs` | Contains the order certificate | Read-only |
+| **Location** | **Purpose** | **Access** |
+| :- | :- | :- |
+| `/sp/inputs/input-0001`
`/sp/inputs/input-0002`
etc. | Possible data locations | Read-only |
+| `/sp/output` | Output directory for results | Write; read own files |
+| `/sp/certs` | Contains the order certificate, private key, and workloadInfo | Read-only |
So, your solution must find the data in `/sp/inputs` and write the results to `/sp/output`.
@@ -156,4 +156,8 @@ For example:
```shell
./spctl orders download-result 256587
-```
\ No newline at end of file
+```
+
+## Support
+
+If you have any issues or questions, contact Super Protocol on [Discord](https://discord.gg/superprotocol) or via the [contact form](https://superprotocol.zendesk.com/hc/en-us/requests/new).
\ No newline at end of file
diff --git a/docs/cli/index.md b/docs/cli/index.md
index 39e44e41..c2d013d9 100644
--- a/docs/cli/index.md
+++ b/docs/cli/index.md
@@ -41,7 +41,7 @@ import TabItem from '@theme/TabItem';
You can also download and install SPCTL manually from the Super Protocol [GitHub repository](https://github.com/Super-Protocol/ctl).
-## Set Up
+## Set up
You can set up SPCTL using the `./spctl setup` command or by manually creating a configuration file.
@@ -103,7 +103,6 @@ You can set up SPCTL using the `./spctl setup` command or by manually creating a
}
}
}
-
```
3. Do not change the preconfigured values and set values to the following keys:
diff --git a/docs/data-for-ai/Overview/about.md b/docs/data-for-ai/Overview/about.md
index 2ee25347..0471cf17 100644
--- a/docs/data-for-ai/Overview/about.md
+++ b/docs/data-for-ai/Overview/about.md
@@ -5,7 +5,7 @@ slug: "/overview/about"
sidebar_position: 1
---
-The Super Protocol Data-for-AI Campaign is more than a contest—it’s a collaborative initiative to rethink how AI systems are trained in high-stakes, regulated domains. By sourcing high-quality, publicly available regulatory and clinical data, we aim to make AI development transparent, decentralized, and verifiable from the ground up.
+The Super Protocol Data-for-AI Campaign is more than a contest—it's a collaborative initiative to rethink how AI systems are trained in high-stakes, regulated domains. By sourcing high-quality, publicly available regulatory and clinical data, we aim to make AI development transparent, decentralized, and verifiable from the ground up.
This campaign is powered by Super Protocol, a decentralized cloud platform designed for privacy-preserving AI computing. It combines confidential execution, on-chain traceability, and cryptographic proof of origin, creating a secure foundation for open collaboration between AI systems and data contributors.
@@ -13,13 +13,13 @@ This campaign is powered by Super Protocol, a decentralized cloud platform desig
AI companies in regulated industries like healthcare face a difficult trade-off: build costly internal systems to collect and validate data, or rely on opaque, third-party pipelines with unknown provenance. Both come with serious limitations—compliance overhead, audit risks, and a lack of trust in the data itself.
-This campaign explores a third path: a verifiable, decentralized pipeline for AI training. Every submitted data link is publicly auditable, cryptographically signed by the contributor, and logged to a smart contract on the opBNB network. It’s not just about finding data—it’s about proving where it came from and how it was used.
+This campaign explores a third path: a verifiable, decentralized pipeline for AI training. Every submitted data link is publicly auditable, cryptographically signed by the contributor, and logged to a smart contract on the opBNB network. It's not just about finding data—it's about proving where it came from and how it was used.
-We’re working with Tytonix, whose medical AI systems will be trained directly on this dataset. Their tools help medical device companies navigate regulatory approvals faster and at lower cost. Your contributions fuel a real-world application with immediate value.
+We're working with Tytonix, whose medical AI systems will be trained directly on this dataset. Their tools help medical device companies navigate regulatory approvals faster and at lower cost. Your contributions fuel a real-world application with immediate value.
## Why verifiability is crucial
-In healthcare AI, data integrity isn’t optional. It must be provable—both to regulators and the companies relying on it.
+In healthcare AI, data integrity isn't optional. It must be provable—both to regulators and the companies relying on it.
Super Protocol ensures every submission has a traceable origin, a clear audit trail, and immutable on-chain attribution. This builds a usable bridge between community-sourced input and production-grade AI.
@@ -27,18 +27,18 @@ Super Protocol ensures every submission has a traceable origin, a clear audit tr
- On-chain record → compliance-ready data
- Decentralized sourcing → scalable, cost-effective pipelines
-What’s submitted here isn’t just checked off—it’s accounted for.
+What's submitted here isn't just checked off—it's accounted for.
## Just the beginning
Super Protocol already supports confidential AI training: models run in secure environments where data remains private, even from developers. Deployments are signed, logged, and verifiable. That infrastructure is live.
-What’s missing—until now—is granular, user-attributed input. The ability to train AI on individual contributions, where each data point is trackable, auditable, and tied to its source without sacrificing privacy.
+What's missing—until now—is granular, user-attributed input. The ability to train AI on individual contributions, where each data point is trackable, auditable, and tied to its source without sacrificing privacy.
-This campaign is the first step. In future phases, contributors will be able to control how their data is used, know when it contributes to training, and opt in or out of specific models. It’s the beginning of a long-term shift: from closed, anonymous datasets to a transparent, accountable, and privacy-respecting AI ecosystem.
+This campaign is the first step. In future phases, contributors will be able to control how their data is used, know when it contributes to training, and opt in or out of specific models. It's the beginning of a long-term shift: from closed, anonymous datasets to a transparent, accountable, and privacy-respecting AI ecosystem.
## Where you come in
Contribute real-world data. Climb the leaderboard. Earn your share of $30,000 in USDT and Super Stakes (convertible into Super Tokens at the token generation event).
-This isn’t just a data campaign. It’s the foundation for an AI system that doesn’t require trust, because everything is verifiable, transparent, and owned.
\ No newline at end of file
+This isn't just a data campaign. It's the foundation for an AI system that doesn't require trust, because everything is verifiable, transparent, and owned.
\ No newline at end of file
diff --git a/docs/data-for-ai/Overview/dates.md b/docs/data-for-ai/Overview/dates.md
index 3c7567b5..2f4c1d45 100644
--- a/docs/data-for-ai/Overview/dates.md
+++ b/docs/data-for-ai/Overview/dates.md
@@ -14,4 +14,4 @@ June 9 – June 23, 12:00 PM UTC
→ All activity counts toward leaderboard ranking and final rewards.
**Daily Reset:**
-Every day at 12:00 PM UTC, submission limits are reset, and the points’ value increases by 4%.
\ No newline at end of file
+Every day at 12:00 PM UTC, submission limits are reset, and the points' value increases by 4%.
\ No newline at end of file
diff --git a/docs/data-for-ai/Overview/support.md b/docs/data-for-ai/Overview/support.md
index dcd74251..13dab134 100644
--- a/docs/data-for-ai/Overview/support.md
+++ b/docs/data-for-ai/Overview/support.md
@@ -5,7 +5,7 @@ slug: "/overview/support"
sidebar_position: 6
---
-If you have questions, encounter issues, or need assistance during the campaign, we’re here to help.
+If you have questions, encounter issues, or need assistance during the campaign, we're here to help.
## Support ticket
@@ -15,4 +15,4 @@ For official support via email, please [submit a request](https://superprotocol.
If you prefer real-time communication, you can also get help through our [Discord server](https://discord.com/invite/superprotocol). The channel is **#data-for-ai**.
-We’re committed to supporting you throughout the campaign.
\ No newline at end of file
+We're committed to supporting you throughout the campaign.
\ No newline at end of file
diff --git a/docs/data-for-ai/Rules/referrals.md b/docs/data-for-ai/Rules/referrals.md
index a2187631..07958729 100644
--- a/docs/data-for-ai/Rules/referrals.md
+++ b/docs/data-for-ai/Rules/referrals.md
@@ -9,9 +9,9 @@ The referral system allows you to earn additional points by inviting others to j
## How it works
-- After registration, you’ll receive a unique referral link.
+- After registration, you'll receive a unique referral link.
- When someone signs up using your link—a *referee*—and starts submitting valid data links, you earn referral points.
-- There’s no limit to how many people you can refer.
+- There's no limit to how many people you can refer.
- Each participant can only be referred once.
- If someone signs up without your link or uses another link first, they cannot be reassigned to you.
@@ -27,13 +27,13 @@ Day 3: ~37.9 points
...
Day 14 (Final Day): ~58.8 points per link
-The longer the campaign runs, the more valuable each referee’s activity becomes.
+The longer the campaign runs, the more valuable each referee's activity becomes.
Note: While later submissions earn more per link, inviting people early gives them time to contribute more overall, resulting in higher total rewards for you.
## Referral penalty
-If your referee submits an invalid data link, you’ll lose the referral reward for one previously earned link from that referee. This only affects the bonus points earned from that specific referee and does not impact your own points or rewards from other referees.
+If your referee submits an invalid data link, you'll lose the referral reward for one previously earned link from that referee. This only affects the bonus points earned from that specific referee and does not impact your own points or rewards from other referees.
Referral points cannot go negative, and the same rule applies individually to each referee and each invalid link.
diff --git a/docs/data-for-ai/Rules/rewards.md b/docs/data-for-ai/Rules/rewards.md
index 7864d0a6..7bb9f9c4 100644
--- a/docs/data-for-ai/Rules/rewards.md
+++ b/docs/data-for-ai/Rules/rewards.md
@@ -7,7 +7,7 @@ sidebar_position: 3
## Reward recipients
-Only the top 1,000 participants will get prizes, and the rewards will depend on the rank. The rank is determined by the user’s total points: own points plus referral points.
+Only the top 1,000 participants will get prizes, and the rewards will depend on the rank. The rank is determined by the user's total points: own points plus referral points.
| **Rank** | **USDT** | **Super Stakes** |
| :- | :- | :- |
@@ -41,7 +41,7 @@ The top 50 participants might be subject to KYC checks to verify identity and pr
## Leaderboard
-To check winners, participants, referrals, rewards, and more, [read the campaign’s smart contract](https://opbnb.bscscan.com/address/0x8c77ef6ed2ee514d1754fbfc2710d70e9d6ba871#readContract) on the opBNB network.
+To check winners, participants, referrals, rewards, and more, [read the campaign's smart contract](https://opbnb.bscscan.com/address/0x8c77ef6ed2ee514d1754fbfc2710d70e9d6ba871#readContract) on the opBNB network.
### Check a participant
@@ -65,7 +65,7 @@ Fields in the example in order of appearance:
| `0` | Number of links validated today. Always `0` because the campaign has ended. |
| `true` | Flag indicating if the address is registered as a campaign participant. |
| `false` | Flag indicating if the address has claimed the reward. |
-| `0x8da2c62C23aEBeb1Aa8b5eE96d341d26a2edec6eB` | The referrer’s address. |
+| `0x8da2c62C23aEBeb1Aa8b5eE96d341d26a2edec6eB` | The referrer's address. |
| `68` | Number of referees. |
| `2640` | Points the participant earned for their referrer. |
| `67738` | Points the participant earned from their referees. |
@@ -73,7 +73,7 @@ Fields in the example in order of appearance:
| `0` | Total number of duplicate links submitted. |
| `237` | Total number of valid links submitted. |
| `152` | Total number of invalid links submitted. |
-| `0xbF4aC1b6efd5C21e5Ce93f34c8F43C8a9bCACA3F3` | The participant’s address. |
+| `0xbF4aC1b6efd5C21e5Ce93f34c8F43C8a9bCACA3F3` | The participant's address. |
| `813` | Current rank in the leaderboard. |
| `97280` | Total points earned. |
| `10000000000000000000` | USDT reward, in denominations. 1018 = 1 USDT. |
diff --git a/docs/developers/deployment_guides/tunnels/repo.md b/docs/developers/deployment_guides/tunnels/repo.md
index 101b3f1b..0f63e806 100644
--- a/docs/developers/deployment_guides/tunnels/repo.md
+++ b/docs/developers/deployment_guides/tunnels/repo.md
@@ -15,7 +15,7 @@ These Github Actions are automating the commands outlined in the [previous step]
1. Go to [GitHub](https://github.com) and log in to your account.
-2. Click the [New Repository](https://github.com/new) button in the top-right. Enter `superprotocol-test-app` as repository name. You’ll have an option there to initialize the repository with a README file. Add `Node` as `.gitignore` template.
+2. Click the [New Repository](https://github.com/new) button in the top-right. Enter `superprotocol-test-app` as repository name. You'll have an option there to initialize the repository with a README file. Add `Node` as `.gitignore` template.
3. Click the “Create repository” button.
diff --git a/docs/developers/marketplace_gui/walkthrough.md b/docs/developers/marketplace_gui/walkthrough.md
index 60febcd1..4a25bcd8 100644
--- a/docs/developers/marketplace_gui/walkthrough.md
+++ b/docs/developers/marketplace_gui/walkthrough.md
@@ -7,9 +7,9 @@ sidebar_position: 2
## 1. Introduction
-To better understand how Super Protocol works, let’s take a step-by-step walkthrough through the Marketplace GUI.
+To better understand how Super Protocol works, let's take a step-by-step walkthrough through the Marketplace GUI.
-As an example we’ll deploy the [Super Chat](/developers/offers/superchat) app with the tunnels. Please note that for this walkthrough we'll be using [Tunnels Launcher](/developers/offers/launcher), which cuts a few corners in order to streamline the experience. For the full tunnels deployment capabilities please refer to [this guide](/developers/deployment_guides/tunnels).
+As an example we'll deploy the [Super Chat](/developers/offers/superchat) app with the tunnels. Please note that for this walkthrough we'll be using [Tunnels Launcher](/developers/offers/launcher), which cuts a few corners in order to streamline the experience. For the full tunnels deployment capabilities please refer to [this guide](/developers/deployment_guides/tunnels).
You might want to read up on the fundamental Super Protocol concepts - such as [offers](/fundamentals/offers), [orders](/fundamentals/orders), [requirements and configurations](/fundamentals/slots), and [tunnels](/fundamentals/tunnels) - in advance, or - just dive into it and figure it out as you go. Your choice.
@@ -142,7 +142,7 @@ To create this order via CLI, click the **Copy CLI workflow** button. It will ge
:::info Step 6. Set up a passphrase.
-Either input your own passphrase or generate a new one. Then press the `Place Order` button. Save your passphrase! You won’t be able to access your order results without it. For testing it's easier to have a single passphrase for all orders.
+Either input your own passphrase or generate a new one. Then press the `Place Order` button. Save your passphrase! You won't be able to access your order results without it. For testing it's easier to have a single passphrase for all orders.
:::
diff --git a/docs/fundamentals/certification.md b/docs/fundamentals/certification.md
index 611d5375..a943c6b5 100644
--- a/docs/fundamentals/certification.md
+++ b/docs/fundamentals/certification.md
@@ -25,7 +25,7 @@ The SubRoot CAs, in turn, issue and sign certificates for **Guide** |
Request to stop execution of the order on the consumer's side. The order status is changed to "canceling", the provider saves the end result of the order and moves the order to "canceled" status. If the offer is of cancelable type, smart contract immediately refunds the remaining deposit based on the proportion of time running or depositSpent. If the offer is of non-cancellable type, the provider sets a fee for their work after the order is complete.
This method works only when all sub-orders are stopped.
| | | | **refillOrder(guid orderId, uint256 orderAmount)** | order.consumer | blockchain | -| Replenishment of the deposit by the customer. Normally required when renewing a rental. It can also be used to obtain additional results if that is supported by the provider’s offer. | | | +| Replenishment of the deposit by the customer. Normally required when renewing a rental. It can also be used to obtain additional results if that is supported by the provider's offer. | | | | **withdrawProfit(guid orderId) public** | order.provider.tokenReceiver | SDK + blockchain | | Order profit withdrawal by the provider. Available after the order is executed. In this case, the profit is transferred to deferred payments for the number of days specified in the protocol settings (_profitWithdrawDelayDays_). | | | | **withdrawChange(guid orderId) public** | order.consumer | SDK + blockchain | diff --git a/docs/whitepaper/high-level-description.md b/docs/whitepaper/high-level-description.md index 5a660198..6f7a05fe 100644 --- a/docs/whitepaper/high-level-description.md +++ b/docs/whitepaper/high-level-description.md @@ -9,7 +9,7 @@ sidebar_position: 5
-From a bird’s eye view, Super Protocol involves the interactions shown in the above diagram. The interactions include the following entities:
+From a bird's eye view, Super Protocol involves the interactions shown in the above diagram. The interactions include the following entities:
- **Provider Offers.** In a form of a provider offer, the provider offers their resources or values in exchange for a certain reward. The offer can fall into one of three categories:
- **Input.** Offers of this type are used for cooperative processing within a trusted execution environment (TEE). These can be data offers or solution offers.
diff --git a/docs/whitepaper/target-audience.md b/docs/whitepaper/target-audience.md
index 3be24d01..9e2b8787 100644
--- a/docs/whitepaper/target-audience.md
+++ b/docs/whitepaper/target-audience.md
@@ -24,7 +24,7 @@ Super Protocol is very unique in a way that it allows equipment owners to engage
Data in the modern world is being created everywhere, while new, more advanced processing algorithms allow for an ever larger number of ways to use this data. However, there are quite a few challenges here. Obviously, large volumes of data are not originally meant to be public, which means it requires anonymization or confidential computing.
-High-quality anonymization is not always possible without losing data utility. Additionally, many of today’s analytical tools enable successful [data de-anonymization](https://www.cs.utexas.edu/~shmat/shmat_oak09.pdf) under many different circumstances: “To demonstrate its effectiveness on real-world networks, we show that a third of the users who can be verified to have accounts on both Twitter, a popular microblogging service, and Flickr, an online photo-sharing site, can be re-identified in the anonymous Twitter graph with only a 12% error rate.”
+High-quality anonymization is not always possible without losing data utility. Additionally, many of today's analytical tools enable successful [data de-anonymization](https://www.cs.utexas.edu/~shmat/shmat_oak09.pdf) under many different circumstances: “To demonstrate its effectiveness on real-world networks, we show that a third of the users who can be verified to have accounts on both Twitter, a popular microblogging service, and Flickr, an online photo-sharing site, can be re-identified in the anonymous Twitter graph with only a 12% error rate.”
Almost any data owner would benefit from monetizing it—as long as it does not harm their business as a whole. This is borne out by the widespread development of technologies for analyzing big data.
diff --git a/docusaurus.config.js b/docusaurus.config.js
index aa49eaab..c3a86fd7 100644
--- a/docusaurus.config.js
+++ b/docusaurus.config.js
@@ -195,13 +195,13 @@ const config = {
position: "right",
label: "Developers",
},*/
- {
+ /*{
type: "doc",
docId: "index",
position: "right",
label: "Whitepaper",
docsPluginId: "whitepaper",
- },
+ },*/
],
},
prism: {
@@ -230,7 +230,7 @@ const config = {
"@easyops-cn/docusaurus-search-local",
({
hashed: true,
- docsRouteBasePath: [/*"developers", */"marketplace", "whitepaper", "fundamentals", "cli"],
+ docsRouteBasePath: [/*"developers", */"marketplace", /*"whitepaper", */"fundamentals", "cli"],
language: ["en"],
highlightSearchTermsOnTargetPage: true,
explicitSearchResultPath: true,