Skip to content

How it works

Khawla edited this page Mar 12, 2025 · 1 revision

At the heart of Lumina is a decentralized data collection system composed of an independent network of Feeders, in charge of sourcing data and feeding it into the Lasernet chain.

A feeder consists of the following components as seen in the figure below:

  • Scraper
  • Collector
  • Processor

The scraper collects trade data from various CEXs and DEXs. The collector and processor aggregate the data through a two-step process to produce a scalar value associated with an asset, which is subsequently published on-chain. In most cases, this value represents the asset's price in USD.

Scrapers

Each scraper is implemented in a dedicated file in the folder /pkg/scrapers with the main function signature func NewExchangeScraper(pairs []models.ExchangePair, tradesChannel chan models.Trade, wg *sync.WaitGroup), resp. pools instead of pairs for decentralized exchanges.
Its function is to continuously fetch trades data from a given exchange and send them to the channel tradesChannel.
The expected input for a scraper is a set of pair tickers such as BTC-USDT. Tickers are always capitalized and symbols separated by a hyphen. It's the role of the scraper to format the pair ticker such that it can subscribe to the corresponding (websocket) stream.
For centralized exchanges, a json file in /config/symbolIdentification is needed that assigns blockchain and address to each ticker symbol the scraper is handling.

Collector

The collector gathers trades from all running scrapers. As soon as it receives a signal through a trigger channel it bundles trades in atomic tradesblocks. An atomic tradesblock is a set of trades restricted to one market on one exchange, for instance BTC-USDT trades on Binance exchange. These tradesblocks are sent to the Processor.

Processor

The processor is a 2-step aggregation procedure similar to mapReduce:

  • Step 1: Aggregate trades from an atomic tradesblock. The type of aggregation can be selected through an environment variable (see Feeder/main). The only assumption on the aggregation implementation is that it returns a float64.
  • Step 2: Aggregate filter values obtained in step 1. The selection of aggregation method and assumptions are identical to Step 1. The obtained scalar value is sent to the Oracle feeder.

Feeder contract

The feeder is feeding a simple key value oracle. It publishes the value obtained from the Processor. It is worth mentioning that the feeder can contain the trigger mechanism that initiates an iteration of the data flow diagram.

Clone this wiki locally