Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 0 additions & 15 deletions app.py

This file was deleted.

118 changes: 118 additions & 0 deletions backend-analytics/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,118 @@
from fastapi import FastAPI, Query, HTTPException
from fastapi.middleware.cors import CORSMiddleware
from pydantic import BaseModel
from typing import List, Literal
import requests
import numpy as np

app = FastAPI(title="Bitcoin Fee Stats API")

app.add_middleware(
CORSMiddleware,
allow_origins=[
"http://localhost:3000", # frontend local host
],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)

BLOCK_INTERVAL = 1000
URL_API = "https://bitcoincorefeerate.com/fees/2/economical/2"

# Response models
class BlockStat(BaseModel):
height: int
p25: float
p75: float
avgFee: float
status: Literal["overpaid", "underpaid", "within_range"]


class SummaryItem(BaseModel):
count: int
percent: float


class StatsResponse(BaseModel):
start_height: int
end_height: int
latest_block_height: int
blocks: List[BlockStat]
summary: dict

# Helpers function

def classify_block(avg_fee, p25, p75):
if avg_fee > p75:
return "overpaid"
elif avg_fee < p25:
return "underpaid"
return "within_range"

#routes
@app.get("/stats", response_model=StatsResponse)
def get_stats(start_height: int = Query(..., ge=0)):
end_height = start_height + BLOCK_INTERVAL

# Fetch data from api url
try:
r = requests.get(URL_API, timeout=30)
r.raise_for_status()
data = r.json()
except Exception as e:
raise HTTPException(
status_code=502,
detail=f"Failed to fetch URL API: {str(e)}"
)

mempool_stats = data.get("mempool_health_statistics", [])

# Group ratios per block height
block_ratios = {}
for entry in mempool_stats:
h = entry["block_height"]
ratio = entry["ratio"]
if h not in block_ratios:
block_ratios[h] = []
block_ratios[h].append(ratio)

# Compute percentiles, average, and classification
blocks = []
counts = {"overpaid": 0, "underpaid": 0, "within_range": 0}

for height in range(start_height, end_height):
ratios = block_ratios.get(height, [])
if not ratios:
continue # skips blocks with no data

arr = np.array(ratios)
p25 = float(np.percentile(arr, 25))
p75 = float(np.percentile(arr, 75))
avg = float(arr.mean())
status = classify_block(avg, p25, p75)
counts[status] += 1

blocks.append(
BlockStat(height=height, p25=p25, p75=p75, avgFee=avg, status=status)
)

total_blocks = len(blocks)
summary = {
"overpaid": {"count": counts["overpaid"], "percent": round(counts["overpaid"] / total_blocks * 100, 2) if total_blocks else 0},
"underpaid": {"count": counts["underpaid"], "percent": round(counts["underpaid"] / total_blocks * 100, 2) if total_blocks else 0},
"within": {"count": counts["within_range"], "percent": round(counts["within_range"] / total_blocks * 100, 2) if total_blocks else 0},
}

latest_block_height = max([b.height for b in blocks], default=start_height)

return {
"start_height": start_height,
"end_height": end_height,
"latest_block_height": latest_block_height,
"blocks": blocks,
"summary": summary,
}
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)
8 changes: 8 additions & 0 deletions backend/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# test files
test_mock.py
test_secure_connection.py
test_rpc_ports.py
test_getbestblockhash.py

rpc_config.ini

80 changes: 80 additions & 0 deletions backend/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
The Collector (The Brain): Runs in the background, continuously pulling real data, making a prediction, and archiving the result.

The API (The Server): Provides instantaneous responses by querying the pre-computed data, avoiding slow, repeated calls to the Bitcoin Core node.

app.py

The API Server (Flask)

Exposes endpoints for both real-time data (/fees) and high-performance historical analytics (/block-stats).
Also exposes an experimental `/fees/mempool` endpoint that uses the current block template
to return fee rate percentiles (default 25/50/75) straight from the mempool. Pass a custom
comma-separated list of percentiles via `?percentiles=10,50,90`.

collector.py

The Background Worker

Continuously monitors the blockchain, runs the custom prediction logic (get_custom_fee_prediction_asap), and archives the prediction-vs-actual data. It now stores percentile information (p10, p25, p50, p75, p90) for each processed block so that accuracy summaries can be generated later.

json_rpc_request.py

The RPC Client

Handles all communication with the Bitcoin Core node. Includes exponential backoff and a built-in cache for reliable, efficient data retrieval.

database.py

SQLite persistence that stores every processed block’s actual fee range, prediction, and percentile information. Provides helpers to fetch recent history and compute a summary of overpayment/underpayment/within-range accuracy for a configurable window (default 1,000 blocks).

New API endpoints

- `GET /fees/mempool`: experimental percentile estimator based on `getblocktemplate`.
- `GET /analytics/summary`: returns historical performance metrics (`total`, `overpayment_val`, `within_val`, etc.) for the stored predictions. Accepts `?limit=` and `?forecaster=` query parameters.
- `GET /api/v1/fees/estimate`: unified estimator endpoint with `method=mempool|historical|hybrid`, `target`, and `percentile`. Includes warnings when average block coverage is low or when using mempool mode for multi-block targets.
- External fallbacks (optional, set `EXTERNAL_FALLBACK_ENABLED=1`):
- `GET /external/block-stats/{N}` → proxies `https://bitcoincorefeerate.com/block-stats/{N}/`
- `GET /external/fees-stats/{N}` → proxies `https://bitcoincorefeerate.com/fees-stats/{N}/`
- `GET /external/fees-sum/{N}` → proxies `https://bitcoincorefeerate.com/fees-sum/{N}/`
- `/analytics/summary` will return external data if internal DB has no records yet.

database.py

The Persistence Layer (SQLite)

Stores the historical performance of our model (prediction, min fee, max fee) over thousands of blocks for instant retrieval. Uses INSERT OR IGNORE for stability.

+---------------------------+ +----------------------------+
| Frontend (Next.js) | REST (JSON) | Flask API |
| /dashboard, /stats +---------------->+ /api/v1/fees/estimate |
| calls /fees/mempool, | | /fees/mempool |
| /analytics/summary | | /analytics/summary |
+---------------------------+ +------------+---------------+
|
| RPC (JSON-RPC)
v
+-----------------------------+
| Bitcoin Core (bitcoind) |
| getblocktemplate |
| getrawmempool (verbose) |
| getblockstats, estimates... |
+--------------+--------------+
^
|
periodic |
fetch/store |
|
+------------------------------------------+ |
| Collector (background process) | |
| - Snapshot mempool (txids + verbose) | |
| - Predict p50 via blocktemplate | |
| - Compute block coverage | |
| - Compute high-fee inclusion ratio | |
| - Store analytics in SQLite | |
+---------------------------+--------------+ |
| |
v |
+------------------+ |
| SQLite (history) |<-----------------+
| fee_analysis |
+------------------+
Loading