Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
133 changes: 133 additions & 0 deletions API_ENDPOINTS.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,133 @@
# L2 Batches Analytics API Endpoints

This document describes the new API endpoints added for L2 batch analysis.

## Existing Endpoints

- `GET /` - Welcome message
- `GET /tx?tx_hash=<hash>` - Single transaction analysis
- `GET /contract?contract_address=<address>` - Contract analysis

## New L2 Analytics Endpoints

### 1. Daily Transactions per Batcher

**Endpoint:** `GET /daily_txs`

**Parameters:**
- `batcher_address` (string) - The batcher address to filter by
- `start_timestamp` (i64) - Start timestamp (Unix timestamp)
- `end_timestamp` (i64) - End timestamp (Unix timestamp)

**Example:**
```
GET /daily_txs?batcher_address=0x5050F69a9786F081509234F1a7F4684b5E5b76C9&start_timestamp=1640995200&end_timestamp=1641081600
```

**Response:**
```json
{
"batcher_address": "0x5050F69a9786F081509234F1a7F4684b5E5b76C9",
"tx_count": 42
}
```

### 2. ETH Saved

**Endpoint:** `GET /eth_saved`

**Parameters:**
- `batcher_address` (string) - The batcher address to filter by
- `start_timestamp` (i64) - Start timestamp (Unix timestamp)
- `end_timestamp` (i64) - End timestamp (Unix timestamp)

**Example:**
```
GET /eth_saved?batcher_address=0x5050F69a9786F081509234F1a7F4684b5E5b76C9&start_timestamp=1640995200&end_timestamp=1641081600
```

**Response:**
```json
{
"batcher_address": "0x5050F69a9786F081509234F1a7F4684b5E5b76C9",
"total_eth_saved_wei": "1234567890123456789"
}
```

**Note:** ETH saved is calculated as the difference between the cost with EIP-7623 calldata and the actual cost with blob data.

### 3. Total Blob Data Gas

**Endpoint:** `GET /blob_data_gas`

**Parameters:**
- `batcher_address` (string) - The batcher address to filter by
- `start_timestamp` (i64) - Start timestamp (Unix timestamp)
- `end_timestamp` (i64) - End timestamp (Unix timestamp)

**Example:**
```
GET /blob_data_gas?batcher_address=0x5050F69a9786F081509234F1a7F4684b5E5b76C9&start_timestamp=1640995200&end_timestamp=1641081600
```

**Response:**
```json
{
"batcher_address": "0x5050F69a9786F081509234F1a7F4684b5E5b76C9",
"total_blob_data_gas": 1234567890
}
```

### 4. Total Pectra Data Gas (EIP-7623)

**Endpoint:** `GET /pectra_data_gas`

**Parameters:**
- `batcher_address` (string) - The batcher address to filter by
- `start_timestamp` (i64) - Start timestamp (Unix timestamp)
- `end_timestamp` (i64) - End timestamp (Unix timestamp)

**Example:**
```
GET /pectra_data_gas?batcher_address=0x5050F69a9786F081509234F1a7F4684b5E5b76C9&start_timestamp=1640995200&end_timestamp=1641081600
```

**Response:**
```json
{
"batcher_address": "0x5050F69a9786F081509234F1a7F4684b5E5b76C9",
"total_pectra_data_gas": 9876543210
}
```

## Technical Notes

- All timestamps are in Unix timestamp format (seconds since January 1, 1970)
- Gas values are in gas units
- ETH values are in wei (1 ETH = 10^18 wei)
- The endpoints query the SQLite database created by the L2 monitoring service
- The database is automatically populated by the `run_l2_batches_monitoring_service`
- All analytics endpoints now require a `batcher_address` parameter to filter results for a specific batcher

## Monitored Batcher Addresses

Currently the system monitors the following batcher addresses:
- `0x5050F69a9786F081509234F1a7F4684b5E5b76C9` (Base)
- `0x6887246668a3b87F54DeB3b94Ba47a6f63F32985` (Optimism)

## Server Startup

To start the server with the new APIs:

```bash
# Set required environment variables
export ETHEREUM_PROVIDER="your_ethereum_rpc_url"
export ETHERSCAN_API_KEY="your_etherscan_api_key"
export CHAIN_ID="1" # 1 for mainnet, 11155111 for sepolia
export PORT="3000" # optional, default 3000

# Start the server
cargo run
```

The server will start both the HTTP API and the L2 monitoring service in parallel.
1 change: 1 addition & 0 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

5 changes: 4 additions & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -30,4 +30,7 @@ sqlx = { version = "0.8", features = [ "runtime-tokio-rustls", "sqlite" ] }
async-trait = "0.1"
tracing = "0.1"
tracing-subscriber = { version = "0.3.19", features = ["env-filter"] }
chrono = { version = "0.4", features = ["serde"] }
chrono = { version = "0.4", features = ["serde"] }

[dev-dependencies]
tempfile = "3.8"
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,6 +94,10 @@ Example:
curl "http://localhost:3000/contract?contract_address=0x41dDf7fC14a579E0F3f2D698e14c76d9d486B9F7"
```

### APIs to read historical data

See in the [related doc](/API_ENDPOINTS.md).

## Development

1. Clone the repository:
Expand Down
209 changes: 209 additions & 0 deletions examples/fill_test_data.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,209 @@
use pectralizer::server::types::TxAnalysisResponse;
use pectralizer::tracker::database::{Database, SqliteDatabase, TrackedBatch};
use std::time::{SystemTime, UNIX_EPOCH};

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
println!("🔧 Filling database with test data...");

// Connect to the database
let db = SqliteDatabase::new("./l2_batches_monitoring.db", 0).await?;

// Get current timestamp
let now = SystemTime::now()
.duration_since(UNIX_EPOCH)
.unwrap()
.as_secs() as i64;

// Use exact addresses from L2_BATCHERS_ADDRESSES in l2_monitor.rs
let base_batcher = "0x5050f69a9786f081509234f1a7f4684b5e5b76c9"; // Base - lowercase hex format
let optimism_batcher = "0x6887246668a3b87f54deb3b94ba47a6f63f32985"; // Optimism - lowercase hex format

// Create proper TxAnalysisResponse structures for test data
let create_analysis_response = |timestamp: u64,
gas_used: u64,
gas_price: u128,
blob_gas_price: u128,
blob_gas_used: u64,
eip_7623_calldata_gas: u64,
legacy_calldata_gas: u64|
-> String {
let response = TxAnalysisResponse {
timestamp,
gas_used,
gas_price,
blob_gas_price: Some(blob_gas_price),
blob_gas_used,
eip_7623_calldata_gas,
legacy_calldata_gas,
blob_data_wei_spent: Some(blob_gas_used as u128 * blob_gas_price),
legacy_calldata_wei_spent: legacy_calldata_gas as u128 * gas_price,
eip_7623_calldata_wei_spent: eip_7623_calldata_gas as u128 * gas_price,
};
serde_json::to_string(&response).unwrap()
};

// Base batcher test data
let base_batches = vec![
TrackedBatch {
id: None,
tx_hash: "0xbase1111111111111111111111111111111111111111111111111111111111111"
.to_string(),
batcher_address: base_batcher.to_string(),
analysis_result: create_analysis_response(
(now - 86400) as u64, // 1 day ago
150000, // gas_used
20_000_000_000, // gas_price (20 gwei)
15_000_000_000, // blob_gas_price (15 gwei)
131072, // blob_gas_used
15000, // eip_7623_calldata_gas
12000, // legacy_calldata_gas
),
timestamp: now - 86400, // 1 day ago
last_analyzed_block: None,
},
TrackedBatch {
id: None,
tx_hash: "0xbase2222222222222222222222222222222222222222222222222222222222222"
.to_string(),
batcher_address: base_batcher.to_string(),
analysis_result: create_analysis_response(
(now - 82800) as u64, // 23 hours ago
200000, // gas_used
25_000_000_000, // gas_price (25 gwei)
18_000_000_000, // blob_gas_price (18 gwei)
262144, // blob_gas_used
25000, // eip_7623_calldata_gas
20000, // legacy_calldata_gas
),
timestamp: now - 82800, // 23 hours ago
last_analyzed_block: None,
},
TrackedBatch {
id: None,
tx_hash: "0xbase3333333333333333333333333333333333333333333333333333333333333"
.to_string(),
batcher_address: base_batcher.to_string(),
analysis_result: create_analysis_response(
(now - 79200) as u64, // 22 hours ago
175000, // gas_used
22_000_000_000, // gas_price (22 gwei)
16_000_000_000, // blob_gas_price (16 gwei)
196608, // blob_gas_used
18000, // eip_7623_calldata_gas
15000, // legacy_calldata_gas
),
timestamp: now - 79200, // 22 hours ago
last_analyzed_block: None,
},
];

// Optimism batcher test data
let optimism_batches = vec![
TrackedBatch {
id: None,
tx_hash: "0xop111111111111111111111111111111111111111111111111111111111111111"
.to_string(),
batcher_address: optimism_batcher.to_string(),
analysis_result: create_analysis_response(
(now - 75600) as u64, // 21 hours ago
300000, // gas_used
30_000_000_000, // gas_price (30 gwei)
20_000_000_000, // blob_gas_price (20 gwei)
393216, // blob_gas_used
35000, // eip_7623_calldata_gas
28000, // legacy_calldata_gas
),
timestamp: now - 75600, // 21 hours ago
last_analyzed_block: None,
},
TrackedBatch {
id: None,
tx_hash: "0xop222222222222222222222222222222222222222222222222222222222222222"
.to_string(),
batcher_address: optimism_batcher.to_string(),
analysis_result: create_analysis_response(
(now - 72000) as u64, // 20 hours ago
350000, // gas_used
35_000_000_000, // gas_price (35 gwei)
25_000_000_000, // blob_gas_price (25 gwei)
524288, // blob_gas_used
45000, // eip_7623_calldata_gas
36000, // legacy_calldata_gas
),
timestamp: now - 72000, // 20 hours ago
last_analyzed_block: None,
},
TrackedBatch {
id: None,
tx_hash: "0xop333333333333333333333333333333333333333333333333333333333333333"
.to_string(),
batcher_address: optimism_batcher.to_string(),
analysis_result: create_analysis_response(
(now - 68400) as u64, // 19 hours ago
250000, // gas_used
28_000_000_000, // gas_price (28 gwei)
22_000_000_000, // blob_gas_price (22 gwei)
327680, // blob_gas_used
28000, // eip_7623_calldata_gas
22000, // legacy_calldata_gas
),
timestamp: now - 68400, // 19 hours ago
last_analyzed_block: None,
},
];

// Insert Base batches
println!("📊 Inserting Base batcher data...");
for batch in base_batches {
match db.save_tracked_batch(&batch).await {
Ok(_) => println!(" ✅ Inserted batch: {}", batch.tx_hash),
Err(e) => println!(" ❌ Failed to insert batch {}: {}", batch.tx_hash, e),
}
}

// Insert Optimism batches
println!("📊 Inserting Optimism batcher data...");
for batch in optimism_batches {
match db.save_tracked_batch(&batch).await {
Ok(_) => println!(" ✅ Inserted batch: {}", batch.tx_hash),
Err(e) => println!(" ❌ Failed to insert batch {}: {}", batch.tx_hash, e),
}
}

// Add some older data for testing different time ranges
println!("📊 Inserting older test data...");
let older_batch = TrackedBatch {
id: None,
tx_hash: "0xold1111111111111111111111111111111111111111111111111111111111111".to_string(),
batcher_address: base_batcher.to_string(),
analysis_result: create_analysis_response(
(now - 604800) as u64, // 7 days ago
120000, // gas_used
15_000_000_000, // gas_price (15 gwei)
12_000_000_000, // blob_gas_price (12 gwei)
131072, // blob_gas_used
12000, // eip_7623_calldata_gas
10000, // legacy_calldata_gas
),
timestamp: now - 604800, // 7 days ago
last_analyzed_block: None,
};

match db.save_tracked_batch(&older_batch).await {
Ok(_) => println!(" ✅ Inserted older batch: {}", older_batch.tx_hash),
Err(e) => println!(" ❌ Failed to insert older batch: {}", e),
}

println!("\n🎉 Test data insertion completed!");
println!("📈 Summary:");
println!(" - 3 Base batcher transactions");
println!(" - 3 Optimism batcher transactions");
println!(" - 1 older transaction for time range testing");
println!("\n🔍 You can now test the API endpoints with:");
println!(" Base batcher: {}", base_batcher);
println!(" Optimism batcher: {}", optimism_batcher);
println!(" Time range: {} to {}", now - 86400, now);

Ok(())
}
Loading