@mytec: iter3.10 start, baseline rc ready
This commit is contained in:
463
RFCP-3.10-LinkBudget-Fresnel-Interference.md
Normal file
463
RFCP-3.10-LinkBudget-Fresnel-Interference.md
Normal file
@@ -0,0 +1,463 @@
|
||||
# RFCP — Iteration 3.10: Link Budget, Fresnel Zone & Interference Modeling
|
||||
|
||||
## Overview
|
||||
Add three interconnected RF analysis features: link budget calculator panel, Fresnel zone visualization on terrain profiles, and basic interference (C/I) modeling for multi-site scenarios. These build on existing infrastructure — propagation models, terrain profiles, and multi-site coverage.
|
||||
|
||||
## Priority Order
|
||||
1. Link Budget Calculator (simplest, standalone UI)
|
||||
2. Fresnel Zone Visualization (extends terrain profile)
|
||||
3. Interference Modeling (extends coverage engine)
|
||||
|
||||
---
|
||||
|
||||
## Feature 1: Link Budget Calculator
|
||||
|
||||
### Description
|
||||
A panel/dialog that shows the complete RF link budget as a table — from transmitter to receiver. Uses existing propagation model values but presents them in the standard telecom link budget format.
|
||||
|
||||
### Implementation
|
||||
|
||||
**New component:** `frontend/src/components/panels/LinkBudgetPanel.tsx`
|
||||
|
||||
The panel should display a table with rows for each element in the link chain. It should use the currently selected site's parameters and a configurable receiver point (either clicked on map or manually entered coordinates).
|
||||
|
||||
**Link Budget Table Structure:**
|
||||
```
|
||||
TRANSMITTER
|
||||
Tx Power (dBm) [from site config, e.g. 43 dBm]
|
||||
Tx Antenna Gain (dBi) [from site config, e.g. 18 dBi]
|
||||
Tx Cable/Connector Loss (dB) [new field, default 2 dB]
|
||||
EIRP (dBm) = Tx Power + Gain - Cable Loss
|
||||
|
||||
PATH
|
||||
Distance (km) [calculated from Tx to Rx point]
|
||||
Free Space Path Loss (dB) [existing formula: 20log(d) + 20log(f) + 32.45]
|
||||
Terrain Diffraction Loss (dB) [from terrain_los model if available]
|
||||
Vegetation Loss (dB) [from vegetation model if available]
|
||||
Atmospheric Loss (dB) [from atmospheric model if available]
|
||||
Total Path Loss (dB) = sum of all path losses
|
||||
|
||||
RECEIVER
|
||||
Rx Antenna Gain (dBi) [configurable, default 0 dBi for handset]
|
||||
Rx Cable Loss (dB) [configurable, default 0 dB]
|
||||
Rx Sensitivity (dBm) [configurable, default -100 dBm]
|
||||
|
||||
RESULT
|
||||
Received Power (dBm) = EIRP - Total Path Loss + Rx Gain - Rx Cable
|
||||
Link Margin (dB) = Received Power - Rx Sensitivity
|
||||
Status = "OK" if margin > 0, "FAIL" if < 0
|
||||
```
|
||||
|
||||
**Backend addition:** Add a new endpoint or extend existing coverage API.
|
||||
|
||||
**File:** `backend/app/api/routes/coverage.py` (or new `link_budget.py`)
|
||||
|
||||
```python
|
||||
@router.post("/api/link-budget")
|
||||
async def calculate_link_budget(request: dict):
|
||||
"""Calculate point-to-point link budget.
|
||||
|
||||
Body: {
|
||||
"site_id": "...", # or tx_lat/tx_lon/tx_params
|
||||
"tx_lat": 48.46,
|
||||
"tx_lon": 35.04,
|
||||
"tx_power_dbm": 43,
|
||||
"tx_gain_dbi": 18,
|
||||
"tx_cable_loss_db": 2,
|
||||
"tx_height_m": 30,
|
||||
"rx_lat": 48.50,
|
||||
"rx_lon": 35.10,
|
||||
"rx_gain_dbi": 0,
|
||||
"rx_cable_loss_db": 0,
|
||||
"rx_sensitivity_dbm": -100,
|
||||
"rx_height_m": 1.5,
|
||||
"frequency_mhz": 1800
|
||||
}
|
||||
"""
|
||||
from app.services.terrain_service import terrain_service
|
||||
|
||||
# Calculate distance
|
||||
distance_m = terrain_service.haversine_distance(
|
||||
request["tx_lat"], request["tx_lon"],
|
||||
request["rx_lat"], request["rx_lon"]
|
||||
)
|
||||
distance_km = distance_m / 1000
|
||||
|
||||
# Get elevations
|
||||
tx_elev = await terrain_service.get_elevation(request["tx_lat"], request["tx_lon"])
|
||||
rx_elev = await terrain_service.get_elevation(request["rx_lat"], request["rx_lon"])
|
||||
|
||||
# EIRP
|
||||
eirp_dbm = request["tx_power_dbm"] + request["tx_gain_dbi"] - request["tx_cable_loss_db"]
|
||||
|
||||
# Free space path loss
|
||||
freq = request["frequency_mhz"]
|
||||
fspl_db = 20 * math.log10(distance_km) + 20 * math.log10(freq) + 32.45 if distance_km > 0 else 0
|
||||
|
||||
# Terrain profile for LOS check
|
||||
profile = await terrain_service.get_elevation_profile(
|
||||
request["tx_lat"], request["tx_lon"],
|
||||
request["rx_lat"], request["rx_lon"],
|
||||
num_points=100
|
||||
)
|
||||
|
||||
# Simple LOS check - does terrain block line of sight?
|
||||
tx_total_height = tx_elev + request.get("tx_height_m", 30)
|
||||
rx_total_height = rx_elev + request.get("rx_height_m", 1.5)
|
||||
|
||||
terrain_loss_db = 0
|
||||
los_clear = True
|
||||
for i, point in enumerate(profile):
|
||||
if i == 0 or i == len(profile) - 1:
|
||||
continue
|
||||
# Linear interpolation of LOS line at this point
|
||||
fraction = i / (len(profile) - 1)
|
||||
los_height = tx_total_height + fraction * (rx_total_height - tx_total_height)
|
||||
if point["elevation"] > los_height:
|
||||
los_clear = False
|
||||
# Simple knife-edge diffraction estimate
|
||||
terrain_loss_db += 6 # ~6dB per obstruction (simplified)
|
||||
|
||||
total_path_loss = fspl_db + terrain_loss_db
|
||||
|
||||
# Received power
|
||||
rx_power_dbm = eirp_dbm - total_path_loss + request["rx_gain_dbi"] - request["rx_cable_loss_db"]
|
||||
|
||||
# Link margin
|
||||
margin_db = rx_power_dbm - request["rx_sensitivity_dbm"]
|
||||
|
||||
return {
|
||||
"distance_km": round(distance_km, 2),
|
||||
"distance_m": round(distance_m, 1),
|
||||
"tx_elevation_m": round(tx_elev, 1),
|
||||
"rx_elevation_m": round(rx_elev, 1),
|
||||
"eirp_dbm": round(eirp_dbm, 1),
|
||||
"fspl_db": round(fspl_db, 1),
|
||||
"terrain_loss_db": round(terrain_loss_db, 1),
|
||||
"total_path_loss_db": round(total_path_loss, 1),
|
||||
"los_clear": los_clear,
|
||||
"rx_power_dbm": round(rx_power_dbm, 1),
|
||||
"margin_db": round(margin_db, 1),
|
||||
"status": "OK" if margin_db >= 0 else "FAIL",
|
||||
"profile": profile,
|
||||
}
|
||||
```
|
||||
|
||||
### UI Requirements
|
||||
- New panel accessible from sidebar or toolbar button (calculator icon)
|
||||
- Click on map to set Rx point (with crosshair cursor)
|
||||
- Auto-populates Tx params from selected site
|
||||
- Shows result table with color coding (green margin = OK, red = FAIL)
|
||||
- Optionally draws line on map from Tx to Rx
|
||||
|
||||
---
|
||||
|
||||
## Feature 2: Fresnel Zone Visualization
|
||||
|
||||
### Description
|
||||
Draw Fresnel zone ellipse overlay on the Terrain Profile chart, showing where terrain intrudes into the first Fresnel zone. This is critical for understanding if a radio link will actually work — even if terrain doesn't block direct LOS, Fresnel zone obstruction causes significant signal loss.
|
||||
|
||||
### Implementation
|
||||
|
||||
**Modify:** The existing Terrain Profile component/chart
|
||||
|
||||
**Fresnel Zone Radius Formula:**
|
||||
```python
|
||||
import math
|
||||
|
||||
def fresnel_radius(n: int, frequency_mhz: float, d1_m: float, d2_m: float) -> float:
|
||||
"""Calculate nth Fresnel zone radius at a point along the path.
|
||||
|
||||
Args:
|
||||
n: Fresnel zone number (1 = first zone, most important)
|
||||
frequency_mhz: Frequency in MHz
|
||||
d1_m: Distance from transmitter to this point (meters)
|
||||
d2_m: Distance from this point to receiver (meters)
|
||||
|
||||
Returns:
|
||||
Radius of nth Fresnel zone in meters
|
||||
"""
|
||||
wavelength = 300.0 / frequency_mhz # meters
|
||||
d_total = d1_m + d2_m
|
||||
if d_total == 0:
|
||||
return 0
|
||||
radius = math.sqrt((n * wavelength * d1_m * d2_m) / d_total)
|
||||
return radius
|
||||
```
|
||||
|
||||
**Backend endpoint:** `backend/app/api/routes/coverage.py`
|
||||
|
||||
```python
|
||||
@router.post("/api/fresnel-profile")
|
||||
async def fresnel_profile(request: dict):
|
||||
"""Calculate terrain profile with Fresnel zone boundaries.
|
||||
|
||||
Body: {
|
||||
"tx_lat": 48.46, "tx_lon": 35.04, "tx_height_m": 30,
|
||||
"rx_lat": 48.50, "rx_lon": 35.10, "rx_height_m": 1.5,
|
||||
"frequency_mhz": 1800,
|
||||
"num_points": 100
|
||||
}
|
||||
"""
|
||||
from app.services.terrain_service import terrain_service
|
||||
|
||||
tx_lat, tx_lon = request["tx_lat"], request["tx_lon"]
|
||||
rx_lat, rx_lon = request["rx_lat"], request["rx_lon"]
|
||||
tx_height = request.get("tx_height_m", 30)
|
||||
rx_height = request.get("rx_height_m", 1.5)
|
||||
freq = request.get("frequency_mhz", 1800)
|
||||
num_points = request.get("num_points", 100)
|
||||
|
||||
# Get terrain profile
|
||||
profile = await terrain_service.get_elevation_profile(
|
||||
tx_lat, tx_lon, rx_lat, rx_lon, num_points
|
||||
)
|
||||
|
||||
total_distance = profile[-1]["distance"] if profile else 0
|
||||
|
||||
# Get endpoint elevations
|
||||
tx_elev = profile[0]["elevation"] if profile else 0
|
||||
rx_elev = profile[-1]["elevation"] if profile else 0
|
||||
tx_total = tx_elev + tx_height
|
||||
rx_total = rx_elev + rx_height
|
||||
|
||||
wavelength = 300.0 / freq # meters
|
||||
|
||||
# Calculate Fresnel zone at each profile point
|
||||
fresnel_data = []
|
||||
los_blocked = False
|
||||
fresnel_blocked = False
|
||||
worst_clearance = float('inf')
|
||||
|
||||
for i, point in enumerate(profile):
|
||||
d1 = point["distance"] # distance from tx
|
||||
d2 = total_distance - d1 # distance to rx
|
||||
|
||||
# LOS height at this point (linear interpolation)
|
||||
if total_distance > 0:
|
||||
fraction = d1 / total_distance
|
||||
else:
|
||||
fraction = 0
|
||||
los_height = tx_total + fraction * (rx_total - tx_total)
|
||||
|
||||
# First Fresnel zone radius
|
||||
if d1 > 0 and d2 > 0 and total_distance > 0:
|
||||
f1_radius = math.sqrt((1 * wavelength * d1 * d2) / total_distance)
|
||||
else:
|
||||
f1_radius = 0
|
||||
|
||||
# Fresnel zone boundaries (height above sea level)
|
||||
fresnel_top = los_height + f1_radius
|
||||
fresnel_bottom = los_height - f1_radius
|
||||
|
||||
# Clearance: how much space between terrain and Fresnel bottom
|
||||
clearance = fresnel_bottom - point["elevation"]
|
||||
|
||||
if clearance < worst_clearance:
|
||||
worst_clearance = clearance
|
||||
|
||||
if point["elevation"] > los_height:
|
||||
los_blocked = True
|
||||
if point["elevation"] > fresnel_bottom:
|
||||
fresnel_blocked = True
|
||||
|
||||
fresnel_data.append({
|
||||
"distance": point["distance"],
|
||||
"lat": point["lat"],
|
||||
"lon": point["lon"],
|
||||
"terrain_elevation": point["elevation"],
|
||||
"los_height": round(los_height, 1),
|
||||
"fresnel_top": round(fresnel_top, 1),
|
||||
"fresnel_bottom": round(fresnel_bottom, 1),
|
||||
"f1_radius": round(f1_radius, 1),
|
||||
"clearance": round(clearance, 1),
|
||||
})
|
||||
|
||||
return {
|
||||
"profile": fresnel_data,
|
||||
"total_distance_m": round(total_distance, 1),
|
||||
"tx_elevation": round(tx_elev, 1),
|
||||
"rx_elevation": round(rx_elev, 1),
|
||||
"frequency_mhz": freq,
|
||||
"wavelength_m": round(wavelength, 4),
|
||||
"los_clear": not los_blocked,
|
||||
"fresnel_clear": not fresnel_blocked,
|
||||
"worst_clearance_m": round(worst_clearance, 1),
|
||||
"recommendation": (
|
||||
"Clear — excellent link" if not fresnel_blocked
|
||||
else "Fresnel zone partially blocked — expect 3-6 dB additional loss"
|
||||
if not los_blocked
|
||||
else "LOS blocked — significant diffraction loss expected"
|
||||
),
|
||||
}
|
||||
```
|
||||
|
||||
### Frontend Visualization
|
||||
On the existing Terrain Profile chart:
|
||||
- Draw the LOS line (straight line from Tx to Rx) — this may already exist
|
||||
- Draw first Fresnel zone as a **semi-transparent elliptical area** around the LOS line
|
||||
- Upper boundary = `fresnel_top` series
|
||||
- Lower boundary = `fresnel_bottom` series
|
||||
- Color: light blue with ~20% opacity
|
||||
- Where terrain intersects Fresnel zone, highlight in red/orange
|
||||
- Show clearance info in the profile tooltip
|
||||
- Add a summary badge: "LOS Clear ✓" / "Fresnel 60% Clear ⚠" / "LOS Blocked ✗"
|
||||
|
||||
---
|
||||
|
||||
## Feature 3: Interference Modeling (C/I Ratio)
|
||||
|
||||
### Description
|
||||
Add carrier-to-interference ratio calculation to the coverage engine. For each grid point, calculate the C/I ratio: the signal from the serving cell vs the sum of signals from all other cells on the same frequency. Display as a separate heatmap layer.
|
||||
|
||||
### Implementation
|
||||
|
||||
**Backend changes:**
|
||||
|
||||
**File:** `backend/app/services/coverage_service.py` (or gpu_service.py)
|
||||
|
||||
Add C/I calculation after existing coverage computation:
|
||||
|
||||
```python
|
||||
def calculate_interference(self, sites: list, coverage_results: dict) -> np.ndarray:
|
||||
"""Calculate C/I ratio for each grid point.
|
||||
|
||||
For each point:
|
||||
- C = signal strength from strongest (serving) cell
|
||||
- I = sum of signal strengths from all other co-frequency cells
|
||||
- C/I = C - 10*log10(sum of linear interference powers)
|
||||
|
||||
Returns array of C/I values in dB.
|
||||
"""
|
||||
# Get all RSRP grids (already calculated)
|
||||
# For each point, find:
|
||||
# 1. Best server (strongest signal) = C
|
||||
# 2. Sum of all others on same frequency = I
|
||||
# 3. C/I = C(dBm) - I(dBm)
|
||||
|
||||
# Group sites by frequency
|
||||
freq_groups = {}
|
||||
for site in sites:
|
||||
freq = site.get("frequency_mhz", 1800)
|
||||
if freq not in freq_groups:
|
||||
freq_groups[freq] = []
|
||||
freq_groups[freq].append(site)
|
||||
|
||||
# Only calculate interference for frequency groups with 2+ sites
|
||||
# For single-site frequencies, C/I = infinity (no interference)
|
||||
|
||||
# The RSRP values are already in dBm, need to convert to linear for summing
|
||||
# P_linear = 10^(P_dBm / 10)
|
||||
# I_total_linear = sum(P_linear for all interferers)
|
||||
# I_total_dBm = 10 * log10(I_total_linear)
|
||||
# C/I = C_dBm - I_total_dBm
|
||||
pass
|
||||
```
|
||||
|
||||
**Key algorithm (for GPU pipeline in gpu_service.py):**
|
||||
```python
|
||||
# After computing RSRP for all sites at all grid points:
|
||||
# rsrp_grid shape: (num_sites, num_points) in dBm
|
||||
|
||||
# Convert to linear (mW)
|
||||
rsrp_linear = 10 ** (rsrp_grid / 10.0) # CuPy array
|
||||
|
||||
# For each point, best server
|
||||
best_server_idx = cp.argmax(rsrp_grid, axis=0)
|
||||
best_rsrp_linear = cp.take_along_axis(rsrp_linear, best_server_idx[cp.newaxis, :], axis=0)[0]
|
||||
|
||||
# Total power from all sites
|
||||
total_power = cp.sum(rsrp_linear, axis=0)
|
||||
|
||||
# Interference = total - serving
|
||||
interference_linear = total_power - best_rsrp_linear
|
||||
|
||||
# C/I ratio in dB
|
||||
# Avoid log10(0) with small epsilon
|
||||
epsilon = 1e-30
|
||||
ci_ratio_db = 10 * cp.log10(best_rsrp_linear / (interference_linear + epsilon))
|
||||
|
||||
# Clip to reasonable range
|
||||
ci_ratio_db = cp.clip(ci_ratio_db, -20, 50)
|
||||
```
|
||||
|
||||
### Frontend Visualization
|
||||
- Add a toggle in the coverage controls: "Show: Signal (RSRP) | Interference (C/I)"
|
||||
- C/I heatmap uses different color scale:
|
||||
- Dark red: < 0 dB (interference dominant — no service)
|
||||
- Orange: 0-10 dB (marginal)
|
||||
- Yellow: 10-20 dB (acceptable)
|
||||
- Green: 20-30 dB (good)
|
||||
- Blue: > 30 dB (excellent, minimal interference)
|
||||
- The C/I map only makes sense with 2+ sites on same frequency
|
||||
- Show warning if all sites are on different frequencies (no co-channel interference)
|
||||
|
||||
### API Response Extension
|
||||
Add `ci_ratio` field to coverage calculation response alongside existing `rsrp` values.
|
||||
|
||||
---
|
||||
|
||||
## Testing Checklist
|
||||
|
||||
### Link Budget
|
||||
- [ ] Panel opens from toolbar/sidebar
|
||||
- [ ] Click on map sets Rx point
|
||||
- [ ] Tx parameters auto-populate from selected site
|
||||
- [ ] Link budget table shows all rows correctly
|
||||
- [ ] Margin calculation is correct (manual verification)
|
||||
- [ ] Color coding: green for positive margin, red for negative
|
||||
- [ ] Line drawn on map from Tx to Rx
|
||||
|
||||
### Fresnel Zone
|
||||
- [ ] Terrain profile shows Fresnel zone overlay
|
||||
- [ ] Fresnel ellipse is widest at midpoint (correct shape)
|
||||
- [ ] Red highlighting where terrain enters Fresnel zone
|
||||
- [ ] Summary shows LOS/Fresnel status
|
||||
- [ ] Works at different frequencies (zone size changes with frequency)
|
||||
- [ ] Clearance values are reasonable (first Fresnel zone at 1800 MHz, 10km = ~22m radius at midpoint)
|
||||
|
||||
### Interference
|
||||
- [ ] C/I toggle appears when 2+ sites exist
|
||||
- [ ] C/I heatmap renders with correct color scale
|
||||
- [ ] Single-site scenario shows "no interference" or infinite C/I
|
||||
- [ ] Two sites on same frequency show interference zones between them
|
||||
- [ ] C/I values are reasonable (> 20 dB near serving cell, < 10 dB at cell edge)
|
||||
|
||||
## Build & Deploy
|
||||
|
||||
```bash
|
||||
cd D:\root\rfcp
|
||||
|
||||
# Backend — just restart uvicorn (Python, no build)
|
||||
cd backend
|
||||
python -m uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
|
||||
|
||||
# Frontend — rebuild if UI components changed
|
||||
cd frontend
|
||||
npm run build
|
||||
|
||||
# Full installer rebuild if needed
|
||||
# (use existing build script)
|
||||
```
|
||||
|
||||
## Commit Message
|
||||
|
||||
```
|
||||
feat(rf): add link budget, Fresnel zone, and interference modeling
|
||||
|
||||
- Add /api/link-budget endpoint with full path analysis
|
||||
- Add /api/fresnel-profile endpoint with zone clearance calculation
|
||||
- Add C/I ratio computation to GPU coverage pipeline
|
||||
- Add LinkBudgetPanel frontend component
|
||||
- Add Fresnel zone overlay to terrain profile chart
|
||||
- Add C/I heatmap toggle alongside RSRP display
|
||||
- Group interference by frequency for co-channel analysis
|
||||
```
|
||||
|
||||
## Success Criteria
|
||||
|
||||
1. Link budget shows correct margin for known test case (Dnipro, 10km, 1800MHz)
|
||||
2. Fresnel zone visually shows ellipse on terrain profile
|
||||
3. Two co-frequency sites show interference pattern between them
|
||||
4. All three features work with existing terrain data (no new downloads needed)
|
||||
5. GPU pipeline performance not significantly degraded by C/I calculation
|
||||
246
RFCP-3.9.1-Terra-Integration.md
Normal file
246
RFCP-3.9.1-Terra-Integration.md
Normal file
@@ -0,0 +1,246 @@
|
||||
# RFCP — Iteration 3.9.1: Terra Tile Server Integration
|
||||
|
||||
## Overview
|
||||
Connect terrain_service.py to our SRTM tile server (terra.eliah.one) as primary download source, add terrain status API endpoint, and create a bulk pre-download utility. The `data/terrain/` directory already exists.
|
||||
|
||||
## Context
|
||||
- terra.eliah.one is live and serving tiles via Caddy file_server
|
||||
- SRTM3 (90m): 187 tiles, 515 MB — full Ukraine coverage (N44-N51, E018-E041)
|
||||
- SRTM1 (30m): 160 tiles, 3.9 GB — same coverage area
|
||||
- terrain_service.py already has bilinear interpolation (3.9.0)
|
||||
- Backend runs on Windows with RTX 4060, tiles stored locally in `data/terrain/`
|
||||
- Server is download source, NOT used during realtime calculations
|
||||
|
||||
## Changes Required
|
||||
|
||||
### 1. Update SRTM_SOURCES in terrain_service.py
|
||||
|
||||
**File:** `backend/app/services/terrain_service.py`
|
||||
|
||||
Replace current SRTM_SOURCES (lines 22-25):
|
||||
```python
|
||||
SRTM_SOURCES = [
|
||||
"https://elevation-tiles-prod.s3.amazonaws.com/skadi/{lat_dir}/{tile_name}.hgt.gz",
|
||||
"https://s3.amazonaws.com/elevation-tiles-prod/skadi/{lat_dir}/{tile_name}.hgt.gz",
|
||||
]
|
||||
```
|
||||
|
||||
With prioritized source list:
|
||||
```python
|
||||
SRTM_SOURCES = [
|
||||
# Our tile server — SRTM1 (30m) preferred, uncompressed
|
||||
{
|
||||
"url": "https://terra.eliah.one/srtm1/{tile_name}.hgt",
|
||||
"compressed": False,
|
||||
"resolution": "srtm1",
|
||||
},
|
||||
# Our tile server — SRTM3 (90m) fallback
|
||||
{
|
||||
"url": "https://terra.eliah.one/srtm3/{tile_name}.hgt",
|
||||
"compressed": False,
|
||||
"resolution": "srtm3",
|
||||
},
|
||||
# Public AWS mirror — SRTM1, gzip compressed
|
||||
{
|
||||
"url": "https://elevation-tiles-prod.s3.amazonaws.com/skadi/{lat_dir}/{tile_name}.hgt.gz",
|
||||
"compressed": True,
|
||||
"resolution": "srtm1",
|
||||
},
|
||||
]
|
||||
```
|
||||
|
||||
Update `download_tile()` to handle the new source format:
|
||||
```python
|
||||
async def download_tile(self, tile_name: str) -> bool:
|
||||
"""Download SRTM tile from configured sources, preferring highest resolution."""
|
||||
tile_path = self.get_tile_path(tile_name)
|
||||
|
||||
if tile_path.exists():
|
||||
return True
|
||||
|
||||
lat_dir = tile_name[:3] # e.g., "N48"
|
||||
|
||||
async with httpx.AsyncClient(timeout=60.0, follow_redirects=True) as client:
|
||||
for source in self.SRTM_SOURCES:
|
||||
url = source["url"].format(lat_dir=lat_dir, tile_name=tile_name)
|
||||
try:
|
||||
response = await client.get(url)
|
||||
|
||||
if response.status_code == 200:
|
||||
data = response.content
|
||||
|
||||
# Skip empty responses
|
||||
if len(data) < 1000:
|
||||
continue
|
||||
|
||||
if source["compressed"]:
|
||||
if url.endswith('.gz'):
|
||||
data = gzip.decompress(data)
|
||||
elif url.endswith('.zip'):
|
||||
with zipfile.ZipFile(io.BytesIO(data)) as zf:
|
||||
for name in zf.namelist():
|
||||
if name.endswith('.hgt'):
|
||||
data = zf.read(name)
|
||||
break
|
||||
|
||||
# Validate tile size
|
||||
if len(data) not in (3601 * 3601 * 2, 1201 * 1201 * 2):
|
||||
print(f"[Terrain] Invalid tile size {len(data)} from {url}")
|
||||
continue
|
||||
|
||||
tile_path.write_bytes(data)
|
||||
res = source["resolution"]
|
||||
size_mb = len(data) / 1048576
|
||||
print(f"[Terrain] Downloaded {tile_name} ({res}, {size_mb:.1f} MB)")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f"[Terrain] Failed from {url}: {e}")
|
||||
continue
|
||||
|
||||
print(f"[Terrain] Could not download {tile_name} from any source")
|
||||
return False
|
||||
```
|
||||
|
||||
### 2. Add Terrain Status API Endpoint
|
||||
|
||||
**File:** `backend/app/api/routes.py` (or wherever API routes are defined)
|
||||
|
||||
Add a new endpoint:
|
||||
```python
|
||||
@router.get("/api/terrain/status")
|
||||
async def terrain_status():
|
||||
"""Return terrain data availability info."""
|
||||
from app.services.terrain_service import terrain_service
|
||||
|
||||
cached_tiles = terrain_service.get_cached_tiles()
|
||||
cache_size = terrain_service.get_cache_size_mb()
|
||||
|
||||
# Categorize by resolution
|
||||
srtm1_tiles = [t for t in cached_tiles
|
||||
if (terrain_service.terrain_path / f"{t}.hgt").stat().st_size == 3601 * 3601 * 2]
|
||||
srtm3_tiles = [t for t in cached_tiles if t not in srtm1_tiles]
|
||||
|
||||
return {
|
||||
"total_tiles": len(cached_tiles),
|
||||
"srtm1": {
|
||||
"count": len(srtm1_tiles),
|
||||
"resolution_m": 30,
|
||||
"tiles": sorted(srtm1_tiles),
|
||||
},
|
||||
"srtm3": {
|
||||
"count": len(srtm3_tiles),
|
||||
"resolution_m": 90,
|
||||
"tiles": sorted(srtm3_tiles),
|
||||
},
|
||||
"cache_size_mb": round(cache_size, 1),
|
||||
"memory_cached": len(terrain_service._tile_cache),
|
||||
"terra_server": "https://terra.eliah.one",
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Add Bulk Pre-Download Endpoint
|
||||
|
||||
**File:** Same routes file
|
||||
|
||||
```python
|
||||
@router.post("/api/terrain/download")
|
||||
async def terrain_download(request: dict):
|
||||
"""Pre-download tiles for a region.
|
||||
|
||||
Body: {"center_lat": 48.46, "center_lon": 35.04, "radius_km": 50}
|
||||
Or: {"tiles": ["N48E034", "N48E035", "N47E034", "N47E035"]}
|
||||
"""
|
||||
from app.services.terrain_service import terrain_service
|
||||
|
||||
if "tiles" in request:
|
||||
tile_list = request["tiles"]
|
||||
else:
|
||||
center_lat = request.get("center_lat", 48.46)
|
||||
center_lon = request.get("center_lon", 35.04)
|
||||
radius_km = request.get("radius_km", 50)
|
||||
tile_list = terrain_service.get_required_tiles(center_lat, center_lon, radius_km)
|
||||
|
||||
missing = [t for t in tile_list if not terrain_service.get_tile_path(t).exists()]
|
||||
|
||||
if not missing:
|
||||
return {"status": "ok", "message": "All tiles already cached", "count": len(tile_list)}
|
||||
|
||||
# Download missing tiles
|
||||
downloaded = []
|
||||
failed = []
|
||||
for tile_name in missing:
|
||||
success = await terrain_service.download_tile(tile_name)
|
||||
if success:
|
||||
downloaded.append(tile_name)
|
||||
else:
|
||||
failed.append(tile_name)
|
||||
|
||||
return {
|
||||
"status": "ok",
|
||||
"required": len(tile_list),
|
||||
"already_cached": len(tile_list) - len(missing),
|
||||
"downloaded": downloaded,
|
||||
"failed": failed,
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Add Tile Index Endpoint
|
||||
|
||||
**File:** Same routes file
|
||||
|
||||
```python
|
||||
@router.get("/api/terrain/index")
|
||||
async def terrain_index():
|
||||
"""Fetch tile index from terra server."""
|
||||
import httpx
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=10.0) as client:
|
||||
resp = await client.get("https://terra.eliah.one/api/index")
|
||||
if resp.status_code == 200:
|
||||
return resp.json()
|
||||
except Exception:
|
||||
pass
|
||||
return {"error": "Could not reach terra.eliah.one", "offline": True}
|
||||
```
|
||||
|
||||
## Testing Checklist
|
||||
|
||||
- [ ] `GET /api/terrain/status` returns tile counts and sizes
|
||||
- [ ] `POST /api/terrain/download {"center_lat": 48.46, "center_lon": 35.04, "radius_km": 10}` downloads missing tiles from terra.eliah.one
|
||||
- [ ] Tiles downloaded from terra are valid HGT format (2,884,802 or 25,934,402 bytes)
|
||||
- [ ] SRTM1 is preferred over SRTM3 when downloading
|
||||
- [ ] Existing tiles are not re-downloaded
|
||||
- [ ] Coverage calculation works with terrain data (test with Dnipro coordinates)
|
||||
- [ ] `GET /api/terrain/index` returns terra server tile list
|
||||
|
||||
## Build & Deploy
|
||||
|
||||
```bash
|
||||
cd D:\root\rfcp\backend
|
||||
# No build needed — Python backend, just restart
|
||||
# Kill existing uvicorn and restart:
|
||||
python -m uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
|
||||
```
|
||||
|
||||
## Commit Message
|
||||
|
||||
```
|
||||
feat(terrain): integrate terra.eliah.one tile server
|
||||
|
||||
- Add terra.eliah.one as primary SRTM source (SRTM1 30m preferred)
|
||||
- Keep AWS S3 as fallback source
|
||||
- Add /api/terrain/status endpoint (tile inventory)
|
||||
- Add /api/terrain/download endpoint (bulk pre-download)
|
||||
- Add /api/terrain/index endpoint (terra server index)
|
||||
- Validate tile size before saving
|
||||
- Add follow_redirects=True to httpx client
|
||||
```
|
||||
|
||||
## Success Criteria
|
||||
|
||||
1. terrain_service downloads from terra.eliah.one first
|
||||
2. /api/terrain/status shows correct tile counts by resolution
|
||||
3. /api/terrain/download fetches tiles for any Ukrainian coordinate
|
||||
4. Offline mode works — no downloads attempted if tiles exist locally
|
||||
5. Coverage calculation uses real elevation data instead of flat terrain
|
||||
@@ -1,352 +0,0 @@
|
||||
# UMTC Wiki v2.0 — MEGA TASK: Integration & Polish
|
||||
|
||||
Read UMTC-Wiki-v2.0-REFACTOR.md and UMTC-Wiki-v2.0-ROADMAP.md for full context.
|
||||
|
||||
This is a comprehensive task covering all remaining fixes and integration work.
|
||||
Take your time, think hard, work through each section systematically.
|
||||
Report after completing each major section.
|
||||
|
||||
---
|
||||
|
||||
## SECTION A: Fix Critical Tauri 404 Bug
|
||||
|
||||
The sidebar loads the full content tree correctly but clicking ANY article shows 404.
|
||||
|
||||
### Debug steps:
|
||||
1. In `frontend/src/lib/api.ts` — find where getPage is called with a slug
|
||||
Add `console.log('[WIKI] getPage called with slug:', slug)`
|
||||
|
||||
2. In `frontend/src/lib/utils/backend.ts` — in the tauriGetPage function
|
||||
Add `console.log('[WIKI] Tauri invoke get_page with:', slug)`
|
||||
|
||||
3. In `desktop/src-tauri/src/commands/content.rs` — in the get_page handler
|
||||
Add `eprintln!("[WIKI] get_page received slug: {}", slug)`
|
||||
Add `eprintln!("[WIKI] trying path: {:?}", resolved_path)`
|
||||
|
||||
4. Check the Sidebar.svelte component — what href/slug does it generate when user clicks?
|
||||
The web version uses `/api/pages/{slug}` — in desktop mode it should invoke with just the slug part.
|
||||
|
||||
5. Common mismatches to check:
|
||||
- Leading slash: sidebar sends `/lte/bbu` but Rust expects `lte/bbu`
|
||||
- File extension: Rust looks for `lte/bbu.md` but file is `lte/bbu/index.md`
|
||||
- URL encoding: Ukrainian characters in slugs
|
||||
- The SvelteKit catch-all route `[...slug]` may pass the slug differently
|
||||
|
||||
6. Fix the mismatch. Test navigation to at least 10 different pages including:
|
||||
- Root sections (lte/, ran/, mikrotik/)
|
||||
- Nested pages (lte/bbu, ran/srsenb-config)
|
||||
- Glossary terms (glossary/prb)
|
||||
- Deep nesting if any
|
||||
|
||||
---
|
||||
|
||||
## SECTION B: Fix Web Deployment
|
||||
|
||||
The web version must keep working. Test and fix:
|
||||
|
||||
1. Check that `backend/content.py` imports work:
|
||||
- `from wiki_frontmatter import ArticleFrontmatter`
|
||||
- `from wikilinks import WikiLinksExtension`
|
||||
- `from backlinks import BacklinksIndex`
|
||||
- `from admonitions import AdmonitionsExtension`
|
||||
|
||||
If any import fails, fix the module.
|
||||
|
||||
2. Add the admonitions extension to the markdown pipeline in content.py
|
||||
(wikilinks was already integrated, verify admonitions too)
|
||||
|
||||
3. Make sure the backlinks API endpoint in main.py works:
|
||||
- GET /api/pages/{slug:path}/backlinks
|
||||
- Should return { "slug": "...", "backlinks": [...], "count": N }
|
||||
|
||||
4. Add grade/status/category to the page API response:
|
||||
- GET /api/pages/{slug} should now include grade, status, category fields
|
||||
|
||||
5. Create a simple test script `scripts/test_web.py`:
|
||||
```python
|
||||
# Test that backend starts and key endpoints work
|
||||
import requests
|
||||
BASE = "http://localhost:8000"
|
||||
|
||||
# Test navigation
|
||||
r = requests.get(f"{BASE}/api/navigation")
|
||||
assert r.status_code == 200
|
||||
nav = r.json()
|
||||
print(f"Navigation: {len(nav)} sections")
|
||||
|
||||
# Test page load
|
||||
r = requests.get(f"{BASE}/api/pages/index")
|
||||
assert r.status_code == 200
|
||||
print(f"Home page: {r.json().get('title', 'OK')}")
|
||||
|
||||
# Test search
|
||||
r = requests.get(f"{BASE}/api/search?q=LTE")
|
||||
assert r.status_code == 200
|
||||
print(f"Search 'LTE': {len(r.json())} results")
|
||||
|
||||
# Test backlinks
|
||||
r = requests.get(f"{BASE}/api/pages/glossary/enb/backlinks")
|
||||
print(f"Backlinks for eNB: {r.status_code}")
|
||||
|
||||
print("\nAll tests passed!")
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## SECTION C: Frontend Wiki Components — Full Integration
|
||||
|
||||
### C.1: Article Grade Badge on Pages
|
||||
|
||||
In the wiki page view (`frontend/src/routes/[...slug]/+page.svelte` or equivalent):
|
||||
- Import ArticleGrade component
|
||||
- Display the grade badge next to the page title
|
||||
- The grade comes from the page API response (field: `grade`)
|
||||
- If no grade, don't show badge
|
||||
- Style: small badge inline with title, not a separate block
|
||||
|
||||
### C.2: Breadcrumbs Component
|
||||
|
||||
Create/update `frontend/src/lib/components/wiki/Breadcrumbs.svelte`:
|
||||
```svelte
|
||||
<!-- Example: Головна > LTE > BBU Setup -->
|
||||
<nav class="breadcrumbs">
|
||||
<a href="/">Головна</a>
|
||||
<span class="separator">/</span>
|
||||
<a href="/lte">LTE</a>
|
||||
<span class="separator">/</span>
|
||||
<span class="current">BBU Setup</span>
|
||||
</nav>
|
||||
```
|
||||
- Generate from current page slug
|
||||
- Each segment is a link except the last
|
||||
- Use titles from navigation tree if available, otherwise humanize slug
|
||||
- Works in both web and desktop mode
|
||||
- Integrate into the page layout — show above article title
|
||||
|
||||
### C.3: Admonition CSS
|
||||
|
||||
Add styles for admonition boxes to the global CSS or a component:
|
||||
```css
|
||||
.admonition {
|
||||
border-left: 4px solid;
|
||||
border-radius: 4px;
|
||||
padding: 12px 16px;
|
||||
margin: 16px 0;
|
||||
}
|
||||
.admonition-note { border-color: #3b82f6; background: rgba(59,130,246,0.1); }
|
||||
.admonition-warning { border-color: #f59e0b; background: rgba(245,158,11,0.1); }
|
||||
.admonition-tip { border-color: #10b981; background: rgba(16,185,129,0.1); }
|
||||
.admonition-danger { border-color: #ef4444; background: rgba(239,68,68,0.1); }
|
||||
|
||||
/* Dark mode */
|
||||
:global(.dark) .admonition-note { background: rgba(59,130,246,0.15); }
|
||||
:global(.dark) .admonition-warning { background: rgba(245,158,11,0.15); }
|
||||
:global(.dark) .admonition-tip { background: rgba(16,185,129,0.15); }
|
||||
:global(.dark) .admonition-danger { background: rgba(239,68,68,0.15); }
|
||||
|
||||
.admonition-title {
|
||||
font-weight: 600;
|
||||
margin-bottom: 4px;
|
||||
}
|
||||
.admonition-icon {
|
||||
margin-right: 8px;
|
||||
}
|
||||
```
|
||||
|
||||
### C.4: Wiki-Link CSS
|
||||
|
||||
Add styles for wiki-links:
|
||||
```css
|
||||
.wiki-link {
|
||||
color: #3b82f6;
|
||||
text-decoration: none;
|
||||
border-bottom: 1px dotted #3b82f6;
|
||||
}
|
||||
.wiki-link:hover {
|
||||
border-bottom-style: solid;
|
||||
}
|
||||
.red-link {
|
||||
color: #ef4444;
|
||||
border-bottom-color: #ef4444;
|
||||
}
|
||||
.red-link:hover::after {
|
||||
content: " (сторінку не знайдено)";
|
||||
font-size: 0.75em;
|
||||
color: #9ca3af;
|
||||
}
|
||||
```
|
||||
|
||||
### C.5: Backlinks Panel Integration
|
||||
|
||||
In the page view, after the article content:
|
||||
- Show BacklinksPanel component
|
||||
- Pass current page slug
|
||||
- Works in both web (API) and desktop (Tauri IPC)
|
||||
- Only show if there are backlinks (don't show empty panel)
|
||||
|
||||
### C.6: Table of Contents (sidebar)
|
||||
|
||||
If the page has headings, generate a table of contents:
|
||||
- Extract h2/h3 from rendered HTML or use TOC data from backend
|
||||
- Show as a floating sidebar on wide screens (>1200px)
|
||||
- Collapsible on smaller screens
|
||||
- Highlight current section on scroll (intersection observer)
|
||||
- Works in both modes
|
||||
|
||||
---
|
||||
|
||||
## SECTION D: Search Integration for Desktop
|
||||
|
||||
1. Test Tantivy search in Tauri:
|
||||
- The search command should be wired to the Search component
|
||||
- Type in search bar → results appear
|
||||
- Cyrillic text should work (test: "мережа", "антена", "LTE")
|
||||
|
||||
2. If search doesn't work, debug:
|
||||
- Is the search index built on startup? Check Rust logs
|
||||
- Are content files found? Check content path resolution
|
||||
- Is the query reaching the search command?
|
||||
|
||||
3. Search results should show:
|
||||
- Page title
|
||||
- Brief excerpt (first 150 chars of content)
|
||||
- Click navigates to page
|
||||
|
||||
4. Keyboard shortcut: Ctrl+K should focus the search bar (already exists in web, verify in desktop)
|
||||
|
||||
---
|
||||
|
||||
## SECTION E: Content Quality Pass
|
||||
|
||||
### E.1: Content Audit Script
|
||||
|
||||
Create `scripts/analyze_content.py`:
|
||||
- Scan all .md files in content/
|
||||
- For each file report: has_frontmatter, word_count, has_code_blocks, grade, broken_wiki_links
|
||||
- Summary: total articles, by grade, articles needing work
|
||||
- Print actionable output
|
||||
|
||||
### E.2: Add More Glossary Terms (20 more)
|
||||
|
||||
Create glossary entries with proper frontmatter (grade: B, category: glossary):
|
||||
|
||||
**Radio/RF terms:**
|
||||
- SGW (Serving Gateway)
|
||||
- PGW (PDN Gateway)
|
||||
- HSS (Home Subscriber Server)
|
||||
- RSRP (Reference Signal Received Power)
|
||||
- RSRQ (Reference Signal Received Quality)
|
||||
- SINR (Signal to Interference plus Noise Ratio)
|
||||
- EARFCN (E-UTRA Absolute Radio Frequency Channel Number)
|
||||
- OFDM (Orthogonal Frequency Division Multiplexing)
|
||||
- MIMO (Multiple Input Multiple Output)
|
||||
- QoS (Quality of Service)
|
||||
|
||||
**Infrastructure terms:**
|
||||
- WireGuard
|
||||
- MikroTik
|
||||
- Mesh Network
|
||||
- VLAN (Virtual LAN)
|
||||
- BGP (Border Gateway Protocol)
|
||||
- mTLS (Mutual TLS)
|
||||
- Caddy (Web Server)
|
||||
|
||||
**Protocol terms:**
|
||||
- S1AP (S1 Application Protocol)
|
||||
- GTP (GPRS Tunnelling Protocol)
|
||||
- SCTP (Stream Control Transmission Protocol)
|
||||
|
||||
Each glossary term should:
|
||||
- Have title in English with Ukrainian description
|
||||
- Use [[wiki-links]] to cross-reference other terms
|
||||
- Include: what it is, why it matters for UMTC, key parameters
|
||||
- Be 100-300 words
|
||||
|
||||
### E.3: Upgrade 5 Key Articles to Grade B
|
||||
|
||||
Pick the 5 most important articles and upgrade them:
|
||||
- Add proper frontmatter with all fields
|
||||
- Add :::note and :::warning admonitions where useful
|
||||
- Add [[wiki-links]] to glossary terms
|
||||
- Add "Див. також" (See also) section with related articles
|
||||
- Verify technical accuracy
|
||||
- Set grade: B
|
||||
|
||||
Good candidates:
|
||||
- Main LTE overview
|
||||
- srsENB configuration
|
||||
- WireGuard setup
|
||||
- Open5GS overview
|
||||
- MikroTik basics
|
||||
|
||||
---
|
||||
|
||||
## SECTION F: Desktop App Polish
|
||||
|
||||
### F.1: Window Title
|
||||
|
||||
Show current page title in the window title bar:
|
||||
`UMTC Wiki — {Page Title}`
|
||||
|
||||
### F.2: Keyboard Navigation
|
||||
|
||||
- Arrow keys in sidebar to navigate
|
||||
- Enter to open selected item
|
||||
- Backspace to go back
|
||||
- Ctrl+K for search (verify)
|
||||
|
||||
### F.3: Error Handling
|
||||
|
||||
- If page not found, show a friendly Ukrainian message instead of generic 404
|
||||
- If content directory is missing, show setup instructions
|
||||
- If search index fails to build, log error but don't crash
|
||||
|
||||
### F.4: About Dialog
|
||||
|
||||
Add a simple about/info accessible from a gear icon or Help menu:
|
||||
- UMTC Wiki v2.0
|
||||
- Built with Tauri + SvelteKit + Rust
|
||||
- Content articles count
|
||||
- "Офлайн документація для УМТЗ"
|
||||
|
||||
---
|
||||
|
||||
## SECTION G: Production Builds
|
||||
|
||||
### G.1: Web Docker Build Test
|
||||
|
||||
Update docker-compose.yml if needed to include new backend modules.
|
||||
Make sure Dockerfile copies:
|
||||
- backend/wiki_frontmatter.py
|
||||
- backend/wikilinks.py
|
||||
- backend/backlinks.py
|
||||
- backend/admonitions.py
|
||||
- All content/ files
|
||||
|
||||
### G.2: Tauri Production Build
|
||||
|
||||
Run `npx tauri build` and fix any remaining compilation errors.
|
||||
Report the output binary size and location.
|
||||
|
||||
---
|
||||
|
||||
## Order of Operations
|
||||
|
||||
Do these in order — each section builds on the previous:
|
||||
|
||||
1. **SECTION A** — Fix 404 bug (CRITICAL, everything depends on this)
|
||||
2. **SECTION B** — Verify web backend
|
||||
3. **SECTION C** — Frontend components
|
||||
4. **SECTION D** — Search
|
||||
5. **SECTION E** — Content
|
||||
6. **SECTION F** — Desktop polish
|
||||
7. **SECTION G** — Production builds
|
||||
|
||||
Report after each section with:
|
||||
- What was done
|
||||
- What files were changed
|
||||
- Any issues found
|
||||
- Ready for next section?
|
||||
|
||||
Think hard about edge cases. Don't break existing functionality.
|
||||
Good luck! 🚀
|
||||
@@ -180,3 +180,93 @@ async def get_terrain_file(region: str):
|
||||
if os.path.exists(terrain_path):
|
||||
return FileResponse(terrain_path)
|
||||
raise HTTPException(status_code=404, detail=f"Region '{region}' not found")
|
||||
|
||||
|
||||
@router.get("/status")
|
||||
async def terrain_status():
|
||||
"""Return terrain data availability info."""
|
||||
cached_tiles = terrain_service.get_cached_tiles()
|
||||
cache_size = terrain_service.get_cache_size_mb()
|
||||
|
||||
# Categorize by resolution based on file size
|
||||
srtm1_tiles = []
|
||||
srtm3_tiles = []
|
||||
for t in cached_tiles:
|
||||
tile_path = terrain_service.terrain_path / f"{t}.hgt"
|
||||
try:
|
||||
if tile_path.stat().st_size == 3601 * 3601 * 2:
|
||||
srtm1_tiles.append(t)
|
||||
else:
|
||||
srtm3_tiles.append(t)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return {
|
||||
"total_tiles": len(cached_tiles),
|
||||
"srtm1": {
|
||||
"count": len(srtm1_tiles),
|
||||
"resolution_m": 30,
|
||||
"tiles": sorted(srtm1_tiles),
|
||||
},
|
||||
"srtm3": {
|
||||
"count": len(srtm3_tiles),
|
||||
"resolution_m": 90,
|
||||
"tiles": sorted(srtm3_tiles),
|
||||
},
|
||||
"cache_size_mb": round(cache_size, 1),
|
||||
"memory_cached": len(terrain_service._tile_cache),
|
||||
"terra_server": "https://terra.eliah.one",
|
||||
}
|
||||
|
||||
|
||||
@router.post("/download")
|
||||
async def terrain_download(request: dict):
|
||||
"""Pre-download tiles for a region.
|
||||
|
||||
Body: {"center_lat": 48.46, "center_lon": 35.04, "radius_km": 50}
|
||||
Or: {"tiles": ["N48E034", "N48E035", "N47E034", "N47E035"]}
|
||||
"""
|
||||
if "tiles" in request:
|
||||
tile_list = request["tiles"]
|
||||
else:
|
||||
center_lat = request.get("center_lat", 48.46)
|
||||
center_lon = request.get("center_lon", 35.04)
|
||||
radius_km = request.get("radius_km", 50)
|
||||
tile_list = terrain_service.get_required_tiles(center_lat, center_lon, radius_km)
|
||||
|
||||
missing = [t for t in tile_list if not terrain_service.get_tile_path(t).exists()]
|
||||
|
||||
if not missing:
|
||||
return {"status": "ok", "message": "All tiles already cached", "count": len(tile_list)}
|
||||
|
||||
# Download missing tiles
|
||||
downloaded = []
|
||||
failed = []
|
||||
for tile_name in missing:
|
||||
success = await terrain_service.download_tile(tile_name)
|
||||
if success:
|
||||
downloaded.append(tile_name)
|
||||
else:
|
||||
failed.append(tile_name)
|
||||
|
||||
return {
|
||||
"status": "ok",
|
||||
"required": len(tile_list),
|
||||
"already_cached": len(tile_list) - len(missing),
|
||||
"downloaded": downloaded,
|
||||
"failed": failed,
|
||||
}
|
||||
|
||||
|
||||
@router.get("/index")
|
||||
async def terrain_index():
|
||||
"""Fetch tile index from terra server."""
|
||||
import httpx
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=10.0) as client:
|
||||
resp = await client.get("https://terra.eliah.one/api/index")
|
||||
if resp.status_code == 200:
|
||||
return resp.json()
|
||||
except Exception:
|
||||
pass
|
||||
return {"error": "Could not reach terra.eliah.one", "offline": True}
|
||||
|
||||
@@ -526,19 +526,33 @@ class CoverageService:
|
||||
progress_fn("Loading terrain", 0.25)
|
||||
await asyncio.sleep(0)
|
||||
t_terrain = time.time()
|
||||
|
||||
# Check for missing tiles before attempting download
|
||||
radius_km = settings.radius / 1000.0
|
||||
missing_tiles = self.terrain.get_missing_tiles(site.lat, site.lon, radius_km)
|
||||
if missing_tiles:
|
||||
_clog(f"⚠ Missing terrain tiles: {missing_tiles} - will attempt download")
|
||||
|
||||
tile_names = await self.terrain.ensure_tiles_for_bbox(
|
||||
min_lat, min_lon, max_lat, max_lon
|
||||
)
|
||||
for tn in tile_names:
|
||||
self.terrain._load_tile(tn)
|
||||
|
||||
# Check what actually loaded
|
||||
loaded_tiles = [tn for tn in tile_names if tn in self.terrain._tile_cache]
|
||||
failed_tiles = [tn for tn in tile_names if tn not in self.terrain._tile_cache]
|
||||
if failed_tiles:
|
||||
_clog(f"⚠ TERRAIN WARNING: Failed to load tiles {failed_tiles}. "
|
||||
"Coverage accuracy reduced - using flat terrain for affected areas.")
|
||||
|
||||
site_elevation = self.terrain.get_elevation_sync(site.lat, site.lon)
|
||||
|
||||
point_elevations = {}
|
||||
for lat, lon in grid:
|
||||
point_elevations[(lat, lon)] = self.terrain.get_elevation_sync(lat, lon)
|
||||
terrain_time = time.time() - t_terrain
|
||||
_clog(f"Tiles: {len(tile_names)}, site elev: {site_elevation:.0f}m, "
|
||||
_clog(f"Tiles: {len(loaded_tiles)}/{len(tile_names)} loaded, site elev: {site_elevation:.0f}m, "
|
||||
f"pre-computed {len(grid)} elevations")
|
||||
_clog(f"━━━ PHASE 2 done: {terrain_time:.1f}s ━━━")
|
||||
|
||||
|
||||
@@ -277,10 +277,11 @@ class GPUService:
|
||||
lons: np.ndarray,
|
||||
terrain_cache: dict,
|
||||
) -> np.ndarray:
|
||||
"""Look up elevations from cached terrain tiles.
|
||||
"""Look up elevations from cached terrain tiles with bilinear interpolation.
|
||||
|
||||
Vectorized implementation: processes per-tile (1-4 tiles) instead of
|
||||
per-point (thousands of points). Inner operations are all NumPy vectorized.
|
||||
per-point (thousands of points). Uses bilinear interpolation for
|
||||
sub-meter accuracy (vs 15m error with nearest-neighbor at 30m resolution).
|
||||
|
||||
Args:
|
||||
lats, lons: Flattened arrays of coordinates
|
||||
@@ -313,16 +314,39 @@ class GPUService:
|
||||
tile_lons = lons[mask]
|
||||
|
||||
size = tile.shape[0]
|
||||
# Vectorized row/col calculation
|
||||
rows = ((1 - (tile_lats - lat_int)) * (size - 1)).astype(int)
|
||||
cols = ((tile_lons - lon_int) * (size - 1)).astype(int)
|
||||
rows = np.clip(rows, 0, size - 1)
|
||||
cols = np.clip(cols, 0, size - 1)
|
||||
|
||||
# Vectorized lookup - single operation for ALL points in tile
|
||||
tile_elevs = tile[rows, cols].astype(np.float64)
|
||||
tile_elevs[tile_elevs == -32768] = 0.0
|
||||
elevations[mask] = tile_elevs
|
||||
# Vectorized bilinear interpolation
|
||||
lat_frac = tile_lats - lat_int
|
||||
lon_frac = tile_lons - lon_int
|
||||
|
||||
row_exact = (1.0 - lat_frac) * (size - 1)
|
||||
col_exact = lon_frac * (size - 1)
|
||||
|
||||
r0 = np.clip(row_exact.astype(int), 0, size - 2)
|
||||
c0 = np.clip(col_exact.astype(int), 0, size - 2)
|
||||
r1 = r0 + 1
|
||||
c1 = c0 + 1
|
||||
|
||||
dr = row_exact - r0
|
||||
dc = col_exact - c0
|
||||
|
||||
# Get four corner values for all points at once
|
||||
z00 = tile[r0, c0].astype(np.float64)
|
||||
z01 = tile[r0, c1].astype(np.float64)
|
||||
z10 = tile[r1, c0].astype(np.float64)
|
||||
z11 = tile[r1, c1].astype(np.float64)
|
||||
|
||||
# Bilinear interpolation (vectorized)
|
||||
result = (z00 * (1 - dr) * (1 - dc) +
|
||||
z01 * (1 - dr) * dc +
|
||||
z10 * dr * (1 - dc) +
|
||||
z11 * dr * dc)
|
||||
|
||||
# Handle void values (-32768) - set to 0
|
||||
void_mask = (z00 == -32768) | (z01 == -32768) | (z10 == -32768) | (z11 == -32768)
|
||||
result[void_mask] = 0.0
|
||||
|
||||
elevations[mask] = result
|
||||
|
||||
return elevations
|
||||
|
||||
|
||||
@@ -20,8 +20,24 @@ class TerrainService:
|
||||
"""
|
||||
|
||||
SRTM_SOURCES = [
|
||||
"https://elevation-tiles-prod.s3.amazonaws.com/skadi/{lat_dir}/{tile_name}.hgt.gz",
|
||||
"https://s3.amazonaws.com/elevation-tiles-prod/skadi/{lat_dir}/{tile_name}.hgt.gz",
|
||||
# Our tile server — SRTM1 (30m) preferred, uncompressed
|
||||
{
|
||||
"url": "https://terra.eliah.one/srtm1/{tile_name}.hgt",
|
||||
"compressed": False,
|
||||
"resolution": "srtm1",
|
||||
},
|
||||
# Our tile server — SRTM3 (90m) fallback
|
||||
{
|
||||
"url": "https://terra.eliah.one/srtm3/{tile_name}.hgt",
|
||||
"compressed": False,
|
||||
"resolution": "srtm3",
|
||||
},
|
||||
# Public AWS mirror — SRTM1, gzip compressed
|
||||
{
|
||||
"url": "https://elevation-tiles-prod.s3.amazonaws.com/skadi/{lat_dir}/{tile_name}.hgt.gz",
|
||||
"compressed": True,
|
||||
"resolution": "srtm1",
|
||||
},
|
||||
]
|
||||
|
||||
def __init__(self):
|
||||
@@ -48,7 +64,7 @@ class TerrainService:
|
||||
return self.terrain_path / f"{tile_name}.hgt"
|
||||
|
||||
async def download_tile(self, tile_name: str) -> bool:
|
||||
"""Download SRTM tile if not cached locally"""
|
||||
"""Download SRTM tile from configured sources, preferring highest resolution."""
|
||||
tile_path = self.get_tile_path(tile_name)
|
||||
|
||||
if tile_path.exists():
|
||||
@@ -56,33 +72,45 @@ class TerrainService:
|
||||
|
||||
lat_dir = tile_name[:3] # e.g., "N48"
|
||||
|
||||
async with httpx.AsyncClient(timeout=60.0) as client:
|
||||
for source_url in self.SRTM_SOURCES:
|
||||
url = source_url.format(lat_dir=lat_dir, tile_name=tile_name)
|
||||
async with httpx.AsyncClient(timeout=60.0, follow_redirects=True) as client:
|
||||
for source in self.SRTM_SOURCES:
|
||||
url = source["url"].format(lat_dir=lat_dir, tile_name=tile_name)
|
||||
try:
|
||||
response = await client.get(url)
|
||||
|
||||
if response.status_code == 200:
|
||||
data = response.content
|
||||
|
||||
if url.endswith('.gz'):
|
||||
data = gzip.decompress(data)
|
||||
elif url.endswith('.zip'):
|
||||
with zipfile.ZipFile(io.BytesIO(data)) as zf:
|
||||
for name in zf.namelist():
|
||||
if name.endswith('.hgt'):
|
||||
data = zf.read(name)
|
||||
break
|
||||
# Skip empty responses
|
||||
if len(data) < 1000:
|
||||
continue
|
||||
|
||||
if source["compressed"]:
|
||||
if url.endswith('.gz'):
|
||||
data = gzip.decompress(data)
|
||||
elif url.endswith('.zip'):
|
||||
with zipfile.ZipFile(io.BytesIO(data)) as zf:
|
||||
for name in zf.namelist():
|
||||
if name.endswith('.hgt'):
|
||||
data = zf.read(name)
|
||||
break
|
||||
|
||||
# Validate tile size (SRTM1: 25,934,402 bytes, SRTM3: 2,884,802 bytes)
|
||||
if len(data) not in (3601 * 3601 * 2, 1201 * 1201 * 2):
|
||||
print(f"[Terrain] Invalid tile size {len(data)} from {url}")
|
||||
continue
|
||||
|
||||
tile_path.write_bytes(data)
|
||||
print(f"[Terrain] Downloaded {tile_name} ({len(data)} bytes)")
|
||||
res = source["resolution"]
|
||||
size_mb = len(data) / 1048576
|
||||
print(f"[Terrain] Downloaded {tile_name} ({res}, {size_mb:.1f} MB)")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f"[Terrain] Failed from {url}: {e}")
|
||||
continue
|
||||
|
||||
print(f"[Terrain] Could not download {tile_name}")
|
||||
print(f"[Terrain] Could not download {tile_name} from any source")
|
||||
return False
|
||||
|
||||
def _load_tile(self, tile_name: str) -> Optional[np.ndarray]:
|
||||
@@ -149,56 +177,179 @@ class TerrainService:
|
||||
|
||||
return self._load_tile(tile_name)
|
||||
|
||||
def _bilinear_sample(self, tile: np.ndarray, lat: float, lon: float) -> float:
|
||||
"""Sample elevation with bilinear interpolation for sub-meter accuracy.
|
||||
|
||||
SRTM1 at 30m means nearest-neighbor can have 15m positional error.
|
||||
Bilinear interpolation reduces this to sub-meter accuracy.
|
||||
"""
|
||||
size = tile.shape[0]
|
||||
|
||||
# Tile southwest corner
|
||||
lat_int = int(lat) if lat >= 0 else int(lat) - 1
|
||||
lon_int = int(lon) if lon >= 0 else int(lon) - 1
|
||||
|
||||
# Fractional position within tile (0.0 to 1.0)
|
||||
lat_frac = lat - lat_int # 0 = south edge, 1 = north edge
|
||||
lon_frac = lon - lon_int # 0 = west edge, 1 = east edge
|
||||
|
||||
# Convert to row/col (note: rows go north to south!)
|
||||
row_exact = (1.0 - lat_frac) * (size - 1) # 0 = north, size-1 = south
|
||||
col_exact = lon_frac * (size - 1) # 0 = west, size-1 = east
|
||||
|
||||
# Four surrounding grid points
|
||||
r0 = int(row_exact)
|
||||
c0 = int(col_exact)
|
||||
r1 = min(r0 + 1, size - 1)
|
||||
c1 = min(c0 + 1, size - 1)
|
||||
|
||||
# Fractional position between grid points
|
||||
dr = row_exact - r0
|
||||
dc = col_exact - c0
|
||||
|
||||
# Get four corner values
|
||||
z00 = tile[r0, c0]
|
||||
z01 = tile[r0, c1]
|
||||
z10 = tile[r1, c0]
|
||||
z11 = tile[r1, c1]
|
||||
|
||||
# Handle void (-32768) values - fall back to nearest valid
|
||||
void_val = -32768
|
||||
corners = [(z00, r0, c0), (z01, r0, c1), (z10, r1, c0), (z11, r1, c1)]
|
||||
if z00 == void_val or z01 == void_val or z10 == void_val or z11 == void_val:
|
||||
valid = [(z, r, c) for z, r, c in corners if z != void_val]
|
||||
if not valid:
|
||||
return 0.0
|
||||
# Return nearest valid value
|
||||
return float(valid[0][0])
|
||||
|
||||
# Bilinear interpolation
|
||||
elevation = (z00 * (1 - dr) * (1 - dc) +
|
||||
z01 * (1 - dr) * dc +
|
||||
z10 * dr * (1 - dc) +
|
||||
z11 * dr * dc)
|
||||
|
||||
return float(elevation)
|
||||
|
||||
async def get_elevation(self, lat: float, lon: float) -> float:
|
||||
"""Get elevation at specific coordinate (meters above sea level)"""
|
||||
"""Get elevation at specific coordinate with bilinear interpolation."""
|
||||
tile_name = self.get_tile_name(lat, lon)
|
||||
tile = await self.load_tile(tile_name)
|
||||
|
||||
if tile is None:
|
||||
return 0.0
|
||||
|
||||
size = tile.shape[0]
|
||||
|
||||
# Calculate position within tile
|
||||
lat_int = int(lat) if lat >= 0 else int(lat) - 1
|
||||
lon_int = int(lon) if lon >= 0 else int(lon) - 1
|
||||
|
||||
lat_frac = lat - lat_int
|
||||
lon_frac = lon - lon_int
|
||||
|
||||
# Row 0 = north edge, last row = south edge
|
||||
row = int((1 - lat_frac) * (size - 1))
|
||||
col = int(lon_frac * (size - 1))
|
||||
|
||||
row = max(0, min(row, size - 1))
|
||||
col = max(0, min(col, size - 1))
|
||||
|
||||
elevation = tile[row, col]
|
||||
|
||||
# -32768 = void/no data
|
||||
if elevation == -32768:
|
||||
return 0.0
|
||||
|
||||
return float(elevation)
|
||||
return self._bilinear_sample(tile, lat, lon)
|
||||
|
||||
def get_elevation_sync(self, lat: float, lon: float) -> float:
|
||||
"""Sync elevation lookup from memory cache. Returns 0.0 if tile not loaded."""
|
||||
"""Sync elevation lookup with bilinear interpolation. Returns 0.0 if tile not loaded."""
|
||||
tile_name = self.get_tile_name(lat, lon)
|
||||
tile = self._tile_cache.get(tile_name)
|
||||
if tile is None:
|
||||
return 0.0
|
||||
|
||||
size = tile.shape[0]
|
||||
lat_int = int(lat) if lat >= 0 else int(lat) - 1
|
||||
lon_int = int(lon) if lon >= 0 else int(lon) - 1
|
||||
return self._bilinear_sample(tile, lat, lon)
|
||||
|
||||
row = int((1 - (lat - lat_int)) * (size - 1))
|
||||
col = int((lon - lon_int) * (size - 1))
|
||||
row = max(0, min(row, size - 1))
|
||||
col = max(0, min(col, size - 1))
|
||||
def get_elevations_batch(self, lats: np.ndarray, lons: np.ndarray) -> np.ndarray:
|
||||
"""Vectorized elevation lookup with bilinear interpolation.
|
||||
|
||||
elevation = tile[row, col]
|
||||
return 0.0 if elevation == -32768 else float(elevation)
|
||||
Handles points spanning multiple tiles efficiently.
|
||||
Groups points by tile, processes each tile with full NumPy vectorization.
|
||||
Tiles must be pre-loaded into memory cache.
|
||||
|
||||
Args:
|
||||
lats: Array of latitudes
|
||||
lons: Array of longitudes
|
||||
|
||||
Returns:
|
||||
Array of elevations (0.0 for missing tiles or void data)
|
||||
"""
|
||||
elevations = np.zeros(len(lats), dtype=np.float32)
|
||||
|
||||
# Compute tile indices for each point
|
||||
lat_ints = np.floor(lats).astype(int)
|
||||
lon_ints = np.floor(lons).astype(int)
|
||||
|
||||
# Group by tile using unique key
|
||||
unique_tiles = set(zip(lat_ints, lon_ints))
|
||||
|
||||
for lat_int, lon_int in unique_tiles:
|
||||
# Get tile name
|
||||
lat_letter = 'N' if lat_int >= 0 else 'S'
|
||||
lon_letter = 'E' if lon_int >= 0 else 'W'
|
||||
tile_name = f"{lat_letter}{abs(lat_int):02d}{lon_letter}{abs(lon_int):03d}"
|
||||
|
||||
tile = self._tile_cache.get(tile_name)
|
||||
if tile is None:
|
||||
continue
|
||||
|
||||
# Mask for points in this tile
|
||||
mask = (lat_ints == lat_int) & (lon_ints == lon_int)
|
||||
tile_lats = lats[mask]
|
||||
tile_lons = lons[mask]
|
||||
|
||||
size = tile.shape[0]
|
||||
|
||||
# Vectorized bilinear interpolation for all points in this tile
|
||||
lat_frac = tile_lats - lat_int
|
||||
lon_frac = tile_lons - lon_int
|
||||
|
||||
row_exact = (1.0 - lat_frac) * (size - 1)
|
||||
col_exact = lon_frac * (size - 1)
|
||||
|
||||
r0 = np.clip(row_exact.astype(int), 0, size - 2)
|
||||
c0 = np.clip(col_exact.astype(int), 0, size - 2)
|
||||
r1 = r0 + 1
|
||||
c1 = c0 + 1
|
||||
|
||||
dr = row_exact - r0
|
||||
dc = col_exact - c0
|
||||
|
||||
# Get four corner values for all points at once
|
||||
z00 = tile[r0, c0].astype(np.float32)
|
||||
z01 = tile[r0, c1].astype(np.float32)
|
||||
z10 = tile[r1, c0].astype(np.float32)
|
||||
z11 = tile[r1, c1].astype(np.float32)
|
||||
|
||||
# Bilinear interpolation (vectorized)
|
||||
result = (z00 * (1 - dr) * (1 - dc) +
|
||||
z01 * (1 - dr) * dc +
|
||||
z10 * dr * (1 - dc) +
|
||||
z11 * dr * dc)
|
||||
|
||||
# Handle void values (-32768) - set to 0
|
||||
void_mask = (z00 == -32768) | (z01 == -32768) | (z10 == -32768) | (z11 == -32768)
|
||||
result[void_mask] = 0.0
|
||||
|
||||
elevations[mask] = result
|
||||
|
||||
return elevations
|
||||
|
||||
def get_required_tiles(self, center_lat: float, center_lon: float, radius_km: float) -> list:
|
||||
"""Determine which tiles are needed for a coverage calculation."""
|
||||
# Convert radius to degrees (approximate)
|
||||
lat_delta = radius_km / 111.0 # ~111 km per degree latitude
|
||||
lon_delta = radius_km / (111.0 * np.cos(np.radians(center_lat)))
|
||||
|
||||
min_lat = center_lat - lat_delta
|
||||
max_lat = center_lat + lat_delta
|
||||
min_lon = center_lon - lon_delta
|
||||
max_lon = center_lon + lon_delta
|
||||
|
||||
tiles = []
|
||||
for lat in range(int(np.floor(min_lat)), int(np.floor(max_lat)) + 1):
|
||||
for lon in range(int(np.floor(min_lon)), int(np.floor(max_lon)) + 1):
|
||||
lat_letter = 'N' if lat >= 0 else 'S'
|
||||
lon_letter = 'E' if lon >= 0 else 'W'
|
||||
tile_name = f"{lat_letter}{abs(lat):02d}{lon_letter}{abs(lon):03d}"
|
||||
tiles.append(tile_name)
|
||||
|
||||
return tiles
|
||||
|
||||
def get_missing_tiles(self, center_lat: float, center_lon: float, radius_km: float) -> list:
|
||||
"""Check which needed tiles are not available locally."""
|
||||
required = self.get_required_tiles(center_lat, center_lon, radius_km)
|
||||
return [t for t in required if not self.get_tile_path(t).exists()]
|
||||
|
||||
async def get_elevation_profile(
|
||||
self,
|
||||
|
||||
436
docs/devlog/gpu_supp/RFCP-3.9.0-SRTM-Terrain-Integration.md
Normal file
436
docs/devlog/gpu_supp/RFCP-3.9.0-SRTM-Terrain-Integration.md
Normal file
@@ -0,0 +1,436 @@
|
||||
# RFCP 3.9.0 — SRTM1 Real Terrain Data Integration
|
||||
|
||||
## Context
|
||||
|
||||
RFCP currently downloads terrain tiles from an elevation API at runtime.
|
||||
This works but has limitations:
|
||||
- Requires internet connection
|
||||
- Unknown data source quality
|
||||
- No offline capability (critical for tactical/field use)
|
||||
- No control over resolution or caching
|
||||
|
||||
Goal: Replace with SRTM1 (30m resolution) HGT files, offline-first architecture.
|
||||
|
||||
## SRTM1 Data Format
|
||||
|
||||
HGT files are dead simple:
|
||||
- 1°×1° tiles, named by southwest corner: `N48E033.hgt`
|
||||
- 3601×3601 grid of signed 16-bit integers (big-endian)
|
||||
- Each value = elevation in meters
|
||||
- File size: exactly 25,934,402 bytes (3601 × 3601 × 2)
|
||||
- Row order: north to south (first row = northernmost)
|
||||
- Column order: west to east
|
||||
- Adjacent tiles overlap by 1 pixel on shared edges
|
||||
- Void/no-data value: -32768
|
||||
|
||||
Compressed (.hgt.zip): ~10-15 MB per tile typically.
|
||||
|
||||
## Architecture
|
||||
|
||||
### Tile Storage Layout
|
||||
|
||||
```
|
||||
{app_data}/terrain/
|
||||
├── srtm1/ # 30m resolution tiles
|
||||
│ ├── N48E033.hgt # Uncompressed for fast access
|
||||
│ ├── N48E034.hgt
|
||||
│ ├── N48E035.hgt
|
||||
│ └── ...
|
||||
├── tile_index.json # Metadata: available tiles, checksums, dates
|
||||
└── downloads/ # Temporary download staging
|
||||
```
|
||||
|
||||
On Windows, `{app_data}` = the application's data directory.
|
||||
For PyInstaller exe: `data/terrain/` relative to exe location.
|
||||
The path must be configurable (environment variable or config file).
|
||||
|
||||
### Tile Manager (new file: `terrain_manager.py`)
|
||||
|
||||
```python
|
||||
class SRTMTileManager:
|
||||
"""Manages SRTM1 HGT tile storage, loading, and caching."""
|
||||
|
||||
def __init__(self, terrain_dir: str):
|
||||
self.terrain_dir = Path(terrain_dir)
|
||||
self.srtm1_dir = self.terrain_dir / "srtm1"
|
||||
self.srtm1_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# In-memory cache: tile_name -> numpy array
|
||||
self._tile_cache: Dict[str, np.ndarray] = {}
|
||||
self._max_cache_tiles = 16 # ~16 tiles = ~400 MB RAM
|
||||
|
||||
def get_tile_name(self, lat: float, lon: float) -> str:
|
||||
"""Convert lat/lon to SRTM tile name."""
|
||||
# Floor to get southwest corner
|
||||
lat_int = int(lat) if lat >= 0 else int(lat) - 1
|
||||
lon_int = int(lon) if lon >= 0 else int(lon) - 1
|
||||
|
||||
lat_prefix = "N" if lat_int >= 0 else "S"
|
||||
lon_prefix = "E" if lon_int >= 0 else "W"
|
||||
|
||||
return f"{lat_prefix}{abs(lat_int):02d}{lon_prefix}{abs(lon_int):03d}"
|
||||
|
||||
def get_required_tiles(self, center_lat, center_lon, radius_km) -> List[str]:
|
||||
"""Determine which tiles are needed for a coverage calculation."""
|
||||
# Calculate bounding box from center + radius
|
||||
# Return list of tile names
|
||||
|
||||
def has_tile(self, tile_name: str) -> bool:
|
||||
"""Check if tile exists locally."""
|
||||
return (self.srtm1_dir / f"{tile_name}.hgt").exists()
|
||||
|
||||
def load_tile(self, tile_name: str) -> Optional[np.ndarray]:
|
||||
"""Load tile from disk into memory. Returns 3601x3601 int16 array."""
|
||||
if tile_name in self._tile_cache:
|
||||
return self._tile_cache[tile_name]
|
||||
|
||||
hgt_path = self.srtm1_dir / f"{tile_name}.hgt"
|
||||
if not hgt_path.exists():
|
||||
return None
|
||||
|
||||
# Read raw HGT: big-endian signed 16-bit
|
||||
data = np.fromfile(str(hgt_path), dtype='>i2')
|
||||
tile = data.reshape((3601, 3601))
|
||||
|
||||
# Replace void values
|
||||
tile = tile.astype(np.float32)
|
||||
tile[tile == -32768] = np.nan
|
||||
|
||||
# Cache management (LRU-style: evict oldest if full)
|
||||
if len(self._tile_cache) >= self._max_cache_tiles:
|
||||
oldest_key = next(iter(self._tile_cache))
|
||||
del self._tile_cache[oldest_key]
|
||||
|
||||
self._tile_cache[tile_name] = tile
|
||||
return tile
|
||||
|
||||
def get_elevation(self, lat: float, lon: float) -> Optional[float]:
|
||||
"""Get elevation at a single point with bilinear interpolation."""
|
||||
tile_name = self.get_tile_name(lat, lon)
|
||||
tile = self.load_tile(tile_name)
|
||||
if tile is None:
|
||||
return None
|
||||
return self._bilinear_sample(tile, lat, lon)
|
||||
|
||||
def get_elevations_batch(self, lats: np.ndarray, lons: np.ndarray) -> np.ndarray:
|
||||
"""Get elevations for array of points. Vectorized."""
|
||||
# Group points by tile
|
||||
# Load needed tiles
|
||||
# Vectorized bilinear interpolation per tile
|
||||
# Return array of elevations
|
||||
|
||||
async def download_tile(self, tile_name: str) -> bool:
|
||||
"""Download a single tile from remote source (if online)."""
|
||||
# Try multiple sources in order:
|
||||
# 1. Own server (future: UMTC sync endpoint)
|
||||
# 2. srtm.fasma.org (no auth required)
|
||||
# 3. viewfinderpanoramas.org (no auth, void-filled)
|
||||
# Returns True if successful
|
||||
|
||||
def get_missing_tiles(self, center_lat, center_lon, radius_km) -> List[str]:
|
||||
"""Check which needed tiles are not available locally."""
|
||||
required = self.get_required_tiles(center_lat, center_lon, radius_km)
|
||||
return [t for t in required if not self.has_tile(t)]
|
||||
```
|
||||
|
||||
### Bilinear Interpolation (CRITICAL for accuracy)
|
||||
|
||||
Current system uses nearest-neighbor (pick closest grid cell).
|
||||
SRTM1 at 30m means nearest-neighbor can have 15m positional error.
|
||||
Bilinear interpolation reduces this to sub-meter accuracy.
|
||||
|
||||
```python
|
||||
def _bilinear_sample(self, tile: np.ndarray, lat: float, lon: float) -> float:
|
||||
"""Sample elevation with bilinear interpolation."""
|
||||
# Tile southwest corner
|
||||
lat_int = int(lat) if lat >= 0 else int(lat) - 1
|
||||
lon_int = int(lon) if lon >= 0 else int(lon) - 1
|
||||
|
||||
# Fractional position within tile (0.0 to 1.0)
|
||||
lat_frac = lat - lat_int # 0 = south edge, 1 = north edge
|
||||
lon_frac = lon - lon_int # 0 = west edge, 1 = east edge
|
||||
|
||||
# Convert to row/col (note: rows go north to south!)
|
||||
row_exact = (1.0 - lat_frac) * 3600.0 # 0 = north, 3600 = south
|
||||
col_exact = lon_frac * 3600.0 # 0 = west, 3600 = east
|
||||
|
||||
# Four surrounding grid points
|
||||
r0 = int(row_exact)
|
||||
c0 = int(col_exact)
|
||||
r1 = min(r0 + 1, 3600)
|
||||
c1 = min(c0 + 1, 3600)
|
||||
|
||||
# Fractional position between grid points
|
||||
dr = row_exact - r0
|
||||
dc = col_exact - c0
|
||||
|
||||
# Bilinear interpolation
|
||||
z00 = tile[r0, c0]
|
||||
z01 = tile[r0, c1]
|
||||
z10 = tile[r1, c0]
|
||||
z11 = tile[r1, c1]
|
||||
|
||||
# Handle NaN (void) values
|
||||
if np.isnan(z00) or np.isnan(z01) or np.isnan(z10) or np.isnan(z11):
|
||||
# Fall back to nearest non-NaN
|
||||
valid = [(z00, 0, 0), (z01, 0, 1), (z10, 1, 0), (z11, 1, 1)]
|
||||
valid = [(z, r, c) for z, r, c in valid if not np.isnan(z)]
|
||||
return valid[0][0] if valid else 0.0
|
||||
|
||||
elevation = (z00 * (1 - dr) * (1 - dc) +
|
||||
z01 * (1 - dr) * dc +
|
||||
z10 * dr * (1 - dc) +
|
||||
z11 * dr * dc)
|
||||
|
||||
return float(elevation)
|
||||
```
|
||||
|
||||
### Vectorized Batch Elevation (for GPU pipeline)
|
||||
|
||||
This replaces the current `_batch_elevation_lookup` in gpu_service.py.
|
||||
Must handle multi-tile seamlessly.
|
||||
|
||||
```python
|
||||
def get_elevations_batch(self, lats: np.ndarray, lons: np.ndarray) -> np.ndarray:
|
||||
"""Vectorized elevation lookup with bilinear interpolation.
|
||||
|
||||
Handles points spanning multiple tiles efficiently.
|
||||
Groups points by tile, processes each tile with full NumPy vectorization.
|
||||
"""
|
||||
elevations = np.zeros(len(lats), dtype=np.float32)
|
||||
|
||||
# Compute tile indices for each point
|
||||
lat_ints = np.where(lats >= 0, np.floor(lats).astype(int),
|
||||
np.floor(lats).astype(int))
|
||||
lon_ints = np.where(lons >= 0, np.floor(lons).astype(int),
|
||||
np.floor(lons).astype(int))
|
||||
|
||||
# Group by tile
|
||||
tile_keys = lat_ints * 1000 + lon_ints # unique key per tile
|
||||
unique_keys = np.unique(tile_keys)
|
||||
|
||||
for key in unique_keys:
|
||||
mask = tile_keys == key
|
||||
lat_int = int(key // 1000)
|
||||
lon_int = int(key % 1000)
|
||||
if lon_int > 500: # handle negative longitudes
|
||||
lon_int -= 1000
|
||||
|
||||
tile_name = self._make_tile_name(lat_int, lon_int)
|
||||
tile = self.load_tile(tile_name)
|
||||
|
||||
if tile is None:
|
||||
elevations[mask] = 0.0 # no data
|
||||
continue
|
||||
|
||||
# Vectorized bilinear for all points in this tile
|
||||
tile_lats = lats[mask]
|
||||
tile_lons = lons[mask]
|
||||
|
||||
lat_frac = tile_lats - lat_int
|
||||
lon_frac = tile_lons - lon_int
|
||||
|
||||
row_exact = (1.0 - lat_frac) * 3600.0
|
||||
col_exact = lon_frac * 3600.0
|
||||
|
||||
r0 = np.clip(row_exact.astype(int), 0, 3599)
|
||||
c0 = np.clip(col_exact.astype(int), 0, 3599)
|
||||
r1 = np.clip(r0 + 1, 0, 3600)
|
||||
c1 = np.clip(c0 + 1, 0, 3600)
|
||||
|
||||
dr = row_exact - r0
|
||||
dc = col_exact - c0
|
||||
|
||||
z00 = tile[r0, c0]
|
||||
z01 = tile[r0, c1]
|
||||
z10 = tile[r1, c0]
|
||||
z11 = tile[r1, c1]
|
||||
|
||||
result = (z00 * (1 - dr) * (1 - dc) +
|
||||
z01 * (1 - dr) * dc +
|
||||
z10 * dr * (1 - dc) +
|
||||
z11 * dr * dc)
|
||||
|
||||
# Handle NaN voids
|
||||
nan_mask = np.isnan(result)
|
||||
if nan_mask.any():
|
||||
result[nan_mask] = 0.0
|
||||
|
||||
elevations[mask] = result
|
||||
|
||||
return elevations
|
||||
```
|
||||
|
||||
## Integration Points
|
||||
|
||||
### 1. Replace terrain_service.py elevation lookup
|
||||
|
||||
Current terrain service downloads elevation data from an API.
|
||||
Replace with SRTMTileManager calls:
|
||||
|
||||
```python
|
||||
# OLD:
|
||||
elevation = await self.terrain_service.get_elevation(lat, lon)
|
||||
|
||||
# NEW:
|
||||
elevation = self.tile_manager.get_elevation(lat, lon)
|
||||
# Or for batch (GPU pipeline Phase 2.6):
|
||||
elevations = self.tile_manager.get_elevations_batch(lats_array, lons_array)
|
||||
```
|
||||
|
||||
### 2. Replace _batch_elevation_lookup in gpu_service.py
|
||||
|
||||
The vectorized elevation lookup in gpu_service.py currently loads tiles
|
||||
and does nearest-neighbor sampling. Replace with tile_manager.get_elevations_batch()
|
||||
which does bilinear interpolation.
|
||||
|
||||
### 3. Coverage service pre-check
|
||||
|
||||
Before starting calculation, check if all needed tiles are available:
|
||||
|
||||
```python
|
||||
missing = self.tile_manager.get_missing_tiles(site_lat, site_lon, radius_km)
|
||||
if missing:
|
||||
if has_internet:
|
||||
# Try to download missing tiles
|
||||
for tile_name in missing:
|
||||
await self.tile_manager.download_tile(tile_name)
|
||||
else:
|
||||
# Return warning to frontend
|
||||
return {"warning": f"Missing terrain tiles: {missing}. Using flat terrain."}
|
||||
```
|
||||
|
||||
### 4. Frontend notification
|
||||
|
||||
When tiles are missing, show a warning banner:
|
||||
"⚠ Terrain data not available for this area. Coverage accuracy reduced."
|
||||
|
||||
When tiles are being downloaded:
|
||||
"⬇ Downloading terrain data... (N48E033.hgt, 12.5 MB)"
|
||||
|
||||
### 5. Terrain Profile Viewer
|
||||
|
||||
The terrain profile viewer should use the same tile_manager
|
||||
for consistent elevation data. With bilinear interpolation,
|
||||
profiles will be much smoother and more accurate.
|
||||
|
||||
## Download Sources (Priority Order)
|
||||
|
||||
For auto-download when online:
|
||||
|
||||
1. **srtm.fasma.org** (no auth, direct HGT.zip download)
|
||||
URL: `https://srtm.fasma.org/N48E033.SRTMGL1.hgt.zip`
|
||||
- Free, no registration
|
||||
- SRTM1 (30m) data
|
||||
- May be slow or unreliable
|
||||
|
||||
2. **viewfinderpanoramas.org** (no auth, void-filled data)
|
||||
URL: `http://viewfinderpanoramas.org/dem1/{region}/{tile}.hgt.zip`
|
||||
- Free, no registration
|
||||
- Void areas filled from topographic maps
|
||||
- Better quality in mountainous areas
|
||||
- File naming might differ by region
|
||||
|
||||
3. **Future: UMTC sync server**
|
||||
URL: `https://rfcp.{your-domain}/api/terrain/tiles/{tile_name}.hgt`
|
||||
- Self-hosted on your infrastructure
|
||||
- Accessible via WireGuard mesh
|
||||
- Can pre-populate with full Ukraine dataset
|
||||
|
||||
## Offline Bundle Strategy
|
||||
|
||||
For installer / field deployment:
|
||||
|
||||
### Option A: Region packs
|
||||
Pre-package tiles by operational area:
|
||||
- `terrain-dnipro.zip` — 4 tiles around Dnipro area (~100 MB)
|
||||
- `terrain-ukraine-east.zip` — ~50 tiles, eastern Ukraine (~1.2 GB)
|
||||
- `terrain-ukraine-full.zip` — ~171 tiles, all Ukraine (~4.3 GB)
|
||||
|
||||
### Option B: On-demand with cache
|
||||
Ship empty, download tiles as needed on first calculation.
|
||||
Cache permanently. Works well for development/testing.
|
||||
|
||||
### Option C: Live USB bundle
|
||||
For tactical deployment, include full Ukraine terrain data
|
||||
on the live USB alongside the application. 4.3 GB is acceptable
|
||||
for a USB drive.
|
||||
|
||||
Recommend: **Option B for now** (development), **Option C for deployment**.
|
||||
|
||||
## File Changes
|
||||
|
||||
### New Files
|
||||
- `backend/app/services/terrain_manager.py` — SRTMTileManager class
|
||||
|
||||
### Modified Files
|
||||
- `backend/app/services/terrain_service.py` — Replace API calls with tile_manager
|
||||
- `backend/app/services/gpu_service.py` — Replace _batch_elevation_lookup
|
||||
- `backend/app/services/coverage_service.py` — Add missing tile pre-check
|
||||
- `backend/app/main.py` — Initialize tile_manager on startup
|
||||
|
||||
### Config
|
||||
- Add `TERRAIN_DIR` environment variable / config option
|
||||
- Default: `./data/terrain` relative to backend exe
|
||||
|
||||
## Testing
|
||||
|
||||
```powershell
|
||||
# Build and test
|
||||
cd D:\root\rfcp\backend
|
||||
pyinstaller ..\installer\rfcp-server-gpu.spec --noconfirm
|
||||
.\dist\rfcp-server\rfcp-server.exe
|
||||
```
|
||||
|
||||
### Test 1: First run (no tiles cached)
|
||||
- Start app, trigger calculation
|
||||
- Should attempt to download required tile(s)
|
||||
- If online: downloads, caches, calculates
|
||||
- If offline: warning, flat terrain fallback
|
||||
|
||||
### Test 2: Cached tiles
|
||||
- Run same calculation again
|
||||
- Tile loaded from disk cache, no download
|
||||
- Should be fast (tile load from disk < 100ms)
|
||||
|
||||
### Test 3: Accuracy comparison
|
||||
- Compare elevation at known points (e.g., Dnipro city center)
|
||||
- Cross-reference with Google Earth elevation
|
||||
- Expected accuracy: ±5m horizontal, ±16m vertical (SRTM spec)
|
||||
|
||||
### Test 4: Multi-tile calculation
|
||||
- Set radius to 50km+ to span multiple tiles
|
||||
- Verify seamless stitching at tile boundaries
|
||||
- No elevation jumps or artifacts at edges
|
||||
|
||||
### Test 5: Terrain profile
|
||||
- Draw terrain profile across tile boundary
|
||||
- Should be smooth, no discontinuity
|
||||
- Compare with Google Earth profile for same path
|
||||
|
||||
### Test 6: Performance
|
||||
- Tile load time from disk: <100ms
|
||||
- Batch elevation lookup (6000 points): <50ms
|
||||
- Should not regress overall calculation time
|
||||
- Memory: ~25 MB per loaded tile, max 16 tiles = 400 MB
|
||||
|
||||
## What NOT to Change
|
||||
|
||||
- Don't modify GPU pipeline architecture (Phase 2.5/2.6/2.7)
|
||||
- Don't change propagation model math
|
||||
- Don't change API endpoints or response format
|
||||
- Don't change frontend map or heatmap rendering
|
||||
- Don't change OSM building/vegetation fetching
|
||||
- Don't change PyInstaller build process (just add data dir)
|
||||
|
||||
## Success Criteria
|
||||
|
||||
- [ ] SRTM1 tiles load correctly (3601×3601, 30m resolution)
|
||||
- [ ] Bilinear interpolation working (smoother than nearest-neighbor)
|
||||
- [ ] Offline mode works with pre-cached tiles
|
||||
- [ ] Auto-download works when online
|
||||
- [ ] Missing tile warning shown to user
|
||||
- [ ] Multi-tile seamless stitching
|
||||
- [ ] Terrain profile accuracy matches Google Earth within 20m
|
||||
- [ ] No performance regression (calculation time same or faster)
|
||||
- [ ] Tile cache directory configurable
|
||||
Reference in New Issue
Block a user