Loading...
Loading...
Use Netryx to index street-view panoramas and geolocate any street-level photo to precise GPS coordinates using CosPlace, ALIKED/DISK, and LightGlue.
npx skill4agent add aradotso/trending-skills netryx-street-level-geolocationSkill by ara.so — Daily 2026 Skills collection.
git clone https://github.com/sparkyniner/Netryx-OpenSource-Next-Gen-Street-Level-Geolocation.git
cd Netryx-OpenSource-Next-Gen-Street-Level-Geolocation
python3 -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
pip install -r requirements.txt
pip install git+https://github.com/cvg/LightGlue.git # required
pip install kornia # optional: Ultra Mode (LoFTR)export GEMINI_API_KEY="your_key_here" # never hard-code; use env varbrew install python-tk@3.11 # match your Python versionpython test_super.pynetryx/
├── test_super.py # Main app: GUI + indexing + search pipeline
├── cosplace_utils.py # CosPlace model loading & descriptor extraction
├── build_index.py # Standalone high-performance index builder
├── requirements.txt
├── cosplace_parts/ # Raw embedding chunks written during indexing
└── index/
├── cosplace_descriptors.npy # All 512-dim descriptors (compiled)
└── metadata.npz # Coordinates, headings, panoid IDslatitude, longitude| Radius | ~Panoramas | Time (M2 Max) | Index size |
|---|---|---|---|
| 0.5 km | 500 | 30 min | ~60 MB |
| 1 km | 2 000 | 1–2 h | ~250 MB |
| 5 km | 30 000 | 8–12 h | ~3 GB |
| 10 km | 100 000 | 24–48 h | ~7 GB |
from cosplace_utils import load_cosplace_model, extract_descriptor
from PIL import Image
import torch
device = torch.device("cuda" if torch.cuda.is_available() else
"mps" if torch.backends.mps.is_available() else "cpu")
model = load_cosplace_model(device=device)
img = Image.open("query.jpg").convert("RGB")
descriptor = extract_descriptor(model, img, device=device) # shape: (512,)
print("Descriptor shape:", descriptor.shape)import numpy as np
from math import radians, sin, cos, sqrt, atan2
# Load compiled index
descriptors = np.load("index/cosplace_descriptors.npy") # (N, 512)
meta = np.load("index/metadata.npz", allow_pickle=True)
latitudes = meta["latitudes"] # (N,)
longitudes = meta["longitudes"] # (N,)
headings = meta["headings"] # (N,)
panoids = meta["panoids"] # (N,)
def haversine_km(lat1, lon1, lat2, lon2):
R = 6371.0
dlat = radians(lat2 - lat1)
dlon = radians(lon2 - lon1)
a = sin(dlat/2)**2 + cos(radians(lat1))*cos(radians(lat2))*sin(dlon/2)**2
return R * 2 * atan2(sqrt(a), sqrt(1 - a))
def search_index(query_descriptor, center_lat, center_lon,
radius_km=2.0, top_k=500):
"""Return top-k candidate indices ranked by cosine similarity within radius."""
# Radius mask
dists = np.array([
haversine_km(center_lat, center_lon, lat, lon)
for lat, lon in zip(latitudes, longitudes)
])
mask = dists <= radius_km
# Cosine similarity (descriptors assumed L2-normalised)
q = query_descriptor / (np.linalg.norm(query_descriptor) + 1e-8)
sims = descriptors[mask] @ q # cosine scores
local_indices = np.where(mask)[0]
ranked = local_indices[np.argsort(sims)[::-1]] # descending
return ranked[:top_k]
candidates = search_index(descriptor, center_lat=48.8566,
center_lon=2.3522, radius_km=1.0)
print(f"Top candidate panoid: {panoids[candidates[0]]}")
print(f" lat={latitudes[candidates[0]]:.6f} lon={longitudes[candidates[0]]:.6f}")# Run after adding new cosplace_parts/*.npz chunks
import subprocess
subprocess.run(["python", "build_index.py"], check=True)build_index.pyimport importlib.util, pathlib
spec = importlib.util.spec_from_file_location("build_index",
pathlib.Path("build_index.py"))
build_index = importlib.util.module_from_spec(spec)
spec.loader.exec_module(build_index)
build_index.build() # adjust to actual function name in the fileFor each candidate:
1. Download panorama tiles from Street View (8 tiles, stitched)
2. Crop rectilinear view at indexed heading
3. Generate multi-FOV crops: 70°, 90°, 110°
4. Extract keypoints:
CUDA → ALIKED (1024 keypoints)
MPS/CPU → DISK (768 keypoints)
5. LightGlue deep feature matching vs. query keypoints
6. RANSAC geometric verification → inlier countHeading refinement : top-15 candidates × ±45° at 15° steps × 3 FOVs
Spatial consensus : cluster matches into 50 m cells; prefer clusters
Confidence score : geographic clustering tightness + uniqueness ratiotest_super.py| Parameter | Default | Effect |
|---|---|---|
| Grid resolution | 300 | Panorama density during indexing; don't change |
| Radius (search) | user-set | Haversine filter radius in km |
| Top-K candidates | 500–1000 | Candidates passed to Stage 2 |
| Heading steps | ±45° / 15° | Refinement sweep range |
| Spatial cell size | 50 m | Consensus clustering granularity |
| Neighbourhood expansion | 100 m | Ultra Mode only |
| Weak match threshold | 50 inliers | Triggers descriptor hopping in Ultra Mode |
import torch
# Netryx auto-selects; mirror this logic in custom scripts
if torch.cuda.is_available():
device = torch.device("cuda") # ALIKED, full speed
elif torch.backends.mps.is_available():
device = torch.device("mps") # Mac M-series, DISK
else:
device = torch.device("cpu") # Works, significantly slower# Index Paris, London, Tokyo — all into the same index/
# Then search by specifying center + radius:
# Paris only
candidates = search_index(desc, center_lat=48.8566, center_lon=2.3522, radius_km=5)
# London only
candidates = search_index(desc, center_lat=51.5074, center_lon=-0.1278, radius_km=5)cosplace_parts/*.npzpython build_index.pybrew install python-tk@3.11 # match your exact Python versionimport lightgluepip install git+https://github.com/cvg/LightGlue.gitimport korniapip install korniatop_kcosplace_parts/.npzexport GEMINI_API_KEY="your_key_here"echo $GEMINI_API_KEY| Package | Role |
|---|---|
| Model inference backbone |
| Deep feature matching |
| LoFTR dense matching (Ultra Mode) |
| Index storage and cosine similarity |
| Image loading and preprocessing |
| GUI (stdlib, may need upgrade on macOS) |
[ ] Clone repo and create venv
[ ] pip install -r requirements.txt
[ ] pip install git+https://github.com/cvg/LightGlue.git
[ ] (optional) pip install kornia
[ ] (optional) export GEMINI_API_KEY=...
[ ] python test_super.py
[ ] Create mode → set coords + radius → Create Index (wait for completion)
[ ] python build_index.py (if not auto-built)
[ ] Search mode → upload photo → Manual/AI Coarse → Run Search
[ ] Read GPS result + confidence score on map