Readit News logoReadit News
Posted by u/vancecookcobxin 4 days ago
Show HN: I built a real-time OSINT dashboard pulling 15 live global feedsgithub.com/BigBodyCobain/...
Sup HN,

So I got tired of bouncing between Flightradar, MarineTraffic, and Twitter every time something kicked off globally, so I wrote a dashboard to aggregate it all locally. It’s called Shadowbroker.

I’ll admit I leaned way too hard into the "movie hacker" aesthetic for the UI, but the actual pipeline underneath is real. It pulls commercial/military ADS-B, the AIS WebSocket stream (about 25,000+ ships), N2YO satellite telemetry, and GDELT conflict data into a single MapLibre instance.

Getting this to run without melting my browser was the hardest part. I'm running this on a laptop with an i5 and an RTX 3050, and initially, dumping 30k+ moving GeoJSON features onto the map just crashed everything. I ended up having to write pretty aggressive viewport culling, debounce the state updates, and compress the FastAPI payloads by like 90% just to make it usable.

My favorite part is the signal layer—it actually calculates live GPS jamming zones by aggregating the real-time navigation degradation (NAC-P) of commercial flights overhead.

It’s Next.js and Python. I threw a quick-start script in the releases if you just want to spin it up, but the repo is open if you want to dig into the backend.

Let me know if my MapLibre implementation is terrible, I'm always looking for ways to optimize the rendering.

afatparakeet · 4 days ago
Optimizing some of that geojson into realtime tiles is a really fun and engaging project.

Have you seen these projects?

https://github.com/protomaps/PMTiles

https://github.com/maplibre/martin

vancecookcobxin · 3 days ago
They are definitely on the horizon! I am a HUGE fan of both of those projects and they are definitely on the roadmap for the architecture...

Right now, ShadowBroker is really optimized for 'blinking blip' real-time radar tracking (streaming the raw GeoJSON payload from the FastAPI backend directly to MapLibre every 60s), so we get as close to as smooth 60fps entity animations across the map.

Moving to something like Martin would be incredible for handling EVEN MORE entities if we start archiving historical flight and AIS data into a proper PostGIS database, but the trade-off of having to invalidate the vector tile cache every few seconds for live-moving targets makes it a bit overkill right now....

afatparakeet · 3 days ago
Yeah less ideal for the realtime data but could be useful for lightening the load of certain more static layers.

Great project, will be contributing!

KronisLV · 3 days ago
Protomaps is really cool also when you just want maps for a country and to serve them without too much of a hassle, their CLI has pretty much everything you need: https://docs.protomaps.com/pmtiles/cli

I set that up for an agricultural project a while back.

totetsu · 3 days ago
Is this kind of Hyper-awareness of data you can't actually do anything about even a desirable thing, or just a pathway into a hole of hyper-alert stress and low Self-efficacy?
ahannigan · 3 days ago
himmi-01 · 2 days ago
This one is so good. Bookmarked. Thanks. I think the only thing I need now is to enter a city name and it gathers data if available.
vavkamil · 4 days ago
You leaked `./frontend/.env.local` & `./backend/.env` inside `ShadowBroker_v0.1.zip` in the first commit.
tfghhjh · 3 days ago
thats why its called osint

everything is open source

DetroitThrow · 3 days ago
the real OSINT is always in the comments
Escapade5160 · 2 days ago
Whole thing feels very vibe coded. Even OP's post here.
porridgeraisin · 3 days ago
What made you check that
wildrhythms · 2 days ago
It's both the first and last thing to check
stef25 · 2 days ago
It's called Hacker News

Deleted Comment

koenschipper · 19 hours ago
That live GPS jamming calculation using commercial flight NAC-P degradation is honestly brilliant. Such a clever use of existing public telemetry.

You mentioned compressing the FastAPI payloads by 90% to keep the browser from melting. I'm really curious about your approach there did you just crank up gzip/brotli on the JSON responses, or did you have to switch to something like MessagePack, Protobuf, or a custom binary format to handle that volume of moving GeoJSON features?

Also, never apologize for the "movie hacker" UI. A project like this absolutely deserves that aesthetic. Awesome work!

CountGeek · 3 days ago
born-jre · 3 days ago
i was building sth like this

https://github.com/blue-monads/potato-apps/tree/master/cimpl...

i should finish but have not have time

rationalist · 3 days ago
Risky click. (It's okay.)
ivannovazzi · a day ago
Nice work! The real-time vehicle movement rendering challenge you mentioned reminded me of a tool I've been building — Moveet (https://github.com/ivannovazzi/moveet), an open-source fleet simulator that drives synthetic vehicles on real OSM road networks using A* pathfinding, streaming positions over WebSocket. It's aimed at developers who need realistic GPS movement data to test fleet or logistics software, and supports Kafka, Redis, and REST sinks. Your approach of batching 30k+ GeoJSON features into typed arrays is a great pattern for this kind of workload.
RovaAI · 2 days ago
Great execution on aggregating live feeds. Two questions from someone who does similar work on the B2B side:

1. How do you handle deduplication when the same event surfaces across multiple feeds simultaneously? For news aggregation this is the hard part - an event that appears in Reuters, Bloomberg, and 12 downstream outlets is one story, not 13.

2. What's your rate limiting strategy across 15 sources? Some of the better data APIs (Shodan, GreyNoise, etc.) have strict per-minute limits that become a real constraint at even modest query frequencies.

The B2B application of this pattern is company intelligence - pulling company news, job postings, funding signals, and tech stack changes from 10+ sources and surfacing the relevant signal per account. Same architecture challenge (deduplication, rate limits, signal:noise ratio) with a much smaller initial data volume but higher precision requirements per entity.