Readit News logoReadit News
bigveech commented on Sorcerer (YC S24) raises $3.9M to launch more weather balloons   axios.com/pro/climate-dea... · Posted by u/tndl
huslage · 3 months ago
This is an anti-pattern. Doing the work that the government should be (and was) doing and then selling back the data to them or others when the data should be (and was) public domain is absolutely terrible for society.

No one should fund this.

bigveech · 3 months ago
Just to put it in perspective: it costs $300–500 to produce a single atmospheric profile with current balloon infrastructure. The U.S. launches ~180 a day—that’s at least $54K daily. Not exactly “pennies.” :)

And the government already buys the helium, radiosondes, and ground systems from private vendors—so the money’s going to private industry anyway. It’s just inefficient.

With 50 of our systems doing 4 profiles a day (which is no where close to max scale), you get the same volume of data for way less. And on top of that, because we reach remote and oceanic areas that aren’t being measured today, the data is also more valuable!

Also, the data you’re referring to isn’t inherently public domain. It becomes public when the government buys it and redistributes it. That’s true whether they pay for the infrastructure themselves or buy the data directly from a company.

avecchi-sorc commented on Launch HN: Sorcerer (YC S24) – Weather balloons that collect more data    · Posted by u/tndl
pkhodiyar · a year ago
curious to know how you guys store and process the data and make meaningful inferences out of the data collected. Redshift? BigQuery? Datazip?
avecchi-sorc · a year ago
Internally yes we use a data warehouse to store the raw observations. We go through a similar process to traditional NWP to produce forecasts except with a few AI steps in between. Unfortunately, the standard currently is to store these as large multi-dimensional data files like grib2/netCDF/zarr.

In case you’re curious, here’s where NOAA stores all their GFS related forecasts: https://registry.opendata.aws/noaa-gfs-bdp-pds/

bigveech commented on Launch HN: Sorcerer (YC S24) – Weather balloons that collect more data    · Posted by u/tndl
OccamsMirror · a year ago
Are you pairing your data with satellite observations?
bigveech · a year ago
Yes, however, like with traditional forecasts, we weigh our balloon observations much higher.
avecchi-sorc commented on Launch HN: Sorcerer (YC S24) – Weather balloons that collect more data    · Posted by u/tndl
popctrl · a year ago
This sounds like super interesting and meaningful work. Are you hiring, or do you have any advice for your average software engineer on getting into this space?
avecchi-sorc · a year ago
I was in the same boat. I've been a software engineer for as long as I can remember and always wanted to do more than just build B2B SaaS.

Max, the first engineer at Urban Sky, hit me up and asked if I wanted to build their mission control. At the time, Urban Sky was just a four-person team, so they couldn’t pay me as much, but I jumped at the chance, even though it meant taking about half my usual salary.

Funny enough, my SaaS background actually helped me create mission control software that was way ahead of the curve!

I guess my advice is, find a small company you're passionate about, where you can make a big impact, and be open to taking a pay cut. It helps the company take less of a risk on you, and you get to work on something that really matters. Plus, when you’re solving real problems, things tend to work out, and eventually, you’ll end up making what you should in salary.

bigveech commented on Launch HN: Sorcerer (YC S24) – Weather balloons that collect more data    · Posted by u/tndl
petervandijck · a year ago
When they said "launch more often" I guess you took that advice to heart :) Congrats on the launch(es).
bigveech · a year ago
Yes! It’s quite confusing to distinguish between launches at the office haha
bigveech commented on Launch HN: Sorcerer (YC S24) – Weather balloons that collect more data    · Posted by u/tndl
pagade · a year ago
To a layperson like me, could you explain how these balloons will be cleaned up / collected after their life? What material are they made up of?
bigveech · a year ago
Here's a video of one of our recent recoveries: https://youtu.be/8DWYLG_95V0
avecchi-sorc commented on Launch HN: Sorcerer (YC S24) – Weather balloons that collect more data    · Posted by u/tndl
counters · a year ago
> One thing we believe, however, is that the reanalysis step in weather forecasting is unnecessary in the long term, and that future (ML) weather models will eventually opt to generate predictions based on un-assimilated raw data and will get better results in doing so.

The idea that we'll be able to run ML weather models using "raw" observations and skip or implicitly incorporate an assimilation is spot-on - there's been an enormous shift in the AI-weather community over the past year to acknowledge that this is coming, and very soon.

But... in your launch announcement you seem to imply that you're already using your data for building and running these types of models. Can you clarify how you're actually going to be using your data over the next 12-24 months while this next-generation AI approach matures? Are you just doing traditional assimilation with NWP?

Also, to the point about reanalysis - that's almost certainly not correct. There are massive avenues of scientific research which rely on a fully-assimilated and reconciled, corrected, consistent analysis of atmospheric conditions. AI models in the form of foundation models or embeddings might provide new pathways to build reanalysis products, but they are a vital and critical tool and will likely be so for the foreseeable future.

avecchi-sorc · a year ago
> There are massive avenues of scientific research which rely on a fully-assimilated and reconciled, corrected, consistent analysis of atmospheric conditions.

That’s a good point! In fact, the outputs for observation based foundational models will likely include a "reanalysis-like" step for the final output.

Regarding the next 6-12 months, we will be integrating our data with traditional NWP models and utilizing AI for forecasting. We've developed a compact AI model that can directly assimilate our "ground truth" data with reanalysis, specifically for use in AI forecasting models.

Once we have hundreds of systems deployed, we'll use the collected observations, combined with historical publicly available data, to train a foundational model that will directly predict specific variables based on raw observations.

u/bigveech

KarmaCake day16April 11, 2024View Original