Potential integration: Run HVAC fans and/or an attic fan and/or a crawlspace fan if indoor AQI is worse than outdoor AQI
This says that Air Quality Sensor support was added to matter protocol in 2023: https://csa-iot.org/newsroom/matter-1-2-arrives-with-nine-ne... :
> Air Quality Sensors – Supported sensors can capture and report on: PM1, PM2.5, PM10, CO2, NO2, VOC, CO, Ozone, Radon, and Formaldehyde. Furthermore, the addition of the Air Quality Cluster enables Matter devices to provide AQI information based on the device’s location
/? matter protocol Air Quality Cluster: https://www.google.com/search?q=matter+protocol+Air+Quality+...
PS: do you use the air pressure to correct the CO2-readings like the Aranet4 does or would users need to manually recalibrate when the move to higher/lower elevations compared to when the sensor was calibrated?
We have not integrated compensation of CO2-readings yet, but will certainly look into it before shipping the device.
Just curious though does it cover finer particles and if not just curious why? Same question I guess for a general AQI rating. The one random (probably poor quality) portable AQI monitor I have covers different levels of fine particles and that seems really useful during fire season when its bad (in Nor Cal). But of course I'm not super knowledgeable about any of this.
Anyway fantastic project, I absolutely love it.
You can find more infos here: https://www.crowdsupply.com/networked-artifacts/air-lab/upda...
This post [2] details the process they went though to port their device firmware to Wasm for their interactive demo. As a colleague put it, could be a pretty solid Show HN in its own right.
[1] https://news.ycombinator.com/item?id=44190541
[2] https://www.crowdsupply.com/networked-artifacts/air-lab/upda...
I reached out to HN by mail (as suggested on their tips page) to gauge whether my story/product was allowed to be posted as a Show HN post. They pointed out that HN requires a more "direct" demo of things. Their suggestion was to create a "raw" video showing how the device works and feels. It made a lot of sense to me. However, as a designer by training, it’s hard for me to produce something like that, as I naturally gravitate towards polishing it too much. When discussing this dilemma with my colleague, we remembered an idea I had some time ago about creating interactive renderings for the Air Lab website. Quickly, we agreed that this would be worth testing, as the whole goal of the video was to give the HN community a feel for the device.
As mentioned in my comment and the Crowd Supply update, I used emscripten to compile the stock firmware to WASM. Luckily, by that time, I was already mostly done with extracting a hardware abstraction-layer from the firmware. This meant that I already had a nice API that I needed to ”mock” and connect to the fake sensors and controls on the website. So most of the work for that week was to actually build the simulator app using Ember.js around the compiled firmware and integrate it. By doing that, I also found a couple of bugs in the firmware itself that have been much easier to debug with the simulator than with a real device.
I can recommend to anyone to reach out to the HN moderators and validate their post. Especially, if it is not a software thing that one can immediately try out. But also then, I think most posts/projects could profit from a more interactive demo.