A few years ago I came to the realisation that if you want people to be more environmentally conscious or economical in terms of utility consumption, (electricity, water, gas, etc), they need far better data than a single figure per month.
You want to be able to see usage to a resolution of at most 5 minutes.
That way people can spot things like “having my electric heater on for those couple of hours used more electricity than all my lights use for a month”.
I have an inverter and solar panels in my place (very common now in South Africa middle class homes due to unreliable electricity producer) and I can see a full history of electricity usage.
It’s easy for me to see where I can improve my efficiency or why my consumption was so high.
It’s still only an overall figure though, so you have to do an informed assumption as to what caused the consumption.
For example it’s obvious that the 3kw draw for about an hour or so after I shower is the geyser heating itself back up. I can see from the usage stats that my battery was depleted from the night, that the solar production is still low due to my showering in the early morning and that the energy was thus coming from the grid (the inverter records all these figures).
It is then obvious that I can very simply save money on electricity by putting a timer on my geyser so that it only heats after 10am or so, once the sun is high enough for solar production to cover the energy usage.
Now I just wish I had something as convenient for monitoring water consumption.
I agree fully, I believe that my Home Assistant energy dashboard has done more for our energy consumption than any other measure.
If you’re in the Netherland you can get something like a “slimme lezer”, plug it into the p1 port of your energy meter and it will pop up in Home Assistant with the right sensors.
The energy dashboard will give an overview of your gas and electricity usage, solar production, proportion used from grid/solar and even a home battery if present. It’s really great.
Combine it with some Aqara (zigbee but easily overloaded) or Shelly (WiFi and I find them very robust) energy monitoring power sockets and you get a very good idea of the simplest measures to take to save power. You can even add cost/kWh and M3 (gas) to the sensors in HA.
I have the same setup. I used it to make a green light go on when we have more than 500 watt excess solar production, so my wife knows she can turn on the washing machine for free.
> If you’re in the Netherland you can get something like a “slimme lezer”, plug it into the p1 port of your energy meter and it will pop up in Home Assistant with the right sensors.
If only the meters had ny power nearby or offered a 5V USB port on it so you could plug in your reader and forget it. But no. Now I'd have to keep a small Li-ion battery living in -20C since the meters are typically outside and there is never any power nearby since they are in a closed cabinet. Only the people who have indoor power meters (I know zero cases of that with detached houses, I think the power companies require the meter to be outside in a cabinet they can access without access to the house.
I have no frame of reference but I have a feeling here (usa) it's super illegal to "tamper" with any pub utilities infrastructure. Could be wrong, though, but no usb ports.
That's really cool though I wish we could do that.
I have a friend that is diabetic (a number of friends, actually).
He has been doing a pretty lousy job of managing his diet.
Until his doctor prescribed a monitor for a few weeks.
This is a device that looks like a big Band-Aid that you put on your arm, inserting a fine needle under the skin, and communicates with an app on the phone, reporting things like glucose levels.
Once he realized the effects of the foods he was eating, he immediately changed his diet, and has been sticking to it, since (he no longer wears the monitor).
The UI of the app was pretty good. The historical data readout is what did it for him.
I think even people who don’t have diabetes could benefit from this. Would having access to my own glucose levels throughout the day give me some insights into how what I eat influences my own mood, energy levels etc?
This company is offering it as a service https://www.limborevolution.com/, it's expensive though. I haven't used it but the owner has had a few success. Shaq is also an investor.
It'd be cool to give something like that a try, but I love that we have so much technology to monitor our bodies but I'd never use anything if it required a cell phone app since you'd have to worry about who else is collected/selling that data. We need more devices that work entirely offline, but they're hard to find because companies know they can make so much more money by collecting/selling your personal data and pushing ads
Your experience also points to the limits of monitoring and subsequent behavioral change, though. I mean, yeah, it might prompt you to start your washing machine a bit earlier or a bit later to align with high production by your solar panels... but how much consumption can you really move around like that, and how many energy hogs can you just decide to not use? If you notice high energy use while cooking, are you going to start eating more salads instead? Across Europe electricity meters are being replaced by smart meters and people are really hyping up the advantages of being able to continuously monitor your energy usage, but I think the jury is still out whether it'll actually lead to significant energy savings.
Ultimately the biggest wins are when big appliances and heating/cooling respond to self-production or take advantage of times when electricity is cheap (if you're on a per-hour or per-day dynamic contract), whether that's with a simple timer like the one you installed, a relay that shuts down heating when you're cooking or something fancier like a Fronius Ohmpilot [1] that tweaks heating power to exactly match PV (over)production.
> I think the jury is still out whether it'll actually lead to significant energy savings.
In Finland you can get an electricity contract that follows the hourly spot prices. Usually the hourly prices varies in the range from 5c to 20c/kWh, but sometimes it jumps up to 40c, even 80c/kWh. The record was 2€/kWh for a couple hours in one day.
People who have chosen this kind of contract, usually reduce their consumption during the ridiculously expensive hours, which usually occur when there happens both low wind energy production, and simultaneously some power plant being offline for maintenance.
You can also get a contract with a fixed price, if you want.
There's been a few simple experiments in the UK - where consumers have been encouraged to reduce usage at peak time that have been successful. But as you say its going to need the appliances to support it. Everything needs a "Get this done by X o'clock" whether thats a dishwasher/washing machine/car charger.
> If you notice high energy use while cooking, are you going to start eating more salads instead?
Of course, why wouldn't you? If the assumption isn't that effectively unlimited power is available on demand you adjust use accordingly.
On sunny days with excess power maybe you charge and do laundry. On a stretch of cloudy days you avoid long periods of cooking or using large tools like sellers or air compressors.
Adjusting to our environment rather that chasing convenience is a very reasonable approach to makinh a real dent in reducing our environmental impact.
> but how much consumption can you really move around like that, and how many energy hogs can you just decide to not use? If you notice high energy use while cooking, are you going to start eating more salads instead?
There’s been some interest in this locally due to energy regulation moving to a more punishing system for peak usage. Also high electricity prices. And the topic came up that people would be encouraged to do things that are considered unsafe like washing their clothes while they are sleeping.
Things like this need to be automated. (And be safe.) Manually following fancy gadgets won’t make much of a difference.
Meanwhile we’re (or were, don’t know the current status) selling hydropower to the rest of the Europe during the fall, emptying the reservoirs before the winter so that electricity prices become unreasonable (leading to strategies like washing your clothes while you sleep or just being content to freeze while indoors).
And I have never seen a good argument for the export/import (Europe) arrangement. But I guess we can try to sideline that whole conversation by nagging people to turn off the light in the bathroom when it’s not busy.
However, knowing that a particular device is bad means that when I eventually need to replace it in the future, I will also factor in energy efficiency and features such as it being able to check energy prices in my purchasing decision.
About water consumption: depending on your make of water meter, there's often a small reflective wheel that turns eg. once for every liter. Sometimes these are made out of metal or even slightly magnetic. An arduino with an optical or Hall effect sensor might get you real far in real time, high resolution data collection!
Alternatively I've had success in wiring up a temperature probe directly to the incoming water line, and comparing that temperature to the ambient temperature. Where I live that works because the water arrives from underground & is always much cooler than ambient air. The time-integrated difference between the two is a proxy for how much water you use... this is much more involved to get meaningful data from, tho.
On mobile so hard to link, but memory says OpenEnergyMonitor's docs site on pulse counters has a computer vision approach too. Think it reads the numbers from the display.
Per device energy tallies also give you interesting data.
Home Assistant can do that in the Energy dashboard, and you can answer questions/learn surprising things, like how much energy my "rack" (UPS+mac mini+5 disk bay+a few other things) actually uses vs e.g my fridge or my washing machine, or my desk compute actually is quite low but boy does the screen costs a lot when active, or what does charging the electric bike costs, or what's the effect of setting thermostat to 19 instead of 20 in winter, or oh wow in summer this fan that we use a lot to make things bearable actually ends up using as much energy as our water heater!
(power measurements are done using Shelly Plug Plus S + 3EM + 4PM devices, thermal measurements using Shelly H&T Plus)
> like how much energy my "rack" (UPS+mac mini+5 disk bay+a few other things) actually uses
I find it better to remain ignorant in that regard. Jokes aside, it's also interesting seeing wall draw vs UPS draw vs PSU draw if each piece of your equipment supports that.
The Shelly stuff is also quite fun to play with (I recommend a AC adapter for H&T). I have the little black spherical sensors and the data resolution is significantly worse on battery since it tries to sleep in low power state as much as possible. It's fun to see the server cabinet (mine's enclosed) vs room vs different room temps. You can also see when the HVAC cycles on and off and when someone takes a shower (humidity spikes).
Being able to see your usage is helpful - at least, for those of us interested.
For example, I was surprised to see how much our electronics (stereo, amplifier, TV, etc.) in the living room use, even when off (some devices are older, with high standby currents). It motivated me to put everything on a timer that only turns power own in the evenings, since that's the only time they get used.
> It motivated me to put everything on a timer that only turns power own in the evenings, since that's the only time they get used.
I was surprised to learn that a timer itself also uses power. I borrowed a Kill-a-watt from the library and found that an 2 decades old timer uses 2.3W while a newer one uses 0.6W. That tells me that I should just keep the old timer for the rare occasions.
Yeah we were away from home recently and it was interesting to see that with everything "off" we were still using a constant 200W or so, so even with no one home we used just over 4kWh each day. 120kWh each month just for "idle" usage definitely is not trivial amount of money, at current prices that's £20!
The consumer element is the sugar to help the masses swallow the pill. If it was just about the consumer, the unit would never report its findings back to base. But blurting back your information is integral to, well, all smart devices. That is the point.
Once the government has that info, it will be able to come up with bespoke taxes for you according to what it ordains as fair use. 'Your showers are too long', 'your toolshed is too big a draw on the electric' therefore 'you need to buy carbon credits to offset the environmental damage you are causing'.
It's the slow descent to greater tyranny, and loss of personal control. It's amazing that people put up with it, but a slight discount in the short term, or visibility of your own data, is probably enough to get most people to accept spying infrastructure in their lives forever.
As evidenced by the article, the problem is that some of these devices within less than 10 years can become essentially bricks.
I think these devices must be required to send the data to the utility company and the utility company must be forced to make the data easily accessible in a standard format so that independent analysis is trivially possible.
This way you don't have a situation where a device manufacturer goes out of business and the capability to monitor is lost.
We have one as well, but since we're on a variable rate tariff(Octopus Tracker) it's completely useless - it doesn't know the current electricity/gas price, it seems to receive rate updates from the network about once every few weeks - so the numbers it displays are just wrong.
I've made my own little Raspberry Pico display that queries today's energy prices and shows those, but I have not been able to show today's energy consumption alongside(and therefore show the day's cost so far). Octopus provides an API to query the kWh used....but only for the last day. I even got their little Octopus Mini that broadcasts live usage to their app but I have not been able to query the live data from it from my raspberry, I don't have the necessary skills in web technologies to do that unfortunately :-(
I have one here in Bucharest and, while fancy, as in it blips a red light when the power consumptions is higher than usual, it doesn't help me at all.
As in, yeah, running the washing machine is power consuming, I knew that, and the same goes for the electric oven or for the vacuum cleaner, but what am I supposed to do with that information? Not wash my clothes anymore? Not using the oven? Leaving dust all over the place for longer?
In my home state of Victoria Australia the government had a program to give out these powerpal[1] units for free that could measure your usage in realtime using the flashing led on our smart meters, we also require all energy grid operators (the people who own the poles and wires) to have an energy portal where users can get near realtime data to the nearest 30mins, soon to be 5 with some new legislation.
The former most people have no idea about but the powerpal has been a smashing success for consumers to understand what is using energy.
Instead of a single usage figure per month, it should be a cumulative line chart with high resolution, preferably zoomable. The customer would see long periods of almost nothing, and occasional big jumps (e.g. When the heater is on).
I had this hope when the energy companies in France installed the national "smart meter" (Linky). It was a fun story because it was linked to spying you at home, 5G, COVID and radiation.
Unfortunately this is a closed system where the energy company will not let you access the data outside of their own dashboard. I would like to think that this is against the national trend toward Open Data but it is what it is.
There are some funky solutions where you connect a board to an input of the meter and somehow get the data in Home Assistant but it is like I said "funky" (completely guerilla style, without any backing of the power company and if you have a problem it will probably be your fault).
> It was a fun story because it was linked to spying you at home, 5G, COVID and radiation.
The first of those is a genuine concern
> In Australia debt collectors can make use of the data to know when people are at home.[63] Used as evidence in a court case in Austin, Texas, police agencies secretly collected smart meter power usage data from thousands of residences to determine which used more power than "typical" to identify marijuana growing operations.[64]
> Smart meter power data usage patterns can reveal much more than how much power is being used. Research has demonstrated that smart meters sampling power levels at two-second intervals can reliably identify when different electrical devices are in use.
IIRC my smart meter in the UK lets me choose between 30m and 24h reporting, probably as a response to these fears, but you just set it as a preference on the provider website, not locally on the meter. It would be trivial for them to just be lying about that and logging data to GCHQ at the maximum precision. That may seem outlandish, but so did PRISM until it was revealed
Some people also refuse to have one so that they can't be forced onto a dynamic-priced tariff. At the moment those are opt-in, but I think their concern is a fair one too. Though if the powers that be wanted to coerce people onto them, they could simply crank the price of fixed tariffs anyway
As a toy project I put a raspi with a small screen in my living room that would show the temperature and humidity for the last three days as a graph. It was somewhat of an eye catcher in the living room. The data was always interesting. Even if the humidity did not change much, at the very least you could always see that it was colder during the nights, which told me it was working.
It taught me so many things.
The effect of opening the windows short vs. long on temperature and humidity.
That the sun shining into my room was way more effective then my heating.
How the temperature goes down exponentially when my heating turns off, and how the length of the of/off cycles depend on the temperature outside. Etc.
It's ridiculous we don't have unlimited free or almost free energy, the future truly sucks.
Likewise it's ridiculous devices are not automatically saving energy when unused, it's such a simple change it should be standard.
Asking the user to worry about it should not be needed. That's the goal we should strive for. For now monitoring will have to do but I think a combination of solar + iron air battery will make everybody living in a sunny-enough place independent from the grid with ample margin - and we can supply the rest with nuclear.
This problem can easily be solved without any device at all. Of course, it demands education and "intelligent behavior". If people only had the curiosity to read the specs on a heater and see "2000W" of power, and compare it to "15W" of power on the specs of the LED bulb. Same for water. One can just place a bowl under the faucet and measure the time it takes to fill up. Now you have your water consumption rate. We can choose the "device based" route, but this road end with Idiocracy and problems so "big" nobody can solve them.
If we're comparing apples to apples all the time, sure. I think it's pretty obvious to most people who care to look that a 60w conventional bulb uses more energy than a 15w LED bulb (which, for the record, is the 100w conventional equivalent). Consider, however, these questions:
If my 2000w heater is running on the 800w setting and turns on when my room has dropped below the point I consider acceptably chilly and turns off above that point, how much electricity have I used in the last hour?
If I have 3 15w LEDs on a dimmer and run them intermittently throughout the day, how much electricity have I used in the last hour?
If my TV is off, but plugged in, and accesses the Internet a few times a day automatically to check for new versions, how much power has it used today?
I think this makes the case for, at least, a kill-a-watt style device. A whole home solution with sufficient report granularity and a report interface visible in the home would be worth the extra trouble, IMHO.
Edit: For the record, these are all real-scenarios from my house.
If you're looking for something simple to try work it out, I bought a smart plug a few years back which could record usage for around 20USD, you can then move it between your devices getting a sense of each's usage.
Long term tracking usage of individual device energy usage is nice, but just knowing from past measurements how much a device tends to use is already very useful.
Do you have a conventional HWS? These are notoriously power-hungry, often poorly maintained and calibrated, and hard to monitor.
You can buy a power meter plug - that sits between appliance and socket - and work your way around almost all your appliances apart from, typically, oven, air conditioner(s), and hot water systems. For those you're going to need to experiment by turning as many things off as you can, to establish a baseline, and review your switch meter periodically for short (several minute) intervals, with and without the larger appliances turned on.
(You can get induction coil systems to report usage of these larger appliances, but they're typically onerously priced.)
Sounds pretty good as such. The worry to have these days, though, is if we can also get this without energy usage data being traded between all sorts of shady companies and/or criminal organizations.
About monitoring water consumption, maybe using some webcam + OCR would help to recognize reading of a water meter?
Then Home Assistant would be helpful to see charts with energy consumption etc
Like you I have a solar system (Sunsynk) and it gives all sorts of wonderful stats, but I hardly bother looking at it as it is meaningless information - just wish it was more intelligent to balance battery discharge vs loadshedding schedule vs weather of the day.
I have to manually set things up for winter vs summer.
My geyser which is not on solar is on a IOT controller (Hotbot) and I set the times I want the water to be heated and it knows when there is loadshedding (3G) so it can intelligently deviate from my set times.
Corollary: people are cheap and lazy, so if you want them to not do something, make it cumbersome in a way that does not justify the cost difference, or vice versa.
SDGE started breaking down our energy usage by "type" a few months ago. Not exactly sure how but they can categorize things as laundry, lights, entertainment, computers, always on, etc.
My PGE (Bay area) does the same but I'm pretty sure it's an estimate according to house size, # of people, and a questionnaire I filled out such as "how many times do you run the dishwasher per week". Your house pulls electricity through the breaker box or doesn't.
Then again, it's possible different appliances have a specific electrical signatures like +5KWH x 1.5 hours = dishwasher.
Realtime data is nice, especially when I can get it from the source instead of having to fiddle with individual outlets. When we upgraded to smart meters in the UK (which you actually get an account credit for doing), our power company gave us a portable touch display that read from both the gas and electricity with different breakdowns of consumption.
> Now I just wish I had something as convenient for monitoring water consumption.
You can set this up non-invasively with ultrasonic flow meters like the TUF-2000M. It isn’t cheap, but it does work quite alright if you don't want any of the risks associated with cracking open your pipes.
(There are also cheaper options if you don’t mind opening up your pipes too.)
Thanks for sharing. Does the system supply grid voltage to your home autonomously, i.e. when the grid is offline? Does it work on all three phases? In Germany, most of the systems need the grid to be online in order to work.
Unpopular opinion: shifting environmental blame to individual consumers is a form of gaslighting (pun not intended, but still good). You may use spend a fortune on solar, heating pumps and A-clsss appliances, but will never save a fraction of power consumption of a single datacenter cabinet or aluminium furnace. One large enough manufacturing plant cancels out all environmental initiatives of a medium town, including public spaces.
Also, we need to stop gaslighting people with fake solutions like "planting trees" or "deleting emails". Sure, it's good, but the scale is so small that you're actually losing focus on what really makes changes.
I like how the author is surprised by the technological aberration that form Linux powered home appliances. A node server to power and publish over wifi a web site, an API, a web socket, while the site is being displayed by a outdated webview engine within an heavily constrained terminal which cant be reused for anything else. That's... the norm.
All this is very common. And yet displaying a couple of digits and a bar graph could be done with a pair of microcontrollers communicating onto some wired bus.
With the power supplies of this era, this pair of devices probably pumps 16w idle. Running 24/24 7/7, they probably consume as much as a small fridge as a whole. The LCA of the solution must be consterning as well, especially compared with few one dollard microcontrollers.
The worst of all is that this whole mess turned into bricks probably 3 years after it was installed, maybe less.
The reason why the Mirai botnet is still at large is: Android.
From a business perspective nobody wants to pay the costly people that can do microcontroller programming. Frontend devs are dirt cheap, especially for something as simple as that interface displaying the bar charts.
From employee perspective it was my impression that EE developers tend to get lower salaries than web developers.
But it could be the case that building an android or web app for a simple UI would take less dev-months than an embedded app with similar functionality.
There is also an enormous amount of flexibility gained when, instead if designing and building your own single-purpose device, you just use a cheap, mass produced, off the shelf, general purpose device.
> within an heavily constrained terminal which cant be reused for anything else.
Except for botnets and/or spying. Some of those boards already contain MEMS microphones and cameras (the box in the picture even shows the camera objective). I'd have took apart the device to take a look inside, or at least run some diagnostics to explore which hardware was installed/detected.
Pulling wires through anywhere after it's finished is an immense installation hassle though. It might be possible...or it might be completely impractical even if you can (i.e. low voltage buses and unshielded power wires don't play nice together if they're parallel).
Yeah I don't understand why he is shocked that this communicates wirelessly. He even bought a modern flat with Ethernet because he clearly knows how much pain it is to add wiring to a house. Very weird.
> With the power supplies of this era, this pair of devices probably pumps 16w idle. Running 24/24 7/7, they probably consume as much as a small fridge as a whole. The LCA of the solution must be consterning as well, especially compared with few one dollard microcontrollers.
At the average cost of electric in the USA this amounts to under $2/month. Seems negligible to me?
For a device with so little functionality and non-critical functionality at that, I wouldn't call $24/year negligible. My whole home Ryzen router/server idles around there. Honestly I'd bet the fuse was missing because the last tenant was an engineer, investigated this thing themselves, found it a useless waste, and pulled it.
> A node server to power and publish over wifi a web site, an API, a web socket, while the site is being displayed by a outdated webview engine within an heavily constrained terminal which cant be reused for anything else. That's... the norm.
I really wonder why this happens. Seems penny wise & pound foolish. Perhaps they failed to hire the right developer for the right abstraction level, and ended up with "web developers" I guess.
Money. That's almost always the reason for "why would they do that?"
It's much cheaper and more sustainable for the wealthy and powerful to train individuals on very high level technologies then reuse their skills in every way they can, regardless of how feasible, the economic and ecological footprint, or any concern outside of making profit.
Electron is not some comic book villain. JavaScript is not horrible and can be the optimal choice for many software applications.
But these technologies and tools are easy to teach to many workers who may or may not: understand the computational architecture to come up with better economic efficiencies, have interest in applying their skills to properly solve a problem rather than put food on the table, and so on.
The higher level the skill is, the less interest and deep systemic understanding needed for the job: many new jobs created.
> There were two strings printed with labels “SSID” and “Pwd”. I froze in horror. They wouldn’t dare. It is literally 3 meter distance. These are embedded devices, they do not need this complexity…
Not surprising at all. I would expect that a lot of these are bought as retrofits, and not as a part of new construction. Running wires through existing walls can be annoying, and they don't want to put that barrier to sale in front of them. And you can get a good-enough WiFi chipset for a few bucks these days.
> I need a 3A fuse [...] After installation, I checked the temperature of the fuse multiple times during the day to get at least some indication that things are not going to get worse. It worked fine for a more than a week now, but I still do not recommend experiments like this to anyone.
Probably don't need to be so worried here. If it's a 3A fuse, the entirety of your apartment's mains power is not running through it. A 3A fuse would burn out in a fraction of a second if you tried to do that.
Also, oh, man, Jazelle. I'd forgotten about that. Hardware support for Java bytecode... that did not pan out well.
> A 3A fuse would burn out in a fraction of a second if you tried to do that.
he bought it on Amazon. He has every reason to be worried that it won't burn out. Louis Rossman did a video[0] where he put 8 amps through a 2 amp fuse and left the room for quite a long time, I think it was several minutes with 8a going through a 2a fuse.
In general, people have the wrong idea about how fuses work. They're not supposed to blow at their rated current, they're supposed to withstand it indefinitely, and only blow at much higher currents. Look up any datasheet from a well established manufacturer and see for yourself (like this one from littelfuse: https://littelfuse.com/products/fuses/cartridge-fuses/5x20mm... )
Amazon had one of their buildings in California shut down a few months ago by the fire department when a generator started smoking. It was probably due to a bad fuse they bought off Amazon. https://signalscv.com/2023/07/fire-breaks-out-at-amazon/ That's how blinded by their avarice Amazon has become; they can't even protect their own house. Notice how the Nilight fuses (https://amzn.to/3S06G2n) are still listed, even after a YouTube video with 300,000 views demonstrated that their 2 ampere fuse takes 10 amps to blow. I even had an electrical fire in my house recently, due to components that I purchased off Amazon. I know Amazon monitors Hacker News PR closely, since they took down those ChatGPT generated listings within minutes of us posting them here. Yet they do nothing about product listings that put our lives, and their own lives, in critical danger.
Well yeah, there's that. My assumption about the worry the author expressed was that it was just an "I'm a little uncomfortable with mains power" type worry, not "did I buy a crappy part that's going to explode" type worry. If it was the latter, that's, well... entirely avoidable.
Also, too: Wifi has inherent galvanic isolation with a wide gap.
It isn't strictly necessary, as anyone here obviously knows, but it can be a cost-effective way to isolate the [electrical] pokey-bits from the [meat-based] pokey-bits, and to avoid loops when things go wrong.
> Also, oh, man, Jazelle. I'd forgotten about that. Hardware support for Java bytecode... that did not pan out well.
As someone who was too young to be paying any attention during this time, what were some of the reasons this didn’t pan out? Java seems so dominant looking back that I’m surprised something like this wouldn’t have been a success.
The Lisp machine failed because Lisp compiler technology got better and better at targeting generic 32-bit CPU hardware, which was becoming increasingly cheap and plentiful. So the benefits of having all this custom hardware to specially execute Lisp code were nullified -- leaving only the costs.
The same thing happened to Java in hardware. It seemed like a good idea at the time because it allowed developers to target a language they were already familiar with, and present an alternative to Wintel -- especially when you realize that Java was all the rage as a sort of universal programming environment, and in particular J2ME was a big deal for proto-"smart" phones before the iPhone came along. But embedded Java didn't really pan out, memory and CPU time got cheaper, and compiler and JIT tech improved to the point where there was just no benefit to adding the hardware it took to decode Java instructions. So Jazelle was deprecated and replaced with something called ThumbEE, which was a more generic framework based on ARM's Thumb instruction set for running code for an abstract machine, providing features like automatic null-pointer checking and that. Like you could set up a ThumbEE environment for running Python or .NET code in addition to Java. Nowadays even ThumbEE is deprecated. Neither feature appears in ARMv8 processors, for instance.
I have also wondered this for years, and always was told "because JITs work better", but that felt a bit handwavy. Luckily for both of us David Chisnall just published an article on ACM about designing ISAs that properly explains the reasoning behind Jazelle and why it did not work in the long run:
> Small code is also important [for a simple single-issue in-order core]. A small microcontroller core may be as small as 10KB of SRAM (static random access memory). A small decrease in encoding efficiency can dwarf everything when considering the total area cost: If you need 20 percent more SRAM for your code, then that can be equivalent to doubling the core area. Unfortunately, this constraint almost directly contradicts the previous one [about decoder complexity]. This is why Thumb-2 and RISC-V focused on a variable length encoding that is simple to decode: They save code size without significantly increasing decoder complexity.
> This is a complex tradeoff that is made even more complicated when considering multiple languages. For example, Arm briefly supported Jazelle DBX (direct bytecode execution) on some of its mobile cores. This involved decoding Java bytecode directly, with Java VM (virtual machine) state mapped into specific registers. A Java add instruction, implemented in a software interpreter, requires at least one load to read the instruction, a conditional branch to find the right handler, and then another to perform the add. With Jazelle, the load happens via instruction fetch, and the add would add the two registers that represented the top of the Java stack. This was far more efficient than an interpreter but did not perform as well as a JIT (just-in-time) compiler, which could do a bit more analysis between Java bytecodes.
> Jazelle DBX is an interesting case study because it made sense only in the context of a specific set of source languages and microarchitectures. It provided no benefits for languages that didn't run in a Java VM. By the time devices had more than about 4MB of RAM, Jazelle was outperformed by a JIT. Within that envelope, however, it was a good design choice.
> Jazelle DBX should serve as a reminder that optimizations for one size of core can be incredibly bad choices for other cores
So: a decent JIT works better if you can afford the overhead of the JIT. Jazelle was only a good idea in a very brief period of time when this wasn't true, and even then only if you insist on running a Java VM.
>> To be honest, the whole thing was a bit scary, since I was very close to the mains
I laughed at this. Changing a fuse is… a bit scary? They literally teach this in elementary school in the U.K. - or they did. As you say, no need to fretfully check the fuse - either it blows or it doesn’t, and you’ll know when it does. At least he didn’t find the receptacle holding a dead fuse, carefully wrapped in the ceremonial aluminium shroud of eternal life and certain death, which is a crime I may have committed in my younger, more fire-prone years.
I find it interesting how uncomfortable some people are outside of their comfort zones - but then I am a person who spends his life sticking his nose in stuff he has no business with.
I don't know how old the author is, but I'm not surprised when people even 10-15 years younger than I am (I'm in my 40s) shy away from digging into the guts of how things work.
I feel like I was at the tail end of when it was ok to experiment with technology as a kid and teen. The early '00s brought much more in the way of disposable, locked-down devices. Kids growing up today (despite the educational push of orgs like the Raspberry Pi Foundation) are presented with hermetically-sealed devices that present a sanitized interface. Manufacturers explicitly don't want their customers taking things apart, discovering how they work, or tinkering with them in any way... and often even try to put legal barriers in place to keep people from doing stuff like this.
This is a far cry from when I was very young (and before I was born) when computers and kits would come with full schematics and datasheets!
> Running wires through existing walls can be annoying, and they don't want to put that barrier to sale in front of them.
It also makes it more convenient to compromise the device from across the street (or across town with a directional antenna). Though of course that's not a problem if your security is up to par and the device continues to receive regular security updates, and we can only surmise that the author has discovered a rare outlier in this space where that is not the case.
> we can only surmise that the author has discovered a rare outlier in this space where that is not the case.
Exactly what I was thinking! What luck that the author found the single IoT device out there that's a cobbled together piece of bodged electronics designed by a graduate from a webdev bootcamp with a Corel Draw focus. A device that, while only ~15 years old is not only hopelessly useless, but also obsolete and insecure.
It's a good thing all other consumer IoT device manufacturers think about and prioritize security, longevity! Also, that customers nowadays are more focused on installing something fit-for-purpose and sustainable once than buying the cheapest shit possible with the blinkiest LEDs.
I shudder to think about how long they tried to get the string-and-cups based telephone to work in my building until the 1930's when they installed the copper still used today for DSL. Or how terrible the paper-straw based water system must have been up to the 1890's when they realized investing in metal pipes has advantages. So glad the days of short-term thinking are behind us.
Sure, but manufacturers -- as we should well know by now -- don't particularly care about that.
And for a device like this -- a rare one where it seems they sold it without any kind of online subscription service -- their goal is to sell units, and telling people they'll have to cut holes in their walls and run wires (for most people this probably means hiring someone) is certainly going to sell fewer units.
device continues to receive regular security updates
Have to reply to this, and my response was covered a bit by your statement of "security up to par".
Nothing should be considered secure. All those bug bounties are to entice black hats, into giving up juicy pre-0day vulnerabilities.
So just because a device is up to date with security updates, we all must understand, there are countless bugs unknown, needing to be patched, and often, being discovered by those that will never tell, never disclose, never report, and only use them for nefarious purposes.
This is why security is nothing without monitoring.
And why nothing is ever "safe", only likely "more safe" due to a security update.
Consider everything that is network connected as compromised. Everything.
> Probably don't need to be so worried here. If it's a 3A fuse, the entirety of your apartment's mains power is not running through it.
If it's a "3A fuse" that doesn't blow at 6A or worse, then it will get very hot (fire hazard) if/when there's a short regardless of the distance to the mains power.
If it truly is a 3A fuse, then great. If it's bought from Amazon then I doubt it's truly a 3A fuse.
Louis Rossman over at YouTube has been going over this fuses thing from Amazon. All the fuses he tried from top-results didn't blow until he put 4x or 5x the current rating through them.
1) a dead short in the circuit. Fuse will blow pretty much instantly.
2) an overload on the circuit. Fuse will blow sooner or later depending on how great the load excess is. If the fuse is rated at 3A, it's not going to be fine at 2.9A and then instantly blow at 3.1A. You'd need actual current monitoring to do that.
And some fuses are "delayed" to allow an overload for a few moments, such as when starting a motor.
None of this disputes that Amazon is well known to sell garbage, and not just limited to fuses. That's why I don't buy anything there.
> Also, oh, man, Jazelle. I'd forgotten about that. Hardware support for Java bytecode... that did not pan out well.
I'd love someday to learn more about why Jazelle failed.
The first SoC I worked on almost 20 years ago was built around an ARM926EJ-S, just like in the story. It was built for Nokia, who used Symbian OS [1], and supported user-installable apps written against Java Micro Edition [2].
The utter mess of Symbian's app discovery and installation, I suspect, was a prime reason Apple created their App Store for the iPhone.
Nevertheless, the fundamental concept of HW-accelerated Java apps doesn't sound crazy. What happened? Were they just stuck with a sinking ship, Symbian?
The name of the company (Netthings) seemed familiar with me, turns out I had read an blog article regarding the hardcoded NTP servers that they used in their devices being firewalled off and therefore losing time sync.
My guess was that it probably had a time correction feature from those British radio tower integration, but this device is from 2015 (says in the article), so probably not.
> Turns out, they found out an even more innovative time sync mechanism. When you open the UI in the browser, they quickly redirect you to "/set-time/" + Date.now(). This sets a global variable in the Node.js app responsible for "now".
I know this sounds pithy (and it is!) but you'd be surprised exactly how cheap and cost effective Wi-fi enabled SoCs are. A lot of the time we're getting Wi-Fi for free, and most of those SoC's don't have the Ethernet controller by default, so it's more cost-effective to use Wi-fi if it can fit your use-case.
Other physical protocols/connection types can be supported of course (I wonder what the longest I2C run ever is), but when you're talking about a retro-fitted client like this is, Wifi or wireless protocols in general are best.
Purely from a materials perspective, 2x cheap WiFi-capable microcontrollers (e.g. esp8266) will cost the manufacturer something like $4-$5 total for both devices - which is comparable to 3m cable+connectors+cheap chips to handle the cable connection (even ignoring the cost of some person to install the cable which is far more expensive than that) so indeed I don't get why the author considers that doing the connection over WiFi is somehow wasteful.
ESP32s are basically universal at this point. You can have them for under a dollar if you order in any sort of bulk, and you get Wifi and Bluetooth right out of the box. At this point is more expensive to not use Wifi
FWIW I asked the OP to send me the /etc/shadow file which he tried to bruteforce with John The Ripper (unsuccessfully). And because it contains old school UNIX crypt() hashes I was able to crack the root pw in ~7 hours with hashcat on 12 x RTX 4090: the root password is Newt@rd$
Not that it's particularly useful as this device let's you gain root access via TCF anyway without authentication. But maybe this password is reused in other areas...
You want to be able to see usage to a resolution of at most 5 minutes.
That way people can spot things like “having my electric heater on for those couple of hours used more electricity than all my lights use for a month”.
I have an inverter and solar panels in my place (very common now in South Africa middle class homes due to unreliable electricity producer) and I can see a full history of electricity usage.
It’s easy for me to see where I can improve my efficiency or why my consumption was so high.
It’s still only an overall figure though, so you have to do an informed assumption as to what caused the consumption.
For example it’s obvious that the 3kw draw for about an hour or so after I shower is the geyser heating itself back up. I can see from the usage stats that my battery was depleted from the night, that the solar production is still low due to my showering in the early morning and that the energy was thus coming from the grid (the inverter records all these figures).
It is then obvious that I can very simply save money on electricity by putting a timer on my geyser so that it only heats after 10am or so, once the sun is high enough for solar production to cover the energy usage.
Now I just wish I had something as convenient for monitoring water consumption.
If you’re in the Netherland you can get something like a “slimme lezer”, plug it into the p1 port of your energy meter and it will pop up in Home Assistant with the right sensors.
The energy dashboard will give an overview of your gas and electricity usage, solar production, proportion used from grid/solar and even a home battery if present. It’s really great.
Combine it with some Aqara (zigbee but easily overloaded) or Shelly (WiFi and I find them very robust) energy monitoring power sockets and you get a very good idea of the simplest measures to take to save power. You can even add cost/kWh and M3 (gas) to the sensors in HA.
If only the meters had ny power nearby or offered a 5V USB port on it so you could plug in your reader and forget it. But no. Now I'd have to keep a small Li-ion battery living in -20C since the meters are typically outside and there is never any power nearby since they are in a closed cabinet. Only the people who have indoor power meters (I know zero cases of that with detached houses, I think the power companies require the meter to be outside in a cabinet they can access without access to the house.
That's really cool though I wish we could do that.
He has been doing a pretty lousy job of managing his diet.
Until his doctor prescribed a monitor for a few weeks.
This is a device that looks like a big Band-Aid that you put on your arm, inserting a fine needle under the skin, and communicates with an app on the phone, reporting things like glucose levels.
Once he realized the effects of the foods he was eating, he immediately changed his diet, and has been sticking to it, since (he no longer wears the monitor).
The UI of the app was pretty good. The historical data readout is what did it for him.
I can confirm that being more aware of the results you get when you eat the wrong food can be a strong incentive to help you choose better food.
Ultimately the biggest wins are when big appliances and heating/cooling respond to self-production or take advantage of times when electricity is cheap (if you're on a per-hour or per-day dynamic contract), whether that's with a simple timer like the one you installed, a relay that shuts down heating when you're cooking or something fancier like a Fronius Ohmpilot [1] that tweaks heating power to exactly match PV (over)production.
[1] https://www.fronius.com/en/solar-energy/installers-partners/...
In Finland you can get an electricity contract that follows the hourly spot prices. Usually the hourly prices varies in the range from 5c to 20c/kWh, but sometimes it jumps up to 40c, even 80c/kWh. The record was 2€/kWh for a couple hours in one day.
Current hourly prices for today and tomorrow:
https://oomi.fi/en/electricity/electricity-contracts/active/...
People who have chosen this kind of contract, usually reduce their consumption during the ridiculously expensive hours, which usually occur when there happens both low wind energy production, and simultaneously some power plant being offline for maintenance.
You can also get a contract with a fixed price, if you want.
https://www.theguardian.com/money/2023/jan/23/households-gre...
Of course, why wouldn't you? If the assumption isn't that effectively unlimited power is available on demand you adjust use accordingly.
On sunny days with excess power maybe you charge and do laundry. On a stretch of cloudy days you avoid long periods of cooking or using large tools like sellers or air compressors.
Adjusting to our environment rather that chasing convenience is a very reasonable approach to makinh a real dent in reducing our environmental impact.
There’s been some interest in this locally due to energy regulation moving to a more punishing system for peak usage. Also high electricity prices. And the topic came up that people would be encouraged to do things that are considered unsafe like washing their clothes while they are sleeping.
Things like this need to be automated. (And be safe.) Manually following fancy gadgets won’t make much of a difference.
Meanwhile we’re (or were, don’t know the current status) selling hydropower to the rest of the Europe during the fall, emptying the reservoirs before the winter so that electricity prices become unreasonable (leading to strategies like washing your clothes while you sleep or just being content to freeze while indoors).
And I have never seen a good argument for the export/import (Europe) arrangement. But I guess we can try to sideline that whole conversation by nagging people to turn off the light in the bathroom when it’s not busy.
However, knowing that a particular device is bad means that when I eventually need to replace it in the future, I will also factor in energy efficiency and features such as it being able to check energy prices in my purchasing decision.
Alternatively I've had success in wiring up a temperature probe directly to the incoming water line, and comparing that temperature to the ambient temperature. Where I live that works because the water arrives from underground & is always much cooler than ambient air. The time-integrated difference between the two is a proxy for how much water you use... this is much more involved to get meaningful data from, tho.
---
Edit: a proximity sensor that detects metal might be the most straightforward thing, if you have a water meter with a rotating metal gauge https://www.alldatasheet.com/view.jsp?Searchword=LJ12A3-4-Z/...
Home Assistant can do that in the Energy dashboard, and you can answer questions/learn surprising things, like how much energy my "rack" (UPS+mac mini+5 disk bay+a few other things) actually uses vs e.g my fridge or my washing machine, or my desk compute actually is quite low but boy does the screen costs a lot when active, or what does charging the electric bike costs, or what's the effect of setting thermostat to 19 instead of 20 in winter, or oh wow in summer this fan that we use a lot to make things bearable actually ends up using as much energy as our water heater!
(power measurements are done using Shelly Plug Plus S + 3EM + 4PM devices, thermal measurements using Shelly H&T Plus)
I find it better to remain ignorant in that regard. Jokes aside, it's also interesting seeing wall draw vs UPS draw vs PSU draw if each piece of your equipment supports that.
The Shelly stuff is also quite fun to play with (I recommend a AC adapter for H&T). I have the little black spherical sensors and the data resolution is significantly worse on battery since it tries to sleep in low power state as much as possible. It's fun to see the server cabinet (mine's enclosed) vs room vs different room temps. You can also see when the HVAC cycles on and off and when someone takes a shower (humidity spikes).
For example, I was surprised to see how much our electronics (stereo, amplifier, TV, etc.) in the living room use, even when off (some devices are older, with high standby currents). It motivated me to put everything on a timer that only turns power own in the evenings, since that's the only time they get used.
It's a small thing, but small things add up.
I was surprised to learn that a timer itself also uses power. I borrowed a Kill-a-watt from the library and found that an 2 decades old timer uses 2.3W while a newer one uses 0.6W. That tells me that I should just keep the old timer for the rare occasions.
This is one of the reasons all UK homes are being fitted for free with smart meters. (There are others, such as enabling better grid control.)
> You want to be able to see usage to a resolution of at most 5 minutes.
My one updates every few seconds and has a set of traffic light LEDs at the bottom giving a visible guide to energy use.
https://www.edfenergy.com/smart-meters/using-a-smart-meter/c...
Once the government has that info, it will be able to come up with bespoke taxes for you according to what it ordains as fair use. 'Your showers are too long', 'your toolshed is too big a draw on the electric' therefore 'you need to buy carbon credits to offset the environmental damage you are causing'.
It's the slow descent to greater tyranny, and loss of personal control. It's amazing that people put up with it, but a slight discount in the short term, or visibility of your own data, is probably enough to get most people to accept spying infrastructure in their lives forever.
I think these devices must be required to send the data to the utility company and the utility company must be forced to make the data easily accessible in a standard format so that independent analysis is trivially possible.
This way you don't have a situation where a device manufacturer goes out of business and the capability to monitor is lost.
I've made my own little Raspberry Pico display that queries today's energy prices and shows those, but I have not been able to show today's energy consumption alongside(and therefore show the day's cost so far). Octopus provides an API to query the kWh used....but only for the last day. I even got their little Octopus Mini that broadcasts live usage to their app but I have not been able to query the live data from it from my raspberry, I don't have the necessary skills in web technologies to do that unfortunately :-(
As in, yeah, running the washing machine is power consuming, I knew that, and the same goes for the electric oven or for the vacuum cleaner, but what am I supposed to do with that information? Not wash my clothes anymore? Not using the oven? Leaving dust all over the place for longer?
The former most people have no idea about but the powerpal has been a smashing success for consumers to understand what is using energy.
[1] https://www.powerpal.net/
Unfortunately this is a closed system where the energy company will not let you access the data outside of their own dashboard. I would like to think that this is against the national trend toward Open Data but it is what it is.
There are some funky solutions where you connect a board to an input of the meter and somehow get the data in Home Assistant but it is like I said "funky" (completely guerilla style, without any backing of the power company and if you have a problem it will probably be your fault).
The first of those is a genuine concern
> In Australia debt collectors can make use of the data to know when people are at home.[63] Used as evidence in a court case in Austin, Texas, police agencies secretly collected smart meter power usage data from thousands of residences to determine which used more power than "typical" to identify marijuana growing operations.[64]
> Smart meter power data usage patterns can reveal much more than how much power is being used. Research has demonstrated that smart meters sampling power levels at two-second intervals can reliably identify when different electrical devices are in use.
from https://en.wikipedia.org/wiki/Smart_meter#Privacy_concerns
IIRC my smart meter in the UK lets me choose between 30m and 24h reporting, probably as a response to these fears, but you just set it as a preference on the provider website, not locally on the meter. It would be trivial for them to just be lying about that and logging data to GCHQ at the maximum precision. That may seem outlandish, but so did PRISM until it was revealed
Some people also refuse to have one so that they can't be forced onto a dynamic-priced tariff. At the moment those are opt-in, but I think their concern is a fair one too. Though if the powers that be wanted to coerce people onto them, they could simply crank the price of fixed tariffs anyway
As a toy project I put a raspi with a small screen in my living room that would show the temperature and humidity for the last three days as a graph. It was somewhat of an eye catcher in the living room. The data was always interesting. Even if the humidity did not change much, at the very least you could always see that it was colder during the nights, which told me it was working.
It taught me so many things. The effect of opening the windows short vs. long on temperature and humidity. That the sun shining into my room was way more effective then my heating. How the temperature goes down exponentially when my heating turns off, and how the length of the of/off cycles depend on the temperature outside. Etc.
Likewise it's ridiculous devices are not automatically saving energy when unused, it's such a simple change it should be standard.
Asking the user to worry about it should not be needed. That's the goal we should strive for. For now monitoring will have to do but I think a combination of solar + iron air battery will make everybody living in a sunny-enough place independent from the grid with ample margin - and we can supply the rest with nuclear.
If my 2000w heater is running on the 800w setting and turns on when my room has dropped below the point I consider acceptably chilly and turns off above that point, how much electricity have I used in the last hour?
If I have 3 15w LEDs on a dimmer and run them intermittently throughout the day, how much electricity have I used in the last hour?
If my TV is off, but plugged in, and accesses the Internet a few times a day automatically to check for new versions, how much power has it used today?
I think this makes the case for, at least, a kill-a-watt style device. A whole home solution with sufficient report granularity and a report interface visible in the home would be worth the extra trouble, IMHO.
Edit: For the record, these are all real-scenarios from my house.
I've seen decent reviews on the "no plumbing required" water meters. Flume has a product available in the U.S. that gets pretty good reviews (https://www.amazon.com/Flume-Smart-Water-Monitor-Detector/dp...).
Long term tracking usage of individual device energy usage is nice, but just knowing from past measurements how much a device tends to use is already very useful.
You can buy a power meter plug - that sits between appliance and socket - and work your way around almost all your appliances apart from, typically, oven, air conditioner(s), and hot water systems. For those you're going to need to experiment by turning as many things off as you can, to establish a baseline, and review your switch meter periodically for short (several minute) intervals, with and without the larger appliances turned on.
(You can get induction coil systems to report usage of these larger appliances, but they're typically onerously priced.)
Or hire an electrician to do it, if you aren't totally confident in your ability to do that safely.
You mean energy companies and governments?
I have to manually set things up for winter vs summer.
My geyser which is not on solar is on a IOT controller (Hotbot) and I set the times I want the water to be heated and it knows when there is loadshedding (3G) so it can intelligently deviate from my set times.
There are some screenshots here that kind of give you the idea: https://www.sdge.com/energy-management-tool
Then again, it's possible different appliances have a specific electrical signatures like +5KWH x 1.5 hours = dishwasher.
You can set this up non-invasively with ultrasonic flow meters like the TUF-2000M. It isn’t cheap, but it does work quite alright if you don't want any of the risks associated with cracking open your pipes.
(There are also cheaper options if you don’t mind opening up your pipes too.)
The system must just work behind the scenes and optimise energy use as much as possible automatically.
All this is very common. And yet displaying a couple of digits and a bar graph could be done with a pair of microcontrollers communicating onto some wired bus.
With the power supplies of this era, this pair of devices probably pumps 16w idle. Running 24/24 7/7, they probably consume as much as a small fridge as a whole. The LCA of the solution must be consterning as well, especially compared with few one dollard microcontrollers.
The worst of all is that this whole mess turned into bricks probably 3 years after it was installed, maybe less.
From a business perspective nobody wants to pay the costly people that can do microcontroller programming. Frontend devs are dirt cheap, especially for something as simple as that interface displaying the bar charts.
But it could be the case that building an android or web app for a simple UI would take less dev-months than an embedded app with similar functionality.
The embedded world isn't known for paying well.
Except for botnets and/or spying. Some of those boards already contain MEMS microphones and cameras (the box in the picture even shows the camera objective). I'd have took apart the device to take a look inside, or at least run some diagnostics to explore which hardware was installed/detected.
At the average cost of electric in the USA this amounts to under $2/month. Seems negligible to me?
https://www.wolframalpha.com/input?i=16+watts+*+24+hours+*+3...
source: https://ourworldindata.org/grapher/daily-median-income
On the other hand, adding one more graph to the microcontroller-powered solution will probably mean redoing it from scratch.
I really wonder why this happens. Seems penny wise & pound foolish. Perhaps they failed to hire the right developer for the right abstraction level, and ended up with "web developers" I guess.
It's much cheaper and more sustainable for the wealthy and powerful to train individuals on very high level technologies then reuse their skills in every way they can, regardless of how feasible, the economic and ecological footprint, or any concern outside of making profit.
Electron is not some comic book villain. JavaScript is not horrible and can be the optimal choice for many software applications.
But these technologies and tools are easy to teach to many workers who may or may not: understand the computational architecture to come up with better economic efficiencies, have interest in applying their skills to properly solve a problem rather than put food on the table, and so on.
The higher level the skill is, the less interest and deep systemic understanding needed for the job: many new jobs created.
Can you expand on that? Would 7 segment displays and a couple of leds be enough ? Which hardware would you use ?
Not surprising at all. I would expect that a lot of these are bought as retrofits, and not as a part of new construction. Running wires through existing walls can be annoying, and they don't want to put that barrier to sale in front of them. And you can get a good-enough WiFi chipset for a few bucks these days.
> I need a 3A fuse [...] After installation, I checked the temperature of the fuse multiple times during the day to get at least some indication that things are not going to get worse. It worked fine for a more than a week now, but I still do not recommend experiments like this to anyone.
Probably don't need to be so worried here. If it's a 3A fuse, the entirety of your apartment's mains power is not running through it. A 3A fuse would burn out in a fraction of a second if you tried to do that.
Also, oh, man, Jazelle. I'd forgotten about that. Hardware support for Java bytecode... that did not pan out well.
he bought it on Amazon. He has every reason to be worried that it won't burn out. Louis Rossman did a video[0] where he put 8 amps through a 2 amp fuse and left the room for quite a long time, I think it was several minutes with 8a going through a 2a fuse.
[0]: https://www.youtube.com/watch?v=B90_SNNbcoU
It isn't strictly necessary, as anyone here obviously knows, but it can be a cost-effective way to isolate the [electrical] pokey-bits from the [meat-based] pokey-bits, and to avoid loops when things go wrong.
Wireless has uses beyond just eliminating wires.
This never occurred to me, thanks.
As someone who was too young to be paying any attention during this time, what were some of the reasons this didn’t pan out? Java seems so dominant looking back that I’m surprised something like this wouldn’t have been a success.
The same thing happened to Java in hardware. It seemed like a good idea at the time because it allowed developers to target a language they were already familiar with, and present an alternative to Wintel -- especially when you realize that Java was all the rage as a sort of universal programming environment, and in particular J2ME was a big deal for proto-"smart" phones before the iPhone came along. But embedded Java didn't really pan out, memory and CPU time got cheaper, and compiler and JIT tech improved to the point where there was just no benefit to adding the hardware it took to decode Java instructions. So Jazelle was deprecated and replaced with something called ThumbEE, which was a more generic framework based on ARM's Thumb instruction set for running code for an abstract machine, providing features like automatic null-pointer checking and that. Like you could set up a ThumbEE environment for running Python or .NET code in addition to Java. Nowadays even ThumbEE is deprecated. Neither feature appears in ARMv8 processors, for instance.
> Small code is also important [for a simple single-issue in-order core]. A small microcontroller core may be as small as 10KB of SRAM (static random access memory). A small decrease in encoding efficiency can dwarf everything when considering the total area cost: If you need 20 percent more SRAM for your code, then that can be equivalent to doubling the core area. Unfortunately, this constraint almost directly contradicts the previous one [about decoder complexity]. This is why Thumb-2 and RISC-V focused on a variable length encoding that is simple to decode: They save code size without significantly increasing decoder complexity.
> This is a complex tradeoff that is made even more complicated when considering multiple languages. For example, Arm briefly supported Jazelle DBX (direct bytecode execution) on some of its mobile cores. This involved decoding Java bytecode directly, with Java VM (virtual machine) state mapped into specific registers. A Java add instruction, implemented in a software interpreter, requires at least one load to read the instruction, a conditional branch to find the right handler, and then another to perform the add. With Jazelle, the load happens via instruction fetch, and the add would add the two registers that represented the top of the Java stack. This was far more efficient than an interpreter but did not perform as well as a JIT (just-in-time) compiler, which could do a bit more analysis between Java bytecodes.
> Jazelle DBX is an interesting case study because it made sense only in the context of a specific set of source languages and microarchitectures. It provided no benefits for languages that didn't run in a Java VM. By the time devices had more than about 4MB of RAM, Jazelle was outperformed by a JIT. Within that envelope, however, it was a good design choice.
> Jazelle DBX should serve as a reminder that optimizations for one size of core can be incredibly bad choices for other cores
So: a decent JIT works better if you can afford the overhead of the JIT. Jazelle was only a good idea in a very brief period of time when this wasn't true, and even then only if you insist on running a Java VM.
[0] https://queue.acm.org/detail.cfm?id=3639445
I laughed at this. Changing a fuse is… a bit scary? They literally teach this in elementary school in the U.K. - or they did. As you say, no need to fretfully check the fuse - either it blows or it doesn’t, and you’ll know when it does. At least he didn’t find the receptacle holding a dead fuse, carefully wrapped in the ceremonial aluminium shroud of eternal life and certain death, which is a crime I may have committed in my younger, more fire-prone years.
I find it interesting how uncomfortable some people are outside of their comfort zones - but then I am a person who spends his life sticking his nose in stuff he has no business with.
I feel like I was at the tail end of when it was ok to experiment with technology as a kid and teen. The early '00s brought much more in the way of disposable, locked-down devices. Kids growing up today (despite the educational push of orgs like the Raspberry Pi Foundation) are presented with hermetically-sealed devices that present a sanitized interface. Manufacturers explicitly don't want their customers taking things apart, discovering how they work, or tinkering with them in any way... and often even try to put legal barriers in place to keep people from doing stuff like this.
This is a far cry from when I was very young (and before I was born) when computers and kits would come with full schematics and datasheets!
https://www.bbc.co.uk/bitesize/guides/z6r37nb/revision/6
It also makes it more convenient to compromise the device from across the street (or across town with a directional antenna). Though of course that's not a problem if your security is up to par and the device continues to receive regular security updates, and we can only surmise that the author has discovered a rare outlier in this space where that is not the case.
Exactly what I was thinking! What luck that the author found the single IoT device out there that's a cobbled together piece of bodged electronics designed by a graduate from a webdev bootcamp with a Corel Draw focus. A device that, while only ~15 years old is not only hopelessly useless, but also obsolete and insecure.
It's a good thing all other consumer IoT device manufacturers think about and prioritize security, longevity! Also, that customers nowadays are more focused on installing something fit-for-purpose and sustainable once than buying the cheapest shit possible with the blinkiest LEDs.
I shudder to think about how long they tried to get the string-and-cups based telephone to work in my building until the 1930's when they installed the copper still used today for DSL. Or how terrible the paper-straw based water system must have been up to the 1890's when they realized investing in metal pipes has advantages. So glad the days of short-term thinking are behind us.
Just remember that the S in IoT stands for security :)
And for a device like this -- a rare one where it seems they sold it without any kind of online subscription service -- their goal is to sell units, and telling people they'll have to cut holes in their walls and run wires (for most people this probably means hiring someone) is certainly going to sell fewer units.
Have to reply to this, and my response was covered a bit by your statement of "security up to par".
Nothing should be considered secure. All those bug bounties are to entice black hats, into giving up juicy pre-0day vulnerabilities.
So just because a device is up to date with security updates, we all must understand, there are countless bugs unknown, needing to be patched, and often, being discovered by those that will never tell, never disclose, never report, and only use them for nefarious purposes.
This is why security is nothing without monitoring.
And why nothing is ever "safe", only likely "more safe" due to a security update.
Consider everything that is network connected as compromised. Everything.
If it's a "3A fuse" that doesn't blow at 6A or worse, then it will get very hot (fire hazard) if/when there's a short regardless of the distance to the mains power.
If it truly is a 3A fuse, then great. If it's bought from Amazon then I doubt it's truly a 3A fuse.
1) a dead short in the circuit. Fuse will blow pretty much instantly.
2) an overload on the circuit. Fuse will blow sooner or later depending on how great the load excess is. If the fuse is rated at 3A, it's not going to be fine at 2.9A and then instantly blow at 3.1A. You'd need actual current monitoring to do that.
And some fuses are "delayed" to allow an overload for a few moments, such as when starting a motor.
None of this disputes that Amazon is well known to sell garbage, and not just limited to fuses. That's why I don't buy anything there.
I'd love someday to learn more about why Jazelle failed.
The first SoC I worked on almost 20 years ago was built around an ARM926EJ-S, just like in the story. It was built for Nokia, who used Symbian OS [1], and supported user-installable apps written against Java Micro Edition [2].
The utter mess of Symbian's app discovery and installation, I suspect, was a prime reason Apple created their App Store for the iPhone.
Nevertheless, the fundamental concept of HW-accelerated Java apps doesn't sound crazy. What happened? Were they just stuck with a sinking ship, Symbian?
[1] https://en.wikipedia.org/wiki/Symbian
[2] https://en.wikipedia.org/wiki/Java_Platform,_Micro_Edition
> C in IoT stands for “cost-effective” I guess
but it's actually C for cableless.
Article: https://strugglers.net/~andy/blog/2018/12/24/the-internet-of...
It also appears that they went into liquidation in 2018, so good luck getting any support with that device from them!
Reads like a quote from a Philip K. Dick book.
(Clearly that didn't pan out so well)
> Turns out, they found out an even more innovative time sync mechanism. When you open the UI in the browser, they quickly redirect you to "/set-time/" + Date.now(). This sets a global variable in the Node.js app responsible for "now".
(https://mastodon.social/@laplab/111789584104871367)
I know this sounds pithy (and it is!) but you'd be surprised exactly how cheap and cost effective Wi-fi enabled SoCs are. A lot of the time we're getting Wi-Fi for free, and most of those SoC's don't have the Ethernet controller by default, so it's more cost-effective to use Wi-fi if it can fit your use-case.
Other physical protocols/connection types can be supported of course (I wonder what the longest I2C run ever is), but when you're talking about a retro-fitted client like this is, Wifi or wireless protocols in general are best.
Power consumption.
Up front vs ongoing cost; the creators chose to minimise the up-front cost in favour of a marginally higher ongoing cost.
Not that it's particularly useful as this device let's you gain root access via TCF anyway without authentication. But maybe this password is reused in other areas...