GDP or even GDP per capita is not the best metric. You can sell 100 iphones for $1000 or sell 500 cheap androids for $200 and both countries will have same GDP but I think output and outcome is much better in latter case (you produced 5x more products and 5x more people can benefits and be more productive with those 'tools'). Sure iphone for $1000 is better than $200 android phone but is it 5x better? Same with cars you can sell ferrari or more cheaper toyotas for the same value. We would have to measure how much goods we can produce.
The Victorian government has also failed significantly on public housing. The wait time is about 20 months (10 months is VicGov's target, it was 14 months 3 years ago) and they're currently looking to demolish many existing options without many short term optionsnfor residents.
It seems very unlikely to me that Victoria's house prices will drop in any sigificant way this decade.
I think there is a good chance it will, as long as a change of government doesn't deliberately dismantle the current approach. Yes there's population growth and yet prices have been stagnant or declining the past few years and construction has picked up. That's a good trend!
I'm not familiar with the situation with public housing but am happy to accept if the government has failed there. But this seems like a separate failure rather than an indictment of their approach to increasing supply generally which I think is working.
Widespread sentiment that if you don't buy something ASAP, you'll never be able to - meaning lots of buyers skimping on due diligence to close a sale.
Things have been crazy for a long time, but I am actually optimistic for Melbourne specifically - the construction rate is up and the state government has been decreasing the power local governments have to block or delay development. If this continues, housing affordability should improve. My main concern is that a change of government may put an end to it, but I hope not.
Some details about what VIC is doing differently in this AFR article if you're interested (archive link because original is paywalled):
How does git help you find certain texts in files? `grep` should do the trick just fine, unless I misunderstand what "chuck Todo comments in the code" mean, the code lives on your disk no?
Grep will find them too, but any in the diff you'll know for sure were added by you.
Mercurial on windows was "download tortoisehg, use it", whereas git didn't have a good GUI and was full of footguns about line endings and case-insensitivity of branch names and the like.
Nowadays I use sublime merge on Windows and Linux alike and it's fine. Which solves the GUI issue, though the line ending issue is the same as it's always been (it's fine if you remember to just set it to "don't change line endings" globally but you have to remember to do that), and I'm not sure about case insensitivity of branch names.
Pretty sure Mercurial handles arbitrary filenames as UTF-8 encoded bytestrings, whether there was a problem with this in the past I can't recall, but would be very surprised if there was now.
Edit: does seem there at least used to be issues around this:
https://stackoverflow.com/questions/7256708/mercurial-proble...
though google does show at least some results for similar issues with git
VexRiscv is aware of this unofficial standard, and asks for four 16x64 multiplies and adds the result together on the next cycle. This produces a much better fmax on FPGAs, but if you were targeting an ASIC, you would be better off asking for a 64-bit multiplier, or not trying for a single-cycle multiply.
Most modern CPUs tend to target a 3 cycle pipelined multiplication, which means 22-bit wide multipliers. Doing this on an FPGA each 22-bit multiplication would require two 18-bit multiplier blocks, for a total of six multipliers, wasting more resources.
-----
In general, "FPGA friendly" means optimizing your design to take advantage of the things which are cheap on FPGAs, like the 18-bit wide multipliers and the block ram. Such designs tend to run faster on FPGAs and use less resources, but it's wasteful to synthesize them to ASICs.
As opposed to, say, interfacing with an FPGA which could be totally different way to be "FPGA-friendly".
How hard or expensive would it be for a reasonably equipped lab to build their own optical clock though? I see there are optical clocks the size of few rack units on the market for a rather hefty price, are the materials needed that expensive or is it just the expertise?
Oh and to know if it's any good you have to either build two (ideally more) of them to compare against each other (ideally using different approaches so their errors are less correlated), or have access to a clock better than the one you're building to compare to. So you can rarely get away with building just one if you want to know if you've succeeded.
Source: I work on the software for these portable optical clocks: https://phys.org/news/2025-07-quantum-clocks-accuracy-curren...