No, it isn't. It can be socially impolite to yawn unexcused, when someone is talking to you, as it has come to be interpreted as boredom rather than tiredness or similar. But it isn't inherently impolite to, for instance, yawn when walking down the street, or in a setting where someone isn't talking to you.
What you describe is in my opinion true for western cultures. In Brazil they are not so relaxed about it. Asia even less so.
Space is a vacuum. i.e. The lack-of-a-thing that makes a thermos great at keeping your drink hot. A satellite is, if nothing else, a fantastic thermos. A data center in space would necessarily rely completely on cooling by radiation, unlike a terrestrial data center that can make use of convection and conduction. You can't just pipe heat out into the atmosphere or build a heat exchanger. You can't exchange heat with vacuum. You can only radiate heat into it.
Heat is going to limit the compute that can be done in a satellite data centre and radiative cooling solutions are going to massively increase weight. It makes far more sense to build data centers in the arctic.
Musk is up to something here. This could be another hyperloop (i.e. A distracting promise meant to sabotage competition). It could be a legal dodge. It could be a power grab. What it will not be is a useful source of computing power. Anyone who takes this venture seriously is probably going to be burned.
A satellite is quite unlike a thermos in the sense that it is carefully tuned to keep its temperature within a relatively narrow band around room temperature.[1] during all operational phases.
This is because, despite intended space usage, devices and parts are usually tested and qualified for temperature limits around room temperature.
[1] "Room temperature" is actually a technical term meaning 20°C (exceptions in some fields and industries confirm the rule).
The claim that the title insurance industry is the reason for lack of adoption of Torrens title schemes is uncited, and immediately followed by descriptions of several cases where Torrens title was adopted (often poorly) and later abandoned.
Fascinating, how is ownership established if there is no single source of truth?
I feel the answer to this is also crucial to understanding OP. It could be a minor annoyance or the real possibility to lose your land.
1 15.95 dragonwriter
2 14.37 tptacek
3 12.80 jacquesm
4 11.15 dang
Take this (and OP) with a grain of salt, if for no other reasons than it does not account for how long someone has been commenting here.As a young intern, I arrived early one morning to find the PCB layout software (PADS PowerPCB) on our "design PC" wasn’t working. (I use quotes because it was just the beefiest machine we had, naturally our boss’s PC, which he kindly shared)
Obviously the dongle. I tried unplugging and replugging it, with and without the printer daisy-chained. Nothing.
So I begrudgingly asked my colleague who’d just arrived. He looked at the dongle, looked at me, looked at the dongle again, and started laughing.
Turns out our Boss had stayed late the previous night processing customer complaints. One customer had sent back a "broken" dongle for the product we were selling. Boss tested it on his PC, found it worked fine, and mailed it back on his way home.
Except he didn’t send our dongle back. He had sent my PowerPCB dongle. More fun was had when the rest of the team and finally our boss arrived. Luckily he took it with good humor.
Today the same argument is rehashed - it's outrageous that VS Code uses 1 GB of RAM, when Sublime Text works perfectly in a tiny 128 MB.
But notice that the tiny/optimized/good-behaviour of today, 128 MB, is 30 times larger than the outrageous decadent amount from Wirth's time.
If you told Wirth "hold my bear", my text-editor needs 128 MB he would just not comprehend such a concept, it would seem like you have no idea what numbers mean in programming.
I can't wait for the day when programmers 20 years from now will talk about the amazingly optimized editors of today - VS Code, which lived in a tiny 1 GB of RAM.
Both, compute and memory, are getting closer to fundamental physical limits and it is unlikely that the next 60 years will be in any way like last 60 years.
While the argument for compute is relatively simple it is a bit harder to understand for memory. We are not near to any limit for the size of our memory but the limiting factor is how much storage we can bring how close to our computing units.
Now, there is still way to make and low hanging fruit to pick but I think we will eventually see a renaissance of appreciation for effective programs in our lifetimes.