A common mode I have seen is phone in lap, front-facing camera ingests an exam page hung over the edge of the desk. Student then flips the page and looks down for the answer.
33.6 kbit/s (a later addition)
31.2 kbit/s (a later addition)
28.8 kbit/s (the theoretical maximum for most people; I remember being jealous of people who actually got it)
26.4 kbit/s (what my internet usually hit in practice)
24.0 kbit/s (I remember seeing this)
21.6 kbit/s (apparently this was very common, though I don't remember seeing it)
19.2 kbit/s
16.8 kbit/s
14.4 kbit/s (quite possible)
(lower bitrates are also documented; this is all multiples of 2.4 kbit/s)
Also, remember to assume about 10 bits per byte of actual data, since there is various protocol overhead.For completeness, 33.6 required insane levels of signal clarity on the phone line, and was mostly fiction outside of urban and dense suburban areas.
Prior to 14.4k, there were other generations of modems that came before: 9600, 2400, and even 300 baud modems were all you could get in their respective eras. Each of which were cutting edge at the time.
56K (also called V.90 or "V.everything"), leaned into the quantization that happens on digital phone trunks, rather than let the analog-to-digital conversion chew up your analog modem waveforms. The trick here is that the psuedo-digital-over-analog leg from your house to the local exchange was limited by a few miles. Try this from too far out of town, and it just doesn't work. And to be clear, this was prior to DSL, which is similar but a completely different beast.
Oh, and the V.90 spec was a compromise between two competing 56K standards at the time: K56Flex and X2. This meant that ISPs needed to have matching modems on their end to handle the special 56K signaling. Miraculously, the hardware vendors did something that was good for everyone and compromised on a single standard, and then pushed firmware patches that allowed the two brands to interoperate on existing hardware.
Also, line conditions were subject to a range of factors. It's all copper wire hung from power-poles after all. So, poor quality materials, sloppy workmanship, and aging infrastructure would introduce noise all by itself, and even during weather events. This meant that, for some, it was either a good day or a bad day to try to dial into the internet.
Also: a lot of development teams in security-oriented fields are doing a lot of self-investigation and improvement anyway. Red Teams still have value, and prove that time and again, in spite of that.
IMO, having another team attack your stuff also creates "real" stakes for failure that feel closer to reality than some existential hacker threat. I think just the presence of a looming "Red Team Exercise" creates a stronger motivation to do a better job when building IT systems.
They say that they keep CO2 in liquid form at room temperature, then turn it into gas, and grab the energy so released.
* Isn't the gas be very cold on expansion from a high-pressure, room-temp liquid? It could grab some thermal energy from the environment, of course, even in winter, but isn't the efficiency going to depend on ambient temperature significantly?
- To turn the gas into the liquid, they need to compress it; this will produce large amounts of heat. It will need large radiators to dissipate (and lose), or some kind of storage to be reused when expanding the gas. What could that be?
- How can the whole thing have a 75% round-trip efficiency, if they use turbines that only have about 40% efficiency in thermal power plants? They must be using something else, not bound by the confines of the Carnot cycle. What might that be?
1. Decompressing the gas can be used to do work, like turning a turbine or something. It's not particularly efficient, as you mention, but it can store some energy for a while. Also the tech to do this is practically off-the-shelf right now, and doesn't rely on a ton of R&D to ramp up. Well, maybe the large storage tanks do, but that should be all. So it _does_ function and nobody else is doing it this way so perhaps all that's seen as a competitive edge of sorts.
2. The storage tech has viable side-products, so the bottom-line could be diversified as to not be completely reliant on electricity generation. The compressed gas itself can be sold. Processed a little further, it can be sold as dry ice. Or maybe the facility can be dual-purposed for refrigeration of goods.
3. IMO, they're using CO2 as a working fluid is an attempt to sound carbon-sequestration-adjacent. Basically, doubling-down on environmentally-sound keywords to attract investment. Yes, I'm saying they're greenwashing what should otherwise be a sand battery or something else that moves _heat_ around more efficiently.
“When open source developers working in codebases that they are deeply familiar with use AI tools to complete a task, they take longer to complete that task”
I have anecdotally found this to be true as well, that an LLM greatly accelerates my ramp up time in a new codebase, but then actually leads me astray once I am familiar with the project.
Coming from other programming languages, I had a lot of questions that would be tough to nail down in a Google search, or combing through docs and/or tutorials. In retrospect, it's super fast at finding answers to things that _don't exist_ explicitly, or are implied through the lack of documentation, or exist at the intersection of wildly different resources:
- Can I get compile-time type information of Enum values?
- Can I specialize a generic function/type based on Enum values?
- How can I use macros to reflect on struct fields?
- Can I use an enum without its enclosing namespace, as I can in C++?
- Does rust have a 'with' clause?
- How do I avoid declaring timelines on my types?
- What is an idiomatic way to implement the Strategy pattern?
- What is an idiomatic way to return a closure from a function?
...and so on. This "conversation" happened here and there over the period of two weeks. Not only was ChatGPT up to the task, but it was able to suggest what technologies would get me close to the mark if Rust wasn't built to do what I had in mind. I'm now much more comfortable and competent in the language, but miles ahead of where I would have been without it.
I wish he’d write books.
Highly recommended: https://m.youtube.com/@ZackFreedman
You mean, always amazingly augmented, aspiring to alienate all other audible aspirations? Zach is always a treat.
I remember fights over whether or not navigation in frames was bad practice. Not iframes, frames. Who here remembers frames?
I remember using HTTP 204 before AJAX to send messages to the server without reloading the page.
I remember building... image maps[1]... professionally in the early 2000. I remember spending multiple days drawing the borders of States on a map of the country in Dreamweaver so we could have a clickable map.
I remember Dreamweaver templates and people updating things wrong and losing their changes on a template update and no way to get it back because no one used version control.
I remember <input type=image> and handling where you clicked on an image in the backend.
I remember streaming updates to pages via motion jpeg. Still works in Chrome, less reliably in Firefox.
I remember the multiple steps we took towards a proper IE PNG fix just to get alpha blending... before we got the ActiveX one that worked somewhat reliably... Just for tastes to change and everything to become flat and us to not really need it anymore.
I remember building site navigations in Java, Flash, and Silverlight.
I remember spacer gifs and conditional comments and what a godsend Firebug was.
I don't know when I got old, it just happened one day.
1. https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...
I was there, writing sites professionally when this was rolled out.
They're more or less deprecated, but I missing having a 1st-class building block that allows you to resize areas of the screen. Recommendations are to use anything but a <frameset>, but there's no replacement for that one feature.
> I remember building site navigations in Java, Flash, and Silverlight.
Don't forget ActiveX (actually, yes, we should all try to forget)