Readit News logoReadit News
buzzybee commented on Iodine: A full superset of Java with enhancements   blogs.remobjects.com/2017... · Posted by u/dwarfland
arkadiytehgraet · 8 years ago
I don't get it: there is already a widely accepted 'better' Java called Kotlin, that features much more than this: way better integration with IDEs, way better integration with existing Java codebases, better everything... Yet somehow we should start using NEW IDE with NEW language supported by completely unknown company with zero guarantees about its support. I just don't get it.
buzzybee · 8 years ago
RemObjects is not unknown. You're just new.
buzzybee commented on FizzleFade   fabiensanglard.net/fizzle... · Posted by u/pietrofmaggi
buzzybee · 8 years ago
FizzleFade is also found in Microprose games from the era (e.g. Railroad Tycoon, Civilization), sometimes in full-screen transitions and other times to fade in single sprites. But more relevantly to "id software history", you can find it in Origin's Space Rogue, which John Romero contributed to. A likely possibility is that he picked up the trick on this or a previous project while at Origin.

It's also possible to use a slower "arbitrary PRNG and bump" scheme that tests the VRAM for the desired values(e.g. if it were a sprite, by running the blit at that pixel address and testing) and walks forwards or backwards until an unset value is found. If the walk can be done fast enough, it'll execute at the same framerate as an LFSR cycle-length fade. It can be further supplemented with an estimation heuristic or a low-resolution map to generate unique patterns. It's just less speedy and mathematically interesting to do that.

buzzybee commented on Motive.c: The Soul of the Sims (1997)   donhopkins.com/home/image... · Posted by u/mountainplus
bluejekyll · 8 years ago
Is there an advantage to storing this as an array, indexed by enum, rather than a strict of 16 floats?

I can't think of a reason. Having to index by enum seems like it could be error prone.

Also, in the init(), instead of looping over all indexes of the array and setting 0, couldn't a memset be used?

buzzybee · 8 years ago
It enables the coder to engage in some late binding without engaging in a lot of casts or more elaborate metaprogramming: the enum index used for assignment is itself an assignable value and therefore can be passed around and manipulated if the algorithm needs it. It's a very intentional tradeoff of structure and long-term clarity for a faster turnaround to make design changes.

I haven't even checked to see if the final program uses it. It's the kind of thing, though, that you might go in thinking, "maybe I will need that", and it doesn't add too much overhead to your prototype.

buzzybee commented on Does OO really match the way we think (1997) [pdf]   leshatton.org/Documents/O... · Posted by u/tjalfi
buzzybee · 8 years ago
My strategy for personal coding recently has shifted towards a sincere exploitation of automatic programming(in a manner similar to model-driven development or language-oriented programming, or the research of VPRI). The overall feedback loop looks like this:

* Write an initial mockup that exercises APIs and data paths in an imperative, mostly-straightline coding style

* Write a source code generator that reproduces part of the mockup by starting with the original code as a string and gradually reworking it into a smaller specification.

* Now maintain and extend the code generator, and write new mockup elements to develop features.

The mockup doesn't have to be 100% correct or clean, nor does the code generator have to be 100% clean itself, nor does 100% of the code have to be automated(as long as clear separation between hand-written modules and automatic ones exists), but the mockup is necessary as a skeleton to guide the initial development, and similar, comprehensible output one layer down is a design goal. Language-level macro systems are not typically sufficient for this task since they tend to obscure their resulting output, and thus become harder to debug. Languages that can deal well with strings and sum types, on the other hand, are golden as generator sources since they'll add another layer of checks.

I'm still only using this for personal code, but it's gradually becoming more "real" in my head as I pursue it: the thing that stopped me before was developing the right feedback loop, and I'm convinced that the way to go is with a pretty lean base implementation(Go is an example of how much language power I'd want to be using in the target output) and an assumption that you're building a bespoke generator for the application, that won't be used anywhere else.

Source code generation gets a bad rap because the immediate payoffs are rare, and it's easy to take an undisciplined approach that just emits unreadable boilerplate without gaining anything, but the potential benefits are huge and make me not really care about design patterns or frameworks or traditional "paradigms" anymore. Those don't achieve anywhere near the same amount of empowerment.

buzzybee commented on Cleaning a Dirty Sponge Only Helps Its Worst Bacteria   nytimes.com/2017/08/04/sc... · Posted by u/aaronbrethorst
buzzybee · 8 years ago
I do microwave the sponge after use, for about 2 minutes so that it's bone dry. The reasoning is simple: most of the bacteria are going to hang out in the water. It's still going to be infected, but it doesn't have to be disinfected. It just has to remain close enough to the ambient load of the kitchen that it doesn't obviously inflame my hands when I pick it up. After all, my hands are germy too, most of the time. The soap, water, and pressure applied during cleaning are supposed to do most of the work.
buzzybee commented on Lindy effect   en.wikipedia.org/wiki/Lin... · Posted by u/bushido
stouset · 8 years ago
Not only that, but endgames frequently involve long periods of positional maneuvering that can take dozens of moves before one side realizes an edge, or before it becomes clear that it's heading toward a draw.
buzzybee · 8 years ago
The longest possible chess game:

https://www.chess.com/blog/kurtgodden/the-longest-possible-c...

Chess AIs, perhaps needless to say, are very good at computing at the depth necessary to win drawn-out endgames.

buzzybee commented on Microsoft Paint to be killed off after 32 years   theguardian.com/technolog... · Posted by u/barking
Joeri · 8 years ago
Do you actually use windows? That's not my experience at all. The difference between even just the stuff windows ships with is staggering. Just compare control panel to settings app, they look like they belong to different OS's, and you have to use both to access all configuration.
buzzybee · 8 years ago
IME real users don't care about the app being aesthetically different, but they do care if the common idioms have changed(e.g. position of OK/Cancel). That shouldn't depend on your toolkit, though.
buzzybee commented on Microsoft Paint to be killed off after 32 years   theguardian.com/technolog... · Posted by u/barking
Karunamon · 8 years ago
They did the same thing with the calculator, turning what is probably the simplest app on the entire computer into a Metro-ified flat-design-meme store-dependent mess for absolutely no benefit.
buzzybee · 8 years ago
Some time ago someone recommended SpeedCrunch to me for calculator stuff and I use it all the time now. It's a little bit less intuitive since it takes a syntax instead of presenting buttons, but it does a ton more.
buzzybee commented on Bitcoin May Have Solved Its Scaling Problem   motherboard.vice.com/en_u... · Posted by u/artsandsci
richardw · 8 years ago
Crypcocurrencies need to stop requiring us all to hold every transaction in one unified blockchain dump. There has to be some way to break the network out into shards while still preserving the distributed nature and ability to pay anyone.

No matter what we do, moving and storing every single transaction is insane. It's like my bank account needing to know what every person in the world's purchases are this morning. I shouldn't need to know what some guy on the other side of the world spent his lunch money on, just to buy my own.

What am I missing? Surely the uber com sci phd's have fully solved this?

buzzybee · 8 years ago
Worse-is-better: we already "solved" it with altcoins.
buzzybee commented on A hacker stole $31M of Ether – how it happened, and what it means for Ethereum   medium.freecodecamp.org/a... · Posted by u/HaseebQ
owenversteeg · 8 years ago
I think the fundamental problem here is an economic one. Make three assumptions:

1) most contracts worth implementing in Ethereum are fairly complex

2) even given great developers, bugs are inevitable in complex code

3) the budget of the contract-makers' security team MUST be smaller than that of the hackers

You quickly see that if the chance of a bug is nonzero, "smart contracts" don't make economic sense. If you have a $100k contract, and you spend $5k on security (which would absolutely destroy most companies' margins by the way) you'll be facing hackers that are EACH willing to spend up to $90k or so. Let's say all the experts in this example world are $200/hr. You spent 25 expert-hours on security. But you're being hacked by people who spent 450 expert-hours on hacking you.

With that in mind, would YOU want to use a smart contract? Spend 5% of the contract value instantly on security, and risk losing 105%? This isn't a normal loss by the way, where you can prosecute someone or sue somebody. No, this is the instant, digital theft of the entire value of the contract, to an anonymous digital address where it will be quickly blended in with hundreds of millions of dollars of similar thefts a month.

buzzybee · 8 years ago
Regarding smart contracts and other inventions of cryptocurrency: I think, most of all, the crypto market is absolutely vicious, in a way that both capital markets and technology companies haven't seen(in the public eye) for many, many years. As of right now there's still some faith left that Ethereum is going to go places because firms keep trying to use the technology on the basis of hype. Unlike with most overhyped and half-baked tech, though, the failures do not get shoved in the back room for some hapless dev or ops team to deal with. This is like if Apple and Google had daily columns on the front page showing every bug or support issue that they experienced in the past 24 hours.

And that's been true right from the beginning with Bitcoin: scams, schemes, heists, data loss - the headlines scream blood all the time. We've never had that kind of "mean time to disaster" in technology before. The status quo was that something would fail, but the failure would never quite get back to the individual or company that produced it. They would spin it away, and no careers would be harmed. But the likes of Solidity produces a meat-grinder, a blind destroyer of any who dare enter calling themselves rockstar.

I think it's kind of awesome and beautiful, in that sense. It will induce a maturation throughout the economy to adapt to this new pressure. But like adolescence, it can be an ugly work in progress, and I don't expect Ethereum itself to be a survivor at this rate.

u/buzzybee

KarmaCake day728March 17, 2014View Original