Deleted Comment
I’ve spent time at small startups and on “elite” big tech teams, and I’m usually the only one on my team using a debugger. Almost everyone in the real world (at least in web tech) seems to do print statement debugging. I have tried and failed to get others interested in using my workflow.
I generally agree that it’s the best way to start understanding a system. Breaking on an interesting line of code during a test run and studying the call stack that got me there is infinitely easier than trying to run the code forwards in my head.
Young grugs: learning this skill is a minor superpower. Take the time to get it working on your codebase, if you can.
By all means, learn to use a debugger well but don't become overly dependent on them.
We’ve been at that limit for decades.
Anything that becomes mainstream is likely to get twisted and turned into whatever the "powers that be" want it to be.
So, while using XP or Scrum or Kanban for that matter properly in a sane environment is going to be great, if you work in an un-sane (sic) one, then the powers that be have turned whatever system you're using into theirs. This is how things like SAFe are born, that try to make "agile safe for the corporation" and of course they're nothing more than corporate BS under an agile name and that gives agile a bad name.
Just like Jira is getting a bad name because it's so configurable that corporations are able to use it to do what they do. You can also use it as nothing than an electronic place to house your "post-it notes on a wall". All up to you, your cow-orkers and company. Nobody can blame Atlassian / Jira for taking the money of these corporations. I know I would if I had had the idea of releasing a ticketing system that doesn't even know that you should use surrogate keys for all your entities instead of making an issue key that can change if you move issues between projects your "primary key" that is referenced everywhere and shit breaks :shrug:
I can't believe it is still the best. it's been like 30 years. During that time, so much has happened--the death of supercomputer companies like Convex and Cray, SIMD going from expensive computers like the MASPAR MP-1 to being on virtually every processor, the dot com boom, the rise of google-style server farms, etc etc.
And now the transition to neural net processing.
I mean, it is a testament to the authors that they could keep their competitors from even thinking about trying to write a competing book for so long. It is a great case study in how to stay relevant in tech for the long term.
But man, before it came out, every year 2 or 3 new textbooks in computer architecture came out, each one detailing the next cool thing which computer architectures were being called upon to do.
It's exhibit A of Peter Thiel's case that we are living in an era of very low innovation. If Computer Architecture were a really healthy field, classes would have to be taught from recently-published papers, because it was moving faster than a textbook could be published.
Hat's off to the authors, but man, this is really depressing.
But why[1]?
[1]: https://www.benq.com/en-us/monitor/programming/rd280u.html
[1] https://en.wikipedia.org/wiki/Battlezone_(1980_video_game)