Readit News logoReadit News
Posted by u/pentab 3 years ago
Ask HN: Why are we not using debuggers more?
I noticed that I rarely use debuggers. I asked around, and most of my co-workers are the same. We rely on debuggers only as a last resort, instead opting for prints, asserts, or the "method of the long stare". We are developing in Python/Java/C++ and use other convenience tools like IDEs.

Are you also avoiding debuggers? If so, why? What makes debugging so tedious?

PaulHoule · 3 years ago
For me debuggers are the first resort. I learned a long time ago that if you get in the habit of hacking the code to debug it (say adding print statements) you will eventually check in debug-related changes that you shouldn't. Using the debugger means you don't have to hack the source code.

I use Jetbrains tools and have an easy time debugging in Java, Python and Javascript. In Java I'd say that you can use unit tests to interactively experiment the same way people do with the CLI in Python with the difference that you get unit tests out of the deal as opposed to having lines scroll away in the console.

I use WebStorm to debug Javascript programs that run in npm but if it is running in the browser I just use the "developer tools" from Firefox, Chrome, Edge or Safari. I think there is some way to attach WebStorm to a running web browser but I've never figured it completely out.

I think the Unix culture is allergic to debugging. It's not that hard to use gdb from the command line and in fact you can do some pretty awesome things with it such as embedded system debugging or debugging the C++ and Java sides of an application at the same time, but for a long time I kept trying graphical front ends for gdb such as ddd that "just don't work".

vr46 · 3 years ago
I started out using debuggers because it made sense to step through the execution, then stopped for years as a junior- and mid-level because nobody else did, least of all the toxic lead engineers that were liberally sprinkled over every London agency I worked in, and then eventually turned 40, at which point I instantly stopped giving a shit about what anyone thought of me, and went back to being a totally-non-rock-stroke-ninja coder again. Debugging.

It’s not just Unix culture that’s allergic to debugging, it’s every chancer and charlatan the world over.

alfonmga · 3 years ago
I use a debugger every day. Delve[0] Go's debugger made me love the process of debugging my code – either attaching the debugger to an existing running process or the feedback loop of debugging the test code until make it passes the test case.

Back in the days when I didn't use one, it was a miserable developer experience. Thanks to Go and his great decision of having unit testing built into their standard now writing tests and debugging them is a joy.

Far are the days of not using a debugger because the programing language of choice didn't treat debuggers/unit testing as a first-class citizen. I could say that now I sleep a lot better thanks to coding my backends on Go and having the confidence that I can go as much depth as I want to fix any bug.

Debugging for me is an automatic action like drinking water when you're thirsty; two clicks (set breakpoint and click the debug button) and I'm back into the debugger, again.

[0]: https://github.com/go-delve/delve

sdevonoes · 3 years ago
I use debuggers, but only if "prints" don't help me figure out the issue at hand first. Debugging via "prints" works out of the box.

You put a "print" statement dumping whatever data structure you want, you run the program again, and voila you got something. Now, the "where to put the print" is an art on itself of course. If you don't know where to put it, then either you start writing multiple "print"s here and there (following an approach similar to binary search) or you use a debugger and go step by step until you find right place to inspect.

The problem with debuggers is that they do not work out of the box. If you use the terminal, then you have use an external program and read its manual to understand how to debug, how to put breaks. Depending on the platform you are using, you may require even an executable to run the debugger on. Sometimes you don't have an executable. Sometimes you don't know where the "entry" point of your program is, but you know where the bug relies. Some debuggers require you to point the debugger to the entry point.

If you use an IDE like Jetbrains', then for debugging you need as well to do some (minimal) setup as well (the last week I had to debug a Node.js program: it was simple, but it didn't work out of the box. Also, when I open my IDE on a different Node.js repository, my debugging setup was gone).

themodelplumber · 3 years ago
Some ideas:

First, if you can fix most issues with a glance or a brief stare, using a debugger never really feels necessary. It may also feel less like using your own brain and solving a puzzle, and puzzle-solving is a huge part of human psychology (see post title).

Also, the story of the last N years has been the text editor, not the IDE. The IDE and its world is really where the integrated debugging story is a big deal.

But if you are using a text editor--or text editors--integrating debugging is less of a thing. Your time may be better spent with various productivity-focused changes for ergonomics, like studying or changing keyboard shortcuts, installing or writing plugins, setting up your own scaffolding using system tools integration, and so on.

Plus once you know the various shortcuts, it's maybe more fun to use them and zip around adding prints than it is to debug. You also gain practice this way, after all.

But if you moved from text editors to IDEs, IMO you probably brought that same set of practices along.

Anyway, good q, I've thought about this recently as well.

AH4oFVbPT4f8 · 3 years ago
I use debuggers all the time. Don’t repeat yourself and keep it simple stupid goes a long way. I’d much have something readable and easy to follow than something clever.
mdcds · 3 years ago
I program in Scala and didn't know what IntelliJ debugger could do until someone taught me how to use it (including the nifty trick of setting break points inside 3rd party libs).

Sometimes you don't know what you don't know.

But yeah, these days I rely on the debugger heavily. Even for Python :)

suprjami · 3 years ago
I make an effort to use a debugger first.

Right now I'm writing some stuff which takes over the terminal and works in recursive loops (ncurses roguelike map generator) so print debugging is a hassle. It's actually quicker to use a debugger.

I find that attaching gdb and running "bt" and "info locals" or inspecting variables with "p" is much cleaner than printing, and helps me visualise what the code is doing much better.

This use of a debugger for "little things" is immensely helpful for when I need to use a debugger at work to solve "big things", which are usually remotely and after-the-fact with a core file, so printing isn't even an option there.

bobajeff · 3 years ago
I'm getting more into using debuggers lately. They can really help they just take time to learn.

As to why they are a last resort. For me it was just having to learn the debugger and also setting up the debugging environment. For every binary needs to be built with debugging symbols and any libraries you're wanting to step through you have to find the symbols for those too.

Also I think it depends on how low level the language and what kind of program it is.

Some languages/environment don't have debuggers or they aren't very good. I wouldn't say most debuggers that are mature are super great to use either.

pentab · 3 years ago
Yes, it is sad that setting up debugging is so tedious.

Could you give some examples or languages/environments with "bad" debuggers?

bobajeff · 3 years ago
Off the top of my head: Multi-threaded C++ is one. Odin lang has poor support for many debugger features.

The Chromium Project can be way too slow to run in a debugger. Even just debugging the Content Shell can be very slow.