Coders at work - observations

Recently, I’ve been reading “Coders At Work” book when commuting to the office. It only takes me 15 minutes, so it’s

going rather slowly. It’s a collection of interviews with some of the most famous programmers, really interesting. I’ve only read about 200 pages so far (5 interviews), but already found two intriguing points:

  • almost everyone so far insisted on using printf for debugging. Yeah, they use some GDB, but not much. It’s been a while since I used GDB, but I don’t remember it being so bad. Then again, I didn’t know Visual Studio then. Anyway, I find this rather oldschool and hard to believe it’s possible to debug any bigger application this way. I’ve developed two complete games using only printf for debug, but it wasn’t exactly my choice. It was my first job, we were doing Gameboy Advance games and only had one devkit in the office (I wasn’t lucky enough to get it). I think it was possible to debug using emulator, but it was slow as hell, so I wrote 99% of code without the debugger. It actually worked out quite nicely, but those were really small scale applications, I just don’t see myself doing it again (for a whole application of course, sometimes logging is priceless & indispensable, debugger breaks the program flow). One of the reasons may be that interviewed coders usually develop web-based, non-realtime systems, where printfs are probably more viable. Still, a little bit weird.

  • interesting observation how todays languages, which are considered “modern” do not really implement many new features. They’re not that much different (in terms of mechanisms) from Lisp or Smalltalk that are over 30-40 years old. Not even that, we’re still widely using C++, which doesn’t really have lots of those mechanisms or they’re hacked on top. The original languages never gained that much mainstream popularity, but they still live “in disguise” in Python, Ruby or Java.

Other than that, I’ve been pretty busy at work recently, so not much new stuff… I’m still trying to finish one of my old experiments, though, just need to clean it up a little bit and I’ve just found another itch to scratch. Should find some time to this in November.

Old comments

peirz 2009-10-18 21:03:52

I once wrote a fairly big project at home where I deliberately did not use a debugger. I just set up scons and wrote code in vim, so I wouldn’t get tempted :) And the immediate result that I noticed was that I put much much more emphasis on unit tests. Because guess what, if the application doesn’t work, I have no way to find out other than printf. So I better cover as much ground as possible in the tests’ assertions. At the end of the project, I was kind of proud that I only had to break out a real debugger maybe 3 or 4 times, and more importantly, to my surprise I hadn’t been really less productive for avoiding msvc.
So I guess I can sort of see their point if they’re doing this on purpose, or just can’t be bothered to bring out the big tools.
(This was a GUI framework plus bunch of supporting libs for i18n etc in C++, so not some trivial 3 weekends crap).

admin 2010-05-04 23:38:35

@Jonathan: good point. However, I still believe it’s strongly dependent on type of application. My current company has the strongest TDD/smoke test/unit test environment that I’ve seen during my years in the industry, yet I still spend probably almost 30% of my development time in the debugger.

Jonathan Hartley 2010-05-04 11:57:50

@peirz: For the record, I use vim too. :-)
We have a story at my workplace we call ‘the great IDE anecdote.’ Drokk, I’ve told this story a bunch of times already… I’m just going to post it on my blog and link to it, hang on…
The story of how we all came to choose to ditch traditional IDEs in favour of Vi and Emacs:

Jonathan Hartley 2010-05-04 10:52:31

I think the use of printf for debugging stems from a style of development where one simply needs to debug less often, and when one does, it is easier to tell what is going on, so the benefits a debugger brings are of less value, relatively.
While debugging is sometimes necessary, it is a waste of time which could be better spent. To avoid this waste of time, it is better to spend effort developing code which is bug free the first time around. (heck this sounds preachy, forgive me, I know you know this, this is just the way it came out)
The guys in Coders at Work clearly know this - they are extraordinary developers. But even for lesser mortals like me, there are easily learned ideas that can have similar effect: things like test driven development, dynamic languages, and use of asserts to enforce design by contract, invariants, & bounds functions.
On my current ten person project, to my astonishment, there has only been about five occasions in the last three years that I’ve ever wanted to fire up a debugger. The rest of the time, if I’m trying to understand what’s going on, I’ll either write a new unit test or add a print - possibly both.
TDD coupled with languages that don’t require a compile phase tip the scales in this direction too, since you can literally add a print, press a key in your editor to run the appropriate unit tests (one of which sets up all the preconditions needed to demonstrate the puzzling behaviour), and see the output from the print instantaneously. No compile delay. No launching the application delay. No complex interacting with the application to invoke the puzzling behaviour. I’m talking a 500ms feedback loop. No debugger can compete with that.
IMHO. Thanks for all the marvellous blog posts.
Best regards,

More Reading