Thursday, July 12, 2012

Some things I've learnt about programming

I've been programming for over 30 years from machines that seem puny today (Z80 and 6502 based) to the latest kit using languages that range from BASIC, assembly language, C, C++ through Tcl, Perl, Lisp, ML, occam to arc, Ruby, Go and more.

The following is a list of things I've learnt.

0. Programming is a craft not science or engineering

Programming is much closer to a craft than a science or engineering discipline. It's a combination of skill and experience expressed through tools. The craftsman chooses specific tools (and sometimes makes their own) and learns to use them to create.

To my mind that's a craft. I think the best programmers are closer to watchmakers than bridge builders or physicists. Sure, it looks like it's science or engineering because of the application of logic and mathematics, but at its core it's taking tools in your hands (almost) and crafting something.

Given that it's a craft then it's not hard to see that experience matters, tools matter, intuition matters.

1. Honesty is the best policy

When writing code it's sometimes tempting to try stuff to see what works and get a program working without truly understanding what's happening. The classic example of this is an API call you decide to insert because, magically, it makes a bug go away; or a printf that's inserted that causes a program to stop crashing.

Both are examples of personal dishonesty. You have to ask yourself: "Do I understand why my program is doing X?". If you do not you'll run into trouble later on. It's the programmer's responsibility to know what's going on, because the computer will do precisely what it's told not what you wish it would do.

Honesty requires rigor. You have to be rigorous about ensuring that you know what your program does and why.

2. Simplify, simplify, simplify

Tony Hoare said: "There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult."

Simplify, refactor, delete.

I'd rephrase Hoare's maxim as "Inside every large, complex program is a small, elegant program that does the same thing, correctly".

Related to this is the 'small pieces loosely joined' philosophy. It's better to structure a program in small parts that communicate than to create some gigantic monolith. This is partly what has made UNIX successful.

3. Debuggers are sometimes a crutch, profilers are not

I almost never use a debugger. I make sure my programs produce log output and I make sure to know what my programs do. Most times I can figure out what's wrong with my code from the log file without recourse to a debugger.

The reason I don't use a debugger much is I think it leads to lazy thinking. Many people when faced with a bug reach for the debugger and dive into setting breakpoints and examining memory or variable values. It's easy to become enamored with such a powerful tool, but a little bit of thinking tends to go a long way. And if your program is so complex that you need a debugger you might need to go back to #2.

(Aside: having said all that, one of the programmers I most respect, John Ousterhout, seemed to spend all day in the Windows debugger).

On the other hand, profilers are essential if you need to understand performance. You'll never cease to be amazed what a profiler will tell you.

4. Code duplication will bite you

Don't Repeat Yourself. Do everything just once in your code.

This is related to #2, but is a special case. Even a simple piece of code that's duplicated will lead to trouble later when you 'fix' one version and forget about the other one.

5. Be promiscuous with languages

Some people get obsessed with a specific language and have to do everything in it. This is a mistake. There is not single greatest language for all tasks.

The key thing is to know which language in your toolbox you'll use for which problem. And it's best to have lots of tools. Try out different languages, build things in them.

For example, perhaps you'll not use Python or ML very much but you'll have played with list comprehensions and seen their power. Or you'll dabble in Go and will have seen how it handles concurrency. Or you'll have used Perl and seen the power of really flexible string handling. Or you'll have used PHP to quickly build a dynamic web page.

I hate language wars. They're basically for losers because you're arguing about the wrong thing. For example, in my hands PHP is a disaster, in the hands of others people make it sing. Similar things can be said about C++.

6. It's easier to grow software than build it

This is related to #2. Start small and grow out. If you are attacking a problem then it's easier to grow from a small part of the problem that you've tackled (perhaps having stubbed out or simulated missing parts) than to design a massive architecture up front.

When you create a massive architecture from the start you (a) get it wrong and (b) have created a Byzantine maze that you'll find hard to change. If, on the other hand, you work from small pieces that communicate with each other, refactoring will be easier when you realize you got it wrong from the start.

The root of this is that you never know what the truly correct architecture will look like. That's because it's very rare to know what the external stimuli of your program will be like. You may think that you know, say, the pattern of arriving TCP traffic that your mail server will handle, or the number of recipients, or you may not have heard of spam. Something will come along from outside to mess up your assumptions and if your assumptions have been cast into a large, interlocked, complex program you are in serious trouble.

7. Learn the layers

I think that having an understanding of what's happening in a program from the CPU up to the language you are using is important. It's important to understand the layers (be it in C understanding the code it's compiled to, or in Java understanding the JVM and how it operates).

It helps enormously when dealing with performance problems and also with debugging. On one memorable occasion I recall a customer sending my company a screenshot of a Windows 2000 crash that showed the state of a small bit of memory and the registers. Knowing the version of the program he had we were able to identify a null pointer problem and its root cause just from that report.

8. I'm not young enough to know everything

I've still got plenty to learn. There are languages I've barely touched and would like to (Erlang, Clojure). There are languages I dabble in but don't know well (JavaScript) and there are ideas that I barely understand (monads).

PS It's been pointed out that I haven't mentioned testing. I should have added that I do think that test suites are important for any code that's likely to be around for a while. Perhaps when I've been programming for another 30 years I'll have an answer to the question "Do unit tests improve software?". I've written code with and without extensive unit tests and I still don't quite to know the answer, although I lean towards unit tests make a difference.


Unknown said...

I strongly agree with everything except your comment on the use of a debugger. To me it is just another tool to use wisely. When things don't work I postulate theories for why and then try to prove my theories through the judicious use of breakpoints.

anon said...

Regarding the simplify issue, I've always followed this rule: If it's getting complicated, you're doing it wrong.

EddieB said...

"1. Honesty is the best policy"

Another example of this.. Programming By Coincidence

anon said...

Regarding the simplify issue, I've always followed this rule: If it's getting complicated, you're doing it wrong.

Baron Mango said...

I'm with Graham Moore - I switch between logs and debuggers, when one seems like the right tool - but I also want to add that a debugger is one of the best **learning** tools, particularly for beginners. Nothing like stepping through code, getting that CPU-view of your instructions, when you're first learning.

Unknown said...

Zero-based numbering made me smile. :)

Unknown said...

Zero-based numbering made me smile. :)

mcv said...

I'd say that programming is both a craft and an engineering. First comes engineering - designing a program, then the craftmanship - writing the code.

aqc said...

Programming is purely a craft, engineering and science are tools

aqc said...

Programming is purely a craft, engineering and science are tools

Unknown said...

Well said.

I'd alter #4 to "Feature duplication will bite you"
Don't repeat the same code for to serve the same intent.

and I'd add . .
#9. Learn from the past.
Don't fall into the trap of believing your situation, solution, or problem is unique. Become of student of the past - of software development and human history.

Mizchief said...

For number 2 I defer to Einstein:
"Make everything as simple as possible, but not simpler."

If you try to make your design too simple, it doesn't properly describe the problem you are trying to solve and you end up with a collection of clever tricks vs a simple and elegant solution using some basic OOP our other design tools.

Will said...

Unit tests help me move faster; but it is mostly a self-confidence thing...

shevy said...

I disagree with you on the topic of programming languages.

The better languages is like the better car - you can go faster with it.

To ignore those difference says a lot about your intelligence, because you, in this post here, deny that some languages (like PHP) are inherently worse than other languages.

Are you not man enough to admit that there are better languages because they are designed better?

rdm said...

A theme here is "understand your code" -- if you, the author, cannot understand your own code... who can?

And, with this tool in hand, I would like to revisit the "debugger" issue:

From my point of view, the most important thing to understand about my code is the data that I am working with. If I understand the data, then my code becomes tools to manipulate it. And a tool like a debugger can be a tool to inspect my data.

Of course *I* have to be responsible for understanding. And in some contexts a good program is sufficient for representing everything about the data that I'm working with. In other contexts, however, it can be good to understand intermediate results. (And there are related issues when dealing with other people's code.)

Unknown said...

Debuggers are tools and are appropriate when the tool needed is a debugger. Same as printf and writing to the register that controls the LED bank.

I find that when I'm stepping through code, what I'm really doing is forcing myself to slow down my thought processes and consider what each step is /supposed/ to be doing. I don't see what the value of a variable is -- I see the delta between what it is and what I expected it to be. It's understanding the delta that allows your intuition its full reign...

Mizchief said...

Yea I semi-agree about the debugger. It's a great tool but not a crutch. I've seen guys spend all day trying to setup their debugger to solve a production issue. Where 5mins of looking of the code and thinking "how is this output possible" would have given the solution.

Nate.Flink said...

I created this to underscore your point:

Jorge Varas said...

For #2 I'll add that simplify is not the same as make it easier, sometimes is hard to simplify but we it should.

Rich explains it better than me:

Marcus said...

Thank you for this, it's good to hear real experience...

Rodger said...

This reminds me of the Dreyfus Model Of Skills Acquisition:

Sounds like you are one of the masters.

Rodger said...

This reminds me of the Dreyfus model of skill acquisition.

Sounds like you are an expert.

Anonymous said...

Thank you very much. Of great help to a budding programmer like me. Thanks much.

Dave said...

Good article but if software is a craft why do we still interview as if it were a science? I agree with some science questions but not 90% science questions in an interview.

Christof Kaller said...

My #9 write your code in a way others can read and understand it. It comes along with #2. Use speaking names-that will help you later on. Comment only a few -special- parts of the code.

John Morrison said...

There is nothing wrong with using a debugger. It is an excellent way of getting intimate with your code and the fastest way to get in there and see what is happening.

I have worked with programmers that don't don't use debuggers. It takes them ages to achieve anything, they are not quite sure what is happening and thier code is unreadable because of all the logging calls.

John Morrison said...

There is nothing wrong with using a debugger. It is an excellent way of getting intimate with your code and the fastest way to get in there and see what is happening.

I have worked with programmers that don't don't use debuggers. It takes them ages to achieve anything, they are not quite sure what is happening and thier code is unreadable because of all the logging calls.

John Morrison said...

There is nothing wrong with using a debugger. It is an excellent way of getting intimate with your code and the fastest way to get in there and see what is happening.

I have worked with programmers that don't don't use debuggers. It takes them ages to achieve anything, they are not quite sure what is happening and thier code is unreadable because of all the logging calls.

Vellanova said...

I echo the sentiments of several previous comments ins saying that using a debugger in most cases is definitely not a crutch and in most instances is far superior to relying on logging statements.

It is also often a useful exercise to step through all your code (and the code of others as well) in the debugger even when there are no apparent bugs - it gives you a better understanding of exactly what the code is doing and I find is far more effective than a static code review.

Unknown said...

I am two things: an artist that sculpts energy & a kitchen-sink scientist that uses the first law of thermodynamics to judge the heat emissions at point of mass-user contact to gauge the efficacy of this mound of energy I am forming and reforming from within a subliminal time-frame. I view the optimum solution as nothing & if I can't manage that then it is the one that is directly next to nothing.

In principal I give absolute precedence to data over program in practice I remove concept of precedence and invoke instructional symmetry & I have no idea what a 'debugger' is. @thepoettrap

Unknown said...

Hmm, I'm heavy in debugger mode- enough so that when I use tools without it, I get frustrated. I guess I don't like littering my code with log statements. Although I'll admit both that- if I did that I would get useful information from users if I needed it and also that sometimes I do go into 'crutch mode' where I have to remind myself and think about the problem.

t.t said...


Thank you for sharing your experiences.

Anonymous said...

Maybe you should look up the definition of engineering.

Anonymous said...

Maybe you should look up the definition of engineering

Jasmine said...

"Are you not man enough to admit that there are better languages because they are designed better?"

I think statements like that point out the real origin of language wars - MEN are competitive by nature and they don't understand the concept that something might not always be better. Men *rank* everything, from women to programming languages, and they want their rankings to be universally applicable, but sorry, some men *like* fat chicks and you're just going to have to deal with the fact that your 10 is not the same as someone else's.

Is Arnold Schwarzenegger better than Judge Judy? In what situations? Always? Men can not deal with the fact that Judy is better than Arnold sometimes, and sometimes it's the other way around. That's where the stupid insistence that one language is better than another comes from. It is simply not true.

I have also been programming for 30 years, and maybe you just need to have 30 years of experience to see this, but languages simply exist, they don't have any moral implication until you attach one. Language wars are a symptom of believing in the wrong things. Is a word processor better than a pencil? Is it really? There are people who will insist that the word processor is always better in all situations, and that's just plain wrong. The same thing is wrong about programming languages.

If you find yourself believing that one language is better than another, you are wrong, and your task is to figure out what went sideways in your thinking, not to defend your incorrect belief.

Kurt Guntheroth said...

Every engineering discipline has an element of craft; there are graceful, elegant buildings, and there are Bauhaus monstrosities. Likewise, a program can be elegant or messy, and still more-or-less do its job.

There is much craft in the individual practice of coding, and much engineering (or not) in design and in the organization of teams to do the work timely.

rdm said...

And, to underline some of the issues Jasmine called out:

Most of the words used "in a language" are words defined to build the application.

Sometimes the best implementations come from building an initial version
using one language and then rebuilding (using lessons learned) in another language. Or, several other languages.

SQL and Javascript can both be useful languages, for some people, and yet it's often useful to insert some other language(s) between them.

In a modern mashup you often do not need to know what languages components of your system were built using.

We need sensible people -- people that can see past nonsensical name slinging -- making the important decisions.

Dan Sutton said...

This is an excellent article. I think the debugger thing is debatable: when you're under the gun to finish a project, it's a useful tool to have: I use it all the time. Of course it's a crutch: it's also a useful tool which pride isn't going to stop me from using. I think if there's a tool at your disposal then there's probably nothing wrong with using it as long as it doesn't contradict point 1 of yours, which is of supreme importance.

Regarding point 0, I'd actually classify programming as an art form: if your algorithms aren't beautiful then they're probably not that great; Father Dijkstra knew this and even went as far as to publish a work entitled, "Ten Beautiful Algorithms."

Programmers who can't see the beauty in the code aren't programmers at all: they're plumbers: a great piece of code should elicit an emotional response.

I remember, back in the '80s, looking at an assembly language listing for drawing lines on a VGA. The programmer had split the task into four distinct algorithms: vertical lines (easy - you just work out the bit mask then keep adding the width of the screen in bytes to move to the next line), horizontal lines (fill in all the bytes with FF and then do the bits on the ends), lines at more than 45 degrees to horizontal (easy because there's only one bit set for each scan line) and then the difficult one: lines at less than 45 degrees, which have a different bit mask for each line. Somehow, his assembly code achieved this by rotating the bit mask for each scan line such that it was always correct: the carry flag of the CPU handled the extra bits or got rid of them: the loop was tiny, with no conditionals in it, and it JUST HAPPENED TO WORK -- amazing -- I remember a feeling of such joy when I finally figured out what the guy had done, since the thought process behind it was so beautiful.

Programming should be like this - unfortunately, many programmers these days - especially corporate programmers, although I hate to say it - fail to see that beauty and instead write pedestrian code, in the creation of which they're never thinking about the whole problem in abstract terms, but rather solving one step at a time, even if the steps represent something which can be expressed in far less code and far more beautifully, vis-a-vis your point #2 (see Edward DeBono and lateral thinking).

Dan Sutton said...

Here's a gem you'll appreciate if you haven't seen it before (or even if you have):

Anonymous said...

This diletantish esse is ridiculous. Seems 30 years you spent for nothing.

Unknown said...

Most software developers have a background in science, hence after their 10,000 hours of practice have a tendency to think of programming as a craft. I do to.

None-the-less, software is about organizing ideas, giving definition to concepts, etc. so a better analogy would be to compare programming to legal work. The computer laws are just a lot more precise and the judge impartial and unforgiving.

If we, as a community, stop thinking in terms of carpentry and more in terms of legal practice, we will organize our work and life in a way that is a lot more suitable to the kind of activities require to develop software.

rdm said...

The legal system analogy is valid.

And, incomplete.

Issues to consider include:

common sense (plentiful in humans, lacking in computers).

conflicting points of view (plentiful in law, and arguably the reason for its existence, but something we need to avoid or neglect in programming -- and its resolution is just the starting point for any useful work).

Persuasive rhetoric... ok, it's good to have some of this on your side in programming exercises, but it probably belongs in the marketing department (and perhaps leadership -- RIP Steve Jobs).

Unknown said...

I agree. I think debuggers are one of the most important tools for a developer. Log files are useful in some cases, but debuggers are the best way to learn how some code works or to make sure your code does what you want (specially for object oriented languages)

Krunal Jariwala said...

Great points, can I share these points on my website with credits for you with link? I will love our users to read these things.

tz said...

Unit tests are less important than inverse functions. As in f(f**-1(x)) and f**-1(f(x)) should return x. Then fuzz x and see.

"Unit tests" tend to be a lot like looking for the common = v.s. == error in an if statement. You know too much and too little about what to test for.

In my current project, the first thing I did to test a marshaler and unmarshaler pair was to feedback the unmarshaled to a known good device. Sort of g(x) is known to work, so m(u(g(x))) should work identically. But I found problems - small typos but then those are often harder to find than massive logic errors.

Rob G said...

There are a few clear violations of the DRY principle in comments above.

Nędza Darek said...

"4. Code duplication will bite you"
My professor said: "you shouldn't duplicate but sometimes it is better to duplicate"

About Monads: do you mean J's or Haskell's monads?

John Keklak said...

Damn you, I've now spent an hour reading about monads. :-)

Matthew Edmondson said...

Interesting article, however like many have said I'd also disagree about the debugger point.

I actually recently compiled a list of my own experiences, and I certainly agree on the refactor point you've made. I hope it can possibly throw a different angle (and maybe add some more) to your points above:

Making an old USB printer support Apple AirPrint using a Raspberry Pi

There are longer tutorials on how to connect a USB printer to a Raspberry Pi and make it accessible via AirPrint but here's the minimal ...