Too Busy For Words - the PaulWay Blog

Wed 1st Feb, 2012

Critical Thinking

In the inevitable rant-fest that followed the LWN story on the proposal to have /lib and /bin point to /usr/lib and /usr/bin respectively (short story), I observe with wry amusement the vocal people who say "Look at PulseAudio - it's awful, I have to fight against all the time, that's why we shouldn't do this". The strange, sad thing about these people is that they happily ignore all those people (like me) for whom PulseAudio just works. There's some little concieted part of their brain that says "I must be the only person that's right and everyone else has got it wrong." It's childish, really.

And in my experience, those people often make unrealistic demands on new software, or misuse it - consciously or unconsciously, and with or without learning about it. These people are semi-consciously determined to prove that the new thing is wrong, and everything they do then becomes in some way critical of it. Any success is overlooked as "because I knew what to do", every failure is pounced on as proof that "the thing doesn't work". I've seen this with new hardware, new software, new cars, new clothes, new houses, accommodation, etc. You can see it in the fact that there's almost no correlation between people who complain about wind generator noise and the actual noise levels measured at their property. Human beings all have a natural inclination to believe that they are right and everything else is wrong, and some of us fight past that to be rational and fair.

This is why I didn't get Rusty's post on the topic. It's either completely and brilliantly ironic, or (frankly) misguided. His good reasons are all factual; his 'bad' reasons are all ad-hominem attacks on a person. I'd understand if it was e.g. Microsoft he was criticising - e.g. "I don't trust Microsoft submitting a driver to the kernel; OT1H it's OK code, OTOH it's Microsoft and I don't trust their motives" - because Microsoft has proven so often that their larger motives are anti-competition even if their individual engineers and programmers mean well. But dmesg, PulseAudio, and systemd have all been (IMO) well thought out solutions to clearly defined problems. systemd, for example, succeeds because it uses methods that are simple, already in use and solve the problem naturally. PulseAudio does not pretend to solve the same problems as JACK. I agree that Lennart can be irritating some times, but I read an article once by someone clever that pointed out that you don't have to like the person in order to use their code...

Last updated: | path: tech | permanent link to this entry

Going from zero

A friend of mine and I were discussing cars the other day. He said that he thought the invention of the electric motor was a curse on cars, because it meant you wouldn't have a gearbox to control which gear you were in. A suitable electric motor has enough power to drive the car from zero to a comfortable top speed (110km/hr) at a reasonable acceleration using a fixed gear ratio - the car stays in (in this case third) gear and you drive it around like that. He maintained, however, that you needed to know which gear you were in, and to change gears, because otherwise you could find yourself using a gear that you hadn't chosen.

I argued that, in fact, having to select a gear meant that drivers both new and experienced would occasionally miss a gear change and put the gearbox into neutral by mistake, causing grinding of gears and possible crashes as the car was now out of control. He claimed to have heard of a clever device that would sit over your gearbox and tell you when you weren't in gear, but you couldn't use the car like that all the time because it made the car too slow. So you tested the car with this gearbox-watcher, then once you knew that the car itself wouldn't normally miss a gear you just had to blame the driver if the car blew up, crashed, or had other problems. But he was absolutely consistent in attitude towards electric motors: you lost any chance to find out that you weren't in the right gear, and therefore the whole invention could be written off as basically misguided.

Now, clever readers will have worked out that at this point my conversation was not real, and was in fact by way of an analogy (from the strain on the examples, for one). The friend was real - Rusty Russell - but instead of electric motors we were discussing the Go programming language and instead of gearboxes we were discussing the state of variables.

In Go, all variables are defined as containing zero unless initialised otherwise. In C, a variable can be declared but undefined - the language standard AFAIK does not specify the state of a variable that is declared but not initialised. From the C perspective, there are several reasons you might not want to automatically pre-initialise a variable when you define it - it's about to be set from some other structure, for example - and pre-initialising it is a waste of time. And being able to detect when a variable has been used without knowing what its stage is - using valgrind, for example - means you can detect subtle programming errors that can have hard-to-find consequences when the variable's meaning or initialisation is changed later on. If you can't know whether the programmer is using zero because that's what they really wanted or because it just happened to be the default and they didn't think about it, then how do you know which usage is correct?

From the Go perspective, in my opinion, these arguments are a kludgy way of seeing a bug as a feature. Optimising compilers can easily detect when a variable will be set twice without any intervening examination of state, and simply remove the first initialisation - so the 'waste of time' argument is a non-issue. Likewise, any self-respecting static analysis tool can determine if a variable is tested before it's explicitly defined, and I can think of a couple of heuristics for determining when this usage isn't intended.

And one of the most common errors in C is use of undefined variables; this happens to new and experienced programmers alike, and those subtle programming problems happen far more often in real-world code as it evolves over time - it is still rare for people to run valgrind over their code every time before they commit it to the project. It's far more useful to eliminate this entire category of bugs once and for all. As far as I can see, you lose nothing and you gain a lot more security.

To me, the arguments against a default value are a kind of lesser Stockholm Syndrome. C programmers learn from long experience to do things the 'right way', including making sure you initialise your variables explicitly before you use them, because of all the bugs - from brutally obvious to deviously subtle - that are caused by doing things in any other way. Tools like valgrind work around indirectly fixing this problem after the fact. People even come to love them - like the people who love being deafened by the sound of growling, blaring petrol engines and associate the feeling of power with that cacophany. They mock those new silent electric motors because they don't have the same warts and the same pain-inducing behaviour as the old petrol engine.

I'm sure C has many good things to recommend it. But I don't think lack of default initialisation is one.

Last updated: | path: tech | permanent link to this entry


All posts licensed under the CC-BY-NC license. Author Paul Wayper.


Main index / tbfw/ - © 2004-2023 Paul Wayper
Valid HTML5 Valid CSS!