Ok, a little philosophy here - with justification, as there's all too much crap code out there - I'm fixing some now, written supposedly by pros. Not, and it's quite a mess in there - not only organized poorly, but plenty of sins of omission that "seemed to work fine at the time". And fixing sins of ommision requires that you actually know your stuff.
Example, I'm working on some data acquisition code for a Diligent Chipkit Uno32 with WiFi shield, and using their example web server - the idea being that I'll be able to suck down log files from the data aq part from any browser on my network.
Yes, I put tellem in this code (same protocol, but obviously not written in perl for this - it's their strange mod of C++ over there).
It sometimes works when it shouldn't. Not always, but enough that had I not known what was going on, fixing it would have been a very intense bear of a thrash. Why does it work when it shouldn't?
Well, WiFi and other routers often keep track of MAC addresses of things hooked up to them - at least until the next power failure - YMMV depending on the router involved. This means that my routher, my laptop, and perhaps my browser (on which I'm writing this) "already knows" a numeric IP for say, my Digilent Uno321...browser saves things across sessions, even.
That is, until something forgets how to map, say, http://uno321 to it's real address - which at the moment is 192.168.1.73, but which could change on the next boot of my router(s). That's why I wrote a lightweight network mapper - tellem and tellme, so that hosts files could be updated by machines, and go around any "remembering" that turned out to be wrong later.
I could have completely left this out as it turns out, and it would have worked for quite some time...and trying to find out why it stopped would have been a nightmare when browsers couldn't find the web server...which thing failed, and why?
Had I not known how things work - I'd have left all this extra stuff out that makes it work every time.
It's obvious that many so-called coders/programmers don't live by this rule. It's been made more obvious in hangouts on G+ where I meet wannabe programmers/script kiddies/hackers (who want to be uber hats, black or white, but it seems most of those who even want to be white hats are thinking "Walter White"). Many of these guys won't buy books and read them, and think we who do this right are merely keeping secret from them some small, simple, magic incantation that if they had it, they'd rule. It's simply not that simple - you really do have to know how all the layers work, and at all the levels appropriate for your particular project.
You can learn, say, C, from K&R's rather thin book - maybe in one day. Doesn't get you squat - if you don't know the system libraries, at times down to the source code level and how/why each thing is done, you can't do diddly. You want to program windows and just guess what MFC or .NET are doing? That's how you write code that crashes with errors that are a bear to find. FWIW, you can do the same with perl from "Learning Perl" - the concept I'm trying to get across here makes the actual language more or less not-relevant. Add some nifty drag-drop "a monkey can do this" IDE and you're good to go out and make horrible quality code.
The same of course holds for linux or embedded programming - if you don't know the totality of the language, you don't know squat. Yeah, you might make a monkey program that sometimes works, but that's not pro-grade stuff, and standards, from what I experience with using others' code show that in spades.
Do you really understand the interactions within your own code, much less with a multitasking opsys? And if you use threads, how about that too? Most don't, and then wonder why things crash at seemingly random, with of course, bogus info given by a debugger - for example, if you hit a breakpoint - of course a serial uart has over (or under) flowed by the time you see the info in the debugger - you have to know things like that.
In this case, I'm even having a struggle with the java-language IDE, MPIDE (it's written in Java, but it's for arduino's C++-like language development), as it's obvious that someone wanted something that looked like a real IDE, but didn't know WTF they were doing. Random crashes due to misuse of threads...loss of editor data...can't click on an error in the compile warnings and go to the file? So, not even the basic features on top of it not being basically reliable.
And it matters...boy, you sure don't want to click "serial monitor" near the end of an upload to the hardware so as to catch the first things it will print - it'll crash, and in the process, lose some of your more-recently edited source code if you didn't save it manually first (all REAL IDEs already do that when you compile).
There are a jillion examples of this, but a list of the ones I run into all the time would be more than this board's disks can hold. Do us all a favor - don't be one of them.
OK, rant-off.
But this is why I believe that some pretty smart guys don't trust embedded-cpu devices to act right, if not PC's, because the people who program them are often guilty of this error, along with others. It used to be so hard to program embedded chips that this wasn't much of a problem, and often as not, the guy who programmed the thing was also either the hardware designer or in the same room.
Dumb example of that - PIC chips in general come up with all pins floating. With a pullup or pulldown, you can ensure that in the "I've just booted or just crashed" state (remember, they tend to have a watchdog timer - use it, and use it right), things can be setup so that even in a fail mode, nothing connected to it goes up in smoke. Simple...but apparently all too complex for the latest generation of hackers (which I don't use in the new perjorative sense - I do make a distinction between an honest kludger and a cracker, which these days is what is seemingly meant by the word hacker, despite my protests for years). Funny how language changes due to ignorance - and lawyers. For example, FUD used to mean Fear, Uncertainty, and Deception. But you can't accuse someone of deception without a possible libel suit - so now it's taken to mean the redundant "Fear, Uncertainty, and Doubt" which change I think the second term already covers.
How can you tell the truth if A: you don't know it, and B: the language changes to mean something different, anyway?
OK, rant really off. Please, all you coders out there take this to heart - it's not one example, it's a philosophy.