Why does so much commercial software and hardware seem like it hasn’t undergone more than cursory debugging? The Netflix streaming component on my BluRay player crashes about 20% of the time. I’ve replaced it with a Chromecast, which somehow ended up in a reboot loop on the second day I owned it, though the problem seems to have resolved itself (and it’s otherwise awesome, if limited in Canada). Customer support at my internet service provider seems to think its normal to suggest that a router should have to be periodically rebooted.
Same thing at work. The software I use to reconstruct SPECT images has a completely reproducible error that shows up when I generate an attenuation map. That’s such a routine operation that it must have been tested repeatedly during development. How was that not caught and fixed? I spent yesterday afternoon trying to import a set of journal articles into Papers 3 before giving up because it had so many bugs that it would be faster to create my reference list manually. I thought maybe it was just my computer, but apparently not.
What’s going on? Is shipping half-finished, poorly tested crap the new normal?