I was having a heated discussion with a HAL "guru" over on their forums regarding the silly bugs I was finding in HALi. Basic properties like Sensors.Timer and Sensors.Counter are broken and completely untested. I asked why Automated Living doesn't have a regression test these basic features and I was told by "guru" that testing is not worth a developers time, developers don't have time to do any testing, regressions would be thrown away every major release, and it's better to just throw the software to the public and let them beta test than to do any extensive testing in house.
QA engineers cost just as much as software engineers ($95k average vs $115k average in Silicon Valley) so it's absurd to assume software engineers are too expensive to use testing their software. Not only that, who do you think would have a better idea on how the software is supposed to work? Any testing they can do would be huge benefit. Also, HAL guru feels it's not even worth testing HALi if the test engineer doesn't understand "Visual Studio, including C#, C++, Visual Basic. And also understands Perl, VB Script, Java Script and HTML." That's quite a list. I wonder if AL really has developers that understand all that. Funny thing is the bugs found were using Visual Basic and jscript - and those bugs would be found in ANY language.
Furthermore, most HAL users are out of the box users. They're not going to be programming in C# or C++ - more than likely VB and VB Script. Why not write regressions in the languages that will get the most use? Especially if you can't cover ALL those languages above. Better to test something than nothing right?
For over 14 years, I've worked in "large" (1000+ employees), medium (200 employees) and small (20+ employees) Silicon Valley companies in ASIC design. I know software engineers have down time between major releases in every company I've worked at. It's the same for all my friends & old co-workers at other companies too. HAL guru says AL engineers don't have down time between major releases. "Hummmmmmmm...."
How are regressions a bad thing? Imagine if I had a script that tested all the basic objects and methods:
flagCount = HAL.Sensors.flag.Count;
varCount = HAL.Sensors.Variable.Count;
ctrCount = HAL.Sensors.Counter.Count;
timCounter = HAL.Sensors.Timer.Count;
Then, it would have been REALLY easy to see that the Counter and Timer properties aren't even functional. HAL guru insists such an effort will be tossed in the next major release but claims that everybody who wrote HALi apps will be able to just "update the HALi reference and recompile". If that's the case, how would a script like above be a waste of time? It wouldn't even have to be compiled and you'll have the benefit of testing all the features from previous releases that are moving forward. Sounds like a no-brainer to me.
HAL guru is very fond of public betas with minimal or no internal testing. But public betas are like random testing - there's no guarantee that these end users will even exercise 10% of the code. Especially since HALi only has a handful of active users. How good is the quality of their testing? How can you measure what they've tested? Sure it's "free", but at what cost?
On the other hand, if you have a QA engineer creating a testplan of DIRECTED tests for the basic methods, events and properties you know what's being tested. You get the low hanging fruit out the way. This helps your beta testers be more efficient as they're not thrashing with bugs on basic features. This keeps your software engineers from going into panic mode when you release a crappy beta and there are 25 bug reports a day coming in, many of them the same filed by different users. Plus you leverage the regressions written for future releases. Also, your QA engineer works with the developers to find what else there is to test, features/behaviors your random beta testers have no clue about or care about. You'll have a shorter beta, get product to market sooner with better quality and you'll have happier customers. That doesn't sound bad right?