For some interesting reading, range about the Internet for articles detailing the way the software world has changed in the past few years with the success of companies like Facebook and similarly ubiquitous, social-node technologies. Those companies have fostered the advent of the DevOps strategy, which is more a paradigm shift in corporate culture than merely a mechanical development/quality assurance/deployment strategy, and it demonstrates a new way of thinking about deployment scaling using the cloud (with an unbelievable number of servers available) while maintaining an aggressive development schedule. Sprint-cycle application development and cloud-based deployment are the order of the day for these newer entities. No longer does dev sit in a development cycle of a year or more, but rather a cycle that is measured in months at most, or weeks – even days. Getting customer-requested features quickly into the product and out to the customers is still job one, but – Oh, hey! – the difference in implementation! Ben Horowitz Article “How Software Testing Has Changed”
How does any piece of software maintain its integrity in a pedal-to-the-metal blast from one feature-set stage to the next? It is obviously difficult, and it is creating an entirely new style of inter-dependence between development and testing/quality assurance and the customers.
In the early 1990’s, an engineer at AT&T Bell Labs wrote a paper wherein he posited that (broadly paraphrasing), contrary to the wisdom of the day that believed user manuals were to be written after the software was coded and tested and were solely for the benefit of the user, such end-user documentation was not an add-on to the overall software product. Instead it was a critical, front-loaded part of the software cycle, from design to delivery. He reasoned that by having the end-user documentation at hand, developers would have a blue-print for the entire product effort. And that by engaging all the principal actors in creating the documentation both preparatory to, and in-parallel with, code design and testing, many of the common hitches and glitches – among things such as feature creep, code bloat, missing features, etc. – could be caught early and smoothed out. And we saw that thrust of thinking applied in myriad ways across a spectrum of industries.
At the time that paper was written, we lived in a 16-bit world of kilobytes that was working on expanding to megabytes using 32-bits. Office-level local-area networking was in its infancy. Obtaining and updating desktop application software was a major endeavor. Mini- and main-frame computers were common for larger enterprises. Today, twenty-five years later, we find ourselves in an analogous – albeit at a vastly grander scale – situation, emerging from the cocoon of networks of desktop computer systems into the virtually unbounded environment of wireless systems and cloud-based servers of terabytes, where products are updated responsively rather than cyclically.
The world of ECM is at a synaptic junction, and enterprises are poised to jump the gap to cloud storage and are looking for products that can sustain lightning-fast and responsive enhancement cycles while providing sound, solid, secure processing. One such leading product is the ImageSource, Inc., ILINX suite. Check it out.
QA Test Engineer