Another school of thought has it that if you only perform a task every N times in M timeframe, you should automate that task because there's a good chance you'll forget how to do it by the time it comes up to do again. Both points of view have merit, but they don't really shed any light on how to set about automating.
The traditional administrator's path to automation is scripts. Lots and lots of scripts. Shell scripts, batch files, PowerShell scripts, you name it! Cron jobs and scheduled tasks running on a dozen different machines all executing little scripts in the background.
At first, these scripts are simple time savers; they are helping an administrator avoid scut work to focus on what really matters. Over time, however, the business comes to rely on those scripts. Eventually the author will go on vacation, quit or otherwise be unavailable. That's usually right about when the scripts stop working and someone else has to fix them, and quickly, because the whole automated house of cards comes tumbling down.
A large enough organization's IT can be held together by badly documented scripts and applets written by dozens of administrators over the course of decades, most of which nobody currently employed in IT knows how to manage or maintain. This is a nightmare scenario, and one best avoided.
APIs and code versioning tools have helped make automation easier. Scripts probably still aren't properly documented, but if adhered to then at least what they are trying to do and the history of their evolution can be understood by those seeking to make changes later on.
Ideally all scripting within an IT organization would be well documented, version controlled, undergo unit testing, integration testing and regression testing. In reality, however, it is far easier to issue an edict to that effect than to actually have it followed.
Scripting frameworks evolved, and they were good. The DevOps movement got born and Infrastructure-As-Code become a buzzword that actually meant something practical in the real world. Scripting evolved from code written in a dozen languages, depending on platform and administrator comfort to become something that organizations invested time, money and training into.
Instead of reliance on scattered cron jobs, scripting became managed by centralized servers with specialized agents for all workloads under management. These could eventually control workloads in the public cloud as well as on premises.
With the likes of Puppet, Chef, Ansible and Salt, automation has grown up.
Unfortunately, not all is well in Mudville. For all that the tools to make automation less risky have grown and adapted, vendors haven't. Most vendors post an API of some variety, but in far too many cases it is a mere afterthought, or originally designed for internal use only.
The APIs are often poorly documented. Modules to talk to your favourite automation suite may or may not exist, with the quality being something of a toss up. Automation may have grown up, but it still hasn't move out of its parents' house yet.
A look at networking offers a great example of the battles being fought here. Networking competes with storage for the most conservative market within IT, and the dominant player – Cisco – has fought against automation tooth and nail.
This is largely because automation leads to commoditization of the underlying infrastructure. Cisco has built an empire on being a virtual monopoly and they aren't keen to see the foundations eroded. Fortunately, they don't have much of a choice.
Software Defined Networking (SDN) is the relevant buzzword in the ongoing efforts of separating the control plane (the configuration) from the data plane (the functionality). We'll get back to this and discuss why this is so important that it merits its own movement later.
Much as with networking, the storage market stubbornly resists automation as well. APIs abound, but actually using them is a whole other discussion entirely. Unlike networking, however, storage went through a massive diversification prior to automation really becoming a mainstream concern.
The result is that no one entity dominates the storage market. There are storage solutions that focus almost entirely on automation, and virtually every storage solution (regardless of provenance) has enough of an API that if you really wanted to you could make it dance.
This does not mean storage vendors have embraced Puppet or any other automation framework. Typically, there is a great deal of antipathy towards these frameworks. For some vendors it is a desire not to be commoditized, for others they simply don't see automation as a "serious" endeavour for the storage market. They want to sell SANs and they don't really care much what you do with them once they have your money.
Whether we are talking networking, storage or some other recalcitrant aspect of the IT ecosystem only one thing really matters to vendors: control.
Cisco doesn't really want SDN to take off. It doesn't want to lose control over the market by letting individuals other than its certified domain experts be able to design and implement complex networks. It doesn't want customers to be able to take their scripts and simply point them at another vendor's switches without having to go through a lot of pain during the migration. In short, Cisco doesn't want to be commoditized.
Storage is already a commodity. Here the game is being played for control of upper layers of the stack. Storage vendors want you to buy into their automation platforms, their integration with virtualization and containerization and their hooks into public, private and hybrid cloud solutions.
Of course, as customers, we don't want vendors to have control. Vendors with vices locked onto our genitals have a nasty history of squeezing until there is no money left to be had. Just look at Oracle licensing.
Automation in these areas thus requires arbiters. Applications that you can code to which in turn can apply your scripts to hardware and software from multiple vendors.
These arbiters have been around for some time, but they are now growing into full grown frameworks in their own right. Templates, profiles and role-based administration are regularly featured. REST APIs and integration with mainstream automation frameworks like Puppet are par for the course.
You can not today simply install Puppet and automate your whole datacenter. With the help of SDN frameworks, however, you can automate networking. With the help of copy data management applications, you can automate storage. And you can do so in a reliable, sustainable fashion that will be manageable and maintainable long after the current round of administrators has moved on.
This is how you automate the datacenter. Not with a collection of scripts and cron jobs, but with frameworks and API arbitration. Sidestep the power games of vendors and get on with the job of running your datacenter. Happy coding.
For more information on Copy Data Management, please see the previous blog in this series, Copy Data Management is Much More than Just Making Copies.
Trevor Pott is a guest writer with Catalogic Software. Trevor is a full-time nerd from Edmonton, Alberta, Canada. He splits his time between systems administration, technology writing, and consulting. As a consultant he helps Silicon Valley start-ups better understand systems administrators and how to sell to them. He currently pens a weekly column for The Register; one of the world’s largest online science and technology magazines, with monthly readership of 7.2 mil. people worldwide.Trevor can be found at http://www.egeek.ca/ for those looking to engage his jedi-like guidance.