Automation Essentials: 5 Things Automation Does to Improve Digital Service Delivery
As digital transformation accelerates, automation is increasingly essential. It only makes sense that as we digitally enable the new economy, we would also seek to automate our digital management.
It’s important, however, to understand what we mean by automation. We’re not simply talking about writing software here. Automation means more than that. Automation means systems that run themselves and heal themselves with minimal human intervention.
The earliest form of automation was probably that old mainframe standby, the batch scheduler. Many people don’t realize that batch processing remains a critical component of digital platforms. There is a clear line of sight from the original batch processors to modern job schedulers such as CA Autosys and the Amazon Simple Workflow Service.
As Amazon says, “In Amazon SWF, a task represents a logical unit of work that is performed by a component of your workflow. Coordinating tasks in a workflow involves managing intertask dependencies, scheduling and concurrency in accordance with the logical flow of the application… Amazon SWF gives you full control over implementing tasks and coordinating them without worrying about underlying complexities such as tracking their progress and maintaining their state.”
Modern digital professionals sometimes ask why job schedulers are still needed. Sometimes, the thought is that digital services should be entirely event-driven and real-time, and schedulers sound too old-school, tied to a nightly or monthly schedule. Here are five reasons why they still matter:
- The business cycle
- Cross-platform execution
- Fast provisioning
- Big data and analytics
1: The business cycle
Why isn’t everything event-driven? Well, we still have business cycles. Paychecks come every two weeks. Transactions are settled daily. Capacity metrics are analyzed weekly. Sales performance is reported monthly. Accounting cycles report monthly, quarterly and yearly. For better or worse, we live in a cyclical world and this is not likely to change. These cycles are complex to manage and often involve a variety of processes and tasks. Job schedulers provide the end-to-end flow of business logic required for these cycles. And, they represent these end-to-end flows independently of their implementing platforms.
2: Cross-platform execution
Workload automation tools can span platforms – for example processing data on a mainframe and downloading it to a Web portal. They provide an important layer of control and abstraction, above particular software technologies. A job scheduler does not care if a process is written in shell script, SQL, Java, Python or Go. Enterprise class schedulers cross platforms and can seamlessly execute job streams running across mainframes, Windows environments, Unix and Linux.
Finally, the job scheduler is no longer merely a clock-driven robot. It can respond dynamically to events, kicking off complex job streams in response to messages and transactions.
3: DevOps and provisioning
Systems management in the virtual and cloud world uses choreography extensively. Instead of traditional job schedulers, dedicated systems tools such as BladeLogic and Jenkins may be used to facilitate the building, packaging and installation of software. At lower levels, automation tools also can create virtual machines and interconnect them in complex clusters, auto-registering them with the network, backup and monitoring services.
Systems tend to “drift”; their configurations change through human error, or even through lower level errors such as memory corruption. Automation is used to ensure that systems “know” their healthy state and can try to restore it if discrepancies arise. This is called policy-driven configuration management and can take place locally within one compute node, or more broadly across distributed systems. At the largest scale, such as in the event of disasters, complex infrastructures defined as code can be readily re-created through automated choreography.
5: Big data and analytics
Data is getting bigger and bigger. An event-driven world sounds great in theory, but when operations involve moving, indexing and searching petabytes of data, the basic physics of computing says this will take time. Some of the most valuable information analysis requires days or weeks of processing, streams of jobs and tasks that need careful management in case of failures. Automation provides that critical control layer, so that a two-week process doesn’t have to be started over from scratch in the event of a partial failure halfway through. And, processing can be shifted to times when capacity is available or cheaper.
From its humble origins in batch scheduling, automation remains a critical component of digital service delivery, unlikely to fade away. As a control layer that separates “what” from “how,” it is essential for effective digital service operations. In an era of higher and higher expectations for service availability and large scale data analytics, its role in the digital ecosystem is assured.