In 1992, the Royal Majesty cruise ship ran aground because of an electrical problem with their GPS system. Despite the fact that it should have been clear to any experienced crewmember that the ship had been veering off-course, most simply assumed that the GPS system would correct itself or someone else would take on the responsibility of fixing the problem.
Humans have a bias to trust computers over humans. And this bias grows over time, as computers continue to prove their accuracy and trustworthiness. When a human operator notices something wrong with an automated system, they are often likely to disregard reality and go with what the computer is saying.
This is an excellent example of the “Automation Paradox”.
As automation becomes more effective, the role of the human operator turns out to be more vital. In the same way that automation can create exponential benefits and efficiencies, it can also scale out the harm caused by human error and poor implementation.
In the early days of computing, mainframes were very expensive and difficult to use. Administrators took great care in their maintenance & implementation, and hacking was very unlikely. The process of provisioning a new machine could take months and required approval from many different departments. If these mainframes ever crashed, the company could still maintain some level of operations through their paper-based processes.
Today, virtualization makes it easy to launch new servers with default security settings quickly. IT departments must deal with virtualization sprawl, shadow IT, and employees working on unauthorized systems. Provisioning has become so easy that IT administrators are struggling to prevent new systems from getting added to the network. And as a result, tolerances for data loss, security breaches, and unplanned downtime have virtually dropped to zero.
A day in the life of the average IT manager often resembles the broom scene from Disney’s Fantasia.
Thankfully, the tools have also improved. Today’s IT administrators have access to backup and disaster recovery systems that are both – potent and elementary to use. But the automation paradox also applies to backup and disaster recovery systems.
If you can protect all of your virtualized systems from a single application, that’s great. But this also means that human error has the potential to cause much more damage. As your data protection and business continuity tools become more powerful, you likewise have a duty to be extra-cautious with their management, monitoring, and implementation.
This is why we recommend delegating your data protection and business continuity to a dedicated specialist that exclusively does this kind of work, and nothing else. When you outsource your backup and disaster recovery to a specialist, you know that this work is being done by dedicated experts who have the training, experience, and resources to ensure that your systems are always protected.
When disaster strikes, you can take comfort in the fact that these specialists perform real-world recoveries every day. They know how to take care of business right, inevitably, without fail.
You need the best automation tools. But they have to be managed by the best-trained and most skilled technicians. The more efficient the automation, the more crucial the role of the human operator. If you want total peace of mind, make sure that you have the best people implementing, managing and monitoring your backups.