Making the most of automation

Problem solvers are constantly faced with decisions about technology. Questions about technology are not simply about which technologies to use, but also how to manage the technology once it’s integrated into the business or system. Today there is much talk about Artificial Intelligence (AI), but the questions surrounding AI is focused mainly on what it can do, which applications it is right for, and the impact that might have on jobs. This is a logical starting point, but there’s much more that ought to be considered.

Technology is Automation

At it’s core, AI like many other modern technologies (Robot Process Automation, Driverless Vehicles, Factory Robots, etc.) are tools of automation. They speed up a process or system, they take the human person out of the picture, and they are supposed to perform with greater speed, quality, safety, and make better decisions than humans would. In short, the value offered by these modern, revolutionary technologies is the same value that the steam engine offered in the 18th century. Yes the technologies today are more sophisticated. They make better decisions, or have greater applications, but the difference is in degree, not in kind. We gain value through their ability to automate.

For us, this is a very good thing. For several generations we’ve been working with automation and there are many people who have done a good job of explaining the best ways to use automation to achieve business objectives while also decreasing risks like safety and quality. The luminary electrical engineer, and human factors pioneer Jens Rasmussen had one of the most insightful ways to think the potentials of automation as well as the risks. He called this the “Automation Conundrum.”

Jens Rasmussen’s drawing of the “Automation Conundrum”

The Automation Conundrum

At its core, the Automation Conundrum encapsulates an interesting paradox—an intricate dance between the capabilities and limitations of automation. Rasmussen's insight surrounds the optimal design of the system. When automation works, it really works. It outperforms humans in all the critical dimensions like safety, quality, and productivity. But when it doesn’t work, it actually underperforms what a normal person can do. Moreover, the window for good automation performance is very very small. Comparably, humans are more adaptable to different work conditions, and can maintain a relatively consistent level of performance in many different environments.

The lesson from Rasmussen’s Automation Conundrum is to rely on automation only when you can ensure that it will stay within the range of optimal performance. This calls back to the earlier point: The selection of a technology should not simply be based on what it can do. It should also be evaluated based on its maintainability, adaptability, integration into the system, and resiliency to changing circumstances and environments. This may require deep technical knowledge as well as organizational knowledge, so it’s imperative that technological decisions are made with the most relevant subject matter experts within the organization and industry.

Principles for Using Technology

To navigate this complex terrain, and make technology decisions simpler, I propose a set of guiding principles:

Principle #1: Avoid Overreliance on Automation Automation should be viewed as a tool to augment human capabilities. It should not be a substitute for human judgment and automation should not take over a decision making role or remove people from control.

Principle #2: Acknowledge the Paradox of Automation The paradox of automation highlights the inherent trade-offs associated with technological advancement. As automation systems become increasingly sophisticated, there is a tendency for human operators to become complacent and overly reliant on technology. This phenomenon underscores the importance of maintaining a vigilant and proactive approach to risk management, recognizing that as the system improves, our ability to prevent mistakes may diminish.

Principle #3: Embrace Decentralized Control Centralized control structures may offer the illusion of efficiency, but they can also introduce rigidity and vulnerability in the face of uncertainty. Decentralized control architectures, on the other hand, promote adaptability and resilience, enabling rapid responses to changing circumstances. By distributing decision-making authority across multiple nodes, we can enhance the robustness and agility of the system, mitigating the impact of unforeseen events.

Principle #4: Prevent Irreversible Changes Automation systems should be designed with built-in safeguards to prevent irreversible changes or catastrophic failures. By incorporating mechanisms for error detection and correction, we can minimize the impact of unforeseen errors and maintain system integrity. Additionally, the implementation of fail-safe mechanisms and redundant systems can further enhance reliability and mitigate the consequences of potential failures.

Principle #5: Use Automation in stable environments As the Automation Conundrum demonstrates, automation is great under certain conditions, but it performs poorly when those conditions fluctuate or change. Therefore, it’s important to create a stable environment for the technology to operate in. Otherwise, it’s best to rely on the adaptability of humans. Just one more reason standardization is important!

Practical Steps to Optimizing Technology

With these principles in mind, we can now turn to two practical steps during the technology integration process.

Step #1: Conduct Comprehensive Risk Analysis Utilize robust risk analysis methodologies, such as What-if analysis, Failure Mode and Effects Analysis (FMEA), and Fault Trees, to identify potential hazards and vulnerabilities. By thinking critically about the potential risks and taking steps to reduce those risks we can proactively identify areas of concern and implement targeted measures to enhance system safety and reliability.

Step #2: Use Simulation Training Implement rigorous simulation training programs, modeled after the training protocols used in high-risk industries such as aviation. These simulations can replicate a wide range of scenarios, including rare but critical events that may have significant implications for system performance. By exposing operators to realistic scenarios in a controlled environment, we can cultivate situational awareness, decision-making skills, and effective response strategies.

Conclusion

As problem solvers, the effective use and application of technology is paramount to the success and growth of business. The use of modern tools provide many great possibilities for streamlining business operations, increasing quality, improving customer satisfaction, and making workplaces safer. But while these technologies are effective and powerful, we must also be aware of their limitations and the best way to use them. The Automation Conundrum illustrates a clear way to think about technology and performance.


Problem solvers seeking to use or integrate new technologies should understand these limitations and the design principles. These guiding principles will not only prevent risks and create greater resilience in organizations, they will also ensure the optimal performance whatever the environment and regardless of whether it’s automated or not.

Michael Parent

Michael Parent is CEO of the Problem Solving Academy and author of “The Lean Innovation Cycle” a book that explores the intersection of Problem Solving, Lean and Human Centered Design. Throughout his career, Michael has coached executives through strategic problem solving, strategy, and operations management and has led numerous projects in a variety of industries.

Previous
Previous

Managing Systems: Lessons from Deming’s Red Bead Experiment

Next
Next

Why is standardization so important?