Observe and Report
In response to the Process Mining Manifesto, Neil Ward-Dutton has written an interesting blog post, where he contrasts the now-typical “active” process management systems with a new, “passive” kind of system which can be enabled by process mining:
Whats particularly interesting to me, based on my reading of the manifesto at least, is that the authors (or at least some of them) appear to propose that process mining in its broadest context provides the foundation for a different kind of process management system from the kind many people are familiar with today one thats passive rather than active.
[…] Through ongoing and continuous mining of event logs in the background, not directly connected to the systems that people use to get work done, such a system would work by detecting the shadows that work casts onto existing IT systems; tracking those shadows in the context of models (discovered or purposely created); and then using that analysis to drive a) management insights into opportunities for improvement and b) operational insights into optimal execution of work.
Neil’s post lays out this idea and its implications in more detail, and I would encourage you to read it in its entirety. I have been thinking along similar lines for quite some time, and in that spirit, here are some of my thoughts on this topic.
The perils of an intelligent system
The idea of “passive” systems for process support is intriguing, and has been the subject of a number of research papers even before the Process Mining Manifesto1. In one way or another, researchers always gravitate towards a visionary take on the topic, sketching a “brave new world” scenario where an all-knowing and intelligent AI learns from process observations in the background, and then automatically applies its findings to current operations.
I think that an “automated learning” approach, i.e. a fully-automated “passive” system, will always have to balance between being overly restrictive on the one hand, and, on the other hand, being eventually useless because its recommendations are mostly common sense. Not that it is not worthwhile to pursue this direction, but that balance is quite hard to strike for the general use case, and is probably best left for university researchers to explore for some time to come.
The future is already here
I would argue that you can start assembling your very own “passive” system, with tools that are available right now. For process execution, use any system which places no constraints on how processes are executed. To achieve transparency, complement that system with a process mining tool which lets you know how work is executed in detail, on demand.
The actual change needs to be in the paradigm used, i.e. in the way that process management is understood by stakeholders. Abandon the idea of “controlling” process execution, where constraints and rules are dictated from above to prevent mishaps in execution. Replace it by a “trust and check” model, where knowledge workers enjoy complete freedom. Through periodic process mining analysis, management can spot quality or efficiency problems reliably and early on, and then take appropriate action to prevent it from happening again. This action can take the form of meetings or briefings, to communicate rules and best practices, it can be in the form of explicit rules or constraints implemented in the case management system, or anything else really.
The current paradigm emphasizes anticipating problems, and preventing them proactively. If you trust in the experience and intelligence of your staff, and in their having the best interests of your company in mind, you can change that paradigm right now, without waiting for other tools to arrive. The actual shift is not a technical one, but is in the mindset of all actors involved, especially management.