Process Improvement – Under the Hood
Blog: KWKeirstead's Blog
Here’s the deal . . .
You are in a corporation that
1) “thinks” BPM,
2) has mapped processes,
3) has a run-time workflowworkload Case platform, whose Cases have stated objectives,
4) has non-subjective means of assessing progress toward meeting Case objectives,
5) has a compiler that is capable of carving up your process maps into run-time BPM templates,
6) has a way to stream Cases onto BPM templates,
7) has good Case Managers.
All good, except that essential to requisite #3 (run-time workflow/workload Case platform), is the ability to auto-build a Case History.
Each user log-in that results in any data change/data augmentation needs to result in auto- recording of the “session” by way of a system applied date/timestamp, complete with a user “signature” and all data, as it was, at the time it was entered, on the form versions that were in service at the time.
Once in, the platform must not allow any changes to the data. Errors/omissions, late data are handled by posting copies of Forms, allowing edits to these Forms with new “session” recordings.
Considering that not all data needed at a Case can be recorded precisely at the time the data becomes known to a user, all Forms at process steps (structured or ad hoc) must accommodate a reference date-of-capture date which can precede the Case History session date by hours, days, even weeks.
The cardinal rule in some industry sectors is that data, not in the system, “does not exist” – the interpretation of this rule is that protocol requires that users visit the Case History prior to taking any current decisions/actions. If the data is not in the Hx, there is a good chance that the decision will be made only on the basis of what is in the Case History. Who knew what, when is all – important in many Case audits.
So, how now do you go about improving decision-making at Cases and improving processes dynamically?
First comes data analytics.
Unless you are trying to post big screen notices to individual buyers at shopping malls re Internet searches they did last night, data analytics for improved dynamic decision making is not complicated.
A small change at branching decision boxes allows analytics to provide a hint to the user as to which branching options have been the most popular (i.e. they went that way 60% of the time).
Clearly, your data sampling size must to be sufficient and you may need/want to filter your statistics according to a timeframe, especially for initiatives that anticipate different seasonal outcomes or initiatives where legislation may have changed recently.
As for dynamic process improvement, the best approach I have been able to think of is to post the process map and then overlay it to show skips, jumps, loopbacks and ad hoc insertions, with stats where possible. Ad hoc interventions should be noted as well, particularly in terms of their timing (i.e. each time step #1024 is skipped, a specific ad hoc intervention is inserted, possibly giving a good indication that the map needs to be updated to show the ad hoc intervention in lieu of the skipped step.
I am not prepared to say that any of the above aside from the Mall initiative is AI but for me, it provides a pretty good way of improving processes.