The next trend
Blog: Process-Modeling.com - Rick Geneva
In the IT world, trends come and go. The next “must have” or “must do” today is a dust collector tomorrow. Recently I had a conversation with a colleague about BPM, and whether or not it will continue to be a growing trend, or are its days numbered? He said to me “are you still doing that process stuff? BPM is old news.” My reply to this was simple. While trends of automating processes come and go, process management has been around since before the computer. The computer enables people to be more efficient in many ways. But the software you use today is constantly being replaced by latest, greatest trend. BPM is not software. It’s not something you buy. It’s something you do. There are many systems on the market based on older technologies that make them go out of favor as new systems emerge. But to say that BPM is ancient history would be like saying that business its self is ancient history as well.
Application or Process?
A business process exists whether or not you automate it through a BPM system or a workflow tool. Many organizations choose not to use a formal BPM approach to process. Instead they use the traditional vertical market application that helps automate some of the process based on rules and logic provided by the software vendor. Some degree of process management exists with this approach. However the logic (the business know-how) is essentially outsourced to the software vendor. Often this requires the organization to undergo a massive customization effort in order to make the vertical market solution fully effective in the organization.
The application acts as a participant to the process. Without there being a business process there would be no need for the application. So you could say that what you use the application for is the process, and the application is the tool that helps you be more efficient at doing your part of the process.
Why BPM is a constant
Everything about business involves a process. Presenting a product to a customer, ordering supplies, and collecting money are all examples of processes. In a more simple term you could all each of these activities a workflow. The real benefit of BPM comes into play when you start to analyze the complex interaction between many of these individual workflows. Most likely the simple workflow evolves to include computer systems, probably just simple applications at first, gradually becoming more complex. These systems become participants of the process as well. Eventually the computer systems become an integral part of the process, often automating parts of the original process as well as enabling more efficiency as more people and systems are involved.
Let’s not forget that BPM is a management technique more than it is about technology. With BPM we are not talking about managing specifically people, systems, vendors, customers, or money. Instead we are talking about managing people, systems, vendors, customers, and money, as well as the complex interactions between them. The more complexity exists in a process the more efficient your organization becomes with proper business process management methodology and technique.
The next trend in IT
Enough about defending the need for BPM. Here’s what I see as the next emerging trend in information technology: Could Computing.
So what does cloud computing have to do with BPM? A lot, actually. Earlier in this post I contrasted the difference between applications and process. But we are now at a point in history where this line becomes even more blurred.
At the core of cloud computing is hardware; a lot of it. But instead of 4 or 5 servers to run one application we start to see a trend where 10 servers run 40 or 50 applications. The hardware virtualization means that CPUs, memory, and storage capacity is combined over multiple systems. The old days of redundant systems for maximum fault tolerance are coming to an end because the basic architecture of a cloud system is 3 and 4 times redundant in every way. In fact I’ve seen demonstrations where the plug is literally pulled out of the wall and the system keeps running. This is because there are dozens of power supplies with dozens of plugs, and often multiple complete systems in different data centers, all acting as one gigantic supercomputer.
The could computing design is more reliable, and it’s cheaper to operate. Moore’s law states that computer power doubles about every two years. The problem in data centers is that in most cases 95% of the CPU power is wasted sitting idle because it’s only utilized when someone is using that specific system. When people happen to be using another system, the CPU simply burns up kilowatts of power from the electric company as it remains idle, waiting to process the next request from a user; as quickly as possible. But with cloud computing, a cluster of computers are all operating as one. A peak load on one application can easily be absorbed across the entire system while the less frequently used applications.
The idea of cloud computing is undoubtedly inspired by the way the Internet works. At any specific point on the backbone of the Internet, if a failure occurs, there might be a localized disruption. But nobody has ever heard of the entire Internet going offline. It’s designed to be fault tolerant to a point where even nuclear war won’t take it offline.
Now back to my point about where BPM is involved in all of this. Prior to the cloud computing trend, systems were isolated in various localized data centers with virtually no way to communicate with each other besides for the system interfaces (API) that were designed to perform a specific function. Cloud computing brings systems previously separated by physical hardware together on one hardware platform. When you add service oriented architecture (SOA) to this combination, there starts to be less objections to using web services. Many software engineers and architects believe that a native interface (such as Java to Java or .NET to .NET) is better than web services for performance reasons. But in a Cloud environment when an application needs to talk to another system, often the other system is in the same virtual memory space on the same hardware cluster.
No longer do we have to worry about databases exceeding 50 terabytes. The cloud system can handle exabytes or even petabytes without even a flinch. So the notion of storing data once in a “normalized” database becomes too much effort to make it worth the effort of doing proper data modeling. Store it 1000’s of times in 100’s of formats to service dozens of applications because you can transform the data just as fast as you can store it.
Again, BPM is about management. If I can connect everything, nobody objects to connecting, and I have a virtually unlimited amount of computing power, imagine the complexity we can create! So what are you going to do with it? Burn up megawatts of power instead of the kilowatts of previous generations? Instead, how about getting smart about what you do with your computing power. All of the computing power in the world will not do anything for your organization if you don’t manage the processes that you are attempting to automate.
The trend as I see it will be for applications to be more process aware, and BPM systems to become more like applications. Many early attempts at BPM systems were nothing more than task state management for workflows. System to system integration was added, but this is still very task centric.
There are several categories of applications. Some examples include:
- Tools such as a calculator, disk defragmenter, backup utility, etc.
- Data origination tools such as a word processor, spreadsheet
- Information sharing tools such as email, screen sharing, etc.
- Collaboration tools such as groupware, and BPM automation systems.
The problem is that we still need one application for one job, and another application to do something else. For example, my word processor program is good for writing documents but doesn’t do so well at adding 2 + 2. My spreadsheet crunches numbers well but doesn’t manage people well (although some people insist that a spreadsheet is actually a database).
History hints at what’s next
If there is anything that history tells us about technology, it’s that consolidation of multiple systems is inevitable. Back in the 1970’s a CB radio or walkie talkie was all the rage in business communications. Then someone got the idea to combine a radio with a telephone and the cellular telephone was born. While we are at it, why not put a camera on it. Personally I couldn’t figure out this marriage of technologies out when it first emerged, but now I find myself sending pictures of my daughter to my friends and family on a regular basis. Oh, and while we are at it, why not hook the phone to the internet. For that matter, why not hook your refrigerator and toaster oven to the internet too? That way you can call up your appliances and tell them and make you breakfast before you get out of bed. Or better yet, as I sit here stretched out in business class on my favorite airline, I’m writing a blog post while connected to the Internet, powered by cellular phone technology.
In the above example there are a few major enablers of the merging technology. First there is a reliable cellular telephone network that is available virtually anywhere on earth people are found in mass. Next there is the Internet; always on; always ready to serve. This is the infrastructure. The telephone and the camera are the tools. The collaboration is when I hit the send button from 30,000 feet (9500 meters) above sea level, telling my wife how wonderful the remote control for the powered lay-flat seats are on this airliner, and she replies saying “that’s nice honey, enjoy. I have to put the little one to bed”. I’m constantly in touch with the world, powered by so much complexity that I never have to know about, or even know that the complexity exists.
The cloud , simply put, creates an enabling infrastructure that I never have to worry about. It’s there, it’s always on, and it would take a full-blown nuclear war or an asteroid hitting the earth to take it offline (in which case we’d all be dead anyway, so why worry about it). Data exists somewhere, but I no longer care where it is. Someone just added 5 more terabytes of RAM to the cloud and I didn’t even know it (yes, I said terabytes of RAM, not hard drives). My software got updated with a click of the button, and if I don’t like it I can click a button to switch back to the previous version without the tedious process of uninstall, reinstall.
So I’m on the internet writing on my blog page (the cloud enabled application) writing about the cloud, while I’m looking down at the clouds. Sorry to mention it, but I couldn’t resist pointing out the irony.
The green screen effect
It’s been said many times that we are coming full circle back to the days of the green screen. Ironically ‘green’ means something else today, which is causing the push to go back to the concept of the terminal attached to the mainframe. For those of you reading this who are too young to remember, the terminal was a green CRT screen. Remember the CRT? You know, that huge clunky tube screen. Before they CRTs were colored they displayed characters either green or yellow. Green today means using less power and being friendly to the environment. One of the most compelling arguments for moving to cloud computing is because it’s more environmentally friendly as well as easier to manage. It also means distributing computing power everywhere like a grid, and hosting applications online instead of installing them on your local machine.
Recently Google announced that they are releasing an operating system that is not much more than a window to the Internet. Anyone see where this is going yet? If you think about my example of how the cell phone enabled me to make toast in the morning without getting out of bed, what happens when the Internet meets cloud computing and both applications and processes are so intertwined that you can’t tell where one stops and the other begins? Applications? Process? Don’t know, don’t care. I have work to do so stop bothering me with such trivial things.
I don’t have to install applications anymore. I simply have to cache the data locally incase by odd occurrence that I cannot connect to the Internet. Even the green screen terminals of the 1960’s had this concept. They “buffered” the data in an 8 kilobyte local microchip in case the connection to the mainframe was severed momentarily. Well, the numbers certainly got bigger, but the concept is coming full circle. Want a word processor or spreadsheet? Try Google Docs. Want an image editor? Yep, you can do this online too. Why keep it local? Local storage is a single point of failure that packrats like me who never delete anything cannot afford to risk. Even this blog post is auto-saved out to the cloud somewhere and I don’t have to worry losing anything even if this plane I’m sitting on crashes. Google’s applications automatically store a temporary copy locally until the data is sent to the cloud for permanent storage. I bet some of you didn’t even know it works this way. That’s the point. It’s fault tolerant, it’s green, it’s empowering, and it’s transparent. At least to me, this sounds like a trend that is here to stay.
Person, system, or process? They are all process, out in the cloud. It won’t be long and I’ll be having a conversation with your virtual presence or avatar while you are out of the office. I’m hoping people will finally realize that sending spreadsheets over email is not process management – it’s completely wasteful. While we are at it, let’s get rid of email too and think of something more efficient because I’m tired of sifting through 200+ emails per day trying to follow a conversation. And this way I can ensure I never get another spreadsheet emailed to me. Please link it, don’t send it. This way I can actually find the correct version when I need it instead of sifting through 5 or 6 outdated versions.
Getting to the point
In case you don’t see where I’m going at this point with this post, let me spell it out very clear and concise. Show people they can things done without worrying about the technology that powers it and you will start to see things getting done. Take the complexity away from the worker and give them what they need, when they need it. Make it convenient and available wherever, whenever. Don’t bog them down with the complexity of technology. Likewise, don’t bog down the workforce of your organization with the complexity of business processes. They don’t need to know how it works, just that it does. Make it simple, seamless, and bullet-proof reliable. Don’t make them worry about whether or not the data is accurate and up to date.
So the trend I see for the future of BPM is that a new wave of process management will emerge that is not bogged down by legacy fears and horror stories of integration challenges. Applications that you install so support the business process will become merely windows to the data produced by the instances of processes. Synchronizing data to the local machine in a “buffer” will become a standard feature for all BPM systems, and local data storage will likely become the backup copy rather than permanent storage. The permanent storage is somewhere on the cloud but it will be nearly impossible to determine the precise physical location where it exists. In many ways, the technology world as we’ve known it for the past decade is taking a full reversal and going back to the concepts of the 1960’s. But Moore’s law of computing power will actually accelerate. And along with this, I believe BPM will become hundreds of times more effective when practiced religiously throughout the organization. Before this can happen I think that BPMN will have to evolve to deal with more complex event processing needs because the complexity will certainly be there, even if most of us no longer have to know the complexity exists.
You might have noticed that I haven’t written any posts in a while. The reason is because I had an epiphany. A light bulb turned on in my head (and yes, it was a compact florescent, because I’m on the green trend too). It was such an crazy thought that I started to wonder if my passion for BPM is going somewhere. Then just at my most critical moment, a dear friend asks me if I’m still doing “that process stuff”. I stand by my words. Yes, I do BPM. And I do it because there is more of a need for it now in the cloud computing age than ever before. Those of us who specialize in process management need to realize that we have invented a profession. We are not IT specialists, business analysts, efficiency experts, or project managers. We are a bit of all of these things in one. This is how I got the job title Process Expert, and I’m looking forward to a time in the not-so-distant future that this will be a job title I can find on the job search sites on a regular basis.
I see that this post is getting rather long so it’s time to wrap it up and open it for discussion. I’m looking forward to getting comments from my readers so that I can write more on this topic. But for now It’s time for me to do a video conference with my 11 month old child from 37,000 feet up in the air (camera + laptop + internet + airborne internet was a wonderful marriage of gadgets). I wonder what she’ll be doing when she’s my age? One thing is for sure, she’ll be online. Someone will probably have found a way to remove the thunderous background jet noise from the airborne video call. But I doubt by that time the FAA (Federal Aviation Administration for those of you outside the USA) will allow her to stay online during takeoff and landing. Until then, there is a lesson to be learned. Never forget to model the exceptional conditions into your processes, no matter how reliable the underlying technology becomes.