Blog Posts BPMN DMN

Auditing case management executions with Kafka and Red Hat Process Automation Manager

Blog: Drools & jBPM Blog

Case management provides problem resolution for dynamic processes as opposed to the efficiency-oriented approach of BPM for routine, predictable tasks. It manages one-off situations when the process flow is determined based on the incoming request. Red Hat Process Automation Manager provides the ability to define Case Management applications using the BPMN 2.0 notation. In this article, we will explore how you can capture audit metrics for a running case instance.

Using Case Listeners for fine-grained and async auditing

Case Events Listener can be used to capture notifications for case-related events and operations that are invoked on a case instance. This can then be sent downstream to analytical tools. The Case Events listener can be implemented by overriding any of the methods as defined by the CaseEventListener interface. 

In our example, we will set up a listener to capture the following events:

We will then send them over to a Kafka topic from which this data can be visualized or analyzed.

private void pushToKafka(CaseDefinition caseDefinition) {
        try {
            Future<RecordMetadata> out = producer.send(new ProducerRecord<String,
                    String>("case_events", caseDefinition.getCaseId(), new ObjectMapper().writeValueAsString(caseDefinition)));

        } catch (JsonProcessingException e) {
            e.printStackTrace();
        }
    }

    @Override
    public void afterCaseStarted(CaseStartEvent event) {

        CaseDefinition caseDefinition = new CaseDefinition(event.getCaseId(),"Case Started",null,null, new Date());
        pushToKafka(caseDefinition);
    }

    @Override
    public void finalize() {
        System.out.println("case listener clean up");
            producer.flush();

            producer.close();

    }

    @Override
    public void afterCaseDataAdded(CaseDataEvent event) {

        CaseDefinition caseDefinition = new CaseDefinition(event.getCaseId(),"Case Data Added",event.getData(),null, new Date());
        pushToKafka(caseDefinition);
    };

    @Override
    public void afterCaseDataRemoved(CaseDataEvent event) {
        CaseDefinition caseDefinition = new CaseDefinition(event.getCaseId(),"Case Data Removed",event.getData(),null, new Date());
        pushToKafka(caseDefinition);
    };

    @Override
    public void afterCaseClosed(CaseCloseEvent event) {
        CaseDefinition caseDefinition = new CaseDefinition(event.getCaseId(),"Case Closed",null,null, new Date());
        pushToKafka(caseDefinition);
    };

    @Override
    public void afterCaseCommentAdded(CaseCommentEvent event) {
        CaseComment caseComment = new CaseComment(event.getComment().getComment(),event.getComment().getAuthor());
        CaseDefinition caseDefinition = new CaseDefinition(event.getCaseId(),"Comments Added",null,caseComment,new Date());
        pushToKafka(caseDefinition);
    };

    @Override
    public void afterCaseReopen(CaseReopenEvent event) {
        CaseDefinition caseDefinition = new CaseDefinition(event.getCaseId(),"Case Reopened",null,null, new Date());
        pushToKafka(caseDefinition);
    }

Notice how we extract the event data properties that we are interested in so that we can push it for analysis. We will then package the listener class as a maven project so that we can configure it on our case project. A complete example of the listener can be found in this git repository.

Configuring the listener on the case project:

Now that we have created the listener, we will now configure it in our case project.

First, we should add the listener jar to business central so that our case project can use it. You can use Business Central UI to upload the jar file. The following Aritfact upload option can be acessed through the menu Central  → Settings →Artifacts. Upload the jar file:

Now, let’s add the dependency for the listener jar on our case project. You can do this by accessing, Business Central in Menu → Design → PROJECT_NAME → Settings →Dependencies

Next, you can configure the listener using the deployment descriptors. Access it in: Business Central in Menu → Design → PROJECT_NAME → Settings → Deployments.

Finally, we can build and deploy the changes, and the listener should be able to capture case changes as they occur.

Visualizing the collected data

In order to visualize the data, let us set up a simple UI application. This quarkus application reads from the kafka topic where we push our case metrics and shows it on a responsive UI. The application can be started using: 

mvn quarkus:dev

The UI application should be available at http://0.0.0.0:8582/ListenerUI.html

Testing the Case Audit Metrics:

Let us create a case request.

We can now see that audit metrics start populating on the UI application we created.

Notice how the case start and case data added events have been captured. For every data element added to the case file, the event defines the payload associated with the data added.

Similarly, other case changes like comments being added and cases being closed can be captured similarly.

Summary

This simple demo project for case listeners and the UI can be used with any case project. It shows how we can set up a listener for a case, and how we could push it down to Kafka for effective monitoring and audit traceability.

References:

Case Listener

The post Auditing case management executions with Kafka and Red Hat Process Automation Manager appeared first on KIE Community.

Leave a Comment

Get the BPI Web Feed

Using the HTML code below, you can display this Business Process Incubator page content with the current filter and sorting inside your web site for FREE.

Copy/Paste this code in your website html code:

<iframe src="https://www.businessprocessincubator.com/content/auditing-case-management-executions-with-kafka-and-red-hat-process-automation-manager/?feed=html" frameborder="0" scrolling="auto" width="100%" height="700">

Customizing your BPI Web Feed

You can click on the Get the BPI Web Feed link on any of our page to create the best possible feed for your site. Here are a few tips to customize your BPI Web Feed.

Customizing the Content Filter
On any page, you can add filter criteria using the MORE FILTERS interface:

Customizing the Content Filter

Customizing the Content Sorting
Clicking on the sorting options will also change the way your BPI Web Feed will be ordered on your site:

Get the BPI Web Feed

Some integration examples

BPMN.org

XPDL.org

×