Managing Benchmark Definitions

You can define benchmark definitions in the Benchmark Management view of the Control Center perspective. You can design benchmarks for processes and activities and publish them from the Benchmark Management view. Any user can view the benchmark definition, but saving and publishing of benchmarks is allowed to Administrator user only.

The benchmark definition contains:

Once the benchmarks are defined and published, you need to specify association in the Configuration panel. For more information, please refer to the chapter Configuring Benchmark Settings.

This chapter covers the following:

Designing Benchmark Definitions

By default, the Benchmark Definitions view opens in the Designs mode. Existing benchmark definitions are displayed at the top of the Benchmark Management view when Design mode is selected. You can add, edit, delete, clone, publish, upload and download benchmark definitions.

The Default Benchmark definition is added in the Benchmark Definition table.

Benchmark Definitions
Figure: Benchmark Definitions

Published Benchmark Definition

Once the benchmark is published, it becomes available in the Published mode.

For more information, please refer to the section Publishing a Benchmark Definition.

Adding a new Benchmark Definition

To add a new benchmark definition:

  1. Click the Add Benchmark icon displayed in the toolbar. The Default Benchmark definition is added.

    Add Benchmark Definitions
    Figure: Add Benchmark Definitions

  2. Specify General Properties
  3. Specify Categories for the benchmark
  4. Set Conditions
  5. Save the benchmark definition. The benchmark definition is added.

Adding General Properties

General Properties - Benchmark Definition
Figure: General Properties - Benchmark Definition

Defining Categories

The Categories tab supports specifying the Benchmark Categories and the conditions in which they are applied. A tree of deployed models from Model & Go!, Modeling and Eclipse Modeler is displayed. You can navigate to the model, its processes and activities of each model and specify the benchmark for them. By default, only non-Auxiliary Processes and Interactive Activities not marked as Auxiliary are shown. Click the eye icon to hide auxiliary and non-interactive activities.

Categories - Benchmark Definition
Figure: Categories - Benchmark Definition

Selecting an element in the tree populates a benchmark table containing that element's conditions and the benchmark categories defined for the benchmark definition. Conditions cannot be defined on model nodes and therefore, you cannot select model node while defining category.

Category Evaluation

Categories are evaluated from right-to-left. If the first cell from right is resolved to true, the category is applied. Individual cells must resolve to true or false.

Adding a Category

When defining benchmark category, please observe the Sequence + Category Name + Color for different benchmarks. To add a category, perform the following steps:

  1. From the Categories tab, select the process or activity of the model for which you want to define a category

    Select Process/Activity
    Figure: Select Process or Activity

  2. Select the Enable Benchmark option to specify benchmark calculation for the selected element. By default, it is unmarked which means that benchmark is not calculated. Category can be defined if only this option is selected. By default, the Default Category is displayed with green color code. You can change the default category or add a new category.
  3. Click the down arrow displayed adjacent to the Default Category and click Add

    Add Category
    Figure: Add Category

  4. Click the Green color square to open the color palette. The color palette is displayed.

    Specify Category Color
    Figure: Specify Category Color

  5. Select the color for a category

    Note:
    If the user decides to use the same color across multiple Benchmark Definitions, they should have the same meaning that is color/label. Please avoid using the same color flag that has different semantic across different benchmarks. For example, Red color benchmark would mean "Very Late" in one benchmark and "Not Late" in another benchmark.

  6. Select condition from the drop-down list. By default, the Free Form option is selected.
Condition Description Examples
Free Form A JavaScript editor is provided. Free-form JavaScript should render to true/false. You can use the following data in the JavaScript formula:
  • The predefined PROCESS_PRIORITY primitive data
  • Process data
  • All activity/process controlling parameters defined on the activity or process. For example, the Target Execution Time. Accessor methods for parameters with time semantics return the corresponding time interval in seconds.
  • JavaScript syntax. For example, activityInstance.getActivity().getTargetExecutionTime()
  • The start time of the activity or process. For example, activityInstance.getStartTime()
  • JavaScript method activityInstance.getAge(), returning the result of new Date() - activityInstance.getStartTime() in milliseconds.
  • Criticality value
Data Expression
  • Time
    • Current Time - The current time (for example, server time) is evaluated against Process Data (for example, Business Date) and, optionally, an offset.
    • Process Start Time
    • Root Process Start Time
  • Operators - You can select from the following options:
    • Later than - Indicates that the benchmark would be evaluated if the process/activity execution takes place after the selected Time and selected Date/Time Process Data.
    • Before - Indicates that the benchmark would be evaluated if the process/activity execution takes place before the selected Time and selected Date/Time Process Data.
  • Date/Time Process Data - The drop-down list includes all primitive date/time data including the default data and Structured Data. Process start time of root and child process instance are also available.
  • Evaluation Type -
    • Business Days - Calculations do not include days (entire days) blocked in the calendar associated with the process instance when it is started. Calculations begin from the business date. The business calendar is associated if it is selected in the General properties tab. For more information, please refer to the section Business Days Calculations of the chapter Benchmarks from the Concepts handbook.
    • Calendar Days - Calculations are based on linear calendar days beginning from the business date with no regard to any blocked off time in associated calendars.
  • Apply Offset - If selected, it enables fields that support specifying an offset relative to the process data selected above.
    • Days - Calculations are in days from the referenced process (date/time) data. For example, when referencing business date:
      • 0 = Process Instance's Business Date
      • +1 = Process Instance's Business Date plus 1 business or calendar day
      • -1 = Process Instance's Business Date minus 1 business or calendar day
    • At - Time control for setting an explicit time using the current locale
  • Process must be completed in 10 business days.


    Figure: Set at process level

  • The execution of activities within a process must be completed by a given time and day since the start of the process (by 2:30 PM on the 5th business day after process start).


    Figure: Set at process level

  • Activity instance must be completed by 4 PM the same day the process instance was started.


    Figure: Set at activity level

Moving a Category

To move a category, click the down arrow displayed adjacent to the category you want to move and click Move Right or Move Left. You can move a category to the right or left of any existing category. For more information, please refer to the section Category Evaluation.

Move Category
Figure: Move Category

Cloning a Category

To clone a category, click the down arrow displayed adjacent to the category you want to clone and click Clone.

Clone Category
Figure: Clone Category

Deleting a Category

To delete a category, click the down arrow displayed adjacent to the category you want to delete and click Delete.

Delete Category
Figure: Delete Category

Saving a Benchmark Definition

To save the benchmark definition:

  1. Select the row of the benchmark definition that you want to save
  2. Click the Save icon in the toolbar

    Save Benchmark Definitions
    Figure: Save Benchmark Definitions

  3. Click Ok in the Success dialog

    Publish Benchmark Definitions
    Figure: Success - Save Benchmark Definition

Publishing a Benchmark Definition

Benchmark definitions are available only after they are published. They do not become effective until published. They can be saved without publishing.

To publish a benchmark:

  1. Select the row of the benchmark definition that you want to publish
  2. Click the Publish icon in the toolbar

    Publish Benchmark Definitions
    Figure: Publish Benchmark Definitions Icon

  3. Click Publish in the confirmation dialog

    Publish Benchmark Definitions
    Figure: Publish Benchmark Definitions

  4. Click Ok in the Success dialog

    Publish Benchmark Definitions
    Figure: Success - Publish Benchmark Definitions

In the worklist view, the Benchmark column displays the evaluated benchmark. For more information on activity and process benchmark, please refer to the chapter Working with Activity Tables and Working with Process Tables, respectively.

Republishing a Benchmark Definition

Editing and publishing an existing benchmark definition overwrites the existing benchmark definition, which immediately applies the new benchmarks to in-flight process instances and activity instances that are using that benchmark definition.

Cloning a Benchmark Definition

To clone the benchmark definition:

  1. Select the Design mode from the drop-down list
  2. Select the row of the benchmark definition that you want to clone
  3. Click the Clone Benchmark icon in the toolbar

    Clone Benchmark Definitions
    Figure: Clone Benchmark Definitions

  4. Modify the details of the cloned benchmark. For more information, please refer to the section Adding a new Benchmark Definition.
  5. Save and publish the benchmark definition

Deleting a Benchmark Definition

You can delete a benchmark definition at design time or after publishing the benchmark. However, if you delete a benchmark definition from the Design mode that is already published, the published benchmark won't get deleted.

You cannot delete benchmarks that are in use by an active process instance. For more information, please refer to the section Deletion of Benchmarks from the Benchmarks chapter of the Concepts handbook.

  1. Select the Design mode from the drop-down list
  2. Select the row of the benchmark definition that you want to delete
  3. Click the Delete Benchmark icon in the toolbar

    Delete Benchmark Definitions
    Figure: Delete Benchmark Definitions

  4. Select Ok in the confirmation dialog. The benchmark definition is deleted.

    Delete Benchmark Definitions
    Figure: Confirm - Delete Benchmark Definition

Deleting a Published Benchmark Definition

To delete a published benchmark:

  1. Select the Published mode from the drop-down list
  2. Select the row of the benchmark definition that you want to delete
  3. Click the Delete Benchmark icon in the toolbar

    Delete Published Benchmark Definitions
    Figure: Delete Published Benchmark Definitions

  4. Select Ok in the confirmation dialog. The published benchmark definition is deleted.

    Delete Benchmark Definitions
    Figure: Confirm - Delete Published Benchmark Definition

Uploading a Benchmark Definition

You can upload a .JSON benchmark file to the Benchmark Definition in the Design mode. To upload a benchmark definition file:

  1. Select the Designs mode from the drop-down list
  2. Click the Upload Benchmark from File icon in the toolbar

    Upload Benchmark from File
    Figure: Upload Benchmark from File

  3. In the File Upload dialog, click the File icon

    File Upload
    Figure: File Upload

  4. Select the .JSON file to be uploaded
  5. In the File Upload dialog, click the Upload icon

    File Upload
    Figure: Upload

The benchmark definition gets uploaded in the Benchmark Definitions panel.

Downloading a Benchmark Definition

You can download the benchmark definition JSON file from the Designs mode.

Select the benchmark definition row from the benchmark table and click the Download Benchmark icon.

Download Benchmark
Figure: Download Benchmark

Downloading a Published Benchmark

You can download the published benchmark JSON file from the Published mode.

  1. Select the Published mode from the drop-down list
  2. Select the benchmark definition row from the benchmark table and click the Download Benchmark icon

    Download Published Benchmark
    Figure: Download Published Benchmark

Example

The following example explains how to create and evaluate the benchmark for specific process.

  1. Create a model named Benchmark Test Model in the Modeling perspectiveas shown in the following screenshot

    Benchmark Model
    Figure: Benchmark Model

  2. Deploy the model
  3. From the Control Center perspective, open the Benchmark Management view
  4. Add the benchmark named Test Benchmark in the General properties section

    Test Benchmark
    Figure: Test Benchmark - General Properties

  5. In the Categories tab, select a process
  6. Click the Enable Benchmark check box and define Green, Blue and Red categories as shown in the following screenshot:

    Test Benchmark
    Figure: Test Benchmark - Categories

  7. Similarly, set Enable Benchmark option for the activities Benchmark - Get Loan Amount and Benchmark - Review Loan Amount of the Benchmark-Loan process.

    Test Benchmark
    Figure: Enable Benchmark - Activities

  8. Save the Test Benchmark
  9. Once saved, publish the benchmark
  10. Go to the Configuration settings of the Infinity Process Platform Portal and select the Benchmark Management option
  11. For the Benchmark Test Model, specify and save the settings as shown in the following screenshot:

    Test Benchmark
    Figure: Benchmark Management - Portal Configuration Settings

  12. From the Administration perspective, open the Daemons view
  13. Start the Benchmark Daemon

    Test Benchmark
    Figure: Benchmark Daemon

  14. Go to Workflow Execution perspective
  15. Activate three instances of the Benchmark-Loan process and specify values 12000, 7000 and 4000 and Suspend and Save these activities
  16. Open the Worklist of the user and select Benchmark column from the Select Column dialog. You can view the benchmark flags displayed as per specified category.

    Test Benchmark
    Figure: Worklist - Benchmark Category Flags