Business Rules Engines & “Prescriptive Analytics”

24 Dec

Someone asked me the other day about how to best evaluate business rule engines (BRE) or business rules management systems (BRMS). The following were the quick notes. BRE are part of a solution. I like what this month’s INFORM magazine said when talking about “Prescriptive Analytics” (page 14); which I find falls into the same category of “Adaptive Analytics”. They define prescriptive analytics as follows:

“Prescriptive analytics leverages the emergence of big data and computational and scientific advances in the fields of statistics, mathematics, operations research, business rules and machine learning. Prescriptive analytics is essentially this chain of transformations whereby structured and unstructured big data is processed through intermediate representations to create a set of prescriptions (suggested future actions). These actions are essentially changes (over a future time frame) to variables that influence metrics of interest to an enterprise, government or another institution.”

With that in mind to frame a discussion of BRMS…

Why was the BRE created? Different companies have approached things differently, which has resulted in differing feature sets. TIBCO has a rules engine that was built around their Enterprise Service Bus (ESB) business; SAS has a rules engine feature that is organized around processing rules within the construct of the various analytical packages (money laundering; fraud); Streambase has a rules engine optimized around applying “real time” rules in streaming data within the financial trading data.

Many BRE / BRMS started as workflow or content management tools, and conflate the idea of a rules engine with workflow related management. This is important to understand as certain applications of BRE need to be extracted away from the workflow management aspect of things

The above tends to frame why BRE products are built the way they are. However, there are a core set of capabilities that exist within all rules engines:

Rule Management. Does the BRMS provide the rules authoring environment that is appropriate for the particular installation? One is looking for the ability of the rules authoring approach to support a level of abstraction around how business entities are identified and how rules are applied to those entities (Business Entity Model).

How does the BRMS support the ability to evaluate current rules and re-use rules, or portions of rules? This is related to how the BRMS supports the evolution of rules in order to optimize them.

Processing functions: When creating rules, what functions exist within the rules authoring component of the engine that can be called on by the user? This discussion falls into two categories:

  1. Functions that are universal – think of these as functions that are similar to those available excel: sum; count; average; absolute value, etc.
  2. Functions that are external to the rules engine but whose invocation is handled by the tool.

The latter is perhaps most important from an analytical perspective in that few rules engines appear to have capabilities to apply a complex rule based on a native set of functions. Additionally, the ability to call external functions, or integrate non-BRE logic, impacts a BRE’s ability to deal with file based data and the associated processing that is part of the Hadoop / MapReduce approach to the Big Data challenge.

How rules are triggered. How does the tool support rule management from a processing perspective? In complex event processing there needs to be a way to manage when rules are executed. This becomes important when a collection of rules must be executed only on receipt of all the data required. This particular perspective is framed by how the solution has been architected. In general, there is a desire to reduce the input / output activity with the data source(s) if that activity does not produce an actionable result.

Sourcing the input. What functions are native to the BRE that provides access to data? Are there connectors to all the major databases; unstructured sources; web sites (open source); file systems? This is perhaps less of an issue than it used to be as the notion of data virtualization has evolved and tools are available to easily format data for delivery to the BRE/

Interfaces. What interface capabilities exist to all of the BRE components? Does the BRE have well published API’s that expose the functions required to effectively integrate the tool into the overall workflow. The ability to reach into the system to expose information of interest is critical. This is most important from an operational perspective. In the management of complex event processing rules, the ability to look into the system and create a simplified view of what can be an overwhelming level of data is critical.

See also:

  • December Informs magazine article on analysis of imagery and “Presecriptive Analytics” This frames the discussion on the combination of capabilities that needs to exist in a decision support capability for complex and adaptive problems; a rules based approach is one f the capabilities.
  • Semantic Reasoning
  • Complex Event Processing

Advertisements

2 Responses to “Business Rules Engines & “Prescriptive Analytics””

  1. Ben Wright January 30, 2013 at 7:08 am #

    If the BRE is constrained to trigger a certain rules package based upon availability of 100% of necessary data, and yet is self sufficient with regard to specific data storage methodologies, then it seems some sort of triggering mechanism must be included. For example, in most structured database implementations the concept of a trigger could satisfy the requirement to reduce unnecessary rule package execution. However if the BRE is agnostic about its data partnerships another triggering methodology must be employed.

    • analyticaltern February 1, 2013 at 5:57 pm #

      You have put your finger on a key challenge. The rules engine is generally the slowest bit of the system. There needs to be a mechanism to ensure that rules that cannot be completed are not presented to the engine. This will also cut down on the disk i/o which is another bottle neck area. We had a table that tracked the arrival of data, and triggered based on the availability of all rules for a a rule – Note NOT the rule package. If you have rules that are running in many rule packages, you want to run the rule as soon as the data is available. in this way, the output of the rule can then be considered as a necessary input for other rules.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: