skip to main content
Corticon Studio: Rule Modeling Guide : Logical analysis and optimization : Testing rule scenarios in the Ruletest Expected panel
 

Try Corticon Now

Testing rule scenarios in the Ruletest Expected panel

Using Ruletests you can submit request data as input to Rulesheets or Ruleflows to see how the rules are evaluated and the resulting output. You can make Ruletests even more powerful by specifying the results you expected, and then seeing how it reconciles with the output. Running the test against a specified Rulesheet or Ruleflow automatically compares the actual Output data to your Expected data, and color codes the differences for easy review and analysis.
You can establish the expected data in either of two ways:
1. Create expected data from test output:
a. Create or import a request into a Ruletest
b. Run the test against an appropriate Rulesheet or Ruleflow.
c. Choose the menu command Ruletest > Testsheet > Data > Output > Copy to Expected, or click button in the Corticon Studio toolbar.
2. Create expected data directly from the Vocabulary:
a. Drag and drop nodes from the Rule Vocabulary window to create a tree structure in the Expected panel that is identical to the input tree.
b. Enter expected values for the Input attributes as well as the attributes that will be added in the Output panel.
Note: See the topics in Techniques that refine rule testing.
* Navigating in Ruletest Expected comparison results
* Reviewing test results when using the Expected panel
* Techniques that refine rule testing