Welcome to the OPA Hub!


Category Archives: Oracle Policy Modeling

Fun with Aliases and Strings #2

Fun with Aliases and Strings #2

Returning to the ” Aliases and Strings”  theme of the previous post, where we looked into an example of String concatenation. Just a reminder, in the previous article you created the entity model and set up a couple of relationships, before using a rule to decide if the ticker tape instance is a member of a relationship called  the next ticker tapes.

So here is the continuation of the document you saw in the previous steps:

Aliases and Strings #2

The first part should look reasonably familiar, since it builds on the example with the next ticker tapes. But is uses the second relationship, called the closest ticker tape. Note the wording closest ticker tape not ticker tapes. We are aiming for the closest one, or if you prefer, the next one in line. So for ticker tape number 3, the closest would be number 4.

Dodgey Ticker

We again use an alias, but things get a bit sticky in the following parts. Where did the further ticker tape come from? Well, perhaps unsurprisingly, it’s another alias. You see, we already used the other ticker tape in the conclusion so we need to use another word : in this case further was my personal choice, but it could have been another word that meant something in this context. So by now we have the following, expressed in conversational style :

Compare ticker tape A (with other tapes, let’s say B, C and D). B,C or D will be called the closest ticker tape if the following is true.

  1. B,C or D have an ID that is higher than the ID for A
  2. Using the next ticker tapes as your starting point (so, B C and D)
  3. Compare them (so B compared to C, B compared to D etc) to this rule
  4. Is B’s ID is less than or equal to C (for example)?

So we end up with the ticker tape that is in the next ticker tapes AND has an ID that is less than or equal to the other next ticker tapes. So it is the closest one.

I’m reminded of this excellent conversation from Monty Python since it can get a bit confusing at first:

video

The final rule concerns whichever ticker tape has the longest string. And that string is what you are about to create, for each and every instance of your entity.

We’re coming with you!

You will generate a string of text for each of the entity instances (so, for each of the ticker tape instances). And this string will be the driver of a logical loop.

Firstly, let’s set your scene and remind of the context:

  1. “Text 1”
  2. “Text 2”
  3. “Text 3”

Each ticker tape has a text message, for example “Text 1” . This message should be concatenated with the other text messages to form a long “final” string. Each should have a comma inserted between them, into the final string, and of course a “.” at the end. Just to make a nice tidy “final” string. It might look like “Text 1, Text 2, Text 3.”.

Aliases and Strings #2

So each instance has a text string, and a “final text”. The “final text” will be the ticker tape text string concatenated with the closest ticker tape’s text string, plus a comma if required – for example if there are no “next ticker tapes” for a given tape, it’s because we have reached the end of the instances (number 4 , if there is no number 5).

The following attributes give us the numbers used in the table above:

Aliases and Strings #2

And the final (final) global attribute:

Final String Result

Aliases and Strings #2

In the next part of this series, there will be a chance to look back on the techniques, observe the warning message and generally investigate your logical loop.

Aliases and Strings part three will be with you in a few days, In the meantime of course you can read the online help here.

OPA Word Rules – Level Up!

Hi There!


This content is accessible only to logged in users of the OPA Hub Website.

To Register takes only 10 seconds and uses LinkedIn for authentication.

Once registered you can change your OPA Hub password and manage it independently of LinkedIn.

We recommend you keep different passwords for all your sites.

To register, click the Log in link in the menu at the top of your page.

Thanks, the OPA Hub Website.

Stunning Infographics with Oracle Policy Automation and Easelly #1

Hi There!


This content is accessible only to logged in users of the OPA Hub Website.

To Register takes only 10 seconds and uses LinkedIn for authentication.

Once registered you can change your OPA Hub password and manage it independently of LinkedIn.

We recommend you keep different passwords for all your sites.

To register, click the Log in link in the menu at the top of your page.

Thanks, the OPA Hub Website.

Importing Data into an Interview : Excel Example

Importing Data into an Interview : Excel Example

Readers will remember a while ago I explained briefly how to use Microsoft Excel to act as a Connection Datasource – in this overview article, followed by this one in a little more detail. Now we look at another challenge : Importing Data into an Interview.

Well, here comes another example of the ubiquitous nature of Microsoft Excel. The customer requirement was as follows:

Using a simple mechanism, let the user upload an existing Excel spreadsheet into the Interview. Parse the spreadsheet, read the data in it, create corresponding rows in an Entity. Let the user review the data but do not require any new data entry. There may be up to 250 rows of data to import. So how do you go about about Importing Data into an Interview?

So how can we face up to a challenge like that? We need:

  • An upload that isn’t a standard File Upload Group
  • A parsing mechanism to read Excel and extract the data in a given tab, or wherever
  • A custom Entity Collect to handle the data import / create the rows in a Screen

The shopping list above isn’t that long.

The File Upload is essentially an HTML 5 component to let the user select a file on their computer. We cannot access an arbitrary local path from JavaScript, so we need the user to point to the file they want to upload.

There are a number of JavaScript-based Excel parsers, including the excellent SheetJS js-xlsx which we used. It is capable of converting to and from Excel, which is no easy task when you consider that an Excel file is basically a Zip file with a bunch of complicated stuff inside it. The library can convert to HTML, CSV and magically (for our requirement) JSON. Awesome!

Plus, in a previous post we’ve also looked at the (large amount of) work required to build a Custom Entity Collect Extension. In fact when I was writing that article I was thinking, for goodness sake Richard, when do you think you will actually need to go to the trouble of building that Entity Collect Extension? Well, I’ve finally found a use for it – Importing Data into an Interview!

We need an Entity Collect Extension since we need some way of getting the Excel data into the Entity Collect, which ultimately means we need to do some work behind the scenes between the import of the data and the display of the Entity Collect. We need to rewire the Entity Collect temporarily so that it sucks our Excel data up, before we show it to the user so they can examine the results.

For the purposes of a raw demo, I unplugged all the other functionality (delete buttons, add buttons, etc.) and just concentrated on getting the data into the Entity Collect. There are how ever a few caveats. Once you get into the larger imports, at least in the Debugger, you can expect to see “concurrent record editing” errors. I’m trying to find out what the limit is exactly. But up to a few hundred I think it’s OK.

So let’s look at the items in turn.

File Upload and Data Load

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
{
					//console.log("Starting customInput Mount");
					var div = document.createElement("input");
					div.id = "myFile";
					div.type = "File";
					//div.value = control.getValue();
					el.appendChild(div);
 
					function handleFile(e) {
						var files = e.target.files,
						f = files[0];
						var reader = new FileReader();
						reader.onload = function (e) {
							var data = new Uint8Array(e.target.result);
							//console.log("In Change");
							var workbook = XLSX.read(data, {
									type: 'array'
								});
 
							worksheet = workbook.Sheets["YOURWORKSHEET"];
							jsonoutput = XLSX.utils.sheet_to_json(worksheet, {
						raw: true, header : 1
							});
						//console.log("Read " + jsonoutput );
						};
 
						reader.readAsArrayBuffer(f);	
						var completepath = $(':file').val();
							//console.log(completepath);
						interview.setInputValue("rest_filenameandpath", completepath);
					}
 
 
				var filedialog = document.getElementById("myFile");
					filedialog.addEventListener('change', handleFile, false);
 
					var completepath = $(':file').val();
						control.setValue(completepath);
						//console.log("Hello " + completepath);
						//console.log("Ending customInput Mount");
				}

Assuming you have a custom Input framework as your starting point, the above code will be in the mount. This will build an HTML5 file upload control, and attach an event handler. The code regarding Excel depends upon xlsx.full.min.js being in the resources directory. But that’s it. You’ve loaded the Excel file into a JSON object.

Entity Collect

The next step is to include a Custom Entity Collect in your project, and use the jsonoutput object (which you just created from the imported file above) in the mount of the Entity Collect to load the JSON into the Entity Collect. The following is an extract from the mount code

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
var numEntities = Object.size(jsonoutput);
					//console.log(numEntities);
					// Remove header row if the file has one
					jsonoutput.shift();
					// load records into the Entity Collect
 
					if (control.getRows() == 0) {
						for (j = 0; j < numEntities - 1; j++) {
							control.addNewRow();
							var mycurrentrecords = control.getRows();
							mycurrentrecords[j][0].setValue(jsonoutput[j][0]);
							mycurrentrecords[j][1].setValue(jsonoutput[j][1]);
							mycurrentrecords[j][2].setValue(jsonoutput[j][2]);
							mycurrentrecords[j][3].setValue(jsonoutput[j][3])
drawrows();

The end result is something like this:

Importing Data into an Interview

The File is loaded into the Entity Collect, and the contents displayed to the user. In my case I unhooked all the code related to modification (the onchange stuff from the original idea) and removed the add / delete buttons, since it was designed to just allow the user to see the loaded result, not modify it.

Importing Data into an Interview

If you want to have a look at the project, just download the very basic example here.

Sneak Peek : Web Authoring – Modules

Sneak Peek : Web Authoring – Modules

Having recently installed an Oracle Policy Modeling 19A platform, in an On Premise / Private Cloud scenario, I proceeded, as per usual, to dig around and see what was new behind the scenes. One item that caught my attention was a parameter, in the usual table (I’m not going to mention which table, since if you have an On Premise installation you will know which table I mean, and if you are in the Cloud, you won’t have access to it anyway) called feature_web_rules_enabled.  Web Rule Authoring has been on the radar for a while, and mentioned under safe harbor at a few events recently. So what’s behind this idea of Web Authoring?

Simply put, it’s the idea that some users might want to develop rules without Microsoft Word or Excel.

Having recently installed an Oracle Policy Modeling 19A platform, in an On Premise / Private Cloud scenario, I proceeded, as per usual, to dig around and see what was new behind the scenes. One item that caught my attention was a parameter, in the usual table (I’m not going to mention which table, since if you have an On Premise installation you will know which table I mean, and if you are in the Cloud, you won’t have access to it anyway) called feature_web_rules_enabled.  Web Rule Authoring has been on the radar for a while, and mentioned under safe harbor at a few events recently. So what’s behind this idea of Web Authoring?

Simply put, it’s the idea that some users might want to develop rules without Microsoft Word or Excel.

So, I couldn’t help but switch the parameter on. Now, obviously this is a work in progress, and the finished product will look very different, but it is always exciting to see what the Oracle Policy Automation team are working on. It is especially true in respect of authoring without Microsoft Word or Excel. Web Authoring – “Modules” have the potential to provide a new approach to rule creation. Perhaps targeting a more technically savvy audience, whilst Microsoft Word and Excel could continue to provide the well-known and well-loved platform for more busines-oriented users. Of course, that is just my subjective opinion : I’m sure Davin and the team have their own very exciting vision of the future.

So, I couldn’t help but switch the parameter on. Now, obviously this is a work in progress, and the finished product will look very different, but it is always exciting to see what the Oracle Policy Automation team are working on. It is especially true in respect of Rules Authoring without Microsoft Word or Excel. Web Authoring – Modules have the potential to provide a new approach to rule authoring, perhaps for a more technically savvy audience, whilst Microsoft Word and Excel would continue to provide the well-known and loved platform for business users. Of course, that is just my subjective opinion : I’m sure Davin and the team have their own very exciting vision of the future.

So here are the things I was able to look at, once I had enabled the parameter:

Web Rules Modules

A funky new icon, which is a combination of Repository and Deployment arranged to look like a set of epaulettes that an airline captain might wear. Good Choice!

At it’s heart, obviously, a new interface for entering rules, complete with auto-detection of attributes and operators (watch for the formatting and indenting in the video below), plus Rule Assistant-style pop-up suggestions as you type, complete with functions. And secondly, the ability to quickly deploy the Module and pick up the WSDL from this “one-shot” screen.

As I said at the start : the end product will I’m sure look completely different, but isn’t it exciting to see the development of a new feature already taking shape? What do you think of the potential of this?

Back to OPA Basics : Oracle Policy Modeling Features

Back to OPA Basics : Oracle Policy Modeling Features

Welcome to another in our periodic back to OPA basics series. At the moment I am watching a lot of new starters join a set of experienced developers. And funnily enough, both groups sometimes are stuck in their routine. For the new people, they fall back on what they know from other rules engines. For the more experienced people, that are familiar with Oracle Policy Modeling, they repeat what they learned long ago and do not necessarily see anything that has been added to the application in the intervening time since they first started using it.

So here are my top five cool time savers and useful things you can find in Oracle Policy Modeling today.

  • The Rule Assistant

I still find it strange that many people don’t use the Assistant in Word. If there was ever a tool to avoid having to remember the arcane spelling and phraseology of an Oracle Policy Automation attribute, this is it!

Back to OPA Basics - Rule Assistant

  • The Convert to Test Case Export

Watching people filling in every cell in their Test Case spreadsheet, when they probably have already saved many of the initial scenarios as debug sessions, I think they should remember the fact that they can move a unit test into a Test Case, and vice-versa.

Back to OPA Basics - Export as Test Case

  • The Find Unused Attribute Filter

I tend to use this one when the Rule Assistant has not been used very much, so we are looking for duplicate / mispelled attributes in our Project (see Rule Assistant, above).

Back to OPA Basics - Unused Attributes

  • The Inclusion Report

OK, so I’m probably cheating as far as this one is concerned, but it is a real time saver. In 19A, the introduction of the Inclusion Report has saved me time already. Great for beginning an impact assessment when some sort of surgery is required on Project structure. Find out more here.

Back to OPA Basics - Inclusion Report

  • The Export Entity Data Model Option

This is a tiny little option hidden away in the toolbar but I’m often asked by non-OPA people for the data model and I find this export really simple and quick to use.

Tiny Button - Export Data Model

Well, that’s our top five for now. What other tricks do you use to get the most out of Oracle Policy Modeling?

Let us know in the comments!

 

Business Process : Force PDF Download in OPA

Hi There!


This content is accessible only to logged in users of the OPA Hub Website.

To Register takes only 10 seconds and uses LinkedIn for authentication.

Once registered you can change your OPA Hub password and manage it independently of LinkedIn.

We recommend you keep different passwords for all your sites.

To register, click the Log in link in the menu at the top of your page.

Thanks, the OPA Hub Website.

Excel as an OPA Data Connection #2

Excel as an OPA Data Connection #2

So, following on from the previous post in this ” Excel as an OPA Data Connection” series, we have been investigating using Excel as both the Metadata and Data storage for a prototype Connection. The goal being that if I want to quickly model something and show it to a prospect I can do everything in Excel, and do not need any other software.

The previous post laid out the architecture, and highlighted that we would be putting the Excel on a Web Server and using a Web Service to open and manipulate the Excel file. Once again, this is just a bit of an interesting concept and is not at all for real-life use, especially as Microsoft has explicitly stated that remote automation with Microsoft Office is not a good idea (nor is it supported).

Excel as an OPA Data Connection : Common Points

In our Web Service code, the three main endpoints (GetMetadata, Load and Save) have a great deal in common. They all manipulate either DataTables (Load and Save) or MetaTables (GetMetadata) and in the case of DataTables they have Row(s) and DataField(s) whilst the MetaTables have MetaFields.  Essentially your job is to build these into a hierarchy, and they form the response that is sent back to Oracle Policy Automation.

In the screenshot below, the Excel data range is being parsed into a series of Metafields for the GetMetadata response, and we are setting the different tags of our SOAP response. I took a shortcut in the prototype and no matter which metadata you ask for you get all the fields back in the response.

 Excel as an OPA Data Connection GetMetadata

Excel as an OPA Data Connection : Context

In the case of the Load and Save, they receive either loadrequest.context or saverequest.context which will contain any URL argument items you want to pass into the Service (for example, in a Load, you will pass an ID which will correspond to the row of data you want). You can pass in as many as you want.

In the screenshot below, you can see that both the Load and Save have the context object available. In the code, since it is a prototype, we assume the only context element is the one we are looking for.

 Excel as an OPA Data Connection Context

Excel as an OPA Data Connection : Load and Save

In the case of a Load request, you will send back the Tables and Row(s) and Fields for the Interview. In the case of a Save request, since we are only handling updates in our prototype,  we send back the Row and Fields as the response, marking them as “input” fields. We are not using Load after Submit in our case, so there are no “output” fields to send back. If we were to handle inserts, we would have to send back the Row and Fields in that case as well, plus the ID of the new record for example in a Load after Submit.

In the screenshot below I am testing the Load Response, hard coding some values into the fields. Note the “TEST” value at the top. In the finished version of course, these are replaced with Excel Cell contents.

 Excel as an OPA Data Connection Load Example

Since the entire Web Service is written in Visual Basic, it is a great learning opportunity to be able to dump the XML requests and responses to files, in order to better understand what Oracle Policy Automation is sending or receiving. In the screenshot below, a Save Request being sent to the Web Service.

 Excel as an OPA Data Connection XML Example

Excel as an OPA Data Connection : Attribute Types

One area where I got quite confused is the different ways to indicate what kind of attribute you are working with. In the request for a save, for example, there are field types for each attribute which are actually constants (0,1,2,3,4 and so on, one for each type). When building the save response I needed to then map that number to an AttributeTypeEnum (another constant value) and set the ItemElementName to get the “<date-val>” or “<text-val>” tags that you need. It took a while to work out the logic. Between MetaFields that say they are “STRING” and DataFields that say they are “text-val” it can get a bit boring!

Here’s the sort of thing I mean, checking the fieldtype and mapping to the AttributeEnumType:

Excel as an OPA Data Connection : Video

Rather than keep on writing, let’s have a video to put it all together and see the adventure in the flesh, doing what it is supposed to do!  This was great fun, and is definitely a cool way to learn more about the Connector API and what would be needed when building an integration. Of course, these days we have Integration Cloud and so many more managed tools and services but I am of the opinion that it is better to be over-informed than under-informed. Speaking of which, if you need to read the official Connector API Overview, you will find it here.

Of course there is more that could be done. As described above, the first thing will be to implement record creation. Then, perhaps, a child table or two using the same basic principle. Who knows, one day when I have more time I might come back to it.

If anyone wants the Visual Studio Project and code, then just leave a Comment. Have fun!

 

 

 

Dynamic Charts in Container Controls Example Image

Dynamic Charts in Container Controls

Hi There!


This content is accessible only to logged in users of the OPA Hub Website.

To Register takes only 10 seconds and uses LinkedIn for authentication.

Once registered you can change your OPA Hub password and manage it independently of LinkedIn.

We recommend you keep different passwords for all your sites.

To register, click the Log in link in the menu at the top of your page.

Thanks, the OPA Hub Website.

Back to Basics : Seeding from a URL Parameter

Back to Basics : Seeding from a URL Parameter

This topic comes up regularly and it often seems to get mixed up with just plain old “starting the interview from a URL”. So it’s time for a little refresher about Seeding from a URL Parameter. Time for another Back to Basics topic (catch the previous one here) !

First of all : What do we mean by seeding from a URL parameter?

Sometimes you want to start an Interview with certain attributes already populated. But this is not a situation where you are using a Connection, for example, to perform an inbound mapping loaded at start. Perhaps you are embedding Oracle Policy Automation Interviews on a Web site, and the Web site is going to pass some information into the Interview when it starts.

In pseudo-world, you want to do something like this:

https://myinterview.com/startsession/Interview/?myattribute=X

It’s really quite simple and it is becoming more frequent, since lots of Oracle Policy Automation is appearing in modern interfaces driven by JavaScript and other frameworks that support using JavaScript.

But if you try to do something like this with Oracle Policy Automation, you get a nasty surprise. It doesn’t work at all.

Then you read the documentation, and discover two or three important things that must be in place.

  1. You must authorize pre-seeding for the attribute(s) you are interested in

Seeding from a URL Parameter 1

2. You must use the seedData= tag on your URL to introduce the information to the Interview.

Armed with this you come back to your Interview and you try something like this, with impressive results:

Seeding from a URL Parameter 2

What the heck is happening here? It all seemed so simple. Well it is, but you forgot the most important point. Whatever your seeding from a URL requirements, they must be URL encoded. So the “=” character for example, is not going to get through the defences of Oracle Policy Automation. You have to encode it all first. So let’s imagine you have an attribute, whose name is my_seeded_value, and you want to populate that attribute with Richard Napier.

You need to encode part of what you are sending. You do NOT need to encode “seedData=” in fact if you do, there will be yet another error. Your URL needs to look like this :

http://xxx.com/web-determinations/startsession/URL_Seeding?seedData=

And the rest of the URL needs to be something like my_seeded_value=Richard Napier. But that part needs to be encoded. As a simple example, take it to https://www.urlencoder.org/ and put it through the encoder. It probably comes out like my_seeded_value%3DRichard%20Napier. Excellent. Now you can create the complete URL:

http://xxx.com/web-determinations/startsession/URL_Seeding?seedData=my_seeded_value%3DRichard%20Napier

And it comes out just fine:

Seeding from a URL Parameter 3

So there has definitely been an important lesson here : don’t encode seedData=, but do encode the information after it.

There is a lot more to learn about URL-based seeding, and we will continue this after the break. If you want the full online help, it is to be found here.

Have a good Holiday and see you in 2019!