Wednesday, December 3, 2025

A Simple VBCS App to Query Oracle Fusion Tables Dynamically

Many a times we need to write sql queries to get data out of Oracle Fusion.

For this we have BI Publisher data model. Everytime we want to change the query we need to edit the data model.

Also it exports data in xml and only upto 200 records. If we want to export records more than 200 records then we need to create a report using  this data model.

In this post we will see how we can create a simple VBCS app to run  queries dynamically.

For this we will need below things:

1). A BI publisher data model to accept dynamic query as base64 encoded string

2). A BIP report.

3). An OIC integration to run report and provide output to VBCS app

4). A VBCS app to provide UI for running query and seeing results.

-> We will be importing and using fast-xml-parser library to convert data from  xml to json.

Let us see each steps in detail.

1). A BI publisher data model to accept dynamic query as base64 encoded string

We need to create a BI Publisher data model with PLSQL block. Like below.

 


 

We need to create two parameters query1 - which will accept base64 encoded sql query.

Since it is string so the max character limit will be 32767. Here in example we have made 32000.

The second parameter will be xdo_cursor.

The data model query block is:

DECLARE
type refcursor is REF CURSOR;
xdo_cursor  refcursor;
var varchar2(32000);
BEGIN
var := utl_raw.cast_to_varchar2(utl_encode.base64_decode(utl_raw.cast_to_raw(:query1)));
OPEN :xdo_cursor FOR var;
END;

 

Now lets run a query and save the sample data so that we can use the saved sample data to create report.

Lets write a simple query(There should not be any semicolon at the end):

select invoice_id, invoice_num from ap_invoices_all

Now we need to convert this to base64 because if we pass the query directly then it will give error.

 We can use any text editor or use online tool to convert to base64.


The encoded string for above query is:

c2VsZWN0IGludm9pY2VfaWQsIGludm9pY2VfbnVtIGZyb20gYXBfaW52b2ljZXNfYWxs

Now lets pass this in our datamodel query1 parameter and click on view:

 



We will get the output like below:

 

Save this as sample data.

Now lets create a report.

2). A BIP report.

Select the data model for report:  

No need to have a layout as it will give data XMl as output. We can create template and give output type as well.

Save the report and then view report.

 

 

We can export this report as Data xml.

 

 

3). An OIC integration to run report and provide output to VBCS app

We will create a simple OIC integration which will accept the base64 encoded query from VBCS ,runs the BIP report and returns base64 report bytes to VBCS.

 


 

 

 

 

 Now call BIP report using a soap adapter:

 

 

Give attributeFormat as xml

 

 

Finally send the response back to VBCS:

 

 

4). A VBCS app to provide UI for running query and seeing results.

This is the start and end of the whole process.

In the sample app we have given a text area to enter the query ad then a run icon.

There is a progress bar, to display the progress of steps in query execution, a table to display the results and an export button to export the result in csv.

The UI looks like below:


 

Now lets run a query:

 

 

 The Result will look like below:


We can export this output to csv.

Another query example with commenting the previous one:  


 


We need to do below things in VBCS:

  • From the text area whatever query we are having , we need to encode that to base64 through javascript:

Sample Function:

EncodeToBase64(arg1) {
let encoded = window.btoa(arg1);
return encoded;
}
 
The argument will be the query text from text area. 
  • Call rest endpoint(the OIC integration) by passing this base64 string
  • The response from integration will be base64 report byte . We need to decode that using JS.
The sample Function: 

DecodeOutput(arg1) {
let decoded = window.atob(arg1);
return decoded;

}
 
  • This will return the xml, we need to parse this xml and convert to json.
  •  We will use fxp(fast-xml-parser) library.
We can get that from here: https://naturalintelligence.github.io/fast-xml-parser/lib/fxp.min.js 
For the import in VBCS we need to do below things:
  • Place the library in our resources/js folder.

Make an entry in index.html as follows : <script src="resources/js/fxp.min.js"></script>

 

The sample js function to convert xml to json will be:

ConvertXmlToJson(arg1) {

return new Promise(function(resolve, reject) {

let data = arg1;
const parser = new fxp.XMLParser();
let result;
try {
result = parser.parse(data);
//console.log(result);
resolve(result);
} catch (err) {
console.log(err);
reject(err);
}

});
}
 
Now we have the json. We need to display this data tin a table.
We need to have column headers and data.
We will create two variables for table column and data:
Table Column data var as any type and  table data var as ADP.
        
 
Below is the sample function to create column header for the table:
 
ParseReportJsonReturnTableHeaders(arg1) {
var ColumnArrayOfJsonObject = [];
var columnAttributes = ["headerText", "field"];
var valueVar = {};
var columnObjArray = [];
var i = 0;
const obj = JSON.parse(arg1);
var MaxHdrCounter = 1;
for (var key in obj) {
if (key == "ROWSET") {

// console.log("Rowset Key is: " + key);
//console.log("Objects under Rowset :" + obj[key]);

var count1 = Object.keys(obj[key]).length;
//console.log("Number of row item : " + count1);
for (var rowkey in obj[key]) {
if (MaxHdrCounter > 1)
break;
if (Array.isArray(obj[key][rowkey])) {
if (MaxHdrCounter > 1)
break;
console.log("it is an array");
for (var actualrowkeyArray in obj[key][rowkey]) {
if (MaxHdrCounter > 1)
break;
for (var innermostkeyArray in obj[key][rowkey][actualrowkeyArray]) {
//console.log("Actual Key is: " + innermostkeyArray);
//console.log("Actual value: " + obj[key][rowkey][actualrowkeyArray][innermostkeyArray]);
valueVar.headerText = innermostkeyArray;
valueVar.field = innermostkeyArray;
ColumnArrayOfJsonObject.push({
...valueVar
});

//console.log(JSON.stringify(ColumnArrayOfJsonObject));

MaxHdrCounter = MaxHdrCounter + 1;
}
}
} else {
for (var actualrowkeyNonArray in obj[key][rowkey]) {


//console.log("Actual Key in Non Array : " + actualrowkeyNonArray);
//console.log("Actual value in Non Array: " + obj[key][rowkey][actualrowkeyNonArray]);
valueVar.headerText = actualrowkeyNonArray;
valueVar.field = actualrowkeyNonArray;
ColumnArrayOfJsonObject.push({
...valueVar
});
console.log(JSON.stringify(ColumnArrayOfJsonObject));
MaxHdrCounter = MaxHdrCounter + 1;

}
}

}
}

}
return ColumnArrayOfJsonObject;
}
 

Similarly we need to do for data in table.
We will be using Array Data Provider.
By default it will not be available to use in the JS function. We need to import it like below:
 
At the begining(first line) we need to update like this: 
 
define(['ojs/ojarraydataprovider'], (ArrayDataProvider) => {
 
Below is  the sample function to create data for the table:
 
ParseReportJsonReturnTableData(arg1) {
var DataArrayOfJsonObject = [];
var JsonObjVar = {};
var columnName = [];
var dataRowArray = [];
var i = 0;
const obj = JSON.parse(arg1);
var MaxHdrCounter = 1;
for (var key in obj) {
if (key == "ROWSET") {

//console.log("Rowset Key is: " + key);
// console.log("Objects under Rowset :" + obj[key]);

var count1 = Object.keys(obj[key]).length;
//console.log("Number of row item : " + count1);
for (var rowkey in obj[key]) {

if (Array.isArray(obj[key][rowkey])) {

//console.log("it is an array");
for (var actualrowkeyArray in obj[key][rowkey]) {

for (var innermostkeyArray in obj[key][rowkey][actualrowkeyArray]) {
//console.log("Actual Key is: " + innermostkeyArray);
//console.log("Actual value: " + obj[key][rowkey][actualrowkeyArray][innermostkeyArray]);
columnName[i] = innermostkeyArray;
//console.log("Current Column Name is : "+columnName[i]);
dataRowArray[columnName[i]] = obj[key][rowkey][actualrowkeyArray][innermostkeyArray];
i = i + 1;

}
JsonObjVar = dataRowArray;

DataArrayOfJsonObject.push({
...JsonObjVar
});
}
} else {
for (var actualrowkeyNonArray in obj[key][rowkey]) {
// console.log("Actual Key in Non Array : " + actualrowkeyNonArray);
//console.log("Actual value in Non Array: " + obj[key][rowkey][actualrowkeyNonArray]);
columnName[i] = actualrowkeyNonArray;
//console.log("Current Column Name is : "+columnName[i]);
dataRowArray[columnName[i]] = obj[key][rowkey][actualrowkeyNonArray];
i = i + 1;
}
JsonObjVar = dataRowArray;

DataArrayOfJsonObject.push({
...JsonObjVar
});

// console.log("Data Array :"+dataRowArray);
}

}
}

}

//console.log("Data Array for Table: "+JSON.stringify(DataArrayOfJsonObject));


return new ArrayDataProvider(DataArrayOfJsonObject, {
keyAttributes: '@index'
});
}
 
 
In the action chain we will update table column data var and table data var with the response of above two functions.
 
Now in table we can map table column and data separately like below:


 
In the code mode it will look like below: 

 

Note:-  We can combine some funtions together which we saw above, to reduce total number of functions.

This is a simple app which explains flow of how we can query dynamically from VBCS  , present the data in tabular form after parsing and also providing option to download data in csv( We need to seacrh and install export data component from components).

The mapping for export data will look like below:



We can Enhance this application to have features like 

  • Format Query
  • Query history - Store  the executed  queries in ATP with env, query,user, unique id and status
  • Rerunning the past queries - Fetch the query from ATP and allow to rerun
  • Running queries for different Fusion Environments from the same page  - Create  OIC connection for diff ENV and have branches in the int for each env. Have LOV in VBCS page and accordingly pass to integration.
  • Having links for fusion tables etc - This will help making easy to find the Fusion tables, We can give OER link for each module tables.

 
as shown in below screenshots.

 Before Formatting:


 

 After Formatting:


 

 

 Get the past run queries, Which we stored in ATP


 

  Select the Query which we want to rerun and click on highlighted rerun button

 

Run  queries in Different Fusion Environments , from single page without needing to open BIP of each env.



Link for Oracle OER tables for Different Modules


No comments:

Post a Comment