There will be cases where we have lot many integrations developed and they exist with different versions. There would also be some integrations which we would have created to do POCs.
As the time passes , we will keep on adding new integrations and their version as well. The number will keep on growing.
At some point due to these so many versions and number of integrations not being used it might be difficult for us to manage them in the instance.
At the same time we might not want to simply delete them as there would be versions to which we may want to revert back or the integrations which we had done for POC might be useful for us in future.
We can deactivate the unused integrations and keep in the instance but it would still make the instance look clumsy.
So to cleanup we must take the backup first and then remove the integrations which is not being used currently.
Let us take example where we have 200+ integrations with different versions currently, taking exports manually will take lot of effort and time. We can utilize it in other productive thing or relaxing.
Also when the number of integrations grows it will be even more time and effort consuming.
There are Rest APIs available for OIC integration which we can leverage for this task.
We can get the list here:
https://docs.oracle.com/en/cloud/paas/integration-cloud/rest-api/api-integrations-integrations.html
We will use "Export an Integration" from the list
There are different ways we can take the bulk/all integration backup:
Some of the common ones are:
1) Write cURL scripts for each INT and run them at once using "&" separator.
We need to keep on updating the script.
2) Maintain packages in OIC and keep related integrations together.
This would somehow reduce our effort but would still need lot of manual effort since we might end
up creating different packages for integrations of different needs. We won't want to keep unrelated
integrations together.
3) Creating Test Suite in SOAP UI and using Groovy to call the API multiple times to export the
integrations. It will be similar to This, here we have used SOAP service similarly we can use Rest as
well.
We may need to maintain a csv file with INT CODE and Version or we can run
another OIC api to get list of integrations( This gives details of only
100 INT in one run, so we need
to use offset and limit query parameter if we have more than 100 integrations)
4) Maintain GIT repository for OIC integrations and keep adding the new changes.
We usually don't maintain GIT for OIC integrations, but if we have GIT it will be a better approach.
5) Create an OIC integration and export all the integrations to your laptop or desktop and then place it to
a shared repository.
In this article we will see point 5, i.e. Using OIC integration.
Let us take example where we do not have any sFTP which we can use to archive the integrations. So we can always use our laptop where we can export these integrations and have the backup.
The high level steps which we need to follow to achieve the above are:
- Download OIC connectivity agent and install in your laptop, similar steps to be followed as in This article here we have taken Windows machine as example, similar steps can be followed for Mac or Linux.
- Verify the agent from OIC if it is up and running
- Create a connection using File adapter and use the above configured Agent.
- Use OIC Rest API to fetch integrations and Export Integrations from the below URL:
https://docs.oracle.com/en/cloud/paas/integration-cloud/rest-api/api-integrations-integrations.html
- The Get all integrations provides only 100 integration details at a time, so we need to use "offset" and "limit" query parameters if we have more than 100 integration in our OIC instance.
- We can create an OIC integrations which generates offset and limit and passes to another OIC integration whose job will be to export the integrations to our local machine. This way the child integration will have multiple instances( for each offset and limit combination). this will save our time over having a single integration.
- Use file adapter to write file to local machine directory.
Below steps can be followed to achieve the above:
Downloading and installing an Agent
Follow the steps in This article
We can verify the agent name for this post :
Integrations-> Agents
Verify the Status of Agent it should be up and running:
Monitoring->integrations->Agents
Create a Connection using File adapter, use the Agent which we have configured as shown in below screenshot:
Now our Logic to export integration will be get all integrations->use Code and Version to call export specific integration each time in loop.
Since the OIC Rest API to Get all integrations returns 0nly 100 integrations by default, so we will use offset and limit query parameters to fetch the integration detai.
This brings a lot of details like connections, Links, Tracking variables etc which we do not need to export the integration.
So we can customize the response JSON which we want while configuring the rest adapter in OIC integration.
For now let us see the sample JSON which we have customized by removing the unnecessary fields for our need.
We can further remove some more fields like id, created by, Link etc.
Now Let us see How we can design our OIC Integrations to export any number of integrations available in our OIC instance.
Create a sample rest endpoint trigger connection or rather use the one which comes by default with OIC used in sample integrations provided by Oracle
Now Create a Rest adapter invoke Connection to invoke the OIC rest API's.
->We can also create a single Rest connection of the type Trigger and Invoke.
Now we will use the two Rest and File adapter connection in our integrations.
We will have 2 integrations:
1) First integration will create offset and Limit(=100) and trigger the child integration with these values.
2) The child integration will be called multiple times based on the number of integrations(>100) in different threads, it will fetch integrations for the respective "offset" and "limit" and it will export these integrations to our local machine directory mentioned in the file adapter configuration.
Let us see the screenshots in detail(step by step).
Get total number of integrations (here we will pass offset and limit as 0 and 1 respectively because for even one integration detail response we will get the "totalResults" field always)
Now Let us see the Child Integration:
Customized JSON in response payload fro get all integrations with offset ans limit.
Export Integration API call
Mapper for ExportInt:
Write File to our local machine using File adapter without using any schema
Mapper for file adapter:
Note:
The Locked integrations cannot be exported , it throws an error so we have put the operation in scope to skip that. Alternatively we can also have filter to fetched only those integrations which are not locked.
Now let us see the integrations , how it works.
Activate both the integrations and run the parent integration which generates offset and limt and calls the child integration.
We had 284 integrations so the child integration got called 3 time with maximum limit of 100 and offset as 0, 100 and 200.
Now let us see the directory on our local machine to verify the exported integrations.
No comments:
Post a Comment