Posts Tagged 'OSB'

Branching in Native REST Services

In previous post, we are introduced to native REST services (Typed and Un-typed) support in 12.2.1. But we can observe following issues there:

  • We used only GET method for demonstration and typically this would not be the case as REST service can also support other HTTP methods (POST, PUT and DELETE).
  • No branching in Typed REST Services when multiple HTTP methods are supported.
  • No branching in Un-Typed REST Services when multiple HTTP methods are supported.

In this post, we will try to cover above aspects. Note that all of this discussion is related to native REST services unless stated otherwise.

Branching in Typed REST Services:

Add POST method support for typedEmployees resource as shown below.

typedbranch

typedbranch1

typedbranch2

typedbranch3

Since Typed REST Service uses WADL and contains Operation name annotated with soa:name, we can simply make use of Operational Branch.

opbranch 

You can use URL like below to access REST Service.

http://localhost:7003/restDemo/typedService/typedEmployees

Branching in Un-Typed REST Services:

Since Un-Typed REST services does not use WADL, we can’t use Operational Branch as above. So in this release, OSB introduced a new node called  REST Branch for this purpose.

restbrnch

Add REST Branch in pipeline by dragging it from Components.

rest1

For each REST branch, give supported Media Types, Resource Path and HTTP method mandatorily.

branch

Use + icon to add media types and give other information as shown below. This means we are creating a REST resource called  untypedEmployees which supports GET and supported media types are  application/xml, application/ json.

rest2

Modify REST branch name in General section of Properties. We can add more branches using highlighted icon below.

rest3

We can add POST method support for same resource path as shown below.

post

post1

post3

Test Proxy as shown below. Note that we had specified required parameters in HTTP headers.

test

test2

You can use URL like below to access this REST Service and make sure that Content-Type is passed without fail.

http://localhost:7003/restDemo/untypedService/untypedEmployees

Observations:

  • OSB parses payload based on  HTTP header Content-Type in request. We can Use Log activity to see $body contents. Refer  to this post to enable logging.
  • When Content-Type is application/xml, $body is logged as below.

PostPipelinePair, request-ab047b9.N47e0f03.0.15591b0ca2c.N7d1d, Stage1, REQUEST] CreateEmployeeLog: <soapenv:Body xmlns:soapenv="http://schemas. xmlsoap.org/soap/envelope/">[[<a><b>1233333</b></a></soapenv:Body>]]

  • When Content-Type is application/json, $body is logged as below.

[PostPipelinePair, request-ab047b9.N47e0f03.0.15591b0ca2c.N7d1d, Stage1, REQUEST] CreateEmployeeLog: <soapenv:Body xmlns:soapenv="http://schemas.xmlsoap.org/ soap/envelope/">{"a":1234,"b":3455}</soapenv:Body>

  • OSB binds a globally-scoped object called process and can be used as process.body or process.var which is similar to $body and $xyz XPath variables. This notation is used for Java Script expressions. Use Log activity as below in Java Script expressions to verify the same.

logjs

logjs1

logjs2

  • When Content-Type is application/xml, process.body is logged as below

[PostPipelinePair, request-ab047b9.N47e0f03.0.15591b0ca2c.N7d1d, Stage1, REQUEST] CreateEmployeeLog: <a>[[<b>1233333</b></a>]]

  • When Content-Type is application/json, process.body is logged as below

     [PostPipelinePair, request-ab047b9.N47e0f03.0.15591b0ca2c.N7d1d, Stage1, REQUEST]         CreateEmployeeLog: {"a":1234,"b":3455}

  • Though REST service supports JSON/XML payload, there is no automatic conversion takes place at runtime and to be done programmatically in native REST services.
  • When using End-to-End XML, use XQuery/XSLT for transformation.
  • When using End-to-End JSON, use Java script for transformation.

References:

https://docs.oracle.com/middleware/1221/osb/develop/GUID-FE2CAC5B-E4DF-49DE-AD3C-36EEAF750BFE.htm#OSBDV-GUID-BAE387C8-F1BE-49CF-8789-EFE220D216DB

Enabling Logging in Service Bus

To enable pipeline logging in Service Bus, steps remain same as below but the location where do we do this activity changed. The screenshots shown in this post

  • Enable logging in Global Settings
  • Enable logging at Pipeline level

Global Settings

Login to EM Console and navigate to SOA –> service-bus (Admin Server) as shown below.

gbtree

Click on Global Settings tab and set Logging Enabled property. We can also enable Monitoring, Alerts, Reporting and Result Cache as shown below.

osbglobal

Pipeline Settings

In EM Console, navigate to SOA –> service-bus –> <<Service Bus Project>>.

proxylog

Go to Operations tab and query for Pipelines. Here we can see all monitoring related properties for Pipelines.

proxylog1

Click on Pipeline and go to Properties tab to enable Logging as shown below. We can set other Monitoring and Tracing related properties as well. We can also set log level so that it will be shown in log files.

pplog

This logging information is shown in <<osbservername>>-diagnostic.log.

Another related blog entry: https://thecattlecrew.net/2015/12/23/oracle-soa-12c-quicktip-enable-servicebus-message-tracing-in-defaultdomain/

Service Bus 12.2.1 – REST Support

In this blog, we will review native REST service support added in 12.2.1. And you can refer to post to find information about same from 12.13 perspective.

Before discussing further, we will first see how 12.2.1 provides the backward compatibility with 12.1.3. In 12.1.3, REST Proxy Service converts native REST payload to SOAP before calling a Pipeline/Split-Join and REST Business Service convert SOAP to REST native payload i.e. the internal communication happen using WSDL interfaces only.

In 12.13, while creating REST binding as Proxy or Business service check the option as shown below and other steps remain same.

We can see WSDL and WADL gets created in your project.

wadl

req1

req2

To access REST resource use url like http://localhost:<<OSB Port>/<<proxy endpoint>>/<<resource name>>  so it will be http://localhost:7003/restDemo/REST1213WayPS/employees

To access design-time WADL use url like http://localhost:<<OSB Port>/sbresource?WADL/<>/<>  so it will be http://localhost:7003/sbresource?WADL/RESTIn1213way/WSDL/REST1213WayPS

To access effective WADL use url like http://localhost:<<OSB Port>/sbresource?(PROXY or BIZ)/<<project path>>/<<proxy or biz service name>>  so it will be http://localhost:7003/sbresource?PROXY/RESTIn1213way/ProxyServices/REST1213WayPS

Now in 12.2.1, we have native REST support and no need of creating WSDL for internal communication. This native support is broadly classified into following categories:

  • Un-typed Proxy/Business Service –  For which method information is available at design time so no WADL is involved.
  • Typed Proxy/Business Service – For which the method information is available at design time so WADL is used/created having this information.

REST binding can be used to create both Proxy and Business services that fall into above categories. In this post, we discuss from Proxy Service perspective and same can be followed for business services.

Creating Typed Proxy Service:

We use REST binding to create native REST service. So drag REST binding from Components to Proxy Services swim lane or right click to choose REST option.

typedbind1

Provide name for REST binding and do not select WSDL interfaces check box as we are creating native REST services. Click Next.

typed1

Create a new REST resource as shown below.

typedemp

typedemp2

Create a REST method using following steps by clicking + icon in Methods.

typedreq

typedresp

typedfinish

Now verify that WADL file is generated automatically with method information as defined above. Now create pipeline using the following steps.

typedpp

typedpp1

typedpp2

typedpp3

Connect Proxy Service, Pipeline and Business Service as shown below. Use the same business service as we used earlier.

sboverview

Finish the message flow as shown below.

ppmflow

routing

Deploy and test your project in Service Bus console. Observe that you can see all media types supported by REST service are shown in Accept choice list.

typedtestreq

typedtestresp

Now we will see how to use an existing WADL to create Typed REST services.

Again drag the REST binding from Components to Proxy Services swim lane or right click in swim lane to choose REST option.

untypedbind1

Provide name for REST binding and do not select WSDL interfaces check box as we are creating native REST services. Click Next.

typedproxy

Choose REST1213WayPS.wadl. This confirms that WADLs generated by 1213 REST services are supported here. Observe that REST methods are populated automatically from selected WADL.

typed

wadlselect

wadloper

Click Finish and verify that new WADL is generated again for this Proxy Service.

wadl

Now finish pipeline message flow as above using WADL created in above step.

typedexistingpp

sboverview1

To access REST resource use url http://localhost:7003/restDemo/typedService/typedEmployees

To access design-time WADL use url http://localhost:7003/sbresource?WADL/RESTTypedServices/TypedRestService

To access effective WADL use url http://localhost:7003/sbresource?PROXY/RESTTypedServices/TypedRestService

Observations:

  • WADL is always created for Typed native REST services when one is not chosen during creation.
  • No where we are able to give the input/output message structure (XML or JSON schema) for REST methods. I think this may be improved in later releases.
  • When a native REST Proxy Service supports multiple content types (XML, JSON), automatic payload conversion (XML to JSON and vice-versa) is not happening as we see in WSDL based REST services. I will try to cover more on this in later posts.
  • Content-Type HTTP header is used by OSB for content parsing and we can see this set automatically when media type is chosen in test console.
  • Value given for soa:name in WADL is populated for $operation context variable in pipeline.
  • 1213 WADL is not supported for creating pipelines but can be used to create Proxy, however a new WADL will be generated by OSB as we saw above.

Creating Un-typed Proxy Service:

Create a Proxy Service using following steps. Observe the usage of Transport and no where we define REST resource or methods.

untypedps

untypedps1

untypedps2

untypedps3

Create pipeline using following steps and observe that we are not selecting any WADL as we did earlier.

untypepp

untypedpp1

Connect all these pieces as shown below and complete Message Flow as we did earlier.

sboverview2

Deploy and test your project in Service Bus console. Observe that you can see all media types supported by REST service are shown in Media Type choice list as we have not specified supported types any where. Service Bus uses the Content-Type HTTP header for parsing the payload and you can see this is set automatically when we choose the media type in Test Console.

typedtest

untypedtestresp

To access REST resource use url http://localhost:7003/restDemo/untypedService

Observations:

  • No WADL is used during creation of Un-typed native REST services.
  • Again, no where we are able to give the input/output message structure (XML or JSON schema) for REST methods.
  • Again, no automatic payload conversion will happen when REST Proxy supports multiple Content Types.
  • Content-Type HTTP header is used by OSB for content parsing and we can see this set automatically when media type is chosen in test console.

In above 2 sections, we created  both Proxy and Pipeline separately and we can observe that WADL is optional for REST based pipelines. So even Pipelines are classified into Typed and Un-typed  depending on usage of WADL.

So now the Q arises about compatibility between Proxy and Pipelines as both of them can be Typed /Un-Typed. Since Typed is more restrictive having REST methods we will be able to call both Un-Typed and Typed pipelines provided they used same WADL. In the same way, Un-Typed will be able to call both Un-typed and Typed Pipelines.

The source code used in this post can be downloaded from here and please note that you need to create DB connection pool to run this project with JNDI eis/DB/LocalDB.

Reference:

https://docs.oracle.com/middleware/1221/osb/develop/GUID-C346DF7D-041D-4E10-BE1C-451F50719106.htm#OSBDV89235

12.2.1 OSB JDev Issues

The following information is related to 12.2.1 release unless stated otherwise.

Issue 1:

OSB projects are being converted to SOA projects after adding a XQuery to workspace. You can confirm this by looking at components window which shows SOA related components after opening a pipeline.

Fortunately, this issue is already documented by in support note 2090174.1 and the solution is applying the patch 22226040. Refer to this post for instructions on applying the patch. Make sure that ORACLE_HOME and MW_HOME are pointing to right locations when you have multiple middleware homes.

Verify that patch is successfully applied by issuing opatch lspatches. Restart jdeveloper after clearing the cache (system directory).

If you still see this issue, verify the jpr files TechnologyScopeConfiguration does not have SOA entry similar to below.

<hash n=”oracle.ide.model.TechnologyScopeConfiguration”>
<list n=”technologyScope”>
<string v=”Maven”/>
<string v=”ServiceBusTechnology”/>
<string v=”WSDL”/>
<string v=”WSPolicy”/>
<string v=”XML”/>
</list>
</hash>

Issue 2:

For the first time, jdeveloper is getting stuck saying ‘Loading Maven…’ when opening any existing Service Bus application. To resolve the issue, modify the version to 12.2.1-0-0 in parent section of pom files of service bus projects including System project. Sample is shown below.

<parent>
<groupId>com.oracle.servicebus</groupId>
<artifactId>sbar-project-common</artifactId>
<version>12.2.1-0-0</version>
</parent>

SOA 12c – Maven Articles

Using Maven Sync Plugin

Using Maven for SOA Deployment

Using Maven for Service Bus Deployment

SOA 12c – ESS Articles

Creating ESS Job metadata using EM Console

Creating ESS Job metadata using JDeveloper

Creating ESS Schedule metadata

Creating ESS Incompatibility metadata

Creating ESS Job Sets metadata

Retry functionality in ESS Jobs

Creating Async ESS Job Definition

Using Schedule Job activity in BPEL

SOA 12c – Creating ESS Async Job Definition

In the post, we have seen creating ESS Job Definition using synchronous web service. Now, we will look at creating Job Definition using  Asynchronous BPEL web service having 5 min Wait activity to simulate the delayed response.

We will also take a look at other changes required for deployment when new job definition is created in existing ESS application in new package structure.

Create Async Job Definition with help of following screenshots. Make sure that you always use the concrete WSDL.

asyncbpeljob

projexplore

tab

selectwsdl

selectwsdl1

asyncws

asyncws1

You need to modify MAR profile to include the new job definition and also adf-config.xml file to include the valid namespace as shown below.

marchanges

adfconfig

Now deploy ESS application and submit a new request using this job definition. You would observe that ESS job status is in Running state but waiting for the response from BPEL web service as shown below.

waitjob

Once ESS job receives the response from asynchronous BPEL web service the job will be finished and shows status as Succeeded as shown below.

aftersuccess

You will see the similar behavior as above even when you use ADF BC service.

SOA 12c–Creating ESS Job Set metadata

Often, we may have to run multiple jobs to finish some functional process. ESS lets you do this using Job Set where we can add multiple jobs as steps in the metadata and submit them as single unit. We can also specify the relationship among these steps either as Serial or Parallel.

Job Set also allows another Job Set to be included so that a parallel job set can have a Serial Job set and vice versa so that more more complex Job Sets can be created.

To create Job Set, select File –> New –> Enterprise Scheduler Metadata –> Job Set.

jobsetnew

Give a meaningful name and use the same package as used in previous post.

jobsetnew1

Click OK and you can observe new Job Set shown in Project Explorer and a new tab is opened.

explrjobset

jobsetsteps

Let us create Serial Job Set at first. Click + in the Job Set Steps option to add jobs. We can also specify any System Application properties at each step using the respective tabs shown below.

step1

Now your Job Set Steps visual diagram look like below. After the execution, each job step can assume any of the statuses Success, Warning and Error represented by icons in below screenshot. Here, you can define relationship of steps with other steps based on the status. So the following diagram depicts that Job Set execution to stop on occurrence of Error or Warning and proceed to next step (if any) on Success.

step1graphic

Similarly add another step as shown below.

step2

step2graphic

Now modify the relationships of Step1 as shown below.

step1and2

Now save your changes and deploy to ESS server using the steps mentioned in previous post.

To view the newly created Job Set in EM console, navigate to ESSAPP –> Job Metadata–> Job Sets and do search for EssNativeHostingApp as shown below.

emjobsetview

Navigate to ESSAPP –> Job Requests –> Submit Job Request and submit Job Set as shown below.

submitjobset

submitjobset1

Run this Job Set when the service is down so that we can see it’s behavior when an error occurred during execution. Observe that both of the steps are resulted into an error as shown below.

jobsetstatus

And Search Job Requests page shows these requests as below where each step is executed as child request and we can also observe the serial execution by looking at Processing Start Time and Run Time.

jobstatus1

Now modify Step1 to Stop on occurrence of error. Now save you changes and deploy your application to ESS server.

step1stop

Submit request using this Job Set and observe the Job Status as shown below. Now you can clearly observe that only Step1 has been executed because of the above changes.

step1stop1

step1stop2

Now bring up service and submit the Job Set to observe both of the steps are successful.

stepsuccess

By default, each step status does determine the terminal state of Job Set. To override this behavior you can define the system property SYS_selectState at step level as shown below and set value to false.

sysprop1

sysprop2

ss

parallel1

parallel2

parallel3

parallel1step1

parallel1step2

parallel4

In Parallel job set, all steps execution will start at same time hence we can’t define relationships among steps based on step execution status similar to Serial job set. However, you can still define the step level system parameter SYS_selectState to override default behavior.

Now save changes and deploy application to ESS server. Submit a request using this new Job Set and observe the parallel runtime behavior as shown below.

jobsetparallel

parallestatus

You can find more information about Job Sets in ESS documentation here and sample project used in this blog can be downloaded from here.

Using EM Console to Create Job Set

We can also create a Job Set in EM console as shown below and the steps look similar to above.

emjobset

emcreate1

emcreate2

emcreatestep

emcreate3

SOA 12c–Creating ESS Incompatibility metadata

We often come across following restrictions when we use ESS jobs because of data corruption issue or for some other functional reason.

  • Only single instance of Job definition should run at particular time.
  • Some jobs should not be run during the run of other jobs irrespective of parameters.
  • Some jobs should not be run during the run of other jobs when acting on same object i.e. having same value for a particular parameter.

In ESS, all of above requirements are addressed by using Incompatibility definition. The first requirement is addressed by using Self Incompatible option. Second and third requirements are addressed by using Global and Domain type incompatibility definitions. In this blog post, we will learn how to create Incompatibility catering to above requirements.

To create Incompatibility metadata, select File –> New –> Enterprise Scheduler Metadata –> Incompatibility.

incomp

Give a meaningful name and use the same package as used in previous post. Here we are creating Global type.

incompdef

Click OK and you can observe the new Incompatibility file shown in the project explorer and a new tab is opened.

projexpl

incomptab

Click + icon in Entities section to start adding the jobs.

joblist

Select the required jobs and click OK. If we try to save , we will get an error as shown below. As shown below, aAn Incompatibility definition mandates us to select Self Incompatible option when we are adding just single job (entity).

error

To set this option, double click job in Entities section and choose the option as shown below and click OK. This would make ESS request processor run only single instance of this job at a particular time.

selfincomp

Now save your changes and deploy to ESS server using the steps mentioned in previous post.

To view newly created Incompatibility in EM console, navigate to ESSAPP –> Job Metadata–> Incompatibilities and do search for EssNativeHostingApp as shown below.

deployedincomp

To verify the effect of incompatibility, let us submit 2 instances of the same job and schedule them to run at same time as shown below.

submitted

On the scheduled time, we can observe that one of the requests is blocked as shown below because of our incompatibility definition.

blocked

Once the job with request id 205 is completed, 206 will be kick started by request processor which is evident from the start times shown in the below screenshot.

starttime

If you want to make 2 job incompatible with each other, add the other job in the incompatibility definition as shown below.

secondjob

secondjob1

Similar to above, you can observe the same blocked behavior for these 2 jobs in following screenshots when submitted in EM console at the same scheduled time.

schedule2jobs

sucess2jobs

Domain type Incompatibility

You can create incompatibility definition of Domain type using the following screenshots.

domainincomp

domainincomp1

Double click on each job in Entities section to select the property to be used for defining the incompatibility. Note that, we can have different property names to define the incompatibility definition.

1stjobprop

2ndprop

Now your incompatibility definition should look like below.

2ndincomp

Now deploy your ESS project to verify effect of new incompatibility definition. Note that you have to remove SecondOSBJob from previous incompatibility definition before deployment as that is of Global type and is necessary to see effect of this domain type incompatibility which is based on properties.

When used different values for the parameters, you can observe that both jobs started execution at same time which is evident from the date values shown below.

sametime

When used same values for the parameters, you can observe that one of the requests is blocked which is evident from the following screenshots.

samevalues

samevalues1

Using EM Console to Create Schedule

We can also create a Incompatibility in EM console as shown below  and the steps look similar to above.

emincomp

emincompcreate

createincomp

entitylist

selfincompem

defn

You can find more information in the documentation here and sample project used in this blog can be downloaded from here.

SOA 12c–Creating ESS Schedule metadata

The real strength of ESS comes from the ability of scheduling a job which is a common use case. ESS provides Schedule as the job metadata to enable the user to schedule job either based on recurrence or on explicit dates. In this post, we will see how to create this Schedule metadata and use it for our ESS job created in the previous post.

To create Schedule metadata, select File –> New –> Enterprise Scheduler Metadata –> Schedule.

newsch1

Give a meaningful name for the schedule and use the same package as used in previous post.

newsch2

Click OK and you can observe the new schedule in project explorer.

schprojexplore

Let us define a schedule so that ESS job runs for every 2 min thrice considering Start and End dates as shown below. You can also give explicit dates to be included regardless of recurring settings in the respective section. Observe that we are not specifying any ESS job while defining the schedule which enables the reuse of schedule and can be used for any ESS jobs.

newsch3

Now save your changes and deploy to ESS server using the steps mentioned in previous post.

To view the newly created Schedule in EM console, navigate to ESSAPP –> Job Requests –> Define Schedule and do search for EssNativeHostingApp as shown below.

emnewsch3

You can also set the recurrence settings using Every field for seconds, Hours, Days, Weeks etc.. as shown below.

recurr

And appropriate selection of days, weeks etc.. will be enabled based on the selected frequency as shown below.

recurr1

Navigate to ESSAPP –> Job Requests –> Submit Job Request and select the job definition as shown below. Go to Schedule section and click search icon for Use existing schedule.

jobsubmission

Select the appropriate schedule and click OK.

selectschedule

schsubmission2

To see submitted requests, navigate to ESSAPP->Job Requests –> Search Job Requests and do search for this job.

searchjobreq

In the above screenshot, you can observe 3 requests (used 3 as the count in Schedule) have been submitted as child jobs. Click on Parent ID and Request ID to see more information on these job requests.

parentreq

childreq

Using EM Console to Create Schedule

We can also create a schedule using EM console directly as shown below.

emnewsch

emnewsch1

emnewsch2

emnewsch3


Pages

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 379 other subscribers

Enter your email address to follow this blog and receive notifications of new posts by email.