Tuesday, December 8, 2015

Oracle Managed File Transfer (MFT) custom callouts

Originally posted at TMNS Blog

This week I was doing some proof of concept with Oracle Managed File Transfer and exploring in more detail one of its advanced features: custom callouts.
Callouts allow you to execute transfer pre­ and postprocessing. There are some callouts out­-of-­the-­box for compression, decompression, encryption and decryption. The interesting thing however, is that you can build your own custom callouts. Oracle provides some samples on how to build callouts on the MFT website, see bit.ly/learnmft for details.


Based on the RunScriptPre_01.zip provided by Oracle, I started by developing a similar callout, but to be used in the Target­post action. Below are some high­level steps on how to do that and a link to the zip file containing all the information you need to test by yourself:
  1. Create a java class (RunScriptPost.java) that implements the oracle.tip.mft.engine.processsor.plugin.PostCalloutPlugin interface
  2. Compile the class, package it in a jar file (RunScriptPost.jar) and copy to $DOMAIN_HOME/mft/callouts/
  3. Create a callout definition file (RunScriptPost.xml) to describe the callout
  4. Run the createCallouts WLST command (createCallout.py) to configure the callout in MFT
Once you complete the steps above, the callout will be available in your MFT Design to be selected in the “add post­processing actions” screen of your Transfers.
Check the README.txt file for a detailed description on how to deploy, configure and test it.
Download: RunScriptPost


After creating the RunScriptPost, I decided to build a second callout to call an auditing web service. I started by defining a service interface for a generic auditing service that could be implemented by anyone that needs to do logging or auditing of the transfers managed by MFT. With the interface available, I’ve created a client jar file with the web service client and created a sample implementation for the service using SOA Suite. Finally, I created 2 callouts, one for Pre Auditing and another for Post Auditing. The steps to create the callout were the same as described above and the result you can get from the link below. Check the README.txt file for details on how to use it.
Download: GenericWSAudit
I hope this can help you to understand how MFT callouts work. Moreover, I expect you can immediately take advantage of the callouts I wrote during this POC.

Thursday, November 5, 2015

How to install Cloud Adapter for Salesforce.com

Originally posted at TMNS Blog
In our recent TechNote we discussed the pros and cons of Cloud Adapter versus Custom Web Service Call. In this How To we will guide you through the installation of Cloud Adapter and detail some of the issues you can face during installation and the changes needed to fix them.

Installation steps cookbook:

  1. To use the Cloud Adapter for Salesforce.com, you need to install the initial version of SOA Suite or BPM Suite 12c.
  2. If you want to use it only within a SOA Suite SCA calling the adapter from a BPEL flow, no additional installation is needed.
  3. If you want to create a Business Service in OSB and expose the Salesforce Adapter operations via Proxy Service, there are additional steps that you need to follow (see details in the bottom of the post):    a) Apply Bundle patch for SOA Suite or for BPM Suite [Issue #1].    b) Create a Security System Policy using Enterprise Manager. [Issue #2]    c) Change server configuration to not validate hostname or allow wildcards in the certificate URLs. [Issue #3]
  4. Follow the Cloud Adapter instructions download the Salesforce supported WSDL and do the additional server configuration changes:    a) Download Salesforce Enterprise WSDL from Salesforce.com Setup web page.    b) Export Salesforce Certificate from the browser and import into the server Keystore using java Keytool.    c) Configure EM Credentials, defining CSF Key and using username and password (with security token).Check all configuration steps here: https://docs.oracle.com/middleware/1213/cloudadapter-salesforce/TKSDP.pdf

Issues and solutions:

Below you have the details for some of the issues you can face during installation and the changes needed to fix them:





If you do not apply the Bundle Patch for SOA/BPM suite, once you call
Salesforce during run-time you should see the error message below:

OSB-380001: Invoke JCA outbound service failed with application error, exception: com.bea.wli.sb.transports.jca.JCATransportException:
oracle.tip.adapter.sa.api.JCABindingException: oracle.tip.adapter.sa.impl.fw.ext.org.collaxa.
[ salesforceReferencePortType::create(parameters,parameters) ] –
WSIF JCA Execute of operation ‘create’ failed due to:
JCA Binding Component connection issue.
JCA Binding Component is unable to create an
outbound JCA (CCI) connection.
[ salesforceReferencePortType::create(parameters,parameters) ] – :
The JCA Binding Component was unable to establish an outbound
JCA CCI connection due to the following issue: BINDING.JCA-12561
JCA Resource Adapter location error (WebLogic).
Unable to locate the JCA Resource Adapter via .jca binding file element <connection-factory/>
The JCA Binding Component is unable to locate the Resource
Adapter specified in the <connection-factory/> element:
(properties: {csfkey=SalesForceKey, applicationVersion=34.0, csfMap=oracle.wsm.security, jndi.location=cloud/CloudAdapter, targetWSDLURL
The reason for this is most likely that either
 1) the Resource Adapters RAR file has not been deployed successfully to the WebLogic J2EE Application server or
 2) the JNDI <jndi-name> setting in the WebLogic JCA deployment descriptor has not been set to cloud/CloudAdapter. In the last case you might have to add a new ‘connector-factory’ entry (connection) to the deployment descriptor.
Please correct this and then restart the WebLogic Application Server

Caused by: javax.naming.NameNotFoundException: While trying to lookup ‘cloud.CloudAdapter’ didn’t find subcontext ‘cloud’. Resolved ”; remaining name ‘cloud/CloudAdapter’

There are 2 solutions for this issue:

1) manually changing the connection mode parameter in the advanced section for transport on business service in OSB console after deploy:
1.       open OSB console
2.       create a session
3.       click on project and then on business service for salesforce
4.       open the Transport detail tab
5.       open advanced section and change connection mode to unmanaged
6.       activate the session in console
2) patching to or up and no manual configuration will be needed


If you do not create the Security Policy, once you call Salesforce in run-time you should see the error message below:

javax.resource.ResourceException: Unable to create Cloud Operation:
  at oracle.tip.adapter.cloud.CloudAdapterInteraction.create
  at oracle.tip.adapter.cloud.CloudAdapter
  at oracle.tip.adapter.sa.impl.fw.wsif.jca.
  at oracle.tip.adapter.sa.impl.fw.wsif.jca.WSIFOperation_
Caused by: oracle.cloud.connector.api.CloudInvocationException: Unable
to find username in credential store.
  at oracle.cloud.connector.salesforce.
  at oracle.cloud.connector.impl.
  at oracle.tip.adapter.cloud.CloudAdapterInteraction.create

The steps below describe how to create the Security Policy needed:

1. Log in to Fusion Middleware Control Enterprise Manager.
2. Expand “Weblogic Domain” in the left panel
3. Right click on the domain you want to modify and select Security > System Policies to display the page System Policies.
4. In the System Policies page, click on “Create…” button.
5. In the Codebase field enter the path to the jar file i.e file:${osb.oracle.home}/soa/modules/oracle.
6. In the Permissions section click on “Add” button.
7. In the new window modify the “Type” field to “Principal” and click on the search button. Select the “Administrator” permission and click “Ok”
8. Back in the previous windows you will see now under Permissions “oracle.security.jps.service.credstore.CredentialAccessPermission”
9. Select “oracle.security.jps.service.credstore.Credential
AccessPermission” and click on “Edit…” button and modify it as follow:
Resource Name: context=SYSTEM,mapName=SOA,keyName=*
Permission Action: *
10. Click on “OK” to save the new permission.
See more details at Oracle Support “SOA/OSB 12c: Cloud Adapter Patch Reference (Doc ID 1917423.1)”


If you do not change the server configuration related to hostname validation, once you call Salesforce in run-time you should see the error message below:

Certificate chain received from [URL – IP] failed hostname verification check. Certificate contained *.[URL] but check expected [URL]

There are 2 options to solve this:

1. Disable hostname verification, as explained by the Cloud Adapter documentation (“Set Hostname Verification to None”)
2. Change Hostname Verifier to allow wildcards:
1. Go to the WebLogic admin console -> Environment -> Servers -> your server -> Configuration -> SSL
2. Click “Lock & Edit”
3. Open the “Advanced” flap
4. Change “Hostname Verification” from “BEA Hostname Verifier” to “Custom Hostname Verifier”
5. Set “Custom Hostname Verifier” to weblogic.security.utils.SSLWLSWildcardHostnameVerifier
6. Click “Save” and then “Activate Changes”
7. Restart your server.


Once you create a new Salesforce Adapter instance selecting SOSL/SOQL, in case you add parameters to the query, you can get the error message below once you complete the wizard and the Business Service is not created.

Failed to generate the business service
error: Unexpected character encountered (lex state 3): ‘

The workaround for this issue:

is to create the Salesforce Adapter instance without any query parameter and, after completing the wizard, right-click on the adapter and select “Edit JCA”, running the adapter wizard again and replacing the query, including the parameters. This time the wizard will complete as expected and all adapter metadata files will be updated accordingly.

Wednesday, October 28, 2015

Salesforce.com integration: Cloud Adapter versus Custom Web Service call

Originally posted at TMNS Blog
As part of Oracle Fusion Middleware, Oracle introduced the Cloud Adapter for Salesforce.com integration. In this TechNote we will first provide some general information about Cloud Adapter and shortly describe the support, behaviour and restrictions. Then we will proceed with comparing Cloud Adapter and Custom Web Service call, state strengths and limitations of each approach, and end with some final recommendations.

Cloud Adapter integration
Cloud Adapter offers a single integration platform to unify cloud and on-premises blog-salesforce-adapter1applications, reducing the effort to manage authentication, session management and transformation mappings that is usually needed when you build a direct integration with Salesforce using the WSDL API. Depending on the business needs, integration can be done from different systems and in different areas and data sets: Account, Contact, Product, Invoice, Billing, Order, Fulfilment, integrating hundreds of objects existing in Salesforce.
Following the usual Adapter Wizard approach, the integration can be easily created by browsing, searching and selecting one or more Salesforce business objects and operations. It also allows modelling of SOSL/SOQL queries, providing design time validation capabilities.
blog-salesforce-adapter2The Cloud adapter is available in the 11g ( and 12c versions of Oracle Fusion Middleware.Although we expect to have out of the box functionalities with this new feature in the Middleware platform, the initial set up needs some attention and manual configuration on the server side. We will soon post another blog in which you will find the installation steps needed to have the Cloud Adapter for Salesforce.com integration working in both SOA Suite and OSB. We’ll be using Oracle 12c as reference for all technical details.

Adapter support, behaviour and restrictions
The Cloud Adapter for Salesforce.com supports the following operations:blog-salesforce-adapter3
COREconvertLead, getDeleted, getUpdated, merge, undelete, upsert
CRUDcreate, retrieve, update, delete
MISCgetUserInfo, process
SOSL / SOQLquery, queryAll, search, queryMore

During design-time, each time you instantiate a Salesforce Adapter in your SCA or OSB, an operation needs to be selected and for each operation, you can choose a set of Salesforce objects that will be affected. Each invocation of the adapter instance should be seen as a transaction within Salesforce. There is a header property “All or None” that allows you to say what to do in case of an intermediate failure: rollback all changes or keep what was already changed.
During run-time, the Adapter will handle the user session in a transparent way, storing the session in the cache for future calls and avoiding multiple calls to login the user.
There are restrictions on the number of objects that can be selected for one Salesforce instance. The number of objects vary depending on the operation being selected. In addition to that, when you instantiate the adapter (and thus select the operation and objects), if you need separate transactions or need information from one object (e.g. account ID) to be added to another object (e.g. contact), you will need to create two separate adapter instances, make two separate calls and handle the distributed transaction and compensations.
After the environment is correctly set up, the use of the Cloud Adapter is really straightforward. If you have a custom integration that involves a simple scenario, with only a few interactions with Salesforce and operations on top of a reduced number of objects, the use of Cloud Adapter is suggested to easily process the Salesforce requests. For complex scenarios with multiple interactions with Salesforce or for a generic approach that allows any call to be made to Salesforce, the use of Cloud Adapter may be too time consuming, due to the particularities related to the way Salesforce Adapter is created in your project. In these cases you will need to create one instance of the adapter for each different object and/or operation.
Alternate approach
For complex scenarios it might be better to interact directly with Salesforce API using the Enterprise WSDL. You will need to manage the sessions (login/logout) in your custom code, but it allows you more flexibility to create a generic approach and reuse it.
The custom Proxy or Business Service can be built based on the same WSDL used for Oracle Cloud Adapter. You need to import the WSDL as a web service reference and call the expected operation, sending the Salesforce Business Object as a payload.
Before you call the service to query or change objects, however, you need to call the Login operation and handle the Session ID received back, that needs to be included in any subsequent requests. As a recommendation, it would be better to handle the session separately from the integration code and keep the session alive on the server side using a more robust approach, reusing the same session for multiple calls. In this case, you should be able to handle the exceptions in case of Salesforce invalidating the session, doing the login again and retrying the failed transaction.
Salesforce has a limit on the number of logins that you can do per day and expects you to store the Session ID and Server URL for multiple calls. It means that you should not login/logout for each operation, otherwise your logins will start to be rejected. In addition to that, the way Salesforce handles session, once you do a login, the session will be opened on the Salesforce side and any next logins done by the API will join the same session, instead of creating a new one. It means that if you do a logout, you will be invalidating the session not only for your flow instance, but for all flows that did a login while your session was active. The recommendation in this case is to NOT logout. Just do a login if you do not have the Session ID and keep the session alive, handling only the exceptions in case the session is terminated by Salesforce to do the login again.
Below is a comparison between the two approaches, with advantages and disadvantages of each one.
Oracle Salesforce cloud adapterDirect call to Salesforce API
Straightforward wizard based approach during designAll operations exposed in one API
Built-in management of session and authenticationAll objects exposed in one API without restrictions
Easier mapping of data during designCan be a part of a generic framework
Performance restricted only by hardware and Salesforce restrictionsPerformance restricted only by hardware and Salesforce restrictions
Adapter supports up to 6 previous versions of Salesforce.comNo extra license costs

Oracle Salesforce cloud adapterDirect call to Salesforce API
It does not work out of the box. Requires additional configuration setup.Extra effort needed in development for session and authentication management
Limitations imposed by Salesforce on number of objects per operation transferred to adapterExtra effort needed for data transformation within BPEL or OSB because of polymorphism
Adapter is limited to only one operation per created instanceRequires more knowledge and experienced developer to make it work
Not generic enough to be used as a part of a frameworkLimitations imposed by Salesforce on number of objects per operation exist although the API itself has no limitations
Extra cost per processor to use on top of every SOA licence

The use of the web service interface from Salesforce is free of charge, so if you follow the direct approach, the cost is limited to the effort to build the custom integration. If you plan, however, to use the Salesforce Adapter to speed up the initial integrations, you need to consider the cost associated to the Adapter, that is around $17.500 per core.

Final recommendations
  • When dealing with only a limited number of operations and objects (one or two different operations on up to 5 objects) on the Salesforce side Oracle Cloud Adapter is preferred as it has authentication and session management out of the box.
  • For complex situations with a large number of tasks that require different operations on multiple different objects, we recommend using direct calls of Salesforce API to achieve flexibility and generic approach.
  • Only the direct invocation of Salesforce API is generic enough to be a part of a framework, so if you plan to include Salesforce in a framework for higher reusability, direct invocation of Salesforce API is the recommended approach.
Choice of integration approach is not made solely on technical information and data, but also on existing environment setup and business requirements, so final choice will depend on the project and use cases it needs to cover.