Handling content enrichment in Dynamic Data Mapping Solution

In the first part, we have covered how to handle JSON payloads coming with Varied Field Names Across Different Client Systems in WSO2 Micro Integrator.

In this second part, we will explore how to dynamically enhance the data originating from distinct client systems without the need to create a dedicated data mapper for each one. Each field within the incoming payload might necessitate unique parsing and data manipulation procedures before being assigned to the designated canonical field. Thus, the implemented solution must accommodate various enrichment logics for individual fields.

Let’s employ the same technique that was demonstrated in First Part of the article. To recap briefly, the data mapper functions through a “.dmc” file that contains JavaScript code for data mapping. By making adjustments to this “.dmc” file, we can introduce the required code to dynamically execute JavaScript code, thereby attaining the intended outcome.

Enrichment logic can be defined using JavaScript. Compose a JS script that derives the values based on the input fields within the incoming payload. DataMapper can be modified to execute the JavaScript logic specified for the particular field. A JSON map can be defined and injected to specify the parsing logic for each field in the payload.

Step 1: Create sample JSON file to define datamapper

Create a JSON file containing the following data.

{

“number”:”110001″,

“priority”: “2 – High”,

“location”: “1002 Newyork”

}

<property expression=”json-eval($)” name=”payload” scope=”default” type=”STRING”/>

From the above payload , consider a scenario where the target system requires the numeric value preceding the “-” symbol to be configured for the priority field. Similarly, the location field necessitates only the initial 4 characters.

Step 2: Define the Datamapper

Begin the data mapping process by providing the payload file from step 1 as input. Use the same sample file for both input and output purposes. Next, navigate to the file created in the registry resource project, having the “.dmc” extension, and open it. Inside, you will find the code snippet given below.

map_S_root_S_root = function(){ 

var outputroot={};

outputroot = {};

outputroot.number= inputroot.number;

outputroot.priority= inputroot.priority;

outputroot.location= inputroot.location;

return outputroot;

};

Step 3: Modify .dmc file

Now, change the JavaScript code in the “.dmc” file to execute the dynamic JS script given in the JSON key-value map (given in Step 4) for each required input field name.

Note: After making modifications to the .dmc file, refrain from opening the file or making changes in Integration Studio, as doing so will override custom mapping code with the default settings.

map_S_root_S_root = function(){ 

var outputroot={};

outputroot = {};

outputroot.number=inputroot.number;

outputroot.priority= enrichDynamicFieldValue (DM_PROPERTIES.DEFAULT[‘payload’],’priority‘,DM_PROPERTIES.DEFAULT[‘mappingData’]);

outputroot.location= enrichDynamicFieldValue (DM_PROPERTIES.DEFAULT[‘payload’],’location‘,DM_PROPERTIES.DEFAULT[‘mappingData’]);

return outputroot;

};

enrichDynamicFieldValue = function(payload,canonicalFieldName, mappingData){

var mappingJSON = JSON.parse(mappingData);
script= mappingJSON[canonicalFieldName];
payload = JSON.parse(payload);
return eval(script);

}

The above code introduces a new function called enrichDynamicFieldValue , requiring three inputs: the payload to be converted, the mapping data, and the canonical field name that requires enrichment from the map. To integrate this function, replace the original “inputroot.fieldnames” in the initial .dmc code. By making this substitution, the function will dynamically execute the JavaScript logic as specified in the mapping data for the designated field. Consequently, it will return the enriched value atop the input payload.

Step 4: Define Enrichment mapping payloadfactory

Within the API/Sequence file, before invoking the datamapper mediator, generate a Key-Value payload to define the enrichment logic for the input field names originating from integrating client system within the .dmc file. Store this payload as a property. This property will be utilized within the .dmc file to facilitate the enrichment of the input payload.

Please take note: In this post, we have directly hardcoded mapping payload here; however, in a real project scenario, it should be stored in a database. It is recommended to maintain separate input enrichment mapping payload for each client system in database and retrieve them based on the input system of the respective client.

<payloadFactory media-type=”json”>

<format>

{

    “priority”: “payload[‘priority’].substring(0, payload[‘priority’].indexOf(‘-‘)).trim();”,
    “urgency”: “payload[‘location’].substring(0, 4).trim();”
 }

</format>

<args/>

</payloadFactory>

<property expression=”json-eval($)” name=”mappingData” scope=”default” type=”STRING”/>

Step 5: Testing the program

Time to test the implementation, call the API with below input payload

{

“number”:”INC_12805″,

“priority”: “3 – Medium”,

“location”: “1005-Dallas”

}

The API with help of custom dynamic datamapper will convert the above payload as below

{

“number”:”INC_12805″,

“priority” : “3”,

“location”: “1005”

}

Using this approach, you can generate a output payload for each individual client system and carry out the conversion and enrichment of the payload. This eliminates the creation of new Datamapper code for each integrating system.

An inside look into todays ESB tools

The core characteristics that differentiate an ESB from a Hub and Spoke based EAI and message broker are for the most part being overlooked by current ESB tool vendors in the market. Instead, they are enriching their older Hub and Spoke based tools with open standards such as web services, XML, XPath and XSLT. These features, which include the Pervasive Grid, Selective Deployment, and Autonomous and Federated environments, essentially enable organizations to adopt selective deployment and incremental adoption of the ESB.

Most ESB deployments today are based on the Hub and Spoke model. In addition to being monolithic in nature and afflicted with a single point of failure, these deployments lack in implementation many core ESB features due to tool constraints or because of lack of awareness on them.


My blog posted some time back in iGate blog site is intended to give an overview of the missing features in present day ESB tools that are core to the idea of ESB concept.

Read the full blog post here: http://www.igate.com/iblog/?p=332