Handling content enrichment in Dynamic Data Mapping Solution

In the first part, we have covered how to handle JSON payloads coming with Varied Field Names Across Different Client Systems in WSO2 Micro Integrator.

In this second part, we will explore how to dynamically enhance the data originating from distinct client systems without the need to create a dedicated data mapper for each one. Each field within the incoming payload might necessitate unique parsing and data manipulation procedures before being assigned to the designated canonical field. Thus, the implemented solution must accommodate various enrichment logics for individual fields.

Let’s employ the same technique that was demonstrated in First Part of the article. To recap briefly, the data mapper functions through a “.dmc” file that contains JavaScript code for data mapping. By making adjustments to this “.dmc” file, we can introduce the required code to dynamically execute JavaScript code, thereby attaining the intended outcome.

Enrichment logic can be defined using JavaScript. Compose a JS script that derives the values based on the input fields within the incoming payload. DataMapper can be modified to execute the JavaScript logic specified for the particular field. A JSON map can be defined and injected to specify the parsing logic for each field in the payload.

Step 1: Create sample JSON file to define datamapper

Create a JSON file containing the following data.

{

“number”:”110001″,

“priority”: “2 – High”,

“location”: “1002 Newyork”

}

<property expression=”json-eval($)” name=”payload” scope=”default” type=”STRING”/>

From the above payload , consider a scenario where the target system requires the numeric value preceding the “-” symbol to be configured for the priority field. Similarly, the location field necessitates only the initial 4 characters.

Step 2: Define the Datamapper

Begin the data mapping process by providing the payload file from step 1 as input. Use the same sample file for both input and output purposes. Next, navigate to the file created in the registry resource project, having the “.dmc” extension, and open it. Inside, you will find the code snippet given below.

map_S_root_S_root = function(){ 

var outputroot={};

outputroot = {};

outputroot.number= inputroot.number;

outputroot.priority= inputroot.priority;

outputroot.location= inputroot.location;

return outputroot;

};

Step 3: Modify .dmc file

Now, change the JavaScript code in the “.dmc” file to execute the dynamic JS script given in the JSON key-value map (given in Step 4) for each required input field name.

Note: After making modifications to the .dmc file, refrain from opening the file or making changes in Integration Studio, as doing so will override custom mapping code with the default settings.

map_S_root_S_root = function(){ 

var outputroot={};

outputroot = {};

outputroot.number=inputroot.number;

outputroot.priority= enrichDynamicFieldValue (DM_PROPERTIES.DEFAULT[‘payload’],’priority‘,DM_PROPERTIES.DEFAULT[‘mappingData’]);

outputroot.location= enrichDynamicFieldValue (DM_PROPERTIES.DEFAULT[‘payload’],’location‘,DM_PROPERTIES.DEFAULT[‘mappingData’]);

return outputroot;

};

enrichDynamicFieldValue = function(payload,canonicalFieldName, mappingData){

var mappingJSON = JSON.parse(mappingData);
script= mappingJSON[canonicalFieldName];
payload = JSON.parse(payload);
return eval(script);

}

The above code introduces a new function called enrichDynamicFieldValue , requiring three inputs: the payload to be converted, the mapping data, and the canonical field name that requires enrichment from the map. To integrate this function, replace the original “inputroot.fieldnames” in the initial .dmc code. By making this substitution, the function will dynamically execute the JavaScript logic as specified in the mapping data for the designated field. Consequently, it will return the enriched value atop the input payload.

Step 4: Define Enrichment mapping payloadfactory

Within the API/Sequence file, before invoking the datamapper mediator, generate a Key-Value payload to define the enrichment logic for the input field names originating from integrating client system within the .dmc file. Store this payload as a property. This property will be utilized within the .dmc file to facilitate the enrichment of the input payload.

Please take note: In this post, we have directly hardcoded mapping payload here; however, in a real project scenario, it should be stored in a database. It is recommended to maintain separate input enrichment mapping payload for each client system in database and retrieve them based on the input system of the respective client.

<payloadFactory media-type=”json”>

<format>

{

    “priority”: “payload[‘priority’].substring(0, payload[‘priority’].indexOf(‘-‘)).trim();”,
    “urgency”: “payload[‘location’].substring(0, 4).trim();”
 }

</format>

<args/>

</payloadFactory>

<property expression=”json-eval($)” name=”mappingData” scope=”default” type=”STRING”/>

Step 5: Testing the program

Time to test the implementation, call the API with below input payload

{

“number”:”INC_12805″,

“priority”: “3 – Medium”,

“location”: “1005-Dallas”

}

The API with help of custom dynamic datamapper will convert the above payload as below

{

“number”:”INC_12805″,

“priority” : “3”,

“location”: “1005”

}

Using this approach, you can generate a output payload for each individual client system and carry out the conversion and enrichment of the payload. This eliminates the creation of new Datamapper code for each integrating system.

Building Dynamic Data Mapping Solution for Parsing JSON Payloads with Varied Field Names Across Different Client Systems in WSO2 Micro Integrator

When developing integration solutions, it is often necessary to handle input source keys or field names that vary between different systems. Unfortunately, the current data mapping solution offered by WSO2 requires defining the input and output fields during the developement phase, without the flexibility to load a dynamic input configuration file during runtime. As a result, this limitation forces the creation of multiple datamapper projects for each integrating system, adding extra development and deployment efforts for each integrating client system.

While there is no direct method available, we can accomplish this task through a non-standard approach. The data mapper operates using a “.dmc” file, which contains JavaScript code to facilitate data mapping. By modifying this “.dmc” file, we can insert the necessary code to load a list of dynamic field names, achieving the desired functionality.

Step 1: Create sample JSON file

Create a JSON file with below data

{

“number”:”110001″,

“description”: “unable to create order”

}

<property expression=”json-eval($)” name=”payload” scope=”default” type=”STRING”/>

Step 2: Define Datamapper

Begin the data mapping process by providing the payload file from step 1 as input. Use the same sample file for both input and output purposes. Next, navigate to the file created in the registry resource project, having the “.dmc” extension, and open it. Inside, you will find the code provided below.

map_S_root_S_root = function(){ 

var outputroot={};

outputroot = {};

outputroot.number= inputroot.number;

outputroot.description= inputroot.description;

return outputroot;

};

Step 3: Modify .dmc file

Now, change the JavaScript code in the “.dmc” file to enable dynamic retrieval of input field names from a JSON key-value map.

Note: After making modifications to the .dmc file, refrain from opening the file or making changes in Integration Studio, as doing so will override custom mapping code with the default settings.

map_S_root_S_root = function(){ 

var outputroot={};

outputroot = {};

outputroot.number=getDynamicFieldName (DM_PROPERTIES.DEFAULT[‘payload’],’number‘,DM_PROPERTIES.DEFAULT[‘mappingData’]);

outputroot.description= getDynamicFieldName (DM_PROPERTIES.DEFAULT[‘payload’],’description‘,DM_PROPERTIES.DEFAULT[‘mappingData’]);

return outputroot;

};

getDynamicFieldName = function(payload, canonicalFieldName, mappingData){
var mappingJSON = JSON.parse(mappingData);
fieldName = mappingJSON[canonicalFieldName];
obj = JSON.parse(payload);
return obj[fieldName];

};

The above provided code introduces a fresh function called getDynamicFieldName, requiring three inputs: the payload to be converted, the mapping data, and the canonical field name to extract from the map. To incorporate this function, substitute the original inputroot.fieldnames in the initial .dmc code. By doing so, the JavaScript logic will retrieve dynamic field names from the mapping data and appropriately process the input payload.

Step 4: Create Dynamic Field mapping payloadfactory

Within the API/Sequence file, before invoking the datamapper mediator, generate a Key-Value payload to establish a correspondence between input field names of integrating client system and the canonical field names specified in the .dmc file. Store this payload as a property. This property will be utilized within the .dmc file to facilitate the parsing of the input payload.

Please take note: In this post, we have directly hardcoded mapping payload here; however, in an ideal scenario, it should be stored in a database. It is recommended to maintain separate input mapping payload for each client system in database and retrieve them based on the input system of the respective client.

<payloadFactory media-type=”json”>
<format>
{
“number”:”incident_no”,
“description”:”desc”

}

</format>
<args/>
</payloadFactory>

<property expression=”json-eval($)” name=”mappingData” scope=”default” type=”STRING”/>

Step 5: Testing the program

Time to test the implementation, call the API with below input payload

{

“incident_no”:”INC_12801″,

“desc” : “Unable to open the website”

}

The API with help of custom dynamic datamapper will convert the above payload as below

{

“number”:”INC_12801″,

“description” : “Unable to open the website”

}

Using this approach, you can generate a fresh Datamapping payload for each individual client system and carry out the conversion. This eliminates the creation of New Datamapper code for each integrating system.

An inside look into todays ESB tools

The core characteristics that differentiate an ESB from a Hub and Spoke based EAI and message broker are for the most part being overlooked by current ESB tool vendors in the market. Instead, they are enriching their older Hub and Spoke based tools with open standards such as web services, XML, XPath and XSLT. These features, which include the Pervasive Grid, Selective Deployment, and Autonomous and Federated environments, essentially enable organizations to adopt selective deployment and incremental adoption of the ESB.

Most ESB deployments today are based on the Hub and Spoke model. In addition to being monolithic in nature and afflicted with a single point of failure, these deployments lack in implementation many core ESB features due to tool constraints or because of lack of awareness on them.


My blog posted some time back in iGate blog site is intended to give an overview of the missing features in present day ESB tools that are core to the idea of ESB concept.

Read the full blog post here: http://www.igate.com/iblog/?p=332