Monday, 16 July 2012


Continuing with my earlier post long way back...
Let see what are inbound interfaces

inbound interfaces:Inbound interfaces where data comes into our when ever we are moving from Legacy system to oracle applications
we need to move the data residing in that system(master and transactional data) from legacy to new oracle apps system

Inbound are broadly classified in to two types


Conversion:Conversion is a one time activity where we move the data from the legacy system to oracle applications

Interface:Is a integration between oracle applications and any other third party application which will exists even after migrating
to the new oracle apps systems

Lets talk about conversion

In a typical conversion process we will load data provided by the client team into oracle applications
Usually data is provided in the dat format or in excel(*.csv)
THe usual steps involves in a conversions are

1.Load the Dat/csv files into a staging table--This step is done using external tables/sqlloader
2.Perform the basic validations on the data at staging table
3.Derive any data required based on conditional logic/or defaulting some of the coulmns
4.mark the valid records in the staging table
5.pull all the valid records and insert into open interface tables/call API's
6.if data is loaded into interface tables run the standard interface program pragmatically/submitted as a part of the request set
8.once the interface program runs check whether any records are left in the open interface tables with error state
9.update the corresponding staging table with the error status and error message
10.pull all the error records from the staging table and generate a error records report with the corresponding error message

Typical conversion examples are item,customer,supplier etc..

One of the conversion approaches we have taken recently where the data volume is very high is

we divided the program into two step process
1.load data
2.program to validate and load data.

This process was very useful when we have more custom validations before loading data into system
so we developed the custom program to run in two modes.
1.validation mode
2.validation and insert mode

for this process we will have a processed column in the staging table.we load data with N status
once we validate data we mark the records as V and error records CE--custom validation error

in validation mode we just validate the custom data loaded in the staging table and generate the error report based on the
custom validation logic and mark the records as V or CE

In validation and insert mode we pick all the records marked as v and N.Validation logic's are performed on records which are in
N status only.we load load all the records marked as V and then load open interface/call api

By following this process we avoid multiple iterations for the conversion process.

Conversion process itself is multiple iterative process where we clean data by performing multiple mock conversions.
add validations:
The best approach is have the concurrent program to switch on and off the validation when ever required.
Normally we will have around 2-3 mock conversions before we load data in the production system
if we have option to switch off validation..we can switch of them if they are very time consuming because by the time we go to production
we would have cleaned our data.

No comments:

Post a Comment