Infolinks

Wednesday, 4 July 2012

Open Interfaces in APPS EBS - Best Practices

Open Interfaces in APPS EBS - Best Practices 

This article assumes that you have already read the article on data migration, if not the click here.

What is common between Data Migration and interfaces?
Both essentially use API or open interface tables. See Data Migration Article

What is the difference between data migration and conversion?
These are essentially the same.

What is the difference between data migration and interfaces?
Data Migration is a one-time activity however interfaces are ongoing processes that run regularly. Hence error handling must be well thought when designing interfaces.

Best practices for interfaces?
1. Usually an interface must have well defined error reporting mechanism.
2. Data errors must be fixed at the source once the transaction gets rejected.
3. If your support team is often modifying transactions in interface tables using sql, then your interface design is flawed.
4. If you log support issues with any interface on a relatively regular basis, then its time to re-visit your design.
5. Interfaces must be designed in a manner such that, once the original errors are fixed, those transactions must get re-processed during the next run (or on demand)
6. If possible an interface mechanism/infrastructure must be in place, so that all interfaces are written in similar manner.
7. There must not be any hard coding for mapping from source system data to EBS. You can either use oracle's lookup screen to define mappings, or use a custom mapping screen.
8. In some cases you will have a generic screen which facilitates one-to-one, one-to-many, many-to-one, many-to-many mappings.
9. There must exist a simple mechanism to identify the duplicate processing of any transaction must exist. One way this can either be achieved is by making your source system specify a unique identifier for each record.
10. Do not store references to rowid in any part of your interfaces.
11. Think upfront, design and write your interface in such a manner that once it goes to production, you never receive an email to fix any stuck records. If at all you do receive data fixing requests, then ensure that interface program is changed in a manner it gets handled without programmers intervention in future.
12. Have proper debugging. You could add a parameter for debug flag, so that debug messages aren’t generated unnecessarily. Surely, don't forget to use FND Logging, which is delivered out of the box by Oracle.
13. Make the concurrent program end with warning in case of errors. This can be done by passing retcode=1. Use the out-of-the-box concurrent program notification facility can be used to inform end users of the errors encountered during processing.
By doing so, you can make the monitoring automatic, i.e. when submitting/scheduling interface concurrent program, you can attach a workflow role.


What tools can I use for point to point interfaces?
Point to Point interface methodology is not ideal for a large Organization. However to keep the costs low, when number of interfaces are very low, then Point to Point Interface approach could be justified. However please note that Organizations do grow in size , specially after implementing Oracle ERP. Hence Point 2 Point Interfaces can not be a long term strategy.

Please find p2p interface approach listed:-
1. File tables..
   --You can map the structure of an ASCII file to table structure.
2. XML Gateway
   --You can use XML Gateway and XML Message designer if you have too many interfaces having XML source. Although keep in mind that there may not be any future for XML Gateway in Fusion applications.
3. Sql*loader
   --Avoid for interfaces as error reporting is not user friendly. Use this for data migration as SQL*Loader is very efficient.
4. DB link for intra company different Oracle systems
  --DBA's often raise questions regarding security of database links. However security concerns can be minimized by making them non-public and make them connect to not specific schemas with minimum privileges.
5. Java concurrent program with jdbc
If your source system is in a database like SQL*Server, or DB2 or any other database which supports JDBC, this approach could be ideal.
The source system can create a specialized schema and expose the necessary views or tables/synonyms, such that you can pull the data from that system, load it into your system using API. All this can happen seamlessly, all within one program.
6. Java concurrent program with FTP, csv parser
All the above steps can be done within one single program.


Usual bad practices:-
1. Each developer writing their own mapping tables and screens.
2. Hard coding in interface
3. No thoughtful process for Error Reporting and Error Correction(deviation from fix at source philosophy)
4. Calling SQL*Plus sessions from Host programs, without due considerations for APPS Password Security
5. No debugging
6. Each interface program using its own File Dequeue mechanism.


Long term interface strategy using Oracle EBS Delivered tools
Oracle Warehouse Builder [Preferably 10g Version]
This strategy means that you move away from point to point interface design. I think a simple tool like Warehouse builder can be used as a simple single point interface control. Some powerful features of Warehouse builder means that you can do mappings, transformation of data structures without writing your own code. Warehouse Builder appears to be a part of Fusion strategy, hence it is a good investment. Warehouse builder can be used to pull the data from various sources using ODBC, JDBC, DB Links etc. Hence you can leverage a Warehouse Builder interface design run interface end-to-end that includes submission of the Interface API.
Limitation:- This is a batch mechanism, and not messaging based.

XML Gateway
If you want an Oracle delivered tool to implement interfaces using messaging, you can use XML Gateway with Message Maps. But investment in this tool may be lost when you move to BPEL based architecture.

Web Services
Wait for Fusion that integrates BPEL, unless you wish to manage Web Services in a standalone manner.
==============

This article explains the steps to write programs for data migration in Oracle Apps.

Question : Why this article?
Answer : Because in every Oracle ERP implementation you have some level of data migration activity. I will try to explain the steps, followed by a real life example of data migration.

Question : How to do data migration?
Answer :
A. Prepare the source data
B. Design your staging tables and write code
C. Test Test Test....

Question : What is involved in the preparation stage of data migration in oracle apps?
Answer:  For the Preparation stage, following must be done.
1. Understand the structure of the data being imported and also its business purpose.
2. Fully understand where the data will end up residing into Oracle Apps.
3 Find out if open interfaces or APIs exist in Oracle Apps to facilitate loading the required data.
4. Ensure that lookup codes or setup in Oracle apps support the values that are coming from source system.
5. Think about how the errors will be reported and managed.
6. Also think about how the transactions that fail migration will be re-tried. You may decide to knock of a simple screen if high volume of transactions needs user intervention for cleansing.

Question: And how about the development stage during migration?
Answer: Do the following
1. Design the stages in which data will flow.
Traditionally, your legacy/source data will be loaded into some temporary(staging) tables in a raw(as is) fomat/strucutre . Next you will translate the data into the physical structure that is similar to structure of Oracle Apps table/API. And then you will call the API or populate the Open Interface tables.
2. For doing the above, you will first write one-off scripts to create those staging tables and load data into them using SQL*Loader or some other methodology. If your source system is Oracle too, then I suggest you create a db link to pull the data into your staging tables.
All the steps pertaining the transformation can be done via a pl/sql concurrent Program(unless you are using tools like Warehouse builder)
3. Run the concurrent program that you have would written for transformation of data. This will either populate your Open Interface Tables or it will perform migration if APIs are being used.
(Again all this can be done via tools too, but here I am talking about a small migration activity, hence ignore the tool talk)
4. Ensure that errors are logged into some error logging table.
If the data errors in Oracle's open interface tables, then you could have a database view that does a union on your error table and also oracle's error logging table.

Question: How about testing?
Answer: You will figure out the following at this stage, from the records that error
1. Missing mapping codes
2. Missing setups in oracle apps.
Testing will most likely be repetitive, until you are able to have a migration run which leaves behind the records that are very low in volume or non-existent.

Question : If I get a comma delimited file, how will I load that into tables?
Answer : You can use a Sql*Loader, or a java program with a csv parser or a file based table approach.

Question : In oracle apps during migration, do we usually receive xml data file?
Answer : Keep in mind that you usually migrate data from mainframe or standalone systems which now stand outdated.
Such systems usually produce comma delimited or tab delimited file.

Question : Ok, once the data from source system has been loaded using sql*loader, what next?
Answer: The data model of the source data may or may not comply with the data model in oracle apps. Hence you need a transformation step in most cases. This is explained below with an example of TCA API to migrate customers/parties.

Question : Should I not ask the legacy system people to transform data as per our requirements?
Answer : No, don't bother doing so, for below reasons :-
A. Techies of legacy system will not be happy that their system is now being made redundant. Hence don't expect much value addition from them.
B. There is a possibility that in an attempt to transform data to Oracle's data structure, they might induce faults/bugs.
If the transformation bugs are encountered at your end in apps, you can fix them yourself. However if legacy team does transformation for you, then you become dependent  on them for bug fixes. Eventually tired of waiting on them, you might end up doing transformation yourself anyway.
C. If your design for transformation changes, you should not be dependent upon the legacy system...just fix it yourself at your own end in apps.

Question : If we end up using interface api's in apps for data migration, then where lies the difference between migration and interfaces?
Answer : Migration is a one-off activity, even though it uses the same sets of tables/API's as interfaces do.

Question : Enough explanation, now give me an example.
Answer : Let’s assume you get a task to migrate customers from legacy system into TCA (or call it AR - Receivables for simplicity).
Your file from source is:-
lcust.dat
1000,GE, GE Capital,1000-1
1001,Cisco,Cisco Routers,1001-1
1002,Barclays,Barclays Investments,1001-1
1003,Barclays,Barclays Mortgages,1001-2

The format of data is CustomerId,Cust parent name,customer operating company name, operating company id

Step 1. FTP this file to your database server and run sql*loader to load the file into a table named xx_legacy_cust, this table will have four columns, one column for each file in source.
Step 2. Transform this data...
For this we create two tables...
XX_TRNSFRM_PARTY
   --parent_cust_Id
   --parent_cust_name
XX_TRNSFRM _CUST_ACCOUNT
   --parent_cust_Id
   --operating_cust_Id
   --operating_cust_name
Now, you can write a pl/sql program to split data from table xx_legacy_cust into the two tables listed above.

Step 3. Now write the pl/sql program to migrate this data into Oracle.
Create or replace procedure xx_migrate_parties(errbuff out varchar2,retcode out varchar2) is
Begin
FOR p_party_rec in (select * from XX_TRNSFRM_PARTY )
LOOP
 --call api to create party
Hz_Party_Site_V2pub.create_party();

    For p_party_rec in (select * from XX_TRNSFRM _CUST_ACCOUNT where parent_cust_Id = p_party_rec.parent_cust-id)
    loop
    --now create a customer account
         hz_cust_account_v2pub.create_cust_account(..parameters…
                                             l_return_status,
                                             l_msg_count,
                                             l_msg_data
        );

      IF l_msg_count > 1 THEN
         FOR i IN 1 .. l_msg_count
         LOOP
            Fnd_File.put_line (Fnd_File.LOG, i || '. ' || SUBSTR (Fnd_Msg_Pub.get (p_encoded      => Fnd_Api.g_false ), 1, 255 ) );
         END LOOP;
     END IF;

    END LOOP ;
END LOOP ;
END  ;

Now some notes:-
1. No two data migrations are exactly the same, hence I have given a pseudo code, and not th actual code.
2. In some cases the quality of the source data can be so bad that you may need a third party to cleanse the data before you run migration.
3. The API's used for data migration and for developing open interfaces are the same.

No comments:

Post a Comment