SAP Connector Guide - Informatica Knowledge Base

Informatica Cloud (Version Fall 2015)
SAP Connector Guide
Informatica Cloud SAP Connector Guide
Version Fall 2015
December 2015
Copyright (c) 1993-2015 Informatica LLC. All rights reserved.
This software and documentation contain proprietary information of Informatica LLC and are provided under a license agreement containing restrictions on use and
disclosure and are also protected by copyright law. Reverse engineering of the software is prohibited. No part of this document may be reproduced or transmitted in any
form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica LLC. This Software may be protected by U.S. and/or
international Patents and other Patents Pending.
Use, duplication, or disclosure of the Software by the U.S. Government is subject to the restrictions set forth in the applicable software license agreement and as
provided in DFARS 227.7202-1(a) and 227.7702-3(a) (1995), DFARS 252.227-7013©(1)(ii) (OCT 1988), FAR 12.212(a) (1995), FAR 52.227-19, or FAR 52.227-14
(ALT III), as applicable.
The information in this product or documentation is subject to change without notice. If you find any problems in this product or documentation, please report them to us
in writing.
Informatica, Informatica Platform, Informatica Data Services, PowerCenter, PowerCenterRT, PowerCenter Connect, PowerCenter Data Analyzer, PowerExchange,
PowerMart, Metadata Manager, Informatica Data Quality, Informatica Data Explorer, Informatica B2B Data Transformation, Informatica B2B Data Exchange Informatica
On Demand, Informatica Identity Resolution, Informatica Application Information Lifecycle Management, Informatica Complex Event Processing, Ultra Messaging and
Informatica Master Data Management are trademarks or registered trademarks of Informatica LLC in the United States and in jurisdictions throughout the world. All
other company and product names may be trade names or trademarks of their respective owners.
Portions of this software and/or documentation are subject to copyright held by third parties, including without limitation: Copyright DataDirect Technologies. All rights
reserved. Copyright © Sun Microsystems. All rights reserved. Copyright © RSA Security Inc. All Rights Reserved. Copyright © Ordinal Technology Corp. All rights
reserved. Copyright © Aandacht c.v. All rights reserved. Copyright Genivia, Inc. All rights reserved. Copyright Isomorphic Software. All rights reserved. Copyright © Meta
Integration Technology, Inc. All rights reserved. Copyright © Intalio. All rights reserved. Copyright © Oracle. All rights reserved. Copyright © Adobe Systems
Incorporated. All rights reserved. Copyright © DataArt, Inc. All rights reserved. Copyright © ComponentSource. All rights reserved. Copyright © Microsoft Corporation. All
rights reserved. Copyright © Rogue Wave Software, Inc. All rights reserved. Copyright © Teradata Corporation. All rights reserved. Copyright © Yahoo! Inc. All rights
reserved. Copyright © Glyph & Cog, LLC. All rights reserved. Copyright © Thinkmap, Inc. All rights reserved. Copyright © Clearpace Software Limited. All rights
reserved. Copyright © Information Builders, Inc. All rights reserved. Copyright © OSS Nokalva, Inc. All rights reserved. Copyright Edifecs, Inc. All rights reserved.
Copyright Cleo Communications, Inc. All rights reserved. Copyright © International Organization for Standardization 1986. All rights reserved. Copyright © ejtechnologies GmbH. All rights reserved. Copyright © Jaspersoft Corporation. All rights reserved. Copyright © International Business Machines Corporation. All rights
reserved. Copyright © yWorks GmbH. All rights reserved. Copyright © Lucent Technologies. All rights reserved. Copyright (c) University of Toronto. All rights reserved.
Copyright © Daniel Veillard. All rights reserved. Copyright © Unicode, Inc. Copyright IBM Corp. All rights reserved. Copyright © MicroQuill Software Publishing, Inc. All
rights reserved. Copyright © PassMark Software Pty Ltd. All rights reserved. Copyright © LogiXML, Inc. All rights reserved. Copyright © 2003-2010 Lorenzi Davide, All
rights reserved. Copyright © Red Hat, Inc. All rights reserved. Copyright © The Board of Trustees of the Leland Stanford Junior University. All rights reserved. Copyright
© EMC Corporation. All rights reserved. Copyright © Flexera Software. All rights reserved. Copyright © Jinfonet Software. All rights reserved. Copyright © Apple Inc. All
rights reserved. Copyright © Telerik Inc. All rights reserved. Copyright © BEA Systems. All rights reserved. Copyright © PDFlib GmbH. All rights reserved. Copyright ©
Orientation in Objects GmbH. All rights reserved. Copyright © Tanuki Software, Ltd. All rights reserved. Copyright © Ricebridge. All rights reserved. Copyright © Sencha,
Inc. All rights reserved. Copyright © Scalable Systems, Inc. All rights reserved. Copyright © jQWidgets. All rights reserved. Copyright © Tableau Software, Inc. All rights
reserved. Copyright© MaxMind, Inc. All Rights Reserved. Copyright © TMate Software s.r.o. All rights reserved. Copyright © MapR Technologies Inc. All rights reserved.
Copyright © Amazon Corporate LLC. All rights reserved. Copyright © Highsoft. All rights reserved. Copyright © Python Software Foundation. All rights reserved.
Copyright © BeOpen.com. All rights reserved. Copyright © CNRI. All rights reserved.
This product includes software developed by the Apache Software Foundation (http://www.apache.org/), and/or other software which is licensed under various versions
of the Apache License (the "License"). You may obtain a copy of these Licenses at http://www.apache.org/licenses/. Unless required by applicable law or agreed to in
writing, software distributed under these Licenses is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied. See the Licenses for the specific language governing permissions and limitations under the Licenses.
This product includes software which was developed by Mozilla (http://www.mozilla.org/), software copyright The JBoss Group, LLC, all rights reserved; software
copyright © 1999-2006 by Bruno Lowagie and Paulo Soares and other software which is licensed under various versions of the GNU Lesser General Public License
Agreement, which may be found at http:// www.gnu.org/licenses/lgpl.html. The materials are provided free of charge by Informatica, "as-is", without warranty of any
kind, either express or implied, including but not limited to the implied warranties of merchantability and fitness for a particular purpose.
The product includes ACE(TM) and TAO(TM) software copyrighted by Douglas C. Schmidt and his research group at Washington University, University of California,
Irvine, and Vanderbilt University, Copyright (©) 1993-2006, all rights reserved.
This product includes software developed by the OpenSSL Project for use in the OpenSSL Toolkit (copyright The OpenSSL Project. All Rights Reserved) and
redistribution of this software is subject to terms available at http://www.openssl.org and http://www.openssl.org/source/license.html.
This product includes Curl software which is Copyright 1996-2013, Daniel Stenberg, <daniel@haxx.se>. All Rights Reserved. Permissions and limitations regarding this
software are subject to terms available at http://curl.haxx.se/docs/copyright.html. Permission to use, copy, modify, and distribute this software for any purpose with or
without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies.
The product includes software copyright 2001-2005 (©) MetaStuff, Ltd. All Rights Reserved. Permissions and limitations regarding this software are subject to terms
available at http://www.dom4j.org/ license.html.
The product includes software copyright © 2004-2007, The Dojo Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to
terms available at http://dojotoolkit.org/license.
This product includes ICU software which is copyright International Business Machines Corporation and others. All rights reserved. Permissions and limitations
regarding this software are subject to terms available at http://source.icu-project.org/repos/icu/icu/trunk/license.html.
This product includes software copyright © 1996-2006 Per Bothner. All rights reserved. Your right to use such materials is set forth in the license which may be found at
http:// www.gnu.org/software/ kawa/Software-License.html.
This product includes OSSP UUID software which is Copyright © 2002 Ralf S. Engelschall, Copyright © 2002 The OSSP Project Copyright © 2002 Cable & Wireless
Deutschland. Permissions and limitations regarding this software are subject to terms available at http://www.opensource.org/licenses/mit-license.php.
This product includes software developed by Boost (http://www.boost.org/) or under the Boost software license. Permissions and limitations regarding this software are
subject to terms available at http:/ /www.boost.org/LICENSE_1_0.txt.
This product includes software copyright © 1997-2007 University of Cambridge. Permissions and limitations regarding this software are subject to terms available at
http:// www.pcre.org/license.txt.
This product includes software copyright © 2007 The Eclipse Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms
available at http:// www.eclipse.org/org/documents/epl-v10.php and at http://www.eclipse.org/org/documents/edl-v10.php.
This product includes software licensed under the terms at http://www.tcl.tk/software/tcltk/license.html, http://www.bosrup.com/web/overlib/?License, http://
www.stlport.org/doc/ license.html, http://asm.ow2.org/license.html, http://www.cryptix.org/LICENSE.TXT, http://hsqldb.org/web/hsqlLicense.html, http://
httpunit.sourceforge.net/doc/ license.html, http://jung.sourceforge.net/license.txt , http://www.gzip.org/zlib/zlib_license.html, http://www.openldap.org/software/release/
license.html, http://www.libssh2.org, http://slf4j.org/license.html, http://www.sente.ch/software/OpenSourceLicense.html, http://fusesource.com/downloads/licenseagreements/fuse-message-broker-v-5-3- license-agreement; http://antlr.org/license.html; http://aopalliance.sourceforge.net/; http://www.bouncycastle.org/licence.html;
http://www.jgraph.com/jgraphdownload.html; http://www.jcraft.com/jsch/LICENSE.txt; http://jotm.objectweb.org/bsd_license.html; . http://www.w3.org/Consortium/Legal/
2002/copyright-software-20021231; http://www.slf4j.org/license.html; http://nanoxml.sourceforge.net/orig/copyright.html; http://www.json.org/license.html; http://
forge.ow2.org/projects/javaservice/, http://www.postgresql.org/about/licence.html, http://www.sqlite.org/copyright.html, http://www.tcl.tk/software/tcltk/license.html, http://
www.jaxen.org/faq.html, http://www.jdom.org/docs/faq.html, http://www.slf4j.org/license.html; http://www.iodbc.org/dataspace/iodbc/wiki/iODBC/License; http://
www.keplerproject.org/md5/license.html; http://www.toedter.com/en/jcalendar/license.html; http://www.edankert.com/bounce/index.html; http://www.net-snmp.org/about/
license.html; http://www.openmdx.org/#FAQ; http://www.php.net/license/3_01.txt; http://srp.stanford.edu/license.txt; http://www.schneier.com/blowfish.html; http://
www.jmock.org/license.html; http://xsom.java.net; http://benalman.com/about/license/; https://github.com/CreateJS/EaselJS/blob/master/src/easeljs/display/Bitmap.js;
http://www.h2database.com/html/license.html#summary; http://jsoncpp.sourceforge.net/LICENSE; http://jdbc.postgresql.org/license.html; http://
protobuf.googlecode.com/svn/trunk/src/google/protobuf/descriptor.proto; https://github.com/rantav/hector/blob/master/LICENSE; http://web.mit.edu/Kerberos/krb5current/doc/mitK5license.html; http://jibx.sourceforge.net/jibx-license.html; https://github.com/lyokato/libgeohash/blob/master/LICENSE; https://github.com/hjiang/jsonxx/
blob/master/LICENSE; https://code.google.com/p/lz4/; https://github.com/jedisct1/libsodium/blob/master/LICENSE; http://one-jar.sourceforge.net/index.php?
page=documents&file=license; https://github.com/EsotericSoftware/kryo/blob/master/license.txt; http://www.scala-lang.org/license.html; https://github.com/tinkerpop/
blueprints/blob/master/LICENSE.txt; http://gee.cs.oswego.edu/dl/classes/EDU/oswego/cs/dl/util/concurrent/intro.html; https://aws.amazon.com/asl/; https://github.com/
twbs/bootstrap/blob/master/LICENSE; https://sourceforge.net/p/xmlunit/code/HEAD/tree/trunk/LICENSE.txt; https://github.com/documentcloud/underscore-contrib/blob/
master/LICENSE, and https://github.com/apache/hbase/blob/master/LICENSE.txt.
This product includes software licensed under the Academic Free License (http://www.opensource.org/licenses/afl-3.0.php), the Common Development and Distribution
License (http://www.opensource.org/licenses/cddl1.php) the Common Public License (http://www.opensource.org/licenses/cpl1.0.php), the Sun Binary Code License
Agreement Supplemental License Terms, the BSD License (http:// www.opensource.org/licenses/bsd-license.php), the new BSD License (http://opensource.org/
licenses/BSD-3-Clause), the MIT License (http://www.opensource.org/licenses/mit-license.php), the Artistic License (http://www.opensource.org/licenses/artisticlicense-1.0) and the Initial Developer’s Public License Version 1.0 (http://www.firebirdsql.org/en/initial-developer-s-public-license-version-1-0/).
This product includes software copyright © 2003-2006 Joe WaInes, 2006-2007 XStream Committers. All rights reserved. Permissions and limitations regarding this
software are subject to terms available at http://xstream.codehaus.org/license.html. This product includes software developed by the Indiana University Extreme! Lab.
For further information please visit http://www.extreme.indiana.edu/.
This product includes software Copyright (c) 2013 Frank Balluffi and Markus Moeller. All rights reserved. Permissions and limitations regarding this software are subject
to terms of the MIT license.
See patents at https://www.informatica.com/legal/patents.html.
DISCLAIMER: Informatica LLC provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied
warranties of noninfringement, merchantability, or use for a particular purpose. Informatica LLC does not warrant that this software or documentation is error free. The
information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation is
subject to change at any time without notice.
NOTICES
This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software
Corporation ("DataDirect") which are subject to the following terms and conditions:
1. THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.
2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT,
INCIDENTAL, SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT
INFORMED OF THE POSSIBILITIES OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT
LIMITATION, BREACH OF CONTRACT, BREACH OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS.
Part Number: IC-SAPCG-23000-0001
Table of Contents
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Informatica Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Informatica Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Informatica Cloud Web Site. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Informatica Cloud Communities. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Informatica Cloud Marketplace. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Informatica Cloud Connector Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Informatica Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Informatica Cloud Trust Site. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Informatica Global Customer Support. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Part I: Introduction to SAP Connector. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Chapter 1: Introduction to SAP Connector. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
SAP Connector Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Informatica Cloud and SAP Integration Methods. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Data Integration using SAP Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Data Integration Using BAPI/RFC Functions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Data Integration Using IDocs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Communication Interfaces. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
SAP Metadata Utility. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Part II: SAP Connector Administration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Chapter 2: SAP Connector Administration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
SAP Connector Administration Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
SAP Table Connector Administration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Step 1. Downloading and Installing the Microsoft Visual C++ Redistributable. . . . . . . . . . . . 17
Step 2. Downloading and Configuring the Libraries for Table Read and Write. . . . . . . . . . . . 17
Step 3. Configuring saprfc.ini. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Step 4. Configuring SAP User Authorization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Step 5. Installing SAP Table Connection Transport Files. . . . . . . . . . . . . . . . . . . . . . . . . 21
Step 6: Configuring HTTPS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
SAP IDocs and RFCs/BAPI Connector Administration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Step 1. Downloading and Installing the Microsoft Visual C++ Redistributable. . . . . . . . . . . . 25
Step 2. Downloading and Configuring SAP Libraries for IDoc and BAPI/RFC. . . . . . . . . . . . 26
Step 3. Configuring saprfc.ini. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Step 4. Defining SAP Connector as a Logical System in SAP. . . . . . . . . . . . . . . . . . . . . . 29
Step 5. Configuring SAP User Authorizations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Step 6. Installing and Configuring the SAP Metadata Utility. . . . . . . . . . . . . . . . . . . . . . . . 33
4
Table of Contents
Part III: Connections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Chapter 3: SAP Connections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
SAP Connections Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
SAP Table Connections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
SAP Table Connection Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
SAP Connection Rules and Guidelines. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
SAP IDoc and BAPI/RFC Connections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
SAP RFC/BAPI Interface Connection Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
SAP IDoc Reader Connection Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
SAP IDoc Writer Connection Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
Creating an SAP Table Connection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Creating an SAP IDoc Reader Connection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Creating an SAP IDoc Writer or SAP RFC/BAPI Interface Connection . . . . . . . . . . . . . . . . . . . 41
Chapter 4: Troubleshooting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Troubleshooting Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
SAP Table Connection Errors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Part IV: Data Integration Using SAP Table. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Chapter 5: SAP Table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
SAP Tables and Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Rules and Guidelines for SAP Sources and Targets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Chapter 6: Data Synchronization Tasks with SAP Table. . . . . . . . . . . . . . . . . . . . 48
Data Synchronization Tasks with SAP Table Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
SAP Table Sources in Data Synchronization Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
SAP Table Lookups in Data Synchronization Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Configuring a Data Synchronization Task with a Single SAP Object as the Source. . . . . . . . . . . . 50
Configuring a Data Synchronization Task with Multiple SAP Objects as the Source. . . . . . . . . . . 52
Monitoring a Data Synchronization Task. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Data Synchronization Task Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Step 1: Define the Data Synchronization Task. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Step 2: Configure the SAP Table Source. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Step 3: Configure the Flat File Target. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Step 4: Configure the Field Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Chapter 7: Mappings and Mapping Configuration Tasks with SAP Table. . . . . . . 59
Mapping and Mapping Configuration Tasks with SAP Table Overview. . . . . . . . . . . . . . . . . . . . 59
SAP Table Sources in Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
SAP Table Lookups in Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Table of Contents
5
Configuring a Mapping with an SAP Table Source. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Creating a Mapping Configuration Task. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Mapping with an SAP Table Source Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Step 1: Define the Mapping. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Step 2: Configure the SAP Table Source. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Step 3: Configure the Flat File Target. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Step 4: Save the Mapping and Create a Mapping Configuration Task. . . . . . . . . . . . . . . . . 66
Part V: Data Integration Using BAPI/RFC Functions. . . . . . . . . . . . . . . . . . . . . . . . . . . 68
Chapter 8: BAPI/RFC Mapplets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
BAPI/RFC Mapplets Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
BAPI/RFC Mapplet Parameters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
BAPI/RFC Parameter Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
BAPI/RFC Functions with Nested Structures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
System Variables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Integration ID in BAPI/RFC Mapplet. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Target Object for BAPI/RFC Error Output. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Rules and Guidelines for BAPI/RFC Mapplets in Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Importing BAPI/RFC Metadata. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Chapter 9: Mapping and Mapping Configuration Tasks Using BAP/RFC
Functions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
Mapping and Mapping Configuration Tasks Using BAPI/RFC Functions Overview. . . . . . . . . . . . 75
Importing a BAPI/RFC Mapplet to Informatica Cloud. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
Configuring a Mapping with a BAPI/RFC Mapplet. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
Mappings with BAPI/RFC Function Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Step1: Importing BAPI_SALESORDER_CREATEFROMDAT1 Metadata. . . . . . . . . . . . . . . 78
Step 2: Importing the BAPI_SALESORDER_CREATEFROMDAT1 Mapplet to Informatica
Cloud. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
Step 3: Configuring a Mapping with the bapi_salesorder_createfromdat1 Mapplet. . . . . . . . . 82
Part VI: Data Integration Using IDocs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
Chapter 10: IDoc Mapplets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
IDoc Mapplets Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
Segments and Groups. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
Segment and Group Status. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
IDocs Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Outbound Mapplet. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Outbound Mapplet Ports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Target Object for Outbound Mapplet Error Output. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
Inbound Mapplet. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
Key Fields and Control Record Fields. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
6
Table of Contents
IDoc Primary and Foreign Keys. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Importing IDoc Metadata. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Chapter 11: Mapping and Mapping Configuration Tasks Using IDocs. . . . . . . . . 95
Mapping and Mapping Configuration Tasks Using IDocs Overview. . . . . . . . . . . . . . . . . . . . . . 95
IDoc Reader Sources in Mappings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
Importing an IDoc Mapplet to Informatica Cloud. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
Configuring an Outbound Mapping to Read IDocs from SAP. . . . . . . . . . . . . . . . . . . . . . . . . . 97
Configuring an Inbound Mapping to Write IDocs to SAP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Outbound Mapping to Read IDocs from SAP Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
Step 1: Importing MATMAS IDoc Metadata. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
Step 2: Importing the MATMAS04_Interpreter_Mapping Mapplet to Informatica Cloud. . . . . . 102
Step 3: Configuring an Outbound Mapping with the MATMAS IDoc. . . . . . . . . . . . . . . . . . 103
Inbound Mapping to Write IDocs To SAP Example. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
Step 1: Importing MATMAS IDoc Metadata. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
Step 2: Importing the MATMAS03_Prepare_Mapping Mapplet to Informatica Cloud. . . . . . . 107
Step 3: Configuring an Inbound Mapping with the MATMAS IDoc. . . . . . . . . . . . . . . . . . . 107
Appendix A: SAP Data Type Reference. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
SAP Data Type Reference Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
SAP and Transformation Data Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
Table of Contents
7
Preface
The Informatica Cloud SAP Connector Guide contains information about how to set up and use SAP
Connector. The guide explains how organization administrators and business users can use SAP Connector
to read from and write data to SAP.
Informatica Resources
Informatica Documentation
The Informatica Documentation team makes every effort to create accurate, usable documentation. If you
have questions, comments, or ideas about this documentation, contact the Informatica Documentation team
through email at infa_documentation@informatica.com. We will use your feedback to improve our
documentation. Let us know if we can contact you regarding your comments.
The Documentation team updates documentation as needed. To get the latest documentation for your
product, navigate to Product Documentation from https://mysupport.informatica.com.
Informatica Cloud Web Site
You can access the Informatica Cloud web site at http://www.informatica.com/cloud. This site contains
information about Informatica Cloud editions and applications.
Informatica Cloud Communities
Use the Informatica Cloud Community to discuss and resolve technical issues in Informatica Cloud. You can
also find technical tips, documentation updates, and answers to frequently asked questions.
Access the Informatica Cloud Community at:
http://www.informaticacloud.com/community
To find resources on using Cloud Application Integration (the Informatica Cloud Real Time service), access
the community at:
https://community.informatica.com/community/products/informatica_cloud/application_integration
Developers can learn more and share tips at the Cloud Developer community:
http://www.informaticacloud.com/devcomm
8
Informatica Cloud Marketplace
Visit the Informatica Marketplace to try and buy Informatica Cloud Connectors, Informatica Cloud integration
templates, and Data Quality mapplets:
https://community.informatica.com/community/marketplace/informatica_cloud_mall
Informatica Cloud Connector Documentation
You can access documentation for Informatica Cloud Connectors at the Informatica Cloud Community:
https://community.informatica.com/cloud/index.htm
You can also download individual connector guides: https://community.informatica.com/docs/DOC-2687.
Informatica Knowledge Base
As an Informatica customer, you can access the Informatica Knowledge Base at
https://mysupport.informatica.com. Use the Knowledge Base to search for documented solutions to known
technical issues about Informatica products. You can also find answers to frequently asked questions,
technical white papers, and technical tips. If you have questions, comments, or ideas about the Knowledge
Base, contact the Informatica Knowledge Base team through email at KB_Feedback@informatica.com.
Informatica Cloud Trust Site
You can access the Informatica Cloud trust site at http://trust.informaticacloud.com. This site provides real
time information about Informatica Cloud system availability, current and historical data about system
performance, and details about Informatica Cloud security policies.
Informatica Global Customer Support
You can contact a Customer Support Center by telephone or online.
For online support, click Submit Support Request in the Informatica Cloud application. You can also use
Online Support to log a case. Online Support requires a login. You can request a login at
https://mysupport.informatica.com.
The telephone numbers for Informatica Global Customer Support are available from the Informatica web site
at https://www.informatica.com/services-and-training/support-services/contact-us.html.
Preface
9
Part I: Introduction to SAP
Connector
This part contains the following chapter:
•
10
Introduction to SAP Connector, 11
CHAPTER 1
Introduction to SAP Connector
This chapter includes the following topics:
•
SAP Connector Overview, 11
•
Informatica Cloud and SAP Integration Methods, 11
•
Communication Interfaces, 13
•
SAP Metadata Utility, 13
SAP Connector Overview
You can use SAP Connector to integrate with SAP systems in batch, asynchronous, or synchronous modes
based on your requirements.
Informatica Cloud supports ABAP, IDoc read, IDoc write, or BAPI/RFC functions to integrate with SAP
systems. You can choose one of four SAP connection types to connect to SAP systems based on the
interface requirements.
You can use the SAP connection in Data Synchronization tasks, mappings, and Mapping Configuration tasks.
Create a Data Synchronization task to synchronize data between a source and target. Create a Mapping
Configuration task to process data based on the data flow logic defined in a mapping or integration template.
Informatica Cloud and SAP Integration Methods
SAP is an application platform that integrates multiple business applications and solutions, such as Customer
Relationship Management (CRM), Advanced Planner and Optimizer (APO), and Bank Analyzer. Developers
can add business logic within SAP using Java 2 Enterprise Edition (J2EE) or Advanced Business Application
Programming-Fourth Generation (ABAP/4 or ABAP), a language proprietary to SAP.
You can use the SAP Table, BAPI/RFC functions, and IDocs integration methods to extract data from or load
data to SAP systems.
Use SAP Table for data-level integration, BAPI/RFC functions for object-level integration, and IDocs for
message-level integration.
Data Integration using SAP Tables
You can integrate SAP data dictionary objects by creating an SAP Table connection.
11
You can use SAP Table connection to read data from SAP and write to any target. You can also write data
from any source to custom tables in SAP. Contact Global Customer Support for information about using SAP
Table connection to write data to SAP systems.
You can access all the objects in the SAP catalog including transparent tables, cluster tables, pool tables,
and views. The Secure Agent accesses data through the application layer in SAP using ABAP. Data is
streamed to the Secure Agent through HTTP (s) protocol. SAP Table connector supports joins and filters on
the source tables.
To optimize performance when the Secure Agent and the SAP system are in different networks, you can
enable data compression when you read data from SAP.
When you create a Data Synchronization task, mapping, or Mapping Configuration task, Informatica Cloud
generates a dynamic ABAP query to read from SAP tables and write to custom SAP tables.
Data Integration Using BAPI/RFC Functions
Business Application Programming Interfaces (BAPI) provide a way for third-party applications to
synchronously integrate with SAP at the object-level . You use BAPIs to read, create, change, or delete data
in SAP.
BAPIs allow access to the SAP system objects through methods for the business object types. Together with
the business object types, BAPIs define and document the interface standard at the business level.
You define BAPIs in the SAP Business Objects Repository. You can call BAPIs as an ABAP program within
SAP or from any external application. SAP Connector use RFC protocol to call BAPI/RFC functions outside of
SAP.
You can import a BAPI/RFC function as a mapplet to Informatica Cloud. You can then use the mapplet in a
mapping to read, create, change, or delete data in SAP. When you run the mapping or the Mapping
Configuration task, Informatica Cloud makes the RFC function calls to SAP to process data synchronously.
You can view and test the BAPI interface definitions in SAP using transaction SE37.
Data Integration Using IDocs
Intermediate Documents (IDocs) electronically exchange data between SAP applications or between SAP
applications and external programs. IDoc is message-based integration interface that processes data
asynchronously.
IDoc is a component of Application Link Enabling (ALE) module within SAP that can send and receive
Intermediate Documents (IDocs) over RFC protocol.
ALE Layers
The message-based architecture of ALE comprises three layers:
•
Application layer that provides ALE an interface to SAP to send or receive messages from external
systems.
•
Distribution layer that filters and converts messages to ensure that they are compatible between different
SAP releases.
•
Communications layer that enables ALE to support synchronous and asynchronous communication. You
use IDocs for asynchronous communication.
The architecture of ALE provides a way to send IDocs as text files without connecting to a central database.
Applications can communicate with each other without converting between formats to accommodate
hardware or platform differences.
12
Chapter 1: Introduction to SAP Connector
IDoc Record Types
IDocs contain three record types:
•
Control record, which identifies the message type.
•
Data records that contain the IDoc data in segments.
•
Status records that describe the status of the IDoc. Status record names are the same for each IDoc type.
ALE Components
ALE has the following components:
Component
Description
Logical System
All systems that need to communicate using ALE/IDoc must be setup as a Logical System
within SAP. An SAP administrator can setup logical systems in transaction BD54.
Distribution Model
Defines an agreement between two logical systems on the messages that can be exchanged
and identifies the sender and the receiver. An SAP administrator can setup distribution
models in transaction BD64.
Partner Profile
Stores the IDoc type and processing logic related to the distribution model. An SAP
administrator can setup partner profiles in transaction WE20.
RFC Destination
Defines the protocol and access to the logical system. An SAP administrator can setup RFC
destinations in transaction SM59.
Message Type
Representation of a business object.
IDoc Type
Representation of a message type. SAP uses IDoc types to support backward compatibility
across various SAP releases.
IDoc
An instance of an IDoc type that contains business data.
Communication Interfaces
SAP uses TCP/IP as the native communication interface to communicate with Informatica Cloud.
SAP also uses the Remote Function Call (RFC) communication protocol to communicate with Informatica
Cloud. To execute remote calls from Informatica Cloud, SAP requires connection information, and the service
name and gateway on the application server. The service and gateway parameters, and connection
information is stored in a configuration file named saprfc.ini on the Secure Agent machine.
SAP Metadata Utility
You can use the SAP Metadata utility on the Windows operating system to import metadata from standard
BAPIs, custom RFCs, and IDoc messages through Informatica Cloud mapplets.
The utility generates an Informatica Cloud mapplet XML file based on the API functionality that you specify.
Import the mapplet to Informatica Cloud and use the mapplet in a mapping.
Communication Interfaces
13
Use the utility to import one BAPI/RFC function or IDoc message at a time. After you import the metadata,
you can stay connected to the same SAP system destination to import more than one BAPI/RFC or IDoc in
the same session. To import metadata from a different destination, end the session and launch the utility
again.
By default, the utility writes the XML file to the <Utility Installation Directory>/generatedMappings
folder. However, you can configure the location of the output file.
Note: You do not need to configure SAP or install SAP transports to use the SAP Metadata utility.
14
Chapter 1: Introduction to SAP Connector
Part II: SAP Connector
Administration
This part contains the following chapter:
•
SAP Connector Administration, 16
15
CHAPTER 2
SAP Connector Administration
This chapter includes the following topics:
•
SAP Connector Administration Overview, 16
•
SAP Table Connector Administration, 16
•
SAP IDocs and RFCs/BAPI Connector Administration, 25
SAP Connector Administration Overview
SAP Connector requires configuration on the machine that hosts the Secure Agent and also on the SAP
systems. The administrators for each of these systems must perform the configuration tasks for their
respective systems.
SAP Table Connector Administration
Before users can use an SAP Table connection to process SAP table data, an SAP Administrator must
perform the following tasks:
1.
Download and install the Microsoft Visual C++ redistributable.
2.
Download and configure the SAP libraries for SAP Table read and write.
3.
Configure the saprfc.ini file.
4.
Configure SAP user authorization.
5.
Install transport files.
6.
Configure HTTPS.
After the administrator has performed the configuration, users can set up and use an SAP table connection in
Data Synchronization and Mapping Configuration tasks.
16
Step 1. Downloading and Installing the Microsoft Visual C++
Redistributable
If you do not have Microsoft Visual C++ (VC++) installed, download and install the Microsoft Visual C++
redistributable (x86) from the Microsoft website. You can then run applications developed with VC++.
1.
Download and install two versions of the VC++ redistributable.
For more information about this issue from SAP, see SAP Note 684186 on the SAP website:
http://service.sap.com/notes.
The following table shows the secure agent system and the related VC++ redistributable versions to
install:
Secure Agent System
Windows 7 64-bit
VC++ Redistributable Version
Microsoft VC++ 2008:
http://www.microsoft.com/en-us/download/details.aspx?id=29
Microsoft VC++ 2010:
http://www.microsoft.com/en-us/download/details.aspx?id=5555
Windows Server 2008 64-bit
Microsoft VC++ 2008:
http://www.microsoft.com/en-us/download/details.aspx?id=29
Microsoft VC++ 2010:
http://www.microsoft.com/en-us/download/details.aspx?id=5555
Windows 7 32-bit or Windows Server
2008 R2 32-bit
Microsoft VC++ 2005:
http://www.microsoft.com/en-us/download/details.aspx?id=14431
Microsoft VC++ 2008:
http://www.microsoft.com/en-us/download/details.aspx?id=29
Linux
Not required.
Verify that your system meets the requirements for each redistributable version you install.
2.
Restart the Secure Agent after installation.
Step 2. Downloading and Configuring the Libraries for Table Read
and Write
Before you can use an SAP Table connection, download and configure the SAP libraries. Install and
configure the SAP libraries on the Secure Agent machine.
The libraries that you use are based on whether you want to read from SAP tables or write to SAP tables.
Downloading and Configuring Libraries to Read from SAP Tables
Download the SAP JCo libraries, configure the PATH system variable, and copy a library to the Secure
Agent. Contact Informatica Global Customer Support if you encounter any issues when you download the
libraries.
1.
Go to the SAP Service Marketplace: http://service.sap.com/connectors.
Note: You will need SAP credentials to access the Service Marketplace.
SAP Table Connector Administration
17
2.
Download the appropriate installer for 32-bit or 64-bit SAP JCo libraries and unzip the following file:
Secure Agent System
SAP File Name
Windows
sapjco3-NTintel-3.0.11.zip
Linux
sapjco3-linuxintel-3.0.11.tgz
Note: When the Secure Agent runs on a 32-bit machine, download the 32-bit SAP JCo libraries. When
the Secure Agent runs on a 64-bit machine, download the 64-bit SAP JCo libraries. Verify that you
download the most recent version of the libraries.
3.
Set the PATH environment variable to the location of the unzipped file.
4.
Copy the sapjco3.jar file to the following Secure Agent directories:
<Secure Agent Installation Directory>/main/tomcat/plugins/300620
<Secure Agent Installation Dirrectory>/main/bin/rdtm/javalib
The <Secure Agent Installation Dirrectory>/main/tomcat/plugins/300620 directory is created
after Informatica enables the SAP Table Connector license for the organization.
5.
Restart the Secure Agent.
Downloading and Configuring Libraries to Write to SAP Tables
Download and configure the SAP RFC SDK 7.2 libraries. Contact Informatica Global Customer Support if you
encounter any issues when you download the libraries.
Note: If you performed this step for an SAP IDoc or RFC/BAPI connection, you do not need to do it again.
1.
Go to the SAP Service Marketplace: http://service.sap.com
Note: You must have SAP credentials to access the Service Marketplace.
2.
Download the Classic RFC SDK Unicode 7.2 libraries for the Secure Agent system.
Verify that you use the Classic RFC SDK libraries and not the NetWeaver RFC SDK libraries.
SAP provides the RFC SDK Unicode 7.2 libraries in the following service archive (SAR) files:
Secure Agent System
SAP File Name
Windows 64-bit
RFC_10-10009747.SAR
Windows 32-bit
RFC_10-10009746.SAR
Linux 64-bit
RFC_10-10009745.SAR
Linux 32-bit
RFC_10-10009742.SAR
Use the most recent version available. The SAP file name might vary based on the version.
3.
Use the SAPCAR.exe utility to unzip the SAR file.
4.
Copy the files in the lib directory to the following directory:
<Secure Agent installation directory>\main\bin\rdtm
18
Chapter 2: SAP Connector Administration
5.
Set the following permissions for each RFC SDK library:
•
Read, write, and execute permissions for the current user.
•
Read and execute permissions for all other users.
Step 3. Configuring saprfc.ini
SAP uses the Remote Function Call (RFC) communications protocol to communicate with other systems. To
enable the Secure Agent to connect to the SAP system as an RFC client, create and configure the
saprfc.ini file on the machines that host the Secure Agent.
saprfc.ini Entry Types for SAP Tables
The SAP table connection uses the following types of entries to connect to SAP:
Type A
For all SAP connections. Connects to an SAP system. Each Type A entry specifies one SAP system.
The following text shows a sample Type A entry:
DEST=sapr3
TYPE=A
ASHOST=sapr3
SYSNR=00
RFC_TRACE=0
Type B
For all SAP connections. Enables SAP to create a connection to the application server with the least
load at run time. Use a type B entry to enable SAP load balancing. The following text shows a sample
Type B entry:
DEST=sapr3
TYPE=B
R3NAME=ABV
MSHOST=infamessageserver.informatica.com
GROUP=INFADEV
saprfc.ini Parameters for SAP Tables
For SAP table connections, configure the Type A or Type B entries in saprfc.ini.
The following table describes the saprfc.ini parameters to use for SAP table connections:
saprfc.ini Parameter
Type
Description
DEST
A, B
Logical name of the SAP system for the connection.
All DEST entries must be unique. You must have only one DEST
entry for each SAP system.
Use up to 32 characters to define a logical name.
TYPE
A, B
Type of connection. Set to A or B.
ASHOST
A, B
Host name or IP address of the SAP application. Informatica Cloud
uses this entry to attach to the application server.
SYSNR
A, B
SAP system number.
SAP Table Connector Administration
19
saprfc.ini Parameter
Type
Description
R3NAME
B
Name of the SAP system.
MSHOST
B
Host name of the SAP Message Server.
GROUP
B
Group name of the SAP application server.
RFC_TRACE
A
Debugs RFC connection-related problems. 0 is disabled. 1 is
enabled.
Configuring saprfc.ini for SAP Tables
Configure the saprfc.ini file for SAP Table connections.
1.
Use a DOS editor or WordPad to create the saprfc.ini file.
2.
In saprfc.ini, create an entry for each SAP connection that you want to use.
Create a Type A entry for each system with unique DEST parameters. Create a Type B entry to
configure load balancing.
3.
Save the file.
Configuring the Location of the saprfc.ini File
Save the saprfc.ini file to the Secure Agent machine and configure the RFC_INI environment variable to point
to the file.
1.
For an saprfc.ini file that contains information about a source SAP system, copy the file to a directory
local to each Secure Agent that you want to read from SAP. Repeat this step for every Secure Agent that
you want to read from SAP.
2.
For an saprfc.ini file that contains information about a target SAP system, copy the saprfc.ini file to the
following directory: <SecureAgent_InstallDir>/main/bin/rdtm. Repeat this step for every Secure Agent
that you want to read from SAP.
3.
For each Secure Agent machine that you use, configure the RFC_INI environment variable to point to
the location of the saprfc.ini file.
Step 4. Configuring SAP User Authorization
Configure the SAP user account to process SAP table data.
The following table describes the required authorization to read from SAP tables:
Read Object Name
S_BTCH_JOB
Required Authorization
DELE, LIST, PLAN, SHOW.
Set Job Operation to RELE.
S_PROGRAM
20
Chapter 2: SAP Connector Administration
BTCSUBMIT, SUBMIT
Read Object Name
Required Authorization
S_RFC
SYST, SDTX, SDIFRUNTIME, /INFADI/TBLRDR
S_TABU_DIS
&_SAP_ALL
The following table describes the required authorization to write to SAP tables:
Write Object Name
Required Authorization
S_RFC
/INFATRAN/ZPMW
S_TABU_DIS
&_SAP_ALL
Step 5. Installing SAP Table Connection Transport Files
Install the SAP Table connection transport files on the SAP machines that you want to access. Before you
install the transports on your production system, install and test the transports in a development system.
The following table describes the objects included in the transport:
Object Name
Object Type
Description
/INFADI/TABLEACCESS
Package
Namespace for all objects listed below.
/INFADI/TBLRDR
Function
Group
Namespace for the function module.
/INFADI/RFC_READ_TABLES
Function
Module
Function module used to generate a query based on
Informatica Cloud task metadata.
Also used to schedule the program to execute data
extraction.
/INFADI/EXECUTE_DYNQUERY
Program
ABAP/4 program to execute a query and to push data to
Informatica Cloud SAP table connection.
Called in executed sync or async mode.
/INFADI/IMPORTFLDS
Table
Internal structure referenced by the function module.
/INFADI/TABLEDATA
Table
Internal table type referenced by the function module.
Installing Transport Files
Install transport files from a Secure Agent directory to read from an Unicode or non-Unicode SAP system.
The transport files are for SAP version ECC 5.0 or later. To install transport files for an earlier version or to
write to an SAP system, contact Informatica Global Customer Support.
1.
Find the transport files in the following directory on the Secure Agent machine: <Secure Agent
Installation Directory>\main\bin\rdtm\sap-transport\SAPTableReader.
2.
Copy the cofile transport file to the Cofile directory in the SAP transport management directory on each
SAP machine that you want to access.
SAP Table Connector Administration
21
The cofile transport file uses the following naming convention: TABLE_READER_K<number>.G00.
3.
Remove "TABLE_READER_" from the file name to rename the cofile.
For example, for a cofile transport file named TABLE_READER_K900556.G00, rename the file to
K900556.G00.
4.
Copy the data transport file to the Data directory in the SAP transport management directory on each
SAP machine that you want to access.
The data transport file uses the following naming convention: TABLE_READER_R<number>.G00.
5.
Remove "TABLE_READER_" from the file name to rename the file.
6.
To import the transports to SAP, in the STMS, click Extras > Other Requests > Add and add the
transport request to the system queue.
7.
In the Add Transport Request to Import Queue dialog box, enter the request number for the cofile
transport.
The request number inverts the order of the renamed cofile as follows: G00K<number>.
For example, for a cofile transport file renamed as K900556.G00, enter the request number as
G00K900556.
8.
In the Request area of the import queue, select the transport request number that you added, and click
Import.
9.
If you are upgrading from a previous version of the Informatica Transports, select the Override
Originals option.
Step 6: Configuring HTTPS
To connect to SAP through HTTPS and read SAP table sources, you must configure the machine that hosts
the Secure Agent and the machine that hosts the SAP system. You must also enable HTTPS when you
configure an SAP Table connection in Informatica Cloud.
Perform the following configuration tasks on the Secure agent and SAP systems:
HTTPS Configuration on the Secure Agent System
To configure HTTPS on the machine that hosts the Secure Agent, perform the following tasks:
1.
Create a certificate using OpenSSL and JAVA KeyTool.
2.
Convert the OpenSSL certificate (PKCS#12 certificate) to SAP specific format (PSE) using the
SAPGENPSE tool.
Currently, self-signed certificates are supported
HTTPS Configuration on the SAP System
To configure HTTPS on the machine that hosts the SAP system, perform the following tasks:
1.
Enable the HTTPS service on the SAP system.
2.
Import the certificate in PSE format to the SAP system trust store.
Prerequisites
Before you create an OpenSSL certificate, verify the following prerequisites:
•
22
Download OpenSSL from https://www.openssl.org/community/binaries.html. Install
Win64OpenSSL_Light-1_0_2d.exe to a local directory on the Secure Agent machine.
The openssl.exe, ssleay32.dll, libeay32.dll, and openssl.cfg files are available in the <OpenSSL
Installation Directory>.
Chapter 2: SAP Connector Administration
•
Based on the operating system of the machine that hosts the Secure Agent and the SAP system,
download the latest available patch of the SAPGENPSE Cryptography tool from the SAP Service
Marketplace. For information, see Downloading the SAP Cryptographic Library.
Typically, the SAPGENPSE files are extracted to the nt-x86_64 directory.
•
Configure the following SAP Parameters: icm/server_port, ssl/ssl_lib, sec/libsapsecu, ssf/
ssfapi_lib, ssf/name, icm/HTTPS/verify_client, ssl/client_pse, and wdisp/ssl_encrypt. For
information, see the SAP documentation.
Create an OpenSSL Certificate
Create a self-signed certificate using OpenSSL.
1.
At the command prompt, set the OPENSSL_CONF variable to the absolute path to the openssl.cfg file. For
example, enter the following command: set OPENSSL_CONF= C:\OpenSSL-Win64\bin\openssl.cfg
2.
Navigate the <openSSL Installation Directory>\bin directory.
3.
To generate a 2048-bit RSA private key, enter the following command: openssl.exe req -new -newkey
rsa:2048 -sha1 -keyout <RSAkey File_Name>.key -out <RSAkey File_Name>.csr.
4.
When prompted, enter the following values:
•
Private key password (PEM pass phrase). Enter a phrase that you want to use to encrypt the secret
key. Re-enter the password for verification.
Important: Make a note of this PEM password. You need to specify this value in some of the
following steps.
•
Two letter code for country name.
•
State or province name.
•
Locality name.
•
Organization name
•
Organization unit name.
•
Common name (CN). Mandatory.
Important: Enter the fully qualified host name of the machine that hosts the Secure Agent.
•
5.
Email address.
Enter the following extra attributes you want to send along with the certificate request:
•
Challenge password.
•
Optional company name.
A RSA private key of 2048-bit size is created. The <RSAkey File_Name>.key and <RSAkey
File_Name>.csr files are generated in the current location.
6.
To generate a self-signed key using the RSA private key, enter the following command: openssl x509 req -days 11499 -in <RSAkey File_Name>.csr -signkey <RSAkey File_Name>.key –out
<Certificate File_Name>.crt
7.
When prompted, enter the PEM pass phrase for the RSA private key.
The <Certificate File_Name>.crt file is generated in the current location.
SAP Table Connector Administration
23
8.
9.
10.
Concatenate the contents of the <Certificate File_Name>.crt file and the <RSAkey File_Name>.key
file to a .pem file.
a.
Open the <Certificate File_Name>.crt file and the <RSAkey File_Name>.key files in a Text
editor.
b.
Create a file and save it as <PEM File_Name>.pem.
c.
Copy the contents of the <Certificate File_Name>.crt file and paste it in the .pem file.
d.
Copy the contents of the <RSAKey_Name>.key file and append it to the existing contents of the .pem
file.
e.
Save the <PEM file name>.pem file.
To create a PKCS#12 certificate, enter the following command at the command prompt: openssl pkcs12
-export -in <PEM File_Name>.pem -out <P12 File_Name>.p12 –name “domain name”.
When prompted, enter the following details:
•
The PEM pass phrase for the .pem file.
•
An export password for the P12 file. Re-enter the password for verification.
Important: Make a note of this export password for the P12 file. You need to specify this value in
some of the following steps and while creating the SAP Table connection in Informatica Cloud.
The <P12 File_Name>.p12 file is generated in the current location.
11.
To create a Java keystore file, enter the following command: keytool -v -importkeystore srckeystore <P12 File_Name>.p12 -srcstoretype PKCS12 -destkeystore <JKS File_Name>.jks deststoretype JKS -srcalias "source alias" –destalias "destination alias".
12.
When prompted, enter the following details:
•
Password for the destination keystore, the JKS file.
Important: Make a note of this password. You need to specify this password while creating the SAP
Table connection in Informatica Cloud.
•
Password for the source keystore, the P12 file. Enter the Export password for the P12 file.
The <JKS File_Name>.jks file is generated in the current location.
Important: While enabling HTTPS in an SAP Table connection, you must specify the name and location
of this keystore file. You must also specify the destination keystore password as the Keystore Password
and the source keystore password as the Private Key Password.
Convert an OpenSSL Certificate to PSE Format
You can convert an OpenSSL certificate to PSE format using the SAPGENPSE tool.
1.
At the command prompt, navigate to the <SAPGENPSE Extraction Directory>.
2.
To generate a PSE file, enter the following command: sapgenpse import_p12 -p <PSE_Directory>
\<PSE File_Name>.pse <P12 Certificate_Directory>\<P12 File_Name>.p12
3.
When prompted, enter the following details:
•
Password for the P12 file. Enter the Export password for the P12 file.
•
Personal identification number (PIN) to protect the PSE file. Re-enter the PIN for verification.
The <PSE File_Name>.pse file is generated in the specified directory.
24
4.
To generate the certificate based on the PSE format, enter the following command: sapgenpse
export_own_cert -p <PSE File_Directory>\<PSE File_Name>.pse -o <Certificate_Name>.crt
5.
When prompted, enter the PSE PIN number.
Chapter 2: SAP Connector Administration
The <Certificate_Name>.crt file is generated in the current location. Import this certificate file to the SAP
system trust store.
Enable the HTTPS Service on SAP System
Enable the HTTPS service from the SMICM transaction.
For information, see HTTP(S) Settings in ICM.
Import a Certificate to SAP System Trust Store
1.
Login to SAP and go to the STRUST transaction.
2.
Select SSL Client (Standard) and specify the password. In the Import Certificate dialog, you may need
to select Base64 format as the certificate file format.
3.
Click the Import icon and select the <Certificate_Name>.crt file in PSE format.
Note: You may need to add a DNS entry of the agent host on the SAP app server if a user is on a
different network.
4.
Click Add to Certificate List.
5.
Restart the ICM.
For more information, see Importing the Certificate From the File System.
SAP IDocs and RFCs/BAPI Connector Administration
Before you can use an SAP connection to process data through IDocs or RFCs/BAPIs, an SAP administrator
must perform the following tasks:
1.
If necessary, download and install the Microsoft Visual C++ Redistributable.
2.
Download and configure the SAP libraries.
3.
Configure the saprfc.ini file.
4.
Define SAP Connector as an external logical system in SAP.
5.
Configure SAP user authorization.
6.
Install and configure the SAP Metadata utility.
After the administrator has performed the configuration, you can create and use SAP RFC/BAPI, IDoc
Reader, and IDoc Writer connections in mappings.
Step 1. Downloading and Installing the Microsoft Visual C++
Redistributable
If you do not have Microsoft Visual C++ (VC++) installed, download and install the Microsoft Visual C++
redistributable (x86) from the Microsoft website. You can then run applications developed with VC++.
1.
Download and install two versions of the VC++ redistributable.
For more information about this issue from SAP, see SAP Note 684186 on the SAP website:
http://service.sap.com/notes.
SAP IDocs and RFCs/BAPI Connector Administration
25
The following table shows the secure agent system and the related VC++ redistributable versions to
install:
Secure Agent System
VC++ Redistributable Version
Windows 7 64-bit
Microsoft VC++ 2008:
http://www.microsoft.com/en-us/download/details.aspx?id=29
Microsoft VC++ 2010:
http://www.microsoft.com/en-us/download/details.aspx?id=5555
Windows Server 2008 64-bit
Microsoft VC++ 2008:
http://www.microsoft.com/en-us/download/details.aspx?id=29
Microsoft VC++ 2010:
http://www.microsoft.com/en-us/download/details.aspx?id=5555
Windows 7 32-bit or Windows Server
2008 R2 32-bit
Microsoft VC++ 2005:
http://www.microsoft.com/en-us/download/details.aspx?id=14431
Microsoft VC++ 2008:
http://www.microsoft.com/en-us/download/details.aspx?id=29
Linux
Not required.
Verify that your system meets the requirements for each redistributable version you install.
2.
Restart the Secure Agent after installation.
Step 2. Downloading and Configuring SAP Libraries for IDoc and
BAPI/RFC
Download and configure the SAP RFC SDK libraries. Contact Informatica Global Customer Support if you
encounter any issues when you download the libraries.
Note: If you performed this step for an SAP table connection and want to connect to Unicode SAP systems,
you do not need to download and configure the classic RFC SDK 7.2 libraries again. However, if you want to
connect to non-Unicode SAP systems, you must download and configure the classic RFC SDK 7.1 libraries.
1.
Go to the SAP Service Marketplace: http://service.sap.com.
Note: You must have SAP credentials to access the Service Marketplace.
2.
Download the classic RFC SDK libraries for the Secure Agent system.
•
To connect to Unicode SAP systems, download the classic RFC SDK Unicode 7.2 libraries.
•
To connect to non-Unicode SAP systems, download the classic RFC SDK non-Unicode 7.1 libraries.
Verify that you download the classic RFC SDK libraries and not the SAP NetWeaver RFC SDK libraries.
26
Chapter 2: SAP Connector Administration
The following table lists the Secure Agent system and the associated service archive (SAR) file for the
RFC SDK Unicode 7.2 libraries and RFC SDK non-Unicode 7.1 libraries:
Secure Agent System
SAP File Name
Windows 64-bit
RFC_10-10009747.SAR
Windows 32-bit
RFC_10-10009746.SAR
Linux 64-bit
RFC_10-10009745.SAR
Linux 32-bit
RFC_10-10009742.SAR
Use the most recent patch available. The SAP file name might vary based on the version.
3.
Use the SAPCAR.exe utility to unzip the SAR file.
4.
Copy the files in the lib directory to the following directory:
<Secure Agent installation directory>\main\bin\rdtm
5.
Set the following permissions for each RFC SDK library:
•
Read, write, and run permissions for the current user.
•
Read and run permissions for all other users.
Step 3. Configuring saprfc.ini
SAP uses the Remote Function Call (RFC) communications protocol to communicate with other systems. To
enable the Secure Agent to connect to the SAP system as an RFC client, create and configure the
saprfc.ini file on the machines that host the Secure Agent.
saprfc.ini Entry Types for IDoc and BAPI/RFC
The SAP connection uses the following types of entries to connect to SAP:
Type A
Type A. For SAP IDoc Writer and BAPI/RFC connections. Connects to an SAP system. Each Type A
entry specifies one SAP system. The following text shows a sample Type A entry:
DEST=sapr3
TYPE=A
ASHOST=sapr3
SYSNR=00
RFC_TRACE=0
Type B
For SAP Table, IDoc Writer, and BAPI/RFC connections. Use a type B entry to enable SAP load
balancing with the SAP message server. The following text shows a sample Type B entry:
DEST=sapr3
TYPE=B
R3NAME=ABV
MSHOST=infamessageserver.informatica.com
GROUP=INFADEV
SAP IDocs and RFCs/BAPI Connector Administration
27
Type R
For SAP IDoc Reader connections only. Connects to an SAP system to receive outbound IDocs. The
following text shows a sample Type R entry:
DEST=sapr346CLSQA
TYPE=R
PROGID=PID_LSRECEIVE
GWHOST=sapr346c
GWSERV=sapgw00
RFC_TRACE=1
saprfc.ini Parameters for IDoc and BAPI/RFC
For SAP IDoc and BAPI/RFC connections, configure the Type A, B, and R entries in the saprfc.ini file.
The following table describes the parameters in the saprfc.ini file to use for SAP IDoc and BAPI/RFC
connections:
saprfc.ini
Parameter
Type
Description
DEST
A, B, R
Logical name of the SAP system for the connection.
All DEST entries must be unique. You must have only one DEST entry for each
SAP system.
Use up to 32 characters to define a logical name.
TYPE
A, B, R
Type of connection. Set to A, B, or R.
ASHOST
A
Host name or IP address of the SAP application. Informatica Cloud uses this entry
to attach to the application server.
SYSNR
A
SAP system number.
R3NAME
B
Name of the SAP system.
MSHOST
B
Host name of the SAP Message Server.
GROUP
B
Group name of the SAP application server.
PROGID
R
Program ID. The Program ID must be the same as the Program ID for the logical
system you define in the SAP system to send or receive IDocs or to consume
business content data.
GWHOST
R
Host name of the SAP gateway.
GWSERV
R
Server name of the SAP gateway.
The GWSERV property is the HTTP port value. If the SAP GUI is also available on
the Secure Agent machine, this value is automatically updated. If not, you need to
manually update the value in the saprfc.ini file.
Ensure that the value is the same as the HTTP port value in the C:\Windows
\System32\drivers\etc\services file.
RFC_TRACE
A, R
Debugs RFC connection-related problems. 0 is disabled. 1 is enabled.
The following text shows a sample saprfc.ini file:
DEST=SAPR3
TYPE=A
28
Chapter 2: SAP Connector Administration
ASHOST=SAPR3
SYSNR=00
DEST=SAPR346CLSQA
TYPE=R
PROGID=PID_LSRECEIVE
GWHOST==sapr346c
GWSERV=sapgw00
Configuring saprfc.ini for IDoc and RFC/BAPI
Configure the saprfc.ini file for SAP IDoc and RFC/BAPI connections.
1.
Use a DOS editor or WordPad to create the saprfc.ini file.
Notepad can introduce errors to the saprfc.ini file.
2.
In the saprfc.ini file, create an entry for each SAP connection that you want to use.
Create a Type A entry for an IDoc Reader or RFC/BAPI connection type. Create a Type R entry for an
IDoc Writer connection type. Create a Type B entry for load balancing.
If you connect to multiple SAP systems, create appropriate entries for each system with unique DEST
parameters.
3.
Save the file.
4.
Copy saprfc.ini to the following directory for every Secure Agent machine that you want to use:
<SecureAgent_InstallDir>/main/bin/rdtm.
Step 4. Defining SAP Connector as a Logical System in SAP
To use SAP Connector to send and receive IDocs from SAP, you must define SAP Connector as an external
logical system in SAP.
Create a single logical system in SAP for IDoc ALE integration with SAP Connector. When you define SAP
Connector as a logical system, SAP acknowledges SAP Connector as an external system that can receive
outbound IDocs from SAP and send inbound IDocs to SAP.
Perform the following steps to define SAP Connector as a logical system:
1.
Create a logical system in SAP for SAP Connector.
2.
Create an RFC destination for SAP Connector.
3.
Create a tRFC port for the RFC destination.
4.
Create a partner profile for SAP Connector.
5.
Create outbound and inbound parameters for the partner profile.
Note: These steps are based on SAP version 4.6C. The steps may differ if you use a different version. For
complete instructions on creating a logical system in SAP, see the SAP documentation.
Step 1. Create a Logical System for SAP Connector
To uniquely identify SAP Connector as a client within a network, define SAP Connector as an external logical
system in SAP.
1.
Go to transaction SALE.
The Display IMG window appears.
2.
Expand the tree to navigate to the Application Link Enabling > Sending and Receiving Systems >
Logical Systems > Define Logical System operation.
SAP IDocs and RFCs/BAPI Connector Administration
29
3.
Click the IMG - Activity icon to run the Define Logical System operation.
An informational dialog box appears.
4.
Click Enter.
The Change View Logical Systems window appears.
5.
Click New Entries.
The New Entries window appears.
6.
Enter a name and description for the logical system entry for SAP Connector.
Step 2. Create an RFC Destination
Create an RFC destination and program ID for SAP Connector.
1.
Go to transaction SM59.
The Display and Maintain RFC Destinations window appears.
2.
Click Create.
The RFC Destination window appears.
3.
Enter the name of the logical system you created as the RFC destination.
4.
To create a TCP/IP connection, enter T as the connection type.
5.
Enter a description for the RFC destination.
6.
Click Save.
7.
For Activation Type, click Registration.
8.
For Program ID, enter the same name as the RFC destination name.
Use the Program ID as the value for the PROGRAM_ID parameter in the saprfc.ini file.
9.
If the SAP system is an unicode system and the Secure Agent runs on AIX (64-bit), HP-UX IA64, Linux
(32-bit), Solaris (64-bit), or Windows, click the Special Options tab, and select the Unicode option
under Character Width in Target System.
SAP provides unicode RFC libraries for these operating systems. When the Secure Agent runs on one of
these operating systems, it uses the unicode RFC libraries to process unicode data.
Step 3. Create a tRFC Port for the RFC Destination
Create a tRFC port for the RFC destination you defined in SAP. SAP uses the tRFC port to communicate with
SAP Connector.
1.
Go to transaction WE21.
2.
Click Ports > Transactional RFC.
3.
Click Create.
The Ports in IDoc Processing dialog box appears.
30
4.
Click Generate Port Name or Own Port Name and enter a name.
5.
Click Enter.
6.
Enter a description for the port.
7.
Select the IDoc record version type.
8.
Enter the name of the RFC destination you created.
Chapter 2: SAP Connector Administration
Step 4. Create a Partner Profile for SAP Connector
Create a partner profile for the logical system you defined for SAP Connector. When SAP communicates with
an external system, it uses the partner profile to identify the external system.
1.
Go to transaction WE20.
2.
Click Create.
3.
Enter the following properties:
4.
5.
Partner Profile Property
Description
Partner number
Name of the logical system you created for SAP Connector.
Partner type
Partner profile type. Enter LS for logical system for ALE distribution systems.
In the Post-processing tab, enter the following properties:
Partner Profile Property
Description
Type
User type. Enter US for user.
Agent
The SAP user login name.
Lang
Language code that corresponds to the SAP language. Enter EN for English.
In the Classification tab, enter the following properties:
Partner Profile Property
Description
Partner class
Enter ALE.
Partner status
Indicates the status of communication with the partner. To communicate with
the partner, enter A for active.
Step 5. Create Outbound and Inbound Parameters for the Partner Profile
Outbound parameters define the IDoc message type, IDoc basic type, and port number for outbound IDocs.
Inbound parameters define the IDoc message type for inbound IDocs.
SAP uses outbound parameters when it sends IDocs to SAP Connector. Create an outbound parameter for
each IDoc message type that SAP sends to SAP Connector. SAP uses inbound parameters when it receives
IDocs from SAP Connector. Create an inbound parameter for each IDoc message type that SAP receives
from SAP Connector.
1.
From the partner profiles window, click Create Outbound Parameter.
The Partner Profiles: Outbound Parameters window appears.
SAP IDocs and RFCs/BAPI Connector Administration
31
2.
3.
Enter the following properties:
Outbound Parameter
Property
Description
Message Type
The IDoc message type the SAP system sends to SAP Connector.
Receiver Port
The tRFC port number you defined.
IDoc Type
The IDoc basic type of the IDocs the SAP system sends to SAP Connector.
Click Save.
The Packet Size property appears.
4.
Enter a value between 10 and 200 IDocs as the packet size.
The packet size determines the number of IDocs that SAP sends in one packet to SAP Connector.
5.
Click Enter.
6.
Repeat steps from 1 to 5 to create an outbound parameter for each IDoc message type that SAP sends
to SAP Connector.
7.
Click Create Inbound Parameter.
The Partner Profiles: Inbound Parameters window appears.
8.
9.
10.
For each inbound parameter, enter the following properties:
Inbound Parameter
Property
Description
Message Type
The IDoc message type the SAP system receives from SAP Connector.
Process Code
The process code. The SAP system uses the process code to call the
appropriate function module to process the IDocs it receives.
Click Enter.
Repeat steps 7 through 9 to create an inbound parameter for each IDoc message type that SAP receives
from SAP Connector.
Step 5. Configuring SAP User Authorizations
An SAP administrator needs to create a profile in the development, test, and production SAP system so that
you can use the integration features. This profile name must include authorization for the objects and related
activities. The profile on the test system should be the same as the profile on the production system.
The setup of the user and profiles is done within SAP using the SAP GUI. This activity is external to
Informatica Cloud.
32
Chapter 2: SAP Connector Administration
BAPI/RFC
The following table describes the authorization an SAP user requires to execute tasks using the
BAPI/RFC functions:
Authorization Object
Authorization Value
S_RFC
SYST, SDTX, SDIFRUNTIME, RFC1, RFC2
Note: In addition to the above authorization, the user needs access to any BAPI/RFC function that needs
to be executed.
IDoc
The following table describes the authorization an SAP user requires to execute tasks with IDoc
messages:
Authorization Object
Authorization Value
S_RFC
SYST, SDTX, SDIFRUNTIME, RFC1, RFC2, EDIMEXT
Note: In addition to the above authorization, the user needs access to specific IDoc and underlying
transactions that needs to be executed.
Step 6. Installing and Configuring the SAP Metadata Utility
To import BAPI/RFC or IDoc metadata from SAP systems and generate mapplets, you need to install and
configure the SAP Metadata utility.
Prerequisites
Before you use the SAP Metadata utility, verify the following prerequisites:
•
Configure the RFC_INI system environment variable for the path to the saprfc.ini file. Verify that you
include the file name in the variable.
•
Download and install the 32-bit SAP JCo libraries for your operating system. You can find the libraries on
the SAP Service Marketplace at the following URL: http://service.sap.com. Unzip the contents to a local
directory.
•
Add the location of the SAP JCo libraries to the PATH system environment variable.
•
Verify that an SAP user has the authorization to browse and extract metadata.
Installation and Configuration
Install and configure the SAP Metadata utility on the machine that hosts the Secure Agent.
1.
Download the SAP Metadata utility zip file, SapUtility.zip from the Informatica Cloud Community.
2.
Unzip the file to a local directory.
Avoid using spaces in the directory name because spaces can cause imports to fail.
SAP IDocs and RFCs/BAPI Connector Administration
33
3.
Edit the <SAP Metadata utility download directory>/SAPUtil.bat file to define the CLASSPATH
and JAVA_HOME variables.
a.
Enter the SAP JCo libraries directory and the sapjco3.jar file name in the CLASSPATH variable
and remove “REM” from the following line:
REM SET CLASSPATH=%CLASSPATH%;<Location of sapjco3.jar>\sapjco3.jar
For example: SET CLASSPATH=%CLASSPATH%;C:\SAP\JCo\sapjco3.jar
b.
Enter the JAVA JRE directory in the JAVA_HOME variable and remove “REM” from the following
line:
REM SET JAVA_HOME=<JRE_LOCATION>
For example: SET JAVA_HOME=C:\Program Files (x86)\Informatica Cloud Secure Agent\jre
Use the JAVA JRE included with the Informatica Cloud Secure Agent.
4.
34
Save and close the batch file.
Chapter 2: SAP Connector Administration
Part III: Connections
This part contains the following chapters:
•
SAP Connections, 36
•
Troubleshooting, 43
35
CHAPTER 3
SAP Connections
This chapter includes the following topics:
•
SAP Connections Overview, 36
•
SAP Table Connections, 37
•
SAP IDoc and BAPI/RFC Connections, 38
•
Creating an SAP Table Connection, 40
•
Creating an SAP IDoc Reader Connection, 41
•
Creating an SAP IDoc Writer or SAP RFC/BAPI Interface Connection , 41
SAP Connections Overview
Use an SAP connection to read from and write data to SAP systems.
Informatica Cloud supports ABAP, IDocs, or BAPI/RFC functions to integrate with SAP systems. You can
choose one of four SAP connection types to connect to SAP systems based on the interface requirements.
The following table describes the different SAP connections:
SAP Connection Type
SAP Connector
Use the connection to ...
Read data from SAP and write to any target.
You can also write data from any source to custom tables in SAP. Contact Global
Customer Support for information about using SAP Table connection to write data to
SAP systems.
36
SAP RFC/BAPI Interface
Read and write data using BAPI/RFC functions.
IDoc Reader
Read Intermediate Documents (IDocs) from SAP systems.
IDoc Writer
Write IDocs to SAP systems.
SAP Table Connections
SAP Table connections enable you to access data directly from SAP tables. You can use the SAP Table
connection type in Data Synchronization tasks, mappings, and Mapping Configuration tasks.
You can use the SAP Table connection type to read data from transparent tables, cluster tables, pool tables,
or views. You can also use the SAP Table connection type to write data to custom transparent tables.
To enable the Secure Agent to connect to SAP through HTTPS, you must enable HTTPS and specify the
keystore details when you configure an SAP Table connection. To read SAP table sources through HTTPS,
specify an SAP Table connection configured for HTTPS when you create Data Synchronization tasks,
mappings, or Mapping Configuration tasks.
SAP Table Connection Properties
To process SAP table data, select the SAP Connector connection type and configure the following
properties:
Connection
Property
Description
Runtime
Environment
Runtime environment that contains the Secure Agent that you want to use to access SAP
tables.
Username
SAP user name with the appropriate user authorization.
Password
SAP password.
Client
SAP client number.
Language
Language code that corresponds to the SAP language.
Saprfc.ini Path
Local directory to the saprfc.ini file.
To write to SAP tables, use the following directory:<Secure Agent Installation
Dirrectory>/main/bin/rdtm.
Destination
Type A DEST in the saprfc.ini file.
Destination is case sensitive.
Note: Use all uppercase letters for the destination.
Port Range
HTTP port range. The SAP Table connection uses the specified port numbers to connect
to SAP tables using the HTTP protocol. Ensure that you specify valid numbers to prevent
connection errors. Default: 10000-65535.
Enter a range in the default range, for example, 10000-20000. When a range is outside
the default range, the connection uses the default range.
Test Streaming
Tests the connection. When selected, tests the connection using both RFC and HTTP
protocol. When not selected, tests connection using HTTP protocol.
Https Connection
When selected, connects to SAP through HTTPS protocol. To successfully connect to
SAP through HTTPS, verify that an administrator has configured the machines that host
the Secure Agent and the SAP system.
Keystore Location
The absolute path to the JKS keystore file.
SAP Table Connections
37
Connection
Property
Description
Keystore Password
The destination password specified for the .JKS file.
Private Key
Password
The export password specified for the .P12 file.
SAP Connection Rules and Guidelines
The following SAP data types are not supported for the SAP table writer at this time:
•
SSTRING
•
STRING
•
RAWSTRING
Tasks that include these data types for the SAP table writer might fail.
SAP IDoc and BAPI/RFC Connections
SAP connections enable you to access SAP data through the IDoc or BAPI/RFC interfaces. You can use the
connections in mappings and Mapping Configuration tasks.
SAP RFC/BAPI Interface Connection Properties
The following table describes the properties:
Connection
Property
Description
User Name
SAP user name with authorization on S_DATASET, S_TABU_DIS, S_PROGRAM, and
B_BTCH_JOB objects.
Password
SAP password.
Connection String
Type A DEST in the saprfc.ini file.
Code Page
The code page compatible with the SAP target. Select one of the following code pages:
-
38
Chapter 3: SAP Connections
MS Windows Latin 1. Select for ISO 8859-1 Western European data.
UTF-8. Select for Unicode and non-Unicode data.
Shift-JIS. Select for double-byte character data.
ISO 8859-15 Latin 9 (Western European).
ISO 8859-2 Eastern European.
ISO 8859-3 Southeast European.
ISO 8859-5 Cyrillic.
ISO 8859-9 Latin 5 (Turkish).
IBM EBCDIC International Latin-1.
Connection
Property
Description
Language Code
Language code that corresponds to the SAP language.
Client Code
SAP client number.
SAP IDoc Reader Connection Properties
The following table describes the properties:
Connection
Property
Description
Destination Entry
Type R DEST in the saprfc.ini file. The Program ID for this destination entry must be the
same as the Program ID for the logical system you defined in SAP to receive IDocs.
Code Page
The code page compatible with the SAP source. Select one of the following code pages:
-
MS Windows Latin 1. Select for ISO 8859-1 Western European data.
UTF-8. Select for Unicode and non-Unicode data.
Shift-JIS. Select for double-byte character data.
ISO 8859-15 Latin 9 (Western European).
ISO 8859-2 Eastern European.
ISO 8859-3 Southeast European.
ISO 8859-5 Cyrillic.
ISO 8859-9 Latin 5 (Turkish).
IBM EBCDIC International Latin-1.
SAP IDoc Writer Connection Properties
The following table describes the properties:
Connection
Property
Description
User Name
SAP user name with authorization on S_DATASET, S_TABU_DIS, S_PROGRAM, and
B_BTCH_JOB objects.
Password
SAP password.
Connection String
Type A DEST in the saprfc.ini file.
Code Page
The code page compatible with the SAP target. Select one of the following code pages:
-
MS Windows Latin 1. Select for ISO 8859-1 Western European data.
UTF-8. Select for Unicode and non-Unicode data.
Shift-JIS. Select for double-byte character data.
ISO 8859-15 Latin 9 (Western European).
ISO 8859-2 Eastern European.
ISO 8859-3 Southeast European.
ISO 8859-5 Cyrillic.
ISO 8859-9 Latin 5 (Turkish).
IBM EBCDIC International Latin-1.
SAP IDoc and BAPI/RFC Connections
39
Connection
Property
Description
Language Code
Language code that corresponds to the SAP language.
Client code
SAP client number.
Creating an SAP Table Connection
1.
Click Configure > Connections.
2.
Click New in the Connections page.
The New Connection page appears.
3.
Enter a name for the SAP Table connection.
Connection names can contain alphanumeric characters, spaces, and the following special characters:
_.+Connection names are not case sensitive.
4.
Enter a description for the connection.
The description can have a maximum length of 255 characters.
5.
Select SAP Connector as the connection type.
The SAP Connector Connection Properties appear.
40
6.
Select the name of the runtime environment where you want to run the tasks.
7.
Enter an SAP user name with the appropriate user authorization.
8.
Enter the SAP password.
9.
Enter the SAP client number.
10.
Enter the language code that corresponds to the SAP language.
11.
Enter the complete path to the saprfc.ini file.
12.
Enter the Type A DEST in the saprfc.ini file.
13.
Enter a range of HTTP port numbers that you can use.
14.
Select Test Streaming, to test the connection with both RFC and HTTP protocol. Clear the field to test
the connection with HTTP protocol.
15.
Select Https Connection to connect to SAP through HTTPS protocol.
16.
Enter the absolute path to the JKS keystore file.
17.
To specify the destination keystore password as the Keystore Password, type the destination password
specified for the .JKS file while creating the OpenSSL certificate.
18.
To specify the source keystore password as the Private Key Password, type the export password
specified for the .P12 file while creating the OpenSSL certificate.
19.
Click Test to test the SAP Table connection using HTTP protocol.
20.
Click OK to save the connection.
Chapter 3: SAP Connections
Creating an SAP IDoc Reader Connection
1.
Click Configure > Connections.
2.
Click New in the Connections page.
The New Connection page appears.
3.
Enter a name for the SAP IDoc Reader connection.
Connection names can contain alphanumeric characters, spaces, and the following special characters:
_.+Connection names are not case sensitive.
4.
Enter a description for the connection.
The description can have a maximum length of 255 characters.
5.
Select SAP as the connection type.
The SAP Connection Properties appear.
6.
Select the name of the runtime environment where you want to run the tasks.
7.
Select iDoc Reader as the SAP connection type.
The iDoc Reader Connection Properties appear.
8.
Enter the Type R DEST entry in the saprfc.ini file.
The Program ID for this destination entry must be the same as the Program ID for the logical system you
defined in SAP to receive IDocs.
9.
10.
Select UTF-8 as the code page compatible with the SAP source.
Click OK to save the connection.
Creating an SAP IDoc Writer or SAP RFC/BAPI
Interface Connection
1.
Click Configure > Connections.
2.
Click New in the Connections page.
The New Connection page appears.
3.
Enter a name for the connection.
Connection names can contain alphanumeric characters, spaces, and the following special characters:
_.+Connection names are not case sensitive.
4.
Enter a description for the connection.
The description can have a maximum length of 255 characters.
5.
Select SAP as the connection type.
The SAP Connection Properties appear.
6.
Select the name of the runtime environment where you want to run the tasks.
7.
Select the SAP connection type. You can choose one of the following options:
Creating an SAP IDoc Reader Connection
41
•
To create an IDoc Writer connection, select iDoc Writer.
•
To create a BAPI/RFC connection, select SAP RFC/BAPI Interface.
The connection properties appear.
42
8.
Enter an SAP user name with the appropriate user authorization.
9.
Enter the SAP password.
10.
Enter the Type A DEST in the saprfc.ini file as the connection string.
11.
Select the code page compatible with the SAP system.
12.
Enter the language code that corresponds to the SAP language.
13.
Enter the SAP client number.
14.
Click OK to save the connection.
Chapter 3: SAP Connections
CHAPTER 4
Troubleshooting
This chapter includes the following topics:
•
Troubleshooting Overview , 43
•
SAP Table Connection Errors, 43
Troubleshooting Overview
Use the following sections to troubleshoot errors in Informatica Cloud. For a list of common error messages
and possible solutions, see the Informatica Cloud Community article,
"Troubleshooting: Common Error Messages".
SAP Table Connection Errors
The following error displays when I test an SAP Table connection:
Test Connection Failed for <connection name>. saprfc.ini file is not found in the
directory d:\foldername.
In the connection properties, verify the accuracy of the saprfc.ini file location. Also, make sure the file is
readable.
The following error displays when I test an SAP Table connection:
Test Connection Failed for <connection name>/sap/conn/jco/JCoException
Verify that the sapjco3.jar has been saved to the appropriate directories. For more information, see the
Informatica Cloud Administrator Guide.
Restart the Secure Agent after you copy the sapjco3.jar.
The following error displays when I test an SAP Table connection or use the connection in a task.
Test Connection Failed for <connection name>. Error getting the version of the native
layer: java.lang.UnsatisfiedLinkError: no sapjco3 in java.library.path.
Verify that the location of the sapjco3.dll file is in the to PATH variable for the Secure Agent machine.
If necessary, see the Informatica Cloud Administrator Guide.
43
The following error displays when I test an SAP Table connection or use the connection in a task:
Test Connection Failed for <connection name>. Error getting the version of the native
layer: java.lang.UnsatisfiedLinkError: C:\Program Files\vikram\sapjco3NTintel-3.0.9\sapjco3.dll: This application has failed to start because the application
configuration is incorrect. Reinstalling the application may fix this problem.
If necessary, see the Informatica Cloud Administrator Guide.
The following error displays when I test an SAP Table connection or use the connection in a task:
Test Connection Failed for <connection name>. Error getting the version of the native
layer: java.lang.UnsatisfiedLinkError: no sapjco3 in java.library.path.
Add the location of sapjco3.dll to PATH variable and restart the Secure Agent.
A task that reads from SAP tables fails with the following error:
Error occurred processing data from SAP : Unable to establish Http Communication
between SAP server and agent! Shutting down reader.
The HTTP port is not open or the incoming request is being blocked by Windows Firewall. To resolve the
issue, in Windows Firewall, use the advanced settings to create a new incoming rule. Apply the rule to TCP
and all ports, and choose the HTTP-In protocol.
The following error message displays when I select an SAP table as a source object in a Data
Synchronization task.
Field QUERYRESULT not a member of TABLES
Install the latest transport files and clear the browser cache. For more information about installing transport
files, see the Informatica Cloud Administrator Guide.
44
Chapter 4: Troubleshooting
Part IV: Data Integration Using
SAP Table
This part contains the following chapters:
•
SAP Table , 46
•
Data Synchronization Tasks with SAP Table, 48
•
Mappings and Mapping Configuration Tasks with SAP Table, 59
45
CHAPTER 5
SAP Table
This chapter includes the following topics:
•
SAP Tables and Views, 46
•
Rules and Guidelines for SAP Sources and Targets, 46
SAP Tables and Views
You can connect to transparent, pool, and cluster tables using an SAP Table connection. You can also
connect to SAP views. Informatica Cloud does not differentiate between tables and views. You extract data
from views the same way you extract data from tables.
When you select a table, Informatica Cloud displays the table name followed by the business name in the
Select Object dialog box. You can filter by table name or business name when you connect to the SAP
system.
Informatica Cloud imports the following SAP table information:
•
Source name
•
Column names
•
Business descriptions
•
Data types, length, precision, and scale
Rules and Guidelines for SAP Sources and Targets
Use the following rules and guidelines when you configure SAP sources and targets:
46
•
When you configure an SAP source, configure row limits using the advanced source properties available
on the on the scheduling page of the task wizard. Row limits on the data filters page of the task wizard are
not enabled for SAP sources.
•
Do not use tables as SAP Table sources if the sources have circular primary key-foreign key relationships.
•
When you use more than one SAP table in a Data Synchronization task, you can use one cluster table or
one pool table. If you use more than one cluster or pool table, errors occur at run time. You can use the
Object Search dialog box to see if a table is a cluster table or a pool table.
•
When you join a cluster table or pool table with a transparent table, include all key fields in the transparent
table in the join condition. List the fields in the order that they appear in the SAP system.
•
When you join a cluster table or pool table with a transparent table, use all of the source fields in the
transparent table that you use in the joins and filters in the field mapping. Also, map at least one field from
the cluster or pool table.
•
Define relationships for multiple sources after the data preview displays the data. You can use the wizard
in advanced mode to avoid waiting to preview data.
•
Data sorting is not supported on cluster or pool table fields.
•
Do not use the Is Null or Is Not Null operators in data filters for SAP cluster and pool tables.
•
Do not use the Is Null or Is Not Null operators in data filters on SAP character fields.
•
Due to an SAP limitation, tasks that require a read longer than 30 minutes can fail. You might use one or
more of the following suggestions if you encounter this problem:
- Use the SAP advanced source properties to limit the number of rows to be read.
- Configure a data filter to reduce the number of rows to be read.
- Reduce the number of output fields for the task.
- Configure the SAP parameter rdisp/max_wprun_time to allow more time for the read. For more
information, see the SAP documentation.
- To increase the amount of records that the Secure Agent can retrieve at one time, you can increase the
Java heap memory for the Secure Agent. To do this, edit the Secure Agent. In the System Configuration
Details section, select DTM and set the JVMOption1 property to the following value: Xmx512m. Click OK
to save the change and restart the Secure Agent. Adjust the value for the JVMOption1 property based
on the amount of records you want to retrieve and the available memory on the Secure Agent machine.
•
For a lookup on an SAP object, configure the lookup to return less than 20 rows. Tasks might fail if the
lookup returns more than 20 rows.
•
A lookup on an SAP object does not return matching rows if the lookup comparison value is NULL.
•
When you define a reject file name for an SAP target, use the default name or the variable
$ErrorFileName. The $ErrorFileName variable uses the following convention for reject file name:
s_dss_<task name>_<run number>_error.csv.bad
•
When you define a reject directory for an SAP target, use the variable $PMBadFileDir. When you use the
$PMBadFileDir variable, the Data Synchronization task writes the reject file to the following Secure Agent
directory:
<SecureAgent_InstallDir>/main/rdtmDir/error
Rules and Guidelines for SAP Sources and Targets
47
CHAPTER 6
Data Synchronization Tasks with
SAP Table
This chapter includes the following topics:
•
Data Synchronization Tasks with SAP Table Overview, 48
•
SAP Table Sources in Data Synchronization Tasks, 49
•
SAP Table Lookups in Data Synchronization Tasks, 50
•
Configuring a Data Synchronization Task with a Single SAP Object as the Source, 50
•
Configuring a Data Synchronization Task with Multiple SAP Objects as the Source, 52
•
Monitoring a Data Synchronization Task, 54
•
Data Synchronization Task Example, 54
Data Synchronization Tasks with SAP Table
Overview
The Data Synchronization application allows you to synchronize data between a source and target.
You can configure a Data Synchronization task using the Data Synchronization Task wizard. You can use
SAP Table objects as sources, targets, or lookup objects. You can use expressions to transform the data
according to your business logic, use data filters to filter data before writing it to targets, sort data in
ascending or descending order of multiple fields.
When you create a task, you can associate it with a schedule to run it at specified times or on regular
intervals. Or, you can run it manually. You can monitor tasks that are currently running in the activity monitor
and view logs about completed tasks in the activity log.
For more information about Data Synchronization tasks, see the Informatica Cloud User Guide.
48
SAP Table Sources in Data Synchronization Tasks
When you configure a Data Synchronization task to use an SAP Table source, you can configure the source
properties.
The source properties appear on the Source page of the Data Synchronization Task wizard when you specify
an SAP Table connection.
The following table describes the SAP Table source properties:
Property
Description
Connection
Name of the source connection.
Source Type
Source type. Select one of the following types:
- Single. Select to specify a single SAP Table object.
- Multiple. Select to specify multiple SAP Table objects. When you specify multiple
source objects, you must create relationships between the source objects.
Source Object
Source object for the task.
Add
Adds multiple source objects.
Create Relationship
Creates relationship between selected source object and related source object.
Specify a join condition between a source object key field and a related source object
key field.
Edit Relationship
Edits a join condition.
Display technical field
names instead of labels
When selected, displays technical names instead of business names of the fields in
the specified source object.
Display source fields in
alphabetical order
When selected, displays source fields in alphabetic order. By default, fields appear in
the order returned by the source system.
Data Preview
Displays the first 10 rows of the first five columns in the object and the total number
of columns in the object.
Preview All Columns
Previews all source columns in a file.
You can also configure advanced source properties when you schedule the Data Synchronization task.
Advanced source properties appear on the Schedule page of the Data Synchronization Task wizard.
The following table describes the SAP Table advanced source properties:
Property
Description
Number of rows to be fetched
The number of rows that are randomly retrieved from the SAP Table. Default
value of zero retrieves all the rows in the table.
Number of rows to be skipped
The number of rows to be skipped.
SAP Table Sources in Data Synchronization Tasks
49
Property
Description
Packet size in MB
Packet size. Default is 10 MB.
Enable Compression
Enables compression.
If the Secure Agent and the SAP System are not located in the same network,
you may want to enable the compression option to optimize performance.
SAP Table Lookups in Data Synchronization Tasks
When you configure field mappings in a Data Synchronization task, you can create a lookup to an SAP Table
object.
When you use an SAP Table object as a lookup, you do not need to configure specific SAP Table properties.
For more information, see the Informatica Cloud User Guide.
Configuring a Data Synchronization Task with a
Single SAP Object as the Source
1.
Click Apps > Data Synchronization.
The Data Synchronization Task Wizard appears.
2.
Enter a name for the Data Synchronization task.
The names of Data Synchronization tasks must be unique within the organization. Data Synchronization
task names can contain alphanumeric characters, spaces, and the following special characters:_ . + Data Synchronization task names are not case sensitive.
3.
Enter a description for the Data Synchronization task.
The description can have a maximum length of 255 characters.
4.
Select the task operation that you can perform on the target. Select one of the following options: Insert,
Update, Upsert, and Delete.
5.
Click Next to enter the source details.
a.
Select an SAP Table connection.
b.
Select Single as the source type.
c.
Click Select to specify the SAP source object.
The Select Source Object dialog box appears. The dialog box displays up to 200 objects. If the
objects you want to use do not appear, enter a search string to search based on name and
description.
d.
Click Select.
The Data Preview area displays the first 10 rows of the first five columns in the SAP object and the
total number of columns in the object. To preview all source columns in a file, click Preview All
Columns.
50
Chapter 6: Data Synchronization Tasks with SAP Table
6.
To display technical names instead of business names, select Display technical field names instead
of labels.
7.
To display source fields in alphabetic order, click Display source fields in alphabetical order.
By default, fields appear in the order returned by the source system.
8.
Click Next to specify the target connection and target objects.
9.
Click Next to specify any data filters or sort criteria.
Note: Specify the row limit in the Advanced Source Properties section in the Schedule page.
10.
11.
12.
Click New to create a data filter. You can choose to create a simple or advanced data filter.
•
To create a simple data filter, select a source object, source field, and operator. Enter the value you
want to use and click OK.
•
To create an advanced data filter, click Advanced. Select a source object and enter the field
expression you want to use and click OK.
You can use parameters defined in a parameter file in the data filters. When you use a parameter in a
data filter, start the data filter with the parameter.
Click New to configure the sort criteria.
a.
Select the source object, sort by field, and the sort direction.
b.
Click New to configure additional sort criteria or click Delete to remove a sort criteria.
Click Next to configure the field mappings. Perform any of the following steps based on your
requirements.
a.
Click Edit Types in the Source column to edit the precision and scale of the SAP object.
b.
Click Add Mapplet to select a mapplet and optionally specify a connection for the mapplet.
c.
Click Automatch to match source and target fields with similar names.
d.
Click Refresh Fields to update the cache and view the latest field attributes.
e.
Click Edit Types in the Target column to edit the data type, precision, and scale of the target object.
Note that this option is not available for all target types.
f.
Select a source field and drag it to the target field to map the source and target fields. Repeat for all
the fields that you want to map.
g.
Click the Add or Edit Expression icon to define a field expression to transform data.
h.
Click the Add or Edit Lookup icon to create a lookup. Specify the lookup connection, object, source
and lookup fields, output field, multiplicity, and lookup expression.
i.
Click Validate Mapping to validate all the field mappings.
j.
13.
14.
Click Clear Mapping to clear all the field mappings.
Click Next to configure a schedule and advanced options. Perform any of the following steps based on
your requirements.
a.
Click Run this task on schedule and specify the schedule you want to use.
b.
Configure the email notification options.
c.
Configure advanced options for the task.
d.
Configure the advanced source properties and advanced target properties.
e.
Specify the execution mode.
Save the Data Synchronization task. You can choose one of the following save options.
•
Click Save and Close to save the task and close the Data Synchronization task.
Configuring a Data Synchronization Task with a Single SAP Object as the Source
51
•
Click Save and Continue to save the task and continue with configuring the Data Synchronization
task.
•
Click Save and Run to save and run the Data Synchronization task.
Configuring a Data Synchronization Task with
Multiple SAP Objects as the Source
1.
Click Apps > Data Synchronization.
The Data Synchronization Task Wizard appears.
2.
Enter a name for the Data Synchronization task.
The names of Data Synchronization tasks must be unique within the organization. Data Synchronization
task names can contain alphanumeric characters, spaces, and the following special characters:_ . + Data Synchronization task names are not case sensitive.
3.
Enter a description for the Data Synchronization task.
The description can have a maximum length of 255 characters.
4.
Select the task operation that you can perform on the target. Select one of the following options: Insert,
Update, Upsert, and Delete.
5.
Click Next to enter the source details.
a.
Select an SAP Table connection.
b.
Select Multiple as the source type.
c.
Click Add to specify an SAP source object.
The Select Source Object dialog box appears. The dialog box displays up to 200 objects. If the
objects you want to use do not appear, enter a search string to search based on name and
description. To search for an object using the technical name, enclose the name in double quotes.
d.
6.
Repeat the previous steps to add multiple SAP objects. To remove a selected object, click the
Delete icon.
Create relationships between the multiple SAP objects.
a.
Select an SAP object and click Create Relationship to create the join conditions between the
source and the related object.
The Create Relationship dialog box appears.
b.
Specify the key field in the source SAP object, the type of join, the join operator, the related SAP
object, and the key field in the related object.
c.
Click OK to create the relationship.
d.
Repeat the previous steps to create multiple relationships.
7.
To display technical names instead of business names, select Display technical field names instead
of labels.
8.
To display source fields in alphabetic order, click Display source fields in alphabetical order.
By default, fields appear in the order returned by the source system.
9.
10.
52
Click Next to specify the target connection and target objects.
Click Next to specify any data filters or sort criteria.
Chapter 6: Data Synchronization Tasks with SAP Table
Note: Specify the row limit in the Advanced Source Properties section in the Schedule page.
11.
12.
13.
14.
15.
Click New to create a data filter. You can choose to create a simple or advanced data filter.
•
To create a simple data filter, select a source object, source field, and operator. Enter the value you
want to use and click OK.
•
To create an advanced data filter, click Advanced. Select a source object and enter the field
expression you want to use and click OK.
You can use parameters defined in a parameter file in data filters. When you use a parameter in a
data filter, start the data filter with the parameter.
Click New to configure the sort criteria.
a.
Select the source object, sort by field, and the sort direction.
b.
Click New to configure additional sort criteria or click Delete to remove a sort criteria.
Click Next to configure the field mappings. Perform any of the following steps based on your
requirements.
a.
In the Source column, select one of the SAP objects or All source objects to map the fields.
b.
Click Edit Types in the Source column to edit the precision and scale of the selected SAP object.
c.
Click Add Mapplet to select a mapplet and optionally specify a connection for the mapplet.
d.
Click Automatch to match source and target fields with similar names.
e.
Click Refresh Fields to update the cache and view the latest field attributes.
f.
Click Edit Types in the Target column to edit the data type, precision and scale of the target object.
Note that this option is not available for all target types.
g.
Select a source field and drag it to the target field to map the field. Repeat for all the fields that you
want to map.
h.
Click the Add or Edit Expression icon to define a field expression to transform data.
i.
Click the Add or Edit Lookup icon to create a lookup. Specify the lookup connection, object, source
and lookup fields, output field, multiplicity, and lookup expression.
j.
Click Validate Mapping to validate all the field mappings.
k.
Click Clear Mapping to clear all the field mappings.
Click Next to configure a schedule and advanced options. Perform any of the following steps based on
your requirements.
a.
Click Run this task on schedule and specify the schedule you want to use.
b.
Configure the email notification options.
c.
Configure advanced options for the task.
d.
Configure the advanced source properties and advanced target properties.
e.
Specify the execution mode.
Save the Data Synchronization task. You can choose one of the following save options.
•
Click Save and Close to save the task and close the Data Synchronization task.
•
Click Save and Continue to save the task and continue with configuring the Data Synchronization
task.
•
Click Save and Run to save and run the Data Synchronization task.
Configuring a Data Synchronization Task with Multiple SAP Objects as the Source
53
Monitoring a Data Synchronization Task
When you run a Data Synchronization task, you can view details about the task in the activity monitor.
Select Monitor > Activity Monitor to view the tasks details.
After the job is completed, select Monitor > Activity Log to view the activity log. Select the name of the task
to view the task details. Click the session log to view details about the completed task. In addition, you can
view the ABAP statement associated with the task in the session log.
You can also monitor the progress of the task by calling Transaction SM37 from SAP. You can view the
actual job duration in SAP. The job duration listed in the Informatica Cloud activity log is a higher value
because it also includes time required to complete processing in Informatica Cloud.
You can view the HTTP and HTTPS log files in the SMICM transaction. Optionally, you can increase trace
level to 3 to view the detailed logs.
Data Synchronization Task Example
You can create a Data Synchronization task to read data from multiple SAP objects and write the data to a
flat file object.
You can read General Ledger Accounting line items from the BKPF and BSEG tables in SAP. BSEG is an
SAP Cluster table that is used to store Accounting Document Segment information. BKPF is a Transparent
SAP Table that is used to store Accounting Document Header information. In this example, you can join the
BKPF and BSEG tables and map the source object to a flat file target object.
In this example to write the accounting document details to a flat file object, perform the following steps:
1.
Define the Data Synchronization task.
2.
To configure the SAP Table sources, select an SAP Table connection, and select the BKPF transparent
table and the BSEG cluster table as the source objects. Create join conditions between the source BKPF
table and the related BSEG table.
3.
To configure a flat file target for the task, select a flat file connection and specify a flat file object.
4.
Configure the field mappings to define the data that the Data Synchronization task writes to the target.
5.
Save and run the Data Synchronization task.
Step 1: Define the Data Synchronization Task
1.
Click Apps > Data Synchronization.
The Data Synchronization Task Wizard appears.
54
2.
Enter a name for the Data Synchronization task.
3.
Enter a description for the Data Synchronization task.
Chapter 6: Data Synchronization Tasks with SAP Table
4.
Select the insert task operation for the target.
The following image shows the Data Synchronization task definition page:
5.
Click Next.
Step 2: Configure the SAP Table Source
1.
Select an SAP Table connection.
2.
Select Multiple as the source type.
3.
Click Add to specify the SAP source object.
The Select Source Object dialog box appears. Select the BKPF transparent table.
4.
Click Select.
5.
Click Add to select the BSEG cluster table.
The following image shows the Select Source Object dialog box:
6.
Create relationships between the SAP tables.
a.
Select the BKPF SAP object and click Create Relationship to create the join conditions between
the source BKPF table and the related BSEG table.
The Create Relationship dialog box appears.
b.
Specify the key field in the source SAP object, the type of join, the join operator, the related SAP
object, and the key field in the related object.
Data Synchronization Task Example
55
c.
Click OK to create the relationship.
d.
Repeat the previous steps to create multiple relationships.
The following image shows the Create Relationship dialog box:
56
7.
Select a source object to preview the data. The Data Preview area displays the first 10 rows of the first
five columns in the SAP object. You can also view the total number of columns in the object. To preview
all source columns in a file, click Preview All Columns.
8.
To display technical names instead of business names, select Display technical field names instead
of labels.
9.
To display source fields in alphabetic order, click Display source fields in alphabetical order.
Chapter 6: Data Synchronization Tasks with SAP Table
By default, fields appear in the order returned by the source system.
The following image shows the join conditions for multiple SAP objects in the Data Synchronization task
source details page:
10.
Click Next.
Step 3: Configure the Flat File Target
1.
Select a flat file connection and select a flat file object.
2.
Select a target flat file object and click OK.
The following image shows a flat file object in the Data Synchronization task target details page:
3.
Click Next to specify any data filters or sort fields.
4.
Click Next.
Data Synchronization Task Example
57
Step 4: Configure the Field Mapping
1.
Map the source and target fields.
You can select all source objects or one of the source objects to map with the target fields.
58
2.
Click Next to configure a schedule and advanced options.
3.
Save and run the Data Synchronization task.
Chapter 6: Data Synchronization Tasks with SAP Table
CHAPTER 7
Mappings and Mapping
Configuration Tasks with SAP
Table
This chapter includes the following topics:
•
Mapping and Mapping Configuration Tasks with SAP Table Overview, 59
•
SAP Table Sources in Mappings, 60
•
SAP Table Lookups in Mappings, 61
•
Configuring a Mapping with an SAP Table Source, 61
•
Creating a Mapping Configuration Task, 62
•
Mapping with an SAP Table Source Example, 63
Mapping and Mapping Configuration Tasks with SAP
Table Overview
Use a mapping to define data flow logic that is not available in Data Synchronization tasks, such as specific
ordering of logic or joining sources from different systems. Use the Informatica Cloud Mapping Designer to
configure mappings.
When you configure a mapping to describe the flow of data from source and target, you can also add
transformations to transform data. A transformation includes field rules to define incoming fields. Links
visually represent how data moves through the data flow.
After you create a mapping, you can run the mapping or you can deploy the mapping in a Mapping
Configuration task. The Mapping Configuration application allows you to process data based on the data flow
logic defined in a mapping or integration template.
Use the Mapping Configuration Task wizard to create a Mapping Configuration task. When you create a
Mapping Configuration task, you select the mapping or integration template for the task to use.
If you configured parameters, which are placeholders for information, in a mapping, you can define the
parameters in the Mapping Configuration task. Defining parameters provides additional flexibility and allows
you to use the same mapping in multiple Mapping Configuration tasks. For example, you can use a
parameter for a source connection in a mapping, and then define the source connection when you configure
the Mapping Configuration task.
59
When you create a Mapping Configuration task, you can associate the task with a schedule to run it at
specified times or on regular intervals. Or, you can run it manually. You can also configure advanced session
properties. You can monitor tasks that are currently running in the activity monitor and view details about
completed tasks in the activity log.
For more information, see the Informatica Cloud User Guide.
SAP Table Sources in Mappings
To read data from an SAP application, configure an SAP Table object as the Source transformation in a
mapping.
Specify the name and description of the SAP Table source. Configure the source and advanced properties
for the source object.
The following table describes the source properties that you can configure in a Source transformation:
Property
Description
Connection
Name of the source connection.
Source Type
Select one of the following types:
- Single. Select to specify a single SAP Table object.
- Multiple. Select to specify multiple SAP Table objects.
- Parameter. Select to specify a parameter name. You can configure the source object in a Mapping
Configuration task associated with a mapping that uses this source transformation.
Object
Source object.
When you specify multiple source objects, you must create relationships between the source
objects.
The following table describes the SAP Table advanced source properties:
Property
Description
Number of rows to be
fetched
The number of rows that are randomly retrieved from the SAP Table. Default
value of zero retrieves all the rows in the table.
Number of rows to be
skipped
The number of rows to be skipped.
Packet size in MB
Packet size. Default is 10 MB.
Enable Compression
Enables compression.
If the Secure Agent and the SAP System are not located in the same network,
you may want to enable the compression option to optimize performance.
Tracing Level
60
Sets the amount of detail that appears in the log file. You can choose terse,
normal, verbose initialization or verbose data. Default is normal.
Chapter 7: Mappings and Mapping Configuration Tasks with SAP Table
SAP Table Lookups in Mappings
In a mapping, you can configure a Lookup transformation to represent an SAP Table object.
When you use an SAP Table object as a lookup, you do not need to configure specific SAP Table properties.
For more information, see the Informatica Cloud User Guide.
Configuring a Mapping with an SAP Table Source
Use the Informatica Cloud Mapping Designer to configure a mapping.
1.
To create a mapping, click Design > Mappings, and then click New Mapping.
The New Mapping dialog box appears.
2.
Enter a name and description for the mapping, and click OK.
You can use alphanumeric characters and underscores (_) in the mapping name.
3.
To configure a source, on the Transformation palette, click Source.
4.
In the Properties panel, on the General tab, enter a name and description.
5.
Click the Source tab and configure source details.
6.
Specify the source type. You can choose one of the following options:
•
Select Single Object to select a single SAP object,
•
Select Multiple Objects to specify source object, related source object, and configure the
relationship between the source objects.
•
Select Parameter to configure the source objects in a Mapping Configuration task associated with
this mapping.
7.
Click Query Options in the Source tab to specify any filter and sort options for the SAP object.
8.
Click Advanced to specify the advanced source properties.
9.
To add or remove source fields, to update field metadata, or to synchronize fields with the source, click
the Fields tab.
10.
To add a transformation, on the Transformation palette, click the transformation name. Or, drag the
transformation onto the mapping canvas.
a.
On the General tab, you can enter a name and description for the transformation.
b.
Draw a link to connect the previous transformation to the transformation.
When you link transformations, the downstream transformation inherits the incoming fields from the
previous transformation.
For a Joiner transformation, draw a master link and a detail link.
c.
To preview fields, configure the field rules, or rename fields, click Incoming Fields.
d.
Configure additional transformation properties, as needed.
The properties that you configure vary based on the type of transformation you create.
e.
11.
To add another transformation, repeat these steps.
To add a Target transformation, on the Transformation palette, click Target.
a.
On the General tab, you can enter a name and description.
SAP Table Lookups in Mappings
61
b.
Draw a link to connect the previous transformation to the Target transformation.
c.
Click the Target tab and configure target details. If necessary, configure the advanced target
properties.
Target details and advanced target properties appear based on the connection type. For more
information, see the Informatica Cloud Transformation Guide.
12.
d.
To preview fields, configure the field rules, or rename fields, click Incoming Fields.
e.
Click Field Mapping and map the fields that you want to write to the target.
f.
To add another Target transformation, repeat these steps.
Save and run the mapping or save and create a Mapping Configuration task.
Creating a Mapping Configuration Task
You can create a Mapping Configuration task based on a valid mapping or integration template on the
Mappings page.
1.
Click Apps > Mapping Configuration.
2.
Click New .
The New Mapping Configuration Task appears.
3.
Enter a name for the task.
Task names must be unique within the organization. Task names can contain alphanumeric characters,
spaces, and the following special characters:_ . + -Task names are not case sensitive.
4.
Select the runtime environment that contains the Secure Agent that you want to use to access the SAP
tables.
5.
Select Mapping as the task based on which you want to create the Mapping Configuration task.
6.
Click Select to specify a mapping.
The Select a Mapping dialog box appears.
7.
Select a mapping or search for the required mapping and select OK.
The image of the selected mapping appears.
8.
Click Next.
If you specified any parameters in the source or target details in the mapping, the Source or Target page
appears. If not, the Schedule page appears.
9.
10.
Click Next to configure a schedule and advanced options. Perform any of the following steps based on
your requirements.
a.
Click Run this task on schedule and specify the schedule you want to use.
b.
Configure the email notification options.
c.
Configure advanced options for the task.
d.
Configure the advanced source properties and advanced target properties.
e.
Specify the execution mode.
Optionally, add advanced session properties.
a.
62
Click Add.
Chapter 7: Mappings and Mapping Configuration Tasks with SAP Table
11.
b.
Select a session property.
c.
Configure the value of the session property.
Save and run the Mapping Configuration task.
Mapping with an SAP Table Source Example
You can create a Mapping Configuration to read data from a single SAP object and write the data to a target
object.
You can read data from an SAP purchasing document header, the EKKO table, and write the purchasing
details to any target.
In this example to read data the EKKO table and write the data to a flat file target object, perform the
following steps:
1.
Define the mapping.
2.
To configure an SAP Table source, select an SAP Table connection and select the EKKO table.
3.
To configure a flat file target, select a flat file connection, specify a flat file object, and map the source
and target fields.
4.
Save the mapping and create a Mapping Configuration task.
Step 1: Define the Mapping
1.
To create a mapping, click Design > Mappings, and then click New Mapping.
The New Mapping dialog box appears.
2.
Enter a name and description for the mapping.
The following image shows the New Mapping dialog box:
3.
Click OK.
Step 2: Configure the SAP Table Source
1.
To configure an SAP source, on the Transformation palette, click Source.
Mapping with an SAP Table Source Example
63
2.
In the Properties panel, on the General tab, enter a name and description.
3.
Click the Source tab to configure source details.
4.
Specify an SAP Table connection as the source object connection.
5.
Specify the source type as Single Object and click Select.
6.
In the Select Source Object dialog box, select the EKKO table,
7.
Click Query Options in the Source tab to specify any filter and sort options for the SAP Table object.
8.
Click Advanced to specify the advanced source properties.
The following image shows the source details page:
Step 3: Configure the Flat File Target
64
1.
To add a flat file Target transformation, on the Transformation palette, click Target.
2.
On the General tab, enter a name and description.
3.
Draw a link to connect the Source transformation to the Target transformation.
4.
Click the Target tab to configure the flat file target details.
5.
Specify a flat file connection as the target connection.
6.
Select the target type as Single Object and click Select.
Chapter 7: Mappings and Mapping Configuration Tasks with SAP Table
7.
Specify a flat file object.
The following image shows the target details:
8.
To preview fields, click Incoming Fields.
The following image shows the incoming field details:
Mapping with an SAP Table Source Example
65
9.
Click Field Mapping and map the fields that you want to write to the target.
The following image shows the field mapping details:
Step 4: Save the Mapping and Create a Mapping Configuration
Task
1.
Click Save > Save and New Configuration Task.
The New Mapping Configuration Task page appears.
2.
66
Enter a name and description for the task.
Chapter 7: Mappings and Mapping Configuration Tasks with SAP Table
3.
Select the runtime environment that contains the Secure Agent you want to use to access SAP tables.
The following image shows the Mapping Configuration task details:
4.
Click Next to configure the schedule and advanced options.
5.
Save and run the Mapping Configuration task.
Mapping with an SAP Table Source Example
67
Part V: Data Integration Using
BAPI/RFC Functions
This part contains the following chapters:
68
•
BAPI/RFC Mapplets, 69
•
Mapping and Mapping Configuration Tasks Using BAP/RFC Functions, 75
CHAPTER 8
BAPI/RFC Mapplets
This chapter includes the following topics:
•
BAPI/RFC Mapplets Overview, 69
•
BAPI/RFC Mapplet Parameters, 70
•
Target Object for BAPI/RFC Error Output, 73
•
Rules and Guidelines for BAPI/RFC Mapplets in Mappings, 73
•
Importing BAPI/RFC Metadata, 74
BAPI/RFC Mapplets Overview
You can use the SAP Metadata utility to import BAPI/RFC metadata and generate a mapplet.
The BAPI/RFC mapplet includes a BAPI/RFC transformation. The BAPI/RFC transformation makes
BAPI/RFC calls in SAP. BAPI/RFC calls include requests to the SAP system, such as creating, changing, or
deleting data in SAP applications. To perform these tasks, BAPI/RFC functions use function parameter
values.
69
BAPI/RFC Mapplet Parameters
BAPI/RFC functions use function parameter values to perform tasks. A BAPI/RFC mapplet includes input and
output groups based on the BAPI/RFC transformation.
Function Parameters
BAPI/RFC functions can have the following parameters:
Function Parameter
Description
Scalar input
parameters.
Scalar input values. Some BAPI functions require scalar input values to perform tasks
such as changing data.
Scalar output
parameters.
Scalar output values that a BAPI function returns after performing a task.
Table parameters.
SAP structures with more than one row. Table parameters can be input, output, or both.
Input table parameters pass table input values to a BAPI/RFC function. For example,
some BAPI/RFC functions require table inputs to change data.
Input and Output Groups
The BAPI/RFC transformation can contain the following groups:
Group
Name
Description
Scalar input
Input group for scalar parameters. Contains a field for each scalar input parameter. The group
name is SCALAR_INPUT and the field names are SI_<FieldName>.
Table input
One group for each table parameter. The fields represent import structures. The group name is
TABLE_INPUT_<structure name> and field names are TI_<StructureName>_<FieldName>.
Scalar
output
Output group for scalar parameters. Contains a field for each scalar output parameter. The group
name is SCALAR_OUTPUT and field names are SO_<FieldName>.
Table output
One group for each table parameter. The fields represent output structures. The group name is
TABLE_OUTPUT_<structure name> and field names are TO_<StructureName>_<FieldName>.
Error output
Passes data from data conversion errors and invalid BAPI/RFC calls. Map the error output field to
a target to see error messages about data conversion and BAPI/RFC calls.
BAPI/RFC Parameter Properties
When you get objects for a BAPI/RFC function in the SAP Metadata utility, you can view the parameter
properties and return structures.
You can also change the direction of the BAPI/RFC table parameters. The direction of the table parameters
determine which groups in the transformation are input groups and which are output groups.
If the BAPI return structure is custom, you can edit the return structure properties that you selected during
import for reusable BAPI/RFC transformations. Otherwise, you can view the BAPI/RFC return structure
parameters.
70
Chapter 8: BAPI/RFC Mapplets
The following table describes the BAPI/RFC function parameter properties:
Parameter
Description
Name
Name of the export, import, and table parameters and columns.
Associated Type
Definition of the parameter in SAP.
Short Description
Short description of the export, import, and table parameters and columns.
Optional
Indicates if the Secure Agent should pass a value to the parameter when it calls the BAPI.
Direction
Indicates if the parameter requires input or provides output. Values are: Input, Output, Both,
or None.
Datatype
Object data type.
Precision
Object precision.
Scale
Object scale.
Default
Default value of the parameter in SAP, if any. SAP uses the default value when the Secure
Agent does not pass a value to SAP.
The following table describes the parameter properties in the Return Structure tab:
Return Structure
Parameter
Description
Return Structure
Return parameter name to determine the status of function calls. Value is RETURN if the
BAPI contains a default return structure. If the BAPI does not contain a default return
structure, select any table output parameter or scalar output parameter of type
STRUCTURE. Default is None.
Status Field
Required if you select a value for the return structure. Select a field from the structure for
status. If you select None for the return structure, this parameter is blank.
Text Field
Required if you select a value for the return structure. Select a field from the structure for
status messages. If you select None for the return structure, this parameter is blank.
Status Indicator For
Warning
Enter an indicator message for warning. If you select None for the return structure, the
value is W. Default is W.
Status Indicator for
Error
Enter an indicator message for error. If you select None for the return structure, the value
is E. Default is E.
Status Indicator for
Abort
Enter an indicator message for abort. If you select None for the return structure, the value
is A. Default is A.
BAPI/RFC Functions with Nested Structures
You can import metadata for a BAPI/RFC functions with nested structures.
A BAPI/RFC transformation includes multiple groups. When a BAPI function contains a nested structure,
ports for the input and output groups in a BAPI/RFC transformation use the following naming convention:
<group_name>_<parameter_name>_<field name>
BAPI/RFC Mapplet Parameters
71
For example:
SCALAR_INPUT_PARAM1_FIELD1
If there are multiple input or output structures, the BAPI/RFC transformation includes each structure
parameter name in the port names. For example, BAPI Z_TST2 has the parameter INPUT1, which is of the
type ZTYPE1. ZTYPE1 has several components such as FIELD1 and FIELD2. FIELD2 is a component of the
type structure. It contains field F1. The naming convention in the BAPI/RFC transformation for FIELD 1 is:
SCALAR_INPUT_INPUT1_FIELD1
The naming convention in the BAPI/RFC transformation for the field F1 is:
SCALAR_INPUT_INPUT1_FIELD2_F1
System Variables
SAP uses system variables to set default values for some BAPI import parameters. The variables provide
information, such as current date and time for the operating system on which SAP runs. System variables
start with “SY-”. For example, SY-DATLO represents the local date of the SAP system.
The Secure Agent provides values for some system variables to define default input values for BAPI/RFC
parameters. The Secure Agent uses the values as default input values for some ports of BAPI/RFC
transformations. The Secure Agent uses the default values when there is no input for a port or when the port
is not connected to an upstream transformation or source.
You can use the following system variables:
System Variable Name
Description
SY-LANGU
Log in language from the SAP application connection properties.
SY-MODNO
RFC handle value.
SY-MANDT
Value taken from the SAP application connection properties.
SY-DATUM
Local date on the Security Agent machine processing the data.
SY-UZEIT
Local time on the Security Agent machine processing the data.
SY-UNAME
Logon user ID from the SAP application connection properties.
SY-HOST
SAP host name from the SAP application connection properties.
Integration ID in BAPI/RFC Mapplet
The Integration ID field is a key field in the BAPI mapplet. Each BAPI/RFC mapplet includes an Integration ID
input field and output field.
When you run a mapping with a BAPI/RFC mapplet, the Secure Agent makes a BAPI/RFC call to SAP to
process the data. When it reaches the end of file, the Secure Agent makes the BAPI/RFC call. Depending on
the mapping configuration, the Secure Agent can also issue a commit.
The BAPI/RFC call is based on the input data of the Integration ID ports. The Secure Agent makes one call to
SAP for each Integration ID. Pass a value to the Integration ID ports in the scalar input group and all
mandatory table input groups of the BAPI/RFC mapplet.
72
Chapter 8: BAPI/RFC Mapplets
Note: You must map the Integration ID input field even when a BAPI or RFC does not require other input
fields.
If the BAPI/RFC call fails or if there is a data conversion error, SAP passes the data for the integration ID in
comma-separated format to an error output group. If the mapping contains a target instance that is connected
to the error output group, the Secure Agent writes the data to the target.
Target Object for BAPI/RFC Error Output
To receive input data from a BAPI/RFC function call or data conversion errors from SAP, you can map a
target transformation to a BAPI/RFC mapplet.
Create a target transformation with a column of the String data type and precision of 65535. Connect the
column in the target object to the Error Output Group in the BAPI/RFC mapplet. The Secure Agent writes the
error output data up to 65,535 characters to the target in comma-delimited format. If the error output data is
longer than 65,535 characters, the Secure Agent truncates the data.
Rules and Guidelines for BAPI/RFC Mapplets in
Mappings
When you configure a mapping with a BAPI/RFC mapplet, use the following rules and guidelines:
•
Pass a value to the Integration ID ports in the scalar input group and all mandatory table input groups of
the BAPI/RFC transformation.
•
Add a target object if you want to receive BAPI/RFC function call errors from the BAPI error group.
Use the following guidelines when passing data to BAPI/RFC function input parameters:
•
When the function input parameter data type is INT1 or NUMC, provide positive values for the function
input.
•
When the source input data for a BAPI/RFC function is of the integer data type, do not use string data in
the source transformation. Otherwise, the mapping fails.
•
If the input data for a BAPI/RFC function mapping has a higher scale than the SAP metadata
specification, the Secure Agent rounds the data to comply with the SAP metadata. When you run a
mapping in high precision mode, the mapping can fail due to overflow if the round-off data cascades to the
precision digits. For example, the data type and precision for a BAPI/RFC function parameter is DEC
(6,5). The input data that you pass to the function parameter is 9.99999. When the Secure Agent
processes the input data, it rounds the input data to 10, which is not compatible with the SAP metadata.
The mapping fails.
Target Object for BAPI/RFC Error Output
73
Importing BAPI/RFC Metadata
You can use the SAP Metadata utility to import BAPI/RFC metadata and generate a mapplet.
1.
Navigate to the SAP Metadata utility installation directory and double-click the SAPUtil.bat file.
The Import SAP IDOC/BAPI/RFC wizard appears.
2.
Select the SAP system to which you want to connect.
All systems specified in the saprfc.ini file appear in the drop-down list.
3.
Enter the SAP user name.
4.
Enter the password associated with the SAP user.
5.
Enter the client number.
6.
Enter the language code.
7.
Select BAPI/RFC and click Connect.
The connection to the SAP system is established.
8.
Click Next.
The Step 2: BAPI/RFC Selection page appears.
9.
Enter the name of the BAPI/RFC function.
Note: Verify that you enter the exact name of the BAPI/RFC function that you want to retrieve.
10.
Click Get Objects.
You can view the import, export, and table parameter details of the BAPI/RFC function.
11.
Select the scope of the transformation.
In real time, select the transformation option as Transaction. For batch processing and validating in nonproduction environment, select ALL INPUT as the transformation scope.
12.
Specify the direction for the tables to indicate if the table parameters in the BAPI are input, ouput, or
both.
13.
To select a directory for the output files, click the button next to the Output Directory field.
The Browse For Folder dialog box appears.
14.
Select a directory for the output files and click OK.
15.
Click Finish.
The mapplet for the specified BAPI/RFC function is created in the output directory.
74
Chapter 8: BAPI/RFC Mapplets
CHAPTER 9
Mapping and Mapping
Configuration Tasks Using
BAP/RFC Functions
This chapter includes the following topics:
•
Mapping and Mapping Configuration Tasks Using BAPI/RFC Functions Overview, 75
•
Importing a BAPI/RFC Mapplet to Informatica Cloud, 76
•
Configuring a Mapping with a BAPI/RFC Mapplet, 76
•
Mappings with BAPI/RFC Function Example, 77
Mapping and Mapping Configuration Tasks Using
BAPI/RFC Functions Overview
You can import a BAPI/RFC function as a mapplet to Informatica Cloud. You can then configure a mapping to
use the mapplet to manage data in SAP systems.
For example, to update sales order data in SAP, generate a BAPI/RFC mapplet from the
BAPI_SALESORDER_CHANGE function and configure a mapping using the mapplet.
You can configure a mapping with a BAPI/RFC mapplet to pass input data to BAPI/RFC function input
parameters.
To access the BAPI/RFC functionality through Informatica Cloud, perform the following tasks:
1.
Import the BAPI/RFC metadata from SAP and generate a mapplet using the SAP Metadata utility.
2.
Import the BAPI/RFC mapplet to Informatica Cloud.
3.
Configure a mapping using the generated BAPI/RFC mapplet. Map the Integration ID field and other
inputs and outputs. Save and run the mapping or create a Mapping Configuration task using this
mapping.
For information about mappings and Mapping Configuration tasks, see the Informatica Cloud User Guide.
75
Importing a BAPI/RFC Mapplet to Informatica Cloud
1.
Select Configure > Mapplets.
The Mapplets page appears.
2.
Click New.
The New Mapplet page appears.
3.
Enter an unique name for the BAPI/RFC mapplet.
4.
Optionally, enter a description for the BAPI/RFC mapplet you want to import.
5.
Select the mapplet type as Active.
All BAPIs/ RFCs mapplets are active.
6.
Click Upload to navigate to the XML file you generated using the SAP Metadata utility.
The Upload Metadata XML File dialog box appears.
7.
Click Choose File.
By default, you can view the generated BAPI/RFC mapplets as XML files in the <SAP Metadata Utility
installation directory>/generatedMappings directory.
8.
Select an XML file and click Open.
You can view the input, output, and table details of the BAPI/RFC mapplet.
9.
Click OK.
You can view the imported mapplet in the Mapplets page.
Configuring a Mapping with a BAPI/RFC Mapplet
1.
To create a mapping, click Design > Mappings, and then click New Mapping.
The New Mapping dialog box appears.
2.
Enter a name and description for the mapping, and click OK.
You can use alphanumeric characters and underscores (_) in the mapping name.
3.
To configure a source, on the Transformation palette, click Source.
4.
In the Properties panel, on the General tab, you can enter a name and description.
5.
Click the Source tab and configure the source details.
Source details and advanced source properties appear based on the connection type.
6.
To add a BAPI/RFC mapplet transformation, on the Transformation palette, click Mapplet.
a.
On the General tab, enter a name and description for the mapplet.
b.
Draw a link to connect the previous transformation to the mapplet transformation.
c.
On the Mapplet tab, click Select.
The Select Mapplet dialog box appears.
76
d.
Specify a BAPI/RFC mapplet that you imported to Informatica Cloud and click OK.
e.
Click Connection to specify an SAP RFC/BAPI Interface connection. You can create a connection,
select a connection, or specify a parameter name for the connection.
Chapter 9: Mapping and Mapping Configuration Tasks Using BAP/RFC Functions
7.
f.
To preview fields, configure the field rules, or rename fields, click Incoming Fields.
g.
Click Field Mapping and map the incoming source fields with the Integration ID in the BAPI/RFC
mapplet.
To add any other transformation, on the Transformation palette, click the transformation name. Or,
drag the transformation onto the mapping canvas.
a.
On the General tab, enter a name and description for the transformation.
b.
Draw a link to connect the previous transformation to the transformation.
When you link transformations, the downstream transformation inherits the incoming fields from the
previous transformation.
For a Joiner transformation, draw a master link and a detail link.
c.
To preview fields, configure the field rules, or rename fields, click Incoming Fields.
d.
Configure additional transformation properties, as needed.
The properties that you configure vary based on the type of transformation you create.
e.
8.
To add another transformation, repeat these steps.
To add a Target transformation, on the Transformation palette, click Target.
a.
On the General tab, you can enter a name and description.
b.
Draw a link to connect the previous transformation to the Target transformation.
c.
Click the Target tab and configure target details. If necessary, configure the advanced target
properties.
Target details and advanced target properties appear based on the connection type.
9.
d.
To preview fields, configure the field rules, or rename fields, click Incoming Fields.
e.
Click Field Mapping and map the fields that you want to write to the target.
f.
To add another Target transformation, repeat these steps.
Save and run the mapping or save and create a Mapping Configuration task.
For more information about transformations, see the Informatica Cloud Transformation Guide.
Mappings with BAPI/RFC Function Example
You can use the bapi_salesorder_createfromdat1 BAPI function to create sales order details for a
customer in your organization.
In this example to create a sales order, perform the following tasks:
Step1: Import the metadata of the bapi_salesorder_createfromdat1 BAPI function using the SAP Metadata utility.
Perform the following steps to import the function metadata:
1.
Launch the SAP Metadata utility and specify the SAP connection properties to connect to the SAP
system.
2.
Verify that you select the BAPI/RFC option and then connect to the SAP system.
The Next button is enabled only after you establish a connection to the SAP system.
3.
Enter the complete name of the BAPI function, bapi_salesorder_createfromdat1, to fetch the
BAPI objects.
4.
Retain the default output directory for the generated mapplet.
Mappings with BAPI/RFC Function Example
77
Step: 2 Import the generated mapplet to Informatica Cloud.
Login to Informatica Cloud and import the generated mapplet,
BAPI_SALESORDER_CREATEFROMDAT1_Mapping.xml file, from the output directory.
Step: 3 Configure a mapping using the generated mapplet.
Perform the following steps to configure a mapping:
1.
Specify source objects to enter the order header data from the ORDER_HEADER_IN structure, the
partner data using the ORDER_PARTNERS table, and item data using the ORDER_ITEMS_IN table
as input parameters.
2.
Add the Mapplet transformation. Draw a link to connect the flat file Source transformation to the
Mapplet transformation. Draw the following links:
3.
4.
•
ORDER_HEADER_IN source object to the Scalar_Input input port of the BAPI mapplet.
•
ORDER_ITEMS_IN source object to the Table_Input_Order_Items_IN input port of the BAPI
mapplet.
•
ORDER_PARTNERS source object to the Table_Input_Order_partners input port of the BAPI
mapplet.
Configure the mapplet transformation.
a.
Select the generated mapplet from the output directory. Verify that you specify an SAP RFC/
BAPI Interface connection for the mapplet.
b.
Map the incoming source fields with the BAPI parameter properties.
Configure a flat file object to which you can write the sales order details. Draw a link to connect the
Table Output in the Mapplet transformation to the flat file Target transformation. Create multiple flat
file target objects to write the sales order. Create the following target objects and map them to the
associated output ports in the BAPI mapplet:
•
ORDER_CFGS_BLOB, ORDER_CFGS_INST, ORDER_CFGS_PART_OF,
ORDER_CFGS_REF, and ORDER_CFGS_VALUE to write item configuration data.
•
ORDER_ITEMS_OUT to write detailed item data.
•
ORDER_CCARD to write the credit card details.
•
ORDER_SCHEDULE_EX to write the structure of VBEP with English field names.
•
Scalar_Output to write the Scalar_Output from the mapplet.
•
Error_Output to write the Error_Output from the mapplet.
Step1: Importing BAPI_SALESORDER_CREATEFROMDAT1
Metadata
Lauch the SAP Metadata utility to generate a mapplet for the BAPI_SALESORDER_CREATEFROMDAT1 function.
1.
Navigate to the SAP Metadata utility installation directory and double-click the SAPUtil.bat file.
The Import SAP IDOC/BAPI/RFC wizard appears.
2.
Select the SAP system to which you want to connect.
All systems specified in the saprfc.ini file appear in the drop-down list.
78
3.
Enter the SAP user name.
4.
Enter the password associated with the SAP user.
5.
Enter the client number.
6.
Enter the language code.
Chapter 9: Mapping and Mapping Configuration Tasks Using BAP/RFC Functions
7.
Select BAPI/RFC and click Connect.
The SAP Metadata utility establishes a connection to the SAP system.
The following image shows the Connection Properties dialog box in the SAP Metadata utility:
8.
Click Next.
The Step 2: BAPI/RFC Selection page appears.
9.
Enter the name of the BAPI/RFC function as BAPI_SALESORDER_CREATEFROMDAT1 and click Get Objects.
You can view the import, export, and table parameter details of the BAPI/RFC function.
Mappings with BAPI/RFC Function Example
79
10.
Specify the direction for the tables to indicate if the table parameters in the BAPI are input, ouput, or
both.
The following image shows the BAPI/RFC selection dialog box:
11.
Click Finish.
The SAP Metadata utility generates the BAPI_SALESORDER_CREATEFROMDAT1_Mapping.xml mapplet file
for the BAPI_SALESORDER_CREATEFROMDAT1 function and saves the file in the <SAP Metadata Utility
installation directory>/generatedMappings directory.
Step 2: Importing the BAPI_SALESORDER_CREATEFROMDAT1
Mapplet to Informatica Cloud
1.
Select Configure > Mapplets.
The Mapplets page appears.
2.
Click New.
The New Mapplet page appears.
3.
80
Enter an unique name for the mapplet.
Chapter 9: Mapping and Mapping Configuration Tasks Using BAP/RFC Functions
4.
Optionally, enter a description for the mapplet you want to import.
The following images shows the New Mapplet page:
5.
Select the mapplet type as Active.
6.
Click Upload to navigate to the XML file you generated using the SAP Metadata utility.
The Upload Metadata XML File dialog box appears.
7.
Click Choose File.
8.
Navigate to the <SAP Metadata Utility installation directory>/generatedMappings directory,
select the BAPI_SALESORDER_CREATEFROMDAT1_Mapping.xml, and click Open.
The following images shows the input, output, and table details of the
BAPI_SALESORDER_CREATEFROMDAT1 mapplet:
9.
Click OK.
You can view the imported mapplet in the Mapplets page.
Mappings with BAPI/RFC Function Example
81
Step 3: Configuring a Mapping with the
bapi_salesorder_createfromdat1 Mapplet
1.
To create a mapping, click Design > Mappings, and then click New Mapping.
The New Mapping dialog box appears.
2.
Enter a name and description for the mapping, and click OK.
3.
To configure the flat file source, on the Transformation palette, click Source.
4.
In the Properties panel on the General tab, enter a name and description.
5.
Click the Source tab and configure the source details.
a.
Select a flat file connection.
b.
Select Single Object as the Source Type.
c.
Click Select to specify a flat file that contains the source fields.
The following image shows the flat file source details:
6.
Add order header, order items, and order partners as flat file source objects for the mapping.
7.
To add a BAPI/RFC mapplet transformation, on the Transformation palette, click Mapplet.
a.
On the General tab, enter a name and description for the mapplet.
b.
Draw a link to connect the flat file Source transformation to the Mapplet transformation.
c.
On the Mapplet tab, click Select.
The Select Mapplet dialog box appears.
d.
Select the bapi_salesorder_createfromdat1 mapplet and click OK.
e.
Click Connection to specify an SAP RFC/BAPI Interface connection.
The following image shows the bapi_salesorder_createfromdat1 mapplet details:.
f.
82
To preview fields, configure the field rules, or rename fields, click Incoming Fields.
Chapter 9: Mapping and Mapping Configuration Tasks Using BAP/RFC Functions
g.
Click Field Mapping and map the incoming source fields with the appropriate mapplet input fields.
The following image shows the field mapping of the incoming source fields with the mapplet input
fields:
8.
To add a flat file target, on the Transformation palette, click Target.
9.
On the General tab, enter a name and description for the target.
10.
Click the Target tab and configure target details.
a.
Select a flat file connection for the target.
b.
Select Single Object as the Source Type.
c.
Click Select to specify the target object.
The Target Object dialog box appears.
d.
Select the flat file object, and click OK.
e.
Optionally, you can choose to forward rejected rows in the Advanced Target options.
f.
Repeat the above steps to add all the flat file target objects.
The following images shows the target transformation details:
Mappings with BAPI/RFC Function Example
83
11.
Draw a link to connect the output ports in the Mapplet transformation to the flat file Target
transformation.
The following image shows the mapping configured for the bapi_salesorder_createfromdat1 mapplet:
12.
84
Save and run the mapping.
Chapter 9: Mapping and Mapping Configuration Tasks Using BAP/RFC Functions
Part VI: Data Integration Using
IDocs
This part contains the following chapters:
•
IDoc Mapplets, 86
•
Mapping and Mapping Configuration Tasks Using IDocs, 95
85
CHAPTER 10
IDoc Mapplets
This chapter includes the following topics:
•
IDoc Mapplets Overview, 86
•
Segments and Groups, 86
•
Outbound Mapplet, 89
•
Inbound Mapplet, 90
•
Importing IDoc Metadata, 93
IDoc Mapplets Overview
You can import an IDoc as a mapplet using the SAP Metadata utility. An IDoc contains a hierarchical
structure consisting of segments. Each segment is an SAP structure defined in the SAP system.
An IDoc has header and data record components. The header component contains control information, such
as creation date and status. The control information is in an SAP structure called EDIDC. The data records
are in an SAP structure called EDIDD.
Segments and Groups
An IDoc is a hierarchical structure that contains segments. A segment can be a parent or child. A child
segment depends on another segment. A parent segment contains child segments. A parent segment can be
a child of another segment.
IDoc segments are organized into groups. The following rules determine the group to which a segment
belongs:
86
•
A parent segment starts a new group. For example, in the MATMAS04 IDoc, the E1MARCM segment
contains a child and therefore starts a group.
•
A child segment that is not a parent belongs to the group that is started by its immediate parent. For
example, in the MATMAS04 IDoc, the E1MARA1 segment does not contain a child and therefore belongs
to the group of its parent E1MARAM.
•
A group can also be a parent or a child.
Segment and Group Status
After you specify the message and doc type in the SAP Metadata utility, you can view the segments and
groups in the IDoc.
Segments and groups can be required or optional. In an IDoc mapplet, a required segment must exist in the
IDoc only if its group, its parent groups, and its parent segments are required or selected. For example, the
E1MARAM group is required. Therefore, its required child segment E1MAKTM must exist in the IDoc while its
optional child segment E1MARA1 does not have to exist in the IDoc.
If a required segment belongs to an optional group that is not selected, then the segment does not have to
exist in the IDoc. For example, the E1MARCM group is optional. Therefore, the required E1MARCM segment
also becomes optional.
When a segment is required, the Segment Status column is selected. When a group is required, the Group
Status column is selected.
For example, specify the message type as MATMAS, the IDoc Type as MATMAS04, and click Fetch. You can
view the segments and groups in the IDoc.
The following table describes how you can use the Segment Status and Group Status columns to understand
which segments are required in the MATMAS04 IDoc:
Segment Name
Segment Group
Segment Status
Group Status
Required in IDoc
E1MARAM
E1MARAM
Required
Required
Required
E1MARA1
E1MARAM
Optional
Optional
Optional
E1MARCM
E1MARCM
Required
Optional
Optional
The following image shows that the E1MARAM segment and the E1MARAM group are required:
Segments and Groups
87
IDocs Properties
When you fetch an IDoc in the SAP Metadata utility, you can view all the IDoc segments. Select a segment to
view the fields in the segment.
IDocs Properties
The following table describes the IDocs properties that you can view and specify in the SAP Metadata utility:
Property
Description
Message Type
Application messages that classify categories of data. For example, ORDERS and
MATMAS (Material Master).
IDoc Type
Data structure associated with the message type. For example, MATMAS01,
MATMAS02 for MATMAS. IDocs contain the data associated with the message type.
Control Page
Displays the control record. You can add partner profiles to the control record as key
value pairs. You can also update and delete any partner profiles associated with the
control record.
Select All Segments
Includes all segments in the IDoc mapplet.
Deselect All
Segments
Removes all selected segments except required segments from the IDoc mapplet.
Select
Transformation
Indicates the type of transformation you want to use to generate the IDoc mapplet.
Select one of the following values:
- Prepare. Select to generate a mapplet that writes source data as an IDoc message.
- Interpreter. Select to generate a mapplet that reads IDoc messages.
- Both. Select to generate a mapplet that read IDoc messages and another mapplet to write
IDoc messages.
Transformation
Scope
Indicates how the Secure Agent applies the transformation logic to incoming data. Select
one of the following values:
- Transaction
- All Input
Choose Transaction to apply the transformation logic to all rows in a transaction. Select
Transaction when the results of the transformation depend on all rows in the same
transaction, but not on rows in other transactions. When you select Transaction,
associated mappings can run in real time.
Choose All Input to apply the transformation logic to all incoming data, and to drop the
incoming transaction boundaries. Select All Input when the results of the transformation
depend on all rows of data in the source.
Default is All Input.
Segment Name
Segment names of the IDoc type.
Description
Description of the segments.
Select
Selects the data segments to include in the transformation.
When you select a segment, the parent segments and all required child segments are
also selected. When you clear a segment, all child segments are also cleared.
88
Segment Status
When selected, indicates that the segment is required in the IDoc mapplet.
Group Status
When selected, indicates that the group is required in the IDoc mapplet.
Chapter 10: IDoc Mapplets
Property
Description
Min. Occurs
Minimum number of occurrences of the segment in an IDoc.
Max Occurs
Maximum number of occurrences of the segment in an IDoc.
Segment Fields
Select a segment name to view the field names of the segment.
The following table describes the segment field details:
Field
Description
Name
Field name of a segment.
Description
Description of the field.
SAP Datatype
SAP data type of the field.
Precision
Precision of the field.
Scale
Scale of the field.
Outbound Mapplet
You can capture changes to the master data or transactional data in the SAP application database in real
time.
When data in the application database changes, the SAP system creates IDocs to capture the changes and
sends the IDocs to Informatica Cloud. You can use the IDoc Reader connection to read the IDoc message in
real time as they are generated by the SAP system.
If the Secure Agent is not running when the SAP system sends outbound IDocs, the Secure Agent does not
receive the IDocs. However, the SAP system stores the outbound IDocs in EDI tables, which are a staging
area for guaranteed message delivery. You can configure the SAP system to resend the IDocs by configuring
the tRFC port used to communicate with the Secure Agent. When you configure the port, you can enable
background processes in SAP that try to resend the IDocs to the Secure Agent a set number of times.
To generate the outbound mapplet to read IDoc messages from SAP system, use the IDoc Interpreter when
you import the IDoc metadata. Import the outbound mapplet to Informatica Cloud and configure an outbound
mapping.
Outbound Mapplet Ports
An outbound IDoc mapplet contains predefined ports. You cannot edit the ports.
Outbound Mapplet
89
The following table describes the mapplet ports:
Port Name
Description
Basic IDoc Type
Basic IDoc type name.
Basic IDoc type defines the structure of an IDoc. Each basic type describes standard IDoc
segments, format of data fields, and size. Basic type contains all the standard fields that
are necessary for carrying out a business transaction.
Extended IDoc Type
Extended IDoc type name.
IDoc extension is extension of basic type and contains additional custom IDoc segments
and fields that are not available in the standard basic type.
IDocRecord
IDoc message data.
DocumentNumber
Unique message number of the IDoc.
Target Object for Outbound Mapplet Error Output
You can configure an outbound IDoc mapping to write IDocs that are not valid to a relational or flat file target.
To write IDocs that are not valid to a relational or flat file target, connect the
IDoc_Interpreter_Error_Output_Group port in the outbound mapplet to a relational or flat file target object.
You must also configure the error log type session property in the Schedule page.
Inbound Mapplet
You can synchronize transactional data in a legacy application with the data in the SAP application database.
Use an inbound SAP IDoc mapping to send the transactional data from the legacy application database to
the SAP system. Informatica Cloud extracts the data from the legacy application data source, prepares the
data in SAP IDoc format, and sends the data to the SAP system as inbound IDocs using ALE. You can use
the IDoc Writer connection to write inbound SAP IDoc messages to SAP systems.
To generate the inbound mapplet to write IDocs to SAP systems, use the Prepare transformations when you
import the IDoc metadata. Import the inbound mapplet to Informatica Cloud and configure an inbound
mapping.
Key Fields and Control Record Fields
An IDoc mapplet includes a primary key (GPK) and a foreign key (GFK) in each segment. When you
configure a mapping, integration template, or Mapping Configuration task, map the primary key field, foreign
key field, and any control record fields.
Note: You can enable control record fields when you import the IDoc metadata.
The Prepare transformation in the SAP IDoc Writer mapplet can have primary key and foreign key fields and
other input fields, including control record fields.
The Prepare transformation has the following output fields:
90
•
IDoc Data. Map this field to an IDoc target.
•
Error IDoc Data. Map this field to see error messages about IDoc syntax/data conversion.
Chapter 10: IDoc Mapplets
When you import IDoc metadata, you can add fields to a Prepare transformation. In the SAP Metadata utility,
you can click Control Page and add the control record keys as key-value pairs. For example, you can add the
following key-value pairs for the sender partner type and the sender partner number:
•
Key: SNDPRT and Value: LS for Logical System
•
Key: SNDPRN and Value ICS
In an inbound mapping, you can pass the sender partner number to SAP. You can pass a value to the
CONTROL_SNDPRN port in the control input group of the Prepare transformation. If you do not connect this
port to an upstream transformation, the Secure Agent uses the partner number value of SNDPRN key you
specify in the SAP Metadata utility.
IDoc Primary and Foreign Keys
An IDoc message is organized hierarchically with one top-level parent segment and one or more secondlevel child segments. Second-level child segments can also have one or more third-level child segments.
To maintain the structure of the IDoc data, the Prepare transformation in the SAP IDoc Writer mapplet uses
primary and foreign keys. The top-level parent segment has a primary key. Each child segment has a primary
key and a foreign key. The foreign key of each child segment references the primary key of its parent
segment. For example, the foreign key of a second-level child segment references the primary key of the toplevel parent segment. Similarly, the foreign key of a third-level child segment references the primary key of
the second-level child segment.
The Prepare transformation groups incoming IDoc data based on the values in the primary and foreign key
fields. The Control Input group of the Prepare transformation represents the parent segment. All other groups
of the Prepare transformation except the ErrorIDocData group represent second-level or third-level child
segments. The ErrorIDocData group is used for processing invalid IDocs.
The following table shows the groups of the Prepare transformation and the fields used for the primary and
foreign keys:
Groups
Field
Description
Control Input Group
GPK_DOCNUM
Primary key of the parent segment.
Child Segment 1
GPK_<Child1_name>
Primary key of Child Segment 1.
Child Segment 1
GFK_DOCNUM_<Child1_name>
Foreign key of Child Segment 1 references
the primary key of the parent segment.
Child Segment A of
Child Segment 1
GPK_<Child1A_name>
Primary key of Child Segment A of Child
Segment 1.
Child Segment A of
Child Segment 1
GFK_<Child1_name>_<Child1A_name>
Foreign key of Child Segment A of Child
Segment 1 references the primary key of
Child Segment 1.
Child Segment 2
GPK_<Child2_name>
Primary key of the IDoc child segment.
Child Segment 2
GFK_DOCNUM_<Child2_name>
Foreign key of Child Segment 2 references
the primary key of the parent segment.
Inbound Mapplet
91
Groups
Field
Description
Child Segment B of
Child Segment 2
GPK_<Child2B_name>
Primary key of Child Segment B of Child
Segment 2.
Child Segment B of
Child Segment 2
GFK_<Child2_name>_<Child2B_name>
Foreign key of Child Segment B of Child
Segment 2 references the primary key of
Child Segment 2.
Each value for the GPK_<name> field needs to be unique. Each GFK_<parent_name>_<group_name> field
needs to reference the primary key of its parent segment.
For example, the following table shows the relationship of primary and foreign keys in an IDoc message
named ABSEN1 with four child segments:
Group
Field
Primary/Foreign Keys
CONTROL_INPUT_ABSEN1
GPK_DOCNUM
P1
E2ABSE1
GPK_E2ABSE1
C1
-
GFK_DOCNUM_E2ABSE1
P1
E2ABSE2
GPK_E2ABSE2
C2
-
GFK_DOCNUM_E2ABSE2
P1
E2ABSE2A
GPK_E2ABSE2A
C2A
-
GFK_E2ABSE2_E2ABSE2A
C2
E2ABSE3
GPK_E2ABSE3
C3
-
GFK_DOCNUM_E2ABSE3
P1
E2ABSE3B
GPK_E2ABSE3B
C3B
-
GFK_E2ABSE2_E2ABSE2A
C3
E2ABSE4
GPK_E2ABSE4
C4
-
GFK_DOCNUM_E2ABSE4
P1
The Prepare transformation uses these primary and foreign key relationships to maintain the structure of the
IDoc data. Any foreign key field that does not match the primary key of its parent segment results in an
orphan row. Any primary key field that is not unique results in a duplicate row.
Verify that each IDoc message has a unique primary key for the top-level parent segment, each child
segment, and that each foreign key matches the primary key of its parent.
92
Chapter 10: IDoc Mapplets
Importing IDoc Metadata
1.
Navigate to the SAP Metadata utility installation directory and double-click the SAPUtil.bat file.
The Import SAP IDOC/BAPI/RFC wizard appears.
2.
Select the SAP system to which you want to connect.
All systems specified in the saprfc.ini file appear in the drop-down list.
3.
Enter the SAP user name.
4.
Enter the password associated with the SAP user.
5.
Enter the client number.
6.
Enter the language code.
7.
Select IDoc and click Connect.
The connection to the SAP system is established.
8.
Click Next.
The Step 2: Select SAP IDoc Prepare Transformation page appears.
9.
Enter the message type and the IDoc type, and click Fetch.
You can view the segment details of the IDoc.
10.
11.
12.
Select the transformation type. You can choose one of the following options:
•
To generate outbound mappings to read IDocs from an SAP system, select the Interpreter
transformation.
•
To generate inbound mappings to write IDocs to an SAP system, select the Prepare transformation.
•
To generate outbound and inbound mappings, select Both.
Select the transformation scope. You can choose one of the following options:
•
Choose Transaction to apply the transformation logic to all rows in a transaction. Select Transaction
when the results of the transformation depend on all rows in the same transaction, but not on rows in
other transactions. When you select Transaction, associated mappings can run in real time. For
outbound mappings, select Transaction.
•
Choose All Input to apply the transformation logic to all incoming data, and to drop the incoming
transaction boundaries. Select All Input when the results of the transformation depend on all rows of
data in the source.
Select the segments you want to include in the mapplet.
You can click Select All Segments to include all segments in the IDoc. Click Deselect All Segments to
remove all selected segments except required segments from the IDoc.
13.
14.
To add other fields into the Control Record input group of a mapplet, perform the following steps:
a.
Click Control Page. Add more control records if you have more than one logical system.
b.
Select the checkbox for the field that you want to add, and then click Partner Profile > New .
c.
Enter the key and value for the partner profile. The key is the field name, and the value is the
partner type.
d.
Optionally, to add a control record field to the mapplet, from the Control Record page, select the
checkbox for the field you want to add. This will enable you to map the selected control record fields
when you configure a mapping, integration template, or Mapping Configuration task.
Select a directory for the output files and click OK.
Importing IDoc Metadata
93
15.
Click Finish.
The mapplet for the specified IDoc is created in the output directory.
94
Chapter 10: IDoc Mapplets
CHAPTER 11
Mapping and Mapping
Configuration Tasks Using IDocs
This chapter includes the following topics:
•
Mapping and Mapping Configuration Tasks Using IDocs Overview, 95
•
IDoc Reader Sources in Mappings, 96
•
Importing an IDoc Mapplet to Informatica Cloud, 97
•
Configuring an Outbound Mapping to Read IDocs from SAP, 97
•
Configuring an Inbound Mapping to Write IDocs to SAP, 99
•
Outbound Mapping to Read IDocs from SAP Example, 100
•
Inbound Mapping to Write IDocs To SAP Example, 105
Mapping and Mapping Configuration Tasks Using
IDocs Overview
To send and read IDocs, Informatica Cloud integrates with SAP applications using Application Link Enabling
(ALE).
ALE is an SAP proprietary technology that enables data communication between SAP systems. ALE also
enables data communication between SAP and external systems.
You can configure outbound mappings to read IDocs from SAP and inbound mappings to write IDocs to SAP.
To configure an outbound mapping to read IDocs from SAP, perform the following tasks:
1.
Import the IDoc metadata from SAP and generate a mapplet using the SAP Metadata utility. Verify that
you selected the segments and groups you want to include in the IDoc. In addition, select the Interpreter
transformation.
2.
Import the IDoc mapplet to Informatica Cloud.
3.
Configure an outbound mapping using the generated IDoc mapplet. Add a Source transformation to read
data from the SAP system, configure the IDoc mapplet, and add a Target transformation to write the
IDoc in the target object.
95
To configure an inbound mapping to write IDocs to SAP, perform the following tasks:
1.
Import the IDoc metadata from SAP and generate a mapplet using the SAP Metadata utility. Verify that
you selected the segments and groups you want to include in the IDoc. In addition, select the Prepare
transformation.
2.
Import the IDoc mapplet to Informatica Cloud.
3.
Configure an inbound mapping using the generated IDoc mapplet. Add a Source transformation to read
data from the source system, configure the IDoc mapplet, and add a Target transformation to write the
IDoc to the SAP system.
For information about mappings and Mapping Configuration task, see the Informatica Cloud User Guide.
IDoc Reader Sources in Mappings
To read IDocs from an SAP application, use an SAP IDoc Reader connection and configure the IDoc Reader
source properties in the Source transformation in a mapping.
Specify the name and description of the IDoc Reader source. Configure the source and advanced properties
for the source object.
The following table describes the IDoc Reader source properties that you can configure in a Source
transformation:
Property
Description
Connection
Name of the source connection.
Source Type
Source type. Select Single for a single source object.
When you select an SAP IDoc Reader as connection, the source type can be a single object and
source object is the IDoc Reader Object. The source object has the generic structure of an IDoc
message.
Object
Source object.
The following table describes the SAP IDoc Reader advanced source properties:
Property
Description
Idle Time
Indicates the number of seconds the Secure Agent waits for IDocs to arrive before it stops
reading from the SAP source. For example, if you enter 30 seconds for idle time, the Secure
Agent waits 30 seconds after reading from the SAP source. If no new IDocs arrive within 30
seconds, the Secure Agent stops reading from the SAP source. Default is 300.
Packet Count
Controls the number of packets the Secure Agent reads from SAP before stopping. For
example, if you enter 10 for Packet Count, the Secure Agent reads the first 10 packets from the
SAP source and then stops. The packet Size property in the ALE configuration determines the
number of IDocs the Secure Agent receives in a packet.
If you enter packet count as -1, you can read infinite number of packets. Default is -1.
Realtime
Flush Latency
96
Determines, in seconds, how often the Secure Agent flushes data from the source.
Chapter 11: Mapping and Mapping Configuration Tasks Using IDocs
Property
Description
Reader Time
Limit
Sets a period of time, in seconds, during which the Secure Agent reads IDocs from the SAP
source. For example, if you specify 10 as the reader time limit, the Secure Agent stops reading
from the SAP source after 10 seconds.
If you enter reader time limit as 0, the Security Agent continues to read IDocs from SAP for an
infinite period of time. Default is 0.
Recovery
Cache Folder
Specifies the location of the recovery cache folder.
Tracing Level
Sets the amount of detail that appears in the log file. You can choose terse, normal, verbose
initialization or verbose data. Default is normal.
Importing an IDoc Mapplet to Informatica Cloud
1.
Select Configure > Mapplets.
The Mapplets page appears.
2.
Click New.
The New Mapplet page appears.
3.
Enter an unique name for the IDoc mapplet.
4.
Optionally, enter a description for the IDoc mapplet you want to import.
5.
Select the mapplet type as Active.
All IDoc mapplets are active.
6.
Click Upload to navigate to the XML file you generated using the SAP Metadata utility.
The Upload Metadata XML File dialog box appears.
7.
Click Choose File.
By default, you can view the generated IDoc mapplets as XML files in the <SAP Metadata Utility
installation directory>/generatedMappings directory.
8.
Select an XML file and click Open.
You can view the XML file details of the IDoc mapplet.
9.
Click OK.
You can view the imported mapplet in the Mapplets page.
Configuring an Outbound Mapping to Read IDocs
from SAP
1.
To create a mapping, click Design > Mappings, and then click New Mapping.
The New Mapping dialog box appears.
Importing an IDoc Mapplet to Informatica Cloud
97
2.
Enter a name and description for the mapping, and click OK.
You can use alphanumeric characters and underscores (_) in the mapping name.
3.
To configure an SAP source, on the Transformation palette, click Source.
4.
In the Properties panel, on the General tab, enter a name and description.
5.
Click the Source tab and select an SAP IDoc Reader connection.
When you select an SAP IDoc Reader as connection, the source type is a single object and source
object is the IDoc Reader Object.
6.
If required, configure the advanced source properties.
7.
To add an IDoc mapplet transformation, on the Transformation palette, click Mapplet.
a.
On the General tab, enter a name and description for the mapplet.
b.
Draw a link to connect the previous transformation to the transformation.
c.
On the Mapplet tab, click Select.
The Select Mapplet dialog box appears.
d.
8.
Specify an IDoc mapplet that you imported using the Interpreter transformation and click OK.
To add any other transformation, on the Transformation palette, click the transformation name. Or,
drag the transformation onto the mapping canvas.
a.
On the General tab, enter a name and description for the transformation.
b.
Draw a link to connect the previous transformation to the transformation.
When you link transformations, the downstream transformation inherits the incoming fields from the
previous transformation.
For a Joiner transformation, draw a master link and a detail link.
c.
To preview fields, configure the field rules, or rename fields, click Incoming Fields.
d.
Configure additional transformation properties, as needed.
The properties that you configure vary based on the type of transformation that you create.
e.
9.
To add another transformation, repeat these steps.
To add a Target transformation, on the Transformation palette, click Target.
a.
On the General tab, enter a name and description.
b.
Draw a link to connect the previous transformation to the Target transformation.
c.
Click the Target tab and configure target details. If required, configure the advanced target
properties.
Target details and advanced target properties appear based on the connection type.
10.
d.
To preview fields, configure the field rules, or rename fields, click Incoming Fields.
e.
Click Field Mapping and map the fields that you want to write to the target.
f.
To add another Target transformation, repeat these steps.
Save and run the mapping. or save and create a Mapping Configuration task.
For information about the source and target transformations, see the Informatica Cloud Transformation
Guide.
98
Chapter 11: Mapping and Mapping Configuration Tasks Using IDocs
Configuring an Inbound Mapping to Write IDocs to
SAP
1.
To create a mapping, click Design > Mappings, and then click New Mapping.
The New Mapping dialog box appears.
2.
Enter a name and description for the mapping, and click OK.
You can use alphanumeric characters and underscores (_) in the mapping name.
3.
To configure a source, on the Transformation palette, click Source.
4.
In the Properties panel, on the General tab, enter a name and description.
5.
Click the Source tab and configure the source details.
6.
To add an IDoc mapplet transformation, on the Transformation palette, click Mapplet.
a.
On the General tab, enter a name and description for the mapplet.
b.
Draw a link to connect the previous transformation to the transformation.
c.
On the Mapplet tab, click Select.
The Select Mapplet dialog box appears.
d.
Specify an IDoc mapplet that you imported using the Prepare transformation and click OK.
e.
To preview fields, configure the field rules, or rename fields, click Incoming Fields.
Note: You must link the DOCNUM port of the mapplet to the source transformation. The DOCNUM
port represents a unique number for each IDoc and the SAP system does not accept inbound IDocs
without a unique document number.
7.
To add any other transformation, on the Transformation palette, click the transformation name. Or,
drag the transformation onto the mapping canvas.
a.
On the General tab, enter a name and description for the transformation.
b.
Draw a link to connect the previous transformation to the transformation.
When you link transformations, the downstream transformation inherits the incoming fields from the
previous transformation.
For a Joiner transformation, draw a master link and a detail link.
c.
To preview fields, configure the field rules, or rename fields, click Incoming Fields.
d.
Configure additional transformation properties, as needed.
The properties that you configure vary based on the type of transformation that you create.
e.
8.
9.
To add another transformation, repeat these steps.
To add a Target transformation, on the Transformation palette, click Target.
a.
On the General tab, enter a name and description.
b.
Draw a link to connect the previous transformation to the Target transformation.
c.
Click the Target tab and configure SAP target details.
d.
To preview fields, configure the field rules, or rename fields, click Incoming Fields.
e.
Click Field Mapping and map the fields that you want to write to the target.
f.
To add another Target transformation, repeat these steps.
Save and run the mapping or save and create a Mapping Configuration task.
Configuring an Inbound Mapping to Write IDocs to SAP
99
Outbound Mapping to Read IDocs from SAP
Example
You can read material master (MATMAS) IDocs from SAP and write it to a flat file object.
In this example to read the MATMAS IDocs, perform the following steps:
Step 1: Import MATMAS metadata using the SAP Metadata utility.
Perform the following steps to import the MATMAS IDoc:
1.
Launch the SAP Metadata utility and specify the SAP connection properties to connect to the SAP
system.
2.
Verify that you select the IDoc option and then connect to the SAP system.
The Next button is enabled only after you establish a connection to the SAP system.
3.
Enter the MATMAS as the message type and MATMAS04 as the IDoc type to fetch the IDoc
segments and fields for MATMAS.
4.
To read IDocs, select the Interpreter transformation.
5.
Retain the default output directory for the generated mapplet.
Step 2: Import the generated mapplet to Informatica Cloud.
Login to Informatica Cloud and import the MATMAS mapplet XML file from the output directory.
Step 3: Configure a mapping using the generated mapplet.
Perform the following steps to configure a mapping:
1.
Configure an SAP source. Specify an SAP IDoc Reader connection.
2.
Add the Mapplet transformation. Draw a link to connect the Source transformation to the Mapplet
transformation.
3.
Map the incoming IDoc Record field with the IDocData field in the mapplet
4.
Configure the Mapplet transformation. Select the generated Mapplet from the output directory.
5.
Configure a flat file object to which you can write the material master details. Draw a link to connect
the Control Output Group in the Mapplet transformation to the flat file Target transformation.
Note: Based on your requirement, you can choose to configure multiple flat file objects for each
segment in the IDoc and for the IDoc_Interpreter_Error_Output_Group.
Step 1: Importing MATMAS IDoc Metadata
1.
Navigate to the SAP Metadata utility installation directory and double-click the SAPUtil.bat file.
The Import SAP IDOC/BAPI/RFC wizard appears.
2.
Select the SAP system to which you want to connect.
All systems specified in the saprfc.ini file appear in the drop-down list.
100
3.
Enter the SAP user name.
4.
Enter the password associated with the SAP user.
5.
Enter the client number.
6.
Enter the language code.
7.
Select IDoc and click Connect.
Chapter 11: Mapping and Mapping Configuration Tasks Using IDocs
The SAP Metadata utility establishes a connection to the SAP system.
The following image shows the Connection Properties dialog box in the SAP Metadata utility:
8.
Click Next.
The Step 2: Select SAP IDoc Prepare Transformation page appears.
9.
Enter the message type as MATMAS and the IDoc type as MATMAS04, and click Fetch.
You can view the segment and field details of the IDoc.
10.
Select the Interpreter transformation and the scope of the transformation as Transaction.
The following image shows the SAP IDoc specification dialog box:
Outbound Mapping to Read IDocs from SAP Example
101
11.
Select the segments you want to include in the mapplet.
You can click Select All Segments to include all segments in the IDoc. You can click Deselect All
Segments to remove all selected segments except required segments from the IDoc.
12.
Retain the default directory for the output files and click OK.
13.
Click Finish.
The MATMAS04_Interpreter_Mapping.xml mapplet for the MATMAS IDoc is created in the <SAP Metadata
Utility installation directory>/generatedMappings directory.
Step 2: Importing the MATMAS04_Interpreter_Mapping Mapplet to
Informatica Cloud
1.
Select Configure > Mapplets.
The Mapplets page appears.
2.
Click New.
The New Mapplet page appears.
3.
102
Enter an unique name for the IDoc mapplet.
Chapter 11: Mapping and Mapping Configuration Tasks Using IDocs
4.
Optionally, enter a description for the IDoc mapplet you want to import.
The following images shows the New Mapplet page:
5.
Select the mapplet type as Active.
6.
Click Upload to navigate to the XML file you generated using the SAP Metadata utility.
The Upload Metadata XML File dialog box appears.
7.
Click Choose File.
8.
Navigate to the <SAP Metadata Utility installation directory>/generatedMappings directory,
select the MATMAS04_Interpreter_Mapping.xml file, and click Open.
The following images shows the input and output details of the MATMAS04_Interpreter_Mapping
mapplet:
9.
Click OK.
You can view the imported mapplet in the Mapplets page.
Step 3: Configuring an Outbound Mapping with the MATMAS IDoc
1.
To create a mapping, click Design > Mappings, and then click New Mapping.
The New Mapping dialog box appears.
Outbound Mapping to Read IDocs from SAP Example
103
2.
Enter a name and description for the mapping, and click OK.
3.
To configure an SAP source, on the Transformation palette, click Source.
4.
In the Properties panel, on the General tab, enter a name and description.
5.
Click the Source tab and select an SAP IDoc Reader connection.
When you select an SAP IDoc Reader as connection, the source type is a single object and source
object is the IDoc Reader Object.
6.
If required, configure the advanced source properties.
The following image shows the SAP source details:
7.
To add an IDoc mapplet transformation, on the Transformation palette, click Mapplet.
8.
On the General tab, enter a name and description for the mapplet.
9.
Draw a link to connect the Source transformation to the Mapplet transformation.
10.
On the Mapplet tab, click Select.
The Select Mapplet dialog box appears.
11.
Specify an IDoc mapplet that you imported using the Interpreter transformation and click OK.
The following image shows the MATMAS mapplet details:
104
Chapter 11: Mapping and Mapping Configuration Tasks Using IDocs
12.
To preview the incoming single IDoc Reader Object, click Incoming Fields.
The following image shows the incoming single IDoc Reader object:
13.
Click Field Mappings to map the incoming IDoc Record field with the IDocData field in the mapplet.
14.
To add a flat file Target transformation, on the Transformation palette, click Target.
15.
On the General tab, enter a name and description.
16.
Click the Target tab and configure target details.
a.
Select a flat file connection for the target.
b.
Select Single Object as the Target Type.
c.
Click Select to specify the target object.
The Target Object dialog box appears.
d.
Select the Create New at Runtime option, specify a name for the flat file object, and click OK.
e.
Optionally, you can choose to forward rejected rows in the Advanced Target options.
The following images shows the target transformation details:
17.
Draw a link to connect the mapplet groups to the flat file target object.
18.
Save and run the mapping.
Inbound Mapping to Write IDocs To SAP Example
You can create material master (MATMAS) in SAP using MATMAS IDoc type.
In this example to write MATMAS IDoc to SAP, perform the following steps:
Step 1: Import MATMAS metadata using the SAP Metadata utility.
Perform the following steps to import the MATMAS IDoc:
1.
Launch the SAP Metadata utility and specify the SAP connection properties to connect to the SAP
system.
Inbound Mapping to Write IDocs To SAP Example
105
2.
Verify that you select the IDoc option and then connect to the SAP system.
The Next button is enabled only after you establish a connection to the SAP system.
3.
Enter the MATMAS as the message type and MATMAS03 as the IDoc type to fetch the IDoc
segments and fields for MATMAS.
4.
To write IDocs, select the Prepare transformation.
5.
Retain the default output directory for the generated mapplet.
Step 2: Import the generated mapplet to Informatica Cloud.
Login to Informatica Cloud and import the MATMAS mapplet XML file from the output directory.
Step 3: Configure a mapping using the generated mapplet.
Perform the following steps to configure a mapping:
1.
Configure multiple flat file sources to provide data to the mapplet input fields.
2.
Add the Mapplet transformation. Draw a link to connect the Source transformation to the Mapplet
transformation.
3.
Configure the Mapplet transformation. Select the generated mapplet from the output directory.
4.
Configure an SAP object to write the material master details and a flat file object to write the error
details. Draw a link to connect the IDoc_Prepare_Output_Group_For_MATMAS03 to the IDocWriter
object. Draw another link to connect the IDoc_Prepare_Error_Output_Group_For_MATMAS03 to the
Error_Output flat file object.
Step 1: Importing MATMAS IDoc Metadata
1.
Navigate to the SAP Metadata utility installation directory and double-click the SAPUtil.bat file.
The Import SAP IDOC/BAPI/RFC wizard appears.
2.
Select the SAP system to which you want to connect.
All systems specified in the saprfc.ini file appear in the drop-down list.
3.
Enter the SAP user name.
4.
Enter the password associated with the SAP user.
5.
Enter the client number.
6.
Enter the language code.
7.
Select IDoc and click Connect.
The SAP Metadata utility establishes a connection to the SAP system.
8.
Click Next.
The Step 2: Select SAP IDoc Prepare Transformation page appears.
9.
Enter the message type as MATMAS and the IDoc type as MATMAS03, and click Fetch.
You can view the segment and field details of the IDoc.
10.
Select the Prepare transformation and the scope of the transformation as All Input.
11.
Select the segments you want to include in the mapplet.
You can click Select All Segments to include all segments in the IDoc. You can click Deselect All
Segments to remove all selected segments except required segments from the IDoc.
12.
To add other fields into the Control Record input group of a mapplet, perform the following steps:
a.
106
Click Control Page. Add more control records if you have more than one logical system.
Chapter 11: Mapping and Mapping Configuration Tasks Using IDocs
b.
Select the checkbox for the field that you want to add, and then click Partner Profile > New .
c.
Enter the key and value for the partner profile. The key is the field name, and the value is the
partner type.
d.
Optionally, to add a control record field to the mapplet, from the Control Record page, select the
checkbox for the field you want to add. This will enable you to map the selected control record fields
when you configure a mapping, integration template, or Mapping Configuration task.
13.
Retain the default directory for the output files and click OK.
14.
Click Finish.
The MATMAS03_Prepare_Mapping.xml mapplet for the MATMAS IDoc is created in the <SAP Metadata
Utility installation directory>/generatedMappings directory.
Step 2: Importing the MATMAS03_Prepare_Mapping Mapplet to
Informatica Cloud
1.
Select Configure > Mapplets.
The Mapplets page appears.
2.
Click New.
The New Mapplet page appears.
3.
Enter an unique name for the IDoc mapplet.
4.
Optionally, enter a description for the IDoc mapplet you want to import.
5.
Select the mapplet type as Active.
6.
Click Upload to navigate to the XML file you generated using the SAP Metadata utility.
The Upload Metadata XML File dialog box appears.
7.
Click Choose File.
8.
Navigate to the <SAP Metadata Utility installation directory>/generatedMappings directory,
select the MATMAS03_Prepare_Mapping.xml file, and click Open.
9.
Click OK.
You can view the imported mapplet in the Mapplets page.
Step 3: Configuring an Inbound Mapping with the MATMAS IDoc
1.
To create a mapping, click Design > Mappings, and then click New Mapping.
The New Mapping dialog box appears.
2.
Enter a name and description for the mapping, and click OK.
3.
To configure a flat file source, on the Transformation palette, click Source.
4.
In the Properties panel, on the General tab, enter a name and description.
5.
Click the Source tab and select an flat file connection.
6.
Add separate flat file sources for the control_input group and segments in the MATMAS IDoc.
7.
To add an IDoc Mapplet transformation, on the Transformation palette, click Mapplet.
8.
On the General tab, enter a name and description for the mapplet.
9.
On the Mapplet tab, click Select.
The Select Mapplet dialog box appears.
Inbound Mapping to Write IDocs To SAP Example
107
10.
Specify an IDoc mapplet that you imported using the Prepare transformation and click OK.
11.
Draw a link to connect the Source transformations to the Mapplet transformation. For example, connect
the Control_Input source object to the Control_Input_Group for the MATMAS IDoc.
12.
To preview the incoming fields, click Incoming Fields.
The following image shows the incoming fields:
13.
Click Field Mappings to map the incoming fields with the mapplet input fields.
The following image shows the field mapping in the mapplet:
14.
15.
16.
To add a Target transformation for writing to SAP, on the Transformation palette, click Target.
a.
On the General tab, enter a name and description.
b.
Click the Target tab and configure SAP target details.
To add a flat file Target transformation for the error output, on the Transformation palette, click Target.
a.
On the General tab, enter a name and description.
b.
Click the Target tab and configure flat file target details.
Draw a link to connect the IDoc_Prepare_Output_Group_For_MATMAS03 to the IDocWriter object. Draw
another link to connect the IDoc_Prepare_Error_Output_Group_For_MATMAS03 to the Error_Output flat
file object.
The following image shows the mapping for the inbound mapping for the MATMAS IDoc:
108
Chapter 11: Mapping and Mapping Configuration Tasks Using IDocs
17.
Save and run the mapping.
Inbound Mapping to Write IDocs To SAP Example
109
APPENDIX A
SAP Data Type Reference
This appendix includes the following topics:
•
SAP Data Type Reference Overview, 110
•
SAP and Transformation Data Types, 111
SAP Data Type Reference Overview
Informatica Cloud uses the following data types in mappings, Data Synchronization tasks, and Mapping
Configuration tasks with SAP:
Native data types
Native data types are data types specific to the source and target databases or flat files. They appear in
non-SAP sources and targets in the mapping.
SAP data types
SAP data types appear in the Fields tab for Source and Target transformations when you choose to edit
metadata for the fields. SAP performs any necessary conversion between the SAP data types and the
native data types of the underlying source database tables.
Transformation data types
Set of data types that appear in the remaining transformations. They are internal data types based on
ANSI SQL-92 generic data types, which Informatica Cloud uses to move data across platforms.
Transformation data types appear in all remaining transformations in a mapping, Data Synchronization
task, or Mapping Configuration task.
When Informatica Cloud reads source data, it converts the native data types to the comparable
transformation data types before transforming the data. When Informatica Cloud writes to a target, it converts
the transformation data types to the comparable native data types.
110
SAP and Transformation Data Types
The following table lists the SAP data types that Informatica Cloud supports and the corresponding
transformation data types:
SAP Data
Type
Transformation
Data Type
Range for Transformation Data Type
ACCP
Date/time
Jan 1, 0001 A.D. to Dec 31, 9999 A.D.
CHAR
String
1 to 104,857,600 characters
Fixed-length or varying-length string.
CLNT
String
1 to 104,857,600 characters
Fixed-length or varying-length string.
CUKY
String
1 to 104,857,600 characters
Fixed-length or varying-length string.
CURR
Decimal
Precision 1 to 28 digits, scale 0 to 28
DATS
Date/time
Jan 1, 0001 A.D. to Dec 31, 9999 A.D. Precision to the nanosecond.
DEC
Decimal
Precision 1 to 28 digits, scale 0 to 28
FLTP
Double
Precision 15, scale 0
INT1
Small Integer
Precision 5, scale 0
INT2
Small Integer
Precision 5, scale 0
INT4
Integer
Precision 10, scale 0
LANG
String
1 to 104,857,600 characters
Fixed-length or varying-length string.
LCHR
String
1 to 104,857,600 characters
Fixed-length or varying-length string.
LRAW
Binary
NUMC
Decimal or Double
PREC
Binary
QUAN
Decimal
Precision 1 to 28 digits, scale 0 to 28
RAW
Binary
Uninterrupted sequence of bytes with a maximum length of 255
positions.
TIMS
Date/time
Jan 1, 0001 A.D. to Dec 31, 9999 A.D. Precision to the nanosecond.
Precision 1 to 28 digits, scale 0 to 28
SAP and Transformation Data Types
111
SAP Data
Type
Transformation
Data Type
Range for Transformation Data Type
UNIT
String
1 to 104,857,600 characters
Fixed-length or varying-length string.
VARC
String
1 to 104,857,600 characters
Fixed-length or varying-length string.
112
Appendix A: SAP Data Type Reference
Index
B
BAPI/RFC
configuring mappings 76
importing mapplet 76
importing metadata 74
map error output 73
mapping example 77
nested structure 71
BAPI/RFC mappings
configuring 76
configuring mapping example 82
example 77
import mapplet example 80
import metadata example 78
overview 75
BAPI/RFC mapplets
importing 76
integration ID 72
overview 69
parameters 70
rules and guidelines 73
BAPI/RFC metadata
importing 74
BAPI/RFC parameter
properties 70
system variables 72
C
certificate
converting to PSE format 24
importing to SAP 25
Cloud Application Integration community
URL 8
Cloud Developer community
URL 8
connections
configuration for SAP IDoc and RFC/BAPI connections 25
configuration for SAP table connections 16, 17, 25
SAP 38
SAP configuration requirements 21
SAP IDoc Reader 39
SAP IDoc Writer 39
SAP RFC/BAPI Interface 38
SAP Table 37
creating
SAP Table connections 40
D
Data Synchronization tasks (continued)
multiple SAP object sources 52
overview 48
rules and guidelines for SAP sources and targets 46
SAP Table lookups 50
SAP Table sources 49
single SAP object source 50
data types
SAP 110
defining logical system
configuration for SAP IDoc connections 29
H
HTTPS
configuring 22
HTTPS configuration
enable service on SAP 25
overview 22
prerequisites 22
I
IDoc mappings
overview 95
read example 100, 105
IDoc mapplets
importing 97
importing example 102, 107
overview 86
IDoc Reader mappings
sources 96
IDocs
importing metadata 93
importing metadata example 100, 106
mapping read example 100, 105
primary and foreign keys 91
properties 88
importing certificate
SAP trust store 25
importing IDocs
MATMAS example 100, 106
inbound mappings
configuring to write IDocs 99
configuring to write IDocs example 107
Informatica Cloud Community
URL 8
Informatica Cloud web site
URL 8
Informatica Global Customer Support
contact information 9
Data Synchronization tasks
example 54
monitoring 54
113
L
libraries
to read from and write to SAP tables 17
to read from SAP tables 17
to write to SAP tables 18
logical system
creating for SAP Connector 29
defining SAP Connector 29
SAP, program ID 30
M
Mapping Configuration tasks
creating 62
overview 59
mapping example
SAP Table source 63
mappings
IDoc Reader sources 96
overview 59
SAP Table lookups 61
SAP Table source example 63
SAP Table sources 60
mapplet
outbound 89
mapplets
BAPI/RFC 69
inbound 90
O
OpenSSL certificate
converting to PSE format 24
creating 23
outbound mappings
configuring to read IDocs 97
configuring to read IDocs example 103
outbound mapplet
map error output 90
ports 89
P
partner profile
creating outbound and inbound parameters 31
SAP ALE integration 31
program ID
ALE integration 30
SAP, logical system 30
R
RFC destination
creating tRFC port 30
rules and guidelines
BAPI/RFC mapplets 73
S
SAP
additional configuration for SAP table connections 16, 17, 25
data types 110
114
Index
SAP (continued)
installing transport files for processing table data 21
SAP connections
IDoc and BAPI/RFC 38
overview 36
rules and guidelines 38
SAP Connector
communication interfaces 13
integration methods 11
overview 11
user authorizations 32
SAP Connector administration
overview 16
SAP IDoc Reader
connection properties 39
SAP IDoc Reader connection
creating 41
SAP IDoc Writer
connection properties 39
SAP iDoc Writer connection
creating 41
SAP integration methods
overview 11
using BAPI/RFC functions 12
using IDocs 12
using SAP Table 11
SAP libraries
for SAP IDoc and RFC/BAPI connections 17, 25, 26
to read from and write to SAP tables 17
to read from SAP tables 17
to write to SAP tables 18
SAP Metadata utility
installation and configuration 33
installing and configuring 33
overview 13
prerequisites 33
SAP RFC/BAPI connection
creating 41
SAP RFC/BAPI Interface
connection properties 38
SAP sources
tables and views 46
SAP sources and targets
rules and guidelines in Data Synchronization tasks 46
SAP Table
connection properties 37
SAP table connections
troubleshooting 43
SAP Table connections
creating 40
overview 37
SAP Table mapping example
defining the mapping 63
SAP Table mappings
configuring 61
SAP user authorization
configuring to process table data 20
saprfc.ini
configuring for IDoc and RFC/BAPI 29
configuring for SAP IDoc and BAPI/RFCI 27
configuring for SAP tables 19, 20
entry types for IDoc and RFC/BAPI 27
entry types for SAP tables 19
parameters for IDoc and RFC/BAPI 28
parameters for SAP tables 19
segments and groups
overview 86
status 87
status
segments and groups 87
trust site
description 9
T
tRFC port
configuring for IDoc connections 30
Index
115