Quantcast
Channel: SCN : All Content - Data Services and Data Quality
Viewing all 4013 articles
Browse latest View live

How to Resolve (BODI-1112394)>. (BODI-1111081) error when calling an function module in SAP

$
0
0

Introduction

When calling a function module in SAP using RFC function call,  the (BODI-1112394)>. (BODI-1111081) error is a common occurence. This document aims to provide a solution for such cases.

 

Error Message

For the purposes of demonstration, I've chosen the Function Module GUID_CREATE which is a standard FM in SAP. The requirement is to generate GUIDs in BODS by calling this function module. The so generated GUIDS need to be sotred in a template table. The subsequent use of these stored GUIDS can be used in BODS ETL routines and this varies with the requirements so this documetn will not explore it.

 

A screenshot of the error message is show below in Fig 1.

Fig 1.jpg

 

The above mentinoed error message occurs because syntactically the function call in the mapping for the GUID column in Fig 1 is incorrect for BODS validation.The validation failure occurs because the BODS Validation treats the RFC_FUNCTION mapping as incorrect without any input parameters being passed in the function call.

 

The Function Module

The Import and Export parameters definition for the FM GUID_CREATE is shown in Fig 2 below. the FM can be viewed in transaction SE37.

 

Fig 2.jpg

From Fig 2 it is clear that the Function Module does not need an input parameter and hence just invoking the FM will provide the GUID output as defined in the Export parameters of the FM. when this Function module is executed in SAP, it works without any errors and returns the GUID as expected (Fig 3).

 

Fig 3.jpg

However in BODS, the syntax error occurs because there are no import parameters for this function module.

 

How to Resolve this situation ?

In order to resolve this error, there should be an input parameter defined for the RFC_FUNCTION call in order to complete the syntax for BODS Validation to be successful. This can be achieved by creating a custom Function Module (ZGUID_CREATE) in SAP that calls the original FM GUID_CREATE but also has an import parameter. This import parameter need not be used in the FM but having this import parameter will complete the syntax in BODS for the RFC Function call.

 

Fig 4 below shows the "Define Input Parameter(s)" section for RFC_FUNCTION call of function module GUID_CREATE.

 

Fig 4.jpg

 

The definition for the customer Function Module ZGUID_CREATE is shown in Fig 5.

 

Fig 5.jpg

 

Note that the ZGUID_CREATE function module does not use the declared import parameter in the source code section. The purpose of the import parameter is for the benefit of the RFC_FUNCTION call in BODS. In Fig 6 below the RFC_FUNCTION call contains the input parameter in the  "Define Input Parameter(s)".

 

Fig 6.jpg

 

The RFC_FUNCTION call now is completed syntactically in the mapping for the GUID column as shown below and validates without errors.

 

 

Fig 7.jpg

 

This is one way of getting over the eror message with the syntax when calling the function module via RFC function call.


Change source path in batch Job in global variable in data services

$
0
0

Hi Experts,

 

my organization has created job in data services 3.2 to cleanse the data reading from excel flat files. the folder path was store in the global variable(I think) and now they have changed the directories hence is it throwing me below error.

 

Error, Input file  does not exist please confirm existence and restart job, 16 ) >

failed, due to error <50316>: <>>> Error, Input file  does not exist please confirm existence and restart job>. I want to update the folder path. I am sure it would be easy but I am very new to BODS.


(12.2) 07-15-14 16:10:08 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> Sleeping for 35.000000 seconds...  '

(12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> Waking up......  '

(12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : 'Starting the timer loop number 6...'

(12.2) 07-15-14 16:10:43 (14232:12656) WORKFLOW: Work flow <WF_Metadata_Files> is started.

(12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> $G_FILENAME_IN : ALL_Metadata_SALES.xls...'

(12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> looking for input file name

                                                 \\infra\finance\production\sales\Metadata\ALL_Metadata_SALES.xls'

(12.2) 07-15-14 16:11:08 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>>  Input file Name is '

(12.2) 07-15-14 16:11:08 (14232:12656)  PRINTFN: > 'JOB ERROR' : '>>> Error, Input file  does not exist please confirm existence and restart job'


I want to update the folder path\\infra\finance\production\sales\Metadata\ALL_Metadata_SALES.xls to \\Home\BIData\finance\production\sales\Metadata\ALL_Metadata_SALES.xls


when i investigated WF_Metadata_files i saw there is a global called INPUT_DIR i assume I have to change the path there. I tried to find old directory in the batch job but i cant find it and even When i give value to global variable it is still pointing to old path.



Can anybody please help me.


Thanks

Tim

Web Service Error with Special characters (é, à)

$
0
0

Hi All,

 

I am facing issue in CRM interface which is connected to external system using a Web Service.Both systems exchange data in XML format which is happening fine.My process is to transfer the master data created in CRM to external system using that web service.This process is failing (i.e XML not flowing to external system) when my master data contains any special characters like this (é à ...).I am getting a 'XML parser error' as a response in CRM and in the external system these XML's are not reached/Updated.This behavior is only in Outbound Mode (i.e CRM to external system) whereas in Inbound Mode (external system to CRM) is working fine.

 

FYI - I am using UTF - 8 encoding in CRM. My Web service is a custom Web service.

 

Please let me know if you need any clarifications.

 

Your response is much appreciated!!!!!

 

Thanks in Advance!!!!!!!

ABAP Dataflow or Regular Dataflow?

$
0
0

We can extract data from ECC system using BODS by creating regular dataflow as well as ABAP dataflow.

 

Are there any guidelines from SAP that we should use ABAP dataflow only?

 

What is advantage of using ABAP dataflow over regular dataflow? Any performance improvement?

Data Services 4.2 HDFS(HDP 2.4)

$
0
0


Trying read data from HDFS and load it to HANA. Job is simple without any transformation but it's not working and gives following error

 

8631 2822256384 RUN-050011 7/16/2014 9:25:10 AM |Data flow New_DataFlow3|Reader Query__AL_ReadFileMT_Read

8631 2822256384 RUN-050011 7/16/2014 9:25:10 AM Error: <HDFS Failed Connect to INBLRLLSSC114.apj.global.corp.sap in reader>.

8618 1423800064 RUN-050011 7/16/2014 9:25:27 AM |Data flow New_DataFlow3|Reader Query__AL_ReadFileMT_Read

8618 1423800064 RUN-050011 7/16/2014 9:25:27 AM Error: <HDFS Failed Connect to INBLRLLSSC114.apj.global.corp.sap in reader>.

 

Complete log is attached.

 

Job to read and load from HDFS( HDFS file format) to HDFS works but HDFS to HANA or HDFS to local file doesn't work.

 

It seems environment setup file provided with Data Services ( hadoop_env.sh) is for older version of hadoop.

 

Data Services is installed on Linux and Linux host is not part of Hadoop cluster. Installed all required hadoop component (hive,pig) on Data services host. Connectivity from Data services host to Hadoop cluster is working fine. Infact there is no issue with data loading from hive to Hana.

 

 

Any help/clue is appreciated to resolve this issue.

Using READ_TEXT to read multiline SAP data in DI

$
0
0

Hello experts,

 

I call u2018READ_TEXTu2019 function in DI to get data from SAP. If returned data contain only one line then everything is OK, but if returned data contain some lines then Read_Text function returns only first line.

 

As example:  I need to get u2018Item Textu2019 for u2018Purchase Orderu2019. I call u2018READ_TEXTu2019 in u2018R/3 Data Flowu2019 in u2018Query Transformu2019 with next input parameters:

-- Client: [SAP client]

--  ID: F01

--  Language: E

--  NAME: [PO number + item number]

--  Object: EKPO

 

And u2018READ_TEXTu2019 returns only first line in Output parameter O_TLINE (varchar (132)).

 

Could you please advice how I can get multiline data if it is possible with u2018READ_TEXTu2019?

Thank you.

 

Regards,

Mikhail Krutalevich

Want to create fileformat txt file with dynamic file name

$
0
0

Within fileformat the filename, tried $G_Load_Type

and tried to give $G_Load_Type = KPOP_20121106Facid.txt using datestamp within filename.

When i ran the job i am getting the following error:

 

Data flow New_DataFlow4|Loader $G_Load_Type

Cannot open file <C:/Users/Johnson/Desktop/KPOP_20121106Facid.txt>. Check its path and permissions.

 

How can i create files with dynamic name based on datestamp of the day of jo execution?

 

Thanks a lot for the helpful info.

DS 4-Call RFC function without input parameters

$
0
0

Hi

I am calling an ECC functionin BODS,

 

which is RFC enabled. It does not take any input parameters but will reurn a TEXT as output.

I have used row_gerenation --> query1 to call RFC function. This doesn't need to be unnest.

I have the query1 --> query2 in which the Output is connected to the target.

 

The validation error I am getting is - "

[Function Call:SRS_DEC_135..ZRD_TEST_HELLO_WORLD]
NRDM Function <Syntax error at line <1>: <>: near <)> found <')'> expecting <'(', &ERROR, __AL_LOOKUPEX_TRAN, __AL_SEARCH_REPLACE_TRAN, __RFC_FUNCTION, __AL_SCRIPT_FUNCTION, __AL_STORED_PROCEDURE, __AL_EXTRACT_FROM_XML, __AL_TRAN_FUNCTION, +, AL_UNSPECIFIED_PARAM, CONVERT, a float, identifier, an integer, a null, a quoted identifier, ;, a string, a decimal, VARCHAR, VARIABLE, -, +>.
1 error(s), 0 warning(s).

Check and fix the syntax and retry the operation. (BODI-1112394)> (BODI-1111191)

 

I found that thisshas been partially discussed on another thread http://scn.sap.com/message/13272300#13272300 but my function does not have input parameters. And that thread does not answer this scenario.

 

So... Is it required for RFC function being called in BODS needs to have input parameters?

BTW, This simple RFC funcion has been tested on ECC side for syntax and execution.

 

Can someone shed some light on this issue, please?

Many thnaks

-- shalaka


Batch Job failed- Errorlog

$
0
0

HI,

 

Batch job failed with the below error:

 

Error: Index of table <> is in UNUSABLE state. Unique index constraint violated. Drop index, remove duplicates from table.

 

Recently did enhancements to the job by adding new fields and defined join conditions as below, the DS version is 3.2.

 

In From tab:

 

Outer source   /   Inner source

LFA1              /     ADRC

LFA1             /      ADRCT

LFA1            /       ZNIT

 

 

In WHERE tab:

LFA1.ADRNR = ADRC.ADRNR AND

ZNIT.LIFNR  = LFA1.LIFNR   AND

LFA1.ADRNR = ADRCT.ADRNR AND

LFA1.SPRAS = 'E'

Data Services job rolling back Inserts but not Deletes or Updates

$
0
0

I have a fairly simple CDC job that I'm trying to put together. My source table has a record type code of "I" for Inserts, "D" for deletes, "UB" for Update Before and "UP" for Update After. I use a Map_CDC_Operation transform to update the destination table based on those codes.

 

I am not using the Transaction Control feature (because it just throws an error when I use it)

 

 

My issue is as follows.

Let's say I have a set of 10,000 Insert records in my source table. Record number 4000 happens to be a duplicate of record number 1. The job will process the records in order starting with record 1 and begin happily inserting records into the destination table. Once it gets to record 4000 however it runs into a duplicate key issue and then my try/catch block catches the error and the dataflow will exit. All records that were inserted prior to the error will be rolled back in the destination.

 

But the same is not true for updates or deletes. If I have 10000 deletes and 1 insert in the middle that happens to be an insert of a duplicate key, any deletes processed before the insert will not be rolled back. This is also the case for updates.

 

And again, I am not using Transaction Control, so I'm not sure why the Inserts are being rolled back, but more curiously Updates and Deletes are not being rolled back. I'm not sure why there isn't a consistent result regardless of type of operation. Does anyone know what's going on here or  what I'm doing wrong/what my misconception may be?

 

Environment information: both source and destination are SQL Server 2008 databases and the Data Services version we use is 14.1.1.460.

 

If you require more information, please let me know.

DS Connectivity with SAP BW

$
0
0

Hi,

 

DS is working fine with sap bw dev but while connecting with BWQ for testing it getting stuck showing below error, have same authorization as dev,

created RFC connection in sap bw (sm59) and in DS,

 

Thanks in advance,

ODBC call for data source failed: . Notify Customer Support.

$
0
0

Need help !

 

While running DS job through designer we are getting below error:

 

ODBC call <SQLDriverConnect> for data source <SErver name> failed: <[>. Notify Customer Support.

 

Additional notes:

 

1. Source and Target datastore are getting connected properly as i have imported table's definition using it.

2. Source and target data store pointing to SQL server 2008 R2 express edition DB (64 bit).

3. It seems like server machine has both 32bit and 64bit drivers installed on it. Please refer attached driverdetails.jpg file for more details.

 

Thanks

Dynamic File Name as source file format

$
0
0

Hi Experts,

 

 

I have to load daily Excel Workbook to HANA.

 

Files will come to shared Folder in BODS Job Server in the format 'FILE_NAME_<DAYMONYYYY>'.xlsx format.

Everyday the file name will be appended with System Date Format.

 

Example: 'FILE_NAME_17JUL2014.xlsx', 'FILE_NAME_18AUG2014.xlsx' etc.

 

I tried to achieve it by creating a substitution parameter and using it to dynamically define file name.

 

But I am not able to define the parameter correctly. (screenshot attached).

 

Please advise where I am doing wrong.  Is there any better way to achieve it.

 

 

Thanks,

SAP DS 4.2 Job server group is not displaying while exporting a batch job

$
0
0

Hi Friends,

 

I have created 3 Jobs servers, job server group and added all three job servers which are in different machines to server group.

 

While running a job or exporting a job to batch file I am not able to find job server group name(in drop down box) but i can see Job server names there.

 

Could you please advise if i am missing anything on job server configuration.

 

Thanks in Advance!!

Ram

SAP ECC Access for BODS Configuration

$
0
0

Hi All,

We would like to migrate data using IDOC methodology. We would like to configure RFC connection in BODS Management console. My query is which accesses should we have for SAP System so that we can load IDOC s into SAP ECC. Thanks in advance.


Data Migration Using IDOC

$
0
0

Hi All,

While using an IDOC for data migration we are getting two ways of idocs generated.

 

1. One Idoc having multiple data segments inside.

 

2. Multiple Idocs having single data segment data inside

 

which one among the above is the correct solution. Could you please provide inputs.

Looking for Information

$
0
0

Hi All,

I have a requirement in which i need to get data from BW and ftp'ed to Unix system with .dat extension.

Pls. let me know will i be able to create .dat file from BODS or need to do work around to get it done.

 

Regrds,

Gannu. N

SAP BODS 4.0 client Installation

$
0
0

Hi,

 

I have installed SAP BODS4.0 in one of our server ( 6 months back). And I have installed SAP BODS client in my local system(Win 7 OS). BODS client and server was working fine. Last week I have formatted my local system and installed BODS client once again. I have updated my oracle TNS files, created ODBC connection, updated system host files.

 

Now When I am opening BODS in my local system BODS 1st screen is opening but after selecting any repository , the BODS is not

opening(designer automatically closing and didn't get any error). But SAP BODS is working fine in the server.

 

I think I have missed something. Can anyone help me about this.

 

 

Thanks ,

Kaushik

SAP BODS architecture

$
0
0

Hi,

 

I am working in BODS 4.0 version.

 

Can anyone tell me the architectural and functionality differences between SAP BODS 4.0 , 4.1 and 4.2 version.

 

 

Thanks ,

Kaushik

Configure HIVE adapter on Data Services 4.2 SP1

$
0
0

Hi all,

 

We are on version 4.2 and there is a request to connect to Hadoop cluster via Data services. Hadoop installation is on Linux and Data Services installation is on Windows in our environment. I have gone through couple of blogs and a wiki page on SCN, all of them talk about Data Services also being installed on Linux for the connectivity to work properly.

 

Question - Has anyone successfully attempted any such connection to pull data out of Hadoop where Data services is on Windows and Hadoop is on Linux? Can we place the required .jar files on the windows machine and configure an adapter, will it work?

 

Any help is appreciated.

 

_panda_

Viewing all 4013 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>