Quantcast
Channel: SCN : All Content - Data Services and Data Quality
Viewing all 4013 articles
Browse latest View live

ABAP data flow shared directory method failing

$
0
0

Hi,

 

I am a relatively new BODS developer, and currently I am trying to set up an RFC connection with a SAP data source. After a considerable amount of pain, we finally managed to set it up, but since the data we pull is approx. 120 million records, we figured we cannot use normal data flows. We switched to ABAP data flows, but it is not working for the 'Shared Directory' method. After quite a lot of googling and various posts on this community, we finally understood quite a bit of it. The issue here being our SAP source server is on a UNIX OS where as our BODS 4.0 is running on Windows OS. Installing Samba is not an option as it is against the policies. Can someone help us out please?

And if we need to switch to Custom transfer because of no other way to do it, could you tell us the steps that need to be followed to set up an sftp connection?

 

Thanks,

Ashok


How to consume date ranges while taking any Month/Year wise count?

$
0
0

Hi guys,

I am facing a scenario and need logic on how to build this in Data Services. Please help me on this :

 

There is a HR Master Data table in which I've following columns :

 

EMP_CODE

EMP_NAME

AREA

START_DATE

END_DATE

 

Example, records will look like this :

 

1003     Samuel     California     01 Apr 2011     31 Mar 2013

1003     Samuel     Arizona        01 Apr 2013     30 Nov 2015

1003     Samuel     New York     01 Dec 2015    31 Dec 9999

1007     Nathan      xyz              10 Aug 2013     10 Dec 2015

1009     Samarth    Caracus       30 Jun 1989     31 Mar 2011

1009     Samarth    Delhi            01 Apr 2011     25 Dec 2014

 

Each employee has it's historical records as well which will indicate change in location based on dates. END_DATE of 31 Dec 9999 indicates that employee is working till date and present in organization.

 

Now the requirement is to build a dashboard on this data in which KPIs will be showing the headcount of employees Month & Year wise. That means for example, how many employees were present in Dec 1997, Mar 2010, Feb 2016 etc. The data model should be robust enough, so that any month/year headcount information could be pulled out of it based on these two columns i.e. START_DATE & END_DATE. I mean we can add further columns if required to table but as of now we have these two date columns through which we need to build logic.

 

If we consider END_DATE column to find headcounts, then confusion is how to make system understand that EMP_CODE 1003 was present in all 12 months of year 2012, 2014, 2015 as well when queried from reporting level(Tableau). If we consume the existing data as is, then it will show 1003 was present in Mar 2013, Nov 2015, and Dec 9999. How to show it's presence in rest of months? Do we need to add the data for dates via some logic or how to handle this? Need to make data generic i.e. user can see any month/year headcount from the data I make available for him in table.

 

Please help on this urgently...Any help would be highly appreciated. Thank you for reading and paying attention.

 

Dirk Venken : Any inputs Please?

Unable to view Record Count of SAP ECC tables in profile tab

$
0
0

Hello everyone,

 

I came across something strange and thought to share with my community here. I am using SAP DS 4.2 SP5. I have made a datastore connection to SAP ECC system and imported few HR Master data tables like PA0000, PA0001 etc.

 

To my surprise, when I went to check the record count of these tables in 'Profile' tab of 'View data' window and clicked on 'Records' button -> It says 0 records. The table contained over 1 lac records but unlike other database tables, ECC tables show 0 records when we click on 'Records' button in 'Profile' tab of view data window of a particular table.

 

I have never witnessed this before, as to count the physical records of table, usually we simply go to profile tab of database table and click on 'Records' button to get the count(*) of table. I am not sure why it is not happening on SAP ECC datastore tables only.

 

Any help/inputs on this would be highly appreciated.

 

Thanks in advance.

Is 'push down to database' optimization technique applicable when source and target are SQL server databases?

$
0
0

Hi All,


Is push down to database optimization technique applicable when source and target are SQL server databases?


Shweta

Insert , update and delete flags in Table comparison transform.

$
0
0

I am exploring the difference between Table comparison, Map and History preserve. On the internet I read that 'The output of table comparison transform is rows flagged as INSERT, UPDATE'.

 

But I could not find any of these flags instead table comparison actually inserts, updates rows in target table. I did not get output as rows flagged as INSERT, UPDATE.

 

 

Which one is true?

BAPI error Infopackage Data Services Datasource

$
0
0

Hi,

 

I have created a Data Services datasource on BW and am looking to schedule a Data Services job from BW.

 

When I click on '3rd party selection' tab on the infopackage and click on refresh, I get below error.

When the job is triggered from Data Services, I see 'red' request on the PSA with the same error.

 

Error: Server repository could not create function templa te for 'BAPI_ISOURCE_DP_GETPARDEF' caused by: com. sap.conn.jco.JCoException:

Message no. RSM799

 

Could someone let me know how to resolve this? We are using Data Sercices 3.2 and SAP BW 7.01 SP3.

 

Thanks,

Raj

Splitting of BOE and DS on AIX

$
0
0

Hi,

Please guide me to the correct place if I ask this question in the wrong place.

 

My customer has an installation of BOE 4.1 SP4 PL3 on AIX. On same server is Data Services 4.2 SP3 PL1 installed. Database is installed on separate server.

 

Now they want to have DS installed on another (clean) AIX server. I face error:

"No SIA Node Found

 

No valid local SIA nodes exist for the given CMS connection information.

You will not be able to install the following services: RFC Server, Administrator, Metadata exchange, and View Data)."

 

As far as I can understand from note 1837296, I should have a working SIA node. Thus, I have to fix this problem.

 

Correct? Or do I simply have to set up a SIA since I'm splitting the installation? (And should this SIA be on central server or local server or both?)

 

I have tried to look in Master Guide, Installation Guide and Administration Guide, without getting any wiser so far.

 

I have access to CMC, but I'm not able to identify if there is a "rouge SIA node" there. (Looks like there is no "SIA node" at all...)

From note 1891022 I find that a new SIA is created in CCM (Central Configuration Manager). But I have not been able to find out anything about CCM. How to start it, or from where. I'm accessing AIX via putty, so if it's an X utility, I have a problem... Can it be that svrcfg is the same tool, but in a "command prompt version"? (Except I find nothing about Server Intelligent Agent there...)


And I also wonder when and why a SIA is installed? When and for what it is needed? Knowing that would make it easier to figure out which problem I have...


I really hope someone out there find it in their heart to guide me in the right direction here.


Kind regards,

Gørril



Data Services with Amazon S3/Redshift as target

$
0
0

Hello Experts,

 

I have a requirement for which i am not able to figure out the best possible way ahead. Any light you shed on the problem will be really appreciated. I am using DS 4.1.1

 

I have a application hosted on the SQL Server 2008. I need to load millions the data from this db to Amazon Redshift. Below are the 2 solution i could think of :-

 

1) Using a third party data driver:- Using one of the many ODBC data drives(e.g DataDirect Redshift Driver etc) available online to transfer data from the Sql Server to a staging and then in turn into the Resdhift DB. Because as far my knowledge there isn't any adapater shipped along with SAP for this kind of a requirement. Please correct me if i am wrong. Without the use of an external ODBC driver the data load will take ages to get loaded. Since db link is not an option with Redshift, so i am not able to push down the complete data flow. The Bulk Load tab also does not appear in the target tables imported from Redshift.

 

2) Creating file format outputs and generating a file as the output of the data flow and then running a Java prog to transfer that file from a shared drive to S3 and then running the COPY Command to move the data from S3 to Redshift. But in this approach i am not sure how to call the Java Prog from BODS and then the process to move the file from a shared drive to S3 is also vague.

 

Please, can anyone put some light to this problem.

 

Thanks,


Migrating from BODS 4.1 to BODS 4.2 Cannot import IDOC

$
0
0

Hi,

 

We are upgrading from BODS 4.1 to 4.2.


We are using IDOC as target in one of our job. We are not able to re import the IDOC in BODS 4.2. We are getting 'Unknown error when importing IDOC - BODI 1112415'.

 

K900187.r22; R900187.r22; R000001.r63; K000001.r63 transports have been imported in SAP ECC (SAP notes 1980221 and 1916294 .

 

Will it impact if both r22 and r63 transports are imported? or Do we need to import only one of them? What else can be checked to the solve this issue?

 

Note: We are able to import the IDOC in BODS 4.1.

 

Thanks,

Ravi

How to delete tasks in my work list of Information steward

$
0
0

Hello Experts,

 

I would like to delete the tasks in my work list of information steward, could some one guide me the list of steps to be followed to do the same.

 

 

 

Thanks in advance,

How to write lookup for this Sql Statements

$
0
0

Hi,

 

How to write lookup for the two fields  acct_type  and cust_type


case when (select d.tbl_seg_key from proddb2.lkuptbl d where d.tbl_name = 'TOPNATND' and substr(tbl_seg_key, 5,10) = a.cust_acct_no FETCH FIRST 1 ROWS ONLY ) is not null then 'National'

     when (select d.tbl_seg_key from proddb2.lkuptbl d where d.tbl_name = 'TOPNATN' and substr(tbl_seg_key, 5,10) = a.cust_acct_no FETCH FIRST 1 ROWS ONLY ) is not null then 'National'

      else ' '  end as acct_type,

case when (select d.tbl_seg_key from proddb2.lkuptbl d where d.tbl_name = 'TOPNATND' and substr(tbl_seg_key, 5,10) = a.cust_acct_no FETCH FIRST 1 ROWS ONLY ) is not null then 'Direct'

     else 'Non-Direct' end as cust_type,



Please suggest me any idea how to implement lookup for these in BODS.




Thanks,

Ramnath



data services - function table structure not correct

$
0
0

Hello Everyone!

 

I am using Data Services to read a RFC from my BW system, which is defined as a "Source BW" datastore in DS.

After importing the function, i realize that the structure of the function is not imported correctly.

When i look at se37 in BW, i can see the correct structure of import table parameter:


BID tables.JPG

and then looking at table E_T_GRID_DATA:

DS Table E_T_GRID_DATA structure.JPG

While in DS, E_T_GRID_DATA has only this "Line" component:

DS tables.JPG

 

has anyone faced this and now how to import the correct structure ?



Thanks in advance,

Liron.


Cannot view data for SAP CRM table in BODS 4.2 SP5

$
0
0

Hi

 

I created a new datastore in BODS local repository to connect to our QA CRM environment. I was able to import some tables but not able to view data for table BUT000. I checked in SAP logon with my userid and was able to see the table data in SE16 transaction.

 

 

Error calling RFC function to get table data: <RFC_ABAP_EXCEPTION-(Exception_Key: DATA_BUFFER_EXCEEDED, SY-MSGTY: E, SY-MSGID: FL, SY-MSGNO: 046, SY-MSGV1: /BODS/RFC_READ_TABLE)>.

 

 

 

Any inputs would be appreciated!!

 

Regards

Arun Sasi

 

Message was edited by: Arun Sasi

xml error

$
0
0

I need help to fix the error in xml file..

 

normally every xml get <gen:DocumentAttestationDate>2016-03-01</gen:DocumentAttestationDate> tag properly formed...

but in some of the xml it is coming like this :-
<gen:DocumentAttestationDate>

2015-09-07</gen:DocumentAttestationDate>

 

which means there is space after the opening tag which is pushing the date in next line.. how to get data in same and closing tag in same line as starting tag...

 

thanks for your help

expert please come in!!!

$
0
0

I have this table:

Start_dateEnd_dateDue_day
2016.03.012016.06.2312

 

I want to produce table like this (to generate the date (month+due day) between start date and end date)

 

Start_dateEnd_dateMid_dateDue_day
2016.03.012016.06.232016.03.1212
2016.03.012016.06.232016.04.1212
2016.03.012016.06.232016.05.1212
2016.03.012016.06.232016.06.1212

Issue when we try consume a Webservices RestFul

$
0
0

Hi Experts,

 

Actully facing some this issue when try to consume a WebService RestFul:

 

 

Error: an element named <envelope> present in the XML data input does not exist in the XML format used to set up this XML source in data flow <…>. Validate your XML data.

 

pic1.jpg

 

Steps Followed to connect to webservices:

⦁ The client summit a xsd Request and Response Files

⦁ Summited Files:

⦁ request_cmm.xsd

⦁ response_cmm.xsd

⦁ The request_cmm.xsd file comes with a lline that the Sap Data Services translate like  a input parameter in the import function time. We deleted this line.

 

pic2.jpg

 

 

⦁ Client summit the WADL: wadl_cmm.wadl  File

⦁ We need to modify the wadl to make reference to the xsd files summited by the client. Attached the image of the wadl summited by client:

pic3.jpg

⦁ wadl file After lines adecuations:

pic4.jpg

⦁ We place the xsd and wadl files in th Sap Data Services Designer workstation.

⦁ Create  a new DataStore

⦁ Datastore Type: Web Service REST

⦁ Specify the wadl

⦁ Clic in Apply to test the connectivity

⦁ We imported the needed function.

⦁ the fucntion its added to the DataStore

⦁ We create a Job to consume the WebServices

 

⦁ We create the Job flow in SaP Data Services Designer

pic5.jpg

 

⦁ When execute the  created job we get this in the trace file.  trace_rest.txt

pic6.jpg

⦁ the message its from the WebServices Restful.

 

⦁ The job ends with the following error message.

 

pic7.jpg

 

PLease if you are any suggest or know something to help in this let me knoe and thanks in advance.

 

Or System are in: SAP Data Services 4.2 SP5 patch 3 in RedHat linux 6.6

Total Hours Calculation group by week in Data Services

$
0
0

I have requirement of creating  EMPLOYEE TIMESHEET HISTORY table where we have add total timesheet hours for EMPLOYEE group by week.

below is the example source data for employees having time sheet hours day wise for 3 weeks :

 

 

EMPLOYEE_CODE

DATE_TIMEBAND

ACTUAL_HOURS

1

1/01/2016

0

1

2/01/2016

0

1

3/01/2016

0

1

4/01/2016

0

1

5/01/2016

8

1

6/01/2016

8

1

7/01/2016

8

1

8/01/2016

8

1

9/01/2016

0

1

10/01/2016

0

1

11/01/2016

8

1

12/01/2016

8

1

13/01/2016

8

1

14/01/2016

8

1

15/01/2016

8

2

4/01/2016

8

2

5/01/2016

8

2

6/01/2016

8

2

7/01/2016

8

2

8/01/2016

8

2

9/01/2016

0

2

10/01/2016

0

 

 

We need the output like below:

 

EMPLOYEE_CODE

DATE_TIMEBAND(Week_Starting_Date)

TOTAL_HOURS(WEEKLY)

1

28/12/2015

0

1

4/01/2016

32

1

11/01/2016

40

2

4/01/2016

40

 

 

Please suggest any approach to solve this problem.

 

Thanks in advance for help and support

How to perform a DataServices Support Package update?

$
0
0

Hello  Experts,

 

We have been requested to perform a DS SP update from DS 4.2 SP1 to DS 4.2 SP 5 patch 2.

 

It is clear for me that I have to update IPS and Information Steward to the compatible SP level  for DS.

 

My concern here is that I have searched for the DS SP update guide but I realized there is only DS Installation Guide and DS Upgrade guide which is not very clear about the SP update over the same release (4.2).

 

Within DS Support packages software section I can see the complete sofwtare comes toghether (it seems a new and whole DS installation needs to be performed)

 

 

DS.png

 

Based on this we have the following concerns/questions:

 

-Is there any documentation we can follow for a DS SP and Patch update?

 

-Does this  DS SP update has to be performed  following the steps for a DS Upgrade activity?

 

-We have also found a very detailed post SAP Data Services 4.x Upgrade steps

Would this post  also apply for a DS SP activity on the same release (4.2)?

 

 

Any guidance is much appreciated !!!!!!!

 

 

Thank you

 

Alex C.

 

 

Current System features for your reference:

 

1  -   SAP Data Services 4.2  SP1 Patch 1

2  -   SAP Information Steward 4.2  SP1

4  -   Information platform services 4.1 SP2

 

DB

 

Oracle Database 11g Enterprise Edition Release 11.2.0.3.0

 

SO

 

SUSE Linux Enterprise Server 11 (x86_64)

VERSION = 11

PATCHLEVEL = 4

Unable to view Options for SAP DQM transforms

$
0
0

Hi Folks,


Recently, we installed SAP DQM 4.2 SP4 and imported all the Real-time service jobs in SAP Data Services.


When, clicked on every single Data quality transform we were able to see parameters for Input and Output tabs in the Transform

but, the Options tab in the transform was totally empty(no parameters).


However, we checked with all the Data quality transforms and found the same.

 

Please find below screenshot regarding the same and let me know if have suggestions on this.

 

data_cleanse.png

 

 

Regards,

Santosh

Data Integrator Transforms are grayed out in Designer DS 4.2 SP5

$
0
0

Hi All,

 

I am not able to use the Data Integrator transforms in Designer. They are grayed out as shown below. Is it a License Version. I ran the License manager application on our linux Job server and it says Data Quality Management premium version.

 

Do we need to update the License key ?

 

 

Regards

Arun Sasi

Viewing all 4013 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>