Quantcast
Channel: SCN : All Content - Data Services and Data Quality
Viewing all 4013 articles
Browse latest View live

wich version number corresponds to which patch / SP / FP for Data Services (and Integrator)

$
0
0

 

Hello,

 

I am searching a document as "1602088  - Which version number corresponds to which patch / SP / FP for SAP BusinessObjects BI 4.0." but for data Integrator/Data services.


Is Anybody know one?


Thanks by advance,

 

Odile

 


how does DS behave in the market

$
0
0

hi expert,

 

         there are BW and DS for sap business DW strategy. my first question is which one will dominate the DW market in SAP?  maybe , one of them will completely take over another part.

 

         and compared with all other popular ETL tools, where the DS stand up right now in the term of future availability and popularity ?

 

Many Thanks,

Version 4.1 bug when calling BAPI that returns a structure called RETURN

$
0
0

To all,

 

In BODS 4.1 service pack 1

 

When calling a BAPI (BAPI_ADDRESSORG_GETDETAIL) and selecting the structure/table called RETURN as output from the function.

 

That structure RETURN is unreachable in a query.

 

The name is quoted as

 

Call_Functions."RETURN".TYPE

 

but then

 

...   Cannot resolve column <TYPE> whose full path is <Call_Functions.RETURN> in nested schema <RETURN>.

Ensure that in the FROM clause for nested schema <RETURN>, path <Call_Functions.RETURN> is defined;

or that it is defined in the FROM clause for any of the parent schemas.

 

This worked in 4.0, but now in 4.1 service pack 1 it is broken.

 

Seem rather onerous to have the ABAP programmers wrap every BAPI and rename the RETURN, (who is going to pay for that?)

 

The loss of the .chm, the compiled windows help, is painful too, PDF's are too slow to search for documentation.

 

F1 doesn't do anything any more.

 

Any help out there?

 

Jim

SAP ECC to Data Services

$
0
0

Hi Experts - We have a requirement to fetch the data from SAP ECC system.

I have DS 3.2 SP - 3 level in our landscape.

 

Is there any way so i can use the SAP ECC standard extractors ?

 

I want to execute the standard extractor and need to get the data in Data Services.

 

 

Thanks

R

If job ended after storing first 50 records in target how can we start loading from 51 record?

$
0
0

Hi Experts,

 

I have below question, any one please suggest

 

If job ended after storing first 50 records in target how can we start loading from 51 record assuming there are 1 lmillion records in source ?

 

Thanks

Madhu

BODS 4.1 job failed - ora 01722

$
0
0

Hi,


i created a simple test job having a single data flow

Data flow holds the below objects


Source table 'abc' holding 2 fields x and y (x and y holds varchar datatype)

query transform

table comparision

key generation

target table 'pqr' holding 3 fields x,y and z (key generation field)


when i run the above job, the job is terminated showing error 01722


i am not doing any datatype conversions in the query transform.


could some one suggest on the above on how i can run the job succesfully icon_sad.gif

Dataservice workbench- error while loading data

$
0
0

Hi,

 

I was trying the Data service workbench, to extract the data from the SQL server. Created a two data store connectivity for the source and the target by giving the respective credentials. I have established the connectivity successfully for both the source and the target. Created a replication job and tried to validate the job since I have to move the data to the respective target. Validation was successful and while executing the job it throws an error. All the metadata comprising to the particular data store have been moved in the Data services(such as 2 datastores created in the DataService tool automatically and the respective workflows and the dataflows are generated automatically) but the data was not populated in to the respective target. Do I need to re execute the jobs again in the Data service tool also to trigger the data to the respective target

 

1. The authentication used here was the SQL server Authentication

2. I just extracted three tables from the source and tried to place it in the target. Error attached below -

Data Service Workbench.JPG

 

 

 

 

Best,

Sanjay

Data service management console - Real time Job

$
0
0

Hello, I have a problem with the tree of administator in data service management console. The component Real time job don't appear. I can't acces at the real time job. Have you any suggestion ?


BODS- Taking an Unique record

$
0
0

Hi Experts,

 

I have a situation where  the records looks like-

 

A     B     C     D

01     P     M     AA

01     P     M     AB

02     P     M     AA

02     P     M     AC

 

The Above is my source data and my Target should look like-

 

A     B     C     D

01     P     M     AA

01     P     M     AB

02     P     M     AB

02     P     M     AC

 

That is if for the same combination of A B C records D value is AB- DO NOT CHANGE

But if the Value is NOT AB then AA should convert to AB

 

 

Please let me know how to implement this logic in BODS

 

 

Thanks

Ashique

How to configure Single Sign On on Data Services and Information Steward (4.1 version)

$
0
0

First of all I need to know is it working for DS and IS or not.

I have few links describing how to configure BI for SSO(we did it and working fine) but it's not really applicable for DS and IS. At certain point I cannot proceed further, so looking for answers, before I submit actual ticket request.

Data Services 4 to ECC 6.0

$
0
0

Hi,

 

we want to extract info from ECC using DS 4.0,

    

     we want to know

          1. the options we have to do that.

          2. Pre task or pre requisites

          3. step by step to connecto

          4. test the connection.

 

Thanks

Creating a file

$
0
0

Hi experts,

Can I create a file (any file) in DS?

Create a Dataflow, make de transforms and generate the file.

I want to save it on server and then be used by another software (SAP BusinessObjects Enterprise XI 3.x).

 

Tanks.

BODS Custom Scheduler - Event Based Sequential Scheduler Technique

$
0
0

Scheduling jobs as per your requirement is quite important and difficult task in BODS.

 

However, now you can do this task, as per your preference.

You can create a schedule of jobs, which will execute jobs one after another, i.e. event based sequential scheduler in BODS

 

 

Step 1 –

          You need 'Execution command file' for all the jobs, you want to schedule. If your are aware of how to create same, you can go to step 2 directly.

 

a)              Others, Follow the steps as per below document, to get a batch file for each BODS Job-

     http://scn.sap.com/community/data-services/blog/2012/08/22/sap-bods--running-scheduling-bods-jobs-from-linux-command-line-using-third-party-scheduler

 

    1. You will get one job_name.bat  file(for windows).
    2. You can execute this file directly from command prompt.
    3. Execute and check it’s running as expected, in command prompt.
    4. If not, then cut/paste this batch file to other location & try running.

                   Eg. Type E:\SAP\JOB_NAME.BAT & press enter.

 

 

Step 2 –

 

  Once file is executed on command prompt, we can create our own scheduler in bods designer.

1.      Create a new job & new script.

2.      Type in script  –

Exec('cmd.exe’,   'E:\SAP\JOB_1.BAT,      8) ;

Exec('cmd.exe’,    'Batch_file_Path\JOB_2.BAT,    8) ;

Exec('cmd.exe’,    'Batch_file_Path\\JOB_3.BAT,   8) ;

& So on…………..

3.       Save script/job.

 

Your custom scheduler for many jobs is ready.

 

This will execute your jobs as per sequence you decide.

You can extend this technique to execute many jobs parallely by adding many scripts/workflow combination.

If you know python/MS DOS batch commands/shell scripts, you can do multiple tasks with this scheduling technique.

use query issue

$
0
0

Hi All,

 

I have simple question

 

i have source file for example that contain employee from 5 dept , and i want that my target will contain 2 files

the first for all emp from dept 1 and the second from dept 2.

the output column and the mapping not suppose to be the same in the 2 garget file

 

did i must use after the source -> 2 query and use the where part of each and the map for each dept?

is there way to use only 1 query and the source that will contain the column of both and then split to 2 files?

is the query run on the database or only the source run in the database?

i want to be sure if i using 2 query after the source is i run the query twice in the database or it's done in  memory cache?

 

regards,

Ilan

Doubt regarding Rapid marts

$
0
0

Hi Guys,

 

I want to understand that what is a rapid mart in Data Service.

I am aware of the best practice documents and Standard content jobs  provided by SAP for doing migration with Data Services.

Can anybody tell me how different a rapid mart is from such best Pracice documents and standard content jobs for Data Services

 

 

Thanks ,

Mayank Mehta


Difference between data services scheduler and BOE scheduler?

$
0
0

Hi All,

 

Can any one explain the difference between data services scheduler and BOE scheduler options in data services management console?

Please, if some one can share some documents related to these concepts, that will be helpful.

 

Thanks & Regards,

Sabarish.M

Having issue with oracle table number columns in DS becoming Decimals with 28,7

$
0
0

I have the oracle table with number and has no mention of number(28,7)

 

as soon as i import into datastore it is becoming decimal(28,7), now how and where i can modify that columns datatype atleast to remove that trailing zeros 7 after 28?

 

here is the screenshot, when i right clikc in teh target table under teh dataflow and modify it is greyed out. there has to be somelogic to change that.

 

I changed within the query transform object for the corresponding column to have 0 in the decimals (28,0). But no effect it still showing after the load ran as 7 zeros after a dot for all those columns, please kindly advise.

 

thanks a lot for the helpful info.

Datatype_issue.JPG

Is there any possiblity to schedule the job based on factory calendar in Data services

$
0
0

Is there any possiblity to schedule the job based on factory calendar in Data services

Incerting 3 new records for each entry

$
0
0

Hi Experts,
 
I have this situation where i seek for your assistance-
 
I have a table-
 
MATNR   MTART   WERKS   NAME
001         A          AA         NEW
002         B          AC         NEW1
003         B          AD         NEW2
004         A          AE          NEW3
 
Now when ever i find a MTART= B i have to insert 2 more records for the same MATNR  the two more records are same i.e-
 
MTART= B,WERKS= ZZ,NAME=OLD1
MTART=B,WERKS=YY,NAME=OLD2
 
with addition to the MATNR, so the output sould look like-
 
MATNR   MTART   WERKS   NAME
001         A          AA         NEW
002         B          AC         NEW1
002         B          ZZ         OLD1
002         B          YY         OLD2
003         B          AD         NEW2
003         B          ZZ         OLD1
003         B          YY         OLD2
004         A          AE          NEW3
 
 
Please let me know how to implement this logic.
 
Thanks
Ashique

has anyone called oracle procedure with input parameters i am having issue

$
0
0

I need to call oracle pl sql procedure or function passing inout parameter values based on variables.

 

i have function called

 

UDF_Create_Transaction() this function whenever called i need to pass Four variable values $Username, $Start_dateTime, $End_DateTime, $DataflowName

 

 

Please kindly i know for sure lot of members here may have used oracle pl sql functions or procedures called bi sql object or via post load trigger.

 

Need your kind help.

 

Thanks a lot for the helpful info.

Viewing all 4013 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>