Quantcast
Channel: SCN : All Content - Data Services and Data Quality
Viewing all 4013 articles
Browse latest View live

NULL in source gets converted to Blanks in target

$
0
0

I am using Data Services to load data from the SQL server to  Teradata. For target columns of VARCHAR or CHAR data types in Teradata, if the corresponding source column value is NULL, the data is loaded into the target as a  blank field value instead of NULL. But for other datatypes like datetime or int, NULL in source is loaded as NULL in target. I am using the FASTLOAD bulk loader option in Data services to load data into teradata.

Should i change any setting in the data services config file to solve this problem?

 

Edited by: indumeena on May 31, 2011 9:54 PM


BI 4.1 SP03 Patch 01 incompatibility issue with DS 4.2 SP01

$
0
0

Hello,

 

this message it's only to advise that the error "Unable to locate BOE platform components. Install or reinstall BOE platform and retry" when you try to open "Data Services Designer" with the following software can be solved uninstalling Patch 01 of BI 4.1 SP03

 

Windows 7 SP01

SAP BO BI Platform v4.1 SP03 Patch 01 (client components)

SAP BO Data Services v4.2 SP01 Patch 04 (client components)

 

Hope this helps

 

Regards

DS 4.2 job processing

$
0
0

Hi guys

 

i have job which is inserting around 800,000 records in target table, but the problem is that for first 600,000 records it takes less than 10 minutes but after that to insert rest 200,000 records it is taking more than 50 minutures.. i want to know why the processing speed changing after 600,000 records..

 

problem is showing :-

 

DF_CL_EXTRACT/Q_READ_TABLENM, PROCEED, 639000, 490.670, 692.161, 3.682, 5.429, 0:5

DF_CL_EXTRACT/Q_READ, PROCEED, 646000, 505.679, 707.160, 1.466, 4.555, 0:0

DF_CL_EXTRACT/Q_READ_TABLENM, PROCEED, 646000, 505.669, 707.160, 3.089, 4.555, 0:5

DF_CL_EXTRACT/Q_READ, PROCEED, 646000, 520.678, 722.159, 0.000, 0.000, 0:0

DF_CL_EXTRACT/Q_READ_TABLENM, PROCEED, 646000, 520.668, 722.159, 0.000, 0.000, 0:5

DF_CL_EXTRACT/Q_READ, PROCEED, 647000, 535.677, 737.158, 0.125, 0.515, 0:0

DF_CL_EXTRACT/Q_READ_TABLENM, PROCEED, 647000, 535.667, 737.158, 0.390, 0.515, 0:5

DF_CL_EXTRACT/Q_READ, PROCEED, 647000, 550.677, 752.158, 0.000, 0.000, 0:0

DF_CL_EXTRACT/Q_READ_TABLENM, PROCEED, 647000, 550.667, 752.158, 0.000, 0.000, 0:5

DF_CL_EXTRACT/Q_READ, PROCEED, 647000, 565.676, 767.157, 0.000, 0.000, 0:0

DF_CL_EXTRACT/Q_READ_TABLENM, PROCEED, 647000, 565.666, 767.157, 0.000, 0.000, 0:5

DF_CL_EXTRACT/Q_READ, PROCEED, 647000, 580.675, 782.156, 0.031, 0.031, 0:0

DF_CL_EXTRACT/Q_READ_TABLENM, PROCEED, 647000, 580.665, 782.156, 0.000, 0.031, 0:5

DF_CL_EXTRACT/Q_READ, PROCEED, 648000, 595.674, 797.155, 0.156, 0.562, 0:0

DF_CL_EXTRACT/Q_READ_TABLENM, PROCEED, 648000, 595.665, 797.156, 0.406, 0.562, 0:5

 

please find the monitor file attached

DS 4.2 get ECC CDHDR deltas in ABAP data flow using last run log table

$
0
0

I have a DS 4.2 batch job where I'm trying to get ECC CDHDR deltas inside an ABAP data flow.  My SQL Server log table has an ECC CDHDR last_run_date_time (e.g. '6/6/2014 10:10:00') where I select it at the start of the DS 4.2 batch job run and then update it to the last run date/time at the end of the DS 4.2 batch job run.

 

The problem is that CDHDR has the date (UDATE) and time (UTIME) in separate fields and inside an ABAP data flow there are limited DS functions.  For example, outside of the ABAP data flow I could use the DS function concat_date_time for UDATE and UTIME so that I could have a where clause of 'concat

 

_date_time(UDATE, UTIME) > last_run_date_time and concat_date_time(UDATE, UTIME) <= current_run_date_time'.  However, inside the ABAP data flow the DS function concat_date_time is not available.  Is there some way to concatenate UDATE + UTIME inside an ABAP data flow?

 

Any help is appreciated.

 

Thanks,

Brad

Autocreate for Template Table if it doesn't exist

$
0
0

Hello.

 

I have a Data Services Project and I face a problem:

I use Template Table, and at the first time of the job execution the table is auto created. But if I delete the table from the database (or change the database of the datastore to another that doesn't have such table)  and execute the package again then it doesn't autocreate it and I get the error:

" Invalid object name 'TableName' ".

 

Furthermore, I don't want to check the "Drop and recreate" option, because I 've placed it in a while loop in order to get data from different tables (with the same schema).

 

This is a problem for me because I want to create the package once and execute it for different new target databases.

 

Is there a solution for my problem?

 

Thanks in advance for your help,

Nicholas

20 million records from SAP CRM to MS SQL Server

$
0
0

Dear experts,

 

I need to load around 20 million records from a Z table in SAP CRM to a MS SQL DB Table, once a month.

What's the recommended method for doing it?

A BW like extractor, customized from a table view in RSA5?

An ABAP dataflow?

RFC Function call?

or IDoc?

 

Thanks in advance!

Rafael

install BODS 4.x 64 bits on solaris X86 system amd64

$
0
0

hello community;

 

please help me with this issue.

 

i'm trying to install Data services on solaris system x86 64bit core.

 

i installed oracle client 64 bits on the machine and i can't execute the install of BODS (it return invalid argument error)?

 

i can not find out what's the matter!!!

 

here is the error i'm getting from trying to execute ./setup.sh:

 

 

./setup.sh[109]: .: line 110: setup.engine/perl/bin/perl: cannot execute [Invalid argument]

/home/kamel/DS/DATAUNITS/DS41SOL64/setupexe -launchedFromSH

./setup.sh[159]: /home/kamel/DS/DATAUNITS/DS41SOL64/setupexe: setupexe: cannot execute [Invalid argument]

Finished, return code is 126

 

P.S: i'm simulating the installation on a virtual machine with 4 processor and 4 g RAM and 50G hard drive.

 

please help me out !

DS 4.2 support Oracle 10g

$
0
0

DS 4.2 CMS repository and DS repositiry support Oracle 10g?

 

DS 4.2 PAM mentioned, Oracle 11g, but DS 4.2 install Guide mentoned in extra requirements for Oracle sections mentioned about 10g R2.

 

It is kind of confusing. Can someone clarify?


GETTING ERROR IN SAP BODS : 230102

$
0
0

Dear Expert ,

 

I am New in SAP BODS,I am getting below Error in SAP BODS.

 

Data Flow terminated due to error <230102>

 

We are transfering data from MS SQL to SAP BW.

 

Please tell me how to resolved it

Many Many Thanks In Advance.

 

Regards,

Divyesh Patel

(9930578182)

Employee Master Data load from BODS to SAP

$
0
0

Hi All,

I need some help regarding data migration from PeopleSoft to SAP using BODS (Business Objects Data Services)

 

I need to know how sample HR data from PeopleSoft (from SQL Server table mapped in BODS) can be loaded to SAP Sand box. For example, for Material Master Data BODS has a BAPI that can create records in SAP.

Can HR_MAINTAIN_MASTERDATA be used to load master data from BODS to SAP?

 

Or is there any other BAPI or IDOC to load HR employee master data from BODS to SAP?

 

Thanks & Regards,

Shilpa

Full connectivity catalog of SAP data services

$
0
0

I have been searching withn the Community and on the public domain but not found a single resource that could list all the data sources that are available to data services, either natively or via adapters.

 

With that I mean (at a minimum)

  • 3rd party business applications (JDE, Salesforce, etc. etc.)
  • RDBMS systems
  • Non RDBMS systems
  • SAP OLTP systems
  • SAP/BOBJ systems
  • SAP HANA
  • Unstructured formats: text files, images, office files, PDFs, etc.
  • OBDC, JDBC
  • Webservices
  • etc. etc.

SQL Server source - Can see data but job fails

$
0
0

I have set up a new SQL Server DB datastore.  I can import tables and views and even see the data when I do a data preview in the dataflow.  When I try to run the job I am getting the following error message.  What am I missing in the setup?

 

error.png

windows authentication for repository

$
0
0

Hi guys

 

could you please help me how to access repositories using windows authentication on BODS 4.2 sp1 as presently it is accessible using Administrator account only...

 

regards

Can BODS connect to MS SSAS ?

$
0
0

Hi,

 

I want to know if BusinessObjects data services can connect to Microsoft SQL Server Analysis Server.

 

Please let me know if you need more details.

 

Thanks

Vikas

Data Services 12.2.3.0 BODI-1112015 Adapter metadata import failed

$
0
0

Hi Experts,

 

I am using Data Services 12.2.3.0.

 

I have an issue in importing functions through 'Adapter' type datastore into Data Services. I can open the datastore and see the list of functions available, but when I try to import them, I get the error BODI-1112015 Adapter metadata import failed.

 

The setup and the errors are as below.

The adapter datastore is setup as below.

New_DataStore.jpg

 

I built a new keystore called clientkeystore.jks in the ..\bin.Then created the .CSR file, and then imported the signed chained (I believe it's chained certificate) certificate of the server hosting the wsdl into the keystore.

Thanks for the post http://scn.sap.com/thread/1589052 . After changing the metadata character set to utf-8, I can see a list of functions when I open this New_Datastore in Data Services. It proves that the setup for the datastore has no problem parsing the wsdl file and give me the list of functions in it. 

 

However, the error appears when I try to import them.

 

Error is:

Adapter metadata import failed. Error message: (BODI-1112015) Error parsing the <TheFunctionToBeImported> included in the XML sent by the adapter to represet a function <Error importing XML Schema from file <adapter_schema_in.xsd>:<XML parser failed: Error <Schema Representation Constraint: Namespace 'http://result.form.v81.api.keysurvey.com' is referenced without <import> declaration> at line <13>, char <46> in < < xsd:schema xmln:xsd=http://www.w3.org/2001/XMLSchema" xmln:tns="http://result.form.v81.api.keystore.com" xmlns:diws="http://businessobjects.com/diwebservice" targetnamespace="http://www.businessobjects.com/diwebservice"><xsd:import namespace='http://v81.api.keysurvey.com' schemaLocation='C:\Program Files\Business Objects\BusinessObjects Data Services\ext\webservice\FormResultManagemenetgetRespondentsgetRespondents0.xsd'/>

<xsd: import namespace='http://result.form.v81.api.keysurvey.com' schemaLocation='C:\Program Files\Business Objects\BusinessObjects Data Services\ext\webservice\FormResultManagemenetgetRespondentsgetRespondents2.xsd'/> ........

 

When comparing it with the wsdl file(as below), it is worth nothing that the schemaLocation is changed to a local directory under C:\Program Files\Business Objects\BusinessObjects Data Services\ext\webservice  while it was not the case in wsdl. The schemaLocation is on the server.WSDL.jpg

 

I am wondering if the redirection from the server specified in the wsdl file to the local directory has caused this error. The error 'namespace is reference without <import>' is apparently wrong as the <import> is just there.

 

Or there is any other reason behind this.

 

I appreciate any adivce or question from you!


Array support in SAP Data Services

$
0
0

Hello,

 

     I am using Business Objects Data Services Designer (version 14.2.2.0)

    I have a requirement of calling Pl SQL procedure which will have out parameter as array. Which needs to be captured in BODS and to be passed over to o\another plsql procedure.

Can anybody suggest me how to implement this in BODS.

 

Thanks in Advance

 

- Rajnish

Find null values and replace with next row's value+10

$
0
0

Hi,

 

BO Data Services Version: 4.1 SP2

 

I have (.CSV) file as source. It has integer values for some combinations and it doesn't have any value for few combinations. i.e. some combination don't exist in the file. I want to replace the null values into some value or next row's value + 10.

 

How can I achieve this? Any help is much appreciated.

 

The data in the file looks like the below:

 

Store IDTypeNameStock
101FruitsOrange2500
101VegetableCarrot2000
102FruitsGrapes4000
102FruitsOrange2000
102VegetableCarrot3000

 

 

Expected result: (Formatted in Bold Italic)

 

Store IDTypeNameStock
101FruitsGrapes2510
101FruitsOrange2500
101VegetableCarrot2000
102FruitsGrapes4000
102FruitsOrange2000
102VegetableCarrot3000

 

Thanks,

Jeni

SAP BODS Delta data is not being updated

$
0
0

Hi Experts,

 

I have Data Service 4.0.

I am using extractor 2lis_11_vahdr.

I have taken Extractor as CDC.

 

My Scenario is like this:

 

I have Extractor Abap Data Flow in which 2lis_11_vahdr-->Query Transform-->Data Transport. Then after Query Transform and MAP_CDC_Component and Template table.

 

Here in Initial load I am getting Full data.

Then If I change 1 record then I am getting that as two row deleted and new changed row.

 

So,in target via CDC it should get added to1.

But I am getting 3 rows.

 

And in MAP_CDC_Component has two option

Sequencing Column = DI_Sequence_Number

Row Operation Column = DI_operation_Type.

 

Which two row of Map_CDC_Component can have some Problem?

 

please give me some solution.

 

 

With Regards,

Chintan Vora

Error: "Message short text 26 107 does not exist"

$
0
0

Hi DS Expert,

 

We've upgraded our DS from 3.2 into 4.2.

But sometimes several jobs fail with error: "Message short text 26 107 does not exist".

This error does not occur in DS 3.2.

 

Please help...

 

Thank you.

How to call and run parameters in Procedures with Sap Data Services?

$
0
0

Hello Guys,
Migrating'm allSSIS2008packages forSapData Services.
During this processI founda difficultyabout runningStored Procedures(Sql Server 2008R2) within theSapData Services.

BODS006.jpg

I need helpto convertthis codesample:


EXEC dbo.prcInserirLogExecucaoSSIS

@FEED = ?,

@TIPO_ENTRADA = 'NOVA_CARGA',

@ARQUIVO = ?



to Sql('datastore', 'example') withparameter passing...

Viewing all 4013 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>