Quantcast
Channel: SCN : All Content - Data Services and Data Quality
Viewing all 4013 articles
Browse latest View live

SAP BODS With Success Factors

$
0
0

SFDS.pngHi All,

 

I am unable to Connect SAP BODS with Test Server of Success Factors . We are using  SAP BODS 4.2 and in Admin Console we can see Success Factors Adaptor also . We are trying to open SAP DS jobs server connections we are getting  error .Please refer to above screen shot

 

 

 

If any one has worked on BODS to Success factors . Request you please help me on the same.

 

Drop me Email : Pandeyanuj21@gmail.com

Call me :971529278497

 

Thanks

Anuj


Overwrite Schema by uploading a Flat File in SAP Data Services

$
0
0

Hello,

 

i want to upload a Flat File and during my E-learning traning the system always asked in the file format editor if i want to overwrite the default schema with the schema of the Flat File i want to upload.

 

But when i load a Flat File now the system does not ask if i want to overwrite the schema and it uploads only the first two columns.of ny ten field table.

 

has anyone an idea how i can solve that problem?

 

 

greetings Philp

DS Management Console: Data Validation: There is no data available

$
0
0

Hi,

 

I have created a job with a data flow containing a VALIDATION transform.
and put one simple rule

 

If the COUNTRY-field value is unequal DE then go to FAIL-File.

If the COUNTRY-field value is equal DE then go to Pass-File.

 

Job works fine.

I see the failed records in the right file and in rule-violation.

 

When I execute the job I marked in the pop up:

 

X collect statistics for optimization

X collect statistics for monitoring

 

Then I expected to see something in the Data Services Management Console under DATA VALIDATION.

However it loads a while and then displaysthe message "There is no data available".

 

Tried it several times.

 

Which setting is missing here?

 

Thank,

 

Andrzej

ETL Recovery methods

$
0
0

Hi All,

For example ETL job failed for some reason in middle of the run.

How to load the ETL job for those records which were not loaded?

Thanks in advance.

running WinSCP bat file

$
0
0

Hi guys

 

I am trying to download the zip folder from the site using SFTP..

 

To achieve same i have created the .bat file without host key and when i am running same maunally it is running fine without asking the host key and downloading the zip folder as well.. but the problem is when i am trying to automize same using data services.. it is asking host key...

 

could you please help me why that difference is coming when running through data services..

 

is there any possibility to avoid entering host key?

 

Thanks for your help.

 

Regards

Physical memory utilization is high

$
0
0

Hi All,

 

I am working on production environment having high end hardware specification.

 

My ETL and database are installed on two different servers.

 

DB server has MS SQL 2008

RAM  - 256 GB

with high end Processor.(didn't remember exact configuration)

 

Not able to recollect for ETL server also but definitely with high end hardware specification.

 

Now my question is even if we run single job the physical memory utilization of Production DB server shoots upto full utilization (all 253-254 GB).

 

Why does it happens ?

What are its consequences ?

 

What are the reasons causing so high memory utilization ?

 

Thanks in advance.

 

Regards,

Sachin

SAP Data Services 4.2 SP 06 on Windows with Hadoop Integration (Hortonworks HDP 2.3.2)

$
0
0

Hello,

 

SAP BODS 4.2 SP 06 release notes says "Data Services 4.2 SP 06 now supports Hadoop on the Windows platform (Hortonworks HDP 2.2.6 only)."

 

We are getting ready to prepare a new BODS landscape for Hadoop integration. We are using Hortonworks HDP 2.3.2.

When I look at the BODS PAM, it says the Hadoop support is available only on Linux platform. But the BODS 4.2 SP 06 says it supports Hadoop integration with BODS installed on Windows.

 

Any one here using 4.2 SP 06 BODS on Windows integrated with Hadoop?

 

Thanks in advance

BODS job failed due to Invalid value for date

$
0
0

hi team,

 

I recently joined this BO DS project and I am just a beginner with this.

Actually I need to know from where and how can I check data from source to validate error.

one of the DS job failed due to below errors:-

Invalid value <Month: 20> for date <01042012>. Context: Column <>.

 

I need to find the root cause it seems record in the source system caused this error.

 

But I don't know how to proceed.

please help!


PDF Format to db table

$
0
0

Hello,

 

Is there any standard functionality in BODS or BO to directly read PDF files and also export the data to db table ? What's the best workaround to automate this process if standard functionality is not there for PDF files.

 

Thanks and regards,

Upgrade BODS to 4.2SP5

$
0
0

Hi All,

 

We have a single windows prod machine installed with BO4.1SP2 and BODS4.2SP2.

 

Now we want to separate BO and BODS to separate machines to have a better performance and also easy upgrades.

 

I am able to separate and upgrade BO to 4.1sp6, I don't have much knowledge on how to separate and upgrade BODS to 4.2 SP5.

 

I want to install IPS 4.1SP6 and BODS 4.2SP5 on new machine for ETL.

 

Can someone guide me what is the best approach/practices to upgrade and separate BODS to another server.

 

 

Regards,

Marven

Unable to log into Designer

$
0
0

Hi Everyone,

I recently installed Desiger 4.2 SP 6 on Window 8.1 machine, our repository is in SQL Server 2012.  When I try to connect to the local repository, I keep getting this message:  "SQL Server driver, cannot generate SSPI content"

 

Capture.PNG

Has anyone seen this error before?

I can't seem to find anything on this SSPI content.

TIA

Lynne

System Exception occurred. Process dump option is off. Process is not dumped

$
0
0

My job was working fine till yesterday, but today it is throwing following error, i am using BODS 4.1

 

Can anyone please help me with this.... ?

 

System Exception <ACCESS_VIOLATION> occurred. Process dump option is off. Process is not dumped.
Call stack:
0x00000000805DF2E5, acta_strcmp()+0357 byte(s),
d:\im_ds_4.2_sp_rel\src\dataservices\dataintegrator\codeline\code\src\genutil\strfunc.cpp, line 0308+0017 byte(s)
0x0000000080560EF4, Compare()+4148 byte(s),
d:\im_ds_4.2_sp_rel\src\dataservices\dataintegrator\codeline\code\src\eval\calc.cpp, line 0854+0026 byte(s)
0x000000008054D0FA, XRow_data::compareBDBBtree()+1258 byte(s),
d:\im_ds_4.2_sp_rel\src\dataservices\dataintegrator\codeline\code\src\eval\row.cpp, line 4543+0053 byte(s)
0x00000000813F1CDA, DSPcache::DSFileBtreeKeyFinder::findRow()+0234 byte(s),
d:\im_ds_4.2_sp_rel\src\dataservices\dataintegrator\codeline\code\src\pcache\ds_file_btree.cpp, line 1329+0122 byte(s)
0x00000000813F2568, DSPcache::DSFileBtreeCursor::getRow()+0040 byte(s),
d:\im_ds_4.2_sp_rel\src\dataservices\dataintegrator\codeline\code\src\pcache\ds_file_btree.cpp, line 1822+0024 byte(s)
0x00000000800E8F89, XTran_eqcache_runtime_new<std::multimap<XRow_data_ptr_with_schema * __ptr64,XRow_data *
__ptr64,lessThanForEquivCacheJoins,std::allocator<XRow_data * __ptr64> >,std::_Tree<std::_Tmap_traits<XRow_data_ptr_with_schema
* __ptr64,XRow_data * __ptr64,lessThanForEquivCacheJoins,std::allocator<XRow_data * __ptr64>,1>
>::iterator,std::_Tree<std::_Tmap_traits<XRow_data_ptr_with_schema * __ptr64,XRow_data *
__ptr64,lessThanForEquivCacheJoins,std::allocator<XRow_data * __ptr64>,1> >::const_iterator>::getNext()+0569 byte(s),
d:\im_ds_4.2_sp_rel\src\dataservices\dataintegrator\codeline\code\inc\xform\teqcachenewimpl.cpp, line 0120
d:\im_ds_4.2_sp_rel\src\dataservices\dataintegrator\codeline\code\src\rww\rww.cpp, line 0451
0x0000000000A7438E, RWThreadFunctionImp::run()+0126 byte(s)
0x0000000000A5C184, RWRunnableImp::exec()+0372 byte(s)
0x0000000000A74643, RWThreadImp::exec()+0051 byte(s)
0x0000000000A75F59, RWThreadImp::_setTimeSliceQuantum()+0169 byte(s)
0x0000000073F237D7, endthreadex()+0071 byte(s)
0x0000000073F23894, endthreadex()+0260 byte(s)
0x00000000770159ED, BaseThreadInitThunk()+0013 byte(s)
0x000000007724C541, RtlUserThreadStart()+0033 byte(s)
Registers:
RAX=0000000000000031 RBX=00000000FC50B63C RCX=000000000C64B84C RDX=00000000FC50B63C RSI=0000000010D0F162
RDI=0000000010D0F159 RBP=00000000FC50B63C RSP=000000000D4ABF30 RIP=00000000805DF2E5 FLG=0000000000010202
R8=000000000000000A R9=0000000010140211 R10=0000000002C6EED0 R11=0000000000000000 R12=000000000000000A
R13=0000000010140211 R14=0000000010140211 R15=000000000BC1A550
Exception code: C0000005 ACCESS_VIOLATION
Fault address: 00000001805DF2E5 01:00000000005DE2E5 D:\Program Files (x86)\SAP BusinessObjects\Data Services\bin\acta.dll
==========================================================
Collect the following and send to Customer Support:
1. Log files(error_*, monitor_*, trace_*) associated with this failed job.
2. Exported ATL file of this failed job.
3. DDL statements of tables referenced in this failed job.
4. Data to populate the tables referenced in the failed job. If not possible, get the last few rows (or sample of them) when
the job failed.
5. Core dump, if any, generated from this failed job.
==========================================================
桔⁥祳瑳浥攠据畯瑮牥摥愠硥散瑰潩湡⁤慣湮瑯瀠潲散獳琠楨⁳捡楴湯桔獩洠獥慳敧挠湯
慴湩⁳潳敭椠瑮牥慮祳瑳浥搠瑥楡獬眠楨档栠癡⁥敢湥栠摩敤潦⁲敳畣楲祴晉礠畯渠敥⁤
潴猠敥琠敨映汵潣瑮湥獴漠⁦桴⁥牯杩湩污洠獥慳敧獡潹牵愠浤湩獩牴瑡牯琠獡楳湧愠
摤瑩潩慮牰癩汩来獥琠潹牵愠捣畯瑮.

A row delimiter should be seen after column number

$
0
0

A column delimiter was seen after column number <70> for row number <533394> in file

 

 

The total number of columns defined is <70>, so

                                                      a row delimiter should be seen after column number <70>. Please check the file for bad data, or redefine the input schema for

                                                      the file by editing the file format in the UI.

 

So i checked the flat file i was trying to retrieve the data for the row number and checked the last column and did not find anything unsual in that row that could have caused this error.

 

There are 70 columns and it proposes there is a problem in the last column. I checked but there were no carriage return

 

Please help me on this.

Transporting DataSources for BW

$
0
0

Hello.

 

I'm loading data into BW using DataServices and I created RFC connections and Source System in BW to communicate with Data Services.

Also I created the Datasources in BW, replicated in DataServices and everything is working fine. All of this in Development.

 

When moving to QA I transported the Datasources from DEV to QA in BW.

In DataServices I promoted the Job with the connection (DataStore) and all the replicated DataSources as well. The promotion finished without any errors.

 

When testing running the Job in QA I got an error that says:

"There is no hierarchy available for Infosource = <DS_DATASOURCE> and source system = <DS_DV>"

 

After several hours of tryal an error I was able to figure out the issue and everything is related to the fact that the DataSource (which was promoted from DataServices DEV to QA) is tied to the source system. This means that "DS_DV" is the source system of BW DEV and should be DS_QA for BW QA.

 

I had to reimport the DataSource in DataServices QA, modify the job and replace the datasource (which was pointing to DEV) with the replicated one.

After this I was able to push data to BW.

 

My question is this, is there a way to map source systems like you do in BW? In BW there's a table where you can translate source systems during a transport request so when in DEV is called DS_DV, when transported to QA it is renamed to DS_QA. I'm looking for this kind of configuracion in DataServices to be able to replace or map the DataSource to the right environment after a promotion.

 

 

Any help?

Thanks.

How create a WADL for Data Services (RestFul)?

$
0
0

Hi

 

Which are the steps to create a WADL for Data Services 4.2

 

or

 

What should be the structure of the WADL?

 

or

 

Which are the steps to import a WADL?

 

I am using SopUI to create a WADL from a url, but when I import the function  to a Datastore (rest), I get a empty function or xml error.

 

Captura22.JPG


BODS exception handling

$
0
0

I am using a BODS job to directly insert rows into a target table.  On errors, such as  when the target column is expecting a non-null value while the job is passing a null value, the whole job fails.     What transform do I use so the job ignores those bad rows from the source and continues running?  Thanks.

What are the common Bugs/errors in SAP BO data services?

$
0
0

Hi All,

 

I am new to SAP Data services. I wanted to know What are the common Bugs/errors in SAP BO data services?

 

Thanks,

Shweta

How to optimise the job with huge source tables?

$
0
0

Hi All,

 

I have 2 source tables having lots of records(227354374 and 46526852) that I need to load into my target structure in BW.

I am using Merge transform to load the data into my target.

 

I'm a bit concerned about the performance issue that this huge number of records can create when I start the load.

 

Is there any performance tuning techniques that I can use or a way to split the number of records into smaller chunks so as to avoid any performance glitch?

 

Regards,

Ankit

how to pass Y/N in bat file

$
0
0

Hi guys

 

I need help to pass Y/N in bat file to download file from website using bat file..

 

when i am running the bat file from data services, i am getting to pass Y/N to continue with the downloading file from website..

 

see the details below:-

 

If you trust this host, press Yes. To connect without adding host key to the cache, press No. To abandon the connection press Cancel.  In scripting, you should use a -hostkey switch to configure the

expected host key.  (Y)es, (N)o, C(a)ncel (10 s), (C)opy Key: Cancel

 

Please tell me how to pass Y by default in bat file.. so that when it runs bat file should pass Y to accept the trusted host..

 

Thanks for your help

SAP Data Services landscape and architecture

$
0
0

Hi there,

 

simple question:

 

A data migration has to be done from one SAP ERP source system to an SAP ERP  target system.

 

Source and target systems have DEV - QA - PRD environment.

 

Do we require also DEV / QA / PRD environment for SAP DS?

and transport projects&repositories bz ATL/files from one environment to another?

 

Or is 1 SAP DS box enough since the datastores allow multiple settings for different source&target systems and

we can switch between the different settings?

 

What is usual and what is best practice?

 

Thanks for explanation!

 

Andrzej

Viewing all 4013 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>