Guys,
I am trying to get data from Salesforce to BODS and want to use SQL transform to do this.
I create the SQL transform, choose the Salesforce source datastore, the Adapter as Database Type and put in the query (I tried with a really simple one that should work).
SELECT ID, Name, CreatedByID FROM SVMXC__Service_Contract__c
But when I run the job I get an error message:
RUN-058105 Error preparing to read SQL: Class for interface com.acta.adapter.FunctionCall is not implemented, function call is not imported, or XML script for the adapter operation is empty.
I also tried to use the pushdown_sql() function within the SQL transfrom, just for a test. Same result.
My SFDC adapter works, I can read metadata, import tables and run jobs on it.
Now these are the questions I have:
1. Is this suppose to work at all? I mean can I use SQL transfrom (so push down an SQL) for SFDC adapter at all, does it work?
2. If yes, what do you think my issue is? Adapter version is 14.2.3.0
Some background why I want to do this.
SFDC stores the ID only (varchar(18), like: 'a3He0000000K3L9EAK') in a certain table for fields that have an own table with master data. Like CreatedBy is an 18 char long string, that is an ID in user table, with master data (Name, UserID etc.).
When I extract and load an SFDC table into BODS, 50% of the fields in the table are these IDs - unusable business-wise. I need to load the parent tables and do a look-up to get 'real' data (like UserID for CreatedBy). But SFDC own querying language (SOQL) can read in into these data using the built-in relationships.
Example, this works in SOQL:
SELECT ID, Name, Createdby.UserID FROM SVMXC__Service_Contract__c
This retrieves the actual UserID for a user, and not the 18 char string.
Thank you, Gabor