Hello,
my project is displaying the following error, when you generate a TXT record with 3 million.
I have a project that when executato DS goes to a Z table in SAP selects every record in this table (+ 3 million), and on DS it generates a TXT file separated by semicolons (;). But the DS is not getting serar this file with more than 3 million record, the DS displays the following error:
(12.2) 02-21-14 10:06:27 (2996:2952) PRINTFN: INFO - Definicao da $G_FF_Diretorio_OUT = C:\DS_Neogrid\Upload\QAS
(12.2) 02-21-14 10:06:27 (2996:2952) PRINTFN: INFO - Definicao da $G_FF_Diretorio_OUT = C:\DS_Neogrid\UpLoad\QAS
(12.2) 02-21-14 10:06:27 (5928:3040) DATAFLOW: Process to execute data flow <DF_TransfArq_035> is started.
(12.2) 02-21-14 10:06:27 (5928:3040) DATAFLOW: Data flow <DF_TransfArq_035> is started.
(12.2) 02-21-14 10:06:27 (5928:3040) DATAFLOW: Cache statistics determined that data flow <DF_TransfArq_035> uses <0> caches with a total size of <0> bytes. This is less
than(or equal to) the virtual memory <1609564160> bytes available for caches. Statistics is switching the cache type to IN
MEMORY.
(12.2) 02-21-14 10:06:27 (5928:3040) DATAFLOW: Data flow <DF_TransfArq_035> using IN MEMORY Cache.
Can anyone help me...
Thank you.
at.
Wagner