Wednesday, March 7, 2012

DTSX package continues to throw errors when working with large dataset.

I have a dataset that is between 40-50K records that has to go through a process that is pre-defined. SSIS works just fine with the smaller sets even up to 20K but this job keeps blowing up saying something along the lines of cannot write to recordset destination. Does this make sense to anyone? The sever is a 2 processor with 2GB of ram. Physical memory usage spikes to about 1.6GB during the run but the processor never really gets above 30% usage. Does this product just not scale yet?
I'd be interested to see the full error message. Depending on how wide the buffer is, trying to stuff that many rows into a variable may be a bit optimistic, just because of the memory requirements. The SSIS pipeline is great for large volumes, and the buffer design means you don't have to load all data into memory at once, but I suspect you are actually forcing just that by using the Recordset Destination. Why do you want to do this?

No comments:

Post a Comment