Reputation: 35
I have a data set that has about 8,000 variables and 100,000 rows. I currently have a Proc SQL statement that takes creates a table that has the counts of the rows and sums from 5,500 different columns that are binary. Example:
Select Count(*) as CNT
,SUM(Column1) as Column1
,SUM(Column2) as Column2
....
,Sum(Column5500) as Column5500
From Table
I'm getting error this error:
ERROR: where clause processing could not obtain memory.
I believe that this is coming from SAS hitting a memory limit. I don't have access to the config file to adjust the memory size.
Here are the list of concerns I have:
Upvotes: 0
Views: 1026
Reputation: 35
After working through this problem, the issue appears to be naming the response column the same as the source column, E.G. SUM(Column1) as Column1. If code was changed to SUM(Column1) as Column1_s the issue disappears.
Upvotes: 0
Reputation: 51566
No idea why it should require a large amount of memory to just sum 1's and 0's.
Try using PROC SUMMARY instead.
proc summary data=have ;
var column1-column5500;
output out=want(rename=(_freq_=cnt)) sum= ;
run;
Upvotes: 4