Reputation: 619
I have a question about inserting column stored procedure into table. I have two tables about 100 rows and 1 Mil rows. Basically, I have a function that returns int comparing two data from different table. And I want to run function on each combination of data.
Table 1 (~100 rows)
Col1
A
B
C
D
Table 2 (~1 Mil rows)
Col1
1
2
3
4
function(i,j)
where i
is from table 1 and j
is from table 2
Code in nutshell is like:
while i < count(*) table 1
print i
while j < count(*) table 2
select function(i,J)
J = J + 1
end
i = i + 1
j = 0
end
The result display is a long single column with i (from table1) and int evaluated from function.
i(1)
1
3
4
.
.
.
i(2)
1
2
.
.
I want to display the above result like table dynamically assign column name to i.
i(1) i(2) i(3) .....
1 1 5
3 2 9
4 3 4
Thanks in advance.
Upvotes: 0
Views: 191
Reputation: 3775
This would be my solution:
DECLARE @columns VARCHAR(MAX)
SELECT @columns = COALESCE(@columns + ',[' + CAST(t1.i as varchar(10)) + ']'
,'[' + CAST(t1.i as varchar(10)) + ']')
FROM t1
DECLARE @query VARCHAR(MAX)
SET @query = '
SELECT * FROM
(
SELECT t1.i, t2.j, yourFunction(t1.i, t2.j) as functionResult
FROM t1
CROSS JOIN t2
) as tbl3
PIVOT
(
SUM(functionResult) FOR i in (' + @columns + ')
) AS tblResults'
EXECUTE (@query)
You can check out a working example of this here on SEDE.
And just for clarification, you would replace t1 and t2 with the actual names of your 2 tables. Also in the SEDE example I just added i
and j
together (t1.i + t2.j as functionResult
) because I can't create functions on there, but you would just call your function instead like I show in my answer (yourFunction(t1.i, t2.j) as functionResult
).
Upvotes: 0
Reputation: 4469
To get your desired output, please see below coding, I've coded perfectly so you do need to change the table name only and call your function accordingly in commented area in the code block.
DECLARE @rest INT,
@rev INT,
@counter INT,
@Table1Count INT,
@Table2Count INT
DECLARE @colName VARCHAR(50),
@Fields NVARCHAR(MAX),
@Sql VARCHAR(MAX)
SELECT @colName = 'Col', @Fields = '', @rest = 1, @rev = 1, @counter = 1
SELECT @Table1Count = count(*) from Table1
SELECT @Table2Count = count(*) from Table2
CREATE TABLE #temp (ID INT IDENTITY(1,1), defaultColumn VARCHAR(50))
WHILE @rest <= @Table2Count
BEGIN
INSERT INTO #temp (defaultColumn) VALUES(@rest)
SET @rest = @rest + 1
END
WHILE @counter <= @Table1Count
BEGIN
PRINT(CAST(@counter AS VARCHAR))
SET @colName = @colName+CAST(@counter AS VARCHAR)
SET @Fields = @Fields + @colName
IF @counter <> @Table1Count
BEGIN
SELECT @Fields = @Fields +', '
END
EXEC ('ALTER TABLE #temp ADD '+ @colName +' VARCHAR(250)')
WHILE @rev <= @Table2Count
BEGIN
-------------------------------------------------
-- Some modification may need to your Function
-- CALL YOUR FUNCTIN HERE
-------------------------------------------------
SET @Sql = 'UPDATE #temp SET '+ @colName +' = '+ CAST( @rev AS VARCHAR)
+' WHERE defaultColumn = '+ CAST( @rev AS VARCHAR)
EXEC (@Sql)
SET @rev = @rev + 1
END
SELECT @rev = 1, @counter = @counter +1, @colName = 'Col'
END
EXEC('SELECT '+ @Fields + ' FROM #temp')
DROP TABLE #temp
Keep in mind that try using columns below the maximum allowable number of columns in a table as per MSSQL server(the version your are using).
Hope this would be very helpful to you, thanks for your time
Upvotes: 0
Reputation: 20320
That's going to be expensive, 100,000,000 results! Support in SQlserver is very poor for this.
To use sp_crosstab (which is not at the top of anybody's most desirable things to do list), you need three columns
So that would be 'I', 'J' and f(i,j)
instead of the while loops, you can do what's normally an error.
Select Table1.Col1,Table2.Col1,Function(Table1.Col1,Table2.Col2) From Table1,Table2
(or From Table1 Join Table2)
This gives you a cartesian product all 100 records in table1 joined to all 1,000,000 in table 2. Pass that into sp_crosstab correctly and you'll geta row for every value in Table2 , a column for every row in Table1 and the cell with hold the result of the function.
Having said that, don't be surprised if this is a little bit slow, and possibly uses up a bit of memory....
You might want to try it out with some smaller ranges.
Upvotes: 1