mekar10
mekar10

Reputation: 651

How can I insert 100000 rows in SQL Server?

INSERT INTO pantscolor_t (procode, color, pic) 
VALUES
('74251', 'Black', '511black.jpg'),
('74251', 'OD Green', '511odgreen.jpg'),
('74251', 'Black', '511black.jpg'),
('74251', 'OD Green', '511odgreen.jpg'),
('74251', 'Black', '511black.jpg'),
('74251', 'OD Green', '511odgreen.jpg'),
..........
..........
..........

INSERT INTO pantscolor_t (procode,color,pic)
VALUES
('74251', 'Charcoal', '511charcoal.jpg'),
('74251', 'Charcoal', '511charcoal.jpg'),
('74251', 'Charcoal', '511charcoal.jpg'),
('74251', 'Charcoal', '511charcoal.jpg'),
.............
.............
.............

 INSERT INTO........................
 INSERT INTO........................
 INSERT INTO........................
 INSERT INTO........................

I have 100000 rows like this but my insert statements bigger than 1000 rows. When I run the SQL statement in SSMS, I get an error:

The number of row value expressions in the INSERT statement exceeds the maximum allowed number of 1000 row values.

Upvotes: 41

Views: 105699

Answers (6)

jerichaux
jerichaux

Reputation: 66

PRELUDE

I tried some of the others here for a manual insert of 250,000 rows, broken apart into 5 groups of 50,000. I let a few attempts run for ~5 minutes apiece (all of which were manually cancelled prior to completion) before deciding to try my own hand at a solution.

For example, user12408924's answer (quoted below) allowed me to attempt to insert >1,000 rows, but it was dreadfully slow (For reference, I was inserting into a table composed of 11xVARCHAR(255) columns, 1xVARCHAR(1) columns, 2xDATE columns, and 1xFLOAT columns, all nullable).

Nothing against their answer, it's great! It just didn't fit my particular use-case.

INSERT mytable (col1, col2, col3, col4, col5, col6)
SELECT * FROM (VALUES
('1502577', '0', '114', 'chodba', 'Praha', 'Praha 1'),
('1503483', '0', 'TVP', 'chodba', 'Praha', 'Praha 2'),
/* ... more than 1000 rows ... */
('1608107', '0', '8', 'sklad', 'Tlumačov', 'Tlumačov'),
('1608107', '0', '9', 'sklad', 'Tlumačov', 'Tlumačov')
) AS temp (col1, col2, col3, col4, col5, col6);

SOLUTION: OPENJSON

So, instead, I used a janky solution that should not have worked, but worked quite well and with zero issues, with each 50,000 insertion completing in <3 seconds (obviously machine dependent, but prior attempts were >5 minutes as mentioned above).

The following batch is an example with far fewer rows and far fewer columns, but is otherwise verbatim as I'd successfully executed it:

DECLARE @JsonArr NVARCHAR(MAX) = N'[
    {"UserID":"001","FirstName":"Alpha","LastName":"First","Email":"[email protected]","BirthDate":"1970-01-01"},
    {"UserID":"002","FirstName":"Bravo","LastName":"Second","Email":"[email protected]","BirthDate":"1970-01-02"},
    {"UserID":"003","FirstName":"Charlie","LastName":"Third","Email":"[email protected]","BirthDate":"1970-01-03"},
    {"UserID":"004","FirstName":"Delta","LastName":"Fourth","Email":"[email protected]","BirthDate":"1970-01-04"},
    {"UserID":"005","FirstName":"Foxtrot","LastName":"Fifth","Email":"[email protected]","BirthDate":"1970-01-05"},
    {"UserID":"006","FirstName":"Golf","LastName":"Sixth","Email":"[email protected]","BirthDate":"1970-01-06"},
    {"UserID":"007","FirstName":"Hotel","LastName":"Seventh","Email":"[email protected]","BirthDate":"1970-01-07"}
]';
INSERT INTO
    [DBName].[SchemaName].[TargetName]
        ([FirstName], [LastName], [Email], [BirthDate])
    SELECT [UserID], [FirstName], [LastName], [Email], [BirthDate]
        FROM OPENJSON(@JsonArr)
            WITH (
                    [UserID] VARCHAR(255)
                        COLLATE SQL_Latin1_General_CP1_CI_AS
                    , [FirstName] VARCHAR(255)
                        COLLATE SQL_Latin1_General_CP1_CI_AS
                    , [LastName] VARCHAR(255)
                        COLLATE SQL_Latin1_General_CP1_CI_AS
                    , [Email] VARCHAR(255)
                        COLLATE SQL_Latin1_General_CP1_CI_AS
                    , [BirthDate] DATE
                );

Limitations

The MSSQL NVARCHAR documentation specifies a maximum of 2GB of space allocated for an NVARCHAR(MAX):

Variable-size string data. The value of n defines the string size in byte-pairs, and can be from 1 through 4,000. max indicates that the maximum storage size is 2^31-1 characters (2 GB).

Upvotes: 0

Apurva Singh
Apurva Singh

Reputation: 4990

Create a csv out.csv etc
Then use:

BULK INSERT 
FROM 'C:\out.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO

Upvotes: 0

user12408924
user12408924

Reputation: 261

INSERT mytable (col1, col2, col3, col4, col5, col6)
SELECT * FROM (VALUES
('1502577', '0', '114', 'chodba', 'Praha', 'Praha 1'),
('1503483', '0', 'TVP', 'chodba', 'Praha', 'Praha 2'),
/* ... more than 1000 rows ... */
('1608107', '0', '8', 'sklad', 'Tlumačov', 'Tlumačov'),
('1608107', '0', '9', 'sklad', 'Tlumačov', 'Tlumačov')
) AS temp (col1, col2, col3, col4, col5, col6);

Upvotes: 25

Scorpion99
Scorpion99

Reputation: 248

By applying the following you should not have any error :

INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251','Black','511black.jpg')

INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'OD Green', '511odgreen.jpg')

INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'Black', '511black.jpg')

INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'OD Green', '511odgreen.jpg')

INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'Black', '511black.jpg')

...........

I tried it and it worked, of course you can use the excel to concatenate the values easily.

Upvotes: 10

Alex Jorgenson
Alex Jorgenson

Reputation: 912

Another solution is to use a select query with unions.

INSERT INTO pantscolor_t (procode,color,pic)
SELECT '74251', 'Black', '511black.jpg'
UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
UNION ALL SELECT '74251', 'Black', '511black.jpg'
UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
UNION ALL SELECT '74251', 'Black', '511black.jpg'
UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
--etc....

UNION ALL is used instead of UNION in order to speed up the query when dealing with thousands of records. UNION ALL allows for duplicate rows whereas UNION will ensure that duplicates do not exist in the result set. For this scenario we don't want to remove any possible duplicates, so UNION ALL is used.

Upvotes: 32

Bala
Bala

Reputation: 4547

Create csv file (or some file with defined field delimiter and row delimiter) and use "BULK INSERT" option to load file to database. File can have 100000 rows; there won't be any problem of loading huge file using bulk upload.

http://msdn.microsoft.com/en-us/library/ms188365.aspx

Upvotes: 23

Related Questions