Reputation: 1776
I would like to insert about 100,000
rows with default values into an SQLLite database and I want to make the engine work rather than iterating in a loop in Python.
For inserting with default values, the doc has this to say:
The third form of an INSERT statement is with DEFAULT VALUES. The INSERT ... DEFAULT VALUES statement inserts a single new row into the named table. Each column of the new row is populated with its default value, or with a NULL if no default value is specified as part of the column definition in the CREATE TABLE statement.
That seems not be helpful, so I thought I could use INSERT INTO ... SELECT:
The second form of the INSERT statement contains a SELECT statement instead of a VALUES clause. A new entry is inserted into the table for each row of data returned by executing the SELECT statement. If a column-list is specified, the number of columns in the result of the SELECT must be the same as the number of items in the column-list. Otherwise, if no column-list is specified, the number of columns in the result of the SELECT must be the same as the number of columns in the table. Any SELECT statement, including compound SELECTs and SELECT statements with ORDER BY and/or LIMIT clauses, may be used in an INSERT statement of this form.
But, I can't figure out a way for making a select statement return 100,000
results on an empty table for use in an INSERT INTO ... SELECT
statement. The docs on the SQLite expressions haven't given me any clues.
Is there a clever solution to this instead of brute forcing it with Python?
Upvotes: 0
Views: 501
Reputation: 180030
Having multiple rows with the same values does not make sense in a relational database.
Anyway, another way to get default values for all the actual columns is to insert NULL into the internal rowid
column.
To generate rows out of thin air, you need a recursive common table expression:
INSERT INTO MyTable(rowid)
WITH RECURSIVE n(r) AS (
SELECT NULL
UNION ALL
SELECT NULL FROM n
LIMIT 100000
)
SELECT r FROM n;
Upvotes: 1