Reputation: 9399
When I run the following I get:
Error: D1_TYPE_ERROR: Type 'object' not supported
Is there a more appropriate way to do bulk inserts with SQL Lite and Cloudflare's D1 tooling?
const items = [1,2,3,4]
await db
.prepare(`INSERT OR REPLACE INTO table_name (field1, field2) VALUES (?, ?), (?, ?)`)
.bind(items)
.run();
Upvotes: 1
Views: 885
Reputation: 540
D1Database bind
function use spread operator for its parameter.
bind(...values: unknown[]): D1PreparedStatement;
So you can write like this.
const items = [1,2,3,4]
await db
.prepare(`INSERT OR REPLACE INTO table_name (field1, field2) VALUES (?, ?), (?, ?)`)
.bind(...items)
.run();
Since D1
allows bind parameters with ?
keyword, we can dynamically create values like this.
const values = [ ['Foo', 1], ['Bar', 2] ]
const valueTemplate: string = values.map(value =>
`(${Array(value.length).fill('?').join(', ')})`
).join(', ') // result = (Foo, 1), (Bar, 2)
const params: unknown[] = values.flat() // result = ['Foo', 1, 'Bar', 2]
Then, we can batch insert like this.
const query = `INSERT INTO table (col1, col2) VALUES ${valueTemplate};`
const result = await db.prepare(query).bind(...params).run()
It works fine until you exceeds the D1 params limits. Cloudflare limits the params size (Maximum bound parameters per query) to 100. So we have to split values
into chunks. My helper class D1Ext
will solve that problem.
export class D1Ext {
// ('Foo', 1, ?) and ['Bar']
private static getValueTemplateAndParams(previousParamSize: number, row: unknown[]) {
const values: string[] = []
const params: unknown[] = []
let paramLimitReached = false
for (const value of row) {
if (value === null) values.push('NULL')
else if (typeof value === 'number') values.push(String(value))
else {
values.push('?')
params.push(value)
paramLimitReached = (previousParamSize + params.length) > 100
}
}
return {
valueTemplate: '(' + values.join(', ') + ')',
params: params,
paramLimitReached: paramLimitReached
}
}
public static batchInsert(db: D1Database, table: string, columns: string[], rows: Array<unknown>[]): Promise<D1Result[]> {
// INSERT INTO table (col1, col2) VALUES
const queryPrefix = `INSERT INTO ${table}` + ` (${columns.join(', ')}) VALUES `
const batches: {query: string, params: unknown[]}[] = []
// (?, NULL) (NULL, ?)
let values: string[] = []
// ['Foo', 'Bar']
let params: unknown[] = []
rows.forEach(row => {
const valueAndParams = this.getValueTemplateAndParams(params.length, row)
if (valueAndParams.paramLimitReached) {
// stored as a batch when param limit (100) reached
const query = queryPrefix + values.join(', ') + ';'
batches.push({query: query, params: params})
// prepare for new batch
values = [valueAndParams.valueTemplate]
params = [...valueAndParams.params]
} else {
values.push(valueAndParams.valueTemplate)
params.push(...valueAndParams.params)
}
})
if (values.length > 0) {
// stored as a batch if values is not empty
const query = queryPrefix + values.join(', ') + ';'
batches.push({query: query, params: params})
}
return db.batch(batches.map(batch => db.prepare(batch.query).bind(...batch.params)))
}
}
const tableName = 'table1'
const columns = ['col1', 'col2']
const values = [ ['Foo', 1], ['Bar', 2] ]
const results = await D1Ext.batchInsert(db, tableName, columns, values)
const isInserted: boolean = results.filter((result) => result.success).length == results.length
Note that my class skips
NULL
andnumber
values to reduce the batch (chunk) size. You can remove it ingetValueTemplateAndParams
function if you want.
Upvotes: 1