Smack
Smack

Reputation: 946

Reading large XML file and inserting data into database

I have large XML files, up to 4 to 5 GB.

I manage to read it with the XmlReader but now want to add the values into a database

I have used simple insert queries to add the values to database and got stuck, it takes more than 2 hours to insert values into database for 2 GB file. Is there other approach that will be more efficient adding the values from the XML file into the database?

The XML files have attributes and their values also new to be added to the database.

I am using SQL Server 2008 and .NET.

Thanks..

Upvotes: 1

Views: 4425

Answers (4)

Adrian Salazar
Adrian Salazar

Reputation: 5319

You will get faster response if you read the XML file directly from SQL Server.

Its a 3 step process, simplified looks like:

1) declare @doc xml = 'your xml here' declare @handler int

2) EXEC sp_xml_preparedocument @handler OUTPUT, @doc

3) select * from OPENXML(@handler, '/ROOT/Path/To/Data')

Upvotes: 0

Hamlet Hakobyan
Hamlet Hakobyan

Reputation: 33381

You can interop XmlBulkLoad. May be this can help you ?

Upvotes: 0

SteveM
SteveM

Reputation: 305

This sounds like a "batch" style operation. Consider taking a different approach:

  1. Create an xsl transform that converts your xml into a BULK INSERT statement
  2. Save the resulting sql script to a file or build a command string
  3. Execute the sql script

See this MSDN topic.

This approach will almost certainly be higher performing than iterating over the XML and inserting a row at a time.

Upvotes: 0

paul
paul

Reputation: 22001

bcp is usually the quickest way...

http://msdn.microsoft.com/en-us/library/ms191184.aspx

Upvotes: 1

Related Questions