Reputation: 1298
I have a concept that I want to put in place. It is for a business logic layer of our software. our application can literally insert thousands of records at a time. I simply want to take the JSON I get Import that to a table then use a stored procedure to import the data. So every save our system does could potentially create a table actual table named by GUID processes the data then deletes it.. I am wondering if this will cause negative affects on our database.
Upvotes: 0
Views: 418
Reputation: 118611
If you're deleting the tables, then you're not "Creating too many". Obviously every system has limits, so "yes" it's possible to create "too many tables". But if you're talking about 100,000 tables created and destroyed over time, them "no".
Many DBs, for example, have the concept of the "temporary" table, which is no more than a table who's life span is tied to the life of the connection. So these tables are created and destroyed routinely.
So, using a normal table in this fashion should be no problem.
Upvotes: 0
Reputation: 432230
You can prepare a temp table before calling the stored procedure rather than a persistent table. This way, every process can use the same name: otherwise you'll need a lot of dynamic SQL.
You can use SQLBulkCopy into this temp table or into the real table directly
Note: for SQL Server 2008 you have table valued parameters
And 1000s in one go is what RDBMS are designed to do...
Upvotes: 2
Reputation: 34909
If they are temporary tables, I don't see a problem. Just don't try to do this with permanent tables.
That said, there is probably a better way to accomplish whatever you are doing without resorting to creating a table each time. If you explain more what you are trying to do we might be able to help with an alternate solution that would perform better.
Upvotes: 0