Reputation: 5417
I've often had to load multiple items to a particular record in the database. For example: a web page displays items to include for a single report, all of which are records in the database (Report is a record in the Report table, Items are records in Item table). A user is selecting items to include in a single report via a web app, and let's say they select 3 items and submit. The process will add these 3 items to this report by adding records to a table called ReportItems (ReportId,ItemId).
Currently, I would do something like this in in the code:
public void AddItemsToReport(string connStr, int Id, List<int> itemList)
{
Database db = DatabaseFactory.CreateDatabase(connStr);
string sqlCommand = "AddItemsToReport"
DbCommand dbCommand = db.GetStoredProcCommand(sqlCommand);
string items = "";
foreach (int i in itemList)
items += string.Format("{0}~", i);
if (items.Length > 0)
items = items.Substring(0, items.Length - 1);
// Add parameters
db.AddInParameter(dbCommand, "ReportId", DbType.Int32, Id);
db.AddInParameter(dbCommand, "Items", DbType.String, perms);
db.ExecuteNonQuery(dbCommand);
}
and this in the Stored procedure:
INSERT INTO ReportItem (ReportId,ItemId)
SELECT @ReportId,
Id
FROM fn_GetIntTableFromList(@Items,'~')
Where the function returns a one column table of integers.
My question is this: is there a better way to handle something like this? Note, I'm not asking about database normalizing or anything like that, my question relates specifically with the code.
Upvotes: 53
Views: 58440
Reputation: 6076
Why not use a table-valued parameter? https://learn.microsoft.com/en-us/dotnet/framework/data/adonet/sql/table-valued-parameters
Upvotes: 5
Reputation: 119806
You either do what you've already got, pass in a delimited string and then parse out to a table value, or the other choice is passing in a wodge of XML and kinda much the same:
I haven't had a chance to look at SQL 2008 yet to see if they've added any new functionality to handle this type of thing.
Upvotes: 5
Reputation: 23315
Query a Single Field for Multiple Values in a Stored Procedure
http://www.norimek.com/blog/post/2008/04/Query-a-Single-Field-for-Multiple-Values-in-a-Stored-Procedure.aspx
Upvotes: 1
Reputation: 754250
If going to SQL Server 2008 is an option for you, there's a new feature called "Table-valued parameters" to solve this exact problem.
Check out more details on TVP here and here or just ask Google for "SQL Server 2008 table-valued parameters" - you'll find plenty of info and samples.
Highly recommended - if you can move to SQL Server 2008...
Upvotes: 37
Reputation: 5131
Here's a very clear-cut explanation to Table Valued Parameters from sqlteam.com: Table Valued Parameters
Upvotes: 2
Reputation: 17260
Your string join logic can probably be simplified:
string items =
string.Join("~", itemList.Select(item=>item.ToString()).ToArray());
That will save you some string concatenation, which is expensive in .Net.
I don't think anything is wrong with the way you are saving the items. You are limiting trips to the db, which is a good thing. If your data structure was more complex than a list of ints, I would suggest XML.
Note: I was asked in the comments if this would save us any string concatenation (it does indeeed). I think it is an excellent question and would like to follow up on that.
If you peel open string.Join with Reflector you will see that Microsoft is using a couple of unsafe (in the .Net sense of the word) techniques, including using a char pointer and a structure called UnSafeCharBuffer. What they are doing, when you really boil it down, is using pointers to walk across an empty string and build up the join. Remember that the main reason string concatenation is so expensive in .Net is that a new string object is placed on the heap for every concatenation, because string is immutable. Those memory operations are expensive. String.Join(..) is essentially allocating the memory once, then operating upon it with a pointer. Very fast.
Upvotes: 19
Reputation: 7622
See http://www.sommarskog.se/arrays-in-sql-2005.html for a detailed discussion of this issue and the different approaches that you could use.
Upvotes: 2
Reputation: 124696
One potential issue with your technique is that it doesn't handle very large lists - you may exceed the maximum string length for your database. I use a helper method that concatenates the integer values into an enumeration of strings, each of which is less than a specified maximum (the following implementation also optionally checks for and removes duplicates ids):
public static IEnumerable<string> ConcatenateValues(IEnumerable<int> values, string separator, int maxLength, bool skipDuplicates)
{
IDictionary<int, string> valueDictionary = null;
StringBuilder sb = new StringBuilder();
if (skipDuplicates)
{
valueDictionary = new Dictionary<int, string>();
}
foreach (int value in values)
{
if (skipDuplicates)
{
if (valueDictionary.ContainsKey(value)) continue;
valueDictionary.Add(value, "");
}
string s = value.ToString(CultureInfo.InvariantCulture);
if ((sb.Length + separator.Length + s.Length) > maxLength)
{
// Max length reached, yield the result and start again
if (sb.Length > 0) yield return sb.ToString();
sb.Length = 0;
}
if (sb.Length > 0) sb.Append(separator);
sb.Append(s);
}
// Yield whatever's left over
if (sb.Length > 0) yield return sb.ToString();
}
Then you use it something like:
using(SqlCommand command = ...)
{
command.Connection = ...;
command.Transaction = ...; // if in a transaction
SqlParameter parameter = command.Parameters.Add("@Items", ...);
foreach(string itemList in ConcatenateValues(values, "~", 8000, false))
{
parameter.Value = itemList;
command.ExecuteNonQuery();
}
}
Upvotes: 8