Reputation: 736
In my application I have a SQL Server 2008 table Employee Swipedaily_Tbl
with 11 columns
where the employee daily swipes are inserted.
And I have about 8000 employees in my company. This means there will be at least 16000 rows created daily..
I am planing to delete all the rows at the end of a month and save them to another table in order to increase performance...... or back up the previous month data as dmb file from by application itself
As I am a new to SQL Server and DBA, can anyone suggest whether there is a better idea?
Can I create a dump file from the application?
Upvotes: 0
Views: 106
Reputation: 159
As all too often, it depends. Native partitioning requires the Enterprise Edition of SQL Server, however there are ways around it (although not very clean), like this.
If you do have the Enterprise Edition of SQL Server, I would take a serious look at partitioning (well linked in some of the other answers here), however I wouldn't split on a monthly basis, maybe a quarterly or semi-annual basis, as at two swipes per day is less than half a million rows per month, and a 1.5-3 mil. row table isn't that much for SQL server to handle.
If you are experiencing performance issues at this point in time with maybe a few months of data, have you reviewed the most frequent queries hitting the table and ensured that they're using indexes?
Upvotes: 1
Reputation: 268
Either by using Partitioning Table so inserting new data in huge volume database table won't effect its performance or using Script to backup data monthly wise using SQL Job and delete from existing one but if you are using Identity column you might need some changes in script to avoid conflict in old and new data.
Upvotes: 2
Reputation: 5316
It would depend on your requirements for the "old" data. Personally, I would strongly consider using table partitioning. See: http://technet.microsoft.com/en-us/library/dd578580(v=sql.100).aspx
Keep all records in table; this will make queries that look at current and historic data simultaneously simpler and potentially cheaper.
Upvotes: 1
Reputation: 6637
You can create another table identical to Swipedaily_Tbl(11 columns)
with additional one column that would tell when specific record was inserted in the backup table. You can then create a script that would backup the data older than one month and delete that data from the orignal table. You can then create a batch or a console application that could be scheduled to run at the end of month.
Hope this help.
Thanks.
Upvotes: 1
Reputation: 10853
However, with proper indexing, you should be OK to reatian the data in your original table itself for a much longer period - 365 day x 8000 employees x 2 swipes = 5.84 million records, not too much for SQL server to handle.
Raj
Upvotes: 1