Reputation: 8352
Running the following query in SQL Server Management Studio gives the error below.
update table_name set is_active = 0 where id = 3
A severe error occurred on the current command. The results, if any, should be discarded.
I have tried the same update statement on a couple of other tables in the database and they work fine.
DBCC CHECKTABLE('table_name');
gives
DBCC results for 'table_name'.
There are 13 rows in 1 pages for object "table_name".
DBCC execution completed. If DBCC printed error messages, contact your system administrator.
Upvotes: 53
Views: 158933
Reputation: 1296
In my case it was a remote procedure call that was attempting to return xml. I converted to varchar(max) in the remote proc, and back to xml on the calling side.
Upvotes: 0
Reputation: 101
in my case, the number of fields in the temporary table did not match the number of fields received for the query from the linked server
INSERT INTO #SomeTempTable
EXEC [SomeLinkedServer].master.dbo.sp_executesql N'SELECT * FROM table'
Upvotes: 0
Reputation: 9210
I received this same exception, but it was due to the fact that I was using Polybase to issue a query to an external table from an API request that was enlisting in a Transaction via TransactionScopes. The exception thrown from ADO.Net was the unhelpful: "A severe error occurred on the current command".
However, in SSMS, I did the following which finally made me realize that it was the transaction that was causing this issue:
I saw the following:
Unsupported transaction manager request 0 encountered. SQL Server Parallel DataWarehousing TDS endpoint only supports local transaction request for 'begin/commit/rollback'.
I then placed this particular query in another transaction scope, suppressing the transactionscope that it would have enlisted in, and it worked.
using (var transactionScope = this.transactionScopeFactory.Create(
System.Transactions.TransactionScopeOption.Suppress,
new System.Transactions.TransactionOptions { IsolationLevel = System.Transactions.IsolationLevel.ReadCommitted },
System.Transactions.TransactionScopeAsyncFlowOption.Enabled))
{
// ... query
transactionScope.Complete(); // Suppress scope option renders this unnecessary, but just a good habit
}
Upvotes: 0
Reputation: 1237
I had the same issue but the table was too big the I couldn't fix it from the UI or by dropping and creating all indexes. So here is another way to solve this , or at least it worked for me.
select * into myDatabase.myTable from myDatabase_BK.myTable
Upvotes: 0
Reputation: 112
Just wanted to add my 2cents to help whomever the next person that comes across this is.
We noticed that a specific report was failing after half an hour. I assumed it was a timeout as the amount of data being returned was huge. It turns out the SSRS default timeout setting is 1800 seconds (30mins). I changed the setting on this specific report to run indefinitely with no timeout and that resolved our issue.
Next step is to identify ways to improve the performance of the MSSQL behind the report :)
Upvotes: 0
Reputation: 653
In my case it was something else, +=
operator caused this. I had to replace += X
with field = field + X
to overcome this. I assume this is a bug though I wasn't able to find any related KB on Microsoft sites.
I am using SQL Server 2008 R2(10.50.1600).
Upvotes: 0
Reputation: 2261
in my case, the method: context.Database.CreateIfNotExists();
called up multiple times before create database and crashed an error A severe error occurred on the current command. The results, if any, should be discarded.
Upvotes: 1
Reputation: 128
One other possible solution we just found after having this issue across multiple databases/tables on the same server.
Is the max connections open to the sql server. We had an app that wasn't closing it's SQL connection and was leaving them open so we were running around 28K-31K connections (SQL Sever has a max out at 32K ish), and we noticed that once we killed a few thousand sleeping connections it took care of the error listed on this question.
The fix was to update the apps to make sure they closed their connections instead of leaving them open.
Upvotes: 1
Reputation: 6476
I was having the error in Hangfire where I did not have access to the internal workings of the library or was I able to trace what the primary cause was.
Building on @Remus Rusanu answer, I was able to have this fixed with the following script.
--first set the database to single user mode
ALTER DATABASE TransXSmartClientJob
SET SINGLE_USER
WITH ROLLBACK IMMEDIATE;
GO
-- Then try to repair
DBCC CHECKDB(TransXSmartClientJob, REPAIR_REBUILD)
-- when done, set the database back to multiple user mode
ALTER DATABASE TransXSmartClientJob
SET MULTI_USER;
GO
Upvotes: 0
Reputation: 829
This seems to happen when there's a generic problem with your data source that it isn't handling.
In my case I had inserted a bunch of data, the indexes had become corrupt on the table, they needed rebuilding. I found a script to rebuild them all, seemed to fix it. To find the error I ran the same query on the database - one that had worked 100+ times previously.
Upvotes: 1
Reputation: 2357
This error is exactly what it means: Something bad happened, that would not normally happen.
In my most recent case, the REAL error was:
Msg 9002, Level 17, State 2, Procedure MyProcedure, Line 2 [Batch Start Line 3]
The transaction log for database 'MyDb' is full due to 'LOG_BACKUP'.
Here is my checklist of things to try, perhaps in this exact order:
Upvotes: 3
Reputation: 900
I just had the same error, and it was down to a corrupted index. Re-indexing the table fixed the problem.
Upvotes: 62
Reputation: 2139
In my case, I was using System.Threading.CancellationTokenSource
to cancel a SqlCommand
but not handling the exception with catch (SqlException) { }
Upvotes: 3
Reputation: 4531
A different scenario but the same error: I got this error when I was trying to insert records into a temporary table using a stored procedure. It turned out there was a parameter mismatch. I was trying to insert a BIGINT into an INT.
Credit goes to Vicky Harp: http://vickyharp.com/2012/03/troubleshooting-a-severe-error-occurred-on-the-current-command/
Upvotes: 7
Reputation: 2501
In my case,I was using SubQuery
and had a same problem. I realized that the problem is from memory leakage.
Restarting MSSQL
service cause to flush tempDb
resource and free huge amount of memory.
so this was solve the problem.
Upvotes: 12
Reputation: 294437
Run DBCC CHECKTABLE('table_name');
Check the LOG folder where the isntance is installed (\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\LOG
usually) for any file named 'SQLDUMP*
'
Upvotes: 6