PURWU
PURWU

Reputation: 427

How to fix error "Catastrophic failure. Cannot fetch a row from OLE DB provider "OraOLEDB.Oracle" for linked server"

I have a SQL Server 2008 R2 64-bit with a linked server to Oracle 12c using OraOLEDB.Oracle provider.

When I fetch data from the Oracle linked server using OPENQUERY, I am receiving the following error message:

The OLE DB provider "OraOLEDB.Oracle" for linked server reported an error. The provider reported an unexpected catastrophic failure.

Cannot fetch a row from OLE DB provider "OraOLEDB.Oracle" for linked server.

In OraOLEDB.Oracle provider options, Allow Inprocess is checked.

This error only happens when I queried all data, but if I add a WHERE clause WHERE ROWNUM <100 to limit to 99 rows, it would go without issue.

This does not happen to other servers with the same environment.

I have tried uncheck/re-check Allow Inprocess, but it didnt work. I tried uninstall the existing Oracle client and reinstall it, and it didnt work either.

Upvotes: 4

Views: 4780

Answers (2)

Mike
Mike

Reputation: 1876

This article (in French, so use Google Chrome to auto-translate it if needed), talks about using the FetchSize parameter to control how many rows to get with each 'page'. In that article they say that setting the value (which defaults to 100) to 101 was sufficient to solve the error for all their requests, regardless of how many rows they requested. (In my own situation using a value of 101 did nothing more than give me 1 more row than a value of 100 did).

Note that the FetchSize value can be set either in the registry as described in that article, or in the creation script for your Linked Server:

EXEC master.dbo.sp_addlinkedserver @server = N'MyLinkedServer', 
    @srvproduct=N'Oracle', @provider=N'OraOLEDB.Oracle', 
@datasrc=N'MyDNSnameForMyOracleServer:1521/MyServiceName',
    @provstr='FetchSize=1000'

Note that in my experimentation, I found that setting FetchSize to a larger number caused problems, depending on how wide the table was that I was querying. With an extremely wide table (494 columns) I could not get it to work with a FetchSize of > 3062. With a 2 column query, I could get it up to between 10m and 100m (and then it started having memory allocation errors).

In any case, it feels like this must be a bug in the Oracle Client (we're using version 21.3 - 64bit) - because a select statement should successfully retrieve all the records regardless of the FetchSize parameter.

Perhaps tweaking the FetchSize parameter will help you with your issue, if it still exists.

Upvotes: 0

Brad Clark
Brad Clark

Reputation: 11

Make sure the "Users" group has "Read & execute", "List Folder contents" and "Read" permissions to the complete directory your Oracle Client is installed in. That is what fixed it for me.

I was also having a query work fine in SQLPlus, but as soon as I tried to execute it from Toad using the Linked Server, I would get that "catastrophic failure". I was also not able to see views or tables for the Linked Server from within SSMS. I finally went back to basics, and checked user permissions on the Driver directory, and they were all over the place.
I went to the base folder just off the root of the drive, and checked Security, then went to "Advanced" in that Tab. Then, in the "Advanced Security Settings" Clicked on the "Change Permissions". Once you do that, there is a check box to "Replace all child object permission entries with inheritable permission entries from this object" Check that box and Click "OK"

I verified the SQLPlus query still worked, then ran all of my test queries that were failing, and they all now gave results. I am also now able to see Tables & Views in SSMS.

Upvotes: 1

Related Questions