Reputation: 1
I am using Apache Druid for data storage and now I am trying to fetch a datasource which is around 7Gb of size to local storage using the druid extern function like
INSERT INTO
EXTERN(
local(exportPath => 'exportLocation/query1')
)
AS CSV
SELECT *
FROM test
I tried this in my dev environment and it worked, but while trying in Prod env this seems not working and throwing below exception:
"status": {
"id": "query-6c555efd-4fa5-4235-852f-2b932ed3eb60",
"groupId": "query-6c555efd-4fa5-4235-852f-2b932ed3eb60",
"type": "query_controller",
"createdTime": "2024-10-07T10:58:43.454Z",
"queueInsertionTime": "1970-01-01T00:00:00.000Z",
"statusCode": "FAILED",
"status": "FAILED",
"runnerStatusCode": "WAITING",
"duration": 56763,
"location": {
"host": "--------",
"port": -------,
"tlsPort": -
},
"dataSource": "__query_select",
"errorMsg": "WorkerRpcFailed: RPC call to task failed unrecoverably: [query-6c555efd-4fa5-4235-852f-2b932ed3eb60-..."
}
Is there any configuration setting that was missing in prod is what I want to know. Also can anyone explain why this kind of exception occurs?
I am trying to compare the runtime.properties file from both envs, Is there any other way we could get a complete datasource
as a CSV (or multiple CSV files by partition)?
Upvotes: 0
Views: 74