Fabiano
Fabiano

Reputation: 23

Spoon run slow from Postgres to Oracle

I have an ETL Spoon that read a table from Postgres and write into Oracle. No transformation, no sort. SELECT col1, col2, ... col33 from table. 350 000 rows in input. The performance is 40-50 rec/sec.

I try to read/write the same table from PS to PS with ALL columns (col1...col100) I have 4-5 000 rec/sec The same if I read/write from Oracle to Oracle: 4-5 000 rec/sec So, for me, is not a network problem.

If I try with another table Postgres and only 7 columns, the performances are good.

Thanks for the help.

Upvotes: 0

Views: 442

Answers (1)

Kullayappa  M
Kullayappa M

Reputation: 174

It happened same in my case also, while loading data from Oracle and running it on my local machine(Windows) the processing rate was 40 r/s but it was 3000 r/s for Vertica database.

I couldn't figure it out what was the exact problem but I found a way to increase the row count. It worked from me. you can also do the same.

Right click on the Table Input steps, you will see "Change Number Of Copies to Start"

Please include below in the where clause, This is to avoid duplicates. Because when you choose the option "Change Number Of Copies to Start" the query will be triggered N number of time and return duplicates but keeping below code in where clause will get only distinct records

where ora_hash(v_account_number,10)=${internal.step.copynr}

v_account_number is primary key in my case 10 is, say for example you have chosen 11 copies to start means, 11 - 1 = 10 so it is up to you to set.

Please note this will work, I suggest you to use on local machine for testing purpose but on the server definitely you will not face this issue. so comment the line while deploying to servers.

Upvotes: 1

Related Questions