Xavier S.
Xavier S.

Reputation: 1157

perl execute sql file (DBI oracle)

I have the following problem, i have a SQL file to execute with DBI CPAN module Perl I saw two solution on this website to solve my problem.

  1. Read SQL file line by line

  2. Read SQL file in one instruction

So, which one is better, and what the real difference between each solution ?

EDIT

It's for a library. I need to retrieve output and the return code.

Kind of files passed might be as following:

    set serveroutput on;
    set pagesize 20000;
    spool "&1."
    DECLARE
        -- Récupération des arguments
        -- &2: FLX_REF, &3: SVR_ID, &4: ACQ_STT, &5: ACQ_LOG, &6: FLX_COD_DOC, &7: ACQ_NEL, &8: ACQ_TYP    
        VAR_FLX_REF VARCHAR2(100):=&2;
        VAR_SVR_ID NUMBER(10):=&3;
        VAR_ACQ_STT NUMBER(4):=&4;
        VAR_ACQ_LOG VARCHAR2(255):=&5;
        VAR_FLX_COD_DOC VARCHAR2(30):=&6;
        VAR_ACQ_NEL NUMBER(10):=&7;
        VAR_ACQ_TYP NUMBER:=&8;
    BEGIN
        INSERT INTO ACQUISITION_CFT 
            (ACQ_ID, FLX_REF, SVR_ID, ACQ_DATE, ACQ_STT, ACQ_LOG, FLX_COD_DOC, ACQ_NEL, ACQ_TYP) 
        VALUES 
            (TRACKING.SEQ_ACQUISITION_CFT.NEXTVAL, ''VAR_FLX_REF'', 
                ''VAR_SVR_ID'', sysdate, VAR_ACQ_STT, ''VAR_ACQ_LOG'',
                ''VAR_FLX_COD_DOC'', VAR_ACQ_NEL, VAR_ACQ_TYP);
    END;

    /
    exit;

I have another question to ask, again with DBI Oracle module. May i use the same code for SQL file and for Control file ?

(Example of SQL Control file)

    LOAD DATA
    APPEND INTO TABLE DOSSIER
    FIELDS TERMINATED BY ';'
    (
    DSR_IDT,
    DSR_CNL,
    DSR_PRQ,
    DSR_CEN,
    DSR_FEN,
    DSR_AN1,
    DSR_AN2,
    DSR_AN3,
    DSR_AN4,
    DSR_AN5,
    DSR_AN6,
    DSR_PI1,
    DSR_PI2,
    DSR_PI3,
    DSR_PI4,
    DSR_NP1,
    DSR_NP2,
    DSR_NP3,
    DSR_NP4,
    DSR_NFL,
    DSR_NPG,
    DSR_LTP,
    DSR_FLF,
    DSR_CLR,
    DSR_MIM,
    DSR_TIM,
    DSR_NDC,
    DSR_EMS NULLIF DSR_EMS=BLANKS "sysdate",
    JOB_IDT,
    DSR_STT,
    DSR_DAQ "CASE WHEN :DSR_DAQ IS NOT NULL THEN SYSDATE ELSE NULL END"

    )

Upvotes: 0

Views: 1038

Answers (3)

user1919238
user1919238

Reputation:

"Line by line" only makes sense if each SQL statement is on a single line. You probably mean statement by statement.

Beyond that, it depends on what your SQL file looks like and what you want to do.

How complex is your SQL file? Could it contain things like this?

select foo from table where column1 = 'bar;';   --Get foo; it will be used later.

The simple way to read an SQL file statement by statement is to split by semicolons (or whatever the statement delimiter is). But this method will fail if you might have semicolons in other places, like comments or strings. If you split this statement by semicolons, you would try to execute the following four "commands":

select foo from table where column1 = 'bar;
';
--Get foo; 
it will be used later.

Obviously, none of these are valid. Handling statements like this correctly is no simple matter. You have to completely parse SQL to figure out what the statements are. Unfortunately, there is no ready-made module that can do this for you (SQL::Script is a good start on an SQL file processing module, but according to the documentation it just splits on semicolons at this point).

If your SQL file is simple, not containing any statement delimiters within statements or comments; or if it is predictable in some other way (such as having one statement per line), then it is easy to split the file into statements and execute them one by one. But if you have to handle arbitrary SQL syntax, including cases such as above, this will be a complex task.

What kind of task?

  • Do you need to retrieve the output?
  • Is it important to detect errors in any individual statement, or is it just a batch job that you can run and not worry about it?

If this is something that you can just run and forget about, you could just have Perl execute a system command, telling Oracle to process the file. This will be simpler than handling all of the statements yourself. But if you need to process the results or handle errors within Perl, doing it yourself statement by statement will be a necessity.

Update: based on your response, you want to write a library that can handle arbitrary SQL statements. In that case, you definitely need to parse the SQL and execute the statements one at a time. This is do-able, but not simple. The possibility of BEGIN...END blocks means that you have to be able to correctly handle semicolons within a statement.

The SQL::Statement class of modules may be helpful.

Upvotes: 1

grahamj42
grahamj42

Reputation: 2762

In fact, the two answers you reference propose the same solution, to read and execute line-by-line (but the first is clearer on the point). The second question has an optional answer, where the file contains a single statement.

If you don't execute the SQL line-by-line, it's very difficult to trap any errors.

Upvotes: 1

Neil Slater
Neil Slater

Reputation: 27197

Reading a table one row at a time is more complex, but it can use less memory - provided you structure your code to make use of the data per item and not need it all later.

Often you want to process each item separately (e.g. to do work on the data), in which case you might as well use the read line-by-line approach to define your loop.

I tend to use single-instruction approach by default, but as soon as I am concerned about number of records (especially in long-running batch processes), or need to loop through the data as the first task, then I read records one-by-one.

Upvotes: 1

Related Questions