Reputation: 11294
I have the below in a bash script. The inputsqls.txt file contains 10 SQL statements that take several hours to run. Is there a way to periodically write the results to output.txt before all 10 statements have run? i.e. every 30mins write the results of all queries executed to the file or each time one query completes update the file, so that you can check say 4 of the query results without having to wait for all 10 to complete.
Eventually output.txt should contain the results of all 10 queries.
sqlplus -l <<sqlly > output.txt
USER/PASSWORD@SID
@inputsqls.txt
EXIT
sqlly
Note: breaking up each file to have just one statement is not an ideal workaround.
Upvotes: 2
Views: 1906
Reputation: 52863
Use the SQL*Plus spool command in inputsqls.txt.
spool output.txt append
This appends whatever is printed to stdout to output.txt as it gets written.
If output.txt doesn't already exist when starting, just remove append
:
spool output.txt
Upvotes: 3
Reputation: 881423
It depends on whether the SQL statements generate any output before they're finished. If they all run for a long time then order the results, then the output will probably only be at the very end of the process.
The way you have it is probably the best way since output.txt
will be flushed periodically anyway, leading to regular updates to the file. But this won't happen if all the output is at the end.
One possibility would be to break up the ten queries individually (assuming you're running them in sequence rather than parallel. This may allow each query to finish sooner and output its results, such as:
sqlplus -l <<sqlly > output.txt
USER/PASSWORD@SID
@inputsqls_01.txt
EXIT
sqlly
sqlplus -l <<sqlly >> output.txt
USER/PASSWORD@SID
@inputsqls_02.txt
EXIT
sqlly
:
sqlplus -l <<sqlly >> output.txt
USER/PASSWORD@SID
@inputsqls_10.txt
EXIT
sqlly
Upvotes: 0