A Question Asker
A Question Asker

Reputation: 3333

Linux: how to queue some jobs in the background?

Here is the functionality I am looking for (and haven't quite found):

I have x processes that I want to run sequentially. Some of them could be quite time consuming.

I want these processes to run in the background of my shell.

I know about nohup, but it doesn't seem to work perfectly...assuming job1 is a time consuming job, if I ctrl+c out of the blank line that I get after doing nohup job1 && job2 && job3 &, then job2 and job3 won't run, and job1 might or might not run depending on how long I let nohup run.

Is there a way to get the functionality I want? I am ssh'ed into a linux server. For bonus points, I'd love it if the jobs that I queued up would continue running even if I closed my connection.

Thanks for your help.

EDIT: A small addendum to the question: if I have a shell script with three exec statements

exec BIGTHING exec smallthing exec smallthing

will it definitely be sequential? And is there a way to wrap those all into one exec line to get the equivalent functionality?

ie exec BIGTHING & smallthing & smallthing or && or somesuch

Upvotes: 11

Views: 14235

Answers (7)

rbrc
rbrc

Reputation: 871

a little more information:

If you separate your jobs with && it is equivalent to the &&/and operator in a programming language, such as in an if statement. If job 1 returns with an error, then job 2 won't run, etc. This is called "short circuiting." If you separate the jobs with ; instead, they will all run regardless of the return code.

If you end the line with & it will background the entire thing. If you forget, you can hit control+z to pause the command and give you a prompt, then the command "bg" will background them.

If you close your ssh session, likely it will end the commands because they are attached to your shell (at least in most implementations of bash). If you want it to continue after you've logged out, use the command "disown".

Upvotes: 1

tachylatus
tachylatus

Reputation: 123

Alternatively, if you want a bit more control of your queue, e.g. the ability to list and modify entries, check out Task Spooler. It is available in the Ubuntu 12.10 Universe repositories as the package task-spooler.

Upvotes: 8

myroslav
myroslav

Reputation: 3793

I do agree that screen is very handy tool. There is additional command line tool enqueue - that enables enqueuing of additional jobs as needed i.e. even when you had already started one of the jobs, you can add another one when the first one is already running.

Here's sample from enqueue README:

$ enqueue add mv /disk1/file1 /disk2/file1
$ enqueue add mv /disk1/file2 /disk2/file2
$ enqueue add beep
$ enqueue list

Upvotes: 3

eumiro
eumiro

Reputation: 213025

Use screen.

  1. ssh to the server
  2. run screen
  3. launch your programs: job1;job2;job3 - separated with semicolons, they will run sequentially
  4. Detach from the screen: CTRL-A, D
  5. logout from the server

(later)

  1. ssh to the server
  2. run screen -r
  3. and you are in your shell with your job queue running...

Upvotes: 14

user562374
user562374

Reputation: 3917

To background it detaching it from the shell (tasks just backgrounded with & are subject to shell games), use

setsid foo

Upvotes: 0

Arne Claassen
Arne Claassen

Reputation: 14414

The basic mechanism for backgrounding a job from the shell is

job1 &

So you could do a simple shell script that does it sequentially for several of them, although at that point i'd consider diving into some scripting language like perl or python and writing a simple script to fork those processes. If your script forks itself as its first step, and does all its work in the fork, it will immediately return control back to you on the shell and continue running even if you log out.

Upvotes: 0

yossi
yossi

Reputation: 13315

after your command put a &

for exmaple running an executable named foo:

./foo &

and it will also keep running even if you close your ssh connection.

Upvotes: -2

Related Questions