user1420474
user1420474

Reputation: 301

Parallel Programming = multi child processes or each process create a child?

Not sure what "parallel programing" means... but I have two thoughts of that

  1. Process A produces a certain number of child processes. Once process A finishes creating child processes, all the child processes run at the same time.

  2. Process A creates child process B, process B create another its child process C, process C create child process D, and so on... Once finished, all the processes with assigned tasks run at the same time.

Which one is the correct thought of parallel programming? Thanks for the help!

EDIT: I assume that running different programs at the same time requires multi-process?

Upvotes: 1

Views: 834

Answers (3)

Tudor
Tudor

Reputation: 62459

The term "parallel programming" is much more broad than merely "a process spawning child processes or threads". It can mean:

  1. A single process spawning a team of threads to do some work.
  2. A single process spawning a team of child processes to do some work.
  3. Multiple processes spawned independently to do cooperative computing on the same machine.
  4. Multiple processes spawned independently to do cooperative computing on different machines in a network.
  5. A single process communicating with a GPU that performs all the parallel computing.
  6. Any combination or nesting of the above.

Basically parallel programming is the act of writing applications or groups of applications that solve a problem in parallel. Threads, processes, etc. are just means to achieve this.

Consider this scenario: A large parallel text processing task taking place on a cluster. A master node issues commands to all its slave nodes to spawn processes for computation (global parallelism). Each process in turn spawns multiple threads/child processes to take advantage of local parallelism (multi-core/multi-processor nodes).

Upvotes: 2

Ira Baxter
Ira Baxter

Reputation: 95400

Your application (call it a "process") starts with some number of live threads, typically 1 in classic Unix and windows systems. Threads represent seperate demand-to-do-work.

Threads run in real or pseudo-parallel, depending on the number of available real CPUs, how the scheduler works, priorities, demands from other processes/threads on the same machine, or whether a thread is waiting for some interaction with another thread to complete. You should generally imagine them running in parallel regardless of how fast they individually make progress, or the policies of the scheduler.

At any moment, a thread may die (quit or suicide). Or, it may manufacture more threads for its process. So the number of threads a process owns is in general dynamic. (In most OSes, threads can also spawn other processes, but that just confuses the picture without adding anything really different).

Upvotes: 0

Alex Lockwood
Alex Lockwood

Reputation: 83311

I would say that neither is the "correct thought of parallel programming", since scenario requires that the processes are executed simultaneously.

Parallelism is when tasks literally run at the same time (i.e. on a multicore processor). If all of your processes are forked/executed on a single core processor, then this would be considered concurrency, not parallelism.

Upvotes: 0

Related Questions