English | 简体中文 | 繁體中文 | Русский язык | Français | Español | Português | Deutsch | 日本語 | 한국어 | Italiano | بالعربية

Synchronization and pooling of processes in Python

Synchronization between processes

The multiprocessing package is a package that supports generating programs using the API. This package is used for local and remote concurrency. With this module, programmers can use multiple processors on the given computer. It can run on Windows and UNIX operating systems.

This package contains all equivalent synchronization primitives.

Example Code

from multiprocessing import Process, Lock
   def my_function(x, y):
      x.acquire()
      print('hello world', y)
      x.release()
      if __name__ == '__main__':
      lock = Lock() for num in range(10) :
Process(target=my_function, args=(lock, num)).start()

Here, an instance can be locked to ensure that only one process can display standard output at a time.

Gather

For the pool, we use the Pool class. When a person can create a process pool, it will carry all the tasks submitted to it.

class multiprocessing.Pool([processes[, initializer[, initargs[, maxtasksperchild]]]])

The pool object controls the worker pool to select the job that can be submitted, which supports asynchronous results with timeouts, callbacks, and parallel mapping implementations.

If the process is empty, use cpu_count(); if initialized to non-none, use initializer (* Call this function with initargs).

apply(func[, args[, kwds]])

This is similar toapply()Built-in function is the same. This method will block until the result is ready. If you want to execute in parallel, the apply_async() method is better.

apply_async(func[, args[, kwds[, callback]]])

Returns a result object.

map(func, iterable[, chunksize])

map() is a built-in function that supports only one iterable parameter. It will block until the result is ready.

In this method, the iterable is divided into many small pieces, and these small pieces are submitted as separate tasks to the process pool.

map_async(func, iterable[, chunksize[, callback]])

Returns a result object.

imap(func, iterable[, chunksize])

The same as itertools.imap().

The size of the parameter is the same as that used inmap().

imap_unordered(func, iterable[, chunksize])

This is similar toimap()Reorder the iterator in the same way.

Close()

The worker will exit the process after completing all tasks.

Terminate()

If we want to stop the worker process immediately without completing the task, use this method.

join()

After usingjoin()Before usingclose()andterminate()function.

class multiprocessing.pool.AsyncResult

Returned by Pool.apply_async() and Pool.map_async().

Get ([timeout])

This function returns the result when it arrives.

Wait ([timeout])

Use this wait function to wait for the result to be available or until the timeout seconds have passed.

Prepare()

This function returns whether the call is completed.

Success()

No errors are returned after the call is completed, this function will return.

Example Code

# -*- coding: utf-8 -*-
""
Created on Sun Sep 30 12:17:58 2018
@author: oldtoolbag.com
""
from multiprocessing import Pool
import time
def myfunction(m):
return m*m
if __name__ == '__main__':
my_pool = Pool(processes=4)  # start 4 worker processes
result = my_pool.apply_async(myfunction, (10,)10))" asynchronously in a single process
print (result.get(timeout=1))
print (my_pool.map(myfunction, (10))  # prints "][0, 1, 4,..., 81])"
my_it = my_pool.imap(myfunction, (10))
print (my_it.next() )  # prints "0"
print (my_it.next() )  # prints"1"
print (my_it.next(timeout=1)  )  # prints"4" unless your computer is *very* slow
result = my_pool.apply_async(time.sleep, (10,)
print (result.get(timeout=1)  )  # raises multiprocessing.TimeoutError