122 python program module thread operations -concurrent

First, the introduction of concurrent module

concurrent.futuresModule provides asynchronous call interface package height

ThreadPoolExecutor:Thread pool to provide asynchronous call

ProcessPoolExecutor:Process pool, provides an asynchronous call

ProcessPoolExecutor 和 ThreadPoolExecutor:Both implement the same interface, which is defined by abstract classes Executor.

Second, the basic method

submit(fn, *args, **kwargs):Asynchronous tasks submission

map(func, *iterables, timeout=None, chunksize=1): Submit substituted for loop operation

shutdown(wait=True): The equivalent of process pool pool.close()+pool.join()operation

  • wait = True, wait for the pool to perform all tasks completed after the completion of resource recovery continues
  • wait = False, returns immediately, and not wait for the task execution is completed pool
  • But no matter why wait parameter values, the entire program will wait until all tasks finished
  • and the map must submit before shutdown

result(timeout=None): Get results

add_done_callback(fn):Callback

Third, the process pools and thread pools

Pool features: limit the number of processes or threads.

What time limit: When the number of concurrent tasks is far greater than the computer could afford, that is not a one-time open an excessive number of tasks I should consider going to limit my number of processes or threads, the server does not collapse from the guarantee.

3.1 process pool

from concurrent.futures import ProcessPoolExecutor
from multiprocessing import Process,current_process
import time



def task(i):
    print(f'{current_process().name} 在执行任务{i}')
    time.sleep(1)


if __name__ == '__main__':
    pool = ProcessPoolExecutor(4) # 进程池里又4个进程
    for i in range(20): # 20个任务
        pool.submit(task,i)# 进程池里当前执行的任务i,池子里的4个进程一次一次执行任务

3.2 thread pool

from concurrent.futures import ThreadPoolExecutor
from threading import Thread,currentThread
import time


def task(i):
    print(f'{currentThread().name} 在执行任务{i}')
    time.sleep(1)

if __name__ == '__main__':
    pool = ThreadPoolExecutor(4) # 进程池里又4个线程
    for i in range(20): # 20个任务
        pool.submit(task,i)# 线程池里当前执行的任务i,池子里的4个线程一次一次执行任务

Four, Map Usage

from concurrent.futures import ThreadPoolExecutor,ProcessPoolExecutor

import os,time,random
def task(n):
    print('%s is runing' %os.getpid())
    time.sleep(random.randint(1,3))
    return n**2

if __name__ == '__main__':

    executor=ThreadPoolExecutor(max_workers=3)

    # for i in range(20):
    #     future=executor.submit(task,i)

    executor.map(task,range(1,21)) #map取代了for+submit

Fifth, synchronous and asynchronous

Understood to be submitted to the task in two ways

Synchronization: Submit a job, you must perform other tasks done (to get the return value), in order to execute the next line of code

Asynchronous: Submit a job, do not wait for execution is over, the next line of code can be executed directly.

Synchronization : the equivalent of performing a task serial execution

asynchronous

from concurrent.futures import ProcessPoolExecutor
from multiprocessing import Process,current_process
import time

n = 1

def task(i):
    global n
    print(f'{current_process().name} 在执行任务{i}')
    time.sleep(1)
    n += i
    return n

if __name__ == '__main__':
    pool = ProcessPoolExecutor(4) # 进程池里又4个线程
    pool_lis = []
    for i in range(20): # 20个任务
        future = pool.submit(task,i)# 进程池里当前执行的任务i,池子里的4个线程一次一次执行任务
        # print(future.result()) # 这是在等待我执行任务得到的结果,如果一直没有结果,这里会导致我们所有任务编程了串行
                               # 在这里就引出了下面的pool.shutdown()方法
        pool_lis.append(future)
    pool.shutdown(wait=True) # 关闭了池的入口,不允许在往里面添加任务了,会等带所有的任务执行完,结束阻塞
    for p in pool_lis:
        print(p.result())


    print(n)# 这里一开始肯定是拿到0的,因为我只是去告诉操作系统执行子进程的任务,代码依然会继续往下执行
    # 可以用join去解决,等待每一个进程结束后,拿到他的结果

Sixth, the callback function

import time
from threading import Thread,currentThread
from concurrent.futures import ThreadPoolExecutor

def task(i):
    print(f'{currentThread().name} 在执行{i}')
    time.sleep(1)
    return i**2

# parse 就是一个回调函数
def parse(future):
    # 处理拿到的结果
    print(f'{currentThread().name} 结束了当前任务')
    print(future.result())


if __name__ == '__main__':
    pool = ThreadPoolExecutor(4)
    for i in range(20):
        future = pool.submit(task,i)

        '''
        给当前执行的任务绑定了一个函数,在当前任务结束的时候就会触发这个函数(称之为回调函数)
        会把future对象作为参数传给函数
        注:这个称为回调函数,当前任务处理结束了,就回来调parse这个函数
        '''
        future.add_done_callback(parse)
        # add_done_callback (parse) parse是一个回调函数
        # add_done_callback () 是对象的一个绑定方法,他的参数就是一个函数

Guess you like

Origin www.cnblogs.com/xichenHome/p/11569111.html