多线程多过程多协程(转载)

  • Bilibili 蚂蚁学Python UP主说得很好
  • 工作中看视频不不便,截取重点局部改为文字版,不便抄作业
  • 原地址:https://www.bilibili.com/video/BV1bK411A7tV

多线程

  1. 定义一个函数

    def my_func(a,b): do_something(a,b)
  2. 创立线程

    import threadingt = threading.Thread(target=my_func, args=(a, b))
  3. 启动线程

    t.start()
  4. 期待完结

    t.join()

队列

  • queue.Queue是线程平安的

    import queueq = queue.Queue()# 增加与获取q.put(time)item = q.get()# 查看状态q.qsize()q.empty()q.full()

线程平安(锁)

  1. try-finally

    import threadinglock = threading.Lock()lock.acquire()try: # do somethingfinally: lock.release()
  2. with

    import threadinglock = threading.Lock()with lock: # do something

线程池

  1. map函数,后果预入参程序对应

    from concurrent.futures import ThreadPoolExecutorarg_list = []with ThreadPoolExecutor() as pool: results = pool.map(my_func, arg_list) for result in results:     print(results)
  2. submit函数,as_completed程序可按实现程序

    from concurrent.futures import ThreadPoolExecutor, as_completedarg_list = []with ThreadPoolExecutor() as pool: futures = [pool.submit(my_func, arg) for arg in arg_list] # 按输出程序 for future in futures:     print(future.result()) # 按实现程序 for future in as_completed(futures):     print(future.result())

Flask中应用线程池

import timefrom concurrent.futures import ThreadPoolExecutorfrom flask import Flaskapp = Flask(__name__)pool = ThreadPoolExecutor()def do_1():    time.sleep(1)    return 'do_1'def do_2():    time.sleep(1)    return 'do_2'def do_3():    time.sleep(1)    return 'do_3'@app.route("/")def index():    result_1 = pool.submit(do_1)    result_2 = pool.submit(do_2)    result_3 = pool.submit(do_3)    return {        '1': result_1.result(),        '2': result_2.result(),        '3': result_3.result(),    }if __name__ == "__main__":    app.run()

多过程

图片截图自 蚂蚁学Python Bilibili 03:00

Flask应用多过程

import timefrom concurrent.futures import ProcessPoolExecutorfrom flask import Flaskapp = Flask(__name__)def do_1():    time.sleep(1)    return 'do_1'def do_2():    time.sleep(1)    return 'do_2'def do_3():    time.sleep(1)    return 'do_3'@app.route("/")def index():    result_1 = pool.submit(do_1)    result_2 = pool.submit(do_2)    result_3 = pool.submit(do_3)    return {        '1': result_1.result(),        '2': result_2.result(),        '3': result_3.result(),    }if __name__ == "__main__":    pool = ProcessPoolExecutor()    app.run()

协程:asyncio、await

import asyncioimport aiohttploop = asyncio.get_event_loop()async def get_url(url):    async with aiohttp.ClientSession() as session:        async with session.get(url) as resp:            result = await resp.text()            print(f"url:{url},{len(result)}")urls = [f"https://www.cnblogs.com/#p{page}" for page in range(1, 50 + 1)]tasks = [loop.create_task(get_url(url)) for url in urls]loop.run_until_complete(asyncio.wait(tasks))

管制asyncio并发数

  1. try-finally

    import asynciosem = asyncio.Semaphore(10)await sem.acquire()try: # do somethingfinally: sem.release()
  2. with

    import asynciosem = asyncio.Semaphore(10)async with sem: # do something
  • 举例

    import asyncioimport aiohttploop = asyncio.get_event_loop()# 限度10个并发semaphore = asyncio.Semaphore(10)async def get_url(url):  async with semaphore:      async with aiohttp.ClientSession() as session:          async with session.get(url) as resp:              result = await resp.text()              print(f"url:{url},{len(result)}")urls = [f"https://www.cnblogs.com/#p{page}" for page in range(1, 50 + 1)]tasks = [loop.create_task(get_url(url)) for url in urls]loop.run_until_complete(asyncio.wait(tasks))