gather_and_close now automatically locks the pool

This commit is contained in:
2022-03-29 19:43:21 +02:00
parent 23a4cb028a
commit 1beb9fc9b0
7 changed files with 7 additions and 29 deletions

View File

@ -143,7 +143,6 @@ Or we could use a task pool:
pool = TaskPool()
await pool.map(another_worker_function, data_iterator, num_concurrent=5)
...
pool.lock()
await pool.gather_and_close()
Calling the :py:meth:`.map() <asyncio_taskpool.pool.TaskPool.map>` method this way ensures that there will **always** -- i.e. at any given moment in time -- be exactly 5 tasks working concurrently on our data (assuming no other pool interaction).