Have you run in situations where concurrent execution could speed up your Python code? Are you using a GUI toolkit?
This talk gives you the background to use concurrency in your code without shooting yourself in the foot - which is quite easy if you don't understand how concurrent execution differs from linear execution!
The presentation starts with explaining some concepts like concurrency, parallelism, resources, atomic operations, race conditions and deadlocks.
Then we discuss the commonly-used approaches to concurrency: multithreading with the threading module, multiprocessing with the multiprocessing module, and event loops (which include the asyncio framework). Each of these approaches has its typical use cases, which are explained.
You can implement concurrency on a number of abstraction levels. The lowest level consists of primitives like locks, events, semaphores and so on. A higher abstraction level is using queues, typically with worker threads or processes. Even higher abstraction levels are active objects (hiding primitives or queues behind an API; this includes "actors" if you heard of them), the thread and process pools in concurrent.futures and the asyncio framework. Finally, you can "outsource" concurrency by leaving it to a message broker, which is a distinct process that receives and distributes messages.
The talk closes with some tips and best practices, mainly: