Limiting Python Function Execution Time with a Parameterized Decorator via Multiprocessing

A decorator to limit the execution time of Python functions

Chris Knorowski
Towards Data Science

--

Photo by Daniele Levis Pelusi https://unsplash.com/photos/WxmZT3sIe4g

In this article, I will walk you through creating a decorator to limit the execution time of a function in your Python program via multiprocessing. My primary motivation for building this decorator was to limit a Python functions execution time with a simple syntax and minimal dependencies.

A naive approach is to use a timer inside the Python function, periodically check if the executing Python function had exceeded that limit, and then exit. That approach may be ok for a simple one-off solution, but any call to a third-party library would prevent checking the time limit.

I also wanted a solution that was as unobtrusive as possible and could be applied easily throughout the codebase. Decorators provide the nice syntax and abstraction to accomplish this goal.

With that in mind, I knew I wanted to create a decorator that could be attached to any function in my project. The decorator would handle limiting the function’s execution time to some specified amount. I also wanted to keep everything purely in Python to limit the dependencies/complexity of adding this sort of scheduler.

The main challenges of doing this were

  1. The decorator should take a parameter for the max execution time to make it easily extensible.
  2. The decorated functions are able to have arbitrary inputs/outputs.
  3. The timer should work even if the executing function made calls to third-party libraries.

To start, I needed to create a decorator that could take parameters as arguments. After some research, I found an excellent stack overflow thread where people proposed several solutions.

I followed the architecture given in a post by Peter Mortensen in the comments to create a decorator for decorators. I won’t go into how this works, but you can jump into the thread for a more detailed explanation. For more information about decorators, I often go here for a refresher.

You can then attach this decorator to the decorator you want to apply to your function, allowing you to parameterize that decorator. I want to create a run_with_timer decorator that takes the maximum execution time as a parameter. It looks like this.

Next, we can fill in the code to limit execution time. The logic is as follows; the main process will use Python’s multiprocessing to run the decorated function in a separate process. The main process will set a timer and kill the subprocess executing the function if it exceeds the timer.

The code for setting up the multiprocessing is two-part. The first is a function I call function_runner, which acts as a wrapper running in the new process to handle running the Python function and returning the results that the multiprocessing function can handle. The second is the multiprocessing code which spawns the new process, sets a timer, then kills the spawned process if it hasn’t finished in time.

Finally, I can create the function to wrap with my run_with_timer decorator. I’ll call it sleeping bear.

When we run the sleeping_bear function, it will terminate f it exceeds the time limit set in the decorator parameter. If the Python function finishes before the time limit, the send_end handler returns the results.

sleeping_bear("Grizzly", hibernation=10)>> Grizzly is going to hibernate
>> 0 zZZ
>> 1 zZZzZZ
>> 2 zZZzZZzZZ
>> 3 zZZzZZzZZzZZ
>> 4 zZZzZZzZZzZZzZZ
>>
>> TimeExceededException: Exceeded Execution Time
sleeping_bear("Grizzly", hibernation=2)>> Grizzly is going to hibernate
>> 0 zZZ
>> 1 zZZzZZ
>>
>> "Grizzly is waking up!"

In summary, I have shown you how to create a decorator to limit the execution time of Python functions using multiprocessing as a scheduler. I was able to solve the three main problems.

  1. Create a parameterized decorator to limit the max execution time.
  2. Allow arbitrary input parameters to the Python functions being wrapped.
  3. Limit the execution time for any function by taking advantage of system-level schedulers in the multiprocessing module.

As a bonus, it was all done with Python and no third-party dependencies. There is some overhead to launching multiprocessing, but my goal was to limit longer running functions, so this was not a deal breaker.

If you enjoyed this make sure to follow me to support more content like this in the future. Thank you for reading, and as always, if you have any suggestions or feedback, please let me know in the comments.

--

--

CTO/Cofounder of SensiML. Works at the intersection of physics, software engineering and machine learning.