My application manages the state of multiple objects, called Requests, over a substantial period of time. Each Request has a unique identifier and goes through a distinct lifecycle. New Requests arise in the system over time.
I'd like to write a separate log file for each Request. The log would track every interesting change to the state of that Request. So if I wanted to know everything about the history of Request X, it would be simple to go and look at X.log.
Obviously, I could hand-roll a solution using plain files. But I'd like to do this with Python's logging framework. One way would be to create a new logger instance for every unique Request, configure it to point at the correct file, and then log away. But this feels like the wrong solution. It creates many loggers, which aren't garbage-collected, and is also unbounded, since new Requests will continue to enter the system.
I was hoping for some way to configure a single logger, perhaps using a custom handler, so that I could redirect output to different files depending on the ID of the incoming Request. I've looked at the docs but everything I see seems to work at the level of incoming records, not manipulating outgoing endpoints.
Is this possible?
logging.Managersubclass?) to remove the Logger instance.loggingmodule, and don't seem to come with any clean-up API. Is adelsufficient?