I have a Flask application running on Gunicorn with 2 workers and 2 threads each (4 concurrent execution contexts). My custom logging function writes to a file without any process-level locking:
def debug(msg: str):
with open("con7.log", "a", encoding="utf-8") as logf:
logf.write(msg + "\n")
Will this cause race conditions, log corruption, throw exceptions or stop my Python script?
Current Setup:
Gunicorn: 2 workers, 2 threads each (--workers 2 --threads 2)
Processes: 2 separate Python processes (workers) + 2 threads each
Logging: Simple file append in each request
What I've researched:
Python's built-in logging module is thread-safe but not process-safe
Multiple processes writing to the same file can cause interleaved content
File operations in append mode generally don't throw concurrent write errors on Linux
My specific questions:
Will concurrent file writes from multiple processes throw exceptions that crash my script, or just cause messy logs?
What's the worst-case scenario, will my Flask app continue serving requests even with corrupted logs?