You could roll-your-own somewhat with threading:
#!/usr/bin/python3
def _spawn_background_interpreter(*args,**kwargs):
from threading import Thread
def _open_interp(locs):
import code
code.interact(local=locs)
locs = args[0] if args else None
t = Thread(target=_open_interp, args=(locs,))
t.setDaemon(True) #pre-3.3 API
t.start()
Call with _spawn_background_interpreter(locals()).
I haven't tested it, but this will probably be fine if your program doesn't continuously print things to the console - otherwise it will be all munged together with the interactive interpreter.
The "opening a new console" idea is interesting, but very environment-specific, so I won't tackle that. I would be interested if there's a better prepackaged solution out there.
Edit: an attempt at a multiprocessing solution:
def _spawn_background_interpreter(*args,**kwargs):
from multiprocessing import Process
import sys, os
def _open_interp(locs,stdin):
import code
sys.stdin = os.fdopen(stdin)
code.interact(local=locs)
locs = args[0] if args else None
fileno = sys.stdin.fileno()
p = Process(target=_open_interp, args=(locs,fileno))
p.daemon = True
p.start()
The reason I initially avoided multiprocessing is that each new process gets its own PID (and stdin). Thus, I had to pass the main thread's stdin to the child process, and things get a little hacky from there. NOTE that there is a bug in python 3.2 and lower that will cause tracebacks to spew any time you call exit() in a multiprocessing process. This is fixed in 3.3.
Unfortunately, the multiprocessing code only runs on POSIX-compliant systems - i.e. not on Windows. Not insurmountable, just going to require a more involved solution involving pipes.
Anyway the multiprocessing implementation is likely going to perform better for you if you're approaching 100% CPU utilization in your main thread. Give it a try if you're on *nix.
code.InteractiveConsole(locals=locals()).interact()from pacakgecodeprovides similar function likeIPython.embed()but doesn't require IPython.