I'm not new to python but still looking for the best appropriate way of sharing string-formatted data between processes.
Currently I have 4 processes (1 parent + 3 Childs in multiprocessing), parent is fetching data from long poll server and after few checks sends data to mysql server.
Then Childs (1 for each kind of tasks) are processing the data in the way I need to.
Everything works fine for now, my data is securely stored and always in access, especially if I need to debug or implement new features, but after all it works slower with project growth (as for web app - not good). As you might assume, kind of data stored in db is list or its objects. I know that there are some problems in python with transferring data between processes ('d be better to say, software restrictions)...
I was thinking about temporary storing data in JSON or simple txt files, but concurrency wouldn't allow me to do that. Also I could try using sockets, but is it worth to start callback server for such a purpose?
Asynchronous thing didn't work for me either. So what are my options in this case? I don't need to loose any piece of data, just as I need to keep it working fast. Any other measures are welcome :)
p.s most of related topics are outdated or didn't answer my question, because I'm sure that the way I'm working with data isn't best so far.