36

If I have a python script running (with full Tkinter GUI and everything) and I want to pass the live data it is gathering (stored internally in arrays and such) to another python script, what would be the best way of doing that?

I cannot simply import script A into script B as it will create a new instance of script A, rather than accessing any variables in the already running script A.

The only way I can think of doing it is by having script A write to a file, and then script B get the data from the file. This is less than ideal however as something bad might happen if script B tries to read a file that script A is already writing in. Also I am looking for a much faster speed to communication between the two programs.

EDIT: Here are the examples as requested. I am aware why this doesn't work, but it is the basic premise of what needs to be achieved. My source code is very long and unfortunately confidential, so it is not going to help here. In summary, script A is running Tkinter and gathering data, while script B is views.py as a part of Django, but I'm hoping this can be achieved as a part of Python.

Script A

import time

i = 0

def return_data():
    return i

if __name__ == "__main__":
    while True:
        i = i + 1
        print i
        time.sleep(.01)

Script B

import time
from scriptA import return_data

if __name__ == '__main__':
    while True:
        print return_data()  # from script A
        time.sleep(1)
4
  • You should be able to import one module to the other, instantiate a single instance (using a singleton if necessary) and then assign attributes/values to this instance so you can read from it as needed in the secondary script. Commented May 9, 2017 at 4:32
  • If the scripts aren't too long or sensitive, it would help to see the source code Commented May 9, 2017 at 4:33
  • 1
    Perhaps you can use file socket? That seems an option for streaming data. Commented May 9, 2017 at 4:36
  • The question is to vague. "pass the live data to another script" could mean many different things. How are you passing it? Over a socket? via a restful interface? As command line arguments? Do you pass the data once when starting the second program or is the data constantly updated as it changes? Please show a Minimal, Complete, and Verifiable example Commented May 9, 2017 at 11:49

7 Answers 7

30

you can use multiprocessing module to implement a Pipe between the two modules. Then you can start one of the modules as a Process and use the Pipe to communicate with it. The best part about using pipes is you can also pass python objects like dict,list through it.

Ex: mp2.py:

from multiprocessing import Process,Queue,Pipe
from mp1 import f

if __name__ == '__main__':
    parent_conn,child_conn = Pipe()
    p = Process(target=f, args=(child_conn,))
    p.start()
    print(parent_conn.recv())   # prints "Hello"

mp1.py:

from multiprocessing import Process,Pipe

def f(child_conn):
    msg = "Hello"
    child_conn.send(msg)
    child_conn.close()
Sign up to request clarification or add additional context in comments.

4 Comments

This seems to be the way to go, however why doesn't the following work? If you add this code to the end of mp1.py: i = 0 def g(): print i if __name__ == "__main__": while True: i = i + 1 g() Why doesn't running mp2.py return the current i?
Didn't quite understand your question there. however if you want to call the function g(), you need to specify it in p = Process(target=g, args=(child_conn,))
See examples in edit, if you insert your code into them, mp2 will return 0 rather than whatever i is at in mp1.
This program is exactly what I need, except how to make it keep passing more data? For example if mp1.py runs a loop that keeps printing from a websocket: while True: print(await websocket.recv()) ] mp2.py (as it is written now) will never move on past the print(parent_conn.recv()) line when I really need to process that data. What I really want is a sequence like this: data_received = parent_conn.recv() \n ... (do more stuff with data_received) ...
11

If you wanna read and modify shared data, between 2 scripts, which run separately, a good solution is, take advantage of the python multiprocessing module, and use a Pipe() or a Queue() (see differences here). This way, you get to sync scripts, and avoid problems regarding concurrency and global variables (like what happens if both scripts wanna modify a variable at the same time).

As Akshay Apte said in his answer, the best part about using pipes/queues, is that you can pass python objects through them.

Also, there are methods to avoid waiting for data, if there hasn't been any passed yet (queue.empty() and pipeConn.poll()).

See an example using Queue() below:

    # main.py
    from multiprocessing import Process, Queue
    from stage1 import Stage1
    from stage2 import Stage2


    s1= Stage1()
    s2= Stage2()

    # S1 to S2 communication
    queueS1 = Queue()  # s1.stage1() writes to queueS1

    # S2 to S1 communication
    queueS2 = Queue()  # s2.stage2() writes to queueS2

    # start s2 as another process
    s2 = Process(target=s2.stage2, args=(queueS1, queueS2))
    s2.daemon = True
    s2.start()     # Launch the stage2 process

    s1.stage1(queueS1, queueS2) # start sending stuff from s1 to s2 
    s2.join() # wait till s2 daemon finishes
    # stage1.py
    import time
    import random

    class Stage1:

      def stage1(self, queueS1, queueS2):
        print("stage1")
        lala = []
        lis = [1, 2, 3, 4, 5]
        for i in range(len(lis)):
          # to avoid unnecessary waiting
          if not queueS2.empty():
            msg = queueS2.get()    # get msg from s2
            print("! ! ! stage1 RECEIVED from s2:", msg)
            lala = [6, 7, 8] # now that a msg was received, further msgs will be different
          time.sleep(1) # work
          random.shuffle(lis)
          queueS1.put(lis + lala)             
        queueS1.put('s1 is DONE')
    # stage2.py
    import time

    class Stage2:

      def stage2(self, queueS1, queueS2):
        print("stage2")
        while True:
            msg = queueS1.get()    # wait till there is a msg from s1
            print("- - - stage2 RECEIVED from s1:", msg)
            if msg == 's1 is DONE ':
                break # ends loop
            time.sleep(1) # work
            queueS2.put("update lists")             

EDIT: just found that you can use queue.get(False) to avoid blockage when receiving data. This way there's no need to check first if the queue is empty. This is no possible if you use pipes.

Comments

5

I solved the same problem using the lib Shared Memory Dict, it's a very simple dict implementation of multiprocessing.shared_memory.

Source1.py

from shared_memory_dict import SharedMemoryDict
from time import sleep

smd_config = SharedMemoryDict(name='config', size=1024)

if __name__ == "__main__":
    smd_config["status"] = True

    while True:
        smd_config["status"] = not smd_config["status"]
        sleep(1)

Source2.py

from shared_memory_dict import SharedMemoryDict
from time import sleep

smd_config = SharedMemoryDict(name='config', size=1024)

if __name__ == "__main__":
    while True:
        print(smd_config["status"])
        sleep(1)

1 Comment

This is a grossly under rated solution. It's just what I need right now.
2

You could use the pickling module to pass data between two python programs.

import pickle 

def storeData(): 
    # initializing data to be stored in db 
    employee1 = {'key' : 'Engineer', 'name' : 'Harrison', 
    'age' : 21, 'pay' : 40000} 
    employee2 = {'key' : 'LeadDeveloper', 'name' : 'Jack', 
    'age' : 50, 'pay' : 50000} 

    # database 
    db = {} 
    db['employee1'] = employee1 
    db['employee2'] = employee2 

    # Its important to use binary mode 
    dbfile = open('examplePickle', 'ab') 

    # source, destination 
    pickle.dump(db, dbfile)                   
    dbfile.close() 

def loadData(): 
    # for reading also binary mode is important 
    dbfile = open('examplePickle', 'rb')      
    db = pickle.load(dbfile) 
    for keys in db: 
        print(keys, '=>', db[keys]) 
    dbfile.close() 

Comments

1

If anyone is still looking:

import fcntl

def write_to_file(filename, content):
    with open(filename, 'w') as file:
        fcntl.flock(file, fcntl.LOCK_EX) # Locks the file for writing
        file.write(content)
        fcntl.flock(file, fcntl.LOCK_UN) # Unlocks the file

write_to_file('myfile.txt', 'This is a message')

import time

def read_from_file(filename):
    while True:
        try:
            with open(filename, 'r') as file:
                fcntl.flock(file, fcntl.LOCK_SH) # Locks the file for reading (shared lock)
                content = file.read()
                fcntl.flock(file, fcntl.LOCK_UN) # Unlocks the file
                return content
        except IOError:
            print("File is being written to, waiting...")
            time.sleep(1)

content = read_from_file('myfile.txt')
print(content)

You can add a flag line in the txt so you can add synch reading and writting. If you require the txt file to be updated before the other script reads it. Such as. Flag is False in the txt file representating that the file was not updated since last read of the file. Whenever information is read from the file the Flag is changed from True to False; whenever information is written to the file the Flag is changed from False to True.

I have something else but it requires the array always being the same size, I use it for sharing frames between many running scripts.

Comments

1

I find it rather charming to use sockets for the communication between two independent processes.

It's even to find in the official documentation of the multiprocessing module.

shortened example [from the docs]

The following server code creates a listener which uses 'secret password' as an authentication key. It then waits for a connection and sends some data to the client:

from multiprocessing.connection import Listener
from array import array

address = ('localhost', 6000)     # family is deduced to be 'AF_INET'

with Listener(address, authkey=b'secret password') as listener:
    with listener.accept() as conn:
        print('connection accepted from', listener.last_accepted)

        conn.send([2.25, None, 'junk', float])
        
        conn.send_bytes(b'hello')

The following code connects to the server and receives some data from the server:

from multiprocessing.connection import Client
from array import array

address = ('localhost', 6000)

with Client(address, authkey=b'secret password') as conn:
    
    print(conn.recv())           # => [2.25, None, 'junk', float]

    print(conn.recv_bytes())     # => 'hello'

Alternatives

Alternatively use a socket based message queue module like Celery, but it'll need a message queue broker (Rabbit or Redis).

I fooled around with Zeromq once, where you could write the broker in python itself. See their guide with loads of examples of how to write the message queue broker and clients in different programming languages.

Comments

0

I was trying to share an array between two concurrently running Python scripts. Andre's solution was helpful. I couldn't share an array with Shared-memory-dict, but each item of an array can be shared separately. The source file has to be started before the receive file. As you may see, Item-4 is shared back and forth: SrcArry.py

from shared_memory_dict import SharedMemoryDict
from time import sleep
Arry = SharedMemoryDict(name=0, size=64)
if __name__ == "__main__":
    while True:
        try:
            print(Arry["Itm4"])
        except:
            pass
        for i in range (0, 16):
            Arry["Itm1"] = i
            Arry["Itm2"] = (i - 1)
            Arry["Itm3"] = (i + 2)
            Arry["Itm4"] = (i + 3)
        sleep(1)

RcvArry.py

from shared_memory_dict import SharedMemoryDict
from time import sleep
Trry = SharedMemoryDict(name=0, size=128)
if __name__ == "__main__":
    while True:
        print(Trry["Itm1"])
        print(Trry["Itm2"])
        print(Trry["Itm3"])
        print(Trry["Itm4"])
        Trry["Itm4"] = 100
        sleep(1)

The name of the array and size can be different in send and receive scripts. Item-4 doesn't exist before the receive file is started, so I have to handle an exception in the source file. You may find a better way.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.