Skip to content Skip to sidebar Skip to footer

Simultaneous Write To Multiple Files

I'm working on a project involving using a Beaglebone to read from multiple sensors and then pass that data into a text file.I'm monitoring six different muscles, so I have the fil

Solution 1:

Your function calls have infinite loops in them, so they will never return. One file was created by LeftHamAcquisition, but since it never returned, none of the other functions were ever executed. You need to use something like the multiprocessing module to get them to run in parallel. In particular, I would recommend multiprocessing pools and the apply_async function:

import multiprocessing
import Queue
import time

# one global constant: the poison pill# this could really be whatever you want - my string choice is arbitrary
STOP = "stop"# change the signature of your function to accept a queue for the main # process to pass a poison pilldefLeftHamAcquisition(kill_queue):
    f_name = 'HamLeft.txt'# you aren't doing anything with "file_name" - should it be removed?# file_name = os.path.join(/relevant file path/)# use file context managers:withopen(fname, 'a+') as HamLData:
        whileTrue:

            # in the infinite loop, we add a check for the poison pilltry:
                val = kill_queue.get(block=False)
                if val = STOP:
                    return# leave if the poison pill was sentexcept Queue.Empty:
                pass# ignore empty queue

            EMGhamL = ADC.read_raw('AIN1')
            HamLData.write(str(elapsed_milliseconds))
            HamLData.write('\t')
            HamLData.write(str(EMGhamL))
            HamLData.write('\n')

# ... the rest of your functions ...#The Programprint"Press ctrl-C to end acquisition"# a list of your functions
f_list = [
    LeftHamAcquisition,
    RighHamAcquisition,
    LeftVastAcquisition,
    RightVastAcquisition,
    LeftQuadAcquisition,
    RightQuadAcquisition,
]

pool = multiprocessing.Pool()    #c reate the worker pool
kill_queue = multiprocessing.Queue() # create the queue to pass poison pillsfor f in f_list:
    # kick off the functions, passing them the poison pill queue
    pool.apply_async(f, args=(kill_queue))     
try:
    # put the main process to sleep while the workers do their thingwhileTrue:
        time.sleep(60)

except KeyboardInterrupt:      

    # close the workers nicely - put one poison pill on the queue for eachfor f in f_list:
        q.put(STOP)
    pool.close()
    pool.join()
    raise data.close()

Also, there's no reason to have this many functions. They all do the same thing, just with different strings and variable names. You should refactor them into one function that you can pass arguments to:

defacquisition(kill_queue, f_name, ain):

    withopen(fname, 'a+') as f:
        whileTrue:

            try:
                val = kill_queue.get(block=False)
                if val = STOP:
                    returnexcept Queue.Empty:
                pass

            an_val = ADC.read_raw(ain)

            # where does "elapsed_milliseconds" come from? it's undefined# in your example code
            f.write("{}\t{}\n".format(elapsed_milliseconds, an_val))

With this function, instead of providing a list of individual functions in my multiprocessing example, you can just reuse this function call it repeatedly with different arguments (which is the whole point of functions).

Post a Comment for "Simultaneous Write To Multiple Files"