multithreading - how to control process in python? -
i want run several process in parallel without giving cpu work cpu can other jobs.
in python, use os.system call binary. , theses call independent , can parallel. these binary may run different length of time.
what want example, keep 8 of them run in parallel, if 1 exit early, start one.
what doing this:
count = 0 f in files: count = count + 1 cmd = exe if (count != 8): cmd = cmd + " &" else: count = 0 os.sytem(cmd)
but not ideal if cmd
without & runs long or short.
i tried multiprocessing module,
p=pool(8) print(p.map(f,list_of_args))
but in case not running 8 processes in parallel of time. since of them exit early.
there no need synchronization.
i have 16 cpu cores , want half of them(8 processes runs in parallel)
you'd better not use os.system
subprocess.popen
more powerful , safe. subprocess.popen
not block on call don't need append '&' @ end of command.
for question itself, need know operating systems quite in balancing automatically workload should not worry idling processes vs running ones. launch workers pool
, let them run until needed without worrying of 'wasting' resource. idling process takes bit of memory , that's it.
when comes improving code, might want use pool of threads instead of pool of processes. due fact workers waiting other ones finish threads better processes that.
if can use python 3 job you.
import subprocess concurrent.futures import threadpoolexecutor def function(myfile): command = ('watever', 'you', 'want', 'to', 'do', 'with', myfile) process = subprocess.popen(command, stdout=subprocess.pipe) process.communicate() threadpoolexecutor(max_workers=8) executor: future = executor.map(function, files) future.result()
Comments
Post a Comment