Share large (read-only) binary string between Python processes?

I have a large, read bytes

- only object that I need to use against several different Python processes (3), each of which "returns" (adding to the result queue) a list of results based on their work.

Since this object is very large and read-only, I would like to avoid copying it into the address space of each worker process. The research I've done suggests that shared memory is the right way to go, but I haven't been able to find a good resource / example on how exactly to do this with a module multiprocessing

.

Thanks in advance.

+3


source to share


1 answer


You can use multiprocessing.Array

that like ctypes.Array

, but for shared memory when you set the type ctypes

.

# No lock needed, as no write will be done.
array = multiprocessing.Array(ctypes.c_char, long_byte_string, lock=False)

      



For example:

>>> import multiprocessing
>>> import ctypes
>>> array = multiprocessing.Array(ctypes.c_char, b'\x01\x02\xff\xfe', lock=False)
>>> array[0]
b'\x01'
>>> array[2:]
b'\xff\xfe'
>>> array[:]
b'\x01\x02\xff\xfe'
>>> b'\xff' in array
True

      

+2


source







All Articles