Thursday, 15 January 2015

Memory consumption when using multiprocessing in python -



Memory consumption when using multiprocessing in python -

i using python's multiprocessing module launch monte-carlo simulation in order speed computation. code have looks this:

def main(): (various parameters beingness set up...) start = 0 end = 10 count = int(1e4) time = np.linspace(start, end, num=count) num_procs = 12 mc_iterations_per_proc = int(1e5) mc_iterations = num_procs * mc_iterations_per_proc mean_estimate, mean_estimate_variance = np.zeros(count), np.zeros(count) pool = multiprocessing.pool(num_procs) index, (estimate, estimate_variance) in enumerate(pool.imap_unordered(mc_linear_estimate, ((disorder_mean, intensity, wiener_std, time) index in xrange(mc_iterations)), chunksize=mc_iterations_per_proc)): delta = estimate - mean_estimate mean_estimate = mean_estimate + delta / float(index + 1) mean_estimate_variance = mean_estimate_variance + delta * (estimate - mean_estimate) mean_estimate_variance = mean_estimate_variance / float(index)

ok, mc_linear_estimate function taking *args , creating additional variables within it. looks this:

def mc_linear_estimate(*args): disorder_mean, intensity, wiener_std, time = args[0] theta_process = source_process(time, intensity, disorder_mean) xi_process = observed_process(time, theta_process, wiener_std) gamma = error_estimate(time, intensity, wiener_std, disorder_mean) estimate = signal_estimate(time, intensity, wiener_std, disorder_mean, gamma, xi_process) estimate_variance = (estimate - theta_process) ** 2 homecoming estimate, estimate_variance

as see, number of iterations pretty big (1.2m), , size of arrays 10k doubles, hence utilize welford's algorithm compute mean , variance, due fact not require store every element of considered sequences in memory. however, not help.

the problem: run out of memory. when launch application, 12 processes emerge (as seen using top programme on linux machine). instantly start consuming lot of memory, linux machine i'm using has 49g of ram, things ok time. then, each of processes takes around 4g of ram, 1 of them fails , shows <defunct> in top. process falls off, , happens until 1 process left, fails "out of memory" exception.

the questions:

what perchance doing wrong?

how improve code wouldn't consume memory?

python memory-management multiprocessing

No comments:

Post a Comment