python - How to remove methods of reference class and the objects in R -
this going long question. so, pardon me
i have next scenario, guess improve give pseudo code explain things better
a python file test.py
def test(i): rpy2.robjects import r r.source('r_file.r') r.call_function(with arguments) #some operations del r
file: r_file.r
rm(list=ls(all=true)) #some global variables #some reference class #creating object of reference class call_function = function(some arguments) { processing phone call few methods on reference class phone call more methods , operations rm(list=ls(all=true)) gc() return(0) }
the phone call the function test in python happens values of 'i' i.e function gets called values of greater 1 i.e function gets invoked multiple times main. hence, source r file more once. wanted new r interpreter every time invoke python function. therefore, import r every time function called , delete rpy2 object.
within r function "call_function", invoke methods, in turn creates reference class objects.
within r code, utilize rm in origin of code , when function some_function exits.
given background, problem i'm facing rm not remove of reference class in code , maintain getting warning this
in .removepreviouscoerce(class1, class2, where, previs) : methods exist coercing "rev_r5" "envrefclass"; replaced
here, rev_r5 reference class. not want happen, there way remove methods, objects related reference classes using rm ?
removing objects r's global environment not mean freshly started r process (class , method definitions may remain, discovered it).
r functions such removeclass()
, removemethod()
, or removegeneric
considered unless there objective requirements (like avoid loading of big objects on , on again), creating r processes each time might safest way go (starting r process relatively fast).
since not possible terminate , restart embedded r (limitation coming r, not rpy2), you'll have start , stop python processes embedding r.
one way utilize python bundle multiprocessing
(included in python's standard library). added bonus processes can run in parallel.
simple examle using doug hellmann's first-class tutorial base:
import multiprocessing def r_worker(i): """worker function""" print('worker %i started' % i) rpy2.robjects import r r.source('r_file.r') r.call_function(with arguments) #some operations del r homecoming if __name__ == '__main__': jobs = [] in range(5): p = multiprocessing.process(target = r_worker, args=(i,)) jobs.append(p) p.start()
python r rpy2
No comments:
Post a Comment