multiple R global environments in Python with Rpy2 -


i'm working in web project, using flask python , r backend. also, using rpy2 communicate python r, logic (in r) runs in server side. have r-scripts called methods in python, , pass parameters, python r using globalenv offers rpy2 can see below.

pythonfile

@map_module.route("/executekmeans", methods=['post']) def execute_kmeans(): robjects.globalenv['numberclusters'] = 5 # pass parameter through globalenv reqr = robjects.r.source(map_module.root_path + "/r/kmeans.r")[0][0] # call r-script  reqr = json.loads(reqr) return jsonify(reqr) 

r-script

# k-means clustering  require(jsonlite)  # function functiondrop <- function(df, names){   return(df[ , -which(names(df) %in% names)]) }  # preparation data  for(i in ncol(datatoanalyse)){   datatoanalyse[,i] <- as.numeric(as.character(datatoanalyse[,i])) }  # excute k-means algorithm data_kmeans <- functiondrop(datatoanalyse, c("lat","lon")) cl <- kmeans(data_kmeans, numbercluster, algorithm = kmeansalgorithm) # here i'm using parameter pass python  # bind data cluster number datatoanalyse <- cbind(datatoanalyse[,c("lat","lon")], colorcluster = cl$cluster)  # change conventions  datatoanalyse$colorcluster[datatoanalyse$colorcluster == 1] <- "red" datatoanalyse$colorcluster[datatoanalyse$colorcluster == 2] <- "blue" datatoanalyse$colorcluster[datatoanalyse$colorcluster == 3] <- "green" datatoanalyse$colorcluster[datatoanalyse$colorcluster == 4] <- "orange" datatoanalyse$colorcluster[datatoanalyse$colorcluster == 5] <- "yellow"  # convert data frame json object tojson(datatoanalyse, pretty=false) 

so problem that, enviroment global , project accessed multiple users change, in runtime, variables. need globalenv unique each r-script call. how can create new enviroments in rpy2 or how can manage global enviroment doesn't create conflicts parameters when r-scripts called different users.


Comments