1

I am using FlowSOM() Clustering from the FlowSOM and am getting an error while a vectorized function is running:

Error in map2(): ℹ In index: 8. ℹ With name: FileID8. Caused by error in map() at flowCompare/R/parameter_optimization_simple.R:3:3:ℹ In index: 11. Caused by error in pdf():

! could not open file 'C:**\AppData\Local\Temp\Rtmp6n5WXe/consensus.pdf'
Run rlang::last_trace() to see where the error occurred.

where *** is just the personal file path on my laptop.

It seems random, not constricted to certain files.

Input data are patient data, summarized in a list of dataframes. The idea is to cluster each patient/dataframe from a set range in FlowSOM() to detect the optimal hyperparameter for that clustering tool.

However, my vectorized function stops at a random file, claiming it couldn't find the consensus.pdf.

For better traceability, I generated a reprex:

library(flowCore)
library(FlowSOM)
library(purrr)
library(ggplot2)
library(smerc)

df.list <- replicate(40, as.data.frame(matrix(rnorm(2700, mean = 3, sd = 4), ncol = 9)), simplify = F )
df.list <- map(df.list, function(.y) { colnames(.y) <- paste0("C", 1:9); return(.y) } )
names(df.list) <- paste0("File",1:40)

mean_ss <- function(x) {centroid <- colMeans(x); mean(sapply(1:nrow(x), function(i) dist(rbind(x[i,], centroid))))}

calculate_median_ss <- function(dat, i, mode) {
  Cluster_ = as.numeric(GetMetaclusters(FlowSOM(input = flowFrame(dat), nClus=i, seed = 1))) 

  dfds <- data.frame(dat, Cluster_)

  clust <- table(dfds[["Cluster_"]])
  ds <- split(dfds, dfds[["Cluster_"]])
  ds <- lapply(ds, as.matrix)
  ds <- lapply(ds, mean_ss)
  ds <- unlist(ds)
  ds <- switch(mode, "median" = median(ds), "mean" = mean(ds))
  return(list(susq = ds, tab = clust))
  }

parameter_optimization_simple <- function(dat, channels, nam, smoothing, seq_x) {

  opt_plot <- map(seq_x, ~ calculate_median_ss(as.matrix(dat[channels]), .x, "mean"))

  plot.df <- data.frame( nClus = seq_x, median_ss = unlist(map(opt_plot, ~ .x[[1]])) )
  colnames(plot.df)[1] <- "nClus"

  p <- ggplot(plot.df, aes(plot.df[[1]], plot.df[[2]])) +
    labs(x = "nClus", y = "mean_ss")
  p <- switch(smoothing, "Y" = {p + geom_smooth()}, "N" = { p + geom_point() + geom_line() } )
  p <- ggplot_build(p)[[1]][[1]]

  elbow <- elbow_point(p[["x"]], p[["y"]])$x

  cat(paste("Clustering for", nam,"optimized! \n"))

  return(elbow) 
  }

channels <- paste0("C", 1:9)
seq_x <- seq(4,34,2)

set.seed(5)
opt_param_list = map2(.x = df.list, .y = names(df.list), ~ parameter_optimization_simple(.x, channels, nam = .y, smoothing = T, seq_x)) 

I have tried to remove and re-install FlowSOM, I have updated all packages and R, but still the error keeps popping up. Also, I couldn't find any pattern to when this error occurs, it seems just completely random and not reproducible.

The whole error message looks like this:

enter image description here

16
  • 1
    What is the full output of rlang::last_trace() on error and paste above. It is likely less random where full stack trace is reviewed. Commented Nov 17 at 19:26
  • The rstudio tag is only appropriate if this behaves differently in the base R terminal (outside the RStudio IDE) as it does when running R inside RStudio. Since the question doesn't suggest this, I'm removing the tag. If you have tested in R outside of RStudio and it behaves differently, please add this to your question when readding the tag. Thanks! Commented Nov 17 at 20:44
  • How is the question different from your previous, now deleted, one? Commented Nov 18 at 6:14
  • 1
    @r2evans no, the Rstudio tag was apparently misleading, thank you for the correction! Commented Nov 18 at 13:08
  • 1
    @LTyrone I am sorry. I tried this initially and SO suggested to edit the error message as a picture. Commented Nov 18 at 22:16

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.