Error in `tempfile()`: cannot find unused tempfile name during unit testing

When running unit tests for some packages (e.g. dplyr), tests fail with the following error:

Error in `tempfile()`: cannot find unused tempfile name
Backtrace:
     ▆
  1. └─testthat::expect_snapshot(desc(mean), error = TRUE) at test-desc.R:2:2
  2.   ├─testthat:::with_is_snapshotting(...)
  3.   └─testthat:::verify_exec(quo_get_expr(x), quo_get_env(x), replay)
  4.     ├─withr::local_pdf(tempfile())
  5.     │ └─withr:::pdf_dev(...)
  6.     │   └─grDevices::pdf(...)
  7.     │     └─grDevices:::checkIntFormat(file)
  8.     │       └─base::gsub("%%", "", s, fixed = TRUE)
  9.     │         └─base::is.factor(x)
 10.     └─base::tempfile()

This appears to be either an OS-level config or R insallation config issue, as it does not occur on my laptop or other available servers. There is a thread from 2005 mentioning this problem when using Sweave, but no real solution. Below, is sessionInfo, platform_info and ulimit info.

R version 4.1.3 (2022-03-10)
Platform: x86_64-pc-linux-gnu (64-bit)
Running under: Red Hat Enterprise Linux 8.8 (Ootpa)

Matrix products: default
BLAS/LAPACK: /usr/lib64/libopenblasp-r0.3.15.so

locale:
 [1] LC_CTYPE=en_US.UTF-8       LC_NUMERIC=C              
 [3] LC_TIME=en_US.UTF-8        LC_COLLATE=en_US.UTF-8    
 [5] LC_MONETARY=en_US.UTF-8    LC_MESSAGES=en_US.UTF-8   
 [7] LC_PAPER=en_US.UTF-8       LC_NAME=C                 
 [9] LC_ADDRESS=C               LC_TELEPHONE=C            
[11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C       

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
[1] dplyr_1.1.2    testthat_3.1.9

loaded via a namespace (and not attached):
 [1] pillar_1.9.0      compiler_4.1.3    dbplyr_2.3.2      lobstr_1.1.2     
 [5] prettyunits_1.1.1 tools_4.1.3       bit_4.0.5         pkgbuild_1.4.1   
 [9] pkgload_1.3.2     memoise_2.0.1     RSQLite_2.3.1     evaluate_0.21    
[13] lifecycle_1.0.3   tibble_3.2.1      pkgconfig_2.0.3   rlang_1.1.1      
[17] cli_3.6.1         DBI_1.1.3         rstudioapi_0.14   fastmap_1.1.1    
[21] withr_2.5.0       generics_0.1.3    desc_1.4.2        vctrs_0.6.3      
[25] bit64_4.0.5       rprojroot_2.0.3   tidyselect_1.2.0  glue_1.6.2       
[29] R6_2.5.1          processx_3.8.1    fansi_1.0.4       waldo_0.5.1      
[33] blob_1.2.4        callr_3.7.3       purrr_1.0.1       magrittr_2.0.3   
[37] ps_1.7.5          ellipsis_0.3.2    utf8_1.2.3        stringi_1.7.12   
[41] cachem_1.0.8      crayon_1.5.2      brio_1.1.3  

> sessioninfo::platform_info()
 setting  value
 version  R version 4.1.3 (2022-03-10)
 os       Red Hat Enterprise Linux 8.8 (Ootpa)
 system   x86_64, linux-gnu
 ui       RStudio
 language (EN)
 collate  en_US.UTF-8
 ctype    en_US.UTF-8
 tz       America/New_York
 date     2023-09-25
 rstudio  2023.03.0+386.pro1 Cherry Blossom (server)
 pandoc   2.0.6 @ /usr/bin/pandoc
[dplyr]$ ulimit -a
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 502453
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 4096
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 502453
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited

I have tried using a different temp directory to ensure space was not an issue. I have also compared system settings to make sure user limits on open files is the same between this system and one where this error does not occur, they are.

I have also tried deleting everything in the tempdir as suggested by: ‘Warning: Error in tempfile: cannot find unused tempfile name’ when rendering multiple R Markdowns but to no avail.

Leave a Comment