oom in R on power 9
I'm requesting 32GB memory (mem=32000) on power9 and then loading a 2.5GB file in R.
The memory that is used within R for the loaded data is 5.6GB according to the R function object.size
.
I'm assuming that is enough memory to a apply an s2dverification function
R however crashes when I apply Clim
to the data with
slurmstepd: error: Detected 1 oom-kill event(s) in step 2570280.batch cgroup. Some of your processes may have been killed by the cgroup out-of-memory handler.
When I increase the requested memory to mem=50000 it works.
Here is the full script:
#!/usr/bin/env bash
#SBATCH --ntasks 1
#SBATCH -J clim.check.p9
#SBATCH --mem=32000
#SBATCH --output log_%x-%j.out
#SBATCH --time 00:30:00
module load R
set -ex
wd=tmp/$$
mkdir -p $wd
cat > $wd/script.r << EOF
.libPaths(new = '/gpfs/projects/bsc32/share/R_libs/3.5')
library(s2dverification)
load("/esarchive/scratch/swild/dcpp_diag/data/Rdata/tastos/ece33_dcpp_tastos_annual_mem1-10fcy1-10.Rdata")
print("data loaded")
print(format(object.size(var),units="MB"))
clim <- Clim(var,var,memb=TRUE)$clim_exp
print(str(clim))
EOF
Rscript --verbose $wd/script.r
Is there a way to find out how much memory I would need?