s2dverification issueshttps://earth.bsc.es/gitlab/es/s2dverification/-/issues2021-05-18T17:55:10+02:00https://earth.bsc.es/gitlab/es/s2dverification/-/issues/261SVD(): Problematic documentation2021-05-18T17:55:10+02:00ahoSVD(): Problematic documentation A user reported:
> SVD is a misleading name for a function that does MCA and CCA (Maximum Covariance Analysis / Canonical Correlation Analysis). Also, in the function documentation, SVD should stand for "singular value decomposition" no... A user reported:
> SVD is a misleading name for a function that does MCA and CCA (Maximum Covariance Analysis / Canonical Correlation Analysis). Also, in the function documentation, SVD should stand for "singular value decomposition" not "single value decomposition".https://earth.bsc.es/gitlab/es/s2dverification/-/issues/232WeatherRegime sensitive to node2019-10-29T16:01:34+01:00EduWeatherRegime sensitive to nodeHi, @nperez @aho
We (@portega and I) are using the WeatherRegime function to compute kmeans clusters in sea ice concentration in various simulations and observations. The problem is we have found that the results are strongly sensitive...Hi, @nperez @aho
We (@portega and I) are using the WeatherRegime function to compute kmeans clusters in sea ice concentration in various simulations and observations. The problem is we have found that the results are strongly sensitive to the machine. If I use `Power9`, I get very different cluster every time I run the scripts. For example these plots are spatial correlations between simulated and observed clusters (each color is one cluster; there are three). The results vary a lot and make nearly impossible any meaningful interpretation.
Example 1 (top is results in winter, bottom in summer; clusters calculated yesterday)
![cluster-power9-1](/uploads/904894bd7fe456def719351c228a0bc5/cluster-power9-1.png)
Example 2 (clusters from yesterday too)
![cluster-power9-2](/uploads/2dc8bf1c6a9f18666f02ff364fa26591/cluster-power9-2.png)
When I run the same scripts in `bsceslogin1`, changing the modules I load as
```
# ==== LOAD PACKAGES
# ==== bsceslogin1
module load R/3.2.0-foss-2015a-bare
module load CDO/1.9.0-foss-2015a
# ==== POWER9
#module load R/3.5.0-foss-2018b
#module load CDO/1.9.4-foss-2018b
#module load s2dverification/2.8.3-foss-2018b-R-3.5.0
```
I get the following:
Example 1 (clusters calculated in January or so)
![cluster-fatnode-1](/uploads/10b277a367a7abebc139e963215cfadf/cluster-fatnode-1.png)
Example 2 (clusters calculated yesterday)
![cluster-fatnode-2](/uploads/bb56d3e1c682cd23ac2da55fca4f4da4/cluster-fatnode-2.png)
The module `R/3.2.0-foss-2015a-bare` is not available in `power9`. It's not CDO because the anomaly fields from which clusters are calculated are identical every time. Results are only different after using WeatherRegime.
Any ideas why this happens? I know many details are missing, so please free to ask anything.
I can also get you the scripts we use.
Thanks!https://earth.bsc.es/gitlab/es/s2dverification/-/issues/220CDORemap crashing with dimension named 'time'2018-11-22T18:00:00+01:00Nicolau Manubens GilCDORemap crashing with dimension named 'time'As reported by @etourign , when an array with the 'time' dimension is provided to CDORemap, and it is not the shortest dimension, CDORemap will crash due to 'unlimited dimension in wrong position'. CDORemap should automatically name the ...As reported by @etourign , when an array with the 'time' dimension is provided to CDORemap, and it is not the shortest dimension, CDORemap will crash due to 'unlimited dimension in wrong position'. CDORemap should automatically name the time dimension to a temporary name, interpolate, and then rename back.https://earth.bsc.es/gitlab/es/s2dverification/-/issues/214test whether difference between 2 correlations is significant2018-05-29T18:25:06+02:00Eleftheria Exarchoutest whether difference between 2 correlations is significantH @nmanubens I'd like to ask about whether a Steigers test (described here https://journals.ametsoc.org/doi/pdf/10.1175/MWR-D-16-0037.1) is available within s2d and how. I basically would like to test if the difference between 2 correlat...H @nmanubens I'd like to ask about whether a Steigers test (described here https://journals.ametsoc.org/doi/pdf/10.1175/MWR-D-16-0037.1) is available within s2d and how. I basically would like to test if the difference between 2 correlations is significant.Nicolau Manubens GilNicolau Manubens Gilhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/211LoadMembersChunks fails too often2018-05-04T16:59:41+02:00Eleftheria ExarchouLoadMembersChunks fails too often@nmanubens I hava an issue with LoadMembersChunks : it is very sensitive to errors. If for example 1 file is missing or has less lead times than specified, the call fails, and then the whole script fails. This is an issue because you ne...@nmanubens I hava an issue with LoadMembersChunks : it is very sensitive to errors. If for example 1 file is missing or has less lead times than specified, the call fails, and then the whole script fails. This is an issue because you need to wait for couple of hours for this Load to function, so it happens few times already I let it running in the evening hoping to have the data in the morning, without success. And again to debug this it needs hours, because with some many files (file per chunk) the chance that a file is missing, or corrupt is high. Can't it simple be that in these cases there is only a warning at the very end and the s2d puts NANs in the function? Perhaps this behaviour can be optional with a flag?
I tag @jacosta because he had the same issue too.https://earth.bsc.es/gitlab/es/s2dverification/-/issues/207CDORemap: bugs with metadata and cdo remap2017-10-09T13:00:46+02:00Nicolau Manubens GilCDORemap: bugs with metadata and cdo remapAs reported by @amanriqu , CDORemap crashes in the following situation due to wrong interpretation of metadata:
```r
mask_path <- '/esnas/exp/ecearth/constant/land_sea_mask_2560x1280.nc'
mask <- NcToArray(mask_path, vars_to_read = 'LSM',...As reported by @amanriqu , CDORemap crashes in the following situation due to wrong interpretation of metadata:
```r
mask_path <- '/esnas/exp/ecearth/constant/land_sea_mask_2560x1280.nc'
mask <- NcToArray(mask_path, vars_to_read = 'LSM', drop_var_dim = TRUE)
lats <- NcToArray(mask_path, vars_to_read = 'lat', drop_var_dim = TRUE)
lons <- NcToArray(mask_path, vars_to_read = 'lon', drop_var_dim = TRUE)
new_mask <- CDORemap(mask, lons, lats, europe.MSWEP$source_files[1], 'bilinear')
```
Ignoring metadata fixes the issue, but another bug appears because CDORemap expects 'lon' and 'lat' dimension names in the remapped file, but the actual names are 'longitude' and 'latitude':
```r
mask_path <- '/esnas/exp/ecearth/constant/land_sea_mask_2560x1280.nc'
mask <- NcToArray(mask_path, vars_to_read = 'LSM', drop_var_dim = TRUE)
attr(mask, 'variables') <- NULL
lats <- NcToArray(mask_path, vars_to_read = 'lat', drop_var_dim = TRUE)
attr(lats, 'variables') <- NULL
lons <- NcToArray(mask_path, vars_to_read = 'lon', drop_var_dim = TRUE)
attr(lons, 'variables') <- NULL
new_mask <- CDORemap(mask, lons, lats, europe.MSWEP$source_files[1], 'bilinear')
```Nicolau Manubens GilNicolau Manubens Gilhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/203Load(): Bug when interpolating data files with more than 4 dimensions2017-11-15T12:23:29+01:00Nicolau Manubens GilLoad(): Bug when interpolating data files with more than 4 dimensionsAs reported in https://earth.bsc.es/gitlab/es/requests/issues/116 , cdo griddes cannot detect the grid in files with more than 4 dimensions. This makes `Load()` crash.
The `CDORemap()` can interpolate arrays with more than 4 dimension...As reported in https://earth.bsc.es/gitlab/es/requests/issues/116 , cdo griddes cannot detect the grid in files with more than 4 dimensions. This makes `Load()` crash.
The `CDORemap()` can interpolate arrays with more than 4 dimensions by creating files with arrays of 4 dimensions maximum. The new Load should be programmed using the CDORemap to solve this issue.Nicolau Manubens GilNicolau Manubens Gilhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/201Load - specify xmin/xmax ymin/ymax for interpolation2017-11-15T12:23:29+01:00Alasdair HunterLoad - specify xmin/xmax ymin/ymax for interpolationCan an option be added to Load() so that the user can specify the initial lonmin/latmin and the increment when interpolating the data?
This could be done using the xfirst, xinc and yfirst, yinc options from CDO (unless I have misundersto...Can an option be added to Load() so that the user can specify the initial lonmin/latmin and the increment when interpolating the data?
This could be done using the xfirst, xinc and yfirst, yinc options from CDO (unless I have misunderstood something).
cheers,
AlasdairNicolau Manubens GilNicolau Manubens Gilhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/198CDORemap: it should automatically put 'time' dimension at the end of each chu...2017-11-15T12:23:29+01:00Nicolau Manubens GilCDORemap: it should automatically put 'time' dimension at the end of each chunk sent to cdo.The following crashes, because the sent chunks have dims (time, lon, lat):
```R
library(startR)
repos_path <- '/esarchive/exp/ecearth/i00k/monthly_mean/heatc/ohc_2d_avg_0-300m_i00k_$sdate$_fc0-4_*.nc'
heatc <- Start(repos = repos...The following crashes, because the sent chunks have dims (time, lon, lat):
```R
library(startR)
repos_path <- '/esarchive/exp/ecearth/i00k/monthly_mean/heatc/ohc_2d_avg_0-300m_i00k_$sdate$_fc0-4_*.nc'
heatc <- Start(repos = repos_path,
sdate = 'first',
var = 'heatc_sl',
time = 'all',
ensemble = 'all',
x = 'all',
y = 'all',
return_vars = list(var_names = NULL),
var_var = 'var_names')
mask_path <- '/esnas/autosubmit/con_files/mask.regions.Ec3.0_O1L46.nc'
lon <- Start(repos = mask_path,
var = 'nav_lon',
t = 'first',
z = 'first',
x = 'all',
y = 'all',
return_vars = list(var_names = NULL),
var_var = 'var_names')
lat <- Start(repos = mask_path,
var = 'nav_lat',
t = 'first',
z = 'first',
x = 'all',
y = 'all',
return_vars = list(var_names = NULL),
var_var = 'var_names')
library(s2dverification)
heatc2 <- heatc
lon2 <- lon
lat2 <- lat
attr(heatc2$Data, 'variables') <- NULL
names(dim(heatc2$Data))[4] <- 't'
attr(lon2$Data, 'variables') <- NULL
attr(lat2$Data, 'variables') <- NULL
heatc3 <- CDORemap(Subset(heatc2$Data, 1:3, list(1, 1, 1), drop = 'selected'),
Subset(lon2$Data, 1:4, list(1, 1, 1, 1), drop = 'selected'),
Subset(lat2$Data, 1:4, list(1, 1, 1, 1), drop = 'selected'),
't106grid', 'bil')
Error in R_nc4_def_var_float: NetCDF: NC_UNLIMITED in the wrong index
Name of variable that the error occurred on: "var"
[1] "----------------------"
[1] "Var: var"
[1] "Ndims: 3"
[1] "Dimids: "
[1] 2 1 0
Error in ncvar_add(nc, vars[[ivar]], verbose = verbose, indefine = TRUE) :
Error in ncvar_add, defining var var
```https://earth.bsc.es/gitlab/es/s2dverification/-/issues/197Check the longitudes when loading data2017-11-15T12:23:29+01:00Martin MenegozCheck the longitudes when loading dataAfter some discussion with @cprodhomme and @rfernand, I open this discussion to point out one issue that I found when loading data with Load with specific longitudes. I have found that most of the issues that we had in the past long time...After some discussion with @cprodhomme and @rfernand, I open this discussion to point out one issue that I found when loading data with Load with specific longitudes. I have found that most of the issues that we had in the past long time ago in IC3) have been solved. I did some tests playing with the longitudes with one CNRM-CM5 experiment. Load provide consistent results when playing with negative or positive longitudes, crossing or not the Greenwich meridian. Except in one configuration when I use (-180;+180), where I found a bug:
(0;360) or (-180;+179.9) give correct and consistent results. That's not the case for (-180;+180) that cannot be used. You can see here the script and the data that I did to have a look at this issue: /esnas/scratch/mmenegoz/MORDICUS/36_members/ENSO/test_load/test_load_longitudes.R
And the plot that show the difference of temperature in the Tropics between data loaded with (0;360) and (-180;+180):[Temperature_tropic_1991_diff2.eps](/uploads/7b712d2409b778c83cbdd948a11fdc6d/Temperature_tropic_1991_diff2.eps)
I think we should have to check also that there is no more issues when loading data from other models, where the longitudes and the latitudes are not ordered as in CNRM-CM5.
Cheers,
MartinNicolau Manubens GilNicolau Manubens Gilhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/185Problem with extrapolation in Load2017-11-15T12:23:29+01:00Chloé ProdhommeProblem with extrapolation in LoadHi everybody,
I open this new issue following issue #182, we found with @mmenegoz some serious issue with extrapolation implemented in Load. Basically I suggest everybody to be very careful and avoid completely to extrapolate (going f...Hi everybody,
I open this new issue following issue #182, we found with @mmenegoz some serious issue with extrapolation implemented in Load. Basically I suggest everybody to be very careful and avoid completely to extrapolate (going from low to high resolution) with Load, so always choose the coarser grid of model and observations as a target in Load.
Bellow you will find an illustration of the problem with extrapolation:
I produce here the difference between ERA land and ERA interim at two resolution (the t106grid and the ERA40 grid: r144x73).![obs_diff_ERA40-ERAint_gridt1062000grid](/uploads/f130012c76e193dd863644283424f17c/obs_diff_ERA40-ERAint_gridt1062000grid.png)![obs_diff_ERA40-ERAint_grid144x732000grid](/uploads/30485005a12378c7133964e24ca480c5/obs_diff_ERA40-ERAint_grid144x732000grid.png)
What you can see is that the product are strongly different (up to 3º) on the coarse grid. However, when you extrapolate to the higher resolution grid, you see those very strange bands appearing. If it is always bad to extrapolate from low to high resolution, the result in this concrete case is worst than what can be expected. This show that the interpolation method we are using is not valid at all for extrapolation.
Therefore, I would suggest the action following actions:
- Short term: choose the default grid as the coarser one instead of the model one or to avoid compatibility break write a warning when an extrapolation is performed.
- Short term: Explain the recommendation in the documention
- Long term: work to implement other extrapolation/interpolation method which could be proposed as an option for Load.Nicolau Manubens GilNicolau Manubens Gilhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/183Possibility to change easily a plots2019-09-04T11:43:43+02:00Martin MenegozPossibility to change easily a plotsHi,
Very often, we have to make beautiful plots for paper/reports, and often, we need for that to make slight modifications of the s2dverification scripts, to include some specificities to our plots.
In the past, I was copying locally Pl...Hi,
Very often, we have to make beautiful plots for paper/reports, and often, we need for that to make slight modifications of the s2dverification scripts, to include some specificities to our plots.
In the past, I was copying locally PlotAno.R for example, to modify manually one line thickness or color. It was a quick way to do that. And I was not creating a new branch, because I have the impression that my modifications were really small and particularly specific to my application. I did want neither to include too many arguments in the PlotAno.R function, otherwise, we will have soon too many arguments for this function.
Now, we cannot do that any more because of some new functionalities in PlotAno.R and in probably in other functions that prevent the possibility to run them alone after including some small modifications, for example this function in PlotAno.R:
if (!is.null(fileout)) {
deviceInfo <- .SelectDevice(fileout)
saveToFile <- deviceInfo$fun
fileout <- deviceInfo$files
}
@nmanubens, what do you recommend us when we want to, include some small changes in functions like PlotAno.R?
For example, today, I would like just to increase the thickness of the black curve corresponding to the observations in PlotAno.R.
All the best,
Martin
PS: @cprodhomme may be interested in your answerNicolau Manubens GilNicolau Manubens Gilhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/179Load(): Interpolation issue2017-11-15T12:23:29+01:00Nicolau Manubens GilLoad(): Interpolation issueAs @nmishra reported, there's an issue when loading ncep/system2_m1 data.
```r
library(s2dverification)
library(ncdf)
# mask for MODEL
# --------------
fnc <- open.ncdf("/esnas/exp/ecearth/land_sea_mask_512x256.nc")
lataux <- get.var...As @nmishra reported, there's an issue when loading ncep/system2_m1 data.
```r
library(s2dverification)
library(ncdf)
# mask for MODEL
# --------------
fnc <- open.ncdf("/esnas/exp/ecearth/land_sea_mask_512x256.nc")
lataux <- get.var.ncdf(fnc,"lat")
lot <- get.var.ncdf(fnc,"lon")
maskaux <- get.var.ncdf(fnc,"LSM")
close.ncdf(fnc)
nlat=length(lataux)
if (lataux[1]>0){
lataux=rev(lataux)
maskmod=maskaux[,seq(nlat,1)]
}
maskmod[which(maskmod > 0.5)] <- 1
maskmod[which(maskmod < 0.5)] <- 0
#maskmod=t(maskmod)
listmaskmod=list(maskmod, maskmod, maskmod, maskmod)
# mask for OBSERVATION
# --------------------
#fnc <- open.ncdf("/esnas/exp/ecearth/land_sea_mask_512x256.nc")
#maskobs <- get.var.ncdf(fnc,"LSM")
#close.ncdf(fnc)
#maskobs[which(is.na(maskmod))] <- 1
#listmaskobs=list(maskobs)
#masklst=list("/esnas/exp/ecearth/land_sea_mask_512x256.nc")
# load NOVEMBER start date seasonal data - DJB
# --------------------------------------------
# generate dataseq for NOVEMBER start date
# ----------------------------------------
Novstart <- as.Date("19921101", "%Y%m%d")
#Novend <- as.Date("19921101", "%Y%m%d")
Novend <- as.Date("20121101", "%Y%m%d")
Novdateseq <- format(seq(Novstart, Novend, by = "year"), "%Y%m%d")
# NovStartData=Load("tas", #prlr
# c( "glosea5_sea", "ECMWF_S4_sea", "NCEP_sea", "MF_sea"),
# obs = "ERAint", #GPCP
# sdates = Novdateseq, leadtimemin = 2, leadtimemax = 4,
# lonmin = -20, lonmax = 70, latmin = 25, latmax = 75,
# storefreq = "monthly", sampleperiod = 1, nmember = 51, output = "lonlat",
# maskmod = listmaskmod, #maskobs = listmaskobs,
# grid = "r512x256",
# configfile = "/home/Earth/nmishra/s2dv_test/BSC_chloe.conf")
glosea5 <- list(path = '/esnas/exp/glosea5/specs-seasonal_i1p1/$STORE_FREQ$_mean/allmemb/$VAR_NAME$/$VAR_NAME$_$START_DATE$.nc')
NovStartData <- Load("tas", #'prlr'
exp = list(glosea5,
list(name = 'ecmwf/system4_m1'),
list(name = 'ncep/system2_m1'),
list(name = 'meteofrance/system4_m1')),
obs = c('erainterim'), #'gpcp_v2.2'
sdates = Novdateseq,
leadtimemin = 2, leadtimemax = 4,
lonmin = -20, lonmax = 70,
latmin = 25, latmax = 75,
storefreq = "monthly", sampleperiod = 1,
nmember = 51, output = "lonlat",
maskmod = listmaskmod,
#maskobs = listmaskobs,
grid = "r512x256")
```Nicolau Manubens GilNicolau Manubens Gilhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/175Function to compute reliability according to Weissheimer and Palmer (2014)2017-11-15T12:23:29+01:00Omar BellpratFunction to compute reliability according to Weissheimer and Palmer (2014)Hi,
I have done a function which computes the reliability scores according to
Weisheimer, A. and T.N. Palmer (2014), On the reliability of seasonal climate forecasts. J. R. Soc. Interface, 11 (96) 20131162, doi: 10.1098/ rsif.2013.116...Hi,
I have done a function which computes the reliability scores according to
Weisheimer, A. and T.N. Palmer (2014), On the reliability of seasonal climate forecasts. J. R. Soc. Interface, 11 (96) 20131162, doi: 10.1098/ rsif.2013.1162.
Is that something useful for s2dverification, even though its a probabilistic measure? The function computes the reliability for different classes but does not plot the reliability diagram, which I have been doing with Specs-Verification.
@nmanubens @vtorralba @cprodhomme @lcaronhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/173PLOT failed in auto-ecearth, using : Error in Clim(toto1$mod, toto1$obs, memb...2017-11-15T12:23:30+01:00Miguel Castrillomiguel.castrillo@bsc.esPLOT failed in auto-ecearth, using : Error in Clim(toto1$mod, toto1$obs, memb = TRUE)I got this error in test t00z (/esnas/autosubmit/t00z) ORCA1L75-LIM3 (nemo3 only) , what could it be? EC-Earth experiments doesn't throw any error.
Apparently we are using commit 5fe4388f8f1337144e231dbeb65636f9aa7b118c , are we outd...I got this error in test t00z (/esnas/autosubmit/t00z) ORCA1L75-LIM3 (nemo3 only) , what could it be? EC-Earth experiments doesn't throw any error.
Apparently we are using commit 5fe4388f8f1337144e231dbeb65636f9aa7b118c , are we outdated @nmanubens ?
Log file at vim /esnas/scratch/cfu/mcastril/t00z/LOG_t00z/t00z_19900101_fc0_2_PLOT-15101.err
````
Attaching package: ‘s2dverification’
The following object is masked from ‘package:base’:
Filter
Error in Clim(toto1$mod, toto1$obs, memb = TRUE) :
At least 4 dim needed : c(nexp/nobs, nmemb, nsdates, nltime)
Execution halted
````Nicolau Manubens GilNicolau Manubens Gilhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/172Load(): Leak of shared memory when force stopping.2017-11-15T12:23:30+01:00Nicolau Manubens GilLoad(): Leak of shared memory when force stopping.Nicolau Manubens GilNicolau Manubens Gilhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/171Load(): Launching multiple processes in computing platforms even if only 1 pr...2016-10-17T18:42:21+02:00Nicolau Manubens GilLoad(): Launching multiple processes in computing platforms even if only 1 processor requestedNicolau Manubens GilNicolau Manubens Gilhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/170Load(): Not able to load cmorized files by default because CMOR convention is...2016-09-14T17:51:18+02:00Nicolau Manubens GilLoad(): Not able to load cmorized files by default because CMOR convention is 'lon' and 'lat'Nicolau Manubens GilNicolau Manubens Gilhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/169Load(): Crashing randomly when working on SMP2020-10-07T19:10:05+02:00Nicolau Manubens GilLoad(): Crashing randomly when working on SMPLoad() was used several times. 20% of the times it crashed, but running it again with the same parameters would eventually work. As reported by @llledo .
```r
* The load call you issued is:
* Load(var = "va60ml", exp = list(struct...Load() was used several times. 20% of the times it crashed, but running it again with the same parameters would eventually work. As reported by @llledo .
```r
* The load call you issued is:
* Load(var = "va60ml", exp = list(structure(list(name = "eraint", path =
* "/esnas/reconstructions/ecmwf/$EXP_NAME$/6hourly/$VAR_NAME$/$VAR_NAME$_$YEAR$$MONTH$.nc"),
* .Names = c("name", "path"))), obs = NULL, sdates = "19941001",
* grid = NULL, output = "lonlat", storefreq = "monthly", ...)
* See the full call in '$load_parameters' after Load() finishes.
* Fetching first experimental files to work out 'var_exp' size...
* Exploring dimensions...
* /esnas/reconstructions/ecmwf/eraint/6hourly/va60ml/va60ml_199410.nc
* Success. Detected dimensions of experimental data: 1, 1, 1, 124, 256,
* 512
* Fetching first observational files to work out 'var_obs' size...
* Success. Detected dimensions of observational data: 0
* Will now proceed to read and process 1 data files:
* /esnas/reconstructions/ecmwf/eraint/6hourly/va60ml/va60ml_199410.nc
* Total size of requested data: 130023424 bytes.
* - Experimental data: ( 1 x 1 x 1 x 124 x 256 x 512 ) x 8 bytes =
* 130023424 bytes.
* - Observational data: ( 0 ) x 8 bytes = 0 bytes.
* If size of requested data is close to or above the free shared RAM
* memory, R will crash.
Error in attach.resource(obj, ...) :
Fatal error in attach: big.matrix could not be attached.
Error: dims [product 16252928] do not match the length of object [0]
Execution halted
```Release 3.0.0 to CRANNicolau Manubens GilNicolau Manubens Gilhttps://earth.bsc.es/gitlab/es/s2dverification/-/issues/167Score function timings2016-08-31T11:34:10+02:00ahunterScore function timingsThe score functions have been modified to be compatible with the veriApply wrapper in easyVerification. The veriApply function enables the use of multiple cores for parallel processing, which should speed up the computation time.
Howe...The score functions have been modified to be compatible with the veriApply wrapper in easyVerification. The veriApply function enables the use of multiple cores for parallel processing, which should speed up the computation time.
However, when using score functions and veriApply with a single core the computation time is greatly increased. Is it worthwhile developing our own wrapper function, similar to veriApply, but more efficient?
Results of timing experiments for the "Corr" correlation function, applied to a typical dataset:
The original function elapsed time: 150.682
The new function with veriApply elapsed time: 303.635
The new function looped over the array elapsed time: 177.273