Chatted off-line... soy happy with the development
There is an internal parameter distr
that might need to be adjustable according to the variables.
It worked well! Thank you!
Sorry!
Yes, this is the follow-up of the case that we discussed yesterday.
I modified $Analysis$Time$ftime_max
and data$hcst$Dates$start
, data$obs$Dates$start
and recipe$Analysis$Variables$freq
.
I also added 'bias' to the outputs from compute_skill_metrics()
while no modifications were done to the probabilities.
Then the error appeared when running save_data()
.
Chung
Another error appears...
> save_data(recipe = recipe, data = data, skill_metrics = skill_metrics,
+ probabilities = probabilities)
Error in ClimProjDiags::Subset(as.Date(data_cube$Dates$start), "syear", :
Input array 'x' must be a numeric array.
Could you provide some hints, please?
Another question pops up...
The directory names of the output nc files after "output_dir" would use the $Analysis$Variables$freq
of the recipe, right?
In this case, I will have to change it before running save_data() as well, otherwise, it will replace the outputs, right?
That will be great. Thanks!
Here is the script: /esarchive/scratch/cchou/FOCUS/seasonal-verification/task3/tas.system5c3s.raw.R
The first part of the for loop computed the seasonal part.
I see. Sorry for that.
I tried to avoid loading twice the same data set because I have to do both the monthly and seasonal verification of the same start date.
What I did was load the three months, took the first month for the monthly part and saved the monthly results, and, with the same input data, I computed the three-month average for the seasonal part before saving the seasonal results.
XD
Chung
Thanks a lot for the improvements.
I got an error from the test submitted last week (please see the errors below) and it raised a question regarding the saving module (The latest improvement has not yet been updated).
Error in ArrayToNc(vars, outfile) :
The dimension 'time' is defined or used more than once in the provided data but the dimension specifications do not match.
Calls: save_data -> save_metrics -> ArrayToNc
In addition: There were 50 or more warnings (use warnings() to see the first 50)
Execution halted
The pathway of the above error file: /esarchive/scratch/cchou/FOCUS/seasonal-verification/task3/sh.output/F.bias-1030699.err
I wonder if the above error came from the fact that I requested 3 months of forecast time at recipe$Analysis$Time$ftime_max
but took the first forecast month for the following computation including saving the outputs (i.e., calibrated hindcast, skill metric, probability).
When the save_data() was run, did it take the $Analysis$Time$ftime_max
for the time
dimension in the nc output files?
If so, is there a way to avoid this inconsistency in the time
dimension between the recipe and the rest of the data sets?
Thanks in advance.
Chung
Hi @vagudets
Could I specify the "method" when using "qmap" to calibrate the data? If not, would it be a good idea to unfold the qmap like the CST_Calibration, using 'PTF','DIST','RQUANT','QUANT','SSPLIN' instead of 'qmap'?
Kind regards,
Chung
Sorry, I missed this request. It works with the develop-PlotEquiMap().
Thank you.
Chung
Hi @aho,
I try using PlotEquiMap in PlotLayout and with the dots provided for showing the significant points.
However, when an all-one array was manually provided (all TRUE), the dots showed a spatial pattern but did not fully cover the domain. Here is a sample script.
rm(list = ls())
library(s2dv)
load('/esarchive/scratch/cchou/MEDGOLD/cs_wine_paper/data/test.RData')
PlotLayout(fun = PlotEquiMap,
plot_dims = c('lat', 'lon'),
var = CORR,
special_args = DOT,
dot_symbol = 19, dot_size = 0.1,
brks = brks,
cols = cols,
col_inf = 'white',
colNA = 'lightgrey',
triangle_ends = c(TRUE, FALSE),
filled.continents = filled.continents,
filled.oceans = filled.oceans,
titles = names,
title_scale = 0.1,
subtitle_scale = 0.7,
axelab = FALSE, units = units,
nrow = 4, ncol = 2,
fileout = 'test.png',
width = 4,
height = 9
)
Could you take a look, please?
Thanks,
Chung
Thank you @aho, I will try with other combinations.
Hi @aho,
I try using PlotEquiMap in PlotLayout and with the dots provided for showing the significant points.
However, when an all-one array was manually provided (all TRUE), the dots showed a spatial pattern but did not fully cover the domain. Here is a sample script.
rm(list = ls())
library(s2dv)
load('/esarchive/scratch/cchou/MEDGOLD/cs_wine_paper/data/test.RData')
PlotLayout(fun = PlotEquiMap,
plot_dims = c('lat', 'lon'),
var = CORR,
special_args = DOT,
dot_symbol = 19, dot_size = 0.1,
brks = brks,
cols = cols,
col_inf = 'white',
colNA = 'lightgrey',
triangle_ends = c(TRUE, FALSE),
filled.continents = filled.continents,
filled.oceans = filled.oceans,
titles = names,
title_scale = 0.1,
subtitle_scale = 0.7,
axelab = FALSE, units = units,
nrow = 4, ncol = 2,
fileout = 'test.png',
width = 4,
height = 9
)
Could you take a look, please?
Thanks,
Chung
Hi @vagudets,
I am using both ESS Verification tool and CSDownscale for the seasonal verification tasks in Focus-Africa.
Since the interpolation and bias correction are combined in some functions of CSDownscale that I may use them later.
I wonder if it would be a good idea to include the CSDownscale in the tool like what the current function "calibrate_datasets" does.
Another possible solution could be to include the other 'bias correction' methods of the CSDownscale (e.g., linear regression and analogs) in the tool.
@jramon @rmarcos , please let me know what you think about this.
Thank you.
Kind regards,
Chung
Hi @vagudets,
Thanks for the explanation.
I don't mind whether the forecast time starts with 0 or 1 as long as it is clearly described in the comment after '#'.
Regarding the percentile/probability, it reminds me of the computation of some indicators (relating to some functions in CSIndicators) where the function converts all the values to quantiles before checking if the quantiles go above a threshold (if so, then it is counted as one for instance). Now, this seems to be irrelevant XD...
In my opinion, to avoid (my own, XD) confusion of a parameter (like the pro or percentile here), if this parameter is used ONLY in the SKILL module, I would go for Victoria's format. However, if this 'Percentiles:prob' would be used INSIDE and OUTSIDE of the SKILL section (for instance computing an indicator or computing a threshold before verification), then Alba's one would be better.
Just my two cents
Chung
Thank you @aho.
Since I tried the normal and gamma distributions, I think either one of them should work with precipitation data...
I will try different ones or give some other arguments.
Kind regards,
Chung
Hi @aho,
I tried with different distributions (also the qstep and start.fun by following the example provided in the 'qmap'), but it doesn't work for my precipitation data.
Here is the error message.
<simpleError in optim(par = vstart, fn = fnobj, fix.arg = fix.arg, obs = data, gr = gradient, ddistnam = ddistname, hessian = TRUE, method = meth, lower = lower, upper = upper, ...): function cannot be evaluated at initial parameters>
Error in do.ply(i) :
task 1 failed - "'mledist' failed to estimate parameters for 'mod' with the error code 100"
Even after I removed all the negative values and further converted the units so that the values are not small, it still doesn't work and returns the same error.
Could you please help with it?
I have saved the sample data (see the above same script).
Thank you.
Chung
Hi @aho,
Sorry, I should have updated the progress of this issue. When using the sample dataset provided above (which is temperature), it worked fine when adding the dist = norm. But later I found the precipitation data didn't like it and failed again with the same error. So I am about to test it with another distribution and see if the precipitation is happy or not. I will update it as soon as I can.
Thank you.
Chung