# Changes between Version 3 and Version 4 of udg/ecoms/RPackage/examples/verification

Ignore:
Timestamp:
May 12, 2016 4:59:25 PM (6 years ago)
Comment:

--

### Legend:

Unmodified
 v3 }}} [[Image(image-20160512-161841.png)]] The results reveal a significant cold bias of the CFSv2 model predictions. === Correlation We follow a similar approach to compute the ensemble mean forecast correlation against the verifying observations: {{{#!text/R corr <- veriApply("EnsCorr", fcst = mn.tx.forecast$Data, obs = mn.tx.obsintp$Data, ensdim = 1, tdim = 2) fields::image.plot(tx.forecast$xyCoords$x, tx.forecast$xyCoords$y, t(corr), asp = 1, xlab = "", ylab = "", main = "Mean tmax correlation - JJA") downscaleR:::draw.world.lines() }}} [[Image(image-20160512-162800.png)]] We find that the ensemble mean summer forecasts for 1991-2000 correlate well with the verifying observations over the north-western sector of the analysis area, but the forecasts do not skilfully represent year-to-year variability over the Iberian Peninsula and the Mediterranean area. === Ranked probability skill score (RPSS) We next illustrate the ranked probability skill score (RPSS). Here we use the RPSS for tercile forecasts, that is probability forecasts for the three categories colder than average, average, and warmer than average. In order to convert observations and forecast in probabilities for the three categories, we have to add an additional argument prob to the veriApply function with the quantile boundaries for the categories chosen. In this case, to indicate that validation is performed on the terciles, we use the value prob=c(1/3,2/3), as indicated next: {{{#!text/R rpss <- veriApply("EnsRpss", fcst = mn.tx.forecast$Data, obs = mn.tx.obsintp$Data, prob = c(1/3,2/3), ensdim = 1, tdim = 2) }}} In this case, the output is a list consisting of two components: The first one is the RPSS lon-lat matrix, as in the previous examples. The second one, provides the standard error, useful to calculate the significance of the score at each particular grid point (at the 95% c.i. in this example): {{{#!text/R # RPSS map fields::image.plot(tx.forecast$xyCoords$x, tx.forecast$xyCoords$y, t(rpss$rpss), asp = 1, xlab = "", ylab = "", main = "tmax RPSS - JJA") downscaleR:::draw.world.lines() # Compute significant points and collocate spatially: sig.i <- rpss$rpss > rpss$rpss.sigma*qnorm(0.95) lons <- rep(mn.tx.obsintp$xyCoords$x, each = length(mn.tx.obsintp$xyCoords$y)) lats <- rep(mn.tx.obsintp$xyCoords$y, length(mn.tx.obsintp$xyCoords\$x)) points(lons[sig.i], lats[sig.i], pch = 19) }}} [[Image(image-20160512-165909.png)]] == Acknowledgements These examples have been prepared by Jonas Bhend (**Meteo Swiss**), in collaboration with the **Santander Met Group**. == Package versions and session info {{{#!text/R print(sessionInfo(), locale = FALSE) R version 3.3.0 (2016-05-03) ## Platform: x86_64-pc-linux-gnu (64-bit) ## Running under: Ubuntu 14.04.4 LTS ## attached base packages: ## [1] stats     graphics  grDevices utils     datasets  methods   base ## other attached packages: ## [1] downscaleR_1.0-1        easyVerification_0.2.0  loadeR.ECOMS_1.0-0      loadeR_1.0-0            loadeR.java_1.1-0 ## [6] rJava_0.9-8             SpecsVerification_0.4-1 ## loaded via a namespace (and not attached): ##  [1] Rcpp_0.12.4       devtools_1.10.0   maps_3.1.0        MASS_7.3-44       evd_2.3-2         munsell_0.4.3     colorspace_1.2-6 ##  [8] lattice_0.20-33   pbapply_1.1-3     plyr_1.8.3        fields_8.4-1      tools_3.3.0       CircStats_0.2-4   parallel_3.3.0 ## [15] grid_3.3.0        spam_1.3-0        dtw_1.18-1        digest_0.6.9      abind_1.4-3       akima_0.5-12      bitops_1.0-6 ## [22] RCurl_1.95-4.8    memoise_1.0.0     sp_1.2-3          proxy_0.4-15      scales_0.4.0      boot_1.3-17       verification_1.42 }}}