Tutorial 9.3b - Randomized Complete Block ANOVA (Bayesian)
24 Dec 2015
If you are completely ontop of the conceptual issues pertaining to Randomized Complete Block (RCB) ANOVA, and just need to use this tutorial in order to learn about RCB ANOVA in R, you are invited to skip down to the section on RCB ANOVA in R.
Overview
You are strongly advised to review the information on the nested design in tutorial 9.3a. I am not going to duplicate the overview here.
Tutorial 9.2a (Nested ANOVA), introduced the concept of employing sub-replicates that are nested within the main treatment levels as a means of absorbing some of the unexplained variability that would otherwise arise from designs in which sampling units are selected from amongst highly heterogeneous conditions. Such (nested) designs are useful in circumstances where the levels of the main treatment (such as burnt and un-burnt sites) occur at a much larger temporal or spatial scale than the experimental/sampling units (e.g. vegetation monitoring quadrats).
For circumstances in which the main treatments can be applied (or naturally occur) at the same scale as the sampling units (such as whether a stream rock is enclosed by a fish proof fence or not), an alternative design is available. In this design (randomized complete block design), each of the levels of the main treatment factor are grouped (blocked) together (in space and/or time) and therefore, whilst the conditions between the groups (referred to as `blocks') might vary substantially, the conditions under which each of the levels of the treatment are tested within any given block are far more homogeneous (see Figure below).
If any differences between blocks (due to the heterogeneity) can account for some of the total variability between the sampling units (thereby reducing the amount of variability that the main treatment(s) failed to explain), then the main test of treatment effects will be more powerful/sensitive.
As an simple example of a randomized complete block (RCB) design, consider an investigation into the roles of different organism scales (microbial, macro invertebrate and vertebrate) on the breakdown of leaf debris packs within streams. An experiment could consist of four treatment levels - leaf packs protected by fish-proof mesh, leaf packs protected by fine macro invertebrate exclusion mesh, leaf packs protected by dissolving antibacterial tablets, and leaf packs relatively unprotected as controls.
As an acknowledgement that there are many other unmeasured factors that could influence leaf pack breakdown (such as flow velocity, light levels, etc) and that these are likely to vary substantially throughout a stream, the treatments are to be arranged into groups or 'blocks' (each containing a single control, microbial, macro invertebrate and fish protected leaf pack). Blocks of treatment sets are then secured in locations haphazardly selected throughout a particular reach of stream. Importantly, the arrangement of treatments in each block must be randomized to prevent the introduction of some systematic bias - such as light angle, current direction etc.
Blocking does however come at a cost. The blocks absorb both unexplained variability as well as degrees of freedom from the residuals. Consequently, if the amount of the total unexplained variation that is absorbed by the blocks is not sufficiently large enough to offset the reduction in degrees of freedom (which may result from either less than expected heterogeneity, or due to the scale at which the blocks are established being inappropriate to explain much of the variation), for a given number of sampling units (leaf packs), the tests of main treatment effects will suffer power reductions.
Treatments can also be applied sequentially or repeatedly at the scale of the entire block, such that at any single time, only a single treatment level is being applied (see the lower two sub-figures above). Such designs are called repeated measures. A repeated measures ANOVA is to an single factor ANOVA as a paired t-test is to a independent samples t-test.
One example of a repeated measures analysis might be an investigation into the effects of a five different diet drugs (four doses and a placebo) on the food intake of lab rats. Each of the rats (`subjects') is subject to each of the four drugs (within subject effects) which are administered in a random order.
In another example, temporal recovery responses of sharks to bi-catch entanglement stresses might be simulated by analyzing blood samples collected from captive sharks (subjects) every half hour for three hours following a stress inducing restraint. This repeated measures design allows the anticipated variability in stress tolerances between individual sharks to be accounted for in the analysis (so as to permit more powerful test of the main treatments). Furthermore, by performing repeated measures on the same subjects, repeated measures designs reduce the number of subjects required for the investigation.
Essentially, this is a randomized complete block design except that the within subject (block) effect (e.g. time since stress exposure) cannot be randomized (the consequences of which are discussed in section on Sphericity).
To suppress contamination effects resulting from the proximity of treatment sampling units within a block, units should be adequately spaced in time and space. For example, the leaf packs should not be so close to one another that the control packs are effected by the antibacterial tablets and there should be sufficient recovery time between subsequent drug administrations.
In addition, the order or arrangement of treatments within the blocks must be randomized so as to prevent both confounding as well as computational complications (Sphericity). Whilst this is relatively straight forward for the classic randomized complete block design (such as the leaf packs in streams), it is logically not possible for repeated measures designs.
Blocking factors are typically random factors (see section~\ref{chpt:ANOVA.fixedVsRandomFactor}) that represent all the possible blocks that could be selected. As such, no individual block can truly be replicated. Randomized complete block and repeated measures designs can therefore also be thought of as un-replicated factorial designs in which there are two or more factors but that the interactions between the blocks and all the within block factors are not replicated.
Linear models
The linear models for two and three factor nested design are:
$$
\begin{align}
y_{ij}&=\mu+\beta_{i}+\alpha_j + \varepsilon_{ij} &\hspace{2em} \varepsilon_{ij} &\sim\mathcal{N}(0,\sigma^2), \hspace{1em}\sum{}{\beta=0}\\
y_{ijk}&=\mu+\beta_{i} + \alpha_j + \gamma_{k} + \beta\alpha_{ij} + \beta\gamma_{ik} + \alpha\gamma_{jk} + \gamma\alpha\beta_{ijk} + \varepsilon_{ijk} \hspace{2em} (Model 1)\\
y_{ijk}&=\mu+\beta_{i} + \alpha_j + \gamma_{k} + \alpha\gamma_{jk} + \varepsilon_{ijk} \hspace{2em}(Model 2)
\end{align}
$$
where $\mu$ is the overall mean, $\beta$ is the effect of the Blocking
Factor B, $\alpha$ and $\gamma$ are the effects of withing block Factor A
and Factor C respectively and $\varepsilon$ is the random unexplained or
residual component.
Tests for the effects of blocks as well as effects within blocks assume that there are no interactions between blocks and the within block effects. That is, it is assumed that any effects are of similar nature within each of the blocks. Whilst this assumption may well hold for experiments that are able to consciously set the scale over which the blocking units are arranged, when designs utilize arbitrary or naturally occurring blocking units, the magnitude and even polarity of the main effects are likely to vary substantially between the blocks.
The preferred (non-additive or `Model 1') approach to un-replicated factorial analysis of some bio-statisticians is to include the block by within subject effect interactions (e.g. $\beta\alpha$). Whilst these interaction effects cannot be formally tested, they can be used as the denominators in F-ratio calculations of their respective main effects tests (see the tables that follow).
Proponents argue that since these blocking interactions cannot be formally tested, there is no sound inferential basis for using these error terms separately. Alternatively, models can be fitted additively (`Model 2') whereby all the block by within subject effect interactions are pooled into a single residual term ($\varepsilon$). Although the latter approach is simpler, each of the within subject effects tests do assume that there are no interactions involving the blocks and that perhaps even more restrictively, that sphericity (see section Sphericity) holds across the entire design.
Assumptions
As with other ANOVA designs, the reliability of hypothesis tests is dependent on the residuals being:
- normally distributed. Boxplots using the appropriate scale of replication (reflecting the appropriate residuals/F-ratio denominator (see Tables above) should be used to explore normality. Scale transformations are often useful.
- equally varied. Boxplots and plots of means against variance (using the appropriate scale of replication) should be used to explore the spread of values. Residual plots should reveal no patterns. Scale transformations are often useful.
- independent of one another. Although the observations within a block may not strictly be independent, provided the treatments are applied or ordered randomly within each block or subject, within block proximity effects on the residuals should be random across all blocks and thus the residuals should still be independent of one another. Nevertheless, it is important that experimental units within blocks are adequately spaced in space and time so as to suppress contamination or carryover effects.
RCB in R (JAGS and STAN)
Simple RCB
Scenario and Data
Imagine we has designed an experiment in which we intend to measure a response ($y$) to one of treatments (three levels; 'a1', 'a2' and 'a3'). Unfortunately, the system that we intend to sample is spatially heterogeneous and thus will add a great deal of noise to the data that will make it difficult to detect a signal (impact of treatment).
Thus in an attempt to constrain this variability you decide to apply a design (RCB) in which each of the treatments within each of 35 blocks dispersed randomly throughout the landscape. As this section is mainly about the generation of artificial data (and not specifically about what to do with the data), understanding the actual details are optional and can be safely skipped. Consequently, I have folded (toggled) this section away.
- the number of treatments = 3
- the number of blocks containing treatments = 35
- the mean of the treatments = 40, 70 and 80 respectively
- the variability (standard deviation) between blocks of the same treatment = 12
- the variability (standard deviation) between treatments withing blocks = 5
library(plyr) set.seed(1) nTreat <- 3 nBlock <- 35 sigma <- 5 sigma.block <- 12 n <- nBlock*nTreat Block <- gl(nBlock, k=1) A <- gl(nTreat,k=1) dt <- expand.grid(A=A,Block=Block) #Xmat <- model.matrix(~Block + A + Block:A, data=dt) Xmat <- model.matrix(~-1+Block + A, data=dt) block.effects <- rnorm(n = nBlock, mean = 40, sd = sigma.block) A.effects <- c(30,40) all.effects <- c(block.effects,A.effects) lin.pred <- Xmat %*% all.effects # OR Xmat <- cbind(model.matrix(~-1+Block,data=dt),model.matrix(~-1+A,data=dt)) ## Sum to zero block effects block.effects <- rnorm(n = nBlock, mean = 0, sd = sigma.block) A.effects <- c(40,70,80) all.effects <- c(block.effects,A.effects) lin.pred <- Xmat %*% all.effects ## the quadrat observations (within sites) are drawn from ## normal distributions with means according to the site means ## and standard deviations of 5 y <- rnorm(n,lin.pred,sigma) data.rcb <- data.frame(y=y, expand.grid(A=A, Block=Block)) head(data.rcb) #print out the first six rows of the data set
y A Block 1 37.39761 1 1 2 61.47033 2 1 3 78.07370 3 1 4 30.59803 1 2 5 59.00035 2 2 6 76.72575 3 2
Exploratory data analysis
Normality and Homogeneity of variance
boxplot(y~A, data.rcb)
Conclusions:
- there is no evidence that the response variable is consistently non-normal across all populations - each boxplot is approximately symmetrical
- there is no evidence that variance (as estimated by the height of the boxplots) differs between the five populations. . More importantly, there is no evidence of a relationship between mean and variance - the height of boxplots does not increase with increasing position along the y-axis. Hence it there is no evidence of non-homogeneity
- transform the scale of the response variables (to address normality etc). Note transformations should be applied to the entire response variable (not just those populations that are skewed).
Block by within-Block interaction
library(car) with(data.rcb, interaction.plot(A,Block,y))
#OR with ggplot library(ggplot2) ggplot(data.rcb, aes(y=y, x=A, group=Block,color=Block)) + geom_line() + guides(color=guide_legend(ncol=3))
library(car) residualPlots(lm(y~Block+A, data.rcb))
Test stat Pr(>|t|) Block NA NA A NA NA Tukey test -0.885 0.376
# the Tukey's non-additivity test by itself can be obtained via an internal function # within the car package car:::tukeyNonaddTest(lm(y~Block+A, data.rcb))
Test Pvalue -0.8854414 0.3759186
# alternatively, there is also a Tukey's non-additivity test within the # asbio package library(asbio) with(data.rcb,tukey.add.test(y,A,Block))
Tukey's one df test for additivity F = 0.7840065 Denom df = 67 p-value = 0.3790855
Conclusions:
- there is no visual or inferential evidence of any major interactions between Block and the within-Block effect (A). Any trends appear to be reasonably consistent between Blocks.
Model fitting or statistical analysis
JAGS
Full parameterization | Matrix parameterization | Heirarchical parameterization |
---|---|---|
$$ \begin{array}{rcl} y_{ijk}&\sim&\mathcal{N}(\mu_{ij},\sigma^2)\\ \mu_{ij} &=& \beta_0 + \beta_{i} + \gamma_{j(i)}\\ \gamma_{i{j}}&\sim&\mathcal{N}(0,\sigma_{B}^2)\\ \beta_0, \beta_i&\sim&\mathcal{N}(0,100000)\\ \sigma^2, \sigma_{B}&\sim&\mathcal{Cauchy}(0,25)\\ \end{array} $$ | $$ \begin{array}{rcl} y_{ijk}&\sim&\mathcal{N}(\mu_{ij},\sigma^2)\\ \mu_{ij} &=& \beta\mathbf{X} + \gamma_{j(i)}\\ \gamma_{i{j}}&\sim&\mathcal{N}(0,\sigma_{B}^2)\\ \beta&\sim&\mathcal{MVN}(0,100000)\\ \sigma^2, \sigma_{B}^2&\sim&\mathcal{Cauchy}(0,25)\\ \end{array} $$ | $$ \begin{array}{rcl} y_{ijk}&\sim&\mathcal{N}(\mu_{ij},\sigma^2)\\ \mu_{ij} &=& \beta_0 + \beta_{i} + \gamma_{j(i)}\\ \alpha_{i{j}}&\sim&\mathcal{N}(0,\sigma_{B}^2)\\ \beta_0, \beta_i&\sim&\mathcal{N}(0, 1000000)\\ \sigma^2, \sigma_{B}^2&\sim&\mathcal{Cauchy}(0,25)\\ \end{array} $$ |
The full parameterization, shows the effects parameterization in which there is an intercept ($\alpha_0$) and two treatment effects ($\alpha_i$, where $i$ is 1,2).
The matrix parameterization is a compressed notation, In this parameterization, there are three alpha parameters (one representing the mean of treatment a1, and the other two representing the treatment effects (differences between a2 and a1 and a3 and a1). In generating priors for each of these three alpha parameters, we could loop through each and define a non-informative normal prior to each (as in the Full parameterization version). However, it turns out that it is more efficient (in terms of mixing and thus the number of necessary iterations) to define the priors from a multivariate normal distribution. This has as many means as there are parameters to estimate (3) and a 3x3 matrix of zeros and 100 in the diagonals. $$ \mu\sim\left[ \begin{array}{c} 0\\ 0\\ 0\\ \end{array} \right], \hspace{2em} \sigma^2\sim\left[ \begin{array}{ccc} 1000000&0&0\\ 0&1000000&0\\ 0&0&1000000\\ \end{array} \right] $$
Rather than assume a specific variance-covariance structure, just like lme() we can incorporate an appropriate structure to account for different dependency/correlation structures in our data. In RCB designs, it is prudent to capture the residuals to allow checks that there are no outstanding dependency issues following model fitting.
Full effects parameterization
modelString=" model { #Likelihood for (i in 1:n) { y[i]~dnorm(mu[i],tau) mu[i] <- beta0 + beta[A[i]] + gamma[Block[i]] res[i] <- y[i]-mu[i] } #Priors beta0 ~ dnorm(0, 1.0E-6) beta[1] <- 0 for (i in 2:nA) { beta[i] ~ dnorm(0, 1.0E-6) #prior } for (i in 1:nBlock) { gamma[i] ~ dnorm(0, tau.B) #prior } tau <- pow(sigma,-2) sigma <- z/sqrt(chSq) z ~ dnorm(0, 0.0016)I(0,) #1/25^2 = 0.0016 chSq ~ dgamma(0.5, 0.5) tau.B <- pow(sigma.B,-2) sigma.B <- z/sqrt(chSq.B) z.B ~ dnorm(0, 0.0016)I(0,) #1/25^2 = 0.0016 chSq.B ~ dgamma(0.5, 0.5) } "
data.rcb.list <- with(data.rcb, list(y=y, Block=as.numeric(Block), A=as.numeric(A), n=nrow(data.rcb), nBlock=length(levels(Block)), nA = length(levels(A)) ) ) params <- c("beta0","beta",'gamma',"sigma","sigma.B","res") burnInSteps = 3000 nChains = 3 numSavedSteps = 3000 thinSteps = 10 nIter = burnInSteps+ceiling((numSavedSteps * thinSteps)/nChains) library(R2jags) rnorm(1)
[1] -1.523615
jags.effects.f.time <- system.time( data.rcb.r2jags.f <- jags(data=data.rcb.list, inits=NULL, parameters.to.save=params, model.file=textConnection(modelString), n.chains=3, n.iter=nIter, n.burnin=burnInSteps, n.thin=thinSteps ) )
Compiling model graph Resolving undeclared variables Allocating nodes Graph Size: 583 Initializing model
jags.effects.f.time
user system elapsed 9.997 0.056 10.096
print(data.rcb.r2jags.f)
Inference for Bugs model at "5", fit using jags, 3 chains, each with 13000 iterations (first 3000 discarded), n.thin = 10 n.sims = 3000 iterations saved mu.vect sd.vect 2.5% 25% 50% 75% 97.5% Rhat n.eff beta[1] 0.000 0.000 0.000 0.000 0.000 0.000 0.000 1.000 1 beta[2] 28.469 1.126 26.153 27.751 28.488 29.220 30.719 1.002 1000 beta[3] 40.184 1.142 37.899 39.422 40.197 40.955 42.340 1.001 2400 beta0 43.037 2.081 38.975 41.670 42.995 44.422 47.240 1.001 2100 gamma[1] -6.545 3.277 -13.077 -8.749 -6.447 -4.305 -0.256 1.001 3000 gamma[2] -9.939 3.295 -16.422 -12.129 -9.923 -7.758 -3.491 1.001 3000 gamma[3] -3.657 3.200 -10.086 -5.802 -3.580 -1.443 2.429 1.001 3000 gamma[4] 8.098 3.218 1.748 5.947 8.174 10.221 14.243 1.001 3000 gamma[5] 6.653 3.248 0.236 4.426 6.749 8.818 12.743 1.001 3000 gamma[6] -2.505 3.247 -8.804 -4.673 -2.526 -0.333 3.916 1.002 1400 gamma[7] -5.171 3.227 -11.428 -7.280 -5.194 -3.033 1.165 1.001 3000 gamma[8] 10.231 3.286 3.708 8.154 10.241 12.387 16.709 1.001 3000 gamma[9] 5.248 3.181 -0.990 3.159 5.240 7.377 11.443 1.001 2700 gamma[10] -13.826 3.276 -20.135 -15.990 -13.849 -11.565 -7.473 1.001 3000 gamma[11] -12.819 3.222 -19.059 -15.027 -12.928 -10.603 -6.459 1.001 3000 gamma[12] 3.656 3.234 -2.720 1.541 3.714 5.793 9.919 1.001 3000 gamma[13] 9.356 3.252 2.986 7.195 9.343 11.525 15.670 1.001 2700 gamma[14] -2.748 3.261 -9.239 -4.968 -2.729 -0.488 3.448 1.001 3000 gamma[15] 8.406 3.250 1.930 6.245 8.416 10.576 14.647 1.001 3000 gamma[16] 0.543 3.287 -5.740 -1.666 0.548 2.720 7.044 1.002 1600 gamma[17] -9.671 3.297 -16.274 -11.779 -9.696 -7.482 -3.120 1.002 1700 gamma[18] 2.926 3.222 -3.310 0.778 2.863 5.093 9.110 1.001 3000 gamma[19] -14.431 3.318 -21.075 -16.618 -14.419 -12.220 -7.883 1.001 3000 gamma[20] 12.128 3.277 5.838 9.866 12.188 14.364 18.513 1.004 3000 gamma[21] 19.986 3.212 13.484 17.855 20.016 22.086 26.175 1.001 3000 gamma[22] -10.910 3.256 -17.414 -13.042 -10.904 -8.742 -4.613 1.003 680 gamma[23] -16.639 3.281 -22.930 -18.849 -16.679 -14.453 -10.149 1.002 1800 gamma[24] 2.792 3.222 -3.478 0.682 2.788 4.937 9.149 1.002 1800 gamma[25] -9.145 3.224 -15.343 -11.347 -9.171 -6.980 -2.877 1.001 3000 gamma[26] 26.933 3.213 20.648 24.800 26.915 29.148 33.162 1.001 3000 gamma[27] -6.852 3.194 -12.844 -9.012 -6.910 -4.652 -0.689 1.002 1200 gamma[28] 3.393 3.236 -3.175 1.312 3.323 5.569 9.801 1.002 1700 gamma[29] -4.603 3.256 -11.078 -6.800 -4.532 -2.391 1.601 1.001 3000 gamma[30] -11.052 3.274 -17.184 -13.362 -11.095 -8.746 -4.506 1.001 2100 gamma[31] 1.626 3.211 -4.607 -0.526 1.700 3.682 7.894 1.002 1900 gamma[32] -18.959 3.266 -25.345 -21.118 -18.922 -16.788 -12.651 1.001 3000 gamma[33] 11.276 3.220 4.864 9.069 11.224 13.512 17.440 1.001 3000 gamma[34] 3.350 3.214 -2.866 1.181 3.302 5.428 9.763 1.001 2200 gamma[35] 22.236 3.288 15.714 20.028 22.275 24.431 28.651 1.002 1400 res[1] 0.906 2.722 -4.421 -0.874 0.883 2.698 6.305 1.001 3000 res[2] -3.490 2.710 -8.813 -5.343 -3.513 -1.652 1.824 1.001 3000 res[3] 1.398 2.718 -3.720 -0.402 1.363 3.185 7.003 1.001 3000 res[4] -2.500 2.772 -8.090 -4.294 -2.464 -0.627 2.822 1.001 3000 res[5] -2.566 2.781 -8.090 -4.398 -2.537 -0.753 2.800 1.001 3000 res[6] 3.444 2.752 -2.045 1.664 3.453 5.220 8.840 1.001 3000 res[7] -2.308 2.747 -7.733 -4.123 -2.393 -0.395 2.968 1.001 2300 res[8] 1.445 2.728 -3.863 -0.395 1.398 3.327 6.838 1.001 3000 res[9] 0.096 2.731 -5.327 -1.743 0.082 1.988 5.416 1.001 3000 res[10] -0.882 2.736 -6.231 -2.713 -0.877 0.940 4.363 1.001 3000 res[11] 0.754 2.744 -4.604 -1.004 0.760 2.521 6.107 1.001 3000 res[12] 1.206 2.730 -4.164 -0.593 1.207 3.023 6.522 1.001 3000 res[13] 5.359 2.695 0.038 3.542 5.321 7.107 10.648 1.002 1800 res[14] -6.618 2.681 -11.749 -8.360 -6.666 -4.816 -1.213 1.001 3000 res[15] 2.254 2.693 -2.929 0.440 2.214 4.034 7.574 1.001 3000 res[16] -0.841 2.707 -6.181 -2.626 -0.870 0.938 4.364 1.004 540 res[17] 4.340 2.762 -1.007 2.517 4.286 6.136 9.900 1.002 1000 res[18] -4.211 2.726 -9.462 -6.025 -4.200 -2.389 1.039 1.003 780 res[19] 0.944 2.696 -4.547 -0.786 0.993 2.725 6.147 1.001 3000 res[20] 1.961 2.695 -3.592 0.232 2.032 3.736 7.243 1.001 3000 res[21] -3.803 2.703 -9.307 -5.520 -3.795 -2.018 1.359 1.001 3000 res[22] 1.135 2.745 -4.483 -0.677 1.193 3.017 6.519 1.001 2600 res[23] 2.429 2.740 -3.107 0.657 2.442 4.279 7.780 1.001 3000 res[24] -1.588 2.730 -6.896 -3.360 -1.551 0.245 3.776 1.001 3000 res[25] 6.330 2.696 1.076 4.448 6.339 8.154 11.768 1.001 3000 res[26] 2.719 2.695 -2.601 0.901 2.715 4.540 8.096 1.001 3000 res[27] -8.172 2.704 -13.495 -9.958 -8.193 -6.360 -2.940 1.001 3000 res[28] -0.343 2.765 -5.837 -2.189 -0.298 1.453 5.075 1.002 1300 res[29] -2.068 2.760 -7.584 -3.885 -1.988 -0.220 3.364 1.001 3000 res[30] -0.028 2.751 -5.527 -1.872 -0.004 1.821 5.361 1.001 2400 res[31] -1.809 2.735 -7.302 -3.599 -1.796 0.011 3.367 1.002 1300 res[32] 3.034 2.719 -2.556 1.310 3.069 4.915 8.268 1.001 3000 res[33] -3.446 2.727 -9.016 -5.177 -3.400 -1.683 1.761 1.001 2600 res[34] -1.527 2.724 -7.069 -3.317 -1.525 0.289 3.849 1.001 3000 res[35] -4.059 2.701 -9.400 -5.833 -4.075 -2.242 1.201 1.001 3000 res[36] 6.335 2.717 0.896 4.564 6.361 8.154 11.681 1.001 3000 res[37] 0.413 2.690 -4.816 -1.409 0.447 2.232 5.717 1.001 3000 res[38] 2.911 2.711 -2.391 1.087 2.910 4.708 8.236 1.002 3000 res[39] -1.434 2.709 -6.944 -3.272 -1.466 0.460 3.779 1.001 3000 res[40] 6.774 2.711 1.664 4.888 6.706 8.625 12.169 1.001 3000 res[41] -3.285 2.705 -8.297 -5.213 -3.312 -1.492 2.117 1.001 3000 res[42] -4.130 2.703 -9.197 -6.032 -4.196 -2.288 1.163 1.001 3000 res[43] 6.292 2.704 1.020 4.478 6.287 8.059 11.568 1.002 1500 res[44] -2.591 2.682 -7.877 -4.418 -2.563 -0.833 2.610 1.001 3000 res[45] -2.090 2.670 -7.294 -3.906 -2.060 -0.348 3.112 1.001 2300 res[46] -0.766 2.756 -6.077 -2.633 -0.773 1.078 4.543 1.002 1300 res[47] 1.129 2.772 -4.371 -0.676 1.107 2.901 6.573 1.002 1600 res[48] -0.382 2.761 -5.747 -2.237 -0.366 1.429 5.037 1.002 1500 res[49] 1.761 2.748 -3.654 -0.114 1.792 3.620 7.126 1.002 1400 res[50] -0.065 2.759 -5.450 -1.891 -0.089 1.808 5.350 1.002 1100 res[51] -3.424 2.746 -8.714 -5.276 -3.342 -1.578 1.927 1.002 1300 res[52] 4.845 2.700 -0.535 3.015 4.890 6.622 10.024 1.001 3000 res[53] -1.411 2.689 -6.681 -3.247 -1.407 0.467 3.718 1.001 3000 res[54] -2.952 2.686 -8.183 -4.759 -2.886 -1.168 2.234 1.001 3000 res[55] -2.659 2.751 -8.104 -4.518 -2.571 -0.821 2.757 1.001 3000 res[56] 2.936 2.755 -2.469 1.120 2.946 4.782 8.360 1.001 3000 res[57] -2.710 2.738 -8.279 -4.528 -2.659 -0.893 2.589 1.001 3000 res[58] 1.844 2.766 -3.497 0.050 1.823 3.664 7.223 1.001 3000 res[59] 0.155 2.790 -5.201 -1.718 0.118 1.970 5.796 1.001 3000 res[60] 0.226 2.754 -5.017 -1.690 0.237 2.080 5.671 1.001 3000 res[61] 1.043 2.731 -4.118 -0.805 1.014 2.869 6.475 1.001 2200 res[62] -0.672 2.717 -5.930 -2.542 -0.690 1.175 4.672 1.001 3000 res[63] 3.215 2.712 -1.901 1.373 3.187 4.994 8.558 1.001 3000 res[64] -4.125 2.681 -9.346 -5.880 -4.184 -2.306 1.166 1.002 1100 res[65] 6.530 2.707 1.270 4.686 6.502 8.365 11.787 1.004 560 res[66] -4.400 2.692 -9.636 -6.180 -4.409 -2.565 0.745 1.003 690 res[67] -0.432 2.719 -5.912 -2.194 -0.461 1.361 4.916 1.001 3000 res[68] -0.037 2.754 -5.676 -1.816 -0.011 1.792 5.376 1.002 1900 res[69] -2.372 2.731 -8.020 -4.098 -2.428 -0.523 2.843 1.001 2600 res[70] 0.723 2.715 -4.664 -1.099 0.707 2.501 6.099 1.003 830 res[71] -7.033 2.728 -12.237 -8.859 -7.074 -5.268 -1.462 1.002 1500 res[72] 6.706 2.697 1.459 4.915 6.671 8.521 12.203 1.002 1200 res[73] -3.838 2.731 -9.182 -5.619 -3.848 -2.028 1.580 1.001 3000 res[74] 3.701 2.720 -1.666 1.841 3.689 5.516 9.142 1.001 3000 res[75] -1.277 2.694 -6.652 -3.189 -1.300 0.554 4.038 1.001 3000 res[76] -4.904 2.739 -10.180 -6.773 -5.040 -3.003 0.637 1.001 2700 res[77] 10.817 2.739 5.596 8.895 10.702 12.682 16.279 1.001 3000 res[78] -1.247 2.740 -6.351 -3.123 -1.378 0.641 4.242 1.001 3000 res[79] -3.087 2.718 -8.356 -4.926 -3.133 -1.277 2.291 1.003 950 res[80] -3.328 2.709 -8.700 -5.154 -3.341 -1.532 1.944 1.002 1100 res[81] 5.411 2.735 -0.102 3.656 5.331 7.204 10.614 1.002 1000 res[82] 1.754 2.731 -3.572 -0.076 1.729 3.506 7.222 1.002 1400 res[83] 1.787 2.761 -3.617 0.001 1.804 3.662 7.121 1.002 1200 res[84] -2.984 2.742 -8.260 -4.829 -2.992 -1.193 2.432 1.002 1400 res[85] -5.535 2.729 -10.735 -7.426 -5.494 -3.718 -0.253 1.001 3000 res[86] -1.943 2.729 -7.217 -3.797 -1.940 -0.086 3.384 1.001 3000 res[87] 6.718 2.719 1.510 4.918 6.678 8.506 12.149 1.001 3000 res[88] -4.010 2.699 -9.338 -5.803 -4.018 -2.163 1.241 1.003 890 res[89] -6.295 2.727 -11.728 -8.105 -6.278 -4.386 -0.958 1.002 1700 res[90] 8.258 2.723 2.866 6.453 8.232 10.145 13.584 1.002 1300 res[91] -0.272 2.643 -5.598 -2.014 -0.311 1.493 4.848 1.001 3000 res[92] -2.059 2.638 -7.269 -3.796 -2.156 -0.282 3.194 1.002 2000 res[93] 2.711 2.667 -2.466 0.991 2.653 4.461 7.965 1.001 3000 res[94] -1.305 2.781 -6.717 -3.187 -1.250 0.519 4.101 1.001 2600 res[95] -7.302 2.755 -12.539 -9.135 -7.397 -5.428 -1.979 1.001 3000 res[96] 5.109 2.778 -0.259 3.251 5.119 6.951 10.524 1.001 3000 res[97] 1.998 2.708 -3.334 0.141 2.025 3.817 7.305 1.002 1400 res[98] -2.318 2.765 -7.704 -4.198 -2.341 -0.418 2.966 1.001 3000 res[99] 2.367 2.704 -2.953 0.584 2.401 4.164 7.680 1.001 3000 res[100] -3.510 2.681 -8.765 -5.295 -3.527 -1.747 1.878 1.004 560 res[101] 8.523 2.700 3.195 6.714 8.518 10.366 13.790 1.002 1200 res[102] -4.203 2.729 -9.719 -5.965 -4.206 -2.422 1.158 1.003 920 res[103] 3.084 2.756 -2.397 1.232 3.070 4.940 8.509 1.001 3000 res[104] 1.944 2.773 -3.604 0.122 1.974 3.758 7.409 1.002 1400 res[105] -1.056 2.807 -6.634 -2.988 -1.048 0.729 4.291 1.002 2000 sigma 4.692 0.414 3.993 4.403 4.657 4.951 5.599 1.002 1400 sigma.B 11.583 1.513 9.043 10.493 11.427 12.520 14.839 1.001 2700 deviance 619.973 11.360 600.077 611.961 618.909 627.051 644.698 1.002 1800 For each parameter, n.eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor (at convergence, Rhat=1). DIC info (using the rule, pD = var(deviance)/2) pD = 64.5 and DIC = 684.5 DIC is an estimate of expected predictive error (lower deviance is better).
data.rcb.mcmc.list.f <- as.mcmc(data.rcb.r2jags.f)
[1] "beta0" "beta[1]" "beta[2]" "beta[3]" "deviance" "gamma[1]" "gamma[10]" "gamma[11]" "gamma[12]" "gamma[13]" "gamma[14]" "gamma[15]" "gamma[16]" "gamma[17]" "gamma[18]" "gamma[19]" [17] "gamma[2]" "gamma[20]" "gamma[21]" "gamma[22]" "gamma[23]" "gamma[24]" "gamma[25]" "gamma[26]" "gamma[27]" "gamma[28]" "gamma[29]" "gamma[3]" "gamma[30]" "gamma[31]" "gamma[32]" "gamma[33]" [33] "gamma[34]" "gamma[35]" "gamma[4]" "gamma[5]" "gamma[6]" "gamma[7]" "gamma[8]" "gamma[9]" "res[1]" "res[10]" "res[100]" "res[101]" "res[102]" "res[103]" "res[104]" "res[105]" [49] "res[11]" "res[12]" "res[13]" "res[14]" "res[15]" "res[16]" "res[17]" "res[18]" "res[19]" "res[2]" "res[20]" "res[21]" "res[22]" "res[23]" "res[24]" "res[25]" [65] "res[26]" "res[27]" "res[28]" "res[29]" "res[3]" "res[30]" "res[31]" "res[32]" "res[33]" "res[34]" "res[35]" "res[36]" "res[37]" "res[38]" "res[39]" "res[4]" [81] "res[40]" "res[41]" "res[42]" "res[43]" "res[44]" "res[45]" "res[46]" "res[47]" "res[48]" "res[49]" "res[5]" "res[50]" "res[51]" "res[52]" "res[53]" "res[54]" [97] "res[55]" "res[56]" "res[57]" "res[58]" "res[59]" "res[6]" "res[60]" "res[61]" "res[62]" "res[63]" "res[64]" "res[65]" "res[66]" "res[67]" "res[68]" "res[69]" [113] "res[7]" "res[70]" "res[71]" "res[72]" "res[73]" "res[74]" "res[75]" "res[76]" "res[77]" "res[78]" "res[79]" "res[8]" "res[80]" "res[81]" "res[82]" "res[83]" [129] "res[84]" "res[85]" "res[86]" "res[87]" "res[88]" "res[89]" "res[9]" "res[90]" "res[91]" "res[92]" "res[93]" "res[94]" "res[95]" "res[96]" "res[97]" "res[98]" [145] "res[99]" "sigma" "sigma.B"
Error in HPDinterval.mcmc(as.mcmc(x)): obj must have nsamp > 1
Matrix parameterization
modelString=" model { #Likelihood for (i in 1:n) { y[i]~dnorm(mu[i],tau) mu[i] <- inprod(beta[],X[i,]) + gamma[Block[i]] res[i] <- y[i]-mu[i] } #Priors beta ~ dmnorm(a0,A0) for (i in 1:nBlock) { gamma[i] ~ dnorm(0, tau.B) #prior } tau <- pow(sigma,-2) sigma <- z/sqrt(chSq) z ~ dnorm(0, 0.0016)I(0,) #1/25^2 = 0.0016 chSq ~ dgamma(0.5, 0.5) tau.B <- pow(sigma.B,-2) sigma.B <- z/sqrt(chSq.B) z.B ~ dnorm(0, 0.0016)I(0,) #1/25^2 = 0.0016 chSq.B ~ dgamma(0.5, 0.5) } "
A.Xmat <- model.matrix(~A,data.rcb) data.rcb.list <- with(data.rcb, list(y=y, Block=as.numeric(Block), X=A.Xmat, n=nrow(data.rcb), nBlock=length(levels(Block)), nA = ncol(A.Xmat), a0=rep(0,3), A0=diag(0,3) ) ) params <- c("beta",'gamma',"sigma","sigma.B","res") adaptSteps = 1000 burnInSteps = 3000 nChains = 3 numSavedSteps = 3000 thinSteps = 10 nIter = burnInSteps+ceiling((numSavedSteps * thinSteps)/nChains) library(R2jags) rnorm(1)
[1] 1.174783
jags.effects.m.time <- system.time( data.rcb.r2jags.m <- jags(data=data.rcb.list, inits=NULL, parameters.to.save=params, model.file=textConnection(modelString), n.chains=3, n.iter=nIter, n.burnin=burnInSteps, n.thin=thinSteps ) )
Compiling model graph Resolving undeclared variables Allocating nodes Graph Size: 910 Initializing model
jags.effects.m.time
user system elapsed 9.888 0.060 9.989
print(data.rcb.r2jags.m)
Inference for Bugs model at "5", fit using jags, 3 chains, each with 13000 iterations (first 3000 discarded), n.thin = 10 n.sims = 3000 iterations saved mu.vect sd.vect 2.5% 25% 50% 75% 97.5% Rhat n.eff beta[1] 43.004 2.154 38.742 41.589 43.024 44.450 47.311 1.002 1200 beta[2] 28.490 1.128 26.296 27.745 28.468 29.232 30.687 1.001 3000 beta[3] 40.174 1.119 37.894 39.445 40.217 40.945 42.305 1.001 3000 gamma[1] -6.523 3.316 -13.224 -8.720 -6.491 -4.331 -0.002 1.002 3000 gamma[2] -9.928 3.262 -16.347 -12.141 -9.928 -7.750 -3.371 1.002 1100 gamma[3] -3.790 3.261 -10.171 -5.930 -3.819 -1.648 2.568 1.001 3000 gamma[4] 8.056 3.294 1.682 5.791 8.029 10.306 14.360 1.001 3000 gamma[5] 6.700 3.285 0.285 4.408 6.772 8.851 13.058 1.001 3000 gamma[6] -2.556 3.281 -9.064 -4.748 -2.501 -0.465 4.022 1.002 1100 gamma[7] -5.123 3.245 -11.542 -7.238 -5.119 -3.038 1.359 1.002 1400 gamma[8] 10.357 3.230 4.101 8.183 10.293 12.512 16.746 1.001 3000 gamma[9] 5.189 3.293 -1.058 2.988 5.164 7.294 11.634 1.001 3000 gamma[10] -13.734 3.234 -19.824 -15.957 -13.783 -11.585 -7.237 1.001 3000 gamma[11] -12.787 3.274 -19.356 -15.003 -12.764 -10.485 -6.646 1.001 3000 gamma[12] 3.699 3.334 -3.009 1.446 3.721 6.013 10.243 1.002 1800 gamma[13] 9.537 3.285 3.082 7.277 9.515 11.773 16.179 1.004 700 gamma[14] -2.810 3.270 -9.311 -4.922 -2.875 -0.661 3.646 1.001 3000 gamma[15] 8.485 3.284 2.103 6.273 8.488 10.717 14.932 1.001 3000 gamma[16] 0.504 3.244 -5.816 -1.783 0.427 2.753 6.886 1.002 1700 gamma[17] -9.719 3.298 -15.908 -11.982 -9.843 -7.419 -3.233 1.002 3000 gamma[18] 2.954 3.271 -3.455 0.784 2.929 5.090 9.405 1.001 3000 gamma[19] -14.297 3.222 -20.502 -16.454 -14.387 -12.157 -7.738 1.002 1600 gamma[20] 12.195 3.241 5.871 10.051 12.263 14.406 18.318 1.001 3000 gamma[21] 20.036 3.334 13.391 17.852 20.066 22.230 26.832 1.002 3000 gamma[22] -10.911 3.278 -17.106 -13.208 -10.912 -8.698 -4.470 1.001 3000 gamma[23] -16.691 3.292 -23.152 -18.860 -16.741 -14.495 -10.125 1.002 3000 gamma[24] 2.696 3.284 -3.611 0.453 2.633 4.875 9.190 1.002 1100 gamma[25] -9.063 3.257 -15.351 -11.338 -9.052 -6.797 -2.875 1.001 3000 gamma[26] 26.953 3.339 20.622 24.648 26.899 29.207 33.513 1.002 1700 gamma[27] -6.736 3.322 -13.246 -8.906 -6.761 -4.509 -0.271 1.001 3000 gamma[28] 3.366 3.222 -3.119 1.210 3.379 5.481 9.746 1.001 3000 gamma[29] -4.626 3.287 -10.929 -6.910 -4.653 -2.364 1.806 1.001 3000 gamma[30] -11.090 3.275 -17.542 -13.353 -11.037 -8.833 -4.581 1.001 2200 gamma[31] 1.669 3.216 -4.783 -0.406 1.680 3.742 7.875 1.001 3000 gamma[32] -18.987 3.313 -25.508 -21.173 -18.976 -16.840 -12.251 1.003 870 gamma[33] 11.323 3.319 4.720 9.175 11.341 13.511 17.752 1.001 3000 gamma[34] 3.424 3.291 -2.896 1.251 3.394 5.548 10.024 1.002 1100 gamma[35] 22.371 3.309 15.878 20.171 22.370 24.577 28.991 1.001 3000 res[1] 0.917 2.759 -4.620 -0.940 0.880 2.729 6.493 1.001 3000 res[2] -3.500 2.772 -9.057 -5.434 -3.469 -1.693 2.092 1.001 3000 res[3] 1.418 2.760 -3.833 -0.468 1.381 3.256 6.862 1.001 3000 res[4] -2.478 2.737 -7.717 -4.410 -2.502 -0.647 2.944 1.001 3000 res[5] -2.566 2.743 -7.906 -4.443 -2.506 -0.741 2.697 1.001 2500 res[6] 3.475 2.705 -1.710 1.623 3.469 5.252 8.817 1.001 2000 res[7] -2.142 2.748 -7.596 -3.953 -2.146 -0.258 3.177 1.002 1700 res[8] 1.590 2.732 -3.843 -0.189 1.586 3.427 6.863 1.001 2300 res[9] 0.271 2.731 -5.077 -1.570 0.257 2.108 5.552 1.001 2700 res[10] -0.807 2.734 -6.303 -2.632 -0.735 1.028 4.582 1.001 3000 res[11] 0.807 2.737 -4.645 -0.994 0.806 2.604 6.060 1.001 3000 res[12] 1.290 2.734 -4.196 -0.559 1.311 3.144 6.653 1.001 3000 res[13] 5.344 2.750 -0.121 3.514 5.395 7.155 10.637 1.001 2600 res[14] -6.654 2.756 -12.035 -8.496 -6.678 -4.773 -1.251 1.001 3000 res[15] 2.249 2.726 -3.186 0.477 2.292 4.046 7.596 1.001 3000 res[16] -0.758 2.716 -6.013 -2.563 -0.808 1.092 4.477 1.001 3000 res[17] 4.403 2.726 -0.922 2.628 4.407 6.222 9.669 1.001 3000 res[18] -4.118 2.690 -9.356 -5.966 -4.124 -2.291 1.188 1.001 2800 res[19] 0.928 2.715 -4.362 -0.893 0.905 2.764 6.150 1.001 3000 res[20] 1.924 2.724 -3.450 0.100 1.859 3.752 7.179 1.001 3000 res[21] -3.809 2.713 -9.068 -5.634 -3.852 -2.023 1.562 1.001 3000 res[22] 1.041 2.731 -4.382 -0.753 1.098 2.864 6.299 1.001 2000 res[23] 2.314 2.716 -3.093 0.462 2.315 4.160 7.565 1.001 2700 res[24] -1.671 2.672 -6.945 -3.462 -1.615 0.091 3.339 1.001 3000 res[25] 6.421 2.734 1.089 4.552 6.410 8.200 11.946 1.001 3000 res[26] 2.789 2.737 -2.520 0.959 2.773 4.604 8.298 1.001 3000 res[27] -8.071 2.760 -13.435 -9.908 -8.100 -6.224 -2.329 1.001 3000 res[28] -0.402 2.701 -5.890 -2.202 -0.416 1.396 4.854 1.001 2000 res[29] -2.148 2.759 -7.566 -3.919 -2.174 -0.263 3.280 1.001 3000 res[30] -0.077 2.712 -5.592 -1.880 -0.072 1.667 5.222 1.001 3000 res[31] -1.809 2.691 -6.979 -3.644 -1.847 0.028 3.370 1.001 3000 res[32] 3.014 2.712 -2.336 1.175 2.948 4.859 8.457 1.001 3000 res[33] -3.436 2.731 -8.874 -5.345 -3.414 -1.597 2.086 1.001 3000 res[34] -1.538 2.743 -6.945 -3.379 -1.541 0.244 3.892 1.001 3000 res[35] -4.090 2.793 -9.334 -5.995 -4.088 -2.291 1.354 1.001 3000 res[36] 6.334 2.782 0.936 4.507 6.327 8.081 11.939 1.001 3000 res[37] 0.265 2.754 -5.047 -1.548 0.266 2.126 5.733 1.002 1900 res[38] 2.742 2.772 -2.723 0.984 2.748 4.623 8.056 1.002 1500 res[39] -1.573 2.758 -6.879 -3.416 -1.575 0.216 3.851 1.002 1400 res[40] 6.868 2.728 1.621 5.028 6.878 8.687 12.302 1.002 1500 res[41] -3.211 2.745 -8.689 -4.989 -3.264 -1.441 2.212 1.001 2100 res[42] -4.026 2.721 -9.412 -5.821 -4.026 -2.205 1.360 1.001 2400 res[43] 6.246 2.717 0.781 4.427 6.248 8.067 11.439 1.001 2700 res[44] -2.659 2.727 -8.028 -4.536 -2.625 -0.803 2.647 1.001 3000 res[45] -2.127 2.716 -7.465 -3.957 -2.154 -0.267 3.152 1.001 3000 res[46] -0.695 2.655 -5.883 -2.499 -0.605 1.073 4.420 1.001 2900 res[47] 1.179 2.677 -4.102 -0.615 1.197 2.956 6.499 1.001 3000 res[48] -0.301 2.679 -5.713 -2.050 -0.302 1.490 4.819 1.001 3000 res[49] 1.841 2.673 -3.491 0.047 1.930 3.609 7.021 1.001 3000 res[50] -0.006 2.680 -5.260 -1.870 0.040 1.822 5.162 1.001 3000 res[51] -3.334 2.673 -8.610 -5.127 -3.320 -1.547 1.788 1.001 3000 res[52] 4.850 2.717 -0.588 3.073 4.817 6.666 10.211 1.002 1300 res[53] -1.428 2.741 -6.836 -3.233 -1.452 0.417 3.954 1.002 1700 res[54] -2.937 2.704 -8.170 -4.720 -2.979 -1.154 2.298 1.002 1900 res[55] -2.761 2.652 -8.288 -4.517 -2.653 -1.001 2.233 1.001 3000 res[56] 2.814 2.694 -2.730 1.043 2.871 4.574 8.034 1.001 3000 res[57] -2.802 2.708 -8.353 -4.600 -2.691 -1.065 2.261 1.001 3000 res[58] 1.809 2.657 -3.322 -0.003 1.768 3.576 7.132 1.002 1200 res[59] 0.100 2.641 -4.964 -1.773 0.148 1.823 5.472 1.002 1500 res[60] 0.202 2.682 -4.849 -1.690 0.178 1.986 5.448 1.002 1800 res[61] 1.025 2.743 -4.220 -0.775 0.975 2.807 6.519 1.001 3000 res[62] -0.710 2.753 -5.840 -2.514 -0.770 1.069 4.743 1.001 3000 res[63] 3.207 2.731 -2.133 1.472 3.140 4.902 8.842 1.001 3000 res[64] -4.092 2.763 -9.609 -5.864 -4.053 -2.264 1.299 1.001 3000 res[65] 6.544 2.760 1.168 4.741 6.624 8.420 11.809 1.001 3000 res[66] -4.356 2.758 -9.929 -6.218 -4.308 -2.483 0.912 1.001 3000 res[67] -0.348 2.725 -5.659 -2.204 -0.316 1.523 4.917 1.001 3000 res[68] 0.026 2.708 -5.254 -1.860 0.123 1.905 5.243 1.001 3000 res[69] -2.278 2.710 -7.547 -4.109 -2.232 -0.488 3.060 1.001 3000 res[70] 0.851 2.695 -4.547 -0.884 0.848 2.640 6.153 1.001 3000 res[71] -6.925 2.721 -12.373 -8.670 -6.913 -5.141 -1.537 1.001 2600 res[72] 6.844 2.700 1.617 5.034 6.911 8.645 12.018 1.001 2200 res[73] -3.887 2.687 -8.936 -5.741 -3.932 -2.088 1.403 1.001 2900 res[74] 3.630 2.716 -1.602 1.849 3.575 5.459 9.102 1.001 3000 res[75] -1.316 2.710 -6.421 -3.129 -1.360 0.506 4.159 1.001 3000 res[76] -4.892 2.723 -9.983 -6.830 -4.919 -3.012 0.316 1.001 3000 res[77] 10.808 2.755 5.424 8.899 10.797 12.654 16.243 1.002 3000 res[78] -1.225 2.731 -6.517 -3.100 -1.273 0.665 4.258 1.001 3000 res[79] -3.170 2.752 -8.477 -5.005 -3.163 -1.317 2.249 1.002 1000 res[80] -3.432 2.756 -8.813 -5.308 -3.443 -1.555 1.853 1.002 1200 res[81] 5.338 2.762 0.002 3.510 5.327 7.194 10.680 1.002 1300 res[82] 1.814 2.727 -3.565 0.011 1.780 3.641 7.008 1.001 3000 res[83] 1.827 2.672 -3.589 0.115 1.861 3.604 6.985 1.001 3000 res[84] -2.914 2.718 -8.320 -4.720 -2.929 -1.077 2.320 1.001 3000 res[85] -5.479 2.751 -10.911 -7.350 -5.467 -3.666 -0.052 1.001 3000 res[86] -1.907 2.756 -7.162 -3.749 -1.938 -0.055 3.639 1.001 3000 res[87] 6.784 2.762 1.416 4.956 6.769 8.564 12.313 1.001 3000 res[88] -3.940 2.671 -9.110 -5.748 -3.987 -2.149 1.568 1.001 3000 res[89] -6.246 2.658 -11.470 -7.983 -6.301 -4.433 -1.107 1.001 3000 res[90] 8.338 2.668 3.224 6.599 8.287 10.080 13.656 1.001 3000 res[91] -0.283 2.687 -5.429 -2.159 -0.251 1.541 5.005 1.001 3000 res[92] -2.091 2.696 -7.290 -3.904 -2.072 -0.279 3.203 1.001 3000 res[93] 2.710 2.713 -2.517 0.875 2.760 4.515 8.115 1.001 3000 res[94] -1.244 2.716 -6.596 -3.068 -1.210 0.573 3.872 1.001 2700 res[95] -7.262 2.699 -12.509 -9.113 -7.225 -5.472 -2.189 1.002 1800 res[96] 5.180 2.706 -0.225 3.303 5.205 7.010 10.342 1.002 1600 res[97] 1.984 2.763 -3.475 0.153 1.960 3.775 7.552 1.002 2200 res[98] -2.353 2.755 -7.896 -4.114 -2.375 -0.530 3.125 1.001 2900 res[99] 2.363 2.764 -3.222 0.566 2.345 4.230 7.936 1.001 3000 res[100] -3.551 2.761 -8.907 -5.415 -3.585 -1.684 1.863 1.001 2700 res[101] 8.461 2.732 3.307 6.641 8.435 10.293 13.842 1.001 2000 res[102] -4.234 2.725 -9.536 -6.071 -4.277 -2.390 1.048 1.002 1700 res[103] 2.981 2.747 -2.332 1.113 2.987 4.808 8.341 1.001 3000 res[104] 1.821 2.769 -3.515 -0.091 1.789 3.685 7.272 1.001 3000 res[105] -1.148 2.757 -6.483 -3.016 -1.161 0.685 4.305 1.001 3000 sigma 4.682 0.404 3.967 4.400 4.658 4.938 5.551 1.001 3000 sigma.B 11.616 1.495 9.132 10.567 11.435 12.537 15.014 1.003 980 deviance 619.768 11.408 600.405 611.767 618.725 626.775 643.972 1.002 1400 For each parameter, n.eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor (at convergence, Rhat=1). DIC info (using the rule, pD = var(deviance)/2) pD = 65.0 and DIC = 684.8 DIC is an estimate of expected predictive error (lower deviance is better).
data.rcb.mcmc.list.m <- as.mcmc(data.rcb.r2jags.m) Data.Rcb.mcmc.list.m <- data.rcb.mcmc.list.m
modelString=" model { #Likelihood for (i in 1:n) { y[i]~dnorm(mu[i],tau) mu[i] <- inprod(beta[],X[i,]) + inprod(gamma[], Z[i,]) res[i] <- y[i] - mu[i] } #Priors beta ~ dmnorm(a0,A0) for (i in 1:nZ) { gamma[i] ~ dnorm(0, tau.B) #prior } tau <- pow(sigma,-2) sigma <- z/sqrt(chSq) z ~ dnorm(0, 0.0016)I(0,) #1/25^2 = 0.0016 chSq ~ dgamma(0.5, 0.5) tau.B <- pow(sigma.B,-2) sigma.B <- z/sqrt(chSq.B) z.B ~ dnorm(0, 0.0016)I(0,) #1/25^2 = 0.0016 chSq.B ~ dgamma(0.5, 0.5) } "
A.Xmat <- model.matrix(~A,data.rcb) Zmat <- model.matrix(~-1+Block, data.rcb) data.rcb.list <- with(data.rcb, list(y=y, X=A.Xmat, n=nrow(data.rcb), Z=Zmat, nZ=ncol(Zmat), nA = ncol(A.Xmat), a0=rep(0,3), A0=diag(0,3) ) ) params <- c("beta","gamma","sigma","sigma.B",'beta','res') burnInSteps = 3000 nChains = 3 numSavedSteps = 3000 thinSteps = 10 nIter = burnInSteps+ceiling((numSavedSteps * thinSteps)/nChains) library(R2jags) rnorm(1)
[1] -0.2973848
jags.effects.m2.time <- system.time( data.rcb.r2jags.m2 <- jags(data=data.rcb.list, inits=NULL, parameters.to.save=params, model.file=textConnection(modelString), n.chains=3, n.iter=nIter, n.burnin=burnInSteps, n.thin=thinSteps ) )
Compiling model graph Resolving undeclared variables Allocating nodes Graph Size: 4621 Initializing model
jags.effects.m2.time
user system elapsed 28.221 0.104 28.487
print(data.rcb.r2jags.m2)
Inference for Bugs model at "6", fit using jags, 3 chains, each with 13000 iterations (first 3000 discarded), n.thin = 10 n.sims = 3000 iterations saved mu.vect sd.vect 2.5% 25% 50% 75% 97.5% Rhat n.eff beta[1] 43.074 2.115 38.783 41.715 43.067 44.446 47.249 1.002 1800 beta[2] 28.408 1.132 26.197 27.679 28.396 29.153 30.612 1.002 1400 beta[3] 40.127 1.121 37.966 39.394 40.112 40.884 42.261 1.002 1100 gamma[1] -6.667 3.325 -13.075 -8.904 -6.612 -4.499 -0.171 1.003 920 gamma[2] -9.884 3.264 -16.376 -12.029 -9.956 -7.709 -3.347 1.003 870 gamma[3] -3.780 3.240 -10.314 -5.931 -3.792 -1.646 2.595 1.003 950 gamma[4] 7.991 3.195 2.062 5.811 7.901 10.155 14.234 1.002 1900 gamma[5] 6.589 3.313 -0.059 4.437 6.627 8.769 12.997 1.001 3000 gamma[6] -2.549 3.207 -8.875 -4.747 -2.498 -0.389 3.616 1.002 1000 gamma[7] -5.185 3.308 -11.838 -7.415 -5.187 -2.967 1.175 1.001 3000 gamma[8] 10.322 3.301 3.840 8.052 10.361 12.493 16.938 1.001 3000 gamma[9] 5.307 3.282 -1.027 3.054 5.281 7.495 11.819 1.002 1400 gamma[10] -13.863 3.244 -20.343 -16.028 -13.793 -11.735 -7.595 1.001 2700 gamma[11] -12.894 3.289 -19.431 -15.060 -12.864 -10.684 -6.426 1.001 2900 gamma[12] 3.683 3.354 -2.647 1.389 3.607 5.961 10.515 1.001 3000 gamma[13] 9.408 3.294 2.830 7.205 9.442 11.594 15.889 1.002 1300 gamma[14] -2.811 3.278 -9.311 -5.032 -2.805 -0.621 3.792 1.001 2900 gamma[15] 8.458 3.329 2.028 6.190 8.355 10.700 15.089 1.001 2800 gamma[16] 0.494 3.250 -5.636 -1.745 0.430 2.671 7.074 1.001 3000 gamma[17] -9.665 3.212 -16.259 -11.844 -9.547 -7.521 -3.477 1.004 540 gamma[18] 2.969 3.249 -3.621 0.861 2.913 5.171 9.384 1.001 3000 gamma[19] -14.382 3.254 -20.722 -16.477 -14.365 -12.317 -7.966 1.001 3000 gamma[20] 12.242 3.282 5.882 10.045 12.274 14.373 18.782 1.002 1700 gamma[21] 20.046 3.341 13.551 17.772 19.968 22.289 26.473 1.001 3000 gamma[22] -10.907 3.272 -17.547 -13.073 -10.864 -8.779 -4.412 1.001 3000 gamma[23] -16.597 3.290 -23.064 -18.759 -16.587 -14.428 -10.232 1.002 1400 gamma[24] 2.726 3.227 -3.601 0.516 2.655 4.887 9.086 1.001 3000 gamma[25] -9.107 3.273 -15.445 -11.276 -9.195 -6.920 -2.539 1.001 3000 gamma[26] 26.961 3.260 20.632 24.757 26.954 29.123 33.400 1.001 3000 gamma[27] -6.803 3.228 -13.194 -8.882 -6.781 -4.698 -0.386 1.002 1400 gamma[28] 3.400 3.174 -2.655 1.241 3.424 5.586 9.404 1.001 3000 gamma[29] -4.577 3.245 -10.940 -6.738 -4.623 -2.357 1.739 1.001 3000 gamma[30] -11.103 3.260 -17.536 -13.304 -11.104 -8.933 -4.649 1.001 2300 gamma[31] 1.673 3.205 -4.774 -0.412 1.683 3.800 8.200 1.001 3000 gamma[32] -19.083 3.277 -25.486 -21.280 -19.145 -16.890 -12.623 1.001 3000 gamma[33] 11.267 3.306 4.944 9.043 11.298 13.471 17.724 1.001 3000 gamma[34] 3.393 3.297 -3.147 1.169 3.406 5.624 9.659 1.001 3000 gamma[35] 22.310 3.306 16.073 20.040 22.294 24.502 29.067 1.001 2400 res[1] 0.991 2.763 -4.476 -0.842 0.987 2.852 6.364 1.003 860 res[2] -3.345 2.733 -8.826 -5.139 -3.346 -1.501 1.964 1.002 1600 res[3] 1.540 2.756 -3.922 -0.278 1.568 3.352 6.954 1.002 1500 res[4] -2.592 2.682 -7.946 -4.446 -2.577 -0.765 2.776 1.002 1800 res[5] -2.597 2.704 -7.857 -4.417 -2.611 -0.798 2.682 1.002 1500 res[6] 3.409 2.682 -1.903 1.595 3.420 5.197 8.791 1.002 1100 res[7] -2.222 2.724 -7.516 -4.062 -2.296 -0.402 3.018 1.002 1000 res[8] 1.592 2.725 -3.658 -0.245 1.604 3.349 6.991 1.001 2000 res[9] 0.240 2.699 -5.083 -1.534 0.227 2.017 5.469 1.002 1700 res[10] -0.812 2.704 -6.071 -2.699 -0.795 1.109 4.262 1.002 1700 res[11] 0.884 2.738 -4.545 -1.002 0.936 2.795 6.111 1.001 3000 res[12] 1.333 2.703 -4.006 -0.550 1.321 3.278 6.428 1.001 3000 res[13] 5.386 2.733 0.046 3.497 5.424 7.262 10.938 1.001 3000 res[14] -6.530 2.742 -11.959 -8.372 -6.498 -4.749 -1.034 1.001 3000 res[15] 2.339 2.724 -2.835 0.440 2.337 4.140 7.880 1.001 3000 res[16] -0.834 2.743 -6.166 -2.698 -0.860 1.012 4.419 1.003 740 res[17] 4.408 2.726 -0.777 2.572 4.380 6.245 9.796 1.002 1500 res[18] -4.147 2.710 -9.534 -5.991 -4.136 -2.298 1.002 1.002 1500 res[19] 0.921 2.770 -4.377 -0.954 0.917 2.791 6.523 1.002 1500 res[20] 1.998 2.744 -3.170 0.146 1.967 3.864 7.473 1.001 3000 res[21] -3.768 2.774 -9.105 -5.655 -3.759 -1.955 1.747 1.001 3000 res[22] 1.008 2.709 -4.180 -0.818 1.022 2.772 6.392 1.001 3000 res[23] 2.362 2.713 -2.912 0.550 2.359 4.073 7.764 1.001 3000 res[24] -1.657 2.713 -7.029 -3.399 -1.703 0.080 3.783 1.001 3000 res[25] 6.233 2.704 0.886 4.381 6.296 8.055 11.477 1.002 1100 res[26] 2.683 2.727 -2.719 0.861 2.660 4.488 8.025 1.001 2800 res[27] -8.210 2.690 -13.525 -9.999 -8.205 -6.352 -3.031 1.001 2600 res[28] -0.342 2.717 -5.804 -2.172 -0.313 1.506 4.967 1.001 3000 res[29] -2.007 2.689 -7.343 -3.855 -2.027 -0.192 3.319 1.001 3000 res[30] 0.031 2.705 -5.273 -1.763 -0.041 1.869 5.263 1.001 3000 res[31] -1.771 2.718 -6.997 -3.637 -1.820 0.062 3.467 1.001 3000 res[32] 3.133 2.721 -2.123 1.321 3.079 4.966 8.433 1.001 3000 res[33] -3.350 2.719 -8.587 -5.227 -3.329 -1.513 2.071 1.001 3000 res[34] -1.591 2.753 -7.170 -3.417 -1.606 0.279 3.576 1.001 3000 res[35] -4.062 2.784 -9.632 -5.942 -4.026 -2.158 1.378 1.001 3000 res[36] 6.328 2.772 0.919 4.546 6.369 8.180 11.578 1.001 3000 res[37] 0.324 2.760 -5.057 -1.528 0.288 2.171 5.821 1.001 2300 res[38] 2.884 2.737 -2.526 1.098 2.857 4.658 8.231 1.001 3000 res[39] -1.465 2.767 -6.875 -3.291 -1.491 0.381 3.995 1.001 2900 res[40] 6.800 2.713 1.531 5.001 6.809 8.575 12.149 1.001 2600 res[41] -3.198 2.726 -8.514 -5.026 -3.158 -1.394 2.237 1.001 3000 res[42] -4.046 2.719 -9.413 -5.893 -4.029 -2.283 1.506 1.001 3000 res[43] 6.203 2.780 0.688 4.337 6.258 8.068 11.586 1.001 3000 res[44] -2.620 2.776 -8.081 -4.488 -2.621 -0.728 2.777 1.001 3000 res[45] -2.122 2.801 -7.630 -3.982 -2.096 -0.165 3.232 1.001 3000 res[46] -0.754 2.706 -6.194 -2.582 -0.768 1.127 4.512 1.001 3000 res[47] 1.202 2.726 -4.100 -0.626 1.208 3.112 6.430 1.001 3000 res[48] -0.312 2.717 -5.815 -2.091 -0.285 1.541 4.798 1.001 3000 res[49] 1.718 2.706 -3.537 -0.076 1.695 3.518 7.185 1.004 610 res[50] -0.048 2.725 -5.247 -1.874 -0.083 1.764 5.404 1.003 900 res[51] -3.410 2.701 -8.600 -5.222 -3.462 -1.614 2.077 1.003 750 res[52] 4.766 2.709 -0.518 2.929 4.779 6.527 10.146 1.002 1600 res[53] -1.430 2.734 -6.804 -3.201 -1.369 0.393 3.868 1.001 3000 res[54] -2.974 2.710 -8.302 -4.740 -2.998 -1.149 2.447 1.001 3000 res[55] -2.745 2.715 -8.144 -4.553 -2.812 -0.908 2.608 1.001 3000 res[56] 2.911 2.717 -2.442 1.143 2.883 4.705 8.267 1.001 3000 res[57] -2.738 2.719 -8.141 -4.530 -2.756 -0.910 2.721 1.001 3000 res[58] 1.693 2.738 -3.826 -0.132 1.758 3.514 6.930 1.001 2300 res[59] 0.064 2.776 -5.562 -1.818 0.117 1.961 5.396 1.001 3000 res[60] 0.133 2.764 -5.378 -1.693 0.198 1.969 5.549 1.001 2500 res[61] 0.946 2.802 -4.530 -0.924 0.960 2.837 6.333 1.002 1400 res[62] -0.708 2.813 -6.209 -2.573 -0.753 1.157 4.840 1.001 2500 res[63] 3.176 2.782 -2.266 1.334 3.136 5.049 8.635 1.001 3000 res[64] -4.165 2.709 -9.420 -5.987 -4.225 -2.381 1.145 1.001 3000 res[65] 6.552 2.660 1.316 4.784 6.539 8.286 11.674 1.001 3000 res[66] -4.382 2.690 -9.714 -6.166 -4.396 -2.602 0.776 1.001 3000 res[67] -0.511 2.754 -6.040 -2.327 -0.546 1.419 4.867 1.002 1300 res[68] -0.055 2.757 -5.724 -1.859 -0.079 1.845 5.428 1.003 700 res[69] -2.393 2.764 -7.893 -4.168 -2.369 -0.549 3.114 1.003 700 res[70] 0.753 2.718 -4.439 -1.030 0.707 2.546 6.266 1.001 3000 res[71] -6.943 2.712 -11.989 -8.803 -7.001 -5.190 -1.477 1.001 3000 res[72] 6.793 2.690 1.519 5.020 6.778 8.581 12.054 1.001 3000 res[73] -3.912 2.749 -9.547 -5.813 -3.874 -1.969 1.402 1.001 3000 res[74] 3.687 2.781 -1.830 1.846 3.706 5.569 8.909 1.001 3000 res[75] -1.294 2.728 -6.736 -3.132 -1.283 0.504 3.972 1.001 3000 res[76] -4.970 2.702 -10.044 -6.802 -4.959 -3.167 0.467 1.001 3000 res[77] 10.812 2.681 5.668 8.906 10.841 12.565 16.311 1.001 3000 res[78] -1.255 2.683 -6.487 -3.043 -1.297 0.495 4.065 1.001 3000 res[79] -3.173 2.712 -8.569 -4.945 -3.122 -1.316 1.945 1.002 1300 res[80] -3.352 2.693 -8.672 -5.162 -3.289 -1.527 1.882 1.001 3000 res[81] 5.383 2.716 0.135 3.576 5.406 7.204 10.612 1.001 3000 res[82] 1.711 2.669 -3.425 -0.086 1.686 3.520 6.877 1.001 3000 res[83] 1.805 2.649 -3.315 -0.002 1.763 3.645 6.896 1.001 3000 res[84] -2.970 2.641 -8.114 -4.783 -2.952 -1.228 2.196 1.001 3000 res[85] -5.598 2.741 -10.945 -7.442 -5.620 -3.755 -0.136 1.002 1300 res[86] -1.944 2.759 -7.379 -3.811 -1.929 -0.059 3.442 1.001 2200 res[87] 6.713 2.729 1.416 4.901 6.749 8.503 12.166 1.001 3000 res[88] -3.996 2.682 -9.294 -5.794 -3.958 -2.231 1.295 1.001 3000 res[89] -6.220 2.722 -11.648 -7.989 -6.183 -4.396 -0.984 1.001 3000 res[90] 8.330 2.700 2.973 6.604 8.334 10.121 13.526 1.001 3000 res[91] -0.355 2.690 -5.527 -2.151 -0.390 1.456 4.916 1.001 3000 res[92] -2.082 2.685 -7.274 -3.834 -2.156 -0.275 3.514 1.001 3000 res[93] 2.685 2.701 -2.625 0.890 2.690 4.461 8.277 1.001 3000 res[94] -1.218 2.759 -6.666 -3.017 -1.219 0.550 4.149 1.001 3000 res[95] -7.154 2.750 -12.652 -8.963 -7.228 -5.307 -1.734 1.001 3000 res[96] 5.254 2.780 -0.209 3.395 5.239 7.035 10.904 1.001 3000 res[97] 1.971 2.751 -3.399 0.120 1.989 3.804 7.443 1.002 1800 res[98] -2.284 2.732 -7.683 -4.131 -2.302 -0.416 3.174 1.001 3000 res[99] 2.398 2.737 -2.896 0.595 2.394 4.171 7.697 1.001 3000 res[100] -3.589 2.768 -8.965 -5.497 -3.564 -1.749 1.979 1.001 3000 res[101] 8.505 2.725 3.183 6.651 8.534 10.315 13.906 1.001 3000 res[102] -4.225 2.733 -9.489 -6.026 -4.236 -2.442 1.038 1.001 3000 res[103] 2.973 2.745 -2.387 1.160 3.003 4.793 8.346 1.001 3000 res[104] 1.894 2.757 -3.501 0.038 1.879 3.743 7.260 1.001 3000 res[105] -1.108 2.764 -6.620 -2.942 -1.108 0.771 4.185 1.001 3000 sigma 4.680 0.408 3.969 4.389 4.649 4.942 5.544 1.001 2100 sigma.B 11.682 1.534 9.146 10.592 11.534 12.562 15.177 1.001 3000 deviance 619.919 11.149 600.770 612.087 619.098 626.943 644.105 1.001 3000 For each parameter, n.eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor (at convergence, Rhat=1). DIC info (using the rule, pD = var(deviance)/2) pD = 62.2 and DIC = 682.1 DIC is an estimate of expected predictive error (lower deviance is better).
data.rcb.mcmc.list.m2 <- as.mcmc(data.rcb.r2jags.m2) Data.Rcb.mcmc.list.m2 <- data.rcb.mcmc.list.m2
Hierarchical parameterization
For a simple model with only two hierarchical levels, the model is the same as above..
$R^2$ and finite population standard deviations
modelString=" model { #Likelihood (esimating site means (gamma.site) for (i in 1:n) { y[i]~dnorm(mu[i],tau) mu[i] <- gamma[Block[i]] + inprod(beta[], X[i,]) y.err[i]<- mu[i]-y[i] } for (i in 1:nBlock) { gamma[i] ~ dnorm(0, tau.block) } #Priors for (i in 1:nX) { beta[i] ~ dnorm(0, 1.0E-6) #prior } sigma ~ dunif(0, 100) tau <- 1 / (sigma * sigma) sigma.block ~ dunif(0, 100) tau.block <- 1 / (sigma.block * sigma.block) sd.y <- sd(y.err) sd.block <- sd(gamma) } "
A.Xmat <- model.matrix(~A,ddply(data.rcb,~Block,catcolwise(unique))) data.rcb.list <- with(data.rcb, list(y=y, Block=Block, X= A.Xmat, n=nrow(data.rcb), nBlock=length(levels(Block)), nX = ncol(A.Xmat) ) ) params <- c("beta","sigma","sd.y",'sd.block','sigma','sigma.block') burnInSteps = 3000 nChains = 3 numSavedSteps = 3000 thinSteps = 10 nIter = burnInSteps+ceiling((numSavedSteps * thinSteps)/nChains) library(R2jags) rnorm(1)
[1] -1.321587
jags.SD.time <- system.time( data.rcb.r2jagsSD <- jags(data=data.rcb.list, inits=NULL, parameters.to.save=params, model.file=textConnection(modelString), n.chains=3, n.iter=nIter, n.burnin=burnInSteps, n.thin=thinSteps ) )
Compiling model graph Resolving undeclared variables Allocating nodes Graph Size: 899 Initializing model
jags.SD.time
user system elapsed 3.972 0.008 3.996
print(data.rcb.r2jagsSD)
Inference for Bugs model at "5", fit using jags, 3 chains, each with 13000 iterations (first 3000 discarded), n.thin = 10 n.sims = 3000 iterations saved mu.vect sd.vect 2.5% 25% 50% 75% 97.5% Rhat n.eff beta[1] 43.008 2.193 38.860 41.550 42.966 44.443 47.378 1.001 2800 beta[2] 28.447 1.121 26.275 27.654 28.436 29.224 30.645 1.001 3000 beta[3] 40.177 1.134 37.950 39.428 40.190 40.941 42.370 1.001 2500 sd.block 11.499 0.472 10.584 11.180 11.497 11.827 12.400 1.001 3000 sd.y 4.605 0.232 4.220 4.438 4.580 4.751 5.113 1.002 1700 sigma 4.659 0.410 3.937 4.372 4.635 4.911 5.565 1.001 3000 sigma.block 11.967 1.611 9.316 10.865 11.819 12.883 15.607 1.001 2100 deviance 619.453 10.667 601.034 611.788 618.614 626.518 642.120 1.001 3000 For each parameter, n.eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor (at convergence, Rhat=1). DIC info (using the rule, pD = var(deviance)/2) pD = 56.9 and DIC = 676.3 DIC is an estimate of expected predictive error (lower deviance is better).
data.rcb.mcmc.listSD <- as.mcmc(data.rcb.r2jagsSD) Xmat <- model.matrix(~A, data.rcb) coefs <- data.rcb.r2jagsSD$BUGSoutput$sims.list[['beta']] fitted <- coefs %*% t(Xmat) X.var <- aaply(fitted,1,function(x){var(x)}) Z.var <- data.rcb.r2jagsSD$BUGSoutput$sims.list[['sd.block']]^2 R.var <- data.rcb.r2jagsSD$BUGSoutput$sims.list[['sd.y']]^2 R2.marginal <- (X.var)/(X.var+Z.var+R.var) R2.marginal <- data.frame(Mean=mean(R2.marginal), Median=median(R2.marginal), HPDinterval(as.mcmc(R2.marginal))) R2.conditional <- (X.var+Z.var)/(X.var+Z.var+R.var) R2.conditional <- data.frame(Mean=mean(R2.conditional), Median=median(R2.conditional), HPDinterval(as.mcmc(R2.conditional))) R2.block <- (Z.var)/(X.var+Z.var+R.var) R2.block <- data.frame(Mean=mean(R2.block), Median=median(R2.block), HPDinterval(as.mcmc(R2.block))) R2.res<-(R.var)/(X.var+Z.var+R.var) R2.res <- data.frame(Mean=mean(R2.res), Median=median(R2.res), HPDinterval(as.mcmc(R2.res))) rbind(R2.block=R2.block, R2.marginal=R2.marginal, R2.res=R2.res, R2.conditional=R2.conditional)
Mean Median lower upper R2.block 0.30002983 0.29979922 0.25796679 0.3362629 R2.marginal 0.65172770 0.65224634 0.61326493 0.6888135 R2.res 0.04824247 0.04754536 0.03857305 0.0586949 R2.conditional 0.95175753 0.95245464 0.94130510 0.9614270
Rstan
Cell means parameterization
rstanString=" data{ int n; int nA; int nB; vector [n] y; int A[n]; int B[n]; } parameters{ real alpha[nA]; real<lower=0> sigma; vector [nB] beta; real<lower=0> sigma_B; } model{ real mu[n]; // Priors alpha ~ normal( 0 , 100 ); beta ~ normal( 0 , sigma_B ); sigma_B ~ cauchy( 0 , 25 ); sigma ~ cauchy( 0 , 25 ); for ( i in 1:n ) { mu[i] <- alpha[A[i]] + beta[B[i]]; } y ~ normal( mu , sigma ); } "
data.rcb.list <- with(data.rcb, list(y=y, A=as.numeric(A), B=as.numeric(Block), n=nrow(data.rcb), nB=length(levels(Block)),nA=length(levels(A)))) burnInSteps = 3000 nChains = 3 numSavedSteps = 3000 thinSteps = 10 nIter = burnInSteps+ceiling((numSavedSteps * thinSteps)/nChains) library(rstan) rstan.c.time <- system.time( data.rcb.rstan.c <- stan(data=data.rcb.list, model_code=rstanString, pars=c('alpha','sigma','sigma_B'), chains=nChains, iter=nIter, warmup=burnInSteps, thin=thinSteps, save_dso=TRUE ) )
TRANSLATING MODEL 'rstanString' FROM Stan CODE TO C++ CODE NOW. COMPILING THE C++ CODE FOR MODEL 'rstanString' NOW. SAMPLING FOR MODEL 'rstanString' NOW (CHAIN 1). Iteration: 1 / 13000 [ 0%] (Warmup) Iteration: 1300 / 13000 [ 10%] (Warmup) Iteration: 2600 / 13000 [ 20%] (Warmup) Iteration: 3001 / 13000 [ 23%] (Sampling) Iteration: 4300 / 13000 [ 33%] (Sampling) Iteration: 5600 / 13000 [ 43%] (Sampling) Iteration: 6900 / 13000 [ 53%] (Sampling) Iteration: 8200 / 13000 [ 63%] (Sampling) Iteration: 9500 / 13000 [ 73%] (Sampling) Iteration: 10800 / 13000 [ 83%] (Sampling) Iteration: 12100 / 13000 [ 93%] (Sampling) Iteration: 13000 / 13000 [100%] (Sampling) # Elapsed Time: 0.4 seconds (Warm-up) # 1.14 seconds (Sampling) # 1.54 seconds (Total) SAMPLING FOR MODEL 'rstanString' NOW (CHAIN 2). Iteration: 1 / 13000 [ 0%] (Warmup) Iteration: 1300 / 13000 [ 10%] (Warmup) Iteration: 2600 / 13000 [ 20%] (Warmup) Iteration: 3001 / 13000 [ 23%] (Sampling) Iteration: 4300 / 13000 [ 33%] (Sampling) Iteration: 5600 / 13000 [ 43%] (Sampling) Iteration: 6900 / 13000 [ 53%] (Sampling) Iteration: 8200 / 13000 [ 63%] (Sampling) Iteration: 9500 / 13000 [ 73%] (Sampling) Iteration: 10800 / 13000 [ 83%] (Sampling) Iteration: 12100 / 13000 [ 93%] (Sampling) Iteration: 13000 / 13000 [100%] (Sampling) # Elapsed Time: 0.37 seconds (Warm-up) # 1.26 seconds (Sampling) # 1.63 seconds (Total) SAMPLING FOR MODEL 'rstanString' NOW (CHAIN 3). Iteration: 1 / 13000 [ 0%] (Warmup) Iteration: 1300 / 13000 [ 10%] (Warmup) Iteration: 2600 / 13000 [ 20%] (Warmup) Iteration: 3001 / 13000 [ 23%] (Sampling) Iteration: 4300 / 13000 [ 33%] (Sampling) Iteration: 5600 / 13000 [ 43%] (Sampling) Iteration: 6900 / 13000 [ 53%] (Sampling) Iteration: 8200 / 13000 [ 63%] (Sampling) Iteration: 9500 / 13000 [ 73%] (Sampling) Iteration: 10800 / 13000 [ 83%] (Sampling) Iteration: 12100 / 13000 [ 93%] (Sampling) Iteration: 13000 / 13000 [100%] (Sampling) # Elapsed Time: 0.42 seconds (Warm-up) # 1.3 seconds (Sampling) # 1.72 seconds (Total)
print(data.rcb.rstan.c)
Inference for Stan model: rstanString. 3 chains, each with iter=13000; warmup=3000; thin=10; post-warmup draws per chain=1000, total post-warmup draws=3000. mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat alpha[1] 42.89 0.05 2.24 38.52 41.39 42.90 44.37 47.27 2125 1 alpha[2] 71.38 0.05 2.21 66.97 69.88 71.41 72.89 75.72 2166 1 alpha[3] 83.08 0.05 2.20 78.74 81.57 83.10 84.53 87.42 2114 1 sigma 4.67 0.01 0.41 3.93 4.38 4.63 4.92 5.55 2939 1 sigma_B 11.90 0.03 1.56 9.24 10.82 11.71 12.85 15.41 3000 1 lp__ -313.76 0.11 5.45 -324.89 -317.25 -313.37 -309.85 -304.44 2691 1 Samples were drawn using NUTS(diag_e) at Mon Mar 9 09:10:22 2015. For each parameter, n_eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor on split chains (at convergence, Rhat=1).
data.rcb.rstan.c.df <-as.data.frame(extract(data.rcb.rstan.c)) head(data.rcb.rstan.c.df)
alpha.1 alpha.2 alpha.3 sigma sigma_B lp__ 1 39.66 68.98 81.29 3.990 12.033 -302.8 2 42.15 70.92 81.55 4.437 15.084 -321.2 3 43.65 72.31 85.08 4.874 9.853 -316.7 4 46.98 75.78 88.41 4.893 11.365 -326.1 5 41.91 69.89 81.54 4.678 14.699 -322.2 6 40.61 68.60 79.92 4.293 15.213 -309.9
data.rcb.mcmc.c<-rstan:::as.mcmc.list.stanfit(data.rcb.rstan.c) plyr:::adply(as.matrix(data.rcb.rstan.c.df),2,MCMCsum)
X1 Median X0. X25. X50. X75. X100. lower upper lower.1 upper.1 1 alpha.1 42.900 34.146 41.385 42.900 44.371 51.119 38.613 47.342 40.641 45.058 2 alpha.2 71.405 62.558 69.880 71.405 72.889 79.280 66.819 75.389 69.261 73.589 3 alpha.3 83.102 75.224 81.569 83.102 84.526 91.045 78.704 87.371 80.878 85.183 4 sigma 4.634 3.519 4.376 4.634 4.922 6.221 3.849 5.434 4.203 4.991 5 sigma_B 11.709 7.877 10.820 11.709 12.848 21.071 9.041 15.084 10.200 13.095 6 lp__ -313.369 -340.392 -317.248 -313.369 -309.850 -298.056 -324.061 -303.851 -318.308 -307.557
Full effects parameterization
rstanString=" data{ int n; int nB; vector [n] y; int A2[n]; int A3[n]; int B[n]; } parameters{ real alpha0; real alpha2; real alpha3; real<lower=0> sigma; vector [nB] beta; real<lower=0> sigma_B; } model{ real mu[n]; // Priors alpha0 ~ normal( 0 , 1000 ); alpha2 ~ normal( 0 , 1000 ); alpha3 ~ normal( 0 , 1000 ); beta ~ normal( 0 , sigma_B ); sigma_B ~ cauchy( 0 , 25 ); sigma ~ cauchy( 0 , 25 ); for ( i in 1:n ) { mu[i] <- alpha0 + alpha2*A2[i] + alpha3*A3[i] + beta[B[i]]; } y ~ normal( mu , sigma ); } "
A2 <- ifelse(data.rcb$A=='2',1,0) A3 <- ifelse(data.rcb$A=='3',1,0) data.rcb.list <- with(data.rcb, list(y=y, A2=A2, A3=A3, B=as.numeric(Block), n=nrow(data.rcb), nB=length(levels(Block)))) burnInSteps = 3000 nChains = 3 numSavedSteps = 3000 thinSteps = 10 nIter = burnInSteps + ceiling((numSavedSteps * thinSteps)/nChains) library(rstan) rstan.f.time <- system.time( data.rcb.rstan.f <- stan(data=data.rcb.list, model_code=rstanString, pars=c('alpha0','alpha2','alpha3','sigma','sigma_B'), chains=nChains, iter=nIter, warmup=burnInSteps, thin=thinSteps, save_dso=TRUE ) )
SAMPLING FOR MODEL 'e6b26e59c453ce19af522e475363a98a' NOW (CHAIN 1). Chain 1, Iteration: 1 / 13000 [ 0%] (Warmup) Chain 1, Iteration: 1300 / 13000 [ 10%] (Warmup) Chain 1, Iteration: 2600 / 13000 [ 20%] (Warmup) Chain 1, Iteration: 3001 / 13000 [ 23%] (Sampling) Chain 1, Iteration: 4300 / 13000 [ 33%] (Sampling) Chain 1, Iteration: 5600 / 13000 [ 43%] (Sampling) Chain 1, Iteration: 6900 / 13000 [ 53%] (Sampling) Chain 1, Iteration: 8200 / 13000 [ 63%] (Sampling) Chain 1, Iteration: 9500 / 13000 [ 73%] (Sampling) Chain 1, Iteration: 10800 / 13000 [ 83%] (Sampling) Chain 1, Iteration: 12100 / 13000 [ 93%] (Sampling) Chain 1, Iteration: 13000 / 13000 [100%] (Sampling) # Elapsed Time: 0.687363 seconds (Warm-up) # 1.86471 seconds (Sampling) # 2.55208 seconds (Total) SAMPLING FOR MODEL 'e6b26e59c453ce19af522e475363a98a' NOW (CHAIN 2). Chain 2, Iteration: 1 / 13000 [ 0%] (Warmup) Chain 2, Iteration: 1300 / 13000 [ 10%] (Warmup) Chain 2, Iteration: 2600 / 13000 [ 20%] (Warmup) Chain 2, Iteration: 3001 / 13000 [ 23%] (Sampling) Chain 2, Iteration: 4300 / 13000 [ 33%] (Sampling) Chain 2, Iteration: 5600 / 13000 [ 43%] (Sampling) Chain 2, Iteration: 6900 / 13000 [ 53%] (Sampling) Chain 2, Iteration: 8200 / 13000 [ 63%] (Sampling) Chain 2, Iteration: 9500 / 13000 [ 73%] (Sampling) Chain 2, Iteration: 10800 / 13000 [ 83%] (Sampling) Chain 2, Iteration: 12100 / 13000 [ 93%] (Sampling) Chain 2, Iteration: 13000 / 13000 [100%] (Sampling) # Elapsed Time: 0.64299 seconds (Warm-up) # 1.78891 seconds (Sampling) # 2.4319 seconds (Total) SAMPLING FOR MODEL 'e6b26e59c453ce19af522e475363a98a' NOW (CHAIN 3). Chain 3, Iteration: 1 / 13000 [ 0%] (Warmup) Chain 3, Iteration: 1300 / 13000 [ 10%] (Warmup) Chain 3, Iteration: 2600 / 13000 [ 20%] (Warmup) Chain 3, Iteration: 3001 / 13000 [ 23%] (Sampling) Chain 3, Iteration: 4300 / 13000 [ 33%] (Sampling) Chain 3, Iteration: 5600 / 13000 [ 43%] (Sampling) Chain 3, Iteration: 6900 / 13000 [ 53%] (Sampling) Chain 3, Iteration: 8200 / 13000 [ 63%] (Sampling) Chain 3, Iteration: 9500 / 13000 [ 73%] (Sampling) Chain 3, Iteration: 10800 / 13000 [ 83%] (Sampling) Chain 3, Iteration: 12100 / 13000 [ 93%] (Sampling) Chain 3, Iteration: 13000 / 13000 [100%] (Sampling) # Elapsed Time: 0.721642 seconds (Warm-up) # 1.92995 seconds (Sampling) # 2.65159 seconds (Total)
print(data.rcb.rstan.f)
Inference for Stan model: e6b26e59c453ce19af522e475363a98a. 3 chains, each with iter=13000; warmup=3000; thin=10; post-warmup draws per chain=1000, total post-warmup draws=3000. mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat alpha0 43.01 0.06 2.20 38.71 41.51 43.00 44.41 47.41 1445 1 alpha2 28.45 0.02 1.13 26.24 27.69 28.45 29.21 30.71 3000 1 alpha3 40.15 0.02 1.14 37.97 39.40 40.15 40.92 42.40 2663 1 sigma 4.65 0.01 0.42 3.92 4.36 4.63 4.92 5.52 3000 1 sigma_B 11.90 0.03 1.59 9.24 10.75 11.75 12.89 15.52 2700 1 lp__ -313.00 0.10 5.38 -324.44 -316.37 -312.61 -309.11 -303.95 2811 1 Samples were drawn using NUTS(diag_e) at Wed Dec 23 11:14:07 2015. For each parameter, n_eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor on split chains (at convergence, Rhat=1).
data.rcb.rstan.f.df <-as.data.frame(extract(data.rcb.rstan.f)) head(data.rcb.rstan.f.df)
alpha0 alpha2 alpha3 sigma sigma_B lp__ 1 45.34963 28.86244 41.28095 4.288403 11.64362 -316.1950 2 46.56000 29.03809 40.02504 4.589700 11.16517 -309.2829 3 41.02639 29.06624 40.60771 4.816826 10.61045 -307.8916 4 43.12900 28.36760 39.66213 4.574166 12.08069 -313.3190 5 43.42910 28.56547 41.53942 4.432488 12.23545 -304.2449 6 44.52987 26.68965 39.34490 4.134544 11.10949 -311.3038
data.rcb.mcmc.f<-rstan:::as.mcmc.list.stanfit(data.rcb.rstan.f) plyr:::adply(as.matrix(data.rcb.rstan.f.df),2,MCMCsum)
X1 Median X0. X25. X50. X75. X100. lower upper lower.1 upper.1 1 alpha0 42.998651 34.792884 41.513957 42.998651 44.405610 50.294911 38.811920 47.497702 40.593886 44.830394 2 alpha2 28.447276 23.922377 27.693890 28.447276 29.208311 32.080887 26.313779 30.740973 27.139039 29.394033 3 alpha3 40.151930 35.955333 39.398216 40.151930 40.916647 44.493609 38.084372 42.500434 39.034976 41.300031 4 sigma 4.626867 3.510477 4.358376 4.626867 4.921509 7.065056 3.874188 5.429956 4.195516 5.005088 5 sigma_B 11.753132 7.543985 10.749614 11.753132 12.887957 19.570931 9.016837 15.113263 10.095304 13.135502 6 lp__ -312.610598 -334.649842 -316.366239 -312.610598 -309.105863 -297.718740 -323.717718 -303.285639 -317.279913 -306.961175
Matrix effects parameterization
rstanString=" data{ int n; int nX; int nB; vector [n] y; matrix [n,nX] X; int B[n]; } parameters{ vector [nX] beta; real<lower=0> sigma; vector [nB] gamma; real<lower=0> sigma_B; } transformed parameters { vector[n] mu; mu <- X*beta; for (i in 1:n) { mu[i] <- mu[i] + gamma[B[i]]; } } model{ // Priors beta ~ normal( 0 , 100 ); gamma ~ normal( 0 , sigma_B ); sigma_B ~ cauchy( 0 , 25 ); sigma ~ cauchy( 0 , 25 ); y ~ normal( mu , sigma ); } "
Xmat <- model.matrix(~A, data=data.rcb) data.rcb.list <- with(data.rcb, list(y=y, X=Xmat, nX=ncol(Xmat), B=as.numeric(Block), n=nrow(data.rcb), nB=length(levels(Block)))) library(rstan) rstan.d.time <- system.time( data.rcb.rstan.d <- stan(data=data.rcb.list, model_code=rstanString, pars=c('beta','sigma','sigma_B'), chains=3, iter=3000, warmup=1000, thin=2, save_dso=TRUE ) )
SAMPLING FOR MODEL '18e6498c61bcea7cdfdc0535e9da24c2' NOW (CHAIN 1). Chain 1, Iteration: 1 / 3000 [ 0%] (Warmup) Chain 1, Iteration: 300 / 3000 [ 10%] (Warmup) Chain 1, Iteration: 600 / 3000 [ 20%] (Warmup) Chain 1, Iteration: 900 / 3000 [ 30%] (Warmup) Chain 1, Iteration: 1001 / 3000 [ 33%] (Sampling) Chain 1, Iteration: 1300 / 3000 [ 43%] (Sampling) Chain 1, Iteration: 1600 / 3000 [ 53%] (Sampling) Chain 1, Iteration: 1900 / 3000 [ 63%] (Sampling) Chain 1, Iteration: 2200 / 3000 [ 73%] (Sampling) Chain 1, Iteration: 2500 / 3000 [ 83%] (Sampling) Chain 1, Iteration: 2800 / 3000 [ 93%] (Sampling) Chain 1, Iteration: 3000 / 3000 [100%] (Sampling) # Elapsed Time: 0.453177 seconds (Warm-up) # 0.62562 seconds (Sampling) # 1.0788 seconds (Total) SAMPLING FOR MODEL '18e6498c61bcea7cdfdc0535e9da24c2' NOW (CHAIN 2). Chain 2, Iteration: 1 / 3000 [ 0%] (Warmup) Chain 2, Iteration: 300 / 3000 [ 10%] (Warmup) Chain 2, Iteration: 600 / 3000 [ 20%] (Warmup) Chain 2, Iteration: 900 / 3000 [ 30%] (Warmup) Chain 2, Iteration: 1001 / 3000 [ 33%] (Sampling) Chain 2, Iteration: 1300 / 3000 [ 43%] (Sampling) Chain 2, Iteration: 1600 / 3000 [ 53%] (Sampling) Chain 2, Iteration: 1900 / 3000 [ 63%] (Sampling) Chain 2, Iteration: 2200 / 3000 [ 73%] (Sampling) Chain 2, Iteration: 2500 / 3000 [ 83%] (Sampling) Chain 2, Iteration: 2800 / 3000 [ 93%] (Sampling) Chain 2, Iteration: 3000 / 3000 [100%] (Sampling) # Elapsed Time: 0.545787 seconds (Warm-up) # 0.602373 seconds (Sampling) # 1.14816 seconds (Total) SAMPLING FOR MODEL '18e6498c61bcea7cdfdc0535e9da24c2' NOW (CHAIN 3). Chain 3, Iteration: 1 / 3000 [ 0%] (Warmup) Chain 3, Iteration: 300 / 3000 [ 10%] (Warmup) Chain 3, Iteration: 600 / 3000 [ 20%] (Warmup) Chain 3, Iteration: 900 / 3000 [ 30%] (Warmup) Chain 3, Iteration: 1001 / 3000 [ 33%] (Sampling) Chain 3, Iteration: 1300 / 3000 [ 43%] (Sampling) Chain 3, Iteration: 1600 / 3000 [ 53%] (Sampling) Chain 3, Iteration: 1900 / 3000 [ 63%] (Sampling) Chain 3, Iteration: 2200 / 3000 [ 73%] (Sampling) Chain 3, Iteration: 2500 / 3000 [ 83%] (Sampling) Chain 3, Iteration: 2800 / 3000 [ 93%] (Sampling) Chain 3, Iteration: 3000 / 3000 [100%] (Sampling) # Elapsed Time: 0.475119 seconds (Warm-up) # 0.637245 seconds (Sampling) # 1.11236 seconds (Total)
print(data.rcb.rstan.d)
Inference for Stan model: 18e6498c61bcea7cdfdc0535e9da24c2. 3 chains, each with iter=3000; warmup=1000; thin=2; post-warmup draws per chain=1000, total post-warmup draws=3000. mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat beta[1] 42.98 0.11 2.08 38.70 41.64 42.99 44.37 46.96 333 1.01 beta[2] 28.42 0.02 1.12 26.23 27.65 28.41 29.18 30.62 2244 1.00 beta[3] 40.12 0.02 1.12 37.90 39.40 40.12 40.90 42.20 2351 1.00 sigma 4.65 0.01 0.40 3.97 4.36 4.61 4.90 5.51 1901 1.00 sigma_B 11.93 0.03 1.60 9.26 10.79 11.79 12.90 15.57 2405 1.00 lp__ -312.99 0.14 5.47 -324.51 -316.52 -312.72 -309.00 -303.73 1441 1.01 Samples were drawn using NUTS(diag_e) at Wed Dec 23 11:24:24 2015. For each parameter, n_eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor on split chains (at convergence, Rhat=1).
data.rcb.rstan.d.df <-as.data.frame(extract(data.rcb.rstan.d)) head(data.rcb.rstan.d.df)
beta.1 beta.2 beta.3 sigma sigma_B lp__ 1 39.14560 31.09324 39.65505 4.573291 10.759842 -314.6550 2 44.86351 27.03882 39.13673 5.144527 10.361909 -311.5803 3 43.57541 29.97864 40.73320 5.101668 9.780195 -318.3110 4 45.90323 27.41146 39.27068 4.899846 12.238562 -313.2588 5 40.54387 27.54918 41.86662 4.718828 13.027700 -317.1884 6 41.80010 27.94253 40.45006 4.865833 9.818217 -321.6274
data.rcb.mcmc.d<-rstan:::as.mcmc.list.stanfit(data.rcb.rstan.d) plyr:::adply(as.matrix(data.rcb.rstan.d.df),2,MCMCsum)
X1 Median X0. X25. X50. X75. X100. lower upper lower.1 upper.1 1 beta.1 42.990618 35.807843 41.644381 42.990618 44.365448 50.508390 38.929137 47.081249 41.26412 45.269817 2 beta.2 28.414124 23.722028 27.647317 28.414124 29.179416 32.416893 26.251619 30.623911 27.25986 29.452460 3 beta.3 40.121429 35.797484 39.399052 40.121429 40.900593 44.350100 38.055674 42.353027 39.11139 41.327864 4 sigma 4.612858 3.455549 4.355802 4.612858 4.898886 6.464649 3.939388 5.481554 4.17415 4.957871 5 sigma_B 11.792348 7.897597 10.788851 11.792348 12.897968 18.886186 9.090628 15.144342 10.14461 13.213148 6 lp__ -312.723587 -339.349432 -316.520568 -312.723587 -308.998518 -299.367259 -324.529867 -303.740895 -317.55765 -306.738854
Planned comparisons and pairwise tests
Since there are no restrictions on the type and number of comparisons derived from the posteriors, Bayesian analyses provide a natural framework for exploring additional contrasts and comparisons. For example, to compare all possible levels:
coefs <- data.rcb.r2jags.m$BUGSoutput$sims.list[[c('beta')]] head(coefs)
[,1] [,2] [,3] [1,] 42.36122 28.03856 41.06767 [2,] 43.91985 28.54764 39.03881 [3,] 42.39406 27.01535 38.90414 [4,] 42.33675 28.00178 40.57500 [5,] 39.45086 29.89651 40.68564 [6,] 42.20516 25.63713 38.38961
newdata <- data.frame(A=levels(data.rcb$A)) # A Tukeys contrast matrix library(multcomp) tuk.mat <- contrMat(n=table(newdata$A), type="Tukey") Xmat <- model.matrix(~A, data=newdata) pairwise.mat <- tuk.mat %*% Xmat pairwise.mat
(Intercept) A2 A3 2 - 1 0 1 0 3 - 1 0 0 1 3 - 2 0 -1 1
comps <- coefs %*% t(pairwise.mat) MCMCsum <- function(x) { data.frame(Median=median(x, na.rm=TRUE), t(quantile(x,na.rm=TRUE)), HPDinterval(as.mcmc(x)),HPDinterval(as.mcmc(x),p=0.5)) } (comps <-plyr:::adply(comps,2,MCMCsum))
X1 Median X0. X25. X50. X75. X100. lower upper lower.1 upper.1 1 2 - 1 28.46831 23.861253 27.74467 28.46831 29.23208 32.74363 26.360184 30.73643 27.73428 29.21728 2 3 - 1 40.21721 35.148536 39.44456 40.21721 40.94479 45.57754 38.033142 42.42591 39.63044 41.10973 3 3 - 2 11.70102 8.092086 10.91325 11.70102 12.46882 15.67819 9.546795 14.05773 11.01591 12.56149
library(ggplot2) library(gridExtra) ggplot(comps, aes(x=X1, y=Median)) + coord_flip()+ geom_hline(v=0, linetype=2)+ geom_errorbar(aes(ymin=lower, ymax=upper),width=0)+ geom_errorbar(aes(ymin=lower.1, ymax=upper.1),width=0, size=1.25)+ geom_point()+ scale_y_continuous("Effect size (median)")+ scale_x_discrete("Comparison (A)")+ theme(panel.grid.major=element_blank(), panel.grid.minor=element_blank(), panel.background=element_blank(), panel.border = element_blank(), axis.line = element_line(), axis.line.y=element_blank(), axis.title.y=element_text(size=17, vjust=2,angle=90), axis.text.y=element_text(size=12), axis.title.x=element_text(size=17,vjust=-2), axis.text.x=element_text(size=10), strip.background=element_rect(fill="transparent", colour="black"), plot.margin=unit(c(0.5,0.5,2,2),"lines") )
Error in theme(panel.grid.major = element_blank(), panel.grid.minor = element_blank(), : could not find function "unit"
RCB (repeated measures) ANOVA in R - continuous within
Scenario and Data
Imagine now that we has designed an experiment to investigate the effects of a continuous predictor ($x$, for example time) on a response ($y$). Again, the system that we intend to sample is spatially heterogeneous and thus will add a great deal of noise to the data that will make it difficult to detect a signal (impact of treatment).
Thus in an attempt to constrain this variability, we again decide to apply a design (RCB) in which each of the levels of X (such as time) treatments within each of 35 blocks dispersed randomly throughout the landscape. As this section is mainly about the generation of artificial data (and not specifically about what to do with the data), understanding the actual details are optional and can be safely skipped. Consequently, I have folded (toggled) this section away.
- the number of times = 10
- the number of blocks containing treatments = 35
- mean slope (rate of change in response over time) = 60
- mean intercept (value of response at time 0 = 200
- the variability (standard deviation) between blocks of the same treatment = 12
- the variability (standard deviation) in slope = 5
library(plyr) set.seed(1) slope <- 30 intercept <- 200 nBlock <- 35 nTime <- 10 sigma <- 50 sigma.block <- 30 n <- nBlock*nTime Block <- gl(nBlock, k=1) Time <- 1:10 rho <- 0.8 dt <- expand.grid(Time=Time,Block=Block) Xmat <- model.matrix(~-1+Block + Time, data=dt) block.effects <- rnorm(n = nBlock, mean = intercept, sd = sigma.block) #A.effects <- c(30,40) all.effects <- c(block.effects,slope) lin.pred <- Xmat %*% all.effects # OR Xmat <- cbind(model.matrix(~-1+Block,data=dt),model.matrix(~Time,data=dt)) ## Sum to zero block effects ##block.effects <- rnorm(n = nBlock, mean = 0, sd = sigma.block) ###A.effects <- c(40,70,80) ##all.effects <- c(block.effects,intercept,slope) ##lin.pred <- Xmat %*% all.effects ## the quadrat observations (within sites) are drawn from ## normal distributions with means according to the site means ## and standard deviations of 5 eps <- NULL eps[1] <- 0 for (j in 2:n) { eps[j] <- rho*eps[j-1] #residuals } y <- rnorm(n,lin.pred,sigma)+eps #OR eps <- NULL # first value cant be autocorrelated eps[1] <- rnorm(1,0,sigma) for (j in 2:n) { eps[j] <- rho*eps[j-1] + rnorm(1, mean = 0, sd = sigma) #residuals } y <- lin.pred + eps data.rm <- data.frame(y=y, dt) head(data.rm) #print out the first six rows of the data set
y Time Block 1 208.3803 1 1 2 132.4775 2 1 3 201.4656 3 1 4 150.1660 4 1 5 169.8155 5 1 6 298.2939 6 1
library(ggplot2) ggplot(data.rm, aes(y=y, x=Time)) + geom_smooth(method='lm') + geom_point() + facet_wrap(~Block)
Exploratory data analysis
Normality and Homogeneity of variance
boxplot(y~Time, data.rm)
ggplot(data.rm, aes(y=y, x=factor(Time))) + geom_boxplot()
Conclusions:
- there is no evidence that the response variable is consistently non-normal across all populations - each boxplot is approximately symmetrical
- there is no evidence that variance (as estimated by the height of the boxplots) differs between the five populations. . More importantly, there is no evidence of a relationship between mean and variance - the height of boxplots does not increase with increasing position along the y-axis. Hence it there is no evidence of non-homogeneity
- transform the scale of the response variables (to address normality etc). Note transformations should be applied to the entire response variable (not just those populations that are skewed).
Block by within-Block interaction
library(car) with(data.rm, interaction.plot(Time,Block,y))
ggplot(data.rm, aes(y=y, x=Time, color=Block, group=Block)) + geom_line() + guides(color=guide_legend(ncol=3))
library(car) residualPlots(lm(y~Block+Time, data.rm))
Test stat Pr(>|t|) Block NA NA Time 0.485 0.628 Tukey test -0.663 0.507
# the Tukey's non-additivity test by itself can be obtained via an internal function # within the car package car:::tukeyNonaddTest(lm(y~Block+Time, data.rm))
Test Pvalue -0.6628553 0.5074232
# alternatively, there is also a Tukey's non-additivity test within the # asbio package library(asbio) with(data.rm,tukey.add.test(y,Time,Block))
Tukey's one df test for additivity F = 1.2438417 Denom df = 305 p-value = 0.2656099
Conclusions:
- there is no visual or inferential evidence of any major interactions between Block and the within-Block effect (Time). Any trends appear to be reasonably consistent between Blocks.
Sphericity
Since the levels of Time cannot be randomly assigned, it is likely that sphericity is not met.
library(biology) epsi.GG.HF(aov(y~Error(Block)+Time, data=data.rm))
$GG.eps [1] 0.3843977 $HF.eps [1] 0.433252 $sigma [1] 4010.234
Conclusions:
- Both the Greenhouse-Geisser and Huynh-Feldt epsilons are very low. In fact, since the Greenhouse-Geisser epsilon is lower than 0.5, we will base any corrections on the Huynh-Feldt measure. Essentially, when using traditional ANOVA modelling, we would multiply the degrees of freedom by the epsilon value in order to lower the sensitivity of the tests. This is somewhat of a hack that attempts to compensate for inflated power by adjusting proportional to the approximate degree of severity (deviation from sphericity).
Alternatively (and preferentially), we can explore whether there is an auto-correlation patterns in the residuals. Note, as there was only ten time periods, it does not make logical sense to explore lags above 10.
library(nlme) data.rm.lme <- lme(y~Time, random=~1|Block, data=data.rm) acf(resid(data.rm.lme), lag=10)
Conclusions:
- The autocorrelation factor (ACF) at a range of lags up to 10, indicate that there is a cyclical pattern of residual auto-correlation. We really should explore incorporating some form of correlation structure into our model.
Model fitting or statistical analysis
JAGS
Full parameterization | Matrix parameterization | Heirarchical parameterization |
---|---|---|
$$ \begin{array}{rcl} y_{ijk}&\sim&\mathcal{N}(\mu_{ij},\sigma^2)\\ \mu_{ij} &=& \beta_0 + \beta_{i} + \gamma_{j(i)}\\ \gamma_{i{j}}&\sim&\mathcal{N}(0,\sigma_{B}^2)\\ \beta_0, \beta_i&\sim&\mathcal{N}(0,100000)\\ \sigma^2, \sigma_{B}&\sim&\mathcal{Cauchy}(0,25)\\ \end{array} $$ | $$ \begin{array}{rcl} y_{ijk}&\sim&\mathcal{N}(\mu_{ij},\sigma^2)\\ \mu_{ij} &=& \beta\mathbf{X} + \gamma_{j(i)}\\ \gamma_{i{j}}&\sim&\mathcal{N}(0,\sigma_{B}^2)\\ \beta&\sim&\mathcal{MVN}(0,100000)\\ \sigma^2, \sigma_{B}^2&\sim&\mathcal{Cauchy}(0,25)\\ \end{array} $$ | $$ \begin{array}{rcl} y_{ijk}&\sim&\mathcal{N}(\mu_{ij},\sigma^2)\\ \mu_{ij} &=& \beta_0 + \beta_{i} + \gamma_{j(i)}\\ \alpha_{i{j}}&\sim&\mathcal{N}(0,\sigma_{B}^2)\\ \beta_0, \beta_i&\sim&\mathcal{N}(0, 1000000)\\ \sigma^2, \sigma_{B}^2&\sim&\mathcal{Cauchy}(0,25)\\ \end{array} $$ |
The full parameterization, shows the effects parameterization in which there is an intercept ($\alpha_0$) and two treatment effects ($\alpha_i$, where $i$ is 1,2).
The matrix parameterization is a compressed notation, In this parameterization, there are three alpha parameters (one representing the mean of treatment a1, and the other two representing the treatment effects (differences between a2 and a1 and a3 and a1). In generating priors for each of these three alpha parameters, we could loop through each and define a non-informative normal prior to each (as in the Full parameterization version). However, it turns out that it is more efficient (in terms of mixing and thus the number of necessary iterations) to define the priors from a multivariate normal distribution. This has as many means as there are parameters to estimate (3) and a 3x3 matrix of zeros and 100 in the diagonals. $$ \mu\sim\left[ \begin{array}{c} 0\\ 0\\ 0\\ \end{array} \right], \hspace{2em} \sigma^2\sim\left[ \begin{array}{ccc} 1000000&0&0\\ 0&1000000&0\\ 0&0&1000000\\ \end{array} \right] $$
Rather than assume a specific variance-covariance structure, just like lme() we can incorporate an appropriate structure to account for different dependency/correlation structures in our data. In RCB designs, it is prudent to capture the residuals to allow checks that there are no outstanding dependency issues following model fitting.
Full effects parameterization
modelString=" model { #Likelihood for (i in 1:n) { y[i]~dnorm(mu[i],tau) mu[i] <- beta0 + beta*Time[i] + gamma[Block[i]] res[i] <- y[i]-mu[i] } #Priors beta0 ~ dnorm(0, 1.0E-6) beta ~ dnorm(0, 1.0E-6) #prior for (i in 1:nBlock) { gamma[i] ~ dnorm(0, tau.B) #prior } tau <- pow(sigma,-2) sigma <- z/sqrt(chSq) z ~ dnorm(0, 0.0016)I(0,) #1/25^2 = 0.0016 chSq ~ dgamma(0.5, 0.5) tau.B <- pow(sigma.B,-2) sigma.B <- z/sqrt(chSq.B) z.B ~ dnorm(0, 0.0016)I(0,) #1/25^2 = 0.0016 chSq.B ~ dgamma(0.5, 0.5) } "
data.rm.list <- with(data.rm, list(y=y, Block=as.numeric(Block), Time=Time, n=nrow(data.rm), nBlock=length(levels(Block)) ) ) params <- c("beta0","beta",'gamma',"sigma","sigma.B","res") burnInSteps = 3000 nChains = 3 numSavedSteps = 3000 thinSteps = 10 nIter = burnInSteps+ceiling((numSavedSteps * thinSteps)/nChains) library(R2jags) rnorm(1)
[1] 0.9598324
jags.effects.f.time <- system.time( data.rm.r2jags.f <- jags(data=data.rm.list, inits=NULL, parameters.to.save=params, model.file=textConnection(modelString), n.chains=3, n.iter=nIter, n.burnin=burnInSteps, n.thin=thinSteps ) )
Compiling model graph Resolving undeclared variables Allocating nodes Graph Size: 1815 Initializing model
jags.effects.f.time
user system elapsed 30.410 0.164 30.664
print(data.rm.r2jags.f)
Inference for Bugs model at "5", fit using jags, 3 chains, each with 13000 iterations (first 3000 discarded), n.thin = 10 n.sims = 3000 iterations saved mu.vect sd.vect 2.5% 25% 50% 75% 97.5% Rhat n.eff beta 32.704 1.152 30.475 31.939 32.683 33.449 34.962 1.002 1900 beta0 167.104 13.005 141.832 158.519 167.181 175.619 192.917 1.001 3000 gamma[1] -41.918 21.470 -85.938 -56.071 -41.921 -27.977 0.789 1.001 3000 gamma[2] 75.885 21.316 34.048 62.116 75.867 89.977 118.173 1.001 3000 gamma[3] 4.040 20.752 -36.796 -10.040 4.088 18.018 44.390 1.001 3000 gamma[4] 84.330 21.491 42.155 69.811 84.525 98.156 127.562 1.002 3000 gamma[5] -54.946 21.868 -97.303 -69.900 -54.597 -40.009 -13.005 1.002 1400 gamma[6] 5.394 21.218 -37.404 -8.371 5.914 19.673 46.428 1.001 3000 gamma[7] -95.251 21.618 -137.968 -109.493 -95.316 -81.318 -53.125 1.001 3000 gamma[8] -30.879 21.696 -74.300 -45.162 -30.958 -16.123 11.282 1.001 3000 gamma[9] 74.818 21.731 31.804 60.244 74.469 89.409 118.312 1.001 3000 gamma[10] 17.672 21.447 -23.916 3.263 17.708 31.968 59.536 1.001 3000 gamma[11] 101.374 21.653 60.948 86.562 101.234 115.815 143.977 1.001 3000 gamma[12] 33.530 21.467 -9.007 19.126 33.337 47.782 75.017 1.001 3000 gamma[13] -60.145 21.611 -103.492 -74.700 -60.171 -45.638 -18.564 1.001 3000 gamma[14] -84.567 21.481 -127.018 -99.563 -84.499 -70.585 -41.204 1.001 3000 gamma[15] 40.324 21.000 -1.103 25.904 40.303 54.150 81.007 1.001 2400 gamma[16] 70.285 21.365 28.294 55.900 70.484 84.694 111.513 1.002 1200 gamma[17] 8.368 21.643 -33.280 -6.362 8.166 23.195 49.769 1.001 3000 gamma[18] -32.845 21.397 -75.182 -47.393 -32.794 -18.464 8.855 1.001 2400 gamma[19] 33.625 21.493 -8.787 19.567 33.591 47.426 75.772 1.003 880 gamma[20] 77.463 21.952 34.170 62.636 77.326 92.913 118.041 1.001 3000 gamma[21] 86.457 21.737 43.411 71.579 86.679 101.527 128.565 1.001 3000 gamma[22] 25.011 21.261 -17.686 11.399 24.538 39.068 68.664 1.001 3000 gamma[23] 19.047 21.495 -22.685 4.403 18.696 33.418 62.236 1.001 3000 gamma[24] -55.608 21.482 -97.248 -70.183 -55.844 -40.777 -14.447 1.001 3000 gamma[25] 25.985 21.668 -16.876 11.489 25.851 41.109 69.144 1.001 3000 gamma[26] -60.660 21.449 -102.098 -75.148 -60.470 -46.552 -16.890 1.001 3000 gamma[27] -72.455 21.497 -115.552 -86.709 -72.054 -58.242 -30.887 1.002 1300 gamma[28] -101.686 21.196 -143.323 -115.680 -102.121 -87.263 -61.071 1.001 2800 gamma[29] -42.330 21.984 -85.621 -57.484 -42.143 -27.338 -0.591 1.001 3000 gamma[30] -57.399 21.737 -100.094 -71.642 -57.864 -42.735 -13.552 1.002 1500 gamma[31] 17.763 21.450 -24.195 2.901 17.868 32.341 58.496 1.003 700 gamma[32] -22.575 21.479 -65.693 -36.984 -22.081 -8.033 19.144 1.001 2400 gamma[33] 30.022 21.194 -12.262 15.951 30.447 44.285 71.679 1.002 2000 gamma[34] 25.703 21.122 -15.480 11.570 25.684 39.627 66.349 1.001 2800 gamma[35] -38.600 21.726 -81.407 -53.020 -39.010 -24.081 4.340 1.001 3000 res[1] 50.490 19.406 12.040 37.616 50.369 63.073 89.148 1.001 3000 res[2] -58.117 19.132 -95.461 -70.799 -58.134 -45.579 -19.988 1.001 3000 res[3] -21.833 18.925 -58.419 -34.241 -21.905 -9.681 15.551 1.001 3000 res[4] -105.837 18.786 -142.191 -118.292 -105.906 -93.738 -68.501 1.001 3000 res[5] -118.891 18.717 -155.543 -131.260 -118.963 -106.774 -81.559 1.001 3000 res[6] -23.117 18.718 -59.986 -35.553 -23.312 -10.695 14.245 1.001 3000 res[7] 17.543 18.791 -19.593 5.022 17.275 30.076 55.544 1.001 3000 res[8] 73.332 18.933 35.900 60.822 72.978 86.090 111.819 1.001 3000 res[9] 78.169 19.144 39.955 65.502 77.788 91.096 116.952 1.001 3000 res[10] 60.660 19.421 21.592 47.924 60.318 73.577 99.453 1.001 3000 res[11] -61.054 19.602 -99.232 -73.840 -61.129 -47.993 -22.438 1.001 3000 res[12] 20.061 19.359 -17.284 7.213 19.897 33.073 58.192 1.001 3000 res[13] 7.018 19.183 -29.684 -5.919 6.804 19.926 44.992 1.001 3000 res[14] -41.965 19.075 -78.273 -54.957 -42.085 -29.272 -4.225 1.001 3000 res[15] -2.674 19.036 -38.971 -15.753 -2.706 10.016 35.087 1.001 3000 res[16] 38.679 19.067 2.029 25.642 38.776 51.371 76.662 1.001 3000 res[17] 112.281 19.167 75.712 99.163 112.597 125.106 150.706 1.001 3000 res[18] 45.689 19.335 8.627 32.558 46.160 58.425 84.275 1.001 3000 res[19] 2.481 19.570 -34.984 -11.083 3.170 15.260 41.612 1.001 3000 res[20] -33.894 19.869 -71.785 -47.620 -33.315 -20.744 5.541 1.001 3000 res[21] 6.802 19.388 -32.872 -6.345 6.714 20.205 44.991 1.001 3000 res[22] -15.378 19.094 -53.960 -28.131 -15.291 -2.284 22.630 1.001 3000 res[23] -30.115 18.866 -68.013 -42.758 -29.926 -17.152 7.259 1.001 3000 res[24] 44.430 18.706 7.141 32.107 44.559 57.076 81.063 1.001 3000 res[25] -3.443 18.616 -39.963 -15.908 -3.342 9.079 32.770 1.001 3000 res[26] -26.814 18.598 -63.353 -39.174 -26.646 -14.349 9.895 1.001 3000 res[27] 5.984 18.650 -30.923 -6.292 6.173 18.366 43.138 1.001 3000 res[28] 55.293 18.773 18.259 42.916 55.428 67.655 92.516 1.001 3000 res[29] -0.645 18.965 -37.912 -13.303 -0.461 11.831 37.124 1.001 3000 res[30] -32.734 19.225 -70.327 -45.746 -32.432 -20.076 5.224 1.001 3000 res[31] 12.318 19.933 -26.235 -0.864 12.687 25.434 50.427 1.001 2900 res[32] 56.782 19.693 19.201 43.749 57.127 69.935 94.441 1.001 3000 res[33] 28.366 19.518 -8.899 15.269 28.616 41.399 65.427 1.001 3000 res[34] -53.848 19.410 -91.248 -66.913 -53.494 -41.000 -17.197 1.001 3000 res[35] 36.795 19.370 -0.711 23.815 37.003 49.742 73.140 1.001 3000 res[36] 95.066 19.398 57.139 82.308 95.461 108.270 132.012 1.001 3000 res[37] 33.870 19.495 -4.618 21.157 34.263 47.035 71.603 1.001 3000 res[38] 16.637 19.658 -22.226 3.780 17.010 29.762 55.083 1.001 3000 res[39] -82.410 19.887 -122.264 -95.332 -82.013 -69.247 -43.483 1.001 3000 res[40] -45.729 20.180 -86.466 -58.817 -45.453 -32.387 -5.814 1.001 3000 res[41] 163.552 19.721 125.866 150.121 163.548 176.532 202.085 1.001 2600 res[42] 65.278 19.444 28.343 52.070 65.255 78.244 103.415 1.001 2500 res[43] 29.004 19.231 -7.453 15.824 29.142 41.832 66.665 1.002 1900 res[44] 6.363 19.086 -29.929 -6.924 6.542 19.672 43.468 1.002 1700 res[45] -14.288 19.010 -50.567 -27.628 -14.337 -1.140 22.490 1.002 1500 res[46] -98.958 19.003 -135.672 -112.208 -98.937 -85.855 -62.062 1.002 1400 res[47] -40.522 19.066 -77.495 -53.568 -40.692 -27.502 -3.983 1.002 1300 res[48] -95.952 19.198 -133.392 -109.206 -96.104 -82.859 -58.793 1.002 1200 res[49] -65.497 19.398 -103.339 -78.638 -65.575 -52.138 -27.665 1.002 1100 res[50] -10.775 19.663 -49.182 -24.060 -10.937 2.631 27.582 1.002 1000 res[51] -74.894 19.604 -112.111 -88.110 -74.879 -62.154 -36.767 1.001 3000 res[52] -17.982 19.314 -54.578 -31.001 -17.746 -5.366 19.586 1.001 3000 res[53] -18.879 19.089 -55.536 -31.878 -18.732 -6.342 18.440 1.001 3000 res[54] -51.236 18.932 -87.103 -64.156 -50.982 -38.615 -14.707 1.001 3000 res[55] -12.960 18.844 -49.103 -25.873 -12.810 -0.372 23.573 1.001 3000 res[56] -36.891 18.826 -73.267 -49.776 -36.976 -24.485 -0.592 1.001 3000 res[57] 53.746 18.878 17.370 40.815 53.631 66.374 90.220 1.001 3000 res[58] 36.172 19.000 -0.323 23.149 35.946 48.910 72.820 1.001 3000 res[59] 65.125 19.191 28.385 52.003 64.974 78.074 102.784 1.001 3000 res[60] 55.364 19.448 18.376 42.221 55.237 68.470 92.783 1.001 3000 res[61] 53.278 19.922 14.552 40.400 53.048 66.649 93.002 1.001 3000 res[62] -0.374 19.660 -38.830 -13.023 -0.661 12.789 38.364 1.001 3000 res[63] 3.264 19.463 -35.082 -9.450 3.006 16.446 41.950 1.001 3000 res[64] 38.678 19.333 0.734 25.972 38.501 51.681 76.685 1.001 3000 res[65] -62.507 19.271 -99.975 -75.397 -62.582 -49.754 -24.478 1.001 3000 res[66] 21.225 19.277 -16.952 8.277 21.130 34.002 59.078 1.001 3000 res[67] 9.364 19.352 -29.020 -3.280 9.312 22.117 47.247 1.001 3000 res[68] -8.089 19.495 -46.954 -20.990 -7.980 4.799 29.342 1.001 3000 res[69] -62.729 19.705 -101.861 -75.981 -62.636 -49.760 -24.848 1.001 3000 res[70] -101.895 19.978 -141.093 -115.300 -101.984 -88.507 -63.133 1.001 3000 res[71] -88.062 19.964 -127.870 -101.538 -87.755 -74.493 -49.523 1.001 3000 res[72] -31.041 19.717 -70.085 -44.202 -30.631 -17.705 6.899 1.001 3000 res[73] -116.328 19.534 -155.476 -129.402 -115.893 -103.318 -78.411 1.001 3000 res[74] -130.422 19.419 -169.430 -143.343 -130.093 -117.543 -92.973 1.001 3000 res[75] -65.232 19.371 -103.791 -77.980 -65.008 -52.431 -27.767 1.001 3000 res[76] -63.061 19.392 -102.132 -75.582 -63.004 -50.149 -25.391 1.001 3000 res[77] 69.055 19.481 29.857 56.476 69.062 81.790 107.019 1.003 3000 res[78] 128.226 19.637 88.133 115.407 128.205 141.284 166.622 1.001 3000 res[79] 142.510 19.859 102.290 129.581 142.359 155.739 181.020 1.001 3000 res[80] 123.865 20.144 83.602 110.599 123.893 137.385 162.879 1.001 3000 res[81] 38.602 20.014 -1.008 25.334 38.746 51.662 78.229 1.001 3000 res[82] -12.588 19.710 -51.453 -25.593 -12.756 0.423 26.629 1.001 3000 res[83] 12.615 19.470 -25.289 -0.173 12.641 25.485 51.119 1.001 3000 res[84] 74.846 19.296 37.322 62.013 74.794 87.501 112.990 1.002 3000 res[85] 104.251 19.190 66.972 91.597 104.153 116.810 141.983 1.001 3000 res[86] 32.401 19.152 -4.545 19.479 32.285 45.135 69.950 1.001 3000 res[87] -65.902 19.184 -102.275 -78.898 -65.895 -53.119 -28.566 1.001 3000 res[88] -69.624 19.284 -106.028 -82.781 -69.618 -56.824 -31.787 1.001 3000 res[89] -45.616 19.452 -82.397 -59.002 -45.480 -32.474 -7.501 1.001 3000 res[90] 18.556 19.686 -19.200 4.878 18.594 31.813 57.193 1.001 3000 res[91] -6.375 19.645 -46.936 -19.201 -5.740 6.850 30.685 1.001 3000 res[92] 11.087 19.365 -28.819 -1.650 11.794 24.050 47.458 1.001 3000 res[93] 17.971 19.150 -21.644 5.330 18.451 30.980 54.292 1.001 3000 res[94] 70.960 19.002 31.721 58.299 71.279 84.017 106.739 1.001 3000 res[95] 51.718 18.924 13.421 39.127 51.882 64.682 87.953 1.001 3000 res[96] 19.313 18.916 -18.781 6.918 19.478 32.441 56.157 1.001 3000 res[97] -46.627 18.977 -85.087 -59.029 -46.534 -33.639 -9.915 1.001 3000 res[98] -68.450 19.108 -106.603 -80.803 -68.248 -55.528 -31.873 1.001 3000 res[99] -78.684 19.307 -117.252 -91.086 -78.492 -65.666 -41.856 1.001 3000 res[100] 48.221 19.572 8.678 35.284 48.311 61.301 85.706 1.001 3000 res[101] 151.859 19.942 112.365 138.496 152.304 165.170 190.847 1.001 3000 res[102] 105.282 19.711 65.959 92.441 105.617 118.468 143.749 1.001 3000 res[103] 23.631 19.545 -15.313 10.793 23.766 36.728 61.945 1.001 3000 res[104] -88.693 19.446 -126.950 -101.599 -88.541 -75.657 -50.783 1.001 3000 res[105] -54.713 19.415 -92.997 -67.672 -54.520 -41.584 -16.938 1.001 3000 res[106] -108.333 19.452 -146.736 -121.511 -108.095 -95.202 -70.691 1.001 3000 res[107] 16.993 19.557 -21.635 3.590 17.226 30.297 54.873 1.001 3000 res[108] -41.801 19.729 -81.383 -55.254 -41.569 -28.347 -3.946 1.001 3000 res[109] -39.542 19.966 -79.131 -52.972 -39.264 -25.939 -1.700 1.001 3000 res[110] 146.684 20.265 106.654 133.046 147.016 160.453 184.922 1.001 2600 res[111] 110.392 19.883 70.551 97.053 110.845 123.630 149.335 1.001 3000 res[112] 102.659 19.588 63.251 89.607 103.048 115.582 141.117 1.001 3000 res[113] 25.209 19.358 -13.729 12.280 25.534 37.995 63.255 1.001 3000 res[114] 35.435 19.193 -3.071 22.616 35.842 47.996 73.572 1.001 3000 res[115] -17.970 19.097 -56.239 -30.767 -17.690 -5.444 19.751 1.001 3000 res[116] -13.707 19.070 -51.677 -26.481 -13.479 -1.223 23.866 1.001 3000 res[117] -29.546 19.112 -67.356 -42.195 -29.352 -16.942 7.726 1.001 3000 res[118] -87.077 19.224 -125.082 -99.677 -86.857 -74.388 -49.359 1.001 3000 res[119] -73.915 19.403 -112.532 -86.763 -73.733 -61.129 -36.107 1.001 3000 res[120] -14.912 19.648 -53.583 -27.995 -14.564 -1.989 23.693 1.001 3000 res[121] 152.253 19.875 114.052 139.117 152.372 166.073 190.653 1.001 3000 res[122] 64.803 19.610 26.838 51.906 64.772 78.252 102.865 1.001 3000 res[123] 50.457 19.409 12.353 37.550 50.562 63.712 88.039 1.001 3000 res[124] 108.890 19.274 71.069 96.000 108.932 121.942 146.113 1.001 3000 res[125] 41.414 19.208 3.238 28.548 41.302 54.335 78.813 1.001 3000 res[126] -83.821 19.212 -122.165 -96.752 -83.699 -70.772 -46.700 1.001 3000 res[127] -104.920 19.283 -143.345 -117.694 -104.748 -91.900 -67.377 1.001 3000 res[128] -123.908 19.424 -162.661 -136.786 -123.572 -110.831 -85.934 1.001 3000 res[129] -88.902 19.630 -127.987 -101.919 -88.584 -75.669 -50.984 1.001 3000 res[130] -83.999 19.902 -123.644 -97.372 -83.685 -70.867 -44.817 1.001 3000 res[131] -45.196 19.773 -83.462 -59.264 -44.940 -31.270 -7.344 1.001 3000 res[132] -50.316 19.517 -88.060 -64.179 -49.905 -36.724 -12.621 1.001 3000 res[133] -15.128 19.327 -52.852 -28.781 -14.762 -1.680 22.299 1.001 3000 res[134] -24.525 19.204 -61.873 -38.095 -24.047 -11.006 12.285 1.001 3000 res[135] 45.222 19.149 7.804 31.769 45.480 58.672 81.827 1.001 3000 res[136] 4.107 19.164 -33.040 -9.433 4.342 17.428 40.583 1.001 3000 res[137] 22.074 19.247 -15.438 8.567 22.209 35.663 59.127 1.001 3000 res[138] -22.869 19.399 -60.491 -36.131 -22.754 -9.203 14.303 1.001 3000 res[139] -4.721 19.617 -42.814 -18.193 -4.769 9.075 32.443 1.001 3000 res[140] -3.531 19.900 -41.781 -17.260 -3.645 10.496 34.484 1.001 3000 res[141] -82.619 19.339 -120.653 -95.804 -82.808 -69.469 -44.809 1.001 2100 res[142] -71.287 19.071 -108.569 -84.245 -71.473 -58.340 -34.100 1.002 1800 res[143] 3.459 18.870 -33.086 -9.236 3.180 16.498 40.284 1.002 1600 res[144] 37.775 18.737 1.909 25.192 37.673 50.795 74.665 1.002 1400 res[145] 37.779 18.675 1.507 24.824 37.796 50.871 74.548 1.002 1300 res[146] -29.537 18.684 -66.067 -42.454 -29.494 -16.630 7.429 1.002 1200 res[147] 34.146 18.763 -2.378 21.065 34.288 47.144 71.218 1.002 1100 res[148] 30.051 18.913 -6.572 16.907 30.075 43.133 67.265 1.002 1000 res[149] -0.315 19.130 -36.968 -13.850 -0.234 13.014 37.138 1.003 960 res[150] 79.542 19.414 42.225 66.016 79.671 93.168 117.387 1.003 920 res[151] -15.710 19.702 -54.637 -28.442 -16.032 -2.318 23.425 1.002 1200 res[152] -8.392 19.437 -47.065 -21.062 -8.508 4.701 29.718 1.002 1300 res[153] 45.840 19.237 7.427 33.252 45.742 58.761 82.871 1.002 1400 res[154] 54.709 19.104 16.179 42.332 54.510 67.477 91.079 1.002 1500 res[155] 15.801 19.040 -22.156 3.395 15.712 28.428 51.909 1.002 1700 res[156] -21.436 19.046 -59.080 -33.834 -21.564 -8.864 14.408 1.002 1800 res[157] -13.088 19.121 -50.948 -25.586 -13.144 -0.336 23.025 1.001 2100 res[158] 0.966 19.265 -36.985 -11.582 1.127 13.675 37.354 1.001 2400 res[159] -13.075 19.477 -51.691 -25.660 -12.804 -0.298 23.923 1.001 2700 res[160] 40.143 19.753 0.966 27.444 40.516 53.217 78.273 1.001 3000 res[161] 102.784 19.912 64.637 89.605 102.741 116.590 141.300 1.001 3000 res[162] 48.776 19.638 11.699 35.876 48.468 62.590 86.845 1.001 3000 res[163] 66.910 19.428 30.129 54.073 66.513 80.464 104.867 1.001 3000 res[164] -56.080 19.285 -92.605 -68.998 -56.546 -42.806 -17.846 1.001 3000 res[165] -25.324 19.209 -61.372 -38.375 -25.474 -12.055 12.250 1.001 3000 res[166] 3.989 19.203 -31.608 -9.180 3.977 17.079 41.593 1.001 3000 res[167] -9.193 19.266 -45.136 -22.675 -9.175 3.856 28.767 1.001 3000 res[168] -64.893 19.397 -100.944 -78.383 -65.008 -51.661 -26.696 1.001 3000 res[169] -73.887 19.595 -110.534 -87.466 -73.981 -60.459 -35.013 1.001 3000 res[170] 15.619 19.858 -21.473 1.971 15.430 29.368 54.636 1.001 3000 res[171] 69.071 19.843 29.954 56.015 69.086 82.178 107.890 1.001 3000 res[172] -45.763 19.547 -84.784 -58.924 -45.852 -32.775 -7.549 1.001 3000 res[173] 19.028 19.316 -18.984 5.902 18.852 31.991 56.713 1.001 3000 res[174] 4.641 19.151 -32.873 -8.433 4.536 17.697 41.738 1.001 3000 res[175] -7.888 19.055 -45.007 -20.874 -8.042 4.917 28.625 1.001 3000 res[176] -53.676 19.028 -90.235 -66.635 -53.811 -40.837 -17.240 1.001 3000 res[177] -31.206 19.070 -67.993 -44.343 -31.261 -18.345 5.511 1.001 3000 res[178] 22.418 19.182 -14.566 9.397 22.472 35.254 59.040 1.001 3000 res[179] 0.341 19.361 -37.652 -12.843 0.349 13.148 36.946 1.001 3000 res[180] -18.824 19.607 -57.174 -32.418 -18.848 -5.825 18.155 1.001 3000 res[181] 7.365 19.740 -30.291 -5.734 6.890 20.586 47.131 1.005 460 res[182] -4.947 19.488 -42.532 -17.934 -5.262 7.939 34.569 1.005 470 res[183] -10.957 19.301 -48.365 -23.523 -11.048 1.811 28.674 1.005 470 res[184] -39.600 19.182 -76.586 -52.279 -39.728 -27.195 -0.045 1.005 490 res[185] 13.256 19.132 -23.309 0.567 12.984 25.606 52.749 1.004 500 res[186] -39.732 19.151 -76.321 -52.402 -40.156 -27.220 -0.543 1.004 520 res[187] 9.509 19.238 -27.496 -3.156 8.940 22.132 48.971 1.004 550 res[188] 46.184 19.394 9.423 33.313 45.515 59.019 85.959 1.004 570 res[189] 11.318 19.617 -25.389 -1.926 10.804 24.282 51.337 1.004 610 res[190] 48.685 19.904 11.757 35.059 48.363 62.003 88.897 1.004 650 res[191] 61.340 20.399 22.049 47.265 61.497 74.888 100.390 1.001 3000 res[192] 69.258 20.114 31.015 55.566 69.257 82.796 107.771 1.001 3000 res[193] 147.513 19.893 109.362 133.787 147.655 161.004 186.024 1.001 3000 res[194] 10.216 19.736 -27.536 -3.349 10.080 23.626 47.995 1.001 3000 res[195] -60.291 19.645 -98.084 -73.839 -60.424 -46.983 -22.375 1.001 3000 res[196] -127.817 19.622 -165.542 -141.253 -127.969 -114.600 -90.152 1.001 3000 res[197] -105.169 19.666 -143.051 -118.975 -105.420 -92.065 -67.183 1.001 3000 res[198] -16.744 19.777 -54.867 -30.534 -16.956 -3.502 21.925 1.001 3000 res[199] 58.118 19.954 19.251 44.132 57.895 71.517 97.543 1.001 3000 res[200] 57.988 20.196 19.100 43.664 57.783 71.837 97.408 1.001 3000 res[201] 104.662 20.216 65.604 91.196 104.129 117.955 145.655 1.001 3000 res[202] 68.042 19.960 29.943 54.697 67.603 81.144 108.289 1.001 3000 res[203] 68.898 19.768 31.120 55.869 68.189 81.648 109.075 1.001 3000 res[204] 64.405 19.641 27.221 51.506 63.620 77.206 104.266 1.001 3000 res[205] 3.406 19.582 -33.482 -9.523 2.819 16.280 42.974 1.002 1600 res[206] -22.546 19.590 -59.368 -35.751 -23.043 -9.997 17.186 1.002 1500 res[207] -35.927 19.665 -72.694 -49.313 -36.427 -23.321 3.921 1.002 1500 res[208] 29.262 19.808 -8.081 15.700 28.769 42.151 69.098 1.002 1400 res[209] -40.669 20.016 -78.873 -54.580 -41.163 -27.794 -0.309 1.002 1300 res[210] -151.022 20.287 -189.295 -165.054 -151.558 -137.974 -110.183 1.002 1300 res[211] -11.331 19.478 -49.788 -24.327 -11.329 1.687 28.375 1.001 3000 res[212] -46.848 19.246 -84.659 -59.834 -46.962 -34.136 -7.637 1.001 3000 res[213] -56.800 19.082 -94.085 -69.741 -56.920 -44.189 -18.002 1.001 3000 res[214] 1.739 18.986 -35.498 -10.993 1.712 14.340 40.172 1.001 3000 res[215] -35.361 18.959 -72.673 -48.126 -35.249 -22.891 1.979 1.001 3000 res[216] -44.481 19.003 -81.120 -57.243 -44.253 -31.932 -6.630 1.001 3000 res[217] 39.857 19.116 2.744 26.894 40.166 52.544 76.860 1.001 3000 res[218] 58.081 19.297 20.032 45.058 58.512 70.719 95.558 1.001 3000 res[219] 72.813 19.544 34.451 59.392 73.151 85.635 110.641 1.001 3000 res[220] 50.115 19.856 10.955 36.464 50.294 63.305 88.234 1.001 3000 res[221] -6.809 20.129 -45.345 -20.384 -6.751 6.781 31.948 1.001 2900 res[222] -80.306 19.822 -118.228 -93.554 -80.279 -66.762 -42.589 1.001 3000 res[223] -75.983 19.579 -113.551 -89.244 -76.070 -62.480 -38.965 1.001 3000 res[224] 38.190 19.401 0.379 24.941 38.224 51.608 74.826 1.001 3000 res[225] 39.987 19.290 2.221 26.695 39.817 53.434 76.794 1.001 3000 res[226] 38.017 19.248 0.066 24.729 38.161 51.420 75.022 1.001 3000 res[227] 44.313 19.275 5.935 30.750 44.665 57.618 81.339 1.001 3000 res[228] 12.917 19.370 -25.346 -0.651 13.262 26.276 50.266 1.001 3000 res[229] -63.418 19.532 -101.603 -77.321 -63.083 -49.681 -25.480 1.001 3000 res[230] 78.698 19.761 40.201 64.710 79.059 92.331 117.449 1.001 3000 res[231] 76.659 19.806 38.131 63.234 76.442 90.078 115.084 1.005 1500 res[232] 48.916 19.531 11.052 35.842 48.765 62.195 86.730 1.001 2300 res[233] -48.505 19.321 -86.078 -61.577 -48.749 -35.320 -11.446 1.001 2400 res[234] -49.766 19.178 -86.679 -62.891 -49.994 -36.579 -12.742 1.001 2600 res[235] -51.281 19.104 -88.102 -64.425 -51.504 -37.911 -13.696 1.001 2700 res[236] -53.444 19.098 -89.858 -66.523 -53.670 -39.952 -15.864 1.001 2900 res[237] -132.287 19.162 -168.928 -145.504 -132.360 -118.903 -94.433 1.001 3000 res[238] -17.322 19.295 -54.524 -30.619 -17.404 -3.952 20.978 1.001 3000 res[239] 73.056 19.495 35.024 59.636 73.118 86.468 111.834 1.001 3000 res[240] 91.118 19.760 52.017 77.501 91.045 104.766 130.681 1.012 3000 res[241] 39.233 19.729 0.839 25.691 39.122 52.582 78.534 1.001 3000 res[242] 68.968 19.452 30.912 55.412 68.888 82.077 107.474 1.001 3000 res[243] 44.166 19.241 6.414 30.804 43.942 57.248 82.155 1.001 3000 res[244] 20.118 19.097 -17.486 6.813 20.176 33.121 57.895 1.001 3000 res[245] 84.460 19.022 47.331 71.334 84.438 97.283 122.266 1.001 3000 res[246] 5.867 19.016 -31.082 -7.273 5.760 18.665 43.468 1.001 3000 res[247] -21.716 19.080 -58.964 -34.794 -21.846 -8.983 16.336 1.001 3000 res[248] -55.321 19.213 -92.581 -68.575 -55.392 -42.535 -17.061 1.001 3000 res[249] -45.199 19.414 -82.523 -58.470 -45.144 -32.195 -7.023 1.001 3000 res[250] -117.273 19.680 -154.801 -130.486 -117.034 -104.027 -78.504 1.001 3000 res[251] -38.588 19.683 -77.239 -51.094 -38.857 -25.719 0.540 1.001 3000 res[252] 20.055 19.388 -18.018 7.658 19.870 32.785 58.416 1.001 3000 res[253] 53.893 19.158 15.911 41.714 53.644 66.699 91.722 1.001 3000 res[254] 8.467 18.995 -29.688 -3.601 8.322 21.325 46.151 1.001 3000 res[255] 48.242 18.901 10.345 36.144 48.178 61.126 85.573 1.001 3000 res[256] -70.073 18.877 -107.711 -82.460 -70.023 -57.326 -33.066 1.001 3000 res[257] -60.657 18.923 -99.108 -73.051 -60.727 -48.023 -23.606 1.001 3000 res[258] -0.990 19.039 -39.245 -13.623 -1.009 11.781 36.321 1.001 3000 res[259] -22.400 19.224 -61.277 -35.284 -22.404 -9.470 15.244 1.001 3000 res[260] -8.938 19.474 -48.325 -22.010 -8.914 4.199 29.484 1.001 3000 res[261] -40.706 19.787 -80.312 -54.111 -40.805 -27.291 -1.804 1.002 1400 res[262] 26.716 19.519 -11.760 13.727 26.702 39.647 65.075 1.002 1500 res[263] -54.797 19.315 -92.628 -67.657 -54.907 -42.032 -16.838 1.002 1600 res[264] -44.212 19.179 -81.895 -56.895 -44.226 -31.532 -6.175 1.002 1700 res[265] -32.864 19.111 -70.202 -45.406 -32.828 -20.031 4.278 1.002 1900 res[266] -8.518 19.112 -46.263 -21.219 -8.508 4.277 28.638 1.001 2100 res[267] -33.803 19.183 -71.322 -46.425 -33.740 -20.774 3.197 1.001 2400 res[268] 78.089 19.322 40.612 65.483 78.078 91.198 115.171 1.001 3000 res[269] 4.080 19.528 -34.265 -8.475 3.989 17.432 41.357 1.001 3000 res[270] 28.535 19.799 -9.892 15.777 28.421 41.986 66.923 1.001 3000 res[271] -95.314 19.649 -134.912 -108.265 -95.371 -82.335 -57.118 1.002 1700 res[272] -61.283 19.366 -100.750 -73.997 -61.367 -48.686 -23.257 1.002 1600 res[273] -9.237 19.148 -48.234 -21.852 -9.247 3.298 27.883 1.002 1400 res[274] -47.861 18.998 -86.708 -60.513 -47.797 -35.410 -10.915 1.002 1300 res[275] 10.178 18.917 -28.271 -2.607 10.411 22.652 47.508 1.002 1200 res[276] 37.374 18.905 -0.279 24.567 37.478 49.931 74.485 1.002 1100 res[277] 90.859 18.964 53.314 77.798 91.003 103.426 127.892 1.002 1200 res[278] 42.122 19.092 4.163 28.933 42.335 54.879 78.912 1.003 990 res[279] -3.771 19.289 -41.790 -17.072 -3.597 9.021 33.600 1.003 940 res[280] -81.073 19.551 -119.532 -94.451 -80.994 -68.219 -43.465 1.003 900 res[281] -73.538 20.107 -112.081 -87.504 -73.932 -59.611 -34.105 1.001 3000 res[282] -116.839 19.826 -154.857 -130.612 -116.996 -103.297 -77.619 1.001 3000 res[283] -50.673 19.609 -88.406 -64.468 -51.033 -37.194 -12.123 1.001 3000 res[284] -29.125 19.457 -66.540 -42.834 -29.546 -15.905 9.630 1.001 3000 res[285] 93.497 19.373 56.454 80.008 93.163 106.460 131.420 1.001 3000 res[286] 23.680 19.357 -13.814 10.278 23.381 36.592 61.191 1.001 3000 res[287] 84.255 19.409 46.656 70.942 83.998 97.319 121.882 1.001 3000 res[288] 46.722 19.529 9.094 33.449 46.492 59.912 84.600 1.001 3000 res[289] -30.308 19.716 -68.604 -43.912 -30.291 -17.032 7.968 1.000 3000 res[290] 9.006 19.968 -29.070 -4.576 9.218 22.181 47.997 1.001 3000 res[291] 8.593 20.015 -31.955 -4.547 8.502 22.264 47.210 1.002 1000 res[292] -28.563 19.743 -68.586 -41.529 -28.571 -14.894 9.682 1.002 1100 res[293] -42.725 19.536 -82.238 -55.534 -42.634 -29.169 -4.797 1.002 1100 res[294] -7.395 19.396 -46.510 -20.288 -7.258 5.962 30.226 1.002 1200 res[295] -40.163 19.323 -79.256 -53.049 -39.972 -26.882 -2.998 1.002 1400 res[296] 36.921 19.318 -1.850 24.320 37.188 49.917 74.298 1.002 1500 res[297] -15.795 19.382 -54.771 -28.417 -15.563 -2.928 21.888 1.002 1700 res[298] 12.273 19.514 -27.089 -0.267 12.415 25.214 50.686 1.002 1900 res[299] 30.508 19.712 -9.210 17.865 30.601 43.715 69.219 1.001 2200 res[300] -15.886 19.975 -56.414 -28.700 -15.663 -2.565 23.327 1.001 2600 res[301] -41.584 19.395 -79.601 -54.605 -41.784 -28.198 -4.726 1.003 710 res[302] -58.142 19.124 -95.490 -71.207 -58.554 -44.736 -21.555 1.003 730 res[303] 8.849 18.919 -28.170 -4.058 8.374 22.116 45.240 1.003 760 res[304] 91.734 18.783 55.239 79.080 91.322 105.056 127.995 1.003 840 res[305] 27.658 18.717 -8.539 14.843 27.478 40.808 63.535 1.003 850 res[306] 74.425 18.722 38.598 61.527 74.389 87.708 109.620 1.003 960 res[307] 60.998 18.797 25.289 47.807 60.997 74.344 96.693 1.003 970 res[308] -44.873 18.942 -80.824 -58.161 -44.935 -31.592 -8.886 1.002 1100 res[309] -69.546 19.156 -105.840 -82.973 -69.672 -56.016 -33.148 1.002 1100 res[310] -28.972 19.435 -66.154 -42.576 -29.137 -14.930 7.777 1.002 1300 res[311] 10.680 19.807 -26.803 -2.840 10.548 23.587 50.141 1.001 3000 res[312] -106.339 19.491 -142.961 -119.628 -106.201 -93.604 -67.746 1.001 3000 res[313] -49.364 19.239 -85.781 -62.695 -49.120 -36.655 -11.483 1.001 3000 res[314] -0.588 19.053 -36.925 -13.553 -0.357 12.051 37.018 1.001 3000 res[315] -10.086 18.935 -46.680 -23.125 -10.011 2.455 27.165 1.001 3000 res[316] -38.378 18.887 -75.100 -51.190 -38.468 -25.946 -1.907 1.001 3000 res[317] 72.405 18.910 35.744 59.612 72.444 84.979 109.032 1.001 3000 res[318] 57.411 19.002 20.545 44.624 57.449 70.316 94.020 1.001 3000 res[319] 48.675 19.163 11.263 35.761 48.746 61.618 86.018 1.001 3000 res[320] -14.327 19.391 -51.627 -27.332 -14.188 -1.186 23.657 1.001 3000 res[321] -87.145 19.535 -125.417 -100.389 -87.379 -74.039 -48.332 1.001 2900 res[322] -146.218 19.248 -184.223 -159.492 -146.224 -133.467 -108.012 1.001 3000 res[323] -138.958 19.026 -177.046 -152.077 -139.093 -126.323 -101.356 1.001 3000 res[324] -44.546 18.872 -82.167 -57.514 -44.777 -32.163 -7.595 1.001 3000 res[325] 50.086 18.788 12.873 37.254 49.950 62.661 86.972 1.001 3000 res[326] 115.979 18.774 78.980 103.361 115.817 128.458 152.967 1.001 3000 res[327] 154.573 18.831 117.686 141.955 154.525 167.211 191.734 1.001 3000 res[328] 108.189 18.957 71.098 95.542 108.388 120.936 145.768 1.001 3000 res[329] 21.214 19.152 -16.229 8.398 21.395 34.071 59.664 1.001 3000 res[330] -4.091 19.414 -42.173 -17.121 -4.085 8.698 34.344 1.001 3000 res[331] -110.990 19.479 -149.142 -124.184 -111.019 -97.780 -72.255 1.001 3000 res[332] -106.611 19.212 -144.365 -119.692 -106.683 -93.629 -68.547 1.001 3000 res[333] -4.965 19.011 -42.469 -17.765 -5.204 7.899 32.614 1.001 3000 res[334] -0.635 18.879 -38.038 -13.058 -0.625 12.175 36.959 1.001 3000 res[335] 50.534 18.816 13.166 38.198 50.529 63.207 87.887 1.001 3000 res[336] 60.602 18.823 23.165 48.153 60.780 73.293 98.025 1.001 3000 res[337] 4.702 18.901 -33.322 -7.894 4.813 17.167 42.040 1.001 3000 res[338] 85.566 19.048 47.330 72.799 85.753 98.276 122.955 1.001 2600 res[339] 58.618 19.263 20.416 45.915 58.860 71.234 96.496 1.001 2700 res[340] -8.339 19.544 -46.938 -21.333 -8.085 4.530 30.369 1.001 2500 res[341] 41.513 19.969 3.428 27.925 41.568 55.015 79.872 1.001 3000 res[342] -39.928 19.697 -77.691 -53.104 -39.831 -26.614 -2.007 1.001 3000 res[343] 13.497 19.490 -24.223 0.172 13.746 26.760 50.968 1.001 3000 res[344] 37.587 19.350 -0.125 24.640 38.010 50.780 74.818 1.001 3000 res[345] 13.500 19.277 -24.638 0.635 13.712 26.711 50.908 1.001 3000 res[346] -12.993 19.273 -51.096 -26.094 -12.897 0.140 24.911 1.001 3000 res[347] 36.508 19.338 -2.608 23.555 36.758 49.767 74.603 1.001 3000 res[348] -24.367 19.470 -63.595 -37.295 -24.099 -11.173 13.776 1.001 3000 res[349] -69.679 19.670 -108.987 -82.798 -69.533 -56.408 -30.968 1.001 3000 res[350] -36.066 19.934 -75.541 -49.269 -35.666 -22.399 3.260 1.001 3000 sigma 63.079 2.515 58.326 61.370 62.978 64.664 68.204 1.002 1900 sigma.B 60.746 8.321 46.425 55.013 59.927 65.631 79.037 1.001 3000 deviance 3894.805 9.297 3878.620 3888.202 3894.145 3900.658 3914.644 1.001 3000 For each parameter, n.eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor (at convergence, Rhat=1). DIC info (using the rule, pD = var(deviance)/2) pD = 43.2 and DIC = 3938.0 DIC is an estimate of expected predictive error (lower deviance is better).
data.rm.mcmc.list.f <- as.mcmc(data.rm.r2jags.f)
Matrix parameterization
modelString=" model { #Likelihood for (i in 1:n) { y[i]~dnorm(mu[i],tau) mu[i] <- inprod(beta[],X[i,]) + gamma[Block[i]] res[i] <- y[i]-mu[i] } #Priors beta ~ dmnorm(a0,A0) for (i in 1:nBlock) { gamma[i] ~ dnorm(0, tau.B) #prior } tau <- pow(sigma,-2) sigma <- z/sqrt(chSq) z ~ dnorm(0, 0.0016)I(0,) #1/25^2 = 0.0016 chSq ~ dgamma(0.5, 0.5) tau.B <- pow(sigma.B,-2) sigma.B <- z/sqrt(chSq.B) z.B ~ dnorm(0, 0.0016)I(0,) #1/25^2 = 0.0016 chSq.B ~ dgamma(0.5, 0.5) } "
Xmat <- model.matrix(~Time,data.rm) data.rm.list <- with(data.rm, list(y=y, Block=as.numeric(Block), X=Xmat, n=nrow(data.rm), nBlock=length(levels(Block)), nA = ncol(Xmat), a0=rep(0,ncol(Xmat)), A0=diag(0,ncol(Xmat)) ) ) params <- c("beta",'gamma',"sigma","sigma.B","res") adaptSteps = 1000 burnInSteps = 3000 nChains = 3 numSavedSteps = 3000 thinSteps = 10 nIter = burnInSteps+ceiling((numSavedSteps * thinSteps)/nChains) library(R2jags) rnorm(1)
[1] -0.9528011
jags.effects.m.time <- system.time( data.rm.r2jags.m <- jags(data=data.rm.list, inits=NULL, parameters.to.save=params, model.file=textConnection(modelString), n.chains=3, n.iter=nIter, n.burnin=burnInSteps, n.thin=thinSteps ) )
Compiling model graph Resolving undeclared variables Allocating nodes Graph Size: 2521 Initializing model
jags.effects.m.time
user system elapsed 29.562 0.108 29.804
print(data.rm.r2jags.m)
Inference for Bugs model at "5", fit using jags, 3 chains, each with 13000 iterations (first 3000 discarded), n.thin = 10 n.sims = 3000 iterations saved mu.vect sd.vect 2.5% 25% 50% 75% 97.5% Rhat n.eff beta[1] 167.553 12.810 142.156 159.427 167.460 176.006 192.755 1.001 3000 beta[2] 32.661 1.197 30.288 31.839 32.660 33.479 35.018 1.002 3000 gamma[1] -42.273 20.936 -83.035 -56.410 -42.140 -28.377 -1.552 1.001 2800 gamma[2] 76.232 21.234 34.282 62.151 76.530 90.666 118.817 1.002 1600 gamma[3] 4.242 21.292 -38.030 -10.095 4.129 18.602 45.183 1.002 1700 gamma[4] 84.799 21.726 42.898 70.152 84.447 99.227 126.907 1.001 3000 gamma[5] -55.162 21.373 -98.295 -69.913 -54.710 -41.054 -14.143 1.001 3000 gamma[6] 3.789 21.044 -37.870 -10.381 3.986 17.863 44.945 1.001 3000 gamma[7] -95.696 21.479 -139.446 -110.112 -95.683 -81.173 -54.752 1.001 2900 gamma[8] -30.475 21.091 -72.663 -44.714 -30.227 -16.540 11.629 1.002 1100 gamma[9] 75.027 21.247 34.412 61.032 74.781 89.553 116.765 1.001 3000 gamma[10] 17.399 21.035 -23.850 3.849 17.162 31.461 60.170 1.001 3000 gamma[11] 100.817 21.485 58.570 85.881 100.975 115.742 143.521 1.001 3000 gamma[12] 33.249 21.309 -9.681 19.556 33.572 46.838 74.995 1.001 3000 gamma[13] -60.166 21.241 -100.967 -73.948 -60.107 -45.963 -19.289 1.004 520 gamma[14] -83.950 21.116 -126.463 -97.776 -83.859 -69.687 -43.277 1.001 3000 gamma[15] 39.379 21.359 -2.333 25.127 39.390 53.827 80.733 1.001 3000 gamma[16] 70.714 21.336 30.580 56.051 70.929 84.813 113.163 1.001 3000 gamma[17] 8.079 21.001 -31.718 -6.111 7.884 22.124 48.698 1.002 1000 gamma[18] -33.842 21.697 -78.092 -48.325 -33.952 -18.962 8.517 1.001 3000 gamma[19] 33.434 21.703 -9.573 18.592 33.731 48.331 74.464 1.002 1300 gamma[20] 78.018 21.692 34.752 63.206 77.490 92.945 121.366 1.001 2600 gamma[21] 85.627 21.633 43.287 70.706 86.156 100.124 127.879 1.005 2100 gamma[22] 25.173 21.631 -17.524 10.699 25.273 39.469 67.823 1.002 1500 gamma[23] 18.889 21.348 -23.124 4.923 19.051 32.973 60.537 1.001 3000 gamma[24] -55.072 20.876 -94.769 -69.356 -54.229 -41.129 -13.995 1.002 1400 gamma[25] 25.325 20.944 -15.411 11.303 25.089 39.017 67.220 1.001 3000 gamma[26] -61.605 21.316 -103.964 -76.167 -61.886 -47.218 -19.810 1.001 3000 gamma[27] -72.937 21.657 -114.694 -87.830 -72.524 -58.265 -32.139 1.001 2500 gamma[28] -102.097 21.584 -145.254 -116.089 -102.029 -87.748 -59.975 1.001 3000 gamma[29] -42.156 21.464 -84.999 -56.136 -42.426 -27.839 -0.392 1.001 3000 gamma[30] -57.700 21.389 -101.703 -71.889 -57.312 -43.470 -17.267 1.001 3000 gamma[31] 17.441 21.299 -25.552 3.156 17.758 31.628 58.503 1.001 3000 gamma[32] -23.479 20.982 -64.625 -37.467 -23.547 -8.760 16.750 1.001 2300 gamma[33] 29.310 21.198 -12.534 15.282 29.732 43.727 71.142 1.001 2800 gamma[34] 25.955 21.381 -16.591 11.442 26.243 40.054 66.769 1.004 590 gamma[35] -38.152 21.518 -79.473 -52.927 -38.229 -24.164 3.728 1.001 2600 res[1] 50.439 19.319 12.876 37.404 50.318 63.712 88.340 1.001 3000 res[2] -58.126 18.988 -95.145 -70.899 -58.349 -45.013 -20.685 1.001 3000 res[3] -21.799 18.728 -58.627 -34.316 -21.955 -8.844 14.334 1.001 3000 res[4] -105.760 18.542 -142.225 -118.075 -105.965 -93.008 -70.462 1.001 2800 res[5] -118.772 18.432 -155.032 -131.069 -118.881 -106.255 -83.674 1.001 2500 res[6] -22.955 18.400 -59.205 -35.149 -23.239 -10.281 12.158 1.001 2200 res[7] 17.748 18.445 -18.503 5.441 17.399 30.200 53.630 1.001 2000 res[8] 73.580 18.567 37.383 61.177 73.330 86.168 110.054 1.001 2300 res[9] 78.460 18.765 42.226 65.863 78.132 91.189 115.629 1.001 2200 res[10] 60.993 19.036 24.148 48.157 60.726 73.851 98.726 1.002 1700 res[11] -61.808 20.022 -101.765 -75.001 -61.184 -48.414 -23.067 1.001 2500 res[12] 19.350 19.723 -19.882 6.087 20.008 32.655 57.930 1.001 2300 res[13] 6.350 19.493 -32.041 -6.805 6.608 19.392 44.380 1.001 2200 res[14] -42.590 19.334 -80.768 -55.614 -42.291 -29.589 -4.466 1.001 2100 res[15] -3.257 19.249 -41.356 -16.091 -2.985 9.987 34.714 1.002 2000 res[16] 38.139 19.238 -0.211 25.261 38.419 51.400 75.918 1.002 1900 res[17] 111.783 19.301 72.925 98.728 112.186 124.963 149.804 1.002 1600 res[18] 45.234 19.438 6.075 32.070 45.694 58.297 83.814 1.002 1800 res[19] 2.069 19.647 -37.076 -11.236 2.374 15.226 40.645 1.002 1700 res[20] -34.263 19.926 -74.331 -47.510 -33.883 -20.952 5.112 1.002 1700 res[21] 6.193 19.755 -31.496 -7.437 6.238 19.246 45.385 1.002 1800 res[22] -15.944 19.445 -53.579 -29.384 -15.749 -3.119 22.529 1.002 1900 res[23] -30.638 19.205 -67.871 -43.801 -30.664 -17.865 7.068 1.001 2100 res[24] 43.950 19.037 7.078 30.980 44.033 56.721 81.043 1.001 2200 res[25] -3.881 18.944 -40.444 -16.670 -3.817 8.849 33.351 1.001 2400 res[26] -27.209 18.926 -63.821 -40.060 -27.324 -14.650 10.158 1.001 2700 res[27] 5.632 18.983 -31.028 -7.149 5.753 18.222 43.343 1.001 3000 res[28] 54.984 19.116 18.210 42.067 55.150 67.424 93.315 1.001 3000 res[29] -0.911 19.322 -37.961 -14.026 -0.604 11.589 37.651 1.001 3000 res[30] -32.958 19.598 -70.690 -46.067 -32.849 -20.085 6.331 1.001 3000 res[31] 11.442 19.978 -27.325 -1.863 11.213 24.906 50.268 1.001 2600 res[32] 55.949 19.721 17.690 42.993 55.823 69.129 94.083 1.001 2500 res[33] 27.576 19.534 -10.942 14.512 27.529 40.742 65.538 1.001 2400 res[34] -54.596 19.419 -93.057 -67.703 -54.563 -41.780 -16.454 1.001 2300 res[35] 36.090 19.378 -2.174 23.210 35.998 48.933 74.003 1.001 2300 res[36] 94.404 19.410 56.319 81.554 94.172 107.491 132.256 1.002 1800 res[37] 33.251 19.516 -5.274 20.356 32.938 46.409 71.530 1.001 2200 res[38] 16.061 19.694 -22.775 2.977 15.751 29.359 54.478 1.001 2100 res[39] -82.943 19.943 -122.431 -96.227 -82.920 -69.726 -43.881 1.001 2100 res[40] -46.219 20.259 -86.333 -59.763 -46.182 -32.779 -6.002 1.001 2100 res[41] 163.361 19.739 124.993 149.773 163.218 176.681 202.470 1.001 3000 res[42] 65.130 19.436 27.538 51.838 65.126 78.208 103.737 1.001 3000 res[43] 28.898 19.204 -7.866 15.593 28.796 41.940 66.885 1.001 3000 res[44] 6.300 19.044 -30.016 -6.888 6.282 19.059 43.597 1.001 3000 res[45] -14.308 18.959 -50.369 -27.542 -14.297 -1.705 22.866 1.001 3000 res[46] -98.935 18.949 -134.994 -112.258 -99.080 -86.369 -61.671 1.001 3000 res[47] -40.456 19.015 -76.856 -53.571 -40.566 -27.911 -3.184 1.001 3000 res[48] -95.843 19.155 -132.476 -108.938 -95.944 -83.178 -58.098 1.001 3000 res[49] -65.346 19.368 -102.071 -78.710 -65.509 -52.606 -26.899 1.001 3000 res[50] -10.580 19.653 -47.785 -24.136 -10.734 2.532 28.843 1.001 3000 res[51] -73.696 19.530 -111.608 -86.921 -73.562 -60.792 -35.178 1.001 3000 res[52] -16.741 19.242 -54.242 -29.771 -16.651 -3.994 21.327 1.001 3000 res[53] -17.596 19.025 -54.824 -30.503 -17.515 -5.048 19.652 1.001 3000 res[54] -49.910 18.881 -87.200 -62.766 -49.927 -37.410 -12.769 1.001 3000 res[55] -11.592 18.813 -48.479 -24.380 -11.364 0.521 25.493 1.001 3000 res[56] -35.479 18.820 -72.501 -48.094 -35.371 -23.322 1.973 1.001 3000 res[57] 55.201 18.904 17.734 42.702 55.286 67.644 92.576 1.001 3000 res[58] 37.669 19.063 -0.243 25.234 37.631 50.384 75.083 1.001 3000 res[59] 66.665 19.294 28.076 54.200 66.547 79.391 104.481 1.001 3000 res[60] 56.947 19.596 17.946 44.475 56.745 69.707 94.877 1.001 3000 res[61] 53.315 19.761 15.088 39.638 52.978 67.236 91.349 1.001 3000 res[62] -0.294 19.453 -38.329 -13.809 -0.790 13.236 37.536 1.001 3000 res[63] 3.387 19.214 -34.394 -10.102 2.968 16.552 41.261 1.001 3000 res[64] 38.844 19.048 1.069 25.626 38.690 51.693 76.602 1.001 3000 res[65] -62.298 18.956 -99.764 -75.459 -62.525 -49.347 -24.252 1.001 3000 res[66] 21.477 18.939 -15.677 8.378 21.289 34.438 58.777 1.001 3000 res[67] 9.658 18.998 -27.176 -3.546 9.285 22.764 47.262 1.001 3000 res[68] -7.752 19.132 -45.511 -20.923 -8.201 5.485 29.918 1.001 3000 res[69] -62.349 19.339 -100.708 -75.486 -62.820 -48.851 -24.654 1.001 3000 res[70] -101.473 19.617 -140.713 -114.705 -101.901 -87.900 -62.421 1.001 3000 res[71] -88.872 19.610 -126.540 -101.845 -89.061 -76.155 -50.211 1.002 1200 res[72] -31.808 19.283 -68.953 -44.601 -31.793 -19.376 6.263 1.002 1200 res[73] -117.052 19.025 -154.532 -129.493 -116.874 -104.638 -79.801 1.002 1300 res[74] -131.104 18.840 -168.575 -143.579 -130.920 -118.751 -94.318 1.002 1300 res[75] -65.871 18.730 -102.964 -78.065 -65.871 -53.603 -29.234 1.002 1400 res[76] -63.657 18.697 -100.455 -75.848 -63.620 -51.500 -26.957 1.002 1500 res[77] 68.502 18.739 32.485 56.263 68.375 80.747 105.323 1.003 1700 res[78] 127.716 18.858 90.856 115.235 127.765 140.156 164.969 1.001 2100 res[79] 142.042 19.052 104.397 129.451 142.220 154.713 179.371 1.001 2300 res[80] 123.441 19.317 84.987 110.535 123.856 136.271 161.427 1.001 2600 res[81] 37.986 19.999 -0.590 24.306 37.731 51.571 76.839 1.001 3000 res[82] -13.162 19.727 -51.653 -26.548 -13.484 0.160 25.233 1.001 3000 res[83] 12.084 19.525 -26.054 -1.285 11.606 25.309 50.148 1.001 2700 res[84] 74.358 19.395 36.398 61.131 73.805 87.473 112.716 1.002 1900 res[85] 103.806 19.338 65.848 90.691 103.542 116.789 141.988 1.002 1800 res[86] 31.999 19.355 -5.764 18.810 31.768 45.024 70.216 1.002 2000 res[87] -66.261 19.445 -104.263 -79.502 -66.437 -53.194 -27.801 1.002 1800 res[88] -69.941 19.609 -108.442 -83.177 -70.008 -56.862 -31.249 1.002 1700 res[89] -45.889 19.843 -84.743 -59.585 -45.908 -32.965 -6.541 1.002 1600 res[90] 18.325 20.146 -21.264 4.527 18.467 31.416 58.858 1.002 1600 res[91] -6.509 19.493 -45.292 -19.876 -6.516 6.213 30.752 1.001 3000 res[92] 10.995 19.195 -26.825 -2.118 11.105 23.701 47.886 1.001 3000 res[93] 17.923 18.969 -18.669 5.033 18.160 30.278 54.419 1.001 3000 res[94] 70.955 18.815 34.449 58.275 70.992 83.316 107.580 1.001 3000 res[95] 51.755 18.737 15.428 39.245 51.709 64.228 88.362 1.001 3000 res[96] 19.393 18.736 -17.059 7.074 19.417 32.059 56.164 1.001 3000 res[97] -46.504 18.810 -83.493 -58.735 -46.518 -34.162 -9.504 1.001 3000 res[98] -68.284 18.960 -105.867 -80.835 -67.971 -56.031 -30.941 1.001 3000 res[99] -78.476 19.184 -116.331 -91.057 -78.068 -66.145 -41.291 1.001 3000 res[100] 48.472 19.479 9.907 35.759 48.940 61.016 86.176 1.001 3000 res[101] 152.009 20.156 111.901 138.798 151.791 165.408 191.649 1.001 3000 res[102] 105.475 19.865 66.133 92.272 105.327 118.830 144.555 1.001 3000 res[103] 23.867 19.643 -15.067 10.828 23.542 37.184 62.512 1.001 3000 res[104] -88.415 19.492 -126.699 -101.367 -88.656 -75.075 -50.579 1.001 3000 res[105] -54.392 19.414 -93.019 -67.283 -54.487 -41.341 -16.669 1.001 3000 res[106] -107.968 19.409 -146.327 -120.957 -107.926 -94.916 -70.514 1.001 3000 res[107] 17.400 19.478 -20.976 4.217 17.354 30.507 55.107 1.001 3000 res[108] -41.352 19.620 -80.013 -54.629 -41.368 -28.046 -3.328 1.001 3000 res[109] -39.049 19.834 -77.329 -52.544 -38.991 -25.464 -0.157 1.001 3000 res[110] 147.219 20.116 108.457 133.755 147.192 160.973 186.491 1.001 3000 res[111] 110.266 19.830 70.876 97.488 110.309 123.238 150.222 1.001 3000 res[112] 102.576 19.551 63.239 89.991 102.458 115.148 142.299 1.001 3000 res[113] 25.169 19.342 -14.269 12.849 25.184 37.579 64.395 1.001 3000 res[114] 35.438 19.206 -3.448 23.169 35.478 47.822 74.055 1.001 3000 res[115] -17.925 19.144 -56.918 -30.197 -17.978 -5.552 20.480 1.001 3000 res[116] -13.619 19.156 -52.697 -25.894 -13.739 -1.390 24.810 1.001 3000 res[117] -29.415 19.244 -68.622 -41.695 -29.516 -16.982 9.465 1.001 3000 res[118] -86.903 19.404 -126.138 -99.503 -87.018 -74.236 -48.696 1.001 3000 res[119] -73.699 19.637 -113.109 -86.195 -73.627 -60.926 -35.191 1.001 3000 res[120] -14.652 19.939 -54.173 -27.536 -14.804 -1.791 24.233 1.001 3000 res[121] 151.868 19.820 112.580 138.900 152.166 164.284 190.295 1.004 640 res[122] 64.460 19.510 25.277 51.722 64.864 76.675 102.267 1.003 670 res[123] 50.157 19.270 11.395 37.490 50.669 62.206 87.406 1.003 680 res[124] 108.633 19.102 70.523 95.822 109.124 120.890 145.978 1.004 640 res[125] 41.200 19.008 4.127 28.238 41.666 53.380 78.558 1.003 700 res[126] -83.993 18.989 -120.957 -96.820 -83.647 -71.570 -47.103 1.003 710 res[127] -105.049 19.045 -142.454 -117.907 -104.660 -92.444 -68.396 1.003 740 res[128] -123.994 19.177 -161.412 -136.930 -123.567 -111.287 -87.036 1.003 770 res[129] -88.945 19.381 -126.985 -101.934 -88.358 -76.043 -51.126 1.003 800 res[130] -84.000 19.656 -122.730 -96.990 -83.575 -70.820 -45.233 1.003 840 res[131] -46.220 19.664 -84.674 -59.300 -46.365 -33.119 -7.766 1.001 3000 res[132] -51.297 19.379 -89.580 -64.382 -51.410 -38.176 -13.209 1.001 3000 res[133] -16.067 19.165 -54.260 -29.123 -16.021 -3.230 22.214 1.001 3000 res[134] -25.421 19.024 -62.660 -38.279 -25.313 -12.459 11.993 1.001 3000 res[135] 44.369 18.958 7.354 31.487 44.467 57.430 81.687 1.001 3000 res[136] 3.297 18.967 -33.367 -9.357 3.516 16.092 40.987 1.001 3000 res[137] 21.307 19.052 -15.941 8.484 21.559 33.997 58.809 1.001 3000 res[138] -23.593 19.211 -61.404 -36.496 -23.438 -10.917 14.931 1.001 3000 res[139] -5.403 19.442 -43.483 -18.372 -5.449 7.228 33.740 1.001 3000 res[140] -4.170 19.744 -42.981 -17.283 -4.315 8.598 35.562 1.001 3000 res[141] -82.081 19.863 -120.583 -95.760 -81.518 -68.403 -43.788 1.001 2200 res[142] -70.707 19.592 -108.623 -84.053 -70.125 -57.169 -32.844 1.001 2200 res[143] 4.082 19.391 -33.649 -9.229 4.624 17.369 41.831 1.001 2300 res[144] 38.441 19.263 1.182 25.292 38.659 51.610 75.661 1.001 2300 res[145] 38.487 19.209 0.947 25.273 38.575 51.490 75.616 1.001 2400 res[146] -28.786 19.229 -66.486 -41.935 -28.745 -15.954 8.587 1.001 2500 res[147] 34.941 19.324 -3.346 21.847 34.813 47.886 72.536 1.001 2600 res[148] 30.888 19.491 -7.704 17.718 30.651 44.313 68.887 1.001 2700 res[149] 0.565 19.730 -38.475 -12.848 0.491 14.138 39.107 1.001 2800 res[150] 80.465 20.038 40.977 66.713 80.338 94.421 119.852 1.001 2300 res[151] -16.545 19.467 -54.422 -30.292 -16.490 -3.332 21.816 1.001 3000 res[152] -9.185 19.224 -46.576 -22.803 -9.001 3.887 28.701 1.001 3000 res[153] 45.090 19.053 8.098 31.688 45.191 58.114 82.743 1.001 3000 res[154] 54.002 18.956 17.332 40.793 54.187 66.950 91.176 1.001 3000 res[155] 15.137 18.935 -22.447 1.857 15.411 27.825 52.615 1.001 3000 res[156] -22.058 18.989 -59.868 -35.348 -21.760 -9.257 15.095 1.001 3000 res[157] -13.667 19.118 -51.788 -27.044 -13.357 -0.718 23.968 1.001 3000 res[158] 0.430 19.321 -38.029 -12.945 0.773 13.606 38.090 1.001 3000 res[159] -13.569 19.595 -51.968 -27.178 -13.208 -0.229 24.800 1.001 3000 res[160] 39.693 19.937 1.010 25.775 40.105 53.148 78.361 1.001 3000 res[161] 102.667 19.336 65.989 89.284 102.895 115.567 139.724 1.002 1600 res[162] 48.701 19.055 12.495 35.580 48.961 61.560 85.483 1.002 1600 res[163] 66.878 18.845 31.072 53.855 67.258 79.477 103.452 1.002 1600 res[164] -56.069 18.710 -91.684 -68.936 -55.769 -43.439 -19.295 1.002 1700 res[165] -25.270 18.651 -60.580 -38.135 -25.003 -12.778 11.788 1.002 1700 res[166] 4.085 18.668 -31.335 -8.760 4.352 16.389 40.714 1.002 1800 res[167] -9.054 18.762 -44.542 -22.177 -8.641 3.476 27.966 1.002 1900 res[168] -64.711 18.932 -101.023 -77.661 -64.321 -52.065 -27.310 1.002 2000 res[169] -73.662 19.175 -110.590 -86.551 -73.424 -60.880 -35.843 1.001 2100 res[170] 15.886 19.488 -21.481 2.793 15.884 28.943 54.067 1.001 2200 res[171] 69.662 19.932 31.529 56.198 69.332 83.064 109.016 1.001 3000 res[172] -45.129 19.646 -82.603 -58.613 -45.484 -31.951 -6.369 1.001 3000 res[173] 19.704 19.429 -17.386 6.341 19.544 32.657 58.259 1.001 3000 res[174] 5.360 19.285 -31.031 -7.795 5.226 17.978 44.056 1.001 3000 res[175] -7.126 19.214 -43.136 -20.497 -7.248 5.255 31.533 1.001 3000 res[176] -52.871 19.217 -89.525 -66.359 -52.719 -40.341 -13.862 1.001 3000 res[177] -30.359 19.295 -67.255 -43.920 -30.277 -17.732 9.328 1.001 3000 res[178] 23.308 19.447 -13.894 9.820 23.512 36.050 63.062 1.001 3000 res[179] 1.273 19.670 -36.404 -12.258 1.490 14.286 41.255 1.001 3000 res[180] -17.849 19.963 -55.930 -31.517 -17.512 -4.604 22.779 1.001 3000 res[181] 7.149 19.941 -31.760 -6.313 6.751 20.642 45.756 1.002 1600 res[182] -5.120 19.630 -43.283 -18.312 -5.471 8.302 33.121 1.002 1700 res[183] -11.087 19.388 -48.823 -24.206 -11.392 2.204 27.208 1.002 1800 res[184] -39.687 19.218 -77.119 -52.610 -40.209 -26.637 -1.336 1.002 1900 res[185] 13.211 19.121 -23.903 -0.022 12.822 26.200 51.275 1.001 2000 res[186] -39.733 19.099 -76.586 -52.829 -40.156 -26.827 -1.775 1.001 2200 res[187] 9.551 19.152 -27.207 -3.371 9.197 22.376 47.621 1.001 2400 res[188] 46.268 19.279 9.466 33.461 45.884 59.219 85.012 1.001 2700 res[189] 11.445 19.479 -25.807 -1.279 11.199 24.600 50.120 1.001 3000 res[190] 48.855 19.750 10.910 35.670 48.580 62.173 88.346 1.001 3000 res[191] 60.378 19.816 20.825 47.220 60.202 73.860 97.674 1.002 1600 res[192] 68.339 19.525 29.679 55.375 68.098 81.698 105.312 1.002 1700 res[193] 146.636 19.305 107.839 133.768 146.285 159.730 183.391 1.002 1700 res[194] 9.382 19.157 -28.575 -3.429 9.078 22.301 46.013 1.002 1800 res[195] -61.082 19.082 -99.125 -73.928 -61.341 -48.020 -24.158 1.002 1900 res[196] -128.565 19.083 -166.606 -141.435 -128.718 -115.639 -91.299 1.002 2000 res[197] -105.874 19.159 -144.161 -118.678 -106.131 -92.885 -68.108 1.001 2100 res[198] -17.406 19.309 -55.557 -30.395 -17.797 -4.191 21.177 1.001 2300 res[199] 57.499 19.531 19.382 44.407 57.157 70.693 96.614 1.001 2500 res[200] 57.411 19.823 18.371 44.044 57.022 70.600 97.667 1.001 2700 res[201] 105.085 20.067 66.195 91.223 104.791 118.924 144.082 1.002 1300 res[202] 68.508 19.740 30.388 54.807 68.334 82.005 106.834 1.007 1600 res[203] 69.407 19.481 32.141 55.840 69.328 82.892 107.147 1.005 1400 res[204] 64.957 19.293 27.870 51.574 64.781 78.463 101.949 1.006 1300 res[205] 4.001 19.178 -32.903 -9.356 3.941 17.362 40.625 1.003 870 res[206] -21.909 19.137 -58.892 -35.139 -21.853 -8.627 14.718 1.003 840 res[207] -35.247 19.171 -72.318 -48.441 -35.151 -22.259 1.128 1.003 820 res[208] 29.985 19.280 -7.233 16.895 30.230 43.039 66.237 1.003 800 res[209] -39.903 19.461 -77.433 -53.110 -39.443 -26.535 -2.924 1.003 790 res[210] -150.214 19.714 -188.103 -163.686 -149.670 -136.765 -113.249 1.003 780 res[211] -11.900 19.777 -50.771 -25.354 -11.785 1.446 26.647 1.003 800 res[212] -47.374 19.479 -85.689 -60.698 -47.034 -34.361 -9.073 1.003 800 res[213] -57.283 19.250 -95.404 -70.525 -56.862 -44.277 -19.651 1.003 800 res[214] 1.298 19.094 -36.871 -11.896 1.599 14.179 39.149 1.003 810 res[215] -35.759 19.012 -73.710 -48.782 -35.558 -22.890 1.524 1.003 820 res[216] -44.835 19.006 -82.675 -57.551 -44.826 -31.888 -7.364 1.003 840 res[217] 39.545 19.074 2.029 26.749 39.475 52.582 77.104 1.003 860 res[218] 57.812 19.217 20.345 45.058 57.763 70.927 95.699 1.003 890 res[219] 72.587 19.433 34.888 59.703 72.297 85.831 111.533 1.002 1500 res[220] 49.932 19.719 12.213 36.736 49.826 63.046 89.663 1.003 970 res[221] -7.057 19.791 -46.718 -20.443 -7.070 6.279 33.132 1.001 3000 res[222] -80.512 19.498 -119.812 -93.751 -80.386 -67.362 -41.067 1.001 3000 res[223] -76.145 19.275 -114.620 -89.110 -75.986 -63.223 -38.286 1.001 3000 res[224] 38.070 19.124 -0.177 25.102 38.273 51.016 75.741 1.001 3000 res[225] 39.910 19.047 1.878 26.997 40.160 52.574 76.771 1.001 3000 res[226] 37.983 19.046 -0.266 25.212 38.260 50.527 74.749 1.001 3000 res[227] 44.321 19.119 5.627 31.577 44.366 56.913 80.994 1.001 3000 res[228] 12.968 19.267 -25.941 0.422 13.034 25.779 49.965 1.001 3000 res[229] -63.324 19.488 -102.835 -76.152 -63.093 -50.374 -26.078 1.001 3000 res[230] 78.835 19.778 39.153 65.647 79.094 91.965 116.635 1.001 3000 res[231] 75.715 19.521 37.036 62.477 75.907 88.758 113.033 1.001 2600 res[232] 48.016 19.245 9.770 34.866 48.204 60.844 84.983 1.001 2100 res[233] -49.363 19.040 -87.298 -62.381 -49.039 -36.538 -13.120 1.001 2200 res[234] -50.581 18.908 -88.027 -63.647 -50.259 -37.869 -14.329 1.001 2400 res[235] -52.053 18.852 -89.309 -65.072 -51.888 -39.314 -15.737 1.001 2600 res[236] -54.173 18.872 -91.420 -67.084 -54.208 -41.311 -18.516 1.001 2800 res[237] -132.973 18.968 -170.471 -146.101 -132.957 -120.007 -96.694 1.001 3000 res[238] -17.965 19.138 -55.577 -31.215 -17.993 -4.621 18.536 1.001 3000 res[239] 72.455 19.380 34.238 58.840 72.657 86.238 109.353 1.001 3000 res[240] 90.560 19.693 52.017 76.823 90.640 104.646 128.306 1.001 3000 res[241] 39.487 19.441 0.493 26.954 39.639 52.654 77.036 1.001 3000 res[242] 69.264 19.110 30.671 56.847 69.341 82.403 106.091 1.001 3000 res[243] 44.505 18.849 6.373 32.146 44.654 57.323 80.797 1.001 3000 res[244] 20.500 18.662 -17.180 8.312 20.689 33.162 56.211 1.001 3000 res[245] 84.885 18.550 47.058 72.698 84.810 97.699 120.225 1.002 3000 res[246] 6.334 18.515 -31.947 -5.651 6.095 19.272 41.526 1.001 3000 res[247] -21.205 18.558 -59.425 -33.211 -21.358 -8.361 14.163 1.001 3000 res[248] -54.768 18.677 -93.158 -66.885 -54.997 -41.805 -19.241 1.001 3000 res[249] -44.603 18.872 -82.725 -56.989 -44.751 -31.398 -8.686 1.001 3000 res[250] -116.635 19.139 -154.956 -129.258 -116.784 -103.313 -80.235 1.001 3000 res[251] -38.049 19.613 -75.740 -51.228 -38.276 -24.636 -0.253 1.001 3000 res[252] 20.637 19.332 -16.910 7.792 20.343 33.782 58.437 1.001 3000 res[253] 54.518 19.123 16.818 41.992 54.576 67.290 91.864 1.001 3000 res[254] 9.135 18.986 -27.926 -3.321 9.205 22.152 46.285 1.001 3000 res[255] 48.952 18.924 12.181 36.429 49.043 61.756 85.952 1.001 3000 res[256] -69.320 18.938 -105.990 -82.003 -69.174 -56.508 -32.790 1.001 3000 res[257] -59.862 19.028 -96.055 -72.554 -59.967 -46.964 -23.236 1.001 3000 res[258] -0.151 19.192 -36.970 -12.975 -0.477 12.825 36.833 1.001 3000 res[259] -21.519 19.428 -59.057 -34.177 -21.898 -8.451 15.773 1.001 3000 res[260] -8.014 19.734 -46.217 -20.996 -8.403 4.982 29.752 1.001 3000 res[261] -40.630 19.588 -78.725 -54.178 -40.319 -27.013 -2.333 1.001 2200 res[262] 26.834 19.298 -10.383 13.432 26.987 40.069 64.496 1.001 2400 res[263] -54.636 19.079 -91.311 -68.034 -54.340 -41.241 -17.436 1.001 2600 res[264] -44.008 18.933 -80.558 -57.136 -43.850 -30.806 -7.390 1.001 2800 res[265] -32.617 18.862 -69.049 -45.584 -32.493 -19.417 3.998 1.001 3000 res[266] -8.228 18.868 -44.765 -21.109 -8.281 4.594 28.218 1.001 3000 res[267] -33.470 18.948 -69.926 -46.530 -33.576 -20.583 3.623 1.001 3000 res[268] 78.465 19.104 41.282 65.161 78.257 91.333 115.702 1.001 3000 res[269] 4.499 19.333 -33.401 -9.063 4.368 17.653 42.366 1.001 3000 res[270] 28.996 19.632 -9.114 15.217 28.924 42.389 67.858 1.001 3000 res[271] -95.310 19.797 -135.376 -107.748 -95.144 -81.942 -55.944 1.001 2100 res[272] -61.236 19.518 -100.669 -73.559 -61.140 -48.078 -22.938 1.002 1900 res[273] -9.148 19.308 -48.070 -21.273 -9.118 3.852 28.699 1.002 1700 res[274] -47.728 19.172 -85.635 -59.891 -47.624 -34.960 -10.284 1.002 1600 res[275] 10.353 19.109 -27.794 -1.818 10.679 22.976 47.548 1.002 1400 res[276] 37.592 19.122 -0.743 25.321 37.967 50.170 74.725 1.002 1300 res[277] 91.119 19.209 53.069 78.782 91.222 103.701 128.855 1.002 1600 res[278] 42.425 19.370 4.045 29.913 42.682 55.145 80.813 1.002 1200 res[279] -3.424 19.603 -42.168 -16.204 -3.214 9.362 35.775 1.002 1100 res[280] -80.684 19.905 -119.568 -93.793 -80.530 -67.663 -41.254 1.002 1100 res[281] -74.119 19.813 -113.214 -86.758 -74.007 -61.178 -35.643 1.001 2400 res[282] -117.377 19.539 -155.438 -129.993 -117.216 -104.423 -79.363 1.001 2500 res[283] -51.168 19.336 -89.000 -63.745 -51.206 -38.495 -13.193 1.001 2600 res[284] -29.577 19.205 -67.218 -42.075 -29.764 -16.639 8.201 1.001 2700 res[285] 93.087 19.148 55.431 80.621 92.906 106.184 130.864 1.001 2800 res[286] 23.313 19.166 -14.738 10.564 23.245 36.294 60.745 1.001 3000 res[287] 83.931 19.259 45.966 70.984 83.829 97.029 121.662 1.002 3000 res[288] 46.441 19.425 7.643 33.488 46.455 59.533 83.687 1.001 3000 res[289] -30.547 19.663 -70.344 -43.804 -30.498 -17.275 7.131 1.001 3000 res[290] 8.811 19.970 -31.240 -4.566 8.741 22.413 47.360 1.001 3000 res[291] 8.487 19.839 -30.198 -4.727 7.866 22.005 47.226 1.001 3000 res[292] -28.626 19.540 -66.894 -41.706 -29.106 -15.325 10.145 1.001 3000 res[293] -42.746 19.310 -80.659 -55.812 -42.908 -29.709 -4.464 1.001 2600 res[294] -7.372 19.152 -44.977 -20.398 -7.364 5.634 30.569 1.001 2300 res[295] -40.098 19.069 -77.338 -53.244 -40.077 -27.286 -2.636 1.001 2100 res[296] 37.028 19.060 -0.051 23.694 37.130 49.655 74.297 1.002 1900 res[297] -15.644 19.127 -52.480 -29.045 -15.669 -3.106 22.066 1.002 1800 res[298] 12.466 19.268 -24.191 -1.017 12.590 25.179 50.521 1.002 1700 res[299] 30.744 19.481 -6.571 17.070 30.784 43.560 69.226 1.002 1600 res[300] -15.607 19.765 -53.302 -29.523 -15.543 -2.730 23.548 1.002 1500 res[301] -41.668 19.925 -79.984 -55.519 -42.286 -27.906 -3.561 1.001 3000 res[302] -58.184 19.601 -95.500 -71.837 -58.692 -44.511 -20.675 1.001 3000 res[303] 8.850 19.345 -28.376 -4.648 8.303 22.286 46.243 1.001 3000 res[304] 91.777 19.161 54.968 78.462 91.326 105.042 129.357 1.001 3000 res[305] 27.745 19.050 -8.566 14.395 27.260 40.833 65.448 1.001 3000 res[306] 74.555 19.014 37.950 61.090 74.206 87.563 112.284 1.001 3000 res[307] 61.170 19.053 24.288 47.553 60.708 74.249 99.225 1.001 3000 res[308] -44.658 19.168 -81.762 -58.141 -45.131 -31.685 -6.585 1.001 3000 res[309] -69.288 19.355 -106.962 -82.463 -69.716 -56.151 -30.415 1.001 3000 res[310] -28.672 19.615 -66.754 -41.970 -28.965 -15.552 11.159 1.001 3000 res[311] 11.177 19.169 -26.751 -1.728 11.335 24.014 48.669 1.002 1400 res[312] -105.799 18.884 -142.730 -118.589 -105.489 -93.168 -68.560 1.002 1400 res[313] -48.781 18.672 -85.091 -61.302 -48.417 -36.121 -11.405 1.002 1400 res[314] 0.038 18.535 -35.839 -12.477 0.473 12.840 37.167 1.002 1500 res[315] -9.417 18.475 -45.440 -21.930 -9.115 3.244 27.616 1.002 1600 res[316] -37.666 18.492 -73.456 -50.259 -37.325 -25.009 -1.156 1.002 1600 res[317] 73.159 18.586 37.607 60.465 73.419 85.654 109.483 1.001 2800 res[318] 58.208 18.756 22.584 45.666 58.657 70.791 95.224 1.002 1900 res[319] 49.515 19.001 12.895 36.800 49.868 62.081 87.091 1.001 2000 res[320] -13.445 19.316 -50.117 -26.374 -13.430 -0.852 24.835 1.001 2200 res[321] -86.840 19.753 -125.505 -100.631 -87.159 -73.637 -47.464 1.001 3000 res[322] -145.870 19.462 -183.690 -159.434 -146.273 -132.882 -107.598 1.001 3000 res[323] -138.567 19.240 -175.496 -151.852 -139.074 -125.799 -100.857 1.001 3000 res[324] -44.112 19.092 -80.837 -57.166 -44.654 -31.457 -6.033 1.001 3000 res[325] 50.563 19.017 13.495 37.549 49.998 63.248 88.500 1.001 3000 res[326] 116.499 19.018 79.154 103.452 115.806 129.176 154.478 1.001 3000 res[327] 155.135 19.094 117.236 142.388 154.384 167.846 193.707 1.001 3000 res[328] 108.794 19.244 70.568 95.838 108.207 121.538 147.398 1.001 3000 res[329] 21.861 19.467 -16.295 8.786 21.373 34.481 60.807 1.001 3000 res[330] -3.400 19.760 -41.683 -16.591 -3.813 9.663 36.201 1.001 3000 res[331] -111.648 20.067 -149.854 -125.260 -111.319 -98.393 -72.762 1.003 790 res[332] -107.226 19.776 -144.576 -120.540 -107.048 -93.931 -68.297 1.003 750 res[333] -5.537 19.553 -42.372 -18.918 -5.216 7.595 32.965 1.003 720 res[334] -1.165 19.402 -37.592 -14.569 -1.041 11.978 37.002 1.003 690 res[335] 50.047 19.324 13.314 36.744 50.004 63.166 87.900 1.004 670 res[336] 60.158 19.320 23.760 47.070 60.178 73.384 98.080 1.004 650 res[337] 4.301 19.390 -32.229 -8.882 4.227 17.646 42.422 1.004 640 res[338] 85.207 19.533 48.224 71.651 84.981 98.737 123.640 1.005 560 res[339] 58.302 19.748 20.991 44.571 58.138 72.068 97.003 1.004 630 res[340] -8.612 20.032 -46.119 -22.493 -8.845 5.528 31.117 1.004 630 res[341] 40.658 19.685 3.448 27.350 40.482 53.517 79.299 1.001 3000 res[342] -40.740 19.418 -77.738 -53.723 -40.816 -28.015 -2.330 1.001 3000 res[343] 12.728 19.221 -24.182 -0.401 12.788 25.449 50.498 1.001 3000 res[344] 36.860 19.098 -0.564 24.023 36.739 49.428 74.164 1.001 3000 res[345] 12.816 19.049 -24.411 0.216 12.581 25.508 50.190 1.001 3000 res[346] -13.634 19.076 -50.633 -26.459 -13.878 -0.866 24.163 1.001 3000 res[347] 35.910 19.177 -1.512 23.028 35.570 48.845 73.841 1.001 3000 res[348] -24.923 19.352 -62.350 -37.528 -25.189 -11.766 12.662 1.001 3000 res[349] -70.191 19.599 -108.269 -82.871 -70.339 -56.810 -32.042 1.001 3000 res[350] -36.535 19.915 -75.217 -49.542 -36.668 -22.995 2.009 1.001 3000 sigma 63.073 2.504 58.445 61.335 62.980 64.733 68.013 1.002 1200 sigma.B 61.040 8.449 46.623 55.357 60.161 65.905 80.112 1.001 3000 deviance 3894.560 9.116 3879.137 3887.995 3893.733 3900.224 3914.722 1.002 1200 For each parameter, n.eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor (at convergence, Rhat=1). DIC info (using the rule, pD = var(deviance)/2) pD = 41.5 and DIC = 3936.1 DIC is an estimate of expected predictive error (lower deviance is better).
data.rm.mcmc.list.m <- as.mcmc(data.rm.r2jags.m) Data.Rm.mcmc.list.m <- data.rm.mcmc.list.m
Given that Time cannot be randomized, there is likely to be a temporal dependency structure to the data. The above analyses assume no temporal dependency - actually, they assume that the variance-covariance matrix demonstrates a structure known as sphericity.
Lets specifically model in a first order autoregressive correlation structure in an attempt to accommodate the expected temporal autocorrelation.
modelString=" model { #Likelihood y[1]~dnorm(mu[1],tau) mu[1] <- eta1[1] eta1[1] ~ dnorm(eta[1], taueps) eta[1] <- inprod(beta[],X[1,]) + gamma[Block[1]] res[1] <- y[1]-mu[1] for (i in 2:n) { y[i]~dnorm(mu[i],tau) mu[i] <- eta1[i] eta1[i] ~ dnorm(temp[i], taueps) temp[i] <- eta[i] + -rho*(mu[i-1]-y[i-1]) eta[i] <- inprod(beta[],X[i,]) + gamma[Block[i]] res[i] <- y[i]-mu[i] } beta ~ dmnorm(a0,A0) for (i in 1:nBlock) { gamma[i] ~ dnorm(0, tau.B) #prior } rho ~ dunif(-1,1) tau <- pow(sigma,-2) sigma <- z/sqrt(chSq) z ~ dnorm(0, 0.0016)I(0,) #1/25^2 = 0.0016 chSq ~ dgamma(0.5, 0.5) taueps <- pow(sigma.eps,-2) sigma.eps <- z/sqrt(chSq.eps) z.eps ~ dnorm(0, 0.0016)I(0,) #1/25^2 = 0.0016 chSq.eps ~ dgamma(0.5, 0.5) tau.B <- pow(sigma.B,-2) sigma.B <- z/sqrt(chSq.B) z.B ~ dnorm(0, 0.0016)I(0,) #1/25^2 = 0.0016 chSq.B ~ dgamma(0.5, 0.5) sd.y <- sd(res) sd.block <- sd(gamma) } "
Xmat <- model.matrix(~Time,data.rm) data.rm.list <- with(data.rm, list(y=y, Block=as.numeric(Block), X=Xmat, n=nrow(data.rm), nBlock=length(levels(Block)), nA = ncol(Xmat), a0=rep(0,ncol(Xmat)), A0=diag(0,ncol(Xmat)) ) ) params <- c("beta",'gamma',"sigma","sigma.B","res",'sigma.eps','rho','sd.y','sd.block') adaptSteps = 1000 burnInSteps = 3000 nChains = 3 numSavedSteps = 3000 thinSteps = 10 nIter = burnInSteps+ceiling((numSavedSteps * thinSteps)/nChains) library(R2jags) rnorm(1)
[1] -0.5810783
jags.effects.mt.time <- system.time( data.rm.r2jags.mt <- jags(data=data.rm.list, inits=NULL, parameters.to.save=params, model.file=textConnection(modelString), n.chains=3, n.iter=nIter, n.burnin=burnInSteps, n.thin=thinSteps ) )
Compiling model graph Resolving undeclared variables Allocating nodes Graph Size: 3931 Initializing model
jags.effects.mt.time
user system elapsed 82.169 0.332 83.111
data.rm.mt.mcmc <- data.rm.r2jags.mt$BUGSoutput$sims.matrix summary(as.mcmc(data.rm.mt.mcmc[,grep('beta|sigma|rho',colnames(data.rm.mt.mcmc))]))
Iterations = 1:3000 Thinning interval = 1 Number of chains = 1 Sample size per chain = 3000 1. Empirical mean and standard deviation for each variable, plus standard error of the mean: Mean SD Naive SE Time-series SE beta[1] 170.4335 12.0485 0.219975 0.219975 beta[2] 32.1415 1.1709 0.021377 0.021377 rho 0.7389 0.1305 0.002383 0.002383 sigma 46.9463 4.5583 0.083222 0.083222 sigma.B 54.8340 8.3323 0.152127 0.152127 sigma.eps 20.9418 5.2581 0.095999 0.093444 2. Quantiles for each variable: 2.5% 25% 50% 75% 97.5% beta[1] 146.6049 162.0844 170.6362 178.5282 194.1298 beta[2] 29.8261 31.3479 32.1157 32.9132 34.5192 rho 0.5203 0.6303 0.7329 0.8427 0.9786 sigma 38.6692 43.4660 46.9013 50.4678 55.3807 sigma.B 40.1914 48.9771 54.1602 60.0107 72.6525 sigma.eps 8.6820 18.0282 21.8396 24.7380 29.0647
#head(data.rm.r2jags.mt$BUGSoutput$sims.list[[c('beta','rho','sigma')]]) #print(data.rm.r2jags.mt) data.rm.mcmc.list.mt <- as.mcmc(data.rm.r2jags.mt) Data.Rm.mcmc.list.mt <- data.rm.mcmc.list.mt # R2 calculations Xmat <- model.matrix(~Time, data.rm) coefs <- data.rm.r2jags.mt$BUGSoutput$sims.list[['beta']] fitted <- coefs %*% t(Xmat) X.var <- aaply(fitted,1,function(x){var(x)}) X.var[1:10]
1 2 3 4 5 6 7 8 9 10 8394.090 8493.176 7718.802 8959.809 7583.523 8720.200 9251.465 9191.655 9301.433 10150.338
Z.var <- data.rm.r2jags.mt$BUGSoutput$sims.list[['sd.block']]^2 R.var <- data.rm.r2jags.mt$BUGSoutput$sims.list[['sd.y']]^2 R2.marginal <- (X.var)/(X.var+Z.var+R.var) R2.marginal <- data.frame(Mean=mean(R2.marginal), Median=median(R2.marginal), HPDinterval(as.mcmc(R2.marginal))) R2.conditional <- (X.var+Z.var)/(X.var+Z.var+R.var) R2.conditional <- data.frame(Mean=mean(R2.conditional), Median=median(R2.conditional), HPDinterval(as.mcmc(R2.conditional))) R2.block <- (Z.var)/(X.var+Z.var+R.var) R2.block <- data.frame(Mean=mean(R2.block), Median=median(R2.block), HPDinterval(as.mcmc(R2.block))) R2.res<-(R.var)/(X.var+Z.var+R.var) R2.res <- data.frame(Mean=mean(R2.res), Median=median(R2.res), HPDinterval(as.mcmc(R2.res))) (r2 <- rbind(R2.block=R2.block, R2.marginal=R2.marginal, R2.res=R2.res, R2.conditional=R2.conditional))
Mean Median lower upper R2.block 0.2144352 0.2143692 0.1500259 0.2729597 R2.marginal 0.6242208 0.6252443 0.5625655 0.6868316 R2.res 0.1613440 0.1607172 0.1144348 0.2106814 R2.conditional 0.8386560 0.8392828 0.7893186 0.8855652
It would appear that the incorporation of a first order autocorrelation
structure is indeed appropriate. The degree of correlation between successive
points is 0.733
.
Summary figure
coefs <- data.rm.r2jags.mt$BUGSoutput$sims.list[['beta']] newdata <- with(data.rm, data.frame(Time=seq(min(Time, na.rm=TRUE), max(Time, na.rm=TRUE), len=100))) Xmat <- model.matrix(~Time, newdata) pred <- (coefs %*% t(Xmat)) pred <- adply(pred, 2, function(x) { data.frame(Mean=mean(x), Median=median(x, na.rm=TRUE), t(quantile(x,na.rm=TRUE)), HPDinterval(as.mcmc(x)),HPDinterval(as.mcmc(x),p=0.5)) }) newdata <- cbind(newdata, pred) #Also calculate the partial observations Xmat <- model.matrix(~Time, data.rm) pred <- colMeans(as.vector(coefs %*% t(Xmat))+data.rm.r2jags.mt$BUGSoutput$sims.list[['res']]) part.obs <- cbind(data.rm,Median=pred) ggplot(newdata, aes(y=Median, x=Time)) + geom_point(data=part.obs, aes(y=Median))+ geom_ribbon(aes(ymin=lower, ymax=upper), fill='blue',alpha=0.2) + geom_line()+ scale_x_continuous('Time') + scale_y_continuous('Y') + theme_classic() + theme(axis.title.y = element_text(vjust=2, size=rel(1.2)), axis.title.x = element_text(vjust=-2, size=rel(1.2)), plot.margin=unit(c(0.5,0.5,2,2), 'lines'))
Error in theme(axis.title.y = element_text(vjust = 2, size = rel(1.2)), : could not find function "unit"
STAN
modelString=" data{ int n; int nX; int nB; vector [n] y; matrix [n,nX] X; int B[n]; } parameters{ vector [nX] beta; real<lower=0> sigma; vector [nB] gamma; real<lower=0> sigma_B; } transformed parameters { vector[n] mu; mu <- X*beta; for (i in 1:n) { mu[i] <- mu[i] + gamma[B[i]]; } } model{ // Priors beta ~ normal( 0 , 100 ); gamma ~ normal( 0 , sigma_B ); sigma_B ~ cauchy( 0 , 25 ); sigma ~ cauchy( 0 , 25 ); y ~ normal( mu , sigma ); } "
Xmat <- model.matrix(~Time, data=data.rm) data.rm.list <- with(data.rm, list(y=y, X=Xmat, nX=ncol(Xmat), B=as.numeric(Block), n=nrow(data.rm), nB=length(levels(Block)))) library(rstan) rstan.d.time <- system.time( data.rm.rstan.d <- stan(data=data.rm.list, model_code=rstanString, pars=c('beta','sigma','sigma_B'), chains=3, iter=3000, warmup=1000, thin=2, save_dso=TRUE ) )
SAMPLING FOR MODEL '18e6498c61bcea7cdfdc0535e9da24c2' NOW (CHAIN 1). Chain 1, Iteration: 1 / 3000 [ 0%] (Warmup) Chain 1, Iteration: 300 / 3000 [ 10%] (Warmup) Chain 1, Iteration: 600 / 3000 [ 20%] (Warmup) Chain 1, Iteration: 900 / 3000 [ 30%] (Warmup) Chain 1, Iteration: 1001 / 3000 [ 33%] (Sampling) Chain 1, Iteration: 1300 / 3000 [ 43%] (Sampling) Chain 1, Iteration: 1600 / 3000 [ 53%] (Sampling) Chain 1, Iteration: 1900 / 3000 [ 63%] (Sampling) Chain 1, Iteration: 2200 / 3000 [ 73%] (Sampling) Chain 1, Iteration: 2500 / 3000 [ 83%] (Sampling) Chain 1, Iteration: 2800 / 3000 [ 93%] (Sampling) Chain 1, Iteration: 3000 / 3000 [100%] (Sampling) # Elapsed Time: 5.78416 seconds (Warm-up) # 1.68206 seconds (Sampling) # 7.46622 seconds (Total) SAMPLING FOR MODEL '18e6498c61bcea7cdfdc0535e9da24c2' NOW (CHAIN 2). Chain 2, Iteration: 1 / 3000 [ 0%] (Warmup) Chain 2, Iteration: 300 / 3000 [ 10%] (Warmup) Chain 2, Iteration: 600 / 3000 [ 20%] (Warmup) Chain 2, Iteration: 900 / 3000 [ 30%] (Warmup) Chain 2, Iteration: 1001 / 3000 [ 33%] (Sampling) Chain 2, Iteration: 1300 / 3000 [ 43%] (Sampling) Chain 2, Iteration: 1600 / 3000 [ 53%] (Sampling) Chain 2, Iteration: 1900 / 3000 [ 63%] (Sampling) Chain 2, Iteration: 2200 / 3000 [ 73%] (Sampling) Chain 2, Iteration: 2500 / 3000 [ 83%] (Sampling) Chain 2, Iteration: 2800 / 3000 [ 93%] (Sampling) Chain 2, Iteration: 3000 / 3000 [100%] (Sampling) # Elapsed Time: 3.05728 seconds (Warm-up) # 1.65579 seconds (Sampling) # 4.71307 seconds (Total) SAMPLING FOR MODEL '18e6498c61bcea7cdfdc0535e9da24c2' NOW (CHAIN 3). Chain 3, Iteration: 1 / 3000 [ 0%] (Warmup) Chain 3, Iteration: 300 / 3000 [ 10%] (Warmup) Chain 3, Iteration: 600 / 3000 [ 20%] (Warmup) Chain 3, Iteration: 900 / 3000 [ 30%] (Warmup) Chain 3, Iteration: 1001 / 3000 [ 33%] (Sampling) Chain 3, Iteration: 1300 / 3000 [ 43%] (Sampling) Chain 3, Iteration: 1600 / 3000 [ 53%] (Sampling) Chain 3, Iteration: 1900 / 3000 [ 63%] (Sampling) Chain 3, Iteration: 2200 / 3000 [ 73%] (Sampling) Chain 3, Iteration: 2500 / 3000 [ 83%] (Sampling) Chain 3, Iteration: 2800 / 3000 [ 93%] (Sampling) Chain 3, Iteration: 3000 / 3000 [100%] (Sampling) # Elapsed Time: 4.13186 seconds (Warm-up) # 1.76267 seconds (Sampling) # 5.89453 seconds (Total)
print(data.rcb.rstan.d)
Inference for Stan model: 18e6498c61bcea7cdfdc0535e9da24c2. 3 chains, each with iter=3000; warmup=1000; thin=2; post-warmup draws per chain=1000, total post-warmup draws=3000. mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat beta[1] 42.98 0.11 2.08 38.70 41.64 42.99 44.37 46.96 333 1.01 beta[2] 28.42 0.02 1.12 26.23 27.65 28.41 29.18 30.62 2244 1.00 beta[3] 40.12 0.02 1.12 37.90 39.40 40.12 40.90 42.20 2351 1.00 sigma 4.65 0.01 0.40 3.97 4.36 4.61 4.90 5.51 1901 1.00 sigma_B 11.93 0.03 1.60 9.26 10.79 11.79 12.90 15.57 2405 1.00 lp__ -312.99 0.14 5.47 -324.51 -316.52 -312.72 -309.00 -303.73 1441 1.01 Samples were drawn using NUTS(diag_e) at Wed Dec 23 11:24:24 2015. For each parameter, n_eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor on split chains (at convergence, Rhat=1).
modelString=" data{ int n; int nX; int nB; vector [n] y; matrix [n,nX] X; int B[n]; vector [n] tgroup; } parameters{ vector [nX] beta; real<lower=0> sigma; vector [nB] gamma; real<lower=0> sigma_B; real ar; } transformed parameters { vector[n] mu; vector[n] E; vector[n] res; mu <- X*beta; for (i in 1:n) { E[i] <- 0; } for (i in 1:n) { mu[i] <- mu[i] + gamma[B[i]]; res[i] <- y[i] - mu[i]; if(i>0 && i < n && tgroup[i+1] == tgroup[i]) { E[i+1] <- res[i]; } mu[i] <- mu[i] + (E[i] * ar); } } model{ // Priors beta ~ normal( 0 , 100 ); gamma ~ normal( 0 , sigma_B ); sigma_B ~ cauchy( 0 , 25 ); sigma ~ cauchy( 0 , 25 ); y ~ normal( mu , sigma ); } "
Xmat <- model.matrix(~Time, data=data.rm) data.rm.list <- with(data.rm, list(y=y, X=Xmat, nX=ncol(Xmat), B=as.numeric(Block), n=nrow(data.rm), nB=length(levels(Block)), tgroup=as.numeric(Block))) library(rstan) rstan.d.time <- system.time( data.rm.rstan.d <- stan(data=data.rm.list, model_code=modelString, pars=c('beta','sigma','sigma_B','ar'), chains=3, iter=3000, warmup=1000, thin=2, save_dso=TRUE ) )
SAMPLING FOR MODEL '05a3ad88499ba51c07ee081ddf2e3a38' NOW (CHAIN 1). Chain 1, Iteration: 1 / 3000 [ 0%] (Warmup) Chain 1, Iteration: 300 / 3000 [ 10%] (Warmup) Chain 1, Iteration: 600 / 3000 [ 20%] (Warmup) Chain 1, Iteration: 900 / 3000 [ 30%] (Warmup) Chain 1, Iteration: 1001 / 3000 [ 33%] (Sampling) Chain 1, Iteration: 1300 / 3000 [ 43%] (Sampling) Chain 1, Iteration: 1600 / 3000 [ 53%] (Sampling) Chain 1, Iteration: 1900 / 3000 [ 63%] (Sampling) Chain 1, Iteration: 2200 / 3000 [ 73%] (Sampling) Chain 1, Iteration: 2500 / 3000 [ 83%] (Sampling) Chain 1, Iteration: 2800 / 3000 [ 93%] (Sampling) Chain 1, Iteration: 3000 / 3000 [100%] (Sampling) # Elapsed Time: 4.75173 seconds (Warm-up) # 1.52767 seconds (Sampling) # 6.27941 seconds (Total) SAMPLING FOR MODEL '05a3ad88499ba51c07ee081ddf2e3a38' NOW (CHAIN 2). Chain 2, Iteration: 1 / 3000 [ 0%] (Warmup) Chain 2, Iteration: 300 / 3000 [ 10%] (Warmup) Chain 2, Iteration: 600 / 3000 [ 20%] (Warmup) Chain 2, Iteration: 900 / 3000 [ 30%] (Warmup) Chain 2, Iteration: 1001 / 3000 [ 33%] (Sampling) Chain 2, Iteration: 1300 / 3000 [ 43%] (Sampling) Chain 2, Iteration: 1600 / 3000 [ 53%] (Sampling) Chain 2, Iteration: 1900 / 3000 [ 63%] (Sampling) Chain 2, Iteration: 2200 / 3000 [ 73%] (Sampling) Chain 2, Iteration: 2500 / 3000 [ 83%] (Sampling) Chain 2, Iteration: 2800 / 3000 [ 93%] (Sampling) Chain 2, Iteration: 3000 / 3000 [100%] (Sampling) # Elapsed Time: 4.14949 seconds (Warm-up) # 1.49128 seconds (Sampling) # 5.64077 seconds (Total) SAMPLING FOR MODEL '05a3ad88499ba51c07ee081ddf2e3a38' NOW (CHAIN 3). Chain 3, Iteration: 1 / 3000 [ 0%] (Warmup) Chain 3, Iteration: 300 / 3000 [ 10%] (Warmup) Chain 3, Iteration: 600 / 3000 [ 20%] (Warmup) Chain 3, Iteration: 900 / 3000 [ 30%] (Warmup) Chain 3, Iteration: 1001 / 3000 [ 33%] (Sampling) Chain 3, Iteration: 1300 / 3000 [ 43%] (Sampling) Chain 3, Iteration: 1600 / 3000 [ 53%] (Sampling) Chain 3, Iteration: 1900 / 3000 [ 63%] (Sampling) Chain 3, Iteration: 2200 / 3000 [ 73%] (Sampling) Chain 3, Iteration: 2500 / 3000 [ 83%] (Sampling) Chain 3, Iteration: 2800 / 3000 [ 93%] (Sampling) Chain 3, Iteration: 3000 / 3000 [100%] (Sampling) # Elapsed Time: 25.8087 seconds (Warm-up) # 12.2924 seconds (Sampling) # 38.1011 seconds (Total)
print(data.rm.rstan.d)
Inference for Stan model: 05a3ad88499ba51c07ee081ddf2e3a38. 3 chains, each with iter=3000; warmup=1000; thin=2; post-warmup draws per chain=1000, total post-warmup draws=3000. mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat beta[1] 172.54 0.31 13.85 144.76 163.00 172.88 182.19 198.80 1946 1 beta[2] 32.02 0.03 1.80 28.40 30.84 32.03 33.23 35.62 3000 1 sigma 54.65 0.05 2.25 50.54 53.05 54.59 56.15 59.11 2213 1 sigma_B 58.21 0.30 11.71 37.64 50.22 57.35 65.16 84.52 1483 1 ar 0.74 0.00 0.06 0.63 0.70 0.74 0.78 0.84 1787 1 lp__ -1731.82 0.17 5.87 -1744.35 -1735.41 -1731.47 -1727.66 -1721.51 1248 1 Samples were drawn using NUTS(diag_e) at Thu Dec 24 08:35:39 2015. For each parameter, n_eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor on split chains (at convergence, Rhat=1).
BRM
library(brms) data.rm.brm <- brm(y~(1|Block) + Time, data=data.rm, family='gaussian', prior=c(set_prior('normal(0,100)', class='b'), set_prior('cauchy(0,5)', class='sd')), n.chains=3, n.iter=2000, warmup=500, n.thin=2 )
SAMPLING FOR MODEL 'gaussian(identity) brms-model' NOW (CHAIN 1). Chain 1, Iteration: 1 / 2000 [ 0%] (Warmup) Chain 1, Iteration: 200 / 2000 [ 10%] (Warmup) Chain 1, Iteration: 400 / 2000 [ 20%] (Warmup) Chain 1, Iteration: 501 / 2000 [ 25%] (Sampling) Chain 1, Iteration: 700 / 2000 [ 35%] (Sampling) Chain 1, Iteration: 900 / 2000 [ 45%] (Sampling) Chain 1, Iteration: 1100 / 2000 [ 55%] (Sampling) Chain 1, Iteration: 1300 / 2000 [ 65%] (Sampling) Chain 1, Iteration: 1500 / 2000 [ 75%] (Sampling) Chain 1, Iteration: 1700 / 2000 [ 85%] (Sampling) Chain 1, Iteration: 1900 / 2000 [ 95%] (Sampling) Chain 1, Iteration: 2000 / 2000 [100%] (Sampling) # Elapsed Time: 1.19739 seconds (Warm-up) # 1.41722 seconds (Sampling) # 2.61461 seconds (Total) SAMPLING FOR MODEL 'gaussian(identity) brms-model' NOW (CHAIN 2). Chain 2, Iteration: 1 / 2000 [ 0%] (Warmup) Chain 2, Iteration: 200 / 2000 [ 10%] (Warmup) Chain 2, Iteration: 400 / 2000 [ 20%] (Warmup) Chain 2, Iteration: 501 / 2000 [ 25%] (Sampling) Chain 2, Iteration: 700 / 2000 [ 35%] (Sampling) Chain 2, Iteration: 900 / 2000 [ 45%] (Sampling) Chain 2, Iteration: 1100 / 2000 [ 55%] (Sampling) Chain 2, Iteration: 1300 / 2000 [ 65%] (Sampling) Chain 2, Iteration: 1500 / 2000 [ 75%] (Sampling) Chain 2, Iteration: 1700 / 2000 [ 85%] (Sampling) Chain 2, Iteration: 1900 / 2000 [ 95%] (Sampling) Chain 2, Iteration: 2000 / 2000 [100%] (Sampling) # Elapsed Time: 0.871095 seconds (Warm-up) # 1.40274 seconds (Sampling) # 2.27384 seconds (Total) SAMPLING FOR MODEL 'gaussian(identity) brms-model' NOW (CHAIN 3). Chain 3, Iteration: 1 / 2000 [ 0%] (Warmup) Chain 3, Iteration: 200 / 2000 [ 10%] (Warmup) Chain 3, Iteration: 400 / 2000 [ 20%] (Warmup) Chain 3, Iteration: 501 / 2000 [ 25%] (Sampling) Chain 3, Iteration: 700 / 2000 [ 35%] (Sampling) Chain 3, Iteration: 900 / 2000 [ 45%] (Sampling) Chain 3, Iteration: 1100 / 2000 [ 55%] (Sampling) Chain 3, Iteration: 1300 / 2000 [ 65%] (Sampling) Chain 3, Iteration: 1500 / 2000 [ 75%] (Sampling) Chain 3, Iteration: 1700 / 2000 [ 85%] (Sampling) Chain 3, Iteration: 1900 / 2000 [ 95%] (Sampling) Chain 3, Iteration: 2000 / 2000 [100%] (Sampling) # Elapsed Time: 0.960711 seconds (Warm-up) # 1.42992 seconds (Sampling) # 2.39063 seconds (Total)
summary(data.rm.brm)
Family: gaussian (identity) Formula: y ~ (1 | Block) + Time Data: data.rm (Number of observations: 350) Samples: 3 chains, each with n.iter = 2000; n.warmup = 500; n.thin = 2; total post-warmup samples = 2250 WAIC: 3928.94 Random Effects: ~Block (Number of levels: 35) Estimate Est.Error l-95% CI u-95% CI Eff.Sample Rhat sd(Intercept) 60.65 8.51 46.33 79.97 852 1 Fixed Effects: Estimate Est.Error l-95% CI u-95% CI Eff.Sample Rhat Intercept 164.56 12.21 141.16 188.47 520 1 Time 32.80 1.19 30.52 35.12 2148 1 Family Specific Parameters: Estimate Est.Error l-95% CI u-95% CI Eff.Sample Rhat sigma(y) 63.18 2.47 58.32 68.14 2124 1 Samples were drawn using NUTS(diag_e). For each parameter, Eff.Sample is a crude measure of effective sample size, and Rhat is the potential scale reduction factor on split chains (at convergence, Rhat = 1).
stancode(data.rm.brm)
functions { } data { int<lower=1> N; # number of observations vector[N] Y; # response variable int<lower=1> K; # number of fixed effects matrix[N, K] X; # FE design matrix # data for random effects of Block int<lower=1> J_1[N]; # RE levels int<lower=1> N_1; # number of levels int<lower=1> K_1; # number of REs real Z_1[N]; # RE design matrix } transformed data { } parameters { real b_Intercept; # fixed effects Intercept vector[K] b; # fixed effects vector[N_1] pre_1; # unscaled REs real<lower=0> sd_1; # RE standard deviation real<lower=0> sigma; # residual SD } transformed parameters { vector[N] eta; # linear predictor vector[N_1] r_1; # REs # compute linear predictor eta <- X * b + b_Intercept; r_1 <- sd_1 * (pre_1); # scale REs # if available add REs to linear predictor for (n in 1:N) { eta[n] <- eta[n] + Z_1[n] * r_1[J_1[n]]; } } model { # prior specifications b_Intercept ~ normal(0,100); b ~ normal(0,100); sd_1 ~ cauchy(0,5); pre_1 ~ normal(0, 1); sigma ~ cauchy(0, 128); # likelihood contribution Y ~ normal(eta, sigma); } generated quantities { }
library(brms) data.rm.brm <- brm(y~(1|Block) + Time, data=data.rm, family='gaussian', autocor=cor_ar(~Time|Block), prior=c(set_prior('normal(0,100)', class='b'), set_prior('cauchy(0,5)', class='sd')), n.chains=3, n.iter=2000, warmup=500, n.thin=2 )
SAMPLING FOR MODEL 'gaussian(identity) brms-model' NOW (CHAIN 1). Chain 1, Iteration: 1 / 2000 [ 0%] (Warmup) Chain 1, Iteration: 200 / 2000 [ 10%] (Warmup) Chain 1, Iteration: 400 / 2000 [ 20%] (Warmup) Chain 1, Iteration: 501 / 2000 [ 25%] (Sampling) Chain 1, Iteration: 700 / 2000 [ 35%] (Sampling) Chain 1, Iteration: 900 / 2000 [ 45%] (Sampling) Chain 1, Iteration: 1100 / 2000 [ 55%] (Sampling) Chain 1, Iteration: 1300 / 2000 [ 65%] (Sampling) Chain 1, Iteration: 1500 / 2000 [ 75%] (Sampling) Chain 1, Iteration: 1700 / 2000 [ 85%] (Sampling) Chain 1, Iteration: 1900 / 2000 [ 95%] (Sampling) Chain 1, Iteration: 2000 / 2000 [100%] (Sampling) # Elapsed Time: 2.3523 seconds (Warm-up) # 3.77681 seconds (Sampling) # 6.12911 seconds (Total) SAMPLING FOR MODEL 'gaussian(identity) brms-model' NOW (CHAIN 2). Chain 2, Iteration: 1 / 2000 [ 0%] (Warmup) Chain 2, Iteration: 200 / 2000 [ 10%] (Warmup) Chain 2, Iteration: 400 / 2000 [ 20%] (Warmup) Chain 2, Iteration: 501 / 2000 [ 25%] (Sampling) Chain 2, Iteration: 700 / 2000 [ 35%] (Sampling) Chain 2, Iteration: 900 / 2000 [ 45%] (Sampling) Chain 2, Iteration: 1100 / 2000 [ 55%] (Sampling) Chain 2, Iteration: 1300 / 2000 [ 65%] (Sampling) Chain 2, Iteration: 1500 / 2000 [ 75%] (Sampling) Chain 2, Iteration: 1700 / 2000 [ 85%] (Sampling) Chain 2, Iteration: 1900 / 2000 [ 95%] (Sampling) Chain 2, Iteration: 2000 / 2000 [100%] (Sampling) # Elapsed Time: 2.60853 seconds (Warm-up) # 3.93915 seconds (Sampling) # 6.54768 seconds (Total) SAMPLING FOR MODEL 'gaussian(identity) brms-model' NOW (CHAIN 3). Chain 3, Iteration: 1 / 2000 [ 0%] (Warmup) Chain 3, Iteration: 200 / 2000 [ 10%] (Warmup) Chain 3, Iteration: 400 / 2000 [ 20%] (Warmup) Chain 3, Iteration: 501 / 2000 [ 25%] (Sampling) Chain 3, Iteration: 700 / 2000 [ 35%] (Sampling) Chain 3, Iteration: 900 / 2000 [ 45%] (Sampling) Chain 3, Iteration: 1100 / 2000 [ 55%] (Sampling) Chain 3, Iteration: 1300 / 2000 [ 65%] (Sampling) Chain 3, Iteration: 1500 / 2000 [ 75%] (Sampling) Chain 3, Iteration: 1700 / 2000 [ 85%] (Sampling) Chain 3, Iteration: 1900 / 2000 [ 95%] (Sampling) Chain 3, Iteration: 2000 / 2000 [100%] (Sampling) # Elapsed Time: 2.01037 seconds (Warm-up) # 2.1447 seconds (Sampling) # 4.15507 seconds (Total)
summary(data.rm.brm)
Family: gaussian (identity) Formula: y ~ (1 | Block) + Time Data: data.rm (Number of observations: 350) Samples: 3 chains, each with n.iter = 2000; n.warmup = 500; n.thin = 2; total post-warmup samples = 2250 WAIC: 3831.08 Random Effects: ~Block (Number of levels: 35) Estimate Est.Error l-95% CI u-95% CI Eff.Sample Rhat sd(Intercept) 57.1 11.54 36.16 82.3 1206 1 Correlation Structure: arma(~Time|Block, 1, 0, 0) Estimate Est.Error l-95% CI u-95% CI Eff.Sample Rhat ar[1] 0.74 0.05 0.63 0.84 1915 1 Fixed Effects: Estimate Est.Error l-95% CI u-95% CI Eff.Sample Rhat Intercept 173.34 14.17 144.97 200.29 1719 1 Time 31.97 1.76 28.61 35.41 2035 1 Family Specific Parameters: Estimate Est.Error l-95% CI u-95% CI Eff.Sample Rhat sigma(y) 54.87 2.28 50.6 59.62 1876 1 Samples were drawn using NUTS(diag_e). For each parameter, Eff.Sample is a crude measure of effective sample size, and Rhat is the potential scale reduction factor on split chains (at convergence, Rhat = 1).
stancode(data.rm.brm)
functions { } data { int<lower=1> N; # number of observations vector[N] Y; # response variable int<lower=1> K; # number of fixed effects matrix[N, K] X; # FE design matrix # data for random effects of Block int<lower=1> J_1[N]; # RE levels int<lower=1> N_1; # number of levels int<lower=1> K_1; # number of REs real Z_1[N]; # RE design matrix # data needed for ARMA effects int<lower=0> Kar; # AR order int<lower=0> Kma; # MA order int<lower=1> Karma; # max(Kma, Kar) matrix[N, Karma] E_pre; # matrix of zeros vector[N] tgroup; # indicates independent groups } transformed data { } parameters { real b_Intercept; # fixed effects Intercept vector[K] b; # fixed effects vector[N_1] pre_1; # unscaled REs real<lower=0> sd_1; # RE standard deviation vector[Kar] ar; # autoregressive effects real<lower=0> sigma; # residual SD } transformed parameters { vector[N] eta; # linear predictor matrix[N, Karma] E; # ARMA design matrix vector[N] e; # residuals vector[N_1] r_1; # REs # compute linear predictor eta <- X * b + b_Intercept; E <- E_pre; r_1 <- sd_1 * (pre_1); # scale REs # if available add REs to linear predictor for (n in 1:N) { eta[n] <- eta[n] + Z_1[n] * r_1[J_1[n]]; # calculation of ARMA effects e[n] <- (Y[n]) - eta[n]; for (i in 1:Karma) { if (n + 1 - i > 0 && n < N && tgroup[n + 1] == tgroup[n + 1 - i]) { E[n + 1, i] <- e[n + 1 - i]; } } eta[n] <- (eta[n] + head(E[n], Kar) * ar); } } model { # prior specifications b_Intercept ~ normal(0,100); b ~ normal(0,100); sd_1 ~ cauchy(0,5); pre_1 ~ normal(0, 1); sigma ~ cauchy(0, 128); # likelihood contribution Y ~ normal(eta, sigma); } generated quantities { }
standata(data.rm.brm)
$N [1] 350 $Y [1] 208.380314 132.477496 201.465562 150.165955 169.815533 298.293920 371.658255 460.151329 497.692584 512.887406 214.638477 328.458329 348.119053 331.840683 403.835561 477.892357 584.198484 [18] 550.310782 539.807094 536.136434 210.649978 221.174663 239.142086 346.390918 331.222048 340.555491 406.057444 488.070806 464.837333 465.451784 296.455967 373.624170 377.912791 328.402246 [35] 451.749533 542.724790 514.232959 529.704674 463.361987 532.747423 308.413772 242.844032 239.273847 249.337277 261.390299 209.425006 300.565229 277.839323 340.998623 428.425181 130.308136 [52] 219.924118 251.730731 252.078752 323.058162 331.832082 455.173415 470.302936 531.960288 554.903565 157.834535 136.886606 173.228746 241.347754 172.866622 289.302703 310.145450 325.397071 [69] 303.461651 296.999208 80.867402 170.592340 118.009673 136.619587 234.514204 269.389457 434.209939 526.085233 573.073037 587.132600 313.227767 294.741595 352.648822 447.584444 509.693739 [86] 470.547466 404.948819 433.930958 490.643923 587.519526 211.104742 261.270693 300.859216 386.552805 400.014415 400.314259 367.078370 377.959342 400.428954 560.038550 453.040165 439.167655 [103] 390.221439 310.601091 377.285142 356.370121 514.399750 488.309609 523.273695 742.203477 343.729831 368.701255 323.955340 366.886014 346.184644 383.151931 400.017172 375.190685 421.056242 [120] 512.764105 291.916201 237.169954 255.527973 346.666161 311.894368 219.363017 230.967842 244.684857 312.394872 350.001614 70.044875 97.629080 165.520985 188.828300 291.279513 282.869035 [137] 333.540361 321.301611 372.153413 406.048023 157.513111 201.548746 308.999266 376.018994 408.727162 374.115493 470.503413 499.112401 501.450819 614.011651 254.383355 294.405130 381.341373 [154] 422.914637 416.710995 412.178289 453.229806 499.988299 518.651407 604.573973 310.960277 289.656276 340.494186 250.209091 313.668818 375.685961 395.207891 372.212013 395.922739 518.132567 [171] 236.033982 153.904597 251.398967 269.716658 289.892015 276.808427 331.981828 418.310445 428.937272 442.476437 240.797728 261.190025 287.884262 291.945824 377.505718 357.222220 439.167623 [188] 508.546540 506.384512 576.455953 338.610986 379.233211 490.192079 385.599412 347.796891 312.974960 368.327373 489.456760 597.023264 629.596854 390.926836 387.011589 420.571723 448.783017 [205] 420.488264 427.239729 446.563584 544.457022 507.230047 429.581145 213.487459 210.674754 233.427275 324.669889 320.274416 343.859002 460.901246 511.829200 559.265746 569.271746 212.046383 [222] 171.253138 208.280618 355.157846 389.658858 420.393121 459.393067 460.701444 417.070844 591.891105 220.858469 225.820589 161.103275 192.546098 223.735573 254.276829 208.138448 355.807789 [239] 478.889438 529.655452 265.026542 327.465656 335.367964 344.024017 441.070383 395.181131 400.302898 399.401523 442.227773 402.857977 100.560038 191.907863 258.450263 245.728323 318.206772 [256] 232.596658 274.716308 367.087933 378.381705 424.548460 86.647678 186.773009 137.964192 181.254049 225.306060 282.356137 289.775555 434.372195 393.067172 450.225822 2.808493 69.543071 [273] 154.293221 148.374062 239.116384 299.017411 385.205788 369.173122 355.984988 311.386909 83.939565 73.342715 172.213073 226.465591 381.791190 344.678672 437.958440 433.129434 388.803246 [290] 460.822065 151.002615 146.550490 165.092448 233.126971 233.063190 342.850717 322.839504 383.611104 434.550884 420.860852 175.987300 192.132944 291.828149 407.417228 376.046038 455.517528 [307] 474.793803 401.627353 409.658777 482.936557 187.912636 103.597817 193.277461 274.757842 297.964106 302.376147 445.862792 463.573573 487.541269 457.243442 142.685301 116.316482 156.280805 [324] 283.396897 410.733289 509.330898 580.628527 566.949038 512.677754 520.077598 114.521295 151.604562 285.954755 322.988629 406.862507 449.634223 426.439033 540.006364 545.762897 511.509798 [341] 202.720364 153.983957 240.113760 296.907070 305.524232 311.735584 393.940807 365.769869 353.162494 419.480032 $K [1] 1 $X Time 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9 10 10 11 1 12 2 13 3 14 4 15 5 16 6 17 7 18 8 19 9 20 10 21 1 22 2 23 3 24 4 25 5 26 6 27 7 28 8 29 9 30 10 31 1 32 2 33 3 34 4 35 5 36 6 37 7 38 8 39 9 40 10 41 1 42 2 43 3 44 4 45 5 46 6 47 7 48 8 49 9 50 10 51 1 52 2 53 3 54 4 55 5 56 6 57 7 58 8 59 9 60 10 61 1 62 2 63 3 64 4 65 5 66 6 67 7 68 8 69 9 70 10 71 1 72 2 73 3 74 4 75 5 76 6 77 7 78 8 79 9 80 10 81 1 82 2 83 3 84 4 85 5 86 6 87 7 88 8 89 9 90 10 91 1 92 2 93 3 94 4 95 5 96 6 97 7 98 8 99 9 100 10 101 1 102 2 103 3 104 4 105 5 106 6 107 7 108 8 109 9 110 10 111 1 112 2 113 3 114 4 115 5 116 6 117 7 118 8 119 9 120 10 121 1 122 2 123 3 124 4 125 5 126 6 127 7 128 8 129 9 130 10 131 1 132 2 133 3 134 4 135 5 136 6 137 7 138 8 139 9 140 10 141 1 142 2 143 3 144 4 145 5 146 6 147 7 148 8 149 9 150 10 151 1 152 2 153 3 154 4 155 5 156 6 157 7 158 8 159 9 160 10 161 1 162 2 163 3 164 4 165 5 166 6 167 7 168 8 169 9 170 10 171 1 172 2 173 3 174 4 175 5 176 6 177 7 178 8 179 9 180 10 181 1 182 2 183 3 184 4 185 5 186 6 187 7 188 8 189 9 190 10 191 1 192 2 193 3 194 4 195 5 196 6 197 7 198 8 199 9 200 10 201 1 202 2 203 3 204 4 205 5 206 6 207 7 208 8 209 9 210 10 211 1 212 2 213 3 214 4 215 5 216 6 217 7 218 8 219 9 220 10 221 1 222 2 223 3 224 4 225 5 226 6 227 7 228 8 229 9 230 10 231 1 232 2 233 3 234 4 235 5 236 6 237 7 238 8 239 9 240 10 241 1 242 2 243 3 244 4 245 5 246 6 247 7 248 8 249 9 250 10 251 1 252 2 253 3 254 4 255 5 256 6 257 7 258 8 259 9 260 10 261 1 262 2 263 3 264 4 265 5 266 6 267 7 268 8 269 9 270 10 271 1 272 2 273 3 274 4 275 5 276 6 277 7 278 8 279 9 280 10 281 1 282 2 283 3 284 4 285 5 286 6 287 7 288 8 289 9 290 10 291 1 292 2 293 3 294 4 295 5 296 6 297 7 298 8 299 9 300 10 301 1 302 2 303 3 304 4 305 5 306 6 307 7 308 8 309 9 310 10 311 1 312 2 313 3 314 4 315 5 316 6 317 7 318 8 319 9 320 10 321 1 322 2 323 3 324 4 325 5 326 6 327 7 328 8 329 9 330 10 331 1 332 2 333 3 334 4 335 5 336 6 337 7 338 8 339 9 340 10 341 1 342 2 343 3 344 4 345 5 346 6 347 7 348 8 349 9 350 10 $J_1 [1] 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 4 5 5 5 5 5 5 5 5 5 5 6 6 6 6 6 6 6 6 6 6 7 7 7 7 7 [66] 7 7 7 7 7 8 8 8 8 8 8 8 8 8 8 9 9 9 9 9 9 9 9 9 9 10 10 10 10 10 10 10 10 10 10 11 11 11 11 11 11 11 11 11 11 12 12 12 12 12 12 12 12 12 12 13 13 13 13 13 13 13 13 13 13 [131] 14 14 14 14 14 14 14 14 14 14 15 15 15 15 15 15 15 15 15 15 16 16 16 16 16 16 16 16 16 16 17 17 17 17 17 17 17 17 17 17 18 18 18 18 18 18 18 18 18 18 19 19 19 19 19 19 19 19 19 19 20 20 20 20 20 [196] 20 20 20 20 20 21 21 21 21 21 21 21 21 21 21 22 22 22 22 22 22 22 22 22 22 23 23 23 23 23 23 23 23 23 23 24 24 24 24 24 24 24 24 24 24 25 25 25 25 25 25 25 25 25 25 26 26 26 26 26 26 26 26 26 26 [261] 27 27 27 27 27 27 27 27 27 27 28 28 28 28 28 28 28 28 28 28 29 29 29 29 29 29 29 29 29 29 30 30 30 30 30 30 30 30 30 30 31 31 31 31 31 31 31 31 31 31 32 32 32 32 32 32 32 32 32 32 33 33 33 33 33 [326] 33 33 33 33 33 34 34 34 34 34 34 34 34 34 34 35 35 35 35 35 35 35 35 35 35 $N_1 [1] 35 $K_1 [1] 1 $Z_1 [1] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 [98] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 [195] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 [292] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 $NC_1 [1] 0 $tgroup [1] 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 4 5 5 5 5 5 5 5 5 5 5 6 6 6 6 6 6 6 6 6 6 7 7 7 7 7 [66] 7 7 7 7 7 8 8 8 8 8 8 8 8 8 8 9 9 9 9 9 9 9 9 9 9 10 10 10 10 10 10 10 10 10 10 11 11 11 11 11 11 11 11 11 11 12 12 12 12 12 12 12 12 12 12 13 13 13 13 13 13 13 13 13 13 [131] 14 14 14 14 14 14 14 14 14 14 15 15 15 15 15 15 15 15 15 15 16 16 16 16 16 16 16 16 16 16 17 17 17 17 17 17 17 17 17 17 18 18 18 18 18 18 18 18 18 18 19 19 19 19 19 19 19 19 19 19 20 20 20 20 20 [196] 20 20 20 20 20 21 21 21 21 21 21 21 21 21 21 22 22 22 22 22 22 22 22 22 22 23 23 23 23 23 23 23 23 23 23 24 24 24 24 24 24 24 24 24 24 25 25 25 25 25 25 25 25 25 25 26 26 26 26 26 26 26 26 26 26 [261] 27 27 27 27 27 27 27 27 27 27 28 28 28 28 28 28 28 28 28 28 29 29 29 29 29 29 29 29 29 29 30 30 30 30 30 30 30 30 30 30 31 31 31 31 31 31 31 31 31 31 32 32 32 32 32 32 32 32 32 32 33 33 33 33 33 [326] 33 33 33 33 33 34 34 34 34 34 34 34 34 34 34 35 35 35 35 35 35 35 35 35 35 $E_pre [,1] [1,] 0 [2,] 0 [3,] 0 [4,] 0 [5,] 0 [6,] 0 [7,] 0 [8,] 0 [9,] 0 [10,] 0 [11,] 0 [12,] 0 [13,] 0 [14,] 0 [15,] 0 [16,] 0 [17,] 0 [18,] 0 [19,] 0 [20,] 0 [21,] 0 [22,] 0 [23,] 0 [24,] 0 [25,] 0 [26,] 0 [27,] 0 [28,] 0 [29,] 0 [30,] 0 [31,] 0 [32,] 0 [33,] 0 [34,] 0 [35,] 0 [36,] 0 [37,] 0 [38,] 0 [39,] 0 [40,] 0 [41,] 0 [42,] 0 [43,] 0 [44,] 0 [45,] 0 [46,] 0 [47,] 0 [48,] 0 [49,] 0 [50,] 0 [51,] 0 [52,] 0 [53,] 0 [54,] 0 [55,] 0 [56,] 0 [57,] 0 [58,] 0 [59,] 0 [60,] 0 [61,] 0 [62,] 0 [63,] 0 [64,] 0 [65,] 0 [66,] 0 [67,] 0 [68,] 0 [69,] 0 [70,] 0 [71,] 0 [72,] 0 [73,] 0 [74,] 0 [75,] 0 [76,] 0 [77,] 0 [78,] 0 [79,] 0 [80,] 0 [81,] 0 [82,] 0 [83,] 0 [84,] 0 [85,] 0 [86,] 0 [87,] 0 [88,] 0 [89,] 0 [90,] 0 [91,] 0 [92,] 0 [93,] 0 [94,] 0 [95,] 0 [96,] 0 [97,] 0 [98,] 0 [99,] 0 [100,] 0 [101,] 0 [102,] 0 [103,] 0 [104,] 0 [105,] 0 [106,] 0 [107,] 0 [108,] 0 [109,] 0 [110,] 0 [111,] 0 [112,] 0 [113,] 0 [114,] 0 [115,] 0 [116,] 0 [117,] 0 [118,] 0 [119,] 0 [120,] 0 [121,] 0 [122,] 0 [123,] 0 [124,] 0 [125,] 0 [126,] 0 [127,] 0 [128,] 0 [129,] 0 [130,] 0 [131,] 0 [132,] 0 [133,] 0 [134,] 0 [135,] 0 [136,] 0 [137,] 0 [138,] 0 [139,] 0 [140,] 0 [141,] 0 [142,] 0 [143,] 0 [144,] 0 [145,] 0 [146,] 0 [147,] 0 [148,] 0 [149,] 0 [150,] 0 [151,] 0 [152,] 0 [153,] 0 [154,] 0 [155,] 0 [156,] 0 [157,] 0 [158,] 0 [159,] 0 [160,] 0 [161,] 0 [162,] 0 [163,] 0 [164,] 0 [165,] 0 [166,] 0 [167,] 0 [168,] 0 [169,] 0 [170,] 0 [171,] 0 [172,] 0 [173,] 0 [174,] 0 [175,] 0 [176,] 0 [177,] 0 [178,] 0 [179,] 0 [180,] 0 [181,] 0 [182,] 0 [183,] 0 [184,] 0 [185,] 0 [186,] 0 [187,] 0 [188,] 0 [189,] 0 [190,] 0 [191,] 0 [192,] 0 [193,] 0 [194,] 0 [195,] 0 [196,] 0 [197,] 0 [198,] 0 [199,] 0 [200,] 0 [201,] 0 [202,] 0 [203,] 0 [204,] 0 [205,] 0 [206,] 0 [207,] 0 [208,] 0 [209,] 0 [210,] 0 [211,] 0 [212,] 0 [213,] 0 [214,] 0 [215,] 0 [216,] 0 [217,] 0 [218,] 0 [219,] 0 [220,] 0 [221,] 0 [222,] 0 [223,] 0 [224,] 0 [225,] 0 [226,] 0 [227,] 0 [228,] 0 [229,] 0 [230,] 0 [231,] 0 [232,] 0 [233,] 0 [234,] 0 [235,] 0 [236,] 0 [237,] 0 [238,] 0 [239,] 0 [240,] 0 [241,] 0 [242,] 0 [243,] 0 [244,] 0 [245,] 0 [246,] 0 [247,] 0 [248,] 0 [249,] 0 [250,] 0 [251,] 0 [252,] 0 [253,] 0 [254,] 0 [255,] 0 [256,] 0 [257,] 0 [258,] 0 [259,] 0 [260,] 0 [261,] 0 [262,] 0 [263,] 0 [264,] 0 [265,] 0 [266,] 0 [267,] 0 [268,] 0 [269,] 0 [270,] 0 [271,] 0 [272,] 0 [273,] 0 [274,] 0 [275,] 0 [276,] 0 [277,] 0 [278,] 0 [279,] 0 [280,] 0 [281,] 0 [282,] 0 [283,] 0 [284,] 0 [285,] 0 [286,] 0 [287,] 0 [288,] 0 [289,] 0 [290,] 0 [291,] 0 [292,] 0 [293,] 0 [294,] 0 [295,] 0 [296,] 0 [297,] 0 [298,] 0 [299,] 0 [300,] 0 [301,] 0 [302,] 0 [303,] 0 [304,] 0 [305,] 0 [306,] 0 [307,] 0 [308,] 0 [309,] 0 [310,] 0 [311,] 0 [312,] 0 [313,] 0 [314,] 0 [315,] 0 [316,] 0 [317,] 0 [318,] 0 [319,] 0 [320,] 0 [321,] 0 [322,] 0 [323,] 0 [324,] 0 [325,] 0 [326,] 0 [327,] 0 [328,] 0 [329,] 0 [330,] 0 [331,] 0 [332,] 0 [333,] 0 [334,] 0 [335,] 0 [336,] 0 [337,] 0 [338,] 0 [339,] 0 [340,] 0 [341,] 0 [342,] 0 [343,] 0 [344,] 0 [345,] 0 [346,] 0 [347,] 0 [348,] 0 [349,] 0 [350,] 0 $Kar [1] 1 $Kma [1] 0 $Karma [1] 1
Worked Examples
Randomized block and simple repeated measures ANOVA (Mixed effects) references
- McCarthy (2007) - Chpt ?
- Kery (2010) - Chpt ?
- Gelman & Hill (2007) - Chpt ?
- Logan (2010) - Chpt 13
- Quinn & Keough (2002) - Chpt 10
Randomized block design
A plant pathologist wanted to examine the effects of two different strengths of tobacco virus on the number of lesions on tobacco leaves. She knew from pilot studies that leaves were inherently very variable in response to the virus. In an attempt to account for this leaf to leaf variability, both treatments were applied to each leaf. Eight individual leaves were divided in half, with half of each leaf inoculated with weak strength virus and the other half inoculated with strong virus. So the leaves were blocks and each treatment was represented once in each block. A completely randomised design would have had 16 leaves, with 8 whole leaves randomly allocated to each treatment.
Download Tobacco data set
Format of tobacco.csv data files | |||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
Open the tobacco data file.
tobacco <- read.table('../downloads/data/tobacco.csv', header=T, sep=',', strip.white=T) head(tobacco)
LEAF TREATMENT NUMBER 1 L1 Strong 35.89776 2 L1 Weak 25.01984 3 L2 Strong 34.11786 4 L2 Weak 23.16740 5 L3 Strong 35.70215 6 L3 Weak 24.12191
To appreciate the difference between this design (Complete Randomized Block) in which there is a single within Group effect (Treatment) and a nested design (in which there are between group effects), I will illustrate the current design diagramatically.
- Note that each level of the Treatment (Strong and Week) are applied to each Leaf (Block)
- Note that Treatments are randomly applied
- The Treatment effect is the mean difference between Treatment pairs per leaf
- Blocking in this way is very useful when spatial or temporal heterogeneity is likely to add noise that could make it difficualt to detect a difference between Treatments. Hence it is a way of experimentally reducing unexplained variation (compared to nesting which involves statistical reduction of unexplained variation).
Exploratory data analysis has indicated that the response variable could be normalized via a forth-root transformation.
- Fit a model for the Randomized Complete Block
- Full Effects parameterization - random intercepts model (JAGS)
number of lesionsi = βSite j(i) + εi where ε ∼ N(0,σ2)
View full effects parameterization (JAGS) codemodelString= ## write the model to a text file (I suggest you alter the path to somewhere more relevant ## to your system!) writeLines(modelString,con="../downloads/BUGSscripts/ws9.3bQ1.1a.txt") tobacco.list <- with(tobacco, list(number=NUMBER, treatment=as.numeric(TREATMENT), leaf=as.numeric(LEAF), n=nrow(tobacco), nTreat=length(levels(as.factor(TREATMENT))), nLeaf=length(levels(as.factor(LEAF))) ) ) params <- c("perc.Treat","p.decline","p.decline5","p.decline10","p.decline20", "p.decline50","p.decline100","Treatment.means","beta.leaf","beta.treatment", "sigma.res","sigma.leaf","sd.Resid","sd.Leaf","sd.Treatment") burnInSteps = 3000 nChains = 3 numSavedSteps = 3000 thinSteps = 100 nIter = ceiling((numSavedSteps * thinSteps)/nChains) library(R2jags) library(coda) ## tobacco.r2jags.a <- jags(data=tobacco.list, inits=NULL,#inits=list(inits,inits,inits), # since there are three chains parameters.to.save=params, model.file="../downloads/BUGSscripts/ws9.3bQ1.1a.txt", n.chains=3, n.iter=nIter, n.burnin=burnInSteps, n.thin=thinSteps )
Compiling model graph Resolving undeclared variables Allocating nodes Graph Size: 151 Initializing model
print(tobacco.r2jags.a)
Inference for Bugs model at "../downloads/BUGSscripts/ws9.3bQ1.1a.txt", fit using jags, 3 chains, each with 1e+05 iterations (first 3000 discarded), n.thin = 100 n.sims = 2910 iterations saved mu.vect sd.vect 2.5% 25% 50% 75% 97.5% Rhat n.eff Treatment.means[1] 34.696 2.677 29.460 32.958 34.705 36.351 40.133 1.001 2900 Treatment.means[2] 26.821 2.724 21.542 25.079 26.822 28.500 32.203 1.001 2900 beta.leaf[1] 34.696 2.677 29.460 32.958 34.705 36.351 40.133 1.001 2900 beta.leaf[2] 33.699 2.705 28.447 31.912 33.720 35.536 38.931 1.001 2900 beta.leaf[3] 34.387 2.690 29.106 32.618 34.383 36.186 39.467 1.001 2900 beta.leaf[4] 32.815 2.923 27.004 30.996 32.825 34.693 38.665 1.001 2900 beta.leaf[5] 33.951 2.768 28.461 32.262 33.976 35.715 39.368 1.002 2700 beta.leaf[6] 37.929 3.085 32.006 35.904 37.960 39.929 44.083 1.001 2900 beta.leaf[7] 39.712 3.559 32.697 37.201 39.900 42.148 46.370 1.001 2900 beta.leaf[8] 32.691 2.945 26.849 30.761 32.739 34.617 38.363 1.001 2800 beta.treatment[1] 0.000 0.000 0.000 0.000 0.000 0.000 0.000 1.000 1 beta.treatment[2] -7.876 2.369 -12.663 -9.294 -7.876 -6.448 -3.066 1.001 2900 p.decline 0.999 0.037 1.000 1.000 1.000 1.000 1.000 1.029 2900 p.decline10 0.969 0.173 0.000 1.000 1.000 1.000 1.000 1.002 2900 p.decline100 0.000 0.000 0.000 0.000 0.000 0.000 0.000 1.000 1 p.decline20 0.688 0.463 0.000 0.000 1.000 1.000 1.000 1.001 2900 p.decline5 0.992 0.087 1.000 1.000 1.000 1.000 1.000 1.031 2100 p.decline50 0.000 0.019 0.000 0.000 0.000 0.000 0.000 1.291 2900 perc.Treat -22.612 6.343 -35.286 -26.524 -22.669 -18.856 -9.250 1.001 2900 sd.Leaf 3.291 1.494 0.264 2.269 3.458 4.311 5.947 1.007 1200 sd.Resid 6.540 0.000 6.540 6.540 6.540 6.540 6.540 1.000 1 sd.Treatment 2.034 0.610 0.792 1.665 2.034 2.400 3.270 1.001 2900 sigma.leaf 4.062 2.406 0.300 2.485 3.780 5.295 9.579 1.003 2300 sigma.res 4.610 1.346 2.676 3.637 4.423 5.330 7.784 1.001 2400 deviance 92.092 6.615 80.458 87.095 91.801 97.054 105.117 1.001 2900 For each parameter, n.eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor (at convergence, Rhat=1). DIC info (using the rule, pD = var(deviance)/2) pD = 21.9 and DIC = 114.0 DIC is an estimate of expected predictive error (lower deviance is better).
tobacco.mcmc.a <- tobacco.r2jags.a$BUGSoutput$sims.matrix
Note that the Leaf β are not actually the leaf means. Rather they are the intercepts for each leaf (since we have fit a random intercepts model). They represent the value of the first Treatment level (Strong) within each Leaf.
The Treatment effect thus represents the mean of the differences between the second Treatment level (Week) and these intercepts.
Indeed, the analysis assumes that there is no real interactions between the Leafs (Blocks) and the Treatment effects (within Blocks) - otherwise the mean Treatment effect would be a over simplification of the true nature of the populations. The presence of such an interaction would indicate that the Blocks (Leafs) do not all represent the same population.
View full codemodelString=" model { #Likelihood for (i in 1:n) { number[i]~dnorm(mean[i],tau.res) mean[i] <- beta[leaf[i],treatment[i]] y.err[i] <- number[i] - mean[1] } #Priors and derivatives for (i in 1:nLeaf) { for (j in 1:nTreat) { beta[i,j] ~ dnorm(0,1.0E-6) } } #Effects beta0 <- beta[1,1] for (i in 1:nLeaf) { beta.leaf[i] <- mean(beta[i,]-beta[1,]) } #Leaf.means[1] <- mean(beta[1,]) for (i in 1:nLeaf) { Leaf.means[i] <- mean(beta[i,]) } for (i in 1:nTreat) { beta.treatment[i] <- mean(beta[,i]-beta[,1]) } for (i in 1:nTreat) { Treatment.means[i] <- mean(beta[,i]) } for (i in 1:nLeaf){ for (j in 1:nTreat){ beta.int[i,j] <- beta[i,j]-beta.leaf[i]-beta.treatment[j]-beta0 } } tau.res <- pow(sigma.res,2) #sigma.res ~ dgamma(0.001,0.001) sigma.res ~ dunif(0,100) sd.Leaf <- sd(Leaf.means) sd.Treatment <- sd(Treatment.means) sd.Int <- sd(beta.int[,]) sd.Resid <- sd(y.err) } " ## write the model to a text file (I suggest you alter the path to somewhere more relevant ## to your system!) writeLines(modelString,con="../downloads/BUGSscripts/ws9.3bQ1.1a1.txt") tobacco.list <- with(tobacco, list(number=NUMBER, treatment=as.numeric(TREATMENT), leaf=as.numeric(LEAF), n=nrow(tobacco), nTreat=length(levels(as.factor(TREATMENT))), nLeaf=length(levels(as.factor(LEAF))) ) ) params <- c("Leaf.means","Treatment.means","beta","beta.leaf","beta.treatment","sigma.res","sd.Leaf","sd.Treatment","sd.Int","sd.Resid") adaptSteps = 1000 burnInSteps = 200 nChains = 3 numSavedSteps = 5000 thinSteps = 10 nIter = ceiling((numSavedSteps * thinSteps)/nChains) library(R2jags) ## tobacco.r2jags2 <- jags(data=tobacco.list, inits=NULL,#inits=list(inits,inits,inits), # since there are three chains parameters.to.save=params, model.file="../downloads/BUGSscripts/ws9.3bQ1.1a1.txt", n.chains=3, n.iter=nIter, n.burnin=burnInSteps, n.thin=thinSteps )
Compiling model graph Resolving undeclared variables Allocating nodes Graph Size: 185 Initializing model
print(tobacco.r2jags2)
Inference for Bugs model at "../downloads/BUGSscripts/ws9.3bQ1.1a1.txt", fit using jags, 3 chains, each with 16667 iterations (first 200 discarded), n.thin = 10 n.sims = 4941 iterations saved mu.vect sd.vect 2.5% 25% 50% 75% 97.5% Rhat n.eff Leaf.means[1] 30.460 0.517 30.360 30.449 30.459 30.469 30.557 1.233 4900 Leaf.means[2] 28.639 0.791 28.548 28.632 28.642 28.652 28.742 1.276 4900 Leaf.means[3] 29.913 0.523 29.822 29.902 29.912 29.923 30.003 1.275 4200 Leaf.means[4] 27.081 0.421 26.986 27.072 27.082 27.092 27.199 1.251 4900 Leaf.means[5] 29.105 0.421 29.009 29.093 29.103 29.113 29.203 1.249 4900 Leaf.means[6] 36.364 0.462 36.265 36.351 36.361 36.371 36.462 1.270 4900 Leaf.means[7] 39.594 0.714 39.512 39.594 39.604 39.614 39.714 1.278 1900 Leaf.means[8] 26.837 0.573 26.736 26.831 26.841 26.851 26.947 1.274 4500 Treatment.means[1] 34.941 0.196 34.889 34.935 34.940 34.945 34.991 1.223 4900 Treatment.means[2] 27.057 0.286 27.015 27.056 27.061 27.066 27.111 1.281 4300 beta[1,1] 35.909 0.704 35.764 35.884 35.898 35.912 36.039 1.268 4900 beta[2,1] 34.111 1.022 33.969 34.103 34.117 34.131 34.252 1.279 4900 beta[3,1] 35.726 0.672 35.581 35.689 35.702 35.717 35.849 1.235 3700 beta[4,1] 26.224 0.640 26.074 26.210 26.224 26.239 26.374 1.282 4900 beta[5,1] 33.009 0.673 32.883 33.003 33.017 33.031 33.168 1.285 3200 beta[6,1] 36.719 0.693 36.580 36.714 36.728 36.743 36.850 1.279 2500 beta[7,1] 44.722 0.741 44.569 44.710 44.723 44.738 44.861 1.268 2600 beta[8,1] 33.109 1.329 32.949 33.095 33.110 33.124 33.240 1.280 3300 beta[1,2] 25.011 0.646 24.874 25.006 25.020 25.034 25.152 1.222 3200 beta[2,2] 23.168 0.727 23.034 23.153 23.167 23.182 23.317 1.263 4900 beta[3,2] 24.099 0.738 23.979 24.108 24.122 24.137 24.251 1.268 1300 beta[4,2] 27.939 0.528 27.811 27.925 27.939 27.953 28.085 1.221 4900 beta[5,2] 25.201 0.902 25.042 25.173 25.188 25.202 25.318 1.275 4900 beta[6,2] 36.008 0.600 35.877 35.980 35.993 36.008 36.147 1.253 4900 beta[7,2] 34.466 0.865 34.345 34.470 34.484 34.498 34.634 1.276 1600 beta[8,2] 20.565 0.689 20.436 20.557 20.572 20.587 20.728 1.271 4900 beta.leaf[1] 0.000 0.000 0.000 0.000 0.000 0.000 0.000 1.000 1 beta.leaf[2] -1.821 0.770 -1.962 -1.831 -1.817 -1.802 -1.674 1.276 4900 beta.leaf[3] -0.548 0.586 -0.678 -0.561 -0.546 -0.532 -0.406 1.243 4900 beta.leaf[4] -3.379 0.593 -3.518 -3.391 -3.377 -3.363 -3.223 1.217 4900 beta.leaf[5] -1.355 0.719 -1.507 -1.371 -1.356 -1.342 -1.230 1.220 4900 beta.leaf[6] 5.904 0.770 5.763 5.888 5.902 5.916 6.033 1.265 4900 beta.leaf[7] 9.134 0.907 9.007 9.132 9.145 9.159 9.298 1.255 3400 beta.leaf[8] -3.623 0.912 -3.771 -3.632 -3.618 -3.604 -3.485 1.265 4900 beta.treatment[1] 0.000 0.000 0.000 0.000 0.000 0.000 0.000 1.000 1 beta.treatment[2] -7.884 0.309 -7.945 -7.886 -7.879 -7.872 -7.808 1.236 4900 sd.Int 2.622 0.435 2.569 2.595 2.598 2.602 2.642 1.255 340 sd.Leaf 4.586 0.377 4.536 4.564 4.568 4.572 4.612 1.271 570 sd.Resid 6.540 0.000 6.540 6.540 6.540 6.540 6.540 1.000 1 sd.Treatment 5.575 0.218 5.521 5.567 5.571 5.576 5.618 1.246 4900 sigma.res 51.375 28.812 3.174 26.416 51.698 76.713 97.751 1.038 160 deviance -71.342 32.120 -105.102 -93.550 -80.596 -58.897 10.262 1.036 180 For each parameter, n.eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor (at convergence, Rhat=1). DIC info (using the rule, pD = var(deviance)/2) pD = 510.4 and DIC = 439.1 DIC is an estimate of expected predictive error (lower deviance is better).
View matrix parameterization (JAGS) codemodelString=" model { #Likelihood for (i in 1:n) { y[i]~dnorm(mu[i],tau.res) mu[i] <- inprod(gamma[],Z[i,]) + inprod(beta[],X[i,]) y.err[i] <- y[i] - mu[i] } #Priors and derivatives for (i in 1:nZ) { gamma[i] ~ dnorm(mu.gamma,tau.leaf) } mu.gamma ~ dnorm(0,1.0E-06) for (i in 1:nX) { beta[i] ~ dnorm(0,1.0E-6) } Treatment.means[1] <- beta[1] Treatment.means[2] <- beta[1]+beta[2] # Half-cauchy (scale=25) priors on variance tau.res <- pow(sigma.res,-2) sigma.res <-z/sqrt(chSq) z ~ dnorm(0, .0016)I(0,) chSq ~ dgamma(0.5, 0.5) tau.leaf <- pow(sigma.leaf,-2) sigma.leaf <- z.leaf/sqrt(chSq.leaf) z.leaf ~ dnorm(0, .0016)I(0,) chSq.leaf ~ dgamma(0.5, 0.5) sd.Leaf <- sd(gamma) sd.Treatment <- sd(Treatment.means) sd.Resid <- sd(y.err) } " ## write the model to a text file (I suggest you alter the path to somewhere more relevant ## to your system!) writeLines(modelString,con="../downloads/BUGSscripts/ws9.3bQ1.1b.txt") Xmat <- model.matrix(~TREATMENT, tobacco) Zmat <- model.matrix(~LEAF, tobacco) tobacco.list <- with(tobacco, list(y=NUMBER, X=Xmat, nX=ncol(Xmat), Z=Zmat, nZ=ncol(Zmat), n=nrow(tobacco) ) ) tobacco.list
$y [1] 35.90 25.02 34.12 23.17 35.70 24.12 26.22 27.94 33.02 25.19 35.99 36.73 34.48 44.72 20.57 33.11 $X (Intercept) TREATMENTWeak 1 1 0 2 1 1 3 1 0 4 1 1 5 1 0 6 1 1 7 1 0 8 1 1 9 1 0 10 1 1 11 1 1 12 1 0 13 1 1 14 1 0 15 1 1 16 1 0 attr(,"assign") [1] 0 1 attr(,"contrasts") attr(,"contrasts")$TREATMENT [1] "contr.treatment" $nX [1] 2 $Z (Intercept) LEAFL2 LEAFL3 LEAFL4 LEAFL5 LEAFL6 LEAFL7 LEAFL8 1 1 0 0 0 0 0 0 0 2 1 0 0 0 0 0 0 0 3 1 1 0 0 0 0 0 0 4 1 1 0 0 0 0 0 0 5 1 0 1 0 0 0 0 0 6 1 0 1 0 0 0 0 0 7 1 0 0 1 0 0 0 0 8 1 0 0 1 0 0 0 0 9 1 0 0 0 1 0 0 0 10 1 0 0 0 1 0 0 0 11 1 0 0 0 0 1 0 0 12 1 0 0 0 0 1 0 0 13 1 0 0 0 0 0 1 0 14 1 0 0 0 0 0 1 0 15 1 0 0 0 0 0 0 1 16 1 0 0 0 0 0 0 1 attr(,"assign") [1] 0 1 1 1 1 1 1 1 attr(,"contrasts") attr(,"contrasts")$LEAF [1] "contr.treatment" $nZ [1] 8 $n [1] 16
params <- c("Treatment.means","gamma","beta","sigma.res","sigma.leaf","sd.Resid","sd.Leaf","sd.Treatment") burnInSteps = 3000 nChains = 3 numSavedSteps = 3000 thinSteps = 100 nIter = ceiling((numSavedSteps * thinSteps)/nChains) library(R2jags) ## tobacco.r2jags.b <- jags(data=tobacco.list, inits=NULL,#inits=list(inits,inits,inits), # since there are three chains parameters.to.save=params, model.file="../downloads/BUGSscripts/ws9.3bQ1.1b.txt", n.chains=3, n.iter=nIter, n.burnin=burnInSteps, n.thin=thinSteps )
Compiling model graph Resolving undeclared variables Allocating nodes Graph Size: 288 Initializing model
print(tobacco.r2jags.b)
Inference for Bugs model at "../downloads/BUGSscripts/ws9.3bQ1.1b.txt", fit using jags, 3 chains, each with 1e+05 iterations (first 3000 discarded), n.thin = 100 n.sims = 2910 iterations saved mu.vect sd.vect 2.5% 25% 50% 75% 97.5% Rhat n.eff Treatment.means[1] 33.617 9.065 16.294 27.972 33.640 39.220 51.619 1.001 2700 Treatment.means[2] 25.765 9.083 8.148 20.169 25.644 31.302 43.939 1.001 2700 beta[1] 33.617 9.065 16.294 27.972 33.640 39.220 51.619 1.001 2700 beta[2] -7.852 2.475 -12.688 -9.391 -7.896 -6.381 -2.819 1.001 2700 gamma[1] 0.682 6.830 -12.926 -3.302 0.736 4.630 14.115 1.002 2900 gamma[2] -0.726 4.353 -9.381 -3.383 -0.682 1.929 7.993 1.002 1100 gamma[3] 0.051 4.280 -8.644 -2.554 -0.001 2.606 8.432 1.001 2300 gamma[4] -1.668 4.402 -10.107 -4.456 -1.847 1.021 7.512 1.001 2900 gamma[5] -0.432 4.288 -8.802 -3.126 -0.453 2.273 8.139 1.003 770 gamma[6] 3.774 4.497 -5.724 1.007 4.022 6.783 12.086 1.003 730 gamma[7] 5.668 4.814 -5.001 2.714 6.028 8.875 14.321 1.003 970 gamma[8] -1.780 4.454 -10.410 -4.633 -1.827 0.912 7.221 1.001 2900 sd.Leaf 3.828 1.798 0.368 2.682 3.888 4.927 7.529 1.001 2900 sd.Resid 4.301 0.941 2.964 3.591 4.119 4.897 6.362 1.001 2900 sd.Treatment 5.562 1.720 2.021 4.512 5.584 6.640 8.972 1.002 2800 sigma.leaf 4.673 2.866 0.409 2.877 4.254 5.985 11.737 1.001 2900 sigma.res 4.719 1.462 2.627 3.656 4.465 5.446 8.243 1.001 2900 deviance 92.673 7.095 80.563 87.201 92.125 97.650 107.445 1.001 2900 For each parameter, n.eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor (at convergence, Rhat=1). DIC info (using the rule, pD = var(deviance)/2) pD = 25.2 and DIC = 117.9 DIC is an estimate of expected predictive error (lower deviance is better).
tobacco.mcmc.b <- tobacco.r2jags.b$BUGSoutput$sims.matrix
Matrix parameterization STAN coderstanString=" data{ int n; int nZ; int nX; vector [n] y; matrix [n,nX] X; matrix [n,nZ] Z; vector [nX] a0; matrix [nX,nX] A0; } parameters{ vector [nX] beta; real<lower=0> sigma; vector [nZ] gamma; real<lower=0> sigma_Z; } transformed parameters { vector [n] mu; mu <- Z*gamma + X*beta; } model{ // Priors beta ~ multi_normal(a0,A0); gamma ~ normal( 0 , sigma_Z ); sigma_Z ~ cauchy(0,25); sigma ~ cauchy(0,25); y ~ normal( mu , sigma ); } generated quantities { vector [n] y_err; real<lower=0> sd_Resid; y_err <- y - mu; sd_Resid <- sd(y_err); } " # Generate a data list Xmat <- model.matrix(~TREATMENT, data=tobacco) Zmat <- model.matrix(~-1+LEAF, data=tobacco) tobacco.list <- with(tobacco, list(y=NUMBER, Z=Zmat, X=Xmat, nX=ncol(Xmat), nZ=ncol(Zmat), n=nrow(tobacco), a0=rep(0,ncol(Xmat)), A0=diag(100000,ncol(Xmat)) ) ) # define parameters burnInSteps = 6000 nChains = 3 numSavedSteps = 3000 thinSteps = 100 nIter = burnInSteps+ceiling((numSavedSteps * thinSteps)/nChains) library(rstan) tobacco.rstan.a <- stan(data=tobacco.list, model_code=rstanString, pars=c('beta','gamma','sigma','sigma_Z', 'sd_Resid'), chains=nChains, iter=nIter, warmup=burnInSteps, thin=thinSteps, save_dso=TRUE )
TRANSLATING MODEL 'rstanString' FROM Stan CODE TO C++ CODE NOW. COMPILING THE C++ CODE FOR MODEL 'rstanString' NOW. SAMPLING FOR MODEL 'rstanString' NOW (CHAIN 1). Iteration: 1 / 106000 [ 0%] (Warmup) Iteration: 6001 / 106000 [ 5%] (Sampling) Iteration: 16600 / 106000 [ 15%] (Sampling) Iteration: 27200 / 106000 [ 25%] (Sampling) Iteration: 37800 / 106000 [ 35%] (Sampling) Iteration: 48400 / 106000 [ 45%] (Sampling) Iteration: 59000 / 106000 [ 55%] (Sampling) Iteration: 69600 / 106000 [ 65%] (Sampling) Iteration: 80200 / 106000 [ 75%] (Sampling) Iteration: 90800 / 106000 [ 85%] (Sampling) Iteration: 101400 / 106000 [ 95%] (Sampling) Iteration: 106000 / 106000 [100%] (Sampling) # Elapsed Time: 0.61 seconds (Warm-up) # 12.95 seconds (Sampling) # 13.56 seconds (Total) SAMPLING FOR MODEL 'rstanString' NOW (CHAIN 2). Iteration: 1 / 106000 [ 0%] (Warmup) Iteration: 6001 / 106000 [ 5%] (Sampling) Iteration: 16600 / 106000 [ 15%] (Sampling) Iteration: 27200 / 106000 [ 25%] (Sampling) Iteration: 37800 / 106000 [ 35%] (Sampling) Iteration: 48400 / 106000 [ 45%] (Sampling) Iteration: 59000 / 106000 [ 55%] (Sampling) Iteration: 69600 / 106000 [ 65%] (Sampling) Iteration: 80200 / 106000 [ 75%] (Sampling) Iteration: 90800 / 106000 [ 85%] (Sampling) Iteration: 101400 / 106000 [ 95%] (Sampling) Iteration: 106000 / 106000 [100%] (Sampling) # Elapsed Time: 0.62 seconds (Warm-up) # 11.21 seconds (Sampling) # 11.83 seconds (Total) SAMPLING FOR MODEL 'rstanString' NOW (CHAIN 3). Iteration: 1 / 106000 [ 0%] (Warmup) Iteration: 6001 / 106000 [ 5%] (Sampling) Iteration: 16600 / 106000 [ 15%] (Sampling) Iteration: 27200 / 106000 [ 25%] (Sampling) Iteration: 37800 / 106000 [ 35%] (Sampling) Iteration: 48400 / 106000 [ 45%] (Sampling) Iteration: 59000 / 106000 [ 55%] (Sampling) Iteration: 69600 / 106000 [ 65%] (Sampling) Iteration: 80200 / 106000 [ 75%] (Sampling) Iteration: 90800 / 106000 [ 85%] (Sampling) Iteration: 101400 / 106000 [ 95%] (Sampling) Iteration: 106000 / 106000 [100%] (Sampling) # Elapsed Time: 0.56 seconds (Warm-up) # 13.18 seconds (Sampling) # 13.74 seconds (Total)
print(tobacco.rstan.a)
Inference for Stan model: rstanString. 3 chains, each with iter=106000; warmup=6000; thin=100; post-warmup draws per chain=1000, total post-warmup draws=3000. mean se_mean sd 2.5% 25% 50% 75% 97.5% n_eff Rhat beta[1] 34.97 0.04 2.38 30.33 33.49 34.97 36.42 39.80 3000 1 beta[2] -7.90 0.04 2.41 -12.76 -9.38 -7.83 -6.41 -3.15 2995 1 gamma[1] -0.40 0.05 2.63 -6.00 -1.89 -0.28 1.15 4.80 3000 1 gamma[2] -1.25 0.05 2.82 -7.49 -2.81 -1.03 0.50 4.09 3000 1 gamma[3] -0.62 0.05 2.72 -6.43 -2.15 -0.47 0.93 4.91 2938 1 gamma[4] -2.18 0.05 2.90 -8.50 -3.92 -1.89 -0.20 2.98 3000 1 gamma[5] -1.14 0.05 2.75 -7.03 -2.77 -0.85 0.57 3.92 3000 1 gamma[6] 2.95 0.06 3.08 -2.21 0.73 2.64 4.77 9.73 2893 1 gamma[7] 4.66 0.06 3.47 -0.93 1.87 4.62 6.97 11.81 3000 1 gamma[8] -2.24 0.05 2.97 -9.00 -3.96 -1.97 -0.21 2.98 3000 1 sigma 4.64 0.02 1.34 2.74 3.65 4.40 5.38 7.81 3000 1 sigma_Z 4.08 0.05 2.44 0.60 2.47 3.79 5.20 9.71 2716 1 sd_Resid 4.23 0.02 0.83 2.96 3.57 4.13 4.82 5.93 3000 1 lp__ -42.01 0.08 3.95 -50.01 -44.32 -41.84 -39.67 -33.64 2426 1 Samples were drawn using NUTS(diag_e) at Mon Mar 9 07:26:24 2015. For each parameter, n_eff is a crude measure of effective sample size, and Rhat is the potential scale reduction factor on split chains (at convergence, Rhat=1).
tobacco.mcmc.d <- rstan:::as.mcmc.list.stanfit(tobacco.rstan.a) tobacco.mcmc.df.d <- as.data.frame(extract(tobacco.rstan.a))
library(plyr) #Finite-population standard deviations ## Leaf library(coda) sd.leaf <- tobacco.mcmc.df.d[,3:10] SD.leaf <- apply(sd.leaf,1,sd) data.frame(mean=mean(SD.leaf), median=median(SD.leaf), HPDinterval(as.mcmc(SD.leaf)), HPDinterval(as.mcmc(SD.leaf),p=0.68))
mean median lower upper lower.1 upper.1 var1 3.298585 3.402101 0.3716819 5.626209 1.941839 4.91738
#Treatment treatments <- NULL treatments <- cbind(tobacco.mcmc.df.d[,'beta.1'], tobacco.mcmc.df.d[,'beta.1']+tobacco.mcmc.df.d[,'beta.2']) sd.treatment <- apply(treatments,1,sd) data.frame(mean=mean(sd.treatment), median=median(sd.treatment), HPDinterval(as.mcmc(sd.treatment)), HPDinterval(as.mcmc(sd.treatment),p=0.68))
mean median lower upper lower.1 upper.1 var1 5.590889 5.533738 2.255233 9.03083 3.965325 7.075698
- Full Effects parameterization - random intercepts model (JAGS)
- Before fully exploring the parameters, it is prudent to examine the convergence and mixing diagnostics.
Chose either any of the parameterizations (they should yield much the same) - although it is sometimes useful to explore the
different performances of effects vs matrix and JAGS vs STAN.
Full effects parameterization JAGS code
library(coda) plot(as.mcmc(tobacco.r2jags.a))
preds <- c("beta.leaf[1]","beta.treatment[2]", "sigma.res", "sigma.leaf") autocorr.diag(as.mcmc(tobacco.r2jags.a)[, preds])
beta.leaf[1] beta.treatment[2] sigma.res sigma.leaf Lag 0 1.00000000 1.000000000 1.000000000 1.00000000 Lag 100 -0.02310616 -0.004846084 0.018933504 0.01600178 Lag 500 -0.01362871 0.019976354 -0.009438916 -0.01285453 Lag 1000 0.01127440 0.001736608 0.011160479 -0.01859763 Lag 5000 -0.03264369 -0.032800331 -0.021046655 0.01361098
raftery.diag(as.mcmc(tobacco.r2jags.a))
[[1]] Quantile (q) = 0.025 Accuracy (r) = +/- 0.005 Probability (s) = 0.95 You need a sample size of at least 3746 with these values of q, r and s [[2]] Quantile (q) = 0.025 Accuracy (r) = +/- 0.005 Probability (s) = 0.95 You need a sample size of at least 3746 with these values of q, r and s [[3]] Quantile (q) = 0.025 Accuracy (r) = +/- 0.005 Probability (s) = 0.95 You need a sample size of at least 3746 with these values of q, r and s
Matrix parameterization JAGS codelibrary(coda) plot(as.mcmc(tobacco.r2jags.b))
preds <- c("beta[1]","beta[2]", "sigma.res", "sigma.leaf") autocorr.diag(as.mcmc(tobacco.r2jags.b)[, preds])
beta[1] beta[2] sigma.res sigma.leaf Lag 0 1.0000000000 1.000000000 1.000000000 1.000000000 Lag 100 -0.0120118544 -0.019808713 0.001787276 0.018949717 Lag 500 0.0006146483 -0.002923409 -0.003111989 -0.002902121 Lag 1000 -0.0098257947 0.018441595 0.008440164 0.023778084 Lag 5000 0.0348224150 -0.010366829 0.007009670 -0.012140377
raftery.diag(as.mcmc(tobacco.r2jags.b))
[[1]] Quantile (q) = 0.025 Accuracy (r) = +/- 0.005 Probability (s) = 0.95 You need a sample size of at least 3746 with these values of q, r and s [[2]] Quantile (q) = 0.025 Accuracy (r) = +/- 0.005 Probability (s) = 0.95 You need a sample size of at least 3746 with these values of q, r and s [[3]] Quantile (q) = 0.025 Accuracy (r) = +/- 0.005 Probability (s) = 0.95 You need a sample size of at least 3746 with these values of q, r and s
Matrix parameterization STAN codelibrary(coda) preds <- c("beta[1]", "beta[2]", "sigma", "sigma_Z") plot(tobacco.mcmc.d)
autocorr.diag(tobacco.mcmc.d[, preds])
beta[1] beta[2] sigma sigma_Z Lag 0 1.00000000 1.0000000000 1.000000000 1.000000000 Lag 100 -0.01695628 -0.0007715059 -0.018002793 0.045442483 Lag 500 -0.03455833 -0.0345866232 0.010145404 -0.011939255 Lag 1000 0.03571482 0.0228961331 -0.007372627 0.009944821 Lag 5000 -0.02002230 -0.0349517250 -0.025108536 0.018566023
raftery.diag(tobacco.mcmc.d)
[[1]] Quantile (q) = 0.025 Accuracy (r) = +/- 0.005 Probability (s) = 0.95 You need a sample size of at least 3746 with these values of q, r and s [[2]] Quantile (q) = 0.025 Accuracy (r) = +/- 0.005 Probability (s) = 0.95 You need a sample size of at least 3746 with these values of q, r and s [[3]] Quantile (q) = 0.025 Accuracy (r) = +/- 0.005 Probability (s) = 0.95 You need a sample size of at least 3746 with these values of q, r and s
- $R^2$ calculations
$R^2$ calculations from matrix parameterization of JAGS code
# R2 calculations Xmat <- model.matrix(~TREATMENT, tobacco) coefs <- tobacco.r2jags.b$BUGSoutput$sims.list[['beta']] fitted <- coefs %*% t(Xmat) X.var <- aaply(fitted,1,function(x){var(x)}) X.var[1:10]
1 2 3 4 5 6 7 8 9 10 29.677582 33.969637 10.949423 42.947571 16.358872 19.447180 26.444414 8.866723 11.674656 17.731018
Z.var <- tobacco.r2jags.b$BUGSoutput$sims.list[['sd.Leaf']]^2 R.var <- tobacco.r2jags.b$BUGSoutput$sims.list[['sd.Resid']]^2 R2.marginal <- (X.var)/(X.var+Z.var+R.var) R2.marginal <- data.frame(Mean=mean(R2.marginal), Median=median(R2.marginal), HPDinterval(as.mcmc(R2.marginal))) R2.conditional <- (X.var+Z.var)/(X.var+Z.var+R.var) R2.conditional <- data.frame(Mean=mean(R2.conditional), Median=median(R2.conditional), HPDinterval(as.mcmc(R2.conditional))) R2.block <- (Z.var)/(X.var+Z.var+R.var) R2.block <- data.frame(Mean=mean(R2.block), Median=median(R2.block), HPDinterval(as.mcmc(R2.block))) R2.res<-(R.var)/(X.var+Z.var+R.var) R2.res <- data.frame(Mean=mean(R2.res), Median=median(R2.res), HPDinterval(as.mcmc(R2.res))) (tobacco.R2<-rbind(R2.block=R2.block, R2.marginal=R2.marginal, R2.res=R2.res, R2.conditional=R2.conditional))
Mean Median lower upper R2.block 0.3039250 0.3078796 3.470989e-07 0.5948360 R2.marginal 0.3225995 0.3287253 6.488894e-02 0.5786886 R2.res 0.3734755 0.3328591 1.353632e-01 0.7715377 R2.conditional 0.6265245 0.6671409 2.284623e-01 0.8646368
- Summary figure time
Matrix parameterization JAGS code
preds <- c('beta[1]','beta[2]') coefs <- tobacco.r2jags.b$BUGSoutput$sims.matrix[,preds] newdata <- data.frame(TREAT=levels(tobacco$TREAT)) Xmat <- model.matrix(~TREAT, newdata) pred <- coefs %*% t(Xmat) library(plyr) newdata <- cbind(newdata, adply(pred, 2, function(x) { data.frame(Median=median(x), HPDinterval(as.mcmc(x)), HPDinterval(as.mcmc(x), p=0.68)) })) newdata
TREAT X1 Median lower upper lower.1 upper.1 1 Strong 1 33.64026 16.40838 51.70186 24.87108 41.38903 2 Weak 2 25.64370 8.67158 44.14395 17.29038 34.04036
library(ggplot2) p1 <- ggplot(newdata, aes(y=Median, x=TREAT)) + geom_linerange(aes(ymin=lower, ymax=upper), show_guide=FALSE)+ geom_linerange(aes(ymin=lower.1, ymax=upper.1), size=2,show_guide=FALSE)+ geom_point(size=4, shape=21, fill="white")+ scale_y_continuous('Number of lessions')+ theme_classic()+ theme(axis.title.y=element_text(vjust=2,size=rel(1.2)), axis.title.x=element_text(vjust=-2,size=rel(1.2)), plot.margin=unit(c(0.5,0.5,2,2), 'lines')) preds <- c('sd.Resid','sd.Treatment','sd.Leaf') tobacco.sd <- adply(tobacco.mcmc.b[,preds],2,function(x) { data.frame(mean=mean(x), median=median(x), HPDinterval(as.mcmc(x)), HPDinterval(as.mcmc(x),p=0.68)) }) head(tobacco.sd)
X1 mean median lower upper lower.1 upper.1 1 sd.Resid 4.300579 4.118987 2.807450690 5.993276 3.152196 4.902288 2 sd.Treatment 5.561850 5.583537 2.131784567 9.044775 4.070960 7.271658 3 sd.Leaf 3.828328 3.888253 0.003503719 6.586173 1.998388 5.375745
rownames(tobacco.sd) <- c("Residuals", "Treatment", "Leaf") tobacco.sd$name <- factor(c("Residuals", "Treatment", "Leaf"), levels=c("Residuals", "Treatment", "Leaf")) tobacco.sd$Perc <- tobacco.sd$median/sum(tobacco.sd$median) p2<-ggplot(tobacco.sd,aes(y=name, x=median))+ geom_vline(xintercept=0,linetype="dashed")+ geom_hline(xintercept=0)+ scale_x_continuous("Finite population \nvariance components (sd)")+ geom_errorbarh(aes(xmin=lower.1, xmax=upper.1), height=0, size=1)+ geom_errorbarh(aes(xmin=lower, xmax=upper), height=0, size=1.5)+ geom_point(size=3, shape=21, fill='white')+ geom_text(aes(label=sprintf("(%4.1f%%)",Perc),vjust=-1))+ theme_classic()+ theme(axis.title.y=element_blank(), axis.text.y=element_text(size=rel(1.2),hjust=1)) library(gridExtra) grid.arrange(p1,p2,ncol=2)
tobacco.R2$name <- factor(c('Leaf','Treatment','Residuals','Conditional (Total)'), levels=c('Conditional (Total)', 'Residuals','Treatment','Leaf')) tobacco.R2 <- subset(tobacco.R2, name !='Residuals') p3<-ggplot(tobacco.R2,aes(y=name, x=Median))+ geom_vline(xintercept=0,linetype="dashed")+ geom_hline(xintercept=0)+ #scale_x_continuous("Finite population \nvariance components (sd)")+ #geom_errorbarh(aes(xmin=lower.1, xmax=upper.1), height=0, size=1)+ geom_errorbarh(aes(xmin=lower, xmax=upper), height=0, size=1.5)+ geom_point(size=3, shape=21, fill='white')+ #geom_text(aes(label=sprintf("(%4.1f%%)",Perc),vjust=-1))+ theme_classic()+ theme(axis.title.y=element_blank(), axis.text.y=element_text(size=rel(1.2),hjust=1)) library(gridExtra) grid.arrange(p1,p3,ncol=2)