[Update: See this follow-up post.]

Here is an example of what I mean. First, a histogram of the data, consisting of a 50 individual counts. (E.g., for each of 50 subjects, the number times they checked their e-mail in the last hour.) The histogram of data has posterior-credible negative binomial distributions superimposed.

Here is the JAGS model statement:

model {

# Likelihood:

for( i in 1 : N ) {

y[i] ~ dnegbin( p , r )

}

m <- r*(1-p)/p

# Prior:

r ~ dgamma(gSh,gRa)

p ~ dbeta(1.001,1.001)

}

Although the MCMC chain comes up with reasonable estimates in the long run, there is a lot of autocorrelation along the way:

It is clear that the reason for the autocorrelation is the strong correlation of the parameters:

As I said above, there must be some well-known way to address this issue. Can someone point me to it? Thanks.

I don't know a transformation that will allow you to use a Gibbs sampler. But this looks to me like a good case for a random-walk Metropolis sampler. Your scatter plots show that m and p are pretty much uncorrelated, so use them. First form the posterior kernel in terms of r and p, next substitute r -> m*(1-p)/p, then multiply the result by (1-p)/p to account for the Jacobian of the transformation, and finally make joint draws of m and p. (I think that's right.) Now you can compute anything you want with those draws.

ReplyDeleteThanks very much for your suggestion, Mark. Alternatively, a simple reparameterization in JAGS also does the trick. See the follow-up post now linked at the top of this post. Thanks again.

ReplyDelete