-
Notifications
You must be signed in to change notification settings - Fork 70
Description
Dear all,
I'm working with Genomic Selection in sugarcane, so to understand the BGLR package, I picked up the data available in Spindel et al. (2015) and I'm running different Bayesian models (Bayesian gBLUP, Bayes A, Bayes B, Bayes C, Ridge Regression and Bayesian Lasso). So for Bayes A, I'm using the function as follow:
subs <- sample(rep(1:5, length.out = length(y)))
ref <- data.frame(i=1:length(y), sg=subs)
BayesA <- function(i, my.nIter, my.burnIn, my.thin){
trn.y <- y
trn.y[ref$sg==i] <- NA
BGLR(y=trn.y, ETA=list(list(X=genot, model='BayesA')), nIter=my.nIter, burnIn=my.burnIn, thin=my.thin, verbose=FALSE)$yHat
}
In the first time, I used:
system.time(result <- data.frame(mclapply(1:5, function(i) BayesA(i, my.nIter=10000, my.burnIn=0, my.thin=1), mc.preschedule=FALSE, mc.set.seed=TRUE, mc.cores=ncores)))
mcmc <- as.mcmc(data.frame(varE=varE$V1, varU=varU$V1, mu=mu$V1))
ac <- autocorr(mcmc, lags=seq(0, 1000, 10))
acor <- data.frame(seq(0, 1000, 10), stack(data.frame(varE=ac[,1,1], varU=ac[,2,2], mu=ac[,3,3])))
names(acor) <- c('interval', 'autocorrelation', 'parameter')
ggplot(acor, aes(x=interval, y=autocorrelation, colour=parameter, group=parameter)) +
geom_line(size=0.75) +
geom_point(size=1, pch=21, fill='grey30', alpha=0.3) +
theme_bw()
In the second time, I used:
system.time(result <- data.frame(mclapply(1:5, function(i) BayesA(i, my.nIter=1001000, my.burnIn=1000, my.thin=50), mc.preschedule=FALSE, mc.set.seed=TRUE, mc.cores=ncores)))
Through these results, I saw that after 1001000 iterations there are autocorrelation in the Markov chain. What your suggestion to solve this problem? How can i reach at least 1000 effective steps for each parameter?
Best regards
Renato



