The famous probabilist and statistician Persi Diaconis wrote an article not too long ago about the "Markov chain Monte Carlo (MCMC) Revolution." The paper describes how we are able to solve a diverse set of problems with MCMC. The first example he gives is a text decryption problem solved with a simple Metropolis Hastings sampler.
I was always stumped by those cryptograms in the newspaper and thought it would be pretty cool if I could crack them with statistics.
After signing a huge deal with the Angels, Pujols has been having a really bad year. He hasn't hit a home run this year, breaking a career long streak. So I thought it would be a good idea to use some statistics to tell how good or bad we think Pujols will actually be this year.
Coming into the year, he had a career .328/.420/.617 career AVG/OBP/SLG. Through one month, he has a .
I guess you could call this On Bayes Percentage. *cough*
Fresh off learning Bayesian techniques in one of my classes last quarter, I thought it would be fun to try to apply the method. I was able to find some examples of Hierarchical Bayes being used to analyze baseball data at Wharton. Setting up the problem
On base percentage (OBP) is probably the most important basic offensive statistic in baseball.