On shrinkage estimation under divergence loss

Document Type : Original Article

Authors

1 Department of Statistics, Faculty of Mathematical Sciences, Ferdowsi University of Mashhad, Iran

2 Graduate School of Science, Osaka Prefecture University, Sakai, Japan

3 Department of Statistics, School of Mathematics, Statistics, and Computer Science, College of Science, University of Tehran, Tehran, Iran

Abstract

In this paper, the superiority conditions for a general class of shrinkage estimators in the estimation problem of the normal mean are established under divergence loss. This approach is an extension of the work of Ghosh and Mergel (2009).

Keywords


  1. Amari, S. (1982). Differential geometry of curved exponential families-curvatures and information loss. The Annals of Statistics, 10, 357–387.
  2. Arashi, M. and Tabatabaey, S. M. M. (2010). A note on classical Stein-type estimators in elliptically contoured models. Journal of Statistical Planning and Inference, 140, 1206–1213.
  3. Baranchik, A. J. (1970). A family of minimax estimators of the mean of a multivariate normal distribution. The Annals of Mathematical Statistics, 41, 642–645.
  4. Brandwein, A. C. and Strawderman, W. E. (1980). Minimax estimators of location parameters for spherically symmetric distributions with concave loss. The Annals of Statistics, 8, 279–284.
  5. Casella, G. (1990). Estimators with nondecreasing risk: Application of a chi-square identity. Statistics and Probability Letters, 10, 107–109.
  6. Cressie, N. and Read, T. R. C. (1984). Multinomial goodness-of-fit tests. Journal of the Royal Statistical Society Series B: Statistical Methodology, 46, 440–464.
  7. Efron, B. and Morris, C. (1973). Stein’s estimation rule and its competitors-An empirical Bayes approach. Journal of the American Statistical Association, 68, 117–130.
  8. Faith, R. E. (1978). Minimax Bayes estimators of a multivariate normal mean. Journal of Multivariate Analysis, 8, 372–379.
  9. Fourdrinier, D., Strawderman, W. E. and Wells, M. T. (2003). Robust shrinkage estimation for elliptically symmetric distributions with unknown covariance matrix. Journal of Multivariate Analysis, 85, 24–39.
  10. George, E. I. (1991). Shrinkage domination in a multivariate common mean problem. The Annals of Statistics, 19, 952–960.
  11. Ghosh, M., Mergel, V. and Datta, G. S. (2008). Estimation, prediction and the Stein phenomenon under divergence loss, Journal of Multivariate Analysis, 99, 1941–1961.
  12. Ghosh, M. and Mergel, V. (2009). On the Stein phenomenon under divergence loss and an unknown variance-covariance matrix. Journal of Multivariate Analysis, 100, 2331–2336.
  13. James, W. and Stein, C. (1961). Estimation of quadratic loss. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, 1, 361–379.
  14. Maruyama, Y. (2004). Stein’s idea and minimax admissible estimation of a multivariate normal mean, Journal of Multivariate Analysis, 88, 320–334.
  15. Shao, P. Yi-Shi and Strawderman, W. E. (1994). Improving on the James-Stein positive-part estimator. The Annals of Statistics, 22, 1517–1538.
  16. Srivastava, M. S. and Kubokawa, T. (2005). Minimax multivariate empirical Bayes estimators under multicollinearity. Journal of Multivariate Analysis, 93, 394–416.
  17. Stein, C. (1981). Estimation of the mean of a multivariate normal distribution. The Annals of Statistics, 9, 1135–1151.
    Stein, C. (1956). Inadmissibility of the usual estimator for the mean of a multivariate distribution. In Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability, 1, 197–206.
  18. Strawderman, W. E. (1971). Proper Bayes minimax estimators of the multivariate normal mean. The Annals of Mathematical Statistics, 42, 385–388.
  19. Wells, M. T. and Zhou, G. (2008). Generalized Bayes minimax estimators of the mean of multivariate normal distribution with unknown variance. Journal of Multivariate Analysis, 99, 2208–2220.
CAPTCHA Image