题名

Data Increase and Staistical Inferences

并列篇名

資料增加和統計推論

作者

凌嘉華(Chiahua Ling)

关键词

Bayesian method ; prior distribution ; conjugate distribution ; 貝氏方法 ; 先驗分配 ; 共軛分配

期刊名称

台北海洋技術學院學報

卷期/出版年月

8卷2期(2017 / 06 / 01)

页次

102 - 114

内容语文

英文

中文摘要

In this study we follow the method of Bayesian technique with a difference. In traditional Bayesian method one assumes a prior distribution, may be a conjugate one, which is centered at super parameters. Our approach of data increase is supposed to help for small samples or cases where a very few observations are available. We use prior distributions, centered at the given observations (pretending them to be super parameters), to generate a larger artificial dataset which may be termed as second generation dataset. This larger second generation dataset is then used to draw statistical inferences. The method is dependent on computational resources, and may be useful in applied problems.

英文摘要

本研究我們使用和貝氏技術些微不同的方法,傳統的貝氏方法是假設有先驗分配(可能是共軛分配)和以超參數為中心。我們用小樣本中非常少的觀察值作資料增加擴大,用以觀察值(假設它們是超參數)為中心的先驗分配來產生一個較大的資料集(稱作第二產生資料集),這個第二產生資料集可以用來做統計推論,這方法需要用到電腦資源而且是對應用問題有幫助的。

主题分类 人文學 > 人文學綜合
工程學 > 工程學綜合
社會科學 > 社會科學綜合
参考文献
  1. (1985).SAS User's Guide: Basics.Cary, North Carolina:Sas Institute Inc..
  2. (1985).SAS User's Guide: Statistics.Cary, North Carolina:Sas Institute Inc..
  3. Andrews, D. F.,Pregibon, D.(1978).Finding the outliers that matter.J. R. Statist. Soc.,40(1),85-93.
  4. Casella, G.,Berger, R. L.(2001).Statistical Inference.Brooks/Cole Cengage Learning.
  5. Casella, G.,George, E.(1992).Weibulllaining the Gibbs sampler.The American Statistican,46,167-174.
  6. Clayton, D.(2003).Conditional likelihood inference under complex ascertainment using data augmentation.Biometrika,90,976-981.
  7. David, Herbert A.,Nagaraja, Haikady N.(2003).Order Statistics.John Wiley & Sons.
  8. Dempster, A. P.,Laird, N. M.,Rubin, D. B.(1977).Maximum likelihood from incomplete data via the EM algorithm (with discussion).Journal of the Royal Statistical Society, Series B,39(1),1-38.
  9. Efron, B.(1982).The jackknife, the bootstrap, and other resampling plans.Philadelphia, Pa., USA:Society for Industrial and Applied Mathematics.
  10. Faraway, J. J.(1992).On the Cost of Data Analysis.Journal of Computational and Graphical Statistics,1,213-29.
  11. Hastie, T.,Tibshirani, R.,Friedman, J.(2009).The Elements of Statistical Learning: Data Mining, Inference, and Prediction.New York:Springer- Verlag.
  12. Kamakura, W.A.,Wedel, M.,Rosa, F.D.,Mazzon, J.A.(2003).Cross Selling through Database Marketing: A Mixed Data Factor Analyzer for Data Augmentation and Prediction.International Journal of Research in Marketing,20(1),45-65.
  13. Kosuke, I.,David, A.V.D.(2005).A Bayesian Analysis of the Multinomial Probit Model Using Marginal Data Augmentation.Journal of Econometrics,124(2),311-344.
  14. Liu, Jun S.,Wu, Ying Nian(1999).Parameter Expansion for Da ta Augmentation.Journal of the American Statistical Association,94(48),1264-1274.
  15. Pelosi, M. K.,Sandifer, T. M.(2002).Doing Statistics For Business: Data, Inference, and Decision Making.New York:John Wiley & Sons, Inc..
  16. Tanner, M. A.(1991).Tools for Statistical Inference. Observed Data and Data Augmentation Methods.New York:Springer- Verlag.
  17. Tanner, T. A.,Wong, W. H.(1987).The Calculation of Posterior Distribution by Data Augmentation.Journal of the American Statistical Association,82,528-550.
  18. Van Dyk, D. A.,Meng, X. -L.(2001).The art of data augmentation.J. Comput. Graph. Stat.,10,1-111.
  19. Wei, G. C. G.,Tanner, M. A.(1990).A Monte Carlo implementation of the EM algorithm and the poor man's data augmentation algorithm.Journal of the American Statistical Association,85,699-704.