Taylor & Francis Group
Browse
1/1
5 files

The Bayes rule of the parameter in (0,1) under the power-log loss function with an application to the beta-binomial model

dataset
posted on 2017-06-27, 03:56 authored by Ying-Ying Zhang, Ming-Qin Zhou, Yu-Han Xie, Wen-He Song

We propose the power-log loss function plotted in Figure 1 for the restricted parameter space (0,1), which satisfies all the six properties listed in Table 1 for a good loss function on (0,1). In particular, the power-log loss function penalizes gross overestimation and gross underestimation equally, is convex in its argument, and attains its global minimum at the true unknown parameter. The power-log loss function on (0,1) is an analog of the power-log loss function on (0,), which is the popular Stein's loss function. We then calculate the Bayes rule (estimator) of the parameter in (0,1) under the power-log loss function, the posterior expected power-log loss (PEPLL) at the Bayes estimator, and the integrated risk under the power-log loss (IRPLL) at the Bayes estimator, which is also the Bayes risk under the power-log loss (BRPLL). We also calculate the usual Bayes estimator under the squared error loss, which has been proved to be larger than that under the power-log loss. Next, we analytically calculate the Bayes estimators and the PEPLL at the Bayes estimators under a beta-binomial model. Finally, the numerical simulations and a real data example of some monthly magazine exposure data exemplify our theoretical studies of two size relationships about the Bayes estimators and the PEPLLs.

Funding

This work was supported by the Fundamental Research Funds for the Central Universities [grant number CQDXWL-2012-004].

History