A novel weighted likelihood estimation with empirical Bayes flavor
Keywords:
Consistency, data-dependent prior, empirical Bayes, exponentiated distribution, location parameter, maximum likelihood estimator, super-efficiency, unbounded likelihoodAbstract
We propose a novel approach to estimation, where each individual observation in a random sample is used to derive an estimator of an unknown parameter using the maximum likelihood principle. These individual estimators are then combined as a weighted average to produce the final estimator. The weights are chosen to be proportional to the likelihood function evaluated at the estimators based on each observation. The method can be related to a Bayesian approach, where the prior distribution is data driven. In case of estimating a location parameter of a unimodal density, the prior distribution is the empirical distribution of the sample, and converges to the true distribution that generated the data as the sample size increases.
We provide several examples illustrating the new method, argue for its consistency, and conduct simulation studies to assess the performance of the estimators. It turns out that this straightforward methodology produces consistent estimators, which seem to be comparable with those obtained by the maximum likelihood method.