files/journal/2022-09-01_23-34-07-000000_997.jpg

Journal of Modern Mathematics and Statistics

ISSN: Online
ISSN: Print 1994-5388
150
Views
0
Downloads

On Bayesian Estimation in Generalized Geometric Series Distribution

Khurshid Ahmad Mir
Page: 105-108 | Received 21 Sep 2022, Published online: 21 Sep 2022

Full Text Reference XML File PDF File

Abstract

In this study, a Bayesian analysis of Generalized Geometric Series Distribution (GGSD) under different types of loss functions have been studied.


INTRODUCTION

The probability function of Generalized Geometric Series Distribution (GGSD) was given by Mishra (1982) by using the results of the lattice path analysis as:

(1)

It can be seen that at β = 1, Eq. 1 reduced to simple geometric distribution and is a particular case of Jain and Consul (1971)’s generalized negative binomial distribution in the same way as the geometric distribution is a particular case of the negative binomial distribution.

The various properties and estimation of Eq. 1 have been discussed by Mishra (1982), Mishra and Singh (1992). Hassan et al. (2007) discussed the Bayesian analysis under non-informative and conjugate priors. In this study, the Bayesian analysis of Generalized Geometric Series Distribution (GGSD) under different symmetric loss functions have been studied.

Preliminary theory: Let x be a random variable whose distribution depends on r parameters θ1, θ2,...θr and let Ω denotes the parameter space of possible values of θ. For the general problem of estimating some specified real-valued function φ(θ) of the unknown parameters θ from the results of a random sample of n observations, we shall assume that φ(θ) is defined for all θ in Ω.

Let x1, x2,... xn be the sample observations. Also, let θ be an estimate of φ(θ) and let be the loss incurred by taking the value of φ(θ) to be . It should be noted that we are restricting consideration hereto loss functions which depend on θ through φ(θ) only. If Ψ(θ) is the prior density of θ then according to Bayes’ theorem the posterior density of θ is where l(θ/x) is the likelihood function of θ given the sample x and:

It follows that for a given x, the expected loss i.e., risk of the estimator is:

(2)

Assuming the existence of Eq. 2 and the sufficient regularity conditions prevail to permit differentiation under the integral sign, the optimum estimator of Φ(θ) will be a solution of the equation:

(3)

The validity of Eq. 3 and the desirability that it should lead to a unique solution necessarily impose restrictions of one’s choice of loss function and prior density of θ. The loss functions which have been considered here are as follow:

 

 

Where δ isa small known quantity:

Where δ1 and δ2 are 2 known quantities.

MATERIALS AND METHODS
Bayesian estimation of parameter θ of GGSD under different priors:
The likelihood function from Eq. 1 is obtained as:

(4)

Where:

When β is known, the part of the likelihood function which is relevant to Bayesian inference on the unknown parameter θ is .

Bayesian estimation of parameter θ of GGSD under non- informative prior: We assume prior of θ as:

(5)

The posterior distribution of θ from Eq 4. and 5 is:

(6)

The Bayes estimator of parametric function φ(θ) under squared error loss function is the posterior mean which is given as:

If we take φ(θ) = 0, the Bayes estimate of θ is given by:

(7)

This coincides with the moment and ML estimate of θ.

Bayesian estimation of parameter θ of GGSD under beta prior: The more general Bayes estimator of θ can be obtained by assuming the beta distribution as prior information of θ. Thus:

(8)

The posterior distribution of θ is defined as:

(9)

The Bayes estimator of parametric function φ(θ) under squared error loss function is the posterior mean and is given as:

(10)

If we take φ(θ) = 0 then Bayes estimator of θ is given as:

(11)

If a = b = 0, Eq. 11 coincides with Eq. 7. We can consider the more generalized prior as:

(12)

Where c is a positive constant, a and b are known quantities. Under the above loss function, the Bayes’ estimator is given by:

Where:


(13)

and:


(14)

Using Eq. 11, 13 and 14 the Bayes’ estimator under the loss function Eq. 12 is given by:

(15)

 

Substituting a = 0 and b = 1, the loss function Eq. 12 becomes the loss function L1 and the Bayes’ estimator under the loss function L1 using Eq. 15 is:

 

 

 

Which is the mean of the posterior distribution:

Substituting a = -2 and b = 1, the loss function Eq. 12 becomes the loss function L2 and the Bayes’ estimator under the loss function L2 using Eq. 15 is (Table 1):

 


 

 

Table 1: Shows the seven different loss functions and the respective Bayes’ estimators of θ under these loss functions
Where δ1 and δ2 are 2 known quantities

 

 

Substituting a = 0 and b = 1.2, the loss function Eq. 12 becomes the loss function L3 and the Bayes’ estimator under the loss function L3 using Eq. 15 is:

 

 

 

 

Substituting a = 0 and b = 1.2, the loss function Eq. 12 becomes the loss function L4 and the Bayes’ estimator under the loss function L4 using Eq. 15 is:

 

 

 

 

The Bayes’ estimator for the zero-one type of loss function L5 under is the mode of the posterior distribution as:

 

 

 

 

The Bayes’ estimator for the special zero-one type of loss function L6 is:

 

 

 

RESULTS AND DISCUSSION

As δ1 = -δ1 the Bayes’ estimators and are identical. It has been also noted that as β = 1, the above estimators are the Bayesian estimators of the parameter of simple geometric distribution under the above loss functions L1, L2, L3, L4, L5, and L6, respectively.

How to cite this article:

Khurshid Ahmad Mir. On Bayesian Estimation in Generalized Geometric Series Distribution.
DOI: https://doi.org/10.36478/jmmstat.2010.105.108
URL: https://www.makhillpublications.co/view-article/1994-5388/jmmstat.2010.105.108