Show That the Gamma Distribution Belongs to the Exponential Family

Content

  1. Special grade of Gamma distributions and relationships of Gamma distribution
  2. Gamma distribution exponential family
  3. Relationship between gamma and normal distribution
  4. Poisson gamma distribution | poisson gamma distribution negative binomial
  5. Weibull gamma distribution
  6. Application of gamma distribution in real life | gamma distribution uses | awarding of gamma distribution in statistics
  7. Beta gamma distribution | relationship between gamma and beta distribution
  8. Bivariate gamma distribution
  9. Double gamma distribution
  10. Relation betwixt gamma and exponential distribution | exponential and gamma distribution | gamma exponential distribution
  11. Fit gamma distribution
  12. Shifted gamma distribution
  13. Truncated gamma distribution
  14. Survival function of gamma distribution
  15. MLE of gamma distribution | maximum likelihood gamma distribution | likelihood function of gamma distribution
  16. Gamma distribution parameter interpretation method of moments | method of moments estimator gamma distribution
  17. Confidence interval for gamma distribution
  18. Gamma distribution conjugate prior for exponential distribution | gamma prior distribution | posterior distribution poisson gamma
  19. Gamma distribution quantile role
  20. Generalized gamma distribution
  21. Beta generalized gamma distribution

Special class of Gamma distributions and relationships of Gamma distribution

  In this article we will discuss the special forms of gamma distributions and the relationships of gamma distribution with different continuous and discrete random variables also some estimation methods  in sampling of population using gamma distribution is briefly discuss.

Topics on Gamma Distribution

Gamma distribution exponential family unit

  The gamma distribution exponential family and information technology is two parameter exponential family which is largely and applicable family of distribution as nigh of real life problems can exist modelled in the gamma distribution exponential family and the quick and useful calculation  within the exponential family can be done hands, in the two parameter if we take probability density role every bit

\frac{e^{-\lambda /x}x^{\alpha -1}}{\lambda ^{\alpha }\Gamma (\alpha )}I_{x}> 0

if we restrict the known value of α (alpha) this 2 parameter family unit will reduce to i parameter exponential family

f(x/\lambda )=e^{-\lambda /x}-a \ \ log\lambda \frac{x^{\alpha -1}}{\Gamma (\alpha) }I_{x}> 0

and for λ (lambda)

f(x|\alpha )=e^{\alpha logx -a(log\lambda)}- log{\Gamma(\alpha) } e^{-\frac{x}{\lambda }}I_{x}> 0

Relationship between gamma and normal distribution

  In the probability density function of gamma distribution if nosotros have blastoff nearer to l nosotros will get the nature of density part equally

Gamma distribution exponential family
Gamma distribution exponential family

fifty-fifty the shape parameter in gamma distribution we are increasing which is resulting in similarity of normal distribution normal curve, if we tend shape parameter alpha tends to infinity the gamma distribution will be more symmetric and normal but as alpha tends to infinity value of ten in gamma distribution will tends to minus infinity which result in semi infinite back up of gamma distribution infinite hence fifty-fifty gamma distribution becomes symmetric but not same with normal distribution.

poisson gamma distribution | poisson gamma distribution negative binomial

   The poisson gamma distribution and binomial distribution are the discrete random variable whose random variable deals with the detached values specifically success and failure in the form of Bernoulli trials which gives random success or failure as a result only, now the mixture of Poisson and gamma distribution also known as negative binomial distribution is the outcome of the repeated trial of Bernoulli's trial, this tin be parameterize in unlike style as if r-th success occurs in number of trials then it tin can be parameterize as

P(X_{1}=x|p,r)=\binom{x-1}{r-1}p^{r}(1-p)^{x-r}

and if the number of failures earlier the r-thursday success then it tin be parameterize equally

P(X_{2}=x|p,r)=\binom{x+r-1}{x}p^{r}(1-p)^{x}

and considering the values of r and p

r=\frac{\mu^{2}}{\sigma ^{2}-\mu}

p=\frac{r}{r+\mu}

the general form of the parameterization for the negative binomial or poisson gamma distribution is

P(X=x)=\binom{x+r-1}{x}p^{r}(1-p)^{x} \ \ x=0,1,2,…

and alternative one is

P(X=x)=\binom{x+r-1}{x} \left ( \frac{\alpha }{\alpha +1} \right )^{r} \left ( \frac{1}{\alpha +1} \right )^{x} \ \ x=0,1,2,…

this binomial distribution is known as negative because of the coefficient

\binom{x+r-1}{x} =\frac{(x+r-1)(x+r-2)….r}{x!} \ = (-1)^{x}\frac{(-r-(x-1))(-r-(x-2))…..(-r)}{x!} \ = (-1)^{x}\frac{(-r)(-r-1)…. -r-(x-1))}{x!} \ =(-1)^{x}\binom{-r}{x}

and this negative binomial or poisson gamma distribution is well define as the total probability we will get as ane for this distribution

1=p^{r}p^{-r} \ =p^{r}(1-q)^{-r} \ =p^{r} \sum_{0}^{\infty}\binom{-r}{x}(-q)^{x} \ =p^{r} \sum_{0}^{\infty} (-1)^{x} \binom{-r}{x}(q)^{x} \ =\sum_{0}^{\infty} \binom{x+r-1}{x}p^{r}q^{x} \

The hateful and variance for this negative binomial or poisson gamma distribution is

E(X)=\frac{r(1-p)}{p}

var(X)=\frac{r(1-p)}{p^{2}}

the poisson and gamma relation we tin can go by the post-obit calculation

P(X=x)=\frac{1}{\Gamma (\alpha) \beta ^{\alpha }}\int_{0}^{\infty}\frac{e^{-\lambda }\lambda ^{x}}{x!}\lambda ^{\alpha -1}e^{-\lambda /\beta } d\lambda

=\frac{1}{x!\Gamma (\alpha)\beta ^{\alpha }}\int_{0}^{\infty}\lambda ^{\alpha +x-1}e^{-\lambda (1+1/\beta )}d\lambda

=\frac{1}{\Gamma (x+1)\Gamma (\alpha )\beta ^{\alpha }} \Gamma (\alpha +x)\left ( \frac{\beta }{\beta +1} \right )^{\alpha +x}

=\binom{\alpha +x-1}{x}\left ( \frac{1}{\beta +1} \right )^{\alpha } \left ( 1-\frac{1}{\beta +1} \right )^{x}

Thus negative binomial is the mixture of poisson and gamma distribution and this distribution is used in day to solar day problems modelling where discrete and continuous mixture nosotros require.

Gamma distribution exponential family
Gamma distribution exponential family

Weibull gamma distribution

   There are generalization of exponential distribution which involve Weibull as well equally gamma distribution as the Weibull distribution has the probability density office as

f(x) = \begin{cases} \ 0 & x \leq v \ \\ \frac{\beta }{\alpha}\left ( \frac{x-v}{\alpha } \right )^{\beta -1} exp{{ -\left ( \frac{x-v}{\alpha } \right )^{\beta }}} &\ x > v \end{cases}

and cumulative distribution role as

F(x) = \begin{cases} \ 0 &\ x \leq v \\ \ 1- exp { -\left ( \frac{x-v}{\alpha } \right )^{\beta } } & \ x > v \end{cases}

where every bit pdf and cdf of gamma distribution is already we discussed higher up the main connectedness between Weibull and gamma distribution is both are generalization of exponential distribution the difference between them is when power of variable is greater than 1 then Weibull distribution gives quick result while for less than 1 gamma gives quick effect.

     We will not discuss here generalized Weibull gamma distribution that require split give-and-take.

application of gamma distribution in existent life | gamma distribution uses | application of gamma distribution in statistics

  There are number of  application where gamma distribution is used to model the situation such every bit insurance claim to aggregate, rainfall amount accumulation, for whatever product its manufacturing and distribution, the oversupply on specific web,  and in telecom commutation etc. really the gamma distribution give the wait time prediction till next consequence for nth event. There are number of application of gamma distribution in real life.

beta gamma distribution | human relationship between gamma and beta distribution

    The beta distribution is the random variable with the probability density function

f(x) = \begin{cases} \ \frac{ane}{B(a,b)}x^{a-1}(1-x)^{b-1} &\ 0< x < 1 \\ \ 0 &\ otherwise \end{cases}

where

B(a,b)= \int_{0}^{1}x^{a-1}(1-x)^{b-1} dx

which has the human relationship with gamma function as

B(a,b)= \frac{\Gamma (a)\Gamma (b)}{\Gamma (a+b)}

and beta distribution related to gamma distribution as if X be gamma distribution with parameter alpha and beta equally one and Y be the gamma distribution with parameter alpha as ane and beta so the random variable 10/(X+Y) is beta distribution.

or If 10 is Gamma(α,1) and Y is Gamma (1, β) then the random variable X/(Ten+Y) is Beta (α, β)

and also

\mathbf{\lim_{n \to \infty} nB(k,n) =\Gamma (k,1)}

bivariate gamma distribution

     A two dimensional or bivariate random variable is continuous if there exists a part f(10,y) such that the joint distribution function

F(x,y)=\int_{-\infty}^{x}\left [ \int_{-\infty}^{y}f(u,v) dv \right ]du

where

F(+\infty,+\infty)=\lim_{x \to +\infty, y \to +\infty } \int_{-\infty}^{x}\int_{-\infty}^{y} f(u,v)dvdu

= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} f(u,v)dvdu =1

and the joint probability density office obtained by

\frac{\partial^2 F(x,y)}{\partial x \partial y }= f(x,y)

there are number of bivariate gamma distribution one of them is the bivariate gamma distribution with probability density function as

f(x,y)=\frac{\beta ^{\alpha +\gamma }}{\Gamma (\alpha )\Gamma (\gamma )}x^{\alpha -1}(y-x)^{\gamma -1}e^{-\beta y}, \ \ 0< x 0

double gamma distribution

  Double gamma distribution is one of the bivariate distribution with gamma random variables having parameter alpha and one with joint probability density function every bit

f_{Y_{1}{Y_{2}}}(y_{1},y_{2})=\frac{1}{\Gamma (\alpha <em>{1})\Gamma (\alpha </em>{2})}y_{1}^{\alpha_{1} -1}y_{2}^{\alpha_{2} -1} exp(-y_{1} -y_{2}), y_{1}> 0, y_{2}> 0

this density forms the double gamma distribution with respective random variables and the moment generating function for double gamma distribution is

\mathbf{M_{Y_{ane}Y_{two}(t,s)}=\left ( \frac{1}{1-t} \right )^{\alpha <em>{1}} \left (\frac{i}{one-s} \right )^{\blastoff </em>{2}} }

relation between gamma and exponential distribution | exponential and gamma distribution | gamma exponential distribution

   since the exponential distribution is the distribution with the probability density part

f(ten) = \brainstorm{cases} \ \lambda e^{-\lambda 10} &\ if \ \ x\geq 0 \ \ 0 &\ \ \ if x< 0 \end{cases}

and the gamma distribution has the probability density function

f(x) = \begin{cases} \frac{\lambda e^{-\lambda ten}(\lambda x)^{\alpha -1}}{\tau (\blastoff )} &\ x\geq 0 \ \ 0 &\ x < 0 \end{cases}

clearly the value of blastoff if we put equally one we will get the exponential distribution, that is the gamma distribution is nothing but the generalization of the exponential distribution, which predict the await time till the occurrence of next nth event while exponential distribution predict the await fourth dimension till the occurrence of the side by side event.

fit gamma distribution

   Every bit far as plumbing fixtures the given data in the form of gamma distribution imply finding the 2 parameter probability density office which involve shape, location and scale parameters then finding these parameters with different application and calculating the mean, variance, standard departure and moment generating role is the fitting of gamma distribution, since different real life problems will be modelled in gamma distribution so the information as per situation must exist fit in gamma distribution for this purpose diverse technique in various environs is already there due east.thousand in R, Matlab, excel etc.

shifted gamma distribution

     There are every bit per awarding and need whenever the requirement of shifting the distribution required from two parameter gamma distribution the new generalized three parameter or whatever another generalized gamma distribution shift the shape location and scale , such gamma distribution is known as shifted gamma distribution

truncated gamma distribution

     If nosotros restrict the range or domain of the gamma distribution for the shape scale and location parameters the restricted gamma distribution is known as truncated gamma distribution based on the conditions.

survival function of gamma distribution

                The survival function for the gamma distribution is defined the function s(x) equally follows

S(x)=1-\frac{\Gamma_{x} (\gamma )}{\Gamma (\gamma )} \ \ x\geq 0 ; \gamma > 0 \ where \ \ \Gamma_{x}(a) =\int_{0}^{x} t^{a-1}e^{-t} dt

mle of gamma distribution | maximum likelihood gamma distribution | likelihood function of gamma distribution

we know that the maximum likelihood accept the sample from the population every bit a representative and this sample consider every bit an estimator for the probability density function to maximize for the parameters of density part, before going to gamma distribution recall some basics as for the random variable 10 the probability density function with theta as parameter has likelihood function every bit

L(\theta ; x_{1},x_{2},…….x_{n}) =f_{\theta }(x_{1}, x_{2},……x_{n} ),

this we can express as

L(\theta ; x_{1},x_{2},…….x_{n}) =\prod_{i=1}^{n}f\theta (x_{i})

and method of maximizing this likelihood role can be

L(\theta ; x_{1},x_{2},…….x_{n}) =sup_{(\theta \in \theta )} L(\theta ; x_{1},x_{2},…….x_{n})

if such theta satisfy this equation, and as log is monotone function nosotros can write in terms of log

logL(\theta ; x_{1},x_{2},…….x_{n}) =sup_{(\theta \in \theta )} log L(\theta ; x_{1},x_{2},…….x_{n})

and such a supremum exists if

{\frac{\partial logL(\hat{\theta; x_{one}…..x_{n} }) }{\partial \theta_{j} }}=0, \ \ j=i,2,…k, \ \ \theta =(\theta <em>{1}, …..\theta </em>{k})

now we employ the maximum likelihood for the gamma distribution function as

f(x | \alpha ,\beta )=\prod_{i=1}^{n}f(x_{i} | \alpha ,\beta )=\left ( \frac{\beta ^{\alpha }}{\Gamma (\alpha )} \right )^{n}\prod_{i=1}^{n}x_{i}^{\alpha -1} exp(-\beta x_{i}) \propto \beta ^{n\alpha } exp\left ( -\beta \sum_{i=1}^{n}x_{i} \right )

the log likelihood of the role will be

\imath (\beta | \alpha ,x) \propto n\alpha log\beta -\beta n \bar{x} \propto \alpha log\beta - \bar{x} \beta

so is

0=\frac{\partial l}{\partial \beta } =\frac{\alpha }{\beta } -\bar{x},

and hence

\hat{\beta }= \frac{\alpha }{\bar{x}}

This tin be achieved also equally

\textbf{L}(\alpha ,\beta | x)=\left ( \frac{\beta ^{\alpha }}{\Gamma (\alpha )} x_{1}^{\alpha -1} e^{-\beta x_{1}} \right )……..\left ( \frac{\beta ^{\alpha }}{\Gamma (\alpha )} x_{n}^{\alpha -1} e^{-\beta x_{n}} \right ) =\left ( \frac{\beta ^{\alpha }}{\Gamma (\alpha )} \right)^{n} (x_{1} (x_{2}……(x_{n})^{\alpha -1} e^{-\beta }(x_{1}+x_{2}+……x_{n})

past

In\textbf{L}(\alpha ,\beta | x)=n(\alpha In\beta -In\Gamma (\alpha ))+(\alpha -1)\sum_{i=1}^{n} Inx_{i} -\beta \sum_{i=1}^{n}x_{i}

and the parameter tin exist obtained by differentiating

\frac{\partial }{\partial \alpha }In\textbf{L}(\hat{\alpha }, \hat{\beta } |x)=n(In\hat{\beta }-\frac{\mathrm{d} }{\mathrm{d} \alpha } In\Gamma (\hat{\alpha }))+\sum_{i=1}^{n} x_{i}=0

\frac{\partial }{\partial \beta }In\textbf{L}(\hat{\alpha }, \hat{\beta } |x)=n \frac{\hat{\alpha }}{\hat{\beta }} -\sum_{i=1}^{n}x_{i}=0 \ \ or \ \ \bar{x}=\frac{\hat{\alpha }}{\hat{\beta }}

n(In \hat{\alpha } -In\hat{x} -\frac{\mathrm{d} }{\mathrm{d} \alpha } In\Gamma (\hat{\alpha }) )+\sum_{i=1}^{n} Inx_{i}=0

gamma distribution parameter estimation method of moments | method of moments estimator gamma distribution

   We can summate the moments of the population and sample with the help of expectation of nth guild respectively, the method of moment equates these moments of distribution and sample to guess the parameters, suppose we have sample of gamma random variable with the probability density role as

f(x|\alpha ,\lambda )=\frac{\lambda ^{\alpha }}{\Gamma (\alpha )}x^{\alpha -1}e^{-\lambda x} , \ \ x\geq 0

nosotros know the first tow moments for this probability density function is

\mu <em>{1}=\frac{\alpha }{\lambda } \ \ \ \mu </em>{2}=\frac{\alpha (\alpha +1) }{\lambda ^{2}}

so

{\lambda } =\frac{\alpha}{\mu _{1}}

nosotros will get from the 2d moment if we substitute lambda

\frac{\mu <em>{two}}{\mu </em>{1}^{2}}=\frac{\alpha +1}{\alpha }

and from this value of blastoff is

\blastoff=\frac{\mu <em>{ane}^{2}}{\mu </em>{2}-\mu _{1}^{2}}

and at present lambda will be

\lambda =\frac{\mu <em>{1}^{2}}{\mu </em>{two}-\mu <em>{1}^{two}} \frac{one}{\mu </em>{i}} \ \ \ \ \ =\frac{\mu <em>{i}^{2}}{\mu </em>{2}-\mu _{1}^{2}}

and moment estimator using sample volition be

\hat{\lambda }=\frac{\bar{X}}{\hat{\sigma }^{2}}

confidence interval for gamma distribution

   confidence interval for gamma distribution is the fashion to guess the data and its uncertainty which tells the interval is expected to have the true value of the parameter at what pct, this conviction interval is obtained from the observations of random variables, since it is obtained from random it itself is random to get the confidence interval for the gamma distribution at that place are unlike techniques in different awarding that we accept to follow.

gamma distribution conjugate prior for exponential distribution | gamma prior distribution | posterior distribution poisson gamma

     The posterior and prior distribution  are the terminologies of Bayesian probability theory and they are conjugate to each other, any ii distributions are cohabit if the posterior of one distribution is another distribution, in terms of theta permit us prove that gamma distribution is cohabit prior to the exponential distribution

if the probability density part of gamma distribution in terms of theta is every bit

f_{\Theta }(\theta )=\frac{\beta ^{\alpha }\theta ^{\alpha -1}e^{-\beta \theta }}{\Gamma (\alpha )}

assume the distribution function for theta is exponential from given data

f_{X_{i}|\Theta }(x_{i}|\theta )=\theta e^{-\theta x_{i}}

so the joint distribution will be

f(X|\Theta )=\theta^{n} e^{-\theta \sum x_{i}}

and using the relation

\textbf{Posterior} \propto \textbf{Likelihood} \ \ X \ \ \textbf{Prior}

we have

f_{\Theta |X}(\theta |x) \propto \theta ^{n}e^{-\theta \sum x_{i}} x \theta ^{\alpha -1}e^{-\beta \theta }

=\theta ^{n +\alpha -1} e^{-\theta (\sum x_{i} + \beta )}

\therefore \theta| X \sim \textbf{Gamma}(n+\alpha , \sum x_{i} +\beta )

which is

f\Lambda | X (\lambda |x) \propto \lambda ^{\sum x_{i}+\alpha -1} e^{-(n+\beta )\lambda }

so gamma distribution is cohabit prior to exponential distribution as posterior is gamma distribution.

gamma distribution quantile function

   Qauntile function of gamma distribution will be the function that gives the points in gamma distribution which relate the rank society of the values in gamma distribution, this require cumulative distribution function and for different linguistic communication unlike algorithm and functions for the quantile of gamma distribution.

generalized gamma distribution

    Equally gamma distribution itself is the generalization of exponential family of distribution adding more parameters to this distribution gives u.s.a. generalized gamma distribution which is the further generalization of this distribution family, the physical requirements gives different generalization one of the frequent one is using the probability density office as

f(x)=\frac{(\frac{x-\mu }{\beta })^{\gamma -1} exp (-\frac{x-\mu }{\beta })}{\beta \Gamma (\gamma )} \ \ x\geq \mu ;\gamma ,\beta > 0

the cumulative distribution part for such generalized gamma distribution can be obtained by

F(x)=\frac{\Gamma _{x}(\gamma )}{\Gamma (\gamma )} \ \ x\geq 0, \gamma > 0

where the numerator represents the incomplete gamma function as

\Gamma <em>{x}(a)=\int</em>{0}^{\infty}t^{a-1}e^{-t}dt

using this incomplete gamma role the survival function for the generalized gamma distribution can exist obtained as

S(x)=1-\frac{\Gamma _{x}(\gamma )}{\Gamma (\gamma )} \ \ x\geq 0, \gamma > 0

another version of this three parameter generalized gamma distribution having probability density function is

f(t)=\frac{\beta }{\Gamma (k)\theta } \left ( \frac{t}{\theta } \right )^{k\beta -1} e^{-\left ( \frac{t}{\theta } \right )^{\beta }}

where thousand, β, θ are the parameters greater than aught, these generalization has convergence problems to overcome the Weibull parameters replaces

\mu =In(\theta )+\frac{1}{\beta } . In\left ( \frac{one}{\lambda ^{two}} \right ) \ \ \ \sigma =\frac{1}{\beta \sqrt{grand}} \ \ \ \lambda =\frac{ane}{\sqrt{k}} \ \ \ Where \ \ -\infty< \mu  0 , 0< \lambda

using this parameterization the convergence of the density role obtained so the more generalization for the gamma distribution with convergence is the distribution with probability density function as

F(x) = \begin{cases}\frac{|\lambda |}{\sigma .t}.\frac{1}{\Gamma \left ( \frac{1}{\lambda ^{2}} \right )}.e\left [ \frac{\lambda .\frac{In(t)-\mu }{\sigma }+In\left ( \frac{1}{\lambda ^{2}} \right )-e^{\lambda.\frac{In.(t)-\mu }{\sigma }} }{\lambda ^{2}} \right ] &\text{if } \lambda \neq 0\\\frac{1}{t.\sigma \sqrt{2 \pi }}e^{-\frac{1}{2}\left ( \frac{In(t)-\mu }{\sigma } \right )^{2}} &\text{if } \lambda =0\end{cases}

Beta generalized gamma distribution

   The gamma distribution involving the parameter beta in the density function because of which sometimes gamma distribution is known as the beta generalized gamma distribution with the density function

g_{\beta ,\gamma ,c}(x)=\frac{c\lambda ^{c\beta }}{\Gamma (\beta )}x^{c\beta -1}exp\left { -(\lambda x)^{c} \right }, \ \ x> 0

with cumulative distribution function equally

G_{\beta ,\gamma ,c}(x)=\frac{\gamma (\beta ,(\lambda x)^{c})}{\Gamma (\beta )},

which is already discussed in detail in the discussion of gamma distribution, the farther beta generalized gamma distribution is defined with the cdf as

F(x)=I_{G}(x)(a,b)=\frac{1}{B(a,b)}\int_{0}^{G(x)}\omega ^{a-1}(1-\omega )^{b-1}d\omega ,

where B(a,b) is the beta function , and the probability density function for this tin can exist obtained by differentiation and the density part volition be

f(x)=\frac{g(x)}{B(a,b)}G(x)^{a-1}\left { 1-G(x) \right }^{b-1}

hither the G(x) is the higher up divers cumulative distribution function of gamma distribution, if we put this value then the cumulative distribution role of beta generalized gamma distribution is

F(x)=I_{\gamma (\beta ,(\lambda x)^{c})/\Gamma (\beta )}(a,b)=\frac{1}{B(a,b)}\int_{0}^{{\gamma (\beta ,(\lambda x)^{c})/\Gamma (\beta )}}\omega ^{a-1} (1-\omega )^{b-1} d\omega

and the probability density function

f(x)=\frac{c\lambda ^{c\beta }x^{c\beta -1}exp\left { -(\lambda x)^{c} \right }\gamma (\beta ,(\lambda x)^{c})^{a-1}\left { \Gamma (\beta )-\gamma (\beta ,(\lambda x)^{c}) \right }^{b-1}}{B(a,b)\Gamma (\beta )^{a+b-1}}

the remaining backdrop can be extended for this beta generalized gamma distribution with usual definitions.

Decision:

There are dissimilar forms and generalization of gamma distribution and Gamma distribution exponential family as per the real life situations then possible such forms and generalizations were covered in add-on with the interpretation  methods of gamma distribution in population sampling of information, if y'all crave further reading on Gamma distribution exponential family, please go through beneath link and books. For more topics on Mathematics please visit our page.

https://en.wikipedia.org/wiki/Gamma_distribution

A offset course in probability by Sheldon Ross

Schaum'south Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

Post navigation

thompsonseet1970.blogspot.com

Source: https://lambdageeks.com/gamma-distribution-exponential-family/

0 Response to "Show That the Gamma Distribution Belongs to the Exponential Family"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel