|
where $M_{GIG}$ represents the moment generating function of the Generalized Inverse
Gaussian which forms the mixing distribution in this variance-mean mixture subclass.
Powers of the MGF, $M_{GH}(u)^p$, only have the representation in \eqref{appendixI:eq:ghypmom}
for $p=1$, which means that GH distributions are not closed under convolution
with the exception of the NIG, and only in the case when the shape and skew parameters
are the same. The MGF of the NIG is,
$$
{M_{NIG(\alpha ,\beta ,\delta ,\mu )}}(u) = {e^{\mu u}}\frac{{{e^{\delta \sqrt {{\alpha ^2} - {\beta ^2}} }}}}{{{e^{\delta \sqrt {{\alpha ^2} - {{(\beta + u)}^2}} }}}}.
$$
$$
NIG(\alpha ,\beta ,{\delta _1},{\mu _1}) \times...\times NIG(\alpha ,\beta ,{\delta _n},{\mu _n}) = NIG(\alpha ,\beta ,{\delta _1} + ... + {\delta _n},{\mu _1} + ... + {\mu _n}).
$$
$$
{\phi _{GH(\lambda ,\alpha ,\beta ,\delta ,\mu )}}(u) = {e^{\mu \text{i}u}}{\left( {\frac{{{\alpha ^2} - {\beta ^2}}}{{{\alpha ^2} - {{(\beta + {\text{i}}u)}^2}}}} \right)^{\lambda /2}}\frac{{{K_\lambda }\left( {\delta \sqrt {{\alpha ^2} - {{(\beta + {\text{i}}u)}^2}} } \right)}}{{{K_\lambda }\left( {\delta \sqrt {{\alpha ^2} - {\beta ^2}} } \right)}}.
$$
|