Pitman–Yor process
In probability theory, a Pitman–Yor process[1][2][3][4] denoted PY(d, θ, G0), is a stochastic process whose sample path is a probability distribution. A random sample from this process is an infinite discrete probability distribution, consisting of an infinite set of atoms drawn from G0, with weights drawn from a two-parameter Poisson-Dirichlet distribution. The process is named after Jim Pitman and Marc Yor.
The parameters governing the Pitman–Yor process are: 0 ≤ d < 1 a discount parameter, a strength parameter θ > −d and a base distribution G0 over a probability space X. When d = 0, it becomes the Dirichlet process. The discount parameter gives the Pitman–Yor process more flexibility over tail behavior than the Dirichlet process, which has exponential tails. This makes Pitman–Yor process useful for modeling data with power-law tails (e.g., word frequencies in natural language).
The exchangeable random partition induced by the Pitman–Yor process is an example of a Chinese restaurant process, a Poisson–Kingman partition, and of a Gibbs type random partition.
Naming conventions
The name "Pitman–Yor process" was coined by Ishwaran and James[5] after Pitman and Yor's review on the subject.[2] However the process was originally studied in Perman et al.[6][7]
It is also sometimes referred to as the two-parameter Poisson–Dirichlet process, after the two-parameter generalization of the Poisson–Dirichlet distribution which describes the joint distribution of the sizes of the atoms in the random measure, sorted by strictly decreasing order.
See also
- Chinese restaurant process
- Dirichlet distribution
- Latent Dirichlet allocation
References
- ^ Ishwaran, H; James, L F (2003). "Generalized weighted Chinese restaurant processes for species sampling mixture models". Statistica Sinica. 13: 1211–1235.
- ^ a b Pitman, Jim; Yor, Marc (1997). "The two-parameter Poisson–Dirichlet distribution derived from a stable subordinator". Annals of Probability. 25 (2): 855–900. CiteSeerX 10.1.1.69.1273. doi:10.1214/aop/1024404422. MR 1434129. Zbl 0880.60076.
- ^ Pitman, Jim (2006). Combinatorial Stochastic Processes. Vol. 1875. Berlin: Springer-Verlag. ISBN 9783540309901.
- ^ Teh, Yee Whye (2006). "A hierarchical Bayesian language model based on Pitman–Yor processes". Proceedings of the 21st International Conference on Computational Linguistics and the 44th Annual Meeting of the Association for Computational Linguistics.
- ^ Ishwaran, H.; James, L. (2001). "Gibbs Sampling Methods for Stick-Breaking Priors". Journal of the American Statistical Association. 96 (453): 161–173. CiteSeerX 10.1.1.36.2559. doi:10.1198/016214501750332758.
- ^ Perman, M.; Pitman, J.; Yor, M. (1992). "Size-biased sampling of Poisson point processes and excursions". Probability Theory and Related Fields. 92: 21–39. doi:10.1007/BF01205234.
- ^ Perman, M. (1990). Random Discrete Distributions Derived from Subordinators (Thesis). Department of Statistics, University of California at Berkeley.
- v
- t
- e
- Bernoulli process
- Branching process
- Chinese restaurant process
- Galton–Watson process
- Independent and identically distributed random variables
- Markov chain
- Moran process
- Random walk
- Additive process
- Bessel process
- Birth–death process
- Brownian motion
- Cauchy process
- Contact process
- Continuous-time random walk
- Cox process
- Diffusion process
- Dyson Brownian motion
- Empirical process
- Feller process
- Fleming–Viot process
- Gamma process
- Geometric process
- Hawkes process
- Hunt process
- Interacting particle systems
- Itô diffusion
- Itô process
- Jump diffusion
- Jump process
- Lévy process
- Local time
- Markov additive process
- McKean–Vlasov process
- Ornstein–Uhlenbeck process
- Poisson process
- Schramm–Loewner evolution
- Semimartingale
- Sigma-martingale
- Stable process
- Superprocess
- Telegraph process
- Variance gamma process
- Wiener process
- Wiener sausage
- Binomial options pricing model
- Black–Derman–Toy
- Black–Karasinski
- Black–Scholes
- Chan–Karolyi–Longstaff–Sanders (CKLS)
- Chen
- Constant elasticity of variance (CEV)
- Cox–Ingersoll–Ross (CIR)
- Garman–Kohlhagen
- Heath–Jarrow–Morton (HJM)
- Heston
- Ho–Lee
- Hull–White
- Korn-Kreer-Lenssen
- LIBOR market
- Rendleman–Bartter
- SABR volatility
- Vašíček
- Wilkie
- Central limit theorem
- Donsker's theorem
- Doob's martingale convergence theorems
- Ergodic theorem
- Fisher–Tippett–Gnedenko theorem
- Large deviation principle
- Law of large numbers (weak/strong)
- Law of the iterated logarithm
- Maximal ergodic theorem
- Sanov's theorem
- Zero–one laws (Blumenthal, Borel–Cantelli, Engelbert–Schmidt, Hewitt–Savage, Kolmogorov, Lévy)
- Cameron–Martin formula
- Convergence of random variables
- Doléans-Dade exponential
- Doob decomposition theorem
- Doob–Meyer decomposition theorem
- Doob's optional stopping theorem
- Dynkin's formula
- Feynman–Kac formula
- Filtration
- Girsanov theorem
- Infinitesimal generator
- Itô integral
- Itô's lemma
- Karhunen–Loève theorem
- Kolmogorov continuity theorem
- Kolmogorov extension theorem
- Lévy–Prokhorov metric
- Malliavin calculus
- Martingale representation theorem
- Optional stopping theorem
- Prokhorov's theorem
- Quadratic variation
- Reflection principle
- Skorokhod integral
- Skorokhod's representation theorem
- Skorokhod space
- Snell envelope
- Stochastic differential equation
- Stopping time
- Stratonovich integral
- Uniform integrability
- Usual hypotheses
- Wiener space
- Actuarial mathematics
- Control theory
- Econometrics
- Ergodic theory
- Extreme value theory (EVT)
- Large deviations theory
- Mathematical finance
- Mathematical statistics
- Probability theory
- Queueing theory
- Renewal theory
- Ruin theory
- Signal processing
- Statistics
- Stochastic analysis
- Time series analysis
- Machine learning
- List of topics
- Category
This probability-related article is a stub. You can help Wikipedia by expanding it. |
- v
- t
- e