Skip to main content Accessibility help
×
Home
Hostname: page-component-99c86f546-swqlm Total loading time: 0.221 Render date: 2021-12-05T12:28:04.646Z Has data issue: true Feature Flags: { "shouldUseShareProductTool": true, "shouldUseHypothesis": true, "isUnsiloEnabled": true, "metricsAbstractViews": false, "figures": true, "newCiteModal": false, "newCitedByModal": true, "newEcommerce": true, "newUsageEvents": true }

Exact sampling of determinantal point processes without eigendecomposition

Published online by Cambridge University Press:  23 November 2020

Claire Launay*
Affiliation:
Université de Paris
Bruno Galerne*
Affiliation:
Université d’Orléans
Agnès Desolneux*
Affiliation:
CNRS and ENS Paris-Saclay
*
*Postal address: Laboratoire MAP5, Université de Paris, CNRS, Paris, 75006, France. Email: claire.launay.math@gmail.com
**Postal address: Institut Denis Poisson, Université d’Orléans, Université de Tours, CNRS, Orléans, 45100, France.
***Postal address: Centre Borelli, CNRS, ENS Paris Saclay, Gif-sur-Yvette, 91190, France.

Abstract

Determinantal point processes (DPPs) enable the modeling of repulsion: they provide diverse sets of points. The repulsion is encoded in a kernel K that can be seen, in a discrete setting, as a matrix storing the similarity between points. The main exact algorithm to sample DPPs uses the spectral decomposition of K, a computation that becomes costly when dealing with a high number of points. Here we present an alternative exact algorithm to sample in discrete spaces that avoids the eigenvalues and the eigenvectors computation. The method used here is innovative, and numerical experiments show competitive results with respect to the initial algorithm.

MSC classification

Type
Research Papers
Copyright
© Applied Probability Trust 2020

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Affandi, R. H., Kulesza, A., Fox, E. B. and Taskar, B. (2013). Nyström approximation for large-scale determinantal processes. In 16th International Conference on Artificial Intelligence and Statistics (AISTATS) (JMLR Workshop and Conference Proceedings 31), pp. 8598. PMLR.Google Scholar
Aggarwal, C. C. (2016). Outlier Analysis, 2nd edn. Springer.Google Scholar
Amblard, P.-O., Barthelmé, S. and Tremblay, N. (2018). Subsampling with K determinantal point processes for estimating statistics in large data sets. In 2018 IEEE Workshop on Statistical Signal Processing (SSP 2018), pp. 313317. IEEE.10.1109/SSP.2018.8450831CrossRefGoogle Scholar
Anari, N., Gharan, S. O. and Rezaei, A. (2016). Monte Carlo Markov chain algorithms for sampling strongly Rayleigh distributions and determinantal point processes. In 29th Annual Conference on Learning Theory (COLT) (JMLR Workshop and Conference Proceedings 49), pp. 103115. PMLR.Google Scholar
Avena, L. and Gaudillière, A. (2018). Two applications of random spanning forests. J. Theoret. Prob. 31 (4), 19752004.10.1007/s10959-017-0771-3CrossRefGoogle Scholar
Bardenet, R., Lavancier, F., Mary, X. and Vasseur, A. (2017). On a few statistical applications of determinantal point processes. ESAIM: Procs 60, 180202.10.1051/proc/201760180CrossRefGoogle Scholar
Barthelmé, S., Amblard, P.-O. and Tremblay, N. (2019). Asymptotic equivalence of fixed-size and varying-size determinantal point processes. Bernoulli 25 (4B), 35553589.10.3150/18-BEJ1102CrossRefGoogle Scholar
Błaszczyszyn, B. and Keeler, H. P. (2019). Determinantal thinning of point processes with network learning applications. In 2019 IEEE Wireless Communications and Networking Conference (WCNC), pp. 18. IEEE.10.1109/WCNC.2019.8885526CrossRefGoogle Scholar
Borodin, A. and Rains, E. M. (2005). Eynard–Mehta theorem, Schur process, and their Pfaffian analogs. J. Statist. Phys. 3, 291317.10.1007/s10955-005-7583-zCrossRefGoogle Scholar
Brunel, V.-E., Moitra, A., Rigollet, P. and Urschel, J. (2017). Rates of estimation for determinantal point processes. In 2017 Conference on Learning Theory (COLT) (Proceedings of Machine Learning Research 65), pp. 343345. PMLR.Google Scholar
Buades, A., Coll, B. and Morel, J. M. (2005). A non-local algorithm for image denoising. In 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), pp. 6065. IEEE.10.1109/CVPR.2005.38CrossRefGoogle Scholar
Chiu, S. N., Stoyan, D., Kendall, W. S. and Mecke, J. (2013). Stochastic Geometry and its Applications (Wiley Series in Probability and Statistics). Wiley.10.1002/9781118658222CrossRefGoogle Scholar
Dereziński, M., Calandriello, D. and Valko, M. (2019). Exact sampling of determinantal point processes with sublinear time preprocessing. In Advances in Neural Information Processing Systems 32, eds H. Wallach et al., pp. 11546–11558. Curran Associates.Google Scholar
Dupuy, C. and Bach, F. (2018). Learning determinantal point processes in sublinear time. In International Conference on Artificial Intelligence and Statistics (AISTATS 2018), pp. 244–257.Google Scholar
Gartrell, M., Paquet, U. and Koenigstein, N. (2017). Low-rank factorization of determinantal point processes. In 31st AAAI Conference on Artificial Intelligence (AAAI’17), pp. 1912–1918. AAAI Press.Google Scholar
Gautier, G. (2020). On sampling determinantal point processes. Thesis, Ecole Centrale de Lille.Google Scholar
Gautier, G., Bardenet, R. and Valko, M. (2017). Zonotope hit-and-run for efficient sampling from projection DPPs. In Proceedings of the 34th International Conference on Machine Learning (Proceedings of Machine Learning Research 70), eds D. Precup and Y. W. Teh, pp. 12231232. PMLR.Google Scholar
Gautier, G., Polito, G., Bardenet, R. and Valko, M. (2019). DPPy: DPP sampling with Python. J. Mach. Learn. Res. 20, 17.Google Scholar
George, A., Heath, M. T. and Liu, J. (1986). Parallel Cholesky factorization on a shared-memory multiprocessor. Linear Algebra Appl. 77, 165187.10.1016/0024-3795(86)90167-9CrossRefGoogle Scholar
Gillenwater, J., Kulesza, A., Mariet, Z. and Vassilvtiskii, S. (2019). A tree-based method for fast repeated sampling of determinantal point processes. In Proceedings of the 36th International Conference on Machine Learning (Proceedings of Machine Learning Research 97), eds K. Chaudhuri and R. Salakhutdinov, pp. 22602268. PMLR.Google Scholar
Gillenwater, J., Kulesza, A. and Taskar, B. (2012). Discovering diverse and salient threads in document collections. In 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning (EMNLP-CoNLL ’12), pp. 710720. ACL.Google Scholar
Ginibre, J. (1965). Statistical ensembles of complex, quaternion, and real matrices. J. Math. Phys. 6, 440.10.1063/1.1704292CrossRefGoogle Scholar
Horn, R. A. and Johnson, C. R. (1990). Matrix Analysis. Cambridge University Press.Google Scholar
Hough, J. B., Krishnapur, M., Peres, Y. and Virág, B. (2006). Determinantal processes and independence. Prob. Surv. 6, 206229.10.1214/154957806000000078CrossRefGoogle Scholar
Kang, B. (2013). Fast determinantal point process sampling with application to clustering. In Advances in Neural Information Processing Systems 26, eds J. C. Burges et al., pp. 23192327. Curran Associates.Google Scholar
Kulesza, A. and Taskar, B. (2010). Structured determinantal point processes. In Advances in Neural Information Processing Systems 23, eds J. D. Lafferty et al., pp. 11711179. Curran Associates.Google Scholar
Kulesza, A. and Taskar, B. (2012). Determinantal point processes for machine learning. Found. Trends Mach. Learn. 5 (2–3), 123286.10.1561/2200000044CrossRefGoogle Scholar
Kulesza, A. and Taskar, B. (2011). Learning determinantal point processes. In Proceedings of the Twenty-Seventh Conference on Uncertainty in Artificial Intelligence, pp. 419–427.Google Scholar
Launay, C., Galerne, B. and Desolneux, A. (2018). Exact sampling of determinantal point processes without eigendecomposition. Available at arXiv:1802.08429.Google Scholar
Launay, C. and Leclaire, A. (2019). Determinantal patch processes for texture synthesis. In GRETSI 2019 (Lille, France, August 2019).Google Scholar
Lavancier, F., Møller, J. and Rubak, E. (2015). Determinantal point process models and statistical inference. J. R. Statist. Soc. B [Statist. Methodology] 77 (4), 853877.10.1111/rssb.12096CrossRefGoogle Scholar
Li, C., Jegelka, S. and Sra, S. (2016). Efficient sampling for k-determinantal point processes. In 19th International Conference on Artificial Intelligence and Statistics (Proceedings of Machine Learning Research 51), eds A. Gretton and C. C. Robert, pp. 13281337. PMLR.Google Scholar
Li, C., Sra, S. and Jegelka, S. (2016). Fast mixing Markov chains for strongly Rayleigh measures, DPPs, and constrained sampling. In Advances in Neural Information Processing Systems 29, eds D. D. Lee et al., pp. 41884196. Curran Associates.Google Scholar
Mayers, D. and Süli, E. (2003). An Introduction to Numerical Analysis. Cambridge University Press, Cambridge.Google Scholar
Mumford, D. and Desolneux, A. (2010). Pattern Theory: The Stochastic Analysis of Real-World Signals (AK Peters Series). Taylor & Francis.Google Scholar
Poulson, J. (2020). High-performance sampling of generic determinantal point processes. Phil. Trans. R. Soc. London 378, 2166.10.1098/rsta.2019.0059CrossRefGoogle Scholar
Propp, J. G. and Wilson, D. B. (1998). How to get a perfectly random sample from a generic Markov chain and generate a random spanning tree of a directed graph. J. Algorithms 27 (2), 170217.10.1006/jagm.1997.0917CrossRefGoogle Scholar
Rolski, T. and Szekli, R. (1991). Stochastic ordering and thinning of point processes. Stoch. Process. Appl. 37 (2), 299312.10.1016/0304-4149(91)90049-ICrossRefGoogle Scholar
Rota, G.-C. (1964). On the foundations of combinatorial theory I: Theory of Möbius functions. Z. Wahrscheinlichkeitsth. 2, 340368.10.1007/BF00531932CrossRefGoogle Scholar
Scardicchio, A., Zachary, C. E. and Torquato, S. (2009). Statistical properties of determinantal point processes in high dimensional Euclidean spaces. Phys. Rev. E 79 (4), 041108.10.1103/PhysRevE.79.041108CrossRefGoogle ScholarPubMed
Shirai, T. and Takahashi, Y. (2003). Random point fields associated with certain Fredholm determinants I: Fermion, Poisson and boson point processes. J. Funct. Anal. 205 (2), 414463.10.1016/S0022-1236(03)00171-XCrossRefGoogle Scholar
Trefethen, L. N. and Bau, D. (1997). Numerical Linear Algebra. SIAM.10.1137/1.9780898719574CrossRefGoogle Scholar
Tremblay, N., Barthelmé, S. and Amblard, P.-O. (2018). Optimized algorithms to sample determinantal point processes. Available at CoRR, abs/1802.08471.Google Scholar
Tremblay, N., Barthelmé, S. and Amblard, P.-O. (2019). Determinantal point processes for coresets. J. Mach. Learn. Res. 20, 170.Google Scholar
Zhang, C., Kjellström, H. and Mandt, S. (2017). Determinantal point processes for mini-batch diversification. In Proceedings of the Thirty-Third Conference on Uncertainty in Artificial Intelligence.Google Scholar

Send article to Kindle

To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Exact sampling of determinantal point processes without eigendecomposition
Available formats
×

Send article to Dropbox

To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

Exact sampling of determinantal point processes without eigendecomposition
Available formats
×

Send article to Google Drive

To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

Exact sampling of determinantal point processes without eigendecomposition
Available formats
×
×

Reply to: Submit a response

Please enter your response.

Your details

Please enter a valid email address.

Conflicting interests

Do you have any conflicting interests? *