Hostname: page-component-89b8bd64d-x2lbr Total loading time: 0 Render date: 2026-05-07T18:51:55.850Z Has data issue: false hasContentIssue false

Submit content

Help address this Question with your content Submit Content

What are the ultimate limits of photonic quantum memories?

Published online by Cambridge University Press:  24 March 2023

Mustafa Gündoğan*
Affiliation:
Humboldt-Universität zu Berlin, Berlin, Germany
Daniel K.L. Oi
Affiliation:
University of Strathclyde, Glasgow, UK
*
Author for correspondence: Mustafa Gündoğan, Email: guendomu@physik.hu-berlin.de
Rights & Permissions [Opens in a new window]

Extract

Photonic quantum memories are required in many applications in quantum information science with varying performance requirements depending on specific applications. Although classical light storage has been demonstrated in time scales of minutes (Dudin et al., 2013; Heinze et al., 2013) to hours (Ma et al., 2021) in different systems, storing true single photons and single photon level coherent pulses are still limited to around a few seconds at most (Wang et al., 2021; Ortu et al., 2022; Hain et al., 2022; Stas et al., 2022). In this question, we would like to explore what the challenges for quantum memory storage for the purposes of quantum communication and the distribution of entanglement are, e.g. in quantum repeaters. Furthermore, recent work has proposed using quantum memories with hour-long storage times for quantum computation (Gouzien and Sangouard, 2021) and physically transporting single photons for astronomical interferometry (Bland-Hawthorn et al., 2021) and global quantum communications (Wittig et al., 2017; Gündoğan et al., 2023).

Information

Type
Question
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press