Performance assessment of an imaging system is important for the optimization design with various technologies. The information-theoretic viewpoint based on communication theory or statistical inference theory can provide objective and operational measures on imaging performance. These approaches can be further developed by combining with the quantum statistical inference theory for optimizing imaging performance over measurements and analyze its quantum limits, which is demanded in order to improve an imaging system when the photon shot noise in the measurement is the dominant noise source. The aim of this review is to discuss and analyze the recent developments in this branch of quantum imaging.

- Chinese Optics Letters
- Vol. 22, Issue 6, 060009 (2024)
Abstract
1. Performance Assessment of Imaging Systems
Imaging science has become a rapidly evolving multidisciplinary field in the past few years. In most cases of imaging, the intensity of the light used during the imaging process is relatively high, and thus the quantum nature of the optical field does not manifest[1]. However, when the light levels are rather low, the imaging quality will be limited by quantum effect. The field of quantum imaging encompasses a broad spectrum of imaging disciplines, wherein the inherent quantum fluctuations in light play a crucial role during the imaging process and significantly impact its performance[2,3]. In these cases, the performance assessment and determination of its quantum limit through powerful approaches are of great importance in guiding the practical optimization of the imaging system.
As an typical topic of quantum imaging, ghost imaging (GI) is a new type of imaging modality that utilizes high-order correlation of light fields for imaging information acquisition[4,5]. The technique initially emerged as an unconventional imaging method that utilizes quantum-entangled photons as the illuminating source and retrieves the image through two-photon correlation obtained from photon coincidence counting[6,7]. However, subsequent verification demonstrated its feasibility using classical sources[8,9]. Along with the proof-of-principle verification and application of GI techniques, the performance evaluation of its imaging results becomes of significant importance[10].
Resolving power, as a key indicator for the most of imaging systems, can be defined in many ways[11,12]. The famous traditional optical resolving power is Rayleigh’s criterion[13] on the minimum distance of two incoherent point sources that is resolvable from the image. Although Rayleigh’s criterion is still widely used today due to its simplicity, it is a rough idea in the first place. The minimum resolvable distance behind Rayleigh’s criterion can be refined in a more rigorous foundation in terms of the statistical phase transition, where the sample complexity for statistically resolving closely spaced point sources goes from polynomial to exponential, but at the expense of making the calculation more complicated[14].
Resolving power can be better defined from information-theoretic perspectives, which we divide into three categories: overall assessment, image quality assessment, and task-oriented assessment. Overall assessment focuses on the capability of an imaging system to transfer information from object to image. For instance, based on the Nyquist–Shannon sampling theorem, the degree of freedom of an image was proposed to evaluate an imaging system[15,16]. Here, it is essentially the mutual information and capacity[17] of the optical channel that is considered. In the quantum imaging field, there also exist similar thoughts. For example, in GI, by taking the optical system as a communication process, the mutual information between the detection signal and the imaging object is analyzed as a performance measure to indicate the acquired information content for quantitative assessment of the system[18]. Image quality assessment focuses on the faithfulness of the image to the object and, explicitly or implicitly, and involves a comparison between the image (or the observation data) and the object. For instance, the mean square error of the estimates for radiance values at each point[19] is an explicit object-image comparison, while the mutual information is an implicit comparison using the correlation between the object and its image or observation data[20]. For example, the image mutual information between the imaging result and the ground-truth image has been investigated to assess the imaging quality of GI system[21]. Task-oriented assessment is based on the tasks to infer specific features of the objects, such as the brightness, the number, and the distance of optical point sources[22–24]. The tasks used to assess resolving power are often strongly relevant to the realistic scenes, e.g., astronomy observation[25] and molecular imaging[26] and thus have clear operational significance.
2. Quantum Statistical Description of Imaging
Statistical inference theory, including parameter estimation and hypothesis testing, is the main mathematical foundation of task-oriented assessment of resolving power and is also applicable to image quality assessment[22,27–30].
To apply statistical inference theory to guide the optimization of an imaging system and analyze its quantum limit, various fluctuations, including the extrinsic noise and the intrinsic quantum noise, should be taken into account. The extrinsic noise, like the background light and dark count, comes from technical limitations. The intrinsic quantum noise, e.g., the photon shot noise, comes from the random nature of quantum measurement.
The imaging process is usually treated as a map from the object space to the observation space with additive noises[31]. Instead, we should consider the complex amplitude of optical field,
The quantum description of optical field is split into two parts, the field operators and the quantum states, both of which are represented on the Fock space associated with a set of orthonormal spatiotemporal modes. To focus on the spatial resolving power, assume that the light is quasimonochromatic, scalar, and paraxial. Then, within one temporal mode, the quantum state for the spatial degree of freedom can be expressed with the Sudarshan–Glauber representation as[38]
3. Diffraction Limit and State Discrimination
The diffraction limit of imaging can be understood from the perspective of modal transformation during light propagation. For instance, a typical
Figure 1.Diffraction limit understood in the perspective of modal transformation. Here, U is the Fourier transform and P stands for the projection that only allows low-spatial-frequency components at the Fourier plane to pass through, and U† is the adjoint of U. The red and green grids represent the coordinate space and the frequency space in the transverse plane, respectively.
The above-mentioned perspective of diffraction limit becomes more pronounced when considering a single-photon wave packet propagating from the object plane to the image plane. For a single-photon state, the spatial modal function
Correspondingly, the inner product of the single-photon states is transformed as
The nonzero overlap between
For thermal sources, the quantum state of the optical fields in the object plane can be described by the Sudarshan–Glauber representation of Eq. (3) with[38]
4. Quantum Statistical Inference Toolbox
The advantage of the quantum description for imaging is that it will be convenient for utilizing quantum statistical inference theory, which is about how to effectively extract the information encoded in the quantum state by optimizing quantum measurements and data processing. An abstract procedure of quantum statistical inference is illustrated in Fig. 2. After an encoding process, the quantum state depends on an unknown parameter
Figure 2.Quantum statistical inference process.
The ultimate purpose of quantum statistical inference theory is to find a good measurement
In practice, direct minimization of an objective function
5. Imaging Performance Evaluation Based on Parameter Estimation
When the unknown parameter
5.1. CRB in GI with respect to imaging quality
The classical CRB has been used for the image quality assessment of GI[53,54]. Generally speaking, CRB is one of the reasonable theoretical means for analyzing the bound of uncertainty for parameter estimation. By extending it to the scenario where the image is regarded as multiple parameters, the CRB for image information retrieval in the GI system was initially explored. Through the likelihood function
Due to the specific assumption of light fields in the above study on the CRB of GI systems, it is restricted to imaging uncertainty assessment for light fields subject to the Gaussian distribution. However, as different kinds of optimized light fields have been developed[55], a more general method for CRB evaluation is desirable. Recently, a practical numerical method for obtaining the CRB as well as the fundamental uncertainty of the GI imaging result[54] has also been proposed by incorporating the Bayesian filtering paradigm[56].
In particular, considering the multiple varying light-field patterns and the detection noises, the detection process of GI is firstly modeled as the forward process of a Bayesian filtering problem consisting of state evolution and observation. Then, via the prediction and update procedure in Bayesian filtering, the probability distribution of the desired image information is gradually estimated, by assuming appropriate initialization on it. And the estimation as well as uncertainty lower bound of the imaging result can be accordingly obtained. Figure 3(a) illustrates the image information estimation results of different sampling numbers, with the evaluated CRB distribution shown in Fig. 3(b). It can be found that the CRB shows similar property to that in the analytical study. Namely, it is generally the same for each pixel and decreases as the sampling number increases. To better show the effect of uncertainty estimation using CRB, Fig. 3(c) represents the variation of both the reconstruction MSE and evaluated average CRB with the sampling number under different SNRs. It can be found that the MSE matches well with the CRB, which indicates that the uncertainty represented via CRB can be adopted for quantitative imaging quality assessment.
Figure 3.Results of the Bayesian filtering method. (a) The estimation results of image information under different number of measurements; (b) the evaluated CRB distribution corresponding to (a); (c) variation of both the reconstruction MSE and evaluated average CRB with the sampling number under different SNRs (adapted from Fig. 2 in Ref. [
Besides quality assessment, CRB and FI have also been utilized to enhance the quality of the GI system. For example, in the corresponding Fourier-transform GI method[57], better image retrieval is achieved by selecting detection signals that contain more information and result in lower CRB. Specifically, from the conditional probability distribution of detection light-field signals in both paths of the system,
Figure 4.(a) The FI of different parts of detection signals as a function of relative intensity β (with respect to the mean intensity of the detection signals); (b) comparison of experimental results reconstructed from 15,000 samplings using the proposed scheme (left) and vanilla correlation scheme (right) (adapted from Figs. 7 and 8 in Ref. [
5.2. CRB in resolving incoherent point sources
In the 1970s, Helstrom, the pioneer of quantum detection and estimation theory, calculated the quantum CRB for the strength, frequency, and position of an incoherent point source in the presence of background light[41]. These works focus on the single-point resolution limited by background light noise. In 2006, Raw et al. took the classical CRB on the estimation error for the separation between two incoherent optical point sources as a fundamental resolution measure and studied two-point resolution of direct imaging limited by the photon shot noise[24]. They showed that the classical FI about the separation under direct imaging abruptly decreases to zero when the two point sources get closer. This result is consistent with the basic idea behind Rayleigh’s criterion, that is, the significant overlap between the intensity profiles of the light from the two sources damages the distinguishability about the features of the sources.
In 2016, Tsang et al.[37] revealed substantial room for the improvement of two-point resolving power by calculating the quantum FI of the single-photon states about the separation between two incoherent point sources that are closely placed. They also demonstrated that the spatial-mode demultiplexing (SPADE) with respect to the Hermite–Gaussian modes can attain the quantum-limited two-point resolution. Some advanced tools like the moment estimation[58–60] and quantum semiparametric estimation[61,62] are used for generalized quantum superresolution analysis from two-point problem to extended sources. A recent review on this branch of quantum superresolution can be found in Ref. [63].
6. Imaging Performance Evaluation Based on Hypothesis Testing
Another easy-to-use framework of quantum statistical inference is quantum hypothesis testing, which corresponds to the cases where the unknown parameter
The Chernoff exponent[65] and its quantum version[66,67] concern the asymptotic efficiency of hypothesis testing and are much easier to apply than the minimum error probability together with the Helstrom test. For the problems with a large number of samples, we usually perform the same measurement on individual systems. As a result, the observation data are i.i.d. random variables. According to the classical hypothesis testing theory[65], the minimum error probability will decrease with the number of samples in the exponential way, that is,
This error rate, called the (classical) Chernoff exponent, can be calculated through the expression[65]
Therefore, the classical Chernoff exponent assesses the asymptotic efficiency of a measurement for hypothesis testing and the quantum Chernoff exponent gives the quantum limits. They play similar roles as the classical and quantum FI does in parameter estimation problems.
With hypothesis testing theory, in 1964, Harris[22] analyzed resolving power by considering the task of deciding whether the optical field in the image plane is generated by one incoherent source or by two incoherent sources, under conditions of high background radiation. In 1973, Helstrom[68] investigated the quantum limit of the minimum error probability of the one-versus-two incoherent optical sources. However, Helstrom states that the optimal measurement (the Helstrom test) “though possible in principle, cannot be carried out by any known apparatus.” In addition, Helstrom’s work[68] does not compare the quantum-limited performance with the performance of direct imaging, so it is unclear how much room for improvement there is.
In 2018, Lu et al.[69] utilized the Chernoff exponent to compare the asymptotic efficiencies of the binary SPADE measurement[37], the image inversion interferometry (SLIVER) measurement[70], direct imaging, and the quantum limit. They showed that both SPADE and SLIVER attain the quantum limit in the sub-Rayleigh regime, without knowing the separation in advance. In 2021, Huang and Lupo[71], using asymmetric quantum hypothesis testing, showed that SPADE and SLIVER are superior direct imaging for the exoplanet detection problem. In 2022, Zanforlin et al.[72], based on hypothesis testing, experimentally demonstrated optical quantum superresolution imaging for detecting a weak secondary source.
7. Conclusion
Imaging science is a rapidly evolving multidisciplinary field. Various technologies have been used to optimize imaging systems. A good criterion for evaluating the imaging performance is of great importance to guide the design and optimization of imaging systems. The advantage of using the quantum-information-theoretic perspective is that we can tackle the resolution optimization problem in a unified formalism. It often provides an operational resolution measure and an asymptotic attainable lower bound that can be used to reveal the quantum limit of resolution measure. The quantum treatment of imaging may give new insights into ultimate room for improvement of resolving power. More importantly, the new measurement strategies like SPADE are superior to direct imaging in the sub-Rayleigh regime. This branch of quantum imaging is rapidly developing and many proof-of-principle experiments have been reported. It is hoped that, with the information-theoretic perspectives, quantum-optimal measurement can be implemented for realistic imaging problems in the near future.
For GI, related studies so far have been mainly focused on the evaluation of imaging performance with information-based measures and the optimization of the imaging quality. For evaluation, the mutual information and CRB are used as content-free indicators for beforehand assessment. And it is also demonstrated that increasing the detection information contributes to improving the imaging quality. These studies have shown the effect of information theory on assessing and optimizing GI systems, while they also suggest some further theoretical possibility for performing related research. First, the fundamentally theoretical model of GI for information-theoretic study could be further investigated. For example, exploration studies involving the optical coherence should be desirable, since optical coherence essentially plays an important role in GI (as well as several quantum imaging modalities), while probability modeling in existing GI studies rarely involves that. Also, though existing studies have shown that the information measure has similar properties to mean square error, and imaging SNR in imaging quality assessment, the analysis of performance evaluation should be extended by further connecting the information measure with commonly recognized evaluation indicators of an imaging system. In addition, the information-theoretic approaches could be further applied to adaptive and task-oriented imaging system design. Currently, there have been several studies on GI techniques for dynamic imaging or specific non-imaging tasks where the property of flexible information encoding and mapping of GI modality is exploited. The information measure or CRB analysis framework described above is expected to be incorporated into them to promote more related system design studies on a solid theoretical basis. Currently, studies on GI focus more on classical sources. Nonetheless, quantum GI techniques exploiting nonclassical sources as well as the quantum description of the interaction between lights and objects are rather fascinating. And it is expected to have better prospects by combining them with quantum statistical inference methods.
References
[1] M. I. Kolobov. Quantum Imaging(2007).
[2] Y. Shih. Quantum imaging. IEEE J. Sel. Top. Quantum Electron., 13, 1016(2007).
[3] A. Gatti, E. Brambilla, L. Lugiato. Quantum imaging. Prog. Opt., 51, 251(2008).
[4] J. H. Shapiro, R. W. Boyd. The physics of ghost imaging. Quantum Inf. Process., 11, 949(2012).
[5] Y. Shih. The physics of ghost imaging. Classical, Semi-classical and Quantum Noise, 169(2012).
[11] A. J. den Dekker, A. van den Bos. Resolution: a survey. J. Opt. Soc. Am. A, 14, 547(1997).
[12] V. Ronchi. Resolving power of calculated and detected images. J. Opt. Soc. Am., 51, 458(1961).
[14] S. Chen, A. Moitra. Algorithmic foundations for the diffraction limit. Proceedings of the 53rd Annual ACM SIGACT Symposium on Theory of Computing, 490(2021).
[15] G. T. di Francia. Resolving power and information. J. Opt. Soc. Am., 45, 497(1955).
[16] D. Gabor. IV light and information. Progress in Optics, 1, 109(1961).
[17] R. E. Blahut. Principles and Practice of Information Theory(1987).
[22] J. L. Harris. Resolving power and decision theory. J. Opt. Soc. Am., 54, 606(1964).
[27] J. L. Harris. Diffraction and resolving power. J. Opt. Soc. Am., 54, 931(1964).
[31] H. H. Barrett, K. J. Myers. Foundations of Image Science(2003).
[32] L. Mandel, E. Wolf. Coherence properties of optical fields. Rev. Mod. Phys., 37, 231(1965).
[36] M. Tsang. Quantum limits to optical point-source localization. Optica, 2, 646(2015).
[38] L. Mandel, E. Wolf. Optical Coherence and Quantum Optics(1995).
[39] J. W. Goodman. Introduction to Fourier Optics(2004).
[40] C. Fabre, N. Treps. Modes and states in quantum optics. Rev. Mod. Phys., 92, 035005(2020).
[41] C. W. Helstrom. Quantum Detection and Estimation Theory(1976).
[43] S. M. Kay. Fundamentals of Statistical Signal Processing, Volume I: Estimation Theory(1993).
[45] H. Cramér. Mathematical Methods of Statistics(1946).
[46] C. R. Rao. Information and the accuracy attainable in the estimation of statistical parameters. Bull. Calcutta Math. Soc., 37, 81(1945).
[49] V. Giovannetti, S. Lloyd, L. Maccone. Quantum metrology. Phys. Rev. Lett., 96, 010401(2006).
[54] L.-K. Du, C. Hu, S. Liu et al. Bayesian recursive information optical imaging: a ghost imaging scheme based on Bayesian filtering(2023).
[56] S. Särkkä, L. Svensson. Bayesian Filtering and Smoothing(2023).
[60] S. Zhou, L. Jiang. Modern description of Rayleigh’s criterion. Phys. Rev. A, 99, 013808(2019).
[63] M. Tsang. Resolving starlight: a quantum perspective. Contemp. Phys., 60, 279(2020).
[64] C. W. Helstrom. Detection theory and quantum mechanics. Inform. Comput., 10, 254(1967).

Set citation alerts for the article
Please enter your email address