**Fourier ptychography (FP)**

The interested reader can also refer to our book: Fourier Ptychographic Imaging: a MATLAB Tutorial (code downloaded here, erratum).

**Concept**

Optical engineers use a concept called space-bandwidth product (SBP) to characterize the total resolvable pixels of an imaging system. In microscopy, the SBPs of most off-the-shelf objective lenses are on the order of 10 megapixels, regardless of their magnification factors or numerical apertures (NA). For example, a standard 20X microscope objective lens (MPLN 20X, 0.4 NA, Olympus) has a resolution of 0.8 um and a field-of-view with 1.1 mm diameter, corresponding to a SBP of ~8 megapixels. The question then arises: can we design a microscope platform with a gigapixel SBP? Ideally, we can simply scale up the size of a lens to increase the SBP. However, geometrical aberrations associated with the lens also scale up as the size of the lens increases. Therefore, we need to introduce more optical surfaces to increase the degrees of freedom in lens optimization. The resulting lens design is both expensive to produce and difficult to align, and thus, not a practical solution for the conventional microscope platforms.

Fourier ptychography (FP) tackles this problem from another perspective. It brings together two innovations in computational optics to bypass the SBP barrier. The first innovation is the phase retrieval technique originally developed for electron imaging. It is well known that light detectors, such as CCDs and photographic plates, measure only intensity variations of the light that hits them. Phase information, which characterizes how much the light is delayed through propagation, is lost during the recording process. Phase retrieval technique is able to recover the lost phase information by using intensity-only measurements. It typically consists of alternating enforcements of the known information of the object in the spatial and Fourier domains. The second innovation employed in FP is the aperture synthesis technique invented by Dr. Martin Ryle, who received the Noble prize in 1974. This technique was originally developed for radio astronomy, aiming at bypassing the resolution limit of a single radio telescope. It combines images from a collection of telescopes to expand the Fourier passband and improve the achievable resolution. By integrating these two innovations, FPM uses a unique data fusion algorithm to recover a high-resolution, high-SBP sample image. This image contains both the intensity and phase information of the sample, a complete picture of the entire light field.

The physical scheme of FP is very simple. It consists of an LED array and a conventional microscope with a low-NA objective lens. The LED array is used to successively illuminate the sample from multiple angles. At each illumination angle, FP records a low-resolution intensity image through the low-NA objective lens. The objective’s optical-transfer-function imposes a well-defined constraint in the Fourier domain. This constraint is digitally panned across the Fourier space to reflect the angular variation of its illumination. Following the phase retrieval technique, FP recovers a high-resolution complex sample solution by alternatively constraining its amplitude to match the acquired low-resolution image sequence, and its spectrum to match the panning Fourier constraint. The imposed panning Fourier constraint also enables passband expanding in the Fourier space, sharing its root with the aperture synthesis technique. The resolution of final FP reconstruction is determined by the latest incident angle of the LED array. As such, FPM is able to bypass the design conflicts of conventional microscopes and achieve high-resolution, wide field-of-view imaging capabilities.

The name of ‘Fourier ptychography’ comes from a related imaging modality, ptychography. Drawing connections and distinctions between FP and ptychography also helps to understand FP’s concept and principle. Ptychography is lensless imaging approach originally proposed for transmission electron microscopy. It uses a focused beam to illuminate the sample and records multiple diffraction patterns as a function of sample positions. This set of diffraction patterns are then used to recover the complex sample profile following the iterative phase retrieval strategy. It is clear that FP and ptychography both share the strategy of phase retrieval technique: iteratively seeking a complex sample solution that is consistent with many intensity measurements. With ptychography, the object support constraints for phase retrieval are imposed by the confined illumination beam in the spatial domain. As such, the sample must be mechanically scanned through the desired field-of-view. With FP, however, the object support constraints are given by the confined optical-transfer-function in the Fourier domain. In this regard, FP acts as the Fourier counterpart to ptychography, justifying its name. By using a low-NA objective lens, FP naturally offers a large and fixed field-of-view, high signal-to-noise ratio, and no mechanical scanning as compared to conventional ptychography. Computational processing used in conjunction with the panning LED illuminations then leads to an improved-resolution, high-SBP final image. Furthermore, FP is able to digitally correct for aberrations and extend the depth-of-focus beyond the physical limitation of the employed optics.

*Fig. 1 FP setup and imaging procedure. (a) An LED array placed beneath a conventional microscope illuminates the sample with unique wavevector k. (b) The object’s finite spatial frequency support, defined by the microscope’s NA in the Fourier domain (red circle), is imposed at offset locations to reflect each unique LED illumination angle. The Fourier transform of many shifted low-resolution measurements (each circle) are stitched together to recover a complex sample spectrum with resolution extending well beyond the objective lens cutoff frequency. (c) Light emitted from a single LED is regarded as plane wave with wavevector (kxn, kyn) on a small sample area. (d) LEDs are sequentially illuminated during the FPM image acquisition process.*

**Operation of the Fourier ptychographic microscope**

Video below shows the operation of the Fourier ptychographic microscope. Each time, we will turn on one LED element and illuminate the sample from one incident angle. The corresponding captured low-resolution images will be used to recover the high-resolution complex object image.

**High-resolution, Wide-field Imaging**

The FP setup is shown in Fig. 1, where a fixed array of LED is used to illuminate the sample from multiple angles. At each illumination angle, FPM records a low-resolution sample intensity image through a low NA objective lens. The objective’s NA imposes a well-defined constraint in the Fourier domain. This aperture constraint shifts to different positions across the Fourier space as we change the illumination angle. FP performs phase retrieval by iteratively conforming a high-resolution complex sample image to both this shifting Fourier constraint and the associated sequence of low-resolution images – a unique phase-retrieval-based approach.

*Fig. 2 (Right) Fourier ptychographic microscope prototype setup. (Top right) Raw data of 2x objective lens. (Bottom right) The corresponding high-resolution reconstructions.*

The raw data of a USAF resolution target and a blood smear sample are shown in Fig. 2 (top right). We used a 2X objective lens (0.08 NA) in the acquisition process. The corresponding FP reconstructions are shown in the bottom right part of Fig. 2, with a maximum synthetic NA of 0.5. We note that, the resolution of the FPM reconstruction is determined by the largest incident angle of the LED illumination. The capability of improving the resolution of a low-NA lens may provide a new solution for addressing the wide-field, high-resolution imaging needs that dominate digital pathology, hematology, and neuroscience. Fig. 3 shows a wide field-of-view image using the FPM setup.

*Fig. 3 Wide field-of-view imaging using the Fourier ptychographic microscope.*

**Quantitative Phase Imaging**

The challenge of recovering quantitative phase information from a specimen’s digital image has stimulated the development of many computational techniques over the past several decades. FP is, in fact, a phase retrieval approach that uses angular diversity functions for recovering the complex sample image. To verify FP’s ability to also accurately recover optical phase, we imaged a sample containing microbeads in oil (3 um and 6.5 um diameter, noil= 1.48, nsphere= 1.6), shown in Fig. 4(a). Unwrapped line traces of the optical phase shift induced by two different-sized spheres lead to estimated microbead thickness curves in Fig. 4(c1)-(c2), exhibiting close agreement with theory. Fig. 4(b) presents an FPM reconstruction of a complex biological sample – a human blood smear immersed in oil, a common quantitative phase measurement target. The FP and ground-truth DH phase maps closely match, as exhibited by the phase trace through a red blood cell in Fig. 4(c3).

*Fig. 4 Comparing FP phase reconstructions to digital holographic and theoretical data. FPM transforms low-resolution intensity images from a 2X objective (a1) into a high-resolution phase map (a2) of different-sized polystyrene microbeads, as compared with a DH reconstruction (a3) using a 40X objective. (b) A similar image sequence highlights FPM’s phase imaging capabilities on a human blood smear. (c) Line traces through the microbeads and an RBC demonstrates quantitative agreement with expected phase performance.*

**Macroscopic Fourier ptychographic imaging**

FP can be implemented in a macroscopic imaging setting. In this case, the object is placed at the far field and the camera is scanned over different x-y positions to acquire images corresponding to different passbands. Far field propagation is equivalent to performing Fourier transform of the light field, and thus, the finite size of the camera aperture naturally serves as a confined support constraint at the Fourier space. By scanning the entire camera at different x-y positions, we can, thus, synthesize a large passband in the Fourier space and recover the super-resolution sample image. The final achievable resolution in this case is determined by how far we can move the camera, not by the size of lens’s aperture. Fig. 5(c) shows the raw image of the platform and the resolution is limited by the lens’s aperture. Fig. 5(d) shows the recovered image using FP and the achievable resolution is determined by scanning distance of the entire camera.

*Fig. 5 Scheme of the macroscopic FP concept. (a) The sample is placed at the far field and the aperture of the camera lens naturally serves as a confined support constraint at the Fourier plane. By scanning the entire camera to different positions, we can synthesize a larger passband in the Fourier space, enabling super-resolution imaging beyond the resolution limit of the lens’s aperture. (b) Our experimental setup where a USAF target is placed at the far field and a 2D motion stage is used to scan the entire camera at the x-y plane. The raw image of the object directly captured using the camera (c1) and the corresponding Fourier spectrum (c2). The recovered image using the camera-scanning FP scheme (d1) and the corresponding Fourier spectrum (d2).*

The advantage of camera-scanning FP can be better understood by drawing connection and distinction with light-field imaging. Both light-field imaging and camera-scanning FP capture multiple images of the sample from different perspectives. Light-field imaging is an incoherent modality while camera-scanning FP is a coherent modality. For light-field imaging, the acquired images are shifted and added to perform 3D refocusing, similar to the tomographic reconstruction process. The resolution of light-field imaging is still limited by the aperture of the individual lens. Camera-scanning FP, on the other hand, is able to truly bypass the resolution limit of the lens’s aperture. The recovered complex light field can also be back-propagated to different positions along the optical axis for 3D holographic imaging. We envision the camera-scanning FP can find applications in remote sensing and other defense-related applications.

**Spectral Multiplexing via Fourier Ptychography**

Information multiplexing is important for biomedical imaging and chemical sensing. FP can also be used for information multiplexing and coherent-state decomposition. Similar to a typical Fourier ptychographic setting, we use an array of light sources to illuminate the sample from different incident angles and acquire corresponding low-resolution images using a monochromatic camera. In this technique, however, multiple light sources are lit up simultaneously for information multiplexing, and the acquired images thus represent incoherent summations of the sample transmission profiles corresponding to different coherent states. We show that, by using the state-multiplexed FP recovery routine, we can decompose the incoherent mixture of the FP acquisitions to recover a high-resolution sample image. We also show that, color-multiplexed imaging can be performed by simultaneously turning on R/G/B LEDs for data acquisition. The multiplexed FP technique may provide a solution for handling the partially coherent effect of light sources used in Fourier ptychographic imaging platforms. It can also be used to replace spectral filter, gratings or other optical components for spectral multiplexing and demultiplexing. With the availability of cost-effective broadband LEDs, the multiplexed FP technique may open up exciting opportunities for computational multi-spectral imaging.

*Fig. 6 Spectral-multiplexed FP scheme. R/G/B LEDs are turned on simultaneously for illumination. Low-resolution images are acquired using a 0.1 NA objective lens and a monochrome camera. A color-multiplexed FP recovery algorithm is then used to decouple the R/G/B channels from the low-resolution images. A high-resolution color image of the sample can be recovered using computation instead of spectral filters.*

**3D Imaging via Fourier Ptychography**

We can also use the Fourier ptychography scheme to recover the exiting wavefront of a 3D sample (video). The complex wavefront can then be back-propagated to any plane along the optical axis. The video below demonstrates a the 3D imaging capability of the Fourier ptychography approach.

**Extension for super-resolution Fluorescence Imaging**

As a coherent imaging technique, the original FP approach cannot be used for fluorescence microscopy. The reason is very simple: fluorophores are not responsive to phase variation of the excitation waves. In the original FP prototype, no matter which angle we illuminate the sample, the fluorescence emission remains unchanged and no additional information can be extracted from the raw images. To modify the FP recovery concept for fluorescence microscopy, we can simply replace the plane wave illuminations with intensity-varied patterns. In this case, fluorophores are responsive to the intensity variations, and each low-resolution image provides additional information for the high-resolution sample profile. We term such a scheme as pattern-illuminated Fourier ptychography.

*Fig. 7 Experimental demonstration of the pattern-illuminated FP approach. (a1) Sample image acquired using the 10X objective with uniform illumination. (a2) Deconvolved image of (a1). (b1) Speckle-illuminated FP raw image. (b2) FP recovery using 49 raw images. (c1) Sample image acquired using a 40X high-NA objective. (a3), (b3), (c2) Intensity line traces of the highlighted features in (a1), (a2), (b2), and (c1). For (a1) and (c1), we averaged multiple frames to get a similar photon budget as (b2).*

**Incoherent photographic imaging using structured light**

Controlling photographic illumination in a structured fashion is a common practice in computational photography and image-based rendering. We can also use the Fourier ptychography approach for incoherent photographic imaging. In this approach, frequency mixing between the object and the structured light shifts the high-frequency object information to the passband of the photographic lens. Therefore, the recorded intensity images contain object information that is beyond the cutoff frequency of the collection optics. Based on multiple images acquired under different structured light patterns, we can use the Fourier ptychographic algorithm to recover the super-resolution object image and the unknown illumination pattern.

*Fig. 8 Incoherent photographic imaging using the Fourier ptychography approach. (a1)-(a3) Reference images using uniform R/G/B illumination, (b) combined reference color image, (c1)-(c3) recovered super-resolution images using the Fourier ptychogrpahic approach, (d) combined super-resolution color image.*