- Diffraction Refraction Difference
- Refraction Diffraction And Interference
- Diffraction Refraction Reflection Worksheet
- Diffraction Refraction Reflection
- Diffraction Refraction Resonance Structures
The resolution of an optical imaging system – a microscope, telescope, or camera – can be limited by factors such as imperfections in the lenses or misalignment. However, there is a principal limit to the resolution of any optical system, due to the physics of diffraction. An optical system with resolution performance at the instrument's theoretical limit is said to be diffraction-limited.
- Reflection, refraction and diffraction are all boundary behaviors of waves associated with the bending of the path of a wave. The bending of the path is an observable behavior when the medium is a two- or three-dimensional medium. Reflection occurs when there is a bouncing off of a barrier.
- The wave nature of light leads to two very important properties: refraction, where the direction of light propagation is altered at the boundary between media of different densities, and diffraction, which has among its consequences that light can 'bend around corners'. Refraction of Light The direction of light propagation can be changed at the boundary of two media having different densities.
For objects with dimensions close to the wavelength, the wave nature of light intervenes, which leads to the phenomenon of diffraction. Diffraction patterns are typically studied behind an orifice cut in a screen. A ray of light being refracted in a plastic block In optics, the refractive index (also known as refraction index or index of refraction) of a material is a dimensionless number that describes how fast light travels through the material. It is defined as where c is the speed of light in vacuum and v is the phase velocity of light in the medium. Diffraction refers to how electric fields propagate through free space and through media with an index of refraction. Refraction refers to how light rays 'bend' at an interface between to media with different indices of refraction. The rainbow is created because the index of refraction of water droplets changes as a function of wavelength.
The diffraction-limited angular resolution of a telescopic instrument is proportional to the wavelength of the light being observed, and inversely proportional to the diameter of its objective's entrance aperture. For telescopes with circular apertures, the size of the smallest feature in an image that is diffraction limited is the size of the Airy disk. As one decreases the size of the aperture of a telescopic lens, diffraction proportionately increases. At small apertures, such as f/22, most modern lenses are limited only by diffraction and not by aberrations or other imperfections in the construction.
For microscopic instruments, the diffraction-limited spatial resolution is proportional to the light wavelength, and to the numerical aperture of either the objective or the object illumination source, whichever is smaller.
Diffraction Refraction Difference
In astronomy, a diffraction-limited observation is one that achieves the resolution of a theoretically ideal objective in the size of instrument used. However, most observations from Earth are seeing-limited due to atmospheric effects. Optical telescopes on the Earth work at a much lower resolution than the diffraction limit because of the distortion introduced by the passage of light through several kilometres of turbulent atmosphere. Some advanced observatories have recently started using adaptive optics technology, resulting in greater image resolution for faint targets, but it is still difficult to reach the diffraction limit using adaptive optics.
Radiotelescopes are frequently diffraction-limited, because the wavelengths they use (from millimeters to meters) are so long that the atmospheric distortion is negligible. Space-based telescopes (such as Hubble, or a number of non-optical telescopes) always work at their diffraction limit, if their design is free of optical aberration.
The beam from a laser with near-ideal beam propagation properties may be described as being diffraction-limited. A diffraction-limited laser beam, passed through diffraction-limited optics, will remain diffraction-limited, and will have a spatial or angular extent essentially equal to the resolution of the optics at the wavelength of the laser.
The Abbe diffraction limit for a microscope
The observation of sub-wavelength structures with microscopes is difficult because of the Abbe diffraction limit. Ernst Abbe found in 1873 that light with wavelength λ, traveling in a medium with refractive index n and converging to a spot with half-angle will have a minimum resolvable distance of
The portion of the denominator is called the numerical aperture (NA) and can reach about 1.4–1.6 in modern optics, hence the Abbe limit is d = λ/2.8. Considering green light around 500 nm and a NA of 1, the Abbe limit is roughly d = λ/2 = 250 nm (0.25 μm), which is small compared to most biological cells (1 μm to 100 μm), but large compared to viruses (100 nm), proteins (10 nm) and less complex molecules (1 nm). To increase the resolution, shorter wavelengths can be used such as UV and X-ray microscopes. These techniques offer better resolution but are expensive, suffer from lack of contrast in biological samples and may damage the sample.
Implications for digital photography
In a digital camera, diffraction effects interact with the effects of the regular pixel grid. The combined effect of the different parts of an optical system is determined by the convolution of the point spread functions (PSF). The point spread function of a diffraction limited lens is simply the Airy disk. The point spread function of the camera, otherwise called the instrument response function (IRF) can be approximated by a rectangle function, with a width equivalent to the pixel pitch. A more complete derivation of the modulation transfer function (derived from the PSF) of image sensors is given by Fliegel. Whatever the exact instrument response function, it is largely independent of the f-number of the lens. Thus at different f-numbers a camera may operate in three different regimes, as follows:
- In the case where the spread of the IRF is small with respect to the spread of the diffraction PSF, in which case the system may be said to be essentially diffraction limited (so long as the lens itself is diffraction limited).
- In the case where the spread of the diffraction PSF is small with respect to the IRF, in which case the system is instrument limited.
- In the case where the spread of the PSF and IRF are similar, in which case both impact the available resolution of the system.
The spread of the diffraction-limited PSF is approximated by the diameter of the first null of the Airy disk,
where λ is the wavelength of the light and N is the f-number of the imaging optics. For f/8 and green (0.5 μm wavelength) light, d = 9.76 μm. This is similar to the pixel size for the majority of commercially available 'full frame' (43mm sensor diagonal) cameras and so these will operate in regime 3 for f-numbers around 8 (few lenses are close to diffraction limited at f-numbers smaller than 8). Cameras with smaller sensors will tend to have smaller pixels, but their lenses will be designed for use at smaller f-numbers and it is likely that they will also operate in regime 3 for those f-numbers for which their lenses are diffraction limited.
Obtaining higher resolution
There are techniques for producing images that appear to have higher resolution than allowed by simple use of diffraction-limited optics. Although these techniques improve some aspect of resolution, they generally come at an enormous increase in cost and complexity. Usually the technique is only appropriate for a small subset of imaging problems, with several general approaches outlined below.
Extending numerical aperture
The effective resolution of a microscope can be improved by illuminating from the side.
In conventional microscopes such as bright-field or differential interference contrast, this is achieved by using a condenser. Under spatially incoherent conditions, the image is understood as a composite of images illuminated from each point on the condenser, each of which covers a different portion of the object's spatial frequencies. This effectively improves the resolution by, at most, a factor of two.
Simultaneously illuminating from all angles (fully open condenser) drives down interferometric contrast. In conventional microscopes, the maximum resolution (fully open condenser, at NA = 1) is rarely used. Further, under partially coherent conditions, the recorded image is often non-linear with object's scattering potential—especially when looking at non-self-luminous (non-fluorescent) objects. To boost contrast, and sometimes to linearize the system, unconventional microscopes (with structured illumination) synthesize the condenser illumination by acquiring a sequence of images with known illumination parameters. Typically, these images are composited to form a single image with data covering a larger portion of the object's spatial frequencies when compared to using a fully closed condenser (which is also rarely used).
Another technique, 4 Pi microscopy uses two opposing objectives to double the effective numerical aperture, effectively halving the diffraction limit, by collecting the forward and backward scattered light. When imaging a transparent sample, with a combination of incoherent or structured illumination, as well as collecting both forward, and backward scattered light it is possible to image the complete scattering sphere.
Unlike methods relying on localization, such system are still limited by the diffraction limit of the illumination (condenser) and collection optics (objective), although in practice they can provide substantial resolution improvements compared to conventional methods.
The diffraction limit is only valid in the far field as it assumes that no evanescent fields reach the detector. Various near-field techniques that operate less than ≈1 wavelength of light away from the image plane can obtain substantially higher resolution. These techniques exploit the fact that the evanescent field contains information beyond the diffraction limit which can be used to construct very high resolution images, in principle beating the diffraction limit by a factor proportional to how well a specific imaging system can detect the near-field signal. For scattered light imaging, instruments such as near-field scanning optical microscopes and Nano-FTIR, which are built atop atomic force microscope systems, can be utilized to achieve up to 10-50nm resolution. The data recorded by such instruments often requires substantial processing, essentially solving an optical inverse problem for each image.
Metamaterial-based superlenses can image with resolution better than the diffraction limit by locating the objective lens extremely close (typically hundreds of nanometers) to the object.
In fluorescence microscopy the excitation and emission are typically on different wavelengths. In total internal reflection fluorescence microscopy a thin portion the sample located immediately on the cover glass is excited with an evanescent field, and recorded with a conventional diffraction limited objective, improving the axial resolution.
However, because these techniques cannot image beyond 1 wavelength, they cannot be used to image into objects thicker than 1 wavelength which limits their applicability.
Far-field imaging techniques are most desirable for imaging objects that are large compared to the illumination wavelength but that contain fine structure. This includes nearly all biological applications in which cells span multiple wavelengths but contain structure down to molecular scales. In recent years several techniques have shown that sub-diffraction limited imaging is possible over macroscopic distances. These techniques usually exploit optical nonlinearity in a material's reflected light to generate resolution beyond the diffraction limit.
Among these techniques, the STED microscope has been one of the most successful. In STED, multiple laser beams are used to first excite, and then quench fluorescent dyes. The nonlinear response to illumination caused by the quenching process in which adding more light causes the image to become less bright generates sub-diffraction limited information about the location of dye molecules, allowing resolution far beyond the diffraction limit provided high illumination intensities are used.
The limits on focusing or collimating a laser beam are very similar to the limits on imaging with a microscope or telescope. The only difference is that laser beams are typically soft-edged beams. This non-uniformity in light distribution leads to a coefficient slightly different from the 1.22 value familiar in imaging. But the scaling is exactly the same.
The beam quality of a laser beam is characterized by how well its propagation matches an ideal Gaussian beam at the same wavelength. The beam quality factor M squared (M2) is found by measuring the size of the beam at its waist, and its divergence far from the waist, and taking the product of the two, known as the beam parameter product. The ratio of this measured beam parameter product to that of the ideal is defined as M2, so that M2=1 describes an ideal beam. The M2 value of a beam is conserved when it is transformed by diffraction-limited optics.
The outputs of many low and moderately powered lasers have M2 values of 1.2 or less, and are essentially diffraction-limited.
The same equations apply to other wave-based sensors, such as radar and the human ear.
As opposed to light waves (i.e., photons), massive particles have a different relationship between their quantum mechanical wavelength and their energy. This relationship indicates that the effective 'de Broglie' wavelength is inversely proportional to the momentum of the particle. For example, an electron at an energy of 10 keV has a wavelength of 0.01 nm, allowing the electron microscope (SEM or TEM) to achieve high resolution images. Other massive particles such as helium, neon, and gallium ions have been used to produce images at resolutions beyond what can be attained with visible light. Such instruments provide nanometer scale imaging, analysis and fabrication capabilities at the expense of system complexity.
- ^Born, Max; Emil Wolf (1997). Principles of Optics. Cambridge University Press. ISBN0-521-63921-2.
- ^Lipson, Lipson and Tannhauser (1998). Optical Physics. United Kingdom: Cambridge. p. 340. ISBN978-0-521-43047-0.
- ^Fliegel, Karel (December 2004). 'Modeling and Measurement of Image Sensor Characteristics'(PDF). Radioengineering. 13 (4).
- ^Niek van Hulst (2009). 'Many photons get more out of diffraction'. Optics & Photonics Focus. 4 (1).
- ^Streibl, Norbert (February 1985). 'Three-dimensional imaging by a microscope'. Journal of the Optical Society of America A. 2 (2): 121–127. Bibcode:1985JOSAA...2..121S. doi:10.1364/JOSAA.2.000121.
- ^Sheppard, C.J.R.; Mao, X.Q. (September 1989). 'Three-dimensional imaging in a microscope'. Journal of the Optical Society of America A. 6 (9): 1260–1269. Bibcode:1989JOSAA...6.1260S. doi:10.1364/JOSAA.6.001260.
- Puts, Erwin (September 2003). 'Chapter 3: 180 mm and 280 mm lenses'(PDF). Leica R-Lenses. Leica Camera. Archived from the original(PDF) on December 17, 2008. Describes the Leica APO-Telyt-R 280mm f/4, a diffraction-limited photographic lens.
MOREAU René, Professor Emeritus in Grenoble-INP, SIMaP Laboratory (Science and Engineering of Materials and Processes), member of the Academy of Sciences and the Academy of Technologies.
SOMMERIA Jöel, Joël Sommeria Author Encyclopedia Environment Research Director at the CNRS, Laboratory of Geophysical and Industrial Flows (LEGI).
For quantities such as the mass of a contaminant, heat or the amount of movement, the existence of a stirring material at microscopic scales is necessary for their diffusion. On the contrary, light can propagate in a vacuum. It is then not subject to any distribution. On the other hand, in a transparent medium, such as air or ice, whose composition is not uniform, diffusion may occur. In English, the distinction between this phenomenon and molecular diffusion is clear since this phenomenon is called scattering. The diffusion of light depends on the size of the diffusing objects, read The colours of the sky.
Refraction Diffraction And Interference
At the scales of air molecules, or water molecules constituting ice, objects much smaller (nanometers) than the wavelengths of light (between 0.4 and 0.8 μm), scattering is predominant for short wavelengths, thus selecting the color blue. It’s Rayleigh’s broadcast.
Diffraction Refraction Reflection Worksheet
Water droplets suspended in clouds and fogs can reach 100 microns, much larger than the wavelength of light. Diffusion is then reduced to multiple reflections on the droplet surface. It is therefore no longer selective, which explains the white colour of the clouds, with shades of grey depending on the amount of sunlight absorbed.
On the other hand, at intermediate scales, close to the wavelengths of light, such as pollens, aerosols and fumes, Mie scattering imposes a slightly bluish greyish grey colour. Thus, this colour is typical of the blue line of the Vosges above coniferous forests emitting pollen and isoprene microbeads (originally a compound of natural rubber).
These three types of diffusion correspond to different modes of interaction of light with the diffusing object. For objects that are large in relation to wavelength, it is possible to reason in terms of light rays, which corresponds to geometric optics. At the interface between two media, part of the light is reflected, and the other part passes through the interface with a modified propagation direction: this is called refraction. This process is observed in a prism like the one in the vignette of the focus Deviation of light by a prism (link). The different wavelengths of white light have different refractive angles, which leads to colour separation. Water drops or ice crystals can act as prisms, which leads to rainbows (Spectacular Rainbows) or other atmospheric halos phenomena (Atmospheric Halos). In clouds, reflections and refractions are multiple, which blurs the separation of colors, and restores the white color of sunlight.
For objects with dimensions close to the wavelength, the wave nature of light intervenes, which leads to the phenomenon of diffraction. Diffraction patterns are typically studied behind an orifice cut in a screen. The effect is similar to diffraction by a small object, but it is then easier to observe, thanks to the absorption of non-diffracted light. Figure 2 thus represents the diffraction pattern of monochromatic light of wavelength λ passing through a slit of width a. The diffracted light constitutes a main beam opening at an angle λ/a, surrounded by smaller side lobes. Thus within the limit of a wide slit (λ/a small), the beam remains parallel, in accordance with the geometric optics, while for λ~a, the diffracted beam opens very wide.
Diffraction Refraction Reflection
For white light, the different components diffract at different angles, leading to the coloured patterns shown in Figure 3 (obtained in the case of a circular orifice). As the short wavelengths are more widely diffracted, it is conceivable that Mie’s diffusion is dominated by the blue color, in a way that depends on the size of the particles.
Diffraction Refraction Resonance Structures
Cover image. Blue sky due to Rayleigh’s diffusion with a white cloud that shows a non-selective diffusion by the droplets it contains. [Source: Pixabay, royalty-free]