Definition: Sensor; Image Sensor; Digital Image Sensor

Definition: Sensor; Image Sensor; Digital Image Sensor | Glossary entry

Sensor; Image Sensor; Digital Image Sensor

In a DSLR camera the Image Sensor, or sensor for short, is an integrated circuit chip. The chip converts light striking its surface into electrical signals. The signals are then used to create images.

The Digital Image Sensor is an Integrated Circuit Chip which has an array of light sensitive components on the surface. The array is formed by the individual photosensitive points. Each photosensitive sensor point inside the image circle acts to convert the light to an electrical signal. The full set of electrical signals are converted into an image by the on-board computer.

There will typically be millions of photosensitive sensor points on a digital image sensor. The number depends on the size of the sensor, its purpose and its resolution. For most purposes, the number of sensor points on an image sensor array will be the same as the megapixels which define the camera resolution. For example the new Canon 5D MkIII has 22.3 megapixels (literally 5760 × 3840 = 22,118,400 pixels) on a full-frame sensor (35.8 x 23.9 mm).

All of the individual photosensitive points on the image sensor must lie within the image circle on the focal plane. If any part were to lie outside the image circle they would be dark. It is therefore important that the correct lenses are used so that the image circle is correctly mapped to size of the image sensor.

Digital Image Sensor

Digital Image Sensor - a sophisticated photosensitive array, signal amplifier and computer. Click for large image on Wikipedia.

The sensor types

There are two main types of image sensor. These are ‘charge-coupled devices’ (CCD) and ‘complementary metal–oxide–semiconductor’ (CMOS) active pixel sensors. There is no discernible difference in optical quality of the two systems. However, there are implications for shutter design. There are also some advantages and disadvantages of each which make them suited to slightly different image device designs.

Both types of sensor convert the light striking the individual sensor points to electrical signals. However they differ in how the the resultant electrical signal is collected and treated.

In the CCD device each sensor point struck by light creates an electrical charge proportional to the light intensity striking it. All the sensors are activated at once and the charge is stored. Once the exposure is complete the first row of the array is read by the on-board computer. Each charge is converted to data describing the light that struck that sensor point. Next the second row of of the array is emptied into the first row and read by the computer. Then the third row is passed on to the second, then to the first, and in turn is read by the computer. This cascade of signals allows the entire array to capture the data in one instant and be sequentially read-off after the exposure. This ‘frame-transfer’ method captures the entire image at once.

In the CMOS device each sensor point is directly integrated with an amplifier. In this case when the light strikes the sensor surface it creates an electrical voltage signal that is immediately amplified and changed to image data. The computer reads off the amplified signals sequentially, row-by-row during the exposure. During the course of an exposure the first sensor point is read a fraction of a second before the last one is read by the computer. This time difference has consequences for the design of the camera shutter. It can also lead to predictable image distortion effects and some image artifacts particularly when shooting fast moving objects.


In digital cameras the digital image sensor system is a complex computer (CMOS) or is linked to one (CCD). On the front of the sensor there is an array of light sensitive points (photosensitive pixels). In the case of CCD and CMOS technologies each photosensitive pixel creates an electrical charge proportional to the intensity of the light striking the individual sensor.

In CCD technology, the charge in each sensor point is read off after exposure and transferred to the on-board computer to compile the image. The on-board computer is a second chip installed for handling the image processing.

In CMOS technology the amplifier for each sensor is built directly into each sensor point. This means that the computer for managing the data can be built directly into the same chip. It will sense the charge directly. It will also be able to perform the image creation and any manipulation on the same chip.

Cost and use

CCD technology is expensive and often requires more processing power. The need for more components to complete the processing make it difficult to use CCD technology in compact technologies like consumer cameras. In addition the technology suffers from the creation of high noise levels, particularly in high light intensity. In low end cameras this is not a problem. To prevent the noise levels, cooling systems have often been used, adding to the expense. However, the quality of the image improves. These devices have been used in high end imaging devices astronomy, specialist cameras, and medical imaging devices as well as consumer cameras, mobile phone cameras, web cams and low-end sensors.

CMOS Active Pixel Sensors are best used for applications where compact size, low power consumption and on-chip processing are important. CMOS sensors are used in high-end digital cameras down to mobile-phone cameras. This compact technology is cheaper than CCD, and uses less power.


Contribute A Definition?

Send us a definition of a photographic word or phrase...

Send us a definition for our list of photographic words and phrases. Simply write a clear definition and send it in. Include an original picture if you wish. Give us your name and a link to your website and we will credit your work.

One response to “Definition: Sensor; Image Sensor; Digital Image Sensor

  1. Pingback: MULTI-IMAGING - Dipanwita Biswas - Artist and Painter