How digital cameras work


Knowing how a digital camera works gives you a greater understanding of its strength and weaknesses, and thus, how to use it. It can also guide you to make more informed purchases.

How A Digital Camera Works
All cameras work by using a lens to focus light coming from a scenario onto a light-sensitive image sensor. Digital cameras are designed very similarly to film cameras, but there are some fundamental differences. 

Instead of film,digital cameras use digital sensors and memory to capture and record images. The LCD monitor on a digital camera gives a photographer the advantage of previewing images immediately, without the delay of developing films. While a physical shutter is used in a film camera, an electronic one is used in most digital compact cameras,and DSLRs may use a combination of both.

The Image Capture Process

Step 1:
When the camera's sensor is ready to shoot a picture, it is at a "reset" position in the darkness of the camera's body with  no electrical charge.
This is 4x4 pixel section of a standard digital camera image sensor. Each square represents a single photosite and corresponding pixel. Each photosite has either a red,blue, or green color filter overlaid on it ( in a Bayer pattern ) to interpret color.

Step 2:
When the shutter button is pressed, the shutter turns on or opens for a preset period of time, and light from the scene passes through the camera's lens before finally hitting the image sensor.

Step 3:
Each photosite on the sensor measure the light photons hitting it and generates an electrical charge that registers the intensity of the light that is being captured. Depending on the intensity of the light and resultant charge, each photosite is assigned a black,white, or grayscale value between 1 to 255 to define its brightness. The black "0" value is calibrated by the bordering photosites that are not used in image sensing.

Step 4:
After the charge for each pixel has been registered and its color defined by RGB values, an analog to digital converter (ADC) "reads" every pixel and converts the information into binary format for storage.

Step 5:
To add color to the image, each pixel needs to define the specific color of the light photon hitting it by identifying it with a combination of values for red,green,and blue colors. As each pixel can only sense a single color, it derives the values for the missing two colors by sampling, then averaging the color values from the closest surrounding pixels with surrounding color filters.

How do image sensors work?

Developed on the same premise as traditional film, digital image sensors are designed to absorb light rays. Once the intensity and colour of light has been recorded, the information is converted into an electrical signal, which in turn is then used to create an instant digital image.
What are they made from?
An image sensor is primarily made up of three layers, all of which play an important role in recording light that in turn produces an image. The top layer of an image sensor is most commonly made from silicon, a material that absorbs light well. Beneath this layer is a small microlens, which directs light down accurately onto each pixel location. Before the light reaches the pixel, it passes through a layer of photodiodes, which respond to different wavelengths of light. These also help the sensor to determine the intensity of the light (highlights and shadows) and record colour.
What is a pixel?
Your camera’s sensor is made up of millions of pixels; each one is responsible for a select area in your digital image. The amount of pixels the sensor has determines how many megapixels your camera offers. The more megapixels, the higher your image resolution.
How do they capture light and colour?
An image sensor is monochrome. It can detect the intensity of light, but not colour. In order to be able to do this, a layer of light-sensitive photodiodes is laid over the top of the sensor. These photodiodes filter the different wavelengths of light and help the sensor determine the correct colours. Most cameras use a Bayer filter system. The filter makes each pixel responsive to one primary colour of either Red, Green or Blue (RGB). These coloured photodiodes are laid out in rows fitting a pattern that alternates between red and green in one row and blue and green in another. Green is the most frequent as the human eye is most receptive to this colour. The camera then uses a process called colour interpolation to ensure your image has all the right hues and tones.
Sensor sizes 
Camera manufacturers are often boasting about their camera’s high megapixel counts, which can mislead consumers into the thinking that the more megapixels a camera offers, the better the image results will be. In fact, in order to get great shots, there needs to be an even balance between the sensor size and megapixel amount. Unless your sensor is large enough to accommodate all the pixels comfortably, the resulting image quality will be no better than if you were to shoot with a low-megapixel camera.
To ensure your camera has the potential to shoot great-quality captures, look into how many pixels your sensor offers in relation to it size. For example, an APS-C size image sensor that is around 23.6 x 15.7mm will produce much better quality images if it offered just 14 megapixels as opposed to 22 megapixels. This is due to the fact that larger pixels can record more light and detail, which will result in better-quality captures.
  • Full frame sensor (35mm) – High-end professional DSLR cameras offer what is commonly known as a full-frame sensor, which is the same size as 35mm film.
  • APS-C sensor (23.6mm x 15.7mm) – Most mid-range and entry-level DSLRs and CSCs offer APS-C-sized image sensors, based on the size of traditional APS-C film.
  • 7.5mm x 5.7mm sensor - Commonly found in compact cameras, the smaller sensor size can still produce great shots providing there aren’t too many megapixels present.
Types of sensor 
Along with different sizes, there are two main types of image sensor, CMOS and CCD. Both are designed to turn light into electronic signals so that the camera can produce a digital image. However, it’s the way in which the sensors do this that makes them differ.
  • CCD – In a CCD (Charged Couple Device) sensor, each pixel will capture light then move the recorded information along to the edges of the sensor to get converted into electrical signals. These electrical signals in turn are used to produce a digital image. Although accurate, this conversion process requires a lot of battery power.
  • CMOS  - By comparison, a CMOS (Complementary Metal Oxide Semiconductor) sensor will automatically convert light into an electrical signal as soon as it reaches a pixel. This is much more efficient, less expensive and requires a lot less power, making them a popular choice.
  • Live Mos – Camera manufacturers such as Leica, Panasonic and Olympus are known for using an entirely different type of sensor, Live Mos. This sensor is considered a bridge between a CMOS and CCD as image quality is said to be on a par with a CCD but with the efficiency of a CMOS.
FOV Crops.

Digital sensors are typically smaller than the 36 x 24mm surfaces of one frame of film, so if you use the same lens to shoot the same scene with two different media, the photo captured by the sensor will be cropped as it won't be to cover as wide an area as the frame of the film. 
This is a cropped field of view ( FOV )Because of this "35mm film equivalent" is a measurement created to refer to focal lengths of digital cameras that have sensors smaller than the size of traditional film. This measurement, though not always perfectly accurate, will give you an idea of how lenses perform on different digital cameras. Sensor sizes vary great;y,from the tiny ones of found in mobile phones to the "full-frame" ( no fov crop ) one seen in pro DSLRs.
How Much Resolution Do You Really Need?
Do higher megapixel counts make for better images? The honest answer is NO. 
Infact, a 3-megapixel camera captures photos already good enough for A4-sized prints. Higher resolutions are only necessary if you crop your photos or print at large sizes, because it lets you capture more detail. Increasing image resolution on a too-small noise in pictures, so while higher resolutions can be good, more is not always better, depending on the camera in question.

Digital Zoom And Interpolation
A camera's digital zoom uses a processing technique called interpolation to crop and magnify objects in the frame. In other words, it only simulates the effects of zooming in,  as opposed to optical zooming, which actually uses the lens to zoom in and out of a subject.

Interpolation works by spreading the pixels of the cropped area further apart, then sampling them to artificially fill in the surrounding void with what the camera thinks should be there. That's why photos magnified with digital zoom are less detailed than those magnified using an optical zoom. If you're particular about your camera's zoom capabilities, do note that certain models available today are engineered specifically to suit this need, coming with long-range optical zoom ranges up to 30x and higher!
This series of pictures illustrates the effects of using optical and digital zoom on a Canon Powershot S95. The first image is shot at no zoom, the second with the maximum 3.8x optical zoom, and the last with digital zoom at 15x. In the last photo, the camera merely crops the portion out from the image and interpolates it to the new size, resulting in slight degradation of quality.

I hope you enjoy reading the 2nd part of digital photography basics.
Later then. ^_^


1 comments: