TEST BANK
Test Bank for Digital Image Processing and Analysis Computer Vision and
Image Analysis
4th Editon
,Scott E Umbaugh: Digital Image Processing and Analysis: Computer Vision and
Image Analysis 4e
Solutions for Chapter 1: Digital Image Processing and Analysis
1. Digital image processing is also referred to as computer imaging and can be defined as the
acquisition and processing of visual information by computer. It can be divided into application
areas of computer vision and human vision; where in computer vision applications the end user
is a computer and in human vision applications the end user is a human. Image analysis ties these
two primary application areas together, and can be defined as the examination of image data to
solve a computer imaging problem. A computer vision system can be thought of as a deployed
image analysis system.
2. In general, a computer vision system has an imaging device, such as a camera, and a computer
running analysis software to perform a desired task. Such as: A system to inspect parts on an
assembly line. A system to aid in the diagnosis of cancer via MRI images. A system to
automatically navigate a vehicle across Martian terrain. A system to inspect welds in an
automotive assembly factory.
3. The image analysis process requires the use of tools such as image segmentation, image
transforms, feature extraction and pattern classification. Image segmentation is often one of the
first steps in finding higher level objects from the raw image data. Feature extraction is the
process of acquiring higher level image information, such as shape or color information, and may
,require the use of image transforms to find spatial frequency information. Pattern classification
is the act of taking this higher level information and identifying objects within the image.
4. hardware and software.
5. Gigabyte Ethernet, USB 3.2, USB 4.0, Camera Link.
6. It samples an analog video signal to create a digital image. This sampling is done at a fixed
rate when it measures the voltage of the signal and uses this value for the pixel brightness. It uses
the horizontal synch pulse to control timing for one line of video (one row in the digital image),
and the vertical synch pulse to tell the end of a field or frame
7. A sensor is a measuring device that responds to various parts of the EM spectrum, or other
signal that we desire to measure. To create images the measurements are taken across a two-
dimensional gird, thus creating a digital image.
8. A range image is an image where the pixel values correspond to the distance from the imaging
sensor. They are typically created with radar, ultrasound or lasers.
9. The reflectance function describes the way an object reflects incident light. This relates to
what we call color and texture it determines how the object looks.
10. Radiance is the light energy reflected from, or emitted by, an object; whereas irradiance is
the incident light falling on a surface. So radiance is measured in Power/(Area)(SolidAngle), and
irradiance is measure in Power./Area.
11. A photon is a massless particle that is used to model EM radiation. A CCD is a charge-
coupled device. Quantum efficiency is a measure of how effectively a sensing element converts
photonic energy into electrical energy, and is given by the ratio of electrical output to photonic
input.
, 1
12. See fig 1.4-5 and use the lens equation: 1 + 1 = . If the object is at infinity:
a b f
1 1 1 = 0 + 1 = 1 ; f = b
+ =
b f
b f
Test Bank for Digital Image Processing and Analysis Computer Vision and
Image Analysis
4th Editon
,Scott E Umbaugh: Digital Image Processing and Analysis: Computer Vision and
Image Analysis 4e
Solutions for Chapter 1: Digital Image Processing and Analysis
1. Digital image processing is also referred to as computer imaging and can be defined as the
acquisition and processing of visual information by computer. It can be divided into application
areas of computer vision and human vision; where in computer vision applications the end user
is a computer and in human vision applications the end user is a human. Image analysis ties these
two primary application areas together, and can be defined as the examination of image data to
solve a computer imaging problem. A computer vision system can be thought of as a deployed
image analysis system.
2. In general, a computer vision system has an imaging device, such as a camera, and a computer
running analysis software to perform a desired task. Such as: A system to inspect parts on an
assembly line. A system to aid in the diagnosis of cancer via MRI images. A system to
automatically navigate a vehicle across Martian terrain. A system to inspect welds in an
automotive assembly factory.
3. The image analysis process requires the use of tools such as image segmentation, image
transforms, feature extraction and pattern classification. Image segmentation is often one of the
first steps in finding higher level objects from the raw image data. Feature extraction is the
process of acquiring higher level image information, such as shape or color information, and may
,require the use of image transforms to find spatial frequency information. Pattern classification
is the act of taking this higher level information and identifying objects within the image.
4. hardware and software.
5. Gigabyte Ethernet, USB 3.2, USB 4.0, Camera Link.
6. It samples an analog video signal to create a digital image. This sampling is done at a fixed
rate when it measures the voltage of the signal and uses this value for the pixel brightness. It uses
the horizontal synch pulse to control timing for one line of video (one row in the digital image),
and the vertical synch pulse to tell the end of a field or frame
7. A sensor is a measuring device that responds to various parts of the EM spectrum, or other
signal that we desire to measure. To create images the measurements are taken across a two-
dimensional gird, thus creating a digital image.
8. A range image is an image where the pixel values correspond to the distance from the imaging
sensor. They are typically created with radar, ultrasound or lasers.
9. The reflectance function describes the way an object reflects incident light. This relates to
what we call color and texture it determines how the object looks.
10. Radiance is the light energy reflected from, or emitted by, an object; whereas irradiance is
the incident light falling on a surface. So radiance is measured in Power/(Area)(SolidAngle), and
irradiance is measure in Power./Area.
11. A photon is a massless particle that is used to model EM radiation. A CCD is a charge-
coupled device. Quantum efficiency is a measure of how effectively a sensing element converts
photonic energy into electrical energy, and is given by the ratio of electrical output to photonic
input.
, 1
12. See fig 1.4-5 and use the lens equation: 1 + 1 = . If the object is at infinity:
a b f
1 1 1 = 0 + 1 = 1 ; f = b
+ =
b f
b f