Solutions Manual for Digital Image Processing
and Analysis Computer Vision and Image
Analysis, 4e by Scott Umbaugh (All Chapters)
Solutions for Chapter 1: Digital Image Processing and Analysis
1. Digital image processing is also referred to as computer imaging and can be defined as the
acquisition and processing of visual information by computer. It can be divided into application
areas of computer vision and human vision; where in computer vision applications the end user
is a computer and in human vision applications the end user is a human. Image analysis ties these
two primary application areas together, and can be defined as the examination of image data to
solve a computer imaging problem. A computer vision system can be thought of as a deployed
image analysis system.
2. In general, a computer vision system has an imaging device, such as a camera, and a computer
running analysis software to perform a desired task. Such as: A system to inspect parts on an
assembly line. A system to aid in the diagnosis of cancer via MRI images. A system to
automatically navigate a vehicle across Martian terrain. A system to inspect welds in an
automotive assembly factory.
3. The image analysis process requires the use of tools such as image segmentation, image
transforms, feature extraction and pattern classification. Image segmentation is often one of the
first steps in finding higher level objects from the raw image data. Feature extraction is the
process of acquiring higher level image information, such as shape or color information, and may
, 3
require the use of image transforms to find spatial frequency information. Pattern classification
is the act of taking this higher level information and identifying objects within the image.
4. hardware and software.
5. Gigabyte Ethernet, USB 3.2, USB 4.0, Camera Link.
6. It samples an analog video signal to create a digital image. This sampling is done at a fixed
rate when it measures the voltage of the signal and uses this value for the pixel brightness. It uses
the horizontal synch pulse to control timing for one line of video (one row in the digital image),
and the vertical synch pulse to tell the end of a field or frame
7. A sensor is a measuring device that responds to various parts of the EM spectrum, or other
signal that we desire to measure. To create images the measurements are taken across a two-
dimensional gird, thus creating a digital image.
8. A range image is an image where the pixel values correspond to the distance from the imaging
sensor. They are typically created with radar, ultrasound or lasers.
9. The reflectance function describes the way an object reflects incident light. This relates to
what we call color and texture it determines how the object looks.
10. Radiance is the light energy reflected from, or emitted by, an object; whereas irradiance is
the incident light falling on a surface. So radiance is measured in Power/(Area)(SolidAngle), and
irradiance is measure in Power./Area.
11. A photon is a massless particle that is used to model EM radiation. A CCD is a charge-
coupled device. Quantum efficiency is a measure of how effectively a sensing element converts
photonic energy into electrical energy, and is given by the ratio of electrical output to photonic
input.
, 4
1
12. See fig 1.4-5 and use the lens equation: 1 1 . If the object is at infinity:
a b f
1 a2 1a2
1 1
1a 2
a20a2a ;a2 fa 2
2 a2b
b f b f
, 5
13. Yes,a2solida2statea2becausea2thea2quantuma2efficiencya2isa295%
Na2 a2Ata2a2b()q()d
700
a220(10a2x10 )a 2 a2600(0.95)d
3 400
700
a22a2
700
a2114a2a2da2a2114 a2114(165,000)a2a21.881x107
400 a 2 2a 2 400
14. Witha2interlaceda2scanning,a2aa2framea2ina21/30a2ofa2aa2seconda2isa2aa2fielda2ratea2ofa21/60a2ofa
2aa2second.a2Herea2wea2240a2linesa2pera2fielda2anda2640a2pixelsa2pera2linea2whicha2givesa2(240)(64
0)a2=a2153,600a2pixelsa2ina21/60a2ofa2aa2second.a2Soa2thea2samplinga2ratea2musta2be:
153,600a2pixelsa2
a29.216a2x106a2;a2ora2abouta29a2megahertz
1 sec
60
15. Gammaa2raysa2havea2thea2mosta2energy,a2radioa2wavesa2havea2thea2least.a2Fora2humana2life,a2
morea2energya2isa2morea2dangerous.
16. UVa2isa2useda2ina2fluorescencea2microscopy,a2anda2IRa2imagesa2area2useda2ina2remotea2
sensing,a2lawa2enforcement,a2medicala2thermographya2anda2firea2detection.
17. Acoustica2imaginga2worksa2bya2sendinga2outa2pulsesa2ofa2sonica2energya2(sound)a2ata2various
a2frequencies,a2anda2thena2measuringa2thea2reflecteda2waves.a2Thea2timea2ita2takesa2fora2thea2refle
cteda2signala2toa2appeara2containsa2distancea2information,a2anda2thea2amounta2ofa2energya2reflect
eda2containsa2informationa2abouta2thea2object’sa2densitya2anda2material.a2Thea2measureda2informa
tiona2isa2thena2useda2toa2createa2aa2twoa2ora2threea2dimensionala2image.a2Ita2isa2useda2ina2geologic
ala2applications,a2fora2examplea2oila2anda2minerala2exploration,a2typicallya2usea2lowa2frequencya2
soundsa2(arounda2hundredsa2ofa2hertz).a2Ultrasonic,a2ora2higha2frequencya2sound,a2imaginga2isa2o
ftena2useda2ina2manufacturinga2toa2detecta2defectsa2anda2ina2medicinea2toa2“see”a2insidea2opaquea
2objectsa2sucha2asa2aa2woman’sa2womba2toa2imagea2aa2developinga2baby.