Solutions Manual for Digital Image Processing
and Analysis Computer Vision and Image
Analysis, 4e by Scott Umbaugh (All Chapters)
Solutions for Chapter 1: Digital Image Processing and Analysis
1. Digital image processing is also referred to as computer imaging and can be defined as the
acquisition and processing of visual information by computer. It can be divided into application
areas of computer vision and human vision; where in computer vision applications the end user
is a computer and in human vision applications the end user is a human. Image analysis ties these
two primary application areas together, and can be defined as the examination of image data to
solve a computer imaging problem. A computer vision system can be thought of as a deployed
image analysis system.
2. In general, a computer vision system has an imaging device, such as a camera, and a computer
running analysis software to perform a desired task. Such as: A system to inspect parts on an
assembly line. A system to aid in the diagnosis of cancer via MRI images. A system to
automatically navigate a vehicle across Martian terrain. A system to inspect welds in an
automotive assembly factory.
3. The image analysis process requires the use of tools such as image segmentation, image
transforms, feature extraction and pattern classification. Image segmentation is often one of the
first steps in finding higher level objects from the raw image data. Feature extraction is the
process of acquiring higher level image information, such as shape or color information, and may
, 3
require the use of image transforms to find spatial frequency information. Pattern classification
is the act of taking this higher level information and identifying objects within the image.
4. hardware and software.
5. Gigabyte Ethernet, USB 3.2, USB 4.0, Camera Link.
6. It samples an analog video signal to create a digital image. This sampling is done at a fixed
rate when it measures the voltage of the signal and uses this value for the pixel brightness. It uses
the horizontal synch pulse to control timing for one line of video (one row in the digital image),
and the vertical synch pulse to tell the end of a field or frame
7. A sensor is a measuring device that responds to various parts of the EM spectrum, or other
signal that we desire to measure. To create images the measurements are taken across a two-
dimensional gird, thus creating a digital image.
8. A range image is an image where the pixel values correspond to the distance from the imaging
sensor. They are typically created with radar, ultrasound or lasers.
9. The reflectance function describes the way an object reflects incident light. This relates to
what we call color and texture it determines how the object looks.
10. Radiance is the light energy reflected from, or emitted by, an object; whereas irradiance is
the incident light falling on a surface. So radiance is measured in Power/(Area)(SolidAngle), and
irradiance is measure in Power./Area.
11. A photon is a massless particle that is used to model EM radiation. A CCD is a charge-
coupled device. Quantum efficiency is a measure of how effectively a sensing element converts
photonic energy into electrical energy, and is given by the ratio of electrical output to photonic
input.
, 4
1
12. See fig 1.4-5 and use the lens equation: 1 1 . If the object is at infinity:
a b f
1 s6 1s6
1 1
1s 6
s60s6s ;s6 fs 6
6 s6b
b f b f
, 5
13. Yes,s6solids6states6becauses6thes6quantums6efficiencys6iss695%
Ns6 s6Ats6s6b()q()d
700
s620(10s6x10 )s 6 s6600(0.95)d
3 400
700
s62s6
700
s6114s6s6ds6s6114 s6114(165,000)s6s61.881x107
400 s 6 2s 6 400
14. Withs6interlaceds6scanning,s6as6frames6ins61/30s6ofs6as6seconds6iss6as6fields6rates6ofs61/60s6ofs6
as6second.s6Heres6wes6240s6liness6pers6fields6ands6640s6pixelss6pers6lines6whichs6givess6(240)(640)
s6=s6153,600s6pixelss6ins61/60s6ofs6as6second.s6Sos6thes6samplings6rates6musts6be:
153,600s6pixelss6
s69.216s6x106s6;s6ors6abouts69s6megahertz
1 sec
60
15. Gammas6rayss6haves6thes6mosts6energy,s6radios6wavess6haves6thes6least.s6Fors6humans6life,s6
mores6energys6iss6mores6dangerous.
16. UVs6iss6useds6ins6fluorescences6microscopy,s6ands6IRs6imagess6ares6useds6ins6remotes6s
ensing,s6laws6enforcement,s6medicals6thermographys6ands6fires6detection.
17. Acoustics6imagings6workss6bys6sendings6outs6pulsess6ofs6sonics6energys6(sound)s6ats6variouss
6frequencies,s6ands6thens6measurings6thes6reflecteds6waves.s6Thes6times6its6takess6fors6thes6reflect
eds6signals6tos6appears6containss6distances6information,s6ands6thes6amounts6ofs6energys6reflecteds
6containss6informations6abouts6thes6object’ss6densitys6ands6material.s6Thes6measureds6information
s6iss6thens6useds6tos6creates6as6twos6ors6threes6dimensionals6image.s6Its6iss6useds6ins6geologicals6ap
plications,s6fors6examples6oils6ands6minerals6exploration,s6typicallys6uses6lows6frequencys6sounds
s6(arounds6hundredss6ofs6hertz).s6Ultrasonic,s6ors6highs6frequencys6sound,s6imagings6iss6oftens6us
eds6ins6manufacturings6tos6detects6defectss6ands6ins6medicines6tos6“see”s6insides6opaques6objectss
6suchs6ass6as6woman’ss6wombs6tos6images6as6developings6baby.