Solutions Manual for Digital Image Processing
and Analysis Computer Vision and Image
Analysis, 4e by Scott Umbaugh (All Chapters)
Solutions for Chapter 1: Digital Image Processing and Analysis
1. Digital image processing is also referred to as computer imaging and can be defined as the
acquisition and processing of visual information by computer. It can be divided into application
areas of computer vision and human vision; where in computer vision applications the end user
is a computer and in human vision applications the end user is a human. Image analysis ties these
two primary application areas together, and can be defined as the examination of image data to
solve a computer imaging problem. A computer vision system can be thought of as a deployed
image analysis system.
2. In general, a computer vision system has an imaging device, such as a camera, and a computer
running analysis software to perform a desired task. Such as: A system to inspect parts on an
assembly line. A system to aid in the diagnosis of cancer via MRI images. A system to
automatically navigate a vehicle across Martian terrain. A system to inspect welds in an
automotive assembly factory.
3. The image analysis process requires the use of tools such as image segmentation, image
transforms, feature extraction and pattern classification. Image segmentation is often one of the
first steps in finding higher level objects from the raw image data. Feature extraction is the
process of acquiring higher level image information, such as shape or color information, and may
, 3
require the use of image transforms to find spatial frequency information. Pattern classification
is the act of taking this higher level information and identifying objects within the image.
4. hardware and software.
5. Gigabyte Ethernet, USB 3.2, USB 4.0, Camera Link.
6. It samples an analog video signal to create a digital image. This sampling is done at a fixed
rate when it measures the voltage of the signal and uses this value for the pixel brightness. It uses
the horizontal synch pulse to control timing for one line of video (one row in the digital image),
and the vertical synch pulse to tell the end of a field or frame
7. A sensor is a measuring device that responds to various parts of the EM spectrum, or other
signal that we desire to measure. To create images the measurements are taken across a two-
dimensional gird, thus creating a digital image.
8. A range image is an image where the pixel values correspond to the distance from the imaging
sensor. They are typically created with radar, ultrasound or lasers.
9. The reflectance function describes the way an object reflects incident light. This relates to
what we call color and texture it determines how the object looks.
10. Radiance is the light energy reflected from, or emitted by, an object; whereas irradiance is
the incident light falling on a surface. So radiance is measured in Power/(Area)(SolidAngle), and
irradiance is measure in Power./Area.
11. A photon is a massless particle that is used to model EM radiation. A CCD is a charge-
coupled device. Quantum efficiency is a measure of how effectively a sensing element converts
photonic energy into electrical energy, and is given by the ratio of electrical output to photonic
input.
, 4
1
12. See fig 1.4-5 and use the lens equation: 1 1 . If the object is at infinity:
a b f
1 s4 1s4
1 1
1s 4
s40s4s ;s4 fs 4
4 s4b
b f b f
, 5
13. Yes,s4solids4states4becauses4thes4quantums4efficiencys4iss495%
Ns4 s4Ats4s4b()q()d
700
s420(10s4x10 )s 4 s4600(0.95)d
3 400
700
s42s4
700
s4114s4s4ds4s4114 s4114(165,000)s4s41.881x107
400 s 4 2s 4 400
14. Withs4interlaceds4scanning,s4as4frames4ins41/30s4ofs4as4seconds4iss4as4fields4rates4ofs41/60s4ofs4
as4second.s4Heres4wes4240s4liness4pers4fields4ands4640s4pixelss4pers4lines4whichs4givess4(240)(640)
s4=s4153,600s4pixelss4ins41/60s4ofs4as4second.s4Sos4thes4samplings4rates4musts4be:
153,600s4pixelss4
s49.216s4x106s4;s4ors4abouts49s4megahertz
1 sec
60
15. Gammas4rayss4haves4thes4mosts4energy,s4radios4wavess4haves4thes4least.s4Fors4humans4life,s4
mores4energys4iss4mores4dangerous.
16. UVs4iss4useds4ins4fluorescences4microscopy,s4ands4IRs4imagess4ares4useds4ins4remotes4s
ensing,s4laws4enforcement,s4medicals4thermographys4ands4fires4detection.
17. Acoustics4imagings4workss4bys4sendings4outs4pulsess4ofs4sonics4energys4(sound)s4ats4variouss
4frequencies,s4ands4thens4measurings4thes4reflecteds4waves.s4Thes4times4its4takess4fors4thes4reflect
eds4signals4tos4appears4containss4distances4information,s4ands4thes4amounts4ofs4energys4reflecteds
4containss4informations4abouts4thes4object’ss4densitys4ands4material.s4Thes4measureds4information
s4iss4thens4useds4tos4creates4as4twos4ors4threes4dimensionals4image.s4Its4iss4useds4ins4geologicals4ap
plications,s4fors4examples4oils4ands4minerals4exploration,s4typicallys4uses4lows4frequencys4sounds
s4(arounds4hundredss4ofs4hertz).s4Ultrasonic,s4ors4highs4frequencys4sound,s4imagings4iss4oftens4us
eds4ins4manufacturings4tos4detects4defectss4ands4ins4medicines4tos4“see”s4insides4opaques4objectss
4suchs4ass4as4woman’ss4wombs4tos4images4as4developings4baby.