资源预览内容
第1页 / 共59页
第2页 / 共59页
第3页 / 共59页
第4页 / 共59页
第5页 / 共59页
第6页 / 共59页
第7页 / 共59页
第8页 / 共59页
第9页 / 共59页
第10页 / 共59页
亲,该文档总共59页,到这儿已超出免费预览范围,如果喜欢就下载吧!
资源描述
Courseware download:ftp:/bcc.hitsz.edu.cn/DigitalImageProcessing/username: download password: downloadDigital Image Processing:Digital Imaging FundamentalsDr. Guangming LuLuguangmgmail.com3of58ContentsMain purpose: introduce several concepts related to DIP and some of the notation used throughout the course.This lecture will cover:Human vision systemLight and Electromagnetic spectrumImage acquisitionSampling and QuantizationResolutionBasic Relationships Between Pixels4of58Human Visual SystemuThe best vision model we have!uKnowledge of how images form in the eye can help us with processing digital imagesuWe will take just a whirlwind tour of the human visual system5of58Structure of the Human EyenThe lens focuses light from objects onto the retinanThe retina is covered with light receptors called cones (6-7 million) and rods (75-150 million)nCones are concentrated around the fovea and are very sensitive to colournRods are more spread out and are sensitive to low levels of illumination6of58Structure of the Human Eye7of58Blind-Spot ExperimentDraw an image similar to that below on a piece of paper (the dot and cross are about 6 inches apart)Close your right eye and focus on the cross with your left eyeHold the image about 20 inches away from your face and move it slowly towards youThe dot should disappear!8of58Brightness Adaptation & DiscriminationnBecause digital images are displayed as a discrete set of intensities, the eyes ability to discriminate between different intensity levels is an important consideration in DIP results.nThe human visual system can perceive approximately 1010 different light intensity levelsnSubjective brightness is a logarithm function of the light intensity incident on the eye.nHowever, at any one time we can only discriminate between a much smaller number brightness adaptation 9of58nFor a given set of conditions, the current sensitivity level of the visual system is called the brightness adaptation level.Brightness Adaptation & Discrimination10of58Brightness Adaptation & Discrimination (cont)Perceived brightness is not a simple function of intensity:Mach Bands(1865)Seeing is believing? 11of58Brightness Adaptation & Discrimination (cont)nSimilarly, the perceived intensity of a region is related to the light intensities of the regions surrounding it12of58Brightness Adaptation & Discrimination (cont)Another experiment: a piece of paper on a desk is always white, but can appear totally black when used to shield the eyes while looking directly at a bright sky13of58Optical IllusionsOur visual systems play lots of interesting tricks on us14of58Optical Illusions (cont)15of58Light and the Electromagnetic SpectrumNewton, 1666Violet, Blue, Green, Yellow, Orange, and RedBlends smoothly into the next.Light is just a particular part of the electromagnetic spectrum that can be sensed by the human eye16of58Light and the Electromagnetic Spectrum17of58Light and the Electromagnetic SpectrumnVisible light: 0.430.79umnThe electromagnetic spectrum is split up according to the wavelengths of different forms of energyWhere c is the speed of the light, v is the frequency, and h is the Plancks constant18of58Light and the Electromagnetic SpectrumlA stream of massless particleslEach massless particle contains a certain amount of energy.lEach bundle of energy is called a photonlHigh frequency electromagnetic phenomena carry more energy per photon. That is the reason that gamma rays are so dangerous to living organisms. 19of58Light and the Electromagnetic SpectrumnThe colours that we perceive are determined by the nature of the light reflected from an objectnFor example, if white light is shone onto a green object most wavelengths are absorbed, while green light is reflected from the objectWhite LightColours AbsorbedGreen Light20of58Light reflectance propertiesA body that reflects light and is relatively balanced in all visible wavelengths appears white to the observer.A body that favors reflectance in a limited range of the visible spectrum exhibits some shades of color.Achromatic or monochromatic light: the only attribute is intensity-Gray-levelBlack to Gray to WhiteLight and the Electromagnetic Spectrum21of58Chromatic light Three basic quantities to describe the quality of a chromatic light source:Radiance (Watts: W)(发光强度)The total amount of energy that flows from the light source.Luminance (lumen: lm)(光通量)A measure of the amount of energy an observer perceives from a light source.Example: Far Infrared RegionBrightnessSubjective descriptor of light perception that is practically impossible to measure.Light and the Electromagnetic Spectrum22of58Light and the Electromagnetic SpectrumnIn principle, if a sensor can be developed that is capable of detecting energy radiated by a band of the electromagnetic spectrum, we can image events of interest in that band.nHowever, the wavelength of an electromagnetic wave require to “see” an object must be of the same size as or smaller than the object.nElectromagnetic waves is not the only method for image generation. Such as sound reflection-ultrasonic imagesnNote there is an error in the reference book in this section, far infrared should be far ultraviolet. Page 35.23of58Other EM Spectrum: Short-wavelength EndGamma raysMedical ImagingAstronomical ImagingRadiation in nuclear environmentsHard X RaysIndustrial ApplicationsSoft X RaysChest X-Ray (shorter wavelength end)Dental X-Ray (lower energy end)UltravioletMicroscopy ImagingInfrared region:Near-infraredFar-InfraredMicrowaveMicrowave Ovens, Communication, RadarRadio waveAM, FM, TV, and Medical imagingStellar bodies observationLight and the Electromagnetic Spectrumtumourinfraredvisible24of58Image AcquisitionImages are typically generated by illuminating a scene and absorbing the energy reflected by the objects in that scene.25of58Imaging SensorsuImage acquisition sensorsuSingle sensoruStrip sensoruSensor array26of58nIncoming energy lands on a sensor material responsive to that type of energy and this generates a continuous voltagenTo create a digital image, we need to convert the continuous sensed data into digital form.nThis involves two steps: sampling and quantization.Image Sampling and Quantization27of58Image Sampling and Quantization28of58Image Sampling and QuantizationnDigitizing the coordinate values is called sampling, and digitizing the amplitude values is called quantization.nQuantisation is the process of converting a continuous analogue signal into a digital representation of this signal29of58Mathematical StatementLet Z be the set of real integersR the set of real numbersSampling:Partition the xy plane into a grid.The coordinates of the center of each grid being a pair of elements from the Cartesian product Z2.The set of all ordered pairs of elements (zi, zj). With zi and zj being integers from Z.Quantization:f is a function that assigns a gray-level value (a real number in R) to each distinct pair of coordinate (x, y).Image Sampling and Quantization30of58Representation31of58Both spatial and gray level resolutions determine the storage size of an image (bytes)e.g. spatial resolution: 40 x 40gray level resolution: 64 (log264 = 6 bits/pixel) The number of pixels:40 x 40 = 1600 pixels The storage size (no compression, no overhead):1600 x 6 = 9600 bits = 1200 bytes 1.17 KBRepresentationUsually, the M and N are positive integers, and the number of gray levels is an integer power of 2:32of58Representation33of58Spatial & Intensity Level ResolutionuThe spatial resolution of an image is determined by how sampling was carried out.uSpatial resolution simply refers to the smallest discernable detail in an imageVision specialists will often talk about pixel sizeGraphic designers will talk about dots per inch (DPI)5.1 Megapixels34of58Intensity level resolution refers to the number of intensity levels used to represent the imageThe more intensity levels used, the finer the level of detail discernable in an imageIntensity level resolution is usually given in terms of the number of bits used to store each intensity levelNumber of BitsNumber of Intensity LevelsExamples120, 12400, 01, 10, 114160000, 0101, 1111825600110011, 010101011665,5361010101010101010Spatial & Intensity Level Resolution35of58Spatial & Intensity Level Resolution36of581024 * 1024512 * 512256 * 256128 * 12864 * 6432 * 32Spatial & Intensity Level Resolution37of58128 grey levels (7 bpp)64 grey levels (6 bpp)32 grey levels (5 bpp)16 grey levels (4 bpp)8 grey levels (3 bpp)4 grey levels (2 bpp)2 grey levels (1 bpp)256 grey levels (8 bits per pixel)Spatial & Intensity Level Resolution38of58Spatial resolution: M*NGray level resolution:LHow many samples and gray levels are required for a good approximation?Resolution (the degree of discernible detail) of an image depends on sample number and gray level number.i.e. the more these parameters are increased, the closer the digitized array approximates the original image.But: storage & processing requirements increase rapidly as a function of N, M, and kSpatial & Intensity Level Resolution39of58The big question with resolution is always: “how much is enough?”This all depends on what is in the image and what you would like to do with itKey questions includeDoes the image look aesthetically pleasing?Can you see what you need to see within the image?Spatial & Intensity Level Resolution40of58The picture on the right is fine for counting the number of cars, but not for reading the number plateSpatial & Intensity Level Resolution41of58Zooming OversamplingShrinking SubsamplingZooming and Shrinking Digital Images42of58Zooming: The creation of new pixel locationsThe assignment of gray levels to those new locationsNearest neighbor interpolation (NN)Pixel replication: a special case of NNNN produces checkerboard effectZooming and Shrinking Digital ImagesNNGray level43of58Zooming: Bilinear interpolationUsing the four NNs of a point. G(A), G(B), G(C), G(D) are the gray levels of pint A, B, C, D.Zooming and Shrinking Digital Images44of58Zooming and Shrinking Digital Images45of58Shrinking: Similar manner as just described for zooming.DeleteExpand the grid: Nearest Neighbor interpolationBilinear interpolationZooming and Shrinking Digital Images46of58Basic Relationships Between PixelsA pixel p at (x,y) has 2 horizontal and 2 vertical neighbors:(x+1,y), (x-1,y), (x,y+1), (x,y-1)This set of pixels is called the 4-neighbors of p: N4(p)The 4 diagonal neighbors of p are: (ND(p) - (x+1,y+1), (x+1,y-1), (x-1,y+1), (x-1,y-1) - N4(p) + ND(p) N8(p): the 8-neighbors of pDefinitions:f(x,y): digital imagePixels: q, pN4N4PN4N4NDNDPNDNDN8N8N8N8PN8N8N8N847of58Basic Relationships Between Pixels000000000000000110000000001111100000011111110000011111110000001111110000001111111000001111111110000001111110000000000000SN4(p)QN8(p)PND(p)48of58ConnectivityConnectivity between pixels is important:Because it is used in establishing boundaries of objects and components of regions in an imageTwo pixels are connected if:uThey are neighbors (i.e. adjacent in some sense - e.g. N4(p), N8(p), )uTheir gray levels satisfy a specified criterion of similarity (e.g. equality, )V is the set of gray-level values used to define adjacency (e.g. V=1 for adjacency of pixels of value 1)49of58AdjacencyWe consider three types of adjacency:4-adjacency: two pixels p and q with values from V are 4-adjacent if q is in the set N4(p)8-adjacency : p & q are 8- adjacent if q is in the set N8(p)m-adjacency: p & q with values from V are m-adjacent ifq is in N4(p) orq is in ND(p) and the set N4(p)N4(q) has no pixels with values from V50of58AdjacencyMixed adjacency is a modification of 8-adjacency and is used to eliminate the multiple path connections that often arise when 8-adjacency is used.V=1 8-adjacencym-adjacency51of58Path(通路)u A sequence of adjacent pixels.u For example: a path from pixel p with coordinate (x, y) to pixel q with coordinate (s, t) is defined (x0, y0),(x1, y1),(xn, yn)Where (x0, y0) = (x, y),(xn, yn) = (s, t), (xi, yi) and (xi-1, yi-1) are adjacent, 1 i n, n is called the length of the path. uIf (x0, y0) = (xn, yn), the path is a closed path(闭合通路合通路). uWe can define 4-, 8-, and m-path depending on the type of adjacency. Path52of58Path4-Pathm-Path8-Path53of58Definitions:uLet S represent a subset of pixels in an image. Two pixel p and q are said to be connected (连通通) in S if there exists a path between them.uFor any pixel p in S, the set of pixels that are connected to it in S is called a connected component(连通分量)通分量) of S.uIf it only has one connected component, then set S is called a connected set(连通集通集).uLet R be a subset of pixels in an image. We also called R a region(区域区域) of the image is R is a connected set.uThe boundary(边界,或界,或边缘、轮廓廓) of a region R is the set of pixels in the region that have one or more neighbors that are not in R.Connected Set and Region54of58Distance MeasuresFor pixels p,q,z with coordinates (x,y), (s,t), (u,v), D is a distance function or metric if:D(p,q) 0 (D(p,q)=0 if p=q)D(p,q) = D(q,p) andD(p,z) D(p,q) + D(q,z)55of58Distance MeasuresEuclidean distance(p(x,y), q(s,t):De(p,q) = (x-s)2 + (y-t)21/2Points (pixels) having a distance less than or equal to r from (x,y) are contained in a disk of radius r centered at (x,y).56of58Distance MeasuresD4 distance (city-block distance):D4(p,q) = |x-s| + |y-t|forms a diamond centered at (x,y)e.g. pixels with D42 from pD4 = 1 are the 4-neighbors of p57of58Distance MeasuresD8 distance (chessboard distance):D8(p,q) = max(|x-s|,|y-t|)Forms a square centered at pe.g. pixels with D82 from pD8 = 1 are the 8-neighbors of p58of58Euclidean distance (2-norm)D4 distance (city-block distance)D8 distance (checkboard distance)01111011110111111112222222222222222222222223333333344442222Common Distance Definitions59of58SummaryWe have looked at:Human vision systemLight and Electromagnetic spectrumImage acquisitionSampling and QuantizationResolutionBasic Relationships Between PixelsNext time we start to look at techniques for image enhancement
收藏 下载该资源
网站客服QQ:2055934822
金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号