1 / 36

COURSE 10 DIGITAL IMAGE PROCESSING

UNIVERSITY OF MEDICINE AND PHARMACY “Victor Babe ş” TIMISOARA MEDICAL INFORMATICS DEPARTMENT www.medinfo.umft.ro/dim. COURSE 10 DIGITAL IMAGE PROCESSING. 1. WHY IMAGE PROCESSING ?. A p plica tions : (a) improvement of pictorial information for human interpretation ;

Download Presentation

COURSE 10 DIGITAL IMAGE PROCESSING

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UNIVERSITY OF MEDICINE AND PHARMACY “Victor Babeş” TIMISOARAMEDICAL INFORMATICS DEPARTMENTwww.medinfo.umft.ro/dim

  2. COURSE 10DIGITAL IMAGE PROCESSING

  3. 1. WHY IMAGE PROCESSING? • Applications: • (a) improvement of pictorial information for human interpretation; • (b) processing of scene data for autonomous machine perception. • Landmarks: • ·early 1920s – pictures transmitted through cable between London and New York; • ·     1964 – pictures from moon, transmitted byRanger7

  4. Application domains: • (a) medicine, geography, meteorology, physics, astronomy, defense, industry • (b) optical character recognition (OCR), artificial imaging systems in industry, digital processing of fingerprints, weather prediction, screening of blood samples • Human visual perception – superior to all imaging methods

  5. 2. FUNDAMENTALS IMAGING MODEL • Definition: image • Two-dimensional light intensity function, notedf(x,y)denoting the intensity (luminosity) of the “image” in any point(x,y) • The nature off(x,y)may be characterised by two components: • (1) illumination i(x,y) • (2) reflectancer(x,y)

  6. Definition: • The intensity of a monnochrome imagef(x,y) = the gray level – lof the image at the point (x,y) • Lmin  l  Lmax • Lmin=iminrmin si Lmax=imaxrmax • [Lmin ,Lmax] - the gray scale • in practice: [0,L] • ·l=0 is considered to be black • ·l=L is considered to be white

  7. IMAGE SAMPLING AND QUANTIZATION • Uniform sampling and quantization • ·Spatial coordinates(x,y)digitization = image sampling • ·f(x,y) amplitude digitization = gray-level quantization

  8. Suppose:the continuous image f(x,y)is approximated by equally spaced samples arranged in the form of a N*M array – digital image

  9. pixel voxel

  10. Digital image f(x,y): f : ZZ  Rorf : ZZ  Z In digital image processing: N=2n M=2k G=2m The bit number necessary to store a digital image: b=NMm Question: How many samples and gray levels are required for a good approximation?

  11. BASIC RELATIONSHIPS BETWEEN PIXELS • Notation: • ·f(x,y) – image·pand q -pixels • ·S - subset of pixelsfromf(x,y) • A pixel p at coordinates (x,y) has • 4 horizontal and vertical neighbors (x+1,y) (x-1,y) (x, y+1) (x, y-1) N4(p) – “4-neighbors of p” • 4 diagonal neighbors (x+1,y+1) (x+1,y-1) (x-1,y+1) (x-1,y-1) N8(p) – “8-neighbors of p” 0-East, 1-NE, 2-N, 3-NW, 4-W, 5-SW, 6-S, 7-SE

  12. CONNECTIVITY • ·adjacent pixels • ·similarity criterion for the gray levellV • ¨binary image V={1} • ¨gray-level image V={32, 33, ........,63, 64} • We consider 3 connectivity types: • (a) 4-connectivity • pand qif lp, lq Vand qN4(p) • (b) 8-connectivity • pand qif lp, lq Vand q N8(p) • (c) m-connectivity(mixed connectivity) • pand qif lp, lq V and • (1) q N4(p)or • (2) q ND(p) andN4(p) N4(q) = 

  13. Definitions: • ·A pixel pisadjacentto a pixel qif they are connected. • ·Two subsets S1andS2of the imageareadjacentif at least one pixel fromS1is adjacent to another fromS2. • ·Apathfrom pixel pof coord. (x,y) to a pixel qof coord. (s,t)is a sequence of distinct pixelswith coordinates • (x0,y0), (x1,y1), ......, (xn,yn) • (x0,y0)= (x,y)and(xn,yn)= (s,t) • (xi,yi) is adjacent(xi-1,yi-1), with0  i  n. • n = length of the pathbetweenpandq. • ·If pandqare pixelsof a subset Sof the image, then pisconnectedtoq in Sif there is a path fromptoqwithin S. • ·For any pixel p in S, the set of pixels inSconnected topisthe connected componentofS.

  14. DISTANCE MEASURES • For pixels p, q and z of coord. (x,y), (s,t) and (u,v) • D is a distance function or metric if: • D(p,q)  0 D(p,q)=0 if p=q • D(p,q) = d(q,p) • D(p,z)  D(p,q) + D(q,z) • Euclidean distance • De(p,q)=[(x-s)2+(y-t)2]1/2 • D4 Distance (city block D8 Distance • distance) (chessboard distance) • D4(p+q)=|x-s|+|y-t| D8(p,q)=max(|x-s|,|y-t|) • D42 from (x,y) D82 from (x,y)

  15. ARITHMETIC AND LOGIC OPERATIONS • Arithmetic operations between two pixels p and q • addition: p+q • subtraction: p-q • multiplication: p*q (or pq or pq) • division: pq • Logic operations • AND: p AND q (or pq) • OR: p OR q (or p+q) • COMPLEMENT: NOT p (or ~p)

  16. Neighborhood-oriented operations Mask – template, window, filter New value for z5

  17. IMAGING GEOMETRY • Notation: • ·(X,Y,Z) in 3-D • ·(x,y) in 2-D • Translation • Scaling • Rotation • Concatenating transformations • Inverse transformations

  18. IMAGE ENHANCEMENT • ·the techniques discussed are problem-oriented • ·Spatial domain techniques • ·Frequency domain techniques • ·combinations of the two techniques

  19. SPATIAL DOMAIN METHODS g(x,y)=T[f(x,y)] where f(x,y) – input image, g(x,y) – processed image, T – an operator on f,defined over some neighborhood of (x,y)

  20. ENHANCEMENT BY POINT PROCESSINGSIMPLE INTENSITY TRANSFORMATIONSs=T(r) Image negative Contrast stretching Bit-plane slicing

  21. HISTOGRAM PROCESSING • The histogram of a digital image with L gray levels in the range [0,L-1], is a discrete function: • rk - the kth gray level, k=0, 1,2, ...., L-1 • nk – the number of pixels with the kth gray level n – the total number of pixels in the image

  22. Histogram equalization

  23. SPATIAL FILTERING Linear filters – using a “mask” Nonlinear filters Example: §fog effect + imprecise edges (blurring) §smoothing filters=”integrative filters”

  24. Smoothing filters

  25. Derivative filters Gradient filter Laplace filter “Derivative filters” – emphasize the areas of sudden gray level transition (1st and 2nd derivative of the image function)Used to identify edges and delimiting contours.

  26. DICOM standardDigital Imaging and Communications in Medicine • DICOM standard facilitates medical imaging equipment interoperability, by : • §       a set of mandatory protocols for all the equipments which are conform to the standard §       syntax and semantic of the commands and information associated to these protocols • Informations provided by the equipment conforming to the standard

  27. Short history • 1970s  computerized tomography, followed by development of other imagistic investigation techniques  need of standards for image and associated information transfer between the equipment manufactured by various companies • §       1983 American College of Radiology (ACR) and National Electrical Manufacturers Association (NEMA)  committee developing DICOM standard (developed and publlished according to NEMA and ISO/IEC guidelines) • Ø    the standard was developed together with other international standardization organizations • CEN TC251 – Europa • JIRA Japonia • IEEE • HL7 • ANSI - SUA • §       1988 – DICOM version 2 • 2001 – DICOM version 3 (published by NEMA)

  28. DICOM v.3 standard

  29. Modular structure – can add new facilitiesIntroducing “information objects” not only for images and graphics (studies, reports etc)Sets the method for identifying relationships between “information objects” in a network

  30. BREAK

More Related