Journal of Engineering and Applied Sciences

Year: 2010
Volume: 5
Issue: 4
Page No. 290 - 295

ZKIP and Mutual Information as Measure on Image Analysis

Authors : S. Samundeeswari and M. Thiyagarajan

Abstract: The researchers try to extract the component image virtually hidden in the principle image. Information theoretic quantities are used to measure similarity of images in a statistical frame work. Statistical method describes the texture in a form suitable for statistical pattern recognition. This is suggested from large primitives and fine structure of smaller primitives. Co-occurrence matrix method is used to edge frequency detection and separation. Here, we appeal to the mutual information to bring out the component image from a complete image presentation by statistical zero knowledge interactive protocol (statistical ZKIP).

How to cite this article:

S. Samundeeswari and M. Thiyagarajan, 2010. ZKIP and Mutual Information as Measure on Image Analysis. Journal of Engineering and Applied Sciences, 5: 290-295.

INTRODUCTION

Image processing and machine vision are the recent fields of advanced research in computer science and technology (Sonka et al., 1998). Different techniques are adopted to solve problems in the registration, segmentation and split of component images from a unified image acquisition. Here, the researchers address the problem of extracting an invisible component from an unified image with a single outlook and over view. The researchers appeal to a similarity measure based on mutual information and cross correlation. This type of analysis throws light on the study of bio-medical images and satellite images to capture the subroutines to extract the required component for further analysis. The method can be extended to have a better understanding of 3-dimensional perspectives with the help of front elevation and side view.

This novel approach can be put in the category of statistical zero knowledge interactive protocol in which a statistical difference and mutual information difference are used with ease. The approach is based on the mutual information difference to have such a statistical interactive protocol for alignment and rendering of 2-D images (Revathy, 2008).

MATERIALS AND METHODS

Pattern recognition is used for region classification and basic methods of pattern recognition must be understood in order to study more complex machine vision process. The theory of pattern recognition is thoroughly discussed by various researchers such as Sonka et al. (2001). In addition, Ross (1997) introduced some other related technique like graph matching, neural nets, genetic algorithms, simulated annealing and fuzzy logic.

No recognition is possible without knowledge. Decisions about classes or groups into which recognition objects are classified are based on such knowledge about objects and their classes give the necessary information for object classification. Both specific knowledge about the objects being processed and more general about object classes are required. The ability to develop relations in classification is studied through similarity measures. Similarity, measure based on data manipulation is computed using cosine amplitude and max-min method in uncertainty environment. Other methods estimate similarity through exponential functions. The principle of non interactive between sets can be introduced on the assumptions of independent probability modelling. Measures of information in particular, measure based on mutual information is found to be more suitable in the extraction of component images well embedded in an image. This notion is studied by statistician like Renyi (1970) and other scientists.

Image similarity-based methods are broadly used in the study of medical imaging. A basis image similarity-based method consists of a transformation model which is applied to reference image coordinates in the target image space, an image similarity metric which quantifies the degree of correspondence between features in both image spaces achieved by a given transformation and an optimization algorithm which tries to maximize image similarity by changing the transformation parameters. The choice of an image similarity measure depends on the nature of the images to be registered. Uniformity are commonly used for registration of images of the same modality. Simplex method and Powell’s routine (Haralick, 1979) are commonly used for registration problem. Both these methods do not require function derivatives to be calculated. The simplex method considers all degree of freedom simultaneously and is not know for its speed of convergence. Powell’s (Sonka et al., 2001) method optimizes each transformation parameter in turn and it is relatively sensitive to local optima in the registration function.

Basic concepts: The researchers start with the following definitions used in the subsequent study.

Definition 1
Entropy:
The entropy (Renyi, 1970) H (X) for the discrete random variable X with probability distribution function p is defined as:

Where reasons of continuity we define:

0 log 0 = 0

The entropy of X, H (X) may also be denoted H (p). The notation H (p) emphasizes the dependence of entropy on the probability distribution of X as opposed to the actual intensity values of X.

Definition 2
Joint entropy:
The joint entropy, H (X, Y) for the discrete random variables X and Y with joint probability distribution r is defined as:

Η (Ξ, Ψ) = Η (ρ) = -Σ ρ (ξ, ψ) λoγ p (ξ, ψ)

Definition 3
Relative entropy:
Relative entropy distance is a measure of the distance between one probability distribution and another. It measures the error of using an estimated distribution q over the true distribution p. The relative entropy, D (p||q) of 2 probability distributions p and q over X is defined as:

Δ (π || θ) = Σ π (ξ) λoγ (π (ξ)/θ (ξ))

Where for reasons of continuity we define 0 log (0/q) = 0 and p log (p/0) = -. A special case of relative entropy is mutual information. Mutual information measures the amount of information shared between 2 random variables or the decrease in randomness of one random variable due to the knowledge of another (Gonzalez et al., 2004).

Definition 4
Mutual information:
Let X and Y be 2 random variables with probability distributions p and q, respectively and joint probability distribution r. Mutual information, I (X; Y) is the relative entropy between the joint probability distribution, r and the product distribution, d where d (x, y) = p (x) q (y). That is:

I(Ξ;Ψ) = Δ (ρ || δ) = Σ Σρ (ξ, ψ)
λoγ (ρ(ξ, ψ)/π (ξ) θ (ψ))

If the random variables X and Y are independent then the joint probability distribution is equal to the product distribution that is r = d. Thus, mutual information measures the correlation between X and Y with respect to X and Y being independent (Haralick, 1979). Mutual information to be expressed n terms of entropy:


I (Ξ; Ψ) = Η (Ξ) +Η (Ψ) - Η (Ξ, Ψ)

The mutual information MI (A, B) of 2 images A, B has 3 equivalent definitions. Each of them can explain the mutual information differently. The first uses the difference in the entropy of an image and the entropy of the same image A knowing another image:


Β:ΜI (Α, Β) = Η (Α) - Η (Α|Β) = Η (Β) - (Β|Α)

Here, H (A) measures the information contained in the image A while H (A|B) measures the amount of information contained in the picture when the image B is known. The mutual information corresponds to the amount of information that the image B owns the image A or similarly, the amount of information that owns image A on the image B. The second definition refers to the distance i pi log (pi/qi), S which measures the distance between two distribution p and q:

ΜI(Α, Β) = Σα, β παβv λoγ(παβ/παπβ)

is the measure between the distribution pab images A and B and distribution pa pb where image A and B are independent. This definition of information is therefore, a measure of mutual dependence between images A and B. There will be a shift when image A and B are most similar (Haralick, 1979). The third definition of mutual information is a combination of entropies of 2 images, separate and attached:

ΜI (Α, Β) = Η (Α) +Η (Β) - Η (Α, Β)

Definition 5
Mutual difference:
If x is a discrete probability distribution then the entropy of x, denoted H (x) is defined as:

Mutual difference is the promise problem:

MD = (MDY, MDN)

Where:

X and Y are circuits encoding probability distributions. Requiring a gap of 1 bit of entropy in the definition of MD is inessential as any noticeable gap can be easily amplified by replacing each distribution with many independent copies of itself. This contrasts with the definition MD in which the thresholds of 2.3 and 1.3 are not arbitrary.

Problem specification: To extract the component images virtually hidden in the principle image. Information theoretic quantities are used to measure similarity of images in a statistical framework. Statistic description methods describe textures in a form suitable for statistical pattern recognition. Morphological reconstructions are suggested are built from larger primitives, fine structure from smaller primitives. Co-occurrence matrices are used to edge frequency detection and separation (Mao, 2007). Here, researchers appeal to the mutual information to bring out the component images from a complete images presentation.

As the images become misaligned, dispersion of their joint histogram increases. Therefore, registration of 2 images can be accomplished by minimizing the joint entropy of the images but mutual information is a better criterion as marginal entropies H (I) and H (J) are taken into account:

MI (A, B) = H (A) + H (B) – H (A, B)

The optimal transformation can be gained by maximizing mutual information of the 2 images. So if the images are of the same object when they are correctly registered, corresponding pixels in the 2 images will be of the same anatomical or pathological structure. Normalized measure of mutual information is defined as follows:

Normalized mutual information has been shown to be more robust for intermodality registration than standard mutual information. To extract the component images virtually hidden in the principle image. Information theoretic quantities are used to measure similarity of images in a statistical framework. Statistical description method describes the textures in a form suitable for statistical pattern recognitions are suggested (Gonzalez et al., 2004). Coarse structures are built from large primitives and fine structure from smaller primitives. Co-occurrence matrices are used to edge frequency detection and separation (Haralick, 1979). Here, researchers appeal to the mutual information to bring out the component images from a complete image presentation.

Normalized cross-correlation: The correlation between 2 images (cross-correlation) is a standard approach to feature detection. It can be used as a measure for calculating the degree of similarity between 2 images. Its mathematical definition is given:

This metric computes pixel-wise cross-correlation and normalizes it by the square root of the auto-correlation of the images. Misalignment between the images results in small measure values. The metric is insensitive to multiplicative factors between the images and produces a cost function with sharp peaks and well-defined minima. The correlation coefficient is a good measure of alignment in the case of images of the same subject acquired with the same modality at different times in changes in intensity or shape of a structure.

Method: Mutual system is estimated from image strategies and computed over the region of overlap that is the intersection of image spaces or video frames. In general, the region of overlap grows image spaces or video frames become aligned. And since, as images are video frames become misaligned, the region of overlap determines the overlap strategies on the images pixels contribute to the computation of the strategies. Normalized mutual information is the similarity measure that is less affected by overlap strategies. Gradient mutual information determines the statistics for the images that includes spatial information (Samundeeswari, 2009).

RESULTS AND DISCUSSION

The proposed method was applied to register an image and its component. These are referred to us reference image and target image and the registered image for the three metrics. The pixel values are converted to mod 256 and a transformation is effected on the pixel values at the position x, y with the function based on the RGB values and the parameters for the linear fit fro the pixel values under consideration.

The estimated function values are used in the metric for mutual information, normalized mutual information and the normalized cross correlation coefficients. The parameter under consideration is put in different cases based on the residue classes modulo 256. Change of origin and scale helps us to obtain the normalized values. The researchers find the mutual information calculated from the registered image and the target image, agree as shown in Table 1. The image represents a pencil drawing by an artist which has been viewed in 2 perspectives. One as complete rose another sub image which can be extracted as a couple in sitting posture.

Case 1: Double(cat (4, avi(1:end).cdata))/255%; Convert to RGB to gray scale mutual info and normalized mutual info.

The proposed method was applied to register two components of the images. Figure 1-6 show the reference image, target image and registered image for all the three metrics MI, NMI and NCC. The resulting values are shown in Table 1 and Fig. 2-4:

Information for A ∪ B
Information for A ∩ B
Use of result for:

H (A, B) = H (A) + H (B) – H (A ∩ B)

H (A), H (B) are the split images.

Statistical ZKIP: The processing of pictorial and other nonnumeric information required in science, engineering and technical analysis. Pattern recognition and image analysis technical details required to compute the different parameters explaining these qualities can be quantified approximately by a design of numerical database. The approximate reasoning on the knowledge base for segmentation and boundary detection in composing images is a problem addressed by us in which similarity measure based on mutual information, normalized mutual information are giving desired results in the registration process. For feature enhancement and simplicity in the approach, the problem is coined through statistical zero knowledge interactive protocol.

Table 1: Mutual information for split image

Fig. 1: Total image (registered)

Fig. 2: Component image (implemented)

Fig. 3: Total image (registered)

The interaction posed on mutual information:

Using commands from MATLAB and Auto CAD, we could describe points on the boundary of the composite and the component images

Fig. 4: Component image (implemented)

Fig. 5: Total image (registered)

The agreement of sequence of steps we exchange between prover and verifier
This protocol exchanges the performance criteria to a higher of approximation

With this, we declare verifier could segment the component image in the number of trials on a set with a probability >2.3.

Fig. 6: a) component image-1; b) component image-2 (implemented)

This is in agreement with the suggested protocol. The required database is shown in Table 1 and 2. This helps the interaction between prover and verifier. Table 2 shows exchange of data between the prover and verifier on specific window. Data points on the boundary and near by points on the couple is given to verifier.

Table 2: Exchange of data between prover and verifier

Verifier identifies the couple points on the boundary and from the set given to him with the probability >2.3. Interaction protocol is a mutual information on the interaction of the couple and the rose.

Data analysis
MATLAB commands:

Imread (filename) = Reads image from graphics file
Edge (i, sobel) = Find edges in an intensity image
Imcrop ( ) = Crops an image
Imshow ( ) = Displays an image
Pixval ( ) = Display information about pixel

Auto CAD commands:

Limits = To get the screen
Zoom = Get the pictures on the required screen
Line = To draw the picture
Hatch = Hatch the points

CONCLUSION

The use of least sequences is the correlation seen in progress is institutive method allowing retaining of images whole being used on the intensities of the images. The probability distributions generally used in such studies are method taken care of distributions having multi modes. The statistical zero knowledge interactive protocol based on mutual information difference given as easy approach to study segmentation problems. Using the Auto CAD commands the points on the relevant portions on the image are obtained. These points evaluated at specific windows and are made used in the statistical interactive protocol between prover and verifier in the process of component extraction.

Design and power by Medwell Web Development Team. © Medwell Publishing 2024 All Rights Reserved