IOSR Journal of Electronics and Communication Engineering (IOSR-JECE) e-ISSN: 2278-2834,p- ISSN: 2278-8735.Volume 8, Issue 2 (Nov. - Dec. 2013), PP 52-57 www.iosrjournals.org

‘Useful’ R-norm Information Measure and its Properties D.S. Hooda 1, Keerti Upadhyay 2 and D.K.Sharma3 Jaypee University of Engineering and Technology A.B. Road, Raghogarh – 473226 Distt. Guna – M.P. (India).

Abstract : In the present communication, a new ‘useful’ R-norm information measure has been defined and characterized axiomatically. Its particular cases have been discussed. Properties of the new measure have also been studied. Keywords: non-additivity, R-norm entropy, stochastic independence, utility distribution.

I.

Introduction

Let us consider the set of positive real numbers, not equal to 1 and denote this by R : R 0, R 1 . Let

defined as

n with n 2 is the set of all probability distributions

n P p1 , p 2 ,, p n , pi 0 , for each i and pi 1. i 1

[1] studied R-norm information of the distribution P defined for R by: 1 R n R R H R P 1 pi R 1 i 1

(1)

The R-norm information measure (1) is a real function n defined on n ,where n 2 and R is the set of real positive numbers. The measure (1) is different from entropies of [2], [3], [4] and [5]. The main property of this measure is that when R 1 (1) approaches to Shannon’s entropy and when R , H R P 1 max pi , where i 1,2,, n . The measure (1) can be generalized in so many ways. [6] Proposed and characterized the following parametric generalization of (1.1): R n 2 H R P 1 p i R 2 i 1 R

2 R

, 0 1,

R 0 1

(2)

The above measure (2) was called generalized R-norm information measure of degree β and it reduces to (1) when β=1. Further when R=1 (2) reduces to: 1 1 H 1 P 1 pi2 2

In case

2

, 0 1

(3)

1 reduces to: 2

1 n 1 (4) H P 1 p , 1 1 i 1 i 2 This is an information measure which has been given by [7]. It can be seen that (4) also reduces to Shannon’s entropy when 1 .

[8] Proposed and studied the following parametric generalization of (1):

,

HR

R R n 2 ( P) 1 p i R 2 i 1

2 R

, 1, 0 1, R 0 1, 0 R 2

www.iosrjournals.org

(5) 52 | Page

‘Useful’ R-norm Information Measure and its Properties They called (5) as the generalized R-norm information measure of type and degree . (5) reduces to (2) when 1 and it further reduces to (1) when 1 .Recently, [9] have applied (5) in studying the bounds of generalized mean code length. In order to distinguish the events E1 , E2 ,, En with respect to a given qualitative characteristic of physical system taken into account, we ascribe to each event E i a non- negative number uEi ui 0 directly proportional to its importance. We call u i , the utility or importance of event E i where probability of occurrence is

p i .In general u i is independent of p i (see [10]).

[11] characterized a quantitative-qualitative measure which was called ‘useful’ information by [10] of the experiment E and is given as: H P;U H p1 , p2 ,, pn ; u1 , u 2 ,, u n

u i pi log pi ,

u i 0,

0 pi 1,

p

i

(6)

1

Later on [12] characterized the following measure of ‘useful’ information:

H P;U

u i pi log pi

u

i

(7)

pi

Analogous to (1) we consider a measure of ‘useful’ R-norm information as given below:

R R u i pi H R P;U 1 R 1 u i pi

1

R

,

(8)

where U u1 , u 2 ,u n is the utility distribution and u i 0 is the utility of an event with probability p i . It may be noted that if

R 1, then (8) reduces to (7). Further let

n be a set of utility distributions

s.t. U n is utility distribution corresponding to P n . In the present paper we characterize the ‘useful’ R-norm information measure (8) axiomatically in section 2. In section 3 we study the properties of the new measure of ‘useful’ R-norm information measure.

II.

Axiomatic Characterization

Let Sn n R , n 2,3, and G n be a sequence of functions of n

'

u i s , i 1,2,, n , defined over

'

pi s and

S n satisfying the following axioms:

n

Axiom 2.1. Gn P : U a1 a2 h pi , ui , where

a1 and a 2 are non zero constants, and

i 1

p, u J 0,1 0, 0, y ;0 y 1 , y : 0 y . This axiom is also called sum property. Axiom 2.2. For P n ,U n , P m , and U m , Gmn satisfies the following property: Gmn PP : UU Gn P : U Gm P : U

1 Gn P : U Gm P : U . . a1

h p, u is a continuous function of its arguments p and u. Axiom 2.4. Let all pi s and u i s are equiprobable and of equal utility of events respectively, then Axiom 2.3.

1 R where 1 R 1 R Gn , , , u, , u 1 n , n n R 1

n 2,3,, and R 0 1

First of all we prove the following three lemmas to facilitate to prove the main theorem: www.iosrjournals.org

53 | Page

‘Useful’ R-norm Information Measure and its Properties Lemma 2.1. From axiom 2.1 and 2.2, it is very easy to arrive at the following functional equation:

h p n

m

i 1

j 1

i

a2 p j , u i u j a 1

m n , h pi , u i h p j , u j j 1 i 1

where pi , ui , p j , u j J for i 1,2,, n and

(9)

j 1,2,, m.

Lemma 2.2. The continuous solution that satisfies (9) is the continuous solution of the functional equation: a2 , (10) h pp , uu

a1

h p, u h p , u

,

Proof: Let a, b, c, d and a, b, c, d be positive integers such that 1 a a,1 b b,1 c c , and 1 d d . Setting n a a 1 c c 1 and m b b 1 d d 1, pi

ui

1 i 1,2,, a a , a

p a a1

a , a

1 c i 1, 2,, c c , ucc 1 , c c

pj

1 b j 1, 2,, b b , pb b1 , b b

1 j 1,2,, d d , u d d 1 d , d d From equation (9) we have: a a b bh 1 , 1 b bh a , c a a h b , d h a b , c d ab cd ab cd ab cd ab cd a2 1 1 a c 1 1 b d a a h , h , b b ' h , h , a a c a c b d b d 1 u j

Taking a b c d 1in (11), we get: 1 1 a2 1 1 1 1 h , h , . h , ab cd a1 a c b d

(11)

(12)

Taking a c 1 in (11) and using (12), we have:

b d a 2 h , ab cd a1

1 1 b d h , h , . a c b d

Again taking b d 1 in (11) and using (12), we get: a c a2 1 1 a c h , h , h , ab cd a1 b d a c Now (11) together with (12), (13) and (13) reduces to: a b c d a 2 a c b d h , h , h , ab cd a1 a c b d a c b d p, u, p , u in (15), we get the required results (10). Putting a c b d

(13)

(14)

(15)

Next we obtain the general solution of (10). Lemma2.3. One of the general continuous solution of equation (10) is given by: a1 p u h p, u a 2 pu and h p, u 0

1

Proof: Taking g p, u a 2 a 1

, where 0, 0

(16) (17)

in (10), we have: h p , u

g pp , uu g p, u g p , u

(18)

The most general continuous solution of (18) (refer to [13]) is given by: www.iosrjournals.org

54 | Page

‘Useful’ R-norm Information Measure and its Properties p u g p, u pu

1

, 0 and 0

(19)

and g p, u 0

(20)

On substituting g p, u a 2 h p, u in (19) and (20) we get (16) and (17) respectively. This proves the a1

lemma 2.3 for all rationals p 0,1 and u 0 , However, by continuity, it holds for all reals p 0,1 and u 0 . Theorem2.1. The measure (8) can be determined by the axiom 2.1 to 2.4. Proof: Substituting the solution (16) in axiom 2.1 we have: 1 n p u (21) Gn P;U a1 1 i i , 0 p u i 1 i i 1 Taking pi and u i u for each i in (21) we have: n 1 1 1 1 , Gn , , , u, , u a1 1 n u n n

n 2,3, ,

(22)

Axiom (2.4) together with (22) gives: a1 1 n

1

u

1

1 R R 1 n R R 1

It implies a1

R , R 1

R , 1

Putting these values in (21) we have n R u i pi R G n P;U 1 i n1 R 1 u i pi i 1

H R P;U

1

R

Hence this completes the proof of theorem 2.1. Particular cases: (a) When utilities are ignored i.e. u i 1 for each i, (8) reduces to (1). (b) Further R 1 , (1) reduces to Shannon’s entropy [13].

III.

Properties of ‘useful’ R-norm Information Measure

The ‘useful’ R-norm information measure H R P;U satisfies the following properties:

Property 3.1. H R P;U is symmetric function of their arguments provided that the permutation of p i s and

u i s are taken together. H R p1 , p2 ,, pn1 , pn ; u1 , u 2 ,u n1 , u n H R pn , p1 , p2 ,, pn1 ; u n , u1 , u 2 ,, u n1

Property 3.2. H 1 , 1 ;1,1 1 R 8 8

u i pi R Proof: R H R P;U 1 u i pi R 1

for

1

R

i 1,2

R u1 p1 R u 2 p2 R H R P;U 1 u p R 1 u 2 p2 1 1

1

R

www.iosrjournals.org

55 | Page

‘Useful’ R-norm Information Measure and its Properties Taking p1 1 , 8

p2

1 , 8

u 2 1 and

u1 1,

1 2 8 1 1 2 H R , ;1,1 1 8 8 1 1 8

R2

1 Property 3.3. Addition of two events whose probability of occurrence is zero or utility is zero has no effect on useful information, i.e. 1 8 1 8

2

1

2

H R p1 , p2 ,, pn ,0; u1u 2 ,u n1 H R p1 , p2 ,, pn ; u1 , u 2 ,, u n H n p1 , p2 ,, pn1 ; u1 , u 2 ,, u n ,0.

Proof: Let us consider 1 R R R R u n pn 0 R .u n 1 u 2 p2 R u1 p1 H R p1 , p 2 , p n ,0; u1 , u 2 , , u n 1 1 R 1 u 2 p2 u n pn 0.u n 1 u1 p1

H R P;U

Similarly we can prove that H n p1 , p2 ,, pn , pn1 ; u1 , u 2 ,, u n , u n1 H R P;U Property 3.4. H R P;U satisfies the non-additivity of the following form: R 1 H R P;U H R Q;V R where P Q p1q1 ,, p1qm , p2 q1 ,, p2 qm , pn q1 ,, pn qm , and H R P Q;U V H R P;U H R Q;V

U V u1v1 ,, u1vm , u 2 v1 ,, u 2 vm , u n v1 ,, u n vm

R H R P;U H R Q;V R 1 1 1 R R R R R R u i pi R v j q j R u i pi 1 1 1 R 1 u i pi R 1 v j q j R 1 u i pi

Proof: R.H.S= H R P;U H R Q;V

R R u i pi 1 R 1 u pi

1

R

v jq j R 1 v jq j

u i pi R R 1 u I pi R 1

1

R

q jv j q jv j

1

R

R

u i pi R 1 u I pi 1

R

H R P Q;U V = L.H.S.

1

R

q jv j R q jv j

1

R

1

R

R 1 v j q j v j q j

u i pi R u I pi

m R n ui v j pi q j R i 1 j 1 n m 1 R 1 ui v j pi q j i 1 j 1

1

R

1

R

q jv j R q jv j

1

1

R

R

Property 3.5. Let Ai , A j be two events having probabilities p i , p j and utilities u i , , u j respectively, then we define the utility u of the compound event Ai A j as:

u Ai A j

u i pi u j p j

pi p j Theorem 3.1 Under the composition law (23), the following holds:

(23)

p p H R p1 , p2 ,, pn 1 , p, p; u1u2 , , un 1 , u, u n H R P,U p p H R , ; u , u , p p p p p u p u where pn p p and un n 1

p p

Proof:

n 1 H R p1 , p 2 , , p n 1 , p , p ; u1 , u 2 , , u n 1 , u , u

www.iosrjournals.org

56 | Page

‘Useful’ R-norm Information Measure and its Properties n H R p1 , p 2 , , p n 1 ; u1, u 2 , u n 1

n H R

1 u p R R u p R R 1 R 1 u p u p

R R p p u u p p R p p p p R 1 p p u u p p p p

1

R

p p

p p n H R p p H R , ; u , u p p p p This completes the proof of theorem 3.1.

IV.

Conclusion

R- norm information measure is defined and characterized when the probability distribution P belong to R- norm vector space. This is a new addition to the family of generalized information measures. In present paper we have considered that physical system has qualitative characterization in addition to quantitative and have defined and characterized a new measure R-norm information measure . This measure can further be generalized in many ways and can be applied in source coding when source symbols have utility also in addition to probability of occurrence.

References [1]. [2]. [3]. [4]. [5]. [6]. [7]. [8]. [9]. [10]. [11]. [12]. [13].

D. E. Boekee, and J.C.A. Van Der Lubbe, The R-norm Information measure, Information and control, 45, 1980, 136 -155. C.E. Shannon, A mathematical theory of communication. Bell System Technology Journal, 27, 1948, 379-423. A. Renyi, On measure of entropy and information, Selected paper of Alfored Renyi, 2, 1976, 565-580 J.Havrda, and F. Charvat, Quantification of classification processes concept of Structural - entropy. Kybernetika, 3, 1967, 3035. Z. Daroczy, Generalized information function, Information and control, 16, 1970 36-51. D.S. Hooda, and Anant Ram, Characterization of the generalized R-norm entropy, Caribbean journal of Mathematical and Computer science, 8, 2002 18-31. S.Arimoto, Information theoretical consideration on estimation proems. Information and control, 19, 1971, 181-199. D.S. Hooda, and D.k. Sharma, Generalized R-norm information measures, Journal of Application Math statistics & Informatics, 4(2), 2008 153-168. D.S. Hooda, Saxena Sonali, and D.k. Sharma, A Generalized R-norm entropy and coding theorem. International journal of Mathematical Sciences and Engineering Applications, 5, 2011, 385-393. G. Longo, Quantitative-Qualitative Measure of Information, Springer-Verlag, NewYork, 1972. M. Belis, and Guiasu, A quantitative – qualitative measure of information in Cybernetics, System IEEE Trans. Information Theory, IT 14, 1968, 593-594. U.S. Bhaker, and D.S. Hooda, Mean value characterization of ‘useful’ information measures, Tamkang Journal of Mathematics, 24, 1993, 383-394. J. Aczel , Lectures on functional equations and their applications (Academic Press, New York 1966).

.

www.iosrjournals.org

57 | Page