Human Communications Wiki
Register
Advertisement

Knowledge have been not a mathematical element while information has been considered as one of measurable mathematical element[1][2]. Can we measure knowledge as like information?

Introduction[]

Before discussing why knowledge is mathematically intractable, we illustrate a few differences between Information and Knowledge. The knowledge management paper of [3] investigated Data, Information, Knowledge and Wisdom (DIKW) layers. In this article we omit the discussion of Wisdom, which an abstracted concept in out of the science scope. The paper of [3] shows that knowledge is one step higher level than information from the human mental processing prospective. Data is primitive information sensed from internal and external signals, Information is valuable data chosen from primitive ones, and Knowledge is information addition with their relationship such that knowledge has additional information including importance ordering information. The importance ordering information in knowledge denotes which information is more important than the other information. Knowledge implicitly has ordering information not belonging to information. Each bit of information sequence has the same importance. Thus, knowledge must be constructed more than one symbols whereas information can be treated just by each symbol independtly.

Measuring information[]

In information theory, the state probability of each symbol, , is a fundamental element of processing. As an example, the amount of information of a given symbol is represented as where logarithm measure is used for mathematical tractability. If the probability of the event in a symbol is low, the amount of information will be inversely high. The other fundamental measure of the information quantity is Entropy, which is given by

where is a symbol space.

Suggested Way to Measure Knowledge Entropy[]

Now, discuss how we can measure the amount of knowledge. Since each knowledge consists of more than one information and can not be measured independently, we can not define the 'absolute' amount of knowledge for each knowledge symbol. So, let's define the entropy of knowledge in stead of defining the amount of each knowledge symbol, as follows:

where we assumed that each knowledge symbol can be grouped to two sets that are important and useless knowledge groups.

Although for analytically simplicity we categorize only two sets according to whether it is important or useless in this article, more general cases can be treated by straight forward manner. Because we ordered the element of all knowledge factors as two sets as 1 for important and 0 for useless with the assumption of the equal level in each set, the Entropy of Knowledge is the small than Entropy of Information with same number of factors:

Then, let's discuss whether there is no way to specify the amount of each Knowledge?

References[]

The following items is citation documentations in this chapter. The following will be revised using the bib system in Latex.

  1. Cover and Thomas, Elements of Information Theory
  2. Simon Haykin, Digital Communications
  3. 3.0 3.1 Gene Bellinger, Durval Castro, Anthony Mills, Data, Information, Knowledge, and Wisdom
Advertisement