Monday, April 19, 2010

Information and knowledge

Common words are ambiguous, indeterminate and confusing. However, they're what we have to use constantly.

So, here's my working definitions of the two very basic words, information and knowledge, each provisionally created from online sources.



INFORMATION: 1. a measure of the freedom of choice with which a message is selected from the set of all possible messages; 2.a difference which makes a difference

Online source 1
INFORMATION THEORY or communication theory, mathematical theory formulated principally by the American scientist Claude E. Shannon to explain aspects and problems of information and communication. While the theory is not specific in all respects, it proves the existence of optimum coding schemes without showing how to find them. For example, it succeeds remarkably in outlining the engineering requirements of communication systems and the limitations of such systems.

In information theory, the term information is used in a special sense; it is a measure of the freedom of choice with which a message is selected from the set of all possible messages. Information is thus distinct from meaning, since it is entirely possible for a string of nonsense words and a meaningful sentence to be equivalent with respect to information content.


Numerically, information is measured in bits (short for binary digit; see binary system ). One bit is equivalent to the choice between two equally likely choices. For example, if we know that a coin is to be tossed but are unable to see it as it falls, a message telling whether the coin came up heads or tails gives us one bit of information. When there are several equally likely choices, the number of bits is equal to the logarithm of the number of choices taken to the base two. For example, if a message specifies one of sixteen equally likely choices, it is said to contain four bits of information.

Bibliography

See C. E. Shannon and W. Weaver, The Mathematical Theory of Communication (1949); M. Mansuripur, Introduction to Information Theory (1987).

From Information Theory.The Columbia Encyclopedia, Sixth Edition. Columbia University Press: New York. 2009.
(Obtained from Questia)


Online source 2
"In fact, what we mean by information - the elementary unit of information - is a difference which makes a difference". (Bateson [1973], 428).

Bateson, G., 1973, Steps to an Ecology of Mind (Frogmore, St. Albans: Paladin).

From "Semantic Conceptions of Information" in Stanford Encyclopedia of Philosophy
http://plato.stanford.edu/entries/information-semantic/




KNOWLEDGE: The objects, concepts and relationships that are created from logical inference using prior-knowledge and/or information with justification or explanation.

Online source
Computing Dictionary
knowledge definition
artificial intelligence, information science

The objects, concepts and relationships that are assumed to exist in some area of interest. A collection of knowledge, represented using some knowledge representation language is known as a knowledge base and a program for extending and/or querying a knowledge base is a knowledge-based system.
Knowledge differs from data or information in that new knowledge may be created from existing knowledge using logical inference. If information is truthful data plus meaning then knowledge is information plus justification/explanation.
From The Free On-line Dictionary of Computing.
Dictionary.com website: http://dictionary.reference.com/browse/knowledge










Google




Search in WWW
Search in this blog




No comments: