Exactly what information is (at least one very useful technical definition of it) is answered by the mathematical theory of information. A rich field with many applications to biology, linguistics, and electronics, the theory is couched in the language of bits, each bit of information conveying one binary choice. [Hence 5 bits, for example, convey 5 such choices and are sufficient to distinguish from among 32 (or 25) alternatives, there being 32 (25) possible yes-no sequences of length 5.] Bits serve too as units in the numerical measure of such notions as the entropy of information sources, the capacity of communication channels, and the redundancy of messages.Source: Paulos, J.A., (1991) Binary numbers and codes, Beyond Innumeracy, p.26.
Additional links:
W2tQ 'information'
W2tQ 'Ye Olde paper: 1996 "Humans, information and science'