Du er ikke logget ind
Beskrivelse
As individual needs have arisen in the fields of physics, electrical engineering and computational science, each has created its own theories of information to serve as conceptual instruments for advancing developments. This book provides a coherent consolidation of information theories from these different fields. The author gives a survey of current theories and then introduces the underlying notion of symmetry, showing how information is related to the capacity of a system to distinguish itself. A formal methodology using group theory is employed and leads to the application of Burnside's Lemma to count distinguishable states. This provides a versatile tool for quantifying complexity and information capacity in any physical system. Written in an informal style, the book is accessible to all researchers in the fields of physics, chemistry, biology, computational science as well as many others.