This book gathers concepts of information across diverse fields: physics, electrical engineering and computational science. It surveys current theories, discusses underlying notions of symmetry, and shows how the capacity of a system to distinguish itself relates to information. The author develops a formal methodology using group theory, leading to the application of Burnside's Lemma to count distinguishable states. This provides a tool to quantify complexity and information capacity in any physical system.
Written in an informal style, the book is equally accessible to researchers in fields of physics, chemistry, biology, computational science as well as many others.