Nehodí se? Vůbec nevadí! U nás můžete do 30 dní vrátit
S dárkovým poukazem nešlápnete vedle. Obdarovaný si za dárkový poukaz může vybrat cokoliv z naší nabídky.
30 dní na vrácení zboží
This book gathers concepts of information across diverse fields physics, electrical engineering and computational science surveying current theories, discussing underlying notions of symmetry, and showing how the capacity of a system to distinguish itself relates to information. The author develops a formal methodology using group theory, leading to the application of Burnside's Lemma to count distinguishable states. This provides a tool to quantify complexity and information capacity in any physical system.As individual needs have arisen in the fields of physics, electrical engineering and computational science, each has created its own theories of information to serve as conceptual instruments for advancing developments. This book provides a coherent consolidation of information theories from these different fields. The author gives a survey of current theories and then introduces the underlying notion of symmetry, showing how information is related to the capacity of a system to distinguish itself. A formal methodology using group theory is employed and leads to the application of Burnside's Lemma to count distinguishable states. This provides a versatile tool for quantifying complexity and information capacity in any physical system. Written in an informal style, the book is accessible to all researchers in the fields of physics, chemistry, biology, computational science as well as many others.