This book defines and develops a unifying principle of physics, that of ‘extreme physical information’. The information in question is, perhaps surprisingly, not Shannon or Boltzmann entropy but, rather, Fisher information, a simple concept little known to physicists. Both statistical and physical properties of Fisher information are developed. This information is shown to be a physical measure of disorder, sharing with entropy the property of monotonic change with time. The information concept is applied ‘phenomenally’ to derive most known physics, from statistical mechanics and thermodynamics to quantum mechanics, the Einstein field equations, and quantum gravity. Many new physical relations and concepts are developed, including new definitions of disorder, time and temperature. The information principle is based upon a new theory of measurement, one which incorporates the observer into the phenomenon that he/she observes. The ‘request’ for data creates the law that, ultimately, gives rise to the data. The observer creates his or her local reality.