Nature's Algorithms for Learning and Prospering in a Complex World
Author: Leslie Valiant
Publisher: Hachette UK
We have effective theories for very few things. Gravity is one, electromagnetism another. But for most things—whether as mundane as finding a mate or as major as managing an economy—our theories are lousy or nonexistent. Fortunately, we don't need them, any more than a fish needs a theory of water to swim; we're able to muddle through. But how do we do it? In Probably Approximately Correct, computer scientist Leslie Valiant presents a theory of the theoryless. The key is “probably approximately correct” learning, Valiant's model of how anything can act without needing to understand what is going on. The study of probably approximately correct algorithms reveals the shared computational nature of evolution and cognition, indicates how computers might possess authentic intelligence, and shows why hacking a problem can be far more effective than developing a theory to explain it. After all, finding a mate is a lot more satisfying than finding a theory of mating. Offering an elegant, powerful model that encompasses all of life's complexity, Probably Approximately Correct will revolutionize the way we look at the universe's greatest mysteries.
For a thing to be real, it must be able to communicate with other things. If this is so, then the problem of being receives a straightforward resolution: to be is to be in communion. So the fundamental science, indeed the science that needs to underwrite all other sciences, is a theory of communication. Within such a theory of communication the proper object of study becomes not isolated particles but the information that passes between entities. In Being as Communion philosopher and mathematician William Dembski provides a non-technical overview of his work on information. Dembski attempts to make good on the promise of John Wheeler, Paul Davies, and others that information is poised to replace matter as the primary stuff of reality. With profound implications for theology and metaphysics, Being as Communion develops a relational ontology that is at once congenial to science and open to teleology in nature. All those interested in the intersections of theology, philosophy and science should read this book.
10th Conference on Computability in Europe, CiE 2014, Budapest, Hungary, June 23-27, 2014, Proceedings
Author: Arnold Beckmann
This book constitutes the refereed proceedings of the 10th Conference on Computability in Europe, CiE 2014, held in Budapest, Hungary, in June 2014. The 42 revised papers presented were carefully reviewed and selected from 78 submissions and included together with 15 invited papers in this proceedings. The conference had six special sessions: computational linguistics, bio-inspired computation, history and philosophy of computing, computability theory, online algorithms and complexity in automata theory.
The book outlines selected projects conducted under the supervision of the author. Moreover, it discusses significant relations between Interactive Granular Computing (IGrC) and numerous dynamically developing scientific domains worldwide, along with features characteristic of the author’s approach to IGrC. The results presented are a continuation and elaboration of various aspects of Wisdom Technology, initiated and developed in cooperation with Professor Andrzej Skowron. Based on the empirical findings from these projects, the author explores the following areas: (a) understanding the causes of the theory and practice gap problem (TPGP) in complex systems engineering (CSE); (b) generalizing computing models of complex adaptive systems (CAS) (in particular, natural computing models) by constructing an interactive granular computing (IGrC) model of networks of interrelated interacting complex granules (c-granules), belonging to a single agent and/or to a group of agents; (c) developing methodologies based on the IGrC model to minimize the negative consequences of the TPGP. The book introduces approaches to the above issues, using the proposed IGrC model. In particular, the IGrC model refers to the key mechanisms used to control the processes related to the implementation of CSE projects. One of the main aims was to develop a mechanism of IGrC control over computations that model a project’s implementation processes to maximize the chances of its success, while at the same time minimizing the emerging risks. In this regard, the IGrC control is usually performed by means of properly selected and enforced (among project participants) project principles. These principles constitute examples of c-granules, expressed by complex vague concepts (represented by c-granules too). The c-granules evolve with time (in particular, the meaning of the concepts is also subject of change). This methodology is illustrated using project principles applied by the author during the implementation of the POLTAX, AlgoTradix, Merix, and Excavio projects outlined in the book.