**Author**: Jim Pitman

**Publisher:** Springer Science & Business Media

**ISBN:**

**Category:** Mathematics

**Page:** 560

**View:** 679

This is a text for a one-quarter or one-semester course in probability, aimed at students who have done a year of calculus. The book is organised so a student can learn the fundamental ideas of probability from the first three chapters without reliance on calculus. Later chapters develop these ideas further using calculus tools. The book contains more than the usual number of examples worked out in detail. The most valuable thing for students to learn from a course like this is how to pick up a probability problem in a new setting and relate it to the standard body of theory. The more they see this happen in class, and the more they do it themselves in exercises, the better. The style of the text is deliberately informal. My experience is that students learn more from intuitive explanations, diagrams, and examples than they do from theorems and proofs. So the emphasis is on problem solving rather than theory.

This is a graduate level textbook on measure theory and probability theory. The book can be used as a text for a two semester sequence of courses in measure theory and probability theory, with an option to include supplemental material on stochastic processes and special topics. It is intended primarily for first year Ph.D. students in mathematics and statistics although mathematically advanced students from engineering and economics would also find the book useful. Prerequisites are kept to the minimal level of an understanding of basic real analysis concepts such as limits, continuity, differentiability, Riemann integration, and convergence of sequences and series. A review of this material is included in the appendix. The book starts with an informal introduction that provides some heuristics into the abstract concepts of measure and integration theory, which are then rigorously developed. The first part of the book can be used for a standard real analysis course for both mathematics and statistics Ph.D. students as it provides full coverage of topics such as the construction of Lebesgue-Stieltjes measures on real line and Euclidean spaces, the basic convergence theorems, L^p spaces, signed measures, Radon-Nikodym theorem, Lebesgue's decomposition theorem and the fundamental theorem of Lebesgue integration on R, product spaces and product measures, and Fubini-Tonelli theorems. It also provides an elementary introduction to Banach and Hilbert spaces, convolutions, Fourier series and Fourier and Plancherel transforms. Thus part I would be particularly useful for students in a typical Statistics Ph.D. program if a separate course on real analysis is not a standard requirement. Part II (chapters 6-13) provides full coverage of standard graduate level probability theory. It starts with Kolmogorov's probability model and Kolmogorov's existence theorem. It then treats thoroughly the laws of large numbers including renewal theory and ergodic theorems with applications and then weak convergence of probability distributions, characteristic functions, the Levy-Cramer continuity theorem and the central limit theorem as well as stable laws. It ends with conditional expectations and conditional probability, and an introduction to the theory of discrete time martingales. Part III (chapters 14-18) provides a modest coverage of discrete time Markov chains with countable and general state spaces, MCMC, continuous time discrete space jump Markov processes, Brownian motion, mixing sequences, bootstrap methods, and branching processes. It could be used for a topics/seminar course or as an introduction to stochastic processes. Krishna B. Athreya is a professor at the departments of mathematics and statistics and a Distinguished Professor in the College of Liberal Arts and Sciences at the Iowa State University. He has been a faculty member at University of Wisconsin, Madison; Indian Institute of Science, Bangalore; Cornell University; and has held visiting appointments in Scandinavia and Australia. He is a fellow of the Institute of Mathematical Statistics USA; a fellow of the Indian Academy of Sciences, Bangalore; an elected member of the International Statistical Institute; and serves on the editorial board of several journals in probability and statistics. Soumendra N. Lahiri is a professor at the department of statistics at the Iowa State University. He is a fellow of the Institute of Mathematical Statistics, a fellow of the American Statistical Association, and an elected member of the International Statistical Institute.

This book is in two volumes, and is intended as a text for introductory courses in probability and statistics at the second or third year university level. It emphasizes applications and logical principles rather than math ematical theory. A good background in freshman calculus is sufficient for most of the material presented. Several starred sections have been included as supplementary material. Nearly 900 problems and exercises of varying difficulty are given, and Appendix A contains answers to about one-third of them. The first volume (Chapters 1-8) deals with probability models and with mathematical methods for describing and manipulating them. It is similar in content and organization to the 1979 edition. Some sections have been rewritten and expanded-for example, the discussions of independent random variables and conditional probability. Many new exercises have been added. In the second volume (Chapters 9-16), probability models are used as the basis for the analysis and interpretation of data. This material has been revised extensively. Chapters 9 and 10 describe the use of the like lihood function in estimation problems, as in the 1979 edition. Chapter 11 then discusses frequency properties of estimation procedures, and in troduces coverage probability and confidence intervals. Chapter 12 de scribes tests of significance, with applications primarily to frequency data.

This is the only book that gives a rigorous and comprehensive treatment with lots of examples, exercises, remarks on this particular level between the standard first undergraduate course and the first graduate course based on measure theory. There is no competitor to this book. The book can be used in classrooms as well as for self-study.

This is the first half of a text for a two semester course in mathematical statistics at the senior/graduate level for those who need a strong background in statistics as an essential tool in their career. To study this text, the reader needs a thorough familiarity with calculus including such things as Jacobians and series but somewhat less intense familiarity with matrices including quadratic forms and eigenvalues. For convenience, these lecture notes were divided into two parts: Volume I, Probability for Statistics, for the first semester, and Volume II, Statistical Inference, for the second. We suggest that the following distinguish this text from other introductions to mathematical statistics. 1. The most obvious thing is the layout. We have designed each lesson for the (U.S.) 50 minute class; those who study independently probably need the traditional three hours for each lesson. Since we have more than (the U.S. again) 90 lessons, some choices have to be made. In the table of contents, we have used a * to designate those lessons which are "interesting but not essential" (INE) and may be omitted from a general course; some exercises and proofs in other lessons are also "INE". We have made lessons of some material which other writers might stuff into appendices. Incorporating this freedom of choice has led to some redundancy, mostly in definitions, which may be beneficial.

This unique book delivers an encyclopedic treatment of classic as well as contemporary large sample theory, dealing with both statistical problems and probabilistic issues and tools. The book is unique in its detailed coverage of fundamental topics. It is written in an extremely lucid style, with an emphasis on the conceptual discussion of the importance of a problem and the impact and relevance of the theorems. There is no other book in large sample theory that matches this book in coverage, exercises and examples, bibliography, and lucid conceptual discussion of issues and theorems.

Cohesively Incorporates Statistical Theory with R Implementation Since the publication of the popular first edition of this comprehensive textbook, the contributed R packages on CRAN have increased from around 1,000 to over 6,000. Designed for an intermediate undergraduate course, Probability and Statistics with R, Second Edition explores how some of these new packages make analysis easier and more intuitive as well as create more visually pleasing graphs. New to the Second Edition Improvements to existing examples, problems, concepts, data, and functions New examples and exercises that use the most modern functions Coverage probability of a confidence interval and model validation Highlighted R code for calculations and graph creation Gets Students Up to Date on Practical Statistical Topics Keeping pace with today’s statistical landscape, this textbook expands your students’ knowledge of the practice of statistics. It effectively links statistical concepts with R procedures, empowering students to solve a vast array of real statistical problems with R. Web Resources A supplementary website offers solutions to odd exercises and templates for homework assignments while the data sets and R functions are available on CRAN.

This book offers a straightforward introduction to the mathematical theory of probability. It presents the central results and techniques of the subject in a complete and self-contained account. As a result, the emphasis is on giving results in simple forms with clear proofs and to eschew more powerful forms of theorems which require technically involved proofs. Throughout there are a wide variety of exercises to illustrate and to develop ideas in the text.

This book provides a versatile and lucid treatment of classic as well as modern probability theory, while integrating them with core topics in statistical theory and also some key tools in machine learning. It is written in an extremely accessible style, with elaborate motivating discussions and numerous worked out examples and exercises. The book has 20 chapters on a wide range of topics, 423 worked out examples, and 808 exercises. It is unique in its unification of probability and statistics, its coverage and its superb exercise sets, detailed bibliography, and in its substantive treatment of many topics of current importance. This book can be used as a text for a year long graduate course in statistics, computer science, or mathematics, for self-study, and as an invaluable research reference on probabiliity and its applications. Particularly worth mentioning are the treatments of distribution theory, asymptotics, simulation and Markov Chain Monte Carlo, Markov chains and martingales, Gaussian processes, VC theory, probability metrics, large deviations, bootstrap, the EM algorithm, confidence intervals, maximum likelihood and Bayes estimates, exponential families, kernels, and Hilbert spaces, and a self contained complete review of univariate probability.

This book has exerted a continuing appeal since its original publication in 1970. It develops the theory of probability from axioms on the expectation functional rather than on probability measure, demonstrates that the standard theory unrolls more naturally and economically this way, and that applications of real interest can be addressed almost immediately. New to this edition are chapters giving an economical introduction to dynamic programming, which is then applied to the allocation problems represented by portfolio selection and the multi-armed bandit. The investment theme is continued with a critical investigation of the concept of risk-free trading and the associated Black-Sholes formula, while another new chapter develops the basic ideas of large deviations.