Fisher, Neyman, and the Creation of Classical Statistics

Fisher, Neyman, and the Creation of Classical Statistics
Author: Erich L. Lehmann
Publisher: Springer Science & Business Media
Total Pages: 123
Release: 2011-07-25
Genre: Mathematics
ISBN: 1441995005

Classical statistical theory—hypothesis testing, estimation, and the design of experiments and sample surveys—is mainly the creation of two men: Ronald A. Fisher (1890-1962) and Jerzy Neyman (1894-1981). Their contributions sometimes complemented each other, sometimes occurred in parallel, and, particularly at later stages, often were in strong opposition. The two men would not be pleased to see their names linked in this way, since throughout most of their working lives they detested each other. Nevertheless, they worked on the same problems, and through their combined efforts created a new discipline. This new book by E.L. Lehmann, himself a student of Neyman’s, explores the relationship between Neyman and Fisher, as well as their interactions with other influential statisticians, and the statistical history they helped create together. Lehmann uses direct correspondence and original papers to recreate an historical account of the creation of the Neyman-Pearson Theory as well as Fisher’s dissent, and other important statistical theories.

Neyman

Neyman
Author: Constance Reid
Publisher: Springer Science & Business Media
Total Pages: 338
Release: 1998
Genre: Biography & Autobiography
ISBN: 9780387983578

Jerzy Neyman received the National Medal of Science "for laying the foundations of modern statistics and devising tests and procedures that have become essential parts of the knowledge of every statistician." Until his death in 1981 at the age of 87, Neyman was vigorously involved in the concerns and controversies of the day, a scientist whose personality and activity were integral parts of his contribution to science. His career is thus particularly well-suited for the non-technical life-story which Constance Reid has made her own in such well-received biographies of Hilbert and Courant. She was able to talk extensively with Neyman and have access to his personal and professional letters and papers. Her book will thus appeal to professional statisticians as well as amateurs wanting to learn about a subject which permeates almost every aspect of modern life.

R. A. Fisher, the Life of a Scientist

R. A. Fisher, the Life of a Scientist
Author: Joan Fisher Box
Publisher: John Wiley & Sons
Total Pages: 560
Release: 1978
Genre: Biography & Autobiography
ISBN:

Nature and nurture; In the wilderness; Mathematical statistics; Rothamsted Experimental Station; Tests of significance; The design of experiments; The genetical theory of natural selection; The evolution of dominance; The role of a statistician; Galton Professor of Eugenics; Evolutionary ideas; In the United States and India; Blood groups in man; Losses of war; Arthur Balfour Professor of genetics; The biometrical movement; Scientific inference; Retirement.

Classic Topics on the History of Modern Mathematical Statistics

Classic Topics on the History of Modern Mathematical Statistics
Author: Prakash Gorroochurn
Publisher: John Wiley & Sons
Total Pages: 776
Release: 2016-03-29
Genre: Mathematics
ISBN: 1119127939

"There is nothing like it on the market...no others are as encyclopedic...the writing is exemplary: simple, direct, and competent." —George W. Cobb, Professor Emeritus of Mathematics and Statistics, Mount Holyoke College Written in a direct and clear manner, Classic Topics on the History of Modern Mathematical Statistics: From Laplace to More Recent Times presents a comprehensive guide to the history of mathematical statistics and details the major results and crucial developments over a 200-year period. Presented in chronological order, the book features an account of the classical and modern works that are essential to understanding the applications of mathematical statistics. Divided into three parts, the book begins with extensive coverage of the probabilistic works of Laplace, who laid much of the foundations of later developments in statistical theory. Subsequently, the second part introduces 20th century statistical developments including work from Karl Pearson, Student, Fisher, and Neyman. Lastly, the author addresses post-Fisherian developments. Classic Topics on the History of Modern Mathematical Statistics: From Laplace to More Recent Times also features: A detailed account of Galton's discovery of regression and correlation as well as the subsequent development of Karl Pearson's X2 and Student's t A comprehensive treatment of the permeating influence of Fisher in all aspects of modern statistics beginning with his work in 1912 Significant coverage of Neyman–Pearson theory, which includes a discussion of the differences to Fisher’s works Discussions on key historical developments as well as the various disagreements, contrasting information, and alternative theories in the history of modern mathematical statistics in an effort to provide a thorough historical treatment Classic Topics on the History of Modern Mathematical Statistics: From Laplace to More Recent Times is an excellent reference for academicians with a mathematical background who are teaching or studying the history or philosophical controversies of mathematics and statistics. The book is also a useful guide for readers with a general interest in statistical inference.

Statistics on the Table

Statistics on the Table
Author: Stephen M. Stigler
Publisher: Harvard University Press
Total Pages: 514
Release: 2002-09-30
Genre: History
ISBN: 9780674009790

This lively collection of essays examines statistical ideas with an ironic eye for their essence and what their history can tell us for current disputes. The topics range from 17th-century medicine and the circulation of blood, to the cause of the Great Depression, to the determinations of the shape of the Earth and the speed of light.

Statistics in Food Science and Nutrition

Statistics in Food Science and Nutrition
Author: Are Hugo Pripp
Publisher: Springer Science & Business Media
Total Pages: 71
Release: 2012-09-10
Genre: Technology & Engineering
ISBN: 1461450098

Many statistical innovations are linked to applications in food science. For example, the student t-test (a statistical method) was developed to monitor the quality of stout at the Guinness Brewery and multivariate statistical methods are applied widely in the spectroscopic analysis of foods. Nevertheless, statistical methods are most often associated with engineering, mathematics, and the medical sciences, and are rarely thought to be driven by food science. Consequently, there is a dearth of statistical methods aimed specifically at food science, forcing researchers to utilize methods intended for other disciplines. The objective of this Brief will be to highlight the most needed and relevant statistical methods in food science and thus eliminate the need to learn about these methods from other fields. All methods and their applications will be illustrated with examples from research literature. ​

Statistical Inference as Severe Testing

Statistical Inference as Severe Testing
Author: Deborah G. Mayo
Publisher: Cambridge University Press
Total Pages: 503
Release: 2018-09-20
Genre: Mathematics
ISBN: 1108563309

Mounting failures of replication in social and biological sciences give a new urgency to critically appraising proposed reforms. This book pulls back the cover on disagreements between experts charged with restoring integrity to science. It denies two pervasive views of the role of probability in inference: to assign degrees of belief, and to control error rates in a long run. If statistical consumers are unaware of assumptions behind rival evidence reforms, they can't scrutinize the consequences that affect them (in personalized medicine, psychology, etc.). The book sets sail with a simple tool: if little has been done to rule out flaws in inferring a claim, then it has not passed a severe test. Many methods advocated by data experts do not stand up to severe scrutiny and are in tension with successful strategies for blocking or accounting for cherry picking and selective reporting. Through a series of excursions and exhibits, the philosophy and history of inductive inference come alive. Philosophical tools are put to work to solve problems about science and pseudoscience, induction and falsification.

Permutation Statistical Methods

Permutation Statistical Methods
Author: Kenneth J. Berry
Publisher: Springer
Total Pages: 634
Release: 2016-05-03
Genre: Mathematics
ISBN: 3319287702

This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. lly-informed="" audience,="" and="" can="" also="" easily="" serve="" as="" textbook="" in="" graduate="" course="" departments="" such="" statistics,="" psychology,="" or="" biology.="" particular,="" the="" audience="" for="" book="" is="" teachers="" of="" practicing="" statisticians,="" applied="" quantitative="" students="" fields="" medical="" research,="" epidemiology,="" public="" health,="" biology.

Randomization, Masking, and Allocation Concealment

Randomization, Masking, and Allocation Concealment
Author: Vance Berger
Publisher: CRC Press
Total Pages: 251
Release: 2017-10-30
Genre: Mathematics
ISBN: 1315305100

Randomization, Masking, and Allocation Concealment is indispensable for any trial researcher who wants to use state of the art randomization methods, and also wants to be able to describe these methods correctly. Far too often the subtle nuances that distinguish proper randomization from flawed randomization are completely ignored in trial reports that state only that randomization was used, with no additional information. Experience has shown that in many cases, the type of randomization that was used was flawed. It is only a matter of time before medical journals and regulatory agencies come to realize that we can no longer rely on (or publish) flawed trials, and that flawed randomization in and of itself disqualifies a trial from being robust or high quality, even if that trial is of high quality otherwise. This book will help to clarify the role randomization plays in ensuring internal validity, and in drawing valid inferences from the data. The various chapters cover a variety of randomization methods, and are not limited to the most common (and most flawed) ones. Readers will come away with a profound understanding of what constitutes a valid randomization procedure, so that they can distinguish the valid from the flawed among not only existing methods but also methods yet to be developed.