scholarly journals STUDENTS’ REASONING ABOUT p-VALUES

2015 ◽  
Vol 14 (2) ◽  
pp. 7-27
Author(s):  
BIRGIT C. AQUILONIUS ◽  
MARY E. BRENNER

Results from a study of 16 community college students are presented. The research question concerned how students reasoned about p-values. Students' approach to p-values in hypothesis testing was procedural. Students viewed p-values as something that one compares to alpha values in order to arrive at an answer and did not attach much meaning to p-values as an independent concept. Therefore it is not surprising that students often were puzzled over how to translate their statistical answer to an answer of the question asked in the problem. Some reflections on how instruction in statistical hypothesis testing can be improved are given. First published November 2015 at Statistics Education Research Journal Archives

2019 ◽  
Vol 81 (8) ◽  
pp. 535-542
Author(s):  
Robert A. Cooper

Statistical methods are indispensable to the practice of science. But statistical hypothesis testing can seem daunting, with P-values, null hypotheses, and the concept of statistical significance. This article explains the concepts associated with statistical hypothesis testing using the story of “the lady tasting tea,” then walks the reader through an application of the independent-samples t-test using data from Peter and Rosemary Grant's investigations of Darwin's finches. Understanding how scientists use statistics is an important component of scientific literacy, and students should have opportunities to use statistical methods like this in their science classes.


2010 ◽  
Vol 9 (1) ◽  
pp. 68-96
Author(s):  
HOLLYLYNNE STOHL LEE ◽  
ROBIN L. ANGOTTI ◽  
JAMES E. TARR

We examined how middle school students reason about results from a computer-simulated die-tossing experiment, including various representations of data, to support or refute an assumption that the outcomes on a die are equiprobable. We used students’ actions with the software and their social interactions to infer their expectations and whether or not they believed their empirical data could be used to refute an assumption of equiprobable outcomes. Comparisons across students illuminate intricacies in their reasoning as they collect and analyze data from the die tosses. Overall, our research contributes to understanding how students can engage in informal hypothesis testing and use data from simulations to make inferences about a probability distribution. First published May 2010 at Statistics Education Research Journal: Archives


2017 ◽  
Vol 16 (1) ◽  
pp. 55-65
Author(s):  
PATRICK WHITE ◽  
STEPHEN GORARD

Recent concerns about a shortage of capacity for statistical and numerical analysis skills among social science students and researchers have prompted a range of initiatives aiming to improve teaching in this area. However, these projects have rarely re-evaluated the content of what is taught to students and have instead focussed primarily on delivery. The emphasis has generally been on increased use of complex techniques, specialist software and, most importantly in the context of this paper, a continued focus on inferential statistical tests, often at the expense of other types of analysis. We argue that this ‘business as usual’ approach to the content of statistics teaching is problematic for several reasons. First, the assumptions underlying inferential statistical tests are rarely met, meaning that students are being taught analyses that should only be used very rarely. Secondly, all of the most common outputs of inferential statistical tests – p-values, standard errors and confidence intervals – suffer from a similar logical problem that renders them at best useless and at worst misleading. Eliminating inferential statistical tests from statistics teaching (and practice) would avoid the creation of another generation of researchers who either do not understand, or knowingly misuse, these techniques. It would also have the benefit of removing one of the key barriers to students’ understanding of statistical analysis. First published May 2017 at Statistics Education Research Journal Archives


Author(s):  
Helena Kraemer

“As ye sow. So shall ye reap”: For almost 100 years, researchers have been taught that the be-all and end-all in data-based research is the p-value. The resulting problems have now generated concern, often from us who have long so taught researchers. We must bear a major responsibility for the present situation and must alter our teachings. Despite the fact that the Zhang and Hughes paper is titled “Beyond p-value”, the total focus remains on statistical hypothesis testing studies (HTS) and p-values(1). Instead, I would propose that there are three distinct, necessary, and important phases of research: 1) Hypothesis Generation Studies (HGS) or Exploratory Research (2-4); 2) Hypothesis Testing Studies (HTS); 3) Replication and Application of Results. Of these, HTS is undoubtedly the most important, but without HGS, HTS is often weak and wasteful, and without Replication and Application, the results of HTS are often misleading.


2015 ◽  
Vol 14 (1) ◽  
pp. 90-111
Author(s):  
THOMAS P. HOGAN ◽  
BRIAN A. ZABOSKI ◽  
TIFFANY R. PERRY

How does the student untrained in advanced statistics interpret results of research that reports a group difference? In two studies, statistically untrained college students were presented with abstracts or professional associations’ reports and asked for estimates of scores obtained by the original participants in the studies. These estimates were converted to inferred effect sizes and compared with the actual effect sizes. Inferred effect sizes substantially overestimated actual effect sizes for all reports, a phenomenon dubbed the tall-tale effect. The effect was obtained with a variety of reports and statistics. The tall-tale effect could be controlled somewhat with simple changes in wording. This finding suggests a program of research which would better calibrate inferences with those actually obtained in the research. First published May 2015 at Statistics Education Research Journal Archives


2016 ◽  
Vol 15 (2) ◽  
pp. 179-196
Author(s):  
SYLVIA KUZMAK

Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a “mature” understanding of common random phenomena such as the rolling of dice or the blind drawing of balls from an urn. An analysis of the verbalizations of 24 college students, who interact with random phenomena involving the mixture of colored marbles, is presented, using cognitive schema to represent the subjects’ expressed understanding. A cognitive schema representing a mature understanding is contrasted to a diversity of observed immature understandings. Teaching to explicitly build the mature cognitive schema is proposed. First published November 2016 at Statistics Education Research Journal Archives


2017 ◽  
Vol 16 (1) ◽  
pp. 66-73
Author(s):  
JAMES NICHOLSON ◽  
JIM RIDGWAY

White and Gorard make important and relevant criticisms of some of the methods commonly used in social science research, but go further by criticising the logical basis for inferential statistical tests. This paper comments briefly on matters we broadly agree on with them and more fully on matters where we disagree. We agree that too little attention is paid to the assumptions underlying inferential statistical tests, to the design of studies, and that p-values are often misinterpreted. We show why we believe their argument concerning the logic of inferential statistical tests is flawed, and how White and Gorard misrepresent the protocols of inferential statistical tests, and make brief suggestions for rebalancing the statistics curriculum. First published May 2017 at Statistics Education Research Journal Archives


2004 ◽  
Vol 4 (3) ◽  
pp. 58-58
Author(s):  
Flavia Jolliffe ◽  
Iddo Gal

Sign in / Sign up

Export Citation Format

Share Document