Welcome to CARMA’s First Webcast

Dr. Jeff Stanton, Syracuse University

Null and Equivalence Hypothesis Testing

October 2nd, 2020 / 12:00 – 1:30 pm ET (Reception to Follow)

PowerPoint Slides

Jeffrey M. Stanton, Ph.D. (University of Connecticut, 1997) is a Professor in the School of Information Studies at Syracuse University. Stanton’s academic specialty is applied data science. Dr. Stanton served as action editor for the journal Human Resource Management from 2004-2011 and he currently serves on the editorial board of Organizational Research Methods, the premier methodological journal in the field of management.

Stanton has published research in job satisfaction, work-related stress, psychometrics, and statistics with a focus on self report techniques. He has conducted projects that have applied the principles of behavioral science and organizational research towards understanding the interactions of people and technology in institutional contexts. His background also includes years of experience in start-up companies. For example, Stanton worked as a human resources analyst for Applied Psychological Techniques, an HR consulting firm based in Darien, Connecticut. His projects at this firm included the creation, implementation, and assessment of a performance appraisal system, development of a selection battery for customer service representatives, and the creation of a job classification and work standards system for over 350 positions in the public utilities industry.

He has written four books: Reasoning with Data: An Introduction to Traditional and Bayesian Statistics with R, An Introduction to Data Science with lead author and fellow iSchool professor Jeffrey Saltz, Information Nation: Education and Careers in the Emerging Information Professions, with Dr. Indira Guzman and Dr. Kathryn Stam; and The Visible Employee: Using Workplace Monitoring and Surveillance to Protect Information Assets Without Compromising Employee Privacy or Trust, with Dr. Kathryn Stam.

Abstract

Everyone Talks About the Null, But Almost Nobody Does Anything About It!

Testing and rejecting the null hypothesis is a routine part of quantitative research, but relatively few organizational researchers prepare for confirming the null or, similarly, testing a hypothesis of equivalence (e.g., that two group means are practically identical). Both theory and practice could benefit from greater attention to this capability. Planning ahead for equivalence testing also provides helpful input on assuring sufficient statistical power in a study. This CARMA webcast provides a brief tutorial on the use of two frequentist and two Bayesian techniques for testing a hypothesis of no non-trivial effect. This webcast is based on a 2020 article in Organizational Research Methods entitled, “Evaluating Equivalence and Confirming the Null in the Organizational Sciences.”

Registration Instructions

  • Login to your CARMA account.
  • Once you login, in the middle, you will see an option to “Purchase Subscription/Make Reservation”.
  • Select the appropriate event and checkout.
  • You will receive an e-mail about the access information a few days before the event day.

Presenter’s Contributions to CARMA

CARMA Short Courses by Dr. Jeff Stanton

  • Introduction to Data Mining with R
  • Introduction to Big Data and Text Mining with R
  • Statistical Analysis of Big Data with R

Presenter’s Contributions to CARMA

CARMA Recordings by Dr. Jeff Stanton

  • Issues with Internet Data Collection
  • Ramp UP Big Data Research and Teaching with R
  • Association Rules Mining, Bayes and How to Use Markdown

Upcoming CARMA Events

  • Sept. 9 – Members Day
  • Oct. 2 – First Webcast of the 2020-21 Academic Year
  • Oct. 9 – First Topic Interest Group Meeting (Structural Equation Methods)
  • Oct. 21 – Second Webcast of the 2020-21 Academic Year
  • Oct. 30 – First Topic Interest Group Meeting (Multilevel Analysis)

Other CARMA Recordings on Similar Topics

  • Big Data Analytics – Dr. Fred Oswald
  • Big Data Concepts – Dr. Sang Eun Woo
  • Why and How to Replace Statistical Significance Tests with Better Methods – Dr. Andreas Schwab
  • Storytelling Through Statistics – Dr. Lisa Lambert
  • Questionable Research Practices – Dr. George Banks
  • Statistical Analysis with Big Data – Dr. Fred Oswald
  • Inductive Research Approaches – Dr. Paul Spector
  • Ramp UP Big Data Research and Teaching with R – Dr. Jeffrey Stanton
  • Lies My Statistics/Methods Teacher Taught Me – Dr. Charles Reichardt
  • Power Analysis for Traditional and Modern Hypothesis Tests – Dr. Kevin Murphy