October 7, 2019

The first two presentations of the CARMA Webcast Lecture Series originated from the new home of CARMA, the Rawls College of Business at Texas Tech University.  Kicking off the 2019-2020 series on September 6 with the Larry James Memorial Lecture was Dr. Paul Bliese from the University of South Carolina.   Dr. Bliese is the current Editor of Organizational Research Methods and a long-time CARMA Short Course Instructor, and his presentation Panel Data: Methodological Frameworks and Analytic Tools was viewed in Rawls by a live audience of over 30 faculty and graduate students.  His lecture discussed the analysis of panel data by macro and micro oriented researchers, emphasizing theoretical questions that different methodological frameworks can answer, and on showing how different analytic approaches often provide identical answers.

The second presentation by Dr. Justin DeSimone of the University of Alabama occurred on October 4. In presenting Recommendations for Discouraging, Identifying, and Removing Dirty Data in Survey Research, Dr. DeSimone focused on problematic, undesirable, and low-quality data (or “dirty data”) in survey research. Topics covered include (a) ways to discourage participants from providing low-quality or low-effort data, (b) methods of identifying dirty data in a dataset, and (c) the debate about when to retain and when to remove dirty data.

Recordings of both CARMA Webcast Lectures are available for free on-demand viewing in the CARMA Video Library by faculty and students from CARMA’s Institutional Premium and Basic Membership Programs.   Over 125 universities world-wide are CARMA Members for 2019-2020, and the Video Library contains over 160 recorded lectures from previous CARMA Webcast Programs.

The next two lectures of the CARMA Webcast Lecture Series will be provided on November 8 by Dr. Sang Eun Woo of Purdue University (Big Data Concepts) and Dr. Fred Oswald of Rice University (Big Data Analysis).

For more information on CARMA and its programs and events visit the CARMA website http://carmattu.com/

Panel Data: Methodological Frameworks and Analytic Tools 

Bio:

Paul D. Bliese is a professor in the Department of Management at the Darla Moore School of Business. He received a Ph.D. from Texas Tech University and a B.A. from Texas Lutheran University. After graduating in 1991, he worked for a year for the Bureau of Labor Statistics. In 1992, he joined the US Army, where he spent 22 years as a research psychologist at the Walter Reed Army Institute of Research. In his last military assignment, he served as the director of the Center for Military Psychiatry and Neuroscience and retired at the rank of Colonel in 2014.

Over his military career, Bliese directed a large portfolio of research initiatives examining stress, leadership, well-being and performance. From 2007 to 2014, he oversaw the US Army’s Mental Health Advisory Team program assessing the morale and well-being of soldiers deployed to Iraq and Afghanistan. Throughout his professional career, Bliese has led efforts to advance statistical methods and apply analytics to complex organizational data. He developed and maintains the multilevel package for the open-source statistical programming language R, and his research has been influential in advancing organizational multilevel theory. He has published in numerous outlets and served on many editorial boards. He was an Associate Editor for the Journal of Applied Psychology from 2010 to 2017 and is the incoming Editor-in-Chief for Organizational Research Methods.

Abstract:

Panel data (i.e., longitudinal data from multiple higher-level entities) is common in organizational research and can be approached using a surprising large number of methodological frameworks and analytic tools. The presentation discusses the analysis of panel data with a goal of facilitating communication between macro and micro oriented researchers. Emphasis will be placed on delineating theoretical questions that different methodological frameworks can answer, and on showing how different analytic approaches often provide identical answers. The talk builds off of a forthcoming review article in the Journal of Management entitled “Bridging methodological divides between macro- and microresearch: Endogeneity and methods for panel data”.

Recommendations for Discouraging, Identifying, and Removing Dirty Data in Survey Research

Bio

Dr. Justin A. DeSimone received his Ph.D. from Georgia Tech and after Tech served as a post-doc at the University of Nebraska prior to arriving at UA. Justin’s research focuses on innovative personality assessment as well as issues related to research methodology, psychometrics, and statistics. His work has appeared in some of the top journals the field, including the Journal of Applied Psychology, Organizational Research Methods, and Journal of Organizational Behavior. Justin serves as an ad hoc reviewer for many journals in the organizational sciences as well as a senior ad hoc reviewer with the Journal of Organizational Behavior. He routinely presents his research at the annual meetings for both the Academy of Management and Society for Industrial and Organizational Psychology.

Abstract:

This presentation focuses on problematic, undesirable, and low-quality data (or “dirty data”) in survey research. After defining and discussing the potential influence of dirty data, the presentation suggests strategies for minimizing the impact of low-quality data in an effort to enhance the trustworthiness of survey data. Specifically, this presentation covers (a) ways to discourage participants from providing low-quality or low-effort data, (b) methods of identifying dirty data in a dataset, and (c) the debate about when to retain and when to remove dirty data. This presentation also discusses the potential influence of survey administration format (e.g., pencil-and-paper, computer, online) on dirty data.