Andres De Los Reyes received his Ph.D. in 2008 from Yale University. He began his career as an Assistant Professor at the University of Maryland at College Park. Within 10 years, he was promoted up the ranks to Full Professor with tenure. He serves as Director of the Comprehensive Assessment and Intervention Program, where he has provided research training to hundreds of undergraduate and graduate students. The key goal of his research program is to improve our understanding of the discrepant results produced by mental health assessments. Dr. De Los Reyes has published over 150 peer-reviewed articles, books, and book chapters on these and other topics. His peer-reviewed articles have been published in such high-impact journals as the Psychological Bulletin, Psychological Review, Nature NeuroscienceClinical Psychological Science, Child Development Perspectives, Development and PsychopathologyPsychological Assessment, Clinical Psychology Review, Journal of Clinical Child and Adolescent Psychology, and Annual Review of Clinical Psychology. His first book, The Early Career Researcher’s Toolbox (2020), received a Presidential Citation from the American Psychological Association, and an updated edition of the book was released by Springer in 2024. He is the author of Discrepant Results in Mental Health Research, a book released by Oxford University Press in 2024, and one that recently inspired a TED talkDr. De Los Reyes has received over $1.5 million in funding for his work from the Institute of Education Sciences, National Science Foundation, and National Institutes of Health. His service record reveals his passion for education, mentoring, and professional development. In 2019, Dr. De Los Reyes served as Chair of the Board of Educational Affairs of the American Psychological Association, Psychology’s largest organization with over 100,000 members. He serves as Editor for the Journal of Clinical Child and Adolescent Psychology (JCCAP) (2017-2026), a top-tier journal with subscriptions at institutions in over 30 countries. He also founded and serves as Program Chair for the Future Directions Forum. This annual event offers professional development workshops and small-group and one-on-one consultations on all aspects of scholarly work. In 2024, Dr. De Los Reyes was selected to Co-Chair (with Ye Tong) the revisions to the widely used Standards for Educational and Psychological Testing, and elected to serve as Chair of the American Psychological Association’s Coalition for Psychology in Schools and Education. He has received a number of honors for his work, including the American Psychological Association’s Distinguished Scientific Award for an Early Career Contribution to Psychology, the Society for Research in Child Development's Early Career Research Contributions Award, the Association for Behavioral and Cognitive Therapies' President's New Researcher Award, Fellow status at both the American Psychological Association and Association for Psychological Science, and most recently the 2021 Early Career Psychologist Champion Award, also from the American Psychological Association. During the 2021-2022 academic year, Dr. De Los Reyes served as the Fulbright Canada Research Chair in Mental Health at the University of Regina

Areas of Interest

  • Operations Triad Model (https://doi.org/10.1146/annurev-clinpsy-050212-185617)
  • CONTEXT: Classifying Observations Necessitates Theory, Epistemology, and Testing (https://doi.org/10.1080/15374416.2022.2111684)
  • Needs-to-Goals Gap (https://doi.org/10.1016/j.cpr.2021.102114)
  • Discrepant results in mental health research (https://bit.ly/DiscrepantResultsOxfordUniPressPage)
  • Social anxiety (https://doi.org/10.1007/s10567-020-00314-4)
  • Family relationships (https://doi.org/10.1111/cdep.12306)

Doctoral Programs

  • Clinical

Degrees

  • PhD
    Yale University; Psychology, 2008
  • MPhil
    Yale University; Psychology, 2006
  • MS
    Yale University; Psychology, 2004
  • BA
    Florida International University; Psychology, 2001
  • BA
    Florida International University; Political Science, 2001
  • BS
    Florida International University; Criminal Justice, 2001

My research program focuses on an age-old question germane to understanding individual differences in behavior, and with implications for scholarship in Human Development, Family Science, Education, Psychology, and the Social Sciences more broadly: When researchers measure how two or more people view the same social experience, why do they often encounter discrepant results in these views?  We see this question reflected in our relationships and larger social environments.  Discrepant results occur within families, when caregivers differ in their views on child rearing.  Discrepant results manifest when evaluating the effectiveness of education and mental health programs, as different outcome measures often point to differing levels of effectiveness (e.g., Schneider, 2020; Weisz et al., 2017).  They appear within our institutional structures―plaintiffs and defendants arguing opposing sides of legal matters that are adjudicated in the courtrooms of common law countries, for example.  When valued for the insights they reveal, discrepant results beget consensus building and mutually beneficial coalitions among disparate community partners.  Yet, too often, discrepant results beget efforts to determine who has the “right” or “most accurate” view of a social experience, even when several views―not just one―each accurately reflect how relationships and social environments impact human behavior (see De Los Reyes et al., 2019a).  In fact, scholars can leverage discrepant results to understand individual behavior and how it varies within and across the relationships and social environments that typify day-to-day life (De Los Reyes et al, 2023; De Los Reyes et al., 2019b).  Understanding discrepant results and leveraging strategies that embrace and integrate them has implications for multiple scientific disciplines and decision-making settings in societies globally.  In this statement, I describe the approach I take to addressing questions relevant to interpreting discrepant results, and using knowledge about these results to understand links among individual behavior, interpersonal relationships, and social environments.

In my scholarship, I gravitate toward areas of scholarly discourse where discrepant results happen often, and there remains uncertainty among scholars as to what these discrepancies reflect.  Thus, I have dedicated my career to understanding the discrepant results produced in scholarship about mental health.  The “origin story” of scholarship about discrepant results in mental health lies with a reality of assessing mental health domains.  As with many other lines of discourse in the Social Sciences, scholars lack “gold standard” instruments to assess any one mental health domain.  We do not have “one test” for detecting anxiety, or mood concerns, or concerns with sustaining attention, for example.  Addressing this challenge requires not one instrument, but rather multiple, rigorously studied instruments.  The most often-used and well-understood instruments consist of surveys and interviews administered to people and significant others in their lives, such as caregivers and teachers in the case of youth, or spouses and coworkers in the case of adults.  Reports completed by these informants form the backbone of what we know about mental health interventions and special education programs.  The data produced by these reports factor into researchers’ decisions about which interventions appear to “work” or produce beneficial effects for those who receive them.  More broadly, authoritative bodies use these same data to make high-stakes decisions about interventions, with implications for determining which interventions should be “scaled up” and consumed by the public.  Consider those entities tasked with classifying “Evidence-Based Interventions,” such as the United Kingdom’s National Institute of Clinical Excellence (NICE) guidelines, the American Psychological Association’s Clinical Practice Guidelines, and the U.S. Department of Education’s What Works Clearinghouse (WWC), to name a few.  These entities classify interventions as “evidence-based” after considering many studies, each of which relied on multiple informants’ reports to arrive at the data used to estimate the effects of interventions.  Yet, how might one make these classifications accurately, if studies do not reveal the same results about an intervention’s effects?

We all have likely experienced reading the news about a scientific study that produced a given result―the effects of caffeine on memory or health outcomes, for example―and thought, “I read about a study that said the opposite.”  It turns out that these same kinds of discrepant results appear in mental health studies.  Any two mental health studies often differ in their estimates of anything from the prevalence of mental health conditions to the effects of mental health treatments.  Researchers even encounter discrepant results among findings made in a single study.  Discrepant results factor into what we think we know about how often mental health conditions occur, what causes them, and how to improve mental health.  Yet, researchers do not know what to do with discrepant results when they encounter them.  The problem does not lie with their methods.  Discrepant results appear even when researchers use high-quality instruments to collect data, and they appear in the results produced by controlled experiments and uncontrolled field studies alike.  The problem actually lies with how researchers interpret their data, and the decisions they make with those data.  My work reveals that discrepant results often contain information pertinent to understanding links among individual behavior, interpersonal relationships, and social environments.

In my work, I seek to reduce uncertainties in research and decision-making generally, by understanding what discrepant results reflect.  Along these lines, even when informants like caregivers, youth, and teachers make reports about the mental health of, for instance, the youth, and use identical instruments to make these reports (e.g., surveys with parallel item content and response options), these informants nonetheless produce discrepant estimates of the youth’s mental health status.  For instance, an assessor collecting a report from a caregiver might learn that their child displays oppositional behavior, but the child’s teacher does not corroborate the caregiver’s impressions.  At other times, an assessor observes the reverse pattern―a teacher reports that a student in their classroom needs help with anxiety, but the student’s caregiver does not concur.  At times, an assessor collects a report from a child, who reports struggling with relatively covert depressive symptoms that reports from adult authority figures fail to capture.  My work indicates that these discrepant results often reflect aspects of the social environments in which informants observe the youth’s behavior (e.g., caregivers at home vs. teachers at school; De Los Reyes et al, 2022).  This is because when researchers assess any one person’s mental health, they do not randomly select their sources.  Much like the journalist preparing a news story, researchers strategically solicit their sources.  In fact, in Psychology these sources have a name―structurally different informants.  These are informants who harbor unique abilities to observe the person undergoing evaluation within social environments pertinent to understanding that person’s behavior (see Eid et al., 2008).  These structural differences among informants signify that discrepant results do not have to pose obstacles to decision-making.  In fact, they may reveal learning opportunities that enhance decision-making.  What excites me most about this faculty opening is the potential to develop multidisciplinary endeavors that leverage knowledge about discrepant results to not only address long-standing problems in mental health research, but also inspire new lines of research for probing discrepant results and how they manifest in multiple areas of discourse. 

Using frameworks I published in leading theoretical outlets in Human Development, Education, and Psychology (e.g., Psychological Bulletin, Child Development Perspectives, Exceptional Children, Annual Review of Clinical Psychology, Journal of Youth and Adolescence), I examine how discrepant results in mental health research reveal meaningful information about individual differences in behavior.  I take a team scholarship approach akin to the team science approaches that are now commonplace in STEM (see Stokols et al., 2008).  That is, I leverage the ubiquitous nature of discrepant results to produce scholarship with a network of colleagues across Human Development, Education, Organizational Behavior, Cognitive Science, Neuroscience, Medicine, and Social Work.  I also take a developmentally-informed approach that traverses multiple developmental periods, including early childhood, late adolescence, and emerging adulthood.  I integrate multi-informant, psychophysiological, observational, and performance-based assessment paradigms, and I leverage these paradigms to test questions using a suite of experimental, controlled observation, naturalistic, and quantitative review designs.  The goal of my research program is to understand discrepant results in assessments of mental health, the relationships and social environments that shape them, and how we might harness discrepant results to inform decision-making.

  • National
    Founder/Program Chair, JCCAP Future Directions Forum (www.jccapfuturedirectionsforum.com) (2017-)
  • National
    Co-Chair (w/Ye Tong), Revisions of 2014 Standards for Educational and Psychological Testing (2024-)
  • National
    Chair, Coalition for Psychology in Schools and Education (2024-2026)
  • National
    Co-Chair (w/Hannah Ades), Professional Development and Mentoring Special Interest Group (2023-)
  • National
    Chair (Elected), Board of Educational Affairs (2019)
  • National
    Member (Re-Elected), Board of Educational Affairs (2020-2022)
  • National
    Member (Elected), Board of Educational Affairs (2017-2019)
  • National
    Member, Executive Board of the Society of Clinical Child and Adolescent Psychology (2016-2026)
  • Professional
    Editor-in-Chief, Journal of Clinical Child and Adolescent Psychology (2017-2026)
  • Professional
    Associate Editor, Journal of Clinical Child and Adolescent Psychology (2011-2015)
  • Professional
    Associate Editor, Journal of Psychopathology and Behavioral Assessment (2012-2015)
  • Professional
    Associate Editor, Journal of Early Adolescence (2014-2015)
  • Professional
    Associate Editor, Journal of Child and Family Studies (2010-2014)
  • Professional
    Guest Editor for a Special Issue in Journal of Clinical Child and Adolescent Psychology (2023)
  • Professional
    Guest Editor for a Special Section in Journal of Clinical Child and Adolescent Psychology (2020)
  • Professional
    Guest Editor for a Special Section in Clinical Psychological Science (2018)
  • Professional
    Guest Editor for a Special Issue in Journal of Youth and Adolescence (2016)
  • Professional
    Guest Editor for a Special Section in Journal of Psychopathology and Behavioral Assessment (2016)
  • Professional
    Guest Editor for a Special Issue in Journal of Clinical Child and Adolescent Psychology (2015)
  • Professional
    Guest Editor for a Special Section in Journal of Clinical Child and Adolescent Psychology (2011)
  • Campus
    Faculty Leader, First-Year Innovation and Research Experience (2016-2022)
  • Campus
    Associate Chair of Psychology (2014-2015)
  • Campus
    Chair, Psychology Undergraduate Program Committee (2014-2015)
  • Campus
    Chair, Psychology Diversity Committee (2015-2017)
  • Campus
    Member, College of Behavioral and Social Science’s Diversity Advisory Council (2015-2016)
  • Campus
    Director of Clinical Training (2017-2018)
  • Campus
    Area Head, Clinical Psychology Doctoral Program (2018-2021)

Current Students

Former Students

  • Student Name
    Sarah A. Thomas, Ph.D.
    Current Position
    Assistant Professor of Psychiatry and Human Behavior (Research), Brown University
  • Student Name
    Tara M. Augenstein, Ph.D.
    Current Position
    Assistant Professor of Psychiatry, University of Rochester
  • Student Name
    Melanie F. Lipton, Ph.D.
    Current Position
    Clinical Psychologist, Wilmington VA Medical Center
  • Student Name
    Bridget A. Makol, Ph.D.
    Current Position
    Assistant Professor of Pediatrics, Rush University Medical Center
  • Student Name
    Melanie Arenson, Ph.D.
    Current Position
    Post-Doctoral Fellow, Boston University
  • Student Name
    Noor Qasmieh, Ph.D.
    Current Position
    Post-Doctoral Fellow, Johns Hopkins University

Related Students (Listed by Student on Student's Profile)

  • Noor Qasmieh
  • Akram Yusuf
Profile Page Photo for Dr. Andres De Los Reyes
3123H, Biology/Psychology Building
Department of Psychology
Email
adlr [at] umd.edu