Martin, Katherine B.
Hammal, Zakia
Ren, Gang
Cohn, Jeffrey F.
Cassell, Justine
Ogihara, Mitsunori
Britton, Jennifer C.
Gutierrez, Anibal
Messinger, Daniel S.
Funding for this research was provided by:
Autism Speaks
National Institute of General Medical Sciences (1R01GM105004)
Article History
Received: 2 November 2016
Accepted: 6 February 2018
First Online: 27 February 2018
Authors’ information
: JCohn leads efforts to develop methods for the automatic analysis of facial expression and automated tracking of rigid and non-rigid head motion and applied those tools to research in psychopathology. ZH is part of the JCohn’s interdisciplinary group and has applied computer vision and machine learning to improve measurement of and theoretical advances in mother-infant interaction, depression, pain, and social interaction. JCassell is the director of the Human-Computer Interaction Institute, in the school of Computer Science at Carnegie Mellon University, and focuses on applying computer vision systems to the study of human interactions. GR is a postdoctoral fellow at the Center for Computational Science and is part of the Big Data Mining and Data Analytics team, and MO is a professor in the Department of Computer Science at the University of Miami. Together, GR and MO have expertise in computational approaches to big-data. JB is an assistant professor in the Department of Psychology at the University of Miami with expertise in interdisciplinary research that utilizes advancements in technology to study the intersection of human development and clinical outcomes. AG is the assistant director for the Center for Autism and Related Disabilities (CARD) at the University of Miami. DM’s research uses objective measurement of body movement, facial expression, and vocalization to better understand communicative development in typically developing children and those affected by ASD.
: Recruitment and procedures were approved by the University of Miami’s Social and Behavioral Sciences Internal Review Board (reference number: 20070095). Written, parental consent was obtained before participation in the study.
: Parents consented to the following:“VIDEOTAPES: When the researcher plays with your child and when your child watches the children’s videos, we will be videotaping your child. The videotapes of your child will be coded by experts, undergraduate students, and other parents. Additionally, the videos will be measured using computer software at the University of Miami, University of Pittsburgh, and the University of Denver. They may also be used in publications and at scientific conferences. You agree to all of these uses by taking part in this study….”“CONFIDENTIALITY: Your child’s videotaped images will be rated by other people, analyzed by computer at the University of Pittsburgh, and presented in scientific reports. The images will not be linked to personal identifiers in any way. These videotaped images are scientifically valuable and will be stored permanently by the investigators as a record of this study. At your request, the images will be immediately destroyed. The US Department of Health and Human Services (DHHS) may request to review and obtain copies of your records. Your records may also be reviewed for audit purposes by authorized University employees or other agents who will be bound by the same provisions of confidentiality.”
: The authors declare that they have no competing interests.
: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.