Adolescents and older adults pay less attention to social cues in real-world interactions than young adults.
A new digital app has shown to be successful in detecting one key symptom associated with ASD in young children. The app, which combines gaze tracking and machine learning algorithms, could be an inexpensive new tool to help with the diagnosis of autism.
A newly designed “seeing eye mask” can capture pulse, eye movements, and sleep signals.
A new study reveals the relationship between attentional state and emotions from pupillary reactions. Visual perception elicits emotions in all attentional state, while auditory perception elicits emotions only when attention is paid to sounds.
Using scenes from movies, researchers discover how different brain areas can be used flexibly and as needed. The study sheds light on how the brain transitions between moral thinking and empathy.
New eye-tracking technology monitors naturalistic eye movements in mice and discovers similarities and differences with human eye movement.
Eye movements distinguish between “go” and “no go” actions early in the decision-making process, before the hand even starts to move.