Monthly Archives: October 2023

Our Brains On Zoom

Ever wonder how our brains deal with Zoom sessions or calls? Researchers from Yale decided to find out how our brains react to this form of social interaction.

Image created with generative AI (Michael S. Helfenbein

We know that social interactions are the cornerstone of all human societies, and our brains are finely tuned to process dynamic facial cues (a primary source of social information) during real in-person encounters, So does that change with online interactions such as video meetings and calls?

Neural signaling during online exchanges was substantially suppressed compared to activity observed in those having face-to-face conversations, researchers found.

While most previous research using imaging tools to track brain activity during these interactions has involved only single individuals, Hirsch’s lab developed a unique suite of neuroimaging technologies that allows them to study, in real time, interactions between two people in natural settings.

Findings

Researchers found that the strength of neural signaling was dramatically reduced on Zoom relative to “in-person” conversations. Specifically, there was Increased activity among those participating in face-to-face conversations which were associated with increased gaze time and increased pupil diameters, suggestive of increased arousal in the two brains. Increased EEG activity during in-person interactions was characteristic of enhanced face processing ability.

In addition, the researchers found more coordinated neural activity between the brains of individuals conversing in person, which suggests an increase in reciprocal exchanges of social cues between the interacting partners.

According to Dr Hirsch, “Overall, the dynamic and natural social interactions that occur spontaneously during in-person interactions appear to be less apparent or absent during Zoom encounters. This is a really robust effect.” She continued, ” These findings illustrate how important live, face-to-face interactions are to our natural social behaviors. Online representations of faces, at least with current technology, do not have the same ‘privileged access’ to social neural circuitry in the brain that is typical of the real thing,” she said.

Sources:

Hathaway B. Yale News. Zooming in on our brains on Zoom. October 25, 2023. [Link]

Zhao N, Zhang X, J. Noah A, Tiede M, Hirsch J. Separable Processes for Live “In-Person” and Live “Zoom-like” Faces. Imaging Neuroscience (2023) https://doi.org/10.1162/imag_a_00027


Tablet App Screens For Autism

Duke University researchers have demonstrated an app which is AI driven that can run on a tablet to screen for autism in children by measuring and weighing a variety of behavioral indicators.

Image: Nature Medicine

Published online in the October 2023 journal Nature Medicine, an app called SenseToKnow evaluates and measures a variety of behavioral indicators and delivers scores that suggest the probability that the child tested may be on the autism spectrum. The results are fully interpretable, meaning that they spell out exactly which of the behavioral indicators led to its conclusions and why.

This ability gives health care providers detailed information on what to look for and consider in children referred for full assessments and intervention. The researchers state that SenseToKnow’s ease of use and lack of hardware limitations, combined with its demonstrated accuracy across sex, ethnicity and race, could help eliminate known disparities in early autism diagnosis and intervention by allowing autism screening to take place in any setting, even in the child’s own home.

The app uses almost every sensor in the tablet’s arsenal to measure and characterize the child’s response without the need for any sort of calibration or special equipment. It then uses AI to analyze the child’s responses to predict how likely it is that the child will be diagnosed with autism.

Source:

Perochon, S., Di Martino, J.M., Carpenter, K.L.H. et al. Early detection of autism using digital behavioral phenotyping. Nat Med (2023). https://doi.org/10.1038/s41591-023-02574-3