To train these models, the researchers used digital transcriptions of handwritten responses from Framingham Heart Study participants who were asked to describe a picture of a woman who is apparently preoccupied with washing dishes while two kids raid a cookie jar behind her back. These descriptions did not preserve the handwriting from the original responses, says Rhoda Au, director of neuropsychology at the Framingham study and a professor at Boston University. (Her team was responsible for transcribing data for the new paper but did not participate beyond that.) Yet even without the physical handwriting, IBM says its main AI model was able to detect linguistic features that are sometimes related to early signs of cognitive impairment. They include certain misspellings, repeated words and the use of simplified phrases rather than grammatically complex sentences. This evidence is in line with clinicians’ understanding of how Alzheimer’s disease can impact language, Royyuru says.
Have an amazing project to share? The Electronics Show and Tell is every Wednesday at 7:30pm ET! To join, head over to YouTube and check out the show’s live chat and our Discord!
Python for Microcontrollers – Adafruit Daily — Python on Microcontrollers Newsletter: A New Arduino MicroPython Package Manager, How-Tos and Much More! #CircuitPython #Python #micropython @ThePSF @Raspberry_Pi
EYE on NPI – Adafruit Daily — EYE on NPI Maxim’s Himalaya uSLIC Step-Down Power Module #EyeOnNPI @maximintegrated @digikey