Researchers at Middle East Technical University (METU) and Koç University developed a code that can read and correctly analyze the emotional state of its user through textual, visual, and auditory cues, coupled with neural data acquired using functional magnetic resonance imaging. The project is funded by the search giant Google under its awards program.
The research project, titled “Pattern Analysis of Functional Magnetic Resonance Imaging”, sets its ultimate goal as employing the sentiment class labels to improve the search engines of Google.
The cross-disciplinary collaborative effort of the two universities seems to be bearing fruit. “For example, we can correctly guess certain things the user is thinking of, like colors, objects, animals, clothing, vegetables, and fruits.” said Dr. Fatoş Yarman-Vural of METU Computer Engineering. “The success rate is above 80%”. The research team thinks we may be facing the most advanced models currently in use in the world.
The requirement of placing test subjects in an MRI machine, and the inability of the code to predict categories it has not yet been taught may cause you to let your guard down and not reach for the tin foil in your drawer. “This is just the beginning,” warns Dr. Vural, “these intelligent machines are working for the first time to understand the natural intelligence that engendered them.”