Decision making brain
Andhra Pradesh ~ India ~ International ~ City ~ Entertainment ~ Business ~ Sports ~ Technology ~ Health ~ Features
Breast Cancer ~ Swine Flu ~ Lung Cancer ~ Heart attack ~ Pregnancy ~ All Health Topics
Home / Health News / 2009 / July 2009 / July 1, 2009
Decision-making brain region also deciphers different phonetic sounds
RSS / Print / Comments

Health News

Waist size, not BMI can foretell cardiovascular risk in children
A new study by researchers at the University of Georgia, the Menzies Research Institute in Hobart, Australia and the Murdoch Childrens Research Institute in Melbourne, Australia has found that waist circumference is a better indicator of a childs risk for cardiovascular disease and diabetes later in life, as compared to BMI. ANI

Internal body temperature regulates body clock
Fluctuations in internal body temperature regulate the bodys circadian rhythm, the 24-hour cycle that controls metabolism, sleep and other bodily functions, revealed UT Southwestern Medical Center researchers. ANI

Egyptian mummies discovery indicates 'cancer is man-made'
A study of ancient remains has found that cancer is a man-made disease fuelled by pollution and changes to diet and lifestyle. ANI

Decision-making brain region also deciphers different phonetic sounds

A collaborative team of researchers from Brown University and the University of Cincinnati have found that a front portion of the brain, which handles decision-making, also helps decipher different phonetic sounds.


Washington, July 1 : A collaborative team of researchers from Brown University and the University of Cincinnati have found that a front portion of the brain, which handles decision-making, also helps decipher different phonetic sounds.

Writing about their findings in the journal Psychological Science, the researchers have revealed that this section of the brain is called the left inferior frontal sulcus.

They say that this section treats different pronunciations of the same speech sound-such as a 'd' sound-the same way.

The researchers say that in determining this, they have solved a mystery.

MRI studies showed that test subjects reacted to different sounds - ta and da, for example - but appeared to recognize the same sound even when pronounced with slight variations. These five sounds are the same, but the fifth (right) has a slightly different pronunciation.

"No two pronunciations of the same speech sound are exactly alike. Listeners have to figure out whether these two different pronunciations are the same speech sound such as a 'd' or two different sounds such as a 'd' sound and a 't' sound," said Emily Myers, assistant professor (research) of cognitive and linguistic sciences at Brown University.

Lead researcher Sheila Blumstein, the Albert D. Mead Professor of Cognitive and Linguistic Sciences at Brown, said that the findings provided a window into how the brain processes speech.

"No one has shown before what areas of the brain are involved in these decisions. As human beings we spend much of our lives categorizing the world, and it appears as though we use the same brain areas for language that we use for categorizing non-language things like objects," said Blumstein.

The research team studied 13 women and five men, ages 19 to 29. All were brought into an MRI scanner at Brown University's Magnetic Resonance Facility, so that the researchers could measure blood flow in response to different types of stimuli.

Subjects were asked to listen to repetitive syllables in a row as they lay in the scanner. The sounds were derived from recorded, synthesized speech. Initially subjects would hear identical "dah" or "tah" sounds - four in a row - which would reduce brain activity because of the repetition. The fifth sound could be the same or a different sound.

The study showed that the brain signal in the left inferior frontal sulcus changed when the final sound was a different one. But if the final sound was only a different pronunciation of the same sound, the brain's response remained steady.

According to Myers and Blumstein, the study matters in the bid to understand language and speaking and how the brain is able to understand certain sounds and pronunciations.

"What these results suggest is that [the left inferior frontal sulcus] is a shared resource used for both language and non-language categorization," Blumbstein said.

ANI

Link to this page

Suggested pages for your additional reading
AndhraNews.net on Facebook






© 2000-2017 AndhraNews.net. All Rights Reserved and are of their respective owners.
Disclaimer, Terms of Service & Privacy Policy | Contact Us