Monday, March 9, 2009

Languange

Languange

Language is a means of transmitting and processing information, organizing sensory perceptions, and expressing thoughts, feelings, and intentions. The content of language encompasses the past, present, and future. The development of language does not necessarily require speech and audition: deaf-mutes learn to communicate with sign language. Language is most easily acquired in childhood. Linguistic messages are transmitted and received through speaking and hearing, writing and reading, or (in the case of sign language) the production and interpretation of gestures. The cerebral language areas are located in the left hemisphere in over 90% of right-handers and in 60% of left-handers; the remaining individuals have bihemispheric or (in 1–2%) exclusively right-hemispheric dominance for language. The left (dominant) hemisphere is responsible for the cognitive processing of language, while the right (nondominant)hemisphere produces and recognizes the emotional components of language (prosody = emphasis, rhythm, melody). Language is subserved by subcortical nuclei as well (left thalamus, left caudate nucleus, associated fiber pathways). Language function depends on the well-coordinated activity of an extensive neural network in the left hemisphere. It is simplistic to suppose that language is understood and produced by means of a unidirectional flow of information through a chain of independently operating brain areas linked together in series. Rather, it has been shown that any particular linguistic function (such as reading, hearing, or speaking) relies on the simultaneous activation of multiple, disparate cortical areas. Yet the simplified model of language outlined below (proposed by Wernicke and further elaborated by Geschwind) usually suffices for the purposes of clinical diagnosis.
Hearing and speaking. Acoustic signals are transduced in the inner ear into neural impulses in the cochlear nerve, which ascend through the auditory pathway and its relay stations to the primary and secondary auditory cortex (p. 100). From here, the information is sent to Wernicke’s area (the “posterior language area”), consisting of Wernicke’s area proper, in the superior temporal gyrus (Brodmann area 22), as well as the angular and supramarginal gyri (areas 39, 40). The angular gyrus processes auditory, visual, and tactile information, while Wernicke’s area proper is the center for the understanding of language. It is from here that the arcuate fasciculus arises, the fiber tract that conveys linguistic information onward to Broca’s area (areas 44 and 45; the “anterior language area”). Grammatical structures and articulation programs are represented in Broca’s area, which sends its output to the motor cortex (speech, p. 130). Spoken language is regulated by an auditory feedback circuit in which the utterer hears his or her own words and the cortical language areas modulate the speech output accordingly.
Reading and writing. The visual pathway conveys visual information to the primary and secondary visual cortex (p. 80), which, in turn, project to the angular gyrus andWernicke’s area, in which visually acquired words are understood, perhaps after a prior “conversion” to phonetic form. Wernicke’s area then projects via the arcuate fasciculus to Broca’s area, as discussed above; Broca’s area sends its output to themotor cortex (for speech or, perhaps, to the motor hand area for writing). This pathway enables the recognition and comprehension of written language, as well as reading out loud.
Examination. The clinical examination of language includes spontaneous speech, naming of objects, speech comprehension, speech repetition, reading, and writing. The detailed assessment of aphasia requires the use of test instruments such as the Aachen aphasia test, perhaps in collaboration with neuropsychologists and speech therapists. Disturbances of speech maybe classified as fluent or nonfluent. Examples of the former are paragrammatism (faulty sentence structure), meaningless phrases, circumlocution, semantic paraphasia (contextual substitution, e. g., “leg” for “arm”), phonemic paraphasia (substitution of one letter for another, e. g., “tan” for “can”), neologisms (nonexistent words), and fluent gibberish (jargon). Examples of the latter are agrammatism (word chains without grammatical structure), echolalia (repetition of heard words), and automatism (repeating the same word many times). Prosody and dysarthria (if present; p. 130) are evaluated during spontaneous speech. Anomia is the inability to name objects. Patients with aphemia can read, write, and understand spoken language but cannot speak.


Atlas Neurology





Neural Processing of Language
Language (in humans) is a learned form of communication with the external environment. It encompasses the sensory appreciation of symbols, their central interpretation, and the verbal or non-verbal expression of the symbols in a manner intelligible to others who have also learned the language. It is used for the expression of cognitive activity, which ranges in complexity from simple repetition of received language to the expression of thought processes. Language processing should not be confused with the expression of sounds that are semantically meaningless (such as a cough, or a baby’s gurgle, whichmay be a form of communication that expresses pleasure, but is arguably not language processing). At the momentwe have a very poor understanding of the neural processing of language, and virtually all information comes from observing the results of damage to brain areas (see p. 346). Language involves the processing of visual, auditory, and (as in the case of the blind) tactile inputs. Auditory inputs travel from the ear to the auditory cortex, and from there to the auditory association cortex in the angular gyrus. From the association areas, the signals are projected to Wernicke’s area, the left posterior part of the temporal lobe (Brodmann’s area 22), where comprehension of the inputs is effected.
The information is then projected to Broca’s area (Brodmann’s areas 44 and 45), where the semantic ‘dictionary’ is stored, and where the storedwords are assembled meaningfully. This information is sent to the frontal cortex and the premotor areas for associative motor processing prior to being sent to the motor cortex, where vocal and manual articulation are controlled, and the word is spoken or written.
Visual language inputs travel from the retina to the primary visual cortex, where electrical impulses are converted into raw visual information. This is projected to the adjacent extrastriate cortex for further processing, before it is sent to Broca’s area for assembly into grammatical form, and is projected from there to the prefrontal and premotor areas. There are thus independent pathways for the auditory and visual processing of language. Very little is known about how a tactile input is translated into language by the brain. These postulated pathways are undoubtedly vast simplifications, but they do explain why damage to Broca’s area still permits the understanding of language, even if it cannot be expressed. They also explain why the interruption of the arcuate fasciculus, which connects Wernicke’s area with Broca’s area, prevents the conversion of auditory language inputs into verbal expression of words. They do not take into account the fact that subcortical structures and cortical white matter are also involved in the processing of language. The left caudate nucleus, thalamus, and some interconnecting pathways in the left hemisphere participate in language processing. The pathways also do not explain the mechanisms whereby patients with severe damage to left hemisphere areas involved in speech are still able to sing and express sounds (including words) in melodic form. This process, called prosody, is more emotional than cognitive in performance, and is thought to be a function of the right hemisphere, which is more concerned with intuitive and nonreasoning cortical function.


Atlas Neurosciences

No comments:

Followers