Thank You to my Patrons!

Monday, November 7, 2016

Non-Auditory Languages

A great many of us are accustomed to auditory languages, but those are not the only languages around. Not by a long shot!

Linguists use the word "channel" to describe the different ways in which linguistic information can be transmitted. The auditory channel is only one of those. There are also olfactory, visual, and tactile channels - essentially, a channel for every sense.

Sign languages are a really important form of non-auditory language used by humans. It's important to note that American Sign Language is its own language, and not at all the same as Signed Exact English. When working with signs, it's easy to think that signs are more iconic than auditory language, but if you look across international sign languages (they differ for different countries around the world) each one has its own iconicity. The idea that a sign is iconic is common, but how each one is iconic is culturally based.

Sign languages in fiction are a challenge to work with. In fact, any non-auditory language will risk being overwritten by our auditory impressions of the language the story is written in.

Language change is universal, and occurs in semiotics and in gesture, and in sign languages, over time.

A language like American Sign Language has its own grammar. Grammar doesn't take the same form in non-auditory languages that it does in auditory ones. It still categorizes, however. It still does the basic job of grammar, which is to create shared context where none currently exists. In the same way that onomatopoeia imitates actual sounds, sign languages can do really cool things to indicate the manner in which actions are performed. It can also use locations in space as ways to refer back to antecedents.

In my 2016 story, "The Language of the Silent," which I wrote with Sheila Finch, the sign language was created as a language of rebellion by the people who used it. There was a slave population who wished to coordinate their rebellion, and therefore they designed a set of signs based on the auditory language they spoke, and this turned into a full-fledged language they could use. I based my idea for this language in part on the way that Hebrew was revived as a full-fledged living language for the Jewish people in Israel.

Readers who use auditory languages as their native languages will come into a story with a base-level assumption that the language used will be auditory.

Other options are color based languages for cephalopods who can change their skin color. Morgan suggested pheromones making a language.

Auditory languages have the property that they are strung out over time, because there are limitations on the way speech sounds are created and how they can be created in succession. Visual languages are less held back by time limits. You could imagine a language where color suggested emotional content and pattern carried grammatical information. A bioluminescent creature might have a finely tuned sense of color.

Languages have to solve particular types of problems, like how to convey passage of time, how to indicate relative position, etc. They can solve these problems in different ways. You could use the relative balance of two different olfactory chemicals to create change that would convey information.

Helen Keller used a tactile version of English to communicate when she was unable to use the auditory and visual channels.

It's important to realize that humans don't communicate solely on the auditory channel. We communicate simultaneously on multiple channels including the visual (gesture, facial expression), olfactory (pheromones), tactile, etc. We can communicate by telephone, though, because the major burden of grammar falls in the auditory channel. Tone of voice is not the same as speech sounds in conveying grammar, but it still plays an important role, and it can appear strange when it is missing.

Disability affects language of all types. You could imagine a disability possessed by a cephalopod, for example (as Morgan is doing).

Deafness is not just a disability. It is also a language community with its own culture. This is why it's so complex to propose to give hearing to people via cochlear implant. Its effect can be to allow people to hear auditory language, but it simultaneously endangers the culture and language of the Deaf community.

It is a mistake to assume that a channel problem equates to a mental, cognitive, or emotional problem. For example, if we are unable to make or to understand facial expressions, this can be misinterpreted as a mental problem, as if we have diminished capacity for a rich inner life. It's too easy to assume that because a piece of expected evidence is missing, that the internal life itself is also missing.

In The Liars I created a language that was only partially conveyed on the auditory channel. So much of the language was conveyed on a magnetic channel that humans could not detect that the humans concluded that the Poik were cognitively diminished, and this contributed to discrimination and exploitation.

Ask yourself how important the various channels are, and what kind of information each is used for.




#SFWApro

No comments:

Post a Comment