English language cues

English language cues

Nonverbal communication cues are gestures that your body naturally gives during certain situations in order to display your feelings either for that particular situation or towards the people around you. A person needs to be aware of the nonverbal signals that he/she are sending to people, as well recognize and accurately interpret other people's nonverbal signals. Accurate interpretation needs to be done very carefully and a person needs to also remember to take the other person's culture into consideration. Misinterpretation could lead to a misconstrued situation in which one of the individuals gets the wrong perception, resulting in the discontinuation of the conversation and it may also result in less (if any more at all) interaction between the two.

Facial expressions are just one of the many nonverbal cues that a person displays. One might not think about it, but facial expressions are probably the most actively used nonverbal cues that people encode and decode on a daily basis. According to Ekman and Frisen (as cited in Hansen, Johnsen, Hart, Waage, & Thayer, 2008, p. 639), humans are able to identify and distinguish between six different facial expressions of emotion that include: fear, sadness, anger, disgust, happiness, and surprise, as well as neutral. Each and every facial expression (depending on the decoder) can have multiple interpretations. Think about a certain individual that is smiling. Depending on where that individual is, what he/she is doing, who/what they are smiling at, or even just how their smile looks, can have a different effect or meaning in diverse relationships and across various cultures. We, as a society, have certain, unwritten rules that govern our views of nonverbal expression and interpretation as to what we believe is socially acceptable. In the example of the individual smiling, we tend to generally know how long a smile is supposed to last under different circumstances or if we are even supposed to be smiling at all. We also know when is the appropriate time to express/display the other emotions that were previously discussed. In my paper I will compare facial expressions among various topics and situations such as culture, gender, peer relationships, online communication and also sign language.


In just about anything people do, there is a reflection of their particular cultural background. The great thing about emotions is that, for the most part, they are cross-cultural. However, culture does play a role in how we express those particular emotions through various facial expressions while in different situations. Research studies conducted by Matsumoto, Yoo, Hirayaman, and Petrova (as cited by Yamamoto & Suzuki, 2006, p. 168) analyzed social interactions among people belonging to American, Japanese, and Russian cultures. The results of their studies concluded that the Japanese were more able to control their facial expressions of emotion, better than Americans when it came to any of the social interactions in the study. To add further evidence to cultural variance, Friesen (as cited by Yamamoto & Suzuki, 2006, p. 168) found that Americans seemed to openly reveal their negative attitudes as the Japanese controlled their emotions and revealed mere smiles, while watching a predominantly distressing film. It has also been found that people are usually more able to correctly depict emotional, facial expressions that are conveyed by people from the same particular race as them, as opposed to someone with a different cultural background (Wickline, Bailey, & Nowicki, 2009).


As far as gender differences, girls tend to usually be better at identifying, as well as interpreting, nonverbal communication in general. Through my research, I found that it is still up for debate whether females are more accurate than males at decoding facial portrayals of emotion. The majority of the studies I went through concluded that females are better interpreters than males, whereas other studies say it has not yet been proven and therefore further research is needed in order to accurately establish this. A study by van Beek and Dubas (2008) found that girls and older teens, as opposed to both boys and younger children, came to hold more negative opinions towards basic expressions they were able to identify. Another study consisted of 11 photos of facial expressions that conveyed seven different emotions. Six of the 11 photos were female and the rest were male. The researchers then asked participants to identify the emotion that was represented in each of the pictures of facial expressions. In this study the female facial expressions, especially anger and sadness, were more accurately depicted than the photos of the males (Palermo & Coltheart, 2004). Swenson and Casmir (1998) went further into detail about gender accuracy by suggesting that:

A biological model would predict that a female is born with more finely-tuned senses as a protective measure. Born without the physical strength of her male counterpart, she has to be able to interpret danger more acutely as a protective measure. From a sociological standpoint, we can say the female, through conditioning, learns to be more attuned to others in order to gain power through cooperation rather than through aggressive competition. (p. 223)

Peer relationships

According to four separate studies conducted by Ekman, Siegman and Feldstein, DePaulo, and Noller (as cited by Schachner, Shaver, Mikulincer, 2005, p. 141), nonverbal communication can have profound effects on social interaction as well was personal relationships. In the event an individual is not capable of properly encoding and decoding, it can have consequences such as problems with: mental health, social adjustment, and contentment among their various relationships. These possible consequences could result in an individual being a social outcast. In our peer relationships we rely heavily on facial expressions so we can better judge how the other people are feeling when we interact with them. With this knowledge we are able to more finely tune our communication so that it is effective within each given situation. If you do not know how the other person feels, often times it becomes very hard to communicate with them because you don't know how to direct your communication or what are appropriate topics to discuss. One study gives evidence to this by saying "if we can more adequately interpret the expression of internal feelings by others, we can more effectively design and manage our own communication efforts to help produce mutual understanding" (Swenson & Casmir, 1998, p.214).

Online communication

We look at each person's facial expressions in order to tell us how/what that person is feeling or if something might be wrong with him/her. We also depend on facial expressions and other nonverbal cues in order to accurately interpret what the other person is saying. Without these expressions and cues, the messages people try to convey can have numerous meanings and this is why it is often times harder to communicate with someone online. Thankfully in the newer instant messaging systems we have today, we are now able to establish our emotions when chatting online and communicate more effectively. These little faces that we can use while chatting are called emoticons. Since we are not directly in front of the person and cannot see their actual faces we just have to rely on these self-interpreted emoticons in hopes that the individual accurately displays or interprets his/her mood.

Sign language

People that use sign language as a way to communicate are solely relying on nonverbal cues to interpret the ideas and emotions individuals are trying express to them. Like online conversations, a person can interpret sentences in more than one way, but luckily in this case individuals can actually see the facial expressions of the people he or she is communicating with. With this information people that depend on sign language can actively engage in more efficient conversation with those around them. Evidence of this is provided by Patricia Siple (as cited in Muir & Richardson, 1995, p. 392) when she "observed that subjects viewing sign language look at the face, with small excursions around the face, of the signer. This behavior demonstrates the importance of the face in giving clues to the meaning of gestures."


I used to never think about how much you can deduce from another person's facial construction or how vital those expressions are for us to carry out effective conversations with people. Throughout my research and classroom discussions I have had my eyes opened as to how often we express our emotions through nonverbal cues, especially facial expression. Whether you realize it or not, facial expressions are encoded and decoded on a daily basis without us even having to think about it. People need to recognize the importance of how they interpret the various nonverbal behaviors that others tend to display, as well as be aware of their own signals they are sending through different bodily gestures. As with everything studied, further research needs to be conducted in order to get a better understanding of how these expressions impact our lives in various situations.

Source: https://www.ukessays.com/essays/english-language/nonverbal-facial-communication-cues-analysis-english-language-essay.php

Watch video about English language cues

8 ball pool box trick( English language) 22/3/2016


Attributes of prosody[edit]

There is no agreed number of prosodic variables. In auditory terms, the major variables are

  • the pitch of the voice (varying between low and high)
  • length of sounds (varying between short and long)
  • loudness, or prominence (varying between soft and loud)
  • timbre (quality of sound)

in acoustic terms, these correspond reasonably closely to

  • fundamental frequency (measured in hertz, or cycles per second)
  • duration (measured in time units such as milliseconds or seconds)
  • intensity, or sound pressure level (measured in decibels)
  • spectral characteristics (distribution of energy at different parts of the audible frequency range)



Some writers have described intonation entirely in terms of pitch, while others propose that what we call intonation is in fact an amalgam of several prosodic variables. The form of English intonation is often said to be based on three aspects:

  • The division of speech into units
  • The highlighting of particular words and syllables
  • The choice of pitch movement (e.g. fall or rise)


  • pitch prominence, that is, a pitch level that is different from that of neighbouring syllables, or a pitch movement
  • increased length
  • increased loudness
  • differences in timbre: in English and some other languages, stress is associated with aspects of vowel quality (whose acoustic correlate is the formant frequencies or spectrum of the vowel). Unstressed vowels tend to be centralized relative to stressed vowels, which are normally more peripheral in quality[3]




Cognitive aspects[edit]





  • Anger and sadness: High rate of accurate identification
  • Fear and happiness: Medium rate of accurate identification
  • Disgust: Poor rate of accurate identification

Child language[edit]


Brain regions involved[edit]

See also[edit]

  • Intonation
  • Paralanguage
  • Phonological hierarchy
  • Prosodic unit
  • Prosody (poetry)
  • Semantic prosody, or discourse prosody
  • Tempo of speech


Further reading[edit]

  • NESPOR, Marina. Prosody: an interview with Marina Nespor ReVEL, vol. 8, n. 15, 2010.
  • Nolte, John. The Human Brain 6th Edition

External links[edit]

  • Lessons in Prosody (from the University of Freiburg, preserved by the Internet Archive)
  • Prosody on the Web - (a tutorial on prosody)
  1. ^ http://traumaofvoices.com/#/body-mentalists/4592785910

Source: https://en.wikipedia.org/wiki/Prosody_%28linguistics%29

More information about English language cues

Assessment Purpose & Use

Alternate ACCESS for ELLs is an assessment of English language proficiency (ELP) for students in grades 1 -12 who are classified as English language learners (ELLs) and have significant cognitive disabilities that prevent their meaningful participation in the ACCESS for ELLs assessment. The No Child Left Behind Act (NCLB; 2001) requires that all students identified as ELLs be assessed annually for English language proficiency, including students who receive special education services. The Individuals with Disabilities Education Act (IDEA; 2004) also mandates that students with disabilities participate in state-wide and district-wide assessment programs, including alternate assessments with appropriate accommodations, when it is documented in their Individualized Education Programs (IEP). For this reason, WIDA created the Alternate ACCESS for ELLs to meet federal accountability requirements and to provide educators with a measure sensitive to English language proficiency growth of ELLs with significant cognitive disabilities.

Assessment Overview

Alternate Language Proficiency Levels

Alternate ACCESS for ELLs aligns with the WIDA Alternate English Language Proficiency levels. These levels were designed to expand upon Level P1 - Entering, by increasing the sensitivity of the measure for students who have significant cognitive disabilities. The alternate ELP levels give students a chance to demonstrate progress within Level P1.

Alternate Model Performance Indicators (AMPIs)

  • language function (e.g., indicate, match, locate),
  • example topic (e.g., text elements), and
  • form of support (e.g., sensory, graphic, interactive).

9-12 Language of Science

Level A1

Level A2

Level A3



Attend to labeled pictures related to science

Match pictures with science vocabulary words

Locate single components of data from everyday sources represented in tables

Example Alternate Assessment Activities

Teacher presents student with labeled pictures of weather conditions. Student attends to the pictures by demonstrating eye gaze, making sounds, etc.

Teacher presents student with three pictures depicting weather conditions. Student matches the pictures to the correct words (e.g., sun, cloud, snow).

Teacher presents student with weather forecast from newspaper and asks “What day will it be rainy?” Student indicates correct day.

AMPIs for Grade-Level Cluster 9 -12 in the Standard of Language of Science in the Domain of Reading.

English Language Development Standards

Test items are written from AMPIs and MPIs from four of WIDA’s ELD standards:

  • Social & Instructional Language
  • Language of Language Arts
  • Language of Mathematics
  • Language of Science
Test Section Standards Number of Tasks Range of Levels
Listening SIL, LoMA, LoSC, LoLA 9 A1-A3 and P1-P2
Reading SIL, LoMA, LoSC, LoLA 9 A1-A3 and P1-P2
Speaking Part A LoMA, LoSC 8 A1-A3 and P1-P2
Part B
Writing Part A SIL, LoSC, LoLA 10 A1-A3 and P1-P3
Part B
Part C

Language Domains

Each test form assesses the four language domains of Listening, Speaking, Reading, and Writing.

Grade-Level Clusters

Test forms are divided into the following grade-level clusters:

  • Grades 1-2
  • Grades 3-5
  • Grades 6-8
  • Grades 9-12

Sample Items

The Sample Items publication is intended to help Alternate ACCESS for ELLs test administrators become familiar with the new features of the assessment. Within this document, one sample item is provided for each domain (Listening, Speaking, Reading, and Writing) in the 3-5 grade level cluster so that test administrators can see how test items are formatted in each section.

Test Development

The Future of Alternate ACCESS for ELLs

The Alternate Model Performance Indicators are currently being revised to align with the Common Core Essential Elements and WIDA’s 2012 Amplification of the ELD Standards. Once the AMPIs are revised, the Alternate ACCESS for ELLs test forms will be modified to reflect the updated framework.

Technical Reports


Source: https://www.wida.us/assessment/alternateaccess.aspx

Images about English language cues

Leave a Replay

Make sure you enter the(*)required information where indicate.HTML code is not allowed