Alphabet Street aka Corpus Symposium at VRTwebcon 8

I was delighted to be able to take part in my first webinar as a presenter. Leo Selivan (@leoselivan) asked me to join the corpus symposium for the 8th VRT web conference along side Jenny Wright (@teflhelper) and Sharon Hartle (@hartle). You can find links to our talks at the end of this post as well as my slides.

Presenting on a webinar is definitely a unique experience like talking to yourself knowing others are watching and listening in. Other things to be noted are making sure your microphone is loud enough and that uploaded powerpoints to online systems like Adobe Connect don’t show your slide notes!

My talk was about using BYU-Wikipedia corpus to help recycle coursebook vocabulary and was titled Darling (BYU) Wiki in homage to the recent passing of the great musician Prince. Another webinar note – people can’t hear the music from your computer if you have headphones on!

As I have already posted about using BYU-Wiki for vocabulary recycling, in this post I want to give some brief notes on designing worksheets using some principles from the research literature. When talking about the slide below I did not really explain in the talk what input enhancement and input flood were. And I also did not point out that my adaptation from Barbieri & Eckhardt (2007) was  very loose : ).

worksheet-design2

Input  enhancement  draws  learners’  attention  to  targeted grammatical features by visually or acoustically flagging L2 input to  enhance  its  perceptual  saliency but  with  no  guarantee  that  learners will attend to the features” (Kim, 2006: 345).

For written text they include things such as underlining, bolding, italicizing, capitalizing, and colouring. Note that the KWIC output from COCA uses colour to label parts of speech.

Input flood similarly enhances saliency through frequency and draws its basis from studies showing importance of repetition in language learning.

Szudarski & Carter (2015) concluded that a combination of input enhancement and input flood can lead to performance gains in collocational knowledge.

Hopefully this post has briefly highlighted some points I did not cover in my 20 min talk. A huge thanks to those who took the time to attend, to Leo and Heike (Philip, @heikephilp) for organizing things smoothly and my co-presenters Jennie and Sharon. Do browse the recordings of the other talks as there are some very interesting ones to check out.

Talk recording links, slides and related blog posts

Jennie Wright, Making trouble-free tasks with corpora

Sharon Hartle, SkELL as a Key to Unlock Exam Preparation

Mura Nava, Darling (BYU) Wiki

Question and Answer Round

My talk slides (pdf)

Summary Post by Sharon Hartle

8th Virtual Round Table Web Conference 6-8 May 2016 program overview

References and further reading:

Barbieri, F., & Eckhardt, S. E. (2007). Applying corpus-based findings to form-focused instruction: The case of reported speech. Language Teaching Research, 11(3), 319-346

Han, Z.,  Park, E. S., & Combs, C. (2008). Textual enhancement of input: issues and possibilities. Applied Linguistics 29.4: 597–618.

Kim,Y. (2006). Effects of input elaboration on vocabulary acquisition through reading by Korean learners of English as a foreign language. TESOL Quarterly 40.2: 341–373.

Szudarski, P., & Carter, R. (2015). The role of input flood and input enhancement in EFL learners’ acquisition of collocations. International Journal of Applied Linguistics.

Advertisements

#IATEFL 2016 – Corpus Tweets 2

This is a storify of tweets by Sandy Millan, Dan Ruelle and Leo Selivan on the talk Answering language questions from corpora by James Thomas. Hats off to the tweeters I know it’s not an easy task!

IATEFL 2016 Corpus Tweets 2

Answering language questions from corpora by James Thomas as reported by Sandy Millin, Dan Ruelle & Leo Selivan

  1. James Thomas on answering language questions from corpora. Did not know Masaryk uni was home of Sketch Engine!
  2. JT has written a book about discovering English through SketchEngine with lots of ways you can search and use the corpus
  3. JT trains his trrainees how to use SketchEngine, so they can teach learners how to learn language from language
  4. JT Need to ensure that tasks have a lot of affordances of tasks and texts
  5. We live in an era of collocation, multi-word units, pragmatic competence, fuzziness and multiple affordances – James Thomas
  6. JT Why do SS have language questions? Are the rules inadequate? It’s about hirarchy of choice…
  7. JT Not much choice in terms of letters or morphemes, but lots of choice at text level
  8. JT Patterns are visible in corpora. They are regular features and cover a lot of core English
  9. JT What counts as a language pattern? Collocation, word grammar, language chunks, colligation (and more I didn’t get!)
  10. JT Students have questions about lexical cohesion, spelling mistakes, collocations: at every level of hierarchy
  11. JT Examples of q’s: Does whose refer only to people? Can women be described as handsome? Any patterns with tense/aspect clauses?
  12. JT q’s: Does the truth lie? What is friendly fire? What are the collocations of rule?
  13. JT introduces SKELL: Sketch Engine for Language Learning http://skell. (don’t know!)
  14. “Rules don’t tell whole story” – James Thomas making an analogy w/ Einstein who said same about both the wave & the particle theory
  15. JT SKELL selects useful sentences only, excludes proper nouns, obscure words etc. 40 sentences
  16.  http://skell.sketchengine.co.uk 

    Nice simple interface – need to play with it more. #iatefl

  17. JT searched for mansplain in SKELL and it already has 7 or 8 examples in there
  18. JT Algorithm to reduce amount of sentences only works when there are a lot of examples. With a few, sentences often longer
  19. Sketch Engine is a pretty hardcore linguistic tool, but I can see the use of Skell for language learners. #iatefl
  20. JT Corpora can also teach you more about grammar patterns too, for example periphrasis (didn’t get definition fast enough!)
  21. JT Can search for present perfect continuous for example: have been .*ing
  22. JT You can search for ‘could of’ in SKELL – appears fairly often, but relatively insignificant compared to ‘could have’
  23. Can use frequency in corpus search results to gauge which is “more correct” / “the norm”. #iatefl
  24. JT SKELL can sort collocations by whether a noun is the object or subject of a word for example. Can use ‘word sketch’ function
  25. Unclear whether collocation results in Skell are sorted according to “significance” / frequency or randomly #iatefl
  26. JT See @versatilepub for discounts on book about SKELL