Parents have many reasons for raising their children with multiple languages. Some hope for better career opportunities for their offspring, while others focus on the reported cognitive and intellectual benefits of learning an additional tongue, including better attention skills, improved memory, and a quicker decision-making process. Still others, such as the writer Ben Faccini in Aeon, want to fight against the worldwide dominance of the English language. Finally, for countless families, multilingualism is simply a way of life, a tradition that they want to bestow on their children. But no matter what the motivation behind the parents’ desire for giving their children a multilingual upbringing, the questions and worries are the same. How can I manage to teach my child all these languages? What is the best method for achieving this goal? And how can I stand firm against the challenges and difficulties that inevitably come with raising a family that differs from ‘the norm’? First of all, there has been some confusion concerning the terms. Many people use ‘bilingual’ and ‘multilingual’ interchangeably. Traditionally, the first term referred to the knowledge of two languages, while the other meant that someone spoke three or more. Annick De Houwer, a professor of language acquisition and multilingualism at the University of Erfurt in Germany, uses the term ‘bilingual’ to describe a person who knows two or more languages. Most books and articles aimed at families raising bilingual children claim that the best way to teach a child a language is OPOL, or the one-person, one-language method. According to this approach, one parent speaks one language (often the majority language) while the other speaks the minority language. For many reasons, this method is not ideal. In fact, when De Houwer analysed a huge sample of bilingual children living in the Flemish part of Belgium, she found that only a part ended up speaking both languages. The success rate was especially high when both parents spoke the home language while the children learned the other language at school. Moreover, even if the parents spoke more than two languages, the children acquired only the ones that the parents actively used at home. This makes sense, given that there is a direct correlation between the amount of time spent talking to a child and the success of language acquisition. However, the exact amount of time needed is not known. Some believe that a child needs to hear a language more than 30 per cent of waking hours in order to speak it with ease, but there is no scientific evidence to support this. In the first half of the 20th century, multilingualism used to be a dirty word, and speaking more than one language was associated with a lower IQ and decreased cognitive skills. While these negative assumptions are wrong, many of them still prevail. Countless families have been asked to stop speaking their mother tongue to their children, out of fear that speaking an additional language or two will not just confuse children, but also hamper integration in children of immigrants. These fears are unfounded, and based on a distrust of differences in skin colour, culture and religious identity, said De Houwer. The prestige that a language enjoys also plays a role in the success of multilingualism. In Europe, languages such as English, German and French are respected and acknowledged, while eastern European languages such as Russian or Polish, as well as Middle Eastern languages such as Arabic or Turkish, are regarded with suspicion. De Houwer warns that this could lead to children losing the motivation for learning their mother tongues. When their languages are not accepted, she noted, the children don’t feel accepted either. What’s more, language discrimination violates article 30 of the United Nations Convention on the Rights of the Child. Parents can consciously decide to send their children to a school where the language and culture will be respected. Thanks to the work of researchers such as Ellen Bialystok at York University in Canada, bilingualism has become something of a fashion. In the United States, bilingual and immersion schools, designed specifically to help minorities retain their languages, are currently attracting families from white, middle-class backgrounds who jumped at the opportunity of giving their children the gifts of multilingualism. This goal, while commendable, might actually push out the very target groups that these schools were built to support. Middle-class families were also more likely to engage in what the sociologist Annette Lareau at the University of Pennsylvania called ‘concerted cultivation’, an approach where parents actively fostered the children’s skills and abilities in order to maximise success. But this group of children were also overscheduled, exhausted and lacked time to play, get bored, and just be children. Now, multilingualism has become a part of the intensive parenting trend. Some parents today approach raising multilingual children the same way that they approach nutrition, discipline or schooling: a zero-sum game where the only right way to success is through sacrifice and surrender of all parental needs. This is not necessary, and can even be harmful. Research showed that parents, especially mothers, who believed in the tenets of intensive parenting were more likely to be depressed. Also under attack is the idea that ‘more and earlier is better’. De Houwer said that children learn a foreign language best after the age of 12, noting that the current trend in European countries of teaching children English at an ever-earlier age is not showing the right results. Starting a language too early – especially if a child is already learning a language at home from one of his or her parents – could cause the child to lose all motivation to learn. The sheer amount of information available to parents of multilingual children can be overwhelming, especially on top of all the other parenting advice mothers and fathers get. As a result, parents can become anxious, nervous and unsure of themselves and their teaching techniques. While raising multilingual children is hard work, it’s time to relax. Ultimately, the goal is not even that unique. According to estimates, perhaps half the world’s population already speaks two languages or more, and many speak a dialect on top of that. And the very act of learning languages could be an emotional boon. The biggest benefit, as the linguist Amy Thompson at the University of South Florida found, was that multilingualism makes people more tolerant because it helps them acquire two very important skills – cultural competence (a better understanding of various ways that people behave and communicate), and tolerance of ambiguity (how people approach new situations). It follows, then, that the biggest lesson to learn from bilingualism isn’t really the mechanics: the how, or even the why, of it. It’s much bigger than that. Raising multilingual children is about raising open-minded, tolerant and globally aware citizens. And if that’s the case, parents should overcome challenges and lead the way. Olga Mecking This article was originally published at Aeon and has been republished under Creative Commons.
1 Comment
Marguerite Durling is curious about the origins of this week's word, and even more curious to know "where exactly is my flabber and what does it do when it's aghast?" After extensive research into your question Marguerite, I'm left flummoxed, bamboozled, confusticated, absquatulated, spifflicated and somewhat discombobulated - but barely any the wiser. What I can say with some confidence is that flabbergasted dates from at least 1772 and may have originated in Sussex, England. And that's about it. The Century Dictionary, in the po-faced tone that dictionaries specialise in, states: "Like many other popular words expressing intensity of action, [it is] not separable into definite elements or traceable to a definite origin." That's not so surprising. Nonsense words, like flabbergasted and the others in the second paragraph above, have a habit of appearing out of seemingly nowhere, and attempts to unpack them generally end in tears. Your flabber, Marguerite, is an elusive entity of unknown location whose name may or may not have something to do with flabby or flappy. What it does when it's aghast is equally unknown. Sorry. One of the likely keys to inventing a successful nonsense word is to at least give it the appearance of originating from an actual word, Latin being a good choice. Confusticated, absquatulated and discombobulated all fall into that category. Then there are words like bunkum, which look totally made up but aren't exactly. Bunkum is a phonetic spelling of Buncombe, a county in North Carolina, USA. In 1820, Buncombe Congressman Felix Walker launched into a long, dull speech for the sole purpose of ensuring his words would be reported in the local press, and inadvertently gave us a word that not only describes most parliamentary speeches ever since, but ninety per cent of what appears in social media today as well. One of my favourite recent discoveries is kakistocracy, a term for government by the worst people. Tweeted recently by former CIA director John Brennan to describe you-know-who's administration, it dates back to 1829 when author Thomas Love Peacock coined it as an antonym for aristocracy. It comes from the Greek kakistos - worst - which may have its origins in the Proto-Indo-European kakka, to defecate. Go, Mr Brennan! By the way, if you're after a good read, grab Peacock's Nightmare Abbey or Crotchet Castle, two hilarious gothic novels that take the piss out of the then burgeoning Romantic movement and still hold up beautifully nearly 200 years after publication. For such a popular drink, coffee has a relatively short history – there are no records of it having been consumed before the mid-1400s. No wonder they were called the Dark Ages. Given its Arabian origins, it’s hardly surprising that the word coffee also originates from there. It percolated its way into English around 1600 via the Italian caffe, which was drawn from the Turkish kahveh, which trickled down from the Arabic qahwah. As English writers are wont to do, they spelt the word a gazillion different ways before settling on today’s eminently sensible version. One of the more outlandish early options was chaoua. One can only imagine that whoever came up with that was barely awake and desperately in need of a strong cup of the very thing they were writing about. As a callow youth, I was right into instant coffee with my morning toast and jam. That disgusting brew (my palate is vastly more discerning now) was created from the rough and ready robusta, as opposed to the arabica species which gives us those heady brews that send Aucklanders into paroxysms of delight that almost make us forget how long it took us to crawl the one kilometre to the café on our city’s pathetically overcrowded and underfunded roading network. One of my most exciting coffee adventures was when my former brother-in-law and his wife returned from a trip to Malaysia with a packet of Indonesian kopi luwak for my birthday. In the delicate words of Wikipedia, this coffee “undergoes a peculiar process” – namely, being first eaten by the Asian palm civet and subsequently evacuated from its bowels, having fermented slightly and developed a “uniquely rich, slightly smoky aroma and flavour, with hints of chocolate.” E. coli be damned – I was in heaven. Although horrifically expensive – around $30 a cup – kopi luwak pales in comparison to the black ivory coffee beans fed to elephants in Thailand, who then perform a similar function to the civet on a presumably grander scale. The resulting delicacy sells for three times the price of the civet’s efforts. Despite that, what’s the bet that the poor sod who trundles after the elephant with a trowel and bucket only earns a pittance? As for my fellow coffee nuts, you’ll be pleased to know that its consumption is associated with multiple health benefits, including lower rates of cardiovascular disease, reduced risk of type 2 diabetes, and – one health benefit to rule them all – longer lifespans. Then, of course, there's the benefit to our partners. Who doesn't prefer a tolerable, post coffee human being to the monosyllabic monster they woke up to a few minutes earlier? If you think the quality of American political debate has taken a sudden turn for the worse since you-know-who took office, now might be the time to rethink that view. Check out this 2012 article. If it doesn't depress you, nothing will.
Reader Marguerite Durling recently saw an ad for a movie called The Kids are All Right, leading her to wonder if it was a "political movie or another instance of 'alternative' spelling?" The latter, Marguerite - and you can drop the quotation marks, young lady. While alright is a legitimate spelling - and the one used by The Who in their song - all right has long been regarded as the preferred option for formal writing. Here's what Merriam-Webster has to say about it. No one reads anymore? Go on, Rumour has it that people no longer have time to read. I'm not convinced. I think that, if anything, fewer writers are taking the time to actually write. Everywhere I look, weak writing abounds. Those who produce it are often the loudest at proclaiming that people don’t read. That’s like serving up slop in a kitchen then saying people don’t eat. A recent Pew study found that people were reading just as many books in 2016 as they did in 2012. What’s more, the number of books people read is closely associated with their level of education. In other words, smart people read.* --------------------------------------------------------------------------------------------- That’s like serving up slop in a kitchen then saying people don’t eat. --------------------------------------------------------------------------------------------- Having bought the lie that no one reads, many businesses devote little thought to the quality of what they say. Hence the flood of beautiful but vacuous websites, annual reports, brochures, newsletters and other material. Failing to clearly and simply say what you mean has a massive impact: people stop listening. Vast sums are spent on design. Yet content – dwell on that word for a moment if you will – is often left to non-specialists. Forgive me, but are you insane? Would you forgive a restaurant that served up immaculately presented but tasteless food? My company (and I) stand for clear, powerful, engaging writing. By that, I mean writing that is not only understandable, but also demands to be read. Businesses who take the trouble to deliver such writing get listened to in a way that those who don’t never will. They also make the world a better place. But that’s another story. Like great design, clear, simple writing can look easy. And just like great design, it’s anything but. A manifesto We say:
This article is available as a pocket-sized hard copy version. Email us with your postal address and the quantity you'd like. Words have a habit of slowly and subtly shifting in meaning over time, so that after a few centuries they can come to mean something quite different from what they used to. Case in point, today's word. When you and I think of waiting, our mental image is probably of someone hanging about passively and possibly a little bored. In the 1200s, that wasn't what waiting was about at all - a wait was a watch, guard or sentry, and failure to remain alert could have dire consequences (see Game of Thrones, any episode will do). An alternative sense at the time was to lurk with hostile intent, a sense we retain to this day in the phrase lie in wait. It wasn't until the 1400s that the meaning to remain in one place began to assert itself. The first waiting room was recorded in the 1680s (presumably that's when the first 10-year-old Cleo magazines and Reader's Digests became available), and the sense of to wait something out (think Australians and their desire to win the Bledisloe Cup) arose in the mid-1800s. As for that hard-working, underappreciated soul who caters to your every whim at your favourite restaurant before you walk away leaving a pathetic tip, you miserable sod, waiters - or "attendants at the table" - are a creation of the 1560s. Lucky for you they're not waits in the ancient sense of the word. Otherwise you might end up, Game of Thrones style, with a spear in your rear. If you flinch whenever you hear someone address a group as "youse", consider that it's you, not the speaker, who's the uneducated, ignorant one. Are you still there? Great! Now let me tell you why this is so. In standard English, you is both singular and plural. While that may seem unremarkable to you (that’s you sitting in your seat, and all of you collectively), it separates English from many other languages that do distinguish the singular and plural forms. It also distinguishes standard English from its many non-standard forms. “Hey youse!” is a perfectly sensible way for the speaker to make it clear that he’s addressing the whole group, not just one person. Likewise with “y’all”, “youse guys” (New York mainly) and “yinz” (Pennsylvania). It also makes the speaker more consistent than everyone else. Standard English already distinguishes between I and we, and between him/her and them. You is the only personal pronoun with just the one form for both singular and plural. But having a separate form for each would be useful, no? In fact, we once did. Early English had thou, singular, and ye, plural (hence "hear ye, hear ye"). After the Norman invasion, thou gradually became a familiar form of address, and you a formal, deferential option. So if you were chatting with the king, he’d say “Would thou like a bowl of maggot-infested gruel?” and you would grovellingly reply “If it pleases you, Your Majesty”. Then, around the 18th century, thou began to fall out of favour. The reasons are not entirely clear, but some commentators invoke an emerging spirit of egalitarianism. Either way, that left you to do the heavy lifting for both singular and plural references. And the so-called uneducated masses to do something about it by newly inventing plural versions. Good for them, I say, the clever Dicks. Youse may not be standard English, but it's certainly neat, clean and logical. You don’t have to like it, but heaping scorn on it is both illogical and unjust. Popular theory has it that this little gem of a word was coined by Theodor Geisel, aka Dr Seuss, in his 1950 book If I Ran the Zoo. That may be a stretch, however. While Dr Seuss almost certainly played a key role - if not the key role - in popularising nerd, the word had been around for a few years already as a variation on nert, which itself was a humorous pronunciation of nut. In If I Ran the Zoo, nerds are what the Oxford Words blog describes as "small, unkempt, humanoid creature[s] with a large head and a comically disapproving expression". Some would argue that the word's meaning hasn't altered since, but try telling that to Bill Gates or Mark Zuckerberg during the job interview. Related insulty type words include geek and the rather aggressive nimrod. Geek originally meant a sideshow freak (hence Bob Dylan's lyric "you hand in your ticket / to go see the geek"), which possibly derived from the German geck, to mock or cheat. The modern sense of a social misfit with advanced computer skills dates from the early 1980s. Like nerd, geek has been reclaimed by those who've been labelled with the term, which makes it a kissing cousin of words like gay, dyke, queer and even some racially abusive terms which we won't repeat here. Nimrod, on the other hand, was the son of Cush, referred to in Genesis as "a mighty hunter before the Lord". How the name became an insult is a mystery, although various theories have been suggested - none of them convincing. Whatever the case, one thing you can be sure of is that if anyone calls you a nimrod today, they're not expressing their admiration for your impressive pedigree. POSTSCRIPT: I wrote this article originally in 2013. Today I received an email newsletter from World Wide Words that addressed Nimrod's fall from grace and may render my claim in the paragraph above redundant. You can find the WWW piece here. |