Well, than…

It may be clear by now to anyone who’s read my previous posts: I’m a big fan of grammar and language. Combine that with an adulthood spent reading the internet, and combine that with chronic pedantry, and you’ll find that reading most internet comments is just a whirlwind of frustration for me.

I should, however, preface this post by saying that complaints about pervasive language mistakes are futile. Language evolves – that’s one of the many wonderful things about it – and as language is ultimately a tool to enable mutual understanding and communication, one man alone can’t fight it. If “step foot” (ew) becomes more used, and thus better understood, than its correct counterpart “set foot”, or “parting shot” eventually displaces “Parthian shot”, then that is the way of language and it’s not worth trying to fight. I’m writing this post, though, to encourage anyone reading this just to think about the words you’re saying, their inherent functions, and their relations to each other. Such is the fascination of grammar.

Aside from the uneducated abuse of “you’re” and “its”, there are some things that people often overlook, that are actually quite interesting when you stop and think about it. (I want this to be more of a “this is quite cool” post, rather than a “for crying out loud, people” post.)

What I’m really talking about here is the use of “than”. It’s a wonderful word, almost directly equivalent in its usage to “quam” in Latin, and I’d really hate to see it die out. There are far too many people who replace it with “then”, a practice which needs to be expunged without mercy. I have no time for people who say “I’m better then you” or any other such ironic boast where the comparative form is followed by “then” rather than “than”.

And here’s the thing – “than”, like “quam”, comes almost exclusively after the comparative form of the adjective. It is used to compare two nouns or noun phrases. As I’m sure you know, you never say “A is good than B” or “A is best than B”; it can only be used following the comparative – “better than”, “bigger than”, “more incomprehensible than”, “spindlier than”, “more verbose than”, etc. One of my pet hates, therefore, is when people use the phrase “different than“. It comes up everywhere, people have been using it for years (see ‘You’re No Different’ by Ozzy Osbourne, off Bark At The Moon) and nobody actually picks up on the fact that it’s a mistake.
As I’ve just established, “than” is used with the comparative form of an adjective. “Different”, though entailing a comparison by its meaning, is in fact the positive form of the adjective; “More different” is the comparative form. The preposition you should be using with different is “from“. We can see this when we look at the roots of the word “different”; it comes from the verb “differ”. The “-ent” suffix is just the Latin present participle; it’s just the same as saying “differing”. And you’d never say “A differs than B.” You’d say “A differs from B”, e.g. “My opinion differs from yours” or “My opinion is differing from yours”, and so you’d say “My opinion is different from yours.” There are some people who would also say “different to” – however if you look even further into the roots of the word, the “di-” (originally “de-“) prefix is a Latin preposition, itself meaning “from”, so that’s what I’m sticking with.
The only time you should ever actually say “different than” is, as mentioned above, when it’s the comparative form of “different”. If, for example, both A and B are different from C, you could say “A is more different than B from C”. But you *should* never say “A is different than C”. That said, you probably will – such is the nature of language.

More about “than”: there are a couple of instances when you use “than” and it doesn’t feel like a comparative adjective. I’m talking about “other than”, and “rather than”. But here’s what’s really interesting: don’t both “other” and “rather” look like comparative forms of hypothetical adjectives “oth” and “rath” respectively? Moreover, both of those words introduce a sense of comparison: “Can we eat pizza, [rather] than foie gras?” can be translated as “Can we eat pizza, [that would be more enjoyable] than foie gras?” Even “other” still introduces a sense of difference, separation, or distance between the two subjects. If a subject is “A”, “oth-” could be translated as “[not A]”. So to say “we need something other than A” could be translated as “we need something [less A] than A.”

Bit of a bumpy, theoretical ride, there, but I think I got my point across. If you made it through, good on you.

JH

P.S. Superlative forms traditionally end in “-est” or “-st”: biggest, most, starkest, greenest, etc. But… so does “best.”. Think about it. And, in fact, so does “last” (or [most final]). I just blew my own mind.

Example =/= epitome

We as a generation have an interesting relationship with clichés and analogies. In our mass media and pop culture saturation, it’s very easy for the hivemind to pick up on one theme, or one item, and run with it, at the expense of our paying attention to other equally worthy items. A recent example would be the collective outrage over Cecil the Lion, when cynics were quick to point out that we’re ignoring the gross animal abuses that take place on a much larger scale in our very own livestock industries for the sake of a single lion, rather humanely (if illegally) killed.

There is probably encyclopaediarum* worth of blog posts and articles already written on the disproportionality of trends in social media, but that’s not quite what I’m getting at here. What I’ve been thinking about is how, thanks to this pop culture (and social media) saturation for the last 50 years, one historical instance of a subject can be repeated and repeated, until it becomes the go-to example for that subject. In internet writings on any subject matter, 99% of the time this particular example will be the one to come up.
*[genitive plural, just like “years’ worth”, check your Latin]

To put my fact where my abstract is: need to draw comparison to a cult? You’ll probably end up naming the Manson family. Need a nuclear disaster? Chernobyl. Racial segregation? South African Apartheid. Need a mass murderer? Jack the Ripper will usually do the trick, or Ed Gein, especially if you want him to be a bit “weird” with his victims. More positive examples: in sport, footballers are so often looking for “the next [Beckham/Pélé/Maradona/Messi/Ronaldo]”, and hockey players are ALWAYS talking about Wayne Gretsky. It’s hard to talk about rock guitar without someone bringing up Hendrix or Jimmy Page (ew).
And of course, the one I’ve tried to avoid as much as possible – need a totalitarian state? Nazi Germany. A state where fearmongering wins out over rationality? Nazi Germany. Books are burned? Nazi Germany. Racism? Nazi freaking Germany. Is someone slightly controlling, cruel, or abusive of their authority? They’re “worse than Hitler”. Nazi Germany embodied the extremes of so many tropes of history that Godwin’s Law even states that “As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1.”

I’m not saying that these examples are irrelevant or unjustified: quite the opposite. (In fact, I’ve got a blog post coming up specifically about how comparisons to the Nazis are actually more cliché than they should be. They’re often perfectly appropriate – the rise of UKIP, alongside Katie Hopkins and David Cameron recently calling migrants “cockroaches” and “swarms” respectively, exemplifies exactly how insidiously the politics of fear can take hold, as they did in 1930s Germany). My point is that thanks to their repeated reference in popular culture, these examples become the only ones we can think of. Case in point: all of the examples I used above are based entirely on my familiarity with the internet and pop culture, and nothing to do with my knowledge of the subjects themselves.

My problem with this is sort of twofold: first, it impedes our expanding minds and limits our education to the bare minimum of understanding the subject. When one example will suffice, why do we ever need to look further into the subject, even if there are more appropriate examples of just slightly less renown? Moreover, the examples aren’t always the best, most appropriate, or most egregious instances of their subjects.
E.g:
-to refer to the “next Beckham” in British football discounts the legacies of Alan Shearer, Gary Lineker and Frank Lampard. (I can’t expand much more as my knowledge of football is kind of limited to those “big names”, hence making my point exactly.)
-When you’re talking about innovative rock in the 1960s and 70s, I will not dispute the title of Hendrix as one of the absolute greats, but frankly Tony Iommi’s sound and playing style took rock guitar in equally new and inventive directions. I won’t even hear a case for Jimmy Page (hence the “ew” above) – one of the most sued-for-plagiarism musicians in rock history – when his contemporary Ritchie Blackmore was busy pioneering the neoclassical guitar style. Blackmore and Iommi just happen to be a little bit less famous. An up-and-coming player with amazing speed and tapping skills is more likely to be compared to Page or Clapton (ironically nicknamed “Slowhand”) than s/he is to Eddie van Halen, even though van Halen is a much more appropriate comparison.
-For mass murderers: Jack The Ripper only killed 7 people, granted in a rather brutal fashion. Ed Gein only killed 2. Elizabeth Bathory or Gilles de Rais make for much more brutal, frankly interesting and non-trite serial killers, but because Jack the Ripper was such a media frenzy even at the time, and because Norman Bates and Buffalo Bill were partly based on Gein, those are our “go-to” serial killers.
-To refer to anything and everything cult-related in the same breath as the Manson family simplifies everything we know about cults – in their many and varied forms – to one instance of a group of murderers led by a mass murderer. This gives us completely the wrong impression of what cults actually are and what they do – thus misrepresenting their effect on, and place in, society.
Apartheid. This is really what prompted me to write this post, since, as a half-South African who loves his relatives, I am particularly sensitive to representations of apartheid-era South Africa. I must first disclaim that in no way do I endorse racism or racial segregation. My problem with apartheid being the “go-to” example for racial segregation is that it’s presented, much like Nazi Germany, as this sort of unique evil that could only have taken place in its particular place and time, and “we” would never do that sort of thing these days. It demonises all white South Africans (for something that many of them opposed, and many more of them actually had no say in) as a unique group of racists in history, meanwhile discounting ALL THE OTHER EXAMPLES of racism and racial segregation. South Africa had racial segregation 1948-1994 – a total of 46 years. The United States of America’s much less known ‘Jim Crow Laws’ of racial segregation existed on paper from 1890 until 1965 – a total of 75 years. The state of Alabama did not posthumously pardon the innocent Scottsboro Boys until 2013. Need I go on? How about the caste system in India? As a matter of fact, all colonial powers and their relationships with their colonies? Of course those instances are going further and further back in history, and you could argue that the post-WWII world with the end of European imperialism demands different standards. Certainly, South Africa maintaining segregation until 1994 is pretty damn shocking, but at the same time there was also a racially-motivated war going on in the Balkans, genocides and all, that occupies far less of our attention today. What really stands out about the South African example of apartheid is that it is the one that, time-wise, coincided with the rise of mass media and popular culture, and so that became our “go-to” example for racial segregation – and, to make a linguistic point, it will ever remain thus while we use the Afrikaans word “apartheid”.
-Chernobyl works well as a counter-point, because Chernobyl was for an extremely long time the greatest nuclear disaster in history. BUT today we have Fukushima. Sources disagree on which was “worse”, because you have to factor in radiation levels (on which they seem quite equal), number of lives affected, etc. But Chernobyl remains the “go-to” comparison for nuclear disaster, sometimes giving way even to “Three Mile Island”, even though we now have a more present, more vivid example from Japan.
-Finally, the Nazis. I’m glad I’m getting tired of writing this post, because it is very easy to go on and on about the Nazis, and I don’t want to do that. But basically, all the points I’ve made so far apply here. -Totalitarian state? How about North Korea, the White Terror in Taiwan, or the DDR – all of which were/are more effective and controlling than the Third Reich? -A population gone hysterical with nationalism?  Cultural Revolution-era China. -War crimes, atrocities, and genocide? How about Imperial Japan: the horrifying treatment of prisoners, Unit 731, and the Rape of Nanjing, of which so many of my friends are ignorant? or the Armenian Genocide of 1915, which remains unrecognised by so many governments? And how about the war crimes of the United States of America – the only state ever to have used nuclear weapons in war, and napalm in Viet Nam? None of these caused nearly as much death as the Holocaust, it’s true (though the Rape of Nanjing caused an unimaginable ~200,000 in six weeks alone). But surely, if only for the sake of maintaining a full understanding of the world, these other instances deserve more of our recognition. Perhaps, you say, Nazi Germany represents all of the above. But so did the USSR in the 1930s, during which more than 20 million of Stalin’s own people died. Likewise, the regime of Pol Pot and the Khmer Rouge remains for me the most atrocious in history, during which a full QUARTER of Cambodia’s population was killed in four years alone. Needless to say, none of the above excuses or ameliorates the crimes of Nazi Germany, nor am I seeking to do that. All I’m saying is that, like Apartheid, these awful things can’t be viewed as something unique to their time and place, but have recurred throughout history, and we need to understand and be aware.

This post, grim though it was, has mostly been to make my point about what I’ve called the “go-to example”. Ultimately, it is both risky and limiting to continually use one particular example as a hallmark, standard, or milestone in any subject matter. The “go-to” example may be germane, and it may be effective – but it comes at the expense of our broader knowledge and, in the end, it probably says more about the pervasive role of the media and pop culture than it does about the subject matter.

JH

Fun with the English alphabet

Here’s one for you: say the alphabet out loud, as usual making sure to name the letters rather than sounding them out (as in, “ayy” for A, “bee” for B, “see” for C etc.).

You’ll find that every vowel only sounds like the way it’s used in conjunction with the ‘magic E’ or ‘special E’ (however you were taught it in school): that is to say,  A is pronounced the way it is in “bake” or “cake”, not like in “back” or “cat”; E sounds like “bee”/”see” rather than “bet”/”set”, I sounds like “die”/”pie” rather than “dig”/”pig”, O sounds like “rope”/”hope” not “rot”/”hot”, and U sounds like “tube”/”crude” rather than “tut”/”crud”.

Meanwhile, among the consonants, if you say them in the way of the Queen’s English/Received Pronunciation, or even Standard American, then H and W are the only letters whose ‘names’ give no indication and are really nothing like their sound (R as well in English pronunciation). Most consonants indicate the sound, like D being “Dee”, and even calling X “eks” or C “see” indicates their usage in may frequent cases (e.g. box = boks; ceiling = seeling). But, we call W “double yoo” and H “aitch”.
Interestingly enough, (at least in England) calling H “haitch” is often deemed a sign of a lack of ‘education’ or ‘class’, even though, for the purposes of indicating a letter’s sound, it’s actually more useful than “aitch”.

The case for W may be contested, as it was David Crystal who argued that a “w” sound is the result of an extremely shortened “u” (“uu” or “oo”) sound. For example, sound out “uuitch” “uarm” or “ooait”. You probably said “witch”, “warm”. and “wait”. So, the name “Double U” for W could be argued to be an indicator of its pronunciation, if only vicariously through the pronunciation of U.

In SE England, Australasia, and South Africa, we also call R “ah”, which gives you no indication as to how to pronounce a word like “red”. Meanwhile in America, Northern and Western UK, and Ireland, it’s “arr” (or “orr” in Ireland), which is much more helpful. However, you could point out that calling R “ah” correlates with its usage at the end of a word, as in RP we also say “bah”, “cah”, and “fah”, while they’d call it “ar” and say “bar”, “car”, and “far” in the wider anglosphere.

I have deliberately avoided Y in this post. Its vowel/consonant status is always debated and I thought it would screw up my findings. But strangely, calling Y “why” does indicate its usage at the end of a word in the same way as in the cases of X and R: “fly”, “sky”, etc. So it does fit into that consonant category (for the purposes of this exercise) in that its ‘name’ does sound like its usage, at least for about half the time, but it’s still problematic because it gives no indication of its usage as a consonant. “Why” doesn’t tell you at all how to pronounce “yes”, or “yellow” – but it does tell you how to pronounce “dye” and “lye”. In that sense, it better resembles a vowel.

SO “why”, “eks”, and “ah” are a little bit weird, but it’s “double-yoo” and “aitch” that are the real odd ones out.

[If you’ve got through this post you can probably see know why I called it the “English” alphabet and not the “Roman” alphabet; it’s about our specific usage of Roman letters in English.]

So that’s what I’ve stumbled on today… what I can’t tell you is why. Why do the vowels sound like they’re in conjunction with E,* rather than alone? What makes H, R, W, X, and Y so inconsistent with their names? As a matter of fact, who actually names letters, or how organically do these things evolve? Who decided that Americans would start calling Z “zee” while Canadians would inherit the British “zed”?
Dahned if I know.

*now that I think about it, words like “cake”, “see”, “pie”, “note”, and “tube” are actually far more consistent in pronunciation across the broader anglosphere than “cat”, “set”, “pig”, “not”, or “tub”. Don’t believe me? Try saying the latter set of words in Yorkshire, Canadian, South African, and New Zealand accents and you’ll see immediately.

JH

Brave New World?

I’ve been re-re-re-reading Brave New World recently.

My thoughts on Huxley’s writing style aside (it’s kinda crappy, but seems almost deliberately so, which is sort of endearing), it’s quite clear why it’s treated as a hallmark. Huxley’s postulations on the future are strange, terrifying, and captivating, making it (for me) an incisive warning on the transformative power of scientific advance, akin to Orwell and politics. And, as you read the rest of this article, it’s vital to remember that Brave New World was published in 1932.

It’s interesting to consider that the way of the world in Huxley’s novel is not necessarily presented as evil. The way the World Controllers took control certainly is, but as for everything else… It’s just that it’s so different – hypnopaedia, ectogenesis, the lack (or sheer horror) of parents – that it seems scary to us. The only exception to that is the prenatally determined, biochemically enforced caste system, which it is very easy to morally object to.
And yet… as Henry Foster argues in the book, the lowest caste, the Epsilons, are nevertheless happy as they can be. They’ve never known any other life, nor do they want to, nor can they even conceive of being an Alpha/Beta/Gamma/Delta – they’ve been taught from birth that they’re glad they’re not of another caste, and happy to be Epsilons. The same is true for every single caste. Still, it would is very easy to argue against the morality of that, as this mentality is just the result of pure propaganda. The key point is that the quality of life is markedly different between the castes: Alphas and Betas enjoy going to the “feelies” and playing obstacle golf, while Gammas and Deltas are specifically taught to hate the wilderness so that they want to spend more time slaving away in factories.
But for the most part, the people in the world, the subjects of the story, are no more good or evil than us. The reason I bring up Foster’s argument is that, in the same way that a Delta would be horrified to live the life of an Alpha, and vice versa, so do we, the readers, feel the same about our counterparts, the subjects of the novel, as they do about us. Our world, to them, is a dystopian past. “Mother” is practically a swearword, and the concept of having time to be alone and think is practically reviled. This raises an interesting point about subjectivity, and cultural hegemony. Just because something “is”, does not mean that is how it “should be” – neither in our current world nor in Huxley’s future.

(That is, with the exception of the caste system, which I absolutely do not endorse.) But the way the caste system is enforced – prenatally, by treating specific embryos differently – is not necessarily evil. It’s not cruel and causes no pain. It just better prepares the humans to survive and live in a system which, arguably, is evil. This, for me, demonstrates the most crucial point about scientific advances – that science, like energy, knowledge, or power, depends entirely on how you use it.

On the subject of scientific advances, there’s one other point that has really stood out to me. In Huxley’s world, thanks to all the genetic tooling around and the drugs and pills the characters take, they “look the same until age sixty”, and simply drop dead soon after. There’s no explanation given; it’s just a thing that seems to happen. Moreover, if left alone too long, or if they stray too far from their easy lifestyle and pleasure-loving activities, the Alphas soon begin to grow inexplicably distressed and need to take a gramme of soma to calm down. All of their activity is designed to keep them away from thinking and contemplating for their whole lives… until they drop dead.
This fight against the ageing process… is it so different from what we’re working towards today? Google is already working on Calico, a project that seeks to “devise interventions that slow aging and counteract age‑related diseases” – that is, to put off death entirely. As for the easy lifestyles – are we not very nearly there? Certainly in the developed world, compared even to fifty years ago. But it’s a well-documented fact that our sedentary lifestyles, a result of ease-of-living and technological advances, are also killing us. Furthermore, there’s an odd correlation between countries with the most advanced infrastructures/lifestyles and disturbing psychological conditions. Japan for me is the stand-out example; a place where you can go into a restaurant, be seated by touchscreen, order by touchscreen and have your food arrive on a little electric train.* Japan’s technological prowess is world-famous, but they also have one of the highest teen suicide rates in the world, and is one of the worst cases of an ageing population (where fertility decreases and life expectancy increases). Germany, probably the most advanced country in Europe, also has the worst case of population ageing, while famously free-and-easy Sweden has one of the highest suicide rates. Population ageing – or at least, the fertility decrease, the more concerning part – is often credited to a lack of interest in sex among the younger generations. This could be due to any number of factors, but the almost perpetual entertainment we now enjoy – be it games, films, or the ease of accessing pornography – absolutely plays a part.
*N.B. not every restaurant, of course

In short, we don’t seem to be all too far from a brave new world already, and some of the negative consequences on society are already evident. Maybe they’re not as Huxley imagined, but they’re there. Moreover, with my generation being one of the first in the ‘Information Age’, we haven’t actually seen the consequences in later life of all this prolonged exposure to the radiation of smartphones and computer screens – not to mention all the chemicals in everything we breathe, eat, or drink  (but I really don’t have time to go into *that* kettle o’ fish). We may be finding ways to prolong our lives beyond anything seen in history, but we seem to be facing some unexpected consequences. And if Calico finds a way to “put off” death indefinitely… we may just start dropping dead anyway, just like the Alphas in Brave New World.

What I’ve written here is probably old news to a lot of you, or things you’ve thought about/discussed before. It’s just amazing to me that it’s all encapsulated in a slightly-pulp book written in 1932.

JH

[some of this post is built on a previous discussion with Kishan Rajdev about our bodies vs. technological advances]

The Wheel Of Time

[I might update this post more as I continue to reflect, in which case SPOILERS WILL FOLLOW – and the column format of my blog is problematic]

Well, that’s that.

Ten years after I bought the first book, and exactly a year and fourteen books of solid reading since I decided to start again from the top, I have finished reading Robert Jordan’s Wheel of Time series.

It really is one of the greatest fantasy epics of all time, and probably the best I’ve ever read. It may have been long-winded, and excruciatingly slow in places, but the last three books (co-written by Brandon Sanderson) made it SO worth pushing through. It’ll take me a while to get over the emotional rollercoaster of the last book – and it’s been a long time since any book has made me feel that way.

It was like the most amazing blend of characterisation on a Robin Hobb/GRRM level, and world creation on a Tolkien level. And that magic system… completely RJ’s own. I don’t think any other author I’ve read has come up with as comprehensive and powerful a magic system. The whole series simultaneously used, abused, averted and created fantasy tropes like nothing else I’ve read. I’m glad it’s been with me for the last ten years.

I cannot recommend the series highly enough, if you’re looking for an epic story for at least the next year. Myself, I’m at a loss…

[SPOILERS]
Egwene… 😥 Honestly the most shocking moment of the entire series. I’m still getting over it. Why, of all the main characters, did it have to be her? Sure, I was sad at Siuan and Bryne and Bashere and Lan (I thought), but Egwene was one of THE main characters. One of the TEOTW crew.  And it was such a glorious, epic sending off, it’s going to stay with me for a long, long time.
[SPOILERS]

[SPOILERS]
I think it was so shocking because there was none of that reflection on her old life – you would’ve thought there’d be some of her thinking back to being Bran al’Vere’s daughter, the innkeeper’s girl, who had come so far, in her final narrative. But no, none of that. Perhaps it was appropriate, as she was then the Amyrlin, and no longer Egwene al’Vere.
[SPOILERS]

[SPOILERS]
But she had just conquered balefire, had single-handedly un-broken the Pattern, and then went to die without passing on how she did that. And the lack of reflection from other characters on her death, the lack of recognition of what she had actually achieved, combined with the rather quick ending of the series, meant the whole thing feel really abrupt. I guess that’s why it was so shocking.
[SPOILERS]

In defence of Area Studies

Back at my alma mater, and in my current masters, the subject I’ve pursued is best described as ‘area studies’. Chinese with Japanese (~75% language, 25% culture/history) under the title of ‘Asian & Middle Eastern Studies’ for my BA, and a mixture of Chinese Law, Japanese History, Taiwanese Society and International Politics of East Asia for my MA.

For many, this seems quite wishy-washy. Indeed, in our modern vocation-driven learning environment, “area studies” doesn’t provide nearly as good an idea of my skills as, say, “law” or “economics” would for others. Even within the humanities, in which, for example, history and anthropology degrees are often (unfairly) derided as not setting students up for the future, they do at least provide you with extensive training in the methodologies* of history/anthropology and so train you for a career in that subject area. Meanwhile, with the variety of topics it covers, ‘Area Studies’ seems to imply more of a “jack of all trades, master of none” approach.

*If you’re not familiar with academia, the methodology is the approach you take to any subject. Simply put, even if you’re writing about the same subject, say the assassination of Bill McKinley, then a historian would take a vastly different approach, consider different factors, and structure a paper differently, from a sociology/political studies/law major.

The shortcomings of area studies were emphasised personally to me quite recently. In a meeting with one of my essay supervisors, she asked about my course generally, and my plans for the future. When I told her about my other modules and that I was hoping to go abroad to do another Masters in East Asian studies (with different modules), she looked concerned and said she was “worried about what direction [I’m] headed in, academically”. Now, she’s a good lecturer, very good at her subject, and I respect the hell out of her, but I kind of resented that.

So if I’m worried about what the area studies label means for my employment prospects, why insist on using it? Why not just say I have a degree in Chinese and Japanese? Well:
A) because I set quite high standards for myself and I don’t consider my own level of either Chinese or Japanese to be degree-level – not compared to my peers who only took one or the other, and certainly not these days. The last thing I want to do is mislead someone into thinking I’m fluent in both languages (though I do have a working knowledge of both) and come out looking like an idiot when I can’t translate to the standard they need.
B) because to say my degree trained me to speak Chinese and Japanese would be to neglect the many and wonderful things that the non-language-based learning has introduced me to. While I love languages, it was learning about things like Confucianism and Mohism, like the Gempei War, the Meiji Restoration and Dr. Sun Yat-sen that really made East Asia a world that I wanted to discover. Arguably, without learning the languages, I couldn’t have discovered those things, but there are many great scholars before me who have done the translation legwork, and so much of this is now available in English. This is by no means to deny the utility of studying languages, and my languages in particular, but just that I feel the true value of my education so far has not derived solely from a fluency in one or two languages, but from that and everything else that I have discovered about a world still so far removed from our own.

So that’s why I cling to the label. Moreover, I’m proud to, and I’m actually glad that I chose the multidisciplinary, jack-of-all-trades approach. This is because, while I’ve not spent three years studying history, or politics, etc, at the postgraduate level, I have at least been given a pretty decent grounding in all of those methodological approaches. And thanks to the fact that I’ve studied one particular area of the world in depth, I have a decent grounding in those methodologies as pertains specifically to an area that not many people have experience of. As far as (academic) career prospects are concerned, I may not be able to jump straight into foreign policy institutes, but I do have enough knowledge that if I wanted to pursue that path, with a little elbow grease and a little willingness to start at the lower levels, it’s still a viable option. Imagine it like a long corridor, with many different doors. Each piece of expertise can get you into one suite, and once you’re in, you can get further and further into that suite with the grounding you’ve been given. I may not be able to get beyond the entrance hall of any particular suite, but so many more doors on that corridor feel open to me.

And ultimately, those open doors – not just in careers, but in life – are the most important thing to me. By studying what I like to study from so many different angles, I feel truly enlightened. In my Chinese Law course, there are honest historical explanations as to why Chinese law on freedom of religion is still so stringent (hint: less ‘communism’, more ‘Taipings and Boxers’), and my professor seemed to really appreciate the historical perspectives I could offer. Likewise, there is a real discussion to be had over whether escalations in Chinese/Japanese tensions are easily explained by political models of behaviour or whether there is a deep-seated historical, even sociological, animosity between the two. Especially in places beyond our cultural safety-net of the Western world, you simply can’t understand one aspect of non-Western societies out of context.

One of my best friends is a rather fantastic lawyer in London – studying Chinese Law this year has not made me an expert in law, but it has given me a far greater appreciation for what he and all law students go through, and it’s honestly enriched our friendship. We’ve had long discussions over pints about mediation vs. litigation, and it was on such an informal, fun level that we could have been talking about the football. And that sort of thing is what makes area studies so valuable to me. It may be a slower career path than most, but looking beyond careers, to what it has meant for my life and my views on the world, I am so glad I’ve chosen this path.

JH

Names, or 正名

So, to start, maybe it’s worth explaining why I’ve chosen ‘Academia Metallica’ as my blog title.

I ran through a number of names, some of which were unavailable. That was the first hurdle. The next was finding a title that I would be happy with, that would ring memorable in the ears, and that was, ultimately, representative of me.

A couple of the possibilities:

Johann Valjohann (a fun play on my name but a little bit too theatre-niche)
I simply rock into Mordor (again, a fun title, but since it’s based-on-a-meme-based-on-a-LOTR-quote, I felt it would have little mass appeal or lasting power)
Shame On The Night (taken, unfortunately)

So, what I came to eventually was Academia Metallica.
To explain:
-It’s a Latin title, because I love the Latin language, and Roman history (more on that in another blog post).
-‘Academia’, not because I think of myself as an academic (…yet, though a man can dream), but because I love academia, learning, and reading; it is pretty much what has defined my life so far and the only thing I’ve been consistently good at.
-‘Metallica’, because along with academia, something I’ve defined myself by for the last 10 or 15 years has been my love of metal and rock music. ‘Metallica’ has the double-whammy of both making the title mean ‘Metallic Academia’ (so… my dream job, basically) and also instantly recalling that metal-cum-rock* household name, Metallica. They’re no longer my favourite band, but I did spend approximately ALL of 2007 listening to nothing but Metallica. So, they’re the ones who got me into this mess in the first place.
*don’t laugh. It’s English; look it up.

So, that’s the name explained. Some people might see the use of Latin as pretentious (and indeed the Chinese in the title of this post). To that I say: I may be very flawed, but if there’s one thing I try my utmost to avoid being, it’s pretentious. My view is: I know Latin, and it’s proven useful in this context and provided me with the best possible title, so why not use it? I don’t use Latin (or Chinese, or any other language) to put on airs or make Boris Johnson-esque pretences to being more elite or fanciful than I actually am; I use it because sometimes you can’t say something well in one language, but perfectly in another. Plus, I f*cking love languages (more on that in another blog post somewhere down the line), and that’s never going to stop, so I’m never going to restrict myself to English – a grammatical gauntlet/lexical labyrinth of a language that I just happen to have been raised to speak.
Besides, another of the names I was considering was ‘Academicusque Metallicus’ (“academic and metal”, to describe myself) which would have been WAY more pretentious.

Well, I guess that’s my first official blog post done. Hope you enjoyed it and you got something of an insight into me. I’m quite enjoying this myself; it’s training for typing long pieces, while simultaneously procrastinating from my dissertation. So, hopefully there’s plenty more to come!

JH

P.S. “Shame On The Night” would have been a reference to the song by Ronnie James Dio. A fantastic sentiment by a fantastic singer, and probably the only thing I’d ever consider getting tattooed on myself (more on that later)