‘Land, kill and leave’: On CIVCAS and HVT

The photographs, the documents, the whistleblower testimony are all there — the brutal details of our diggers’ conduct brought forward into the harsh light of day.

A blow has been dealt to the prestige of Australia’s special forces with in-kind damages likely to follow for the reputation of the Australian Army as a whole.

At first, it might seem tempting to think of these kinds of events as isolated incidents that do not speak to a more widespread problem within the Army’s special operations community. But misconduct on the battlefield also speaks to a wayward shift in a military force’s broader operating culture.

Along with the Maywand District murders and the Panjywai massacre, what these new allegations levelled against Australian soldiers in Uruzgan will come to symbolise is the ultimate failure of Western militaries to adapt to a fight where the decisive battle was the human terrain.

According to our military leaders, the reason for Australia’s presence in Uruzgan province between 2001 and 2014 was to “clear, hold and build” a Taliban-free Afghanistan. Per counterinsurgency doctrine, by providing an enduring sense of physical security to local Afghans, the “hearts and minds” as well as the rifles and trigger-fingers of fighting-aged males in Uruzgan would eventually be won over.

At some point it seems that this strategic guidance either failed or was wholly ignored.

While Special Operations soldiers had earlier played a kind of “guardian angel” role in support of their regular counterparts in the Mentoring and Reconstruction Task Force, as the Afghan war dragged on, that role became increasingly aggressive.

An upsurge in “direct action” operations began to distract from efforts to secure the population. By 2010, much of the task group was solely focused on so-called “high-value targeting” — the coalition’s effort to kill or capture an ever-growing list of local Taliban “commanders”.

As a former Special Operations Task Group member drily put it to me, the new penchant for fly-in fly-out missions conducted out the side of a Black Hawk saw the entire concept of operations switch from “clear, hold and build” to “land, kill and leave”.

Of course, operating in this manner was never going to defeat the Taliban. Insurgencies are complex adaptive systems capable of surviving the deaths of leaders. As David Kilcullen writes in Counterinsurgency: “decapitation has rarely succeeded [and] with good reason — efforts to kill or capture insurgent leaders inject energy into the system by generating grievances and causing disparate groups to coalesce”.

All this considered then, by channelling an apparent “shoot first, never ask questions at all” ethos, there’s a good argument to be made that much of SOTG’s work in the final years of the Afghan War was counter-productive.

In many ways, the sunset years of operations in Afghanistan marked a transitional moment in the Australian way of war — one which saw our special forces transformed into the hyper-conventional juggernaut it has become today.

In other Western forces, the over-emphasis on “conventionalised” operations — that is heavy-hitting operations which deviate from the subtle and indirect approach of yesteryear — has had similar results on the ground.

The Australian flag sowed onto the arm of a military uniform worn by a man

The New Zealand SAS is currently reeling from allegations that its members carried out “revenge raids” against civilians. US Navy SEAL Teams have now been linked to extra-judicial killings and corpse desecration on the battlefield. In Britain too, the story is much the same. Reports of “rogue” SAS troopers and battlefield executions. Civilian casualties. A Ministry of Defence probe into war crimes allegations.

Incident by incident, this is how the war in Afghanistan was lost.

Despite more than a decade and a half of sustained military effort, today Taliban and other extremist groups cover as much as 40 per cent of the country.

Certainly, where our own efforts are concerned, the data is clear. Australia’s war in Afghanistan was a failure. According to the Institute for the Study of War, districts like Shah Wali Kot (where Corporal Ben Roberts-Smith’s VC-winning charge took place) are now categorised as “high confidence Taliban support zones”.

Elsewhere, the observable metrics on the ground speak for themselves. In 2002, US intelligence estimated the Taliban’s strength at 7,000 fighters. As of 2016, that number has increased to 25,000. As this year’s spring fighting season begins, the Taliban still control roughly a quarter of Afghanistan.

More than anything, what these new revelations demonstrate is that somewhere along the way our military, and our special forces in particular, simply lost the ability to effectively counter an insurgency.

Once upon a time, “the best of the best” were trained to operate like “phantoms” — treading lightly and prudently alongside their local partners.

Today, however, the legacy they will leave behind in the minds of Afghans will be a brutal one. The civilian cost of the Special Operations Task Group’s operations in Afghanistan is now apparent for all to see.

Practical Tips for Skimming the Qur’an: Or, How to Study Islam without Rigour

In a world where media outlets are now plucking Middle East experts from the ranks of heavyset non-Arabic-speaking private military contractors (who have never interracted with people from “these cultures” unless they were heavily armed and travelling in up-armoured convoys), a few colleagues of mine over at the Australian National University have started a great new project called “Re-Anth”. Envisioned as a clearinghouse for popular, prescient scholarship in the social sciences, the general objective of Re-Anth is simple – to reintroduce anthropological thinking into the wider social and political discourse. As such, this will hopefully be the first of many contributions I can make to their new blog.

The first topic I’ve been asked to write about is the concept of “praxis”  one of those great buzzwords you will only ever come across in postgraduate anthropology seminars or in vaguely-meaningful but mostly arcane discussions of Hegel’s contributions to philosophy.

Praxis, in the context which anthropologists use it, refers to the process by which immaterial concepts and ideas (Aristotle’s theoria) are realised by action – the bridge between what Hannah Arendt saw as the two defining categories of human thought and behaviour – “vida contemplativa” (the contemplative life) and “vida activa” (the active life).

While the term itself suffers from a terminal case of jargonitis (in part because praxis is an import word from ancient Greek and in part because praxis is also the German for “practice” which has a separate meaning in English-language anthropology), the spirit of the praxis concept is as follows: there exists a process which connects the things people think about with the things people do and that mapping this contemplation-action algorithm is key to understanding how a member of a particular cultural group is likely to think and behave under a given set of conditions.

There is a huge body of theoretical muck out there to wade through in one’s search for a definition of praxis (from experience, this can actually lead to a reduced understanding of the concept) but since praxis, like anthropology itself, is practically-oriented (or indeed, praxically-oriented) a good way to grapple with the concept can be found by thinking about a religion like Islam not only as a “practice” (that is, something someone does) but also as a “process” (the contemplative and active steps which lead to the doing). By reflecting on the process by which religious texts like the Qur’an (a body of work that contains various theoria) are interpreted and then incorporated into the daily lives of individuals, for example, one can observe the praxis concept in the field.

As a student of Islamic societies, the process by which the Quran is brought into the material world is the textbook example of the praxical process and similarly, if one looks at a political project like “Marxism” – which Antonio Gramsci referred to as “the philosophy of praxis” in his Prison Notebooks  one can observe an analogous process (indeed, a revolutionary strategy) by which a utopian ideal is interpreted and then progressively introduced into society by the Marxist. Both the Marxist revolutionary process and Quranic exegesis-enactment (as a hermeneutic process) therefore, are examples of praxiin the wild”.

With praxis thus defined and with the title of this post suggesting that there is something lacking in how “Islam” is characterized in public discourse, it is now incumbent upon us to consider how the praxis concept might improve the way we think about Islam, re-injecting some intellectual rigor into the discussion.

As I’ve discussed previously, the “true meaning” of any text (especially a religious text) is ultimately interpretive. This should be self-evident to anyone who studied “the novel” in high school – especially if one’s English teacher was intent on extruding bizarre, hidden meanings from the most innocuous of sentences. Certainly, the fact that deciphering a text is an interpretive process (praxis) should be self-evident to anyone who is familiar with the way in which the law is interpreted by the courts. As Barack Obama said of the US Constitution in his final address as President: “it’s really just a piece of parchment.  It has no power on its own.  We, the people, give it power – with our participation, and the choices we make.”

The “participation and choices” which Obama speaks of is, in this instance, a description of constitutional praxis – the process by which the law is interpreted, reflected upon, incorporated and then lived by “We the People”.

To Islamic scholars, the praxis concept is encapsulated by a process called ijtihad – the mental and physical effort which connects the Muslim vida contemplativa with the vida activa (to revisit Arendt). Ijtihad therefore, is the process (thus the praxis) through which interpretations of Islamic jurisprudence are constructed. It follows then, that because jurisprudential interpretation is ultimately subjective, sharia (the legal aspect of Islam) cannot be thought of as comprising a single codex and, much like constitutional opinion amongst American jurists, cannot be understood as a monolithic bloc that regulates Muslim behaviour in any single way.

For this reason – that is, because the ijtihad process produces many different interpretations of both sharia and Islam itself, it is uniquely artless to paint a literal “broad church” with such a broad brush stroke. Likewise and for the very same reason, it is equally artless for one to imply that ISIS’ worldview has “nothing to do with Islam” (this is often described as an “apologist” statement by those who themselves detest the label “Islamophobic”).

Having said that, I suspect that there are very few serious scholars of Islam who would claim that Islamist extremism has “nothing to do with Islam”. As both Shadi Hamid and Reza Aslan have argued – it’s not that ISIS’ ideology is “not Islamic” per se (because the very nebulous nature of religious praxis means that if one says it is Islamic then it is Islamic) but rather that using ISIS as a case study to inform a generalization about what it means to be a Muslim is inaccurate and unfair to Muslim minorities in the West.  As such, despite the shrill cries ringing out from the far-reaches of the internet that terrorist-sympathising “snowflake SJW apologists” are amassing in their “safe spaces” to measure just how little of nothing terrorism has to do with Islam, I’m yet to come across any serious peer-reviewed research that would a claim like “ISIS’ foot-soldiers are Muslims”. The critique, therefore, is probably a straw-man argument.

In many ways then, the greatest intellectual failure of “the anti-Islam school” (that is, the school formerly known as “Islamophobic” Prince logo.svg), lies not in its interpretation of Islamic text per se but rather in its refusal to include a discussion of praxis into how Islam is actually lived – that is, the inability to see Islam not merely as a set of practices but also as a process by which the practitioner interprets text and engages with the sacred.

Certainly, it is possible that one could conclude that the Qur’an is intrinsically violent or misogynistic if one selectively read (as ISIS does) verses like 9:5 or 4:34 to the exclusion of contradicting verses like 109:6 and 30:21 (even though, as the anti-Islam school will tell you, later-occurring verses are supposed to abrogate earlier verses). Yes, if you read the Qur’an like that you might find “Islam [monolithic]” guilty of many crimes.

But of course, in order to find Islam guilty of these crimes, one would also have to refute the role of praxis in producing human behaviour – discounting, for example, the possibility that Islam is a living breathing religion (defined by heterodoxy) or that Muslims are followers of a constantly-evolving faith, a community possessing of a diverse collection of doxa that oscillate from “asymptote to asymptote”. So yes, if one used such a myopic approach – that is, if one employed a textualist, literalist, atomistic, and wholly un-holistic approach to religion as an entire field of study, ignoring the fact that interpretation matters or dismissing the empirically-tested finding that diversity of religious opinion exists even in small-scale societies – you might conclude that Islam is, must be, has to be “bad”, “evil”, “antithetical to Western democracy”… as Tasmanian Senator Jacqui Lambie seems to have concluded.

Naturally, if you use this approach, you’ll probably not find much intellectual backing for your work outside the various (group)think-tanks run by Daniel Pipes or Robert Spencer (or, indeed, Richard Spencer). But then, “left-wing SJWs on university campuses”, right?

Ultimately, the bottom line is this: giving credence to the praxis concept is absolutely critical to the study of Islam [not monolithic]. Moreover, if one actually goes out on the streets and talks to Muslims about how they interpret the Qur’an and how that interpretation influences their behaviour (note: this requires interacting with a sample size that is larger than the cellblock of Camp X-Ray or the mullet-wearing Lebanese teenagers hanging out in hotted-up cars down the road), one would probably conclude that diversity of opinion in a religious congregation which comprises more than a fifth of the world’s population might well be infinite; that praxis is really the only thing that counts when crafting generalisations about “Muslims”; and that ultimately, the Qur’an (regardless of whether or not it is the word of God) is simply a collection of words recorded on a sheaf of palm-fronds. To borrow again from Obama, the Qur’an exists but it is up to Muslims through their “participation and choices” to interpret it and live it.

It might seem bizarre that a religion which regulates its phases of worship according to incremental changes in the lunar cycle could have so much diversity of thought. Here though, it’s worth noting that, according to hadith, the notion of ikhtilaf (Arabic: إختلاف) meaning “difference” or “diversity” was seen as a blessing by Muhammad. Indeed, according to a comprehensive study of the subject by Musawaikhtilaf  al-fuqaha (“diversity of opinion amongst jurists”) not only existed as far back as the Abbasid Caliphate but was also respected as a necessary part of realising a truer, greater Islam.

A non-Muslim interested in thinking more about praxis might consider his own practices, and the contemplation-action algorithm that led him there. If, for example, one ascribes to the Christian faith and goes to church every Sunday, consider the following passage in Matthew 6:4-6:6.

“And when you pray, do not be like the hypocrites, for they love to pray standing in the synagogues and on the street corners to be seen by others. Truly I tell you, they have received their reward in full. But when you pray, go into your room, close the door and pray to your Father, who is unseen. Then your Father, who sees what is done in secret, will reward you.”

Would one say then, that conducting the Sunday ritual at church is “un-Christian”? The answer, of course, is “no”. The only thing that matters here, beyond your self-identification as a “Christian”, is the praxis which underpins the religious choices you have made. In the end, the process of selecting the Sunday ritual and participating in the ritual itself, is the only bit that matters.

We are winning but The Horror will continue

The images coming out of Nice are shocking. Bodies crushed beneath the multi-tonned might of a truck. Revellers who just minutes before were celebrating the festivities of France’s Bastille Day mowed down in the street. Corpses everywhere. People fleeing, running for their lives. All of it live-streamed by the ubiquitous smart-phone.

This terrorist attack (if that is indeed what this was) did not occur in isolation. In preceding weeks we have witnessed similar scenes of carnage in other great cities of the world – Istanbul, Medina, Dhaka and Baghdad. Terrorism is not new to us. But this attack is particularly frightening for two reasons.

At a visceral level, the mangled bodies on the promenade remind us of the human cost of terrorism in a way which even the vaporised nothingness which follows a suicide bombing can fail to convey. The mashed bodies are the bodies of actual recognisable people. The Horror, in the sense which Conrad meant it, is real.

Secondly, and perhaps most frighteningly, the use of an everyday vehicle as the primary weapon in a terrorist attack shows us that despite our best efforts to catalogue and trace the purchases of fertiliser at hardware stores; strictly control the dissemination of firearms and ban pen-knives on planes; we can never fully contain the threat posed by violent extremists. Preventing access to the means by which this violence is perpetrated is crucial but we should be under no illusions – we will never completely eradicate terrorism.

Reactionary voices will come forward saying that a ban on Muslim immigration is the solution to terrorism and Donald Trump will inevitably tweet, as he has tweeted before, that “I alone can solve” (the problem). But make no mistake – no border, no pogrom, no government-funded de-radicalisation program will ever be able to negate the possibility, however infinitesimal, that a madman will slip into the driver’s seat of a legally-purchased, road-worthy truck and run down dozens of innocent people in the street.

The perpetrators of these attacks plan and execute them with specific objectives, that is, a “desired end-state”, in mind. The political function of a terrorist attack is to incite fear in a population and if the scenes of chaos in Nice are anything to go by, IS has achieved this end-state. “Nous sommes terrifiés,” tweeted the Mayor of Nice, begging  the Niçois to remain indoors. The city is in a state of panic. At a global level, the terrorists are celebrating further because we, like the Niçois, are afraid as well – afraid that we will be the victims of the next terrorist attack.

But while the terrorists’ coup in Nice and the marked increase in terrorist attacks should give us all cause for concern, we should not confuse “an increase in terrorist attacks committed by the Islamic State” (assuming IS is responsible) with a statement like “the Islamic State is winning”.

Far from it in fact, on the ground in Syria and Iraq, where this fight really matters, IS is not winning. In the last few months alone, thanks to the co-ordinated efforts of Western, Iraqi and Kurdish forces (and the non-related but mutually-supportive efforts of the Syrian government and its ally, Russia), IS has lost a significant amount of its territory. Palmyra is back in the hands of the Syrian government. Fallujah is back in the hands of the Iraqi army and the Kurds have chased IS back to the gates of Mosul.

Indeed, if we use Mao Zedong’s 3-phased guerrilla war as a model for a successful fight we can see that the last year has been disastrous for IS – a year which has seen it regress from “Phase 3” (wherein the guerrilla army, as in 2014, begins the decisive annexation of enemy territory) back to “Phase 2” – the use of intimidation tactics like terrorist attacks to weaken the enemy’s resolve.

Furthermore, when one observes that the attacks in the holy city of Medina have drawn the ire of prominent Saudi Salafists or when one considers the empirical observation that the use of indiscriminate violence is ultimately counter-productive to a group’s political aims then, all things put together, these attacks appear less as a sign of strength and more as an indication that IS has a reduced threat profile than the one it had just a few years ago.

One-by-one its fighters in Iraq and Syria are being picked off. Just yesterday in a demonstration of the effectiveness of US airpower, Omar Al-Shishani, the Georgian-born commander of the Caliphate’s North and heartthrob of ISIS’ mujahireen  (foreign fighters) was obliterated by a laser-guided GBU-12. Indeed, according to some in the online #ISfanclub, it is yesterday’s loss of Al-Shishani which inspired this new attack in Nice. Thus we arrive at the following conclusion. Outgunned, on the run and lacking the means with which to commit atrocities, ISIS has now resorted to running innocent people over with trucks.

In real terms, as I tweeted just yesterday, ISIS is in a bad place. If current trends in Iraq and Syria continue, my guess is that ISIS will be militarily defeated by this time next year. On the conventional battlefield, they are done. The terrorist attacks however, will likely continue as ISIS reverts to “Phase 2” tactics. In kind, we should prepare ourselves for the next battle – to make sense of and systematically defeat the ideology of salafi-jihadism. This will take time. And patience. But we should be confident about our ultimate victory. Yes, we know now that a truck can be used by this enemy for indiscriminate violence. The prospect truly is terrifying. But as Omar Al-Shishani learned yesterday, 230kg of ordnance (when used selectively), and a patient, cerebral approach is far more effective.


The Arabic word for “Islam” on the right. Inverted into the shape of a Kalashnikov in the thought bubble. Source: Jabertoon


What does it mean to be “radical”?


Radical (chemistry): “A molecule that contains at least one unpaired electron… because of their odd electrons, free radicals are usually highly reactive… they [can] react with intact molecules, generating new free radicals in the process”(Encyclopaedia Britannica, 2015)

Radicalism (political): “Radicalism is characterized less by its principles than by the manner of their application” (Cyclopaedia of Political Science, 1893)


If it can be said that free radicals in chemistry are good at creating more free radicals or that political radicals have a tendency to replicate and create revolutions then it is also a rule of Twitter that anything the popular neuroscientist Sam Harris says about Islam and Islamism will be liked, retweeted and defended by his legions of fans. The Law states if it is he that created it, then the Tweet will be spread, regardless of any discrepancies or oddities in the Tweet’s molecular structure.

Such was the case with Harris’ online rebuttal against the terminology used by Hilary Clinton in her response to the Orlando shooting – 691 retweets, 1,866 “favourites”.

Here Harris sought to chide Clinton for her use of pleonasms, arguing that the excessive use of the adjective “radical” made her terminology linguistically redundant. In many ways, Harris is right to focus on language. Phraseology is important in the discussion of jihadist violence. “If Hilary is only against the radical jihadists,” an onlooker might otherwise wonder. “What about the mainstream jihadists? Are they OK then?”

At this point in Trump’s over-televised run for Presidency, everyone who isn’t living under a rock should be aware that there is no such thing as a “mainstream jihadist” but the larger point Harris is trying to make is still valid – terminology is important and informed debate begins with the correct use of language.

Not being one to shirk the opportunity to nitpick however, I offered that although the term “radical jihadism” is a redundant pleonasm (much as the term “redundant pleonasm” is itself a pleonasm) the term “radical Islamism” is acceptable to use since there are many different schools of Islamist thought. This includes what we might call “mainstream” and “radical” forms of Islamism.

Harris’ response was simply:

Harris’ suggestion, of course, is that theocracy – as a system of government wherein all authority is derived from a deity – has some kind of innate quality which makes it “radical” and that because of this quality it is therefore redundant to affix the adjective “radical” to the word “Islamism” (since the central aim of most Islamists is the establishment of an Islamic theocracy).

As awful as theocracies are, one runs into problems by blanket-labeling an entire system of government as “radical” – even one as flawed as theocracy. If theocratic ideas were necessarily radical what would one then make of a country whose Pledge of Allegiance is a pledge to “one Nation, under God”? Or what would one make of the Vatican – a religious theocracy run by priests? Would one really argue that the Pope and his cabal of cardinals are nothing but “a bunch of radicals”? One could argue that I suppose but it would be a very radical argument to make indeed. And there would be a great many Catholics in the American mainstream (citizens of “one Nation, under God”) who would disagree with your position.

It is clear then that the word “radical” has an inherently relative quality and that it is better understood  simply as “that which is not mainstream”. Contiguously, the term “radicalism” refers simply to a collection of political beliefs which do not exist in the mainstream. It is merely the antonym of the humdrum middle-ground.

The distinction I was trying to make between what we might call “radical Islamism” and the less radical (but no less repugnant) forms of Islamism essentially coheres with the same distinction made previously by political philosopher Olivier Roy. Roy’s thesis was that Islamism is not one single movement but a spectrum of political beliefs which “oscillates between two poles” – a “revolutionary pole” and a “reformist pole”. The distinction should seem fairly self-evident to anyone with even a cursory understanding of the history of political Islam. Some radical Islamists (who we typically refer to as jihadists today) want to pick up a sword and accelerate Islamisation with cold steel while others in the mainstream are more relaxed (Jacob Olidort and Graeme Wood calls these relaxed types “quietist”) – seeking to focus their efforts on Islamising society “from the bottom up, bringing about, ipso facto, the advent of an Islamic State”.

The distinction between Roy’s “revolutionary” Islamists (who we can safely call the “radicals” among the Islamists) and “reformist” Islamists should be familiar to Harris because Maajid Nawaz made a very similar distinction in the book they co-authored together:

“…When I say ‘Islamism’ I mean the desire to impose any given interpretation of Islam on society. When I say ‘jihadism’ I mean the use of force to spread Islamism.” (Nawaz/Harris 2016)

Thus, given what we have discussed about the different forms of Islamism we arrive at the following diagram which expresses the fact that Islamism is not unitary but oscillates between “mainstream” and “radical” poles.

Spectrum of Political islam

Fig 1.1

The “pollination line” in Fig 1.1 is used to demarcate the point at which Islam ceases to be simply “one’s religion” and becomes a political ideology – the point at which the believer pollinates the spiritual life with “the world of the profane“. In essence, the pollination line delineates what we in the West might call “the separation of church and state”. My specific use of the  term “pollination” is intentional here, having borrowed it from a controversial hadith narrated by Anas which offers a glimpse of a secular Islamic world in which worldly affairs are separated from spiritual ones (see footnote below)***.

One will note in the mere existence of this diagram, that I have taken care in how I approach the subject of Islamism, viewing it less as a monolithic bloc (as Harris and Trump have now come to view Islam itself) but treating it as a highly complex collection of diverse social and political movements (all of them seriously flawed).

There are other reasons to be more careful when we taxonomise Islamism. How else can we distinguish between the Shia Islamism of Hizballah and the Sunni Islamism of Al-Qaeda, for example? Or how can we tell the difference between the fractious (and almost-incidental) Islamism amongst the Pashtun  and the ethno-nationalist tinged Islamism now popular amongst many of the Tuareg in Northern Mali?

Indeed, if one were to borrow from the political theorist David Nolan and rework my diagram by including another ideological vector like “primacy of the tribe vs primacy of the Caliphate”, one could plot the significant ideological differences between the various Islamist groups with significantly more accuracy on a Cartesian chart.

As with the following:

Cartesian Chart of Islamism

Fig 1.2 (Noting that Boko Haram’s ideology is nearly unplottable)

Indeed, once we realise that the issue of Islamism is far greater in scope than the white Muslim convert next door regurgitating the filth he reads on the Internet (that is, once we remove ourselves whole-bodily from the ethnocentrism of our own backyard) we will realise that Islamism, just like Islam itself, is very far from a single creed.

Ultimately, the most succinct way I could put the distinction between “the various Islamisms” was by pointing out that in some Muslim-majority countries (like Egypt) there are some Islamists who almost everyone would regard as a terrorist and others who would be democratically elected.

Naturally, by simply pointing out that Islamist views are fairly mainstream in many Muslim majority countries (which they are) I was likened to an ISIS sympathizer by the Harris fan club. According to my logic, they claimed, ISIS’ ideology shouldn’t be considered radical because within the Islamic State, ISIS’ worldview is the prevailing worldview.

Ad hominem aside, it’s actually a reasonable point to make. Hypothetically, if researchers were able to obtain unbiased psephological data from within the Islamic State or if we reduced the sample size of our “spectrum of political belief” diagram (Fig 1.1) to say, fighting aged males currently residing in the city of Raqqa, we would likely find that ISIS’ worldview is far from radical. One might even observe “the pollination line” shifting completely to the right indicating that everyone is in total agreement that Islam should be indivisible from the affairs of state (although if we use my Cartesian model, one would plot the Anbar tribes on a higher co-ordinate to the ISIS muhajireen on the “primacy of the tribe” vector).

Of course (returning to Harris’ original critique), we know that when Hilary Clinton is talking about trends in contemporary politics she is not restricting her sample size to fighting aged males in Raqqa. So the point is moot – ISIS’ worldview is indeed objectively radical in this context. In saying this, I will concede (to the glee of Sam Harris’ fan club) that if the sample size for this discussion was restricted to the US (which it may have been since Clinton was talking about Orlando) then yes, Islamism should be considered a radical ideology. This would mean that Harris is right and the term “radical Islamism” uses a redundant adjective (shame! O shame on you Hilary!). But if we can leave ethnocentrism out of our thinking for a moment and think of this “war of ideas” as a global war and the entire world (with its 50 Muslim-majority countries) as our sample size than it makes sense to make distinctions between different kinds of Islamist belief. Clinton has served as America’s top diplomat so I would hope that she was thinking big picture on this issue.

While I’m on the topic of making concessions to the Harris fan club, I’ll also concede that all Islamists (“quietist” or not) are in a sense “radical” in that an Islamist seeks to be an agent of radical change to society – transforming it completely. There is very little “moderation” in all Islamist ideation which is why many Islamists end up becoming “extremists” (the antonym of “moderation”). But to repeat the hundred and something year old quote at the top of this article: “radicalism is characterized less by its principles than by the manner of their application”.

Heaving ho then, while we could continue discussing whether “radical Islamism” constitutes a pleonasm, the key point is that if we can’t make simple distinctions between the ideation systems of someone like Abu Bakr Al-Baghdadi (a militant jihadist, whom I would label a radical Islamist) and someone like Mohammed Morsi (an Islamist whose views, according to the results of the Egyptian vote in 2012, are fairly mainstream in Egypt) then there really is no hope for our ability to understand the place of Islam in our world.

Of course, Harris’ thesis (the one that is retweeted by his legions of fans… and then repackaged in less savoury terms by Trump™) is that the world’s “Muzz-lims” should be considered followers of an intrinsically radical religion – Islam being what is – a religion founded by a puritanical Bedouin raider.

While the latter about Mohammed might be true, the reality is that in the world we live in today – a world which the founder of Islam was integral in shaping (for better or for worse) – the Islamic worldview and even the Islamist worldview is far from a “radical” one.

This is not to say that one should not speak out against Islamism (as Harris’ fan club seems to think I am suggesting). On the contrary, given all the empirical evidence which suggests that mixing religion and politics is about as good an idea as mixing sleeping pills and alcohol, I’ll be the first to speak out against Islamism if it ever becomes a mainstream belief in Canada (thankfully, an Islamist would be considered a radical in my neighbourhood). I’ll also happily speak out against Pakistan’s blasphemy laws, Saudi Arabia’s harsh dispensation of judicial punishment and the reign of theocracy in Iran.

But if we can agree that “Islamism” is the “enemy” (to use a term which others with military backgrounds can relate to) then our first duty in this global war of political ideas is to understand this enemy as best we can. One need not repeat the Sun Tzu edict here for emphasis.

Understanding this enemy involves conducting what military planners call a “stakeholder analysis” – mapping out all the individual actors within the conflict eco-system to grasp the role they play in producing and transforming violence. This mapping exercise might involve building a profile of each of these individual actors, and occasionally, categorising them according to where their views might lie on a spectrum of political belief (as we have done in Fig 1.2).

Understanding and making the distinction between what we might call “mainstream” Islamists (the quietist types) and “radical” Islamists (the jihadists) is important here because it enables us to adjust the parameters of our targeting apparatus within the system. This enables us to focus our efforts on the targets that matter the most. Indeed, if we remember that labor is in short supply, our aim should always be to attack targets who, once removed from the system, will have a significant effect on the enemy’s centre of gravity. A “radical” Islamist is good at creating more “radical” Islamists, just as in chemistry a radical molecule is good at creating more radical molecules. Therefore, it follows, we need to have words which enable us to categorise and identify radical Islamists where they exist.

Remembering that our ultimate aim in this war is to move that pollination line in Fig 1.1 as far over to the left as we possibly can, the greater problem – the problem of Islamist violence in our world – is greater than the debate over terminology. Ultimately however, our ability to solve the problem rests on our ability to understand the problem and if we can’t understand the basic terminology and the importance of making basic distinctions between the different forms of Islamism then we’ll never find a solution.






Hadith #2363 – narrated from Anas

“The Prophet (peace be upon him) passed by some people who were busy with pollination and said: “if they would not do this, then it would still come out right”

The date crop that resulted was of a very poor quality.

Then he passed by them and asked: “what is with your date palms?”

They said: “You had told us such-and-such…”

He said: “You know best the affairs of your worldly life.”




A Blueprint for Destroying Militant Jihadism

The Problem

By now it should be obvious that the application of brute force, by itself, is insufficient in the effort to destroy militant jihadismSimilarly, while state intelligence organs have proven effective at disrupting threats to domestic security and adding new names to shiny-white balls in the drone strike lottery, the jihadist problem still persists. It persists. And it persists because we have failed to apprehend the nature of problem.


Nope, sorry Akhmal, it’s neither

In more ways than one, this non-apprehension stems from our tendency to glean information through computer screens instead of through people – a symptom of our preference for technologism (as exemplified by the “death from above” problem-solution continuum) instead of humanism (an in-depth understanding of old mate Akhmal and his problems). As a result, and in light of the fact that jihadist terrorism is much worse (by several orders of magnitude) then it was even five years ago, it seems that we still don’t know why cultural facts on the ground in faraway places are manifesting as effects elsewhere.

Indeed, what our misadventures attempting to defeat insurgencies in Iraq and Afghanistan have demonstrated is that our inability to understand the cultural environments in which we operate renders instantly useless any and all efforts we might make as counter-insurgents.

In this war, knowing who to kill can be less important than knowing who not to kill. A given target on the Joint Prioritized Effects List might yield indices of “1” on a threat association matrix but that same target might also be a swing-voting imam, siding with the jihadists not because of any ideological affinity he has with them but because he is engaging in a survival maximisation strategy – collaborating out of necessity.  If we had only known this before we droned him into oblivion, we might have slipped him a few greenbacks, done his speech-writing for him and used his sermons against the bad guys. In many ways, the logic for being better-informed (and perhaps more selective) in our bomb-dropping is numerical – we have a limited amount of ordnance to fire at any given location and we know we can’t and don’t want to kill everybody in that location. But we cannot be better informed until we go and get informed.

We cannot expect the mere building of infrastructure in Afghanistan’s mountainous “land of unrestraint” (yaghistan) to capture the hearts and minds of a tribal population who have a culturally-engrained suspicion of cities (shahr) .

Neither can we expect the Sunni of Anbar to fight for us “out of gratitude” for the armed social work we once conducted in the past. “Hearts” (and well-building) can be valuable to us, yes. But hearts are not nearly as valuable as minds. Furthermore, without observing the “cultural mind” that is driving the phenomenon of militant jihadism, as it is occurring on the ground, the best strategy we will ever be able to hope for in our hopeless war of attrition is two 5.56mm in the heart and one in the mind.

It is clear then that what is required to defeat militant jihadism is a detailed, even ethnographic, understanding of any future terrain where this ideological conflict is likely to take place. It is not enough to simply draw causal links between jihadism and incorporeal factors like “grievances”. Nor is it enough to attribute the blame for jihadist recruitment on vaguely-defined states of being like “poverty” or vaguely-described “charismatic recruiters” and “madrassas”. Further questions need to be asked by people researching these things on the ground. What are these “grievances”? Where did they come from? What is the nature of local “poverty”? If there are “charismatic recruiters” in Saudi-funded madrassas on the AfPak border, which ones in particular are churning out the bad guys? Why these ones? What is the cultural terrain in which these “bad madrassas” are ensconced?  In short, what are the “roots” of the so-called “roots of terrorism”?

Female CST-2 member speaks with Afghan child

Right, but before we blew up your school did you like to go?” (Source: Sunnyinkabul.com)

Up to now, we have largely relied on arcane computer-plotted metrics like “significant kinetic effects” to tell us what the violence looks like rather than walking around, talking to people and finding out what the violence is actually doing. By relying on the quantitative data we are missing out on the qualitative description – the somatic inputs which inform us about the totality of cultural life and the dispositions and allegiances of the people.

The Practitioners

As far as seeking to better understand the problem, the US Army Human Terrain System represented a step in the direction. But it was a dismal failure. Putting uniforms on social scientists and asking them to “do” anthropology in the context of a military operation is laughable. One can not be a “participant-observer” if one is dressed like Terminator in a town where the favoured dresscode is a kaftan or a dashiki. Once the boots are on the ground, it may be too late for anthropologists to do anything useful. Indeed, if Iraq is anything to go by, it may almost be too late to do anything at all.

Having said that, let me be clear. In and of itself, counterinsurgency strategy (COIN) is, at the very least, theoretically sound. Clear, hold, build. It does work – or it least, it can work. But it only works if it is executed by a group of practitioners who are well-informed enough such that it can be said they have a mastery of the cultural terrain and a thorough understanding of the political forces driving the conflict. The agents of British empire were only able to execute successful counterinsurgencies after hundreds of years of deep immersion in the cultural environments they occupied – much of which involved sending explorers and ethnologists like Francis Younghusband and Richard Burton out to the periphery of imperium in order to bring back the cultural information and whispers of rumblings in the hills. Comparatively, 6-month military rotations whose aim is to work through a list of people to kill seems a bit pathetic.

This century has seen the US leading the ham-fisted fight against jihadism. But if the rise (or perhaps, “the scent”) of Donald Trump is symptomatic of a necrotic rot and general decline of a once great America, the responsibility for preserving Western civilisation against the very real threats which menace it will increasingly fall into the hands of smaller powers – Australia, Canada, The Netherlands, Denmark, even New Zealand.

Everyone knows the UN is broken but other multilateral institutions, like the International Criminal Court, can be leveraged and incorporated into the defence policies of small powers. We now have an international legal instrument to prosecute our enemies (war criminals all of them) – what need is there to have the Americans lock them up in Guantanamo? We now have a refuse station to deposit the trash – so why not bring Ahmad Al-Mahdi and “Caliph” Al-Baghdadi and Abubakr Shekau (and Joseph Kony, for that matter), kicking and screaming to the Hague where we can handcuff them (naked or otherwise) to the handrails outside.

In some ways, the foreign policy decisions of a country like Australia are symptomatic of a delusion where a small power thinks itself a great power. We are the truck drivers and logisticians for America’s theme park in Iraq’s Emerald City. We supplement our big cousin’s Air Force with a few extra fighter jets (which we buy off him for exorbitant prices). Secretly however, we all know that a country like Australia (with a population of 20 million) or Canada (whose landmass is largely a frozen waste) will never be able to join the global superpower club. And really, we don’t want this anyway. We don’t want to conquer Afghanistan and install a glorious empire which will last a thousand years. We don’t want to occupy Iraq and raze all the mosques and make barbecues and the production of maple syrup mandatory. We are really just pre-emptive isolationists. All we want to do is look after ourselves and remove, surgically, the little cancers in the world which are threatening to spread.

Jihadism is one such cancer. And as with any cancer, it can be treated early or treated too late. One can cut the polyp out with a scalpel or one can wait till it becomes carcinogenic. One can pre-empt the spread or one can wait until it spreads, choosing instead to confront the problem with a bag-full of toxic chemicals (conventional military force) which is just as likely to destroy the rest of the body as it is to force the body into remission.

So how do we cut out the cancer? Well, here’s a blueprint.

The Blueprint

A few years ago, a popular model was put forward to describe why complex adaptive systems like terrorist networks are so difficult to destroy – a model which juxtaposed decentralised systems with other systems whose command is centrally-controlled. The metaphor used was “the starfish and the spider”.

A spider, as we know, is reasonably easy to kill. Crush its invertebrate body between your fingertips and all its legs – its subsidiary parts – will cease to function. The hierarchical institutions of nation-states often look like spiders. Kill the mad king, his knights surrender. Or, in the world of today, if a drone is ready to be fired and the President is in a meeting the whole operation freezes because the chain of command and control is temporarily paralysed.

The starfish however, doesn’t need centralised command and control (C2). There is no singular brain running the show but a series of nerves running along the ambulacral surface of each individual arm. If any individual arm is cut off the arm regenerates. Each arm is, in effect, autonomous – decentralised.


A starfish regenerating an arm

Unlike with the spider, there is not one nerve centre to destroy but many waiting to grow back. And the biology holds true to reality – organisations like Al-Qaeda and the lone wolf cells operating at the periphery of the Islamic State are demonstrative of the starfish model.

To describe this analogy further there are other congruent examples we could take from Greek myth – eg: if we were to contrast the regenerative heads of the Lernean Hydra with the single-minded Delphic Python (the classic mythological serpent – “cut off the head and the body dies”).


Guess who the terrorists are in this picture.

With this model in mind our task is to now figure out ways in which to kill these “starfishes” given that our current strategy (the drone-strike lottery) is having a limited net effect on the battlespace. As stated earlier, the fundamental problem with our approach to this conflict, has been our inability to understand the taxonomy, the anatomy and the reproductive capacity (that is, the nature) of the starfish – so, in many ways, the problem comes down to a problem of information and intelligence collection.

The nature of the information-space today is different to what it was during the Cold War. This is because unlike during the Cold War (when information was scarcely-available and jealously-guarded by those who held it) today’s “globalised” world is defined by what the anthropologist Arjun Appadurai calls “trajectories of disjuncture”. Information is no longer hidden, here and there, it is everywhere, available to everyone. It is no longer the purview of spies in the employ of the government.It is ripe for the picking by anyone – journalists, lobbyists, soldiers with blogs, unemployed hobbyists surfing the internet. Much of the information (but not all) is already out there, at one’s fingertip, waiting to be apprehended.

Traditional intelligence organisations still fulfill a set of vital and specific functions. They collect high-level information which circulates through diplomatic circles; they analyse specific sets of information as it pertains directly to government policy; and, crucially, they deliver advice to policy-makers. But the world is far too big, the desertscapes and mountain ranges where jihadism is metastasizing are far too expansive for a bevy of urbane and taciturn bureaucrats to apprehend the nature of the problem as it appears on the ground.

Michael Nagata, the Japanese-American general who was until recently the head of the JSOC program to fund Syria’s rebels, argued that in the fight against jihadism it will “take a network to defeat a network”. Following this logic then, the government (a centrally-controlled spider) is going to need help from the outside. This is where the private sector can assist.

Unlike a modern nation-state, there is no inevitable form which an entity in the private sector need adopt. Businesses like eBay have made billions by wresting control from central authority figures and placing it in the hands of the masses – by becoming thriving profitable starfish.

In general, and for good reason, there is a healthy suspicion of handing any kind of role in the War on Terror to the private sector. Indeed, apart from the problem of accountability there is a similar suspicion (if not a lack of trust) of the motivations of those in the business world. Just look at the controversies surrounding the free-reign that private military corporations like Blackwater have had over diplomatic security in Iraq. On this point, Machiavelli said most of what needed to be said regarding the problems posed by mercenaries in his writings about the condottieri in 16th century Italy.

In some cases however, specific and limited outsourcing of government war-time tasks to the private sector can prove indispensable rather than inimical. Contractors are profit-minded which means, if they are paid according to outcomes, increased efficiency in the use of time and resources. Consider, as a heuristic comparison, the time it might take an individual military contractor to board a plane to the UAE and take up a job training Arab forces (as many retired Western soldiers have done) versus the time it might take an Australian military unit, even a special forces unit, to do the same.

The main problem with outsourcing, of course, will always be the issue of accountability. But insofar as the government holds the purse strings the private sector will always be accountable to its pay-masters. And laws still apply to individuals. Furthermore, with a degree of separation between the public and private sectors comes an additional, and useful element of deniability for the government. A condottiere does not carry a government ID card – therefore the government cannot be burned at the stake for the condottiere‘s shortcomings.


The condottieri, the gentlemen-mercenaries of Italy

Ultimately then, given what has been discussed about our cultural knowledge-gap and given the future role which smaller, devolved, government-affiliated but private entities might play, one could conclude that our order of battle (particularly in the sphere of intelligence-collection) needs a complete restructure. And it starts, of course, with government itself.

The current force pitted again jihadism behaves much like “the spider” – where a single-minded body controls eight independent and often knock-kneed arms (*cough* Sovereign Borders). But as the war evolves, it is increasingly clear that what is required to destroy militant jihadism, once and for all, is an organism that more closely resembles a jellyfish.

To biologists, jellyfish are known as medusae, named for the chthonic snake-haired monster from Greek mythology. A medusa typically takes the form of an umbrella. In this metaphor, the upper surface (the exumbrella) is the figurehead of governance (an influencer but not necessarily a decider of the mundane and everyday) which encompasses everything. The exumbrella is in turn supported by a pulsating hydroskeleton (a more efficient, flexible bureaucracy) and a tangle of toxin-delivering stingers (the military, especially the special forces).

The key distinguishing feature between the jellyfish and its older arachnid self is obnoxiousness of presence. While the spider is intrusive – a blot in one’s surroundings, a menace, something to be feared – the jellyfish is confidential, cordial almost, barely noticed as it pulsates seamlessly through the environment. In battle, however, a medusa is just as lethal as the spider. The semi-transparent Australian Irukanji, the smallest of the box jellyfish, is also the most deadly of the box jellyfish, despite being the size of a fingernail.

Again, and crucially, the jellyfish is not intrusive – it does not meddle, disrespectfully and contumeliously, in the same way that the spider does. Jellyfish do not “hang out”, occupying the corner of a room and hunkering down in a maze of silk and HESCO, browbeating those caught in its web about the virtues of democracy. Jellyfish simply “bloom” – reproducing seasonally and in large numbers when the sunshine increases – in a way which, crucially, never disrupts the ecosystem.

Still though, the bloom hunts. And there is yet prey to hunted.

So, the bloom goes forward. And swimming with, amongst and at the vanguard of this bloom will be other carnivorous hydrozoa – sworn into the service of the medusan public but privately employed – at an arm’s length. Hydrozoa like the Portuguese Man o’ War which distinguishes itself from the bloom jellyfish in that it is not one organism comprised of many cells but colonial organism made of many individual organisms called “zooids”.

the fleet

The fleet moves

In principle, the privately-contracted Man o’ War is independent from the bloom and this independence can be useful to the bloom. The Man o’ War remains accountable to the bloom, who feed it the bloom’s scraps, but it complements the bloom because its structure – with its many “zooids” – is different to the bloom. These zooids can produce themselves at random through a process called “direct fission” – redeploying copies of themselves instantaneously.

As the bloom and the Man o’ War approach the juvenile starfish, teams of these zooids break away and descend upon the prey. The zooids attach themselves to the prey’s exterior – problematising the nature of the prey, fissioning further to create more zooids – “local” zooids – who can solve the deeper problem of the prey, assisting with the uncreation of the prey by thickly describing the prey. The zooids prepare the battlespace for the rest of the bloom by showing the bloom where the prey’s weaknesses are; what the prey subsists off; contextualising the prey as the right prey within an entire seabed of prey (that is, not our mate Imam Akhmal) in a way which complements the inputs gathered by the bloom’s sensory organs – the bloom’s spooky spy-feelers.


A zooid. Microscopic. Deadly.

Having colonised the prey’s crusty back, the zooids weigh the prey down and the prey is consumed by the bloom. Then, the bloom moves on, in search of more prey, with the auxiliary zooids swimming in front, disappearing silently into the deep.

This is the blueprint for defeating militant jihadism and we need to get behind it – money and all.

In a statement directed at his government pay-masters, General James Mattis, the Warrior-Monk of the US Marine Corps put the issue of allocation of resources rather succinctly: “if you don’t fund the State Department fully, then I need to buy more ammunition,” he said.

Applying Mattis’ logic, if Western governments with vested interests in the problem of terrorism don’t properly fund ground-based, human-conducted research which seeks to grapple with the problem in the places where it is metastasizing then those same governments are going to need greater funding for missile research. In any given location, by the time the jihadist problem requires a military intervention in order to slow the spread, then it is too late.

The key to destroying militant jihadism is a re-structuring of the intelligence sector and in part a devolution of certain functions to the private sector (getting behind the mercenary zooids) to assist with collecting more information about the problem. A knowledge-gap persists. And we need to fill it.

Take the spread of jihadism in Mali for example. Jihadism has been spreading, rapidly, over the last two years. Right now, of course, everyone is paying attention to ISIS  because ISIS is in vogue. ISIS are the ones with all the fancy videos and media attention. But while Iraq and Syria are now firmly in the clutches of jihadism, a new group – the  Force de libération du Macina (FLM) – is growing in central Mali. Before, jihadism was just a problem in the far north of Mali, a fad amongst a few Arab traders and disaffected nomadic Tuareg. Now, for the first time, FLM is targeting settled Fulani in Mali proper, wooing them to jihadism with nostalgic dreams of long-since forgotten caliphates. This is where the zooids will prove indispensable. Send in the zooids. Let them find out what’s happening in the Sahara. Indeed, what is happening in Fulani Mali? What is the problem? Why is the cancer spreading?

james mattis

James Mattis

Lo, The Terminus

“Oooh! For Christ’s sake let me alone!” cried the wounded man, but still he was lifted and laid on the stretcher.

Nicholas Rostov turned away and, as if searching for something, gazed into the distance, at the waters of the Danube, at the sky, and at the sun. How beautiful the sky looked; how blue, how calm, and how deep! How bright and glorious was the setting sun! With what soft glitter the waters of the distant Danube shone. And fairer still were the faraway blue mountains beyond the river, the nunnery, the mysterious gorges, and the pine forests veiled in the mist of their summits… There was peace and happiness… “I should wish for nothing else, nothing, if only I were there,” thought Rostov. “In myself alone and in that sunshine there is so much happiness; but here… groans, suffering, fear, and this uncertainty and hurry… There—they are shouting again, and again are all running back somewhere, and I shall run with them, and it, death, is here above me and around… Another instant and I shall never again see the sun, this water, that gorge!…”

At that instant the sun began to hide behind the clouds, and other stretchers came into view before Rostov. And the fear of death and of the stretchers, and love of the sun and of life, all merged into one feeling of sickening agitation.” – Leo Tolstoy, War and Peace, Book II, Chapter VIII


Navigating the terminus of the hollow, melted-out Sphinx Glacier.

The old man, chained, by time, to his wheelchair, looks up at me with eyes wide. Medical paraphernalia exudes from everywhere all over him. On his wrist, there is a coloured band with a name and a number and a barcode – the international accessory of the admitted infirm.

“Are you going to be around here for awhile?” he asks with open palms. Fingers spread, hands pointing up – like a supplicant.

I come to a halt in front of him, keys jangling at my waist, short-wave radio clasped to my belt. In the evenings I work security at the hospital, doing my two-hourly rounds through palliative care. Checking the locks on doors, alarm systems, fire panels. That kind of thing.
“Jack was a logger” according to the life synopsis that the nurses have sticky-taped to the wall next to the door to his room.
He left his native Ontario at age 15 and worked his way across the country on the trans-Canada railroad. A stint in the boiler rooms of the coal-powered ships crossing the Pacific followed; then time in Papua New Guinea hunting “alligators” [sic]. Later, he would “serve as a mercenary” and then, returning to Canada, with the RCMP as a Mountie above the Arctic Circle. Then, he settled down, in the fjords of British Columbia, with his wife and three children. This is the bio of a man who has lived a very full life – an adventurous life. Jack was a “fun hog” in the sense that Chouinard and Tompkins might have used the term.
“Are you going to be around here for awhile?” he asks me again.
I nod, and point to the “Security” embellishment on my uniform. “I’m always around,” I say.
He doesn’t hear me. Jack is mostly deaf and the deafness does not help with the dementia. He beckons me toward him, asking me to repeat myself – and, leaning in, progressively closer, I eventually give up.
I hold up two fingers. Jack can still see. He gets it. Kind of. “You’re here for two hours?”
I nod. Close enough.
“But I need someone to watch out for me,” he says. “Can’t you stay awhile and watch out for me?”
I nod. “I’m here for you Jack,” I say. He doesn’t hear me.
“I need someone to watch out for me,” he repeats.
A nurse at the nursing station, seeing me detained part-way through my patrol, intervenes. “Come on now Jack,” she says, and she approaches, inserts herself into Jack’s surrounding and then smiles at me as if to say, “it’s OK, I’ve got this now. You’re right to go on.”
I look back at the bio sheet on the door again, reading more about Jack’s life. Here, the choice of tense in the wording stands out. Jack “was a logger”; “he enjoyed fishing”; “he took to deep-sea sailing on the West Coast”. Here is a life history written in the past tense – the same tense we employ for the life histories of Norgay, Napoleon, Nietzsche – as though the man were already dead.
A nurse reports that one of the maintenance guys has left the door to the outside workshop open. It’s my job to go and lock it. Access control. I step out a fire escape. The evening is clear and cold in Squamish. No winter rains today. Just the chill as the last of the day’s light disappears behind the Tantalus Range. I look east towards the Garibaldi range. In the distance, the Crosscut Ridge of Mt Isosceles is seen through the valley-gap between Crumpit Woods and the lower flanks of the Chief, silhouetted in the light of a rising moon. In its current state, caked in ice and snow, the Crosscut Ridge is very much in winter condition. Last summer, we’d tried to get in there to climb it, only to be shut down by weather and distance and ability. Late season conditions. Melting glaciers reaching the end of their lifespans.
crosscut ridge

The saw-toothed Crosscut Ridge, “the obscure object of our desire”, centre-right.

I was preparing for another shot at it in the early spring, hoping to use skis to cut the approach time by traversing the ice floes on Garibaldi Lake. This time before the summer sun had melted everything out and before the glacier became a labyrinth again.
I return inside and patrol through the “Intermediate Secured Unit” – where they put the high-risk patients – and then, with my rounds complete, I step out into the main hallway again. Someone else, Jim, an old miner, is complaining that another resident entered his room and stole all his stuff. He seems upset. Upset people can become aggressive and Jim has a history of aggression. For the most part, I ignore him. I let the nurses know about his problem and tell them to raise me on the radio if they need me.
I walk away. I don’t much want to grow old, I think to myself, although I know that one day I will have to. I don’t want to die either but I know that this is not an option available to me.
In pre-modern Japanese society, the base of Mount Fuji was said to be a site for a practice called ‘ubasute‘ – whereby the elderly and the infirm were left before the mountain’s bosom to die. Similar things have been said of pre-colonial Inuit society where “old Eskimos were set adrift on ice floes” – farewelled into Nature’s arms. The historicity of these past practices is the subject of intense debate. They may indeed just be myths. But the fact that rumours of these other-worldly practices have persisted (even if solely amongst foreigners gossiping about the Other), reminds us that the problem of how Man should spend his last days is a problem we have not yet solved as a species. We are uneasy about and perhaps not yet satisfied with the systems we have designed for dying. How can we be?
I walk on through the corridors, passing by the infirm in their beds – respirators on, holding on, clinging on. Televisions play in all the rooms. Just another half hour of television. Hold on just a little bit longer. I feel very happy for my beloved grandfather (just passed in December) that he did not spend long in permanent care before he died. He escaped that fate – the fate of a man dying while surrounded by others who are also dying. Quick and painlessly, he went.
The next day, the rains return but then it clears for a while around midday. I can see my objective again – the Crosscut Ridge. I imagine myself on top of the highest gendarme – picking my way along its plated back. I am looking across my domain – my mountains – and I am wondering what it will feel like to die. I am wondering what it must have been like for Ari, when he fell from Mount Aspiring. What were those seconds like? Those final seconds of falling, before impact on the Bonar Glacier? Surely, there must have been fear. Anxiety. But still, I have to believe, I must believe that he was at peace with himself – that he’d accepted it, and in accepting it, experienced a sensation of something akin to bliss.
Yes, I think to myself, gazing across at Garibaldi and Phyllis’ Engine and the Sphinx – mountains named for beings past, both real and fictive, with their own life histories attached. Death is a problem.
A host of dark questions gnaw at me. How do I stay alive in these mountains? How do I keep living without growing old? How do I face the inevitable without becoming a nihilist? How much more of this beauty can I enjoy before I am too old to keep seeking it out? And will I be able to find enjoyment, find beauty in other things, when I am too old and too weak and I’ve lost my mobility?
A few days later, I clock on again at the hospital and continue on my rounds through the residential home. Jack, in his wheelchair, is in the hallway again. He looks docile now. The feintest hint of a smile crosses his lips. Like the dying Count Bezukhov, the father of Pierre, the protagonist of War and Peace:
“While the count was being turned over, one of his arms fell back helplessly and he made a fruitless effort to pull it forward. Whether he noticed the look of terror with which Pierre regarded that lifeless arm, or whether some other thought flitted across his dying brain, at any rate he glanced at the refractory arm, at Pierre’s terror-stricken face, and again at the arm, and on his face a feeble, piteous smile appeared, quite out of keeping with his features, that seemed to deride his own helplessness.”
Jack, half-smiling still, is wheeled back into his room by a carer, embracing the infinite jest of it all. And me, the mountaineer just down from my mountains, the summiteer but after the fact, the security guard on my lonely night patrol – I am left, alone, in the hallway. Alone with another pithy quote. Nietzsche. The so-called nihilist, again.
One should part from life as Odysseus parted from Nausicaa,” Nietzsche wrote. “Blessing it, rather than in love with it.”
I poke my head around the corner and see Jack being helped out of the wheelchair and into his bed. He moves, at a glacial pace – the sound of the crepitus in his bones like the crack and grind of crevasses in the fracture zone. The whole mass is moving downstream to its end. Here, at his terminus, Jack is ready to go. Ready to transition from one world into the next.

Why I Chose To Commemorate Australia Day This Year


As the date marking the arrival of the First Fleet at Sydney Cove passes us by, we have all paid heed to the now-annual calls for Australia Day to be struck down in our national calendar. Yes, this year, like every year, we have heard how the date treasured by lovers of barbecues, beer and Triple J is not “Australia Day” but “Invasion Day” – a date which, rather than commemorating some abiding sense of Australianness instead grotesquely celebrates the beginning of White Man’s colonization of the Great Southern Land. January 26, 1788 marked the beginning of a cultural genocide which systematically dispossessed the indigenous peoples of Terra Australis of their land, history and future, follows this line of reasoning. Therefore, it is a national disgrace to be celebrating Australia Day on this day.

This year among the “Down With Australia Day” pronouncements, a popular video produced by Buzzfeed has been doing the rounds on social media. Labelled “an aboriginal response to ‘Australia Day’”, the video documents the responses of several indigenous speakers who discuss what Australia day means to them. Celebrating Australia Day, according to several of these indigenous speakers, is “insensitive”; a commemoration of an “invasion”; a day which is “really really sad” for the suffering sewed by British colonists after their arrival, it is argued.

Screen Shot 2016-01-26 at 12.01.40 am

“Australia Day…. urggghh”


Screen Shot 2016-01-26 at 12.05.53 am

“Don’t you mean ‘Invasion Day’?”

Screen Shot 2016-01-26 at 12.11.00 am

“Invasion Day… It’s insensitive to say the least”

While in principle, I think there’s merit to some of the arguments advanced in this video (e.g.: that January 1 as the day of Federation would be a more appropriate date to celebrate Australia Day), for the most part this video represents the reproduction of vile, viral memes which add little to serious discussions about “real” issues in aboriginal Australia (like continuing disadvantage in remote-living communities) in favour of regurgitating spoon-fed fallacies about the history and culture of aboriginal Australia which hold no weight anthropologically or historically.

In today’s blog rant I will try and outline some of the main fallacies in today’s discourse about Aboriginal and White Australia, all of which are present for re-production and re-transmission in silly videos like the above Buzzfeed video. So below, a dissection of some random piece of click-bait I watched on Facebook – for better or for worse.

  1. Sweeping Generalisations about Indigenous Australia

What is perhaps most remarkable about this video is the sweeping generalisations and falsehoods many of its speakers make about Aboriginal Australia. This is even more remarkable because the speakers self-describe as indigenous Australians.

“Oldest Surviving Culture”

The first and most obvious fallacy is one speaker’s assertion that Invasion Day marks the survival of the oldest culture on earth. Anyone who has browsed through a tourist brochure selling bite-sized aboriginal cultural experiences is probably accustomed to the “oldest surviving culture” claim. While aboriginal peoples have inhabited Australia for a long long time, even the most cursory examination of what a “culture” actually is would show us how utterly ridiculous it is to use superlatives like “oldest” or “most survival-ey”.

Basic anthropology tells us that “culture”, a term which describes the prevailing set of discourses and practices within a given human society, is not static. Rather than being static, “culture” is actually a continuously evolving set of norms in a state of constant flux. Therefore, since culture is ever-changing, ever-transforming, resembling something one day and something else the next, to talk about “aboriginal culture” as possessing attributes like “age” or “hardiness” is utterly meaningless. Indeed, it is no more valuable to talk about Aboriginal culture as being “the world’s oldest surviving culture” then it is to talk about the mace carried by the Serjeant-at-Arms of the Australian Parliament as being “a cultural relic of the bludgeoning instruments used by early hominids in the East African Cradle of Humankind”.


Behold! The ancestral descendant of a spear-wielding !Xoo subsistence-hunter from the Kalahari Plain… in traditional dress.

The point is that Aboriginal culture today is so utterly different to what it was in 1788 (the destruction of aboriginal culture by colonialism is the reason why Invasion Day is so controversial in the first place). And for sure, perhaps the greatest irony in this video is the fact that the speakers talking about “oldest surviving cultures” are wearing European-style business attire and German Adidas T-Shirts, speaking English and talking into Japanese-made video cameras.

Beyond the anthropological falsehoods which the claims of “oldest culture” represent, there is an obvious cognitive dissonance when people speak about “cultural genocide” and “oldest surviving cultures” in the same sentence. Which is it? Was your culture wiped out or did it survive? Personally, I think that the dispossession of aboriginal people in Australia constituted a cultural genocide (just look at Tasmania), a historical truth which simply adds to why I so thoroughly dislike the idea that “aboriginal culture” (whatever is meant by this rather general term since there are hundreds of different language groups) has some kind of ageless survivability.

Screen Shot 2016-01-26 at 12.09.25 am

“[Australia Day] pisses me off”

“They were a peaceful people”/”We are an inclusive people”

According to the young boy interviewed in the video (who, it should be noted, is clearly below the age of informed consent as an interviewee – a black mark on Buzzfeed’s journalistic standards), the arrival of the First Fleet was a day when Europeans came and “slaughtered… a peaceful people”. Apart from the historical fallacy that the First Fleeters simply rocked up and started slaughtering people on the same day they arrived (more on this later), there is a more pernicious untruth to the claim that the indigenous inhabitants of Australia were any more “peaceful” than anybody else who has ever lived.

“Inclusiveness” is also advanced by one of the speakers as a cultural component (which is apparently exclusive) to aboriginal Australia. While most of the aboriginal informants I have come across during ethnographic research in Cape York could be described as both “inclusive” and “peaceful”  to claim that either of these adjectives are abiding and overwhelming cultural traits is to make a sweeping generalisation without the backing of the empirical record.

Certainly, in pre-colonial Kuuk Thaayorre society, clan rivalries saw the Thaayorre come into almost constant violent contact with members of the Kuuk Yaak language group (“snake speakers”), a historical enmity which manifested in the eradication of the Kuuk Yaak as a cultural unit. No one in the modern Cape York community of Pormpuraaw self-identifies as “Kuuk Yaak” anymore – one is either “Wik-Mungkan” (a language group with strong ties to the township of Aurukun to the north) or Thaayorre. The dimunition of a neighbouring enemy tribe seems neither “inclusive” nor “peaceful” to me.


My good friend Peret Arkwookerum (nicknamed “Wookie” meaning “flying fox”… which is also one of his totems), half Wik-Mungkan, half Kuuk Thaayorre, catching a black bream on his first cast at a sacred site near Pormpuraaw

We know of course, that the video’s speakers are trying to argue that the pre-colonial Eora of Sydney were comparatively peaceful and inclusive (at least in comparison to the British). But even in the case of the Eora of Sydney, there is little evidence to suggest that they were any less war-like than the Kuuk Thaayorre of Northern Queensland. Incidents of spearing were common occurrences among the natives of pre-colonial Sydney. Disputes were often settled by violence. Under Pemulwuy, a group of aboriginal insurgents gathered to resist (rightfully) the settlers occupying their lands. Pemulwuy himself was rumoured to have been blinded in one eye in a violent incident with an enemy from another tribe.

Indeed, with all the violence and exclusiveness observed throughout the history of Aboriginal Australia it is fair to say that perhaps one of the most remarkable features about Aboriginal people, historically and into the present, is how remarkably like the rest of us they are. Aboriginal people are people – and like all societies, pre-colonial Aboriginal society had its racism and its bloodshed, its exclusivity and its conflict. Peace and not conflict is the exception to the rule throughout most of human history and it was no different in the Australia that existed before the arrival of Europeans. To imagine pre-colonial Aboriginal society as having been the embodiment of some kind of Utopian dream-state is to deny almost everything we know about the evolution of human history. We may as well start waxing lyrical about “noble savages”.


Bennelong, an Eora collaborator described by Watkin Tench as “a second Omai”, the textbook “noble savage”.


Pemulwuy. Aboriginal Australia’s most successful guerrilla leader.


2. Excessive Use of the First Person Plural (“We”, “Our”)

One of the common traps we often fall into when talking about the historical lives of our ancestors is what I would like to call “the excessive use of the first person plural”. This is because when one identifies with people who lived hundreds of years ago, seeing these people as part of the extended genealogical network which we call “our family”, it is easy to start using terms like “we” and “our” when we discuss what happened to these historical family members during their lifetimes… even though we ourselves weren’t there to witness or feel what happened to them.

Now, I’m not going to sit here and claim that there is no validity to the idea of “inherited grievances” nor am I going to deny that oppression and structural violence experienced by members of the same social group can be felt, in real terms, for generations (and still continues to be felt by aboriginal peoples today). Indeed, ancestry and inherited grievances are complex issues which are heavily tied to peoples’ conceptions of their own identity.

But my main problem with somebody claiming that January 26, 1788 was “that day that we lost all that we had” stems from the fact that although one might have had relatives who were there and suffered at the hands of the Sydney Cove colonials, you yourself weren’t actually there.

In a similar vein, some years ago, while munching on a shawarma in a Jerusalem hole-in-the-wall eatery, I listened to an Israeli man prattle on about “how we [the Israelites] suffered at the hands of the Philistines (the pre-modern Palestinians)” – how they took our land, et cetera, et cetera (until David came along). Naturally, being in Israel and surrounded by heavily armed IDF soldiers doing the rounds through the Old City, my reaction was to smile and nod. Inwardly however, I couldn’t help but think: “really good sir. This suffering at the hands of the Philistines happened to you did it? You personally?”


In the presence of overwhelming firepower one is inclined to agree with whatever one is told. In the Old City of Jerusalem, some years ago.

You see, one of the major problems associated with constantly reaching back to form connections with events in the past which happened to past peoples is 1.) it can perpetuate cycles of violence (as we see in the “who stole whose land” debates in the former Yugoslavia and the Holy Land which date back to the Middle Ages) and 2.) it becomes easy to fall into an ontological phantasm whereby you confuse something that happened to a historical person (who you never actually met) with something that happened to you yourself.


Can someone please explain to me who stole the Temple Mount from who again?

Of course, I’m not simply suggesting that the subjugation of the Eora peoples in New South Wales in 1788 is something which didn’t happen to a Guugu Yimidhirr person in Far North Queensland in the present. Events like the arrival of the First Fleet are butterfly effect events – events which generated and continue to generate sociological hurricanes across the continent.  Equally, I’m not suggesting that today’s aboriginal Australians should collectively “get over” the dispossession of their ancestors from their native lands nor am I suggesting that it is wrong to draw parallels between the historical suffering of Australia’s first inhabitants and the ongoing structural violence directed against aboriginal peoples in this country. It would be insensitive to tell people to “get over” a cultural genocide and it would be factually incorrect to claim that the use of the first person plural in the context of one’s ancestors never holds any weight.

What I am really ranting against here is the excessive use of terms like “we” and “our” when talking about past persons and historical events. I have genetic links to the starving Irish who were loaded onto ships and sent to a penal colony in the Southern Hemisphere but I am not those Irish. I have an ancestral link to Lt Jack Walsh (my great-grand uncle), the first Queensland officer to take a bullet to the head at the landing of ANZAC, but I wouldn’t have the slightest idea what taking a bullet to the head actually feels like.

Similarly, it is perfectly valid for me to claim that my ancestor the Scottish outlaw Rob Roy McGregor was “one of us” (“us” being “Clan Cattanach”… represent; “touch not the cat, bot the glove”) but it would be excessive to claim that everything he lost and experienced at the hands of the English was physically lost and experienced by me as well. In many ways, to claim Rob Roy McGregor’s suffering as my own would not be dissimilar to claiming his achievements as my own, in a way which conjures up today’s patriotic Americans claiming “the liberation of France from the Nazi” as one of their own personal achievements (see comedian Doug Stanhope tear these kinds of claims apart).

3. The Date Itself

Screen Shot 2016-01-26 at 12.18.44 am

Perhaps the most eloquent and sensible speaker in the video is the dude in the red and blue shirt. His understanding of Australia Day, as he describes it, is like “if a guy comes into your house, does horrible things to your family, and says ‘we’re gonna have a party and have a barbie and listen to Triple J on the date we turned up’.”

Read as a celebration of “the day White Man turned up” (a date which symbolically represents the beginning of a cultural genocide), Australia Day does seem a bit “sadistic”. Again, I agree that there is some merit to the idea of picking a different date to celebrate Australia Day – a more neutral date like January 1 (the date of Federation in 1901) – which doesn’t carry the same historical and emotional baggage as the arrival of the First Fleet.

But to play devil’s advocate, if we as Australians have a responsibility to “never forget” what happened to Aboriginal Australians under colonialism then doesn’t it make sense to commemorate the arrival of the First Fleet in much the same way that “never again” commemorations have memorialised the tragedy of genocide in Rwanda or South Africa?Indeed, the counter-cultural “Invasion Day” is dredged up every year simply by virtue of the date Australia Day already falls on so wouldn’t all these awareness-raising efforts about the atrocities in Australian history fade into obscurity if they just went and changed the date?

Similarly, if one is being faithful to the historical record, one should note that January 26, 1788 most certainly was not the bloodiest chapter in the history of European colonialism in Australia. There were no slaughters or massacres carried out on the day of the Sydney Cove landing (which according to my readings was actually a few days before January 26 anyway). Indeed, compared to some of the other dates in the colonial history of Australia, January 26 was a comparatively tame one. Australia Day does not commemorate, for example, the date of the first landfall made by Europeans on Australian shores – June 1605 – when the Dutch navigator Willem Janszoon, made the first contact with aboriginal Australians at Cape Keerweer – a contact which was characterised by the massacre of “savage, cruel, black barbarians” who had slain some of Janszoon’s sailors.

Neither does Australia Day celebrate travesties like the Black War in Tasmania, part of which involved the formation of an extended line by the 63rd Regiment to corral Tasmanian aborigines into a penal colony on the Tasman Peninsula (and/or shoot them on sight). Indeed, while the landing at Sydney Cove marked the beginning of an awful period of colonization and oppression, the date of the landing itself – January 26, 1788 – was a pretty low-key, native-friendly event. The amicable relations between aboriginals and settlers continued peacefully for at least the first year… until the Governor’s game-keeper, John McKintyre, started slaughtering Eora for fun on his hunting parties, resulting in his own death at the hands of Pemulwuy.

That said, saying that it is acceptable to celebrate Aussie Day on January 26th because January 26th is not as bad a date as other dates isn’t necessarily a way to make a good moral argument, so I’m not seeking to revise the history of the First Fleet’s arrival by painting it as a harmless event in our nation’s history.

More than that, what I’m not calling for is an Andrew Bolt version of Australia where Aboriginal people just “get over” the wrongs done to them and “pick themselves up by their own bootstraps”. Nor am I advocating for any particular position in the discussion over who gets what out of Whites and Aboriginals in modern Australia – the ins-and-outs of Native Title still need work.

What I am calling for is a little bit more intellectual honesty in the way we discuss aboriginal Australia and the history of European colonialism in Australia. Yes, the colonisation of Australia and the dispossession of its native peoples was a travesty of genocidal proportions. But no, the First Fleet did not land at Sydney Cove and immediately begin slaughtering people in droves.  Yes, aboriginal Australians have inhabited Australia for a period dating back at least 50,000 years. But no, Aboriginal culture is “not the world’s oldest surviving culture” because the very idea of an oldest surviving culture is a load of anthropological horse-shit. (And anyway, what about the uncontacted Yąnomamö peoples of the Orinoco basin or the grumpy resistant-to-contact North Sentinelese of tsunami-survival fame? They’re pretty old-skool as well). And finally, yes, there are many nice aboriginal people around the traps today but to imply that every one of the hundreds of language-groups which constituted pre-colonial aboriginal Australia were “peaceful” or “inclusive” is utterly misleading. 

Ultimately, it’s worth mentioning that the above video was produced by Buzzfeed (under the rather stomach-churning watermark “Buzzfeed Aboriginal”), the internet’s chief purveyor of clickbait-for-profit. So the video is perhaps not really worthy of serious intellectual consideration. Certainly, we know from the outset that the video is designed to emotionally-manipulate us into sharing and spreading (not unlike videos produced by ISIS or the Lions of Rojava in Syria). And yes, sharing and spreading is something that White Guilters all over my Facebook newsfeed have certainly done… by my last count this video has 2,082,458 views.

Indeed, the white demographic of the Facebook video-sharers is worthy of note. Conspicuously absent from the re-share meme-train are any of my aboriginal and Torres Strait Islander friends on Facebook – probably because they are too busy catching barramundi or counting crocodile eggs with the Indigenous Land and Sea Rangers program or doing other things… like protecting the country.


Wookie examines one of his totemic ancestors (“minh pinch” is the Kuuk Thaayorre word for “crocodile”)

As for me, while my aboriginal friends are out fishing and drinking beer on Australia Day lapping up some beautiful Cape York sunset, I’m writing this from the cold depths of wintry Canada and I’ll be spending the rest of the day dreaming of barbecues, thongs, beaches and Triple J. After that, I’ll be waiting out for ANZAC Day – sharpening my pencils for the annual debate over whether the remembrance of the landing at ANZAC constitutes a day for the mourning of dead sons or a day when Australians unite to glorify bloodshed and violence. Probably, ANZAC Day (like Australia/Invasion Day) is really a little bit of both – a celebration for what we have and a remembrance of what we lost.


Sunset over the Gulf of Carpentaria in the Western Cape York community of Pormpuraaw. Reminds me very much of the Aboriginal flag.

happy aus day