Sunday, May 31, 2015

Nutritional and Agricultural Realities of Veganism

Note: This is an excerpt from a post I posted on the Anarchoprimitivism subreddit with some important corrections. You can find the original thread here.

For veganism to not negatively impact your health due to poverty of important nutrients such as certain vitamins (K2, B12, protein, fatty acids), you have to rely on plant foods that 1) aren't really possible to amass in quantity solely from foraging and 2) that require additional processing in order to obtain the necessary nutrients. I'll take each point in detail.
1) Most plant foods (fruit, tubers, roots, various aerial parts like stalks, shoots, and leaves) are quite poor in protein and fatty acids, and, depending on the climate, only seasonally available. Plant foods that are relatively richer in these nutrients, such as the seeds (which include nuts, beans, legumes, etc.), are either impractical to gather in sufficient quantities to adequately satisfy nutritional requirements in humans or are otherwise impractical to process--i.e., you may have tons of acorns or walnuts, but it is extremely labor-intensive to extract the edible portions of the food. In the case of acorns, the tannins need to be leeched from the acorns first before the acorns are pounded into flour. In the case of walnuts, one Native American method was to crush all the nuts and boil everything until the meat floated and the shells sank to the bottom of the pot. This would take hours (gather, crush in batches, boil in batches, strain, repeat), as opposed to just snatching a few blackberries along the path. As a supplement to a diet that includes animal foods, it may be reasonable to boil walnuts once in a while to add variety. As a staple, though, you'd be rather tied to this method of preparation for lack of alternative sources of nutrition. What's more, many plant foods (including walnuts) contain "anti-nutrients" such as phytic and oxalic acid that interfere with the absorption of nutrients. These can be reduced by extended cooking and/or soaking methods, or by germination. All require extra time and effort. Animal foods generally do not contain anti-nutrients.
In general, it is less efficient to derive certain nutrients from plants than it is from other foods, including animal foods. The point about anti-nutrients notwithstanding, many nutrients touted in plant foods actually don't exist in an immediately bioavailable form, or are less efficiently absorbed. For example, most people associate carrots with providing loads of Vitamin A in the form of beta-carotene. However, getting Vitamin A in the form of beta-carotene turns out to be 1/12th as efficient as getting it in the form of retinol from animal sources. By weight, beef liver has about three times as much retinol as carrots have of beta-carotene, which means that it is about thirty-six times less efficient to derive Vitamin A from eating just carrots than it is from eating just beef liver. Another example would be the omega-3 fatty acid alpha linolenic acid (ALA) found in abundance in flax seeds. ALA is one of three omega-3 essential fatty acids, the other two being the more important eicosapentaenoic acid (EPA) and docosohexaenoic acid (DHA), the most crucial of the three. Although humans can synthesize EPA and DHA from ALA such as found in flax seeds, the efficiency of either conversion is only at a few percent. In animal foods like pastured animals and wild oily fish, DHA and EPA are readily available for absorption and doesn't require any conversion. This makes sense because components from animals are more easily recognized and easily repurposed in our own animal bodies since they are more or less going to be applied to similar purposes from one animal to the next. The calcium in cow's milk or in bones can be very easily repurposed for your own bones--the calcium in broccoli, not as much (again, partly due to anti-nutrients). Add to this extra step in metabolizing nutrients from plant foods the fact that digestion of plant foods is less efficient than animal foods, and it starts to become apparent that relying on plants to supply the majority of one's nutritional needs is not a choice strategy from a pure survival viewpoint. For example, it is relatively common to find undigested plant foods in the stool of someone with weak digestion, but it's relatively unheard of to find undigested muscle fibers from a chicken in the same person's stool.
Other staples that vegans typically depend on, like legumes, aren't available in sufficient quantities in the wild to even sustain a small population of humans. Thus, these plants need to be domesticated in order for the vegan strategy to possibly work. In modern veganism, for example, soy is an important source of protein and there has yet to be a suitable substitute for it. This leads me to the second point.
2) Because of the high amounts of anti-nutrients in some raw plant foods, particularly those valued by vegans as substitutes for meat-based protein, many agricultural cultures have learned to ferment or otherwise process certain foods in order to reduce or eliminate anti-nutrients and make other nutrients available. Taking the example of soy again, East Asian cultures have a long tradition of fermenting the bean into soy sauce, natto, miso, and tempeh. Of these four fermented soy products, only natto provides Vitamin K2, a nutrient critical for bone health and proper blood coagulation that is generally not found outside of animal and fermented plant foods. Fermentation of certain plant foods (but not soy) also makes Vitamin B12, a critical nutrient generally only found in animal foods, available. Fermentation, however, requires well-controlled conditions in order to work properly, which implies sedentarism and a degree of technical sophistication. While some hunter-gatherers do consume fermented foods, these foods aren't generally the product of sophisticated techniques, but rather of just letting foods (like meat) spoil or finding food already in the process of fermentation.
Another major processing method that basically can only exist under sedentary agricultural conditions include pretty much all grains, as these foods a) can't be gathered in sufficient quantities in the wild to be worthwhile and thus need to be grown in large quantities and cared for accordingly b) require threshing, hulling, winnowing, and milling before they are edible, which are only practically achievable through various specialized tools and machines c) as grain harvests are seasonal, require suitable storage conditions that will keep the grains dry and free from pests, again implying sedentarism.
Probably the closest analogue to modern veganism is found in the East Asian Buddhist vegetarian traditions, which, because dairy foods were not consumed in these cultures, relied more heavily on the versatility of the soy bean, leading to the innovation of many unique and iconic vegan foods such as the aforementioned soy products as well as tofu. An imitation meat made from wheat gluten also exists and is part of traditional East Asian Buddhists' diets. It can be said that modern veganism owes much to East Asian Buddhism, and, as with modern veganism, East Asian Buddhism owes much to vast fields of soy beans, none of which the monks cared for themselves. East Asian Buddhist monks were not traditionally very physically robust or even healthy. They generally consumed one meal a day. They did not have to be healthy in the same sense that people who, say, farmed for a living needed to be healthy. Since monks didn't devote any time to growing or cooking their own food (they needed that time to study sutras and perform religious functions), East Asian Buddhism doesn't really qualify as a sustainable culture. They feed off of the labor of others, the same as any other organized religion. I've heard that Buddhists were originally prohibited from growing their own food, since this would entail tilling the soil, which results in the death of soil-dwelling animals. Generally, pre-modern monks got fed strictly off of the donations of faithful villagers. Thus, the case of East Asian Buddhism doesn't really apply as an example of a fully vegan culture, since monks basically did nothing directly to feed themselves, and the farming culture that supported the monks certainly was not vegan.
In conclusion, an exclusively plant-based diet (including bacteria and fungi), because of the difficulty inherent in procuring the nutrition that humans require to remain reasonably healthy, absolutely requires advanced agricultural knowledge, capability, and labor. Again, to my knowledge there are no pre-modern vegan cultures, most likely because such a diet is inefficient, unhealthy, and impractical.

Sunday, February 15, 2015

Scientific vs. Magical Thinking

The text below is from a personal communication with a friend in a slightly revised form. It comes out of a discussion regarding the role of science in opposing industrial civilization and whether or not a connection to nature requires magical thinking.

I believe scientific thinking requires the ability to accurately record and preserve data. Of course, the human mind and, in a broader sense, human cultures are well-suited for storing certain kinds and amounts of data, which is how individuals and societies compile, organize, and summarize learned information about the world. However, it seems that cultures without written language always necessarily resort to mnemonic devices such as myths, parables, legends, songs, poems, and other aspects of oral tradition in order to preserve and pass down knowledge, and this method always entailed simplification, distortion, and elision simply because it was not practical to memorize and represent information "literally". Things were better conveyed and memorized (and were more entertaining around a campfire) if they were represented using basic archetypical symbolism, a sort of shorthand. Thus people could learn what was poisonous, when was the best time to catch a certain kind of fish, and so on from myths, which of course would typically be couched in magical terms. Magical thinking is an efficient way for pre-literate humans to condense and convey tomes-worth of empirically-gleaned information. If you have writing, you can start to reduce the dependence on magical causation because you can write down and refer back to more objective, precise, and accurate information. No scientific papers need to rhyme, and most don't readily roll off the tongue. Their primary advantage is that they can convey information symbolically without having to resort to a mnemonic shorthand, but it's probably impossible for most people to memorize even a few scientific papers replete with data tables and graphs word for word. If we do away with things like cloud servers, thumb drives, the internet, places to store hard drives and books, and even access to writing in general, then it seems to me that science would be severely limited to what could be remembered solely in the brains of the people in any given society, and there would be no safeguards against "data corruption", i.e., a regression into magical thinking for the sake of improving ease of memorization of scientific facts at the expense of accuracy. This can happen very quickly, as any interruption in formal schooling for even as briefly  as a generation can plunge a social group back toward near-illiteracy. Science presupposes constant upkeep. I personally have no problem with a slide back toward magical thinking (more magical thinking means less ability of humans to actually impact their environment), but if we lean on science as a means to the end of abolishing industry, we will then have to make an exception for things that allow for scientific thinking to not devolve back to magical thinking before we've accomplished said goal. In other words, in order to use science to take down industry, which I agree is a very good goal, we have to not really take down all of industry because science depends on industry to a very large extent. Again, I don't just mean microscopes and satellites, things that are obviously the products of industrial activity, but mostly just recording and storing accurate data. If you feel that it's okay to sacrifice some accuracy and precision and to just stick to the basic outlines of some scientific theory, then I feel that you're already veering back toward magical thinking, because I think by definition anything that isn't strictly scientific is to at least some degree faith-based or magical. Correlation will again start to be interpreted as causation in the natural world, and magical thinking is all about correlation. You would essentially be doing the same thing that those anti-vaccine or anti-GMO people are doing, because they are also using a form of magical thinking. This is why I said it's like a catch-22--the necessity of keeping science viable presupposes some degree of industry, so how to get rid of one by using the other? Perhaps it's possible that science will become less important as industry starts to flag, but it could just as easily be that one of the first things to go in the struggle against industry is science, precisely because of how dependent accurate science is on things like the internet and an international community of scientists able to share information freely. It might be that attacking industry would just take away science as a viable tool to use against industry, which is more or less my feeling right now, and that ignorant pseudo-scientific fanaticism might end up leading the charge anyway. You're right that that sort of movement doesn't really have much staying power, but unfortunately that might be the default mode of humans sans civilized science, and maybe the irrational passion of such a movement or number of movements could still have a significant role to play against industry--the Gothic barbarians sacking the edifice of Roman civilization, as it were. 

I know you distinguish between several types of science, and I may very well be conflating the different types, but I don't think I know of any way to reconcile any notion of science with a lack of writing and libraries, physical or otherwise. I still feel that it's preferable for the world to just devolve back to magical thinking rather than take a risk using science as a tool against industry, which could backfire as I argued above. Magical thinking in and of itself can be a threat to industry (which is why the left hates such so-called "ignorance" and pushes science in its education) if sufficient numbers of the population subscribe to it. Take the anti-vaccination controversy as an example. The scientific and medical communities in the US are appalled and deeply concerned about the anti-vaccination trend because it poses a serious public health threat that could ultimately contribute to a destabilization of the economy and national security. These are both things that I wouldn't mind seeing, not to mention a reduction in population and productivity. Industry is necessary to prevent huge outbreaks of disease in densely populated areas, and to accept the invasiveness of industry requires training in scientific thought, otherwise you'll just say fuck off don't touch my kids, Jesus or Allah or whatever other magical notion will protect them. In this example, magical thinking would be quite helpful in undermining the stability of industry.

As far as connecting with nature, I do believe that a magical or mystical mindset is necessary and probably also a default state of most children prior to civilized education. For example, it's not as though the native Hawaiians were on the verge of dying off from lack of knowing how to get food and medicine before Europeans visited them, this despite the fact that Hawaiian culture was pre-scientific and based on a magical religion in which various deities had to be appeased in order for certain things, like gathering potent medicine or food, could be achieved. Now, worshiping gods probably had little to no real world effect, thus it was unscientific, but it did encourage a certain kind of relationship with nature--that is, you had to ask for something before you take it, as though nature were full of other people who happened to be fruit trees and medicinal roots and fish. You don't just take stuff from other people because they'll get angry, so you ask. This tends to discourage things like clear-cutting and overfishing. From a scientific point of view, you don't have to ask anything that isn't human for anything you want. It might destroy a mountain to mine all its ore, but the mountain will never get mad at you, and the animals and plants won't ever take revenge. A scientific person knows better than to believe otherwise. This is why he can do such things. I like the inhibitory effect of magical thinking on such destructive activity. However, there's another point here. Those who study the natural world utilizing a scientific framework, even though they may genuinely love the natural world, are always distanced from it because of the subject-object relationship that underwrites any scientific enterprise. So for example, if I love my friends and want to understand them better, I don't tap all their phones and hack their emails and keep detailed dossiers on their activities, nor do I breed them or dissect them. I could learn a lot from doing so, but then they're not my friends, they're objects of study, and I miss out on an actual friendship with them. If I don't know better then I could easily mistake study for communion, but they are decidedly different things, at least to me. There are things about my friends that they will never reveal to me, and it's important that I don't attempt to find those things out. If I want something from them, I have to ask permission and reciprocate. I have to respect their autonomy in order for our friendship to be real--they're whole people, not things for me to dissect. Taking such an approach for a study of nature, however, would be decidedly unscientific. In this sense, science helps contribute to a psychopathic attitude toward nature. One definition of psychopathy is the belief that no one else is real and therefore does not qualify for the same considerations that the psychopath reserves for himself. The materialistic, non-mystical, science-based worldview also treats the natural world as basically uninhabited by any sentience worthy of consideration, so it's okay to stake your claim and milk it all dry. This kind of worldview could not exist prior to science because magical thinking was the only game in town. Science, of course, is critical for developing sophisticated technology, and the ability to manipulate the world at even the subatomic level certainly goes a long way toward reinforcing the subject-object relationship and consequent psychopathy. This is why I believe that a mystical orientation toward the world and an acceptance of the fundamental inscrutability of the natural world (hence magic) is necessary in order to live in connection to nature without utterly destroying her. It's also very lonely to not attribute a magical intelligence to the natural environment, which I feel is an under-appreciated source of mental illness in modern societies and leads to a need to "fill the void" with our own civilized likeness.

Monday, February 2, 2015

Three Short Theses on Violence

I. Primitivists need to jettison non-violence as an ideal. Hunter-gatherer cultures are too varied with respect to violence to safely form any generalizations. As a general rule, it seems that forager groups who live in large territories, are highly mobile, or enjoy relative isolation such as the Hadza are less prone to violence whereas foragers who live in close proximity, such as the Asmat cannibals of New Guinea famous for eating Michael Rockefeller, tend to be more warlike. Archaeologist Lawrence H. Keeley attempts to vindicate civilization in his book War Before Civilization: The Myth of the Peaceful Savage with evidence of pervasive prehistoric violence in both agricultural and forager societies. (While I have not yet read the book, I have no doubt that he has plenty of legitimate evidence of hunter-gatherer violence, though how pre-civilized inter-human violence that mostly left nature intact is less desirable than a peaceful civilization that will unambiguously ruin the planet for a majority of life forms is less clear). From the book's Wikipedia entry (notes in brackets are my own):
One half of the people found in a Nubian cemetery dating to as early as 12,000 years ago had died of violence. The Yellowknives tribe in Canada was effectively obliterated by massacres committed by Dogrib Indians, and disappeared from history shortly thereafter [Note: Not really. The Yellowknives, while suffering massive losses, did not actually "disappear" from history; a small number of survivors continue to live on in Canada]. Similar massacres occurred among the Eskimos, the Crow Indians, and countless others [Note: Yellowknives, Dogrib, Eskimo/Inuits and Crow were all pre-contact hunter-gatherers]. These mass killings occurred well before any contact with the West. In Arnhem Land in northern Australia, a study of warfare among the Australian Aboriginal Murngin people in the late-19th century found that over a 20-year period no less than 200 out of 800 men, or 25% of all adult males, had been killed in inter-tribal warfare [Note: All of Australia's indigenous peoples were hunter-gatherers prior to European contact].
Then there is the intra-group violence of many, many forager cultures, such as virtually all of the Aborigines of Australia (PDF):
The particularly high level of violence against women was a feature of pre-contact Aboriginal Australia. First contact explorers and colonists noted with distress the terrible scars and bruises that marked the women due to the frequent brutality of their menfolk. Sutton and Kimm point to Stephen Webb's palaeopathology studies which verify that violence against Aboriginal women was prevalent for thousands of years right across the mainland continent. Webb analysed 'trauma using 6,241 adult post-cranial bone samples and 1,409 cranial samples from prehistoric remains derived from all major regions of Australia except Tasmania'. He found that female cranial injuries, of a kind indicating 'deliberate aggression', were more frequent than male cranial injuries.
Such violence is attested in forager groups' own myths and stories. They are not ashamed of it but rather derive a significant part of their identity from it. Violence is always interpreted through the lens of culture. Often, harming those from an outside group is tolerated or even encouraged, whereas violence against one's own is sometimes frowned upon, but sometimes also tolerated. Receiving violence is usually inversely weighted: violence that comes from outside one's group is a greater concern than violence from within the group. We don't need to force the ideal of non-violence onto forager identity. Nature does not judge violence and seems satisfied to let many interactions within her realm be defined by intense brutality. Certainly few non-domesticated life forms are strangers to violence. If we want to live with nature--that is, if we want to survive for the long term on this planet--then we should relearn to accept violence and cultivate the maturity that all forager groups exhibit when confronted with struggle, death, abuse, and disease, rather than imposing artificial ideals like justice, equality, non-violence, etc., that, frankly, arise chiefly from the civilized mindset as foils to wildness. Only when we learn to be satisfied with the Dao of nature and stop judging its finely tuned systems of violence and death will the urge to create a "better" world using technical means be curbed and our planet be spared.

II. There is no way to live on this planet without participating in violence. Violence is a big part of the Dao of nature. Life sustains life, but death also sustains life. It is true that most life forms show an aversion to death, but this by no means indicates that death is somehow wrong. Life's calibrations reference a bigger picture. Take human reproduction as an example. Say fifty million sperm cells vie for a single egg all at once. All fifty million desire to reach that egg, but typically there are at least 49,999,999 that simply die without ever accomplishing their goal. If each sperm were not completely driven to fertilize an egg, or if there were fewer sperm and therefore less "competition", the egg may not get fertilized. If every sperm could fulfill its desire and fertilize its own egg, the world would be comically overpopulated (even more so than it is today). The way the Dao has prescribed it, it is necessary for the majority of sperm to die in ignominy in order for life to continue the way it is supposed to. Not everyone gets what he or she wants, and that's the way we should want it to be. Likewise, most "higher order" life forms like mammals and birds have above a 50% die off rate before offspring reach reproductive age. This includes non-domesticated humans, which is why hunter-gatherer life expectancy calculations used to be so low. Nature counts on that percentage in its designs. To nature, 50% is by no means high. Since our lives depend on unencumbered wilderness, we need to learn to embrace facts like higher infant mortality. Anything else is just fighting against nature, and that would undermine any anti-tech critique, as the only thing that lies outside of nature would be the artifice achievable only via technical means.

III. Those who choose not to eat animals believing that they are reducing suffering delude themselves, though their intentions may well be noble. While it may be possible to survive strictly off of gathered wild plant and fungal foods (though I highly doubt it), one would surely be, at the least, severely malnourished, as nutrients such as protein, fat, and several vitamins are not easily obtained outside of animal sources. Plants in general tend to be very poor in protein, with the exception of legumes and some grains, which, of course, cannot be gathered in adequate quantities in the wild and therefore presuppose agriculture. I'm sure it's not necessary here to go over how agriculture harms the planet, let alone the animals that vegans and vegetarians claim to be sparing. I can't say the same kind of harm would arise from natural predation relationships, including humans hunting animals, and a strong case can be made that ecosystems actually depend on animals killing other animals. To judge predation negatively as violence is rather absurd. Of all the possible ways of obtaining food on this planet, hunting and gathering leave the most nature intact, even when that hunting results in extensive species extinctions. In the worst case scenario, hunters who have hunted all possible game into extinction will themselves soon perish or else learn to be less profligate in their harvesting, allowing for a quicker rehabilitation of the ecosystem as much more nature will have been left intact compared to what a failed agricultural society leaves in its wake. Low tech hunting and gathering are still the lightest way to tread on this planet, and one of the many reasons why civilization is inherently problematic is because it can by no means accommodate this lifestyle. Take civilization as a given and we are left with only bad choices: large-scale suffering of conventionally-raised food animals, impractical and often unaffordable "ethically raised" animals, vegetarian and vegan diets that must make up for nutritional deficiencies by relying on ecologically-destabilizing agriculture, and genetically-modified crops and animals.

Tuesday, January 20, 2015

The Ghost of Technology Future: A Review of Black Mirror

Note: The following review was originally published by FC Journal, which is now defunct, so I've reproduced the review here.

The British science fiction television series Black Mirror draws from a tradition defined by genre paragons like The Twilight Zone and The Outer Limits, shows that attempt to articulate and explore what Freud termed ‘the uncanny’— that which is at once familiar and yet strange, a paradox that defines the modern condition. However, whereas past tales of the uncanny resort to invoking extraterrestrial forces or elements of the supernatural, Black Mirror represents an important realization: when you’ve got advanced technology, the notion of the supernatural becomes redundant. Through three seasons of blisteringly clever hour-long episodes, series creator and main writer Charlie Brooker delivers a trenchant satire of modern technology. While each episode tells a different and unrelated story with a different cast, the theme of the technological uncanny looms large throughout. The technology depicted in the series, much like the technology of our own world, estranges people from their own reality, generating uncanny situations like reliving the same day again and again or encountering the doppelganger of a deceased loved one. Sharp, original, often harrowing, and unexpectedly haunting, Black Mirror excels at revealing the implacable foreignness that dwells at the heart of even our most pedestrian technological contrivances.

In contrast to what one might expect from typical science fiction, each episode of Black Mirror portrays hypothetical technology that, far from being a revelation, is largely just a modest extrapolation of our own current epidemic of smart phones, social networking services, and wearable gadgets into the very near future. It’s science fiction, but only just barely. A service that generates the textual and vocal likeness of a deceased loved one based on his internet activity during life (S2:E1 “Be Right Back”) is science fiction only in the technical details. The ability of technology to produce a reasonably accurate psychological profile of a person with decades’ worth of hourly status updates, tweets, and Google searches is already upon us. A machine that induces memory loss (S2:E2 “White Bear”), a surgically-implanted and neurologically-wired device that records everything you see and hear via your eyes and ears (S1:E3 “The Entire History of You”), a society that operates as a social network incarnate (S1:E2 “Fifteen Million Merits”)—the hypothetical scenarios featured in each episode directly reference tech that either already exists or else would probably not be surprising to see in a few years’ time, and this helps to make some of the episodes seem remarkably realistic and believable.

The narratives are facilitated by able and affecting performances from a high profile roster of British and American actors, including Toby Kebbell (Dawn of the Planet of the Apes), Rory Kinnear (The Imitation Game), Hayley Atwell (Captain America: The First Avenger), Rafe Spall (Life of Pi), Jessica Brown Findley (Downton Abbey), and Jon Hamm (Mad Men). However, some of the finest moments on the show come from actors that may be less familiar to American audiences, including Jodie Whittaker as Ffion, who struggles to convince her husband that not all truth can be found in the apparent objectivity of a recorded past, and Daniel Kaluuya as Bing, a young man trapped in a screen-based society that interacts chiefly through social network avatars and crass advertisements. As each episode consists of mostly different casts, it’s not possible here to do full justice to each and every superlative performance, so suffice it to say that the performances throughout the series are consistently convincing, robust, and deliberate in a way that makes the characters and stories feel thoroughly real even in the midst of the most absurd or fantastical scenarios, which speaks to the competence of the episodes’ directors as much as to the talent of the actors. The series’ cinematography and visual effects succeed in framing and accentuating the performances and help color each scene with an additional layer of emotional resonance—the bleary morning following a sleepless night of obsessing over a video clip, the garish glow of a cartoon celebrity advertisement lighting up a urine-soaked underpass, the creeping claustrophobia of a snowed-in house—resulting in an atmosphere of near pitch-perfect malaise. Each story often instills a sense of ethical or even existential disquiet that will linger for hours, days, or, if one isn’t too distracted, even weeks. This was the case for this reviewer, who found it difficult to watch several episodes in a row, needing a day or two in between viewings to digest and recover. A sort of cognitive vertigo takes hold as the mind struggles to reconcile the many moral conundrums left open at the end of each episode. The tech that defines the lives of the characters in these episodes, much like the tech that surrounds us in the real world, mitigates even the most intimate aspects of life—sex, death, memory—and, in facilitating their experiences, reduces their humanity to spectacle and entertainment. It is impossible not to feel this degradation as a palpable weight on the soul after each viewing.

As such, the series is decidedly bleak, offering virtually no suggestions for solving the problems it raises, and this might count as its chief deficiency. Perhaps it has no answers, or perhaps it is implying that there can be no answer to the problem of technology, only resignation. Thus we watch as characters resign themselves to their defeat and humiliation, endure perpetual torment or even acquiesce and convert. Black Mirror doesn’t seem interested in asking how we got to where we are or where we seem to be headed, and so doesn’t seem to quite count as cautionary tale or allegory. Rather, the show seems content to simply reflect back at us the true bleakness of the technological world that entraps us without speculating on what we could or should do, a wake-up call as opposed to a battle plan. Part of what makes Black Mirror an exceptional series is its insistence on approaching the subject matter unequivocally, without the apologism that is typical even of dystopian science fiction that may level some half-hearted critique at technology. Thus it avoids the usual pitfalls that cause lesser sci-fi tales to end up undermining their own themes when they inevitably try to salvage technology, rehabilitated or otherwise somehow pardoned, from where it really belongs—oblivion. Black Mirror combines compelling narrative, incisive wit, and cerebral cinematography to articulate technology’s inherent duplicity in a laudable and sorely needed effort to reawaken us to the progress that encroaches upon all facets of our lives. In this, the series hopefully represents the beginning of a shift in industrial society’s attitude toward technology as embodied in our fiction—a move away from the glorification and worship of machines that characterizes the majority of science fiction and a much needed step toward a skepticism and apprehension of the techno-industrial enterprise. Black Mirror provides us a reflection more faithful than most, unadulterated by your typical propaganda and contrived happy endings, so that we might manage to pull out the wiring in our brains for a moment and take a clear look at ourselves for the first time in a long time.

Tuesday, December 16, 2014

Extolling the Human Scale in the Dao De Jing

Here are two passages from the Dao De Jing that summarize the importance of the human scale, which is characterized by insularity and rejection of high technology. Let them serve as an epigraph for my post on the Islamic State and the human scale.

The more taboos and inhibitions there are in the world,
The poorer the people become.
The sharper the weapons the people possess,
The greater confusion reigns in the realm.
The more clever and crafty the men,
The oftener strange things happen.
The more articulate the laws and ordinances,
The more robbers and thieves arise.
--Dao De Jing, Chapter 57
 
Ah, for a small country with a small population! Though there are highly efficient mechanical contrivances, the people have no use for them. Let them mind death and refrain from migrating to distant places. Boats and carriages, weapons and armour there may still be, but there are no occasions for using or displaying them. Let the people revert to communication by knotting cords. See to it that they are contented with their food, pleased with clothing, satisfied with their houses, and inured to their simply ways of living. Though there may be another country in the neighbourhood so close that they are within sight of each other and the crowing of cocks and barking of dogs in one place can be heard in the other, yet there is no traffic between them, and throughout their lives the two peoples have nothing to do with each other.
--Dao De Jing, Chapter 80 (transl. John C. H. Wu)
 
 

Friday, December 12, 2014

On Defining the Natural

Of central importance to anarcho-primitivist and Daoist thought is the concept of nature. Without the distinction between, say, technology, progress, civilization, and domestication on the one hand, and a notion of an unmanipulated, pristine natural state on the other, an anti-civilization critique becomes impossible. Both sides know this all too well, and both continue to commit much energy to either elucidate or discredit this idea of nature. The resulting contest is predictable and tired: anti-civilization folks stressing the unnatural suffering and ruin inflicted by civilization, post-modernist defenders of civilization insisting that any distinction between nature and civilization is delusion. It is therefore understandable that AP thinkers spend much of their time dragging out evidence that demonstrates how civilization and nature are at odds--the entire plausibility of the AP critique would seem to hinge on there really being such a thing as a "state of nature".

Unfortunately, that ideological position is fraught with issues. And it really is ideological--little or no scientific evidence actually supports the notion of a static state anywhere in the "natural" world. Species come in and out of existence with or without the meddling of humans, mountains rise and fall, volcanoes become lakes, lakes become forests, forests become deserts, deserts become oceans, fish become frogs, dinosaurs become birds, quadrupeds become bipeds. The notion of a "state of nature" in the West probably derives more from biblical influence than from actual science. In reality, there is no point in time for any given ecosystem at which one could point and declare it to be the definitive state of that ecosystem. Consider the current hysteria over invasive species. We'll take the decidedly unscientific war on invasive plants as an example*. In order for there to be such a thing as invasive plants, there has to be a notion of native plants. In other words, it is ideologically necessary to construct a myth of native plants, who have "always" been here, and whose venerable existence is under threat from invasive plants, who have no place being outside of their native environments. However, virtually all plants are highly mobile historically, and in many cases, can change their range annually via seed dispersal and clonal propagation. The planet undergoes periodic warming and cooling, with direct and dramatic implications for terrestrial (and aquatic) flora and fauna. Long before humans roamed North America, for example, plants adapted to warmer climes were able to make their way up to more northerly latitudes during interglacial periods, only to retreat back southward as things got cooler again. Cold-adapted plants enacted a reverse migration during ice ages. Plants are also able to access remote places like the Hawaiian Islands, the most isolated landmass on the planet and something of a "novelty" themselves in terms of their relatively recent appearance, and colonize them thoroughly despite thousands of miles of interceding ocean. New species arrived regularly throughout the islands' history, negotiating new relationships with the established plants and animals, sometimes dying out, sometimes overrunning, and sometimes striking a balance between those two extremes. Do none of these incursions into new territory count as invasions, or is it all one long story of invasion? How should one decide what dispersal events count as natural and, therefore, acceptable and desirable, and which count as unnatural and disruptive? If nature is conceived of as a pristine condition that existed before civilization, then the occurrence of novel species and chemical compounds must be considered unnatural despite the fact that they predate the existence of humans. On the other hand, if such novelty is to be considered naturally occurring after all, then there never was a pristine, primeval state of nature to begin with, just a constant series of changes. It is precisely in this light that one could argue that civilization is nothing but a direct product of natural evolution and thus as natural as a coral reef. Thus, we anarcho-primitivists can rave all we want about all the awful things technology and civilization are doing to the biosphere and to ourselves, but without convincingly and coherently addressing this glaring paradox, AP critique essentially has nowhere to go--it is trapped under the weight of its own contradictions. Defenders of civilization will accuse us of romanticism and cherry-picking, and, until we adopt a meaningful definition of nature, they will be right.

The challenge, then, is to define something that is constantly in flux. Owing to biases that can be said to be traditional to Western philosophy (peripheral thinkers like Heraclitus excepted), which prefers to conceptualize things in distinct and unchanging categories for the sake of elucidating permanent and universal truths, the nuances of nature as flux have never sat well. If science now accepts these nuances (quantum mechanics, evolution, chaos, relativity), it does so begrudgingly, as evidenced by lingering inertia in the various branches of science. For example, biologists continue to use the Latin binomial taxonomic classification system for organisms despite the fact that such a classification scheme is in many ways contrived and fails to reflect the constant genetic evolution that proceeds from one generation of a species to the next. Occasionally, a newly discovered organism throws the entire classificatory system into disarray, requiring the renaming of several members of a genus or even family. The system is furthermore not useful for illustrating ecological relationships. Rather, it is artificially imposed for the sake of maintaining the myth that the world can be organized into discrete parts, like a machine. It is revealing that, by contrast, indigenous people consistently demonstrate remarkable awareness and familiarity with virtually all organisms sharing their environment, often even surpassing the ability of researchers to identify individual species and their habits, without anything approaching the Latin binomial system. For example, a type of tree might be customarily called an elder out of recognition for its role in an ecosystem. This name immediately conveys ecological information based on local function, whereas the scientific name of sambucus canadensis, created in a language nobody even speaks anymore, indicates nothing more than the fact that this plant is related to other sambucus trees in far-flung regions of the world, and that this particular tree exists in North America (hence canadensis--"of Canada"). The scientific taxonomic system confers a distinct identity that is intended to be absolute--sambucus canadensis is sambucus canadensis even when it is in Europe, or Australia, or the international space station. No part of its identity need derive from any given environment--all organisms an island unto themselves, and nature a random assortment of pieces. However, in a traditional manner of identification, if the tree is called an elder tree and understood to be an ecological elder to other species, then it matters little whether, from a scientific viewpoint, the tree is canadensis or chinensis; as long as it is willing to fulfill the same role, then it is the same tree. In other words, an entity's identity derives from its context, not from an innate, genetically-derived identity, which depends on a sort of scientific delusion that is in denial of flux as the essential characteristic of nature. Science and, more broadly, Western philosophy exhibit a fixation regarding absolute categorization and hierarchical ranking. There is an implied assumption that precise categorization leads directly to understanding. Appreciation for the larger trajectory of an ecosystem's evolution over time is lost in the constant focus on discerning details such as the relatively spurious differences between individual related species. This focus on the minutiae of the biosphere comes at the cost of lost perspective on the course of change on the planet as a whole--failing to see the forest for the trees, both figuratively and literally. As a result, our science--the study of nature--actually has not the slightest wherewithal with which to judge something such as civilization to be natural or unnatural, and our society's intuition (or lack thereof) on what can be considered natural versus unnatural is accordingly deeply flawed. The debate of whether anything can even be considered "outside of nature" is therefore largely relegated to philosophy--in other words, indefinitely shelved.

The principal issue with defining nature seems to center on the existence of novelty. For example, we know that terrestrial animals evolved from marine animals. A sort of fish, the story goes, gradually came to spend more and more time on dry land until a branch of its progeny evolved to be able to survive outside of the water. The first fish to crawl on dry land, then, was certainly a novelty of the first order. Even a child knows that a fish naturally belongs in the water--it quickly dies, otherwise. So, does that fact make all terrestrial animals the product of the unnatural behavior of that pioneer fish? Of course, that position seems hardly tenable. However, if learning to live on land should not be considered unnatural for that ancient fish, then why should it be that, say, tossing a live mackerel into the middle of the Alps, or rocketing various animals into orbit, should be thought any more unnatural? The answer seems to do not with change in and of itself, but rather with the pace or rate of change. Tossing the mackerel onto the side of a mountain dramatically exceeds the rate at which the fish can adapt, and thus the Swiss Alps should be considered an unnatural environment for mackerel and all other fish. However, this fact does not preclude the possibility that, if allowed sufficient time and meted exposure to a foreign environment, a creature adapted to one environment could eventually undergo enough changes over generations to survive in a new one. The critical element that decides whether a novelty in the environment is natural or unnatural is therefore the rate at which that change took place. Thus, the question that should be asked about, say, genetically modified salmon, is not whether or not it is ipso facto "natural" for a salmon to consume soy and corn all its life inside a farm, but rather how long it would take for such a change in salmon adaptation to occur spontaneously--that is, without genetic modification. Obviously, such changes would take much longer, far too long for enterprise and capitalism's needs, hence GMOs, but nature has to work much more slowly because any change in one species implies a ripple effect that prods each and every other aspect of an ecosystem and possibly the entire biosphere into adjusting to that change, and of course every secondary change generated this way will entail further changes, and so on--non-linear, multilateral feedback loops that are constantly active. There is therefore no stable state devoid of change per se, but there is a relative stability in never-ending flux and adaptation so long as the pace of change does not exceed the parameters of this natural tempo. This speaks to a kind of inertia that exists across all ecosystems that both discourages sudden aberrations from occurring and that absorbs some of the shock to the system that results when such an aberration does occur. It is precisely because this inertia gets flouted in order to effect all of civilization's myriad novelties that every technological advance is decidedly unnatural. Technology, by definition, seeks to achieve something that would not otherwise result at nature's pace of change, and technological innovation is essentially the outstripping of natural change. There is no comparing that sort of artificial innovation to the evolution of a rhinoceros' horn or a bat's wings.

Nature's pace of change is itself variable and based on the totality of circumstances within the biosphere at any given moment--this is why long periods of stability in an ecosystem will tend to promote slower change whereas ecosystems undergoing sudden and dramatic disturbances will tend to see cascading changes at a faster rate. It might be useful to visualize the phenomenon of change in terms of a construct of scaffolds, similar to the scaffolding used while erecting a structure. A naturally-occurring novelty within an ecosystem can be represented as a new scaffold built atop older scaffolding. Over time, this scaffold will become integrated into the overall structure when it in turn serves as part of the immediate foundation for newer scaffolding. What would have been previously structurally impossible or unstable (unnatural) can become not only possible but necessary over time as the entire structure builds up in a steady and stable manner. The scaffolds build up like a pyramid, with the base growing wider as the structure grows taller, and with each scaffold interlocked with many other scaffolds both adjacent and distant, yielding additional stability through redundancy. This means that the removal or destruction of too many of the scaffolds from any point will have implications for the integrity of the entire structure. An occasional loss here and there under normal conditions doesn't present a real threat to stability, and even significant damage can be repaired, holes patched and failing parts replaced and even strengthened beyond previous limits. In the absence of further disturbance, it is possible for nature to incorporate an imbalance over time into a stable structure by building scaffolding to support the flaw and then eventually strengthen it so that it itself can support additional scaffolding and bridge to the other parts of the structure.

Occasional disturbances can be thought of as a way for nature to test and improve upon stability. Witness the feral revival ongoing at Chernobyl, even while radiation levels remain too high to allow humans to move back in. Because there was one catastrophic event followed by a period of relative stability, wildlife such as birds and deer have reclaimed the contaminated land in Chernobyl, though with visible effects from the lingering radiation. However, if disturbances occur at a pace that outstrips the rate at which nature can adapt to them, the resiliency of the entire structure starts to become at risk. As Homo sapiens, we, of course, also live on this structure that we are currently destabilizing. We achieve this destabilization by trying to erect a parallel structure on top of nature's structure. We call this parallel structure civilization. Civilization is erected by raiding the natural world, dismantling the various scaffolds that hold this or that aspect of the biosphere in place, and repurposing them for much narrower human designs. At a small enough scale, the strain placed by civilization on nature's structure is tolerable if not negligible, especially if the pace of civilized development does not exceed the pace of nature's own "construction", which at the outset of, say, the Holocene (11,700 years ago), would have been robust enough to withstand the first few millennia of civilized humans. However, with the advent of industrial means of production, the rate (economists call it "efficiency") of civilized development suddenly began to increase in an exponential fashion. More and more holes in the natural structure started to appear as the pace of raiding the natural world began to proceed at an increasingly reckless rate. While some of those who lived at the margins of this artificial structure of civilization could perceive the destabilizing effects of this process on nature, most of those who lived within civilization could see only progress developing at faster and faster rates while remaining insulated from the effects of this progress on the entire framework of life. The existence of an alternative structure within nature began to encourage the development of a myopic, civilization-centric mindset. The fatal flaw of this way of thinking, of course, was always the mistaken belief that civilization exists parallel to, and thus outside of, the natural world, when in reality it is an inescapable fact that civilization rests upon the very structure that it is literally dead set on tearing down. For civilization, there is no alternative to destruction of the natural world and therefore it has no other path than self-destruction. Even without the constant acceleration of growth in civilization, the sheer strain of a civilization of seven billion-going-on-nine billion sapiens and the bare minimum amount of materials and pollution required to keep them all alive to a standard that the political left finds acceptable easily outstrips nature's ability to provide at the given rate of change many times over. Civilization, therefore, is not an alternative or successor to nature, but more like a tumorous growth that feeds off of nature even as it destroys her. Tumors are not a type of external pathogenic condition in organisms, but rather a sort of imbalanced "runaway" growth of endogenous cells that no longer respond to normal signals to stop multiplying. Normal rates of cell growth are not typically problematic nor are the cells themselves inherently dangerous under nominal conditions. Cells normally divide in order to replenish dead cells and maintain homeostatic functioning within an organism. Once again, we see that it is not the fact that things change (in this case, cellular mitosis) that is the problem, but rather the rate at which that change occurs (the difference between homeostasis and cancer). One can then surmise that the "inevitability" of civilized progress is, like cancer, actually conditional. To consider civilization natural would be the same as accepting cancer as being the normal state of an organism.

When discussing rates of change in nature, it seems worthwhile to contemplate mass extinctions, as they provide illuminating examples of the principle of nature as rhythm. In particular, the extinction of most of the world's terrestrial megafauna (except for Africa) at the end of the last ice age has deep implications for anarcho-primitivist theory regarding the inherent sustainability of hunter-gatherer society. While the debate mainly between the "overkill" theorists and "climate change" advocates continues to play out, the evidence seems to more heavily support the overkill-by-humans argument without completely denying climate change's significance. Obviously, the humans in question were purely hunter-gatherer except perhaps for the domestication of dogs as companions and/or hunting partners ca. 30,000 years ago. From Wikipedia:

Outside the mainland of Afro-Eurasia, these megafaunal extinctions followed a highly distinctive landmass-by-landmass pattern that closely parallels the spread of humans into previously uninhabited regions of the world, and which shows no correlation with climatic history (which can be visualized with plots over recent geological time periods of climate markers such as marine oxygen isotopes or atmospheric carbon dioxide levels). Australia was struck first around 45,000 years ago, followed by Tasmania about 41,000 years ago (after formation of a land bridge to Australia about 43,000 years ago), Japan apparently about 30,000 years ago, North America 13,000 years ago, South America about 500 years later, Cyprus 10,000 years ago, the Antilles 6000 years ago, New Caledonia and nearby islands 3000 years ago, Madagascar 2000 years ago, New Zealand 700 years ago, the Mascarenes 400 years ago, and the Commander Islands 250 years ago. Nearly all of the world's isolated islands could furnish similar examples of extinctions occurring shortly after the arrival of Homo sapiens, though most of these islands, such as the Hawaiian Islands, never had terrestrial megafauna, so their extinct fauna were smaller.

An analysis of Sporormiella fungal spores (which derive mainly from the dung of megaherbivores) in swamp sediment cores spanning the last 130,000 years from Lynch's Crater in Queensland, Australia showed that the megafauna of that region virtually disappeared about 41,000 years ago, at a time when climate changes were minimal; the change was accompanied by an increase in charcoal, and was followed by a transition from rainforest to fire-tolerant sclerophyll vegetation. The high-resolution chronology of the changes supports the hypothesis that human hunting alone eliminated the megafauna, and that the subsequent change in flora was most likely a consequence of the elimination of browsers and an increase in fire. The increase in fire lagged the disappearance of megafauna by about a century, and most likely resulted from accumulation of fuel once browsing stopped. Over the next several centuries grass increased; sclerophyll vegetation increased with a lag of another century, and a sclerophyll forest developed after about another thousand years. During two periods of climate change about 120 and 75 thousand years ago, sclerophyll vegetation had also increased at the site in response to a shift to cooler, drier conditions; neither of these episodes had a significant impact on megafaunal abundance. Similar conclusions regarding the culpability of human hunters in the disappearance of Pleistocene megafauna were obtained via an analysis of a large collection of eggshell fragments of the flightless Australian bird Genyornis newtoni and from analysis of Sporormiella fungal spores from a lake in eastern North America.
The following two passages are taken from Sapiens: A Brief History of Humankind by Yuval Noah Harari. I've reproduced the passages below in their entirety because I feel that they merit reading by anyone who doubts the human factor in prehistoric megafaunal extinction. Taking Australia first as a case study:

Some scholars try to exonerate our species, placing the blame on the vagaries of the climate (the usual scapegoat in such cases). Yet it is hard to believe that Homo sapiens was completely innocent. There are three pieces of evidence that weaken the climate alibi, and implicate our ancestors in the extinction of the Australian megafauna.
Firstly, even though Australia's climate changed 45,000 years ago, it wasn't a very remarkable upheaval. It's hard to see how the new weather patterns alone could have caused such a massive extinction. It's common today to explain anything and everything as the result of climate change, but the truth is that earth's climate never rests. It is in constant flux. Every event in history occurred against the background of some climate change.
In particular, our planet has experienced numerous cycles of cooling and warming. During the last million years, there has been an ice age on average every 100,000 years. The last one ran from about 75,000 to 15,000 years ago. Not unusually severe for an ice age, it had twin peaks, the first about 70,000 years ago and the second at about 20,000 years ago. The giant diprotodon appeared in Australia more than 1.5 million years ago and successfully weathered at least ten previous ice ages. It also survived the first peak of the last ice age, around 70,000 years ago. Why, then, did it disappear 45,000 years ago? Of course, if diprotodons had been the only large animal to disappear at this time, it might have been just a fluke. But more than 90 per cent of Australia's megafauna disappeared along with the diprotodon. The evidence is circumstantial, but it's hard to imagine that Sapiens, just by coincidence, arrived in Australia at the precise point that all these animals were dropping dead of the chills.

Secondly, when climate change causes mass extinctions, sea creatures are usually hit as hard as land dwellers. Yet there is no evidence of any significant disappearance of oceanic fauna 45,000 years ago. Human involvement can easily explain why the wave of extinction obliterated the terrestrial megafauna of Australia while sparing that of the nearby oceans. Despite its burgeoning navigational abilities, Homo sapiens was still overwhelmingly a terrestrial menace.

Thirdly, mass extinctions akin to the archetypal Australian decimation occurred again and again in the ensuing millennia--whenever people settled another part of the Outer World. In these cases Sapiens guilt is irrefutable. For example, the megafauna of New Zealand--which had weathered the alleged 'climate change' of c. 45,000 years ago without a scratch--suffered devastating blows immediately after the first humans set foot on the islands. The Maoris, New Zealand's first Sapiens colonisers, reached the islands about 800 years ago. Within a couple of centuries, the majority of the local megafauna was extinct, along with 60 per cent of all bird species.

A similar fate befell the mammoth population of Wrangel Island in the Arctic Ocean (200 kilometres north of the Siberian coast). Mammoths had flourished for millions of years over most of the northern hemisphere, but as Homo sapiens spread--first over Eurasia and then over North America--the mammoths retreated. By 10,000 years ago there was not a single mammoth to be found in the world, except on a few remote Arctic islands, most conspicuously Wrangel. The mammoths of Wrangel continued to prosper for a few more millennia, then suddenly disappeared about 4,000 years ago, just when the first humans reached the island. 

And touching also on the Americas:

For decades, palaeontologists and zooarchaeologists--people who search for and study animal remains--have been combing the plains and mountains of the Americas in search of the fossilised bones of ancient camels and the petrified faeces of giant ground sloths. When they find what they seek, the treasures are carefully packed up and sent to laboratories, where every bone and every coprolite (the technical name for fossilised turds) is meticulously studied and dated. Time and again, these analyses yield the same results; the freshest dung balls and the most recent camel bones date to the period when humans flooded America, that is, between approximately 12,000 and 9000 BC. Only in one area have scientists discovered younger dung balls: on several Caribbean islands, in particular Cuba and Hispaniola, they found petrified ground-sloth scat dating to about 5000 BC. This is exactly the time when the first humans managed to cross the Caribbean Sea and settle these two large islands.

In reality, people probably should not be so hard on their own species when it comes to these mass extinctions. Other species, given the necessary conditions, have done just as much damage on their own scale. The brown tree snake accidentally introduced to Guam, the cane toad intentionally introduced to Australia, the kudzu vine currently devouring the southern United States--all have done their fair share of "ecological devastation". Organisms are designed to be opportunistic, and when a creature or plant is introduced into virgin land, it will glut itself on the available resources until some force arrives to strike a new equilibrium with it, whether that force be a new predator, new defensive adaptations by prey, or simply starvation for having eaten up absolutely everything it could. Like the above creatures, the hunter-gatherers who proceeded to exterminate the megafauna of the New World and Australia were acting more or less on instinct, which would explain the symmetry in outcomes in both locales despite complete lack of communication. They did not intentionally evolve the intelligence necessary to pull off massive mammoth slaughters, and they also did not cause a land bridge to appear between Russia and Alaska. People had no way of knowing better. They had an unfair advantage over the megafauna and they kept capitalizing on it until things started to even out. The few surviving megafauna of North America--namely, the bison, bears, wild cats, moose, and other ungulates--eventually got wise to human predation and it got significantly more difficult to nab one. This is presumably how a natural equilibrium is supposed to get established in the wake of a new imbalance, and the more sudden the introduction of a novel element into an established environment, the more time is necessary before equilibrium can be re-established. When the brown tree snake came to Guam, the native birds had no evolved wariness or defenses against snakes at all. Twelve native bird species are now extinct. However, if the snakes were left on the island long enough without human attempts at meddling, it would surely soon starve for having eaten everything there was to eat. An island is a precarious and fragile ecosystem. Witness Easter Island's inhabitants consuming their resources to the brink of self-annihilation. With the Maori of Aoterroa/New Zealand, the initial abundance of the giant flightless moa birds induced a sort of shortsighted wastefulness in the newly arrived Polynesians:
Moa may have been hunted to extinction within a century of human arrival to New Zealand. Moa made such easy prey that by AD 1200 the hunting of Moa alone provided food surpluses sufficient to provide for the settling of large villages up to 3 hectares. These villages were permanent coastal encampments from which bands would set out on several week hunts to slaughter and carry back Moa. Over 300 Moa butchering sites are known, 117 on South Island which together account for some 100,000-500,000 Moa. With such abundance came a good deal of waste: as much as 50% of usable weight was discarded in the field. At around the same time as hunting was at it peak, the forests of South Island were burned off. The extraordinary abundance of food resources supported a population of as many as 10,000 people. However, by the late 1400s the Moa hunting society collapsed. By about A.D. 1400 all moa are generally thought to have become extinct, along with the Haast's Eagle which had relied on them for food.--http://www.nzbirds.com/birds/moa.html
Moa may have been hunted to extinction within a century of human arrival to New Zealand. Moa made such easy prey that by AD 1200 the hunting of Moa alone provided food surpluses sufficient to provide for the settling of large villages up to 3 hectares. These villages were permanent coastal encampments from which bands would set out on several week hunts to slaughter and carry back Moa. Over 300 Moa butchering sites are known, 117 on South Island which together account for some 100,000-500,000 Moa. With such abundance came a good deal of waste: as much as 50% of usable weight was discarded in the field. At around the same time as hunting was at it peak, the forests of South Island were burned off. The extraordinary abundance of food resources supported a population of as many as 10,000 people. However, by the late 1400s the Moa hunting society collapsed. By about A.D. 1400 all moa are generally thought to have become extinct, along with the Haast's Eagle which had relied on them for food. - See more at: http://www.nzbirds.com/birds/moa.html#sthash.LZ4TC8hy.dpuf
Moa may have been hunted to extinction within a century of human arrival to New Zealand. Moa made such easy prey that by AD 1200 the hunting of Moa alone provided food surpluses sufficient to provide for the settling of large villages up to 3 hectares. These villages were permanent coastal encampments from which bands would set out on several week hunts to slaughter and carry back Moa. Over 300 Moa butchering sites are known, 117 on South Island which together account for some 100,000-500,000 Moa. With such abundance came a good deal of waste: as much as 50% of usable weight was discarded in the field. At around the same time as hunting was at it peak, the forests of South Island were burned off. The extraordinary abundance of food resources supported a population of as many as 10,000 people. However, by the late 1400s the Moa hunting society collapsed. By about A.D. 1400 all moa are generally thought to have become extinct, along with the Haast's Eagle which had relied on them for food. - See more at: http://www.nzbirds.com/birds/moa.html#sthash.LZ4TC8hy.dpuf
Moa may have been hunted to extinction within a century of human arrival to New Zealand. Moa made such easy prey that by AD 1200 the hunting of Moa alone provided food surpluses sufficient to provide for the settling of large villages up to 3 hectares. These villages were permanent coastal encampments from which bands would set out on several week hunts to slaughter and carry back Moa. Over 300 Moa butchering sites are known, 117 on South Island which together account for some 100,000-500,000 Moa. With such abundance came a good deal of waste: as much as 50% of usable weight was discarded in the field. At around the same time as hunting was at it peak, the forests of South Island were burned off. The extraordinary abundance of food resources supported a population of as many as 10,000 people. However, by the late 1400s the Moa hunting society collapsed. By about A.D. 1400 all moa are generally thought to have become extinct, along with the Haast's Eagle which had relied on them for food. - See more at: http://www.nzbirds.com/birds/moa.html#sthash.LZ4TC8hy.dpuf
The birds, like the dodo, had no fear of humans and were easily caught. The Maori were said to be extremely wasteful with the meat, only keeping the tastiest portions and letting the rest rot. True, both the Maori and the Rapa Nui came from Polynesian agricultural stock and it could be argued that their imprudence somehow stems from that cultural background, but on the whole that argument seems weak. At the very least, it seems doubtful that a group of hunter-gatherers transported to a pristine island in the Pacific would have behaved all that differently, except that maybe they'd die off much sooner for want of having taro and pigs to start off with.

This characterization of ancient peoples as wasteful and imprudent with their natural resources stands in stark contrast to the image of indigenous people that most anarcho-primitivists and even the indigenous peoples themselves hold of their ancestors, who, it is claimed, have always lived with nothing but reverence and respect for nature and have always sought balance with the environment, making use of every part of hunted game, and so on and so forth. One way to resolve this contradiction, of course, is to just simply ignore zooarchaeological evidence and go on reveling in noble savage myths. I suspect many anarcho-primitivists will proceed to do just that due to being ideologically invested in these myths. Indeed, most indigenous peoples take serious offense at the suggestion that their ancestors were solely responsible for the extinction of most of the rare animals and plants in their country. For example, both native Hawaiians and Maori insist that their ancestors always taught them not to be greedy with game and fish, and there is truth to this. Taboos concerning when and under what circumstances an animal may be taken, for example, are numerous in such cultures, and it is clear to see from ethnographic accounts and ongoing cultural practices that waste is and was frowned upon. However, the assumption that, in the absence of any actual historical evidence, the ancient Polynesians were so conserving of resources from the very beginning, is contradicted by the archaeological evidence. One should not necessarily conclude that the Hawaiians or Maori are being disingenuous, but it does appear that they suffer from a sort of cultural amnesia regarding that early period of their history when their ancestors very understandably glutted themselves on all the low-hanging fruit. Only after this initial gluttony did the various Polynesian cultures begin to change, and this was only possible probably after some form of damage or reduction in easily-obtained wild food became apparent, i.e. species extinctions. Culture finally caught up to reality and this new-found respect for nature became ingrained in many Polynesian cultures' beliefs; therefore it is not surprising that native Polynesians take such exception to the notion that they were responsible for species extinctions early in their history. It just shows how thorough and powerful culture can be. The first Polynesian colonizers had to "eat spiders" for a while before they decided that that was not the best course, and now no one recalls ever having eaten spiders, but one can be fairly certain that at some point, somebody did.

Culture can be almost any kind of story, and in that sense it's a great hoax, and a new culture cannot be said ipso facto to be any less authentic and legitimate than an established one, but attention should be paid to the likely consequences of any given culture if one were in the business of choosing between multiple options. A culture based on detachment from nature and non-stop economic growth implies a certain set of consequences, whereas a culture based on sparse, low-tech band societies implies a different set. As Kaczynski pointed out over 20 years ago, popularizing an alternative culture to that of industrial civilization is a stratagem that we can and should use to change people's relationship to nature. In all likelihood, it's probably the only viable option we have at our disposal. Culture, like nature, needs sufficient time to catch up to a sudden novelty. The novel The Lord of the Flies and the like all riff on the assumption that without authority, humans will invariably devolve back to what Hobbes always said we were: nasty brutes who incessantly kill one another. While this outcome would almost certainly be true in the short term, it would be strange to think that, given enough time of this, some sort of cultural solution would not organically arise to address the problems introduced by the original anomalous situation, the same way it has for the Polynesians and for all cultures who have weathered the mistakes of their early pasts and have matured enough to find stability with their environment. That's what culture is, after all--an amalgamation of attempts to solve recurring problems by literally changing the way an entire society thinks on the most fundamental level, even to the point of collective amnesia of things ever having been different. The domestication of fire, for example, was a significant achievement for humans, and I am not aware of any anarcho-primitivists who would go so far as to reject even fire as a technology. However, fire can be dangerous, so it almost certainly took a long time--generations--for people to make all the mistakes one could possibly make with fire before all the variables were grasped  and humans were mature enough to handle fire responsibly. The pace of change in the adoption of this radical new technology still fell within the bounds of what the human mind could grasp, and the damage inflicted upon nature via human fire was incorporated and transformed into a new synergy, with organisms adapting and even benefiting from, say, controlled burns of forest understories. Thus, by means of slowly adopting a new technology, both humans and nature had time to integrate the novelty and produce a new synergy. The novel essentially becomes natural by virtue of acceptable rate of change, as opposed to any sort of static, inherent quality of "natural" or "artificial". Currently, it seems that human culture is trying but failing to incorporate the novelties that technological development keeps throwing at us. The rate of change is not only too fast, but it keeps accelerating. Culture is losing the race badly. This would be the main qualitative difference between, say, a culture learning to incorporate fire-based technology and our current situation, where we cannot even all agree on the basic issues to begin with. Even in the case of early sapiens seemingly having caused the extinction of the majority of the planet's megafauna wherever they roamed outside of Africa, they still did not enact change at a rate that outstripped the biosphere's ability to adapt. The planet survived the quaternary extinction event, just as it did the previous waves of massive extinctions long before humans were around. Certainly, the killing of the terrestrial megafauna came nowhere close in terms of thoroughness or planetary impact to, say, the Cretaceous-Paleogene event that wiped out non-avian dinosaurs, or the even earlier Permian-Triassic event (evocatively termed "the great dying"). The catastrophes of the past, while appearing explosive in historical retrospect, still occurred slowly enough to abide by nature's rhythm of change. By contrast, what technology continues to demonstrate to us in terrifying clarity is that the catastrophes of the present and the future will continue to occur and accumulate at ever faster rates, achieving an unprecedented type of novelty: the truly artificial, the fundamentally other, the collapse of synthesis, the antithesis of existence.

*For a good read on the madness that is invasive plant hysteria in the US, see Timothy Lee Scott's Invasive Plant Medicine

Thursday, October 16, 2014

The Appeal of the Islamic State to Westerners and the Fallout from Techno-Scale Reality

The ascension of the extremist Islamist group known variously as the Islamic State, ISIS, ISIL, or Da'esh, shocked most Westerners with its sudden appearance on the mainstream media radar in the wake of multiple successful campaigns to expand the group's territory, taking over important cities and infrastructure in Iraq and Syria. As most people by now have heard, the ranks of the group, which I will just refer to from here on out as ISIS, have been steadily bolstered by recruits from Western nations. By some estimations, at least three thousand men and women from the United States, Canada, and Europe have joined the fighting in Iraq and Syria to date on the side of ISIS. Many living in the West may find this trend baffling--why give up the peace and comfort of one's middle-class life, journey to a very dangerous place where, in some cases, one doesn't even speak the language, endure Spartan living conditions, and throw in with a group roundly condemned by one's government, media, and neighbors for brazen acts of violence? The motivation behind such a decision will probably elude you if you fail to recognize the deeply embedded need in humans for a human-scale reality, and in modern industrial societies especially, due to our ingrained, myopic fixation on 'human potential' and 'limitless possibilities' through technology and human ingenuity, it is almost automatically assumed that any sane person would prefer peace and tolerance over war and brutality.¹

We know from comparative anthropology and archaeology that, in stark contrast to Western culture, non-industrialized, pre-modern, and traditional cultures were characterized by cohesive worldviews that generally supported what I've been calling 'human-scale' reality. All traditional cultures, for example, have an explanation of how people came to be and how things around them worked. For every question, there was already an answer that every member of the culture either knew or could find out from someone who knew. Such ideas about the world were not open to debate, nor did it seem likely that anyone within a given culture would even think to question them--what would there have been to gain, when each generation's goal was simply the continuation of the previous generations' ways? There would have been a deep sense of comfort in knowing how everything in the universe worked and fit together, and understanding one's place within that universe--things are no different in modern organizations like religious congregations, cults, clubs, etc. A face-to-face community bound together by shared views on the world, egalitarian treatment and good standing, a common goal, and a deep conviction in the correctness of one's actions: this is what a human-scale reality looks like, and, as the evidence of anthropology and archaeology argues, this is what humans have grown used to over two million years of existence. It is, arguably, what the human mind expects to find when it looks out into the world. However, thanks to the assault of Western scientific progress, it is more or less impossible to maintain this scale; as it expanded outward from Europe in three waves of resource-hunger, the first as Roman ambition, the second under the guise of Christian evangelical glory and manifest destiny, and the third in the form of industrial capitalism's total war on the world, Western society has incorporated and devoured all the cultures it has touched, basically rendering traditional cultural worldviews obsolete and feverishly replacing them with a different scale of reality: first, an imperial civilization-scale; second, a transcendental Christian-scale; third, a capitalist techno-scale. This is how the ongoing zombie apocalypse has played out. Instead of having a clear, straightforward, and satisfying answer to questions regarding things like the purpose of life, people now are expected to define the purpose of life more or less for themselves, individually, as an inviolable matter of personal belief, to be based ideally on the exposure made possible by a modern education and access to the internet and books to many different viewpoints and an overwhelming amount of information and arguments that are typically contradictory and inconclusive. The amount of evidence an individual has to consider in determining solutions to the central questions about life that culture can no longer answer accumulates exponentially and the pressure and pitfalls of trying to keep up with it all can be staggering, even mentally harmful, typically resulting in a feeling of being totally lost and detached, even paralyzed. Even worse, the answer an individual decides upon might be rejected by those around her, effectively alienating her. People don't do well in heterogeneous groups with different beliefs, values, goals, and expectations of behavior. Cooperation and communication easily break down and must then be enforced by an arbitrary authority. Science has succeeded in giving every other culture an inferiority complex and crippling their ability to form a cogent worldview, leaving a critical void where a definitive cultural narrative ought to exist.

Scientific rationalism, then, is clearly not compatible with healthy human psychology. The attempt to understand phenomena exclusively via objective observation and logic is unnatural and frequently counterintuitive. This much is obvious given that scientists need to undergo a lot of training to properly apply the scientific method and to understand and check their own biases and logical fallacies. To do science, an individual needs to turn her back on her evolved humanity as well as reject an intimate, subjective relationship to the world. For a society to embrace science, even those who do not personally perform science must be taught to accept scientific thought as the most legitimate form of thought, and scientific ideas as the most concrete and real, even though scientific thought and ideas are frequently counterintuitive and complex, requiring elucidation and mediation by specialists whose claims by definition cannot be verified by lay people. Already, the human scale has been lost by virtue of the need for this mediation. Lay parents, for example, have long since been barred from teaching their own children a traditional worldview and culturally important skills passed down to them by their elders, but instead must now relegate all aspects of their children's education, with direct implications for children's development and identity, to strangers and outsiders whose credentials as an institution are not open to question. When a way of knowing based on subjective experience, intuition, and folk knowledge passed down by previous generations is in conflict with what science holds to be correct, one usually must yield to the scientific authority or else be considered naive, stubborn, ignorant, or fanatical--in other words, not to be taken seriously, and therefore not representative of society's standards. The same dynamic manifests in all hierarchical relationships--civilians don't get to question cops, Catholics don't get to question the Pope, and tax payers don't get to question the Internal Revenue Service. The specialists in positions of authority are privy to information you aren't, and because you don't know what they know, you have no leverage to influence the policies they enact. As a result, one's intuitive worldview becomes increasingly irrelevant, and the value of personal assessments of the world diminishes. Our ability to comprehend the world and therefore to make sound decisions is essentially outsourced to specialists who retroactively inform us after the fact of how and what we think; this describes the basic mechanism of propaganda and why it is needed in civilized societies.

Science, however, differs from all other forms of propaganda in a significant way. Perhaps what makes science even more oppressive than, say, medieval hierarchical societies based on the doctrine of the divine right of kings, is that science consists of theories that are always subject to change pending compelling evidence. At least in a pre-scientific age, an individual or a social class or group could count on the worldview they were taught when they were young, regardless of how unjust or oppressive, to stay the same throughout their lifetimes. People are adaptable, and as long as everyone seems to agree on a certain interpretation, humans seem to have been capable of accepting the consensus and conducting their lives accordingly without feeling the strain of ontological doubt and existential uncertainty. In constrast, to be a good scientist or a good member of a scientific society, one must always be open to changing virtually everything one believes to be true at any given moment. Science is always developing, and nothing can be stated with absolute certainty lest the central tenets of science be violated. This is considered a revolutionary virtue of science, compared to the unchanging beliefs passed down through many generations in traditional cultures. The problem, once again, is that the human mind does not appear to like this sort of ambiguity, and seems to suffer greatly when forced to accept it. Some individuals can be trained to embrace science's unresolvable ambiguities (and are subsequently rewarded for doing so), but I would argue that the average person constantly rebels against this paradigm, even if unconsciously, by holding a few unassailable convictions throughout her life, whether they be about the existence of God, certain ethical questions like all children deserve an education, that dark-skinned men are dangerous, that vitamin C boosts the immune system (it doesn't), or that homosexuality is a disease. Probably the only dogmatic axiom in all of science is the implied presumption that only things that can be objectively observed are actual and real; overall, science is far too impoverished in unshakeable dictums to fill the void left by culture. The average person in modern society still craves the all-encompassing certainty of a human-scale worldview that countless prior generations of humans enjoyed, and to this person, science and its corollary strains of liberalism and progressivism are constant and pervasive tyrants, systematically denying both certainty and meaning to a mind that starves for the return of an intuitively comprehensible and universally shared worldview. This tendency is particularly evident in immigrants from more 'traditional' places. Wherever such immigrants immigrate to, they tend to hold close to their own cultural conventions, replete with religious beliefs, customs, foods, and usually a healthy dose of racism, all in direct defiance of liberal and scientific principles. Many immigrants gather together in migrant communities or form ethnic neighborhoods. Often, immigrant parents do not want their children to date or marry outside of their ethnicity. These insular tendencies, while sometimes tolerated and even romanticized as constituting part of the charm and allure of living in a multicultural city, more often than not get portrayed unsympathetically as being backwards and distinctly un-American, with the usual moral of such stories being that racial and cultural tolerance, acculturation by accepting Western education, and participation in mainstream Western capitalist society represent the proper aspirations of immigrants and the true American dream. Only a token, moderate expression of cultural pride or ethnic identity is tolerated; one may perhaps wear traditional dress on holidays as long as it does not scandalize conventional norms of modesty and decency, or eat traditional foods as long as they are not too disgusting to mainstream American sensibilities, but one cannot, for example, consider the word of God in the Qu'ran to have more weight than the US Constitution, or engage in polygamy as per your people's customs. A true and complete expression of any foreign culture in proximity to any other culture is by definition a challenge if not a threat, and therefore only emasculated versions similar to the ones on display at Disneyworld are tolerated. Likewise, the mantra espousing that "We are all one human family underneath it all" is critical propaganda in enabling the globalization that industrialism so desperately needs. As the world becomes globalized in the image of capitalist industry, it becomes possible to observe the implementation of this leftist worldview as official policy in more and more nations, as only the leftist myth of all-inclusive, secular, color-blind, gender-neutral international cooperation can establish the psychological foundation necessary to induce otherwise xenophobic and tribally-oriented peoples to forgo traditional bonds and accept membership in a global economy.

Generations of people growing up in industrial nations, especially in progressive urban centers, will have internalized this contrived leftist morality to the point where, even when there is overwhelming evidence to the contrary, we all still believe that it is not only acceptable, but morally incumbent upon us all to promote tolerance and build a society that somehow accommodates people of all views. Witness Article 1 of the UN Universal Declaration of Human Rights: "All human beings are born free and equal in dignity and rights. They are endowed with reason and conscience and should act towards one another in a spirit of brotherhood", and Article 2:
Everyone is entitled to all the rights and freedoms set forth in this Declaration, without distinction of any kind, such as race, colour, sex, language, religion, political or other opinion, national or social origin, property, birth or other status. Furthermore, no distinction shall be made on the basis of the political, jurisdictional or international status of the country or territory to which a person belongs, whether it be independent, trust, non-self-governing or under any other limitation of sovereignty.

The above credenda are phrased in a seemingly optimistic and humanistic tone, but actually express a sentiment of subjugation. The notion that everyone is equal is tantamount to stating that there can be no basis for group identity or cultural uniqueness--anyone who asserts that her group is special is violating the above articles. This way of thinking exists purely to enable the urbanizing and globalizing tendencies of industrial capitalism, technology, and science. Without these forces, such assertions concerning the equality of all human beings would be irrelevant. It should be obvious that for most of human existence, each band or village naturally considered itself to be special--it would have been the only society that mattered to its members, indeed, the only one with which they were completely familiar. Life revolved around their group. Many indigenous names of various aboriginal tribes support the antiquity of this ethnocentrism: "Dene", "Gwich'in", "Inuit", "Lenape", "L'nuk", "Maklak", "Mamaceqtaw", "Ndee", "Numakiki", "Numinu", "Nuutsiu", "Olek'wol", "Tanaina", and "Tsitsistas" are just some of the indigenous names by which Native American tribes identified themselves to outsiders that all translate simply as "the people". Tribes like the Sahnish, Anishinaabe, Dunne-za, Gaigwu and Nuxbaaga are even more assertive of their central importance with names that mean "the original people", "the principal people", or "the true people". Many more examples can be found if we were to include aborigines from outside of North America.² Obviously, this ethnocentrism cannot be cited as evidence of backwards racism or chauvinism; rather, it strongly corroborates my argument that people are naturally adapted to a high degree of insularity and cultural isolation, and appear to have lived quite well for several hundred thousand years that way. No indigenous culture prior to encountering Europeans had any clue as to the breadth of foreign cultures residing in far off parts of the planet, as these other cultures did not affect its way of life in the slightest until contact with Europeans was established. When different indigenous groups did encounter one another, they certainly did not extend the same rights and protections they reserved for their own people to the foreign group, who were, of course, naturally considered less than real people. Native American tribes often saw nothing wrong with taking advantage of an outsider, white or native; whereas such behavior would have been unacceptable if directed at another member of one's tribe, one would receive the approbations of peers if one pulled off a ballsy swindle on some unfamiliar sap. Indigenous peoples of both the New World and Australia have been documented by European explorers to act in seemingly erratic and contradictory ways upon encountering them, apparently out of not really knowing the best way to approach a foreign entity that could turn out to be dangerous. Sometimes the natives would decide to shower the explorers with immense generosity in the form of gifts and food, whereas other times the same people might try to loot or outright kill the strangers unprovoked; still other times the natives would simply flee at the sight of them. One humorous account in Bill Bryson's travelogue of Australia In A Sunburned Country tells of how a group of aborigines, upon encountering a European explorer in the desert, stared in seeming bemusement until one of them casually inserted the tip of his spear into the stranger in order to see what would happen.

Sadly, history holds far more sobering examples of the consequences of clashing worldviews and xenophobia enabled by technology. As history has attempted to show us time and time again, simply encountering a foreign worldview induces anxiety and is an easy trigger to violence. Mandating tolerance and diversity via authority and propaganda is inherently oppressive from an anthropological and even biological point of view. People shouldn't be forced to learn, against all evolutionary programming, to accept the cognitive dissonance that comes with the constant presence of strangers in their space. They should be able to live in a human-scale environment where familiarity can beget confidence, connectedness, and a sense of security. Bashing on, say, neo-Nazis or homophobes is really beside the point. Such intolerant people perceive things in black and white because, on a very basic psychological level, they are trying to salvage and affirm the simplicity of the human scale against the accelerating onslaught of techno-scale political correctness. Their prejudice simply reveals the sickness of the techno-scale reality and its empty reverence for universal tolerance. Their intolerance is a backlash to the implicit psychological violence that modern civilization inflicts upon them in order to coerce acceptance of a reality their biology instinctively rejects.

In light of this unrequited need for a human-scale reality, it should not be that surprising that fundamentalist movements--groups who hold hard and fast to scripture or doctrine as the literal truth and ultimate authority regarding all things--hold such strong appeal to those who feel disaffected and alienated by the epistemological and moral ambiguity of science, whose dissolution of a cohesive and intuitive worldview shared by others leaves these marginalized people to seek meaning in a consumerism that is ultimately unsatisfying. The high profile of ISIS coupled with its immense success in its campaigns in Iraq and Syria make the group highly attractive to those living disenfranchised lives in the West. Richard Barrett of the Soufan Group, an intelligence agency, writes in his report on foreign fighters in Syria that ISIS recruits from France, who number more than seven hundred to date, are characterized as "disaffected, aimless and lacking a sense of identity or belonging". He goes on directly to state:
This appears to be common across most nationalities and fits with the high number of converts, presumably people are seeking a greater purpose and meaning in their lives. Indeed, the Islamist narrative of Syria as a land of 'jihad' features prominently in the propaganda of extremist groups on both sides of the war, just as it does in the social media comments of their foreign recruits. The opportunity and desire to witness and take part in a battle prophesized 1,400 years earlier is a strong motivator. And for some, so too is the opportunity to die as a 'martyr', with extremist sheikhs and other self-appointed religious pundits declaring that anyone who dies fighting the 'infidel' enemy, whoever that may be, will be favored in the afterlife.”
As reports such as those referenced in the above links reveal, there is a strong sense of camaraderie and shared purpose that exists within ISIS's ranks. A disaffected individual from the West can easily find what she felt was lacking from her old life in joining ISIS, and the organization's propaganda, targeted at young muslims using social media, seems to suggest that ISIS is keenly aware of the appeal of the human-scaled, simplified view of the world they can offer and the potential for fellowship that such a view possesses. Because the West long ago rejected religious dogma and championed universalist science, it cannot offer any of these things, and, as we can see, for many people the lack of absolutes in a scientific society drives them into the welcoming arms of an ideology that promises to shrink the world back down to size and that indeed is already demonstrating the vitality of its simplistic worldview through ISIS's continued military victories against the  more "civilized" nations, which, in their eyes, are also victories against weak convictions and ineffectual leftist universalism/globalism.

The West may be troubled and baffled by the steady stream of its citizens joining up for militant jihad, but now we can see that the West's confusion belies its longstanding denial. High technology civilization tries hard to convince us that the distinction between a human-scale reality and a techno-scale one isn't real, and even if it were, it doesn't mean that humans cannot easily adapt from the former over to the latter. It will go so far as to disclaim all our various anxieties, neuroses, psychoses, and physical diseases along with growing rates of suicide and depression, persistent substance abuse, and an ever-expanding penchant for appalling acts of violence that now extends to joining foreign terrorist militias, protesting ignorance of the cause of these ills, always mindful to characterize those who act out with force as 'uncivilized', when the truth is just the opposite.

¹I have to say that this post is not meant to endorse the actions of ISIS or any other terrorist group, though I am certainly not condemning them, either. That's what the UN is for. I encourage those citizens of the US who recoil at the atrocities ISIS inflicts upon those they consider 'infidels' or enemies of their way of life to contemplate the following seldom-recounted piece of American history:
In 1813 several hundred Cherokees enlisted under the command of a bush lawyer turned general, Andrew Jackson. Old Hickory, as he became known for his intractable personality, was forty-six, gaunt, shrewd, violent, one arm crippled by dueling wounds--the latest from a duel with his own brother. Of Carolina frontier stock, he hated Indians but was more than willing to employ them as high-grade cannon fodder. His Creek War, hailed by Jackson as a victory for civilization, was notorious for the savagery of white troops under his command. They skinned dead Creeks for belt leather; and Davy Crockett, who was there, told how a platoon set fire to a house with "forty-six warriors in it" and afterward ate potatoes from the cellar basted in human fat.--Ronald Wright, Stolen Continents: The Americas Through Indian Eyes Since 1492
We now pay homage to Old Hickory, who later became the seventh president of the United States, by printing his likeness on our currency. Readers from fellow civilized nations: feel free to supply your own favorite "victory for civilization" from your homeland's illustrious history, and we'll show those jihadis how truly civilized people behave.

²In our mostly de-tribalized, nuclear family paradigm in the West, this penchant to claim uniqueness for one's own can nevertheless still be observed in the way an individual normally considers her parents or children to be special. For example, consider the tendency of civilized children to each call their respective parents "mom" and "dad"; they apply this term only to their own respective parents, and even though the terms are not useful for disambiguation on a larger scale of organization, people still seem beholden to what the left technically would consider a backward and tribally-minded holdover from our unfortunate evolutionary past, this insistence of all people to hurtfully and divisively reserve the terms "mom" and "dad" exclusively for their parents, as if each person's parents were somehow unique, each mother the warmest, each father the strongest. Of course, most individuals don't feel that their parents or children are interchangeable and equivalent to others; I would argue that this is normal human psychology. Any child instinctively understands that her parents are far more important than anyone else's parents; other people's parents occupy a mostly marginal place in a child's life. In a tribal society, this mentality naturally extends to the tribe as a whole vis a vis other, less familiar groups. My point here is that liberalism would prefer you see this way of thinking as a violation of universal human rights.