Category Archives: News

It’s Time to Invade the Amazon

A headline in yesterday’s Atlantic read “The Amazon Fires Are More Dangerous Than WMDs.” My wife made roughly the same point to me when news of the fires broke days ago: “Why don’t we send the military or something?” That high brow news magazines and commonsensical elementary school teachers share the same response to the problem suggests that the nature of the crisis and the appropriate American response to it are widely known. If the US is going to spend trillions of dollars to maintain the world’s largest, best-funded military, it should be able to fight fires as well as terrorists. I don’t pretend to understand the logistics of how that would work exactly (strap a water balloon to a drone?), but the principle is so rudimentary, so obvious that it should be a no-brainer. In a century defined by aggressively interventionist US foreign policy, the US cannot consistently insert itself into one country in the name of bombs that don’t exist but refuse to insert itself into another country where the bomb has already gone off and is burning up the very air we breathe. The hypocrisy is evident (universally, apparently).

It’s time the US government introduced a little parity into the way it treated global crises. If migrants deserve military mobilization, this crisis certainly does–if for no other reason than because the devastation of another Latin American economy cannot help but exacerbate the immigration “problem” here. It’s my hope that people and governments–around the world–do more than jockey and tweet in the face of global catastrophe. After all, what’s the point of the Leviathan state if it has its tentacles in every aspect of our lives and does nothing to save them when they are in peril?

Advertisements
Tagged , , ,

Liu Yifei and Cultural Donatism

Liu Yifei has done it. She’s stood on the wrong side of a hot button cultural issue and now the Chinese-American star of the upcoming Mulan remake has risked the ire of Twitter and its oft-fabled, never-forming boycott. The specific issue has to do with support for the Hong Kong police in the midst of “sometimes violent” ongoing pro-democracy protests–to the great satisfaction and delight, I’m sure, of the never-hypocritical ‘blue lives matter’ folks. The specific issue also doesn’t really matter. It’s the confusion of artist with art, producer with product that concerns me here. My political inclinations being well established, it’s safe to say that I stand neither with the protesters nor the police (though, if possible, I stand considerably less with the police), but I also am equally appalled by those who would consider–as one user put it–an entertainment product “destroyed” because a participant did or said something objectionable.

From a Christian perspective–not to mention the perspective of common sense–the most basic stupidity involved in this should be obvious: everyone involved in every piece of art or entertainment has done, said, or believed something which is now (or will be in the future) objectionable. What has changed in the last 24 hours is not the quality of Liu, as either an actress or a person, but our knowledge about her. It is the tearing back of the veil that revises our perceptions and supposedly demands new judgments; after all, nothing changed about Paddington between 2015 and 2017 except the unsettling knowledge that it was produced by Harvey Weinstein’s company.

The problem, of course, is that our ignorance is unfounded. We know and delude ourselves about human nature rather than deal with the fact that our ability to do good–in the greater world or in the arts–cannot be wholly undone by our sinful nature. Christians know that “none is righteous, no not one,” but we set that aside and presume innocence (in deference to the American way) in spite of a universal knowledge of guilt. If your affection for Mulan cannot survive the crumbling of your feigned ignorance–the revelation of the specific manifestation of what you already know incontestably by biblical axiom or the laws of statistical probability to be true–then the problem lies with you and not with Mulan.

(The most ridiculous manifestation of this comes not with new knowledge but with anachronism and hindsight, when we don’t learn something new but become ashamed of not caring about something old. The Confederate statues and flags were not hidden away from public view; the “sins” of Woody Allen were adjudicated–in his favor–by law enforcement. Only standards about acceptable causes and levels of outrage have changed with time.)

The greater issue, however, that Christians seem to overlook when debating whether or not you can still listen to “Thriller” without being a rape apologist–or, put more benignly, if you can separate the art from the artist–is that great Christian theologians of days past have already arbitrated this question and offered us a resoundingly clear conclusion. In the middle of the fourth century, Donatus Magnus took to his cultural equivalent of Twitter with his cultural equivalent of a hashtag activism, calling on Christians to #boycottbadpriests. The Donatist heretics believed that the character of sinful priests had a negative effect on the sacraments they administered, so much so that people baptized by corrupt priests (particularly those who had renounced the faith to save their lives during the Roman persecutions) should not be admitted to the church. The character of the individual, they argued, corrupted the character of the act.

The Church Fathers–perhaps realizing that all have sinned and fall short of the glory of God–swooped in to set the record straight. Augustine, a North African like Donatus, made a particularly strong case when commenting on the Gospel of John and reflecting on Paul’s letter to the Corinthians:

For what does Paul say? “I have planted, Apollos watered; but God gave the increase. Neither is he that plants anything, nor he that waters; but God who gives the increase.” [1 Corinthians 3:6-7] But he who is a proud minister is reckoned with the devil; but the gift of Christ is not contaminated, which flows through him pure, which passes through him liquid, and comes to the fertile earth. Suppose that he is stony, that he cannot from water rear fruit; even through the stony channel the water passes, the water passes to the garden beds; in the stony channel it causes nothing to grow, but nevertheless it brings much fruit to the gardens. For the spiritual virtue of the sacrament is like the light: both by those who are to be enlightened is it received pure, and if it passes through the impure it is not stained.

In other words, the goodness of God is such that it cannot be destroyed even if it passes through or is administered by an impure vessel. (Which is good, since that’s the only kind available at present.) The principle extends far beyond the sacraments. Since God is the only one Who is, absolutely, good, all goodness must be by definition derivative from that goodness and good by virtue of its origin in the divine and not because the doer is somehow good. What is good cannot be made evil by the agent that accomplishes it.

My point here is not that movies are like sacraments or even like works of righteousness. Obviously they aren’t. The principle, however, remains the same. In confusing act and actor we accept a humanist fallacy that locates value of any particular thing in a person rather than in some transcendent notion of good or evil.* The Church Fathers, in contrast, echo the Scriptures in which God acts not occasionally or exceptionally but routinely through downright evil vessels, hardening their hearts or rousing them to bloody and devastating wars to accomplish righteous purposes. It is the product and not the tool that is just.

In other words, if viewing a film needs to be evaluated as a moral act, if we must take principled stands about what we see based on appeals to higher virtues, then the criteria by which we judge should always only be limited to the content of the creation and not the character of the creators. Is the film pornographic matters; does the camera man watch pornography does not. Seeing Mulan is not tantamount to supporting the Hong Kong police, and to think otherwise is to accept a particularly silly old North African heresy.


*That may be fine for the secular humanist–though I wish them luck living consistently with the almost certain likelihood that any particular person is a flawed vessel and therefore utterly incapable of good. The world becomes a bleak place when even the work of your own hands is beneath your own contempt.
Tagged , , , , , , ,

Has Rihanna Gone Native?

Twitter has already moved on to the latest shiny new racist distraction coming from the White House. Unaccustomed as historians are though to living in the present, I’m still dwelling on the past. Specifically the flashpan outrage that accompanied Rihanna recent East-meets-West themed photo shoot with Harper’s Bizarre China. Last week, I pointed out that the recent popular application of “cultural appropriation” stood as a sad caricature of legitimate complaints about blackface–both the specific act and general spirit present in other behaviors in the same vein. I noted that using “cultural appropriation” as a blunt instrument to enforce rigid cultural borders (even if only against majority groups) also worked specifically counter to progressivisms putative goal of productive multiculturalism. In doing so, it has roots in older cultural policing practices: going native.

I’ve written about the stigma on “going native” here in the past, specifically with reference to what might be called the appropriation of East Asian culture (especially) by white Americans. In brief, the idea was one popular among advocates and opponents of 19th century imperialism who worried that too much exposure to “lesser” (non-Euro-American) cultures might infect and degrade the imperializing populations. It, in essence, inverted the proper flow of cultural transmission, one which was always from whites to non-whites. The evidence for this was the assumption (or, as we would say now, the appropriation) of native culture by white colonizers.

If that profoundly racist and conservative rationale for white people not dressing, speaking, singing, dancing, or acting like people of other cultures seems profoundly antithetical to (and even a potential source of) the contemporary woke and progressive rationale for avoiding cultural appropriation, that’s probably fair–so far as it goes. Because in practice, the very behaviors that were being policed by 19th century imperialists are those behaviors that agitated progressives are worried about now.

The most obvious example pertains to dress. As it so happens, the same policing of Rihanna’s dress that made such transient headlines before has near parallels among the women of far flung imperial outposts. In 1911, the Irish writer Beatrice Grimshaw reflected on her experience living in Papua New Guinea:

 This question of dress is a burning one among island ladies. The native loose robe, hung straight down from a yoke, is very much cooler, and the doctors say, healthier, than belted and corseted dresses such as European women wear. But there is nevertheless a strong feeling against it, because it is supposed to mean a tendency to “go native,” and the distinguishing customs of the race acquire, in the island world, a significance quite out of proportion to their surface importance, because of the greatness of the thing they represent. Therefore, the white woman, unless she is suffering from bad health, and needs every possible help to withstand the heat of the climate, sticks to her blouses and corsets, as a. rule, and sometimes “says things ” about people who do not.

Grimshaw clearly sides with common sense when it comes to ladies’ dress, but she recognizes she is in the minority. It doesn’t matter if the clothing is consistent with the host culture (as Rihanna’s is) or if it is medically or environmentally sensible (which, of course Rihanna’s wasn’t); what matters is the “greatness of the thing they represent.” Divorced from the language of “going native,” Grimshaw’s words might just as easily have found their way on to Twitter, where the “distinguishing customs of race” are considered off limits not because it makes sense but because it has symbolic significance all out of proportion to its real import.

Not everyone was so forgiving with regard to dress as Grimshaw,* but always the focus was on safeguarding the borders of cultural integrity through a prohibition on sartorial miscegenation. Consider this study by Nicholas Thomas and Richard Eves about other reports from the South Seas:

[Martin of Nitendi] describes the acts of piracy of a white man living among an island people who attack and plunder a passing ship. Because of his acts of “reckless courage,” this man is accepted into the local society and achieves a measure of authority and influence that is surpassed only by the chief. As in many stories, he has assumed “native garb” and wears a girdle of tit leaves and nothing else other than a hat made of coconut leaves shading his “blood-shot” and “savage” eyes from the sun. This character, Jim Martin, was put ashore by a whaling vessel because of his mutinous conduct, and as a result he has dissociated himself forever from civilization, becoming one of the most desperate and blood-stained beachcombers that had “ever cursed the fair isles of the South Pacific.” Becke emphasizes that Martin’s previous identity as a white man has been erased when he writes “he had been a White Man.”  Elsewhere, again emphasizing the “savage” state to which Martin has fallen, Becke describes him as a “wild, naked creature,” a description that also calls into question his humanity. He had indeed become “more a savage native than a white man.” The story ends with a punitive expedition in pursuit of Martin for the massacre of a ship’s crew. He escapes into the mountains but is eventually shot and mortally wounded by a soldier who had thought he was a “nigger.” When Martin is found wounded by the lake, his racial identity is called into question when a soldier asks him, “Who are you? Are you a white man?”

Thomas and Eves note that “for those whites who live in the racial and cultural borderlands of the Pacific,” the possibility of going native is ever present but the possibility of redemption is all but non-existent. Insofar as the “process of degeneration is…conceived as a one-way journey,” going native has more overt similarities to the “weeaboo” slurs described in the link above. Even in this, however, there are parallels to current “cultural appropriation” discourse. This is true inasmuch as purveyors believe that appropriation entails a loss for the people of the native culture, one that must be somehow total and permanent (if for no other reason than because that is necessary to explain the scale of the outrage).

The real difference, if there is one, has only to do with which party is identified as aggrieved by the porousness of cultural borders. With “cultural appropriation,” the accusation is that when white people dress or act like non-white people, it represents a colonization of non-white culture by white culture. The ironic reverse is true with “going native,” in which the fear is that when white people adopt non-white culture it represents a colonization of civilization by barbarism.

Both diagnoses are inconsistent with progressive ideals of multiculturalism rigorously conceived in that they (a) assume that cultures are hard realities that have policable borders and (b) consider the breaching of those borders in some or all circumstances to be undesirable. The best theories of culture, however, roundly repudiate that first contention, presenting culture rather as a fluid manifestation of supra-linguistic modes of symbolic communication. If that is true, cultures are not racially or nationally contingent. In groups and outgroups are defined by fluency, the ability to operate culturally within those unspoken codes of conduct. The difference, as many advocates of Black English have noted, between white authors who write fluently in Black English and the minstrelsy that is intended to make African Americans look stupid.

The second contention–that racial permeability is bad in some, many, or all circumstances–not only is untenable under the definition of culture outlined above, it also distorts to a common, twisted purpose the logic of racial realism espoused by the racist extremes of the right. Proceeding from the same unscientific, academically ignorant premise that race and/or culture has a substantial, independent existence, they assume without cause that something is lost or damaged when that existence is diluted or contaminated. In other words, those who sing the songs of condemnation about cultural appropriation have retained the tune to the old “going native” hymns and merely updated the words to suit their new, equally troubling purpose.

Rihanna will be fine. She’s not white; the culture she appropriated isn’t black. Both of these things–not to mention outrageous fame and popularity–will insulate her from any lasting impact. After all, she has been called out by these same arbiters of contemporary cultural decorum before and been just fine. For the rest of us, though, serious thought must be given to precisely how long we are willing to leave unchallenged the idea that our cultures are treasures that we can hide in a box and take out to play with only in the presence of others who look like us. It’s inconsistent with the aims of American progressives, inconsistent with the conclusions of modern academics, inconsistent with basic common sense.


*It is worth noting that Grimshaw was not a great pioneer of acculturation. Her common sense approach to culture had clear limits as well. When she portrayed the negative aspects of “going native,” she too had clear reference to dress:
The Islands are not the place for the ne’er-do-well, and I would also warn the exasperating young man, who never did a square day’s work in his life, never got into trouble with his employers or his superiors, but always found himself misunderstood, unappreciated, and incomprehensibly “sacked,” with an excellent character, at the first hint of slacking business—that the islands will not suit him either. If he comes out, he will not starve or go to the workhouse, because you cannot die of hunger where there is always enough vegetable food to keep the laziest alive, and you do not need workhouses, under the same happy conditions—but he will “go native,” and there are some who would say he had better starve, a good deal. There are men who have “gone native” in most of the Pacific groups, living in the palm-leaf huts with the villagers—~but a white man in a waist-cloth and a bush of long hair, sleeping on a mat and living on wild fruit and scraps given by the generous natives, drunk half the time and infinitely lower, in his soberest hours, than the coloured folk who unwisely put up with him, is not a happy spectacle.

 

Tagged , , , , ,

Rihanna and the Self-Caricature of Progressivism

Twitter is mad this morning. And as increasing irrelevant as I think activism on the platform is becoming, I find today’s outrage particularly outrageous. On Tuesday, Harper’s Bizarre China released photos (set for publication in August) from a recent shoot with pop star Rihanna, in which the Barbadian singer:

[i]n one photo…gazes down, a dainty Chinese fan in one hand and a bright red sash around her waist. In another, she poses in front of a traditional folding screen, the golden ornaments in her hair reminiscent of the royal fashions of ancient China.

The whole purpose of the piece was, by design, to show what happens “when western style icon meets eastern aesthetic.” Unfortunately, the magazine forgot to run that fairly benign vision by the Internet censors in advance. On cue, Twitter cried “cultural appropriation” and let slip the dogs of war.

Cultural appropriation, for the uninitiated, is the use of styles or artifacts of one culture by individuals not of that culture. That’s a broad and inoffensive definition that might make you think of white folks using chopsticks at a Chinese buffet or cooking tacos on Tuesdays (in the proud tradition of Moctezuma). In practice (or, in Twitter practice) the term has become an unevenly but aggressively applied synonym for racism of the career hobbling (but not necessarily ending) blackface sort. It is a kind of polito-cultural slur applied to anyone violating the sacrosanct borders of culture, particularly majority groups who don the visual style or who ape the artistic output of minority cultural groups.

Herein lies part of the most tangled and interesting problem with the accusations against Rihanna. On the one hand Afro-Caribbeans are an almost unrecognizably small minority in mainland China where the consuming audience for the photos lives. Yet, on the other hand, Sino-Caribbeans–the descendants of Chinese laborers brought to the Caribbean–are a significant and clearly visible minority in Rihanna’s cultural home. As an Afro-Caribbean in Chinese garb, is she a member of the mainstream taking on the culture of the weak and marginalized or is she a minority adopting the dominant culture where she happens to be at the moment?

That’s more nuance than Twitter is willing to get into and, quite frankly, it is irrelevant to my larger concerns. The first is the confusion–typical of the overreach of contemporary cultural progressives–between adoption and parody, between (in other words) white people performing hip hop and wearing FUBU and white people appearing in blackface. Blackface (like the yellow- and redface parallels) is a particularly disgusting bit of American history wherein white people made their faces up to look like caricatures of African Americans in order to propagate demeaning stereotypes about a people they were actively oppressing. That’s why blackface was (and still is) terrible–as are all Frito-Bandito style Halloween and Cinco de Mayo get ups.

But we can’t confuse this with white tourists in Tokyo wear yukata on Tanabata, with Monica Geller getting braids to manage her hair in Barbados, or with the very fact of white people drinking margaritas and eating Tex-Mex on Cinco de Mayo. Not only is that silly–raising perfectly legitimate ad absurdum questions about just how much appropriation is appropriate–but it fails to accurately identify what is wrong with blackface. The problem with blackface is not that white people made themselves look like blackpeople, it’s that they did it maliciously and without authentic understanding of (or even superficial desire to understand) the people they were parodying. An attempt to understanding, appreciate, and participate in a culture on its own terms has as little in common with blackface as a surgeon’s scalpel does with a mugger’s switchblade. Knifing people is wrong…depending.

That’s not too much nuance to expect from the general public, or it shouldn’t be. The point has not been entirely lost on Twitter. Rihanna’s defenders are absolutely right to point out that the designers, publishers, and audience are all Chinese:

By going straight to the source and finding a Chinese designer, her supporters said she had honored the culture and people from which the aesthetic was borrowed.

They ought to go a step further and point out the basically colonial attitude involved in being outraged on behalf of the Chinese for the appropriation of their culture. The anxiety over “cultural appropriation” is a peculiarly affluent western phenomenon, one that is overwhelmingly white. To the extent that white people feel the need to explain what minority cultures will find offensive in our behavior they perpetuate the kind of paternalisms they purport to resist. Outside of the paradoxically closed discourse of Western cultural progressives, the world is a much less insecure place. When I took kyūdō lessons in Japan, I was required to remove my western clothing and done the tradition kyūdōgi because failing to do so would have been inappropriate. It was a source of confusion when I had to struggle to explain to our teacher why I did not want pictures of me from the class shared on social media because I did not want to have to explain my non-western garb to any overzealous undergrads. Audiences in China, meanwhile, appear as likely to be befuddled by American outrage:

[T]here is a contrast between audiences in mainland China, who have largely complimented the shoot, and audiences overseas, who seem more conflicted.
On the Chinese micro-blogging platform Weibo, the majority of comments about Harper’s Bazaar cover appeared positive. “No wonder she is the Queen of Shandong (province),” one user wrote, using a nickname Chinese fans have given Rihanna. “She is a foreigner that is most suitable to the Chinese style.”

“It looks so good! Slay! The Chinese style compliments her so well,” another user wrote, while other Weibo comment threads are filled with heart emojis and exclamations of “wow” and “beautiful!”

For those in China who appreciate and consume American musical culture, seeing a pop music icon appreciating, even celebrating, Chinese culture is a tribute rather than an insult.

The almost willful lack of nuance with which angry progressives distribute accusations of cultural appropriation dulls the force of their argument about genuinely hurtful acts of performative racial violence like blackface. Beyond replicating old paternalisms in new but no less patronizing forms, it transforms banal acts of cultural fluidity into mortal sins to the end of confusing and alienating anyone who doesn’t understand the demarcating line of transgression between eating sushi and wearing a red sash while using a paper fan. “If I can’t get it right, I might as well stop worrying about getting it wrong. So why is blackface racist again?”

Just as importantly, bemoaning cultural appropriation undercuts their larger multicultural objective by treating culture as a bounded reality that is fixed and inherited like race. In that respect, they replicate a more specific form of colonialism. In the interest of giving that adequate space, I’ll address it more fully in a subsequent post.

Tagged , , , , , , ,

The Usual Policy

I have made several attempts over the course of the last year to attack the discourse of novelty that surrounds the actions of the current administration. It seems that my voice (through great personal effort) is finally reaching the masses. Chris Rock posted an image yesterday to social media reminding outraged Americans that the United States was founded on a policy of family separation. The greater message–mine, not necessarily Chris Rock’s–is that something is wrong because it is wrong, not because it is without precedent or because it is inconsistent with “American” values. The precedents are there and the values are only as good, as real, as meaningful as the actions they have, do, and will provoke.

There’s my happy belated birthday to the US.

U.S. Navy Flag at Ballgame

Tagged , , , ,

Bad History is Still Good Politics

Biden_P6YI6TBVLLUBBBGXLAELast September, I wrote a bit about the frustrating rhetoric of novelty that surrounds this presidency. There are frequent claims–sometimes from the president but especially from his critics–that he is a norm-shattering figure, untethered from the historical rules and codes of conduct of the American presidency. The cry since the 2016 election of the “resistance” has been never to normalize this administration. But it is normal, at least with regard to its positioning in the grand flow of American presidential history. For every horrible (or, if you’re so inclined, laudable) thing he has done, there is a clear precedent or analogue in administrations past. He is not the corruption (or, again if you’d prefer, metamorphosis) of the US presidency, he is “its culmination–historically and morally the distillation of everything it stands for and has always stood for.”

Nevertheless, the rhetoric persists, and it appears to be born from a delusion that American global authority is rooted in virtue rather than force, exceptionalism rather than triumphalism. Neurotic though it may be, this collective self-deception has been an effective political tool for both parties for at least a century (as far back as Woodrow Wilson’s “moral diplomacy”) and probably longer. The fact that this president has thrown open the curtain–or, to recycle my metaphor from last fall, wiped the lipstick off the pig–may leave us uncomfortable with what we see, but it doesn’t change what we’ve always had. The “wizard” was always a con-artist; the pig was always a pig.

That didn’t stop the narrative from reappearing in this past week’s Democratic primary debates, where once again historical blindness came into service of political rhetoric. Former Vice President Joe Biden–who by all accounts had a rough night and who will almost certainly not suffer for it–doubled-down on the rhetoric of norm-shattering novelty in his closing remarks. See if you can catch the bogus history in the following:

I’m ready to lead this country because I think it’s important we restore the soul of this nation. This president has ripped it out. It’s the only president in our history who has equated racists and white supremacists with ordinary and decent people. He’s the only president who has, in fact, engaged and embraced dictators and thumbed their nose at our allies. I’m, secondly, running for president because I think we have to restore the backbone of America, the poor and hardworking middle class people.

If you guessed “all of it,” then you’re right. The appeal to America’s exceptional virtue is clear and utterly nonsensical. The “soul” (now lost) of the US is, for Biden, it’s racial inclusivity and its repudiation of authoritarian governments. On a night when Biden got raked over the coals for his insensitivity toward the lingering pain of America’s racial history, it is telling to me that no one has called him out on the first of these claims. Every American president at least through (and including) Abraham Lincoln would have equated southern slave owners with “ordinary and decent people.” Say what you will about a brief and radical period of Reconstruction, but most presidents since would also have embraced what the contemporary left considers “racists and white supremacists” as perfectly regular folks. When we have had eighteen American presidents who owned slaves–eight of whom owned slaves while in office–it’s pretty ridiculous to say that this president is the first to consider racism ordinary.

But pointing that out does a disservice to the narrative that this president (rather than America itself) has a pretty consistent and universal race problem.

The idea that the US had a clear and steadfast policy of opposing authoritarian governments prior to 2016 is equally absurd, as any even remotely honest reading of the Cold War in Latin America will reveal. Support for Cuban authoritarian Fulgencio Batista (a support which helped provoke Castro’s communist revolution) and the overthrow of democratically elected leaders in Guatemala (Arbenz) and Chile (Allende) in favor of brutal military dictators are just the beginning. From the détente with Papa Doc to Operation Ajax and the installation of the shah in Iran, the US has a long and proud history of engaging and embracing dictators. In fact, the rise of modern dictatorships coincides–not entirely coincidentally–with the end of isolationism as a viable US foreign policy. If this president has decided to do his cozying up to dictators in public rather than through covert agencies, that is again a change in window dressing not substance.

Biden knows what he is saying is ridiculous. Or someone on his staff does. I don’t believe that no one in the whole machine of the current Democratic field is smart enough to see the patent historical absurdity of the claims being made. The problem is that bad history has always been good politics. Because “that’s not how we do things” has more resonance than “that’s not how we do things lately,” and “that’s not who we are” is more comforting than “that’s not who we pretend to be.”

 

Tagged , , , , ,

Contemporary Feminism Confronts Reality

A number of recent articles, spread out across several publications, have recently tried to stress the degree to which the message of contemporary feminism is having trouble breaking through to the level of popular consciousness. The theme continues to appear every few days as I sift through the news, but two articles in particular have stuck with me. The first was in Politco’s magazine, entitled “It’s Sexism Stupid. Why men are dominating the Democratic 2020 primary.” The article joins the chorus of those lamenting the failure of any woman to catch fire and dominate in the Democratic field the way Hillary Clinton dominated the much smaller 2016 field. The article suggests that, rather than overcrowding in the field or the flawed nature of the female candidates, the problem is “sexism and misogyny—albeit often unconscious, unwitting and the result of implicit bias.” As evidence, the article points to a 2008 study, which found that

gender is a powerful force in inducing voters to defect across party lines. Specifically, when men and women were pitted against each other in head-to-head match-ups for the presidency, a substantial proportion of Democratic voters (12.3 percent) defected to a male Republican, John McCain, rather than vote for a female candidate from their own party, Hillary Clinton.

Similarly, and arguably somewhat less surprisingly, a sizeable proportion of Republican voters (15.5 percent) defected to a male Democrat, John Edwards, rather than vote for a female candidate from their own party, Elizabeth Dole. (This tendency was true for both male and, notably, female voters, and was not balanced out by any comparable pattern of defection toward female presidential hopefuls.)

Another article, which appeared more recently in the Atlantic and struck a little closer to home (literally), was “Even Breadwinning Wives Don’t Get Equality at Home.” The principal complaint in this article is that, while gender equality gets lots of attention and makes most of its progress in the workplace or in politics, gender equality in the home is getting left behind.

Breadwinning wives also don’t get parity in how household chores are divvied up. As wives’ economic dependence on their husbands increases, women tend to take on more housework. But the more economically dependent men are on their wives, the less housework they do. Even women with unemployed husbands spend considerably more time on household chores than their spouses. In other words, women’s success in the workplace is penalized at home.

In the end, women take on more domestic responsibilities than men in a way that is mostly unrelated to their availability for those responsibilities. They take them on because they are women even when their husbands have no gainful employment to occupy their time.

The problem, as both articles see it, is that implicit biases about women’s nature and roles continues to override the public discourse about gender equality. The problem, as I see it, is that whatever people may believe or espouse in the public sphere, there is a fundamental disconnect between what contemporary feminism asserts and people’s actual experience. Contemporary feminism continues to base its argument for the equality of the sexes on the irrelevance (or perhaps obsolescence) of sex in a way that defies people’s everyday experience.

It is one thing to say that men and women should have equal access to employment and public positions, that compensation and advancement should be based entirely on competence without regard for sex. It is another thing entirely to say that we should not see sex, that sex is not a real or meaningful category through which we approach the world. It is more radical still to imagine that all the consequential beliefs that we attach to sex are in fact gender–sex’s ephemeral cousin, entirely culturally rooted, hopelessly fluid, and utterly untethered from biological sex. Even if all of that is true–and I’m pretty sure I had to sign an oath in blood on the back of my PhD saying it was–it flies in the face of how people operate in their day to day lives. With each step toward the root logic of contemporary feminism, we shift further into the realm of cognitive dissonance, where the (coherentist) theoretical soundness of feminist theory butts up against the realities of lived gender economies.

This, perhaps more than self-ghettoizing, explains women’s complicity in the structures of their oppression in both articles. The story on domestic balance noted that “one possible explanation for this is that by outearning their husbands, wives worry that they are breaking norms on gender expectations.” The argument goes further:

[I]t’s not just men who are keen on enforcing the notion that they should be the family’s earner in chief. Wives play a crucial role in framing husbands as breadwinners too. A lawyer who had been the breadwinner in her marriage told me that after she lost her job, she turned her focus to her husband’s business and how he could grow it, instead of worrying about how she could find another job to ensure that their family remains financially stable. Ironically, her educational credentials and prior work experience mean that she is actually positioned to bring in more money than her husband. Instead of focusing on how the unemployed woman could get her next job, the couples I talked with focused their attention on ensuring that the husband’s career was flourishing.

Just like women who cross party lines to vote in favor of a man, women will torch their own career prospects because they are invested in the idea of male vocation as central not only to male-identity but also to household-identity.

The solution, according to “Breadwinning Wives,” is better public policy that will pave the way for gender equity at home. The solution, according to “Sexism Stupid,” is tough talk with the unconverted public.

People who might be explicitly committed to egalitarianism still have gender biases in certain contexts, including presidential races. And they are unwitting experts at concocting post hoc rationalizations for foregone, irrational conclusions.

The idea that you aren’t voting for a woman not because you don’t want to, but because America just isn’t ready for a female candidate smacks to me of that kind of thinking. Perhaps America isn’t ready because you’re one of the many who prefers male to female candidates, and who unconsciously reaches for excuses to rationalize your preference. This country will never be ready for a woman president, to our detriment, if this continues.

My criticism of both of these solutions is that they assume that the problem is not with the feminist vision of equality-qua-irrelevance but with the mass of humans behaving in ways that seem natural to them and that resonate with their experience. Women face real and meaningful problems in our society, but the message doesn’t seem to be getting through in practice. That may because, unlike liberal Twitter, the bulk of Americans continue to see sex as a legitimate, meaningful, useful tool for ordering their personal, professional, and political relationships. As a consequence, they expect–whether with hope or fear–that a woman president would be different from a typical (i.e. male) president. They expect a woman’s loss of employment to affect her and her household differently than a man’s lack of employment. (And they don’t reduce those consequences to the raw economics of who makes more dollars and cents–because this isn’t a murky Marxist dystopia where all people have been unsexed, uncultured, and reduced to engines of revenue production.) In other words, people expect sex to matter.

And that doesn’t automatically preclude gender equality, nor even many of the concrete goals of contemporary feminism. It just requires a different rhetorical platform from which to make your argument. Rather than chastising us for our neanderthal stupidity and ostracizing those who bother to root some aspects of gendered behavior in evolutionary biology rather than culture, maybe it is time to work toward meaningful solutions to significant problems within the context of prevailing beliefs about sex.

Tagged , , ,

Christians Don’t Go to Church. So What?

A new Pew study finds that most Western Europeans who are subject to church taxes are content to pay those taxes even if they generally don’t actually go to church. The taxes, which are mandatory for Christians in six countries and voluntary in three others, go to keep churches open in spite of generally low levels of attendance and engagement throughout much of Western Europe. In each of the six countries where the tax is obligatory (unless you officially leave the church), a substantial majority of those subject to the fee pay it and an equally substantial majority of those who pay it say they are likely to continue rather than resign membership in churches that they do not attend.

webRNS-Pew-Church-Tax2-043019

webRNS-Pew-Church-Tax1-043019A news article about the report sees the results of Pew’s study as something of a curiosity.

From the outside, Western Europe is often seen as a highly secularized region where established religion is dying out….

Besides attention to church taxes, the report highlights some anomalies about Europeans’ attitudes about religious observance. In Finland, 77% of those surveyed called themselves Christian while only 10% said they attended church regularly. In Germany, the figures were 71% Christians compared with 24% churchgoers.

In several countries, half or more of those who approve of church taxes said they also favored the separation of church and state. In Scandinavia, about three-quarters of respondents who pay church taxes through the state they say should stay out of religion.

In many significant ways, Western Europe has secularized in the last two and a half centuries, but those who look at rates of church attendance to illustrate this trend (as well as those who look at a willingness to finance the church as an anomaly contradicting that trend) fundamentally misunderstand the traditional, historical relationship between Christians and their religion.

Particularly in the US, a nostalgia for the 1950s when weekly church attendance was at its high watermark has skewed our understanding of the historically robust symbiosis of religious identification and truancy. In the medieval period, church was a luxury of the wealthy and the rank and file would often take the sacrament only rarely on high holy days. In the vaunted early days of the Puritan experiment in New England, many cities in Connecticut regularly had attendance rates below 15%. In colonial Virginia in the 1660, only one in five parishes even had resident clergy. (Even the priests weren’t going to church.) When you look at that golden age in the 1950s, the level of attendance was only roughly 50%–even as an overwhelming majority of Americans did (and do) identify as Christians.

Church attendance has never been an adequate measure of religious belief or adherence. We may imagine that in the days before Constantine when most or all Christians were devout believers rather than cultural conformists that Christians gladly gathered every Sunday for fellowship. But there’s no way to substantiate that statistically. What we know from the rest of Christian history is that being a Christian didn’t have much to do with going to church until the pietist/evangelical movements really caught fire in the 18th century and made the connection between personal devotions and religious adherence.

This observation is not, importantly, a justification for skipping church; I am still one of those American Christians who makes an effort to attend services weekly. It is, however, a call to stop being confused by the historically regular (if not normal) state of things. The simplistic equation of religious affiliation with church/mosque/synagogue/temple/shrine attendance misunderstands the role religion has historically played in society. As long as only 16% of the world’s population identifies as secular or non-religious, we can safely say that religion as such is secure no matter how few Europeans or Americans are in pews on Sunday morning.

Tagged , , ,

Not All Suffering is Cruciform

Sandwiched here between the two Easter commemorations this week, we’ve been granted a wonderful opportunity to reflect on the sufferings of Jesus. This opportunity comes from an unlikely source: Rep. Steve King.

King spoke at a town hall this week and fielded a question about the fear that Christians were being persecuted in the United States. (You have my symathies.) King took the question a step further, looking away from the hardships of American Christians and tuning instead to his own personal tribulations:

When I have to step down to the floor of the House of Representatives, and look up at those 400-and-some accusers — you know we just passed through Easter and Christ’s passion — and I have better insight into what He went through for us, partly because of that experience.

King no doubt has suffered. He is reviled by Democrats and held at arm’s length by Republicans, has been stripped of his committee assignments, and was the implicit subject of a censorious resolution on the House floor. But not all suffering is cruciform.

Throughout the New Testament, Christians are warned that they may face sufferings. Jesus warned of it before he ever arrived at the cross and, in the aftermath, Christians suffered more than enough to make it a common refrain in the epistles. For our purposes, the most instructive passage comes from 1 Peter 3:

Now who is there to harm you if you are zealous for what is good? But even if you should suffer for righteousness’ sake, you will be blessed. Have no fear of them, nor be troubled, but in your hearts honor Christ the Lord as holy, always being prepared to make a defense to anyone who asks you for a reason for the hope that is in you; yet do it with gentleness and respect, having a good conscience, so that, when you are slandered, those who revile your good behavior in Christ may be put to shame. For it is better to suffer for doing good, if that should be God’s will, than for doing evil.

For Christ also suffered[a] once for sins, the righteous for the unrighteous…

We are encouraged in our suffering precisely because Christ has suffere, and we are exhorted to be fearless as long as our suffering is in the mold of Christ’s. What is cruciform suffering? There are two criteria (mentioned in this passage): suffering for doing good rather than doing evil and suffering on behalf of others. When we face trials under these conditions they are to our credit–to our ultimate credit even.

But Steve King isn’t suffering for doing what’s right. In the most generous reading of events, he is suffering for doing what’s impolitick. He made some less than condemnatory comments about white nationalism and white supremacy…on the record…to the New York Times. (And it’s not the first time he’s tipped his cards either.) Suffering for being racially insensitive, for being bad at your job, or for being an out-and-out racist is not the same as suffering for doing right.

As we think about the suffering of Jesus in this Easter season, it’s worth putting our day to day trials (even those things we think of as persecutions) into perspective.

 

Tagged , ,

What the News Doesn’t Understand about George Washington

394px-Gilbert_Stuart_Williamstown_Portrait_of_George_WashingtonPolitico Magazine recently ran an article about what the president doesn’t understand about George Washington. Even without details about the presidential excursion to Mount Vernon, it is safe to say that what the latest president doesn’t know about the first president is roughly coequal with everything that has been written in the many volumes about Washington. Even so, the Politico article cannot help but reveal its own bind spots. The deceptive categorization “History Dept.” on top of headline doesn’t stop Peter Cannellos from making some pretty egregious historical errors throughout the piece. It starts immediately when Cannellos compares Washington with Caesar, Napoleon, and “every past conqueror.” Washington was a rebel, maybe even a revolutionary, but it’s beyond generous to call him a conqueror.

But I’m not here for semantic quibbles. There is a much more egregious error made by Cannellos–and, if we’re being fair, so many more in the general public–about Washington: when faced with the possibility of tremendous and permanent power, he gave it all up to retire to his life of quiet recluse, preferring principled democratic service to dictatorial authority. It is the Cincinnatus myth, named after the famous Roman general who was twice given total authority by the Roman Senate and twice gave it up voluntarily when his assigned task was completed. The myth has such a deep and enduring hold on the American national consciousness that we have a major city named after Cincinnatus (and, indirectly, after Washington and the Society of the Cincinnati over which he presided as the first president).

Cannellos sums it up like this:

[Washington] gave up power. This wasn’t expected of him; most Americans hoped he would remain president—for life, if possible. He chose instead to return to his farm at Mount Vernon. He yearned for home but also to establish enduring precedents for the nation whose independence he had helped painfully win: No man is bigger than the country. The office is more important than any president. Power is a privilege to be wielded and then handed to another.

It’s an attractive story; I mean, who wouldn’t want to live in a country founded by such a man? This rendering is even true in some of its more technical specifics. The problem is that it’s founded on a deep misunderstanding of what it is that leaders like Cincinnatus and Washington gave up.

Let’s talk briefly about Cincinnatus, who in the fifth century took on dictatorial powers to stop an invading army and later to thwart a revolution. In both cases the general relinquished his powers in accordance with the law once his assignment was done. In some ways, Cincinnatus’ power was substantial could make laws or ignore existing laws, execute people by fiat, lead the Roman army without the advice or consent of the Senate,320px-Cincinato_abandona_el_arado_para_dictar_leyes_a_Roma,_c_1806_de_Juan_Antonio_Ribera and spend the treasury as needed. The problem was that, in the fifth century, the Roman army was not that big, the Roman treasury not that rich. Still centuries away from a Roman empire the mastery of which would be any great prize. The Romans themselves understood this; that’s why as they grew, the stopped appointing dictators. When crises in the first century BCE forced them back to dictators in desperation, there are no men of the “high moral character” of Cincinnatus left to be found. Except, what’s changed is not the morality of Romans but the degree of temptation. What separates Julius Caesar (who makes an appearance in Cannellos’s article) and Cincinnatus (whose presence is only implied) is time not character.

The lesson from ancient history is instructive when thinking about Washington. It is easy to see his refusal to become a king and his voluntary resignation after two terms as great sacrifices when viewed through the lens of the modern imperial presidency–or even the presidency of people like Abraham Lincoln. The government tha Washington stood at the head of was a second try experiment with highly limited and still untested powers. His branch of that government was certainly not the strongest and would only get weaker (temporarily) after he left it and the Supreme Court began to assert itself. Washington–correctly–understood the Constitution and the theories of government behind it to grant primary power to the legislative branch, and he deferred to them in almost everything except that which was specifically and narrowly within his purview.

In fact, it is best to remember that Washington almost certainly exercised more power in almost ever other role in his later life than he did as president. As a general, his control of the substantial continental military was nearer to absolute and substantially more consequential than anything he wielded as president. His influence as a tycoon of Virginia real estate granted him more tangible powers as well. As president in the late eighteenth century he was a second-tier bureaucrat and statesman.

If that seems hyperbolic, the reality of the current presidency is more so. The reason modern politicians are so desperate to cling to power is precisely because they have it. They vie for control of arguably the most powerful nation in the world. The current president wields greater power and authority not only than did Washington but also significantly more than George III did in England. When patriots called for Washington to be made a king like those of Europe, it was not an invitation to power but to impotence. (Just think about the impending fate of the French king.) When they called on him to be president, the country he presided over was supposed to be one with a government only as strong as absolutely necessary with an executive whose very existence was a concession to the failure of the previous system.

So when the current president quipped, in his inimitable style, that “If [Washington] was smart, he would’ve put his name on [Mount Vernon]…You’ve got to put your name on stuff or no one remembers you.” The most appropriate response–the one given by the actual historian present–was to point out that Washington managed to get his name on plenty of stuff, like the capital city in which the president resides. The worst response, however, is to lapse into elegy about the man who had all the power in the world and surrendered it out of duty and love of country. Washington, like his predecessor Cincinnatus, simply quit his job because he had a better one waiting back home.

Tagged , , , ,
Advertisements