Tag Archives: apathy

The Biggest Threats to Democracy

Edward_SnowdenHistory reminds us of those critical events that pose threats to us on various levels: to our well being at a narrow level and to the foundations of our democracies at a much broader level. And, most of these existential threats seem to come from the outside: wars, terrorism, ethnic cleansing.

But it’s not quite that simple — the biggest threats come not from external sources of evil, but from within us. Perhaps the two most significant are our apathy and paranoia. Taken together they erode our duty to protect our democracy, and hand over ever-increasing power to those who claim to protect us. Thus, before the Nazi machine enslaved huge portions of Europe, the citizens of Germany allowed it to gain power; before Al-Qaeda and Isis and their terrorist look-a-likes gained notoriety local conditions allowed these groups to flourish. We are all complicit in our inaction — driven by indifference or fear, or both.

Two timely events serve to remind us of the huge costs and consequences of our inaction from apathy and paranoia. One from the not too distant past, and the other portends our future. First, it is Victory in Europe (VE) day, the anniversary of the Allied win in WWII, on May 8, 1945. Many millions perished through the brutal policies of the Nazi ideology and its instrument, the Wehrmacht, and millions more subsequently perished in the fight to restore moral order. Much of Europe first ignored the growing threat of the national socialists. As the threat grew, Europe continued to contemplate appeasement. Only later, as true scale of atrocities became apparent did leaders realize that the threat needed to be tackled head-on.

Second, a federal appeals court in the United States ruled on May 7, 2015 that the National Security Agency’s collection of millions of phone records is illegal. This serves to remind us of the threat that our own governments pose to our fundamental freedoms under the promise of continued comfort and security. For those who truly care about the fragility of democracy this is a momentous and rightful ruling. It is all the more remarkable that since the calamitous events of September 11, 2001 few have challenged this governmental overreach into our private lives: our phone calls, our movements, our internet surfing habits, our credit card history. We have seen few public demonstrations and all too little ongoing debate. Indeed, only through the recent revelations by Edward Snowden did the debate even enter the media cycle. And, the debate is only just beginning.

Both of these events show that only we, the people who are fortunate enough to live within a democracy, can choose a path that strengthens our governmental institutions and balances these against our fundamental rights. By corollary we can choose a path that weakens our institutions too. One path requires engagement and action against those who use fear to make us conform. The other path, often easier, requires that we do nothing, accept the status quo, curl up in the comfort of our cocoons and give in to fear.

So this is why the appeals court ruling is so important. While only three in number, the judges have established that our government has been acting illegally, yet supposedly on our behalf. While the judges did not terminate the unlawful program, they pointedly requested the US Congress to debate and then define laws that would be narrower and less at odds with citizens’ constitutional rights. So, the courts have done us all a great favor. One can only hope that this opens the eyes, ears and mouths of the apathetic and fearful so that they continuously demand fair and considered action from their elected representatives. Only then can we begin to make inroads against the real and insidious threats to our democracy — our apathy and our fear. And perhaps, also, Mr.Snowden can take a small helping of solace.

From the Guardian:

The US court of appeals has ruled that the bulk collection of telephone metadata is unlawful, in a landmark decision that clears the way for a full legal challenge against the National Security Agency.

A panel of three federal judges for the second circuit overturned an earlier rulingthat the controversial surveillance practice first revealed to the US public by NSA whistleblower Edward Snowden in 2013 could not be subject to judicial review.

But the judges also waded into the charged and ongoing debate over the reauthorization of a key Patriot Act provision currently before US legislators. That provision, which the appeals court ruled the NSA program surpassed, will expire on 1 June amid gridlock in Washington on what to do about it.

The judges opted not to end the domestic bulk collection while Congress decides its fate, calling judicial inaction “a lesser intrusion” on privacy than at the time the case was initially argued.

“In light of the asserted national security interests at stake, we deem it prudent to pause to allow an opportunity for debate in Congress that may (or may not) profoundly alter the legal landscape,” the judges ruled.

But they also sent a tacit warning to Senator Mitch McConnell, the Republican leader in the Senate who is pushing to re-authorize the provision, known as Section 215, without modification: “There will be time then to address appellants’ constitutional issues.”

“We hold that the text of section 215 cannot bear the weight the government asks us to assign to it, and that it does not authorize the telephone metadata program,” concluded their judgment.

“Such a monumental shift in our approach to combating terrorism requires a clearer signal from Congress than a recycling of oft?used language long held in similar contexts to mean something far narrower,” the judges added.

“We conclude that to allow the government to collect phone records only because they may become relevant to a possible authorized investigation in the future fails even the permissive ‘relevance’ test.

“We agree with appellants that the government’s argument is ‘irreconcilable with the statute’s plain text’.”

Read the entire story here.

Image: Edward Snowden. Courtesy of Wikipedia.

I Don’t Know, But I Like What I Like: The New Pluralism

choiceIn an insightful opinion piece, excerpted below, a millennial wonders if our fragmented and cluttered, information-rich society has damaged pluralism by turning action into indecision. Even aesthetic preferences come to be so laden with judgmental baggage that expressing a preference for one type of art, or car, or indeed cereal, seems to become an impossible conundrum  for many born in the mid-1980s or later. So, a choice becomes a way to alienate those not chosen — when did selecting a cereal become such an onerous exercise in political correctness and moral relativism?

From the New York Times:

Critics of the millennial generation, of which I am a member, consistently use terms like “apathetic,” “lazy” and “narcissistic” to explain our tendency to be less civically and politically engaged. But what these critics seem to be missing is that many millennials are plagued not so much by apathy as by indecision. And it’s not surprising: Pluralism has been a large influence on our upbringing. While we applaud pluralism’s benefits, widespread enthusiasm has overwhelmed desperately needed criticism of its side effects.

By “pluralism,” I mean a cultural recognition of difference: individuals of varying race, gender, religious affiliation, politics and sexual preference, all exalted as equal. In recent decades, pluralism has come to be an ethical injunction, one that calls for people to peacefully accept and embrace, not simply tolerate, differences among individuals. Distinct from the free-for-all of relativism, pluralism encourages us (in concept) to support our own convictions while also upholding an “energetic engagement with diversity, ” as Harvard’s Pluralism Project suggested in 1991. Today, paeans to pluralism continue to sound throughout the halls of American universities, private institutions, left-leaning households and influential political circles.

However, pluralism has had unforeseen consequences. The art critic Craig Owens once wrote that pluralism is not a “recognition, but a reduction of difference to absolute indifference, equivalence, interchangeability.” Some millennials who were greeted by pluralism in this battered state are still feelings its effects. Unlike those adults who encountered pluralism with their beliefs close at hand, we entered the world when truth-claims and qualitative judgments were already on trial and seemingly interchangeable. As a result, we continue to struggle when it comes to decisively avowing our most basic convictions.

Those of us born after the mid-1980s whose upbringing included a liberal arts education and the fruits of a fledgling World Wide Web have grown up (and are still growing up) with an endlessly accessible stream of texts, images and sounds from far-reaching times and places, much of which were unavailable to humans for all of history. Our most formative years include not just the birth of the Internet and the ensuing accelerated global exchange of information, but a new orthodoxy of multiculturalist ethics and “political correctness.”

These ideas were reinforced in many humanities departments in Western universities during the 1980s, where facts and claims to objectivity were eagerly jettisoned. Even “the canon” was dislodged from its historically privileged perch, and since then, many liberal-minded professors have avoided opining about “good” literature or “high art” to avoid reinstating an old hegemony. In college today, we continue to learn about the byproducts of absolute truths and intractable forms of ideology, which historically seem inextricably linked to bigotry and prejudice.

For instance, a student in one of my English classes was chastened for his preference for Shakespeare over that of the Haitian-American writer Edwidge Danticat. The professor challenged the student to apply a more “disinterested” analysis to his reading so as to avoid entangling himself in a misinformed gesture of “postcolonial oppression.” That student stopped raising his hand in class.

I am not trying to tackle the challenge as a whole or indict contemporary pedagogies, but I have to ask: How does the ethos of pluralism inside universities impinge on each student’s ability to make qualitative judgments outside of the classroom, in spaces of work, play, politics or even love?

In 2004, the French sociologist of science Bruno Latour intimated that the skeptical attitude which rebuffs claims to absolute knowledge might have had a deleterious effect on the younger generation: “Good American kids are learning the hard way that facts are made up, that there is no such thing as natural, unmediated, unbiased access to truth, that we are always prisoners of language, that we always speak from a particular standpoint, and so on.” Latour identified a condition that resonates: Our tenuous claims to truth have not simply been learned in university classrooms or in reading theoretical texts but reinforced by the decentralized authority of the Internet. While trying to form our fundamental convictions in this dizzying digital and intellectual global landscape, some of us are finding it increasingly difficult to embrace qualitative judgments.

Matters of taste in music, art and fashion, for example, can become a source of anxiety and hesitation. While clickable ways of “liking” abound on the Internet, personalized avowals of taste often seem treacherous today. Admittedly, many millennials (and nonmillennials) might feel comfortable simply saying, “I like what I like,” but some of us find ourselves reeling in the face of choice. To affirm a preference for rap over classical music, for instance, implicates the well-meaning millennial in a web of judgments far beyond his control. For the millennial generation, as a result, confident expressions of taste have become more challenging, as aesthetic preference is subjected to relentless scrutiny.

Philosophers and social theorists have long weighed in on this issue of taste. Pierre Bourdieu claimed that an “encounter with a work of art is not ‘love at first sight’ as is generally supposed.” Rather, he thought “tastes” function as “markers of ‘class.’ ” Theodor Adorno and Max Horkheimer argued that aesthetic preference could be traced along socioeconomic lines and reinforce class divisions. To dislike cauliflower is one thing. But elevating the work of one writer or artist over another has become contested territory.

This assured expression of “I like what I like,” when strained through pluralist-inspired critical inquiry, deteriorates: “I like what I like” becomes “But why do I like what I like? Should I like what I like? Do I like it because someone else wants me to like it? If so, who profits and who suffers from my liking what I like?” and finally, “I am not sure I like what I like anymore.” For a number of us millennials, commitment to even seemingly simple aesthetic judgments have become shot through with indecision.

Read the entire article here.