Opinion

Holy Cross Should Take No Pride in Fauci

Looking at the long list of College of the Holy Cross alumni, many names pop out as being very accomplished individuals. Such names as Bob Cousey, a basketball great, Jon Favreau, an accomplished writer who worked as a speechwriter for President Obama, and Clarence Thomas, associate justice of the United States Supreme Court. Despite these well-accomplished people, one man stands in front as the pride of Holy Cross, that being Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases and the Chief Medical Advisor to the President. Despite his de facto status as the pride of Holy Cross, he is the least deserving of the lot, advocating policies that infringe on the American people’s freedoms, like vaccine mandates, and lying to the American public, most nobly in the case of gain-of-function research and the NIH’s involvement with the Wuhan Institute of Virology.

The name Anthony Fauci first entered the collective American vocabulary in the spring of 2020, when he came to be who Americans looked towards for information on the COVID-19 Pandemic. Students of Holy Cross, having just been sent home due to the pandemic, rallied behind Fauci in an effort to gain that connection to the school who’s campus that had just been separated from. Soon, the Instagram trend “Fauci Friday” gained traction, where students would photoshop Fauci into pictures of them and their friends from before the pandemic and put it on their Instagram stories. Unfortunately, Fauci has proved over his time being the face of the COVID-19 Pandemic that he is not a man who should be celebrated.

Since that time, Fauci has become less of a public health policy maker, and more of a talking head for the masses, appearing in interview after interview — some serious, many not — to the point where many on the Left have given him cult status, all while he collects his salary of $417,608 a year, the highest of any federal worker, including the president. It has become commonplace for merchandise to be sold with his face on it, like pillows or bobble heads. Of course one must ask, “What has he actually done to receive this much praise?” Some may point to his earlier work on the HIV/AIDS epidemic that was essential in many ways, for which he received the presidential medal of freedom. Of course, this should be commended, but in terms of his leadership during the COVID-19 Pandemic, it has been marred by bad policy, conflict, and lies.

First, concerning the policies that Fauci has espoused and approved of, many would point to his flip-flopping statements regarding both the danger of COVID-19 as well as his stance on masks as worthy of criticism. On both issues he changed his opinion in the first few months of the pandemic, but I would not point to this as a failure as others would. It is important to acknowledge that science is an ever-evolving field where the answers on which we once agreed as the truth are not always the ones we find to be true later. In this same vein though, vaccine skepticism is surely justified since science is ever evolving, and the vaccine being seen as beneficial by science now, might change in the future. I do not support anti-vaccine rhetoric, but the point is that science is not as cut in stone as the so-called “experts” make it out to be. It is reasonable to be suspicious of a new, somewhat rushed vaccine, under an administration which, before it took office, blasted the vaccine as untrustworthy. Despite this, Fauci still supports the overarching policy of vaccine mandates.

Fauci’s policies limit the freedom of Americans by supporting policies that compel them to have a substance that they find untrustworthy injected into them. Fauci would point to the idea that public health trumps freedom in this case, meaning that you may have skepticism about the vaccine, but by not taking the vaccine you run the risk of killing others through spreading the disease. This argument has very little merit though when one looks at the data comparing the deaths of vaccinated vs unvaccinated Americans. Andy Slavitt, a former adviser to the Biden administration on COVID-19, noted that 98% to 99% of those who died in May from COVID-19 were unvaccinated, and with a vaccine for which we have near universal access in America, excluding children for which the virus poses an extremely minimal threat, this leads to the conclusion that the unvaccinated do not pose a threat to the vaccinated. Why are we trying to save those who do not want to be saved? The United States is a nation constructed to stand against tyranny. This means that, in America, one should have the right to refuse a vaccine, as at this point in the pandemic, the only one that they stand to hurt is themselves. If they choose to risk their lives in this way, that is their choice, similar to how a smoker has the right to smoke, despite increasing his chance of cancer by huge margins.

Fauci’s policies may fly in the face of freedom, but the most disgraceful act that Fauci has undertaken in his time as America’s doctor are his actions related to the lies regarding gain-of-function research in Wuhan. In an exchange in May between Fauci and Senator Rand Paul (R-KY), Fauci said that “We did not fund gain-of-function research in the Wuhan Institute of Virology.” Wuhan, and the Wuhan Institute of Virology, is the theorized origin of the virus. Gain-of-function research is research that increases the transmissibility and/or virulence of a pathogen, generally involving its transmissibility towards humans.

On October 20th, this claim was utterly debunked by the National Institute of Health (NIH), of which Fauci’s National Institute of Allergy and Infectious Diseases (NIAID) is a part, when they sent a letter to Congress admitting that gain-of-function research did occur in the Wuhan Institute of Virology. The letter details that at least $600,000 was given to EcoHealth, an American-based group who used that money, along with researchers in Wuhan, to study bat coronaviruses. This completely contradicts Fauci’s categorical denial that this type of research occured, despite Fauci working under NIH.

The NIH blames the EcoHealth group, claiming that the appearance of a new highly contagious form of the coronavirus was not reported to them until August of this year. Either Fauci lied and the NIH revealed that lie; or Fauci had no clue what was occurring under the supervision of his own agency; or there is a much larger cover up, possibly involving the origins of COVID-19 and the United States government’s role in it. This comes after an increasing amount of skepticism whether the coronavirus lab leak theory is really as debunked as Fauci would like to make it seem. 

Fauci has, time and again, decried the lab leak theory. And while the majority of government agencies still maintain either that the origin is uncertain or that it is natural (notably with self-reported low confidence), an increasing number of news organizations and even one US government intelligence agency has come to the conclusion that the lab leak theory is the most likely origin. Of course, geopolitical concerns are undoubtedly in the back of the minds of Fauci and the NIH. These concerns include not upsetting the Chinese Communist Party (CCP), similar to how the Director-General of the World Health Organization refused to criticize the CCP in fear that it would not cooperate with the pandemic response, despite its repeated efforts to undermine the information gathering process. It is clear that the CCP has no intention of cooperating and has no consideration for the rest of the world, yet Fauci still chooses to put his trust in them and their scientists.

The CCP has shown repeatedly that it can not be trusted, yet Fauci’s NIAID and the NIH still continue to fund research in China and other foreign countries with suspect ethics. These places do not have the same regulations and safety precautions regarding experiments, and so they are ideal places to conduct dangerous and ethically questionable research. They can claim some degree of plausible deniability as they have done through EcoHealth and its gain-of-function research in Wuhan. Why try to do this research in an American lab with ethical standards, when you could do it in China with no concerns and the ability to blame it on someone else if you leak the most dangerous virus in a century?

Additionally, at the time of this writing, a new story is emerging that Fauci, through the NIH, sent funding to a Tunisian laboratory conducting inhumane experiments on beagles. According to White Coat Waste Project, a bipartisan organization that seeks to end taxpayer-funded experiments on animals, Fauci’s NIAID sent $424,000 to a Tunisian laboratory that undertook experiments where beagles’ heads were put in cages to which sand flies were introduced to eat the dogs’s heads. This story is still developing, so it should be treated with a fair amount of skepticism, but it is concerning nonetheless.

While many on campus take pride in Dr. Anthony Fauci as a Holy Cross alumnus, he has proven, especially through his most recent actions, that he should not be celebrated. His deliberate disregard for freedom in exchange for questionable policy is nonetheless ignored and sometimes even encouraged by his supporters. The cult-like following he has garnered, not just on campus, but in America as a whole, is concerning. Fueled by fanaticism, the very real concerns of suspect research are thrown out the window as nonsense conspiracy theories, rather than concerning and possibly criminal allegations which should be investigated further. Holy Cross should take no pride in Fauci.

Big Tech Hegemony Threatens Civil Discourse

You have likely heard the term “Big Tech” thrown around from time to time. It refers to the dominant information technology companies like Amazon, Google, and Facebook, among others –– basically, the platforms that the majority of Americans use on a daily, or even hourly, basis. And while these platforms are a great tool, our reliance on them is becoming increasingly concerning. Our free speech has become dangerously tied to our ability to access them. These companies have begun selectively applying their terms of service and community standards against those with particular viewpoints, creating a litany of hypocrisy that must come to an end.

The most cited example of Big Tech censorship involves former President Donald Trump. In response to the January 6 attack on the U.S. Capitol, Twitter and Snapchat permanently banned POTUS #45, while Facebook and Instagram put him on an indefinite time-out. These companies believe Trump incited the attack with his media activity and feared his posts would produce further violence. Regardless of whether these claims hold truth or not, one must wonder two things: 1. If Big Tech can censor the President of the Free World, is there a limit to its power? and 2. What does this say about the state of free speech in the United States? The President was completely de-platformed from communicating through Big Tech — he could not post even the most innocent material. The arbitrary application of community standards is extremely concerning, especially when considering how heads of state sponsors of terror are still on social media. These platforms are used to recruit new members and plan violent attacks, but Big Tech is not always as quick to voice concerns over potential violence as they were with a former president. A more consistent application of community standards is clearly necessary.

Beyond the example of #45, other public figures with views adverse to the left’s have also been censored. In May, conservative comedian Steven Crowder received his second strike from YouTube on claims of harassment and cyberbullying. The episode at issue covered the shooting death of Ma’khia Bryant by a police officer. Crowder discussed how Bryant attacked someone with a knife before the officer shot her, arguing with his co-host that the shooting was justified. In a fashion fitting for a comedian, Crowder joked about the incident, eliciting the removal of the video from YouTube. If he received one more strike within 90 days he would have been cut off from the platform and his millions of subscribers. This October, Crowder was subject to a week-long channel freeze for an episode where he suggests “trans people pose a rape threat to women” while discussing California’s decision to host biological males in women’s prisons. YouTube said Crowder violated their hate speech policy. Bill Maher’s video expressing his pleasure with David Koch’s death, however, is completely fine. The issue is that even if the claims do violate YouTube standards, these platforms seem to apply their guidelines very selectively.

The oppressive leftist zeitgeist of today’s discourse even pervades the academic arena, which is perhaps the most dangerous place to censor civil discourse. Even Amazon censors certain books. In February 2021, for example, Ryan Anderson’s book When Harry Became Sally: Responding to the Transgender Movement became the first to be banned under Amazon’s new hate speech policy. The book had been a best-seller when it was first released. Further, Amazon recently prohibited ads for the book BLM: The Making of a New Marxist Revolution because it “contains book/s or content that is not allowed” under its “Creative Acceptance Policies.” The book dives into the difference between saying “black lives matter” and the organization of the same name. These two examples deal with opinions contrary to the left’s agenda, which is reflective of a larger problem within today’s civil discourse. Rather than engaging in difficult discussions, many try to censor the right over claims of hate speech or offensive content. This specifically alters what is considered to be an acceptable opinion, and Big Tech just adds to this growing problem.

Regardless of political leanings, the selective use of Big Tech censorship poses a threat to the country’s political discourse. If we cannot discuss difficult issues freely and openly, then we risk becoming a thoughtless nation. The idea of Orwell’s “memory hole” in his 1984 novel comes to mind: anything the government wanted wiped from the public record was put through a chute into an incinerator to revise history and promote Party dogma. At this point, Big Tech essentially has the power of the memory hole. It can promote the policy positions it favors and censor others under the guise of community standards violations. While some may claim that Big Tech censorship can be solved by switching to a different platform, it is not that simple. Amazon dominates the book-selling market, selling 65% of all new online book units. So, getting blacklisted by Amazon poses a dramatic obstacle to your book sales. Further, if you’re de-platformed from somewhere like YouTube, it becomes incredibly difficult to maintain your following. The loss of your account also marks the loss of your subscribers. Losing access to these platforms means you will have to jump through hoops to maintain what you built. 

If Big Tech is going to go censorship-crazy, they should create more transparent and definitive standards of behavior, applied evenly across the board. The future of civil discourse could very well depend upon it. Our republic was built through deliberation by great thinkers who brought different ideas to the table, with the best ultimately rising to the top. In the spirit of the American experiment, we need to return to true deliberation. A willingness to engage with controversial ideas is the only way to affirm or dispute them. We cannot label everything we disagree with as “hate speech” or “offensive,” even if it makes us uncomfortable. Instead, embrace debate. Prove others wrong. Technology and the media should be a revolutionary way for us to do this. Instead, it has become an enemy of free, unconventional thought. Big Tech’s memory hole must come to an end.

Holy Cross’ Empty Virtue Signaling

In recent decades, Holy Cross has undergone a dramatic transformation from a small, Catholic men’s college to the nationally-recognized liberal arts institution it is today. Like any organization grappling with its identity amid a changing society, this process at Holy Cross has had its fair share of successes and failures. Tensions over to what extent the college’s new, pluralistic identity can coexist with its religious heritage are still ongoing, and are unlikely to be resolved any time soon. But as Holy Cross’ Catholic, Jesuit character increasingly falls by the wayside, it is worth examining what the College’s traditional ethos is being replaced with. As many students and alumni know, social justice issues are a central focus of the College today. But the administration’s initiatives in these areas reveal an institution defined more by virtue signaling than substantive action.


Take, for instance, the administration’s commitment to racial justice. The College’s Anti-Racism Action Plan, adopted in June 2020, outline the administration’s initiatives to transform Holy Cross into an “actively anti-racist institution.” The plan, then-President Boroughs wrote, is a “starting point” for “overcom[ing] the sin of racism, whether it be interpersonal or structural,” at Holy Cross. That the administration apparently believes Holy Cross to be an institution infected by structural racism is curious in itself. Of the three highest-profile members of the College’s executive team — President Vincent Rougeau, Provost Margaret Freije, and Vice President for Student Affairs Michele Murray — no less than two are black. Would this be possible at a structurally racist institution? 


Nonetheless, this is the premise the College is working with. To be sure, racism is a serious issue — and indeed a sin — that should be taken seriously by any organization, and especially mission-driven institutions like Holy Cross. And although there are serious issues with the contemporary “anti-racist” movement, ideas for making Holy Cross a more diverse and welcoming community are certainly worth pursuing. So, even if the impetus for Holy Cross’ recent anti-racism efforts — the College’s supposed structural racism — is questionable, at least some positive impacts will come of it, right?


Not exactly. The College’s Anti-Racism Action Plan is many things, but “substantive” is not one of them. Its forty goals are mostly vague, cosmetic, myopic, or trivial — or some combination of the four. Workshops, seminars, reflection series, and ad-hoc committees abound, including sessions on “Becoming a White Ally for Racial Justice,” and a “Listen and Learn” book club. Other initiatives seem potentially problematic (such as a proposed reporting website for “microaggressions” on campus), or appear not to have been implemented (such as a planned “Anti-Racism Capacity Building Fund” for student organizations).


A major pillar of the plan calls for “recruit[ing] diverse communities — students, faculty, and staff — to our campus.” But, as I noted in a previous article, the Holy Cross student body is already 26 percent nonwhite — higher than Massachusetts statewide (22 percent). Meanwhile, 36 percent of the College’s tenure-track faculty hires in the five years prior to the adoption of the anti-racism plan were people of color, already higher than the nonwhite proportion of recent doctoral graduates (33 percent). Evidently, Holy Cross doesn’t need an “Anti-Racism Action Plan” to recruit diverse talent.


The College’s fervor for racial justice appears even more empty when considering the administration’s actions, in recent years, to backtrack on true efforts to provide opportunities for minority students. Most notably, Holy Cross in 2019 quietly abandoned its need-blind admissions policy, which had been in place for decades. Need-blind admissions, in which an applicant’s need for financial aid is not considered in admissions decisions, help equalize the college admissions process and give students from disadvantaged backgrounds a greater chance when applying to a selective institution like Holy Cross. Importantly, need-blind admissions disproportionately benefit students of color, and colleges with this policy experience measurable gains in student diversity.


Of course, need-blind admissions are not financially feasible for most colleges and universities in the United States. Indeed, Holy Cross gave this reason when they ended the policy two years ago, citing the burden of $67 million in annual financial aid costs. Nonetheless, with a $760 million endowment, and $420 million raised by the recent “Become More” capital campaign, it is hard not to feel that the College could devote more resources to financial aid if it wished to, especially given its willingness to spend, for instance, an exorbitant $107 million on a new performing arts center. It’s just a matter of priorities.


What good are “anti-racism action plans” and Diversity, Equity, and Inclusion (DEI) workshops when the administration has taken concrete steps to make Holy Cross less accessible to minority students? Adding insult to injury is the fact that even as Holy Cross claims it cannot afford the cost of need-blind admissions, the size of its DEI bureaucracy — and the associated costs — have multiplied. The Anti-Racism Action Plan announced the hiring of at least seven new administrators — three in the Office of Multicultural Education and four in the Office of Title IX and Equal Opportunity. This is in addition to the preexisting Office of Diversity, Equity, and Inclusion.


The duties of these staff include hosting student events like Gathered, a “self-care workshop” where students can “reclaim their space and energy,” and Spectrum, advertised as a “celebratory space that centers queer, trans, non-binary, and gender non-conforming BIPOC [black, indigenous, and people of color] experiences.” One cannot help but feel that the salaries of the staff leading events like this could be better put to use by the College elsewhere. My own suggestion would be to combine these offices into one, cut the staff by two-thirds, and redirect the surplus funding to something that would actually benefit students— such as a scholarship fund for students of minority or disadvantaged backgrounds.


Holy Cross’ unwillingness to back up its professed commitments with substantive action is not limited to “anti-racism.” The College, for instance, proclaims that “at Holy Cross, sustainability isn’t a buzzword” — yet it continues to invest in fossil fuels, and rejected calls in 2016 to divest from dirty energy. And, in the midst of a significant dining services staff shortage, Holy Cross has left students to continue facing limited food options and shortened hours rather than raising wages to attract workers. As the College’s assistant director of employment, Margaret Rollo, noted in a recent Spire interview, “We currently don’t have wages or a salary that is competitive. That’s up to the College to make those decisions.” With its ample financial resources, surely Holy Cross could offer dignified wages to its dining staff if it wished to. Again, priorities.


Why is Holy Cross’ commitment to “social justice” reflected in its rhetoric, but not its actions? The answer is simple. It is much easier to be “virtuous” in ways that require little concrete sacrifice on the part of the College and its administrators. It is easier to hang rainbow flags, host “allyship” workshops, and install composting bins than it is to take on the financial and institutional costs necessary for the College to pursue real action on the causes it advocates for. To be sure, Holy Cross’ professed commitments to racial justice, environmental sustainability, and other causes are admirable — at least in theory. But virtue signaling is not virtue, and language without action is empty.

The Decline of Western Civilization at Holy Cross

The endangered species list is due for a new member: history of Western Civilization courses at Holy Cross. Out of 27 courses offered by the College’s History Department in the Spring 2022 semester, only two focus on pre-1500 Western history. In a department of 18 professors, only one specializes in pre-1500 Western history. The Department is currently in the process of hiring another Latin Americanist rather than a medievalist or ancient Mediterranean specialist. This might not seem objectionable at first glance, but it is a serious concern for anyone interested in a genuine liberal arts education. A robust schooling in Western Civilization’s origins is essential for the growth of responsible and informed citizens in a modern liberal democracy, and must be central to any liberal arts curriculum. 

          

Before delving into the body of the article, I want to specify that despite my criticism, I have a deep appreciation for the History Department. I am sincerely grateful for the opportunity to study under the professors I have taken courses with, for they epitomize the best of the historical profession. My quarrel is not with them (or any professor); indeed I can only commend them for their work in the discipline. Nor do I want to denigrate non-Western areas of study — those are incredibly important to the discipline as well. I only desire the recognition that medieval European and ancient Mediterranean studies hold particular value for the Western citizen.

          

To the postmodern mind, it is entirely uncouth to suggest that a particular area of history is essential and should be prioritized. Yet, despite a popular aversion to admitting it, there is indeed a hierarchy of historical importance, particularly during the finite time of an undergraduate education. Walter Lippmann’s 1940 speech at Harvard University’s Phi Beta Kappa Society, offers a cogent case for why universities must defend the necessity of educating students in the tradition and history of Western Civilization. It will serve as the basis for this article's analysis and criticism of the decline in the study of Western Civilization, both at Holy Cross and around the country. Lest he be dismissed out of hand, it should be noted that Lippman was hardly a conservative: he dabbled with socialism for a time, worked for the Wilson administration, and considered himself a progressive for much of his life. 

Lippmann begins from a bird’s eye view of education and its aims. The modern education system finds its roots in the 19th century West, with the goal – quoting Jefferson – of providing the foundation for “the preservation of freedom and happiness”. In Lippmann’s judgement, that foundation has utterly failed. Indeed, it is the students of these schools that in the 20th century “have either abandoned their liberties, or have not known, until the last desperate moment, how to defend them.” One can only defend liberties if he or she is educated in the history and principles that liberty depends upon. 

Lippmann understood that the individuals who built the United States, who constructed and maintained the freest society the world has known, did so with a deep understanding of the West’s past. Many of the concepts that undergird free societies  – such as universal subjection to the law regardless of social stature, the principle of representation, checks and balances, or respect for the human body (as created in the image of God) – were birthed in the ancient Mediterranean. These critical ideas, among many others, were then further developed and enriched in the medieval West. The institutions of a free society that are taken for granted today are but the tip of the stalactite of Western Civilization. Lippmann quotes French philosopher Etienne Gilson: 

“[Western culture] is essentially the culture of Greece, inherited from the Greeks by the Romans, transfused by the Fathers of the Church with the religious teachings of Christianity, and progressively enlarged by countless numbers of artists, writers, scientists and philosophers from the beginning of the Middle Ages up to the first third of the nineteenth century.” 

The American Founders were the heirs of this culture, they were manifestations of a continuous tradition and history stretching back thousands of years. Saint John Henry Cardinal Newman, in The Idea of a University, asserts that, at its foundation, the West is a synthesis of two great traditions: that of Athens and that of Jerusalem: reason and faith. Newman, however, stretches the West’s history back even farther, seeing its origin in the great civilizations of the Near East, from Egypt to Mesopotamia. While the geographical center of Western Civilization has shifted in the course of history, its continuity is not in doubt. Hence the importance of educating Western citizens – and this includes all who inhabit the free world – in the tradition of Western Civilization: to fail to do so is to fail to preserve this great inheritance.

Lippmann defends the importance of preserving tradition – which requires understanding it – in a manner reminiscent of Edmund Burke. No individual or society can start from scratch or jettison the accumulated knowledge of generations and expect to progress as a civilization. Like a stalactite, growth is conducted upon a wide and ancient foundation. Lippmann, similarly, analogizes this to the practice of modern science: “[Society is] able to do advanced experiments which increase knowledge because they do not have to repeat the elementary experiments.” Burke asserted much the same, although he termed respect for tradition as prejudice. This is not the kind of prejudice that one thinks of today, instead it is prejudice in favor of deferring to the combined wisdom of generations past, for, as Russel Kirk affirms in The Conservative Mind, the knowledge of the common man “is a kind of collective wisdom” without which “he is thrown back upon his own private stock of reason, with the consequences which attend shipwreck.” This Burkian prejudice is the response of the human mind to the problem of one’s inability to discover every truth for oneself — it is not to be thrown out as backward or primitive, rather it should be respected and utilized. Lippmann understood, however, that the educational system of 1940 – and this is even more true of 2021 – had no interest in strengthening the intellectual roots, or encouraging the Burkian prejudice, of Western society, as the curriculum had been progressively purged of pre-1500 Western Civilization. 

It is not just that there are fewer courses offered on the history of Western Civilization in institutions of higher education, the problem is also that the subject is no longer required. The typical university student’s education is far more subjective than it once was, with undergraduates – in most colleges – having only to fulfill a short list of vague requirements. At Holy Cross, this takes the form of Common Requirements, which include such expansive terms as ‘Studies in Religion’ or ‘Historical Studies’. These requirements are so vague that a student can graduate without having taken a single course on Christianity or Western history. More specifically, in the History Department, majors need only two pre-modern courses – and they need not be Western, pre-1500, or ancient history either. Educational requirements should exist not just to give the student a breadth of experience in various areas of study, but also to educate the student in those areas which are essential to the society that he or she will inhabit. 

        

 For Lippmann, a society can only endure when there are common bonds, part of which includes a common knowledge of shared heritage and tradition. Education fails in its civic duty – the preservation and furtherance of a free society – when it fails to have a standard of knowledge, when it fails to provide a common well from which members of a society can draw upon. Today, it is a concern for equity that has caused the teaching of Western Civilization — the common well — to be superseded and diminished. Rather than offering courses centered on the school’s duty of providing foundational knowledge for the student, history departments (Holy Cross’ included) have chosen to base their offerings in part on equity or equal representation of cultures and geographic areas — another symptom of contemporary relativism. There is nothing wrong with having culturally and geographically diverse history — indeed, it is a good — but when that comes at the expense of essential topic areas (such as the study of pre-1500 Western history), it does a deep disservice to the student’s education. 

What is required to revivify the education system is a revitalization of studies in Western Civilization and a rejection of the postmodern attachment to relativism. Not all areas of historical study are of equal importance for the educated citizen. Some areas should be prioritized – rather, required – and some should remain elective. This does not mean that the less traditional areas of historical study need to be removed from the curriculum – far from it. However, it does mean that the College should construct its Common Requirements so as to educate the individual in the society he or she inhabits. The History Department should rebuild its medieval Western and ancient Mediterranean history programs, and rework its major requirements to specify that all history majors must take at least one course in both of these topic areas. Politically unpopular they may be, but if the College truly desires to educate men and women “for and with others” in a shared society, these changes are necessary.

Why Classics is Valuable and Cancel Culture is Toxic

Over the summer, I worked as an intern at National Review and was fortunate enough to see many articles about a variety of different topics published in real time. At the beginning of my internship, I read an article by Cameron Hilditch called “Without the Classics, Our History is Incomprehensible.” In this article, Cameron discusses the underlying influence that ancient Greek and Roman culture has had on American history and the roots of our civilization as we know it. He also addresses the recent decision made by Princeton University to drop the requirement that classics majors must learn Greek and Latin. Expertly weaving in the implications that the “pagan classics” have had on politics and the intellectual beliefs of our ancestors (moreso, he says, than even the Judeo-Christian Bible), Cameron forms a concrete argument that opposes the cancellation of classics in modern scholarship and makes the case for preserving a curriculum that educates students about the trajectory of Western civilization.

“Western civilization” as a term has become a sort of polarizing concept these days. Mahatma Gandhi is said to have made this joke when asked what he thought about Western civilization: “I think it might be a good idea.” And sure, that may be funny, but there are some who have taken this literally, arguing that there is, in fact, no such thing as Western civilization. Scholars are doing it, and schools are doing it. A couple of years ago, the classics faculty at Oxford University, for example, recommended that Homer’s Iliad and Vergil’s Aeneid should be removed from the “Literae Humaniores” (a famous undergraduate course at Oxford focused on the classics in particular). Apparently, this decision was made because of the differences in exam results between male and female undergraduates in addition to privately and publicly educated students. However, seeing as all of these people are at Oxford, for goodness sake, the axing of two of the most influential epics of the Western canon is preposterous at best and disgraceful at worst. The effort to erase these two works from the curriculum is a microcosm of the wider attempt by modern scholars to do away with the concept of Western civilization - along with its art, culture, literature, and enduring ideas - as a whole.

In 2019, I wrote an article for WestView News, a newspaper running out of the West Village of NYC, called “Keeping Ancient Greek and Latin Alive.” Back then, I was an idealistic senior in high school who was just beginning to appreciate all that the classics had to offer but was also aware that interest in classical languages and history was in decline. I had studied Latin since middle school, travelled to Rome through a summer “Latin immersion” program, and been exposed to Greek culture my entire life as a second-generation Greek American. At that time, I was also deciding the next major step in my life: where to go to college. I chose to attend Holy Cross, not just for its welcoming community and high-caliber academics, but for its robust and expansive classics department. I was impressed by the sheer number of professors in the department and the fact that all of them were so supremely knowledgeable about a vast array of topics, ranging from Greek tragedy to classical archaeology to even gender in antiquity. The opportunities to expand my own capabilities as a classics student were seemingly endless, and looking back as a junior, they have proved to be more fruitful than I could have ever imagined. I wholeheartedly admit that I would not be as well-rounded and capable of a classicist as I am today without the brilliantly and expertly crafted language and culture courses offered by the Holy Cross Classics Department.

Reading Cameron’s article, I was heartbroken to learn that Princeton University announced that its classics majors will no longer be required to learn Greek or Latin. As of June 2021, the “classics track” was eliminated altogether (which required intermediate proficiency in either Latin or Greek to enter), and the general requirement of taking Greek or Latin was removed. According to the members of the department, these changes to Princeton’s requirements for the Classics track were instituted in order to create a more “inclusive” and “equitable” program of study. Although the school claims that this change will incentivize more students to become majors, what are the true implications of their decision? Are the Princeton professors admitting to the fact that classics as a field is racist, thereby invalidating and tarnishing their entire academic careers spent studying and teaching the subject? Or are they saying that some students at Princeton University are in fact incapable of succeeding in these rigorous language courses? It’s impossible to wrap your head around this issue without coming to these conclusions. Instead of doing away with a subject that is undoubtedly extremely difficult, shouldn’t a school like Princeton and others following in similar footsteps utilize the colossal endowments they have at their disposal to provide better resources for mastering Greek and Latin? By means of tutoring, implementing better structures to courses, and improving professor-student relationships, the problem of the difficulty of the subject matter in question could slowly but surely be eliminated altogether.

I definitely understand that the privilege to study classics is not afforded to all and am grateful that I have been fortunate enough to be given the opportunity to study this subject for many years with countless resources at my disposal. It is not an unknown fact that communities of color and students in underrepresented groups in the United States have indeed suffered from a lack of access to the classics; however, the classics community as of late has begun to fundamentally change this, especially in the United States. There are a multitude of upcoming initiatives in middle schools, high schools, and universities to incorporate more BIPOC (black, indigenous, and other people of color) and underprivileged students into their classics departments. I personally have been involved in these types of programs: in high school, I volunteered through the Paideia Institute’s Aequora program, which is driven by the belief that “Classics [is] an inclusive, diverse, and socially engaged field.” I would go every week to a public school in Flushing, NY to teach elementary school kids from underprivileged families Latin. This experience was not just helpful for them, but was also gratifying to me, and I was able to witness firsthand the benefits that Latin had on these students: they absolutely adored both Latin and the act of learning itself. I am also currently on the Classics Inclusion Committee at Holy Cross, which upholds those same values as the Aequora program and is working to establish an equitable community here at Holy Cross without getting rid of or dumbing down the already established language requirements. 

Classics has long been considered a very niche and “gatekept” subject, but this doesn’t have to be the case: with enough effort, classics can become open to all who wish to study it. Simply giving up and saying that students of color are at a disadvantage at becoming successful classicists is plain wrong and, frankly, offensive. If the classics departments of Princeton or Oxford do not truly believe in their students’ intellectual abilities along with their desire to fully immerse themselves in ancient languages, how are students expected to believe in themselves? Going forward, we have to be wary of the effects of this new and toxic cancel culture that is all too common in modern society. As Cameron so rightly wrote in his piece, “If anything, we need to expand the scope of classical education that kids receive, not further curtail it. Otherwise, we’re deliberately withholding from American children the conceptual tools necessary for contemplating our ancestors with sympathy and understanding.”

Why I Observe Columbus Day as an Indigenous-Blooded Woman

The DNA results are in.  Should any angry readers search frantically for my ancestry records, they would indeed find that I have quite a bit of Indigenous blood.  In fact, according to my father’s DNA test, I have more than enough to qualify for residence on most reservations, lest someone claim I am too removed from Indigenous people to comment on Columbus.  This fact is not particularly shocking considering my last name is Esquivel and half my family traces their origins to Mexico, where virtually everyone is ethnically Mestizo.  Still, it may come as a surprise to many that I do in fact choose to recognize Columbus day, and am saddened to watch Indigenous People’s Day be pinned against it annually.  

In principle, I am not opposed to Indigenous Peoples’ Day; in fact, I am quite sanguine about the idea of having a national day of recognition for Indigenous people.  Indigenous culture is central to North American history.  We would be remiss as a society to ignore or downplay its place in that history, and the abuses which have been suffered by Indigenous people at the hands of colonizers.  I resent the fact, however, that the push for such a day of recognition has been transformed into the club by which Columbus’s legacy is assassinated.  A “Columbus Day” and an “Indigenous Peoples’ Day” can and should peacefully coexist.  I do not observe Columbus Day as a celebration of the man’s character, nor are his personal sins or virtues of particular interest to me.  In the same way that I celebrate Dr. Martin Luther King, Jr. Day despite allegations made against his character, I celebrate Columbus Day.  Both men changed the course of history, and our lives have been bettered because of their accomplishments.  

Columbus’s arrival to the Americas represented the commencement of American society.  No, Columbus was not the first man to “discover America”, but the semantics game does not succeed in watering down the gravity of what happened in October of 1492.  Columbus brought Western values to a land which never had the opportunity to experience movements such as the Renaissance or benefit from the academic progress of the Middle Ages or read Greek philosophy, all of which we are doing at Holy Cross on American soil thanks to Columbus.  This is not to claim that there is no beauty to be found in Indigenous societies as well; Westerners are still dumbfounded by the architectural feats of the Inca and tourists in Mexico pay just to have a glimpse of an Aztec temple outside of the major cities.  What could be better than Indigenous people and Europeans finally coming into contact with one another?  

Of course, it is not that simple.  With conquest comes bloodshed, and the Europeans who came to the Americas were in fact engaged in conquest, some malicious, some well-intentioned. This was the sad fact of virtually every society’s history at the time: European borders were created through ethnic and religious conflict, Indigenous tribes’ own land areas were won through violent conflict, as was that of the Mongolian Empire, the Islamic Empires, and so on.  Though today we see a decrease in such traditional man-to-man warfare, we observe similar patterns which are executed more quietly through the threat of nuclear force (think the USSR and modern China).  Thus, the arrival of the Europeans in the Americas was no historical exception.  

I still choose to observe Columbus Day.  I am eternally grateful that I was not born into the Aztec society of my ancestors.  Cultural relativism is a popular outlook in modern America, but I wholeheartedly believe that I enjoy life in an objectively superior society than that of my ancestors.  Had I been born and raised in Tenochtitlan, I would have watched several chests being slashed in order to harvest beating hearts and spilling blood for sacrifice to the god of the sun by now, had I not been the unlucky sacrificial victim myself.  Were I a member of a high-ranking family in the Empire, I may have had the privilege of engaging in cannibalism as a sort of reward for my noble status.  Were I from a low class I may have had to work as a serf or a slave, and would have been the first to starve during a famine or poor harvest.  

While it may seem as if I am simply berating the Aztecs and their brutal practices for the fun of it, it is not my intention to anachronistically hold them to my ethical standards — standards to which they had not been exposed.  I can, however, state with full confidence that I believe the society in which we find ourselves is exceedingly preferable to the one I just described.  Needless to say, I am thankful that a society arrived in the Americas to inform my ancestors that there is no need to sacrifice a compatriot to the sun god, because the sun operates independently.  Even a vast number of Aztecs seemed to agree with my evaluation, as masses willingly converted to the religion of their missionaries, namely Catholicism, which preached a dramatically contrasting message to the religion they had known for centuries.  

My gratefulness extends beyond my aforementioned points — I could not have been born without Columbus!  As stated earlier, the majority of Mexicans are Mestizo, meaning they are an ethnic combination of European and Indigenous blood.  Most Mexicans can rightly celebrate Columbus Day as a historic event which laid the groundwork for their own bloodline, not to mention the fact that Mexican society, a culture which I love dearly, traces its foundations to the arrival of Christianity on the continent.  A half-millennium later, I would be born, the product of a white mother and a Mexican father, in a society which gladly claims multiethnic people as its own due its philosophically enlightened foundations, an import made possible by Columbus’s landfall in 1492.

Still, ironically enough, Columbus finds himself being posthumously condemned by those who claim to hold the very values which his expedition brought to our continent.  As previously mentioned, I am not here to defend Columbus’s personal character.  The refusal to appreciate the historical importance of Columbus's landfall on the basis of ethical concerns, however, is unbelievable to me, as those criticizing him do not realize that their arguments are a product of a society which subscribes to the values Columbus had a large part in bringing to this side of the world. People ultimately fail to realize just how revolutionary the philosophical underpinnings of the West — brought to the Americas by Columbus — were and are in the face of world history.  Though certainly not carried out flawlessly by all who came before us, the Western value system has proven the greatest facilitator of social progress in history. We must appreciate Indigenous people and the value of their respective cultures, and we should not gloss over the human rights violations suffered by Indigenous groups at the hands of unjust men. I cannot help but celebrate, however, the fact that Columbus made landfall so that I could live the way I do today.  I therefore wish everyone a happy Columbus Day, and a happy Indigenous Peoples’ Day!

Laid Bare: The Reality of Pornography

Americans are generally wary of potential evils which could degrade the behavior, health, and thinking of their fellow citizens. This wariness is evidenced by the visceral responses of Americans to everything from communist infiltrators to narcotics. Yet, one threat has gone largely unaddressed, in spite of its near-universal accessibility, toxicity to the mind, direct ties to human trafficking, and detrimental effects on the family and public understanding of human sexuality. Rather, the evil of pornography has been praised and accepted in the news and by various public figures. However these individuals and organizations wish to frame it, few things are more abnormal, or pose a greater threat to the moral fabric of society, than pornography.

To understand pornography, it is necessary to understand its affects on the brain and body. Like most addictive drugs, pornography hijacks the brain’s reward system. When experiencing seemingly beneficial stimuli, neurotransmitters give feelings of pleasure, incentivizing engagement in the stimuli. Further engagement releases more neurotransmitters, fortifying particular neural pathways. Unfortunately, this reward system cannot distinguish between superficially and truly beneficial stimuli. When exposed to pornographic videos, the brain is tricked into believing that the person is engaging in intercourse, rather than watching videos of others doing the same. 

         

As the brain builds a stimulus tolerance, more stimuli is required to achieve the same amount of pleasure. Pornography provides an infinite amount of novelty as a result of the essentially endless supply of pornographic media available for free at the user’s fingertips. Also, since the novelty of pornography makes it a supernormal stimuli, which is shown to elicit a more intense response in humans and other animals than natural stimuli, pornography presents a more stimulating pleasure experience than real relationships.

As one might expect, this rewiring of the brain has disastrous consequences. Pornography leads to physical changes within the brain, with shrinkage of grey matter that is comparable to, or even greater than that associated with heroin use. On a psychological level, pornography usage produces low self-esteem, loss of energy and mood deterioration, and has been shown to weaken memory. Use of pornography even results in physical changes, such as erectile dysfunction. The rising use of pornography has correlated with a 600 to 3,000 percent increase in erectile dysfunction among young men, a phenomenon that most young men would otherwise not experience for another few decades of their lives. An entire generation of young men who are collectively impotent, placid, and losing grey matter by the day would be a crisis in and of itself. Yet, these immediate effects represent only part of the disastrous consequences associated with pornography.

The craving of novelty inevitably affects true relationships in devastating ways. For those who are in relationships with a pornography-consuming partner, they may see practices viewed by their significant other imposed in the bedroom — practices such as choking and spitting that would otherwise be rightfully considered degrading. Alternatively, the user may become bored and neglect their partner entirely, having built unreasonable expectations of sexual attractiveness and performance around their viewing habits. They then forsake human connection for artificial pleasure. Therefore, it should be no surprise that almost 56 percent of divorces involve pornography consumption, according to the research of therapist Dr. Jill Manning. 

While playing an active role in destroying families through divorce, pornography consumption is becoming alarmingly common among children. The average age of first exposure to pornography is eleven, with some being exposed even earlier. For many children, their first glimpse at sexual activity of any kind will be through the highly disordered lens of pornography. While most may be out of the reach of drugs, pornography is within reach of their keyboards, and the search engines are dealing.

The effects on those most intimately involved in the production of pornography — the actors and actresses — are equally grim. Actresses often come from backgrounds of sexual abuse, poor mental health, and financial instability that result in the pornography industry appearing as a glamorous alternative to their present state of affairs. While perhaps gaining some level of significant income initially, actresses are made to perform increasingly extreme material in order to continue receiving the same income. They are coerced into scenes in which they feel uncomfortable, or would otherwise never engage in.

Beyond exploitation, sexual trafficking and rape go hand-in-hand with pornography. One of the most glaring examples is the case of producers Michael Pratt and Matthew Wolfe, whose productions were widely circulated among pornographic websites. They lured unsuspecting women with promises of a modelling shoot, before coercing them into shooting pornographic videos, and sometimes sexually assaulting them. Unfortunately, this was not an isolated incident. According to Nicholas Kristof of the New York Times, at least a plurality, if not a majority of the millions of videos on websites such as Pornhub are non-consensual, with videos depicting the confirmed rape of minors being monetized. Such was the extent of this content that the website was forced to remove almost half of its videos pending review. Onlyfans, held up by many as the paragon of individualized control concerning pornographic content, has also been linked by investigators to human trafficking, both as a way for traffickers to lure victims, and to profit from them.

The thoroughly terrible nature of pornography is only exacerbated by the attitudes of its proponents and producers. Al Goldstein, credited with normalizing hardcore pornography, stated that pornography was “a way of defiling Christian culture and, as it penetrates to the very heart of the American mainstream, its subversive character becomes more charged.” One might wonder how an industry with such consequences, wielded with the seeming intent of its producers to destroy the fabric of American culture, has been allowed to continue unhindered. Yet, this has not always been the case.

 For decades prior to a number of questionable decisions by a liberalized SCOTUS in the 1960s and 70s, pornography was considered obscenity, subject to regulation and outright bans. Obscenity, which is not protected under the First Amendment, is defined as material that is prurient, devoid of scientific, political, educational, or social value, and violates local community standards. It is clear that pornography is indeed prurient, as it promotes excess sexual interest by rewiring the brain. Further, it would be difficult to assert that it holds any value except appealing to carnal desire, and violates local community standards depicting degrading and deviant sexual activity. Clearly, it is time for the court to reexamine this issue.

Grim as it may be, for those who have been affected in some way or another by pornography, there is hope. Physically and psychologically, it has been shown that the brain will rewire itself normally in the absence of pornographic stimulus. There are a variety of resources available for those struggling with addiction and recovery, including coaching, blockers, and information about quitting that is widely available. For those who struggle, know that you are not alone. Pornography can be fought... and beaten.

Social Media and the Abortion Debate

On September 1st, 2021, Senate Bill 8 in the state of Texas went into effect. This bill created a set of parameters that was able to circumvent the language used in previous judicial decisions that allowed greater access to abortion. Since this bill was sent to the floor of the Texas Senate, and even more so after it became law, social media sites like Instagram and Twitter have run amuck with various arguments and points that are widely held but lack reason. I will be addressing a few of those here.

Instagram stories, from my experience, especially in the wake of George Floyd's death, have become a breeding ground for coarse opinions and flashy-looking infographics. The activist infographic is commonplace on Instagram, where its creators make several slides and write, usually in a bubbly font and pastel colors, slogans and ideas (sometimes with supporting evidence). If you are unfamiliar with what I am referring to, the Instagram account @impact is a good example of this type of content at the time of this writing. A quick search on Instagram under #abortionrights finds the following arguments that I will be addressing here.

“Men shouldn't be making laws about women’s bodies.” This argument is used to invalidate any male’s pro-life opinion by saying that since abortion and unwanted pregranacy is an issue that only affects women, men lack the empathy and relevancy to govern and legislate this issue. This is in the same vein as those who believe that only those of certain races or backgrounds can speak on certain issues because of their lived experience. This is wrong and, in this case, sexist. Rather than address the arguments of the opposition, they try to invalidate them as being incapable of having a productive thought. This argument also completely ignores the 45% of women who support some restrictions to abortion and 19% of women who support a total ban on aboriton according to data collected by Gallup in 2021. These numbers are strikingly similar to male opinions on the subject, meaning sex plays less of a role in abortion opinions than implied by this argument. If men took a step back from the abortion debate and let women handle it, the outcome would most likely not change.

“The pro-life movement is rooted in racism.” I have seen this argument presented in many different ways. The most compelling Instagram infographic used sources from NPR, Politico, and The Atlantic to make their case. It is predicated on the idea that evangelicals in the 1980s were angry about desegregation and only picked up the issue of abortion in order to gain more votes and galvanize support for their agenda. To show the lack of relevance of this argument, one must only look at the modern data. The CDC reports that in the United States, black women had the highest ratio of abortions out of any racial group with 335 abortions per 1,000 live births as opposed to white women with 110 per 1,000 live births. Black women get abortions at more than 3 times the rate of white women. If an individual or group were racist, why would they seek a policy that aims to disproportionately save the lives of black babies? If this was truly the intention, those supposed racists would be in favor of abortion since fewer black babies would be born, leading to a demographic shift, with the final result being fewer black individuals eventually turning 18 and using their right to vote. In modern times, the pro-life movement seeks to save people of all backgrounds, debunking these all too prevalent accusations of racism, stemming from the arbritary claim of a racist past rather than any claims about the present.

“Pro-lifers are only pro-life until it is out of the womb.” This argument is usually coupled with the accusation that those who are pro-life are really only “pro-birth”, seeing that those who are pro-life are also generally conservative and reject the concept of large government welfare programs. This idea infers that in order to not simply be pro-birth, one has to support large government assistance and in turn support the use of a greater tax burden to accomplish that. The reason why this is false is because there is more than one way to assist people in need. Those on the right of the political spectrum find the best way is not to fund bloated and bureaucratic government agencies that mismanage funds, but rather to give their money to religious groups and private charities. They find religious groups like their church more appealing as a destination for their money because they generally know to whom they are giving the money and can hold them accountable, with the faith that the money will go to a good cause due to a common set of values. Private charities are similar in that they can be held to greater accountability than a bureaucratic agency. If a charity acts in a manner that does not reflect what individuals believe the charity should be doing, those individuals can choose not to give them money. That cannot be said about the government, which is held accountable to government officials who are laden with other responsibilities as well as limited by lengthy procedure. To respond to the claim, many pro-lifers find that private means are better than public means in assisting those out of the womb. The difference is that conservatives want to use their own money to support causes of import, while those on the left want to take others' money and allocate it as they see fit.

“Pro-lifers only want to control women’s bodies.” This claim seeks to invalidate any argument as it simplifies the situation into pro-lifers arguing in bad faith. Anything that a pro-lifer says can be discarded because they do not believe what they are saying, or have ulterior motives. Rather than addressing the ideas put forth, this argument simply assumes the opposition is evil and not worthy of debate. To come to this conclusion, one has to assume that pro-lifers do not actually believe that unborn babies are alive, and therefore rather than protect life, wish to control others. Contrary to this, pro-lifers do genuinely believe in what they say, and are only seeking to protect innocent human life. This claim is lazy at best and conspiratorial at worst, implying that there are vast swaths of the population bent on controlling women by all agreeing to lie about the same thing. I would never assume that a large group of people that I disagree with politically were being disingenuous despite them having legitimate arguments that suggest otherwise, and neither should anyone else because it does not lead to productive conversation and will never convince anyone who is governed by reason. It only seeks to breed division. 

Finally “it is none of your business” or “if you are against abortion don’t have one.” To believe someone is being murdered next door is none of my business? The vast majority of those on the left as well as many on the right believe George Floyd was murdered, but is that any of your business? Are white straight cisgender men not allowed to march for Black Lives Matter because they are not affected by alleged systematic abuse and not a part of their intersectionality umbrella? No, of course not. People support causes because they believe it is the right thing to do. Martin Luther King Jr., a man widely revered and quoted in America by both left and right, famously said, “Injustice anywhere is injustice everywhere.” Abortion is an injustice, just like those on the left assert that systematic racism — which they believe to be enshrined in most, if not all, institutions — is an injustice, and it is every person’s imperative to fight for justice, even if one thinks it does not affect them. Abortion is murder, and I and many others are not going to sit by and let it happen.

This article, of course, does not get the opportunity to address the real substantive questions on abortion, such as when does life begin, Catholicism’s view on abortion, or the nuances of cases surrounding rape, incest, or when the mother’s life is in danger. These questions have all been answered before either by previous writers for The Fenwick Review or by other commentators on the topic. Here, I only sought to address the many lazy and coarse opinions spouted on social media that I see take away from the real debate. Additionally, not all those who are for abortion accessibility believe these views, as no group is a monolith. These flashy statements should be retired, so that we as a society can engage in a civil discourse that leads to the truth, rather than shouting matches and political theatre governed by emotions and fanaticism.