The Death Penalty in the US: Legalized Murder?

On September 24, 2024, the state of Missouri executed an innocent Black man. Why did they kill him? 

Marcellus Williams was convicted and sentenced to death for murdering Felicia Gayle. There was no physical evidence linking Williams to her murder: fingerprints, footprints, hair, and DNA found at the crime scene did not match Williams. The only evidence against Williams was testimony from two witnesses whose accounts were inconsistent and unverifiable. Gayle’s family favored life imprisonment. The county prosecutor favored life imprisonment. Only Missouri’s Attorney General wanted Williams executed – and he got his wish. 

Williams was innocent of the crime for which he was executed. He never had a fair trial. The prosecution struck 6 of 7 Black jurors, one of whom was rejected “because he looked too much like Williams.” Missouri knew they were executing an innocent man – and they did it anyway. 

History of the Death Penalty in America 

Capital punishment has been a part of the American legal system since before the United States was a country. The first person executed in the British colonies was George Kendall, who was executed by firing squad for mutiny in 1608. By the early 1900s, public support for the death penalty was beginning to wane, and some states abolished the practice. 

Utilizing capital punishment was briefly illegal nationwide. The 1972 Supreme Court Decision Furman v. Georgia ruled that existing death penalty statutes were discriminatory and therefore unconstitutional. That lasted until 1976, when the Court ruled in Gregg v. Georgia that Georgia’s updated death penalty statute was constitutional, and executions resumed. Since 1976, 1,601 people have been executed. Today, only 21 states still have the death penalty, and only ten have executed people in the last decade. 

Methods for capital punishment have varied greatly over the last two centuries. Early in American history, the most common were firing squad and hanging. Over time, hangings have become associated with lynchings. Despite that history, in 2023, a Tennessee lawmaker proposed that “hanging by a tree” be used as an alternative method of execution in the state. In 1890, the first person was executed with the electric chair, which was the most common method for several decades until lethal injection became more popular after its first use in 1982.

A white room with a gurney with several thick straps used for restraining prisoners.
Image 1: A white room with a gurney and several thick straps was used to restrain prisoners. Source: Yahoo Images.

Lethal injection has faced challenges in recent years for a few reasons. Drug manufacturers do not want to be associated with homicide – and thus refuse to sell the required drugs to state governments – and medical professionals refuse to administer the medicines. Instead of medical professionals, correctional workers struggle to find veins and sometimes fail entirely, causing delayed executions. Roughly 3% of executions are botched, and people subjected to botched executions are disproportionately Black – 1/3 of executions nationwide are of Black prisoners, while 1/2 of botched executions are of Black prisoners. Even when not botched, lethal injections have been shown to be less humane than originally believed. The drugs used are painful and cause the lungs to fill with fluid – typically without proper anesthesia. 

Black prisoners are also treated differently immediately before they are executed. Jeff Hood, who has witnessed six executions – three of Black prisoners, three of white – told NPR, “I can definitely tell you that the restraints that I have seen on Black folk have been unquestionably tighter than the restraints that I have seen on white folk.” 

More recently, there has been controversy over a new execution method: nitrogen hypoxia. The state of Alabama has executed two people – Kenneth Smith and Alan Eugene Miller – by nitrogen hypoxia in the last year. The state had previously attempted to execute both Smith and Miller by lethal injection, but correctional workers were unable to place IV lines in either man over the course of several hours. There is another Institute of Human Rights blog post, published in the fall of 2023, that extensively details execution methods. 

Problems of the Death Penalty

Two of the most common reasons given for keeping the death penalty are deterrence and justice. Justice argues an eye for an eye – that, for some crimes, the only possible form of justice is death. That is a philosophical debate, and one I will not discuss today. Instead, I will focus on the effect of the death penalty on homicide rates – deterrence. Deterrence is the idea that the existence of the death penalty deters crime – it reasons that prospective murderers are logical people who will be less likely to kill others if it will result in their death. 

In 2012, the National Research Council conducted a literature review on studies examining any deterring effects executions – and the general presence of the death penalty – have on homicide rates. They concluded that studies had not yet demonstrated any effect capital punishment has on homicide rates and recommended that the “research… should not influence policy judgments about capital punishment.” 

One of the most powerful arguments used by death penalty abolitionists is about wrongful convictions. Someone who is sentenced to life in prison can be released if they are found innocent; that is not so with someone who is dead, such as Marcellus Williams. Wrongful convictions are common; for every eight executions in the United States since 1977, one person sentenced to death was exonerated. 82% of death penalty exonerations are due to official misconduct and 36% of death penalty sentences are overturned. 

Glynn Simmons was exonerated in December 2023 for a crime he did not commit. He spent 48 years in prison. The state knew when he was convicted in 1975 that Simmons was innocent; he was in Louisiana when the crime was committed in Oklahoma. Despite that, it took almost 50 years – 2/3 of Simmons’ life – for him to finally be exonerated. Imprisonment is reversible. Death is not.

A broken chain.
Image 2: A large broken chain. Source: Yahoo Images

What Can Be Changed? 

Activists have worked for decades to reform or eliminate the death penalty. Two organizations that have been involved in numerous exonerations are the Innocence Project and the Equal Justice Initiative. Both organizations provide legal aid to innocent prisoners. Other ways to support change include petitioning state and federal legislators to end or reform the death penalty.

The Wine Industry: Years of Exploitation and Human Trafficking

by Caitlin Cerillo

Have you ever had a glass of wine and wondered how it’s made? Or, pondered what it comes from and how long the wine-making process takes? Who is responsible for making it? Surely, the wine industry has been modernized, where mechanical inventions can do most of the handiwork when creating a delicious bottle of wine used for birthdays, weddings, anniversaries, and other milestone celebrations.

Unfortunately, this isn’t the case. The wine industry has had a history of exploiting its workers by forcing them to work in extremely poor conditions and grueling hours. Wine-making follows an intricate process, starting with the harvesting of grapes in vineyards. Mechanical harvesting does exist and is generally quicker than doing so by hand, as the average human can harvest 1-2 tons a day, while a machine can harvest 80-200 tons. However, human harvesting is still favored because it offers a more precise selection and lessens the severity of oxidation getting to the grapes due to damaged skins.

A person picking grapes to harvest for wine.
A worker manually harvesting grapes for wine. Source: Yahoo Images

The amount of grapes needed to produce a standard-size bottle of wine varies depending on the style of wine. However, a general number given by experts is an average of 1.25 to 1.50 kilograms, or 2.75 to 3.3 pounds. With the amount of wine that is produced worldwide within just a year, this adds up to a huge demand for grape pickers to supply the lucrative wine business. In the world, there are two primary countries responsible for the largest number of wine production: Italy and France. Both countries have come under fire for unethical practices in their wine production and human rights violations that include human trafficking, exploitation, and extremely poor working/living conditions for workers.

What is Human Trafficking and Exploitation?

Human trafficking is a huge issue across the world. The United Nations Office on Drugs and Crime (UNODC) defines human trafficking as the “recruitment, transportation, transfer, harboring or receipt of people through force, fraud or deception, with the aim of exploiting them for profit.” Human trafficking can come in many different forms, like sex trafficking, forced labor, and child sex trafficking. Victims of human trafficking can come from any kind of age group, gender, and background.

However, specific groups may be more vulnerable than others. These groups include people separated from their families or other support systems, refugees or migrant workers, sexual and gender minorities, people with disabilities, and members of lower socio-economic groups. According to the Centers for Disease Control and Prevention (CDC), human traffickers will use manipulation tactics and exploit the vulnerabilities of their victims, which is why these specific groups are at heightened risk.

Italy

In September 2021, a humanitarian organization by the name of Oxfam released a Human Rights Impact Assessment (HRIA) on the Italian wine supply chain to assess their impact on human rights. The HRIA is titled “The Workers Behind Sweden’s Italian Wine” and focuses on the primary Italian wine supply chain in Sweden, Systembolaget. The HRIA’s objectives were to perform a context analysis on Systembolaget in order to “build an understanding of the nature of the Systembolaget supply chains” and then to “identify the actual and potential human rights impacts in Systembolaget supply chains in practice.”

Oxfam’s HRIA does a great job at going more in-depth with the current human rights violations occurring in the Italian wine industry, along with the potential human rights violations that are at high risk of coming to fruition. To summarize, Oxfam found several serious violations: forced labor, low wages, excessive working hours, health and safety risks in vineyards and wineries, lack of access to remedy, restrictions to freedom of association, sexual harassment and gender discrimination, and unsanitary housing. To read more about Oxfam’s findings, follow this link.

France

France’s primary region for wine production is called the Champagne region, located roughly to the east of Paris. In late 2023, a large portion of the region was shut down by French authorities and put under investigation for human rights violations. Wine-makers in the Champagne region are migrants primarily from West African countries. It was discovered that the lodgings that provided housing to the migrant workers were of poor quality, with makeshift beds surrounded by electrical cables and extremely unsanitary bathroom facilities.

Workers picking grapes in a French vineyard.
Workers picking grapes in a French vineyard. Source: Yahoo Images

The investigation also found that the contractors responsible for hiring the migrant workers exploited their vulnerabilities, as they were willing to work, even without proper contracts and for extremely low wages. At the end of the 2023 harvest season, another trafficking investigation was opened by authorities, which involved 160 laborers from Ukraine living in poor conditions in another area of the Champagne region.

South Africa

Although South Africa isn’t at the very top of the list of wine-producing countries, it has been accused of violating several human rights for years. In 2011, Human Rights Watch released a report titled “Ripe with Abuse: Human Rights Conditions in South Africa’s Fruit and Wine Industries,” detailing the problems surrounding the country’s industries. For over a decade, numerous attempts have been made to improve them, as well as conditions on farms. For instance, the Wine Industry Ethical Trade Association was created in 2002. Unfortunately, significant improvements have yet to be made to rectify the issues at hand.

South African farmworkers who supply the grapes needed for wine are vulnerable to some of the following human rights violations: exposure to pesticides and harmful chemicals, working long hours, and being forced to work in extreme weather conditions. Many farmworkers don’t even have access to safe drinking water, toilets, or livable housing. They face difficulty in forming a union to bring attention to the injustices they face. Like Italy and France, South African farmworkers receive low wages and little to no protection from the government.

The Future of the Wine Industry

There are many possible routes that can be taken to improve the working conditions for wine-makers. One of the most productive ways includes wineries turning to certifications that can help lay a groundwork for better standards, like environmental sustainability and safe working conditions. These certifications can help ensure that wineries are being held to their promises. Several wineries across the world have turned to certification efforts, like Chile’s Emiliana Organic Vineyards, which is certified under B Corp. B Corp was established in 2006, with the initiative of encouraging accountability, transparency, and environmental performance in business. Similarly, Italy has founded the Equalitas standard in 2015, which is specifically aimed at the wine industry.

Femicide in Kenya: A Silent Crisis

 

by Grace Ndanu

 

An image with a group of people holding up a banner that reads, "There is no honor in killing!"
An image with a group of people holding up a banner that reads, “There is no honor in killing!” Source: Yahoo Images (free to share and use)

 

In recent years, Kenya has witnessed a horrifying increase in cases of femicide. The alarming statistics paint an ugly picture of the state of women’s safety in the country. This issue goes beyond simple statistics as it represents a deep-rooted problem that demands urgent attention. Femicide in Kenya is not just a crime against women but also a violation of basic human rights and an assault on the fabric of society.

Understanding Femicide

Femicide is not a new phenomenon, but the magnitude of the problem in Kenya is shocking. The term encompasses various forms of violence against women, including domestic violence, rape, honor killings, and dowry-related deaths. These acts are driven by deep-seated beliefs and cultural norms that perpetuate gender inequality and elevate toxic masculinity.

According to a 2020 report by the World Health Organization, Kenya experiences one of the highest rates of femicide in Africa, with an estimated 47 women killed each week. Shockingly, this represents a 50% increase in femicide cases over the past decade. Furthermore, the majority of these cases go unreported or unnoticed due to social and cultural factors, making the situation even more alarming.

The Cultural Factors Behind Femicide

An image of a Maasai woman from Kenya holding her baby at her hips.
An image of a Maasai woman from Kenya holding her baby at her hips. Source: Wikimedia Commons through Yahoo Images (free to use and share)

 

To tackle femicide in Kenya, it is crucial to dig into the cultural factors that contribute to this crisis. Some of these factors include gender roles, traditions, economic disparities, and the normalization of violence.

Gender roles deeply rooted in Kenyan society perpetuate a patriarchal system that devalues women. Women are expected to be submissive, nurturing, and bound by societal norms. Patriarchy creates a culture of power imbalance, where men feel entitled to control and dominate women, both within and outside the household.

Traditional practices, such as female genital mutilation (FGM), child marriages, and wife inheritance, further perpetuate the vulnerability and defeat of women. These practices condone violence against women in the name of cultural preservation and perpetuate harmful gender norms.

Economic disparities play a significant role in intensifying femicide in Kenya. Poverty and lack of access to education, healthcare, and employment opportunities disproportionately affect women. When women are economically dependent on their partners or families, they are often trapped in abusive relationships with no means of escape.

Society’s normalization and acceptance of violence against women contribute to the perpetuation of femicide. Many cases of domestic violence go unreported due to fear, stigma, or lack of trust in the justice system. In some cases, many people, instead of helping, tend to record videos of women being wronged and post them on social media.

Addressing Femicide in Kenya

An image of a group of women from the Women's Ministerial Breakfast in Nairobi, Kenya.
An image of a group of women from the Women’s Ministerial Breakfast in Nairobi, Kenya. Source: Natalia Mroz; UN Environment Programme through Flickr

 

To address femicide in Kenya, a comprehensive approach is necessary. It requires collaboration between the government, civil society, community leaders, and individuals alike. Here are some key steps that can be taken.

Legal Reforms and Enforcement

Restoring the legal framework surrounding violence against women is paramount. Stricter laws targeting offenders, along with their effective implementation, are crucial. Adequate training for law enforcement officials and judicial personnel is also essential to ensure cases are dealt with sensitively and expeditiously.

Education and Awareness

Comprehensive educational programs should be implemented from an early age to challenge harmful gender norms, promote gender equality, and raise awareness about women’s rights. This includes teaching both boys and girls, as well as women and men, about healthy masculinity and respect for women.

Empowerment and Economic Independence

Efforts must be made to empower women economically. This can be achieved through vocational training, access to micro-financing, and opportunities for entrepreneurship. Women who are financially independent are better equipped to escape abusive relationships and have control over their lives.

Support Services and Safe Spaces

Accessible support services, including helplines, shelters, and counseling centers, are crucial for survivors of femicide and domestic violence. These safe spaces provide survivors with the support they need to rebuild their lives and break free from the cycle of abuse.

Community Mobilization

Community leaders, religious institutions, and local organizations play a vital role in challenging harmful cultural practices, promoting gender equality, and raising awareness about femicide. Mobilizing communities to change attitudes and behaviors towards women is essential to create a safer environment for all.

Conclusion

Femicide in Kenya is an urgent crisis that requires immediate attention. It is a reflection of deep-seated gender inequalities and cultural norms that perpetuate violence against women. Addressing this issue demands a comprehensive approach encompassing legal reforms, education, empowerment, and community mobilization. Only through collective efforts can we hope to build a society where women can live without fear, violence, and the threat of femicide. Together, we must strive to create a country that embraces gender equality, respect, and the protection of basic human rights for all.

Rethinking Museum Exhibitions in America

by Caitlin Cerillo

As an avid lover of visiting museums, it is important to hold them accountable when their exhibitions can have damaging implications. History and science museums can be among the most fascinating places to visit, as the world has such a rich scientific history. However, there is a fine line between preserving a specific piece of history and exploiting groups of people in the name of science. In recent years, several museums have come under fire for capitalizing on the exploitation of ethnic groups and glorifying the world’s hurtful history of colonialism, imperialism, and the oppression of marginalized peoples.

In recent years, attention has been paid to the sources of acquisition that many popular museums in the United States use. One of the most recent is the American Museum of Natural History, located in Manhattan, New York, and its exhibitions contain the remains of indigenous people.

What is Colonialism?

Colonialism is a practice in which domination over a specific area is carried out by another foreign state. Colonialism has been and is used as a way to consolidate political or economic gain and always leads to the complete subjugation, or conquest, of the people in the colonized area. The foundation of America was built on colonialism, dating back to before the nation was even established. While there are records of British colonies existing prior to the 1600s, the 17th century marked the beginning of the first permanent colonies. 

 

An illustration of what colonialism in the New World may have looked like. Depicts a docked ship on land with settlers.
An illustration of colonialism in the New World. Source: Yahoo Images

 

The Jamestown Colony was created in Virginia in 1607. Long before the establishment of any colonies in the New World, or present-day America, Native Americans were the first to live on American soil. The region in which the Jamestown colony arrived was the same region as the Powhatan people, an Indian tribe. On many occasions, there would be violent encounters between the tribe and colonists. When establishing colonies in the New World, colonists would bring diseases like tuberculosis and smallpox. While they had immunity to these microbes, they would be fatal for the local Native American population.

As the 17th century progressed, the relationship between colonists and Native Americans would significantly weaken. For instance, King Philip’s War occurred in 1675 after the execution of three members of the Wampanoag people by the government of the Plymouth Colony in Massachusetts. This war is known as one of the deadliest conflicts in American history, with the amount of casualties reaching extreme heights throughout the 14-month period of the war.

Even after America was established as a country, harmful practices against Indigenous Americans continued to be considered legal. Hundreds of thousands of Indians—particularly Indian youth—were forced to assimilate. Cultural assimilation is extremely damaging for multiple reasons. It normalizes public stigmatization of the affected groups and erases their cultural identity.

The American Museum of Natural History

 

Photo of the front of the American Museum of Natural History building.
The American Museum of Natural History, which has been criticized for its use of the remains of indigenous and enslaved people in exhibitions. Source: Yahoo Images

 

Upon facing public scrutiny, New York’s American Museum of Natural History has created a policy calling for the removal of all exhibits containing human bones. The museum has promised the use of anthropologists to carry out comprehensive analytical processes to determine these remains’ origins and source of acquisition.

Not only has the American Museum of Natural History come under fire for exhibiting the remains of thousands of Native Americans, but also for acquiring the bones of five Black adults who were buried in a cemetery for enslaved people. This brings an important conversation of eugenics, where bodies were exploited and used as “scientific property” against their will. The presence of eugenics and other scientific thoughts entrenched in racism and white supremacy have allowed for other forms of oppression against marginalized groups—specifically Black Americans—like medical racism and healthcare bias. These connections make the museum’s acquisition of these remains even more problematic.

The Smithsonian

 

Photo of some of the Benin sculptures acquired by the Smithsonian's National Museum of Natural History.
Some of the Benin sculptures that originated from the Kingdom of Benin in current-day Nigeria and have been acquired by the Smithsonian. Source: Yahoo Images

 

Another museum that has come under fire for its exhibitions is the Smithsonian’s National Museum of Natural History in D.C. While this exhibition does not involve human remains, the exploitation of a group of marginalized people under colonialism remains present. The museum held 29 bronze sculptures that originally belonged to the Kingdom of Benin. The Kingdom of Benin was established during the pre-colonial period of what is now southern Nigeria. The sculptures were seized by British military and colonial forces during a raid in 1897. This raid also resulted in the burning of the city and the deaths of the people who inhabited it

Real estate developers Paul and Ruth Tishman collected the Benin sculptures and sold them to the Walt Disney Company in 1984. In 2007, they were donated to the Smithsonian. Without thinking about the implications the sources of acquisition of their exhibition pieces have, the Smithsonian turned a blind eye to their hurtful histories. Fortunately, the Smithsonian recognized this problem and removed the sculptures from public display in late 2021. Museum director Ngaire Blankenberg also enlisted the help of curators to find the places of origin for all pieces that had potential ties to the Kingdom of Benin raid.

Harvard’s Peabody Museum and Warren Anatomical Museum

The Peabody Museum of Archaeology and Ethnology and the Warren Anatomical Museum, both owned by Harvard University, recently repatriated the remains of over 300 Indigenous people back to the Wampanoag communities. The university completed the repatriation process in January of this year. Harvard has since aimed to create efforts to better understand and rethink the implications of sources of acquisition. For instance, the Peabody Museum created a virtual exhibit titled “Listening to Wampanoag Voices: Beyond 1620.” The exhibit includes oral histories given by various members of the Wampanoag community.

 

Photo of the seven people in the Wampanoag exhibit created by Harvard's Peabody Museum.
These are some of the faces of the Peabody Museum’s “Listening to Wampanoag Voices: Beyond 1620.” The exhibit includes oral histories from Jonathan James-Perry, Elizabeth James-Perry, Phillip Wynne, Zoë Harris, Linda Jeffers, and Alyssa Harris. Source: Yahoo Images

Why are Sources of Acquisition Important?

The term ‘acquisition‘ refers to an object purchased or given to an institution, such as a museum or library. ‘Sources of acquisition’ deals with the background of these objects, like their historical context and location of origin. If not taken into careful consideration, ignoring sources of acquisition can be harmful to the affected communities. It normalizes the idea that the oppression of people is something that can be glossed over in the name of science or a glorified museum exhibit. In the case of many museums collecting the remains of marginalized communities, it pushes the notion that the subjugation and exploitation of people are acceptable. As reflected earlier in this post, America was built on the institution of white supremacy and colonialism, which makes the sources of acquisition of exhibition pieces even more important to note

So, what can be done to right the wrongs of these museums? Taking the initiative to go through the repatriation process should always be considered. While this process entails a number of legal procedures that may not be completed within a specific timeframe, it is always worth the exhibition pieces being returned to the rightful institutions and people. The Native American Graves Protection and Repatriation Act (NAGRPA) was instated in 1990 and is a US federal law that facilitates the repatriation process. As of 2022, there have been many changes made to the NAGPRA. These changes include defining how objects are defined to better accommodate the cultural traditions and customs of the rightful descendants.

Similarly, hiring curators and anthropologists to analyze the origins of exhibitions can be helpful. Next, understanding shortcomings within the pieces a museum inherits through efforts like opening conversations about America’s history of colonialism, racism, and oppression of marginalized people. Giving a voice to those who have been affected by these harmful practices, like the Peabody Museum’s Wampanoag exhibit, is another way of allowing them to reclaim the hurt that has been done.

International Day of Science and Peace

by Wajiha Mekki 

November 10 is the International Day of Science and Peace (IDSP), also known as the World Science Day for Peace and Development. The United Nations host this international event.

History of IDSP

Established in 1986, this historical day was initially developed to commemorate the birth of Marie Curie, a notable physicist and humanitarian. Curie was known for her innovative work within radioactivity, contributing to the discovery of radium and polonium. By 1999, its purpose changed to reflect the global needs of the scientific and humanitarian community, utilizing the day to affirm the global commitment to attaining the goals of the Declaration on Science and the Use of Scientific Knowledge. The day and annual summit unite governmental, intervention mental, and non-governmental organizations meaningfully to promote international solidarity for shared sciences between countries and renew the global commitment to use science to benefit communities that need it most. 

The overall goal of IDSP is to help achieve the UN 2030 Agenda and the 17 Sustainable Development Goals, creating a plan for prosperity for people and the planet. 

 

ISDP 2023

The 2023 theme for IDSP will be “Bridging the Gap: Science, Peace, and Human Rights.” This emphasizes the interconnectedness between science and peace, having a role in advancing human rights. Science is a valuable tool for making technological advancements, but it is also helpful in helping address social issues, reducing conflicts, and sustainably promoting human rights.

 

Photo of space shuttle near body of water.
Photo of space shuttle near body of water.
Source: Flickr

Science and Human Rights

Science is frequently associated with helping improve medical interventions, solving coding bugs, and completing mathematical equations. However, contrary to popular belief, science is essential to human rights. Firstly, science has a valuable role in promoting sustainable development. Utilizing scientific methods, data can be collected to quantify the progress toward fulfilling the 17 UN Sustainable Development Goals. Ranging from climate change to poverty to infant mortality, scientific data collection and analysis methods are needed to efficiently and effectively respond to global issues. Research and innovation also contribute to the mobilization of resources to historically underserved communities, allowing them to gain access to necessities. 

Within innovation, shared desires and interests help unite countries with singular goals. Scientific diplomacy is valuable in bringing countries to the table of collaboration. This deepens connections between countries as it relates to trade and commercial interests and helps foster peaceful relationships, prioritizing human rights.

With the appropriate distribution of resources, scientific advancements help improve the quality of life for communities internationally. Applying what is traditionally “scientific” to communities gives them a chance to live a better quality of life in a cleaner environment.

It is available to educate the public about the vital role of science and encourage innovation to solve global challenges.

How Countries Can Get Involved

Beyond participating in IDSP, countries can have a role in unifying science and human rights through many different avenues. One route is to protect and invest in scientific diplomacy. By allocating funding to scientific innovation and multilateral collaborations, governments can ensure that they can focus on shared goals with their international counterparts, working collaboratively to promote peace and cooperation. Another route is developing policies that protect innovation while developing guardrails for its usage, ensuring it is mobilized to those who need it most. States have a responsibility to be an advocate and protectors of their citizens, and by working to ensure that scientific diplomacy is used for the betterment of people abroad, they can elicit change in a meaningful way.

 

INTL and MAST Students Visit US Department of State Source: GU Blog
INTL and MAST Students Visit US Department of State Source: GU Blog

How Citizens Can Get Involved

Citizens have a responsibility to promote peace with science, as well. The role of a community member is to primarily use one’s voice to advocate for innovation and peace; by doing so and mobilizing one’s own story, organizations are held accountable for their actions. From governmental entities, non-profit organizations, and grassroots movements, stakeholders are supported by the citizenry. It is also important to have open conversations  to explore further the nuanced introspection of science, peace, and human rights, continuing to promote awareness and understanding.

 

International Day for Disaster Reduction

by Wajiha Mekki 

October 12 is International Day for Disaster Reduction (IDDR). This international event is hosted through the United Nations Office for Disaster Risk Reduction (UNDRR). In 2023, the focus has been on fighting inequality and fighting to break the cycle of international disaster.

History of IDDR

IDDR started in 1989 as a call to action by the United Nations General Assembly to help educate and mobilize resources to reduce the burden of ongoing disasters and increase resilience. This annual event focuses on a different theme, interpreted from the “Sendai Seven Campaign ,” established in 2015 at the third-ever UN World Conference on Disaster Risk Reduction in Sendai, Japan. The framework proposed during this time helps mobilize resources to local communities to ensure they can act at capacity during times of need; this also allows for communities to be prepared not only for small-scale and large-scale disasters but also man-made, natural, environmental, and biological disasters.

 

People in hazmat suits tending to a chemical disaster during a mock drill.
Source: American Red Cross Flickr
IDDR 2023

IDDR, in 2023, will focus on fighting inequality and issues and publish the results of the first-ever global survey on disability and disasters. This survey, with the purpose of championing disability and inclusion, was commissioned in 2013. 2023 also serves as a monumental year for IDDR as it is right after the midterm review of the aforementioned Sendai Framework; this review is vital, ensuring that progress is made to help accelerate action to rescue disaster disparities and prioritize resilience.

Current Burden of International Disasters

Disasters can happen at any time of the day. It is projected that  by 2030, the world will face 1.5 significant disasters per day; this results in a total of 560 disasters per year. Of these disasters, a large proportion is caused by environmental, technological, and biological hazards. Disasters don’t discriminate and have an impact on all people; however, it is noted that they have a disproportionate impact on those with disabilities. This compounded impact results in the development of a perpetual cycle of disaster without resources being efficiently invested to prevent and manage these disasters.

Specifically for those with disabilities, it is noted that development infrastructure is not developed to be inclusive and is oftentimes overlooked during all stages of emergency management. This isolates those with limited mobility and requires a caregiver or other health services, preventing them from accessing resources that will allow them to recover effectively.

Within emergency responses, it is noted that people with disabilities are unnecessarily institutionalized during and after disasters; this further isolates them from their families, peers, and communities. 

14 firefighters tending to a forest fire.
14 firefighters tending to a forest fire.
Source: American Red Cross Flickr
Spotlight: Japan’s 2011 Earthquake

Though there are many examples of international disasters, the horrendous earthquakes in Japan in 2011 highlight the disparities those with disabilities face in times of national emergency. This earthquake, noted as the “strongest earthquake in its recorded history,” was not the only natural disaster that impacted the community; the earthquake caused a tsunami, which amplified the impact and the resources needed to remedy the issue. The earthquake and tsunami destroyed hundreds of businesses, homes, and nuclear reactors. The destruction of these nuclear reactors resulted in toxic materials being released into the environment and communities. Thousands of lives were lost; however,, approximately 25% were disabled. The infrastructure developed for emergencies did not serve them; oftentimes, evacuation centers were not accessible, did not have the needed infrastructure, etc. All of these factors resulted in many people with disabilities not having adequate assistance. These disparities are not unique to Japan and are seen internationally and domestically. 

How Countries Can Take Action

The nature of disasters is cyclical; to have the most effective solution, it is vital to break the cycle and do so in a holistic manner. Firstly, there is the preventative lens of the disaster itself; it is vital to understand how disasters occur and to take the actions needed to establish early warning of these disasters. This allows countries to be prepared to make effective decisions that will have a positive global impact. Beyond this, countries and member states should take action to invest in their current infrastructure to make it more prepared for disasters. Though disasters can be mitigated through the above actions, they are not entirely preventable. Therefore, states should be prepared for their response to be inclusive for all; they must build capacity to accommodate vulnerable populations in their emergency response, including those with disabilities, older persons, and women. 

How You Can Take Action

Acknowledging IDDR is the first step to helping advocacy for advancements in emergency responses and more equitable infrastructure during times of need. It is a two-pronged fork; communities should work to break the cycle of disaster by improving habits and holding entities responsible, but should continue to invest in making resources more equitable. As a community member, it is your responsibility to use your voice to advocate for both of the above. Another way is to use your time to volunteer alongside community and international partners who are working to make improvements. Together, we can break the disaster cycle and make emergency responses more equitable.



The Excessive Nature of Overconsumption in American Culture

by Lexie Woolums

One of the things that dominate American society is what I like to call the “epitome of excess.”

We live in a capitalistic culture that thrives on consumers’ dissatisfaction. Our society’s culture defines American success as getting promoted to a position high enough that one can make enough money to purchase a big house in the suburbs, add a few cars, and have an annual family vacation.

Influencers on social media have added to this growing consumption. People have access to information via “Get Ready With Me” vlogs on TikTok, which feature various (expensive) products to desire based on trends that go in and out of style in just a few short months. This cultural desire to keep up with trends causes a constantly growing urge to have more. Nearly everything is capitalized on, giving us a concept initially coined by Herbert A. Simon in the 1960s known as the attention economy. Digital creators earn money based on views and engagement from their followers. People online regularly discuss strategies to “trick the algorithm” to further capitalize on this economy where time is one of the most valuable things someone can “give,” similar to how we have traditionally viewed money and, later, information. The phrase “time is money” comes to mind, but not in the same way that my grandparents would understand it.

Beyond seeking to maximize the number of seconds a viewer will stay on the video before swiping, this culture has other effects. It pushes for overconsumption. It has become common to see content creators post videos of six dresses they ordered while asking their followers to “help them choose which one to wear” to the event they have coming up. When I was in high school, everyone wanted the Hydro flask. Today, it is the Stanley Cup. As I wrote this article, I was notified that the newest cup fascination is an Owala.

A girl taking a picture of a folded piece of pizza with her phone
Figure 1- Source: Yahoo Images; Taking photos and videos to put on social media is easier than ever.

It has even become ordinary for content creators to try and capture views by “de-influencing” whatever the sought-after object is at the time. Spoiler alert: this is generally just pointing to a different brand of metal cup on Amazon that is better and cheaper than the almighty Stanley cup (and, coincidentally, listed in the person’s Amazon storefront, where they earn a commission on every purchase made).

This is just influencing—a system that attempts to capitalize on the attention that follows dissent.  The concept is not new, but it has changed how people earn money.

People run entire side hustles by making videos showcasing “Five Products You Need from Amazon,” with aesthetic videos of acrylic containers or trendy dresses.

It is normal to hear people joke about “doomscrolling” for hours online, highlighting the over-encompassing nature of modern social media and its role in our everyday lives. The pervasive nature of this beast has become an accepted fact of life, so we do not always think about questioning it. It takes a degree of separation before one might stop and think, what is the cost of this lifestyle? We do not generally stop to consider how the Amazon package made it to our house in two days. We rarely ask who made the trendy cup we found at Walmart or the skirt we found at American Eagle.

We rarely ask any questions about the actual cost of what we consume.

Customers entering and exiting an escalator to enter Zara fashion store.
Figure 2- Source: Yahoo Images; Zara is a well-known fast fashion brand.

As a culture, we are so far disconnected from the places and communities that create the products we use that many Americans would struggle to imagine what life would be like if we did not have access to these things. As a culture, we love a bargain, especially when we get to tell someone else about the three-dollar T-shirt we found at Target. What a steal!

It is a culture of mass consumption, and no one is immune to it. From a nicer car to a bigger house to a new water bottle or wardrobe (even when you do not use most of what you have), the desire to have more continues, especially within fashion.

Overconsumption has more negative effects than I can effectively capture in one blog post. It exists in all aspects of life across all sectors of commerce. Based on personal experience as a woman living in the world, fast fashion is one of the most pervasive issues that could be addressed more effectively if more people stopped to question before they purchased.

For this reason, I am honing in on fashion today, but by no means is that to imply that fashion is the sole or most important issue of our insatiable, overconsuming culture.

 

History

To contextualize the history of fashion consumption, it is important to mention how the fashion industry has shifted its production model over time.

Historically, most clothing purchased in the United States was produced within the country, created by garment workers during the Industrial Revolution. While I will not delve into much of the history here, my colleague, Kala Bhattar, wrote a phenomenal blog that delves further into the history of fashion. I highly encourage people to check that out if they are interested.

A black and white photo that shows a large textile machine with a child standing in the foreground and an older person standing blurred in the background
Figure 3- Source: Yahoo Images; A child working on a textile machine in the industrial era

For the purpose of this blog, the critical thing to note is that this system of domestic production and consumption is no longer standard (and is actually pretty rare) and that most large fashion companies have shifted production into different countries in the Global South, so they can take advantage of the cheaper labor.

 

Pollution

According to the United Nations, the fashion industry is the second largest polluting industry in the world, sitting right behind big oil. As of 2019, H&M was known for having $4.1 billion worth of unsold clothes. Some of the unsold clothing is used to fuel a power plant in Sweden. Still, H&M (and many other brands) still produce a high quantity of textile waste that never gets used, and in many places, it gets sent straight to landfills. People consume 400% more clothing today than twenty years ago. This excessive consumption tends to contribute to human rights inequities like gender inequality since most garment workers are women. It also contributes to the climate crisis due to the manufacturing of chemicals and landfilled textile waste.

The entire business model of fast fashion companies exists based on the idea that consumers will buy things, wear them a few times, and then toss them out and buy more to try and keep up with cycling trends. This model relies on (and intends for) the products to only be used a few times before being thrown out.

With our current consumption habits, the best-case scenario is that an item will be purchased and worn a few times before being discarded. That is a pretty pitiful best-case scenario.

A landfill with dirt to the right and general waste to the left. A large tractor can be seen in the background.
Figure 4- Source: Yahoo Images; Fast fashion is often landfilled after only a few wears

 

Varying Disparities

Fast fashion’s impact on human rights depends on the location, which widely varies. In the United States, the textile waste predominantly goes to landfills. A 2007 North Carolina study showed how solid waste landfills are disproportionately located in Black neighborhoods. In the world abroad, it is known that fast fashion companies like Zara and Forever 21 capitalize on the cheaper labor in the Global South, resulting in what many have called “modern slavery.”

Extensive human rights violations are associated with fast fashion, from child labor to exposure to toxic chemicals to dangerous working conditions. For instance, in a 2022 undercover investigation, it was discovered that Shein employees work 18-hour days with one day off per month and make as little as 4 cents per garment.

I am keeping this section brief not because these problems are not important but to discuss potential solutions because the ultimate truth is that many people already know about these issues, and we need action.

 

Affordability

I would be remiss without mentioning the most significant barrier to purchasing slow fashion, and that is affordability.

Since we live in a culture that encourages overconsumption, some may scoff at spending more than twenty dollars on a pair of jeans. We are used to the cheap stuff and accustomed to buying something to use it for a few times before pawning it off at the thrift store or throwing it in the trash can.

Sustainable brands are notoriously expensive by modern standards, and not everyone can afford those brands because they are the exception rather than the rule. In the past, clothing has been made to last for generations, so it was expected that consumers would pay higher prices upon the new purchase.

I want to be clear here that in no way am I trying to overromanticize the past systems of the fashion industry. I would highly doubt that some Americans today seek to abolish the minimum wage or have children working in our factories again. With that being said, we have lost the skills, knowledge, and willpower to make our purchases last in a way that respects the resources and labor it took to make the piece.

 

Conclusion and Solutions

In terms of solutions, there are some things that we can do to spark change within the fashion industry. These actions exist on two primary fronts: purchasing and—let me emphasize this one here—NOT purchasing.

Regarding true ethics and sustainability, relying on companies to make ethical decisions is not the best strategy since many of them are dishonest about their products’ true social and environmental sustainability. This includes many brands that some would consider to be “sustainable.” Fashion companies are notorious for greenwashing their products, making them appear a better option, even when most of their clothing is not produced ethically or sustainably.

Due to this, consumers should focus on reducing their consumption overall rather than buying when possible.

The best way to minimize the impact on people and the planet manipulated by the fashion industry is to stop buying from those brands. If you need something new and want to buy it, I encourage you to return to your closet and shop from there (because you probably do not need anything). This might sound crazy, but most of us have more than we need, and we must recognize that and act accordingly.

Another solution is to borrow something from a friend or family member. Thrifting or buying secondhand can also be good options to minimize your impact.

All of these examples mentioned fall under the front of not purchasing. If a shirt has holes, learn to mend it to be re-worn. If you want to wear something new to an event, ask a friend to borrow something or try to style something in a new way. Use what you have, and you will be forced to be more creative.

Two women talking to each other. They are standing between two clothing rails on the street at a secondhand clothing sale.
Figure 5- Source: Yahoo Images; People shopping at a clothing swap

It can also be helpful to consider the washing instructions for specific items. Many articles of clothing would last significantly longer if they were hang-dried or hand-washed.

When these options have been exhausted, and you must purchase something new, be selective. As a consumer, making conscious choices when purchasing new clothing dramatically helps. Suppose you cannot picture yourself wearing something often, or you know the item does not go with anything you have. In that case, it is probably a good idea to refrain from purchasing it.

If you cannot afford to spend a lot of money on clothes, fast fashion is going to be the obvious choice, so it is best to focus on making a mindful purchase with an item you will wear for a long time. Beyond that, the best thing is to take care of your clothes as best as possible to maximize the use you can get out of them.

If you love a staple piece from a sustainable brand, try to save up to invest in it—I guarantee you that it will probably last for years. I recommend this website to check on brands you are interested in—it rates brands based on environmental impact, labor conditions, and animal welfare.

 

Final Thoughts

We all experience the desire to have more, and that is not always a bad thing. Still, our culture has a lot of work to do regarding setting realistic expectations about the number of things we think we need.

For better or worse, I am an optimist at heart, and I am confident we can do better.

Where is the Equity? How States Have Disproportionately Underfunded Historically Black Colleges and Universities.

by Jayla Carr

A group of logos of Historically Black College & University teams. Source: Yahoo Image

 

According to the United States Department of Education and Agriculture, sixteen states have underfunded their state’s land-grant, Historically Black Colleges and Universities (HBCUs), by more than $13 billion over the last thirty years. A land grant college or university is an institution designated by the state legislature to receive benefits under the  Morrill Acts of 1890 and 1994. The act’s passing was to ensure that higher education would be accessible to all and not only wealthy individuals, being that before 1892, many of the United States institutes for Higher Education were privately funded and selective of who they allowed. It gave states the power to sell federal land to establish Public Institutions.

If HBCUs do not receive equitable funding, it can perpetuate inequities in educational outcomes and opportunities for underrepresented minority students. Understanding the history of HBCUs is essential to appreciate the significance of addressing underfunding. Many of these institutions were founded to address historical injustices, and chronic underfunding perpetuates these disparities, reinforcing the notion that Black students deserve fewer resources and opportunities than their white counterparts.

Two black students looking at a device in a classroom
Two students are looking at a device in a classroom. Source: Yahoo Images

The History of HBCUs

Historically Black Colleges and Universities (HBCUs) have a rich history of providing education to Black men and women in the United States. They emerged in the early 19th century, with institutions like Cheyney University of Pennsylvania in 1836 and Lincoln University in 1854 initially focusing on teacher training.  Over time, these institutions broadened their curricula and became vital education centers for Black individuals, offering various academic programs.

During the Jim Crow era, which lasted from the late 19th century into the mid-20th century, racial segregation laws enforced strict separation of Black and White individuals in public facilities, including schools. Predominantly white institutions were often closed to Black students, and even if they were nominally open, they were often unwelcoming and discriminatory. HBCUs filled this void by providing Black students access to higher education when other options were limited or nonexistent. These institutions offered a safe and nurturing environment where Black individuals could pursue education and intellectual growth. However, these institutions have faced persistent challenges, including funding disparities that hinder their mission of providing equitable education. State funding policies that allocate resources to public higher education institutions are at the heart of these disparities.

A group of people wearing graduation gowns and caps standing in front of a building.
A group of people wearing graduation gowns and caps stands in front of a building. Source: Yahoo Images

Addressing the Disparities

In the letters sent to the governors of Alabama, Arkansas, Florida, Georgia, Kentucky, Louisiana, Maryland, Mississippi, Missouri, Oklahoma, South Carolina, North Carolina, Texas, Tennessee, Virginia, and West Virginia. The Department of Education highlights the importance of HBCUs. The underinvestment of these institutions should be addressed, given that these institutions generate close to $15 billion and have considerable impacts on the predominantly black communities they serve.

The letter addressed to Governor Kay Ivey of Alabama, the Department of Education highlights the stark contrast between Alabama A&M University, the state’s first land-grant institution for African Americans, and Auburn University, the state’s first original land-grant institution, noting the differences in infrastructure and researching which Miguel Cardona, U.S Secretary of Education talks on saying that “Unacceptable funding inequities have forced many of our nation’s distinguished Historically Black Colleges and Universities to operate with inadequate resources and delay critical investments in everything from campus infrastructure to research and development to student support services.”

Since the COVID-19 pandemic, HBCUs have seen a massive enrollment increase despite a national decrease in college enrollments. During an interview with PBS News Hour, the President of Spelman College, an HBCU all-women’s college, Dr. Helene Gayle, attributed the increase in enrollment to an entire generation of young African Americans who have witnessed historic events. The inauguration of the first Black President of the United States, and the rise of movements such as Black Lives Matter and numerous instances of social injustice have motivated and encouraged young people to seek higher education in environments where they are surrounded by their community.

The increase in enrollment has caused some issues for many HBCUS, one being the need for more housing spaces to accommodate the influx of students. Tennessee State University has the most known case, with the university having to rent out five hotels for the 2022-2023 academic year. This has caused the Tennessee State Comptroller to come in and audit the University and their financial practices. Their report found that TSU had a “lack of planning, management, and sound decision-making.” TSU’s financial decisions play a part in the case. Still, one cannot deny that Tennessee underfunding Tennessee State University $2,147,784,704, the most of any other state, plays a role in their shortcomings. The University of Tennessee, the state’s original land grant-funded institution, has sixteen housing halls in Comparison to Tennessee State’s eight housing halls, including one that just opened in August of 2022.

A white building with a star and a blue graduation cap
A white building with a star and a blue graduation cap. Source: U.S Department of Education

Why HBCUs Matter

HBCUs have a rich history of contributing to research and innovation, often focusing on underrepresented areas in mainstream academia. Unfortunately, underfunding hampers their ability to invest in research projects, labs, and faculty development, affecting their capacity to compete for research grants and produce groundbreaking work. This lack of funding also hurts equity by limiting the contributions of Black professionals and academics in research, innovation, and industries like STEM.

Adequate funding is crucial for maintaining high educational standards, hiring qualified faculty, and offering up-to-date resources and facilities. When HBCUs receive less funding, it can lead to overcrowded classrooms, outdated technology, and limited course offerings. The disparity in educational quality can perpetuate inequities, particularly in the context of historically Black colleges and universities.

HBCUs have historically served as a pathway to higher education for Black students who were often excluded from predominantly white institutions due to racial segregation and discrimination. Inadequate funding can restrict their capacity to enroll and support students, limiting access to quality education. This impacts equity, making it harder for Black students, particularly those from low-income backgrounds, to pursue higher education and achieve social mobility.

Underfunded HBCUs may receive a different education and preparation for future opportunities than students at well-funded institutions. Therefore, providing adequate funding to HBCUs is essential for promoting equity and ensuring Black students have access to quality education and opportunities.

A group of people celebrating in front of a building
A group of people celebrating in front of a building. Source: Yahoo Image

Support HBCUs

Growing up, I was fortunate enough to be surrounded by the pride and tradition of HBCUs. Being a native of Birmingham, Alabama, I have had the pleasure of experiencing the biggest HBCU football game, The Magic City Classic, every year. The way the community comes together to support their teams, regardless of the weather, is truly a unique and unforgettable experience.

Funding HBCUs appropriately not only demonstrates a commitment to inclusivity and solidarity with marginalized communities. These institutions are essential to a more just and prosperous future for all, as they continue to play a vital role in American education and culture. By recognizing the pivotal role of state funding policies, we can work towards a more equitable future where HBCUs receive the resources they need to provide quality education and continue their legacy of empowerment and opportunity. Public policy decisions at the state and federal levels directly impact HBCUs funding, support, and overall well-being. Advocacy, engagement with policymakers, and developing equitable policies are essential to addressing funding disparities and promoting equity in higher education for HBCUs.

 

Here is the list of every federal government-recognized HBCU in the United States. If there is one close to you, I encourage you to support one in any way you can, whether going to a sporting event or donating.

Understand the Impact of Poverty on Kenyan Society: Unveiling the Struggles Faced by Vulnerable Communities

By Grace Ndanu

An image of Diani Beach in Kenya to showcase some of the natural beauty of the nation
Diani Beach, Kenya; Source: Yahoo Images

Poverty is a deeply rooted issue that affects countless individuals and communities around the world. In Kenya, it is no different. Despite its natural beauty and richness, Kenya faces significant challenges when it comes to poverty, particularly among vulnerable communities.

The high living standards brought by the new government of Kenya make the poverty issue more pressing. Everything is doubled. Tax is doubled, food is doubled, oil is doubled, women’s products price is now double the initial price.

An image of a Masai market in Kenya
Masai Market in Kenya; Source: Yahoo Images

One issue arising from poverty is limited access to basic necessities such as food, clean water, and health care. According to a United Nations Development Program report, approximately 36% of Kenyans live below the national poverty line. This means that millions of people struggle to afford even one meal a day, leading to malnutrition and adverse health conditions. Additionally, a lack of access to clean water and proper sanitation facilities further intensifies the spread of diseases, resulting in a higher mortality rate.

An image of Kenyans sorting through the food assistance provided by the United Nations World Food Program.
The UN World Food Program (WFP) assists many Kenyans who face food insecurity. Source: Yahoo Images

Another consequence of poverty is the limited educational opportunities available to children coming from disadvantaged backgrounds. Before the current government, a normal student at the university level was paying approximately 38 thousand Kenyan Shilling per year. Today the student pays 122 thousand Kenyan Shillings per year. Many families cannot afford to send their children to school due to financial constraints, resulting in a significant number of young individuals being deprived of basic education. The lack of education perpetuates the cycle of poverty, as individuals without the necessary skills and knowledge struggle to find stable employment opportunities.

The impact of poverty is also evident in the housing conditions experienced by vulnerable communities in Kenya. Slums and informal settlements are common in urban areas, where individuals live in makeshift shelters with little to no access to basic amenities. Unsanitary living conditions in these areas increase health risks and disease vulnerability.

An image of a student in Kenya with school materials. Paying for school in Kenya has become increasingly expensive.
A Student with school materials. Nyeri Primary School, Nyeri County, Kenya; Source: Yahoo Images

These challenges are not insurmountable, however. It’s important to note that while these issues persist, there are numerous organizations, both local and international working alongside the government of Kenya to tackle these issues and improve the overall well-being of the Kenyan people. Efforts such as community-based programs, microfinance initiatives, and educational campaigns have shown promising results in uplifting vulnerable communities and breaking the cycle of poverty.

To bring about lasting change, it is crucial for individuals, governments, and organizations to come together and address the root causes of poverty in Kenya. This includes investing in sustainable agriculture practices, promoting entrepreneurship and job creation, improving access to quality education, and providing support for health care and social welfare systems.

An image of the Parliament of Kenya.
Parliament of Kenya; Source: Yahoo Images

In conclusion, poverty remains a critical issue in Kenyan society, affecting vulnerable communities in various aspects of their lives. By understanding the impact of poverty and actively working towards its eradication, we can create a brighter future for all Kenyans.

Native American Lands and Their Children: A History

An image of two Native American children in their cultural garbs.
Image 1 – Source: Yahoo Images

I would like to start this piece off with a land acknowledgment, where I acknowledge the truth of who the lands of America truly belong to. The land in which I sit to write this article, as well as the ones occupied by those who reside in America once belonged to the many diverse communities that existed long before America got its name. Once prosperous, thriving lands belonging to these various indigenous communities, (to the Creeks and Choctaw, in my case), the lands of America were respected and honored by the relationship that these various tribal communities held sacred between themselves and their environment. It is in honor of their stewardship and resilience that I hope to shed light on some of the more gruesome, nefarious betrayals they have experienced at the hands of colonizers from the time their tribal ancestors witnessed the colonizers’ arrival to their lands in 1492.

Before the European colonizers arrived on this land, there existed a diverse group of tribal communities, over a thousand different ones just in the mainland we call America today. Now, these tribes have been reduced to no more than 574 federally recognized ones, with dwindling tribal membership numbers, a fact that can only be blamed on the federally sanctioned behaviors of the colonists. So much has been stolen from the diverse groups of indigenous people since the colonization of the North American lands first began. The original indigenous peoples had offered the newly-arriving colonists hospitality and taught them how to cultivate the lands of America and brave the New Frontiers. Yet, what they received in gratitude was bloodshed, tears, death, and betrayals. So many treaties and promises were broken. According to Howard Zinn, the famous author of the book, “A People’s History of the United States,” the various indigenous communities that existed in the Americas by the time the famous explorers landed in the Americas were anywhere between 25-75 million individuals. They had moved into these fertile lands 25,000 years ago, long before the explorers “founded” the Americas. For those interested in learning a truthful history of America, please check out his book. The book begins in 1492 and continues to examine historical events until contemporary times and phenomena such as the “War on Terrorism”.

There is so much information to be covered on this topic, and the more I researched, the more I found. I want to do this topic justice, and I cannot do so until the historical context has been put in place. Hence, this will be a two-part deep dive into the Native American lands, their cultural lifestyles, their relationship with the environment, and what this means for their existence in a capitalist, contemporary society. Part one will focus on the history of Native American lands, the process of treaties and loss, and the cruel, scheming ways of the federal government that attempted to indirectly, yet forcibly, steal lands away from Native Americans by targeting the youngest members of their tribes. Part two will focus on the Indian Child Welfare Act, the fight (and entities involved) in support and against it, how the environment plays a role, and the vast consequences of the recent Supreme Court ruling on the matter, both in terms of the welfare of these indigenous children, as well as the issue of tribal sovereignty. There is a lot to unpack here, so without further ado, let’s begin with a deeper understanding of the relationship that indigenous communities share with their lands.

It’s All About the Land; It Always Has Been

An image depicting all the various different indigenous tribes that existed in America before the European Settlers landed
Image 2 – Source: Yahoo Images; An image depicting the various indigenous tribes that were present in America before the European Settlers landed.

The European settlers had a problem with the Native Americans from the moment they landed in America. For one, they thought the indigenous way of life to be “savagery” and believed that the Native Americans needed to be “civilized”, something they believed only Europeans could teach them about. They found the gods and spirituality of the various indigenous cultures to be blasphemous and nonsensical, and many Europeans attempted to convert the Natives to Christianity, a more “proper” religious belief. Most of all, though, the Europeans and the indigenous communities had vastly different concepts of property and land ownership. To the settlers, who came from the feudal systems of Europe, land was a commodity, purchased and sold by individuals, and prosperity (and social status) was determined by who owned the most properties, and the most prosperous lands. They became lords and could employ the less fortunate to work under them, paying them a fraction of their profits, while keeping the rest for themselves. This was how things worked in Europe back home, and this is the system they brought with them when settling in the New World.

Native Americans, however, had a different lifestyle and concept of ownership. To them, the thought of owning a piece of land was bizarre, as they viewed the land to belong to the various energies and life forms that existed in the said land. The tribal lands of an indigenous community not only fed and nurtured the tribal members but also protected the tribe’s history and held the ancestral burials of their people. The indigenous communities had a spiritual and emotional connection with their tribal lands, one that cannot be sold to another, similar to how you cannot sell to someone else the relationship you hold with your family. Many (if not most), Native tribes even practiced animism, a belief system that accepts all living and non-living things (and natural phenomena) as being capable of having a life force (or soul). For Native Americans, land ownership was a foreign concept, and everyone that existed in their community held rights to the land their tribes lived on. In fact, when European settlers began purchasing lands from the Native Americans, the indigenous people believed they were only “leasing” the lands to the settlers, not giving up their rights to them. For the indigenous communities, the land was just as much a right of every human as sunlight, water, or air.

The Native Americans’ relationship with their lands was also threatening to the European lifestyle of land ownership and individualism. This struggle, between an individualistic view of community, versus the collective view of community, is, as they say, a “tale as old as time.” For Europeans, who believed individual merit and hard work to be the true characteristics of a successful individual, their success could only be displayed by the vastness of their empires, figuratively and physically. Hence, land ownership was a symbol of status and in a way, a testament of a person’s character. For Native Americans who focused on collective success rather than individually standing out, the strength of their tribe was a result of the part each individual tribal member played to ensure their success. This meant that everyone had a role, and if they played them right, everyone in the tribe benefited from the success. This was how tribes survived even as they warred against each other.

Treaties and Deals

An image depicting colonial men engaging in treaty making with a Native American tribe
Image 3 – Source: Yahoo Images; Many treaties such as this were brokered between various Indigenous tribes and colonists, yet they were seldom upheld and often violated or broken.

Due to these differences between the indigenous communities and the European settlers, many struggles broke out between the two groups between 1492 and 1700. In an attempt to keep the peace between the settlers and the indigenous communities, the British Crown established the Proclamation of 1763, which awarded the colonists all the lands East of the Appalachian Mountains, and everything West was promised to the various different Native American populations that lived in those regions. This did not make the colonists happy, as they believed the King was preventing them from expanding their population, and it was one of the points they listed in the Declaration of Independence as a wrong that was done by the King. Many scholars claim that the Proclamation of 1793 led colonists to pursue a revolution against the crown. The diverse indigenous populations attempted to stay out of the Revolutionary War, as they believed it to be a family feud between the British King, and his colonial subjects. Yet, when they did take part in the War, their participation was diverse. Some joined the rebelling Americans, while others joined the forces of the monarchy. Still, others chose to remain neutral, not wanting to support either side of the struggle. Upon the loss of the Revolutionary War, as part of the treaty signed between Britain and the newly established United States, Britain had to give up all the lands they lay claim to in America, including many of the lands that were promised to the Native American tribes living West of the Appalachian Mountains. This happened without consent or discussion with the Native Americans who took residence in those parts. When the colonists came to take over much of the lands that were promised to the Native Americans through the Proclamation of 1763, they justified their brutality against the Native Americans by blaming them for supporting the British in the Revolutionary War, and when the Native Americans tried to fight back for their lands, the British were nowhere to be seen. This was yet another episode of betrayal experienced by the indigenous populations at the hands of the settlers and the British Crown. Yet, this was just the beginning; the atrocities and betrayals were far from over.

Following the Revolutionary War and the as a result of the resilience shown by the many indigenous communities protecting their lands, the United States decided to engage in creating treaties between the various indigenous tribes in an attempt to set boundaries to their lands, and “compensate” them for the lands taken from them. I have “compensate” in quotations because first of all, no amount of money or goods can compensate for lost lives, which is what many tribes experienced. Some tribes became extinct as a result. Second, these treaties were signed by members who did not have signatory authority to give permission to the lands on the side of many indigenous nations, and Congress seldom ratified the treaties that were signed on the part of the United States. This meant that this was more of a theatrical expression than anything else, and the United States continued to steal the lands of indigenous people. Thirdly, as discussed above, many indigenous people who did engage in treaty-making assumed they were simply “leasing” their lands to be used by the colonists, not selling their rights to it outright. So, there was miscommunication and misunderstandings as to what the treaties actually established. Finally, the United States Congress and Supreme Court established that the indigenous tribes were not capable of engaging in treaty-making, and as such, ended the whole process altogether in 1871, claiming that Congress had full control over “Native American Affairs.”

An image depicting the infamous Trail of Tears, where thousands of indigenous people were forcefully driven out of their ancestral homes and marched into Oklahoma.
Image 4 – Source: Yahoo Images; An image depicting the infamous Trail of Tears, where thousands of indigenous people were forcefully driven out of their ancestral homes and marched into Oklahoma.

In an attempt to fasten the process of transferring lands from Native American tribes to the hands of the government, the United States passed the Dawes Act of 1887. Many of the treaties that were made between the US and the various nations included provisions in which tribes were expected to distribute their lands among their members so that lands were held by individuals rather than the tribal entities as a whole. For reasons explained earlier, the settlers were threatened by the communal lifestyles of the Native American tribes and believed that having individual members have rights over smaller portions of lands would make it easier for them to accept the European lifestyles and give up their “backward” ways. The Dawes Act forced these indigenous members to choose a parcel of land for themselves and their families (the size of the parcel of land was determined by the government), and any excess amount of land after this process would be sold to the government to be used by non-native residents and corporations alike. Millions of acres of land were stolen from various indigenous tribes as a result. This essentially acted as a way to separate the individual Native American member from their larger tribe and weaken their sense of community and tribal sovereignty as a whole.

Since the end of the Revolutionary War, the United States government has made about 374 treaties with various indigenous nations across the country. The United States has either violated or fully broken nearly all of these treaties they created as a promise of peacekeeping. Many of these treaties that the United States obtained in the first place were either coerced or done so by forcible means such as threatening starvation on the communities that refused to sign the treaties. Of the various treaties that were violated and broken, one that comes to mind clearly for anyone even slightly familiar with American History is the actions of then president Andrew Jackson and his Indian Removal Act of 1830. Although he negotiated treaties with various tribes in the Southeast in an attempt to get them to move West of the Mississippi River voluntarily, when he became president of the United States, he passed the Indian Removal Act of 1830, forcibly removing almost 50,000 people from their homes. This forcible removal today would be recognized as a forceable deportation of a population, specifically as a crime against humanity. Under the United Nations Rome Statute of the International Criminal Court, this is one of the most heinous systematic crimes that has been committed throughout history. Jackson did this in an attempt to clear lands to cultivate cotton, which would lead to another atrocious event, the revamping of plantation slavery in the South.

History of the Forced Assimilation of Native American Children

An image of indigenous children dressed in military garb posing with an adult at one of the boarding schools set up across the country in efforts to assimilate the children.
Image 5 – Source: Yahoo Images; An image of indigenous children dressed in military garb posing with an adult at one of the boarding schools set up across the country in efforts to assimilate the children.

Another tactic used by the government to acquire lands from the indigenous populations was through further treacherous means. Native American children were forcefully assimilated into American culture in an attempt to beat/torture their culture out of them. The existence of the Federal Indian Boarding School System was proof of this very thing. Recently, an internal investigation was conducted of the United States government’s treatment of Indigenous children following the incident in Canada, where they found over 215 unmarked graves at a school in 2021. This report, led by Deb Halland, the Secretary of Interior, highlighted many nefarious ways in which tribal lands were stolen from different indigenous nations and the atrocities that were forced upon the children from these nations.

To explore some of the details outlined in this report, (specifically from pages 20-40), the plans of forcible assimilation have been put in place since the days of George Washington. This plan to forcefully erase indigenous culture and assimilate the children into Western culture was seen as the “cheapest and safest way” to steal the tribal lands, ensure a less violent relationship between the colonists and the Native Americans, and transform the tribal economy so they would be prepared to live off of lesser and lesser parcels of land. They found a way to weaponize education in order to accomplish this task.

Elaborating on George Washington’s proposal, Thomas Jefferson, the third president of the United States, put forth a 2-step solution to acquire more lands for the colonists. First, he argued that Native Americans could be forcefully assimilated into European culture, where they could be discouraged to live out a nomadic lifestyle (which requires the use of vast areas of land) and to adopt an agricultural lifestyle similar to the colonists (which can be done with a few acres of land that are cultivated). Second, he proposed that the United States place indigenous populations in debt by encouraging their use of credits to purchase their goods. This, he presumed, would make them default on their debts, and when they were unable to pay back their loans, they would be forced to give up their lands as a result. This land acquired by the government would be sold to non-native settlers, and the profits from these land sales would be put back into the education programs for forceful assimilation of native children.

Sanctioned by the United States government, indigenous children were kidnapped from their homes, whether they wanted to go to boarding school or not (with or without parental permission), and placed in these schools that were located far away from their tribal lands. The plan was to erase the relationship these children had with their cultures, communities, and lands, and instead instill individualism in the children who they were attempting to assimilate in the hopes that they could break up the communal lands into individual parcels, making it easier to be ceased by the government and private entities alike. They called it the “Indian Problem”, which was the different lifestyle and relationship tribal members held with their land and their community. Thomas Jefferson’s two-part proposal was seen as a “key solution” to this “Indian problem.” If the Native American children were forced to become dependent on agricultural lifestyles, they assumed, they could be “civilized.” The government believed that if you separate the children from their families and their tribal connections at a very young age, what they were introduced to would be all they knew, and they would become strangers to the indigenous lifestyles. In turn, the government assumed, the children would not want to go back home and live on the reservations, but instead, would be much more likely to assimilate and live amongst the colonists.

As a result, indigenous families were broken apart, and indigenous children were placed with white families as part of the “outing system.” This meant that the children were forbidden to speak in their native languages and were required to speak English to communicate their needs. What’s worse, they were placed with children from other tribes, meaning that their common language of communication was English, and any children they would have would grow up learning English as their first language instead of the tribal languages of their ancestors.

To support the government in this endeavor, many churches were given legal power over reservations by the government. The military was called in to reinforce the orders of these religious institutions. Many times, the government paid these institutions if they operated a boarding school, paying them a sum for each child. The churches went along with it because they believed that the indigenous way of “paganism” kept them from becoming “civilized” and to fully do so, the indigenous children needed to accept Christianity. The government worked with churches from many denominations by funding them to build the Federal Indian Boarding School System.

Treatment of Indigenous children in these boarding school systems

An Image depicting three children before and after the assimilation process at the boarding school. On the left, the three children sit with their cultural garments, proud of their cultural identity. One the right, the same three children have had hair cuts, and been groomed (both physically and psychologically) to appear more Western.
Image 6 – Source: Yahoo Images; An Image depicting three children before and after the assimilation process at the boarding school. On the left, the three children sit in their cultural garments, proud of their cultural identity. On the right, the same three children have had hair cuts, and been groomed (both physically and psychologically) to appear more Western.

 The Federal Indian Boarding School System was problematic in so many ways. Not only did it forcefully assimilate indigenous children, but the system also took a militaristic approach to education, abusing and mistreating these children in the process. The living conditions at these Boarding schools were terrible. There was no access to basic health care needs, and diseases ran rampant across the schools. The children were malnourished, as they were provided with food and water of poor quality. There was an overcrowding issue, with many facilities forcing multiple children to share one bed as a result. There were not enough toilets to serve the number of children at each facility, and the toilets were not properly maintained.

The infrastructure in these facilities was so poor because they were not built specifically to house these children as facilities for education. Rather, these children were placed in abandoned government buildings or military forts to carry out their education. There was also the issue of child labor, where the children were expected to provide all the services required to run the facilities. This included looking after the livestock, chopping wood, making bricks, sewing garments to clothe the other children, working on the railway, cooking, and cleaning for the others in the facility, and so much more.

The children were expected to take care of themselves and the other children at the facilities. They were also tasked with work from various fields like carpentry, plumbing, blacksmithing, fertilizing, helping with the irrigation system, helping make furniture for use in the facilities (such as tables, chairs, and beds), and anything else that involved physical labor. These jobs the children were trained in would forever keep them at a lower socioeconomic level than their White counterparts. Here too they tried to instill the patriarchal norms of Western society, making sure to teach and employ young girls to work as assistants and cooks, while the young boys were expected to be farmers and industrial workers.

The Indigenous children were forcefully assimilated into American culture. They were told to stop practicing their faiths and were stopped from performing any spiritual and/or religious rituals. The children were expected to go by the English names they were given at the boarding school instead of their Native names given to them at birth. They were forced to cut their hair (which was sacred to many indigenous people as it represented their cultural identity), and were forbidden to wear their cultural clothes and instead were put in military garb.

Those who resisted the assimilation or tried to run away were caught and severely abused and punished. They were put in solitary, whipped, slapped, starved, and abused for fighting to retain their culture. Many of the older children at the facilities were forced to punish the younger children, further dividing the children, and destroying any opportunity the children may have had to band together to resist the assimilation forces. As a result of what the Federal Indian Boarding School System put these children through, there were over 50 marked and unmarked burial sites found. These burial sites had found over 500 indigenous children dead and counting, and these numbers are expected to rise to thousands more. Many indigenous children that survived these boarding schools are reported to have long-lasting impacts on their health and their lives. These children that grew up to be adults reported having higher risks for cancer, diabetes, and Tuberculosis. They experienced heightened mental health issues, and many remain in a lower socioeconomic class as a result.

Cultural Genocide

This image depicts the number of indigenous people that exist today in comparison to what we saw before.
Images 7 & 8 – Source: Yahoo Images; This is a map that depicts how many indigenous members exist today, and what is left of their lands.

Many believe this forcible assimilation program conducted by the federal government to be a cultural genocide, in which a state-sanctioned attempt at the erasure of an entire culture took place. The official definition of genocide as established by the Genocide Convention in the Rome Statute of the International Criminal Court reads as follows: “…any of the following acts committed with intent to destroy, in whole or in part, a national, ethnical, racial, or religious group, as such: killing members of the group; causing serious bodily or mental harm to members of the group; deliberately inflicting on the group conditions of life calculated to bring about its physical destruction in whole or in part; imposing measures intended to prevent births within the group; forcibly transferring children of the group to another group.” As per this definition, the acts carried out by the United States government against the children of various Native American tribes fulfill most, if not all these categories that define these acts to be considered genocidal. It is not a surprise that Native Americans have been killed by the federal government’s sanctions throughout history. There has been serious bodily harm and mental harm caused to members of these various indigenous groups throughout American history. The government deliberately placed young children in conditions of life that ensured their destruction as a member of the Native American tribe they each belonged to. The children within these facilities were not allowed to mingle with others from their own tribes, making it harder for them to retain and pass down their cultural identities, as well as procreate with members of their own tribe. Finally, they were forcibly taken away from their parents and placed in these facilities and other non-native homes in an attempt to erase their cultural backgrounds. All these actions, as was discovered by the recent report we explored at length in this blog, were done so with the intent to destroy the rich cultures of the various Native American tribes. So, the forcible assimilation of Native American children can, without a doubt, be characterized as cultural genocide.

The main goal of this blog was to establish the historical context of what the various Native American tribes endured, as well as the intentions of the federal government in terms of their dealings with the different native nations present in America. Part two of this conversation will focus on a specific piece of legislation that has gained a lot of attention in recent years, the Indian Child Welfare Act. At face value, this legislation is simply an act that addresses the long, detailed history of Native American children and sets guidelines to ensure that proper regulations are put in place to prevent a repetition of history. Yet, it’s now been challenged, partly with the help of very shady moneyed interest, and its fate (and the overarching consequences as a result) were placed in the hands of the nine Supreme Court justices of the United States. We will explore more about this legislation and the case in the next blog.