AS level Archives - Oxford Open Learning

2025’s Must-Read Poetry And Novels

2025 brings a fresh wave of captivating literature—whether you’re drawn to enchanting origin stories, thought-provoking novels, or powerful poetry collections, this year promises a literary treat for readers of every genre to enjoy.

Vianne – Joanne Harris

Fans of the beloved novel Chocolat will be delighted by Joanne Harris’s prequel, Vianne, set for release in the UK on 22nd May 2025 by Orion Books. This enchanting tale explores the early life of Vianne Rocher—known then as Sylviane Rochas—as she embarks on a transformative journey from New York to Marseille. Pregnant and seeking a fresh start, she discovers her passion for crafting chocolate-infused delicacies, setting the stage for the magic that unfolds in Chocolat. Harris masterfully weaves themes of pleasure, heartbreak, and culinary artistry, promising readers a rich and immersive experience.

All I Know About A Heavy Heart Is How To Carry It –Chisom Okafor

Nigerian poet Chisom Okafor’s debut collection, winner of the Jacar Press New Voices Award, is set for release in November. Blending lyrical intensity with raw honesty, it offers a profound exploration of vulnerability, masculinity, and personal history.

Okafor’s unflinching narrative voice and deeply personal themes position this as one of the most powerful and anticipated poetry releases of the year.

Girl Dinner – Olivie Blake

Bestselling author Olivie Blake, known for The Atlas Six, ventures into satirical territory with Girl Dinner, set for release on 23rd October 2025.

This sharp, provocative novel critiques society’s obsession with wellness, food culture, and perfection, challenging conventional norms with Blake’s signature wit and inventive storytelling. With its timely themes and clever social commentary, Girl Dinner is poised to be a standout in contemporary literature.

The Emperor Of Gladness – Ocean Vuong

Acclaimed poet and novelist Ocean Vuong returns with The Emperor of Gladness, scheduled for release on 15th May this year.

The novel follows 19-year-old Hai, who, on the brink of suicide, is saved by Grazina, an elderly woman with dementia. As he becomes her caretaker, their unlikely bond explores intergenerational friendship, mental health, and the complexities of human connection. Vuong’s signature lyrical prose ensures a deeply moving experience that will resonate across generations.

There Lives A Young Girl In Me Who Will Not Die – Tove Ditlevsen

Danish author Tove Ditlevsen’s poetry collection There Lives a Young Girl in Me Who Will Not Die has been newly translated into English and was released in January.

A prominent literary figure in Denmark, Ditlevsen is celebrated for her unflinching portrayals of childhood, identity, and personal struggle. This collection charts her poetic evolution—from structured forms to more abstract expressions—offering raw honesty and emotional depth that will resonate with readers decades after her passing.

In today’s interconnected world, learning a second language is more valuable than ever. Among the many methods available, immersion stands out as one of the most effective for achieving fluency. This approach places learners in environments where they are surrounded by native speakers, similar to the natural process of acquiring a first language. By bridging the gap between theory and practice, immersion pushes learners to adapt quickly by engaging in conversations, asking for directions and completing everyday tasks in the new language.

What Is Language Immersion?

Language immersion is a method in which learners immerse themselves in real-world contexts, such as living in a country where the language is spoken or interacting extensively with native speakers. This approach prioritises effective communication and comprehension over strict grammatical accuracy, emphasising the practical application of language in everyday situations.

Through constant exposure and practical use, learners develop not only linguistic skills but also cultural understanding, gaining confidence in real-world situations. Immersion challenges individuals to think, respond, and express themselves entirely in the new language, making it an invaluable tool for anyone striving to achieve true fluency and cultural connection.

Benefits Of Immersion

Language immersion encourages learners to practice multiple skills simultaneously, such as speaking, listening, reading and writing. It often involves exposure to the culture associated with the language, helping learners understand cultural contexts, idiomatic expressions, local pronunciations and subtle nuances. This immersive environment fosters confidence by reducing the fear of making mistakes through constant practice.

By tying vocabulary to emotions and specific real-life events, immersion makes learning more memorable and speeds up the acquisition process. Other advantages include a more enjoyable and engaging way to learn, opportunities for networking and forming connections and cognitive benefits such as improved problem-solving skills, enhanced memory and greater mental flexibility.

Classroom Learning Enhancement

Both immersion and classroom-based learning have unique advantages: classroom-based methods offer structured learning, guided practice and a clear progression of skills, while immersion focuses on real-world communication and adaptability. When combined, these approaches can create a comprehensive and well-rounded language-learning experience.

Study abroad programs are an excellent example, allowing learners to live in another country, attend language classes and interact daily with native speakers, blending structure with practical application. For those unable to study abroad, alternatives like watching foreign films, listening to music or reading books in the target language can provide cultural context and reinforce vocabulary. Additionally, language exchange platforms offer opportunities for conversation practice, enabling learners to engage with native speakers and build confidence from anywhere in the world.

Challenges Of Immersion

Immersion, particularly when living in another country, can lead to culture shock, where adapting to a new culture may feel overwhelming. Communication barriers and the initial steep learning curve can be daunting. Without formal structure, tracking progress can become challenging and learners may experience a plateau once they reach an intermediate level of competence. Additionally, feelings of isolation, homesickness or financial constraints can add stress and make the experience more difficult. Despite these challenges, with the right support and mindset, immersion can still be an incredibly rewarding way to learn a language and gain cultural insight.

Folklore predates recorded history. At its base definition, folklore is communal stories and traditions that have been passed down through generations via word of mouth.

That means many avenues of oral storytelling can be reasonably defined as folklore; ballads, proverbs, riddles, oral epics (Homeric poems), and plays to count a few. The list goes on.

There are easy answers to how some aspects of folklore have survived today. After all, we all tell stories often, and enjoy lending an ear to rumour and mythology. However, the preservation of cultural traditions and heritage has been a more conscious effort. And rightly so!

Let’s explore why folklore has endured.

The Brothers Grimm

Jacob Ludwig Karl Grimm and Wilhelm Carl Grimm were young German librarians, the first linguists, and famed folklorists. You’ve likely heard of them.

The Grimms collected stories and published them in Kinder- und Hausmärchen (1812), their first volume. In English speaking nations, it was known as the Children’s and Household Tales, but today we know the great work as Grimm’s Fairytales.

They didn’t write these fairytales, as is often believed. However, their work in rediscovering folkloric tales, and returning them as a written narrative, was and still is the best known. The Grimms also modified previously written work for their collections, as proven with Charles Perrault’s Cinderella from 1697.

So prolific were the Brothers Grimm that lost works of theirs are now being rediscovered today. 27 books of theirs were recently unearthed in Poland, potentially opening up fresh avenues of folkloric study. Ultimately, however, that first volume of stories is largely credited for the study of folklore, both past and present.

Academic Study

Academia is often leading the charge on anything to do with cultural preservation. The topic of folklore is no exception.

While it has ancient origins, folklore became an avenue of study in the early 19th century. It was the Scottish scholar Andrew Lang, and the English anthropologist Sir Edward Taylor, amongst others inspired by Grimm’s Fairytales, that struck out to ‘reconstruct’ the beliefs and rituals of prehistoric man. Collating all the data of oral folklore that they could, museums and archives soon came to fruition. A nationalistic outlook underpinned these efforts.

In many institutions across the UK today, folklore is recognised as a field of study. It has its own academic body and methodology. There are Master’s Degrees around the subject, and the University of Oxford offers an online course with pre-recorded lectures and weekly Microsoft Teams meetings.

Of course, folklore will often make appearances in other avenues of study, too. Much of art, music, history, and literature are derived from a starting point of traditional folklore. Scholars in the humanities frequently trace back these origins with glee and investigate how these influences all coalesce and continue today.

Folklore As Identity

Of course, folklore is more than just a series of books to be studied. For many people, folklore forms a huge part of their identities.

What is danger? What does love look like? Fairytales, and thus folklore, often form the basis of our comprehension from an early age. From literature to cinema, folklore is everywhere and has played a role in comforting us during hardship. It’s always in heart and mind.

From that understanding comes social cohesion. People connect with each other for folklore’s involvement in their lives. Rich or poor, young or old. Friendships made, or careers chosen. Folklore has given people an ability to interpret their world through storytelling, and thus, a truly timeless way to connect.

The Bering Land Bridge, or Beringia, was a massive landmass that once connected Asia and North America during the Ice Age. Spanning up to 1,000 miles from north to south and 600 miles wide at its peak, this expansive region served as a vital corridor for the migration of humans and animals. It played a crucial role in shaping the movement of species and the course of human history. Genetic studies reveal that individuals across Canada, North America and South America share direct ancestral links with populations in present-day Eastern Russia, highlighting the impact of this ancient land bridge.

Formation Of The Bering Land Bridge

The Bering Land Bridge emerged as a result of dramatic climatic and geological changes during the Ice Age, particularly in the Pleistocene Epoch. During this time, glaciers trapped vast amounts of the Earth’s water, causing sea levels to drop by as much as 400 feet, which exposed the Bering Land Bridge. This vast landmass supported diverse plant and animal life, creating a rich ecosystem that provided essential resources for migrating species. Over millennia, the bridge was repeatedly exposed and submerged due to the cyclical nature of glaciations and interglacial periods.

Human Migration

Archaeological and genetic evidence suggests that the first humans to cross the Bering Land Bridge were small groups of hunter-gatherers in search of animal food sources and more favourable climates. According to the “Beringian Standstill Hypothesis,” these early migrants are believed to have spent thousands of years living on the bridge itself, adapting to its unique environment. Once ice-free corridors opened up, these populations moved southward, eventually spreading throughout the Americas.

Ecological Impact

The Bering Land Bridge served as a critical corridor for the exchange of plant and animal species between continents during the Ice Age. North American species such as horses and camels migrated into Asia, while Asian species like mammoths and saber-toothed tigers journeyed into North America. This bidirectional movement enriched biodiversity on both continents and played a pivotal role in shaping evolutionary pathways.

Bering Land Bridge Legacy

The disappearance of the Bering Land Bridge around 11,000 years ago led to the isolation of plant and animal populations, driving distinct evolutionary pathways. The earlier exchange of species had enriched the genetic pool and fostered ecological resilience on both continents, highlighting the transformative impact of such connections. The impact of the Bering Land Bridge serves as a reminder of the pivotal role geographical features play in shaping biodiversity. Even today, the shared ancestry of certain plants and animals in Asia and North America stands as evidence of these ancient migrations.

A Local Call For Change

Africa, often celebrated as the cradle of humanity, is a continent of immense diversity and potential. With its rich cultural heritage, vibrant young population driving innovation and productivity, and a wealth of the world’s natural resources, Africa stands as a cornerstone of global history and future growth.

Over the previous decades, Africa has been a long-term recipient of foreign aid aimed at assisting with poverty, healthcare, education and infrastructure. There is however a call for change within Africa, a call that promotes long-term goals of empowerment, self-reliance and sustainable partnerships.

The Legacy Of Aid In Africa

Since 1960, Africa has received billions of dollars in financial aid. While these initiatives have undoubtedly played a large part in addressing pressing concerns, its impact as had both positive and negative impacts. Economic support has been instrumental in humanitarian relief, saving countless lives during crises like the 2011 Somali famine and the 2014 Ebola outbreak in West Africa. It has also strengthened healthcare through vaccination programs and AIDS treatments, while funding schools and expanding access to education.

The Need For Change

Despite these successes, international aid has also received criticism and sparked debate about its long-term effectiveness. It has been argued that financial aid focuses on short-term relief that creates a cycle of dependency. For instance, free food supplies have sometimes undercut local farmers, negatively impacting their livelihoods. Additionally, corruption and mismanagement have often obstructed the effective use of funds, leaving those most in need without support.

There is a growing call for greater involvement from the local communities that aid is intended to support. Many donors often overlook the complexities, priorities and cultural nuances of the people they aim to assist, resulting in solutions that may not align with their needs.

Reimagining Aid

Many African leaders and communities are calling for more locally led initiates. These bottom-up approaches can ensure that aid is better aligned with local needs as part of a long-term sustainable solution. Many critics of traditional aid are pushing for equitable trade partnerships and fair access to global markets, investment in local industries and private sector involvement to help drive economic growth and job creation.

There is a clear need to shift the narrative of aid towards one that highlights Africa as a hub of innovation, entrepreneurship and cultural wealth. Partnership programs that promote skill-sharing, technology transfer and joint ventures will empower Africa to take a more active role in shaping its own development.

If you’ve ever wondered about the invention of computers, you might think of Alan Turing, and those massive banks of computers that sent the Apollo to the moon. But even by then the idea of a computer wasn’t exactly new. In the early 19th century, there was Charles Babbage—a man whose brilliant mind conceived the fundamental principles of modern computing, a good century before they became reality. There was just one slight problem: he was just a bit ahead of his time.

Babbage The Problem Solver

Born in 1791 in London, Charles Babbage was a clever bloke. He was a mathematician, inventor, and philosopher. From a young age, he was a bit of a prodigy, showing an insatiable curiosity and a knack for solving problems. By the time he attended Cambridge University, Babbage was already questioning the limitations of existing mathematical methods.

In the early 19th century, mathematical calculations were laboriously done by hand, often riddled with errors. It was all long division and chunking – no phones or calculator papers in your GCSEs back then. Babbage, frustrated by these inaccuracies, asked himself a revolutionary question: Could a machine do this work more reliably? From this question arose his first major invention: the Difference Engine.

Making A Difference… Engine

The Difference Engine (pictured) was designed to automate the production of mathematical tables, eliminating the errors inherent in manual calculation. Essentially, it was a calculator powered by steam, using gears and levers to perform those tricky calculations and keep track of all those ones to carry. Babbage envisioned it as a tool for astronomers, navigators, and engineers—anyone who relied on precise calculations.

Work on the Difference Engine began in 1822, with funding from the British government. However, the machine’s complexity soon became a stumbling block. With over 25,000 precision-engineered parts needed, the technology of the time simply couldn’t meet Babbage’s exacting standards. Sadly for Babbage, his project was eventually abandoned, unfinished, in 1833.

From Calculators To Computers

While the Difference Engine was impressive, it was the Analytical Engine that cemented Babbage’s place as the father of computing. Conceived in 1834, this machine wasn’t just a calculator—it was a general-purpose computer. The Analytical Engine had all the core components of a modern computer:

• A Mill: The equivalent of today’s CPU, it performed mathematical operations.
• A Store: A memory unit for holding numbers and intermediate results.
• Input and Output Devices: Punch cards would input data, and a printer would output results.
• Sequential Control: It could execute instructions in a specific order, akin to modern programming.

Perhaps the most remarkable feature of the Analytical Engine was its programmability. Babbage envisioned using punch cards, similar to those used in looms for weaving patterns into fabric, to instruct the machine. This meant it could solve a variety of problems, not just a single task—a revolutionary concept in the 1830s.

Charles Babbage Was His Own Worst Enemy

Despite his genius, Babbage’s perfectionism and constant tinkering meant that he never completed a single working model of either the Difference or Analytical Engine. Additionally, his difficult personality alienated potential supporters, and the British government eventually withdrew funding.

Another major obstacle was the technology of the time. Precision engineering in the 19th century wasn’t advanced enough to produce the intricate parts Babbage’s machines required. As a result, his designs remained theoretical blueprints, gathering dust in archives for decades.

Though he never saw his machines come to life, Babbage’s work profoundly influenced future generations. In the 20th century, his ideas were rediscovered and celebrated as the foundation of modern computing. Alan Turing, often considered the father of computer science, was heavily influenced by Babbage’s concept of a programmable machine.

In 1991, a team at the Science Museum in London built a working Difference Engine using Babbage’s original designs. The machine worked flawlessly, proving that his ideas were sound and that it was the limitations of his era that had held him back. So next time you boot up your computer or pull out the calculator on your phone, spare a thought for Charles Babbage. Without his temporally misplaced intellect, who knows how differently things could have turned out?

Before HMRC…

You may have heard the phrase, “The only things certain in life are death and taxes,” famously attributed to Benjamin Franklin. In the UK today, His Majesty’s Revenue and Customs (HMRC) ensures this certainty via taxation of citizens in a structured manner, typically through payroll (monthly) or self-assessment (annually).

However, taxation in medieval times was a markedly different affair, closely tied to feudalism and decentralised systems of governance. Monarchs, lords, and the Church acted as the primary tax authorities, imposing levies on behalf of the crown. Think of the infamous Sheriff of Nottingham from the Robin Hood legends, who bullied the Nottinghamshire folk with extortionate taxes, drawing the ire of Robin of Locksley.

While we don’t know if this kind of tyrannical approach to tax collection was the norm in medieval times, the crown didn’t have many regulations on how tax could be collected. If the Lords paid the crown up front for the right to extract dues from their territory, they were left to their own devices, meaning it was ripe for corruption and abuse.

What we do know is that King John of England (r. 1199–1216) earned infamy for his heavy and arbitrary taxation. His financial demands led to widespread unrest, culminating in the Magna Carta in 1215, which sought to curb the crown’s taxing powers.

Also, in early medieval times, tax was not standardized; it was collected in an ad-hoc way to fund wars, building castles or cathedrals or to pay for Royal weddings. There were several different types of tax collection under this model.

Feudal Dues

Under feudalism, peasants owed their lords various forms of payment, often in labour or goods rather than coin. This could include working on the lord’s land, providing a portion of their harvest, or paying a fee to marry off a daughter.

Tithes

The Church played a significant role in medieval taxation. Parishioners were required to pay a tithe, typically 10% of their annual produce or income, to the Church. By the 19th Century Tithes were seen as outdated and only exists today in a more voluntary form of charitable giving.

Scutage

As feudal obligations shifted, knights who owed military service to their lords were sometimes allowed to pay a tax called scutage to forgo military service.

Tallage

A tax imposed on towns and royal lands, tallage was often arbitrary and could be levied whenever a king or lord needed funds. This made it deeply unpopular among the burgeoning class of townsfolk and merchants. This might have been the kind of taxation that the Sheriff of Nottingham was using. However, it was so unjust, it was condemned by the Barons in Magna Carta in 1215 and abolished in 1340 in England. It continued in France throughout the 14th century and was abolished after the French revolution.

Customs Duties

Trade was an increasingly vital part of the medieval economy, and customs duties were levied on goods entering and leaving a lord’s or king’s domain based on tonnage and poundage. Originating in the 14th century, these taxes were an early precursor to modern VAT and import/export duties.

Taxation in medieval times was deeply rooted in feudal structures, marked by decentralization and oversight. While it funded important projects such as castle building, wars, and cathedrals, it often placed an unjust burden on the lower classes.

Does language shape the way we think and perceive the world? According to the Sapir-Whorf hypothesis, language is not just a tool for communication; the structure and vocabulary of a language can directly impact our cognition, from our experience of perceiving colours to our concept of time and space.

Sapir-Whorf Hypothesis Origins

Since its introduction by Edward Sapir and Benjamin Lee Whorf in the early 20th century, the Sapir-Whorf Hypothesis has evolved into two versions. The stronger but largely discredited version claims that language completely determines our perception of reality, asserting that if a language lacks the vocabulary or concepts to describe something, its speakers cannot perceive it. The weaker, more widely accepted version is studied in fields such as psychology and linguistics and proposes that language simply influences perception rather than defining it outright.

Linguistic Relativity

Different languages categorise colours in unique ways, shaping how individuals perceive them. For example, the Himba tribe of Namibia has distinct terms for shades of green, where an English speaker might simply group all these variations under “green.” The Himba language uses four specific names for these shades, reflecting a refined colour perception. Similarly, Russian differentiates between “light blue” and “dark blue”, assigning unique terms to each. These linguistic distinctions sharpen speakers’ ability to notice subtle colour differences that may be overlooked by speakers of other languages.

Many languages assign grammatical genders to nouns, which can shape how speakers perceive objects or concepts. For example, in Spanish, the word for “key” is feminine, and studies show that Spanish speakers are more likely to describe keys as small, shiny or pretty. In contrast, the German word for “key” is masculine, leading German speakers to use adjectives like hard, heavy or useful. These descriptions often align with the grammatical gender of the noun, reflecting how language influences perception.

Sapir-Whorf Hypothesis Implications

The Sapir-Whorf hypothesis aids with the understanding of language’s role in cultural norms and has relevance in areas such as anthropology or diplomacy. It also plays a role in current AI development, where nuances and cognitive biases can be taken into account in language processing.

The Sapir-Whorf Hypothesis has however faced much criticism, especially with the stronger version. Many critics argue against the evidence presented by the hypothesis, advocating that many of the results are simply related to biological, social, cultural and environmental factors. For example, it has been stated that individuals without access to spoken language, such as deaf individuals or young infants, can still demonstrate basic cognitive abilities that language speakers possess.

With Donald Trump returning to office as the 47th President of the United States today, attention naturally turns to the historic and strategic “special relationship” between the UK and the US. Popularised by Winston Churchill in 1944, this term highlights the exceptionally close political, security, cultural, historical, and economic ties shared by the two nations.

Yet, despite their many similarities, including a shared language and commitment to democracy, the political systems of the UK and the US differ in profound ways, as I have shown here.

Republic vs. Constitutional Monarchy And Parliamentary Democracy

One of the most significant differences lies in the form of government. The UK is a constitutional monarchy and parliamentary democracy, where the monarch serves as a ceremonial head of state. Real political power rests with Parliament, led by the Prime Minister, who acts as head of government.

In contrast, the US is a federal republic and presidential democracy. The President is both head of state and head of government, with power distributed between the federal and state governments.

Written vs. Unwritten Constitution

The two countries also diverge in their approach to constitutional law. The US operates under a written constitution, a single codified document ratified in 1789, which explicitly defines the structure, powers, and limitations of government.

The UK, by contrast, relies on an unwritten constitution, a collection of statutes, conventions, legal judgments, and historical documents, such as the Magna Carta. This flexible framework allows for greater adaptability but lacks the rigidity and clarity of a codified system.

Devolved vs. Federal Power

The UK, as a unitary state, centralizes power in Parliament, though some powers are devolved to Scotland, Wales, and Northern Ireland. The US, on the other hand, operates under a federal system, sharing authority between the national government and individual states.

This federal system reflects the sheer scale and diversity of the US, with a population of 340 million, five times that of the UK, and a landmass roughly 40 times larger. Centralized governance on this scale would be impractical, making federalism a necessity.

Judicial Power

Judicial authority is another key area of distinction. In the UK, the judiciary, including the Supreme Court, is independent but lacks the power to overturn parliamentary legislation, as Parliament is sovereign.

In the US, the judiciary wields significant influence through judicial review, allowing courts to invalidate laws or executive actions deemed unconstitutional.

Parliamentary vs. Presidential Elections

The electoral processes of the two nations also differ. In the UK, general elections are held at least every five years using a first past-the-post system. The Prime Minister is not directly elected by the public but is the leader of the majority party in Parliament.

In the US, federal elections occur every two years, with presidential elections held every four years. Citizens vote directly for electors in the Electoral College, who then elect the President. Members of Congress are directly chosen by voters, ensuring a clear separation between legislative and executive powers.

Legislative Structures

Both nations have bicameral legislatures, (which means separated into two separate assemblies), but their compositions and powers vary.

In the UK, Parliament consists of the House of Commons, whose members are elected, and the House of Lords, an unelected chamber comprising life peers, bishops, and hereditary peers who provide legislative oversight.

In the US, Congress includes the House of Representatives, with members elected based on population, and the Senate, where each state elects two senators regardless of size. Unlike the UK Parliament, Congress operates independently of the executive branch.

While the UK and the US share a commitment to democracy and strong bilateral relations, their political systems reflect distinct historical contexts and governance philosophies. The UK’s parliamentary democracy emphasizes centralized authority and party discipline, whereas the USA’s presidential democracy prioritizes a separation of powers and federalism.

Red Rising by Pierce Brown

It’s January. It’s cold, it’s miserable, Christmas is over and there’s nothing good to look forward to for ages. So, this month’s recommendation is very much along the theme of ‘Things could be a lot worse’.

Pierce Brown’s debut novel, Red Rising, is very much a tale of things going from bad to worse. Darrow is a Red, a member of the lowest caste in the colour-coded society of the future. Like his fellow Reds, he works all day, believing that he and his people are making the surface of Mars livable for future generations. Yet he spends his life willingly, knowing that his blood and sweat will one day result in a better world for his children.

However, Darrow and his kind have been betrayed. Soon he discovers a terrible secret that has been kept from him and his kind for generations. Darrow learns that he—and Reds like him—are nothing more than slaves to a decadent ruling class.

Inspired by a longing for justice, and driven by the memory of lost love, Darrow sacrifices everything to infiltrate the legendary Institute, a proving ground for the dominant Gold caste, where the next generation of humanity’s overlords struggle for power. He will be forced to compete for his life and the very future of civilisation against the best and most brutal of Society’s ruling class. There, he will stop at nothing to bring down his enemies… even if it means he has to become one of them to do so.

Red Rising is a YA novel in the loosest sense of the world, as it treats its readers as wise beyond their years and deals with some mature themes. Darrow is a teenager when the novel starts out, and does a lot of growing up over the course of the novel.

The language is simple, and while the plot isn’t going to win any awards for originality, Brown does a terrific job of not only making you care for Darrow but also persistently raising the stakes, right up to a conclusion that is an absolute belter. His world-building is on point too, creating a Mars that feels objectively bleak and lived in, and very much a believable potential future.

Literary Influences

The novel delves into politics, inequality, and classism, and takes inspiration from 1984, Brave New World, and even a pinch of Game of Thrones. Fans of The Hunger Games are going to find plenty to like in this book, as this has the feel of a much more mature version of it. Fair warning though, things do get quite brutal at times, so if you’re not one that kind of thing, maybe give this one a miss.

And the good news is that if you do enjoy Red Rising, it’s part of a long-running series to keep you going through these bitter and cold months of 2025.

Stay Connected