Human Biology IGCSE Archives - Oxford Open Learning

Homeostasis: What Is It?

Human Body Self-Regulation

Our body is highly complex, with many physiological processes taking place within its tissues and organs. Every day, it is subjected to changes in its internal and external environment. Homeostasis is the process by which our bodies maintain balance and stability against these stresses. The word is derived from the Greek word “homeo” meaning similar to, and “stasis” meaning to stand still. The process of homeostasis protects the body, helping it to survive what could otherwise be life-threatening situations, by maintaining a balance in such things as temperature, glucose, water and pH levels.

Blood Sugar Regulation

Perhaps the most widely known example of homeostasis is the regulation of blood sugar levels by the pancreas. If not regulated properly, conditions such as diabetes can occur from hyperglycemia (high sugar levels) or hyporglycemia (low blood sugar levels). The pancreas releases two key hormones to control sugar levels; insulin helps to control the rate of glucose uptake by cells while glucagon controls the release of glucose from the body’s glycogen stores. These hormones work closely together to regulate sugar levels during meals or periods of exercise.

Thermo-regulation

In order to properly function, the body needs to be kept at around 37 degrees Celsius – each of our bodies has a very slight variation in this temperature. A deviation from this temperature, even by a few degrees, is potentially very dangerous.

A region of our brain known as the hypothalamus helps to monitor our body’s temperature and actions responses such as sweating, shivering or restricting blood flow to the extremities to help maintain its core temperature. Sometimes, our bodies override our natural temperature in the event of a viral or bacterial infection, creating a fever to help stimulate our immune system and impede a foreign attack.

Osmoregulation

Maintaining our fluid levels and electrolytic balance is essential for our health and our body controls this through the regulation of water intake and excretion via our kidneys. The average adult needs around 2.5 litres of water a day to achieve this balance. When low levels of water are detected, the hypothalamus synthesises a hormone known as antidiuretic hormone (ADH) which communicates to the kidneys to reabsorb more water.

Acid-Base Regulation

The pH levels for different parts of the human body vary widely, from pH 1 gastric acid to pH 8.1 pancreatic fluid. Human blood needs to have a pH level of between 7.35-7.45 (slightly alkaline) to be within a healthy range. Having the appropriate blood pH level allows proper cellular and enzyme functionality and is regulated by the bicarbonate ion – carbonic acid system, the lungs and kidneys. The lungs are able to regulate blood pH rapidly through the rate of exhalation of carbon dioxide. The kidneys on the other hand have a slower impact on pH levels by excreting acids or synthesising bicarbonate.

We can see that the human body processes are complex, and there is a vital need for regulation to ensure proper functioning and health. This is achieved by the body’s coordination of all its systems working in harmony, in which the hypothalamus plays a key role. Homeostasis allows us to regulate ourselves in the often harsh conditions of the natural world, allowing us to cope with extreme temperature variations or periods of famine. It has also been attributed as a driving force for evolution in organisms.

What Does STEM Stand For?

To mark British Science Week, from the 8th to the 17th of March, let’s shine a light on some of the greatest contemporary British minds in Science, Technology, Engineering and Maths (or STEM, for short).

Sue Black

Sue Black is a Professor of Computer Science at Durham University. An outspoken and active social media campaigner, Sue led a campaign to save Bletchley Park and is one of the most influential women in tech. An advocate for equality, diversity, and inclusion, particularly for women in computing, she founded BSCWomen, an online network for women in tech, and #techmums, a social enterprise which empowers mothers and their families through technology. In the 2016 New Year Honours, Sue received an OBE for services to technology.

Timothy Berners-Lee

Timothy Berners-Lee is a computer scientist and software engineer who is most famous for inventing Hypertext Transfer Protocol, or HTTP, and the World Wide Web. He also created the first internet browser, the HTML language, and the URL system, and in 1991 was named one of the 100 Most Important People of the 20th Century by Time Magazine. In 2004, Timothy was knighted by Queen Elizabeth II for his pioneering work, and he now works as Professor of Computer Science at the University of Oxford. He is also a professor emeritus at the renowned Massachusetts Institute of Technology (Often referred to as MIT).

Maggie Aderin-Pocock

Maggie Aderin-Pocock is a space scientist, educator, and communicator. Throughout her career, she has worked on some of the most prestigious projects at some of the UK’s top universities and is currently an honorary research associate within the Department of Physics and Astronomy at University College London and Chancellor at the University of Leicester. She is also a presenter of the TV show The Sky at Night and does much outreach work to engage young people in science. Her academic work now focuses on building instruments and equipment to aid the fight against climate change. Maggie received an MBE for services to science education in 2009 – an honour that was upgraded to OBE in this year’s New Year Honours.

Donald Palmer

Donald Palmer is an Associate Professor of Immunology at the Royal Veterinary College where his current research interests focus on the ageing of the immune system. After completing his PhD at King’s College London, he took post-doctoral fellowship positions at Cancer Research UK and Imperial College where he carried out research on lymphocyte development. Donald is also a co-founder of the Reach Society – an initiative to inspire, encourage and motivate young people, particularly young Black men and boys, to achieve their full potential.

Roma Agrawal

Roma Agrawal is a structural engineer who is most known for her work on The Shard in London. Born in Mumbai, she completed her undergraduate degree in physics at the University of Oxford and gained an MSc in structural engineering from Imperial College London. She has gained several awards for her work, including the Institute of Structural Engineers’ Structural Engineer of the Year’ award in 2011 and, more recently, the Royal Academy of Engineering’s ‘Rooke Award for Public Promotion of Engineering’. She is an active public speaker and advocate for diversity and inclusion within STEM.

Saiful Islam

Saiful Islam is Professor of Materials Modelling at the University of Oxford. He gained a chemistry degree and PhD from University College London and his research interests focus on gaining a deeper understanding of the processes that exist within energy materials, particularly batteries. As well as numerous academic awards and honours, Saiful holds a Guinness World Record for the highest voltage lemon battery (usually a low powered, simple battery used for the purposes of education).

To learn about more successful British scientists, visit the Inspiring Scientists website.

 

If you are interested in studying a Science or Maths, Oxford Open Learning offers the opportunity to do so at a variety of levels, listed below. You can also find advice via our Contact Us page here.

Maths A level

Biology A level

Maths GCSE

Biology IGCSE

Chemistry IGCSE

Human Biology IGCSE

Maths IGCSE

Physics IGCSE

Science (Double Award) IGCSE

Science (Single Award) IGCSE

Fast Track Biology IGCSE

Fast Track Chemistry IGCSE

Fast Track Human Biology IGCSE

Fast Track GCSE / IGCSE Maths

Fast Track Physics IGCSE

From Penicillin To Antibiotic Resistance

Since the discovery of penicillin by Alexander Fleming in 1928, antibiotics have revolutionised the field of medicine, saving countless lives and providing effective treatments for bacterial infections. However, the rise of antibiotic resistance has become a pressing global concern, posing a significant challenge in the battle against microbes.

Penicillin, the first antibiotic, was a breakthrough in the fight against bacterial infections. It was effective against a wide range of pathogens and played a pivotal role in reducing mortality rates from infectious diseases. The discovery of penicillin paved the way for the development of numerous other antibiotics, each targeting different types of bacteria and providing a diverse arsenal against infections.

From Not Enough To Too Much

For several decades, antibiotics were hailed as medical miracles, and their availability led to a sense of complacency. However, the misuse and overuse of antibiotics have contributed to the emergence of antibiotic-resistant bacteria. When antibiotics are used improperly or unnecessarily, bacteria can develop mechanisms to survive and grow despite the presence of these drugs. This has led to the rise of superbugs, such as methicillin-resistant Staphylococcus aureus (MRSA) and carbapenem-resistant Enterobacteriaceae (CRE), which are difficult to treat and pose a significant threat to public health.

The battle against antibiotic resistance involves a multi-pronged approach. Firstly, there is a need for responsible use of antibiotics. Healthcare professionals must prescribe antibiotics judiciously, ensuring that they are used only when necessary and that the appropriate dosage and duration are followed. Patients, too, play a crucial role by adhering to prescribed antibiotic regimens and not pressuring their doctors for unnecessary prescriptions.

Barriers To Development

In addition to responsible use, efforts are underway to develop new antibiotics and alternative treatments. However, the pipeline for new antibiotics has been dry in recent years, largely due to economic factors and the challenges associated with developing effective drugs. This highlights the need for increased investment in research and development of new antimicrobial agents.

Another important aspect of the battle against microbes is infection prevention and control. By implementing stringent hygiene practices in healthcare settings, such as hand hygiene, proper sterilisation, and effective waste management, the spread of antibiotic-resistant bacteria can be minimised. Public awareness campaigns play a crucial role in educating individuals about the importance of hygiene and responsible antibiotic use.

Gathering Further Data

Furthermore, surveillance and monitoring of antibiotic resistance patterns are essential for understanding the scope and impact of the problem. This information enables healthcare providers and policymakers to make informed decisions regarding treatment protocols and infection control strategies. Collaboration between healthcare professionals, researchers, policymakers, and the public is vital in combating antibiotic resistance.

The battle against microbes and antibiotic resistance is an ongoing and complex challenge. It requires a multifaceted approach that addresses responsible antibiotic use, research and development of new treatments, infection prevention and control, and surveillance. By taking collective action, we can preserve the effectiveness of antibiotics and ensure that future generations have access to effective treatments for bacterial infections. The fight against microbes is a reminder of the ever-evolving nature of infectious diseases, as if recent times have not taught us, and of the need for continuous innovation and vigilance in the field of medicine.

Perfectionism is not, in and of itself, a negative trait. Perfectionists are often conscientious high achievers; our greatest weakness is also our greatest strength. But those trying to be constantly perfect can find that every task feels like an unconquerable burden and every essay a path to failure, however unlikely our friends and family might find our doom-laden predictions. Here are three thoughts to use to beat the unrealistic idealism that may currently be beating you.

1. “I am aiming for my own version of perfect.”

What is perfect, anyway? Maybe you could decide. Perhaps perfection could simply mean sitting down at your messy desk, ignoring the clothes on the floor, and spending 10 minutes planning the first half of your essay. In this deeply imperfect and challenging world, if you were to be reasonable with yourself, your definition of perfect should, and could, be different. Redefine perfection: make it doable and make it your own.

2. “I don’t HAVE to do it; I GET to do it.”

A to-do list is a depressing sight, if, at every item, we are telling ourselves that we ‘have to’ or ‘must’ do this or that. But turn ‘have to’ into ‘get to’ and suddenly life seems more joyful. Perhaps it is an irritating piece of advice, an unwelcome call to simply have more gratitude, but studying is essentially an overwhelmingly positive thing. You are learning and growing, and you have access to great materials and educated teachers; you are lucky. And so, even if it feels at first like you are lying to yourself, tell yourself, next time you inspect your to-do list: “I get to plan my essay today”.

3. “A perfect dissertation is a finished dissertation.”

We will do it, but we are waiting for the perfect time when we are in the mood. Because we know we can do it well, and not just well but REALLY well. And so that is the aim. This isn’t laziness, for the fear is real: we cannot bear to submit anything less than our best; we cannot tolerate failure; and we want to be proud of what we have achieved. We have visualised (or we think we have) the perfect essay or assignment. But the truth is that you have a deadline. Perhaps you could achieve perfection if you had eternity to complete it. But you don’t. Most tasks have a timeline, whether it is 6 years to complete a part-time PhD, or one night to finish an essay. And the test is not what you can achieve, but what you can achieve in the time you have to complete it. The definition of perfect might simply be this: finished.

Technology has been advancing rapidly in recent years, and it has had a profound impact on modern medicine. From electronic medical records to robotic surgery, it is changing the way we approach healthcare and improving patient outcomes. In this article, we will explore how technology is changing modern medicine.

Data

One of the most significant changes in modern medicine has been the widespread adoption of electronic medical records (EMRs). EMRs allow healthcare providers to access patient records from anywhere, reducing the risk of errors and improving the quality of care. EMRs also make it easier for patients to access their medical records and track their own health data, empowering them to take an active role in their healthcare.

Diagnosis

Technology is also changing the way we diagnose and treat diseases. Medical imaging technology, such as MRI and CT scans, provide detailed images of the body that can help doctors detect and diagnose conditions more accurately. Robotic surgery is also becoming more common, allowing doctors to perform procedures with greater precision and less invasive techniques.

Remote Care

Additionally, telemedicine, which uses technology to provide medical care remotely, is becoming more popular, particularly in rural areas where access to healthcare can be limited.

Developing Treatments

Another area where technology is changing modern medicine is in the development of new drugs and treatments. Artificial intelligence (AI) is being used to analyse large amounts of medical data and identify new patterns and potential treatments. This has led to the development of personalised medicine, where treatments are tailored to a patient’s specific genetic makeup and health history.

Public Health

Technology is also changing the way we approach public health. Social media and mobile apps are being used to track disease outbreaks and monitor the spread of infectious diseases. Wearable technology, such as fitness trackers and smartwatches, can monitor a person’s health data and alert them to potential health issues.

The Challenges New Technology Poses

However, there are also challenges that come with these technological advancements. One concern is the potential for data breaches and cybersecurity threats, particularly with the widespread adoption of EMRs. There is also the risk that patients may become too reliant on technology, leading to a decrease in face-to-face interactions with healthcare providers.

Technology is changing modern medicine in profound ways. From electronic medical records to AI and robotic surgery, these advancements are improving patient outcomes and providing new opportunities for diagnosis and treatment. However, it is important to consider the potential challenges that come with these changes and ensure that we are using technology responsibly and in a way that benefits patients and the healthcare system as a whole.

When we think of viruses, usually the first thing that comes to mind are diseases or outbreaks such as COVID-19. However, these invisible microbes play an important role in all aspects of life, from influencing global biogeochemical cycles to having the potential to cure cancer, correct genetic defects and act as insecticides within the agricultural industry.

Natural Ecosystems

Viruses are essential in order to maintain an ecological balance within our ecosystems. They help control natural populations so that exponential growth is limited, creating a natural stability which leads to greater biodiversity. Phages (viruses that target bacteria), for example, are major regulators of harmful bacteria populations.

They also play a role in the cycling of nutrients in ecosystems, lysing (breaking down cells) and releasing organic matter in a process known as environmental cycling. This increases the productivity and overall function of vital ecosystems. Ocean microbes, which produce more than half of the worlds oxygen for example, rely on these nutrients to enable high rates of photosynthesis.

Evolution

Viruses also play a major role in the advancement of evolution. They have the ability to transfer genetic material between organisms that can lead to genetic diversity and variation. In a process known as horizontal gene transfer, viruses can integrate their own genetic material into a host’s genome, or gain genetic information from another host. Viral infections can also create diversity through mutations within the host organism caused by mistakes during viral replication. These factors have played a significant role in the evolution of many species.

Medical Research

Vital advancement in medical research and human health in the areas of genetics and disease have been made possible through viruses, where research has led to a development of vaccines, antiviral drugs and diagnostic tools. A process known as gene therapy incorporates ‘viral vectors’, which have been modified to transport genetic material into cells for the treatment of genetic disorders or disease therapy. Many discoveries in the field of virology have also provided insights into broader areas of molecular biological processes, including gene expression, cell signalling and immune response.

Viral Biocontrol

Viruses have been used as biocontrol agents to target specific species that pose a threat to agriculture, forestry or ecosystems without interfering with beneficial species. By choosing target populations to suppress, viruses can act as environmentally friendly alternatives to pesticides in a process aligned with the principles of sustainable agricultural practice. Resistance to pesticides can be develop over time, but the development of resistance to a virus is less likely, due to their ability to evolve and adapt.

Viruses have been shown to be important drivers of evolution and play important roles within ecosystems, agriculture and medical advancement. If they were to suddenly disappear, complex and potentially unforeseen ecological consequences would likely occur. Some experts believe that life would cease to even exist without them. While viruses remain contributors of many serious diseases, understanding their roles and interactions is crucial in managing their negative impacts while taking advantage of their potential positive contributions.

 

You can learn more about cells, microbiology and pathogens in Oxford Open Learning’s flexible Biology iGCSE or A-level accredited distance learning courses. Get in touch with us today to find out more.

 The International Day Of Forests Falls On The 21st Of March 

Think forests and you may think of lush greenery, long walks and dense canopies of trees with wildlife species aplenty. And you’d be right; forests are great green spaces which provide vital habitats to support life on Earth. The 21st of March is the International Day of Forests and will celebrate our beautiful forests across the world. As such, here are some interesting facts about our world’s diverse woodland areas.

Coverage

Forests are home to animals and plants and cover 31 percent of the world’s total land area. According to the UN Environment Programme, they are home to 60,000 different tree species. They help to reduce the amount of greenhouse gases in the atmosphere and do this by absorbing carbon dioxide. According to Forest Research, the area of woodland in the UK as at 31 March 2022 is estimated to be 3.24 million hectares. This represents 13% of the total land area of the UK. But how does this compare to our neighbours in Europe? The UK is the second least wooded country in Europe after Ireland. Europe’s average tree cover is 44%.

The largest forest in the UK is believed to be Galloway Forest Park, Dumfries & Galloway, Scotland at 297 square miles, which is home to over a million trees and a designated Dark Skies Park. It is home to golden eagles, hen harriers, otters, red deer, red squirrels, wild goats, roe deer and fallow deer amongst other species.

Sweden, at the other end of the spectrum, is a true land of forests; it has largest forest cover in Europe with about 28 million hectares, representing almost 70 percent of the country’s total surface area. Forests are believed to cover around 4 billion hectares or 30 percent of Earth’s land surface.

Around 1.6 billion people around the world depend on forests for their livelihoods and daily subsistence needs. Quantifying how many people are employed in the forestry sector is not an easy exercise. However, it is believed, according to Forest Machine Magazine, that 33 million people are employed in forestry throughout the world.

The largest forested area is located in the Russian Federation, constituting a huge 81% of Europe’s forests. The Taiga, also known as Russian Boreal Forests, at approximately 12 million km2, represent the largest forested region on Earth. The Taiga comprises birch, pine, spruce and fir trees, along with with some deciduous species.

Trees Support Us, We Need To Support Them

A tree can sequester up to 150 kilograms of carbon dioxide per year. We know forests are home to more than three-quarters of the world’s life on land. Approximately 750 million people, including 60 million Indigenous people, live in forests, too. Everything that lives in a forest makes its up its ecosystem. The biodiversity found in these ecosystems is truly amazing. More than 80% of the plants, animals and insects living on land can be found in the world’s forests.

However, as we know, the world’s forests are in need of our help. The world loses almost six million hectares of forest each year to deforestation. That’s similar to an area the size of Portugal every two years. The main cause of forest degradation is illegal and unsustainable logging.

Find out what you can do to help protect our world’s forests, visit the International Day of Forest’s website here.

Neurodiversity, or ND, refers to variations in the human brain regarding attention, learning, mood, sociability and other mental functions in a non-pathological sense. The term was created in 1998 by sociologist Judy Singer, who helped popularise the concept along with journalist Harvey Blume. Singer is an Australian sociologist who first used the term Neurodiversity in her Sociology honours thesis in 1996-1998. Her work on autism and neurodiversity became widely known as a result of her chapter “Why Can’t You be Normal for Once in Your Life?” based on her thesis which was published in the UK in 1999.

Concept

Neurodiversity is the concept that all human beings have variances in terms of our neurocognitive abilities. This relates to the ability to think and reason, and includes the ability to remember things, to concentrate, process information, learn, speak, and understand things. The term acknowledges that we each have talents as well as things we struggle with. For some people, this variation between strengths and weaknesses is more obvious, which can bring positives as well as negatives. The term Neurodiversity also describes the idea that people experience and interact with the world around them in many different ways. There is no one “right” way of thinking, learning, and behaving, and differences are not viewed as deficits.

Numbers For Neurodiversity

It is believed that around 1 in 7 people has a neurodivergent condition. Neurodiverse conditions include Dyslexia (approximately 10% of the global adult population), Dyspraxia (approx. 5%), ADHD (approx. 4%), Autism (approx. 1-2%) and Asperger’s (approx. 0.5% ) and Tourette Syndrome (1 to 10 in 1000 children). There are also conditions including Acquired Brain Injury, Mental Health Conditions and health conditions such as Chronic Fatigue.

Research has found that people who are neurodiverse — and specifically those with ADHD — show better performance on a divergent thinking task (a measure of creative potential) and have more creative achievements compared to the general population (White and Shah, 2011). Additionally, a study published in The Journal of Autism and Developmental Disorders found a strong link between autism and creativity.

A growing recognition of neurodiversity has created a greater level of awareness. Progress has been made in organisations and educational institutions recognising and supporting neurodiversity, and this is helping to create a more diverse and inclusive workforce and education system. As always though, there is still room for further understanding, awareness, and embracing of neurodiversity in our society.

 

 

Image: https://commons.wikimedia.org/wiki/User:MissLunaRose12

Whatever subject you are studying or qualification you are studying for, contact with your teacher or tutor – even when remote – is an invaluable part of that process. They are usually the subject experts, have a full understanding of the assessment process and have, more often than not, supported many other students who felt exactly the same as you do now about their learning. Whether you are confident in your subject knowledge and looking for ways to stretch yourself in order to achieve the very best results or are still a little uncertain and unsure how you might secure the grade you need, your tutors can provide you with the support you require. Here are a few simple strategies every student should try in order to boost the benefits of the contact they have.

Get Organised

Put simply, meet their expectations! If they provide a task, complete it. If they set a deadline, meet it. If you have a meeting, be there. Programmes of study and assessment schedules are in place to meet the needs of everyone; ensuring that there is adequate time for covering all of the content, assessing progress and providing feedback. A tutor works with many students and if you don’t adhere to the plan then you are unlikely to get the time you deserve. If there is a problem with the schedule set out for you, talk to your tutor in advance so that they can make any amendment they possibly can in order to make sure that everyone’s needs are met. If a tutor sees you are committed to your learning and doing what is required they are likely to go above and beyond in the ways in which they support you.

Respect Their Knowledge (but don’t be afraid to ask!)

As already mentioned, the tutor is the subject expert. They have the knowledge of the subject but also the ways it is assessed and how to ensure you can demonstrate it when required to do so. Listen to their advice. Take notes where required. Follow their suggestions. However, if there is something you are unsure about, don’t be afraid to ask! Questioning is key to developing a deeper understanding and mastery of a subject but is also a great tool in ensuring there have been no miscommunications or misunderstandings. Your tutor will respect your ability to really engage with the content you are covering together and look for ways to address your questions in more detail.

Know The Value Of  Tutor Feedback

Receiving feedback is one of the most important parts of the learning journey. However, many of us find getting feedback something that is really, really hard! Instead of thinking about what is said by your tutor as being ‘good’ or ‘bad’, try to consider what you can learn from it instead. If you are given praise for a certain aspect of your work, think about what you did that made this so effective. If there are comments relating to something that hasn’t worked out so well then think about what you might do differently next time. Reflection is key to making progress. Also, apply the same thought process when it comes to your attitude to learning. If a tutor comments on this, avoid taking it personally and think of how you might use what they have said to become a more effective learner.

Plan Your Agenda

Don’t forget that any contact that you have with your tutor is designed to benefit YOU. If you are in need of something specific from that contact then, again, do not be afraid to ask! In reality, this involves planning and preparing for any contact you have before you have it. Make a note of any questions you have when studying independently. If you need to revisit any material with them, ask in advance. If you have found a subject area particularly easy or hard, let them know. Remember, your tutor will be looking to support you in a way that is personalised to meet your needs too, so the more effectively you’re able to communicate these, the better they will be able to do this.

Pandemics

On the 11th of March 2020, the WHO (World Health Organisation) declared the coronavirus, or Covid-19, a global pandemic. The WHO defines a pandemic as:

‘an epidemic occurring worldwide, or over a very wide area, crossing international boundaries and usually affecting a large number of people.’

Fifteen months on, coronavirus is still with us and has changed most of our lives beyond recognition. Face masks are now mandatory in businesses such as shops, banks and other indoor settings. Words such as ‘bubble,’ ‘social distancing,’ ‘lockdown,’ and ‘isolating’ have infiltrated our everyday vocabulary. Yet, we may, ever so cautiously, be beginning to feel that the darkest days are behind us. Tuesday the 1st of June saw zero Covid deaths recorded across England and Northern Ireland, with infection rates and hospitalisation rates having fallen rapidly since the winter. The roll-out of several successful vaccine programmes is thought to have contributed to the drop in cases and transmissions, with over 40 million adults having received the first dose and over half of us having received both doses.

Progress and Postponement

Monday the 17th of May marked another key stage in our roadmap out of lockdown. In a press conference the previous Friday, Prime Minister Boris Johnson confirmed that step three of England’s roadmap out of lockdown would go ahead as planned. Stage three has seen the hospitality sector welcome diners indoors, hotels have reopened, and people can even hug close family and friends once more. These changes feel like a step closer to normality and to the return of everyday activities that many have missed over the past year. However, SAGE, who have provided the British government with scientific advice throughout the pandemic, urged caution after a more transmissible Delta variant was found in the UK. Subsequently the PM warned this could compromise the fourth step – which would see a complete end to social distancing – on June 21st, and unfortunately, this has has come to pass.

These continual knock-backs and delays are difficult for keeping up our morale. Still, during moments of despair, it’s good to remind ourselves that most pandemics do come to an end or become endemic, meaning that we can contain the level of infection. Although the coronavirus pandemic is undoubtedly unique, perhaps by looking at the pandemics of the past we can gain a greater insight and understanding of how pandemics are ultimately brought under control.

Past Pandemics

The Plague (1347-1945)

A bacteria called yersinia pestis caused the plague. Although there have been several plague outbreaks over the years, the most devastating of all was the Black Death, a bubonic plague pandemic that brought tragedy across Europe and Asia. It was referred to as ‘the black death’ as symptoms included black buboes, or swelling, in the groin and armpits. It is thought that the disease was transmitted from rodents to humans, though some modern epidemiologists dispute this theory. The horror caused by the black death is unparalleled: an estimated 200 million people lost their lives. Many measures that we have used during the coronavirus pandemic, such as improved sanitation and quarantine, are thought to have reduced transmission. One myth that persists today is that The Great Fire of London (1666) wiped out the Black Death. This theory has been continually disproven by multiple sources, but remains a popular story. Although science has not eradicated the plague – there was an outbreak in Inner Mongolia just last year – only a fraction of those who used to die of it still die today. Increased access to antibiotics and strict isolation of those infected has subdued this particular pandemic.

Smallpox (1520-1980)

Smallpox first began to take hold in Mexico in 1520, after being brought in by Spanish settlers. Caused by variola minor, smallpox was responsible for the death of 300 million in the 20th century alone. Smallpox remains the only pandemic to be completely eradicated in history. Thanks to a staggering vaccination campaign, initially conceived by English physician Edward Jenner (The illustration above is a depiction of Jenner vaccinating his daughter), the 33rd World Health Assembly declared that smallpox had been eradicated in 1980. The elimination of smallpox remains one of the greatest scientific achievements in human history.

Influenza (1918-1920)

Known as the ‘Spanish Flu’ (despite historians agreeing that it didn’t originate in Spain) the 1918 flu was the most devastating in modern history, infecting up to 40% of individuals globally and killing up to 50 million. Spanish Flu was caused by the H1N1 virus, which had genes of avian origin, and was also responsible for the Swine Flu pandemic, which occurred during 2009 and 2010. Just as the world was coming to the end of World War 1, and the devastation it wrought on millions, an even deadlier killer was on the loose. In the words of Charles River Editors:

“However, as bad as things were, the worst was yet to come, for germs would kill more people than bullets. By the time that last fever broke and the last quarantine sign came down, the world had lost 3-5% of its population.”― Charles River Editors, The 1918 Spanish Flu Pandemic: The History and Legacy of the World’s Deadliest Influenza Outbreak

Although we are yet to witness an influenza pandemic as horrifying as The Spanish Flu, The Hong Kong flu of 1968 killed an estimated 1 million people. Swine Flu also infected an estimated 21% of the population in 2009. Influenza may have become less serious for the general population, but an estimated 650,000 still die each year of seasonal flu. This means that, excepting advances in science, the virus that causes influenza is unlikely to ever be eradicated.

The HIV/Aids Pandemic (1981-1997)

There remains some debate amongst scientific scholars on whether HIV/Aids was a global pandemic, with the WHO currently classifying it as a ‘global epidemic.’ In 1981, a large number of formerly healthy homosexual men began becoming seriously ill. The virus responsible for this was The Human Immunodeficiency Virus. Aids has claimed an estimated 32 million lives globally, with the WHO reporting that 690,000 individuals were still dying from Aids in 2019.

Although the virus was initially thought to affect gay men and drug users primarily, in 1983 the WHO also acknowledged that it could be transferred between heterosexual couples, via blood transfusions, and be passed on from infected mothers to their babies. Although the picture seems bleak, the world of science has made some incredible leaps since the 1980s, with treatment options and life-expectancy for sufferers now being better than ever before. The combination of Public Health Campaigns, increased education on how HIV/Aids is transmitted, and powerful antiretroviral treatment, means that many people who develop HIV go on to live long and healthy lives.

What Now?

Which raises the question – what can past pandemics, if anything, teach us about how the current Covid-19 pandemic might come to an end? As we’ve seen, all pandemics are unique and, except for Smallpox, none have been wiped out. Looking to the past cannot give us definite answers on the future of the pandemic, but it can certainly give us clues. It’s likely that the continued roll-out of a global vaccination programme, greater access to antiviral drugs, and enhanced hygiene, will play a significant role in ending this pandemic.

Although the first case of coronavirus was reported to the WHO in December 2019, the sudden onslaught of infections and deaths seemed to appear almost out of nowhere. Our return to normality will not happen overnight. But, as with previous pandemics, if we are patient, these cautious baby-steps out of lockdown will ultimately add up to a return to the freedom and spontaneity that we have all greatly missed.

Stay Connected