23 minute read

Contagious Vaccines

Harsha Pendyala

The history of vaccines

Advertisement

Vaccines have been humanity’s most effective weapon in suppressing the spread of virus for over 2 centuries being our saving grace in midst of many devastating pandemics, like the polio vaccine introduced in 1955 reducing the incidence of polio cases by over 99% to just 6 cases worldwide in 2021 (Louten, 2016).

The beginnings of vaccines date back to the smallpox epidemic, back then variolation (the intentional inoculation of an individual with virulent material) was the method of choice to battle smallpox. Doctors noticed ground up smallpox scabs or fluid from its pustules, infects the patient with a much milder form of smallpox and provides them immunity from the virus when exposed to it at a later date. This was a very effective method however it had 2 major drawbacks: the inoculated virus could still be transmitted to others after variolation had taken place and the method wasn’t flawless; 2-3% of cases died (History of Smallpox, 2022). However, a viable alternative would soon be discovered as Edward Jenner, a physician at the time conducting research in a small village, noticed that milkmaids who were infected with cowpox (a virus similar to smallpox but much milder) did not contract smallpox, and connecting his knowledge of variolation to this he theorised that inoculating people with cowpox could give them immunity from smallpox. When tested on the young James Phipps, sure enough his method worked and protected the young boy from smallpox (Louten, 2016). This was the first vaccine created in 1976 and quickly caught on in the medical community as it was an ideal alternative to variolation that did not possess its disadvantages. Fast-forward to the present day and vaccines haven’t changed much since the first smallpox vaccine, apart from minor changes have made to increase the breadth of infectious diseases vaccines can prevent against and despite how revolutionary they were they have some major fundamental disadvantages. As illustrated by the COVID-19 vaccination programs, suppressing a pandemic this way is very expensive with the UK’s vaccination program spending more than £8bn over the 2019-2022 period (Pfizer ’s £2 billion NHS rip-off could pay for nurses’ pay rise SIX TIMES over - Global Justice Now, 2022), mostly due the vaccines being made and sold by private companies, with large profit margins in some cases. However, the manufacturing and planning costs were also a large part of this. With each vaccine costing just shy of 5 pounds to produce and sold for even more (Pfizer’s £2 billion NHS rip-off could pay for nurses’ pay rise SIX TIMES over - Global Justice Now, 2022), it is clear our current vaccine technology has major shortcomings.

What are contagious vaccines and how were they developed?

The possibility of a contagious vaccine promises to provide a better solution, with a more efficient rollout timeline and lower rollout costs than the traditional alternative. The first idea for a contagious vaccine came in 1999 from veterinarian José Manuel Sánchez-Vizcaino who was faced with the insurmountable task of trapping and vaccinating an entire population of wild rabbits, notoriously known for being fast breeding (Craig, 2022). He was simply unable to keep up with the rate of population increase while the virus on the other hand, ravaged through the population with relative ease. He needed to find a much faster way that could keep up with the rate of infections by the virus as the rabbits were dying faster than they were being vaccinated. Hence the first contagious vaccine was born, a hybrid virus vaccine between rabbit haemorrhagic disease and myxomatosis, his team sliced out a gene from the rabbit haemorrhagic disease virus and inserted it into the genome of a mild strain of the myxoma virus, which causes myxomatosis. The vaccine would then incite an immune response for both viruses which would easily overpower them both as the myxoma virus would in a very mild form. Although the virus would be modified to make it weaker and add the rabbit haemorrhagic disease viral DNA, Sánchez-Vizcaino hypothesized that because the vaccine was still similar enough to the original disease-causing myxoma virus, it would still spread among wild rabbits (Maclachlan et al., 2017).

A proof-of-concept field test was then carried out among a sample of 147 wild rabbits, with 50% being infected with the viral vaccine. The test showed positive results as percentage of wild rabbits they chipped containing antibodies for rabbit haemorrhagic disease and myxomatosis increased from 50-56% over a 32-day period (Maclachlan et al., 2017). While this does not initially seem like a significant enough difference, the test was done on an island that both rabbit haemorrhagic disease and myxomatosis viruses had not reached yet, therefore the only way rabbits could’ve gained the antibodies was through the viral vaccine’s immune response and the population of rabbits chipped in the test were a only small fraction of those inhabiting the island, and so the number of rabbits the virus had spread to was likely a lot larger than 9 over the month period. This initial test showed great promise for the technology however the EMA (European Medicines Agency), noted technical issues with the vaccine’s safety evaluation and requested that the team decode the myxoma genome, which had not been done before and as a result the concept was dropped with the team’s funding also being cut due to concerns the EMA would never approve such a technology (Craig, 2022). After this, research into self-spreading vaccines went largely dormant. Pharmaceutical companies weren’t interested in investing in research and development for a technology that, by design, would reduce its own profit margins and was unlikely to be approved for use.

The revival of contagious vaccine research

However, in recent years there has been a renewed interest and funding in this line of research, inspired by the devastation zoonotic disease epidemics have caused (like the Lassa virus in East Africa). A new strain of virus vectors, cytomegalovirus, or CMVs, are being used to create viral vaccines and have significant advantages over the previously used Myxoma vectors (Varrelman et al., 2022). These spark hope for the possibility of future contagious vaccine approval for use in reservoir populations, to attempt to combat the tight grasp zoonotic disease has on many countries. CMVs are better as they infect a host for life, inducing strong immune responses while not causing severe disease and they are also uniquely species-specific; as an example, the CMV that spreads among Mastomys natalensis, the rat species that spread Lassa fever, cannot infect any animals other than M. natalensis (Craig, 2022). This alleviates the ethical concerns over viral vaccines mutating and jumping into the human species, where unlike with wild animal populations, informed consent is essential before vaccinating. Field tests using CMVs were also carried out with promising results and after extensive mathematical modelling, a prediction of how long a vaccine of this sort would take to reduce pathogen incidence by 95% in reservoir populations was made. If the technology works as expected, releasing the Lassa fever vaccine could reduce disease transmission among rodents by 95% percent in less than a year and significantly reduce the annual predicted death toll of 5000 people to possibly even 0 and maybe eradicate the disease given a long enough time period (Varrelman et al., 2022).

How realistic is this technology?

The technology clearly shows huge amounts of potential but nevertheless, these predications are ultimately only predications, based on the chance that the technology works exactly as modelled to. Many believe technology should always be developed and tested from a pessimist’s point of view, ensuring that every potential problem that can develop has been addressed so that the release is not a catastrophic failure. This has been applied to many fields like AI but is especially important in immunology as it deals with ecosystems which we do not know enough about. Many experts warn that too little is known about zoonotic disease transmission and viral evolution to accurately predict what might happen if a self-spreading vaccine were released into the wild and what the consequences could be on an ecosystem.

Historical instances show that humans using and modifying viruses for their intended purposes has had devastating effects on the ecosystems of countries. For instance, a man in France intentionally released the myxoma virus in 1952 to keep rabbits out of his home garden but in the process decimated France’s rabbit population wiping out 90% of its rabbits within just 2 years (Maclachlan et al., 2017). Furthermore, the same myxoma virus started killing wild hares almost 70 years later, as it had mixed with the poxvirus allowing it to jump species (Maclachlan et al., 2017). Many experts cite this evidence and warn that we cannot accurately predict (using even mathematical models) what problems might arise from release of a viral vaccine. When the natural ecosystems and animal populations are involved, many say the stakes are simply too high to use the vaccine anyway.

Furthermore, the ethical and social issues that surrounds a programme like this are crushing as most experts in the field accept that the viral vaccine can never be used on human populations, because universal informed consent would never be achieved. However, another problem that accompanies all dangerous technology research is that even though they would never be passed by medical councils, underground unregulated research is likely to take place even if the technology was banned and if the technology enters the wrong hands, it could wreak havoc on countries. The potential scale of devastation that could accompany the viral vaccine technology is limitless, with the process to make the vaccine bearing an uncanny resemblance to the creation of a bioweapon, capable of causing global pandemics. Even Bárcena, a scientist who was part of Sánchez-Vizcaino’s original research group, had shifted his view of self-spreading diseases after he saw how previous strategies involving the intentional release of viruses had unforeseen consequences, referring to the evidence that the myxoma virus had combined with poxvirus enabling it to jump species (Craig, 2022). This ethical argument, however, is an age old one accompanying the discovery of any new technology. The question, whether the potential risks that technology poses, is worth taking to reap the potential benefits it could provide mankind? is at the core of this notion. However, some of the riskiest technologies like AI are still being allowed to develop, as it is widely considered by millions as the next step in human evolution; hence it’s worth taking the risk that AI poses, the benefits outweigh the potential risk. A similar logic can be used on contagious to evaluate if it should be developed or not and the conclusion most come to is that the technology might never be used due to its multitude of problems, but it should still be developed, in case we ever need it. Alec Redwood expresses this excellently with “it’s better to have something in the cupboard that can be used and is mature if we need it than let’s just not do this research because it’s too dangerous, to me, that makes no sense at all” (Craig, 2022).

Reference list

Craig, J., 2022. The controversial quest to make a ‘contagious’ vaccine. [online] Science. Available at: <https://www.nationalgeographic.com/science/article/the-controversial-quest-to-make-a-contagious-vaccine> [Accessed 1 October 2022].

Varrelman, T., Remien, C., Basinski, A., Gorman, S., Redwood, A. and Nuismer, S., 2022. Quantifying the effectiveness of betaherpesvirus-vectored transmissible vaccines. Proceedings of the National Academy of Sciences, 119(4).

Louten, J., 2016. ScienceDirect. Clinical Microbiology Newsletter, 38(13), p.109. Centres for disease control and protection. 2022. History of Smallpox. [online] Available at: <https://www.cdc.gov/smallpox/history/history.html#:~:text=Jenner%20took%20material%20 from%20a,but%20Phipps%20never%20developed%20smallpox.> [Accessed 1 October 2022]. Global Justice Now. 2022. Pfizer’s £2 billion NHS rip-off could pay for nurses’ pay rise SIX TIMES over - Global Justice Now. [online] Available at: <https://www.globaljustice.org.uk/news/pfizers-2billion-nhs-rip-off-could-pay-for-nurses-pay-rise-six-times-over/> [Accessed 1 October 2022].

Maclachlan, N., Dubovi, E., Barthold, S., Swayne, D. and Winton, J., 2017. Fenner’s veterinary virology. 5th ed. Amsterdam: Elsevier/AP, Academic Press is an imprint of Elsevier, pp.157-174. Brink, S. (2018) What’s the real story about the Milkmaid and the smallpox vaccine?, NPR. NPR. Available at: https://www.npr.org/sections/goatsandsoda/2018/02/01/582370199/whats-the-realstory-about-the-milkmaid-and-the-smallpox-vaccine (Accessed: March 13, 2023).

Myxomatosis (2023) Wikipedia. Wikimedia Foundation. Available at: https://en.wikipedia.org/wiki/ Myxomatosis (Accessed: March 13, 2023).

Shore, J. (2020) TEM images of our virus-like particles, The Native Antigen Company. Available at: https://thenativeantigencompany.com/tem-images-of-our-virus-like-particles/ (Accessed: March 13, 2023).

What is an antibiotic?

Teodor Wator

[1]Any substance that inhibits the growth and replication of a bacterium or kills it. Antibiotics are a type of antimicrobial designed to target bacterial infections within the body. This makes antibiotics different from the other main kinds of antimicrobials widely used today:

• Antiseptics are used to sterilise surfaces of living tissue when the risk of infection is high,

• Disinfectants are non-selective antimicrobials, killing a wide range of micro-organisms including bacteria used on non-living surfaces.

Of course, bacteria are not the only microbes that can be harmful to us. Fungi and viruses can also be a danger to humans, and they are targeted by antifungals and antivirals, respectively. Only substances that target bacteria are called antibiotics, while the name antimicrobial is a term for anything that inhibits or kills microbial cells including antibiotics, antifungals, antivirals and chemicals such as antiseptics. Most antibiotics used today are produced in laboratories, but they are often based on compounds scientists have found in nature. Some microbes, for example, produce substances specifically to kill other nearby bacteria to gain an advantage when competing for food, water or other limited resources. However, some microbes only produce antibiotics in the laboratory

WHY ARE ANTIBIOTICS IMPORTANT?

[1]The introduction of antibiotics into medicine revolutionised the way infectious diseases were treated. Between 1945 and 1972, average human life expectancy jumped by eight years, with antibiotics used to treat infections that were previously likely to kill patients. Today, antibiotics are one of the most common classes of drugs used in medicine and make possible many of the complex surgeries that have become routine around the world. If we ran out of effective antibiotics, modern medicine would be set back by decades. Relatively minor surgeries, such as appendectomies, could become life-threatening, as they were before antibiotics became widely available. Antibiotics are sometimes used in a limited number of patients before surgery to ensure that patients do not contract any infections from bacteria entering open cuts. Without this precaution, the risk of blood poisoning would become much higher, and many of the more complex surgeries doctors now perform may not be possible.

PRODUCTION Fermentation

[3]Industrial microbiology can be used to produce antibiotics via the process of fermentation, where the source microorganism is grown in large containers (100,000–150,000 litres or more) containing a liquid growth medium. Oxygen concentration, temperature, pH and nutrients are closely controlled. As antibiotics are secondary metabolites, the population size must be controlled very carefully to ensure that maximum yield is obtained before the cells die. Once the process is complete, the antibiotic must be extracted and purified to a crystalline product. This is easier to achieve if the antibiotic is soluble in an organic solvent. Otherwise, it must first be removed by ion exchange, adsorption or chemical precipitation.

Semi-synthetic

A common form of antibiotic production in modern times is semi-synthetic. Semi-synthetic production of antibiotics is a combination of natural fermentation and laboratory work to maximize the antibiotic. Maximization can occur through the efficacy of the drug itself, the amount of antibiotics produced, and the potency of the antibiotic being produced. Depending on the drug being produced and the ultimate usage of said antibiotic determines what one is attempting to produce. An example of semi-synthetic production involves the drug ampicillin. A beta-lactam antibiotic just like penicillin, ampicillin was developed by adding an amino group (NH2) to the R group of penicillin.[2] This additional amino group gives ampicillin a broader spectrum of use than penicillin. Methicillin is another derivative of penicillin and was discovered in the late 1950s,[3] the key difference between penicillin and methicillin being the addition of two methoxy groups to the phenyl group. [4] These methoxy groups allow methicillin to be used against penicillinase-producing bacteria that would otherwise be resistant to penicillin.

WHAT ARE THEY MADE OF?

The compounds that make the fermentation broth are the primary raw materials required for antibiotic production. This broth is an aqueous solution made up of all of the ingredients necessary for the proliferation of microorganisms. Typically, it contains a carbon source like molasses, or soy meal, both of which are made up of lactose and glucose sugars. These materials are needed as a food source for the organisms. Nitrogen is another necessary compound in the metabolic cycles of organisms. For this reason, an ammonia salt is typically used. Additionally, trace elements needed for the proper growth of the antibiotic-producing organisms are included. These are components such as phosphorus, sulfur, magnesium, zinc, iron, and copper introduced through water-soluble salts. To prevent foaming during fermentation, anti-foaming agents such as lard oil, octadecanol, and silicones are used.

WHAT DIFFERENT ANTIBIOTICS ARE PRODUCED BY?

• [4]Some antibiotics are produced naturally by fungi. These include the cephalosporin producing Acremonium chrysogenum.

• Geldanamycin is produced by Streptomyces hygroscopicus.

• Erythromycin is produced by what was called Streptomyces erythreus and is now known as Saccharopolyspora erythraea.

• Streptomycin is produced by Streptomyces griseus.

• Tetracycline is produced by Streptomyces aureofaciens

• Vancomycin is produced by Streptomyces orientalis, now known as Amycolatopsis orientalis.

DOES SILVER MAKE ANTIBIOTICS MORE EFFECTIVE?

[2]Bacteria have a weakness: silver. It has been used to fight infection for thousands of years —silver can disrupt bacteria and could help to deal with the thoroughly modern scourge of antibiotic resistance.

Silver — in the form of dissolved ions — attacks bacterial cells in two main ways: it makes the cell membrane more permeable, and it interferes with the cell’s metabolism, leading to the overproduction of reactive, and often toxic, oxygen compounds. Both mechanisms could be obtained to make modern antibiotics more effective against resistant bacteria.

Many antibiotics are thought to kill their targets by producing reactive oxygen compounds, and when boosted with a small amount of silver the drugs could kill between 10 and 1,000 times as many bacteria. The increased membrane permeability also allows more antibiotics to enter the bacterial cells, which may overwhelm the resistance mechanisms that rely on pushing the drug back out.

References

[1] https://microbiologysociety.org/members-outreach-resources/outreach-resources/antibiotics-unearthed/antibiotics-and-antibiotic-resistance/what-are-antibiotics-and-how-do-they-work.html

[2] https://www.nature.com/articles/nature.2013.13232

[3] http://www.madehow.com/Volume-4/Antibiotic.html

[4] https://en.wikipedia.org/wiki/Production_of_antibiotics

In December of 1951, news of the first-ever nuclear reactor capable of producing electricity had broken out. Since then, nuclear energy sources have been considered, by many, to be an excellent replacement for fossil fuel-based sources; producing more energy per unit kg of fuel as well as no release of Carbon Dioxide. Despite these advantages, considering world events, with tragic bushfires ravaging Australia in 2020, the melting of ice glaciers in Iceland and most recently a heatwave with temperatures in London reaching up to 40.3 degrees Celsius, we must take drastic steps to slow down the effect of Climate change, which includes finding cleaner sources of energy This article sheds light on both the widely known and unknown consequences of nuclear fission reactors and whether it is possible to find a completely green source of energy

How do Nuclear Fission reactors work?

Comprehending the impact which these reactors have, involves an understanding of their function and how they carry it out. They work on the principle of radioactive decay, which is when an unstable nucleus releases radiation, in order to become more stable. Instability in a nucleus could be due to several reasons, such as an excess or dearth of subatomic particles such as protons or neutrons which causes forces within the nucleus to be unbalanced (however, the case we will focus on is the former). Nuclear fission is the process by which a neutron strikes a larger nucleus, causing it to split producing 2 smaller nuclei as well as 2 or 3 neutrons as well as copious amounts of energy. If a neutron produced in this fission reaction can strike another nucleus, it will cause this to split as well resulting in a chain reaction. One of the most famous equations ever created is E=mc2 stating that Energy and mass are different forms of each other and that Energy = mass times the speed of light squared. This means that a very small mass can be converted into a large amount of energy. This principle is critical in fission reactions; if one was to measure the mass of the individual products and the neutrons after the fission reaction, one would see this mass is slightly less than the mass of the nucleus before, which is called the mass defect. This is due to some of the mass being converted to energy, which is then harnessed.

Within a nuclear reactor:

Nuclear reactors use a similar process of creating electricity to fossil fuels; using energy to heat water and producing steam, which turns the turbine and in turn rotates a generator. Such reactors consist of 4 main components: fuel rods, control rods, a moderator, and a coolant, each of which has a distinct function in ensuring the production of this energy.

The components:

The fuel rods are rods containing the Uranium 235 nuclei, which are surrounded by the moderator. Upon impact of a neutron with the Uranium 235 nucleus, it produces a Uranium 236 nucleus

(which is unstable and therefore splits). The moderator plays an important role in this as it reduces the speeds of neutrons from 13200 m/s to 2200 m/s, which increases the probability of a neutron hitting the fuel rod, making a successful fusion reaction more likely. The moderator, most commonly heavy water, is carefully chosen as it must not absorb the neutrons, but rather just slow them down by a considerable amount for the reason stated above. The control rods are the most crucial parts of this reactor, as they control the rate of the fission reactions taking place. When not lowered, the number of neutron collisions with the fuel rods is greater, which means the rate of nuclear fission is also greater, however, an uncontrolled rate of fission can prove to be dangerous. When the control rods are lowered, they absorb neutrons and therefore decrease the number of neutron collisions with the fuel rods, decreasing the rate of fission reactions taking place which means the electricity output is lower, however rendering the reactor safer. Therefore, there is always a fine balance between ensuring the rate of reaction is enough (to meet the financial goals) while making sure that it is not too high (to ensure the safety of the workers). Finally, the coolant transfers the thermal energy produced in the process to the electrical generator, where the electricity is produced.

The impact on the environment of Fission reactors:

Nuclear reactor accidents:

Nuclear accidents are catastrophic events, releasing significant amounts of nuclear radioactive waste into the surrounding environment, causing the tragic destruction of marine and land ecosystems. Despite numerous safety features implemented; events such as Chernobyl still took place (mainly due to a mix of human error and an unexpected failure of the machines), which have had a profound impact. This section will focus on the effects on agriculture and farming, the effects on lakes and finally the effects on plants and animals.

After the Chernobyl accident, the amount of radioactive iodine in the soil had reached an all-time high. Despite this level decreasing after the accident (due to multiple factors such as the wind carrying these isotopes and the quick decay time), it still had a profound effect. For example, the iodine levels in the thyroid gland of humans had increased, resulting in hyperthyroidism. Furthermore, it affected the reproductive abilities of trees, especially those in the exclusion zone. The Chernobyl accident also heavily contaminated water bodies, with radioactive Caesium as well as radioactive Iodine and radioactive Strontium. There were large concerns about the accumulation of radioactive caesium in aquatic food webs, as the amount of Cs137 in fishes was at a peak. Despite these levels evenutally decreasing, in certain lakes in Russia, the levels of Caesium in water bodies remain high, which is seen by the persistently high levels of radioactive Caesium in fishes. The Chernobyl accident had disastrous effects on human health as well as other wildlife. Workers in the Pripyat reactor received high amounts of gamma radiation (2-20 Gy), which resulted in acute radiation syndrome, a disease where a person has been exposed to large amounts of penetrative ionising radiation (which can penetrate the skin to reach sensitive organs and cause cell death), which resulted in death. Furthermore, due to increased uptake of radioactive iodine by the thyroid (because of the milk of cows which grazed on radioactive grass), the ionising radiation caused mutations, and therefore thyroid cancer. Finally, the Chernobyl accident caused visible deformations in many animals in the exclusion zone.

Mining for Uranium

Obtaining Uranium consists of mining and processing, which have a severe effect on the environment. There are several different types of mining processes, such as underground mining (where large underground tunnels are vertically built, and the Uranium is then removed by breaking it up and then bringing it to the surface). The energy demand required in mining is very large, which is often provided through the burning of fossil fuels (which releases CO2 and other greenhouse gases into the environment). When processing the Uranium ore, by a process of leaching, it has a considerable impact (mainly due to the waste products created). The mill tailings (which is waste containing metals such as radioactive forms of radium, eventually form a toxic mixture of different gases, for example, radioactive radon). There have been particular concerns within the scientific community about radon gas on the environment, especially on human health. As explained by a Stanford University published article, radon gas has links to causing lung cancer: the radon gas can be converted into smaller radioactive particles which get stuck in the lung linings and continue to decay, releasing ionising radiation; this eventually leads to cancer due to the mutation of DNA in these cells. Furthermore, other problems such as its easy ability to be carried away from the mill tailings by the wind, and also its long half-life means that mill tailings have to be controlled safely for long periods of time.

Environmental issues concerning nuclear reactors:

Nuclear waste:

Furthermore, nuclear waste presents itself as a prominent problem; due to its effects on the environment and difficulties in disposal and storage. There are numerous proposed solutions to storing nuclear waste; such as underground or in the oceans. However, if the sealed container gets damaged this could result in leakages of highly radioactive elements with half-lives. The ionising radiation released could cause genetic deformations/mutations and furthermore death of plants and animals. Due to the long half-life of the waste nuclei (for example strontium 90), the rate of decay is much slower so more of this dangerous ionising radiation is released into the environment. Nuclear waste, similarly, has adverse effects on human health. For example, caesium 137, a soluble radioactive isotope, may be released as a waste product. This can be absorbed by internal organs such as the reproductive organs and will decay, releasing high-energy gamma photons (gamma radiation). These can then cause damage and inhibit reproductive abilities. Furthermore, it is widely known that ionising radiation (which is released by such isotopes) is capable of causing cancer.

What is nuclear fusion and could we generate electricity from this?

Nuclear fusion is the process by which two smaller lighter nuclei fuse together, in order to produce a larger nucleus which is more stable (releasing energy). This is the process which takes place in the centre of stars, and in fact, keeps them in a stable condition (due to the outward force of radiation pressure balancing the inward gravitational force). Similar to fission, fusion also works on the principle of the mass-energy equivalence theorem. If you were to measure the mass of the two smaller nuclei, the sum of these masses is greater than the mass of the larger nuclei formed, this difference is called the mass defect. By E=mc2, we know that this mass has been converted to energy which is liberated (released) in the form of binding energy. At present, a proper nuclear fusion reactor has not been made that can successfully produce electricity on a commercial scale, however, there have been big strides in our technology. There are 2 main models for the nuclear fusion reactor: Magnetic confinement reactors and Inertial confinement reactors.

How close are we to a viable nuclear fusion reactor?

A viable nuclear fusion reactor would release more energy in the fusion reaction, than the energy taken in. There are several promising solutions which have been announced. Scientists hope that by 2040, we will have such a reactor which is economically viable and operational.

A large difficulty in making a nuclear fusion reactor is holding the initial ingredients for fusion in place. These nuclear reactors work by heating deuterium and tritium (isotopes of hydrogen) to high temperatures which produce Plasma, which is a soup of ionised particles. However, the plasma cannot touch the wall of the container as it would simply vaporise them. In order to hold the plasma in place, this type of reactor uses strong magnetic fields. Because the plasma contains ions (which are electrically charged), it means that the magnetic fields produced can guide the ions and keep the Plasma in a fixed position. A magnetic confinement reactor works in the following way:

1. Inside the reactor; a stream of neutral particles is released by the accelerator. This stream works to heat the deuterium and tritium to very high temperatures

2. The plasma is contained within a tokamak (which is a doughnut-shaped vessel). The transformer, which is connected to the tokamak, produces magnetic fields which squeeze the plasma so that the deuterium and tritium fuse to produce Helium.

3. The blanket modules absorb the heat produced in the fusion reaction and transfer it to the exchanger where water is converted into steam. This steam drives the turbine which in turn rotates the generator, producing electricity.

Inertial confinement is a new, experimental procedure which physicists are testing however, so far magnetic confinement is a more sound method. This process uses 192 laser beams which are shot at a small capsule called a hohlraum (inside which is a capsule of hydrogen). These X-rays collide with this smaller capsule and cause it to implode. This is meant to model the high pressure and high-temperature conditions at the centre of the stars and cause the deuterium and tritium nuclei to fuse and form Helium.

Furthermore, there are other promising solutions being developed such as SPARC which is also a magnetic confinement Tokamak, however, would be much smaller in size and can be built in a shorter timeframe (potentially allowing us to achieve viable fusion by 2025).

How would our planet benefit from the production of viable nuclear fusion?

One reason we would benefit is due to no release of Carbon Dioxide. In 1900, the percentage of CO2 levels in the atmosphere was 0.285%, however, the percentage of C02 levels in 2020 was 0.039%. Such numbers, which may not seem to have much significance, are largely worrying when looking at them in the context of the global temperature. An increase in greenhouse gas levels results in an enhanced greenhouse effect, where radiation from the sun is absorbed by these gases (however not reflected back into space). This causes an increase in the average temperature on the Earth, which has caused climate change. The drastic effects of this on our environment have been recently seen for example the melting of glaciers which has resulted in the flooding of coastal areas and disrupting ecosystems. However, this is avoided when using energy made from nuclear fusion. Also, there is no release of radioactive isotopes that emit alpha, gamma or beta radiation (all 3 of which are destructive ionising radiation forms). Furthermore, nuclear fusion produces 4 times as much energy as nuclear fission reactions, allowing for more electricity production in nuclear fusion reactors.

What limitations will we face in the development of nuclear fusion:

The main difficulty we have had in producing a viable fusion reactor is the input-to-output ratio i.e. making sure the energy released is more than the energy taken in (With the current record being only 65%). The challenges being faced are:

- Initiating a burning plasma (which is plasma which can maintain itself at the same temperature and spark fusion), requires heating it to temperatures higher than the core of the sun- which needs technologies we haven’t developed yet

- If the walls of the reaction container are not kept at the same temperature (near absolute zero), it could cause damage to the blanket modules and cause the decommissioning of the nuclear facility.

In conclusion, the stark reality of the destructive nature of climate change has led us to re-evaluate the way we live, especially our energy consumption. Attempts to reduce our energy consumption, such as turning off the lights after leaving a room, are successful in slowing down the rate at which this situation is growing, but more radical approaches are needed to reverse this. It is clear that current forms of energy production such as burning fossil fuels have severe effects on the environment, and so does nuclear fission. Although nuclear fusion is not perfect (with recent articles indicating the radioactive waste which is produced due to neutrons colliding with the blanket, requires careful disposal), it is the best option which we have and the entire scientific community are optimistic that we can reverse climate change and sustain this beautiful, unique planet for future generations to come.

This article is from: