Category: Biology

The CRISPR Revolution: How will Gene Editing change our World?

CRISPR will offer unprecedented control in gene editing. Image Source:

The age of gene editing is upon us! Or it will be soon thanks to the revolutionary new technology known as CRISPR. I would be VERY surprised if you haven’t at least heard of it by now, especially when you consider the attention it gets from the science media. Attention that is very understandable once you start looking at exactly what this technology can do, and what that potentially means.

The story began in 2013, when some researchers claimed that they had used CRISPR-Cas9 to successfully slice the genome in human cells at sites of their choosing. This understandably triggered a massive ethics debate which is still going on today. A huge amount of the conversation focussed on how it could be used to fight genetic diseases or even edit human embryos, but there are many more potential applications. As the debate continued, researches set about editing the genomes of many other organisms, including plants and animals, and even exploring it’s potential for studying specific gene sequences. The range of applications is truly remarkable.

Some claim that the real revolution right now is in the lab, where CRISPR has made the study of genetics significantly easier. There are two main components to the CRISPR-Cas9 system: a Cas9 enzyme that acts as a pair of molecular scissors, cutting through the DNA strand, and a small RNA molecule that directs the system to a specific point. Once the cut is made, the cell’s own DNA repair mechanisms will often mend it, but not without making the occasional mistake.

Even a small error during DNA repair can completely alter the structure of the protein it codes for, or stop it’s production altogether. By exploiting these traits and errors, scientists can study what happens to a cell or organism when a specific gene or protein is altered. Such a level of control will likely make these studies less prone error,  and lead to a much better understanding of the role played by specific genes and proteins.

But it doesn’t stop there! There exists a second repair mechanism that mends the DNA strand according to a template provided by the cell. If the researchers were to remove this template and provide one of their own, then they could potentially insert nearly any DNA sequence at whatever site they desire. A prospect that would allow genomes to not just be edited, but actively designed.

That idea may sound somewhat futuristic at this point, but in reality it’s already being put to use. Due to the relative precision and ease that CRISPR offers, scientists have already made a start on editing the genes of animals for applications ranging from agriculture to the revival of extinct species. Some CRISPR modified animals are even being marketed as pets! As you can imagine, regulators are still working out how to deal with such animals, as there are some obvious safety and ecological concerns, but that hasn’t stopped the science from happening.

Disease resistance is one of the more popular uses for CRISPR, and it can provide a variety of agricultural and ecological benefits. For example, there is hope that such an application could help stem the huge loss of honey bee populations around the world. This loss can in part be attributed to disease and parasites, but biotechnology entrepreneur Brian Gillis may have found a solution. He has been studying the genomes of so-called “hygienic bees”, which get their name from their obsessive hive cleaning habits and removal of sick or infested larvae.

Gillis’ idea states that, if genes responsible for this behaviour can be identified, they could then be edited into the genomes of other populations and significantly improve hive health. But whether or not this can be done remains to be seen, as no such genes have been found as of yet, and the roots of the behaviour may prove to be much more complex. At this point we’ll just have to wait and see.

Another potential application, one that I personally find much more interesting, is the revival of extinct species. Around 4000 years ago the woolly mammoth went extinct, an event that was at least partially due to hunting by humans. Well it now looks like we might be able to make up for that mistake! This is thanks to scientist George Church from Harvard Medical School in Boston, who has plans to use CRISPR to bring back this long lost species. He hopes to transform some endangered Indian Elephants into a breed of cold resistant elephants that will closely resemble the mammoth, and release them into a reserve in Siberia where they will have space to roam.

But the process of editing, birthing, and then raising these mammoth-like elephants is no easy task. The first problem is how to go from an edited embryo to a developed baby, and Church has said it would be unethical to implant the embryos into endangered elephants for the sake of an experiment. Since that option is off the table, his lab is currently looking into the possibility of an artificial womb; a device that does not currently exist.

It’s worth pointing out that I’ve just barely scratched the surface of what CRISPR can do for animals, let alone the many other organisms it can be applied to. I could very easily write an entire post about it to be honest, but there is one final point the definitely deserves some attention. We’ve already seen how amazingly versatile CRISPR is, and it stands to reason that, if you can edit the genomes of animals in this way, you can almost certainly do the same to humans.

As I’m sure you can imagine, there is a very heated debate about how it could and should be used to modify the genomes of human embryos. One key argument for is that many currently available technologies already allow parents to do this. These include prenatal genetic screening to check for conditions like down syndrome, and in-vitro fertilisation allowing parents to select embryos that don’t have certain disease-causing mutations. Once could say that direct genome editing is simply the next step in technology of this nature.

On the other hand, one needs to consider what genome editing would mean for society as a whole. For example, by allowing parents to edit out traits they see as debilitating,  we could potentially create a less inclusive society. In such a world even the tiniest of flaws might be seen as a huge disadvantage, with everyone being subjected to much harsher judgement. Would that be beneficial for the human race? Unfortunately that’s not a question we can answer, but it doesn’t sound like a pleasant world to live in.

Whether or not you’re in favour of the human race dictating the genetics and characteristics of future generations seems to be a matter of opinion right now, but it’s certainly not fair to say that either side of the debate outweighs the other. To me, there doesn’t seem to be an obvious answer. On the one hand, we have the chance to truly improve the human race by ensuring the we continue to improve and adapt as time goes on, assuming of course that we have the knowledge to do so. But I cannot say for sure what type of society that would create, or if it’s one I’d really like to live in.

Regardless of your opinion on CRISPR and gene editing, you can’t deny that this new technology has the potential to completely change our world and our society. Given that it can improve our understanding of genetics and allow us to physically alter the DNA of living creatures, one could easily describe it as the beginning of a genetic revolution. We’ll have to just wait and see if it will be put to use in the ways I’ve explored here, but it’s certainly something I will be keeping an eye on. Hopefully I got you interested enough to do the same.


New Fuel Cell Technology keeps the Environment in mind!

I imagine you’re all pretty familiar with fuel cell technology at this point. It’s been around for quite some time and is often heralded as the answer to green, renewable energy. For the most part that is quite true, as there are a number of advantages that this technology has over current combustion-based options. It not only produces smaller amounts of greenhouse gases, but none of the air pollutants associated with health problems.

That being said, the technology isn’t perfect, with many improvements still to be made. One problem is that a fuel cell’s environmental impact depends greatly on how the fuel is acquired. For example, the by-products of a Hydrogen (H2) fuel cell may only be heat and water, but if electricity from the power grid is used to produced the H2 fuel then CO2 emissions are still too high.

The technology also requires the use of expensive or rare materials. Platinum (Pt) is easily the most commonly used catalyst in current fuel cell technology, and this is a rare metal often costing around 1000 US dollars per ounce. This really hurts the commercial viability of the fuel cell, but research regarding alternative materials is progressing.

While I’m certain these kinks will be worked out eventually, it is still worth considering other options. One such option is the Microbial Fuel Cell (MFC), a bio-electrochemical device that uses respiring microbes to convert an organic fuel into electrical energy. These already have several advantages over conventional fuel cell technology, primarily due to the fact that bacteria are used as the catalyst.

The basic structure of an MFC is shown in Figure 1, and you can see that it closely resembles that of a conventional fuel cell. In fact the method by which it produces electricity is exactly the same, the only differences are the fuel and the catalyst.

Figure 1: The basic structure of an MFC. Source:

The fuel for an MFC is often an organic molecule that can be used in respiration. In the figure it is shown to be glucose, and you can see that its oxidation yields both electrons and protons. It is worth noting that the species shown as “MED” is a mediator molecule used to transfer the electrons from the bacteria to the anode. Such molecules are no longer necessary, as most MFCs now use electrochemically active bacteria known as “Exoelectrogens”. These bacteria can directly transfer electrons to the anode surface via a specialised protein.

As I mentioned before, this technology has several advantages over conventional fuel cell technology in terms of cost and environmental impact. Not only are bacteria both common and inexpensive when compared to Pt, but some can respire waste molecules from other processes. This not only means that less waste would be sent to a landfill, but would actually be a source of energy. This has already be applied in some waste-water treatment plants, with the MFCs producing a great deal of energy while also removing waste molecules.

Now you’re probably thinking, “Nathan, this is all well and good, but it’s not exactly new technology”. You’d be right there, but some scientists from the Universities of Bristol and West England have made a big improvement. They have designed an MFC that is entirely biodegradable! The research was published in the journal ChemSusChem in July of 2015, and it represents a great improvement in further reducing the environmental impact of these fuel cells.

Many materials were tried and tested during the construction process. Natural rubber was used as the membrane (see Figure 1), the frame of the cell was produced from polylactic acid (PLA) using 3D printing techniques, and the anode was made from Carbon veil with a polyvinyl alcohol (PVA) binder. All of these materials are readily biodegradable with the exception of the Carbon veil, but this is known to be benign to the environment.

The cathode proved to be more difficult, with many materials being tested for conductivity and biodegradability. The authors noted that conductive synthetic latex (CSL) can be an effective cathode material, but lacks the essential biodegradability. While this meant it couldn’t be used in the fuel cell, it was used as a comparison when measuring the conductivity of other materials.

Testing then continued with egg-based and a gelatin-based mixtures being the next candidates. While both of these were conductive, they weren’t nearly good enough to be used. CSL actually performed 5 times better than either of them. But science can not be beaten so easily! Both mixtures were improved by modification with lanolin, a fatty substance found in Sheep wool, which is known to be biodegradable. This caused a drastic increase in performance for both mixtures, with the egg-based one outperforming CSL! This increase easily made it the best choice for the cathode.

With all the materials now decided, it was time to begin construction on the fuel cell. A total of 40 cells were made and arranged in various configurations. These are shown in Figure 2, and each configuration was tested to determine its performance. Of these three, the stack shown in Figure 2C was found to be able to continuously power an LED that was directly connected. It was also connected to some circuitry that harvested and stored the energy produced, and the authors report that the electricity produced by this method could power a range of applications.

Figure 2: a) A set of 5 fuel cells connected in parallel. Known as a “parallel set”. b) A stack of 4 parallel sets. c) A stack of 8 parallel sets. Source:

While there is much to celebrate here, the authors also address some of the concerns associated with this technology. The most notable concern is how long the fuel cells can operate, and the authors report that after 5 months of operation the stacks were still producing power. This could potentially be longer in an application, as the operational environment of a fuel cell rarely mimics natural conditions.

They also discuss how these MFCs didn’t perform as well as some produced in other studies, but these were the first to be made from cheap, environmentally friendly materials. If anything, this research shows that such fuel cells can at least be functional, and are an excellent target for further research.

So we’ll have to wait for more research to see if this technology will actually take off, and given the timescale of this study it’s likely that we’ll be waiting quite some time. Even so, this is an important step on the road to completely sustainable living, as it shows that even our power sources could be made from completely environmentally friendly materials. Now we just have to hope people take notice. Let’s make sure they do!

Sources not mentioned in text:

New Evidence of Internet Gaming Disorder

Various consoles and controllers. Source:

Internet gaming is still a relatively new concept, yet it is one we are already very familiar with as it bounced into the limelight over the past decade. And I really mean it when I say “bounced”, with the market in China alone being estimated to be worth $12 billion!

There is also a growing amount of literature looking at how video games can affect both our physical and mental health, and it looks like online gaming may have brought about a new kind of mental illness.

This new condition is known as “Internet Gaming Disorder” (IGD), and its more than a simple enjoyment of online games. People with IGD play to the detriment of other areas in their life, neglecting their health, school work, even their family and friends. They also experience withdrawal symptoms if they are prevented from getting their fix.

All that being said, the Diagnostic and Statistical Manual of Mental Health Disorders 5th Edition (DSM-5) does not currently list IGD, stating that its a “condition warranting more clinical research” before its included. Well this new research might be what was asked for, as it provides new evidence of brain differences between people who do and do not have the disorder.

The study participants, all adolescent males between 10 and 19 years of age, were screened in South Korea where online gaming is an even greater social activity than it is in the US. In fact, most research on this matter comes from young males all around Asia since it’s where the disorder is most commonly found. The Korean government also supported the research, hoping to be able to identify and treat addicts.

The research was a collaboration between the University of Utah School of Medicine and Chung-Ang University in South Korea, and was published online in Addiction Biology. It involved taking MRI scans of all participants, 78 of whom were seeking treatment for IGD and 73 who were not.

What they found was that participants suffering from IGD showed hyperconnectivity between several pairs of brain networks, and you can find a list of all the pairs here. Some of these changes could help gamers respond to new information, whereas other are associated with being easily distracted and having poor impulse control.

One of the potentially beneficial changes was improved coordination between areas that process vision or hearing and the Salience Network. This is the area of the brain responsible for focussing a person’s attention on important events and preparing them to react. You can probably see why that would be useful to an online gamer, allowing them to dodge a hoard of bullets or react to a charging foe.

According to author Jeffrey Anderson, M.D, Ph.D, this could lead to “a more robust ability to direct attention towards targets and recognise novel information in the environment”, and  “could essentially help someone to think more efficiently”. But without follow up studies to determine if performance is actually improved, this is only a hypothesis.

A more worrying find was that participants with IGD showed weaker coordination between the Dorsolateral Prefrontal Cortex and the Temporoparietal Junction than those without the disorder. These same changes are seen in patients with Schizophrenia, Down Syndrome, Autism, and those with poor impulse control. It is thought that this could also lead to increased distractability.

But despite all these findings it is currently unclear if chronic video game playing causes these changes in the brain, or if people with these differences are drawn to the gaming world. Much more research will be required before that question can be answered.

So should you spend less time in the virtual world of video games? Well at this point we don’t really know. It might be good for you, but there might also be more benefits than drawbacks. Either way, this is an area of research that is continuing to grow, and its certainly worth keeping an eye on. I know I will be.

Sources not mentioned in text:

An Ocean of Problems

You can be sure, or at least hope, that the many effects of climate change will be addressed this week in Paris, and I’ve got my fingers crossed for some truly meaningful progress to be made. But there is one problem that many people remain startlingly unaware of; the effect that climate change is having on the world’s oceans.

At first glance that might not seem like much of a problem. I mean, what does the ocean do for us? Right? Well it turns out it actually does an awful lot for us humans, and all these services are at risk as the effects of rising temperatures mount up.

The ocean is actually an integral part of the climate system, taking up around 90% of excess energy in the form of heat. It still continues to take up heat to this day, and is an important factor in slowing the atmospheric warming we are so much more concerned about. This heat uptake causes the ocean water column to warm as well, and it is now detectable around the globe to depths greater than 2000m. This not only has a negative effect on ocean ecosystems, but weakens it’s ability to absorb heat in the future.

This is due to the phenomenon known as “Thermohaline Circulation”, meaning the circulation of both heat (thermo) and salt (haline) within the ocean. The mixing occurs due to differences in density, which is determined by both the temperature and salinity of the sea water. The colder and more saline the water, the greater the density. This means that colder water will sink, and will rise again as it travels the worlds ocean currents and warms.

What the increasing temperature of the oceans means is that, due to the fact that a certain volume of water can only absorb so much heat, any excess heat will be taken up by the less dense water being mixed downwards, causing the lower, colder areas of the ocean to warm. This will cause warming throughout the entire water column and the Thermohaline Circulation will become stabilised, as the increasing temperatures mean that density differences will be reduced.

This means that the mixing process will be slowed, maybe even stopped altogether if the warming continues, and the transportation of heat energy around the ocean will become much less efficient. This would make the ocean much less capable of absorbing heat from the atmosphere, as there would be fewer areas of water that are cold enough to absorb a meaningful amount. So where would that heat go now? Well… nowhere. It would remain the atmosphere above the ocean, and its warming would proceed at a much faster rate due to the loss of this regulatory system.

But ocean circulation would not be the only thing affected by the ocean warming. The intensity and frequency of extreme weather would also change, as well as the extent of the areas affected by them. Cyclones and extreme weather events pick up a lot of energy from the ocean in the form of heat. The air above the ocean’s surface contains a great deal of water vapour, and as this air rises and the vapour condenses, the heat absorbed during evaporation is released into the surrounding air. This causes an expansion of the air and a decrease in pressure, which then facilitates the rising of more air from the ocean’s surface. This process feeds more energy into the cyclone or weather system, increasing it’s intensity.

A warming ocean not only increases the amount of heat energy available to these weather events, but since the warming is occurring across the globe the energy exchange can occur over a much larger area. This means that previously unaffected areas of the world may have to rapidly adapted to dealing with these storms, and a poleward shift in the zones of maximum intensity has already been observed.

I hope you’re now thinking “Wow, this could actually really fuck things up”. Well there is more bad news to come my friends, as the ocean is not only getting warmer, it also is getting more acidic. The oceans also does us the service of removing some of our CO2 emissions from the atmosphere, and has absorbed around 28% of human-produced CO2 since the start of the industrial revolution. Doesn’t sound like much? Well it’s equal to approximately 150 billion tons of the stuff.

The trend in ocean acidification is now 30 times greater then the natural variation thanks to us, and the average surface ocean pH has dropped by 0.1 unit, which is a significant increase in acidity. While the large scale effects of acidification remain unknown, it is already clear that it is affecting marine wildlife.

Certain organisms rely on Calcium Carbonate (CaCO3) to form their skeletons or shells, and it is known that CaCO3 formation is disrupted if the environment is too acidic. This can also have indirect effects on other organisms, as some CaCO3 reliant structures, such as coral reefs, provide homes for many other forms of marine life.

It is also known to be slowing the release of sulphur from the ocean and into the atmosphere. This will directly increase the amount of atmospheric warming, as gaseous sulphur contributes to the reflection of solar radiation back into space.

But it has to end there, right?! There can’t possibly be more problems. Did you even read the title? There are many more. The last issue we’ll be discussing affects us more directly, as it has to do with our food supply. Fisheries currently generate $195 billion for the US every year, and fish is a key food source for many people worldwide. Fishing stocks have usually been quite predictable and reliable, as certain populations tend to stay in certain areas. But fish populations are beginning to move, flourish, or whither, depending on the species, due to the many effects we have already discussed.

It is estimated that around 70% of fish species are shifting their ranges, according to a major survey lead by ecologist Malin Pinsky of Rutgers University. This makes fish stocks much less predictable, and it can have surprising economic and political implications.

Over the past decade, huge amounts of Mackerel began appearing of the coast of Iceland, indicating that the populations were moving further north. Iceland took advantage of this during a financial crisis in 2009, and increased the amount of Mackerel they were catching. This was not taken well by competing fleets in the EU and Norway, who had rights to the majority of the catch, claiming that Iceland’s increased Mackerel haul was affecting their own stocks.

This prompted quite a fierce debate on the science of monitoring fish populations. Parties disagreed on the size of the whole population, whether competing fleets were even catching from the same population, and even what waters should be included in the Mackerel’s range.

Luckily, it would appear that this “Mackerel War” has come to a close, with new fishing quotas being agreed on by all parties involved. But it remains a very real example of how the changing environment of the ocean can affect the world of us landlubbers.

I hope that by now you have a good idea of the problems the ocean is facing, but I’d like to point out that there is much I didn’t mention to make this post a reasonable length. Given the prominent role of the ocean in the climate system I’m surprised we haven’t heard about this in the past, and I encourage you to go and find out more. Our ignorance of what’s going on in the ocean is what allowed things to get this bad, and once we’ve educated ourselves we need to start setting up efforts to better understand and counteract these problems.

Let’s hope this at least gets mentioned in Paris, and that someone there decides that enough is enough.


The Flowers of the Future!

What do I mean by flowers of the future? I mean cyberplants! Researchers working at Linköping University in Sweden have found a way to create a rose plant with electronic circuitry running through its tissue, and the effects and implications are very interesting indeed.

Now the idea of cyberplants is not entirely new, as some research by Michael Strano at the Massachusetts Institute of Technology revealed that spinach chloroplasts can incorporate carbon nanotubes (CNTs) into their structure. The report stated that this boosted the rate of photosynthesis, as the CNTs could absorb wavelengths of light that the chloroplasts could not. So if this has been done before, what makes this new discovery special?

Well, this research is the first example of someone incorporating a working electronic circuit into a plant’s anatomy. This was done with a polymer known as PEDOT, or Poly(3,4-ethylenedioxythiophene) if you’re one of our chem nerds, and the structure can be seen below. This material is an excellent conductor when hydrated and is commonly used in printable electronics, making it an excellent contender for the cyberplant project.

The repeating unit structure of PEDOT. Source:,4-ethylenedioxythiophene)

The researchers tested many materials before they made their choice, but none of them were ideal. Some caused the plant to produce toxic compounds, essentially poisoning it, while others clogged the plant’s water transportation systems. They eventually settled on PEDOT, which didn’t cause any noticeable problems, and found a way to incorporate it into the plant’s stem and leaves. They created the world’s first cyber-rose!

This was done by soaking each component in separate PEDOT solutions, and then manipulating them in some way to cause the polymer to migrate into the plant tissue.

In the case of the stem, natural capillary action pulled the polymer out of solution and into the plant’s vascular tissue. The natural structure of the stem then allowed the polymer to self-assemble into wires up to 10 cm long! The conductivity of these structures was then measured using two gold (Au) probes, and the performance was found to be on par with conventional printed PEDOT circuits according to Magnuo Breggren, one of the team members.

The leaves proved to be more tricky. They were first placed in a mixed solution of PEDOT and cellulose nanofibres, then a vacuum was applied. This caused all the air in the leaf tissue to be expelled, and the PEDOT then moved out of the solution and into the empty space the air left behind. This gave the leaves a very interesting property, causing the colour to shift between blueish and greenish hues when a voltage was applied.

Now, while this all seems very interesting, some scientists have questioned what the practical implications of this research could be. “It seems cool, but I am not exactly sure what the implication is” says Zhenan Bao, who works with organic electronics at Stanford University in California.

But Breggren suggests that these electronics could provide an alternative to genetic engineering for monitoring and regulating plant behaviour. While the genetic modification of plants is safe and extremely easy, certain traits, such as flowering times, might be too disruptive to an ecosystem if there is a permanent change. Especially if those changes get passed on to other plants in the area. But electronic switches would not have this risk, and could be easily reversed when needed.

However, if this research progresses to practical applications, the team would have to show that the polymers they use are not harmful to the environment in any way, and in the case of food crops, that the material doesn’t end up in any edible portions of the plant. But this may not be a problem in the future, as the team hopes to make use of biological chemicals to create the circuitry, bypassing any potential environmental and health hazards.

Given the success of their initial study, the team is now collaborating with biologists to develop their research further, and investigating any new directions it could go in. For example, Breggren is apparently investigating whether these PEDOT devices could be used to develop a system to allow the plant to act as a living fuel cell, a project he has rather amusingly named “flower power”.

Regardless of how well this research pans out in the future, it does have the value of being inherently interesting, a trait that drives a great deal of scientific research. But what really interests me, as is this case with a lot of the stuff I write about, is that this is yet another step close to the world of science fiction. We’re getting closer people! All we have to do is wait.


The Age of Antibiotics Could Soon be Over

Antibiotic awareness week has been given a whole new meaning this year due to one particularly eye-opening discovery. We have been slowly emptying our armory of antibiotics for a while now, with few new examples being developed in the past two decades and new infectious diseases being discovered almost every year. We’re also living in a time when diseases are evolving and becoming increasingly resistant to antibiotics, and it looks like they’ve now breached our last line of defense.

A report in The Lancet Infectious Diseases has just revealed the existence of bacteria with resistance to the most aggressive of our antibiotics; a drug known as Colistin. Colistin has had a rough history, being deemed too toxic for human use not long after its discovery due to the damage it caused to kidney cells. But it made a comeback in the early 2000s when more drug resistant bacteria began to emerge, and kidney damage started to seem like the lesser of two evils. And by 2012, the World Health Organisation had classified Colistin as critically important for human health.

But, unknown to the many medical professionals in the West, Colistin was also being used in China. While it was never approved for human use, understandable considering its toxicity, it was approved for use in animals. It has been known for quite some time that feeding animals with low doses of antibiotics fattens them up, and the local pig farmers took to using large quantities of Colisten for this very reason.

This near constant use of Colisten meant that bacteria were being repeatedly exposed to it; long enough for them to learn how to fight back. Colistin resistance has occurred in the past, but the relevant gene was found in the chromosomal DNA, and could not be passed on to non-resistant bacteria. But these guys were a cut above the rest. This time the mutation, now dubbed MCR-1, occurred in the plasmid. This is a circular loop of DNA that all bacteria posses, and it can be passed on in a process called horizontal gene transfer. This is outlined in the graphic below.

The process of horizontal gene transfer. Source:

This means there is now potential for the resistance gene to end up in the DNA of many different species of bacteria, and it has already been found in some known to cause infections in humans, such as E.Coli and Klebsiella Pneumonia. Now this wouldn’t be so bad if the gene could effectively quarantined, but the researchers report that the gene is already widespread in southern China. The study found that the gene was present in 15% of meat samples and 21% of animals tested between 2011 and 2014, as well as 16 of 1322 samples taken from humans. To make matters worse, there is also some evidence that the gene has managed to spread outside of China in to countries such as Laos and Malaysia.

If this gene continues to spread, which is highly likely since reports state that it has an extremely high transfer rate, then we could see the emergence of “pan-resistant bacteria” – bacteria resistant to all known methods of treatment. This is a very frightening prospect for modern medicine, and if MCR-1 combines with other resistance genes, then medicine could be launched back in to the dark ages. As Professor Timothy Walsh told BBC news, “At that point if a patient becomes seriously ill, say with E. Coli, then there is virtually nothing you can do”.

But the apocalypse is not upon us yet! Although the prospect of the MCR-1 gene going global seems to be a case of when not if, we still have time to prevent a complete breakdown of modern medicine if we act fast enough. There are even new antibiotics that could help delay the onset of the post-antibiotic era, such as Teixobactin, that are currently being researched. But this is not something we should rely on, as it is still a long way from being ready for medical use.

This is one hell of a wakeup call, and the authors of the report know this, stating that their findings “emphasise the urgent need for coordinated global action” in the fight against antibiotic resistance. Whether it’s through the discovery of new antibiotics or entirely new methods of treatment, we need to work together to restock our armory and find new weapons to combat this new breed of superbug. If not, deaths from routine surgeries and minor infections could become commonplace once again due to the lack of treatment options. So let’s hope our scientists are on the case! They have quite the challenge ahead.


vOICe: Helping People See with Sound

A demonstration of the vOICe experiment. Photo credit: Nic Delves-Broughton/University of Bath Source:

It seems like there is an almost constant stream of awesome new technology these days, and there has been a rather fantastic addition! A device is being researched at both the California Institute of Technology (Caltech) in the US and the University of Bath in the UK, with a very noble goal in mind; to build better vision aids for the blind.

Now it has long been known that blind people often rely on sound as a substitution for sight, with some individuals sense of hearing heightened to the point of being able to use echolocation. Well, it turns out that sound can also be designed to convey visual information, allowing people to form a kind of mental map of their environment. This is achieved by the device known as “vOICe”; a pair of smart glasses capable of translating images into sounds.

The device itself consists of a pair of dark glasses with a camera attached, all of which is connected to a computer. The system can then convert the pixels in the camera’s video feed into a soundscape that maps brightness and vertical location to an associated pitch and volume. This means that a bright cluster of pixels at the top of the frame will produce a loud sound with a high pitch, and a dark area toward the bottom will give the opposite; a quiet sound with a low pitch.

But what is really impressive about this technology is that this soundscape appears to be intuitively understood, requiring little to no training at all! In a test performed by researchers Noelle Stiles and Shinsuke Shimojo at Caltech, blind people with no experience of using the device were able to match shapes to sounds just as well as those who had been trained, with both groups performing 33% better than pure chance. In contrast, when the coding was reversed (high point = low pitch, bright pixels = quiet etc.) volunteers performed significantly worse. So how did they achieve such an intuitive system?

Well it began with Stiles and Shimojo working to understand how people naturally map sounds to other senses. Both blind and sighted volunteers were involved in the systems development, with sighted people being asked to match images to sounds, and blind volunteers being asked to do the same with textures. The pattern of choices during these trials directly shaped vOICe’s algorithm, and appeared to produce an intuitive result. This seemed to be a surprise to the researchers, as they wrote that “the result that select natural stimuli could be intuitive with sensory substitution, with or without training, was unexpected”.

This information successfully managed to get me excited, and already had me itching to learn more. It was then that I found out the research at the University of Bath further emphasised the importance of having such an intuitive system. Here the researchers claim that some users are exceeding the level of visual performance often achieved by more invasive restoration techniques, such as stem cell implants or prosthetics. While people who receive such surgery are rarely able to make out more than abstract images, some long-term users of vOICe claim to form images in their brain somewhat similar to sight, as their brains become rewired to “see” without use of their eyes.

Michael J Proulx, an associate professor at the university’s department of psychology, gave the example of a man in his 60s who had been born blind. Proulx reports that he initially thought the idea was a joke, too sci-fi to be real, but “after 1 hour of training, he was walking down the hall, avoiding obstacles, grabbing objects on a table. He was floored by how much he could do with it”. He also reports that after a few weeks of use, some people were able to achieve levels of vision of 20/250. To put that into perspective for you, a short-sighted person who removed their glasses would have a level around 20/400. That’s right, this tech could allow the completely blind to see better than those who are still partially sighted! That’s something to wrap your head around.

But slow down there with your excitement! While this technology is truly revolutionary, it is worth pointing out that there is a huge gulf between distinguishing patterns in a lab environment and using vOICe to actually observe and understand the real world. For example, we don’t know how a busy street, with an already large amount of visual and auditory information, would affect both the device’s signals and how they are interpreted. But there is no denying that this work represents and important step on the road to developing better vision aids, and given that the World Health Organisation estimates a total of 39 million blind people in the world, this technology could bring about a dramatic increase in quality of life across the globe.

But that’s not all this technology could do, as the results are challenging the concept of what being able to “see” actually means. This is illustrated by a quote from Shimojo at Caltech, where she mentions that “our research has shown that the visual cortex can be activated by sound, indicating that we don’t really need our eyes to see”. This has profound implications for the field of neuroscience, and has led to another study beginning at the University of Bath to examine exactly how much information is required for a person to “see” in this way. This could not only lead to optimisation of this technology, but to a deeper understanding of how the human brain processes sensory information.

Now I don’t know about you, but I remember when stuff like this was considered to be firmly stuck in the realm of science fiction, and the fact that such talented scientists keep bringing it closer to reality still surprises me. Combine this with an incredible rate of progress, and there really is no way of knowing what futuristic tech they’ll come up with next. This can make keeping up with it all one hell of a challenge, but fear not, my scientific friends! I shall remain here to shout about anything new that comes along.

Sources not mention in text:

What’s the deal with GMOs?

Okay… I was hoping I wouldn’t have to write about this any time soon, but with some countries within the EU deciding to ban the cultivation of genetically modified crops, I think the time has come.

A total of 17 european countries announced this ban at the beginning of October, and it exposes just how far Europe has gone in setting itself against the modern scientific consensus. In fact, the decision seems to have been made without considering the science at all, but we’ll get into that later. I think we should begin by educating ourselves on what GMOs actually are.

A GMO (genetically modified organism) can be defined as an organism that has acquired, by artificial means, one or more genes from another species or from another variety of the same species.

Humans have been modifying the genomes of plants and animals for thousands of years through the process of artificial selection (selective breeding), which involves selecting organisms with desirable traits and breeding them so that these characteristics are passed on. An example of this are the “Belgian Blue” cows, which have been selectively bred to have greater muscle mass.

Unfortunately this is limited to only naturally occurring variations of a gene, but genetic engineering allows for the introduction of genes from completely unrelated species. These genes could lead to resistance to certain diseases and pesticides, or to enhanced nutritional content. The list goes on; but why do we need these organisms?

Given that the Food and Agriculture Organization of the UN predicts that we’ll need to produce 70% more food by 2050 to feed the growing population, we will need to find new ways to meet food demands. There are several ways to do this, but options such as increased deforestation and giving up meat consumption to better utilise the required crops are not appealing for a number of reasons.

The more realistic options are investing more in hydroponics (growing crops indoors), which many countries are doing, or growing GM crops. This is because the central idea behind GM crops is to combat problems that threaten food security, such as pests, climate change, or disease. Modifications that remove these problems could allow certain foods to be effectively grown in locations where it was previously not possible, as well as improving the chances of the crops surviving in harsh conditions.

Despite these benefits, there are still many controversies surrounding GMOs, such as the unintended spread of modified genes, but this excellent story on clearly outlines many of these problems, as well as pointing out why they are really no cause for concern.

So why do so many people still have a problem with GMOs? The benefits are clearly huge and many of the legitimate concerns have already been addressed. Well, many people seem to believe that GMOs are somehow bad for their health, even poisonous, and that they can damage the environment, despite overwhelming scientific evidence to the contrary.

Some researchers published a paper in the journal “Trends in Plant Science”, arguing that the negative representations of GMOs are popular because they are intuitively appealing. In other words, many people oppose GMOs because it “makes sense” that they would pose a threat. The paper is also very well summarised by one of the authors in an article from Scientific American.

One reason they give is the concept of “Psychological Essentialism”, making us perceive an organisms DNA as its “essence”. Following this logic, DNA is an unobservable core causing an organisms behaviour and determining its identity. This means that, when a gene is transferred between two distantly related species, people are likely to believe that some characteristics of the source organism will emerge in the recipient.

They also report that an opinion survey in the US showed that more than half of the respondents thought that a tomato modified with fish DNA would taste like fish. This is NOT how DNA works.

However, it is worth pointing out that not all criticisms of GMOs are unfounded, as many people are skeptical of how the business world will change with their introduction. It has already been reported that the US supreme court has ruled in favour of Monsanto’s claim to copyright GMO seeds, as well as the ability to sue farmers whose fields become contaminated with Monsanto products, whether it is accidental or not.

Now I don’t feel I can safely comment on all of this as the world of business is not something I am educated in, but, while Monsanto’s business practices may be ethically questionable, they are not the only company involved in GMO research and distribution. Many academic institutions and non-profit organisations are also involved, and such groups are responsible for the introduction of Golden Rice, a GMO that has had only beneficial effects for society.

Knowing this, to dismiss all such organisms simply because one questionable company produces some of them is extremely narrow-minded. Another valid criticism is that it is not possible to say that future GMOs will be safe, and that each organism should be evaluated individually. I would agree with this, as a newly created GMO may and likely will have problems associated with it.

But these will be addressed in the research phase, in the same way that a newly synthesised drug has to undergo trials to determine and correct problems, and any product that gets a commercial release will have been thoroughly evaluated by the scientists involved with the research. The problem appears when people claim the gene editing techniques themselves are dangerous, which has no scientific grounding whatsoever.

So, now we go back to the problem of the European Union’s decision to ban GM crops. It is worth noting that this ban doesn’t apply to scientific research, so they are clearly not opposed to the development of new GMOs, just the cultivation of ones that have already been proven safe. Sounds confusing right? I should also point out that this decision was made without consulting the scientific advisor of the European Commission (EC), because they currently don’t have one!

Last November, the EC’s president, Jean-Claude Juncker, chose not to appoint a chief scientific advisor due to lobbying from Greenpeace and other environmental groups, who seemed to have a problem with what the previous advisor was saying about GMOs. Ignoring the fact that the advisors comments reflected the scientific consensus, they wrote “We hope that you as the incoming commission president will decide not to nominate a chief scientific advisor”.

This is extremely worrying, especially since the scientific consensus on the safety of genetic engineering is as solid as that which underpins human-caused climate change. This is especially strange as Greenpeace appears to support the consensus on climate change. You can’t pick and choose which science you agree with; you either support science, or you don’t.

I would assume this ban is due to the negative public opinion of GMOs, and the idea that all scientists that advocate for them have somehow been “bought” by large corporations like Monsanto. Speaking as someone who has experience with scientists and scientific research, I can say that the process has no agenda. Yes, the researchers may prefer one outcome to another, but if the evidence contradicts what they want to find, then they accept that. To do otherwise goes against the very nature of science, and given the amount of work and studying that goes into such a career, very few people go into science without a great deal of passion and respect for the process. You don’t have to trust the corporations, but you should trust the science.

Sources not mentioned in text:

A New Kind of Nature Reserve!

A Family of Elk. Credit: Valeriy Yurko. Source:

There’s no question that the Chernobyl disaster in 1986 was… well… a disaster! Due to a flawed reactor design and inadequately trained personnel, a huge explosion occurred and there were many fires, with at least 5% of the radioactive reactor core being released into the atmosphere and downwind. Both the explosion and the resulting radiation killed around 30 people within a few weeks of the accident, and the local wildlife in the area was all but destroyed at the time. People were evacuated and relocated, an effort still ongoing, and the area was left abandoned. But it seems, some three decades later, that wildlife is finding its way back.

It’s no secret that plant-life has been flourishing in the immediate vicinity of the explosion for years, as some drone footage filmed by Danny Cooke revealed in 2014. But a new study published in the journal “Current Biology” seems to indicate that some mammalian wildlife is starting to call the area home once again. While previous studies showed a large reduction in the wildlife population, this study not only shows that numbers have increased, but that some species are actually thriving in the now human-free 4200 km2 exclusion zone. Measurements were taken from both aerial surveys and by assessing the number/density of animal tracks in the area, and it shows that the populations of Elk, Roe Deer, Red Deer, and Wild Boar are similar to those of four uncontaminated nature reserves in the region, and the Wolf population is around seven times higher! This would suggest that there are abundant mammal communities in the area regardless of the potential effects of radiation.

The study determined this by proposing and testing three different hypotheses.

  1. Mammal abundances are negatively correlated with levels of radioactive contamination in the area.
  2. Density of large mammals are suppressed in the exclusion zone compared to four uncontaminated nature reserves in the area.
  3. Density of large mammals declined in the period between 1 and 10 years after the accident.

In all three cases, the hypothesis was rejected by the evidence the research group collected, making a special note that the huge increase in Wolf population was likely due to the large amounts of prey now available to them. The paper also reports that “this represents unique evidence of wildlife’s resilience in the face of chronic radiation stress”. but I feel this claim is not supported by their evidence, as they also point out that this data cannot separate possible positive effects of a human-free environment from the potential negative effects of radiation. While this would require more studies focussing on each factor to determine for sure, it seems that removing human activity from an area is much better for animal populations than radiation is a detriment. As Jim Smith, a professor of environmental science at the University of Portsmouth, told The Guardian, “What we do, our everyday inhabitation of an area – agriculture, forestry – they’ve damaged wildlife more than the world’s worst nuclear accident”, adding that “It doesn’t say that nuclear accidents aren’t bad, of course they are. But it illustrates that the things we do every day, the human population pressure, damages the environment. It’s kind of amazing isn’t it.”

Amazing it certainly is! I mean, it’s quite a kick in the teeth to learn that simply the presence of humans in an area can have more persistently damaging effects to the environment than chronic radiation exposure. But don’t get too depressed! As Timothy Mousseau, a professor of biological sciences at the University of South Carolina, has pointed out to the BBC, “This study only applies to large animals under hunting pressure, rather than the vast majority of animals – most birds, small mammals, and insects – that are not directly influenced by human habitation.” So maybe we aren’t all that bad. Still, it’s something to think about.

In any case, I think we can all agree that it’s nice to see that we essentially have a new form of nature reserve, albeit a radioactive one. But if and when future studies truly identify the scope of the negative impact we humans have on the environment, I think it might be time to re-think how we behave on this planet. Don’t you?

Sources not mentioned in text:

DNA is Unstable! Luckily your Cells can handle that.

Another Nobel Prize story?! DAMN RIGHT! This time it’s the prize for chemistry, and Tomas Lindahl, Paul Modrich, and Aziz Sancar will collectively bask in the glory for their outstanding work in studying the mechanisms of DNA repair. Given the billions of cell divisions that have occurred in your body between conception and you today, the DNA that is copied each time remains surprisingly similar to the original that was created in the fertilized egg that you once were. Why is that strange? Well from a chemical perspective that should be impossible, with all chemical processes being subject to random errors from time to time. Along with that, DNA is subjected to damaging radiation and highly reactive substances on a daily basis. This should have led to chemical chaos long before you even became a foetus! Now, I would hope that’s not the case for you, so how do our cells prevent this descent into madness? I’ll tell you! It’s because DNA is constantly monitored by various proteins that all work to correct these errors. They don’t prevent the damage from occurring, they just hang around waiting for something to fix, and all three of the winning scientists contributed to our understanding of how our cells achieve this. So! Where do we begin?

A good place to start would be a brief description of the structure of DNA, as this will make things much clearer when we start discussing the research. DNA is primarily a chain of nucleotides, which are themselves made up of three components: a deoxyribose sugar, a phosphate group, and a nitrogenous base. These components are shown bonded together in Figure 1. It is also worth noting that there are four possible bases, each with a slightly different structure, and the one shown in the image is specifically Adenine. The others are known as Thymine, Cytosine, and Guanine, and all attach to the sugar in the same place. The two negative charges on the phosphate group allow it form another bond to the adjacent nucleotide, and this continues on to form a long chain. Two separate chains are then joined together as shown in Figure 2, and voila! A molecule of DNA is formed!

Figure 1: The basic components of DNA. Source:
Figure 2: Representation of how the two chains of Nucleotides bond together to form a molecule of DNA. Source:
A comparison of Cytosine and its Methylated equivalent. Source:

Now that we have a basic understanding of the structure of DNA, the research should make a hell of a lot more sense, and it begins with Tomas Lindahl. In the 1960s, Lindahl found himself asking a question; how stable is our DNA, really? At the time the general consensus among scientists was that it was amazingly resilient. I mean… how else could it remain so constant? If genetic information was in any way unstable, multicellular organisms like us would have never come into existence. Lindahl began his experiments by working with RNA, another molecule found in our cells with a lot of structural similarities to DNA. However, what was surprising was that the RNA rapidly degraded during these experiments. Now it was known that RNA is the least stable of the two molecules, but if was destroyed so easily and quickly, could DNA really be all that stable? Continuing his research, Lindahl demonstrated that DNA does, in fact, have limited chemical stability, and can undergo many reactions within our cells. One such reaction is Methylation, in which a CH3 (methyl) group is added on to one of the bases in the DNA strand. The difference this causes is shown in Figure 3, and can occur with or without the aid of an enzyme. This reaction will become relevant later on, as will the fact that it changes the shape of the base, affecting how other proteins can bind to it. All of these reactions can alter the genetic information stored in DNA, and if they were allowed to persist, mutations would occur much more frequently than they actually do.

Realising that these errors had to be corrected somehow, Lindahl began investigating how DNA was repaired, and by 1986 he had pieced together a molecular image of how “base excision repair” functions. The process involves many enzymes (and I don’t have the time or patience to describe them all), but a certain class known as “DNA glycolsylases” are what actually break the bond between the defective base and the deoxyribose sugar, and the base is removed. Our cells actually contain many enzymes of this type, each of which targets a different type of base modification. Several more enzymes then work together to fill the gap with the correct, undamaged base and there we have it! A mutation has been prevented. To help you visualise all this, you’ll find a graphical representation of it below in Figure 4.

Figure 4: Graphical representation of the process of Base Excision Repair. Source:

But the science doesn’t end there folks! Remember, there were three winners, the second of which is Aziz Sancar, who discovered another method of DNA repair. This one is called “nucleotide excision repair”, and involves the removal of entire sets of nucleotides, rather than individual bases. Sancar’s interest was piqued by one phenomenon in particular; when bacteria are exposed to deadly doses of UV radiation, they can suddenly recover if exposed to visible blue light. This was termed “photoreactivation” for… obvious reasons. He was successful in identifying an isolating the genes and enzymes responsible, but it later became clear that bacteria had a second repair mechanism that didn’t require exposure to light of any kind. But Sancar wasn’t about to let these bacteria out-fox him and, after more investigations, he’d managed to identify, isolate, and characterise the enzymes responsible  for this process as well. The bacteria were no match for his chemical prowess!

“But how does it work?!” I hear you shout. Well calm the f**k down and I’ll tell you! UV radiation can be extremely damaging, and can cause two adjacent Thymine bases in a DNA strand to directly bind to each other, which is WRONG! A certain endonuclease enzyme, known as an “exinuclease”, is aware of this wrongness, and decides that this damage must be fixed. It does this by making two incisions on each side of the defect, and a fragment roughly 12 nucleotides long is removed. DNA polymerase and DNA ligase then fill in and seal the gap, respectively, and now we have a healthy strand of bacterial DNA! Sancar later investigated this repair mechanism is humans in  parallel with other research groups, and while it is much more complicated, involving many more enzymes and proteins, it functions very similarly in chemical terms. You want a picture to make it easier? You’ll find it below in Figure 5!

Figure 5: Graphical representation of Nucleotide Excision Repair. Source:

The final recipient of the Nobel Prize this year was Paul Modrich, who identified YET ANOTHER repair system (there are loads, you know), which he named the “mismatch repair” mechanism. Early on in his career, Modrich was examining various enzymes that affect DNA, eventually focussing on “Dam Methylase” which couples methyl groups to DNA bases (I TOLD YOU THAT REACTION WOULD BE RELEVANT!). He showed that these methyl groups could basically behave a labels, helping restriction enzymes cut the DNA strand at the right location. But, only a few years earlier, another scientist called Matthew Meselson, suggested that they also indicate which strand to use a template in DNA replication. Working together, these scientists synthesised a virus with DNA that had incorrectly paired bases, and methylated only one of the two DNA strands. When the virus infected, and injected its DNA into the bacteria, the mismatched pairs were corrected by altering the unmethylated strand. It would appear that the repair mechanism recognised the defective strand by the lack of methyl groups. Does it work that way in humans? Probably not. Modrich did manage to map the mismatch repair mechanism in humans, but DNA methylation serves many other functions in human cells, particularly those to do with gene expression and regulation. It is thought that strand-specific “nicks” (lack of a bond between a phosphate group and a deoxyribose sugar) or ribonucleotides (nucleotide components of RNA) present in DNA may direct repair, but the mechanism remains to be found at this point.

Figure 6: Structure of Olaparib. Source:

But why should we care? Granted it is nice to know this stuff (at least I think so), but what can this information be used for? Well, it actually has applications within the world of medicine, as errors in repair mechanisms can often lead to cancer. In many forms of cancer these mechanisms have been at least partially turned off, but the cells are also heavily reliant on the mechanisms that remain active. As we mentioned earlier, a lack of these mechanisms leads to chemical chaos, and that would cause the cancer cells to just die. This has led to drugs designed to inhibit the remaining repair systems to slow down or stop cancer growth entirely! One such drug is Olaparib, and you can see the structure in Figure 6. This drug functions by inhibiting two specific proteins (PARP-1 and PARP-2), which are integral in detecting certain flaws in replicated DNA and directing repair proteins to the site of damage. Cancer cells treated with this drug have been shown to be more sensitive to UV radiation, making one form of treatment much more effective.

And with that, we bring our Nobel Prize stories for this year to an end! I think it’s safe to say that the work described here deserved a prize of some sort, as it not only takes a lot of skill and dedication, but it has led to new medical treatments and a MUCH greater understanding of how our DNA behaves. Have you enjoyed our time spent on the science of the Nobel Prize? DAMN RIGHT YOU HAVE. O_O