Category: Medicine

The CRISPR Revolution: How will Gene Editing change our World?

maxresdefault
CRISPR will offer unprecedented control in gene editing. Image Source: http://gizmodo.com/everything-you-need-to-know-about-crispr-the-new-tool-1702114381

The age of gene editing is upon us! Or it will be soon thanks to the revolutionary new technology known as CRISPR. I would be VERY surprised if you haven’t at least heard of it by now, especially when you consider the attention it gets from the science media. Attention that is very understandable once you start looking at exactly what this technology can do, and what that potentially means.

The story began in 2013, when some researchers claimed that they had used CRISPR-Cas9 to successfully slice the genome in human cells at sites of their choosing. This understandably triggered a massive ethics debate which is still going on today. A huge amount of the conversation focussed on how it could be used to fight genetic diseases or even edit human embryos, but there are many more potential applications. As the debate continued, researches set about editing the genomes of many other organisms, including plants and animals, and even exploring it’s potential for studying specific gene sequences. The range of applications is truly remarkable.

Some claim that the real revolution right now is in the lab, where CRISPR has made the study of genetics significantly easier. There are two main components to the CRISPR-Cas9 system: a Cas9 enzyme that acts as a pair of molecular scissors, cutting through the DNA strand, and a small RNA molecule that directs the system to a specific point. Once the cut is made, the cell’s own DNA repair mechanisms will often mend it, but not without making the occasional mistake.

Even a small error during DNA repair can completely alter the structure of the protein it codes for, or stop it’s production altogether. By exploiting these traits and errors, scientists can study what happens to a cell or organism when a specific gene or protein is altered. Such a level of control will likely make these studies less prone error,  and lead to a much better understanding of the role played by specific genes and proteins.

But it doesn’t stop there! There exists a second repair mechanism that mends the DNA strand according to a template provided by the cell. If the researchers were to remove this template and provide one of their own, then they could potentially insert nearly any DNA sequence at whatever site they desire. A prospect that would allow genomes to not just be edited, but actively designed.

That idea may sound somewhat futuristic at this point, but in reality it’s already being put to use. Due to the relative precision and ease that CRISPR offers, scientists have already made a start on editing the genes of animals for applications ranging from agriculture to the revival of extinct species. Some CRISPR modified animals are even being marketed as pets! As you can imagine, regulators are still working out how to deal with such animals, as there are some obvious safety and ecological concerns, but that hasn’t stopped the science from happening.

Disease resistance is one of the more popular uses for CRISPR, and it can provide a variety of agricultural and ecological benefits. For example, there is hope that such an application could help stem the huge loss of honey bee populations around the world. This loss can in part be attributed to disease and parasites, but biotechnology entrepreneur Brian Gillis may have found a solution. He has been studying the genomes of so-called “hygienic bees”, which get their name from their obsessive hive cleaning habits and removal of sick or infested larvae.

Gillis’ idea states that, if genes responsible for this behaviour can be identified, they could then be edited into the genomes of other populations and significantly improve hive health. But whether or not this can be done remains to be seen, as no such genes have been found as of yet, and the roots of the behaviour may prove to be much more complex. At this point we’ll just have to wait and see.

Another potential application, one that I personally find much more interesting, is the revival of extinct species. Around 4000 years ago the woolly mammoth went extinct, an event that was at least partially due to hunting by humans. Well it now looks like we might be able to make up for that mistake! This is thanks to scientist George Church from Harvard Medical School in Boston, who has plans to use CRISPR to bring back this long lost species. He hopes to transform some endangered Indian Elephants into a breed of cold resistant elephants that will closely resemble the mammoth, and release them into a reserve in Siberia where they will have space to roam.

But the process of editing, birthing, and then raising these mammoth-like elephants is no easy task. The first problem is how to go from an edited embryo to a developed baby, and Church has said it would be unethical to implant the embryos into endangered elephants for the sake of an experiment. Since that option is off the table, his lab is currently looking into the possibility of an artificial womb; a device that does not currently exist.

It’s worth pointing out that I’ve just barely scratched the surface of what CRISPR can do for animals, let alone the many other organisms it can be applied to. I could very easily write an entire post about it to be honest, but there is one final point the definitely deserves some attention. We’ve already seen how amazingly versatile CRISPR is, and it stands to reason that, if you can edit the genomes of animals in this way, you can almost certainly do the same to humans.

As I’m sure you can imagine, there is a very heated debate about how it could and should be used to modify the genomes of human embryos. One key argument for is that many currently available technologies already allow parents to do this. These include prenatal genetic screening to check for conditions like down syndrome, and in-vitro fertilisation allowing parents to select embryos that don’t have certain disease-causing mutations. Once could say that direct genome editing is simply the next step in technology of this nature.

On the other hand, one needs to consider what genome editing would mean for society as a whole. For example, by allowing parents to edit out traits they see as debilitating,  we could potentially create a less inclusive society. In such a world even the tiniest of flaws might be seen as a huge disadvantage, with everyone being subjected to much harsher judgement. Would that be beneficial for the human race? Unfortunately that’s not a question we can answer, but it doesn’t sound like a pleasant world to live in.

Whether or not you’re in favour of the human race dictating the genetics and characteristics of future generations seems to be a matter of opinion right now, but it’s certainly not fair to say that either side of the debate outweighs the other. To me, there doesn’t seem to be an obvious answer. On the one hand, we have the chance to truly improve the human race by ensuring the we continue to improve and adapt as time goes on, assuming of course that we have the knowledge to do so. But I cannot say for sure what type of society that would create, or if it’s one I’d really like to live in.

Regardless of your opinion on CRISPR and gene editing, you can’t deny that this new technology has the potential to completely change our world and our society. Given that it can improve our understanding of genetics and allow us to physically alter the DNA of living creatures, one could easily describe it as the beginning of a genetic revolution. We’ll have to just wait and see if it will be put to use in the ways I’ve explored here, but it’s certainly something I will be keeping an eye on. Hopefully I got you interested enough to do the same.

Sources:

New Evidence of Internet Gaming Disorder

video-game_2141739b
Various consoles and controllers. Source: http://www.telegraph.co.uk/news/science/science-news/9088262/Playing-video-games-improves-eyesight.html

Internet gaming is still a relatively new concept, yet it is one we are already very familiar with as it bounced into the limelight over the past decade. And I really mean it when I say “bounced”, with the market in China alone being estimated to be worth $12 billion!

There is also a growing amount of literature looking at how video games can affect both our physical and mental health, and it looks like online gaming may have brought about a new kind of mental illness.

This new condition is known as “Internet Gaming Disorder” (IGD), and its more than a simple enjoyment of online games. People with IGD play to the detriment of other areas in their life, neglecting their health, school work, even their family and friends. They also experience withdrawal symptoms if they are prevented from getting their fix.

All that being said, the Diagnostic and Statistical Manual of Mental Health Disorders 5th Edition (DSM-5) does not currently list IGD, stating that its a “condition warranting more clinical research” before its included. Well this new research might be what was asked for, as it provides new evidence of brain differences between people who do and do not have the disorder.

The study participants, all adolescent males between 10 and 19 years of age, were screened in South Korea where online gaming is an even greater social activity than it is in the US. In fact, most research on this matter comes from young males all around Asia since it’s where the disorder is most commonly found. The Korean government also supported the research, hoping to be able to identify and treat addicts.

The research was a collaboration between the University of Utah School of Medicine and Chung-Ang University in South Korea, and was published online in Addiction Biology. It involved taking MRI scans of all participants, 78 of whom were seeking treatment for IGD and 73 who were not.

What they found was that participants suffering from IGD showed hyperconnectivity between several pairs of brain networks, and you can find a list of all the pairs here. Some of these changes could help gamers respond to new information, whereas other are associated with being easily distracted and having poor impulse control.

One of the potentially beneficial changes was improved coordination between areas that process vision or hearing and the Salience Network. This is the area of the brain responsible for focussing a person’s attention on important events and preparing them to react. You can probably see why that would be useful to an online gamer, allowing them to dodge a hoard of bullets or react to a charging foe.

According to author Jeffrey Anderson, M.D, Ph.D, this could lead to “a more robust ability to direct attention towards targets and recognise novel information in the environment”, and  “could essentially help someone to think more efficiently”. But without follow up studies to determine if performance is actually improved, this is only a hypothesis.

A more worrying find was that participants with IGD showed weaker coordination between the Dorsolateral Prefrontal Cortex and the Temporoparietal Junction than those without the disorder. These same changes are seen in patients with Schizophrenia, Down Syndrome, Autism, and those with poor impulse control. It is thought that this could also lead to increased distractability.

But despite all these findings it is currently unclear if chronic video game playing causes these changes in the brain, or if people with these differences are drawn to the gaming world. Much more research will be required before that question can be answered.

So should you spend less time in the virtual world of video games? Well at this point we don’t really know. It might be good for you, but there might also be more benefits than drawbacks. Either way, this is an area of research that is continuing to grow, and its certainly worth keeping an eye on. I know I will be.

Sources not mentioned in text:

The Age of Antibiotics Could Soon be Over

Antibiotic awareness week has been given a whole new meaning this year due to one particularly eye-opening discovery. We have been slowly emptying our armory of antibiotics for a while now, with few new examples being developed in the past two decades and new infectious diseases being discovered almost every year. We’re also living in a time when diseases are evolving and becoming increasingly resistant to antibiotics, and it looks like they’ve now breached our last line of defense.

A report in The Lancet Infectious Diseases has just revealed the existence of bacteria with resistance to the most aggressive of our antibiotics; a drug known as Colistin. Colistin has had a rough history, being deemed too toxic for human use not long after its discovery due to the damage it caused to kidney cells. But it made a comeback in the early 2000s when more drug resistant bacteria began to emerge, and kidney damage started to seem like the lesser of two evils. And by 2012, the World Health Organisation had classified Colistin as critically important for human health.

But, unknown to the many medical professionals in the West, Colistin was also being used in China. While it was never approved for human use, understandable considering its toxicity, it was approved for use in animals. It has been known for quite some time that feeding animals with low doses of antibiotics fattens them up, and the local pig farmers took to using large quantities of Colisten for this very reason.

This near constant use of Colisten meant that bacteria were being repeatedly exposed to it; long enough for them to learn how to fight back. Colistin resistance has occurred in the past, but the relevant gene was found in the chromosomal DNA, and could not be passed on to non-resistant bacteria. But these guys were a cut above the rest. This time the mutation, now dubbed MCR-1, occurred in the plasmid. This is a circular loop of DNA that all bacteria posses, and it can be passed on in a process called horizontal gene transfer. This is outlined in the graphic below.

86775358_antibiotic_resistance
The process of horizontal gene transfer. Source: http://www.bbc.co.uk/news/health-34857015

This means there is now potential for the resistance gene to end up in the DNA of many different species of bacteria, and it has already been found in some known to cause infections in humans, such as E.Coli and Klebsiella Pneumonia. Now this wouldn’t be so bad if the gene could effectively quarantined, but the researchers report that the gene is already widespread in southern China. The study found that the gene was present in 15% of meat samples and 21% of animals tested between 2011 and 2014, as well as 16 of 1322 samples taken from humans. To make matters worse, there is also some evidence that the gene has managed to spread outside of China in to countries such as Laos and Malaysia.

If this gene continues to spread, which is highly likely since reports state that it has an extremely high transfer rate, then we could see the emergence of “pan-resistant bacteria” – bacteria resistant to all known methods of treatment. This is a very frightening prospect for modern medicine, and if MCR-1 combines with other resistance genes, then medicine could be launched back in to the dark ages. As Professor Timothy Walsh told BBC news, “At that point if a patient becomes seriously ill, say with E. Coli, then there is virtually nothing you can do”.

But the apocalypse is not upon us yet! Although the prospect of the MCR-1 gene going global seems to be a case of when not if, we still have time to prevent a complete breakdown of modern medicine if we act fast enough. There are even new antibiotics that could help delay the onset of the post-antibiotic era, such as Teixobactin, that are currently being researched. But this is not something we should rely on, as it is still a long way from being ready for medical use.

This is one hell of a wakeup call, and the authors of the report know this, stating that their findings “emphasise the urgent need for coordinated global action” in the fight against antibiotic resistance. Whether it’s through the discovery of new antibiotics or entirely new methods of treatment, we need to work together to restock our armory and find new weapons to combat this new breed of superbug. If not, deaths from routine surgeries and minor infections could become commonplace once again due to the lack of treatment options. So let’s hope our scientists are on the case! They have quite the challenge ahead.

Sources:

vOICe: Helping People See with Sound

A demonstration of the vOICe experiment. Photo credit: Nic Delves-Broughton/University of Bath Source: http://www.theguardian.com/society/2014/dec/07/voice-soundscape-headsets-allow-blind-see

It seems like there is an almost constant stream of awesome new technology these days, and there has been a rather fantastic addition! A device is being researched at both the California Institute of Technology (Caltech) in the US and the University of Bath in the UK, with a very noble goal in mind; to build better vision aids for the blind.

Now it has long been known that blind people often rely on sound as a substitution for sight, with some individuals sense of hearing heightened to the point of being able to use echolocation. Well, it turns out that sound can also be designed to convey visual information, allowing people to form a kind of mental map of their environment. This is achieved by the device known as “vOICe”; a pair of smart glasses capable of translating images into sounds.

The device itself consists of a pair of dark glasses with a camera attached, all of which is connected to a computer. The system can then convert the pixels in the camera’s video feed into a soundscape that maps brightness and vertical location to an associated pitch and volume. This means that a bright cluster of pixels at the top of the frame will produce a loud sound with a high pitch, and a dark area toward the bottom will give the opposite; a quiet sound with a low pitch.

But what is really impressive about this technology is that this soundscape appears to be intuitively understood, requiring little to no training at all! In a test performed by researchers Noelle Stiles and Shinsuke Shimojo at Caltech, blind people with no experience of using the device were able to match shapes to sounds just as well as those who had been trained, with both groups performing 33% better than pure chance. In contrast, when the coding was reversed (high point = low pitch, bright pixels = quiet etc.) volunteers performed significantly worse. So how did they achieve such an intuitive system?

Well it began with Stiles and Shimojo working to understand how people naturally map sounds to other senses. Both blind and sighted volunteers were involved in the systems development, with sighted people being asked to match images to sounds, and blind volunteers being asked to do the same with textures. The pattern of choices during these trials directly shaped vOICe’s algorithm, and appeared to produce an intuitive result. This seemed to be a surprise to the researchers, as they wrote that “the result that select natural stimuli could be intuitive with sensory substitution, with or without training, was unexpected”.

This information successfully managed to get me excited, and already had me itching to learn more. It was then that I found out the research at the University of Bath further emphasised the importance of having such an intuitive system. Here the researchers claim that some users are exceeding the level of visual performance often achieved by more invasive restoration techniques, such as stem cell implants or prosthetics. While people who receive such surgery are rarely able to make out more than abstract images, some long-term users of vOICe claim to form images in their brain somewhat similar to sight, as their brains become rewired to “see” without use of their eyes.

Michael J Proulx, an associate professor at the university’s department of psychology, gave the example of a man in his 60s who had been born blind. Proulx reports that he initially thought the idea was a joke, too sci-fi to be real, but “after 1 hour of training, he was walking down the hall, avoiding obstacles, grabbing objects on a table. He was floored by how much he could do with it”. He also reports that after a few weeks of use, some people were able to achieve levels of vision of 20/250. To put that into perspective for you, a short-sighted person who removed their glasses would have a level around 20/400. That’s right, this tech could allow the completely blind to see better than those who are still partially sighted! That’s something to wrap your head around.

But slow down there with your excitement! While this technology is truly revolutionary, it is worth pointing out that there is a huge gulf between distinguishing patterns in a lab environment and using vOICe to actually observe and understand the real world. For example, we don’t know how a busy street, with an already large amount of visual and auditory information, would affect both the device’s signals and how they are interpreted. But there is no denying that this work represents and important step on the road to developing better vision aids, and given that the World Health Organisation estimates a total of 39 million blind people in the world, this technology could bring about a dramatic increase in quality of life across the globe.

But that’s not all this technology could do, as the results are challenging the concept of what being able to “see” actually means. This is illustrated by a quote from Shimojo at Caltech, where she mentions that “our research has shown that the visual cortex can be activated by sound, indicating that we don’t really need our eyes to see”. This has profound implications for the field of neuroscience, and has led to another study beginning at the University of Bath to examine exactly how much information is required for a person to “see” in this way. This could not only lead to optimisation of this technology, but to a deeper understanding of how the human brain processes sensory information.

Now I don’t know about you, but I remember when stuff like this was considered to be firmly stuck in the realm of science fiction, and the fact that such talented scientists keep bringing it closer to reality still surprises me. Combine this with an incredible rate of progress, and there really is no way of knowing what futuristic tech they’ll come up with next. This can make keeping up with it all one hell of a challenge, but fear not, my scientific friends! I shall remain here to shout about anything new that comes along.

Sources not mention in text:

Nobel Prize! Drugs to take on parasites bring home the award!

As you may or may not have heard, the winners of the Nobel Prize for Medicine and Physiology were announced today, and a total of three scientists received this award: Satoshi Ōmura (a Japanese microbiologist), William C. Campbell (an expert on parasite biology), and Youyou Tu (a Chinese medical scientist and pharmaceutical chemist). But more important than who they are, is what they achieved, and each of these great minds has somehow contributed immensely to the treatment of infections caused by some nasty parasites.

Both Ōmura and Campbell contributed to the discovery of a new drug that is extremely effective in killing Microfilariae larvae; an organism known to cause Onchocerciasis, or River Blindness as it might be known to us laymen. The story begins with Ōmura, described as an expert in isolating natural products, deciding to focus on a particular group of bacteria known as Streptomyces. These bacteria are known to produce many compounds with antibacterial properties when isolated, including Streptomycin, the drug initially used to treat Tuberculosis. Ōmura, using his biological powers, successfully isolated and cultivated new strains of Streptomyces, and from thousands of cultures, he selected the 50 best candidates that merited further analysis. One of these turned out to be Streptomyces Avermitilis.

Skip forward a bit, and we see that Campbell acquired some of these samples and set about exploring their antibacterial efficiency. He was then able to show that a component produced by Streptomyces Avermitilis was very efficient against parasites in both domestic and farm animals, which he then purified and named Avermectin. This was initially used as a veterinary drug given Campbell’s findings, but subsequent modification on the molecular level (an addition of only two hydrogen atoms!) gave us the “wonder drug” Ivermectin. This was then shown to out-perform the previously used DEC (Diethylcarbamazine for the chemistry nerds) primarily due to the lack of side effects such as inflammation. This has made the drug extremely safe for human use, allowing it to be administered to patients by non-medical staff and even individuals in small rural communities with no hospital experience at all (provided some very basic training). This is what makes this drug so special, as it can be safely used in some less developed parts of Africa and South America, where River Blindness is most common and advanced medical care may be unreachable or unaffordable for some individuals.

The other major advancement in this field worthy of the Nobel Prize was the work of Youyou Tu, who developed an effective treatment for another well-known parasitic infection: Malaria! Inspired by traditional Chinese medicine, she identified an interesting extract from the plant Artemisia Annua, or as you may know it, Sweet Wormwood. Despite initial results appearing inconsistent, Tu revisited some literature and found clues that lead to the successful extraction of the active component, Artemisinin. She then became the first to demonstrate that this compound was highly effective against Malaria, killing the parasite at even early stages of development. While the precise mechanism of how Artemisinin achieves this is still debated, many of the current theories and hypotheses are that the drug forms a highly reactive compound within the Malaria organism which is then capable of irreversibly modifying and damaging proteins and other important molecules, and the parasite goes down!

The consequences of these discoveries have been felt across the globe, but the countries most affected by these diseases stand to gain the most. River Blindness is nowhere near the huge problem it used to be, with treatment scenarios moving away from control and into eradication and elimination. This could be MASSIVE for the global economy, with estimates stating a potential saving of US$ 1.5 billion over a few decades! Eradication could also be great for local economies; as such serious illnesses can affect employee attendance, due to actual infection or having to care for a relative. This decrease in workforce can potentially lead to an economic downturn and further unemployment. When combined with the more direct costs of treatment, the losses can be huge. Malaria alone is thought to cost Africa around US$ 12 million a year in lost GDP, and continues to slow growth by more than 1% each year. Just imagine what eliminating these diseases could do for these countries. Not only could their economies rise to a more globally competitive level, it could also lead the way to alleviating the more poverty-stricken areas. Families would be freer to go out and earn a living, with no need to worry about potential infection or having to care for sick family members. This could afford more food, better healthcare, and leisure activities, drastically increasing quality of life. Granted it would also cause significant population growth in areas with already high birth rates, and the current food crises would not be helped by this, but these problems would be easier to control and solve in a disease-free society.

That is not to say there are no concerns associated with disease eradication, although it is extremely unlikely that these would out-weigh the benefits. It could be that the process of natural selection would be halted, as these diseases weed out weaker immune systems in the population. But with advances in technology and medicine allowing for new treatments of both genetic and infectious diseases, if such a thing could happen, the causes are already present across the globe, so why should it influence our decision in this case? I mean, Malaria was eradicated in the US many years ago and there have been no obvious downsides to this. You could also look at the example of smallpox, which not only saved the world around US$ 1.35 billion a year, but has had no clear effect on our immune systems. Even if it were to have such an effect, it would take many thousands of years for such a change to occur, and it is possible that our technology and medical treatments will have advanced enough to counter this. While this question does remain to be answered, there is no evidence of a decline in our immune systems, and I think it is safe to say that the benefits of eliminating these diseases would go way beyond the realm of public health. So should we let a completely hypothetical downside influence our decision? Answer! No we shouldn’t.

Sources:

The Key Publications mentioned by the Nobel Assembly at Karolinska Institutet:

  • Burg et al., Antimicrobial Agents and Chemotherapy (1979) 15:361-367
  • Egerton et al., Antimicrobial Agents and Chemotherapy (1979) 15:372-378.
  • Tu et al., Yao Xue Xue Bao (1981) 16, 366-370 (Chinese)