With extreme weather becoming increasingly frequent, there is one particular question that logically follows – Did climate change cause these events? As recent as a decade ago scientists would have confidently argued that this question cannot be answered, but thanks to rapid improvements in both the understanding of weather systems and the analytical methods used to study them, they are now able to provide some more meaningful responses.
But even with this new knowledge it’s still not possible to answer the exact question above, as the the question itself is flawed. No weather event ever has a single cause, in fact there are multiple, independent factors, most of which are natural. Climate change is but one variable in many, and it’s influence can be quite subtle.
So what can the scientists tell us? Well, according to a report from the National Academies of Science, Engineering, and Medicine (NASEM) released on March 11th, they can now examine how the likelihood and intensity of an event have been altered. And thus, the science of “event attribution” is born!
However, even when scientists are armed with their new understanding and analytical methods, statements and predictions can’t be made without a HUGE amount of data to back it up. This can be obtained in many ways, with some studies using observational data obtained from similar events in the past, and others using climate and weather models to compare conditions in worlds with and without human-caused climate changes. But no one data set is perfect, and the NASEM report states that results are often most reliable when multiple methods are used.
SO! We now have a rough idea of what event attribution is and how this new area of science works. Let’s start looking at what it can do! So far, the most reliable attribution findings that scientists can give us are for those related to temperature. There is little doubt in the scientific community that human activities have had a noticeable impact on this aspect of the climate, and it’s effect on various weather events are already known.
Apart from increasing the likelihood of extremely hot days and doing the opposite for cold days, a warmer climate can have some rather unexpected effects. Such warming can cause greater evaporation of water from the Earth’s surface, which not only increases the intensity of droughts, but also the amount of atmospheric moisture available to storms. This could lead to more severe heavy rainfall and snowfall events, and you can find an explanation of how that would work in another post I’ve written on the formation of snow.
But the implications of event attribution go beyond simply determining the influence of human-caused climate change. By gaining a deeper understanding of what causes extreme weather events, scientists can improve their ability to accurately predict and project future weather and climate states. If they’re able to predict both the frequency and intensity of extreme events, it could help lessen their impact by avoiding the destruction they could cause. For example, an accurate prediction may allow a community to evacuate long before the event even arrives, and knowledge regarding it’s frequency can help them decide between rebuilding or relocating.
It should now be quite clear what the science of event attribution can offer us, but we should also be aware of the challenges that this relatively new science faces. According to the NASEM report statements about event attribution are quite sensitive to the way the questions are framed, as well as their context. Given this sensitivity, many choices need to be made about defining the duration of the event, the geographical area impacted by it, what variables to study and many more. These decisions will likely drastically improve reliability, as they can all influence how results are interpreted.
But despite it’s problems, the science of event attribution has a lot it can offer to society in terms of limiting the impact of extreme weather, as well as drawing people’s attention to the reality of climate change. Real extreme weather events get people’s attention, and attribution studies could bring an end to the notion of climate change as a distant threat, and help people realise that there is a very real need for us to act on it now. Let’s just hope it can do it fast enough.
Welcome! In this episode we discuss the concept of a biobased economy, how it could be achieved, and look at some of the industries that would be affected by it.
We also talk about some new gene sequencing technology, potentially the largest solar power plant in North America, and how AlphaGo is defeating humanity. Our funny story is about someone called Brian.
The age of gene editing is upon us! Or it will be soon thanks to the revolutionary new technology known as CRISPR. I would be VERY surprised if you haven’t at least heard of it by now, especially when you consider the attention it gets from the science media. Attention that is very understandable once you start looking at exactly what this technology can do, and what that potentially means.
The story began in 2013, when some researchers claimed that they had used CRISPR-Cas9 to successfully slice the genome in human cells at sites of their choosing. This understandably triggered a massive ethics debate which is still going on today. A huge amount of the conversation focussed on how it could be used to fight genetic diseases or even edit human embryos, but there are many more potential applications. As the debate continued, researches set about editing the genomes of many other organisms, including plants and animals, and even exploring it’s potential for studying specific gene sequences. The range of applications is truly remarkable.
Some claim that the real revolution right now is in the lab, where CRISPR has made the study of genetics significantly easier. There are two main components to the CRISPR-Cas9 system: a Cas9 enzyme that acts as a pair of molecular scissors, cutting through the DNA strand, and a small RNA molecule that directs the system to a specific point. Once the cut is made, the cell’s own DNA repair mechanisms will often mend it, but not without making the occasional mistake.
Even a small error during DNA repair can completely alter the structure of the protein it codes for, or stop it’s production altogether. By exploiting these traits and errors, scientists can study what happens to a cell or organism when a specific gene or protein is altered. Such a level of control will likely make these studies less prone error, and lead to a much better understanding of the role played by specific genes and proteins.
But it doesn’t stop there! There exists a second repair mechanism that mends the DNA strand according to a template provided by the cell. If the researchers were to remove this template and provide one of their own, then they could potentially insert nearly any DNA sequence at whatever site they desire. A prospect that would allow genomes to not just be edited, but actively designed.
That idea may sound somewhat futuristic at this point, but in reality it’s already being put to use. Due to the relative precision and ease that CRISPR offers, scientists have already made a start on editing the genes of animals for applications ranging from agriculture to the revival of extinct species. Some CRISPR modified animals are even being marketed as pets! As you can imagine, regulators are still working out how to deal with such animals, as there are some obvious safety and ecological concerns, but that hasn’t stopped the science from happening.
Disease resistance is one of the more popular uses for CRISPR, and it can provide a variety of agricultural and ecological benefits. For example, there is hope that such an application could help stem the huge loss of honey bee populations around the world. This loss can in part be attributed to disease and parasites, but biotechnology entrepreneur Brian Gillis may have found a solution. He has been studying the genomes of so-called “hygienic bees”, which get their name from their obsessive hive cleaning habits and removal of sick or infested larvae.
Gillis’ idea states that, if genes responsible for this behaviour can be identified, they could then be edited into the genomes of other populations and significantly improve hive health. But whether or not this can be done remains to be seen, as no such genes have been found as of yet, and the roots of the behaviour may prove to be much more complex. At this point we’ll just have to wait and see.
Another potential application, one that I personally find much more interesting, is the revival of extinct species. Around 4000 years ago the woolly mammoth went extinct, an event that was at least partially due to hunting by humans. Well it now looks like we might be able to make up for that mistake! This is thanks to scientist George Church from Harvard Medical School in Boston, who has plans to use CRISPR to bring back this long lost species. He hopes to transform some endangered Indian Elephants into a breed of cold resistant elephants that will closely resemble the mammoth, and release them into a reserve in Siberia where they will have space to roam.
But the process of editing, birthing, and then raising these mammoth-like elephants is no easy task. The first problem is how to go from an edited embryo to a developed baby, and Church has said it would be unethical to implant the embryos into endangered elephants for the sake of an experiment. Since that option is off the table, his lab is currently looking into the possibility of an artificial womb; a device that does not currently exist.
It’s worth pointing out that I’ve just barely scratched the surface of what CRISPR can do for animals, let alone the many other organisms it can be applied to. I could very easily write an entire post about it to be honest, but there is one final point the definitely deserves some attention. We’ve already seen how amazingly versatile CRISPR is, and it stands to reason that, if you can edit the genomes of animals in this way, you can almost certainly do the same to humans.
As I’m sure you can imagine, there is a very heated debate about how it could and should be used to modify the genomes of human embryos. One key argument for is that many currently available technologies already allow parents to do this. These include prenatal genetic screening to check for conditions like down syndrome, and in-vitro fertilisation allowing parents to select embryos that don’t have certain disease-causing mutations. Once could say that direct genome editing is simply the next step in technology of this nature.
On the other hand, one needs to consider what genome editing would mean for society as a whole. For example, by allowing parents to edit out traits they see as debilitating, we could potentially create a less inclusive society. In such a world even the tiniest of flaws might be seen as a huge disadvantage, with everyone being subjected to much harsher judgement. Would that be beneficial for the human race? Unfortunately that’s not a question we can answer, but it doesn’t sound like a pleasant world to live in.
Whether or not you’re in favour of the human race dictating the genetics and characteristics of future generations seems to be a matter of opinion right now, but it’s certainly not fair to say that either side of the debate outweighs the other. To me, there doesn’t seem to be an obvious answer. On the one hand, we have the chance to truly improve the human race by ensuring the we continue to improve and adapt as time goes on, assuming of course that we have the knowledge to do so. But I cannot say for sure what type of society that would create, or if it’s one I’d really like to live in.
Regardless of your opinion on CRISPR and gene editing, you can’t deny that this new technology has the potential to completely change our world and our society. Given that it can improve our understanding of genetics and allow us to physically alter the DNA of living creatures, one could easily describe it as the beginning of a genetic revolution. We’ll have to just wait and see if it will be put to use in the ways I’ve explored here, but it’s certainly something I will be keeping an eye on. Hopefully I got you interested enough to do the same.
Supercomputers are truly marvellous examples of what technology can accomplish, being used in many areas of science to work through some incredibly complex calculations. Their computational power is truly a feat of human engineering. But, unfortunately, they’re not perfect. Not only are they absolutely huge, often taking up an entire room, but they’re also expensive, prone to overheating, and a huge drain on power. They require so much of the stuff they often need their own power plant to function.
But fear not! As always, science is capable of finding a solution, and this one comes in the form of a microchip concept that uses biological components that can be found inside your own body. It was developed by an international team of researchers, and it uses proteins in place of electrons to relay information. They’re movement is also powered by Adenosine Triphosphate (ATP), the fuel that provides energy for all biological processes occurring in your body right now. It can quite literally be described as a living microchip.
The chip’s size may not seem like much, measuring only 1.5 cm2, but if you zoom in you get a very different picture. Imagine, if you will, you’re in a plane looking down at an organised and very busy city. The streets form a sort of grid spanning from one end of the city to the other, which closely resembles the layout of this microchip. The proteins are then the vehicles that move through this grid, consuming the fuel they need as they go. The main difference being that, in this case, the streets are actually channels that have been etched into the chip’s surface.
“We’ve managed to create a very complex network in a very small area” says Dan Nicolau Sr. a bioengineer from McGill University in Canada, adding that the concept started as a “back of an envelope idea” after what he thinks was too much rum. I guess some innovative ideas require a little help getting started.
Once the rum was gone and the model created, the researchers then had to demonstrate that this concept could actually work. This was done by the application of a mathematical problem, with a successful result being if the microchip was able to identify all the correct solutions with minimal errors.
The process begins with the proteins in specific “loading zones” that guide them into the grid network. Once there, the journey through the microchip begins! The proteins start to move through the grid, via various junctions and corners, processing the calculation as they go. Eventually, they emerge at one of many exits, each of which corresponds to one possible solution to the problem. In the specific case described by the researchers, analysis of their results revealed that correct answers were found significantly more often than incorrect ones, indicating that model can work as intended.
The researchers claim that this new model has many advantages over existing technology, including a reduction in cost, better energy efficiency, and minimal heat output, making it ideal for the construction of small, sustainable supercomputers. They also argue that this approach is much more scalable in practice, but recognise that there is still much to do to move from the model they have to a full on functioning supercomputer. It’s early days, but we know the idea works.
So, while it may be quite some time before we start seeing these biological supercomputers being actually put to use, it certainly seems like a fruitful path to follow. Society would no doubt benefit from the reduced cost and power usage that this new technology would bring, and these aspects would also make their application in scientific research much easier.
In fact, if the decrease in cost and power usage is a dramatic one, then scientists could potentially use a larger amount of these computers than they do at the moment. This a change that would have a huge impact on the kind of calculations that could be performed, and could potentially revolutionise many areas of science. Even though we’ll have to wait, that’s something I am very much looking forward.
Nicolau, D., Lard, M., Korten, T., van Delft, F., Persson, M., & Bengtsson, E. et al. (2016). Parallel computation with molecular-motor-propelled agents in nanofabricated networks. Proceedings Of The National Academy Of Sciences. http://dx.doi.org/10.1073/pnas.1510825113
So I recently got started on a podcast project with a friend of mine focussing on climate change, green energy, sustainability and science of that nature. If all goes to plan this will be a weekly series so keep an eye out for the posts here or subscribe on YouTube if you’re interested.
Episode 1 just went live yesterday, and if you could check it out and leave any feedback it would be greatly appreciated! You can find the link below!
In this episode we discuss climate change, the rising sea levels, and what you can do about it. We also mention an exciting new medical trial going on and look at some more amusing science stories. Hope you enjoy!
Apologies for the unbalanced audio! Lessons were learned and it will be fixed on the next episode 🙂