Features
New genome research at Oxford University could change the way scientists view our evolution.
The relationship and emergence of the three ‘domains’ of life – the three founding branches of the Tree of Life to which all living cells belong – has been much disputed. Two of these domains, Bacteria and Eukaryotes (which includes all animals, plants and fungi) are familiar but less is known of the third: these organisms are collectively called the Archaea.
Some species of Archaea are adapted to live in extremes such as the boiling sulphur springs of Yellowstone National Park or the high salt concentrations of the Dead Sea. Others, such as the group Thaumarchaea, are found in more moderate environments including the warm surface waters of oceans.
Steven Kelly, of Oxford University's Department of Plant Sciences, tracked the evolutionary history of the three domains by analysing more than 3,500 families of genes in the Archaea, Bacteria and Eukaryotes. He and his colleagues found that Eukaryotes are most closely related to the Thaumarchaea.
The study, recently published in Proceedings of the Royal Society B, also suggests that the metabolism of the earth’s first organisms was based on methane production. 'That’s a really important discovery because it gives us a real insight into how life got started, which is one of the biggest questions in evolutionary biology,' Steven said. 'This is a step change in the way people think about how life on earth developed.'
The ability to link advances in our knowledge of evolution to changes in past atmospheric and environmental conditions will enhance our knowledge of how life is adapting to the changing environmental conditions we see today, Steven believes.
This new research suggests that Archaea are as ancient as their name suggests. Evidence from geology and genetics, coupled with the findings, suggests that Eukaryotes evolved between 2 and 2.5 billion years after Archaea, which emerged around 3.5 billion years ago.
An experiment to test if slime moulds can design efficient railway networks has won a team, including Oxford University researchers, an Ig Nobel Prize.
We reported on the original research back in January, but I asked team-member Mark Fricker of Oxford University’s Department of Plant Sciences why scientists study these strange organisms, what they can teach us and how they take 'networking' to a whole new level:
OxSciBlog: What makes slime moulds so interesting to study?
Mark Fricker: The acellular slime molds represent a very unusual life form. The whole organism is one single giant cell, albeit containing many nuclei, that can grow to be many centimeters in size. In the wild, it spreads as a pulsing network seeking out food sources such as bacteria, fungi or dead insects that it engulfs and then digests.
Even with a low power microscope or a hand lens it is possible to watch the shuttle flow of cytoplasm coursing through the system that somehow manages to resolve into an efficient transport network. Although it has no brain or nervous system, its exploratory behaviour and the network itself is highly responsive and continuously adapts to whatever is happening around it.
It's a great system to then challenge with different stimuli to see how it reacts. If things get really bad, it simply dries out and waits until things get better or forms spores that can spread to other sites.
OSB: Why is it useful to compare their networks with manmade ones?
MF: We already know that the slime mold is capable of solving certain abstract problems, such as the shortest path through a maze or finding the most efficient way to connect geometric arrangements of different food sources using Steiner points, that is computationally difficult to achieve.
However, we wanted some way to determine whether understanding such behaviour could have utility beyond simple fascination with such elegant biology. Providing a real-world test problem that we already know the answer to seemed to be one way to discover whether the lessons we might learn from the slime mold could have applications elsewhere.
OSB: What can we learn from how slime moulds build networks?
MF: As there is no obvious distinct communication system within the organism, we infer that the network is able to form and adapt based solely on local information. The overall behaviour emerges from the collective interaction of the constituent parts.
Control by such a decentralised system is in marked contrast to management through a central control centre that has to assimilate all the necessary information, processes it and then send out instructions to achieve a co-ordinated response. We also infer that the lack of a "brain" means the rules governing local behaviour are likely to be simple, but iteratively give rise to apparently sophisticated problem-solving behaviour, very similar in principle to the way that complex behaviour can emerge in social insects such as termites or bees.
Decentralised control systems running with simple rules offer attractive possibilities to establish readily scalable, low maintenance, robust and adaptable network architectures. Equally, we have to be careful in pushing these analogies very far as, although the slime mold networks match the infrastructure networks at one level, they develop using very different processes that would be completely impractical to replicate in all but a limited number of real-world scenarios.
Nevertheless, there may be interesting general concepts that emerge such as communication of fuzzy information over long distances and information through conservation laws that are intrinsically associated with physical flows. It is also interesting that many systems in biology show oscillatory behaviour that may assist in co-ordination of behaviour, whilst most man-made control strategies deliberately try to suppress such phenomena.
OSB: How did you feel when you heard you’d won an Ig Nobel?
MF: Great. I think they are a wonderful vehicle to make science accessible and entertaining.
OSB: What do you hope to investigate next as part of this research?
MF: We have a number of different organisms, including fungal mycelia, that also produce elegant networks that appear to be tuned to a different balance between cost, efficiency, resilience and control complexity.
They also form their networks by completely different methods to slime molds at the molecular level, yet there are already interesting similarities including the conservation flows and pulsating behaviour at a macroscopic level. This again hints at some universal principles that govern this type of network formation that can be achieved with a wide variety of different components. This is also important as it suggests that the control principles can also be transferred to non-biological systems as well.
Unravelling these processes and modelling the critical components needs creative links between biologists, physicists, mathematicians and engineers. This network of network people is what we are currently building.
Mark Fricker and colleagues won this year's Transportation Planning Prize.
How do you accurately simulate the Universe on a computer?
A new programme at the Oxford Martin School, led by Pedro Ferreira of Oxford University's Department of Physics, aims to see how this and other seemingly impossible tasks can be tackled by developing new ways of handling data.
Pedro will be explaining this new approach as part of a showcase at the Royal Society this evening, but before that I caught up with him to ask about data, supercomputers and the biggest problems in science:
OxSciBlog: Why do we need new approaches to handling data?
Pedro Ferreira: We are at a threshold of a new era in Cosmology. Over the past few years we have developed very powerful instruments - groundbased, balloon born and satellite telescopes - that have a phenomenal capacity for collecting data. New surveys of galaxies, maps of cosmic radiation at various different frequencies all probe the Universe on a wide range of scales.
We want to learn what are the fundamental properties of the Universe, such as what its made of and how it is evolving. The new data sets are so massive that this can’t be done using conventional methods.
We need to be clever, innovative, pushing the boundaries of data management and statistical analysis. In particular we need to come up with radically different methods. Otherwise we won’t be able to extract the knowledge we want from the data we have in hand. And it is going to get much, much worse.
OSB: What lessons have been learnt from astrophysics & cosmology about processing/searching large amounts of data?
PF: We have learnt a lot. For example, we have had to come up with very powerful ways of simulating the data sets.
Think about it: we want to simulate the Universe on a computer, something immense, but with enough detail that we can recognize the fine details we see around us. That has proven a challenge but as a result, members of our team have taken part in developing codes which are some of the most powerful in the world.
I mean by this that they can run on the largest supercomputers in the world, making the most use of the gigantic computing capacity of these machines. In fact, as a result of this capacity, these codes can be used to benchmark the next generation of supercomputers.
OSB: What sort of new techniques will you look to develop?
PF: We have three strands of research that we want to pursue: First of all we want to develop methods to deal with the up and coming surveys of galaxies such as the ones that will come out of the Square Kilometre Array (SKA) or the Large Synoptic Survey Telescope (LSST). We need to be able to deal with data sets with billions, not millions of galaxies.
Second, we want to harness the capacity of the public to take part in the analysis of these data sets. Our Citizen Science project has been very successful in harnessing the creativity of hundreds of thousands of individuals online. We want to explore the potential of this new way of doing science.
Finally, we really want to push our ability to simulate the Universe using the most advanced computer codes in the field, developed by our team. These machines have to be able to correctly simulate the largest scales, the overall properties of the visible Universe while at the same time, pick out the fine details. How galaxies interact, merge and evolve to build up the complex cosmic ecology which we observe.
OSB: How might this help researchers in fields such as oceanography, climate science and medicine?
PF: The problems we are facing in Cosmology are present in many other fields. In Climate we need to be able to simulate incredibly complex systems on a wide range of scales. In Oceanography, there are experiments which will try to map out the oceans in real time and tens of thousands at different points.
Imaging can and will play a crucial role in medicine and is amenable to the use of novel statistical methods. Citizen Science, as developed by our group, is already being deployed in a range of fields, from the classification of weather logs to the reconstruction of classical papyri.
Professor Pedro Ferreira is Director of the Computational Cosmology Programme at the Oxford Martin School and a Professor of Astrophysics.
Scientists are expecting grim news in the forthcoming Comprehensive Spending Review, where funding for science and research is expected to be cut significantly. This is despite arguments that science and innovation should be at the heart of future economic growth, not least in a Royal Society report from March.
Business secretary Vince Cable at the beginning of the month said universities will have to do ‘more with less’, and angered researchers by suggesting up to 45% of grants went to research that wasn’t of excellent standard (with the implication that mediocre science could reasonably be cut). Now the heads of leading research universities are getting involved.
Lord Krebs, head of the House of Lords’ science and technology select committee, has warned today that cuts to the government’s science research budget will affect the ability of UK universities to attract and retain the best researchers from around the world.
He spoke this morning on Radio 4’s Today programme after sending the science minister David Willetts a letter setting out the views of the heads of six leading universities, including the Vice-Chancellor of Oxford Professor Andrew Hamilton.
The Times and BBC News Online have both covered the story and publish the letter in full.
David Willetts gave evidence to the Lords’ science and technology committee in July, when he invited the committee to provide evidence that the UK is becoming a less attractive place for science research.
Lord Krebs then wrote to the vice-chancellors of six leading research universities – Oxford, Cambridge, Manchester, Imperial, Edinburgh and UCL – for their experiences. Their responses provide the material for the letter to David Willetts, including the suggestion that a handful of top researchers have already returned back to the US given the outlook for research funding here.
In Oxford’s submission, Professor Andrew Hamilton says: ‘We have very real concerns that the brightest and best researchers at all stages of their career could accept offers of study or employment at our international competitor institutions should the national funding environment become more challenging ... We are of the firm view that it is less expensive to retain our [leading UK research intensive universities’] current quality endeavour than it would be if it had to be rebuilt in the future.’
Lord Krebs’ letter and the statements from the universities have now been published on the science and technology committee’s website.
It’s clear that discussion of the effect of science cuts will continue in the next days and weeks.
How can you measure the economic benefits of university research?
One way is to look at the number of spin-out firms universities create in order to exploit new scientific ideas and techniques.
As Richard Tyler writes in The Telegraph a new report into life science start-ups suggests that Oxford University is doing particularly well in turning good ideas into companies. The report states that 'if all the Russell Group universities were operating to the same level as Oxford, in theory there would have been an additional 78 university spin-outs over the period [2005-09].'
It goes on to show that, in measures of numbers of and investment in spin-outs, Oxford comes 'way out in front of the pack' ahead of Imperial and Cambridge. It also highlights the importance of university technology transfer companies, such as Oxford's own Oxford University Innovation.
Of course, as the report also makes clear, spin-outs are not the only measure of success in terms of the commercialisation of research. But these new companies are the most obvious sign that good science is being turned into good business.
- ‹ previous
- 209 of 252
- next ›
