Features

OSB archive

Probing what fuels cancer

Jonathan Wood | 3 Aug 2012

Cancer is often described as a genetic disease, after all the transition a cell goes through in becoming cancerous tends to be driven by changes to the cell's DNA.

But genes, though hugely important, might not be the whole story. Researchers at Oxford University are interested in understanding how changes in cells' metabolism – the chemical processes through which cells get the energy they need – could also prime them to become cancerous.

They have just started collaborating with a lab at Keio University in Japan to bring large-scale techniques to the study of metabolic processes going on in cancer cells, much as gene technologies have given such insight into DNA changes involved in cancers.

'Altered cellular metabolism is a hallmark of cancer,' says Dr Patrick Pollard, who is leading this effort in the Nuffield Department of Clinical Medicine at Oxford.

This is not a new finding - it is something that has been known for a long time. The biochemist and Nobel laureate Otto Warburg pointed it out in the early 1900s. He observed that most cancer cells get the energy they need predominantly through a high rate of glycolysis (the metabolic process that breaks down glucose to release energy). It helps the cancer cells deal with the low oxygen levels that tend to be present in a tumour.

But whether dysfunctional metabolism causes cancer, as Warburg believed, or is something that happens afterwards is a different question.

In the meantime, gene studies rapidly progressed and gave us a picture of how genetic changes lead to cancer.

It goes something like this: DNA mutations spring up all the time in the body's cells, but most are quickly repaired. Alternatively the cell might shut down or be killed off before any damage is caused. However, the repair machinery is not perfect. If changes occur that bypass parts of the repair machinery or sabotage it, the cell can escape the body's normal controls on growth and further DNA changes can begin to accumulate as the cell switches to become cancerous.

So what has metabolism got to do with this? We get the energy we need from food of course, and we talk about our metabolism in the way our bodies make use of that food as a fuel for everything we do during the day. Our cells are the same. They have whole series of chemical reactions going on simultaneously to keep them working, wherever and whatever they are doing in the body – from heart cells to neurons in the brain and liver or pancreatic cells. Cellular metabolism is a constant process with thousands of metabolic reactions happening at the same time, all of which need to be regulated to keep our cells ticking over healthily.

It's what happens when the regulation of cellular metabolic processes goes wrong that could be of interest. And it's only a lot more recently that techniques to probe the entirety of metabolic processes in the cell have advanced. The result is something of a return to vogue for studies to understand how altered cellular metabolism and cancer are linked.

Studies of the genetic basis of cancer and dysfunctional metabolism in cancer cells are complementary, Patrick believes. 'Genomic data is very important, but certain changes in cells can’t always be accounted for by genetics.'

He is now collaborating with Professor Tomoyoshi Soga's large lab at Keio University in Japan, which has been at the forefront of developing the technology for metabolomics research over the past couple of decades (metabolomics being the ugly-sounding term used describe research that studies all metabolic processes at once, like genomics is the study of the entire genome).

The Japanese lab's ability to screen samples for thousands of compounds and metabolites at once, coupled with the access to tumour material and cell and animal models of disease in Oxford, should give great power to probe the metabolic changes that occur in cancer.

There is reason to believe that dysfunctional cell metabolism is important in cancer. Some genes with metabolic functions are associated with some cancers, and changes in the function of a metabolic enzyme have been implicated in the development of gliomas.

These results have led to the idea that some metabolic compounds, or metabolites, when they accumulate in cells, can cause changes to metabolic processes and set cells off on a path towards cancer.

Patrick Pollard and colleagues have now published a perspective article in the journal Frontiers in Molecular and Cellular Oncology that proposes fumarate as such an 'oncometabolite'. Fumarate is a standard compound involved in cellular metabolism.

In that article, the researchers summarise evidence (often from their own lab) that shows how accumulation of fumarate when an enzyme goes wrong affects various biological pathways in the cell. It shifts the balance of metabolic processes and disrupts the cell in ways that could favour development of cancer.

This work on metabolic pathways involving fumarate has already led to a cheap and reliable diagnostic test for a rare form of cancer caused by accumulation of fumarate within cells. Their test for hereditary leiomyomatosis and renal cell cancer (HLRCC) involves screening tumour samples for a particular molecular fingerprint unique to this type of cancer. The Oxford researchers are now hoping to develop their test for clinical use, largely to help with genetic counselling for families as the condition can be inherited.

While HLRCC is a rare type of cancer, Patrick Pollard says: 'Metabolic changes are observed in most cancers, so there could be wider implications. Lots of findings about pathways that are important in cancer come from studying rare cancers.'

This is where the collaboration with Keio University comes in. The Keio group is able to label glucose or glutamine, basic biological sources of fuel for cells, and track the pathways cells use to burn up the fuel. It allows the scientists to work out the metabolic pathways that are being used preferentially by different cell types including cancer-derived cell lines.

Patrick gives an example of how the research might progress: they could profile the metabolites in a cohort of tumour samples and matched normal tissue. This would produce a dataset of the concentrations of hundreds of different metabolites in each group. Statistical approaches could suggest which metabolic pathways were abnormal. These would then be the subject of experiments targeting the pathways to confirm the relationship between changed metabolism and uncontrolled growth of the cancer cells.

Patrick and colleagues write in their latest article that the shift in focus of cancer research to include cancer cell metabolism 'has highlighted how woefully ignorant we are about the complexities and interrelationships of cellular metabolic pathways'.

Hopefully, research efforts like this large-scale approach to understanding cell metabolism can give insight into how cells respond to shifted metabolic processes and how this is associated with the development of some cancers.

OSB archive

Glucose sensor science recognised

Pete Wilton | 12 Jul 2012

Research at Oxford University that led to a new type of sensor enabling people with diabetes to easily and accurately monitor their own blood sugar (glucose) levels has been celebrated with the unveiling of a special plaque.

The National Chemical Landmark plaque from the Royal Society of Chemistry [RSC] recognises work by Allen Hill, Tony Cass, and Graham Davis at Oxford's Department of Chemistry. In the 1980s they developed a new technique that enabled a range of proteins to be investigated electrochemically, paving the way for a new type of monitoring device.

Previously glucose monitoring had been done using 'colorimetric' methods in which a drop of blood was applied to a strip that changed colour to indicate the concentration of glucose in the blood. But these methods required a large droplet of blood and were not very accurate.

The new approach started with work into how proteins gain or lose electrons when linked in to an electrical circuit. Studying such electrochemical behaviour was tricky as the proteins tended to stick to the electrodes, creating a build up which prevented a current from flowing.

Allen and undergraduate student Mark Eddowes developed a way of protecting the electrodes by binding another molecule to them which did not interfere with the current. The protein could now pick up an electron from one electrode via the surface bound molecule and lose it at the other electrode: allowing a huge range of proteins to be investigated electrochemically for the first time.

In 1982 Allen, working with Tony Cass and Graham Davis, overcame a major drawback of the new method when applied to enzymes that used oxygen; the latter interfered with the electron transfer processes. They used ferrocene as a carrier of the electrons thereby creating a much more stable system. By using the enzyme which breaks down glucose (glucose oxidase) as the protein component in the device they were able to construct a new type of sensor.

Glucose oxidase reacts with any glucose in a sample, giving up electrons that are passed on to the electrode via the oxidised ferrocene: the larger the concentration of glucose the larger the current measured by the device.

Importantly, the new device needed just the tiniest pinprick of blood, would work with blood straight from the body, and could measure glucose levels much more accurately than colorimetric methods.

Attracting investors to commercialise the new sensor proved difficult until Ron Zwanziger provided financial backing and helped to secure the funds to found a new company, Medisense. The new glucose monitoring device finally went on sale in 1989.

Medisense was a great success, and was sold to Abbott Laboratories in the mid-1990s for around $800m. However, the greatest legacy of the work is that the many variants of the original Oxford-developed glucose monitor are used by people with diabetes around the world today.

Commenting on the original breakthrough, and how the technology has developed since, Allen told me:

'The initial reaction was a mixture of excitement and frustration. The former because we realised how useful the system we had discovered could be: frustration because it took a very long time parading our wares before one company after another before we were fortunate to meet Mr Zwanziger. Now the devices have developed enormously but they are still related to what we found in the '80s.'

Tony Cass is now Professor of Chemical Biology at Imperial College London.

OSB archive

Higgs hunt: new particle found

Pete Wilton | 4 Jul 2012

A wave of excitement is spreading across the world's media today as scientists at the Large Hadron Collider (LHC) announce the latest results in their search for the Higgs boson.

So has this elusive particle been found? And why is finding it so important?

I asked Alan Barr of Oxford University's Department of Physics, UK physics coordinator for LHC's AATLAS experiment, what the world's highest energy particle accelerator saw, what it means for science, and what life might be like once physicists' Most Wanted has been safely consigned to the particle zoo…

OxSciBlog: Why is the search for the Higgs boson important?
Alan Barr: Over the last hundred years, physicists have studied the subatomic world in exquisite detail, and from their findings have constructed a very beautiful mathematical theory of nature.

This remarkable achievement can be likened to a machine in which each of the cogs is required to have its place to make the whole work. This Higgs particle is the "cog" responsible for mass – it is the physical manifestation of the field which is theorized to give weight to all of the other fundamental particles.

Without that field, the electrons and quarks would be massless, and would zip around at the speed of light.

OSB: What do the latest results tell us?
AB: The latest results show a significant excess of a particular type of event – collisions in which the detector has spotted two high-energy particles of light – photons – which have similar total energies.

That special energy is crucially important, because Einstein tells us that mass and energy are interchangeable. The total energy of the photons is equal to the mass of some new particle (times the speed of light squared).

The Higgs boson is expected to have a very brief lifetime before decaying into other particles, and in particular into a pair of photons. The total energy of the photons – about 126 billion electron volts – should correspond to the mass of the Higgs particle.

OSB: At what point can we say, definitively, if it exists or not?
AB: Physicists require high standards of proof, to ensure that their interesting results are not just lucky – or indeed unlucky – statistical flukes. The level of confidence required is a "five-sigma" excess, which means that there is a less than a one in a million probability that the signal could happen by chance. That's the level of significance that we require to claim discovery of a new particle.

OSB: What more work needs to be done?
AB: Whether or not this particle is the Higgs boson – or even a Higgs boson (many theories propose more than one) will take a little more time to work out. For the moment we have reached what seems to be the top-most summit by knowing that there is a new particle there.

We will next have to survey the ground and find out precisely what that something is. That means measuring its properties – like its spin, and how often it decays in different ways. Only by probing its interactions with the world around it will we know whether or not it is the Higgs particle that many were expecting.

OSB: How will finding/not finding the Higgs change physics?
AB: The full consequences will take years to work out in detail, but some things we can be sure of already. Firstly, the Standard Model has made a far-reaching prediction and has been very much up to the test. That speaks volumes for the power of elegant mathematical theories to describe the intimate details of the inner workings of the universe.

Secondly, it means that there will have to be a programme of detailed measurements of the properties of the particle, to be sure that they do not differ from those that have been predicted. Such subtle discrepancies are predicted in many theories as the hints of a new theory yet to be uncovered.

And finally, the existence of a particle like the Higgs boson asks deep questions about why the universe seems to be so exquisitely finely set-up for us to inhabit, almost as if it had us in mind from the beginning.

OSB: What about life after Higgs: what else could the LHC find out?
AB: If this particle really is the Higgs boson, then we will have precise knowledge about the particles and forces of the Standard Model. That really deserves something of a celebration. But we should be cautious. Remember that, if the astronomers are correct, Standard Model particles make up only about 5% of the contents of the universe.

The "Dark Matter" and "Dark Energy" which are believed to control the evolution of the universe as a whole have never yet been made in the lab. Understanding this other 95% of the universe is an exciting prospect, and is sure to exercise the combined intellects and technological prowess of new astronomers and particle physicists for many decades to come.

With luck, the first clues may come in the next few years, as we turn up the LHC to even higher energies and probe even further into new and undiscovered terrain.

OSB archive

Summer of pests, motors, & spin

Pete Wilton | 3 Jul 2012

Find out about insect birth control, machines made from DNA, and weird quantum properties that could power supercomputers, in Oxford University exhibits at this year’s Royal Society Summer Science Exhibition.

The exhibition, that starts today in London and runs until 8 July, also features Oxford research into creating a genetic map of Britain.

The insect control exhibit explains how genetically modified (GM) insects can help to tackle dengue fever, a potentially fatal mosquito-borne disease which infects 50-100 million people each year in over 100 countries.

A new technique creates GM 'sterile' male insects which are released so that wild females mate with them. These females then have fewer offspring or none at all and, if enough sterile males are released over a long period, this can significantly reduce, or even eliminate, the population.

Michael Bonsall of Oxford University’s Department of Zoology, lead exhibitor of insect control, said: 'Our approach uses advances in genetic engineering in a bid to address the challenge of insect pests. Our research draws on many scientific disciplines, from ecology to health economics, and demonstrates that it can be effective'.

The nano-scale transport exhibit, meanwhile, looks at what we could learn from self-assembled motors found in the natural world and about how we could build our own molecular machines.

The flagellar motors that power bacteria, and the kinesin motors that transport cargoes within cells, are just two of the systems Oxford scientists are investigating.

The exhibit also explores how DNA can be used as a construction material: making tiny 'walkers' that can control chemical reactions, as well as containers and geometric figures.

Quantum of spin is an exhibit all about the science of 'spin'; a weird quantum property that is possessed by electrons and atomic nuclei.

Visit this stand and find out how technology harnessing spin is used in hospital MRI scanners and how Oxford research is examining how it could be used to build the ultimate supercomputer, and even find out how robins and other birds find their way during their long migrations.

OSB archive

OK, computation

Pete Wilton | 29 Jun 2012

'It seems like Nature has some secret that lets it make complicated stuff in an effortless way,' Stephen Wolfram recently told an audience at Oxford University’s Mathematical Institute.

In his talk, that you can now watch online, Wolfram, the scientist behind Mathematica and Wolfram Alpha, explored how advances in computation could benefit mathematics.

One of the key ideas he put forward was 'computational irreducibility' – the idea that some computations cannot be sped up by any shortcut, the only way to figure out what is going to happen is to simulate each step.

'People sometimes say that the reason the mathematics that we have is the way it is, is because that's what we need to describe the natural world, I think that's just not true,' he commented.

He suggested that much of the reason mathematics covers the areas it does is historical, building on work begun by the first mathematicians in ancient Babylon.

Computational irreducibility, he said, is a 'junior version of ‘undecidability'' – the idea that when you ask the question of what will ultimately happen the answer is something that is undecidable. Whilst there are over three million theorems in mathematics these are all things that turned out to be decidable/provable.

There isn’t much undecidability in mathematics because maths is set up to examine those things its methods can make progress on: 'mathematics has navigated through these kind of narrow paths in which you don't run into rampant undecidability all over the place.'

Ask mathematical questions at random, he suggested, and you would soon run into undecidability. But perhaps through exploring the space of all possible theorems, using tools such as Wolfram Alpha, you might find new paths.

He described the point of Wolfram Alpha as 'to collect as much knowledge as possible and make it computable', and that this approach could be applied to find out which theorems about a particular structure or system were 'interesting' or 'powerful'.

A pilot study focusing on one particular area of maths, continued fractions, is already showing that the process of organising theorems in a way that’s systematically computable is leading to new advances, he said.

In a contrast to the days when mathematicians did all of their calculations by hand, the future of mathematical process could be that, by entering some details of a system, within seconds they would automatically see a range of theorems about it.

This would give a window on what he called a 'vast ocean of unexplored generalisation of mathematics that exists in this computational universe of possible systems.'

The talk took place at the Mathematical Institute on 12 June 2012.