Features
Over the last four years, solar cells made from materials called perovskites have reached efficiencies that other technologies took decades to achieve, but until recently no-one quite knew why.
Since perovskite was first used in 2009 to produce 3% efficient photovoltaic (PV) cells, scientists have rapidly developed the technology to achieve efficiencies of over 15%, overtaking other emerging solar technologies which have yet to break the 14% barrier.
Scientists at Oxford University, reporting in Science, have revealed that the secret to perovskites' success lies in a property known as the diffusion length, and worked out a way to make it ten times better.
'The diffusion length gives us an indication of how thick the photovoltaic (PV) film can be,' explains Sam Stranks, who led the discovery in Henry Snaith's group at Oxford University's Department of Physics. 'If the diffusion length is too low, you can only use thin films so the cell can't absorb much sunlight.'
So why is the diffusion length so important?
PV cells are made from two types of material, called p-type and n-type semiconductors. P-type materials mainly contain positively-charged 'holes' and n-type materials mainly contain negatively-charged electrons. They meet at a 'p–n junction', where the difference in charge creates an electric field.
The cells generate electricity when light particles (photons) collide with electrons, creating 'excited' electrons and holes. The electric field of the p–n junction guides excited electrons towards the n-side and holes towards the p-side. They are picked up by metal contacts, electrodes, which enable them to flow around the circuit to create an electric current.
'The diffusion length tells you the average distance that charge-carriers (electrons and holes) can travel before they recombine,' explains Sam. 'Recombination happens when excited electrons and holes meet, leaving behind a low-energy electron which has lost the energy it gained from the sunlight.
'If the diffusion length is less than the thickness of the material, most charge-carriers will recombine before they reach the electrodes so you only get low currents. You want a diffusion length that is two to three times as long as the thickness to collect almost all of the charges.'
The thickness of a solar cell is always a compromise – if they're too thin they won't absorb much light, but if they're too thick the charge carriers inside won't be able to travel through. Longer diffusion lengths allow for more efficient cells overall, as they can be made thicker without losing as many charge carriers. Scientists can get around this by arranging cells into complex structures called 'mesostructures', but this is a time-consuming and complicated process which has yet to be proven commercially.
Previously, researchers were able to get mesostructured perovskite cells to 15% efficiency, using a perovskite compound with a diffusion length of around 100 nanometres (nm). But by adding chloride ions to the mix, Henry's group achieved diffusion lengths over 1000nm. These improved cells can reach 15% efficiency without the need for complex structures, making them cheaper and easier to produce.
'Being able to make 15% efficient cells in simple, flat structures makes a huge difference. We've made hundreds just for research purposes, it's such an easy process. I expect we'll be seeing perovskite cells in commercial use within the next few years. They're incredibly cheap to make, have proven high efficiencies and are also semi-transparent. We can tune the colour too, so you could install them in aesthetically-pleasing ways in office windows.'
That perovskite cells are showing commercial potential after such a short time is a testament to their fantastic properties. We could well be seeing perovskite cells with efficiencies of 20-30% within the next few years, offering the same power as standard silicon-based cells at a fraction of the cost.
'Now is a truly exciting time to be working in the field,' says Sam. 'It's such a rapidly-emerging field, I expect to see it evolve even further over the next couple of years. What's incredible is that all of these advances have been made in academic environments so far, but it won't be long before industrial manufacturers start looking at perovskite cells as serious contenders.'
This week, scientists and engineers from Oxford University and around the world will start work on the final designs for the Square Kilometre Array (SKA), soon to become the world's largest and most sensitive radio telescope.
The SKA will cover a combined collecting area equivalent to a dish of about one square kilometre using thousands of dishes and millions of linked antennae spread across Australia and in Southern Africa. It will be able to detect radio waves more accurately and sensitively than ever before, helping to answer some of the biggest questions in physics and astronomy. These include questions about dark matter and dark energy, and perhaps even the biggest question of all: is humanity alone in the universe?
'After many years of planning and preparation it is very exciting that the SKA project is now moving in to the detailed design phase,' said Professor Michael Jones, principal investigator of SKA at Oxford. 'In a few years this amazing scientific instrument will no longer be the stuff of dreams but will start to become a reality.'
The Oxford team will lead the design of electronic systems to digitise and combine signals from millions of low-frequency antennae and allow the telescope to point in multiple directions at once. This will be done in collaboration with the Rutherford Appleton Laboratory along with partners from industry and other universities.
Oxford is also one of the key universities involved in preparation for the scientific exploitation of the SKA, with members on several of the SKA Science Working Groups. They will play a major role in the development of the signal processing systems that will search for pulsars, one of the SKA's key science goals, and in the development of software and high-performance computing systems for the project.
Proteins which reside in the membrane of cells play a key role in many biological processes and provide targets for more than half of current drug treatments. These membrane proteins are notoriously difficult to study in their natural environment, but scientists at the University of Oxford have now developed a technique to do just that, combining the use of sophisticated nanodiscs and mass spectrometers.
Mass spectrometry is a technique which allows scientists to probe molecular interactions. Using a high-tech 'nanoflow' system, molecules are transmitted into the instrument in charged water droplets, which then undergo evaporation releasing molecules into the gas phase of the mass spectrometer.
But membrane proteins are difficult to measure in this way, as they are hydrophobic: they don't dissolve in water. One way to overcome this problem is to mix them with detergents. Detergents work by surrounding insoluble substances with a water-friendly shell. Each detergent particle has two ends – the heads are attracted to water and the tails are attracted to insoluble regions of the membrane protein. The tails stick to the hydrophobic parts, leaving a shell of water-loving heads around the outside. The molecules can then easily dissolve in water.
Although detergents can be used to get membrane proteins to dissolve in water, these artificial chemicals can damage protein structures and do not faithfully mimic the natural environments in which they are normally found. The Oxford group, led by Professor Carol Robinson, has utilised a technique which allows them to study membrane protein structures by mass spectrometry from their natural environment. Their new method, published in Nature Methods, uses tiny disc-like structures made from molecules called lipids, as first author Dr Jonathan Hopper explains:
'Membrane proteins are naturally found in flat structures called lipid bilayers. Lipids are a bit like nature's detergents, in that they have water-loving heads and fat-loving tails. Lipid bilayers are made up of two sheets of lipids with their tails pointing inwards.
'The nanodiscs we use are made from lipids, the same material that membrane proteins occupy in the body. It's essentially as if you took a round cookie cutter to remove a section of the natural bilayer, so the conditions are just like they would be in the body. The discs are stabilised by wrapping a belt of proteins around them to keep the exposed lipid tails from the water.
'Aside from the nanodiscs, we actually got great results from 'bicelles', which are made in a similar way. The main difference is that instead of putting a belt of proteins around the edge, we plug the gap with short-chain lipids instead. This actually gives us much more control over the size and structure of the disc.'
These innovations enable researchers to study membrane protein structures using sophisticated mass spectrometry, in environments as close to the human body as possible.
'I am delighted that this has worked, it is completely unexpected given the difficulties we have had in the past in studying these complexes in lipidic environments,' says study leader Professor Carol Robinson. 'The breakthrough enables us to study membrane proteins in a natural environment for the first time. We believe this will have a great impact on structural biology approaches, and could in turn lead to better-designed drug treatments.'
'The world urgently needs new medicines for many diseases such as Alzheimer's, depression, diabetes and obesity,' says Professor Chas Bountra. 'Yet the pharmaceutical industry's success rate for generating truly novel medicines remains low, despite investing tens of billions of dollars.'
What's going wrong? Why can't we depend on the vast commercial pharma industry to deliver the new treatments we need? Professor Bountra is in the ideal position to ask. He came from the drug firm GSK to lead the Structural Genomics Consortium at Oxford University, a public-private partnership that bridges academia and industry and produces data that is directly relevant for coming up with new drugs.
'What the pharma industry has done is recruit some of the smartest people on the planet, invested tens of billions in technology and infrastructure, and acquired promising companies,' he says. 'It's not that industry is doing anything wrong. The problem is that it's so difficult. The fundamental bottleneck is our ability to identify new targets for drug discovery.'
Those working in this area talk about 'targets'. If you have a biological molecule, most often a protein, that you find is critical in a disease process in the body, this is a target.
It is a target because you can throw tens and hundreds of thousands of small chemical compounds at it and see which of these would-be drugs stick. You might come away with a handful of compounds that bind your target protein and block the disease process. Now you have somewhere to start, you have some candidate drugs against this disease.
You'll want to optimise the chemical compound, do toxicology checks, and there would be years of clinical trials to determine it was safe and beneficial. But the starting point turns out to be crucial. If you don't know enough about the target and the disease process it affects, you may waste billions of pounds, years of effort and expose patients to something that may have no medical benefit – or worse, find side effects you didn't know about.
Professor Bountra explains: 'There are around 22,000 different proteins in humans, any of which could be a target for a drug. There are hundreds of diseases and hundreds of subsets of diseases. What we can't do right now is say this protein will work in this subset of Alzheimer's patients.
'Pharma is extremely good at taking a candidate drug molecule through to market. None of us – and I include the whole global biomedical community in this – is good at selecting the right target for drug discovery.'
Peter Ratcliffe, Nuffield Professor of Medicine at Oxford University, is of exactly the same mind: 'It's almost self-evident that in starting drug development you need to start in the right place. We need to have the right molecular target.'
He is the director of the new Target Discovery Institute at Oxford University, an institute whose whole purpose is validating targets for drug discovery.
Researchers have just started moving into the TDI's impressive new building on the Old Road Campus. All clean lines, sharp angles and a glass frontage to guide you in, it brings the best biologists and chemists together with the latest genetic and cell biology technologies.
Modern biology research is delivering thousands of potential targets, Professor Ratcliffe says, but it is currently hard or impossible for scientists in pharma to know which are the most promising to pursue for new drugs. He believes that at least a portion of academic research should be more aligned to what industry needs to take things forward.
One of the examples Professor Ratcliffe gives is a set of enzymes called histone demethylases. These are involved in switching genes on and off in cells, and drugs targeting these proteins may be useful in cancer and inflammatory disease. But this work is still at a relatively early stage and there is a lot to be done to determine the range of effects that blocking these enzymes can have, and whether discrete medical benefits can be achieved. That's where the interest of the TDI comes in.
Forging successful partnerships between academia and industry is exactly what Professor Bountra has done at the SGC. This not-for-profit group, which with academic and industry partners worldwide determines the three-dimensional structures of proteins of importance to human health, places the data in the public domain, open and free to all. Knowing the structure of a protein is important in finding candidate drugs that bind this target.
More recently, the SGC began working further along the drug discovery chain in coming up with novel chemical compounds that block target proteins. Again the data and reagents are openly available to allow anyone to investigate them. Some novel drug compounds are already being taken forward by new biotech companies.
'We need to pool the strengths of academia and industry,' Professor Bountra believes, 'to create a more efficient, more flexible way of discovering new drugs. It is only by pooling resources and by working with the best people that we can hope to reduce costs and reduce risks in this very difficult task of discovering new drugs.'
Professor Ratcliffe adds: 'The failure of drug candidates at a late stage in large-scale trials is reasonably held to be the thing killing the pharma industry. We have to secure the rationale for developing a drug in the first place, and we have to make sure we don't find untoward aspects at a late stage.'
Both professors believe that there is wider importance to the British economy, following many drug companies downsizing their research capacity in the UK. By making these projects in Oxford a success, it can bring in drug company investment, it can see new biotechnology companies spun off and help in retaining highly skilled people in this country, they say.
'I honestly think what is happening in Oxford is phenomenal,' says Professor Bountra. 'In the next one to two years, Oxford will be the academic drug discovery centre in the UK. What distinguishes Oxford is a culture that makes all of this work. We are all pulling in the same direction to help industry develop new medicines because society desperately needs new medicines.'
This article was originally published in Blueprint, the University's staff magazine.
The health of the ocean is spiralling downwards far more rapidly than previously thought, according to a new review of marine science.
The latest results from the International Programme on the State of the Ocean (IPSO) suggest that pollution and overfishing are compromising the ocean's ability to absorb excess carbon dioxide (CO2) from the atmosphere. IPSO's scientific team warns that the oceans won't be able to shield us from accelerating climate change for much longer and that mass extinctions of some species may be inevitable.
'What the report points to is our lack of understanding of both the role of the ocean in taking up CO2 and the impact of human activity on marine ecosystems,' Alex Rogers of Oxford University's Department of Zoology, Scientific Director of IPSO, told me.
The findings are published in a set of five papers in the journal Marine Pollution Bulletin, the papers came out of meetings hosted at Somerville College, Oxford.
'Our research at Oxford is trying to fill in these gaps in our knowledge about how carbon is transported in the deep ocean,' Alex explains. 'We need more research in particular into the active processes taking place as animals migrate up and down in the ocean every day.
'Animals such as deep water fish will feed in surface waters at night, then migrate up to 1,600 metres back down into the deep. Animals like jellyfish repackage carbon ingested during feeding and excrete it as faecal pellets. We also see mass die-offs of deep sea animals – how this contributes to the carbon cycle, and how it might be affected by climate change, is very poorly understood.'
Alex highlights how estimates of the biomass from fish from the 'twilight zone' region (200-1000 metres deep) were recently found to be out by a factor of ten because it was not realised that these mesopelagic fish were actively avoiding underwater nets.
'That we can get the numbers out by this amount just demonstrates the poor level of knowledge about our oceans,' Alex comments.
Much more research is needed, he believes, if we are to understand how climate change both affects and is influenced by marine ecosystems.
Read more about Oxford research into the deep ocean in our stories about a 'lost world' of new species discovered deep beneath the Southern Ocean, and the origins of the hairy 'Hoff' crab.
- ‹ previous
- 177 of 252
- next ›
