Frankenfuels

Under Steven Chu, the US Department of Energy (DoE) has never been so science-friendly. DoE has created and funded an idea’s incubation unit and has been busy with, in the energy secretary’s oft-quoted phrase, its “hunt for miracles” to transform America’s energy landscape.

The most substantial backing from the DoE’s Adnvanced Research Projects Agency – Energy (arpa-e) so far has been the package of grants, totalling more than US$92m, it announced in July. Among these, the largest single grant was one of more than US$5m to HRL Laboratories (formerly Hughes Research Laboratories), whose partners include GM, for a project to develop Gallium Nitride, a semiconductor material, to make small, efficient battery switches so that electric vehicles can interact more efficiently with the grid. Most of the grants on that list are to similar projects aimed at better batteries, more efficient grids, etc.

Impressive as these innovations promise to be, others working outside of the DoE’s aegis offer truly inventive breakthroughs. A team led by Michael Strano of the Massachusetts Institute of Technology, has used a combination of nanotechnology and bioengineering to reproduce nature’s “self-repairing” mechanism for solar cells. As Science News describes the process on its website, it could lead to solar cells with an indefinite lifetime:

The researchers began with light-harvesting reaction centers from a purple bacterium. Then they added some proteins and lipids for structure, and carbon nanotubes to conduct the resulting electricity.

These ingredients were added to a water-filled dialysis bag — the kind used to filter the blood of someone whose kidneys don’t work — which has a membrane that only small molecules can pass through. The soupy solution also contained sodium cholate, a surfactant to keep all the ingredients from sticking together. 

When the team filtered the surfactant out of the mix, the ingredients self-assembled into a unit, capturing light and generating an electric current.

Just as remarkable, Synthetic Genomics is researching DNA engineering that may lead, as the New York Times reports, to a way to transform coal into natural gas or to cut out the middle man, as it were, so that plants can produce fuel directly rather than via an expensive and energy-intensive refining process. It is getting substantial backing from industry:

Exxon Mobil is giving Synthetic Genomics $300 million in research financing to design algae that could be used to produce gasoline and diesel fuel. (The new greenhouse will be used for that research.)

BP has invested in the company itself, turning to Synthetic Genomics to study microbes that might help turn coal deposits into cleaner-burning natural gas. Another investor, the Malaysian conglomerate Genting, wants to improve oil output from its palm tree plantations, working toward what its chief executive calls a “gasoline tree.”

Advertisements

Hydrogen’s long road ahead

A hydrogen-powered version of London’s black cab was launched today, with plans for a small fleet by 2012. Although there is optimism surrounding the prospects for the technology, there are still plenty of hurdles to leap before it becomes viable.

Hopes for an eco-vehicle revolution have taken another step forward with the recent launch of a hydrogen fuel cell hybrid version of London’s iconic black cab. But although a number of concerns have been dealt with, there is still a long way to go to fulfil a pledge by London’s mayor, Boris Johnson, to see all of the city’s taxis operating with zero emissions by 2020, and even further before hybrids can begin to challenge traditional fuel sources on the road.

The hydrogen fuel cell works in tandem with rechargeable lithium batteries. Either can power the cab’s engine and the fuel cell recharges the batteries as well as providing power directly. The prototype has plenty going for it. The main attraction is the absence of the traditional cough and splutter of an exhaust pipe – the hybrid vehicle emits only heat and water vapour. With no emissions of particulates (which contribute to lower air quality) the new cab has the potential to win over Londoners, while its lack of CO2 emissions should prove attractive in the political context of searching for solutions to climate change.

However, there are still several obstacles. The cab’s zero emissions refer to what happens “at point of use” (i.e. what comes directly from the vehicle). The hydrogen itself must be created. If that is not done in a renewable way (and the infrastructure to do so does not currently exist), then the overall saving in carbon emissions will be far lower, since traditional hydrogen manufacture relies on burning fossil fuels.

Transport is also tricky. Natural-gas pipelines cannot be used because hydrogen makes the steel tubing brittle and attacks the welds. Special production processes are needed to make pipes for carrying hydrogen. For that reason, few exist. The alternative is to liquefy the hydrogen at great expense and transport it in road tankers refrigerated with liquid nitrogen, at great expense.

Problems with a lack of infrastructure do not end at the manufacturing and transport level. Hydrogen hybrids face the same ‘chicken and egg’ problem as plug-in electric vehicles when it comes to fuelling: there are not enough fuelling points to create demand for vehicles, and too few vehicles to justify building fuel stations. London taxis are owned by their drivers, most of whom depend on their vehicles for their livelihoods. Although a great deal of work has been done to make sure the hybrid will be as reliable and durable as its petrol counterpart, a wholesale transfer to hydrogen will need more incentive.

Money is a major obstacle. John Russell, CEO of Manganese Bonds, the holding company for LTI vehicles (which manufactures the traditional black cab), has said that there will be “tremendous funding issues” before a hybrid fleet becomes a reality.  Creating enough fuelling stations to make it worthwhile to own a hydrogen hybrid will take a lot of investment. So will supporting the energy infrastructure that is necessary if the technology is going to have any environmental impact beyond its improvement of air quality. Although that benefit ought not to be taken lightly, it will be difficult to justify expenditure on a “green fuel source” that is in some ways just as polluting as its predecessor.

Bacterial biofuels

The creation of “synthetic life” by American biologists has excited the world. But can it be used to revolutionise the biofuels industry?

The claim by American geneticists to have created synthetic life has, understandably, been picked up across the globe, with the main focus being on the ethical implications (although some scientists have argued that the “life” tag is an exaggeration). But the new technology is not merely a victory over nature. Claims have been made for its revolutionary implications in other areas, not least in the possibility of next-generation biofuels.

Craig Venter, who along with Hamilton Smith was responsible for the technique which placed entirely synthetic DNA inside a pre-existing bacterial cell, is clearly among the most optimistic about the prospects of his breakthrough. In an interview with the American Association for the Advancement of Science, Venter cited a deal signed with Exxon Mobil to investigate the possibility of creating algae that could manufacture hydrocarbons.

An important aspect of this aspiration is the fact that algae, unlike traditional sources of biofuel, would not be in direct competition with agriculture , and therefore not a drain on increasingly-stretched food production. Exxon has faith in the concept; the company’s research and development deal with Venter’s Synthetic Genomics Inc is worth a potential $600 million.

Venter has admitted that the technique represents only “the final proof of concept”, rather than a substantive step towards designing life, but apparently believesthe potential commercial biofuel applications of creating algae that create biofuel from atmospheric carbon dioxide could be worth over a trillion dollars.

However, disagreement over the breakthrough is not confined to ethical debate. Some in the scientific community have questioned the real value of the technology compared to current biofuel applications that can be realised using pre-existing organisms with just a few genes knocked out rather than the painstaking process of Venter’s technique. Sir Paul Nurse, President-to-be of the Royal Society, has said he does not see the discovery transforming “the way we do biotech”, adding that pre-existing technology has much greater potential to transform the lives of individuals.

Others have pointed out that the new “life” is merely copy of an already existent organism, the Mycoplasma bacterium. This was chosen because it is one of the simplest forms of life on the planet, and is not suitable for biofuel production. Before the technology can be harnessed for human benefit, scientists would need to devise a specific genetic code which caused an organism to, for instance, produce biofuel.

The danger, then, is that the headline-grabbing nature of “synthetic life” may divert attention, and money, away from potentially more efficient means of producing biofuels. If Venter is right, however, the new technology will presumably allow far more subtle manipulation of organisms, giving humanity greater control than ever in the production of biofuels.

Brazil’s dilemma

Brazil is having to deal with the downside of success, both on the oil and biofuel front.

As our recently-updated Brazil Energy report points out, the country is still formulating its policy to tighten its grip on the oil sector to ensure the vast bulk of the wealth expected from the “pre-salt” offshore oil riches accrue to the public sector.

Don't be jealous

Even before production has begun to ramp up significantly, however, The Economist notes that proposals to distribute the oil wealth more evenly throughout Brazil’s states – rather than just Rio and Espírito Santo, which have received the bulk of it so far – has caused consternation among politicians who will lose out.

The country faces the downside of success on another front too. Helping Brazil to maximise the revenue from its oil windfall has been its achievement — originally, in response to the 1970s oil crises — in becoming the biggest domestic user of biofuels in transport and the world’s largest exporter of sugar cane-based ethanol. As our country report forecasts, government figures show that “flex-fuel” car production is approaching 90% of the total, so that Brazil is on track to have three-quarters of its car fleet having flex-fuel capability (running on either gasoline, biofuels or a combination of both) by 2020.

However, a story on Foreign Policy magazine’s website points out that this is a double-edged achievement:

As a result of its turn toward ethanol, Brazil avoided emitting 600 million tons of carbon between 1974 and 2004. So what’s with environmentalists who complain about ethanol — won’t they ever be satisfied?

While sugar cane ethanol is certainly less ecologically destructive than some other biofuels, the industry’s boosters have overlooked one key fact: You’ve got to plant sugar cane somewhere. One couldn’t pick a worse place to harvest cane than Brazil’s Atlantic rainforest. There, sugar cane crops have led to deforestation and, paradoxically, more carbon emissions.

In typically forthright fashion, President Luiz Inácio Lula da Silva has dismissed the criticisms – mainly, but by no means exclusively, coming from western environmentalists and like-minded governments – as motivated by jealousy. And it’s unlikely his successor will hold back the country’s march toward a biofuelled future.

Hydrogen vs EV, Round 5

At the very least, Suzuki’s Burgman Fuel Cell Scooter prototype (pictured) passes the embarrassment test: you wouldn’t feel like a fool tooling around the city on it, as you probably would in some electric vehicles like the Smart Car or a Toyota iQ, or certainly the old Sinclair C5 tricycle (remember them?).

Indeed, the bike is intended to look just like a conventional Suzuki Burgman and was developed with Intelligent Energy, a UK-based fuel cell developer, to address some other issues that have held back commercial deployment of fuel-cell vehicles — its makers claim a range of 350km, a re-fueling time of just a few minutes, as well as engine power comparable to its conventional stablemate.

It’s expected to be rolled out commercially sometime around 2015 as a realistic alternative to a standard combustion bike, and is designed to have mass appeal, specifically targeted for use in the urban environment by London’s commuters. The London prototype unveiling, with the backing of city authorities, also comes amid talk of a European Union move to crack down on motorbike emissions.

However, there are plenty of questions still to be answered and the impetus both in the EU and the USA heavily favours electric vehicles (EV) over hydrogen.

The initial price of the Suzuki Burgman fuel cell scooter will be relatively high — perhaps twice the current price tag of US$6,000 for a conventional bike — though that would fall if the bike found a large enough market. Bigger issues include the fragility of the engines, which can be highly susceptible to bumps and vibrations. Also, hydrogen is enormously expensive to store, and refuelling pumps are scarce.

The EV versus Fuel Cell debate hasn’t been in fuel cell’s favour recently, especially in the US, where Nobel Prize-winning Energy Secretary, Steven Chu, has come out forcefully against a viable future for the technology.

Intelligent Energy chief executive,  Henri Winaund, argues that it will be cheaper in the long run to develop a hydrogen fuel grid compared to electricity, as pumps can be built into existing petrol stations. EVs also face questions about infrastructure, such as the fact that many countries still produce most electricity from coal, the dirtiest fuel source, and grids would require huge upgrades to be able to cope with large-scale EV deployment.

For motorcycles, the lighter-weight fuel cell also has a distinct advantage over heavier EV battery alternatives. Several vehicle manufacturers, including GM, Honda and Toyota, have cited 2015 as the year that affordable hydrogen vehicles will start to come onto the market, and investment in infrastructure is picking up pace accordingly. Already, California has a nascent hydrogen network, with over 25 refueling stations; Germany has 30 and Sweden and Denmark are working to keep up with Norway, aiming to link up to create a Scandinavian highway. Japan is also investing heavily in infrastructure, hoping to have built up to 60 stations by 2015.

Now, encouraged by the Suzuki launch, London is scheduled to have at least six refueling points built in the run up to 2012, as well a fleet of fuel-cell taxis, buses and police cars planned for roll-out. By focusing on London’s commuters, the hope is that the practical case for developing a power grid for hydrogen is supported. Further backing in the UK came from energy minister Philip Hunt, who has laid out a £7m investment plan to fund a hydrogen highway along a stretch of motorway into Wales.

EVs are still favoured among policy-makers (See Cleaning up cars) and roll-out in the US and elsewhere is well-advanced. But Suzuki’s new scooter is a stylish ambassador for the hydrogen case.

Hottest year on record?

[From The Economist]

The betting is that 2010 will be the hottest year on record. But understanding how the planet’s temperature changes is still a challenge to science.

IT MAY seem implausible at the moment, as northern Europe, Asia and parts of America shiver in the snow, but 2010 may well turn out as the hottest year on record. Those who doubt that greenhouse gases are quite the problem they have been cracked up to be by most of the world’s climatologists have taken comfort from the fact that the Hadley Centre, part of Britain’s Meteorological Office, reckons the warmest year since records began was 1998 (see chart 1). Twelve years without a new record would, the sceptics reckon, be rather a large lull in what is supposed to be a rising trend. Computer modelling by the Met Office, though, gives odds-on chances of the lull being broken.

The fact that no record high happened in the 2000s does not mean that there was no warming over the decade–trends at scales coarser than the annual continued to point upwards, and other authorities suggest there have been record years during the period. Nor was the length of time without an annual record exceptional. Models simulating centuries of warming normally have the occasional decade in which no rise in surface temperatures is observed. This is because heat can be stored in other parts of the system, such as the oceans, for a time, and thus not show up on meteorologists’ thermometers.

Indeed, one reason for thinking that the coming year will be hotter than all known previous ones is that the tropical Pacific is currently dumping heat. This phenomenon, by which heat that has been stored up in the sea over the previous few years is released into the atmosphere, is known as El Niño. A strong Niño contributed to the record temperatures in 1998. In 2007 and 2008 the opposite phenomenon, a cooling Niña, was happening. That goes some way to explaining why those years were chilly by the standards of the 2000s.

And on top of El Niño, there is the sun. The sun’s brightness fluctuates over an 11-year cycle. Though the fluctuation is not vast, it is enough to make a difference from peak to trough. In 2009 the sun was at the bottom of its cycle. Unless it is behaving particularly strangely, it should, over the next 12 months, begin to brighten.

The Met Office’s forecast was made using the Decadal Prediction System, or DePreSys (pronounced “depresses”, which is what it sometimes does to Doug Smith and his colleagues, who run it). Climate models are normally used to show how the climate’s behaviour will respond to changes in things like greenhouse-gas levels. But though a model’s response will, it is hoped, be similar to the real climate’s, models are caricatures, not portraits. Trying to force one into a state that looks exactly like the real climate at a specific time, as prediction requires, will distort it, and it is likely to misbehave as a result.

DePreSys is an attempt to work round this “initialisation” problem–to give the model’s caricature not just an all-purpose resemblance to the way the real climate behaves, but one that captures its pose and expression at a particular moment. In 2007 the first study using DePreSys correctly predicted that there would be a few more years which would set no records. After this, it said, there would be a definite rise in temperature. More recently, Dr Smith and his team have been using clusters of computers around Britain to run multiple models with slightly different initial conditions. Four-fifths of these runs suggest 2010 will be warmer than any previous year–which could be taken as odds of four-to-one on. The techniques are still in their infancy. But they are at least making predictions that can be checked.

Balancing the books

Dr Smith and his colleagues are trying to predict some of the natural variability to come. Kevin Trenberth of America’s National Centre for Atmospheric Research wants to understand in detail the natural variability just seen. His quest gained unexpected prominence when one of his forcefully expressed e-mails on the subject–“The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t”–found its way into the public domain as one of thousands of e-mails from the Climatic Research Unit of the University of East Anglia in the “climategate” furore of November 2009.

Dr Trenberth was not, he has since been at pains to stress, saying that the relatively unwarmed 2000s were particularly out of the ordinary. Instead, he was saying that, given the panoply of satellites and measurement networks that are being installed to monitor the climate, it should now be possible to identify the places and processes that hide energy from the prying eyes of climatologists. That would make it possible to determine what has actually happened to the energy trapped by increasing levels of greenhouse gases.

For the first part of the decade this turns out to be possible. From 1998 to 2003, although surface temperatures were not rising, a lot of energy was mopped up by the oceans (see chart 2). This is borne out by the rise in sea level during the period, which matches (once the additional effects of melting glaciers and ice sheets are taken into account) the expansion of the water in the oceans caused by this heating. Until the middle of the 2000s, therefore, the sums seem to balance.

It is after that that the problem comes. Runoff’s role in the rising sea level increases, meaning the fraction attributable to expansion, and thus the amount of heat taken up by the sea, has fallen (and the chart therefore levels out). The missing heat must therefore be going somewhere else. One possibility is that it is being reflected back into space by changes in cloud cover. The data, however, seem to say no. America’s CERES programme, the result of observations by seven different instruments on six different satellites, suggests the Earth has actually absorbed more energy and reflected away less over the past few years, rather than the other way round. It is all rather mysterious.

Nevertheless, while there is a lot of scepticism in, around and about climate science, none of it is aimed at the first law of thermodynamics, which says that energy cannot be created or destroyed. The energy that the sun delivers to the Earth must therefore be equal to the energy that is reflected back into space, plus that re-emitted as infra-red radiation, plus that stored in some part of the atmosphere, the oceans or the land.

The fact that the books cannot currently be balanced is therefore an admission of ignorance–an ignorance that better, future measurements should help abolish. That, in turn, should allow predictions of what the climate will do next, for good or ill, to become significantly better, and thus deprive climatic bookies of their trade.

Summit fallout

The Guardian (UK) carried a fascinating insight into the failed Copenhagen summit from Mark Lynas, an Oxford-based climate change consultant and activist who was in attendance at the inter-governmental sessions there. The piece was titled How do I know China wrecked the Copenhagen deal? I was in the room, which really requires little further explanation.

Whether or not one shares Mr Lynas’ position on anthropogenic climate change, CO2 policies, etc., his account of the talks has a very convincing ring to it in terms of the process involved in one of these events. One doesn’t even have to agree with his conclusion that China is “to blame” for the failure to get the kind of deal that he, his fellow climate activists and indeed most governments appear to have wanted out of Copenhagen, to understand what went wrong. For the truth is that all of these processes, whatever the “One World” pronouncements from politicians involved, are ultimately a complex set of countervailing negotiations around national interests. The bigger the number of countries involved — 193 for Copenhagen, for goodness sake — and the more divergent those national interests, the less likely is the prospect of a meaningful outcome. In the case of Copenhagen, it was clear long before it commenced that there was little prospect that it would achieve anything conclusive. But the completeness of its failure exceeded even the pessemistic expectations. 

In a way, however, that failure — and the nature of the failure, as spelled out in Mr Lynas’ account –can be seen as a good thing. It is likely to focus governments more on what can — and should — be done at national levels, even if only rationalised on the grounds of pollution-control, energy security, promotion of economic and technological development, etc. There will be, of course, much debate at the natoional and the inter-governmental levels about costs, monitoring, effective delivery systems, etc. But it may well be more rational if done in more manageable, less amorphous groups than was seen to fail at the Copenhagen shambles. Already there is talk of a much smaller group of the most powerful — and most polluting — countries negotiating outside of a UN process, perhaps at a G30 level.

The UN’s top climate man, Yvo de Boer, generally a voice of reason in the process, could have had Mr Lynas in mind (as well as Ed Miliband, the UK environment minister and others from the west, and those responding for China) when he called for the blame game to finish and to move onto constructive talks. But sometimes a big, noisey row is needed so that everyone can see what really is what before moving onto the business at hand. And maybe the UN forum for debating and managing the climate change process has run its course.