THE WASTE-TO-ENERGY PLANT OF THE FUTURE

THE WASTE-TO-ENERGY PLANT OF THE FUTURE

The EU energy system will become more decentralised, decarbonised and more locally integrated and society will become more circular – in order to achieve a climate-neutral economy by 2050.

Waste-to-Energy plants will store energy; they will be able to desalinise water; they will capture the CO2 and recover it; and much more. Circularity means that Waste-to-Energy plants will take care of the fraction of waste that cannot be directly recycled. All materials after the Waste-to-Energy process that can be used will then be recycled into products. Contributing to a sound use of natural resources, Waste-to-Energy plants will systematically recover minerals and metals to build roads, buildings, etc.; produce biological fertilisers for agriculture; and much more. Waste-to-Energy plants will therefore be integrated with recycling plants in waste management centres. Landfills will be minimised all over Europe and separate waste collection will be everyone’s rule. Waste-to-Energy plants will be increasingly integrated into the urban fabric and will generate multiple opportunities for citizens, while safeguarding the environment. Their large scale will allow to integrate sport centres (such as ski slopes, rock climbing gyms, skate parks, tennis courts, outdoor swimming pools, etc.) and edutainment activities to raise students’ and citizens’ awareness of waste management, energy production, engineering, etc. The presence of restaurants, picnic spots, panoramic points and the organisation of activities including concerts, open-air cinemas and theatres, etc. will be potential features of both the plants and the parks surrounding them. Finally, synergies with research centres will allow for the development of innovative technologies furthering the contribution of Waste-to-Energy to resource efficiency and decarbonisation. Waste-to-Energy plants will generate jobs involving a huge variety of professions, they will contribute to the economic growth of the city and they will keep the environment clean, fully aligned with the goals set by the United Nations in the 2030 Agenda for Sustainable Development. Increased feed-in from variable generators will mean that Waste-to-Energy plants will have to make sure they can use the energy they generate even if their electricity may not be fed into the grid.

Text and image from ESWET Vision 2050 - https://www.eswet.eu/tl_files/eswet/5.%20Documents/ESWET_2050_Vision.pdf


In Sync: How A Steam-Era Machine Can Upgrade The 21st Century Electric Grid

ENERGY

The peculiar machine described in the 1920 issue of GE Review — essentially a giant engine designed to produce no mechanical power — seems like nothing more than a charming relic from the early years of electrification. Yet the device, known as “synchronous condenser,” turns out to be far more than a steam-era oddity, as the engineers in charge of today’s electric grid are discovering. GE supplied its first synchronous condenser a century ago and remains a leader in the field.GE’s first large transmission grid condenser, built a century ago for Ontario Hydro in Canada, was big enough to fill a room. It helped move electric power generated by Niagara Falls to customers in distant cities. Today, GE remains  a leader in the field, supplying the technology to utilities changing their energy mix and embracing solar and wind power.

The first device was a hit. It worked by injecting the right amount of “reactive power” into the grid to keep alternating current (AC) flowing smoothly. You’ve never heard of reactive power? You are not alone. It is sometimes referred to as “phantom power” due to the critical but often overlooked role it plays in maintaining grid stability. By keeping voltage and current in sync, it helps prevent surges and brownouts and allows utilities to transmit power from faraway places. Gushed the author of a 1920 review: “As a means of controlling voltage on high potential long-distance transmission lines, the synchronous condenser is an absolute necessity.”

Cut to a hundred years later, when the engineers running today’s electric grid find themselves confronting oddly similar problems.

GE’s first large synchronous condenser helped move electric power generated by Niagara Falls to customers in distant cities. Today, GE remains  a leader in the field, supplying the technology to utilities changing their energy mix and embracing solar and wind power. Top image credit: Getty Images. Above: GE Power.

Electric grids in 1920 and 2020 have much in common. Today’s solar plants and wind farms face the same dilemma as the pioneering hydroelectric utility: Lots of renewable energy to generate electricity, but also big obstacles to move that power to the customers when they want it. And as more users and producers connect to an expanding grid, the grid’s current and voltage often get knocked out of phase.

This is a big problem. You may remember from school that AC current —this is the type of current in your home socket — oscillates around zero and periodically reverses its direction. As a result, the work done by an AC grids’ power can fall to zero if the voltage and current are completely out of phase and cancel out each other; the closer they are to being perfectly in sync, the more efficiently the power flows. The job of synchronous condensers is to ensure the grid stays in phase by injecting reactive power into the grid as needed.

The modern version of the issue is particularly acute because of the increasing number of giant renewable energy projects being planned in areas far removed from the population centers that will use that energy. Record-setting wind farms, in locations such as the North Sea and Wyoming, need efficient ways to deliver their power to big cities. Further, the method that solar and wind farms use to generate power is very different than conventional big power plants. Solar and wind farms do not provide significant grid strength and inertia.

Think of the conventional power plant as a big freight train and the solar and wind farms as many race cars. When a “disturbance,” like a large animal, steps in front of the freight train, it continues down the track with hardly any change in speed. But when the same steps in front of the race cars, there is a big crash and the race cars are greatly affected by this “disturbance.” This is similar to a “disturbance” on the transmission grid. Without the large conventional power plants, the grid will crash. By adding the synchronous condensers to a grid having lots of renewable energy generation, we improve the strength and inertia allowing the grid to continue to operate in a stable fashion when the “disturbance” occurs. With all those gigawatts zipping around the planet, a technology that can improves stability on the transmission grid becomes ever more valuable.

Synchronous condensers inject “reactive power” into the grid to keep alternating current (AC) flowing smoothly. Image credit: GE Power.

Researchers said in a recent Journal of Electrical Engineering article that the location of synchronous condensers on the grid has a major effect on their effectiveness. Their models show that adding a combination of a new wind farm and a condenser to the edge of the existing grid can dramatically improve grid stability.

The modern version of the synchronous condenser offers some 21st century improvements over its predecessors. While the 1920 version of the machine relied on rheostats and other analog control systems, today’s condenser uses digital controls to dispatch the reactive power in an instant.

When a tree branch falls on a main power line — or some other event causes a fault on the main grid — the voltage plummets temporarily and the entire grid can collapse, unless a condenser steps in with a large and rapid injection of reactive power to recover the voltage.

The long distances between generation and users can also lead to an increase in the network’s vulnerability to disturbances such as short-circuits, which is another problem modern synchronous condensers are suited to countering.

Might the century-old technology reveal other surprising talents as the electric grid continues to evolve? The authors of the Journal of Electrical Engineering article recommend optimism: “Although being an old technology, the synchronous condenser remains an interesting topic for future work.”


Moixa, Honda and Islington Council launch ‘innovative’ V2G project

Honda's Waite, Cllr Champion and Moixa's Wright with one of the new chargers. Image: Moixa.

Honda's Waite, Cllr Champion and Moixa's Wright with one of the new chargers. Image: Moixa.

Islington Council has launched a vehicle to grid (V2G) project with energy technology firm Moixa and automotive giant Honda.

Five new bidirectional V2G chargers have been installed in the back carpark of Islington Town Hall, allowing the council's new Nissan e-NV200 electric vans to charge, but also to provide storage for the building.

The council has a net zero target of 2030, and a fleet of nearly 500 vehicles it is now transitioning from ICE to help meet this goal.

In a launch event at the building yesterday (16 January), Councillor Rowena Champion, Islington Council’s executive member for environment and transport, welcomed the project saying:

“Decarbonisation of our own fleet is absolutely imperative, to climate change, but also to air quality. This project is very much part of that, but it's also much more.

“Smart charging allows power to withdraw from vehicles to be utilised where it’s needed most. Crucially, it allows us to make use of the greenest energy available, with the benefit that brings.”

The chargers were made by Swiss company EVTEC, and jointly developed with Honda. They have both Combined Charging System (CCS) and CHAdeMO connections, as while CCS remains dominant in the UK, Honda suggested at the event that CHAdeMO is starting to take over in Europe. As such, they are designed to be interoperable and forward thinking.

They have been installed with Moixa’s GridShare software, which will allow the vehicles to work as storage for the council building. They will be able to charge when electricity is cheap and green, usually at night when there is a large amount of wind and nuclear on the grid and a low level of demand.

This can then be discharged as and when it is needed back into the council buildings system. This can help balance the council's electricity system which can at times be stretched, especially when there are music concerts taking place at the onsite venue.

Events within this space have been known to trip the electrics in the town hall, leading Moixa’s chief technology officer Chris Wright to quip that the chargers can be seen as “vehicle to party".

Flexitricity is helping to provide the demand response services for the project, and say they are “delighted” to be involved in the innovative project.

The company’s chief strategy officer and founder Alastair Martin said: “There is huge value in flexibility and Flexitricity’s job is to make sure all types of energy users have access to it – from large commercial and industrial energy users down to every EV customer.

“It’s very exciting to see how partnerships like this one are moving us closer to our vision of a greener and fairer energy system. The initial tests went very smoothly and the operators in our 24/7 control room in Edinburgh are poised and ready to dispatch the flexibility from the EV fleet to help National Grid balance the system, driving revenue and savings for Islington Council in the process.”

The chargers will operate within the Town Hall's capacity limits, and received approval from UKPN prior to their construction. Moixa is currently in talks with UKPN about providing further constraint management services in the future to other parties.

The chargers in Islington Town Hall have a capacity of up to 10kW each, while the building has a baseload of 50kW and a total site capacity of 250kW.

It is hoped that the chargers will be able to increase energy security, lower costs and provide environmental benefits for the council. When the entire fleet has transitioned to electric, it will cut 1,400 tonnes of carbon dioxide emissions a year.

Councillor Champion added: “We’re working to ensure our residents have clean air to breathe, while also saving money that can be spent on delivering essential services for the people of Islington. We’re working with industry leaders – Honda and Moixa – to electrify our fleet in the most effective way for our residents and acting as a pioneer for others to follow.”

The council is currently on target to have reduced emissions by 40% from 2005 levels by 2020.

The project will last for twelve months, and is the first in Honda and Moixa’s partnership, after they announced that Moixa would be the automotive company’s European smart charging partner in early 2019.

Jorgen Pluym, project leader of energy management at Honda Motor Europe, said: “As the shift towards electrification accelerates, we must continue to innovate with projects like these – helping to drive awareness and uptake of charging solutions and advanced vehicle-to-grid technologies. Honda is committed to promoting sustainable future energy management in Europe, and this project in Islington represents an important part of our vision for future energy solutions.”

Since their partnership was announced, Honda has led a £8.6 million funding round in Moixa, as the latter targets ‘millions’ of batteries under management.

At the event, Honda's Matthew Waite, department manager for energy management projects said that the company would be making further energy announcements in 2020.

The project is designed to be an exemplar for other councils and government organisations, to help electrify fleets in an effort to meet net zero. There are 4,844 council-managed vehicles in London alone, 90% of which are still diesel powered.

Moixa’s Wright added: “The EV revolution will put millions of ‘batteries on wheels’ on our roads in the next decade. By using AI-driven charging technology, we can intelligently manage these fleets of batteries, securing lowest-cost charging and highest-impact carbon savings.

“Our project with Honda and Islington shows what is possible and provides a blueprint for all large organisations to follow.”


Green Alliance warns of ‘plastic alternative’ impacts

9 JANUARY 2020 by Claudia Glover at letsrecycle.com 

Environmental think-tank the Green Alliance has warned that companies are on the verge of switching from plastic to materials with a “greater environmental impact”, because of public pressure.

The warning comes in the wake of a report – Plastic Promises – which was released this morning (9 January). It is based on interviews with UK supermarkets and brands and forms part of Green Alliance’s work programme for the Circular Economy Taskforce.

The report explained that plastic alternatives could turn out to be harmful in the long run

Questioning the practicalities for retailers of moving away from single-use packaging, the report suggests that some plastic alternatives could turn out to be harmful in the long run, when looking at factors such as carbon emissions.

One of the examples highlighted was plastic bags. The report explained that UK supermarkets, including Morrisons, Tesco and Sainsbury’s, have recently switched from single use plastic bags for loose produce and bakery items, for example, replacing them with single use paper bags.

“This is a worrying trend, as paper bags, which are often just as unnecessary as their plastic counterparts, can have much higher carbon impacts, though this can depend on material sources and product specification,” the report noted.

Summing up the findings of the report, Green Alliance said that “in the absence of government direction, a disjointed and potentially counterproductive approach to solving plastic pollution is emerging”.

Biodegradable

A“particular concern” raised by the Green Alliance in the report is compostable or ‘biodegradable’ plastic. While the research suggests that over 80% of consumers think this is an environmentally friendly product, said the think-tank, there is “little understanding” of what the terms mean and how the material should be dealt with once used.

The report explained that the retailers consulted “wanted a clearer approach to where it should be used and how it should be marked to avoid causing more problems”.

Unintended consequences

The Green Alliance said that all of the interviewees who took part in the report felt that decisions to switch away from plastic are often made without considering the environmental impact of the substitute materials chosen, or whether or not there is adequate collection and treatment infrastructure in place for them.

One respondent called the process “fairly quick and fairly cut and dry”, prompted by a mandate to office managers to “be more environmentally friendly” which results in “a kneejerk reaction to exit plastic,” the study claimed.

“Avoiding unintended consequences should be at the forefront of everyone’s minds.”

Adam Read, Suez

‘Careful’

The research was conducted for the Circular Economy Task Force, a business group convened by the Green Alliance which includes major waste companies such as Suez, Veolia and Viridor.

Commenting on the report from a Suez perspective, Adam Read, external affairs director at SUEZ recycling and recovery UK, said short term decisions could cause problems in creating a “true circular economy”.

Mr Read added: “As the war on plastics continues to rage, avoiding unintended consequences should be at the forefront of everyone’s minds, and that includes government, industry and, of course, consumers. Change must be managed and planned if we’re to move towards fully closed-loop systems for recycling and, more importantly, reuse.”

‘Reality check’

Richard Kirkman, Veolia UK’s chief technology and innovation officer, said: “This report is a reality check – it shows what’s happening with plastics on the ground and why we need to keep a level head.

“Let’s follow the science and ensure producers and consumers make sound material choices in line with the progressive resources and waste strategy.”

Dan Cooke, head of sustainability at Viridor, said kneejerk reactions can cause ‘frustration’ for recycling companies

‘Kneejerk’

Dan Cooke, head of sustainability at Viridor, said: “The often kneejerk reactions of some buyers and brands can cause frustration for recycling companies as they move away from inherently recyclable packaging types into materials like coated cardboard and composites that are less recyclable and that can have a worse environmental impact.

“We work closely with supermarkets and brand owners on recyclability and to align recycling services with their requirements.”

Related link

The full report can be read here.


Blackout investigation: What went wrong at Hornsea One and Little Barford

Image: Getty.

Image: Getty.

Ofgem's investigation into the 9 August blackout has detailed why Hornsea One and Little Barford - the offshore wind farm and CCGT plant at the heart of the event - remained disconnected from the grid.

As a result, Orsted and RWE have agreed to make a voluntary payment of £4.5 million to Ofgem’s redress fund.

The investigation has also made recommendations for the distribution network operators (DNOs), as well as shed light on the role of distributed generators in the blackout and the need for greater visibility of the assets.

Hornsea One

At 16:52:33, a lightning strike caused a fault on the Eaton Socon – Wymondley 400kV line. Whilst this was rectified within 80 milliseconds, around 150MW of distributed generation disconnected from the local distribution networks due to a safety mechanism known as vector shift protection.

At Hornsea One, the onshore control system operated as expected during the incident. However, the offshore wind turbine controllers reacted incorrectly to voltage fluctuations on the offshore network. This caused instability between the onshore control system and the individual wind turbines, and its this instability that triggered two modules to shut down, with Hornsea One deloading from 799MW to 62MW.

In Orsted’s internal investigation, it identified that the stability issue had occurred around ten minutes prior to the incident on 9 August but had not caused deloading at that time.

Orsted’s also provided Ofgem with information on its modelling prior to 9 August, which showed problems with the voltage control system when operating at its full capacity of 1,200MW. These findings were not shared with the ESO at the time. Plans were in place for a software update scheduled to take place on 13 August to resolve this issue, with the update then being implemented on 10 August following the blackout.

Orsted also did not notify the ESO when it de-loaded by 737MW, and temporarily began its process of starting up two of its modules without coordinating with the ESO.

Hornsea One has acknowledged that it did not meet its Grid Code requirement to remain connected and transiently stable following a fault on the transmission system, with power output recovering to at least 90% within 0.5 seconds, having de-loaded following the fault.

In addition, it has accepted that it did not meet the Grid Code requirement to have an overall voltage control system that appropriately dampens or limits swings.

Little Barford

Within a second of the fault, Little Barford’s steam turbine, which was generating 244MW, tripped. The trip occurred initially because of discrepancy in the three independent speed sensors on the turbine, which exceeded the tolerance of the control system. However, the root cause of the discrepancy has not been established.

This resulted in a total loss of generation of over 1,1039MW within one second of the fault. This caused the frequency of the electricity system to fall at a rate of change of frequency (RoCoF) above 0.125Hz/s, which then resulted in an estimated 350-430MW of distributed generation tripping off unnecessarily, according to the investigation.

Frequency response was then activated, with frequency fall stopped 25 seconds after the fault at 49.1Hz, plateauing after 45 seconds at 49.2Hz, below the minimum frequency of 49.5Hz set in the SQSS.

However, Ofgem criticised the frequency response as "inadequate", with primary responders under-delivering by 17% and secondary by 14%. Despite this, Ofgem doesn't believe better response and reserve delivery would have stopped demand being disconnected.

Mandatory response providers and commercial Fast Frequency Response providers of dynamic primary response under-delivered by approximately 25%.

Around a minute after the fault, a gas turbine generating 210MW at Little Barford was shut down for safety due to too much steam pressure in its pipework.

The second gas turbine at Little Barford generating 187MW was manually tripped by plant staff around a minute and a half after the initial fault due to safety concerns.

RWE Generation - which owns Little Barford - has acknowledged the role it played in contributing to the power outage by not continuing to provide power to the system following the fault, Ofgem said in its report.

Distributed generation and the DNOs

Ofgem’s investigation also found an estimated distribution generation loss across the event between 1,300MW and 1,500MW. At least 500MW was lost due to the loss of mains protection settings in the first second of the event, and over 200MW tripped when the system reached 49Hz.

When the system frequency dropped below 48.8Hz, the DNOs disconnected 892MW of net demand in a process known as Stage 1 of Low Frequency Demand Disconnection (LFDD).

The ESO reported that the net demand reduction seen by the transmission system was 350MW, meaning around 550MW of additional distributed generation was lost.

While most DNOs met requirements regarding LFDD, there were some issues with essential services such as trains and hospitals being disconnected.

However, it is difficult to isolate sites and so those sites should have their backup generation in place.

There was also some “concerning” evidence that some DNOs disconnected generation via LFDD that was providing frequency response or reserve services.

Ofgem also concluded that DNOs don’t collect and record enough data on distributed generation, which highlights the “substantial improvements” needed for DNOs to transition into DSOs.

More granular data collection is needed, as the majority of data the DNOs provided Ofgem only offered a partial view of which distributed generators tripped.

The regulator is continuing to review the behaviour of distributed generators during the event, and will consider the appropriateness of opening investigations into any licensed parties’ compliance with Distribution Code requirements regarding distributed generators’ protection settings.

Recommendations for the DNOs and distributed generation:

  • The ESO and DNOs should review the timescales for the Accelerated Loss of Mains Change Programme and consider widening its scope to include other distributed generation that unexpectedly disconnected or de-loaded on 9 August. This should be done through the Energy Networks Association (ENA), which should put forward its recommendations to the E3C by April.
  • Ofgem and BEIS should undertake a joint review of the regulatory compliance and enforcement framework for distributed generators, engaging with the industry in Spring 2020.
  • The E3C, through the DNOs and ENA, should undertake a fundamental review of the LFDD scheme, reporting its progress to Ofgem and BEIS on a quarterly basis.
  • Ofgem should consider options to improve the real-time visibility of distributed generation to the DNOs and the ESO.

Britain’s electricity since 2010: wind surges to second place, coal collapses and fossil fuel use nearly halves

In 2010, Great Britain generated 75% of its electricity from coal and natural gas. But by the end of the decade*, these fossil fuels accounted for just 40%, with coal generation collapsing from the decade’s peak of 41% in 2012 to under 2% in 2019.

The near disappearance of coal power – the second most prevalent source in 2010 – underpinned a remarkable transformation of Britain’s electricity generation over the last decade, meaning Britain now has the cleanest electrical supply it has ever had. Second place now belongs to wind power, which supplied almost 21% of the country’s electrical demand in 2019, up from 3% in 2010. As at the start of the decade, natural gas provided the largest share of Britain’s electricity in 2019 at 38%, compared with 47% in 2010.

Chart: Dr Grant Wilson, University of Birmingham Source: Elexon and National GridAuthor provided

As we predicted last January, 2019 saw the annual total for coal generation drop below solar and into seventh place for the first time. Britain’s renewables also generated more electricity than coal and natural gas combined over a month for the first ever time in August.


 


Besides the reduction in carbon emissions, there was another remarkable shift in Britain’s electrical system during the 2010s. The amount of electricity consumed fell by nearly 15% between 2010 and 2019, with the economy using 50 terawatt hours (TWh) less electricity in 2019 than it did in 2010. That’s enough electricity to power half of Britain’s cars and taxis, if they were all electric vehicles.

Some of this reduction can be attributed to greater energy efficiency, such as more LED lighting, and the fact that more goods were imported, rather than manufactured within Britain. With wages stagnant since 2010, it’s likely that lower economic demand also contributed.

Chart: Dr Grant Wilson, University of Birmingham Source: Elexon and National GridAuthor provided

The rise of renewable generation and the fall in electrical demand allowed coal power to be transitioned off the system. Britain’s electrical grid was coal-free for over 3,700 hours in 2019, something that would have been unthinkable ten years ago.

Winds of change

Wind energy set a new record of 26.5% for December 2019’s generation in the UK. Including solar, hydroelectric and biomass, renewables provided nearly 37% of the month’s electricity overall, with wind energy reaching a peak of nearly 17 gigawatts (GW) during the afternoon of December 10.

Since August 2018, renewables have produced more electricity than nuclear power for 17 months straight. Nuclear fell to less than a fifth of electricity generation in 2019, its lowest level since 2008 due to extended maintenance periods at six nuclear power stations. This helped the annual output of wind energy to surpass nuclear for the first time in 2019.

But the 2020s will prove an even greater challenge for decarbonisation, not least as Britain’s economy is still heavily dependent on fossil fuels for transport, heating and hot water. Sales of electric vehicles in Britain are accelerating, with a quarter of a million now on Britain’s roads – but how to decarbonise heating is still up for debate.

Eggborough coal power station, which was decommissioned in 2018. Phil Silverman/Shutterstock

Encouragingly, due to cleaner electricity, a major milestone for electric heating is likely to have been reached in 2019. Using electricity from the grid to heat buildings or hot water is less carbon intensive than burning natural gas to get one kilowatt hour (kWh) of heat from a modern gas boiler. This means that even a simple electric heater releases on average less carbon than burning natural gas.

But since natural gas demand varies greatly over a day and between seasons compared to the demand for electricity, a wholesale shift from natural gas to electricity is a significant challenge. Using low-carbon gases such as hydrogen is one option to decarbonise Britain’s heat supply, so too are electric heat pumps. Without a sustained focus on shifting heat and the transport sector from fossil fuels, Britain will fail to become a net-zero carbon economy by 2050.

What lies ahead in the 2020s?

Scaling up renewable energy generation has catapulted Britain through a decade of electrical system change, but to capitalise on this momentum in the 2020s, low-carbon energy must be complemented with low-carbon flexibility. That must mean the growth of industries focused on energy storage, demand reduction and management, and local control systems, ensuring the system can continue to meet demand at all times.

After a promising decade of decarbonisation - despite policy setbacks like the green deal – the race is on to be the first G7 country to attain a net-zero carbon economy. Showing that it is possible to fully decarbonise a large economy while remaining internationally competitive would send an important message to the world.


 


The next decade will see even more renewable energy deployed, such as the Hornsea Project One – a 1.2 GW offshore wind farm, due to be completed in 2020. But what else do the 2020s hold? Here are our energy predictions for the next ten years:

  1. Britain will install an additional 30 GW of marine energy generation, including offshore wind, wave, tidal flow and tidal range.
  2. Over 10,000 “active buildings” will be built. These are highly energy efficient buildings integrating renewable energy technologies for heat, power, and transport with different types of heat and electrical storage.
  3. Over 80% of new cars sold will be battery electric vehicles.

We would very much welcome your predictions for 2030 in the comments below, as a snapshot of current thinking at the beginning of 2020. It will be great to look back at these in the future.


*The electrical generation data is from Elexon and National Grid. Data from other analyses (such as BEIS or DUKES) will differ due to methodologies and additional data, particularly by including combined heat and power, and other on-site generation which is not monitored by Elexon and National Grid. Renewables in this analysis = wind + solar + hydro + biomass.



Climate change hope for hydrogen fuel

Climate change hope for hydrogen fuel

  •  

Gas hob
Natural gas can be blended with hydrogen for a greener mix

A tiny spark in the UK’s hydrogen revolution has been lit – at a university campus near Stoke-on-Trent.

Hydrogen fuel is a relatively green alternative to alternatives that produce greenhouse gases.

The natural gas supply at Keele University is being blended with 20% hydrogen in a trial that's of national significance.

Adding the hydrogen will reduce the amount of CO2 that’s being produced through heating and cooking.

Critics fear hydrogen will prove too expensive for mass usage, but supporters of the technology have high hopes.

Using natural gas for heating generates about a third of the UK emissions that are driving global warming.

But the only product of burning hydrogen is water.

How does it work?

As a fuel, hydrogen functions in much the same way as natural gas. So staff in the university canteen say cooking on the 20% hydrogen blend has made no difference to their cooking regime.

The project – known as HyDeploy - is the UK’s first live trial of hydrogen in a modern gas network. Keele was chosen because it has a private gas system.

Its hydrogen is produced in an electrolyser - a device that splits water (H2O) into its constituents: hydrogen and oxygen. The machine is located in a glossy green shipping container in the corner of the university’s sports field.

The gas distribution firm Cadent, which is leading the project, says that if a 20% blend were to be rolled out across Britain, it would reduce emissions of CO2 by six million tonnes - equivalent to taking 2.5 million cars off the road.

The hydrogen could be generated pollution-free by using surplus wind power at night to split water molecules using electrolysis.

Why not add more than 20% hydrogen?

The 20% proportion was chosen because it’s an optimal blend that won’t affect gas pipes and appliances.

Currently, the UK has only small supplies of hydrogen, but the firm says increasing production would offer a quick way of cutting emissions from heating.

Consultant engineer Ed Syson told BBC News: “The prize is a large one. If we were to roll this system out across the UK it would be on broadly the same scale as offshore wind is today. So it’s a significant technology.

“What’s more, it makes those carbon savings without having customers change their behaviour in any way.”

How long before we see 100% hydrogen boilers?

Some boiler manufacturers are already producing prototype boilers that use 100% hydrogen.

Worcester Bosch, for instance, has a “hydrogen-ready” design. It can run on natural gas, but it’s capable of converting to 100% hydrogen following a one-hour visit by an engineer.

The firm wants the government to stipulate that by 2025, all new boilers on sale should be hydrogen-ready.

It says this would allow households to switch painlessly to clean boilers when existing boilers reach the end of their lives. The extra cost of the hydrogen-ready boiler would be about £50, it says.

Wind farm, Thanet
Hydrogen could be generated using surplus energy from renewables

How clean is hydrogen?

Hydrogen can be produced from water through electrolysis, or from natural gas.

Electrolysis from surplus renewable energy is unambiguously beneficial for the environment – but it’s not very efficient.

For the foreseeable future it may be cheaper to produce hydrogen from natural gas. However, CO2 is released in the industrial process used to generate hydrogen.

The resulting CO2 would need to be captured and stored underground with carbon capture and storage (CCS) - a technology not yet established at scale.

Is the hydrogen revolution inevitable?

About 85% of homes have gas central heating, and some experts believe it will prove more cost-effective to switch boilers to hydrogen, rather than to install heat pumps which would require the UK’s aging housing stock to be highly insulated.

A recent study for the government raised the possibility that homes could be warmed by a hybrid system using electric heat pumps , then topping up with hydrogen on cold days.

Major drawbacks to hydrogen are cost and availability. The costs are much higher than for natural gas, although the differential will surely shrink as carbon taxes raise the price of burning gas to combat climate change over coming decades.

The environmental think tank E3G said in a statement: “Going for hydrogen entails massive infrastructure expenditure. In many cases the additional costs make it look unattractive compared with alternatives (like renewables).

Richard Black from the Energy and Climate Intelligence Unit (ECIU) told BBC News: “We will and should have hydrogen in the mix of energy options, but it’s not a wonder solution to everything, which you sometimes get the impression from the rhetoric. There is hope – but too much hype.”

Meanwhile, in the corner of a sports field in Keele, the container of hope has just supplied enough hydrogen to cook 20% of Christmas dinners.


What's in the Air?

Composition of Earth's atmosphere by volume
Composition of Earth's atmosphere by volume. Lower pie represents trace gases that together compose about 0.038% of the atmosphere (0.041197% at March 2019 concentration). Numbers are mainly from 1987, with carbon dioxide and methane from 2009, and do not represent any single source. Credit: Public domain

By volume, the dry air in Earth’s atmosphere is about 78.09 percent nitrogen, 20.95 percent oxygen, and 0.93 percent argon.

A brew of trace gases accounts for the other 0.03 percent, including the greenhouse gases carbon dioxide, methane, nitrous oxide and ozone. Yet while these greenhouse gases make up just a tiny percentage of our atmosphere, they play major roles in trapping Earth’s radiant heat and keeping it from escaping into space, thereby warming our planet and contributing to Earth’s greenhouse effect.

The largest greenhouse gas by volume is actually the one most people tend to overlook: water vapor, whose concentration varies significantly depending on temperature. As the temperature of the atmosphere increases, the amount of humidity in the atmosphere also goes up, further heating our planet in a vicious cycle.

Tiny solid or liquid particles known as aerosols, which are produced both naturally and by human activities, are also present in variable amounts, along with human-produced industrial pollutants and natural and human-produced sulfur compounds.


The Atmosphere: Getting a Handle on Carbon Dioxide

Part Two

Earth’s atmosphere is resilient to many of the changes humans have imposed on it. But, says atmospheric scientist David Crisp of NASA’s Jet Propulsion Laboratory in Pasadena, California, that doesn’t necessarily mean that our society is.

“The resilience of Earth’s atmosphere has been proven throughout our planet’s climate history,” said Crisp, science team lead for NASA’s Orbiting Carbon Observatory-2 (OCO-2) satellite and its successor instrument, OCO-3, which launched to the International Space Station on May 4. “Humans have increased the abundance of carbon dioxide by 45 percent since the beginning of the Industrial Age. That’s making big changes in our environment, but at the same time, it’s not going to lead to a runaway greenhouse effect or something like that. So, our atmosphere will survive, but, as suggested by UCLA professor and Pulitzer-Prize-winning author Jared Diamond, even the most advanced societies can be more fragile than the atmosphere is.”

NASA's OCO-3 instrument sits on the large vibration table (known as the "shaker") in the Environmental Test Lab at NASA's Jet Propulsion Laboratory.
NASA’s OCO-3 instrument sits on the large vibration table (known as the "shaker") in the Environmental Test Lab at NASA’s Jet Propulsion Laboratory. Thermal blankets were later added to the instrument at NASA’s Kennedy Space Center, where a Space-X Dragon capsule carrying OCO-3 launched on a Falcon 9 rocket to the space station on May 4, 2019. Credit: NASA/JPL-Caltech

Changes to our atmosphere associated with reactive gases (gases that undergo chemical reactions) like ozone and ozone-forming chemicals like nitrous oxides, are relatively short-lived. Carbon dioxide is a different animal, however. Once it’s added to the atmosphere, it hangs around, for a long time: between 300 to 1,000 years. Thus, as humans change the atmosphere by emitting carbon dioxide, those changes will endure on the timescale of many human lives.

Earth’s atmosphere is associated with many types of cycles, such as the carbon cycle and the water cycle. Crisp says that while our atmosphere is very stable, those cycles aren’t.

“Humanity’s ability to thrive depends on these other planetary cycles and processes working the way they now do,” he said. “Thanks to detailed observations of our planet from space, we’ve seen some changes over the last 30 years that are quite alarming: changes in precipitation patterns, in where and how plants grow, in sea and land ice, in entire ecosystems like tropical rain forests. These changes should attract our attention.

“One could say that because the atmosphere is so thin, the activity of 7.7 billion humans can actually make significant changes to the entire system,” he added. “The composition of Earth’s atmosphere has most certainly been altered. Half of the increase in atmospheric carbon dioxide concentrations in the last 300 years has occurred since 1980, and one quarter of it since 2000. Methane concentrations have increased 2.5 times since the start of the Industrial Age, with almost all of that occurring since 1980. So changes are coming faster, and they’re becoming more significant.”

The concentration of carbon dioxide in Earth’s atmosphere is currently at nearly 412 parts per million (ppm) and rising. This represents a 48 percent increase since the beginning of the Industrial Age, when the concentration was near 280 ppm, and an 11 percent increase since 2000, when it was near 370 ppm. Crisp points out that scientists know the increases in carbon dioxide are caused primarily by human activities because carbon produced by burning fossil fuels has a different ratio of heavy-to-light carbon atoms, so it leaves a distinct “fingerprint” that instruments can measure. A relative decline in the amount of heavy carbon-13 isotopes in the atmosphere points to fossil fuel sources. Burning fossil fuels also depletes oxygen and lowers the ratio of oxygen to nitrogen in the atmosphere.

A chart showing the steadily increasing concentrations of carbon dioxide in the atmosphere (in parts per million)
A chart showing the steadily increasing concentrations of carbon dioxide in the atmosphere (in parts per million) observed at NOAA's Mauna Loa Observatory in Hawaii over the course of 60 years. Measurements of the greenhouse gas began in 1959. Credit: NOAA

OCO-2, launched in July 2014, gathers global measurements of atmospheric carbon dioxide with the resolution, precision and coverage needed to understand how this important greenhouse gas — the principal human-produced driver of climate change — moves through the Earth system at regional scales, and how it changes over time. From its vantage point in space, OCO-2 makes roughly 100,000 measurements of atmospheric carbon dioxide every day.

OCO-2 beauty shot
Artist’s rendering of NASA’s Orbiting Carbon Observatory (OCO)-2 in orbit above the U.S. upper Great Plains. Credit: NASA-JPL/Caltech

Crisp says OCO-2 has already provided new insights into the processes emitting carbon dioxide to the atmosphere and those that are absorbing it.

OCO-2 image of persistent CO2 anomalies around the globe
Map of the most persistent carbon dioxide “anomalies” seen by OCO-2 (i.e. where the carbon dioxide is always systematically higher or lower than in the surrounding areas). Positive anomalies are most likely sources of carbon dioxide, while negative anomalies are most likely to be sinks, or reservoirs, of carbon dioxide. Credit: NASA/JPL-Caltech

“For as long as we can remember, we’ve talked about Earth’s tropical rainforests as the ‘lungs’ of our planet,” he said. “Most scientists considered them to be the principal absorber and storage place of carbon dioxide in the Earth system, with Earth’s northern boreal forests playing a secondary role. But that’s not what’s being borne out by our data. We’re seeing that Earth’s tropical regions are a net source of carbon dioxide to the atmosphere, at least since 2009. This changes our understanding of things.”

Measurements of atmospheric carbon dioxide in the tropics are consistently higher than anything around them, and scientists don’t know why, Crisp said. OCO-2 and the Japan Aerospace Exploration Agency’s Greenhouse gases Observing SATellite (GOSAT) are tracking plant growth in the tropics by observing solar-induced fluorescence (SIF) from chlorophyll in plants. SIF is an indicator of the rate at which plants convert light from the Sun and carbon dioxide from the atmosphere into chemical energy.

“We’re finding that plant respiration is outstripping their ability to absorb carbon dioxide,” he said. “This is happening throughout the tropics, and almost all of the time. When we first launched OCO-2, our first two years of on-orbit operations occurred during a strong El Niño event, which had a strong impact on global carbon dioxide emissions. Now we have more than five years of data, and we see that the tropics are always a source (of carbon dioxide), in every season. In fact, the only time we see significant absorption of carbon dioxide in the tropics is in Africa during June, July and August. So that’s half the story.

The last El Niño in 2015-16 impacted the amount of carbon dioxide that Earth's tropical regions released into the atmosphere.
The last El Niño in 2015-16 impacted the amount of carbon dioxide that Earth's tropical regions released into the atmosphere, , leading to Earth's recent record spike in atmospheric carbon dioxide. The effects of the El Nino were different in each region. Credit: NASA-JPL/Caltech

“The other half is also quite interesting,” he added. “We’re seeing northern mid- and high-latitude rainforests becoming better and better absorbers for carbon dioxide over time. One possible explanation for this is that the growing season is getting longer. Things that didn’t used to grow well at high latitudes are growing better and things that were growing well there before are growing longer. We’re seeing that in our data set. We see that South America’s high southern latitudes — the so-called cone of South America — are also strong absorbers for carbon. We don’t know if it was always this way and our previous understandings were incomplete or wrong, or if climate change has increased the intensity of the growing season. So we’ve established a new baseline, and it appears to be somewhat of a paradigm shift. Our space-based measurements are beginning to change our understanding of how the carbon cycle works and are providing new tools to allow us to monitor changes in the future in response to climate change.”

Crisp says OCO-2, OCO-3 and other new satellites are giving us new tools to understand how, where and how much carbon dioxide human activities are emitting into the atmosphere and how those emissions are interacting with Earth’s natural cycles. “We’re getting a sharper picture of those processes,” he said.

Impacts from agricultural activities also seem to be changing, he says. During summer in the U.S. upper Midwest, scientists are seeing an intense absorption of carbon dioxide associated with agricultural activities. The same thing is being observed in Eastern and Southern Asia. The strong absorption of carbon dioxide across China is erasing all but a thin strip of fossil fuel emissions along the coast, with Central China now functioning as a net absorber of carbon dioxide during the growing season. Thanks to the development of big, sophisticated computer models combined with wind and other measurements, we’re able to quantify these changes for the first time.

In response to the rapid changes observed in carbon dioxide concentrations and their potential impact on our climate, 33 of the world’s space agencies, including participants from the United States, Europe, Japan and China, are now working together to develop a global greenhouse gas monitoring system that could be implemented as soon as the late 2020s, Crisp added. The system would include a series of spacecraft making coordinated measurements to monitor these changes. Key components of the system would include the OCO-2 and OCO-3 missions, Japan’s GOSAT and GOSAT-2, and Europe’s Copernicus missions. The system would be complemented by ground-based and aerial research.

Crisp said he and his fellow team members are eagerly poring over the first science data from OCO-3. The new instrument, installed on the exterior of the space station, will extend and enhance the OCO-2 data set by collecting the first dawn-to-dusk observations of variations in carbon dioxide from space over tropical and mid-latitude regions, giving scientists a better view of emission and absorption processes. This is made possible by the space station’s unique orbit, which carries OCO-3 over locations on the ground at slightly different times each orbit.

NASA’s OCO-3 mission launched to the International Space Station on May 4, 2019. This follow-on to OCO-2 brings new techniques and new technologies to carbon dioxide observations of Earth from space. Credit: NASA-JPL/Caltech

The Copernicus CO2 Mission, scheduled for launch around 2025, will be the first operational carbon dioxide monitoring satellite constellation. Crisp, who’s a member of its Mission Advisory Group, said the constellation will include multiple satellites with wide viewing swaths that will be able to map Earth’s entire surface at weekly intervals. While its basic measurement technique evolved from the GOSAT and OCO-2 missions, there’s a key difference: the earlier satellites are sampling systems focused on improving understanding of Earth’s natural carbon cycle, while Copernicus will be an imaging system focused on monitoring human-produced emissions. In fact, it will have the ability to estimate the emissions of every large power plant in every city around the world.

Crisp says as time goes on the objective is to build an operational system that will monitor all aspects of Earth’s environment. Pioneering satellites like OCO-2, OCO-3, GOSAT and GOSAT-2 are adding greenhouse gas measurements to the data on temperature, water vapor, cloud cover, air quality and other atmospheric properties that have been collected for decades.

“We know our atmosphere is changing and that these changes may affect our civilization,” he said. “We now have the tools to monitor our atmosphere very carefully so that we can give policymakers the best information available. If you’ve invested in a carbon reduction strategy, such as converting from coal to natural gas or transitioning from fossil fuels to renewables, wouldn’t you like to know that it worked? You can only manage what you can measure.”

For more on OCO-2, visit https://ocov2.jpl.nasa.gov/.

For more on OCO-3, visit https://ocov3.jpl.nasa.gov/.


The 2010s wrecked the planet, but don’t despair yet

 

Steam and exhaust rise from different companies on a cold winter day on January 6, 2017 in Oberhausen, Germany.

Steam and exhaust rise from different companies on a cold winter day on January 6, 2017 in Oberhausen, Germany.


Privacy Preference Center