The deadline for organisations participating in the carbon reduction commitment (CRC) scheme to have submitted their final reports is now passed, providing a good point to reflect on the whole process.
Here at Local Energy we have been deeply involved with helping public sector organisations navigate their CRC requirements, and so the last few months have been particularly busy for us.
One element of this service has been our offer of a mini-audit for organisations within our Carbon Saving Public Sector (CSPS) network. Essentially the audit was used to highlight the areas of good practice and points for improvements, in a risk free and constructive environment for organisations.
In the run-up to this year’s reporting deadline we have completed a number of mini-audits. On analysis of these audits, Local Energy consultant Peter Chasmer found that a number of general learning points could be highlighted, with the issues below being representative of how organisations have been approaching their CRC reporting.
1) Any solar panels or renewables that organisations may have tend not to be managed by, or in any way connected to the energy management team of that organisation. Instead they are usually done for PR purposes. As a result of this, feed-in-tariffs (FITs) are being lost because the necessary paperwork is not being done.
2) Commercial properties out of contract, for example void shops, usually account for less than one per cent of an organisation’s energy consumption. Despite this, it is taking a disproportionately long time to accurately calculate this data.
3) Because putting in place a CRC report and fulfilling the necessary requirements has proven to be an often complex process, Local Energy has been stressing the need for the proper recording of roles and responsibilities. This simply requires a document that clearly outlines personnel responsible for the relevant areas of CRC; from the senior reporting officer down to all contacts within the organisation working on CRC. The mini-audits that we have done showed that external contacts lists are not available in general, and that wider CRC personnel lists were generally minimal – making this an important area for improvement in the future.
4) Connected to the previous point, we found that while data collection by organisations was comprehensive; their processes, procedures and policies were generally not completed. This will cause problems in the coming years, where changing personnel and procedures may mean a comprehensive document is never recorded.
5) It seemed to be a general theme amongst participant organisations that “actual” data received from different sources (annual statements/ customer reads/ automated meter readings (AMR)) was never consistent. Indeed, there was often a substantial difference in readings coming from supplier annual statements versus locally recorded data. The term “actual” is in practice a misnomer, which needs to be treated with caution, and Peter Chasmer has been advising that organisations should focus primarily on ensuring their own recorded data is accurate.
6) Within the annual report form, the Environment Agency has included a 2000 character “plain text” space for organisations to submit any further information they may want to include. For the reports that we have audited, organisations rarely used this space to talk about their performance for AMR and Carbon Trust standard league table metrics. Instead it was used to talk more broadly about their commitment to carbon reduction and renewables.
7) As well as having a plain text field, the report also contains a “4 questions” field which looks at the organisations environmental policy. Participant organisations are asked: do they have a carbon reduction mission statement; have they provably reduced their emissions over the last three years; do they have a senior person within the organisations specifically responsible for environment issues; and do they have schemes in place for actively reducing the organisations emissions. Our mini-audits highlighted that organisations were not considering these questions until they came to submit their reports, at which point they were looking around for a quick answer to fill the space. This therefore raises a question around the role and success of these questions for the report.
8) In regards to the ‘source list tool’ on the Environment Agency’s website, larger organisations (500+ meters) rarely used it, finding it to be cumbersome. For example, the use of drop-down menus requiring individual submission for each meter reading made the tool impractical to use for larger organisations. However, for smaller organisations it has been more useful, and some larger organisations have exported the macro values for the source tool to use within their own data sheets.
9) It seems that rules around residual gas and electricity are still not fully understood by councils, meaning that the 90 per cent rule, which will become obsolete with the planned simplification of the CRC, does not really work currently.
10) One of the biggest problems faced by organisations was the numerous energy suppliers they were dealing with; usually the result of various historical issues. The added complication of multiple data formats caused issues with both collation and analysis.
11) Finally, organisational structures tended to be over-complicated, raising the question of whether it would be more appropriate to just include (significant group undertaking’s) SGU’s in the CRC.
Overall we believe that the process of mini-audits has been a very useful exercise in not only helping public sector organisations with their CRC requirement, but also in understanding the trends and learning points in the CRC process so far.
For the organisations within out network, there has been both an improvement in meeting CRC requirements alongside a continuing need to work on other areas of the requirements. The need for ongoing learning, networking and improvement is further necessitated by the continued changes to the CRC system, and the uncertainty around its future.