Estimated page impact: 0.39g CO2e per page view
Each website (or app) is unique and will have unique solutions needed to decrease the digital impact. This blog post summarises the technical solutions I believe are most helpful in my journey so far. Check the series for detail on what can be done from a content and design perspective as well as by raising awareness.
Architect the website to be as efficient as possible. There are well established practices out there that are all really valuable:
Event driven architecture
Extreme content delivery network use
Static site generation
These aren’t buzzwords, they’re genuinely vital for efficiency and also UX. A lightning fast site will undoubtedly reduce digital impact.
Think about the future by ensuring the architecture can evolve too. You need to be able to take advantage of new frameworks, software and hardware as they develop to continuously increase efficiency.
We’re experimenting with the best combination of the above and have recently decided not to use a static site generator because we believe we can get comparable results with event driven architecture, and using a content delivery network to deliver a very significant portion of site code.
Host with an efficient green energy powered data centre that’s as close to the people who use your site as possible. I admit this is a sweeping statement, but only because there is so much to this discussion. So let’s start with the big players.
Google: buys renewable energy to 100% match the amount it uses in its data centres as well as rolling out some very innovative efficiency and carbon intensity initiatives.
Microsoft (and therefore Azure): has an amazing carbon capture commitment whilst allowing people to have access to data about the impact of what they host.
This is a huge topic that I’m nowhere near forming a strong opinion on, and would love to debate more with other people. From there I will be writing a dedicated blog post – keep an eye out!
Aggressively block bots to reduce server usage. Human activity on the internet accounts for only 48% of traffic on the internet. This means a massive 52% of traffic is generated by bots. Half of that number is deemed bad bots; scraping content, crashing websites, trying to find security loopholes, bulk buying products on eCommerce sites, simulating advert clicks, the list goes on…
On average, any site can expect a quarter of its traffic to be from bad bots, and due to the latest levels of sophistication, they behave in an almost indistinguishable way from humans.
If images are absolutely necessary, compress and cache them and don’t load what people can’t see. Compress and zip everything you possibly can!
Clean code only
Write (c)lean code, particularly JS and JQuery. Being the wonderful humans that we are, we’re prone to making a few mistakes, bowing to pressure and cutting the odd corner. We have to prove to people how valuable well written code is in the short term (performance, carbon, SEO, UX improvements) and longer term (ease of handover, upgrade and improvement).
Also, review plugin code before using, and question whether the functionality you need is better done without one. Social media plugins are a great example. A huge amount of their code offers no benefit to the person using the site or the website owner, and the continued tracking they fire off causes even more emissions. Ad software is also a particularly bad offender here.
Setting CO2e KPIs will empower developers to push back on common asks that are detrimental to website impact.
Use the design system as part of the pipeline build/reporting process. This is a great way to start conversations about the governance of key files. So often as websites grow they’re owned and worked on by different people, this leads to design creep, which is bad for page weight and UX.
Consider carbon intensity when planning times to run regular jobs. Traditionally we would run batch jobs in the middle of the night to make sure it doesn’t add to the server load. However, given that Octopus Energy recently started paying their customers to use renewable energy when it was very sunny and windy, a better way to plan these jobs is to monitor the carbon intensity of electricity and run the jobs when it is at its lowest. This concept is much better articulated as a green principle.
Automatically run site scanners (e.g. Sitebulb, Monsido, Siteimprove) to pick up issues that might have slipped through other quality checking processes. These products bring up some false positives but that’s a small price to pay for the impact that making these continuous improvements have.
Given the mind blowing amount of data that an internet user generates every day it is vital to think about data minimisation. This is a way of thinking contrary to the tradition of harvesting as much data as possible (sorry marketeers). Instead, first, define why you want to know something; how will knowing that help your audience? Then define the bare minimum data you need to capture that info.
This kind of approach might lead to products such as Cabin Analytics. The Panoply Group website will soon have this instead of Google Analytics, I’ll keep you posted on how it goes.
Make a check list of the above and embed them into your team's best practices.
Transformation is for everyone. We love sharing our thoughts, approaches, learning and research all gained from the work we do.
How and why we achieved B Corp Certification™
What my trip to Denmark highlights about the importance of making rail a competitor to flying.
Discover how our Principal Delivery Manager, Scott Drayton, and his colleagues empowered young minds at BIMA Digital Day, fostering creativity and digital skills for the future.