ARN

12 programming tricks developers use to cut cloud bills

Cutting cloud costs is a team effort, and that includes developers. Here are 12 tricks for developing software that is cheaper to run in the cloud.

Nothing lifts a development team's spirits like watching an application go viral. It’s a wonderful feeling–at least, until the monthly cloud bill comes in. 

Some developers believe that managing the cost of computing is a responsibility for the devops squad. Coders write the software, toss it over the wall, and let someone else worry about paying for it. Nothing could be further from the truth.

Smart developers know that their coding decisions make a big difference for the company bottom line. Bulky code is slower and requires more cloud resources to run. Choosing better algorithms and writing tighter code is about more than just speed. Well written code costs less to run.

Developers don’t always see the connection. It's easy to write code on their own machine, where RAM and extra disk space were paid for when the machine was purchased. If they’ve got two terabytes of disk space, they might not notice how much of it their code consumes. 

If a new algorithm takes twice as long to run, their desktop might not even blink–and besides, who notices a few extra milliseconds? But it’s almost certain that doubling the computation will result in a larger cloud bill.

Modern cloud computing excels at converting resource utilisation into a line-item charge. Good cloud developers understand that they have the power to make smarter decisions when writing their code. It can be as simple as running a profiler to identify slow spots, or avoiding unnecessary data storage for a lower memory footprint.

Here are 12 ways that developers streamline code so that it is leaner, faster, and cheaper to run.

Write faster code

Most developers don’t spend much time optimising their code. If it runs in a split second on their laptop, they don’t notice if it’s running 20 per cent, 30 per cent, or even 300 per cent slower over time. The program is still responding in split seconds. But these differences add up when they occur millions of times on the server. Careful profiling can flag the slow parts. Rewriting them could reduce the number of instances an application needs.

Lower RAM footprint

The amount of RAM being used is an important parameter for pricing cloud instances. In many cases, doubling RAM also doubles the cost. Programmers can lower their RAM footprint by avoiding keeping data in memory.

Some streaming algorithms, like Java’s Stream classes, are designed to work with large files of data without loading them all into memory. The Apache DataSketches project generates approximate answers for complex big data statistics without occupying all the memory.

As a side benefit, careful RAM consumption can also speed up algorithms. Sometimes, the operating system will start offloading data onto the disk using virtual memory. This prevents crashing, but it can slow down programs dramatically.

Use lower resolution images and video

Using lower resolution images and video can pay off in multiple ways. First, storing them will be cheaper. Second, any data exfiltration charges will be lower. Third, the application will seem snappier to users.

All static images should be minimised from the beginning. The amount of minimisation, alas, is not simple because at some point the visual quality degrades enough to be apparent to users. Finding the right tradeoff is a design decision that some programmers aren’t prepared to make.

Some applications that use uploaded images can also create smaller thumbnails and reduced-resolution versions after receiving the image. Toolkits like ImageMagik and formats like WebP were developed for this purpose.

Dump unnecessary data

Many developers are digital pack rats who store information just in case they might need it someday. They fill out tables with endless columns and then never delete the rows. The extra data doesn't cost anything if they own the hardware and the disk drive has plenty of space. But the cloud charges for everything. 

Will they really need all those values in the future? Does the user even want so many details? Dumping some of that old data will save money on data storage and exfiltration.

Limit disk storage

Using the local disk on cloud instances is not only dangerous, but it can be expensive. The local disk space is often designed to be fast enough to keep the operating system running efficiently. Many developers create their code on a personal machine with one or more terabytes of storage. 

Cloud machine storage is rarely so cheap or readily available. Clouds often bill directly for storage according to size, so the best approach is to use as little storage as possible. Consider ways to minimise not only the temporary files that the application creates, but the required system libraries and software packages.

Clean logs

Log files are great for identifying problems and debugging software during development. But once the code is in production, developers don't need to keep all of them. All the extra information clogs up either the local disk or the object storage. When they design the logging system, configure it to remove logs frequently. Many log packages like Log4j can be set to keep a minimal number of logs and delete them on a rolling basis.

Go serverless

Serverless architecture plans only bill when code is running, which can save developers plenty when loads are intermittent. Even applications that have a constant stream of users have more dead time than they might expect.

Many serverless pricing plans reward careful coding and very fast performance with minimal RAM consumption. The billing formula counts the response time in milliseconds and charges only for the time that the processor is occupied. As a developer, they get immediate feedback because they can track the response time directly and see how their code changes affect it.

The serverless approach is ideal for smaller or more experimental projects and the bill can often be as low as a few cents per month. If an application runs some features only occasionally, it might make sense to go serverless.

Archive old data

As data grows older, it’s less frequently accessed. They can anticipate this by setting up the application to migrate older data to a cheaper location. Some clouds charge much less for so-called “cold storage,” which can take minutes or even hours to deliver the bits. 

Other clouds like Wasabi or Backblaze specialise in archival storage for Amazon S3 objects and charge dramatically less than the major clouds. In some cases, they don’t even charge for data exfiltration. Offloading data as soon as it is no longer in high demand can be extremely cost effective.

Simplify CSS layouts

If they've looked at the HTML tags generated by some frameworks, they know how ridiculous the layouts can get. It’s just DIV tags nested into DIV tags all the way down–which costs money to generate and deliver. A web designer I know brags about cutting their bandwidth bill by 30 per cent just by creating a simpler layout with more judicious use of CSS.

Build static sites

Some frameworks like React require quite a bit of computational power, especially if they use features like server-side rendering. All that code drives up the monthly cloud bill. The opposite philosophy is to create a static site, built from unchanging blocks of HTML, CSS, and JavaScript that are served up from a cache verbatim. Using a content-delivery network can speed up delivery even more by moving caches closer to the user.

Various frameworks embrace this static philosophy. Jekyll, Hugo, Gridsome, and Pelican are just a few tools that will package all of the content into a set of compact, unchanging files. They can still build personalisation into the pages with AJAX calls, but the bulk of the site generates little load on the servers.

Externalise computation and storage

As browsers get more powerful, some frameworks make it simpler to move more computation directly to the client. Good JavaScript or WebAssembly code can push more of the load on to the user’s machine and off of cloud servers. 

Some developers are reducing their cloud layer to be little more than a database with a bit of business logic for authentication. One friend runs everything with static HTML and a server-side version of PostgreSQL with embedded procedures that output JSON.

Browsers also have more elaborate options for storing information locally like the HTML Web Storage standard and W3C Indexed Database API. It’s not just short strings and cookies anymore. This data is available faster because it doesn’t travel over the internet, and it gives users some comfort to know their data is not stored in a centralised, hackable database. Why pay for data storage and exfiltration when it can live on a user’s machine for free?

Appoint a cost engineer

Some developers specialise in taking care of databases. Some like creating beautiful first impressions with a well-designed front end. Now that cloud costs are so flexible, some teams are officially appointing “cost engineers” to manage code costs and efficiency. 

A cost engineer's first focus is getting application code to run cleaner, faster, lighter, and thus cheaper. Making this task part of someone’s job sends a message about the importance of managing code costs as part of the development team's role and responsibility.