Being fairly new in the workforce, I had a lot of questions when my seniors came back from a meeting ordering us to spin down all the resources we weren’t using. Why was this decided upon all of a sudden? Didn’t we have an unlimited amount of resources we could use to test and implement a solution? We had never done this before, nor were we required to do so in any other project. Why did we have to take care of it now?
The simple answer to all my questions came down to costs. The CSO of the company had gotten the cloud bill for the month and we were way over budget. But how were we, as developers, supposed to know the cost associated with these resources? And by resources, I mean things like AWS EC2 instances, AWS RDS databases, and AWS S3 storage. Why was my team, the consulting team, being yelled at for something their own developers didn’t have insight into? Us day-to-day developers are not focused on how many resources we are using, but on how to implement a solution. It becomes easy to forget that the resources we are using are actually being paid for by the company and not just there for our disposal.
Coming to Harness and learning about the different features that are a part of the platform, I was surprised and eager to learn more about the Cloud Cost Management tool. This tool gives developers visibility into their cloud costs by empowering them to manage the resources they use, rather than letting finance teams, solely, have that responsibility.
Because of the scenario from above and other instances where something similar had happened, I had realized this was a pain point that I had experienced - and at that time, didn’t know how to resolve. This made me think: of course, cloud providers wouldn’t create a feature like this!
They claim that their services are cost efficient because of their pay-per-use method, but why hasn’t this feature become more widespread? It’s a feature that every company could use to save on costs.
A common misconception about moving to the cloud is that it will save the company money. There is no hardware to buy, manage, or maintain and all the cloud providers boast a pay-per-use strategy, which tricks companies into thinking they will only be charged for the resources they use. They fail to take into account resources that are kept on, but not actively being utilized. At AWS re:invent, it was stated that 35% of all cloud spend is waste. 35% - that’s a lot of wasted resources and money! And 35% is why developers should handle cloud costs and manage it themselves.
Developers need the transparency and visibility into what they are using. Letting developers provision and use cloud resources without them knowing how much it’s costing is like letting someone else run a taxi meter when you’re paying for the ride.
On average, the annual cost spent on the cloud is $4,695,000. That is a lot of money to be spending on something that you thought would save you money. The only way that cost could go down would be to go to the source of the spend and see if you could cut down on it. But how would someone unfamiliar with the cloud even do that? Simply put, they wouldn’t. You would need someone familiar with the cloud and the resources that are being provisioned. You would need a developer.
With the widespread adoption of DevOps and the efforts to eliminate the silos that exist in software engineering, the silos that exist around cloud costs need to go as well. The more times a developer deploys code, the more cloud costs are incurred.
Developers need that visibility and transparency - not only because they can use it to efficiently handle costs of underutilized, non-allocated resources, but also because they can use it to monitor and help the business as a whole to cut costs. Other than the finance department that is paying the bills, the developers are the only ones who really care about cloud costs.
They are the ones using the resources. If the resources were cut, or let’s say the finance department refused to pay the cloud bill at the end of the month, the developers would be the ones who would be impacted first because their resources wouldn’t be available.
As an engineer, it gets very easy to get stuck in the code and become focused on one little method or functionality of the big picture or program. With the visibility of cloud cost, developers are able to see a broader overview of what is going on in the company. They are able to see the actual number of dollars going out the door.
Allowing developers to manage cloud costs isn’t a new or profound concept. I believe it’s a concept that either company hasn’t realized yet, or they have realized it but don’t have the resources to remediate the problem. Enter Harness Cloud Cost Management.
Enjoyed reading this blog post or have questions or feedback?
Share your thoughts by creating a new topic in the Harness community forum.