How To Reduce the Cost and Improve the Performance of Your GCP Infrastructure

Are you currently using 50TB or more in Google Cloud Platform (GCP)?  Are those workloads comprised primarily of analytics and transactional databases powering high performance critical applications, with some dev/test environments?

From our experience, here at Kaminario, these environments typically require high performance SSDs combined with powerful compute engines. The resulting footprint can cost $80,000 a month or more, depending on how many snapshots you use.

Reducing Cloud Spend with Kaminario on GCP

Kaminario’s Cloud Data Platform can reduce the cost of your GCP infrastructure while also increasing data performance. We do this by separating compute from capacity, removing the underlying inter-dependencies, and by implementing rich Tier 1 data services.

This dramatically reduces your footprint, and your GCP costs by 30% or more.

Let’s walk through a typical Kaminario customer profile: They have a high-performance data footprint of around 250TB, costing about $70,000 a month.  Even with a small number of snapshots, their GCP bill often balloons to over $80,000 a month.  But by deploying Kaminario on GCP, customers are able to turn off expensive cloud resources, reducing the monthly spend to about $20,000 a month! A huge savings with a significant increase in performance.

How does our platform do this?  Our secret sauce is leveraging existing cloud infrastructure assets in an extremely efficient manner to increase performance and reduce footprint at the same time.

We deliver rich Tier 1 data services like inline variable deduplication, inline compression, pattern removal, zero detect, and thin provisioning.  Combined, these services deliver major reductions in resource utilization, (2-4x) while increasing performance and reducing latency.

And there are more savings…

Our platform also eliminates the need to overprovision capacity to support high compute requirements. In GCP, performance thresholds are interdependent on the amount of compute and capacity provisioned. To support high performance, but low capacity data sets, organizations must radically over provision capacity, leading to additional costs and waste.

Kaminario solves this problem by disaggregating compute and capacity performance dependencies. You can now scale either compute or capacity independently, allowing for more granular control of provisioned infrastructure. With Kaminario, the smallest (or largest) data sets will get the same high performance.

Rich Tier 1 Data Services without a Hit to your Budget

In GCP, to create copies of your data, you take snapshots. The more snapshots you create, the more costs you incur—and it’s not a small cost increase. Using snapshots can also add to your performance requirements, further increasing costs.  Kaminario gives you unlimited snapshots at no cost, with no performance impact. A win-win.

Finally, GCP uses a shared nothing architecture which restricts data resources and applications to one-to-one mappings natively.  You must add both compute and capacity resources for each new application. Kaminario provides a shared resource capability supporting one-to-many mapping allowing the same resources to support hundreds of applications simultaneously.  Reducing resource overprovisioning delivers even more saving.

The bottom line: we can save a minimum of 30% off your public cloud bill.

In today’s current climate, where we are all looking to reduce costs, these savings will flow right to your bottom line. That’s the power of Kaminario.

Ready to try Kaminario for yourself? Click here to start a 30 day trial of Kaminario, at no cost. In as little as 60 minutes, you can start seeing up to 30% cost savings and significantly improved performance.

 

New call-to-action