5 Reasons to love GKE (Google Kubernetes Engine)

5 Reasons to love GKE (Google Kubernetes Engine)

3 min read

A good technology pairing is like a good relationship- the two parties should not just get along well, they should also support one another in ways that help elevate the other person to be the best they can be. In the tech world, there are many technologies that complement each other well, but some of these relationships can be a bit one-sided. Kubernetes and Google Container Engine, however, are setting some serious relationship goals. And so, I present to you, my top 5 reasons why K8s + GKE = ❤️:

Kubernetes is Native

You can think of Kubernetes and Google Container Engine as that couple who has known each other since freshman year of college. K8s has been working with GKE since day one, which makes certain things simple to do together. For example, Kubernetes has new releases every couple months; if you’ve ever had to upgrade on your own, you know the hassle of spinning up a new cluster, moving your pods to a new cluster, etc. Frustrating, no? Well, GKE to the rescue- master is automatically upgraded, and all you have to do is run a simple command to upgrade the nodes:

 

gcloud container clusters upgrade $CLUSTER_NAME --cluster-version=$CLUSTER_VERSION

 

You can read more about it here.

 

GKE and Docker Work Well Together

Every good relationship needs a fun and supportive friend group. Enter Google Container Engine’s good friend Docker. GKE schedules your containers into the cluster and manages them automatically based on requirements you define (such as CPU and memory). And, since it’s built on Kubernetes, you have the flexibility to take advantage of on-premises, hybrid, or public cloud infrastructure. GKE supports the common Docker container format, and makes it simple to store and access your private Docker images. In other words, you’re able to run Docker containers on GKE, powered by K8s.

Networking is Easier

As I mentioned before, Google Container Engine loves to help out Kubernetes any way it can by assisting with complicated tasks that can be frustrating. Having to think about scaling issues/weave/flannel/etc? GKE has you covered and will manage all of it for you. After all, that’s what a supportive partner would do, right?

All Managed in One Place

No need to go back and forth between apartments- Google Container Engine and Kubernetes are all in one place. With GKE’s Cluster Autoscaler, scaling has never been easier- if some node is underutilized and all pods running on it can be easily moved elsewhere then the node is deleted. Additionally, if there are no resources in the cluster to schedule a recently created pod, a new node is added. You can enable autoscaling by using the following command (using the gcloud command-line tool):

gcloud alpha container clusters update mycluster --enable-autoscaling --min-nodes=1 --max-nodes=10 --zone=us-central1-f --node-pool=default-pool

 

Price

Relationships can get pricey; dates, vacations, rent, buying a house. Whoa- that escalated quickly! With a pay-as-you-go pricing plan, GKE is a fantastic deal. You only get what you need when you need it. With no upfront costs, and no termination fees, it truly is the simplest and best priced product for all your Docker needs.

All ridiculous Google Container Engine & Kubernetes relationship metaphors aside, they truly are a match made in heaven when it comes to running K8s on a cloud. Being able to work together well, provide simple networking solutions, centralized activity, and pricing are only the top 5 reasons we think GKE & K8s are a great pair.

 

Automating deployment to GKE is easy too! Checkout our documentation here for more info on how Codefresh can help.

How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.

Build your GitOps skills and credibility today with a GitOps Certification.

Get GitOps Certified

Ready to Get Started?
  • safer deployments
  • More frequent deployments
  • resilient deployments