According to a recent survey from KeyBanc Capital Markets, 32 percent of CIOs say they plan to use multiple vendors to create internal private cloud systems, while 27 percent planned hybrid cloud arrangements.
And it’s no secret why. When making the leap to the cloud, organizations have two goals in mind: achieving economies of scale not typically available to smaller organizations and obtaining greater access to resources that free up their own.
However, attaining these highly sought-after benefits isn’t always easy. As is the case with any innovation, a move to the cloud presents a new set of challenges. For database managers, dealing with complex data integrations spanning multiple environments presents a messy and complicated scenario.
Below, we outline the most common issues database managers experience and the best practices customers have implemented to manage them.
Challenge 1: Resetting your thinking
Cloud environments are a lot more fluid and dynamic than the typical server. As such, you must adjust your usual thinking, compared to that of working with on-premise technologies. For example, when working with a single cloud server, you know there is a chance that it may suddenly disappear for any reason, at any time, fully knowing there’s also a good chance it will never come back. With the cloud, however, it’s much easier to get an identical server up and running instantly. Even for the simplest applications, to keep your entire application fault tolerant and highly available, you need to embrace distributed computing principles.
Unlike working with static, immovable infrastructures, you can’t just set it and forget it. Consistently review and revise your current cloud set-ups. When considering the example of disappearing data, you must make sure it lives in multiple locations. Understand and apply cloud-specific accommodations for high-availability, persistent storage of data and active monitoring of your systems.
Challenge 2: Automating the cloud
Working with many servers, networks and machines, anything can go on the fritz and give you problems at any time. The common solution for when anything isn’t working, “Have you tried turning it on and off?” has an interesting cloud version. When it comes to the cloud, everything is disposable. When things aren’t working, cloud providers simply wipe the machine and start from a clean state. This shift in thinking requires an organization to employ automation strategies allowing you to easily to deploy, update, monitor and manage servers in a new and more efficient manner when everything needs to be shut down.
One example in which we’ve seen an increased need for (and knowledge of) automation among clients is the deployment of a new release to production. In the cloud, such an action typically requires a single Docker pull command. This deceptively simple operation hides enormous levels of complexity behind it. Before the cloud, you would log onto the server, stop the application, update binaries and start the application again. Now you need to package applications in Docker, register and deploy it; everything else is managed for you.
Challenge 3: Migrating your data
Moving your company’s assets to the cloud is a long and gradual process. Migrating your on-premise systems one day and having them on the cloud the next is unheard of. It takes time, usually years, for this process to complete. That’s why many are running hybrid environments, instead.
Depending on your systems and infrastructure, navigating a hybrid system can be as easy as setting a VPN tunnel between the two locations and expanding pre-existing distributed architecture to a new location. If your system presumes a single source of truth (SSOT), you will run into issues where you now have multiple sources. In most organizations this is handled with extract, transform and load (ETL) and data flow processes, and requires the data team to carefully delineate the ownership of each data item and its flow throughout the organization to successfully integrate into the new cloud environment.
Challenge 4: Securing the cloud (beyond your provider)
When primary servers and databases are safely locked in a vault deep underneath your offices, you have a certain expectation of safety from the physical aspects of computer security. On the cloud, you’re typically much safer because of ongoing intrusion detection, distributed denial of service (DDoS) prevention and active monitoring done by the cloud infrastructure.
While cloud providers often take care of security issues for you, there are still many issues that can arise. For example, if you make a server public without following the appropriate security protocols, there is nothing that will prevent any hacker from waltzing into your innermost systems and wreaking havoc. Over the past few years, there are countless examples of millions of people having their data stolen and ransomed back to the company they stole it from because the database was kept open when it went into production.
Over the last few years, as threats have developed and evolved, security has been the bane of many organizations. Remember to properly secure and protect your database is easy to forget – however, you don’t want to learn to lock the door the hard way when the data is gone and your company name is in all the headlines.
See how CenturyLink can help you find the right cloud solution for your IT initiatives.