Nowadays, the relocation of databases to the cloud has become common practice. Lower maintenance costs are often first priority when introducing cloud computing. It is very attractive for IT decision makers to eliminate a large part of the investment costs for hardware and software permanently. It is even more attractive to reduce the operating costs for installation, maintenance, updates, patches and end-of-life databases without additional administration effort. However, it is important to keep an overview of the expenses that are incurred instead. In addition to licenses and subscriptions, organizations often incur indirect costs such as high network usage fees during migration, loss of customer information during any outages, and lost revenue due to unexpected downtime during database migration.
Reliability and redundancy are important factors for the introduction of the cloud. With tens or hundreds of data centers worldwide, most cloud vendors offer high reliability. Providers also hire a large number of administrators to run their data centers and ensure that there is no single point of failure.
One of the effects of working in the cloud is flexibility. It is important when it comes to quickly adding and subtracting resources and thus closely linking demand and resources. This means that development environments for databases can also be flexibly adapted.
A decisive advantage of the cloud is the level of security. This is usually higher than would be possible in a company’s own data center. Cloud providers have appropriate personnel who follow security bulletins and perform white-hat penetration tests on the servers to improve security. Few companies have the resources or technical depth to do this.
The right speed
According to Oracle, all enterprise data and development and test projects will probably be in the cloud by 2025. IT decision-makers should therefore take this time horizon as a benchmark for taking appropriate action. Proper planning prevents poor performance. Exploiting the advantages of cloud computing is the reward for a conscious, committed approach.
With the exception of start-ups that need computing resources for the first time, moving to the cloud is a continuous journey. Most companies will be busy in the foreseeable future gradually moving local systems to the cloud. So start small and move the development environment to the cloud or move discrete applications to other systems with a few hooks. Moving an app or part of the business, such as the helpdesk to ServiceNow or Zendesk, reduces the risk of business interruption. A function like payroll, on the other hand, is a candidate for later migration. No company that runs payroll accounting in-house wants to take a bite of a business function with so many internal processes.
Of course, companies using SaaS applications such as Salesforce and Office 365 have already moved to the cloud, at least in part. Virtual Desktop Infrastructure (VDI) also reduces dependence on local infrastructure by allowing users to log on to a client and work entirely on a pre-configured virtual desktop with no local storage or installed software.
What should be moved?
In principle, it makes sense to move all existing databases to the cloud in order to maximize the benefits of the cloud. However, this is sometimes associated with unexpected costs.
This can be easily understood using the example of an Oracle implementation. When moving an Oracle Multicore Processor Licensing (MPL) that is deployed on-premises, a 50% Oracle Core Factor is available. In the example, eight physical cores are running in the corporate server. Only four cores are paid to license these eight cores from Oracle. But if the company wants to move to an Amazon AWS environment with eight virtual cores, Oracle says the core factor does not apply in the cloud. This means that there are now fees for eight cores instead of the four in the previous local Oracle implementation.
Development and quality assurance are good use cases for the cloud. It is an advantage to quickly create multiple instances for writing and testing apps, as long as you don’t forget to remove them when they are no longer needed. The absolute cost is low, but it adds up over time.
When migrating a database to the cloud, the first step should be to identify low priority tables and schemas, such as development or QA databases. These mark the starting point for the migration. Before moving an entire local database to a database in the cloud, database administrators should identify use cases such as data integration, disaster recovery, and outsourced reporting that require data availability but do not impact application availability.
For the sake of simplicity, some companies start from zero, i.e. without historical data. They take software comparable to Oracle E-Business Suite, set it up in the cloud with all its customizations and flexfields and start at the beginning of a new fiscal year or quarter. The local version is retained when historical data is queried. The non-migration of the old database makes it easy to move to the cloud.
“Big Bang” is an approach to get into the cloud in one fell swoop, say, over a weekend. This involves some disruption and risks. Migrating a system while it’s not in use can be an attractive solution for applications with small databases and regular downtime. Database administrators back up the database and applications, restore them to the cloud, and start them from the next business day.
Cloud migration of databases should not affect the applications running on those databases before, during, or after migration. Users must be able to perform tasks such as reporting, querying, and analysis throughout the migration process. In addition, database administrators should be able to reset a production database in the event of a problem during migration without impacting user activity.
The four different approaches to migration must take this into account. The first approach is low-risk but slow if database administrators start with tables and schemas that support applications but are not business-critical. The second, clean approach means a hard cut by eliminating history. The challenge here is to build brand new applications with databases in the cloud without migrating old databases. The big bang approach is the most comprehensive, but generates longer downtime. The fourth approach is the most intelligent and innovative. It involves replicating data from a source database to a target database in the cloud. Replication is clearly recommended for a smooth and low-risk migration.
To apply these approaches, data replication solutions are available that record database changes and keep source and target instances in sync, making migration to the cloud easier and more secure.