Matching systems with mission critical data

    Chung Heng Han, Senior Vice President for Systems in JAPAC and EMEA for Oracle interviewed online.

    ENTERPRISES today run on data.

    In every stage of the various eras of industrial revolutions, there were different motive forces that powered each one. From steam to electricity, then computerization to automation. Now it is data that fuels Industrial Revolution 4.0

    Just to put that in perspective, there are 2.5 exabytes of data are created daily. That is 2,500,000,000,000,000,000 (2.5 quintillion) bytes of data, that will require 10 million Blu-ray (25GB) discs to store. If one would which if stack these discs, it would measure the height of four Eiffel Towers on top of one another.

    Experts say that 90% of this gigantic ocean of data today has been created in the last two years alone. What is scary is that, the amount is growing exponentially.

    How will organizations manage this massive data growth?

    With all this enterprise data, scattered and stored everywhere, in on-premise servers, data centers and the Cloud how can organizations get the best of both worlds—the might of on-premises infrastructure with the elasticity of the Cloud?

    Malaya Business Insight interviews Chung Heng Han, Senior Vice President for Systems in JAPAC and EMEA for Oracle and is asked how the world will process, store, move, retrieve and transfer this data. He points out how a new data management model Oracle calls ‘Cloud Adjacency,’ will give that “best of both worlds” scenario in a simplified way.

    Q: What is the most critical role data plays in business today?
    A: It is increasingly clear that the role the data plays within the business is what makes the difference between an organization being a business and digital business.  Those that are data-driven recognize that leveraging data lets them work differently and create different types of “engagement models”, and the difference is profound.

    But accessing all the data the companies hold and are collecting can be very hard.  More often than not, it is locked in separate siloes, both on-premises, and on different Clouds.  In fact, a Forrester research recently noted that 73% of organizations operate disparate and siloed data strategies, and 64% are still grappling with the challenge of managing a multi-hybrid infrastructure.

    No wonder 70% of organizations consider the need to simplify their processes as a high or critical business priority.

    Q: Given that scenario, how are companies trying to cope up? 
    A: More often than not, people are turning to public Cloud in the first instance to help them do this.  Its benefits are well known to improve agility, greater speed to market, faster innovation, elastic scalability, cost optimization enhanced productivity and data-driven decision making. Today, the Cloud has become a core business enabler.

    But at the same time, it must be said that the Cloud is still in its early days, with analysts estimating less than 20% penetration, and most of this being for non-critical workloads.  The reason for this is that one size or generic approach does not fit all, as no two organizations have the same infrastructure needs.  And it is agreed to be especially hard for critical workloads, such as these valuable data stores.

    Q: The penetration rate for the Cloud is still very small. What should be done?
    A: So, in reality, while many CIOs may dream of a standardized and unified infrastructure based on one or two strategic vendors, the reality of enterprise infrastructure is that the different elements that create today’s key applications, including the databases they run on, will be split between disparate public Clouds, old-school on-premise resources, and private Clouds.

    And what we are seeing is that most public Cloud users–about 81% according to a recent Gartner survey—are using multiple Cloud providers and are running either a hybrid or multi-Cloud strategy, or a mixture of both.

    Q: What’s the difference between hybrid and multi-Cloud?
    A: Hybrid Cloud is the combination of public infrastructure Cloud services with private Cloud infrastructure, generally with on-premises servers running Cloud software. The public and private environments basically operate independently of each other and communicate over an encrypted connection, either through the public internet or through a private dedicated link.  Hence, the term hybrid—two different models but still connected.

    Multi-Cloud, on the other hand, is 100% public Cloud, where infrastructure is spread between different Cloud providers or within regions on the same Cloud.

    Q: What factors should be considered when choosing one over the other?
    A: Multi-Cloud’s main advantage is that organizations and application developers can pick and choose components from multiple vendors and use the best fit for their intended purpose.

    This ability to be selective is critical for data-driven organizations that are using their data as an asset.  It can potentially enable them to move corporate data closer to key Cloud services, such as high-performance compute and new services that allow them to access things like AI, machine learning (ML), and advanced analytics so that they can construct new business models.

    In effect, this is not so much about moving data into the Cloud, this is about moving the Cloud and Cloud services to the data.

    Q: Why would you do that?
    A: Different workloads move to the public Cloud more easily than others.

    For enterprise applications, the public Cloud—whether by need, default or mandate—can be excellent.  In fact, recent research by Enterprise Strategy Group (ESG) shows a steady upward trend in the use of Infrastructure-as-a-Service (IaaS) such as Oracle’s Generation 2 platform.  Or companies are using the most mature area of Cloud, Software-as-a-Service (SaaS).

    But for other workloads, there can be real issues.  For example, there can be significant challenges when moving critical databases into generic public Clouds. Using a generic, public Cloud can introduce business risks, simply because these databases are extremely critical.

    It also brings challenges, which can relate to performance, scalability, security, and data sovereignty.  There are also unintended consequences pertaining to increased latency that can lead to SQL time-outs and high networking costs which can be a real shocker for your IT budgets.  This means it can be hard for organizations to maintain their application SLAs (service-level agreement).

    Q: Given these choices, what is the answer?
    A: Many IT professionals are looking for ways to address the challenging—and ever more stringent—performance, scalability, availability, security, and cost requirements demanded of their mission-critical applications and technologies, such as databases.  And they are trying to do this while simultaneously enabling their organizations to embrace some amount of public Cloud services, optimally and on their terms.

    This can make them feel like they are between a rock and a hard place.

    The desired situation for many customers is to ‘have their IT Cloud cake and eat it too’. In other words, find a model that offers the elasticity of the Cloud and the processing power of on-premises IT to address this conundrum.

    A new model, providing ‘Cloud Adjacent Architecture’, can help offer a solution to those not yet willing or able to consider using the public Cloud. In effect, it puts their data on powerful Cloud-ready hardware close to the public Cloud across a globally interconnected exchange of data centers. This then enables enterprises to interconnect securely to the Cloud, as well as other business partners, whilst also directly lowering latency and networking costs.

    By doing this, enterprises can reduce their data center footprint, leverage the scale and variety of modern public Cloud services, while still having the control, precision, and data ownership of on-premise infrastructure.

    Q: What types of workloads is it best suited to?
    A: For this strategy to work organizations should ask or consider the following in their strategy formulation. Enterprises that are trying to achieve better business and product development agility or are trying to get out of the data center business (stepping stone to the Cloud). Those with data sovereignty challenges with public Clouds and those have moved workloads to the Cloud and now have created application integration and latency issues. Companies that have specialized workloads and they want to move to the edge of the Cloud or have performance, scalability and special capability requirements for which Cloud providers cannot solve

    Q: What is Oracle’s solution?
     We looked into how we could put those databases on Exadata, adjacent to the Cloud or the Cloud-based applications that are connected to them.

    The result is what we believe a very pragmatic approach.  Oracle is working with key hosting providers, such as Equinix, to place its next generation of Exadata Engineered Systems directly in their data centers.  This unites the joint competitive advantage of on-premises architecture with generic public Cloud services. Organizations gain on-premises levels of performance predictability, scalability, and high-availability features, as well as improved control, increased data sovereignty, and higher security, all while still being able to embrace the public Cloud.

    Additionally, by placing Exadata into such facilities, organizations can leverage the high speed interconnections that exist in any of the public Clouds available directly in their facility, eliminating the high cost of the direct networking pipes that organizations must otherwise pay each public Cloud provider to meet their required database SLAs.

    In the case of such a set up in Equinix, users of this solution have reported savings of up to 70% on their networking costs by connecting to public Clouds within an Equinix data center rather than trying to reach these Clouds from their own data centers.

    Moreover, and crucially, this Cloud Adjacent solution represents a zero-change architecture to optimally and simultaneously address on-premises performance and security needs and multi-Cloud operational strategies.

    Best of all, customers can choose who manages the data—whether it’s the customer themselves, a partner, or a systems integrator—and how it gets done, providing total flexibility and enabling customers to retain control of their Oracle Database licenses and their data.

    Q: Where might we expect to see multi-Cloud strategies being deployed?
    A:  There are many different types of use cases, but one of the most understandable is around the deployment of multi-Cloud architectures to support smart city initiatives. With the proliferation of sensors, cameras and other types of technologies, our urban infrastructure is changing from being purely physical to including data and technology. The convergence of the digital and physical worlds provides those driving smart cities initiatives with a unique opportunity to understand better the dynamics of a location on a real-time basis and then use insights to provide value back to residents and businesses through the provision of new or better services, often delivered by an app.

    But to do this means harnessing a proliferation of data, which given the number of parties involved, is likely to sit in different systems and even different Clouds.  And when you start bringing in big data, social and mobile services into the equation, it gets even more interesting.

    This new approach can be used to give the best of both worlds within the multi-Cloud deployment, hopefully removing some the core drawbacks of data complexity and help bring a more rapid roll-out of projects that could make a significant difference to all our lives.



    Please enter your comment!
    Please enter your name here