Intercloud: An Emerging Architecture for Cloud Analytics

As enterprises sail through an expanding sea of data and look to fish more advanced analytics out of it, cloud computing architectures — with their immense scale and elasticity, plus on-demand pricing — provide a natural solution.

The 2020 IDC Global DataSphere forecast estimated that 34% of all enterprise data was in the cloud in 2019, and that 51% of it would be there by 2024. Getting the most speed and value from cloud analytics requires thinking about what kind of cloud architecture your data will inhabit, and how you will orchestrate this information’s movement between cloud environments.

Intercloud platform for cloud data analytics

Cloud analytics challenges — and the rise of intercloud

Simply moving data into the cloud, for example to take advantage of low-cost storage, is neither a good way to unlock that data’s full insights, nor an optimal use of the advanced analytics capabilities of modern public clouds.

Moreover, there is the real risk of creating a “data swamp,” i.e. a disconnected set of disparate services plus tools and workflows for managing them, as data spreads across multiple clouds. This setup can become costly. That’s because there’s no connected multi-cloud platform to consistently ingest all of the data sources in play, integrate with the different clouds they live in, and optimize all associated workload pricing and performance.

Enterprises recognize this overarching issue, and in response they’re seeking more seamless visibility and control across these multiple cloud ecosystems, per a 2021 survey from IDC and Seagate. Connecting numerous data sources to one platform while being able to automatically use the best available cloud for each analytics use case is the ideal solution.

Enter the concept of intercloud, one of the most important emerging cloud trends. Intercloud is an evolution of the increasingly popular multi-cloud deployment model, intended to overcome some key limitations of that architecture through tighter connectivity. Multi-cloud setups are inherently complex, with numerous considerations around:

  • Application portability and recoding
  • Data source locations and colocation
  • Costs, e.g. from data egress charges and spikes in demand
  • Movement of enterprise data, and the latency involved

Intercloud can simplify some of these issues by making multi-cloud analytics operations more automated and intelligent. In short, intercloud lets enterprises move data and workloads without having to think as much about how to optimize them.

What is intercloud, exactly?

Intercloud is a cloud deployment model that links multiple public cloud services together as one holistic and actively orchestrated architecture. Its activities are coordinated across these clouds to automatically and intelligently move workloads (e.g., for data analytics), based on criteria like their cost and performance characteristics.

Intercloud vs. multi-cloud vs. hybrid cloud

An intercloud architecture requires a multi-cloud deployment. Multi-cloud means running applications in multiple clouds, i.e. using infrastructure from a cloud provider like AWS, Microsoft Azure, or Google Cloud.

Both intercloud and multi-cloud are different from hybrid cloud, which is the combination of a public cloud with on-premises infrastructure. Hybrid multi-cloud is a combination of multiple clouds, along with an on-premises environment.

To use an analogy, think of the three architectures like this:

  • Hybrid cloud: An apple (public cloud) and an orange (on-premises).
  • Multi-cloud: Multiple distinct apple varieties.
  • Intercloud: A new apple varietal, seamlessly combining the traits of the above.

Although intercloud is an advanced form of multi-cloud computing, the way in which it works is actually most similar to a hybrid cloud.

An intercloud architecture moves data between the infrastructure of multiple cloud service providers (CSPs). This movement is analogous to how an architecture-spanning hybrid cloud allows on-premises data to move back and forth to the cloud, depending on the use case and its associated cost and latency requirements.

Three reasons to explore intercloud

Intercloud is still in its infancy. Many enterprises are still in the process of migrating their data sources to a multi-cloud architecture, or connecting their on-premises environments to a hybrid cloud. But intercloud could soon become a viable alternative to traditional multi-cloud and hybrid cloud, due to:

1. Active cloud management

This is the major value proposition of intercloud. Decisions about where to place a particular dataset or application no longer require human deliberation, as would happen with multi-cloud. The intercloud architecture “knows” where to send the information at any given moment.

Intercloud also takes advantage of tight integrations between public clouds and orchestrates activity on its own, on the fly. Think of it as the cloud architectural equivalent of a TV equipped with variable refresh rate (VRR). For example:

VRR-enabled TVs can dynamically change their refresh rates (e.g., from 60 Hz to 120 Hz and to anything in between) to provide the best motion smoothness for the content on screen, based on current frame rate. All of this happens automatically and doesn’t require digging through menus to change settings. Intercloud does something similarly streamlined for cloud analytics workloads.

2. Cost savings

The intelligent automation of intercloud makes it easier to get the best price for cloud resources moment by moment. For instance, an intercloud deployment may look up the spot price of compute in Cloud B, find that it is lower than the current equivalent cost in Cloud A, and move a workload from Cloud A to Cloud B automatically.

Such functionality makes intercloud a powerful arrow in the cloud cost optimization quiver. The combination of a connected multi-cloud data platform and intercloud architecture lets enterprises find the right mix of consumption and reserved pricing for a wide range of possible analytics workloads and business requirements.

3. Performance optimization

By finding the optimal cloud service provider for every workload, intercloud ensures optimal performance and pricing for analytics. Data from numerous sources can be reliably loaded, queried, and scaled across clouds.

Larry H. Miller Sports and Entertainment, former owner of the Utah Jazz, connected over 80 data sources to its cloud data platform deployed on AWS, hinting at the types of benefits future intercloud architecture will deliver:

  • Data sources ranging from cameras to social media accounts were integrated onto its data platform.
  • AI and machine learning helped to extract deep insights from this information and create a comprehensive picture of the company’s customers.
  • The new cloud architecture let the company rapidly scale its analytics during the surge in activity on game days.

Looking ahead, intercloud will extend this level of performance and scalability for analytics across each company’s clouds of choice. A key enabler of intercloud will be a flexible data platform that works the same in every cloud and offers first-party integrations with the major cloud services.

Teradata Vantage delivers these capabilities. It can be deployed on any public cloud to ingest data from any source, deliver deep insights, and harness the power of multi-cloud and intercloud connectivity.