Choosing a Serverless Architecture: Advantages and Limitations

The desire to abandon the shackles of legacy software and infrastructure was a key driver—arguably, the key driver—of enterprises' full-speed-ahead move toward widespread use of the cloud. It redefined classic notions of how storage, compute, and networking operations were deployed, and gave considerable freedom to developers, allowing for the creation of truly dynamic apps.

This same sort of thinking is powering interest in a notable cloud trend—serverless architecture. DevOps is a natural use case for this technology as it allows for more efficient app development, but serverless computing can also offer a streamlined path toward agile cloud analytics and data governance. We'll delve into serverless architecture's fundamentals, potential benefits, and use cases, while also highlighting potential pitfalls that you must work to mitigate.

Advantages of serverless architecture

What is a serverless architecture?

The term serverless architecture is a slight misnomer because it implies there aren't servers anywhere in the ecosystem. However, apps that run on a serverless computing framework are still hosted on servers—but your enterprise isn't responsible for overseeing these servers in any way. The provider of your cloud service directly manages the serverless architecture and takes care of any related infrastructure needs, which helps you accelerate processes, achieve scalability more easily, and control costs.

Your organization, meanwhile, doesn't have to worry about provisioning, maintaining, or scaling servers to keep your cloud storage, database, and applications running effectively. In fact, many of these functions—scaling, for example—are entirely automated with serverless architectures. The only aspect of app management that your organization is responsible for is developers writing and deploying code.

Function-as-a-service (FaaS) and backend-as-a-service (BaaS) are the two main types of serverless architecture:

FaaS

With this style of serverless platform, DevOps professionals write application code as a series of functions. Each function executes a specific task when it is triggered to do so by an equally specific event; perhaps something as simple as receiving an email. Developers then deploy the functions and associated triggers to their cloud provider, and the vendor handles everything from there. They can also use apps written with serverless code through APIs—specifically, via an API gateway managed by the cloud vendor. Unlike apps developed in platform-as-a-service (PaaS) products, FaaS apps don't require traditional server processes to run in the background when the apps aren't being used.

All of the major public cloud providers have FaaS tools in their catalogs: Amazon was first to market with AWS Lambda, and Microsoft and Google followed soon after with Azure Functions and Google Cloud Functions, respectively. FaaS has become a hot topic these days because it provides an easy way to implement a microservices architecture.

BaaS

This type of serverless architecture doesn't offer the same degree of flexibility as its FaaS counterpart. Instead of independently writing and deploying code functions, members of the dev team have access to various pre-written tools from the organization's cloud provider—running entirely on provider-managed servers—to aid in app creation. Examples of these tools include database management, encryption, user authentication, remote updating, and hosting.

Developers who are short on time and can only focus on frontend function development may appreciate BaaS, whereas those looking to create from the ground up will find it limiting. Also, while numerous service providers implement BaaS on top of a serverless architecture, not all implementations of BaaS are truly serverless. Many require fixed resource locations, or run full time instead of on demand.

4 key advantages of serverless computing

When properly deployed, serverless cloud computing can help enterprises realize a number of significant benefits. Some of the most notable include:

1. No infrastructure maintenance

Avoiding the responsibilities of infrastructure maintenance is a significant benefit: If the DevOps team has to pause in the middle of an application's dev cycle to send requests to their cloud vendor for more virtual machines (VMs) or compute nodes to ensure an app will run properly once deployed, that's time that could've been spent writing serverless apps. Agile teams can benefit from this approach with faster time to market from dev to ops.

With a serverless architecture, developers also won't have to handle any of the other frustrations that can come with cloud infrastructure oversight, such as container management. Although the serverless approach to cloud app development does involve containers for app functions, these containers are stateless, automated, and managed by the cloud provider. As such, they don't need container orchestration—though some may opt to use open-source tools to make a container orchestration platform compatible with serverless architecture.

2. Infinite scalability

All the serverless computing platforms offered by the three biggest public cloud providers feature automated scaling capabilities. This scalability gives developers virtually unlimited flexibility to craft applications vital to the enterprise without ever concerning themselves with partitioning servers or allocating additional compute resources. As traffic ebbs and flows, the serverless architecture automatically creates or eliminates function instances to ensure that app operations are always in sync with demand.

3. Cost control

Most cloud service providers charge strictly based on use: for storage space, VMs, server allocation, compute power, and so on. But in traditional cloud computing, there will still be some situations in which servers or VMs will run even if they aren't in use. Apps running 24/7 and constantly calling functions will drive costs up in a hurry.

By contrast, with a serverless architecture like FaaS, apps only run when their functions have been specifically triggered. This helps keep overall cloud costs down. Code can be verified and tuned to keep function calls at a minimum, which can improve cost efficiency and get rid of undesirable loops that can quickly add to the budget.

4. Greater productivity

A developer who does not have to worry about provisioning, server management, or any other aspect of infrastructure oversight will have more time to work on each new serverless application in their repertoire. This helps ensure that developers' apps are as powerful, effective, and agile as possible, giving them the freedom to flex their creative muscle. In the long run, this increased productivity can benefit the enterprise's overall growth, and boost customer satisfaction by speeding up production cycles.

4 possible challenges of serverless computing

As is the case with most technologies, serverless architectures have some hurdles to overcome, although none of these should be considered deal-breakers. The following are the most common challenges of a serverless platform:

1. Cold starts and latency

Diminished latency is sometimes cited as a potential advantage of serverless architecture, because the code functions can originate in any cloud provider's data center rather than being limited to a specific origin server. However, serverless technologies also present a unique circumstance that can create latency.

Cold starts occur when app functions are triggered initially or after they have gone untouched for a significant period of time. Because the FaaS platform must initiate this initial request on behalf of the function, it adds some latency to the execution process. While that single cold start only amounts to a few seconds, if these happen often enough they can adversely affect operations and the end-user experience, especially when facing operational or tactical applications in which service-level agreements contain strict expectations.

2. Security

Apps running on serverless platforms aren't inherently unsafe. But there will always be some risk because cloud providers often handle the needs of several different customers at once with the same server. To minimize this risk, providers must make sure that shared servers are configured to deal with app function traffic from multiple enterprises, and there isn't always a guarantee that this will happen.

Additionally, in the event that certain cyberattacks—such as distributed denial-of-service (DDoS) campaigns or function event data injections—successfully penetrate your cloud provider's defenses, the potential damage can be massive even if it wasn't your enterprise's security that faltered. Also, even if there's no real damage or information loss, functions that are executed with no business purpose will still add to the final bill.

3. Loss of control

Adopting a serverless technology framework for your cloud operations requires relying on your cloud provider to manage all infrastructure tasks. Even if you wanted to change something yourself, you wouldn't be able to. You likely knew this going in—but you may not have fully understood that the provider also has full control over the software stack that serves as the foundation for your apps. You cannot configure parameters as you please or have full visibility into app performance.

The infrastructure for serverless architecture may consist of many different server types, so there's no guarantee that the same app functions will perform the same each time, even after scaling up. Also, developers must note that infrastructure may not remain the same when switching from development to testing to production, as the provider will assign those resources on demand.

4. Vendor lock-in

Serverless architectures are one of the most common settings in which vendor lock-in can occur. You will rely very heavily on the specific complementary services your cloud provider offers, especially APIs. Mixing and matching elements from different vendors is not always simple, and migration to another provider will be difficult and expensive.

Moving toward a serverless architecture for analytics

Serverless architecture can provide a strong foundation for data analytics even in the largest, most complex enterprise cloud ecosystems. Data query and analysis functions can be made incredibly agile in a serverless context, especially in the context of innovation hubs and exploring data of unknown value. Analytics workloads that suddenly spike or are ad hoc could also benefit from this type of processing. Model training is another great example, in which the infrastructure for the processing engine only needs to be provisioned just once a month for a few hours.

With the assistance of Vantage, Teradata's agile, cloud-optimized data analytics engine, your serverless analytics operations can thrive. Vantage is provider-agnostic and compatible with numerous serverless tools from AWS, Azure and Google Cloud, allowing for robust data integration from all sources across the business.

Check out our blog to learn about Vantage's compatibility with AWS's serverless extract, transform, and load (ETL) platform.