Three tips to minimize the development hassles in your cloud hybridization journey...
Given the ecosystem complexities of hybrid cloud deployments, API management can be tough.
While addressing Inspire Partner Conference in 2017, Microsoft CEO Satya Nadella made an interesting observation when he opined that in the fight between public and private cloud, it's the hybrid cloud that has emerged the winner. Nadella's statement gets validated when one views the rise of hybrid cloud as part of the overall cloud market. According to Markets and Markets, the global hybrid cloud market is forecast to be valued at USD 91.74 billion by 2021, growing by 22.5% CAGR. Compare this with the overall cloud computing revenues of USD162 billion in 2020 (growing from USD 67 billion in 2015).
A highly sought after model by CIOs, hybrid cloud offers organizations a flexibility to use cost-effective resources and innovative features from public cloud vendors as and when required while retaining the advantages of tight management controls that their private cloud deployments assure. The hybrid model, however, can be a challenge to the backend API developer who has to ensure that users can move from one model to another seamlessly without compromising on the organization’s security and compliance mandates. Let’s explore three API management good practices companies may follow to get the best out of their API programmes for hybrid cloud implementations.
- Streamline the database access needs of individual components
Generally, a hybrid cloud set-up presents a complex ecosystem of applications, business processes, and infrastructure components. Given this complex environment, when multiple components spread across business and application process flows are accessing database, it may impact performance of the whole system. In any case when components are moving into the cloud, performance gets affected further. An aspect to consider, here is the type of data–whether it is persistent or non-persistent.
The API management exercise should therefore, begin with mapping the database access needs of various application components with the linked business processes, and application workflows besides the user access levels. Ensuring that the database access function is carried out by one or only a few components may be a good approach to follow. Creating controlled interface(s) for public cloud-components to access applications and data from your private cloud may also be deemed useful.
- Do not ignore security and support
In a hybrid cloud set-up APIs work as the pipe that provides access to enterprise resources to users―internal as well as external. Safeguarding sensitive business information is, therefore, an important task APIs must perform. A combination of security provisions including basic authentication via API keys, the advanced authorization model by using OAuth 2.0, and JSON Web Tokens (JWT) along with encryption may be considered a reasonable solution stack to secure API framework for a hybrid cloud implementation.
In a hybrid cloud environment, requests continuously flow back and forth between public cloud and the data center or private cloud. Therefore, in addition to the standard functions of visibility, availability, and monitoring, your APIs, while acting as gatekeepers, also need to support features such as throttling. When designed with adequate provisioning for caching and fair-use as the guiding principle, throttling can help private cloud administrators achieve performance optimization while measuring resource requests on the business criticality parameters. Allowing developers to request access to a predefined data-set through a developer-portal can also build efficiency into support operations.
- Know your organization’s hybrid cloud priorities
Hybrid cloud has various usage-driven models. Some organisations build strong data center capabilities and a private cloud relying on public cloud resources only for analytics especially when big data is involved. In fact, this method is similar to the model traditionally followed in third party business intelligence system deployments wherein data fetched from multiple enterprise source systems and ETL is fed into a data warehouse and then to a BI system. In this method, replace BI system with public cloud. The public cloud's access is thus limited to the EDW layer only. The enterprise systems are kept oblivious and out of reach for the public cloud.
Another practice followed by organizations is limiting the scope of public cloud to spike-based provisioning while majority of provisioning being done by the private cloud. Popularly termed as cloud-bursting, the public cloud’s role is of augmentative nature only. Through this method, CIOs try to keep their risks and costs under control while ensuring uninterrupted performance at all times.
Lastly, some organizations take a different view of public cloud adoption. By breaking down their applications into back-end and front-end components, the web-based front-end portions are moved to the public cloud while retaining the transaction data processing and analysis at the private cloud level only. An API developer should therefore thoroughly review the practices and risk-return priorities of the organization before creating a suitable API architecture.
The author is CEO & Co-founder, Postman