The ability to interoperate and communicate with other software is fundamental to the value a cloud system can offer, which is why most providers, including ourselves, use APIs. These allow different systems to talk to each other using a common language.
There is normally some means of controlling the number of data transfers, otherwise a small (or indeed large) number of customers could accidentally swamp the system with a massive number of requests, negatively affecting themselves and everyone else. Here’s an example of how one service applies bandwidth management to avoid this:
With the agileBase platform, communicating with third party software is really easy – custom APIs can be enabled by an administrator with one click for each view and tools such as Zapier then allow you to send and receive data from common cloud software tools with no need for programming.
That’s led to an increase in the number of customers using the API and a ballooning of what they’re using it for. We love seeing that, but it does mean that we now also need to think about how to protect customers against the possibilities of mis-configurations in third party systems that perhaps make them ‘overly chatty’. We need a way to smooth out demand.
For example, we’ve had one or two situations where the system has been flooded with requests from a third party due to a bug in their system that caused repeated requests for the same information in a continuous loop. Say for example a request takes on average 1 second to process and the system receives 2 requests per second. Well that means that the number of requests being actively processed at one time is pretty quickly going to spin out of control!
When that happens, people using the data being requested will of course start to experience a slowdown and lack of responsiveness. If it goes on for long enough, all other customers and users will also be affected. As a customer, you obviously don’t want your service to suffer due to events unrelated to your use and outside your control. We’ve often been praised by users for the responsiveness and speed of the agileBase platform and we’re keen to keep it that way!
Smoothing data flows
To help avoid this possibility, we now employ a couple of simple bandwidth management measures.
- Queuing – API requests are queued for each customer. This means that only one request can be processed at a time, to stop lots of requests coming in at exactly the same moment, which can happen just by chance if they’re all coming from different sources.
Each customer has their own queue, so one customer’s requests won’t affect another’s.
- Traffic shaping – after a request for a large amount of data, the system will pause for a short time before processing further requests for a customer. The period is related to the number of records requested. 1000 records will cause a delay of 0.5 seconds, topping out at 5 seconds for 10,000 or more records.
Additionally, for particularly intensive API calls which may have a detrimental effect on other users if used too frequently, we have the option, in consultation with customers, of rate limiting. That means setting a maximum frequency for calls, e.g. once every minute, 10 minutes or hour. In fact, if you like, you can set this yourself, to protect you from third party apps you may use overloading your systems. The option is available in the API settings ‘sync’ screen under the manage tab for a view.
With these measures in place, we’re confident that businesses can take advantage of the possibilities opened up by allowing easy communication with other cloud software, without needing to worry about whether there may be any detrimental performance effects. We encourage people to look into the possibilities of setting up syncs like these examples from customers:
- exporting data to Power BI to allow consolidated reporting from many sources
- transferring invoices and other financial data to an accounting package like Xero
- importing enquiries and orders from websites