Business growth consultant can provide your business the right direction. We know how important third-party APIs can be for streamlining the time- consuming process of software or web development. It can save a lot in terms of time and money, and not to mention the effort and intense energy needed to build good software. It is then equally important to consider how to best integrate them into the software. We are aware that not all APIs can be reliable, they might not be competently in maintenance which can adversely impact the layout and performance of the software or website. The main task of good functional Integration, among the many other functions, is to regularly test and monitor the API.
It also takes a lot of time and effort to design and build an exceptional API integration and developers conscientiously focus on the finer details of operations, its parameters, and specific attributes. They can inadvertently overlook the many other factors that could considerably affect the efficiency, stability and even future maintenance of the integration. Also, have to get an understanding and insight and give diligent attention to the other equally important factors can make a big difference in the overall efficacy of the integration. They have to be proactive and inquisitive enough to learn the tips and tricks that would help make the integration perform to their maximum potential. The knowledge and resources exist to build all the best practices into the integration, benefiting both, one who is currently in the process of designing one, or someone with an existing integration.
Detailed below are some of the recommended best practices to ensure the optimal performance of the Smartsheet API integration. The practices are in no way exhaustive and new additions are in update regularly
Providing Efficiency by Adopting Bulk Operations:
The practice recommended for achieving the maximum efficiency is to adopt bulk-enabled API operations at most times. Applying bulk-enabled API operation allows one complete task like addition. Also, updating, and deletion of multiple items through a single API request. This efficiency-boosting practice has clear advantages over single-object operations. Where only a single object request could be in execution at any point of time. Using bulk enabled operations one could use a single request. Such as to execute multiple objects like updating 15 rows in a single sheet. The advantages of employing bulk-enabled API operation are numerous. But, the most significant being it efficaciously improves efficiency, as the integration makes a substantially lesser number of outbound calls.
At present, API operations allow for the following tasks can be completed in bulk:
This request can be used to insert many columns in the specified sheet. It can be executed by applying single or multiple upload requests. The Result object would correspond to match the specified request which in this case will be an array of column objects.
- Rows: Adding rows, copying rows to a different sheet, deleting rows, moving rows to a different sheet. Also, sending rows by Email and updating rows. This request can be used to do all the tasks corresponding to the above-specified tasks, that can be single or multiple execution requests.
- Adding favorites and removing favorites: One of the popular features that let users ‘star’ mark folders that can be sheets, reports or objects to label them as favorites. This feature allows users to add or remove favorites labels and the operation allows for both single and bulk semantics.
- Adding Group members: This feature is only available to system and group administrators and obviously supports adding members to a group. This operation allows for bulk semantics.
- Sharing Report and Sheet: This operation feature allows for the sharing of reports and sheets with specified groups, or users. This operation also allows for bulk semantics.
The above list of tasks is by no means conclusive and in the future, more bulk-enabled tasks could be added very broadly increasing its use and efficacy. This confirms that using bulk-enabled API operations effectively improves overall efficiency.
Support for optional bulk operations
Many endpoints allow for optional bulk API operations along with the basic single object operation. In this case, the endpoints allow for both single object and also an array of objects to pass, the result object would return either a single object or depending on the nature of the operation, an array of objects. A good example of optional Bulk-enabled API operation is updating rows: One can pass in a single update row object to affect a single row update or create multiple update row requests to affect an array of row updates.
The notable advantage of the optional bulk operation over a single request can be in the illustration by the point that, when one attempts to create an already existing single object, the operation almost always fails. However, while executing bulk API operation, the endpoints never return an error, even when one or more objects in the array previously existed; in this case, existing objects or items are in ignorance- the result object bypasses them.
Setting up for partial success
By design, bulk operations by default tend to fail downright if just a single object in the request is invalid for any reason. By nature, Smartsheet does all the tasks like adding, updating, copying, deleting all objects in a single request, or else it skips execution altogether and none of the objects are changed. This confirms that when an error is thrown amid the bulk operation, and the default response would be a failure of the entire operation.
Provisioning for partial success
In this case, provisioning for partial success makes sense. It means that the operation will still execute even when a single or more requested objects fail. For any reason- even when one or more items or objects, for some reason, invalid. This confirms the fact that one should, for the successful execution of the operation, always provision for partial success. We can illustrate the point further by taking a valid example: If one makes a request to send rows by email and one of the row objects selected is invalidated, then the entire operation would be deemed invalid and none of the rows will be sent. However, if one factor in a condition to allow for partial success, that would all for valid portions of the operation to be carried out even in the presence of some simple errors.
There also exists a provision for informing the user about the end status of the operation. One has to just enable for partial success.
Image Source: smartsheet.com
The Smartsheet API incorporates index endpoints that are in design to return an array of objects. These can be easily identifiable as those are titled as “List”, “Get All” in the documentation. The endpoints can be in use to retrieve results in paged subsets, which means processing and accessing large result sets in manageable chunks. This operation has the right name as pagination.
Confirming to the Rate Limit
The Smartsheet API presently adheres to a rate limit, which is, for every valid token, 300 requests per minute. Every user has to comply with the Smartsheet rate limiting guidelines. This feature often called ‘throttling’ was Incorporated to prevent misuse putting an unwarranted strain on the Smartsheet servers. Many intensive operations, mainly involving bulk-enabled API operations tend to hit the stipulated rate limit. For example, certain tasks like moving or copying rows to the different sheets can be resource-intensive and may count as multiple requests which add to the rate limit.
The rate limit is strictly in enforcement. If one surpasses the rate limit, the subsequent API requests made which exceeds the limit stipulated within the specified 60 seconds period, are straight out rejected showing a 429 HTTP status code along with JSON response body errorCODE: 4003, with a warning message, “Rate Limit Exceeded.”
Smartsheet suggests designing
Smartsheet suggests designing the integration to ingeniously get over the API rate limit. The most effective process or strategy would be the following
- One-minute sleep breaks: One suggestion is to let the integration sleep for one minute. When the error pops up, and only then reattempt the operation. This calls for patience, the reward being better productivity without straining the system or the servers.
Error managing strategy
The other option would be to implement and enforce an exponential error managing strategy. The process would entail retrying a failed request at predetermined periods set at progressively longer wait periods. In the end, either the request would go through sooner, or the request would execute after a number of retries.
- Refraining from rapid-fire updates: It is high in the recommendation, to confirm to the rate-limit. To refrain from making API multiple requests in short intervals of time to update a task or Smartsheet object. If an operation involves repeating the same tasks over and over again, multiple times with the 1-minute period, it might comply with the rate-limit set by Smartsheet but such repetitive tasks in succession will, with all certainty generate save errors which would adversely affect the integration and certainly affect the user experience.
Design the integration intelligently
Instead of physically refraining from making multiple API requests in quick succession, an intuitive and intelligent feature can be designed and built in the integration so that successive rapid requests are never carried out in the same Smartsheet object. Here again another option exists, for maximum efficacy. one can aggregate the changes and submit then in a single command using the bulk-enabled feature.
Requests to be in execution serially not in parallel
It is highly in recommendation to design the integration in a way that it executes multiple requests serially. It should work in a way that a single request is in execution at any given point of time, and the subsequent request is not in execution until the previous one is complete. The underlying reason being when several requests are in execution. In parallel to update a Smartsheet object, that action would result in diminished performance and the chances of errors increase substantially owing to save collisions.
Image Source: smartsheet.com
The Capability to Appropriately Hand Errors
It is not unusual for an executed API request to result in 200 HTTP status code. Besides the fair amount of data in the body of the as per the operation executed. Sometimes things may go as per plan and one may get back errors and an undesirable response. Here the capability of the API integration is in test if it can appropriately handle errors. The aim should be to design the integration in such a way that it understands the difference. Potentially resolvable errors fixed by repeatedly retrying the request and unresolvable errors to be permanently retired.
The API documentation details recommendations for every Smartsheet error code. That information can be in use to perform specific error handling operation as per the following logic:
(1) Do not reattempt the request if the error code reveals a permanent error.
(2) Fix the problem before reattempting the request, if the error code reveals a problem that could be fixed.
Image Source: nibblessoftware.com
The Capability of the Integration to Log API Requests and Responses
It is unreasonable to expect the integration to function consistently without any issues every working day. Issues can and will arise and it is important that the integration is able to log API requests and responses. The access to requests and responses, error codes and messages will help to organize. Also, troubleshooting to speed up the resolution time. It is therefore important that when one sets up the integration, they should apply API request and response logging.
The above-detailed practices can be effective in use to design and build a robust integration with Smartsheet. Even for those who have an existing integration, the time is right. It is to take into account the above best practices and build them into their software.