API best practices

Performance

  1. Don’t request any fields you don’t need - any additional field add overhead to the request.  
  2. Similarly, make your filters as specific as possible.  Wherever possible, filtering in the API query rather than parsing it after you have the results back is better.
  3. Exact match filters will perform better than partial match filters. Eg. using “is” will perform better than “contains”.
  4. You can create a query that will get values from linked entities but when you do that the API needs to perform a database join, which can impact performance.  Weigh up whether it’s quicker to run two separate queries and deal with the data later, or to run one query.
  5. If you run into serious performance problems where you don’t expect them, custom indexes can be added to your site for things you querying often that aren’t currently indexed.  This is most valuable for queries that are happening very often and usually on custom index fields that have been added on your site, that aren’t covered by the default indexes eg. if you have added a field in Shotgun that allows you to link to you internal tools for pipeline.

Control and debugging

  1. Use separate keys for scripts, a unique key for every tool.  This is invaluable for debugging. Also make sure that every script has an owner/maintainer and the information is up to date in your Scripts page (under the Admin menu).
  2. Consider creating three different permission groups for API Users - read, write, and read/write/delete.  Many scripts only need read access and this can limit your exposure to accidental changes.
  3. Track which keys are in use so that old scripts can be removed.  Some studios script auditing information in their API wrapper, to make this easier
  4. If you’re running into script errors, check the syntax in our documentation - this is the most common cause of errors.   
  5. Also make sure to check entity names and fields. Shotgun has two names for each field - a dispay name that’s used in the UI (and isn’t necessarily unique) and an internal field name that’s used by the API.   Because the display name can be changed at any point, you can’t reliably predict the field name from the display name.    You can see field names by going to the Fields option in the Admin menu,  or you can use the schema_read(), schema_field_read(), schema_entity_read() methods, as described here: https://github.com/shotgunsoftware/python-api/wiki/Reference%3A-Methods

Design

  1. For larger studios especially, consider having an API isolation layer - a wrapper.  This isolates your tools from changes in the Shotgun API.  It also means that you can control API access, manage debugging, tracking auditing, etc in a more fine-grained way. There’s a great example wrapper here: https://github.com/Nvizible/sg_wrapper
  2. Try to use the latest version of the API - it’ll contain bug fixes and performance improvements.
  3. Be aware of where the script will be run from.  It can be useful to have one person checking that Shotgun scripts will be used sensibly (eg. not in a context where they’re being started 1000’s of times per second) before new scripts are created.
  4. Scripts that are operating on large data-sets should be run in off-hours, especially for high-traffic sites.
  5. You can turn off event generation for scripts.  This is most useful for scripts that are running very often whose events you won’t need to track later.  For scripts that run extremely often, this is highly recommended as the event log can become very large otherwise.  A large event log won’t impact the performance of your Shotgun server over all, but it can impact performance when viewing or querying event histories and it will take up disk space.
Follow

0 Comments

Please sign in to leave a comment.