jobber-data-dashboard

Project

Creating a Jobber Data Dashboard

If you’re using Jobber as your field service management system, you may have run into limitations with its reporting. Maybe you’ve tried downloading your data so you could create your own reports in Google Sheets, or considered choosing a different platform with better reporting.

Our client Graffitico approached us to create a better solution to this problem. They had created sophisticated personnel reports in Google Sheets, but to keep them updated they needed to download the reports manually every day and update the sheets.

Thankfully, Jobber has an open GraphQL API. This means that a developer can pull the data you need from the platform automatically so you don’t have to rely on downloading reports. 

This opens up the possibility of moving past Google Sheets and creating dynamic dashboards to display your data visually, and connect it to other systems. 

Challenges to overcome with Jobber’s API

While Jobber has a robust API, it has some surprising limitations. Certain data that you would expect to be available, such as the date that a quote transitioned statuses, can’t be pulled from the API even though it’s available in reports. 

There are other things you’d expect to be able to do with the API that simply aren’t supported. 

Also, although they offer webhooks, their webhook functionality is not optimized for scale. Jobber frequently sends two webhooks at the same second triggered by a single update to an object. This creates race conditions that can mess up your data. To combat this, we used Hookdeck to receive webhooks and queue them up to be delivered to our automation system every 4 seconds to give it time to process each one individually. 

The most difficult aspect of their API is the strict rate limit. Basically, any GraphQL query that you attempt has a certain calculated “query cost”, and you can only use up to 10000 before you’re rate limited. The credits restore at a rate of 500 per second. 

While this is totally fine if you’re just pulling some information about a couple invoices, it’s tough to deal with if you want to download all your data like our client did. If you want to pull a lot of data, you will need to first identify what data you actually need and then create a plan to stay within the rate limit.

The most frustrating aspect of the rate limit is that pulling information about related objects inflates the query cost very quickly. For example, if you want to pull line items for invoices, you’ll find that the query cost skyrockets. The way to combat this is to limit how many related items you’re pulling—for example only pulling 10 line items. If you happen to have more than 10 on a particular invoice, you’ll miss some of that data. 

Your options for working within the rate limit:

  1. Pull everything, but only get a few items at a time to stay within the rate limit (depending how much data you have, this could take several hours to even days)
  2. First explore your data to figure out reasonable limits, then optimize your query to only pull what you need. (Ex: if all your invoices are under 20 line items, then you can set the limit to 20. Or if you don’t need custom fields, you can leave that out of your query). 
  3. Pull everything, starting from parent objects and looping through related objects. For example, you can pull an invoice, then pull all its line items in a separate query and use API pagination to stay within the rate limit. This approach is scalable but time-consuming to implement. 

For this client, we ended up going with option 2. It’s a good balance of accuracy and speed. 

Automating with Google Sheets and n8n

This project went through several iterations based on what the client wanted. Because they were already used to Google Sheets and had created several reports in there, we started by replicating this process with automation.

We used n8n to create an automation that would simply clear the sheets every night at midnight, and update the data one object at a time for a few hours until it was finished. This gave them fresh data every day. 

This image shows an example of the workflow pulling visits. The workflows were daisy-chained so when visits were finished, we’d start the next workflow which pulled invoices, and so on. 

While this would have been cost prohibitive on a tool like Zapier, n8n’s pricing is very reasonable and doesn’t punish you for pulling a lot of data. 

The result worked pretty well, but we knew it wasn’t scalable. We wanted to deliver faster, more accurate data and a solution that wasn’t clearing and restarting from scratch every day. Google Sheets is not a database and lacks the features required to make a robust system.

We also ran into frequent errors with Google Sheets due to the volume of data we were pushing, which would crash the entire workflow. While we were able to implement a step that would retry the workflow, we were patching up a fundamentally inferior system.

Eventually, we landed on using Retool to store the data and create interactive dashboards for managers. Retool has a built-in Postgres database and makes it easy to create interfaces to display that data.

Automated dashboards using Retool

Retool is a visual development tool that allows you to quickly create internal tools for your company. It’s great for creating a combination of a dashboard with functionality that allows you to actually manipulate your data or take actions, rather than just looking at stats. 

We recreated our data model from Google Sheets in the Retool database editor, then moved our automations to add data into Retool. It was relatively trivial to replace the Google Sheets action step with a Postgres step to insert or update data: 

But using Retool (or any proper database), allowed us to go a step further. Instead of relying on a daily sync, we could have real-time data transfer onto the dashboard using Webhooks. 

To get all the data we wanted accurately and quickly, we ended up using a combination of approaches:

  1. Webhooks for real-time data transfer
  2. Weekly syncs to catch anything that may have been missed as a precaution
  3. A browser automation that logs into Jobber, triggers the quote report to be sent to an email address, pulls the .csv attachment from the email, and updates the dates on the quote objects in Retool. 

Theoretically, you could ditch the weekly sync, but that means any failure on Jobber’s end or on our end would result in mismatching data that may never get fixed. Having a weekly or monthly sync clean things up is often a good idea, especially since we had already built it. 

The browser automation allowed us to circumvent the limitation of Jobber’s open API, which does not allow you to get information on when quotes moved from one status to another. This is an advanced approach that I’ve written about as a guest post here.

Augmenting Jobber with contact attempt tracking

When quotes get ignored or invoices go unpaid, it’s important to have a strategy for consistently keeping in touch with the customer until they take action. 

Jobber doesn’t have tools for this built in, but we created a workaround that enables Graffitico to keep track of attempts to reach their clients. 

Using a series of custom fields, sales reps can now update a quote or invoice to indicate that they reached out today, as well as leave notes on the call. These notes in the custom field automatically get added as a note on the timeline from our automation. 

While it would be preferable to have a system that allows you to keep track of calls/contact attempts as a separate object tied to the quote, this workaround allows Graffitico to add functionality into Jobber. 

The result

We handled getting the data into Retool, and our client built their own dashboards using the Retool interface. 

They created multiple pages for different purposes, filling in the holes left by the Jobber reports. Here’s just one example:

“We use Jobber to manage our field services, but their native reporting features are limited. We needed a way to analyze the data beyond what Jobber was capable of doing. Giovanni created an automation to pull all the data every day into google sheets, using Jobber’s GraphQL API. He built the initial solution quickly and was responsive to any updates I wanted to make to the data model. In addition, he suggested further improvements we could make to our systems with his automation capabilities. Gio’s easy to work with and trained me on how to update what he built so I could maintain the system on my own if needed. I highly recommend working with Datos!” – James Perfect, Vice President of Growth & Operations at Graffitico.

Whatever system you choose to use, you usually have options for extending its capabilities with a little creativity. APIs like Jobber’s enables you to get access to your data and set up integrations so you’re not stuck with the solutions Jobber provides out-of-the-box. 

We love working with clients to solve problems like these. If you need better internal tooling to run your business, reach out and tell us what you’re looking for!

Let's build something together.

Never miss a post from Datos