FileMaker Native Dashboard
Updated: Mar 13
Consider the Business Request for a Dashboard.
That's the initial brief, no further details required at this point, just that the Executive Team knows that the Managers will know what they want, so we're going to build them one.
Key Points for Consideration:
What is a Dashboard ?
Like most computer and software terms, the term Dashboard comes from the real world.
From vehicles where the current speed, distance travelled, water & oil levels - and of course the infamous 'check engine' light that is 'ignored' - are displayed, but more on that later . . .
Wikipedia: A dashboard is a type of graphical user interface which often provides at-a-glance views of key performance indicators relevant to a particular objective or business process.
Those of us with websites, should be familiar with Google Analytics, where we can view a data regarding the traffic and therefore performance, of our website.
It's worth stating the fact - that a Dashboard is not simply a collection of Line and Pie Charts - and to be truely effective, takes into consideration any 'call to action' required by the User.
I can't stress enough how important having a clear method for any Project.
Interview with the Operations Managers.
Now that we know the Executive Team has given the project the green light - It's time to drill down to the User Requirements - and create the Utopian experience for the User Teams.
In this case, we are fortunate to have an Operations Manager - who we've elected as 'Project Champion' for our current requirements gathering - who created an Excel document with examples of the 'numbers' required for the Dashboard.
Define Required Personas.
We have numerous 'unique' User Types, therefore numerous requirements and reporting Areas for the Dashboard.
There's a direct connection between the existing User's 'Default Values' for the rest of this Solution, and the User's requirement to filter the Dashboard with the same options.
In this case, our start up script will pre-populate the Dashboard Filtering, as per the pre-populating of the Search options in the other sections of the Solution.
Dashboard Users require to mix and match filters for the Data presented by their choice:
The User Requirement for the Dashboard is to select Filters, and to 'see' the data update within the expected response time.
A good measure is to count "1,2,3, look at me' and if your data is not ready, then you need to update your interface, code or back end - or sometimes even all three.
One requirement is to show all data - with either the expected behaviour - to have to select every item - or the 'cleaner' option, is selecting no options, to show all data.
Real Live Data.
Our current solution provides data reporting for the following Sections:
20,000 Audit Actions
With most Data Modelling - the exception proves the rule. It's worth taking the time to study and recognise the Data Patterns from existing reporting, Dashboards are no exception.
There are multiple Areas that cross multiple States, and multiple Divisions that cross multiple States. This trick here is to not 'count' the same Person, Vehicle, etc when multiple options are selected for the same filter - or the absence of a selection for a filter, which assumes 'all' values are to be displayed.
Real Time Data.
Based on Executive and Operational Management, along with Personnel Interview, data presented via the Dashboard 'can' be between 'live' and '24 hours old'. This is the norm, with the exception being a script can be called from the Dashboard to refresh the Data Server-Side, only taking minutes.
Now we are starting to formulate a real world requirement for our Dashboard.
Data Publishing Options.
Before deciding that a Dashboard is the best fit for the User's requirements - to therefore meet the Business's requirements - we ask if a Dashboard is what is 'needed' not just what the Management 'wants'.
One option is an email blast to all Users - with the data filtered based on their User Defaults.
Again, this can be an option of the User Defaults - to either receive this report via email, or to choose to navigate to the Dashboard and view the data - with the obvious advantage of being able to edit the filters 'in real time' beyond their User Defaults.
Which brings up an important feature - should the Solution publish, or broadcast to the correct Managers (and Personnel) when data limits are reached.
Consider when the compliance of an Area falls below the required value - should the Manager 'find' the error during the next 'visit' to the Dashboard - or should the system send a Notification (email or text) as soon as the live data limit is reached ?
Once we have the User presented with our Dashboard Report - in this example, Event Actions that are overdue - what is the next User action?
I'm as much as fan of Charts as the next Developer to communicate the whole story, in combination with real numbers, as per the example at the top of this article.
Each of the Items presented as numbers on the Dashboard allows the User to 'Click' and 'Navigate' that those records for viewing and/or actioning, as required.
For most Solutions, we can break down the creation of the function to three components - what the user sees, what the user interacts with, and where the data is stored (the back end).
Relationships vs ExecuteSQL.
Dashboards can present number and chart data - this has always been achievable via 'native' relationships and fields, we now have the option of 'ExecuteSQL' functions.
There will be some that now see 'ExecuteSQL' as a 'native' function - and like any other function - to be used as necessary, not just because it's the latest function to be added.
As part of your research on Development options, attempt to solve the same sample function via ExecuteSQL and/or Relationships to collect the related data.
To begin with, here's an example of the count returned via the ExecuteSQL function
FROM \"Personnel\" WHERE \"ETS.Employee\" IS NOT NULL" ; "" ; "" ; "" )
In this case, we're setting a 'flag' in the Personnel record as a data label of working for a particular company. We're also using this flag in other reports to count records.
Now things get interesting, as some records will have multiple flags set, or have multiple values for Division, State or Area - in this case, Vehicles. This restricts us from just counting the records, we need to only count the unique records
ValueCount ( UniqueValues ( List ( Dashboard Report.Date::Plant.Vehicles.IDs ) ) )
Note: It's actually interesting that the above calculation does not work with Count (or Sum, Max or Min) and only returns a value with ValueCount. Also worth noting the List function is also required, as we are returning a list of related records, and then counting.
How we reach into the related records and count the values can be via various methods.
Without a Data Warehouse, the result for the ETS.Employee flag needs to be calculated for each and every record - and let's just say neither ExecuteSQL or 'Native' functions return the data in an expected time - in this case, the above ExecuteSQL returned in 20 seconds.
Data Warehouse Function.
For this example, we've set up a separate table for data to be displayed on the Dashboard.
A key feature of this method is that we include the 'Date' field with each collection of the data, which allows the storage of historical data - and therefore the ability for the User to 'go back in time' and view previous data.
Also consider that this data history opens up the prospect of reports covering time frames, along with reports on future trends, which leads us towards true business intelligence.
All that is required for the Data Warehouse are the same number of Tables - for each section - related from the Data Warehouse Table, in this case seven Table Occurrences in all.
Minimum Viable Product (MVP).
A critical part of the Project Brief and Timeline, is to define the MVP and release this into the Production Environment as soon as possible, ensuring the release improves the workflow.
Just as critical is to publish communication to the User Community the arrival of the Dashboard, give credit to Stakeholders, define the current functionality, future functionality and empower Users to provide feedback and be part of the direction of future releases.
When it comes to development - needs analysis aside - years of experience has shown that 80% of the requirement can be achieved in the first 20% of effort. This is not to say that the last 20% of functionality is no longer required, as this can be the difference between good and great software.
Multiple Server Environments.
As discussed in previous blogs, and also discussed on the Fireside FileMaker Podcast, I consider a minimum of 'two' Server Environments as 'mandatory'.
Various Server/Client Environments
Running 'four' Server Environments may be considered extreme by some, the minimal compromise is that your environment can function with two Servers.
With two Servers, we have the Production Server - obviously - with coordinating the other three functions on the second Server.
User Acceptance Testing (UAT).
Creating various UAT Documents, along with the requirement to sign on completion, brings the project closer to releasing versions of the Solution with minimal 'bugs' or unexpected behaviours - which in turn results in data errors.
The initial UAT Document is based on the Project Brief tasks, with tasks performed by the Developer - which are tasks the Developer should be performing as they craft the code.
Consider page two of the UAT Document - when the User performs task in a way that is the expected 'wrong' way. This can be reversing the order of operation, attempting to delete records, entering non-valid data, based on the information the Developer has collected.
Now consider page three of the UAT - and this becomes the 'fun' part for the Users - seriously attempting to 'break' the solution. That's right, 'break' the solution, beyond the functions and methods applied by the Developer, and their knowledge of the Solution.
You'll see a definite 'shift' in the culture of your Developer and User relations when a level of fun and competition is added to your Solution Crafting process . . .
"Great work with the continued development on the Ecosystem Dashboard and functionality. Really makes it easy to focus on improving our effectiveness and compliance (and close the gaps). Thank-you!" ~ Justin