Connecting state and local government leaders
Every day seems to bring yet another dashboard that not only reinvents the performance reporting wheel but also fails to support data sharing across organizations and systems.
Dashboards originated to protect passengers from being splashed by water and mud kicked up by horse-drawn carriages. With the advent of motorized carriages, dashboards featured gauges and controls reporting on the vehicle's functioning. More recently, the word has been applied to IT applications displaying performance indicators of myriad systems and organizations.
As vehicles become more intelligent and autonomous, their dashboards may increasingly come to resemble their IT cousins. However, unlike their vehicular instantiations, IT dashboards have not yet matured to provide relatively standardized means of reporting performance metrics. Thus, every day seems to bring yet another dashboard (YAD) that not only reinvents the performance reporting wheel but also fails to support data sharing across organizations and systems. Google turns up about 300 million hits on the term “dashboards.” USA.gov doesn’t provide a count but reveals hundreds, if not thousands, in the .gov domain, including attractive cattle and lamb dashboards requiring Adobe’s soon-to-be deprecated Flash.
That is not to suggest any of those dashboards do not provide value to their stakeholders, but to take the vehicular analogy a turn further, it is as if each organization is driving around its own parking lot and doesn’t think about traversing public thoroughfares. But public agencies, which are funded by taxpayers in accordance with politically driven mandates, should be not only socially and fiscally responsible but also well-coordinated, both from the top down and from the bottom up. Surely, our agencies -- across all levels of government -- ought to be able to work together as intelligently as our cars, shouldn’t they?
For them to do so, not only their leaders but also other stakeholders must be able to see, understand, and influence the results being generated -- via performance reporting and management systems.
Unfortunately, the record of government bureaucracy is not encouraging. More than 20 years ago, Raines’ Rules (No. 5) directed agencies to “specify standards that enable information exchange …” Similar guidance has been reiterated over the years, including the Office of Management and Budget's Circular A-130, which explicitly directs agencies to use open data standards.
When John Teeter was deputy CIO at the Department of Health and Human Services, he proposed the inclusion of a planning and accountability domain in the National Information Exchange Model. Yet little, if any thought was given to data exchange when the George W. Bush administration’s ExpectMore.gov site was replaced by the Obama administration’s Performance.gov. That site, built on Drupal, is neither a data management nor data sharing system, much less a performance management system based upon an applicable data standard.
Moreover, when taxpayer funding was used to develop an application programming interface for the Performance.gov site after-the-fact, it proved to be unusable because the performance indicators provided on the site were not linked to the goals and objectives.
That’s a contemporary Catch 22: Why should we care that performance indicators cannot be linked to goals and objectives if a site is so alluring? Looking good is no substitute for performing well. Performance reporting systems should provide metrics stakeholders care about in ways that are readily discoverable, comprehensible and usable to them.
Since the GPRA Modernization Act requires a centralized site, some version of Performance.gov will likely persist. However, if past performance is any indicator of future results, the records on the site may not be maintained with any continuity across changing political administrations. According to a pop-up on the site, it is being reengineered, with the next release anticipated in February when GPRAMA requires agencies to update their plans and publish them on their own websites in machine-readable format. Encouragingly, the Aug. 1 release of OMB Circular A-11, section 230.18, parenthetically notes the Performance.gov site is merely one example of the types of services to be enabled by publishing plans and reports in machine-readable format.
Publishing performance plans and reports in standard, machine-readable format will enable intermediaries to add value to the data for myriad communities of interest, far beyond any capability that might be provided by a single, centralized site. Making the original, authoritative versions of agencies’ plans and reports available on their own websites in open, standard, machine-readable format will also address the problem of maintaining those important records with continuity across administrations. These plans should also be designated as permanent records in the agencies’ own records schedules.
As agencies develop their strategic plans, they should keep in mind that effective dashboards depend on simplicity, readability and focus, according to BetterBuys, which offers resources to inform software purchases. Similarly, the data analysis and visualization firm Tableau identifies five practices along with seven mistakes to avoid. The good practices are:
- Choose metrics based on why they matter.
- Keep it visual.
- Make it interactive.
- Keep it current or don’t bother.
- Make it simple to access and use.
- Starting off with too much complexity.
- Using metrics no one understands.
- Cluttering the dashboard with unimportant graphics and unintelligible widgets.
- Waiting for complex technology and big business intelligence deployment projects.
- Underestimating the time or resources to create and maintain the dashboard.
- Failing to match metrics to the goal.
- Using ineffective, poorly designed graphs and charts.
Other guidelines for dashboards include Dashboard Design Best Practices – 4 Key Principles, by Ilan Hertz; A Guide to Creating Dashboards People Love to Use, by Juice Analytics; Socrata’s Solutions for Performance Improvement & Accountability; and common dashboard features from Capterra, which categorizes and provides advice on business software.
While data integration capabilities of dashboards are certainly useful in the near-term, they should not be taken as a long-term substitute for making performance reports available in open, standard, machine-readable format so the data they contain can be used directly, without the need for cumbersome and costly extraction, transformation and loading processes. Readily usable (user-centric) query/discovery and input/feedback features should also be supported. [See Dan Chenok’s thoughts on “Citizen engagement: a pathway for government reform.”]
The $10 million Steve Ballmer reportedly spent developing USAFacts, YAD on Uncle Sam’s performance, is far more than would have been necessary if agencies had implemented GPRAMA’s machine-readability requirement.
Sadly, the scientific community, which should be leading the way, seems to be lagging. When former Vice President Joe Biden announced the Moon Shot to Cure Cancer, for example, he lamented the lack of coordination among cancer researchers and the prevalence of data silos.
While much lip service has been paid to data sharing for many years, the lowly regarded so-called “do nothing” Congress has recently demonstrated real leadership in that regard – through legislation like the DATA Act and section 10 of GPRAMA, both of which require data to be shared in machine-readable format.
And with the recent release of OMB Circular A-11, it appears the Trump administration will now expect agencies to comply. If so, the next question is how much more of the taxpayers’ money will be wasted reinventing YADs that fail to implement the duly adopted national and international open data standard.
With reference to their namesake in the Jewish tradition, let’s hope YADs point to a more intelligent, if not necessarily a divinely inspired future.
NEXT STORY: The only safe email is text-only email