PredictSmart: Use Data, Find Answers, Fulfill the Mission

On Point5 minutes readJan 17th, 2020
SHARE +

Predictsmart overview imageSometimes people only see what they want to see. The dichotomy of our times is that government collects voluminous rich data, with over 260,000 datasets being made public and many more considered too sensitive for eyes other than those with a need to know. Yet, too often, too many policy and operations decisions are made on gut instinct and political popularity rather than data.

Some questions are portentous. What are terrorists plotting, and what actions will thwart them? What strain of flu is coming this winter, what vaccine will protect the most vulnerable, and must it be rationed? With demands that the opioid crisis placed on the foster family network, what programs are needed by affected children? What highway investments will best reduce traffic accidents?

Some questions are urgent, and decisions cannot wait for an in-depth study. What havoc will tomorrow’s blizzard create at which airports? A massive forest fire burns out of control; what citizens and buildings are in its path, what actions should be taken, and how should they be prioritized?

Others are more customary, involving longstanding programs with massive budget outlays and enduring political interest group support. In a period of strong economic growth, what adjustments are needed in programs serving SNAP recipients to help them increase their incomes and self-sufficiency? As the population ages, what regional demographic shifts will occur, and what is the impact on federal agency regional operations?

There is a huge premium on arriving at the right answer. Even when lives are not at stake, budgets and credibility are at risk, and getting it wrong invites very public criticism, even recrimination. In OMB Memorandum M-19-23, Acting Director Russell Vought stated, “Despite previous efforts and resource commitments, federal agencies often lack the data and evidence necessary to make critical decisions about program operations, policy and regulations, and to gain visibility into the impact of resource allocation on achieving program objectives. Investing in and focusing on the management and use of data and evidence across the federal government will enable agencies to shift away from low-value activities toward actions that will support decision makers: linking spending to program outputs, delivering on mission, better managing enterprise risks and promoting civic engagement and transparency.”

Congress has passed laws (such as the Evidence-Based Policymaking Act, Results Act, Government Performance and Results Act, and DATA Act), and the Trump Administration’s Federal Data Strategy and Action Plans are focused on putting the people and tools in place. There is much work to do.

Skill-sets and organizations will evolve, leading to better algorithms and analysis of existing data in order to improve program results and operating efficiency. That, in turn, will require integration of third-party data or new data collection, modifications of existing or creation of new systems and re-shaping of operations and policies. Transparency will bring new, independent analyses of the datasets, leading to new solutions that may benefit the public on one hand, while on the other hand threatening longstanding assumptions of groups that have had a key role in determining program budgets.

In the near term at Unisys, we are uncovering innovative ways every day for agencies to use technology to extract sound insights from oceans of data in furtherance of their mission.

  • Integrating child welfare data with other systems (birth and death certificates, public assistance programs, criminal records) and employing algorithms can enable child welfare agencies to rapidly match programs to needs, improving lives while allocating resources wisely.
  • Replacing manual paperwork with AI tools, such as logistics management best practices that reduce queues at border crossings and risk algorithms that reduce false positives. Travelers and trucks can be rapidly sorted by border agents who receive timely, tailored, detailed information that helps them spot risks and prioritize searches, making better and more efficient use of their time and easing the flow of travelers and cargo.
  • Restoring computer systems and preventing infection by doing rapid analysis of cyber threats, then using AI to apply tools that lock in those threats to points in the network where they can be assessed without harm to systems, operations or privacy of citizen data held by that agency.

At this point, an acknowledgment is du rigueur: All these breakthroughs depend to some degree on widespread data sharing, which many agencies are reluctant to do. Discussions of AI and algorithms inevitably involve concerns over model quality and whether the level of false positives is so high as to make the predictive value more harmful than good. Collection, sharing and use of data is not automatic and will require considerations of privacy trade-offs, while concerns about security of data grow. These are all worthy concerns and are part of what the Federal Data Strategy is addressing under its “ethical governance” remit. Even countries that are more rivals than allies understand the important role of careful sharing of data in the interest of security.

As for keeping data safe from intruders, that will always and forever be a significant consideration in any use of data technology. That is why security is the subject of our fourth blog in this series, SecureSmart.

Next up: “SmartEnterprise: The reliable, available, and protected work environment.

Click here to receive the latest updates on our Smart 2020 event.

Tags-   AI Analytics Cloud data ML