The analytics needs of government agencies different from those of small and medium businesses. One major difference in the requirements come from the objectives of the respective enterprises. SMBs, like any other businesses are focused on profits, where as government agencies are focused on efficiency (in theory, at least!). Barring this fundamental split, there are no other differences.
Every business (small, large or government) collects data - increasingly larger amounts of it, at greatly decreasing costs. Once this data is available, the key question of course is how to best make use of it.
In late 2011, the President’s Council of Advisors on Science and Technology concluded that the US was under-investing in technologies related to big data for collection, management, analysis and effective utilization. As a result, the Office of Science and Technology for the President announced in March 2012 a $200 Million initiative to improve the “ability to extract knowledge and insights from large and complex collections of digital data”.
In addition, agencies ranging from the Defense Advanced Research Projects Agency (DARPA) to the Department of Energy are focusing on data analytics challenges specific to their missions. DARPA is beginning the XDATA program, which intends to invest approximately $25 million annually for four years to develop computational techniques and software tools for analyzing large volumes of semi-structured and unstructured data. The key issues XDATA will focus on include developing algorithms for processing data in distributed data stores; and creating effective human-computer interaction tools for enabling rapidly customizable visual reasoning for diverse missions.
The DOE has recognized that simulations that are being run on its supercomputers are becoming increasingly sophisticated and complex; however the analytics to digest that data is not keeping pace. In order to address this, they are providing $25 million to fund the Scalable Data Analysis, Visualization and Management Institute (SDAV) to help in development of data analytics technologies.
Clearly all these efforts are predicated on the availability of large volumes of data. What will be needed is a scalable method which will allow a government agency to easily access relevant data, set up processes to transform, cleanse and filter data and finally data reduction and visualization tools to automatically identify the key drivers of information content within the data. After applying such a process, agencies may be able to deep dive further into cause and effect analysis by setting up simulation models using the reduced data set or simply use the distilled information to make decisions about policy.
Sign up to beta test KeyConnect, our dimension reduction application.