Rebuilding Public Trust: The Case for Compliant Financial Data

The key to solving the problem is figuring out the best way to publish our data

In the aftermath of banking failures, subprime mortgages, and bailouts across multiple industry sectors, it is a good time to examine the strategies it will take to rebuild public trust in our government and in the world’s financial markets. I believe the answer depends both on what we can do and how we do it.

With all the financial information that corporations were obligated to report because of existing government regulations, how could we not have foreseen this financial disaster? Did we misread the data? Was the information in the reports incorrect? Is there more that should have been required to report?  What was missing were regulations that properly addressed how the information was to be reported. Being specific on the “how” may very well have provided the warnings of the impending disaster rather than finding out how bad it was in the midst of it.

The key to solving the problem is figuring out the best way to publish our data.

A major step towards solving the problem of how to report financial information has been addressed in the latest set of regulations from the U.S. SEC. The regulations now require a growing number of companies to provide financial statement information in eXtensible Business Reporting Language (XBRL). Once all companies that are required to report use this format, analysts will be able to provide more accurate and timely warnings. Rather than using the past rules of demanding information — whether locked inside spreadsheets, forms, PDFs, the web, and other proprietary formats — specifying XBRL makes the data more usable and more easily gathered and analyzed.

If we can’t find the data, if we can’t figure out the problems that may be buried in the mountains of information, and if we haven’t any means to explore the data, we are no better prepared for the next financial crisis.

Creating Compliant Data

To complete the true financial picture of our economy, we need to complete the move to XBRL to ensure that all financial reporting is available as XBRL-tagged content. The government also needs to use XBRL reporting as part of creating a complete picture of the economy. As the bailout legislation, Recovery Act and other massive appropriations that include requirements for public disclosure continue, XBRL reporting could provide an excellent method of measuring the effects to the economy in real-time. Usually economic indicators lag behind because time is needed for surveys and reports to be collected and processed.

The initial impact of new regulations layered on top of existing regulations like Sarbanes-Oxley will be that corporations will be obligated to publish even more data, more frequently. The Obama administration’s effort to push more government data out via Recovery.gov is a good example of the movement to encourage more disclosure. If that information was in XBRL, we could more easily use the data. As the government rolls out its mandate for corporations to submit their financials using XBRL to the U.S. SEC for closer scrutiny and better compliance, they could also take a page from their own book and make all government financial reporting available to the public in XBRL, facilitating a more open national dialogue on our government’s financial health.

As Obama said in his Memorandum on Transparency, “Government should be transparent. Transparency promotes accountability and provides information for citizens about what their Government is doing. Information maintained by the Federal Government is a national asset.” This information needs to be compiled in a harmonized, compliant fashion across agencies to facilitate its preservation, dissemination, and exploitation and to maximize the research that is derived from it.

Connecting the Data

Like pages on the web, we intuitively know that the public data being posted to sites like SEC.gov and Recovery.gov is interconnected with other sources of data. But discovering the connections when all the data is in disparate formats is next to impossible for most — even  the expert research analysts at times. The government has already started to require XBRL with positive results in terms of access to data, so there should be no reason not to settle on XBRL for all financial reporting by companies and government entities alike. Beyond any specific qualities of XBRL, just by using the same format, reporting and analyzing the reports is faster and more likely to be accurate.

Further transformations like those proposed at recovery.gov (such as a service to transform the data to RDF-tagged semantic data) can allow the financial data in XBRL to be combined with data from other industry and government sectors — transforming the way we explore information.

In addition, the data about the data needs to be discoverable. Just having information in a usable format does little without it being easily accessible. Any kind of data object or concept should be found at a specific Uniform Resource Locator (URL) so that people can look up specific names, get useful information, and discover more.

As Tim Berners-Lee, director of the World Wide Web Consortium, put it, “It is about making links, so that a person or a machine can explore the web of data. With linked data, when you have some of it, you can find other related data.

Another consideration is trusting the provenance of the data itself — maintaining the connection to the source of the data whether it’s a filing on the SEC website or a recovery.gov document listing all the grant recipients for bailout funds. If the link (or URL) to the primary data source is lost, the connection and the provenance of the data is no longer assumable and any derived research loses its authenticity. When posting financial information for public consumption, entities enter into an unspoken agreement to maintain those links with the consumers. As more and more data is available online, the long-term stability of government sites and the continued maintenance of links must be guaranteed. The links are the mechanisms that will enable us to trace back to the sources and link us to the authoritative literature; whether that content is a Financial Accounting Standards Board (FASB) ruling, a Senate Bill allocating billions of bailout dollars, or the financial data related to an entity’s compliance with these rules and regulations.

If governments and corporations publish financial information using harmonized data standards and ensure that relevant links are maintained and accessible, this will go a long way to improve finding the location of relevant data.  Public scrutiny will diminish, and we will again trust the financial markets and the agencies that regulate them.

More Stories By Diane Mueller

Diane Mueller is a leading cloud technology advocate and is the author of numerous articles and white papers on emerging technology. At ActiveState, she works with enterprise IT and community developers to evangelize the next revolution of cloud computing - private platform-as-a-service. She is instrumental in identifying, building, and positioning ActiveState's Stackato cloud application platform. She has been designing and implementing products and applications embedded into mission critical financial and accounting systems at F500 corporations for over 20 years. Diane Mueller is actively involved in the efforts of OASIS/TOSCA Technical Committee working on Cloud Application Portability, works on the XML Financial Standard, XBRL, and has served on the Board of Directors of XBRL International. She currently works as the ActiveState Cloud Evangelist.

Latest Stories
DX World EXPO, LLC, a Lighthouse Point, Florida-based startup trade show producer and the creator of "DXWorldEXPO® - Digital Transformation Conference & Expo" has announced its executive management team. The team is headed by Levent Selamoglu, who has been named CEO. "Now is the time for a truly global DX event, to bring together the leading minds from the technology world in a conversation about Digital Transformation," he said in making the announcement.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Conference Guru has been named “Media Sponsor” of the 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to gre...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develop...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...