Welcome!

Related Topics: @DXWorldExpo, @CloudExpo, Cloud Security, Government Cloud

@DXWorldExpo: Article

Trends in Federal Records Management

Three Principles for Successful Federal Records Management

The below is summary of my comments provided on Wednesday, January 29, 2014, at the Alfresco Content.Gov event in Washington, DC.

In my 27 years of federal service, I've watched the growth in federal records and the implementation of new executive orders and regulations aimed at improving records management across the federal space. There are immense challenges associated with litigation, review and release, tracing factual evidence for analysis, managing information legal proceedings, and overseeing a plethora of authorized and unauthorized disclosures of classified and/or sensitive information.

Federal records management professionals are true, unsung heroes in helping our nation protect information while also protecting the civil liberties and privacy of our nation's citizens. The job has become increasingly more difficult in today's era of "big data."  Records management and information management in the 1980s was hard and that's when we thought big data was hundreds of gigabytes. As we consider today's generation of data, four (4) decades later, federal records professionals are charged with managing tens of thousands of gigabytes-petabytes and zettabytes of data. It's an especially daunting task.

Three principles for records management are critical to future success for the federal space:

  1. Capture on creation;
  2. Manage and secure through the workflow; and
  3. Archive responsibly.

Point 1: Capture on Creation
The federal workforce creates content every second of every day. The content is created in formal and informal ways.  It's an email, a meeting maker, an instant message communication, a voice communication, a VTC session, PowerPoint deck, meeting minutes, collaborative engagement session, memorandum, written paper, analytic notes, and so forth.

The federal workforce stores this created content in just as many formal and informal ways.  It's stored on local hard drives, mobile phones, corporate storage, shadow IT storage, public clouds, and private clouds.

In short...it's a mess for the records management professional.

What is needed are solid systems and capabilities that demand capture on content creation.  Simplistic and non-intrusive ways to drive the creator to label information will help tremendously.  Non-intrusive doesn't mean voluntary; actions for content creation need to be forced and demanded.  Not everything is a record, but many things deserve to be preserved for after action review, lessons learned, and knowledge management training over time.

Many of today's technologies make it far too easy to create content and far too difficult to manage it in perpetuity.  Content creation with longevity in mind is critical for the federal records management professional and for the federal government in general.

Implementing technologies that work together to achieve the longevity goal is paramount. No federal agency can survive on one tool; one tool rarely meets the variety of end user needs or requirements. Discovering and implementing technologies with easy interfaces, open APIs, and purposeful data exchange bases will be most successful in the federal government. Often this equates to open source tools, which are naturally built for easy expansion and integration with other tools.

Point 2:  Manage and Secure Through the Workflow
Very little happens in the federal government without being attached to a workflow.

  • Employee time is a workflow that leads to paychecks.
  • Purchasing small and large good is a workflow that leads to vendor payments and receipt of goods.
  • Asset management is a workflow from asset need to asset receipt to asset long-term disposition.
  • Analytic products are a workflow from inception to review to edit to publish.
  • Meetings are a workflow from establishment to agenda to minutes to action capture and tracking.
  • Federal budget creation is an uber-workflow from planning, programming, budgeting, and execution.
  • Grants management is a workflow from idea submission to review to approval to tracking progress.
  • Citizen services contain many workflows for social security payments, passport processing, visa approvals, small business loans, and so forth.

Introducing solid records management to these macro and micro workflow environments is necessary and important.

The federal government needs tools that understand the intricate workflow processes and seamlessly captures the changes, approvals, and actions for the workflow throughout the entire process-from creation to retirement. A suite of tools-built on open platforms for easy data exchange-is likely to be required for any federal agency. Working through big ERP systems and through small purpose-built systems, workflow foundations can capture information necessary for approvals and for long-term retention.

Equally necessary are workflow tools that maintain data integrity, individual privacy, and agency security. The Federal Government demands absolute security in processing workflows, especially for citizen-based services that span public and private information processing environments.  It's simply not enough to have workflow tools which are fundamentally secure in a private environment. Federal agencies need confidence when exchanging data from a mobile, citizen platform to a private, agency platform.

Point 3:  Archive Responsibly
Fundamental to our form of government is trust.  Trust of our people is fundamental.  Trust by our federal workforce is fundamental. Trust in our records and information is equally fundamental. When the Administration or the Hill or the People want to know what we knew and when we knew it, federal agencies need to be at the ready to provide the truth - with facts and records to support the facts.

The Federal Government and its agencies aren't private institutions. Although there is information that we should not keep, federal agencies should continue to err on the side of caution and keep anything that seems worth keeping. We should be prepared to keep more information and more records than legally required to lend credibility and understanding of historical decisions and outcomes.

Again, we need tools and technologies that make responsible records management and archival easier for everyone. The amount of resources spent by the federal government on review and redaction of federal records is staggering. If we could have technologies to cut the resources just by 10 percent, that would be awesome. Reaching 20 or 30 percent cost reductions would be phenomenal.

Key to reducing manpower in archival, review, and release, is solid creation at that start. At the risk of creating a circular reference, I'll take you back to my initial point of Content Management at Creation.

Summary

  • Federal agencies create more data and content than any of us cares to understand.
  • It's not all useful data and finding our way through the mountains of data to know and keep what's important is a tough job.
  • Securing the data to prevent harmful use and unlawful disclosure needs to be easier for federal agencies.
  • Knowing when a leak is harmful also needs to be easier for federal agencies.
  • Responding to appropriate releases of information-whether through freedom of information act requests or congressional inquiries-shouldn't be as hard as it is today.
  • Guaranteeing the safety and security of private citizen data isn't a desire...it's a demand.
  • The basic needs for federal agencies are:
    • Suites of tools that do a large amount of the content management;
    • Open interfaces and open source tools that allow affordable and extensible add-ons for special purposes;
    • Tools that facilitate reduced complexity for end users and IT departments; and
    • Tools that make a records management professional and an end user's job easier on a day-to-day basis.

More Stories By Jill Tummler Singer

Jill Tummler Singer is CIO for the National Reconnaissance Office (NRO)- which as part of the 16-member Intelligence Community plays a primary role in achieving information superiority for the U.S. Government and Armed Forces. A DoD agency, the NRO is staffed by DoD and CIA personnel. It is funded through the National Reconnaissance Program, part of the National Foreign Intelligence Program.

Prior to joining the NRO, Singer was Deputy CIO at the Central Intelligence Agency (CIA), where she was responsible for ensuring CIA had the information, technology, and infrastructure necessary to effectively execute its missions. Prior to her appointment as Deputy CIO, she served as the Director of the Diplomatic Telecommunications Service (DTS), United States Department of State, and was responsible for global network services to US foreign missions.

Singer has served in several senior leadership positions within the Federal Government. She was the head of Systems Engineering, Architecture, and Planning for CIA's global infrastructure organization. She served as the Director of Architecture and Implementation for the Intelligence Community CIO and pioneered the technology and management concepts that are the basis for multi-agency secure collaboration. She also served within CIA’s Directorate of Science and Technology.

Latest Stories
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"Venafi has a platform that allows you to manage, centralize and automate the complete life cycle of keys and certificates within the organization," explained Gina Osmond, Sr. Field Marketing Manager at Venafi, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Michael Maximilien, better known as max or Dr. Max, is a computer scientist with IBM. At IBM Research Triangle Park, he was a principal engineer for the worldwide industry point-of-sale standard: JavaPOS. At IBM Research, some highlights include pioneering research on semantic Web services, mashups, and cloud computing, and platform-as-a-service. He joined the IBM Cloud Labs in 2014 and works closely with Pivotal Inc., to help make the Cloud Found the best PaaS.
Creating replica copies to tolerate a certain number of failures is easy, but very expensive at cloud-scale. Conventional RAID has lower overhead, but it is limited in the number of failures it can tolerate. And the management is like herding cats (overseeing capacity, rebuilds, migrations, and degraded performance). In his general session at 18th Cloud Expo, Scott Cleland, Senior Director of Product Marketing for the HGST Cloud Infrastructure Business Unit, discussed how a new approach is neces...
"This week we're really focusing on scalability, asset preservation and how do you back up to the cloud and in the cloud with object storage, which is really a new way of attacking dealing with your file, your blocked data, where you put it and how you access it," stated Jeff Greenwald, Senior Director of Market Development at HGST, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
"We work around really protecting the confidentiality of information, and by doing so we've developed implementations of encryption through a patented process that is known as superencipherment," explained Richard Blech, CEO of Secure Channels Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"I focus on what we are calling CAST Highlight, which is our SaaS application portfolio analysis tool. It is an extremely lightweight tool that can integrate with pretty much any build process right now," explained Andrew Siegmund, Application Migration Specialist for CAST, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
The Founder of NostaLab and a member of the Google Health Advisory Board, John is a unique combination of strategic thinker, marketer and entrepreneur. His career was built on the "science of advertising" combining strategy, creativity and marketing for industry-leading results. Combined with his ability to communicate complicated scientific concepts in a way that consumers and scientists alike can appreciate, John is a sought-after speaker for conferences on the forefront of healthcare science,...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Our strategy is to focus on the hyperscale providers - AWS, Azure, and Google. Over the last year we saw that a lot of developers need to learn how to do their job in the cloud and we see this DevOps movement that we are catering to with our content," stated Alessandro Fasan, Head of Global Sales at Cloud Academy, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.