|By Jill Tummler Singer||
|February 3, 2014 08:15 AM EST||
The below is summary of my comments provided on Wednesday, January 29, 2014, at the Alfresco Content.Gov event in Washington, DC.
In my 27 years of federal service, I've watched the growth in federal records and the implementation of new executive orders and regulations aimed at improving records management across the federal space. There are immense challenges associated with litigation, review and release, tracing factual evidence for analysis, managing information legal proceedings, and overseeing a plethora of authorized and unauthorized disclosures of classified and/or sensitive information.
Federal records management professionals are true, unsung heroes in helping our nation protect information while also protecting the civil liberties and privacy of our nation's citizens. The job has become increasingly more difficult in today's era of "big data." Records management and information management in the 1980s was hard and that's when we thought big data was hundreds of gigabytes. As we consider today's generation of data, four (4) decades later, federal records professionals are charged with managing tens of thousands of gigabytes-petabytes and zettabytes of data. It's an especially daunting task.
Three principles for records management are critical to future success for the federal space:
- Capture on creation;
- Manage and secure through the workflow; and
- Archive responsibly.
Point 1: Capture on Creation
The federal workforce creates content every second of every day. The content is created in formal and informal ways. It's an email, a meeting maker, an instant message communication, a voice communication, a VTC session, PowerPoint deck, meeting minutes, collaborative engagement session, memorandum, written paper, analytic notes, and so forth.
The federal workforce stores this created content in just as many formal and informal ways. It's stored on local hard drives, mobile phones, corporate storage, shadow IT storage, public clouds, and private clouds.
In short...it's a mess for the records management professional.
What is needed are solid systems and capabilities that demand capture on content creation. Simplistic and non-intrusive ways to drive the creator to label information will help tremendously. Non-intrusive doesn't mean voluntary; actions for content creation need to be forced and demanded. Not everything is a record, but many things deserve to be preserved for after action review, lessons learned, and knowledge management training over time.
Many of today's technologies make it far too easy to create content and far too difficult to manage it in perpetuity. Content creation with longevity in mind is critical for the federal records management professional and for the federal government in general.
Implementing technologies that work together to achieve the longevity goal is paramount. No federal agency can survive on one tool; one tool rarely meets the variety of end user needs or requirements. Discovering and implementing technologies with easy interfaces, open APIs, and purposeful data exchange bases will be most successful in the federal government. Often this equates to open source tools, which are naturally built for easy expansion and integration with other tools.
Point 2: Manage and Secure Through the Workflow
Very little happens in the federal government without being attached to a workflow.
- Employee time is a workflow that leads to paychecks.
- Purchasing small and large good is a workflow that leads to vendor payments and receipt of goods.
- Asset management is a workflow from asset need to asset receipt to asset long-term disposition.
- Analytic products are a workflow from inception to review to edit to publish.
- Meetings are a workflow from establishment to agenda to minutes to action capture and tracking.
- Federal budget creation is an uber-workflow from planning, programming, budgeting, and execution.
- Grants management is a workflow from idea submission to review to approval to tracking progress.
- Citizen services contain many workflows for social security payments, passport processing, visa approvals, small business loans, and so forth.
Introducing solid records management to these macro and micro workflow environments is necessary and important.
The federal government needs tools that understand the intricate workflow processes and seamlessly captures the changes, approvals, and actions for the workflow throughout the entire process-from creation to retirement. A suite of tools-built on open platforms for easy data exchange-is likely to be required for any federal agency. Working through big ERP systems and through small purpose-built systems, workflow foundations can capture information necessary for approvals and for long-term retention.
Equally necessary are workflow tools that maintain data integrity, individual privacy, and agency security. The Federal Government demands absolute security in processing workflows, especially for citizen-based services that span public and private information processing environments. It's simply not enough to have workflow tools which are fundamentally secure in a private environment. Federal agencies need confidence when exchanging data from a mobile, citizen platform to a private, agency platform.
Point 3: Archive Responsibly
Fundamental to our form of government is trust. Trust of our people is fundamental. Trust by our federal workforce is fundamental. Trust in our records and information is equally fundamental. When the Administration or the Hill or the People want to know what we knew and when we knew it, federal agencies need to be at the ready to provide the truth - with facts and records to support the facts.
The Federal Government and its agencies aren't private institutions. Although there is information that we should not keep, federal agencies should continue to err on the side of caution and keep anything that seems worth keeping. We should be prepared to keep more information and more records than legally required to lend credibility and understanding of historical decisions and outcomes.
Again, we need tools and technologies that make responsible records management and archival easier for everyone. The amount of resources spent by the federal government on review and redaction of federal records is staggering. If we could have technologies to cut the resources just by 10 percent, that would be awesome. Reaching 20 or 30 percent cost reductions would be phenomenal.
Key to reducing manpower in archival, review, and release, is solid creation at that start. At the risk of creating a circular reference, I'll take you back to my initial point of Content Management at Creation.
- Federal agencies create more data and content than any of us cares to understand.
- It's not all useful data and finding our way through the mountains of data to know and keep what's important is a tough job.
- Securing the data to prevent harmful use and unlawful disclosure needs to be easier for federal agencies.
- Knowing when a leak is harmful also needs to be easier for federal agencies.
- Responding to appropriate releases of information-whether through freedom of information act requests or congressional inquiries-shouldn't be as hard as it is today.
- Guaranteeing the safety and security of private citizen data isn't a desire...it's a demand.
- The basic needs for federal agencies are:
- Suites of tools that do a large amount of the content management;
- Open interfaces and open source tools that allow affordable and extensible add-ons for special purposes;
- Tools that facilitate reduced complexity for end users and IT departments; and
- Tools that make a records management professional and an end user's job easier on a day-to-day basis.
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
Oct. 26, 2016 09:00 PM EDT Reads: 571
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Oct. 26, 2016 08:15 PM EDT Reads: 1,472
In the next five to ten years, millions, if not billions of things will become smarter. This smartness goes beyond connected things in our homes like the fridge, thermostat and fancy lighting, and into heavily regulated industries including aerospace, pharmaceutical/medical devices and energy. “Smartness” will embed itself within individual products that are part of our daily lives. We will engage with smart products - learning from them, informing them, and communicating with them. Smart produc...
Oct. 26, 2016 08:00 PM EDT Reads: 1,569
SYS-CON Events announced today that Enzu will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive advantage. By offering a suite of proven hosting and management services, Enzu wants companies to focus on the core of their online busine...
Oct. 26, 2016 08:00 PM EDT Reads: 1,416
Qosmos, the market leader for IP traffic classification and network intelligence technology, has announced that it will launch the Launch L7 Viewer at CloudExpo | @ThingsExpo Silicon Valley, being held November 1 – 3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The L7 Viewer is a traffic analysis tool that provides complete visibility of all network traffic that crosses a virtualized infrastructure, up to Layer 7. It facilitates and accelerates common IT tasks such as VM migra...
Oct. 26, 2016 07:30 PM EDT Reads: 386
WebRTC adoption has generated a wave of creative uses of communications and collaboration through websites, sales apps, customer care and business applications. As WebRTC has become more mainstream it has evolved to use cases beyond the original peer-to-peer case, which has led to a repeating requirement for interoperability with existing infrastructures. In his session at @ThingsExpo, Graham Holt, Executive Vice President of Daitan Group, will cover implementation examples that have enabled ea...
Oct. 26, 2016 07:00 PM EDT Reads: 2,352
SYS-CON Events announced today that Coalfire will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Coalfire is the trusted leader in cybersecurity risk management and compliance services. Coalfire integrates advisory and technical assessments and recommendations to the corporate directors, executives, boards, and IT organizations for global brands and organizations in the technology, cloud, health...
Oct. 26, 2016 06:30 PM EDT Reads: 1,663
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Oct. 26, 2016 06:30 PM EDT Reads: 1,090
November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Penta Security is a leading vendor for data security solutions, including its encryption solution, D’Amo. By using FPE technology, D’Amo allows for the implementation of encryption technology to sensitive data fields without modification to schema in the database environment. With businesses having their data become increasingly more complicated in their mission-critical applications (such as ERP, CRM, HRM), continued ...
Oct. 26, 2016 06:15 PM EDT Reads: 1,151
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, will contrast how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He will show the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He will also have live demos of building immutable pipe...
Oct. 26, 2016 05:45 PM EDT Reads: 1,660
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
Oct. 26, 2016 05:30 PM EDT Reads: 1,517
SYS-CON Events announced today that Cloudbric, a leading website security provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Cloudbric is an elite full service website protection solution specifically designed for IT novices, entrepreneurs, and small and medium businesses. First launched in 2015, Cloudbric is based on the enterprise level Web Application Firewall by Penta Security Sys...
Oct. 26, 2016 05:15 PM EDT Reads: 1,284
WebRTC sits at the intersection between VoIP and the Web. As such, it poses some interesting challenges for those developing services on top of it, but also for those who need to test and monitor these services. In his session at WebRTC Summit, Tsahi Levent-Levi, co-founder of testRTC, reviewed the various challenges posed by WebRTC when it comes to testing and monitoring and on ways to overcome them.
Oct. 26, 2016 05:00 PM EDT Reads: 4,207
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Oct. 26, 2016 05:00 PM EDT Reads: 9,071
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, you'll learn about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how Docke...
Oct. 26, 2016 04:30 PM EDT Reads: 1,674