|By Jill Tummler Singer||
|February 3, 2014 08:15 AM EST||
The below is summary of my comments provided on Wednesday, January 29, 2014, at the Alfresco Content.Gov event in Washington, DC.
In my 27 years of federal service, I've watched the growth in federal records and the implementation of new executive orders and regulations aimed at improving records management across the federal space. There are immense challenges associated with litigation, review and release, tracing factual evidence for analysis, managing information legal proceedings, and overseeing a plethora of authorized and unauthorized disclosures of classified and/or sensitive information.
Federal records management professionals are true, unsung heroes in helping our nation protect information while also protecting the civil liberties and privacy of our nation's citizens. The job has become increasingly more difficult in today's era of "big data." Records management and information management in the 1980s was hard and that's when we thought big data was hundreds of gigabytes. As we consider today's generation of data, four (4) decades later, federal records professionals are charged with managing tens of thousands of gigabytes-petabytes and zettabytes of data. It's an especially daunting task.
Three principles for records management are critical to future success for the federal space:
- Capture on creation;
- Manage and secure through the workflow; and
- Archive responsibly.
Point 1: Capture on Creation
The federal workforce creates content every second of every day. The content is created in formal and informal ways. It's an email, a meeting maker, an instant message communication, a voice communication, a VTC session, PowerPoint deck, meeting minutes, collaborative engagement session, memorandum, written paper, analytic notes, and so forth.
The federal workforce stores this created content in just as many formal and informal ways. It's stored on local hard drives, mobile phones, corporate storage, shadow IT storage, public clouds, and private clouds.
In short...it's a mess for the records management professional.
What is needed are solid systems and capabilities that demand capture on content creation. Simplistic and non-intrusive ways to drive the creator to label information will help tremendously. Non-intrusive doesn't mean voluntary; actions for content creation need to be forced and demanded. Not everything is a record, but many things deserve to be preserved for after action review, lessons learned, and knowledge management training over time.
Many of today's technologies make it far too easy to create content and far too difficult to manage it in perpetuity. Content creation with longevity in mind is critical for the federal records management professional and for the federal government in general.
Implementing technologies that work together to achieve the longevity goal is paramount. No federal agency can survive on one tool; one tool rarely meets the variety of end user needs or requirements. Discovering and implementing technologies with easy interfaces, open APIs, and purposeful data exchange bases will be most successful in the federal government. Often this equates to open source tools, which are naturally built for easy expansion and integration with other tools.
Point 2: Manage and Secure Through the Workflow
Very little happens in the federal government without being attached to a workflow.
- Employee time is a workflow that leads to paychecks.
- Purchasing small and large good is a workflow that leads to vendor payments and receipt of goods.
- Asset management is a workflow from asset need to asset receipt to asset long-term disposition.
- Analytic products are a workflow from inception to review to edit to publish.
- Meetings are a workflow from establishment to agenda to minutes to action capture and tracking.
- Federal budget creation is an uber-workflow from planning, programming, budgeting, and execution.
- Grants management is a workflow from idea submission to review to approval to tracking progress.
- Citizen services contain many workflows for social security payments, passport processing, visa approvals, small business loans, and so forth.
Introducing solid records management to these macro and micro workflow environments is necessary and important.
The federal government needs tools that understand the intricate workflow processes and seamlessly captures the changes, approvals, and actions for the workflow throughout the entire process-from creation to retirement. A suite of tools-built on open platforms for easy data exchange-is likely to be required for any federal agency. Working through big ERP systems and through small purpose-built systems, workflow foundations can capture information necessary for approvals and for long-term retention.
Equally necessary are workflow tools that maintain data integrity, individual privacy, and agency security. The Federal Government demands absolute security in processing workflows, especially for citizen-based services that span public and private information processing environments. It's simply not enough to have workflow tools which are fundamentally secure in a private environment. Federal agencies need confidence when exchanging data from a mobile, citizen platform to a private, agency platform.
Point 3: Archive Responsibly
Fundamental to our form of government is trust. Trust of our people is fundamental. Trust by our federal workforce is fundamental. Trust in our records and information is equally fundamental. When the Administration or the Hill or the People want to know what we knew and when we knew it, federal agencies need to be at the ready to provide the truth - with facts and records to support the facts.
The Federal Government and its agencies aren't private institutions. Although there is information that we should not keep, federal agencies should continue to err on the side of caution and keep anything that seems worth keeping. We should be prepared to keep more information and more records than legally required to lend credibility and understanding of historical decisions and outcomes.
Again, we need tools and technologies that make responsible records management and archival easier for everyone. The amount of resources spent by the federal government on review and redaction of federal records is staggering. If we could have technologies to cut the resources just by 10 percent, that would be awesome. Reaching 20 or 30 percent cost reductions would be phenomenal.
Key to reducing manpower in archival, review, and release, is solid creation at that start. At the risk of creating a circular reference, I'll take you back to my initial point of Content Management at Creation.
- Federal agencies create more data and content than any of us cares to understand.
- It's not all useful data and finding our way through the mountains of data to know and keep what's important is a tough job.
- Securing the data to prevent harmful use and unlawful disclosure needs to be easier for federal agencies.
- Knowing when a leak is harmful also needs to be easier for federal agencies.
- Responding to appropriate releases of information-whether through freedom of information act requests or congressional inquiries-shouldn't be as hard as it is today.
- Guaranteeing the safety and security of private citizen data isn't a desire...it's a demand.
- The basic needs for federal agencies are:
- Suites of tools that do a large amount of the content management;
- Open interfaces and open source tools that allow affordable and extensible add-ons for special purposes;
- Tools that facilitate reduced complexity for end users and IT departments; and
- Tools that make a records management professional and an end user's job easier on a day-to-day basis.
There are 66 million network cameras capturing terabytes of data. How did factories in Japan improve physical security at the facilities and improve employee productivity? Edge Computing reduces possible kilobytes of data collected per second to only a few kilobytes of data transmitted to the public cloud every day. Data is aggregated and analyzed close to sensors so only intelligent results need to be transmitted to the cloud. Non-essential data is recycled to optimize storage.
Mar. 28, 2017 08:15 AM EDT Reads: 3,141
"I think that everyone recognizes that for IoT to really realize its full potential and value that it is about creating ecosystems and marketplaces and that no single vendor is able to support what is required," explained Esmeralda Swartz, VP, Marketing Enterprise and Cloud at Ericsson, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Mar. 28, 2017 08:00 AM EDT Reads: 4,389
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
Mar. 28, 2017 06:00 AM EDT Reads: 8,909
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, will present an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He will expound on the industry issues he frequently came up against as an analyst, and...
Mar. 28, 2017 06:00 AM EDT Reads: 4,291
Why do your mobile transformations need to happen today? Mobile is the strategy that enterprise transformation centers on to drive customer engagement. In his general session at @ThingsExpo, Roger Woods, Director, Mobile Product & Strategy – Adobe Marketing Cloud, covered key IoT and mobile trends that are forcing mobile transformation, key components of a solid mobile strategy and explored how brands are effectively driving mobile change throughout the enterprise.
Mar. 28, 2017 06:00 AM EDT Reads: 2,957
After more than five years of DevOps, definitions are evolving, boundaries are expanding, ‘unicorns’ are no longer rare, enterprises are on board, and pundits are moving on. Can we now look at an evolution of DevOps? Should we? Is the foundation of DevOps ‘done’, or is there still too much left to do? What is mature, and what is still missing? What does the next 5 years of DevOps look like? In this Power Panel at DevOps Summit, moderated by DevOps Summit Conference Chair Andi Mann, panelists l...
Mar. 28, 2017 05:00 AM EDT Reads: 9,890
In their Live Hack” presentation at 17th Cloud Expo, Stephen Coty and Paul Fletcher, Chief Security Evangelists at Alert Logic, provided the audience with a chance to see a live demonstration of the common tools cyber attackers use to attack cloud and traditional IT systems. This “Live Hack” used open source attack tools that are free and available for download by anybody. Attendees learned where to find and how to operate these tools for the purpose of testing their own IT infrastructure. The...
Mar. 28, 2017 04:45 AM EDT Reads: 7,476
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Mar. 28, 2017 02:30 AM EDT Reads: 2,039
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Mar. 28, 2017 02:15 AM EDT Reads: 3,118
The modern software development landscape consists of best practices and tools that allow teams to deliver software in a near-continuous manner. By adopting a culture of automation, measurement and sharing, the time to ship code has been greatly reduced, allowing for shorter release cycles and quicker feedback from customers and users. Still, with all of these tools and methods, how can teams stay on top of what is taking place across their infrastructure and codebase? Hopping between services a...
Mar. 28, 2017 02:15 AM EDT Reads: 9,911
Virtualization over the past years has become a key strategy for IT to acquire multi-tenancy, increase utilization, develop elasticity and improve security. And virtual machines (VMs) are quickly becoming a main vehicle for developing and deploying applications. The introduction of containers seems to be bringing another and perhaps overlapped solution for achieving the same above-mentioned benefits. Are a container and a virtual machine fundamentally the same or different? And how? Is one techn...
Mar. 28, 2017 02:15 AM EDT Reads: 3,127
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex softw...
Mar. 28, 2017 02:00 AM EDT Reads: 3,902
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
Mar. 28, 2017 01:00 AM EDT Reads: 2,359
What sort of WebRTC based applications can we expect to see over the next year and beyond? One way to predict development trends is to see what sorts of applications startups are building. In his session at @ThingsExpo, Arin Sime, founder of WebRTC.ventures, will discuss the current and likely future trends in WebRTC application development based on real requests for custom applications from real customers, as well as other public sources of information,
Mar. 28, 2017 12:45 AM EDT Reads: 994
Interoute has announced the integration of its Global Cloud Infrastructure platform with Rancher Labs’ container management platform, Rancher. This approach enables enterprises to accelerate their digital transformation and infrastructure investments. Matthew Finnie, Interoute CTO commented “Enterprises developing and building apps in the cloud and those on a path to Digital Transformation need Digital ICT Infrastructure that allows them to build, test and deploy faster than ever before. The int...
Mar. 28, 2017 12:00 AM EDT Reads: 1,257