Welcome!

Blog Feed Post

External Authentication and FIPS Compliance with Hybrid Data Pipeline

New security enhancements to Hybrid Data Pipeline include external authentication over OAuth, LDAP, Okta and more, plus FIPS support for federal compliance.

Hybrid Data Pipeline, the groundbreaking data access service from DataDirect, recently released several new features to meet market demand and remain on the cutting edge of data services. Security requirements are in higher demand than ever before, and Hybrid Data Pipeline continues to be at the forefront of data security.

What is Hybrid Data Pipeline?

Hybrid Data Pipeline is a lightweight, embeddable data access service that simplifies integration by connecting directly to the data. This enables applications to use SQL or OData to perform real-time access to on-premises and cloud data. This prevents developers from having to do ETL. Connecting directly to the data in real-time is more agile than setting up a middle tier and is better than ETL for several use cases.

Hybrid Data Pipeline

 What's New in Hybrid Data Pipeline?

  • External Authentication Support: In addition to its internal authentication, Hybrid Data Pipeline now supports external authentication methodologies, such as LDAP, OAuth and Okta, via Java plugin. External authentication allows administrators to call their existing systems of authentication through APIs, for an added layer of security. Users also have the ability to write Java code to handle authentication in a way that best fits their environment. This authentication system also adds a layer of flexibility to Hybrid Data Pipeline, as administrators can choose to map multiple externally authenticated users to a single Hybrid Data Pipeline user to more easily control data source access.
  • FIPS Compliance: Hybrid Data Pipeline Server now provides a configuration where it can be run in FIPS 140-2 compliant mode. FIPS, the Federal Information Processing Standard, is a cryptography standard defining security compliance for both hardware and software. Why is FIPS important? Compliance means that software has met the security standards for deployment by U.S. federal agencies and federal contractors. In addition, FIPS is an established standard for security industry-wide since it is accredited by both the US and Canadian governments.
  • FedRAMP Account Lockout Policy: Hybrid Data Pipeline supports the implementation of an account lockout policy, which can be used to limit the number of consecutive failed authentication attempts permitted before a user account is locked. The user is unable to authenticate until a configurable period of time has passed or until the administrator unlocks the account. The Hybrid Data Pipeline account lockout policy is by default enabled in accordance with Federal Risk and Authorization Management Program (FedRAMP) low- and medium-risk guidelines. FedRAMP Account Lockout Policy and FIPS compliance together make Hybrid Data Pipeline easy to use for Federal customers.

Security Policy

Progress DataDirect is committed to providing secure data access to its customers. Upon identification of any security vulnerability that would impact one or more Progress product(s), Progress will exercise commercially reasonable efforts to address the vulnerability in accordance with the following guidelines:

Security Vulnerability Response Policy

PRIORITY*

TIME GUIDELINE

VERSION(S)

High Risk
(CVSS 8+ or industry equivalent)

30 days

Active (i.e. latest shipping version) and all Supported versions

Medium Risk
(CVSS 5-to-8 or industry equivalent)

180 days

Active (i.e. latest shipping version)

Low Risk
(CVSS 0-to-5 or industry equivalent)

Next major release or best effort

Active (i.e. latest shipping version)

* Priority is established based on the current version of the Common Vulnerability Scoring System (CVSS), an open industry standard for assessing the severity of computer system security vulnerabilities. For additional information on this scoring system, refer to this page.

How are Companies Using Hybrid Data Pipeline?

Progress partners are using the Hybrid Data Pipeline technology to access data in the cloud or on-premises behind a firewall. In one example, a partner is exposing standard SQL/REST from multiple data sources. Hybrid Data Pipeline’s new release allows them to leverage existing LDAP security while continuing to access data in many sources. Another partner scenario involves a financial company with strict data governance requirements managed via OAuth.

Support for external authentication in the latest Hybrid Data Pipeline release enables both of those companies to access their data with minimal security effort, as well as delivering compliance with federal standards.

Learn More

To learn more about the latest innovations in enterprise security, join our webinar on Enterprise Security in Data Access, or get started with Hybrid Data Pipeline today..

Join Security Webinar

Try Hybrid Data Pipeline

Read the original blog entry...

More Stories By Progress Blog

Progress offers the leading platform for developing and deploying mission-critical, cognitive-first business applications powered by machine learning and predictive analytics.

Latest Stories
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
Sometimes I write a blog just to formulate and organize a point of view, and I think it’s time that I pull together the bounty of excellent information about Machine Learning. This is a topic with which business leaders must become comfortable, especially tomorrow’s business leaders (tip for my next semester University of San Francisco business students!). Machine learning is a key capability that will help organizations drive optimization and monetization opportunities, and there have been some...
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, provided a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to oper...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
"Storpool does only block-level storage so we do one thing extremely well. The growth in data is what drives the move to software-defined technologies in general and software-defined storage," explained Boyan Ivanov, CEO and co-founder at StorPool, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
ChatOps is an emerging topic that has led to the wide availability of integrations between group chat and various other tools/platforms. Currently, HipChat is an extremely powerful collaboration platform due to the various ChatOps integrations that are available. However, DevOps automation can involve orchestration and complex workflows. In his session at @DevOpsSummit at 20th Cloud Expo, Himanshu Chhetri, CTO at Addteq, will cover practical examples and use cases such as self-provisioning infra...
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was jo...
As Marc Andreessen says software is eating the world. Everything is rapidly moving toward being software-defined – from our phones and cars through our washing machines to the datacenter. However, there are larger challenges when implementing software defined on a larger scale - when building software defined infrastructure. In his session at 16th Cloud Expo, Boyan Ivanov, CEO of StorPool, provided some practical insights on what, how and why when implementing "software-defined" in the datacent...
Blockchain. A day doesn’t seem to go by without seeing articles and discussions about the technology. According to PwC executive Seamus Cushley, approximately $1.4B has been invested in blockchain just last year. In Gartner’s recent hype cycle for emerging technologies, blockchain is approaching the peak. It is considered by Gartner as one of the ‘Key platform-enabling technologies to track.’ While there is a lot of ‘hype vs reality’ discussions going on, there is no arguing that blockchain is b...
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, discussed the b...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and co...
The use of containers by developers -- and now increasingly IT operators -- has grown from infatuation to deep and abiding love. But as with any long-term affair, the honeymoon soon leads to needing to live well together ... and maybe even getting some relationship help along the way. And so it goes with container orchestration and automation solutions, which are rapidly emerging as the means to maintain the bliss between rapid container adoption and broad container use among multiple cloud host...
The cloud era has reached the stage where it is no longer a question of whether a company should migrate, but when. Enterprises have embraced the outsourcing of where their various applications are stored and who manages them, saving significant investment along the way. Plus, the cloud has become a defining competitive edge. Companies that fail to successfully adapt risk failure. The media, of course, continues to extol the virtues of the cloud, including how easy it is to get there. Migrating...
Imagine if you will, a retail floor so densely packed with sensors that they can pick up the movements of insects scurrying across a store aisle. Or a component of a piece of factory equipment so well-instrumented that its digital twin provides resolution down to the micrometer.