|By Gilad Parann-Nissany||
|September 5, 2013 11:00 AM EDT||
The HIPAA Omnibus Final Rule went into effect on March 26, 2013. In order to stay compliant, the date for fulfilling the new rules is September 23, 2013, except for companies operating under existing “business associate agreements (BAA),” may be allowed an extension until September 23, 2014.
As healthcare and patient data move to the cloud, HIPAA compliance issues follow. With many vendors, consultants, internal and external IT departments at work, the question of who is responsible for compliance comes up quite often. Not all organizations are equipped or experienced to meet the HIPAA compliance rules by themselves. Due to the nature of the data and the privacy rules of patients, it is important to secure the data correctly the first time.
HIPAA and the Cloud
Do you have to build your own cloud HIPAA compliance solutions from scratch? The short answer is no. There are solutions and consulting companies available to help move patient data to the cloud as well as secure it following HIPAA compliance rules and best practices.
The following checklist provides a guide to help plan for meeting the new HIPAA compliance rules.
A Cloud HIPAA Compliance Checklist
1. Ensure “Business Associates” are HIPAA compliant
- Data Centers and cloud providers that serve the healthcare industry are in the category of “business associates.”
- Business Associates can also be any entity that “…creates, receives, maintains, or transmits protected health information (PHI) on behalf of a covered entity.” This means document storage companies and cloud providers now officially have to follow HIPAA rules as well.
- Subcontractors are also considered business associates if they are creating, receiving, transmitting, or maintaining Protected Health Information (PHI) on behalf of a business associate agreement.
- As a business associate they must meet the compliance rules for all privacy and security requirements.
What can you do?
Ensure business associates and subcontractors sign a business associate agreement and follow the HIPAA compliance rules for themselves and any of their subcontractors. A sample Business Associate Agreement is available on the HHS.gov website.
What happens if you are in violation?
The Office of Civil Rights (OCR) investigates HIPAA violations and can charge $100 – 50,000 per violation. That gets capped at $1.5 million for multiple violations. The charges are harsh to help ensure that data is safe and companies are following the HIPAA rules.
2. Data Backup
- Health care providers, business associates, and subcontractors must have a backup contingency plan.
- Requirements state that it has to include a:
Backup plan for data, disaster recovery plan, and an emergency mode operations plan
- The backup vendor needs to encrypt backup images during transit to their off-site data centers so that data cannot be read without an encryption key
- The end user/partner is required to encrypt the source data to meet HIPAA compliance
What can you do?
If you handle the data backup internally, set a plan to meet HIPAA compliance and execute it.
If you have external backup solution providers, ensure they have a working plan in place.
3. Security Rules
- Physical safeguards need to be implemented to secure the facility, like access controls for the facility
- Develop procedures to address and respond to security breaches
- There are an additional 18 technical security standards and 36 implementation specifications as well
What can you do?
Put a plan in place to protect data from internal and external threats as well as limiting access to only those that require it.
4. Technical Safeguards
Health care providers, business associates, and subcontractors must implement technical safeguards. While many technical safeguards are not required – they do mitigate your risk in case of a breach. In particular, encryption of sensitive data allows you to claim “safe harbor” in the case of a breach.
- Study encryption and decryption of electronically protected health information
- Use AES encryption for data “at rest” in the cloud
- Use strong – and highly protected – encryption key management; this is the most sensitive and difficult piece on this list – consider to use split-key cloud encryption or homomorphic key management
- Transmission of data must be secured: use SSL/TLS or IPSec
- When any data is deleted in the cloud any mirrored version of the data must be deleted as well
- Limit access to electronically protected health information
- Audit controls and procedures that record and analyze activity in information systems which contain electronically protected health information
- Implement technical security measures such as strong authentication and authorization, guarding against unauthorized access to electronically protected information transmitted over electronic communication networks
What can you do?
Adopt strong encryption technology and develop a plan to ensure data is transmitted, stored, and deleted securely. Develop a plan to monitor data access and control access.
5. Administrative Safeguards
For organizations to meet HIPAA compliance they must have HIPAA Administrative Safeguards in place to “prevent, detect, contain and correct security violations.” Policies and procedures are required to deal with: risk analysis, risk management, workforce sanctions for non-compliance, and a review of records.
- Assign a privacy officer for developing and implementing HIPAA policies and procedures
- Ensure that business associates also have a privacy officer since they are also liable for complying with the Security Rule
- Implement a set of privacy procedures to meet compliance for four areas:
“Conduct an accurate and thorough assessment of the potential risks and vulnerabilities to the confidentiality, integrity, and availability of electronic protected health information held by the covered entity”
“Implement security measures sufficient to reduce risks and vulnerabilities to a reasonable and appropriate level to comply with §164.306(a).”
Workforce Sanctions for Non-Compliance
“Apply appropriate sanctions against workforce members who fail to comply with the security policies and procedures of the covered entity.”
Review of Records
“Implement procedures to regularly review records of information system activity, such as audit logs, access reports, and security incident tracking reports.”
- Provide ongoing administrative employee training on Protected Health Information (PHI)
- Implement a procedure and plan for internal HIPAA compliance audits
What can you do?
Develop an internal plan to meet HIPAA compliance and have a privacy officer to implement requirements. Ensure that policies and procedures deal with analysis of risk, management of risk, policy violations, and sanctions for staff or contractors in violation of the policy. Develop and maintain documentation for internal policies to meet HIPAA compliance as it will help define those policies to your organization and could assist during a HIPAA audit.
The post The HIPAA Final Rule and Staying Compliant in the Cloud appeared first on Porticor Cloud Security.
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
Jul. 27, 2016 06:45 PM EDT Reads: 2,046
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Jul. 27, 2016 06:45 PM EDT Reads: 1,134
As companies gain momentum, the need to maintain high quality products can outstrip their development team’s bandwidth for QA. Building out a large QA team (whether in-house or outsourced) can slow down development and significantly increases costs. This eBook takes QA profiles from 5 companies who successfully scaled up production without building a large QA team and includes: What to consider when choosing CI/CD tools How culture and communication can make or break implementation
Jul. 27, 2016 06:00 PM EDT Reads: 1,668
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
Jul. 27, 2016 05:45 PM EDT Reads: 852
Big Data, cloud, analytics, contextual information, wearable tech, sensors, mobility, and WebRTC: together, these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at @ThingsExpo, Erik Perotti, Senior Manager of New Ventures on Plantronics’ Innovation team, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it ...
Jul. 27, 2016 04:30 PM EDT Reads: 184
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Jul. 27, 2016 04:30 PM EDT Reads: 1,613
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jul. 27, 2016 04:30 PM EDT Reads: 1,849
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
Jul. 27, 2016 04:18 PM EDT Reads: 195
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 27, 2016 04:15 PM EDT Reads: 1,119
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 27, 2016 04:00 PM EDT Reads: 1,517
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Jul. 27, 2016 04:00 PM EDT Reads: 1,735
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
Jul. 27, 2016 04:00 PM EDT Reads: 1,051
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
Jul. 27, 2016 03:30 PM EDT Reads: 970
StackIQ has announced the release of Stacki 3.2. Stacki is an easy-to-use Linux server provisioning tool. Stacki 3.2 delivers new capabilities that simplify the automation and integration of site-specific requirements. StackIQ is the commercial entity behind this open source bare metal provisioning tool. Since the release of Stacki in June of 2015, the Stacki core team has been focused on making the Community Edition meet the needs of members of the community, adding features and value, while ...
Jul. 27, 2016 01:45 PM EDT Reads: 436
Deploying applications in hybrid cloud environments is hard work. Your team spends most of the time maintaining your infrastructure, configuring dev/test and production environments, and deploying applications across environments – which can be both time consuming and error prone. But what if you could automate provisioning and deployment to deliver error free environments faster? What could you do with your free time?
Jul. 27, 2016 01:30 PM EDT Reads: 272