Click here to close now.




















Welcome!

Related Topics: @CloudExpo, Microservices Expo, Microsoft Cloud, Containers Expo Blog, Cloud Security, SDN Journal

@CloudExpo: Article

Should Quantum Computers Cracking Encryption Matter to You?

If you are using encryption, use the strongest, most well-vetted techniques available to protect sensitive data

Many news organizations including The Washington Post are reporting that the latest documents leaked by former NSA contractor turned whistleblower Edward Snowden show the NSA is in the early stages of working to build a quantum computer that could possibly crack most types of encryption. The NSA actually discloses they are working on quantum computing technology on their website, however, the status of the research was previously unknown. According to these new documents the agency is working on a "cryptologically useful quantum computer" as part of a research program called "Penetrating Hard Targets" and the goal is to use it for cracking encryption.

With headlines that scream, "NSA Secretly Funding Code-Breaking Quantum Computer Research," it's easy to see why many executives and enterprises are anxious and perhaps starting to lose faith in internet communications and transactions. Encryption is used to protect medical, banking, business and government records around the world. But, as many of the articles in the media today point out, the reality today is quantum computing is a theoretical research topic and is many years away from being a usable real-world technology. The Washington Post article quotes Scott Aaronson, an associate professor of electrical engineering and computer science at the Massachusetts Institute of Technology, "It seems improbable that the NSA could be that far ahead of the open world without anybody knowing it."

As cryptography expert Bruce Schneier said in a piece in USA Today, "I worry a lot more about poorly designed cryptographic products, software bugs, bad passwords, companies that collaborate with the NSA to leak all or part of the keys, and insecure computers and networks. Those are where the real vulnerabilities are, and where the NSA spends the bulk of its efforts."

Mr. Schneier's comments do re-affirm the importance of never locking-in, intractably, to a single encryption algorithm/technique. If an organization does happen to lose faith in the integrity of a specific encryption algorithm, and it has become core to many/all of the systems it runs, they would be in a very difficult position. The systems that are used to protect information, like cloud encryption gateways, need to be flexible enough to do their job regardless of what encryption algorithms are used. This design approach provides organizations with the flexibility to swap algorithms in and out over time based upon their preference without impacting the core capabilities of the solutions using these encryption modules.

Even though quantum computers are years away, the news today is an important reminder to ensure that if you are using encryption, take care to make sure you are using the strongest, most well-vetted techniques available when protecting sensitive data. Groups such as National Institute of Standards and Technology (NIST) have standards such as Federal Information Processing Standards (FIPS) for use across the Federal Government in the United States. The FIPS 140-2 standard is an information technology security accreditation program for validating that the cryptographic modules produced by private sector companies meet well-defined security standards. Organizations should look for strong, industry acknowledged encryption approaches that meet accredited standards such as FIPS 140-2 when protecting sensitive and private information, and have well documented third-party peer-reviewed security proofs.

Also, enterprises should strongly consider the inherent strength of an alternative data protection technique known as tokenization (which has no keys to crack). Tokenization is a process by which a sensitive data field, such as a primary account number (PAN) from a credit or debit card, is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated original value. While there are various approaches to creating tokens, they typically are simply randomly generated values that have no mathematical relation to the original data field. The inherent security of tokenization is that it is nearly impossible to determine the original value of the sensitive data field by knowing only the surrogate token value. If a criminal got access to the token (in a cloud environment for example), there is no "quantum computer" that could ever decipher it back into its original form.

For more information on encryption, tokenization and retaining control over sensitive data in the cloud, please visit our resource center.

Read the original blog entry...


PerspecSys Inc. is a leading provider of cloud protection and cloud encryption solutions that enable mission-critical cloud applications to be adopted throughout the enterprise. Cloud security companies like PerspecSys remove the technical, legal and financial risks of placing sensitive company data in the cloud. PerspecSys accomplishes this for many large, heavily regulated companies across the world by never allowing sensitive data to leave a customer's network, while maintaining the functionality of cloud applications. For more information please visit / or follow on Twitter @perspecsys.

More Stories By Gerry Grealish

Gerry Grealish is Vice President, Marketing & Products, at PerspecSys. He is responsible for defining and executing PerspecSys’ marketing vision and driving revenue growth through strategic market expansion and new product development. Previously, he ran Product Marketing for the TNS Payments Division, helping create the marketing and product strategy for its cloud-based payment gateway and tokenization/encryption security solutions. He has held senior marketing and leadership roles for venture-backed startups as well as F500 companies, and his industry experience includes enterprise analytical software, payment processing and security services, and marketing and credit risk decisioning platforms.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
"We have been in business for 21 years and have been building many enterprise solutions, all IT plumbing - server, storage, interconnects," stated Alex Gorbachev, President of Intelligent Systems Services, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.
"We specialize in testing. DevOps is all about continuous delivery and accelerating the delivery pipeline and there is no continuous delivery without testing," noted Marc Hornbeek, Sr. Solutions Architect at Spirent Communications, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
How do you securely enable access to your applications in AWS without exposing any attack surfaces? The answer is usually very complicated because application environments morph over time in response to growing requirements from your employee base, your partners and your customers. In his session at @DevOpsSummit, Haseeb Budhani, CEO and Co-founder of Soha, shared five common approaches that DevOps teams follow to secure access to applications deployed in AWS, Azure, etc., and the friction an...
"Alert Logic is a managed security service provider that basically deploys technologies, but we support those technologies with the people and process behind it," stated Stephen Coty, Chief Security Evangelist at Alert Logic, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to tran...
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
The Internet of Things is not only adding billions of sensors and billions of terabytes to the Internet. It is also forcing a fundamental change in the way we envision Information Technology. For the first time, more data is being created by devices at the edge of the Internet rather than from centralized systems. What does this mean for today's IT professional? In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists addressed this very serious issue of pro...
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
With SaaS use rampant across organizations, how can IT departments track company data and maintain security? More and more departments are commissioning their own solutions and bypassing IT. A cloud environment is amorphous and powerful, allowing you to set up solutions for all of your user needs: document sharing and collaboration, mobile access, e-mail, even industry-specific applications. In his session at 16th Cloud Expo, Shawn Mills, President and a founder of Green House Data, discussed h...
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affect t...