Welcome!

Related Topics: @CloudExpo, Microservices Expo, Microsoft Cloud, Containers Expo Blog, Cloud Security, SDN Journal

@CloudExpo: Article

Should Quantum Computers Cracking Encryption Matter to You?

If you are using encryption, use the strongest, most well-vetted techniques available to protect sensitive data

Many news organizations including The Washington Post are reporting that the latest documents leaked by former NSA contractor turned whistleblower Edward Snowden show the NSA is in the early stages of working to build a quantum computer that could possibly crack most types of encryption. The NSA actually discloses they are working on quantum computing technology on their website, however, the status of the research was previously unknown. According to these new documents the agency is working on a "cryptologically useful quantum computer" as part of a research program called "Penetrating Hard Targets" and the goal is to use it for cracking encryption.

With headlines that scream, "NSA Secretly Funding Code-Breaking Quantum Computer Research," it's easy to see why many executives and enterprises are anxious and perhaps starting to lose faith in internet communications and transactions. Encryption is used to protect medical, banking, business and government records around the world. But, as many of the articles in the media today point out, the reality today is quantum computing is a theoretical research topic and is many years away from being a usable real-world technology. The Washington Post article quotes Scott Aaronson, an associate professor of electrical engineering and computer science at the Massachusetts Institute of Technology, "It seems improbable that the NSA could be that far ahead of the open world without anybody knowing it."

As cryptography expert Bruce Schneier said in a piece in USA Today, "I worry a lot more about poorly designed cryptographic products, software bugs, bad passwords, companies that collaborate with the NSA to leak all or part of the keys, and insecure computers and networks. Those are where the real vulnerabilities are, and where the NSA spends the bulk of its efforts."

Mr. Schneier's comments do re-affirm the importance of never locking-in, intractably, to a single encryption algorithm/technique. If an organization does happen to lose faith in the integrity of a specific encryption algorithm, and it has become core to many/all of the systems it runs, they would be in a very difficult position. The systems that are used to protect information, like cloud encryption gateways, need to be flexible enough to do their job regardless of what encryption algorithms are used. This design approach provides organizations with the flexibility to swap algorithms in and out over time based upon their preference without impacting the core capabilities of the solutions using these encryption modules.

Even though quantum computers are years away, the news today is an important reminder to ensure that if you are using encryption, take care to make sure you are using the strongest, most well-vetted techniques available when protecting sensitive data. Groups such as National Institute of Standards and Technology (NIST) have standards such as Federal Information Processing Standards (FIPS) for use across the Federal Government in the United States. The FIPS 140-2 standard is an information technology security accreditation program for validating that the cryptographic modules produced by private sector companies meet well-defined security standards. Organizations should look for strong, industry acknowledged encryption approaches that meet accredited standards such as FIPS 140-2 when protecting sensitive and private information, and have well documented third-party peer-reviewed security proofs.

Also, enterprises should strongly consider the inherent strength of an alternative data protection technique known as tokenization (which has no keys to crack). Tokenization is a process by which a sensitive data field, such as a primary account number (PAN) from a credit or debit card, is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated original value. While there are various approaches to creating tokens, they typically are simply randomly generated values that have no mathematical relation to the original data field. The inherent security of tokenization is that it is nearly impossible to determine the original value of the sensitive data field by knowing only the surrogate token value. If a criminal got access to the token (in a cloud environment for example), there is no "quantum computer" that could ever decipher it back into its original form.

For more information on encryption, tokenization and retaining control over sensitive data in the cloud, please visit our resource center.

Read the original blog entry...


PerspecSys Inc. is a leading provider of cloud protection and cloud encryption solutions that enable mission-critical cloud applications to be adopted throughout the enterprise. Cloud security companies like PerspecSys remove the technical, legal and financial risks of placing sensitive company data in the cloud. PerspecSys accomplishes this for many large, heavily regulated companies across the world by never allowing sensitive data to leave a customer's network, while maintaining the functionality of cloud applications. For more information please visit / or follow on Twitter @perspecsys.

More Stories By Gerry Grealish

Gerry Grealish is Vice President, Marketing & Products, at PerspecSys. He is responsible for defining and executing PerspecSys’ marketing vision and driving revenue growth through strategic market expansion and new product development. Previously, he ran Product Marketing for the TNS Payments Division, helping create the marketing and product strategy for its cloud-based payment gateway and tokenization/encryption security solutions. He has held senior marketing and leadership roles for venture-backed startups as well as F500 companies, and his industry experience includes enterprise analytical software, payment processing and security services, and marketing and credit risk decisioning platforms.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web co...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
SYS-CON Events announced today that Enzu will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive ad...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...