|By Nikita Ivanov||
|December 30, 2014 05:00 PM EST||
The Facts and Fiction of In-Memory Computing
In the last year, conversations about In-Memory Computing (IMC) have become more and more prevalent in enterprise IT circles, especially with organizations feeling the pressure to process massive quantities of data at the speed that is now being demanded by the Internet. The hype around IMC is justified: tasks that once took hours to execute are streamlined down to seconds by moving the computation and data from disk, directly to RAM. Through this simple adjustment, analytics are happening in real-time, and applications (as well as the development of applications) are working at-pace with this new standard of technology and speed.
Despite becoming both more cost-effective and accepted within enterprise computing, there are still a small handful of falsehoods that confuse even the most technical of individuals in enterprise IT.
Myth: In-memory computing is about databases, so this isn't really relevant to my business.
The best way to clear the air around IMC is to start with a simple explanation of what, in fact, in-memory computing is. While many assume that because we are talking about RAM, we are having a conversation about databases and storage, but this is not the case.
IMC, at its most basic level, is using a middleware software that allows one to store data in RAM - across a broad cluster of computers - and do any and all processing where it resides (in the memory). With traditional methods, data processing is often confined to spinning disks.
By comparison, in-memory computing speeds up this process by roughly 5,000 times. Now you can see that we're not talking about storage only - instead active, fluid data and computing.
Which brings me to another, more tangible point about computing efficiency. By incorporating in-memory, a handful of processes are streamlined in order to save time, resources, and money.
To start, in-memory requires much less hardware; the result - significantly decreased capital, operational and infrastructure overhead.
Moreover, IT departments can also significantly extend the life of existing hardware and software through the increased performance that is inherent with IMC - thus amplifying the ROI on the machines that have already been purchased.
Surprisingly, in-memory computing is not a new phenomenon. Since the inception of RAM, IMC has been viewed as reliable accelerant for high-performance computing, bringing us to the next crucial misconception about this technology.
Myth: In-memory computing is expensive, therefore not practical for my operation.
There is a reason that this is one of the most common misunderstandings about IMC, because there was a point in time where the cost of memory was once quite high. That being said, the cost of RAM has been dropping consistently, at a rate of about 30% - for the last 5 years.
Today, the price of a 1 Terabyte RAM cluster can go for anywhere between $20 and $40 thousand - including all of the CPUs, networking, etc. A few years from now that same setup will likely be available for half that price.
Regardless of the future price of RAM, which based upon current projections will likely continue to fall, the current economics have already placed this technology well within the reaches of the enterprise computing budgets that require this level of scale.
Myth: My needs are already being met by Flash.
There are three different reasons why this mentality is held by IT folks, each of which are highly misinformed. I'll start with the most common, which is the idea that your business doesn't need the Lambourgini-esque super-computing power of IMC.
The hard yet obvious reality is that if your business is in any way data-driven, you likely cannot survive without speed and agility in this department. As time goes on, the amount of data that businesses accumulate compounds with new streams and variances. This is a sink-or-swim reality.
Another myth commonly used to dispel IMC is that if businesses are able to just effectively mount RAM disk, they will get in-memory processing. Unfortunately, it's not that easy. As mentioned earlier, IMC works through middleware to effectively unlock its power.
Finally, there's the notion that one can just replace their HDDs with SSDs in order to get this super-charged performance. For SSDs - in certain situations - the performance gain that you can pull from flash storage in lieu of spinning disk is enough.
However, speed matters - and is rapidly becoming more of a requirement every day. At this point, it's like comparing apples to oranges with speed improvements of 10 to 100x over SSDs.
Myth: Memory is not durable enough to be truly sustainable.
This is another notion that for whatever reason has been both widely perpetuated - and is entirely false.
The fact is - almost all in-memory computing middleware (apart from very simplistic ones) offer one or multiple strategies for in-memory backups, durable storage backups, disk-based swap space overflow, etc.
More sophisticated vendors provide a comprehensive tiered storage approach where users can decide what portion of the overall data set is stored in RAM, local disk swap space or RDBMS/HDFS - where each tier can store progressively more data but with progressively longer latencies.
Yet another source of confusion is the difference between operational datasets and historical datasets. In-memory computing is not aimed at replacing enterprise data warehouse (EDW), backup or offline storage services - like Hadoop, for example. The goal of IMC is to improve the operational datasets that require mixed OLTP and OLAP processing and in most cases are less than 10TB in size. That is to say, in-memory computing is not "all or nothing" - and does not require that every aspect of data be housed in memory.
The in-memory computing revolution is by no means intended to obliterate disks from the enterprise. For now, the disk still serves a well-defined role for offline/backup use cases - tasks that are not the focus of IMC.
Myth: In-memory is inaccessible to my business because so few developers actually know how to use it.
Yes indeed, In-memory computing is a highly complex technology, that for now, only a few vendors have even been able to successfully develop offerings for. However, like much of high-technology, in-memory computing has entered the world of open source - bringing its capabilities and power to the fingertips of developers around the world.
Currently, with GridGain, developers have the ability to get their hands on IMC with a simple download at http://gridgain.org/.
In-memory computing is already being tapped across a broad range of functions and industries including (but not limited to) financial trading systems, online game, bioinformatics, hyper-local advertising, cognitive computing, and geospatial analysis.
By raising awareness, and bringing the capabilities of IMC to more developers and organizations - industries around the globe are poised to experience entirely new standards of speed, computing, and performance.
Artificial Intelligence has the potential to massively disrupt IoT. In his session at 18th Cloud Expo, AJ Abdallat, CEO of Beyond AI, will discuss what the five main drivers are in Artificial Intelligence that could shape the future of the Internet of Things. AJ Abdallat is CEO of Beyond AI. He has over 20 years of management experience in the fields of artificial intelligence, sensors, instruments, devices and software for telecommunications, life sciences, environmental monitoring, process...
May. 5, 2016 07:45 PM EDT Reads: 1,498
Increasing IoT connectivity is forcing enterprises to find elegant solutions to organize and visualize all incoming data from these connected devices with re-configurable dashboard widgets to effectively allow rapid decision-making for everything from immediate actions in tactical situations to strategic analysis and reporting. In his session at 18th Cloud Expo, Shikhir Singh, Senior Developer Relations Manager at Sencha, will discuss how to create HTML5 dashboards that interact with IoT devic...
May. 5, 2016 07:00 PM EDT Reads: 1,470
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., will focus on real world deployments of DDoS mitigation strategies in every layer of the network. He will give an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He will also outline what we have found in our experience managing and running thousands of Linux and Unix managed service platforms and what specifically c...
May. 5, 2016 07:00 PM EDT Reads: 1,375
The IoTs will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm and share the must-have mindsets for removing complexity from the development proc...
May. 5, 2016 06:00 PM EDT Reads: 1,038
We’ve worked with dozens of early adopters across numerous industries and will debunk common misperceptions, which starts with understanding that many of the connected products we’ll use over the next 5 years are already products, they’re just not yet connected. With an IoT product, time-in-market provides much more essential feedback than ever before. Innovation comes from what you do with the data that the connected product provides in order to enhance the customer experience and optimize busi...
May. 5, 2016 06:00 PM EDT Reads: 1,367
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
May. 5, 2016 05:00 PM EDT Reads: 1,272
Many private cloud projects were built to deliver self-service access to development and test resources. While those clouds delivered faster access to resources, they lacked visibility, control and security needed for production deployments. In their session at 18th Cloud Expo, Steve Anderson, Product Manager at BMC Software, and Rick Lefort, Principal Technical Marketing Consultant at BMC Software, will discuss how a cloud designed for production operations not only helps accelerate developer...
May. 5, 2016 05:00 PM EDT Reads: 1,369
A critical component of any IoT project is the back-end systems that capture data from remote IoT devices and structure it in a way to answer useful questions. Traditional data warehouse and analytical systems are mature technologies that can be used to handle large data sets, but they are not well suited to many IoT-scale products and the need for real-time insights. At Fuze, we have developed a backend platform as part of our mobility-oriented cloud service that uses Big Data-based approache...
May. 5, 2016 04:00 PM EDT Reads: 771
Peak 10, Inc., has announced the implementation of IT service management, a business process alignment initiative based on the widely adopted Information Technology Infrastructure Library (ITIL) framework. The implementation of IT service management enhances Peak 10’s current service-minded approach to IT delivery by propelling the company to deliver higher levels of personalized and prompt service. The majority of Peak 10’s operations employees have been trained and certified in the ITIL frame...
May. 5, 2016 04:00 PM EDT Reads: 1,188
trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vice president of product management, IoT solutions at GlobalSign, will teach IoT developers how t...
May. 5, 2016 03:45 PM EDT Reads: 730
As the rapid adoption of containers continues, companies are finding that they lack the operational tools to understand the behavior of applications deployed in these containers, and how to identify issues in their application infrastructure. For example, how are multiple containers within an application impacting each other’s performance? If an application’s service is degraded, which container is to blame? In the case of an application outage, what was the root cause of the outage?
May. 5, 2016 03:00 PM EDT Reads: 1,182
Digital payments using wearable devices such as smart watches, fitness trackers, and payment wristbands are an increasing area of focus for industry participants, and consumer acceptance from early trials and deployments has encouraged some of the biggest names in technology and banking to continue their push to drive growth in this nascent market. Wearable payment systems may utilize near field communication (NFC), radio frequency identification (RFID), or quick response (QR) codes and barcodes...
May. 5, 2016 02:45 PM EDT Reads: 1,065
SYS-CON Events announced today that CollabNet (www.collabnet.com) a global leader in enterprise software development and delivery solutions that help customers create high-quality applications at speed, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. The CEO of CollabNet Flint Brenton will also present about DevOps challenges in today’s global, open, and heterogeneous world of software development.
May. 5, 2016 02:39 PM EDT Reads: 320
SYS-CON Events announced today that Peak 10, Inc., a national IT infrastructure and cloud services provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Peak 10 provides reliable, tailored data center and network services, cloud and managed services. Its solutions are designed to scale and adapt to customers’ changing business needs, enabling them to lower costs, improve performance and focus inter...
May. 5, 2016 02:30 PM EDT Reads: 1,488
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, will show how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningfu...
May. 5, 2016 02:30 PM EDT Reads: 1,209