Welcome!

News Feed Item

Oracle Database In-Memory Powers the Real-Time Enterprise

Larry Ellison Unveils Breakthrough Technology, Which Turns the Promise of Real-Time Into a Reality

REDWOOD SHORES, CA -- (Marketwired) -- 06/10/14 -- Oracle (NYSE: ORCL)

News Summary

In today's fast-paced, hyper-connected, and mobile/social world, businesses demand instantaneous information and responsiveness. In this environment, businesses must be able to move as fast as their customers, be they B2B or B2C, to deliver the experience those customers demand.

For years, technology companies have been talking about the "real-time" enterprise. And for years, that's all those vendors delivered -- talk -- because they didn't have the necessary range of world-class technologies to deliver on the real-time promise. But today, Oracle is changing that paradigm, because only Oracle can bring together for customers optimized in-memory capabilities across applications, middleware, databases, and systems. Oracle Database In-Memory transparently extends the power of Oracle Database 12c to enable organizations to discover business insights in real-time while simultaneously increasing transactional performance. With Oracle Database In-Memory, users can get immediate answers to business questions that previously took hours to obtain and are able to deliver a faster, better experience to both their internal and external constituents.

Oracle Database In-Memory delivers leading-edge in-memory performance without the need to restrict functionality or accept compromises, complexity and risk. Deploying Oracle Database In-Memory with virtually any existing Oracle Database-compatible application is as easy as flipping a switch -- no application changes are required. It is fully integrated with Oracle Database's renowned scale-up, scale-out, storage tiering, availability and security technologies making it the most industrial-strength offering in the industry.

At a special event at Oracle's headquarters, CEO Larry Ellison described how the ability to combine real-time data analysis with sub-second transactions on existing applications enables organizations to become Real-Time Enterprises that quickly make data-driven decisions, respond instantly to customers' demands, and continuously optimize key processes.

What Customers are Saying

  • "As a consumer Internet pioneer and innovator, Yahoo is always at the leading edge of big data and database technology to deliver a responsive, seamless consumer experience. We joined Oracle's beta program to understand how memory optimization could sharpen our big data processing," said Sudhi Vijayakumar, Yahoo's Principal Oracle Database Architect. "Full support for Oracle Real Application Clusters' scale-out capabilities means Oracle Database In-Memory can be used even on our largest data warehouses."

News Facts

  • Oracle Database In-Memory enables customers to accelerate database performance by orders of magnitude for analytics, data warehousing, and reporting while also speeding up online transaction processing (OLTP).
  • An innovative, dual-format in-memory architecture combines the best of row format and column format to simultaneously deliver fast analytics and efficient OLTP.
  • Oracle Database In-Memory allows any existing Oracle Database-compatible application to automatically and transparently take advantage of columnar in-memory processing, without additional programming or application changes.
  • Oracle Database In-Memory demonstrated from 100x to more than 1000x speedup for enterprise application modules in performance tests, including Oracle E-Business Suite, Oracle's JD Edwards, Oracle's PeopleSoft, Oracle's Siebel, and Oracle Fusion Applications.
  • The ability to combine real-time data analysis with sub-second transactions on existing applications enables organizations to become Real-Time Enterprises that quickly make data-driven decisions, respond instantly to customers' demands, and continuously optimize all key processes.
  • Oracle Database In-Memory has undergone extensive validation testing by hundreds of end-users, ISV partners, and Oracle Applications teams over the past nine months.
  • Oracle Database In-Memory is scheduled for general availability in July and can be used with all hardware platforms on which Oracle Database 12c is supported.
  • Oracle PartnerNetwork (OPN) is also announcing that Oracle Database 12c Ready certification will soon include Oracle Database In-Memory.

Software and Hardware Engineered for the Real-Time Enterprise

  • Building on years of innovations and maturity, Oracle Database In-Memory inherits all Oracle Database capabilities including:
    • Maximum Availability Architecture to protect against data loss and downtime.
    • Industry leading security technologies.
    • Scalability to meet any requirement via scale-up on large SMP servers, scale-out across a cluster of servers, and storage-tiering, to cost effectively run databases of any size -- whether petabyte-scale data warehouses, big data processing or database clouds.
    • Rich programmability: Java, R, Big Data, PHP Python, Node, REST, Ruby, etc.
    • Full data type support: relational, objects, XML, text, spatial, and new integrated JSON support.
  • Oracle Engineered Systems are the ideal complement to Oracle Database In-Memory:
    • Oracle Engineered Systems, including Oracle Exadata Database Machine and Oracle SuperCluster, are optimized for Oracle Database In-Memory, featuring large memory capacity, extreme performance, and high availability while tiering less active data to flash and disk to deliver outstanding cost effectiveness.
    • In-Memory fault tolerance on Oracle Engineered Systems optionally duplicates in-memory data across nodes enabling queries to instantly use a copy of in-memory data if a server fails. New Direct-to-Wire Infiniband accelerates scale-out for in-memory.
    • Oracle's M6-32 Big Memory Machine is the most powerful scale-up platform for Oracle Database In-Memory providing up to 32 Terabytes of DRAM memory and 3 terabytes/sec of memory bandwidth for maximum in-memory performance.

Supporting Quotes

  • "We are delighted that our MicroStrategy Analytics Platform is among the first third-party applications to be certified with Oracle Database In-Memory," explained Paul Zolfaghari, President, MicroStrategy Incorporated. "Our participation in Oracle's beta program and integration with Oracle Database In-Memory builds on our long-standing relationship with Oracle, underscoring the importance of working together to optimize our platforms to extend the advanced functionality and speed performance improvements to our joint customers."
  • "Oracle is the only vendor in the industry to embrace in-memory computing from applications to middleware to database to systems, enabling businesses to maximize profitability by accelerating operations, quickly discovering new growth opportunities and making smarter, real-time decisions," said Andrew Mendelsohn, Executive Vice President, Database Server Technologies, Oracle. "Oracle Database 12c In-Memory uniquely delivers unprecedented performance for virtually all workloads with 100 percent application transparency and no data migration. Plus all the high availability, scalability, and security that customers have come to expect from the Oracle Database are fully preserved."
  • "Oracle Applications provide the foundation for our customers' mission-critical business operations, including sales, financials, supply chain and human resources. By raising the bar on speed, Oracle Database In-Memory enables customers to compound the value of their existing applications by deriving new insights and business opportunities faster," said Steve Miranda, Executive Vice President of Application Development, Oracle.

Supporting Resources

About Oracle
Oracle engineers hardware and software to work together in the cloud and in your data center. For more information about Oracle (NYSE: ORCL), visit www.oracle.com.

Trademarks
Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

Safe Harbor
The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle's products remains at the sole discretion of Oracle Corporation.

PDF Attachment Available: http://media.marketwire.com/attachments/201406/76191_DBIM_ComparChart_Vert.pdf

Image Available: http://www2.marketwire.com/mw/frame_mw?attachid=2613727

Contact Info

Letty Ledbetter
Oracle
+1.650.506.8071
Email Contact

Teri Whitaker
Oracle
+1.650.506.9914
Email Contact

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will d...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
When building large, cloud-based applications that operate at a high scale, it's important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. "Fly two mistakes high" is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Le...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
As Cybric's Chief Technology Officer, Mike D. Kail is responsible for the strategic vision and technical direction of the platform. Prior to founding Cybric, Mike was Yahoo's CIO and SVP of Infrastructure, where he led the IT and Data Center functions for the company. He has more than 24 years of IT Operations experience with a focus on highly-scalable architectures.
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Dhiraj Sehgal works in Delphix's product and solution organization. His focus has been DevOps, DataOps, private cloud and datacenters customers, technologies and products. He has wealth of experience in cloud focused and virtualized technologies ranging from compute, networking to storage. He has spoken at Cloud Expo for last 3 years now in New York and Santa Clara.
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.