News Feed Item

Continuuity, AT&T Labs to Open Source Real-Time Data Processing Framework

Disruptive New Technology Will Enable Integrated, High-Quality and Consistent Streaming Analytics

PALO ALTO, CA -- (Marketwired) -- 06/03/14 -- Continuuity, creator of the industry's first Big Data Application Server for Apache Hadoop™, and AT&T Labs today announced plans to release into open source a disruptive new technology that will provide an integrated, high-quality, and consistent streaming analytics capability. Initially code-named jetStream, it will be made available to the market via open source in the third quarter of 2014.

jetStream will offer a simple, efficient and cost-effective way for businesses, OEMs/ISVs, system integrators, service providers, and developers to create a diverse range of big data analytics and streaming applications that address a broad set of business use cases such as, network intrusion detection and analytics, real-time analysis for spam filtering, social media market analysis, location analytics, and real-time recommendation engines that match relevant content to the right users at the right time.

"Continuuity's mission is to help enterprises to make their businesses truly data-driven. Given the wealth of data being consumed and processed, the ability to make informed, real-time decisions with data is critical," said Jonathan Gray, Founder & CEO of Continuuity. "Our collaboration with AT&T to open source jetStream is consistent with our vision of becoming the de facto platform for easily building applications powered by data and operated at scale."

To create jetStream, Continuuity joined forces with AT&T Labs to integrate Continuuity BigFlow, a distributed framework for building durable high throughput data processing applications, with AT&T's streaming analytics tool -- an extremely fast, low-latency streaming analytics database originally built out of the necessity for managing its network at scale.

jetStream brings together the complementary functionality of BigFlow and AT&T's streaming analytics tool to create a unified real-time framework that supports in-memory stream processing and model-based event processing with direct integration for a variety of data systems including Apache HBase and HDFS. By combining AT&T low-latency and declarative language support with BigFlow durable, high throughput computing capabilities and procedural language support, jetStream provides developers with a disruptive new way to take in and store vast quantities of data, build massively scalable applications, and update applications in real-time as new data is ingested. Specifically, developers will be able to do the following:

  • Direct integration of real-time data feeds and processing applications with Hadoop and HBase and utilization of YARN for deployment and resource management
  • Framework-level correctness, fault tolerance guarantees, and application logic scalability that reduces friction, errors, and bugs during development
  • A transaction engine that provides delivery, isolation and consistency guarantees that enable exactly-once processing semantics
  • Scalability without increased operational cost of building and maintaining applications

"From financial services to global network management, there are industries with a need to conduct real-time business; this requires access to a real-time streaming analytics technology that also provides consistency, speed and scalability," said Christopher W. Rice, Vice President of Advanced Technologies at AT&T Labs. "We believe that making this technology available in the market will advance the industry, broaden the supplier base, and lower the cost of such technology. Putting this into open-source makes available the required tools for developing data-intensive, complex applications accessible to a broader base of developers across businesses and partners of all sizes."

For more information, please visit jetStream.io or www.research.att.com/projects.

About AT&T
AT&T Inc. (NYSE:T) is a premier communications holding company and one of the most honored companies in the world. Its subsidiaries and affiliates -- AT&T operating companies -- are the providers of AT&T services in the United States and internationally. With a powerful array of network resources that includes the nation's most reliable 4G LTE network, AT&T is a leading provider of wireless, Wi-Fi, high speed Internet, voice and cloud-based services. A leader in mobile Internet, AT&T also offers the best wireless coverage worldwide of any U.S. carrier, offering the most wireless phones that work in the most countries. It also offers advanced TV service with the AT&T U-verse® brand. The company's suite of IP-based business communications services is one of the most advanced in the world.

Additional information about AT&T Inc. and the products and services provided by AT&T subsidiaries and affiliates is available at http://about.att.com or follow our news on Twitter at @ATT, on Facebook at http://www.facebook.com/att and YouTube at http://www.youtube.com/att.

© 2014 AT&T Intellectual Property. All rights reserved. AT&T, the AT&T logo and all other marks contained herein are trademarks of AT&T Intellectual Property and/or AT&T affiliated companies. All other marks contained herein are the property of their respective owners.

Reliability claim based on data transfer completion rates on nationwide 4G LTE networks. 4G LTE availability varies.

About Continuuity
Continuuity makes it easy for any Java developer to build and manage data applications in the cloud or on-premise. Continuuity Reactor, its flagship product, is the industry's first Big Data Application Server for Apache Hadoop™. Based in Palo Alto, Calif., the company is backed by leading investors including Battery Ventures, Andreessen Horowitz and Ignition Partners. Learn more at www.continuuity.com.

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Established in 1998, Calsoft is a leading software product engineering Services Company specializing in Storage, Networking, Virtualization and Cloud business verticals. Calsoft provides End-to-End Product Development, Quality Assurance Sustenance, Solution Engineering and Professional Services expertise to assist customers in achieving their product development and business goals. The company's deep domain knowledge of Storage, Virtualization, Networking and Cloud verticals helps in delivering ...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
In the next five to ten years, millions, if not billions of things will become smarter. This smartness goes beyond connected things in our homes like the fridge, thermostat and fancy lighting, and into heavily regulated industries including aerospace, pharmaceutical/medical devices and energy. “Smartness” will embed itself within individual products that are part of our daily lives. We will engage with smart products - learning from them, informing them, and communicating with them. Smart produc...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, will discuss the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docke...
WebRTC sits at the intersection between VoIP and the Web. As such, it poses some interesting challenges for those developing services on top of it, but also for those who need to test and monitor these services. In his session at WebRTC Summit, Tsahi Levent-Levi, co-founder of testRTC, reviewed the various challenges posed by WebRTC when it comes to testing and monitoring and on ways to overcome them.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, will contrast how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He will show the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He will also have live demos of building immutable pipe...
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, you'll learn about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how Docke...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...