Click here to close now.


News Feed Item

TABB Examines Market Data Technology Innovation Related to Managed Services, Comprehensive Data Ecosystems and Minimizing Cost of Unused Market Data

The complexity and cost of accessing global market data continues to weigh heavily on capital markets firms competing in a “Cambrian Age” of data where management face an explosion of new data sources, a market structure in transformation and constant competitive pressure to digest gargantuan amounts of data faster than ever.

According to new TABB Group research, the hunt for new sources of performance and savings is reaching a fever pitch. “Solution-wise, significant improvements in market data fluency must be an integral part of the formula for chief data officers as well as their CIO, CTO and CFO colleagues,” says Paul Rowady, a TABB principal, director of data and analytics (DnA) research and author of “Market Data Technology Innovation: Bigger Data, Simpler Solutions, Better Functionality.” “We believe that innovations balancing lower costs with increased levels of performance, increasingly broad coverage and improved functionality will be integral to their firms’ success going forward.”

The new report delves into three areas of market data technology innovation: managed services and the balancing of both unified and bifurcated market data infrastructure; comprehensive data ecosystems where strong breadth and depth of global market data is coupled with application development technology to intelligently consume that data; and enhanced user management to minimize the costs of unused market data.

In terms of data costs, while flattening since 2011, TABB estimates that total global spending on market data was about $25 billion, with an additional $3.1 billion spent on market data infrastructure in 2013. Within that broad category, $12 billion was spent on real-time market data (RTMD), which can be further segmented into ultra-low latency feeds, consolidated feeds and screens.

This complexity of combined volumes, velocity and variability of market-related data is putting increasing pressure on solutions to deliver more scope, functionality, performance and output configurations at lower costs. With these pressures come a new challenge: the phenomenon of big data but the real story, Rowady explains, is less about big or bigger data and more about the ability to consume it more effectively at a lower cost, this at a time when large banks and cutting edge prop shops are growing their data by terabytes per day.”

“The need for innovation in market data technology has never been greater,” he says. “The question is just how quickly new technology and methods can be adopted.”

The 13-page report with 7 exhibits is available to TABB Research Alliance DnA clients and qualified media at The Executive Summary can be seen at To purchase the study, write to [email protected].

About TABB Group

Based in New York and London, TABB Group is the research and consulting firm focused exclusively on capital markets, based on the interview-based, “first-person knowledge” research methodology developed by Larry Tabb.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
The APN DevOps Competency highlights APN Partners who demonstrate deep capabilities delivering continuous integration, continuous delivery, and configuration management. They help customers transform their business to be more efficient and agile by leveraging the AWS platform and DevOps principles.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
In his session at DevOps Summit, Bryan Cantrill, CTO at Joyent, will demonstrate a third path: containers on multi-tenant bare metal that maximizes performance, security, and networking connectivity.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at @DevOpsSummit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, al...
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
Chris Van Tuin, Chief Technologist for the Western US at Red Hat, has over 20 years of experience in IT and Software. Since joining Red Hat in 2005, he has been architecting solutions for strategic customers and partners with a focus on emerging technologies including IaaS, PaaS, and DevOps. He started his career at Intel in IT and Managed Hosting followed by leadership roles in services and sales engineering at Loudcloud and Linux startups.
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes.
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
DevOps is gaining traction in the federal government – and for good reasons. Heightened user expectations are pushing IT organizations to accelerate application development and support more innovation. At the same time, budgetary constraints require that agencies find ways to decrease the cost of developing, maintaining, and running applications. IT now faces a daunting task: do more and react faster than ever before – all with fewer resources.