Welcome!

News Feed Item

Top Trial Firm Cohen, Placitella & Roth, P.C. Investigating Merge Healthcare, Inc. for Falsification of Customer Contracts

PHILADELPHIA, Jan. 17, 2014 /PRNewswire/ -- Cohen, Placitella and Roth, P.C. is investigating claims on behalf of investors who purchased Merge Healthcare, Incorporated ("Merge Healthcare" or the "Company") (Nasdaq: MRGE) stock between August 1, 2012 and January 7, 2014, inclusive. The investigation concerns whether Merge Healthcare and certain of its officers and/or directors disseminated material false and/or misleading information to investors in violation of Sections 10(b), and 20(a) of the Securities Exchange Act of 1934.

Merge Healthcare offers health stations, clinical trial software and other health data and analytics services designed to engage consumers about their personal health. Commencing in May 2013 there were a series of negative disclosures. First, the Company's General Counsel resigned, followed by its Chairman and CEO. Next, the Company disclosed very disappointing second quarter 2013 earnings results, including a 9% decline year-over-year in revenue to $57.2 million. It disclosed this disappointment despite reporting an 82% increase in its subscription backlog from the second quarter of 2012.  The Company also disclosed earnings per share and revenue that fell far short of what the market expected.  On this news, the price of the Company's stock fell more than $2.00 per share, or more than 45%. Then, on January 8, 2014, before the market opened, Merge Healthcare announced that the existence and/or value of millions of dollars of customer contracts had been falsified for the six quarters ending September 20, 2013. On this news, on January 8, 2014, the Company's stock price that had traded as high as $4.71 per share during the Class Period fell to close at $2.31 per share.

If you have any information on the falsification of Merge Healthcare's customer contracts or you wish to discuss your rights related to a loss in your investment in Merge Healthcare, please contact Eduardo A. Texidor, Jr. at [email protected] or, toll free, at 1-888-375-7600. For those investors inquiring via email, please be sure to include "Merge Healthcare" in the subject line, the number of shares purchased, and your mailing address and telephone number.

Since 1973, Cohen, Placitella & Roth, P.C. has been recognized as one of the premier trial law firms in the country. The firm has extensive experience in prosecuting securities litigation involving violations of the federal securities laws, state law derivative actions and mergers and acquisitions cases, representing institutional investors such as public pension plans and union pension funds as well as individual shareholders suffering substantial investment losses due to corporate misconduct. LexisNexis Martindale-Hubbell® annually reports Cohen, Placitella & Roth's peer rating-the highest AV® - "a testament to professional excellence." Since the inauguration of its "Best Law Firms"' edition in 2010, U.S. News and World Report has annually listed Cohen, Placitella & Roth's as one of the top-tier class action law firms in the country.

Contact:
Eduardo A. Texidor, Jr.
Cohen, Placitella,and Roth, P.C.
Toll free: 1-888-375-7600
[email protected]

SOURCE Cohen, Placitella & Roth, P.C.

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
As companies gain momentum, the need to maintain high quality products can outstrip their development team’s bandwidth for QA. Building out a large QA team (whether in-house or outsourced) can slow down development and significantly increases costs. This eBook takes QA profiles from 5 companies who successfully scaled up production without building a large QA team and includes: What to consider when choosing CI/CD tools How culture and communication can make or break implementation
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Big Data, cloud, analytics, contextual information, wearable tech, sensors, mobility, and WebRTC: together, these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at @ThingsExpo, Erik Perotti, Senior Manager of New Ventures on Plantronics’ Innovation team, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it ...
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
StackIQ has announced the release of Stacki 3.2. Stacki is an easy-to-use Linux server provisioning tool. Stacki 3.2 delivers new capabilities that simplify the automation and integration of site-specific requirements. StackIQ is the commercial entity behind this open source bare metal provisioning tool. Since the release of Stacki in June of 2015, the Stacki core team has been focused on making the Community Edition meet the needs of members of the community, adding features and value, while ...
Deploying applications in hybrid cloud environments is hard work. Your team spends most of the time maintaining your infrastructure, configuring dev/test and production environments, and deploying applications across environments – which can be both time consuming and error prone. But what if you could automate provisioning and deployment to deliver error free environments faster? What could you do with your free time?