|By PR Newswire||
|September 2, 2014 10:27 AM EDT||
BURLINGTON, Massachusetts, September 2, 2014 /PRNewswire/ --
Attunity Ltd. (NASDAQ CM: ATTU), a leading provider of information availability software solutions, today announced the release of Attunity Maestro 2.0, the latest version of its innovative Big Data management and data distribution platform. The solution's new, key capabilities include logic and event-driven process control, which enable increased control over business processes, greater IT efficiencies and productivity across global and distributed infrastructures.
According to a June 2014 report by IDC*, the worldwide workload scheduling and automation software market totaled $3.82 billion in 2013 and is expected to reach $5.7 billion in 2018. Outlook impact is driven by business needs that include, among others, support for larger-scale data sources, increases in the number and range of users at an organization, and requirements to optimize self-service data provisioning amongst business users. This release of Maestro enables Attunity to tap into this market and strengthens our offerings to serve the needs of customers looking for an automated replication workflow solution.
Attunity Maestro 2.0 orchestrates and automates complex, multi-stage IT processes and data flows into simple-to- use, workflow 'composition' templates. In addition, it provides rules-driven process controls that include operator approval, as well as event-based processes, to control execution of these compositions. The powerful combination of automation and interaction accelerates data transfers, deployment processes, and Big Data initiatives so that organizations can now enjoy improved collaboration and increased productivity.
Using Attunity Maestro 2.0, organizations can:
- Automate multi-stage IT processes and data flows into streamlined workflows
- Schedule event-driven processes for better control
- Facilitate collaborative operations between organizational silos
- Enhance auditing and email notifications
- Monitor, record and report business processes & authorization for improved compliance
For more details about Attunity Maestro, visit http://www.attunity.com/products/attunity-maestro.
"Attunity is encouraged by Attunity Maestro's growing momentum in the market to date," said Lawrence Schwartz, VP Marketing at Attunity. "Attunity Maestro provides an easy-to-use enterprise solution for managing and monitoring important data availability tasks that can be implemented quickly. These benefits are highly attractive to companies that need solutions to rapidly overcome growing information management challenges. Attunity Maestro can empower organizations with more control, visibility, higher efficiency and greater productivity - keys to ensuring a competitive advantage."
New @Attunity Maestro 2.0 Offers Workflow Automation & File Distribution for Increased Control Over Biz Processes http://bit.ly/1ucSYXe
Attunity is a leading provider of information availability software solutions that enable access, management, sharing and distribution of data, including Big Data, across heterogeneous enterprise platforms, organizations, and the cloud. Our software solutions include data replication, data management, test data management, change data capture (CDC), data connectivity, enterprise file replication (EFR), managed-file-transfer (MFT), and cloud data delivery. Using Attunity's software solutions, our customers enjoy significant business benefits by enabling real-time access and availability of data and files where and when needed, across the maze of heterogeneous systems making up today's IT environment.
Attunity has supplied innovative software solutions to its enterprise-class customers for nearly 20 years and has successful deployments at thousands of organizations worldwide. Attunity provides software directly and indirectly through a number of partners such as Microsoft, Oracle, IBM and HP. Headquartered in Boston, Attunity serves its customers via offices in North America, Europe, and Asia Pacific and through a network of local partners. For more information, visit http://www.attunity.com or our In Tune blog and join our community on Twitter,Facebook, LinkedIn and YouTube, the content of which is not part of this press release.
Safe Harbor Statement
This press release contains forward-looking statements, including statements regarding the anticipated features and benefits of Replicate Solutions, within the meaning of the "safe harbor" provisions of the Private Securities Litigation Reform Act of 1995 and other Federal Securities laws. Statements preceded by, followed by, or that otherwise include the words "believes", "expects", "anticipates", "intends", "estimates", "plans", and similar expressions or future or conditional verbs such as "will", "should", "would", "may" and "could" are generally forward-looking in nature and not historical facts. Because such statements deal with future events, they are subject to various risks and uncertainties and actual results, expressed or implied by such forward-looking statements, could differ materially from Attunity's current expectations. Factors that could cause or contribute to such differences include, but are not limited to: our reliance on strategic relationships with our distributors, OEM and VAR partners, and on our other significant customers; risks and uncertainties relating to acquisitions, including costs and difficulties related to integration of acquired businesses; timely availability and customer acceptance of Attunity's new and existing products, including Attunity Maestro; changes in the competitive landscape, including new competitors or the impact of competitive pricing and products; a shift in demand for products such as Attunity's products; the impact on revenues of economic and political uncertainties and weaknesses in various regions of the world, including the commencement or escalation of hostilities or acts of terrorism; and other factors and risks on which Attunity may have little or no control. This list is intended to identify only certain of the principal factors that could cause actual results to differ. For a more detailed description of the risks and uncertainties affecting Attunity, reference is made to Attunity's latest Annual Report on Form 20-F which is on file with the Securities and Exchange Commission (SEC) and the other risk factors discussed from time to time by Attunity in reports filed with, or furnished to, the SEC. Except as otherwise required by law, Attunity undertakes no obligation to publicly release any revisions to these forward-looking statements to reflect events or circumstances after the date hereof or to reflect the occurrence of unanticipated events.
© Attunity 2014. All Rights Reserved. Attunity is a registered trademark of Attunity Inc. All other product and company names herein may be trademarks of their respective owners.
Melissa Kolodziej, Director of Marketing Communications, Attunity
SOURCE Attunity Ltd.
Companies can harness IoT and predictive analytics to sustain business continuity; predict and manage site performance during emergencies; minimize expensive reactive maintenance; and forecast equipment and maintenance budgets and expenditures. Providing cost-effective, uninterrupted service is challenging, particularly for organizations with geographically dispersed operations.
Feb. 14, 2016 10:00 AM EST Reads: 103
Advances in technology and ubiquitous connectivity have made the utilization of a dispersed workforce more common. Whether that remote team is located across the street or country, management styles/ approaches will have to be adjusted to accommodate this new dynamic. In his session at 17th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., focused on the challenges of managing remote teams, providing real-world examples that demonstrate what works and what do...
Feb. 14, 2016 10:00 AM EST Reads: 368
As enterprises work to take advantage of Big Data technologies, they frequently become distracted by product-level decisions. In most new Big Data builds this approach is completely counter-productive: it presupposes tools that may not be a fit for development teams, forces IT to take on the burden of evaluating and maintaining unfamiliar technology, and represents a major up-front expense. In his session at @BigDataExpo at @ThingsExpo, Andrew Warfield, CTO and Co-Founder of Coho Data, will dis...
Feb. 14, 2016 10:00 AM EST Reads: 238
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee...
Feb. 14, 2016 09:15 AM EST
The Quantified Economy represents the total global addressable market (TAM) for IoT that, according to a recent IDC report, will grow to an unprecedented $1.3 trillion by 2019. With this the third wave of the Internet-global proliferation of connected devices, appliances and sensors is poised to take off in 2016. In his session at @ThingsExpo, David McLauchlan, CEO and co-founder of Buddy Platform, will discuss how the ability to access and analyze the massive volume of streaming data from mil...
Feb. 14, 2016 09:00 AM EST Reads: 103
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, will discuss using predictive analytics to ...
Feb. 14, 2016 08:45 AM EST Reads: 428
WebSocket is effectively a persistent and fat pipe that is compatible with a standard web infrastructure; a "TCP for the Web." If you think of WebSocket in this light, there are other more hugely interesting applications of WebSocket than just simply sending data to a browser. In his session at 18th Cloud Expo, Frank Greco, Director of Technology for Kaazing Corporation, will compare other modern web connectivity methods such as HTTP/2, HTTP Streaming, Server-Sent Events and new W3C event APIs ...
Feb. 14, 2016 08:30 AM EST
Join us at Cloud Expo | @ThingsExpo 2016 – June 7-9 at the Javits Center in New York City and November 1-3 at the Santa Clara Convention Center in Santa Clara, CA – and deliver your unique message in a way that is striking and unforgettable by taking advantage of SYS-CON's unmatched high-impact, result-driven event / media packages.
Feb. 14, 2016 08:30 AM EST Reads: 118
At first adopted by enterprises to consolidate physical servers, virtualization is now widely used in cloud computing to offer elasticity and scalability. On the other hand, Docker has developed a new way to handle Linux containers, inspired by version control software such as Git, which allows you to keep all development versions. In his session at 17th Cloud Expo, Dominique Rodrigues, the co-founder and CTO of Nanocloud Software, discussed how in order to also handle QEMU / KVM virtual machin...
Feb. 14, 2016 08:15 AM EST Reads: 164
SYS-CON Events announced today that FalconStor Software® Inc., a 15-year innovator of software-defined storage solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. FalconStor Software®, Inc. (NASDAQ: FALC) is a leading software-defined storage company offering a converged, hardware-agnostic, software-defined storage and data services platform. Its flagship solution FreeStor®, utilizes a horizonta...
Feb. 14, 2016 07:30 AM EST
Silver Spring Networks, Inc. (NYSE: SSNI) extended its Internet of Things technology platform with performance enhancements to Gen5 – its fifth generation critical infrastructure networking platform. Already delivering nearly 23 million devices on five continents as one of the leading networking providers in the market, Silver Spring announced it is doubling the maximum speed of its Gen5 network to up to 2.4 Mbps, increasing computational performance by 10x, supporting simultaneous mesh communic...
Feb. 14, 2016 05:00 AM EST
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
Feb. 14, 2016 04:30 AM EST Reads: 407
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Feb. 14, 2016 04:00 AM EST Reads: 494
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Feb. 14, 2016 04:00 AM EST Reads: 264
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
Feb. 14, 2016 03:45 AM EST Reads: 484