|By PR Newswire||
|April 2, 2014 07:00 AM EDT||
BURLINGTON, Massachusetts, April 2, 2014 /PRNewswire/ --
Attunity Ltd. (NASDAQ CM: ATTU), a leading provider of information availability software solutions, today announced the release of Attunity Maestro, a new and innovative Big Data management platform designed to help organizations automate the complex process of composing, conducting and monitoring information flow across the entire global enterprise - easily and efficiently. The solution is expected to enable organizations to increase business productivity by empowering them to design and monitor large Big Data transfer processes using its powerful, unified control platform.
"Mapping, executing and managing the flow of data between a myriad heterogeneous systems - both analytical and operational/transactional systems - are critical functions for enterprises that want to become truly data-driven," said Jeffrey Kelly, Principal Research Contributor and lead Big Data Analyst at Wikibon, a leading open source IT research community focused on infrastructure, Big Data, cloud and software-led innovations. "Attunity Maestro is designed to simplify these otherwise complex functions so that enterprises can deliver the right data to the right system at the right time."
Supporting global data centers and cloud environments, Attunity Maestro is engineered for medium to large enterprises that need to integrate critical data transfer processes into daily business activity. The solution accelerates and coordinates data transmission and deployment processes of Big Data and large-file assets, delivering speed, simplicity and scalability to virtually any business or IT processes that require information availability.
Designed to meet the needs of a diverse portfolio of users: IT Operations, Lines of Business and Risk Management Teams alike, Attunity Maestro provides unique controls for defining, executing, managing and auditing all transaction and automation initiatives. Common uses will include data distribution to remote location(s), data consolidation for central analytics, enterprise-wide content management and sharing, and multi-stage content deployment.
Using Attunity Maestro, organizations can:
- Centrally manage and control information flow processes
- Enjoy and benefit from quick time-to-value
- Enable higher efficiency and productivity
- Empower line-of-business staff to self-service
- Free up IT resources
For more details about Attunity Maestro, visit http://www.attunity.com/products/attunity-maestro.
"Global organizations implementing Big Data initiatives are increasingly challenged with managing and monitoring their most precious and fastest growing asset - their data," explained Shimon Alon, Chairman and CEO at Attunity. "With Attunity Maestro, we are pleased to address this critical need head-on and help organizations to better manage the flow of information throughout their globally-distributed enterprises. We believe that Attunity Maestro will become a strategic driver for our future growth and expand our addressable markets, immediately providing us with a competitive advantage."
Attunity Announces New #BigData Mgmt Platform for Conducting/Monitoring Info Flow Across Global Enterprises http://bit.ly/1kq3Afx #Maestro
Attunity is a leading provider of information availability software solutions that enable access, management, sharing and distribution of data, including Big Data, across heterogeneous enterprise platforms, organizations, and the cloud. Our software solutions include data replication, data management, change data capture (CDC), data connectivity, enterprise file replication (EFR), managed-file-transfer (MFT), and cloud data delivery. Using Attunity's software solutions, our customers enjoy significant business benefits by enabling real-time access and availability of data and files where and when needed, across the maze of heterogeneous systems making up today's IT environment.
Attunity has supplied innovative software solutions to its enterprise-class customers for nearly 20 years and has successful deployments at thousands of organizations worldwide. Attunity provides software directly and indirectly through a number of partners such as Microsoft, Oracle, IBM and HP. Headquartered in Boston, Attunity serves its customers via offices in North America, Europe, and Asia Pacific and through a network of local partners. For more information, visit http://www.attunity.com or our In Tune blog and join our community on Twitter, Facebook, LinkedIn and YouTube, the content of which is not part of this press release.
This press release contains forward-looking statements, including statements regarding the anticipated features and benefits of Replicate Solutions, within the meaning of the "safe harbor" provisions of the Private Securities Litigation Reform Act of 1995 and other Federal Securities laws. Statements preceded by, followed by, or that otherwise include the words "believes", "expects", "anticipates", "intends", "estimates", "plans", and similar expressions or future or conditional verbs such as "will", "should", "would", "may" and "could" are generally forward-looking in nature and not historical facts. For example, when we say that we expect Attunity Maestro to be a strong driver of our future growth, we use a forward-looking statement. Because such statements deal with future events, they are subject to various risks and uncertainties and actual results, expressed or implied by such forward-looking statements, could differ materially from Attunity's current expectations. Factors that could cause or contribute to such differences include, but are not limited to: our reliance on strategic relationships with our distributors, OEM and VAR partners, and on our other significant customers; risks and uncertainties relating to acquisitions, including costs and difficulties related to integration of acquired businesses; timely availability and customer acceptance of Attunity's new and existing products, including Attunity Maestro; changes in the competitive landscape, including new competitors or the impact of competitive pricing and products; a shift in demand for products such as Attunity's products; the impact on revenues of economic and political uncertainties and weaknesses in various regions of the world, including the commencement or escalation of hostilities or acts of terrorism; and other factors and risks on which Attunity may have little or no control. This list is intended to identify only certain of the principal factors that could cause actual results to differ. For a more detailed description of the risks and uncertainties affecting Attunity, reference is made to Attunity's latest Annual Report on Form 20-F which is on file with the Securities and Exchange Commission (SEC) and the other risk factors discussed from time to time by Attunity in reports filed with, or furnished to, the SEC. Except as otherwise required by law, Attunity undertakes no obligation to publicly release any revisions to these forward-looking statements to reflect events or circumstances after the date hereof or to reflect the occurrence of unanticipated events.
© Attunity 2014. All Rights Reserved. Attunity is a registered trademark of Attunity Inc. All other product and company names herein may be trademarks of their respective owners.
Melissa Kolodziej, Director of Marketing Communications, Attunity
SOURCE Attunity Ltd.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jan. 18, 2017 12:45 AM EST Reads: 5,929
"Logz.io is a log analytics platform. We offer the ELK stack, which is the most common log analytics platform in the world. We offer it as a cloud service," explained Tomer Levy, co-founder and CEO of Logz.io, in this SYS-CON.tv interview at DevOps Summit, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 18, 2017 12:45 AM EST Reads: 6,235
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Jan. 18, 2017 12:30 AM EST Reads: 2,773
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 18, 2017 12:00 AM EST Reads: 2,252
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
Jan. 18, 2017 12:00 AM EST Reads: 7,727
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
Jan. 17, 2017 11:30 PM EST Reads: 4,288
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
Jan. 17, 2017 11:15 PM EST Reads: 4,874
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
Jan. 17, 2017 11:00 PM EST Reads: 513
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Dave McCarthy, Director of Products at Bsquare Corporation; Alan Williamson, Principal ...
Jan. 17, 2017 10:30 PM EST Reads: 2,336
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
Jan. 17, 2017 10:30 PM EST Reads: 679
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software securi...
Jan. 17, 2017 09:30 PM EST Reads: 1,767
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Jan. 17, 2017 09:15 PM EST Reads: 7,534
"There is a huge interest in Kubernetes. People are now starting to use Kubernetes and implement it," stated Sebastian Scheele, co-founder of Loodse, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 17, 2017 08:45 PM EST Reads: 1,934
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
Jan. 17, 2017 08:00 PM EST Reads: 11,617
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jan. 17, 2017 06:45 PM EST Reads: 6,216