News Feed Item

Fidelis Cybersecurity Boosts Detection and Shortens Response and Resolution Times for Security Incidents with Real-Time Attack Visualization and Monitoring for Endpoint Devices

Fidelis Cybersecurity™, the leading provider of solutions for detecting and stopping advanced cyberattacks, announces new time-saving features, enhancements and usability improvements to its Fidelis Endpoint™ product. Fidelis Endpoint 6.1 shortens the time to investigate and resolve security incidents and provides real-time insights into attackers when they infiltrate your endpoints and hide in your environment.

“The attacks are happening on laptops, servers and other endpoints,” says Fidelis Cybersecurity Senior Vice President of Products Brian Karney. “Immediate and long-term visibility is critical when it comes to limiting the damage attackers can do. With Fidelis Endpoint, security teams can immediately and retrospectively detect suspicious activity across endpoints and get one-click access to the related information they need to understand and act on that alert.”

Highlights of the enhancements included in Fidelis Endpoint 6.1 include:

  • Real-Time Event Monitoring: The introduction of new centralized event monitoring provides real-time detection and visibility into what is happening on endpoints across the enterprise. With this release, Fidelis Endpoint now continuously records and streams key endpoint activities including file, process, registry, network, URL and DNS into a centralized event repository. In addition to improved detection, the historical event data holds valuable clues that let you trace an alert back to its original source. When you get new intelligence from Fidelis or your threat intelligence services, you can apply it to the historical events to detect if you’ve been compromised in the past.
  • Enhanced Detection Engine: A new detection engine built on top of the centralized event monitoring system provides real-time threat detection. Detections are driven by a growing set of behavioral rules -- also known as indicators of attack and powered by the Fidelis Threat Research Team -- that can be configured to take automated actions, such as tagging for later review, isolating the machine, or acquiring RAM. The new detection engine supports third party/custom indicator feeds and has the ability to create custom behavior rules.
  • Event Driven User Interface: When an attack occurs, a new event-driven user interface provides an interactive play-by-play view that shows how the incident unfolded so security teams can take appropriate action to resolve the issue. Users can also filter through data and quickly tag an event, see similar events, or easily create an alert rule when they discover something malicious to drive future and retrospective detections.
  • Fidelis Network Integration: The introduction of event monitoring enhances the product’s integration with Fidelis Network. Now, when Fidelis Endpoint receives an alert from Fidelis Network, it automatically queries the event repository to determine what took place and validate the alert. Results happen within seconds and an alert rule is dynamically created to watch across others systems for the endpoint activity that triggered the Fidelis Network alert.
  • Script Support for All Jobs: All jobs are now executed using the peer-to-peer script engine, which enables users to perform queries/jobs and receive results in near real-time across hundreds of thousands of endpoints.
  • Enhanced Endpoint Context: Users can now quickly access additional context about endpoints of interest. This lets users quickly see who is currently logged into a system, the host name, IP address, OS, event data associated with a specific endpoint and the job history for a particular endpoint – all in one location.

“Until this release, users have been forced to choose between vendors who had optimized their endpoint products for query speed or real-time threat detection from centralized events, or endpoint forensics,” says Fidelis Cybersecurity Chief Technology Officer Kurt Bertone. “Fidelis Endpoint 6.1 is the first and only endpoint detection and response product with an architecture optimized to support all three of these use cases in a single product.”

Fidelis Endpoint 6.1 is generally available today.

Learn More

- Contact Fidelis to schedule a demo
- Watch our Fidelis Endpoint video for an overview
- Read the new Endpoint blog post on Threat Geek
- Visit the Fidelis Endpoint product page
- Read the Fidelis Endpoint datasheet

About Fidelis Cybersecurity

Fidelis Cybersecurity is creating a world where attackers have no place left to hide. We reduce the time it takes to detect attacks and resolve security incidents. Our Fidelis Network™ and Fidelis Endpoint™ products look deep inside your traffic and content where attackers hide their exploits. Then, we pursue them out to your endpoints where your critical data lives. With Fidelis you’ll know when you’re being attacked, you can retrace attackers’ footprints and prevent data theft at every stage of the attack lifecycle. To learn more about Fidelis Cybersecurity products and incident response services, visit www.fidelissecurity.com and follow us on Twitter @FidelisCyber.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effici...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, drew together recent research and lessons learned from emerging and established compa...