Welcome!

Article

How to Monitor Windows Share Access

Any tiny detail or process on the network are very important and need to be monitored on a regular basis

Any tiny detail or process on the network are very important and need to be monitored on a regular basis. For instance, the network devices' and services' operation has to be audited; hardware and software must be inventoried. Even shared files and folders have to be under the system administrator's control. Why is it so important to audit access to shares?

Sooner or later, any company meets the following problems: files disappear, important documents are modified or replaced by users without notifying anyone. Especially, it often happens in big organizations with a lot of employees. Managers and heads of departments know perfectly that such accidents can sometimes stop the whole department's operation.

That obviously has a negative impact on the company's reputation. Determination of guilty employee in this case can help to find careless or irresponsible users, and prevent such unpleasant situations in future. It is very difficult and, we would say, even impossible to determine the guilty employee without the shared folder access audit. There are not so many people who can admit mistakes they have done. That is why the system administrator has to solve one more problem on the network - he has to perform the constant share audit.

How this process can be performed?

For this purpose, special software for the share audit can be used. It is to be installed on the file server and works as service, so it is totally invisible for users. The program maintains the log file and answers three main questions that the system administrator has to know: who, what, and when?

1. Who uses shared resources: from what PC and under what user account.

2. What users do with the shared files: whether they just view or modify them, or perhaps they delete files or copy them on their PCs.

3. When do employees access to shared folders.

Above all, the system administrator can always disable access to particular shares when necessary in couple of mouse clicks.

An advanced network share access audit tool might have another advantage: the system administrator has an opportunity to configure the program's notifications or alerts on connections to shared resources. He can receive an e-mail, screen or sound notifications when users connect to certain shared folders or files. Thus, the system administrator can be always aware when:

- Files, that contain important information, have been edited.

- Critical documents have disappeared.

- Information, that has to be kept in particular folder, has been moved to another one.

- Company's documents, that contain private information, have been copied on users' computers.

All these aspects auditing allows the system administrator to take instant actions on the information protection and preventing the data leak. In addition, he has an opportunity to start recovering the deleted data immediately from a backup copy. All that he needs is to manage the correct shared resources audit.

Thus, to prevent the unpleasant situations that are connected with using shared resources the system administrator does not need to do too much, as it seems. He just has to organize the correct monitoring of shared folders and documents.

More Stories By Dmitriy Stepanov

Dmitriy Stepanov is a CEO at 10-Strike Software, Network Inventory, Network Monitoring, Bandwidth Monitoring Software developer. 10-Strike Software is a software developing company. They offer their high quality networking products since 1999. 10-Strike Software specializes in producing Windows network software for corporate users.

Latest Stories
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...
SYS-CON Events announced today that Silicon India has been named “Media Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Published in Silicon Valley, Silicon India magazine is the premiere platform for CIOs to discuss their innovative enterprise solutions and allows IT vendors to learn about new solutions that can help grow their business.
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...