Welcome!

Article

How to Monitor Windows Share Access

Any tiny detail or process on the network are very important and need to be monitored on a regular basis

Any tiny detail or process on the network are very important and need to be monitored on a regular basis. For instance, the network devices' and services' operation has to be audited; hardware and software must be inventoried. Even shared files and folders have to be under the system administrator's control. Why is it so important to audit access to shares?

Sooner or later, any company meets the following problems: files disappear, important documents are modified or replaced by users without notifying anyone. Especially, it often happens in big organizations with a lot of employees. Managers and heads of departments know perfectly that such accidents can sometimes stop the whole department's operation.

That obviously has a negative impact on the company's reputation. Determination of guilty employee in this case can help to find careless or irresponsible users, and prevent such unpleasant situations in future. It is very difficult and, we would say, even impossible to determine the guilty employee without the shared folder access audit. There are not so many people who can admit mistakes they have done. That is why the system administrator has to solve one more problem on the network - he has to perform the constant share audit.

How this process can be performed?

For this purpose, special software for the share audit can be used. It is to be installed on the file server and works as service, so it is totally invisible for users. The program maintains the log file and answers three main questions that the system administrator has to know: who, what, and when?

1. Who uses shared resources: from what PC and under what user account.

2. What users do with the shared files: whether they just view or modify them, or perhaps they delete files or copy them on their PCs.

3. When do employees access to shared folders.

Above all, the system administrator can always disable access to particular shares when necessary in couple of mouse clicks.

An advanced network share access audit tool might have another advantage: the system administrator has an opportunity to configure the program's notifications or alerts on connections to shared resources. He can receive an e-mail, screen or sound notifications when users connect to certain shared folders or files. Thus, the system administrator can be always aware when:

- Files, that contain important information, have been edited.

- Critical documents have disappeared.

- Information, that has to be kept in particular folder, has been moved to another one.

- Company's documents, that contain private information, have been copied on users' computers.

All these aspects auditing allows the system administrator to take instant actions on the information protection and preventing the data leak. In addition, he has an opportunity to start recovering the deleted data immediately from a backup copy. All that he needs is to manage the correct shared resources audit.

Thus, to prevent the unpleasant situations that are connected with using shared resources the system administrator does not need to do too much, as it seems. He just has to organize the correct monitoring of shared folders and documents.

More Stories By Dmitriy Stepanov

Dmitriy Stepanov is a CEO at 10-Strike Software, Network Inventory, Network Monitoring, Bandwidth Monitoring Software developer. 10-Strike Software is a software developing company. They offer their high quality networking products since 1999. 10-Strike Software specializes in producing Windows network software for corporate users.

Latest Stories
“We're a global managed hosting provider. Our core customer set is a U.S.-based customer that is looking to go global,” explained Adam Rogers, Managing Director at ANEXIA, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Information technology (IT) advances are transforming the way we innovate in business, thereby disrupting the old guard and their predictable status-quo. It’s creating global market turbulence. Industries are converging, and new opportunities and threats are emerging, like never before. So, how are savvy chief information officers (CIOs) leading this transition? Back in 2015, the IBM Institute for Business Value conducted a market study that included the findings from over 1,800 CIO interviews ...
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...