Welcome!

Blog Feed Post

Gathering LDAP Identity Data with Splunk Cloud

When organizations think about moving their Splunk implementation to the Cloud, there are a couple of pain points that they encounter. For starters, LDAP Authentication. In this article, we’re not going to address that issue, as there aren’t currently any workarounds for it other than opening a firewall rule between Splunk Cloud and one of your domain controllers. A second sticking point that we will address though is pulling identity data into the Splunk App for Enterprise Security.

Typically organizations want to set up an automated way to pull identity data from their LDAP infrastructure. In the past, this has involved setting up that same firewall rule we discussed earlier and then configuring the Splunk Support for Active Directory in the Cloud. But recently I came across a way to bypass that need and to ship identity information up to the Cloud instead.

Of course you can always set up a Read Only Domain Controller and that will put some of your Windows Admins at ease. You can also use firewalls to lock down connections only coming from a single IP on a single port towards that Domain Controller. These are great strategies but usually, it’s a lot of extra work for minimal gain if all you need is a list of identities.

You can get the same level of identity information by using something in Splunk called summary indexing and a Splunk Heavy Forwarder in your environment. This would involve you setting up Splunk Support for Active Directory locally and eliminating the need for any connections inbound to your domain controllers. All data would be sent out to Splunk Cloud via the same port as the rest of your data.

Step 1: Create an index in Splunk Cloud

To create the index in Splunk Cloud:

  1. Login to your Splunk Cloud installation
  2. Navigate to Settings > Indexes
  3. Create a New Index titled “summary_ldap”

Step 2: Create an index in your Heavy Forwarder

Next you’ll want to create an index on your Heavy Forwarder:

  1. Login to your Heavy Forwarder Splunk installation
  2. Navigate to Settings > Indexes
  3. Create a New Index titled “summary_ldap”

Step 3: Schedule a search to send LDAP data to Splunk Cloud

You’ll want to schedule a search with the following settings on your Heavy Forwarder:

Search Box:

| ldapsearch domain=yourdomain search="(&(objectclass=user)(!(objectClass=computer)))" attrs="sAMAccountName,personalTitle,displayName,givenName,sn,mail,telephoneNumber,mobile,manager,department,whenCreated,userAccountControl" |makemv userAccountControl
|search userAccountControl="NORMAL_ACCOUNT"
|eval suffix=""
|eval priority="medium"
|eval category="normal"
|eval watchlist="false"
|eval endDate=""
|table
sAMAccountName,personalTitle,displayName,givenName,sn,suffix,mail,telephoneNumber,mobile,manager,priority,department,category,watchlist,whenCreated,endDate
|rename sAMAccountName as identity, personalTitle as prefix, displayName as nick, givenName as first, sn as last, mail as email, telephoneNumber as phone, mobile as phone2, manager as managedBy, department as bunit, whenCreated as startDate

Schedule: Select a schedule that is appropriate for how often you’d like to see your identities get updated. The key here is to make sure to select “Summary Indexing” and to choose the “summary_ldap” index you recently created.

Step 4: Create the lookup table file in Splunk Cloud

Lookups

The base lookup table file can be created by uploading a skeleton file through SplunkWeb. You will want to make sure that the destination app is the SplunkEnterpriseSecuritySuite and that the sharing permissions are set to Global (object should appear in all Apps).

Add New

Creating the lookup table file (uploading a skeleton file)

The header of the CSV should be as follows:

identity,prefix,nick,first,last,suffix,email,phone,phone2,managedBy,priority,bunit,category,watchlist,startDate,endDate

Permissions

Setting the appropriate permissions

Lookup Table Files

File saved successfully

Step 5: Create the lookup table definition in Splunk Cloud

Step 6: Create the lookup table and verify that it is updated on the filesystem in Splunk Cloud

Step 7: Save the search as a report

This is assuming that the lookup table is being generated properly. Also, this can be scheduled at a later point in time to automatically update the lookup table.

Step 8: Configure Identities in ES: Navigate to App: Enterprise Security -> Configuration -> Identity Management

Step 9: Create the merged identity file

Before you create the merged identity file, wait approximately 5 minutes for Splunk to automatically detect the change in the identity configuration.

The following search will show you the status:  

index=_internal source=*python_modular_input.log *identit*

Step 10: Schedule the report you created earlier to run on a semi-regular basis

This is being done assuming all is working as expected. Ensure that set the time frame so you don’t get duplicate results. If you scheduled the search on your Heavy Forwarder for once per day, this search should also look only for the past 24 hours and not any more. Also, as a final note, don’t forget to disable the sample identities that are enabled by default Enterprise Security.

Read the original blog entry...

More Stories By Hurricane Labs

Christina O’Neill has been working in the information security field for 3 years. She is a board member for the Northern Ohio InfraGard Members Alliance and a committee member for the Information Security Summit, a conference held once a year for information security and physical security professionals.

Latest Stories
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed...
All in Mobile is a place where we continually maximize their impact by fostering understanding, empathy, insights, creativity and joy. They believe that a truly useful and desirable mobile app doesn't need the brightest idea or the most advanced technology. A great product begins with understanding people. It's easy to think that customers will love your app, but can you justify it? They make sure your final app is something that users truly want and need. The only way to do this is by ...
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...
CloudEXPO New York 2018, colocated with DevOpsSUMMIT and DXWorldEXPO New York 2018 will be held November 12-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI and Machine Learning to one location.
DXWorldEXPO LLC announced today that Nutanix has been named "Platinum Sponsor" of CloudEXPO | DevOpsSUMMIT | DXWorldEXPO New York, which will take place November 12-13, 2018 in New York City. Nutanix makes infrastructure invisible, elevating IT to focus on the applications and services that power their business. The Nutanix Enterprise Cloud Platform blends web-scale engineering and consumer-grade design to natively converge server, storage, virtualization and networking into a resilient, softwar...
Hackers took three days to identify and exploit a known vulnerability in Equifax’s web applications. I will share new data that reveals why three days (at most) is the new normal for DevSecOps teams to move new business /security requirements from design into production. This session aims to enlighten DevOps teams, security and development professionals by sharing results from the 4th annual State of the Software Supply Chain Report -- a blend of public and proprietary data with expert researc...
So the dumpster is on fire. Again. The site's down. Your boss's face is an ever-deepening purple. And you begin debating whether you should join the #incident channel or call an ambulance to deal with his impending stroke. Yes, we know this is a developer's fault. There's plenty of time for blame later. Postmortems have a macabre name because they were once intended to be Viking-like funerals for someone's job. But we're civilized now. Sort of. So we call them post-incident reviews. Fires are ne...
The digital transformation is real! To adapt, IT professionals need to transform their own skillset to become more multi-dimensional by gaining both depth and breadth of a wide variety of knowledge and competencies. Historically, while IT has been built on a foundation of specialty (or "I" shaped) silos, the DevOps principle of "shifting left" is opening up opportunities for developers, operational staff, security and others to grow their skills portfolio, advance their careers and become "T"-sh...
This session will provide an introduction to Cloud driven quality and transformation and highlight the key features that comprise it. A perspective on the cloud transformation lifecycle, transformation levers, and transformation framework will be shared. At Cognizant, we have developed a transformation strategy to enable the migration of business critical workloads to cloud environments. The strategy encompasses a set of transformation levers across the cloud transformation lifecycle to enhance ...
Authorization of web applications developed in the cloud is a fundamental problem for security, yet companies often build solutions from scratch, which is error prone and impedes time to market. This talk shows developers how they can (instead) build on-top of community-owned projects and frameworks for better security.Whether you build software for enterprises, mobile, or internal microservices, security is important. Standards like SAML, OIDC, and SPIFFE help you solve identity and authenticat...
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app secu...
Mike is managing director in Deloitte Consulting LLP's Cloud practice, responsible for helping clients implement cloud strategy and architecture to drive digital transformation. Beyond his technology experience, Mike brings an insightful understanding of how to address the organizational change, process improvement, and talent management challenges associated with digital transformation. Mike brings more than 30 years of experience in software development and architecture to his role. Most recen...
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Ge...