Blog Feed Post

Elasticsearch security: Authentication, Encryption, Backup

The recent ransom attack on public Elasticsearch instances showed that Elasticsearch security is still a hot topic. Elasticsearch was not the only target – tens of thousands of poorly configured MongoDB databases have been compromised over the past week, too, compromising over 27,000 servers where hackers stole and then deleted data from unpatched or “poorly-configured” systems. The scenario is always the same: insecure instances are “hacked” and data replaced with a note informing the owner to send payment to a Bitcoin address and then email the attacker to retrieve the data. Over the last few days we saw more than 4000 Elasticsearch instances compromised and the number of instances is still growing, as seen here. The attacks are rather simple. The attacker simply scans for services on port 9200. Once such a service is found the hacked fetches the data from it, then deletes it and puts the payment information as document in the stolen Elasticsearch index. Due to the fact that many Elasticsearch instances are not protected these instances are very easy targets.

In this post, we are going to show you what you should or shouldn’t do with Elasticsearch, but actually how to secure Elasticsearch by sharing a few simple and free prevention methods to do that.

Let’s start with the general attack ⇒ counter-measures:

  • Port scanning ⇒ minimize exposure:
    • Don’t use the default port 9200
    • Don’t expose Elasticsearch to the public Internet (put Elasticsearch behind a firewall)
    • Don’t forget Kibana port
  • Data theft ⇒ secure access:
    • Lock down the HTTP API with authentication
    • Encrypt communication with SSL/TLS
  • Data deletion ⇒ set up backup:
    • Backup your data
  • Log file manipulation ⇒ log auditing and alerting
    • Hackers might manipulate or delete system log files to cover their tracks. Sending logs to a remote destination increases the chances of discovering intrusion early.

Let’s drill into each of the above items with step-by-step actions to secure Elasticsearch:

Lock Down Open Ports

Firewall: Close the public ports

The first action should be to close the relevant ports to the Internet:

iptables -A INPUT -i eth0 -p tcp --destination-port 9200 -s {PUBLIC-IP-ADDRESS-HERE} -j DROP

iptables -A INPUT -i eth0 -p tcp --destination-port 9300 -s {PUBLIC-IP-ADDRESS-HERE} -j DROP

If you run Kibana note that the Kibana server acts as a proxy to Elasticsearch and thus needs its port closed as well:

iptables -A INPUT -i eth0 -p tcp --destination-port 5601 -s {PUBLIC-IP-ADDRESS-HERE} -j DROP

After this you can relax a bit! Elasticsearch won’t not reachable from the Internet anymore.

Bind Elasticsearch ports only to private IP addresses

Change the configuration in elasticsearch.yml to bind only to private IP addresses or for single node instances to the loopback interface:


Add private networking between Elasticsearch and client services

If you need access from another machine to Elasticsearch connect them via VPN or any other private network. A quick way to establish a secure tunnel between two machines is via SSH tunnels:

ssh -Nf -L 9200:localhost:9200 [email protected]

You can then access Elasticsearch via the SSH tunnel with from client machines e.g.

curl http://localhost:9200/_search

Authentication and SSL/TLS with Nginx

There are several open-source and free solutions that provide Elasticsearch access authentication, but if you want something quick and simple, here is how to do it yourself with just Nginx:

Generate password file

printf "esuser:$(openssl passwd -crypt MySecret)\n" > /etc/nginx/passwords

Generate self-signed SSL certificates, if you don’t have official certificates …

sudo mkdir /etc/nginx/ssl

sudo openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout
/etc/nginx/ssl/nginx.key -out /etc/nginx/ssl/nginx.crt

Add the proxy configuration with SSL and activate basic authentication to /etc/nginx/nginx.conf (note we expect the SSL certificate and key file in /etc/nginx/ssl/). Example:

# define proxy upstream to Elasticsearch via loopback interface in 
http {
  upstream elasticsearch {

server {
  # enable TLS
  listen ssl;
  ssl_certificate /etc/nginx/ssl/nginx.crt;
  ssl_certificate_key /etc/nginx/ssl/nginx.key
  ssl_protocols TLSv1.2;
  ssl_prefer_server_ciphers on;
    ssl_session_timeout 5m;
    ssl_ciphers "HIGH:!aNULL:!MD5 or HIGH:!aNULL:!MD5:!3DES";
    # Proxy for Elasticsearch 
    location / {
            auth_basic "Login";
            auth_basic_user_file passwords;
            proxy_set_header X-Real-IP $remote_addr;
            proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
            proxy_set_header Host $http_host;
            proxy_set_header X-NginX-Proxy true;
            # use defined upstream with the name "elasticsearch"
            proxy_pass http://elasticsearch/;
            proxy_redirect off;
            if ($request_method = OPTIONS ) {
                add_header Access-Control-Allow-Origin "*"; 
            add_header Access-Control-Allow-Methods "GET, POST, , PUT, OPTIONS";
            add_header Access-Control-Allow-Headers "Content-Type,Accept,Authorization, x-requested-with"; 
            add_header Access-Control-Allow-Credentials "true"; 
            add_header Content-Length 0;
            add_header Content-Type application/json;
            return 200;

Restart Nginx and try to access Elasticsearch via https://localhost/_search.

Free Security Plugins for Elasticsearch

Alternatively, you could install and configure one of the several free security plugins for Elasticsearch to enable authentication:

  • HTTP Authentication plugin for Elasticsearch is available on Github.  It provides Basic HTTP Authentication, as well as IP ACL.
  • SearchGuard is a free security plugin for Elasticsearch including role based access control, document level security and SSL/TLS encrypted node-to-node communication. Additional enterprise features like LDAP authentication or JSON Web Token authentication are available and licenced per Elasticsearch cluster.  Note that SearchGuard support is also included in some Sematext Elasticsearch Support Subscriptions.

Auditing & Alerting

As with any type of system holding sensitive data, you have to monitor it very closely.  This means not only monitoring its various metrics (whose sudden changes could be an early sign of trouble), but also watching its logs.  Concretely, in the recent Elasticsearch attacks, anyone who had alert rules that trigger when the number of documents in an index suddenly drops would have immediately been notified that something was going on. A number of monitoring vendors have Elasticsearch support, including Sematext (see Elasticsearch monitoring). Logs should be collected and shipped to a log management service in real time, where alerting needs to be set up to watch for any anomalous or suspicious activity, among other things.  The log management service can be on premises or it can be a 3rd party SaaS, like Logsene.  Shipping logs off site has the advantage of preventing attackers from covering their tracks by changing the logs.  Once logs are off site attackers won’t be able to get to them.  Alerting on metrics and logs means you will become aware of a security compromise early and take appropriate actions to, hopefully, prevent further damage.

Backup and restore data

A very handy tool to backup/restore or re-index data based on Elasticsearch queries is Elasticdump.

To backup complete indices, the Elasticsearch snapshot API is the right tool. The snapshot API provides operations to create and restore snapshots of whole indices, stored in files, or in Amazon S3 buckets.

Let’s have a look at a few examples for Elasticdump and snapshot backups and recovery.

  1. Install elasticdump with the node package manager
    npm i elasticdump -g
  2. Backup by query to a zip file:
    elasticdump --input='http://username:[email protected]:9200/myindex' --searchBody '{"query" : {"range" :{"timestamp" : {"lte": 1483228800000}}}}' --output=$ --limit=1000 | gzip > /backups/myindex.gz
  3. Restore from a zip file:
    zcat /backups/myindex.gz | elasticdump --input=$ --output=http://username:[email protected]:9200/index_name

Examples for backup and restore data with snapshots to Amazon S3 or files

First configure the snapshot destination

1) S3 example

 curl 'localhost:9200/_snapshot/my_repository?pretty' -XPUT -d '{

   "type" : "s3",
   "settings" : {
     "bucket" : "test-bucket",
     "base_path" : "backup-2017-01",
     "max_restore_bytes_per_sec" : "1gb",
     "max_snapshot_bytes_per_sec" : "1gb",
     "compress" : "true",
     "access_key" : "<ACCESS_KEY_HERE>",
     "secret_key" : "<SECRET_KEY_HERE>"


2) Local disk or mounted NFS example

 curl 'localhost:9200/_snapshot/my_repository?pretty' -XPUT -d '{

   "type" : "fs",
   "settings" : {
     "location": "<PATH … for example /mnt/storage/backup>"


3) Trigger snapshot

 curl -XPUT 'localhost:9200/_snapshot/my_repository/<snapshot_name>'

4) Show all backups

 curl 'localhost:9200/_snapshot/my_repository/_all'

5) Restore – the most important part of backup is verifying that backup restore actually works!

curl -XPOST 'localhost:9200/_snapshot/my_repository/<snapshot_name>/_restore'

What about Hosted Elasticsearch?

There are several hosted Elasticsearch services, with Logsene being a great alternative for time series data like logs.   Each hosted Elasticsearch service is a little different.  The list below shows a few relevant aspects of Logsene:

  • Logsene API is compatible to Elasticsearch except for a few security related exceptions
  • Logsene does not expose management API’s like index listing or global search via /_search
  • Logsene blocks scripting and index deletion operations
  • Logsene users can define and revoke access tokens for read and write access
  • Logsene provides role based access control and SSL/TLS
  • Logsene creates daily snapshots for all customers and stores them securely
  • Logsene supports raw data archiving to Amazon S3

If you have any questions, feel free to contact us. We provide professional Elasticsearch support, trainings and consulting.

Or, if you are interested in trying Logsene, our hassle-free managed ELK you don’t need to maintain and scale, here it is a short path to a free trial:


Read the original blog entry...

More Stories By Sematext Blog

Sematext is a globally distributed organization that builds innovative Cloud and On Premises solutions for performance monitoring, alerting and anomaly detection (SPM), log management and analytics (Logsene), and search analytics (SSA). We also provide Search and Big Data consulting services and offer 24/7 production support for Solr and Elasticsearch.

Latest Stories
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex software systems for startups and enterprises. Since 2009 it has grown from a small group of passionate engineers and business...
SYS-CON Events announced today that GrapeUp, the leading provider of rapid product development at the speed of business, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company, specialized in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market acr...
SYS-CON Events announced today that Ayehu will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on October 31 - November 2, 2017 at the Santa Clara Convention Center in Santa Clara California. Ayehu provides IT Process Automation & Orchestration solutions for IT and Security professionals to identify and resolve critical incidents and enable rapid containment, eradication, and recovery from cyber security breaches. Ayehu provides customers greater control over IT infras...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
Automation is enabling enterprises to design, deploy, and manage more complex, hybrid cloud environments. Yet the people who manage these environments must be trained in and understanding these environments better than ever before. A new era of analytics and cognitive computing is adding intelligence, but also more complexity, to these cloud environments. How smart is your cloud? How smart should it be? In this power panel at 20th Cloud Expo, moderated by Conference Chair Roger Strukhoff, pane...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
In this presentation, Striim CTO and founder Steve Wilkes will discuss practical strategies for counteracting fraud and cyberattacks by leveraging real-time streaming analytics. In his session at @ThingsExpo, Steve Wilkes, Founder and Chief Technology Officer at Striim, will provide a detailed look into leveraging streaming data management to correlate events in real time, and identify potential breaches across IoT and non-IoT systems throughout the enterprise. Strategies for processing massive ...
SYS-CON Events announced today that Cloud Academy named "Bronze Sponsor" of 21st International Cloud Expo which will take place October 31 - November 2, 2017 at the Santa Clara Convention Center in Santa Clara, CA. Cloud Academy is the industry’s most innovative, vendor-neutral cloud technology training platform. Cloud Academy provides continuous learning solutions for individuals and enterprise teams for Amazon Web Services, Microsoft Azure, Google Cloud Platform, and the most popular cloud com...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo Silicon Valley which will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is at the intersection of technology and business-optimizing tools, organizations and processes to bring measurable improvements in productivity and profitability," said Aruna Ravichandran, vice president, DevOps product and solutions marketing...
In his session at Cloud Expo, Alan Winters, an entertainment executive/TV producer turned serial entrepreneur, presented a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to ma...