Welcome!

Blog Feed Post

Throttling Database Using Rate Limits for SQL or REST

Need to utilize rate limiting to improve the quality of service of your database? Our new database-agnostic Rate Limits API makes it easy.

When you are planning to expose your database to new users or tenants, one of the important areas to consider is resource governance. When in production, there’s always a high probability that you will see complex live queries for data visualization or map reduce jobs impacting your analytical database, which can impact other users. Then you start to scale as with any web application by running a load balancer in front of your servers to distribute requests efficiently. But often in a production environment you come across a bad user that affects your quality of service (QoS).  To give you an idea on how a bad user can affect your service, here are couple of abusive scenarios:

  • A naïve developer who keeps hogging all the resources due to an inefficiently written client request.
  • A low priority user who keeps hogging the resources, causing service outages for a high priority users
  • A malicious user who keeps attacking your API endpoints to cause DDoS for all other users

It is not pragmatic to scale your system to accommodate genuine requests whenever there is a drop in QoS due to such abusive behavior. To deal with this, rate limiting is one technique that can be employed. Essentially rate limiting defines a number of requests or the amount of data that you can request with in an interval of time. This is an effective technique that can mitigate the abusive scenarios discussed above, and you can find rate limits for almost all the SQL and REST APIs that you would want to interact with.

Database Agnostic Throttling

At Progress, we are really interested in open analytics strategies that leverage Data API Management to expose hosted databases to other users, customers and tenants for self-service analytics. We engineered DataDirect Cloud (a hosted service) and Hybrid Data Pipeline (a self-hosted service) to provision Data APIs on top of 30+ different data sources, including Oracle, SQL Server, Hadoop Hive, Postgres, MySQL, IBM DB2, Sybase, Greenplum and more—without having to configure the database itself. In this blog, I will be going through the rate limiting features for SQL or REST APIs generated through Hybrid Data Pipeline to protect your hosted database resources.

Introducing Rate Limits API

With the recent release of Hybrid Data Pipeline, admins can now throttle Data APIs (ODBC, JDBC or OData) with fine granularity to improve the overall QoS. With the Rate Limits API, you can configure the following parameters:

  1. MaxFetchRows: Maximum number of rows that can be fetched per query
  2. PasswordLockoutInterval: The duration, in seconds, for counting the number of consecutive failed authentication attempts
  3. PasswordLockoutLimit: The number of consecutive failed authentication attempts that are allowed before locking the user account
  4. PasswordLockoutPeriod: The duration, in seconds, for which a user account will not be allowed to authenticate to the system when the PasswordLockoutLimit is reached
  5. CORSBehavior: Configuration parameter for CORS behavior. Setting the value to 0 disables the CORS filter.  Setting the value to 1 enables the CORS filter.

You can configure these parameters at three levels wherever they are applicable. Hybrid Data Pipeline offers three levels of granularity to rate limit the OData API. They are:

  1. System Level
  2. User Level
  3. Data Source Level

You can set the all the parameters defined above in System Level Limits and only MaxFetchRows parameter for User and Data Source Level settings.

System Level Rate Limit

System Level limits defines the limits for the Hybrid Data pipeline service. For System level limit, you can configure following parameters:

  1. MaxFetchRows (id = 1)
  2. PasswordLockoutInterval (id = 2)
  3. PasswordLockoutLimit (id = 3)
  4. PasswordLockoutPeriod (id = 4)
  5. CORSBehavior (id = 5)

Let’s look at the Rate Limits API. To fetch all the system level limits all you must do is send a GET request as shown below:

GET

http://<server>:<port>/api/admin/limits/system

 RESPONSE

   {
 
    "limits": [
 
        {
 
            "value": 1000,
 
            "id": 1
 
        },
 
        {
 
            "value": 100,
 
            "id": 2
 
        },
 
        {
 
            "value": 10,
 
            "id": 3
 
        },
 
        {
 
            "value": 100,
 
            "id": 4
 
        },
 
        {
 
            "value": 0,
 
            "id": 5
 
        }
 
    ]
 
}

To set a limit, you can send a POST request as shown below:

POST

http://<server>:<port>/api/admin/limits/system/<limit-id>

BODY

{
    "value": 100
}

 

RESPONSE

{
    "value": 100
}

 

As simple as that, an admin can set limits for all the above parameters at the System Level. You can also update and delete the system limits. Visit the documentation here to learn how you can do it.

User Level Limits

At the user level, you can set limits only for restricting the number of rows that can be fetched for a single query. To fetch all the users and the Max row limits for each of them, execute the GET request as shown below:

GET

http://<server>:<port>/api/admin/limits/users

RESPONSE

{
 
    "userLimits": [
 
        {
 
            "limits": [
 
                {
 
                    "value": 10000000,
 
                    "id": 1
 
                }
 
            ],
 
            "userId": 1,
 
            "userName": "d2cadmin"
 
        },
 
        {
 
            "limits": [
 
                {
 
                    "value": 100,
 
                    "id": 1
 
                }
 
            ],
 
            "userId": 2,
 
            "userName": "d2cuser"
 
        }
 
    ]
 
}

To set a limit for a user, you can send a POST request as shown below:              

POST

http://<server>:<port>/api/admin/limits/users/<user-id>/<limit-id>

BODY

{
    "value": 100
}

RESPONSE

{
    "value": 100
}

 

You can also update and delete the limits later. To learn how to do that, visit this documentation page.

Data Source Level Limits

Each user can have multiple data sources defined in his account, and you can dictate the limit for each data source individually. To get all the data sources and limits for a user, execute a GET request as shown below.

GET

http://<server>:<port>/api/admin/limits/users/<user-id>/datasources

RESPONSE

{
 
    "datasourceLimits": [
 
        {
 
            "limits": [],
 
            "dataSourceId": 1,
 
            "dataSourceName": "SQLServer",
 
            "isGroup": false
 
        }
 
    ]
 
}

To set a limit for a datasource under a user, you can send a POST request as shown below.

POST

http://<server>:<port>/api/admin/limits/users/<user-id>/datasources/<datasource-id>/<limit-id>

BODY

{
    "value": 100
}

 

RESPONSE

{
    "value": 100
}

As with User level limits, you can only set MaxFetchRows limit at the data source level. You can also update and delete the limits later. To learn how to do that, visit this documentation page.

Data API Management

We hope this article gave you a glimpse into how Hybrid Data Pipeline can provide throttling for the Data APIs that you can produce. In addition to this, you can now easily configure a load balancer for cluster of Hybrid Data Pipeline servers, helping you to handle and distribute requests properly, improving the QoS for the ODBC, JDBC and OData APIs that you are generating using Hybrid Data Pipeline servers. 

Want to learn more about Hybrid Data Pipeline features for Data APIs? Just click the link below.

Learn More about Hybrid Data Pipeline

Read the original blog entry...

More Stories By Progress Blog

Progress offers the leading platform for developing and deploying mission-critical, cognitive-first business applications powered by machine learning and predictive analytics.

Latest Stories
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, will discuss how from store operations...
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
SYS-CON Events announced today that Ryobi Systems will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Ryobi Systems Co., Ltd., as an information service company, specialized in business support for local governments and medical industry. We are challenging to achive the precision farming with AI. For more information, visit http:...
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: imple...
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, will discuss how by using...
Transforming cloud-based data into a reportable format can be a very expensive, time-intensive and complex operation. As a SaaS platform with more than 30 million global users, Cornerstone OnDemand’s challenge was to create a scalable solution that would improve the time it took customers to access their user data. Our Real-Time Data Warehouse (RTDW) process vastly reduced data time-to-availability from 24 hours to just 10 minutes. In his session at 21st Cloud Expo, Mark Goldin, Chief Technolo...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lead...
SYS-CON Events announced today that Daiya Industry will exhibit at the Japanese Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Ruby Development Inc. builds new services in short period of time and provides a continuous support of those services based on Ruby on Rails. For more information, please visit https://github.com/RubyDevInc.
SYS-CON Events announced today that CAST Software will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CAST was founded more than 25 years ago to make the invisible visible. Built around the idea that even the best analytics on the market still leave blind spots for technical teams looking to deliver better software and prevent outages, CAST provides the software intelligence that matter ...
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busine...
As businesses evolve, they need technology that is simple to help them succeed today and flexible enough to help them build for tomorrow. Chrome is fit for the workplace of the future — providing a secure, consistent user experience across a range of devices that can be used anywhere. In her session at 21st Cloud Expo, Vidya Nagarajan, a Senior Product Manager at Google, will take a look at various options as to how ChromeOS can be leveraged to interact with people on the devices, and formats th...
Is advanced scheduling in Kubernetes achievable? Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, will answer these questions and demonstrate techniques for implementing advanced scheduling. For example, using spot instances ...
First generation hyperconverged solutions have taken the data center by storm, rapidly proliferating in pockets everywhere to provide further consolidation of floor space and workloads. These first generation solutions are not without challenges, however. In his session at 21st Cloud Expo, Wes Talbert, a Principal Architect and results-driven enterprise sales leader at NetApp, will discuss how the HCI solution of tomorrow will integrate with the public cloud to deliver a quality hybrid cloud e...
SYS-CON Events announced today that Yuasa System will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Yuasa System is introducing a multi-purpose endurance testing system for flexible displays, OLED devices, flexible substrates, flat cables, and films in smartphones, wearables, automobiles, and healthcare.