Welcome!

Blog Feed Post

Throttling Database Using Rate Limits for SQL or REST

Need to utilize rate limiting to improve the quality of service of your database? Our new database-agnostic Rate Limits API makes it easy.

When you are planning to expose your database to new users or tenants, one of the important areas to consider is resource governance. When in production, there’s always a high probability that you will see complex live queries for data visualization or map reduce jobs impacting your analytical database, which can impact other users. Then you start to scale as with any web application by running a load balancer in front of your servers to distribute requests efficiently. But often in a production environment you come across a bad user that affects your quality of service (QoS).  To give you an idea on how a bad user can affect your service, here are couple of abusive scenarios:

  • A naïve developer who keeps hogging all the resources due to an inefficiently written client request.
  • A low priority user who keeps hogging the resources, causing service outages for a high priority users
  • A malicious user who keeps attacking your API endpoints to cause DDoS for all other users

It is not pragmatic to scale your system to accommodate genuine requests whenever there is a drop in QoS due to such abusive behavior. To deal with this, rate limiting is one technique that can be employed. Essentially rate limiting defines a number of requests or the amount of data that you can request with in an interval of time. This is an effective technique that can mitigate the abusive scenarios discussed above, and you can find rate limits for almost all the SQL and REST APIs that you would want to interact with.

Database Agnostic Throttling

At Progress, we are really interested in open analytics strategies that leverage Data API Management to expose hosted databases to other users, customers and tenants for self-service analytics. We engineered DataDirect Cloud (a hosted service) and Hybrid Data Pipeline (a self-hosted service) to provision Data APIs on top of 30+ different data sources, including Oracle, SQL Server, Hadoop Hive, Postgres, MySQL, IBM DB2, Sybase, Greenplum and more—without having to configure the database itself. In this blog, I will be going through the rate limiting features for SQL or REST APIs generated through Hybrid Data Pipeline to protect your hosted database resources.

Introducing Rate Limits API

With the recent release of Hybrid Data Pipeline, admins can now throttle Data APIs (ODBC, JDBC or OData) with fine granularity to improve the overall QoS. With the Rate Limits API, you can configure the following parameters:

  1. MaxFetchRows: Maximum number of rows that can be fetched per query
  2. PasswordLockoutInterval: The duration, in seconds, for counting the number of consecutive failed authentication attempts
  3. PasswordLockoutLimit: The number of consecutive failed authentication attempts that are allowed before locking the user account
  4. PasswordLockoutPeriod: The duration, in seconds, for which a user account will not be allowed to authenticate to the system when the PasswordLockoutLimit is reached
  5. CORSBehavior: Configuration parameter for CORS behavior. Setting the value to 0 disables the CORS filter.  Setting the value to 1 enables the CORS filter.

You can configure these parameters at three levels wherever they are applicable. Hybrid Data Pipeline offers three levels of granularity to rate limit the OData API. They are:

  1. System Level
  2. User Level
  3. Data Source Level

You can set the all the parameters defined above in System Level Limits and only MaxFetchRows parameter for User and Data Source Level settings.

System Level Rate Limit

System Level limits defines the limits for the Hybrid Data pipeline service. For System level limit, you can configure following parameters:

  1. MaxFetchRows (id = 1)
  2. PasswordLockoutInterval (id = 2)
  3. PasswordLockoutLimit (id = 3)
  4. PasswordLockoutPeriod (id = 4)
  5. CORSBehavior (id = 5)

Let’s look at the Rate Limits API. To fetch all the system level limits all you must do is send a GET request as shown below:

GET

http://<server>:<port>/api/admin/limits/system

 RESPONSE

   {
 
    "limits": [
 
        {
 
            "value": 1000,
 
            "id": 1
 
        },
 
        {
 
            "value": 100,
 
            "id": 2
 
        },
 
        {
 
            "value": 10,
 
            "id": 3
 
        },
 
        {
 
            "value": 100,
 
            "id": 4
 
        },
 
        {
 
            "value": 0,
 
            "id": 5
 
        }
 
    ]
 
}

To set a limit, you can send a POST request as shown below:

POST

http://<server>:<port>/api/admin/limits/system/<limit-id>

BODY

{
    "value": 100
}

 

RESPONSE

{
    "value": 100
}

 

As simple as that, an admin can set limits for all the above parameters at the System Level. You can also update and delete the system limits. Visit the documentation here to learn how you can do it.

User Level Limits

At the user level, you can set limits only for restricting the number of rows that can be fetched for a single query. To fetch all the users and the Max row limits for each of them, execute the GET request as shown below:

GET

http://<server>:<port>/api/admin/limits/users

RESPONSE

{
 
    "userLimits": [
 
        {
 
            "limits": [
 
                {
 
                    "value": 10000000,
 
                    "id": 1
 
                }
 
            ],
 
            "userId": 1,
 
            "userName": "d2cadmin"
 
        },
 
        {
 
            "limits": [
 
                {
 
                    "value": 100,
 
                    "id": 1
 
                }
 
            ],
 
            "userId": 2,
 
            "userName": "d2cuser"
 
        }
 
    ]
 
}

To set a limit for a user, you can send a POST request as shown below:              

POST

http://<server>:<port>/api/admin/limits/users/<user-id>/<limit-id>

BODY

{
    "value": 100
}

RESPONSE

{
    "value": 100
}

 

You can also update and delete the limits later. To learn how to do that, visit this documentation page.

Data Source Level Limits

Each user can have multiple data sources defined in his account, and you can dictate the limit for each data source individually. To get all the data sources and limits for a user, execute a GET request as shown below.

GET

http://<server>:<port>/api/admin/limits/users/<user-id>/datasources

RESPONSE

{
 
    "datasourceLimits": [
 
        {
 
            "limits": [],
 
            "dataSourceId": 1,
 
            "dataSourceName": "SQLServer",
 
            "isGroup": false
 
        }
 
    ]
 
}

To set a limit for a datasource under a user, you can send a POST request as shown below.

POST

http://<server>:<port>/api/admin/limits/users/<user-id>/datasources/<datasource-id>/<limit-id>

BODY

{
    "value": 100
}

 

RESPONSE

{
    "value": 100
}

As with User level limits, you can only set MaxFetchRows limit at the data source level. You can also update and delete the limits later. To learn how to do that, visit this documentation page.

Data API Management

We hope this article gave you a glimpse into how Hybrid Data Pipeline can provide throttling for the Data APIs that you can produce. In addition to this, you can now easily configure a load balancer for cluster of Hybrid Data Pipeline servers, helping you to handle and distribute requests properly, improving the QoS for the ODBC, JDBC and OData APIs that you are generating using Hybrid Data Pipeline servers. 

Want to learn more about Hybrid Data Pipeline features for Data APIs? Just click the link below.

Learn More about Hybrid Data Pipeline

Read the original blog entry...

More Stories By Progress Blog

Progress offers the leading platform for developing and deploying mission-critical, cognitive-first business applications powered by machine learning and predictive analytics.

Latest Stories
​Blockchain, also known as Distributed Ledger Technology (DLT), can be used for more than just crypto currencies. This presentation discusses how blockchain and the Internet of Things (IOT) can be used for several supply chain management and logistics use cases. Real world examples that utilize the open source project Hyperledger will be discussed.
Gym Solutions is a software as a service (SaaS) solution purpose-built to service the fitness industry, with over 1000 clients servicing over 2 million gym members across 40 countries making Perfect Gym Solutions one of the largest and fastest growing SaaS applications in the fitness industry. Their platform is a comprehensive package of modern modular products for the management of fitness clubs, sports facilities and gyms an end- to end solution, revolutionising the way that gyms are managed. ...
Enterprises are universally struggling to understand where the new tools and methodologies of DevOps fit into their organizations, and are universally making the same mistakes. These mistakes are not unavoidable, and in fact, avoiding them gifts an organization with sustained competitive advantage, just like it did for Japanese Manufacturing Post WWII.
Eric Taylor, a former hacker, reveals what he's learned about cybersecurity. Taylor's life as a hacker began when he was just 12 years old and playing video games at home. Russian hackers are notorious for their hacking skills, but one American says he hacked a Russian cyber gang at just 15 years old. The government eventually caught up with Taylor and he pleaded guilty to posting the personal information on the internet, among other charges. Eric Taylor, who went by the nickname Cosmo...
There's no doubt that blockchain technology is a powerful tool for the enterprise, but bringing it mainstream has not been without challenges. As VP of Technology at 8base, Andrei is working to make developing a blockchain application accessible to anyone. With better tools, entrepreneurs and developers can work together to quickly and effectively launch applications that integrate smart contracts and blockchain technology. This will ultimately accelerate blockchain adoption on a global scale.
As the fourth industrial revolution continues to march forward, key questions remain related to the protection of software, cloud, AI, and automation intellectual property. Recent developments in Supreme Court and lower court case law will be reviewed to explain the intricacies of what inventions are eligible for patent protection, how copyright law may be used to protect application programming interfaces (APIs), and the extent to which trademark and trade secret law may have expanded relev...
The Blockchain Benchmark asks and answers the questions many people want to know about the state of Blockchain: What are the biggest barriers? What was your motivation to get involved? When will it mainstream? Who are the true influencers? What are its top use cases? Who will win over the next 5 years? How will the future unfold? And 20+ other valuable questions.
Ivo Lukas is the Founder/CEO for 24Notion. 24Notion is the first integrated marketing/digital PR & lifestyle agency with special emphasis on giving back to the global communities. With a broad understanding the art of non- traditional marketing, new media, communications and social influence. 24Notion ranked #12 in Corporate Philanthropy nominated by Portland Business Journal Book of List.
SUSE is a German-based, multinational, open-source software company that develops and sells Linux products to business customers. Founded in 1992, it was the first company to market Linux for the enterprise. Founded in 1992, SUSE is the world’s first provider of an Enterprise Linux distribution. Today, thousands of businesses worldwide rely on SUSE for their mission-critical computing and IT management needs.
Provide an overview of the capabilities of Azure Stack allowing you or your customers to adopt truly consistent Hybrid Cloud capabilities to deliver greater productivity in your cloud world. Ultan Kinahan is on a member of the Global Black Belt team at Microsoft with a focus on Azure Stack Hybrid Cloud. Ultan has been in the Azure team since the beginning, Has held roles in Engineering, Sales and now consults with both small to medium size business and the worlds largest organizations on how to ...
SAP is the world leader in enterprise applications in terms of software and software-related service revenue. Based on market capitalization, we are the world's third largest independent software manufacturer. Harness the power of your data and accelerate trusted outcome-driven innovation by developing intelligent and live solutions for real-time decisions and actions on a single data copy. Support next-generation transactional and analytical processing with a broad set of advanced analytics - r...
Mid-sized companies will be pleased with StorageCraft's low cost for this solution compared to others in the market. There are no startup fees, our solution has a predictable monthly cost, highly competitive pricing and offers ongoing value for our partners month after month. By enabling pooling and StorageCraft's 30-days of free virtualization the company removes several concerns surrounding machine size management and disaster recovery testing costs that add to the complexity of implementing a...
Blockchain is a new buzzword that promises to revolutionize the way we manage data. If the data is stored in a blockchain there is no need for a middleman - the distributed database is stored on multiple and there is no need to have a centralized server that will ensure that the transactions can be trusted. The best way to understand how a blockchain works is to build one. During this presentation, we'll start with covering the basics (hash, nounce, block, smart contracts) and then we'll create ...
In addition to 22 Keynotes and General Sessions, attend all FinTechEXPO Blockchain "education sessions" plus 40 in two tracks: (1) Enterprise Cloud (2) Digital Transformation. PRICE EXPIRES AUGUST 31, 2018. Ticket prices: ($295-Aug 31) ($395-Oct 31) ($495-Nov 12) ($995-Walk-in) Does NOT include lunch.
DevOpsSUMMIT at CloudEXPO will expand the DevOps community, enable a wide sharing of knowledge, and educate delegates and technology providers alike. Recent research has shown that DevOps dramatically reduces development time, the amount of enterprise IT professionals put out fires, and support time generally. Time spent on infrastructure development is significantly increased, and DevOps practitioners report more software releases and higher quality. Sponsors of DevOpsSUMMIT at CloudEXPO will b...