Blog Feed Post

OData FAQs: Why Should REST API Developers Use OData?

Why use OData? Who is adopting OData? In this quick FAQ, learn about features of OData like FHIR, RFC, IETF, Security,  JSON, batch requests and pagination.

In this blog, we compiled a set of FAQs on OData (the Standard for a REST API) based on our interactions with a diverse group of API developers across various events and meetups.

The exponential growth of SaaS applications has led to an explosion of REST APIs. As of today, there are almost 18,000 APIs registered on the ProgrammableWeb, and research shows that around 40 new APIs are being added every week. This means that a developer today will be spending most of his or her time learning new APIs instead of building the application itself. To solve this problem, Microsoft founded the OData standard for building REST APIs.

OData (Open Data Protocol) defines a set of best practices for building and consuming RESTful APIs. OData helps you focus on your business logic while building RESTful APIs without having to worry about the various approaches to define request and response headers, status codes, HTTP methods, URL conventions, media types, payload formats, query options, etc. We are proud to serve on the OData Technical Committee (in fact, we were the first member of this committee) along with other technical giants such as CA and Citrix.

Most recently, I presented at a local meetup – TRI REST – to introduce the audience to OData. You can find my presentation here
. This meetup was especially interesting because it helped me understand how developers evaluate a new standard like OData. We had a great discussion around this standard for REST. Here is a brief excerpt of that discussion:

  1. Why should I use OData?

    As APIs continue to explode, each organization exposes its own unique REST/SOAP/Bulk APIs for consuming their data. And some of them also come up with their own unique query language such as ROQL (Oracle Service Cloud), SOQL (Salesforce), etc. This makes it difficult for an enterprise and its development team to be able to learn and code against all these different APIs.

    This is where OData is very useful. OData advocates a standard way of implementing REST APIs that allows for SQL-like querying capabilities using these RESTful APIs. OData is essentially SQL for the web built on top of standard protocols – HTTP, JSON & ATOM – while leveraging the REST architecture style. Learn through code samples how OData can simplify your life in this tutorial blog: Marketo REST API vs Eloqua REST API vs OData

    OData API
  2. Which companies are adopting OData?

    Some of the developers were curious to know whether Microsoft was the only company pushing OData. However, they were surprised to realize that OData has been adopted by a lot of technologies and companies including SAP, IBM, Salesforce, Tableau, Databoom, Progress, Red Hat and Dell. The OData ecosystem has a list of some of its consumers and producers and the slide below is a list we’re tracking, but it’s growing faster than we can keep up with.

    Broad Adoption of OData
  3. How is FHIR related to OData?

    FHIR, or Fast Healthcare Interoperability Resources Specification, is a standard for exchanging healthcare information electronically. In order to make FHIR truly interoperable, it is recommended that systems use the rules specified by OData specification for the $search parameter. Further, FHIR also uses OAuth in order to establish a trusted relationship with the client for an extra layer of security. Read more about this here.
  4. Is OData compliant with the Internet Standards?

    Yes, OData is an OASIS standard and has been recently ratified as an ISO standard. It is also based on a lot of the RFC standards from the IETF (Internet Engineering Task Force). Here are some of the RFC standards it uses:

    RFC2616 HTTP 1.1 Specification

    RFC5023 The Atom Publishing Protocol

    RFC2119 Keywords for use in RFCs to Indicate Requirement Levels

    RFC5789 Patch Method for HTTP

    RFC3629 UTF-8

    RFC4627 JSON

    RFC 3986 URI

    RFC 2046 Multipurpose Internet Mail Extensions (MIME)

  5. Is OData susceptible to SQL Injection or other security attacks?

    OData is a query language like SQL with which you can query anything that is exposed by the model. Like SQL, if the application only wants to expose certain parts of the model, the application will need to provide those restrictions.

    As for security attacks, this will depend on the implementation. I am not aware of any security flaws that are specific to the OData specification. Since OData is exposed as a REST API, the implementation must guard against security vulnerabilities like any other REST API.

    From a Progress DataDirect product perspective, our hybrid connectivity services follow the OWASP guidelines for protecting against known security vulnerabilities. DataDirect Cloud is also subject to routine security scans and penetration testing both by internal resources and independent external resources.
  6. How can I manage the JSON version according to the schema?

    The JSON that is returned from a query is defined by the model. If the model changes, the JSON in the response will change. In the OData 4.0 spec the CSDL syntax that is used to define the OData model does not have a way to assign a version to a model. The intent was that once an OData API was published at a given URL, its model would not change. If there was a change to the model, then a new (possibly versioned) URL would be provided.

    However, there were enough requests for versioning the model that a SchemaVersion annotation was added to the CSDL in the coming OData 4.0.1 specification. A specific version of the model can be requested with the SchemaVersion request header for OData 4.0.1
  7. Can OData support batch requests like in an email?

    OData supports batch requests. Batch requests allow grouping multiple operations into a single HTTP request payload. A batch request is represented as a Multipart MIME v1.0 message RFC 2046, a standard format allowing the representation of multiple parts, each of which may have a different content type (as described in [OData-Atom] and [OData-JSON]), within a single request.

    Batch requests are submitted as a single HTTP POST request to the batch endpoint of a service, located at the URL $batch relative to the service root. The batch request MUST contain a Content-Type header specifying a content type of multipart/mixed and a boundary specification as defined in RFC 2046.
  8. What about pagination? Does pagination work for frequently changing content like Twitter?

    OData is designed as a set of conventions that can be layered on top of existing standards to provide common representations for common functionality. To aid in client/server interoperability, this specification defines multiple levels of conformance for an OData Service, as well as the minimal requirements for an OData Client to be interoperable across OData services. For a minimum conformance, OData must support server-driven paging. Beyond that one could also apply client-side paging through query options such as Orderby, select, skip, top, filter, expand and inlinecount.

    This pagination is done on a per query basis. Typically if query capability is done on a streaming service like Twitter, then the query is done for a particular time slice. If there is more data in that time slice, then the data will be broken up into pages.
  9. Does OData support procedures? Can we perform JOINs across Federated databases?

    Yes, OData supports procedures. In RESTful APIs, there can be some custom operations that contain complicated logic and can be frequently used. For that purpose, OData supports defining functions and actions to represent such operations. They are also resources themselves and can be bound to existing resources. Further, OData does not preclude federating data from multiple sources.

Getting Started with OData

Now that you have learned enough about OData, you can get started with it by using our hybrid connectivity services. You can dive a little deeper into OData with our quick guide, or check out our short tutorial for help getting started and playing with OData. And in case you have more questions around OData…

Talk to an OData Expert

Read the original blog entry...

More Stories By Progress Blog

Progress offers the leading platform for developing and deploying mission-critical, cognitive-first business applications powered by machine learning and predictive analytics.

Latest Stories
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
Most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes a lot of work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost ...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.