Welcome!

Blog Feed Post

New API Academy Team Member: Irakli Nadareishvili

Irakli NadareishviliThe API Academy team has a new member: Irakli Nadareishvili who has joined CA Layer 7 as Director of API Strategy. Before joining CA, Irakli served as Director of Engineering for Digital Media at NPR, which is noted for its leadership in API-oriented platform design. He has also participated in the creation of the Public Media Platform, worked with whitehouse.gov and helped a number of major media companies develop publishing solutions using open source software.

I recently sat down with Irakli to discuss what he has in mind as he joins API Academy.

MM: You once told me that you believe the future of Big Data is “linked APIs”? That sounds intriguing. Tell me more about it.

IN: In most people’s minds, “Big Data” is synonymous to “very large data”. You may hear: “Google-large” or “Twitter-large” or “petabytes”. The Wikipedia definition of Big Data is slightly more elaborate:

“Big data is the term for a collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools or traditional data processing applications”.

In my work, I see the “complex” part of the definition becoming more important than the size. We have gotten pretty good at taming the large sizes of data. Tooling for horizontal partitioning and parallel processing of large data sets is now abundant. Still, most Big Data sets are contained and processed in the isolation of single organizations. This is bound to change very soon. The end of siloed Big Data is near: I believe that the next phase of Big Data challenges will have to do with data sets that cross organizational boundaries.

APIs will play a major role in this. Web APIs represent the most effective available technology that allows data to cross organizational boundaries. APIs efficiently connect and link data at a distance.

MM: Can you give an example of what you mean by “data sets that cross organizational boundaries”? And what challenges do these pose?

IN: You see, a lot of people have the notion that the data they need to process can be stored in a database maintained by a single organization. This notion is increasingly inaccurate. More and more, organizations are having to deal with highly-distributed data sets.

This can be very challenging. The infamous healthcare.gov is a good example of such a distributed system. The main technical challenge of implementing healthcare.gov’s backend was that it had to integrate with data in many existing systems.

The $500 million initial public fiasco of healthcare.gov is also a vivid indication of just how complex it is to build truly distributed systems. Practically the only successful implementation of such a large, distributed information system is the World Wide Web. There’s a lot we can learn from the architecture of the Web. It’s a battle-tested blueprint for building distributed systems at scale.

I believe the Big Data challenges of the future will be solved at the intersection of APIs with Web/hypermedia architecture, linked data and what we currently call Big Data tooling. I call this intersection “Linked APIs”, to differentiate it from the current, siloed state of most Web APIs.

MM: What practical advice would you give to the developers of future Big Data APIs?

IN: I think the most important thing is that we need to stop falsely assuming all of the API data is local data. It is not. Despite the name, an API for a distributed system is really not a “programming interface” to local data/assets. Rather, it is a programmable data index. Think of APIs as a programmable search index for a distributed collection of data sets.

I don’t like to think of the term “API” as an abbreviation anymore. Maybe it was one a while ago but it has since evolved way past that. Much like IBM doesn’t think of itself as “International Business Machines” anymore, APIs aren’t merely “application programming interfaces”. Most of what IBM does these days isn’t even necessarily about “machines”. Likewise, most of what we need out of APIs isn’t about any single application or an interface to it.

MM: Big Data represents one important challenge for computing today. What about IoT?

NN: The Internet of Things is already here, in small ways. The IoT we have today consists of a vast number of Web-connected devices, acting as sensors, sending myriads of signals to the cloud. That, by the way, is what creates many Big Data challenges. The future is much more interesting, however. Once the connected devices start engaging in peer-to-peer interactions, bypassing central authority, we will enter a significantly different realm. The most important challenge in that world, from my perspective, will be identity. Identity is always key in distributed systems but especially so in peer-to-peer networks.

MM: What excites you the most about your new role at Layer 7?

IN: Thank you for asking this question. I will start by telling you what terrifies me the most. The API Academy and Layer 7 teams represent a gathering of  ”scary” amounts of world-class brainpower and expertise in the API space. It is extremely humbling to be part of such a distinguished group.

Obviously, it also means that there is a lot of very fundamental thinking and innovation that happens here. Especially now that Layer 7 is part of CA Technologies, there’s really very little that we couldn’t accomplish if we put our hearts to it. That feels extremely empowering. I really care about all things related to APIs and distributed systems, the role they can play for the future of technology. I am super excited about the possibilities that lie ahead of us.

Read the original blog entry...

More Stories By Matt McLarty

Matt McLarty is focused on customer success, providing implementation best practices and architectural guidance to ensure clients receive the maximum benefit from Layer 7’s products. Matt brings over 15 years of technology leadership to Layer 7, with a particular focus on enterprise architecture, strategy and integration. Prior to joining Layer 7, Matt led the global IBM technical sales organization responsible for application integration software and solutions, notably helping to grow the SOA Gateway business substantially over a five-year period. Before joining IBM, Matt worked as a Senior Director and Enterprise Architect focused on SOA, electronic payments and identity management. Follow him on Twitter at @MattMcLartyBC.

Latest Stories
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, will introduce two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a...
Any startup has to have a clear go –to-market strategy from the beginning. Similarly, any data science project has to have a go to production strategy from its first days, so it could go beyond proof-of-concept. Machine learning and artificial intelligence in production would result in hundreds of training pipelines and machine learning models that are continuously revised by teams of data scientists and seamlessly connected with web applications for tenants and users.
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
Most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes a lot of work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost ...