Welcome!

Blog Feed Post

Application Architecture With Azure Service Fabric

Is Azure the dominant cloud-based development infrastructure of the future? There’s some good evidence to support that claim. At last year’s Dell World conference in Austin, TX, Microsoft CEO Satya Nadella announced on stage that there are only two horses in the contest for control of the cloud. “It’s a Seattle race,” Nadella said. “Amazon clearly is the leader, but we are number two. We have a huge run-rate. All up, our cloud business last time we talked about it was over $8 billion of run-rate.”

Normally, you could dismiss that as typical marketing speak, but market analysts tend to agree with him. Gartner’s Magic Quadrant for Cloud Infrastructure as a Service Report found that there are only two leaders in the space. AWS is ahead, but Microsoft Azure’s offerings are growing faster. Gartner concluded, “Microsoft Azure, in addition to Amazon Web Services, is showing strong legs for longevity in the cloud marketplace, with other vendors falling further to the rear and confined to more of a vendor-specific or niche role.”

The Rundown on Azure Service Fabric and Microservices

Service Fabric is the new middleware layer from Microsoft designed to help companies scale, deploy, and manage microservices. Service Fabric supports both stateless and stateful microservices. In stateful microservices, Service Fabric computes your storage and application code together, reducing latency and automatically provides replication services in the background to improve availability of your services.

Azure Service Fabric improves the deployment process for customers embracing DevOps with features like rolling upgrades and automatic rollback during deployments.

Empowering customers to deliver microservices using Azure Service Fabric is a key contributor powering Microsoft’s revenue growth, expanding 102 percent year-over-year through the success of Azure.

Top enterprises betting on Azure services today include global chocolatier The Hershey Company, Amazon’s e-commerce competition Jet.com, digital textbook builder Pearson, GE Healthcare, and broadcaster NBC Universal. Azure is an optimized multi-platform cloud solution that can power solutions running on Windows and Linux, using .NET, Node.js, and a host of other runtimes in the market, making it easier to adopt regardless of the language or underlying OS for customers deploying applications that scale using microservices.

Why Microsoft Chose Microservices Over Monolithic

When Microsoft started running cloud-scale services such as Bing and Cortana, it ran into several challenges with designing, developing, and deploying apps at cloud-scale. These were services that were always on and in high-demand. They required frequent updates with zero latency. The microservices architecture made much more sense than a traditional monolithic approach.

Microsoft’s Mark Fussell defined the problem with monolithic: “During the client-server era, we tended to focus on building tiered applications by using specific technologies in each tier. The term ‘monolithic application’ has emerged for these approaches. The interfaces tended to be between the tiers, and a more tightly coupled design was used between components within each tier. Developers designed and factored classes that were compiled into libraries and linked together into a few executables and DLLs.”

There were certainly benefits to that methodology at the time in terms of simplicity and faster calls between components using inter-process communication (IPC). Everybody’s on one team testing a single software, so it’s easier to coordinate tasks and collaborate without explaining what each is working on at a given moment.

Azure and Microservices

Monolithic started to fail when the app ecosphere turbocharged the speed of user expectations. If you want to scale a monolithic app, you have to clone it out onto multiple servers or virtual machines (or containers, but that’s another story). In short, there was no easy way to break out and scale components rapidly enough to satisfy the business needs of enterprise-level app customers. The entire development cycle was tightly interconnected by dependencies and divided by functional layers, such as web, business, and data. If you wanted to do a quick upgrade or fix, you had to wait until testing was finished on the earlier work. Monolithic and agility didn’t mix.

The microservices approach is to organize a development project based on independent business functionalities. Each can scale up or down at its own rate. Each service is its own unique instance that can be deployed, tested, and managed across all the virtual machines. This aligns more closely with the way that business actually works in the world of no latency and rapid traffic spikes.

In reality, many development teams start with the monolithic approach and then break it up into microservices bases, in which functional areas need to be changed, upgraded, or scaled. Today, DevOps teams that are responsible for microservices projects tend to be highly cost-effective but insular. APIs and communications channels to other microservices can suffer without strong leadership and foresight.

How Azure Service Fabric Helps

Azure Service Fabric is a distributed systems platform that assigns each microservice a unique name, which can be stateless or stateful. Service Fabric streamlines the management, packaging, and deploying of microservices, so DevOps teams and admins can just forget about the infrastructure complexities and get down to implementing workloads. Microsoft defined Azure Service Fabric as “the next-generation middleware platform for building and managing these enterprise-class, tier-1, cloud-scale applications.”

Azure Service Fabric is behind services like Azure SQL Database, Azure DocumentDB, Cortana, Microsoft Power BI, Microsoft Intune, Azure Event Hubs, Azure IoT Hub, and Skype for Business. You can create a wide variety of cloud native services that can immediately scale up across thousands of virtual machines. Service Fabric is flexible enough to run on Azure, your own bare metal on-premise servers, or on any third-party cloud. More importantly — especially if you’re an open-source house — is that Service Fabric can also deploy services as processes or in containers.

Azure Container Services

Open-source developers can use Azure Container Service along with Docker container orchestration and scale operations. You’re free to work with Mesos-based DC/OS, Kubernetes, or Docker Swarm, and Compose and Azure will optimize the configuration for .NET and Azure. The containers and your app configuration are fully portable. You can modify the size, the number of hosts, and which orchestrator tools you want to use, and then leave the rest to the Azure Container Service.

Any of the most popular development tools and frameworks are compatible because Azure Container Services exposes the standard API endpoints for their orchestration engine. That opens the door for all of the most common visualizers, monitoring platforms, continuous integration, and whatever the future brings. For .NET developers or those who have worked with the Visual Studio IDE, the Azure interface presents a familiar user experience. Developers can use Azure and cross-platform a fork of .NET known as .NET Core to create an open-source project running ASP.NET applications for Linux, Windows, or even Mac.

Taking on New Challenges With Service Fabric

Microsoft’s role as a hybrid cloud expert gives Azure an edge over virtual-only competitors like AWS and Google Cloud. Azure’s infrastructure is comprised of hundreds of thousands of servers, content distribution networks, edge computing nodes, and fiber optic networks. Azure is built and managed by a team of experts working around the clock to support services for millions of businesses all over the planet.

Developers experienced with microservices have found it valuable to architect around the concept of smart endpoints and dumb pipes. In this approach, the end goal of microservices applications is to function independently, decoupled but as cohesive as possible. Each should receive requests, act on its own domain logic, and then send off a response. Microservices can then be choreographed using RESTful protocols, as detailed by James Lewis and Martin Fowler in their microservices guide from 2014.

If you’re dealing with workloads that have unpredictable bursting, you want an infrastructure that’s reliable and secure while knowing that the data centers are environmentally sustainable. Azure lets you instantly generate a virtual machine with 32TB of storage driving more than 50,000 IOPS. Then, your team can tap into data centers with hundreds of thousands of CPU cores to solve seemingly impossible computational problems.

AppDynamics for Azure

In the end, the user evaluates the app as a singular experience. You need application monitoring that makes sure all the microservices are working together seamlessly and with no downtime. AppDynamics App iQ platform is what you need to handle the flood of data coming through .NET and Azure applications. You can monitor all of the .NET performance data from inside Azure, as well as frameworks and runtimes like WebAPI, OWIN, MVC, and ASP.NET Core on full framework, deploying AppDynamics agents in Azure websites, worker roles, Service Fabric, and in containers. In addition, you can monitor the performance of queues and storage for services like Azure SQL Server and Service Bus. This provides end to end visibility into your production services running in the cloud.

The asynchronous nature of microservices itself makes it nearly impossible to track down the root failure when it starts cascading through services unless you have solid monitoring in place. With AppDynamics, you’ll be able to visualize the services path from end to end for every single interaction — all the way from the origination through the services calls. Otherwise, you’ll get lost in the complexity of microservices and lose all the benefits of building on the Azure infrastructure.

While we see many developers in the Microsoft space attracted to Azure, AppDynamics realizes Azure is a cross-platform solution supporting both Windows and Linux. In addition to .NET runtimes, AppDynamics provides a rich set of monitoring capabilities that many of the modern technologies being used in the Azure cloud require, including Node.js, PHP, Python and Java applications.

Learn more

Learn more about our .NET monitoring solution or take our free trial today.

The post Application Architecture With Azure Service Fabric appeared first on Application Performance Monitoring Blog | AppDynamics.

Read the original blog entry...

More Stories By AppDynamics Blog

In high-production environments where release cycles are measured in hours or minutes — not days or weeks — there's little room for mistakes and no room for confusion. Everyone has to understand what's happening, in real time, and have the means to do whatever is necessary to keep applications up and running optimally.

DevOps is a high-stakes world, but done well, it delivers the agility and performance to significantly impact business competitiveness.

Latest Stories
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, discussed the b...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: imple...
Mobile device usage has increased exponentially during the past several years, as consumers rely on handhelds for everything from news and weather to banking and purchases. What can we expect in the next few years? The way in which we interact with our devices will fundamentally change, as businesses leverage Artificial Intelligence. We already see this taking shape as businesses leverage AI for cost savings and customer responsiveness. This trend will continue, as AI is used for more sophistica...
No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching ...