Welcome!

Related Topics: IBM Cloud

IBM Cloud: Article

Process-Driven Architectures for WebSphere Application Server

Process-Driven Architectures for WebSphere Application Server

As the industry moves toward a new, process-based application model, it is important to understand the terms and standards involved.

There have been many definitions of business process management (BPM) bandied about, but the one that is the easiest to understand and that most accurately reflects BPM is the following:

Business Process Management is the representation and enactment of business interactions that involve multiple steps, possibly over time, that can occur in parallel across multiple systems or people.

Within the BPM market itself there are different terms that tend to be referenced in BPM literature:

  • Business process: A business process is analogous to an end-to-end use case requirement within a business. It is the definition of the activities involved in fulfilling a business requirement regardless of the people, architecture, or systems involved.
  • Business process modeling: Business process modeling tools provide a graphical way to map out the steps required to complete a business process as described earlier. Note that business process modeling vendors do not necessarily provide an executable engine in which to run the process.
  • Executable business processes: Executable business processes are models that can be loaded and executed by a process engine. Process engines execute processes that are long running, as well as straight-through processing (STP)-type processes. The process engine is able to handle the services, integration, and mediation necessary to complete each activity step that forms the sum of the process. And, most leading process engines are certified to run within WebSphere Application Server.
  • Collaborative business processes: Collaborative business processes specify how two concurrent executable business processes interact at the business level. Examples of collaborative business processes are Microsoft BizTalk, RosettaNet, Ariba, and Commerce One.
  • Business process activities: A business process activity represents an interaction between users (worklists), systems, autonomous agents (EJBs), or possibly Web services. They are triggered by events.

    Why Companies Are Embracing a Process-Based Architecture
    The industry is in the process of moving to a new application model, one that is fundamentally different from the one we're familiar with.

    Today, an "application" is a discrete piece of software that a user opens on his or her computer in order to perform a specific task. In the evolving model, the activities that users click on will call on an underlying layer of interconnected services running within WebSphere Application Server, ultimately providing an infrastructure layer, transparently providing software automation to perform business processes as required. This infrastructure change is driven by:
    1.   The need to change applications rapidly to meet evolving requirements. This is not new, but has been exacerbated by the Internet and the real desire to have more loosely coupled external trading collaborations.
    2.   The need to integrate existing applications, data, and processes to automate value chains. This is in essence what the industry is describing when referring to "process driven architectures" - i.e., integrate rather than throw away. Corporations are driving this need to integrate in order to reduce total cost of ownership (TCO) and increase return on existing investments (ROI).

    The emergence of a reliable, shared, standardized infrastructure based on the J2EE standard, such as WebSphere Application Server, and a service-oriented architecture (SOA) has been a predominant force in enabling this application infrastructure change described above. And, today no company wants to write an application from scratch. The ability to use a large number of services common to all applications (security, transaction, connection pooling, messaging, etc.) is the nirvana on which the J2EE platform was originally sold.

    This same model can be applied to application service pieces to answer questions such as "Why can't I apply services that are common to my business across different business scenarios?" and "Why can't I do it at a higher level of abstraction?" Let me put it another way: enterprise development of WebSphere applications can involve as much as 18 million lines of code. How are you going to deliver this - on time and to budget, and then maintain and update it - without abstraction?

    This touches upon another industry nirvana, component-based development (CBD) and software component reuse. Reuse, a primary focus of process-driven architectures, is the ability to reuse frequently used processes, subprocesses, and frequently used activity or component patterns throughout an organization.

    Business process management is right there at the convergence of metadata-driven components, using Web services to solve EAI-type requirements and using process activities as an interface to a service for reuse.

    Applications are more maintainable because the "process" is not hard-coded but instead described by an XML definition and deployed in real time. This enables flexibility and responsiveness to change at the most volatile architecture layer, that which implements business policies and procedures.

    If applications comply with the same business process metamodel, one has a greater chance to be able to compose and recompose these processes across applications, provided that their respective process engines interoperate (design-time composition/orchestration, runtime choreography).

    Declarative component reuse becomes achievable as activities become the interface to services that are "new" or "wrappered" legacy processes/components exposed in the middle-tier platform fulfilled by business rules, i.e., we achieve modular, reusable software components that can be manipulated visually in a declarative design tool.

    The idea here is that we take the object-oriented idea of reuse and essentially repackage it with reuse by "specification." Process activities can be customized and their properties altered to be reused in different scenarios.

    In addition, BPM addresses another aspect of application development, the ability to support long-running units of work. In the past, transaction processing systems have provided a synchronous link between the consumer of the data and the data itself. As applications are more and more integrated with their environment, one cannot expect to have the whole world synchronously (and transactionally) connected to applications. We need the ability to selectively couple our architecture, i.e., to make sensible choices about where it is necessary to have tightly coupled or loosely coupled endpoints.

    Ultimately in today's business and technology environment it's not just about connecting application A to application B; it's about taking a set of applications, applying a business process flow to control the integration and a set of rules to ensure data and transactional consistency, and then exposing this aggregate application using whatever middleware is appropriate.

    When to Implement a Separate Process Layer
    Many companies that have designed, or are thinking of designing, their middle tier using a granular service approach find that they need a layer that coordinates the services to provide an aggregate service. This has often been done in a piecemeal fashion in which complex code resides in session beans that are viewed as "fat" service objects in the middle tier. Moving "process" from the business logic layer into the process layer considerably increases flexibility for the following reasons:
    1.   A change in the process definition does not then necessarily require a modification of business services.
    2.   The processes become "visible" and are able to be rapidly changed/ amended and redeployed.
    3.   BPM provides the ability to monitor, audit, and escalate processes. This provides the ability to identify and deal with processes that are not working, while also providing real metrics to the business: Which are the most popular services? Which services are regularly escalated, etc.
    4.   Volatile rules in the business services layer can be externalized. Examples of this are decision points that can be externalized in the process model and changed at runtime.
    5.   A process flow layer can be used to connect different user interfaces to the same business service. Business analysts can do process modeling and maintenance.
    6.   In using the worklist aspects of BPM, organizations have the ability to "push" work to users - this is being referred to as fulfillment of the "last mile" of workflow and EAI. Many companies that are looking to automate and drive back-office value chains use BPM in this way.

    The Relationship Between Process and Web Services
    One of the big questions within the IT industry, and particularly in the areas of EAI and BPM, is how do Web services help businesses? In a business process context, Web services themselves are in some ways at the very edges of the business processes in that they provide a loosely coupled invocation mechanism to the "units of work" that are implemented by back-office or legacy applications. These are traditionally exposed via proprietary APIs, but increasingly EAI vendors are allowing access to their connectors over SOAP, enabling them to be composed within the various business processes.

    Taken in this context, Web services won't always be appropriate, as it is likely that in certain scenarios you will require robust persistent-based durable messaging. In this case it is likely that you will implement a more traditional EAI hub-and-spoke infrastructure, perhaps using JMS as the standard access mechanism. Your application requirements will dictate the type of coupling that you use.

    However, there will also be cases when you have low-volume transactions with existing legacy that you are able to elegantly expose through a Web service entry point. This model enables organizations to quickly, and with little cost, allow connectivity to legacy, and is increasingly being utilized within internal firewall boundaries.

    Consider the case in Figure 1. Here we can see an example of a business process designed in Versata's Process Logic Designer that utilizes Web service process activities to communicate with an external credit card verification system and an internal legacy goods ordering system whose Web service endpoint could be exposed through an EAI vendor's connector. In this scenario we are utilizing Web services within our process flow to rapidly satisfy application and business requirements.

    BPM vs EAI
    BPM is closely related to the problems that are traditionally addressed by EAI. However, BPM is built on a service-oriented architecture. This architecture, as discussed, changes the economics of integrating disparate applications and business processes.

    BPM lends itself to smaller initial projects, allowing companies to build incrementally. Also, most BPM solutions feature server-based pricing with drastically reduced requirements for specialized consulting services, which is a necessity with proprietary adapter and scripting language-based EAI solutions.

    For any company wanting to invest in the service-oriented architecture approach, BPM adds the capability to automate long-running, multistep business transactions that span heterogeneous systems.

    BPM solutions based on Java and J2EE leverage existing skills within the enterprise. Analysts are able to declaratively specify business processes and in-house developers can, where needed, leverage their Java-based skill sets to implement them.

    BPM Standards
    Given that an end-to-end business process can extend across technology platforms and encompass several different vendors, it stands to reason that vendors and companies are keen to promote standardization of business process modeling, execution, and interactions.

    Currently there are many different standardization initiatives, which has resulted in confusion rather than the compatibility for which they were intended. These include:

    WSFL
    Web Services Flow Language (WSFL), was defined by IBM and is a proposed standard that addresses workflow on two levels:
    1.   It takes a directed graph model approach to defining and executing business processes.
    2.   It defines a public interface that allows business processes to advertise themselves as Web services.

    WSFL can be used to model processes that move from one activity to the next, where decisions are made at each control point, using an XML syntax that can be read by both humans and machines. By consuming WSFL, a workflow engine can walk through business processes activity by activity, control point by control point. WSFL has now been superceded by BPEL4WS.

    XLANG
    XLANG , a Microsoft specification that is an XML-based language, describes business process interactions. XLANG is implemented within Microsoft's BizTalk server and its BizTalk Orchestration Designer can compile XLANG schedule drawings into XML-structured XLANG schedule files. Because XLANG is XML-based, its schedules must comply with XML rules for well-formed documents, which means it must conform to a specification or standard schema. XLANG has also been superceded by BPEL4WS.

    BPEL4WS
    BPEL4WS is a relatively new standard from Microsoft, IBM, and BEA. This standard supercedes a previous Microsoft standard, XLANG, and a previous IBM standard, WSFL. It describes the manner in which different participants in a process collaborate from a single participant viewpoint, i.e., rather than relating the process from a holistic viewpoint.

    BPMI
    The Business Process Management Initiative (BPMI) promotes a standard to describe business processes called Business Process Modeling Language (BPML). BPML describes comprehensive control flow and data flow constructs, providing support for both short and long-running transactions, including compensating activities. BPML also provides support for exception handling and timeouts. However, it does not address such issues as authentication and nonrepudiation.

    UML 2.0
    It would be relatively easy to forget UML, but process definitions themselves are analogous to a UML activity diagram. UML is extensible and as such can be adapted to many applications. In fact Versata has used this ability to define its own "rules" stereotype.

    ebXML
    UN/CEFACT and OASIS have collaborated on ebXML, an end-to-end stack of protocols and specifications for conducting electronic business using the Internet. ebXML predates the Web services paradigm and has been discussed as an alternative to Web services, however, this somewhat misses the point. Web services technology currently does not provide adequate support for business transaction services. This, however, is what ebXML was designed for (think EDI-style information exchange) and where it clearly distinguishes itself from Web services.

    It is likely that at some point there will be a merging of some of the technologies used in the Web services model and ebXML. An example of this is UN/CEFACT and OASIS recently adopting SOAP as the basis of the ebXML messaging infrastructure. For its part, the W3C is evaluating the ebXML specification and will likely incorporate aspects of the specification that they feel meet requirements missing from the Web services architecture.

    EDOC
    The Object Management Group's vision for EDOC is to simplify the development of component-based EDOC systems by means of a modeling framework based on UML 1.4 and conforming to the OMG (Object Management Group) Model Driven Architecture. In particular they are striving to achieve:

  • A platform-independent, recursive collaboration-based modeling approach
  • A business component architecture that provides interoperable business components and services, reuse and composability of components, and reuse of designs and patterns while being independent of choice of technology (e.g., component models)
  • Modeling concepts for clearly describing the business processes and associated rules
  • A loosely coupled, reuseable business collaboration architecture that can be leveraged by business-to-business (B2B) and business-to-customer (B2C) applications, as well as for enterprise application integration
  • A development approach that allows two-way traceability between the specification, implementation, and operation of enterprise computing systems and the business functions they are designed to support
  • Support for system evolution and the specification of collaboration between systems
  • A notation that is accessible and coherent

    This approach has a lot of merit and can be seen as the basis for achieving a Model Driven Architecture (MDA).

    Wf-XML
    The Wf-XML standard, designed and supported by the Workflow Management coalition, is designed to support process-level interoperability. Wf-XML defines a set of operations that can be carried out to start, interrogate, and terminate a process using an Internet/Web services-style paradigm. Although Wf-XML is not based on the SOAP standard, and the operations are not defined in WSDL, it is an applicable standard for Web services, specifically aimed at business process/workflow runtime interoperability. The WfMC recently announced Wf-XML SOAP bindings for interoperability standards support.

    The Versata Logic Server is based on a Wf-XML implementation and Versata has done some work in designing additional process artifacts that tie this in with Web services support.

    The Workflow Management Coalition has already spent over nine years facilitating this set of standards that enables processes to be used across organizations. We believe that what is essential is the description of the process and the ability for interoperability.

    Conclusion
    Although a standard business process language would be great to have, the reality is that, with competing standards, this will be in flux until a dominant one emerges. Make no mistake, however. This is fundamental to achieving business process integration and collaboration. The ability to have technology-independent metamodels "behind" BPM is where the "rubber meets the road."

    References
    BPEL4WS: www-106.ibm.com/developerworks/ webservices/library/ws-bpelcol1
    BPML: www.bpmi.org.
    Positioning paper comparing BPML and BPEL4WS: www.bpmi.org/downloads/BPML-BPEL4WS.pdf
    Proposal from the UML 2.0 members concerning process models: www.omg.org/cgi-bin/doc?ad/01-11-01
    ebXML: www.ebxml.org
    More information on EDOC and MDA: http://omg.org
    Wf-XML: www.wfmc.org

  • More Stories By Jim Liddle

    Jim is CEO of Storage Made Easy. Jim is a regular blogger at SYS-CON.com since 2004, covering mobile, Grid, and Cloud Computing Topics.

    Comments (1) View Comments

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


    Most Recent Comments
    John Brooks 01/14/03 04:38:00 PM EST

    Thought that this was pretty clear and concise - the meta-models seem a minefield. What are people using out there ?

    Latest Stories
    With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will d...
    While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
    The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
    When building large, cloud-based applications that operate at a high scale, it's important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. "Fly two mistakes high" is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Le...
    Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
    As Cybric's Chief Technology Officer, Mike D. Kail is responsible for the strategic vision and technical direction of the platform. Prior to founding Cybric, Mike was Yahoo's CIO and SVP of Infrastructure, where he led the IT and Data Center functions for the company. He has more than 24 years of IT Operations experience with a focus on highly-scalable architectures.
    CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure...
    The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
    René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
    Dhiraj Sehgal works in Delphix's product and solution organization. His focus has been DevOps, DataOps, private cloud and datacenters customers, technologies and products. He has wealth of experience in cloud focused and virtualized technologies ranging from compute, networking to storage. He has spoken at Cloud Expo for last 3 years now in New York and Santa Clara.
    Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
    Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
    Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
    Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
    Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.