News Feed Item

IBM Chosen to Support BP's Global Workforce

Integrating global business processes and launching new collaboration capabilities to support 60,000 of BP's global employees

ARMONK, N.Y., March 24, 2014 /PRNewswire/ -- IBM (NYSE: IBM) today announced that it has been selected by BP (NYSE: BP) to integrate and manage the company's business applications globally, as well as provide enhanced service desk support for 60,000 employees and 80,000 devices in the Americas and Europe. IBM was chosen to support BP in enhancing user interaction with the service desk and improving business IT processes across all key BP IT operations.

IBM Corporation logo.

End user computing is the face of IT to employees in all corporations. BP wants to provide its employees with greater choice and flexibility in interacting with its service desk, in order to improve first-time problem resolution rates.  The IBM solution is focused on providing a personal service that gives an individual choice on how and when to get help.  Enhancements will include new services such as live online chat whereby employees can engage the help desk in real time and in their local language through a live agent or through BP's self-help web portal.  The self-help portal leverages IBM's knowledge base, deep analytics of IT incidents and natural-language search capabilities to rapidly deliver the most relevant results and reduce the need for on-site support.

"BP's business requirements are changing, and the technologies that will enable us to meet our strategic objectives are evolving even faster," said Mark Bouzek, Vice President, Global Operations and Infrastructure, BP. "As a complex global organization, BP needs to continuously improve our business processes and the speed of service we deliver to our employees and external customers."

With operations around the world, BP selected IBM to provide the next generation of application management services for its global enterprise systems and additional connected applications.  Central to these new generation services is the IBM Command Centre that monitors BP's enterprise systems and applications in real time, utilizing predictive analytics to prevent system outages and data flow failures, thus maintaining a world-class level of application availability. IBM has developed its next generation application management solution for BP by harnessing the knowledge and insights gained from serving BP and from IBM's extensive global client base.

"IBM has consistently met high standards for quality delivery as well as delivering innovation for BP. We have proven that we are the right partner to serve BP with our global presence, relentless focus on service and our drive for continuous improvement," said David Marley, Managing Director, IBM. "Looking forward, we will continue to bring together IBM's integrated services management, research capabilities and innovative technologies to provide BP the right mix of skills and capabilities they need to support their business and drive future growth."

IBM is providing IT help desk services to BP from its delivery centers in Boulder, Colorado; Greenock, Scotland; Dublin, Ireland; Brno, Czech Republic; and Bangalore, India. IBM is providing application management services from its delivery facilities in Bangalore, Kolkata and Hyderabad in India; Houston and Tulsa in the U.S., UK, Germany and Australia.

About IBM
For more information about IBM, visit: ibm.com/services

Follow the conversation on Twitter at @IBMSourcing, @ibmoilandgas and #IBMSocialBiz, and on LinkedIn via the Smarter Chemicals and Petroleum Community.

Leslie Monreal-Feil
IBM Media Relations
[email protected]

Logo - http://photos.prnewswire.com/prnh/20090416/IBMLOGO


More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
In the next five to ten years, millions, if not billions of things will become smarter. This smartness goes beyond connected things in our homes like the fridge, thermostat and fancy lighting, and into heavily regulated industries including aerospace, pharmaceutical/medical devices and energy. “Smartness” will embed itself within individual products that are part of our daily lives. We will engage with smart products - learning from them, informing them, and communicating with them. Smart produc...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
SYS-CON Events announced today that Coalfire will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Coalfire is the trusted leader in cybersecurity risk management and compliance services. Coalfire integrates advisory and technical assessments and recommendations to the corporate directors, executives, boards, and IT organizations for global brands and organizations in the technology, cloud, health...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, will contrast how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He will show the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He will also have live demos of building immutable pipe...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
WebRTC defines no default signaling protocol, causing fragmentation between WebRTC silos. SIP and XMPP provide possibilities, but come with considerable complexity and are not designed for use in a web environment. In his session at @ThingsExpo, Matthew Hodgson, technical co-founder of the Matrix.org, discussed how Matrix is a new non-profit Open Source Project that defines both a new HTTP-based standard for VoIP & IM signaling and provides reference implementations.
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
So you think you are a DevOps warrior, huh? Put your money (not really, it’s free) where your metrics are and prove it by taking The Ultimate DevOps Geek Quiz Challenge, sponsored by DevOps Summit. Battle through the set of tough questions created by industry thought leaders to earn your bragging rights and win some cool prizes.
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
Governments around the world are adopting Safe Harbor privacy provisions to protect customer data from leaving sovereign territories. Increasingly, global companies are required to create new instances of their server clusters in multiple countries to keep abreast of these new Safe Harbor laws. Is it worth it? In his session at 19th Cloud Expo, Adam Rogers, Managing Director of Anexia, Inc., will discuss how to keep your data legal and still stay in business.