Welcome!

Blog Feed Post

Ubisoft API (powered by Intel) – The game plays you now!

By Andy Thurai (Twitter: @AndyThurai)

Remember the old days, when we used to play “graphical” games such as Tetris, and were amazed by them? Twenty years fast forward, Ubisoft is doing things to enrich the user experience in an amazing way. Gone are the days the games are given to you statically so the results are predictable if you play them in a certain way. Now, the real-time games (such as Assasin’s Creed ®) are so sophisticated that they track player movements and adjust the results based on player movements and skill level.

Tetris

[The most graphical modern game in the late 80s - Tetris]

As any teen can vouch for, gaming is moving from a console-based model to a device-based model (Console/PC/ Mobile/other devices). The games are not controlled by your keystrokes or game controllers anymore, but based on player movements as sensed by sensors such as cameras, body armor, gadgets, etc.

This change posed an interesting challenge to our recent customer Ubisoft. They needed to convert their existing legacy services into a cross-platform enabler to support the above and they also needed to build a new gaming platform for the future that will allow them to provide a richer, connected, and engaged user experience by providing a ubiquitous platform.

With 46 game development studios, in 19 countries, with thousands of developers engaged in producing real-time games, such as “Assasin’s Creed ®,” the issue of providing a cross-platform experience to the gamers becomes paramount. Ubisoft needed to open the APIs to the community, both developers and fans, to create new values. For example, in the case of developers, it allows them to foster innovation, and in the case of users, it allows them to share, create, experience, and provide an enriched gaming experience such as stats pages across platforms. It also allows you to share it with your follower community or consume the game across multi-devices in a continuation mode.

Ubisoft’s existing legacy services consists of millions and millions of user data that can be exposed only using legacy services and/or forced to use legacy controls. They wanted to convert that into a newer gaming platform that can support current and future controls and enable them by transitioning from the legacy model to API model. And, of course, by adding Big Data analytics (with integrated Hadoop security) Ubisoft is able to understand a specific player behavior and adapt to the player’s needs/skills. This helps them to provide an enriched, personalized gaming experience based on behavioral patterns, instead of pre-decided development patterns, which will make the gaming experience very dynamic and very unique.

In order to choose the right technology to support their future gaming platform, Ubisoft had the following goals in mind:

  • Help them create value around their brand by building REST web services APIs for newer devices/ consoles.
  • Move their restrictive platform into a more lucid gaming platform.
  • Have these APIs delivered either on cloud or on–premise based on need.
  • An eco-system to help them build, collaborate, expose public/ private/ community developer APIs.
  • Have those APIs mobile enabled.
  • Secure those APIs to gaming and to industry standards.
  • Help smoothen the special “Christmas” problem where the demand could be multitudes higher than the normal times.
  • Have an adaptive licensing and architectural model that will help them seamlessly scale during peak demands such as above.

Be the heart of Ubisoft’s newer, modern gaming platform by providing technology to be more connected, more social, more mobile and be “ON” all the time. Given the sensitivity and the core of the issues, Ubisoft wanted to make sure that they chose the right technology partner. Essentially, this “partner” will not only help them with the current technology issues they are facing, but will also help them with future needs and evolve at a speed greater than Ubisoft’s needs.

When they engaged with Intel, in trying to solve these specific issues, and to move their game platform to the next millennium, they created a set of usecases highlighting the issues they are facing. They invited the usual players (6 vendors in total who matched the capabilities) in this space to complete the POC for those specific usecases.

According to the Ubisoft program manager, “our ‘wow’ moment came when Intel finished the POC in a couple of days when other players were trying to figure out and strategize on how to execute this POC. Intel had already finished and packed up.”

 Assasin's creed

[A scene from Assassin’s Creed 3 ®]

In the end, Ubisoft was able to expose their backend services as a unified API to all their developer partners. This allows them to on-board newer developer studio partners. They are also able to expose more services and surface newer APIs fairly quickly. Ubisoft took months (sometimes years) to expose services to their developers, but now they can do that in a matter of weeks given this newer platform. The initial production release of these API programs was expected to be released initially in 2015 and 2016, but now, given the faster enablement, they moved up the original vision to go live much sooner, as in early 2014.

The greatest thing that I gained out of this experience is that my teenage son thinks that I am “cool”. That in itself speaks volumes about Intel’s modern technology when a teen, who lives 24×7 in the gaming world, thinks that Intel is doing cool stuff to enrich his gaming experience.

You can watch the video by Ubisoft here.

Martin Video testimonial picture

 

The post Ubisoft API (powered by Intel) – The game plays you now! appeared first on Application Security.

Read the original blog entry...

More Stories By Andy Thurai

Andy Thurai is Program Director for API, IoT and Connected Cloud with IBM, where he is responsible for solutionizing, strategizing, evangelizing, and providing thought leadership for those technologies. Prior to this role, he has held technology, architecture leadership and executive positions with Intel, Nortel, BMC, CSC, and L-1 Identity Solutions.

You can find more of his thoughts at www.thurai.net/blog or follow him on Twitter @AndyThurai.

Latest Stories
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
In this presentation, you will learn first hand what works and what doesn't while architecting and deploying OpenStack. Some of the topics will include:- best practices for creating repeatable deployments of OpenStack- multi-site considerations- how to customize OpenStack to integrate with your existing systems and security best practices.
DXWorldEXPO LLC announced today that Kevin Jackson joined the faculty of CloudEXPO's "10-Year Anniversary Event" which will take place on November 11-13, 2018 in New York City. Kevin L. Jackson is a globally recognized cloud computing expert and Founder/Author of the award winning "Cloud Musings" blog. Mr. Jackson has also been recognized as a "Top 100 Cybersecurity Influencer and Brand" by Onalytica (2015), a Huffington Post "Top 100 Cloud Computing Experts on Twitter" (2013) and a "Top 50 C...
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of rec...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business outcomes to drive data, analytics and technology decisions that underpin an organization's digital transformation strategy.
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busine...
Enterprises are universally struggling to understand where the new tools and methodologies of DevOps fit into their organizations, and are universally making the same mistakes. These mistakes are not unavoidable, and in fact, avoiding them gifts an organization with sustained competitive advantage, just like it did for Japanese Manufacturing Post WWII.
The revocation of Safe Harbor has radically affected data sovereignty strategy in the cloud. In his session at 17th Cloud Expo, Jeff Miller, Product Management at Cavirin Systems, discussed how to assess these changes across your own cloud strategy, and how you can mitigate risks previously covered under the agreement.
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...