Welcome!

News Feed Item

Open Network Install Environment (ONIE) Opens New Certification Lab at University of Texas at San Antonio

Today the first ONIE certification lab, hosted and operated by the University of Texas at San Antonio (UTSA), is opening. ONIE is an industry standard network boot loader for installing software on network switches. Announced in conjunction with the Open BigCloud Symposium and OCP Workshop 2014, the addition of ONIE will allow organizations to add network level certification to define test suites and test platforms in accordance with the specifics of their projects. The ONIE certification lab is an extension to UTSA’s existing Cloud and Big Data Laboratory, which is the first Open Compute Project (OCP) Certification Laboratory in North America.

With each new platform and chipset, there is a significant amount of development work that is involved to ensure compatibility. ONIE certification and compliance leverages best practices to validate this process in the most expedient way possible,” said Carlos Cardenas, Associate Director, Cloud and Big Data Lab, UTSA. “We are pleased to launch the certification lab as the demand for standardization and reliability across the entire data center ecosystem – from servers to switches and now networking – becomes standard protocol.”

“ONIE provides a standards-based method to add the appropriate OS on top of a Dell switch making it much easier to scale and manage large workloads,” said Subi Krishnamurthy, Executive Director, CTO Dell Networking. “This is a critical success factor in Open Networking Innovation.”

UTSA will provide ONIE logo certification to organizations to help them meet the specific set of criteria defined by the growing community of IT and networking professionals interested in an easy open and standard protocol for adding the hardware they want to the software they choose.

The UTSA certification lab represents a new milestone for ONIE – a project that originated as a contribution to the open networking community by Cumulus Networks™ and Big Switch one year ago, and was then incubated and adopted by OCP soon thereafter. Over the course of the year, the ONIE community expanded and now includes several hardware providers that have standardized their networking gear on ONIE including Dell, Accton (Edge-Core Networks), Quanta, Penguin Computing, Celestica, Mellanox, Interface Masters and Agema. The UTSA Lab makes ONIE certification simple and allows organizations to test and validate the performance of the hardware and software they choose. This makes it easier to buy and deploy standards-based network hardware, reducing the overall capital and operating cost of network switches and breaking the proprietary stranglehold of traditional architectures.

"As a long-standing member and supporter of the ONIE platform, Quanta has added ONIE to five Quanta switches and we are excited to see ONIE evolve into a clear-cut standard protocol,” said Mike Yang, General Manager of Quanta QCT. “Certification and the support of the UTSA Lab will allow Quanta to standardize more hardware on ONIE faster while maintaining the proven processes."

The addition of UTSA’s ONIE certification lab could not come at a better time,” said Min Chao, President of Edge-Core Networks. “Networking has been a rigid and closed environment and our customers are demanding the ability to manage their infrastructures more effectively. The establishment of common standards and best practices that have proven to be successful on the compute side must resonate across the data center. Edge-Core has recognized this and already is shipping 1 GbE, 10 GbE, and 40 GbE data center switches with ONIE. Certification will be a significant value add in helping us standardize additional hardware as demand for open networking grows.”

Certification Process – ONIE Lab:

  • The hardware vendor sends a representative switch to UTSA, where it stays for the entire product lifecycle
  • UTSA will then run the test suite for a given ONIE release (the first targeted release is 2014.08)
  • Upon passing the tests, UTSA will issue an ONIE Certification Logo for the vendor to use
  • As new ONIE releases are made and a given vendor wants to officially support them, only the updated ONIE image will be required to be sent to UTSA for certification of the new ONIE release

How to achieve ONIE certification: For more details contact Carlos Cardenas ([email protected])

How ONIE Works

ONIE is the combination of a boot loader and a small Linux operating system for bare metal network switches that provides an environment for automated software download, installation and provisioning. As a component of the open hardware switch platform, ONIE will contribute to and advance standards that define the hardware/software interface. ONIE allows end-users and channel partners to install the target network OS as part of data center provisioning, in the fashion that servers are provisioned.

Visit ONIE Certification Lab at UTSA

WHERE: The ONIE certification lab will be launched May 7 at the Open BigCloud Symposium and OCP Workshop 2014. This event will be held at UTSA.

The full schedule for the Open BigCloud Symposium and OCP Workshop 2014 can be found at: http://openbigcloudsymposiumandocp2014a.sched.org/

For more information on ONIE visit: www.onie.org

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Technology vendors and analysts are eager to paint a rosy picture of how wonderful IoT is and why your deployment will be great with the use of their products and services. While it is easy to showcase successful IoT solutions, identifying IoT systems that missed the mark or failed can often provide more in the way of key lessons learned. In his session at @ThingsExpo, Peter Vanderminden, Principal Industry Analyst for IoT & Digital Supply Chain to Flatiron Strategies, will focus on how IoT depl...
Data is an unusual currency; it is not restricted by the same transactional limitations as money or people. In fact, the more that you leverage your data across multiple business use cases, the more valuable it becomes to the organization. And the same can be said about the organization’s analytics. In his session at 19th Cloud Expo, Bill Schmarzo, CTO for the Big Data Practice at Dell EMC, introduced a methodology for capturing, enriching and sharing data (and analytics) across the organization...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"Logz.io is a log analytics platform. We offer the ELK stack, which is the most common log analytics platform in the world. We offer it as a cloud service," explained Tomer Levy, co-founder and CEO of Logz.io, in this SYS-CON.tv interview at DevOps Summit, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Dave McCarthy, Director of Products at Bsquare Corporation; Alan Williamson, Principal ...