Welcome!

News Feed Item

NEC Delivers 100Gbps Transponder for Optical Transmission in Latin America Through Internexa

Tokyo, Mar 17, 2014 - (JCN Newswire) - NEC Corporation (NEC; TSE: 6701) announced today that it has delivered a 100Gbps transponder to Internexa for optical transmission. This is the first such transponder to be provided to Internexa and has already started commercial operation in Colombia.

Internexa is a leading provider of telecommunications infrastructure and ICT services as well as an affiliate of ISA, a business group composed of 30 affiliates and subsidiaries that promote major infrastructure projects and development throughout 9 countries in Latin America.

The Internexa network is open to all operators and features an area of 28,000 km of terrestrial fiber optics primarily mounted on electrical infrastructure that connects Venezuela, Colombia, Ecuador, Peru, Chile, Argentina and Brazil. Internexa capitalizes on the latest technologies as a leader in Internet access services for operators, data transport and ICT solutions tailored to its customers. Moreover, Internexa helps to create an ecosystem for migrating digital contents for the development of "the real Latin-American internet" reaching approximately 380 million people.

The transponder that NEC delivered for the construction of these network projects incorporates a 100Gbps digital coherent technology (1) developed by NEC and a DP-QPSK modulation system (2) to support high-capacity transmissions of a maximum of 96 waves. Prior to its introduction, a test was conducted by integrating NEC's 100Gbps transponder seamlessly into Internexa's existing optical transmission system located on the optical layer of the current 80 x 10Gbps DWDM System provided by a third party vendor. The transponder earned higher marks than its competitors for its signal characteristics, implementation efficiency, footprint and energy-saving performance.

"In recent years, telecommunications traffic is rapidly increasing in response to the growing use of smartphones, machine to machine, cloud computing, digital cities, CDN's, tablets and the launch of LTE services. As a result, the expansion of optical fiber networks that can accommodate this growing traffic is urgently required in Latin America," said Genaro Garcia, CEO, Internexa. "The 100Gbps transponder provided by NEC to Internexa in Colombia has demonstrated significant competitive advantages, such as rapid implementation, efficiency and excellent performance. This allows us to continue offering customers the best services through our networks."

"NEC is honored to be selected as the supplier of the first 100Gbps transponder for Internexa," said Shunichiro Tejima, Executive Vice President, NEC. "We thank Internexa for their trust and support and we are confident that this upgrade will effectively support the operations of Internexa's more than 28,000 km of fiber."

Going forward, NEC will continue capitalizing on its IT and network assets in order to concentrate on key areas that highlight its strengths in providing network systems and services to telecommunications operators and other customers around the world.

Based on its Mid-term Management Plan 2015, the NEC Group is working towards the safety, security, efficiency and equality of society. NEC aims to develop solutions for a wide range of social issues, as a company that creates value for society through the promotion of its "Solutions for Society," which provide advanced social infrastructure utilizing ICT.

Notes:

(1) Digital coherent technology is an optical transmission technology providing high-capacity transmission over long distances with a high degree of reliability through the use of optical phases in data transmission. At the time of receiving data, it mixes local light with signal light to decode the signal. It also performs digital signal processing to compensate for signal losses that occur in the process of transmission and to reproduce the original transmission data.

(2) Dual Polarization Quadrature Phase Shift Keying (DP-QPSK) modulation system: This is a modulation method in which four different two-bit signal data are assigned to four modulated optical phases, 0 degrees, 90 degrees, 180 degrees and 270 degrees, for individual orthogonal polarized waves.

About NEC Corporation

NEC Corporation (TSE: 6701) is a leader in the integration of IT and network technologies that benefit businesses and people around the world. By providing a combination of products and solutions that cross utilize the company's experience and global resources, NEC's advanced technologies meet the complex and ever-changing needs of its customers. NEC brings more than 100 years of expertise in technological innovation to empower people, businesses and society. For more information, visit NEC at http://www.nec.com.



Source: NEC Corporation

Contact:
Seiichiro Toda         
NEC Corporation                     
[email protected]
+81-3-3798-6511

Joseph Jasper         
NEC Corporation                     
[email protected]
+81-3-3798-6511


Copyright 2014 JCN Newswire. All rights reserved. www.japancorp.net

More Stories By JCN Newswire

Copyright 2008 JCN Newswire. All rights reserved. Republication or redistribution of JCN Newswire content is expressly prohibited without the prior written consent of JCN Newswire. JCN Newswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.