Welcome!

News Feed Item

SCRA Applied R&D Announces Three New Projects

SCRA today announced three new projects for their Shipbuilding Center of Excellence, the Center for Naval Shipbuilding Technology (CNST). CNST, managed and operated by SCRA Applied R&D, is a Navy ManTech Center of Excellence, chartered by the Office of Naval Research (ONR) to develop advanced manufacturing technologies and deploy them in U.S. shipyards.

The first new project is a partnership with General Dynamics-Electric Boat (GDEB) and the Institute for Manufacturing and Sustainment Technologies (iMAST) to develop a Trade Friendly Dimensional Locating Metrology System. The project team recently started the 18 month plan to determine the feasibility and cost-effectiveness of new measurement technologies to support submarine platform manufacturing processes at the GDEB facilities.

For the other two projects, SCRA/CNST is teamed with Huntington Ingalls Industries - Ingalls Shipbuilding (Ingalls) to support Ingalls’ digital vision and strategy. CNST and Ingalls recently started two complementary projects that are focused on optimizing the use of electronic data to support complex shipbuilding industry processes. The first is a two year project focused on cycle time reduction by combining data entry with mobile scanning devices to ensure visibility, traceability and accountability for all materials throughout the logistics process.

The second project will develop an automated, flexible system that will be used to effectively allocate manufacturing and assembly spaces in the shipyard. This enables planners to adapt to schedule changes and reassign work area much more rapidly than is possible using the current, manual method.

“These recent projects further the Shipbuilding Center of Excellence's mission and are strong indicators of the Navy's sustained commitment to CNST," said SCRA CEO Bill Mahoney. "SCRA has proven that through applied research and advanced technology applications, we can substantially reduce costs and increase efficiencies for our clients such as the U.S. Navy and the Department of Defense."

The U.S. Navy sponsors nine Centers of Excellence that develop technologies to assure that the U.S. Navy remains the most technologically advanced Navy in the world. Two of the nine (CNST and the Composites Manufacturing Technology Center (CMTC)) are led by SCRA and headquartered in South Carolina.

About SCRA

http://www.scra.org/

SCRA is an applied research corporation with over 30 years of experience delivering technology solutions with high returns on investment to federal and corporate clients. To fulfill our mission, SCRA has three sectors: Our Technology Ventures sector has helped over 290 early-stage companies to commercialize innovations and create jobs, our Applied R&D sector manages over 100 national and international programs worth over $2 billion in contract value, and our R&D Facilities sector builds and manages research facilities that include wet labs, secure rooms for sensitive work and advanced, high-tech manufacturing shops. Multiple economic impact studies show SCRA’s cumulative output into South Carolina’s economy to be over $16.6 billion.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
As companies gain momentum, the need to maintain high quality products can outstrip their development team’s bandwidth for QA. Building out a large QA team (whether in-house or outsourced) can slow down development and significantly increases costs. This eBook takes QA profiles from 5 companies who successfully scaled up production without building a large QA team and includes: What to consider when choosing CI/CD tools How culture and communication can make or break implementation
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Big Data, cloud, analytics, contextual information, wearable tech, sensors, mobility, and WebRTC: together, these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at @ThingsExpo, Erik Perotti, Senior Manager of New Ventures on Plantronics’ Innovation team, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it ...
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
StackIQ has announced the release of Stacki 3.2. Stacki is an easy-to-use Linux server provisioning tool. Stacki 3.2 delivers new capabilities that simplify the automation and integration of site-specific requirements. StackIQ is the commercial entity behind this open source bare metal provisioning tool. Since the release of Stacki in June of 2015, the Stacki core team has been focused on making the Community Edition meet the needs of members of the community, adding features and value, while ...
Deploying applications in hybrid cloud environments is hard work. Your team spends most of the time maintaining your infrastructure, configuring dev/test and production environments, and deploying applications across environments – which can be both time consuming and error prone. But what if you could automate provisioning and deployment to deliver error free environments faster? What could you do with your free time?