Welcome!

News Feed Item

TASC Report Warns of Risks of Lowest Price Acquisition Approach

Complex mission-critical services require technically rigorous evaluations

CHANTILLY, Va., Nov. 20, 2012 /PRNewswire/ -- Increased reliance by government customers on the Lowest Price Technically Acceptable (LPTA) acquisition strategy poses unnecessary risks such as budget overruns, delivery delays and, in the worst cases, mission failure.  According to a new report by TASC, Inc., the LPTA process can be appropriate for commoditized services with clearly defined requirements, but not for complex services that support sophisticated, high-risk missions.  In cases where the government does elect to use the LPTA process, it should rigorously define and evaluate technical acceptability and past performance to avoid compromising the performance of the program or system and, ultimately, the success of the mission.

(Logo: http://photos.prnewswire.com/prnh/20120730/PH48590LOGO )

"The lowest cost can fast become very expensive when looking at mission-support services important to national security," says Jerry Howe, senior vice president and general counsel of TASC.  "When mission requirements are constantly evolving – as is the case in many national security programs – the government should be seeking the 'best value' services, that is, services provided by specialists with highly sophisticated technical skills and deep domain expertise that exhibit the agility to adjust quickly to complex and changing needs."

The TASC report makes the case that using the LPTA approach in the acquisition of complex mission services can compromise mission success and increase total program costs over time when factoring in rework and related costs.  To achieve the best value, TASC recommends that solicitations for complex services adopt a classic best-value, cost-technical tradeoff approach that considers innovation, scheduling rigor and program cost containment.  When the government does use the LPTA process, technical and past performance requirements should be defined precisely.  In addition, applying detailed technical criteria using scenario-based evaluations, high standards of past performance and price-realism analyses are essential to mitigate the risk of unsuccessful performance on a given contract solicited on an LPTA basis.  The TASC report offers examples of solicitations that utilize these recommendations.

"It is essential that contractors hired by the government to deliver complex services have the technical qualifications and domain knowledge to meet the challenges of today and tomorrow," says Howe.  "We live in an era where systems must work the first time and every time, not only to accomplish the mission, but also to meet affordability goals and budget objectives."

Download the TASC report "The Challenge of Applying the LPTA Process to the Procurement of Complex Services."  For more information and career opportunities, visit http://www.tasc.com.

Media Contact:  Christine Nyirjesy Bragale, TASC Public Relations Manager, [email protected], (703) 653-5996.

 

SOURCE TASC, Inc.

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, discussed the best practices that will ensure a successful smart city journey.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Technology vendors and analysts are eager to paint a rosy picture of how wonderful IoT is and why your deployment will be great with the use of their products and services. While it is easy to showcase successful IoT solutions, identifying IoT systems that missed the mark or failed can often provide more in the way of key lessons learned. In his session at @ThingsExpo, Peter Vanderminden, Principal Industry Analyst for IoT & Digital Supply Chain to Flatiron Strategies, will focus on how IoT depl...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
The pace of innovation, vendor lock-in, production sustainability, cost-effectiveness, and managing risk… In his session at 18th Cloud Expo, Dan Choquette, Founder of RackN, discussed how CIOs are challenged finding the balance of finding the right tools, technology and operational model that serves the business the best. He also discussed how clouds, open source software and infrastructure solutions have benefits but also drawbacks and how workload and operational portability between vendors an...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Big Data, cloud, analytics, contextual information, wearable tech, sensors, mobility, and WebRTC: together, these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at @ThingsExpo, Erik Perotti, Senior Manager of New Ventures on Plantronics’ Innovation team, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it m...
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet and...
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
There will be new vendors providing applications, middleware, and connected devices to support the thriving IoT ecosystem. This essentially means that electronic device manufacturers will also be in the software business. Many will be new to building embedded software or robust software. This creates an increased importance on software quality, particularly within the Industrial Internet of Things where business-critical applications are becoming dependent on products controlled by software. Qua...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.