Welcome!

News Feed Item

Colville Tribes Win Long-Running Environmental Lawsuit against Teck Metals

Federal Court in Yakima Holds Canadian Company Liable for Decades of Contamination

NESPELEM, Wash., Dec. 14, 2012 /PRNewswire/ -- Today, a judge in the United States District Court in Yakima issued a ruling that Canadian mining and smelting giant Teck Metals, Ltd. is liable under United States environmental law for contaminating the Columbia River with millions of tons of smelting waste.

In finding Teck liable under the Comprehensive, Environmental Response, Compensation, and Liability Act (CERCLA, also known as Superfund), the Honorable Judge Lonny R. Suko ruled that, "for decades Teck's leadership knew its slag and effluent flowed from Trail downstream and are now found in Lake Roosevelt, but nonetheless Teck continued discharging wastes into the Columbia River."  The court noted Teck's manager's recognition that it, "had been treating Lake Roosevelt as a 'free,' 'convenient' disposal facility for its wastes."  Given this conduct and connection with Washington, Judge Suko decided that Teck could be tried in Washington, even though its smelter is located in Canada. 

"We are very pleased with this outcome," said John Sirois, Chairman of the Colville Business Council. "Now that the Court has found that Teck is liable for its contamination of the Columbia River, we look forward to its participation in cleaning it up and paying for any resulting damages."

Included in the decision, the judge determined:

  • Between 1930 and 1995, Teck intentionally discharged at least 9.97 million tons of slag, including  heavy metals such as lead, zinc, mercury, cadmium, copper, and arsenic, directly into the Columbia river via outfalls at its Trail smelter.
  • Teck knew its disposal of hazardous waste into the Upper Columbia river was likely to cause harm, and was told by the Canadian government that its slag was toxic to fish and leached hazardous metals.
  • Pursuant to CERCLA, Teck is liable to the Tribes and the State in any subsequent action or actions to recover past or future response costs at the Upper Columbia river site.

The Court's finding that Teck is liable under CERCLA will give EPA the power to force Teck to fund necessary cleanup.  The Court's ruling will also make Teck liable for any natural resource damages resulting from its releases of hazardous substances to the environment in the Upper Columbia River. 

First filed in 2004, the lawsuit arose from Teck's refusal to comply with United States Superfund law to study the nature and extent of hazardous substances discharged by the mining company in and around Lake Roosevelt and the Upper Columbia River.  The smelter is located directly on the Columbia River just a few miles north of the United States border.  Teck's smelting wastes have been documented throughout the 150-mile reach of the Columbia River between the Canadian border and Grand Coulee Dam. 

The Confederated Tribes of the Colville Reservation joined the lawsuit in 2005.  For seven years, the Tribes, together with the State of Washington, litigated to obtain today's result – the determination that Teck is subject to United States environmental law and is obligated to investigate and cleanup contamination in the Upper Columbia River and Lake Roosevelt.  After vigorously contesting that its wastes ever deposited in the United States or released hazardous substances there, on the eve of trial Teck finally conceded that it dumped nearly 10 million tons of smelting waste into the Columbia River, some of which included hazardous substances that deposited in the United States, and that its wastes leached heavy metals into the environment of the United States.

"Today's court ruling has great meaning for our Tribes," said Sirois. "This river is the heart of our people.  It has always been and will always be our homeland, and damages to our natural resources must be addressed."

Together with other governmental entities, the Tribes and the State are actively planning scientific studies necessary to identify the extent of injury and resulting damage in the river, as well as in the upland areas.   Once these studies are complete, the Tribes and State will return for another trial to address the damages that have resulted from the decades of release of hazardous substances in the Upper Columbia River, Lake Roosevelt, and the upland region.

About Confederated Tribes of the Colville Reservation

With more than 9,000 descendants of 12 aboriginal tribes of Indians enrolled, the Colville Tribes is a sovereign nation and federally-recognized Indian tribe. The existing Colville Reservation and former North Half of the Reservation are bounded by the Columbia River (including Lake Roosevelt) covering about 3 million acres located in north eastern and central Washington. This area is diverse with natural resources including standing timber, streams, rivers, lakes, minerals, varied terrain, native plants and wildlife.  The Columbia River has been the homeland of the Colville People for time immemorial.  The Colville tribal members use the Columbia River for sustenance, recreation, economic, and spiritual purposes.  It is an integral part of their culture and tribal identity. 

SOURCE Confederated Tribes of the Colville Reservation

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
In his session at @DevOpsSummit at 20th Cloud Expo, Kelly Looney, director of DevOps consulting for Skytap, showed how an incremental approach to introducing containers into complex, distributed applications results in modernization with less risk and more reward. He also shared the story of how Skytap used Docker to get out of the business of managing infrastructure, and into the business of delivering innovation and business value. Attendees learned how up-front planning allows for a clean sep...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
Most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes a lot of work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reduction in cost ...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...