Welcome!

News Feed Item

Garibaldi broadens mineralized corridor at Grizzly, prepares to accelerate 2014 exploration plans

TSXV: GGI
OTC: GGIFF
Frankfurt: RQM

VANCOUVER, March 13, 2014 /PRNewswire/ - Garibaldi Resources Corp. (TSXV: GGI) (the "Company" or "Garibaldi") is pleased to report that recent reconnaissance work carried out over western portions of its Grizzly Property in the Sheslay Valley, northwest British Columbia, has identified a new zone of porphyry copper mineralization 3 km south of its Grizzly West porphyry target and 3 km west-southwest of Prosper Gold Corp.'s Pyrrhotite Creek prospect in an area referred to as West Kaketsa. Garibaldi is the largest landholder among juniors in the Sheslay district and controls approximately 26,200 hectares in claims. The results of this program suggests potential to significantly broaden a NW/SE trending corridor of porphyry targets that extends for over 30 km through the Sheslay Valley from the western end of the Grizzly Property through Grizzly Central to the recently announced Grizzly East expansion claims.

Given highly encouraging results in the western and central parts of the Grizzly Property, Garibaldi is accelerating its 2014 plans at the Grizzly by launching an aggressive Phase 1 exploration program to include detailed mapping, geochemistry, IP surveys and drilling.  The initial stages of this work will commence in the next few weeks and the results will determine the scale of a planned Phase 2 program. Garibaldi is in a strong working capital position and is looking forward to advancing the Grizzly Project concurrently with its assets in Mexico.

"The importance of the discoveries recently announced by Prosper Gold and Doubleview Capital Corp., located approximately 10 km apart on properties within the Sheslay corridor, is the scale of mineralization over such wide distances. Confirmation of another significant porphyry target area several km south of Grizzly West underscores the world class potential of this growing mineralized Cu-Au porphyry corridor in the Sheslay Valley," explained Steve Regoci, Garibaldi President and CEO.  "Garibaldi has captured more than 50% of this very prospective corridor with multiple targets already identified from Grizzly West to Grizzly Central through geophysical and geochemical surveys. We're very excited about advancing the Grizzly to a first-ever drilling stage."

West Kaketsa Mineralization Similar To Grizzly West, Pyrrhotite Creek

The extent of the newly discovered mineralized area at West Kaketsa is yet to be determined but it's located approximately 1 km north of the historic West Kaketsa prospect (B.C. Minfile # 104J-024) and appears to be related to a fault that extends at least 3 km to Pyrrhotite Creek on the eastern flank of Mount Kaketsa.  Garibaldi's upcoming program at the Grizzly Property includes plans to further define this new zone with IP surveys and identify potential drill targets.

Petrographic analysis of mineralization from both the West Kaketsa and Grizzly West prospects has confirmed that both areas exhibit classic porphyry-style copper-gold mineralization including hydraulic brecciation, disseminated chalcopyrite and intense alteration within a hydrothermal environment. Garibaldi's reconnaissance work identified the new mineralized zone while following up on an encouraging airborne magnetic and radiometric survey completed last fall over western portions of the Grizzly Property.

The airborne survey confirmed that West Kaketsa, like Grizzly West, is in a region of strong magnetic activity and structure, the latter indicating faults and fractures in the intrusive bodies favorable for mineralizing fluids along the contact zones of the Mount Kaketsa monzonite granodiorite stock. Historic technical reports describe porphyry mineralization at West Kaketsa as being similar to mineralization observed at Pyrrhotite Creek.  Prosper Gold has reported that Pyrrhotite Creek is a large mineralized zone with multiple porphyry targets located 3 km southwest of the Star porphyries in the SW corner of its Sheslay Property adjoining the Grizzly.

Grizzly West

Last fall's airborne survey shows that Grizzly West is on the periphery of a large magnetic anomaly, in a similar setting to targets on the adjoining Sheslay Property.  A soil geochemical survey has defined several strong copper anomalies open in multiple directions at Grizzly West with the grid covering an area approximately 1.5 km x 1.5 km.  Fieldwork at this target has confirmed historical reports (Corona Resources) of mineralization with reported grades ranging from 0.20% Cu to 6.7% Cu in rock chip samples.

Maps - West Kaketsa & Grizzly West

Maps showing location, sampling areas and full results from recent work completed at West Kaketsa and Grizzly West are available on the Garibaldi web site at www.GaribaldiResources.com.

To view a 2.5-minute video on the Grizzly Property and the Sheslay Valley, please visit the following URL: http://www.garibaldiresources.com/s/Media.asp#video1

Qualified Person

Carl von Einsiedel, P.Geo., a non-independent geological consultant and a Qualified Person as defined by NI-43-101 has reviewed this release and approved the content thereof.

We seek safe harbor.

GARIBALDI RESOURCES CORP.

Per: "Steve Regoci"   
       Steve Regoci, President

Neither the TSX Venture Exchange nor its Regulation Services Provider accepts responsibility for the adequacy or the accuracy of this release.

SOURCE Garibaldi Resources Corp.

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
As companies gain momentum, the need to maintain high quality products can outstrip their development team’s bandwidth for QA. Building out a large QA team (whether in-house or outsourced) can slow down development and significantly increases costs. This eBook takes QA profiles from 5 companies who successfully scaled up production without building a large QA team and includes: What to consider when choosing CI/CD tools How culture and communication can make or break implementation
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Big Data, cloud, analytics, contextual information, wearable tech, sensors, mobility, and WebRTC: together, these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at @ThingsExpo, Erik Perotti, Senior Manager of New Ventures on Plantronics’ Innovation team, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it ...
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
StackIQ has announced the release of Stacki 3.2. Stacki is an easy-to-use Linux server provisioning tool. Stacki 3.2 delivers new capabilities that simplify the automation and integration of site-specific requirements. StackIQ is the commercial entity behind this open source bare metal provisioning tool. Since the release of Stacki in June of 2015, the Stacki core team has been focused on making the Community Edition meet the needs of members of the community, adding features and value, while ...
Deploying applications in hybrid cloud environments is hard work. Your team spends most of the time maintaining your infrastructure, configuring dev/test and production environments, and deploying applications across environments – which can be both time consuming and error prone. But what if you could automate provisioning and deployment to deliver error free environments faster? What could you do with your free time?