|By Marketwired .||
|February 11, 2014 12:25 PM EST||
VANCOUVER, BRITISH COLUMBIA -- (Marketwired) -- 02/11/14 -- Kazax Minerals Inc. ("Kazax" or the "Company") (TSX VENTURE:KZX) is pleased to provide an update on its current exploration drilling programme at the Lomonosovskoye iron project located in Kostanay Oblast, Kazakhstan.
The work, which commenced in September 2013, has successfully completed 9,141 metres of a planned 10,360 metre programme. Of the completed drilling, over 1,200 samples of drill core have been delivered to an independent lab - ALS at Ust-Kamenogorsk, in Kazakhstan. Assay results are expected by May 2013.
As a result of further data processing of soviet era information, which was located in 2013 through government library information services and previously unknown to the Company, the Company plans to extend its drilling programme into 2014 with a further 11 exploration holes - 6 holes within the envelope of two adjacent projected pits and 5 holes outside this boundary, targeting magnetic anomalies and the presence of copper and zinc mineralisation. Out of the 6 holes to be drilled within the pit boundary, 5 are targeting a zone between the two pits previously thought to be potentially barren. The historic soviet data demonstrates interesting mineralization and magnetic anomalies lying between these two areas. This could positively influence pit design work being conducted as part of the Bankable Feasibility Study ("BFS") process. The examination of this historical information has been running concurrently with the drilling programme.
The Company benefits from the availability of historic drilling information linked to the soviet era. This comprises 412 boreholes drilled between 1956 and 1982, totalling 131,441 m. The results of this historic data are still being reviewed. The Company already sent to ALS lab in Ust-Kamenogorsk over 800 powder samples not previously tested as part of this review.
The Company currently anticipates completion of the initial drill programme by March 2014 and the extended drill programme by May 2014. The Company expects to produce a revised technical report in compliance with National Instrument 43-101 upon announcing the results from this drill programme.
The Company continues to advance work towards the completion of its BFS, including detailed infrastructure and cost model improvements.
The technical information provided in this news release was reviewed and approved by Dr. Juan Camus, the project's country manager and a qualified person for the purposes of National Instrument 43-101.
For further information readers are invited to review additional corporate and property information available on SEDAR at www.sedar.com.
ON BEHALF OF THE BOARD
Trevor Campbell Smith, President & CEO
Neither the TSX Venture Exchange nor its Regulation Services Provider (as that term is defined in the policies of the TSX Venture Exchange) accepts responsibility for the adequacy or accuracy of this release.
This news release contains forward-looking statements and forward-looking information within the meaning of applicable securities laws. The use of any of the words "expect", "anticipate", "continue", "estimate", "objective", "ongoing", "may", "will", "project", "should", "schedule", "believe", "plans", "intends" and similar expressions are intended to identify forward-looking information or statements. More particularly and without limitation, this news release contains forward looking statements and information concerning the Company s future operations and prospects. The forward-looking statements and information are based on certain key expectations and assumptions made by the Company, including expectations and assumptions concerning equipment and crew availability, and joint venture partner financial capability. Although the Company believes that the expectations and assumptions on which such forward-looking statements and information are based are reasonable, undue reliance should not be placed on the forward looking statements and information because the Company can give no assurance that they will prove to be correct. By its nature, such forward-looking information is subject to various risks and uncertainties, which could cause the Company's actual results and experience to differ materially from the anticipated results or expectations expressed. These risks and uncertainties include, but are not limited to, reservoir performance, labour, equipment and material costs, access to capital markets, interest and currency exchange rates, and political and economic conditions. Additional information on these and other factors is available in continuous disclosure materials filed by the Company with Canadian securities regulators. Readers are cautioned not to place undue reliance on this forward-looking information, which is given as of the date it is expressed in this news release or otherwise, and to not use future-oriented information or financial outlooks for anything other than their intended purpose. The Company undertakes no obligation to update publicly or revise any forward-looking information, whether as a result of new information, future events or otherwise, except as required by law.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Jan. 18, 2017 01:15 AM EST Reads: 4,854
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 18, 2017 01:00 AM EST Reads: 2,001
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
Jan. 18, 2017 01:00 AM EST Reads: 6,040
"Logz.io is a log analytics platform. We offer the ELK stack, which is the most common log analytics platform in the world. We offer it as a cloud service," explained Tomer Levy, co-founder and CEO of Logz.io, in this SYS-CON.tv interview at DevOps Summit, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 18, 2017 12:45 AM EST Reads: 6,237
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jan. 18, 2017 12:45 AM EST Reads: 5,937
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Jan. 18, 2017 12:30 AM EST Reads: 2,774
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
Jan. 18, 2017 12:00 AM EST Reads: 7,728
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 18, 2017 12:00 AM EST Reads: 2,252
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
Jan. 17, 2017 11:30 PM EST Reads: 4,304
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
Jan. 17, 2017 11:15 PM EST Reads: 4,881
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
Jan. 17, 2017 11:00 PM EST Reads: 528
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Dave McCarthy, Director of Products at Bsquare Corporation; Alan Williamson, Principal ...
Jan. 17, 2017 10:30 PM EST Reads: 2,339
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
Jan. 17, 2017 10:30 PM EST Reads: 688
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software securi...
Jan. 17, 2017 09:30 PM EST Reads: 1,769
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Jan. 17, 2017 09:15 PM EST Reads: 7,538