Welcome!

Related Topics: Agile Computing, Java IoT, Linux Containers, Containers Expo Blog, @CloudExpo, @BigDataExpo, @ThingsExpo

Agile Computing: Article

Open Source Storage Chairman: "Goal is to be the Market Leader"

Exclusive Interview with New Chairman Mark Iwanowski

Open Source Storage recently announced that Oracle veteran Mark Iwanowski (pictured below) was joining the company as its new chairman. What better time to find out what's going on there? We had a few questions for him...

Cloud Computing Journal: What are your key objectives as the new Chairman?

Mark Iwanowski: As Chairman of Open Source Storage, I'll help the company achieve success by bringing my business background and understanding of open source together with the innovations of CEO Eren Niazi and his executive management team. One of our objectives is to continue building the company as a market leader in the space.

Our goal is to grow Open Source Storage to be the market leader for secure, integrated, hardware/software enterprise-grade open source storage solutions. We are the only company in the marketplace that offers a true end-to-end storage solution (in our opinion), providing hardware and software that are needed to deliver large-scale enterprise solutions.

CCJ: How do you see Big Data and the IoT impacting the company?

Mark: Big Data analytics and the emerging Internet of Things market are driving an exponential explosion of data. This has resulted in the need for a cost effective, secure enterprise class storage solution that seamlessly links certified Open Source Storage software with integrated and fully tested low cost hardware, and offers an efficient way to connect private and public cloud storage infrastructure in a seamless way.

Open Source Storage ensures highly efficient, secure, real-time management of public and private cloud data sources.

CCJ: How so?

Mark: There are a few key points.

Open Source Storage's business model is about lower-cost enterprise open source solutions.

We can provide a highly customizable and scalable platform for companies that are manufacturing devices for the Internet of Things to receive and process data from these devices.

Big Data covers a wide array of technologies that Open Source Storage can partner with companies to solve.

Whatever the Big Data task, Open Source Storage will develop a lower total cost solution. This allows customers to quickly adopt, adapt, update, or change technologies to meet their needs

CCJ: What are the key challenges, then, that need to be met to achieve the objectives in this emerging Big Data/IoT world?

Mark: Key challenges include the need to drive the cost of data management down dramatically while ensuring robust performance in a highly secure way.

As CIOs evolve their data storage strategies to include a hybridized private/public cloud combination, this further adds to the integration challenges as real-time data analytics is driven by the need to manage huge volumes of data in an almost instantaneous fashion.

All of this has to be done in a highly secure manner, and Open Source Storage's solutions are specifically focused on addressing these increasing data management demands.

CCJ: What are top-of-mind issues for CIOs, in your opinion?

Mark: Specifically, here are a few points to keep in mind:

With Big Data, the challenges really depend on the technology that enterprise uses for big data such as Hadoop or others. Customers are interested in leveraging big data in a most cost effective way, and Open Source Storage addresses this need.

With the Internet of Things and IPv6 adoption, it is estimated there will be 50 billion devices with Internet access by 2020 (source: Cisco's Internet Business Solutions Group (IBSG) - http://share.cisco.com/internet-of-things.html), and customers must be prepared to embrace the IPv6 reality.

With privacy, CIOs should ask how they can protect privacy when everything is talking to the world? It is essential to ensure clearly defined opt in policy for all data access and to guide customers toward holistic industry best practices to secure the data.

Regarding obsolescence:, changes in the communication methods may render many devices obsolete, there are many device providers to choose from in the industry, and Open Source Storage will help drive industry standards including backwards compatibility

Then with security, this is something that must be adopted from the ground up--from end-to-end encryption, to ensuring the security of user's data that is stored by the device manufacturer.

Finally, when it comes to standards, there needs to be a strong governance model to help standardize methods, and it's important to have open source APIs in place that the enterprise can leverage.

Contact me on Twitter

 

 

More Stories By Roger Strukhoff

Roger Strukhoff (@IoT2040) is Executive Director of the Tau Institute for Global ICT Research, with offices in Illinois and Manila. He is Conference Chair of @CloudExpo & @ThingsExpo, and Editor of SYS-CON Media's CloudComputing BigData & IoT Journals. He holds a BA from Knox College & conducted MBA studies at CSU-East Bay.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
As companies gain momentum, the need to maintain high quality products can outstrip their development team’s bandwidth for QA. Building out a large QA team (whether in-house or outsourced) can slow down development and significantly increases costs. This eBook takes QA profiles from 5 companies who successfully scaled up production without building a large QA team and includes: What to consider when choosing CI/CD tools How culture and communication can make or break implementation
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Big Data, cloud, analytics, contextual information, wearable tech, sensors, mobility, and WebRTC: together, these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at @ThingsExpo, Erik Perotti, Senior Manager of New Ventures on Plantronics’ Innovation team, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it ...
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
StackIQ has announced the release of Stacki 3.2. Stacki is an easy-to-use Linux server provisioning tool. Stacki 3.2 delivers new capabilities that simplify the automation and integration of site-specific requirements. StackIQ is the commercial entity behind this open source bare metal provisioning tool. Since the release of Stacki in June of 2015, the Stacki core team has been focused on making the Community Edition meet the needs of members of the community, adding features and value, while ...
Deploying applications in hybrid cloud environments is hard work. Your team spends most of the time maintaining your infrastructure, configuring dev/test and production environments, and deploying applications across environments – which can be both time consuming and error prone. But what if you could automate provisioning and deployment to deliver error free environments faster? What could you do with your free time?