|By Sathyanarayanan Muthukrishnan||
|February 14, 2014 11:30 AM EST||
It is quite commonly observed now-a-days as a common practice; most of the companies invest a lot of time in engaging consultants & designers and spending colossal amounts of money in capacity planning to size the infrastructure for their specific needs. Not denying the fact that people, capacity planning tools are always helpful to help identify the required amount of resources to size the infrastructure correctly. However, need to consider the fact and it is absolutely necessary to do "Proof-of-concept" especially while making imperative decisions.
There are always concerns raised in terms of obtaining a satisfactory performance. Moreover, mergers and acquisitions have brought in their share of complexity to the existing environment, resulting in technology -vs- application compatibility related challenges. Nevertheless, this may be applicable for setting up new infrastructure for a business critical application from the scratch or for specific IT requirements (for example - data center consolidation, virtualizing a system or going for cloud based solutions). Proof of concepts helps companies in deciding acceptance criteria, right sizing the infrastructure according to their specific needs. It helps in achieving business objectives by controlling the budget over run and helps IT management to plan for cost and procure resources accordingly to ensure successful completion of a project. As the design phase is responsible for many critical decisions, many cost overrun causes are related to such phase. It is identified that most significant causes of cost overrun related to the design phase are due to blindly following the theoretical evidence or by going with by completely trusting on the metrics obtained using unreliable capacity planning tools.
The purpose of PoC is to showcase the benefits using real world end user scenarios and by calculating the TCO for individual cases. Considering the Key system performance base metrics - Processor, Memory, Disk and Network, usually the work loads are classified in to three types (1) Typical user (2) Power user and (3) Advanced Power user. It is always a good practise to calculate load / system usage based on "Power user". If funds permit, it would be even better to use the upper bound for the calculations by considering "Advanced power user" usage in to the account.
PoC helps in determining and size accordingly based on the Average and Peak loads. It enables the consultants in deciding anticipated future growth and leave sufficient room for all key system performance metrics discussed above.
Gartner predicts that the portion of organizations using cloud services will reach 80% by the end of year 2015. Whilst the Cloud Disaster Recovery Service becoming popular these days, companies want to have quick recovery of vital applications in case of failures, by taking advantage of cloud based DR solutions. Hence, it is becoming imperious for organisations to set their own PoC strategy, choose their own POC clouds, navigate technical hurdles & compatibility related challenges, and measure success.
In conclusion, to successfully execute a project, an organization has to give maximum importance to "Proof-of-concept", which defines its success criteria. The use of a proof-of-concept template can be applied to various projects that can help Businesses bridge the gap between the visionary and delivery stages of production efforts.
Fig Illustrates: Resource equals money
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Jul. 24, 2016 04:15 PM EDT Reads: 1,532
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
Jul. 24, 2016 03:45 PM EDT Reads: 1,713
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
Jul. 24, 2016 03:30 PM EDT Reads: 573
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 24, 2016 03:30 PM EDT Reads: 880
Cloud analytics is dramatically altering business intelligence. Some businesses will capitalize on these promising new technologies and gain key insights that’ll help them gain competitive advantage. And others won’t. Whether you’re a business leader, an IT manager, or an analyst, we want to help you and the people you need to influence with a free copy of “Cloud Analytics for Dummies,” the essential guide to this explosive new space for business intelligence.
Jul. 24, 2016 03:15 PM EDT Reads: 548
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Jul. 24, 2016 03:15 PM EDT Reads: 1,663
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, outlined ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and sto...
Jul. 24, 2016 03:00 PM EDT Reads: 995
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
Jul. 24, 2016 02:30 PM EDT Reads: 739
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, discussed the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filterin...
Jul. 24, 2016 02:00 PM EDT Reads: 1,183
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Jul. 24, 2016 01:30 PM EDT Reads: 794
Enterprise networks are complex. Moreover, they were designed and deployed to meet a specific set of business requirements at a specific point in time. But, the adoption of cloud services, new business applications and intensifying security policies, among other factors, require IT organizations to continuously deploy configuration changes. Therefore, enterprises are looking for better ways to automate the management of their networks while still leveraging existing capabilities, optimizing perf...
Jul. 24, 2016 01:00 PM EDT Reads: 975
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2016 Silicon Valley. The 6thInternet of @ThingsExpo will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jul. 24, 2016 12:00 PM EDT Reads: 1,882
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.
Jul. 24, 2016 12:00 PM EDT Reads: 1,865
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee...
Jul. 24, 2016 12:00 PM EDT Reads: 1,327
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
Jul. 24, 2016 11:45 AM EDT Reads: 1,845