|By Srinivasan Sundara Rajan||
|February 14, 2011 06:00 PM EST||
Data Warehousing As A Cloud Candidate
Over the past year, we have started seeing greater support for Cloud from major vendors and Cloud is here to stay. The bigger impact is that, the path is clearly drawn for the enterprises to adopt Cloud. With this in mind, it is time to identify the potential for existing data center applications to be migrated to Cloud.
Most of the major IT majors predict a HYBRID Delivery will be future, where by the future enterprises needs to look for a delivery model that comprises of certain work loads on Clouds and some of them continue to be on data centers and then look for a model that will integrate them together.
Before we go further into a blue print of How Data warehouses fit within a HYBRID Cloud environment, we will see the salient features of Data warehouses and how the Cloud tenants make them a very viable work load to be moved to Cloud.
A data warehouse is a subject oriented, integrated, time variant and non volatile collection of data in support of management's decision making process.
Data Warehousing Usage
Cloud Tenant Value Proposition
ETL (Extract, Cleaning, Transform, Load) process is subject to variable patterns. Normally we may get large files over the week end or in night time to be processed and loaded.
It is better to use the COMPUTE resources on demand for the ETL as they require , rather than having a fixed capacity
OLAP (Online Analytical Processing) and related processing needs for MOLAP (Multi dimensional OLAP) and / or ROLAP (Relational OLAP) are highly compute intensive and requires stronger processing needs
High Performance Computing and ability to scale up on demand, tenants of Cloud will be highly aligned to this need
Physical architecture needs are complex in a data warehousing environment.
Most of the IaaS , PaaS offerings like Azure platform, Amazon EC2 have built in provisions for a highly available architecture, with most of the day to day administration is abstracted from the enterprises.
The below are some of the advantages of SQL Azure Platform
Multiple Software and platform needs,
The product stack of data warehousing environment is really huge and most organizations will normally find it difficult to get into a ideal list of software and platforms and tools for their BI platform. platform. SaaS for applications like data cleansing or address validation and PaaS for reporting like Microsoft SQL Azure reporting will be ideal to solve the tools and platform maze.
The following are the ideal steps for migrating a in-premise data warehouse system to a cloud platform, for the sake of case study , Microsoft Windows Azure platform is chosen as the target platform.
1. Create Initial Database / Allocate Storage / Migrate Data
The existing STAR Schema design of the existing data warehousing system can be migrated to Cloud platform as it is. And migrating to a Relational database platform like SQL Azure should be straightforward. To migrate the data, the initial storage allocations of the existing database on the data center needs to be calculated and the same amount Storage resources will be allocated on the Cloud.
You can store any amount of data, from kilobytes to terabytes, in SQL Azure. However, individual databases are limited to 10 GB in size. To create solutions that store more than 10 GB of data, you must partition large data sets across multiple databases and use parallel queries to access the data.
Once a high scalable database infrastructure is setup on SQL Azure platform , the following are some of the methods in which the data from the existing on-premise data warehouses can be moved to SQL Azure.
Traditional BCP Tool : BCP is a command line utility that ships with Microsoft SQL Server. It bulk copies data between SQL Azure (or SQL Server) and a data file in a user-specified format. The bcp utility that ships with SQL Server 2008 R2 is fully supported by SQL Azure. You can use BCP to backup and restore your data on SQL Azure You can import large numbers of new rows into SQL Azure tables or export data out of tables into data files by using the bcp utility.
The following tools are also useful, if you existing Data warehouse is in Sql Server within the data center.
You can transfer data to SQL Azure by using SQL Server 2008 Integration Services (SSIS). SQL Server 2008 R2 or later supports the Import and Export Data Wizard and bulk copy for the transfer of data between an instance of Microsoft SQL Server and SQL Azure.
SQL Server Migration Assistant (SSMA for Access v4.2) supports migrating your schema and data from Microsoft Access to SQL Azure.
2. Set Up ETL & Integration With Existing On Premise Data Sources
After the initial load of the data warehouse on Cloud, it required to be continuously refreshed with the operational data. This process needs to extract data from different data sources (such as flat files, legacy databases, RDBMS, ERP, CRM and SCM application packages).
This process will also carry out necessary transformations such as joining of tables, sorting, applying various filters.
The following are typical options available in Sql Azure platform to build a ETL platform between the On Premise and data warehouse hosted on cloud. The tools mentioned above on the initial load of the data also holds good for ETL tool, however they are not repeated to avoid duplication.
SQL Azure Data Sync :
- Cloud to cloud synchronization
- Enterprise (on-premise) to cloud
- Cloud to on-premise.
- Bi-directional or sync-to-hub or sync-from-hub synchronization
The following diagram courtesy of Vendor will give a over view of how the SQL Azure Data Sync can be used for ETL purposes.
Integration provides common Biztalk Server integration capabilities (e.g. pipeline, transforms, adapters) on Windows Azure, using out-of-box integration patterns to accelerate and simplify development. It also delivers higher level business user enablement capabilities such as Business Activity Monitoring and Rules, as well as self-service trading partner community portal and provisioning of business-to-business pipelines. The following diagram courtesy of the vendor shows how the Windows Azure Appfabric Integration can be used as a ETL platform.
3. Create CUBES & Other Analytics Structures
The multi dimensional nature of OLAP requires a analytical engine to process the underlying data and create a multi dimensional view and the success of OLAP has resulted in a large number of vendors offering OLAP servers using different architectures.
MLOAP : A Proprietary multidimensional database with a aim on performance.
ROLAP : Relational OLAP is a technology that provides sophisticated multidimensional analysis that is performed on open relational databases. ROLAP can scale to large data sets in the terabyte range.
HOLAP : Hybrid OLAP is an attempt to combine some of the features of MOLAP and ROLAP technology.
SQL Azure Database does not support all of the features and data types found in SQL Server. Analysis Services, Replication, and Service Broker are not currently provided as services on the Windows Azure platform.
At this time there is no direct support for OLAP and CUBE processing on SQL Azure, however with the HPC (High Performance Computing ) attributes using multiple Worker roles, manually aggregation of the data can be achieved.
4. Generate Reports
Reporting consists of analyzing the data stored in the data warehouse in multiple dimensions and generate standard reports for business intelligence and also generate ad-hoc reports. These reports present data in graphical/tabular form and also provide statistical analysis features. These reports should be rendered as Excel, PDF and other formats.
It is better to utilize the SaaS based or PaaS based reporting infrastructure rather than custom coding all the reports.
SQL Azure Reporting enables developers to enhance their applications by embedding cloud based reports on information stored in a SQL Azure database. Developers can author reports using familiar SQL Server Reporting Services tools and then use these reports in their applications which may be on-premises or in the cloud.
SQL Azure Reporting also currently can connect only to SQL Azure databases.
The above steps will provide a path to migrate on premise Data warehousing applications to Cloud. As we needed lot of support from the vendor in terms of IaaS, PaaS and SaaS, Microsoft Azure Platform is chosen as a platform to support the case study. With several features integrated as part of this, Microsoft Cloud Platform positioned to be one of the leading platform for BI on Cloud.
The following diagram indicates a blue print of a typical Cloud BI Organization on a Microsoft Azure Platform.
SYS-CON Events announced today that Men & Mice, the leading global provider of DNS, DHCP and IP address management overlay solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. The Men & Mice Suite overlay solution is already known for its powerful application in heterogeneous operating environments, enabling enterprises to scale without fuss. Building on a solid range of diverse platform support,...
Feb. 7, 2016 06:45 PM EST Reads: 137
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Feb. 7, 2016 05:15 PM EST Reads: 315
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Feb. 7, 2016 04:45 PM EST Reads: 186
The principles behind DevOps are not new - for decades people have been automating system administration and decreasing the time to deploy apps and perform other management tasks. However, only recently did we see the tools and the will necessary to share the benefits and power of automation with a wider circle of people. In his session at DevOps Summit, Bernard Sanders, Chief Technology Officer at CloudBolt Software, explored the latest tools including Puppet, Chef, Docker, and CMPs needed to...
Feb. 7, 2016 04:15 PM EST Reads: 259
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
Feb. 7, 2016 03:30 PM EST Reads: 370
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Feb. 7, 2016 02:45 PM EST Reads: 140
One of the bewildering things about DevOps is integrating the massive toolchain including the dozens of new tools that seem to crop up every year. Part of DevOps is Continuous Delivery and having a complex toolchain can add additional integration and setup to your developer environment. In his session at @DevOpsSummit at 18th Cloud Expo, Miko Matsumura, Chief Marketing Officer of Gradle Inc., will discuss which tools to use in a developer stack, how to provision the toolchain to minimize onboa...
Feb. 7, 2016 02:00 PM EST
SYS-CON Events announced today that VAI, a leading ERP software provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. VAI (Vormittag Associates, Inc.) is a leading independent mid-market ERP software developer renowned for its flexible solutions and ability to automate critical business functions for the distribution, manufacturing, specialty retail and service sectors. An IBM Premier Business Part...
Feb. 7, 2016 02:00 PM EST Reads: 557
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
Feb. 7, 2016 01:45 PM EST Reads: 361
Fortunately, meaningful and tangible business cases for IoT are plentiful in a broad array of industries and vertical markets. These range from simple warranty cost reduction for capital intensive assets, to minimizing downtime for vital business tools, to creating feedback loops improving product design, to improving and enhancing enterprise customer experiences. All of these business cases, which will be briefly explored in this session, hinge on cost effectively extracting relevant data from ...
Feb. 7, 2016 01:30 PM EST
In most cases, it is convenient to have some human interaction with a web (micro-)service, no matter how small it is. A traditional approach would be to create an HTTP interface, where user requests will be dispatched and HTML/CSS pages must be served. This approach is indeed very traditional for a web site, but not really convenient for a web service, which is not intended to be good looking, 24x7 up and running and UX-optimized. Instead, talking to a web service in a chat-bot mode would be muc...
Feb. 7, 2016 01:15 PM EST Reads: 182
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
Feb. 7, 2016 01:00 PM EST Reads: 332
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection.
Feb. 7, 2016 01:00 PM EST Reads: 151
With the Apple Watch making its way onto wrists all over the world, it’s only a matter of time before it becomes a staple in the workplace. In fact, Forrester reported that 68 percent of technology and business decision-makers characterize wearables as a top priority for 2015. Recognizing their business value early on, FinancialForce.com was the first to bring ERP to wearables, helping streamline communication across front and back office functions. In his session at @ThingsExpo, Kevin Roberts...
Feb. 7, 2016 12:00 PM EST Reads: 344
As someone who has been dedicated to automation and Application Release Automation (ARA) technology for almost six years now, one of the most common questions I get asked regards Platform-as-a-Service (PaaS). Specifically, people want to know whether release automation is still needed when a PaaS is in place, and why. Isn't that what a PaaS provides? A solution to the deployment and runtime challenges of an application? Why would anyone using a PaaS then need an automation engine with workflow ...
Feb. 7, 2016 11:45 AM EST Reads: 128