Welcome!

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Article

A Cloud Computing Business Intelligence Organization

Moving data warehouses to the cloud

Data Warehousing As A Cloud Candidate
Over the past  year, we have started seeing greater support for Cloud from major vendors and Cloud is here to stay. The   bigger impact is that,  the path is clearly drawn for the enterprises to adopt Cloud.  With this in mind,  it is time to identify the potential for existing  data center applications  to be migrated to Cloud.

Most of the major IT majors predict a  HYBRID Delivery will be future,  where by the future enterprises needs to look  for a delivery model that comprises of certain work loads  on  Clouds  and some of them continue to be on data centers and then look for a model that will integrate  them together.

Before  we go further into  a  blue print of  How Data warehouses  fit  within a HYBRID Cloud environment, we  will see the salient features of  Data warehouses  and how the Cloud tenants  make them a  very viable work load to be  moved to Cloud.

A data warehouse  is a subject oriented, integrated, time variant and non volatile collection  of data in support of management's decision making process.

Data Warehousing  Usage

Cloud Tenant Value Proposition

ETL   (Extract, Cleaning, Transform, Load) process is  subject to variable patterns. Normally we may get large files over the week end or in night time to be processed and loaded.

It is better to  use the COMPUTE  resources on demand for the ETL as they require , rather than having a fixed capacity

OLAP (Online Analytical Processing) and related  processing needs for  MOLAP (Multi dimensional OLAP) and / or ROLAP (Relational OLAP) are highly compute intensive and requires stronger processing needs

High Performance Computing and ability to scale up on demand,   tenants of Cloud will be highly aligned to this need

Physical architecture needs are complex in a data warehousing environment.

  • MPP Servers (Massively Parallel Processing)
  • Shared Nothing Data Architecture
  • Mirrored Copies of Disk Space
  • High Availability Clustering

Most of the IaaS , PaaS offerings  like Azure platform, Amazon EC2 have built in  provisions for a highly available architecture, with most of the day to day administration  is  abstracted from the  enterprises.

The below are some of the advantages of SQL Azure Platform

  • No physical administration required - software installation and patching is included, as this is a platform as a service (PAAS)
  • High availability and fault tolerance are built in

Multiple Software and platform  needs,

  • Database Design Tools (STAR Schema Modeling)
  • ETL Tools
  • Data Cleansing Tools
  • OLAP Tools
  • Spatial Tools
  • Data Mining Tools
  • BI Reporting Tools

The product stack of  data warehousing  environment  is really huge and most organizations will normally find  it difficult to get into  a ideal list of software and platforms and tools  for their BI  platform. platform. SaaS for  applications like  data cleansing or address validation and PaaS for reporting like Microsoft SQL Azure reporting will be ideal to solve the tools and platform maze.

 

The following are the ideal steps for migrating  a  in-premise  data warehouse  system to a cloud platform, for the sake of case study , Microsoft  Windows Azure platform is chosen  as the target platform.

1. Create Initial Database / Allocate Storage / Migrate Data
The existing STAR Schema design  of the existing  data warehousing system can be migrated to  Cloud platform as it is.  And  migrating  to a  Relational database  platform like  SQL Azure should be straightforward. To migrate the data,   the initial  storage allocations of the existing  database on  the data center needs to be calculated and the same amount  Storage resources will be allocated on the Cloud.

You can store any amount of data, from kilobytes to terabytes, in SQL Azure. However, individual databases are limited to 10 GB in size. To create solutions that store more than 10 GB of data, you must partition large data sets across multiple databases and use parallel queries to access the data.

Once a high scalable database infrastructure is setup on SQL Azure platform , the following are some of the methods in which the data from the existing on-premise  data warehouses can be  moved to SQL Azure.

Traditional BCP Tool : BCP  is a command line utility that ships with Microsoft SQL Server. It bulk copies data between SQL Azure (or SQL Server) and a data file in a user-specified format. The bcp utility that ships with SQL Server 2008 R2 is fully supported by SQL Azure. You can use BCP to backup and restore your data on SQL Azure You can import large numbers of new rows into SQL Azure tables or export data out of tables into data files by using the bcp utility.

The following tools  are also useful, if you existing  Data warehouse is in  Sql Server within the data center.

You can transfer data to SQL Azure by using SQL Server 2008 Integration Services (SSIS). SQL Server 2008 R2 or later supports the Import and Export Data Wizard and bulk copy for the transfer of data between an instance of Microsoft SQL Server and SQL Azure.

SQL Server Migration Assistant (SSMA for Access v4.2) supports migrating your schema and data from Microsoft Access to SQL Azure.

2. Set Up ETL & Integration With Existing  On Premise Data Sources
After the initial load of the  data warehouse on Cloud,  it required to be continuously refreshed   with the operational data.  This process  needs to extract  data from different data sources (such as flat files, legacy databases, RDBMS, ERP, CRM and SCM application packages).

This process will also carry out necessary transformations such as joining of tables, sorting, applying  various filters.

The following are typical  options available in Sql Azure platform  to  build a  ETL platform between the On Premise and data warehouse hosted on cloud.  The tools mentioned above on the initial load of the data also holds good  for ETL tool, however they are not repeated  to avoid duplication.

SQL Azure Data Sync :

  • Cloud to cloud synchronization
  • Enterprise (on-premise) to cloud
  • Cloud to on-premise.
  • Bi-directional or sync-to-hub or sync-from-hub synchronization

The following diagram courtesy  of  Vendor will give a over view of how the SQL Azure Data Sync can be used for ETL purposes.

Integration provides common  Biztalk  Server integration capabilities (e.g. pipeline, transforms, adapters) on Windows Azure, using out-of-box integration patterns to accelerate and simplify development. It also delivers higher level business user enablement capabilities such as Business Activity Monitoring and Rules, as well as self-service trading partner community portal and provisioning of business-to-business pipelines.  The following diagram courtesy of the vendor shows how the  Windows Azure Appfabric Integration can be used as a ETL platform.

3. Create CUBES & Other Analytics  Structures
The multi dimensional nature of  OLAP requires a analytical engine to process the underlying data and create a multi dimensional view and  the success of OLAP has resulted in a large  number of vendors  offering OLAP servers using different architectures.

MLOAP :  A Proprietary multidimensional database with a aim on performance.

ROLAP :   Relational OLAP is a technology that provides sophisticated multidimensional analysis that is performed on open relational databases.  ROLAP can scale to  large data sets in the terabyte range.

HOLAP : Hybrid OLAP is an attempt to combine some of the features of MOLAP and ROLAP technology.

SQL Azure Database does not support all of the features and data types found in SQL Server. Analysis Services, Replication, and Service Broker are not currently provided as services on the Windows Azure platform.

At this time  there is no direct support for OLAP and CUBE processing on SQL Azure,  however with the HPC (High Performance Computing ) attributes  using multiple Worker roles,  manually  aggregation of the data can be achieved.

4. Generate Reports
Reporting consists of  analyzing the data  stored in the data warehouse in multiple dimensions and  generate standard reports for business intelligence and also generate ad-hoc reports.  These reports present data in graphical/tabular form and also provide statistical analysis features.  These reports should be rendered as Excel, PDF and other formats.

It is better to utilize the SaaS based or PaaS based reporting infrastructure rather than custom coding all the reports.

SQL Azure Reporting enables developers to enhance their applications by embedding cloud based reports on information stored in a SQL Azure database.  Developers can author reports using familiar SQL Server Reporting Services tools and then use these reports in their applications which may be on-premises or in the cloud.

SQL Azure Reporting  also currently can connect only to SQL Azure databases.

Summary
The above steps will provide a path to migrate   on premise  Data warehousing  applications to Cloud. As we needed lot of support from the  vendor in terms of IaaS, PaaS  and SaaS,   Microsoft Azure Platform is chosen as a platform to support the case study.  With several features  integrated as part of  this, Microsoft  Cloud Platform  positioned to be  one of the leading platform for BI on Cloud.

The following diagram  indicates a blue print of a  typical Cloud BI Organization on a Microsoft Azure Platform.

More Stories By Srinivasan Sundara Rajan

Srinivasan is passionate about ownership and driving things on his own, with his breadth and depth on Enterprise Technology he could run any aspect of IT Industry and make it a success.

He is a seasoned Enterprise IT Expert, mainly in the areas of Solution, Integration and Architecture, across Structured, Unstructured data sources, especially in manufacturing domain.

He currently works as Technology Head For GAVS Technologies.

Latest Stories
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Struggling to keep up with increasing application demand? Learn how Platform as a Service (PaaS) can streamline application development processes and make resource management easy.
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, will show how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningfu...
If there is anything we have learned by now, is that every business paves their own unique path for releasing software- every pipeline, implementation and practices are a bit different, and DevOps comes in all shapes and sizes. Software delivery practices are often comprised of set of several complementing (or even competing) methodologies – such as leveraging Agile, DevOps and even a mix of ITIL, to create the combination that’s most suitable for your organization and that maximize your busines...
In his session at @ThingsExpo, Chris Klein, CEO and Co-founder of Rachio, will discuss next generation communities that are using IoT to create more sustainable, intelligent communities. One example is Sterling Ranch, a 10,000 home development that – with the help of Siemens – will integrate IoT technology into the community to provide residents with energy and water savings as well as intelligent security. Everything from stop lights to sprinkler systems to building infrastructures will run ef...
SYS-CON Events announced today that Peak 10, Inc., a national IT infrastructure and cloud services provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Peak 10 provides reliable, tailored data center and network services, cloud and managed services. Its solutions are designed to scale and adapt to customers’ changing business needs, enabling them to lower costs, improve performance and focus inter...
In the world of DevOps there are ‘known good practices’ – aka ‘patterns’ – and ‘known bad practices’ – aka ‘anti-patterns.' Many of these patterns and anti-patterns have been developed from real world experience, especially by the early adopters of DevOps theory; but many are more feasible in theory than in practice, especially for more recent entrants to the DevOps scene. In this power panel at @DevOpsSummit at 18th Cloud Expo, moderated by DevOps Conference Chair Andi Mann, panelists will dis...
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
Artificial Intelligence has the potential to massively disrupt IoT. In his session at 18th Cloud Expo, AJ Abdallat, CEO of Beyond AI, will discuss what the five main drivers are in Artificial Intelligence that could shape the future of the Internet of Things. AJ Abdallat is CEO of Beyond AI. He has over 20 years of management experience in the fields of artificial intelligence, sensors, instruments, devices and software for telecommunications, life sciences, environmental monitoring, process...
Increasing IoT connectivity is forcing enterprises to find elegant solutions to organize and visualize all incoming data from these connected devices with re-configurable dashboard widgets to effectively allow rapid decision-making for everything from immediate actions in tactical situations to strategic analysis and reporting. In his session at 18th Cloud Expo, Shikhir Singh, Senior Developer Relations Manager at Sencha, will discuss how to create HTML5 dashboards that interact with IoT devic...
See storage differently! Storage performance problems have only gotten worse and harder to solve as applications have become largely virtualized and moved to a cloud-based infrastructure. Storage performance in a virtualized environment is not just about IOPS, it is about how well that potential performance is guaranteed to individual VMs for these apps as the number of VMs keep going up real time. In his session at 18th Cloud Expo, Dhiraj Sehgal, in product and marketing at Tintri, will discu...
So, you bought into the current machine learning craze and went on to collect millions/billions of records from this promising new data source. Now, what do you do with them? Too often, the abundance of data quickly turns into an abundance of problems. How do you extract that "magic essence" from your data without falling into the common pitfalls? In her session at @ThingsExpo, Natalia Ponomareva, Software Engineer at Google, will provide tips on how to be successful in large scale machine lear...
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
The IoTs will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm and share the must-have mindsets for removing complexity from the development proc...
SYS-CON Events announced today that Ericsson has been named “Gold Sponsor” of SYS-CON's @ThingsExpo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. Ericsson is a world leader in the rapidly changing environment of communications technology – providing equipment, software and services to enable transformation through mobility. Some 40 percent of global mobile traffic runs through networks we have supplied. More than 1 billion subscribers around the world re...