|By Scott Middleton||
|January 7, 2014 08:30 AM EST||
We recently completed a proof-of-concept (POC) that involved pulling data out of DynamoDB and into Redshift so that business users could analyze the data in an ad hoc manner with JasperSoft. JasperSoft and Redshift are touted as the new way to do data warehousing in the cloud and we were using DynamoDB as a sort of alternative to something Hadoop based.
We were amazed at how quickly and easily you could get a business-user friendly view of the data we had stored in DynamoDB using Redshift and JasperSoft. The actual human effort required to copy some initial data from DynamoDB into Redshift and then view it in JasperSoft was barely a few hours. However, there were a few unforseen technical challenges, but these challenges were not insurmountable. Ultimately we will continue with this technology combination because, as well as being easy to deploy and use, it gives us confidence that we can scale the POC into a "real" solution, and that our growing data needs will be taken care of.
Redshift's inbuilt copy from DynamoDB function makes getting data into Redshift fast but has limitations
Redshift provides an out-of-the-box copy function to copy data from DynamoDBinto Redshift without the need to set up servers or write any code other than a few simple lines of SQL. We were able to copy many of our tables straight out of DynamoDB and into Redshift and start running ad hoc queries without having to fire up servers or create an entire data translation layer of software.
We couldn't copy all of our tables, however, as DynamoDB's String Set field type is not currently supported by Redshift's copy function. After attempts at various SQL hacks and closely reading the Redshift manual we realized that we would not be able to work with these tables in the POC. In the future we will write some simple copy scripts (unless Amazon beats us to it and updates the copy command which, given their continuous product improvement, is likely).
Copying takes time
As it was a POC we were only copying across 3.5GB or 50 million rows of data but this process did take some time to complete - it took us 37 hours. Both Redshift and DynamoDB were running on the lowest performance settings and the DynamoDB instance was also servicing the needs of the live beta application we were trying to extract data from. We suspect this process could easily be made quicker by increasing the DynamoDB instance power and the power of Redshift but we did not test this.
The time it takes to copy 3.5GB of data indicated to us that a considered approach is necessary for getting data from DynamoDB into Redshift, especially considering that the live data will be much larger in volume. For example, when this goes into production we are only going to copy new records on a daily basis instead of clearing the entire Redshift database and reloading it to stay up to date.
Working in SQL with Redshift makes life easy
Once we had pulled our initial copy of data into Redshift we needed to manipulate the data to get it into a form that business users could analyze and create reports with.
The great thing about Redshift is that you are working in an environment you are familiar with. SQL. Redshift, at the time of writing, is based on PostgreSQL 8.0.2 so we were able to apply familiar string manipulation and math functions as well as create and join new tables to make the data much easier to understand for a non-technical business user.
Some SQL functions aren't yet supported by Redshift so we had to read through the documentation every now and then to find a suitable alternative. Sometimes it was just about trying to find the alternate name Redshift was using for a function we were used to using. Other times it meant creating some interesting workaround SQL. For example, Redshift doesn't support a function that can convert a Unix timestamp to a date so we had to manually convert our time stamps to dates using a mathematical formula.
Instant, non-technical user friendly data access with JasperSoft
We easily spun up a JasperSoft OnDemand instance and connected it to Redshift quite quickly. We were then creating ad hoc views and reports in a matter of minutes.
We did have some issues analyzing one of our tables straight out-of-the-box, one table had almost 3.5GB. Attempting to view reports on this table led to JasperSoft crashing. With some tweaking of the way the reports ran we were able to prevent analysis of this table from crashing JasperSoft.
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
Dec. 5, 2016 11:38 AM EST
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...
Dec. 5, 2016 11:30 AM EST Reads: 755
CloudJumper, a Workspace as a Service (WaaS) platform innovator for agile business IT, has been recognized with the Customer Value Leadership Award for its nWorkSpace platform by Frost & Sullivan. The company was also featured in a new report(1) by the industry research firm titled, “Desktop-as-a-Service Buyer’s Guide, 2016,” which provides a comprehensive comparison of DaaS providers, including CloudJumper, Amazon, VMware, and Microsoft.
Dec. 5, 2016 11:30 AM EST Reads: 742
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 5, 2016 11:15 AM EST Reads: 921
"Qosmos has launched L7Viewer, a network traffic analysis tool, so it analyzes all the traffic between the virtual machine and the data center and the virtual machine and the external world," stated Sebastien Synold, Product Line Manager at Qosmos, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 5, 2016 11:00 AM EST Reads: 655
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Dec. 5, 2016 10:30 AM EST Reads: 614
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
Dec. 5, 2016 10:30 AM EST Reads: 235
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, attendees learned about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how ...
Dec. 5, 2016 10:15 AM EST Reads: 949
Application transformation and DevOps practices are two sides of the same coin. Enterprises that want to capture value faster, need to deliver value faster – time value of money principle. To do that enterprises need to build cloud-native apps as microservices by empowering teams to build, ship, and run in production. In his session at @DevOpsSummit at 19th Cloud Expo, Neil Gehani, senior product manager at HPE, discussed what every business should plan for how to structure their teams to delive...
Dec. 5, 2016 09:15 AM EST Reads: 1,413
"Venafi has a platform that allows you to manage, centralize and automate the complete life cycle of keys and certificates within the organization," explained Gina Osmond, Sr. Field Marketing Manager at Venafi, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 5, 2016 09:15 AM EST Reads: 884
"We are a modern development application platform and we have a suite of products that allow you to application release automation, we do version control, and we do application life cycle management," explained Flint Brenton, CEO of CollabNet, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 5, 2016 08:45 AM EST Reads: 819
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
Dec. 5, 2016 07:30 AM EST Reads: 7,053
We are always online. We access our data, our finances, work, and various services on the Internet. But we live in a congested world of information in which the roads were built two decades ago. The quest for better, faster Internet routing has been around for a decade, but nobody solved this problem. We’ve seen band-aid approaches like CDNs that attack a niche's slice of static content part of the Internet, but that’s it. It does not address the dynamic services-based Internet of today. It does...
Dec. 5, 2016 07:30 AM EST Reads: 995
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web ...
Dec. 5, 2016 07:15 AM EST Reads: 1,291
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Dec. 5, 2016 06:45 AM EST Reads: 986