Click here to close now.




















Welcome!

Related Topics: Containers Expo Blog, Microsoft Cloud, @CloudExpo

Containers Expo Blog: Tutorial

How to Provision Your First Virtual Machine in Window Azure

Actually creating and provisioning a new virtual machine can be completed in just a few clicks

Windows Azure is currently offering an exclusive promotion. You can get $200 in free credits just for test driving the innovative public cloud service designed by Microsoft. What's the catch? When you sign up, Microsoft asks that you provide your credit card and a phone number so that they can verify your identity. Once verified, you are able to dive into the world of Windows Azure.

How to Create a New VM
Actually creating and provisioning a new virtual machine can be completed in just a few clicks. Once you've signed up for Azure, you'll want to find the tab on the left that says "Virtual Machines." Ensure that you first click the NEW button in the bottom left hand corner to begin the new virtual machine wizard. You will be asked to do a "Quick Create" or a "From Gallery" virtual machine. Let's select "From Gallery" so that you understand what all is available to you.

Choosing an Image
The featured images are all Microsoft operating system images. You can also add Ubuntu, CentOS, SUSE and Oracle. In this demonstration, click the Ubuntu tab on left and select Ubuntu Server 13.10 and next.

VM Configuration
The VM configuration screen helps you give more detailed information about this specific machine. You will want to name the machine, select the size of the server on which you'd like to deploy this image and you will also want to install a new username for the machine. You may want to provide a password for the account you are creating for this VM in order to provide an extra layer of protection.

After you click next, you'll be presented with an option that allows you to add this virtual server into a pool of cloud servers you have already built. If this is your first virtual machine, you will probably not have a cloud service and one will be created after this VM machine is complete. Your regional affinity lets you setup where you'd like your data to be stored. Microsoft has a map that shows what certain ping times should be depending on your regional affinity. For a storage account, you can use an automatically generated storage account. Under availability, the default is none however if you'd like this resource to only be available at certain times, this is the setting in which you would toggle.

The last VM configuration setting involves setting up the ports for your VM. In this case, Azure asks you which port you'd like to host SSH. Port 22 is the default and this setting is suitable for most configurations. Click the checkmark and pat yourself on the back for creating your very first ever virtual machine in Windows Azure.

More Stories By Natalie Lerner

Natalie Lerner is a senior contributor for CloudWedge. In her spare time, Natalie enjoys exploring all things cloud and is a music enthusiast. Follow Natalie’s daily posts on Twitter: @Cloudwedge, or on Facebook.

Latest Stories
With SaaS use rampant across organizations, how can IT departments track company data and maintain security? More and more departments are commissioning their own solutions and bypassing IT. A cloud environment is amorphous and powerful, allowing you to set up solutions for all of your user needs: document sharing and collaboration, mobile access, e-mail, even industry-specific applications. In his session at 16th Cloud Expo, Shawn Mills, President and a founder of Green House Data, discussed h...
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
"We have been in business for 21 years and have been building many enterprise solutions, all IT plumbing - server, storage, interconnects," stated Alex Gorbachev, President of Intelligent Systems Services, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.
"We specialize in testing. DevOps is all about continuous delivery and accelerating the delivery pipeline and there is no continuous delivery without testing," noted Marc Hornbeek, Sr. Solutions Architect at Spirent Communications, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
How do you securely enable access to your applications in AWS without exposing any attack surfaces? The answer is usually very complicated because application environments morph over time in response to growing requirements from your employee base, your partners and your customers. In his session at @DevOpsSummit, Haseeb Budhani, CEO and Co-founder of Soha, shared five common approaches that DevOps teams follow to secure access to applications deployed in AWS, Azure, etc., and the friction an...
"Alert Logic is a managed security service provider that basically deploys technologies, but we support those technologies with the people and process behind it," stated Stephen Coty, Chief Security Evangelist at Alert Logic, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to tran...
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
The Internet of Things is not only adding billions of sensors and billions of terabytes to the Internet. It is also forcing a fundamental change in the way we envision Information Technology. For the first time, more data is being created by devices at the edge of the Internet rather than from centralized systems. What does this mean for today's IT professional? In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists addressed this very serious issue of pro...
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...