Wednesday 26 September 2012

AWS Week in Review - September 17th to September 23rd, 2012


 
 
 
 
 
 
 


 
 
 
 
Let's take a quick look at what happened in AWS-land last week:


Monday, September 17

§   AWS added four new checks to the AWS Trusted Advisor including EC2 Reserved Instance Optimization, VPN Tunnel Redundancy, RDS Backup, and RDS Multi-AZ.

Tuesday, September 18

§   AWS introduced Auto Scaling termination policies to give you additonal control over the scale-down process.

Friday, September 21

§ Developer Andy Chilton released version 0.11.0 of Awssum, a collection of node.js modules for AWS.


SOURCE
 

Infographi​c: Young Profession​als & Risky Tech Behavior

Based on the 2011 Cisco Connected World Technology ReportBACKGROUNDCHECK.ORG have compiled an infographic taking a look at the risky tech-related behavior young professionals engage in (in terms of device and password management, extreme internet behavior, and lost or stolen devices).


Young Professionals & Risky Tech Behavior
From: BackgroundCheck.org

SOURCE
 

Infographic : Hypervisor Tug-of-War

 
We've described the battle between VMware and Microsoft as a hypervisor war going back as far as the first release of Microsoft Virtual Server. Unfortunately for Microsoft, that war was pretty ugly. On the one side, VMware had guns and cannons, and on the other side, Microsoft was throwing rocks. Ok, maybe it wasn't quite that bad, but you get the picture.
Fast forward and Microsoft has added a lot of weaponry with the introduction of Microsoft Hyper-V, the company's true hypervisor. And the Redmond giant's offering just keeps on getting better. But, so too does the ESX hypervisor from VMware. A virtual tug-of-war if you will.
Check out this latest Infographic coming from the folks at SolarWinds
Instead of depicting this as a hypervisor war, the InfoGraphic calls it the Hypervisor Tug-of-War.
The survey information clearly shows that VMware remains in control. However, there are some interesting data points in there as well.
  • Survey responders believe that VMware is far and away going to be the most trusted private cloud vendor, with a whopping 76.9%
  • However 76.1% believe Microsoft will close the functionality gap with VMware thanks to Hyper-V 3.0
  • While trying to break beyond the barrier of VM Stall, it sounds like 56.8% of key applications will be deployed on a private cloud next year, while 57.7% have already moved beyond the 40% mark of their datacenter being virtualized
  • 70% of end users still deploy more than one virtualization tool
And at the bottom, we see that VMware still commands a lead in the hypervisor tug-of-war.
 
 
 
Reblogged from VMBlog
Infographic by SolarWinds
 

Monday 24 September 2012

Best Practices For Companies In The Cloud


Best Practices For Companies In The Cloud...



SOURCE

Infographic: The Big Data Boom

Our age has moved into a complex situation. On the one hand, we are experiencing an ever-increasing level of home clouds and, while on the other hand, commodity data services are hungrily intensifying. In the midst of this chaos, big data and cloud computing have secured their bond, and their friendship has come forward. This has prompted a sigh of relief from the user group.

Data is a common problem in organizations, and hence they want to figure out how to get their enterprise data assets under control. The friendship mantra of big data and the cloud can be verified by analyzing the emergence of cloud computing with the processing power to manage exabytes of information. Being able to handle large amounts of information is a priority for big enterprises in the industry because organizations are trying to get their data under control. Nevertheless, learning from the great success of big data in the cloud world, many governments have also become active players in the cloud domain.

Emerging big data trends show that organizations are getting to the analytical states of their processes, gaining the ability to determine value, and getting to know what their data is doing in a particular state of their business. This also requires the combination of huge amounts of data into common, accessible points that provide mission-critical business intelligence (BI). Data warehousing and the ability to look the value of information, either in an operation state or in a decision support state, are also factors of prime importance.

Nearly all the latest market trends point to the notion of being agile and being customer responsive. Interestingly, not all big data cloud servers are the same. The technology that Microsoft provides is completely different from the technology that Google provides. The time it takes to push a big data project to completion inevitably depends on the technology used by the server. This leads us to the question of which service provider to choose and which to ignore.

To support an adaptive organization, big data is one of the critical elements. The ability of cloud computing to provision computer resources, storage, network capacity and, above all, do all this magic at a moment’s notice makes big data and the cloud the dearest of friends. There is a long way to go, but for now they seem to be friends for life.



Source : cloudtweaks.com
Infographic Source: Netuitive
Write Up Source : Big Data And Cloud Computing – Friends For Life

Friday 21 September 2012

Pros and Cons of Hybrid Cloud

Benefits and Limitations of investing in a Hybrid Cloud


Pros and Cons of Hybrid cloud

SOURCE

Pros and Cons of Private Cloud

Benefits and Limitations of investing in a Private Cloud

Pros and Cons of Private Cloud

SOURCE

Pros and Cons of Public Cloud

Benefits and Limitations of investing in a Public Cloud

 

Pros and Cons of Public Cloud

SOURCE

Infographi​c: America's Most Innovative Colleges

In late June 2012, CampusTechnology.com revealed the winners of its 2012 Innovators Awards, which are given out to IT leaders who have come up with extraordinary technology solutions for campus challenges.

Find below the compiled information into an infographic to showcase the projects and technologies used by the winning colleges.

Innovations on Campus




Put Together By: OnlineUniversities.com


Wednesday 19 September 2012

AWS Week in Review - September 10th to September 16th, 2012

 

Let's take a quick look at what happened in AWS-land last week:
Tuesday, September 11
Wednesday, September 12
Thursday, September 13


SOURCE

Sign in to and Use the AWS Account


In this guide, lets understand below topics:
  • Understand AWS Manament Console
  • Different ways of logging into the AWS Management Console.
  • Sign into the AWS Management Console using the Account level credentials
  • Configure AWS Management Console as per your suitability.

AWS Management Console



















AWS Management Console simple definition by AWS :
Access and manage Amazon’s growing suite of infrastructure web services through a simple and intuitive, web-based user interface. The AWS Management Console provides convenient management of your compute, storage, and other cloud resources.

AWS constantly keeps pushing new features and support for various services in the console. If any feature is not available throught the AWS Management Console, the user must employ the APIs and/or SDKs provided by AWS.

In AWS, there are basically two different ways for a user to sign in to the AWS Management Console for handling the services:
  • Using the Account level credentials
Consider this as a "POWER USER LOGIN" (this is a term coined by me to set perspective and not by AWS)
A user can sign in using the typical AWS Console login URL. The user must use email-address using which the account is created and password provided.
This way of sign in allows the user complete control over the AWS services, resources and  account management.
In this guide we will be concentrating more on Account-level login to the AWS Management Console.  
  • Amazon Indentity and Access Management (IAM) User
Consider this as a "ACCESS CONTROLLED USERs"(this is a term coined by me to set perspective and not by AWS)
In case there is need of more than 1 user to login to the AWS account, you can use the IAM Service. Each user may have same or different access controls over the various AWS services and resources. The users can sign into the console using a different alias, specific to your account and using specific user login name & password. These privilieges are not only applicable to the AWS Management Console. The same can be applied to the use of SDKs and APIs. This can be achieved by creating user specific Access Keys and Secret Keys.
IAM also enables identity federation between your corporate directory and AWS services.
I'm NOT covering IAM user login in this tutorial, but I will surely write a guide on the topic and provide updated links.

Sign into the AWS Management Console using the Account level credentials :

Tuesday 18 September 2012

Amazon VPC - New Additions

 
AWS has added 3 new features / options to the  Amazon Virtual Private Cloud (VPC) service.
 
 
PFB extract for the two blogs written by Jeff on the same:
 
 
The Amazon Virtual Private Cloud (VPC) gives you the power to create a private, isolated section of the AWS Cloud. You have full control of network addressing. Each of your VPCs can include subnets (with access control lists), route tables, and gateways to your existing network and to the Internet.
 
You can connect your VPC to the Internet via an Internet Gateway and enjoy all the flexibility of Amazon EC2 with the added benefits of Amazon VPC. You can also setup an IPsec VPN connection to your VPC, extending your corporate data center into the AWS Cloud. Today we are adding two options to give you additional VPN connection flexibility:
  1. You can now create Hardware VPN connections to your VPC using static routing. This means that you can establish connectivity using VPN devices that do not support BGP such as Cisco ASA and Microsoft Windows Server 2008 R2. You can also use Linux to establish a Hardware VPN connection to your VPC. In fact, any IPSec VPN implementation should work.
  2. You can now configure automatic propagation of routes from your VPN and Direct Connect links (gateways) to your VPC's routing tables. This will make your life easier as you won’t need to create static route entries in your VPC route table for your VPN connections. For instance, if you’re using dynamically routed (BGP) VPN connections, your BGP route advertisements from your home network can be automatically propagated into your VPC routing table.
If your VPN hardware is capable of supporting BGP, this is still the preferred way to go as BGP performs a robust liveness check on the IPSec tunnel. Each VPN connection uses two tunnels for redundancy; BGP simplifies the failover procedure that is invoked when one VPN tunnel goes down.

Sunday 16 September 2012

Cancel an AWS Account

AWS allows users to cancel their AWS account.

If you wish to cancel your AWS account follow the below steps:

  • Login to your AWS Account as a returning user by selecting the option : "I am a returning user and password is:" , click on Sign in using our secure server button.



Create an AWS Account - Free Usage Tier

Amazon Web Services helps its new customers to get started into the cloud by introducing a free usage tier. This tier is available to customers for 12 months. 

Below are the highlights of AWS’s free usage tiers. All are available for one year (except SWF, DynamoDB, SimpleDB, SQS, and SNS which are free indefinitely):



NOTE:
Below Image is updated as of October 1st,2012 for AWS RDS announcement. For latest updates, please check AWS Free Tier for more details
 

 

Do check AWS Free Tier for more details.

How to get started:



Saturday 15 September 2012

Infographic: Why are more more businesses moving to the cloud?

The cloud is one of the quickest growth areas in the IT sector, more and more businesses are using the cloud for their day to day processes and according to analysts TechMarketView, the UK market for cloud computing reached £1.2bn in 2011, 38 percent higher than the previous year. That’s some serious growth!

But why are businesses moving to the cloud? The reality is the cloud is fast becoming hard to ignore and in the infographic below we take a look at why:




SOURCE
 

Amazon RDS News - Oracle Data Pump

The Amazon RDS team is rolling out new features at a very rapid clip.
The most awaited feature - Oracle Data Pump is finally here.

Extract from blog post by Jeff:
Customers have asked us to make it easier to import their existing databases into Amazon RDS. We are making it easy for you to move data on and off of the DB Instances by using Oracle Data Pump. A number of scenarios are supported including:
  • Transfer between an on-premises Oracle database and an RDS DB Instance.
  • Transfer between an Oracle database running on an EC2 instance and an RDS DB Instance.
  • Transfer between two RDS DB Instances.
These transfers can be run in either direction. We currently support the network mode of Data Pump where the job source is an Oracle database. Transfers using Data Pump should be considerably faster than those using the original Import and Export utilities. Oracle Data Pump is available on all new DB Instances running Oracle Database 11.2.0.2.v5. To use Data Pump with your existing v3 and v4 instances, please upgrade to v5 by following the directions in the RDS User Guide. To learn more about importing and exporting data from your Oracle databases, check out our new import/export guide.

For those who are not aware what Oracle Data Pump is -

Oracle Data Pump is a feature of Oracle Database 11g Release 2 that enables very fast bulk data and metadata movement between Oracle databases. Oracle Data Pump provides new high-speed, parallel Export and Import utilities (expdp and impdp) as well as a Web-based Oracle Enterprise Manager interface.

  • Data Pump Export and Import utilities are typically much faster than the original Export and Import Utilities. A single thread of Data Pump Export is about twice as fast as original Export, while Data Pump Import is 15-45 times fast than original Import.
  • Data Pump jobs can be restarted without loss of data, whether or not the stoppage was voluntary or involuntary.
  • Data Pump jobs support fine-grained object selection. Virtually any type of object can be included or excluded in a Data Pump job.
  • Data Pump supports the ability to load one instance directly from another (network import) and unload a remote instance (network export).

Friday 14 September 2012

Amazon EC2 Reserved Instance Marketplace

Superbly detailed blog Post by Jeff on Amazon EC2 Reserved Instance Marketplace

No more words need to be added....


EC2 Options
I often tell people that cloud computing is equal parts technology and business model. Amazon EC2 is a good example of this; you have three options to choose from:
  • You can use On-Demand Instances, where you pay for compute capacity by the hour, with no upfront fees or long-term commitments. On-Demand instances are recommended for situations where you don't know how much (if any) compute capacity you will need at a given time.
  • If you know that you will need a certain amount of capacity, you can buy an EC2 Reserved Instance. You make a low, one-time upfront payment, reserve it for a one or three year term, and pay a significantly lower hourly rate. You can choose between Light Utilization, Medium Utilization, and Heavy Utilization Reserved Instances to further align your costs with your usage.
  • You can also bid for unused EC2 capacity on the Spot Market with a maximum hourly price you are willing to pay for a particular instance type in the Region and Availability Zone of your choice. When the current Spot Price for the desired instance type is at or below the price you set, your application will run.
Reserved Instance Marketplace
Today we are increasing the flexibility of the EC2 Reserved Instance model even more with the introduction of the Reserved Instance Marketplace. If you have excess capacity, you can list it on the marketplace and sell it to someone who needs additional capacity. If you need additional capacity, you can compare the upfront prices and durations of Reserved Instances on the marketplace to the upfront prices of one and three year Reserved Instances available directly from AWS. The Reserved Instances in the Marketplace are functionally identical to other Reserved Instances and have the then-current hourly rates, they will just have less than a full term and a different upfront price.


AWS Expands in Japan

Amazon Web Services (AWS) is expanding in Japan with the addition of a third Availability Zone.
The move means that AWS will most likely be adding more data centers to keep up with the steady demand in service it has had since it first began offering its service in Tokyo 18 months ago.

For people who are not aware of Availability zones and Regions of AWS -

Amazon Web Services serves hundreds of thousands of customers in more than 190 countries.
Currently, AWS has spanned across 8 regions around the Globe.
Each region has multiple availability zones.
Each availability zone can encompass multiple data centers.

See a detailed list of offerings at all AWS locations

Extracted below a nice blog post by Jeff:

We announced an AWS Region in Tokyo about 18 months ago. In the time since the launch, our customers have launched all sorts of interesting applications and businesses there. Here are a few examples:
    • Cookpad.com is the top recipe site in Japan. They are hosted entirely on AWS, and handle more than 15 million users per month.
    • KAO is one of Japan's largest manufacturers of cosmetics and toiletries. They recently migrated their corporate site to the AWS cloud.
    • Fukoka City launched the Kawaii Ward project to promote tourism to the virtual city. After a member of the popular Japanese idol group AKB48 raised awareness of this site, virtual residents flocked to the site to sign up for an email newsletter. They expected 10,000 registrations in the first week and were pleasantly surprised to receive over 20,000.
Demand for AWS resources in Japan has been strong and steady, and we've been expanding the region accordingly. You might find it interesting to know that an AWS region can be expanded in two different ways. First, we can add additional capacity to an existing Availability Zone, spanning multiple datacenters if necessary. Second, we can create an entirely new Availability Zone.
Over time, as we combine both of these approaches, a single AWS region can grow to encompass many datacenters. For example, the US East (Northern Virginia) region currently occupies more than ten datacenters structured as multiple Availability Zones.
 
AWS Tokyo Region and Availability Zones
 
Today, we are expanding the Tokyo region with the addition of a third Availability Zone. 
This will add capacity and will also provide you with additional flexibility. As is always the case with AWS, untargeted launches of EC2 instances will now make use of this zone with no changes to existing applications or configurations. If you are currently targeting specific Availability Zones, please make sure that your code can handle this new option.

Thursday 13 September 2012

Infographic : Understanding Hybrid Cloud Computing

The term “hybrid” can hold a different meaning from industry to industry. Generally speaking, it describes the combination of two things to make one. In the cloud computing industry, there are a series of technologies you can use to run your web environments, from cloud to dedicated.

Users can have a hybrid environment at Rackspace by connecting Cloud Servers and dedicated servers via RackConnect™. Here is a fun illustration that highlights this technology in the simplest of terms.


Hybrid Hosting
 Hybrid Hosting

This illustration is meant to be a general explanation of what “hybrid” means in the cloud computing world. With the Rackspace solution, RackConnect, you do in fact need a firewall to connect dedicated and cloud servers. The data transferred between the two can be done over a private network.
A better explanation can be seen here:
http://www.rackspace.com/hosting_solutions/hybrid_hosting/

SOURCE
 

Infographic : How Cloud Computing is Saving the Earth

Since the first Earth Day in 1970, the rise of information technology has meant progress but it has also raised many challenges. Even with green IT initiatives, global data center energy consumption is on the rise and IT departments are continuing to look for ways to become more efficient.

The nature of cloud computing solves for many IT problems and in the infographic below, we have illustrated how cloud computing is making a difference. In addition, projects like Open Compute, are helping IT industries create more efficient data centers.


Rackspace® — [INFOGRAPHIC] How Cloud Computing is Saving the Earth
Rackspace® — [INFOGRAPHIC] How Cloud Computing is Saving the Earth



SOURCE
 

Wednesday 12 September 2012

Infographic : API Adoption And The Open Cloud - What Is An API?

You probably use application programming interfaces (APIs) multiple times a day and aren’t even aware of it. They make it easier to share photos with friends, access massive data stores and drive new app development.

With the rise of APIs, including our own Open Cloud API, we’ve compiled an overview to help you understand how APIs work, how you’re already using them, and how businesses are finding big successes with APIs.


Rackspace® — API Adoption And The Open Cloud: What Is An API? [Infographic]
Rackspace® — API Adoption And The Open Cloud: What Is An API? [Infographic]



SOURCE



Infographic : Hosting to Storage - Why the Cloud is a Big Deal for Small Businesses

Cloud computing helps businesses get off the ground with little upfront investment and operate on a shoestring while accessing technology and applications traditionally reserved for big businesses.

Services like web hosting, hosted storage, and email hosting take the financial strain and workload off of business so they can focus on their core competencies. Take a look at what a big deal the cloud is becoming for small businesses.


Rackspace® — [INFOGRAPHIC] Hosting to Storage: Why the Cloud is a Big Deal for Small Businesses
Rackspace® — [INFOGRAPHIC] Hosting to Storage: Why the Cloud is a Big Deal for Small Businesses




SOURCE

Infographic : Healthcare in the Cloud


Cloud computing is changing the way healthcare delivers care. Over 70% of health care providers are planning to deploy cloud technologies like email, collaboration, scheduling, and other systems so they can focus on patient care, not caring for complex IT systems.

The growing healthcare IT industry is in the spotlight this week with National Health IT Week sponsored by Healthcare Information and Management Systems Society. In recognition of the partnership between the cloud and healthcare, we took a look at how cloud computing is impacting the healthcare industry.


Rackspace® — [Infographic] Healthcare in the Cloud
Rackspace® — [Infographic] Healthcare in the Cloud


SOURCE
 

AWS Week in Review - September 3rd to September 9th, 2012














Let's take a quick look at what happened in AWS-land last week:

Tuesday, September 4
Wednesday, September 5
Friday, September 7

SOURCE
 

Monday 10 September 2012

Getting Started with Amazon Glacier



Amazon Glacier is an extremely low-cost storage service that provides secure and durable storage for data archiving and backup. In order to keep costs low, Amazon Glacier is optimized for data that is infrequently accessed and for which retrieval times of several hours are suitable. With Amazon Glacier, customers can reliably store large or small amounts of data for as little as $0.01 per gigabyte per month, a significant savings compared to on-premises solutions.

Retrieving archives from Amazon Glacier requires the initiation of a job. Jobs typically complete in 3 to 5 hours. You organize your archives in vaults
The quick start video for Amazon Glacier walks you through on how-to use the AWS Management console to create vaults in Amazon Glacier.

To upload the data to Amazon Glacier, users must use the SDKs/APIs provided by AWS for uploading the data.








In case your data is too huge to be uploaded through internet, you can make use of another AWS Service - AWS Import/Export. It accelerates moving large amounts of data into and out of AWS using portable storage devices (see supported devices) for transport. You can ship your device along with its interface connectors, and power supply to AWS. When your package arrives, it will be processed and securely transferred to an AWS data center, where your device will be attached to an AWS Import/Export station. After the data load completes, the device will be returned to you.

For more information, please visit the Amazon Glacier Product Page and the Amazon Glacier Developer Guide.

Building Highly Available, Scalable Web Properties with AWS

From the AWS Webinar Series: Building Highly Available, Scalable Web Properties with AWS 

A very nicely compiled webinar for understand various AWS Services and design principles.

This webinar recording focuses on basic properities for Building Highly Available, Scalable Web Applications on AWS Cloud.

These properties are:

  • Elasticity
  • Design for Failure
  • Loose Coupling
  • Security
  • Performance
 


Saturday 8 September 2012

The AWS Report - Colin Lazier, Amazon Glacier

Want to know what Amazon's new service Glacier is and how people are using it?

Check out the new AWS Report with Colin Lazier.

For today's episode of The AWS Report, I spoke to Colin Lazier, a Senior Development Manager on the AWS Storage Team. Colin and I talked about Amazon Glacier and how it can be used to archive data for long periods of time. I learned that Glacier uses anti-entropy techniques to guard against data loss. 
We also talked about Glacier's retrieval model, and our expectation that third parties will build archiving and indexing tools around Glacier's storage and retrieval functions.




Friday 7 September 2012

Infographic : How much do Apple, Google and Amazon spend on advertising?


Last year, U.S. tech company advertisers continued to increase ad and promo spending so much so that Google, Amazon and Apple are listed among the six companies with the highest ad-spending growth rates. Please have a look at the complete list with a more detailed breakdown o


Tech Company Ad Spends
SOURCE : OnlineBusinessDegree.org

Thursday 6 September 2012

Cloud Infographic: The World And Big Data





SOURCE
 

Installing Active Directory Domain Services on Windows Server 2008

Microsoft Active Directory provides the structure to centralize the network management and store information about network resources across the entire domain. Active Directory uses Domain Controllers to keep this centralized storage available to network users. In order to configure a Windows Server 2008 machine to act as Domain Controller, several considerations and prerequisites should be taken into account, and several steps should be performed. In this article I will guide you through these prerequisites and steps of creating a new Windows Server 2008 Domain Controller for a new Active Directory domain in a new forest.

Considerations when Installing a new Windows Server 2008 forest