Independent Telecommunications Consultants
Microwave and SCADACritical CommunicationsTwo Way Radio, 911 and DispatchConverged Communications and Contact Centers
Microwave and SCADA
Critical Communications
Two Way Radio, 911 and Dispach
Converged Communications and Contact Centers

continuing on – Cloud Computing

Why do they want to sell you cloud?

No surprise, it’s not always for your benefit.

In my previous article I explored a little history and came up with my own definition of cloud computing. For those who have not read it, or can’t remember, my definition was as follows:

Cloud Computing –

“An application that is provided to a computing device where the execution of the core application is done separate from the (application) client, and outside the user’s private secure network, instead accessed by a shared public network.

The core application is owned and executed by some other 3rd party. The core application is also developed or configured such that it appears to be a single instance of the application to the user or specific group of users. The application client may be a standards based interface, a custom software application, or embedded firmware based.

The key difference between cloud computing and the client – server model is that in the client – server model, the core application and hardware is maintained (typically) by the users organization and located inside the private secure network.”

And checking a few web references, I was pretty close!

So now I will attempt to explain WHY there is such a push to sell everybody Cloud Computing, and to no ones’ surprise, it’s all about money.  But the money is coming in a different way.

Let me explain –

There has been a lot of talk lately about how PC sales have dropped off significantly. This has ripple affects. No new PC’s equals no new copies of Windows sold.  How many people when they purchase a new PC also purchase a copy of the latest version of Office? And while we’re at it, let’s look at Office itself. What actual new features in Office do most people use. Frankly, I can do 99.9% of everything that I need to do with the copy of Office 2003. This is the same reason that Windows XP is still in use, and frankly, I have yet to have any customer embrace Windows 8. Combine that with the availability of mature and free open source applications like OpenOffice that are generally equal in features and it doesn’t take much to look out a few years and see your product becoming a commodity item.

But it’s more than just desktops isn’t it? Server horsepower, memory and storage technology has still maintained the Moore’s Law paradigm. However, with efficient hypervisors like VMWare and Xen combined with server operating system kernal improvements by both Microsoft and the open source world, older hardware and applications are remaining in service longer. After all, if a server based application works, and you can easily migrate it from one hardware platform to another, why replace it? Or if I have to replace it, is there a low cost alternative that I can just drop in as a virtual machine to replace it? This too has not gone unnoticed in the industry.

As for a real world example, as an Independent Technology Consultant, I have done both scenarios for my customers. I have taken an older application that required (at the time) a dedicated server and migrated the entire system to operate as a virtual machine. I tell my customers that generally, if the hard drive(s) partitions are still good enough to boot and run the application, it can be migrated.

I have also recommended and helped customers evaluate and install open source alternatives to applications where the vendor was mandating an upgrade upon a customer to maintain support. I don’t intend to denigrate the vendor here – I understand why this is done. In most of these types of situations, the customer themselves have ignored the situation for far too long, and have put themselves AND the vendor between a rock and a hard place. This is why engaging or recommending a business technology consultant as part of a long term business relationship is good for customers and vendors (so much for the shameless plug).

Now returning to your regularly scheduled program….

Hardware, and now software is becoming a commodity item. Visualization has allowed for a large number of server applications to reside on an efficient single server. Storage has become cheap, which allows for large redundant arrays, to the point where the storage itself is virtual in a storage network.  These two revelations are not enough to push a large portion of the industry to adopt a cloud based model. If these two items alone were the drivers, it would have happened long ago. No, there are two more pieces to this puzzle.

The final technology piece of this puzzle is bandwidth. I’m not talking about local area network bandwidth. The cost of enterprise grade gigabit switches capable of low latency, non-blocking operation has been well within reach for some time. I’m talking about last mile bandwidth, or in layman terms, high speed access.

Twenty years ago, when I told my friends and associates in the telecommunications carrier world that their biggest coming competitor would be the cable companies, they scoffed in derision. When I would try to explain the aggregate bandwidth available in an enclosed RF distribution system, and the concept of Frequency Division Multiplexing I would generally get blank stares. Some of these associates understood the theory, but would comment about the ability of the cable companies to execute and maintain such systems.

Fast forward to approximately 5 years ago. Technology proved me correct, and the broadband networks installed by cable companies are able to compete in available last mile bandwidth. However, 1.5 to 3 megabits still really isn’t enough to drive a nearly seamless application experience to a remote host. The last thing a hosting company, or bandwidth providers want is to make promises they can’t keep. Hence, the Application Service Provider or Software as a Service model didn’t work.

Without going into a discussion about changes in head end and node routing and processing, Fiber to the Premise technology, et cetera,  suffice to say the bandwidth environment has changed. 3 to 5 megabits is the minimum service available, with standard services in excess of ten times that. The network latency now is low enough as to not affect the application experience. Broadband providers now have no reason to dissuade a customer away from a hosted application. To the contrary, they have a vested interest in an increased sale for a service that requires minimal additional infrastructure to the customer.

With buy in from bandwidth providers, hardware and software becoming a commodity market item for the large software companies, we reach the last realization why all the excitement in the industry is centered around cloud based products. It’s pretty obvious that it’s financial. What is not obvious is that there are two parts to the financial reasoning. Every business person knows that to make money, you have to sell something, and you have to continue selling something. It’s called cash flow. In the example I gave early in this article, if my old copy of Microsoft Office continues to work, after the initial sale, Microsoft garners no additional revenue. In contrast, if I purchase a three year subscription to Office365, and I don’t renew at the end of the term, I don’t have an office application suite anymore. So, at the minimum, the user has to make a decision on renewing the subscription. While this is a rather simplistic example, if you expand the concept to say, a Customer Resource Management hosted application where it is billed monthly based on the number of users and the cash flow incentive is pretty large.

The second not so secret financial reason to promote a cloud based product is ownership. Not of the application as alluded to above, but of the customer and the data. This isn’t some giant conspiracy theory. This is about inertia and change. To migrate from one application to another is effort (and cost) above and beyond the cost of application itself. With an in-house application, a customer has access to the raw data. This allows for an in house programmer or competitor to access and massage the raw data as part of a migration. With a cloud based application, this is substantially more difficult. There is no guarantee that your companies data is stored in a separate instance of a database application, or just another set of tables in one large database. So, by increasing the cost and potential complexity of migration from a “service based/hosted application” the odds of loosing a revenue source are reduced, and hence, a tighter hold on the end customer.

So much for the second part of my overview of Cloud Computing.

In my next post “To Cloud or Not to Cloud“, I’ll speak to how to determine if a cloud solution is right for your business, some of the pitfalls and how to avoid them.