Editor's Note: The following is a guest post from Kash Iftikhar, vice president IaaS, Public Cloud Services at Oracle.
American businesses are well into the second decade of the cloud computing era. And yet many CEOs and others in the executive suite still hold common misperceptions about this fundamental change in how companies run their information technology.
Here are five of the biggest misconceptions of the modern cloud era:
1. All applications are equal
The public cloud has long been the go-to destination for developers who build and test brand new applications for web deployment.
It's a no brainer: Why write a big check to purchase servers and storage when you can rent what you need and stop paying for it when you're done? These so-called "born-to-cloud" apps are the proverbial low-hanging fruit for cloud providers.
But heavy-duty legacy applications, including the databases and transaction systems that power corporations, are a whole other thing. They are complex in part because they incorporate custom code and integrations with other software developed over time.
It's hard to rewrite those applications from the ground up and very difficult to forklift them to the cloud.
Once companies have started moving existing software to a cloud data center, they can look into rebuilding those apps as needed, or continue to run them as-is for the foreseeable future, perhaps also surrounding them with newer cloud-based application and data sources.
Estimates as to how many business-critical enterprise apps like these now run in the public cloud are all over the map. Some sources say 30%, but that likely includes the sorts of CRM and marketing applications that built Software as a Service into a major category over the last 20 years.
When it comes to hardcore database, finance and other enterprise apps, that number is probably closer to 10%. That leaves a lot of headroom for public cloud infrastructure built expressly for an enterprise customer base.
The key, however, is making sure businesses pick the cloud that can most easily accommodate a forklift of existing software.
2. Service Level Agreements will cover everything that ails you
Prospective cloud customers who believe that service level agreements (SLAs) will take care of any and every potential snafu should review them carefully before signing on.
Generally speaking, in the event of an outage, a cloud SLA will reimburse affected customers with credits. So, if your compute instances fail for X hours you'll get X hours of credit for those same resources.
But a performance hit — which slows your work to a crawl — is not technically an outage, and may not trigger any sort of make-good agreement.
And if for some reason your cloud instance doesn't respond to your management console, and you are unable to scale them up and down as needed because of some other issue, or any other management tasks fail, there likewise will be no credit for you.
Customers should check the red tape carefully to ensure the SLAs will cover the elements that are mission critical to their business — such as performance. And in the case that there is a hiccup, be sure they will be adequately reimbursed.
3. It's all about the infrastructure
Ask any retailer or financial services firm about where they derive real value from their IT and they will likely say the biggest payoff comes from software that performs the functions they need to run their business successfully. But like the human body, there are things far beneath the surface that make these applications tick.
For example, independent software vendors (ISVs) build and tailor the kinds of business-facing applications noted above on a database and middleware infrastructure.
Being able to run these bespoke applications with large databases is critical for these customers. However, some cloud services limit the size of the customer database to 16 terabytes of data.
There are many, many databases that fall in into that category and many Fortune 1000 companies would find themselves bumping up against that limit almost immediately.
So while the building blocks of compute, storage and networking are key, the real special sauce for customers is the smart software that runs atop that foundation and the ability of that software to fully utilize the cloud resources available to it.
4. Price is everything
When cloud computing launched as a category more than a decade ago, much of the positioning was all about price: Get your cheap compute by the hour (later by the minute) and get tons of storage for very little money. But even then, some of these claims didn't stand up to scrutiny.
For example, it's dirt cheap (often free) to move corporate data into a public cloud. But once a company has lots of data up there, moving it out again costs substantial money and lots and lots of time.
The real value of cloud is that it lets customers tap into the latest and greatest hardware running in super secure data centers. That makes it easier for companies to focus on fine-tuning their own software and running their businesses without the worry of maintaining server rooms.
5. The first to market will win it all
As mentioned above, reasonable people can argue about what percentage of the current information technology universe has moved to public cloud. But even the most aggressive estimates hold that less than half of business workloads have done so.
That means there is plenty of opportunity for a business-savvy provider to carve out a substantial piece of the market.
Customers should look for a partner that can furnish a cloud infrastructure offering that can run both traditional workloads "as-is" for the foreseeable future while also supporting the creation of brand new generations of applications. That's a pretty powerful pitch for C-suite executives.