Skip to main content

Chief concerns when buying cloud storage include cost and capacity

As the market for disk-based arrays continues to slide, the shift in the data center to newer technologies such as the cloud is accelerating. According to a TechTarget research survey, a little more than half of respondents see the public cloud as a way to address growing data volumes. Other important reasons for buying cloud storage include the ability to more effectively meet compliance requirements, recover data, consolidate backup operations and deploy new enterprise applications.
Become an expert in creating and deploying applications using Spring Tool Suite. Learn Cloud Foundry
Not surprisingly, cost is the most important thing people look at when buying cloud storage features and services. Sixty-eight percent listed cost as the No. 1 feature, placing it 25 percentage points ahead of No. 2, encryption. Data retention policy placed third with 22%, followed by protocols supported -- e.g., Amazon Simple Storage Service, OpenStack, object and RESTful, NFS, WebDav, etc. -- at 20%. When it comes to evaluating cloud storagevendors, pricing practices (67%) leads by 19 points over the number two evaluation criteria, brand and reputation. And while that is followed closely by ease of implementation and product feature set, clearly -- when it comes to public cloud storage -- how much you pay and how you pay it are most important when selecting a service and provider.
The top six vendors under consideration for moving data to the public cloud were Microsoft (44%), Amazon (40%), Google (29%), Hewlett Packard Enterprise (17%), IBM (16%) and Rackspace (10%). All of those contenders except Hewlett Packard Enterprise have a place on Gartner's latest "Magic Quadrant for Public Storage Services report."
On average, our surveys show that enterprise cloud storage customers turn to these services for 269 TB of additional storage capacity. While a little more than a third intend to acquire 10 TB or less, and another third 10 TB to 99 TB, the majority of the remaining 40% land in higher capacity ranges between 100 TB and a petabyte (PB). Among those who are buying cloud storage for backup, the vast majority (82%) use it to protect 199 TB of data or less.

Half of respondents intend to purchase cloud capacity for archiving. Forty percent of those with definite archiving plans are doing so at less than 10 TB of storage, 41% at 10 TB to 99 TB, 7% at 100 TB to 499 TB, 5% at 500 TB to 999 TB and 7% at 1 PB or more. The average capacity acquired for archiving alone is 203 TB.

A little more than half (52%) anticipate making their next cloud storage purchase a ways off, in nine months to a year. Around one quarter (23%) expect to do so fairly quickly, within three months. The rest said they would be buying cloud storage within three to nine months.

Comments

Popular posts from this blog

Better Tableau implementation gives BI dashboards a boost

When the telecommunications group at General Motors initially began using Tableau dashboards to track service requests and incidents, reports had such long load times that managers couldn't use them effectively. "It really wasn't something they could use on their own, let alone use in a meeting," said Katrina Botting, who heads data strategy and intelligence for GM's telecom organization, in a presentation at Tableau Conference 2017. It wasn't until last year, when Botting and her team learned how to effectively utilize the software's data extract feature as part of GM's Tableau implementation, that reports loaded quickly enough to be truly self-service. The extract feature loads a snapshot of data stored on disk into memory; now, virtually all reports are based on extracted data, and reports that once took several minutes to load do so in seconds, Botting said. Managers in the telecom unit used to meet with each other just to make sure they a...

Evolution of SQL Server from SQL 2000 to SQL Server 2008 R2

As a software professional i would attend some technical interviews with some reputed companies for their open positions. Most of them asked me  What is the differenced between SQL 2000 and 2005 . Since i have worked on all SQLversions from SQL Server 7.0/2000 to 2008 r2, I felt just like our human race is evolving, SQL Server is also evolving. Each release will meet with exact requirement of the industry it was born. When SQL Server has evoled from SQL Server 2000 to SQL Server 2005, this was the major difference we felt, in a quick summary in tabular format. Feature SQL Server 2000 SQL Server 2005 Security Owner = Schema, hard to remove old users at times Schema is separate. Better granularity in easily controlling security. Logins can be authenticated by certificates. Encryption No options built in, expensive third party options with proprietary skills required to implement properly. Encryption and key management build in. High Availability Clustering or Log Shipping...

The keys to a successful business intelligence strategy

Business intelligence (BI) is essential for business growth and competitive advantage, yet reaping benefits from BI requires more than implementing the technology that enables it. In fact, deploying the technology is the easiest part of any BI initiative, according to Boris Evelson, vice president and principal analyst at Forrester Research. Getting the personnel and processes portions right are much more challenging, he says. As such, organisations must addresses personnel and processes as key facets of their  BI strategy  if they want to be successful. Moreover, BI strategies should be broken down even further to address ownership and continual improvement as well. These are the seven essential components of any successful BI strategy, according to BI experts. Give business ownership over BI Organisations that place BI in the hands of business users have greater success rates than those who confine BI within IT, Evelson says. This may mean embedding BI within line...