More and more of our lives are lived online. Our music collections,
bookshelves, vacation memories and more are increasingly digitized and
uploaded into the cloud, the vast network of server farms that provide the
bulk of online storage today. Research firm Gartner projects that by 2016, 36
percent of consumer content will be stored on the cloud, up from a mere seven
percent in 2011.
Service providers, watching these trends with a wary eye, will be required to
accommodate ever-increasing demands for storage as consumer appetites for
cloud content storage continues to grow. To adapt, many service providers are
exploring new options in data center architecture that will permit greater
flexibility and control over hardware costs.
One such option is software-defined storage. By taking features typically
found in hardware and moving them to the software layer, a software... (more)
Cloud computing has given birth to a broad range of online services. To
maintain a competitive edge, service providers are taking a closer look at
their Big Data storage infrastructure in an earnest attempt to improve
performance and reduce costs.
Large enterprises hosting their own cloud servers are seeking ways to scale
and improve performance while maintaining or lowering expenditures. If the
status quo of scaling users and storage infrastructure is upheld, it will
become increasingly difficult to maintain low cost cloud services, such as
online account management or data sto... (more)
Cloud computing has opened the doors to a vast array of online services. With
the emergence of new cloud technologies, both public and private companies
are seeing increases in performance gains, elasticity and convenience.
However, maintaining a competitive advantage has become increasingly
difficult. Service providers are taking a closer look at their data storage
infrastructure for ways to improve performance and cut costs.
If the status quo remains, maintaining low-cost cloud services will become
increasingly difficult. Service providers will incur higher costs, while
Legendary Intel co-founder Gordon Moore's eponymous law holds that the number
of transistors per square inch of chip will double every two years, a
prediction that has held remarkably firm as incredibly tiny devices have been
made increasingly powerful. In recent years, that assertion could be used to
apply to the creation of data as well. An IDC study in 2011 showed that the
amount of data created by every device and person in the world would double
every two years, with a staggering 1.8 zettabytes (1 billion terabytes)
created in 2011 alone.
A significant driver for this trend... (more)