Welcome!

Stefan Bernbo

Subscribe to Stefan Bernbo: eMailAlertsEmail Alerts
Get Stefan Bernbo via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: Storage Journal, Big Data on Ulitzer, Internet of Things Journal

Blog Post

The ‘Internet of Things’ Storage By @CompuverdeSDS | @ThingsExpo [#IoT]

As more businesses, entrepreneurs and government entities embrace the IoT, more data will be generated daily

Meeting the Storage Demands of the ‘Internet of Things’

Smart devices that use wireless technology to exchange information with each other and with their human owners: this is the Internet of Things (IoT). This incredible level of connectivity is already transforming how we exercise, treat diseases, park our cars and access business documents. Research firm IDC projects that the IoT world of connected devices will grow to 200 billion objects by 2020.

As more businesses, entrepreneurs and government entities embrace the IoT, more data will be generated daily than the already mind-boggling 2.5 quintillion bytes of data per day. However, even current levels of connectivity have pushed storage solutions to the brink. If service providers hope to stay competitive, they will need to find new approaches to storage that can meet their capacity needs without being cost-prohibitive.

Problems with Appliances
Server hardware that comes with proprietary, mandatory software creates the bulk of today's data center architecture. The software is designed for the hardware and vice versa, and come tightly wedded together as a package. The benefits of this configuration include convenience and ease of use.

In this setup, redundant copies of expensive components are included in anticipation of the inevitable hardware failures that occur. These copies help prevent the failure that is caused by reliance on a single point of entry. These redundant extra components bring with them higher hardware costs, greater energy usage and additional layers of complexity. When companies, in anticipation of growth events like the IoT, begin to consider how to scale out their data centers, costs for this traditional architecture skyrocket.

Traditional appliances also bring with them the problem of vertical construction, which come in via a single point of entry and are then re-routed. Imagine a million users connected to that one entry point at the same time. That's a recipe for a bottleneck, which prevents service providers from being able to scale to meet the capacity needed to support the Internet of Things.

A Streamlined, Faster Approach
In order to meet the IoT's capacity demands, software-defined storage is one option now available. By taking features typically found in hardware and moving them to the software layer, a software-defined approach to data center architecture eliminates the dependency on server "appliances" with software hard-wired into the system. This option provides the scalability and speed that the IoT demands.

It's easy to become jaded about the latest buzzwords, so it may be a relief to realize that many everyday devices have been "software-defined" for years. Take the PC, for example: software can be installed on any hardware platform, allowing the user to custom-tailor both the hardware and the software according to his or her needs. The average PC can use Linux as an operating system if the owner so chooses. This gives the user greater freedom to allocate his or her budget precisely as needed for the task at hand - whether towards a high-powered graphic design setup, for example, or a lightweight Web browser.

Liberating the software from the hardware, software-defined storage provides a cost-reducing alternative to traditional appliances because it can be run on inexpensive commodity servers. When coupled with lightweight, efficient software solutions, the use of commodity servers can result in substantial cost savings for online service providers seeking ways to accommodate their users' growing demand for storage.

Software-defined storage is also scalable, suitable for a wide range of data centers across all industries. A major bank with branches in several countries will have different storage needs than a telco servicing one particular area, and a cloud services host provider will have different needs still. While appliances might be good enough for most of these needs, fully uncoupling the software from the hardware can extract substantial gains in economy of scale.

Because the software and hardware are no longer hard-wired together, administrators can look at their business needs and choose only those components and software that will serve their growth goals. While this approach does require more technically trained staff, the flexibility afforded by software-defined storage delivers a simpler, stronger and more tailored data center for the company's needs.

In addition, the potential for bottlenecking common to vertical, single-entry-point models is eliminated with the software-defined approach. It uses a horizontal architecture that streamlines and redistributes data, meaning data is handled faster and more efficiently. This non-hierarchical construction can be scaled out easily and cost-effectively.

With millions of devices needing to access storage, the current storage model that uses a single point of entry cannot scale to meet the demand. To accommodate the ballooning ecosystem of storage-connected devices all over the world, service providers, enterprises and telcos need to be able to spread their storage layers over multiple data centers in different locations worldwide. It's becoming increasingly clear that one data center is not enough to meet the storage needs of the Internet of Things; storage must instead be distributed in multiple data centers globally.

A Future-Focused Storage Model
Few people could have predicted changes that the Internet of Things would bring to daily life. Current storage solutions - with their combined hardware and software and redundant copies - are too costly to be scalable. Their horizontal architecture also creates data bottlenecks that slow performance. Software-defined storage offers scalability and speed at a price that allows organizations to remain competitive as they distribute their data around the world.

More Stories By Stefan Bernbo

Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, he has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson, the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.