A free and open internet is vital and is under siege!

By Storj Labs

Today, we find ourselves in a rapidly evolving age where the internet plays a major role in our ability to communicate with one another and gather information. The internet now connects more than 3.2 billion people around the globe, which represents 55% of the global population. This number only continues to grow and it’s not hard to imagine that one day, everyone in the world will have consistent, regular internet access.

Already, the internet has provided a massive evolution in the way humanity learns and collaborates. Just like the revolutionary Gutenberg Press gave substantially more people access to books, the internet has presented humanity with a wealth of interconnectedness and information like never before. Having access to all of the world’s knowledge in your pocket used to be a science fiction dream; now it’s reality.

While the raw amount of data being transferred is definitely growing, the time we are spending processing it scales at almost the same rate. We are connecting to the internet earlier and more often than ever before, further cementing the web as a critical, central component of our lives. Primary schools have implemented web navigation and strategies into their curriculum, and online access is becoming increasingly necessary to early and continuing education. Some children are even learning coding concepts in elementary school, a clear sign that the internet is empowering individuals of all ages and the consumption of data starts much earlier than it did a decade ago. It is a very powerful and advanced tool that has the ability to change the world for decades to come, and everyone should have an equal opportunity to connect.

When the internet was first created as a research project connecting universities, no one could have predicted how it would grow, morph, or transform into what it is today. A huge part of this exponential growth and adoption is that the internet gives everyone the ability to join and create. There’s no access fee and there’s no tier structure; if you have internet access, you can directly communicate with anyone else with internet access. This free access regime caused an explosion of creativity, opportunity, and new ideas. The creators of Google, MapQuest, Wikipedia, Craigslist, eBay, etc., didn’t need permission to start their life-changing projects.

Unfortunately, this open, transformative power is under threat and may start to become curtailed.

You may have heard of “net neutrality,” or the idea that all sites and services on the internet should be granted equal access to any internet user. Net neutrality proponents aim to keep all internet traffic democratized and ensure data from a small, one-man startup is treated with the same consideration as that of today’s Google, Microsoft, or another cloud giant, just like Google had when it was starting out.

Net neutrality activists seek to preserve the right to freely use your internet access for whichever services you like. Net neutrality prevents ISPs from giving preferential treatment to certain services, such as big cloud corporations, that pay the providers an incentive to provide enhanced speed to their content. If we lose net neutrality, these ISPs could throttle bandwidth when accessing other, smaller websites and services that provide better prices, products, or enhanced utility. While this is great for big businesses, it limits innovation because these smaller platforms cannot generally afford to pay the premium costs to these providers, even if the service they offer is better than others. Imagine what would’ve happened if Facebook was difficult-to-impossible to access in its early days because MySpace had paid for preferential treatment.

A loss of net neutrality affects content providers as well. Without net neutrality, ISPs have the ability to charge extra fees for any services they want. In this scenario, if an end-user is using a media streaming service like Netflix or YouTube, Netflix or YouTube could also be forced to pay additional fees.

Just imagine surfing an online store and, while scrolling through the articles, suddenly the internet connection is throttled (on either end) because of “unusual usage metrics” or because the service you are using doesn’t pay your ISP a premium to use its “fast lane.” This may degrade your experience or even prevent you from buying the items you were searching for (at the price you wanted), because you simply cannot access the site offering it.

The potential loss of net neutrality isn’t the only threat to a free and open internet. Bandwidth caps constitute an additional, and often artificially imposed, limitation that impacts the future potential of the internet. Bandwidth caps are often applied evenly to all websites, and so they do not typically garner the same attention as discussions around net neutrality. That being so, bandwidth caps still have the ability to impose an unnecessary stifling effect on the growth of internet innovation.

Many ISP plans nowadays advertise their impressive bandwidth speeds, which any consumer would aspire to take full advantage of, especially as the amount of data we consume online continues to increase. Bandwidth caps can have a huge impact on ISP customers, depending on location, costs, and common practices employed in each region.

Without calling out specific ISPs and making a more formal comparison, we have found that in more and more locations gigabit fiber lines are being offered with a mere 1 TB traffic cap for private usage. Doing a quick estimate, this would only allow for a maximum of 380 KB/s or 3 Mbit/s of bandwidth usage for the entire month, if we want to avoid hitting this invisible ceiling. ADSL connections over phone lines in 1998 were up to 2–4 times faster! If the full gigabit of throughput were constantly utilized, the bandwidth cap would be hit in just 2 hours and 25 minutes. To put it in perspective, users are offered a 1000 Mbit/s plan to access the web but are limited to using only about 0.3% of its maximum capabilities.

Bandwidth caps disproportionately affect decentralized applications, and in many cases, ISPs have implemented bandwidth caps to reduce the usage of peer-to-peer software. Peer-to-peer applications often use more “edge” traffic, instead of the typical “hub-and-spoke” model used by centralized services today, where client computers only talk with a handful of central servers (“the cloud”).

Though it’s a common practice for an ISP to try and limit peer-to-peer software usage, it would be in any ISP’s interest to work more closely with peer-to-peer services. These services typically pay upfront capital expenditures (one-time costs) for their networks, and only pay operational expenditures (ongoing costs) for traffic that leaves their network for other networks. ISPs could benefit significantly by encouraging users to use peer-to-peer services that prioritize keeping traffic within an ISP’s network, which would help the ISP skip the ongoing operational expenditures for inter-network traffic.

The throttling and bandwidth-cap problems that impact the internet now will compound with web 3.0, as the interaction with each service gets more bidirectional. In the early days of the internet (web 1.0) almost all use was “read,” in that we didn’t interact with one another or services much — if at all. With web 2.0, we began to “write,” with services like social media, blogs and other platforms that allowed us to begin interacting with one another. This began to create an environment where we begin uploading data, in addition to downloading, starting the trend of bidirectional usage. Web 3.0 will take this to the next stage, creating an internet that is programmable.

Throttling and bandwidth caps have the potential to impact one’s ability to interact with this emerging programmable internet that is web 3.0. Without the ability for everyone to consume these new web offerings freely, the goal of web 3.0 cannot be fully realized and those that do not have access will be left behind.

Net neutrality legislation has strived to keep the internet free and open, but removing bandwidth caps is also an important topic to address. Net neutrality ensures all participants are treated equally; eliminating bandwidth caps ensures that all technologies and network architectures are treated equally.

In the US and other countries, net neutrality legislation has been shut down, essentially putting the power back into the hands of internet service providers. ISPs use this power to limit data transfers for specific applications and throttle users’ ability to access data once they hit a certain usage level, which hurts consumers and stifles innovation. Bandwidth caps get less focus, but have also been creeping up significantly in the last decade.

Storage is the foundation to the cloud and now that web 3.0 development is starting to really gain momentum, our decentralized cloud storage network will be a critical component to this evolving future. Our platform stores our users’ personal, client-side encrypted data across many nodes spread all over the globe, rather than in a single, siloed data center, which would be more prone to natural disasters or attacks and costs several million to build and maintain.

To achieve maximum performance and resilience in file storage over the upcoming decades, some measures are required to ensure that node failures, network outages and similar problems, will not cause data loss. In our newly released V3 white paper, we outline how our new network aims to achieve much better bandwidth efficiency by switching from predominantly using replication across many nodes to pure erasure encoding (see our recent blog post here). The goal is to use only the bare minimum of bandwidth needed to maintain the above-mentioned metrics and to reserve most of the available bandwidth for fast download and (especially) uploads of our users’ files.

While net neutrality will help support the web 3.0 future, our new network is designed to minimize possible delivery bottlenecks by fetching and streaming data from many different locations across the world. The power of using small fragments called erasure shares (in the small kilobyte range) helps our platform succeed regardless of what conditions are placed upon the internet on which it runs.

Storj Labs remains committed to fighting for the free and open internet, both in support of net neutrality, and in support of fighting unnecessary bandwidth caps, both of which present huge hurdles to humanities ongoing quest to innovate. With web 3.0 on the horizon, it is critical that we consider how failing to address these problems could affect people in less developed countries, impede individual innovators with little power and influence, and slow down the ever-evolving internet.

By Stefan Benten, Strategy R&D Engineer at Storj Labs

Originally published at storj.io on December 13, 2018.