Block size and scalability, explained

Published at: Aug. 24, 2020

The debate is not settled.

Despite much of the great work being done, there is no one clear solution for blockchain scaling, and it’s possible there never will be.

All of the possibilities explored above stand to help bring digital assets onto a global stage, but none of them are the final say yet. As we have discussed, there are pros and cons for each, and it isn’t implausible that there simply won’t be a single winner. Different projects with unique goals may need to scale in different ways. It is even plausible that more than one of these ideas can be used in tandem to multiply the benefits of each.

In time, history will be written as to how to scale blockchains, but we aren’t there just yet. By testing and reimagining solutions, developers should be getting closer each day to a system of global data processing that rivals or surpasses current offerings. For now, it is important to keep an open mind and be willing to try new things, as perhaps the answer that project leaders are looking for is already being tested in the field, right now.

Learn more about ILCoin

Disclaimer. Cointelegraph does not endorse any content or product on this page. While we aim at providing you all important information that we could obtain, readers should do their own research before taking any actions related to the company and carry full responsibility for their decisions, nor this article can be considered as an investment advice.

How have different projects approached the issue?

No single solution has emerged as the best one, and projects are still actively exploring creative versions of all these philosophies in an attempt to make scalable networks.

At the time of writing, Bitcoin hasn’t natively upgraded the nature of its blocks since the implementation of SegWit. That being said, Lightning Network and sidechain research is still going strong, and many expect some form of it to be what enables everyday purchasing with Bitcoin to become the norm. As mentioned, projects such as Bitcoin Cash have embraced the creation of bigger blocks, and BitcoinSV has taken this further with an upper limit on its blocks of a whopping 2 GB. This has, admittedly, led to an increase in the cost of maintaining a node as well as more frequent issues with orphaned blocks.

Though 2 GB is impressive, there are even more ambitious platforms than this. A project called ILCoin has used a protocol known as RIFT that, as the team claims, allows for the creation of blocks up to 5 GB in size and the throughput to reach up to 100,000 TPS. ILCoin developers claim that it is possible because each block is composed of collections of 25 MB “Mini-Blocks,” and these do not need to be mined individually, as they are generated automatically by parent blocks. The team says it is using this new type of system to create a Decentralized Cloud Blockchain, or DCB, which, “thanks to the ability to synchronize blocks simultaneously, will act as a global data storage solution that is trustless and completely resistant to manipulation.” ILCoin developers also believe that this will be the first project that is able to store users’ files on-chain.

Not all projects are taking the larger block approach, of course. While networks such as Zilliqa have joined Ethereum in looking to sharding as their primary means of creating a scalable platform, Ethereum itself is looking to migrate over to a new proof-of-stake system that is being labeled Casper. On the other hand, the project Cardano has developed a new approach called Hydra, which sees each user generating 10 “heads” and each head acting as a new channel for throughput on the network. This will hopefully allow for seamless scalability, as increased use of the network should also generate increased capacity.

What are the arguments for and against increasing block size?

Those who want to see block size increase argue that larger blocks not only improve capacity and speed but also push down fees. Detractors are concerned that larger blocks will lead to greater centralization.

There are many who feel that increasing block size is key to bringing Bitcoin (BTC) and other decentralized assets into mainstream adoption. It is certainly fair to point out that as block size increases, not only can more transactions be confirmed in each block, but also the average transaction fee will drop. This sounds like the best of both worlds, as the network would be both faster and cheaper. This case is made stronger when proponents point out that other scaling solutions, such as the aforementioned sidechains and sharding, are still being tested and aren’t ready to be mass-implemented yet. 

These are important points, but of course, increasing the size of blocks does have some consequences. Again, many see it as simply buying time and not solving the real issue, and that more sophisticated solutions are necessary. The reason they give for why larger blocks are such a problem is that node operators need to download each new block as it is propagated, which with current hardware is of no major issue if blocks are 1 MB, 4 MB or even 32 MB in size. However, if a blockchain is to be adopted globally, then even this is not enough. Before long, blocks would need to be on the scale of gigabytes, and this could be a roadblock for many. If most average users cannot afford hardware or internet connections capable of handling this, then, presumably, fewer and fewer would do so, leading to increased centralization. As Bitcoin Core developer Gregory Maxwell has stated:

“There’s an inherent tradeoff between scale and decentralization when you talk about transactions on the network. […] You’d need a lot of bandwidth, on the order of a gigabit connection. It would work. The problem is that it wouldn’t be very decentralized, because who is going to run a node?”

Ultimately, the ones who decide on these changes to a network are the miners, who can “signal” that they support an upgrade to the network’s protocol. Because many miners are grouped into large pools, which ultimately all signal together, this can potentially be another form of centralization, as these conglomerates have far more say than lone miners ever could. Fortunately, there is more than one way to approach this issue, and not all projects want to see open-ended block sizes. Other developers negate this problem in other, clever ways in the hopes of putting scaling to rest once and for all.

What are some ways blockchains can scale?

Scaling solutions come in two forms: on-chain and off-chain. Both come with pros and cons, but as of now, there is no agreement as to which is more promising for future growth.

On-chain scaling

On-chain scaling refers to the philosophy of changing something about the blockchain itself to make it faster. For example, one approach to scaling includes shrinking the amount of data used in each transaction so that more transactions fit into a block. This is akin to what Bitcoin achieved with its Segregated Witness update, otherwise known as SegWit. By altering how the transaction data is handled, this patch to Bitcoin allowed a notable improvement to overall network capacity.

Another way to potentially boost the TPS of a network is to increase the rate of block generation. While this can be helpful up to a point, there are limitations to this method relating to the time it takes to propagate a new block through the network. Basically, you don’t want new blocks being created before the previous block was communicated to all (or virtually all) of the nodes on the network, as it can cause issues with consensus.

Creating seamless communication between discrete blockchains is another potential way that these systems could scale. If different chains can all transact between each other, then each individual network doesn’t have to handle as much data and the throughput of each should improve. Of course, a system would be needed to ensure the data being sent between networks is 100% accurate, and this is what projects such as Polkadot are working to do right now. By combining multiple native chains as well as smart contracts, this platform makes it possible for the entire decentralized ecosystem to scale together, once fully implemented.

Then there’s a technique called sharding, in which transactions are broken up into “shards,” and different nodes only confirm certain shards, effectively performing parallel processing to speed up the system. This can be applied to proof-of-work or proof-of-stake systems and is going to form a major component of Ethereum 2.0. This offers the potential to improve the capacity and speed of the network, and developers are hoping that we will see upward of 100,000 TPS become a reality

On the other hand, it should be noted that it will still take a few years before the sharding process is fully implemented into Ethereum, and detractors have pointed out that it also adds complexity and hurts security. This is due to the fact that sharding increases the chances of a “double-spend” occurring as a result of an attack. The issue here is that it takes notably fewer resources to take over individual shards than it does to perform a traditional 51% attack. This can lead to transactions being confirmed that would otherwise be seen as invalid, such as the same Ether (ETH) being sent to two different addresses.

Some projects have attempted to improve network speeds by limiting the amount of validating nodes — a very different philosophy from Ethereum’s. One example is EOS, which has limited its validators to just 21. These 21 validators are then voted on by token holders in an attempt to keep a fair, distributed form of governance — with mixed results. This has given the network a reported 4,000 TPS, and developers are confident that they can continue to scale, which has positioned the project as one of Ethereum’s main competitors in this space. However, limited validators are often looked down upon as a form of centralization, so not all users are sold on the model.

Of course, one of the most frequently discussed means to scale a blockchain is to increase the size of individual blocks. This was the approach that Bitcoin Cash famously took when it forked away from Bitcoin in 2017. Not wanting a limit of 1 MB, the Bitcoin Cash community changed the rules so that the project could have 8 MB, and later 32 MB, blocks. While this certainly means there is more room in each block for added transaction data, some point out that it is infeasible to continue growing block sizes indefinitely. Many consider this solution to be merely pushing the problem down the road, and at worst, they see it as again primed for harming the decentralized nature of the blockchain. Given that, in practice, the average block on the Bitcoin Cash network is still under 1 MB, the debate on this is as of yet unsettled, and we will explore the issue more thoroughly below.

Off-chain scaling

There are also ways to improve network throughput that don’t directly change anything about the blockchain. These are often called “second-layer solutions,” as they sit “on top of” the blockchain. One of the most well known of these projects is the Lightning Network for Bitcoin. Basically, Lightning Network nodes can open up “channels” between each other and transact back and forth directly, and only when the channel is closed does the Lightning Network transmit the final tally to be recorded on-chain. These nodes can also be strung together, making a much faster, cheaper payment system that only interacts with the main network a fraction of the time. 

Ethereum, of course, also has solutions along these lines. For one, there is the Raiden Network, designed to be Ethereum’s version of the Lightning Network, as well as a more general blockchain product called the Celer Network. These projects implement not only off-chain transactions but also state changes, which allow for the processing of smart contracts. Currently, the biggest drawback with these systems is that they are a work in progress, and there are still bugs and other technical issues that can arise if channels aren’t created or closed correctly.

A similar idea is something called “sidechains.” These are basically blockchains that are “branched off” of the main chain, with the ability to move the native asset between them. This means sidechains can be created for specific purposes, which will keep that transaction activity off of the primary network, freeing up the overall bandwidth for things that need to be settled on the main chain. This is implemented for Bitcoin through the Liquid sidechain, and Ethereum’s version is known as Plasma. One downside here is that each sidechain itself needs to be secured by nodes, which can lead to issues with trust and security if a user is unaware of who is running them behind the scenes.

Why is block size important?

The size of individual blocks on a blockchain can have a potentially large impact on the speed and capacity of the network, but there are always trade-offs.

As you are likely aware, blockchains get their name from the fact that they are literally composed of an ever-ongoing history of blocks. Blocks themselves are batches of transaction data, and the amount of data contained in each block combined with the chain’s block generation speed determines the number of transactions per second, or TPS, that the network can handle. Obviously, having a high rate of TPS is more attractive, so developers are always looking for ways to improve this metric.

Actual rates vary based upon network conditions, but Bitcoin currently maxes out around seven TPS, and Ethereum isn’t much better at 15 TPS. For comparison, Visa can process something around 1,700 TPS, so it is imperative that improvements be made if these networks want to compete as global payment solutions. Because the TPS rate of a blockchain is deeply tied to the size of each block, this becomes a major factor in finding a path to mainstream adoption. However, as we shall see, simply increasing the size indefinitely is only one way to approach the issue, and there are many different philosophies as to how to move forward.

Tags
Related Posts
The importance of block sizes, and the rise of off-chain solutions
The cryptocurrency sector has undergone a seismic shift in recent years as retail investors and publicly listed companies make their way into the space. Globally, estimates suggest that the number of crypto users surged by about 190% between 2018 and 2020. Record-breaking inflows into the market have also taken the number of active addresses on the Bitcoin network, as well as trading volumes, to all-time highs this year. On the face of it, all of this should be cause for celebration — and it is. But here’s the problem: While Bitcoin’s network is enjoying a surge in demand, it could …
Blockchain / May 25, 2021
Polygon announces scalable data availability infrastructure Avail
Ethereum (ETH) scalability infrastructure developer Polygon has announced the rollout of a general-purpose, scalable data availability solution called Avail. According to a release issued on Monday, Polygon revealed that Avail will function as a data available tool for execution layers like sidechains, standalone networks, and layer-two protocols. One of the major hurdles for effective blockchain scaling is the data availability problem. Malicious actors can broadcast blocks to the network with incomplete data and other participants will be none the wiser. To tackle this problem, the Polygon team stated that Avail utilizes erasure coding and polynomial commitment to combat data encoding …
Blockchain / June 28, 2021
Proof-of-Work vs. Proof-of-Stake for Scaling Blockchains
Most people in the cryptocurrency world are aware that network validation often comes in one of two forms: proof-of-work or proof-of-stake. There are others, but these systems are common and power many of the most popular blockchains. They take the same basic problem — verifying transactions — and solve it in unique ways. However, both offer different solutions to the ongoing debate over scaling. Does one have a true advantage over the other, or are they just different philosophies? We’ll take a look at both. Proof-of-work, explained Most people have heard of Bitcoin (BTC) “miners,” but just what do they …
Blockchain / June 23, 2020
Crypto Platform Says It Aims to Solve Bitcoin Scalability Issue
Crypto platform ILCoin says it is developing a “modern alternative to bitcoin” without scalability issues by using “pioneering technology” called command chain protocol (C2P). The protocol of C2P, created by ILCoin, is supposed to solve the main problem of the Bitcoin blockchain — the lack of scalability — by enlarging the block size. The startup says that it has already managed to increase its limit to 25 MB. This enables its users to have a higher transaction amount: Instead of the current maximum for Bitcoin of seven transactions per second, the team says they hit 170.000 transactions per block or …
Blockchain / June 10, 2019
Cardano ecosystem set to expand with custom-built sidechains
Input Output Global (IOG) — the team behind the Cardano ecosystem — will release a software toolkit in late Jan. 2023 that will enable developers to deploy custom-built sidechains on Cardano aimed at improving the ecosystem. The news was announced by IOHK — a blockchain engineering company founded by Charles Hoskinson and now known as IOG — on Jan.12, which also attached the official technical documentation for the sidechain toolkit. ⛓️ The #Cardano sidechain toolkit was previewed at #IOScotFest, and we’re happy to share the first iteration is out now. Here’s the lowdown on this exciting new project! https://t.co/Ny9tQuJh5K — …
Decentralization / Jan. 13, 2023