Section V: Vertical Scaling Approaches
While Section II explored how modularity enables horizontal scaling through parallel chains, individual chains must also decide how to maximize their own capacity. Vertical scaling makes individual chains more powerful through hardware improvements, optimized fee markets, and clever data management.
Hardware Requirements
The hardware demands for running validators reveal one of the clearest balancing acts between accessibility and performance across blockchain architectures.
Bitcoin sets the lowest barrier to entry. A modest Raspberry Pi with adequate storage can fully validate the chain, enabling broad participation from nearly anyone with basic computing resources. This democratic approach comes at a cost: throughput maxes out around 5 transactions per second, depending on transaction size.
Ethereum strikes a middle ground post-Merge. While validation requires more substantial resources than Bitcoin (32 GB RAM and 4 TB of fast solid-state storage are recommended, along with a 32 ETH stake), these requirements remain achievable for home operators. This balance has fostered a geographically distributed validator set, supporting roughly 20 TPS on Layer 1 (varying with gas usage and 12-second block times).
As mentioned in the introduction, Solana's architecture illustrates this trade-off starkly. The network prioritizes performance through demanding specifications: high-clock CPUs, 256+ GB RAM, fast NVMe drives, and at least 1 Gbps network connections. To manage storage, validators typically prune ledger history by default. In return, the network sustains thousands of transactions per second during normal operations. However, these steep requirements concentrate validation power among well-resourced entities.
This hardware spectrum illustrates the core tradeoff clearly. Higher performance demands deliver greater throughput but shrink the pool of potential validators, affecting both current participation and the barrier to entry for newcomers. Decentralization in practice exists on a spectrum; there is no crisp threshold for being 'decentralized enough.' A pragmatic lens is the cost and coordination required to shut the network down economically, legally, and operationally. Each network has chosen its position on this curve.
Fee Markets and Resource Allocation
Hardware requirements determine a chain's theoretical capacity, but fee markets determine how that scarce capacity gets allocated among competing users. Different chains have developed pricing mechanisms that reflect their underlying resource constraints.
Bitcoin pioneered the classic auction where miners collect fees directly. Ethereum combines a protocol-set base fee that adjusts automatically based on network congestion with user-paid priority tips. Solana introduced localized fee markets with fixed base components and optional priority fees, reflecting its high-throughput architecture where different transaction types compete for different computational resources.
Newer networks are pushing toward more sophisticated, multi-market fee designs that align fees with specific bottlenecks and resource usage patterns. A transaction that consumes significant compute might be priced differently than one requiring substantial storage. This evolution from one-size-fits-all pricing to nuanced, resource-aware fee markets represents blockchain infrastructure maturing to serve diverse use cases.
Bigger Blocks and Faster Intervals
Bigger blocks are the simplest way to scale. They just increase how much transaction data fits in each block. Bitcoin Cash took this approach, starting with 8 MB blocks in 2017 and expanding to 32 MB in 2018. Today it has no hard limit, though blocks rarely go above a few MB. BNB Chain scales by adjusting its block gas limit, which is currently around 100 megagas. There's a proposal to increase it tenfold to 1 gigagas.
Shorter block times can boost throughput without making each block bigger. Ethereum's 12-second slots process more transactions per minute than Bitcoin's 10-minute blocks, even when the blocks are similar in size. But there's a catch with proof-of-work chains: very short intervals create more competing blocks (called orphans or uncles). This wastes honest mining work and reduces security. Proof-of-stake systems like Ethereum after the Merge face different problems. When slots are too short, the network struggles to keep up. This leads to missed attestations and missed slots rather than uncle blocks, and it makes the fork choice process harder.
The core limitation is simple: larger or faster blocks need more bandwidth and storage. This makes it harder for regular people to participate in the network. While techniques like pipelining and parallel execution help chains process blocks more efficiently, the fundamental trade-off between performance and accessibility remains.
State Growth and Storage
While transaction throughput gets most attention, state growth poses an equally serious scalability threat. State is the complete snapshot of all current blockchain data: account balances, smart contract variables, and stored information. Unlike transaction history, state must remain immediately accessible for validation.
The core problem: state only grows, never shrinks. Every new account, contract, or stored data adds to state permanently. As state expands from gigabytes to terabytes, hardware requirements increase, sync times lengthen, and running a node becomes prohibitively expensive. Without intervention, only data centers can afford to validate the chain, undermining decentralization.
Three main approaches have emerged to manage state growth. State rent charges ongoing fees for storing data on-chain, creating economic pressure to remove unnecessary state, though this risks disrupting applications built on assumptions of free permanent storage. State expiry automatically removes data that hasn't been accessed for a set period (users can later restore expired state with cryptographic proofs), capping state size but adding significant complexity. Advanced data structures can also help by dramatically shrinking the proofs needed to verify state. Ethereum is pursuing an approach called Verkle trees, which allow nodes to prove facts about the blockchain's state using much smaller proofs than current methods require. This enables lightweight nodes to participate without storing the full state, reducing the barrier to running a node.
State management creates a stark decentralization constraint: aggressive solutions like expiry risk breaking applications, while inaction allows state bloat to gradually exclude ordinary node operators.
All the scaling techniques we've discussed (whether horizontal through modularity or vertical through hardware and optimization) ultimately fragment liquidity and users across chains. This creates the interoperability problem: how do we reconnect these isolated islands of value? Section VI examines the bridges and cross-chain infrastructure attempting to solve this challenge.