Welcome!

Blog Feed Post

Ethereum’s Big Switch Explained: Why the Blockchain is Ditching Proof of Work and Adopting Proof of Stake Instead

Recently, we posted a brief overview of the Ethereum ICO craze. Before that, we had reviewed how blockchains work in general.

Today, we’ll delve deeper into the process of mining.

In this article, we’ll introduce briefly the concepts of Proof of Work (PoW) and Proof of Stake (PoS). We’ll talk about why Ethereum, the second largest blockchain, is planning to switch the former for the later and share some forecasts as to the outcomes this decision might lead to.

Let’s get to it.

Byzantine Generals’ Problem

Among his many breakthrough accomplishments, Satoshi, the mysterious founder of Bitcoin, is praised for coming up with a solution to what’s known as Byzantine Generals’ Problem.

The issue is basically this.

Suppose there’s a war. There’s an army that has a city encircled, but due to exhaustion of resources, the army’s generals are undecided whether it’s smarter to attack or to retreat. Suppose, also, that it’s the 15th century and the commanders, who are all in camps far apart, have no way of communicating effectively save for sending messengers.

How could these generals (let’s say there are 20 of them) reach a consensus?

Obviously, they’ll have to vote. If the majority (at least 51%) decides to move forward with a strategy (of attacking, or of retreating), the whole army will have to get behind their choice. That’s only fair and logical.

But then they’ll face another problem: how to ensure that no general involved in making a decision votes the wrong way on purpose, just to confuse things?

Well, in the world of blockchains, the generals are miners. And choosing a war strategy for them is agreeing on a set of rules, a certain view of the history of digital events that are posted on the network.

The way that Bitcoin enables reaching a distributed consensus and punishes, or rather discourages, bad actors for acting dishonestly is by using the Proof of Work (PoW) algorithm.

What is Proof of Work (PoW)?

What is Proof of Work (PoW)?

Initially, PoW was proposed as means to protect network connections and systems from the denial of service (DoS) attacks. One of its first implementations was hashcash – the technology that is still being used to secure the mining process on Bitcoin and was engineered by Adam Back, one of Bitcoin co-developers.

Essentially, PoW is just a piece data that’s both hard to produce, computing wise, and easy to verify on the receiving end.

The PoW principle was first used to deter spam emails. A person had to solve a puzzle of some kind, i.e put effort, before sending an email, and then they had to attach a solution they’d come up with to the letter’s header so that the recipient could recognize it.

The main gist of the idea was making it difficult and time-consuming for a spammer to send bulk, trashy ads. All the emails without a proof of work were easily identifiable as spam, and, therefore, recipients never opened them.

On Bitcoin, Proof of Work is a miner’s responsibility. Whenever a new block with transactions appears on the network, validators start to compete in solving a mathematical problem (generating Proof of Work) attached to it. The winner, a miner who manages to figure out the cryptographic nonce first, gets to write to the blockchain’s history and is then rewarded by the network with a certain amount of crypto coins.

The blockchain adjusts the difficulty of these mathematical problems so that it takes a miner roughly 10 minutes to find a solution. Hence, we get the universal 10 minute block time on the Bitcoin blockchain.

What are the drawbacks of using Proof of Work?

The fairly high level of security provided by PoW comes with a cost. Some, including Ethereum’s founder Vitalik Buterin, consider the algorithm to be too wasteful and costly.

Here are some of the most typical concerns people have about Proof of Work:

  1. Millions of hashes that are generated by miners around the world do not really solve anything. The immense work and resources put into securing blockchains are not, in any way, useful to society. And after a computational problem is solved, miners proceed to solve the next one, throwing their previous efforts on the floor. From the environmental standpoint, server farms with powerful mining equipment that utilize vast amounts of electricity to do basically nothing, are not beneficial to our world.
  2. The process of resolving computational problems is complex and competitive and those with the most advanced mining hardware have an edge over those who don’t. This creates an arms race among miners, and makes the validators’ community more exclusive (there’s isn’t a whole lot of people willing to buy up equipment and manage substantial amounts of computing power). And exclusivity goes against the idea of decentralization – the key principle behind the blockchain technology.
  3. This leads to the most talked about PoW related concern that is the possibility of a 51% attack. While a mining pool that has a centralized control over the network can still play it fair, nothing would stop it from attacking a blockchain, invalidating legitimate transactions and double spending crypto coins.

To avoid facing these potential issues ever, the creators of Ethereum – the second largest blockchain in the world – are planning to switch from PoW to Proof of Stake (PoS).

What is Proof of Stake?

Proof of Stake takes labor work out of the mining process. Instead of time and electricity – the external resources validators are used to putting into generating PoW – the algorithm enables miners with most coins (internal resources) to write to a blockchain’s history. The underlying principle behind PoS is that the more invested a validator is in the network (the bigger stake they possess), the less likely they are to attack it, and, therefore, the more validating rights they should be given.

The only cryptographic calculations involved in PoS are those establishing if a miner owns a needed amount of cryptocurrency. On Ethereum, according to its developers, a person who has 5% of all ether will be able to mine 5% of all the transactions happening on the blockchain.

The system will decide whose turn it is to commit a block pseudo-randomly, weighing the selection toward miners with the most coins. And it will allow more people to participate in the validating process: there would no longer be a need to purchase expensive hardware to mine.

What benefits can PoS bring to Ethereum?

What is Proof of Work (PoW)?


Essentially, where PoW falls short, PoS is expected to thrive.
Click To Tweet


  • The issue of unnecessary energy wasting will be forgotten about, as no mining, in its traditional form, will take place.
  • No competition in solving computational puzzles will mean no demand for advanced mining hardware. Hence, more people will be encouraged to participate in the validation process.
  • Despite reducing energy costs, by a lot, PoS will make attacks on the blockchain even more expensive. If anyone decides to buy up 51% of ether to try to alter transaction blocks, they’ll have to pay millions of dollars to get the coins (due to limited supply and increased demand ether price will be increased drastically) and then risk losing their money by destabilizing the very blockchain they’ve put their funds in. It’s hard to imagine a sane person doing that.

In conclusion

Besides assuming that a miner won’t risk their money to hack the blockchain, PoS offers a scenario of how malicious activities can be diffused, if they do occur. If a chain takeover happens, Ethereum community can simply hard fork the network and destroy the deposits of the attacking miners’, no matter how much coins, i.e. mining power, they might possess.

It would take some healing, a few days probably before the blockchain gets on tracks again. But, in the long run, no one except the offenders will suffer substantial losses. Conversely, the honest validators will end up richer as the crunch in supply of ether caused by the fork, will make the coins’ cost rise even more.

Want to learn more about Ethereum, Bitcoin and the Blockchain technology in general? Contact our expert for a free consultation.

The post Ethereum’s Big Switch Explained: Why the Blockchain is Ditching Proof of Work and Adopting Proof of Stake Instead appeared first on Perfectial.

Read the original blog entry...

More Stories By Rostyslav Demush

Ross Demush is a digital marketing specialist at custom software development company Perfectial, a leading provider of web & mobile development services, specializing in FinTech, Real Estate, Media & Entertainment & eLearning.

Latest Stories
DX World EXPO, LLC, a Lighthouse Point, Florida-based startup trade show producer and the creator of "DXWorldEXPO® - Digital Transformation Conference & Expo" has announced its executive management team. The team is headed by Levent Selamoglu, who has been named CEO. "Now is the time for a truly global DX event, to bring together the leading minds from the technology world in a conversation about Digital Transformation," he said in making the announcement.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Conference Guru has been named “Media Sponsor” of the 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to gre...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develop...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...