Discussion about Primecoin and its infra. Primecoin is a very innovative cryptocurrency, being the 1st non Hash-Cash PoW crypto, naturally scarce (not artificially), with very fast confirmations (1min), elastic readjusting reward & a useful mining (byproducts are primes). Primecoin is sustainable (miners are guaranteed to have revenues), and decentralized (ASIC/FPGA are not particularly advantaged). Sidechain for decentralized data applications (e.g. Storj) currently in development.
Living in a post-GPT-3 world: What will the consequences be?
As you have probably noticed, the recently published GPT-3 language model from OpenAI has captured the attention of this and some other communities and deservedly so: Another previously exclusive domain of human intellect, namely speech generation, has seemingly fallen. Sure, the text isn't always coherent and the semantics can be lacking, but consider that we have been working on this problem since the 60-70s; take ELIZA as an example. A lot of the work that later ended up in stuff like compilers was also about natural language generation and understanding. But, using the techniques from that time, language generation turned out to be more or less impossible; ELIZA was just a party trick. But now, AI has seemingly conquered this domain as well. Some people think that with some tweaks, such models can achieve human-like intelligence; GAI as a sequence-generation problem. I personally am not quite that optimistic (or pessimistic?), but let's consider the direct impact of GPT-like models on our society.
Spam, SEO and fraud: A very high fraction of texts generated are indistinguishable from low-effort human text; GPT-3 can basically write high-school essays. Spam has been a persistent problem for the internet since its inception; fraud and phishing are some of the most prevalent forms of cybercrime. SEO is slowly making the internet unusable. Automated SEO, spam, fraud and phishing might be a severe disruption to user-generated content on the internet. One of my bolder predictions is that we will see proof-of-work schemes like HashCash being used; these are the precursor to Bitcoin and were intended to solve the spam problem in the 90s by making e-mails have a small cost. CAPTCHAs are already something like that; we might have to solve a lot more of them in the future. Maybe you will have to pay 0.00005 cents to write internet comments. Or, alternatively, half of all the internet content might get written by a bot and we'll be none the wiser.
Political propaganda: A very common accusation on Reddit is to accuse ones debate opponent of being a political shill; for the CCP, the Russians, the Democrats, whatever. Internet comment section have certainly become a battleground for various state actors. Astroturfing refers to artificial grassroots movements; making people believe there is broad support for certain causes could certainly be a potent political tool. GPT-3 could help by auto-generating millions of plausible-seeming comments. It could generate normal, non-political comments as well, to make the account seem genuine. Lowering the costs of astroturfing might further inflame polarization; or it might lead to an abandonment of the internet as a political battleground, as there can be no trust that you are speaking to humans.
Fifty Years of Cypherpunk: History, Personalities, And Spread of its ideas
In this review, we tell how the ideas of cypherpunk were born, how they influenced cryptocurrencies, and modern technologies, who formed the basis and why its popularity these days has grown again.
From the early days to today: the chronology of key events of the cypherpunk
In the early 1970s,James Ellis of the UK Government Communications Center put forward the concept of public-key cryptography. In the early 1980s, small groups of hackers, mathematicians and cryptographers began working on the realization of this idea. One of them was an American cryptographer, Ph.D. David Chaum, who is sometimes called the godfather of cypherpunk. This new culture has proclaimed computer technology as a means of destroying state power and centralized management systems.Key figure among the cypherpunk of the 80s — Intel specialist Timothy C. May. His dream was to create a global system that allows anonymous exchange of information. He created the concept of the BlackNet system. In September 1988, May wrote The Crypto-Anarchist Manifesto: people themselves, without politicians, manage their lives, use cryptography, use digital currencies, and other decentralized tools.In 1989,David Chaum founded DigiCash an eCash digital money system with its CyberBucks and with the blind digital signature technology.Since 1992, Timothy May, John Gilmore (Electronic Frontier Foundation), and Eric Hughes (University of California) have begun holding secret meetings and regular PGP-encrypted mailing through anonymous remailer servers. And finally, in 1993 Eric Hughes published a fundamental document of the movement — А Cypherpunk's Manifesto. The importance of confidentiality, anonymous transactions, cryptographic protection — all these ideas were subsequently implemented in cryptocurrencies.The term "cypherpunk" was first used by hacker and programmer Jude Milhon to a group of crypto-anarchists.In 1995,Julian Assange, the creator of WikiLeaks, published his first post in cypherpunk mailing.In 1996,John Young and Deborah Natsios created the Cryptome, which published data related to security, privacy, freedom, cryptography. It is here that subsequently will be published data from the famous Edward Snowden.In 1997, cryptographerDr. Adam Back (you know him as CEO of Blockstream) created Hashcash, a distributed anti-spam mechanism.In 1998, computer engineer Wei Dai published two concepts for creating a b-money digital payment system:
Each member of the system has a copy of the system database with user funds balances (this idea found itself in Bitcoin).
Distributed base, but not everyone has a copy. To maintain the integrity of participants, deposits, fines, and incentives are provided. This was later implemented in the Proof-of-Stake consensus algorithm.
In April 2001,Bram Cohen developed the BitTorrent protocol and application.In 2002,Paul Syverson, Roger Dingledine and Nick Mathewson presented the alpha version of the anonymity network named TOR Project.In 2004, cypherpunk Hal Finney created the Reusable Proof of Work (RPoW) algorithm. It was based on Adam Back's Hashcash but its drawback was centralization.In 2005, cryptographer Nick Szabo, who developed the concept of smart contracts in the 1990s, announced the creation of Bit Gold — a digital collectible and investment item.In October 2008, legendary Satoshi Nakamoto created themanifesto“Bitcoin: A Peer-to-Peer Electronic Cash System”, which refers to the works of the cypherpunk classics Adam Back and Wei Dai.In 2011,Ross William Ulbricht aka Dread Pirate Roberts created the Silk Road, the first major market for illegal goods and services on the darknet.In 2016,Julian Assange released the book "Cypherpunks: Freedom and the future of the Internet."At the beginning of 2018,Pavel Durov, the creator of Telegram, announced the launch of the TON multi-blockchain platform and mentioned his plans to launch TON ICO.In 2019, the Tor Project introduced an open anti-censorship group.
Plenty of services, products, and technologies were inspired by cypherpunk: Cryptocurrencies, HD (Hierarchical Deterministic) crypto wallets, Coin Mixers, ECDHM addresses, Privacy Coins. The ideas of distribution and anonymity were also implemented in the torrents and VPN. You can see the embodiment of cybersecurity ideas in the electronic signatures and protected messengers (Telegram, Signal, and many others).Why there were so many talks about cypherpunk this spring?In April 2020, Reddit users suggested that the letter from the famous cypherpunks mailing dated September 19, 1999, was written by Satoshi Nakamoto himself (or someone close to him). This letter is about the functioning of ecash. Anonymous (supposed Satoshi) talks about the "public double-spending database" and Wei Dai's b-money as a possible foundation for ecash.In addition, researchers of the mystery "Who is Satoshi Nakamoto?" periodically make some noise and discover the next "secret" about one or another legendary cypherpunks. So, in May 2020, Adam Back wrote in response to videos and new hype discussions that, despite some coincidences, he is not Satoshi.Other heroes of the scene are not idle too: in April 2020, David Chaum received $9.7 million during the presale of the confidential coin xx, created to encourage venture investors.
As you can see from the Satoshi Nakamoto's mentions and from the stories of DigiCash, Hashcash, RPoW, Bit Gold, the movement of cypherpunk influenced a lot the emergence of cryptocurrencies. As governments and corporations restrict freedom and interfere with confidentiality, cypherpunk ideas will periodically rise in popularity. And this confrontation will not end in the coming decades.
https://preview.redd.it/hl80wdx61j451.png?width=1200&format=png&auto=webp&s=c80b21c53ae45c6f7d618f097bc705a1d8aaa88f A proof-of-work (PoW) system (or protocol, or function) is a consensus mechanism that was first invented by Cynthia Dwork and Moni Naor as presented in a 1993 journal article. In 1999, it was officially adopted in a paper by Markus Jakobsson and Ari Juels and they named it as "proof of work". It was developed as a way to prevent denial of service attacks and other service abuse (such as spam on a network). This is the most widely used consensus algorithm being used by many cryptocurrencies such as Bitcoin and Ethereum. How does it work? In this method, a group of users competes against each other to find the solution to a complex mathematical puzzle. Any user who successfully finds the solution would then broadcast the block to the network for verifications. Once the users verified the solution, the block then moves to confirm the state. The blockchain network consists of numerous sets of decentralized nodes. These nodes act as admin or miners which are responsible for adding new blocks into the blockchain. The miner instantly and randomly selects a number which is combined with the data present in the block. To find a correct solution, the miners need to select a valid random number so that the newly generated block can be added to the main chain. It pays a reward to the miner node for finding the solution. The block then passed through a hash function to generate output which matches all input/output criteria. Once the result is found, other nodes in the network verify and validate the outcome. Every new block holds the hash of the preceding block. This forms a chain of blocks. Together, they store information within the network. Changing a block requires a new block containing the same predecessor. It is almost impossible to regenerate all successors and change their data. This protects the blockchain from tampering. What is Hash Function? A hash function is a function that is used to map data of any length to some fixed-size values. The result or outcome of a hash function is known as hash values, hash codes, digests, or simply hashes. https://preview.redd.it/011tfl8c1j451.png?width=851&format=png&auto=webp&s=ca9c2adecbc0b14129a9b2eea3c2f0fd596edd29 The hash method is quite secure, any slight change in input will result in a different output, which further results in discarded by network participants. The hash function generates the same length of output data to that of input data. It is a one-way function i.e the function cannot be reversed to get the original data back. One can only perform checks to validate the output data with the original data. Implementations Nowadays, Proof-of-Work is been used in a lot of cryptocurrencies. But it was first implemented in Bitcoin after which it becomes so popular that it was adopted by several other cryptocurrencies. Bitcoin uses the puzzle Hashcash, the complexity of a puzzle is based upon the total power of the network. On average, it took approximately 10 min to block formation. Litecoin, a Bitcoin-based cryptocurrency is having a similar system. Ethereum also implemented this same protocol. Types of PoW Proof-of-work protocols can be categorized into two parts:- · Challenge-response This protocol creates a direct link between the requester (client) and the provider (server). In this method, the requester needs to find the solution to a challenge that the server has given. The solution is then validated by the provider for authentication. The provider chooses the challenge on the spot. Hence, its difficulty can be adapted to its current load. If the challenge-response protocol has a known solution or is known to exist within a bounded search space, then the work on the requester side may be bounded. https://preview.redd.it/ij967dof1j451.png?width=737&format=png&auto=webp&s=12670c2124fc27b0f988bb4a1daa66baf99b4e27 Source-wiki · Solution–verification These protocols do not have any such prior link between the sender and the receiver. The client, self-imposed a problem and solve it. It then sends the solution to the server to check both the problem choice and the outcome. Like Hashcash these schemes are also based on unbounded probabilistic iterative procedures. https://preview.redd.it/gfobj9xg1j451.png?width=740&format=png&auto=webp&s=2291fd6b87e84395f8a4364267f16f577b5f1832 Source-wiki These two methods generally based on the following three techniques:- CPU-bound This technique depends upon the speed of the processor. The higher the processor power greater will be the computation. Memory-bound This technique utilizes the main memory accesses (either latency or bandwidth) in computation speed. Network-bound In this technique, the client must perform a few computations and wait to receive some tokens from remote servers. List of proof-of-work functions Here is a list of known proof-of-work functions:- o Integer square root modulo a large prime o Weaken Fiat–Shamir signatures`2 o Ong–Schnorr–Shamir signature is broken by Pollard o Partial hash inversion o Hash sequences o Puzzles o Diffie–Hellman–based puzzle o Moderate o Mbound o Hokkaido o Cuckoo Cycle o Merkle tree-based o Guided tour puzzle protocol A successful attack on a blockchain network requires a lot of computational power and a lot of time to do the calculations. Proof of Work makes hacks inefficient since the cost incurred would be greater than the potential rewards for attacking the network. Miners are also incentivized not to cheat. It is still considered as one of the most popular methods of reaching consensus in blockchains. Though it may not be the most efficient solution due to high energy extensive usage. But this is why it guarantees the security of the network. Due to Proof of work, it is quite impossible to alter any aspect of the blockchain, since any such changes would require re-mining all those subsequent blocks. It is also difficult for a user to take control over the network computing power since the process requires high energy thus making these hash functions expensive.
In the past weeks I heard a lot pros and cons about IOTA, many of them I believe were not true (I'll explain better). I would like to start a serious discussion about IOTA and help people to get into it. Before that I'll contribute with what I know, most things that I will say will have a source link providing some base content.
The pros and cons that I heard a lot is listed below, I'll discuss the items marked with *. Pros
Many users claim that the network infinitely scales, that with more transactions on the network the faster it gets. This is not entirely true, that's why we are seeing the network getting congested (pending transactions) at the moment (12/2017). The network is composed by full-nodes (stores all transactions), each full-node is capable of sending transactions direct to the tangle. An arbitrary user can set a light-node (do not store all transactions, therefore a reduced size), but as it does not stores all transactions and can't decide if there are conflicting transactions (and other stuff) it needs to connect to a full-node (bitifinex node for example) and then request for the full-node to send a transaction to the tangle. The full-node acts like a bridge for a light-node user, the quantity of transactions at the same time that a full-node can push to the tangle is limited by its brandwidth. What happens at the moment is that there are few full-nodes, but more important than that is: the majority of users are connected to the same full-node basically. The full-node which is being used can't handle all the requested transactions by the light-nodes because of its brandwidth. If you are a light-node user and is experiencing slow transactions you need to manually select other node to get a better performance. Also, you need to verify that the minimum weight magnitude (difficulty of the Hashcash Proof of Work) is set to 14 at least. The network seems to be fine and it scales, but the steps an user has to make/know are not friendly-user at all. It's necessary to understand that the technology envolved is relative new and still in early development. Do not buy iota if you haven't read about the technology, there is a high chance of you losing your tokens because of various reasons and it will be your own fault. You can learn more about how IOTA works here. There are some upcoming solutions that will bring the user-experience to a new level, The UCL Wallet (expected to be released at this month, will talk about that soon and how it will help the network) and the Nelson CarrIOTA (this week) besides the official implementations to come in december.
We all know that currently (2017) IOTA depends on the coordinator because the network is still in its infancy and because of that it is considered centralized by the majority of users. The coordinator are several full-nodes scattered across the world run by the IOTA foundation. It creates periodic Milestones (zero value transactions which reference valid transactions) which are validated by the entire network. The coordinator sets the general direction for the tangle growth. Every node verifies that the coordinator is not breaking consensus rules by creating iotas out of thin air or approving double-spendings, nodes only tells other nodes about transactions that are valid, if the Coordinator starts issuing bad Milestones, nodes will reject them. The coordinator is optional since summer 2017, you can choose not implement it in your full-node, any talented programmer could replace Coo logic in IRI with Random Walk Monte Carlo logic and go without its milestones right now. A new kind of distributed coordinator is about to come and then, for the last, its completely removal. You can read more about the coordinator here and here.
These are blockchain-based cryptocurrencies (Bitcoin) that has miners to guarantee its security. Satoshi Nakamoto states several times in the Bitcoin whitepaper that "The system is secure as long as honest nodes collectively control more CPU power than any cooperating group of attacker nodes". We can see in Blockchain.info that nowadays half of the total hashpower in Bitcoin is controlled by 3 companies (maybe only 1 in the future?). Users must trust that these companies will behave honestly and will not use its 50%> hashpower to attack the network eventually. With all that said it's reasonable to consider the IOTA network more decentralized (even with the coordinator) than any mining-blockchain-based cryptocurrency You can see a comparison between DAG cryptocurrencies here
Some partnerships of IOTA foundation with big companies were well known even when they were not officialy published. Some few examples of confirmed partnerships are listed below, others cofirmed partnerships can be seem in the link Partnerships with big companies at the pros section.
So what's up with all alarming in social media about IOTA Foundation faking partnerships with big companies like Microsoft and Cisco? At Nov. 28th IOTA Foundation announced the Data Marketplace with 30+ companies participating. Basically it's a place for any entity sell data (huge applications, therefore many companies interested), at time of writing (11/12/2017) there is no API for common users, only companies in touch with IOTA Foundation can test it. A quote from Omkar Naik (Microsoft worker) depicted on the Data Marketplace blog post gave an idea that Microsoft was in a direct partnership with IOTA. Several news websites started writing headlines "Microsoft and IOTA launches" (The same news site claimed latter that IOTA lied about partnership with Microsoft) when instead Microsoft was just one of the many participants of the Data Marketplace. Even though it's not a direct partnership, IOTA and Microsoft are in close touch as seen in IOTA Microsoft and Bosch meetup december 12th, Microsoft IOTA meetup in Paris 14th and Microsoft Azure adds 5 new Blockchain partners (may 2016). If you join the IOTA Slack channel you'll find out that there are many others big companies in close touch with IOTA like BMW, Tesla and other companies. This means that right now there are devs of IOTA working directly with scientists of these companies to help them integrate IOTA on their developments even though there is no direct partnership published, I'll talk more about the use cases soon.
We are excited to partner with IOTA foundation and proud to be associated with its new data marketplace initiative... - Omkar Naik
IOTA's use cases
Every cryptocurrency is capable of being a way to exchange goods, you pay for something using the coin token and receive the product. Some of them are more popular or have faster transactions or anonymity while others offers better scalablity or user-friendness. But none of them (except IOTA) are capable of transactioning information with no costs (fee-less transactions), in an securely form (MAM) and being sure that the network will not be harmed when it gets more adopted (scales). These characteristics open the gates for several real world applications, you probably might have heard of Big Data and how data is so important nowadays.
Data sets grow rapidly - in part because they are increasingly gathered by cheap and numerous information-sensing Internet of things devices such as mobile devices, aerial (remote sensing), software logs, cameras, microphones, radio-frequency identification (RFID) readers and wireless sensor networks.
It’s just the beginning of the data period. Data is going to be so important for human life in the future. So we are now just starting. We are a big data company, but compared to tomorrow, we are nothing. - Jack Ma (Alibaba)
There are enormous quantities of wasted data, often over 99% is lost to the void, that could potentially contain extremely valuable information if allowed to flow freely in data streams that create an open and decentralized data lake that is accessible to any compensating party. Some of the biggest corporations of the world are purely digital like Google, Facebook and Amazon. Data/information market will be huge in the future and that's why there so many companies interested in what IOTA can offer. There are several real world use cases being developed at the moment, many of them if successful will revolutionize the world. You can check below a list of some of them.
Not having your wallet set up properly (min weight 14, etc.)
Problems that could be easily avoided with a better understand of the network/wallet or with a better wallet that could handle these issues. As I explained before, some problems during the "congestion" of the network could be simply resolved if stuff were more user-friendly, this causes many users storing their iotas on exchanges which is not safe either. The upcoming (dec 2017) UCL Wallet will solve most of these problems. It will switch between nodes automatically and auto-reattach transactions for example (besides other things). You can have full a overview of it here and here. Also, the upcoming Nelson CarrIOTA will help on automatic peer discovery for users setup their nodes more easily.
IOTA Vulnerability issue
On sept 7th 2017 a team from MIT reported a cryptographic issue on the hash function Curl. You can see the full response of IOTA members below.
Funds were never in danger as such scenarios depicted on the Neha's blogpost were not pratically possible and the arguments used on the blogpost had'nt fundamentals, all the history you can check by yourself on the responses. Later it was discovered that the whole Neha Narula's team were envolved in other concurrent cryptocurrency projects Currently IOTA uses the relatively hardware intensive NIST standard SHA-3/Keccak for crucial operations for maximal security. Curl is continuously being audited by more cryptographers and security experts. Recenlty IOTA Foundation hired Cybercrypt, the world leading lightweight cryptography and security company from Denmark to take the Curl cryptography to its next maturation phase.
It took me a couple of days to gather the informations presented, I wanted it to make easier for people who want to get into it. It might probably have some mistakes so please correct me if I said something wrong. Here are some useful links for the community.
This is my IOTA donation address, in case someone wants to donate I will be very thankful. I truly believe in this project's potential. I9YGQVMWDYZBLHGKMTLBTAFBIQHGLYGSAGLJEZIV9OKWZSHIYRDSDPQQLTIEQEUSYZWUGGFHGQJLVYKOBWAYPTTGCX
This is a donation address, if you want to do the same you might pay attention to some important details:
Create a seed for only donation purposes.
Generate a address and publish it for everyone.
If you spend any iota you must attach a new address to the tangle and refresh your donation address published before to everyone.
If someone sends iota to your previous donation address after you have spent from it you will probably lose the funds that were sent to that specific address.
You can visualize how addresses work in IOTA here and here.
This happens because IOTA uses Winternitz one-time signature to become quantum resistent. Every time you spend iota from a address, part of the private key of that specific address is revealed. This makes easier for attackers to steal that address balance. Attackers can search if an address has been reused on the tangle explorer and try to brute force the private key since they already know part of it.
An extensive list of blockchain courses, resources and articles to help you get a job working with blockchain.
u/Maximus_no and me spent some time at work collecting and analyzing learning material for blockchain development. The list contains resources for developers, as well as business analysts/consultants looking to learn more about blockchain use-cases and solutions.
Certifications and Courses
IIB Council Link to course: IIB council : Certified Blockchain Professional C|BP is an In-Depth, Industry Agnostic, Hands-On Training and Certification Course specifically tailored for Industry Professionals and Developers interested in implementing emerging technologies in the Data-Driven Markets and Digitized Economies. The IIB Council Certified Blockchain Professional (C|BP) Course was developed to help respective aspiring professionals gain excessive knowledge in Blockchain technology and its implication on businesses. WHO IS IT FOR:
C|BP is developed in line with the latest industry trends to help current and aspiring Professionals evolve in their career by implementing the latest knowledge in blockchain technology. This course will help professionals understand the foundation of Blockchain technology and the opportunities this emerging technology is offering.
If you are a Developer and you are willing to learn blockchain technology this course is for you. You will learn to build and model Blockchain solutions and Blockchain-based applications for enterprises and businesses in multiple Blockchain Technologies.
This exam is designed for non-technical business professionals who require basic knowledge about Blockchain and how it will be executed within an organization. This exam is NOT appropriate for technology professionals seeking to gain deeper understanding of Blockchain technology implementation or programming.
A person who holds this certification demonstrates their knowledge of:
· What is Blockchain? (What exactly is it?) · Non-Technical Technology Overview (How does it work?) · Benefits of Blockchain (Why should anyone consider this?) · Use Cases (Where and for what apps is it appropriate?) · Adoption (Who is using it and for what?) · Future of Blockchain (What is the future?)
A person who holds this certification demonstrates their ability to:
· Architect blockchain solutions · Work effectively with blockchain engineers and technical leaders · Choose appropriate blockchain systems for various use cases · Work effectively with both public and permissioned blockchain systems
This exam will prove that a student completely understands:
· The difference between proof of work, proof of stake, and other proof systems and why they exist · Why cryptocurrency is needed on certain types of blockchains · The difference between public, private, and permissioned blockchains · How blocks are written to the blockchain · Where cryptography fits into blockchain and the most commonly used systems · Common use cases for public blockchains · Common use cases for private & permissioned blockchains · What is needed to launch your own blockchain · Common problems & considerations in working with public blockchains · Awareness of the tech behind common blockchains · When is mining needed and when it is not · Byzantine Fault Tolerance · Consensus among blockchains · What is hashing · How addresses, public keys, and private keys work · What is a smart contract · Security in blockchain · Brief history of blockchain · The programming languages of the most common blockchains · Common testing and deployment practices for blockchains and blockchain-based apps
A person who holds this certification demonstrates their ability to:
· Plan and prepare production ready applications for the Ethereum blockchain · Write, test, and deploy secure Solidity smart contracts · Understand and work with Ethereum fees · Work within the bounds and limitations of the Ethereum blockchain · Use the essential tooling and systems needed to work with the Ethereum ecosystem
This exam will prove that a student completely understands how to:
· Implement web3.js · Write and compile Solidity smart contracts · Create secure smart contracts · Deploy smart contracts both the live and test Ethereum networks · Calculate Ethereum gas costs · Unit test smart contracts · Run an Ethereum node on development machines
Basic course with focus on Bitcoin. After this course, you’ll know everything you need to be able to separate fact from fiction when reading claims about Bitcoin and other cryptocurrencies. You’ll have the conceptual foundations you need to engineer secure software that interacts with the Bitcoin network. And you’ll be able to integrate ideas from Bitcoin in your own projects.
· A mid / basic understanding of blockchain technology and its long-term implications for business, coupled with knowledge of its relationship to other emerging technologies such as AI and IoT · An economic framework for identifying blockchain-based solutions to challenges within your own context, guided by the knowledge of cryptoeconomics expert Christian Catalini · Recognition of your newfound blockchain knowledge in the form of a certificate of completion from the MIT Sloan School of Management — one of the world’s leading business schools Orientation Module: Welcome to Your Online Campus Module 1: An introduction to blockchain technology Module 2: Bitcoin and the curse of the double-spending problem Module 3: Costless verification: Blockchain technology and the last mile problem Module 4: Bootstrapping network effects through blockchain technology and cryptoeconomics Module 5: Using tokens to design new types of digital platforms Module 6: The future of blockchain technology, AI, and digital privacy
· A mid / basic understanding of what blockchain is and how it works, as well as insights into how it will affect the future of industry and of your organization. · The ability to make better strategic business decisions by utilizing the Oxford Blockchain Strategic framework, the Oxford Blockchain Regulation framework, the Oxford Blockchain Ecosystem map, and drawing on your knowledge of blockchain and affiliated industries and technologies. · A certificate of attendance from Oxford Saïd as validation of your newfound blockchain knowledge and skills, as well as access to a global network of like-minded business leaders and innovators. Module 1: Understanding blockchain Module 2: The blockchain ecosystem Module 3: Innovations in value transfer Module 4: Decentralized apps and smart contracts Module 5: Transforming enterprise business models Module 6: Blockchain frontiers
[Proof of Work] - very short, cuz it's well-known.  Bitcoin - to generate a new block miner must generate hash of the new block header that is in line with given requirements. Others: Ethereum, Litecoin etc. [Hybrid of PoW and PoS]  Decred - hybrid of “proof of work” and “proof of stake”. Blocks are created about every 5 minutes. Nodes in the network looking for a solution with a known difficulty to create a block (PoW). Once the solution is found it is broadcast to the network. The network then verifies the solution. Stakeholders who have locked some DCR in return for a ticket* now have the chance to vote on the block (PoS). 5 tickets are chosen pseudo-randomly from the ticket pool and if at least 3 of 5 vote ‘yes’ the block is permanently added to the blockchain. Both miners and voters are compensated with DCR : PoS - 30% and PoW - 60% of about 30 new Decred issued with a block. * 1 ticket = ability to cast 1 vote. Stakeholders must wait an average of 28 days (8,192 blocks) to vote their tickets. [Proof of Stake]  Nxt - The more tokens are held by account, the greater chance that account will earn the right to generate a block. The total reward received as a result of block generation is the sum of the transaction fees located within the block. Three values are key to determining which account is eligible to generate a block, which account earns the right to generate a block, and which block is taken to be the authoritative one in times of conflict: base target value, target value and cumulative difficulty. Each block on the chain has a generation signature parameter. To participate in the block's forging process, an active account digitally signs the generation signature of the previous block with its own public key. This creates a 64-byte signature, which is then hashed using SHA256. The first 8 bytes of the resulting hash are converted to a number, referred to as the account hit. The hit is compared to the current target value(active balance). If the computed hit is lower than the target, then the next block can be generated.  Peercoin (chain-based proof of stake) - coin age parameter. Hybrid PoW and PoS algorithm. The longer your Peercoins have been stationary in your account (to a maximum of 90 days), the more power (coin age) they have to mint a block. The act of minting a block requires the consumption of coin age value, and the network determines consensus by selecting the chain with the largest total consumed coin age. Reward - minting + 1% yearly.  Reddcoin (Proof of stake Velocity) - quite similar to Peercoin, difference: not linear coin-aging function (new coins gain weight quickly, and old coins gain weight increasingly slowly) to encourage Nodes Activity. Node with most coin age weight have a bigger chance to create block. To create block Node should calculate right hash. Block reward - interest on the weighted age of coins/ 5% annual interest in PoSV phase.  Ethereum (Casper) - uses modified BFT consensus. Blocks will be created using PoW. In the Casper Phase 1 implementation for Ethereum, the “proposal mechanism" is the existing proof of work chain, modified to have a greatly reduced block reward. Blocks will be validated by set of Validators. Block is finalised when 2/3 of validators voted for it (not the number of validators is counted, but their deposit size). Block creator rewarded with Block Reward + Transaction FEES.  Lisk (Delegated Proof-of-stake) - Lisk stakeholders vote with vote transaction (the weight of the vote depends on the amount of Lisk the stakeholder possess) and choose 101 Delegates, who create all blocks in the blockchain. One delegate creates 1 block within 1 round (1 round contains 101 blocks) -> At the beginning of each round, each delegate is assigned a slot indicating their position in the block generation process -> Delegate includes up to 25 transactions into the block, signs it and broadcasts it to the network -> As >51% of available peers agreed that this block is acceptable to be created (Broadhash consensus), a new block is added to the blockchain. *Any account may become a delegate, but only accounts with the required stake (no info how much) are allowed to generate blocks. Block reward - minted Lisks and transaction fees (fees for all 101 blocks are collected firstly and then are divided between delegates). Blocks appears every 10 sec.  Cardano (Ouroboros Proof of Stake) - Blocks(slots) are created by Slot Leaders. Slot Leaders for N Epoch are chosen during n-1 Epoch. Slot Leaders are elected from the group of ADA stakeholders who have enough stake. Election process consist of 3 phases: Commitment phase: each elector generates a random value (secret), signs it and commit as message to network (other electors) saved in to block. -> Reveal phase: Each elector sends special value to open a commitment, all this values (opening) are put into the block. -> Recovery phase: each elector verifies that commitments and openings match and extracts the secrets and forms a SEED (randomly generated bytes string based on secrets). All electors get the same SEED. -> Follow the Satoshi algorithm : Elector who have coin which corresponded to SEED become a SLOT LEADER and get a right to create a block. Slot Leader is rewarded with minted ADA and transactions Fee.  Tezos (Proof Of Stake) - generic and self-amending crypto-ledger. At the beginning of each cycle (2048 blocks), a random seed is derived from numbers that block miners chose and committed to in the penultimate cycle, and revealed in the last. -> Using this random seed, a follow the coin strategy (similar to Follow The Satoshi) is used to allocate mining rights and signing rights to stakeholders for the next cycle*. -> Blocks are mined by a random stakeholder (the miner) and includes multiple signatures of the previous block provided by random stakeholders (the signers). Mining and signing both offer a small reward but also require making a one cycle safety deposit to be forfeited in the event of a double mining or double signing. · the more coins (rolls) you have - the more your chance to be a minesigner.  Tendermint (Byzantine Fault Tolerance) - A proposal is signed and published by the designated proposer at each round. The proposer is chosen by a deterministic and non-choking round robin selection algorithm that selects proposers in proportion to their voting power. The proposer create the block, that should be validated by >2/3 of Validators, as follow: Propose -> Prevote -> Precommit -> Commit. Proposer rewarded with Transaction FEES.  Tron (Byzantine Fault Tolerance) - This blockhain is still on development stage. Consensus algorithm = PoS + BFT (similar to Tendermint): PoS algorithm chooses a node as Proposer, this node has the power to generate a block. -> Proposer broadcasts a block that it want to release. -> Block enters the Prevote stage. It takes >2/3 of nodes' confirmations to enter the next stage. -> As the block is prevoted, it enters Precommit stage and needs >2/3 of node's confirmation to go further. -> As >2/3 of nodes have precommited the block it's commited to the blockchain with height +1. New blocks appears every 15 sec.  NEO (Delegated Byzantine Fault Tolerance) - Consensus nodes* are elected by NEO holders -> The Speaker is identified (based on algorithm) -> He broadcasts proposal to create block -> Each Delegate (other consensus nodes) validates proposal -> Each Delegate sends response to other Delegates -> Delegate reaches consensus after receiving 2/3 positive responses -> Each Delegate signs the block and publishes it-> Each Delegate receives a full block. Block reward 6 GAS distributed proportionally in accordance with the NEO holding ratio among NEO holders. Speaker rewarded with transaction fees (mostly 0). * Stake 1000 GAS to nominate yourself for Bookkeeping(Consensus Node)  EOS (Delegated Proof of Stake) - those who hold tokens on a blockchain adopting the EOS.IO software may select* block producers through a continuous approval voting system and anyone may choose to participate in block production and will be given an opportunity to produce blocks proportional to the total votes they have received relative to all other producers. At the start of each round 21 unique block producers are chosen. The top 20 by total approval are automatically chosen every round and the last producer is chosen proportional to their number of votes relative to other producers. Block should be confirmed by 2/3 or more of elected Block producers. Block Producer rewarded with Block rewards. *the more EOS tokens a stakeholder owns, the greater their voting power [The XRP Ledger Consensus Process]  Ripple - Each node receives transaction from external applications -> Each Node forms public list of all valid (not included into last ledger (=block)) transactions aka (Candidate Set) -> Nodes merge its candidate set with UNLs(Unique Node List) candidate sets and vote on the veracity of all transactions (1st round of consensus) -> all transactions that received at least 50% votes are passed on the next round (many rounds may take place) -> final round of consensus requires that min 80% of Nodes UNL agreeing on transactions. It means that at least 80% of Validating nodes should have same Candidate SET of transactions -> after that each Validating node computes a new ledger (=block) with all transactions (with 80% UNL agreement) and calculate ledger hash, signs and broadcasts -> All Validating nodes compare their ledgers hash -> Nodes of the network recognize a ledger instance as validated when a 80% of the peers have signed and broadcast the same validation hash. -> Process repeats. Ledger creation process lasts 5 sec(?). Each transaction includes transaction fee (min 0,00001 XRP) which is destroyed. No block rewards. [The Stellar consensus protocol]  Stellar (Federated Byzantine Agreement) - quite similar to Ripple. Key difference - quorum slice. [Proof of Burn]  Slimcoin - to get the right to write blocks Node should “burn” amount of coins. The more coins Node “burns” more chances it has to create blocks (for long period) -> Nodes address gets a score called Effective Burnt Coins that determines chance to find blocks. Block creator rewarded with block rewards. [Proof of Importance]  NEM - Only accounts that have min 10k vested coins are eligible to harvest (create a block). Accounts with higher importance scores have higher probabilities of harvesting a block. The higher amount of vested coins, the higher the account’s Importance score. And the higher amount of transactions that satisfy following conditions: - transactions sum min 1k coins, - transactions made within last 30 days, - recipient have 10k vested coins too, - the higher account’s Important score. Harvester is rewarded with fees for the transactions in the block. A new block is created approx. every 65 sec. [Proof of Devotion]  Nebulas (Proof of Devotion + BFT) - quite similar to POI, the PoD selects the accounts with high influence. All accounts are ranked according to their liquidity and propagation (Nebulas Rank) -> Top-ranked accounts are selected -> Chosen accounts pay deposit and are qualified as the blocks Validators* -> Algorithm pseudo-randomly chooses block Proposer -> After a new block is proposed, Validators Set (each Validator is charged a deposit) participate in a round of BFT-Style voting to verify block (1. Prepare stage -> 2. Commit Stage. Validators should have > 2/3 of total deposits to validate Block) -> Block is added. Block rewards : each Validator rewarded with 1 NAS. *Validators Set is dynamic, changes in Set may occur after Epoch change. [IOTA Algorithm]  IOTA - uses DAG (Directed Acyclic Graph) instead of blockchain (TANGLE equal to Ledger). Graph consist of transactions (not blocks). To issue a new transaction Node must approve 2 random other Transactions (not confirmed). Each transaction should be validate n(?) times. By validating PAST(2) transactions whole Network achieves Consensus. in Order to issue transaction Node: 1. Sign transaction with private key 2. choose two other Transactions to validate based on MCMC(Markov chain Monte Carlo) algorithm, check if 2 transactions are valid (node will never approve conflicting transactions) 3. make some PoW(similar to HashCash). -> New Transaction broadcasted to Network. Node don’t receive reward or fee. [PBFT + PoW]  Yobicash - uses PBFT and also PoW. Nodes reach consensus on transactions by querying other nodes. A node asks its peers about the state of a transaction: if it is known or not, and if it is a doublespending transaction or not. As follow : Node receives new transaction -> Checks if valid -> queries all known nodes for missing transactions (check if already in DAG ) -> queries 2/3 nodes for doublepsending and possibility -> if everything is ok add to DAG. Reward - nodes receive transaction fees + minting coins. [Proof of Space/Proof of Capacity]  Filecoin (Power Fault Tolerance) - the probability that the network elects a miner(Leader) to create a new block (it is referred to as the voting power of the miner) is proportional to storage currently in use in relation to the rest of the network. Each node has Power - storage in use verified with Proof of Spacetime by nodes. Leaders extend the chain by creating a block and propagating it to the network. There can be an empty block (when no leader). A block is committed if the majority of the participants add their weight on the chain where the block belongs to, by extending the chain or by signing blocks. Block creator rewarded with Block reward + transaction fees. [Proof of Elapsed Time (POET)]  Hyperledger Sawtooth - Goal - to solve BFT Validating Nodes limitation. Works only with intel’s SGX. PoET uses a random leader election model or a lottery based election model based on SGX, where the protocol randomly selects the next leader to finalize the block. Every validator requests a wait time from an enclave (a trusted function). -> The validator with the shortest wait time for a particular transaction block is elected the leader. -> The BlockPublisher is responsible for creating candidate blocks to extend the current chain. He takes direction from the consensus algorithm for when to create a block and when to publish a block. He creates, Finalizes, Signs Block and broadcast it -> Block Validators check block -> Block is created on top of blockchain.  Byteball (Delegated Byzantine Fault Tolerance) - only verified nodes are allowed to be Validation nodes (list of requirements https://github.com/byteball/byteball-witness). Users choose in transaction set of 12 Validating nodes. Validating nodes(Witnesses) receive transaction fees.  Nano - uses DAG, PoW (HashCash). Nano uses a block-lattice structure. Each account has its own blockchain (account-chain) equivalent to the account’s transaction/balance history. To add transaction user should make some HashCash PoW -> When user creates transaction Send Block appears on his blockchain and Receive block appears on Recipients blockchain. -> Peers in View receive Block -> Peers verify block (Double spending and check if already in the ledger) -> Peers achieve consensus and add block. In case of Fork (when 2 or more signed blocks reference the same previous block): Nano network resolves forks via a balance-weighted voting system where representative nodes vote for the block they observe, as >50% of weighted votes received, consensus achieved and block is retained in the Node’s ledger (block that lose the vote is discarded).  Holochain - uses distributed hash table (DHT). Instead of trying to manage global consensus for every change to a huge blockchain ledger, every participant has their own signed hash chain. In case of multi-party transaction, it is signed to each party's chain. Each party signs the exact same transaction with links to each of their previous chain entries. After data is signed to local chains, it is shared to a DHT where every neighbor node validate it. Any consensus algorithms can be built on top of Holochain.  Komodo ('Delegated' Delayed Proof of Work (dPoW)) - end-to-end blockchain solutions. DPoW consensus mechanism does not recognize The Longest Chain Rule to resolve a conflict in the network, instead the dPoW looks to backups it inserted previously into the chosen PoW blockchain. The process of inserting backups of Komodo transactions into a secure PoW is “notarization.” Notarisation is performed by the elected Notary nodes. Roughly every ten minutes, the Notary nodes perform a special block hash mined on the Komodo blockchain and take note of the overall Komodo blockchain “height”. The notary nodes process this specifc block so that their signatures are cryptographically included within the content of the notarized data. There are sixty-four “Notary nodes” elected by a stake-weighted vote, where ownership of KMD represents stake in the election. They are a special type of blockchain miner, having certain features in their underlying code that enable them to maintain an effective and cost-efcient blockchain and they periodically receives the privilege to mine a block on “easy difculty.” Source: https://www.reddit.com/CryptoTechnology/comments/7znnq8/my_brief_observation_of_most_common_consensus/ Whitepapers Worth Looking Into: IOTA -http://iotatoken.com/IOTA_Whitepaper.pdf NANO -https://nano.org/en/whitepaper Bitcoin -https://bitcoin.org/bitcoin.pdf Ethereum: https://github.com/ethereum/wiki/wiki/White-Paper Ethereum Plasma (Omise-GO) -https://plasma.io/plasma.pdf Cardano - https://eprint.iacr.org/2016/889.pdf
It has been proposed that ProgPoW should replace Ethash. Most notably the EIP itself describes this. This statement is confusing and wrong based the Ship of Theseus and the design rationale as the 'formal cause'. The algorithm Ethash evolved from a research process that included something called Dagger-Hashimoto. This is an algorithm that combines Dagger and Hashimoto, but was considered to have flaws. After further research and improvements (read about the history here), this eventually all rolled into specification revision 23 (aka 'Ethash'). The way this is phrased, it is suggested to consider any hashing algorithm that evolves from it and is used for the Ethereum mainnet to still be an increment of Ethash. So similar to the famous ship and in the style of Martin Swende who recently described the difference in his stance on ProgPow, it would be considered less confusing to describe ProgPoW as a revision of Ethash rather than a replacement. The changes can be referred to as revision numbers and a description of the part that has been updated like ethash-dashimoto to ethash-progpow. What are your thoughts on this?
In response to ProofOfResearch's misleading article on NEO.
Yesterday, I was made aware of an article published by ProofOfResearch almost entirely based on a Reddit post that I had written a few months ago. About a month ago I was contacted by Randomshortdude (supposedly ProofOfResearch himself) asking for a permission to use the excerpts from the aforementioned post in his write-up about NEO. As an avid proponent of inclusivity and transparency, I gave a permission to use the contents of my post (the screenshots of the entire conversation will be added below), providing him with links to the Github repos and updating him on the fixes and improvements that have happened since the post had been published. Unknowingly, I continued to work on my projects while my post was being molded into a foundation for an entirely misleading and unfathomably unscientific article. This post is going to consist of a list of excerpts from the article and the corresponding refutal for each of the listed excerpts.
"This is a semantic issue (example: $BTC having a 1 MB block size + 10 min block time limits TPS; no way around that) meaning that this is immutable"
Bitcoin doesn't have a 10 minute block time limit coded into the platform. The 10 minute average block production time is obtained via a difficulty adjustment formula that readjusts the difficulty of the underlying HashCash PoW algorithm every 2015 blocks (not 2016 due to a bug that was never fixed) based on the average block production time of the previous 2015 blocks.
"I’m not sure it’s even possible to change the digital signature of a protocol without a major hard fork, and there isn’t an alternative digital signature (that I think of), that would make this any more secure."
This excerpt is written in a reference to Point 2 of my Reddit post that criticizes the use of multisigs as a proof of the fact that a quorum (at least 2f + 1) replicas had signed the block hash. The use of multisig instead of signature batching via Schnorr's signatures doesn't affect the security of the nodes or the cryptographic standards used, however, the security of the network as a whole can be compromised due to the decreased number of full/light nodes operating increasing the likelihood of a spam attack being able to degrade the performance of the platform. Apart from that, a digital signature algorithm of the platform can be easily changed by adjusting the versioning of the block and transaction structures.
"Therefore, the consensus algo itself would need to be changed to amend this issue."
The consensus protocol works independently from the cryptographic standards of the platform, so a switch to a different elliptic curve or digital signature algorithm will have zero impact on the consensus algorithm.
"This, in itself, might be what stops $NEO from ever being able to truly scale."
While digital signature algorithms can vary in signing and verification speeds, the difference in the performance of the most popular signature schemes is small enough (except for BLS) to be considered to have a negligible impact on the efficiency of the consensus. As long the nodes are running on an efficient implementation, average network throughput is going continue to be the main bottleneck of the platform.
"Digital signatures are somewhat complex, but not incomprehensible if you really take the time to sit down and understand it. Once again though, it’s going to rely on an understanding of blockchain tech as well to know how this impacts the signing feature of a TX itself as well as pub key creation"
Digital signature algorithms play no role in public key creation as a public key is created simply by multiplying a 256-bit entropy (private key) by a generator (G). A screenshot of a tweet used in the article. Baffling. ed25519 DSA does not impact the efficiency of BFT and "blockchain" (whatever the hell that means in this context) as a result. Please also note that NEO does not use ed25519. NEO uses secp256r1 (as opposed to secp256k1 used by Bitcoin, which is a Koblitz curve) which is a NIST-recommended elliptic curve.
"Regular PoW algos are already designed to be Byzantine fault-tolerant already"
While being technically correct, the author dismisses the fact that BFT algorithms offer Byzantine fault-tolerance under rigid mathematical assumptions, in contrast to PoW algorithms which offer Byzantine fault-tolerance under probabilistic assumptions.
"Byzantine Fault Tolerance is not an issue though. It’s actually really useful but for private blockchains."
A common misconception about the use of BFT algorithms in "public" (the author meant permissioned/permission-less) blockchains. BFT algorithms are only required to retain the permissioned status during the agreement phase (meaning that the new candidates will have to wait until the next consensus round to be able to participate in the consensus) and can have a round robin algorithm implemented to select the next pool of validators.
"Of course, in a decentralized protocol — something like that is very hard to achieve."
The research paper quoted in the article examines the efficiency of Castro and Liskov's PBFT (Practical Byzantine Fault Tolerance) algorithm which is dissimilar from dBFT because PBFT doesn't require a primary change after every consensus round, which is impacts the performance in a decentralized network.
“At the other extreme, Hyperledger uses the classic PBFT protocol, which is communication bound: O(N²) where N is the number of nodes. PBFT can tolerate fewer than N/3 failures, and works in three phases in which nodes broadcast messages to each other. First, the pre-prepare phase selects a leader which chooses a value to commit. Next, the prepare phase broadcasts the value to be validated. Finally, the commit phase waits for more than two third of the nodes to confirm before announcing that the value is committed. PBFT has been shown to achieve liveness and safety properties in a partially asynchronous model , thus, unlike PoW, once the block is appended it is confirmed immediately. It can tolerate more failures than PoW (which is vulnerable to 25% attacks ). However, PBFT assumes that node identities are known, therefore it can only work in the permissioned settings. Additionally, the protocol is unlikely to be able to scale to the network size of Ethereum beacuse of its communication overhead.”
This statement will require a separate post to examine the real-world "permission-lessness" of PoW chains.
"NEO codebase is virtually abandoned."
"This is purportedly in favor of $NEO 3.0, but there’s no GitHub for $NEO 3.0 (at least not any that I’ve found)"
"The idea of it being able to handle 1000 TPS has been thoroughly debunked and it is virtually impossible (probably entirely impossible) for $NEO to create a public blockchain based on DBFT (essentially POS+BFT semantically), that keeps the same encryption signatures (which are probably the only ones that will reliably serve the purpose of crypto where collision resistance must be all but a guarantee)."
dBFT cannot be equated to PoS + BFT as none of those are delegate-centered protocols. How was 1000 TPS thoroughly debunked? With the neo-sharp implementation and Akka being launched, I don't see a reason for dBFT to not be able to surpass 1,000 TPS during peak loads (not during sustained loads though). The excerpt about the collision resistance of "encryption signatures" (?) makes no sense to me. Here are the promised screenshots of our conversation: Screenshot 1 Screenshot 2 Screenshot 3 P.S. It is sad to see the so-called "researchers" attracting a mass following despite being clueless about the technology they are trying to review.
Bitcoin[a] (₿) is a cryptocurrency. It is a decentralized digital currency without a central bank or single administrator that can be sent from user to user on the peer-to-peer bitcoin network without the need for intermediaries. Transactions are verified by network nodes) through cryptography and recorded in a public distributed ledger called a blockchain. Bitcoin was invented by an unknown person or group of people using the name Satoshi Nakamoto and was released as open-source software in 2009. Bitcoins are created as a reward for a process known as mining. They can be exchanged for other currencies, products, and services. Research produced by University of Cambridge estimates that in 2017, there were 2.9 to 5.8 million unique users using a cryptocurrency wallet, most of them using bitcoin. Bitcoin has been criticized for its use in illegal transactions, its high electricity consumption, price volatility, thefts from exchanges, and by reputable economists stating that "it should have a zero price". Bitcoin has also been used as an investment, although several regulatory agencies have issued investor alerts about bitcoin. Mining is a record-keeping service done through the use of computer processing power.[f] Miners keep the blockchain consistent, complete, and unalterable by repeatedly grouping newly broadcast transactions into a block, which is then broadcast to the network and verified by recipient nodes. Each block contains a SHA-256cryptographic hash of the previous block, thus linking it to the previous block and giving the blockchain its name.:ch. 7 To be accepted by the rest of the network, a new block must contain a proof-of-work (PoW). The system used is based on Adam Back's 1997 anti-spam scheme, Hashcash.[failed verification] The PoW requires miners to find a number called a nonce, such that when the block content is hashed along with the nonce, the result is numerically smaller than the network's difficulty target.:ch. 8 This proof is easy for any node in the network to verify, but extremely time-consuming to generate, as for a secure cryptographic hash, miners must try many different nonce values (usually the sequence of tested values is the ascending natural numbers: 0, 1, 2, 3, ...:ch. 8) before meeting the difficulty target. Every 2,016 blocks (approximately 14 days at roughly 10 min per block), the difficulty target is adjusted based on the network's recent performance, with the aim of keeping the average time between new blocks at ten minutes. In this way the system automatically adapts to the total amount of mining power on the network.:ch. 8Between 1 March 2014 and 1 March 2015, the average number of nonces miners had to try before creating a new block increased from 16.4 quintillion to 200.5 quintillion. The proof-of-work system, alongside the chaining of blocks, makes modifications of the blockchain extremely hard, as an attacker must modify all subsequent blocks in order for the modifications of one block to be accepted. As new blocks are mined all the time, the difficulty of modifying a block increases as time passes and the number of subsequent blocks (also called confirmations of the given block) increases.
Core/AXA/Blockstream CTO Greg Maxwell, CEO Adam Back, attack dog Luke-Jr and censor Theymos are sabotaging Bitcoin - but they lack the social skills to even feel guilty for this. Anyone who attempts to overrule the market and limit or hard-code Bitcoin's blocksize must be rejected by the community.
AXA is trying to sabotage Bitcoin by paying the most ignorant, anti-market devs in Bitcoin: Core/Blockstream This is the direction that Bitcoin has been heading in since late 2014 when Blockstream started spreading their censorship and propaganda and started bribing and corrupting the "Core" devs using $76 million in fiat provided by corrupt, anti-Bitcoin "fantasy fiat" finance firms like the debt-backed, derivatives-addicted insurance mega-giant AXA. Remember:
Bitcoin was always intended to be upgraded honestly, overtly, explicitly, and transparently - by hard forks as proposed by Satoshi - where you must explicitly "opt in" by deliberately upgrading your code.
Smart, honest devs fix bugs. Fiat-fueled AXA-funded Core/Blockstream devs add bugs - and then turn around and try to lie to our face and claim their bugs are somehow "features" Recently, people discovered bugs in other Bitcoin implementations - memory leaks in BU's software, "phone home" code in AntMiner's firmware. And the devs involved immediately took public responsibility, and fixed these bugs. Meanwhile...
AXA-funded Blockstream's centrally planned blocksize is still a (slow-motion but nonethless long-term fatal) bug, and
AXA-funded Blockstream's Anyone-Can-Spend SegWit hack/kludge is still a poison-pill.
People are so sick and tired of AXA-funded Blockstream's lies and sabotage that 40% of the network is already mining blocks using BU - because we know that BU will fix any bugs we find (but AXA-funded Blockstream will lie and cheat and try to force their bugs down everyone's throats).
So the difference is: BU's and AntMiner's devs possess enough social and economic intelligence to fix bugs in their code immediately when the community finds them. Meanwhile, most people in the community have been in an absolute uproar for years now against AXA-funded Blockstream's centrally planned blocksize and their deadly Anyone-Can-Spend hack/kludge/poison-pill. Of course, the home-schooled fiat-fattened sociopath Blockstream CTO One-Meg Greg u/nullc would probably just dismiss all these Bitcoin users as the "shreaking" [sic] masses. Narcissistic sociopaths like AXA-funded Blockstream CTO Greg Maxwell and CTO Adam and their drooling delusional attack dog Luke-Jr (another person who was home-schooled - which may help explain why he's also such a tone-deaf anti-market sociopath) are just too stupid and arrogant to have the humility and the shame to shut the fuck up and listen to the users when everyone has been pointing out these massive lethal bugs in Core's shitty code. Greg, Adam, Luke-Jr, and Theymos are the most damaging people in Bitcoin These are the four main people who are (consciously or unconsciously) attempting to sabotage Bitcoin:
These toxic idiots are too stupid and shameless and sheltered - and too anti-social and anti-market - to even begin to recognize the lethal bugs they have been trying to introduce into Bitcoin's specification and our community. Users decide on specifications. Devs merely provide implementations. Guys like Greg think that they're important because they can do implemenation-level stuff (like avoiding memory leaks in C++ code). But they are total failures when it comes to specification-level stuff (ie, they are incapable of figuring out how to "grow" a potentially multi-trillion-dollar market by maximally leveraging available technology).
Core/Blockstream is living in a fantasy world. In the real world everyone knows (1) our hardware can support 4-8 MB (even with the Great Firewall), and (2) hard forks are cleaner than soft forks. Core/Blockstream refuses to offer either of these things. Other implementations (eg: BU) can offer both.
https://np.reddit.com/btc/comments/5ejmin/coreblockstream_is_living_in_a_fantasy_world_in/ Greg, Adam, Luke-Jr and Theymos apparently lack the social and economic awareness and human decency to feel any guilt or shame for the massive damage they are attempting to inflict on Bitcoin - and on the world. Their ignorance is no excuse Any dev who is ignorant enough to attempt to propose adding such insidious bugs to Bitcoin needs to be rejected by the Bitcoin community - no matter how many years they keep on loudly insisting on trying to sabotage Bitcoin like this. The toxic influence and delusional lies of AXA-funded Blockstream CTO Greg Maxwell, CEO Adam Back, attack dog Luke-Jr and censor Theymos are directly to blame for the slow-motion disaster happening in Bitcoin right now - where Bitcoin's market cap has continued to fall from 100% towards 60% - and is continuing to drop.
When bitcoin drops below 50%, most of the capital will be in altcoins. All they had to do was increase the block size to 2mb as they promised. Snatching defeat from the jaws of victory.
u/FormerlyEarlyAdopter : "I predict one thing. The moment Bitcoin hard-forks away from Core clowns, all the shit-coins out there will have a major sell-off." ... u/awemany : "Yes, I expect exactly the same. The Bitcoin dominance index will jump above 95% again."
https://np.reddit.com/btc/comments/5yfcsw/uformerlyearlyadopter_i_predict_one_thing_the/ Market volume (ie, blocksize) should be decided by the market - not based on some arbitrary number that some ignorant dev pulled out of their ass For any healthy cryptocurrency, market price and market capitalization and market volume (a/k/a "blocksize") are determined by the market - not by any dev team, not by central bankers from AXA, not by economically ignorant devs like Adam and Greg (or that other useless idiot - Core "Lead Maintainer" Wladimir van der Laan), not by some drooling pathological delusional authoritarian freak like Luke-Jr, and not by some petty tyrant and internet squatter and communmity-destroyer like Theymos. The only way that Bitcoin can survive and prosper is if we, as a community, denounce and reject these pathological "centralized blocksize" control freaks like Adam and Greg and Luke and Theymos who are trying to use tricks like fiat and censorship and lies (in collusion with their army of trolls organized and unleashed by the Dragons Den) to impose their ignorance and insanity on our currency. These losers might be too ignorant and anti-social to even begin to understand the fact that they are attempting to sabotage Bitcoin. But their ignorance is no excuse. And Bitcoin is getting ready to move on and abandon these losers. There are many devs who are much better than Greg, Adam and Luke-Jr A memory leak is an implementation error, and a centrally planned blocksize is a specification error - and both types of errors will be avoided and removed by smart devs who listen to the community. There are plenty of devs who can write Bitcoin implementations in C++ - plus plenty of devs who can write Bitcoin implementations in other languages as well, such as:
Greg, Adam, Luke-Jr and Theymos are being exposed as miserable failures AXA-funded Blockstream CTO Greg Maxwell, CEO Adam Back, their drooling attack dog Luke-Jr and their censor Theymos (and all the idiot small-blockheads, trolls, and shills who swallow the propaganda and lies cooked up in the Dragons Den) are being exposed more and more every day as miserable failures. Greg, Adam, Luke-Jr and Theymos had the arrogance and the hubris to want to be "trusted" as "leaders". But Bitcoin is the world's first cryptocurrency - so it doesn't need trust, and it doesn't need leaders. It is decentralized and trustless. C++ devs should not be deciding Bitcoin's volume. The market should decide. It's not suprising that a guy like "One-Meg Greg" who adopts a nick like u/nullc (because he spends most of his life worrying about low-level details like how to avoid null pointer errors in C++ while the second-most-powerful fiat finance corporation in the world AXA is throwing tens of millions of dollars of fiat at his company to reward him for being a "useful idiot") has turned to be not very good at seeing the "big picture" of Bitcoin economics. So it also comes as no suprise that Greg Maxwell - who wanted to be the "leader" of Bitcoin - has turned out to be one of most harmful people in Bitcoin when it comes to things like growing a potentially multi-trillion-dollar market and economy. All the innovation and growth and discussion in cryptocurrencies is happening everywhere else - not at AXA-funded Blockstream and r\bitcoin (and the recently discovered Dragons Den, where they plan their destructive social engineering campaigns). Those are the censored centralized cesspools financed by central bankers and overrun by loser devs and the mindless trolls who follow them - and supported by inefficient miners who want to cripple Bitcoin with centrally planned blocksize (and dangerous "Anyone-Can-Spend" SegWit). Bitcoin is moving on to bigger blocks and much higher prices - leaving AXA-funded Blockstream's crippled censored centrally planned shit-coin in the dust Let them stagnate in their crippled shit-coin with its centrally planned, artificial, arbitrary 1MB 1.7MB blocksize, and SegWit's Anyone-Can-Spend hackkludge poison-pill. Bitcoin is moving on without these tyrants and liars and losers and sociopaths - and we're going to leave their crippled censored centrally planned shit-coin in the dust.
Core/Blockstream are now in the Kübler-Ross "Bargaining" phase - talking about "compromise". Sorry, but markets don't do "compromise". Markets do COMPETITION. Markets do winner-takes-all. The whitepaper doesn't talk about "compromise" - it says that 51% of the hashpower determines WHAT IS BITCOIN.
Core/Blockstream is living in a fantasy world. In the real world everyone knows (1) our hardware can support 4-8 MB (even with the Great Firewall), and (2) hard forks are cleaner than soft forks. Core/Blockstream refuses to offer either of these things. Other implementations (eg: BU) can offer both.
1 BTC = 64 000 USD would be > $1 trillion market cap - versus $7 trillion market cap for gold, and $82 trillion of "money" in the world. Could "pure" Bitcoin get there without SegWit, Lightning, or Bitcoin Unlimited? Metcalfe's Law suggests that 8MB blocks could support a price of 1 BTC = 64 000 USD
Bitcoin Original: Reinstate Satoshi's original 32MB max blocksize. If actual blocks grow 54% per year (and price grows 1.542 = 2.37x per year - Metcalfe's Law), then in 8 years we'd have 32MB blocks, 100 txns/sec, 1 BTC = 1 million USD - 100% on-chain P2P cash, without SegWit/Lightning or Unlimited
I'm writing a series about blockchain tech and possible future security risks. This is the third part of the series introducing Quantum resistant blockchains.
Part 1 and part 2 will give you usefull basic blockchain knowledge that is not explained in this part. Part 1 here Part 2 here Quantum resistant blockchains explained. - How would quantum computers pose a threat to blockchain? - Expectations in the field of quantum computer development. - Quantum resistant blockchains - Why is it easier to change cryptography for centralized systems such as banks and websites than for blockchain? - Conclusion The fact that whatever is registered on a blockchain can’t be tampered with is one of the great reasons for the success of blockchain. Looking ahead, awareness is growing in the blockchain ecosystem that quantum computers might cause the need for some changes in the cryptography that is used by blockchains to prevent hackers from forging transactions. How would quantum computers pose a threat to blockchain? First, let’s get a misconception out of the way. When talking about the risk quantum computers could pose for blockchain, some people think about the risk of quantum computers out-hashing classical computers. This, however, is not expected to pose a real threat when the time comes. This paper explains why: https://arxiv.org/pdf/1710.10377.pdf "In this section, we investigate the advantage a quantum computer would have in performing the hashcash PoW used by Bitcoin. Our findings can be summarized as follows: Using Grover search, a quantum computer can perform the hashcash PoW by performing quadratically fewer hashes than is needed by a classical computer. However, the extreme speed of current specialized ASIC hardware for performing the hashcash PoW, coupled with much slower projected gate speeds for current quantum architectures, essentially negates this quadratic speedup, at the current difficulty level, giving quantum computers no advantage. Future improvements to quantum technology allowing gate speeds up to 100GHz could allow quantum computers to solve the PoW about 100 times faster than current technology. However, such a development is unlikely in the next decade, at which point classical hardware may be much faster, and quantum technology might be so widespread that no single quantum enabled agent could dominate the PoW problem." The real point of vulnerability is this: attacks on signatures wherein the private key is derived from the public key. That means that if someone has your public key, they can also calculate your private key, which is unthinkable using even today’s most powerful classical computers. So in the days of quantum computers, the public-private keypair will be the weak link. Quantum computers have the potential to perform specific kinds of calculations significantly faster than any normal computer. Besides that, quantum computers can run algorithms that take fewer steps to get to an outcome, taking advantage of quantum phenomena like quantum entanglement and quantum superposition. So quantum computers can run these certain algorithms that could be used to make calculations that can crack cryptography used today. https://en.wikipedia.org/wiki/Elliptic-curve_cryptography#Quantum_computing_attacks and https://eprint.iacr.org/2017/598.pdf Most blockchains use Elliptic Curve Digital Signature Algorithm (ECDSA) cryptography. Using a quantum computer, Shor's algorithm can be used to break ECDSA. (See for reference: https://arxiv.org/abs/quant-ph/0301141 and pdf: https://arxiv.org/pdf/quant-ph/0301141.pdf ) Meaning: they can derive the private key from the public key. So if they got your public key (and a quantum computer), then they got your private key and they can create a transaction and empty your wallet. RSA has the same vulnerability while RSA will need a stronger quantum computer to be broken than ECDSA. At this point in time, it is already possible to run Shor’s algorithm on a quantum computer. However, the amount of qubits available right now makes its application limited. But it has been proven to work, we have exited the era of pure theory and entered the era of practical applications:
2001: First execution of Shor's algorithm at IBM's Almaden Research Center and Stanford University. The paper here: (Experimental realization of Shor's quantum factoring algorithm using nuclear magnetic resonance Lieven M. K. Vandersypen, https://arxiv.org/abs/quant-ph/0112176 )
So far Shor's algorithm has the most potential, but new algorithms might appear which are more efficient. Algorithms are another area of development that makes progress and pushes quantum computer progress forward. A new algorithm called Variational Quantum Factoring is being developed and it looks quite promising. " The advantage of this new approach is that it is much less sensitive to error, does not require massive error correction, and consumes far fewer resources than would be needed with Shor’s algorithm. As such, it may be more amenable for use with the current NISQ (Noisy Intermediate Scale Quantum) computers that will be available in the near and medium term." https://quantumcomputingreport.com/news/zapata-develops-potential-alternative-to-shors-factoring-algorithm-for-nisq-quantum-computers/ It is however still in development, and only works for 18 binary bits at the time of this writing, but it shows new developments that could mean that, rather than a speedup in quantum computing development posing the most imminent threat to RSA and ECDSA, a speedup in the mathematical developments could be even more consequential. More info on VQF here: https://arxiv.org/abs/1808.08927 It all comes down to this: when your public key is visible, which is always necessary to make transactions, you are at some point in the future vulnerable for quantum attacks. (This also goes for BTC, which uses the hash of the public key as an address, but more on that in the following articles.) If you would have keypairs based on post quantum cryptography, you would not have to worry about that since in that case not even a quantum computer could derive your private key from your public key. The conclusion is that future blockchains should be quantum resistant, using post-quantum cryptography. It’s very important to realize that post quantum cryptography is not just adding some extra characters to standard signature schemes. It’s the mathematical concept that makes it quantum resistant. to become quantm resistant, the algorithm needs to be changed. “The problem with currently popular algorithms is that their security relies on one of three hard mathematical problems: the integer factorization problem, the discrete logarithm problem or the elliptic-curve discrete logarithm problem. All of these problems can be easily solved on a sufficiently powerful quantum computer running Shor's algorithm. Even though current, publicly known, experimental quantum computers lack processing power to break any real cryptographic algorithm, many cryptographers are designing new algorithms to prepare for a time when quantum computing becomes a threat.” https://en.wikipedia.org/wiki/Post-quantum_cryptography Expectations in the field of quantum computer development. To give you an idea what the expectations of quantum computer development are in the field (Take note of the fact that the type and error rate of the qubits is not specified in the article. It is not said these will be enough to break ECDSA or RSA, neither is it said these will not be enough. What these articles do show, is that a huge speed up in development is expected.):
When will ECDSA be at risk? Estimates are only estimates, there are several to be found so it's hard to really tell. The National Academy of Sciences (NAS) has made a very thourough report on the development of quantum computing. The report came out in the end of 2018. They brought together a group of scientists of over 70 people from different interconnecting fields in quantum computing who, as a group, have come up with a close to 200 pages report on the development, funding, implications and upcoming challenges for quantum computing development. But, even though this report is one of the most thourough up to date, it doesn't make an estimate on when the risk for ECDSA or RSA would occur. They acknowledge this is quite impossible due to the fact there are a lot of unknowns and due to the fact that they have to base any findings only on publicly available information, obviously excluding any non available advancements from commercial companies and national efforts. So if this group of specialized scientists can’t make an estimate, who can make that assessment? Is there any credible source to make an accurate prediction? The conclusion at this point of time can only be that we do not know the answer to the big question "when". Now if we don't have an answer to the question "when", then why act? The answer is simple. If we’re talking about security, most take certainty over uncertainty. To answer the question when the threat materializes, we need to guess. Whether you guess soon, or you guess not for the next three decades, both are guesses. Going for certain means you'd have to plan for the worst, hope for the best. No matter how sceptical you are, having some sort of a plan ready is a responsible thing to do. Obviously not if you're just running a blog about knitting. But for systems that carry a lot of important, private and valuable information, planning starts today. The NAS describes it quite well. What they lack in guessing, they make up in advice. They have a very clear advice:
"Even if a quantum computer that can decrypt current cryptographic ciphers is more than a decade off, the hazard of such a machine is high enough—and the time frame for transitioning to a new security protocol is sufficiently long and uncertain—that prioritization of the development, standardization, and deployment of post-quantum cryptography is critical for minimizing the chance of a potential security and privacy disaster."
Another organization that looks ahead is the National Security Agency (NSA) They have made a threat assessment in 2015. In August 2015, NSA announced that it is planning to transition "in the not too distant future" (statement of 2015) to a new cipher suite that is resistant to quantum attacks. "Unfortunately, the growth of elliptic curve use has bumped up against the fact of continued progress in the research on quantum computing, necessitating a re-evaluation of our cryptographic strategy." NSA advised: "For those partners and vendors that have not yet made the transition to Suite B algorithms, we recommend not making a significant expenditure to do so at this point but instead to prepare for the upcoming quantum resistant algorithm transition.” https://en.wikipedia.org/wiki/NSA_Suite_B_Cryptography#cite_note-nsa-suite-b-1 What these organizations both advice is to start taking action. They don't say "implement this type of quantum resistant cryptography now". They don't say when at all. As said before, the "when" question is one that is a hard one to specify. It depends on the system you have, the value of the data, the consequences of postponing a security upgrade. Like I said before: you just run a blog, or a bank or a cryptocurrency? It's an individual risk assesment that's different for every organization and system. Assesments do need to be made now though. What time frame should organisationds think about when changing cryptography? How long would it take to go from the current level of security to fully quantum resistant security? What changes does it require to handle bigger signatures and is it possible to use certain types of cryptography that require to keep state? Do your users need to act, or can al work be done behind the user interface? These are important questions that one should start asking. I will elaborate on these challenges in the next articles. Besides the unsnswered question on "when", the question on what type of quantum resistant cryptography to use is unanswered too. This also depends on the type of system you use. The NSA and NAS both point to NIST as the authority on developments and standardization of quantum resistant cryptography. NIST is running a competition right now that should end up in one or more standards for quantum resistant cryptography. The NIST competition handles criteria that should filter out a type of quantum resistant cryptography that is feasable for a wide range of systems. This takes time though. There are some new algorithms submitted and assessing the new and the more well known ones must be done thouroughly. They intend to wrap things up around 2022 - 2024. From a blockchain perspective it is important to notice that a specific type of quantum resistant cryptography is excluded from the NIST competition: Stateful Hash-Based Signatures. (LMS and XMSS) This is not because these are no good. In fact they are excelent and XMSS is accepted to be provable quantum resistant. It's due to the fact that implementations will need to be able to securely deal with the requirement to keep state. And this is not a given for most systems. At this moment NIST intends to approve both LMS and XMSS for a specific group of applications that can deal with the statefull properties. The only loose end at this point is an advice for which applications LMS and XMSS will be adviced and for what applications it is discouraged. These questions will be answered in the beginning of april this year: https://csrc.nist.gov/news/2019/stateful-hbs-request-for-public-comments This means that quite likely LMS and XMSS will be the first type of standardized quantum resistant cryptography ever. To give a small hint: keeping state, is pretty much a naturally added property of blockchain. Quantum resistant blockchains “Quantum resistant” is only used to describe networks and cryptography that are secure against any attack by a quantum computer of any size in the sense that there is no algorithm known that makes it possible for a quantum computer to break the applied cryptography and thus that system. Also, to determine if a project is fully quantum resistant, you would need to take in account not only how a separate element that is implemented in that blockchain is quantum resistant, but also the way it is implemented. As with any type of security check, there should be no backdoors, in which case your blockchain would be just a cardboard box with bulletproof glass windows. Sounds obvious, but since this is kind of new territory, there are still some misconceptions. What is considered safe now, might not be safe in the age of quantum computers. I will address some of these in the following chapters, but first I will elaborate a bit about the special vulnerability of blockchain compared to centralized systems. Why is it easier to change cryptography for centralized systems such as banks and websites than for blockchain? Developers of a centralized system can decide from one day to the other that they make changes and update the system without the need for consensus from the nodes. They are in charge, and they can dictate the future of the system. But a decentralized blockchain will need to reach consensus amongst the nodes to update. Meaning that the majority of the nodes will need to upgrade and thus force the blockchain to only have the new signatures to be valid. We can’t have the old signature scheme to be valid besides the new quantum resistant signature scheme. Because that would mean that the blockchain would still allow the use of vulnerable, old public- and private keys and thus the old vulnerable signatures for transactions. So at least the majority of the nodes need to upgrade to make sure that blocks which are constructed using the old rules and thus the old vulnerable signature scheme, are rejected by the network. This will eventually result in a fully upgraded network which only accepts the new post quantum signature scheme in transactions. So, consensus is needed. The most well-known example of how that can be a slow process is Bitcoin’s need to scale. Even though everybody agrees on the need for a certain result, reaching consensus amongst the community on how to get to that result is a slow and political process. Going quantum resistant will be no different, and since it will cause lesser performance due to bigger signatures and it will need hardware upgrades quite likely it will be postponed rather than be done fast and smooth due to lack of consensus. And because there are several quantum resistant signature schemes to choose from, agreement an automatic given. The discussion will be which one to use, and how and when to implement it. The need for consensus is exclusively a problem decentralized systems like blockchain will face. Another issue for decentralized systems that change their signature scheme, is that users of decentralized blockchains will have to manually transfe migrate their coins/ tokens to a quantum safe address and that way decouple their old private key and activate a new quantum resistant private key that is part of an upgraded quantum resistant network. Users of centralized networks, on the other hand, do not need to do much, since it would be taken care of by their centralized managed system. As you know, for example, if you forget your password of your online bank account, or some website, they can always send you a link, or secret question, or in the worst case they can send you mail by post to your house address and you would be back in business. With the decentralized systems, there is no centralized entity who has your data. It is you who has this data, and only you. So in the centralized system there is a central entity who has access to all the data including all the private accessing data, and therefore this entity can pull all the strings. It can all be done behind your user interface, and you probably wouldn’t notice a thing. And a third issue will be the lost addresses. Since no one but you has access to your funds, your funds will become inaccessible once you lose your private key. From that point, an address is lost, and the funds on that address can never be moved. So after an upgrade, those funds will never be moved to a quantum resistant address, and thus will always be vulnerable to a quantum hack. To summarize: banks and websites are centralized systems, they will face challenges, but decentralized systems like blockchain will face some extra challenges that won't apply for centralized systems.
Updating the signature scheme will need consensus in the sense that all nodes need to update after implementation of a quantum resistant signature scheme.
Users of blockchain will personally need to move their funds from old addresses to new quantum resistant addresses. You won't need to move your bank funds.
Lost addresses where people lost access to their funds will never be moved and stay vulnerable to quantum hacks. Blockchain doesn't know their users, can't communicate with them and won't be able to distinguish coins on lost addresses from coins from users who still have access but somehow have not migrated their coins after a quantum resistant update. So burning lost coins will be legally a big issue.
The sender prepares a header and appends a counter value initialized to a random number. It then computes the 160-bit SHA-1 hash of the header. If the first 20 bits (i.e. the 5 most significant hex digits) of the hash are all zeros, then this is an acceptable header.
They aren't even using the same hashing algorithm. Similar method, but not identical. More from the Wikipedia page:
While hashcash uses the SHA-1 hash and requires the first 20 of 160 hash bits to be zero, bitcoin's proof of work uses two successive SHA-256 hashes and originally required at least the first 32 of 256 hash bits to be zero.
The fact Back claims on his Twitter bio that his hashcash is used in Bitcoin Proof-of-Work is fraudulent As of the time of this post, Back's Twitter profile reads as:
I've always been skeptical of the value of bitcoin. I've been watching bitcoin for a long time since back when you could by 10000 bitcoins for like a dollar. Mainly because as a developer I would see it pop in my news feed constantly. I thought it seems like a novel idea, and as a reader the Cryptonomicon kinda gave me a warm fuzzy to see it take off as a reality. But that was were my warm fuzzy's ended. My first thought was it'll be a year before this gets shut down by the SEC or embroiled in some sort of lawsuits from the IRS. I was surprised when nothing happened, but overall it seemed to just go on. I heard about people using it on the dark web and that made sense, this was even mentioned in the Cryptonomicon that in any frontier the forerunners are often the criminals. Eventually though it saw some legitimacy, some websites started accepting it as currency the value started to rise meeting parity with the dollar. I started to read some of the stuff about it. What is a bitcoin? Well, that's when I came across my first problem with how bitcoin works. Bitcoin mining is built on a concept called Proof of Work, from the Bitcoin wiki:
A proof of work is a piece of data which is difficult (costly, time-consuming) to produce but easy for others to verify and which satisfies certain requirements. Producing a proof of work can be a random process with low probability so that a lot of trial and error is required on average before a valid proof of work is generated. Bitcoin uses the Hashcash proof of work system.
But is that data valuable? Well... no. The value of bitcoin is spent cpu cycles. If we really break it down the value of bitcoin is entropy. That's what people are trading. Bitcoin is a commodity currency where the commodity is... literally entropy. I thought surely people will realize that the value of this is nothing but spent energy. But the price kept rising. Now these days you see people defending the idea of bitcoin. When you point out any sort of flaw in how bitcoin is built you get back unintelligible answers that basically break down to, "The ontology of this ontology is the ontology of this ontology." I'm kind of fed up with listening to these people yammer nonsensically about what a great thing bitcoin is. Is anyone else experiencing the same thing?
The evolution of Distributed Ledger Technologies - Part 1
Understanding really something means that you need to look at how it was created and how it has evolved. Blockchain technology was not created out of nowhere or overnight from an anonymous crazy inventor called Satoshi Nakamoto, as some may believe. It was the outcome of collective human innovation through a very strange set of circumstances that would set the setting stone for a new decentralized movement and a new and better concept of money. To grasp ahold of the origin of Bitcoin and the Distributed Ledger Technologies, or plainly laid out “Blockchain” in modern online literature, one has to look at the history and the combined influence of 4 elements, Cryptography, Open Source Software, Peer to Peer Sharing Networks, Crypto-Economics.
Part 1 - Introduction to Cryptography
Cryptography is about solving the problem of transmitting information fast, securely and covertly to an audience. The problem arose as new technology increased the potential of communication and the danger from information being stolen. In the 1930s and during the World War II encryption and cryptography boomed as a result of military research and development, that would provide a competitive advantage and eventually greatly assist by breaking almost every German and Japanese code. Formal information security and electronic surveillance organizations would then be born and continue to this day, such as the NSA. Military Enigma machine, model \"Enigma I\", used during the late 1930s and during the war; displayed at Museo scienza e tecnologia Milano, Italy. Pioneering cryptographers were James Ellis and Clifford Cocks with their public key encryption idea. An encrypted message would contain the key that would enable unlocking the encryption, however the idea was not at that point feasible as it entailed a public communications network such as the internet as a foundation. These systems were not yet available to the public in the 1970s. Additionally, David Chaum, was the first to propose cryptocurrency in 1983, in a paper called “Numbers can be a better form of cash than paper” as well as other ideas like untraceable electronic mail, digital signatures and digital secret identities.
The Rise of the Cypherpunks
With the emergence of the internet, by the 1990s’ a new movement called Cypherpunks was born. These people wanted to use the encryption tools developed by the military-industrial complex to protect individuals and their privacy. In early 1991, a U.S. Senate legislation had a proposal that would force electronic communications service providers to hand over individuals’ private messages. A little known programmer called Phil Zimmerman decided to develop a tool that would help individuals freely communicate on the internet. Concerned that the American government would soon require service providers to turn over its users’ communications, Phil developed the free software known as Pretty Good Privacy, or PGP, so that individuals could encrypt the contents of their own messages, texts and files. PGP quickly became the world’s most popular email-encryption software and one of the world’s first examples of public key encryption to gain any kind of widespread adoption. It was notably used by Edward Snowden to secretly transfer classified documents from the NSA to journalist Glenn Greenwald in 2012. NSA whistle-blower Edward Snowden in a still image taken from video during an interview. In late 1992, Eric Hughes, Tim May and John Gilmore invited twenty of their closest friends to an informal meeting to discuss programming and cryptographic issues. This meeting was then held monthly at John Gilmore’s company, Cygnus Solutions and as the group grew they decided to setup a mailing list to reach other people elsewhere and the Cypherpunks were already growing in numbers. The ideas and concepts shared in this mailing list varied from cryptography, mathematics, computer science and political as well as philosophical debates, with privacy being one of the main founding principles.
“Privacy is necessary for an open society in the electronic age. Privacy is not secrecy. A private matter is something one doesn’t want the whole world to know, but a secret matter is something one doesn’t want anybody to know. Privacy is the power to selectively reveal oneself to the world.”
Early attempts of anonymous transaction systems that would introduce game theory and incentivised behaviour, was the Hashcash in 1997, by Dr. Adam Back, which was a system to prove that some computational power was spent to create a stamp in the header of an email, acting as an anti-spam mechanism, a concept that might sound familiar to the proof of Work use in Bitcoin. In 1998, Wei Dai published his proposal for B-Money, which included two methods of maintaining transaction data, one in which all participants hold a separate database or ledger and a second in which a specific group only holds the database and are incentivized to act honestly as they have deposited their own money into a special account and stand to lose it by acting dishonestly, also known as the “Proof of Stake” method. Ethereum is one of the cryptocurrencies considering to move to this method of transaction verification since it provides efficiency benefits. In 2004, Hal Finney created the Reusable Proofs of Work based on the principles of Hashcash, which were unique cryptographic tokens you could only spend once, but were limited to validation and protection against double spending from a central server. In 2005 Nick Szabo gave his own proposal for BitGold, a system which units would be valued differently based upon the amount of computational work performed to create them. https://preview.redd.it/kx6psm0vfgj11.jpg?width=1067&format=pjpg&auto=webp&s=3d72dc76341c76dc671a0f6e46f9a98acc6d0179 Finally, in 2008, Satoshi Nakamoto, a pseudonym for a still-unidentified individual or individuals, published the bitcoin whitepaper, citing both hashcash and b-money, addressing many of the problems that the earlier developers had faced, including double spending. The bitcoin white paper attracted a lot of criticism from sceptics, but Satoshi moved on despite the critics and mined the genesis block of Bitcoin on 3rd of January 2009. See you in the nextarticle! I think that’s enough condensed knowledge for one article. In the following article we’ll look at Open Source Software and study its influence in the development of Blockchain Technologies.
Adam Back, Greg Maxwell, and Pieter Wuille will be conducting an AMA about sidechains/Blockstream Thursday, October 23, 2014 at 9:00 AM PDT (16:00 UTC)
In case you're still looking for the AMA, it is here. Very exciting news just came in mod mail. Greg and Pieter are Bitcoin long-time core devs, and Adam Back proposed Hashcash, the proof-of-work system that was cited in Satoshi's whitepaper. Blockstream just published their sidechains whitepaper today.
We are long-time Bitcoin protocol developers who are convinced that finding an architecturally sound and permissionless way to extend Bitcoin is essential for cryptocurrency to reach its full potential. Bitcoin has inspired people since its inception five years ago as a new kind of money in digital form that exists independent of any government or institution. Bitcoin enables financial transactions without needing to trust any third party.
Read the rest of their statement here. Really interesting stuff. The only hitch is we don't yet know where they plan on posting their AMA. This post will be updated as soon as possible. Check your local time zone. Don't miss it!
Abstract We propose a proof of work protocol that computes the discrete logarithm of an element in a cyclic group. Individual provers generating proofs of work perform a distributed version of the Pollard rho algorithm. Such a protocol could capture the computational power expended to construct proof-of-work-based blockchains for a more useful purpose, as well as incentivize advances in hardware, software, or algorithms for an important cryptographic problem. We describe our proposed construction and elaborate on challenges and potential trade-offs that arise in designing a practical proof of work. References
SpaceMint: A cryptocurrency based on proofs of space. In: FC’18. Springer (2018)
Back, A.: Hashcash-a denial of service counter-measure (2002)
Ball, M., Rosen, A., Sabin, M., Vasudevan, P.N.: Proofs of work from worst-case assumptions. In: CRYPTO 2018. Springer International Publishing (2018)
Barbulescu, R., Gaudry, P., Joux, A., Thom´e, E.: A heuristic quasi-polynomial algorithm for discrete logarithm in finite fields of small characteristic. In: EUROCRYPT’14 (2014)
Barker, E., Chen, L., Roginsky, A., Vassilev, A., Davis, R.: SP 800-56A Revision 3. Recommendation for pair-wise key establishment schemes using discrete logarithm cryptography. National Institute of Standards & Technology (2018)
Biryukov, A., Pustogarov, I.: Proof-of-work as anonymous micropayment: Rewarding a Tor relay. In: FC’15. Springer (2015)
Bitansky, N., Canetti, R., Chiesa, A., Goldwasser, S., Lin, H., Rubinstein, A., Tromer, E.: The hunting of the SNARK. Journal of Cryptology 30(4) (2017)
Boneh, D., Bonneau, J., B¨unz, B., Fisch, B.: Verifiable delay functions. In: Annual International Cryptology Conference. pp. 757–788. Springer (2018)
Bos, J.W., Kaihara, M.E., Kleinjung, T., Lenstra, A.K., Montgomery, P.L.: Solving a 112-bit prime elliptic curve discrete logarithm problem on game consoles using sloppy reduction. International Journal of Applied Cryptography 2(3) (2012)
Afternoon, All. Today marks the eighth anniversary of the publication of the Bitcoin white paper. As a special tribute, I will provide you with a short story on the origins of the Bitcoin tech. I've been out of the game for many years, however now I find myself drawn back - in part due to the energy that's being added by the incumbents, in part due to information that's become public over the past year. I haven't followed the Bitcoin and alt coin tech for the past five or six years. I left about six months before (2). My last communication with (2) was five years ago which ended in my obliteration of all development emails and long-term exile. Every mention of Bitcoin made me turn the page, change the channel, click away - due to a painful knot of fear in my belly at the very mention of the tech. As my old memories come back I'm jotting them down so that a roughly decent book on the original Bitcoin development may be created. The following are a few of these notes. This is still in early draft form so expect the layout and flow to be cleaned up over time. Also be aware that the initial release of the Bitcoin white paper and code was what we had cut down to from earlier ideas. This means that some of the ideas below will not correspond to what would end up being made public. Bitcoin Logo BitCoin Origins Six Months In A Leaky Boat Introduction I have always found that there’s a vast gulf between knowledge and understanding. Wherever I looked I’ve found very intelligent folks who had immense knowledge in their subject but with little understanding of what to do with it, how to mould it, how to create something new. They could only ever iterate incrementally to improve the knowledge in their given field. Understanding comes from experiences outside of knowledge in a particular subject. The following story is about a most unique project and the understanding that was used and applied to the e-cash problem which resulted in the experiment called Bitcoin. It is to show the thought process, stream of consciousness, arguments, examples, concerns and fears that went through our minds has we tussled with this beast and hammered out something that may actually work. There is no verification of truth here. There is absolutely no evidential proof that I had any part in the project. All evidence was purged in late 2011 - the reason will become apparent. Only (2) should know of my involvement (until now). Take this as just a fictional story if you wish. Who am I ? I went by the ‘net handle Scronty back then. scrontsoft.com I have always been interested in computer and electronic technology since the age of eleven. Seeing what others had made these machines do, and then trying to push it a little bit further out. Whenever there was a problem to be figured out I would always begin with what the current state of knowledge was - after all, we all stand on the shoulders of all those who have gone before. Quite often I found that the assumptions folks hold for a particular problem are the things that are holding them back from figuring out a new solution. So I would begin by questioning peoples basic assumptions on various subjects
“What if that wasn’t true ?"
"If it didn’t exist what could it be replaced with ?”
This usually resulted in annoying all of these knowledgable folks.
“That’s the way it’s always been”
“That’s the best industry standard for this”
“All the letters after my name means I’m right and you’re wrong”
“That’s what’s written in this book I’m holding”
“Everyone quotes from this person so he must be right - so I quote from him as well”
You get the idea. You see it on every single message board since the mid-nineties onwards. There’re also a lot of egotistical chips on folks shoulders where you’d find that they’d look down on others and belittle them on topics that they themselves had only just learned a few weeks earlier. This is particularly true in programming and crypto forums. Start A couple of guys worked with an online betting company. They had a problem. For punters to use their service they had to provide credit card details and pay for chip tokens. However, many times a punter would play the online pokey machines, lose all of their money and then reverse the credit card charge saying “It’s unauthorised. It wasn’t me”. Sometimes the company’s network would not record the funds transfer correctly and so the punters funds were removed from their credit account into the company’s account but no record of it was made on the company’s end - so the punter didn’t receive any play tokens and, again, tried to reverse the charges. The large credit card issuing companies also actively stopped allowing credit cards to be used for online gambling and began refusing to reverse the charges. What these guys needed was a way to transfer funds between punters and the online betting companies so that both parties could trust that everything was above board. That a payment could not be made by mistake and once a payment went through it was unchangeable, irreversible. (2) had been on the periphery of the cypherpunks group since the mid 1990’s. When I entered the project in early 2008 he had been working on the problem part-time over the past five years. Over the previous year or so he’d been working on the problem full-time. He was writing a white paper for an e-cash system for the online betting/gambling company to use ( or to license out the solution to multiple companies ) plus writing the code for it. He was attempting to implement a working example of electronic cash. There were other cryptographers who he was communicating with however it just wouldn’t “work”. There were always too many attack vectors with the solution and even though, from a cryptographic point-of-view, the white paper and code was appropriate, he found it unsatisfactory. After talking to his friend (3) it was decided that maybe they had their noses too close to the grindstone and that they should find someone who wasn’t a cryptographer to look over the ideas. The problem is that to find such a person is very difficult. He’d have to be smart enough to understand cryptography (or learn it), also be interested in the subject but also not currently be a cryptographer. Usually the folks who were smart enough and had an interest were already cryptographers. Through various IRC (Internet Relay Chat) channels (3) came across me and I ended up being put in touch with (2). With my work in the Win32 Asm community I’d shown I was smart enough and could figure out the solutions to difficult problems. Plus I’d made sure my public profile was always dealing with grey-to-white topics (no online gambling stuff). Request For Help I was asked to take a look over what had been written in the white paper and see what needed to be changed as the code implementing it just wasn’t working - the pieces wouldn’t fit together or the whole thing would fail if certain pre-conditions in the network weren’t met. (2) wanted to publish the white paper before the end of the year (2008). I began reading through the document - understanding very little. Hashing and encrypting and decrypting and private keys and public keys. Different types of hashing algorithms, encrypting then hashing and hashing then encrypting. Oh my! “Just tell me what I need to change to make it work” - (2) kept asking me. “I dunno what the [redacted] I’m reading here” - I replied. (2) thought that maybe he’d made a mistake and he’ll just try and find someone else. I told him that he’s going about fixing it the wrong way. “How should it be fixed ?”, he asked. “Well, first I need to know what I’m reading. So you’re going to have to give me info on the various crypto stuff in here”, I said. “No no no”, he said. “ If you learn the meaning of the cryptographic jargon you will be influenced by it and would no-longer be the “non-cryptographer” that we need to look over the white paper”. I told him that without learning the jargon I cannot read the paper in the first place. Also - as I learn I will understand more and will be able to tell you what you need to change. If or when it got to the stage that I’d learned too much and also had my nose too close to the grindstone then I could leave the project and he could find someone else to replace me. He agreed that having me learn a bit about cryptography may be a good idea (:roll-eyes:). He told me to get started. I asked where the information was. He said “Google it”. I said “Nope. You’ve been working in this area for the past few years so you can give me a link to the websites with the info." He returned with a list of website links and said to go through that and look at the white paper. The list had about 109 links in it - bloody [redacted]. One-by-one I began going through the information. After a few weeks I’d gone through about half-a-dozen papers/websites which hadn’t cleared up anything. Once three or four weeks had gone by I threw my hands up in disgust and told him “At this rate I’ll be here all year and still not understand all the pieces. You’ve got to filter this down for me. You’ve already read all of these documents and websites so give me a list of the most important docs/websites you think would be helpful in understanding your white paper”. He came back with a list of about 23 white papers and websites. “Now list them in the order you think I should read them in”. He came back with a sorted and filtered list of crypto-docs and websites. I began reading through them - starting at the first. Transactions Given a computer network there had to be transactions sent to a recipient. The initial white paper was pretty much a shuffling of the various cryptographic e-cash white papers at the time. We knew that when someone wanted to send a payment to another person it would have to be transmitted across a network securely. But how to solve the double-spend problem ? A piece of physical paper cash can only be in one place at a time - you cannot double-spend a physical currency note. All current electronic cash solutions relied upon a central server to control the allocation of coin and to make sure no coin could be double-spent. But if that server went down, or was unaccessible due to a DDOS attack or government intervention ( or someone just tripping over a power cord ) then no more money. We knew that a coin would initially be minted somehow. I found most of the methods written in white papers and on websites were rubbish ( Personal opinion here. No disrespect to those who wrote those white papers ). They either tried to pretend to act as central banks or tried to allow a “mates club” whereby they all agreed who's going to get coin at a particular time. Kind of like politicians using an "independent" third party to give themselves a pay rise. We knew that a piece of electronic cash would be minted somehow, however once it was minted how could it be sent to someone else ? (2) and I went back and forth with a few ideas, going through the physical process of different transaction types one by one and adjusting how a transaction data package would look like. We began with a single piece of e-cash. Like a piece of gold, it should be able to cut smaller pieces off of it. That means by starting with one item we’d end up with two - the piece going to the recipient and the change coming back to the original owner. I told (2) that when drawn into a diagram it looks like electronic or computer logic gates. Logic Gates Except sometimes there can be more outputs than inputs. And in the end it looks like a neural network. If we had a large piece and were paying that entire amount to someone then the input and output pieces would be the same. If we had a large piece and were paying a small amount to someone then the input would be the large piece and the outputs would be the amount being paid plus a small piece as change. As more people are paid we’d end up with a lot of small pieces in our wallet. If we had a small piece and needed to pay someone a large amount then we could combine multiple small pieces to be equal or larger than the amount to be paid, and refund back to ourselves any change left over. This means a transaction would have to allow multiple inputs and multiple outputs, with each input signed by the current owners private key and the outputs being the new owners public key. Transaction Types One day he came back to me saying his friend (3) wanted to communicate directly with me but he was a super-paranoid fella and I had to encrypt any messages using private/public keys. It was a [redacted] nightmare. I had to:
Generate the private/public keys
Make sure the public key was sent to a very specific location so that we could “trust” that the public key was valid
Use this quirky little command line proggy where I included my email address plus a link to the private key
Embed the generated data into the email
This was all so he could confirm that the message was indeed from me and had not been intercepted or changed. Then he decided that I’d also have to generate new private/public keys for every single email just in case a previous email had been intercepted. I told (2) that this just wasn’t going to happen. I’ve always disliked using command line programs directly and always thought that they should always be executed from a GUI ( Graphical User Interface). I said “You’re going to be my filter for this project and main conduit in this team. I send emails to you, you communicate with whoever you need to and send their replies back to me. Or you send their requests to me and I reply back through you. And what’s this annoying command line proggy anyway? What the [redacted] is it doing? (2) gave me the link to the information - it was in that list of 109 docs/websites but not in the filtered list of 23. It was to Hal's website where he very clearly explained how something called "Hashcash" worked. Hals RPOW From there I went on to Adam's site: Hashcash (which was not even in the original list at all). I read the Hashcash white paper sections until I hit the calculations and my eyes begun to glaze over. Hashcash I read the first few paragraphs and knew this was something interesting. I asked (2) if he could check whether this document was the final version or if there had been improvements/ amendments/ updates to it. He said he thought I was wasting my time with this and I should continue with the other docs/websites in the list he’d provided me. I told him that I’m the only one who would know what info is important and to look into the Hashcash origin for me. He came back a couple of days later and said it was confirmed that the public document linked was the final version of the Hashcash paper. I asked how he could confirm it? He told me that he’d contacted the original website author Hal and asked him for any updated document and Hal had replied back with the exact same public link. He’d even copy/pasted Hal’s reply in the email to me. I said “Wait… What ? …” “You actually contacted the original author of the reference material ?” He said “Yep. Who else would I go to to confirm the document, except to the author themselves ?” I told him it was really quite rare to have someone check with the original author or sources. Most folks read something and take that as fact, or read the reference documents and take those as fact. If someone read about the Boyer-Moore search algorithm they take it as fact that what they’ve read is the official final solution. I haven’t heard of anyone contacting Boyer or Moore to check for any updates/ improvements/ amendments. The Boyer-Moore search algorithm is something that went through the rounds on the Win32Asm community forum for a while. I found this quite intriguing. Even with (2)’s occasional grating personality it would be very useful to have someone who’s prepared to hunt down the original authors like this. I asked him if he'd contacted the Hashcash author and he said he'd sent emails to every single author of all of the websites/ white papers and only about a dozen or so had ever replied back to him. I had begun to write up a list of what the various problems were for creating an e-cash system from the other e-cash system white papers and websites I had been studying. I was still referring back to the white paper (2) had supplied me however it was really just a mishmash of what everyone else had been doing over the years. Hence why it failed like all of the others. One of the problems was a trusted time stamp so that folks would know that funds hadn’t been double-spent. Another was the minting of the tokens in the system and trusting the minting source. If I recall - practically every single white paper out there ( including the one suppled to me ) used a trusted third party as the source for a time stamp and a convoluted method to check it hadn’t been tampered with. And the minting either used a trusted third party to generate coins on a regular basis or had a network of nodes agree on how many tokens to generate and give to each other. (2) said that we need to use the trusted third parties because how else can we trust the time stamp and the minting of the tokens. I told him he was thinking of it in the wrong way. You’re assuming a trusted third party is needed, just because every single other cryptographic white paper says that’s how you do it. But you’re also saying that you can’t rely on a trusted third party because that makes a single point attack vector that can bring the whole system down to its knees. “Remember Sherlock Holmes” I said. “ ‘When you have eliminated the impossible, whatever remains, however improbable, must be the truth ?’. The assumption of a trusted third party in an functioning e-cash system must be eliminated as impossible for this to work. So if we cannot have a trusted third party for this, what are our other options ?” “I have no idea”, (2) replied. “Do you believe this proof-of-work thing you’re looking into can be used for this somehow ?”. “I dunno. It definitely has some possibilities. It’s made for making sure the data being sent and received comes from a known trusted source and that it hasn’t been tampered with”. It forces the user computer to generate a hash of the data to find a hash with a prepended number of zeroes. If the hash isn’t found it increments a value and hashes again. It just keeps repeating until a hash is found with the correct number of prepended zeroes. This means that the user computer has to spend time working on the hashes until it finds one and only then can it stop. It was designed to eliminate the email spam problem that we all have because a spam-sender would need to use a lot of computing resources to generate hashes for all the emails sent out ( the data that’s hashed includes the recipients email address so a new hash is required for every single email recipient ). It also has a throttle so that the difficulty in generating a hash can be increased over time as the general computing hardware improves. The minting problem is also sorted due to the electricity used in generating a hash can be used to mint the e-cash and put it into circulation. Effectively - the real fiat-currency cost (via electricity consumed) of generating the valid hash is how much e-cash is given to that minter. It also sets what the price of the minted e-cash should be, as there is a direct correlation between a real-world electricity bill and the digital e-cash amount minted. Taking the time used to generate the hash with how much energy the cpu used during the generation ( only the time spent on hashing - not other computing resources ) with the local electricity costs of the suburb/county/province/state/nation the minter resides within, then each minter could have a locally-adjusted e-cash value added to their account. It would mean that someone minting in a country with cheap electricity due to state-subsidised support would receive less e-cash because less real-world fiat currency was expended in the generation of the hash. So now we had a mechanism in which this e-cash would work. I'll stop this story here for now and post a follow-up depending upon its reception. The follow-up will contain some of the details of how the idea of a chain of blocks came about, plus some of the tech that was left out of the initial white paper and public code release ( it was, after all, just the first experiment to check whether this tech would actually work ). Bitcoin Origins - part 2 As a side-note: When you read the Bitcoin white paper again, the Introduction, Calculation, Conclusion and References sections were written and edited by (2) and (3). The Transactions, Timestamp Server, Proof-of-Work, Network, Incentive, Reclaiming Disk Space, Simplified Payment Verification, Combining and Splitting Value and Privacy sections were from text copy/ pasted from emails from me to (2) explaining how each part worked as they were being figured out. I wrote the Abstract text when (2) asked me to write the Introduction. (2) used it as the Abstract section because he found it too terse for an introduction. (2) and (3) edited the entire document and removed any double-spaces from it, adding titles to the various sections and adjusting between 2% and 5% for spelling errors and gramma sentence structure. You can see the original Abstract with double-spacing here: Public Mailing-list Posting There was a huge misunderstanding between us all during the formation of the white paper which I'll mention next time. Cheers, Phil (Scronty) vu.hn
Hashcash. Bitcoin uses the hashcash Proof_of_work function as the mining core. All bitcoin miners whether CPU, GPU, FPGA or ASICs are expending their effort creating hashcash proofs-of-work which act as a vote in the blockchain evolution and validate the blockchain transaction log. Hashcash . Bitcoin uses the hashcash Proof_of_work function as the mining core. All bitcoin miners whether CPU, GPU, FPGA or ASICs are expending their effort creating hashcash proofs-of-work which act as a vote in the blockchain evolution and validate the blockchain transaction log. HashCash has a crypto-currency trading wallet, PayBito that allows consumers to trade in major currencies such as Bitcoin and Ethereum. PayBito is a BIPS compliant secure multi-sig wallet that ensures anti-money laundering and know your customer verifications of both sender and receiver before authorizing the transaction. The bitcoin blockchain is a public ledger that records bitcoin transactions.  It is implemented as a chain of blocks, each block containing a hash of the previous block up to the genesis block [lower-alpha 4] of the chain. A network of communicating nodes running bitcoin software maintains the blockchain. : 215–219 Transactions of the form payer X sends Y bitcoins to payee Z are ... Bitcoin uses the hashcash proof-of-work function. The primary purpose of mining is to allow Bitcoin nodes to reach a secure, tamper-resistant consensus. Mining is also the mechanism used to introduce Bitcoins into the system: Miners are paid any transaction fees as well as a "subsidy" of newly created coins. ... Even experienced Bitcoin Wiki ...
Is Bitcoin's Bounce the Start of the "Wildcard" Scenario ...
How to Prepare for the Future and Avoid Being Caught the Crash - Robert Kiyosaki & George Gammon - Duration: 42:38. The Rich Dad Channel 154,692 views. New Bitcoin has bounced strongly off the 200 SMA, but is bitcoin's bounce the start of the "wildcard" scenario? We look at the charts. #Bitcoin #BTCUSD #AlessioR... 3 Essential tips for finding the best bitcoin mining company in 2019. Forget about cloud mining! ☆Social☆ Subscribe ⇨ https://www.youtube.com/c/cryptominingc... Bitcoin and other cryptocurrencies sprang into limelight when reports of people earning crores of rupees emerged. Bitcoin also has had its fair share of crit... This is a crash course on the history of cryptocurrency and the intricacies of Bitcoin and altcoins. 102-----The history of Bitcoin Satoshi Nakamoto's white paper Adam Back and hashcash Mt. Gox ...