Factors That Influence The Bitcoin Price Total Bitcoin

[OWL WATCH] Waiting for "IOTA TIME" 30;

Disclaimer: This is sort of my own arbitrary editing, so there could be some misunderstandings.
I root for the spread of good spirits and transparency of IF.
📷
Hans Moog [IF]어제 오후 2:45
So why don't we just copy Avalanche? Well that's pretty simple ...
📷
Hans Moog [IF]어제 오후 2:47
1. It doesn't scale very well with the amount of nodes in the network that have no say in the consensus process but are merely consensus consuming nodes (i.e. sensors, edge devices and so on). If you assume that the network will never have more than a few thousand nodes then thats fine but if you want to build a DLT that can cope with millions of devices then it wont work because of the message complexity.
2. If somebody starts spamming conflicts, then the whole network will stop to confirm any transactions and will grind to a halt until the conflict spamming stops. Avalanche thinks that this is not a huge problem because an attacker would have to spend fees for spamming conflicts which means that he couldn't do this forever and would at some point run out of funds.
IOTA tries to build a feeless protocol and a consensus that stops to function if somebody spams conflicts is really not an option for us.
3. If a medium sized validator goes offline due to whatever reason, then the whole network will again stop to confirm any transactions because whenever a query for a nodes opinion can not be answered they reset the counter for consecutive successful voting rounds which will prevent confirmations. Since nodes need to open some ports to be available for queries it is super easy to DDOS validators and again bring the network confirmations to 0.
📷
Hans Moog [IF]어제 오후 3:05
4. Avalanche still processes transactions in "chunks/blocks" by only applying them after they have gone through some consensus process (gathered enough successfull voting rounds), which means that the nodes will waste a significant amount of time where they "wait" for the next chunk to be finished before the transactions are applied to the ledger state. IOTA tries to streamline this process by decoupling consensus and the booking of transactions by using the "parallel reality based ledger state" which means that nodes in IOTA will never waste any time "waiting" for decisions to be made. This will give us much higher throughput numbers.
📷
Hans Moog [IF]어제 오후 3:11
5. Avalanche has some really severe game theoretic problems where nodes are incentivized to attach their transactions to the already decided parts of the DAG because then things like conflict spam won't affect these transactions as badly as the transactions issued by honest nodes. If however every node would follow this "better and selfish" tip selection mechanism then the network will stop to work at all.
Overall the "being able to stop consensus" might not be too bad since you can't really do anything really bad (i.e. double spend) which is why we might not see these kind of attacks in the immediate future but just wait until a few DeFi apps are running on their platform where smart contracts are actually relying on more or less real time execution of the contracts. Then there might be some actual financial gains to be made if the contract halts and we might see alot of these things appear (including selfish tip selection).
Avalanche is barely a top 100 project and nobody attacks these kind of low value networks unless there is something to be gained from such an attack. Saying that the fact that its live on mainnet and hasn't been attacked in 3 weeks is a proof for its security is completely wrong.
Especially considering that 95% of all stake are controlled by avalanche itself
If you control > 50% of the voting power then you essentially control the whole network and attacks can mostly be ignored
I guess there is a reason for avalanche only selling 10% of the token supply to the public because then some of the named problems are less likely to appear
📷
Navin Ramachandran [IF]어제 오후 3:21
I have to say that wtf's suggestion is pretty condescending to all our researchers. It seems heavy on the troll aspect to suggest that we should ditch all our work because iota is only good at industrial adoption. Does wtf actually expect a response to this? Or is this grand standing?
📷
Hans Moog [IF]어제 오후 3:22
The whole argument of "why don't you just use X instead of trying to build a better version" is also a completely idiotic argument. Why did ETH write their own protocol if Bitcoin was already around? Well because they saw problems in Bitcoins approach and tried to improve it.
📷
Hans Moog [IF]어제 오후 3:27
u/Navin Ramachandran [IF] Its like most of his arguments ... remember when he said we should implement colored coins in 2nd layer smart contracts instead of the base layer because they would be more expressive (i.e. turing complete) completely discarding that 2nd layer smart contracts only really work if you have a consensus on data and therefore state for which you need the "traceability" of funds to create these kind of mini blockchains in the tangle?
Colored coins "enable" smart contracts and it wouldnt work the other way round - unless you have a platform that works exactly like ETH where all the nodes validate a single shared execution platform of the smart contracts which is not really scalable and is exactly what we are trying to solve with our approach.
📷
Navin Ramachandran [IF]어제 오후 3:28
Always easier to criticise than build something yourself. But yet he keeps posting these inflammatory posts.
At this point is there any doubt if he is making these comments constructively?
📷
Hans Moog [IF]어제 오후 3:43
If he at least would try to understand IOTAs vision ... then maybe he wouldn't have to ask things like "Why don't you just copy a tech that only works with fees"
📷
Hans Moog [IF]어제 오후 4:35
u/Shaar
I thought this would only be used to 'override' finality, eg if there were network splits. But not in normal consensus
That is not correct. Every single transaction gets booked on arrival using the parallel reality based ledger state. If there are conflicts then we create a "branch" (container in the ledger state) that represents the perception that this particular double spend would be accepted by consensus. After consensus is reached, the container is simply marked as "accepted" and all transactions that are associated with this branch are immediately confirmed as well. This allows us to make the node use all of its computing ressources 24/7 without having to wait for any kind of decision to be made and allows us to scale the throughput to its physical limits. That's the whole idea of the "parallel reality based ledger state" instead of designing a data structure that models the ledger state "after consensus" like everybody else is doing it is tailored to model the ledger state "before consensus" and then you just flip a flag to persist your decision. The "resync mechanism" also uses the branches to measure the amount of approval a certain perception of the ledger state receives. So if my own opinion is not in line with what the rest of the network has accepted (i.e. because I was eclipsed or because there was a network split), then I can use the weight of these branches to detect this "being out of sync" and can do another larger query to re-evaluate my decision.(수정됨)
Also what happens in IOTA if DRNG notes would fall out, does the network continue if no new RNGs appear for a while? Or will new nodes be added sufficiently fast to the DRNG committee that no one notices?
Its a comittee and not just a single DRNG provider. If a few nodes fail then it will still produce random numbers. And even if the whole comittee fails there are fallback RNG's that would be used instead
📷
Hans Moog [IF]어제 오후 4:58
And multiverse doesn't use FPC but only the weight of these branches in the same way as blockchain uses the longest chain wins consensus to choose between conflicts. So nodes simply attach their transactions to the transactions that they have seen first and if there are conflicts then you simply monitor which version received more approval and adjust your opinion accordingly.
📷
Hans Moog [IF]어제 오후 5:07
We started integrating some of the non-controversial concepts (like the approval reset switch) into FPC and are currently refactoring goshimmer to support this
We are also planning to make the big mana holders publish their opinion in the tangle as a public statement, which allows us to measure the rate of approval in a similar way as multiverse would do it
So its starting to converge a bit but we are still using FPC as a metastability breaking mechanism
Once the changes are implemented it should be pretty easy to simulate and test both approaches in parallel
📷
Serguei Popov [IF]어제 오후 5:53
So the ask is that we ditch all our work and fork Avalanche because it has not been attacked in the month or so it has been up?
u/Navin Ramachandran [IF] yeah, that's hilarious. Avalanche consensus (at least their WP version) is clearly scientifically unsound.
📷
Hans Moog [IF]어제 오후 9:43
u/wtf maybe you should research avalanche before proposing such a stupid idea
and you will see that what I wrote is actually true
📷
Hans Moog [IF]어제 오후 9:44
paying fees is what "protects" them atm
and simply the fact that nobody uses the network for anything of value yet
we cant rely on fees making attack vectors "inattractive"
📷
Serguei Popov [IF]어제 오후 10:17
well (1.) very obviously the metastability problems are not a problem in practice,
putting "very obviously" before questionable statements very obviously shows that you are seeking a constructive dialogue 📷 (to make metastability work, the adversary needs to more-or-less know the current opinion vectors of most of the honest participants; I don't see why a sufficiently well-connected adversary cannot query enough honest nodes frequently enough to achieve that)
(2.) .... you'd need an unpredictable number every few tens/hundreds milliseconds, but your DRNG can only produce one every O(seconds).
the above assumption (about "every few tens/hundreds milliseconds") is wrong
We've had this discussion before, where you argued that the assumptions in the FPC-BI paper (incl. "all nodes must be known") are not to be taken 100% strictly, and that the results are to be seen more of an indication of overall performance.
Aham, I see. So, unfortunately, all that time that I invested into explaining that stuff during our last conversation was for nothing. Again, very briefly. The contents of the FPC-BI paper is not "an indication of overall performance". It rather shows (to someone who actually read and understood the paper) why the approach is sound and robust, as it makes one understand what is the mechanism that causes the consensus phenomenon occur.
Yet you don't allow for that same argument to be valid for the "metastability" problem in avalanche,
Incorrect. It's not "that same argument". FPC-BI is a decent academic paper that has precisely formulated results and proofs. The Ava WP (the probabilistic part of it), on the other hand, does not contain proofs of what they call results. More importantly, they don't even show a clear path to those proofs. That's why their system is scientifically unsound.
even when there's a live network that shows that it doesn't matter.
No, it doesn't show that it doesn't matter. It only shows that it works when not properly attacked. Their WP doesn't contain any insight on why those attacks would be difficult/impossible.
📷
Hans Moog [IF]어제 오후 10:56
That proposal was so stupid - Avalanche does several things completely different and we are putting quite a bit og effort into our solution to pretty much fix all of Avalanches shortcomings
If we just wanted to have a working product and dont care about security or performance then we could have just forked a blockchaib
I am pretty confident that once we are done - its going to be extremely close to the besttheoretical thresholds that DLTs will ever be able to achieve for an unsharded baselayer
​-------------------------------------------------------------------------------------------------------------
📷
Bas어제 오전 2:43
Yesterday I was asked how a reasonably big company no one has heard of could best move forward implementing Access for thousands of locations worldwide. (Sorry for the vagueness, it’s all confidential.) They read the article and want to implement it because it seems to fit a problem they’re currently trying to solve. Such moves will vastly increase the utility of protocols like IOTA, and is what the speculation is built on. I do not think you can overestimate what impact Access is going to have. It’s cutting out the middleman for simple things; no server or service needed. That’s huge.
So yes, I think this space will continue to grow u/Coinnave

--------------------------------------------------------------------------------------------------------------
📷
Angelo Capossele [IF]2020.10.02.
In short: we are planning a new v0.3.0 release that should happen very soon. This version will bring fundamental changes to the structure of the entire codebase (but without additional features) so that progressing with the development will be easier and more consistent. We have also obtained outstanding results with the dRNG committee managed by the GoShimmer X-Team, so that will also be integral part of v0.3.0. After that, we will merge the Value Tangle with the Message Tangle, so to have only one Tangle and make the TSA and the orphanage easier to manage. And we are also progressing really well with Mana, that will be the focus after the merge. More or less this is what is going to happen this month.
We will release further details with the upcoming Research Status Update 📷

submitted by btlkhs to Iota [link] [comments]

Proposal: The Sia Foundation

Vision Statement

A common sentiment is brewing online; a shared desire for the internet that might have been. After decades of corporate encroachment, you don't need to be a power user to realize that something has gone very wrong.
In the early days of the internet, the future was bright. In that future, when you sent an instant message, it traveled directly to the recipient. When you needed to pay a friend, you announced a transfer of value to their public key. When an app was missing a feature you wanted, you opened up the source code and implemented it. When you took a picture on your phone, it was immediately encrypted and backed up to storage that you controlled. In that future, people would laugh at the idea of having to authenticate themselves to some corporation before doing these things.
What did we get instead? Rather than a network of human-sized communities, we have a handful of enormous commons, each controlled by a faceless corporate entity. Hey user, want to send a message? You can, but we'll store a copy of it indefinitely, unencrypted, for our preference-learning algorithms to pore over; how else could we slap targeted ads on every piece of content you see? Want to pay a friend? You can—in our Monopoly money. Want a new feature? Submit a request to our Support Center and we'll totally maybe think about it. Want to backup a photo? You can—inside our walled garden, which only we (and the NSA, of course) can access. Just be careful what you share, because merely locking you out of your account and deleting all your data is far from the worst thing we could do.
You rationalize this: "MEGACORP would never do such a thing; it would be bad for business." But we all know, at some level, that this state of affairs, this inversion of power, is not merely "unfortunate" or "suboptimal" – No. It is degrading. Even if MEGACORP were purely benevolent, it is degrading that we must ask its permission to talk to our friends; that we must rely on it to safeguard our treasured memories; that our digital lives are completely beholden to those who seek only to extract value from us.
At the root of this issue is the centralization of data. MEGACORP can surveil you—because your emails and video chats flow through their servers. And MEGACORP can control you—because they hold your data hostage. But centralization is a solution to a technical problem: How can we make the user's data accessible from anywhere in the world, on any device? For a long time, no alternative solution to this problem was forthcoming.
Today, thanks to a confluence of established techniques and recent innovations, we have solved the accessibility problem without resorting to centralization. Hashing, encryption, and erasure encoding got us most of the way, but one barrier remained: incentives. How do you incentivize an anonymous stranger to store your data? Earlier protocols like BitTorrent worked around this limitation by relying on altruism, tit-for-tat requirements, or "points" – in other words, nothing you could pay your electric bill with. Finally, in 2009, a solution appeared: Bitcoin. Not long after, Sia was born.
Cryptography has unleashed the latent power of the internet by enabling interactions between mutually-distrustful parties. Sia harnesses this power to turn the cloud storage market into a proper marketplace, where buyers and sellers can transact directly, with no intermediaries, anywhere in the world. No more silos or walled gardens: your data is encrypted, so it can't be spied on, and it's stored on many servers, so no single entity can hold it hostage. Thanks to projects like Sia, the internet is being re-decentralized.
Sia began its life as a startup, which means it has always been subjected to two competing forces: the ideals of its founders, and the profit motive inherent to all businesses. Its founders have taken great pains to never compromise on the former, but this often threatened the company's financial viability. With the establishment of the Sia Foundation, this tension is resolved. The Foundation, freed of the obligation to generate profit, is a pure embodiment of the ideals from which Sia originally sprung.
The goals and responsibilities of the Foundation are numerous: to maintain core Sia protocols and consensus code; to support developers building on top of Sia and its protocols; to promote Sia and facilitate partnerships in other spheres and communities; to ensure that users can easily acquire and safely store siacoins; to develop network scalability solutions; to implement hardforks and lead the community through them; and much more. In a broader sense, its mission is to commoditize data storage, making it cheap, ubiquitous, and accessible to all, without compromising privacy or performance.
Sia is a perfect example of how we can achieve better living through cryptography. We now begin a new chapter in Sia's history. May our stewardship lead it into a bright future.
 

Overview

Today, we are proposing the creation of the Sia Foundation: a new non-profit entity that builds and supports distributed cloud storage infrastructure, with a specific focus on the Sia storage platform. What follows is an informal overview of the Sia Foundation, covering two major topics: how the Foundation will be funded, and what its funds will be used for.

Organizational Structure

The Sia Foundation will be structured as a non-profit entity incorporated in the United States, likely a 501(c)(3) organization or similar. The actions of the Foundation will be constrained by its charter, which formalizes the specific obligations and overall mission outlined in this document. The charter will be updated on an annual basis to reflect the current goals of the Sia community.
The organization will be operated by a board of directors, initially comprising Luke Champine as President and Eddie Wang as Chairman. Luke Champine will be leaving his position at Nebulous to work at the Foundation full-time, and will seek to divest his shares of Nebulous stock along with other potential conflicts of interest. Neither Luke nor Eddie personally own any siafunds or significant quantities of siacoin.

Funding

The primary source of funding for the Foundation will come from a new block subsidy. Following a hardfork, 30 KS per block will be allocated to the "Foundation Fund," continuing in perpetuity. The existing 30 KS per block miner reward is not affected. Additionally, one year's worth of block subsidies (approximately 1.57 GS) will be allocated to the Fund immediately upon activation of the hardfork.
As detailed below, the Foundation will provably burn any coins that it cannot meaningfully spend. As such, the 30 KS subsidy should be viewed as a maximum. This allows the Foundation to grow alongside Sia without requiring additional hardforks.
The Foundation will not be funded to any degree by the possession or sale of siafunds. Siafunds were originally introduced as a means of incentivizing growth, and we still believe in their effectiveness: a siafund holder wants to increase the amount of storage on Sia as much as possible. While the Foundation obviously wants Sia to succeed, its driving force should be its charter. Deriving significant revenue from siafunds would jeopardize the Foundation's impartiality and focus. Ultimately, we want the Foundation to act in the best interests of Sia, not in growing its own budget.

Responsibilities

The Foundation inherits a great number of responsibilities from Nebulous. Each quarter, the Foundation will publish the progress it has made over the past quarter, and list the responsibilities it intends to prioritize over the coming quarter. This will be accompanied by a financial report, detailing each area of expenditure over the past quarter, and forecasting expenditures for the coming quarter. Below, we summarize some of the myriad responsibilities towards which the Foundation is expected to allocate its resources.

Maintain and enhance core Sia software

Arguably, this is the most important responsibility of the Foundation. At the heart of Sia is its consensus algorithm: regardless of other differences, all Sia software must agree upon the content and rules of the blockchain. It is therefore crucial that the algorithm be stewarded by an entity that is accountable to the community, transparent in its decision-making, and has no profit motive or other conflicts of interest.
Accordingly, Sia’s consensus functionality will no longer be directly maintained by Nebulous. Instead, the Foundation will release and maintain an implementation of a "minimal Sia full node," comprising the Sia consensus algorithm and P2P networking code. The source code will be available in a public repository, and signed binaries will be published for each release.
Other parties may use this code to provide alternative full node software. For example, Nebulous may extend the minimal full node with wallet, renter, and host functionality. The source code of any such implementation may be submitted to the Foundation for review. If the code passes review, the Foundation will provide "endorsement signatures" for the commit hash used and for binaries compiled internally by the Foundation. Specifically, these signatures assert that the Foundation believes the software contains no consensus-breaking changes or other modifications to imported Foundation code. Endorsement signatures and Foundation-compiled binaries may be displayed and distributed by the receiving party, along with an appropriate disclaimer.
A minimal full node is not terribly useful on its own; the wallet, renter, host, and other extensions are what make Sia a proper developer platform. Currently, the only implementations of these extensions are maintained by Nebulous. The Foundation will contract Nebulous to ensure that these extensions continue to receive updates and enhancements. Later on, the Foundation intends to develop its own implementations of these extensions and others. As with the minimal node software, these extensions will be open source and available in public repositories for use by any Sia node software.
With the consensus code now managed by the Foundation, the task of implementing and orchestrating hardforks becomes its responsibility as well. When the Foundation determines that a hardfork is necessary (whether through internal discussion or via community petition), a formal proposal will be drafted and submitted for public review, during which arguments for and against the proposal may be submitted to a public repository. During this time, the hardfork code will be implemented, either by Foundation employees or by external contributors working closely with the Foundation. Once the implementation is finished, final arguments will be heard. The Foundation board will then vote whether to accept or reject the proposal, and announce their decision along with appropriate justification. Assuming the proposal was accepted, the Foundation will announce the block height at which the hardfork will activate, and will subsequently release source code and signed binaries that incorporate the hardfork code.
Regardless of the Foundation's decision, it is the community that ultimately determines whether a fork is accepted or rejected – nothing can change that. Foundation node software will never automatically update, so all forks must be explicitly adopted by users. Furthermore, the Foundation will provide replay and wipeout protection for its hard forks, protecting other chains from unintended or malicious reorgs. Similarly, the Foundation will ensure that any file contracts formed prior to a fork activation will continue to be honored on both chains until they expire.
Finally, the Foundation also intends to pursue scalability solutions for the Sia blockchain. In particular, work has already begun on an implementation of Utreexo, which will greatly reduce the space requirements of fully-validating nodes (allowing a full node to be run on a smartphone) while increasing throughput and decreasing initial sync time. A hardfork implementing Utreexo will be submitted to the community as per the process detailed above.
As this is the most important responsibility of the Foundation, it will receive a significant portion of the Foundation’s budget, primarily in the form of developer salaries and contracting agreements.

Support community services

We intend to allocate 25% of the Foundation Fund towards the community. This allocation will be held and disbursed in the form of siacoins, and will pay for grants, bounties, hackathons, and other community-driven endeavours.
Any community-run service, such as a Skynet portal, explorer or web wallet, may apply to have its costs covered by the Foundation. Upon approval, the Foundation will reimburse expenses incurred by the service, subject to the exact terms agreed to. The intent of these grants is not to provide a source of income, but rather to make such services "break even" for their operators, so that members of the community can enrich the Sia ecosystem without worrying about the impact on their own finances.

Ensure easy acquisition and storage of siacoins

Most users will acquire their siacoins via an exchange. The Foundation will provide support to Sia-compatible exchanges, and pursue relevant integrations at its discretion, such as Coinbase's new Rosetta standard. The Foundation may also release DEX software that enables trading cryptocurrencies without the need for a third party. (The Foundation itself will never operate as a money transmitter.)
Increasingly, users are storing their cryptocurrency on hardware wallets. The Foundation will maintain the existing Ledger Nano S integration, and pursue further integrations at its discretion.
Of course, all hardware wallets must be paired with software running on a computer or smartphone, so the Foundation will also develop and/or maintain client-side wallet software, including both full-node wallets and "lite" wallets. Community-operated wallet services, i.e. web wallets, may be funded via grants.
Like core software maintenance, this responsibility will be funded in the form of developer salaries and contracting agreements.

Protect the ecosystem

When it comes to cryptocurrency security, patching software vulnerabilities is table stakes; there are significant legal and social threats that we must be mindful of as well. As such, the Foundation will earmark a portion of its fund to defend the community from legal action. The Foundation will also safeguard the network from 51% attacks and other threats to network security by implementing softforks and/or hardforks where necessary.
The Foundation also intends to assist in the development of a new FOSS software license, and to solicit legal memos on various Sia-related matters, such as hosting in the United States and the EU.
In a broader sense, the establishment of the Foundation makes the ecosystem more robust by transferring core development to a more neutral entity. Thanks to its funding structure, the Foundation will be immune to various forms of pressure that for-profit companies are susceptible to.

Drive adoption of Sia

Although the overriding goal of the Foundation is to make Sia the best platform it can be, all that work will be in vain if no one uses the platform. There are a number of ways the Foundation can promote Sia and get it into the hands of potential users and developers.
In-person conferences are understandably far less popular now, but the Foundation can sponsor and/or participate in virtual conferences. (In-person conferences may be held in the future, permitting circumstances.) Similarly, the Foundation will provide prizes for hackathons, which may be organized by community members, Nebulous, or the Foundation itself. Lastly, partnerships with other companies in the cryptocurrency space—or the cloud storage space—are a great way to increase awareness of Sia. To handle these responsibilities, one of the early priorities of the Foundation will be to hire a marketing director.

Fund Management

The Foundation Fund will be controlled by a multisig address. Each member of the Foundation's board will control one of the signing keys, with the signature threshold to be determined once the final composition of the board is known. (This threshold may also be increased or decreased if the number of board members changes.) Additionally, one timelocked signing key will be controlled by David Vorick. This key will act as a “dead man’s switch,” to be used in the event of an emergency that prevents Foundation board members from reaching the signature threshold. The timelock ensures that this key cannot be used unless the Foundation fails to sign a transaction for several months.
On the 1st of each month, the Foundation will use its keys to transfer all siacoins in the Fund to two new addresses. The first address will be controlled by a high-security hot wallet, and will receive approximately one month's worth of Foundation expenditures. The second address, receiving the remaining siacoins, will be a modified version of the source address: specifically, it will increase the timelock on David Vorick's signing key by one month. Any other changes to the set of signing keys, such as the arrival or departure of board members, will be incorporated into this address as well.
The Foundation Fund is allocated in SC, but many of the Foundation's expenditures must be paid in USD or other fiat currency. Accordingly, the Foundation will convert, at its discretion, a portion of its monthly withdrawals to fiat currency. We expect this conversion to be primarily facilitated by private "OTC" sales to accredited investors. The Foundation currently has no plans to speculate in cryptocurrency or other assets.
Finally, it is important that the Foundation adds value to the Sia platform well in excess of the inflation introduced by the block subsidy. For this reason, the Foundation intends to provably burn, on a quarterly basis, any coins that it cannot allocate towards any justifiable expense. In other words, coins will be burned whenever doing so provides greater value to the platform than any other use. Furthermore, the Foundation will cap its SC treasury at 5% of the total supply, and will cap its USD treasury at 4 years’ worth of predicted expenses.
 
Addendum: Hardfork Timeline
We would like to see this proposal finalized and accepted by the community no later than September 30th. A new version of siad, implementing the hardfork, will be released no later than October 15th. The hardfork will activate at block 293220, which is expected to occur around 12pm EST on January 1st, 2021.
 
Addendum: Inflation specifics
The total supply of siacoins as of January 1st, 2021 will be approximately 45.243 GS. The initial subsidy of 1.57 GS thus increases the supply by 3.47%, and the total annual inflation in 2021 will be at most 10.4% (if zero coins are burned). In 2022, total annual inflation will be at most 6.28%, and will steadily decrease in subsequent years.
 

Conclusion

We see the establishment of the Foundation as an important step in the maturation of the Sia project. It provides the ecosystem with a sustainable source of funding that can be exclusively directed towards achieving Sia's ambitious goals. Compared to other projects with far deeper pockets, Sia has always punched above its weight; once we're on equal footing, there's no telling what we'll be able to achieve.
Nevertheless, we do not propose this change lightly, and have taken pains to ensure that the Foundation will act in accordance with the ideals that this community shares. It will operate transparently, keep inflation to a minimum, and respect the user's fundamental role in decentralized systems. We hope that everyone in the community will consider this proposal carefully, and look forward to a productive discussion.
submitted by lukechampine to siacoin [link] [comments]

Technical: Taproot: Why Activate?

This is a follow-up on https://old.reddit.com/Bitcoin/comments/hqzp14/technical_the_path_to_taproot_activation/
Taproot! Everybody wants it!! But... you might ask yourself: sure, everybody else wants it, but why would I, sovereign Bitcoin HODLer, want it? Surely I can be better than everybody else because I swapped XXX fiat for Bitcoin unlike all those nocoiners?
And it is important for you to know the reasons why you, o sovereign Bitcoiner, would want Taproot activated. After all, your nodes (or the nodes your wallets use, which if you are SPV, you hopefully can pester to your wallet vendoimplementor about) need to be upgraded in order for Taproot activation to actually succeed instead of becoming a hot sticky mess.
First, let's consider some principles of Bitcoin.
I'm sure most of us here would agree that the above are very important principles of Bitcoin and that these are principles we would not be willing to remove. If anything, we would want those principles strengthened (especially the last one, financial privacy, which current Bitcoin is only sporadically strong with: you can get privacy, it just requires effort to do so).
So, how does Taproot affect those principles?

Taproot and Your /Coins

Most HODLers probably HODL their coins in singlesig addresses. Sadly, switching to Taproot would do very little for you (it gives a mild discount at spend time, at the cost of a mild increase in fee at receive time (paid by whoever sends to you, so if it's a self-send from a P2PKH or bech32 address, you pay for this); mostly a wash).
(technical details: a Taproot output is 1 version byte + 32 byte public key, while a P2WPKH (bech32 singlesig) output is 1 version byte + 20 byte public key hash, so the Taproot output spends 12 bytes more; spending from a P2WPKH requires revealing a 32-byte public key later, which is not needed with Taproot, and Taproot signatures are about 9 bytes smaller than P2WPKH signatures, but the 32 bytes plus 9 bytes is divided by 4 because of the witness discount, so it saves about 11 bytes; mostly a wash, it increases blockweight by about 1 virtual byte, 4 weight for each Taproot-output-input, compared to P2WPKH-output-input).
However, as your HODLings grow in value, you might start wondering if multisignature k-of-n setups might be better for the security of your savings. And it is in multisignature that Taproot starts to give benefits!
Taproot switches to using Schnorr signing scheme. Schnorr makes key aggregation -- constructing a single public key from multiple public keys -- almost as trivial as adding numbers together. "Almost" because it involves some fairly advanced math instead of simple boring number adding, but hey when was the last time you added up your grocery list prices by hand huh?
With current P2SH and P2WSH multisignature schemes, if you have a 2-of-3 setup, then to spend, you need to provide two different signatures from two different public keys. With Taproot, you can create, using special moon math, a single public key that represents your 2-of-3 setup. Then you just put two of your devices together, have them communicate to each other (this can be done airgapped, in theory, by sending QR codes: the software to do this is not even being built yet, but that's because Taproot hasn't activated yet!), and they will make a single signature to authorize any spend from your 2-of-3 address. That's 73 witness bytes -- 18.25 virtual bytes -- of signatures you save!
And if you decide that your current setup with 1-of-1 P2PKH / P2WPKH addresses is just fine as-is: well, that's the whole point of a softfork: backwards-compatibility; you can receive from Taproot users just fine, and once your wallet is updated for Taproot-sending support, you can send to Taproot users just fine as well!
(P2WPKH and P2WSH -- SegWit v0 -- addresses start with bc1q; Taproot -- SegWit v1 --- addresses start with bc1p, in case you wanted to know the difference; in bech32 q is 0, p is 1)
Now how about HODLers who keep all, or some, of their coins on custodial services? Well, any custodial service worth its salt would be doing at least 2-of-3, or probably something even bigger, like 11-of-15. So your custodial service, if it switched to using Taproot internally, could save a lot more (imagine an 11-of-15 getting reduced from 11 signatures to just 1!), which --- we can only hope! --- should translate to lower fees and better customer service from your custodial service!
So I think we can say, very accurately, that the Bitcoin principle --- that YOU are in control of your money --- can only be helped by Taproot (if you are doing multisignature), and, because P2PKH and P2WPKH remain validly-usable addresses in a Taproot future, will not be harmed by Taproot. Its benefit to this principle might be small (it mostly only benefits multisignature users) but since it has no drawbacks with this (i.e. singlesig users can continue to use P2WPKH and P2PKH still) this is still a nice, tidy win!
(even singlesig users get a minor benefit, in that multisig users will now reduce their blockchain space footprint, so that fees can be kept low for everybody; so for example even if you have your single set of private keys engraved on titanium plates sealed in an airtight box stored in a safe buried in a desert protected by angry nomads riding giant sandworms because you're the frickin' Kwisatz Haderach, you still gain some benefit from Taproot)
And here's the important part: if P2PKH/P2WPKH is working perfectly fine with you and you decide to never use Taproot yourself, Taproot will not affect you detrimentally. First do no harm!

Taproot and Your Contracts

No one is an island, no one lives alone. Give and you shall receive. You know: by trading with other people, you can gain expertise in some obscure little necessity of the world (and greatly increase your productivity in that little field), and then trade the products of your expertise for necessities other people have created, all of you thereby gaining gains from trade.
So, contracts, which are basically enforceable agreements that facilitate trading with people who you do not personally know and therefore might not trust.
Let's start with a simple example. You want to buy some gewgaws from somebody. But you don't know them personally. The seller wants the money, you want their gewgaws, but because of the lack of trust (you don't know them!! what if they're scammers??) neither of you can benefit from gains from trade.
However, suppose both of you know of some entity that both of you trust. That entity can act as a trusted escrow. The entity provides you security: this enables the trade, allowing both of you to get gains from trade.
In Bitcoin-land, this can be implemented as a 2-of-3 multisignature. The three signatories in the multisgnature would be you, the gewgaw seller, and the escrow. You put the payment for the gewgaws into this 2-of-3 multisignature address.
Now, suppose it turns out neither of you are scammers (whaaaat!). You receive the gewgaws just fine and you're willing to pay up for them. Then you and the gewgaw seller just sign a transaction --- you and the gewgaw seller are 2, sufficient to trigger the 2-of-3 --- that spends from the 2-of-3 address to a singlesig the gewgaw seller wants (or whatever address the gewgaw seller wants).
But suppose some problem arises. The seller gave you gawgews instead of gewgaws. Or you decided to keep the gewgaws but not sign the transaction to release the funds to the seller. In either case, the escrow is notified, and if it can sign with you to refund the funds back to you (if the seller was a scammer) or it can sign with the seller to forward the funds to the seller (if you were a scammer).
Taproot helps with this: like mentioned above, it allows multisignature setups to produce only one signature, reducing blockchain space usage, and thus making contracts --- which require multiple people, by definition, you don't make contracts with yourself --- is made cheaper (which we hope enables more of these setups to happen for more gains from trade for everyone, also, moon and lambos).
(technology-wise, it's easier to make an n-of-n than a k-of-n, making a k-of-n would require a complex setup involving a long ritual with many communication rounds between the n participants, but an n-of-n can be done trivially with some moon math. You can, however, make what is effectively a 2-of-3 by using a three-branch SCRIPT: either 2-of-2 of you and seller, OR 2-of-2 of you and escrow, OR 2-of-2 of escrow and seller. Fortunately, Taproot adds a facility to embed a SCRIPT inside a public key, so you can have a 2-of-2 Taprooted address (between you and seller) with a SCRIPT branch that can instead be spent with 2-of-2 (you + escrow) OR 2-of-2 (seller + escrow), which implements the three-branched SCRIPT above. If neither of you are scammers (hopefully the common case) then you both sign using your keys and never have to contact the escrow, since you are just using the escrow public key without coordinating with them (because n-of-n is trivial but k-of-n requires setup with communication rounds), so in the "best case" where both of you are honest traders, you also get a privacy boost, in that the escrow never learns you have been trading on gewgaws, I mean ewww, gawgews are much better than gewgaws and therefore I now judge you for being a gewgaw enthusiast, you filthy gewgawer).

Taproot and Your Contracts, Part 2: Cryptographic Boogaloo

Now suppose you want to buy some data instead of things. For example, maybe you have some closed-source software in trial mode installed, and want to pay the developer for the full version. You want to pay for an activation code.
This can be done, today, by using an HTLC. The developer tells you the hash of the activation code. You pay to an HTLC, paying out to the developer if it reveals the preimage (the activation code), or refunding the money back to you after a pre-agreed timeout. If the developer claims the funds, it has to reveal the preimage, which is the activation code, and you can now activate your software. If the developer does not claim the funds by the timeout, you get refunded.
And you can do that, with HTLCs, today.
Of course, HTLCs do have problems:
Fortunately, with Schnorr (which is enabled by Taproot), we can now use the Scriptless Script constuction by Andrew Poelstra. This Scriptless Script allows a new construction, the PTLC or Pointlocked Timelocked Contract. Instead of hashes and preimages, just replace "hash" with "point" and "preimage" with "scalar".
Or as you might know them: "point" is really "public key" and "scalar" is really a "private key". What a PTLC does is that, given a particular public key, the pointlocked branch can be spent only if the spender reveals the private key of the given public key to you.
Another nice thing with PTLCs is that they are deniable. What appears onchain is just a single 2-of-2 signature between you and the developemanufacturer. It's like a magic trick. This signature has no special watermarks, it's a perfectly normal signature (the pledge). However, from this signature, plus some datta given to you by the developemanufacturer (known as the adaptor signature) you can derive the private key of a particular public key you both agree on (the turn). Anyone scraping the blockchain will just see signatures that look just like every other signature, and as long as nobody manages to hack you and get a copy of the adaptor signature or the private key, they cannot get the private key behind the public key (point) that the pointlocked branch needs (the prestige).
(Just to be clear, the public key you are getting the private key from, is distinct from the public key that the developemanufacturer will use for its funds. The activation key is different from the developer's onchain Bitcoin key, and it is the activation key whose private key you will be learning, not the developer's/manufacturer's onchain Bitcoin key).
So:
Taproot lets PTLCs exist onchain because they enable Schnorr, which is a requirement of PTLCs / Scriptless Script.
(technology-wise, take note that Scriptless Script works only for the "pointlocked" branch of the contract; you need normal Script, or a pre-signed nLockTimed transaction, for the "timelocked" branch. Since Taproot can embed a script, you can have the Taproot pubkey be a 2-of-2 to implement the Scriptless Script "pointlocked" branch, then have a hidden script that lets you recover the funds with an OP_CHECKLOCKTIMEVERIFY after the timeout if the seller does not claim the funds.)

Quantum Quibbles!

Now if you were really paying attention, you might have noticed this parenthetical:
(technical details: a Taproot output is 1 version byte + 32 byte public key, while a P2WPKH (bech32 singlesig) output is 1 version byte + 20 byte public key hash...)
So wait, Taproot uses raw 32-byte public keys, and not public key hashes? Isn't that more quantum-vulnerable??
Well, in theory yes. In practice, they probably are not.
It's not that hashes can be broken by quantum computes --- they're still not. Instead, you have to look at how you spend from a P2WPKH/P2PKH pay-to-public-key-hash.
When you spend from a P2PKH / P2WPKH, you have to reveal the public key. Then Bitcoin hashes it and checks if this matches with the public-key-hash, and only then actually validates the signature for that public key.
So an unconfirmed transaction, floating in the mempools of nodes globally, will show, in plain sight for everyone to see, your public key.
(public keys should be public, that's why they're called public keys, LOL)
And if quantum computers are fast enough to be of concern, then they are probably fast enough that, in the several minutes to several hours from broadcast to confirmation, they have already cracked the public key that is openly broadcast with your transaction. The owner of the quantum computer can now replace your unconfirmed transaction with one that pays the funds to itself. Even if you did not opt-in RBF, miners are still incentivized to support RBF on RBF-disabled transactions.
So the extra hash is not as significant a protection against quantum computers as you might think. Instead, the extra hash-and-compare needed is just extra validation effort.
Further, if you have ever, in the past, spent from the address, then there exists already a transaction indelibly stored on the blockchain, openly displaying the public key from which quantum computers can derive the private key. So those are still vulnerable to quantum computers.
For the most part, the cryptographers behind Taproot (and Bitcoin Core) are of the opinion that quantum computers capable of cracking Bitcoin pubkeys are unlikely to appear within a decade or two.
So:
For now, the homomorphic and linear properties of elliptic curve cryptography provide a lot of benefits --- particularly the linearity property is what enables Scriptless Script and simple multisignature (i.e. multisignatures that are just 1 signature onchain). So it might be a good idea to take advantage of them now while we are still fairly safe against quantum computers. It seems likely that quantum-safe signature schemes are nonlinear (thus losing these advantages).

Summary

I Wanna Be The Taprooter!

So, do you want to help activate Taproot? Here's what you, mister sovereign Bitcoin HODLer, can do!

But I Hate Taproot!!

That's fine!

Discussions About Taproot Activation

submitted by almkglor to Bitcoin [link] [comments]

Unpopular opinion - the economy has to become dynamic in order for it to have any longevity (and other musings on the progression)

Ain't no one gonna read this but here it goes!
The issue of progression has recently been gaining some traction in the community with Klean and DeadlySlob covering this topic recently.
Now any solution to this has an inherent issue associated with it - it'll be uncomfortable to someone. Whatever is done, it'll negatively affect someone, just by the fact of change alone. You cannot make something better by not changing anything. So anything you do or don't do, you will alienate a portion of your playerbase.
Early/Mid-game vs Late game.
Early and mid game is lauded, late game is considered boring. But why? For startes, firefights last longer, require more skill, movement, tactics and outsmarting your opponent. You value your life, you feel respect even for the shittiest of bullets. You have a feeling that the kill is earned. Guns have tons of recoil so you need to pick your shots. It's... I know it's illegal... but it's fun.
Late game however is plagued with a number of issues. Gear gets dominated by very similar loadouts that cover approx 10% of the gear in the game. There's nowhere to progress as you've reached the ceiling. The excitement from killing a kitted player diminishes as time goes as the economy saturates. People start being picky with their loot and only the good stuff brings any sort of satisfaction. The hideout provides a steady, predictable stream of income.
You let it run long enough it becomes a mindless PVP battleground.
Side note - the black and white fallacy of the makeup of the community.
Casuals vs hardcores. Rats vs Chads. Whenever a discussion pops up this dichotomy is always present. "Feature X hurts casuals but doesn't bother hardcore gamers playing 8h a day". No. Like anything in life the population of EFT is subject to the bellcurve distribution. There are hardcore sweaties grinding out the kappa within a week and there are also sunday gamers. Then there's everything else in between. Let's keep that in mind.
You don't need to be a streamer or play the game as a full time job to make money. We have a discord for 30+ yr old gamers with families and all of us were swimming in roubles and gear after 3 months of the past wipe. Sure it takes us longer than streamers, but still.
The meta
Taking weapons as an example. Different items have different stats (recoil, ergonomics, etc), some are obviously better than others which obviously makes them more sought after. There are also different ammo types for every caliber. Then lastly we come to the guns which directly tie into the first point, by their base stats and how much those can be brought down/up by attachments.
If you have a plethora of items that have different stats, there's sure to be an optimal loadout. If that optimal loadout is always available at an attainable price to the point where you can run it consistently, then there's really no reason to run anything less. This is the meta and at the moment it's basically a synonym for best in slot.
Appealing to a greater good such as gameplay variety is in vain because people will do everything to put themselves in the best possible position. If that means running whatever flavor of meta weapon that is - VAL, M4, FAL alongside top tier lvl 5 or 6 armor over and over and over and over again, so be it. We all know that's not the only way to get by in EFT, but all else being equal - top gear puts you on equal footing at minimum.
Trash contextualizes treasure. A rare item is not rare if everyone is running it. It's a normal item.
Gear minmaxing combined with a ceiling in progression create a situation where the game becomes stale, people get bored and we get chants for a wipe to releave the pressure.
Wipes
Wipes however, even at set intervals, are not the solution. Every wipe, in the absence of something fundamentally new, gives you (rapidly) diminishing returns. Doing the same quests over and over is an absolute drag. It's my 7th wipe and this time around I've really hit a brick wall with them. Now imagine doing them every 3 months. Maybe just do an inventory and trader level wipe? Yeah, that's just skipping one part of it and arriving at the same point but even quicker, considering how quickly you can make money.
The endpoint being - having enough money to run anything you want all the time without the fear of getting broke. Or in the abstract, having a big enough cushion to make any blow from a bad streak become inconsequential.
All of that is just a perpetuation of the same sawtooth progression. Grind, saturate, wipe, grind, saturate, wipe.
Side note - persistent character vs wiped character
I know there have been talks about having two characters - one persistent that's not wiped and one seasonal that is. On paper this might look like a good solution, but there are some problems.
POE players would have to chip in, but I reckong, that in a way this might become a form of matchmaking - the persistent character would be a mode for "sunday" players, while the wiped one for the sweats. I mean, maybe that's the way to go, but if the game is to gave any longevity, the persistent character will eventually face the same issues as the current game, it'll just take longer to develop.
Unpopular opinion - The economy is just a set of time and effort gated unlocks.
There have been multiple ideas to prolong a wipe, but in my view the fundamental issue with those is that they're based off the same linear progression - start from scratch and acumulate wealth until saturation. Some of these ideas include restricting labs till level X, locking behind a quest or just disabling it for a month. The problem with these is that it's just delaying the inevitable, while also giving a direct buff to those who get there first as they'll have the place virtually to themselves.
What follows is also the concept of "starting mid wipe", which essentially means that the gear disparity is so big that the further into a wipe, the more difficult it is to catch up. That effort is directly correlated with experience - the more experience you have the easier it is for you to reset or jump in midwipe. Extending a wipe potentially alleviates that by giving people more opportunity to catch up, but also pushes away from coming back/into the game if they recognize that it had passed their personal breakpoint where it's too hard / frustrating.
Perpetual mid-game
So out of all of that, a clearer picture emerges. We have to somehow find a solution to always have something to work for, but also not give the impression that you're up against an impenetrable wall.
That means that the game needs to pivot around something colloquially known as mid game. How would we define mid-game? That's another debate, but for the sake of the argument we could define that as something in the range of:
That would be the sort of mean loadout you can run on a consistent basis and you'd see the majority of the time. From the sentiment across the community, this seems to be the most enjoyable state of the game, where the sweetspot is in terms of protection and vulnerability, but allowing a lot of headroom for both variety and
Solutions
Now we must have to remember that there's a number of changes inbound that will alleviate some of the issues:
But those are sill far on the horizon.
The uncomfortable reality is that in order to truly balance that you have only a few choices. One is to go down the route of typical FPS tropes where every weapon type is perfectly balanced (i.e. shotguns powerfull but limited range, smg's low recoil, high ROF but weaker, dmrs powerful but high recoil and low ROF, etc). I don't think this will be ever a thing in the game.
Another one is to make attachments roughly equal and just attribute the differences to the tacticool visual factor. This would be realistic in a way, but would take away from the game.
The last one is to price them out. Literally. I'm of the unpopular opinion that endgame should not be a stage, it should be a state.
Dynamic pricing
I know I know, last time it failed spectacularly. However, that was a different flea market and the implementation was poorly thought out. Since it didn't have a pivot point to relate to it caused widespread inflation of even the most basic items and was prone to manipulation.
However the concept in principal has proven itself to work - M995 was essentially priced out of existence and forced people to look for alternatives like M855A1 or M856A1 or different calibers alltogether. Even the sweaties of sweats got a bit excited when they killed someone with 3 60rounders filled with M995. See where I'm going with this?
The execution was poor and poorly thought out.
But how about a different implementation? Adjust the prices based on how much an item is (or is not) bought compared to other items of the same item type. Most popular items' price (of a specific category) increases, while the least popular one decreases.
This could also be coupled with (or as an alternative) an additional rarity factor which would sort of specify how volatile the price is. Continuing the ammo example M995 would have the highest rarity factor and would be very prone to price increases, while the likes of M855 would be considered common and have a much more stable price.
Obviously this would be subject to long term trends and would not happen overnight. But the main aim is to dynamically scale the economy to the general wealth of the playerbase around a certain pivot point which we established before as the mid-game.
This would be a quite significant blow to the uberchads as they would unironically struggle to maintain a profit from their runs. And yes, some of them would still probably be able to pull this off, but remember what we said about the bell curve? It's just about making them so insignificant in the global player pool that they'd be a very rare occurance.
Global item pools
This idea has been floated around by Nikita some time ago but we have no ETA on this. In short - for some items, there is only a set amount that is present in circulation. For example there are only X amount of ReapIR's in the entire economy - spawns, traders, player stashes. If everyone hoards them in their stashes - thats where they'll remain. They don't spawn on maps, they're not sold on traders. Only until they're lost they get reinjected into the item pool.
This idea should be reserved only for the absolute top tier OP items. Something that you'd get all giddy if found/looted and you'd contemplate taking it out.
Side note, the X amount should scale to the active playerbase, which could be something like a weekly or biweekly moving average of people actively playing the game in a set period.
Insurance
This one is a bit controversial but also attributes to some of the in game inflation and gear recirculation. If you run a large squad, even if one of you dies, there's a high chance someone will survive and secure others' gear. And even if all of you die, something's bound to come back.
This might be a bit controversial, but I think group size should have a debuff to the chance of getting your gear back the higher the bigger your squad size, for example an incremental 10% chance for each additional squadmate.
Hideout adjustments
Right now fuel consumption is static no matter how much stuff is going on. What if the fuel consumption rate was tied to the size of your bitcoin farm and the amount of crafting going on.
Additionally hideout appliances could wear out and require maintenance, which would grant them performance debuffs like increased crafting time.
Dynamic stocks.
Right now stocks are predictable. You have the same amount of items at a set interval. Things like traders missing some items or not getting a restock due to broken supply lines, which can be cheekily tied into...
Dynamic global events/quests
Such as as getting rid of scavs on a particular location to remove the roadblock. These might be done per player or as a global event where everyone has to chip in.
Summary
The subject is difficult and solutions are not simple, but what I do know is that eventually Tarkov will have settle into an identity which will come with a sacrifice either at the expense of vision or mainstream popularity.
Thank you for coming to my TEDTalk. I'd like to give a heartfelt thank you to the 5 people that read this wall of text.
submitted by sunseeker11 to EscapefromTarkov [link] [comments]

I'm kinda ok with MCO -> CRO Swap; a indepth personal view

EDIT: this post https://www.reddit.com/Crypto_com/comments/i2yhuz/open_letter_to_kris_from_one_of_cdcs_biggest/ from u/CryptoMines expresses my sentiments and concerns better than I could ever put into words myself. I'd say read his/her post instead.
Very long post ahead, but TL;DR, I actually see this swap as a positive change, despite fearing for what it may do to my portofolio, and having mixed feelings about its consequences on CDC reputation.Before I start, for the sake of context and bias, here's my personal situation as a CDC user:
  1. I'm just a average Joe, with a 500 MCO Jade card. I bough 50 MCO at 5,22€ in September 2019 and staked for Ruby, then bough 440 MCO at 2.47€ in March 2020 and upgraded to Jade. The total amount of MCO I own is currently 515, and everything above the 500 stake is cashback rewards.
  2. I bought MCO exclusively for the card and bonus Earn interest benefits, and had no plans to unstake my MCO. Now with the swap, definetly won't unstake.
  3. The MCO -> CRO conversion rates increased the fiat value of my MCO in about 1000€.
  4. I own a decent amount of CRO, wich I bought at ~0,031€ in March 2020.
  5. The country where I live is crypto friendly and completely crypto-tax free; I only have to pay income tax if I deposit a certain threshold of fiat in my bank.
Take all these factors into account as possible (if not major) influencers or bias on my opinions; both the emotional and economical ones. Call me a fool or a devil's advocate if you want, but keep your torches and pitchforks down. As we say here on Reddit: "Remember the human".-----------------------------------------------------------------------------------------------------------------------------------------------------
Like all of you, I woke up to find this anouncement, wich came right the #[email protected] out of nowere, and gives you little to no options. Good or bad, this announcement arrived as basicly a "comply or die" choice. Emotionally, this came as both terrifying and disgusting; but rationally, I cannot blame CDC for it.
Because wether we like it or not, CDC is a centralized company, and the MCO tokens were never a stock or legally binding contract; something wich pretty much every crypto company or ICO warns in their T&C and risk warnings. Not to mention the mostly unregulated status of the cryptocurrency and. I'll call this "dishonest" any day, but I cannot see it as a "scammy" since I can't see how they broke any rules or terms.
A scammer would take your money/assets away, but CDC is offering you to swap it for another asset wich you can sell right away if you want. And at current price, it is still worth more or less as much fiat as MCO cost at the 5 $/€ wich was more or less the comunity standard used for calculating the card prices. And by that, I mean that the fiat value of 50/500/5000 MCO (as CRO) is actually not far from the 250/2500/25'000 $/€ that the comunity commonly used as standard when calculating the ROI and (under)valuation of MCO.
So CDC is at least trying to give us the option to get (some) our money back, and not at a unfair rate. If you happened to buy MCO at a price higher than this, I can't see how that's CDC's fault, just as I don't see anyone blaming Bitcoin or Altcoins for getting them stuck at the top of the 2017 bubble burst.
I read many posts in this reddit calling this a "backstab" and "betrayal" of early investors and for the people who "believed in MCO". Emotionally, I share your sentiment.But after thinking it for a while, I'd say this was actually very rewarding for early investors and long term MCO supporters. As CDC clearly sates in the swap rules; nobody is going to lose their card tier or MCO stake benefits (at least not yet), and your stake DOES NOT unstake automatically after 180 days. Actually, so far they never did unstake automatically, you had to manually unstake yourself.
With this in mind, everyone who already got their cards, or at least staked MCO to reserve one, basicly got them 3-5 times cheaper than future users; and IMHO, now the $/€ price of cards feels more fair and sustainable compared to their benefits.So in a sense, everyone who supported and believed on the MCO for its utility (i.e. the card and app benefits) has been greatly rewarded with perks that they get to keep, but are now out of reach for a lot of people.Likewise, the people who believed and invested in CRO (for whatever reason), have also been rewarded, as their CRO tokens now have more utility.
So either the price of CRO crashes down to around 0.05 $/€, or the people who bought MCO/CRO early or cheap are now massively benefited. But then again, so is everyone who bought or mined Bitcoin in its early days, or invested in Bitcoin at crucial points of its history... how is that unfair? Some people bought Ethereum at 1'400 $ on a mix of hopes/promises that it would continue to rise; it didn't. And even today with DeFi and ETH 2.0 ever closer, it is still far from that price.
And I know what some of you are thinking: "The cards aren't avaiable in my country yet, that's why I didn't buy/stake."Well, they weren't avaiable in my country either when I staked 50 MCO. Heck, the cards weren't avaiable in anyones country when MCO started, but many people still bought it and staked it. That's exacly what "early adopter", "long supporter" and "believing in MCO" means.
On the other hand, the people who invested on MCO as a speculative asset and decided to HODL and hoard MCO, hoping for its price to moon and then sell MCO at big profit, had their dreams mercilessly crushed by this swap... and good lord, I feel their pain.But this is also where I'll commit the sin of being judgemental, because IMHO, speculating on MCO never made any sense to me; MCO was a utility token, not a value token, so it should not (and could not) ever be worth more than the value of its utility. That's basicly how stablecoins and PAXG are able to stay stable; because nobody will pay more/less than the value of the asset/service they represent.
Tough now that I'm looking at the new card stake tiers in CRO, I have to give credit to the MCO hodlers I just now criticised; maybe you were right all along. Unless the price of CRO crashes or corrects, I wich case, I un-rest my case.
One thing I'll agree with everyone tough, is that I fell that CDC just suckerpunched it's comunity. Because even if we have no vote on its decisions (wich again, we aren't necessarily entitled to, since they are a privante and centralized business) they should/could have warned that this was in their plans well in advance; if anything to allow those who wouldn't like it to exit this train calmly.
Also the CRO stake duration reset. The mandatory reset of your CRO stake for taking advantage of the early swap bonus feels like another gut-punch.
-----------------------------------------------------------------------------------------------------------------------------------------------------
Now that we got emotional feelings out of the way, here's my sentiment about how this will affect the overall CDC ecossystem.
One common criticism of the sustainability of MCO was that its supply cap could never allow a large number of cards to be issued, and how could CDC keep paying the cashbacks and rebates. On the oposite corner, one of the major criticisms of the sustainability of CRO, was it's ridiculously huge supply cap and inflation caused by the gradual un-freezing and release of more CRO into the system.
But now that MCO and CRO became one, it might just have made both issues more sustainable. Now the huge supply cap of CRO makes more sense, as it allows a much larger number of future users to stake for cards (at higher costs, but still). And because most card cashback is small parcels, this large supply also ensures that CDC can keep paying said cashbacks for a long time; especially since it can be semi-renewable trough the trading fees we pay in CRO.
Before this, the MCO you got as cashback had no use, other than selling it for fiat or speculate on its price. But CRO can be used, at the very least, to receive a discount on trading fees. And everytime you pay trading fees in CRO or spend CRO on a Syndicate event, some of that CRO goes back to CDC, wich they can use to keep paying the cahsback/rebates.
And keep in mind, the technicalities of CRO can be changed, as well as the perks and utilities it can be used for. So even if this current model doesn't fix everything (wich it probably doesn't) it can still be changed to patch problems or expand its use.
Another obvious potentially positive outcome of this, is that now CDC only has to focus on 1 token, so it makes it easier to manage and drive its value. People complained that CDC was neglecting MCO over promoting CRO, but now they can focus on both services (cards/exchange) at the same time. Sure, this might not bring much advantage to the common customer, but its probably a major resource saver and optimizer at corporate levels; wich in the long term ultimately benefits its customers.
Much like Ethereum is undergoing major changes to ensure its scalability, the crypto companies themselves also have to change to acommodate the growing number of users, especially as the cryptomarket and DeFi are growing and becoming more competitive. Business strategies that were once successfull became obsolete, and exchanges that once held near-monopolies had to adjust to rising competitors. There is no reason why CDC shouldn't keep up with this, or at least try to.
Point is, the financial markets, crypto or otherwise, are not a status quo haven. And when something is wrong, something has to be changed, even if it costs. The very rise of cryptocurrencies and blockchain, wich is why we are here in the first place, is a perfect example of this, as it experiments and provides alternatives to legacy/traditional products and technologies.
Was this the best solution to its current problems? Is this what will protect us as customers from a potentially unsustainable business model? I have no idea.
This change ripped me too from my previous more or less relaxed status quo (the safety of the value of the CRO I bough for cheap), along with CRO late investors wich now probably fear for the devaluation of their CRO. To say nothing of the blow this represents for my trust (and I believe everyone elses trust) on CDC and its public relations. It's not what CDC did, it's how they did it.
------------------------------------------------------------------------------------------------------------------------------------------------
Wether you actually bothered to read all I wrote or just skip everything (can't blame you), I'm eager to hear your opinions and whatever criticisms on my opinions you may have.
If you just want to vent at me, you are welcome too; now you can raise your pitchforks and torches.
submitted by BoilingGarbage to Crypto_com [link] [comments]

Why Osana takes so long? (Programmer's point of view on current situation)

I decided to write a comment about «Why Osana takes so long?» somewhere and what can be done to shorten this time. It turned into a long essay. Here's TL;DR of it:
The cost of never paying down this technical debt is clear; eventually the cost to deliver functionality will become so slow that it is easy for a well-designed competitive software product to overtake the badly-designed software in terms of features. In my experience, badly designed software can also lead to a more stressed engineering workforce, in turn leading higher staff churn (which in turn affects costs and productivity when delivering features). Additionally, due to the complexity in a given codebase, the ability to accurately estimate work will also disappear.
Junade Ali, Mastering PHP Design Patterns (2016)
Longer version: I am not sure if people here wanted an explanation from a real developer who works with C and with relatively large projects, but I am going to do it nonetheless. I am not much interested in Yandere Simulator nor in this genre in general, but this particular development has a lot to learn from for any fellow programmers and software engineers to ensure that they'll never end up in Alex's situation, especially considering that he is definitely not the first one to got himself knee-deep in the development hell (do you remember Star Citizen?) and he is definitely not the last one.
On the one hand, people see that Alex works incredibly slowly, equivalent of, like, one hour per day, comparing it with, say, Papers, Please, the game that was developed in nine months from start to finish by one guy. On the other hand, Alex himself most likely thinks that he works until complete exhaustion each day. In fact, I highly suspect that both those sentences are correct! Because of the mistakes made during early development stages, which are highly unlikely to be fixed due to the pressure put on the developer right now and due to his overall approach to coding, cost to add any relatively large feature (e.g. Osana) can be pretty much comparable to the cost of creating a fan game from start to finish. Trust me, I've seen his leaked source code (don't tell anybody about that) and I know what I am talking about. The largest problem in Yandere Simulator right now is its super slow development. So, without further ado, let's talk about how «implementing the low hanging fruit» crippled the development and, more importantly, what would have been an ideal course of action from my point of view to get out. I'll try to explain things in the easiest terms possible.
  1. else if's and lack any sort of refactoring in general
The most «memey» one. I won't talk about the performance though (switch statement is not better in terms of performance, it is a myth. If compiler detects some code that can be turned into a jump table, for example, it will do it, no matter if it is a chain of if's or a switch statement. Compilers nowadays are way smarter than one might think). Just take a look here. I know that it's his older JavaScript code, but, believe it or not, this piece is still present in C# version relatively untouched.
I refactored this code for you using C language (mixed with C++ since there's no this pointer in pure C). Take a note that else if's are still there, else if's are not the problem by itself.
The refactored code is just objectively better for one simple reason: it is shorter, while not being obscure, and now it should be able to handle, say, Trespassing and Blood case without any input from the developer due to the usage of flags. Basically, the shorter your code, the more you can see on screen without spreading your attention too much. As a rule of thumb, the less lines there are, the easier it is for you to work with the code. Just don't overkill that, unless you are going to participate in International Obfuscated C Code Contest. Let me reiterate:
Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.
Antoine de Saint-Exupéry
This is why refactoring — activity of rewriting your old code so it does the same thing, but does it quicker, in a more generic way, in less lines or simpler — is so powerful. In my experience, you can only keep one module/class/whatever in your brain if it does not exceed ~1000 lines, maybe ~1500. Splitting 17000-line-long class into smaller classes probably won't improve performance at all, but it will make working with parts of this class way easier.
Is it too late now to start refactoring? Of course NO: better late than never.
  1. Comments
If you think that you wrote this code, so you'll always easily remember it, I have some bad news for you: you won't. In my experience, one week and that's it. That's why comments are so crucial. It is not necessary to put a ton of comments everywhere, but just a general idea will help you out in the future. Even if you think that It Just Works™ and you'll never ever need to fix it. Time spent to write and debug one line of code almost always exceeds time to write one comment in large-scale projects. Moreover, the best code is the code that is self-evident. In the example above, what the hell does (float) 6 mean? Why not wrap it around into the constant with a good, self-descriptive name? Again, it won't affect performance, since C# compiler is smart enough to silently remove this constant from the real code and place its value into the method invocation directly. Such constants are here for you.
I rewrote my code above a little bit to illustrate this. With those comments, you don't have to remember your code at all, since its functionality is outlined in two tiny lines of comments above it. Moreover, even a person with zero knowledge in programming will figure out the purpose of this code. It took me less than half a minute to write those comments, but it'll probably save me quite a lot of time of figuring out «what was I thinking back then» one day.
Is it too late now to start adding comments? Again, of course NO. Don't be lazy and redirect all your typing from «debunk» page (which pretty much does the opposite of debunking, but who am I to judge you here?) into some useful comments.
  1. Unit testing
This is often neglected, but consider the following. You wrote some code, you ran your game, you saw a new bug. Was it introduced right now? Is it a problem in your older code which has shown up just because you have never actually used it until now? Where should you search for it? You have no idea, and you have one painful debugging session ahead. Just imagine how easier it would be if you've had some routines which automatically execute after each build and check that environment is still sane and nothing broke on a fundamental level. This is called unit testing, and yes, unit tests won't be able to catch all your bugs, but even getting 20% of bugs identified at the earlier stage is a huge boon to development speed.
Is it too late now to start adding unit tests? Kinda YES and NO at the same time. Unit testing works best if it covers the majority of project's code. On the other side, a journey of a thousand miles begins with a single step. If you decide to start refactoring your code, writing a unit test before refactoring will help you to prove to yourself that you have not broken anything without the need of running the game at all.
  1. Static code analysis
This is basically pretty self-explanatory. You set this thing once, you forget about it. Static code analyzer is another «free estate» to speed up the development process by finding tiny little errors, mostly silly typos (do you think that you are good enough in finding them? Well, good luck catching x << 4; in place of x <<= 4; buried deep in C code by eye!). Again, this is not a silver bullet, it is another tool which will help you out with debugging a little bit along with the debugger, unit tests and other things. You need every little bit of help here.
Is it too late now to hook up static code analyzer? Obviously NO.
  1. Code architecture
Say, you want to build Osana, but then you decided to implement some feature, e.g. Snap Mode. By doing this you have maybe made your game a little bit better, but what you have just essentially done is complicated your life, because now you should also write Osana code for Snap Mode. The way game architecture is done right now, easter eggs code is deeply interleaved with game logic, which leads to code «spaghettifying», which in turn slows down the addition of new features, because one has to consider how this feature would work alongside each and every old feature and easter egg. Even if it is just gazing over one line per easter egg, it adds up to the mess, slowly but surely.
A lot of people mention that developer should have been doing it in object-oritented way. However, there is no silver bullet in programming. It does not matter that much if you are doing it object-oriented way or usual procedural way; you can theoretically write, say, AI routines on functional (e.g. LISP)) or even logical language if you are brave enough (e.g. Prolog). You can even invent your own tiny programming language! The only thing that matters is code quality and avoiding the so-called shotgun surgery situation, which plagues Yandere Simulator from top to bottom right now. Is there a way of adding a new feature without interfering with your older code (e.g. by creating a child class which will encapsulate all the things you need, for example)? Go for it, this feature is basically «free» for you. Otherwise you'd better think twice before doing this, because you are going into the «technical debt» territory, borrowing your time from the future by saying «I'll maybe optimize it later» and «a thousand more lines probably won't slow me down in the future that much, right?». Technical debt will incur interest on its own that you'll have to pay. Basically, the entire situation around Osana right now is just a huge tale about how just «interest» incurred by technical debt can control the entire project, like the tail wiggling the dog.
I won't elaborate here further, since it'll take me an even larger post to fully describe what's wrong about Yandere Simulator's code architecture.
Is it too late to rebuild code architecture? Sadly, YES, although it should be possible to split Student class into descendants by using hooks for individual students. However, code architecture can be improved by a vast margin if you start removing easter eggs and features like Snap Mode that currently bloat Yandere Simulator. I know it is going to be painful, but it is the only way to improve code quality here and now. This will simplify the code, and this will make it easier for you to add the «real» features, like Osana or whatever you'd like to accomplish. If you'll ever want them back, you can track them down in Git history and re-implement them one by one, hopefully without performing the shotgun surgery this time.
  1. Loading times
Again, I won't be talking about the performance, since you can debug your game on 20 FPS as well as on 60 FPS, but this is a very different story. Yandere Simulator is huge. Once you fixed a bug, you want to test it, right? And your workflow right now probably looks like this:
  1. Fix the code (unavoidable time loss)
  2. Rebuild the project (can take a loooong time)
  3. Load your game (can take a loooong time)
  4. Test it (unavoidable time loss, unless another bug has popped up via unit testing, code analyzer etc.)
And you can fix it. For instance, I know that Yandere Simulator makes all the students' photos during loading. Why should that be done there? Why not either move it to project building stage by adding build hook so Unity does that for you during full project rebuild, or, even better, why not disable it completely or replace with «PLACEHOLDER» text for debug builds? Each second spent watching the loading screen will be rightfully interpreted as «son is not coding» by the community.
Is it too late to reduce loading times? Hell NO.
  1. Jenkins
Or any other continuous integration tool. «Rebuild a project» can take a long time too, and what can we do about that? Let me give you an idea. Buy a new PC. Get a 32-core Threadripper, 32 GB of fastest RAM you can afford and a cool motherboard which would support all of that (of course, Ryzen/i5/Celeron/i386/Raspberry Pi is fine too, but the faster, the better). The rest is not necessary, e.g. a barely functional second hand video card burned out by bitcoin mining is fine. You set up another PC in your room. You connect it to your network. You set up ramdisk to speed things up even more. You properly set up Jenkins) on this PC. From now on, Jenkins cares about the rest: tracking your Git repository, (re)building process, large and time-consuming unit tests, invoking static code analyzer, profiling, generating reports and whatever else you can and want to hook up. More importantly, you can fix another bug while Jenkins is rebuilding the project for the previous one et cetera.
In general, continuous integration is a great technology to quickly track down errors that were introduced in previous versions, attempting to avoid those kinds of bug hunting sessions. I am highly unsure if continuous integration is needed for 10000-20000 source lines long projects, but things can be different as soon as we step into the 100k+ territory, and Yandere Simulator by now has approximately 150k+ source lines of code. I think that probably continuous integration might be well worth it for Yandere Simulator.
Is it too late to add continuous integration? NO, albeit it is going to take some time and skills to set up.
  1. Stop caring about the criticism
Stop comparing Alex to Scott Cawton. IMO Alex is very similar to the person known as SgtMarkIV, the developer of Brutal Doom, who is also a notorious edgelord who, for example, also once told somebody to kill himself, just like… However, being a horrible person, SgtMarkIV does his job. He simply does not care much about public opinion. That's the difference.
  1. Go outside
Enough said. Your brain works slower if you only think about games and if you can't provide it with enough oxygen supply. I know that this one is probably the hardest to implement, but…
That's all, folks.
Bonus: Do you think how short this list would have been if someone just simply listened to Mike Zaimont instead of breaking down in tears?
submitted by Dezhitse to Osana [link] [comments]

List of current UI/UX issues & possible QoL improvements (Megathread?)

As some of you know, I only make stupidly long posts and also like to humbly brag about being a software engineer with fairly decent experience in QA, automated testing and testing in general (6+ years a C# dev).
This is my personal list of things that either make no sense, are unpleasant, incoherent, or could be improved.
Please feel free to add to the list, I will come back and edit every day.
Numbers are also here to help you quote & provide your own criticism.
Note that is is done with the following optimization mindset, in order of importance :
As you can see I worked under the assumption that the average player wants to spend more time in raid rather than in inventory ; obviously this falls apart if that is not the case. To do that I try to improve time spent on searching / arranging things without creating unecessary automation or remove important/immersive aspects of the game, even in inventory. I also try to improve time spent clicking through various windows as currently a lot of them are done to be fast & easy to for the devs, not for the players. I want to emphasize that I'm okay with that. I know the importance of having sub-optimal navigation to help you find out what your better navigation is. I also know a complete rework is not always possible, that is why I made my list without changing too much of the menus as well as keeping the vibe/current feel of those menus.
Keywords like should & could are used as intended ; since this is not a professional report I'm emphasizing here, the meaning of the word is important. Should means it is adding an improvement over an existing issue, could means it's a possible improvement but requires further investigation. Would means investigation was done and is just one possible outcome usually relevant within the context.
Please note that most of us now are very used to the current UI/UX, which will generate two reactions:
- "It's fine as it is because I can do it quite fast."
- "I don't want it to change again, I'm used to it now."
I cannot emphasize how unefficient it is to let those emotions get the best of you. UI/UX is the study of common sense & ease of use in an interface. You should never have to get used to anything, it should be fluid and intuitive. If you think you're fast now, that means it's possible to be slow. This is extremely bad for a UI/UX standpoint. Everybody should be able to navigate/understand the menus just as fast the 1st time than the 100th time (ideally). Keep this in mind when you read everything down here, because some stuff you probably won't like at first glance, but you will get used to it very fast, and you will gain a lot of time in the future, as well as new players.

1. Autostacking of items

Money & Ammo. When a stackable item or stack of items enter an inventory, it should autostack itself to an available non-full stack, then fill other available stacks until there aren't any. At that point, the item should just go at the top of the inventory as it is doing now. Autostacking should *not* browse for sublayers of inventory.
Items drag & dropped on an inventory slot should not be auto-stacked either (drag & drop overrides autostacking).
It would autostack when control clicking, or using "Receive all" from another inventory, or when dropping into a sublayer without selecting a specific slot.
Autostacking should only stack FiR items together and non FiR items together.

Example 1

Drag & Dropping would not stack in the same inventory layer. Drag & dropping would override auto stacking.

Dragging over the money case would auto stack in the inventory of the case.

Using Ammo as example here. If you drag & drop directly on a slot (even in an inventory sublayer), you would override autostacking.

2. Highlighting of full stacks

Stacks at full capacity could be highlighted for easier inventory management.
Many aspects could be used to highlight (either the name of the item, or the value, or the background of the cell)

Apology for the poor photoshop skills

This could be a highlighting method

This could be a highlighting method

3. Consistent item order in hideout craft list

Currently when in the workbench (and I think others? now I doubt), the list of craftable items appears to be random. The order should always be the same for consistency. Does not provide meaningful gameplay experience to have to "look for the recipe" every time.

4. Collecting crafts

Hideout stations could display the finished craft on top for easy collection of craft, or there could be a "Get Items" or "Receive All" elsewhere to avoid unecessary scrolling. This is uncessary if ongoing crafts are moved on top of the list, or if the list is autoscrolled to the ongoing craft.
"Collect All" on station level is not the best idea. If you go in a station, it's probably better that you know what you're collecting. I suggest moving the relevant craft on top or auto scrolling and not adding "receive all" on station level, although it would be a good help.
This should be investigated.

Receive All or Get Items could be moved or added at the top or bottom of the window.

5. "Receive All" could exist at hideout level

The same way we "receive all" from a trader, it would be nice to "Receive all" from the hideout. Either in the form of a trader (in which we can receive all / pick manually from) or by instantly putting it in inventory. If there is enough space it just works. If there isn't, it displays an error like it already does.
This is not mutually exclusive with the previous suggestion.

6. Display crafts readyness/collection

6.1 Hideout
The current behaviour is partially coherent. You get notified when an item is sold, and you get notified when a craft is finished.
You have a display notification "Attachment" style when a trader has something for you, and you should get a display notification "attachment" style when the hideout has something for you.
Ideally, there should also be such notifications for currently unused station


Receive all on the right, Nutrition unit has finished crafting and Lavatory is currently NOT crafting

6.2 Traders
There should be a way of knowing if something is waiting in trader inventory on a global level (quest rewards, money, insurance, unsold market items returns), like the nofication. The "new item" notification could be always visible as long as items are in the trader inventories, compared to now where it disappears as soon as you either click it or visit the messenger. In this hypothesis, there could be a change of color in the notification to show that there are still item waiting including some that haven't been seen yet (to still fulfill the current role of the notification)

7. CTA's

Note : CTA = Call to Action, it's the button your user will press 99.3% of the time. Example, in the launcher, it's the "Start Game". Clearly visible, easily accessible, highlighted, much bigger, and at a very common CTA spot. That one is great.
Some others are not.
7.1. "Receive All" should not be displayed when there is nothing to receive.
7.2 "Get" in single transaction messages from Ragman could be removed. There is no reason to take single items from the window when you can receive it all at once.
7.3 A "group collect" Receive all action could be added when you click on the attachment notification, or as an extra action next to the notification (just like shown on the Hideout in figure 6.0) that would specifically collect all. it would loop through all conversations and collect all and dump at the top of stash, either until its finished or there is not enough room, in which case it displays an error. It could also work like the scav case and not pick up anything until you have room, and in that case you would go in the window manually and/or make room (like we do now).
7.3 The "Receive all" is at the bottom when most CTAs in the game is at the top (dealer tabs, market tabs, character sheet tabs, settings...). Save in the settings is at the bottom too. It is incoherent. It would make more sense to have all CTAs at the bottom and options/tabs/menus at the top.
7.4 The "DEAL" button in trader view is much smaller and less visible than the "Fill Item" checkbox. The CTA should be getting more attention than a setting. New players pretty much *never* see it first and look around the "Fill Item" with eyes & mouse.

DEAL should be at the bottom in the current \"Fill Item\" box. Fill Items should be removed entirely.
7.5 Quests could be automatically accepted (no need for CTA). I don't see a reason why someone would not accept a quest. The only reason we're Accepting them now is to let the user know he has a new quest. There are other means of notifying players of new stuff : usually notifications. If not, that button should at least be more visible/highlighted. Every new player ( 100%! ) I coach does not see it at first and never looks at the right spot the first time.
7.6 "Insure All" is the most commonly used button in the insurance screen and could be emphasized more.

Example 7.5

8. Remove "Fill Items"

The Fill Items to automatically fill the trader's requirements should be removed and set as the default behaviour. There is no need to fill items manually nor tell the game to do so.

9. Expire / Delete pending requests

Friend requests should be cancellable and could expire. Requests should not be stuck until another user acts on them. Right clicking the request could display a "Cancel" or "Delete" request button.
Ideally, the cell should include a CTA on the right, as the only action I would ever do in a cell in this context is cancel.
Opening a submenu with only 1 item means you should not be opening a submenu, but displaying a button where the user right clicked instead.

I can only re-send a friend request to someone that already denied me. This is incoherent.

10. Market Rows

From my somewhat small sample (about 60 players), nobody uses the expand button on the top right of a cell (see below). I think everybody uses the right click on item instead.

An expanded cell with context menu opened, and a collapsed cell
The extra information available on the right is the exact same as a right click, but is hidden behind a left click. This is incoherent.
The only difference is the profile picture that I only get from expanding, but currently we all have the same one. This would need to be investigated.

This could be an improvement, displaying the CTA's immediately (although BUY is definitely way too small). Notice profile picture on the left
10.1 The expandable cell feature should be removed altogether, as the other options are available on right click.
10.2 The whole row should provide the same context menu (right click).
10.3 The "Send friend request" could be included in the row's context menu, or could be removed entirely, as right now most requests are missclicks. Adding the Send Friend Request at the bottom of the context menu on the row would reduce the amount of missclicks.
10.4 Left clicking should not open the context menu. This is mostly the reason behind missclick friend requests, people double clicking slightly off the item icon sending a friend request by mistake. Now I have 4 just because I was trying to make a screenshot. F's in the chat. This would be resolved with 10.2 and 10.3.
10.5 Barter items have a "Barter" icon that is redundant, the first and second column are completely irrelevant to the player.

Example 10.5

11. Filtering search

11.1 "Filter by Item" should not filter the browse list. If you're writing a valid keyword in the search field it should display the correct suggestions. Filtering content is good, filtering suggestions is incoherent.
11.2 Filters could be cleared as soon as you type text in the search field. This would resolve 11.1

Example 11
11.3 "My Offers" could not be affected by filters, or could reset filters. It is more trouble to remove the filter manually every time rather than browsing through the offer list. Currently we never have more than ~10 offers at the same time for most players, which is okay to display without filter.
11.4 Filters should not overlap with other UI elements, they could be resized to fit or the expandable filter list could include more elements so the visible ones fit.

Example 11.4
11.5 The Remember Selected Filter / Reset Filter is unclear. Looks great, feels weird, and should be investigated to be more useful.

12. Context Menu in player lobby


The current lobby with context menu open

All players in this list are looking for a group, there is no need to write a status "Looking for group", it's redundant. The exception is friends, which 99.633...% of the time is the group i'm about to play with. Those are displayed on top.
The only action we do on the list on this screen is the "Invite to group" context menu action. It's a CTA and should not be hidden in a context menu, especially if the context menu only has one option. Since recently it has two, but we'll come to that in a minute.
12.1 The invite CTA should be on the player cell itself.
12.2 The report action should not be the default one from the context menu
12.3 Since there could be only one item in the context menu according to 12.1, the report action could be on the cell as well.

A low quality suggestion for 12.x

13. Trader Buy/Sell

Trader screen needs to be reworked. I won't provide a solution that doesn't completely change how everything looks/works as I stated at the start of the post. That being said this should be improved.


Example 13.1
13.1 Buying UI should be reworked.
When buying, the price of the item is already displayed on the item itself in the trader view.
The price is also displayed a second time in the tooltip of the item if you mouse-over.
The price is also displayed a third time in the barter area on the right of the image (middle of the screen in game). This is redundant. I understand the item on the right is the physical item "Roubles" in a stack that is paid, like a barter, but it does not need to be displayed a third time.
13.2 Quantity limit (red box in the image) could be shown in the tooltip ; most of the time people will hit "DEAL" until they get an error insted of actually reading the red box.
13.3 The red box looks like an error even when at 0/x, this is not intuitive. Limited items can be listed in different ways that are not so invasive. We could add "out of X" at the right side of the quantity box.
13.4 Barter item prices (if we assume 13.1) would need to also displayed differently. This needs to be investigated
13.5 Selling UI should be reworked

Example 13.5

Currently selling an item still displays the full available items to BUY, this is incoherent. Especially from the "Sell" tab. The whole left side of the screen is wasted, and cannot be used.
13.6 Items on the left are not greyed out (even though I can't buy them), but items on the right are greyed out (because I can't sell them). This is incoherent.
13.7 Trader sell space should be infinite
13.8 Buy/Sell could be done in a single tab if the whole screen is reworked. There are different levels to this. An easy one I could think :
"Trade" Tab instead of "Buy". Displays the same as the current "BUY" tab. If you ctrl+click an item from your stash, it instantly sells without confirmation. The second tab would be a "Buyback" where you can see what you sold in the current trade session. If you leave the screen your buyback is reset and items cannot be recovered. Another way would be to keep buybacks for the last X items. You would need to pay what you received to get back. The item would not lose it's FiR status. This preleminary and simplistic rework has issues, notably that you have to know to right click to sell. One way to fix that would be to make right click sell to trader instead of control click, but that would definitely make missclicks the first few days (and buyback would be mandatory).
This could be investigated.
13.9 Currency exchange rates should be easily available in relevant areas (Peacekeeper, Therapist and flea market) for all currencies (Rouble, Euro, Dollar, Bitcoin)

14. Boxing

Items should be boxables and moved around. At least to be dropped in boxes, ideally to be moved around freely.
There is a limit of 20 images. 🤷‍♂
Example 14

15. Quest inventory

If you loot too many quest items in a raid, you can end up not being able to loot it. I assume this is by design and it is why you have limited quest item space.
The quest inventory could be infinite if it's not by design to be limited.
The quest inventory should be manageable. In my case i had a 1 slot item blocking me from taking the suitcase, I should have been allowed to move that 1 slot item to the top or to the right of my inventory, clearing a whole line and letting me take the case.
Quest items could be stored in backpack (and resized) ; since you lose them on death it's not relevant to the players looting you or you dying and that issue would be gone. Storing it in your stash would also prevent you from losing it by going in raid with it by mistake. Taking it in raid or giving it to trader would be a volontary action. It also makes much more sense that way as other quest items (that are also usable items) work that way.


Alright this ended up taking more time than my lunchbreak, and there is *much* more to write but for the time being I'll leave it at that and come tomorrow to add your suggestions or mine. See you in 24 hours.
submitted by SixOneZil to EscapefromTarkov [link] [comments]

Bitcoin endgame

Hi everybody I am still getting started in the world of cryptocurrencies, focusing on understanding how bitcoin works. I chose bitcoin because it is the most established cryptocurrency and with the higher chance of becoming a full fledge currency worldwide, being used by everyone (at least that’s what I think at the moment, I might be wrong here) I know that the practical limit for bitcoin is 21 million bitcoins and each bitcoin can be divided into 100 million satoshi. My question is, assuming bitcoin takes over as a single global currency and everyone is using it, isn’t the total amount of satoshi too little? I mean, if you split the total amount of bitcoin by everyone in the world, each person receives a relative small amount of bitcoin (I did a rough estimate, and from my estimates, each person would have the equivalent of 30€ in bitcoin)
Thanks in advance for dedicating the time to this weird question :)
EDIT:
Thank you all for the answers. Like it has been said I agree that the main obstacle for bitcoin to become the one currency on a global scale is politics because no one, in this case governments would like to loose control of their money.
With the increase in market cap and the subdivision of satoshi’s (part that I was unaware of), bitcoin could be used as the global currency because, at the same time, bitcoin would have enough “value” to represent the global economy and would be divisible enough to a reasonable value for cheap stuff like a bottle of water.
The main technical issue that I see at the moment is the difference to fiat in how bitcoin is stored (hardware wallets) and transferred between two entities (addresses and private keys). For me it is something that I am starting to understand but I think it would be close to impossible for the majority of people that are older / not so tech oriented. I haven’t yet bought bitcoin, just got a bit of exposure to it using Revolut and decided to explore it on a deeper level.
One other thing that was referred was that subdividing satoshi’s is similar to “printing” money and would lead to inflation. I understand why this is being said, because creating money or dividing the current supply into smaller amounts can be seen as having the same overall effect. I think that the key difference in the division of satoshi’s is that it is not controlled by a central authority. For example if a new base unit that corresponded to 1/100 of a satoshi was created, everyone would be affected equally. When money is being printed by a central bank or government, they are increasing their wealth by making everyone else poorer, since they are increasing the percentage of money they have (note that I am not an economist and this explanation probably is flawed).
submitted by AlexDRibeiro to BitcoinBeginners [link] [comments]

Major Stories In Crypto This Week

Major Stories In Crypto This Week
https://preview.redd.it/1oaizrm3yho51.jpg?width=1280&format=pjpg&auto=webp&s=e8870de366cc62eaad247a7e8ca18252cb6da19b
Waiting for ETH fees to become cheaper
From now on, every Monday we will be doing a weekly news digest where we will be discussing the biggest stories on the crypto market over the previous 7 days. Why are we doing that? It’s simple — we want to create a useful news outlet for our community members’ convenience.
So, what events marked the third week of September?
Analysts predicting a Bitcoin price rally
“The whale exchange ratio is at the lowest level of the year — the fewer whales moving to the exchanges, the fewer spills and [a] higher BTC price”, said CryptoQuant CEO Ki Young Ju on Thursday. This indicates a possible price rally soon: the less coins whales send to exchanges, the less chance there is for the price to dump, therefore, the higher the chance for a price rally. This is proven by experience: the last time this figure dropped from the current level in April 2019, Bitcoin price grew from $4000 to $13000 over a few months.
FORSAGE.io — how is this information useful to us? It’s important to remember that whenever the first cryptocoin’s price goes up, altcoins invariably follow suit. This means that if BTC shoots up in value, the possibility of TRX and ETH going up also increases, and that’s great news for the entire FORSAGE community.
Is Ethereum still going the Proof-of-Stake route?
A leading Ethereum developer, Danny Ryan published the official Ethereum Improvement Proposal EIP-2982 that implies the launch of Ethereum 2.0 and the switch to Proof-of-Stake. If his proposal gets passed, it may become implemented into the network, solving the issue of high commission fees.
FORSAGE.io — how is this information useful to us? The high commission fees on the ETH network has been slowing down the growth of the FORSAGE community for the past month, because the price of tens and even hundreds of dollars per transaction is unacceptable to most people. We are all looking forward to the day this issue is solved.
DeFi projects boom
DeFi projects are becoming more and more popular — for example, Uniswap, a decentralized exchange, saw its token rise in value by 75% after listing on world’s leading exchanges, making it on the list of top 50 highest market capitalization assets.
FORSAGE.io — how is this information useful to us? It is DeFi projects and their popularity that catalyzed the unprecedented increase in transactions on the Ethereum network, driving the price of transactions further up. After the price of gas for Ethereum transactions has been hovering above 100 Gwei for the past few weeks, on September 17th it jumped all the way to 700 Gwei. This is why we are very excited about the previous news story — the launch of Ethereum 2.0
Over 10% of all crypto payments take place in Eastern Europe
Chainalysis has published another study that shows 12% of all cryptocurrency transactions that took place from July 2019 to June 2020 happened in Eastern Europe! This means the region is the fourth biggest market in terms of transaction volume, and while it is out-grossed by the giants that are the US and China, the region is growing quickly.
FORSAGE.io — how is this information useful to us? The Eastern European region is developing, in part, driven by the increasing commitment of many countries to decentralization and economic freedom. This positively affects the local levels of crypto activity. Let’s find out where the majority of the FORSAGE community comes from! Leave a comment with your country below!
submitted by Forsage_io to ico [link] [comments]

Cryptocurrencies in the Era of COVID-19 (Part One)

Cryptocurrencies in the Era of COVID-19 (Part One)

https://preview.redd.it/cscwryttr4o51.jpg?width=2560&format=pjpg&auto=webp&s=ddd90997810c0cc46cf8e6b5cac534cd8f9c796f
To speak of “post-COVID” is not only premature, but perpetuates the myth that the mere passage of time will lead to some kind of universal recovery. The reality is rather more harsh. Currently, the only positive dynamic at work is that the patient will learn to cope with the symptoms of a congenital condition, until, and if, the underlying problem can be resolved. While we would prefer otherwise, this is the Era of COVID.
The opening up of Europe’s Mediterranean tourist industry in the summer of 2020 was always going to increase the rate of COVID transmission, but the experiment was justified in terms of local economic dependency on foreign visitors vis-a-vis the health costs, the degree of disease impact, and overly testing the limits of voluntary social distancing.
From the perspective of the pathogen, however, absolutely nothing has changed. In terms of global polity, economic policy and social welfare, everything has changed, is changing, and may well end up creating scenarios out of all recognition.
Critical to appreciating the “why?” of this reorientation is the recognition that only a raft of temporary, but wholly unsustainable macroeconomic policies, have kept the global economy functioning. The problem, however, is that it is a bit like cheating a wise man. You only get away with it once. Thereafter you have to accept realities and manage how they play out as best as you can.
Central to the latter is the fact that until a vaccine is developed, ours is the era of socio-economic COVID-19 management. All other determinations derive from where they stand in regards this polarity; the spread of the disease on the one part and the damage done to the global economy on the other. The balance between lives and livelihoods. In reality the two are not finally distinct. The acceptance of higher COVID-19 infection will have economic costs both over the short and long term. The worry is that these could be far, far greater than many currently anticipate. Critically, that those people with mild or no symptoms today, could develop significant health problems in their tens of millions as they get older. That the virus lays dormant at a cellular level but surfaces to cause physical problems in the future, negatively impacting the functioning of vital organs, including the brain. As this happens the economic costs will become significant.
To restate. Temporary economic measures funded by quantitative easing have allowed the global economy to maintain a degree of normalcy, but over time these will inevitably weaken the economy they were designed to protect. In similitude, the temporary relief of putting short term spending needs on the credit card eventually crashes into the wall of maximised indebtedness. The consequence is either the hardship of paying back what has been borrowed, or simply walking away from the debt and being cut off from credit thereafter.
The last time the global economy faced anything like this level of catastrophic dialectic was after the two world wars. For the people of Germany and France coins and banknotes were minted with ever greater number of zeros, but ever reduced buying power. In the end these currencies were simply abandoned—replaced with the Reichsmark and nouveau franc respectively. The former at a rate of one trillion (sic) to one! Stability resulted, but it must be underscored, because the printing presses were turned off.
The trick was to introduce a medium of exchange whose physical number was very tightly defined and limited. As long as the temptation to cheat when you run out of money is resisted, all will be well. All this may prefigure a nouveau dollar, digital yuan or an altogether different scenario may unfold.
This is where the current locus of speculation—financial and theoretical— currently lies.
Any considerations in these respects needs to take into account the following factors as delimiting the parameters of probable outcomes:
  • Structural shifts in global economic activity away from travel, leisure, tourism, some automotive and manufacturing towards health, security, robotics, datacom and a range of advanced technologies. This not only portends shifts in investment between sectors, but more graphically, shifts in wealth between regions and nations.
  • Growing tensions within the European Union. With many of the southern states so highly dependent on tourism, significantly declining income will further exacerbate the north-south wealth gap, and thus tensions over budgetary redistribution.
  • Structural shifts in global geo-politics and trade away from multilateralism towards bilateralism, supply chain security, high-tech protectionism and hegemonic alliances.
  • A new era of Western statism necessary to reduce the threat of a severe economic depression. This will be directed to enhanced infrastructure projects, support for advanced, green and digital technologies, new strategies on preventative and remote health care, and internal security and surveillance.
  • Social acceptance of greater government intrusion and regulation as the price of minimising the impact of COVID, future pandemic threats and economic downturn.
More important than any of these are the underlying shift towards new orthodoxies at the expense of tearing up the old order. This not only includes the fundamentals of government macroeconomic theory (and thus policy) but the rules underpinning all commercial and currency infrastructures. “Fundamental” because the three are inextricably linked, yet autonomous enough for one to affect the other with a potential impact so dramatic it is difficult to overstate.
These paradigms are so new, and their final impact so remote, that the most significant element of their existence is easily missed: A year ago such a narrative would have been viewed as sheer lunacy. A year from now so obvious as to merit an historical footnote. Emerging from the rabbit hole everything will be different. Everything is up in the air and everyone is scrambling to find an anchor.
In the meanwhile, popular investment ethos is myopic, entirely oblivious to the undercurrents which will mark the end of the status quo. Somewhere along the line, a soaring Stockmarket has become an end in itself. Wealth, the mere addition of fiat zeros.
The intention of the original cryptocurrency was to sidestep this fallacy. To extricate and preserve real wealth from constantly shifting foundations. Like all ideals, it has been imperfectly realised. No one can deny that the meteoric rise in Bitcoins’ value from $327 to almost $12,000 (at the time of writing) reflects some degree of speculation, but it also reflects substantive, intelligibly based doubts as to the fundamentals sustaining fiat currencies. They may still exist in five or ten years, but what will they tangibly be worth?
Eventual outcomes here—including which cryptocurrencies prove their worth —will be determined by our collective actions. History reveals that whatever divergences take place, in the end the solid and substantial always win out. Lies are exposed and tyranny eventually falls. Shaky assets yield to solid. Bad money drives good to a premium.
(Subsequent additions to this article will examine critical factors determining the path of cryptocurrency evolution in the era of COVID as these arise, including government regulations).
submitted by JamesFXF to FXF [link] [comments]

[ Bitcoin ] Technical: Taproot: Why Activate?

Topic originally posted in Bitcoin by almkglor [link]
This is a follow-up on https://old.reddit.com/Bitcoin/comments/hqzp14/technical_the_path_to_taproot_activation/
Taproot! Everybody wants it!! But... you might ask yourself: sure, everybody else wants it, but why would I, sovereign Bitcoin HODLer, want it? Surely I can be better than everybody else because I swapped XXX fiat for Bitcoin unlike all those nocoiners?
And it is important for you to know the reasons why you, o sovereign Bitcoiner, would want Taproot activated. After all, your nodes (or the nodes your wallets use, which if you are SPV, you hopefully can pester to your wallet vendoimplementor about) need to be upgraded in order for Taproot activation to actually succeed instead of becoming a hot sticky mess.
First, let's consider some principles of Bitcoin.
I'm sure most of us here would agree that the above are very important principles of Bitcoin and that these are principles we would not be willing to remove. If anything, we would want those principles strengthened (especially the last one, financial privacy, which current Bitcoin is only sporadically strong with: you can get privacy, it just requires effort to do so).
So, how does Taproot affect those principles?

Taproot and Your /Coins

Most HODLers probably HODL their coins in singlesig addresses. Sadly, switching to Taproot would do very little for you (it gives a mild discount at spend time, at the cost of a mild increase in fee at receive time (paid by whoever sends to you, so if it's a self-send from a P2PKH or bech32 address, you pay for this); mostly a wash).
(technical details: a Taproot output is 1 version byte + 32 byte public key, while a P2WPKH (bech32 singlesig) output is 1 version byte + 20 byte public key hash, so the Taproot output spends 12 bytes more; spending from a P2WPKH requires revealing a 32-byte public key later, which is not needed with Taproot, and Taproot signatures are about 9 bytes smaller than P2WPKH signatures, but the 32 bytes plus 9 bytes is divided by 4 because of the witness discount, so it saves about 11 bytes; mostly a wash, it increases blockweight by about 1 virtual byte, 4 weight for each Taproot-output-input, compared to P2WPKH-output-input).
However, as your HODLings grow in value, you might start wondering if multisignature k-of-n setups might be better for the security of your savings. And it is in multisignature that Taproot starts to give benefits!
Taproot switches to using Schnorr signing scheme. Schnorr makes key aggregation -- constructing a single public key from multiple public keys -- almost as trivial as adding numbers together. "Almost" because it involves some fairly advanced math instead of simple boring number adding, but hey when was the last time you added up your grocery list prices by hand huh?
With current P2SH and P2WSH multisignature schemes, if you have a 2-of-3 setup, then to spend, you need to provide two different signatures from two different public keys. With Taproot, you can create, using special moon math, a single public key that represents your 2-of-3 setup. Then you just put two of your devices together, have them communicate to each other (this can be done airgapped, in theory, by sending QR codes: the software to do this is not even being built yet, but that's because Taproot hasn't activated yet!), and they will make a single signature to authorize any spend from your 2-of-3 address. That's 73 witness bytes -- 18.25 virtual bytes -- of signatures you save!
And if you decide that your current setup with 1-of-1 P2PKH / P2WPKH addresses is just fine as-is: well, that's the whole point of a softfork: backwards-compatibility; you can receive from Taproot users just fine, and once your wallet is updated for Taproot-sending support, you can send to Taproot users just fine as well!
(P2WPKH and P2WSH -- SegWit v0 -- addresses start with bc1q; Taproot -- SegWit v1 --- addresses start with bc1p, in case you wanted to know the difference; in bech32 q is 0, p is 1)
Now how about HODLers who keep all, or some, of their coins on custodial services? Well, any custodial service worth its salt would be doing at least 2-of-3, or probably something even bigger, like 11-of-15. So your custodial service, if it switched to using Taproot internally, could save a lot more (imagine an 11-of-15 getting reduced from 11 signatures to just 1!), which --- we can only hope! --- should translate to lower fees and better customer service from your custodial service!
So I think we can say, very accurately, that the Bitcoin principle --- that YOU are in control of your money --- can only be helped by Taproot (if you are doing multisignature), and, because P2PKH and P2WPKH remain validly-usable addresses in a Taproot future, will not be harmed by Taproot. Its benefit to this principle might be small (it mostly only benefits multisignature users) but since it has no drawbacks with this (i.e. singlesig users can continue to use P2WPKH and P2PKH still) this is still a nice, tidy win!
(even singlesig users get a minor benefit, in that multisig users will now reduce their blockchain space footprint, so that fees can be kept low for everybody; so for example even if you have your single set of private keys engraved on titanium plates sealed in an airtight box stored in a safe buried in a desert protected by angry nomads riding giant sandworms because you're the frickin' Kwisatz Haderach, you still gain some benefit from Taproot)
And here's the important part: if P2PKH/P2WPKH is working perfectly fine with you and you decide to never use Taproot yourself, Taproot will not affect you detrimentally. First do no harm!

Taproot and Your Contracts

No one is an island, no one lives alone. Give and you shall receive. You know: by trading with other people, you can gain expertise in some obscure little necessity of the world (and greatly increase your productivity in that little field), and then trade the products of your expertise for necessities other people have created, all of you thereby gaining gains from trade.
So, contracts, which are basically enforceable agreements that facilitate trading with people who you do not personally know and therefore might not trust.
Let's start with a simple example. You want to buy some gewgaws from somebody. But you don't know them personally. The seller wants the money, you want their gewgaws, but because of the lack of trust (you don't know them!! what if they're scammers??) neither of you can benefit from gains from trade.
However, suppose both of you know of some entity that both of you trust. That entity can act as a trusted escrow. The entity provides you security: this enables the trade, allowing both of you to get gains from trade.
In Bitcoin-land, this can be implemented as a 2-of-3 multisignature. The three signatories in the multisgnature would be you, the gewgaw seller, and the escrow. You put the payment for the gewgaws into this 2-of-3 multisignature address.
Now, suppose it turns out neither of you are scammers (whaaaat!). You receive the gewgaws just fine and you're willing to pay up for them. Then you and the gewgaw seller just sign a transaction --- you and the gewgaw seller are 2, sufficient to trigger the 2-of-3 --- that spends from the 2-of-3 address to a singlesig the gewgaw seller wants (or whatever address the gewgaw seller wants).
But suppose some problem arises. The seller gave you gawgews instead of gewgaws. Or you decided to keep the gewgaws but not sign the transaction to release the funds to the seller. In either case, the escrow is notified, and if it can sign with you to refund the funds back to you (if the seller was a scammer) or it can sign with the seller to forward the funds to the seller (if you were a scammer).
Taproot helps with this: like mentioned above, it allows multisignature setups to produce only one signature, reducing blockchain space usage, and thus making contracts --- which require multiple people, by definition, you don't make contracts with yourself --- is made cheaper (which we hope enables more of these setups to happen for more gains from trade for everyone, also, moon and lambos).
(technology-wise, it's easier to make an n-of-n than a k-of-n, making a k-of-n would require a complex setup involving a long ritual with many communication rounds between the n participants, but an n-of-n can be done trivially with some moon math. You can, however, make what is effectively a 2-of-3 by using a three-branch SCRIPT: either 2-of-2 of you and seller, OR 2-of-2 of you and escrow, OR 2-of-2 of escrow and seller. Fortunately, Taproot adds a facility to embed a SCRIPT inside a public key, so you can have a 2-of-2 Taprooted address (between you and seller) with a SCRIPT branch that can instead be spent with 2-of-2 (you + escrow) OR 2-of-2 (seller + escrow), which implements the three-branched SCRIPT above. If neither of you are scammers (hopefully the common case) then you both sign using your keys and never have to contact the escrow, since you are just using the escrow public key without coordinating with them (because n-of-n is trivial but k-of-n requires setup with communication rounds), so in the "best case" where both of you are honest traders, you also get a privacy boost, in that the escrow never learns you have been trading on gewgaws, I mean ewww, gawgews are much better than gewgaws and therefore I now judge you for being a gewgaw enthusiast, you filthy gewgawer).

Taproot and Your Contracts, Part 2: Cryptographic Boogaloo

Now suppose you want to buy some data instead of things. For example, maybe you have some closed-source software in trial mode installed, and want to pay the developer for the full version. You want to pay for an activation code.
This can be done, today, by using an HTLC. The developer tells you the hash of the activation code. You pay to an HTLC, paying out to the developer if it reveals the preimage (the activation code), or refunding the money back to you after a pre-agreed timeout. If the developer claims the funds, it has to reveal the preimage, which is the activation code, and you can now activate your software. If the developer does not claim the funds by the timeout, you get refunded.
And you can do that, with HTLCs, today.
Of course, HTLCs do have problems:
Fortunately, with Schnorr (which is enabled by Taproot), we can now use the Scriptless Script constuction by Andrew Poelstra. This Scriptless Script allows a new construction, the PTLC or Pointlocked Timelocked Contract. Instead of hashes and preimages, just replace "hash" with "point" and "preimage" with "scalar".
Or as you might know them: "point" is really "public key" and "scalar" is really a "private key". What a PTLC does is that, given a particular public key, the pointlocked branch can be spent only if the spender reveals the private key of the given private key to you.
Another nice thing with PTLCs is that they are deniable. What appears onchain is just a single 2-of-2 signature between you and the developemanufacturer. It's like a magic trick. This signature has no special watermarks, it's a perfectly normal signature (the pledge). However, from this signature, plus some datta given to you by the developemanufacturer (known as the adaptor signature) you can derive the private key of a particular public key you both agree on (the turn). Anyone scraping the blockchain will just see signatures that look just like every other signature, and as long as nobody manages to hack you and get a copy of the adaptor signature or the private key, they cannot get the private key behind the public key (point) that the pointlocked branch needs (the prestige).
(Just to be clear, the public key you are getting the private key from, is distinct from the public key that the developemanufacturer will use for its funds. The activation key is different from the developer's onchain Bitcoin key, and it is the activation key whose private key you will be learning, not the developer's/manufacturer's onchain Bitcoin key).
So:
Taproot lets PTLCs exist onchain because they enable Schnorr, which is a requirement of PTLCs / Scriptless Script.
(technology-wise, take note that Scriptless Script works only for the "pointlocked" branch of the contract; you need normal Script, or a pre-signed nLockTimed transaction, for the "timelocked" branch. Since Taproot can embed a script, you can have the Taproot pubkey be a 2-of-2 to implement the Scriptless Script "pointlocked" branch, then have a hidden script that lets you recover the funds with an OP_CHECKLOCKTIMEVERIFY after the timeout if the seller does not claim the funds.)

Quantum Quibbles!

Now if you were really paying attention, you might have noticed this parenthetical:
(technical details: a Taproot output is 1 version byte + 32 byte public key, while a P2WPKH (bech32 singlesig) output is 1 version byte + 20 byte public key hash...)
So wait, Taproot uses raw 32-byte public keys, and not public key hashes? Isn't that more quantum-vulnerable??
Well, in theory yes. In practice, they probably are not.
It's not that hashes can be broken by quantum computes --- they're still not. Instead, you have to look at how you spend from a P2WPKH/P2PKH pay-to-public-key-hash.
When you spend from a P2PKH / P2WPKH, you have to reveal the public key. Then Bitcoin hashes it and checks if this matches with the public-key-hash, and only then actually validates the signature for that public key.
So an unconfirmed transaction, floating in the mempools of nodes globally, will show, in plain sight for everyone to see, your public key.
(public keys should be public, that's why they're called public keys, LOL)
And if quantum computers are fast enough to be of concern, then they are probably fast enough that, in the several minutes to several hours from broadcast to confirmation, they have already cracked the public key that is openly broadcast with your transaction. The owner of the quantum computer can now replace your unconfirmed transaction with one that pays the funds to itself. Even if you did not opt-in RBF, miners are still incentivized to support RBF on RBF-disabled transactions.
So the extra hash is not as significant a protection against quantum computers as you might think. Instead, the extra hash-and-compare needed is just extra validation effort.
Further, if you have ever, in the past, spent from the address, then there exists already a transaction indelibly stored on the blockchain, openly displaying the public key from which quantum computers can derive the private key. So those are still vulnerable to quantum computers.
For the most part, the cryptographers behind Taproot (and Bitcoin Core) are of the opinion that quantum computers capable of cracking Bitcoin pubkeys are unlikely to appear within a decade or two.
So:
For now, the homomorphic and linear properties of elliptic curve cryptography provide a lot of benefits --- particularly the linearity property is what enables Scriptless Script and simple multisignature (i.e. multisignatures that are just 1 signature onchain). So it might be a good idea to take advantage of them now while we are still fairly safe against quantum computers. It seems likely that quantum-safe signature schemes are nonlinear (thus losing these advantages).

Summary

I Wanna Be The Taprooter!

So, do you want to help activate Taproot? Here's what you, mister sovereign Bitcoin HODLer, can do!

But I Hate Taproot!!

That's fine!

Discussions About Taproot Activation

almkglor your post has been copied because one or more comments in this topic have been removed. This copy will preserve unmoderated topic. If you would like to opt-out, please send a message using [this link].
[deleted comment]
[deleted comment]
[deleted comment]
submitted by anticensor_bot to u/anticensor_bot [link] [comments]

Bitcoin Q&A: What Will Be The Effect Of The 2020 Halving? Factors that Determine the Price of Bitcoin? What makes Bitcoin and other CryptoCurrencies go up in value? INTERESTING!!! The current Bitcoin Price Pattern happened ... Bitcoin Halving Explained Simple - Does it Affect Bitcoin ...

Why Bitcoin Value Fluctuates. ... Universal access is an example of what affects the price of Bitcoin. Lower fees; ... Thus, it is clear that the value of Bitcoin is based on the current market demand. Bitcoin mining is becoming steadily more difficult, resulting in low supply in the market. Just like gold, acquiring Bitcoin in the initial days ... That puts its total value at around $108.8 billion. While at first ordinary people could mine thousands of bitcoins , potentially now worth millions of pounds, bitcoin mining now requires a huge ... Bitcoin Halving. In the backdrop of these economic conditions, the Bitcoin halving event has become even more interesting for investors and financial experts. All eyes are on what specific impact the Bitcoin halving will have on the future prices of the digital currency, which is largest in terms of market capitalization. How halving events also impact the value of crypto. Currently the bitcoin network rewards miners with 12.5 bitcoin every ten minutes. However, this started at 50 and has gone through a halving event four years to bring it to the current number. Current balance is another major factor that affects the value of your currency. In simple terms, the current account balance is the total amount of goods, services, income and currency transfers of a nation with the rest of the world. Having a positive current balance means that a country lends more to the world than it borrows.

[index] [658] [11251] [28947] [24909] [10103] [23439] [12586] [27955] [113] [35086]

Bitcoin Q&A: What Will Be The Effect Of The 2020 Halving?

Start trading Bitcoin and cryptocurrency here: http://bit.ly/2Vptr2X Every 4 years on average (210K blocks) the reward granted to Bitcoin miners for adding a... The current Bitcoin price volatility is on a yearly low. What did this mean for the Bitcoin price in the past? Also in this episode: Generation Z is starting... We all know Bitcoin is a roller coaster of price changes, but have you ever wondered what determines the value of Bitcoin? Today Maria walk you through how the value of bitcoin constantly changes ... How will the halving affect the price of bitcoin? In this clip, Andreas examines the effect of the coming halving. These questions are from the January and February monthly patron session, which ... How might Bakkt affect the price of Bitcoin BTC? What's going on with Litecoin LTC and Ripple XRP? These are some of today's topics covered by Mattie. ----- CHECK OUT OUR PODCAST: https://bit.ly ...

#