Sidechain - Bitcoin Wiki

AllTheBitcoins: For Bitcoins, forks and orphans

Welcome to AllTheBitcoins: The subreddit appealing to supporters of all the competing implementations and forks of Bitcoin. You are welcome here no matter who you side with or which chain you consider to be the proper Bitcoin design.
[link]

Classic Ether Market & Trading Discussion

Ethereum has forked and moved to a new chain. This sub is for the discussion of the Ethereum chain which didn't move the coins.
[link]

d down, k up, everybody's a game theorist, titcoin, build wiki on Cardano, (e-)voting, competitive marketing analysis, Goguen product update, Alexa likes Charles, David hates all, Adam in and bros in arms with the scientific counterparts of the major cryptocurrency groups, the latest AMA for all!

Decreasing d parameter
Just signed the latest change management document, I was the last in the chain so I signed it today for changing the d parameter from 0.52 to 0.5. That means we are just about to cross the threshold here in a little bit for d to fall below 0.5 which means more than half of all the blocks will be made by the community and not the OBFT nodes. That's a major milestone and at this current rate of velocity it looks like d will decrement to zero around March so lots to do, lots to talk about. Product update, two days from now, we'll go ahead and talk about that but it crossed my desk today and I was really happy and excited about that and it seemed like yesterday that d was equal to one and people were complaining that we delayed it by an epoch and now we're almost at 50 percent. For those of you who want parameter-level changes, k-level changes, they are coming and there's an enormous internal conversation about it and we've written up a powerpoint presentation and a philosophy document about why things were designed the way that they're designed.
Increasing k parameter and upcoming security video and everybody's a game theorist
My chief scientist has put an enormous amount of time into this. Aggelos is very passionate about this particular topic and what I'm going to do is similar to the security video that I did where I did an hour and a half discussion about a best practice for security. I'm going to actually do a screencasted video where I talk about this philosophy document and I'm going to read the entire document with annotations with you guys and kind of talk through it. It might end up being quite a long video. It could be several hours long but I think it's really important to talk around the design philosophy of this. It's kind of funny, everybody, when they see a cryptographic paper or math paper, they tend to just say okay you guys figure that out. No one's an expert in cryptography or math and you don't really get strong opinions about it but game theory despite the fact that the topics as complex and in some cases more complex you tend to get a lot of opinions and everybody's a game theorist. So, there was enormous amount of thought that went into the design of the system, the parameters of system, everything from the reward functions to other things and it's very important that we explain that thought process in as detailed of a way as possible. At least the philosophy behind it then I feel that the community is in a really good position to start working on the change management. It is my position that I'd love to see k largely increased. I do think that the software needs some improvements to get there especially partial delegation delegation portfolios and some enhancements into the operation of staking especially.
E-voting
I'd love to see the existence of hybrid wallets where you have a cold part a hot part and we've had a lot of conversations about that and we will present some of the progress in that matter at the product updates. If not this October certainly in November. A lot of commercialization going along, a lot of things going on and flowing around and you know, commercial teams working hard. As I mentioned we have a lot of deals in the pipeline. The Wyoming event was half political, half sales. We were really looking into e-voting and we had very productive conversations along those lines. It is my goal that Cardano e-voting software is used in political primaries and my hope is for eventually to be used in municipal and state and eventually federal elections and then in national elections for countries like Ethiopia, Mongolia and other places. Now there is a long road, long, long road to get there and many little victories that have to begin but this event. Wyoming was kind of the opener into that conversation there were seven independent parties at the independent national convention and we had a chance to talk to the leadership of many of them. We will also engage in conversation with the libertarian party leadership as well and at the very least we could talk about e-voting and also blockchain-based voting for primaries that would be great start and we'll also look into the state of Wyoming for that as well. We'll you know, tell you guys about that in time. We've already gotten a lot of inquiries about e-voting software. We tend to get them along with the (Atala) Prism inquiries. It's actually quite easy to start conversations but there are a lot of security properties that are very important like end-to-end verifiability hybrid ballots where you have both a digital and a paper ballot delegation mechanics as well as privacy mechanics that are interesting on a case-by-case basis.
Goguen, voting, future fund3, competitive marketing analysis of Ouroboros vs. EOS, Tezos, Algorand, ETH2 and Polkadot, new creative director
We'll keep chipping away at that, a lot of Goguen stuff to talk about but I'm going to reserve all of that for two days from now for the product update. We're right in the middle, Goguen metadata was the very first part of it. We already have some commercialization platform as a result of metadata, more to come and then obviously lots of smart contract stuff to come. This update and the November update are going to be very Goguen focused and also a lot of alternatives as well. We're still on schedule for an HFC event in I think November or December. I can't remember but that's going to be carrying a lot of things related multisig token locking. There's some ledger rule changes so it has to be an HFC event and that opens up a lot of the windows for Goguen foundations as well as voting on chain so fund3 will benefit very heavily from that. We're right in the guts of Daedalus right now building the voting center, the identity center, QR-code work. All this stuff, it's a lot of stuff, you know, the cell phone app was released last week. Kind of an early beta, it'll go through a lot of rapid iterations every few weeks. We'll update it, google play is a great foundation to launch things on because it's so easy to push updates to people automatically so you can rapidly iterate and be very agile in that framework and you know we've already had 3500 people involved heavily in the innovation management platform ideascale and we've got numerous bids from everything. From John Buck and the sociocracy movement to others. A lot of people want to help us improve that and we're going to see steady and systematic growth there. We're still chipping away at product marketing. Liza (Horowitz) is doing a good job, meet with her two three-times a week and right now it's Ouroboros, Ouroboros, Ouroboros... We're doing competitive analysis of Ouroboros versus EOS, Tezos, Algorand, ETH2 and Polkadot. We think that's a good set. We think we have a really good way of explaining it. David (David Likes Crypto now at IOHK) has already made some great content. We're going to release that soon alongside some other content and we'll keep chipping away at that.
We also just hired a creative director for IO Global. His name's Adam, incredibly experienced creative director, he's worked for Mercedes-Benz and dozens of other companies. He does very good work and he's been doing this for well over 20 years and so the very first set of things he's going to do is work with commercial and marketing on product marketing. In addition to building great content where hope is make that content as pretty as possible and we have Rod heavily involved in that as well to talk about distribution channels and see if we can amplify the distribution message and really get a lot of stuff done. Last thing to mention, oh yeah, iOS for catalyst. We're working on that, we submitted it to the apple store, the iOS store, but it takes a little longer to get approval for that than it does with google play but that's been submitted and it's whenever apple approves it or not. Takes a little longer for cryptocurrency stuff.
Wiki shizzle and battle for crypto, make crypto articles on wiki great again, Alexa knows Charles, Everpedia meets Charles podcast, holy-grail land of Cardano, wiki on Cardano, titcoin
Wikipedia... kind of rattled the cage a little bit. Through an intermediary we got contact with Jimmy Wales. Larry Sanger, the other co-founder also reached out to me and the everpedia guys reached out to me. Here's where we stand, we have an article, it has solidified, it's currently labeled as unreliable and you should not believe the things that are said in it which is David Gerard's work if you look at the edits. We will work with the community and try to get that article to a fair and balanced representation of Cardano and especially after the product marketing comes through. We clearly explain the product I think the Cardano article can be massively strengthened. I've told Rod to work with some specialized people to try to get that done but we are going to work very hard at a systematic approval campaign for all of the scientific articles related to blockchain technology in the cryptocurrency space. They're just terrible, if you go to the proof of work article, the proof of stake or all these things, they're just terrible. They're not well written, they're out of date and they don't reflect an adequate sampling of the science. I did talk to my chief scientist Aggelos and what we're gonna do is reach out to the scientific counterparts that most of the major cryptocurrency groups that are doing research and see if they want to work with us at an industry-wide effort to systematically improve the scientific articles in our industry so that there are a fair and balanced representation of what the current state of the art are, the criticisms, the trade-offs as well as the reference space and of course obviously we'll do quite well in that respect because we've done the science. We're the inheritor of it but it's a shame because when people search proof of stake on google usually wikipedia results are highly biased. We care about wikipedia because google cares about wikipedia, amazon cares about wikipedia.
If you ask Alexa who is Charles Hoskinson, the reason why Alexa knows is because it's reading directly from the wikipedia page. If I didn't have a wikipedia page Alexa would know that so if somebody says Alexa what is Cardano it's going to read directly from the wikipedia page and you know and we can either just pretend that reality doesn't exist or we can accept it and we as a community working with partners in the broader cryptocurrency community can universally improve the quality of cryptocurrency pages. There's been a pattern of commercial censorship on wikipedia for cryptocurrencies in general since bitcoin itself. In fact I think the bitcoin article is actually taken down once back in, might have been, 2010 or 2009 but basically wikipedia has not been a friend of cryptocurrencies. That's why everpedia exists and actually their founders reached out to me and I talked to them over twitter through PMs and we agreed to actually do a podcast. I'm going to do a streamyard, stream with these guys and they'll come on talk all about everpedia and what they do and how they are and we'll kind of go through the challenges that they've encountered. How their platform works and so forth and obviously if they want to ever leave that terrible ecosystem EOS and come to the holy-grail land of Cardano we'd be there to help them out. At least they can tell the world how amazing their product is and also the challenges they're having to overcome. We've also been in great contact with Larry Sanger.
He's going to do an internal seminar at some point with with us and talk about some protocols he's been developing since he left wikipedia specifically to decentralize knowledge management and have a truly decentralized encyclopedia. I'm really looking forward to that and I hope that presentation gives us some inspiration as an ecosystem of things we can do. That's a great piece of infrastructure regardless and after we learn a lot more about it and we talk to a lot of people in ecosystem. If we can't get people to move on over, it would be really good to see through ideascale in the innovation management platform for people to utilize the dc fund to build their own variant of wikipedia on Cardano. In the coming months there will certainly be funding available. If you guys are so passionate about this particular problem that you want to go solve it then I'd be happy to play Elon Musk with the hyperloop and write a white paper on a protocol design and really give a good first start and then you guys can go and try to commercialize that technology as Cardano native assets and Plutus smart contracts in addition to other pieces of technology that have to be brought in to make it practical.
Right now we're just, let's talk to everybody phase, and we'll talk to the everpedia guys, we're going to talk to Larry and we're going to see whoever else is in this game and of course we have to accept the incumbency as it is. So, we're working with obviously the wikipedia side to improve the quality of not only our article but all of the articles and the scientific side of things so that there's a fair and accurate representation of information. One of the reasons why I'm so concerned about this is that I am very worried that Cardano projects will get commercially censored like we were commercially censored. So, yes we do have a page but it took five years to get there and we're a multi-billion dollar project with hundreds of thousands of people. If you guys are doing cutting-edge novel interesting stuff I don't want your experience to be the same as ours where you have to wait five years for your project to get a page even after government's adopted. That's absurd, no one should be censored ever. This is very well a fight for the entire ecosystem, the entire community, not just Cardano but all cryptocurrencies: bitcoin, ethereum and Cardano have all faced commercial censorship and article deletions during their tenure so I don't want you guys to go through that. I'm hoping we can prove that situation but you know you don't put all your eggs in one basket and frankly the time has come for wikipedia to be fully decentralized and liberated from a centralized organization and massively variable quality in the editor base. If legends of valor has a page but Cardano didn't have one until recently titcoin, a pornography coin from 2015, that's deprecated, no one uses it, has a page but Cardano couldn't get one there's something seriously wrong with the quality control mechanism and we need to improve that so it'll get done.
submitted by stake_pool to cardano [link] [comments]

Why Osana takes so long? (Programmer's point of view on current situation)

I decided to write a comment about «Why Osana takes so long?» somewhere and what can be done to shorten this time. It turned into a long essay. Here's TL;DR of it:
The cost of never paying down this technical debt is clear; eventually the cost to deliver functionality will become so slow that it is easy for a well-designed competitive software product to overtake the badly-designed software in terms of features. In my experience, badly designed software can also lead to a more stressed engineering workforce, in turn leading higher staff churn (which in turn affects costs and productivity when delivering features). Additionally, due to the complexity in a given codebase, the ability to accurately estimate work will also disappear.
Junade Ali, Mastering PHP Design Patterns (2016)
Longer version: I am not sure if people here wanted an explanation from a real developer who works with C and with relatively large projects, but I am going to do it nonetheless. I am not much interested in Yandere Simulator nor in this genre in general, but this particular development has a lot to learn from for any fellow programmers and software engineers to ensure that they'll never end up in Alex's situation, especially considering that he is definitely not the first one to got himself knee-deep in the development hell (do you remember Star Citizen?) and he is definitely not the last one.
On the one hand, people see that Alex works incredibly slowly, equivalent of, like, one hour per day, comparing it with, say, Papers, Please, the game that was developed in nine months from start to finish by one guy. On the other hand, Alex himself most likely thinks that he works until complete exhaustion each day. In fact, I highly suspect that both those sentences are correct! Because of the mistakes made during early development stages, which are highly unlikely to be fixed due to the pressure put on the developer right now and due to his overall approach to coding, cost to add any relatively large feature (e.g. Osana) can be pretty much comparable to the cost of creating a fan game from start to finish. Trust me, I've seen his leaked source code (don't tell anybody about that) and I know what I am talking about. The largest problem in Yandere Simulator right now is its super slow development. So, without further ado, let's talk about how «implementing the low hanging fruit» crippled the development and, more importantly, what would have been an ideal course of action from my point of view to get out. I'll try to explain things in the easiest terms possible.
  1. else if's and lack any sort of refactoring in general
The most «memey» one. I won't talk about the performance though (switch statement is not better in terms of performance, it is a myth. If compiler detects some code that can be turned into a jump table, for example, it will do it, no matter if it is a chain of if's or a switch statement. Compilers nowadays are way smarter than one might think). Just take a look here. I know that it's his older JavaScript code, but, believe it or not, this piece is still present in C# version relatively untouched.
I refactored this code for you using C language (mixed with C++ since there's no this pointer in pure C). Take a note that else if's are still there, else if's are not the problem by itself.
The refactored code is just objectively better for one simple reason: it is shorter, while not being obscure, and now it should be able to handle, say, Trespassing and Blood case without any input from the developer due to the usage of flags. Basically, the shorter your code, the more you can see on screen without spreading your attention too much. As a rule of thumb, the less lines there are, the easier it is for you to work with the code. Just don't overkill that, unless you are going to participate in International Obfuscated C Code Contest. Let me reiterate:
Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.
Antoine de Saint-Exupéry
This is why refactoring — activity of rewriting your old code so it does the same thing, but does it quicker, in a more generic way, in less lines or simpler — is so powerful. In my experience, you can only keep one module/class/whatever in your brain if it does not exceed ~1000 lines, maybe ~1500. Splitting 17000-line-long class into smaller classes probably won't improve performance at all, but it will make working with parts of this class way easier.
Is it too late now to start refactoring? Of course NO: better late than never.
  1. Comments
If you think that you wrote this code, so you'll always easily remember it, I have some bad news for you: you won't. In my experience, one week and that's it. That's why comments are so crucial. It is not necessary to put a ton of comments everywhere, but just a general idea will help you out in the future. Even if you think that It Just Works™ and you'll never ever need to fix it. Time spent to write and debug one line of code almost always exceeds time to write one comment in large-scale projects. Moreover, the best code is the code that is self-evident. In the example above, what the hell does (float) 6 mean? Why not wrap it around into the constant with a good, self-descriptive name? Again, it won't affect performance, since C# compiler is smart enough to silently remove this constant from the real code and place its value into the method invocation directly. Such constants are here for you.
I rewrote my code above a little bit to illustrate this. With those comments, you don't have to remember your code at all, since its functionality is outlined in two tiny lines of comments above it. Moreover, even a person with zero knowledge in programming will figure out the purpose of this code. It took me less than half a minute to write those comments, but it'll probably save me quite a lot of time of figuring out «what was I thinking back then» one day.
Is it too late now to start adding comments? Again, of course NO. Don't be lazy and redirect all your typing from «debunk» page (which pretty much does the opposite of debunking, but who am I to judge you here?) into some useful comments.
  1. Unit testing
This is often neglected, but consider the following. You wrote some code, you ran your game, you saw a new bug. Was it introduced right now? Is it a problem in your older code which has shown up just because you have never actually used it until now? Where should you search for it? You have no idea, and you have one painful debugging session ahead. Just imagine how easier it would be if you've had some routines which automatically execute after each build and check that environment is still sane and nothing broke on a fundamental level. This is called unit testing, and yes, unit tests won't be able to catch all your bugs, but even getting 20% of bugs identified at the earlier stage is a huge boon to development speed.
Is it too late now to start adding unit tests? Kinda YES and NO at the same time. Unit testing works best if it covers the majority of project's code. On the other side, a journey of a thousand miles begins with a single step. If you decide to start refactoring your code, writing a unit test before refactoring will help you to prove to yourself that you have not broken anything without the need of running the game at all.
  1. Static code analysis
This is basically pretty self-explanatory. You set this thing once, you forget about it. Static code analyzer is another «free estate» to speed up the development process by finding tiny little errors, mostly silly typos (do you think that you are good enough in finding them? Well, good luck catching x << 4; in place of x <<= 4; buried deep in C code by eye!). Again, this is not a silver bullet, it is another tool which will help you out with debugging a little bit along with the debugger, unit tests and other things. You need every little bit of help here.
Is it too late now to hook up static code analyzer? Obviously NO.
  1. Code architecture
Say, you want to build Osana, but then you decided to implement some feature, e.g. Snap Mode. By doing this you have maybe made your game a little bit better, but what you have just essentially done is complicated your life, because now you should also write Osana code for Snap Mode. The way game architecture is done right now, easter eggs code is deeply interleaved with game logic, which leads to code «spaghettifying», which in turn slows down the addition of new features, because one has to consider how this feature would work alongside each and every old feature and easter egg. Even if it is just gazing over one line per easter egg, it adds up to the mess, slowly but surely.
A lot of people mention that developer should have been doing it in object-oritented way. However, there is no silver bullet in programming. It does not matter that much if you are doing it object-oriented way or usual procedural way; you can theoretically write, say, AI routines on functional (e.g. LISP)) or even logical language if you are brave enough (e.g. Prolog). You can even invent your own tiny programming language! The only thing that matters is code quality and avoiding the so-called shotgun surgery situation, which plagues Yandere Simulator from top to bottom right now. Is there a way of adding a new feature without interfering with your older code (e.g. by creating a child class which will encapsulate all the things you need, for example)? Go for it, this feature is basically «free» for you. Otherwise you'd better think twice before doing this, because you are going into the «technical debt» territory, borrowing your time from the future by saying «I'll maybe optimize it later» and «a thousand more lines probably won't slow me down in the future that much, right?». Technical debt will incur interest on its own that you'll have to pay. Basically, the entire situation around Osana right now is just a huge tale about how just «interest» incurred by technical debt can control the entire project, like the tail wiggling the dog.
I won't elaborate here further, since it'll take me an even larger post to fully describe what's wrong about Yandere Simulator's code architecture.
Is it too late to rebuild code architecture? Sadly, YES, although it should be possible to split Student class into descendants by using hooks for individual students. However, code architecture can be improved by a vast margin if you start removing easter eggs and features like Snap Mode that currently bloat Yandere Simulator. I know it is going to be painful, but it is the only way to improve code quality here and now. This will simplify the code, and this will make it easier for you to add the «real» features, like Osana or whatever you'd like to accomplish. If you'll ever want them back, you can track them down in Git history and re-implement them one by one, hopefully without performing the shotgun surgery this time.
  1. Loading times
Again, I won't be talking about the performance, since you can debug your game on 20 FPS as well as on 60 FPS, but this is a very different story. Yandere Simulator is huge. Once you fixed a bug, you want to test it, right? And your workflow right now probably looks like this:
  1. Fix the code (unavoidable time loss)
  2. Rebuild the project (can take a loooong time)
  3. Load your game (can take a loooong time)
  4. Test it (unavoidable time loss, unless another bug has popped up via unit testing, code analyzer etc.)
And you can fix it. For instance, I know that Yandere Simulator makes all the students' photos during loading. Why should that be done there? Why not either move it to project building stage by adding build hook so Unity does that for you during full project rebuild, or, even better, why not disable it completely or replace with «PLACEHOLDER» text for debug builds? Each second spent watching the loading screen will be rightfully interpreted as «son is not coding» by the community.
Is it too late to reduce loading times? Hell NO.
  1. Jenkins
Or any other continuous integration tool. «Rebuild a project» can take a long time too, and what can we do about that? Let me give you an idea. Buy a new PC. Get a 32-core Threadripper, 32 GB of fastest RAM you can afford and a cool motherboard which would support all of that (of course, Ryzen/i5/Celeron/i386/Raspberry Pi is fine too, but the faster, the better). The rest is not necessary, e.g. a barely functional second hand video card burned out by bitcoin mining is fine. You set up another PC in your room. You connect it to your network. You set up ramdisk to speed things up even more. You properly set up Jenkins) on this PC. From now on, Jenkins cares about the rest: tracking your Git repository, (re)building process, large and time-consuming unit tests, invoking static code analyzer, profiling, generating reports and whatever else you can and want to hook up. More importantly, you can fix another bug while Jenkins is rebuilding the project for the previous one et cetera.
In general, continuous integration is a great technology to quickly track down errors that were introduced in previous versions, attempting to avoid those kinds of bug hunting sessions. I am highly unsure if continuous integration is needed for 10000-20000 source lines long projects, but things can be different as soon as we step into the 100k+ territory, and Yandere Simulator by now has approximately 150k+ source lines of code. I think that probably continuous integration might be well worth it for Yandere Simulator.
Is it too late to add continuous integration? NO, albeit it is going to take some time and skills to set up.
  1. Stop caring about the criticism
Stop comparing Alex to Scott Cawton. IMO Alex is very similar to the person known as SgtMarkIV, the developer of Brutal Doom, who is also a notorious edgelord who, for example, also once told somebody to kill himself, just like… However, being a horrible person, SgtMarkIV does his job. He simply does not care much about public opinion. That's the difference.
  1. Go outside
Enough said. Your brain works slower if you only think about games and if you can't provide it with enough oxygen supply. I know that this one is probably the hardest to implement, but…
That's all, folks.
Bonus: Do you think how short this list would have been if someone just simply listened to Mike Zaimont instead of breaking down in tears?
submitted by Dezhitse to Osana [link] [comments]

Eth 2.0 vs Polkadot and other musings by a fundamental investor

Spent about two hours on this post and I decided it would help the community if I made it more visible. Comment was made as a response to this
I’m trying to avoid falling into a maximalist mindset over time. This isn’t a 100% ETH question, but I’m trying to stay educated about emerging tech.
Can someone help me see the downsides of diversifying into DOTs?
I know Polkadot is more centralized, VC backed, and generally against our ethos here. On chain governance might introduce some unknown risks. What else am I missing?
I see a bunch of posts about how Ethereum and Polkadot can thrive together, but are they not both L1 competitors?
Response:
What else am I missing?
The upsides.
Most of the guys responding to you here are full Eth maxis who drank the Parity is bad koolaid. They are married to their investment and basically emotional / tribal in an area where you should have a cool head. Sure, you might get more upvotes on Reddit if you do and say what the crowd wants, but do you want upvotes and fleeting validation or do you want returns on your investment? Do you want to be these guys or do you want to be the shareholder making bank off of those guys?
Disclaimer: I'm both an Eth whale and a Dot whale, and have been in crypto for close to a decade now. I originally bought ether sub $10 after researching it for at least a thousand hours. Rode to $1500 and down to $60. Iron hands - my intent has always been to reconsider my Eth position after proof of stake is out. I invested in the 2017 Dot public sale with the plan of flipping profits back to Eth but keeping Dots looks like the right short and long term play now. I am not a trader, I just take a deep tech dive every couple of years and invest in fundamentals.
Now as for your concerns:
I know Polkadot is more centralized
The sad truth is that the market doesn't really care about this. At all. There is no real statistic to show at what point a coin is "decentralized" or "too centralized". For example, bitcoin has been completely taken over by Chinese mining farms for about five years now. Last I checked, they control above 85% of the hashing power, they just spread it among different mining pools to make it look decentralized. They have had the ability to fake or block transactions for all this time but it has never been in their best interest to do so: messing with bitcoin in that way would crash its price, therefore their bitcoin holdings, their mining equipment, and their company stock (some of them worth billions) would evaporate. So they won't do it due to economics, but not because they can't.
That is the major point I want to get across; originally Bitcoin couldn't be messed with because it was decentralized, but now Bitcoin is centralized but it's still not messed with due to economics. It is basically ChinaCoin at this point, but the market doesn't care, and it still enjoys over 50% of the total crypto market cap.
So how does this relate to Polkadot? Well fortunately most chains - Ethereum included - are working towards proof of stake. This is obviously better for the environment, but it also has a massive benefit for token holders. If a hostile party wanted to take over a proof of stake chain they'd have to buy up a massive share of the network. The moment they force through a malicious transaction a proof of stake blockchain has the option to fork them off. It would be messy for a few days, but by the end of the week the hostile party would have a large amount of now worthless tokens, and the proof of stake community would have moved on to a version of the blockchain where the hostile party's tokens have been slashed to zero. So not only does the market not care about centralization (Bitcoin example), but proof of stake makes token holders even safer.
That being said, Polkadot's "centralization" is not that far off to Ethereum. The Web3 foundation kept 30% of the Dots while the Ethereum Foundation kept 17%. There are whales in Polkadot but Ethereum has them too - 40% of all genesis Ether went to 100 wallets, and many suspect that the original Ethereum ICO was sybiled to make it look more popular and decentralized than it really was. But you don't really care about that do you? Neither do I. Whales are a fact of life.
VC backed
VCs are part of the crypto game now. There is no way to get rid of them, and there is no real reason why you should want to get rid of them. They put their capital at risk (same as you and me) and seek returns on their investment (same as you and me). They are both in Polkadot and Ethereum, and have been for years now. I have no issue with them as long as they don't play around with insider information, but that is another topic. To be honest, I would be worried if VCs did not endorse chains I'm researching, but maybe that's because my investing style isn't chasing hype and buying SUSHI style tokens from anonymous (at the time) developers. That's just playing hot potato. But hey, some people are good at that.
As to the amount of wallets that participated in the Polkadot ICO: a little known fact is that more individual wallets participated in Polkadot's ICO than Ethereum's, even though Polkadot never marketed their ICO rounds due to regulatory reasons.
generally against our ethos here
Kool aid.
Some guy that works(ed?) at Parity (who employs what, 200+ people?) correctly said that Ethereum is losing its tech lead and that offended the Ethereum hivemind. Oh no. So controversial. I'm so personally hurt by that.
Some guy that has been working for free on Ethereum basically forever correctly said that Polkadot is taking the blockchain tech crown. Do we A) Reflect on why he said that? or B) Rally the mob to chase him off?
"I did not quit social media, I quit Ethereum. I did not go dark, I just left the community. I am no longer coordinating hard forks, building testnets, or contributing otherwise. I did not work on Polkadot, I never did, I worked on Ethereum. I did not hate Ethereum, I loved it."
Also Parity locked their funds (and about 500+ other wallets not owned by them) and proposed a solution to recover them. When the community voted no they backed off and did not fork the chain, even if they had the influence to do so. For some reason this subreddit hates them for that, even if Parity did the 100% moral thing to do. Remember, 500+ other teams or people had their funds locked, so Parity was morally bound to try its best to recover them.
Its just lame drama to be honest. Nothing to do with ethos, everything to do with emotional tribalism.
Now for the missing upsides (I'll also respond to random fragments scattered in the thread):
This isn’t a 100% ETH question, but I’m trying to stay educated about emerging tech.
A good quick intro to Eth's tech vs Polkadot's tech can be found on this thread, especially this reply. That thread is basically mandatory reading if you care about your investment.
Eth 2.0's features will not really kick in for end users until about 2023. That means every dapp (except DeFI, where the fees make sense due to returns and is leading the fee market) who built on Eth's layer 1 are dead for three years. Remember the trading card games... Gods Unchained? How many players do you think are going to buy and sell cards when the transaction fee is worth more than the cards? All that development is now practically worthless until it can migrate to its own shard. This story repeats for hundreds of other dapp teams who's projects are now priced out for three years. So now they either have to migrate to a one of the many unpopulated L2 options (which have their own list of problems and risks, but that's another topic) or they look for another platform, preferably one interoperable with Ethereum. Hence Polkadot's massive growth in developer activity. If you check out https://polkaproject.com/ you'll see 205 projects listed at the time of this post. About a week ago they had 202 listed. That means about one team migrated from another tech stack to build on Polkadot every two days, and trust me, many more will come in when parachains are finally activated, and it will be a complete no brainer when Polkadot 2.0 is released.
Another huge upside for Polkadot is the Initial Parachain Offerings. Polkadot's version of ICOs. The biggest difference is that you can vote for parachains using your Dots to bind them to the relay chain, and you get some of the parachain's tokens in exchange. After a certain amount of time you get your Dots back. The tokenomics here are impressive: Dots are locked (reduced supply) instead of sold (sell pressure) and you still earn your staking rewards. There's no risk of scammers running away with your Ether and the governance mechanism allows for the community to defund incompetent devs who did not deliver what was promised.
Wouldn’t an ETH shard on Polkadot gain a bunch of scaling benefits that we won’t see natively for a couple years?
Yes. That is correct. Both Edgeware and Moonbeam are EVM compatible. And if the original dapp teams don't migrate their projects someone else will fork them, exactly like SUSHI did to Uniswap, and how Acala is doing to MakerDao.
Although realistically Ethereum has a 5 yr headstart and devs haven't slowed down at all
Ethereum had a five year head start but it turns out that Polkadot has a three year tech lead.
Just because it's "EVM Compatible" doesn't mean you can just plug Ethereum into Polkadot or vica versa, it just means they both understand Ethereum bytecode and you can potentially copy/paste contracts from Ethereum to Polkadot, but you'd still need to add a "bridge" between the 2 chains, so it adds additional complexity and extra steps compared to using any of the existing L2 scaling solutions
That only applies of you are thinking from an Eth maximalist perspective. But if you think from Polkadot's side, why would you need to use the bridge back to Ethereum at all? Everything will be seamless, cheaper, and quicker once the ecosystem starts to flourish.
I see a bunch of posts about how Ethereum and Polkadot can thrive together, but are they not both L1 competitors?
They are competitors. Both have their strategies, and both have their strengths (tech vs time on the market) but they are clearly competing in my eyes. Which is a good thing, Apple and Samsung competing in the cell phone market just leads to more innovation for consumers. You can still invest in both if you like.
Edit - link to post and the rest of the conversation: https://www.reddit.com/ethfinance/comments/iooew6/daily_general_discussion_september_8_2020/g4h5yyq/
Edit 2 - one day later PolkaProject count is 210. Devs are getting the hint :)
submitted by redditsucks_goruqqus to polkadot_market [link] [comments]

[Twitter/Clubhouse/News Media?] Silicon Valley v The New York Times: Overpriced Suitcases, Insta Stories, Insular Apps and Bitcoin Bounties

Background:
What is Clubhouse?
You know all those stories of people interrupting Zoom calls by spamming the link and getting in? What if that, as a business model. It is still in private beta, has only 1500 users and yet somehow venture capitalists have $12 million invested at a $100 milion dollar valuation in this.
What is Away?
Hardcover suitcases that cost $225 and above. Hipsters seem to like it. "The brand is more than just luggage. It’s about travel." It is treated like a tech company by VCs for some godforsaken reason (it has raised $100 million at a $1.4 billion valuation), and the CEO uses a lot of Lean In rhetoric (female led, inclusive etc.)
How New York Times?
The New York Times has hired a reporter, Taylor Lorenz, specifically for "Internet Culture" i.e. HobbyDrama reporting. (No, seriously, look at the stories she gets to write. For the NYT!)
Pre-Drama Events:
In December 2019, an elaborate investigation was posted by The Verge (not the NYT, important) about the toxic work culture at Away, with the CEO, Steph Corey, calling workers brain dead and firing someone based on chat in an internal private Slack channel called #Hot-Topics "filled with LGBTQ folks and people of color" (from article).
Korey stepped down as CEO in December, with another CEO to be selected. She came back as co-CEO in January because she 'should not have fallen on the sword.'
Course of the Drama:
June 30/July 1: On an Instagram AMA, returned co-CEO Korey answered a question about "women being targeted by the media" (I presume the AMA went in that direction) by talking about media having an incentive to clickbait in the social media era that and women (like her) were targeted because women are supposed to be motherly, ambitious women like Hillary are targeted by the media, some millennial women who work in the media forgo their ethics to advance their career because old media ethics are being eroded.
The Verge investigation was done by Zoë Schiffer, a “millennial woman.”
Incensed by this, Lorenz posts the IG pics on twitter (previous link from her) and speculates that this AMA exists because of a piece on the disgracing of the “girlboss” stereotype. To recap, neither the original story, nor the Atlantic op-ed were written by her.
Techbros start sharing the same pics of the AMA as a balanced perspective.
Until this point, #bothsides, let them fight, etc.
Enter Balaji Srinivasan. Here is a pompous bio.
He starts attacking Lorenz (again, not the writer of any of the stories). Lorenz says the guy has been obsessively attacking her for quite some time on Clubhouse gussied up public Webex calls (in tweets after the linked tweet).
Then anti-Lorenz sockpuppet accounts start being created to attack her.
An elaborate website is linked by the accounts, specifically to attack her. (Click the link, it is deranged as all hell.)
Taylor asks Ben Horowitz (of multi-billion dollar Andreessen Horowitz, where Balaji has worked before) to get him to stop. Gets blocked.
Then the Andreessen Horowitz batch have a conversation on Clubhouse Discord without texting with Lorenz. After Taylor leaves, (this part leaked to Vice, so you can go listen) Ben Horowitz’ wife, Felicia says that Taylor is playing the “woman card to defend herself.” Balaji implies that she may be “afraid of a brown man.”
And then the conversations ascend:
”the entire tech press was complicit in covering up the threat of COVID-19,”
relying on the press is “outsourcing your information supply chain to folks who are disaligned with you,”
”Media corporations are not the free press, any more than chain restaurants are food “
“why does press have a right to investigate private companies, let the market decide, I don’t understand who gives them that right” (Note: Probably from another conversation by some CEO)
Also something about Github, VC funding and Blockchains being a better model for journalism. (Bitconeeeect!!)
Then Vice reports on it.
Tech media rallies around Taylor [retweets on her twitter]. Glenn Greenwald pokes his nose and says otherwise, because Greenwald.
VCs support their own (along with MC Hammer?? because he’s also on Clubhouse the conference calls you join “for fun” app? So is Oprah???), with opinions like
I’ve never met a VC more powerful than a journalist. One can block me from accessing capital. The other has controlled the narrative / perception of my entire race for decades.
And other nuclear fucking takes retweeted by Felicia Horowitz
And Balaji?
When reached for comment, Balaji claims recording it was illegal (which, idk, haven’t seen the Terms of Service, only 1500 people can use Clubhouse the Twitch app, but you don’t have video and chat has audio)
And then he announces a $1000 bounty for memes and analysis of this event.
(paid in Bitcoin, obviously, this whole scenario is a damn meme)
This gets the creator of Ruby on Rails/Basecamp to defect to the media’s side
Aftermath? (This is a current story):
VCs (like Paul Graham) declare that the media hates them because they are losing power
Media Twitter decides VC Twitter is trying to reanimate the corpse of #GamerGate
Steph Korey, the instigator of this spiraling nonsense? Away says she has decided to step down (redux) because an employee revolt over her IG post.
(Recap: Away sells hardcases to hypebeasts. They are worth a billion because of VCs)
Balaji, rich VC guy, has memes on his timeline?
[Elon Musk has not weighed in yet, if you are curious]
submitted by runnerx4 to HobbyDrama [link] [comments]

MetaDAO - update

MetaDAO - update
Wax on. Wax off.
Quick update on the MetaDAO concept and outreach to Edgeware one of the first Substrate based chains after Polkadot and Kusama.
Since the intial post, both Polkadot and Kusama have seen explosive growth, with Polkadot now #6 on Coingecko and Kusama #60. Edgeware is at #107 and all have treasuries to spend - c$8m in $EDG, c$6 $KSM and $38m $DOT.
The reaction to both the proposal and building bridges to Decred from the (small) Edgeware community has been positive - you can see an overview of last week's community call here on Commonwealth, which is their expanded version of something like Politeia and hosts various conversations and threads.
There is another call today at 2:45pm EST - you can join a google hangout here if you're interested. Hopefully the call will progress the conversations again.
So what values / characteristics do Decred and Edgeware have in common?
Well both are focused on governance, both launched aiming to offer a wide and fair distribution (info on lockdrop here) and both have treasuries that can sustain them long into the future.
There are also key differences.
Decred is older (and perhaps a little jaded) prioritising slow, careful progress that considers every action through the lens of unintended consequences with the aim of maximising privacy and security.
It is a philosophy rooted in the project's origins at Company 0 - see 'return to zero' and the north star of building a fairer financial system.
Edgeware is new on the scene, filled with excitement, enthusiasm and buckets of shower thoughts that would be shot down in an instant by the Decred community. The project will not be as secure or private as Decred but will move faster and can potentially become a bridge to the wider ecosystem, taking chances the older project can't (or won't)...
Now I'm showing my age, but when seen side by side, the two projects reminds me of the central relationship in the 80's classic The Karate Kid.
Mr Myagi is the old master weary of new things.
“Never put passion in front of principle, even if you win, you’ll lose”.
Daniel San is the young and impetuous grasshopper.
Together they achieve great things and teach other a few things along the way.
Aaaaanyway....
The next stage is to move the concept to a formalised proposal for Edgeware's council to vote on (it is an NPOS system) and a request for funding that can officially kick off the MetaDAO project and potential areas for collaboration and co-financing.
Post this, I will also submit a formal proposal to Politieia to hopefully gain stakeholder approval and gain extra insights along the way.
u/jet_user has suggested a few ideas here.
- Personally I'm interested in the "real stuff" that boosts autonomy and resilience of individuals: open hardware, UX improvement to solve that huge self-custody UX challenge, security audits (perhaps we use some common libraries), etc.
- devs in the smart contract land might borrow some elements of dcrdex's coin swap sequence (coordinating a peer-to-peer swap required to figure out a lot of ugly edge cases), or we could borrow something from them.
- new forum software with stronger transparency, integrity protection and data preservation properties that has Reddit-like UX but Politeia-like security and transparency.
- Another is crypto job market/bounty system/issue tracker combo to unlock the energy sitting on the sidelines and improve mass coordination.
I think Decred could integrate with Commonwealth for working group discussions ahead of Pi-Reddit and could perhaps also utilise the project's infrastructure for subDAOs ahead of being possible with Decred.
One other area that I'm particularly interested in is figuring out a way to bring alive the story of Open Source - a definitive history of the movement, it's achievements and the path to Bitcoin, Decred, et al but done in a contemporary (and cool) way that connects to a much bigger picture and is hosted on decentralised streaming infrastructure like Livepeer.
Right now it is still a story that most do not understand nor connect with.
Ultimately Decred was created to solve the funding issue in OSS so I feel it is an important piece of the puzzle that will aid in the project's broader awareness and understanding in the long term.
Until now all content about this revolution has been delivered in a very old school TV way - see Open Source Money.
submitted by monsieurbulb to decred [link] [comments]

Why i’m bullish on Zilliqa (long read)

Edit: TL;DR added in the comments
 
Hey all, I've been researching coins since 2017 and have gone through 100s of them in the last 3 years. I got introduced to blockchain via Bitcoin of course, analyzed Ethereum thereafter and from that moment I have a keen interest in smart contact platforms. I’m passionate about Ethereum but I find Zilliqa to have a better risk-reward ratio. Especially because Zilliqa has found an elegant balance between being secure, decentralized and scalable in my opinion.
 
Below I post my analysis of why from all the coins I went through I’m most bullish on Zilliqa (yes I went through Tezos, EOS, NEO, VeChain, Harmony, Algorand, Cardano etc.). Note that this is not investment advice and although it's a thorough analysis there is obviously some bias involved. Looking forward to what you all think!
 
Fun fact: the name Zilliqa is a play on ‘silica’ silicon dioxide which means “Silicon for the high-throughput consensus computer.”
 
This post is divided into (i) Technology, (ii) Business & Partnerships, and (iii) Marketing & Community. I’ve tried to make the technology part readable for a broad audience. If you’ve ever tried understanding the inner workings of Bitcoin and Ethereum you should be able to grasp most parts. Otherwise, just skim through and once you are zoning out head to the next part.
 
Technology and some more:
 
Introduction
 
The technology is one of the main reasons why I’m so bullish on Zilliqa. First thing you see on their website is: “Zilliqa is a high-performance, high-security blockchain platform for enterprises and next-generation applications.” These are some bold statements.
 
Before we deep dive into the technology let’s take a step back in time first as they have quite the history. The initial research paper from which Zilliqa originated dates back to August 2016: Elastico: A Secure Sharding Protocol For Open Blockchains where Loi Luu (Kyber Network) is one of the co-authors. Other ideas that led to the development of what Zilliqa has become today are: Bitcoin-NG, collective signing CoSi, ByzCoin and Omniledger.
 
The technical white paper was made public in August 2017 and since then they have achieved everything stated in the white paper and also created their own open source intermediate level smart contract language called Scilla (functional programming language similar to OCaml) too.
 
Mainnet is live since the end of January 2019 with daily transaction rates growing continuously. About a week ago mainnet reached 5 million transactions, 500.000+ addresses in total along with 2400 nodes keeping the network decentralized and secure. Circulating supply is nearing 11 billion and currently only mining rewards are left. The maximum supply is 21 billion with annual inflation being 7.13% currently and will only decrease with time.
 
Zilliqa realized early on that the usage of public cryptocurrencies and smart contracts were increasing but decentralized, secure, and scalable alternatives were lacking in the crypto space. They proposed to apply sharding onto a public smart contract blockchain where the transaction rate increases almost linear with the increase in the amount of nodes. More nodes = higher transaction throughput and increased decentralization. Sharding comes in many forms and Zilliqa uses network-, transaction- and computational sharding. Network sharding opens up the possibility of using transaction- and computational sharding on top. Zilliqa does not use state sharding for now. We’ll come back to this later.
 
Before we continue dissecting how Zilliqa achieves such from a technological standpoint it’s good to keep in mind that a blockchain being decentralised and secure and scalable is still one of the main hurdles in allowing widespread usage of decentralised networks. In my opinion this needs to be solved first before blockchains can get to the point where they can create and add large scale value. So I invite you to read the next section to grasp the underlying fundamentals. Because after all these premises need to be true otherwise there isn’t a fundamental case to be bullish on Zilliqa, right?
 
Down the rabbit hole
 
How have they achieved this? Let’s define the basics first: key players on Zilliqa are the users and the miners. A user is anybody who uses the blockchain to transfer funds or run smart contracts. Miners are the (shard) nodes in the network who run the consensus protocol and get rewarded for their service in Zillings (ZIL). The mining network is divided into several smaller networks called shards, which is also referred to as ‘network sharding’. Miners subsequently are randomly assigned to a shard by another set of miners called DS (Directory Service) nodes. The regular shards process transactions and the outputs of these shards are eventually combined by the DS shard as they reach consensus on the final state. More on how these DS shards reach consensus (via pBFT) will be explained later on.
 
The Zilliqa network produces two types of blocks: DS blocks and Tx blocks. One DS Block consists of 100 Tx Blocks. And as previously mentioned there are two types of nodes concerned with reaching consensus: shard nodes and DS nodes. Becoming a shard node or DS node is being defined by the result of a PoW cycle (Ethash) at the beginning of the DS Block. All candidate mining nodes compete with each other and run the PoW (Proof-of-Work) cycle for 60 seconds and the submissions achieving the highest difficulty will be allowed on the network. And to put it in perspective: the average difficulty for one DS node is ~ 2 Th/s equaling 2.000.000 Mh/s or 55 thousand+ GeForce GTX 1070 / 8 GB GPUs at 35.4 Mh/s. Each DS Block 10 new DS nodes are allowed. And a shard node needs to provide around 8.53 GH/s currently (around 240 GTX 1070s). Dual mining ETH/ETC and ZIL is possible and can be done via mining software such as Phoenix and Claymore. There are pools and if you have large amounts of hashing power (Ethash) available you could mine solo.
 
The PoW cycle of 60 seconds is a peak performance and acts as an entry ticket to the network. The entry ticket is called a sybil resistance mechanism and makes it incredibly hard for adversaries to spawn lots of identities and manipulate the network with these identities. And after every 100 Tx Blocks which corresponds to roughly 1,5 hour this PoW process repeats. In between these 1,5 hour, no PoW needs to be done meaning Zilliqa’s energy consumption to keep the network secure is low. For more detailed information on how mining works click here.
Okay, hats off to you. You have made it this far. Before we go any deeper down the rabbit hole we first must understand why Zilliqa goes through all of the above technicalities and understand a bit more what a blockchain on a more fundamental level is. Because the core of Zilliqa’s consensus protocol relies on the usage of pBFT (practical Byzantine Fault Tolerance) we need to know more about state machines and their function. Navigate to Viewblock, a Zilliqa block explorer, and just come back to this article. We will use this site to navigate through a few concepts.
 
We have established that Zilliqa is a public and distributed blockchain. Meaning that everyone with an internet connection can send ZILs, trigger smart contracts, etc. and there is no central authority who fully controls the network. Zilliqa and other public and distributed blockchains (like Bitcoin and Ethereum) can also be defined as state machines.
 
Taking the liberty of paraphrasing examples and definitions given by Samuel Brooks’ medium article, he describes the definition of a blockchain (like Zilliqa) as: “A peer-to-peer, append-only datastore that uses consensus to synchronize cryptographically-secure data”.
 
Next, he states that: "blockchains are fundamentally systems for managing valid state transitions”. For some more context, I recommend reading the whole medium article to get a better grasp of the definitions and understanding of state machines. Nevertheless, let’s try to simplify and compile it into a single paragraph. Take traffic lights as an example: all its states (red, amber, and green) are predefined, all possible outcomes are known and it doesn’t matter if you encounter the traffic light today or tomorrow. It will still behave the same. Managing the states of a traffic light can be done by triggering a sensor on the road or pushing a button resulting in one traffic lights’ state going from green to red (via amber) and another light from red to green.
 
With public blockchains like Zilliqa, this isn’t so straightforward and simple. It started with block #1 almost 1,5 years ago and every 45 seconds or so a new block linked to the previous block is being added. Resulting in a chain of blocks with transactions in it that everyone can verify from block #1 to the current #647.000+ block. The state is ever changing and the states it can find itself in are infinite. And while the traffic light might work together in tandem with various other traffic lights, it’s rather insignificant comparing it to a public blockchain. Because Zilliqa consists of 2400 nodes who need to work together to achieve consensus on what the latest valid state is while some of these nodes may have latency or broadcast issues, drop offline or are deliberately trying to attack the network, etc.
 
Now go back to the Viewblock page take a look at the amount of transaction, addresses, block and DS height and then hit refresh. Obviously as expected you see new incremented values on one or all parameters. And how did the Zilliqa blockchain manage to transition from a previous valid state to the latest valid state? By using pBFT to reach consensus on the latest valid state.
 
After having obtained the entry ticket, miners execute pBFT to reach consensus on the ever-changing state of the blockchain. pBFT requires a series of network communication between nodes, and as such there is no GPU involved (but CPU). Resulting in the total energy consumed to keep the blockchain secure, decentralized and scalable being low.
 
pBFT stands for practical Byzantine Fault Tolerance and is an optimization on the Byzantine Fault Tolerant algorithm. To quote Blockonomi: “In the context of distributed systems, Byzantine Fault Tolerance is the ability of a distributed computer network to function as desired and correctly reach a sufficient consensus despite malicious components (nodes) of the system failing or propagating incorrect information to other peers.” Zilliqa is such a distributed computer network and depends on the honesty of the nodes (shard and DS) to reach consensus and to continuously update the state with the latest block. If pBFT is a new term for you I can highly recommend the Blockonomi article.
 
The idea of pBFT was introduced in 1999 - one of the authors even won a Turing award for it - and it is well researched and applied in various blockchains and distributed systems nowadays. If you want more advanced information than the Blockonomi link provides click here. And if you’re in between Blockonomi and the University of Singapore read the Zilliqa Design Story Part 2 dating from October 2017.
Quoting from the Zilliqa tech whitepaper: “pBFT relies upon a correct leader (which is randomly selected) to begin each phase and proceed when the sufficient majority exists. In case the leader is byzantine it can stall the entire consensus protocol. To address this challenge, pBFT offers a view change protocol to replace the byzantine leader with another one.”
 
pBFT can tolerate ⅓ of the nodes being dishonest (offline counts as Byzantine = dishonest) and the consensus protocol will function without stalling or hiccups. Once there are more than ⅓ of dishonest nodes but no more than ⅔ the network will be stalled and a view change will be triggered to elect a new DS leader. Only when more than ⅔ of the nodes are dishonest (66%) double-spend attacks become possible.
 
If the network stalls no transactions can be processed and one has to wait until a new honest leader has been elected. When the mainnet was just launched and in its early phases, view changes happened regularly. As of today the last stalling of the network - and view change being triggered - was at the end of October 2019.
 
Another benefit of using pBFT for consensus besides low energy is the immediate finality it provides. Once your transaction is included in a block and the block is added to the chain it’s done. Lastly, take a look at this article where three types of finality are being defined: probabilistic, absolute and economic finality. Zilliqa falls under the absolute finality (just like Tendermint for example). Although lengthy already we skipped through some of the inner workings from Zilliqa’s consensus: read the Zilliqa Design Story Part 3 and you will be close to having a complete picture on it. Enough about PoW, sybil resistance mechanism, pBFT, etc. Another thing we haven’t looked at yet is the amount of decentralization.
 
Decentralisation
 
Currently, there are four shards, each one of them consisting of 600 nodes. 1 shard with 600 so-called DS nodes (Directory Service - they need to achieve a higher difficulty than shard nodes) and 1800 shard nodes of which 250 are shard guards (centralized nodes controlled by the team). The amount of shard guards has been steadily declining from 1200 in January 2019 to 250 as of May 2020. On the Viewblock statistics, you can see that many of the nodes are being located in the US but those are only the (CPU parts of the) shard nodes who perform pBFT. There is no data from where the PoW sources are coming. And when the Zilliqa blockchain starts reaching its transaction capacity limit, a network upgrade needs to be executed to lift the current cap of maximum 2400 nodes to allow more nodes and formation of more shards which will allow to network to keep on scaling according to demand.
Besides shard nodes there are also seed nodes. The main role of seed nodes is to serve as direct access points (for end-users and clients) to the core Zilliqa network that validates transactions. Seed nodes consolidate transaction requests and forward these to the lookup nodes (another type of nodes) for distribution to the shards in the network. Seed nodes also maintain the entire transaction history and the global state of the blockchain which is needed to provide services such as block explorers. Seed nodes in the Zilliqa network are comparable to Infura on Ethereum.
 
The seed nodes were first only operated by Zilliqa themselves, exchanges and Viewblock. Operators of seed nodes like exchanges had no incentive to open them for the greater public. They were centralised at first. Decentralisation at the seed nodes level has been steadily rolled out since March 2020 ( Zilliqa Improvement Proposal 3 ). Currently the amount of seed nodes is being increased, they are public-facing and at the same time PoS is applied to incentivize seed node operators and make it possible for ZIL holders to stake and earn passive yields. Important distinction: seed nodes are not involved with consensus! That is still PoW as entry ticket and pBFT for the actual consensus.
 
5% of the block rewards are being assigned to seed nodes (from the beginning in 2019) and those are being used to pay out ZIL stakers. The 5% block rewards with an annual yield of 10.03% translate to roughly 610 MM ZILs in total that can be staked. Exchanges use the custodial variant of staking and wallets like Moonlet will use the non-custodial version (starting in Q3 2020). Staking is being done by sending ZILs to a smart contract created by Zilliqa and audited by Quantstamp.
 
With a high amount of DS; shard nodes and seed nodes becoming more decentralized too, Zilliqa qualifies for the label of decentralized in my opinion.
 
Smart contracts
 
Let me start by saying I’m not a developer and my programming skills are quite limited. So I‘m taking the ELI5 route (maybe 12) but if you are familiar with Javascript, Solidity or specifically OCaml please head straight to Scilla - read the docs to get a good initial grasp of how Zilliqa’s smart contract language Scilla works and if you ask yourself “why another programming language?” check this article. And if you want to play around with some sample contracts in an IDE click here. The faucet can be found here. And more information on architecture, dapp development and API can be found on the Developer Portal.
If you are more into listening and watching: check this recent webinar explaining Zilliqa and Scilla. Link is time-stamped so you’ll start right away with a platform introduction, roadmap 2020 and afterwards a proper Scilla introduction.
 
Generalized: programming languages can be divided into being ‘object-oriented’ or ‘functional’. Here is an ELI5 given by software development academy: * “all programs have two basic components, data – what the program knows – and behavior – what the program can do with that data. So object-oriented programming states that combining data and related behaviors in one place, is called “object”, which makes it easier to understand how a particular program works. On the other hand, functional programming argues that data and behavior are different things and should be separated to ensure their clarity.” *
 
Scilla is on the functional side and shares similarities with OCaml: OCaml is a general-purpose programming language with an emphasis on expressiveness and safety. It has an advanced type system that helps catch your mistakes without getting in your way. It's used in environments where a single mistake can cost millions and speed matters, is supported by an active community, and has a rich set of libraries and development tools. For all its power, OCaml is also pretty simple, which is one reason it's often used as a teaching language.
 
Scilla is blockchain agnostic, can be implemented onto other blockchains as well, is recognized by academics and won a so-called Distinguished Artifact Award award at the end of last year.
 
One of the reasons why the Zilliqa team decided to create their own programming language focused on preventing smart contract vulnerabilities is that adding logic on a blockchain, programming, means that you cannot afford to make mistakes. Otherwise, it could cost you. It’s all great and fun blockchains being immutable but updating your code because you found a bug isn’t the same as with a regular web application for example. And with smart contracts, it inherently involves cryptocurrencies in some form thus value.
 
Another difference with programming languages on a blockchain is gas. Every transaction you do on a smart contract platform like Zilliqa or Ethereum costs gas. With gas you basically pay for computational costs. Sending a ZIL from address A to address B costs 0.001 ZIL currently. Smart contracts are more complex, often involve various functions and require more gas (if gas is a new concept click here ).
 
So with Scilla, similar to Solidity, you need to make sure that “every function in your smart contract will run as expected without hitting gas limits. An improper resource analysis may lead to situations where funds may get stuck simply because a part of the smart contract code cannot be executed due to gas limits. Such constraints are not present in traditional software systems”. Scilla design story part 1
 
Some examples of smart contract issues you’d want to avoid are: leaking funds, ‘unexpected changes to critical state variables’ (example: someone other than you setting his or her address as the owner of the smart contract after creation) or simply killing a contract.
 
Scilla also allows for formal verification. Wikipedia to the rescue: In the context of hardware and software systems, formal verification is the act of proving or disproving the correctness of intended algorithms underlying a system with respect to a certain formal specification or property, using formal methods of mathematics.
 
Formal verification can be helpful in proving the correctness of systems such as: cryptographic protocols, combinational circuits, digital circuits with internal memory, and software expressed as source code.
 
Scilla is being developed hand-in-hand with formalization of its semantics and its embedding into the Coq proof assistant — a state-of-the art tool for mechanized proofs about properties of programs.”
 
Simply put, with Scilla and accompanying tooling developers can be mathematically sure and proof that the smart contract they’ve written does what he or she intends it to do.
 
Smart contract on a sharded environment and state sharding
 
There is one more topic I’d like to touch on: smart contract execution in a sharded environment (and what is the effect of state sharding). This is a complex topic. I’m not able to explain it any easier than what is posted here. But I will try to compress the post into something easy to digest.
 
Earlier on we have established that Zilliqa can process transactions in parallel due to network sharding. This is where the linear scalability comes from. We can define simple transactions: a transaction from address A to B (Category 1), a transaction where a user interacts with one smart contract (Category 2) and the most complex ones where triggering a transaction results in multiple smart contracts being involved (Category 3). The shards are able to process transactions on their own without interference of the other shards. With Category 1 transactions that is doable, with Category 2 transactions sometimes if that address is in the same shard as the smart contract but with Category 3 you definitely need communication between the shards. Solving that requires to make a set of communication rules the protocol needs to follow in order to process all transactions in a generalised fashion.
 
And this is where the downsides of state sharding comes in currently. All shards in Zilliqa have access to the complete state. Yes the state size (0.1 GB at the moment) grows and all of the nodes need to store it but it also means that they don’t need to shop around for information available on other shards. Requiring more communication and adding more complexity. Computer science knowledge and/or developer knowledge required links if you want to dig further: Scilla - language grammar Scilla - Foundations for Verifiable Decentralised Computations on a Blockchain Gas Accounting NUS x Zilliqa: Smart contract language workshop
 
Easier to follow links on programming Scilla https://learnscilla.com/home Ivan on Tech
 
Roadmap / Zilliqa 2.0
 
There is no strict defined roadmap but here are topics being worked on. And via the Zilliqa website there is also more information on the projects they are working on.
 
Business & Partnerships
 
It’s not only technology in which Zilliqa seems to be excelling as their ecosystem has been expanding and starting to grow rapidly. The project is on a mission to provide OpenFinance (OpFi) to the world and Singapore is the right place to be due to its progressive regulations and futuristic thinking. Singapore has taken a proactive approach towards cryptocurrencies by introducing the Payment Services Act 2019 (PS Act). Among other things, the PS Act will regulate intermediaries dealing with certain cryptocurrencies, with a particular focus on consumer protection and anti-money laundering. It will also provide a stable regulatory licensing and operating framework for cryptocurrency entities, effectively covering all crypto businesses and exchanges based in Singapore. According to PWC 82% of the surveyed executives in Singapore reported blockchain initiatives underway and 13% of them have already brought the initiatives live to the market. There is also an increasing list of organizations that are starting to provide digital payment services. Moreover, Singaporean blockchain developers Building Cities Beyond has recently created an innovation $15 million grant to encourage development on its ecosystem. This all suggests that Singapore tries to position itself as (one of) the leading blockchain hubs in the world.
 
Zilliqa seems to already take advantage of this and recently helped launch Hg Exchange on their platform, together with financial institutions PhillipCapital, PrimePartners and Fundnel. Hg Exchange, which is now approved by the Monetary Authority of Singapore (MAS), uses smart contracts to represent digital assets. Through Hg Exchange financial institutions worldwide can use Zilliqa's safe-by-design smart contracts to enable the trading of private equities. For example, think of companies such as Grab, Airbnb, SpaceX that are not available for public trading right now. Hg Exchange will allow investors to buy shares of private companies & unicorns and capture their value before an IPO. Anquan, the main company behind Zilliqa, has also recently announced that they became a partner and shareholder in TEN31 Bank, which is a fully regulated bank allowing for tokenization of assets and is aiming to bridge the gap between conventional banking and the blockchain world. If STOs, the tokenization of assets, and equity trading will continue to increase, then Zilliqa’s public blockchain would be the ideal candidate due to its strategic positioning, partnerships, regulatory compliance and the technology that is being built on top of it.
 
What is also very encouraging is their focus on banking the un(der)banked. They are launching a stablecoin basket starting with XSGD. As many of you know, stablecoins are currently mostly used for trading. However, Zilliqa is actively trying to broaden the use case of stablecoins. I recommend everybody to read this text that Amrit Kumar wrote (one of the co-founders). These stablecoins will be integrated in the traditional markets and bridge the gap between the crypto world and the traditional world. This could potentially revolutionize and legitimise the crypto space if retailers and companies will for example start to use stablecoins for payments or remittances, instead of it solely being used for trading.
 
Zilliqa also released their DeFi strategic roadmap (dating November 2019) which seems to be aligning well with their OpFi strategy. A non-custodial DEX is coming to Zilliqa made by Switcheo which allows cross-chain trading (atomic swaps) between ETH, EOS and ZIL based tokens. They also signed a Memorandum of Understanding for a (soon to be announced) USD stablecoin. And as Zilliqa is all about regulations and being compliant, I’m speculating on it to be a regulated USD stablecoin. Furthermore, XSGD is already created and visible on block explorer and XIDR (Indonesian Stablecoin) is also coming soon via StraitsX. Here also an overview of the Tech Stack for Financial Applications from September 2019. Further quoting Amrit Kumar on this:
 
There are two basic building blocks in DeFi/OpFi though: 1) stablecoins as you need a non-volatile currency to get access to this market and 2) a dex to be able to trade all these financial assets. The rest are built on top of these blocks.
 
So far, together with our partners and community, we have worked on developing these building blocks with XSGD as a stablecoin. We are working on bringing a USD-backed stablecoin as well. We will soon have a decentralised exchange developed by Switcheo. And with HGX going live, we are also venturing into the tokenization space. More to come in the future.”
 
Additionally, they also have this ZILHive initiative that injects capital into projects. There have been already 6 waves of various teams working on infrastructure, innovation and research, and they are not from ASEAN or Singapore only but global: see Grantees breakdown by country. Over 60 project teams from over 20 countries have contributed to Zilliqa's ecosystem. This includes individuals and teams developing wallets, explorers, developer toolkits, smart contract testing frameworks, dapps, etc. As some of you may know, Unstoppable Domains (UD) blew up when they launched on Zilliqa. UD aims to replace cryptocurrency addresses with a human-readable name and allows for uncensorable websites. Zilliqa will probably be the only one able to handle all these transactions onchain due to ability to scale and its resulting low fees which is why the UD team launched this on Zilliqa in the first place. Furthermore, Zilliqa also has a strong emphasis on security, compliance, and privacy, which is why they partnered with companies like Elliptic, ChainSecurity (part of PwC Switzerland), and Incognito. Their sister company Aqilliz (Zilliqa spelled backwards) focuses on revolutionizing the digital advertising space and is doing interesting things like using Zilliqa to track outdoor digital ads with companies like Foodpanda.
 
Zilliqa is listed on nearly all major exchanges, having several different fiat-gateways and recently have been added to Binance’s margin trading and futures trading with really good volume. They also have a very impressive team with good credentials and experience. They don't just have “tech people”. They have a mix of tech people, business people, marketeers, scientists, and more. Naturally, it's good to have a mix of people with different skill sets if you work in the crypto space.
 
Marketing & Community
 
Zilliqa has a very strong community. If you just follow their Twitter their engagement is much higher for a coin that has approximately 80k followers. They also have been ‘coin of the day’ by LunarCrush many times. LunarCrush tracks real-time cryptocurrency value and social data. According to their data, it seems Zilliqa has a more fundamental and deeper understanding of marketing and community engagement than almost all other coins. While almost all coins have been a bit frozen in the last months, Zilliqa seems to be on its own bull run. It was somewhere in the 100s a few months ago and is currently ranked #46 on CoinGecko. Their official Telegram also has over 20k people and is very active, and their community channel which is over 7k now is more active and larger than many other official channels. Their local communities also seem to be growing.
 
Moreover, their community started ‘Zillacracy’ together with the Zilliqa core team ( see www.zillacracy.com ). It’s a community-run initiative where people from all over the world are now helping with marketing and development on Zilliqa. Since its launch in February 2020 they have been doing a lot and will also run their own non-custodial seed node for staking. This seed node will also allow them to start generating revenue for them to become a self sustaining entity that could potentially scale up to become a decentralized company working in parallel with the Zilliqa core team. Comparing it to all the other smart contract platforms (e.g. Cardano, EOS, Tezos etc.) they don't seem to have started a similar initiative (correct me if I’m wrong though). This suggests in my opinion that these other smart contract platforms do not fully understand how to utilize the ‘power of the community’. This is something you cannot ‘buy with money’ and gives many projects in the space a disadvantage.
 
Zilliqa also released two social products called SocialPay and Zeeves. SocialPay allows users to earn ZILs while tweeting with a specific hashtag. They have recently used it in partnership with the Singapore Red Cross for a marketing campaign after their initial pilot program. It seems like a very valuable social product with a good use case. I can see a lot of traditional companies entering the space through this product, which they seem to suggest will happen. Tokenizing hashtags with smart contracts to get network effect is a very smart and innovative idea.
 
Regarding Zeeves, this is a tipping bot for Telegram. They already have 1000s of signups and they plan to keep upgrading it for more and more people to use it (e.g. they recently have added a quiz features). They also use it during AMAs to reward people in real-time. It’s a very smart approach to grow their communities and get familiar with ZIL. I can see this becoming very big on Telegram. This tool suggests, again, that the Zilliqa team has a deeper understanding of what the crypto space and community needs and is good at finding the right innovative tools to grow and scale.
 
To be honest, I haven’t covered everything (i’m also reaching the character limited haha). So many updates happening lately that it's hard to keep up, such as the International Monetary Fund mentioning Zilliqa in their report, custodial and non-custodial Staking, Binance Margin, Futures, Widget, entering the Indian market, and more. The Head of Marketing Colin Miles has also released this as an overview of what is coming next. And last but not least, Vitalik Buterin has been mentioning Zilliqa lately acknowledging Zilliqa and mentioning that both projects have a lot of room to grow. There is much more info of course and a good part of it has been served to you on a silver platter. I invite you to continue researching by yourself :-) And if you have any comments or questions please post here!
submitted by haveyouheardaboutit to CryptoCurrency [link] [comments]

Syscoin Platform’s Great Reddit Scaling Bake-off Proposal

Syscoin Platform’s Great Reddit Scaling Bake-off Proposal

https://preview.redd.it/rqt2dldyg8e51.jpg?width=1044&format=pjpg&auto=webp&s=777ae9d4fbbb54c3540682b72700fc4ba3de0a44
We are excited to participate and present Syscoin Platform's ideal characteristics and capabilities towards a well-rounded Reddit Community Points solution!
Our scaling solution for Reddit Community Points involves 2-way peg interoperability with Ethereum. This will provide a scalable token layer built specifically for speed and high volumes of simple value transfers at a very low cost, while providing sovereign ownership and onchain finality.
Token transfers scale by taking advantage of a globally sorting mempool that provides for probabilistically secure assumptions of “as good as settled”. The opportunity here for token receivers is to have an app-layer interactivity on the speed/security tradeoff (99.9999% assurance within 10 seconds). We call this Z-DAG, and it achieves high-throughput across a mesh network topology presently composed of about 2,000 geographically dispersed full-nodes. Similar to Bitcoin, however, these nodes are incentivized to run full-nodes for the benefit of network security, through a bonded validator scheme. These nodes do not participate in the consensus of transactions or block validation any differently than other nodes and therefore do not degrade the security model of Bitcoin’s validate first then trust, across every node. Each token transfer settles on-chain. The protocol follows Bitcoin core policies so it has adequate code coverage and protocol hardening to be qualified as production quality software. It shares a significant portion of Bitcoin’s own hashpower through merged-mining.
This platform as a whole can serve token microtransactions, larger settlements, and store-of-value in an ideal fashion, providing probabilistic scalability whilst remaining decentralized according to Bitcoin design. It is accessible to ERC-20 via a permissionless and trust-minimized bridge that works in both directions. The bridge and token platform are currently available on the Syscoin mainnet. This has been gaining recent attention for use by loyalty point programs and stablecoins such as Binance USD.

Solutions

Syscoin Foundation identified a few paths for Reddit to leverage this infrastructure, each with trade-offs. The first provides the most cost-savings and scaling benefits at some sacrifice of token autonomy. The second offers more preservation of autonomy with a more narrow scope of cost savings than the first option, but savings even so. The third introduces more complexity than the previous two yet provides the most overall benefits. We consider the third as most viable as it enables Reddit to benefit even while retaining existing smart contract functionality. We will focus on the third option, and include the first two for good measure.
  1. Distribution, burns and user-to-user transfers of Reddit Points are entirely carried out on the Syscoin network. This full-on approach to utilizing the Syscoin network provides the most scalability and transaction cost benefits of these scenarios. The tradeoff here is distribution and subscription handling likely migrating away from smart contracts into the application layer.
  2. The Reddit Community Points ecosystem can continue to use existing smart contracts as they are used today on the Ethereum mainchain. Users migrate a portion of their tokens to Syscoin, the scaling network, to gain much lower fees, scalability, and a proven base layer, without sacrificing sovereign ownership. They would use Syscoin for user-to-user transfers. Tips redeemable in ten seconds or less, a high-throughput relay network, and onchain settlement at a block target of 60 seconds.
  3. Integration between Matic Network and Syscoin Platform - similar to Syscoin’s current integration with Ethereum - will provide Reddit Community Points with EVM scalability (including the Memberships ERC777 operator) on the Matic side, and performant simple value transfers, robust decentralized security, and sovereign store-of-value on the Syscoin side. It’s “the best of both worlds”. The trade-off is more complex interoperability.

Syscoin + Matic Integration

Matic and Blockchain Foundry Inc, the public company formed by the founders of Syscoin, recently entered a partnership for joint research and business development initiatives. This is ideal for all parties as Matic Network and Syscoin Platform provide complementary utility. Syscoin offers characteristics for sovereign ownership and security based on Bitcoin’s time-tested model, and shares a significant portion of Bitcoin’s own hashpower. Syscoin’s focus is on secure and scalable simple value transfers, trust-minimized interoperability, and opt-in regulatory compliance for tokenized assets rather than scalability for smart contract execution. On the other hand, Matic Network can provide scalable EVM for smart contract execution. Reddit Community Points can benefit from both.
Syscoin + Matic integration is actively being explored by both teams, as it is helpful to Reddit, Ethereum, and the industry as a whole.

Proving Performance & Cost Savings

Our POC focuses on 100,000 on-chain settlements of token transfers on the Syscoin Core blockchain. Transfers and burns perform equally with Syscoin. For POCs related to smart contracts (subscriptions, etc), refer to the Matic Network proposal.
On-chain settlement of 100k transactions was accomplished within roughly twelve minutes, well-exceeding Reddit’s expectation of five days. This was performed using six full-nodes operating on compute-optimized AWS c4.2xlarge instances which were geographically distributed (Virginia, London, Sao Paulo Brazil, Oregon, Singapore, Germany). A higher quantity of settlements could be reached within the same time-frame with more broadcasting nodes involved, or using hosts with more resources for faster execution of the process.
Addresses used: 100,014
The demonstration was executed using this tool. The results can be seen in the following blocks:
612722: https://sys1.bcfn.ca/block/6d47796d043bb4c508d29123e6ae81b051f5e0aaef849f253c8f3a6942a022ce
612723: https://sys1.bcfn.ca/block/8e2077f743461b90f80b4bef502f564933a8e04de97972901f3d65cfadcf1faf
612724: https://sys1.bcfn.ca/block/205436d25b1b499fce44c29567c5c807beaca915b83cc9f3c35b0d76dbb11f6e
612725: https://sys1.bcfn.ca/block/776d1b1a0f90f655a6bbdf559ff5072459cbdc5682d7615ff4b78c00babdc237
612726: https://sys1.bcfn.ca/block/de4df0994253742a1ac8ac9eec8d2a8c8b0a6d72c53d6f3caa29bb6c171b0a6b
612727: https://sys1.bcfn.ca/block/e5e167c52a9decb313fbaadf49a5e34cb490f8084f642a850385476d4ef10d70
612728: https://sys1.bcfn.ca/block/ab64d989edc71890e7b5b8491c20e9a27520dc45a5f7c776d3dae79057f59fe7
612729: https://sys1.bcfn.ca/block/5e8b7ecd0e36f99d07e4ea6e135fc952bf7ec30164ab6f4d1e98b0f2d405df6d
612730: https://sys1.bcfn.ca/block/d395df3d31dde60bbb0bece6bd5b358297da878f0beb96be389e5f0e043580a3
It is important to note that this POC is not focused on Z-DAG. The performance of Z-DAG has been benchmarked within realistic network conditions: Whiteblock’s audit is publicly available. Network latency tests showed an average TPS around 15k with burst capacity up to 61k. Zero-latency control group exhibited ~150k TPS. Mainnet testing of the Z-DAG network is achievable and will require further coordination and additional resources.
Even further optimizations are expected in the upcoming Syscoin Core release which will implement a UTXO model for our token layer bringing further efficiency as well as open the door to additional scaling technology currently under research by our team and academic partners. At present our token layer is account-based, similar to Ethereum. Opt-in compliance structures will also be introduced soon which will offer some positive performance characteristics as well. It makes the most sense to implement these optimizations before performing another benchmark for Z-DAG, especially on the mainnet considering the resources required to stress-test this network.

Cost Savings

Total cost for these 100k transactions: $0.63 USD
See the live fee comparison for savings estimation between transactions on Ethereum and Syscoin. Below is a snapshot at time of writing:
ETH price: $318.55 ETH gas price: 55.00 Gwei ($0.37)
Syscoin price: $0.11
Snapshot of live fee comparison chart
Z-DAG provides a more efficient fee-market. A typical Z-DAG transaction costs 0.0000582 SYS. Tokens can be safely redeemed/re-spent within seconds or allowed to settle on-chain beforehand. The costs should remain about this low for microtransactions.
Syscoin will achieve further reduction of fees and even greater scalability with offchain payment channels for assets, with Z-DAG as a resilience fallback. New payment channel technology is one of the topics under research by the Syscoin development team with our academic partners at TU Delft. In line with the calculation in the Lightning Networks white paper, payment channels using assets with Syscoin Core will bring theoretical capacity for each person on Earth (7.8 billion) to have five on-chain transactions per year, per person, without requiring anyone to enter a fee market (aka “wait for a block”). This exceeds the minimum LN expectation of two transactions per person, per year; one to exist on-chain and one to settle aggregated value.

Tools, Infrastructure & Documentation

Syscoin Bridge

Mainnet Demonstration of Syscoin Bridge with the Basic Attention Token ERC-20
A two-way blockchain interoperability system that uses Simple Payment Verification to enable:
  • Any Standard ERC-20 token to be moved from Ethereum to the Syscoin blockchain as a Syscoin Platform Token (SPT), and back to Ethereum
  • Any SPT to be moved from Syscoin to the Ethereum blockchain as an ERC-20 token, and back to Syscoin

Benefits

  • Permissionless
  • No counterparties involved
  • No trading mechanisms involved
  • No third-party liquidity providers required
  • Cross-chain Fractional Supply - 2-way peg - Token supply maintained globally
  • ERC-20s gain vastly improved transactionality with the Syscoin Token Platform, along with the security of bitcoin-core-compliant PoW.
  • SPTs gain access to all the tooling, applications and capabilities of Ethereum for ERC-20, including smart contracts.
https://preview.redd.it/l8t2m8ldh8e51.png?width=1180&format=png&auto=webp&s=b0a955a0181746dc79aff718bd0bf607d3c3aa23
https://preview.redd.it/26htnxzfh8e51.png?width=1180&format=png&auto=webp&s=d0383d3c2ee836c9f60b57eca35542e9545f741d

Source code

https://github.com/syscoin/?q=sysethereum
Main Subprojects

API

Tools to simplify using Syscoin Bridge as a service with dapps and wallets will be released some time after implementation of Syscoin Core 4.2. These will be based upon the same processes which are automated in the current live Sysethereum Dapp that is functioning with the Syscoin mainnet.

Documentation

Syscoin Bridge & How it Works (description and process flow)
Superblock Validation Battles
HOWTO: Provision the Bridge for your ERC-20
HOWTO: Setup an Agent
Developer & User Diligence

Trade-off

The Syscoin Ethereum Bridge is secured by Agent nodes participating in a decentralized and incentivized model that involves roles of Superblock challengers and submitters. This model is open to participation. The benefits here are trust-minimization, permissionless-ness, and potentially less legal/regulatory red-tape than interop mechanisms that involve liquidity providers and/or trading mechanisms.
The trade-off is that due to the decentralized nature there are cross-chain settlement times of one hour to cross from Ethereum to Syscoin, and three hours to cross from Syscoin to Ethereum. We are exploring ways to reduce this time while maintaining decentralization via zkp. Even so, an “instant bridge” experience could be provided by means of a third-party liquidity mechanism. That option exists but is not required for bridge functionality today. Typically bridges are used with batch value, not with high frequencies of smaller values, and generally it is advantageous to keep some value on both chains for maximum availability of utility. Even so, the cross-chain settlement time is good to mention here.

Cost

Ethereum -> Syscoin: Matic or Ethereum transaction fee for bridge contract interaction, negligible Syscoin transaction fee for minting tokens
Syscoin -> Ethereum: Negligible Syscoin transaction fee for burning tokens, 0.01% transaction fee paid to Bridge Agent in the form of the ERC-20, Matic or Ethereum transaction fee for contract interaction.

Z-DAG

Zero-Confirmation Directed Acyclic Graph is an instant settlement protocol that is used as a complementary system to proof-of-work (PoW) in the confirmation of Syscoin service transactions. In essence, a Z-DAG is simply a directed acyclic graph (DAG) where validating nodes verify the sequential ordering of transactions that are received in their memory pools. Z-DAG is used by the validating nodes across the network to ensure that there is absolute consensus on the ordering of transactions and no balances are overflowed (no double-spends).

Benefits

  • Unique fee-market that is more efficient for microtransaction redemption and settlement
  • Uses decentralized means to enable tokens with value transfer scalability that is comparable or exceeds that of credit card networks
  • Provides high throughput and secure fulfillment even if blocks are full
  • Probabilistic and interactive
  • 99.9999% security assurance within 10 seconds
  • Can serve payment channels as a resilience fallback that is faster and lower-cost than falling-back directly to a blockchain
  • Each Z-DAG transaction also settles onchain through Syscoin Core at 60-second block target using SHA-256 Proof of Work consensus
https://preview.redd.it/pgbx84jih8e51.png?width=1614&format=png&auto=webp&s=5f631d42a33dc698365eb8dd184b6d442def6640

Source code

https://github.com/syscoin/syscoin

API

Syscoin-js provides tooling for all Syscoin Core RPCs including interactivity with Z-DAG.

Documentation

Z-DAG White Paper
Useful read: An in-depth Z-DAG discussion between Syscoin Core developer Jag Sidhu and Brave Software Research Engineer Gonçalo Pestana

Trade-off

Z-DAG enables the ideal speed/security tradeoff to be determined per use-case in the application layer. It minimizes the sacrifice required to accept and redeem fast transfers/payments while providing more-than-ample security for microtransactions. This is supported on the premise that a Reddit user receiving points does need security yet generally doesn’t want nor need to wait for the same level of security as a nation-state settling an international trade debt. In any case, each Z-DAG transaction settles onchain at a block target of 60 seconds.

Syscoin Specs

Syscoin 3.0 White Paper
(4.0 white paper is pending. For improved scalability and less blockchain bloat, some features of v3 no longer exist in current v4: Specifically Marketplace Offers, Aliases, Escrow, Certificates, Pruning, Encrypted Messaging)
  • 16MB block bandwidth per minute assuming segwit witness carrying transactions, and transactions ~200 bytes on average
  • SHA256 merge mined with Bitcoin
  • UTXO asset layer, with base Syscoin layer sharing identical security policies as Bitcoin Core
  • Z-DAG on asset layer, bridge to Ethereum on asset layer
  • On-chain scaling with prospect of enabling enterprise grade reliable trustless payment processing with on/offchain hybrid solution
  • Focus only on Simple Value Transfers. MVP of blockchain consensus footprint is balances and ownership of them. Everything else can reduce data availability in exchange for scale (Ethereum 2.0 model). We leave that to other designs, we focus on transfers.
  • Future integrations of MAST/Taproot to get more complex value transfers without trading off trustlessness or decentralization.
  • Zero-knowledge Proofs are a cryptographic new frontier. We are dabbling here to generalize the concept of bridging and also verify the state of a chain efficiently. We also apply it in our Digital Identity projects at Blockchain Foundry (a publicly traded company which develops Syscoin softwares for clients). We are also looking to integrate privacy preserving payment channels for off-chain payments through zkSNARK hub & spoke design which does not suffer from the HTLC attack vectors evident on LN. Much of the issues plaguing Lightning Network can be resolved using a zkSNARK design whilst also providing the ability to do a multi-asset payment channel system. Currently we found a showstopper attack (American Call Option) on LN if we were to use multiple-assets. This would not exist in a system such as this.

Wallets

Web3 and mobile wallets are under active development by Blockchain Foundry Inc as WebAssembly applications and expected for release not long after mainnet deployment of Syscoin Core 4.2. Both of these will be multi-coin wallets that support Syscoin, SPTs, Ethereum, and ERC-20 tokens. The Web3 wallet will provide functionality similar to Metamask.
Syscoin Platform and tokens are already integrated with Blockbook. Custom hardware wallet support currently exists via ElectrumSys. First-class HW wallet integration through apps such as Ledger Live will exist after 4.2.
Current supported wallets
Syscoin Spark Desktop
Syscoin-Qt

Explorers

Mainnet: https://sys1.bcfn.ca (Blockbook)
Testnet: https://explorer-testnet.blockchainfoundry.co

Thank you for close consideration of our proposal. We look forward to feedback, and to working with the Reddit community to implement an ideal solution using Syscoin Platform!

submitted by sidhujag to ethereum [link] [comments]

Update and Few Thoughts, a (Well-Typed) transcript: Liza&Charles the marketeers, Voltaire kick-off, PrisM and Ebb-and-Flow to fuck ETH2.0 Gasper, the (back)log of a man and a falcon, lots of companies, September Goguen time, Basho, 2021 Titans, Basho, Hydra and much more thoughts and prayers

Hi everybody this is Charles Hoskinson broadcasting live from warm sunny Colorado. I'm trying a new streaming service and it allows me to annotate a few things and simulcast to both periscope and youtube. Let's see how this works. I also get to put a little caption. I think for the future, I'm just for a while going to put: "I will never give away ada". So, when people repost my videos for giveaway scams they at least have that. First off, a thank you, a community member named Daryl had decided to carve a log and give his artistic impression of my twitter profile picture of me and the falcon so that always means a lot when I get these gifts from fans and also I just wanted to, on the back of the Catalyst presentation, express my profound gratitude and excitement to the community.
You know it's really really cool to see how much progress has been made in such a short period of time. It was only yesterday when we were saying "when Shelley"? Now Shelley's out and it's evolving rapidly. Voltaire is now starting to evolve rapidly and we're real close to Goguen. At the end of this month we'll be able to talk around some of the realities of Goguen and some of the ideas we have and give some dates for certain things and give you a sense of where that project is at. The good news is that we have gained an enormous amount of progress and knowledge about what we need to do and how to get that done and basically people are just executing and it's a much smaller task than getting us to Shelley. With Byron to Shelley we literally had to build a completely new cryptocurrency from the ground up. We had to have new ledger rules, new update system, we had to invent a way of transitioning from one system to another system and there's hundreds of other little innovations along the way: new network stack and so forth. Byron cosmetically looks like Shelley but under the hood it's completely different and the Shelley design was built with a lot of the things that we needed for Goguen in mind. For example, we built Shelley with the idea of extended UTXO and we built Shelley understanding what the realities were for the smart contract model and that's one of the advantages you get when you do this type of bespoke engineering. There's two consequences to that, one, the integration is significantly easier, and two, the integration is significantly faster. We won't look at that same complexity there.
The product update at the end of the month... We'll really start discussing around some of these things as well as talk about partners and talk about how the development ecosystem is going to evolve. There are a lot of threads throughout all three organizations that are happening simultaneously. Emurgo, they're really thinking deeply about DeFi and they've invited us to collaborate with them on things like stablecoins for example but we're also looking at oracles (oracle pools), DEX and these other things and because there are already people in market who have made mistakes, learned lessons, it gives us the benefit of hindsight. It means we can be much faster to market and we can build much more competitive things in market and the Cardano community gets first access to these next generation DeFi applications without a lot of the problems of the prior generations and that's super beneficial to us.
You know, the other side of it, is that Voltaire is going to have a systemic influence not just on community funding but also the overall evolution and direction of the platform. The longer it exists the more pervasive it will become. Probably first applied towards the Cardano foundation roadmap but later on it will definitely have a lot of influence and say over every element aspect of the system including the launch dApps and these other things. Basically, long term, the types of problems that Cardano solves so that's incredibly appealing to me and very exciting to me because it's like I have this giant community brain with the best and brightest of all of you working with us to get us where we need to go.
You know, another thing that was super encouraging, it's a small thing, but it shows us that we're definitely in the right direction was that we recently got a demo from Pramod (Viswanath) and his team out of university of Illinois on a protocol they create called PrisM which is a super fast proof-of-work protocol and they wrote this beautiful paper and they wrote code along with it that showed that PrisM is a ten thousand times faster than Nakamoto consensus. If you take the bitcoin proof-of-work protocol, you strip it out, you put PrisM in, you can run the entire bitcoin system 10000 times faster. They have these beautiful benchmarks to show that. Even in bad network conditions. (I'm) promoting this team, they're, they're real researchers, and they're real engineers, they use a lot of cool HPC concepts like springboarding and other things like that to accommodate that. Then I asked him in the presentation, I said well, how much faster if you replay the Ethereum chain? He says, well, that it takes a big performance hit, could be only maybe a hundred times because that model is not as easy to optimize and shard with standard computer science concepts. In fact in some cases there are limitations there that really can't be overcome. It turns out that we're more on that UTXO side than we are on the account side. As a coincidence or intent of the design of extended UTXO we're gonna have a lot easier time getting much higher performance where and when it's necessary.
I also approved this week a scaling up of the Basho project. In particular, to build a hydra prototype team. The science has gotten to a point where we can make a really competitive push in that particular direction. What does that mean? It means that in just a few short months we can de-risk technological approaches that long-term will give us a lot of fruit where and when the community decides that they need infrastructure like hydra. Now, here's the beautiful thing about hydra. If you watch my whiteboard back in September of 2017 when Cardano first hit market with Byron I talked about this concept of looking at scalability with a very simple test which is as you get more people in the system it stays at the same performance or it gets faster. We all experience systems that do this, for example, bittorrent, more people downloading something you tend to be able to get it faster and we all experience the converse which is, the system gets slower when you get more people. What does this mean? It means that hydra is an actual approach towards true scalability in the system and it's a lot easier to do than sharding even though we have a beautiful approach to get the sharding on the ledger side if we truly desire to go down that way. There's beautiful ideas that we are definitely in deep discussions about. That's a very complex thing. There was recently a paper ("Ebb-and-Flow Protocols: A Resolution of the Availability-Finality Dilemma") out of Stanford that showed that the Gasper protocol as proposed for ETH2.0 does have some security concerns and it's going to be the burden on the shoulders of the Ethereum 2.0 developers and Vitalik to address those concerns from those Stanford professors. Whenever you have these very complex protocols they have so many different ways they can break and things can go wrong so it's much more appealing when you don't have to embrace complexity to achieve the same. The elegance of hydra is that stake pool operators are very natural parties to put hydra channels on and every time we add one we get much more performance out of that and the system as it gets more valuable. The k factor increases which means you get more stake pull operators, which means you get more hydra channels, so with growth we get appreciation, with appreciation we get more decentralization, with more decentralization we get more performance. In essence, this spiritually speaking, is really what we meant when we said scalability. That the system will always grow to meet its particular needs and we have a very elegant way of moving in that direction that doesn't require us to embrace very sophisticated techniques. It's not to say that these techniques don't have a place and purpose but it says that the urgency of implementing these is gone and we then have the luxury to pick the best science when it's ready instead of rushing it to market to resolve a crisis of high fees. We'll never have that crisis so there's a beauty to Cardano that is missing, I in my view, from many cryptocurrencies and blockchains in the marketplace and we're now seeing that beauty shine through. Not only through our community who are so passionate and amazing but in the science and the engineering itself and how easy it is for us to navigate the concepts. How easy it is for us to add more things, to take some things away, to clean some things up here and there and our ability to move through.
I never imagined when in 2015 I signed up to go in on this crazy ride and try to build a world financial operating system we would have made as much progress as we made today. We've written more than 75 research papers as an organization many of which are directly applicable to Cardano. We've got great partners who work with Nasa and Boeing and Pfizer, massive companies, that have 10 years of history and millions of users to come in and help us grow better. We've worked with incredible organizations, major universities like university of Wyoming, university of Edinburgh, Tokyo, tech professors all across the world. We've worked with incredible engineering firms like VacuumLabs and AtixLabs and Twig and Well-Typed, runtime verification, QuviQ and dozens of others along the years and despite the fact that at times there's been delays and friction throughout this entire journey we've mostly been aligned and we keep learning and growing. It gives me so much hope that our best days are ahead of us and an almost fanatical belief that success is inevitable in a certain respect. You see because we always find a way to be here tomorrow and we always find a way to make tomorrow a better day than today and as long as that's the trend you're monotonically increasing towards a better tomorrow, you're always going to have that outcome, you're always going to be in a position where Cardano shines bright. Towards the end of the month we'll have a lot more to say about the development side and that'll be a beginning just like Voltaire is the beginning and then suddenly you now notice the beautiful parallelism of the roadmap. Shelley continues to evolve, partial delegation is coming, in fact, I signed the contract with vacuumlabs to bring that to Ledger (and Trezor). The Daedalus team is hard at work to make that feature apparent for everyone as is the Yoroi team.
You see that, with now Voltaire, and soon was Goguen, and these are not endpoints, rather they're just beginnings and they're never over. We can always make staking better, more diverse, more merit-based and entertain different control models, have better delegation mechanics, have better user experience. The same for smart contracts, that's an endless river and along the way what we've discovered is it's easy for us to work with great minds and great people. For example with testing of smart contracts I would love to diversify that conversation above and beyond what we can come up with and bring in some firms who have done this for a long time to basically take that part with us shoulder to shoulder and build beautiful frameworks to assist us. For example, runtime verification is doing this with, the EVM with a beautiful project called Firefly to replace Truffle. I believe that we can achieve similar ends with Plutus smart contracts.
When you ask yourself what makes a system competitive in the cryptocurrency space? In my view there are four dimensions and you have to have a good story for all four of those dimensions. You need security and correctness. A lot of people don't prioritize that but when they get that wrong it hurts retail people, it hurts everyday people, billions of dollars have been lost due to the incompetence and ineptitude of junior developers making very bad mistakes and oftentimes those developers faced no consequences. The people who lost money were innocent people who believed in cryptocurrencies and wanted to be part of the movement but didn't protect themselves adequately. That's a really sad thing and it's unethical to continue pushing a model that that is the standard or the likely outcome rather than a rare edge case. You have to as a platform, a third generation platformn invest heavily in giving the developers proper tools to ensure security and correctness. We've seen a whole industry there's been great innovations out of Quantstamp and ConsenSys and dozens of other firms in the space including runtime verification who have really made major leaps in the last few years of trying to improve that story. What's unique to Cardano is that we based our foundations on languages that were designed right the first time and there's over 35 years of history for the approach that we're following in the Haskell side that allows us to build high assurance systems and our developers in the ecosystem to build high assurance systems. We didn't reinvent the wheel, we found the best wheel and we're giving it to you.
I think we're going to be dominant in that respect as we enter 2021. Second, you look at things like ease of maintenance, ease of deployment, the life cycle of the software upgrades to the software and as we've demonstrated with things like the hard fork combinator and the fact that Voltaire is not just a governance layer for ada and Cardano but will eventually be reusable for any dApp deployed on our system. You have very natural tooling that's going to allow people to upgrade their smart contracts, their dApps and enable governance for their users at an incredibly low cost and not have to reinvent the governance wheel each and every application. This is another unique property to our system and it can be reused for the dApps that you deploy on your system as I've mentioned before. Performance is a significant concern and this was often corrupted by marketers especially ICO marketers who really wanted to differentiate (and) say: "our protocol tested on a single server in someone's basement is 500000 transactions per second" and somehow that translates to real life performance and that's antithetical to anyone who's ever to study distributed systems and understands the reality of these systems and where they go and what they do and in terms of performance. I think we have the most logical approach. You know, we have 10 years of history with bitcoin, it's a massive system, we've learned a huge amount and there's a lot of papers written about, a lot of practical projects and bitcoin is about to step into the world of smart contracts. We congratulate them on getting Schnorr sigs in and the success of Taproot. That means entering 2021, 2022, we are going to start seeing legitimate dApps DeFi projects, real applications, instead of choosing Ethereum or Algorand, EOS, Cardano, choosing bitcoin and they're adding a lot to that conversation. I think that ultimately that model has a lot of promise which is why we built a better one. There are still significant limitations with what bitcoin can accomplish from settlement time to the verbosity of contracts that can be written.
The extended UTXO model was designed to be the fastest accounting and most charitable accounting model ever, on and off chain, and hydra was designed to allow you to flex between those two systems seamlessly. When you look at the foundations of where we're at and how we can extend this from domain specific languages, for domain experts, such as Marlowe to financial experts, and the DSLs that will come later, for others, like lawyers and supply chain experts in medical databases and so forth and how easy it is to write and deploy these. Plutus being beautiful glue code for both on and off chain communications. I think we have an incredibly competitive offering for performance and when hydra comes, simply put, there'll be no one faster. If we need to shard, we're going to do that and definitely better than anybody else because we know where our security model sits and there won't be surprise Stanford papers to blindside us that require immediate addressing.
In terms of operating costs, this is the last component, in my view, and that's basically how much does it cost you the developer to run your application? There are really two dimensions, one is predictability and the other is amount. It's not just good enough to say: it's a penny per transaction today. You need to know that after you spend millions of dollars and months or years of effort building something and deploying something that you're not going to wake up tomorrow and now it's five dollars to do what used to cost a penny. You need that cost to be as low as possible and as predictable as possible and again the way that we architectured our system and as we turn things on towards the end of this year and as we enter into the next year we believe we have a great approach to achieve low operating cost. One person asks why Cardano? Well because we have great security and correctness in the development experience and tools with 35 years of legacy that were built right the first time and don't put the burdens of mistakes on your customers. They ask why Cardano and we say: well the chain itself is going to give you great solutions with identity value transformation and governance itself and as a consequence when you talk about upgrading your applications having a relationship with your customers of your applications and you talk about the ease of maintenance of those applications. There's going to be a good story there and we have beautiful frameworks like Voltaire that allow that story to evolve and we keep adding partners and who have decades of experience to get us along. We won't stop until it's much better. They asked why Cardano? We said because at the moment we're 10 times faster today than Ethereum today and that's all we really need for this year and next year to be honest and in the future we can be as fast as we need to be because we're truly scalable. As the system gets more decentralized the system improves performance and where and when we need to shard we can do that. We'll have the luxury of time to do it right, the Cardano way, and when people ask why Cardano? Because the reality is, it's very cheap to do things on our platform and the way we're building things. That's going to continue being the case and we have the governance mechanisms to allow the community to readjust fees and parameters so that it can continue being affordable for users. Everything in the system will eventually be customizable and parameterizable: from block size, to transaction fees and the community will be in a good position to dynamically allocate these things where and when needed so that we can enjoy as an ecosystem predictability in our cost.
In the coming weeks and months, especially in my company, we're going to invest a lot of time and effort into comparison marketing and product marketing. When I see people say, oh well, you've launched proof of stake, a lot of other people have done. I don't think those people fully appreciate the magnitude of what we actually accomplished as an ecosystem and the quality of the protocols that are in distribution. That's not their fault, it's our fault, because we didn't take the time in simplistic terms, not scientific papers and deep code and formal specifications, but rather everyday language, to really show why we're different. I admit that that's a product failing and that needs to be corrected so we hired a great marketing director, named Liza (Horowitz?) and she is going to work full time with me and others in the ecosystem, a great team of people, every single day to get out there and explain what we have done is novel, unique, competitive and special to our industry. Everything from Ouroboros and contrast to major other protocols from the EOSes and Algorands and the Tezos of the world. Why we're different, trade-offs we chose over them, to our network stack, to the extended UTXO model, to Plutus, to Marlowe and we're going to keep hammering away at that until we get it right and everybody acknowledges and sees what has been accomplished.
I've spent five years of my life, good years of my life, and missed a lot to get this project where it needs to go. All of our employees have invested huge sums of their personal lives, their time, their brand, their careers, in trying to make this the really most magical and special cryptocurrency and blockchain infrastructure around. No one ever signed up in this company or the other companies working on Cardano to work on a mediocre protocol. That's just another blockchain, they signed up to change the world, they signed up to build a system that legitimately can look at you in the face and say: one day we have the potential to have a billion users! That's what they signed up for and they showed up to play. They built technology that evolves in that direction with some certainty and great foundations and we have an obligation to market in a way that can show the world why, succinctly, with clarity. Understandably, this has been a failing in the past but you know what? You can always be better tomorrow that monotonically increasing make it better and that's what we're going to do. We recognized it and we're going to invest in it and with Voltaire if we can't do it. You the community can do it and we'll work with you. If you can do a better job and the funding will be there to get that done. In addition to this, we think about 2021 and we ask where does the future take us? I've thought a lot about this you know I've thought a lot about how do we get the next five years as we close out 2020 and here's the reality: we're not going to leave as a company until we have smart contracts and multi-asset and Voltaire has evolved to a point where the community can comfortably make decisions about the future of the protocol and that the staking experience has solidified and it's stable.
I don't care if this costs me millions or tens of millions of dollars out of my own pocket to make happen. I'm going to do that because that's my commitment to you, the community and every product update will keep pushing our way there. We'll continue to get more transparent, we'll continue to get more aggressive and hire more and parallelize more. Aware when we can, to deliver that experience so that Cardano gets where it needs to go. Then when we ask about where do we go next? The reality is that the science as an industry, the engineering as an industry has given a menu of incredibly unique attractive and sexy things that we can pursue. What we're going to do is work with the community and the very same tools that are turning on today, the Voltaire tools, the cardano.ideascale.com tools and we're going to propose a consortium and we're going to bring the best and brightest together and give a vision of where we can take the system in another five years. With the benefit of hindsight, massively improved processes, better estimation capabilities and the fact that we're not starting with two people at IOG. We're starting with 250 people and the best scientific division in our industry and the legacy of almost, nearly by the end of this year, 100 scientific papers. That's us, you know what, there's dozens of companies throughout the history who have worked on Cardano. It's about time to scale them up too and get client diversity. So come next year when the protocol has evolved to the point where it's ready for it, we'll have that conversation with you the community and that's going to be a beautiful conversation. At the conclusion of it, there's going to be certainty of how we're going to evolve over the next five years to get ourselves beyond the cryptocurrency space. I'm very tired of these conversations we have about: are you going to go to (coindesk's) consensus or not? Or who's going to be the big winner? What about Libra or what about this particular regulation and this crypto unicorn and this thing?
You know I've been in the space a long time and I've noticed that people keep saying the same things year after year in the same venues. Yes, the crowd sizes get larger and the amount of value at risk gets larger but I haven't seen a lot of progress in the places where I feel it is absolutely necessary for this technology to be permanent in the developing world. We need to see economic identity. People often ask what is the mission for Cardano? For us IOG, you look at economic identity and you take a look at a roadmap. For it, you scale up and down, and each and every step along the way, from open data, to self-sovereign identity, to financial inclusion. You can keep going down: to decentralized lending, decentralized insurance, decentralized banking. Each and every step along the way to economic identity. When you admit a blockchain tells you that, there's a collection of applications and infrastructure that you need to build.
My life's work is to get to a point where we have the technology to do that. The infrastructure to do that, with principles, and so we'll keep evolving Cardano and we'll keep evolving the space as a whole and the science as a whole until I can wake up and say: each box and that road to economic identity, for all people not just one group, we have a solution for that. I'm going to put those applications on Cardano and success for me is not about us being king of the crypto hill and having a higher market cap than bitcoin or being entrepreneur of the year coindesk's most influential person. It's meaningless noise, success for me is reflecting back at the things that we have accomplished together and recognizing that millions if not billions now live in a system where they all matter, they all have a voice, they all have an equal footing. The Jeff Bezos of the world have the very same experience as the person born in Rwanda and we're not done until that's the case. It's a long road, it's a hard road, but you know what? We're making progress, we have great people in Africa, we have great people in eastern Europe, we have great people in southeast Asia and great partners all along the way. Great people, Latin America, great people in south America, great people here in the United States.
When we talk about economic identity there are millions, if not tens of millions of Americans who don't have it. Same for Canadians, hundreds of thousands, who don't have it. Developed western cultures, it's the greatest blind spot of policy and as we enter into a depression as a result of coronavirus, add millions if not tens of millions more onto that list. Generations are being disenfranchised by this legacy system and we as an ecosystem, we as an entire community are offering a different way forward. Not hyper centralizationn not social credit but a way forward where you own your own money, your own identity, your own data. You're not a victim of surveillance capitalism, you're not a victim of civil asset forfeiture. When you say the wrong things, you get shut out of society. Each and every human being matters and I'm optimistic to believe that when you remind people that they matter they're gonna rise to the occasion. That is the point of my company. In the things that we do each and every day, that's our mission to give the platforms to the world so that those who don't have economic identity can get it and they can keep it and no one can take it from them and they can enjoy an ever increasing growth of standard of living wealth and prosperity.
However you want to measure that this is my goal post, I couldn't care less about the cryptocurrency space. It was a great place to start but the space needs to be reminded why it exists. Bitcoin was given a mandate on the back of the 2008 financial crisis to do something different. It was not given a mandate to go be a new settlement layer for central banks or a new way for the old guard to make more money and banks get bigger and for those who are in control to preserve their power. The whole point of doing something so crazy as to buy a coin that doesn't even exist in real life, that's just a bunch of numbers in the cloud, the whole point of that was so that we as a society could do something different than the way that we'd been doing things before. So, each and every member of the cryptocurrency space needs to remind everyone else from time to time why we're here and where did we come from and where are we going to go.
The beauty of Cardano is we have already achieved for the most part a decentralized brain and that momentum is pushing harder than ever. More and more scientists are waking up, more and more institutions are waking up, getting us there. The code we have, the right approach and I think we have a great competitive offering for 2021 as we go and battle the titans and that's going to be a lot of fun but we know who we are and where we're going and we're in the right places. It's so incredibly encouraging to see the stake pool operators not just be from California or Texas or New York or Canada. To see a lot of stake pool operators from the place that need the most, help everybody does matter and it means a lot to me for the people who are there but it means a lot to everybody to say that we have created an equal platform. It makes the participation of all of us so much more meaningful. We're not just talking to each other, we're talking to the world and by working together on this platform we're lifting the world up and giving people hope. That's the point, there's a lot more to do, we didn't get everything done. You never do you aspire, you work hard, you set a moon, shot and sometimes you can just get to orbit with the first go but you know what? When you build the next rocket you can go to Mars.
Thank you all for being with me, thank you all for being part of this. Today was a damn good day with the announcement of Voltaire. Go to cardano.ideascale.com. You can participate in that, so end of September is going to be a good day too. There's a lot of good days to come, in between a lot of hard days, doing tasks sometimes entirely forgettable but always necessary to keep the revolution going and the movement going. I cannot wait for 2021, our best days are ahead of us, because of you. You all take care now .
Source: https://www.youtube.com/watch?v=BFa9zL_Dl_w
Other things mentioned:
https://cardano.ideascale.com/
https://www.atixlabs.com/blockchain
https://www.well-typed.com/
https://www.vacuumlabs.com/
https://medium.com/interdax/what-is-taproot-and-how-will-it-benefit-bitcoin-5c8944eed8da
https://medium.com/interdax/how-will-schnorr-signatures-benefit-bitcoin-b4482cf85d40
https://quantstamp.com/
https://bloxian.com/bloxian-platforms/ (TWIG)
https://runtimeverification.com/firefly/
https://www.trufflesuite.com/
https://experts.illinois.edu/en/publications/prism-deconstructing-the-blockchain-to-approach-physical-limits (PrisM and not our Prism https://atalaprism.io/)
Ebb-and-Flow Protocols: A Resolution of the Availability-Finality Dilemma (aka Gasper and ETH2.0 fucker) https://arxiv.org/abs/2009.04987
http://www.quviq.com/products/
https://en.wikipedia.org/wiki/Schnorr_signature
submitted by stake_pool to cardano [link] [comments]

Recap on CoinEx & Avalanche AMA Aug 5, 2020

Recap on CoinEx & Avalanche AMA Aug 5, 2020
Written by SatoshisAngels
Published by read.cash
On August 5th 2020, Satoshi’s Angels hosted an AMA for CoinEx on “How BCH and Avalanche Are Bringing Financial Freedom to 6 Billion People” on a Chinese platform Bihu. During the 100-minute event, Haipo Yang of ViaBTC and CoinEx, and Emin Gun Sirer of AVA Labs shared their in-depth views on such topics as different consensus mechanisms, community governance, IPFS, Defi. And Haipo explained why he wants to fork BCH. This is the full text.
You can check out the full AMA here (mostly in Chinese with some English translation).

https://preview.redd.it/x790bw58axf51.png?width=1920&format=png&auto=webp&s=03c8af942f8f14d98d5dd693adf9e2a50448d61d
Cindy Wang (Satoshi’s Angels): There are news saying that you are to fork BCH. Is it a marketing makeover? Are you serious about it?
Haipo Yang: It’s definitely not a marketing makeover. But the details are not decided yet.
Over the past three years, the BCH community has gone through multiple discussions from reducing block time, changing mining algorithms, adding smart contracts, etc. But none of these disputes have been well settled.
BCH is a big failure in terms of governance. A lack of good governance has made it fall in disorder. It is too decentralized to make progress.
You may know that the first BCH block was mined by ViaBTC. And we gave a lot of support to it indeed. But we didn’t dominate the fork. The Chinese community in particular thought I had a lot of influence, but it was not true.
I think the whole community is very dissatisfied with Bitcoin ABC, but it is difficult to replace them or change the status quo. So I am thinking of creating a new branch of BCH. The idea is still in early stage. I welcome anyone interested to participate and discuss it with me.
Wang: Professor Emin, what’s your attitude to fork? Do you think it’s a good timing to fork BCH?
Emin Gun Sirer: I am a big fan of BCH. It adheres to the original vision of Satoshi Nakamoto. I like the technical roadmap of BCH. But just like what Haipo mentioned, BCH lacks a good governance mechanism. There are always something that can cause BCH community to divide itself.
But I think it’s not enough to just have a good governance mechanism. There are many good proposals in the community but failed to be adopted in the end. I think BCH needs social leadership to encourage discussion when there are new proposals.
Wang: We are all curious to know How Avalanche got its name?
I know that Avalanche doesn’t mean well in Chinese. But in English, it’s a very powerful word. Avalanche represents a series of algorithms piling together like a mountain. When decisions slowly form, the ball (nodes in the network) on top of the mountain starts going down the hill on one side, and it gets bigger and bigger, and like an avalanche and it becomes unstoppable, making the transaction final.
Wang: Prof. Emin, I know that you are a big blocker. Have you ever considered implementing Avalanche based on BCH? Why create another chain?
Sirer: Of course I considered that. Satoshi Nakamoto consensus is wonderful, but the proof-of-work mechanism and Nakamoto consensus base protocols have some shortcomings, such as network latency, and it is hard to scale. Avalanche, instead, is totally different, and is the new biggest breakthrough in the past 45 years. It is flexible, fast, and scalable. I’d love to implement BCH on top of avalanche in the future, to make BCH even better by making 0-conf transactions much more secure.
Wang: As an old miner, why did CoinEx Chain choose to “abandon” POW, and turn to POS mechanism?
Haipo: Both POW and POS consensus algorithms have their own advantages. POW is not just a consensus algorithm, but also a more transparent and open distribution method of digital currency. Anyone can participate in it through mining.
POW is fairer. For a POS-based network, participants must have coins. For example, you need to invest ICO projects to obtain coins. But developers can get a lot of coins almost for free. In addition, POW is more open. Anyone can participate without holding tokens. For example, as long as you have a computer and mining rigs, you can participate in mining. Openness and fairness are two great features of POW. POS is more advanced, safe and efficient.
POS is jointly maintained by the token holders, and there is no problem of 51% attacks. Those who hold tokens are more inclined to protect the network than to destroy the network for their own interests. To disrupt the network, you need to buy at least two-thirds of the token, which is very difficult to achieve. And when you actually hold so many coins, it’s barely possible for you to destroy the network.
POW has the problem of 51% attack. For example, ETC just suffered the 51% attack on August 3. And the cost to do that is very low. It can be reorganized with only tens of thousands of dollars. This is also a defect of POW.
In addition, in terms of TPS and block speed, POS can achieve second-level speed and higher TPS. Therefore, CoinEx Chain chose POS because it can bring a faster transaction experience. This is very important for decentralized exchanges. Both POW and POS have their own advantages. It’s a matter of personal choice. When choosing a consensus mechanism, the choice must be made according to the characteristics of the specific project.
https://preview.redd.it/upbayijaaxf51.jpg?width=1055&format=pjpg&auto=webp&s=703e3b6a493a76f86bc9045e784d174bde9d3c42
Wang: Ethereum is switching to ETH 2.0. If they succeed, do you think it will lead the next bull market?
Sirer: If Ethereum 2.0 can be realized, it must be a huge success.
But I doubt it can be launched anytime soon considering that it has been constantly delayed. And even if it comes out, I am not so sure if it will address the core scaling problem. And the main technology in Ethereum 2.0 is sharding. Sharding technology divides the Ethereum networks into small parallel groups, but I think what will happen is everyone wants to be in the same “shard” so the sharding advantages might not be realizable in Ethereum 2.0.
Avalanche supports Ethereum’s virtual machine, and Avalanche can realize 1 second level confirmation, while with sharding finalizing confirmation takes 5–6 seconds at best. Avalanche approach to make Ethereum scale is superior to Ethereum 2.0. There are many big players behind Ethereum 2.0, and I wish them success. But I believe that Avalanche will be the fastest and best Smart Contract platform in the crypto space, and it is compatible with Ethereum.
Wang: Why is Avalanche a real breakthrough?
Sirer: Avalanche is fundamentally different from previous consensus mechanisms. It’s very fast with TPS surpasses 6500, which is three times that of VISA. Six confirmations can be achieved in one second. Compared with the POW mechanism of Bitcoin and Bitcoin Cash, Avalanche’s participation threshold is very low. It allows multiple virtual machines to be built on the Avalanche protocol.
Avalanche is not created to compete with Bitcoin or fiat currencies such as the US dollar and RMB. It’s not made to compete with Ethereum, which is defined as the “world’s computer”. Avalanche is positioned to be an asset issuance platform to tokenize assets in the real world.
Wang: How do you rank the importance of community, development, governance, and technology to a public chain?
Sirer: These four are like the legs of a table. Every foot is very important. The table cannot stand without strong support.
A good community needs to be open to welcome developers and people. Good governance is especially important, to figure out what users need and respect their voices. Development needs to be decentralized. Avalanche has developers all over the world. And it has big companies building on top of Avalanche.
Yang: From a long-term perspective, I think governance is the most important thing, which is the same as running a company.
In the long run, technology is not important. Blockchain technology is developed based on an open source softwares that are free to the community. Community is also not the most important factor.
I think the most important thing is governance. Decentralization is more about technical. For example, Bitcoin, through a decentralized network method, ensures the openness and transparency of data assets, and the data on the chain cannot be tampered with, ensuring that the total amount of coins has a fixed upper limit.
But at the governance level, all coins are centralized at some degree. For example, BCH developers can decide to modify the protocol. In a sense, it is the same as managing a company.
Historically, the reasons for the success and failure of companies all stem from bad governance. For example, Apple succeeded based on Steve Jobs’s charisma, leadership and the pursuit of user experience. When Jobs was kicked out, Apple suffered great losses. After Jobs returned, he made Apple great again.
Issues behind Bitmain is also about governance. Simply put, governance requires leaders who have a longer-term vision and are more capable of coordinating and balancing the resources and interests of all parties to lead the community.
In the blockchain world, many people focus on technology. In fact, technology is not enough to make great products. User experience is most important. Users don’t care about the blockchain technology itself, but more concerned about whether it is easy to use and whether it can solve my problem.
We need to figure out how to deliver a product like Apple. The pursuit of user experience is also governance in nature. And governance itself lies in the soul of key leaders in the community.
Realize tokenization of assets in.
https://preview.redd.it/14jf1bvcaxf51.jpg?width=1082&format=pjpg&auto=webp&s=c312912142c38de986f42912086e205354162190
Wang: Speaking of asset tokenization, I would like to ask Haipo, do you think the market for assets on the chain is big?
Yang: It must be very big. We need to see which assets can be tokenized.
Assets that can be tokenized are standardized assets, sush as currencies and securities.
  1. In terms of currency, Tether has issued over 10 billion U.S. dollars. Many people think that’s too much. But I think this market is underestimated. The market for stablecoins in the future must be hundreds of billions or even trillions, especially after the release of Facebook’s Libra. Even US dollar might be issued based on the blockchain in the future.
At present, the settlement of USD currency is through the SWIFT system. But the SWIFT system itself is only a clearing network, a messaging system, not a settlement network. It takes a long time for clearing and settlement, and it is not reliable. But both USDT and USDC can quickly realize cross-border transfers in seconds and realize asset delivery. Even sovereign currencies are likely to be issued on the blockchain. I believe RMB also has such a plan.
  1. Equity and securities markets are the largest market. But they have strict requirements for market access.
Whether a stock is listed on A-shares or in the American markets, it’s hard to obtain them. I believe that the blockchain can completely release the demand through decentralization. It can allow any tiny company or even a project to issue, circulate and finance a token.
There may be only tens of thousands of stocks currently traded globally. There are also tens of thousands of tokens in the crypto space. I believe that millions or more of assets will be traded and circulated in the future. This can only be realized through decentralized technology and organization.
The market for assets tokenization will be huge. And at present, the entire blockchain technology is still very primitive. Bitcoin and Ethereum only have a few or a dozen TPS, which is far from meeting market demand. This is why CoinEx is committed to building a decentralized Dex public chain.
Wang: Avalanche’s paper was first published on IPFS. What do you think of IPFS?
Sirer: I personally like IPFS very much. It is a decentralized storage solution.
Yang: There is no doubt that IPFS solves the problem of decentralized storage, and can be robust in the blockchain world, and can replace HPPT services. But there are still three problems:
  1. IPFS is not for ordinary users. Everybody needs BCH and BTC, but only developers need IPFS, which is a relatively niche market;
  2. IPFS is more expensive than traditional storage solutions, which further reduces its practicality. In order to achieve decentralization, more copies must be stored, and more hardware devices must be consumed. In the end, these costs will be on to users.
  3. There may be compliance issues. If you use IPFS to store sensitive information, such as info from WikiLeaks, it may end up threatening national security. I doubt that decentralized storage and decentralized public chains can survive under the joint pressure of global governments.
The IPFS project solves certain problems. But from the perspective of application prospects, I am pessimistic.
Wang: What do you think of Defi?
Yang: I want to talk about the concept first.
Broadly speaking, the entire blockchain industry is DeFi in nature. Blockchain is to realize the circulation of currency, equity, and asset value through decentralization.
So in a broad sense, blockchain itself is DeFi. In a narrow sense, DeFi is a financial agreement based on smart contracts. DeFi, through smart contracts, can build applications more flexibly. For example, before we could only use Bitcoin to transfer and pay. Now with smart contracts, flexible functions such as lending, exchange, mortgage , etc. are available. The entire blockchain industry is gradually evolving under the conditions of DeFi. DeFi will definitely get greater development in the future.
Sirer: I think Defi will definitely have a huge impact. DeFi is not only an innovation in the cryptocurrency field, but also an innovation in the financial field. Wall Street companies have stagnated for years with no innovation. Avalanche fits different DeFi needs, including performance and compliance. In the future, not only will Wall Street simply adopt DeFi, but DeFi will grow into a huge market that will eventually replace the traditional financial system.
Questions from the community:
1. How does Avalanche integrate with DeFi?
Sirer: At present, all DeFi applications on Avalanche have surpassed Ethereum. What can be achieved on Ethereum can be achieved on Avalanche with better user experience. We are currently connecting with popular DeFi projects such as Compound and MakerDao to add part of or all of their functions.
At present, Avalanche is working on decentralized exchange (DEX). The current DEXs are limited by speed and performance but when they are built on top of Avalanche it will be real-time and very fast.
2. How many developers does BCH have?
Yang: I think it does not matter how many developers there are. What matters is what should be developed. I watched Jobs’ video the other day, and it inspired me a lot. We are not piecing together technology to see what technology can do. It’s we figure out what we want first and then we use the technology we need.
The entire blockchain community worship developers. Such as they call Vitalik “V God”. It’s not necessary to treat developers as wizards. Developers are programmers, and I myself is also a programmer.
ViaBTC has a development team of over 100 people, including core members from Copernicus (a dev team formerly belonged to Bitmain). Technically we are very confident to build faster, stabler, and better user experience products.
submitted by CoinExcom to btc [link] [comments]

Clarence on Making the Elastos Ethereum Side Chain (with ... What is Sidechain in Blockchain ? Cryptocurrencies: Last Week Tonight with John Oliver (HBO ... Sidechains, innovación en blockchain de Bitcoin Selling Bitcoin at a Coin Shop! - YouTube

If the second blockchain has agreed to be a Bitcoin sidechain, it now does something really special… it creates the exact same number of tokens on its own network and gives you control of them. So it’s as if your Bitcoins have been transferred to this second chain. And remember: they’re immobilized on the Bitcoin network… so we haven’t created or destroyed any…. Just “moved ... Wiki Bitcoin - Your Bitcoin knowledge sources. Wiki comes from a Hawaiian word that stands for "fast" or "quick". Wiki Bitcoin gives you a quick overview of the most important Bitcoin related knowledge sources. Some of these sources are Wikis like Wikipedia. Hence the name of this site Wiki Bitcoin. Contents of Wiki Bitcoin. Bitcoin Wiki entries Sidechain is a method of separation blockchains. Instead of using only primary blockchain, a user now can transfer his digital assets to a supplemented one. These are various platforms for blockchain, which operate with sidechains for a faster speed of transactions and payments: Ardor, RSK and etc. Bitcoin would be ‘firewalled’ from the side chain, meaning that any security issues that arose in a side chain wouldn’t affect people who were only involved in the bitcoin block chain. Bitcoin sidechains use an improved version of this system called “two-way pegging”, which works as follows. In order to receive one unit of BTC2.0, one would need to take one unit of BTC1.0 and send it into a “script” which we will call X and leave undescribed for now. A script in Bitcoin is an address that, instead of being owned by a private key, essentially acts as a lockbox that ...

[index] [34240] [44896] [37087] [44684] [6550] [22058] [461] [10635] [44058] [35482]

Clarence on Making the Elastos Ethereum Side Chain (with ...

#Sidechain is a #blockchain that runs in parallel to the main #blockchain which extends functionality through interoperable #blockchain networks allowing a d... Digital currencies are generating a lot of excitement. John Oliver enlists Keegan-Michael Key to get potential investors equally excited about the concept of... Conoce sobre sidechains, la innovación que está mejorando la red de Bitcoin. ¿Qué opinas? ¿Esto será el futuro de bitcoin? The Bitcoin Group, the American Original, for over the last ten seconds, the sharpest satoshis, the best bitcoins, the hardest crypocurrency talk. Start protecting yourself on the web at http://www.NordVPN.com/whatsinsidefam or use code whatsinsidefam and save 77%!” Can we sell Bitcoins to a coin dealer...

#