Outlining a new Cryptocurrency Ratings Model

29.03.2018

Following up on my latest story about Weiss Ratings entering the cryptocurrency ecosystem, I want to briefly outline my work-in-progress theory of some of the properties we need to look for and then map in order to spot an undervalued cryptocurrency (not security or utility token). I sort of danced around these points in an earlier story about what makes Bitcoin valuable, but I now think they need to be described in a more structured way.

Imagine a group of scientists pointing a strong light source straight to a prism. The light of different wavelengths are dispersed, casting colorful spears for us to marvel at. A fair ratings model should not focus on the blinding light source; it must carefully use a mental prism and analyse the results of the spectroscopy to properly understand what the light source actually consists of.

Now, rating cryptocurrencies is an evolving practice. There is no obvious answer to what will be the correct way of doing it moving forward. Certain realities have changed and will continue to change as well. It is however my strong conviction that the basis of any ratings must be, at least, that of a close look at the following four properties, or light shards if you will: Ecology, Technology, Decentralization and Valuation. If any of these properties are failing, the ledger is absolutely not a good place to store value on.

Ecology

This property or index concerns the usage, utility and ultimately how much of a protocol/standard the coin is or in some sense realistically can be. Recently there has been a prolonged discussion with regards to Bitcoin whether actual transaction usage or more indirect store-of-value usage is what's needed for a coin to establish itself as the standard of value over the internet. In short, I believe that in order to have a store-of-value property at all, high usage is a prerequisite. Any high valuation without prolonged and/or high usage is just temporary.

More specifically, the Ecology index looks at the number of transactions per day and the amount of transaction fees paid to miners/stakers of the network. It tries to measure how active the coin community is on different social media outlets and how the project roadmap ensures high usage. It also measures how active developers and stakeholders are in communicating roadmaps, creating documentation and guides, offering software with good user experiences etc. The fact that some developer teams have had a history of moving on to completely new projects, leaving the old blockchain to slowly fade in importance, is also mapped.

Usage and meaningful transactions will come from users and businesses transacting goods and services over the cryptocurrency network. A diverse integration of businesses and decentralized use cases act as a hedge against in this case idiosyncratic ecology risk; if completely independent entities are utilizing the network heavily and under a long period of time, it solidifies the network as a value protocol or money. Usage slowly leads to protocol legitimacy (medium of exchange, unit of account, store of value).

The index takes into account how much time and resources are spent on marketing. I am convinced that the amount of marketing is inversely correlated with the long term success rate of a cryptocurrency. If the code is good, people, businesses and capital will come organically as they in the long run want to use and store value on a chain that is functional and secure, not one that is attracting users synthetically through ads. Re-branding falls into this category as well. Coins with a heavy focus on marketing seems to help form the least loyal communities as well. It may also attract regulatory action.

Technology

Measuring the coin's Technology properties or index usually demands first a deep look into the workings of the consensus algorithm. There are many different algorithms, some common and others isolated to just a single blockchain. Many coins have forked the Bitcoin Core code base, and other have forked forks themselves. Copying open source code to start a new blockchain is relatively easy. but isn't always necessarily a bad thing. Building a code base from scratch can act as a quality stamp on a developer team if they manage to pull it of, but on the other hand, a simple copy of a battle tested code base with some tweaks is less prone to having very serious bugs. This trade-off has to be seen in context to what the project actually tries to build and achieve. A complete copy with no code alterations whatsoever is of course a much more dubious practice as one is left to wonder what the purpose of the new chain was in the fist place.

Maximum number of transactions per second have an impact on the index, as well as what scaling solutions are being worked on. Transaction throughput is important now as the most popular blockchains start to hit their max limits. It is also important not to be quickly blinded by a high presented number from some new self-proclaimed revolutionary solution. Much more often than not, projects can present a higher-than-usual throughput because they centralize certain aspects of the network, like restricting the number of validating nodes with a master node system or other kind of node thresholds. This is not to say master node based consensus algorithms are automatically bad; it means the higher throughput does not immediately warrant a higher score on the Technology index.

The index also tries to measure if different incentives are aligned within the project. Miners or stakers need economic incentives to validate new transactions, store the ledger and pass on data to each other. Coin holders need incentives, like low inflation, safe cryptographic algorithms, safe hot-, light- or hardware wallets, not to place their capital elsewhere.

How actively the coin is developed on GitHub matters a lot since it sets the pace and efficiency of finding and solving bugs, developing new features, and having the code reactive to different needs in general. The index also look at how many good cryptographers and developers in general the coin has working on it, and what type of recognized work they have done earlier. Any peer review of published work is helpful when measuring the index.

Decentralization

Fundamentals of a coin like a large, active community and great technology are worthless in this space unless the whole project is decentralized. The ratings of a cryptocurrency should not reflect a short time frame because capital needs to be stored on something that is reliable long term. Centralization combined with a decade long perspective opens up for the possibility of some kind of destructive event that was possible only due to a central point of failure. Among the attack vectors on a centralized cryptocurrency are regulatory crackdown, theft, censorship, corporate or state interference, unfair change of protocol rules, narrow ecology abandoning the chain, the team/company moving on to a new project etc.

The Decentralization index scores high points with a broad developer base that code not just because a company pays them. The absence of a founding company or a coin foundation may hurt the Ecology index slightly if there is not enough coordination to meet needs, but it strengthens the Decentralization index due to the absence of one specific point of failure. Other factors that influence are multiple software clients, low coin concentration, many small mining/staking pools, ease of running a full node.

Valuation

Being the last of all four pillars of a coin, it is also slightly different in the sense that it measures value protocol contenders like Bitcoin and Ethereum differently from all the rest of the (alt)coins. This may seem ambiguous but reflects the reality of how I believe value will be stored in the future; one or a couple of hyper utilized and secure protocol chains (think Internet) will carry the majority of the stored value, with other small chains (think LAN) competing for the rest. The Valuation index reflects this discrepancy. For smaller altcoins, the index reflects the market capitalization compared to similar altcoins and/or Bitcoin and Ethereum. For Bitcoin and Ethereum, the index conservatively reflects the valuation compared to a small percentage of the global broad money supply M3, among other things.

M3 is a good proxy of what amount of liquid value could hypothetically be stored on one or several decentralized ledgers in the future. I am aware of the 1% fallacy but we need to be able to look at the M3 stock in an neutral way and judge the comparison on its merits. It is a fact that some small percentage of M3 consists of value denominated in currencies that suffer chronic inflation, mismanagement by authoritative regimes or leaders, bad technical infrastructure etc. It is also a fact that an even larger percentage of the broad money supply are exposed to very real risks of ending up absorbing systemic failures as global debt levels keep rising. And finally, there are only so many different assets where value can be stored today and most of them are centralized.

On quantitative measurement

I want to end this article talking about light obscuration. As we stand over the colorful light shards pondering the relevance of them all, it should be obvious that any interference with the light source would taint the corresponding results. The amount of transactions per day on a blockchain can be propped up by Sybil attacks. So can followers on Twitter, subscribers on Reddit or YouTube etc. The number and pace of GitHub commits can be made to look high if each of them concerns only small, simple changes. Also, there is the other kind of GitHub obscuration where the source code is closed, with promises to make it public in the future.

These are just a few examples of how data can be tainted, and shows why a completely mechanical analysis realistically is unfeasible. We can't just push a bunch of public data points in our ratings algorithm and have a solid rating come out as a result. The rating process must use public data both quantitatively and descriptively. I do think of money as a value protocol or a value language. With that in mind, context, background, experience can be very important when reading the words in that favorite book of yours.