The value of reputation
Most individuals understand this intuitively. That is why we humans strive to maintain a good reputation at all times.
But why is our reputation so valuable to us?
Let’s have a look at how Prof. Jordan Peterson, a clinical psychologist, explains this phenomenon:
We’ve certainly been searching forever for an incorruptible storehouse of value. [...] And the incorruptible storehouse of value is your reputation. [...] The more incorruptible you are as a person the better your reputation and that’s the most reliable storehouse of value you have. [...] We’re kind of enticed into believing that if we store value there’s something selfish about that. But that’s not the case at all, if it’s the case that the best place to store value is in your reputation and therefore you want to be the most ethical actor possible. [...] So, you store value in your reputation and that means you store value in the ethics of your behaviour.
The following research article provides another explanation for why reputation is so valuable:
Reputation is a piece of public information that summarizes how a person behaves towards others. Individuals often invest substantial resources to maintain a good reputation. These costs are incurred because having a good reputation is valuable: empirical and theoretical studies from evolutionary game theory and economics indicate that having a good reputation increases one’s expected payoff in future interactions with others. Therefore, reputation can incentivize cooperative behaviour, i.e. behaviour that is individually costly, but socially beneficial.
This is even more true for people, organisations and businesses whose success depends heavily on their reputation in society.
For example:
People
- actors
- talk show hosts
- internet celebrities
- musicians
- politicians
- business leaders
- athletes
- activists
Organisations
- charities
- religious organisations
- NPOs
- NGOs
- clubs
- associations
Businesses
- private companies / corporations
- public companies / corporations
When such individuals, organisations, or businesses join in a trust-based system, they are very likely to follow the rules in order to protect their reputation, as their success is strongly dependent on it. And reputation, like trust, is hard to gain but easy to lose.
According to this Forbes article reputation is the only currency that matters in business. This is the reason companies are willing to pay a lot of hush money. They are afraid of damaging the success of the company through the loss of reputation.
But how can the fear of losing one’s reputation contribute to the success of a DLT system?
Every DLT system is made up of two distinct groups of people: operators and users. (In the vast majority of cases, operators are also users, whereas we know from existing DLT systems that only a small fraction of users participate in the operation of the network.)
A DLT system exists to serve its users, not its operators. Its primary purpose is to protect people against persons who wield excessive power and influence. But as previously detailed, operators are sadly unavoidable in digital money systems (as in all other financial systems). However, because the welfare of the users is dependent on the behaviour of the operators, the system’s operators are ultimately responsible for the system’s success or failure. Since a DLT system’s main mission is to serve its users, it is the operator’s duty to safeguard the system in such a way that it protects and benefits the users at all times. If the operators abuse their position of power, the users suffer, which means the mission had failed and the raison d’être of the respective DLT system would be forfeited. To safeguard users from exploitative behaviour by operators, operators must make themselves susceptible to users in some way. This is the only way they can acquire the users’ trust.
But how can operators make themselves vulnerable without making a financial commitment (as in the case with PoW and PoS)?
Operators should reveal their identity and intentions.
If an operator with a publicly known identity attempts to cheat, he has to pay with his reputation. However, as long as an operator is honest, upright, and trustworthy, he has nothing to worry about. On the contrary, he will be well rewarded because this behaviour ensures users can prosper, adding to the DLT system’s objective. People, organisations, and businesses can, of course, participate in the network’s operations anonymously. However, an operator will only earn significantly if he has the trust of many users, which is only feasible if he reveals his identity and intentions, and if the users deem the respective operator’s behaviour as excellent.
Thus, the operators of the system form a network of known and unknown individuals, organisations and companies.
As stated at the outset of this section, the majority of operators understand that their reputation is far more valuable than money in the long term, because money follows their reputation. Being aware of this is a powerful drive for system operators to play by the rules. This is precisely the basic principle that a trust-based system leverages.
The only two rules for the operators of the system are:
Being entirely honest at all times (not cheating and also not approving or condoning any fraudulent activities by others in the system)
Protecting the freedom and human rights of all users in the system (no oppression, exploitation, discrimination or surveillance of any user or group of users)
If an operator violates either of these two rules, they must be immediately blacklisted by all the other operators. This is irrespective of the number of operators in the network who have broken the rules. Whether a single entity, a group, or the majority of operators violate one or both rules, a blacklisting procedure must always be initiated immediately to separate the wheat from the chaff.
This means that if an operator attempts to manipulate the system for his own benefit, he will be immediately locked out by all other operators and placed on a public list with the reasons for his blacklisting. Of course, all this happens automatically without any human intervention. As a result, the attacker gains no advantage from the fraudulent activity (e.g. a fake transaction), as it is rejected by all other operators. Quite the opposite, such actions would make headlines and ruin the attacker’s reputation, discouraging other operators from attempting to do the same.
Of course, this is only true if the attack is deliberate or targeted by an operator. After all, it might just be a bug in the node software or a security vulnerability in the node that an external attacker is able to exploit. In that instance, the operator can simply publicly apologise and rejoin the network, albeit with a minor penalty (e.g. deduction of points from his trust score). If this occurs repeatedly within a certain period of time, the penalty increases each time, eventually leading to permanent exclusion from the network.
To summarise, the better an operator’s reputation, the more likely this operator will operate honestly and sincerely. And the greater the likelihood that an operator will operate honestly and sincerely, the more trustworthy that operator will be.