Early Adopters of Crypto

Attention is the most scarce thing in the world. On a macro level, the world is awash in capital. Interest rates in countries are below zero. However, within our daily lives, there are always thousands of things competing for our attention. A question I like to think through is, where are the early adopters focusing their limited attention. Chris Dixon says it’s people messing around in garages building something. A revised question along those same lines is: 

Which nation/market is an early adopter of technology? How do their market dynamics predict what might happen in another geography?

First, a little theory. The world is a connected graph of people. Word of mouth is the thing that really gets people to adopt products. Facebook decreased the six degrees of separation down to around 4.5. However, among this distribution of connections between people and connections isn’t even. When we think of information flow, it’s more of a uni-directional graph. This means that person A can influence person B, but not usually not vice versa.

When we think of how information spreads, I think of a tinder over a dry terrain. While something doesn't spark 100% of the time, but when it does, there's the potential for a cascade of "catching fire". Within a network, there are early adopters and late adopters. These people are differentiated by personality traits, sources of information, and levels of connectedness in both the upstream and downstream direction in terms of where they get their information. I usually split the adoption curve into three sets of people:

  • 1) people who do things because it is novel or cool. This is an intrinsic motivator. These are early adopters.
  • 2) people who do things because there’s an economic need. This is an extrinsic motivator. These are middle adopters.
  • 3) people who do things because everyone else is doing something. These are late adopters.

So now that we have that out of the way, this is my current mental model for crypto adoption.

I am increasingly looking towards Asia for technology and more specifically Korea for cryptocurrencies. Due to special features in what their graph looks like, they have interesting winner take all dynamics as well as being early adopters. Information spreads quickly because of the connectedness and centrality of its social graph. The whole nation using Kaokao, has high-speed internet access, a high appetite for novelty and coolness, very tight-knit business communities, and have historically been early adopters of new technologies. Before the States got around to these things in Web 1.0, Korea was already on top of camera phones in the early 2000s, playing MMORPGS and other things, and over the top streaming (aka Netflix).

Bill Gurley and associates caught onto this trend and planned a trip to Korea to see what might be gleaned from this market. What resulted was a sharpening of their thesis around Social, Local, Mobile. When the iPhone hit everyone’s hand in 2008, we had the confluence of the internet, GPS, and camera in every pocket. And the rest is history, that Benchmark fund invest in a plethora of internet hits most notably Uber and Snapchat.

The current environment for Korea is pretty telling. 30% of South Korea owns or holds some sort of crypto, past the tipping point for widespread social adoption. When the regulators tried to shut exchanges down, HODLs raised their voices. I’m excited to see how individuals interact with token powered protocols as usability and scalability allow us to fall down the Marginal Benefit Curve of cryptocurrencies. While we’re still stuck at the store-of-value and the speculative era of cryptocurrencies, that should change soon.

Even now, as staking protocols begin to proliferate, crypto holders are looking to gain an edge in earning incremental token. We should start to see use Vest and Compound.Finance gain adoption as the usability of protocols begins to drop.

I’m personally not as bullish for developing countries as leading indicator as early adopters. As weird as it sounds, they need cryptocurrencies too much. My mental model for early adopters are the ones that like toys, the weirdos, the rich people and more that are willing to accept the flaws in the product. There’s something about intrinsic motivation as opposed to extrinsic motivation that drives the sickness and retention of a product/technology. I would much rather look towards the high-risk tolerance ICO investors than look towards traditional business and crypto “enterprise alliances”.

Research Coin v3



At this point, the thoughts contained in this post is quite old. However I wanted to publish this as I've been tinkering around with a new formulation of a protocol. It comes as an extension of thoughts by Nicola @ Protocol Labs. 

Research is really expensive, a public good, and has nastier power law returns than startups. The graph above shows revenues generated by patents, the step that comes after publicly funded research. It took  10k patents produced at Northwestern at a yearly cost of at least $675 million dollars to produce one patent with licensing revenue of $1B/yr. That's a cost of $67,000 per patent to get this holy grail.

Bell Labs spent over $10B in inflation-adjusted dollars on research and brought together the most incredible minds in an incredibly productive environment. Some of the end results include the transistor, which we all can thank as the earliest baby step for you reading this article.

Today, academics and funders complain about the misalignment of incentives for funding science. That's a discussion for another day. 

What we'll talk about today is a potential mechanism to fund basic research at scale. We want to do this to produce the research needed to generate these valuable patents. As well as rewarding the scientists, the individuals who actually generate ideas.

The core idea is recursive payments and ownership instead of just betting on getting accepted into a conference or something.

Units of the research coin system:
  • Paper 
    • Papers have owners
    • Papers have citations to other papers
    • Papers each have a token. This token is distributed to the owners of this paper, and to citations of other papers in the mechanism described below.
  • Owners are types of people. Can be an individual contributor, or an organization like MIT or something else.

Why staking and markets instead of social style peer-reviewers?
  • Peer review and prestigious journals are proxies for the long-term value of a paper. If we develop a market around each individual paper’s value, then this might be a good way to get rid of social gatekeepers of conferences and journals that incentivize “flash in the pan ideas"
  • There are already too many papers out there. Staking might open the door for algorithmic researchers as well as peer reviewers. 
  • It could induce more reproducibility studies since they are valuable but don’t get published in the flashy magazines if people can figure out a way to capture value in that (by shorting the weak paper???), or by purchasing a share

Why papers and owners instead of organizations?
  • Owners can be anyone who is holds an interest, perhaps an author or organization or something else (a DAO)
  • Token could flow directly to researchers, who may do better jobs of funding and finding talent rather than bueracratic organizations.
  • Should owners have a token that people can purchase???
    • A market for organizations???? This may be out of scope for this, as it seems that organizations could just be wallets, and people could potentially own a share of these if they wanted to.

Why a token?
  • Tokens help align value. They establish a clear unambiguous signal for a paper's value, while citations are social and a little bit messy (vanity citations, you scratch my back, I scratch your kind of thing)
  • Seems like this is a utility kind of thing. 
  • Maybe you could just use ETH to place bets and do payments, but it seems like you should definitely have some research coin for governance and staking.

Would people invest in research coin?
  • Researchers need to purchase or use research coin to review a paper.
  • People would put money into research coin because it should be a better science/provide more socially useful results than what is currently performed right now.
  • You purchase research coin because you think it produces a better body of science

What's the mechanism?

  1. Research coin distribution event (ICO? or do airdrop to researchers and other stakeholders)
  2. Each paper, when hits preprint server starts a game. Perhaps authors of papers have to stake money as well???
  3. Owners stake research coin so they can peer review this paper.
  4. We gather a set pool of staked money for the paper. They decide on the validity of the paper. And how to initially distribute the distribution of said paper’s individual token in proportion to owners and citations to other papers. This is some type of Schelling point game.
  5. Accurate Schelling point people are rewarded with some new research coin (in some proportion to how much was staked). Slashing the stake of bad reporters.
  6. Once paper Schelling point is set, then distribute locked research token recursively to owners of said paper’s token with pro rata of Schelling point. Intuition is that peer reviews want to review important papers and therefore will stake tokens to do this. More important papers accrue more staked token. More token flows recursively to the owners of the paper’s token. I guess this is technically securitizing basic research IP, lol.
  7. Markets develop for individual paper’s token. These may, later on, yield great research results and therefore generate recursive payments of research token. As more papers get published, flow of money goes recursively to the paper parent papers/owners. The price of the token’s paper, denominated in research coin may 
  • Recursive ownership is important because it incentivizes research with the greatest NPV in terms of research coin.
  • Researchers who publish should get steady payouts as more papers cite them, so they can continue to fund more research.

Additional thought
  1. Bounties for research can also function within this space as well. i.e “I am putting X research coin up for grabs if you can solve this problem and these people can verify it’s validity”
  2. Seems a bit complicated,

Related


Economic Returns of Casper

I recently set up a small mining rig to mine Ethereum. My housemates have audio-visual evidence of this. My first thought was to mine Ethereum. However, the big thing looming over this particular foray into hardware is the switch to Casper, Ethereum’s new PoS. When that switch happens, I want to start staking ETH and participating in that consensus protocol as well.

On ethresear.ch, you can find active discussion spurred on by Vitalik and Jon Choi on the potential economic outcomes of the switch and how they might drive monetary policy.

We can look at the current rate of return on PoW mining right now. While the profile of stakers vs miners may be completely different, I wonder if the total deposit level will be adversely affected. We may have fewer deposits than we have posited, somewhere between 0.1% - 0.5% of total deposit (TD) or 60 - 300M of USD worth of deposits. I arrive at this conclusion now by looking at the current rate of return that miners get compared to what is being discussed on the site. At the current return, we’d only have about $300M in staked deposits. Which feels quite low to subvert a $70B dollar chain.

The market-driven rate of return for the consensus protocol driven returns in relatively stable protocols (BTC, BCC, ETH) has remained relatively high in comparison to ranges shown in the Google Spreadsheet that Jon has shown. The range between 20% (Equities) and Multiples (Cryptos/startups) is very large. The current PoW yield is closer to a startup's risk-reward profile than an public market equity, with an estimated yield of 150%--back of envelope math below.

Given the current hash rate, factoring in fixed, variable (electricity), and non-recurring engineering costs such as physical space to find the current yield, outside of appreciation. Right now, given the price of ETH, it’s pretty damn profitable to mine. I arrive at an estimated yield of 150% per year. The total cost of the network including the aforementioned costs is $3-$5 Billion. Of a security to network value of 5%.

This checks out as well, given that the rate of return on a single GPU is around 7.5 months for payback period for an NVIDIA 1070. 

It seems like we might see a much smaller TD Ratio given the market rate of return on mining now. Given the stated target inflation rate of 0.5%, I’m afraid we might see a much lower participation rate given the modeled yield. PoS with the 4-month lockup is seemingly based on the same risk/reward and liquidity profile of PoW. PoW is potentially even more liquid given I can start mining on some other token if the price of token drops. Of course, the biggest driver of this is perhaps that returns from HODLing have been so extraordinary right now. After all, we know the price of ETH has basically 100x YTD. When the returns for crypto assets start to stabilize, we might see PoS return to being a stable source of returns 15% - 20% not including appreciation of assets seems pretty good [1]. 

  • Looking at current hashrate gives ~150000 GH/s. an NVIDIA 1070 GTX gives ~30MH/s, so there are approximately 5,000,000 GPUs working to secure ETH. These GPUs each cost $500. If we estimate that overhead expenses are 1.2x of per GPU cost, we arrive at an all in fixed and NRE cost of $3 billion dollars
  • If each GPU is pushing ~150Wh, Electricity costs are 5,000,000 / 6   * .05 * 24 * 365 or 7,300,000,000 to 10,550,000,000 kWh/y. An all in electricity cost per year of $365,000,000.
  • Taking that into account we have $3 - $5.19 Billion / $30 Billion network or a TD Ratio of 11 - 16% 
    • We’re paying out $3,858,570,000 in USD, or 12,861,900 token per year, 13.83% at $300 per Token
    • Yield of 74 - 100%
  • At a $700/ETH price, we have $5.19 Billion / $70 Billion network or a TD Ratio of ~4 %. This is with the current inflation rate of 15%. Miners currently zero out at 7.5 months, so 4.5 months of profit. This gives a yield of 165-55% hmmm…
    • We’re paying out $9,000,000,000 in USD, or 12,861,900 token per year, 13.83% at $700 per Token

[1] https://ethresear.ch/t/casper-validator-yield-as-a-function-of-td-and-issuance/222



Crypto's Ladder of Abstraction

Like all good blog posts, this one starts with a tweet. In this case, I can point to Nicola for spurring this one. Niraj and I previously collaborated on a post called “Merging Chains”. You can think of this in the same spirit as that post.


Portability has been a great side-effect of abstractions for computation, higher-level languages and will have the same effect for the decentralized world as well. In the centralized world, we have Dropbox, Google Drive, and Evernote. These all let us take our information wherever we want it. Whereas before, the existing model was a thumb drive or involved clunky data transfers. The internet helped pave the way for user side abstraction. When we wanted to upgrade devices, we didn’t have to worry about our data. On the dev side, we’ve seen the evolution of serverless. Preceding that were IaaS plays, namely AWS, and preceding that you had to rent out hardware and co-locate a hardware. 

Right now, a lot of effort is spent building on a base layer Turing(ish)-complete stack-based machine like Ethereum. While Ethereum remains a market leader right now, things might change. A 0day exploit might appear, someone very influential might die within their organization, or a switch to PoS might actually prove to have a bad security model. Those don’t necessarily reflect what I believe but rather are stated to show some “existential” type risks that might compromise a base layer protocol.

In theory, a mature dapp built on top of Ethereum, shouldn’t derive much of its value from the security model of Ethereum. In theory, a dapp should be able to move its contract state to another base layer protocol. Another way to look at it is again through the lens of history and greater abstraction previously mentioned. A user doesn’t really care whether Dropbox uses it’s own servers or is hosted on AWS. Of course, they care about getting their information lost or stolen, but that’s up to the developers to worry about.

As mentioned previously, developers today don’t have to deal with renting servers. Developers on Ethereum don’t have to write EVM bytecode either. We’ve already seen people build on different platforms. Kin moving to Stellar rather than building on Ethereum, at least initially. I do have a gut feeling that the switching costs may be less than people think, especially since new base layer protocols are taking the tack of enabling the EVM already, like RSKSmart. Also, the Ethereum state trie is already publicly available, and that lets people do airdrops and such, like EtherMint.

And of course, Ethereum abstracted away the messy world of bootstrapping your own blockchain, secured by miners.However, as we build upon this world of abstractions it’s easy to forget that these build on real components, while you can write in a high-level language, your code is still executed by self-interested miners, and that leads to interesting side effects and security concerns.

Ryan Shea and co spent time thinking about migrating state for onename, so this isn’t a thought that is completely out of the blue. Of course we’re seeing protocols such as Cosmos, Polkadot, and aelf now being presented as partial scaling solutions. Hopefully, they'll allow protocols now built only on Ethereum to work on other base world computers with ease.

In the formally-verified future, dapps and protocols will compile down to multiple VMs. Users and developers might not have to worry about a break down in the consensus mechanism of any one base layer protocol. A “meta”-token that wraps both the native ERC20 and whatever the token specification is for another base layer protocol will exist. Maybe these token prices will be pegged to each other, Or value will be accrued in proportion to the amount of state that they actually keep. In this way, the different base layer protocols may just be different shards on which protocols interact. Already, some tokens are looking at building on both Ethereum and NEO

If this vision of dapps on multiple chains does play out, competition between base layer protocols based solely on the dapps that they host may not be a long-term competitive advantage. Again, that hypothesis is premised on the belief that switching costs of state are low, and it does look like that is happening. If a protocol advertises the fact that they have a competitive advantage just because they’re building on a certain VM, that isn’t going to be a long-term advantage.

I don’t really offer up much in the way of analysis but just a bit of observation that we’re in the early days of crypto. There are so many rungs on the ladder of abstraction yet to be formalized and built. It's not immediately clear how scaling, where the points of friction and therefore economic value will be long term. You might say that the tokens that have the largest network effect will win out, i.e. Ethereum. Yet, the network effect argument is self-referential. It is intrinsically so, the more people use it, the better Ethereum gets. But the more people that leave an ecosystem, the more unstable it gets. With flows between addresses on chain and economic value exchanged cross-chain all available in real time on decentralized exchanges and on blockchains. We could see, in real-time, the shift in network effects from Ethereum to a hypothetical competitor. We won't have to wait for Facebook to release it's latest earning's report to show that it churn some X% of users. Please talk to me if you think I'm right or wrong :)

So What's in a PhD

I remember watching Dragonball-Z, where Gohan’s mom, Chi-Chi, wanted him always to get a PhD. This really hammered home of getting the importance of the credential, the PhD, to be recognized as an expert. However, since that time I’ve become somewhat of an autodidact that learns just for the sake of. However, I recently tweeted this:


The response was surprising. However, I stand by the statement. I first stumbled across this quote while reading "The Mathematical Experience.” The "80 book benchmark" shattered the final remnants of the childhood illusion that you need a PhD to become an expert, a PhD being some mystic level of achievement. In its place now stands a new belief, that becoming expert-level is not that hard. It’s a concrete milestone that anyone sufficiently motivated can achieve.

I really like this 80 book expert benchmark because it has all the classic signs of a good goal. It’s measurable, achievable, but still decently ambitious, especially if you love books. Becoming an “expert” is not that hard, especially if you don’t need the credentials. And thankfully, if you work in startups or are creating something, credentials are not that important. If you really do need credentials, you can always hire someone with the right three letter acronym.

80 books, while seemingly daunting is not that bad. An average US working citizen spends almost an hour commuting to and from work. If she decided to use that time instead of ‘gramming or texting and instead read, she’d be able to get through a decent amount of a book per year. For a book printed with normally-sized font, a reader of this blog could probably read a page per minute. This includes the appropriate highlights made in-text for subject matter retention. That means you, dear reader, could probably finish an average-sized book per week, ~360 pages or ~50 books per year. You could get a PhD in 2 years with time to space! [1]

The eighty book-mark is also great because it illustrates how little knowledge an individual needs to know to become an expert. Within startups specifically, the low barrier to becoming an expert makes investment decisions in “inexperienced” or “young” founders less risky than they actually are. I’ve already written on how young founders often found the biggest, baddest, and best companies. If you believe the thesis of this piece, then being young is less of a disadvantage because it’s so easy to get up to speed in an industry.

Expert-level specialization is very real and necessary. Even a small town library will usually have at least a few thousand books waiting to be checked out. If we only know 80 books worth of knowledge, it’s hard to imagine how you’d be able to build a multi-faceted business. Also with knowledge expanding at an exponential rate, it seems even more daunting. This is one of the reasons why being an expert or getting things done in the world still requires you to collaborate with others and/or use tools to manage knowledge.

Of course, the 80 book goal doesn't cover all the nuances of being an expert. On Twitter, others brought up several counterpoints. First, books aren’t always the best source of knowledge. I think this is certainly true. To the original goal, I would then add the caveat that, you do need to read 16000 pages—or 80 books worth of material at 200 pages per book. This is especially true in fast-growing fields such as blockchain or AI, where preprint, blogs, and Twitter. Where you choose to get the 16k pages certainly makes a difference in what you learn. The best practitioners are often the ones that aren’t teaching the subject. Their knowledge is either much more implicit, or codified in a much more free-flowing form factor such as a blog post. Take, for example, some of Vitalik’s writings on cryptocurrencies. If you’re getting into crypto, his posts will serve you much better than any book proclaiming that blockchains are the second coming of the internet.

Another common retort to “80 books" was that being an expert is mostly about creation. However, I’d say people still need some base level of knowledge to be able to be productive in a field, and as we’ve established before 80 book-length pieces of information or 16,000 pages or two years of learning seems about right to me. You’re probably familiar with the “Whartonite Seeks Code Monkey" or “I can handle the business side" meme pages. In short, they both poke fun at B-School students who don’t really understand the mechanics of product or startups. When I first read TechCrunch and watched the Social Network, I 100% asked a technical friend of mine the same questions. I didn’t have the requisite mental models on what a “startup” was to know why this was a bit silly of a request. Yet after reading blogs, working on products, and talking to folks to get the implicit domain knowledge, I now do. More generally, understanding the domains lets you know what's at the "adjacent possible", the stuff that's hard enough that no one's done it yet and not impossible. In physics this would be the difference between working on gravitational waves and working on time travel.

I look forward to getting my PhDs in bio, brains, and blockchains soon :)

---
[1] The speed at which a person reads will definitely depend on the subject matter as well. While reading Molecular Biology of the Cell, I read at approximately 15 pages per hour while taking detailed notes. At 1000+ pages, MBOC would take me ~70 hours to read it cover to cover. A normal college-level bio class probably covers half the material in the book. So I could a semester in ~40, or a normal work week. Of course, the caveat to this is that I can read 8 hours per day… Of course, I don’t, but for sufficiently motivated individuals who find the subject matter at hand interesting, you could do it. Warren does it.

Some Thoughts on "Confessions a Sociopath"

While browsing bookstores in NYC, I stumbled across a striking cover. A porcelain mask. Female. Red lipstick, with the attached popsicle stick handle. My eyes wandered down to the title in the bottom left-hand corner--“Confessions of a Sociopath”. Intriguing yet hesitant, as I don't normally read pop psych, I picked the book up. I put it down twenty pages later. I didn’t purchase it. It was a little bit too spooky for me. When you're left with a new lens with which to view your friends, colleagues, and possibly self, you’d feel the same way.

I ended up purchasing it at another bookstore later in the same day.

M. E. Thomas, a pseudonym, writes in an extremely readable transparent style. The compact volume of three hundred or so pages reads a bit like a diary, which is what a sociopath would like you to want. We want to feel like we know the other person. Yet, true to her sociopathic nature, the prose is lightweight, easy to reach, and a bit detached. Just what we’d want in a fling, to be drawn in, to imprint our own desires onto, and to be left wanting to know more. An early moment that we experience is mom and dad driving away forgetting us at the park. A moment that “normiopaths” or “empaths" would regard with fear, tears, or some other visceral reactions, M.E. takes as a chance to prove that she can live without them. M. E. reveals nothing, and with this style, she draws us into to her inner world. 

We follow M. E. as she navigates growing up in a somewhat dysfunctional household, and matures into a beautiful, intriguing, and cold young woman. Some of her experiences as a child, I think readers may be able to relate to, especially if one were an outsider or immigrant to a new community. When you come in as an outsider, there are cultural norms, language cues, body language differences, inside jokes picked up innately for some and intentionally learned by the outsiders. The difference here is that, for M. E., the language to be learned is that of emotion something we might take for granted. The only strong desire she expresses is that of power, for control over her environment and all the people around her.

We discover her how she manipulates people around them, often without them knowing, learn, as she does, that emotions play no part in her mental world, and rules that don’t advantage her can be broken. We’re often reminded of rebels, criminals, and vampires, the darker archetypes of our mythology—characters with which we are enthralled with, at least in the aspect that they have freedom from internal and societal retribution. By continually drawing on examples from literature, particularly from Steinbeck, we’re reminded of favorite characters and perhaps about people in our own lives that fit this sociopathic mold. Not only does M.E. draw from these sketches, but she also draws from brain imaging and clinical research, as well as, clinical definitions for psychiatry. This gives this extremely transparent, personal narrative the touch of scientific authority without being too drawn out.

The worlds of work and love figure heavily in this book. Sociopaths, as we learn, turn out to be tailor-made for corporate capitalism. Money, that impartial thing that so much of daily life is centered around, is a sociopathic object. It can be transformed into whatever desire that we may hold. Within jobs that require stress, acting, or even normal office politics, sociopaths are able to lie and win their way into higher and higher positions. They're better able to deal with the stress of firing or launching a new product better than we can. However, we find through personal anecdotes that a cutthroat character isn’t always as good as it seems. The same impulsive behavior become less reliable at creating long-term relationships needed for management positions. I often thought of Steve Jobs as a possible archetypal sociopathic CEO, driven by a great product, through a path of scattered emotional breakdowns.

We later turned to the subject of love, and as noted before, it’s more than tough to maintain a long-term relationship because the default position is to be what your lover wants you to be. But as we know, vulnerability, that is being your true self, or at least acting and speaking as if you don’t have anything to hide, is the key to long-term relationships.

When we look around the office, or our college campus, or even in our loved one's heads, we often wonder what is going on in the behind their eyes. In a certain way, while reading, I was reminded of the Turing Test —how can you tell whether this thing, producing some output is intelligent (and/or) conscious.  To extend the metaphor, the Turing Test is for emotions. "Do I actually know what this person is feeling at this moment?” It’s a bit frustrating. There will always be that lack of understanding that we always will face when dealing with people, just because we haven’t lived the exact same experiences as them. 

How do we know our lover’s smile is genuine? What if like a chameleon, our lover may be producing this contortion of facial muscles to provoke the response we so desire? The ends that they want may not be just to please us, but they may be planning, plotting three steps ahead, using that goodwill generated from that smile to cajole us to change the channel to whatever they wanted.

At the end of the book, we’re left with M. E as she goes about her life without a care in the world, without attachment, yet desiring of a real connection wanting kids. And struck by the normalcy of it all. These are the desires that all of us feel, our mental worlds have just happened to mold our perceptions in a slightly different arrangement. Our biological drives, along with our upbringings can really make a difference in our lives.

If you read "Confessions of a Sociopath", you will wear the sociopaths mask. For some, you may that it fits your face perfectly. You may gain answers to some pesky questions that you’ve always wondered about yourself. If not, you may be disgusted and off put, but you will certainly wonder more about the man on the train with a certain glint in his eye. What is he thinking? How does he feel—if anything?

Biocomputers

A mostly speculative post on the far-ish future of biology.

This essay’s a spiritual success to my previous post on the subject. If you’re an investor feel free to invest with that essay’s thesis in mind :) . I’d like to take a few steps forward into the future and try to reason backwards to where we are now. I began the other essay with a comparison to the mainframe era, and I’d still like to draw on the computing metaphor.

Most people identify Intel + the microprocessor as a key innovation in the whole computing revolution. The same could be said about the Apple II, which finally incorporated the microprocessor into a consumer-ready, integrated product. I won’t argue for against either for marking a new age. Either way, those technologies were unequivocally tied together, they bookended the period where the microprocessor led the way to general purpose computing for everyone.

The integrated circuit was the culmination of billions of dollars in R&D, and today the heir to that technology is the iPhone 8, which holds some $150 trillion dollars just in transistors from 1957. These devices let you do essentially anything and were the corner stones of global communications and global money. A person could live their life with just a phone.

I wonder what set of innovations might allow for the equivalent exponential jump in biology, the microprocessor for biology. What’s the equivalent of a general purpose computing device in biology, and why would we even want one? 

First, let’s look at the definition of the microprocessor according to Wikipedia.

"The microprocessor is a multipurpose, clock driven, register based, digital-integrated circuit which accepts binary data as input, processes it according to instructions stored in its memory, and provides results as output.”

If we swap out binary data for DNA, that sounds a lot like what a nucleus does. The speed and accuracy with which we can create new strands of DNA is limited right now. Biology is, of course, general purpose. The same DNA that codes humans can be used to code algae. However, most DNA is assembled for a specific purpose. The software, the ACTGs of DNA are still way too expensive to sequence. Additionally, de novo gene synthesis and assembly, or making long DNA strands from scratch, is doubly plus expensive. While we herald a $1000 human gene sequence, and soon a $100 human gene sequence, it really needs to be close to zero. While a single base pair costs $0.02 to synthesize, this also should be close to zero. 

Why do I think $0.02 is way too high? Well, think about it this way. If every line of code cost $0.02, we would not have operating systems or any of the wonderful things we depend on today. To get to truly ubiquitous DNA manipulation the cost has to be ~$0.00000, like manipulating electrons in a personal computer.

In short, a biological microprocessor, a bioprocessor for short would be able to manipulate DNA and spit out the results, or biological and chemical components of whatever we wanted at near zero costs. An integrated biocomputer would take the inputs, the single cells, the small molecules, blood drawn from individuals, other enzymes, and return new cell the right genes inserted. Attached to main bioprocessor would be other chips such as microscopes, perturbation devices, electroporation devices, incubators, bioprinters, fluid + solid handling devices (think needles and other things), as well as being connected to traditional chips.

Fundamentally, having a digital bioprocessor or some personal computer equivalent could lower the cost of creation by several orders of magnitude. The table top sets we have today for home biology are the equivalent of the HAM radio sets. So it will be some time before we have anything really cool. However, biology holds this same property of being an information science. However, like the pre-personal computer + internet era, we had to go to separate sources to gather all of our biological material. We travel to the grocery store, we go to the mall to buy creams that are synthesized by snails, we go get surgery + pay $ to look different, we go to the pet store to get pets, even the clothes on our back are made from organic materials. If we can download creams, seeds for foods to be grown, and drug treatments, we could enable biological creativity like we have in bits.

One use case that bioprocessors could dramatically influence is human drug/medical treatment. Martin Shrekli and the latest EpiPen snafus could be avoided by at home production of molecules and treatments. If the cost of treatments are zero, then how are drugs/treatment costs to be amortized? To create a blockbuster drug today costs billions, so what happens when individuals are able to “download” medicine for free? Of course, this is a moral dilemma. Orphan drug disease gene therapy treatments cost consumers $500,000 for one treatment. That seems a bit outrageous. 

Business Models for Biology

Bioprocessors should hopefully have two first-order effects on biology--decreasing the cost of production and distribution. We just have to look to software, as we’ve seen with the internet, a radical shift in the costs of distribution has, and will continue to reshape industries. 10-1000x cost reductions leads to startups. Disrupting industries. With the internet, everything either became free, had a SaaS/APIs model attached, or birthed a marketplace. Each download or use will cost some amount, like hitting an API endpoint.
  • Music -> piracy (zero cost distribution) + lower production cost = free initially, but now SaaS model, litigious for sure.
  • Movies -> high production costs, lower discovery/distribution cost =  SaaS model (Netflix)
  • Banking -> high production/integration cost = Now have an API for this. We have Stripe.
  • Housing -> high production cost, high discovery cost = Marketplace Model (Airbnb)
The same will happen with biology. The effect on food will be different than that of pharma and that’s related to the market dynamics of production, distribution, and reputation. All these elements add to transaction cost, and as we know, transaction costs govern where fat businesses are made. Sit on top of a fat pipe of transaction costs and win money for a long time. A worry people have is drug piracy. If the cost of downloading a drug effectively drops to zero, then what happens to the dollars that need to go into research.

There are a few effects of a bioprocessor and associated peripheral devices could have on drug development. The cost of research should be way lower, allowing more drugs to come on the market, however determining efficacy will still be hard, so brands or marketplaces should establish themselves.

However, free in biology isn’t necessarily bad. People don’t always need to be motivated by monetary ends (directly) to contribute, the Debian ecosystem has had ~$20 billion of work put into free software. And this isn’t just random stuff. It runs on almost any internet connected server. We depend on it for critical infrastructure. We could potentially have freely designed seeds that are pest resistant that farmers could use instead of ones controlled by the huge pharma companies.

We might have a SaaS like business model for individuals to purchase treatments (Illumina, Gene Therapy Market??? -> have the right idea). However, we’ll have to deal with data security. Medical records are worth 20x your credit card information on black market. There is no way that I would want my health information to be hacked. A more fun SaaS business might be custom designed hair product and colorizer. First, input a strand of your hair, enter the desired hairstyle and texture, and out comes a specially designed set of creams that actually changes biological hair growth from the follicles, If we actually change the follicles, then we can change the color and texture of our hair at will for longer, cheaper, and safer than we do now.

If we go to space, we’ll certainly need and want different biological tools. Space radiation can kill, just as scurvy killed people. Space radiation can also be curtailed by 4 SNPs that potentially could be free. A digital biocomputer would be a necessary tool. We’re not going to have a lot of space on those space ships and we’re going to need to bring a lot of things. The best way of compressing things is through just information.

All these of these are possible arrangements for how the bioprocessor changes production and distribution of organic materials. But we’re sadly still a way’s away.

Today: Complexity

Computer scientists severely underestimate the complexity of even single cells. These things are really, really complex to model and build, especially if you want to get to atomic scale precision. Atomic scale precision is often what you’ll need, after all polymerase is atomically precise. It manipulates several atoms into place, and we can thank evolution for that. We only have several mutations per our few billion base pairs. To do that level of simulation we need to assume Moore’s Law continues 50 years into the future (so we’ll basically need Quantum Computers to continue that trend) to simulate one cell. For a whole brain simulation, we’ll need 100 years for that. Another example of complexity is protein structure.

We’ll either need to reduce the modeling accuracy of our systems (as we’ve done with deep learning) or use biological techniques in addition to computational models. We can use bioprocessors as a model of studying, directing the evolution of cells, of creating anything we want. On our way to a glorious biologically infused future we have many roadblocks to creating components for a bioprocessor and or personal biocomputer.

A future post will speculate in detail on 1) what a bioprocessor actually looks like 2) who’s working on this stuff now and 3) what else is holding us back.

Merging Chains

Written by Niraj and Dillon

If you posit that bitcoin has a network effect (the more people that use the currency to transact), the more valuable the coin becomes. The more valuable the coin becomes, the more users you get and the higher the network effects. Additionally, if longer chain history means better security and more miners mean better security, in the long run, is there a way to increase the network effect by merging chains?

Right now, we've only got people doing forks. Forks are important, they allow for experimentation on the rule sets, however, they potentially may reduce the overall network effect of any single token. Forks are really good because they align incentives between the people who have already done work on the master chain. In the example of ETH and ETC, we argue that it's a feature that the Ethereum Foundation automatically held ETH and ETC without their permission. They could gain in economic value of another development team. The new development team wins because they get an input customer set, the set of pub-priv key pairs that already holds ETH. This is a subtle shift in incentives, we'll write more about this later...

While we're not advocating for a maximalist approach, the idea that there should only be one token ever, it just seems like there needs to be a process to merge chains just as there is the process of forking. There is an argument to be made that "Core" or the Foundations bearing the base token name centralize development resources. In BTC and ETH respectively, only 5 and 2 developers make up the majority of commits. Forking seems to have to have become a way for talented devs to work on protocols. Just look at LTC and @satoshilite.

Additionally, we see that experimentation has been a net positive for society in other areas. Allowing for experimentation and merging isn't limited to blockchains. Just to name a few:
  • Policy experimentation within a federal system of government. I.e. adoption of a precursor to the Affordable Care Act before it became national law.
  • Startups as new entrants that can be acquired or grow to be large companies.
  • Spin offs from large corporations. Standard Oil became several smaller companies and Rockefeller was richer for it.
  • Mitochondria being swallowed to become the powerhouse of the cell.

In blockchain terms, you could conceive of merge mining as extended uncle resolution. In the GHOST Protocol, the individual uncle hash power is added to the winning block's score. Uncle miner still is incentivized, they get some proportion of the block reward. Likewise, people who contribute to the "losing token" are still incentivized. When you think of merging chains, you're still incentivizing a smaller chain's absorption into the larger chain. While protocols can directly implement the necessary hard/soft forks to include the rule set change of a fork, they won't have the now differentiated userbase etc.

How to Do Merges

There are two methods for potentially doing a merge for tokens (and probably more that we haven't thought of).

The first method is pegging a token A to token B.
  1. Agree on a price/exchange rate for A:B
    1. Oracle to determine price
    2. Hash power signaling/ratio
    3. Market pricing on exchanges
  2. Hard fork both protocols to have the same block + rule set
    1. Enforce a specific block height for the rule change, include the pegged price ratio
    2. Price converges
  3. Before the rule set is implemented, people are free to trade out of token B
  4. Allow for atomic cross chain swaps
    1. Using Decred or 0x → hard code this into the rule set change
The second method involves one chain "absorbing" the value of the other. Meaning that token A remains and token B is never used again.
  1. Agree on a price/exchange ratio for A:B
    1. Oracle to determine price
    2. Hash power signaling/ratio
    3. Market pricing on exchanges
  2. Acquire buy out funds for A to purchase B
  3. Post a public address where all B tokens can be sent to
  4. Before the rule set is implemented, people are free to trade out of token B
  5. Burn the B tokens, each B token holder, will get the amount of agreed upon amount of token A proportion to how much they sent to the specified address

Roadblocks to putting this in practice.

Both of these scenarios involve a lot of coordination. Imaging trying to do a protocol merge without some kind of explicit voting mechanism other than hash power signaling induces a headache right away. The future of decentralized governance will definitely play a large part in how these things happen.

Also, as we see in centralized mergers and acquisitions, the larger company often has to purchase the shares of the smaller company at a price premium. We'll have to establish a better pricing mechanism beyond hash power and other matters. Ari Paul and Chris Buniske have been doing a lot of great work in fundamental valuations for this.
Additionally, atomic cross chain swaps are not the only potential way to transfer a token from one chain to another, using a protocol such as Polkadot or Cosmos we might allow for this sort of thing as well.

Real World Procotols That Could Benefit.

These wouldn't just have to be currency tokens, you could potentially also merge utility tokens as well. For example, looking at Sia and Filecoin. If Filecoin were to establish a dominant market cap and share position, it might behoove them to purchase the Sia network. An additional step would need to be taken. Individuals would need to, before they can acquire any of token A, transfer their files over to the new blockchain. Once this is performed, they can claim their Filecoin token.
  • Small cap token mergers
  • Prediction markets (Augur and Gnosis)
  • File storage markets (Filecoin, Sia, and Storj)
  • BTC variant mergers (BTC, LTC, BCC)

----

Look here as well

Cryptobanking: Brain Dump

I was recently having a conversation with Niraj about the upcoming Raiden release. We were having a bit of a debate about the upcoming Layer 2 protocols for off-chain transactions. The conversation centered around the risks of centralization within these networks, as the graphs are distributed, but rather decentralized. We also noticed that there seemed to be growing competition for crypto assets. Purchasing this token or that token, and with the influx of talent into the space, it seems to guarantee a future where many different things would be competing for your crypto dollars.

Competition for Crypto Dollars
  • Storage -> Filecoin, Sia, Storj
  • Computation -> Golem, Truebit, and other things?? vs something else
  • Bandwidth -> Source, or some other wifi coin.
  • Economic value (staking/tokens) -> Ethereum, you transform it to other tokens, lock it in RanDAO, or stake a Livepeer node.
  • Purchasing new tokens -> NEO, DASH etc.
  • Crypto Hedge Funds -> Prism
  • Layer 2 Protocols -> Lightning Network, Raiden, Plasma, Polkadot.

Since crypto assets are extremely liquid and can be instantaneously changed into some other digital asset, it's tempting to do so. Unless you're a day trader in the top 1%, you'll probably lose money. With all the complexity in dealing with crypto assets, a person's best bet is usually to hold, or rather HODL. In this case, individuals are hoping that the base crypto asset that they purchase appreciates in the future. HODLing is basically stashing your coins under your mattress, which many of my friends have expressed as their dominant investment strategy.

However, if we compare this to a traditional asset like cash, that can earn interest by sitting in a bank, stashing your cryptos under a mattress doesn't seem too enticing. No one's really figured out a way to earn interest by HODLing for this, of course, we’re not the only ones to have this thought.

Lending Right Now

If your intention is to earn interest in crypto, right now you can lend token on Poloniex. When the markets are volatile, you can earn up to 0.1% per day for essentially tapping a button. However, there are two downsides to this. You don’t earn that much money because of the fragmentation of order books. Fragmented order books drive down the liquidity and potentially the demand for your lent crypto assets. Which drives down any interest you could earn. Second, lending (and investing) on centralized exchanges carries high counterparty risk. A centralized exchange could be hacked and your funds stolen, the exchange could shut down leaving your assets locked up on the platform, or the exchange could invest your crypto and lost it.

Instead of a holding my token on a centralized exchange, I'd be interested in depositing my ETH or whatever token in a smart contract that would allow me to earn an interest rate, denominated in whatever token I'd deposited. All depositors can pool their funds together, which can then be lent to another party, with a loan that is administered through a separate contract. Of course, this is exactly what a traditional bank does. Lots of people deposit their money, banks lend it out at interest and split some of the money with depositors.

In crypto, there's already a movement to do this. Mining is effectively an operation that allows you to convert BTC into more BTC. Miners often finance the purchase of their equipment with other things. And by pooling your hardware assets together in a mining pool, you can turn this into less of a lottery and more of a fixed interest annuity on an emerging commodity. Decentralized mining pools potentially allow a decentralized, smart contract implementation of these things. Already we see in mining pools, those that have algorithms to optimize which alts they should be mining on at one specific time. This also carries the additional risk/annoyance of having to exchange your base token that you’d prefer HODLing into another, potentially more volatile currency to generate a return. However, we’re going to continue seeing a proliferation of protocols, and managing this will get unwieldy. 

Even more related to our crypto bank concept is a decentralized mining pool for staked tokens. You don’t have to convert your crypto into dollars to purchase hardware to earn crypto, instead, you give up the time value of your crypto assets for more cryptos in the future, and that’s okay if we’re HODLing. Rocket Pool and 1Protocol are both staked mining pools. Rocket Pool and 1Protocol both implement a token ontop. In Rocket Pool, every ETH deposit generates a one-to-one redeemable token for Ethereum. This harkens back to private currency solutions and other bearer bonds of years past. However, cryptobanks could be used for more than just for staking protocols and letting individuals margin trade.

Furthermore, lending to purchase other tokens is only a slice of activities someone might undertake with a lent crypto asset. Crypto banks could drive liquidity by lending into all these different protocols.
  • Staked mining on Ethereum
    • Things that let you earn money -> RanDAO, Swarm
  • Staked level 2 (Raiden or other things)
  • Delegated staking on protocols that share this design
    • Livepeer.
  • Exchanges to provide liquidity in general, and for margin trading.
  • Operating a 0x node
  • Augur/prediction markets
  • Oracles
  • Numeraire -> (if I really believe my model is superior), then I should go out and purchase token
    • But what if I don’t have assets??? 
  • NEO -> interest for holding token
  • Node operators -> Lightning Network/Truebit/Coinjoin-as-a-service
  • Polkadot/other scalability tokens
History Applied to Crypto

Fractional reserve serves two purposes. One role is operational. Banks make transferring money easy. Instead of moving my gold bar to Timmy, I can ask the bank to change the ledger entry. The other role is economic, providing investments to individuals. If we look at the history of banking in general reserve we went from warehousing money and not lending it to lending it for a fee and only retaining paper slips that allowed you to transfer money from another.

While the operational role of banks are greatly lessened in a crypto economy, as individuals can directly transfer tokens with other individuals using the built-in asymmetric key cryptography. However, individuals may still want to earn interest as they hold, and this is where a cryptobank might come in.

In the far, far future, when on-chain transactions are too expensive (hopefully this never happens), crypto bank contracts/DAOs may enable scaling of transactions. They could operate as layer 2 hubs, with connections to other cryptobanks and then finally to individuals through payment channels. Although we should continue to work on scaling the base protocols, banks could potentially keep the operational role of transferring money (the traditional role of banks). We’ll leave the discussion of whether or not this is a good thing, due to centralization for a separate topic.

A common bitcoin debate is if it needs to be used as a medium of exchange as well as a store of value. The argument in one sentence for bitcoin needing to be a medium of exchange is that if everyone holds bitcoin and never spends it, there will be no transactions, miners won't be incentivized to secure the network and it'll fall apart. However with cryptobanking, bitcoin HODLers who are in it for the long haul are able to lend their bitcoin to others, driving transactions, paying fees to miners, and keeping the bitcoin as a medium of exchange dream alive. 

And that’s about it! Thanks for reading.

----

Related Protocols

Some Work to be Done

Here are a few things that we don't know. If you are working on any of them, I'd be curious to learn more.

  • What consciousness is and feels like to other entities that are not ourselves
  • How the components of our brains work together to remember, generalize, learn, and act
  • How to safely augment our mental capabilities
  • How to create machines that can remember, generalize, learn, and act
    • How we can do this safely
  • How to create robots that can generalize and can act in a home
  • How to best help those suffering from mental illnesses
  • How to safely create powerful electronics that can remain inside the body indefinitely
  • How to reverse dementia, CBT, Alzheimer's, and more
  • How to reverse cancer
  • How to reverse heart disease
  • Why we keep getting fat and how to stop it
  • The best way to reverse diabetes (and other metabolic disorders)
  • The root causes of aging
    • How to measure biological age
  • How to model the large-scale systems of biology
  • How to cheaply mass produce DNA
  • We don’t know how to rearrange biological components to do useful things in a safe manner
    • We don’t know how what individual biological components do at various times
  • A cheap, safe, precise way of delivering drugs or other biologics
  • A safer, cheaper way of developing treatments
  • The best way of keeping good monopolies (those that create lots of consumer surplus early in their lifetime) and turn into rent-seeking organizations later in their lifetime
  • How to accurately model large-scale social systems 
  • We don’t know the best way to organize and make decision at a scale of 7+ billion people
  • The best way of solving the tragedy of the commons
    • Climate change
  • How to organize large-scale groups of people, capital, and knowledge in systems to create value
  • How to correctly allocate resources to those doing research
  • The best method of learning, an inefficient and difficult process and sometime unenjoyable process even for the smart and self-motivated
  • The best way to motivate people to stay happy, work on interesting problems, and to contribute to society
  • How to scale-up self-sustaining fusion reactions
  • How to scale up generalized quantum computation
  • What lies beyond our universe
  • What the fundamental nature of our reality, at the lowest levels
  • If the universe, at the lowest levels is continuous or discrete
  • How to get a lot of people off the planet onto another planet safely
  • How to get minerals and other resources from off the planet onto the planet