Is Cardano the Ethereum Killer?


If you watch the video of Charles Hoskins — founder of Cardano — in which he explained the Cardano project in a whiteboard session in 2017, it becomes clear that this is not a simple “we do everything better than XY” project. No, Cardano is a completely new approach in many ways.
Cardano describes itself as a 3rd generation platform and wants to significantly improve those things that have been identified as shortcomings in 1st generation (Bitcoin — decentralized money) and 2nd generation (Ethereum — smart contracts) blockchain platforms. This is not reduced to a few technical parameters, but is essentially reflected in a very scientific approach.

Cardano has the advantage here of learning from the “mistakes” — especially with regard to the basic architecture — of other existing and competing offers such as Ethereum.

Charles Hoskins, as one of the co-founders of Ethereum, certainly knows how crucial it is to consider the long-term evolution of the platform in terms of scalability, extensibility and governance when making architectural decisions. Taking Ethereum as an example, we can currently see how difficult it is now to increase scalability, e.g. via sharding, or to change the consensus mechanism from proof-of-work to proof-of-stake.

The peer review is not limited to the Cardano developers or the community, you are looking for exchange and feedback at conferences and from universities.

This is of course much more time consuming than reviews which only take place within the developer community.


A very important point, which Cardano identified as a deficit, especially with Bitcoin and Ethereum, is the lack of scalability.

Scalability is divided into 3 areas:
1. Transactions per second (Tx/s.)
2. Network bandwidth (required for the amount of transactions required)
3. Amount of data (which data must I exchange or store in the block chain)

Bitcoin, with its approx. 14 Tx/s, would of course not be able to serve as a global payment network. The main limitation here is the proof-of-work consensus. Cardano therefore relies on its own proof-of-stake based consensus, called Oroboros.
Oroboros divides the time in which the consensus is created into epochs, and chooses according to a random number the slot leader, which performs the function similar to a Miner, but without the same effort (HW& energy). Accordingly, no time is lost to find a slotleader and blocks can be created in very fast sequence.

Slot leaders can also perform functions for sidechains.

Network bandwidth is also a key factor in terms of scalability. The aim is to reduce the bandwidth required for data dissemination and communication. For this purpose RINA (Recursive InterNetwork Architecture) is to be developed. RINA is a counter draft to TCP/IP. TCP/IP is a protocol stack with a layered structure based on the OSI model, where each layer fulfils a different task. Meanwhile, however, there is a multitude of protocols for each layer. RINA, on the other hand, is not based on such a layer model, but on a recursive structure for the network protocol architecture. This comprises only a few protocols, but they cover everything that services need today.

With the increasing amount of data exchanged, the size of the data to be stored grows. The goal is to reduce this amount of data. Not every participant needs all data from other transactions. When Hans and Maria exchange/transfer ADA tokens, the other users are actually only interested in the fact that the token ownership is then correctly recorded. Also in terms of privacy, a reduction of data that other participants receive about a transaction in which they are not involved is advantageous.

Techniques for reducing the amount of data:

  • Pruning (truncation, simplification of data sets)
  • Subscriptions (Subscribing to information)
  • Compression (use of compression methods)
  • Partitioning (dividing the network into shards)

Reducing this amount of data is certainly one of the biggest challenges in the area of scaling!

Additional complexity results from the development of own cryptographic procedures. Even though Charles Hoskins has a lot of experience in the field of cryptography, self-developed cryptographic procedures always bear a considerable risk and significantly increase the development and testing effort. Wherever he could, Satoshi has used well-known procedures, also in the area of cryptography.

It is very likely that not only a block chain or a crypto-currency will prevail. Therefore it will be necessary to enable an exchange between the different networks, which can do without central points like Exchanges.

Cardano wants to establish interoperability not only between block chains such as BTC, ETH and XRP, but also to legacy systems of the banking world (SWIFT etc.)!

But this is very ambitious and highly complex. Consider the regulatory requirements, for example, if money from Ethereum’s ICOs is to be deposited via SWIFT into an account with JP Morgan. What metadata is required and how can it be checked? KYC, AML, etc. must be taken into account, or how can I verify, when exchanging values, that these really exist on the other block chain or network.

Cardano’s goal here is an “Internet of Blockchains”.

What one has set oneself here as a goal is another Herculean task. By the way, you also have to keep up with possible changes in the other networks, so that the exchange is guaranteed even after changes.

The central question here is: “How does the system manage and finance itself?

How can it be ensured that sufficient funds are available for operation and further development? Cardano does not want the development to be influenced by companies that assign developers (e.g. Blockstream — BTC) and influence the development of Cardano in the interest of their customers.

ICOs are a one-off funding instrument, even if the revenue from them is spread over the first parts of the roadmap, the funds will eventually be exhausted.

A long-term financing model must be created that feeds a “treasury”. Here, the model that Dash has created is used as a guide.

Similar to Dash, the financing of improvements and extensions (Cardano Improvement Proposals (CIP), marketing etc.), which are submitted by independent developers, for example, will be decided by voting.
To do this in a decentralized way, to provide the right incentives to participate in the voting and to distribute the funds correctly, is by no means trivial.

Which changes should be implemented and how should this be done without splitting the community and possibly risking a split of the network after a soft- or hardfork? You don’t want to see a multitude of splits like Bitcoin. A voting system is to be implemented here to ensure that changes are only implemented if there is a high level of approval. Of course, this can also be a hindrance if too large majorities are required. However, this should be preferred over a network split.

The long-term goal in terms of governance is a self-governing system without human intervention.

Cardano basically consists of three parts, whereby the Cardano Foundation and IOHK are closely interwoven.

The Cardano Foundation itself is responsible for community support and for work on regulatory and commercial issues. IOHK (Input/Output Hong-Kong) has been commissioned to develop the Cardano platform until 2020. IOHK was founded in 2015 by Charles Hoskinson and Jeremy Wood.

The third part of the organizational setup is Emurgo. Emurgo is to promote the adaptation of Cardano through cooperation and consulting of companies/partners who want to implement Cardano.
The focus here is on IOHK’s management team, which is responsible for the development of the platform. CEO of IOHK is the founder of Cardano Charles Hoskinson. Hoskinson has experience from his time as co-founder of Ethereum, where he retired in May 2014. Previously, he was involved in the concept of BitShares under the leadership of Dan Larimer. Due to his good technical background, he can certainly be seen as a suitable CTO. Leadership experience in complex projects, which were completed on time and within budget, is not evident with him.

The second man at IOHK is co-founder & Chief Strategy Officer Jeremy Wood. He also has little experience qualifying him for this position. 6 months as Executive Assistant at Ethereum does not seem like a lot.

The Cardano Community currently comprises around 490,000 members. Furthermore, there are 142,000 followers on Twitter and 77,000 members on Reddit. According to the Santiment’s, 2019 Market Report Cardano 2019 was the project with the most activity (before Ethereum).

Due to the very science-oriented approach, a cooperation with the European Union was established to research use cases for block chain and distributed ledger technology (DLT). There is a cooperation with the state of Georgia in the field of public administration and education.

A recent report on the cooperation between Cardano and PwC with regard to the development of a commercial strategy for Cardano caused a sensation. However, all the information on this seems to be based essentially on statements by Charles Hoskinson on this subject.

When Charles Hoskinson presented the Cardano Project in the video mentioned at the beginning of this article, a time frame within 2018 had been specified for the processing of all major building blocks of the project.

The following five versions are planned:

Byron: First version (Foundation) of Cardano Purchase/sale of ADA, Ouroboros Consensus Protocol, Deadalus Desktop Wallet, Block Explorer, Testnet (in addition to Mainnet), Exchange Interfaces & Support

Shelley: Second version (Decentralization) Focus: Decentralization

Goguen: Third Version (Smart Contracts) Focus: Smart Contracts & dApps, Creation of fugible & NFT Token

Basho: Fourth version (scaling) Focus: Performance improvement, sidechains (offload work to sidechain via sharding)

Voltaire: Fifth Version (Governance) Focus: Autonomous System,

Implementation Voting & Treasury especially for Caradano Improvement Proposal (CIP)

So far, only Byron has been implemented, i.e. the basic functions, although the final version will not be activated until March/April 2020. In the last update that Charles Hoskinson gave to the community, he could not give a more exact time frame for the release of Shelley (Decentralization). He wants to name a period of two months in the next weeks in which Shelley will be available.

The current revised roadmap is as follows:
As the roadmap overview shows, the Cardano project is already one year behind schedule in terms of the basic function (Byron). The first version of Shelley, which will bring decentralization, is currently expected for Q3 2020. The other versions (Goguen, Basho, Voltaire) are certainly not to be expected in 2020. As mentioned before, especially with the Basho and Voltaire versions come those features that are considered the biggest challenges. Therefore an implementation of Voltaire in a mature version is not expected before 2022.

An ICO, which was concluded on January 1, 2017, has brought Cardano a total of 62.2 million USD through the sale of 57.6% of the tokens. More detailed information can be found in the ADA Distribution Audit. Information on the use of the collected funds is not included in the audit.
The current market capitalisation is USD 1.05 billion and the token price is USD 0.04. This is certainly based on the expectation of a successful course of the project, even if probably few ADA token owners expect the planned milestones for the outstanding versions to be met.

As mentioned at the beginning, Cardano is an extremely ambitious project! The motto is: “Only the best”. While the vision of Cardano is being realized, the developments of the other blockchains do not stand still, of course. In the end Cardano has to compete with the current state of Bitcoin, Ethereum & Co.
In particular, the increased scalability of Bitcoin through L2 technologies
such as the Lighting network or PoS and Sharding for Ethereum have changed a lot compared to 2017. Bitcoin and Ethereum can already rely on a functioning network, a very active community and corresponding network effects. As we know from other examples (e.g. Facebook, Twitter etc.), network effects are often more important than some features and can hardly be caught up on. Since Cardano with Byron, the first of five versions, has so far only released basic functions and thus has not solved any of the major self- imposed challenges, the gap to the dominant crypto currencies is currently probably growing rather than shrinking.

– Scientific and structured approach
– Focus on scalability (Tx/s., NW bandwidth, data)
– Interoperability as an essential objective
– Financial security of the project

– Far behind the own plan
– The main challenges are still open (decentralisation, smart contracts, scaling, governance)
– Charles Hoskinson as an absolutely central figure

Now I would be interested in your conclusion about Cardano — let me know in the comments and which coin we should look at next!

Your I-Unlimited Team

*Disclaimer: This article is purely an analysis of Cardano. This is not an investment recommendation. As with any investment, your capital is at risk and the return is not guaranteed. Before you decide on an investment, please read our risk statement or contact a financial advisor

No tags for this post.

Related posts

Dependency Injection to Make Your Code Testable [A How-To Guide]

Photo by CDC on Unsplash

Have you ever wanted to write unit tests for your code, but you’ve found that it’s difficult to do so? Often this is the result of not writing code with testing in mind. An easy way to solve this is through utilizing test-driven development, a development process in which you write your tests before your app code.

But, even if you’re not a fan of test-driven development, you can still make your code easier to test by employing a simple technique, dependency injection, which we’ll discuss in this article.

What is Dependency Injection?

Dependency injection is a pretty straightforward yet incredibly powerful technique. In short, rather than a function having its dependencies hard-coded into it, the function instead allows the developer using the function to pass it any needed dependencies through arguments.

To help solidify the concept, let’s look at an example together.

Parsing a Cookie String

Let’s say you want to write a JavaScript function that can parse individual cookie key-value pairs out of the 



For example, say you want to check if there is a cookie called 


, and if its value is 


, then you want to enable some cool feature for that user browsing your site.

Unfortunately, the 


 string is absolutely terrible to work with in JavaScript. It’d be nice if we could just look up a property value with something like 


, but alas, we cannot.

So, we’ll resort to writing our own cookie-parsing function that will provide a simple facade over some potentially complicated underlying code.

(For the record, there are several JavaScript libraries and packages out there that have done exactly this, so don’t feel the need to re-write this function yourself in your own app unless you want to.)

As a first pass, we might want to have a simple function defined like this:

function getCookie(cookieName) { /* body here */ }

This function would allow us to find a specific cookie’s value by calling it like this:


A Sample Solution

A Google search on “how to parse the cookie string in JavaScript” reveals many different solutions from various developers. For this article, we’ll look at the solution provided by W3Schools. It looks like this:
export function getCookie(cookieName) { var name = cookieName + '=' var decodedCookie = decodeURIComponent(document.cookie) var ca = decodedCookie.split(';') for (var i = 0; i < ca.length; i++) { var c = ca[i] while (c.charAt(0) == ' ') { c = c.substring(1) } if (c.indexOf(name) == 0) { return c.substring(name.length, c.length) } } return ''

Criticism of the Sample Solution

Now, what’s wrong with this? We won’t criticize the main body of the code itself, but rather we’ll look at this one line of code:

var decodedCookie = decodeURIComponent(document.cookie)
The function 


 has a dependency on the 


 object and on the 


 property! This may not seem like a big deal at first, but it does have some drawbacks.

First, what if for whatever reason our code didn’t have access to the 


 object? For instance, in the Node environment, the 




. Let’s look at some sample test code to illustrate this.

Let’s use Jest as our testing framework and then write two tests:

import { getCookie } from './get-cookie-bad' describe('getCookie - Bad', () => { it('can correctly parse a cookie value for an existing cookie', () => { document.cookie = 'key2=value2' expect(getCookie('key2')).toEqual('value2') }) it('can correctly parse a cookie value for an nonexistent cookie', () => { expect(getCookie('bad_key')).toEqual('') })

Now let’s run our tests to see the output.

ReferenceError: document is not defined
Oh no! In the Node environment, the 


 is not defined. Luckily, we can change our Jest config in our 


 file to specify that our environment should be 


, and that will create a DOM for us to use in our tests.

module.exports = { testEnvironment: 'jsdom'
Now if we run our tests again, they pass. But, we still have a bit of a problem. We’re modifying the 


 string globally, which means our tests our now interdependent. This can make for some odd test cases if our tests run in different orders.

For instance, if we were to write 


 in our second test, it would still output 


. Oh no! That’s not what we want. Our first test is affecting our second test. In this case, the second test still passes, but it’s very possible to get into some confusing situations when you have tests that are not isolated from one another.

To solve this, we could do a bit of cleanup after our first test’s 



it('can correctly parse a cookie value for an existing cookie', () => { document.cookie = 'key2=value2' expect(getCookie('key2')).toEqual('value2') document.cookie = 'key2=; expires = Thu, 01 Jan 1970 00:00:00 GMT'
(Generally I’d advise you do some cleanup in an 


 method, which runs the code inside it after each test. But, deleting cookies isn’t as simple as just saying 

document.cookie = ''


A second problem with the W3Schools’ solution presents itself if you wanted to parse a cookie string not currently set in the 


 property. How would you even do that? In this case, you can’t!

There is a Better Way

Now that we’ve explored one possible solution and two of its problems, let’s look at a better way to write this method. We’ll use dependency injection!

Our function signature will look a little different from our initial solution. This time, it will accept two arguments:

function getCookie(cookieString, cookieName) { /* body here */ }

So we can call it like this:

getCookie(<someCookieStringHere> 'enable_cool_feature')

A sample implementation might look like this:

export function getCookie(cookieString, cookieName) { var name = cookieName + '=' var decodedCookie = decodeURIComponent(cookieString) var ca = decodedCookie.split(';') for (var i = 0; i < ca.length; i++) { var c = ca[i] while (c.charAt(0) == ' ') { c = c.substring(1) } if (c.indexOf(name) == 0) { return c.substring(name.length, c.length) } } return ''
Note that the only difference between this function and the original function is that the function now accepts two arguments, and it uses the argument for the 


 when decoding the cookie on line 3.

Now let’s write two tests for this function. These two tests will test the same things that our original two tests did:

import { getCookie } from './get-cookie-good' describe('getCookie - Good', () => { it('can correctly parse a cookie value for an existing cookie', () => { const cookieString = 'key1=value1;key2=value2;key3=value3' const cookieName = 'key2' expect(getCookie(cookieString, cookieName)).toEqual('value2') }) it('can correctly parse a cookie value for an nonexistent cookie', () => { const cookieString = 'key1=value1;key2=value2;key3=value3' const cookieName = 'bad_key' expect(getCookie(cookieString, cookieName)).toEqual('') })

Note how we can completely control the cookie string that our method uses now.

We don’t have to rely on the environment, we don’t run into any testing hangups, and we don’t have to assume that we’re always parsing a cookie directly from 



Much better!


That’s it! Dependency injection is incredibly simple to implement, and it will greatly improve your testing experience by making your tests easy to write and your dependencies easy to mock. (Not to mention it helps decouple your code, but that’s a topic for another day.)

Thanks for reading!

(Originally published here)
No tags for this post.

Related posts

Flax: Google’s Open Source Approach To Flexibility In Machine Learning

Thinking of Machine Learning, the first frameworks that come to mind are Tensorflow and PyTorch, which are currently the state-of-the-art frameworks if you want to work with Deep Neural Networks. Technology is changing rapidly and more flexibility is needed, so Google researchers are developing a new high performance framework for the open source community: Flax.

The base for the calculations is JAX instead of NumPy, which is also a Google research project. One of the biggest advantages of JAX is the use of XLA, a special compiler for linear algebra, that enables execution on GPUs and TPUs as well.

For those who do not know, TPU (tensor processing unit) is a specific chip optimized for Machine Learning. JAX reimplements parts of NumPy to run your functions on a GPU/TPU.

Flax focuses on key points like:

  • easy to read code
  • prefers duplication, instead of bad abstraction or inflated functions
  • helpful error messages, seems they learned from the Tensorflow error messages
  • easy expandability of basic implementations

Enough praises, now let’s start coding.

Because the MNIST-Example becomes boring I will build an Image Classification for the Simpsons Family, unfortunately, Maggie is missing in the dataset 🙁 .

Sample Images of the Dataset

First, we install the necessary libraries and unzip our dataset. Unfortunately you will still need Tensorflow at this point because Flax misses a good data input pipeline.

pip install -q --upgrade`nvcc -V | sed -En "s/.* release ([0-9]*)\.([0-9]*),.*/cuda\1\2/p"`/jaxlib-0.1.42-`python3 -V | sed -En "s/Python ([0-9]*)\.([0-9]*).*/cp\1\2/p"`-none-linux_x86_64.whl jax
pip install -q git+[email protected]
pip install tensorflow
pip install tensorflow_datasets

Now we import the libraries. You see we have two “versions” of numpy, the normal numpy lib and the one part of the API that JAX implements. The print statement prints CPU, GPU or TPU out according to the available hardware.

from jax.lib import xla_bridge
import jax
import flax import numpy as onp
import jax.numpy as jnp
import csv
import tensorflow as tf
import tensorflow_datasets as tfds print(xla_bridge.get_backend().platform)

For training and evaluation we first have to create two Tensorflow datasets and convert them to numpy/jax arrays, because FLAX doesn’t take TF data types. This is currently a bit hacky, because the evaluation method doesn’t take batches.

I had to create one large batch for the eval step and create a TF feature dictionary from it, which is now parsable and can be fed to our eval step after each epoch.

def train(): train_ds = create_dataset(tf.estimator.ModeKeys.TRAIN) test_ds = create_dataset(tf.estimator.ModeKeys.EVAL) test_ds = test_ds.prefetch( #test_ds is one giant batch test_ds = test_ds.batch(1000) #test ds is a feature dictonary! test_ds = test_ds = tfds.as_numpy(test_ds) test_ds = {'image': test_ds[0].astype(jnp.float32), 'label': test_ds[1].astype(jnp.int32)} _, initial_params = CNN.init_by_shape(jax.random.PRNGKey(0), [((1, 160, 120, 3), jnp.float32)]) model = flax.nn.Model(CNN, initial_params) optimizer = flax.optim.Momentum(learning_rate=0.01, beta=0.9, weight_decay=0.0005).create(model) for epoch in range(50): for batch in tfds.as_numpy(train_ds): optimizer = train_step(optimizer, batch) metrics = eval(, test_ds) print('eval epoch: %d, loss: %.4f, accuracy: %.2f' % (epoch+1,metrics['loss'], metrics['accuracy'] * 100))

The Model

The CNN-class contains our convolutional neural network. When you are familiar with Tensorflow/Pytorch you see it’s pretty straight forward. Every call of our flax.nn.Conv defines a learnable kernel.

I used the MNIST-Example and extended it with some additional layers. In the end, we have our Dense-Layer with four output neurons, because we have a four-class problem.

class CNN(flax.nn.Module): def apply(self, x): x = flax.nn.Conv(x, features=128, kernel_size=(3, 3)) x = flax.nn.relu(x) x = flax.nn.avg_pool(x, window_shape=(2, 2), strides=(2, 2)) x = flax.nn.Conv(x, features=128, kernel_size=(3, 3)) x = flax.nn.relu(x) x = flax.nn.avg_pool(x, window_shape=(2, 2), strides=(2, 2)) x = flax.nn.Conv(x, features=64, kernel_size=(3, 3)) x = flax.nn.relu(x) x = flax.nn.avg_pool(x, window_shape=(2, 2), strides=(2, 2)) x = flax.nn.Conv(x, features=32, kernel_size=(3, 3)) x = flax.nn.relu(x) x = flax.nn.avg_pool(x, window_shape=(2, 2), strides=(2, 2)) x = flax.nn.Conv(x, features=16, kernel_size=(3, 3)) x = flax.nn.relu(x) x = flax.nn.avg_pool(x, window_shape=(2, 2), strides=(2, 2)) x = x.reshape((x.shape[0], -1)) x = flax.nn.Dense(x, features=256) x = flax.nn.relu(x) x = flax.nn.Dense(x, features=64) x = flax.nn.relu(x) x = flax.nn.Dense(x, features=4) x = flax.nn.softmax(x) return x

Unlike in Tensorflow, the activation function is called explicitly, this makes it very easy to test new and own written activation functions. FLAX is based on the module abstraction and both initiating and calling the network is done with the apply function.

Metrics in FLAX

Of course, we want to measure how good our network becomes. Therefore, we compute our metrics like loss and accuracy. Our accuracy is then computed with the JAX library, instead of NumPy because we can use JAX on TPU/GPU.

def compute_metrics(logits, labels): loss = jnp.mean(cross_entropy_loss(logits, labels)) accuracy = jnp.mean(jnp.argmax(logits, -1) == labels) return {'loss': loss, 'accuracy': accuracy}
To measure our loss we use the Cross Entropy Loss, unlike in Tensorflow it is calculated by yourself, we do not have the possibility to use ready-made loss objects yet. As you can see we use


as a function decorator for our loss function. This vectorizes our code for running on batches efficiently.

def cross_entropy_loss(logits, label): return -jnp.log(logits[label])
How does the




takes both arrays, logits and label, and performs our


on each pair, thus allowing the parallel calculation of a batch. The cross entropy formula for a single example is:

Our ground truth y is 0 or 1 for one of the four output neurons, therefore we do not need the sum formula in our code, because we just calculate the log(y_hat) of the correct label. The mean in our loss calculation is used because we have batches.


In our train step, we use again a function decorator,


, for speeding up our function. This works very similar to Tensorflow. Please have in mind


is our image data and


our label.

def train_step(optimizer, batch): def loss_fn(model): logits = model(batch[0]) loss = jnp.mean(cross_entropy_loss( logits, batch[1])) return loss grad = jax.grad(loss_fn)( optimizer = optimizer.apply_gradient(grad) return optimizer
The loss function loss_fn returns the loss for our current model,

, and our


calculates its gradient. After the calculation we apply the gradient like in Tensorflow.

The eval step is very simple and minimalistic in Flax. Please note that the complete evaluation dataset is passed to this function.

def eval(model, eval_ds): logits = model(eval_ds['image']) return compute_metrics(logits, eval_ds['label'])

After 50 epochs we have a very high accuracy. Of course, we can continue to tweak the model and optimize hyperparameter.

For this experiment, I used Google Colab, so if you want to test it yourself create a new environment with a GPU/TPU and import my notebook from Github. Please note that FLAX is not working under Windows at the moment.


It is important to note that FLAX is currently still in alpha and is not an official Google product.

The work so far gives hope for a fast, lightweight and highly customizable ML framework. What is completely missing so far is a data-input pipeline, so Tensorflow still has to be used.

The current set of optimizers is unfortunately limited to ADAM and SGD with Momentum. I especially liked the very strict forward direction of how to use this framework and the high flexibility.

My next plans are to develop some activation features that are not yet available. Also a speed comparison between Tensorflow, PyTorch and FLAX would also be very interesting.

No tags for this post.

Related posts

Quantum Programming: Getting from 0 to 1 using Cirq

We all know that regular computers use bits 0 and 1 for storing data and processing tasks so for example if I have four bits in a row I can represent a bunch of numbers.

Let’s get Started!

So, what are Quantum computers?

Understanding Quantum computers are pretty complicated and quite confusing. So I’m going to break it down in an easy way to understand it.

We all know that regular computers use bits 0 and 1 for storing data and processing tasks so for example if I have four bits in a row I can represent a bunch of numbers.

Quantum bits know as Qubits are either 0 or 1 but it may still be 0 & 1 at the same time.

I know! I know! it sounds very strange and confusing because it’s not really quite plain to grasp, for example, let’s imagine that a bit is sort of like a coin, it can either be heads or tails just like 0 or 1.

it can only be either heads or tails right?

What is it right now?

while it’s in the air Is it Heads or tails ?

it’s heads or tails now in a strange manner, it’s heads and tails at the same moment, that doesn’t even make sense before it falls in my palm as I see the coin, so then I can see what it’s like, and that’s the theory behind the quantum bit so it should be a 0 and a 1 at the same moment.

In simple words qubits are bits with two state and each state have some probability just like in the case of coins

You get it right!

How it is going to change the world?

Well now is where the fun begins let’s suppose we have 4 bits with this we have 16 possibilities( means we can have 16 numbers)

let’s say that I’m trying to crack a password the password I’m trying to crack is one of the numbers we can get with this bits.

A normal computer will take one number at a time and put it in the normal machine one by one till we got the right answer

what if we use a quantum computer so instead of putting in these four regular bits, We put in four quantum bits, Now remember each bit is both a 0 and a 1 that means these quantum bits are all the numbers all at the same time

So, when I put the quantum bits into my machine to find the right password what comes off the other end of machine is saying that I’m both right and wrong because We gave it both right and wrong answers at the same time.

We still want to know what the correct password is? right?

Well there’s a technique called a growver operator, this is a real thing where you can sweep away all the wrong answers and what you’re left behind with is the right answer

So that’s the beauty of Quantum computing

I heard some people say that it will take the age of the universe to try and crack these codes

That’s how secure they are but with a quantum computer you can try them all at the same time use the growver operator to sweep away all the wrong answers and what you’re left with is the right answer.

So instead of taking millions of years with a regular computer you can do it in seconds with quantum computers.

Are you excited to write you first quantum program

Lets get started

# install latest version
!pip install cirq
import cirq
import numpy as np
from cirq import Circuit
from cirq.devices import GridQubit
# creating circuit with 5 qubit
length = 5 qubits = [cirq.GridQubit(i, j) for i in range(length) for j in range(length)]

Applying Hadamard operation on every qubit

circuit = cirq.Circuit()
H1 = cirq.H(qubits[0])
H2 = cirq.H(qubits[1])
H3 = cirq.H(qubits[2])
H4 = cirq.H(qubits[3])
H5 = cirq.H(qubits[4])

Apply CNOT operation on (0, 1), (1,2), (2,3), (3,4) , swap at (0,4) , rotating X by with pi/2

C1 = cirq.CNOT(qubits[0],qubits[1])
C2 = cirq.CNOT(qubits[1],qubits[2])
C3 = cirq.CNOT(qubits[2],qubits[3])
C4 = cirq.CNOT(qubits[3],qubits[4]) #swap
S1 = cirq.SWAP(qubits[0],qubits[4]) #Rotation
X1 = cirq.X(qubits[0])
X2 = cirq.X(qubits[1])
X3 = cirq.X(qubits[2])
X4 = cirq.X(qubits[3])
X5 = cirq.X(qubits[4])

Creating the moment and printing the circuit

moment1 = cirq.Moment([H1])
moment2 = cirq.Moment([H2])
moment3 = cirq.Moment([H3])
moment4 = cirq.Moment([H4])
moment5 = cirq.Moment([H5])
moment6 = cirq.Moment([C1])
moment7 = cirq.Moment([C2])
moment8 = cirq.Moment([C3])
moment9 = cirq.Moment([S1])
moment10 = cirq.Moment([X1])
moment11 = cirq.Moment([X2])
moment12 = cirq.Moment([X3])
moment13 = cirq.Moment([X4])
moment14 = cirq.Moment([X5]) #circuit
circuit = cirq.Circuit((moment1, moment2, moment3, moment4, moment5 ,moment6 ,moment7, moment8, moment9, moment10, moment11, moment12, moment13, moment14))

This is the quantum circuit you get, I will recommend you to try and play with it.

I hope it’s helped you in some way, Thanks for reading!

No tags for this post.

Related posts

Margin Trading: What You Don’t Think About Today Will Bite You Hard Tomorrow

How many chats on your phone are discussing where the Bitcoin price is heading right now? Even if you only touch the crypto industry with your left pinky, the Bitcoin price talk is hard to miss. Add to this the stories about someone who always seems to forecast the price right. A friend of a friend who moved to Bali and now makes a ton of money by trading straight out of his pool. The wizard sees in the price charts what others don’t see and makes profits when the markets are good, bad, and even ugly. 

Trading appeals by its promise of independence – do it from anywhere you want and whenever you want. Margin trading (the use of leverage) adds an extra edge – you can go big starting with little. With the magnitude of Bitcoin price moves and the leverage available as high as 100x, it becomes hard to leave the potentials of cryptocurrency margin trading unnoticed, at least in theory. 

From the dream of securing the bag while living on an exotic island to an idea of becoming a cool cryptotrader – margin trading can be a magnet to a diverse group of people. Reserved to professionals in white pressed shirts in traditional markets, margin trading in cryptocurrencies is available to almost everyone. 

Though democratizing financial instruments is a nice step towards more open finance, there is a flip side of this availability – we jump into something new and then wing it until we figure out how things work later. The cost of this “winging” are losses that could have been avoided. 

This post is not about the risks of trading with leverage. If you are reading this, likely you understand that leverage works both ways. It can shoot your gains up on a rocketship and it can also eat your deposit like a hungry bodybuilder on a competition prep. What happens is determined by the direction of the price movement. 

But there are aspects of margin trading more nuanced than the formula for gains or losses. Most people do not think about them, until their deposit is liquidated. 

In this post, we’ll go through the key aspects of margin trading that can bite you (and your profits) hard unless you pay attention to them. Good news – each of those aspects starts with “L”, and knowing them will help you avoid catching your own Ls in margin trading. Understanding them, you will be able to judge any margin trading platform more scrupulously than your girlfriend judges your ex. 

So let’s dive in:


“Oh, leverage I already know!” – you’d say. That’s right, but let’s look deeper! Which instrument is this leverage provided with?

Not all margin trading platforms are created the same. The size of leverage is the simplest thing that differentiates them. Yet the type of instrument that provides the leverage on each platform affects a lot of things in your trading.

For example, users trade perpetual futures on BitMEX. Options were recently added to OKEx. CEX.IO Broker offers margin trading based on CFDs (Contracts for Difference). All instruments are traded with leverage, but all are different and represent different rights and liabilities of the parties involved. More so, the type of instrument determines what reference price your profit or loss is calculated against

Here’s what it means:

When you open a leveraged position with BitMEX futures – you buy from or sell to some other participant on this derivative market. Who those other participants – you don’t know. And there is no matching order of the same size that goes out to the spot market for execution. The gains and losses of your position are calculated against the price, which represents a weighted average index of prices on other exchanges.

In contrast, when you place an order with CFDs on CEX.IO Broker, the Broker is an intermediary of every transaction. The Broker quotes you the best price at which it can execute your order on the spot market. So the quoted price, in this case, is not an index. It is rather a representation of the combined liquidity of exchanges where the Broker fulfills your order. And it is in the Broker’s interest for the price to be representative of the market. Otherwise, nobody would use it.

Further on, leverage, by definition, means that positions you open are times larger than the amount of your own funds you use for those positions. Hence, leverage involves borrowing. What exactly you borrow is important.

On some platforms, your borrowed capital equals the difference between the position size and your own funds. On other platforms, your own funds represent a collateral, and your borrowed capital equals the entire size of the position.   

So, in two different cases the financing fee is applied to a different amount. Hence, the cost of trading is different. Not to mention that the financing fee itself can vary, as does the frequency with which the financing fee is applied.  

So your leverage is more than the number before “x”, 3x, 10x, 100x. What instrument provides that leverage affects trading in multiple regards. So if you didn’t know, now you know! Moving on!


Liquidity is something a lot of people like talking about, but not many people know exactly what it means and why liquidity matters. In the context of trading, liquidity represents the “depth” of the market.

And the depth means the volume of the working (currently active) limit orders on a platform within a narrow range around the market price. To put it simply, you can add 0.1% to and subtract 0.1% from the market price, and the value of all limit orders falling into this price range will represent your liquidity.  

A liquid orderbook means that even a big market order from a trader will be executed without a significant change from the price level at which it was placed.

Lack of liquidity means that a tiny bit sizable order will move the price. It is called “slippage” – when the price at which a market order was placed differs from the price at which the order was actually executed. A small slippage is fine and quite common. Significant slippage bites off a big piece of your potential profit. 

Because there are various types of margin trading platforms (see above) – they can solve the liquidity issue differently. For example, on the spot market (e.g. Kraken margin trading) and futures market (e.g.BitMEX) the liquidity is provided by the participants of the specific market. The limit orders they place are precisely what forms the liquidity there.

But a different setup is also possible. The liquidity can be supplied by other exchanges, liquidity providers, with whom the trading platform has agreements in place. For example, that’s the case with CEX.IO Broker. An order there is routed to one or a combination of liquidity providers for execution. 

Whether the liquidity is supplied by the market participants or by the liquidity providers, neither option is superior to the other. What’s important is to understand how liquidity works on a platform you are trading on. You don’t need an MBA from an Ivy League school for that – just need to know where to look. 

In the first case (liquidity is formed by the market participants) – the orderbook is your reflection of the liquidity situation. Significant order volume within a small range around the market price – great! Not too much – then expect slippages. 

In the second case (with liquidity providers) – sometimes the platforms are showing an oderbook formed by the consolidated liquidity from other exchanges. If so – then see the above. If not – only testing how your orders are executed on the platform will help you assess the liquidity.   

Note, however, that your own order size matters here. If you trade with the money you’ve saved up skipping the $13 salads for lunch – you got nothing to worry about. If you sold your car to go all in with 100x leverage (don’t do it!) – think twice! Big market orders can eat up the liquidity on a platform quickly and execute with big slippages. 

From slippages to wide spreads, typical to illiquid markets, liquidity on a trading platform is something that affects the cost of your trading and your resulting ROI. So instead of being enticed by the sexy features a platform offers, spend some time understanding the liquidity situation on it. Liquidity is the pulse of trading – check this pulse first so that you don’t end up playing on a dead market.


Latency is another aspect of trading, not apparent from the home page of a trading platform but affecting the quality of your experience of the market. 

Latency is a time interval (delay) between an instruction to send data and the delivery of that data. And now let’s put it simply: remember your conference calls with an office across the ocean? Your foreign colleague opens his mouth saying something and it takes time (forever) before you can hear him. That’s your latency.   

Now, closer to trading. Imagine Bitcoin is nosediving, but when the price updates on your screen – it is already at a different level in the market. Then you want to open a position at a good price, but when the system actually sends your order to the market, the price is no longer relevant.

As a result, you either opened a position at a bad price (if that was a market order), or didn’t open a position at all (with a limit order). In another scenario, your position simply gets liquidated, and you did not even have a chance to do anything about it. You get the idea.   

Latency happens because the little gnomes in computers can only spin the wheel with a limited speed. They spin the wheel to realize some logic – what data to send between the parts of the system, what price to show, how to execute an order, etc.

But sometimes the speed of the gnomes’ spinning is inadequate relative to changes on the market. Add the fact that you are trading with leverage – and even a split of a second is enough to make you feel like you are losing control. 

With latency on the overseas conference calls or with a latency on your trading platform – you can get used to everything. Yet, well-trained gnomes who can spin fast are always better. What it depends on is the infrastructure quality and the logic behind the order execution. Both can be done poorly and over-complicated even on the prettiest of trading platforms.  

The problem is, you, as a user, cannot come to a trading platform with “What’s up with your gnomes?” question. Some platforms tell users about their low or ultra low latency as their advantage.

But in practice, you can only verify that by using the platform. If trading reminds you of the dialup modem internet times – you got latency. In trading, especially in margin trading, this is unacceptable and impractical.   


One way or another a scary liquidation touches every trader. Liquidation means a forced closure of a trader’s position. Why does a platform close a position without asking the trader? To manage the risks of potentially uncontrollable losses. 

Since a trader’s position is opened with leverage (hence, some of the capital is borrowed), a certain pre-specified minimum level of funds needs to be maintained on it at all times in order to keep the position open.

If a position suffers losses, making the funds available to guarantee the open positions insufficient, and if a trader does nothing to remedy the situation (e.g. adds more funds or closes the money-losing positions) – the platform has to initiate liquidation. 

Nobody likes to suck up the losses, especially if the price bounces right back after your liquidation, a sure way to start believing in conspiracy theories! Yet liquidation itself – is a form of a payment for using the margin trading facilities per se.

Many beginner traders think of liquidation as a bedtime scary story or a break time meme theme. Yet until they see zeros (or, better yet, minuses) on their trading account, they do not understand how liquidation works specifically on a platform they use. And that’s a process that can vary greatly from place to place. 

Respectable trading platforms provide detailed information about their liquidation process and warn a trader when the liquidation is nearing with margin calls. Others make liquidation akin to some sorcery that leaves you naked at the end.  

So you better not fool yourself that liquidation will never touch you. No matter how lucky you are or how savvy you get, understanding what initiates the liquidation and how it unfolds will always put you well ahead of your fellow traders (and save you a ton of money and hidden tears in process).  

What initiates the liquidation is usually a metric in your trading account finances that crosses a certain threshold. For example, in CEX.IO Broker, this metric is Margin Level. The Marin Level needs to stay above a certain level, otherwise liquidation starts. For other platforms, indicators like liquidation price, maintenance margin will usually show you the right direction. Do not just learn which indicator you need to track, understand how it works, better than you understood your favorite subject at school!

How liquidation unfolds is a question of whether the liquidation is partial or full. Partial liquidation closes just enough positions to restore the indicator mentioned above to a defined level. So some of your positions may still survive. Full liquidation – simply liquidates all your positions. Ouch.

Another important thing:

The rollercoaster of price movements can lead to a situation when your deposit does not just liquidate to zero. You end up with a negative number on your account.

Some platforms absorb the negative balances resulting from the liquidation (or guarantee that negative balances won’t happen). Others – don’t. You need to know this in advance. One thing is to cry over a liquidated deposit, another is to cry while having to hide from your creditors trying to collect your debt. 

And, of course, there is a cherry on top! Liquidation fee – ta-da! Oh yea, after all the sadistic pain you’ve been put through with the liquidation of your positions, some platforms (e.g. BitMEX) charge you a (sizable) liquidation fee. That fee goes to a fund which is supposed to guarantee that, after you are liquidated, you do not owe anything. Logically, the liquidation fee is charged at the moment of liquidation. Double ouch.   

We can talk a lot about liquidation. Or you can search some liquidation memes for the inspo. But instead of letting the liquidation spectacularly undress you in the most opportune moment, you are much better off making an effort to understand how it works on the platform before you use it. 

The dream, the nightmare, and the messy middle 

Leverage, Liquidity, Latency, and Liquidation are the terms that you will definitely come across in your margin trading. They all affect both your trading experience and your experience of the market in multiple ways. How these things are implemented can make trading on two similar-looking platforms feel like two different worlds. So get to know them. 

Trading is something that can potentially put you on an adventurous quest, professionally and personally. And with any quest, there is a dream that makes a hero to venture on a new path.

Then there is a challenge – a nightmare of sorts – that can stop the hero from getting what he wants. And there is also a messy middle – not so apparent plot twists that all turn out to be key for a journey.

Similarly, in trading, leverage may be that dream of making it big straight out of a fabulous exotic location. Liquidation – is that nightmare that will one day make you question your path. And liquidity and latency are both the things you may not notice at the beginning of your trip, but they will be there to trip you up midway. Know your dream, your nightmare, and your messy middle – and you are one good hero ready for your journey. 

No tags for this post.

Related posts

New York calls for tech volunteers to fight COVID-19

New York State has put out a call for volunteers with technology expertise to create “technology SWAT teams” to boost the state’s response to COVID-19. 

The official site for the COVID-19 Technology SWAT Team, as it’s called, is light on details but broad in scope. The state seeks volunteers with a wide range of tech skills: “professionals with experience in product management, software development / engineering, hardware deployment and end-user support, data science, operations management, design, or other similar areas.”

[ Also on InfoWorld: Kaggle calls data scientists to action on COVID-19 ]

While individuals can and are encouraged to volunteer, the state explicitly asks for “teams or cohorts of individuals from a single institution.” And while the state accepts remote volunteers from any timezone, it prefers those in the Eastern and Central U.S. timezones, with top preference given to those who can work locally. A minimum of 90 days’ commitment is required.

According to the sign-up form, volunteers can list one of four areas of focus: application/web development, data science and analytics, multimedia (or “digital content strategy”), and end-user support. Volunteers can indicate whether they want to provide full-time technical aid, build external platforms that will be provided without charge, provide free hardware or software, or provide some other kind of service.

New York State has been the epicenter of COVID-19 infections in the United States. As of March 25, 2020, the state had 30,811 confirmed cases of COVID-19, with thousands added daily over the past week. Calls have already been put out for health and mental health volunteers, with 40,000 people already enlisted according to state officials.

No tags for this post.

Related posts

Blockchain Use Cases: Cutting Through the Hype

Since 2013, blockchain startups have raised over $23 billion, with the vast majority of that investment coming from Initial Coin Offerings (ICOs). Between eliminating intermediaries and having short investment timeframes, the advantages of crypto fundraising proved to be massive.

Over the past couple of years, a variety of new fundraising methods have also become available to entrepreneurs and blockchain developers. As a result, many are now considering alternative options such as Initial Exchange Offerings (IEOs) and the more regulated Securities Token Offerings (STOs). For some companies, raising the traditional way through equity raises has also proven to become more popular and sustainable. 

Needless to say, new blockchain ventures are now a dime a dozen. This is further evident by the fact that there are now over 5,000 cryptocurrencies available on the market. In 2017, that figure was closer to 2,000.

Despite volatile market conditions, blockchain entrepreneurs have powered through regulatory uncertainty and changing markets to continue building decentralized services, products, and technology. 

A Noisy Landscape With a Hopeful Future

While the abundance of blockchain-based projects may seem like the industry is flourishing with promising new ventures, that is unfortunately far from the truth. Similar to the conventional startup ecosystem, many blockchain ventures struggle to attract and maintain a healthy user base and are abandoned after a few short months.

A Deloitte report titled ‘Evolution of Blockchain Technology: Insights from the GitHub Platform’ claimed that approximately 92 percent of open-source blockchain projects launched in the year 2016 had been abandoned by the end of 2017. Furthermore, regulators such as the United States Securities and Exchange Commission (SEC) have noticed an uptick in malicious activity surrounding blockchain fundraising. The SEC has halted and prosecuted dozens of blockchain startups for allegedly misleading investors, embezzlement, and selling unregistered securities.

As a result of these factors, investor uncertainty with token-based raises has never been higher. Many projects, even those being launched in 2020, lack clear vision and utility. This has made the entire crypto and blockchain asset class seem less desirable to investors and early adopters than only a couple years prior. Like with any emerging asset class, education and trust are essential to gain new interest.

One way of doing this it by living up to the industry’s core belief of decentralization. For example, there is a small class of cryptocurrencies that were launched out of foundations that act as innovation hubs and often community-driven authorities. One example is the Telos Foundation, which is in charge of promoting the Telos blockchain and network, which is helping connect purpose-driven people and resources. The foundation is set up as a not-for-profit, and unlike most blockchains, does not hold any voting power or big reserves of the tokens, but instead works for the benefit of the chain and contributors.

The responsibility of the foundation is to act as the marketing arm and to help promote adoption through education and awareness. By lowering entry barriers and making the technology relevant to the mass market, the foundation is ultimately helping solve blockchain’s biggest challenge. 

It is also important to note that there are still a multitude of verticals and industries where the potential of blockchain is quietly gaining momentum. In the following sections of this article, we take a closer look at which sectors could face massive disruption from this technology.

Media and Entertainment

Centralized media and distribution platforms such as YouTube are increasingly finding themselves in the midst of controversies that range from opaque monetization practices to less than ideal recommendation algorithms that are constantly changing.

Blockchain aims to solve these issues, and many others, by offering a more decentralized and democratic approach to the entertainment industry. Similar to how cryptocurrencies did away with the need for banks and financial institutions, blockchain-enabled media platforms are offering a platform for content creators and entertainers that is free of intermediaries and third parties.

Naturally, this increases the level of transparency, which in turn, increases the amount of compensation the artist gets. A recent report found that the blockchain-backed entertainment industry is projected to be $1.5 billion by 2024, led by some notable names such as Disney (, Audius (backed by Kleiner Perkins), FilmChain and more.

Other forms of media, including journalism, also stand to benefit immensely from the technology. Most news organizations currently rely on advertisers for funding, which can sometimes lead to conflict of interest.

With blockchain, however, platforms like Brave Browser are offering a new revenue model that is reliable, efficient, and less intrusive than modern ads. This has attracted more than 10 million active users for Brave.


Healthcare is another industry that is riddled with inefficiency and ambiguity. In addition to offering more transparency by reducing the number of intermediaries, blockchain could disrupt the industry by increasing automation.
Smart contracts, a relatively new technology that allows programmable contracts to be executed without human intervention, can revolutionize several aspects of the healthcare supply chain such as reporting and
The potential of blockchain to disrupt healthcare was highlighted in this recent report by Deloitte, which noted how the industry could benefit from blockchain.

For example, insurance providers could benefit from smart contracts, which will ultimately make tasks such as claims management, underwriting, and policy issuance far more trivial.

The industry currently relies heavily on paper records, which blockchain can entirely replace with a secure and immutable digital ledger. This means records would be more organized, secure, and accessible. 


As free speech becomes an increasingly important consideration for many countries around the world, blockchain and cryptocurrencies could help decentralize communication. Bitcoin, for one, has always been lauded for its censorship-resistance. In other words, data on a blockchain is etched permanently. They can be read by anyone, but never modified.

Security and convenience are some other aspects of communication that could be improved with the technology. Aloha, for one, is a blockchain-based project that aims to help its users monetize their unused mobile data by sharing bandwidth with people around them. In exchange for their spare data, Aloha grants these users loyalty tokens

Aloha tokens can later be redeemed against other currencies or products and services on partner platforms. In this way, Aloha is not only solving the problem of blockchain adoption, but also that of internet penetration.As we can see, convenience is a key aspect of applying blockchain in the real world and is arguably the most important request from users.


While we explored three industries where blockchain could prove to be disruptive, this list is far from an exhaustive one. Similar to other emerging technologies, decentralization and blockchain technology will likely have far reaching positive impact in almost every major sector. While it is still too early to tell which application will lead to mass adoption, the number of companies with millions of users increases consistently and shows that blockchain is likely here to stay.

No tags for this post.

Related posts

The Key Differences Between DeFi apps and NeoBanks

2019 was marked by a boom of DeFi apps and neobanks. Both form part of the wider fintech industry, and both can change the way people view and use money. But you shouldn’t lump them together. In this post, I’ll explain the subtle differences between decentralized finance and neobanking apps.

A very basic definition

The problem with neobanks and DeFi apps is that it’s hard to define where they start and end. Fintech is a spectrum, where different types of services merge into one another. Here’s a simplified definition:

A neobank is an app (mobile, web-based or desktop) that lets you manage your fiat money in the same way as banks do, but doesn’t have any physical branches; everything is done online.

Service range: money transfers, loans, payments, ATM withdrawals, trading, currency conversion.

A DeFi service is an app (mobile, web-based or desktop) that lets you manage your crypto assets in a decentralized way – that is, trustlessly via smart contracts. 

Service range: lending, transfers, payments, trading, asset storage.

Apart from these two types, there’s a wealth of fintech apps that can’t be categorized as either DeFi or neobanks. I’ll return to them at the end of the article.

Now, let’s compare our two kinds of apps using various criteria. In the process, it will hopefully become clear how they are similar in many ways, yet different in others. You’ll also see that there are no sharp black-and-white divisions, but rather many shades of grey.

1) Asset type

This is perhaps the most obvious difference. Neobanking services focus on transactions with fiat money – USD, euro and so forth. For instance, Monzo supports 140 fiat currencies. 

At the same time, a neobank can offer cryptocurrency exposure. For example, Revolut allows its Premium clients to convert fiat into crypto and send it to others within the app, though you can’t withdraw this crypto to an external wallet.

By contrast, DeFi apps are designed to manage crypto. A 100% decentralized DeFi app can’t offer any fiat services, because it’s not possible to convert crypto into fiat and back using smart contracts. 

However, DeFi projects need to give customers a way to deposit and withdraw fiat money if they want to achieve mass adoption. This means involving centralized organizations in the process. For example, Maker has just signed an agreement with the payment provider Simplex to create a fiat on-ramp for its stablecoin DAI.

2) Regulation and licensing

Most countries don’t regulate virtual banking, so a neobanking app doesn’t need to be registered as a bank. Instead, it can establish partnerships with real banks and simply act as an interface. 

Still, there is a trend for larger neobanks to obtain full banking licenses. Revolut, Monzo and N26 have all gone this route.

With a license comes a full KYC and a proper account opening procedure. You’ll need to fill a registration form and provide an ID photo and a selfie for the KYC. It can take over an hour for your account to get approved.

DeFi projects generally operate without any financial license. You can take out a loan or lend your crypto assets and earn an interest – all without as much as providing your email address. 

So far DeFi apps are flying under the radars of the regulators, but it’s not clear how long this will continue. If governments introduce obligatory registration as financial institutions for DeFi, this could endanger the whole decentralized business model. We’ll just have to wait and see.

3) Availability

The differences in availability stem from the licensing and regulation rules.

DeFi apps are unregulated, so they are available in all countries where you can use Internet and crypto. You don’t need to tell the app your name or address. 

By contrast, neobanks are regulated, so they face certain restrictions on who they can and can’t serve. For instance, Monzo is only available to UK residents, while Chime works solely in the US. Revolut can be used by residents of the European Economic Area (EEA), Australia, Canada, Singapore, Switzerland, and the United States. N26 supports most EU countries, plus the US. 

4) Volumes 

In DeFi, the main criterion of growth is the amount of funds locked in the smart contract of each app. You can find this data on DeFi Pulse, for example. The amount is in USD equivalent, so it changes together with crypto prices. 

At the time of writing, a total of $732m were locked in all DeFi apps. A year ago, in March 2019, it was just $313m. This means an increase of 133% in just one year – an impressive figure. (In mid-February 2020, when the price of Bitcoin exceeded $10,000, the value reached $978m – a 212% year-on-year increase.)

5) User numbers

Instead of the number of individual users, DeFi operates with the number of unique addresses (active wallets). This number also exploded last year. In February 2019, Maker – by far the largest DeFi app – had only 7,300 active DAI addresses. By September, there were already 66,000 addresses – a 9x growth. 
At the same time, the number of active Ethereum wallets that interacted with DeFi apps rose by over 500% in 2019 to reach 19,000, according to DappRadar. But in absolute terms the numbers aren’t so impressive: no more than 1000 daily unique addresses.
When we look at neobanks, the number of users is estimated through the number of downloads. The growth rate here isn’t quite as impressive as with DeFi, but the absolute numbers are far greater. According to Accenture, neobanks’ user base grew by 150% in 2019, from 7.7 million to almost 20 million. It’s 1000 times more than the number of wallets interacting with DeFi services.

6) Interface

Neobanking apps are known for their crisp minimalist design and great UX. They are also mobile-first, meaning that the mobile version always takes priority. Some, like Revolut, can seem complex at first sight, because there are so many different services. But the overall UI/UX quality is high. Neobanks clearly allocated big budgets for design and testing.

By comparison, many DeFi apps look and feel unpolished and not too user-friendly. Few of them have native mobile apps, and there aren’t enough tutorials and beginner guides. Considering how complex the process of lending, trading or transfers can be on DeFi platforms, this can scare off some potential users. 

Compare these two apps – Monzo and Maker:

Monzo mobile app

Maker interface

Monzo clearly looks like a ‘real’ app, designed to be intuitive and pleasant to use. Maker has a bare interface with many confusing terms that aren’t properly explained. Clearly, it’s aimed at experienced crypto users. 

7) Payments and cards

Most people use neobanks as an easier, faster, cheaper way to pay, especially internationally. You can pay abroad and send money overseas at the interbank rate, without paying a conversion fee. You can both link your neobanking app to Google Pay or Apple Pay or use the neobank’s prepaid card. 

Meanwhile, the DeFi industry is still working on ways to make crypto transactions faster and cheaper. As long as a single Bitcoin transaction takes up to an hour to complete, crypto won’t be able to compete with fiat as a universal means of payment. Examples of projects include Lightning Network (BTC, instant), xDai (ETH and DAI, under 5 seconds) and Connext (ETH, instant).
There are also DeFi-friendly interfaces and wallets that allow users to integrate different DeFi apps in a single dashboard with a better design. Essentially they try to imitate the stylish look and good UX of neobanks but for DeFi  protocols (Compound, Maker, etc.). In this category, we can name Zerion, Argent and DeFiSaver

Credit: Zerion

8) Borrowing money

Crypto lending is the most popular DeFi service, while it still plays a minor role in the neobanking business. Usually you borrow from other users (P2P lending), who get most of the interest you pay. DeFi loans are always in crypto, and you need to provide collateral in a different crypto asset. The collateral is always larger than the loan itself. But you don’t need to provide any personal or financial information, and the loan is instant and automatic.

Defi borrowing rates vary from 2% to over 20%, depending on the asset. Here is a sample:


You can explore current rates here

With neobanks like Revolut or Monzo, you can get a fiat loan on the same day, though it’s not instant. The procedure is easier than with a regular bank: the neobank will review your profile and credit history and perhaps ask you to provide an income statement. APRs range from 3.5% to 20%.

9) Earning an interest

In DeFi, lending is just the other side of borrowing: users lend to each other to earn an interest. Interest rates can be very attractive: for example, you can earn 8.5% when lending DAI on dYdX.

Since you don’t know who you are lending to, your funds are protected by a large collateral. If the borrower doesn’t repay, or if the price of the asset falls, you can liquidate (sell) the collateral to get your money back.

Some neobanks also offer saving accounts that pay interest. You can earn 1.35% with Revolut, 1.30% with Monzo, and 1.10% with Chime.

Bottom Line

Neobanks and DeFi apps both tackle the same problem: helping users manage their money independently, without going to a bank or doing paperwork. The main difference is that neobanks target fiat users, while DeFi services are aimed at crypto holders. But gradually neobanks are adding cryptocurrencies to their services, while DeFi platforms are starting to integrate fiat support. 

Apart from the fiat/crypto divide, the range of services offered by the two types of apps is similar. You can send, borrow and lend money, pay for purchases, and trade assets. Does this mean that eventually the two will come together? 

Will we see unified crypto/fiat neobanks that allow you to pay in hundreds of digital and fiat currencies, freely convert them into each other, or buy tokenized securities? Perhaps this is the best route for fintech to follow – a very exciting possibility indeed.

No tags for this post.

Related posts

Bodhi Linux 5.1.0 Released based on Ubuntu 18.04.4 LTS

A new version of Bodhi Linux is available to download based on the recent Ubuntu 18.04.4 point release.

While Bodhi Linux isn’t a so-called headline distro it has gained a solid following over the years thanks to its combination of low system resource requirements and solid performance with the quirky Moksha desktop environment and popular lightweight desktop apps.

And truth be told I have a bit of a soft spot for it, too. I like distros that ‘do things differently’ and, amidst a a sea of pale Ubuntu spins sporting minor cosmetic changes, Bodhi Linux does just that.

Bodhi Linux Screenshot

Bodhi 5.1.0

Bodhi Linux 5.1.0 is the first major update to the distro in almost two years, and succeeds the Bodhi 5.0 release back in 2018.

The update, aside from being based on the recent Ubuntu 18.04.4 LTS release and HWE, makes some software substitutions. The ePad text editor is replaced with the lightweight Leafpad. Likewise, the Midori web browser is supplanted by Epiphany (aka GNOME Web).

To help promote the new release Bodhi devs have put together the following video ‘trailer’, which you can view below if your browser supports video embeds:

[embedded content]

Bodhi Linux runs well on low-end machines (though not exclusively; it’s perfectly usable for gaming rigs too). If you’re minded to give an old Celeron-powered netbook a new purpose then a Bodhi install wouldn’t be a bad way to go about it.

Fair warning though: the Moksha desktop environment, which is based on Enlightenment libraries, is not for everyone. The modular nature of Moksha means it works rather differently to vertically-integrated DEs like GNOME Shell and KDE Plasma.

But different isn’t necessarily bad.

You can learn more about the Bodhi Linux 5.1 release on the distro’s official blog. To download the very latest release as a 64-bit .iSO hit the button below, or grab the official torrent:

Download Bodhi Linux 5.1.0 (64-bit .iso)

If you have a 32-bit only machine you can download and use the Bodhi Linux 5.1 legacy release. This features Linux kernel 4.9 and no PAE extension:

Download Bodhi Linux 5.1.0 (32-bit .iso)

Download News

Related posts

Tagged : / / /

Lowering The Electricity Costs Of Mining Bitcoin [A How-To Guide]

Bitcoin remains an incredibly tantalizing digital asset, as those who invest wisely in this cryptocurrency can earn huge sums of money for themselves. Bitcoin miners and investors are nevertheless forced to contend with the fact that their market is incredibly volatile, and that technological changes are constantly upsetting industry practices. New and more efficient bitcoin mining processes, for instance, can help lower the energy costs associated with producing the digital tokens.

How can Bitcoin miners reduce their energy burden in order to save money and become more environmentally sustainable? Here’s a breakdown of how to lower the energy costs of mining Bitcoin, and why else the cryptocurrency has undergone serious changes recently.

Ongoing disruption is rife

It’s important to establish that there’s much ongoing disruption in the cryptocurrency marketplace right now that’s preventing miners and investors from achieving their ideal outcomes. Bitcoin mining is being substantially upended by supply chain difficulties arising from the global coronavirus pandemic, for instance, as certain equipment makers are finding it impossible to produce the goods they ship to eager customers around the globe. 

According to one recent report, for instance, Chinese companies producing Bitcoin mining equipment are finding it impossible to do their jobs following a crackdown by authorities meant to diminish the spread of the coronavirus. Workers who could be busy producing this equipment are instead self-quarantining at home or finding themselves prevented from heading into work by government or company regulations. As such, Bitcoin miners around the globe who are desperate to get their hands on the latest mining equipment are finding themselves sorely out of luck. The ultimate impact of the pandemic on Bitcoin could have serious implications for the broader marketplace in the years to come. 

Just because there’s some supply chain difficulties affecting the world of Bitcoin right now doesn’t mean things will always be this bad. Indeed, there are plenty of reasons to believe that Bitcoin will bounce back from this crisis like so many others and emerge stronger than ever before. Digital cryptocurrencies remain an enticing asset in an era of eroding privacy, but they’ll only be obtainable if they can be mined efficiently and without wasting huge sums of power. 

Lowering your energy costs

So, how do you lower your energy costs when mining Bitcoin? The first tip is to avoid illegal or unethical behavior that can expedite mining; many Bitcoin enthusiasts are considering tapping into their work computers to mine Bitcoin, for instance, as doing so would enable them to pass the electricity costs of mining onto their employers. More than being unethical, this is flagrantly illegal in many areas; Uzbekistan alone recently announced a massive electricity tax aimed at the cryptocurrency community explicitly because of abuses that were going on. Individual miners have been fired from their jobs and fined heavily for using work computers for mining purposes, so don’t think lowering your electricity costs with this method is worth it. 

One reliable option is to depend upon specific products marketed towards bitcoin miners that claim to make the process easier. You have to be careful, though, as these are private companies attempting to make a profit for themselves. Gigabit Magazine recently did a review of Node, a commercial product aimed at helping Bitcoin enthusiasts mine the currency without using too much energy. Be sure to carefully scrutinize these products before depending upon them, as they’ll vary greatly in quality and features. The last thing you want to do is find yourself incapable of checking cryptocurrency prices because a faulty product fried your house’s electricity. 

The only surefire way to mitigate the immense amount of electricity that Bitcoin operations need to keep thriving is by implementing cryptogovernance, or wisely-conceived policies that will enable the sustainable production of digital assets without tarnishing the environment. As this argument points out, government regulations are effectively the only measure that Bitcoin miners will respect in large enough numbers to make a difference. Baring these measures, the continued pursuit of cryptocurrencies like Bitcoin could become too cost prohibitive in the near-future.

Between a rock and a hard place

Bitcoin miners are going to find the forthcoming days to be difficult and costly. Many people who are unethically and illegally using company machines to mine Bitcoin will find it impossible to do so as they quarantine in order to avoid the ongoing global pandemic. Others will simply find that mining Bitcoin at home is cost-prohibitive, as there’s very little that the average miner can do to mitigate their electricity usage short of setting up a massive facility that requires a huge amount of startup capital. 

Despite efforts to lower its costs and make it more accessible, the future of Bitcoin mining will remain dominated by the select few with enough financial resources or the access to huge computer farms needed to survive in this industry.

No tags for this post.

Related posts