Qubic status update October 3rd 2018

Eric Hop:

September was all about testing the Qubic programming language Abra and creating its initial support library, which is written in Abra of course. While creating the support library, the need arose to be able to verify all kinds of ideas. This resulted in a parallel trajectory where a simple Abra language parser was created in Java. The parser allowed us to run syntactical sanity checks on the library code even before we had a running Abra compiler. To facilitate the building of the parser we created an EBNF syntax diagram for the Abra language.

The process of building this parser directly resulted in a few changes to the Abra language syntax that make it easier to parse and analyze the language. In addition, while building the support library, it became clear that there was a lot of repetitive programming going on due to the fixed-size nature of the Abra trit vector data type. This resulted in the addition of a template-like feature which allows us to create generic functions and then instantiate them for the required trit vector sizes.

In the mean time, one of our Discord community members, ben75, managed to use the EBNF syntax diagram to implement an awesome syntax highlighter for Abra on the IntelliJ IDEA platform. That turned out to be a great help for us while creating the support library code. This community never ceases to amaze me.

Once the parser/analyzer worked correctly it was decided to give it quick-and-dirty ability to run the Abra code as an interpreter. This allows us to run the Abra code and test it already, even without being able to compile it for a specific platform yet. It also allowed us to debug the support library code by stepping through the Java interpreter code in the debugger while it executes the Abra code.

We’re happy to report that most basic library functions worked exactly as designed, and only a few minor details needed fixing. The most astonishing thing that happened during this phase was that *by far* the most complex function that was written, the integer multiplication function, worked flawlessly right off the bat! Astonishing because when we wrote this code there was no way to run the code other than in our minds.

While we haven’t yet created the corresponding integer division function, the functions that implement arithmetic, logical, and conditional operations have already proven to work correctly in practice when we used them to implement several test functions. The most impressive part of the support library is probably the way we can tailor those functions to any required trit vector size, which allows us to perform integer arithmetic natively on a vast selection of integer ranges that is unmatched by most other programming languages. For example we defined an integer data type we named Huge, which is a trit vector 81 trits long and can represent values in the range of minus to plus 221,713,244,121,518,884,974,124,815,309,574,946,401! We even tested with a 6561 trit data type that is supposed to hold an IOTA signature and found that it can represent integer numbers that are a whopping 3131 digits in length. And the integer arithmetic functions will work correctly with all of them!

Further additions to the parser allow it to generate a ternary representation of the Abra code ready for inclusion in a Qubic message to be sent through the Tangle, and convert that ternary code back into the original representation that can be run by the interpreter. This could prove very helpful in speeding up the process of getting to a working proof of concept for Qubic until a more robust version of the end-to-end functionality can be completed.

The document about the Qubic computational model is coming along nicely. It has grown so large that we decided to split it into multiple parts. The first two parts are currently being reviewed, and the third part is expected to be ready in the first week of October. The following parts are planned at the moment:

  1. A conceptual overview of the Abra processing model.
  2. An overview of the basic entities in Abra.
  3. An overview of the Abra programming constructs.
  4. Some example Abra functions with details on how they work.
  5. The Qubic Dispatcher and its interaction with Abra.

It has been very exciting for us to finally see the first working Abra programs in action this month! We hope to be able to share the documentation and interpreter with the community as soon as possible, so that you can start playing with it and contributing to our Abra efforts as well.

Bonus: Abra Syntax Diagram

Click the above link to see a nice railroad syntax diagram of the Abra language.

Source: https://blog.iota.org/qubic-status-update-october-3rd-2018-1cb7db5c850d

Validity in the Tangle: The good, the bad, and the ugly

Clara Shikhelman:

Validity in the Tangle: The good, the bad, and the ugly

This article discusses how we can assess the validity of a transaction in a distributed ledger, and the particular issues we face in the case of the Tangle.

To begin with, let’s look at blockchain data structures. In a blockchain, before users attach a new block to the end of a chain they will make sure that all of the transactions in the chain are valid. Validity means that in each transaction, the payer has to have the required funds, and no coins can be created outside the protocol specification. This is made simple, because the blocks are ordered in time and transactions are ordered within the blocks (by the miners) — this makes comparison of the timing of two transactions relatively straightforward. For example, if Alice starts with an empty wallet, the transaction “Alice gets 3 coins” must precede the transaction “Alice gives 3 coins,” otherwise the latter is invalid.

How is The Tangle Different?

Just like a blockchain, a necessary condition for a transaction to be valid is that the payer has received the coin before making the payment. However, since the Tangle is a Directed Acyclic Graph (DAG), the question of time-ordering becomes non-trivial. The DAG structure makes it hard, or sometimes impossible, to resolve whether a payer has received coins before spending them.

Let’s take the case of 2 transactions A and B. In a blockchain transaction, there are only two options regarding their order. A happened before B or A happens after B. However, in a DAG, we also have the option that two transactions are uncomparable (i.e., we do not know which happened first).

To continue our discussion, let us agree on a few preliminaries. First, we treat validity as a state of a transaction, examined in the context of the current state of the Tangle. That is, any function that decides if a transaction is valid or not, takes into account both the transaction itself and the Tangle it lives in.

Second, the only thing a transaction is accountable for is its cone of past transactions — that is, all of the transactions that it approves directly or indirectly.

When a user chooses which transactions to approve, they check the validity of the transactions by analysing their cones (the part of the Tangle which these transactions reference), and can ignore the rest of the Tangle.

Every transaction in a cone implies the movement of coins from one account to another. To assess validity, we apply all of these movements to the Genesis state to generate an updated state of wallets and their balances. We will call this state acceptable if all of the balances are non-negative (i.e., address balances cannot contain negative values, only zero or positive) and the sum of the balances is equal to the number of coins created in the Genesis.

Now let’s dive deeper into what properties a transaction and its cone should have, for the transaction to be considered valid. There are many options for such properties. Here, we consider three of those:

1. The minimum property

The moment that an honest (even if selfish) user sends a transaction, they should own the funds they are spending. That is, in a healthy system, we expect users to have the money “in their pocket” before trying to give it to someone else.

Having sufficient funds is the most basic condition. For a transaction to be considered valid, we expect that after all of its cone transactions are applied to the Genesis state, we get an acceptable state. This property is minimal in the sense that any other reasonable property we come up with will satisfy this property.

2. The “good random walk” property

This property comes up rather naturally when implementing the Tangle. As we know, to choose two transactions, the user performs two random walks starting from the Genesis until they get to a tip that they consider as “valid.” It would be a pain to go all the way from the Genesis to a tip just to discover that it is no good.

To save time, and broken hearts over bad tips, a user can check that every transaction that the random walk steps on has the “minimum property” stated above. This means for each transaction on the way, after applying the cone of this transaction to the Genesis state, the state we get is acceptable.

3. The maximum property

The most strict property we can demand from a transaction is that every transaction in its cone has the minimum property. In addition to getting an acceptable state for the end transaction or the transactions we saw on the walk, we also check that we get an acceptable state for any transaction that we approve, directly or indirectly.

So if users approve a transaction, they must take some responsibility over the funds in that transaction, even if those funds are not involved in the new transaction they are adding. This property is maximal in the sense that any reasonable property we can think about cannot ask more from the cone than what is stated above.

So which property is good, which is bad and which is ugly? Well, the one thing we can clearly say is that the “good random walk” property is bad, even if appealing from an implementation point of view. The problem with it is the following. If all of the users agree on this property it might be that two users disagree on a validity of a given transaction because the randomness of their walks took them in two different paths. As we would like to have honest users agree with each other as much as possible, this property is definitely bad.

As for the other two — numerous hours and gallons of e-ink went into the discussion of which is the good one (and which is the ugly one). We were not able to find an attack that the minimum property allows or a use case that the maximum property prevents, which leaves a lot of room for discussion without a clear tiebreaker. I have my ideas, but I encourage you to think and share yours and maybe we’ll come up with the one property to rule them all.

Source: https://blog.iota.org/validity-in-the-tangle-the-good-the-bad-and-the-ugly-98bd3b53408a

Welcome Florian Doebler to the IOTA Foundation

David Sønstebø: Florian Doebler is an Economist and Entrepreneur with a focus on behavioral science and international development. He joins the IOTA Foundation as Social Impact & Donor Relations Coordinator at our headquarters in Berlin. During his studies in Munich and Vienna his academic interest centered around the sustainable governance of commons throughout economic history and the intersection of innovation, cultures, and institutions.

Florian is a co-founder of a social start-up with the aim of bringing transparency to ecological accounting while incentivizing reductions in resource use. He is a specialist on the behavioral consequences of digitalization, and contributed to the implementation of experimentality in governments and private enterprises, as an employee at a global consultancy.

After he graduated in January, Florian supported the establishment of the newly established Blockchain Lab at the German Development Agency (GIZ), furthering the maturity of Distributed Ledger Technologies with a view to the United Nations Development Goals. Tapping his interdisciplinary skill set and passion for life on earth, he will help to harness the potential of IOTA for positive impact on people, communities, and ecosystems.

On joining IOTA

When I first learned about IOTA, it immediately sparked my imagination as a key enabler of the sharing economy. In my eyes, the answer to the challenges of today can only be the more sensible use of our resources and increased cooperation on a global scale. IOTA promises to pave the way towards these goals and opens up new avenues to build a more inclusive and equal society.

I could not be more thrilled to support the development of the future economy, in such an ambitious and passionate team.

Florian´s profound understanding of economics, technology, and human behavior will complement our Social Impact & Regulatory Affairs Team, and help to anchor IOTA as the backbone of the sharing economy. We are very happy that he is part of our journey and welcome him warmly to the IOTA Foundation!

Source: https://blog.iota.org/welcome-florian-doebler-to-the-iota-foundation-9b03498364c

ENGIE Lab CRIGEN and the IOTA Foundation to drive DLT innovation in the smart energy ecosystem

Welcome Mark Nixon to the IOTA Foundation

David Sønstebø:

Mark Nixon joins IOTA to head up the Smart City Program. He brings a wealth of experience gained across the TMT sector having held senior commercial and operational roles for 3ComVerizonNokiaO2 and most recently Huawei where he led the Business Consulting Practice in MENA.

Mark started his career in Telecommunications 30 years ago with the British Army “Royal Corps of Signals” as a Systems Engineer working on some of the earliest Mobile Data Communications Networks deployed globally to support British Military Operations. Early successful forays into near-realtime Intelligence Systems working on personality tracking and vehicle identification / tracking systems led to his leaving the Royal Signals and joining the TCP-IP Switching Company 3Com. Here he held several Technical and Product Marketing roles, leading key projects with the UK academic community on the development of “JANET” and “SuperJANET”, a global secure IP network supporting research and academic institutions.

Throughout his career he has looked to be innovative in terms of how technology can be applied to real world business and consumer lifestyles, looking to solve problems that make a difference to our lives. He launched the first Blackberry Service outside the US with O2, and quickly followed by leading the launch of the first Windows mobile data solution with the O2 XDA, successfully introducing secure scalable data centric solutions to the UK market.

Mark led the O2 platform solutions business, securing breakthrough contracts with Camelot (the UK National Lottery) and Pop Idol (for cross-carrier interactive voting services which have revolutionized how we interact with our TV viewing). His team pioneered the early M2M market creating an ecosystem of development and innovation for the mobility security market, with the introduction of a smart car tracker service in the UK and Europe. He played an integral role in the establishment of a mobile data application development ecosystem (O2 Litmus), enabling innovation and collaboration for the creation of data applications that can be used in 3–4G carrier networks.

At Nokia Mark drove the successful launch of the OVI Store, offering customers content that was compatible with their mobile devices and relevant to their tastes and locations. Ovi introduced the first carrier billing capability to the Nokia partner network of operators, to seamlessly simplify consumers accessing and paying for relevant content.

Most recently Mark has been leading Huawei Business Consulting Practices in Europe and the Middle East. He was responsible for driving new innovative engagements with the TMT sector, looking to enable the Digital Transformation of traditional connectivity-centric mobile network operators to becoming viable Digital Services Businesses. As a TMForum Ambassador, Mark is at the heart of introducing Industry standards and frameworks that look to accelerate transformation by utilising new open transparent technologies. For example the Open API Program, Digital Maturity Model and FrameworX — the structured building blocks for carriers to address the transition from legacy processes, skills and operating systems, to new agile experience-based business models.

Mark holds a BEng. In Electronic Engineering from City of London Universityand an MBA from Said Business School Oxford.

On joining IOTA

I’m so excited to be joining the IOTA team, where I will be leading the Smart City Program. I am looking forward to continuing the great work that IOTA has done to date in engaging with the Smart City ecosystem including government, academia and industry. We hope to continue to leverage the IOTA Foundation to partner in the creation of real world DLT solutions that enhance the lives of citizens globally. Utilising the IOTA Tangle, R&D teams and Partner Network in the development of POC’s and Use Cases, we can drive open source, permissionless development.

Mark Nixon brings a massive wealth of experience and a significant network of contacts, to lead the Smart City Program at the IOTA Foundation. Give him a warm welcome!

Source: https://blog.iota.org/welcome-mark-nixon-to-the-iota-foundation-63badefdb311

Modelling New Business Models with IOTA

Jan Pauseback:

People often ask me what Blockchain or Distributed Ledger Technology is. As a technical analyst, it is part of my job to explain it to them. However some technologies are extremely difficult to explain in a simple fashion.

DLT or Distributed Ledger Technology is one of those cases. Even for people with a software engineering or computer science background, it is sometimes difficult to understand the technology itself, and more so the implications for current and future business models.

Sometimes it really helps to show how it all fits together..


Inspired by Volkswagen@CEBIT

..something like Volkswagen’s miniature world showcase at CEBIT this year, which demonstrated how IOTA may be used in different automotive use cases, such as:

  • Vehicle identification. Allowing a vehicle to identify itself to parking garages, smart home systems, other vehicles, etc.
  • Over the air updates. Allowing verification that the software running in your car is correct and up-to-date.
  • Function on demand. Allowing payments to upgrade your car with features such as higher maximum speed or extended range.
  • Autonomous vehicles that earn money. Allowing autonomous vehicles to act as fee-earning taxis when not used by their owners.
  • Payment for tolls and EV charging. Allowing automatic, transparent and frictionless payment.

When I first saw the Volkswagen showcase I was taken aback, as were many others. This approach actually made people understand the implications of the machine-to-machine economy vision of IOTA.


Showcasing IOTA’s vision with the EV3

We saw that physical demonstrations help people to understand the vision. So how could we as a foundation model our vision? With Lego!

Many people have childhood memories of building their dreams with Lego. So we wanted to see if we too could build our dreams with it. Lego EV3 is a computing device / platform with which you can build programmable robots. Since its release in 2013, people have implemented some sophisticated applications on the platform.

In this project we installed ev3dev, a Debian-based operating system, on our EV3 and ran a simple shell script. This script checks whether the balance on a specific IOTA address has increased, indicating that someone paid the truck. If the funds on the address increase, then the truck starts its engine and drives for a set amount of time: 1Ki equals one second of driving. Below you can see the car start driving when money is sent to the right address.

To summarize: We built a toy truck that drives only when it gets paid and stops when it ‘runs out of money’. That is the most basic use case but it already has real world relevance. With this simple example it becomes clear that IOTA:

  • Decreases complexity. There are no middle men involved. Only you and the truck.
  • Has no transaction fees. What you send is what the truck receives.
  • Simplifies cross-country payments. No need to deal with foreign currencies any more.
  • Can enforce sanctions in case of a default. The truck will only drive as long as you are paying for it [1].

These features even today, could help to reduce fraud, reduce risks and reduce costs.


We believe that by bringing together IOTA and such elegant and familiar modelling tools, we can make our new technology more approachable, and help people to understand its possibilities and implications.

The next step could be towards supply chain tracking, eg by putting a Bosch XDK (https://xdk.bosch-connectivity.com/) into a toy shipping container, and using the recently published xdk2mam (https://xdk2mam.io/) code to stream environmental data to the Tangle. Furthermore one could let the shipping container pay the truck to start driving, which would bring this miniature world showcase even closer to IOTA’s vision of a machine-to-machine economy.

There is a whole world yet to be built from these components. So that everyone can see what the future will look like!

If you want to build this future together with us, then apply for our tech analyst positions! We are hiring! (https://iota.bamboohr.co.uk/jobs/view.php?id=32)

Please note:

IOTA does not have any partnership with LEGO to support this development. Other similar toys with embedded electronics can potentially be used for these demonstrations.

Source: https://blog.iota.org/modelling-new-business-models-with-iota-fadd53c6a192

Welcome Michele Nati to the IOTA Foundation

David Sønstebø:

Michele Nati, PhD has 15 years of experience in the fields of data and IoT, working on research and development and recently on innovation in a number of roles, from Academia to SMEs to government organisations. During his career he pioneered and touched all the technologies and aspects that helped to create the connected world we are living today.

In 2007, Michele obtained a PhD in Computer Science from the University of Rome La Sapienza. He researched, designed and deployed novel cross-layer communication protocols for Wireless Sensor Networks and other resource-constrained embedded devices, dividing his time between Rome and Boston where he was a visiting researcher at the Northeastern University. After his PhD, he worked for academia and a number of SMEs as a research scientist in the area of remote sensing / monitoring.

In 2010 Michele moved to the UK to work at the University of Surrey, where he was a Senior Researcher at the 5G Innovation Centre. As part of the first large scale european smart city project, SmartSantander, he led the development and deployment of an Internet of Things campus-wide testbed comprising of thousands of sensor devices for energy monitoring applications. He led a group of PhD students investigating the impact of IoT and data in creating more sustainable buildings and campuses. He developed proposals and led a number of other European Projects, researching the field of privacy-aware communication and crowdsensing in mobile Internet of Things.

Since 2015 Michele has worked as the Lead Technologist for Data and Trust at Digital Catapult London, driving open innovation in the area of data and trust. He worked on a number of projects and initiatives to increase transparency and individual control on how personal data are collected and shared. He championed the introduction of a standardized Personal Data Receipt to increase transparency, track personal data sharing transactions and provide General Data Protection Regulation compliance.

In his research into trust and transparency, Michele was exposed to blockchain and distributed ledger technologies. He researched DLTs as a means to build an infrastructure for sharing data with trust, and to create innovation in the digital manufacturing and creative industries.

Currently he is an active member of the Mydata community, actively liaising with academia, SMEs, large organizations and governments to identify open innovation activities in the field of data and IoT.

Beyond technology Michele is a keen trail runner and meetup organiser.

On joining IOTA

The IOTA technology and vision bring together all the aspects that have filled my technical and research interests for the last 15 years, IoT (from its early stage), data and trust. I believe these are the same three pillars on top of which innovation should now be created. I like the way the IOTA Foundation is working and establishing its presence in different vertical domains, through open innovation, demonstrators and partnership, rather than just capitalising only on the value associated with a cryptocurrency.

I am very excited to use my experience in managing multi-stakeholder initiatives, to find real world problems and to help with adoption of IOTA as trust infrastructure for creation of new data sharing ecosystem.

Michele’s has deep technical knowledge, multi-sector exposure and has participated in different stages of research and innovation activities. These qualities, together with his innate ability to manage multiple stakeholders, will be of great value for adoption and evolution of the IOTA protocol.

Michele is based in London, where he works as the Lead Technical Analyst to support innovation in Global Trade and Supply Chains. In light of his deep domain knowledge and network of contacts, Michele also serves as the Personal Data Lead, working with the Business Development team to define a strategy for the Personal Data arena. Give him a warm welcome!

Source: https://blog.iota.org/welcome-michele-nati-to-the-iota-foundation-ed023f9840c2

Coming Up: Local Snapshots

Hans Moog: A development status update

Over the last few months, the IOTA network has seen a significant increase in activity as more and more developers start to implement solutions based on the Tangle. While this is a very promising development, reflecting the increasing adoption of IOTA, it also results in an increase in database size, which may be problematic for nodes with limited hardware resources.

The IOTA Foundation has been performing global snapshots on a regular basis, whereby the transaction history is pruned, and the resulting balances are consolidated into a new genesis state that allows nodes to start over, with an empty database. However, this way of dealing with a growing ledger size is becoming more and more impractical as it requires us to:

  • Temporarily stop the coordinator.
  • Generate the snapshot state.
  • Give the community time to verify the generated files.
  • And finally restart the coordinator.

The Solution — Local Snapshots

To solve this problem, we have been working on implementing a feature called Local Snapshots. This has always been a central part of IOTA’s Roadmap. The initial implementation is now being tested internally, and we will keep you up to date with next steps, but first we must review all the implemented changes and gather sufficient metrics about the behaviour of this new feature.

Console output during local snapshot

What does this mean for node operators?

Before we dive into the technical aspects of Local Snapshots, we want to give a short summary of the changes that this new feature brings for node operators:

  • When spinning up a new node, it is possible to sync based on a small local snapshot file, which allows nodes to be fully synced within minutes (rather than bootstrapping the node with a large database file).
  • The disk requirements for nodes are massively reduced — in fact we already have nodes running with just a few hundred megabytes of hard disk space.
  • Since there will no longer be a need for global snapshots, nodes could theoretically run for years without maintenance.
  • Nodes should be in a position to handle thousands of transactions per second, without the database size ever becoming a problem.

How does it work?

To understand the way Local Snapshots work, we first need to clarify a few things about the way the Tangle works:

  • The Tangle is a data structure that has a lot of uncertainty at its tips but gains certainty as time progresses.
  • Consequently, the further you go back in time the less likely it is for an unconfirmed transaction to suddenly become part of consensus. This is the reason why it is necessary to “reattach” transactions if they have been pending for too long.
  • To verify transactions and take part in IOTA’s consensus, it is only necessary to know the recent history of pending transactions and the current state (balances) of the ledger.

The basic principle behind Local Snapshots is relatively easy to understand and can be divided into different aspects:

Pruning of old transactions and persisting the balances

  • We first choose a confirmed transaction that is sufficiently old and use this transaction as an “anchor” for the local snapshot.
  • We then prune all transactions that are directly or indirectly referenced by this transaction and clean up the database accordingly.
  • Before we clean the old transactions we check which balances were affected by them and persist the resulting state of the ledger in a local snapshot file, which is subsequently used by IRI as a new starting point.

Solid Entry Points (fast sync for new nodes)

While pruning old transactions is no problem for nodes that are already fully synced with the network, it poses a problem for new nodes that try to enter the network, since they are no longer able to easily retrieve the complete transaction history dating back to the last global snapshot.

Even if we assume that they are able to retrieve the full history by asking permanodes for the missing transactions, it would still take a very long time to catch up with the latest state of the ledger. This problem is not new and one of the reasons why a lot of node operators bootstrap their nodes with a copy of the database from another synchronized node.

To solve this problem, we use the local snapshot files not just as a way to persist the state of the node but also to allow new nodes to start their operations based on the exact same file (which can be shared by the community and the IF in regular intervals).

To be able to bootstrap a new node with a local snapshot file we need to store a few more details than just the balances:

  • First of all a new syncing node needs to know at which point it can stop solidifying a chain of transactions and just consider the subtangle solid. To be able to do so, we determine which one of the transactions that we deleted, had approvers that did not become orphaned and store their hashes in a list of “solid entry points”.
  • Once a node reaches one of those hashes it stops asking for its approvees and marks the transaction as solid (like the 999999….99 transaction after a global snapshot).

This enables us to use local snapshot files as a bootstrap mechanism to get a new node synced very quickly (within minutes), which at the same time is much easier to provide and retrieve than a copy of the whole database.

Seen Milestones (even faster sync)

  • While solid entry points allow us to stop the solidification process as soon as possible, it can still take a while to learn about all subsequent milestones that happened after our chosen cut-off point.
  • Since we want the local snapshot files to be a viable and efficient way of bootstrapping a new node we also save all subsequent milestone hashes in the local snapshot files, so that new nodes can immediately ask for missing milestones, without having to passively wait for their neighbours to broadcast them.

Permanodes

Since the pruning of data will be controlled by a simple configuration setting, it will now be possible to run permanodes that keep the full history of transactions, which has so far been impossible due to the fact that global snapshots were a network wide event.


The whole procedure of taking local snapshots is then automatically repeated to allow nodes to run with a relatively constant space requirement without unnecessary maintenance.

Summary

The upcoming feature of Local Snapshots will not just solve the space problems that arise with the growing adoption of IOTA, but will also simplify the setup of new nodes and allow organisations and community members to operate permanodes.

We will be opening this up for beta testing in the coming weeks. More information will be posted in the #snapshots channel on the IOTA Discord.

Source: https://blog.iota.org/coming-up-local-snapshots-7018ff0ed5db

Welcome Sam Chen to the IOTA Foundation

David Sønstebø陳志誠 (Sam Chen) is a software engineer based in Taiwan, with experience in embedded systems, the Android framework, and software integration. He holds a Bachelor of Science with specialization in computer science and studied electronic engineering in industrial high school.

Sam has helped IC design companies to develop smart-TVs, set-top boxes and media players and has been dedicated to software architecture and performance analysis for 10 years. He has an interest in open source software and open hardware, in particular using the Arduino and Raspberry Pi platforms to solve problems in daily life.

On joining IOTA

My passion lies in IoT. After reading the IOTA Whitepaper, I believe IOTA can take IoT to a further state. Based on The Tangle, IoT devices can communicate with each other in a more secure way. Though I am new to distributed ledger technologies, I am eager to contribute and realize the promise of IoT. I am glad to join the IOTA Foundation and work with smart people around the world.

We are very happy to officially announce Sam Chen joining the project. His experience as a software engineer in embedded systems is of great value to the IOTA foundation. Give him a warm welcome!

Economics in Cryptoland

Gur Huberman:

Two forms of money are familiar: token- and ledger-based. The former consists of coins and bills. Possession of the tokens amounts to their ownership. The history of their possession is irrelevant to their current status. Ideally the tokens are scarce, verifiable, durable, portable, unforgeable, fungible. Usually governments produce them.

Traditional ledger-based money relies on one or more trusted parties which maintain the ledger. They keep and update records of balances. In doing so they are trusted to be following accounting rules and satisfy the account holders’ requests to reduce their balance and credit the balances of other accounts as long as these requests are consistent with the accounting rules.

Tokens and ledger-based money are equivalent and trade at parity because of a government decree that obligates banks to convert one form of money to another at the client’s request.

A ledger-based currency is associated with one or more payment systems, i.e., ways of maintaining balances, transferring its units from one individual to another and updating balances. Cryptocurrencies are similar — they are computer networks which maintain and update balances. A notable difference is that a cryptocurrency’s payment system is for its own native coin whereas a traditional payment system is for a coin which is viable outside the payment system, e.g., the USD.

A more important difference between a traditional payment system and a cryptocurrency is the mechanism determining the rules and changes in them, i.e., the governance structure.

A traditional system is maintained by an organization which operates within an established legal jurisdiction with clearly stated objectives — profit maximization is natural for privately owned enterprises — and formal decision making mechanisms. These determine, e.g, what are its products, how it makes them, how much and how the organization charges for these products and how much and how it pays for its inputs. Market conditions affect these choices, and changes in market conditions often inspire changes in these choices.

The superficial resemblance of cryptocurrencies to traditional payment systems notwithstanding, the differences are crucial.

In its purest and most common form a cryptocurrency is founded on a protocol, i.e., a set of rules that when effective are followed by each participant because when he believes that other participants follow the rules, it is in the participant’s self interest to follow them. No legal system is necessary or even able to support the protocol of a cryptocurrency.

A cryptocurrency is born with its protocol. An effective protocol anticipates the prevailing market conditions. A protocol which supports a certain service quality at a given price is effective if the users’ demand is compatible with these parameters. For instance, Bitcoin was designed to handle up to 2,000 transactions — a block — per ten minutes; when transactions arrive at a rate close to the limit, users will offer fees to see their transactions processed earlier than those of users who offer lower fees; when transactions consistently arrive at a rate higher than the limit, some of them will not be processed.

A cryptocurrency may be functional and even prosperous as long as conditions are stationary. However, protocol changes are difficult to orchestrate because they require the participants’ consensus. Divergence of participants’ interests undermine such a consensus. Consider for instance what will happen when the system gains popularity and its throughput is deemed insufficient. Users who send time-sensitive transactions would prefer the system to process blocks faster whereas other users may prefer the system to increase the block size.

The upshot is that to understand how a payment system functions, one needs to understand incentives. How does a participant act in a given situation, presumably trying to advance his self-interest? Beyond individuals’ incentives, one needs to understand equilibrium. What’s the outcome, given everybody’s actions?

These issues are difficult enough in a traditional context where potent government pronounces and enforces the rules. It’s far more challenging to understand which rules are self-enforcing in the absence of a government, and what are the behaviors and outcomes they engender. These questions are challenging, with the challenge compounded when one asks how the incentives, actions and equilibria evolve over time.

These very issues are high on the agenda of IOTA’s research team which looks at questions such as: What is the best way to apply MCMC for tip selection? How is protocol-following behavior rewarded? How does the speed of communications affect incentives and behavior?

The agenda is rich, the questions challenging. Establishing a protocol for IoT and Internet payment and communication will be enormously rewarding.

Source: https://blog.iota.org/economics-in-cryptoland-3ba2d9e401d0