Richard Brown is the Chief Technology Officer at R3. He leads the team that has invented, designed and brought to market the Corda blockchain platform. He now also leads the team that’s building out their second major product line, called Conclave a platform to securely share and analyse data using confidential computing.
What is blockchain?
Previously Richard gave Insureblocks a definition of blockchain from an enterprise perspective. A blockchain like Corda is all about allowing multiple firms in a market to be in sync with each other about facts, they care about such as loans and trade deals. Documents which are shared between firms such as notification of loss for an insurance policy, will invariably evolve over time. The claim gets reviewed, processed and authorised. All those business processes are executed within a firm. Other firms across the ecosystem that have a stake in those documents need to be in consensus about their status.
For Richard, blockchain is all about ensuring that all the participants in an inter firm business process are in sync and remain so. The key value proposition being that “what you see is what I see”. Since our last podcast together in April 2019, Richard believes that his original definition of blockchain has been mostly validated by projects R3 has successfully run such as Spunta, by ABI (the Italian Banking Association). Spunta is about ensuring Italian banks are in sync with each other, that their balances reconcile and all the details are correct.
Security on the web – the padlock on your browser
When browsing the web, including going onto social media sites like Facebook, we are trained to look out for that padlock next to that URL within our internet browser as it gives us a sense of security. What that padlock tells us is that the connection between ourselves and Facebook is secure. That we are talking to the real Facebook.com and that connection is with servers controlled by Facebook.com.
This means that whatever data you are exchanging with Facebook is protected in transit as it leaves your computer and goes across Facebook servers.
However, what it doesn’t say is what Facebook will do with the data, it simply tells you that they are the ones who will receive it. Once Facebook receives that data they can do whatever they like with it. Something which of course has led to some press scandals as the Cambridge Analytica one.
Social media sites today haven’t deployed any technological measure to constrain or control how they use your data. As consumers we rely entirely on social and legal measures to constrain what they do with that data.
The padlock in the browser effectively gives us a false sense of security, because whilst it gives protection to the data as it moves it doesn’t do anything about how the data is ultimately used by the receiving party.
This problem of course isn’t just for consumers but also for businesses. Banks will route client orders to exchanges to buy or sell shares. Insurance companies will send data to government agencies or third party credit agencies. A lot of the data that is being sent may include personally identifiable and risky information. As a company, the only way you can get comfortable with that is by investigating the reputation and procedures of that firm.
For firms to get a better understanding of their market share or how they compare with their competitors they have two ways of doing that. Share information with their competitors which most wouldn’t want to and even if accepted is usually prohibited in numerous jurisdictions. The other approach is sharing it with third parties such as Bloomberg in the financial industry. Financial institutions share with Bloomberg information regarding trades they’ve done and at what price. Bloomberg aggregates all that data, processes it and anonymize it in order to produce useful market level metrics which they sell back to the market.
If you can’t get comfortable with it then a company will simply not share data with third parties due to the risk of what happens if they do what they shouldn’t with it, in spite of the value you could get in sharing that data.
The real opportunity is thus to give senders the ability to control or constrain how their data is used by the receiving party before they send it out.
Moving the trust from third parties to silicon chips
The padlock allows you to know who you’re sending your data to, and only they can receive it. Confidential computing goes a step further. It allows a computer that somebody is running to prove to somebody else what programme it is running. If we use Facebook as an example, Facebook would have to prove to your browser what they will use your data for. The algorithm will prove that the data can be used to find your friends, to play games but that it cannot sell your data to advertisers or sent it to a government agency.
This is a paradigm shift where you shift the trust based on the goodwill and privacy policies of firms such as Bloomberg to the implementation in chips, like Intel and AMD, of specific cryptographic techniques where even if Facebook wanted to they couldn’t change what an algorithm did, and they couldn’t see the underlying data.
There are two sides to artificial intelligence (AI). One side is the definition and the training of the models. The second side is the use of those models for some business purpose.
The challenge is the model is as good as the data on which it is trained on. Getting sufficient good quality data from the right sources to train a model is a significant challenge. This requires sourcing data from outside your organisation and reaching out to other players and third parties. The problem is that they may be very reluctant to share that data with you. They may be fine with you using the data only for training your model, but how can they be sure you won’t be using the data for other purposes.
By using cryptographic techniques mentioned earlier a model creator could prove to third parties that their data will only be used to train the model. This will ensure the building of better quality and performant business performance models.
The second side of course is the execution of the model for some business purpose. However, if you wish to licence out the model or give it to your customers to use or integrate into their applications you need to make sure they can’t reverse engineer it. You don’t want them to be able to do anything other than execute the model. The cryptographic technique can also be applied for this use case where you can ensure that the model can only be used to execute its function and not for another purpose.
If you sent data to somebody else’s computer, they can do what they like, it doesn’t matter what they tell you, they control the computer, they get to say what happens. So, they might tell you, they’re running a particular programme, but in general, you’ve got no way of knowing they are. And if they change it or inspect what it’s doing, you’ve got no way of knowing.
Confidential computing is a term for a general set of techniques that allows you the owner of the computer to relinquish the right to control what it’s doing or change what it’s doing or see what it’s doing in this context. And you will be able to prove to other people that you’ve done that.
Consequently, senders of data will be comfortable sharing data to your computer as they know the data will be processed in a secure enclave, a protected zone within your computer where you cannot change the data or see the data, as the data is processed only within the agreed manner.
There are a few subtleties which need to be pointed out. The owner of the computer gets to choose the programme that runs within that secure enclave. The sender of the data gets to audit it, to say whether or not they are happy to send their data to that programme within that secure enclave. Most senders will rely on third party auditors to offer such services.
Introduction to Conclave
Conclave is R3’s confidential computing platform that enables multiple parties to contribute data for analysis without revealing the actual data to anyone. It leverages Intel’s SGX chip. With Conclave you can:
- Analyse data from multiple parties in a protected algorithm, and verify how data is used
- Protect customer data from misuse and provide assurance that the data remains protected when collected, shared and analysed
- Reduce time-to-market for privacy-enhancing applications
- Access previously inaccessible customer data to deliver new insights and AI without compromising on confidentiality
- Streamline business processes between firms, while sharing processing costs and infrastructure
- Increase effectiveness of Financial Crime detection by eliminating false positives
- Promote transparent and fair price discovery in Dark Pools
iPhone, SIM cards and EMV cards – examples of confidential computing
The core technology of confidential computing is enabled by hardware, rather than software. The core concepts are not new at all. An example of a confidential computer is a SIM card. SIM cards within mobile phones hold a tiny computer that stores some data and can run some programmes. Some of the data it holds is a private key which acts as your identity allowing you to connect to the network.
The chips on EMV cards such as debit or credit cards also use confidential computing techniques. The chips are tiny computers that store data such as your pin and your spending limits for contactless offline transactions to perform basic programmes.
iPhones have implementations of confidential computing techniques. Whilst you may own the iPhone, you can install apps and perform a number of actions on it, you can’t change the operating system. You can’t for example install an Android operating system onto an iPhone, unless of course you jailbreak the phone. That means even though you physically possess that device, Apple is the manufacturer can prevent you from doing certain things.
With confidential computing in the enterprise world, such as with R3’s Conclave, you can choose to use this technology to restrict your own freedoms, you can create massive confidence in your own customers, which allows them the confidence to send you their data. And to the extent you’re in a competitive market, where you can prove how you’ll use a customer’s data and your competitor can’t. The customer will be more willing to deal with you and trade with you than with your competitor, because you can prove to a higher standard of proof, how you will process that data.
Whilst confidential computing isn’t new, what is new is the emergence of this technology into the mainstream and it becoming accessible to regular business developers to build applications.
Confidential computing enhancing blockchain
Richard and his team started working on Conclave several years ago for their own internal use to solve a residual privacy problem on Corda.
In a collaborative business network with lots of blockchain nodes that use Corda these nodes are communicating and collaborating to bring participants within the network in sync. The problem that all blockchains suffer, is that sometimes the way you reach confirmation on the authenticity or validity of data is by reviewing its provenance. For example, if you’re sent a token that represents cash, you need to ensure that it was really issued by a specific bank. This is known as the back-chain problem.
There’s a chain of transactions that led to the current one that you need to verify to check that the current token is entirely legitimate. To verify it you have to receive it and you may be able to infer something about your counterpart’s previous business dealings, there’s a residual piece of privacy, that needs to be solved.
To solve it, Richard and his team encrypted the back chain and instead of sending information on the provenance of the data the sender of the data could remotely attest to it that the data has been verified using the algorithm on a confidential computing chip. The confidential computing platform would have done the verification of the data thus removing the need to review the provenance of the data and consequently eliminating the residual privacy problem.
Within the Corda ecosystem, they realised that lots of parties were using Corda to sync data that they cared about. However, it became clear that parties wouldn’t be willing to share some data under any circumstances but would be happy to bring their datasets together with others for some joint analysis.
Early clients from the financial industry have expressed interest in using Conclave in fraud and in crime analytics such as in detecting money laundering.
Conclave is being built as a standalone product but it will also be integrated into Corda enterprise. It is presently in Beta 4 and is expected to ship out in Q1 of 2021.