Ctrl+Alt+Regulate
3 Techies Banter #3TBMay 17, 202400:52:31

Ctrl+Alt+Regulate

Tech has a "pacing problem" (believe us - this is an accepted term ЁЯШО). Innovation is fast outpacing the ability of laws and regulations to keep up. Regulators are trying to govern Hogwarts with rules written for a Muggle school. Good luck stopping magical mischief with detention and a dress code. Balancing benefits/risks, ethical considerations, determining liability and lack of expertise are just some of the considerations in play. "To regulate or not to regulate" is the constant dilemma. Or should we just let the market decide? So who better than Rahul Matthan, a partner at Trilegal and one of the country's foremost technology lawyers, to walk us all through this nuanced world. Join our delightful conversation on any platform of your choosing (not regulated ЁЯШЙ). Learn more about your ad choices. Visit megaphone.fm/adchoices

Tech has a "pacing problem" (believe us - this is an accepted term ЁЯШО). Innovation is fast outpacing the ability of laws and regulations to keep up. Regulators are trying to govern Hogwarts with rules written for a Muggle school. Good luck stopping magical mischief with detention and a dress code.

Balancing benefits/risks, ethical considerations, determining liability and lack of expertise are just some of the considerations in play. "To regulate or not to regulate" is the constant dilemma. Or should we just let the market decide?

So who better than Rahul Matthan, a partner at Trilegal and one of the country's foremost technology lawyers, to walk us all through this nuanced world.

Join our delightful conversation on any platform of your choosing (not regulated ЁЯШЙ).

Learn more about your ad choices. Visit megaphone.fm/adchoices

[00:00:02] Hi, I'm Samiran. Hi, I'm Nilesh. Hi, I'm Sheetal. And you're listening to 3TB. 3 Techies Banter

[00:00:12] Hi everyone and welcome to another episode of 3 Techies Banter. We have kind of stayed with

[00:00:19] thematic episodes and as you must have realized this time around we've kind of picked on this whole issue of

[00:00:27] privacy, security, data and what is it doing to us as individuals, what are rights

[00:00:33] might be and stuff like that. In fact if you see some of our artwork around it

[00:00:39] it's pretty cool and slightly conny if I'm my dad, it's actually got a cookie jar

[00:00:44] for whatever reason. But having said that we said, well Nilesh Sheetal and I obviously

[00:00:51] discussed this topic and look at it for various angles. We thought it would be super

[00:00:57] super useful to get someone who's kind of been in that space got his hands dirty

[00:01:03] and you know, nose it from the back of the hand. And in fact we are very very delighted to have

[00:01:10] Raul Martin as I guess he's a partner at trial legal. And I think from quite a few of the

[00:01:19] introductions to him or I have read is that he's one of those rare people who apparently

[00:01:25] marries the Eastmost of the US to the West Coast which is like actually legal people

[00:01:30] to the technologies. So you know, he kind of is able to kind of very very comfortably

[00:01:35] straddle both spaces. So he's been in the technology field and in the technology

[00:01:39] practice of trial legal for the longest time. And of course he is a lawyer by profession

[00:01:45] I think other than that what is also interesting is that I think in the last several

[00:01:50] years he's also branched into this whole larger issue of data governance, especially

[00:01:58] in the you know widely or accelerating digital world that we live in today because you know

[00:02:05] like we discussed earlier we had this whole issue of you know we want to be private

[00:02:09] people but we have no harm, no issue giving away our data online. So it's kind of

[00:02:13] really strange. I think he's going to be helping us through that. Obviously he's had

[00:02:18] a super, super illustrious career and my came to his illustrious career that he used

[00:02:24] to be in school with me though a senior but he's been on the govern, RBI govern as

[00:02:28] advisory board, he's been on he was a G20 advisor on DPI and you know many, many things

[00:02:35] like that. But you know we're and thank you again Raul so kind of taking this afternoon

[00:02:42] to come and speak with us. So I kind of just keep this off by maybe asking you to

[00:02:48] just you know talk a little bit about I won't say your journey but you know just

[00:02:54] just how is it you know little bit about your hobbies maybe you know what's

[00:03:00] exciting in this technology come law business that keeps you kind of going all day

[00:03:06] and then we can jump into whatever questions we might have.

[00:03:09] Sure, look and I've listened to 3 TB for some time so very, very glad to be part of this

[00:03:18] conversation. I've heard many of your conversations around privacy and I guess the

[00:03:24] intersections between tech and law and policy every now and then I felt like look I

[00:03:30] felt like putting my hand up and saying look I also have something to say so you know

[00:03:34] what's going on. I'm glad I'm glad I'm here to be part of the conversation but yeah look

[00:03:38] I mean it's it's strange I've been doing this for over 25 years now for all of that

[00:03:44] time I've been doing technology law and so 25 years ago there was no such thing as

[00:03:49] technology law and so it seemed it was very lonely just doing something that no one thought

[00:03:55] was useful or interesting and going on and on doing it and why I did it was I guess

[00:04:01] I've always had a sort of an affinity for technology. I was never afraid of technology

[00:04:07] if there was some new technology I would go in and try and sort of figure out how it worked

[00:04:12] and where it all started as I recall is my very first job in my very first job with a

[00:04:16] law firm called the word so says there was one computer that was connected to the internet

[00:04:21] on one of those screaming modems and this was like this is the backup for the facts

[00:04:26] so you know the facts didn't work they would have a way to send documents and no one used it

[00:04:32] and this is pre-intinent literally this is before BSNL came in and the only way that I could use

[00:04:38] it was in those days we had something called a bulletin board service in Bangalore

[00:04:42] BDS and this is essentially a modem connected to some guys computer and he had a computer

[00:04:49] dedicated and there's the modem on his computer he had also a little programs and games

[00:04:54] and also a sort of old email system so the way the email used to work was you would go there

[00:05:00] see what you download everything you see all the messages that people had left for you

[00:05:04] and go offline because it was considered really rude to hog this one line because there would be about

[00:05:10] your 20 30 people who wanted to get onto this so you would quickly download your messages go offline

[00:05:16] answer them at your leisure and then wait and turn for the modem to become free again

[00:05:20] and then upload all your replies and then you do this whole thing again the next day

[00:05:25] so this is really you know old asynchronous kind of internet in those days

[00:05:32] but the reason I'm mentioning this is because it was fairly non-trivial for a lawyer that

[00:05:38] to make this to work and I did and I somehow the other figure out how to make this work

[00:05:45] and that's sort of the journey. I mean whenever there's a tech new tech toy that comes out

[00:05:52] I buy it and I play around with it and I see you know what it is I bought my first and only bit coin

[00:05:59] when it was I don't know maybe 200 dollars to the bit coin or something like the only bit coin I ever had

[00:06:06] I started an investment for me but then I've used that of course now you know the price of bit coin is and

[00:06:11] I've sort of converted that into ether I've done a whole bunch of different things played with the L2s

[00:06:17] the you know some really interesting L3s right now my social media service of choice is

[00:06:25] I've worked on the focus of protocol so I look I dabble with all sorts of the why do I do that

[00:06:31] because I can't advise clients on how existing laws and policy will intersect with their business

[00:06:40] if I don't know how their business works and so I can't advise on cryptocurrencies and

[00:06:46] blockchains if I don't understand how cryptocurrency and blockchain works and that's true for everything

[00:06:52] so I guess that's how I did it. I mean I've sort of learnt the hard way and so if you say marry the eScoast

[00:06:59] for the West Coast yeah that's I guess the only way I think this eScoast in the West Coast

[00:07:04] stay too far apart and in today's world where technology is eating everything we've got to all

[00:07:12] be able to understand each other and so I think that's sort of the way I have.

[00:07:17] So in fact the eScoast West Coast I remember this thing when earlier Trump was running for

[00:07:22] President the first time around they said that you know there's the eScoast and there is the West Coast

[00:07:28] and there are all the people in between who actually vote for Trump. So the eScoast and West Coast

[00:07:33] actually I'm thinking to do with America because they're like I mean they're like super liberal or you know whatever

[00:07:39] more enlightened or more educated or you know doing other stuff but it's those people in between who determine the fate of America

[00:07:46] so I'll just jump into the conversation and I know that you know there's been so much conversation

[00:07:52] especially because I do research there is so much conversation that's happening around data

[00:07:58] and it's becoming more and more, how should I say more and more consumer focused earlier it used to be policy

[00:08:06] focused it used to be you know people within the industry focused in now today you find that

[00:08:12] every consumer is being asked about data and what they think about data and how do they see

[00:08:17] about sharing data and all of that. And the funny thing is that you know very often in research

[00:08:23] I don't even think the consumer knows how the data's been defined right and therefore maybe a very very

[00:08:29] fundamental question for you but the question is when everybody talks about data privacy

[00:08:36] and when everybody talks about your rights but is that data that one is talking about

[00:08:41] let's start at that and then we can go maybe into the more complicated questions around you know the

[00:08:47] ethical issues and the legal issues etc but how could one define this behavior?

[00:08:52] Yeah so I think what's of concern to people is what we call personal data and personal data

[00:09:00] around the world is defined as any data that in relation to a person personally identifies that

[00:09:07] that data so if you just think about traffic data, you know the amount of cars that are going

[00:09:13] on the street does make a difference but if they can put some sort of tracker on your car and they

[00:09:19] know that you have gone from point A to point P then becomes personal identifiable information and you can see

[00:09:24] any type of information will transform in this manner the moment it becomes associated or aligned

[00:09:30] with an individual and why is this a concern because a single stray line of data that is you went

[00:09:38] from point A to point B is it does make that much of a difference but then over a period of like ten years

[00:09:47] if you find that every morning you go from point A to point B and every evening you go from point B to point A

[00:09:54] it is a reasonable presumption that one of those is your place of residence and the other is your place of work

[00:09:59] and then from there you can find more and more information and so I think the concern around personal data

[00:10:06] is one that is capable of being analyzed and manipulated at such a high degree of fidelity

[00:10:15] that it can provide a lot of information that we normally assume would be kept private and in my first book

[00:10:25] which is called privacy point why I actually explore this concept in some detail.

[00:10:29] I say look the concept of privacy is I argue that it is a relatively new concept, it is a concept

[00:10:39] that is abnormal in nature because animals are generally not private and early humans were not private

[00:10:45] in fact I argued that it was dangerous for the species to be private because you didn't want one of your tribe

[00:10:53] going off doing something when you actually need that person to be fighting against the enemies when they came

[00:10:57] or against like the the heart of elephants when they charged in over whatever it was and so we actually did not like private people

[00:11:06] we did not like people went off and did what they could but that all changed when society is specialized

[00:11:12] and then we realized that we really actually needed to have space to think

[00:11:17] and that's where privacy comes from. We sort of all need like a private space to go and think and reflect

[00:11:23] and in the book I describe how art and science and culture actually come out of this ability to be with ourselves

[00:11:31] in a private space. So if privacy is important for these really foundational civilizational things

[00:11:39] then when data about us that is associated with us can be used outside of our will

[00:11:48] to describe things about us that we were quite sure no one knew it starts to become dangerous

[00:11:56] but at the same time as you've said many times on this show we trade this off with the many benefits that technology provides

[00:12:04] and a lot of those benefits actually come out of the conveniences that data provides.

[00:12:10] Recommendations are something that I think we I've I've very recently started going back to physical book stores

[00:12:17] because I finally found a book seller who understands what I like and whenever I go I select a few books and he says

[00:12:24] also you like this book also. The longest time I haven't had a book seller who understands me like that

[00:12:30] and that book seller is Amazon because Amazon's got a perfect memory of every book that I've read

[00:12:35] and pretty good accurate recommendations of what I should read next. So I think that's the benefit of data

[00:12:41] downside of data is just exactly as I know all of these things. You know some really deep dark secrets that you really don't want anyone else

[00:12:48] to know and if we can't control that and you have for some pernicious reason the people who control these companies

[00:12:56] misuse it to harm us it could be very dangerous. So that's really the root of this issue

[00:13:05] and in a lot of my writing I describe this as a trade off. We are trading off the good with the harm

[00:13:11] and we each make the straight off in our own special way and I think the point that people don't understand

[00:13:19] is that this is a trade off and your trade off is different from my trade off and I don't think we can impose it on anyone

[00:13:27] but there are some perocular things that we say look this sort of stuff is not okay. We don't for instance want children to be subject

[00:13:33] to this because we know that children perhaps don't have this kind of full understanding and so they can cause themselves harm

[00:13:40] unknowingly. And so that's really the role of data protection law in policies to is to try and find ways in which to

[00:13:48] trade this very fine balance between these two things. So Rahul since we define the data and you know you touch the

[00:13:58] pawn laws and regulations and I'm sure you have multinational clients right from all across the world.

[00:14:05] So I think by now who's top of us are very well aware of GDPR right and European Union most of the they follow it quite

[00:14:14] stringently. Now we have our DPDP so I mean when you have advice clients across the world are these laws essentially

[00:14:25] all similar or there are some very specific differences between the global and Indian perspective.

[00:14:31] And just a quick view on that if you could.

[00:14:34] Yeah I think data protection laws are roughly similar on the on the base of the principles that they follow

[00:14:41] and yeah these started actually in the US not in Europe the first data protection principles were they called

[00:14:48] FIFTS the fair information practices which was very much a US framework then it was articulated in the OECD it

[00:14:56] subsequently got picked up but with the data protection directive and then that got converted in 2016 into the GDPR.

[00:15:04] So the principles are roughly the same principles are it's very much based on consent so you can collect a personal data but collected on the base of consent

[00:15:13] only collect data for a specified purpose so before you collect the data tell me what you are going to use it for.

[00:15:20] Only collect as much data as is required to fulfill the purpose not anymore once you've once the purpose has been served delete the data

[00:15:28] don't keep it for longer than it is required have the ability to revoke the data consent and things like that and GDPR will have all of these things.

[00:15:37] I think the difference and I write about that in my most recent book is in the way in which these are implemented.

[00:15:45] Now even though the US started out with these principles and FIFTS the US actually does not have a central federal data protection regulation.

[00:15:53] Some states have got data protection regulations notably California but try as they mind it seems unlikely that they're going to pass a nationwide data protection regulation.

[00:16:05] In the US all the US tech companies actually leverage their ability to use data with low supervision.

[00:16:15] And so in the book I say that this is the classic US LASIFER approach.

[00:16:21] They leave it in the hands of the tech companies to decide what can be done and so what is the law as far as US tech companies are concerned.

[00:16:29] It's the terms of service and privacy policy that you sign up with when you sign into that service.

[00:16:33] And then every now and then once in a while like what's updated they ask you to agree to a new set of terms and conditions and somehow as don't agree to it and you know we keep having to click that agree.

[00:16:46] Whenever you whenever you use WhatsApp in Europe it's very different.

[00:16:50] There are no big tech companies in Europe and so in and Europe is a very rights driven kind of a region.

[00:16:59] And I don't know if they're particularly guilty about what happened in Nazi Germany but they really move very much in the direction of ensuring that there is personal autonomy.

[00:17:11] And so in Europe GDPR essentially prescribes what can and cannot be done through the laws and the regulations.

[00:17:18] They don't allow you at allow tech companies to determine what can and cannot be done on the basis of solely their terms of service and privacy policy.

[00:17:28] But they say that those things need to be aligned with the privacy regulation.

[00:17:35] So these are the two different frameworks.

[00:17:38] And so this is in a sense in my book which is called the third way. This is the first way the US way is the first way the Lacer Fayway Europe is the second way, which is the regulatory heavy way.

[00:17:48] And the book is called the third way because I argue and the book that India has proposed a third way.

[00:17:54] And that third way is essentially a marriage of these two.

[00:18:00] And I argue that if you use the European method alone, you will get Cambridge Analytica and things like that.

[00:18:07] And that's because technology moves much faster than laws can ever move.

[00:18:10] And so if you want to have any hope of keeping big tech in check, you're going to need to have to build legal regulatory principles directly into the technology architecture that we all use.

[00:18:25] And my argument in the book, the summary is that India's digital public infrastructure has been built using design principles using governance infrastructure, institutional infrastructure and actually embedding law into code.

[00:18:41] In a way that we actually have what I have started to call a techno legal approach to the regulation of technology.

[00:18:50] And I argue that this is the only way that we can do it because we've seen that the US model doesn't work.

[00:18:55] If you leave it to the tech companies, the tech companies will answer to what their stakeholders and the shareholders want them to do, which is more profits.

[00:19:02] And that leads to, as you mentioned on the show surveillance capitalism and all sorts of things are there.

[00:19:07] If you allow the European model to work, then it's always going to lag behind the latest greatest technology as we've seen with the AI Act.

[00:19:15] I mean, they passed the AI Act last year.

[00:19:18] But they had to rewrite the entire thing because after the whole thing was written, generated where I came along and this is literally how technology changes.

[00:19:26] So the idea is to actually have laws and principles embedded directly into the legal that technology infrastructure.

[00:19:34] And if you do something like that, you have the ability to change quickly and rapidly.

[00:19:39] That's the thesis and that's my argument.

[00:19:43] So in fact, that's probably also in keeping with what we do is good software practice is that you kind of catch a problem early.

[00:19:52] So if it's in the design or if it's right at the beginning, you know, you prevent it being ramifications of it being spread all over.

[00:20:02] So if it's in the finished product, then it's much more difficult to fix than that you have product recalls and loans.

[00:20:07] But if it's in the design, which is in this case is in the design of the public infrastructure, then it'll be great.

[00:20:14] And I think that's also the same parallel we use for vaccination that you know, if you vaccinated it is easier than if you kind of try to cure a disease when it actually comes up.

[00:20:24] But I'm going to before we kind of go into that discussion about India, there's a point that you mentioned about individual freedom and tradeoffs and how each person would like or may become

[00:20:36] a comfortable and with a different degree of freedom or data sharing.

[00:20:43] So while regulations is a general guide, like if in my case I choose that you know, I'm very comfortable giving away more data because I think there's a great benefit that I get whether it's personalization whether it's what I have better service.

[00:20:56] Well somebody has a but that literally therefore the unit of population of administration literally becomes one then right.

[00:21:04] I mean if you take a world view and in India, if it's 1.4 billion people, literally could have different views.

[00:21:10] Then if you look at it from the lens of literacy and the understanding of what you're signing up for, I mean how do you kind of then you know,

[00:21:19] I mean, it's this whole combination.

[00:21:23] And no one said it was easy, so we call it really, you know, there's lots of terms for this they call it the privacy paradox.

[00:21:32] Consent fatigue, many sort of terms of art that are coming around because of this and look just to address the privacy paradox which you might find interesting.

[00:21:43] The privacy paradox essentially says that in order for your consent to be informed, you need to have a lot of information.

[00:21:53] But if I give you all the information you need to make informed consent, that's more information than our tiny minds can conceptualize.

[00:22:01] I listen to your privacy episode where you kept talking about the Microsoft terms of service and I don't know what you say.

[00:22:08] I don't know what you're talking about, but I don't know what you're talking about.

[00:22:12] But look, that's the amount of information they need to give you in order for your consent to be informed consent.

[00:22:18] Now this is like an unbeatable paradox and why is that because they have only one place to provide these terms.

[00:22:25] And if we could just break up the products into many small parts and we could say, look, as far as Microsoft word is concerned, I don't really need to give any personal data to them because it's like a app that's installed on my computer.

[00:22:37] So I won't give you any personal information, but if I'm using Bing or you know, co-pilot or something like that, I won't co-pilot to be looking through my emails as it can give me a better answer and sort of save me time.

[00:22:51] And there's a trade off there. Now that trade off, you've got to know that Bing is going to troll through.

[00:22:57] You know, whatever you're a fair with your mistresses, well as your emails to your wife and if you're okay with that, yeah, I mean, that's the trade off that you make.

[00:23:06] But you've got to go into that with your with your eyes open. So look, these are these are difficult things.

[00:23:12] And then if you layer on top of that, the challenges of literacy and I don't even see literacy is sort of a fairly elitist way of thinking about it.

[00:23:21] Even among the very literate, there is digital capacity and these are people who, you know, maybe flying first class wherever they go but they don't have any clue as to

[00:23:33] The decisions they made with regard to their personal data on some obscure app that they did and that could cause them them trouble.

[00:23:42] So it's really, it's all of these sorts of things. And I think that tech policy and privacy policy needs to keep all these sorts of people in mind.

[00:23:52] But I think there's another really important thing that we've got to keep in mind in particularly Indian context.

[00:23:57] And that is that this is, you know, Sanandan says this quite often he says that India is the first country that is going to be data rich before its people are economic inequality.

[00:24:07] And if that is the case, if we have an obligation as a nation to improve the economic wealth of our people should be not find a way in which we can use their data richness and convey a translate that into economic wealth.

[00:24:23] And if you're doing that, we are necessarily making trade-offs that perhaps the elite among us would never make.

[00:24:30] You know, I don't think any of you would really allow a fintech company to, you know, take a look at all the messages on your phone or the battery status or whether you type in all caps or not.

[00:24:46] But a poor person who has no physical collateral, no house, no vehicle. If he or she turns that on on their phone, then a fintech company could look at that and absolutely say look I think this person even though he's got no collateral, I think this person is great worth the enough.

[00:25:05] That's a trade-off that none of us here might be willing to make, but that's probably the only way that that person will get alone.

[00:25:12] And if, I just understand that when they get their first loan, they get into the formal financial system.

[00:25:17] And then the second loan will be based on how well they have we paid the first loan and what are changed that will make to their entire way of life.

[00:25:27] And so if you think about it, 300 million people in this country are in the formal financial system.

[00:25:32] Over one billion need to be brought in and how do we do that?

[00:25:37] And we need to just think about privacy in this very western framing of it in that context and try and see how we can in our laws or policies taken to account this very complex problem for this very complex country.

[00:25:52] So, so Rahul, you know since we are on the topic of Indian problem statement is totally different and you mentioned something that I did.

[00:26:00] I was not really aware that DPI is using something like a third way which you have mentioned in your book.

[00:26:07] So in this third way, you mentioned this whole techno legal framework where the law is embedded in the code right and I come from a very blockchain kind of a world last 80 years I'm doing that.

[00:26:19] So in blockchain code is the law code code is everything right.

[00:26:24] So could you, you know, briefly explain, I mean how, how is this third way working?

[00:26:30] Because technology was still changed right and if the code is a law are we saying that because it is in the framework it upgrades with the technology and that is where it is the third way.

[00:26:43] Yeah, look I mean so actually the blockchain world is very much on this code is law kind of framework but I think we go even before that right if you if you the very first book popular science book that was written by lawyer that I read was a book by professor

[00:27:00] Lisec called code and the other laws of cyberspace this was 20 20 23 years ago and Larry Lisec says that exactly the same thing he talks about technology just doesn't call it a technical legal and he says that on the internet.

[00:27:17] And he said that if we want to be smart, rather than right laws, we will get involved in

[00:27:24] looking at the protocols that define the internet because we will be able to do things like

[00:27:30] control speech in the way we have to evolve or whatever. Now as it happened after Larry

[00:27:36] Lyssey wrote that and this is the time when it was, you know, this is before geocities. This is

[00:27:41] when we had to literally make your own website using HTML and yes, it was possible. But today,

[00:27:46] all of us access the internet through Facebook and things like that. And so I think that coming back

[00:27:52] to that in the pure internet sense of it, the first time that that happened was when Vitalik

[00:27:58] Bootrin created the Ethereum, right? So the blockchain 2.0, I would say even the blockchain 1.0,

[00:28:06] which is Bitcoin and the Satash, Nakamoto paper is not, you can see the hints of that in

[00:28:13] there because there is a very, very simple law in blockchain 1.0 which is the double spending

[00:28:19] law and then the fact that any contract that you make on the blockchain is irrefutable

[00:28:25] and that's really powerful. But that's as far as law is a concern. That's really basic. That's

[00:28:29] just contract law and that's not like an involved in advanced thing. But what Vitalik did with

[00:28:34] Ethereum was create this virtual machine that was capable of doing all sorts of other

[00:28:40] programmable transactions. And so you're right, that is very much the direction in which my

[00:28:48] thinking goes for technology. I think the only problem with this is that and this is the problem

[00:28:53] that blockchain faces around the world is that countries like to exert sovereign control

[00:29:00] overlords and they like to have sovereign variations as to how laws need to be interpreted.

[00:29:08] And you've got to give them some ability to do that. Now, if you build a digital public

[00:29:13] infrastructure that is and as you can see now we've got DPI for identity credentials which is

[00:29:20] essentially who am I what can I do, what are my skills, you've got DPI for payments

[00:29:25] which means this is how I can contribute transactions, you've got a back and protocol which is

[00:29:32] open network for digital commerce. So this is the full stack for how to do digital commerce.

[00:29:37] And then you've got DPI for data exchange and sharing which is, how do I transfer information

[00:29:43] from one person to the other with protocols and of course all of this can be taken out

[00:29:47] to the financial sector, moved into health, education, agri, all these various sectors. So what we're

[00:29:54] seeing right now is that you've got a technology underpinning for virtually every type of

[00:30:01] interaction that you have between human beings. You can sit down and and I can I can

[00:30:08] sell you when I meet you a book that I physical book that I have but I can conclude the transaction

[00:30:14] entirely digitally and I can also send give you the book entirely digitally in like an ebook

[00:30:20] format and so the world has got which is not the case when Sathoshino Komoto wrote his paper

[00:30:26] at the first time that the digital was not as pervasive as it is today. Now it is utterly pervasive

[00:30:33] and at this point in time if we can have national digital infrastructure that aligns with certain

[00:30:42] codes and laws and then we put the design of those protocols in the hands of regulators.

[00:30:49] The same regulators will write laws but they will also have control over the protocols that they can

[00:30:54] then tweak to determine how your interactions work in real time and the example that I give often

[00:31:01] is that NPC I came out with this idea that we should not have there should be no player that has

[00:31:07] more than 33% of the total volume of transactions in the digital sphere. They didn't implement it

[00:31:14] but if they wanted to implement it they have control over the infrastructure and they could implement

[00:31:21] now what are these saying essentially they are trying to give effect to a competition regulation

[00:31:28] which is that we cannot have a duopoly of two. We need to introduce three people into the ecosystem.

[00:31:35] Now if they were doing it in the old fashion where they were fast to law and then they're

[00:31:40] waited to see what happens and then you know you'll do an audit at the end of the year to see

[00:31:44] did you do one third or did you do you know half or whatever and then they'll find you for

[00:31:49] this is how this is work it's like it's all after the fact but if you have control over the

[00:31:55] technology at the switch level you can in real time ensure that no one gets more than a third you could

[00:32:03] do it every third transaction could go somewhere else literally you could program it in whichever way

[00:32:08] now that is extraordinarily powerful and that is sort of the way that in real time you can give

[00:32:13] effect to laws and that's the idea and I think just the difference between blockchain and this is

[00:32:18] that blockchain is supposed to be self-executing it's supposed to work on its own and I have

[00:32:25] spoken lots of governments and you know not just here over the G20 at the opportunity to speak

[00:32:29] the other things everyone doesn't like it because you know real life is not like I think in a very

[00:32:35] evolved economy this is probably possible but they keep telling me in at the time of COVID if we

[00:32:41] didn't have control over your macroeconomic forces we would have we would not have survived

[00:32:47] a COVID now I don't I'm not an economist I don't know I don't know the whether that's true or not

[00:32:51] I've spoken a lot of economists who say we can certainly be a little less controlling than we

[00:32:57] currently are but the fact of the matter is that this is I think the via media between the two

[00:33:02] so in fact this is very interesting when you kind of look at it this way and especially I think

[00:33:08] one of the things that came out of G20 was this 50 and 5 kind of movement where they said that

[00:33:14] you know in the next five years we will implement some aspect of DPI and selected 50 countries

[00:33:22] so I mean I'm just thinking like like we kind of think about DPI and India and if we really have

[00:33:29] programmable infrastructure which is kind of rolling out in 50 countries then we could potentially

[00:33:36] be laying the foundation for a privacy ready kind of society sort of you come and I don't

[00:33:44] to make it sound too utopia and that you know we've not solved the world's problems but

[00:33:48] but literally the fact that if more and more countries were to choose this way of work I mean

[00:33:53] it's a slightly like democracy it's not perfect but at least it's better than anything else you have right

[00:33:58] so I mean it is kind of really a game changing opportunity window to kind of roll something out

[00:34:07] like this and it will happen more in the developing countries because obviously no I mean the

[00:34:11] US is not going to suddenly do something very strange so it's probably like a great opportunity to

[00:34:17] have something like that roll out pretty much globally yeah look I mean I think the other

[00:34:23] fact that you've got to recognize is that digitization of governments or digital governments

[00:34:30] are happening right now whether we like it or not there was a study that I saw somewhere that

[00:34:34] the total amount of money your mark for digital government digital governance is somewhere in the

[00:34:39] region of $700 billion so this is budgeted this is not even aspirational which means

[00:34:46] like it or not someone's going to spend this amount of money now you have two options the way in which

[00:34:51] the $700 billion gets spent and you know how to the $700 billion I don't know the exact number

[00:34:56] is but about 500 billion is spent in the US and your and so there's about 200 billion for I guess

[00:35:03] you know five billion people and there's 500 billion for maybe a billion people or so right so

[00:35:10] so that's the way it is getting spent you have two options and the way it's done in the majority of

[00:35:15] the world including and in particular in the West is that you hire some tech company one of you guys

[00:35:23] and they will build out a solution and we've had many of these examples where a big tech companies

[00:35:31] have come in and built out an identity system and then the tech company whatever doesn't get paid

[00:35:37] or it gets doesn't like the ways and it just packs up and leaves and you have seven million

[00:35:41] people whose identity is just gone there's this sort of private company now holds this identity

[00:35:47] and you have many of these sorts of challenges and then you know you've got incompatibility someone's

[00:35:52] building the identity system another person's building the payment system a third person's building

[00:35:55] the credentials system none of them talk to each other I mean the best example is the health

[00:35:59] case system in the United States where you cannot get electric electronic medical records from a

[00:36:07] hospital on one side of a medicine avenue to a hospital on the other side of medicine

[00:36:11] just like across the road you can get the patient in a gurney to the other hospital quicker than

[00:36:16] you can get his records so this literally is the the problem with the way in which we are building

[00:36:25] these systems and so you're right I think if we can choose this other approach which is the

[00:36:30] DPI approach the DPI approach the way we talk about it is you have strong foundations you have

[00:36:35] it utterly interoperable so that you never rebuild something which is already built and can be

[00:36:41] deployed in the other end of the stack the reason why you do that is because it is fully interoperable

[00:36:46] and so you can really all the features that one part has been designed to you never have to

[00:36:52] build those again or you just sort of simply change the XML a bit to get a few more features and

[00:36:57] it's sort of as simple as that if you can use this kind of an approach it does a couple of things one

[00:37:03] is it is very sovereign because each country can then own that spec it can then get a bunch of

[00:37:11] private sector players to build out on top of it but if a particular private sector player doesn't

[00:37:17] work out or whatever nothing happens because you know it's the same spec that anyone has can pick up

[00:37:22] anyone has can can build on and I think this approach as an alternative to this very vendor driven approach

[00:37:29] is much more democratic and it is much more in control of the of the country that wants to sort

[00:37:37] of provide this. Sir I am getting a bit of it but every time I do research right if you do research

[00:37:44] in India and you talk about the government right for whatever reasons however much the fights

[00:37:51] may be happening whatever you think of the existing government outside the government etc there

[00:37:57] is an inherent belief that the government works for the greater duties of population right as an

[00:38:03] inherent belief. But to anywhere around the globe when you talk to them about government there

[00:38:10] is an inherent disbelief that the government works for the greater duty of the population.

[00:38:16] Now a DPI works in the Indian context or may work in the emerging market content context if

[00:38:24] there is belief that the government is doing it for migrated goods so you know it's going to look at

[00:38:30] wealth creation for me it's going to give me more subsidies etc. When you try and do

[00:38:37] the way you're proposing in countries where there is no belief in the government and no belief

[00:38:43] that the government is doing it for migratitude how do you then see these act coming into play or this

[00:38:49] method coming into play because to me it's it's quite a challenging thing because in India it works

[00:38:55] I mean everybody went and did the other hard you know nobody was worried about giving away their

[00:38:59] information and blah blah blah all of it because there is a belief that the government is like the

[00:39:04] cut up right and it's job is to take care of you that's not true of the US that's not true of

[00:39:09] your work. So where do you see that dialogue ended if we look at something like a digital problem

[00:39:16] I just go back to the point I made earlier which is that today governance is digital everywhere

[00:39:23] in the world and just this morning I had a conversation with someone in Australia and Australia's

[00:39:30] after all the opposition that everyone in the Western world had to India is now about to embark on

[00:39:37] a digital identity program and I was like oh wow you know like I mean you gave us such grief for

[00:39:43] for a decade and it looks like this is now the really smart idea and everyone seems to be moving

[00:39:48] to it. You're right I think the fact is that there is a little bit of skepticism in some parts of

[00:39:54] the world but the fact is that they are embarking on digital governance projects and the only

[00:40:02] only point that I make is that if you have the option of choosing a vendor driven approach or a

[00:40:07] DPI approach to me the DPI approach makes more sense. Now you can go and do what you want and I think

[00:40:12] still the 500 billion is being spent to line the pockets of some really large corporations

[00:40:18] but as we get the idea out that this is a way to do it people are sort of thinking and saying okay

[00:40:25] you know this is perhaps another way to do it but you know I think your question is often framed

[00:40:31] to me in a very different way which is that what happens if you build these big massive

[00:40:38] population scale digital systems and it gets into the hands of a repressive government.

[00:40:43] Leave aside like someone you know who's not really going to have my interest in mine but

[00:40:48] someone who actively is going to use this information for to harm me or to harm

[00:40:56] a community like me or you know for any other reason and for those people I say that at least

[00:41:04] in the design of India's digital public infrastructure there are fundamental design choices

[00:41:10] that we have made that preference democratic and liberal values and you know in the book I go

[00:41:17] through some detail of it but I'll just give you the example of other that we that we all know look

[00:41:23] this is an identity system and the point of an identity system is to identify you.

[00:41:28] Now how would you identify someone? How do you identify someone who you want to call?

[00:41:35] I mean we're all old enough to remember when we used to have these very very fat telephone books

[00:41:43] and you could go into the telephone book and you could look it was arranged by alphabetical

[00:41:48] order and you could look and find any one's number and it was their printer and you could use it.

[00:41:53] Now what is that? That is fundamentally search. We could have built other with search built in

[00:41:58] because it is very convenient but if you build other with search built and that to the kind of

[00:42:04] Boolean search that we're all used to we can index on you know all people who's surname is something

[00:42:11] in India you know if your surname is anything I can tell you exactly what cost you are and which

[00:42:15] region of the country you are you know in Kerala or I'm from they'll actually tell you everything about

[00:42:20] you you know who your father was etc we just this one question where are you from and that would be

[00:42:28] deeply invasive of privacy and so other was built without a search feature you cannot search

[00:42:36] you can authenticate the authenticate is yes no. Is does this other number match this name

[00:42:43] and if you are a much smaller category of people you could also get KYC

[00:42:48] and that is just for demographic attributes. Now this design choice if you can think about an

[00:42:55] autocratic ruler an autocratic ruler would have loved to get an identity system where they've got

[00:43:00] the identity of 1.4 billion people and I can choose to identify 200 million or then that I want to

[00:43:07] choose because I want to persecute them or something like that but the way in which these systems

[00:43:13] are built is to ensure that that doesn't happen. The second important design choice which

[00:43:19] relationship will appreciate because it's very much once again a blockchainting is federated data

[00:43:24] you know other than other where the identity information because it's the foundational

[00:43:29] DPI the identity information and that to the minimal identity information was aggregated

[00:43:34] all of our DPI keeps the data federated and if you think about identity once again we've got

[00:43:40] various types of identity it's just and and your people keep telling me during the other case

[00:43:46] when it was fought in the Supreme Court question that kept being thrown up is that with other

[00:43:52] linked everywhere the government knows everything and you know when I buy a pizza the government

[00:43:57] will know that I like pizza the fact is that when other is seeded into databases you idea has

[00:44:03] gets no information you idea does not know that you like pizzas now the bank which has added your

[00:44:10] other number to your bank account information has your other number but you idea does not have your

[00:44:16] bank account and so there is no centralization happening of the information if the government wants

[00:44:23] to find out all of this information yes the government has the part go to every bank and say

[00:44:27] now I want you to give me all the various account holders that you have and all their other

[00:44:34] numbers and then it can in this very laborious way link all of this together yes it can but

[00:44:38] look the government can do something even more more uh pernicious which is it can just go to a

[00:44:44] telco and you know today you use a telephone number everywhere so I'm way more scared of my mobile

[00:44:49] number than I am of my other number because it's used for absolutely everything so I'm saying that

[00:44:55] the point I'm trying to make is that when you think about the DPI approach and you think about

[00:44:59] an autocratic governments governments they don't really care about us they don't have this

[00:45:04] or a cut-the approach to us there are many technology systems that they could have chosen

[00:45:11] if they wanted to do all of those things if they wanted to be repressive if they wanted to be completely

[00:45:15] you know noncommital about us of all the systems they could choose this would be the last system

[00:45:21] they should choose because it doesn't serve their purpose at all. In fact I

[00:45:26] honor lighted out I think baby the specterator design is built into our DNA because I was just

[00:45:31] listening to that election bonds debate and they were talking about how they have to try to match

[00:45:35] those two lists of what was bought and over 3D and therefore the life of them they can't and I think

[00:45:40] finance, secretaries and all their said you know and it's so straight up but though they're not numbered

[00:45:45] and apparently on the points there is some number written which can be read only in ultraviolet light

[00:45:51] but actually you will still not know because that number is not related to who's bought it

[00:45:54] or who's redeeming it. I mean just like I think everything is federated in our country and

[00:45:58] it's been like consciously kept or found from each other. I look I think we're giving them too much

[00:46:03] credit. I think the fact though that this is how it was built is really one of the features

[00:46:10] that is going to protect us from privacy violations. The other thing in fact the fact that even those

[00:46:17] name fields I mean I'm sure it must have been in all conversations with the other the fight to

[00:46:21] keep it just takes only so much data. So it's literally there is nothing in there that you would

[00:46:26] really want. So it's you know like you said it was the last place people will go to go and look for

[00:46:31] information. So it does really nothing in there. You've had you had promote on on the show and promote

[00:46:37] I've had many conversations with him he said they started the 29 fields and you can understand the

[00:46:41] government this is a mammoth exercise I'm going to get people to collect this information let me

[00:46:45] everything that I can at one shot and the fight to keep them to collect only four pieces of

[00:46:51] information I tell you that is the biggest service that that gang did for us. I just give you one

[00:46:58] other example of something that I was involved with and that is in your inkova at the Arrogase

[00:47:04] to app and this will give you a good example of privacy by design. You know this is the early days of

[00:47:14] of the pandemic we had no idea what was happening the only thing that we knew was that this disease

[00:47:22] you know after 14 days you're not infectious anymore and of course they built this COVID tracker

[00:47:28] I had no clue whether it could work all my doctor friends said this is utter rubbish this is just

[00:47:32] like tech solutionism but what was happening was they were building it anyway right so it

[00:47:35] it was being built and they reached out to me and said look can you help us get this privacy

[00:47:41] you make sure that the privacy parts of this are sorted out. So the question I asked them was

[00:47:46] look we know that that you know after 14 days if you come in contact with someone who

[00:47:53] was infectious after 14 days it doesn't matter all the evidence is showing that so why do you need

[00:47:59] to hold this contact information for more than 14 days so you know I don't know it's this is

[00:48:05] so new I do I say okay so what about 30 days they said yeah okay I guess after 30 days we could

[00:48:10] delete it so I said okay then why do we just do that so why don't we just build the system such

[00:48:14] that on the 31st day you delete the data that was helped 30 days ago and so they built that system

[00:48:21] and so if they just think about the the counter factor if they hadn't built that you know there were

[00:48:28] 250 million people who had downloaded this app it was sitting you're silently on the mobile phone

[00:48:33] they came and contact with I don't know how many other people all of that information no one

[00:48:38] deletes apps on their phone all all those apps would be collecting data today once they sunset

[00:48:44] the app 30 days after that there was no data left on any system now this is privacy by design

[00:48:51] and I think this is sort of the way in which we can design our technology if we can keep privacy in my

[00:48:59] to actually achieve these sorts of democratic principles that's amazing roughly I think we could go on

[00:49:05] and on on this conversation with you but keeping in mind the fact that if we don't want to

[00:49:12] take up too much of your time be you know by size information as much easier for people to kind

[00:49:18] of absorb we're kind of going to bring this talk close but we'd love to have you on once again

[00:49:23] add some point in the future to discuss another aspect I think privacy because it's so many

[00:49:28] aspects of privacy that one needs to kind of decode for our listeners so I don't know

[00:49:35] in relation if you have anything to add to this otherwise I thought maybe we could wrap this one

[00:49:40] up and invite Rahul again so I don't love the round. I have many things so forward yes you're right

[00:49:46] the editing editing becomes a nightmare I think Raul it was it was a fantastic

[00:49:52] configuration loved chatting with you I think we'll kind of think of so we are we are kind

[00:49:58] of doing a whole series and doing it as a security is a pain privacy or fantasy integrity

[00:50:03] and non-deprediation so so probably yeah maybe another episode some you know will take up some more

[00:50:09] of your time but thanks a lot thanks Raul. Thanks Raul. My pleasure. Yeah and just when you think

[00:50:14] you know everything or all the stories or the pair out there you know along come someone with

[00:50:20] a new thing that I don't know the arugas it was a real eye opener I mean I just didn't know this

[00:50:24] I mean actually it's like a completely fantastic thing to do. No thank you thank you so much

[00:50:30] very very enjoyable conversation and yeah we could chat a lot I would maybe if I have the courage

[00:50:36] take you up on a blockchain type thing I'm still a novice at blockchain but I have some

[00:50:41] on that one Raul I will just be expected early in the list now because

[00:50:46] what you did many many years ago which is you don't get into technology and legal I am

[00:50:53] starting to do so I'm not from the space of technology I'm from the space of research and I love

[00:50:59] technology so like you I pick up whatever new comes up when I kind of play with it but no blockchain

[00:51:05] isn't something I've mastered yet so I'll join you as a student on that conversation. I'm just as

[00:51:12] much a student I say it's a you know I've just read Chris Dixon's latest book and it's you know

[00:51:19] once again it's open on mind again I read a lot of Vitalik so I have some sense that but

[00:51:23] it is just scratching the surface it's such a fascinating new world. Yeah so on that note thank you

[00:51:30] so Hosh Rahul and absolute pleasure having you here and we do hope like for more we will keep

[00:51:35] inviting you back to have different conversations with us at different points in fact so thank you