In today’s episode Monica Verma talks with an industry leader and a technology lawyer. Yup, hard to find those combinations. Monica sits down and talks to Jan Sandtrø, who is a renowned legal expert and has been working with technology, privacy and their intersections for more than two decades. In this episodes engaging conversations, they covering
We talked to our experts about these and more. Let’s hop into the episode right away.
Looking for your dream job in cybersecurity?
Don't know where to start or how to go about it?
Follow Monica Verma (LinkedIn) and Monica Talks Cyber (Youtube) for more content on cybersecurity, technology, leadership and innovation, and 10x your career.
Hey folks, welcome back to We Talk Cyber with Monica, your one and only platform for real world stories from some of the renowned global experts who are making a real impact in security and privacy every single day. Do you wonder what does it take to build a personal success story in cybersecurity and privacy? And how to make a real impact? What are some of the most important challenges that we're facing today within security and privacy? How do they affect our today and tomorrow? And what can we do to overcome them? Do you wish to build your career in cybersecurity, take it a step further, and even break into leadership? Well, if you're interested in all these things, and more? Then this podcast is the right place for you! So before we hop into the episode, make sure you subscribe to my YouTube channel: MonicaTalksCyber. You will find all my videos including a podcast videos. And if you are a podcast listener and you listen to Audio Podcast, then please subscribe and tune into We Talk Cyber in your favorite podcast app. Do it right away! So you don't miss any of these amazing conversations and stories. This is We Talk Cyber with Monica.
Monica Verma 1:29
So in today's episode, we'll be talking to an industry leader and a technology lawyer. Yep! Hard to find those combinations. So we'll be talking to Jan Sandtrø who is a renowned legal expert, and has been working with technology and privacy for more than two decades. In today's engaging conversations we'll be covering: What is 'Schrems I', 'Schrems II', and how did they came to be? Why is the entire EEA and all the European companies that are working with American companies kind of in like a 'data transfer limbo'? What is 'The CLOUD Act'? And is it any less invasive? All the applications and apps that we're using today track us to certain extent, and sometimes even more? What can you as a user do about it. So if you're interested in hearing more, let's meet our guests right away. Hi, and how you doing? Welcome to the podcast show.
Jan Sandtrø 2:24
Hi, thank you. Nice to be here.
Monica Verma 2:26
Really lovely to have you on the show today. Would you like to introduce yourself and say just a few words about yourself, and share a fun fact with the audience.
Jan Sandtrø 2:36
As I said, I'm a lawyer within the area of technology, working with all fake tech issues. I started working with the privacy in '95/'96, I think. Then it was a bit boring, it was not that hot. But eventually we got the GDPR which made it more sexy. You know, before GDPR, if you mentioned that you work with privacy issues... It was like: "Okay, thank you! I don't want to talk to you anymore..." But now everybody wants GDPR. So thank you EU, that you provided this, to make some lawyers more interesting. I've been working as a technology lawyer since '97. While I started during my studies, to be interested in technology and law. So my main area in the last what plus years has been technology. I worked with a good lawyer in Silicon Valley appeared, and she called me a 'geek with a law degree' and I find that quite cool! When I was called a geek that wasn't that cool. Like it's almost like 'nerd', but today it's cool. Thank you .
Monica Verma 4:06
Fantastic! Welcome to the club then I think. Lovely to have you on the show today! Let's hop right into it, because we have a lot of things going on the GDPR that came out. And in light of GDPR, and also because of the all the data transfers that were happening outside the EU and outside the Europe. There were some cases going on. And then we had the 'Schrems I' that we had now recently 'Schrems II', and there is something known as 'FISA 702' which basically is a US revelation, which caused a lot of problems and costs 'Schrems II' specifically. Tell us the audience, what are these things in laymen terms and why should we care about it?
Jan Sandtrø 4:48
Yeah, well, within the GDPR they define the European Economic Area (EEA) as safe. Some people think that GDPR is about privacy and protecting the personal data of people? Well, the first purpose of GDPR, which is mentioned first in the GDPR, is the free flow of personal data actually, within the EEA. But it's not that known to people that that's the first and foremost purpose, then you have the defined EEA area where all the data, personal data is to float freely around, for the best of business probably to earn money on them. But then in the GDPR, they defined—or it's not a definition within the GDPR, but it's a common word within the EU law—that the third countries...—not a very pretty name on the rest of the world, but everything outside the EEA is third countries. In order to transfer personal data from the EEA to a third country, you need a basis to transfer. Some countries are considered to be 'safe'.
Monica Verma 6:13
So adequate in comparison to what we have in Europe.
Jan Sandtrø 6:16
Yeah, I'm trying to make it understandable. But thank you, they have an adequate level of privacy laws. And then you have the rest of the world. And then you need the basis for the transport: the standard contract clauses (the SCCs). But for transfer to the US we have some other basis for transfer. The first one was what we call 'Safe Harbor'. And that was an agreement between the European Commission (EC) and the US Department of Commerce. So that was like an approval: they could approve that there has been a basis in both the privacy directive and GDPR: the privacy regulation. But then we have this Austrian law student: Maximillian Schrems. Firstly, he requested all his data from Facebook: and he got a really new, huge stack of data. And then he discovered that— well, everybody knew it!—Facebook was transporting personal data to the US as part of their services. But then through the Snowden revealing it was learned that the the US surveillance authorities were listening into or requiring the personal data. Max Schrems, he understood that much of his data—you know how much data—was transferred to the US and was available to the US surveillance authorities. And he didn't like that of course! Then, before he was a lawyer, he went to the—I think it was the first— Irish supervisory authority (Irish Data Protection Commissioner) and requested that they should impose a restriction on Facebook to transfer the data. And then the matter went to the European Court of Justice (the ECJ is the supreme court of the European Union, and part of the Court of Justice of the European Union, the CJEU). And the ECJ found that it was not legal to transfer the personal data to the US. And that was 'Schrems I' (Case C-362/14). And that terminated the Safe Harbor agreement. And then we had a period until the EC and the US Department of Commerce got an agreement, the next agreement called: the 'EU-US Privacy Shield'. I'm mostly impressed of the names of these deals. I don't know what the next one will be... the 'Super Safe Something'? And then this also went to the ECJ through the Irish supervisory authority. And then recently, the ECJ found that this agreement was neither okay, because the US surveillance authorities still had access! And we have to take into consideration that the EC—, they approved this arrangement or agreement to approve revisions. And then it was not good, or not sufficient, as the court provided. So, that leads us to the 'Schrems II' decision (Case C-311/18), which said that the Privacy Shield is no longer a valid basis for transfer to the US. And that left us with the SCCs; which are used all over the world. But now Max Schrems also questioned the SCCs... And the court then said that: well—, the Privacy Shield is off, and with regard to the SCC, you have to do some tests. The first test is that the country you transfer to has 'sufficient privacy regulation'. And then they decided under the Privacy Shield—, they said that the US is not that. So you have to 'cut up' the US, but the rest of the world you have to consider the level of privacy in that country. And then the court left it to companies to decide what the level is. And that's, that's quite ironic! Since the EC, they used two years to approve the legislation in Japan. So with ALL their lawyers, they used two years! And now all the companies around Europe have the same configuration.., like India isn't considered, nor Brazil, you have to consider that that's totally impossible.
Monica Verma 11:07
So you went through the history a bit, and you mentioned a bit about how we have come today to Schrems II and where we stand today. And you also mentioned a bit about the difficulties and challenges that we have seen; especially with regards to the European Commission taking two years to evaluate whether Japan was safe, had adequate levels of privacy in place, and in accordance to GDPR. And now, as you say, and as I understand from Schrems II: the companies and the organizations are left on evaluating these third countries, or whether they can transfer the personal data to third countries. So in the US as you said, the Privacy Shield is invalid, right? So they have to use standard contractual clauses. They don't have any other option. So when it comes to evaluation—let's talk from a business perspective. First, what does the evaluation actually entail? How can they do the evaluation? How do they consider the risk before they can even think what kind of these organizational or technical controls they can put in place?
Jan Sandtrø 12:06
That's a good trick question. Because when we had the decision by the court, we really didn't understand what to do... Because the decision of the court was very short. After the decision by the court, we were quite puzzled on how to do the assessment. Because the court's decision was very short and crypto. But the European Data Protection Board ('The EDPB'), they announced that they would come with some recommendation on how to do the assessment. And that came, like a month after the court's decision. And we.., yeah.., we strongly awaited that. But when we got it, it was.., it didn't clarify much. It was very strict and they had some examples, like: 'If you use a CLOUD provider, and your data is not encrypted (totally), it's not legal to use that CLOUD provider.'
Monica Verma 13:16
Oh, that's challenging!
Jan Sandtrø 13:19
That's challenging. And unlike privacy lawyers or experts all over the worlds are saying that well: "They also put SCCs to death". So, now you have to do an assessment, but if you do 'the assessments' as assessment, as recommended by the EDPB, you will almost always end up with that you can't do the transfer. And this has led to that, all over Norway and Europe, big projects are stopping.
Monica Verma 13:59
From one of the things that I understand—and please correct me if I'm wrong, because I'm not a lawyer—is that 'FISA 702'..., requests or proceedings or transfers under that, are not disclosed, they are not public. So then in that case, if Microsoft would even have to comply and they will do the transfer in contradiction—violating GDPR—would businesses or data owners... would they even know about it? Or come to know about that?
Jan Sandtrø 14:24
Yeah, that's correct under the 'FISA' (US Foreign Intelligence Surveillance Act: Section 702, Executive Order 12333 and Presidential Policy Directive 28), with a 'FISA court' (The United States Foreign Intelligence Surveillance Court), they could impose a 'gag order': you have to keep it confidential, you cannot even tell. But as I said, under the recommendation from the EDPB, it is said that you can put in place a 'contractual requirement'. One of those contractual requirements is like, Microsoft undertakes—if they have a request from US authorities—they will fight that request in court with ALL means possible. They will fight the request and they will fight the gag order. And then they are pushing it back; it could be on the FISA, but they can also bring the FISA decision in under another court like the federal courts or other. And then that actually, to have a contractual obligation to do that, could be a remedy under the SCC.
Monica Verma 15:34
Right. Businesses still collect a lot of data, without really a reason behind it, or a justification behind it. I mean, we have seen organizations take data for and from; we know the scandals that have happened. We know Cambridge Analytica that has happened, we know how data is being sold... How do we make companies more liable that there is a justification provided? Is there really a need to collect data in the first place, then process and store, and all these things?
Jan Sandtrø 16:12
Yeah, and that's a good point. Because under GDPR, we have these principles in articleV saying that you should not store, or you cannot collect more data than you need for the purpose. And you shall delete data whenever it's not necessary, or whenever you do not feel you have a legal basis or a purpose to process them. So these are quite strict principles. And as you know, we have these extreme high levels of our clients under GDPR. So I think that there will be huge fines, and that will (maybe) get them into place. A fine imposed on Google in Sweden, for a while ago, and that was..., 75 million Swedish Krones or something like that... I think that's like, maybe twenty seconds of the revenue on Google. So, you have to have really huge fines to make them listen.
Monica Verma 17:23
Right. And we talked a bit about CLOUD service providers, right? And we talked about Microsoft, Google, and so on. One of the things that people get confused with, now that 'Schrems II' is there, is people also don't understand the 'CLOUD Act' that came out. What is the difference between the two mainly? Can you explain a bit about what implications do they have on data transfer? What is the difference?
Jan Sandtrø 17:48
The CLOUD Act came after 'The Microsoft Decision', or 'The Microsoft Ruling', for some years ago, where the prosecution authorities in the US needed some evidence. It was in six or seven courts until it made it to the High Court. And before it was a decision by the High Court, the CLOUD Act was imposed, and the CLOUD Act was endorsed by Microsoft, etc. Because then they didn't have a case anymore. They could hand over the data, or have to hand over the data, but it was a legal basis. So the CLOUD Act was imposed to get criminal data. Almost all countries in the world can request the data based on criminal charges from any country in the world. So they make a request, and then you have a decision in that country to hand it over. And that's normally done under 'The Budapest Convention'. So the CLOUD Act in itself didn't invent so much, or bring so much new. It was more to have the formalization of the process.
Monica Verma 19:13
Okay, so in that sense, it seems that at least the CLOUD Act is less invasive, than the ruling that happened with the 'Privacy Shield' with regards 'Schrems II'.
Jan Sandtrø 19:24
It appears to do that, but like the executive order and the FISA regulation which was revealed by Snowden. That's more invasive for the privacy of people, because they could tap cables and everything. And that's, I think the upstream project within NSA, without knowing too much about that. It's not really what we learn at law school. But I think they went after those regulations, because they are so enormous. If somebody is very interested, you can google 'the expert opinion' made by a US lawyer before the court under the 'Schrems Act', which is a huge document, just describing everything about the US surveillance legislation. It's interesting and frightening. Yeah.
Monica Verma 20:39
It's interesting and frightening.
Jan Sandtrø 20:41
Yeah, but still we have to remember that, almost all countries in the world do this. And most of the European countries, they work together they call for cooperation with US authorities in this area. But it's like, I think that Europe they think, or they reckon that, well, we can do it, but not the Americans! That's brings us to another area, which I figure that, and you are a expert on information security... If you use a cloud provider, like AWS or Azure in Ireland, they got top notch security, right? That's some of the best security you get in the world, because they cannot be hacked, because that kills the business. At least, I figured that they got very good security...
Monica Verma 21:38
But everybody can be hacked. I wouldn't as a security expert, I would never say nobody can be hacked. But definitely I agree. Cloud service providers definitely have much better security, than companies on their own prem. Absolutely.
Jan Sandtrø 21:50
I didn't mean that they can't be hacked, I mean more, they do not want to be hacked, because that's killing the trust.
Monica Verma 21:58
Absolutely! Their trust and their business.
Jan Sandtrø 22:01
So we cannot transfer personal data to the US and to probably some of the best security you get in the world. So you are left with having a server in Norway and to catch and secure that. Like, you had the parliament that had that email server in the basement, which was hacked by the Russians...
Monica Verma 22:29
You're right. I mean, a lot of companies..., you do get better physical security, you do get better data center security, you do get better security of the cloud, when you're moving data into big cloud service providers. Definitely better than the data centers that you might have in your own organization in the basement, that you might not have the competency, expertise, time or money for. And that's where the challenges are. Let's take another aspect as well. I mean, my entire life is on my mobile phone. Whether it's healthcare, health apps, fitness apps, banking apps, and whatever. Everything that I do is on my mobile, if I don't have a mobile, it feels like my life will probably stop or cease to exist! And we are obviously having massive amount of applications on our desktops, laptops, mobiles, anywhere that we go, or browse on the net. All of these applications or services are obviously tracking 'where we are' and 'when we are': a lot of information about our lives is there. And there are people obviously who know this. Let's dig it from their perspective, awareness about this is obviously important! But let's say that we have given the awareness to users and that users know that, anything and everything that you put out on the internet or use in these applications, just assume that it is going to be tracked, and it is going to be available and it is going to be readable by everybody: it's public, even if you're try to not make it public. Or also if you give them the awareness that whatever you put on public, or even share, that can be misused in certain ways. Let's say they have that awareness, but still, is there something that the users can do in addition to having the awareness to reduce they're being tracked? Or is it something the only solution distributes the uninstall the application?
Jan Sandtrø 24:25
I think that the only solution we got is to uninstall it, or hopefully wait for some of the authorities in Europe to do something. Like, you have the 'Datatilsynet', the supervisory authority in Norway, which also went after some big companies too. Like that they went out darcheville and made them change their policy. And you have the other public offices in Norway and in other countries that I've been looking into this tracking, etc. But the problem is that: yeah, you can force the big companies to stop tracking you, but then you have like ten or twenty other apps—like the fun apps—which are also tracking you. We have better security on the phones, but still, as we talked about: it's always a risk with technology. As a user, as a 'normal person', you don't have that much option. You probably have to move to a cabin in the woods, without electricity or something. That, if you want to be safe, you can do. For almost everybody else: just accept the risk and hope it will be reduced. But I believe strongly in consumer power. So maybe that could be the way.
Monica Verma 26:07
Yeah, I mean, that's correct. I mean, that's what I said, this is based on a very big assumption that I made here: user awareness. But that's not really the reality. So let's just say that there are a few people that I know of that are aware of these things. But there is a majority population that don't care, that don't understand, and don't know. So we definitely have to start with that. And that there is a consumer power is important. I hope that going forward as well. Because one of the biggest things that we've seen now is that the companies—because the digitalization that's happening independent of which sector they are in or which field they are in—the companies need to build more and more trust with their consumers. And building trust in terms of technology for them will still be an important part. Because everything is digital now. So no matter what you do, you will need to somehow provide the technology trust in a way—also to your consumer—if you want to manage, and continue to have the trust of your consumers and your customer. Just to around of; maybe I'm gonna ask you one thing, because we're talking a lot about privacy. And obviously, we touched topics of security as well. Where do you see the balance of security and privacy going forward? Just your thoughts on that.
Jan Sandtrø 27:17
So I feel that privacy and information security goes hand in hand. Yeah, all the thanks to GDPR: which doesn't stress that much. It's part of the big movement.
Monica Verma 27:34
Absolutely! And that's the thing: we don't have this location and the answers to everything. But I believe GDPR was definitely one good step in the right direction. And hopefully, we will see more work going forward. And then as you say: "security and privacy go hand in hand". It's important for us—security leaders, lawyers, privacy advocates, organizations, national authorities, data, supervisory authorities, everybody—to understand, accept, acknowledge, and also for the people to know: that privacy is a human right. And we need to continue doing this work going forward of how we can continue to have this balance.
Jan Sandtrø 28:10
Yeah. So the future is very interesting.
Monica Verma 28:15
Absolutely. It was so wonderful Jan to have you on the podcast show today. I think I've learned quite a lot from you. It was really interesting to hear your thoughts from a technological perspective and talk a lot about privacy with you, and what's happening, and what's going to happen in the future. I really enjoyed the conversations. I hope you did as well.
Jan Sandtrø 28:36
I certainly did. Thank you for having me on the show.
Monica Verma 28:41
Lovely! So everybody that was today's episode: We Talk Cyber with Monica. I'll be back with more episodes, fantastic gets, and amazing conversations. So continue tuning in, stay safe, and take care.