The Monica Talks Cyber Show

The Cyborg Era - Are we there yet?

October 19, 2020 Monica Verma Season 1 Episode 10
The Monica Talks Cyber Show
The Cyborg Era - Are we there yet?
Show Notes Transcript Chapter Markers

Join CISOs and Ethical Hackers, Chris Roberts and Monica Verma, on the latest episode of We Talk Cyber as they talk about

  • the era of cyborg 
  • the security risks, ethics and the privacy implications
  • data security and awareness around augmentation, artificial intelligence and more.

Looking to become an influential and effective security leader? Don't know where to start or how to go about it? Follow Monica Verma (LinkedIn) and Monica Talks Cyber (Youtube) for more content on cybersecurity, technology, leadership and innovation, and 10x your career. Subscribe to The 10x Circle newsletter at https://www.monicatalkscyber.com.

Intro 00:00:00
You're tuning into the podcast series We Talk Cyber with Monica. Your platform for engaging discussions and expert opinions on All Things Cyber. For more information, check out Monica Talks Cyber dot com (MonicaTalksCyber.com) and let's hop right into today's episode.

Mon 00:00:18
Hi everyone! Good morning, good afternoon, good evening, wherever from the world you're tuning in today. Welcome to yet another fantastic episode of We Talk Cyber with Monica. I’m your host Monica Verma. I’m here with yet another amazing guest who is none other than Chris Roberts. He needs absolutely no introduction. Hi Chris, welcome to today's episode of We Talk Cyber with Monica.

Chris 00:00:39
Thank you! Thanks Monica. Thanks for having me. Kind of fun to be here. It's nice just to hang out, get a chance to relax for a little bit and just talk on stuff.

Mon 00:00:49
It's so lovely to have you on the podcast episode today.

Chris 00:00:52
Thank you, I appreciate it.

Mon 00:00:53
Let's just jump right into it. So, today, I wanted to talk to you a bit about the futurism and how the landscape and the society is changing. Do you believe that we are already in the era of cyborgs and what are cyborgs?

Chris 00:01:11
Yeah, I mean, it's kind of interesting when you look at it. There's some of the research I’m doing. In fact actually one of the machines [that] is behind me, is doing a ton of research and a ton of work on pulling data signals out of the brain. So the EEG research I’m doing and I mean there [are] other institutes that are doing the same thing. So, then you take a look at how we're learning, especially within some of the military in the government side of the world. How we're actually trying to, because of the knowledge we have in the brain and because of what we're using from technology, how we're actually educating people or trying to educate them more effectively. But, if you take a look at the digital side of the world that we have, [i.e.] you take away from the human and you purely look at the digital side, you look at actual artificial intelligence. Very specific AI situations. And we've got some really interesting things moving in that field. Unfortunately, the whole like robotics, machine learning, AI has been beaten up pretty badly, by our industry, the security industry. Especially because you know everybody's claiming they have the best. Nobody's actually really put some good metrics around what augmented intelligence is. It's one of the things. There's some good folks over at CalypsoAI that I’m hanging out with. We're actually putting a yardstick and trying to put some decent measures and metrics around it. Because when you look at you look at robotics, you look at where the future is for humanity's standpoint, we have a couple of different parts we're going to augment ourselves which gets into a really interesting piece, you know. We're doing some of that at the moment from a healthcare standpoint but you look at it from a military installation standpoint - How do I augment the person that has to be in the field or can I take them out completely and you know go the Boston Dynamics route and throw all the autonomous systems in there. And if I can, what are they allowed to do, what are they not allowed to do, where are the ethics lines and all that stuff. Then, you look at the stuff I’m messing around with which is this stuff, which is, at some point in time I want to look at this physical body and go - I don't need it anymore. I want to be at a point, yeah, and that's where it's really interesting because it's when you get into the humanity of it. It's like, okay, what makes us the human. Is it purely digital signals or signals that I can convert into something a machine can read or is there more to the human? And so that gets some really really interesting [and] very philosophical conversations.

Mon 00:03:48
Right and let's talk about those a bit. So what I hear from you is that - Yes, you can say the cyborg era has somehow started. We are looking into the AI and advancement of it, into the augmentation to human body. So can you define in just very laymen terms - What is a cyborg?

Chris 00:04:10
It's tough. Because I mean there's there are different entities. There are different ways of looking at it. You've got a cyborg as a human replacement. So, if you take it as a - Let's go back to um some of that let's go back to Hollywood or Bollywood typically, pick whichever one you want to and - that industry that says all of those menial jobs that we don't want the humans to do, let's give those to the machines. Now, you take a step further forward and go you look at the advanced automation that's been put into vehicles. So, do you have a cyborg element in a vehicle? It’s something that's doing work for you. Something that's transporting you from A to B. It has a level of intelligence of its own to make certain amounts of decisions and depending upon where you go to, on that autonomous vehicle scaling like one through five, will depend upon the cyborg stuff there. But for me, it's either a human augmentation or a human replacement for various different things.

Mon 00:05:05
Right, so let's take an example here. Okay, so let's weigh in two things. On one hand, we have - Everybody today in the world is using smartphones. Right, yeah. On these smartphones we have all our data, almost entirely our lives. Banking apps. Healthcare apps. All kind of information that is crucial to our lives is somehow being stored, processed, managed monitored through these smartphones. Right. Through different corporations or different apps, third parties and so on. On the other hand, we are talking about the bionics. We're talking about augmentation. We're talking about devices implants, pacemakers and so on, integrated into our body which also have some kind of data about our lives. What is really the difference between these two examples in terms of risk towards security and privacy?

Chris 00:06:08
Oh yeah. So, I mean, the perfect example - I did a linkedin post the other day where I had this in one hand. Well, I had that in one hand, and I had that in the other hand. And when you think about it, I mean everybody thinks of this as a weapon. Everybody goes, this is a weapon this can harm people. But what they don't think of is this being a weapon, you know the phone being a weapon. And how much damage an individual or a group or people can do with one of these. Now to your point, this on its own has people's lives on it. You can ruin somebody's life just through one of these devices. You can enhance somebody's life through one of these devices. If you take a look at cyborg as a something that takes beyond normal human limitations, arguably this pushes us beyond our normal human limitations. The ability to communicate anywhere in the world. The ability to you know virtually visit anywhere in the world. But with that comes responsibilities. You know it's like how do I guard myself, as i put myself into this. And can I guard myself or do I have to rely on the vendors, the suppliers, the third parties and everybody else to do it? We all know how that's ending out, unfortunately. Now you flip it around and you go back to the human thing. So perfect example for anybody that's had an either an accident or .mil.gov stuff, that's had limb replacements. So, now, you start looking at the augmentation that's there and the ability to basically take an artificial limb, tie it into some not just nerve function but tied into the neural functions. So those senses and those movements become much more, you know, natural as we like to look at it. Or you take the brain implants that are in place for anybody from the Alzheimer's side of things, all the way through to anything that's [for] Epilepsy and various other areas, where they're trying to calm a lot of that down. Amazing leaps forward in technology. But the flip side of that is again, either one of these is in the way or embedded tech is in the way that then ends up basically introducing potential vulnerabilities. When you think about it, we are analog humans in a digital world. But when you extend that beyond and you start taking a look at the digital world, that's when you know I can reach out and literally touch anybody around the planet or even off planet, potentially. So you look at the functional or the quantification of what's a cyborg and you go - Okay, well, it's we've always thought about it, in the traditional analog sense. We've like [thought], oh, it has to be another machine that looks like us. No. You know it can be the digital side of the world by far in a way. And I think that's where we're starting to really realize that, as the analog human, how do we go more into the digital world. How do we move more into the digital world and I think that's a big part of it for me. You know, especially with the EEG stuff, [it] is now, it isn't just looking at the digital world through a piece of glass. It is now [about] - Okay, How can I interact with that more effectively. How can that machine know what I’m thinking, when I’m thinking it, how can [I] anticipate it and potentially move that from a reactive to a very proactive side of the world. So it can start doing thinking for me. You know, that's when you start really looking at actual artificial intelligence. That's what I want to see. It is [this]. I want that machine to think for me, think past me and think further than I do.

Mon 00:09:49
Right. Let's talk about the ethical implications here. Do you believe, we as humans, number one, have the right to decide for ourselves if we should or want to get one of these devices, and two, if we do get these devices if we do decide, we do have the decisive power to do that let's say like we agree and we decide that, do we have the right to really understand and approve how our data is stored managed processed deleted and so on?

Chris 00:10:27
Do we have the right. Yes. Do we have the knowledge to be able to answer those questions. No. Do most people care. No. We know that from society. We look at a perfect example. You look at society today. How many people are on snapchat. How many people are on tick tock. How many people are on facebook. And they're on it because it's convenient. It's a good communication channel. There's a lot of entertainment value. There's a lot of information sharing value. And some people go into it knowing that they are being farmed. I mean, let's be perfectly honest, you are nothing more than an asset. And I will maximize the value of that asset. What can I sell you. What data of yours can I sell to somebody else. That's really what it's all about. But some people [are] going to that knowingly. Some people just don't care or don't know or don't want to know. And I think, that's probably part of it as well. But again, so now, we have we have separation at that point, because that's on this device (mobile). When you start looking at augmentation, so perfect example, brain capacity. You know, if I look at what i have in here, now some of the stuff I’m building behind, [it] isn't just to pull signals out. It's [about] what signals can I push back in. So, how could I enhance learning. How could I enhance various different things. What could I do with my physical body. Elevate it through digital inputs, digital signals, manipulation memory. In other words, I’m giving myself an upgrade. That's great but what are the consequences of that. What happens if the upgrade needs to be patched. What happens if I haven't paid my support maintenance for this year's patch maintenance. What happens if all of a sudden a researcher out in let's say Israel, Japan, China, India or the US or somewhere goes - Hey, if I stand close to this person with an RFID, I can completely erase their memory. And all of a sudden you get a bunch of us, the hackers go - Hey that's really cool. And we start running around with freaking great big antennas, wiping everybody's memory, as we run down the road. Not so cool but it proves that we're capable of doing this. So then it's a case of like, okay we get into the supply chain issues and the vendor issues, that we currently have. But it's no longer a separation. There's no longer this device (mobile) that I can put down. It’s us, right. And I think as an industry, we don't have that responsibility. We don't think about it that way. You know the problem with our industry is, we have issues and we've always thought about it being - Oh it's a computer. It's - If the computer breaks who cares but we're at a point back to the cyborg thing, whether we do human augmentation, digital augmentation or we extend ourselves, that we have the ability to impact humans and their lives much more. So now, like physically. And I think you know we obviously had that with the heart rate monitors and the pacemakers. You know. Four, five, six years ago. We've had it with insulin funds, we have with various other things. But we haven't found a way to actually communicate that effectively to the population. Let alone the people that are making devices. So yeah, there's a ton of risk. Not just in the devices, but who has the data, who uses it, what they're going to do with it, how [do] they monetize it. And we haven't solved those issues yet. I mean, we really haven't solved them yet.

Mon 00:13:55
You say something about that our industry doesn't feel that we have the responsibility here. So who do you believe should be responsible for this? Who should own the accountability? Or not maybe the accountability but more often the responsibility of understanding, educating and making sure that the implications are understood, in general and [in] defending and protecting the humanity.

Chris 00:14:24
I think, we have to be in that conversation. I don't necessarily believe that we should be leading that conversation. Because our industry tends to approach things at a very technical level. But I think, you know, if you think about it, around the table you would need us. You would also need the development teams, back to that devsecops kind of mentality. You need the people who are doing the developments and that's from the manufacturing standpoint to the software standpoint and all the other pieces in between. You need them at the table as well. And you need the business. You need the people that are selling this stuff at the table. The challenge there is, it becomes an ethical conversation and let's face it most companies will err on the side of - Hey, I want to make money - as opposed to - Hey, I actually want to do something ethical. Somehow you need those people. I think also at the table you need the psychologists. You need the humans. You need those people who understand humans. We don't have those at the table at the moment. More often than not, when we make decisions in our industry, We make them from a very technical viewpoint. We don't make them from a humanitarian viewpoint. We're not good at doing that. So, I think we need more of those conversations because, you know what are the risks of - putting out a brain, you know. If I put a brain implant out that i think is, you know, works and it's great and it's all good. But it's got software and it's got code and it has flaws and it's sitting on github and people like me, let's face it, I look at the stuff and i'm like - Hey, what can we do with that. And all of a sudden you realize you do some fairly nefarious things with it. So at that point it's an ethics conversation. It's a human conversation. And it is a technical piece in there as well. So. I think we need more people in the conversations.

Mon 00:16:15
Should we already be having that conversation now?

Chris 00:16:18
Yes. Understatement. I gave a lecture in Oslo. Last couple of weeks ago. It was all talking about the human augmentation. And you know, to the point where, the website advertising - We've got millions of people [that] have used these. And it's a pill. it's like a little tablet. It's about that big. It's a tablet that you swallow. And It has RFID in it. It also has a camera. So it's actually watching you digest your factory for a whole bunch of reasons. Then you can wear a patch. That patch then talks to your phone. User id: admin. password: admin. Default. On every single one of them. I’m like what are you thinking. So I mean, we have that, you know. We have a large manufacturer of all of these types of technologies that just - Its user id is ge and its password is ge. Across almost everything. I’m like, telling me that it's not your responsibility because it's the people that install it or it's the people who use it, that doesn't work. That cuts no more. You cannot absolve yourself of responsibility. This is humanity. We're dealing with killing people. So step up and actually accept this. Put the lawyers in the back room and actually step forward. And go, hey, yeah you know what, we'll own some of this. So yeah, we really need to be having these conversations. Josh corman and a lot of the folks over at “I am the cavalry” are doing, to their credit, they're doing an amazing job at the legal level. So they're out in the US and D.C. beating the heck out of the lawmakers. To really start to look at the healthcare industry and what technology we're putting in. I have a ton of respect for those guys because they do things I can't do. Which is, talk to lawyers and deal with people in Capitol Hill. They are very nice and they try to do it the right way. So yeah, there's [that]. We in the industry are trying to have those conversations. The industry needs to listen more.

Mon 00:18:26
So what is one of the biggest surprises that you've had in the cyber security industry?

Chris 00:18:35
Ooh, that's an interesting one.

Mon 00:18:39
Because, I mean, we see [username] admin [and password] admin. It doesn't surprise us anymore. I mean somehow we've gotten so used to it and we are like - Okay, that's there. Is there anything that still surprises you or has surprised you recently in the cyber security industry?

Chris 00:18:55
I think you know, I think a lot of it comes. I think the human aspect still surprises me. Um and I think frustrates me. Probably surprise slash frustrates. You think about our industry. I mean, we've grown up with this industry for the last, for how many years I mean. For me it’s, I’ve been messing with this stuff for 40 years for crying out loud. And when we did it and when we started it, we did it because it was a passion. It’s what we loved. [It is] what we did. [It’s what] we do. We do this because we enjoy doing this. But I think what has frustrated and surprised me over the last probably five or ten years or so, and i don't know how to combat it, is our own industry, quite honestly, has been taken over by the marketers. You know we have, we are no longer run as an industry to protect people or protect companies or protect industries. Our industry is run to make as much money and as much profit. The marketing teams run the industry. You know we come out with new acronyms every single darn day to confuse everybody into buying our stuff. We're really good at not admitting when we don't know everything. We have so many companies telling everybody that they can fix everything. I think we've almost lost that honesty. You know, we had that. You look at the early days of defcon. In the early days of some of the other conferences and [at] the BSides, we actually stood up and were like - Hey, you know, we're ugly and we understand it. This is what we are. What's happened, it feels like over maybe the last 10 years or so, is we've become much more polished. We haven't become polished to deal with business. We've become polished to sell more. The marketing has just, I mean, it's gone ridiculous. I mean the problem is, now you don't know who to trust. That's my frustration. When I go out there, if I want to go buy, what am I buying, antivirus? Am I buying endpoint? Am i buying EDR, EDN, ECMP, EMP?

Mon 00:20:52
You're probably buying all in one, right?

Chris 00:20:54
Exactly and then am I buying it all in one from this company or this company or this company? I mean, they're all telling me they can do everything. And you know, my mentality from the hacker side is, you know, CrowdStrike is useless I can bypass it. Carbonblack, I can bypass. Semantic, Mcafee and everything else is sitting there and they go - Hey, he went that way. Yes, it's good as part of a solution. You need to have it as part of a solution. We're like, oh no, it's perfect. It's anti-hacker proof.

Mon 00:21:22
Right, 100 percent security. Right outside the box. So what we understood today what we've talked a bit about is that, yes we should be having these discussions. We need the right people in the room. There are still things that surprise us quite a lot in this industry, as you say. Especially about the human aspect, the marketing aspect and people trying to just trying to sell things in the name of hacker proof or 100 percent security. So education is still very important. It's quite key here, right.

Chris 00:21:58
Absolutely!

Mon 00:21:59
We need to educate users because at the end of the day, if nobody is willing to take the responsibility and nobody's taking the accountability, which is even worse, then users need to be really educated on what they're getting themselves into. Whether it's using mobile phones, as mundane and normal [as] it is in today's world. Or it is, letting the corporations just sell their data on different kind of platforms and apps or whatever. Or going, tomorrow, with bionic devices. I mean, there's an evolution in which your digital life changes but nonetheless the risks are there. How do we educate the users. What would be your recommendations for people to understand this? How do you think non-technical people can learn a bit about it?

Chris 00:22:54
So, good questions. I think, for me it all falls down to one simple thing, which is, ask one more question. and I think that's really all we can all we can do for managers for businesses for individuals for people in or out of the industry. Ask one more question. It’s always what it came down to me. I remember, I had an amazing CFO many years ago. Italian gentleman. If I couldn't explain to him what I needed in a language he understood, with him asking three questions, there was no chance I was going to get it. and I think that's [what] the thing is. As individuals, as people, who are there to make decisions, if you can't tell me why this is good or bad or what application or what I’m embedding in myself and you can't explain it to me simply, easily and readily, in three questions, then we've got to take a step back and go - Why not? You know, it's our problem, it's our issue and it's also the industry. So to me, I would encourage anybody to just ask more questions. I mean let's face it, a lot of us [that] are in the industry, are on the internet. I mean you're amazing. You are all over the place. You're a wealth of knowledge. A bunch of us are out there, that have got stuff. So find us out and ask us questions.

Mon 00:24:05
Right that sounds very good, I think. Never be shy of asking questions. I believe that's something that I believe is quite important. [Something] that people feel shy about but it's very important. The curiosity to understand, is actually what helps us be better, smarter and understanding the aspects that we have never thought of before.

Chris 00:24:28
Absolutely, yes!

Mon 00:24:29
Let's just sign off by me asking your a very last question. Do you have any recommended reading or listening for the audience that you really love or enjoy?

Chris 00:24:38
So from a resource standpoint, obviously to me, LinkedIn is great. There's a really good community. I love peerlyst. So peerlyst is another one that's out there. Join the community. There are some amazing wikis and stuff out there on that one. The other one that I absolutely love reading is, Randall Monroe, who does xkcd. [He] has written a number of books. There's one book that in particular [that] we inside all of our industry literary is called “Thing Explainer”. It's understanding complex problems in a way that you explain it using basically the top 1000 (thousand) words in our world. To me, I love stuff like that. So, I would go with Thing Explainer and some stuff like that. So yeah, thank you!

Mon 00:25:20
Fantastic. Thank you, [was] lovely to have you on the podcast episode today. We'll be signing off now. So that was Chris Roberts in today's episode of We Talk Cyber with Monica. I’m your host Monica Verma and I’ll be back with yet exciting episodes, fantastic guests and amazing conversations, on All Things Cyber. Until then, take care!

Outro 00:25:39
Thanks for tuning in to We Talk Cyber with Monica. Do not forget to subscribe to We Talk Cyber in your favorite podcast app and youtube channel MonicaTalksCyber. Take care and continue tuning in!

Intro
Podcast Start
Are we already in the Cyborg Era?
Cyborg, AI and Augmentation to the Body
Mobile vs. Bionics
Ethical Implications
Who is Responsible? Accountable?
What conversations should we be having?
Surprises and Challenges in the Cybersecurity Industry
How to Educate Users?
Recommended Reading
Outro