If you've spoken to a friend or a colleague for over an hour in the past month, it is highly likely that the words Artificial Intelligence (AI) have come up. While we have been hearing about these technologies for a while, there are some new AI tools which have blown up in the social media world. Cases in point – Chat GPT, DALL-E, Mid Journey and more.
So, we got together with Shamim Mokles, a graphic designer and YouTuber who recently did some interesting experiments with AI, and Dr. Anupam Guha, who works with AI policy and is a professor at IIT Bombay.
In this podcast, we will talk all about how threatening these AI tools are for creatives, if we need an AI policy and what it should look like, and ultimately we discover some non-popular AI uses that are the real threat. And no, it is not Frankenstein or Chitti.
Tune in to the first episode of The Big Story!
(Disclaimer: The views expressed are the speakers' own. The Quint neither endorses nor is responsible for them.)
00:00:02
Speaker 1: Hello and welcome to the big Story. Two point oh hello,
00:00:06
Speaker 1: this is the first episode of the second season and
00:00:10
Speaker 1: we are your hosts. I am Anjali and I am Prateek.
00:00:14
Speaker 1: So Anjali, what's new in the season? What is so
00:00:18
Speaker 1: two point about it. So you know how our news
00:00:20
Speaker 1: culture is quite noisy and there are so many headlines
00:00:25
Speaker 1: that come every day, this is happening, that is happening.
00:00:27
Speaker 1: So exactly what is happening,
00:00:29
Speaker 1: that is where the story comes in. So just like
00:00:33
Speaker 1: season one, we will get experts on the podcast to
00:00:36
Speaker 1: answer all of our burning questions. But in this season
00:00:40
Speaker 1: since we wanted to spend more time on longer discussions,
00:00:44
Speaker 1: we are moving to a fortnightly series instead of a
00:00:47
Speaker 1: daily show and the episodes are longer as well. We
00:00:50
Speaker 1: have very free wheeling, you know long, long discussion
00:00:54
Speaker 1: with our experts so that we get the more nuanced
00:00:58
Speaker 1: and well rounded up that is all about the show.
00:01:02
Speaker 1: But let's talk about this episode before we go further.
00:01:05
Speaker 1: I just want to make an appeal to our amazing listeners.
00:01:09
Speaker 1: Please listen to and check out all the future episodes
00:01:12
Speaker 1: of big story, all the previous episodes of the big
00:01:14
Speaker 1: story on your favorite audio streaming platforms like apple podcast,
00:01:18
Speaker 1: Spotify gone adios
00:01:20
Speaker 1: everywhere. We are on the Queen's website, we're on the
00:01:22
Speaker 1: Queen's Youtube channel. We have taken over the world and
00:01:26
Speaker 1: also check out our other podcasts. Yes, we have a
00:01:29
Speaker 1: movie and tv review show. We have and we have
00:01:35
Speaker 1: a show on indian politics called C assets. So to
00:01:38
Speaker 1: check them out as well. Right? So for our first
00:01:42
Speaker 1: episode we sort of looked around and
00:01:45
Speaker 1: we try to find out what is the one thing
00:01:47
Speaker 1: that is creating headlines and you might have guessed it.
00:01:52
Speaker 1: It is, it's artificial intelligence, machine learning. So there is
00:01:58
Speaker 1: a lot of curiosity about how it works, what it
00:02:01
Speaker 1: can do, is it good? Is it bad? There is
00:02:04
Speaker 1: a group of people who believe that it is going
00:02:05
Speaker 1: to change the world for the better.
00:02:08
Speaker 1: And then there is another group of people saying that
00:02:11
Speaker 1: all is not take our jobs to replace humans, robots
00:02:16
Speaker 1: will kill us, we will come back. So there's a
00:02:19
Speaker 1: lot of talk and so we wanted to answer a
00:02:23
Speaker 1: lot of noise, like genuine, genuine noise probably generated by
00:02:29
Speaker 1: But yeah, so we thought, let us get to experts
00:02:33
Speaker 1: from different walks of life but connected to Ai and
00:02:38
Speaker 1: ask them some questions that I would say we have
00:02:42
Speaker 1: and a lot of our listeners would have as well,
00:02:44
Speaker 1: like we start from very basic and then we want
00:02:47
Speaker 1: to go into some, you know, deep
00:02:49
Speaker 1: questions about how it will impact the society at large.
00:02:52
Speaker 1: So Anjali, what were some of the questions that you
00:02:56
Speaker 1: have for our guests? So I think I would like
00:02:59
Speaker 1: to start by understanding how does that work? Like in
00:03:02
Speaker 1: very simple language, how does it work? What is the
00:03:06
Speaker 1: thing behind there also this emergence of these new tools
00:03:11
Speaker 1: like gPT dolly and lens are in, you know, very
00:03:16
Speaker 1: common
00:03:17
Speaker 1: use cases like art and essay writing and all of that.
00:03:22
Speaker 1: So it's actually a really interesting, you know, I think
00:03:27
Speaker 1: to deep dive upon key AI and Art co relation right?
00:03:33
Speaker 1: And for that, like we have
00:03:37
Speaker 1: our first guest who is Jemima claims he is a
00:03:41
Speaker 1: Youtuber and he's also a graphic designer of working real
00:03:45
Speaker 1: human graphic designer, not an a a graphic designer. He
00:03:47
Speaker 1: works at weapons. And we want to talk to him
00:03:51
Speaker 1: about his journey with art in general and how he
00:03:56
Speaker 1: sees this encroachment of AI in the space of art.
00:04:01
Speaker 1: Interestingly enough, he actually made a Youtube video about this
00:04:04
Speaker 1: recently where
00:04:05
Speaker 1: where he sent an AI generated design to a client.
00:04:09
Speaker 1: So that would be interesting to explore what was that?
00:04:15
Speaker 1: Like what was the whole vibe of the clients? Did
00:04:19
Speaker 1: they know that they find out? So that was really nice.
00:04:21
Speaker 1: So we just want to get a peek into how
00:04:25
Speaker 1: does the creative world look at ai what are the
00:04:29
Speaker 1: kind of problems that AI is solving for them? Do
00:04:32
Speaker 1: they like it? Do they not like it?
00:04:34
Speaker 1: And at the same time we really want to zoom
00:04:36
Speaker 1: out and talk about a more macro view at AI
00:04:39
Speaker 1: for which we have Doctor who is a professor at
00:04:43
Speaker 1: iit Bombay and who works extensively in aI policy.
00:04:48
Speaker 1: He has spoken about this in a lot of places. So,
00:04:51
Speaker 1: I think he'd be the perfect fit for us to
00:04:53
Speaker 1: understand what a policy around here should look like, what
00:04:57
Speaker 1: is really doing to the world. And should we be
00:05:02
Speaker 1: worried about it's going to be so interesting to have
00:05:04
Speaker 1: an artist and a turkey talk about. All right. So
00:05:07
Speaker 1: let's jump right into the corners.
00:05:08
Speaker 1: Let's jump right into it. Welcome. Sammy. Welcome And hello,
00:05:13
Speaker 1: I want to start today's discussion, first of all by
00:05:16
Speaker 1: talking about ai ai artificial intelligence Dalil bad. Let us
00:05:29
Speaker 1: you know, hold back a bit
00:05:31
Speaker 1: and discuss what artificial intelligence is and how it works
00:05:35
Speaker 1: in a sort of simple language. Right? But I was
00:05:38
Speaker 1: thinking instead of Anupam telling us what it is because
00:05:42
Speaker 1: he has been working closely with the eye and he
00:05:44
Speaker 1: knows the internal, you know, workings of it. How about
00:05:48
Speaker 1: all of us
00:05:50
Speaker 1: tell him what we think Ai is and how it works.
00:05:54
Speaker 1: And then he takes our class and we're like, okay,
00:05:58
Speaker 1: we will do our version of it. And he'll give
00:06:01
Speaker 1: us marks who wants to start, I am at the
00:06:04
Speaker 1: bottom of the of this pyramid. So I but no, no,
00:06:08
Speaker 1: we'll do we'll do we'll do it alphabetically. So Anjali,
00:06:12
Speaker 1: like I said, I'm at the bottom of this pyramid.
00:06:14
Speaker 1: So my definition is going to be the most basic.
00:06:18
Speaker 1: So how I think
00:06:19
Speaker 1: social intelligence works is like a told a while back,
00:06:23
Speaker 1: it is just a very sophisticated coding where you feel
00:06:27
Speaker 1: into a program, all the information that that is on
00:06:30
Speaker 1: the internet. And you ask it to identify patterns. And
00:06:36
Speaker 1: then when given a prompt with the identification of those
00:06:40
Speaker 1: patterns create a new result. Now this can be done
00:06:44
Speaker 1: for
00:06:48
Speaker 1: Yeah, so helping off of what she is saying. I
00:06:54
Speaker 1: think artificial artificial intelligence, how I understand is is a
00:06:59
Speaker 1: way to, as the name suggests, artificially create the pattern
00:07:03
Speaker 1: of thinking of her human brain I would say because
00:07:06
Speaker 1: Hamas that becomes sort of our data observed and then
00:07:14
Speaker 1: we recognize some patterns
00:07:20
Speaker 1: process information Intelligence generate over the years. It gets more
00:07:29
Speaker 1: sophisticated
00:07:30
Speaker 1: and that's how I understand it in a more most
00:07:34
Speaker 1: basic sense. I mean what do you think? Yeah, obviously
00:07:37
Speaker 1: you guys went a little too technical I think I
00:07:40
Speaker 1: feel will correct me if I'm wrong but Ai is
00:07:43
Speaker 1: basically pattern recognition. Like it's very good in recognizing patterns.
00:07:48
Speaker 1: That is what Ai is for me in my profession
00:07:50
Speaker 1: because I'll only talk about my profession here but
00:07:54
Speaker 1: more or less like when we talk about jobs and
00:07:56
Speaker 1: all everything and if you look at it from a
00:07:59
Speaker 1: bird's eye view, then I think Ai is a tool
00:08:02
Speaker 1: that will make us as a species more productive in
00:08:06
Speaker 1: the future and will make our lives easier. So that's
00:08:09
Speaker 1: what Ai is according to me. Okay, so
00:08:14
Speaker 2: so first of all, first of all, thank you for
00:08:19
Speaker 2: having me here
00:08:20
Speaker 2: and uh thank you for coming. So it's very interesting
00:08:24
Speaker 2: to talk to you
00:08:26
Speaker 2: a number uh definition Angelica
00:08:36
Speaker 1: actually
00:08:37
Speaker 2: actually because let me try to explain, first of all
00:08:50
Speaker 2: instead of explaining ai Kahuta. I want to start with
00:08:54
Speaker 2: a I can write because there is a lot of
00:08:58
Speaker 2: cultural opinions on what it is and I think we
00:09:01
Speaker 2: need to cut a little bit there.
00:09:05
Speaker 2: So the first thing you have to understand that artificial
00:09:08
Speaker 2: intelligence is a misnomer or rather a market term, it
00:09:13
Speaker 2: has pretty much nothing to do with intelligence which is
00:09:17
Speaker 2: where protic, your definition flies out of the window. There
00:09:21
Speaker 2: is no
00:09:23
Speaker 2: neuroscientists on earth right now. Who knows why are we
00:09:28
Speaker 2: conscious survey? Are we really intelligent
00:09:32
Speaker 2: and hence there is no technology which can artificially replicate
00:09:36
Speaker 2: intelligence outward patterns exist, record
00:09:52
Speaker 2: any machine which can autonomously make decisions is called a nail.
00:09:57
Speaker 2: And generally the technology which is being used in 95 96%
00:10:02
Speaker 2: of ai is machine learning. Machine learning how to, I
00:10:06
Speaker 2: would say angelica, definition more or less character. Machine learning data.
00:10:12
Speaker 2: Machine
00:10:19
Speaker 2: decision
00:10:24
Speaker 2: decision. Later classification. Machine orange classification machine learning
00:10:50
Speaker 2: or who are quite fruit drawing. Bannock is um generative
00:10:54
Speaker 2: viable daily charge gpt a generative machine learning types but
00:11:01
Speaker 2: fundamentally all machine learning algorithms, they eat a lot of
00:11:07
Speaker 2: data which is called training data.
00:11:09
Speaker 1: So sorry first
00:11:13
Speaker 1: are they new like things like dolly and charge. Gpt.
00:11:16
Speaker 2: No they are not, they're not.
00:11:20
Speaker 1: I think 2nd 3rd versions of like dolly actually
00:11:25
Speaker 2: more more than more than 2nd 3rd versions of that
00:11:28
Speaker 2: technology particular brand names. But
00:11:38
Speaker 2: so it is basically what is called in the technical
00:11:42
Speaker 2: field a language model exiled language english or patterns or
00:11:54
Speaker 2: patterns is a language model. Language models, basic technology nanotechnology.
00:12:03
Speaker 2: Earlier two thousand's major technologies exist
00:12:06
Speaker 2: but joe hardware software but Gpus market manager of hardware
00:12:15
Speaker 2: Nous software checked earlier research or Joey I just ca
00:12:23
Speaker 2: laboratories methodology which was something we were working on in
00:12:27
Speaker 2: our fields. They suddenly started to stop remaining as scientific
00:12:33
Speaker 2: projects
00:12:34
Speaker 2: and became market commodities which started to be converted into products.
00:12:39
Speaker 2: I would say that was a man. I mean we
00:12:42
Speaker 2: were trying to solve some specific problems but as a
00:12:46
Speaker 2: I became commercialized people also started to realize instead of
00:12:51
Speaker 2: trying to solve specific but hard problems, it is much
00:12:55
Speaker 2: easier to throw lots of data models, quantity has a
00:12:59
Speaker 2: quality of its own or
00:13:03
Speaker 2: it's a research career or what problem are they trying
00:13:16
Speaker 2: to solve?
00:13:17
Speaker 2: I
00:13:28
Speaker 1: want to I want to pause you for a second
00:13:30
Speaker 1: because you bring up an interesting point now this and
00:13:36
Speaker 1: since we were talking about things like dolly and mid
00:13:38
Speaker 1: journey and stable diffusion.
00:13:40
Speaker 1: Shamim Ur a working professional right. You work with graphics
00:13:45
Speaker 1: on a day to day basis. So how do you
00:13:47
Speaker 1: think have these tools
00:13:50
Speaker 1: sort of infiltrated in your industry and casually and probably
00:13:55
Speaker 1: problem discussed, correct problem exists. Tools solve career in your
00:14:01
Speaker 1: industry specifically in my industry at this point it's making
00:14:06
Speaker 1: us more productive. Like I still remember in my initial days, 2007,
00:14:10
Speaker 1: 8 at the time of seven was I learned in
00:14:14
Speaker 1: like the very like vintage version
00:14:17
Speaker 1: there. If you have to remove people from a background,
00:14:20
Speaker 1: it was a very big task like it was like
00:14:22
Speaker 1: a full day task. Now it's just one click. If
00:14:25
Speaker 1: I have to make a brochure then if I'm using
00:14:29
Speaker 1: five images as such, then I can create a brochure
00:14:33
Speaker 1: in one hour, two or maximum three years at the
00:14:36
Speaker 1: time that those three years were three days. So as
00:14:40
Speaker 1: a designer I'm being more productive, my team is being
00:14:43
Speaker 1: more productive and
00:14:44
Speaker 1: if you talk about all these designing software, they are
00:14:47
Speaker 1: becoming like intelligence is not what they are but for
00:14:52
Speaker 1: us they are becoming intelligent in a way that they
00:14:55
Speaker 1: give us solutions easier and faster. Do do you think
00:14:58
Speaker 1: they seem intelligent to us looking at it as a
00:15:02
Speaker 1: layman coffee, truly intelligent Like they they might not be
00:15:06
Speaker 1: intelligent in the way we are intelligent but they seem
00:15:10
Speaker 1: pretty into
00:15:14
Speaker 1: background removal. I think also because for so long people
00:15:18
Speaker 1: have been removing background that is something that the software
00:15:22
Speaker 1: has also been learning that when somebody wants to remove
00:15:25
Speaker 1: a background this is the distinction that they are seeing
00:15:27
Speaker 1: between foreground background and after we just discussed after all
00:15:32
Speaker 1: of this data has been collected now they've developed a
00:15:36
Speaker 1: thing where one click and the
00:15:38
Speaker 1: software knows what do you think is the foreground and
00:15:41
Speaker 1: the background if you see a new update will come
00:15:43
Speaker 1: and that software will add a new to background removal.
00:15:47
Speaker 1: Background extension is just one option. Suppose it's very difficult
00:15:51
Speaker 1: to tell you but you get multiple options in a
00:15:53
Speaker 1: very short amount of time. But I I don't I
00:15:57
Speaker 1: don't feel it's it's a threat because see jobs will
00:16:01
Speaker 1: change the perception of jobs will change.
00:16:05
Speaker 1: I cannot be rigid key. No no I will I
00:16:07
Speaker 1: will do the best work. A I cannot do the
00:16:10
Speaker 1: way I will do and I'll only do it myself
00:16:14
Speaker 1: then I will be kicked out because other design will
00:16:16
Speaker 1: come he'll use AI and he'll do multiple things in
00:16:20
Speaker 1: the same amount of time. Is something like that happening.
00:16:23
Speaker 1: He has. No no I haven't seen because it's all
00:16:28
Speaker 1: about productivity. Right. Who wants to work till four in
00:16:32
Speaker 1: everybody is a. I being used very prominently. Yeah I'll
00:16:37
Speaker 1: tell you one live example. I'll tell you suppose you
00:16:40
Speaker 1: have a holiday client. Okay suppose you are working with
00:16:43
Speaker 1: a client who sells holidays. It's not possible for us
00:16:46
Speaker 1: to go and click a hut in the mountain picture
00:16:48
Speaker 1: all the time. So what we do will create an
00:16:51
Speaker 1: image like that in a I suppose and then if
00:16:53
Speaker 1: necessary we'll try to regenerate it because we
00:16:57
Speaker 1: we don't have inspiration before right? But the Ai has
00:17:00
Speaker 1: as said no the Ai has been fed so many
00:17:04
Speaker 1: images with a mountain in the heart. He'll give us
00:17:08
Speaker 1: options because Ai work is still very patchy. Like I
00:17:11
Speaker 1: I don't see it very like fine fine. Okay so
00:17:15
Speaker 1: we'll watch this image and we'll get inspiration of the
00:17:18
Speaker 1: lighting and set up and how the mountains and our
00:17:21
Speaker 1: this thing. But we'll create it manually so it will
00:17:24
Speaker 1: look better for it will look better
00:17:26
Speaker 1: and it will be in our control. So that's what
00:17:30
Speaker 1: Ai is being used now so we can create a
00:17:34
Speaker 1: safer image. Like it won't get cooperated because we are
00:17:36
Speaker 1: not stealing the image we are constructing saying you are
00:17:40
Speaker 1: currently using Ai as a starting bouncing board of
00:17:44
Speaker 1: the first idea mildew. And then we can because you
00:17:50
Speaker 1: also had made a video about this where you said
00:17:53
Speaker 1: that you gave an Ai generated design to an actual client.
00:17:57
Speaker 1: So talk to us about that key battle. What kind
00:18:00
Speaker 1: of brief did you get
00:18:02
Speaker 1: and how was that whole experience of you know actually
00:18:06
Speaker 1: going through it and and what was that like see
00:18:12
Speaker 1: how it works like and first of all, why did
00:18:15
Speaker 1: you do did you just do it as an experiment? Yes. Yes, yes.
00:18:18
Speaker 1: It was an experiment because we know okay. We know
00:18:21
Speaker 1: when we are teaching something. No we are throwing things
00:18:25
Speaker 1: like in the dark we don't know what will stick.
00:18:27
Speaker 1: So you show what you can do for them
00:18:30
Speaker 1: And they'll say yeah we can do this also. So
00:18:32
Speaker 1: you do things to get rejected sometimes if you're lucky
00:18:36
Speaker 1: then something from that only will get selected okay. But
00:18:40
Speaker 1: like 99% of time though
00:18:42
Speaker 1: it's like okay, okay these are very good. But we
00:18:45
Speaker 1: had this in our mind and this is the hook
00:18:48
Speaker 1: point in the business, right? If they if you get
00:18:51
Speaker 1: to that point then you have the job and those
00:18:54
Speaker 1: two logo options. No, that was my way of like
00:18:58
Speaker 1: reducing the what you call beating because if I tell
00:19:01
Speaker 1: the initial exactly if I did, you know, I don't
00:19:06
Speaker 1: know quite possible, likely reject.
00:19:09
Speaker 1: But at least the client actually yes, exactly. Because the
00:19:14
Speaker 1: best thing about Ai in our businesses like you're not
00:19:17
Speaker 1: stealing anything. It's not cooperated yet because I will talk
00:19:21
Speaker 1: about today. What is the idea behind stealing it because
00:19:27
Speaker 1: the air generated options also know I did some tweaking
00:19:29
Speaker 1: there also it was not properly. I
00:19:32
Speaker 1: we showed it to them and if they like you.
00:19:35
Speaker 1: Okay this is the go one and we need 10
00:19:39
Speaker 1: poses of this one. At least we have one basic post.
00:19:42
Speaker 1: So we'll do more changes and you're saying Ai helps
00:19:44
Speaker 1: in you know, cracking that first step. Yes. There will
00:19:48
Speaker 1: be one graphic designer who's better in air and there
00:19:51
Speaker 1: will be one graphic designer who is really who wants
00:19:52
Speaker 1: to do things the old way.
00:19:55
Speaker 1: Okay, okay. It's a simple digital camera and the camera,
00:19:58
Speaker 1: the transition was very, very crooked. Okay. The old people
00:20:02
Speaker 1: who were like real camera users, they were very, they
00:20:06
Speaker 1: didn't want it to go to digital camera, but who did,
00:20:09
Speaker 1: who did go to the digital camera? They survived. Right.
00:20:12
Speaker 1: So this is what it is. The technologists will be
00:20:15
Speaker 1: evolving and you need to write the way basically you
00:20:19
Speaker 1: need to be updated. That's that's what I feel
00:20:24
Speaker 1: I was listening. I have one question for Shamim in
00:20:29
Speaker 1: your video. Also you mentioned that sometimes what an Ai
00:20:33
Speaker 1: lacks is context. So and I think this is very
00:20:37
Speaker 1: cultural also that when you have an indian client you
00:20:42
Speaker 1: have the entire cultural background that this client is coming from.
00:20:46
Speaker 1: And so you know you might not get it in
00:20:50
Speaker 1: one go what the client wants but you have certain
00:20:53
Speaker 1: context which a lot of times because the air doesn't
00:20:56
Speaker 1: know who this person feeding in the prompt is might
00:21:00
Speaker 1: not be able to give a more accurate image. So
00:21:06
Speaker 1: how does that work? If you can explain more about
00:21:09
Speaker 1: context in graphic communication,
00:21:14
Speaker 1: see your question. No, I I want to give it
00:21:17
Speaker 1: to because your question answer is hidden is Jarvis possible
00:21:22
Speaker 1: and if Jarvis is possible then is Skynet possible, Jarvis
00:21:27
Speaker 1: and Skynet as possible before he answers this, I just
00:21:29
Speaker 1: want to interrupt and explain to our listeners what Skynet
00:21:33
Speaker 1: actually is. So a terminator movie, the Kyonggi Skynet is
00:21:38
Speaker 1: like the villain of that film and it is like
00:21:41
Speaker 1: a I would say which gains self awareness and which
00:21:45
Speaker 1: becomes human
00:21:46
Speaker 1: and at the end it decides that humans are the
00:21:51
Speaker 1: biggest enemy in the world and they mission is to
00:21:56
Speaker 1: you know kill humans. And essentially Skynet is the example
00:22:00
Speaker 1: that is used to illustrate ghetto hockey a human, this
00:22:09
Speaker 1: is like the first time in popular culture that we
00:22:13
Speaker 1: at the scale. So all of this obviously irobot or
00:22:17
Speaker 1: go India go robot shankar androgyny, that movie's premier Ai
00:22:24
Speaker 1: basically humans. So um yes is Jarvis and Skynet possible.
00:22:32
Speaker 1: Yeah then then you have your
00:22:34
Speaker 2: answer again, I used the I I used the word
00:22:39
Speaker 2: when I started to describe what Ai is and isn't right,
00:22:43
Speaker 2: I use the word and I used it very specifically
00:22:46
Speaker 2: I said think of ai as a parrot, Jarvis is
00:22:51
Speaker 2: not a parrot right? A parrot doesn't know what it
00:22:54
Speaker 2: is saying, A parrot just says things, it has heard
00:22:57
Speaker 2: patches from things it has heard but I want to
00:23:01
Speaker 2: answer this in a bigger detail, you know like because
00:23:04
Speaker 2: you guys are having this conversation and I was listening
00:23:07
Speaker 2: very closely to try to sort of so there are
00:23:11
Speaker 2: again things which we have to be very clear about
00:23:15
Speaker 2: Shamim, you used the word or rather the phrase that
00:23:20
Speaker 2: Ai is coming right and you would like to make
00:23:27
Speaker 2: an analogy. Technology never comes. People choose to use technology
00:23:32
Speaker 2: and the people who have power and money get to
00:23:35
Speaker 2: dictate which technology is used in waterway technological part three
00:23:40
Speaker 2: nautical technologies coming another phrase we keep hearing is genie's
00:23:44
Speaker 2: out of the bottle.
00:23:46
Speaker 2: Yeah, it's not a genie, it can't be out of
00:23:48
Speaker 2: the bottle. So bad. There is a technology, it is
00:23:53
Speaker 2: used in particular ways
00:23:55
Speaker 2: which way's will become popular and which ways will not
00:23:59
Speaker 2: become popular depend a lot on ai industry or control
00:24:04
Speaker 2: what use cases they fund, What use cases the research,
00:24:08
Speaker 2: what use cases they develop an individual level. Sure Arabic
00:24:14
Speaker 2: professional to achieve
00:24:16
Speaker 2: skills survived but at a macro level and as a
00:24:29
Speaker 2: citizen facial recognition tech
00:24:36
Speaker 2: are identified privacy technology. So remember at the end of
00:24:45
Speaker 2: the day her technology wider context apart, policy, political, economics,
00:24:54
Speaker 2: politics or economics. So Tianna who control the technology and
00:24:59
Speaker 2: the second thing you guys were talking about a plot bad,
00:25:03
Speaker 2: correct and Anjali said something which was actually correctly segmentation job,
00:25:08
Speaker 2: bad background removal. Computer vision that problem is called
00:25:12
Speaker 2: segmentation, segmentation, possible segmentation millions of times important point.
00:25:33
Speaker 2: Whatever value any eye generates that value is coming out
00:25:39
Speaker 2: of training of data which was at some point made
00:25:43
Speaker 2: by human beings.
00:25:44
Speaker 1: So doesn't this then bring the question of ownership
00:25:48
Speaker 2: exactly here the question comes at no AI can exist
00:25:53
Speaker 2: without data. A lot of data and
00:25:59
Speaker 2: it is a shame that right now that data is
00:26:03
Speaker 2: kind of being stolen Internet Italia. It's that right. I'll
00:26:08
Speaker 2: give you one example. There is a very famous facial
00:26:13
Speaker 2: recognition tech company in America evaluation. Billions Clearview Clearview dataset.
00:26:21
Speaker 2: They just went on the internet on America digital law
00:26:24
Speaker 2: enforcement website under trials, photos, Internet, social media, facebook, photo
00:26:34
Speaker 2: media natalia right. Company company.
00:26:46
Speaker 2: So yeah, ke hame jo bad career gives the job
00:26:52
Speaker 2: maker surprise society. It's easy. You can't leave them to
00:27:00
Speaker 2: people who are going to make money from it. You
00:27:03
Speaker 2: have to have that discussion at a societal or at
00:27:06
Speaker 2: a policy level data protection, europe had a law, it
00:27:16
Speaker 2: is called G. D. P. R. You might have heard
00:27:19
Speaker 2: of it. It's a very strong law is locked in europe.
00:27:23
Speaker 2: Your data cannot be even if it is on the internet,
00:27:26
Speaker 2: some company can't come and randomly.
00:27:28
Speaker 1: So I'm just going to pause you here. So in
00:27:32
Speaker 1: some way would you think that what Shamim did with
00:27:37
Speaker 1: his experiment that using somebody else's like using ai generated
00:27:42
Speaker 1: images or even I have my profile picture as many
00:27:46
Speaker 1: lands and generate Corinthian. I use a lot of a yard.
00:27:49
Speaker 1: So would you consider that
00:27:51
Speaker 1: unethical at some levels.
00:27:54
Speaker 2: As a policy academic. I try to stay away from
00:28:00
Speaker 2: questions of individual ethics, ethics are not something an individual
00:28:05
Speaker 2: can decide. Si Shamim when he did that Shameka demand
00:28:11
Speaker 2: met Tonia anarchy. I am using an Ai ai millions
00:28:14
Speaker 2: of images. Million images does
00:28:25
Speaker 1: in the past
00:28:29
Speaker 2: but as an individual but company
00:29:06
Speaker 1: I actually had another question to ask.
00:29:09
Speaker 1: The Ai tools are using the databases that are available
00:29:13
Speaker 1: on the internet virtually for free of cost. Right. So
00:29:17
Speaker 1: then
00:29:19
Speaker 1: is there a problem with the entire money making strategy
00:29:24
Speaker 1: of
00:29:28
Speaker 2: they have collected the data from the internet without giving
00:29:32
Speaker 2: any money to the people to collect or fruits. Data
00:29:36
Speaker 2: model train a Prada
00:29:59
Speaker 2: problem. You had that
00:30:01
Speaker 2: in our culture and when I say our culture I
00:30:04
Speaker 2: just wrote me in indian culture. I also mean american culture.
00:30:07
Speaker 2: There is a false narrative or a false narrative here.
00:30:11
Speaker 2: Regulation innovation or year problem here in America or India
00:30:18
Speaker 2: and our country's key. We don't have any
00:30:23
Speaker 2: regulation on data or
00:30:26
Speaker 1: even data protection
00:30:29
Speaker 2: data protection bill. Hell on him. But even an imperfect
00:30:34
Speaker 2: law would have been progress but I'm sorry to survive now.
00:30:38
Speaker 2: We don't have any law to protect your on my
00:30:41
Speaker 1: data. Progress and what will happen if a law like
00:30:44
Speaker 1: this comes home ari daily life
00:30:48
Speaker 2: depends on what
00:30:49
Speaker 2: depends on what is written in the law. Like right
00:30:53
Speaker 2: now last joe draft data protection card criticism law, citizens sorry,
00:31:02
Speaker 2: responsibilities but state sorry responsibility, Jessica
00:31:08
Speaker 1: again, Shamim should be
00:31:11
Speaker 1: king of
00:31:12
Speaker 2: like professor science or humanities. Would
00:31:35
Speaker 1: you be right
00:31:35
Speaker 1: now If you wouldn't have pivoted to policy. Would you
00:31:39
Speaker 1: be working in one of these companies now? Yeah
00:31:41
Speaker 2: I would have been working in Silicon Valley in 2019.
00:31:54
Speaker 1: I wanted to go back to that thing like because
00:31:58
Speaker 1: after listening to my,
00:32:00
Speaker 1: I'm completely with and we are on the same page
00:32:04
Speaker 1: where we talk about laws and policies because whenever I
00:32:07
Speaker 1: meet somebody like who wants to be a graphics and
00:32:09
Speaker 1: who sees me, who meets me somebody I know I'll
00:32:12
Speaker 1: tell them our professional, our profession is not just a job,
00:32:15
Speaker 1: it's a superpower because all the fake news, all the
00:32:18
Speaker 1: hysteria and all everything that is done by people like us.
00:32:22
Speaker 1: Okay
00:32:23
Speaker 1: and now there's all this like what you call Deepfakes
00:32:27
Speaker 1: and now the fake news is exactly defects and everything.
00:32:31
Speaker 1: This is done by Ai. You don't put a law
00:32:35
Speaker 1: in breathing right? You put a law in cutting a
00:32:37
Speaker 1: tree because that's something is dangerous, breathing is not dangerous.
00:32:41
Speaker 1: I don't have problem with Ai. Okay, I have a
00:32:43
Speaker 1: problem with the hysteria that is being created against a
00:32:47
Speaker 1: you will eat your job is dangerous this and that.
00:32:50
Speaker 1: So I'm going back to the same how dangerous is
00:32:53
Speaker 1: let's okay, let's talk about key danger in its worst form,
00:32:57
Speaker 1: what is the harm they can do? And since we
00:33:00
Speaker 1: are recording it on a friday the 13th let us go,
00:33:03
Speaker 2: Okay, let's let's actually talk about the danger of the
00:33:11
Speaker 2: answer is a very contradictory answer to the patients,
00:33:17
Speaker 2: yes Ai is extremely dangerous. No. Skynet is not possible
00:33:22
Speaker 2: because it is not dangerous in the way people think
00:33:24
Speaker 2: of its danger then a common person thinks of AI
00:33:27
Speaker 2: is dangerous.
00:33:49
Speaker 2: You
00:33:50
Speaker 1: make eyes also stupid act,
00:33:53
Speaker 1: you've taken all the power away from
00:33:55
Speaker 2: it. Yes because but I would also say this very
00:33:59
Speaker 2: stupid thing is extremely dangerous in very stupid ways. A
00:34:03
Speaker 2: problem here was stupid, basic, basic, basic, basic examples how
00:34:12
Speaker 2: dangerous he is America
00:34:15
Speaker 2: joe building America program technically what is the chance? Criminal
00:34:37
Speaker 2: judges or juries? Common sense, shoulder case, black community
00:34:59
Speaker 2: but but problem or
00:35:27
Speaker 1: was this, what year was this
00:35:29
Speaker 2: has been going on for decades. This is a huge
00:35:32
Speaker 2: problem in America example, amazon
00:35:37
Speaker 2: warehouses to amazon has moved over from purely online presence
00:35:43
Speaker 2: and going to brick and mortar industry of a warehouse.
00:35:45
Speaker 2: Banaras jamming carried warehouses, amazon, they tried to replicate their
00:35:55
Speaker 2: whole hiring process by converting it to machine learning. Machine
00:35:59
Speaker 2: learning
00:36:03
Speaker 2: problem bias people, machine learning, machine learning women, Black,
00:36:45
Speaker 2: okay, similarly,
00:36:59
Speaker 2: but generally in the army police departments,
00:37:14
Speaker 2: Delhi punjab assam, Hyderabad Chennai police departments, facial recognition tech
00:37:23
Speaker 2: used to start machine learning machine learning probabilistic
00:37:36
Speaker 2: up a I say inaccuracy nickel, it is not a
00:37:41
Speaker 2: technical fault, inaccurate, Neykova inaccurate, inaccurate right about decision the idea. Okay,
00:37:54
Speaker 2: inaccurate recently, ex Chief justice sheriff bob,
00:38:05
Speaker 2: he announced that he has decided and the Supreme Court
00:38:09
Speaker 2: judges think indian courts make resources, procedure,
00:38:27
Speaker 2: procedure simplify justice be a content justice be a product
00:38:33
Speaker 2: of justice, A product now machine learning, Machine learning, sorry, jobs,
00:38:49
Speaker 2: jobs spear member vodka,
00:38:54
Speaker 2: intelligent creative jobs. Economic 50 60% jobs, manual laborers, especially
00:39:10
Speaker 2: especially
00:39:11
Speaker 2: like India these jobs cannot be replaced because the worth
00:39:16
Speaker 2: of a human being in India is very low. Let's
00:39:19
Speaker 2: be very clear about jobs, jobs, jobs are but never
00:39:31
Speaker 2: color job
00:39:35
Speaker 2: correct companies, companies mining or jobs, jobs, jobs employee to employee.
00:39:48
Speaker 2: You have an entire economic in this country where
00:39:54
Speaker 2: increasing automation and increasing machine learning has enabled platforms, the
00:39:59
Speaker 2: platforms in a machine learning and because of the popularity
00:40:04
Speaker 2: and profitability of these platforms as a profitable dot com
00:40:08
Speaker 2: company will be lost because of that. And because of
00:40:13
Speaker 2: no policy on gig work whatsoever.
00:40:16
Speaker 2: Our entire workforce is slowly changing into gig work and
00:40:21
Speaker 2: this is extremely dangerous, short long
00:40:37
Speaker 2: overall jobs.
00:40:44
Speaker 1: But the biggest problem also I would want to get
00:40:48
Speaker 1: Shamim into this key.
00:40:51
Speaker 1: Even us when we were talking about it, we were
00:40:55
Speaker 1: only considering all of these higher level jobs like creatives
00:41:00
Speaker 1: and all of that. But the actual jobs, we aren't
00:41:03
Speaker 1: talking about those kind of jobs, right? And when we
00:41:06
Speaker 1: also don't associate them with Ai Ai S C jobs
00:41:12
Speaker 1: could replace Karenga especially
00:41:17
Speaker 1: after listening to after listening to my whole point of
00:41:22
Speaker 1: view has changed this problem that has said has never
00:41:25
Speaker 1: come came to my mind. Exactly yeah. The question coming
00:41:28
Speaker 1: to my mind is that key, what we should do
00:41:31
Speaker 1: because mom said key individual contribution is very less so
00:41:35
Speaker 1: that's why we need laws and policies but how we
00:41:39
Speaker 1: can
00:41:39
Speaker 2: help like at an individual level
00:41:42
Speaker 2: do what is best for you at an individual level
00:41:46
Speaker 2: encourage famine because he has a platform. You should encourage
00:41:48
Speaker 2: people to learn these skills so that at an individual level.
00:42:01
Speaker 2: But as a, as a policy professor as somebody who
00:42:08
Speaker 2: looks at the entire society
00:42:11
Speaker 2: masses of individuals individuals hama english, advanced degree police. Exactly,
00:42:34
Speaker 2: sarkar policies, laws
00:42:38
Speaker 2: at the end of the day, they also come from
00:42:41
Speaker 2: the popular mass popular topic local insurance, cancel job jobs
00:43:07
Speaker 2: are
00:43:08
Speaker 2: jobs they call her cell jobs but online content creator
00:43:23
Speaker 2: Hamara come bath, low quality content, high quality content. So
00:43:34
Speaker 2: the point I'm trying to make especially with Sammy because
00:43:36
Speaker 2: Sammy has a large audience on youtube right? Loco encouraged debates,
00:43:45
Speaker 2: just scholarly debates. Academy academy a measure debates, policymakers, conference fairness, accountability, transparency, fact,
00:43:56
Speaker 2: conference ai or policies or actual danger technologies related
00:44:13
Speaker 2: her actual danger, boring sensation
00:44:22
Speaker 1: gradually. Exactly I was just saying because you may be
00:44:43
Speaker 1: in the same the same way medical field maybe like
00:44:49
Speaker 1: er careful because medical it is said it will be
00:44:56
Speaker 1: the most useful
00:44:58
Speaker 2: right now.
00:45:00
Speaker 2: Yes, you should be So simple reason. India publishes a
00:45:05
Speaker 2: lot of data for free. Some open government data whose website, 52%
00:45:10
Speaker 2: downloads I think health data, health data, jokey health data,
00:45:18
Speaker 2: machine learning Alana hottest field commercially
00:45:24
Speaker 2: there are big companies wings, telemedicine telemedicine remotely bad chat
00:45:33
Speaker 2: vodka through doctor. So Covid fella basically doctors training as
00:45:40
Speaker 2: far as technology and science methodology. Techno solution is um
00:45:52
Speaker 2: techno solution is um is that bad idea that any
00:45:56
Speaker 2: social problem can be solved with technology Dr Valerie
00:46:09
Speaker 2: problem, chat board media, stable internet connection, smartphone in english.
00:46:27
Speaker 2: Problem solved problem DR problem solved colleges. Education don't call
00:46:39
Speaker 2: dr medical problem already.
00:46:56
Speaker 2: America makes research american medical data, hospitals machine learning use
00:47:04
Speaker 2: cases patients patients have to be very patient
00:47:17
Speaker 2: a malnourished, poor or black local data time machine decision
00:47:31
Speaker 2: to medical decision. There are a lot of time they
00:47:35
Speaker 2: were medically incorrect decision a
00:47:39
Speaker 1: devil's advocate wala. Point comes
00:47:41
Speaker 1: where people say that the exclusive technology
00:48:01
Speaker 2: main difference healthcare healthcare.
00:48:11
Speaker 2: So then
00:48:12
Speaker 1: would you would you advocate for removal of things like
00:48:16
Speaker 1: AI in matters of health care and education or more?
00:48:20
Speaker 2: I would advocate for I would advocate for
00:48:24
Speaker 2: no I would advocate for rational decision making on who
00:48:28
Speaker 2: is owning that rate of O. A. I. Ca. Use
00:48:33
Speaker 2: seer of profit making kelly. Individuals and private companies decide
00:48:38
Speaker 2: career yet as a society. Hm Sabbatical decided correct. Use
00:48:43
Speaker 2: examples of where
00:48:48
Speaker 2: popular
00:49:06
Speaker 2: election useful now but marlo
00:49:31
Speaker 2: you can't as a customer ask for the company to
00:49:34
Speaker 2: change its policy but as a citizen you can ask
00:49:37
Speaker 2: the government to change its policy services and
00:49:55
Speaker 2: citizenship
00:50:01
Speaker 1: to sort of conclude as a policy or where government
00:50:06
Speaker 1: comes in ai is to regulate for what and for
00:50:11
Speaker 1: what not AI should be used. That is where the
00:50:15
Speaker 1: government should come in and it should clearly dictate the
00:50:18
Speaker 1: use of AI
00:50:19
Speaker 1: is okay. So that would also cover all of our
00:50:23
Speaker 1: creative fields created exactly what in your opinion would be
00:50:29
Speaker 1: okay use for a
00:50:30
Speaker 2: more than okay as I would say that first of
00:50:33
Speaker 2: all this proceed this process to decide what the government
00:50:37
Speaker 2: should do. You should not just hand it to the
00:50:39
Speaker 2: government like that.
00:50:45
Speaker 2: It has to be a democratic process right down to
00:50:59
Speaker 2: the public is response dolly
00:51:04
Speaker 2: is document I think
00:51:10
Speaker 1: with this chat also it was our objective. Yes we
00:51:15
Speaker 1: want to talk about dolly and chad Gpt and all
00:51:18
Speaker 1: of these are you know certainly nice popular issues but
00:51:23
Speaker 1: as I got Amerco views the car and even from
00:51:30
Speaker 1: your understanding of it that these are not the biggest
00:51:33
Speaker 1: issues that we should be worried about. I think it's
00:51:36
Speaker 1: episode titled We are worried about the wrong. Oh yeah,
00:51:41
Speaker 1: I would like to add one thing is that yes,
00:51:44
Speaker 1: as I mentioned in my video also. Okay. Somehow I
00:51:48
Speaker 1: feel key we are being targeted like the creative people
00:51:51
Speaker 1: are being targeted. Everybody saying your job will go, your
00:51:54
Speaker 1: job will go today. I understood what AI is and
00:51:57
Speaker 1: how AI is dangerous. Exactly. So I want to thank
00:52:00
Speaker 1: you guys thank you for doing this with me because
00:52:03
Speaker 1: it was such a privilege because um you just opened
00:52:07
Speaker 1: my mind truly like
00:52:09
Speaker 1: I'm feeling good. Also that the way I use AI
00:52:13
Speaker 1: is not the most dangerous one. Don't you think this
00:52:16
Speaker 1: is also like a self romanticization? A lot of even
00:52:20
Speaker 1: creative people are thinking that homeless victims
00:52:26
Speaker 1: it I don't think that is the case. We are
00:52:29
Speaker 1: the most privileged people. Um it's as a toy used
00:52:34
Speaker 1: we have the option to choose it to use it
00:52:37
Speaker 1: or not use the term. This is a playground for
00:52:40
Speaker 1: us right
00:52:40
Speaker 2: now. Exactly. But year options taxi driver because Uber has
00:52:48
Speaker 2: out competed the taxes from the market. So that is
00:52:51
Speaker 2: very important. This point of
00:52:54
Speaker 2: our ability to confuse the actual reality with our reality
00:52:59
Speaker 2: are very small group of elite people who consumed
00:53:07
Speaker 2: but I'm very happy protect and Angela that you invited
00:53:11
Speaker 2: me
00:53:12
Speaker 2: my subcommittee criticized Sada ham academics problem but very few
00:53:23
Speaker 2: of us talk to the common masses. Some paper papers
00:53:26
Speaker 2: have papers, there are very few platforms, they have issues
00:53:33
Speaker 2: seriously but okay so first of all I would really
00:53:36
Speaker 2: like to thank, went for in
00:53:38
Speaker 2: take me to give me a small chance to translate
00:53:42
Speaker 2: some of these issues to the common language. I hope
00:53:45
Speaker 2: I was a little successful at that.
00:53:48
Speaker 1: Thank you. So in closing I would just like to
00:53:53
Speaker 1: ask um what is the first thing because obviously the
00:53:56
Speaker 1: recommendation and I would be endless. What is the first
00:53:59
Speaker 1: thing that people who have just heard this chat should
00:54:02
Speaker 1: go and read up
00:54:04
Speaker 2: about. So there is a
00:54:07
Speaker 2: paper I would like people to read. The paper is
00:54:11
Speaker 2: called on the dangers of stochastic parrots, stochastic simply means probabilistic.
00:54:17
Speaker 2: This paper was written by a bunch of researchers in
00:54:22
Speaker 2: the U. S. It is a very nice introductory paper
00:54:26
Speaker 2: to understand the dangers of language model, language model charge deputy,
00:54:31
Speaker 2: anything that generates language. Also, the paper is written I
00:54:34
Speaker 2: think in a very easy to understand english which anybody
00:54:38
Speaker 2: can read. You don't need to be a machine learning professional.
00:54:41
Speaker 2: This would be an interesting paper to start off with
00:54:44
Speaker 2: very realized that these artifacts are not
00:54:48
Speaker 2: operating in a vacuum that they have a social impact
00:54:51
Speaker 2: and what those social impacts can be. So this could
00:54:53
Speaker 2: be an interesting paper to begin. I wouldn't say this
00:54:57
Speaker 2: is you know the end of the conversation. But the
00:55:00
Speaker 2: beginning of the conversation for somebody, I am not the
00:55:04
Speaker 2: first person who calls Ai parrots
00:55:08
Speaker 1: is conversation with one thing that I will carry for
00:55:15
Speaker 1: the next my life is a I. Tota. The starting
00:55:18
Speaker 1: question we started with was a I. And art and
00:55:22
Speaker 1: creative job van logic hold Kotecki creative and where the
00:55:31
Speaker 1: point where we ended
00:55:35
Speaker 1: actually justice system, labor market. So I would like to
00:55:51
Speaker 1: close with you as well and going back to our
00:55:54
Speaker 1: starting questions.
00:55:57
Speaker 1: What do you think after you've heard all of this
00:56:01
Speaker 1: key your art club exist exist, what I personally feel
00:56:07
Speaker 1: after this long conversation and after learning so much from
00:56:10
Speaker 1: them is that key.
00:56:11
Speaker 1: We are being targeted because we can be targeted because
00:56:15
Speaker 1: some people want a i in everything without any regulation
00:56:19
Speaker 1: and we are the easiest people to talk because they'll
00:56:22
Speaker 1: get people like me who will say is making a
00:56:25
Speaker 1: life easier because even this actually is so interesting that
00:56:34
Speaker 1: you say, because I consider myself a very, you know,
00:56:37
Speaker 1: tech savvy person every time some google assistant comes. I really,
00:56:42
Speaker 1: I'm excited, Kara, that's why as soon as ai dolly
00:56:46
Speaker 1: all of this came I was very excited
00:56:48
Speaker 1: and obviously I am not alarmist about it. I had
00:56:55
Speaker 1: a very favorable view of it. So I think advertisers
00:57:08
Speaker 2: or any newspaper Mitnick alarmist article article article because I
00:57:16
Speaker 2: was talking about some actual issues. But Mark, I had
00:57:22
Speaker 2: reproduce the results which had read in some McKenzie report
00:57:26
Speaker 2: somewhere academic organization that it's a company or a company.
00:57:38
Speaker 2: Yes, sensationalism, but or boring imagination. I wouldn't say
00:57:59
Speaker 2: sensationalism
00:58:16
Speaker 1: trying to do with this podcast. The actual issue
00:58:21
Speaker 1: is like much deeper and honestly speaking, boring but love Yeah, interesting.
00:58:29
Speaker 1: Which is sad. But I think we tried our level
00:58:33
Speaker 1: best to make it an interesting conversation and thanks a
00:58:36
Speaker 1: lot for that. Very
00:58:38
Speaker 2: happy to talk to you guys. Also very happy to
00:58:40
Speaker 2: meet you. You look at the channel, I find it
00:58:43
Speaker 2: very nice. So I'm glad you are doing what you're doing.
00:58:46
Speaker 1: Thank you. Thank you so much. Thank you.
00:58:48
Speaker 1: Okay, so Anjali thoughts. First thoughts more than more than exciting. Jasmine. Yeah.
00:58:56
Speaker 1: What what were your expectations before we started? I really
00:59:01
Speaker 1: was expecting just you know, tell us about how he's
00:59:05
Speaker 1: not threatened by. I and I'm talking about you should
00:59:09
Speaker 1: be it might it might get so powerful that will
00:59:11
Speaker 1: take over your job. But
00:59:12
Speaker 1: I think it went into a whole other direction. Was
00:59:15
Speaker 1: I was expecting like a modification of a. Because usually
00:59:19
Speaker 1: when people talk about AI in popular culture entity but
00:59:27
Speaker 1: now my opinion on AI has changed so much after
00:59:32
Speaker 1: obviously catchphrase of this episode
00:59:40
Speaker 1: and every time, you know, I feel like oh it's
00:59:42
Speaker 1: going to take over the world, it's going to do
00:59:44
Speaker 1: this and that like the image of a cute little
00:59:50
Speaker 1: at the same time, I feel that we've discovered it's
00:59:53
Speaker 1: not all that good and that it solely depends on
00:59:58
Speaker 1: what it is used for. And I think we've drawn
01:00:00
Speaker 1: a very clear distinction between using our Ai for art
01:00:04
Speaker 1: and using ai to replace
01:00:06
Speaker 1: human consciousness. I would say. It was interesting the case
01:00:13
Speaker 1: study that tom talked about the 100 trials in the U. S.
01:00:16
Speaker 1: Where that is scary but it's way more scarier than
01:00:21
Speaker 1: all your irobot type stuff and it is such a babe.
01:00:28
Speaker 1: But okay, this can have real world implications. It's definitely
01:00:33
Speaker 1: something that these giant corporations are looking at and we
01:00:38
Speaker 1: as citizens have to be aware and we have to
01:00:40
Speaker 1: be worried about the right a
01:00:42
Speaker 1: and listen to more academicians. Yes. And that just goes
01:00:47
Speaker 1: beyond I think everything that is popularly discussed, we like
01:00:53
Speaker 1: we should not be listening to influencers talk about it,
01:00:55
Speaker 1: but listen to academy, just listen to the big story. Yes,
01:01:00
Speaker 1: that's because we will get you access to these crazy,
01:01:05
Speaker 1: crazy intelligent admissions. So I think that's it for this episode.
01:01:10
Speaker 1: It was a great chat chat, let us know in
01:01:14
Speaker 1: the comments, what you thought about it, what you think about,
01:01:17
Speaker 1: why do you think under or what animal do you
01:01:21
Speaker 1: think it is and share this with your friends, share
01:01:25
Speaker 1: this with your parents, share this with anyone, anyone, anyone
01:01:29
Speaker 1: who talks about,
01:01:35
Speaker 1: What are we going to talk about in the next
01:01:38
Speaker 1: episode of the big story. So after the grand opening
01:01:42
Speaker 1: of Ai and art and
01:01:45
Speaker 1: Policy, let's talk about media and we want to talk
01:01:49
Speaker 1: about a very interesting aspect of media which is media trials.
01:01:53
Speaker 1: It's like this unique combination of media and the judiciary,
01:01:57
Speaker 1: what role they play in each other's functioning and we're
01:02:02
Speaker 1: going to break down the entire concept of popular media
01:02:05
Speaker 1: trials that you've seen in the past decade.
01:02:12
Speaker 1: Okay, so, tune into the next episode. We have two
01:02:15
Speaker 1: amazing guests with us. So this was Prateek and this
01:02:20
Speaker 1: and this was the big story. Thank you guys, Bye bye.
01:02:26
Speaker 1: The Big Story is a quint original podcast executive produced
01:02:30
Speaker 1: by Shelley Value. And this episode was hosted, produced and
01:02:33
Speaker 1: edited by practically do and Anjali payload and it uses
01:02:37
Speaker 1: theme music from BMG production music, A special thanks to
01:02:40
Speaker 1: our guests, Dr Anupam Gupta and Sammy Markelis.
01:02:44
Speaker 1: Yeah,
01:02:47
Speaker 1: you were listening to the Quinns podcast.