S3 Finale Episode: Fun, Follies, and a Fresh Look at Tech
3 Techies Banter #3TBOctober 22, 202400:40:19

S3 Finale Episode: Fun, Follies, and a Fresh Look at Tech

Join us in the Season 3 Finale episode, where we reminisce about our favourite moments from the podcast! From the hilarious creation of the jingle episode (an all-round friends and family collaboration) to deep dives into the impact of AI on graphics and the fashion industry, this episode is packed with lots of laughs and a wee bit of insights. We also tackle the tricky business of AI "bullshitting" and how machines love to please, even if it means getting things wrong. PS - Liars care about the truth (to hide it), bullshitters just want to impress or persuade, truth be damned! Welcome to GenAI. Plus, a little humble bragging about the awards we have won and a sneak peek into plans for shaking things up in the next season. We can’t wait to be back in Season 4. Until then… Learn more about your ad choices. Visit megaphone.fm/adchoices

Join us in the Season 3 Finale episode, where we reminisce about our favourite moments from the podcast! From the hilarious creation of the jingle episode (an all-round friends and family collaboration) to deep dives into the impact of AI on graphics and the fashion industry, this episode is packed with lots of laughs and a wee bit of insights. We also tackle the tricky business of AI "bullshitting" and how machines love to please, even if it means getting things wrong.

PS - Liars care about the truth (to hide it), bullshitters just want to impress or persuade, truth be damned! Welcome to GenAI.

Plus, a little humble bragging about the awards we have won and a sneak peek into plans for shaking things up in the next season.

We can’t wait to be back in Season 4. Until then…

Learn more about your ad choices. Visit megaphone.fm/adchoices

[00:00:02] Hi, I'm Samiran. Hi, I'm Nilesh. Hi, I'm Sheetal. And you're listening to 3TV. 3 Techies Banter.

[00:00:10] So this is actually a recording after a very, very long time. So we, I think probably have forgotten how to record only. I was actually having a lot of time setting up my mic also. Anyway.

[00:00:24] So, in fact, when we embarked, you know, we had a logical end to season two, the good, the bad, the fun we had. And then we did some specials. And then we kind of just went on and the thinking was that this is going to be one continuum, you know, we'll kind of record, record, record, record.

[00:00:47] Somewhere down the line, we realized that, you know, you know, if we have to do things differently, if we have to course correct, we need to take a break.

[00:00:57] So that begs the question that, you know, that how do you end something, you know, which you like kind of never really started?

[00:01:05] Because when I started looking at the start of season three, I couldn't find an episode which started it. But anyway, here's our effort to try and end something that we probably didn't want to end or didn't start in the first place.

[00:01:22] So the way we are going to try and do this is, Sheetal Nilesh and I have kind of gone through this whole body of episodes.

[00:01:35] And we are going to talk about our favorite or our best, our most memorable episodes and, you know, why we had the most fun doing it.

[00:01:49] And what we felt would be best is, of course, for me to start.

[00:01:56] And I'm going to start with an episode which is not an episode really.

[00:02:01] So, I mean, I think that's probably expected of me.

[00:02:04] So, my favorite episode was actually the jingle episode.

[00:02:23] And before you kind of start wondering why I said that, so while the jingle itself, you know, was just a song.

[00:02:33] If any of you have kind of actually seen the jingle, it was a fantastic process.

[00:02:40] I mean, all of us were part of it.

[00:02:42] And in fact, it was probably one of the only things where a lot of our friends participated.

[00:02:49] So, you know, someone wrote the lyrics.

[00:02:54] It was outsourced.

[00:02:54] Someone said it's music.

[00:02:57] Totally.

[00:02:58] Totally.

[00:02:59] Totally.

[00:02:59] Totally.

[00:03:00] It was totally.

[00:03:01] I mean, I think one day out of the blue, Saloni, my wife, she came up with the idea, oh, you should do a jingle.

[00:03:07] And, you know, I mean, as is typical, I got very excited.

[00:03:10] And that was the end of it.

[00:03:11] But then I didn't know what to do.

[00:03:12] Then someone said, okay, you know, I'll write the lyrics.

[00:03:16] And then someone said, you know, then we thought, let's just sing it.

[00:03:20] And then we were told, we have to set it to music.

[00:03:22] You've got to kind of get it recorded.

[00:03:24] Then I think Sheetal and I spent tons of time trying to kind of set the words back to the music, you know.

[00:03:31] And we didn't even know that was supposed to be done.

[00:03:33] And I think the most hilarious part was the video.

[00:03:37] I think we kind of made complete jackasses of ourselves doing that video.

[00:03:41] So, I mean, if you haven't seen the YouTube video, you must go check it out.

[00:03:44] So, I think all in all, the jingle, I felt, kind of captured the spirit of why we do the podcast.

[00:03:53] And also embodies the fact, you know, that how much fun we have doing it.

[00:03:58] And, of course, it was all about technology if you care to listen to the relics.

[00:04:02] So, that's my absolute favorite.

[00:04:05] So, I absolutely love the process that we followed for the jingle.

[00:04:11] Relic joined us long distance.

[00:04:13] And, of course, he was part of the video because we shot that in Dublin.

[00:04:17] To the jingle and to the madness of the jingle.

[00:04:20] So, yes, the lyrics of the jingle make sense.

[00:04:22] And I think we just appropriated some stuff which couldn't have been appropriated otherwise.

[00:04:28] So, I think it was a great fun place.

[00:04:32] Coming to the episode.

[00:04:33] So, since we're talking about episodes, coming to the episodes, one of my favorite episodes really is amongst the first ones that we did.

[00:04:41] And we started with this whole thing called a theme called Roti Kapra Makan.

[00:04:46] And true to our style, while the theme was Roti Kapra Makan, we decided to start with Kapra and not with Roti.

[00:04:52] God only knows that.

[00:04:53] So, in the Roti Kapra Makan theme, we covered something on Kapra, which was all about fashion tech.

[00:05:01] And I think that's one of my few favorites for two reasons.

[00:05:06] One is that that episode has so much of fun in it, but it has so much of fact.

[00:05:13] It has so much of data made available, right?

[00:05:17] So, whether the fact that it takes 2,700 liters of water to make a t-shirt, which is equivalent to 900 days of water for one person,

[00:05:27] kind of got you thinking, saying, should I really be buying more t-shirts today?

[00:05:32] So, that was the kind of data that was sitting in that episode.

[00:05:35] What was also interesting was we had Rajesh, Samiran's close friend, and I'm therefore not going to call him out by his first and second year.

[00:05:45] I'm just going to call him Rajesh, who came on the show.

[00:05:52] And he talked about how tech is being leveraged in so many ways in the fashion industry, right?

[00:05:58] From, you know, Gen AI today, which is like a buzzword, which is used to do 3D modeling,

[00:06:05] to RFID and geotagging, which actually adds to Nilesh's favorite subject called authenticity and authentication.

[00:06:14] So, it's really one of those few, it's one of those episodes which got me super excited.

[00:06:22] I love the fashion industry anyways.

[00:06:24] So, this was like the perfect episode for me.

[00:06:27] Rilesh, I don't know which one is your favorite, but...

[00:06:31] Before I jump into my favorite, I think the fashion one, at least two people told me that they loved the graphic also, right?

[00:06:43] And it's all thanks to Samiran.

[00:06:45] Two people actually told me that they really liked the graphic associated with the small thumbnail that comes on Spotify.

[00:06:53] And honestly, that kind of brings me to one of my favorite episode, which is Gen AI.

[00:07:01] So, we have been using a lot of AI tools to generate graphic.

[00:07:06] It's not as simple as just telling ChatGPT or other tools to just spit out an image.

[00:07:16] It takes a lot of effort and Samiran does that and some of them are fantastic.

[00:07:21] So, for me, I don't know whether we did it immediately after Roti Kapra or Makaan, we did a lot of episodes on AI, right?

[00:07:31] And since then, honestly, there have been lots of ups and downs and many people are now calling that AI might also be like that whole .com buzz that had happened.

[00:07:41] And, you know, there must be some boom will be followed by a bust kind of stuff for Gen AI.

[00:07:49] But I think it has changed a lot of things and the way we do those things, right?

[00:07:58] So, for us, it was obviously generation of images.

[00:08:02] But one thing that's the interesting part is all these episodes lead to a lot of research, right?

[00:08:09] And for me, I mean, all of us, we do a lot of research to bring out these things.

[00:08:14] And in the Gen AI, I love the fact that, you know, through the readings and research, a beautiful, simple way to explain the world of AI and the available data was one of the articles which said that, you know, think of a master chef, you know, going to a forest.

[00:08:37] And he is told to cook, you know, Michelin class dish using just the produce that is fallen on the ground, right?

[00:08:48] So, which meant that even if you are in a forest and there is all the fresh produce, you are limited to forage what is on the ground.

[00:08:56] And that kind of brought us to the conclusion that, honestly, the whole Gen AI is actually working on 4% of data that is actually available on the internet, right?

[00:09:09] So, you're talking of 4% of data and that is how the things are being generated.

[00:09:14] When you think of that number, honestly, amazing stuff is being generated even at 4% and it begs all of us to think that, you know, what if that the availability of data, the richness of data goes up even by 10% and suddenly the output of Gen AI would be something else.

[00:09:36] So, in a nutshell, I think, I don't think, I mean, there may be naysayers, but I think Gen AI is here to stay.

[00:09:46] And I think, yeah, that was the whole idea of the research and understanding.

[00:09:50] Nilesh, you know, interesting, sorry Samir, but Nilesh, what's interesting is that when we first started, right, a couple of years ago, we were talking about AI as a trend which is going to continue.

[00:10:02] And lo and behold, one and a half years later, Gen AI had kind of taken over and, you know, everybody was now starting to use chat, TPD, blah, blah, blah, and all of that.

[00:10:12] And I was just reading, the other day I was reading an article by Pascal Bonnet and he talks about AI obesity.

[00:10:19] Okay.

[00:10:19] So, look at the timelines, right, from in three years, we've been like on this podcast for three years.

[00:10:27] You had from not really everybody knowing and understanding AI to Gen AI becoming a boom.

[00:10:35] And now already the negativities of Gen AI are coming to the fore, right?

[00:10:41] And obesity around, and because you talked about food and foraging and things like that, you know, AI obesity is a term that Pascal Bonnet uses.

[00:10:49] And I thought it was fascinating that in one and a half years, we've already become hobbies on our YouTube.

[00:10:59] And in fact, this foraging, so this, I think we probably talked about this, but there was this amazing cookbook that I had come across.

[00:11:06] And don't ask me why I came across it again, but it was called the Antarctica Cookbook.

[00:11:10] Okay.

[00:11:11] And this was written by these people who were living in this Antarctic base station.

[00:11:16] And lo and behold, obviously, you know, they had a, they had recipes, but they could never make everything, right?

[00:11:23] Because you had only 40% of the ingredients.

[00:11:26] So then they wrote a cookbook about, you know, how to make dishes with, you know, because they had to mix and match all kinds of things.

[00:11:33] So, you know, if you had to, I don't know, maybe make pasta, you probably had to use bananas.

[00:11:38] And, you know, if you had to make some omelette, you probably had to use apples.

[00:11:42] I mean, I have no idea, but it was such an amazing, you know, I mean, it was a comedy.

[00:11:47] And at the same time, it was like super, super interesting.

[00:11:49] And I think, I think that, remember, initial days of mid-journey, right?

[00:11:56] The thing that you are saying that in Antarctica, if you had 40% of stuff and you had to make a dish, you mix and match.

[00:12:02] I remember us creating those images, having maybe six fingers and two noses.

[00:12:09] Oh, yeah, yeah, yeah.

[00:12:11] Mixing.

[00:12:12] I mean, oh my God.

[00:12:13] I mean, in fact, we had this whole thing about how mid-journey was not able to create a single banana.

[00:12:20] So then they kind of went and fixed it.

[00:12:22] And then I tried with a single grape and then that was difficult.

[00:12:26] But maybe they fixed that also.

[00:12:27] But yeah, it is that the logic is very, very flawed in some way.

[00:12:34] But yeah, I guess it's work in progress.

[00:12:35] So in fact, that kind of, you know, so I think like Sheetal said, this nasing or, you know, skeptics of AI and all is there.

[00:12:47] You know, I've been kind of just trying to figure out, you know, why is there so much work?

[00:12:52] So very, very interesting piece of information that I gathered is that modern artificial intelligence is kind of, you know, Turing plus.

[00:13:04] It's like 1950s, 50s, 56, you know, there was this big conference there, blah, blah.

[00:13:08] So because we call it artificial intelligence, you know, we kind of attribute a lot of human qualities and cognitive, you know, qualities to this piece of technology.

[00:13:27] So there are a lot of people who kind of believe that that's not true.

[00:13:31] And in fact, there was a conference on AI in the, I think it's some, I don't know, maybe it was between technologists in the church.

[00:13:38] It was called the Pontifical Academy of Science in Rome or something.

[00:13:42] So they actually hypothesized that, you know, you should not be calling it artificial intelligence.

[00:13:48] You should kind of rename it and you should call it something which is defines its scope.

[00:13:54] And so they came up with this very interesting name, which kind of defines it perfectly.

[00:14:00] It's called systematic approaches to learning algorithms and machine inferences.

[00:14:08] And if the, if you anagram that, it actually turns out to be salami.

[00:14:15] And the whole intent there was that, you know, the moment you call, you know, when you call something artificial intelligence, it sounds profound.

[00:14:23] You call something salami and you can never then say the salami has emotions or salami will acquire personality or, you know, can you fall in love with the salami?

[00:14:33] Can a salami become more superior?

[00:14:34] So, I mean, that is one way of going with that.

[00:14:37] So, on to my second kind of favorite thing.

[00:14:42] And that also, you know, is kind of a bit overarching and it's a mix of what Nilesh talked about.

[00:14:48] It's got to do with this whole process of shorts, you know.

[00:14:54] So, I think the shorts is a bit like a jingle project for me because starting from, you know, coming up with, of course, coming up with the facts themselves, you know, that itself is kind of a whole different story.

[00:15:11] Then trying to say all kinds of crazy stuff and then trying to figure out a way to illustrate that.

[00:15:19] Because, you know, you can say anything.

[00:15:20] You can talk about baboons and bananas and, you know, mummies needing passports and what have you.

[00:15:25] But now you have to illustrate that.

[00:15:27] So, that used to be one challenge.

[00:15:28] The second challenge used to be come up with a title.

[00:15:32] You know, every time you came up with, said something ridiculous, you know, now you have to kind of compress that into a title which kind of explains it.

[00:15:41] And then, of course, there was the description and all of that.

[00:15:44] But I think the beauty of the shorts creation, I think we all kind of underwent that.

[00:15:50] You know, you read such interesting stuff and it sparks so many other, you know, crazy ideas.

[00:16:00] And I mean, I think it just develops and encourages a process of, you know, just reading, you know.

[00:16:11] I mean, and I remember I found this thing about the ass fruit and, you know, what have you, goats and Google grazing.

[00:16:19] I think we talked about James Bond.

[00:16:21] Why does James Bond have it shaken, not stirred?

[00:16:25] And it's got to do with temperature, blah, blah, and beer flooding and whatever.

[00:16:28] So, I think the whole process was just crazy.

[00:16:31] It's super fun.

[00:16:33] And I think, again, we've had a lot of people telling us that, you know, when we are stuck in traffic, we kind of listen to this.

[00:16:39] When we have, you know, when we're just bored.

[00:16:44] And in fact, the classic case, I think, was Sheetal's nephew who actually wanted to buy something that we talked about.

[00:16:50] And we had no clue that he said it was those inflatable jeans as a self-defense mechanism.

[00:16:58] Right.

[00:16:59] So, you know, Samir, and every time you kind of talk about naming our shorts, I always think of, I always thought you were inspired by young Sheldon.

[00:17:09] Because every time, because of all the episodes that young Sheldon has, they're all similar sounding.

[00:17:16] So, you know, young Sheldon has episodes like a patch of modem and, you know, I don't even know what.

[00:17:23] A patch of modem and a Zantac.

[00:17:25] Or, you know, there's another one which is called a briske voodoo and cannonball run.

[00:17:32] And I always felt that our episodes, and it was quite strange because I was watching young Sheldon at the time that we had just started.

[00:17:42] But you were the one who was putting together the names.

[00:17:45] And I kept thinking, my God, is this like Serendipity or is it that he's also...

[00:17:50] You know, actually, I had no clue.

[00:17:52] I had no clue.

[00:17:53] I mean, I used to just try to do it to kind of capture the, whatever the, you know, get people to, you know, give them a sense of the episode without revealing too much.

[00:18:02] Exactly what young Sheldon is all of.

[00:18:04] Okay.

[00:18:06] So I'm not, I'm not so young Sheldon then.

[00:18:08] You're the old Sheldon.

[00:18:11] So I thought, I always used to enjoy the way we used to craft.

[00:18:16] Though I must admit that all our episodes are crafted that way, not just the shorts.

[00:18:21] I think all our episodes have very interesting names.

[00:18:24] And that's something that a lot of people have called out to me saying that it's the artwork and the name which makes it so interesting.

[00:18:32] It also communicates everything we stand for, right?

[00:18:36] The fact that it's a fun tech podcast and not a boring podcast where if you see a lot of the other names, they're very to the point, very jargonistic in their titling.

[00:18:50] Whereas ours are always fun.

[00:18:52] It creates a sense of curiosity and you kind of say, listen, I want to know what they're doing over there.

[00:18:59] So I think we should continue to keep that.

[00:19:01] So I think, for the longest time, I think, I mean, I think we spend a lot of time doing that.

[00:19:08] And sometimes I wonder whether it's right.

[00:19:10] But I think the artworks and just all of that itself is a great repository of just, you know, if you kind of, if you look at all the artworks together now,

[00:19:21] then you really, oh, wow, you know, it's like there's so much fun, you know, just looking through it.

[00:19:26] And in fact, so I have to tell you this, so that, you know, you realize that how different tools behave.

[00:19:35] Okay. So I have been like playing around with multiple of them because, you know, the same thing doesn't give.

[00:19:41] So, you know, Dali behaves in a very different way from Med Journey.

[00:19:46] And then I've tried some of these, the Adobe Gen AI and all that.

[00:19:51] So each one of them, they're prompt and runway and all of them.

[00:19:54] Each one of them, though you may think that it is some LLM tool, but it behaves differently.

[00:20:01] And at one point, I kind of get the unnerving feeling that they are probably kind of, you know, developing a sense of humor and becoming comical.

[00:20:10] Or they're kind of becoming like me, you know, because I say something and kind of come up with something really stupid.

[00:20:15] That is specific to you.

[00:20:18] There's nothing to do with it.

[00:20:22] Absolutely.

[00:20:24] Do you want to, Nilesh, kind of talk about the shorts that you love the most?

[00:20:28] Because I have a lot of favorites, but honestly, finding, just as it was difficult finding episodes that I could say they are my favorites.

[00:20:36] Like I, on the episodes front, I love two other interviews we did.

[00:20:40] One was with Rahul Madhan on the entire thing of, you know, how should I say, on privacy, on GDPR,

[00:20:51] on the fact that India has developed a third wave of, you know, looking at data privacy and data protection and things like that.

[00:21:00] That was a fabulous episode.

[00:21:03] Another episode was on the India FinTech, which we did.

[00:21:07] So I think there are so many such episodes which were great learnings, etc.

[00:21:12] But finding those episodes and saying, okay, this is amongst my favorites was really a tough one for me.

[00:21:18] And it became equally tough when I started doing that with the shorts.

[00:21:22] I mean, every short I would look at and say, I love this.

[00:21:24] And then I'd see the next one and say, I love this one even more.

[00:21:27] And it never ends, right?

[00:21:29] Because you had so much fun doing these shorts.

[00:21:32] And, you know, the most fun I used to have, at least for myself, is that when I think Varsha has to send us the episodes for proofing.

[00:21:42] So, you know, you record it and you think it's fun and then you hear it back and then I should start laughing.

[00:21:48] I said, my God, this is so funny.

[00:21:50] And then we feel, oh God, this is us.

[00:21:52] So, you know, I mean, when probably three of us had that chemistry and we enjoyed our banter, you know, the name itself says it.

[00:22:05] And but people have actually told me that, you know, it is very courageous that you, you know, to do a podcast, put something out there.

[00:22:15] And in our case, I mean, touch wood, it came naturally.

[00:22:19] And when we heard it back, it didn't sound so stupid at all, right?

[00:22:24] I mean, we enjoyed it.

[00:22:25] So, I think either we have lost it, you know, we have completely.

[00:22:32] Or we are too full of ourselves.

[00:22:36] So, whenever people used to tell me, that's so courageous that you put something out there, you know, a podcast and all that.

[00:22:41] I thought, okay.

[00:22:42] I mean, I was just having fun.

[00:22:44] But, okay, courage was not really the part of it.

[00:22:49] But, yeah, many people think it's something, you know, because you are putting something of your inner absolute public domain, right?

[00:22:58] And anyone, whosoever comes across this, looks at it, hears it and has thoughts and comments.

[00:23:04] But I think it never bothered us that much because two reasons.

[00:23:08] I think, A, most of our episodes, as Sheetal correctly said, fun tech episodes, they were well researched.

[00:23:15] And we were putting forth valid, clear-cut thoughts and ideas when it came to the bigger episodes.

[00:23:22] And in shots, there were also some messages in it as well as fun facts, right?

[00:23:28] And they were really, I mean, it took, like episode, we used to research our shots, right?

[00:23:35] Because these were not usual stuff you will find on a Google feed or your Android phone.

[00:23:40] So, for me, I think if I had to, and you're right, it was very difficult to pick which is your favorite episode or favorite shot.

[00:23:49] But one of them was the one where we talked of one very few very serious things and one very wacky thing.

[00:23:58] Most of the time, it is a mix of all.

[00:24:00] But I spoke in that episode of geopolitical swing states, right?

[00:24:06] And honestly, I came across this article, which was amazing, which said that India is one of the geopolitical swing states now.

[00:24:15] It lends a lot of strength to Indian politics.

[00:24:19] And we can see that in our foreign minister, how we deal with international situations, because we know that we hold that kind of a sway in the international circles.

[00:24:30] So, that was one part in the shots.

[00:24:33] The other part was completely wacky about a peanut butter sandwich being stuck in a VCR.

[00:24:41] So, it was like, wow, you're talking of a Chad GPT, you know, telling in a, I think, some very biblical style of how to remove a peanut butter sandwich that is stuck in a VCR.

[00:24:55] And then you're talking of India as a geopolitical swing state.

[00:24:58] Either we are very, that tells us we are very courageous.

[00:25:01] Yes.

[00:25:03] No, no.

[00:25:03] And believe me, it was so crazy trying to come up with an image.

[00:25:07] Because I think we kind of talked, I think, about, I think in that same episode, I think we talked about Donald Duck's sister or something.

[00:25:16] Yes, yes.

[00:25:16] I said, my God.

[00:25:17] I said, my God.

[00:25:18] You know, like, how do you put all of this together?

[00:25:21] And you know, the worst thing that's happened with all of this, I mean, kind of is that because we tend to write everything in a tongue-in-cheek way, I have realized that I started writing everything like that, which is very dangerous.

[00:25:34] You know, I find it very difficult to write a serious LinkedIn post, you know.

[00:25:38] So, then I just say something stupid in the middle of that.

[00:25:42] In fact, I think that's a silly thing about that Times Square thing, right?

[00:25:49] So, just put something there.

[00:25:51] But I couldn't help but mention the fact that, you know, how I'm trying to convince Sheetal and you to put up the R holding on the band.

[00:25:59] I can't put up the R holding on the R holding on Times Square.

[00:26:02] So, Nilesh, after he sends me this thing that he has this 0.8 seconds of what I'm saying on Times Square, he writes to me and says, nobody knows you in New York, right?

[00:26:12] Put up the R holding on Times Square.

[00:26:17] If not, if not Bandra Gurula Sealing, right?

[00:26:19] Let's just go to Times Square, right?

[00:26:25] That was the madness.

[00:26:26] And I was just thinking, if I allow him to do that, I don't know what our next shorts is going to be like, right?

[00:26:33] So, I just put a pause on it.

[00:26:35] Every time he sends me an article, he writes and says, what do you think of this?

[00:26:39] I'm constantly looking for those moments where he's done all the things that I would edit on the show.

[00:26:46] I'm saying, okay, maybe you need to cut this out or maybe you need to edit that.

[00:26:51] I just play a role, I think, of making sure he doesn't take everything that we don't allow him to say on the show into his articles.

[00:27:00] So, it's quite funny to do that.

[00:27:02] But my favorite one, Nilesh, just like yours, so mine doesn't have, actually, I don't know if it has anything serious in it, right?

[00:27:11] But it is so amazing because it is, again, very well researched, right?

[00:27:18] So, the one that I like is the one which has bananas, brown M&Ms and dinosaur pee, right?

[00:27:24] And if you actually listen to the short, while whatever the title may say, you really talk in depth about the Millennium Camera,

[00:27:34] which is there in Tuscan, America, Arizona, right?

[00:27:39] Where they're recording a Millennium in one go.

[00:27:45] And that's so fascinating because while the name is so different, it doesn't tell you that there is the Millennium Camera in that shorts,

[00:27:55] which has been covered and captured.

[00:27:57] So, I love that one because it has this very serious database thing, which is a Millennium Camera.

[00:28:03] It also has a very interesting thing, which is a little bit, again, of our reflection,

[00:28:08] which is we bring in music and art and culture and everything into it.

[00:28:12] And therefore, the story of the brown M&Ms, right?

[00:28:15] Which is one artist or one group which wants only brown M&Ms kept in their rooms, which is so fascinating.

[00:28:23] So, and then, of course, Samiran's favorite, which says that part of our DNA is the DNA of a banana.

[00:28:34] And the most crazy one on that one is really the fact that we all may be drinking dinosaur pee

[00:28:41] because that water has kind of recycled and come through.

[00:28:45] It took me off water for a couple of days.

[00:28:47] But that's how extreme it is, right?

[00:28:51] And I love the shorts for that, for the fact that they're extremely well-researched.

[00:28:56] But they're facts that catch you by surprise.

[00:28:59] They're not things that you read in, you know, in regular places.

[00:29:04] So, it's not like a book.

[00:29:08] What a nice way of teaching people a water cycle, right?

[00:29:12] Dinosaur pee and the water cycle.

[00:29:13] I mean, normally we used to learn like evaporation.

[00:29:17] I'm going to do a diet for you to create another mid-journey artwork.

[00:29:23] On the side of water and how dinosaur pee is part of the water we drink.

[00:29:28] Might be an interesting one.

[00:29:30] So, that's really my favorite one.

[00:29:32] And yeah, Samiran, go ahead.

[00:29:34] Which is your favorite shorts?

[00:29:38] So, no.

[00:29:39] So, I think my, I mean, I think like my, it's like a family of shorts only.

[00:29:43] So, in fact, so, in fact, for me, when I was kind of thinking about all this stuff around generative AI and all that.

[00:29:58] So, I said, you know, let's look at, so all of these generative AI tools, you know, basically they have this problem that they aim to please.

[00:30:12] Okay.

[00:30:13] So, you know, if you ask it a question, it never tells you I don't know.

[00:30:17] So, the classic case in point, you know, was that we were actually celebrating my wife's bua's 80th birthday.

[00:30:26] And we were, you know, she was looking for songs which talked about cooking, you know, Hindi movie songs.

[00:30:32] They talked about cooking and all that.

[00:30:34] And somebody came up with some, you know, we looked at that regular movie like, you know, Bawarchi.

[00:30:38] And, you know, there might be some song like that.

[00:30:41] And we asked Chad GPT that, you know, give us a name of the song.

[00:30:43] So, Chad GPT named a movie, gave us the lyrics and everything.

[00:30:48] And we said, wow, I mean, you know, we didn't even know.

[00:30:50] And low and below, no such movie existed.

[00:30:53] Wow.

[00:30:54] That movie didn't exist.

[00:30:56] It gave the full lyrics of the movie, the song.

[00:30:59] And we were super excited.

[00:31:01] We said, wow, man, you know, now we're going to play this song and we're going to do this stuff.

[00:31:04] There was no movie.

[00:31:07] So, I think the point there is that one of the things that people and to the naysayers that they say is that this need to please and the need for hallucination is kind of a bit of a danger point.

[00:31:21] So, I thought, you know, let me kind of dig deep into this problem of hallucination.

[00:31:26] So, what I discovered that there was this philosopher, theologist, his name was Harry Frankfurt.

[00:31:34] And in 2000, he wrote this book or it was actually like a big pamphlet.

[00:31:42] It was called On Bullshit, which he articulated the concept of bullshit and how it is distinct from lying.

[00:31:50] So, he said that lying actually involves hiding the truth while bullshitting involves a lack of regard for the truth.

[00:31:58] So, which means while a liar is, you know, kind of he knows the truth but he is hiding it, a bullshitter is actually a person who doesn't care whether it's true or false.

[00:32:11] So, in that sense, obviously, the recent article kind of goes on to talk about certain Western politicians and how they bullshit and Boris Johnson and Trump and they just say anything, right?

[00:32:26] And they said that, you know, so bullshit is worse than lying.

[00:32:30] And I found a couple of academic papers that actually talk about are LLMs liars or bullshitters and all that.

[00:32:38] I said, wow, I wish I was doing this.

[00:32:41] Therefore, are you saying that Jainai is a bullshitter or a liar?

[00:32:45] Or is it something?

[00:32:47] No, no, it is a bullshitter.

[00:32:48] So, the theory is it is a bullshitter actually.

[00:32:50] So, basically, it doesn't have any regard for the truth is the worry.

[00:32:54] Because of this need to please.

[00:32:57] It will never tell you that I don't know something.

[00:33:00] So, fascinating, Samiran, because from a research perspective, typically it has always been believed that women have been brought up to, you know, with enough thrown into their heads where they are always trying to please people.

[00:33:15] But the people who code AI are really typically men.

[00:33:20] Men?

[00:33:21] I mean, undoubtedly.

[00:33:22] So, that is beyond pressure.

[00:33:24] Yes, yes, yes.

[00:33:26] I mean, that is.

[00:33:26] So, there must be all these coders have some inner woman or they are all bullshitters.

[00:33:31] I mean, one of the two women is true.

[00:33:35] Now, it is, I mean, in that sense, it is something to be aware of.

[00:33:40] I mean, I won't say it is dangerous.

[00:33:41] But, yeah, don't believe everything.

[00:33:43] I guess before we kind of wrap up for this final episode of ours, a couple of things.

[00:33:51] And, Nilesh, you know, I'm going to pick up from where you were talking about the fact that people wrote and said that these are very different.

[00:33:58] It's fun tech.

[00:33:59] It's engaging.

[00:34:00] Also, Samiran and I have had people tell us, oh, you know, you'll ask my companion in the car, drive home and things like that.

[00:34:07] And I think the fact that we own the fun tech space, the fact that we make it easy for people, etc., is reflected in the votes that we've received for the various awards that we've received in this category.

[00:34:21] Where I think people have kind of said, okay, these guys don't bullshit.

[00:34:24] They at least put the facts.

[00:34:25] They bring intelligence to the table.

[00:34:28] Yes, they make technology fun, but they at least do it with enough homework at the back.

[00:34:34] And I think we should do a shout out and a call out and a thank you for all the awards that we have won.

[00:34:40] I think we've won almost every award in the podcast space over the last two years in the technology category, whether it's the gold, the silver, but we've picked up an award in more.

[00:34:52] And from what Samiran tells me, we are also out there in one of the global awards shortlist.

[00:35:00] We haven't yet won it, but...

[00:35:02] Yes, yes, we are.

[00:35:04] In a couple of them, actually.

[00:35:05] One is Signal, one is Podcast.

[00:35:07] So I think that...

[00:35:08] I think Sheetal has been our private joke about how we are winning awards.

[00:35:13] So we are kind of literally like the Om Puri, Nasruddin Shah and Smita Patil of podcasting as opposed to the Shah Rukh Khan, Amitabha Chan and Alia.

[00:35:22] We are winning awards, but we are not getting any money.

[00:35:26] No, and you know, honestly, shout out to all our listeners.

[00:35:32] And none of these awards are bought awards, right?

[00:35:36] I mean, there is all that bullshit also there where people go and buy awards.

[00:35:39] We have no money.

[00:35:40] We have no money.

[00:35:40] There is no money to buy awards.

[00:35:44] So, yeah.

[00:35:45] I think before, just before wrapping up, I found it very interesting when you mentioned lying and bullshitting, right?

[00:35:53] And that whole movie name and lyrics of songs were created by Chad GVD just to please you.

[00:35:59] But it bought me...

[00:36:01] You know, when you are saying research, it's so interesting.

[00:36:03] And it's a very classical problem which I would love our listeners to go and research, right?

[00:36:13] And I'm sure they would have heard it.

[00:36:14] But there is a lot of study that has been done on that problem.

[00:36:18] The problem is very simple, okay?

[00:36:20] And it comes to lying and bullshitting.

[00:36:21] I'll come to that.

[00:36:22] If you are...

[00:36:24] There is a guy and you are lost in a jungle in some remote continent, right?

[00:36:31] And you are at a crossroad.

[00:36:33] This is a classic puzzle.

[00:36:35] It has mathematics, philosophy, everything.

[00:36:37] You are at a crossroad and you don't know whether you have to take right or you have to take left.

[00:36:41] And there is a person standing there.

[00:36:43] Now, the issue...

[00:36:45] You only know one thing that this area has two tribes.

[00:36:49] One always tells the truth and the other always tells the lie, right?

[00:36:53] You don't know who that person is.

[00:36:56] So what question should you ask that the outcome will tell you the right way towards the city and you are not further lost, right?

[00:37:03] Now, the problem becomes complex when you come into the realm of lying and bullshitting.

[00:37:11] So there was that...

[00:37:12] Does the tribe which tells truth is okay?

[00:37:14] It always tells the truth.

[00:37:16] But the tribe which lies, does it lie with similar kind of conviction?

[00:37:23] Meaning always say the opposite?

[00:37:26] Does it bullshit?

[00:37:28] Does it bullshit?

[00:37:29] Because both are looking like lies, right?

[00:37:31] So then you are in a deep shit.

[00:37:34] Because if you find a liar bullshitter, then you are screwed.

[00:37:39] If you find a liar or you find a guy who tells the truth, you are okay.

[00:37:43] So I think that was an interesting thing that you brought up, Samir.

[00:37:49] And I think for our podcast, we can genuinely say that we are only telling the truth.

[00:37:54] There is no lying or bullshitting.

[00:37:57] It's just true facts not churned out by ChadGBT.

[00:38:01] And, you know, this kind of brings us to another episode of 3TV.

[00:38:08] And this is the episode where we take a pause for multiple reasons.

[00:38:14] One is I am off for almost a month trying to set up my son in his US college and also that I am on a break.

[00:38:23] But at the same time, I think we are trying to also look at ways and means of, you know, enriching our content, right?

[00:38:34] With probably different kind of possibilities of mediums, right?

[00:38:38] We are just doing podcasts, but maybe there are other opportunities and avenues that we need to look at.

[00:38:44] So this is a kind of a period where we'll take a pause and we'll re-engage with our audience in a similar or a newer avatar and probably some newer formats.

[00:38:57] But for now, it's a short break.

[00:39:01] And please keep following the episodes.

[00:39:04] There is a lot of good stuff.

[00:39:05] You can go back and check it out.

[00:39:07] And please leave us a rating and review if you are listening to some Apple podcast.

[00:39:14] So until next time.

[00:39:17] Bye-bye.

[00:39:17] In true Hollywood style, we'll be back.

[00:39:21] We'll be back.

[00:39:23] Hasta la vista.