Back to Event Overview
Event Recap2026-04-28

April 2026 AI Dinner - From C-Level to AI Solo: What Changes When Your Team Becomes 30 Agents

"The convergence of silicon and soul. An unforgettable evening of deep tech and high gastronomy."

Premium Content

This recap is reserved for all-access members and ticket holders of the original event.

Buy Event Access

Already have access? Log in here

Member Only Summary

The April 2026 AI Dinner is going to be unlike any other AI Dinner. We've been busy creating a new membership model, launching our new website and are moving to White House Melia Hotel, just one of the new venues we have planned for the year ahead. We also have a reception from 5:30 sponsored by Cybercy Group with space for up to 100 guests and are delighted to be featuring a brand new book launch for long time AI Dinner supporter Baiju Solanki - all welcome... We then have the AI Dinner itself and will be kicking off with Baiju followed by our keynote speaker for the evening Kim. In his session, Kim shares what actually transfers from managing people to managing agents, what doesn't, and what nobody warns you about: the dopamine trap of endless building, the silent tax of checking everything, and why the hardest part isn't the AI, it's being your own manager when there's nobody left to push back on your ideas.

Baiju Solanki
Speaker

Baiju Solanki

Humanity's Edge Book Launch

Baiju Solanki is a psychologist, entrepreneur, and leadership expert who helps individuals and organisations build human-first cultures in an AI-driven world

View LinkedIn Profile
Sunny Vara
Speaker

Sunny Vara

Cybercy (Sponsor)

Founder & CEO | Cybercy Group (UK) & Cybercy Gulf | Board-level advisor on Cyber Security, Data Protection & AI Governance for regulated and critical infrastructure organisations.

View LinkedIn Profile
Kim Faura
Keynote Speaker

Kim Faura

From C-Level to AI Solo: What Changes When Your Team Becomes 30 Agents

Kim Faura spent 20 years leading product, marketing, and commercial teams at giffgaff, Checkatrade, and Gumtree. At the start of this year, he left his MD role and built Early Phoenix, an AI-powered business intelligence platform, entirely through AI. No team. No engineers. No analysts. Just 30 named AI agents running in parallel across FMCG, animal welfare, and European marketplace analysis. In this session, Kim shares what actually transfers from managing people to managing agents, what doesn't, and what nobody warns you about: the dopamine trap of endless building, the silent tax of checking everything, and why the hardest part isn't the AI, it's being your own manager when there's nobody left to push back on your ideas.

View LinkedIn Profile
Deep Dive cover
Deep Dive: April 2026 AI Dinner - From C-Level to AI Solo: What Changes When Your Team Becomes 30 Agents
AI Generated Deep Dive
00:0023:09

Recordings

Baiju Solanki

Sunny Vara

Slide Deck

Full Transcript

00:00:00.480 Is that good? 00:00:02.480 >> Yes. 00:00:02.879 >> Y mission accomplished. All right. 00:00:05.200 >> Go on, Kim. 00:00:05.920 >> No tech issues. AI will resolve the 00:00:08.720 world also in terms of technology. 00:00:12.000 >> All right. 00:00:12.719 >> Good to go. 00:00:13.519 >> Good. Thank you guys. Well, I'll start 00:00:15.360 with a um a story. 00:00:18.000 30 years ago, I asked my dad that to 00:00:22.160 take these um fast typing lessons, touch 00:00:25.279 typing they call it. And I wanted to 00:00:27.760 learn how to type quickly because for me 00:00:29.840 it seemed to be the windows to 00:00:31.359 everything I really cared about was 00:00:33.360 learning to type very quickly. I'd be 00:00:35.200 very efficient in any decision I take in 00:00:37.840 terms of where my career would go. But 00:00:40.160 my dad, who was a visionary, he said, 00:00:43.200 "No, I'm not going to pay for that." Why 00:00:45.440 not? Because by the time you're working, 00:00:48.879 you're going to be talking to your 00:00:50.640 computer. 00:00:52.640 Now what happened as a result was that I 00:00:56.879 did not take those lessons. I have 00:00:58.559 listened. I learned how to type but I 00:01:00.719 was never really good. I was never 00:01:02.160 really fast. I was actually known as 00:01:04.400 three-finger Kim. I was typing with 00:01:06.960 three fingers most of the time 00:01:08.320 basically. So I was never the fastest 00:01:10.400 typist. Um but things have changed a lot 00:01:13.600 right things have evolved. 00:01:15.680 It's been a huge revolution in terms of 00:01:17.600 AI and that has changed completely the 00:01:20.080 way I work. Not only in terms of how I 00:01:21.759 type or how I talk but in terms of how I 00:01:24.400 run a business. 00:01:26.640 This is a bit of my background. So even 00:01:28.960 though I was a slow typist, I managed to 00:01:31.119 have a successful career. I was at 00:01:33.040 PepsiCo for a bit. I then came to the UK 00:01:35.840 in 2009 to launch Gift Gap, the mobile 00:01:38.320 network. We grew that to 500 million and 00:01:41.280 over 3 million customers. Then I moved 00:01:43.200 to check trade a 40 million organization 00:01:45.439 now 120 million organization. I was 00:01:48.159 chief product officer, chief marketing 00:01:49.920 officer there. Then I moved to Gumree 00:01:52.159 which is a business that had been 00:01:53.360 declining for 10 years and now is 00:01:56.240 growing again. I was managing director 00:01:58.399 until very recently. Now my dayto-day 00:02:02.880 looked a bit like this. Throughout my 00:02:05.759 entire career I was bombarded with 00:02:08.399 messages with emails. the calendar which 00:02:12.400 I didn't own 80% of my time was in 00:02:14.959 meetings between 9 and 5. So basically 00:02:17.200 the spare time I had left was uh to 00:02:20.400 either do strategy work which I would do 00:02:22.480 in the morning and then calm slack on 00:02:24.800 the evenings. So it was a exhausting 00:02:27.760 dayto-day life as a a corporate leader 00:02:32.720 but then I left. 00:02:34.800 What happened? 00:02:38.239 Nothing. Silence. tumble weeds. Not a 00:02:42.400 lot was going on. So, I just was at my 00:02:44.879 desk at home thinking about I got to 00:02:48.480 grow. I got to scale. I got to develop a 00:02:50.560 business. And um I guess for me, what I 00:02:54.160 thought initially was if I want to grow, 00:02:57.120 if I want to scale a business, I'm going 00:02:58.720 to have to hire talent. I'm going to 00:03:00.480 need an analyst, a data analyst. I want 00:03:02.560 a data engineer. I'm going to want to 00:03:05.280 bring a product leader to help me out. 00:03:07.680 because that was for 20 years that had 00:03:09.920 been the answer on how you grow and how 00:03:12.239 you scale a business but then I started 00:03:15.280 to think like well with everything that 00:03:16.720 has evolved with new technologies 00:03:18.319 emerging particularly AI should I do 00:03:20.560 without could I do it all myself right 00:03:23.680 and that's a bit more of a picture of me 00:03:25.760 today with a microphone by the way so 00:03:28.239 that's that family joke about 00:03:30.400 three-fingered Kim is no longer a joke 00:03:32.879 since February I'm only talking to my 00:03:35.440 computer and I'm getting all everything 00:03:37.920 done through prompts, through 00:03:39.200 conversations, right? So, I'll introduce 00:03:41.280 you to Early Phoenix, which is a 00:03:43.440 business that has been running for a 00:03:44.799 while, but fully focused on AI since 00:03:47.440 February this year. It's an AI powered 00:03:49.760 business intelligence organization. 00:03:52.560 It's uh doesn't have a team, only 00:03:54.400 myself. Uh no engineers, no analysts, uh 00:03:58.080 but there are 30 agents, actually 32 00:04:01.120 today. Uh a microphone, and I'm 00:04:04.080 predominantly using quad code. I'll tell 00:04:06.480 you all about it. Now, this is me, me 00:04:09.360 and my agents. I'll introduce you three 00:04:11.519 of them. This is Dex. And you'll get to 00:04:14.159 hear a lot about Dex on the following 00:04:16.320 session because I took all the 00:04:18.320 inspiration from Dex from Dave Keley who 00:04:20.399 is going to be speaking in a month's 00:04:22.240 time. Uh, and Dex is my chief of staff. 00:04:25.040 So, uh, for me, every conversation, 00:04:27.680 including this one, is being recorded. 00:04:30.080 And um every it's got access to my 00:04:32.800 calendar, it's got access to my 00:04:34.000 WhatsApp, it's got access to my email, 00:04:35.919 my Slack, you name it. It's all 00:04:37.680 connected, which means that when I wake 00:04:40.320 up in the morning, it's read all my 00:04:42.000 emails. It knows what I need to respond 00:04:43.680 to. It will draft emails for me. It will 00:04:46.639 send them over to drafts. It will uh 00:04:49.040 create calendar invites. It will do it 00:04:50.800 all. Uh which is very much inspiration 00:04:54.320 you would see around businesses like 00:04:55.840 Open Claw. They do something very 00:04:57.919 similar to that. And uh yeah, it's very 00:05:00.639 much adapted to what I needed at the 00:05:03.199 time. Open claw is a bit more loose and 00:05:05.040 a bit more wild, but it's a very similar 00:05:07.280 concept. 00:05:09.520 Now, as I was growing and scaling the 00:05:12.320 business, working with different um 00:05:15.280 customers, um exploring different 00:05:17.840 projects, different initiatives. I was 00:05:19.199 treating almost like an agent at a time 00:05:20.960 as I was thinking about how I 00:05:22.880 compartmentamentalize all these 00:05:24.560 different work streams. And after a 00:05:26.720 while, I realized I need I need an 00:05:28.880 orchestrator. I need someone that takes 00:05:30.400 care of technology. Someone that will 00:05:32.000 upgrade all the agents. Someone that 00:05:33.440 will wake them up when they need to wake 00:05:35.120 them up or or who will put them to sleep 00:05:37.280 and ensure that everything's saved and 00:05:38.800 everything's stored. And that's where 00:05:40.960 text came up. That's when text appeared, 00:05:43.520 which is my chief technology officer. Um 00:05:46.080 and that helped with bring a lot more 00:05:48.800 structure. So text will not be sending 00:05:51.680 any my emails. It won't be looking at my 00:05:53.600 emails, but it will know everything 00:05:55.199 about all the agents that are awake and 00:05:57.360 what they're doing. 00:05:59.520 And I had been running this for quite a 00:06:01.280 while. Um, but weeks in, I realized that 00:06:04.479 I was um was going a bit loopy, right? 00:06:07.039 So, you start start talking to your 00:06:08.639 computer to computer a lot. You you have 00:06:11.039 a lot of conversations and I don't know 00:06:13.759 about about you guys, but for me, I I've 00:06:16.160 got an an adrenaline rush when it comes 00:06:17.919 to building. I love building, right? I 00:06:19.600 could spend all day, all night building. 00:06:21.520 And that's what I was doing for weeks 00:06:23.919 and weeks almost uh evenings, weekends. 00:06:26.319 I just build build because it's so 00:06:28.080 exciting, right? Something you can't be 00:06:29.680 done. Uh but you also realize that 00:06:31.600 you're just not making time for all the 00:06:33.280 things that you wanted to do. Be that 00:06:34.560 family, be that health, right? Uh and 00:06:36.639 that's where I created sex. I could have 00:06:39.120 chose a better name, right? But um you 00:06:41.199 had Dex and text already. So I I had to 00:06:43.280 I had to come. 00:06:44.319 >> Did you say you created sex? 00:06:45.919 >> I created sex. I am the investor, 00:06:48.080 >> quote unquote, quote unquote. 00:06:50.080 >> He's right here. And sex tracks my 00:06:52.479 routine, my fitness, my habits, 00:06:54.720 nutrition. Thomas was saying, "Oh, 00:06:56.400 you're looking good." Like, well, that's 00:06:57.360 thanks to sex. Yeah, I'm looking good. 00:07:00.639 >> Fitness exercise every morning. Very 00:07:02.960 much resonates with what you were saying 00:07:04.800 as well, right? In terms of how you are 00:07:06.240 an effective leader has a lot to do with 00:07:08.319 how you manage yourself. I really needed 00:07:10.560 that tough coach that every morning 00:07:12.880 tells me did you diet la yesterday? Did 00:07:16.080 you work out this morning? Have you 00:07:17.840 meditated? So all of that is being kind 00:07:20.479 of followed by sex who is a very tough 00:07:23.199 coach and very much needed. So at this 00:07:26.240 point you're probably thinking uh Kim 00:07:28.319 are you anthropomorizing your your 00:07:30.240 agents? Do you speak to AI as if they 00:07:32.800 were people? How many of you do that by 00:07:34.720 the way? How many of you you speak to as 00:07:36.800 people? I see quite a few people in in 00:07:39.039 the room and I've seen scales on the 00:07:40.960 internet about um like the lightest form 00:07:44.319 of of anthropomorphization would be 00:07:46.479 saying thank you or saying please and 00:07:48.479 then you got the other extreme which is 00:07:50.960 like when you have starting to develop a 00:07:52.639 personal relationship with the agent. Um 00:07:54.800 I think I'm at this level now. So So 00:07:58.639 it's me and Wilson. I don't remember the 00:08:00.400 movie but basically uh Tom Hanks has the 00:08:02.879 conversation with Wilson and Exactly. 00:08:06.000 and suddenly becomes like that human 00:08:08.240 friend that he needs needs to survive. 00:08:10.479 So I'm starting to become a bit a little 00:08:12.160 bit like that right now. It does feel 00:08:13.919 like that especially when you're 00:08:14.960 speaking and especially when you're 00:08:16.400 interacting with these agents which you 00:08:18.879 created personas and define roles 00:08:21.199 around. 00:08:22.879 Now this is my software. This is how I 00:08:24.800 work. These are the basic um elements 00:08:26.960 that I use on my day-to-day. One is quad 00:08:28.960 code. I'm on the max plan and actually I 00:08:31.360 sometimes go over. So, uh, I will pay 00:08:33.679 more than £180 a month in order to run 00:08:36.640 my business. Whisper Flow for me is the 00:08:39.599 best product when it comes to talking. 00:08:41.519 So, that's more like um it's a system 00:08:43.839 that will resume I was I was trying it 00:08:46.560 out with John earlier actually. It will 00:08:48.320 resume um everything that you're saying. 00:08:51.200 It's an AI on its own. So, it will take 00:08:53.120 away all the ums. It will take away 00:08:54.959 everything that you kind of didn't want 00:08:57.040 to say and it will prompt it straight 00:08:59.200 away on any environment. So sometimes it 00:09:01.519 will go to claude, sometimes it's 00:09:03.440 straight to WhatsApp that I've just 00:09:04.560 heard the voice uh kind of what's over. 00:09:06.959 But critically I stop typing and whisper 00:09:09.440 flow is the reason why it's because the 00:09:12.880 AI kind of transcription has evolved to 00:09:15.600 a way that it doesn't matter if I'm 00:09:17.839 speaking English 00:09:20.399 or 00:09:21.920 I will be able this will pick it up 00:09:23.920 straight away will know my language. I 00:09:26.000 speak three usually on a dayto-day and 00:09:28.560 will be able to transcribe it and 00:09:30.080 translate it to to family, friends, etc. 00:09:32.480 which is awesome. And then Google 00:09:33.519 Workspace, which has been throughout uh 00:09:35.279 my my career. I've got a Google phone as 00:09:37.440 well. So, I've been with Google for a 00:09:39.440 very long time now. 00:09:41.519 My hardware. Um I wouldn't recommend 00:09:44.399 taking a picture about this, by the way. 00:09:46.000 This this was a mistake from the start, 00:09:48.240 right? So, I knew I needed more compute 00:09:50.720 because I had um 8 gigab gigabyte RAM um 00:09:55.040 uh laptop and I thought, "Oh, I'm going 00:09:57.839 to be great with 24. So, I'm going to 00:09:59.760 get a Mac Mini because it's cheaper, so 00:10:01.600 I'll work with that." Then, as you start 00:10:03.360 getting to whisper flow, you realize 00:10:05.200 it's not picking up things and say 00:10:06.720 that's because you don't have a great 00:10:08.000 microphone. So, get a microphone. It 00:10:09.519 also looks cool on the calls like what 00:10:10.880 are you doing with the microphone? Like, 00:10:12.000 I'm just talking to my laptop all day. 00:10:13.440 So, I got a microphone. That's why, 00:10:14.640 right? So, it helps explain a bit that 00:10:16.320 persona. But then I realized that uh 00:10:18.800 when I was going to travel when I went 00:10:21.040 went to Spain for Easter to see family, 00:10:23.600 friends and um I realized that I 00:10:27.360 couldn't take I I couldn't take my 00:10:29.680 laptop with me. I I I needed to bring 00:10:31.600 the whole thing. So I needed I need to 00:10:33.279 bring my Mac Mini and the microphone and 00:10:35.920 the camera and the keyboard and the 00:10:38.079 mouse and a portable monitor. My wife 00:10:42.560 wasn't happy. You can imagine. It was it 00:10:45.519 was a difficult conversation when she 00:10:47.040 saw my back. Why is that? Like well I 00:10:48.959 was really struggling to operate in the 00:10:50.800 same way even though there been some 00:10:53.279 advances there but I could not operate 00:10:55.360 in the same way without my Mac Mini 00:10:57.200 close to me. So I had to bring it all 00:10:58.800 with me. So 00:11:01.279 what I recommend if you're getting into 00:11:03.040 this world 00:11:04.880 get yourself a Mac Pro. They're amazing. 00:11:08.079 Especially get one with a higher spec 64 00:11:10.560 gig 128. That's going to be my next 00:11:12.800 thing. They cost a lot of money though. 00:11:14.240 just four or £5,000 depending which one 00:11:16.640 you go after. But uh yeah, that's far 00:11:19.360 more practical if you're planning to 00:11:21.760 travel. 00:11:23.920 Good. Now talking about things that are 00:11:26.640 quite impartable. Why do I need all this 00:11:29.839 RAM? Why am I working with a system like 00:11:35.279 plot code, right? Which is it feels so 00:11:38.880 going back to the past. I mean, look at 00:11:40.560 the graphics. This looks like Pac-Man 00:11:42.399 from the 1980s, doesn't it? And uh 00:11:44.560 you're having to learn all these 00:11:46.160 commands just to input text into a box 00:11:49.680 which you're going to talk to, right? 00:11:51.279 It's fascinating how we've ended up like 00:11:53.839 going back in time. I was used to 00:11:55.839 working on Google Docs and Figma files 00:11:58.880 and uh I was using web type of 00:12:00.880 environments for everything. So it 00:12:02.320 didn't really matter how powerful my 00:12:04.880 machine was. But now in order to get 00:12:07.120 stuff done, I need a powerful machine 00:12:09.200 and I need to go back to these systems 00:12:11.120 which are a bit feel ancient. But that's 00:12:14.480 part of the beauty of it all because 00:12:15.920 with this I've been able to build things 00:12:18.880 like this. This is my key product. It's 00:12:21.360 called shelfling and I work with 00:12:23.360 different customers uh from PepsiCo to 00:12:26.000 Ferrero and I'm having they're receiving 00:12:28.560 daily emails and this is a digest of 00:12:31.360 everything that changed on the digital 00:12:32.880 shelf the day before. So they will want 00:12:34.880 to know is there a new promotion in the 00:12:36.800 market for my competitors? Has a new 00:12:39.200 product launch? Has the product uh had a 00:12:41.839 a price increase or a price decrease? 00:12:43.920 Because that really affects their sales 00:12:45.680 and I can do that every day across six 00:12:49.200 different countries and it's growing 00:12:50.800 because I only started doing this two 00:12:52.320 months ago. 00:12:55.839 Pet welfare intelligence. So I'm working 00:12:58.399 with uh welfare organizations like 00:13:00.560 Battery, Nature Watch. I was having a 00:13:03.440 conversation uh this morning with the 00:13:06.000 Rabbid Welfare Society as well. They're 00:13:08.160 very interested in this type of data, 00:13:09.680 right? So, they want to know what's for 00:13:11.839 sale in the market. How many Labradors 00:13:13.440 are available right now for sale? Where 00:13:15.440 are they located? Do they have a 00:13:16.959 microchip? What's their average age? All 00:13:19.120 this information is publicly available, 00:13:20.639 but it's spread and very messy across 00:13:23.360 different platforms. So be being able to 00:13:25.519 extract it, extract it, normalize it and 00:13:27.760 present it in a way that any everyone 00:13:29.760 can understand has become uh 00:13:31.920 life-changing for all these um 00:13:34.320 organizations that often have campaigns 00:13:36.320 running to understand what's the impact 00:13:38.399 of uh preventing people buying these 00:13:40.959 types of breeds like the French bulldog 00:13:42.800 which don't have great quality of life. 00:13:46.800 So that's been expanding. I've been 00:13:48.480 doing dogs, cats, small mammals, 00:13:50.480 reptiles. I'm having a conversation with 00:13:52.160 a parrot society as well. they're 00:13:53.600 interested in parrots. So, working 00:13:55.200 throughout all of these. 00:13:56.639 >> Um, now through my contact with the 00:13:58.639 welfare organizations, I was also uh 00:14:00.720 able to um build the first uh dog 00:14:03.839 breeder registry. Um, I don't know if 00:14:05.440 you're aware, but if you are uh looking 00:14:07.519 for who is a registered dog breeder, 00:14:09.920 you're going to have to go through uh 00:14:11.600 all the local councils that you may be 00:14:13.440 interested in uh in order to find who is 00:14:16.079 registered or not. There's um about 350 00:14:19.120 councils in the UK. uh many of them will 00:14:21.360 not present any of this data but 254 00:14:24.399 will and they present it in various 00:14:26.399 different ways Excel spreadsheets PDFs 00:14:28.959 forms you name it but I've been able to 00:14:30.720 extract it all in such way that I can 00:14:32.800 every day collect all that data and play 00:14:35.360 it back into a dashboard and that is 00:14:37.600 through the power of code agents all my 00:14:41.279 agents look like this essentially well 00:14:43.360 80% of my agents apart from dex text and 00:14:45.839 sex uh my agent will look like this uh 00:14:49.360 first is an AI engine layer. Of course, 00:14:51.199 it's quad code running everything. It 00:14:53.279 writes the briefs, it builds the 00:14:54.959 collectors, it owns the verticals, it QA 00:14:57.920 everything uh and it improves and it 00:15:00.079 learns, right? So that's that's the 00:15:01.519 basic foundation. But then everything 00:15:03.760 that's under if you work for like 00:15:06.000 analytics or business intelligence, it's 00:15:08.079 not that different. You would probably 00:15:09.760 follow these steps. So you just 00:15:11.440 naturally work in the way that we're 00:15:13.360 used to. First of all, collect the data. 00:15:15.440 Second, validate it. But that's the 00:15:17.199 that's the magic, right? Because 00:15:18.560 sometimes you're validating data and you 00:15:20.480 realize, oh, the data is broken, but the 00:15:22.399 system knows it's broken. So, it's going 00:15:23.680 to rewrite itself to fix itself and then 00:15:26.000 collect the data again, right? That's 00:15:27.680 the magic of AI. Then you you have this 00:15:30.160 canonical data set. Everyone will have a 00:15:32.320 different way to represent data. 00:15:33.920 Sometimes, like say for a dog, you'll 00:15:35.839 know it was born a month ago. Sometimes 00:15:38.160 it will you will know it was born the 00:15:40.880 28th of April. Sometimes you will know 00:15:44.240 um it's four weeks old. you everyone 00:15:46.720 will present data in a different way. 00:15:48.160 What's very important here is is the 00:15:49.920 normalization this canonical data set 00:15:52.079 which you can compare AC across 00:15:54.000 different platforms. Then you move to 00:15:55.839 the analytical layer which um will 00:15:58.399 answer questions such as how many 00:16:01.120 Labradors over two months old have got a 00:16:03.759 microchip. So that's when you are 00:16:05.279 starting to work with the data and 00:16:06.800 finally you've got an output which is 00:16:08.880 the dashboard or a or a deck. By the 00:16:11.120 way, all of this in the past you would 00:16:13.040 have need data engineers, analysts and 00:16:15.040 maybe marketeteers to the to the to the 00:16:17.040 end uh part. You don't need those types 00:16:19.440 of specialist rows. It's it's cloud code 00:16:21.360 running everything end to end. And 00:16:23.920 what's critical as well is that the 00:16:25.040 tools are very different. So I'm not 00:16:27.040 using Excel. I'm not using PowerPoint. 00:16:28.959 That's what you would have used in the 00:16:30.240 past. It's I guess the output is is 00:16:33.040 either HTM it's HTML nearly all the 00:16:35.279 time. And it's so much easier to edit 00:16:39.440 your presentation when you've got an 00:16:41.360 HTML and it's going to look so much 00:16:43.120 better as well that inevitably that's 00:16:46.399 where many of us I guess coming from a 00:16:48.800 business or product background are 00:16:50.480 starting to gravitate to. So we're 00:16:52.160 moving into models especially when you 00:16:54.399 work with data that can go all the way 00:16:56.160 from collection to presentation in a 00:16:58.639 seamless way in a touch of the button. 00:17:00.160 So for me when I'm updating how many 00:17:02.320 dogs are for sale which I just did did a 00:17:05.599 a week ago. Uh it's it's a one day job. 00:17:09.119 So So and and it flows into Thomas. 00:17:12.559 >> Kim, how do you charge for the service? 00:17:15.520 >> So there's different monetization 00:17:17.280 different opportunities, different 00:17:18.240 business models here. So when I talk 00:17:19.760 about for example shelf signal, this if 00:17:22.720 you wanted a specific category, let's 00:17:24.400 say you're interested in um following 00:17:26.959 snacks, right, which is a category I 00:17:29.120 work with PepsiCo, then and and there's 00:17:31.600 an interest on one specific country. 00:17:33.760 Then I'll say okay well that's the value 00:17:35.360 and it's about 500 pounds and I'll 00:17:37.120 follow that every day for you per 00:17:38.720 category per country but of course the 00:17:40.240 more you want to expand categories and 00:17:41.600 countries then it's that monetization 00:17:43.840 model scales I wouldn't charge you know 00:17:45.840 500 at a time but it would grow in that 00:17:47.840 way um and it's different like when I if 00:17:51.039 I'm honest this I did for free for 00:17:52.799 nature watch foundation and I thought 00:17:54.480 this is was my first project I wanted to 00:17:56.960 understand how complex it is to build 00:17:59.679 these data collectors data mining around 00:18:02.160 254 councils So that was more of an 00:18:04.559 experiment. I did it for right. Um 00:18:07.039 whereas these ones which are having an 00:18:10.160 impact within um welfare organizations 00:18:12.400 not so government by the way having two 00:18:14.640 conversations with DERA about this then 00:18:17.280 it's going to be more of an ad hoc type 00:18:19.280 of like what do you actually need and 00:18:21.360 depending on what they need then it's a 00:18:23.039 very specific uh use case scenario but 00:18:25.520 um yeah we're very much at the start of 00:18:28.400 this. So my products are halfdefined but 00:18:31.919 I'm also very as to what is needed out 00:18:34.880 there. 00:18:38.080 Good. Now, I'll say a very important 00:18:40.720 caveat. 00:18:42.240 Very important caveat. How much do I 00:18:44.320 trust AI? 00:18:46.320 >> About 20%. Right? So, everything I've 00:18:48.960 said so far, this light looks beautiful. 00:18:52.400 >> Right? And the reality and what we dream 00:18:54.559 is that you just click a button and by 00:18:56.160 the end of it, it's all done. And it's 00:18:58.240 getting better. I'm not saying it's not 00:18:59.840 getting better, but 00:19:02.320 at least half of the times it's wrong. 00:19:04.720 There's something that you'll pick up 00:19:06.880 because you're going to QA because 00:19:08.240 you're going to see the data because 00:19:09.840 you've got previous months because you 00:19:11.840 spend a lot of time understanding this 00:19:13.600 and you'll realize there's something 00:19:15.280 wrong, right? And uh that's why still so 00:19:19.120 far my position is okay trust but 00:19:22.720 validating and that's where I spend most 00:19:25.120 of my time validating because when you 00:19:27.440 work with data you know it is crucial 00:19:30.400 that you get it right 00:19:31.600 >> but it's it's an interesting one because 00:19:33.120 for for me the attention to detail is so 00:19:35.760 phenomenal you you just got to nail it. 00:19:40.480 >> Yeah. I guess you start working on 00:19:42.400 different ways of how can I QA this to 00:19:44.400 ensure that there's no errors and you 00:19:46.960 start building your own eval 00:19:51.280 but eventually or or essentially you 00:19:53.520 also say what what will be the things 00:19:55.679 that will help me test manually to 00:19:59.360 ensure that the emails have also been 00:20:01.840 successful. 00:20:04.080 >> Yeah. I mean, look, brilliant and you 00:20:06.320 know your [ __ ] which is great, but so 00:20:09.440 what to what extent is this is how I 00:20:11.679 want to work when you first started or 00:20:14.240 this is what I want to do? 00:20:16.080 >> So when you talked about your agent de 00:20:19.840 sex and I mean did you say I'm going to 00:20:22.400 create these and then 00:20:23.520 >> it's all progressive? No, I mean for me 00:20:25.280 it was Dex was the original and it was 00:20:28.000 more about well I didn't want to have to 00:20:30.000 deal with all this kind of admin right 00:20:32.640 that that was surrounding me and maybe 00:20:34.799 when you have a you know corporate 00:20:36.799 career 200 emails a day having to manage 00:20:38.960 all that having to deal with a lot of 00:20:40.400 LinkedIn messages all those things were 00:20:42.480 a bit of the problem but then as you 00:20:44.559 start building your business you start 00:20:46.960 realizing I need an orchestrator I need 00:20:49.200 text and then I needed someone to manage 00:20:51.200 myself I need a coach I need sex that's 00:20:53.280 that's been a the transition but 00:20:55.200 >> so how much time do you spend managing 00:20:57.600 agents 00:20:58.400 >> the whole time 00:21:00.159 >> is that different from people 00:21:02.880 >> I'll get to that because there are there 00:21:04.799 are things that travel but there are 00:21:06.480 things that don't travel right um 00:21:08.159 because I trust people more than I trust 00:21:09.760 agents today I can get because I've had 00:21:11.919 the luxury 00:21:12.400 >> you made the agents 00:21:13.440 >> I made a uh but I've had the luxury to 00:21:16.720 work with super talented people 00:21:18.799 throughout my career and uh when it 00:21:21.039 comes to a great analyst or a great 00:21:23.039 engineer it wouldn't take the 00:21:24.559 handholding that is required to AI. 00:21:27.679 There's a lot of challenges with it, 00:21:29.120 right? Um, and obviously my trust with 00:21:30.640 dogs is is huge, right? They're very 00:21:32.400 predictable and you can really, you 00:21:34.640 know, what to expect from a dog and 00:21:36.080 that's something that even with people 00:21:37.600 you won't you won't always get. Um, now 00:21:41.039 answering your question, Tom, um, 00:21:42.720 challenges. These are the challenges 00:21:44.960 that I did deal uh deal with day in day 00:21:48.000 out. They are there and they're always 00:21:50.880 there. And um you've just got to accept 00:21:54.320 that you're working with this. You've 00:21:57.760 got hallucinations and they can come in 00:22:00.000 various different shapes. Right? In this 00:22:02.080 case 12.74 00:22:04.400 got converted into 1,274. 00:22:07.120 And it it's problematic when you're 00:22:08.799 working with different countries because 00:22:10.320 actually in Spain for us this is like uh 00:22:14.400 this would be like what one pound two 00:22:16.960 right whereas in the UK that's that's 00:22:19.120 what would define the digits right um 00:22:22.080 and so the moment you start extracting 00:22:24.159 data in different places this or you're 00:22:27.280 extracting data from images as well by 00:22:28.960 the way you get elements of 00:22:31.120 hallucination and interpretation of it 00:22:33.440 and that is always a bit of a challenge 00:22:36.000 so you always have to keep an eye that 00:22:37.520 this could happen, this may happen and 00:22:39.600 it's more about how do you minimize it 00:22:42.400 um drift. So this I realized the hard 00:22:46.080 way because um the more you build your 00:22:48.559 evals, the more you build your 00:22:50.000 documentation, the more you realize that 00:22:53.039 it drifts like you just adding so much 00:22:55.200 context that it just chokes with the 00:22:57.679 context and it's not able to read it 00:23:00.080 through end to end and it will start 00:23:02.000 ignoring things that you've added which 00:23:04.159 were very clear commands, right? 00:23:07.440 variance. There is something with these 00:23:09.919 models. I mean, sometimes they just 00:23:11.280 don't work at all. Right. So, Claude has 00:23:13.280 had a few ter temp temperamental weeks 00:23:16.159 uh recently, but I'm pretty sure that it 00:23:18.000 also is affected by the the the compute, 00:23:21.360 the time of the day, how much it can 00:23:23.039 really offer, how much it can you can 00:23:24.799 really access as much as you may be 00:23:26.640 using. And I use the top Frontier models 00:23:29.280 Opus 4.7 all the time. All the day. 00:23:32.480 Hence why I spent $200 on it. Uh but 00:23:36.080 even with the best models um out there, 00:23:38.960 you you will realize that some days they 00:23:41.440 are phenomenal, but other days not that 00:23:44.799 phenomenal. And it's hard to pinpoint 00:23:46.960 why that happens. You just need to 00:23:48.480 accept that variability. It's not very 00:23:51.679 human by the way. It's very human as 00:23:54.000 well. So it's not that alien as a 00:23:56.240 concept that you may be facing these 00:23:57.840 things because they can happen in a 00:23:59.600 human-led environment as well. Context 00:24:02.240 collapse. 00:24:04.000 So um for those used to working with 00:24:06.159 clot code, you probably will have seen 00:24:07.840 how your conversation is compacted after 00:24:10.240 a period of time. What that means is 00:24:12.320 that while everything that you believe 00:24:15.039 you've been doing and building rules 00:24:16.960 around all of a sudden is kind of 00:24:19.200 forgotten or just you just take the gist 00:24:21.279 of it and you carry on having the 00:24:23.120 conversation. Now you need to be very 00:24:25.039 mindful. So as you're working with quad 00:24:27.039 code, building all the files, building 00:24:29.679 all the structure, your MD codes, your 00:24:32.080 scales, your hooks, all of that 00:24:34.559 infrastructure is crucial so that when 00:24:37.200 that 00:24:38.880 when that when there's a context 00:24:40.320 collapse, there's at least a a clear 00:24:43.679 restart and you're not going from ground 00:24:45.679 zero. 00:24:47.600 Finally, and this happens almost as a 00:24:49.440 result of other things around here, 00:24:50.960 there's the authority breeze. This has 00:24:52.720 happened to me. So, I've had uh clock 00:24:54.960 coach send emails on my behalf without 00:24:56.720 my permission, right? It's happened two, 00:24:58.880 three times, maybe four times, and then 00:25:02.000 I said, "This is not going to happen 00:25:03.600 again." So, what I did is I I changed 00:25:06.400 the MCP 00:25:08.240 connectivity with Gmail, and I and I 00:25:11.520 stopped the ability for Cloud Code to 00:25:14.240 send emails at all. It will send drafts, 00:25:16.240 by the way, so it'll put everything to 00:25:17.440 draft, but I'm going to be the one 00:25:18.880 clicking that send button at the end 00:25:20.640 because I could not trust it anymore. I 00:25:23.600 have to trust in some things on 00:25:24.880 WhatsApp. There's no draft mode when 00:25:26.559 you're sending a candidate invite. 00:25:28.000 There's no draft mode either. So, it's 00:25:29.679 all or nothing there. So, you need to 00:25:31.840 manage your risk your own way. But where 00:25:34.159 you can, that's what I recommend. 00:25:37.840 So, this is how it feels. This is how it 00:25:40.080 feels uh working with AI. So, everything 00:25:42.559 around you is very volatile. Lots of 00:25:45.440 things are breaking and you don't really 00:25:48.000 know what you're going to find as uh you 00:25:50.480 start your new day. How have the agents 00:25:52.720 waken up uh today? Are they going to 00:25:54.880 behave? Are they going to be um 00:25:56.799 productive or is this going to be a 00:25:58.799 nightmare? And so you got to stay calm. 00:26:00.720 You got to stay relaxed uh throughout 00:26:03.039 your day because to me data confident is 00:26:04.799 everything, right? Without it, you're 00:26:06.559 you're just guessing faster. And AI is 00:26:10.400 super fast. But that doesn't certainly 00:26:12.159 mean it is precise. And when it comes to 00:26:14.320 data, precision is paramount. 00:26:17.919 Now, what does transfer Tom? What does 00:26:20.559 transfer from sea level to AI? There's a 00:26:24.159 few things that I think are are quite 00:26:26.640 common and it is not that different on a 00:26:29.600 dayto-day when you work only with agents 00:26:32.000 AI all day. First of all, you need to be 00:26:34.960 very good at briefing, right? So being 00:26:38.080 very concise, to the point, very clear, 00:26:42.640 have a good structured view as to what 00:26:44.799 you want to do. But don't be paranoid 00:26:47.360 about this. I'd say because often people 00:26:49.520 think about prompting engineering being 00:26:51.360 being the next skill. I think that's 00:26:53.600 going to become less relevant over time 00:26:55.679 because AI is getting very good at 00:26:57.919 interpreting what you need and it's 00:26:59.760 asking you questions along the way just 00:27:02.000 to clarify is this what you meant or was 00:27:04.240 it something else right give context but 00:27:07.600 don't overload so often I've seen these 00:27:11.520 mega prompts which eventually will mean 00:27:15.279 the the you won't be able to really 00:27:17.360 build or edit or build or or construct 00:27:20.240 on top of it. Um it's much it's it's 00:27:23.279 better if you start with something that 00:27:24.880 is brief but clear and then evolve from 00:27:28.799 it, right? And build different skills as 00:27:31.039 you navigate through um different 00:27:33.120 experiences. Different agents will need 00:27:35.760 different context. 00:27:38.720 Expect variability. That's a a a very 00:27:41.279 kind of human thing, right? So same 00:27:43.679 prompt will deliver different results in 00:27:46.240 different days. This for me is key. This 00:27:49.120 trust and verify. That's I'm always 00:27:51.520 trusting. So I'm giving the agents 00:27:53.440 everything, access to everything, but at 00:27:56.640 the same time I'm g building all these 00:27:58.960 guard rails, all these emails. 00:28:02.640 And finally, and maybe this is the one 00:28:04.640 thing that 20 years of experience give 00:28:06.399 you is that I've got these spidey 00:28:08.240 senses. I know when things are wrong. I 00:28:11.039 can tell. I can see data. And it doesn't 00:28:13.919 take me too long to figure out, oh, no, 00:28:16.559 no, no, no. Right. Um, and that it's 00:28:20.399 very hard to prompt your way to learning 00:28:23.360 that. I think having been in the field 00:28:25.279 for 20 years is what gives me that uh 00:28:28.480 edge that others that haven't had that 00:28:31.120 career would struggle to adopt. 00:28:34.159 So, it's very exciting and that's how I 00:28:36.240 feel every morning. I feel super 00:28:37.919 excited. There's so much that can be 00:28:40.000 done with AI because almost I could have 00:28:42.960 a conversation, a pitch conversation 00:28:44.720 every day and I do have them almost 00:28:46.559 every day. Um but I realized and hence 00:28:50.080 why I needed sex so badly uh was that um 00:28:55.120 I needed to change my lifestyle and the 00:28:57.360 whole thing about becoming a solarreneur 00:28:59.440 is that there's no excuse now whereas in 00:29:01.679 the past my agenda in my calendar was 00:29:03.760 almost like defined by everything around 00:29:06.000 me now I can do it all and what do I 00:29:08.080 want to do is uh being more disciplined. 00:29:11.919 So start with meditation, do gym workout 00:29:15.360 more often, do journaling, things that I 00:29:17.600 have never done before. Actually, I'm 00:29:19.039 trying out uh just to become a better 00:29:21.360 leader. So you very very much aligned to 00:29:24.000 what uh what you you were saying as 00:29:25.760 well, right? Um 00:29:28.080 school runs, I I rarely had the 00:29:30.000 opportunity to do them before. Now I do 00:29:31.520 them regularly. So drop off, pick up, I 00:29:34.000 love doing that. Um and mornings are my 00:29:36.559 key time. I mean, early Phoenix. The 00:29:38.240 reason for the name is because I've 00:29:39.520 always been an early bird and uh so it 00:29:41.679 had to have something related with early 00:29:43.840 bird. I wake up at 5:00 a.m. usually and 00:29:47.120 I I would start focus work then. Uh 00:29:49.679 Phoenix is because this got really start 00:29:52.000 right something that felt very close to 00:29:55.279 what I feel I'm doing. It's it's a new 00:29:57.840 beginning. So early Phoenix for me felt 00:29:59.919 very natural. Um so yeah I will focus 00:30:03.760 all my work all my agentic work starts 00:30:06.399 in the mornings and the afternoons will 00:30:08.559 very much be about um those pitch 00:30:11.520 conversations emails and communications 00:30:14.320 and the evenings family time networking 00:30:16.480 coming to events like this I did not 00:30:18.000 have the opportunity to do that as much 00:30:20.000 as I do right now. So, loving life as an 00:30:22.720 AI solopreneur. I recommend it. And if I 00:30:27.760 have to leave you with one thing by the 00:30:29.919 way, I know uh we're saying Tom that uh 00:30:32.559 you will take away only 10% of what I 00:30:35.279 said right now. This has got to be the 00:30:37.200 one thing because the more you work with 00:30:39.039 AI, the more you need to take care of 00:30:40.640 yourself. And a good way to do that is 00:30:43.840 beat it with AI. Build yourself a sex 00:30:46.000 agent that takes care of your personal 00:30:48.399 development as much as your personal 00:30:50.000 development. Anyways, we can take that. 00:30:51.279 I can't wait. Oh my goodness. 00:30:52.480 >> Hey, thank you. 00:31:02.000 >> Of course. Of course. 00:31:03.679 >> Sure. Yeah. 00:31:06.080 You and I have a very similar set in 00:31:07.679 terms of 00:31:12.399 Vietnam. So in the morning with my 00:31:15.679 phone. 00:31:20.000 That's very good for small teams and 00:31:21.919 especially solarreneurs. What is 00:31:23.919 something that for the rest of the group 00:31:25.520 have larger teams and teams of people 00:31:28.480 there, they can take this to kind of 00:31:30.320 adapt their workflow while still keep 00:31:32.320 people in the mix, still human element, 00:31:34.880 but maybe alleviating their supporting 00:31:37.039 responsibilities down to them. What is 00:31:40.320 something that someone as a non 00:31:42.240 solarreneur so as a team based manager 00:31:45.360 could take from 00:31:47.840 their 00:31:49.440 Yeah, it's a great question. Sorry. Your 00:31:50.799 name? 00:31:51.279 >> Jerry. 00:31:51.840 >> Joe. Um Jerry. 00:31:53.360 >> Oh, Jerry. Jerry. Sorry. 00:31:54.559 >> Jerry. 00:31:55.679 >> Um, 00:31:56.640 >> call Peter, but sorry. 00:32:00.399 >> It's a great question and I probably 00:32:02.080 would say I'm not at that stage to be 00:32:03.760 able to give the best advice here, but I 00:32:05.919 think inevitably it has to start with 00:32:07.760 yourself. I think inevitably you've got 00:32:09.600 to go very deep into AI, understand it. 00:32:12.320 The reason why I had to leave Gumree to 00:32:14.640 do this is because the time and 00:32:16.799 dedication required for me to get deep 00:32:19.279 enough to understand what what's that 00:32:20.799 exact frontier if the model calls it the 00:32:23.760 what's the limit of what AI can do 00:32:26.240 versus where you need human intervention 00:32:28.240 is still something I'm figuring out and 00:32:29.840 it's something that's changing all the 00:32:32.000 time. So had I known what I know at uh 00:32:36.559 now and remain at a at a MD role at 00:32:39.760 Gumry, I probably would have uh 00:32:42.080 completely changed the shape of the 00:32:43.840 organization. But obviously you need 00:32:45.600 people that come with the skills that 00:32:48.399 are needed for this new day and age. 00:32:50.320 When you think about the different roles 00:32:52.320 required for a business like this, 00:32:54.240 product managers, data, data engineers, 00:32:56.320 could be designers, you don't need them. 00:32:59.039 So you need or you need one person 00:33:00.640 that's able to do it all. Very few 00:33:03.039 people are actually able to do it all. 00:33:04.720 Right? So it's almost like the the shape 00:33:07.279 will change so much so vastly in such a 00:33:09.600 short time frame that you've got to 00:33:11.679 think about that almost as a followup. 00:33:14.320 But first of all, I'd say immerse 00:33:16.320 yourself in your to in these tools. 00:33:18.320 Build your decks. Build something that 00:33:20.159 will help you understand what can be 00:33:22.159 done and allow people to play with it as 00:33:25.360 well. Like some of the things that I I 00:33:27.360 heard others do, Monzo for example, uh 00:33:30.559 they started having like a it was like a 00:33:33.120 15 minutes a day thing where everyone 00:33:34.880 has to drop their tools and just play 00:33:37.279 with AI and do cool stuff. And it was a 00:33:39.840 very simple brief. It started like that. 00:33:41.760 It ended up being like weekly socials 00:33:43.760 where everyone's showing and sharing 00:33:46.000 what they do. That's the stage that 00:33:48.559 we're in right now. People don't know 00:33:50.559 yet what's the agile or the scrum of 00:33:53.840 tomorrow. that has not been defined. It 00:33:55.679 hasn't been created yet. There's no 00:33:57.840 specific formula that for everyone to 00:33:59.840 just follow as a blueprint. But for now, 00:34:02.159 I'd say just get very familiarized with 00:34:03.919 it because no matter what happens next, 00:34:06.240 having a very deep understanding of what 00:34:08.079 these tools are capable of will be 00:34:09.679 pivotal. 00:34:10.960 >> The key to that, the general population 00:34:13.359 when they've used AI the first time, 00:34:15.599 they've used it the same way. They've 00:34:17.119 not evolved. So, they think that output 00:34:19.199 is what it is 00:34:20.480 >> because using the internet or search to 00:34:22.000 find something. So, they haven't It's 00:34:24.000 right. That's it. They're just using it. 00:34:25.520 So they're so they're sort of they're 00:34:27.199 stuck in that sort of um way of doing it 00:34:30.399 and it's hard to change. 00:34:33.839 Thomas 00:34:34.320 >> Kim, good presentation. 00:34:36.159 >> Thank you. Um how close are you to the 00:34:38.399 cutting edge of claw? 00:34:40.639 >> Very much the cutting edge of claw. I 00:34:41.918 mean, I'm not on the Mythos, let's say, 00:34:45.280 model, which is the one that was made 00:34:47.839 public that um could almost like um 00:34:50.960 create a lot of security 00:34:52.320 vulnerabilities. I don't have that 00:34:54.239 model, but just the one before, which is 00:34:56.719 the one that's accessible to all of us. 00:34:58.560 Opus 4.7. I guess what's different to my 00:35:00.640 setup and other people is that most 00:35:02.480 people, they don't want to spend more 00:35:03.520 than $20 a month, right? And so, they 00:35:06.000 will reach that limit and they will play 00:35:07.200 with Haiku and with Sonet. I'm not 00:35:08.720 playing that game. I'm straight to Opus 00:35:10.480 4.7 nearly all the time. 00:35:12.480 >> Here. Here. 00:35:14.320 >> Great. 00:35:15.040 >> So, 00:35:16.560 starting to charge by the token rather 00:35:18.320 than the subscription. How much you 00:35:20.240 think you're really spending? 00:35:22.160 >> Um, 00:35:24.079 it's a great question. And 00:35:27.599 >> yeah, 00:35:29.200 >> I mean, I'm I'm really I'm I'm spending 00:35:31.040 a I guess my limit is still the same. 00:35:32.880 I'm still at that $200 a month. So for 00:35:35.520 me, but I do sense like I get these 00:35:38.400 reminders as the week or as the day 00:35:40.400 progresses that I'm starting to get very 00:35:42.320 close to my limit. Sometimes I've gone 00:35:44.320 over it and sometimes I spend more that 00:35:46.800 but it's always been conscious. It's 00:35:48.640 always been like okay I'm over it. What 00:35:50.960 do you want to do? Stop working or spend 00:35:52.960 another $50 and usually that's been the 00:35:55.760 answer. I'll spend a bit more. Um I I 00:35:58.800 don't get the sense. There was a period 00:36:00.000 of time that I was particularly worried 00:36:01.520 that I am actually going to be running a 00:36:03.920 lot of burning a lot of cash here. Um, 00:36:07.119 but I don't know how it stabilized for 00:36:09.119 me. So, it doesn't seem to be burning as 00:36:11.119 much tokens for my model what I'm doing 00:36:15.040 what it was originally. Maybe because 00:36:16.640 I'm less experimental. are using the 00:36:18.400 different hacks to try and reduce 00:36:20.160 there's loads of different hacks to 00:36:21.280 reduce the uh 00:36:22.079 >> the only thing that I've done 00:36:24.079 >> the only thing that I've done and I 00:36:25.920 would say maybe right is I may be moving 00:36:29.680 to sonnet a little bit more maybe but 00:36:32.880 it's not something I spend that much 00:36:34.720 time on 00:36:38.000 >> chosen claw when there are better 00:36:39.920 options in my personal opinion 00:36:41.760 >> I'm sure there are great options out 00:36:43.280 there and I think this is I was very 00:36:44.880 influenced by people who had um been 00:36:47.680 using 00:36:49.599 tools for the type of work that I was 00:36:52.240 doing be that data business 00:36:53.680 intelligence. Um there's just been a 00:36:56.400 huge backing around claude within my 00:36:58.480 product tech community far more than 00:37:01.520 anything else. I mean if I think about 00:37:03.599 all the influencers that I talk to, it 00:37:06.720 could be Lenny's podcast. I mean you 00:37:09.119 name it. Everyone seems to be very much 00:37:11.359 behind col beyond everything else. And 00:37:13.680 on that one, I think I think too, just 00:37:15.040 to add to that one, I think that people 00:37:16.720 that have had a product background and a 00:37:18.400 business background have tended towards 00:37:21.040 claw a bit more because it's a little 00:37:22.800 bit more lenient. It's a little bit more 00:37:25.280 willing to do things for you where 00:37:27.359 codeex, if you're an engineer, it's 00:37:29.680 like, yep, this is the way it should be 00:37:31.520 done. Yep, hold on. Let's hold. So, so I 00:37:33.599 think I think it's it's different 00:37:35.040 different horses for different courses. 00:37:36.720 >> I think you pick the model that that 00:37:38.160 fits your your your style. Sorry. 00:37:39.839 >> Yeah. Yeah. No, you're absolutely right. 00:37:40.960 And I don't I don't know. I mean, it's 00:37:42.480 hard to tell, right? Because everyone 00:37:43.599 will tell you a different thing. It's 00:37:45.280 clear there's a two- horse race. I know 00:37:46.800 that uh Peter Stainberger, founder of 00:37:48.960 Open Cloth, is a big fan of Codex and 00:37:51.680 he's been using Codex now choice. Now, 00:37:54.720 he's been hard, but that's kind of yeah, 00:37:58.560 you're between these two models from 00:38:00.079 what I can see, right? And you got to 00:38:01.599 make a choice. For me, I've built 00:38:03.440 something in an ecosystem that I could 00:38:04.880 almost share with my community that I 00:38:07.599 know of and I could almost get feedback 00:38:09.920 when I'm get stuck. uh whereas if I had 00:38:11.680 chosen another model I wouldn't have had 00:38:13.040 that. 00:38:15.440 >> Um 00:38:16.400 >> these are the uh reasons for you do and 00:38:19.440 the the outcomes that you get which is 00:38:21.200 pick your kids up uh get more time back. 00:38:23.839 I mean look any of us who've run our own 00:38:26.000 businesses and work for ourselves we all 00:38:28.240 recognize these things as being great 00:38:30.079 things you get back and that's great. Um 00:38:33.760 but also there are a number of people 00:38:35.440 now presumably in the market who also 00:38:37.920 have different skills from you maybe 00:38:40.480 coding skills maybe analytic skills who 00:38:43.440 now thinking I'm going to do the same 00:38:45.440 thing. So what's your competition? 00:38:48.960 >> Um you're right 00:38:49.680 >> and are you feeling that? Oh, you're 00:38:51.440 right. What I'm doing right now, I've 00:38:53.680 been able to do, let's say, in two 00:38:55.680 months, right? Um, people with 00:38:59.440 product data background be able to do 00:39:02.640 that and more so over time, right? So, 00:39:04.640 what's my edge and uh I would say this 00:39:08.640 there's various things that give me the 00:39:10.960 edge. one is 00:39:13.359 maybe the drive, it's the network. It's 00:39:16.079 the fact that I've been around for 20 00:39:17.920 years and uh I build a network of people 00:39:22.079 from different industries in different 00:39:23.440 places which uh will trust me um and 00:39:27.520 will open the door to me for anything I 00:39:29.520 would have to say. So I wouldn't have 00:39:31.440 had conversations with uh any of my 00:39:34.720 customers had I not been in the roles 00:39:36.640 that I've been in the past. 00:39:37.760 >> So AI isn't your age. Human is your age. 00:39:40.240 A few minutes later 00:39:43.359 full CIRCLE 00:39:49.440 back 00:39:53.920 >> come get your come get your book sign by 00:39:57.520 sign your book sign your book by if you 00:40:00.000 haven't joined the WhatsApp group 00:40:01.680 >> get out your cameras and join the 00:40:03.040 WhatsApp group 00:40:04.160 >> and I will see you back here 00:40:06.480 >> in a moment 00:40:07.200 >> no one ever guesses that 00:40:08.640 >> well done thank Thank you 00:40:17.760 Dave. 00:40:38.240 So I sit the head of mergers and

Baiju Solanki

00:00:00.160 I sort of leveraged this in because when 00:00:02.000 you said we're changing v venues, we 00:00:03.840 want to get bigger, I thought, you know 00:00:05.440 what, can I launch my book and piggy 00:00:07.279 back off the back of it. So, thank you 00:00:08.720 so much for supporting that. We've had 00:00:10.559 about I reckon we had about 70 odd 00:00:11.840 people outside in the reception and 00:00:13.120 obviously the room here. For those 00:00:15.360 who've heard my talk outside, this goes 00:00:17.199 a little bit deeper about my philosophy 00:00:19.600 about why why I wrote this book, 00:00:21.119 Humanity's Edge. So, humanity's edge, 00:00:24.320 how self leadership and accountability 00:00:26.080 build high performing teams in the age 00:00:27.519 of AI. So the book didn't start off 00:00:30.080 called humanity's edge. It was called 00:00:31.760 the accountability game. And as Thomas 00:00:34.800 mentioned during my interview that um I 00:00:37.280 have this big thing around 00:00:38.160 accountability, self accountability and 00:00:40.079 how you hold others accountable. But 00:00:00.080 in in this world now. So what makes us 00:00:03.439 irreplaceable? 00:00:05.359 What we've also got to remember is this 00:00:07.120 is everyone in this room, you're in an 00:00:08.800 echo chamber. This is not the normal 00:00:11.120 conversation that the main the main 00:00:12.880 population having. And sometimes we we 00:00:14.480 talk to each other, you know, I spoke 00:00:16.640 speak to Tom and Sunny and everybody 00:00:18.240 else and we talk about these things. Oh, 00:00:19.840 you know, I'm missing out. But you have 00:00:21.600 no idea how much ahead of the game that 00:00:23.279 you are. So what makes us irreplaceable? 00:00:26.720 AI doesn't take humanity away from us. 00:00:28.800 It hands it back. 00:00:31.760 And the reason I say that is this is way 00:00:33.680 back when when humans evolve, caveman, 00:00:36.960 our role as a caveman was to go and 00:00:38.719 hunt, get food, mate, and survive. It 00:00:41.920 was a survival. Humans aren't geared to 00:00:44.879 thrive. We're geared to just survive. 00:00:47.360 That's why only the 1% thrive. 00:00:50.559 So our uniqueness is curiosity, 00:00:53.840 creativity, and connecting. 00:00:56.239 And an AI L&M agent cannot be curious, 00:01:00.079 cannot create. It doesn't have 00:01:01.600 curiosity. Everything that exists in the 00:01:03.920 world today from year dot exists because 00:01:06.640 somebody one day was curious. 00:01:09.520 They were curious. H could this be this? 00:01:13.119 And from that stuff created all right 00:01:16.000 everything. And there's no different 00:01:17.439 today. AI cannot be curious. 00:01:21.040 So AI handles the middle. We get our 00:01:22.960 edges back. So, this is what I'm saying 00:01:24.240 is like we've been paid for labor. If 00:01:26.560 you done 40 hours, I'll give you £1,000. 00:01:28.799 If you 8 hours days, I'll give you 00:01:30.320 another £1,000. But how about this? What 00:01:32.320 about saying actually um you know, it 00:01:34.880 took you 8 hours to do that thing and 00:01:36.880 you say actually I can do that thing now 00:01:39.119 in 1 hour. I still want to be paid the 00:01:41.600 same. Why not? So, I'm paying for the 00:01:44.240 output. Actually, now for those other 00:01:46.880 seven hours, go and work, do something 00:01:48.720 else. I could pay you more money. 00:01:50.240 >> That means prices are going up. 00:01:51.759 >> Prices are going up. everything will go 00:01:53.119 up. So what's your commodity? Are we 00:01:56.159 look think about we're human beings? But 00:01:58.640 we've been operating in the world for 00:02:00.159 the last how many number of years as 00:02:02.320 human doings. Do something and I'll 00:02:04.320 reward you. Whether that's white color 00:02:06.079 or blue collar. So humans sit here under 00:02:10.239 curiosity and creativity. What AI now 00:02:12.879 does is the heavy lifting in the middle. 00:02:16.480 We don't need to now analyze the Excel 00:02:18.480 sheet. We don't need to write the 00:02:19.760 report. We can use our creativity and 00:02:22.560 curiosity to do what this is. And at the 00:02:24.480 end, our role now is to communicate and 00:02:26.160 connect with others. 00:02:28.160 Now, is this do you leave it to do its 00:02:30.080 thing? Yes. But your creativity and 00:02:32.560 curiosity 00:02:34.160 um uh compounds on that. So, you 00:02:36.640 reiterate iterate. You don't take its 00:02:38.160 first answer. We all know that if you 00:02:39.599 put something in chat, TPT and Claude, 00:02:41.200 you put a prompt in. You don't take the 00:02:42.640 first thing back. Whether it's a sense 00:02:44.480 of hallucination or whe it's where your 00:02:46.319 prompt's not good. So the key is right 00:02:48.640 is you're no longer being paid just to 00:02:50.480 do that. The analysis is done but that 00:02:53.519 middle bit right now is so big but the 00:02:56.720 more we use AI these edges will get 00:02:58.480 bigger 00:03:00.400 and that's where we sit and then we have 00:03:02.000 more meetings like this. The information 00:03:04.080 today in this meeting could have done on 00:03:06.159 Zoom. I know we do Zoom meetings, but 00:03:08.159 why do we come to this dinner? Push the 00:03:09.519 flesh. We like talking. It's that water 00:03:11.440 cooler conversation. It's that random 00:03:13.760 conversation that makes the difference. 00:03:15.360 And for me, humanity is about 00:03:17.599 connection. So if AI is doing the thing 00:03:19.840 that you couldn't come to the meeting 5 00:03:21.840 years ago cuz you had to write the 00:03:23.040 report, but now longer that doesn't need 00:03:24.720 to happen. You can come to the meeting. 00:03:26.560 You can meet your loved ones. You can 00:03:28.239 spend time with your children. You don't 00:03:29.920 have to live for the weekend, Easter 00:03:31.680 holiday or Christmas. And the majority 00:03:34.480 of the world lives for those moments. 00:03:37.519 So my simple architecture around 00:03:40.000 humanity is this is you can now be but 00:03:42.400 the being starts from you. There's a 00:03:45.120 gentleman outside that said well what's 00:03:46.560 our responsibility for the people out 00:03:48.159 there who may not be as Oay of AI is how 00:03:51.519 do you lead yourself? You all here in 00:03:53.280 this room have a responsibility wherever 00:03:55.519 you're whether you're in tech, whether 00:03:56.959 you're in consultancy, whether you're in 00:03:59.280 um uh um coding, whatever it is, to 00:04:02.720 educate others. How do I make the rest 00:04:04.400 of the world better? But you've got to 00:04:06.480 lead from yourself. A lot of the 00:04:08.560 leadership programs I focus on is not 00:04:10.239 about how do you lead others is how do 00:04:11.920 you show up yourself, right? 00:04:15.599 Can I look at your actions and the way 00:04:17.759 you talk as an inspiration for how I 00:04:19.680 lead? The second bit is in terms of do 00:04:21.918 the accountability. Who likes to like 00:04:24.240 Thomas asked me who likes to be held 00:04:25.680 accountable? No one does because 00:04:28.639 you could be called out, right? You'd be 00:04:31.440 called out. Imagine living in a world 00:04:34.560 where everybody literally did what they 00:04:36.720 said they're going to do, 00:04:39.520 right? It that world exists in my world 00:04:41.759 because I won't say anything I can't do. 00:04:43.919 So imagine you trained your team. If you 00:04:45.680 can't do it, why do people say to do 00:04:47.520 things? Cuz they people please. They 00:04:49.280 fear of judgment. all the reasons why 00:04:51.840 you eliminate those reasons. Oh, okay. 00:04:53.840 Let's live in a world. So, I'm not going 00:04:55.199 to commit to something that I can't do. 00:04:57.440 You now find the true reasons around 00:04:59.360 performance. And the last bit of is then 00:05:01.680 what you have is performance. 00:05:03.919 And you measure people on performance. 00:05:05.440 You don't measure people on just you 00:05:07.120 spend eight hours on doing something. 00:05:10.320 So performance, my background is in 00:05:12.400 sport psychology. So sport is all about 00:05:16.400 performance, right? There's a simple 00:05:18.560 equation as performance is potential 00:05:20.320 minus interference. What's in the way of 00:05:22.720 you being your best 00:05:25.039 right now? 00:05:27.039 Understanding tech and using tech in 00:05:29.120 terms of commodities and resources 00:05:30.800 that's not really in the way. So what is 00:05:32.720 in the way of this is your fear. Fear of 00:05:36.560 conflict, fear of failure, people 00:05:37.919 pleasing, avoiding hard conversations, 00:05:39.759 needing to be like, not wanting to be 00:05:41.280 challenged back. They're the 00:05:43.280 interference around what stops people 00:05:44.960 performing. Think about in a corporate 00:05:46.800 setting, in your setting. Who here is a 00:05:49.440 self-confessed people pleaser, 00:05:52.080 right? There's your interference. 00:05:55.280 Imagine when you're in a moment of 00:05:57.520 thinking, I'm saying this just to please 00:05:59.360 someone. You dealt with the fact that 00:06:01.759 person will be fine. What would you say? 00:06:05.360 You create cut through straight away. 00:06:06.639 Who here is afraid to say no? Find it 00:06:08.560 hard to say no. Yeah, that in itself. 00:06:11.759 You don't need to go in leadership 00:06:12.880 courses. So what's the psychology behind 00:06:15.039 you're afraid of saying no? 00:06:18.080 So how we how we do we lower the volume 00:06:20.800 around accountability? For me 00:06:22.560 accountability is not a linear thing. 00:06:24.319 It's not like have you doing are you 00:06:26.080 going to do something and let me know 00:06:27.120 when. That's basic level accountability 00:06:29.360 at this level here. You all know what to 00:06:32.240 do. Yeah. 00:06:34.720 You all know that you got to write this 00:06:36.319 email. You got to write that report. You 00:06:37.840 got to send that thing. You know it. So 00:06:39.919 basic level accountability is focused on 00:06:41.680 these things. What's the task? What's 00:06:43.280 the goal? Fine. That's what most people 00:06:46.560 call accountability around 00:06:47.840 micromanagement. But for me in this room 00:06:50.319 is how am I holding you accountable to 00:06:52.080 your potential? You you're operating 00:06:55.280 here. You know, you can be this good, 00:06:57.039 but everybody else sees this good. So, 00:06:59.440 I'm going to hold you to your potential. 00:07:00.960 So, what's now the conversation that 00:07:03.520 allows your gray matter to think, oh, 00:07:05.039 okay, I could be better. And the next 00:07:07.039 level of accountability is genius level. 00:07:08.960 Genius focused. How am I going to tap 00:07:11.120 into your genius? How many times you've 00:07:13.120 had a conversation with someone, maybe 00:07:14.319 at the bar, at networking, and that 00:07:16.400 conversation's made you think about your 00:07:17.919 life in a different way. Yeah. And then 00:07:19.919 you've taken a different action. 00:07:22.960 The accountability is not around the 00:07:24.479 action. It's around actually, I'm not 00:07:26.880 reaching my potential. This 00:07:28.880 conversation's made me realize I've got 00:07:30.720 to look in the mirror and do something 00:07:32.319 different tomorrow. Is there a project 00:07:34.240 you're putting off? Is there a 00:07:36.080 conversation you're putting off that's 00:07:37.759 not tapping into your genius and their 00:07:39.360 genius? AI can't do any of that. AI is 00:07:42.639 now doing heavy lifting. Now you can 00:07:44.080 have that hard conversation with your 00:07:45.520 neighbor and that's what we're tapping 00:07:47.759 into. So the book talks about how do you 00:07:49.840 structure that level of accountability. 00:07:53.039 So I've got seven predictions around AI 00:07:55.680 humanity. Number one, the outcome 00:07:58.479 economy has begun. It is about the 00:08:00.800 outcome. It's no longer about the the 00:08:02.800 the actual time you spend cuz like you 00:08:04.879 say it's 5 years ago. 00:08:07.039 >> You want me to build a website? I'm 00:08:09.039 gonna charge you five grand because it's 00:08:10.720 gonna take me five weeks 00:08:13.599 today. 00:08:15.360 It will take me I'll charge you five 00:08:17.120 grand, 00:08:18.800 right? But I'll get it to you tomorrow. 00:08:20.720 Oh, what am I paying for you? Oh, what 00:08:22.479 you want? Five weeks now. You got it 00:08:24.240 tomorrow, 00:08:25.759 right? You're paying on the outcome. So, 00:08:27.440 the prices shouldn't go down. The 00:08:28.960 activity economy is ending. Number two, 00:08:31.440 AI will expose anyone who's never built 00:08:33.679 self leadership. If you don't lead 00:08:35.519 yourself, you're going to be exposed 00:08:37.919 because AI is going to disrupt the 00:08:39.919 industry in one way or another. It might 00:08:41.760 disrupt it at a surface level. It might 00:08:43.919 disrupt it a deeper level. But either 00:08:46.640 way, if you're not showing up where 00:08:48.480 you're showing up, it's going to expose 00:08:50.640 it. 00:08:53.200 Number three, the middle of the work 00:08:54.480 disappears. Meaning the edges become 00:08:56.399 everything. Your creativity, your 00:08:58.480 curiosity, how you communicate, and how 00:09:01.760 you hold others accountable is your 00:09:03.680 leverage. 00:09:05.200 That's your edge. Not the fact that you 00:09:07.519 can analyze a sheet, not the fact that 00:09:08.959 you can write a report, not the fact 00:09:10.080 that you can code, not the fact you can 00:09:11.839 build something. That is not your edge 00:09:13.519 anymore. So the middle bit is no longer 00:09:17.360 the value. Curiosity becomes the rarest 00:09:20.800 skill. So I have this notion when I work 00:09:23.279 with leaders. I said, "Right, would you 00:09:24.480 rather?" There's the thing about in 00:09:25.839 therapy. Would you rather be right or 00:09:27.200 have peace? There's a line in therapy, 00:09:29.200 right? It's the same with curiosity, 00:09:30.959 right? Is when you're in a conversation, 00:09:32.880 however much you disagree or don't uh 00:09:36.160 you don't follow someone else's uh 00:09:38.240 dialogue is be curious about where 00:09:40.320 they're coming from. 00:09:42.320 Make curiosity your number one listening 00:09:44.959 skill. 00:09:46.480 The world starts opening up because 00:09:48.480 you're allowed because we have five 00:09:49.760 senses. Touch, smell, sight, taste, and 00:09:53.279 What's the fifth one? Can't remember 00:09:54.720 whatever that one was. Philight sight. 00:09:56.320 Yeah. Right. 00:09:57.200 >> So we majority of people see the world. 00:10:00.000 Hear the hearing. The majority of people 00:10:02.080 see the world through their Sorry. 00:10:04.160 >> Yeah. See the world through their five 00:10:06.160 senses? Who has pets? Dogs. 00:10:08.320 >> Well, I don't have a dog. Do a Does a 00:10:09.680 dog have a six sense? 00:10:11.440 >> Do animals. Yeah. So, animals sense that 00:10:13.440 that we can't. 00:10:15.519 >> They say we're not animals. 00:10:17.279 >> Let's not go there. It's a different 00:10:18.240 conversation, right? But you get what 00:10:19.680 I'm saying. But there's there's certain 00:10:21.920 animals that have a sense that we don't 00:10:23.519 have right. So why are we arrogant 00:10:26.800 enough to think oh there's a yeah 00:10:29.279 exactly. So are we arrogant enough to 00:10:31.440 think that we can experience everything 00:10:32.880 in the world as humans? No. No way. So 00:10:36.720 there's things in the world we'll never 00:10:38.000 experience. So for you 00:10:41.680 practicing your curiosity muscle is your 00:10:44.240 biggest single asset. What that means is 00:10:47.040 is you've got to be confident in your 00:10:48.800 opin your take on life, right? Doesn't 00:10:52.079 matter whether you're you're you're a 00:10:53.839 10-year-old or or a 90-y old. 00:10:57.279 Number five, communication stops being a 00:10:59.040 soft skill. We talk about soft skills, 00:11:00.480 leadership, communication. Communication 00:11:02.000 is the skill. Be that verbally, be that 00:11:05.040 body language, be that online, 00:11:07.920 it's the skill. And I talk about clean 00:11:09.440 communication. Clean communication 00:11:10.880 avoids assumption. When you have clean 00:11:13.440 communication, the accountability 00:11:15.440 conversation's cleaner because no 00:11:17.600 ambiguity of what's expected. Now, does 00:11:20.480 life happen in between an agreed task 00:11:23.360 and the delivery? Yes. So, at that 00:11:25.279 point, you have to be in communication, 00:11:27.839 right? What does being in communication 00:11:29.600 mean? It doesn't mean I send Tom an 00:11:31.040 email. Sorry, I couldn't do that for 00:11:32.240 you. It means I pick up the phone and 00:11:34.000 discuss. I'm in communication with him. 00:11:37.600 >> That's the key. And teams don't do that. 00:11:40.480 AI gives back gives us back the time we 00:11:42.959 lost for busy work. Who here uses the 00:11:45.760 word busy? 00:11:47.279 >> Yeah, I don't. Right now, 00:11:50.079 >> right now, are you busy? 00:11:53.360 >> Right. But enough work. 00:11:55.680 >> No, no, no, no, no, no, no, no, no. Busy 00:11:57.360 is a busy. What? When you think of the 00:11:58.880 word busy, what emotions come up for 00:12:00.320 you? 00:12:01.680 >> Stress. 00:12:02.240 >> Stress. Why would you want to be 00:12:03.360 stressed? Why would you self-induce 00:12:04.640 yourself in using words? 00:12:06.560 >> Right. Exactly. You're never busy. Just 00:12:08.000 doing what you're doing. But did you 00:12:09.760 have a were you working today clients or 00:12:11.360 went home what were you doing? Right? 00:12:12.720 But if someone said at the end you went 00:12:14.000 home how's your day? Yeah busy day 00:12:15.920 client this went for a dinner I did a 00:12:17.760 networking thing did thing. So you 00:12:19.120 describe it as a busy day but you just 00:12:20.560 do you get to do this everyone in this 00:12:23.200 room the very fact in this room you get 00:12:25.120 to do what you're doing. No one's forced 00:12:27.120 you to do what you're doing right there 00:12:28.720 might be parts of you what you don't 00:12:30.000 like but you get to do it. And finally, 00:12:33.040 the companies that win will be the ones 00:12:34.800 where people feel most human, where they 00:12:37.680 feel they belong, right? So they might, 00:12:40.560 yes, are there going to be situations 00:12:42.160 where there'll be some companies who now 00:12:44.079 have a 100 workforce, they can reduce it 00:12:46.160 down to 20. Yeah, there's going to be 00:12:47.760 collateral, but it's being humane about 00:12:50.079 understanding that, educating people 00:12:52.560 around what the transformation is. 00:12:54.160 There's no different to the only 00:12:55.600 difference when the tractor was invented 00:12:57.040 where those 100 plow people no longer 00:12:59.120 have the job. The only difference is the 00:13:00.560 speed, 00:13:02.079 right? The speed of change is quicker. 00:13:04.160 The human mind cannot cop keep up the 00:13:06.639 speed. That's why you, everyone in this 00:13:08.480 room has a responsibility because you're 00:13:10.639 in this room. You're in this echo 00:13:11.760 chamber. You're part of that 00:13:13.680 conversation. So the edge is not what 00:13:15.760 you do with AI. The edge is who you are. 00:13:18.480 When AI does the rest, 00:13:25.519 >> thank you. 00:13:31.519 AND uh I will uh sign your books as we 00:13:34.639 go around between uh um dessert and main 00:13:37.680 course 00:13:38.000 >> on the table break. Do.

Sunny Vara

00:00:00.719 And it's perfectly safe to do, by the 00:00:02.080 way. 00:00:02.320 >> Does that give you access? What is it? 00:00:04.720 >> Clever guide. Clever guy. Um, 00:00:06.640 >> no. What this will tell you, this is 00:00:08.160 basically you auditing your phone to see 00:00:10.880 if it's been hacked into. Now, what 00:00:12.559 you'll see hopefully is a gray screen. 00:00:15.040 So, star 00:00:16.880 hash 2 one hash and then press the green 00:00:22.800 button. And this is a safe thing to do. 00:00:25.119 And please pass this on to your family, 00:00:27.279 friends, etc. as well. And if you go 00:00:29.439 down the uh the gray screen, if it says 00:00:31.760 disabled, so it says SMS call 00:00:34.000 forwarding. So that basically means are 00:00:36.079 your text messages being circumvented to 00:00:37.840 a hacker. Also on there, it will 00:00:39.920 actually have voice call forwarding. Um 00:00:42.640 if it says disabled all the way, good. 00:00:44.480 It's good. Okay. But if it says enabled, 00:00:47.440 your phone is being 00:00:50.320 >> tracked as we speak. Anyone got enabled 00:00:53.520 by the chart? 00:00:58.320 They're not going to 00:00:58.960 >> They got one here. 00:00:59.840 >> Oh, have we? Okay. Yeah. So, 00:01:02.000 >> what does Cla's phone say? 00:01:03.680 >> Let's have a look. 00:01:05.199 >> On all calls. 00:01:06.479 >> Sorry. 00:01:06.960 >> It's not clean. 00:01:07.760 >> It's not clean. Yeah. So, that tells me 00:01:10.159 that there's a tracker on her phone that 00:01:12.400 and me as an ethical hacker, I can 00:01:14.720 actually buy the calls potentially that 00:01:17.600 people are actually ringing you on in a 00:01:19.600 live manner as well. 00:01:21.119 >> Does that worry you? 00:01:22.159 >> Yeah. 00:01:22.479 >> Yeah. So, come and see me afterwards and 00:01:24.159 I'll clear your phone for you. All 00:01:25.280 right. Yeah. All right. So, 00:01:30.240 thank you for those who have got Android 00:01:32.159 phones. Anyone got Android phones? 00:01:33.680 Perfect. Okay. Yeah, no problem. So, if 00:01:36.159 you dial star hash 00:01:39.119 >> Hold up. Wait a minute. 00:01:40.159 >> Oh, sorry. Sorry. 00:01:42.640 Sorry. I'll start again. 00:01:44.079 >> In your Android phones, star hash 6 one 00:01:49.439 hash and then press the dial button. 00:01:52.560 Tell me what happens. 00:01:55.119 the Apple one. So anyways, 00:01:57.040 >> star hash 00:02:00.399 >> call forwarding. 00:02:01.280 >> Call forwarding. Yeah. So on there, if 00:02:03.600 you've actually got a number on there 00:02:04.960 that it's being forwarded on to and you 00:02:07.200 don't recognize that number, 00:02:09.360 >> that is somebody tracking your phone. 00:02:13.360 >> So star 61 starburst in the right 00:02:16.640 corner. That means someone your screen. 00:02:21.840 >> Yeah. So whilst you're doing that, so 00:02:24.239 finally what I wanted to and if it said 00:02:26.640 if it has a number on there and you 00:02:28.080 don't recognize it, you've been hacked. 00:02:29.680 Come and see me afterwards and I will 00:02:31.360 clear the point for you. Finally, what I 00:02:34.160 wanted to talk about, this is obviously 00:02:35.920 an audience with AI interest. So the one 00:02:39.280 thing that I mean, yes, people are using 00:02:41.519 tools etc. One thing I wanted to talk 00:02:43.519 about is do not disregard AI governance, 00:02:46.160 AI security when you develop your AI. 00:02:49.519 All right? And that's what we specialize 00:02:51.200 in. That's where we're supporting Beju, 00:02:53.440 you know, in terms of the fantastic 00:02:54.720 leadership stuff that he's doing as 00:02:56.239 well. I'm Sunonny Vara. I'm an ethical 00:02:58.640 hacker and please let me know if there's 00:03:01.040 any help I can give you. Thank you. 00:03:09.519 Cheers. Thank you. 00:03:10.800 >> Amazing.

Members Attended

No attendees have opted into the directory for this event.