Episode 39

Crafting the Future of Film with AI Insights with Tim Neeves

In this conversation, Director Tim Neeves discusses the impact of AI in the film industry, particularly in the areas of research, writing, audio production, and multilingual content. He shares his experience with AI tools in post-production, such as transcription and audio editing. Tim also highlights the importance of human connection in documentary filmmaking and the potential risks of AI replacing creativity.

Takeaways

  • AI tools have greatly improved efficiency and creativity in film production, particularly in research, writing, and post-production.
  • AI voice technology and translation tools have the potential to enhance multilingual content and reach a global audience.
  • While AI can assist in various aspects of filmmaking, human connection and creativity remain essential in storytelling.
  • The democratisation of video production has made high-quality content creation more accessible to a wider range of individuals.
  • The future of AI in camera technology may involve advancements in image stabilisation, auto-focusing, and intelligent exposure control.
  • AI tools can be used to automate the cataloguing and tagging of content, improving organisation and searchability. AI can be used to analyse images and identify specific conditions, but there are challenges in accurately identifying complex scenarios.
  • AI models can be adapted and trained for various applications, making it a cost-effective solution for different industries.
  • AI tools can assist in financial modelling and provide insights that human analysts may overlook.
  • AI assistants may rely on specialised AI tools to provide accurate and detailed information in specific domains.
  • AI can be used for question generation and podcast editing, but there are challenges in maintaining the conversational flow and editorial decision-making.
  • AI predictions have the potential to revolutionise various fields, but caution must be exercised to ensure ethical and responsible use.

Links relevant to this episode:

Thanks for listening, and stay curious!

//david

--

Tools we use and recommend:

Riverside FM - Our remote recording platform

Music Radio Creative - Our voiceover and audio engineering partner

Podcastpage - Podcast website hosting where we got started

Transcript

00:00 - David Brown (Host)

Tim founded Prospect Arts in:

01:01 - Tim Neeves (Guest)

Thank you for having me. It's been a while since we caught up over coffee.

01:06 - David Brown (Host)

It is, yeah, it has been.

01:07 - Tim Neeves (Guest)

Talked about coming on your show.

01:09 - David Brown (Host)

It has been.

01:10 - Tim Neeves (Guest)

I think it's interesting, like even hearing you read through some of that little background. It makes me think I need to update my LinkedIn profile. It's now 22 years, as it was to 17. So, like, five years progressed, but no, it's good to see you.

01:31 - David Brown (Host)

No, and same, and thanks for coming on. We met because we started you were working on a film project, and you were looking for some funding and talking about grants and filling in grant applications and stuff, and I'd had some experience recently with that through some of the public sector work that I had. But what struck me was some of the projects that you work on are I found were really, really interesting and because you've got such a long history working in film and film production and that sort of thing, I was really curious to get your thoughts on. You know what you think about how the technology is moving forward and how AI can be a, how AI is going to fit into that in a larger sense, and you know how it fits maybe into workflow and where you think it might go and how it might improve the business, but maybe what the risks are as well. So that was kind of why I wanted to have you on and I know we had a really interesting conversation to start off with, so we can just continue that basically.

02:34 - Tim Neeves (Guest)

Yeah, I mean, yeah.

02:36

ng. I think, probably back in:

03:45

Yeah, sure we were doing a documentary which went to Netflix called Brave Blue World, which is kind of looking at kind of the water crisis globally and innovation. That brings hope for us as we look to the future. And I think at that time, to be honest, I wasn't massively aware of AI and I think we were hearing it more in the context of the water industry that we're filming and you know how they're using kind of you know AI and nanotechnology to help streamline processes and early warning systems and improving flow and all that kind of stuff. But I think probably in the film certainly from our side of the film industry the AI stuff was probably very much more rudimentary, like things weren't rebuilt into the platforms that we were using. I mean, the kind of work that we do is not that kind of super high-end VFX studio shoots. It's more kind of on the ground, you know documentary, talking to people, gathering story in that capacity. So I think from where we were in, probably back before the pandemic, you know it was researching a documentary was, you know, a laborious effort of you know going on Google, searching for stuff, reading through, kind of collating notes, moving things around, and that over the last that's speaking for like for me personally has probably been the biggest transformation in our in, in setting my process and in terms of accelerating, accelerating the creativity.

05:32

So I'm not the quickest writer in the world, I can write well, but it takes me time to get there and so to write down a proposal for a project, you know back, you know, prior to the, the tools that we all use. Everybody uses it. You know it does speed up the workflow. It doesn't. You know, I would never use, you know, things like chat, gbt and Bard and others to to do the creative, because I just don't. I've not found it do works that well for me, but what I have found it. Really, if you feed it with information and then can kind of work through it yourself, it just means that I'm having more time to put to the creative thinking and shaping of things and less time literally structuring out and I like silly little things like you know, cutting down a paragraph from you know 400 words to 100 words all that kind of stuff is like literally is a dream.

06:37

So you know, when I think about like brave blue world, in pre production, you know we were probably two to three months to research and actually we kept on going through the process of production, the kind of researching stories and writing and developing ideas.

06:54

You know I'm currently working on a really really fast turnaround project which is another water documentary project, but the timeline is absolutely insane and if it weren't for these tools, literally it would be impossible to actually to work to do this project. So I think and this is where I think the challenge comes is obviously when technology moves on, expectations shift as well. Yeah, so, like I remember when I first started out 20 odd years ago, you know you turn up on a shoot with, you know, a camera with a fixed lens and a mic on it and maybe like a boom boom mic and a tripod. Nowadays, when we turn off on a shoot, you've got like two or three camera bodies, probably about six or seven lenses, radio mics, boom mic, gimbal drone, you know, you name it, everyone wants it all. It's because it's, you know, the dump, I guess, the democratisation of technology, it all becomes cheaper, which is great, you know. But just then the expectations, you know, shift, so I don't.

08:07

I think that's probably like an initial initial little thought about, you know, from my, from my end, and that kind of like produce a director role, I think, in terms of research and kind of writing into very, very quickly be able to get your head into things like, particularly when doing something around technology that you might not. You know, I'm no expert, I can't be an expert on everything I make a film about. So, in you know, in terms of very quickly being able to come up, get up to speed, you know with with things, it's like it's amazing. I think it's absolutely brilliant.

08:47 - David Brown (Host)

Yeah, and it's. It's interesting that you mentioned that, because I think what's happened is is I don't know if you know the, the hype curve, you know, but basically you start off and you get this new technology and then you just get, you know, there's this, this like fever pitch of excitement about it, and then shortly after that you have what's called the trough of disillusionment and and it basically falls off a cliff where everybody, you know it's all fun to play with in the beginning, but then it's like everybody has to say to themselves, or everybody says to themselves you know, okay, we, we've got to get real now.

09:22

Like, how do we actually use this? And what I've noticed over the past kind of 12 months is that we've now gone through, I think, the peak of the hype cycle and it's now coming down the other side and what what I'm noticing is businesses and and it's just what you were talking about is a lot of businesses are now finding how they fit it into their actual process. Do you know what I mean? It's not the whizzy stuff. You're not doing the creative stuff with it, you're not. Or maybe you are.

09:53

Maybe you're using some of the tools and I know we can talk about the tools that you use to do production and stuff, and I know there's been some some hugely technological tools to help with that, and I know things like you know a lot of the like I don't know the Adobe tools and everything are putting AI into everything.

10:11

So you've got, you know, adobe Premiere Pro and you've got audition and you've got all the different tools and you've got, you know, resolve and you've got all that stuff who are now building AI elements into everything that they do. But it's that's almost a side effect of the real use of it. And I was talking to a guy named Steve a couple weeks ago and he does. He has a company that does audio ads and he said you know, they don't use they don't use AI voice technology to read the ads. They use AI voice technology to read 14,000 addresses for each store, because no one wants to do that. So they have a real person do the voice over for the ad and then, when it comes time to read the address, they just clone the voice and then they can have it read all the different addresses without a human having to do that.

11:04 - Tim Neeves (Guest)

That's brilliant, yeah. I mean, interestingly, we did a project for we worked for, yeah, for a number of different nonprofit organisations and before Christmas, we were doing a very short little piece on Haiti. They don't know whether you know what's going on there, but it's just become incredibly lawless. It's one of the most dangerous places currently on the planet and this organisation, I think we're kind of having to close down some of their operation because there's a lot of gang kind of kidnappings and all that but they'd they produced this little film from there, but they'd used AI voices on it and it just felt really weird.

11:49 - David Brown (Host)

I have to say like it just did not.

11:52 - Tim Neeves (Guest)

It just did not work. But I could see, as you know you're from the other example you gave, I could see how that would work when you're trying to. You know mass produce. You know whole lot of different things. One, actually, one of the things that I was talking with a friend about, so I'm not quite sure how much kind of to go in on this, but like this new startup that I'm working on, which you know you helpfully gave a bit of advice on in terms of the government grant, but we were talking about how to open up content to a global audience better and this, this could be relevant for you know a lot of the nonprofits that we work for.

12:32

But equally, from an entertainment perspective, you know if you're producing or you know, or even e-learning videos or whatever is you know is how you could create content and very, very quickly and in inexpensively be able to put that out into multiple languages without like, just like what we've done historically, which is, you know, subtitling or you know overdubbing, with. You know having to go and source, you know VOs from different regional accents and all that. But you know, when you look at I'm trying to remember the name of it now you're. What's that? That company out in the States that you know where you can upload a clip and when.

13:13

It can translate you and Elevenlabs, but it's that kind of thing you know where. You know how that could be brought into the production world, where, and it's only a matter of time before you know the likes of a. You know Premiere Pro and Resolve have a tool like that, where you know it can lips. You know you can be speaking on camera and then all of a sudden your voice comes out with the same tone, but in a. French, Arabic, you know.

13:44 - David Brown (Host)

Yeah, they. I think 11 Labs is doing real-time translation now, right? I think I just saw that. But they just got $80 million worth of funding as well to really grow the company, and I have noticed that I used to subscribe in the beginning, but I was a subscriber because I wanted to really play around with it. What I found is that I don't actually use it that much, so I stopped paying for the subscription, but I do keep up with the news on what's going on. And I think the other the other company that you're thinking about maybe is called hey Jen, which is a G Y G N. Yeah, and that's the one where they can, in a video, not only will they translate it, but they'll redo the mouth shape so it matches the shape of the word that you're saying, which is insane.

14:32 - Tim Neeves (Guest)

Yeah, I mean you can imagine that you know would be an incredibly helpful tool for, particularly for e-learning, because you think you know these. You know, take a university, like you know an Oxford, who are probably putting out content. I mean I know there are a number of Berkeley College out in the States do a lot of online courses that they want to get out to the world. But if you're a student out in India who's you know English is not your first language to be able to have that immediately being able to go out you all of a sudden, you'll be able to reach a much, much bigger market, which I think I think was.

15:08

You know, I'm a documentary point of view and I think about Brave Blue World, that got you know, went to Netflix, was translated into I don't know how many languages, but most of that subtitled. How much more immersive and engaging with that content be to an audience If they can hear it in their own language. Where it's not, I mean I don't know whether you've watched many. I hate watching shows when they're overdubbed. You know you've got 100%.

15:34

I remember what I remember watching Money Heist. You know which is that great. I thought it was a great Netflix show back in the pandemic. You know, it's in Spanish and I was watching it in, you know, with subtitles. And then I saw that you could watch it, you know, in English. I switched over and I was like this just does not work. But you know, if it were the actors voice and tone and lip sync, I know, would that feel weird. I don't know if you've noticed. I don't know.

16:07 - David Brown (Host)

I don't know and it's a good question and something I was talking about the other day and I don't.

16:13

I can't remember if it was on a podcast or not, so I apologise if I've talked about this already to anyone listening, but we were watching.

16:22

We were watching what is that Married at First Sight show, but it was Sweden and it could be anything, though there's been like there was a French crime drama called Spiral and you know there was The Bridge, which is a you know, which was a Nordic noir type show, and my wife and I both agree that we love to watch it in its natural language with the captions, because even though it's in a language that you don't know, you still get meaning from the intonation and the way that people say the words, and you can, you can get the emotion even if you don't know what they're saying. You know somebody's yelling at you, or you know in a bad way, or you know someone's yelling at you in a good way, kind of thing, and so, yeah, totally. I mean, you know, she switched it over to English, to the dubbing in English, for a while, because I was just sitting in the living room, and she thought I might want to just hear it, and I was like actually, no, just put it back. That's wrong.

17:24 - Tim Neeves (Guest)

But I don't know if AI would be.

17:26 - David Brown (Host)

I don't know how good it would get. I mean, at some point in the next five or 10 years, yeah, you probably won't be able to tell the difference, and then you can put out anything in any language and it will. It will get the intonation pretty correctly, but for now I think it is still quite awkward.

17:45 - Tim Neeves (Guest)

Yeah.

17:48 - David Brown (Host)

And you do mostly. You do mostly documentary type stuff, don't you? Is that? Did I get that right?

17:54 - Tim Neeves (Guest)

Yeah, yeah, yeah. So and that is where you know, when there's obviously a lot of fear around the exponential growth of AI and there are other, obviously other things that one could talk about with that, but from a literally from a production point of view, it doesn't massively. It doesn't worry me in a sense that you can't replace human connection, and I think that's the thing, like you know short, you know you could use virtual studios, you know LED walls and create all of that kind of environment for commercial TV commercials and feature films and stuff like that. But when you're talking about, you know, doing real life storytelling around stuff like climate change or you know human interest stories which people want, you know, you know seeing the world through other people's eyes, that you can't replace that with AI. Thankfully, I'm sure someone will try.

18:55 - David Brown (Host)

Yeah.

18:57 - Tim Neeves (Guest)

But you know, yeah, exactly. But, that's it from you, know? Yeah, go on.

19:01 - David Brown (Host)

No, no, no, go ahead.

19:03 - Tim Neeves (Guest)

I don't know where I was going.

19:05 - David Brown (Host)

Sorry, sorry, I thought you were done.

19:11

No, it's interesting and I think, particularly in the documentary space, like you said, I can see where it's really helpful for doing all the behind the scenes stuff.

19:19

You know writing up everything, but is it even helpful with doing things? I guess you need to learn about a particular like a person, or what they're involved in or what their businesses or whatever, and it's really good to do in the summarisation for that and, like you said, sort of cutting down paragraphs, which I think is totally ironic that you know they put you in school and they say you need to write a six page paper on this particular topic, when in the real world, if you gave somebody a six page paper, they'd be like can you summarise this in six paragraphs please? And it's like it's totally backwards. But anyway, ai is brilliant at doing exactly that kind of thing and I wonder, if it's, have you, have you used it for things like trying to understand, maybe, what questions you want to ask people? Or is that something that's just instinctive to you and you know what to ask and you know how to do the interviews to get the right information out?

20:16 - Tim Neeves (Guest)

Yeah, I personally not used it for that. I'm sure you, obviously you could do I. For me it's an instinctive process and I think I guess, as part of the research phase, if it's something I really don't know much about and I need to know certain things than it, you know, I think you probably use it in that instance. But generally speaking, I think you know an interview is a bit like this, is a conversation, and you know, you see where that conversation goes. And even from a creative perspective, like, interestingly, on this project that we're currently working on, the producer writer and I working on some like very, very quickly, some treatment ideas, and the client then ended up plugging it through chat GPT to kind of see what else could be done with it. And it came back and it was.

21:14

It was so very of me, was so very obviously written by chat GPT and I think that's something I've become acutely aware of. When you look through LinkedIn, I'll put my hand up and say I'm terrible at social media. It's just not something I particularly enjoy, doesn't, doesn't give life to me. But when I do have a little hunt around on LinkedIn or very, very occasionally on Facebook, it's just amazing how obvious the AI posts are. It really is, and.

21:50

I think I don't know whether people realise that, but I think it is a little bit. I think that's where things can become a bit disappointing is where creativity ends up being sucked out and probably being replaced by laziness. Because these tools are. They are brilliant, they're really great, but I think there's no excuse for dropping the ball on creativity because of these tools are there. Does that make sense?

22:19 - David Brown (Host)

Yeah, 100%. And I was on a call the other day and somebody mentioned the fact. They said, oh, I can always tell when it's AI because it puts a comma before and in a list of items. And I was like, no, but that's called the Oxford comma. It's, like you know.

22:36

But it's interesting though for someone from the UK where you don't really use the Oxford comma, as opposed to the US where that's what we're taught in school. It's that's a key giveaway to someone here, but it wouldn't be a clue to someone in the US, for example. So you know, and obviously the spelling it's almost always US spelling. Even if you tell it that you're in the UK and you wanted to use UK spelling, it still uses American spelling for some reason. It just can't figure it out. But aside from you know, those are the little clues that you can find, aside from the fact that it's, you know, written in a quite formal it's always quite formal, I think and it's business speak most of the time. And I think that's because the core training of them were on academic papers, because those were the first, I think that's the first bit of content that the researchers had access to. So they just trained it on academic stuff, and so it ends up sounding very academic, or at least most of it does to me.

23:38 - Tim Neeves (Guest)

And then when you and then if you ask it to do something a little bit more poetic or a little bit more creative, then it just goes away the other end and it's like just does not sound real.

23:49 - David Brown (Host)

Yeah, have you tried Claude.ai?

23:53 - Tim Neeves (Guest)

No, I should check that out.

23:54 - David Brown (Host)

Try Claude. Claude’s really interesting because, in my experience, it seemed I don't know what's going on with my video either. Do you see it?

24:01

flashing I have no idea why it does that. Anyway, it's slightly annoying. It's not doing it on my count. Sorry, we're totally off track now, but it's for anybody. If you actually saw this, it's like my video is just flashing like mad, but it's not doing it on my camera. So I have no idea why it's doing that. Anyway, yeah, Claude.ai, I find if you're trying to write anything that's like more emotional or something that's a bit more personal, clod is really really good at that and I have no idea what they've done in the background to make it work that way. So, top tip if you're trying to do something like that, maybe try clod and see what it's like and see if maybe it does something a little bit better. But the few times I've asked it to do things like that, the results have been amazing.

24:56 - Tim Neeves (Guest)

Wow, oh, check it out. Yeah, I think with some of the tools that from a post-production point of view obviously a massive time saver for us and also cost saver. I think back again pre-pandemic, we'd come back from a shoot and we'd have to send off interviews to be transcribed with Rev or whoever else spending money time doing all of that. And we've switched over to Resolve, which was mainly because I think a lot of people finished films in Resolve and since they've improved the editing stuff in there it feels a better tool for us. But to be able to straight out in the timeline create transcripts that are locked to the time code, so then when you're doing a dialogue cut you can literally just highlight a bit of text, hit the little down arrow, it drops it for the timeline. So for very, very quickly and a simple tool that you don't even need to be a seasoned editor to be able to use. So that is a huge time saver to break the back on a story in the post-production. But also there's so many other things. Obviously you've dropped in audition and we've often used Isotope over the years and there are so many great audio plugins out there that can remove. I don't know, I literally have no idea how it works. It's all crazy.

26:30

Removing background sounds and actually even the built-in one on Resolve is, I think, is quite incredible. We've done a lot over the last couple of even still over the last couple of years. Since we're out of the pandemic, we're still doing for some of our international nonprofits, you know, edit projects that are based around Zoom, interviews and stuff that's been shot locally that's sent over to us. And I think that's been a really positive move from the pandemic is that it's forced some of these nonprofits to think a bit more creatively about how to create content and not always fly, cruise out spending all that money when actually there are some things that could be done really well locally. But sometimes you know the quality of the content can be, sometimes it can be amazing, other times it can be quite extraordinary. But you know some of those AI tools in Resolve for like, cutting out audio, like removing reverb and things like that is amazing.

27:36

A huge time saver.

27:38 - David Brown (Host)

Yeah, and it's. I think there's a lot of discussion these days about democratising AI and making sure that everybody can use it. But I think part of what AI is doing is it's democratising a lot of industries, for example, video. I mean you had YouTube, but I think even over the past well up until, let's say, the pandemic, but even a little bit after that I mean I'm thinking in the last 12 months we've taken a massive excuse me, a massive step forward in what we're able to do. Because even 18 months ago, I mean, you could get a transcript done automatically, but it was still kind of wonky and it was a bit expensive and the tools weren't great. Now, anybody, basically, that's free and everything.

28:28

If you've just got a piece of software like I use, you know it's Primer Pro. Primer Pro generates it. You can do editing straight in the text, like you can do. Do you know what I mean? You can do the same thing in Adobe Audition now and it's literally democratised that whole thing. And you combine that with mobile phones that can record in 4K, that give you you know, if you've got an iPhone with you know a separate app, or even the app that just the camera app that comes in the phone.

28:59

I mean the quality of the film that you can get out of that. I mean, you know, 10 years ago that would be a $30,000 camera and now you've just got it in your hand in a phone and if you, you know, then you've got a free editing tool that you can go in. That has, you know, some settings and some things. Or maybe you know you spend 30 pounds on getting a. You know some sort of a plugin or something for it. And the next thing you know, you know somebody sitting at home who plays around with it, for you know, a few hours can make something. That's. I won't say it's as good, because it won't be as good as a professional can do, but it'll be damn close.

29:41 - Tim Neeves (Guest)

Oh, it really is, and that, you know, I think is really exciting and I think, particularly when I think about our work in the kind of nonprofit and also sustainability work, is the fact that those tools have become more readily available means that, okay, it reduces the kind of the amount of work that we potentially could do.

30:03

But from a world perspective, you know, if we don't if I don’t need to travel to, you know, Rwanda to go and shoot a story because there’s someone there who’s able to go and shoot that, and we can kind of collaborate on the storytelling, then why not? I actually do remember working a friend of mine. We were working on a project ages ago, and it was like a last-minute project; it was the first time that he’d ever done it, and he directed a shoot from the UK with an iPad. So the DP who went out like literally set up an iPad next to the camera and then my friend over here was, you know, talking to the camera, was having a conversation like we're doing right now. But yeah, they had the camera set up like really nice camera set up over there, but yeah, it meant that he didn't have to, you know, fly all the way over to Seattle or wherever it was, you know for the expense but also the carbon footprint. So I think there's something really good to be said about all of that.

31:15 - David Brown (Host)

The podcast that's coming out tomorrow. I was talking to a guy who has been an audio engineer for 20 years, used to work for the BBC and everything and one of the things that he was saying in the show which I didn't I kind of in the back of my mind, knew that it happened but I never really thought about it. But he said how many, how many records do you think are recorded with the musicians in the same room? And he's like they don't have to do that anymore. He's like we've done shows where you've had the person singing in LA and the person you know playing the piano would be in the UK, for example, and they're live at the same time but they're pulling this together, but they don't even have to be in the same room anymore.

31:59

And I was like that just totally blew my mind because I was like you know the technology behind that and the like, because I would think something like that. You know you need the interaction between the musician and the singer and there's a lot of feel to it and all that, and you know you've got to have. If there's any delay in that, then that can throw all that off and it could make it really awkward. But he said no, you know. He said it happens all the time, you know, and it's that kind of thing. That's you know. I never really thought about someone directing remotely either, but I guess you can do anything remotely these days, so it just you just got to crack on with it.

32:36 - Tim Neeves (Guest)

Yeah, that whole music production thing is brilliant. I remember I know, probably 10 or so years ago I think, logic Pro were trying to do something like that and it was so jangly and it just literally didn't work. And I even remember during the pandemic trying to do some some kind of live music just for fun. You know, on Zoom it's like this just does not work. The latency just becomes a horrible mess.

33:02 - David Brown (Host)

Yeah, yeah.

33:04 - Tim Neeves (Guest)

But I think the remote directing thing I think is pretty cool.

33:10 - David Brown (Host)

That's amazing. So, on that, actually it makes me wonder. I mean, you've been doing this for you know, 22 years, something like that. How have you seen the progression just of technology in general? Do you know what I mean? And then thinking about how far it's come over the past 20, 22, 25 years, to then thinking where do you think, like, what's the next evolution of this? Where do you think we're going to go next?

33:41 - Tim Neeves (Guest)

Yeah, I mean I think possibly where I started out and that those early years it's one of the most exciting times, I think, in tech development. Like I remember getting my first red camera when they came out the red one and that thing was just incredible. And I think it was the price point that was almost most amazing because there were like super high-end cameras that were around that time but they were used on major Hollywood blockbusting movies that to get a camera body would have been like 150 grand or whatever. And all of a sudden red came out with this body. That I mean it was not the most stable thing but it was just fine for us because we were doing documentary things with understanding clients and it was cool. But it was like something I think it was like $17,000 or $18,000 for the body. I mean you had to buy other bits and bobs for it. But it's like this thing shoots 4K. What that's amazing and you know, slow motion in 4K was so cool.

34:52

But I think what has happened since then is obviously then with the DSLRs, like the 5D Mark II when that came out, was a huge game changer for the industry and being able to use photographic lenses so you can get that nice cinematic depth of field and I think what's interesting is we've been kind of slightly pushing back on that. You can see a lot of content out there. It's so shallow that you don't really get to see the context of the setting. So I think people are now kind of moving a little bit away from the super shallow stuff. But the tech has just got smaller and smaller and I think you picked up earlier about the iPhone. I did a channel foreshoot a year or so ago where I used there was a few shots that went in the TV show that were shot on my iPhone and literally you could not tell because they just look great, they look really good.

35:49 - David Brown (Host)

It's crazy, isn't it?

35:51 - Tim Neeves (Guest)

And so I think that's probably where things are going to go in terms of just the base level. I mean, the lighting for us is also a huge thing. So up until a few years ago, to get daylight quality lights would need huge lighting fixtures with appropriate power supplies to go with it, whereas now you can travel with an LED. That's light, it doesn't get really hot and you can plug it into a conventional power socket and get really good quality, controllable lights where you can change the color temperature. So all of that stuff is just making an. Even like the flat LED panels. I remember when they came out was like what? This is amazing.

36:39

I don't have to take that whopping great bag on the shoe and just roll this thing up, stick it in my carry-on, and I think that's we're able to more than often, take our principal kit as carry-on when we travel, for fear of stuff getting broken and lost, which, let's face it, that has happened quite a few times for us. Tripods don't make it, so I think it's the kit. Getting smaller Lenses, I think, is probably I'm not sure. I'm not sure I have the vision for how lenses can improve, because you look at a lens that's 50 years old and it's been well looked after, just looks special.

37:28

And I think that was the bit of advice I was given when I started out, by a brilliant photographer from the States who said he always said invest in good lenses because the camera tech is always going to change, and that was. He wasn't even shooting digital at the time, I think he was just on film. But he's like you know, you buy, invest in a good lens, it will serve you for life, and I've seen that. You know, with the lenses that we've got, you know they're as good today as they were, you know, 20 years, 10, 15 years ago or so, and you know the camera bodies, you know you can have like a tiny little camera body, but that's where, yes, the difference, I think, with the iPhone is how they handle light, and all that kind of stuff doesn't match up to like a really beautiful lens. But you could have a really cheapo camera body with a $50,000 lens and it's going to look incredible, whereas if you have a 100 quid lens on a $50,000 quid body, it's not going to look great 100%.

38:34 - David Brown (Host)

And that's when I started off in photography. Much like you, I started off and was sort of an amateur semi-perfytographer for a while and I got taught the same lesson very quickly. And it's like you know, you've got to get the light without geeking out too much. I'm really resisting the urge to just deep dive into all of that. But but yeah, you know, the lens makes such a difference, which is, you know, I have a small, tiny little. I've got a Sony ZVE 10 or whatever. It is right that I just do now that I have from my home studio and it's perfect.

39:11

But I just bought a prime lens for it which is really, really nice and you know, it gives nice shallow depth of field but it's not too shallow and you know the quality that it gives is incredible compared to something even five years ago. And it's, you know, like you said, the tech is just going and going and going and I wonder, yeah, I'm sure somebody's going to shove AI, they're going to call it AI, but it's just going to be some sort of electronics that go in a lens that maybe they can, you know, try and correct some of the you know color, do some color correction or some sort of something in the lens, or you know, it's going to get better at following focus or I don't know something. I'm sure somebody will figure out a way to call it AI and put it in a lens.

39:56 - Tim Neeves (Guest)

Yeah, I mean, I think it's probably going to be more down to the camera bodies and I think it'll be down to like how, how to use AI to for image stabilization, and it could even be things like auto focusing and aperture. So be able to read the, the light balance in a space and make Intelligent decisions on which is actually, to be honest, that would be brilliant, because that's always been a bugbear of mine, if not, that we Very often use, like auto exposure, but, like I've, auto exposure never really works, and so I think that would be. It would be an advancement. Yeah, where it's intelligently like you know my background here, you know, knowing that it's not it's judging the aperture based on what you are Obviously filming, as opposed to like a general exposure of the whole image. So I think that that kind of stuff could get smarter. Yeah, I'm not sure what else they could do really.

40:56

Yeah, I guess. well, I think they could be cataloguing. Cataloguing and naming of clips. That would be would be a brilliant thing, and that's obviously something that is, from a post point of view, is just gonna get better and better of you know as your ingesting content, you know all the metadata surrounding it, so that could be Potentially built into the camera tech you know, so it's able to. Yeah, as you're going be, you know, keywording stuff and Starring images that maybe were performed better by the cameraman. I mean, you can imagine the DP Say stop, stop, come up with these ideas.

41:33 - David Brown (Host)

I Love it, though that's do you know what that's? I hadn't thought about that, and that feels like a really good, feels like a startup in there somewhere, but I'm okay, yeah, if no one's doing it already. But yeah, but again. What's interesting, though, for me is is that you've You've already gone back to what's an actual business application for this that can take the tedious tasks Right.

42:00

We're not talking about the creative part of it. What we're talking about is the tedious bullshit that you have to do. That takes hours and hours, like if anybody out there has ever shot, you know, Hundreds or thousands of photos, and then you try and categorise and tag all that stuff, and I know Google and some of the other platforms are now trying to get a little bit better about that, but that would. You're absolutely right. That's an. That's an insanely useful use for an AI tool, because it can interpret and it and it can come up with tags that you might not even think about. You know what I mean and it can create a whole bunch of stuff it could talk about, feel it could talk about all sorts of things that you may not think to tag, and then Adding all that stuff to the metadata of the images, which then creates its own problem, because now it's just exploding the size of the image, because you've got all the metadata and all the other stuff, but but that's a good business.

42:56 - Tim Neeves (Guest)

I know what that is. A business I think that is. I mean it's probably more likely to be built into the post-production workflow. But I think I think it could be great, though, from a set like, if you imagine you've got like a producer with an iPad that's wirelessly connected to a camera, there's automatically, you know, feeding information, like got sentiment analysis in there, figuring out like, oh that, yeah, the talent sounded a little bit moody in that shot. Let's see, yeah, this one is, let's give that one three stars. That kind of stuff would be great. But I think it's more likely to happen in a post-production workflow and in many ways that is Something around this new business that I've, you know, I mean the process I finally submitted for this Innovate.

43:44

UK grad yeah a couple of weeks ago. That was a big process but is it is around this whole idea of how to Kind of curate and ingest and catalog content. So, yeah, lots, lots there that's.

44:03 - David Brown (Host)

That's interesting. I know that there are. So I've worked with a lot of big data companies and particularly around public sector and transport, and One of the ways they've been using similar tools. Now this was before AI became popular and everything was just machine learning. But they've been using machine learning to Try and understand from autonomous vehicles when there's, say, a near miss, and then trying to analyze those near misses. So you've got 15 terabytes of data from a one vehicle for one day. So you've got vast amounts of data that you're trying to churn through and you can't watch all that footage, so you have to process it somehow. So what you do is you feed it into a machine learning tool which then goes through and it identifies instances where it thinks something might be a near miss. And then what? They had people. They had people manually Looking at those and then if they said, yes, that qualifies as a near miss, the people had to manually tag All of the things that they could think about that particular image so that they could then go back and start to do some analysis to see if there was anything in common.

45:21

So, for example, you know, does the, does the vision on the vehicle struggle at dawn or dusk, because the lighting is quite awkward. It’s okay if it's totally dark and at night, and it's okay if it's totally light during the day. And there goes my video again. But but it doesn't. It doesn't really work in those kind of mixed conditions, or you know?

45:46

Is it one of those, is it one of those roads that's got a lot of canopy over the top and you know how when the sunlight comes through, it's like just kind of Strobing in your eyes almost, and it was. But they had to have people do that. They couldn't have the tool, couldn't figure it out, and so it's interesting that you say that, because that could be another, maybe an application you hadn't thought of. But that's another instance where you really need to be able to analyse that stuff. And if you're going churning through, you know if you think you've got tens of thousands of vehicles on the road or hundreds of thousands of Teslas, you know they must have some tools that they're using AI to churn through some of that stuff, to find, to find that.

46:27 - Tim Neeves (Guest)

Yeah, I think that's the thing that you know. Ever a lot of this stuff has been developed. I think you know I had an amazing conversation with a guy over the weekend who's actually local to us, who works in AI and machine learning. I was telling a little bit about some of the stuff you know I'm wanting to do and, again, you know, a few years ago, the cost to develop this kind of Text they like for that company you're talking about, yeah, the colossal amount of money. But those models you know a lot of those models have been built that can be adapted and tested and trained on other things. I mean, it's a, it's a world. I don't. You know, you understand, but I think there's a lot is possible in terms of like gapataping this with that. And you know, what could we learn from the likes of a? You know, you know Tesla, you know you've got models that are analysing, you know driver profiles and there are certain things like that?

47:29 - David Brown (Host)

Yeah, exactly. So I have a question for you when you, when you use it like a chat, gpt or something like that, do you feel the need to be polite to it?

47:44 - Tim Neeves (Guest)

Oh, do you know what? I'm quite British, and so I am I like. Would you mind please?

47:50 - David Brown (Host)

Terribly.

47:53 - Tim Neeves (Guest)

Do it again. Yeah, the funny thing is, actually, I did have a bit of a go at chat GPT when I was. I think I might have told you this actually when we had coffee. So it's part of this government in. You know, government grant. I was trying to work out some quite complicated financial modelling and I was trying to get I did statistics at A level, but that was quite a long time ago, and so some of those things and I did do my NDA, but we didn't go hugely into the details of compound churn rates and all that stuff.

48:35

So anyway I thought, do you know what? I bet ChatGPT could help me figure this out. I knew what I needed to do, but it was just the time to try and figure out the actual equation. I couldn't find anything suitable on Google, so I went on to ChatGPT and I fed it some of my numbers and said, okay, so how would this work out? And I was looking at it and the actual results they didn't add up. So I got it to do it a couple of times and every time it came out slightly differently. And then, eventually, I went and found the proper scientific calculator and did it for some myself.

49:16

And then I had to go and tell ChatGPT off and say surely the answer is this. And it kind of came back saying, oh, my apologies, I must have got that wrong, but it did make me think, my goodness. I wonder how many financial institutions are missing a few decimal points somewhere. And how many trillions of dollars are being misrepresented around the world.

49:42 - David Brown (Host)

I think what people are finding is that you can't ask it. If you ask it, what 2 plus? I think now, like basic arithmetic, it knows, but in the beginning you could say what's 2 plus 2? And it'd be like 7. Because it's not doing the math, it's doing the language right, and it's looking at linguistic patterns, not mathematical patterns.

50:04

And this gets into something that I think is going to happen, which is, I think, what you're going to find is if you, in the future, you're going to interact with I don't know, some AI assistant and you can say to your AI assistant, you could give it that information and say, hey, I'm trying to accomplish this. Here are the numbers. How would you recommend that I approach it? And your assistant is not going to know. But your assistant is going to be connected to a mathematics AI that would know, and so what it'll do is, on your behalf, it will go out, it will reach out to that math AI, it will have it do that and come back to you with an answer that it gets from there. So it's not going to know, it's just going to know someone who does know.

50:51

And it'll be the same if you want to play chess or if you want to play, go or you want to, you know whatever Then you say I'd like to play chess. You go okay, great, you know. It'll say, okay, great, I'll play chess with you. And then instantly it just connects to a chess AI and it just passes through the information, and I think that's where we're going to end up, and it's not all it's going to be good at is the conversational part, but if you ask it details about anything specific like engineering or chemistry or physics or maths or anything like that, it will go out to another narrow AI tool that can answer those questions and do the interaction.

51:32 - Tim Neeves (Guest)

Well, that sounds good.

51:35 - David Brown (Host)

I mean that's.

51:35 - Tim Neeves (Guest)

I don't know whether I don't know whether you've used agent GPT agent. Gpt nine. Yeah, so I don't know whether that's something a little bit like that, maybe, where it kind of goes off and uses different tools. So where chat GPT is, obviously I think there are different plugins you can get now for chat GPT. I'm not really experimented much with that.

52:00 - David Brown (Host)

Yeah, there are. I think you were limited, or at least you used to be. I do actually pay for it, but it's another one of those tools that I used a lot in the beginning, but I don't actually use it a lot now. I do use it two or three times a week, but I don't kind of use it like I used to like going back to something we talked about earlier, I used to use it when I very first started doing the podcast.

52:24

I used to use it to try and help me come up with what I thought were interesting questions to ask people and unusual questions to ask people, so I could, like you said, do a bit of research on their industry and what they did and on them and then say give me some interesting questions.

52:39

But what I found is is that it, even though it gave me interesting questions, that didn't help in a conversation and it ended up breaking up the conversation Do you know what I mean? And it was like then it just turned into more of like a, like a I don't know a news story, or it felt more like a, like a journalistic interview, and that's not what I want to do. I just want to talk to people, so I don't even use it for that anymore. I do use AI tools to do all the summaries of other episodes, though, and to do things like suggest titles, and, you know, give me a summary. It goes out. I have tools that go out and create all the timestamps and all the you know, the chapters and all that stuff afterwards.

53:20 - Tim Neeves (Guest)

Do you use it for cutting as well? Maybe? I mean, I've not, we've not tried that ourselves, but I know a number of editors who use, like Premiere, to do suggested cuts.

53:33 - David Brown (Host)

Here's an hour's interview.

53:34 - Tim Neeves (Guest)

Cut it down for me for 20 minutes or whatever.

53:37 - David Brown (Host)

I haven't. I don't have one that does that. I do have a plugin for Prem that if you have a multi-camera setup it will actually cut and do all your multi-cam stuff for you and what you do is you basically just highlight all of it and then you can play through it and at any point you can decide. I mean it will, it does like speaker priority basically is what it does in the beginning, but then if you have like a wide camera shot as well, it will incorporate those in occasionally too. So it gives you this just if you don't do anything, it will literally give you, you know, a nice cut, almost with some kind of L and J type joins to it as well, to kind of give it a little bit more drama.

54:23

And it's not just jump cuts the whole time and it it's really good at that. But if you want to then go in and play it back, you can then just literally just click on a button and decide which camera you want to use. But you can take it from. You can use it from a Riverside thing where I might have two cameras and you might have two cameras, and do you know what I mean? We could have four different shots and it would actually go back and redo that. So that's the tool I have, but I've not tried anything to actually edit content and edit it down. I would think that'd be interesting, but also.

54:57 - Tim Neeves (Guest)

That risky.

55:00 - David Brown (Host)

It's risky Because, yeah, because it's, yeah, it's and it's, and you'll know this for sure, and this is just something that I've learned. But hopefully you, you, hopefully I'm right, but I think it's. I think it's hard to edit something that you've done, maybe because you have your own, like I have my own ideas about what I think is interesting in a podcast episode and the things that I found interesting. But if I gave that to you of some conversation I had with somebody else and I gave it to you and said, look, can your editor edit this down into you know, half an hour? They would probably pull out different things than I would pull out. And there's that whole creative piece of you know who's right. Well, no one's right, it's just different approaches to it. And I would find that challenging with an AI, I think, because I'd be like, well, why is it choosing this?

55:53 - Tim Neeves (Guest)

But I guess that's where there's one of the things that I am finding quite interesting.

55:58

I'd love to understand more about how it could be implemented is how to use AI to guide editorial based on how it's going to perform with the end user. So you know, I've often had this argument with editors. You know where they'll come to me like, oh, this is what I thought I'd put together here and it'd be different from how I see it, and there's always that kind of power dynamic. But also, I've got to be okay sometimes to let things go in a sense that I'm not always going to be right. I know that, and you know I've got to be open to the fact that someone else's idea might actually be the right thing. In the same way, with AI, that actually what an AI engine might come up with could actually be on the money. But I guess that's how things could be tested, so kind of A&B testing on YouTube with multiple cuts of a story and then figure out how people interact, you know where's the engagement levels, and then to then adapt the edit based on that information. I think that could be quite interesting.

57:12 - David Brown (Host)

As far as I know and I'm not a monetised creator on YouTube yet, so I don't know if maybe there's some tools I don't know about, but from what I understand, from what I know, there's no way on YouTube to A&B test things. So if you put two versions of the same bit of content up, they're going to strike you because it's the same content, and they don't like that. But it would be. I do wonder if, once you reach a certain, you know, sort of partnership level with them, if they do have a tool where they can, you can put in the same video but with different titles and descriptions, and maybe thumbnails, and then A&B test those to see which one performs better and if I'm sure people have asked for it, even if they don't have it. So this isn't my original idea, but you're absolutely right.

58:02

You know, and I often wonder and I don't remember if we talked about this when we talked before, but it makes me think of when they did the, when they were using AI to see if they could find women who had breast cancer and I promise there is a link here and when they started they thought that it was really inaccurate because it kept giving all these false positives and they're like well, it's identifying this, but you know, there's nothing there.

58:28

And it wasn't until four years later that they realised that it was so far ahead that these women were did actually have breast cancer, but it was identifying it so far in advance that they didn't even think that was possible.

58:40

So they had to go back, the doctors had to go back and rethink the whole thing, and then they started to doubt all their results because they’re like, well, maybe this thing is actually way smarter than we, smarter but more accurate than we gave it credit for. And then your example, I’ve exactly talked to someone about before, because a lot of times I would go into chat GPT and I would say, hey, you know, brainstorm, give me five titles that would convert really highly for this YouTube video. And it gives me back titles, and I look at them, and I go, but I wouldn't use that. But then there's this little bit in the back of my mind going, but maybe that's me not understanding what's convertible like, what's going to convert, and maybe I should try it, even though it doesn't feel right to me. What's my goal? Is my goal to get people to click on the video and to watch it, or is my goal to have it say something that I think it should say?

59:37 - Tim Neeves (Guest)

Yeah.

59:39 - David Brown (Host)

Yeah, it's interesting.

59:41 - Tim Neeves (Guest)

That is interesting. I don't think we'll find the answer until AI tells us, probably.

59:47 - David Brown (Host)

Exactly, exactly.

59:50 - Tim Neeves (Guest)

That story of the kind of cancer kind of reminds me of another kind of similar example.

59:56

I think it's an Israeli company who were able to predict a cholera outbreak somewhere in Central America that had, like, got rid of cholera, I know, several decades, and it was tracking, like weather patterns, and it was tracking human traffic around the world, and it basically said, oh, in two years time there's going to be a cholera outbreak. I can't remember the country off my head right now. And then everyone's like, oh, that'll never happen. We got rid of cholera 20 years ago, and then a year later, there’s going to be this cholera outbreak within a year and then within six months and sure enough, there was.

::

There was.

::

But what was amazing is that, because of that predictive information, they were able to have boots on the ground with the kit available, medicines available, to make sure that it wasn't going to be that big a problem. But it was just amazing to think how, using weather patterns, analysing human movement, they could predict that. Doesn't that if?

::

you take. So that's a fantastic example. I'd never heard that, so I will research that, and I’ll find a link to it, and I'll put it in the show notes so everybody, if you want to go and read it, you can find it. But that kind of stuff just scares the crap out of me, because if they can predict that sort of thing, what can intelligence agencies, for example, predict, and what can our, what can bad actors predict? Makes you think, doesn't it? It does. This can all get quite scary if we, if we let it.

::

Tim, I'm conscious we're over an hour now already. Thank you very much for your time. We could, like I said this is to Adam the other day I mean, we could, you know, go full Joe Rogan and probably chat for three hours, but I'm not sure that that would be. That'd be great for either one of us. But thank you very much for your time today. Do you want to give a shout-out to any of the projects that you're working on? Can people go see some of your work, like, is Brave Blue World still on Netflix, or is there somewhere they can see it, or anything?

::

you want people to watch.

::

So Brave Blue World is still on Netflix, One of the projects I still love to shout out with, with hopefully going to do some little bit more work on it this coming year because the story is developed, but again. So, just heading into the pandemic, we finished a doc called the final fix, which looks at the kind of the opioid crisis, predominantly obviously in the US, but as a revolutionary treatment that was developed 50 years ago by a Scottish doctor that claims it could bring you off any drug of addiction within a matter of you know, maybe a week, a week or two, and that was. It's been held back by the powers that be over many, many years and but as a result of the screenings that we did just before the pandemic in on the capital in the US, managed to get it in front of some fairly influential people. So they are waiting the data from a quadruple blind FDA trial. It's the first time in 50 years that managed to get it on that path because every time the drug gets pulled.

::

So hopefully within this next couple of months that treatment will be approved by the FDA, Unless there's some other nefarious thing that goes on. But anyway, I think that project is hugely exciting because you know we are filled to have you and McGregor narrating it, which I thought was brilliant. You got the Scottish but also the well, well known heroin addict not in real life and we feature some guys out in a rehab facility in Kentucky who, you know, in front of our eyes, came clean, having tried for many, many years using the box zone, using abstinence, using, you know, methadone and various different things, and had failed. And within, I mean, I thought it sounded too good to be true, but within 15 minutes of these guys plugging on this neuro-electrical therapy, which I'm sure, in fact almost certain, uses some kind of AI stuff in there.

::

But it targets the neuro pathways, and it’s like gentle tapping stimulation, a bit like a TENS machine, Right?

::

So the brain.

::

Brain can produce serotonin and endorphins to provide natural pain relief that's been subdued by addiction. So, anyway, within 15 minutes, these guys are. You know, you can, even without asking them how they are, they're like oh man you can see it yeah it's amazing. So that's really exciting. So is that?

::

sorry, is that out already or is that still?

::

it's waiting to come out.

::

It's well, there is a version behind a paywall on Amazon Prime, but we are it is something that I would like to re pick up, because I think the story has you know, the story has developed quite a lot, but I think I would just encourage your listeners just to keep their eyes out for what we're working on, because we've got some pretty exciting stuff that I can't really go into at the moment, in terms of both documentary projects, but also this new startup, change creators, which is going to be very exciting.

::

Awesome. Well, anybody that's curious, I mean, obviously we'll stay in touch. We're both local and in the media group together, so I'm sure we'll see each other around. But as soon as anything comes out or there's anything you know that you want me to mention, I'm happy to do that. We'll put it on the LinkedIn page and on our social media accounts and stuff like that for you. We'll be more than happy to do that when it comes out. And, like I said, I'll put links to all this in the show notes. So if anybody's curious and wants links to all this stuff so I've got you know. I'll put a link to some briefly world stuff, final fix, all that sort of stuff. So yeah, and if anybody wants to find you and needs a documentary produced, I assume you're the man.

::

Yes, well, thank you, it's really good to chat with you.

::

Awesome Thanks, tim. Well, have a good afternoon and I'll speak to you soon.

::

Yeah, take care.

::

Cheers, bye-bye.

::

Thank you.

About the Podcast

Show artwork for Creatives With AI
Creatives With AI
The spiritual home of creatives curious about AI and its role in their future

About your hosts

Profile picture for Lena Robinson

Lena Robinson

Lena Robinson, the visionary founder behind The FTSQ Gallery and F.T.S.Q Consulting, hosts the Creatives WithAI podcast.

With over 35 years of experience in the creative industry, Lena is a trailblazer who has always been at the forefront of blending art, technology, and purpose. As an artist and photographer, Lena's passion for pushing creative boundaries is evident in everything she does.

Lena established The FTSQ Gallery as a space where fine art meets innovation, championing artists who dare to explore the intersection of creativity and AI. Lena's belief in the transformative power of art and technology is not just intriguing, but also a driving force behind her work. She revitalises brands, clarifies business visions, and fosters community building with a strong emphasis on ethical practices and non-conformist thinking.

Join Lena on Creatives WithAI as she dives into thought-provoking conversations that explore the cutting edge of creativity, technology, and bold ideas shaping the future.
Profile picture for David Brown

David Brown

A technology entrepreneur with over 25 years' experience in corporate enterprise, working with public sector organisations and startups in the technology, digital media, data analytics, and adtech industries. I am deeply passionate about transforming innovative technology into commercial opportunities, ensuring my customers succeed using innovative, data-driven decision-making tools.

I'm a keen believer that the best way to become successful is to help others be successful. Success is not a zero-sum game; I believe what goes around comes around.

I enjoy seeing success — whether it’s yours or mine — so send me a message if there's anything I can do to help you.