Josh Horneman (00:00)
there's an inevitability that people in jobs, certain people in jobs will be replaced by this technology. This is just the results of industrial revolutions. And I feel like we're at that point yet again, there is another industrial revolution taking place. That replacement of people, I think will be slower than the hype that is being thrown around at the moment
Brad Eather (00:18)
Hello and welcome to the Creative Business Podcast, the podcast exploring the intersection of where creativity meets commerce. I'm your host, Brad Eather a marketing communication specialist helping businesses bring their strategic message to market. Remember, if you enjoy the conversation you can support us by subscribing wherever you are or go that step further and find me on LinkedIn.
We're currently living through what many call the AI revolution, but for the average business leader, it feels less like a revolution and more like a high stakes gamble. We're told AI is going to automate everything, yet 70 % of the workforce is looking at these tools with genuine hesitation or outright fear. The promise is that AI will make things easier and more efficient, but the reality for most enterprises is a trust gap as wide as the ocean.
tech giants? Do we have to start thinking about this in a completely different context or is there an alternative way to harness these tools while keeping our data and our humanity firmly in our own hands? My guest today sits at this fascinating intersection of high level philosophical debate
and get your hands dirty operational reality. Josh Horneman is the co-founder of HOWLL an AI operating system designed specifically for the private enterprise. With a background spanning over a decade in business coaching, consulting, and even feature film production, Josh helps leaders bridge the gap. He helps them contextualize this new reality between cutting edge tech and mission driven growth.
Today, we're gonna unpack this trust gap in AI, why local hardware might be the last frontier for IP protection and why despite the hype of autonomous agents, the human expert must remain in the loop. Please welcome to the show, Josh Horneman.
Josh Horneman (02:22)
Thanks Mike, thanks for me.
Brad Eather (02:24)
Josh, you're not the typical AI guy. You haven't come from a computer science background or a scientific pathway. Why are you the one sitting across from me today? And what do you think you saw early that led you to working in this space?
Josh Horneman (02:43)
The why is probably curiosity. I am innately curious and always fascinated by new and interesting things. That has been just how I've led life and yeah it's really what started me on this journey and I think it's actually something I challenge for anybody that uses ICT to actually get the most out of it. So yeah it's definitely my why and it's probably something I'm part of and others to try and be their why as well.
Brad Eather (03:08)
So how did you end up in, how did you end up here? Like I understand you why, but yeah, go for it.
Josh Horneman (03:11)
Yeah, for sure. So,
so no, no, of course. So I, as you mentioned, yeah, consulting, coaching for a decade or so now, the traditional, you know, going help businesses understand what their strategy is, where they're going, how to get there. And we can set a strategy and you can write it on a big piece of paper, it goes in the top drawer and people try to execute to it. And, and that has been the kind of repeating iteration across a whole bunch of clients. We then get to this point where for me, chat GPT probably a month in.
was when I first sort of started using it and I got to this point of, wow, this may change absolutely everything. For me, it really felt like the response you got, the feedback loop, the iteration opportunities at that point in time, unlike anything I'd experienced before. As you mentioned, I'm not like a coder or a tech guy or anything like that. And so for me, it was really interesting interacting with technology in that way. And I actually just saw
a trajectory ahead of like humans that this would really impact. And here we are three or so for actually don't even know what is it now? 29, a long time later. And, and it's really, really starting to take hold. But at the same time, it feels like only 1 % of the world is still using this stuff. And so you got this kind of balance of like excitement and interest alongside the trepidation, which I'm sure we'll talk about at length. And
And so for me, curiosity is what drives me. I was like, this is phenomenal. This could be the next big thing. And then I start exploring how it impacts the world that I live in. So the world of business, the world of helping people build their businesses. And that's really what evolved into the journey to build out the platform that we have now and to kind of like acknowledge the value of privacy, IP, all these sorts of things.
as it relates to large language models or AIs we currently know.
Brad Eather (05:01)
You mentioned that you're not a tech person. And for me, my experience with AI has been similar, although I've come from a technology background. It hasn't necessarily been software development or anything like that. But what AI has allowed me to do is from a non-technical lens, access the world of technology in a completely different way and start experimenting. And all of a sudden,
you have IP that's been locked out or been gatekeep for a long time from a technical perspective now being able to just flood out. What do you think the ramifications of that are or how do you see that playing out in real life?
Josh Horneman (05:39)
Yeah, mean, slop is the word at the moment, right? This tech, this technology has the ability to create both phenomenal things and hilarious things and terrible things. And people are basically having a free for all around all of it at the moment. I think I think there are, you know, horses for courses, obviously different models for different things. And I would argue that we're kind of way further ahead.
of any potential regulation, any potential breaks that can be put on this. And so the paradigm is and or has shifted. And so people who often are the biggest challenge when it comes to change are either gonna have to jump on board or feel so uncomfortable as a result. I think we're starting to see and I actually shared something.
this morning around this where there's now this growing perspective because of how good models are progressing, especially this year, some of the latest model releases, the likelihood of like complexity around scams and phishing and all these sorts of things is just going to get crazy good fast. And so yes, you will always have bad actors and good actors in the world. Technology is something they will always leverage.
And so one, you know, potentially negative aspect of this technology is that, but then at the same time, you know, you can teach your children phenomenal things so much faster. You can learn yourself so much faster. You can iterate and create an app in a weekend, right? Like there's all these phenomenal things we can also do with this, which can benefit humans.
Brad Eather (07:05)
You AI slop and scamming. I suppose when it comes to AI, you're either in it or you're out of it. And from an outsider's perspective, AI slop and scams are the things that are frightening. To bring it back to like a business context when you're speaking to leaders.
Where's the hesitancy come from? it from these sort of like, because we don't understand it, we're feeling the pressure? Walk me through how these...
Josh Horneman (07:34)
Yeah, totally. think,
I think, yeah, I think AI slop is a generalist term for different outputs that come from this technology. A lot of people originally use, knew it as a hallucination, right? AI hallucinates, it lies. The realities of these tools is that they don't actually know truth as a probability engine. give you their best guess as the output and they're getting better and better and better at being accurate with that. From a business standpoint, there's a lot of risk attached there. So this is the
Cool, I'm just gonna trust everything that comes out, gonna go use that in whatever way I feel like I should and hope that it doesn't impact us. Organizations that have anything around like regulation, basically I ask people, what are you insured against? Yeah, and if you are not protected correctly or properly, if you let a technology solution take over, what's the impact of that on your business?
And so the slop piece, yes, it feels weird when you see like Will Smith eating spaghetti, but in a business environment, it can actually be the numbers that are generated. It can be the sort of paragraph that turns out to be the thing that's implicated in you down the track. There's so many different layers around how I define it in that world. And it's a large part of why people are hesitant. If they've got risk teams or governance teams, they're the ones sitting there going, well, what's the impact?
Brad Eather (08:52)
So you're talking there about the output that you can get from these technologies. When it comes to value in a business context, historically there's been a lot of emphasis placed around the value piece around intellectual property. You're heavily in the space on data sovereignty. Walk me through.
what data sovereignty looks like in this world of AI from an enterprise perspective.
Josh Horneman (09:20)
Yeah, totally. I think it's starting to become probably the hottest topic, which is great. I think when Chat GPT 2 first kind of came out and everyone was jumping into a free tool that felt great and they threw whatever they wanted at it, there was a lot less restriction or safety in place. It's now evolving that the major labs are focusing heavily on safety and hope and guardrails and the ability to offer you isolated instances, et cetera.
So sovereignty in that sense is evolving. Our perspective was that the basically the same people that created sort of social media, the world of cookies, we want to understand everything about you and potentially profit from it, are largely the same sorts of people with a similar business model creating most of the large, let's just call it closed source AI at the moment. And so there will always be some form of a trade off in that world. Now,
There is also a huge swath of the kind of AI space generating in an open source capacity, which means you can have and run the entirety of a model on your laptop, your phone, on a few GPUs that sit on a computer. And it never even needs to touch the internet at that point. And so the data that it's interacting with can be air gapped or siloed in a way, which is not saying someone can't walk in the door and steal the thing, but your...
in a larger sort of environment of control. And so for me, a lot of large organizations have actually started to implement this. Some big mining companies here where I am in WA have made partnerships to build their own data centers and try to really ring fence and control, while others are just, you know, all in with Copilot or ChatGPT. And so it's not to say that there is never a risk or there's larger or smaller risks. It's just what are we doing to actually manage those?
And this is just that data sovereignty piece. There's a whole like other realm of things around this that we're really passionate about in terms of visibility of who's using it and where the right models for the right solutions. If a person walks out the door, do they take all that knowledge with them on how they were using it versus you actually understand the process that they've integrated AI into so it actually benefits you and your business. There's layers to it. But one of the keys for me is kind of an appreciation of where the data is going, where it's stored, how it's being processed.
when it comes to using AI.
Brad Eather (11:33)
This is, I don't disagree with you, but I do want to just introduce a differing perspective for the sake of the conversation. There's obviously an argument that intellectual property is something that we need to keep and control. But there's another argument that says that in a new world of AI, when things are open source, what is the value of intellectual property? In that argument,
Josh Horneman (11:40)
Let's go. Yeah.
Mm-hmm.
Brad Eather (12:01)
there's a sense that the value that a business creates is no longer in the intellectual property, but rather how it's packaged and presented to market. How do you see those two differing arguments sort of play out in your sphere?
Josh Horneman (12:12)
Mm-hmm.
Yeah, totally. I think when I kind of come back to this change, this paradigm shift that we're having, you're spot on, right? Like most of the large language models we interact with have been trained on large sort of data sets. A lot of them pirated, which has come out over and over again as being an issue as to why they exist in the first place. And so we are beneficiaries of the exploitation of IP every time we interact with them. When it comes back to
I guess IP as it relates to, let's say an engineering firm with a unique processing methodology or a lawyer who's doing something in a certain way that makes sense for them and their clients. That point of difference in market, that ability to say, I can generate you this outcome at the moment is attached to expertise, but sometimes can be documented in a way that should be protected or should be rewarded for that person's creativity or their idea that then they brought to fruition. This is kind of the shoulders we all stand on.
The society we've built is on the back of this and this structure. I see the potential if we continue down a path where society as a whole has a reliance on a very small few providers of this technology risk attached to the individual's ability to be rewarded for whatever it is they generate or create unless the fundamental economic system shifts.
And so this is where we go deep and get philosophical and start thinking about what change needs to happen. And at the moment, we're not there yet. And so I think there's a risk that on the pathway there, on the journey there, we lose the value of that creativity, the thing that somebody builds and then takes to market or maybe should or could protect because it holds value. Yeah, it's an interesting conundrum.
Brad Eather (13:56)
Because I think, I think at the core here is that this is where we start getting into the futurist sort of expectations, right? there's two worlds, a world that we've come from and the world that we're going to. And in the world that I've just described where everything's a free for all, we're relying on things like societal reform, ideology to change.
to actually reach that and that's where AI gets scary. What does the world look like when the fundamental principles that we've grown up in and know disappear underneath where we stand?
Josh Horneman (14:29)
Yeah. Do you want me to give you my answer? ⁓ So, okay. So I, I'm like usually a safe descript, describer of what I think's coming because fear is not something I love throwing at humans. And also I think that that paradigm shift that people need to go through is going to have to take some time. It's very difficult for people to be like, yep, great overnight. Everything is different. I actually think that's an inevitability with the trajectory that we're headed on.
Brad Eather (14:31)
Yeah, please, please.
Josh Horneman (14:55)
And it'll be similar to the overnight changes that happened because of like world wars, major conflicts in the past, big things that happened that just basically shake the future foundations of humanity. This has the potential to do that in a couple of ways. One is dystopian and insane. And everyone all of a sudden overnight is like, what, how did that occur? And the other is abundant. And everyone goes, well, I can now achieve whatever I want, do whatever I want.
And so the arguments for both of those things that people have at the moment, let's say abundance is usually attached to some form of universal basic income, some form of everybody's got their basic needs and wants met, so on and so forth. That feels difficult in a country like Australia at the moment where we have such a massive housing crisis, right? People can't find a roof over their heads. And then the dystopia is the, well, you have to do whatever you're told by whoever's in charge because they know what's best for you. And so here's the two parts that...
a lot of people go to from an extremes perspective. In reality, we'll go somewhere down the middle and it's going to take a bit of time. But I still wouldn't be surprised if one or the other of those extremes happens overnight.
Brad Eather (16:01)
Let's.
bring it back to reality and talk about the now. I think we're both in agreement that we're both advocates that people should be using this technology to benefit themselves. You've mentioned to me before in a previous conversation that a lot of the conversations that you have with the workforce, there's a psychological sort of barrier, there's a hesitancy from staff, they're getting these two versions of the
Well, they're getting mostly the dystopia and they're just having people in the media telling them that they're going to lose their jobs. What, how do you address that psychological barrier? How do you get them to see the reality of where we are now and what this technology is actually able to do for them?
Josh Horneman (16:32)
Hmm.
Yeah, for sure. So yeah, for me, if I go into an organization, say to run a training session, I'll do like an anonymous survey initially. Everybody gets to say how they're using it, how they're feeling about it. And probably 70 % as an average, either apprehensive or like unsure, and rarely is a heavy percentage of that population super positive. And it's because of the, don't understand that I haven't used this yet. Everyone's telling me that I should be afraid of it.
I will then run my session with the intention of showing them the potential, the possibilities. And most of the time, the outcome is those that have never touched it before because they have an initial light touch experience relevant to how they do what they do, even if it's just in their segment of a business, right? Show someone what deep research can do for them. So show someone what sort of quick iteration on an idea to generate a piece of copy can do for them. It's a soft light touch to this tech.
A lot of people aren't given that chance, given that opportunity, they just see like a video online and they're like, is that real or is that fake? And they immediately go, I don't know what to believe what's happened to reality. And so a lot of people are sitting with this fear challenged by it and haven't had the chance or feel like they don't know how to interact with it. So for me, it's like a step. It's like, a step. My challenge to anybody is just download one of these things and go back and forth with it, but do so with
some knowledge, do so with the understanding that, yeah, it doesn't quite know what truth is or some of these other things that we've talked about. So you can kind of get a feel for what its impact is gonna be on your life.
Brad Eather (18:18)
I suppose the underlying fear for some people within an organization context, I mean, we're talking about jobs and fear that you are essentially training your own replacement or could be, you know, you're very much an advocate for human in the loop. What what does that
Josh Horneman (18:26)
Mm, mm.
Brad Eather (18:40)
What does that look like? How do you challenge that idea that we're not training a replacement, we're actually supercharging what you already do?
Josh Horneman (18:49)
I there's an inevitability that people in jobs, certain people in jobs will be replaced by this technology. This is just the results of industrial revolutions. And I feel like we're at that point yet again, there is another industrial revolution taking place. That replacement of people, I think will be slower than the hype that is being thrown around at the moment because implementation, change, all those things take time in an organizational sense. But my...
usual conversations with people who lead businesses is they might have an appreciation of a team. Let's say there's 10, 20 people within that team. There might be one or two like absolute weapons, people that are great delivering, constantly showing up, doing everything they possibly can to execute. And then there may be a whole bunch of other people in that team who have a specific role. It might be attached to a repeating task or an outcome or a skillset expertise.
and they're required as it relates to an existing business process. But if we look at that holistically and then understand what's possible with AI and its current iteration or potentially near future iterations, you can start to see, wow, do I actually need someone doing that task every day over and over again as the definition of their role? Most of the time, the answer is no. And it's not cool, great, I'm going to now fire that person. It's usually, well, what could they be doing?
What other expertise do they have? And I'll use an example here where a client we're talking to at the moment has two internal accountants. And one of those internal accountants basically does shuffling papers as it relates to like travel and reconciliations of receipts and things across the team. It's just, it's nonstop for them. And they look at that and go, I would love that person to be empowered to do what their skillset actually is and bring more value to this organization. So how do we replace all the stuff that they're currently responsible for?
with this technology for them to sit above as an expert and have oversight of, but ultimately unlock them to go do other things. That's how that business is thinking about it. So that's for me the right way to approach this. It's like either how do I amplify what my people can output and generate for me, or how can I replace the things that, I mean, effectively just come back to tasks, movement of information, data, to unlock them from that, to then go do something else.
Brad Eather (20:59)
what you're essentially saying is that from these big ideas that we explored earlier, we're linking this back to ROI via auditing the mundane tasks and making them more efficient so that we can actually be more productive. Is that fair to say?
Josh Horneman (21:05)
Mm-hmm.
100 %
Brad Eather (21:25)
So, what, what, yeah, yeah, let's go on, how?
Josh Horneman (21:28)
How?
Yeah,
how you articulate that perfectly? think productivity is the inevitable uplift here. And if you look at Australia, right, we effectively have been a zero productivity country for a very long time now. People spin it different ways and argue differently. But the reality is that's just where we're at. If you're allowed to, or if you're able to empower the people in a business, the broader economy to create, to generate, to take more, to achieve more with
the same amount of time, that is a productivity uplift. And so if I go into an organization and I ask someone in charge, well, where are we at today? What are our margins? What are our revenues? What's that potential? Most organizations are touching, I don't know, 10, 15, maybe 20 % of the adressable market. And they go, oh, it'd be awesome to get to 25 or 50, but I'm going to have to hire a whole bunch more. It's going to take a whole lot of time. There's all these rigmaroles. Okay, cool. Well, can we improve the way your existing business runs?
to allow you to attack another 5 % or 10 % of that addressable market. And then there's your productivity uplift, but not necessarily by bringing more people into a business. And so again, there's another conundrum. What happens to the youngsters that would otherwise have come in and done the photocopying tasks while they're in their apprenticeships within an accounting firm or whatever it might be. And so there is a lot of challenge around how this impacts the wider sort of societal structure.
But right now that's the immediate opportunity for people. And then what I say to those young people is jump into this technology. Most of them are going, sweet, what can I build with it? What can I do with this? What can I create? Because they're seeing it as the inevitable future that they face. And so you're kind of balancing these two things. One, which is how can existing organizations leverage it in a way to sort of support themselves in a growth journey, et cetera. But then how can that next generation compete, right?
Like if a young upstart accountant graduate comes out and builds a, I don't know, brand new zero for construction and takes that to market because he was able to code it over a weekend and get a few people using it. Epic. Like that's going to have all these amazing productivity gains. Yeah. So again, I'm an optimist, right? Like I'm always and will forever be an optimist about this stuff. and so I come to those like opportunities first.
Brad Eather (23:46)
what I was thinking there is just an example of some concrete examples of some mundane tasks that you've experienced, that you've optimized for.
Josh Horneman (23:58)
Yeah, totally. I mean, I'll use myself first. So I used to do strategy workshops, sort of setting a strategy for a business, go in, run a few different workshops, do some interviews, conversations. It takes a good few hours over a good couple of days to two weeks, you generate an outcomes document that then goes around and everybody goes, yeah, I like this and agree with it. I can achieve the same outcome that I used to take two weeks to achieve in probably about three hours of total time now. And I can
represent it visually in say an interactive HTML environment, for example, in two or three prompts from a transcript. And so you now have this ability to quickly generate much more palatable outcomes for that's just one instance in a way that we've never been able to do before. And then if you take that into the company that so an example, I've just run that with one of my clients.
They're now off and running with that strategy, but they have a heavy report writing step within their existing service offering. So they have a huge swath of prior reports. There becomes a knowledge base that AI can look at and learn from or take direction from. And then they have a relatively formulaic report outcome. And so sections of those reports are repeating.
like an introduction about who we are or whatever it might be. And maybe it tweaks a little bit with some sort of updates, but most of their existing process is still very manual around that. And so the journey becomes, well, great. Can we look at that prior knowledge, your, in my opinion, IP, and then can we learn from that through the assistance of a large language model to generate the next report outcome? And instead of it taking you two or three days, can it take you two or three hours? But
Brad Eather (25:27)
Yeah.
Josh Horneman (25:39)
an expert who understands how to do this process and knows what that outcome should look like, reviews with detail what is effectively a draft and turns it into the final sort of client ready document. So this is very simple steps within our processes can be benefited in exactly this way. And it can that's just prompt and response right now, right? Then you get into agents, then you get into full automation and all the layers. But I would argue most of the Australian market isn't even ready for that yet.
Brad Eather (26:06)
Yeah. The thing that I was thinking about is just bringing it back to the human level, like ⁓ motivation in the workforce. When you look at the ability to automate these things, especially when you're dealing with a large group of people, now that you can move faster becomes a motivation problem whether or not you can actually execute.
Josh Horneman (26:12)
Mmm.
Mm-hmm.
Brad Eather (26:31)
You could argue that this frees you, frees business, if they choose to use their time wisely, it frees them up to actually take on more leadership responsibilities and empower their teams to motivate them through a change process like this. Change has become simpler. How are you finding the motivation piece?
Josh Horneman (26:31)
Yeah.
still human, right? And you still need to go through an exercise of understanding who's involved, what their interest levels are. And so one of the reasons for doing like an anonymous survey is you get a vibe for what's going on, where everyone's at. And then you maybe run a couple of actual in-person human sessions, which further lets you get a vibe for who's in the room and who's keen and who's not. Yeah. There'll be the same perspectives from the leaders in the organization. And often you'll start with
Brad Eather (27:12)
you
Josh Horneman (27:19)
a power user, someone who's already really motivated and interested, working quickly towards showing others like I can now do X or Y and I'm doing it this way and it's achieving these outcomes. One of my favorite places for this is the not-for-profit sector because they are so impact driven but so admin heavy in most circumstances. And so if you can help a power user in that sector, get to an outcome faster.
removing a lot of those, you know, simple and repeating steps or simple challenging admin steps that I want to be doing. They're all about impact and they've got an end individual or group that they're trying to help and they can show quickly that they have. So I still think, and this is, it's not just human in the loop in the how we use it. I think it's about maintaining an understanding of our humanity and how we roll it out and how we help people up skill, learn benefit from, et cetera, the broader technology.
Brad Eather (28:08)
Let's from a societal perspective, I think we've landed there. From my own experiences using AI and the iterations that it's been through, I don't know, probably the last two iterations, things for me from at least has seemed to stagnate a little bit. And we're sort of at a point for now, at least that's
Josh Horneman (28:16)
Mmm.
Mm-hmm.
Brad Eather (28:37)
we have a pretty good understanding of the capabilities and the limitations of this technology. And now we can actually start to implement it without this whole like, where's it going next idea. Talk me through the physical limitations of AI growth and what you think it might mean for the future of work from, and from a societal perspective as well.
Josh Horneman (28:50)
Mm-hmm.
Yeah, of course. I think, you know, there's an inevitability at the moment around access to power, right? That power is what we need for the compute and the scale of compute that everybody's currently talking about. And different countries are approaching this in different ways. So there's the potential for a pinch point there that means people are restricted in the access. And that could mean that you know, rate limiting in some of the big public models becomes more significant because they can't actually give you what you want.
But it could also mean you run something locally on a GPU that you own and you can go hot as long as you want as long as you can keep paying for the power attached to it so I think that the power piece is like a real thing and there are obviously lots of people trying pretty crazy things to fulfill the projected need for it and Yeah, where that goes not a hundred percent sure, but I do think at the same time I kind of try to stick
really close to the labs, especially the independent research labs, trying to build new ways of creating these models. They are like quickly developing way faster, way more capable models that are smaller and smaller and smaller. And so then you have this kind of battle of, do we need that much compute to still get the same technology or intelligence outcomes? And so they're pushing kind of both levers at the same time. So the growth is there.
Hopefully we can sustain it. We don't get to a point where people are restricted from its use. But I would say a couple of things. The first being power. The second being the kind of ability to actually access this. So at the moment, you need a decent computer with a decent GPU to run something locally. Your iPhone using a great app, one of my favorites called Locally AI can run small models on it at the moment.
and you can ask it questions and get answers. They'll obviously be quite restricted in their size, but you've still effectively got the entirety of human knowledge in your pocket on airplane mode, disconnected from the internet. So we are at that point. When it comes to a business that wants to serve 200 users, 500 users on a completely localized instance, that's a big investment still. That's racks of A100s, especially if they've got people hitting them in different ways, wanting all sorts of different outcomes.
And so the scale of it, especially from an investment perspective can still be pretty significant. But I think we're to come to a tipping point soon. And this is the argument a lot of the kind of major guys are making where it becomes deflationary. Like you're hearing that a lot now where this technology will drive a deflationary outcome or impact on the market. I don't know what timeframe that'll be in, but I actually do think that's possible. I think it will get to a point where it's accessibility and its ability to.
impact us as individuals and the way we exist, let alone the way we work, will have pretty phenomenal results.
Brad Eather (31:52)
from a power perspective, like what are the limitations? think, yeah, I think.
Josh Horneman (32:00)
What? Cooling. Cooling at the
moment. Yeah. Yeah. Cooling is the biggest limitation because if you understand how the compute component works, if you're, if you're sitting your laptop on your lap, doing a task that makes it start humming crazily, and then all of a sudden it gets really hot. That's it working hard. A GPU is what sits behind a large language model at the moment. And when you turn that on, because you've asked it a question, it starts working hard and that generates heat.
Right? So for them to not burn out, blow up, explode, stop working, you need to constantly implement some form of cooling. And there are a whole bunch of different ways that this is possible. Nvidia obviously being a leader of how this is happening in the market at the moment, are getting better and better and better at deploying AI hardware that doesn't overheat, that can run at a, you know, say higher temperature or deliver the same intelligence at a lower temperature. So.
The cooling element, especially when you hear people talking about like using all our water and all these sorts of things is because that's the reality. We need to cool the base technology that gives us these outcomes. So that's why power is important because all of that takes from the grid in some way.
Brad Eather (33:09)
think what I wanted to cover today is everything from the AI infrastructure piece to the futuristic sort of, what's the word I'm looking for? philosophy, from the philosophy perspective and the human perspective.
Josh Horneman (33:21)
Yep, philosophy. Yep.
Mm-hmm.
Brad Eather (33:30)
There's three different core, different debates that we could be talking about when it comes to this technology. For the average person, I don't think we need to know about the power stuff. We don't need to know about the limitations. For the average person, what's your advice? What should the average person be focusing on when it comes to this technology?
Josh Horneman (33:38)
Hmm.
amplifying their intelligence. So really and fundamentally for me, the way I look at AI as we currently know it is a mechanism for amplifying our individual intelligence. And so that might mean, you know, previously we turned to Google to ask a question to get a response. Now, like my kids just go, can I ask Rock a question please? And they don't even use Google anymore. And they'll pull it up and they'll whack a question in. They'll get an answer really quickly. It's the least rate limited. gives you really fast responses. It searches.
all sorts of different sources and they're just like bang, bang, bang, back and forth with it. I mean, it's teaching my son to play piano at the moment because he'll just sit there talking to it and it's like, put you this finger in middle C and then do these things and he's learning songs. So the ability for us to interact with it as a tutor or a researcher, something that challenges our thinking, all of these different ways of using about it and thinking it are what amplifies our intelligence. Now the majority of the population
are still like doing their jobs, raising their kids, trying to get to the gym, trying to enjoy some, I don't know, swimming, golf, art, whatever, like life, yeah, consumes us. So my challenge often is if you start to just experiment and start to think about where it can compliment what you're already doing, you will start to see benefits. And so that could be you're going to the gym with it as your coach or your guide.
or the person that runs your training sessions for you because you're like, well, what do I do next? And that's a great starting point. Or it could be, you know, my daughter's learning about the first fleet at the moment and she's just going hard on it. She's learning all sorts of different perspectives of it and finding like a video here or an article there or a book to read here that you're going to get from the library. And so it's that, it's this kind of guide for her from a research perspective. Now, where this steps into interesting territory is people rely on it too much.
And so we're seeing more and more of what they're calling AI psychosis, where a human gets way too attached to the technology because it feels so human. And so herein lies this tipping point. And also one of the biggest issues that I have with how it's being served by the big guys, we are users in their perspective and they want us to be users as much as possible. Right? The only two ways we use the word user in this current society is digital technology and drug abuse.
So there's a fascinating conundrum there, right? In how we manage this and balance this ourselves as individuals to make sure it amplifies our intelligence.
Brad Eather (36:22)
Just reflecting on how I've used it, I think the way that I've sort of gone about it is not necessarily finding out new things, but challenging what I already know and exploring things that I know to be true to find new areas to build upon that. So,
Josh Horneman (36:35)
Mm-hmm.
Brad Eather (36:44)
Again, rather, I mean, when you're talking about AI, there's lots of, well, you can, I think it's fair to say you can use it smartly and you can use it completely stupidly. But I think maybe that's a good place to start challenging the things that you fundamentally know to be true, finding where the holes are in this technology so that you can constantly
Well, then you can then you can truly iterate on what it's doing because you can see the gaps in the knowledge that it's giving back to you.
Josh Horneman (37:17)
100 % you remain the expert though, right? You remain the person in control
Brad Eather (37:20)
Yeah. Yeah.
Josh, we've come to the end of the conversation and at the end of every episode, I like to have a conversation about creativity. I think your answer is gonna be pretty cool. From your whole experience from coaching, brief little teaser that you've done some work in film.
a truly creative industry and then coming all the way through to what you're doing now and building a business. What's your definition of creativity?
Josh Horneman (37:53)
Yeah, this was a really cool question. I think for me it is the coordination of chaos that could be our thoughts. It could be the combination of people, humans, technology to truly create something unique, different, beautiful in the world. And that can be like physical and tactile. Now that could be a technology led piece that never actually gets to be touched by anybody. Yeah. So it's that kind of chaos, the coordination of that chaos to create something.
truly unique and beautiful.
Brad Eather (38:22)
Creativity is chaos. haven't had that one before, but that's a nice analogy.
what's an example of chaos that you've manufactured into creative output?
Josh Horneman (38:31)
Well, a feature film is a great example. That is one of the most chaotic things you can ever try to turn into a physical thing that exists when you actually appreciate what goes into it. But I guess for me, this also came from when sort of thinking about this, the human mind, right? We have how many thoughts a day and we have how many thoughts before we do a thing, take action, create a thing. We have how many over thoughts, right? They get in the way of us actually doing that.
Brad Eather (38:33)
Hahaha
Josh Horneman (38:57)
And so for me, that is the chaos. And I think in the modern world, is this cacophony, a great word, there is this cacophony of chaos that we have to navigate to be creative now, right? We have to be bored, disconnect, whatever that might be to allow us actually to create the next thing, to build the next thing, to achieve, to invent, whatever it might be. And there's some awesome tools now that can help us get there, but even then that can be a distraction. Yeah, therein lies the chaos.
Brad Eather (39:23)
So do you think that in the world that we're talking about, mean, AI technology, how much presence do you need to have to allow yourself that silence in order to achieve that? Yeah.
Josh Horneman (39:39)
I think it's everything. think
the essence of what it means to be human will stay with us forever. Like I don't actually think that we will be able to continue on our path as a species without that. I think it is so important to disconnect. I think it is so important to leave your phone at home and go for a walk with a pen and paper. I think all of that is still at the core of...
everything that we're going to achieve as beings. And if you talk to any guru or any successful human or any of that sort of stuff, most of them have some sort of practice around that, which has supported them in getting there. So a lot of the noise is just distraction. If we can navigate it, we're in a pretty phenomenal space.
Brad Eather (40:21)
great answer mate. Josh it's been a pleasure having you on the show where can people find you if they're looking for you?
Josh Horneman (40:29)
Yeah, mean, joshhorniman.com is my coaching website. Howl, H-O-W-L-L.A-I is the platform that we're building from an AI perspective. I spend a fair bit of time on LinkedIn. As much as it's turning into Facebook, I do still enjoy it and love interacting and connecting with people there. So yeah, that's me and I'm based in Perth, WA. So if you're over here, shout out, love a coffee.
Brad Eather (40:52)
Awesome. All right, thanks for joining us on this episodes of Creative Business Podcast. Still getting used to saying that. If you enjoyed the conversation and wanna help us grow, make sure to subscribe wherever you are. And in the meantime, stay creative.