Skip to main content
Scroll For More
listen   &   read

Future Shocks

There are really positive futures and really negative futures and it's about us choosing the right one.

Toby Walsh

Floods. Fires. Plague. We’ve seen them all in the past few years, fuelling a renewed sense of an unpredictable world. Add to this the galloping pace of technological change (ChatGPT anyone?) and it is easy to feel overwhelmed. What shocks lie ahead? And what kind of resilience do we need to build to ensure we are prepared? Hear from three thinkers as they discuss climate change, health, technology, and look over the horizon to explore what is to come. Joëlle Gergis, Norman Swan, and Toby Walsh appeared with Julianne Schultz.

This event was presented by the Sydney Writers' Festival and supported by UNSW Sydney. 

Transcript

UNSW Centre for Ideas: Welcome to the UNSW Centre for Ideas podcast – a place to hear ideas from the world's leading thinkers and UNSW Sydney's brightest minds. The talk you are about to hear, Future Shocks features UNSW Sydney AI expert, Professor Toby Walsh, author Julianne Schultz, journalist Norman Swan and climate scientist Joëlle Gergis and was recorded live at the 2023 Sydney Writers’ Festival.  

Julianne Schultz: Good evening, and thank you for being with us this evening for this session on Future Shocks. My name is Julianne Schultz. It's my pleasure to be hosting this session today. Before I start, I'd like to acknowledge the traditional owners, the Gadigal people of the Eora Nation, and pay my respects to elders past and present. I'd also like to pay particular acknowledgement to any First Nations people who may be with us this evening. This is a time, at the moment, in this country where the level of public discourse is as violent and aggressive as any that I can remember. It's probably since Pauline Hanson first came onto the stage in 1996. And I know that's taking a great toll on many people. So I just hope that at this time, we have leaders and people across the nation who can actually grind this disgusting talk into the dust and for those of you who are feeling the pressure greatly at this moment, I'd just like to say that, you know, you have a lot of allies, and I'm sure that there are many of them here in this audience. 

So our discussion this evening is about Future Shocks, and now three distinguished speakers really have a lot to say, and they bring very strong, strongly informed and unique perspectives to this discussion. Let me introduce them, Dr Joëlle Gergis is an award-winning climate scientist who is based at the Australian National University. She served as a lead author and one of the very few women, and even fewer people from this part of the world, on the United Nations IPCC Sixth Assessment Report that was released last year, and she's the author of Sunburnt Country: History and Future of Climate Change in Australia and, most recently, Humanity's Moment: A Climate Scientist's Case for Hope which is up for an award tonight at the Australian Book Industry Awards. Joëlle has contributed chapters to Greta Thunberg's, The Climate Book and Not Too Late, the book that was edited by Rebecca Solnit and Thelma Young… I've got this. I've printed this in 10-point, can you believe it? You’d think I know by now. Can you pronounce her name?  

Joëlle Gergis: Lutunatabua, I think it is.  

Julianne Schultz: There you go. And she co-hosts a wonderful podcast that The Conversation has recently put online called, Fear and Wonder which I would recommend to you. Please welcome Joëlle. 

Professor Toby Walsh is a man who's found his moment. He's a Chief Scientist now at the University of New South Wales' new AI Institute, and he's a strong advocate for limits to ensure that AI improves our lives. He's spoken to the UN, heads of state in many places, parliamentary bodies, company boards, and many others around the world and you see him now as almost a regular fixture on Australian media, for which we can be very grateful to have his expertise in the public forum. His advocacy has led to him being named and banned indefinitely from Russia. He was named on the international ‘Who's Who in AI’ list of influencers, and he's written a trilogy of books exploring the risks and opportunities of artificial intelligence, the most recent of which is called, Machines Behaving Badly: The morality of AI. Please welcome Toby. 

And finally, I'd like to welcome Dr Norman Swan, who's an old friend of mine, who hosts ABC Radio National's, Health Report and co-hosts the, Coronacast podcast on the Coronavirus, which, rather like the virus, has had a longer life than we initially expected. Norman trained as a doctor in Scotland before becoming a journalist in Australia and has become the most trusted voice of medical reporting and commentary in this country. You've heard him on the radio, you've seen him on TV and online. He's received many awards, including a Gold Walkley, the Medal for the Australian Academy of Science, fellowship of the Australian Academy of Health and Medical Sciences, and an honorary MD from the University of Sydney. His first book, So You Think You Know What's Good for You? was a bestseller, as was his latest book, which was just asked the biggest question, So You Think You Want to Live Younger Longer? Please welcome Norman. 

So this session is called, Future Shocks but maybe it should be called Now Shocks. We've been fed a diet of apocalypse for decades. It's there in our public debates and in popular culture. Movies, TV, and games have made disasters commonplace. Generally, though, the goodie survives, but not always. The camera has lovingly captured the natural world or its virtual proxy out of control as it freezes, burns, floods, is contaminated, or otherwise implodes. And if that wasn't enough, there was always an imagined world of robots, slime, computers, and aliens taking control, all on the screen somewhere near you. It's almost as though we've been paying to try and scare ourselves to death, maybe to prepare ourselves for real catastrophes, or maybe like the boy who cried wolf, we've seen it so often, we don't believe it. We've become inured to the crises. The phrase ‘existential threat’ has almost become a cliché, without a moment's thought that it means a threat to our very existence.  

We're good at catastrophising but not so good at preparing for real catastrophes. Our three speakers have thought deeply about this and have some really interesting perspectives to share. We've seen this up close over the past few years, in particular. First, there were the fires, then the pandemic, then the floods, all accompanied by a drumbeat of misdirected early-stage AI, with a snappy name, Robodebt, devastating thousands of lives. When disasters happen in real life, people say it seems surreal, like something that happens on a screen. But sadly, it's real. 

Although, as Toby Walsh says, these days, unless you are there and you can touch and smell the event or the person or the environment, you can't be sure anything is real, such is the speed and transformative nature of AI. So, Toby, to get the scare level high, I'll start with you. ‘Do no evil’ used to be Google's slogan, but it's now slipped to the end of the corporate mission. Do you have a reasonable expectation that this ‘doing no evil’ is possible with AI? 

Toby Walsh: It is just a tool at the end of the day, it is just doing our design. And we are... maybe the wrong people are doing the design and we're being careless in the design. But ultimately, I believe we can use it for good in the sense that we can amplify what we do. And humans, I mean, it's worth pointing out, let's be a bit positive here, humans are terrible at making decisions. We're full of behavioural biases. I mean, if you look at behavioural psychology, it's a catalogue of the ways that we make irrational decisions and we have the potential to make more evidence-based, better decisions.  

Equally, we also have the same potential with the same algorithms to embed the biases of the past, because we're training on the historical data of the past, embed the racism and the sexism, and ageism, and all the other isms that we've been trying to rid from our society for the last 100 years. Embed them into algorithms that are lacking in transparency, lacking in accountability, and are going to further increase the inequality within our society. So there are really positive futures and there are incredibly negative futures, and it’s about us choosing the right one. 

Julianne Schultz: So why do you think that there is this sort of feeling that it’s like the ultimate Frankenstein, Frankenstein’s monster, at the moment? I mean, there is a real palpable fear that’s around. 

Toby Walsh: I think those myths, the first stories that we’ve told that go back through Frankenstein, go back through history of us telling stories. We’re always worried about whether what we create will have the better of us, ever since we invented firewe’ve worried. Fire was, of course, the thing that allowed us – I mean, Norman’s the better persons talk about this – as allowed us to cook food, and to change our diet and to live on the planet. But equally, fire was the thing that allowed us to fight war and has caused immense harm. It was meant to do good, and it’s the same deal, Faustian deals that we make all the time. But I think the challenge, the difference – the difference, and this is the fundamental difference, that with the technology like AI is the speed. We’ve never had technology where you could so rapidly disseminate it and put it into the hands of billions of people. And it’s no surprise that ChatGPT, this AI chatbot that everyone’s talking about, everyone’s excited about, everyone’s nervous about, was the fastest growing app ever. In the hands of a million people at the end of the f’rst week, 100 million at the end of the first month. And now, less than six months later, it’s in the hands of billions of people. It’s part of Snapchat. So that’s three quarters of a million people, it’s part of Bing, that’s another half of half a billion people. We’ve never had technologies where you could get them out there so quickly. And so, even small harms multiplied by a billion and happening so quickly before regulation, before anything can happen, are going to be really significant. 

Julianne Schultz: So, Norman, do you think that there’s something qualitatively different in terms of AI, not just in terms of its speed of adaptation, but its potential impact? 

Norman Swan: So the moment, as far as I understand it, is that whilst we’re impressed with its potential and its current power, and it surprises us all the time, that it doesn’t yet have general intelligence. So it’s still a machine that does one thing after another and predicts the next word just does incredibly well and gathers all this stuff together, and we’ve probably got a moment or two before general intelligence develops. Maybe Toby thinks that it never will. But when it does, that’s when we need to be really worried and the problem always with new technologies is regulation.  

We, the regulators, the laws always fall behind. But also throughout history, we've had a moral panic about new technologies. There was a moral panic about the printing press, there was a moral panic about the fountain pen, the moral panic about television, the laptop, the mobile phones, social media, and so on. And exactly as Toby says, there's a yin and yang to everything and there's a moral panic going on now. But the problem is also we've chased regulation. So we have the bomb, from the splitting of the atom. And we've also had nuclear power and fire-neutral nuclear pharmaceuticals but we're still living under the shadow of the split atom, which Einstein predicted what would happen. 

Julianne Schultz: The regulation issue is an interesting one, isn't it? Because we're seeing now leaders of AI companies, you know, in the US, for instance, saying very publicly that regulation is needed. Do you have a clearer sense of what that would actually look like, Toby? I mean, what sort of regulation are we talking about that would be meaningful in this context? 

Toby Walsh: There's a lot of things that we don't actually need regulation, we just need to apply existing rules. We have quite a few existing laws that somehow we got sold a bit of a lie in the sense that those existing rules didn't really apply to the digital space. And actually, they do. There was this idea that you couldn't, because somehow it was different. It wasn't physical. It was these bits, and that these companies crossed international borders and existing rules didn't apply to them. And that you shouldn't, because that would stifle innovation. And I think we're discovering actually quite the opposite. You can, existing rules do actually apply just as well to these companies. And the fact that they cross international borders isn't actually that important. You can actually apply laws in nation states, and they actually have to sit up and pay attention. And that you should. Actually, it's harming our democracy not to, but it's harming our economies not to… 

Norman Swan: But will those rules stop bad actors? I mean, that's what people are frightened of, bad actors. And we're worried once the technology is out there, you've got bad actors who can use that technology and just ignore the law and ignore the rules. 

Toby Walsh: But laws apply to those bad actors, right? So if we were talking about maybe there are bad states or rogue states who are going to, you know, that's why cybersecurity is incredibly important these days, that we do have to worry that there are bad actors who are going to, but within a state, the existing laws apply to those bad actors. But also, I think there are fresh harms that are going to be done and there are some laws that we should be thinking about. One is holding the platforms a bit more accountable.  

Julianne Schultz: It was interesting, isn't it? I mean, you know, you see something like Facebook, starting with a slogan that says, ‘move fast and break things’. I mean, the intention of these corporations was that the rules shouldn't apply to them. And if they did, they should be broken.  

Toby Walsh: It was, I mean, it is one of these boiling frog problems where it slowly crept up. Facebook was, remember, a dating site for people who went to Harvard. And so yeah, you don't really need too many rules for a dating site for people attending Harvard, and it slowly encroached into our lives and now has become so important that elections are decided, and violence is created by posts on Facebook. That we do need to think carefully about it, the fake images that AI is now generating are not in themselves particularly harmful, it's the fact that they get put in front of millions of people. The harms are going to be significant. So if we held the platforms a bit more accountable for the content that they are distributing to so many people, I think… 
 
Julianne Schultz:
Are you optimistic that that can be done? 

Toby Walsh: I don't think it's going to be perfect, I think it's going to be challenging to do. But equally, the idea that the platforms have no accountability for their content, they're just disseminating the content and have no responsibility. We hold traditional media responsible for the content they put up. These platforms make serious money, the only businesses that make more money than them are illegal. So I think they could perhaps spend a bit more of it on policing, you know, simple things have a little tick saying "This comes from a verified source", or have a little question mark saying, "This comes from a source that we have no…". 

Julianne Schultz: It's interesting, I mean, because the resistance is so great. I mean, you know, they will do everything to resist this. 

Toby Walsh: The interesting thing is they are not resisting now, they are actually calling for regulation. So I think it's pretty clear that we should regulate them when they themselves say they should be regulated.  

Julianne Schultz: So, Joëlle, I'm interested in whether there are lessons from the climate space in regard to this because, you know, clearly there have been regulations that have not worked. How do you sort of square that? 

Joëlle Gergis: Yeah, well, I guess I mean, the climate change issue is really an example of failed regulation. I mean, we know the causes of a warming planet, and yet we continue to pump industrial quantities of carbon into the atmosphere that have basically destabilised the planetary system. And so we find ourselves in a situation where we actually know the cause of the problem, and we know what we need to do. But, you know, we haven't really done that. So I mean, I'm listening with great interest to this conversation. And I must confess it, it does scare me a little bit. Because we have real-world problems that we haven't been able to face. And the prospect of being able to distinguish truth from fiction, in a sense with artificial intelligence really worries me as a scientist because, I mean, our whole discipline is really built on being able to replicate and to have sort of veracity in the integrity of the data and the interpretation of that data. And it does worry me a little bit, I must say, but I'm so immersed in my field that I am only really just starting to dip my toe into thinking about what a world of artificial intelligence might mean for a very troubled time and a very fractured society. 

Toby Walsh: Can I ask a particular question? 
 
Julianne Schultz:
Sure.  

Toby Walsh: Part of the problem seems to be big oil, capturing the debate. So what should we have learnt about controlling Big Oil to regulate big tech? 

Joëlle Gergis: Well look, just on the whole fossil fuel lobbyists, I mean, issue – at COP26, the big UN summit in Glasgow a few years ago, basically, the largest delegation were fossil fuel lobbyists. And they outnumbered the Pacific Island delegates 12 to one. So I remember when I absorbed that piece of information, I had just finished working on the Intergovernmental Panel on Climate Change as a Sixth Assessment Report, which António Guterres had said was basically a code red for humanity. And yet, here we are, in a situation where there are vested interests that really have a stranglehold on the political debate, which obviously influences the ability to regulate. So I mean, there are long and deeply entrenched positions out there that are really hard to upend. 

Norman Swan: But I mean, the thing is that we get in a car or an electric car to get us as an assist device to get us from A to B much faster than we otherwise would get on a bus or a train. So we have this assistive technology. We have our laptops and our phones, which are assistive technology to help with my dementia when I can't remember a word, and I go into my phone, and so we have these assist devices. My question to you as a scientist or question, I think just more generally, is: are we being paralysed by the fear and not using AI as the assist device? Because when you're dealing with biological complexity, it's enormous, and you need these supercomputers. Could AI enhance the power of the individual scientists to actually solve intractable problems? 

Joëlle Gergis: 100%, I do see a lot of potential in that space and I don't... I think what you said before was really earlier about... There is a resistance, I guess, when this new technology comes about, and even someone like me, I do feel a little bit skeptical, and I need to know more, I need to read Toby's book and, and really think deeply about it. But I do feel like it's probably a polarised perspective of it's all bad or it's going to save the world. I think there's probably a nuanced position in the middle. And I guess I'm thinking about that. 

Julianne Schultz: One of the things that I'm interested in, just as I hear you talk, is one of the big mechanisms for discrediting climate science that's been around, especially in the last decade, has been to deny its veracity. It's been to say, 'this is not true.' And we've all been subjected to that in various forms. I'm interested now that you add a layer of AI into that, where how that becomes even more complex than competing journal articles or competing viewpoints with Toby, as you say, you can't be sure. 

Toby Walsh: No, not only won't we be sure, I mean, we run the risk of being drowned in a sea of robotic voices, and that human voices will be too few and far between to be able to hear the voice of the humans. 

Julianne Schultz: So, just bring in the pandemic, which, you know, I know nobody likes. 
 
Norman Swan: It’s all over, forget it.  

Julianne Schultz: When we're doing catastrophes, we've got to get pandemics in there. I mean, Norman, how do you put this sort of pandemic – the anticipation of and the dealing with of pandemics into this sort of conversation? 

Norman Swan: Well, I think you have to introduce politics into this conversation because that's been the undertone of what we've been talking about for the last few minutes. Throughout history, and the least important part of a pandemic has been the bug. The most important part of it – yes, you need a bug which transmits easily – it spreads through the community, and we don't have natural immunity. Yep, got to have that. But it's politics that cause pandemics. It's humans that cause pandemics. 

There was no surprise to the people who are actually eerily similar to both Toby's area and Joëlle's area in that in Toby's area, they've been talking about the potential for AI for many, many years, possibly 100 years. But in terms of science fiction, there was no surprise that we had a Coronavirus pandemic. It was – they war gamed a  Coronavirus pandemic not long before the pandemic hit, it was expected. We've been talking for years about what's needed in terms of international surveillance and controls, international cooperation. But what happens when this burst on the scene?  

Is that if it had burst on the scene – just to illustrate 10 years beforehand, or counterfactual, I know, I don't think it would have turned into the pandemic it was because 10 years beforehand, SARS-CoV-1was fresh in the memory of the Chinese, Xi was not in the powerful position he is now creating fear. You did not have a world run by weak authoritarian men and every country was there for itself, and international borders, so countries just put their wagon train around rather than cooperated and put early intervention in place.  

And by the time they woke up to it, it had gone. The outbreak in New York, for example, caused by three weeks delay by the Governor of New York. You know, we responded quite quickly, but it was a close call. We shut the borders to China, but only to China and then there was that notorious week in March where the government was saying to people, “Oh, you got to separate. You got to use social distancing. You got to be careful”. There was a basketball match in Western Australia, 14,000 people, there was the Grand Prix with 250,000 people, probably with drivers who had COVID, from Monza. And you had the notorious Sharks’ game that the then Prime Minister tripled down on that he was proudly going to.  

We came very close and there were active conversations at government in Canberra, from the naysayers saying, “Don't do this. This is encroaching on people's civil rights. It's going to be a disaster for the economy”. And we came very close to not doing it. 

Julianne Schultz: So when you say politics, what do you mean? You were talking about political interests or are you talking about poor leadership?  

Norman Swan: It's a combination of all those things and vested interests sitting behind it. In Australia, you had News Limited, which has just been consistently against public health measures, both here and internationally. Luckily, we ended up not listening to them. Boris Johnson did and in Britain, they decided they were just going to let it rip and get natural immunity. They were just sold a pup on that. So after six months and 45,000 deaths, the level of natural immunity in the population was six percent. And so it's vested interest, it's the media, and it is fearful leadership not wanting to stand up and take positions.  

Toby Walsh: Norman, how do we get politics to look longer ahead? Because I mean, it seems to me, we're locked in a cycle where the attention cycle of politicians just gets shorter and shorter and shorter, and not prepared to make the long-term decisions that go beyond the next election, but go to the planet that our children are going to have, 

Julianne Schultz: Or even Toby, in this case, when that planning had been happening. But it stops, you know, so how do you join? 

Norman Schultz: The Americans wrote the textbook of how to deal with a pandemic, and then they threw it out because you had Trump in power, so I think that's a key conversation for us all, which is what Joëlle is dealing with every day. Why does it take a catastrophe for us to act, when we don't perceive a slow-moving catastrophe? And I think Dan Kahneman has got a lot to say about that is the way our brains work, in part is that we get fired up by the rare and dramatic rather than the steady and [unintelligible]. And we've got a classic example right now with a pandemic, it's not over.  

So every day we were getting the statistics and we saw those graphs going way up. I was doing on 7.30, you know, these terrible graphs, going way up, you've got a new variant coming out, and etcetera, etcetera. But now, we've got the big spike, but we've got slow waves, and we're in the middle of one now. But there's something in maths called the area under the curve. So you can have a big spike like that and not that many people are affected by it. Whereas the slow one, which goes on forever, has got a big area under the curve, and a lot of people get infected, and a lot of people die. And because there's not that big spike, we are not fearful of what we should be fearful.1,300 people in New South Wales right now are in hospital, that is 1.3, 1.4 of a very large teaching hospital, at a time when people need to be in hospital for cancer, for heart care. 1.4 teaching hospitals are blocked. But we don't, I think it's human psychology… 

Toby Walsh: It’s evolution has poorly equipped us, correct? 

Norman Swan: That's absolutely right. 

Julianne Schultz: But we talked about before about regulation. I mean, regulation should snap in. I mean, we've got this sort of difference between the regulatory framework that people are talking about in all of these areas, hopefully presenting something. But then we're saying that that is vulnerable to an individual bad leader. I mean, what is that saying about the whole robustness of the structure in which we're… 

Toby Walsh: I think we've seen a couple of good examples of individual bad leaders. I mean, I'm spoilt for choice. Which one should we talk about? 

Julianne Schultz: But would you expect that individual bad leaders would be able to turn these complex systems of regulation and, you know, the whole architecture that goes into building the society, that they are so vulnerable to one small group of bad actors? 

Norman Swan: But in a democracy, a bad leader has a hinterland. And, look at the thousands that turned out for Modi in Australia, who is a former authoritarian leader in the Indian context. But he's got a hinterland of people who buy into it and you've got Trump who's taken over the Republicans, you know, it's not Trump alone. It's Trump with a hinterland. So they buy into that and feed our worst selves. That's what the bad leader does. 

Julianne Schultz: So, in terms of the mounting catastrophes, how do we intervene at that level? Because if it's not sufficient to do it at the level of the leader and the regulation, I mean, how do you switch this? 

Toby Walsh: But are we saying we need to elect good leaders?  

Julianne Schultz: Pardon? 

Toby Walsh: Oh, we are saying we need to elect better leaders. 

Julianne Schultz: That's obvious, but if you take Norman's point, you've got to, then… 
 
Toby Walsh: But then the question that he has: Why is our politics and politics in the US, politics in the UK, politics in lots of countries so broken? That we're electing such terrible leaders.  

Norman Swan: What are you saying? Get rid of democracy, you know, have a benevolent dictator?  

Toby Walsh: No, I think our democracy is a very flawed system. But it's the best form of government that you can have. 

Julianne Schultz: Joëlle, you've been watching this, in terms of the climate stuff, on the one hand, there is this extraordinary scientific effort that's been going on now for several decades. And in some societies, in some states, there had been leadership which has grappled, which has embraced that and taken it seriously. But in others, that's not been the case. And the capacity to not take people with you is, you know, I think Julian Lesser said at one point before he dropped out of the Aboriginal ministry, the shadow ministry that he had, he said, "I've not been persuasive. I've not been persuasive." So what is the flaw in the persuasiveness? 

Joëlle Gergis: Yeah, it's a great question. And I think, just listen to what Norman was just saying. I mean, the climate science community has really tried to make them, it's a terrible way to think about it, but make the most of these catastrophes that have played out.  

So, you know, the catastrophic flooding of the east coast or the black summer bushfires, and really try and use that as a public awareness moment, where it is so clear to everybody that the world is changing really rapidly, it's changing in nonlinear ways, that is shocking the scientific community. And it's having really long-lasting and very, very destructive impacts on our communities, on our ecosystems, on our economy. And so, scientists like myself would get up and we try and help people join the dots between what they're seeing out their window and realising that the world is changing. But then you basically see the inadequate response in terms of, for instance, with the flooding of the east coast, where the emergency services were just inadequate. And so, we literally saw in northern New South Wales people just rescuing people, their neighbours from the rooftops just in dinghies, in kayaks, and that sort of thing. And really, the community rose up and basically took matters into their own hands, but it basically shows that institutional failure that's there, which is effectively a refusal to accept that the planet is warming.  

And these are the sorts of extremes that are going to continue to play out. It's a preview of what's to come. And this is what we're seeing with 1.2 degrees of global warming, which is where we are right now. So what do you get with 1.5, with two, with three, with four degrees? To be honest, I don't really want to find out. But that's why the IPCC does all of this modelling out to the end of the century with a range of different scenarios based on different emission pathways. And I guess it's one of those things that we still have an opportunity to really decide which path we're going to go down so that is one of the key messages from the IPCC report is how bad we let things get is still very much in our hands. And as a really well-resourced, industrialised country, we are still grappling with the basics of firstly, climate mitigation in terms of regulating the amount of carbon we are pouring into the atmosphere, and also the restoration of landscapes, but also thinking about climate change adaptation, that there's just an inevitable amount of warming that's already locked in.  

And beyond a certain level of warming, it becomes not possible to adapt. And so I guess, as a scientist, I fear for the future, in the sense that if we do not get the urgency of this, and in the last session, I spoke out just earlier today with Saul Griffith, he was basically saying, “'You know, your book is really good in terms of getting the urgency out there. But it's still, it's almost not enough”. But then people like me, what more can someone like me actually say? And this is where it moves into the political realm.  

Norman Swan: So, one of the dampeners. Now I’m not speaking from a point of view of the media, one of the dampeners are that Western countries are full of second-rate contrarians who make a living and elements of the media who are influenced by the peddlers of that. And the second-rate contrarians have the ears of politicians. In this country, because of media ownership, there are whole swathes of the country that the second contrarians have a free kick and influence those areas and you're seeing that with the Voice, and you see that with climate change, and you see that still with the pandemic, and they sit there on Sky and in the papers that we know about.  

And they are second raters, they don't understand the evidence, but they do understand how to switch people on. I mean, one of the most interesting 90 minutes of my life was a treat that my son – I get a treat every time I go to Washington, my son is a journalist – and one treat of many, several years ago, was going to Steve Bannon's house for coffee. So, I had 90 minutes with Steve Bannon, who was just out of power and that was the only reason it was amusing is that he was just out of power. 

Toby Walsh: So it was just before he was arrested and charged.  

Norman Swan: Yeah, all that, and you could just see this sort of mad thinking that goes on, that if you don't actually understand that what he's saying is complete bullshit, it sounds very, very appealing. And, of course, the people who in Australia who are secondary contrarians aren't very different from Bannon. Bannon is actually an anarchist, and they're not anarchists. They're very much pro-establishment. But they're creating anarchy without realising. Bannon is deliberately creating anarchy. The secondary contrarians in this country are craving anarchy deliberately, but they don't know what they're doing, really. 

Toby Walsh: But to pick on the media now, isn't the media partly responsible? Because this idea that you have to have two sides to the argument, when for something like climate change, there are not two sides to… 

Norman Swan: Well, the ABC went through a very rough phase with a very conservative board, who slammed editorially, and Julianne knows this well, slammed the ABC for its climate change coverage, which was scientific – it was accurate. Robin Williams was really slammed for it. It was appalling and outrageous, and you get pummelled, and you'll start looking over your shoulder. And for the public broadcaster, it's very hard now in the current environment to be brave. 

Julianne Schultz: And I was on the board of the ABC at that time, and the chairman was absolutely a climate change denier. He would get up early in the morning and read his blogs and he'd read his notes from, you know, your secondary contrarians and would relay them. And he would bring them into the boardroom to try and ensure that that view was one that prevailed, and it was quite difficult for those of us on the board who didn't share his view.  

And I mean, at one point – telling tales out of school – but it's a long time ago. Now, at one point, it was so sort of intractable because the pressure was going down directly from the chair, not from the board. The board didn't share the view, to reach into editorially, and at one point, I said, "I don't know what to do." And so, I sought a briefing from Martin Parkinson, who was at that stage the head of the Ministry for Climate Change, or Department of Climate Change, or whatever, before he got sacked from that when governments changed. And I said, "Look, I need some tactics on how to deal with this." And the best that I could come up with, which worked for a minute and a half, was to say to the chair, "Look, you’ve been involved in some really big fundamental changes in Australia. I mean, the whole neoliberal project has been one that you’ve been close to, and you had your day, and that’s had its fruit. It’s time to let somebody else have a go.” And it sort of gave him pause for a moment, only a moment, but it was sort of interesting.  

I mean, there's the level of, and I think this is what we're talking to, in a sense, is that the level of emotion that you have to bring in to try and counter those sorts of arguments is actually something which people who are working in a policy space or in a science space, or, you know, a very rational, brain-driven area, find quite difficult to hang things on to those levers. But in a sense, essentially, it's about getting that emotional content in, that connects. 

Toby Walsh: I want to inject a little piece of optimism here because I know you'll all be getting rather depressed, which is, I think, amongst all the gloom and doom and sadness of the pandemic, there were a few bright lights and one of the bits was I thought was we had epidemiologists on the team. We had Norman on the TV, we had politicians actually, in many cases, saying they were stepping back, letting the scientists, letting the epidemiologists say what to do. I thought that was such a welcome breath of fresh air. Unfortunately, it stopped. 

Norman Swan: Not so much that it stopped, it's that they haven't learned the lesson.  

Toby Walsh: Yes.  

Norman Swan: And it looked as if they were telling the truth when they weren't. That's the point. It looked transparent. So, for example, it's clear this is spread by air and aerosol.  

Toby Walsh: Yes. 

Norman Swan: And well, Victoria is doing better than other states but we are not regulating indoor air. We regulate outdoor air, but not regulating indoor air. We're still building hospitals without adequate ventilation built into them. There was a hospital in Victoria where they put patients in a single room, which is what's supposed to be done with negative pressure. So the air is being sucked out. They didn't realise when the wind changed, it became positive pressure and vented into the corridor, infected nurses several metres away. So we haven't, and because the so-called scientists that are still there, by the way, advising the Government did not believe that it was a zero-soul spread. They were stuck in 1945. 

Toby Walsh: We're back to this bit of short-termism as well, which is that it would make sense, like it made sense to improve the quality of our water, that improved the health of the country greatly, to improve the quality of our air, but that's going to cost money. And that's a long-term investment for something that politicians are not prepared to. 

Norman Swan: I mean, the money we've spent so far compared to that, and there is going to be another respiratory pandemic sometime – flu or another… 

Toby Walsh: But a politician looking at that and saying, "That's someone else's problem”. 

Julianne Schultz: Norman, you mentioned before about catastrophe promoting change and sometimes that happens, sometimes it doesn’t. I mean the great example is, the nuclear stuff at the end of the Second World War, mind you, they keep on testing and doing stuff with the nuclear devices for a long time before the various treaties were implemented. I wonder, if whether there is something now that in the pandemic, similarly was observed globally in the same way as the end of that war was observed globally. Whereas climate change tends to be something we’re familiar with in the immediate environment, you know, ‘my area flooded, yours burnt,’ you know, whatever and I just wonder whether there’s something in the pandemic moment, which give us that global heft. I mean, I would draw a connection between pandemics and climate change for instance, they’re very, very closely related. I wonder whether there’s something in that global catastrophe that give this a new energy?  

Norman Swan: I'm not seeing it, unfortunately. I think that people are relieved, they think it's over. Politicians want it to be over and we're back to business as usual. I just don't see that happening, and I think that because of the moment of the pandemic and who was in power and this authoritarianism that was there, which is a bit less prevalent now. I think we lost that moment unfortunately, so gradually, it shows back to pleading. 

Julianne Schultz: Joëlle, do you think there’s anything that comes from this sort of pandemic moment? When the IPCC in this final report – it was very clear warning that was being expelled and it came with the backdrop of the pandemic… 

Joëlle Gergis: That's right. I think it was a moment where we realised just how interconnected our whole society is and I think, for me, as a scientist working at that UN level, you get an opportunity to really delve into what is happening everywhere. And it's a shocking thing for me. It was an incredible professional experience, but it completely reconfigured my worldview and I guess it was one of those moments where you just realise how interconnected everything is. I think when the Coronavirus pandemic really hit, and you're watching the news and it's just a cascade around the world, it's a very clear reminder of just how our planet is really connected. 

Norman Swan: I mean the contrarians who say, “Well, you know, it came out of a lab". Well, this is designed to actually get us going because, like a rare event, dramatic… 

Julianne Schultz: Surreal.  

Norman Swan: That’s right and it feeds into the China prejudice and all that stuff but to admit that it comes from the environment is to admit climate change.  

Joëlle Gergis: Exactly.  

Norman Swan: Too many people on the planet are encroaching on new environments and we’re in an environment where the whole evolution of viruses, microbiology, microbial organisms is changing. 

Julianne Schultz: One of the things that I’m interested in all of your different perspectives on this, and troubled by is this process of change and how big change happens. Now, your comment before Norman on catastrophe triggers, its’s true but that’s in the exception of, you know, there were rarely catastrophes of sufficient moment to rewrite whole global orders and it was something I was really grappling with when I was writing my book – the idea of Australia.  
 
I was trying to figure out why some things happen quickly, but most things didn't. I ended up drawing on a British-American historian called Linda Colley, whose argument is that basically significant changes take three-score years and 10 – that is a lifetime. Some of them, like we're seeing with the recognition and Voice debate at the moment, we're talking about 11 score years and three, and we're still not anywhere near there. So I'm interested in that process that drives change and makes it real. I mean, Joëlle, you've written a bit about this in relation to some of the studies that have been done in terms of how people are trying to accommodate the reality of climate change. 

Joëlle Gergis: Yeah, I mean, a whole discipline of science known as detection and attribution has come about to try and help people join the dots. I mean, it's between what is playing out in terms of what we're seeing and its connection to the burning of fossil fuels. So as scientists, we're doing everything we possibly can to help people understand those connections but I think it comes back down to human psychology.  

Julianne Schultz: The German study that you quoted from your book, where it talks about, you know, you get to about sort of 25%.  

Joëlle Gergis: Yeah, that's true. When I realised after delving into a lot of scientific material, I started to read a lot of history and I read about social movements, and basically I came across a study that shows that it only takes about 25% of a population to shift a social norm. And then once that happens, the rest of the population starts to go with and so it makes you realise that it's about that critical mass. Sometimes, those social tipping points happen, as we saw with the federal election last year, where we saw a progressive element come in, which has been a breath of fresh air. And I think it's been really inspiring for a lot of us to see that our political leaders can start to reflect our shared values, but we still have a long way to go. And obviously, it's an imperfect victory, I suppose but it's certainly a step in the right direction, but that's I guess… 

Norman Swan: It’s communities that change, it’s communities that control pandemics and it’s communities that will control and do this and so we are beyond that 25% with climate change, it’s just the politicians haven’t listened to the community.  

So you've got people working in land care. They're working in the local communities, cleaning up their environment, recycling, trying to use less energy, so the communities are there. The politicians haven't quite appreciated that. I mean, the interesting thing is, how do you harness artificial intelligence for this to enhance the power of the community, so the power of good that could come from this assisted device in some ways? 

Toby Walsh: I'll just add that science itself is not perfect. There's lots of inertia in science, and scientists can be very conservative and there's that wonderful saying that science advances one arbitrary at a time. 

Julianne Schultz: Yes well, that’s true. I mean that’s the other part of this thing isn’t it that science is contested. I mean that’s the whole point of it and so, to expect a sort of scientific truth to be handed down, I mean there aren’t many that lasts through the annals of time. 

Norman Swan: But you’ve got an interesting moment in AI, but maybe you disagree with me, Toby, is that at the moment we’re not seeing it commercialised. I mean, there’s this whole tension with OpenAI about this being for the common good in the commons. The investors are making them create a commercial entity but at the moment, there is quite a lot in the commons for the community to use, is there not?  

Toby Walsh: It is, but the door is closing very rapidly. I mean, OpenAI is no longer open. When it started, it was a not-for-profit corporation. Everything they did was published openly, and the datasets they trained on were public knowledge. And in the last – this is the worrying trend – I had a conversation a few weeks ago about this because of the commercial pressures, because we're witnessing a trillion-dollar industry being created before our eyes. People are becoming more closed off. We no longer know what data is being trained on, and we no longer know how many parameters are in the latest algorithm. We are witnessing commercial capture. But you're right, until recently, I knew everything that was going on. I knew I could build a ChatGPT and any decent AI researcher on the planet could go off and recreate what had been done, it was a very open, collaborative, global enterprise. And now, you’re seeing huge sums of money, billions of dollars being invested, trillions of dollars of value being created in front of our eyes, and that openness is going to disappear. 

Julianne Schultz: And it's intruding into every commercial enterprise. It's not as though it sits over there in some special ChatGPT box. I mean, the tools of AI are there in every commercial activity that you deal with every day. 

Toby Walsh: It's going to be as pervasive as electricity. I mean, it’s everywhere.  

Julianne Schultz: I wonder in terms of the change – the change discussion, I mean, I’m not trying to be excessively negative about it, but I am interested in whether so much of this AI functionality will get built into everything and you've referred to the danger of entrenching all the old ways of doing things. In a way, we run the risk of AI becoming like the epigenetic stuff of the whole society. What gets built into all those thousands of little interactions that we have with AI-enabled activities now makes that level of change so much more difficult? 

Toby Walsh: Yes, I mean, it's a mistake to think there's going to be one big AI, one AI that's going to be this godlike authority with ultimate intelligence. Artificial intelligence is going to be like human intelligence. There are eight billion intelligences on the planet; intelligence is well distributed, and AI is going to be like that, very diffused, and therefore much harder to be in charge of, in control of, and to understand. 

Julianne Schultz: So as we wrap this up, in terms of talking about the role of the individual, the individual scientists, the individual AI developers, the doctors, there is the possibility of having big regulations, professional standards, and so on. But how much of it falls back onto the individual to ensure that the right thing gets done?  

Joëlle, you’ve written very movingly about your struggle, your battle with this – maybe you can share some of that with the audience. 

Joëlle Gergis: Yeah, I guess it becomes a bit of a professional dilemma. Because I feel that my role at the moment is not only to contribute to the actual research in terms of furthering my discipline, but it's also about communicating to the public. It has become an increasingly large part of what I do because unless people truly understand what's important and the urgency behind it, then we won't see the response that we need. 

And I think, if people like me don’t step in then the void will get filled by these second-rate people that just really don’t have the qualifications, but they might have a louder voice. But also, to be a commentator in this space is difficult. 10 years ago, I’d speak out in the public and I’d get all sorts of freedom of information requests and harassment and the whole thing, which made it really, I guess made me really reconsider whether or not that was a safe thing for someone like me to do but increasingly, I just feel like the stakes are so high that if people like me don’t speak out, even though I’m really introverted and that’s why I write books and I’m not really that comfortable on a stage to be honest, but at least I’m trying to do something, but it's ultimately up to policy. But I'm hoping that people like me are helping inform the public conversation. 

Toby Walsh: Can I ask you a question? How do you square this idea that as a scientist, you're just trying to find what's true? You're just trying to understand the universe but when you step into these public conversations, environment, inevitably, it turns to questions of politics, questions of values. I mean, there are going to be harms done, the planet is going to warm, various people are going to pay the price, and we have to make choices as to who are those people, and who are the people that are going to benefit – those ultimately become political conversations and as a scientist, I find myself really uncomfortable that I'm now in a political conversation no longer a scientific one. 

Joëlle Gergis: Me too. It's a really difficult thing to grapple with because it's one of those things that people will say, "Well, you know, is it the job of a scientist to be an advocate, and to engage in advocacy?”. But, you know, as was raised earlier, I think if we don't step in, then other people will fill the void and I think we don't have the time for that.  

Norman Swan: Well, let me tell you a conversation I had during the pandemic, because I was asked for a lot of talks to medical students and so on, and one of my favourite talks I give to medical students at the beginning of their training is never forget that medicine is a social science. That was actually preached by a pathologist and doctor in the 19th century called Rudolf Virchow, who was actually a radical, a political radical, “Never forget that it's a social science”. 
 
And I had the head of a major institute come clattering down on me because this person was – I won’t tell their agenda – but this person was absolutely disagreeing with me, “This is a hard science, it's a biological science”, but it’s a social science and because social factors affect it too, so coming back to this role of the individual. It's complicated because we get fired up by individual stories. 

So, we can identify with a person who might have suffered at the hands of fraud or for AI, or somebody whose farm is drowned by a flood. We've got to be able to connect the dots but in terms of the power of the individual to change, that's much less. We have to gather in communities and get to know the people around us and shared interests and act at the ground level, so the politicians realise that they actually, they've got no choice here, this is the way we're going.  

Julianne Schultz: But there is a good interesting lesson from medicine because doctors are bound to both do good and do no harm and I’m interested in – when we talk about the AI – the people who are, not the high level sort of stuff which is only available to the absolute boffins that are involved in it but the places where AI is being applied in enterprises every day of the week that there’s not a similar sort of code of professional practice that needs to apply. When we saw that during the Robodebt evidence that there were scientists saying this is doing harm, but that had no sway in the end.  

Toby Walsh: Yeah, but the Robodebt is a wonderful example. We need a Code of Practice for CEOs, parliament and civil servants.  

Julianne Schultz: But as in the case of doctors, I mean there may not be a Code of Practice for… 

Norman Swan: You had doctors in Auschwitz, you had psychiatrists in the Soviet Union locking up people for political reasons. You know, it does help, it does put a break on it, but there's not a guarantee. 

Julianne Schultz: I think Joëlle, you didn't talk precisely to the sort of the depths of depression that your environment threw you into as you were dealing with the IPCC stuff. But what I would like to just end on is, have you found hope?  

Joëlle Gergis: Yeah well, I guess what Norman was just saying about collective action and as I was finishing writing the book, we had our federal election and that social tipping that played out for me, I mean, I wept watching the election results coming through and it made me realise that even in a fossil fuel obsessed country like Australia, we can bring about change that we want. And I just had a session earlier in the day with Simon Holmes à Court who helped with the climate 200 programme, which brought in a lot of different people into the political arena and that gives me hope because I think that's how the system ends up changing.  

And the fact that it could happen in a country like Australia means that I think it can happen anywhere. I mean, we're the largest exporter of coal and liquefied natural gas on the planet so what we do really makes a difference in terms of our stance here, in terms of ethically, whether we continue to exploit fossil fuels or whether we decide that actually, we've had enough, we've lost 50% of the Great Barrier Reef, the koala is now an endangered species on the east coast of this country, and so on and so on.  

We've had enough and we want better. And I think that really, for me, it was a moment of hope and I think that we're still riding that wave, and it's been a changing of the guard. I'm not going to say that it's a done deal, but none of these huge transformative moments in human history are never just one shot. It's gonna take a lot of time. But I guess as a younger person, I have to have faith that we can do this and I do have faith that there is the community support and eventually the political will come about. 

Julianne Schultz: A very fine note to end on. Thank you. 

UNSW Centre for Ideas: Thanks for listening. This event was presented by the UNSW Centre for Ideas and Sydney Writers’ Festival. For more information, visit centreforideas.com and don't forget to subscribe wherever you get your podcasts. 

Speakers
Joelle

Joëlle Gergis

Dr Joëlle Gergis is an award-winning climate scientist and writer at the Australian National University. She served as a lead author for the United Nations' IPCC Sixth Assessment Report and is the author of Sunburnt Country: The History and Future of Climate Change in Australia. Joëlle has also contributed chapters to Greta Thunberg's The Climate Book, and Not Too Late edited by Rebecca Solnit and Thelma Young Lutunatabua. Her latest book is Humanity's Moment: A Climate Scientist's Case for Hope. 

Norman Swan

Norman Swan

Norman hosts ABC RN's Health Report and co-hosts Coronacast, a podcast on the coronavirus. Norman is a reporter and commentator on the ABC's 7.30, Midday, News Breakfast and Four Corners and a guest host on RN Breakfast. He is a past Gold Walkley winner and has won other Walkleys including one in 2020. Norman has been awarded an AM, the medal of the Australian Academy of Science, a Fellowship of the Academy of Health and Medical Sciences and an honorary MD from the University of Sydney. His book, So You Think You Know What's Good For You? was a bestseller and his latest book So You Want To Live Younger Longer? has also been on the bestseller list. Norman trained in medicine and paediatrics in Aberdeen, London and Sydney before joining the ABC. 

Toby Walsh

Toby Walsh

Toby Walsh is Chief Scientist of UNSW.AI, UNSW Sydney’s new AI Institute. He is a strong advocate for limits to ensure AI is used to improve our lives, having spoken at the UN and to heads of state, parliamentary bodies, company boards and many others on this topic. This advocacy has led to him being "banned indefinitely" from Russia. He is a Fellow of the Australia Academy of Science and was named on the international "Who's Who in AI" list of influencers. He has written four books on AI for a general audience, the most recent is Faking It! Artificial Intelligence in a Human World.

Julianne

Julianne Schultz

Julianne Schultz AM FAHA is the author of The Idea of Australia: A search for the soul of the nation. She is an Emeritus Professor of Media and Culture at Griffith University, where she was the founding editor of Griffith Review

For first access to upcoming events and new ideas

Explore past events