This week: our 300th episode. We’re joined by Professor Genevieve Bell to settle once and for all – if data is not the new oil, then what is it?

Sandra Peter (Sydney Business Insights) and Kai Riemer (Digital Futures Research Group) meet once a week to put their own spin on news that is impacting the future of business in The Future, This Week.

Our guest this week

Distinguished Professor Genevieve Bell

Genevieve’s Twitter, @feraldata

Healthcare data is the new oil, water data is the new oil, connected car data is the new oil, marketing data is the new oil, analytics is the new oil, privacy is the new ‘new oil’, high-quality data is the new oil

Our previous discussion on how data is not the new oil on The Future, This Week

Our previous discussion on how cobalt is also not the new oil on The Future, This Week

Harmeet Sawhney’s 1996 paper on metaphors as midwives

Kevin Kelly’s 2010 book, What Technology Wants

Genevieve’s talk on the overland telegraph line

Nielson TV ratings

Recognising appliances using smart meter data

Mad Max Beyond Thunderdome – two men enter, one man leaves

Gartner Hype Cycle

Our previous discussions on the hype cycle on The Future, This Week in 2017, 2018 and 2021

Facebook engineers say they don’t know what the company does with user data

Genevieve’s TED talk on the 6 big ethical questions about the future of AI


Follow the show on Apple PodcastsSpotifyOvercastGoogle PodcastsPocket Casts or wherever you get your podcasts. You can follow Sydney Business Insights on Flipboard, LinkedInTwitter and WeChat to keep updated with our latest insights.

Send us your news ideas to sbi@sydney.edu.au.

Music by Cinephonix.

Image: generated by DALL-E 2

Dr Sandra Peter is the Director of Sydney Executive Plus at the University of Sydney Business School. Her research and practice focuses on engaging with the future in productive ways, and the impact of emerging technologies on business and society.

Kai Riemer is Professor of Information Technology and Organisation, and Director of Sydney Executive Plus at the University of Sydney Business School. Kai's research interest is in Disruptive Technologies, Enterprise Social Media, Virtual Work, Collaborative Technologies and the Philosophy of Technology.

Genevieve is Distinguished Professor and Director of the School of Cybernetics at the Australian National University and Vice President and Senior Fellow at Intel. She is interested in new technology and how we respond to it, particularly how personal computers, mobile phones and the internet have reshaped our lives.

Disclaimer We'd like to advise that the following program may contain real news, occasional philosophy, and ideas that may offend some listeners.

Kai So Sandra, what is data?

Sandra Apparently it's the new oil.

Kai Still the new oil?

Sandra Yeah, we had this conversation, like, four years ago.

Kai Doesn't seem to go away. Also what kind of oil?

Sandra Crude oil?

Kai Olive oil?

Sandra Snake oil?

Kai I think we need help.

Sandra I think we need the big guns.

Kai One of our favourite people.

Sandra Professor Genevieve Bell.

Intro From The University of Sydney Business School, this is Sydney Business Insights, an initiative that explores the future of business. And you're listening to The Future, This Week, where Sandra Peter and Kai Riemer sit down every week to rethink trends in technology and business.

Genevieve Bell I can tell you my name, it's Genevieve Bell. My title is an ever-increasing list of randomness.

Sandra Genevieve is one of those mythical creatures that inhabit multiple universities: academia, where she's a brilliant anthropologist on a mission to build a new branch of engineering.

Genevieve Bell I'm the interim Dean for the College of Engineering and Computer Science, soon to be named the College of Engineering, Computing and Cybernetics. I'm the Director of the School of Cybernetics.

Sandra Business, she's been at Intel.

Genevieve Bell I'm a Vice President and Senior Fellow at Intel Corporation.

Sandra She's basically been their chief strategist and chief futurist.

Genevieve Bell I'm a non-Executive Director of the Commonwealth Bank. I'm Doug Engelbart inaugural fellow.

Sandra And government.

Genevieve Bell Oh yeah, and I'm on the Prime Minister's Science and Tech Council. Also members of two learned academies. And a post-nominal. Mostly I just go by Genevieve, because that's all a bit long.

Sandra I think more fitting for our conversation today is her twitter handle.

Kai @feraldata, like, feral data. That is so cool.

Sandra So she should know a thing or two about data.

Kai Let's ask her.

Sandra I almost feel bad that we're just talking about data again. But there is this resurgence of the data is the new oil argument in various flavours, right? High quality data is the new oil, healthcare data is the new oil, analytics are the new oil, plenty more where that came from. But there's this recognition that data is not enough. There is a call for better data, more data, cleaner data, better analytics, but the argument still seems to hold; data is the new oil. And we've had it five, six years ago, and we've had a chat on the podcast around 2018, 2019 seemed to be like the height of data is not the new oil. But it seems to be back like a bad rash. And we're hoping to pack that a bit. Why this simplistic narrative in the first place? And what are better ways to think about data?

Genevieve Bell Gosh, it's such a good question. I mean, I'm always really struck about the ways that people want to make sense of technological phenomena by mapping them on to something else, right, of kind of trying to give it something that we grasp. And there's plenty of other examples, right? So computer vision, machine learning, artificial intelligence, the information superhighway. All of those, on the one hand, they are descriptive of particular technological phenomena. On the other hand, they attempt to make it manageable, comprehensible. They give it a kind of, well a flavour if you want. There's a lovely paper that got written, gosh, nearly 30 years ago, by a man named Harmeet Sawhney, who is these days a Professor at Indiana University in Bloomington. And he wrote a paper, the title of which I don't remember, but the subtitle, which is hugely important, because he says that "metaphors are midwives". And his argument in this paper is that the language we choose to frame up emerging technical phenomena, then shapes our imagination for how those phenomena in turn unfold. So it becomes, in some ways, a kind of a self-reinforcing prophecy, right? And he takes the information superhighway language, which is obviously dominant in the 1990s and says, in describing the internet as an information superhighway, we immediately privileged speed. We wanted to talk about the throughput. We became really interested in how things moved from place to place and a whole lot of things became invisible, in calling it superhighway, right? We didn't really have to think about things that weren't on the highway. We didn't have to think necessarily about safety, because it's not often a conversation you have about highways, we didn't really think about construction or about the materiality of it, it became this kind of story about speed. And of course, those of us who've grown up around superhighways, I'm looking at you Kai.

Kai Are loud.

Genevieve Bell Actually, you know, we might we might talk about safety, we talk about rules and regulations, superhighways are part of dominant narratives about nation building and reconstruction among other things.

Kai And you looked at me because I'm German. The Autobahn had a particular...

Genevieve Bell I was thinking of the Autobahn as a place that is seen as both a superhighway but one where regulation would be a part of the conversation necessarily, but in America, not so much. And so I think, you know, Sawhney argument...

Kai There's still no speed limit in many parts of Germany on the Autobahn so, you know.

Genevieve Bell But there's also an incredible set of pre-existing regulations that prescribe how you get a driver's licence.

Kai That is true.

Genevieve Bell An expectation about how to be a safe navigator of that and a set of rules about vehicles that can traverse it that mean that before you ever entered the superhighway, there's a whole lot of preconditions that would be necessary. And I think, you know, Sawhney would argue, less true about the American highway system, which is not burdened with quite some of the same regulatory framework. And the reason I'm kind of bringing that up is that I think his analysis of saying, the metaphors we choose in turn structure our thinking, is a helpful tool for saying, what is it about comparing data to oil that's so seductive, because you may say, 2017/2018, but I remember it from an even earlier, turn back probably in about 2010.

Kai No, I'm in two minds about this, because on the one hand, of course, the connotation that people want to evoke is that it's this commodity, this resource that is incredibly important and valuable, and can be traded, and we can make all kinds of things out of it. But on the other hand, oil has really fallen out of favour, right? Because it has all these kinds of dangerous consequences for society, pollution, climate change, and all that. Maybe after all, that part is a fitting metaphor for data because we have all kinds of problems now, but let's start with the commodity argument. How is data different from oil?

Genevieve Bell I'd want to frame that question slightly differently, I'd want to frame the question to be...

Kai We reject the premise of the question. I like it. Very good.

Genevieve Bell Absolutely. I'm an anthropologist, that is what we do. It's the question you're asking me, I think an equally interesting question...

Kai You brought your own question.

Genevieve Bell No no, an equally interesting question would be, why is that the metaphor that has persisted for a decade plus? Like, why is it that we've kept coming back to it when clearly as you reasonably say, it's hard not to think about oil in the 21st century and think about, or at least, you know, in my world, think about the Exxon Valdez spilling oil into the ocean, hard not to think about war, about pricing, how not to think about oil fields on fire.

Sandra Oil crisis.

Genevieve Bell Oil crisis, the kind of by-products of petroleum in our atmosphere. I mean, there's sort of too much dystopia over here. Hello, Blade Runner, and, you know, parts one and two.

Kai So there's a little bit turn to the right place for dystopia.

Genevieve Bell But sort of it's interesting that much like the information superhighway where we could talk about the superhighway-ness and the speediness of the superhighway, and not all the other pieces of the puzzle. There is also a way that those images get evoked because of the work they do, but not because of all of their work. And so there's something about the fact that oil feels like a shorthand, the way a superhighway did. And of course, the thing about shorthand is as soon as you go, but hey, wait a minute, do you mean this kind of highway, this kind of oil? What about this? Of course, the entire thing collapses. But my suspicion is the seduction of calling it oil is that it makes it seem really straightforward. Because one of the things those stories do when you use that kind of language, is that it automatically assumes that everyone has the shared understanding. It's a really easy shorthand, it makes a thing that seems otherwise as yet unknowable, knowable. Now, of course, you've just rightly pointed out most of the perils in that, which is that actually, oil is not straightforward and simple. Knowing it doesn't actually help us understand data necessarily, except in only one way. So what is the work that it does do and why do people keep perpetrating that? Well, one way is to say, well, actually, the reason people like to think about data as the new oil is because they would like to imagine that it was a commodity, because that's actually incredibly convenient. It is convenient to say that you could trade data, that there would be stockpiles of data, that data could be a thing that came in a crude form, and then was refined and then was put into a machine that then delivered some kind of value on the basis of it. I mean, that...

Kai Makes it easy for vendors right to say, you just need more data, we give you the tools to refine the data and off to the races,

Genevieve Bell And also does a nice job of making unnecessary questions about refining processes. Because truthfully, for most people, you don't really think about where your oil is refined and what the consequences of that are. You don't really have to think about what are the extractive principles by which much oil comes into existence, think shale oil as a kind of a an awful extension of that. It does lead itself I suspect for many citizens to an immediate kind of thought about the cost of it, because we go from oil to petrol pumps relatively quickly, I suspect. So I imagine part of the seduction of calling it the new oil is that it unlocks a story about value really quickly. So it lets us talk about it as a commodity. It lets us imagine that the accumulation of that commodity will convert to wealth. It suggests that people who have the commodity are going to be well positioned in a conversation about how value is derived. I suspect it implies supply chains without necessarily having to explicate them, and it, it implicitly suggests a differential between kind of the unprocessed and processed. So I mean it does some really nice metaphoric sense. It does useful work there. My question would be useful work for whom? And why is it that people keep coming back to it? And what's getting lost in that conversation? What would a more productive conversation look like? What are the other things you could talk about? And I come back to people like Kevin Kelly, who, like, 20 plus years ago wrote that piece about what does technology want? How does a technology kind of grow? And what does it want? And he was arguing, you know, it wanted a path to complexity, a path to sustainability, a path to being a dynamic system, that was always his sort of set of things. And I remember more than 10 years ago, asking myself, What the hell does data want? So rather than asking what it is, could we imagine what it wanted? And speculating that was interesting too. You know, my sort of sense of data wanting or needing a country that it comes from, data always comes from somewhere, right? One of the things about calling it oil makes it easy to forget that it has to start in a place because oil is also just a flowing thing, rather than from a place thing. So I was always interested, could you talk about did data have a country? Did data want relationships? Because you know, comes from somewhere, it goes somewhere, is related to something. I suspect data wants a network. Data needs to be connected, right? Like, it's not actually useful if it doesn't flow. I thought that data liked to keep secrets. Data doesn't always want to reveal all of its pieces of the puzzle. My suspicion was the data actually also liked to be abundant, liked surplus, right, like data likes other data. I did think data wanted to be feral. Data wants to jump the fence like the bunnies and just be gone. And that notions of kind of domesticating data were probably foolish. Now, of course, that entire thing then presupposes that you can anthropomorphize data and give it wants, when it really isn't those things. I think these days, the place I've gotten to with data is thinking that one of the pieces of the puzzle about it is that it's, for me at least, it's always retrospective, as in data is what happened, not what is going to happen. And as a result, I see it as being in some ways conservative, not in a political sense. But in a, it's always what's already happened. So it's contained in that way, right. And then using, and imagining you can build a world on what's already been, in some ways conscribes what's coming by what's already happened.

Kai In a very weird sense, that is what oil is. Because oil is just stuff that was and was you know, fossilised and...

Genevieve Bell Deeply distant.

Kai Deeply distant.

Genevieve Bell In that case, it's a metaphor that actually works. But I think it's not one of the ones people usually think of.

Kai Producing insights here.

Sandra But in that sense, all we could ever do with it is perform autopsies. And probably, autopsies are not the best way to move forward.

Genevieve Bell The one exception to that might be weather data. So weather data sits very much in that sort of interesting world between what has been and what we might know is happening. So you know, we can look at weather data across Australia, which has a lovely history, right? Our collection of weather data dates back to the overland telegraph line, it's a happy Charles Todd invention. So when he was stringing the telegraph line, cross from North to South Australia, he turned every repeater station that was necessary to broadcast the signal to move the data around, he turned every data moving centre into a weather collection centre and created weather data basically, so the reason we have it, it's because of that. But when he did that, in the 1880s, he started to work out that the weather on one side of South Australia would be the weather on the other side of South Australia mere hours later. And it's one of the few places where I can think of where past data is actually someone else's present data soon to be someone's future data. And there's a nice piece where it's not actually in quite that way backward looking. But yeah, I mean, one sense of the way data functions is that because it's always in the past, how you then mobilise it relies on being able to imagine that the future or even the present might be more than the past has been. So an autopsy is one way of thinking about it. The other way that, you know, Kai suggests is burying it in the ground and waiting for it to become something over a protracted period of time, not quite transmogrification.

Sandra But in that way, it's not like oil, right? Because in many sectors, many industries, old data would not be very useful data. If I'm looking at shopping behaviour for instance, I'm not really all that interested in what people were buying in the early 80s. I'm more interested in what they were buying last year or two months ago.

Genevieve Bell Yeah, my suspicion is the periodicity or the time-sensitiveness of data is hugely variable. Like in some places data that's 500 years old is incredibly useful, in some cases, data that's five minutes old is incredibly not useful.

Kai So it ages, it is time-bound. It's also very contextual.

Genevieve Bell Absolutely.

Kai Because my data might not be useful for you and vice versa. And what is useful to an electricity company is not useful to a fashion retailer. So it's not a commodity that trades easily.

Genevieve Bell And not everyone would even imagine that it was a commodity to trade in the first place, right. So in calling it oil, it has a levelling effect, it makes the artefact into a thing that seems to be the same and could move, not necessarily seamlessly in barrels perhaps, but between things. But you know, the reality is there are plenty of sectors, communities, people, organisations who collect material who may not necessarily think it is data, right? The notion of things becoming data is actually quite complicated to, like, at what point does something stop being a different kind of thing and becomes data because not everything makes that leap, right?

Kai I want to make the leap to another of those metaphors, artificial intelligence or deep learning, the idea that we exploit those past patterns that are hidden in the data to make predictions about the future, which of course we can do because much of our social life is incredibly patterned and follow certain pathways, which can change and then we might be in trouble. But in that respect, many people who are building these technologies believe that the future is already hidden in that past data. And then how do we think productively about the role of data when we think about those new practices that are emerging where we try to control, predict, or even shape and manipulate the future, for whoever benefits from it, by exploiting patterns in data?

Genevieve Bell I remember sitting with a bunch of bankers in about 2009/2010 in New York City, it's early days of big data. We were there talking about big data and the cloud. Big data and the cloud, it's kind of like a bad fairy tale.

Sandra Still is.

Genevieve Bell Yeah exactly. One of these guys, he looked at me, and he's like, 'I don't think you understand Genevieve, like, more data equals more truth'. And I thought, 'wow, so modernism is back,' which I said to him, and he didn't understand it. The guy who was sitting next to me did and we had a bit of a laugh about why that was a conversation that was just never going to resolve. And I'm like, 'look, more data doesn't equal more truth. It just equals bigger storage bills'. And he was like, 'no, no, if you have more data, we will be able to ultimately discern the truth.' And I think there's something about that fantastical imagining of data that suggests that if only we could have complete knowledge, we would have complete truth, where knowledge and truth are stable and singular. And I'm of a vintage and an intellectual training, boringly I suppose, that suggest to me that that's actually not possible, there will always be multiple truths, and they will be contested. And there are lots of ways of constituting knowledge and the idea that simply having more data will get you there is kind of a delightful fallacy. I can think of two really contrasting examples there, not that many years ago. So a large organisation in the United States that tracks TV viewing habits, so the Nielsen, which does you know, basically TV rating, and also TV viewing habits in order to work out how to obviously sell advertising, it's partly their business model. Nielsen knows an enormous amount about American households, because they instrument the households with a simple remote control to your television, you have to fill in some demographic information and you press the button when you move your television so they know what you're watching. What they have, there is your age, your gender, some education level, what your zip code or your postcode is, and your television viewing habits, and the occupants of your house. Usually on the basis of that they can determine to about 95% accuracy, what you'll be watching on any given day. They were offered the ability to have real-time data capture on the set top boxes, or what do we call them now, personal video recorders. So the kind of the box that the cable company or the television company would sell you. And they said, 'why would we need that, we don't need more data, because we actually have the only pieces that really matter. We already know the pieces that give us the right answer more data will just mean basically more storage to manage'. And they were quite clear that more data didn't get them to more truth, right, they actually knew what they needed to know. And this was not a place where having more data was going to be helpful. The contrast of that is that there are multiple places where I'm not sure people understand that data is being collected about them that is deeply revelatory, so smart electrical meters sitting on the outside of your home would be an obvious one, where I suspect for a long time in people's lives your energy bill, which turns up maybe once a quarter, which is not granular in a particular kind of way. So it doesn't tell you minute by minute what you're up to. And that minute by minute, what you're up to wouldn't be interesting to you or your energy company. Smart electrical metres make it slightly easier to be able to have a much greater level of granularity about what's going on in the home. There are some globally, not yet here in Australia, where you would be able to tell by the electronic signature, what the brand of your appliance was. So you know, televisions have different signatures. So do fridges based on brand, based on configuration, so you would be able to know looking at them.

Kai Yeah, oil doesn't reveal that about you, does it?

Genevieve Bell That is correct. It doesn't reveal that you like to drive in third and you don't change into fourth when you should.

Kai No, it does not.

Genevieve Bell But this would basically say you seem to stand with your fridge door open 20 minutes every day. What's that about? Or ha, you'd like to vary the temperature in your house significantly in a manner that suggests what? Menopause or temperature, or possibly COVID, we don't know. But suddenly a whole series of things that would...

Sandra You keep the lights on in the basement.

Kai Very strong lights in the basement.

Genevieve Bell Lights on all the time, what you doing down there? And on the one hand, that all sounds like science fiction, but you know, we did some work in one of my previous jobs, where, with householders permissions, we instrumented their homes and then came back to them and said, 'you know, can you talk us through what you were doing in any given day'. And there is a vivid example, we all remember of this household where the argument unfolded between a husband and wife about why the washing machine went every day. Which had obviously been going on this way for 20 years, but he'd never known. And she was like, 'well, where do you think the clean clothes come from every day?' And he's like, 'I don't know what you're talking about'. She's like, 'how do you think that happens?' He's like, 'yes, but why is the washing machine going every day?' And we were like, 'oh dear'. And what the data in that context made visible was a set of tacit social practices in the home that had never been visible to one party, and the homeowner will hyper visible to the other, which tells you an enormous amount about domestic politics and labour and ideas about how work is visible and not visible, and making it into data surfaced all of that. My suspicion is, while we know how to think about a television-selling organisation, knowing something about us, it's very different to contemplate what an electric company might know about us in a place where something that didn't look like data before, an electrical bill, becomes a real-time portal into our home. And the challenge there is, of course, our legislation knows how to manage the first but not the second, we don't yet know how to think about the, you stand in front of your fridge with the door open for 20 minutes as a privacy violation. We know how to think about the former as one. And so there's a little bit here where the oil metaphor doesn't help you with either of those things, and imagining how we are better at starting to think through, in both of those instances, there's more data, I'm not sure on either of those instances, there's more truth. I think what there are in both of those instances is other patterns revealed that then create social complexity that people aren't necessarily equipped to deal with, and regulatory challenges.

Kai What's the truth there, right, because depending on how I look at it, it's either completely creepy and horrifying, or full of business opportunity.

Sandra But as you said, I think in this case, the oil metaphor doesn't get us anywhere. And if anything, it blinds us to what's actually important about it, and to the inequalities that are embedded in how we know about this.

Genevieve Bell Where I was starting, I would suggest might be part of why it is being constantly gone back to. Much like the superhighway for the internet, oil for data is a really helpful way of managing the conversation in a particular kind of direction. And when I say helpful, you should probably imagine that's in air quotes, "helpful".

Kai What's a better way of thinking about it?

Genevieve Bell I think you were going there Kai, and I think it's that data is cultural, data is contextual, data is almost inevitably constrained in some way or another, you know, because you need to be able to start to talk about all the other things that, and all metaphors should let you go to it often doesn't, like, what are the conditions of its production? How is it collected? What was the intent by which it was collected? By whom and where? Even if it's collected by a machine, what are the affordances of that machine? Because those machines themselves have intentionality and constraints built into them. I think, you know, there's a series of questions we should be asking about. Where is it stored? Under what circumstances, accessed by whom? Are there protocols for how it is stored, are there protocols for how it's recirculated? How do we talk about its curation, its circulation, its creation. All of those become, I think, more interesting, but also, let's be really clear, much more complicated and complicating conversations that have, and oil is just easier. Because oil says pump it, store it, move it, power something. As opposed to going, 'gosh, who collected that? And what were they doing? And what were the tools they used to get that there? And do we know the histories of those tools? And where's it going now?'

Kai Who is it for?

Genevieve Bell And what were the conditions under which it was extracted? On whose land? Were there, you know, relationships of, you know, reasonable production enacted there, you know, what are the politics of the places that it's being stored? What is the technology that's being used? How's it being transported? What are the consequences of that. Like, you could do all of that, but most of that's not what people want.

Sandra Let me challenge you to stay with this for a simple reason. I share a household with a consultant, which makes my life quite difficult sometimes, because I will bring up something like, ' but these people with data is the new oil. How dare they? The conversation is so much more complex''. And he says, 'well give me another shortcut'. And I say, 'well, it's complex and complicated. And there are all these questions you need to be asking'. He says, 'well, that doesn't help me keep people in conversations around it'. And I think that's a question you've asked before as well. How do we hold people in these conversations? Because I think what happened, data is not the new oil, we were tired with that conversation, we let it go. We come back to it occasionally, when you know, we have a conversation about the ethics of data or the inequalities in the data, or some small aspect. What's a way to hold people in the data conversation?

Kai Well, I think we actually have to be willing to say to ourselves and others. 'You're right. It isn't that simple.'

Sandra I can't say that at home, you understand this, right?.

Genevieve Bell That's why you have a podcast. You can say it somewhere else to a whole lot more people. Again, I think it's the piece where, it's why it started by saying, 'there's another question you might ask, which is, what is the seduction of those kinds of metaphors?' Right? It's exactly what your consultant partner says. They make it easier to have conversations and for people to feel like they are in those conversations. And they give everyone a shorthand. Now, of course, the reality about shorthands and metaphors is that they then of course, both constrain our thinking limit, who's in those conversations, make it easy for everyone to think they're having the conversation, and then are incredibly vexing to the rest of us going, 'it's not that simple. Oh, God really? Like there's 17 things that are wrong with that'. And like, 'oh shut up, we just need it to be simple'. And I don't think there is an easy way to hold people in that conversation. But I think it is reasonable to want to continue to try, and to say, 'Well, okay, if you want to call it the new oil, here are three other things that immediately makes me think of. So what is data is the new oil spill look like? What is data is the next oil slick? What does it mean if you know, data is the next oil crisis? Is that what you mean?' And then you can sort of say, 'well, okay, if you're determined to have that one'.

Kai Pollution.

Genevieve Bell Yeah, well, or, you know, data is the next oil leak, data is fracking. What's the kind of the logical piece of that right?

Kai Your smart meter. That's a little bit like fracking.

Genevieve Bell Well it's a little bit like something that's extractive. Do you mean data is extractive? Do we mean that data is a colonial power regime? I mean, for me, at least one of the more useful interventions is then to play out all the obvious extensions of that. So if it's the information superhighway, who are the muscle cars? Who was selling insurance? Who is Henry Ford in that story?

Kai Who is the highway police?

Genevieve Bell That is correct. Who is running the roadhouse? What are they serving? So there's a little bit about, you know, can you be, playful is the wrong word, but can you push that? It's like, 'okay, if that's the metaphor you were to, here's the next set of metaphors or extensions of that, that come into being', and truthfully for me, it's very hard to ever, in an Australian context, think about that without immediately going to Mad Max. So then I get to you know...

Kai Fury Road.

Sandra Thunderdome, usually,

Kai Oh, that's a good movie,

Genevieve Bell Two men enter, one man leaves.

Sandra Well it always makes me think of this whole overhead we have now to be good citizens. And to have healthy democracy, you need to understand so much more about so many different things. I think it's a particularly interesting moment in history in that respect, because we're asked to make decisions that will impact our lives in the very short term that we don't fully understand, that set us on trajectories that are very, very difficult to...

Genevieve Bell Absolutely. Though I suspect in some ways, that's the condition of contemporary democracies over the last 100 years, when I think about the efflorescence of emerging technologies and the appearances of things that have turned up and profoundly reconfigured the way we did things, right. I mean, I haven't seen you guys in a while. So you have no way of knowing that I'm currently, although maybe you do, obsessed with the overland telegraph line and telegraphy systems more broadly.

Sandra and Kai Oh we know.

Genevieve Bell I know, that's right, you were you there when I did that overland telegraph line, she's completely lost the plot moment. So. But one of the pieces of unexpected information that I started to realise was working out that for Australia, there's this day in, well right about now, August 1872, when it goes from taking 47 days, or 45 days for a boat to get from London to Adelaide, to taking four hours by telegraph, that shift in speed 45 days to four hours, there's not been a technological shift since then, where the diminution of the speed, or the size, or the throughput, ever changed quite that rapidly in quite such a short period of time. And, frankly, for almost everyone simultaneously, so maybe currency conversion. we experienced that collectively. But there's something about our desire to say 'it's so much more complicated now', which isn't untrue. But there are these other moments where people's imaginations and their lived experiences had to be not instantaneously, four hours, reconfigured and reimagined. And you know, if we think about that piece, and for me interesting in its relationship to data, hugely important, because 1872 August, you suddenly can tease apart communication and information, they no longer have to happen with a physical artefact, right. So, before 1872 in Australia, if you wanted to get information from somewhere else in the world, it had to come through a transportation means in a physical artefact like it was a book or a newspaper or a journal on a boat and then on a train or possibly with a pony or a camel or a bike to get to you like it was a quite sophisticated sort of supply chain of things. And then when you get telegraphy, you suddenly have information that is now flowing without a physical artefact in quite that way down a wire. And a couple of things happen, right? Not all information makes that leap pretty obviously. So you create two classes of information, information that can be telegraphic, aha, bad joke there, and information that still needs to be, you know, physically bound literally on a boat. We know that we then configured things to flow through the telegraph line more readily. Well, hence things like the Reuters, the AAP, The Daily Telegraph are all you know, consequences of making things telegraphic. But it also made data in a really particular kind of way that hadn't existed before, right. So large, dense ideas are turned into, effectively, not zeros and ones, but certainly dots and dashes to move through a wire. And the nature of who controlled that, where it went, how it flowed, was very different. And delightfully, you can see back then, people desperately trying to work out how to describe the phenomenon, and the metaphors they come to and the language they are trying to do.

Sandra Telegraph's the new oil?

Genevieve Bell No!

Sandra Telegraphic data is the new oil.

Genevieve Bell So, no, no, nope. They want to have things like the wire is singing. So they want to talk about the nature of that transaction, they want to make it more human, less mechanical, so they call the singing wire, because of the noise the wire makes. They want to talk about it as girdle, so a way of binding the whole world together, exactly. They want to talk about it as sort of referencing some of Shakespearean ideas about the kind of the connecting the whole world together and holding them in conversation. They definitely evoke God, oh yeah and the Holy Spirit and a whole lot of other things about it, an invisible word. They are very convinced that if you can connect everything with a single wire that the singing wire will create, and this should sound really familiar to you, they should make democracy, a global phenomenon, that everyone will be able to be equal because they'll have access to the same kinds of information at the same kinds of times, that there'll be no more great fights between nations, because there will be clear, transparent articulations of shared value.

Sandra This is the internet, isn't it?

Kai So utopian ideas, I think have always preceded...

Genevieve Bell But then your question is going to be is oil a utopian vision or a dystopian vision?

Kai Oh, yeah, well I raised that in the beginning.

Genevieve Bell But you said dystopian, I guess I'm saying in this case is it actually meant to be utopian?

Kai I think it's meant utopian, but we forget that there is a dystopian angle. And I think we tend to learn over time that the utopia doesn't quite come through. And sometimes we go full scale dystopia

Genevieve Bell Well, that's the Gartner Hype curve.

Kai Right. But it's not quite dystopia if it's on the hype curve, except that it doesn't really work. But what I want to raise with you is, and I put it to you as an open question, I think...

Genevieve Bell it was always a worry, when it's framed, that way

Kai It was still possible to explain that the telegraph would allow you to send certain words, and they would then arrive, and then I have them at the other end. With these big data collection, distribution, manipulation machines that we've been building collectively, and many 1000s of 1000s of engineers are working on these machines. There was a recent article about Facebook engineers who admit that, first of all, they don't know where our data goes, where is it really stored? And really how these algorithms actually bring about these, you know, sometimes dystopian phenomena. So do we have a kind of technology that, and that goes back to Sandra's comment about what you need to know to actually make good decisions in this democracy, have we created systems of a complexity that are very hard to understand for anyone really, in? That's different? Is it not to the telegraph of the day?

Genevieve Bell So yes, and no,

Kai I take the yes.

Genevieve Bell Well, but I think you have to think about both of them, right, is that has technological machinery gotten more complicated over the last 150 years? Absolutely. Has that complexity rendered some of the outputs less comprehensible? I suspect so. Do we imagine that the moment that we are in now as a proxy for the future that is coming? I'm not so convinced of that. So one of the ways I sometimes think about some of the stuff that's happening now in lots of different places is that there are multiple forms of experimentation going on with, here's a data set, here's a set of tools we're using to manipulate it. Does that generate answers that are useful? Does it generate stuff we're like, 'we have no idea what that is'. Will we necessarily continue everything that is present right now, History suggests to us no, market economics suggests to us absolutely not. The ones that work will be the ones that we know they're sitting inside a capitalist, sort of late-capitalist, post-enlightenment society, we know that ones that will persist are the ones that people find value in. Whether that value is economic or political is more complicated. Will that be the same globally? No. Should we be worried that there are untested algorithms being used to extract value out of data? Probably. But, you know, I think the question that is interesting there has to do with, what are the mathematical models that are being utilised here? Because it's easy to wave our hands at machine learning and deep learning as though the computer is just going to go and find meaning in the data. And the reality is, that's actually not what's happening, right, we have a series of mathematical and statistical models that are being deployed to attempt to find patterns. And the thing about those mathematical models is they all have really particular histories. Many of them are models designed to find things that are like other things. So they're designed to fit...

Kai Cat pictures.

Genevieve Bell Or linear regression is about finding things that are more like each other rather than less like each other. Stochastic analysis, not dissimilar. Semi-Markov and Markov models are a mildly different form of statistical analysis. But most of the tools that are being utilised inside machine learning are not mysterious. And we do ourselves a disservice when we talk about them as though they are right there. In fact, based on a series of well-litigated mathematical, statistical models, there are some emerging ones that that's not true about, but many of them sit inside, you know, long-standing, you know, statistics, topology, bits of maths that I don't really understand. But you know, I do know are parts of the world that other people operate in continuously. I think one of the things we aren't good at talking about in all of this is, we are going through a cycle currently, and this sits inside regimes of value for computer scientists and people who work in this area, where in order to accrue status, you want to publish papers, in particular peer-reviewed journals, this will be true as much for people who are sitting inside industry as in the academy. Having papers produced at the moment are entirely predicated on your ability to produce a novel result from data, not to ask questions about the models themselves. So at the moment, the fashion, and it is fashion, in certain communities is to say, 'I've used a novel mathematical model on a novel form of data and produced a novel answer. And you’re like, 'okay, what's the theory that you're building?'

Kai What's the question?

Genevieve Bell What on earth are you doing? And much the same was true in all of our disciplines, at earlier points in time, either granting bodies, journals, the pressures of a particular kind of well-rounded CV caused us to do certain kinds of work. We're in a particular kind of moment of that currently, in that community. And we aren't good, as people who sit in a very different community, of recognising some of that production is not as sinister as we'd like to read it. And it is, in fact, epi phenomenological, about CVs, and about what keeps you employed. Which is very different than saying, you know, will those machine learning models ultimately create, you know, an internet of entirely cat pictures? Or is it more that this guarantees you, you know, you'll get your next promotion? And both of you are just looking at me, like really? It's an unsatisfactory answer, I can tell. Dear listeners, we're you not in the room, you would be granted by the fact that I have two people staring across the table at me going, what?

Sandra Not going what. I think, for me, there's still that point where I think we're excellent that kind of unpacking things for ourselves. But I think many of these conversations benefit from people who are not currently in them, entering them

Genevieve Bell Yep, don't disagree.

Sandra What are the avenues by which we bring them in? Not just the conversation around data or around, don't get me started around papers and publishing, what we choose to write about and get promoted on. But I think many conversations around what's changing, technology, all the conversations around artificial intelligence, around ethics around data, would benefit from other voices being in the room and staying in those conversations, and people just want to go to the practicalities of it, to let's experiment with it. Let's figure out what works. How do we keep the people who are doing these things ,involved in bigger conversations?

Genevieve Bell Which room?

Sandra This one would be a good one, but ahh...

Genevieve Bell Because I think this is an interesting kind of piece about the notion that people have to be in, quote, unquote, "the conversation" or "a conversation". And I'm always kind of curious about what we think that looks like. I know when I sat inside large tech companies, the inordinate frustration I heard from my colleagues about dealing with my other colleagues. Oh, so like, sitting as a social scientist inside of a large tech company trying to get social scientists outside my large tech company to talk to the people I was working with in a way that was comprehensible, was a really hard form of trying to create a shared discursive arena. And I think part of that is exactly where you're going, which is that we all have localised specific vocabularies, and localised specific ways of talking about the questions that we care about. And, for me at least, as a practitioner sitting inside, when I did, you know, technology companies, it wasn't enough that I got to show up knowing what I knew. It wasn't enough to show up with my, you know, fairly bizarre collection of knowledge in the Science and Technology Studies space and, you know, ethno-history, bit a feminist and queer theory, and postcolonial literature thrown in for good measure, and whatever else was in my toolkit of crazy, and you know, and also not just my toolkit, my theoretical kind of training. It wasn't simply enough to show up without and expect people were going to talk to me, it became part and parcel, at least in my working life, of having to spend an enormous amount of time actually making sense of the work that was being done by my colleagues, and of working out that I needed to be able to understand what they were talking about, and not expect they were going to meet me halfway, because that wasn't the way the nature of that exchange was going to be. And that meant that, you know, for me, at least, in order to create a conversation that I sustained people in, I had to be far more literate about the technological pieces than they ended up being literate about social sciences. And I also had to be willing to work out which of the pieces of social science theory could you bring without necessarily having to do in-text citations, or footnoting, and make them part of the broader conversation. And, you know, I have seen that work, but I also know it requires work. And so when I think about, you know, what I do now with students, that would be about ensuring that all of our students know how to build a technological system. So you know, for better or worse, that means they've all been exposed to coding, so they can all code in Python. Some of them don't love us for that, in fact some of them may hate us for that. But you know, learning how to code at least to a beginners level is a really useful way of understanding how that particular technological apparatus functions. The same way learning how to in fact, write a programme, look at data, visualisations think about how algorithms work, it's not enough to understand those as socio- technological artefacts, it helps to actually understand them as someone who's at least built one of them, and attempted to make it do something. And so part of the way we have formulated the curriculum for ourselves and our students is to say, look, we need to give you, not just the building blocks of critical thinking, but the building blocks of the technical apparatus itself, so that it is not an abstract object of your technological inquiry, it is actually an object that you have built. Because it turns out when you have done that, there are some slightly different questions about affordances you almost asked instantaneously, that aren't necessarily the same as your theoretical constructs. That obviously doesn't scale. So I think, you know, then we have to deal with the fact that, while notionally, we would like to imagine everyone should be in the room having the conversation, not everyone wants to be in the room, having "the conversation.

Kai You're not going to get members of parliament to have to decide about technology regulation, to first take a course in Python,

Genevieve Bell Ah, you'd be surprised, actually,

Kai I would be.

Sandra Let's.

Genevieve Bell So in multiple places in the world, we actually see notions about technological literacy as a threshold test for certain kinds of political activity, more common in Europe than elsewhere. It's certainly been the case at earlier points in time that many of our regulators, both government and politicians, had backgrounds in law, engineering, in the sciences. So on one hand, maybe it seems insane to suggest that, on the other hand, I'm pretty certain if we said to the current minister, the Right Honourable Minister Ed Husic...

Genevieve Bell Minister Husic, you know, would you like to learn Python? I suspect he'd say yes.

Kai Then he should do it.

Sandra I would agree with you on that. We've done quite a bit of work on trying to teach various types of fluencies to people who are involved in making these decisions, and in thinking about these issues, I still remember when, a few years ago, we were at one of our conferences, and there was someone from Google there, who said, 'I'm here and I want to understand what you're doing. But I do not, I cannot understand the language that you're using, what are you talking about?' And so on. And this was directed to all our colleagues, of course, we'd never use that language.

Genevieve Bell Right, when I rocked up at Intel, I'd come out of a discipline. So American Cultural Anthropology, where you communicated through papers, so you gave people your papers, but if you were at an event, you read your paper out loud. So you would stand in front of an audience, often not that many people because see above, read your paper out loud, right? And then you would stop. And when I first got to Intel, that's the only way I knew how to talk, was to read my academic work. So I would write things and then get into conference rooms and read them. And my colleagues were like, 'what the hell is wrong with you?' And I'm like, 'What do you mean? And they're like, 'don't you know your ideas?' And I'm like, 'what?' And they're like, 'well why do you need to read that? Don't you know what they are?' I'm like well because...

Sandra The word I'm using here is important. And it is this word I want to use.

Genevieve Bell I said that. And they said to me, 'but why are you reading? If you're reading to us in their fields, it read that I didn't actually know my own ideas. And I couldn't communicate them without reading meant that I didn't know them. And so it was this twofold way. they tuned me out, right? On the one hand, I was using a vocabulary they didn't understand. But on the other hand, I clearly didn't actually know my stuff, because I insisted on reading it, and to them., I didn't know. And it was fascinating that something that, for me was a thing where you know what, I am trained and I do exquisitely care about the words I use, still do. But not reading them out loud was a very sort of unexpected, sort of disciplinary boundary that I had to sort of traverse in order to be comprehensible in a different kind of way. So I get the bit that it says, I imagine there are lots of people in lots of places that are technology companies that we wouldn't think of being tech companies anymore, because let's remember the thing about the data puzzle, new oil or otherwise, is that there are many companies that are data companies that weren't necessarily tech companies. And the expansion of people that are data companies who need a technological substrate in order to have their data flow, means that there's a whole lot of other people who are going to be in this conversation that are even less close to us than some of our colleagues sitting inside large tech companies.

Sandra Qantas as a tech company.

Kai Every bank, most of the mining companies, pretty much any energy company, Coles, Woolworths.

Sandra So a bit like electricity before.

Genevieve Bell Yeah, well, or at least we wouldn't have talked about them as being electrical companies. Very excited about Coles or Woolworths, they're electrical companies, and you're like, 'Woohoo!' So there was that sort of bit where, you know, at some point, we're going to realise that, you know, who was it, it was Andrew Yang, right, who was the CTO at Baidu. And before that, Google, Stanford, I think, who used to talk about, you know, data was a strategic asset. For companies, it wasn't just the thing you had it was an asset, sort of building on...

Kai Makes it slightly different to a commodity like oil, though. Maybe that's...

Genevieve Bell Because he was part of that whole set of conversations. Jeff Immelt used to say this from GE, that, you know, 'most companies that come the 20th century without being technology companies, but none would get through the 21st without being one'. And Yang was like, 'look, I think it's actually about data and how you mobilise it as a strategic asset' Yes, he didn't think it was oil, he thought it was a, you know, could you use it as an asset.

Kai If it's a strategic asset, then it's highly specific, and you can defend it, and you can make it your own in a way that no one else can, right. So there's a lot of skill that goes into that.

Genevieve Bell And I thought that was a better framing of it in that way, right? Because it at least understood it as being more specific.

Kai It's a good way of arriving at this insight. So data as strategic assets.

Sandra So maybe it's not data is the new oil. But data is the new strategic asset. A better way to think about it.

Genevieve Bell Or just the data is a strategic asset. No 'new' necessary, because of course, we didn't spend any time talking about what the word new is doing in there, which we could have.

Kai That is true also. Next time.

Sandra Next time, we hope to have you back again. It was such a pleasure talking to you today. Thanks for being here with us.

Genevieve Bell I think it was my pleasure. And I'm pretty certain I should be saying congratulations to the pair of you and your imminently hyper-qualified people behind us for 300 episodes. Well done.

Kai That is true.

Genevieve Bell Woohoo.

Sandra Woohoo. Well done, Megan.

Megan Woohoo!

Kai Thank you, Megan. And thanks for listening.

Sandra Thanks for listening.

Outro You've been listening to The Future, This Week from The University of Sydney Business School. Sandra Peter is the Director of Sydney Business Insights, and Kai Riemer is Professor of Information Technology and Organisation. Connect with us on LinkedIn, Twitter, and WeChat, and follow, like, or leave us a rating wherever you get your podcasts. If you have any weird or wonderful topics for us to discuss, send them to sbi@sydney.edu.au.

Related content