This week: why the tax office is interested in social media, robots in education and why phones that are not so smart are suddenly appealing. Sandra Peter (Sydney Business Insights) and Kai Riemer (Digital Disruption Research Group) meet once a week to put their own spin on news that is impacting the future of business in The Future, This Week.

The stories this week

Tax office and social media

Robots in education

Not so smart iphones

IRS and the social media

Dubai’s drone taxis

More on simple phones from MIT


You can subscribe to this podcast on iTunes, Soundcloud, Stitcher, Libsyn or wherever you get your podcasts. You can follow us online on Flipboard, Twitter, or sbi.sydney.edu.au.

Send us your news ideas to sbi@sydney.edu.au.

For more episodes of The Future, This Week see our playlists.

Dr Sandra Peter is the Director of Sydney Executive Plus at the University of Sydney Business School. Her research and practice focuses on engaging with the future in productive ways, and the impact of emerging technologies on business and society.

Kai Riemer is Professor of Information Technology and Organisation, and Director of Sydney Executive Plus at the University of Sydney Business School. Kai's research interest is in Disruptive Technologies, Enterprise Social Media, Virtual Work, Collaborative Technologies and the Philosophy of Technology.

Introduction: The Future, This week. Sydney Business Insights. Do we introduce ourselves? I'm Sandra Peter and I'm Kai Riemer. Once a week we're going to get together and talk about the news of the week. There's a whole lot I can talk about. OK let's do this.

Kai: Today we look at why the tax office is interested in social media, robots in education and why phones that are not so smart are all of a sudden appealing. 

Sandra: I'm Sandra Peter. I'm the Director of Sydney Business Insights. 

Kai: I'm Kai Riemer. I'm Professor here at the business school. I'm also the leader of the Digital Disruption Research Group. So Sandra what happened in the future this week?

Sandra: Our first exciting story is about tax. 

Kai: We're talking about tax for a second week in a row. 

Sandra: Second week in a row. This is an exciting podcast about tax. The Sydney Morning Herald talks a little bit about who cares what you post on social media and it seems now everybody cares including the tax office. 

Kai: So they're actually reading our social media. 

Sandra: The Tax Office is. The tax office apparently employed a team of data mining specialists or data doctors and their role apparently is to look at your Facebook and Instagram pictures and to try to figure out whether you're dodging taxes or not. 

Kai: Indeed the Sydney Morning Herald came out with an article that should scare everyone who has to or everyone whose social media feed is not quite in line what they reported to the Tax Office in their tax returns. 

Sandra: They're trying to do this really in two ways. One of them is they're trying to look at Facebook and Instagram to see if you have a business that you haven't declared - are you're selling anything online that you haven't declared? 

Kai: Are you're spending money that you're claiming you don't have?

Sandra: Exactly. Are you displaying bling that you couldn't afford on what you're telling the tax office you're making.

Kai: They're not actually reading this in person. Right? They're employing data analytics and pattern matching techniques to you know throw up those exceptions that they will then investigate and audit. Right.

Sandra: That's apparently what we're being told. And again this is not something new. So data matching is something that the tax office has been doing for a long while online not necessarily with social media feeds so the fact that there are now branching out into images and other types of data is new but...

Kai:...it shouldn't surprise us.

Sandra: It shouldn't surprise us especially since this has happened before. So interestingly enough even though this story has come out this year, in 2014 there was an interesting article on cnet that looked at the IRS in the US, the Internal Revenue Services, and how they are looking at your Facebook feed and this story came out around March/April which is tax season. And if we go further back in 2013 Fox News came out with a scary news report around how the IRS was looking at your Facebook feed to try to figure out if you're dodging taxes. So this seems to be a recurring news around tax season.

Kai: Is this just ATO catching up with what is happening in the US or is this just a story that pops up always around a particular time of year?

Sandra: We think it's a story that pops out every year but is this something that we actually can do a bit of research on which we will.

Kai: This is an interesting story for a number of different reasons. What this points to for me, what it points out again, is the idea that really the data that we're creating in social media will more and more be taken out of context, will be used for other purposes. And you know it already happens. It's being used for marketing purposes and things like that. And it really points to the fact that we need to be mindful of what we're doing in social media. The point is that you know when people are talking to their friends on social media they're in a situation like they're oriented towards talking to their friends, talking to their family, and they might not necessarily think of all the consequences of what they're doing, the traces that they're leaving. So they're innocently just leaving you know traces that can then later be taken out of context or be used for other purposes as we see here. So I think this points to the need for each and every one of us to become more mindful of what we're doing, maybe to look into yes the privacy settings of the platforms that we're using and to really gain a form of digital literacy that many of us have not acquired because we did not grow up with these kinds of media. So to me that's really an interesting aspect there that you know we should never forget that the data that we leave is, a lot of that is, publicly available and will be used for other purposes. 

Sandra: I think you point to a very interesting aspect there with how we perceive these spaces. And I think there are some nice studies looking at the fact that quite often places like Facebook where we talk to our friends are perceived especially by students or the younger generations are perceived as private spaces even though they are public. And I think the other interesting aspect for me in that story was the way we perceive institutions. We seem to have different trust relationships to different institutions in that if we look at law enforcement organisations or if you look at the FBI they have been using Facebook and social media to track down criminals or to find out other illicit activities online. And we feel perfectly comfortable with that. We even encourage that to happen and we welcome it. But once the IRS or the ATO start looking into our own taxes and our own finances we see that as an infringement of...

Kai: I mean none of this is unproblematic right, so the FBI has followed up on leads that turned out to be fake and that can have dire consequences for individuals. But you're raising a good point about the way in which people perceive these spaces to be private. I remember a story back in Germany, a German social worker who was going into schools and high schools to teach young people about digital literacy. So what he used to do is he would go and find out who's in the class and then would go and see which of the kids had their Facebook settings on public and then would print out the Facebook stream or the Facebook wall of one of the kids or two of the kids, would bring them as posters and pin them up in class. And the kids would be shocked and they would you know scream murder that you know this is private information you can't put this up here in the classroom. And that would then you know throw them in the middle of a discussion about hey you know I'm just putting it up here with your classmates. You've put this up for everyone to see. Imagine who is out there on the internet who can possibly see this right. And you have a very powerful case to make to then you know sit down together and look into the privacy settings of those platforms. And I think this is a point those platforms do not put those privacy settings front and centre of course because they want you to share as widely as possible so that the data is available because the data is the product in the end isn't it. 

Sandra: Exactly if you're not paying anything for the service you're the thing that's being sold on it. 

Kai: Yes. Or as someone once said if Facebook was a butcher you know we wouldn't be the customers we'd be the sausages.

Sandra: So since we've done tax again this week should we talk again about robots. They've been all over the news in the past week.

Kai: Oh there's been a lot. I mean we could talk about this for an hour I guess. Robots also seem to have...they seem to have become a shortcut for anything to do with machine learning, AI, computers in general, right, so we have to keep an eye on that. But there's been some interesting stories about taxi drones in Dubai. There was an article in Recode. There's a whole controversy around self driving cars and how Uber is being accused of developing their robocars from stolen Google technology.

Sandra: And that's one we'll keep a close eye on as it develops. Yeah it might have interesting implications for the future of autonomous vehicles. 

Kai: Absolutely it's just unfolding. But there's been a couple of really interesting stories about robots and children and education. There was one in MIT Technology Review about robots as role models for children and one in Futurism.com about how your next teacher could be a robot and I think that's fascinating because when robots mix with children we learn a lot about not necessarily only about robots but also about humans and how we relate to robots. So let's talk about that.

Sandra: Let's talk about robots teaching our children. So the article talks about having them in schools. And I think as much as we like to think of robots as you know looking like us or closely related to us and sitting at the front of the class this is much about technology going into the classroom and especially adaptive technology going into the classroom. I claim that to some extent we have that already...

Kai: Not entirely new right? 

Sandra: It's not entirely new and UNSW has developed a software called Smart Sparrow which made its way into universities and adaptive learning technology that tries to figure out what the mistakes are that you're making and tried to correct those mistakes and present you with increasingly difficult problems or problems that make you more comfortable in areas that you're struggling with or responding too slowly to. So we have those sorts of technologies but that's not quite what they meant is it. 

Kai: No not quite. They went much further than that saying that you know your teacher could be a robot. We could actually have computers teach children more or less entirely. And often the argument that is being put forward is that this is a really good technology for areas where there's not enough teachers available. We can talk about that as a separate topic but what I find quite interesting is the model of education or learning that goes into making claims like that like the assumptions that are being made. It says in the Futurism article and this is a guy named Thomas Frey of the Da Vinci Institute being quoted to say it learns what your interests are, your reference points, and figures out how to teach you in a faster and faster way over time. So what we're saying is the computer is almost treating the human as a thing to be figured out and then to be filled with knowledge in a more efficient way, right. Now I find that a quite a narrow approach to learning and also to education. And my point would be isn't education and learning about gaining a positive perspective on life. Really about becoming a person and learning how to learn and appreciating what education is all about. There's so much more than stuffing your head with knowledge that goes into learning. So having adaptive technology that helps with certain learning tasks right. I get that point to a certain extent but claiming that a teacher is just doing that, I find a very dangerous argument to make. 

Sandra: I would have to agree with the fact that extending this to anything other than mastering let's say certain mathematical techniques or coding or other aspects where you could rely on technology to correct certain types of mistake extending to still a more well-rounded education especially in primary school or secondary school is a dangerous argument. Also defending it on the count that you might be able to take it into areas where you have a lack of teachers. This would often be the areas that would most benefit from actually having a human being at the front of the class trying to figure out how students from backgrounds that would not normally have access to education how they develop and how to instruct them or looking at countries where education is difficult say rural parts of India or of Africa were translate or taking over a robot that has been taught in the Western context is quite difficult because it assumes that these students whoever they are will have the critical literacies to engage with the type of content that we're giving them over the cultural competencies to even know how we learn and how what we were used. 

Kai: It takes a very mechanistic view of learning doesn't it. To me education is about creating situations right. It's emotional. It needs you to be there. It needs to be risky to a certain extent and it has to have a point to reduce teaching and learning to the mastery of certain you know mechanistic tasks is a very narrow view. Certainly has a place. But if I had to make a point or sum this up is how does a robot teach a child to be a person. 

Sandra: Well actually I have a comment on that and that would be that we can look at how that's happening and it's not necessarily in a good way. If you think of Amazon's Alexa the little speaker that you talk to, there were a couple of interesting studies looking at the fact that if you put them in the house where they were there is a child and the child is two years old well you can ask Alexa any questions and you don't have to say please or a thank you and you can ask it 20 different times and you can yell at it and it will still respond. So they're actually teaching children to be rude at that age.

Kai: So what it actually means that we're becoming more like robots right.

Sandra: Or they're becoming more like us. And I think that was the point that the Stanford University researchers were doing with the robot that they had developed to try to learn to speak and behave naturally and it went on Twitter and apparently in 24 hours it became rude and anti-semitic and....

Kai: Yeah yeah people taught to do all kinds of naughty things.

Sandra: So overall I think robot teachers are something to keep an eye on. And it might not necessarily be about the next robot in the classroom although I think there is a fascination that we have as humans with that. There is a channel for a TV series called robots where we have this humanoid robots called Synths that actually do teach in class based on a really good Swedish series actually on real human. 

Kai: It points to a whole other discussion that's going to be had not today which is you know our fascination with robots and the role science fiction plays in the way in which we now talk about the robots coming for us and they're stealing our jobs from us. Right. As if they had you know human agency and were actually living things that are coming for us right. We just got to keep in perspective that we're still talking algorithms that have to learn from us. So you know we're not talking real human intelligence in robot bodies yet. 

Sandra: Yet. Something for next week. 

Kai: We have one more topic for today which is dumb phones. Phones that are not smart and I'm hearing Nokia is making a comeback. 

Sandra: Yes apparently you're next phone could be not a smartphone but a dumb phone. Interesting article from the MIT Tech Review. 

Kai: Yeah. And one in Atlantic as well. Yes. Again multiple articles approaching this topic from different angles. 

Sandra: That are looking at phones that will only have one function and that will be to allow you to make calls and receive text messages. No connectivity, no apps and an actual physical keyboard.

Kai: But yes those old fashioned phones that are not connected to the Internet that will not annoy you with Facebook posts, Instagrams and all the kind of Snapchat messages that are coming flowing in to our phones and notifications but that they are really just there to have a chat. 

Sandra: So is there a future market for dumb phones in the era of smartphones bigger and bigger screens more apps more connectivity more features.

Kai: That's a good question. I don't know. The Atlantic says the contemporary citizen of the developed world has almost no choice but to own and operate one, a smartphone that is, and yet the joy and the utility of doing so has declined if not ceased entirely. I think that's the point right. Smart phones used to be new. They used to be shiny. They opened up a whole new world of possibility and we were excited. We embraced it and in a matter of seven eight 10 years since the launch of the iPhone these smartphones have enabled us to come up with whole new forms of communication. But the shine has come off a little. Right. For a lot of people smartphones just add to the busyness of life to an extent that they become nuisances they become really a burden. 

Sandra: We still want you to have them to listen to our podcast but indeed these dumb phones come in this context of angst about technology and about the amount of time it takes us and about the number of times we check it. I remember that saying that the thing you first touch in the morning it's the thing you're in a relationship with and for many of us that is our phone that we first touch in the morning. 

Kai: Indeed. So many people sleep with their phones under their pillows. But at the same time it's almost like a love hate relationship for many people these days. And yes indeed many of us cannot do without them. But we do not necessarily love them anymore. I think it's more than just a techno phobia that is developing. I think people's lives are getting so busy and life has gotten very complex. There's so much information that is streaming in and the solutions don't seem to appeal because outsourcing your control over what you're reading what you consume to an algorithm I think it makes people uneasy. Right. Because you're losing control to a certain extent about what you take as information. 

Sandra: It seems that both in terms of joy and utility. Smartphones have reached a certain plateau and they are declining and that's probably also one of the reasons that companies like LG are mentioned or even iPhone in that they are looking more and more and how to improve battery life and the screen rather than add any more features as that's seemed to have plateaued. But context I think also besides this constant infringement on our personal lives and on our time I think we should also consider the matter of cost. So these dumb phones besides not offering us any of the apps or any of the connectivity that the smartphone would have they also come at a very very low cost. So that is also something that opens up a completely new market. If you ask what is the job that this phone does for you. 

Kai: Yes. The Atlantic makes an interesting point there. They said it used to be called a mobile phone, right because it added something to the phone it made the phone mobile. And I find that etymology quite interesting because then the mobile phone became smart. It became a smartphone and now what used to be the mobile phone we call a dumb phone. Right. As if it was a downsized version of the smartphone. So that in itself tells you something. But I think this idea of the phone being dumb points to the relationship that we have with it in the sense that phones and the apps and the notifications and the way in which they throw things at us seem to take on a certain control over us. I would be really interested in a study whether it exists or we'd have to do it about how much do we actually act on our phones and how much do we react to what extent are we just simulus response automatons reacting to notifications and messages coming in and things that the phone brings up that we are supposed to like or retweet. And to what extent are we actually enacting our own human agency and our own human control over these devices. And I think that's at play here for some people who say I really want a break from this. I want to just have a simple phone. I want to see what it's like to have a phone which only does things when I tell it to make a phone call. I think that's probably one angle to explaining this. 

Sandra: And we can also think of environments or industries or context in which this would actually be a benefit to those people. And one of the examples would be the construction industry where construction workers definitely benefit from not having something that is large that breaks easily that is a distraction....

Kai: That you can operate with gloves on.

Sandra: Exactly. That you can operate with gloves on. And best of all if you have a dumb phone the ATO can't track what you're doing on social media.

Kai: Exactly. And I think that's all we have time for today. Thanks for listening.

Sandra: Thanks for listening. 

Outro: This was The Future, This Week brought to you by Sydney Business Insights and the Digital Disruption Research Group. You can subscribe to this podcast on SoundCloud, itunes or wherever you get your podcast. You can follow us online on Twitter and on Flipboard. If you have any news you want us to discuss please send them to sbi@sydney.edu.au.

Related content