1 00:00:00,480 --> 00:00:03,480 foreign 2 00:00:08,280 --> 00:00:11,280 floriani 3 00:00:13,019 --> 00:00:17,160 Sam is a digital rights activist and 4 00:00:15,179 --> 00:00:18,300 writer based in Melbourne they are 5 00:00:17,160 --> 00:00:19,980 currently the program leader digital 6 00:00:18,300 --> 00:00:21,359 Rights Watch where they advocate for a 7 00:00:19,980 --> 00:00:24,060 liberatory digital future in which 8 00:00:21,359 --> 00:00:25,199 everyone confront everyone can thrive 9 00:00:24,060 --> 00:00:26,820 they're going to talk to us about human 10 00:00:25,199 --> 00:00:28,859 rights and Tech policy in Australia with 11 00:00:26,820 --> 00:00:30,340 digital rights and digital wrongs please 12 00:00:28,859 --> 00:00:39,670 welcome Sam 13 00:00:30,340 --> 00:00:39,670 [Applause] 14 00:00:42,000 --> 00:00:44,840 hmm 15 00:00:46,820 --> 00:00:55,020 I'd like to start with some stories 16 00:00:51,539 --> 00:00:57,480 so a teenager and her mother discuss 17 00:00:55,020 --> 00:00:59,699 accessing vital Medical Care on a 18 00:00:57,480 --> 00:01:02,160 popular social media platform 19 00:00:59,699 --> 00:01:04,199 now this form of Medical Care is widely 20 00:01:02,160 --> 00:01:06,420 recognized as essential but it's 21 00:01:04,199 --> 00:01:08,580 recently become criminalized 22 00:01:06,420 --> 00:01:10,860 they think that they're having a private 23 00:01:08,580 --> 00:01:12,720 conversation but their messages are not 24 00:01:10,860 --> 00:01:14,820 encrypted 25 00:01:12,720 --> 00:01:16,320 the social media platform hands over 26 00:01:14,820 --> 00:01:18,540 their Communications data to law 27 00:01:16,320 --> 00:01:20,880 enforcement and the teenager goes to 28 00:01:18,540 --> 00:01:23,119 prison and it looks like the mother will 29 00:01:20,880 --> 00:01:23,119 follow 30 00:01:23,880 --> 00:01:30,840 a woman has built her entire Community 31 00:01:26,640 --> 00:01:33,540 online she spent hours weeks months 32 00:01:30,840 --> 00:01:35,820 building up her brand finding her 33 00:01:33,540 --> 00:01:36,720 community making friends she's earning a 34 00:01:35,820 --> 00:01:39,240 living 35 00:01:36,720 --> 00:01:41,880 now she knows that she works in a pretty 36 00:01:39,240 --> 00:01:44,759 stigmatized area so she makes sure that 37 00:01:41,880 --> 00:01:47,100 she's really careful to abide by the 38 00:01:44,759 --> 00:01:50,340 rules of the platform 39 00:01:47,100 --> 00:01:52,320 but they are vague and they do seem to 40 00:01:50,340 --> 00:01:54,180 change all the time 41 00:01:52,320 --> 00:01:56,159 she's got a whole Community there she's 42 00:01:54,180 --> 00:01:58,020 got access to a support network they're 43 00:01:56,159 --> 00:02:00,540 sharing you know crucial health and 44 00:01:58,020 --> 00:02:02,579 safety information together and then in 45 00:02:00,540 --> 00:02:04,259 an instant she loses it all all of her 46 00:02:02,579 --> 00:02:06,180 friends all of her community all of her 47 00:02:04,259 --> 00:02:08,039 pictures her videos her memories it's 48 00:02:06,180 --> 00:02:10,399 all gone and there's no Pathway to 49 00:02:08,039 --> 00:02:10,399 appeal 50 00:02:11,039 --> 00:02:16,739 a man is looking for a home 51 00:02:14,760 --> 00:02:19,200 it's the middle of a housing crisis and 52 00:02:16,739 --> 00:02:20,819 he's really stressed out so he's going 53 00:02:19,200 --> 00:02:22,860 to rental inspections and he's using 54 00:02:20,819 --> 00:02:23,819 whichever app the real estate agent asks 55 00:02:22,860 --> 00:02:25,260 him to 56 00:02:23,819 --> 00:02:26,580 he starts to become a little bit 57 00:02:25,260 --> 00:02:27,959 uncomfortable with the kinds of 58 00:02:26,580 --> 00:02:30,180 questions that they're asking especially 59 00:02:27,959 --> 00:02:32,819 as they become more and more invasive 60 00:02:30,180 --> 00:02:35,099 and they kind of seem irrelevant to his 61 00:02:32,819 --> 00:02:37,200 ability to pay rent 62 00:02:35,099 --> 00:02:38,580 but he doesn't want to be Troublesome or 63 00:02:37,200 --> 00:02:41,160 annoying because he knows that that will 64 00:02:38,580 --> 00:02:42,840 make it all the more hard to secure a 65 00:02:41,160 --> 00:02:45,360 rental 66 00:02:42,840 --> 00:02:47,519 he misses out on securing a home again 67 00:02:45,360 --> 00:02:49,500 and again and again because an 68 00:02:47,519 --> 00:02:52,980 algorithmic scoring system downranked 69 00:02:49,500 --> 00:02:55,879 him and he has no idea why and no way to 70 00:02:52,980 --> 00:02:55,879 do anything about it 71 00:02:56,120 --> 00:03:01,200 a young woman 72 00:02:58,319 --> 00:03:02,879 is living her life given that it's the 73 00:03:01,200 --> 00:03:03,840 modern digital economy she has a mobile 74 00:03:02,879 --> 00:03:06,300 phone 75 00:03:03,840 --> 00:03:09,420 and then one day her telecommunications 76 00:03:06,300 --> 00:03:12,120 provider has a massive data breach 77 00:03:09,420 --> 00:03:13,500 they blame hackers but she later finds 78 00:03:12,120 --> 00:03:16,620 out that maybe their digital security 79 00:03:13,500 --> 00:03:18,780 wasn't quite up to scratch 80 00:03:16,620 --> 00:03:20,819 her details are online and she has no 81 00:03:18,780 --> 00:03:22,500 idea how to protect herself or her 82 00:03:20,819 --> 00:03:24,959 identity 83 00:03:22,500 --> 00:03:27,720 a few weeks later her health insurer 84 00:03:24,959 --> 00:03:29,819 also has a massive data breach more of 85 00:03:27,720 --> 00:03:32,159 her details are online creating an Ever 86 00:03:29,819 --> 00:03:33,180 more complete picture of her and her 87 00:03:32,159 --> 00:03:35,220 life 88 00:03:33,180 --> 00:03:36,540 these companies in the news say things 89 00:03:35,220 --> 00:03:38,700 like oh well we don't know of anybody 90 00:03:36,540 --> 00:03:41,220 who's been harmed as a result of these 91 00:03:38,700 --> 00:03:44,519 breaches and yet she lives her life 92 00:03:41,220 --> 00:03:46,019 every day in fear that her ex-partner 93 00:03:44,519 --> 00:03:49,519 will be able to find out with a little 94 00:03:46,019 --> 00:03:49,519 bit of effort where she lives 95 00:03:49,920 --> 00:03:54,360 so I could fill my entire time slot with 96 00:03:52,860 --> 00:03:55,980 stories like these 97 00:03:54,360 --> 00:03:59,220 and these stories are all connected 98 00:03:55,980 --> 00:04:01,140 they're all digital rights issues 99 00:03:59,220 --> 00:04:02,879 and all of these stories are based on 100 00:04:01,140 --> 00:04:05,340 real events and they have really real 101 00:04:02,879 --> 00:04:06,959 impacts on people's rights safety and 102 00:04:05,340 --> 00:04:09,060 well-being 103 00:04:06,959 --> 00:04:10,379 there are countless examples of ways in 104 00:04:09,060 --> 00:04:13,340 which digital Technologies are 105 00:04:10,379 --> 00:04:15,780 intersecting with systems of Oppression 106 00:04:13,340 --> 00:04:18,180 Injustice and inequality and in many 107 00:04:15,780 --> 00:04:21,120 ways making things worse 108 00:04:18,180 --> 00:04:22,740 conversely digital technology has 109 00:04:21,120 --> 00:04:24,960 immense potential to improve the way 110 00:04:22,740 --> 00:04:26,100 that we live work and connect given that 111 00:04:24,960 --> 00:04:28,139 you're here today I don't think I need 112 00:04:26,100 --> 00:04:29,580 to convince you of the good that 113 00:04:28,139 --> 00:04:32,160 technology can do and it's immense 114 00:04:29,580 --> 00:04:34,560 possibilities 115 00:04:32,160 --> 00:04:37,020 but the risks and possible consequences 116 00:04:34,560 --> 00:04:39,600 when it goes wrong are immense and 117 00:04:37,020 --> 00:04:42,600 that's where digital rights come in 118 00:04:39,600 --> 00:04:44,220 so digital rights are human rights the 119 00:04:42,600 --> 00:04:46,020 only real difference is that they 120 00:04:44,220 --> 00:04:48,620 specifically take digital Technologies 121 00:04:46,020 --> 00:04:48,620 into account 122 00:04:49,380 --> 00:04:54,720 too often when people talk about digital 123 00:04:51,840 --> 00:04:57,000 rights and harms they think about them 124 00:04:54,720 --> 00:04:58,560 in kind of like an abstract sense sort 125 00:04:57,000 --> 00:05:00,419 of ethereal you know we think about it 126 00:04:58,560 --> 00:05:01,979 happening in cyberspace but not in the 127 00:05:00,419 --> 00:05:03,540 real world 128 00:05:01,979 --> 00:05:05,580 but as you've just heard from these 129 00:05:03,540 --> 00:05:08,460 stories that I shared to begin with they 130 00:05:05,580 --> 00:05:11,820 do have real tangible material impacts 131 00:05:08,460 --> 00:05:14,340 on people's lives and as the Divide 132 00:05:11,820 --> 00:05:16,560 between online and offline continues to 133 00:05:14,340 --> 00:05:18,479 collapse digital rights become ever more 134 00:05:16,560 --> 00:05:20,580 important 135 00:05:18,479 --> 00:05:22,680 and so the fight for digital rights at 136 00:05:20,580 --> 00:05:25,139 its core is about protecting people from 137 00:05:22,680 --> 00:05:26,880 these kinds of harms and more 138 00:05:25,139 --> 00:05:28,440 but more than that more than just 139 00:05:26,880 --> 00:05:30,660 protecting people from the bad stuff 140 00:05:28,440 --> 00:05:33,060 it's all about it's also about you know 141 00:05:30,660 --> 00:05:35,340 envisaging and working towards and 142 00:05:33,060 --> 00:05:38,780 ushering in a world where technology is 143 00:05:35,340 --> 00:05:38,780 really a force for social good 144 00:05:39,780 --> 00:05:43,699 So today we're going to delve into the 145 00:05:42,000 --> 00:05:46,500 world of digital riots where Tech 146 00:05:43,699 --> 00:05:49,380 politics and human rights combine 147 00:05:46,500 --> 00:05:50,820 so in this session we're going to do a 148 00:05:49,380 --> 00:05:52,500 little crash course on digital rights to 149 00:05:50,820 --> 00:05:54,240 begin with that I'm going to go through 150 00:05:52,500 --> 00:05:58,500 the digital wrongs and give you a bit of 151 00:05:54,240 --> 00:06:00,479 uh uh um a Roundup of some tech policy 152 00:05:58,500 --> 00:06:02,039 in Australia and then we're going to 153 00:06:00,479 --> 00:06:03,900 finish up with some tips on how you can 154 00:06:02,039 --> 00:06:06,960 get involved if I've done my job well 155 00:06:03,900 --> 00:06:08,400 and you feel so inclined 156 00:06:06,960 --> 00:06:09,240 um before we go much further you 157 00:06:08,400 --> 00:06:11,639 probably want to know a little bit more 158 00:06:09,240 --> 00:06:14,039 about me that's my cat 159 00:06:11,639 --> 00:06:18,419 um she's my biggest fan and I love her 160 00:06:14,039 --> 00:06:19,500 so much my name is Sam as um Daisy said 161 00:06:18,419 --> 00:06:21,360 thank you for your introduction and 162 00:06:19,500 --> 00:06:23,160 thank you so much for pycon and all of 163 00:06:21,360 --> 00:06:24,660 the organizers and volunteers for 164 00:06:23,160 --> 00:06:26,220 putting this on today it's a real honor 165 00:06:24,660 --> 00:06:29,880 to be here and to all of you for coming 166 00:06:26,220 --> 00:06:31,860 along and coming with me on this journey 167 00:06:29,880 --> 00:06:33,360 um I in my day-to-day life have the 168 00:06:31,860 --> 00:06:35,759 privilege of living and working on 169 00:06:33,360 --> 00:06:37,740 unseated lands of the royal wandering 170 00:06:35,759 --> 00:06:39,419 people of the cooler Nations 171 00:06:37,740 --> 00:06:41,160 um and it always was and always will be 172 00:06:39,419 --> 00:06:44,340 Aboriginal land 173 00:06:41,160 --> 00:06:46,620 my background is in politics and data 174 00:06:44,340 --> 00:06:49,020 science and I've worked in privacy 175 00:06:46,620 --> 00:06:52,500 somewhere or another for about 10 years 176 00:06:49,020 --> 00:06:53,699 in the uh in the public sector in the 177 00:06:52,500 --> 00:06:57,539 private sector and now in the 178 00:06:53,699 --> 00:07:00,060 not-for-profit space I also have done uh 179 00:06:57,539 --> 00:07:01,740 stint as the pro program director at 180 00:07:00,060 --> 00:07:04,080 code like a girl which is organization 181 00:07:01,740 --> 00:07:06,360 all about creating Pathways for women 182 00:07:04,080 --> 00:07:10,199 and non-binary people into Tech careers 183 00:07:06,360 --> 00:07:13,500 and advocating for more inclusive gender 184 00:07:10,199 --> 00:07:16,080 representation in the tech industry 185 00:07:13,500 --> 00:07:17,220 and now I'm the program lead at digital 186 00:07:16,080 --> 00:07:19,259 Rights Watch 187 00:07:17,220 --> 00:07:22,440 which is a civil society organization 188 00:07:19,259 --> 00:07:26,000 that exists to defend digital rights and 189 00:07:22,440 --> 00:07:26,000 to protect digital rights in Australia 190 00:07:26,280 --> 00:07:30,300 we do a couple of things that digital 191 00:07:27,780 --> 00:07:32,039 Rights Watch so we advocate for rights 192 00:07:30,300 --> 00:07:34,099 respecting socially Progressive Tech 193 00:07:32,039 --> 00:07:36,240 policy so that means doing things like 194 00:07:34,099 --> 00:07:38,520 writing submissions going to 195 00:07:36,240 --> 00:07:40,139 parliamentary hearings talking with 196 00:07:38,520 --> 00:07:41,699 industry talking with government and 197 00:07:40,139 --> 00:07:43,860 most of that work happens kind of behind 198 00:07:41,699 --> 00:07:46,319 the scenes 199 00:07:43,860 --> 00:07:48,120 uh we also contribute to the public Tech 200 00:07:46,319 --> 00:07:49,560 discourse so things like coming here 201 00:07:48,120 --> 00:07:51,419 today to chat with you about digital 202 00:07:49,560 --> 00:07:53,940 riots or talking to the media or running 203 00:07:51,419 --> 00:07:56,759 events and campaigns and things 204 00:07:53,940 --> 00:08:00,000 and we are working to build the digital 205 00:07:56,759 --> 00:08:01,979 rights movement in Australia so sadly 206 00:08:00,000 --> 00:08:04,560 the digital ads movement in Australia is 207 00:08:01,979 --> 00:08:07,380 pretty small to be honest and it's 208 00:08:04,560 --> 00:08:10,139 filled with like brilliant dedicated 209 00:08:07,380 --> 00:08:13,080 people but it's chronically underfunded 210 00:08:10,139 --> 00:08:14,699 and under-resourced so what we do is we 211 00:08:13,080 --> 00:08:17,940 try to work with different organizations 212 00:08:14,699 --> 00:08:20,520 people across law Tech social justice 213 00:08:17,940 --> 00:08:23,099 other community groups to try and build 214 00:08:20,520 --> 00:08:26,539 that movement because we we know that 215 00:08:23,099 --> 00:08:26,539 we're more powerful together 216 00:08:27,419 --> 00:08:33,839 okay so as I said before digital rights 217 00:08:31,680 --> 00:08:35,099 are human rights as realized in the 218 00:08:33,839 --> 00:08:37,140 digital age 219 00:08:35,099 --> 00:08:39,599 so that's a pretty big concept so I 220 00:08:37,140 --> 00:08:41,459 thought we'd break some of that down 221 00:08:39,599 --> 00:08:43,860 so one aspect of fighting for digital 222 00:08:41,459 --> 00:08:46,020 rights is protecting and upholding human 223 00:08:43,860 --> 00:08:48,120 rights that existed long before the 224 00:08:46,020 --> 00:08:50,339 Advent of the internet so things like 225 00:08:48,120 --> 00:08:53,580 the right to privacy freedom of 226 00:08:50,339 --> 00:08:55,740 expression freedom of speech freedom of 227 00:08:53,580 --> 00:08:57,540 assembly and so on so these things 228 00:08:55,740 --> 00:08:59,100 existed long before the Advent of the 229 00:08:57,540 --> 00:09:00,839 internet 230 00:08:59,100 --> 00:09:04,620 and something to be aware of as we go on 231 00:09:00,839 --> 00:09:07,140 is that Australia doesn't have a federal 232 00:09:04,620 --> 00:09:08,760 Human Rights Act or Charter and we're 233 00:09:07,140 --> 00:09:10,860 the only Western liberal democracy that 234 00:09:08,760 --> 00:09:12,720 doesn't have one and this sadly makes 235 00:09:10,860 --> 00:09:15,240 the fight for digital rights all the 236 00:09:12,720 --> 00:09:17,459 more challenging 237 00:09:15,240 --> 00:09:19,800 now for me the right to privacy is 238 00:09:17,459 --> 00:09:22,019 absolutely pivotal in this fight and 239 00:09:19,800 --> 00:09:23,519 that's because it is what we can call an 240 00:09:22,019 --> 00:09:25,440 enabling right 241 00:09:23,519 --> 00:09:28,380 so privacy is really important in and of 242 00:09:25,440 --> 00:09:30,300 itself but it also enables us to enjoy 243 00:09:28,380 --> 00:09:32,820 other rights and freedoms 244 00:09:30,300 --> 00:09:34,980 for example if you think of any social 245 00:09:32,820 --> 00:09:36,959 movement that's you know pushed back 246 00:09:34,980 --> 00:09:39,899 against powerful institutions for social 247 00:09:36,959 --> 00:09:42,120 change it is incredibly difficult to do 248 00:09:39,899 --> 00:09:44,700 that incredibly hard to organize a 249 00:09:42,120 --> 00:09:45,959 protest or coordinate a union or you 250 00:09:44,700 --> 00:09:48,540 know build a community based on 251 00:09:45,959 --> 00:09:50,279 Resistance if you are if you have no 252 00:09:48,540 --> 00:09:53,399 privacy if you're under constant 253 00:09:50,279 --> 00:09:56,580 surveillance and so privacy enables us 254 00:09:53,399 --> 00:09:58,320 to gather and share and organize outside 255 00:09:56,580 --> 00:10:00,480 of the view of those who might want to 256 00:09:58,320 --> 00:10:02,580 be listening in 257 00:10:00,480 --> 00:10:05,100 it's also tied to Notions of 258 00:10:02,580 --> 00:10:07,860 self-determination personal agency 259 00:10:05,100 --> 00:10:10,740 dignity autonomy the ability to develop 260 00:10:07,860 --> 00:10:12,959 one's self sense of self and personality 261 00:10:10,740 --> 00:10:15,300 it's also closely linked to ideas of 262 00:10:12,959 --> 00:10:16,920 anonymity which is really important to 263 00:10:15,300 --> 00:10:19,200 keep lots of people safe online 264 00:10:16,920 --> 00:10:21,240 especially in a digital ecosystem where 265 00:10:19,200 --> 00:10:23,459 it's increasingly possible to track 266 00:10:21,240 --> 00:10:25,080 everything that we do 267 00:10:23,459 --> 00:10:26,640 so these are all really powerful 268 00:10:25,080 --> 00:10:29,459 Concepts and they're really important to 269 00:10:26,640 --> 00:10:32,519 us as individuals but also collectively 270 00:10:29,459 --> 00:10:35,779 as a community and more broadly for 271 00:10:32,519 --> 00:10:35,779 Society at Large 272 00:10:36,300 --> 00:10:40,200 so another aspect of fighting for 273 00:10:38,100 --> 00:10:42,360 digital rights is to fight for the 274 00:10:40,200 --> 00:10:44,880 establishment and protection of newer 275 00:10:42,360 --> 00:10:46,380 rights or reimagine rights that are fit 276 00:10:44,880 --> 00:10:48,540 for the digital age 277 00:10:46,380 --> 00:10:50,700 for instance many now argue that access 278 00:10:48,540 --> 00:10:51,480 to the internet should be considered a 279 00:10:50,700 --> 00:10:53,399 right 280 00:10:51,480 --> 00:10:55,560 and given that so much of our Lives 281 00:10:53,399 --> 00:10:58,800 happens online and you know accessing 282 00:10:55,560 --> 00:11:00,300 Services work socializing it makes sense 283 00:10:58,800 --> 00:11:02,700 that this shouldn't necessarily be 284 00:11:00,300 --> 00:11:03,959 considered a luxury anymore and this is 285 00:11:02,700 --> 00:11:07,079 connected to the idea of digital 286 00:11:03,959 --> 00:11:10,399 inclusion and the digital divide and 287 00:11:07,079 --> 00:11:12,480 just shy of 10 of Australians are still 288 00:11:10,399 --> 00:11:14,959 greatly excluded from the digital 289 00:11:12,480 --> 00:11:14,959 economy 290 00:11:15,360 --> 00:11:19,320 digital security is another thing that 291 00:11:17,459 --> 00:11:21,980 people are starting to consider in terms 292 00:11:19,320 --> 00:11:21,980 of a right 293 00:11:22,040 --> 00:11:27,240 now all of these things are actually 294 00:11:24,360 --> 00:11:28,920 really connected and before I came here 295 00:11:27,240 --> 00:11:30,660 as I was you know working away on my 296 00:11:28,920 --> 00:11:33,180 talk I was like how can I possibly 297 00:11:30,660 --> 00:11:35,760 communicate to these people just how you 298 00:11:33,180 --> 00:11:38,399 know sort of connected all of these 299 00:11:35,760 --> 00:11:39,120 things are and it got it got pretty 300 00:11:38,399 --> 00:11:42,240 um 301 00:11:39,120 --> 00:11:43,440 hectic in my brain to be honest and 302 00:11:42,240 --> 00:11:45,180 where we're going is going to look a 303 00:11:43,440 --> 00:11:47,339 little bit like this so I apologize in 304 00:11:45,180 --> 00:11:48,779 advance it's uh I promise I'm not going 305 00:11:47,339 --> 00:11:50,399 to Pedal like a conspiracy theory 306 00:11:48,779 --> 00:11:53,459 anything 307 00:11:50,399 --> 00:11:55,079 so this is my mind map it's a map of my 308 00:11:53,459 --> 00:11:57,720 mind it's what keeps me up at night I 309 00:11:55,079 --> 00:12:00,720 have nightmares about this it haunts me 310 00:11:57,720 --> 00:12:02,700 um and there are probably infinite ways 311 00:12:00,720 --> 00:12:04,019 that we could create a map like this so 312 00:12:02,700 --> 00:12:05,459 you know if you've got your binoculars 313 00:12:04,019 --> 00:12:08,279 out and you want to pull me up on 314 00:12:05,459 --> 00:12:10,680 something don't at me it's it's just one 315 00:12:08,279 --> 00:12:13,440 example and the idea is to demonstrate 316 00:12:10,680 --> 00:12:16,560 how these things are connected 317 00:12:13,440 --> 00:12:18,240 so in the blue nodes we've got uh rights 318 00:12:16,560 --> 00:12:19,560 not all of them I can't fit them all on 319 00:12:18,240 --> 00:12:21,240 this screen 320 00:12:19,560 --> 00:12:23,760 so you can see the right to privacy 321 00:12:21,240 --> 00:12:26,640 right to protest yada yada 322 00:12:23,760 --> 00:12:29,160 in the purple we have Concepts so ideas 323 00:12:26,640 --> 00:12:31,980 like surveillance capitalism 324 00:12:29,160 --> 00:12:33,959 which is the idea of data extraction and 325 00:12:31,980 --> 00:12:35,700 accumulation for profit 326 00:12:33,959 --> 00:12:39,180 and you can see that that is connected 327 00:12:35,700 --> 00:12:41,339 to datification which is the idea that 328 00:12:39,180 --> 00:12:43,680 everything that we do these days is able 329 00:12:41,339 --> 00:12:46,200 to be transformed into data and that's 330 00:12:43,680 --> 00:12:49,019 connected to data positivism which is an 331 00:12:46,200 --> 00:12:51,000 ideology that basically reports that we 332 00:12:49,019 --> 00:12:53,399 can solve any complex social problem if 333 00:12:51,000 --> 00:12:56,880 we just have enough information enough 334 00:12:53,399 --> 00:12:59,339 data and spaced on this assumption that 335 00:12:56,880 --> 00:13:01,680 everything in the world is knowable and 336 00:12:59,339 --> 00:13:03,720 measurable and therefore predictable 337 00:13:01,680 --> 00:13:06,720 this ideology is actually really popular 338 00:13:03,720 --> 00:13:10,440 in in Silicon Valley for example the 339 00:13:06,720 --> 00:13:12,420 ex-ceo of Google once said that they 340 00:13:10,440 --> 00:13:14,940 could solve practically any complex 341 00:13:12,420 --> 00:13:17,279 social problem with just enough data and 342 00:13:14,940 --> 00:13:18,839 the ability to Crunch it 343 00:13:17,279 --> 00:13:21,000 so this is connected to the idea of 344 00:13:18,839 --> 00:13:22,920 techno solutionism which is a word to 345 00:13:21,000 --> 00:13:24,660 describe the phenomenon where we you 346 00:13:22,920 --> 00:13:28,019 know really try to solve complex social 347 00:13:24,660 --> 00:13:30,660 problems by throwing technology at it 348 00:13:28,019 --> 00:13:33,000 and try or attempt is sort of a key word 349 00:13:30,660 --> 00:13:35,579 there because in a lot of instances it 350 00:13:33,000 --> 00:13:38,459 actually makes things worse 351 00:13:35,579 --> 00:13:40,980 and that's not to say that Tech and data 352 00:13:38,459 --> 00:13:43,620 can't play a really important role in 353 00:13:40,980 --> 00:13:45,660 addressing critical social problems but 354 00:13:43,620 --> 00:13:48,240 it is to say that when we are too quick 355 00:13:45,660 --> 00:13:50,339 to jump to it without considering the 356 00:13:48,240 --> 00:13:53,100 social and political consequences and 357 00:13:50,339 --> 00:13:55,380 contexts then it can end up just 358 00:13:53,100 --> 00:13:58,320 creating more problems 359 00:13:55,380 --> 00:14:00,779 so data positivism create combined with 360 00:13:58,320 --> 00:14:03,779 datification creates this environment 361 00:14:00,779 --> 00:14:06,360 that incentivizes and justifies 362 00:14:03,779 --> 00:14:09,180 collecting more and more data including 363 00:14:06,360 --> 00:14:11,459 personal information and this fantasy of 364 00:14:09,180 --> 00:14:14,339 data positivism establishes this kind of 365 00:14:11,459 --> 00:14:16,260 moral mandate for ever more intrusive 366 00:14:14,339 --> 00:14:18,959 data collection and invasion of privacy 367 00:14:16,260 --> 00:14:20,279 it's Justified as a way to solve these 368 00:14:18,959 --> 00:14:22,019 social problems 369 00:14:20,279 --> 00:14:23,579 and so what we end up with is this kind 370 00:14:22,019 --> 00:14:25,620 of cognitive dissonance where you have 371 00:14:23,579 --> 00:14:27,480 you know some companies operating under 372 00:14:25,620 --> 00:14:29,700 a very clear surveillance capitalism 373 00:14:27,480 --> 00:14:31,440 business model but at the same time 374 00:14:29,700 --> 00:14:34,160 spouting rhetoric about how they're 375 00:14:31,440 --> 00:14:34,160 helping people 376 00:14:34,320 --> 00:14:39,480 now perhaps 377 00:14:36,480 --> 00:14:41,279 um you're really interested in the the 378 00:14:39,480 --> 00:14:42,360 climate the climate the environment the 379 00:14:41,279 --> 00:14:44,820 world that we live in perhaps you want 380 00:14:42,360 --> 00:14:47,040 to see climate action now that might not 381 00:14:44,820 --> 00:14:49,620 automatically you know make be really 382 00:14:47,040 --> 00:14:51,660 easily identifiable how that connects to 383 00:14:49,620 --> 00:14:52,980 digital rights but it does so if we 384 00:14:51,660 --> 00:14:54,959 think about 385 00:14:52,980 --> 00:14:56,279 the right to protest and political 386 00:14:54,959 --> 00:14:58,320 organizing 387 00:14:56,279 --> 00:15:00,060 so climate protesters are some of the 388 00:14:58,320 --> 00:15:01,740 most heavily criminalized activists in 389 00:15:00,060 --> 00:15:03,360 Australia 390 00:15:01,740 --> 00:15:05,279 so for climate activists or any 391 00:15:03,360 --> 00:15:07,740 activists really they need to be able to 392 00:15:05,279 --> 00:15:09,540 do their work by communicating securely 393 00:15:07,740 --> 00:15:12,480 and privately and safely with each other 394 00:15:09,540 --> 00:15:14,940 through encrypted services 395 00:15:12,480 --> 00:15:16,680 activists and protests also are often 396 00:15:14,940 --> 00:15:18,720 needing to be really aware of State 397 00:15:16,680 --> 00:15:20,760 surveillance and through things like 398 00:15:18,720 --> 00:15:23,160 facial recognition technology or other 399 00:15:20,760 --> 00:15:25,500 biometric surveillance 400 00:15:23,160 --> 00:15:26,720 it was only last year that the New South 401 00:15:25,500 --> 00:15:29,820 Wales police 402 00:15:26,720 --> 00:15:32,100 imposed bail conditions that prohibited 403 00:15:29,820 --> 00:15:33,959 climate activists from being able to use 404 00:15:32,100 --> 00:15:35,040 encrypted services like WhatsApp or 405 00:15:33,959 --> 00:15:37,680 signal 406 00:15:35,040 --> 00:15:40,380 so all of this sits within a broader war 407 00:15:37,680 --> 00:15:42,420 and encryption and a rise of the use of 408 00:15:40,380 --> 00:15:45,060 facial surveillance Technologies and I 409 00:15:42,420 --> 00:15:48,180 think it's worth pointing out how this 410 00:15:45,060 --> 00:15:49,920 Canon does have flow on effects a 411 00:15:48,180 --> 00:15:52,440 thriving and functioning democracy 412 00:15:49,920 --> 00:15:54,660 relies on our ability to hold those in 413 00:15:52,440 --> 00:15:56,880 power accountable and again that's very 414 00:15:54,660 --> 00:15:58,560 hard to do if you don't have any privacy 415 00:15:56,880 --> 00:16:01,800 or you're under constant surveillance 416 00:15:58,560 --> 00:16:04,440 but we'll get to more on that shortly 417 00:16:01,800 --> 00:16:06,000 this is also a good moment I think to 418 00:16:04,440 --> 00:16:07,920 acknowledge that safeguarding encryption 419 00:16:06,000 --> 00:16:09,839 and resisting facial surveillance is 420 00:16:07,920 --> 00:16:11,639 important for everyone and benefits 421 00:16:09,839 --> 00:16:14,399 everyone you don't need to be doing 422 00:16:11,639 --> 00:16:15,720 anything even remotely sus to benefit 423 00:16:14,399 --> 00:16:17,639 from this 424 00:16:15,720 --> 00:16:18,899 in fact framing it is something that 425 00:16:17,639 --> 00:16:21,000 only benefits people who have something 426 00:16:18,899 --> 00:16:22,380 to hide is actually really problematic 427 00:16:21,000 --> 00:16:24,420 and damaging 428 00:16:22,380 --> 00:16:26,820 everybody deserves 429 00:16:24,420 --> 00:16:28,260 to digital security and privacy and to 430 00:16:26,820 --> 00:16:30,779 be able to go about their day-to-day 431 00:16:28,260 --> 00:16:32,579 lives safely and securely even if it's 432 00:16:30,779 --> 00:16:34,920 just you know using signal to chat to 433 00:16:32,579 --> 00:16:37,500 your dad 434 00:16:34,920 --> 00:16:39,600 that being said a short word on 435 00:16:37,500 --> 00:16:42,000 criminality in a moment I'm going to 436 00:16:39,600 --> 00:16:43,560 talk about surveillance powers and 437 00:16:42,000 --> 00:16:46,079 something to keep in mind is that these 438 00:16:43,560 --> 00:16:47,759 are often Justified around ideas of 439 00:16:46,079 --> 00:16:49,139 criminality you'll hear people say 440 00:16:47,759 --> 00:16:50,220 things like oh what does it matter 441 00:16:49,139 --> 00:16:51,899 though they're just going to use these 442 00:16:50,220 --> 00:16:53,579 powers on criminals it'll never impact 443 00:16:51,899 --> 00:16:55,560 us 444 00:16:53,579 --> 00:16:58,079 but I think it's important to note that 445 00:16:55,560 --> 00:16:59,940 laws can change and so too can the 446 00:16:58,079 --> 00:17:01,920 notion of criminality 447 00:16:59,940 --> 00:17:03,899 so here in Australia we've seen that 448 00:17:01,920 --> 00:17:06,360 directed specifically towards climate 449 00:17:03,899 --> 00:17:09,600 activists in the last 18 months 450 00:17:06,360 --> 00:17:11,760 in the US we've witnessed how access to 451 00:17:09,600 --> 00:17:14,100 pregnancy termination Healthcare became 452 00:17:11,760 --> 00:17:16,020 criminalized really quickly 453 00:17:14,100 --> 00:17:18,240 in many places around the world simply 454 00:17:16,020 --> 00:17:19,079 being part of the lgbtq community is 455 00:17:18,240 --> 00:17:21,299 enough 456 00:17:19,079 --> 00:17:23,160 we also can't ignore the racial and 457 00:17:21,299 --> 00:17:25,319 classist elements when it comes to 458 00:17:23,160 --> 00:17:27,000 considering what's criminal and what's 459 00:17:25,319 --> 00:17:29,040 not who deserves to be under 460 00:17:27,000 --> 00:17:30,660 surveillance and who doesn't 461 00:17:29,040 --> 00:17:32,820 and so I would strongly suggest that we 462 00:17:30,660 --> 00:17:35,100 exercise real caution when we think 463 00:17:32,820 --> 00:17:37,620 about what when it is or is not 464 00:17:35,100 --> 00:17:39,720 appropriate to use invasive Technologies 465 00:17:37,620 --> 00:17:41,580 for surveillance and to justify it in 466 00:17:39,720 --> 00:17:44,280 those terms 467 00:17:41,580 --> 00:17:45,059 so maybe climate activism isn't really a 468 00:17:44,280 --> 00:17:46,860 bag 469 00:17:45,059 --> 00:17:50,340 let's use one more example before we get 470 00:17:46,860 --> 00:17:52,799 out of this nightmare mind map 471 00:17:50,340 --> 00:17:56,340 so perhaps you have kids or you know 472 00:17:52,799 --> 00:17:58,020 kids or you just care about the safety 473 00:17:56,340 --> 00:17:59,820 and well-being of children and young 474 00:17:58,020 --> 00:18:02,160 people 475 00:17:59,820 --> 00:18:04,260 and maybe you've heard about algorithmic 476 00:18:02,160 --> 00:18:06,299 rabbit holes maybe you saw a Four 477 00:18:04,260 --> 00:18:08,580 Corners episode on it 478 00:18:06,299 --> 00:18:10,679 so we can go down here into this this 479 00:18:08,580 --> 00:18:12,720 section of our math 480 00:18:10,679 --> 00:18:14,520 and so often in discussions about Online 481 00:18:12,720 --> 00:18:17,400 safety we really focus in on these 482 00:18:14,520 --> 00:18:19,140 rabbit holes and to be fair there are 483 00:18:17,400 --> 00:18:21,720 that there are a real problem they do 484 00:18:19,140 --> 00:18:25,200 send people down to some really really 485 00:18:21,720 --> 00:18:27,600 awful areas of the internet 486 00:18:25,200 --> 00:18:29,220 but often the logic goes basically well 487 00:18:27,600 --> 00:18:30,960 okay well I don't want this I don't want 488 00:18:29,220 --> 00:18:33,480 children to see this content and so we 489 00:18:30,960 --> 00:18:35,580 need to get rid of that content 490 00:18:33,480 --> 00:18:37,679 but this only really considers part of 491 00:18:35,580 --> 00:18:40,080 the issue what we end up doing when we 492 00:18:37,679 --> 00:18:41,760 when we focus too much on policing and 493 00:18:40,080 --> 00:18:42,780 removal of content is that we end up 494 00:18:41,760 --> 00:18:45,480 playing this kind of game of 495 00:18:42,780 --> 00:18:46,980 whack-a-mole and it often overlooks a 496 00:18:45,480 --> 00:18:49,020 lot of the complexities that come with 497 00:18:46,980 --> 00:18:51,720 content moderation and especially 498 00:18:49,020 --> 00:18:53,760 automated content moderation 499 00:18:51,720 --> 00:18:56,280 and it can also Overlook some of the 500 00:18:53,760 --> 00:18:58,080 harms that can arise when it goes wrong 501 00:18:56,280 --> 00:19:00,539 for example by disproportionately 502 00:18:58,080 --> 00:19:03,000 removing imagery of people in larger 503 00:19:00,539 --> 00:19:05,820 bodies or gender diverse people 504 00:19:03,000 --> 00:19:07,200 or by taking down documentation of Human 505 00:19:05,820 --> 00:19:08,940 Rights abuses 506 00:19:07,200 --> 00:19:10,860 and of course on the other end when 507 00:19:08,940 --> 00:19:13,320 content doesn't get taken down that also 508 00:19:10,860 --> 00:19:15,240 causes harm so it's a really sort of 509 00:19:13,320 --> 00:19:16,919 you're walking on a knife's edge it's 510 00:19:15,240 --> 00:19:19,080 really hard to get right but that 511 00:19:16,919 --> 00:19:20,280 doesn't stop politicians from you know 512 00:19:19,080 --> 00:19:22,860 spouting things like well why can't we 513 00:19:20,280 --> 00:19:23,820 just use AI to get rid of all of the bad 514 00:19:22,860 --> 00:19:25,440 stuff 515 00:19:23,820 --> 00:19:27,720 and it's because it doesn't work that 516 00:19:25,440 --> 00:19:29,100 way humans are complex and context is 517 00:19:27,720 --> 00:19:31,919 really important 518 00:19:29,100 --> 00:19:34,320 so we also need to go up in the other 519 00:19:31,919 --> 00:19:37,200 direction and consider the engagement 520 00:19:34,320 --> 00:19:38,580 and recommendation algorithms and when 521 00:19:37,200 --> 00:19:40,440 we start thinking about that we start 522 00:19:38,580 --> 00:19:42,000 thinking about the attention economy 523 00:19:40,440 --> 00:19:44,100 which is predicated on targeted 524 00:19:42,000 --> 00:19:46,380 advertising which takes us right back to 525 00:19:44,100 --> 00:19:47,700 surveillance capitalism 526 00:19:46,380 --> 00:19:49,500 now that's certainly an 527 00:19:47,700 --> 00:19:51,780 oversimplification of all of the issues 528 00:19:49,500 --> 00:19:53,580 happening here but what I think is 529 00:19:51,780 --> 00:19:55,679 important to note is that if we spend 530 00:19:53,580 --> 00:19:57,780 too much time thinking just about the 531 00:19:55,679 --> 00:20:00,900 sort of symptoms and dealing with it on 532 00:19:57,780 --> 00:20:02,520 that level we miss out on dealing with 533 00:20:00,900 --> 00:20:05,400 the underlying causes the business 534 00:20:02,520 --> 00:20:06,960 models the incentives and so we just end 535 00:20:05,400 --> 00:20:09,780 up sort of you know having the same 536 00:20:06,960 --> 00:20:11,220 issues over and over again 537 00:20:09,780 --> 00:20:14,100 so the point of showing you this mind 538 00:20:11,220 --> 00:20:15,600 map wasn't to like lead you to apathy or 539 00:20:14,100 --> 00:20:18,179 despair be like oh my God this is just 540 00:20:15,600 --> 00:20:19,440 so much stuff happening possibly rather 541 00:20:18,179 --> 00:20:21,660 I'm hoping that what you can take away 542 00:20:19,440 --> 00:20:23,100 from this is two things firstly that all 543 00:20:21,660 --> 00:20:25,679 of these issues are kind of start to 544 00:20:23,100 --> 00:20:27,419 become connected and not always in ways 545 00:20:25,679 --> 00:20:29,700 that are immediately obvious including 546 00:20:27,419 --> 00:20:32,160 connected with other social causes or 547 00:20:29,700 --> 00:20:35,340 issues and the second thing is that 548 00:20:32,160 --> 00:20:37,200 there's no need to tackle all of this at 549 00:20:35,340 --> 00:20:39,120 once so if you're interested in a 550 00:20:37,200 --> 00:20:41,700 particular social cause or a particular 551 00:20:39,120 --> 00:20:43,620 issue or a particular technology then 552 00:20:41,700 --> 00:20:45,720 you can kind of pick that spot and focus 553 00:20:43,620 --> 00:20:48,679 on that and it will play a role in the 554 00:20:45,720 --> 00:20:51,419 larger digital rights ecosystem 555 00:20:48,679 --> 00:20:53,460 okay so now we come to the digital 556 00:20:51,419 --> 00:20:55,559 wrongs section of this talk where I tell 557 00:20:53,460 --> 00:20:57,840 you about just a handful of tech policy 558 00:20:55,559 --> 00:21:00,600 uh legis and legislation in Australia 559 00:20:57,840 --> 00:21:01,980 that weren't not great 560 00:21:00,600 --> 00:21:04,320 so there's heaps of things we could talk 561 00:21:01,980 --> 00:21:05,640 about uh in this but I picked out three 562 00:21:04,320 --> 00:21:07,140 key areas that I think are really 563 00:21:05,640 --> 00:21:09,240 important to understand the sort of 564 00:21:07,140 --> 00:21:10,980 context and a bit of History so you know 565 00:21:09,240 --> 00:21:14,220 where we're at now where we've come from 566 00:21:10,980 --> 00:21:17,340 and where we're going next 567 00:21:14,220 --> 00:21:20,720 so first things first privacy reform 568 00:21:17,340 --> 00:21:23,700 my true love and arch nemesis 569 00:21:20,720 --> 00:21:25,620 so protecting privacy as I said is about 570 00:21:23,700 --> 00:21:28,260 so much more than just keeping our 571 00:21:25,620 --> 00:21:30,179 information secret or to ourselves it's 572 00:21:28,260 --> 00:21:32,100 about addressing individual and 573 00:21:30,179 --> 00:21:34,020 Collective power imbalances and 574 00:21:32,100 --> 00:21:36,360 information asymmetries it's about 575 00:21:34,020 --> 00:21:39,120 reining in corporate dominance and it's 576 00:21:36,360 --> 00:21:40,679 about upholding democracy and safety 577 00:21:39,120 --> 00:21:42,179 so in Australia the main bit of 578 00:21:40,679 --> 00:21:44,039 legislation that governs how our 579 00:21:42,179 --> 00:21:46,200 personal information is collected used 580 00:21:44,039 --> 00:21:49,500 and shared is the Privacy Act 581 00:21:46,200 --> 00:21:52,440 so this was drafted in 1988 based on a 582 00:21:49,500 --> 00:21:55,020 set of oecd principles and aside from a 583 00:21:52,440 --> 00:21:56,880 few amendments here and there it hasn't 584 00:21:55,020 --> 00:21:57,900 really been significantly changed since 585 00:21:56,880 --> 00:21:59,480 then 586 00:21:57,900 --> 00:22:01,860 the ACT is designed to be 587 00:21:59,480 --> 00:22:04,860 principles-based and Technology neutral 588 00:22:01,860 --> 00:22:06,780 but even then a lot has changed since 589 00:22:04,860 --> 00:22:09,299 the 80s and so it's just woefully 590 00:22:06,780 --> 00:22:12,000 ill-equipped to deal with the modern 591 00:22:09,299 --> 00:22:13,620 challenges of the digital economy 592 00:22:12,000 --> 00:22:15,240 so Advocates and experts have been 593 00:22:13,620 --> 00:22:17,400 calling for reform to the Privacy Act 594 00:22:15,240 --> 00:22:20,159 for a very long time and here you can 595 00:22:17,400 --> 00:22:22,260 see a very simplified timeline of of the 596 00:22:20,159 --> 00:22:23,460 process of privacy reform and you can 597 00:22:22,260 --> 00:22:26,760 see that there's just been so many 598 00:22:23,460 --> 00:22:30,179 rounds of consultation and yet no real 599 00:22:26,760 --> 00:22:31,919 change yet no real improvements to to um 600 00:22:30,179 --> 00:22:34,740 that we need to better protect people 601 00:22:31,919 --> 00:22:36,539 and their personal information 602 00:22:34,740 --> 00:22:38,039 key things that we're fighting for just 603 00:22:36,539 --> 00:22:39,299 so you have a sense of what's on the 604 00:22:38,039 --> 00:22:40,980 table is 605 00:22:39,299 --> 00:22:42,900 these things so updating the definition 606 00:22:40,980 --> 00:22:45,360 of personal information and this might 607 00:22:42,900 --> 00:22:47,460 not sound super sexy but it actually has 608 00:22:45,360 --> 00:22:48,780 the potential to be really powerful and 609 00:22:47,460 --> 00:22:50,760 that's because the definition of 610 00:22:48,780 --> 00:22:52,500 personal information kind of acts as a 611 00:22:50,760 --> 00:22:55,500 gatekeeper to the Protections in the act 612 00:22:52,500 --> 00:22:57,780 so if we expand the definition we expand 613 00:22:55,500 --> 00:22:59,159 the protections so we needed to better 614 00:22:57,780 --> 00:23:01,500 include things like technical 615 00:22:59,159 --> 00:23:03,659 information like metadata as well as 616 00:23:01,500 --> 00:23:05,820 inferred or generated information which 617 00:23:03,659 --> 00:23:08,460 is becoming you know increasingly more 618 00:23:05,820 --> 00:23:11,400 of an issue we also need it to include 619 00:23:08,460 --> 00:23:13,320 instances where you can distinguish a 620 00:23:11,400 --> 00:23:15,980 person from a group without necessarily 621 00:23:13,320 --> 00:23:18,360 knowing their name but you can 622 00:23:15,980 --> 00:23:20,340 point them out in the crowd and that's 623 00:23:18,360 --> 00:23:22,320 because privacy harms can happen even if 624 00:23:20,340 --> 00:23:23,700 you don't necessarily know the person's 625 00:23:22,320 --> 00:23:25,740 name 626 00:23:23,700 --> 00:23:27,299 there's also currently political party 627 00:23:25,740 --> 00:23:29,580 and small business exemptions in the 628 00:23:27,299 --> 00:23:31,200 Privacy Act which create a huge gap in 629 00:23:29,580 --> 00:23:34,320 the protections it offers 630 00:23:31,200 --> 00:23:36,419 we also need to create a direct right of 631 00:23:34,320 --> 00:23:38,280 action or a statutory taught for serious 632 00:23:36,419 --> 00:23:40,140 invasions of privacy and that would 633 00:23:38,280 --> 00:23:42,059 enable people to take their right to 634 00:23:40,140 --> 00:23:44,760 privacy in their own hands and take 635 00:23:42,059 --> 00:23:46,380 action when it gets violated or if it 636 00:23:44,760 --> 00:23:49,200 gets violated 637 00:23:46,380 --> 00:23:51,240 and lastly a fair and reasonable test to 638 00:23:49,200 --> 00:23:53,940 put the onus of responsibility onto 639 00:23:51,240 --> 00:23:55,500 organizations you know organizations 640 00:23:53,940 --> 00:23:58,080 that collect and handle our personal 641 00:23:55,500 --> 00:23:59,520 information should be bearing the brunt 642 00:23:58,080 --> 00:24:01,320 of making sure that they're doing so 643 00:23:59,520 --> 00:24:03,480 fairly and responsibly 644 00:24:01,320 --> 00:24:05,100 currently we operate under more of a 645 00:24:03,480 --> 00:24:07,620 model that places the responsibility 646 00:24:05,100 --> 00:24:09,840 onto individuals to manage our own 647 00:24:07,620 --> 00:24:11,940 privacy through things like privacy 648 00:24:09,840 --> 00:24:14,460 policies and really long terms of 649 00:24:11,940 --> 00:24:16,320 service and clicking I agree and 650 00:24:14,460 --> 00:24:18,179 clicking I accept and collection notices 651 00:24:16,320 --> 00:24:20,280 and all of that all that stuff that you 652 00:24:18,179 --> 00:24:21,360 know we hate to read and barely ever do 653 00:24:20,280 --> 00:24:23,880 and it doesn't work and it's 654 00:24:21,360 --> 00:24:28,460 manipulative and so we need to shift the 655 00:24:23,880 --> 00:24:28,460 responsibility back onto organizations 656 00:24:28,620 --> 00:24:33,600 so the next issue is surveillance Powers 657 00:24:31,380 --> 00:24:35,159 so over the past decade we've seen huge 658 00:24:33,600 --> 00:24:36,840 increases in electronic surveillance 659 00:24:35,159 --> 00:24:39,480 powers in Australia which have really 660 00:24:36,840 --> 00:24:41,580 wide reaching impacts on privacy and 661 00:24:39,480 --> 00:24:44,700 digital security and the functioning of 662 00:24:41,580 --> 00:24:46,320 our democracy uh more broadly again I've 663 00:24:44,700 --> 00:24:48,659 put together a little timeline here to 664 00:24:46,320 --> 00:24:50,100 visualize what's been going on and 665 00:24:48,659 --> 00:24:52,080 generally speaking these powers are 666 00:24:50,100 --> 00:24:54,780 Justified as necessary for national 667 00:24:52,080 --> 00:24:57,659 security purposes they use quite scary 668 00:24:54,780 --> 00:25:00,000 rhetoric about terrorism and CR and 669 00:24:57,659 --> 00:25:02,220 crimes as I mentioned before 670 00:25:00,000 --> 00:25:04,260 and in Australia we have this pretty 671 00:25:02,220 --> 00:25:07,200 well established tradition that once for 672 00:25:04,260 --> 00:25:09,900 repressive powers are introduced they 673 00:25:07,200 --> 00:25:12,360 are really rarely wound back for example 674 00:25:09,900 --> 00:25:15,480 just following uh the September 11 675 00:25:12,360 --> 00:25:18,480 attacks Australia passed just shy of 100 676 00:25:15,480 --> 00:25:21,120 National Security laws which had many 677 00:25:18,480 --> 00:25:22,380 had uh unprecedented rights infringing 678 00:25:21,120 --> 00:25:25,500 powers 679 00:25:22,380 --> 00:25:26,880 and over 20 years later only one of 680 00:25:25,500 --> 00:25:28,260 those significant Powers has been 681 00:25:26,880 --> 00:25:30,960 repealed 682 00:25:28,260 --> 00:25:32,460 and since then three major pieces of 683 00:25:30,960 --> 00:25:34,559 legislation have been passed which are 684 00:25:32,460 --> 00:25:36,299 highly controversial and detrimental to 685 00:25:34,559 --> 00:25:37,980 digital rights in Australia 686 00:25:36,299 --> 00:25:41,100 the first is the metadata retention 687 00:25:37,980 --> 00:25:44,279 scheme which you may remember it was 688 00:25:41,100 --> 00:25:46,380 established in 2015 and basically set it 689 00:25:44,279 --> 00:25:48,419 up so that Telco providers have to 690 00:25:46,380 --> 00:25:50,159 retain certain forms of data for two 691 00:25:48,419 --> 00:25:51,779 years metadata 692 00:25:50,159 --> 00:25:53,820 again this was passed on to Grand 693 00:25:51,779 --> 00:25:56,940 assurances that would only be used for 694 00:25:53,820 --> 00:26:00,179 the most terrifying criminals 695 00:25:56,940 --> 00:26:03,240 fast forward to 2016 and over 60 696 00:26:00,179 --> 00:26:05,340 agencies applied to access metadata 697 00:26:03,240 --> 00:26:07,200 including local councils to follow up on 698 00:26:05,340 --> 00:26:09,900 fines 699 00:26:07,200 --> 00:26:12,720 jump ahead again to 2021 and the 700 00:26:09,900 --> 00:26:16,080 Ombudsman found that every single agency 701 00:26:12,720 --> 00:26:18,299 that they investigated every one of them 702 00:26:16,080 --> 00:26:21,299 had accessed Australians metadata 703 00:26:18,299 --> 00:26:24,000 without proper authorization 704 00:26:21,299 --> 00:26:25,620 and jump ahead to this year and it was 705 00:26:24,000 --> 00:26:27,659 reported that metadata was used to check 706 00:26:25,620 --> 00:26:30,840 up on the relationship status of people 707 00:26:27,659 --> 00:26:33,299 receiving welfare an incredibly punitive 708 00:26:30,840 --> 00:26:34,919 use of these powers and not exactly the 709 00:26:33,299 --> 00:26:37,580 terrifying criminals that we were sold 710 00:26:34,919 --> 00:26:37,580 this law on 711 00:26:37,679 --> 00:26:43,260 then in 2018 we had the assistance and 712 00:26:40,260 --> 00:26:44,760 access act or Tola or the anti-scription 713 00:26:43,260 --> 00:26:47,520 ACT 714 00:26:44,760 --> 00:26:49,500 um because you might remember uh old 715 00:26:47,520 --> 00:26:51,659 mate turnable so this was super 716 00:26:49,500 --> 00:26:54,600 controversial and the debate was really 717 00:26:51,659 --> 00:26:57,299 ferocious and it really had Echoes of 718 00:26:54,600 --> 00:26:59,880 the crypto Wars in it and basically what 719 00:26:57,299 --> 00:27:02,279 it did is enable law enforcement and 720 00:26:59,880 --> 00:27:04,860 intelligence agencies to compel tech 721 00:27:02,279 --> 00:27:06,360 companies to assist them to be able to 722 00:27:04,860 --> 00:27:07,620 access the content of encrypted 723 00:27:06,360 --> 00:27:09,240 Communications 724 00:27:07,620 --> 00:27:11,340 it was rushed through the Christmas 725 00:27:09,240 --> 00:27:12,960 period and labor said that they had 726 00:27:11,340 --> 00:27:15,419 concerns about it 727 00:27:12,960 --> 00:27:19,980 but they passed it anyway 728 00:27:15,419 --> 00:27:22,200 so in 2020 a an independent monitor 729 00:27:19,980 --> 00:27:25,320 reviewed the act and and suggested major 730 00:27:22,200 --> 00:27:27,299 overhaul which still hasn't happened 731 00:27:25,320 --> 00:27:29,460 today politicians and others are still 732 00:27:27,299 --> 00:27:32,159 using the same rhetorical tools to try 733 00:27:29,460 --> 00:27:33,900 to argue against encryption although the 734 00:27:32,159 --> 00:27:36,360 emphasis has shifted away from terrorism 735 00:27:33,900 --> 00:27:39,779 now and towards people who disseminate 736 00:27:36,360 --> 00:27:41,400 child sexual abuse material or csam but 737 00:27:39,779 --> 00:27:44,640 what these politicians and other people 738 00:27:41,400 --> 00:27:46,740 in power fail to realize or fail to 739 00:27:44,640 --> 00:27:49,020 address is that strong encryption 740 00:27:46,740 --> 00:27:51,840 encryption helps keeps all of us safe 741 00:27:49,020 --> 00:27:53,880 including children and weakening it in 742 00:27:51,840 --> 00:27:57,299 any way actually undermines that safety 743 00:27:53,880 --> 00:27:59,880 for all of us in the long term 744 00:27:57,299 --> 00:28:03,120 and then the last one identify and 745 00:27:59,880 --> 00:28:04,980 disrupt act so this passed in 2021 and 746 00:28:03,120 --> 00:28:06,659 it introduced a range of new powers that 747 00:28:04,980 --> 00:28:09,960 enable law enforcement intelligence 748 00:28:06,659 --> 00:28:12,840 agencies to add copy delete and alter 749 00:28:09,960 --> 00:28:15,600 data on devices take over accounts and 750 00:28:12,840 --> 00:28:17,820 lock people out and and to access entire 751 00:28:15,600 --> 00:28:19,020 networks with minimal oversight or 752 00:28:17,820 --> 00:28:21,179 accountability 753 00:28:19,020 --> 00:28:23,220 now before it passed the Parliamentary 754 00:28:21,179 --> 00:28:25,559 joint committee on intelligence and 755 00:28:23,220 --> 00:28:27,900 security noted that there were serious 756 00:28:25,559 --> 00:28:30,120 flaws in the bill and recommended 757 00:28:27,900 --> 00:28:32,279 significant changes including narrowing 758 00:28:30,120 --> 00:28:34,320 the powers and establishing oversight 759 00:28:32,279 --> 00:28:36,419 mechanisms 760 00:28:34,320 --> 00:28:40,580 but did they listen to that no they 761 00:28:36,419 --> 00:28:40,580 passed it just under a week later 762 00:28:40,860 --> 00:28:44,580 so important thing to remember with all 763 00:28:42,600 --> 00:28:46,799 of this is that private and public 764 00:28:44,580 --> 00:28:48,600 surveillance is functionally intertwined 765 00:28:46,799 --> 00:28:51,000 and what I mean by that is that 766 00:28:48,600 --> 00:28:53,880 government use of surveillance relies 767 00:28:51,000 --> 00:28:56,820 upon commercial data collection at the 768 00:28:53,880 --> 00:28:59,700 data market and private tech companies 769 00:28:56,820 --> 00:29:02,360 you know be it through Telco providers a 770 00:28:59,700 --> 00:29:06,240 commercial facial recognition Services 771 00:29:02,360 --> 00:29:09,020 databases digital platforms Etc it's all 772 00:29:06,240 --> 00:29:09,020 really connected 773 00:29:09,059 --> 00:29:13,440 onto Online safety so around the end of 774 00:29:11,520 --> 00:29:15,240 2020 the Australian government started 775 00:29:13,440 --> 00:29:17,159 to really get a taste for taking on big 776 00:29:15,240 --> 00:29:18,779 Tech you might remember at the time 777 00:29:17,159 --> 00:29:22,140 there was a lot of stories about Scott 778 00:29:18,779 --> 00:29:24,000 Morrison cracking down on social media 779 00:29:22,140 --> 00:29:25,919 and alongside this there was a lot of 780 00:29:24,000 --> 00:29:28,200 talk about protecting women and children 781 00:29:25,919 --> 00:29:30,080 as well and so we ended up with the 782 00:29:28,200 --> 00:29:32,640 Online safety act 783 00:29:30,080 --> 00:29:34,860 now this was surrounded with a huge 784 00:29:32,640 --> 00:29:36,779 amount of debate and controversy 785 00:29:34,860 --> 00:29:38,880 so the bill is complex we don't need to 786 00:29:36,779 --> 00:29:40,140 get into the weeds of it here but there 787 00:29:38,880 --> 00:29:42,840 are some things in it that are really 788 00:29:40,140 --> 00:29:46,080 valuable and reasonable but there are 789 00:29:42,840 --> 00:29:47,820 also a range of really broad vaguely 790 00:29:46,080 --> 00:29:50,399 defined powers given to the regulator 791 00:29:47,820 --> 00:29:53,580 the e-safety commissioner to be able to 792 00:29:50,399 --> 00:29:55,860 require the removal of certain content 793 00:29:53,580 --> 00:29:58,080 so this applies to things to Illegal 794 00:29:55,860 --> 00:30:00,720 content like csam or proterra material 795 00:29:58,080 --> 00:30:02,520 material but it also extends to other 796 00:30:00,720 --> 00:30:03,659 categories of content like sexual 797 00:30:02,520 --> 00:30:05,039 material 798 00:30:03,659 --> 00:30:07,200 and this is where it starts getting 799 00:30:05,039 --> 00:30:10,020 really tricky and starts getting wrapped 800 00:30:07,200 --> 00:30:11,760 up in sort of conservative ideology and 801 00:30:10,020 --> 00:30:13,919 has real implications for the freedom of 802 00:30:11,760 --> 00:30:15,659 speech and freedom of expression for 803 00:30:13,919 --> 00:30:17,820 many groups 804 00:30:15,659 --> 00:30:20,340 what's more is that in able to be able 805 00:30:17,820 --> 00:30:22,799 to put these things into action a lot of 806 00:30:20,340 --> 00:30:25,440 it incentivizes or requires increased 807 00:30:22,799 --> 00:30:27,480 policing monitoring and data collection 808 00:30:25,440 --> 00:30:28,980 online which usually means increased 809 00:30:27,480 --> 00:30:30,480 surveillance 810 00:30:28,980 --> 00:30:32,220 so in addition to digital rights 811 00:30:30,480 --> 00:30:34,620 Advocates there was serious pushback 812 00:30:32,220 --> 00:30:36,539 from sex worker groups and the lgbtq 813 00:30:34,620 --> 00:30:38,399 community because these groups are 814 00:30:36,539 --> 00:30:40,440 already subject to disproportionate 815 00:30:38,399 --> 00:30:43,980 censorship and surveillance online and 816 00:30:40,440 --> 00:30:46,020 the ACT stood to increase that 817 00:30:43,980 --> 00:30:47,520 fast forward and there's a whole range 818 00:30:46,020 --> 00:30:49,320 of things happening in the Online safety 819 00:30:47,520 --> 00:30:51,299 space at the moment so we've got you 820 00:30:49,320 --> 00:30:53,039 know proposals for age verification that 821 00:30:51,299 --> 00:30:55,500 come with huge privacy and digital 822 00:30:53,039 --> 00:30:57,480 security risks also a kind of 823 00:30:55,500 --> 00:30:58,980 implementation nightmare 824 00:30:57,480 --> 00:31:01,559 we've also got proposals to reduce 825 00:30:58,980 --> 00:31:03,960 anonymity online and this as I said 826 00:31:01,559 --> 00:31:06,179 earlier risks the safety of a lot of 827 00:31:03,960 --> 00:31:08,940 people who rely on anonymity to be able 828 00:31:06,179 --> 00:31:11,460 to be online in a safe way 829 00:31:08,940 --> 00:31:13,080 we also have these increasingly complex 830 00:31:11,460 --> 00:31:14,880 requirements for tech companies 831 00:31:13,080 --> 00:31:16,860 including the basic Online safety 832 00:31:14,880 --> 00:31:19,440 expectations as well as the Online 833 00:31:16,860 --> 00:31:21,600 safety industry codes so at the moment 834 00:31:19,440 --> 00:31:22,799 we're waiting for the e-safety 835 00:31:21,600 --> 00:31:25,260 commissioner to deliver a couple of 836 00:31:22,799 --> 00:31:27,120 standards and we're expecting that those 837 00:31:25,260 --> 00:31:29,159 standards will include requirements for 838 00:31:27,120 --> 00:31:31,559 proactive detection or client-side 839 00:31:29,159 --> 00:31:34,740 scanning which is a really controversial 840 00:31:31,559 --> 00:31:38,820 and quite a dangerous use of technology 841 00:31:34,740 --> 00:31:38,820 to sidestep encryption 842 00:31:39,419 --> 00:31:44,399 so the Online safety space is a complex 843 00:31:42,059 --> 00:31:45,899 regulatory spiders web and most of the 844 00:31:44,399 --> 00:31:48,899 time it ends up being a kind of tug of 845 00:31:45,899 --> 00:31:51,000 war between big Tech and the government 846 00:31:48,899 --> 00:31:52,679 and small tech companies or 847 00:31:51,000 --> 00:31:54,840 organizations are really starting to 848 00:31:52,679 --> 00:31:56,520 feel huge amounts of pressure to be able 849 00:31:54,840 --> 00:31:58,980 to meet the requirements that are really 850 00:31:56,520 --> 00:32:02,580 designed with big Tech in mind 851 00:31:58,980 --> 00:32:05,279 for example in 2022 sweater a sex worker 852 00:32:02,580 --> 00:32:07,860 friendly uh social media platform had to 853 00:32:05,279 --> 00:32:09,779 shut down in response to increasingly 854 00:32:07,860 --> 00:32:12,299 hostile regulation 855 00:32:09,779 --> 00:32:14,760 now things like this make many 856 00:32:12,299 --> 00:32:16,860 communities less safe but it also 857 00:32:14,760 --> 00:32:19,080 contributes to the consolidation and 858 00:32:16,860 --> 00:32:20,399 power of power and influence into a 859 00:32:19,080 --> 00:32:21,960 handful of large companies because 860 00:32:20,399 --> 00:32:24,720 they're the only ones who are able to 861 00:32:21,960 --> 00:32:26,520 meet the requirements 862 00:32:24,720 --> 00:32:28,200 now none of that is to say that tech 863 00:32:26,520 --> 00:32:31,860 companies should be off the hook when it 864 00:32:28,200 --> 00:32:32,940 comes to Online safety but it is to say 865 00:32:31,860 --> 00:32:34,740 that we need to be thinking really 866 00:32:32,940 --> 00:32:37,020 carefully about the kinds of regulations 867 00:32:34,740 --> 00:32:38,820 that we put in place and we need really 868 00:32:37,020 --> 00:32:40,740 we need a technologically competent 869 00:32:38,820 --> 00:32:42,240 government who meaningfully engages with 870 00:32:40,740 --> 00:32:45,299 the tech community and the digital 871 00:32:42,240 --> 00:32:47,520 rights Civil Society space and other 872 00:32:45,299 --> 00:32:49,440 impacted communities 873 00:32:47,520 --> 00:32:51,960 it is really concerning when the vision 874 00:32:49,440 --> 00:32:54,080 for Online safety in Australia is one of 875 00:32:51,960 --> 00:32:57,240 increased automated content moderation 876 00:32:54,080 --> 00:32:59,279 privacy invasive age verification a 877 00:32:57,240 --> 00:33:01,679 reduction in anonymity and a Crackdown 878 00:32:59,279 --> 00:33:03,899 on end-to-end encryption these 879 00:33:01,679 --> 00:33:06,539 approaches increase surveillance the 880 00:33:03,899 --> 00:33:08,039 increase control and monitoring and in 881 00:33:06,539 --> 00:33:12,059 doing so threatened to actually 882 00:33:08,039 --> 00:33:13,919 undermine the safety for a lot of people 883 00:33:12,059 --> 00:33:15,960 I also think it's important that we 884 00:33:13,919 --> 00:33:18,960 recognize that all of this sits within a 885 00:33:15,960 --> 00:33:21,120 broader context of anti-sax anti-queer 886 00:33:18,960 --> 00:33:23,700 and anti-trans sentiment that is both 887 00:33:21,120 --> 00:33:25,799 online and offline much of the public 888 00:33:23,700 --> 00:33:28,200 discourse from politicians and others in 889 00:33:25,799 --> 00:33:30,299 of in positions of power have emphasized 890 00:33:28,200 --> 00:33:33,840 a view of Online safety that centers 891 00:33:30,299 --> 00:33:35,399 upon moralism puritanism and sanitizing 892 00:33:33,840 --> 00:33:38,700 online space 893 00:33:35,399 --> 00:33:41,399 it becomes genuinely very scary when the 894 00:33:38,700 --> 00:33:44,220 idea of offensive content or harmful 895 00:33:41,399 --> 00:33:46,260 content is your very existence 896 00:33:44,220 --> 00:33:49,260 especially when it starts to impact the 897 00:33:46,260 --> 00:33:52,799 internet which you know many of my 898 00:33:49,260 --> 00:33:55,080 fellow lgbtq community members rely upon 899 00:33:52,799 --> 00:33:56,820 to access support friends and vital 900 00:33:55,080 --> 00:34:00,179 health information 901 00:33:56,820 --> 00:34:02,519 we must vehemently reject the notion 902 00:34:00,179 --> 00:34:04,500 that the safety of children is in 903 00:34:02,519 --> 00:34:06,419 opposition to the freedom and safety of 904 00:34:04,500 --> 00:34:08,940 queer people and sex workers because 905 00:34:06,419 --> 00:34:11,280 it's not true and it's a dichotomy that 906 00:34:08,940 --> 00:34:15,560 is designed to divide us and lead us to 907 00:34:11,280 --> 00:34:15,560 punitive and harmful internet regulation 908 00:34:15,740 --> 00:34:21,179 thank you 909 00:34:18,359 --> 00:34:22,440 so I picked out the the three key areas 910 00:34:21,179 --> 00:34:24,540 that I think are really important to 911 00:34:22,440 --> 00:34:26,460 focus on but I also wanted to just flag 912 00:34:24,540 --> 00:34:28,679 a few things that are coming up now that 913 00:34:26,460 --> 00:34:30,839 you've got all of all the context you 914 00:34:28,679 --> 00:34:32,280 know what to keep your eye out for um as 915 00:34:30,839 --> 00:34:33,960 we move forward so at the moment we've 916 00:34:32,280 --> 00:34:35,700 got misinformation and disinformation 917 00:34:33,960 --> 00:34:38,820 legislation on the cards which is 918 00:34:35,700 --> 00:34:41,220 creating some controversy around when it 919 00:34:38,820 --> 00:34:42,960 is and isn't misinformation and what to 920 00:34:41,220 --> 00:34:44,040 take down and what not because it's very 921 00:34:42,960 --> 00:34:46,800 complex 922 00:34:44,040 --> 00:34:49,800 we've got digital identity on the cards 923 00:34:46,800 --> 00:34:52,560 which includes use of biometric data and 924 00:34:49,800 --> 00:34:55,440 we've also there's been a lot of talk 925 00:34:52,560 --> 00:34:57,720 about AI governance recently so they we 926 00:34:55,440 --> 00:34:59,400 just currently there's a inquiry 927 00:34:57,720 --> 00:35:01,440 happening at the moment about different 928 00:34:59,400 --> 00:35:03,599 mechanisms for regulation and and 929 00:35:01,440 --> 00:35:05,520 governments of AI so you can expect to 930 00:35:03,599 --> 00:35:07,760 see a lot more of that in the near 931 00:35:05,520 --> 00:35:07,760 future 932 00:35:08,099 --> 00:35:12,599 okay so my hope is that this talk has 933 00:35:10,859 --> 00:35:14,640 helped bring you up to speed on some of 934 00:35:12,599 --> 00:35:16,380 the current digital rights issues and 935 00:35:14,640 --> 00:35:18,599 the state of tech policy in Australia 936 00:35:16,380 --> 00:35:20,700 and even better I really hope that at 937 00:35:18,599 --> 00:35:22,920 least one thing has resonated with you 938 00:35:20,700 --> 00:35:25,400 and maybe moved you to get involved with 939 00:35:22,920 --> 00:35:27,720 the digital rights movement in Australia 940 00:35:25,400 --> 00:35:29,700 technologists and Tech workers like 941 00:35:27,720 --> 00:35:31,440 yourself are in such a wonderful 942 00:35:29,700 --> 00:35:33,960 position to get involved with this 943 00:35:31,440 --> 00:35:36,300 movement as people who understand and 944 00:35:33,960 --> 00:35:38,940 actively work in Tech you have a lot of 945 00:35:36,300 --> 00:35:41,400 power to help others understand and to 946 00:35:38,940 --> 00:35:43,560 influence the direction of Technology 947 00:35:41,400 --> 00:35:45,660 so here's a little list of things you 948 00:35:43,560 --> 00:35:47,280 can do if you want to get involved not 949 00:35:45,660 --> 00:35:50,040 all of these will apply some will work 950 00:35:47,280 --> 00:35:52,760 some will some won't so take what fits 951 00:35:50,040 --> 00:35:52,760 and ignore what doesn't 952 00:35:53,579 --> 00:35:59,339 so firstly educating and helping to 953 00:35:55,859 --> 00:36:01,140 demystify technology so 954 00:35:59,339 --> 00:36:04,320 as I said you're in a great position to 955 00:36:01,140 --> 00:36:05,760 be able to do this there's a lot of 956 00:36:04,320 --> 00:36:07,619 misunderstanding of a lot of 957 00:36:05,760 --> 00:36:11,640 Technologies and how they work and how 958 00:36:07,619 --> 00:36:13,380 they impact people for example I'm sure 959 00:36:11,640 --> 00:36:16,320 I'm sure a lot of you have seen 960 00:36:13,380 --> 00:36:18,119 firsthand a lot of the misinformation a 961 00:36:16,320 --> 00:36:20,220 lot of the misunderstanding around how 962 00:36:18,119 --> 00:36:22,740 large language models work and what they 963 00:36:20,220 --> 00:36:24,060 are and are not capable of and this kind 964 00:36:22,740 --> 00:36:27,359 of confusion leads people to 965 00:36:24,060 --> 00:36:30,420 misunderstand what AI can and can't do 966 00:36:27,359 --> 00:36:32,420 and that can end up having real impacts 967 00:36:30,420 --> 00:36:35,160 on the decisions and policy making 968 00:36:32,420 --> 00:36:37,260 directions so playing a role in 969 00:36:35,160 --> 00:36:39,540 demystifying how these Technologies work 970 00:36:37,260 --> 00:36:41,160 for the people around you who who don't 971 00:36:39,540 --> 00:36:43,859 necessarily have your level of 972 00:36:41,160 --> 00:36:45,720 understanding can be really helpful 973 00:36:43,859 --> 00:36:48,300 you could speak up within the industry 974 00:36:45,720 --> 00:36:50,460 so you have the potential to be a really 975 00:36:48,300 --> 00:36:52,440 influential voice within the tech 976 00:36:50,460 --> 00:36:55,079 industry in Australia so this might mean 977 00:36:52,440 --> 00:36:56,880 things like advocating internally in 978 00:36:55,079 --> 00:36:58,920 your workplace or just with your Tech 979 00:36:56,880 --> 00:37:02,460 Community for rights respecting 980 00:36:58,920 --> 00:37:06,440 practices privacy digital security the 981 00:37:02,460 --> 00:37:06,440 inclusive design and things like that 982 00:37:07,079 --> 00:37:10,859 um submission writing so not this isn't 983 00:37:09,000 --> 00:37:13,440 for everybody but if you're interested 984 00:37:10,859 --> 00:37:15,240 we could really use the help 985 00:37:13,440 --> 00:37:17,820 um so one of the main challenges that we 986 00:37:15,240 --> 00:37:19,560 find with um doing submission writing 987 00:37:17,820 --> 00:37:21,480 and engaging with the government on Tech 988 00:37:19,560 --> 00:37:24,960 legislation is it's just a phenomenal 989 00:37:21,480 --> 00:37:27,660 lack of technical understanding it seems 990 00:37:24,960 --> 00:37:30,119 in our government and often they'll put 991 00:37:27,660 --> 00:37:33,359 forward proposals that don't really make 992 00:37:30,119 --> 00:37:35,520 sense or uh sort of technically 993 00:37:33,359 --> 00:37:37,440 um inappropriate or lead to technical 994 00:37:35,520 --> 00:37:39,240 consequences that they haven't foreseen 995 00:37:37,440 --> 00:37:42,480 or really hard to implement and things 996 00:37:39,240 --> 00:37:44,160 like that so if you're inclined weighing 997 00:37:42,480 --> 00:37:46,820 in on those kinds of things is really 998 00:37:44,160 --> 00:37:46,820 really helpful 999 00:37:47,099 --> 00:37:51,780 if you're somebody who builds products 1000 00:37:49,680 --> 00:37:53,280 perhaps you could play an active role in 1001 00:37:51,780 --> 00:37:55,740 prioritizing digital rights in the 1002 00:37:53,280 --> 00:37:58,200 design development and deployment of 1003 00:37:55,740 --> 00:38:00,060 them this might mean being again a 1004 00:37:58,200 --> 00:38:01,440 champion for privacy it might mean 1005 00:38:00,060 --> 00:38:03,960 engaging with inclusive design 1006 00:38:01,440 --> 00:38:07,020 methodologies you could advocate for 1007 00:38:03,960 --> 00:38:09,119 features that Empower your users to 1008 00:38:07,020 --> 00:38:12,180 control their data and to make informed 1009 00:38:09,119 --> 00:38:14,760 choices about how they interact with it 1010 00:38:12,180 --> 00:38:17,339 if you're doing side projects you could 1011 00:38:14,760 --> 00:38:19,740 consider maybe volunteering some of your 1012 00:38:17,339 --> 00:38:21,720 technical expertise to organizations 1013 00:38:19,740 --> 00:38:23,460 that are doing good or you could 1014 00:38:21,720 --> 00:38:26,640 contribute to digital rights focused or 1015 00:38:23,460 --> 00:38:30,060 open source projects 1016 00:38:26,640 --> 00:38:32,579 this is the spicy One tech unionization 1017 00:38:30,060 --> 00:38:35,520 um some tech workers overseas have 1018 00:38:32,579 --> 00:38:37,440 started uh exploring unionization as a 1019 00:38:35,520 --> 00:38:39,720 way to collectively negotiate for better 1020 00:38:37,440 --> 00:38:42,720 working practices for themselves but 1021 00:38:39,720 --> 00:38:45,119 also as a way to negotiate for a seat at 1022 00:38:42,720 --> 00:38:47,760 the table to be to pay it to play a part 1023 00:38:45,119 --> 00:38:49,560 in the decision-making processes so they 1024 00:38:47,760 --> 00:38:51,540 want to be able to use this to not only 1025 00:38:49,560 --> 00:38:54,119 impact the company that they work for 1026 00:38:51,540 --> 00:38:55,619 but also Society at large and how they 1027 00:38:54,119 --> 00:38:57,720 sort of fit together 1028 00:38:55,619 --> 00:39:00,540 so sadly we don't have a dedicated Tech 1029 00:38:57,720 --> 00:39:02,460 Workers Union in Australia yet 1030 00:39:00,540 --> 00:39:05,160 but 1031 00:39:02,460 --> 00:39:08,220 I mean what if you started one 1032 00:39:05,160 --> 00:39:09,780 it's just food for thought or you could 1033 00:39:08,220 --> 00:39:12,300 just work with your your fellow 1034 00:39:09,780 --> 00:39:14,099 colleagues and your friends but 1035 00:39:12,300 --> 00:39:16,980 I don't know think about it 1036 00:39:14,099 --> 00:39:19,140 and lastly supporting digital rights 1037 00:39:16,980 --> 00:39:21,000 organizations as I mentioned right at 1038 00:39:19,140 --> 00:39:23,880 the beginning the digital rights 1039 00:39:21,000 --> 00:39:27,900 community and movement in Australia 1040 00:39:23,880 --> 00:39:29,520 is wildly under-resourced and as you 1041 00:39:27,900 --> 00:39:31,320 have just heard there is just a 1042 00:39:29,520 --> 00:39:34,380 phenomenal amount of work in different 1043 00:39:31,320 --> 00:39:35,700 areas coming at us from all angles all 1044 00:39:34,380 --> 00:39:37,920 the time 1045 00:39:35,700 --> 00:39:39,480 so advocacy work might not be your cup 1046 00:39:37,920 --> 00:39:41,520 of tea and that's totally fine not 1047 00:39:39,480 --> 00:39:43,859 everybody needs to do it but please 1048 00:39:41,520 --> 00:39:45,780 don't underestimate how much of a 1049 00:39:43,859 --> 00:39:48,000 difference it can make by supporting 1050 00:39:45,780 --> 00:39:50,160 people who are doing that work 1051 00:39:48,000 --> 00:39:52,380 so this could look like a few different 1052 00:39:50,160 --> 00:39:53,820 things you could 1053 00:39:52,380 --> 00:39:55,740 um make a donation you could make a 1054 00:39:53,820 --> 00:39:57,480 one-off donation which is good you could 1055 00:39:55,740 --> 00:40:00,020 make a recurring donation which is even 1056 00:39:57,480 --> 00:40:03,480 better it helps us to plan ahead 1057 00:40:00,020 --> 00:40:06,300 you could organize something in your 1058 00:40:03,480 --> 00:40:09,180 company to do your recurring support or 1059 00:40:06,300 --> 00:40:10,740 of course it you could also get in touch 1060 00:40:09,180 --> 00:40:12,480 with the organizations and see if you 1061 00:40:10,740 --> 00:40:14,820 can help out in other ways maybe it will 1062 00:40:12,480 --> 00:40:17,160 be helping with some some technical 1063 00:40:14,820 --> 00:40:19,740 projects maybe it would be writing 1064 00:40:17,160 --> 00:40:21,180 content or helping to run workshops and 1065 00:40:19,740 --> 00:40:24,240 things like that it really depends on 1066 00:40:21,180 --> 00:40:25,680 the organization obviously I hope that 1067 00:40:24,240 --> 00:40:28,140 you will support digital Rights Watch 1068 00:40:25,680 --> 00:40:30,480 but there are some other organizations 1069 00:40:28,140 --> 00:40:32,280 in Australia that are doing really great 1070 00:40:30,480 --> 00:40:34,140 work as well including electronic 1071 00:40:32,280 --> 00:40:35,880 Frontiers Australia the Australian 1072 00:40:34,140 --> 00:40:37,800 privacy foundation and they're a 1073 00:40:35,880 --> 00:40:40,560 state-based civil civil liberties 1074 00:40:37,800 --> 00:40:43,700 organizations as well which you can look 1075 00:40:40,560 --> 00:40:43,700 up if you're interested 1076 00:40:44,280 --> 00:40:50,700 okay so that brings us to the end thank 1077 00:40:48,000 --> 00:40:53,460 you so much for your attention and again 1078 00:40:50,700 --> 00:40:55,800 to pycon for having me I hope that I 1079 00:40:53,460 --> 00:40:57,540 have left you with some little Sparks of 1080 00:40:55,800 --> 00:40:59,460 ideas of things that you might be 1081 00:40:57,540 --> 00:41:01,980 interested in following up and that 1082 00:40:59,460 --> 00:41:05,099 hopefully you'll want to get involved in 1083 00:41:01,980 --> 00:41:06,620 the digital rights movement in Australia 1084 00:41:05,099 --> 00:41:08,310 um thank you again 1085 00:41:06,620 --> 00:41:15,190 [Applause] 1086 00:41:08,310 --> 00:41:15,190 [Music] 1087 00:41:17,339 --> 00:41:22,140 do I am I am I am I alive now can you 1088 00:41:19,980 --> 00:41:24,480 hear me yay lovely thank you so much Sam 1089 00:41:22,140 --> 00:41:26,820 we have uh the traditional sponsor uh 1090 00:41:24,480 --> 00:41:27,900 sponsored speaker gift of a mug and a 1091 00:41:26,820 --> 00:41:28,990 thank you card thank you thank you so 1092 00:41:27,900 --> 00:41:36,140 much thank you 1093 00:41:28,990 --> 00:41:36,140 [Applause]