1 00:00:12,160 --> 00:00:17,440 welcome back 2 00:00:14,480 --> 00:00:20,080 uh we're now joined by sam 3 00:00:17,440 --> 00:00:21,920 to uh get us all to get radical 4 00:00:20,080 --> 00:00:24,400 uh sam works at the intersection of 5 00:00:21,920 --> 00:00:26,320 feminism human rights and technology 6 00:00:24,400 --> 00:00:28,080 she's currently program lead for digital 7 00:00:26,320 --> 00:00:30,000 rights watch and a privacy and 8 00:00:28,080 --> 00:00:31,519 technology specialist with salinger 9 00:00:30,000 --> 00:00:33,120 privacy 10 00:00:31,519 --> 00:00:35,520 with experience across the public 11 00:00:33,120 --> 00:00:37,520 private and not-for-profit sectors and a 12 00:00:35,520 --> 00:00:38,480 background in both politics and data 13 00:00:37,520 --> 00:00:40,879 science 14 00:00:38,480 --> 00:00:43,120 sam thinks an interdisciplinary approach 15 00:00:40,879 --> 00:00:45,360 to privacy is vital 16 00:00:43,120 --> 00:00:47,600 as former privacy director sorry as 17 00:00:45,360 --> 00:00:50,239 former program director for code like a 18 00:00:47,600 --> 00:00:53,039 girl sam is dedicated to ethics of 19 00:00:50,239 --> 00:00:55,360 technology in all its forms from gender 20 00:00:53,039 --> 00:00:57,440 equity in the tech industry to upholding 21 00:00:55,360 --> 00:00:59,840 privacy in increasingly surveillance 22 00:00:57,440 --> 00:01:01,840 obsessed world 23 00:00:59,840 --> 00:01:03,520 sam is also going to use all the time 24 00:01:01,840 --> 00:01:05,439 because it's just so much amazing 25 00:01:03,520 --> 00:01:08,080 information to cram in 26 00:01:05,439 --> 00:01:10,560 so if you have any questions please make 27 00:01:08,080 --> 00:01:12,880 a note of them and ask them in the 28 00:01:10,560 --> 00:01:15,280 hallway track after the talk 29 00:01:12,880 --> 00:01:17,200 take it away sam 30 00:01:15,280 --> 00:01:20,080 awesome thanks so much 31 00:01:17,200 --> 00:01:24,080 i'm just gonna take these out okay 32 00:01:20,080 --> 00:01:26,799 so hi um thanks so much for having me um 33 00:01:24,080 --> 00:01:28,799 and i'm so thrilled to be here so i want 34 00:01:26,799 --> 00:01:30,880 to thank you to pycon and in particular 35 00:01:28,799 --> 00:01:32,880 to snake oil academy for 36 00:01:30,880 --> 00:01:34,479 for creating this space where we can 37 00:01:32,880 --> 00:01:36,560 really nerd out about privacy and 38 00:01:34,479 --> 00:01:38,799 security it's it's um it's a really 39 00:01:36,560 --> 00:01:40,560 special thing to to have and to be able 40 00:01:38,799 --> 00:01:43,280 to participate in so 41 00:01:40,560 --> 00:01:44,720 as this slide suggests 42 00:01:43,280 --> 00:01:46,640 i'm going to be talking about privacy as 43 00:01:44,720 --> 00:01:49,680 a collective issue 44 00:01:46,640 --> 00:01:51,840 so this is one of my favorite topics and 45 00:01:49,680 --> 00:01:53,520 i'm really excited to have you here and 46 00:01:51,840 --> 00:01:55,200 to bring you along on this 47 00:01:53,520 --> 00:01:56,240 journey 48 00:01:55,200 --> 00:01:57,840 so 49 00:01:56,240 --> 00:02:00,479 as far as i see it 50 00:01:57,840 --> 00:02:02,799 as governments and companies continue to 51 00:02:00,479 --> 00:02:05,200 use technology to meet their thirst for 52 00:02:02,799 --> 00:02:08,000 surveillance protecting privacy is one 53 00:02:05,200 --> 00:02:11,039 of the key ways that we can push back 54 00:02:08,000 --> 00:02:12,720 and i think that's really really cool 55 00:02:11,039 --> 00:02:14,000 and i hope that by the end of this talk 56 00:02:12,720 --> 00:02:16,640 you think so too 57 00:02:14,000 --> 00:02:18,640 and if you already feel that way i hope 58 00:02:16,640 --> 00:02:20,720 that i leave you feeling energized and 59 00:02:18,640 --> 00:02:23,599 excited to embed really good privacy 60 00:02:20,720 --> 00:02:24,959 practices and protections into any 61 00:02:23,599 --> 00:02:26,319 projects that you might work on in the 62 00:02:24,959 --> 00:02:28,000 future 63 00:02:26,319 --> 00:02:30,480 as we'll get into a little bit later i 64 00:02:28,000 --> 00:02:32,560 think that using privacy as a tool can 65 00:02:30,480 --> 00:02:34,480 mean so much more than just meeting our 66 00:02:32,560 --> 00:02:36,319 compliance requirements 67 00:02:34,480 --> 00:02:38,400 it can also help shape the kind of 68 00:02:36,319 --> 00:02:41,920 technology that we build and in turn 69 00:02:38,400 --> 00:02:44,959 that the world that we create 70 00:02:41,920 --> 00:02:47,519 so i am coming at you from the stolen 71 00:02:44,959 --> 00:02:49,599 lands of the rwandari wararang people of 72 00:02:47,519 --> 00:02:52,080 the eastern kulin nation 73 00:02:49,599 --> 00:02:53,760 i want to acknowledge uh the sovereignty 74 00:02:52,080 --> 00:02:55,440 of these lands was never seeded and pay 75 00:02:53,760 --> 00:02:57,440 my respects to elders past present and 76 00:02:55,440 --> 00:02:59,360 emerging and to any first nations people 77 00:02:57,440 --> 00:03:01,440 who might be here today thank you so 78 00:02:59,360 --> 00:03:03,200 much for being here 79 00:03:01,440 --> 00:03:04,319 i also want to highlight that a lot of 80 00:03:03,200 --> 00:03:05,840 what i'm going to talk about today is 81 00:03:04,319 --> 00:03:08,159 about surveillance and i think it's 82 00:03:05,840 --> 00:03:10,159 really important that we keep in mind 83 00:03:08,159 --> 00:03:12,319 that surveillance has intrinsic 84 00:03:10,159 --> 00:03:14,159 connections with colonization and 85 00:03:12,319 --> 00:03:16,720 colonial policing 86 00:03:14,159 --> 00:03:19,120 and this was and continues to be about 87 00:03:16,720 --> 00:03:20,640 policing race and class 88 00:03:19,120 --> 00:03:22,000 we need to remember 89 00:03:20,640 --> 00:03:24,480 throughout this talk and whenever we 90 00:03:22,000 --> 00:03:26,879 think about privacy that aboriginal and 91 00:03:24,480 --> 00:03:27,760 torres strait islander peoples continue 92 00:03:26,879 --> 00:03:29,440 to be 93 00:03:27,760 --> 00:03:30,840 disproportionately surveillance 94 00:03:29,440 --> 00:03:32,959 surveilled and 95 00:03:30,840 --> 00:03:35,040 policed i'm also going to talk a bit 96 00:03:32,959 --> 00:03:37,760 about this idea of extractivism this you 97 00:03:35,040 --> 00:03:39,920 know obsession with collection and 98 00:03:37,760 --> 00:03:41,280 evermore collecting data and personal 99 00:03:39,920 --> 00:03:43,920 information 100 00:03:41,280 --> 00:03:45,920 this too is a concept that is rooted in 101 00:03:43,920 --> 00:03:47,519 white supremacy and capitalism and 102 00:03:45,920 --> 00:03:48,480 colonization 103 00:03:47,519 --> 00:03:50,159 so 104 00:03:48,480 --> 00:03:51,519 we should never forget that and i think 105 00:03:50,159 --> 00:03:54,879 throughout this talk it's good to keep 106 00:03:51,519 --> 00:03:54,879 that context in mind 107 00:03:55,519 --> 00:04:00,799 so today's talk is essentially my love 108 00:03:58,879 --> 00:04:03,200 letter to privacy 109 00:04:00,799 --> 00:04:05,599 privacy isn't a perfect tool it can't 110 00:04:03,200 --> 00:04:07,760 solve all of the problems in fact 111 00:04:05,599 --> 00:04:10,080 sometimes it can cause problems it can 112 00:04:07,760 --> 00:04:12,799 be a double-edged sword but i think it's 113 00:04:10,080 --> 00:04:14,319 absolutely essential if we want to build 114 00:04:12,799 --> 00:04:17,280 technology that fosters the kind of 115 00:04:14,319 --> 00:04:19,440 future that we want one that is free and 116 00:04:17,280 --> 00:04:22,160 fair and sustainable and one that 117 00:04:19,440 --> 00:04:24,960 respects people and communities rather 118 00:04:22,160 --> 00:04:26,800 than exploiting and policing them 119 00:04:24,960 --> 00:04:28,400 i am of course making a few assumptions 120 00:04:26,800 --> 00:04:29,520 here i mean maybe you want the kind of 121 00:04:28,400 --> 00:04:31,520 future 122 00:04:29,520 --> 00:04:34,160 you know depicted in some dystopian 123 00:04:31,520 --> 00:04:36,080 sci-fi novel where there is no privacy 124 00:04:34,160 --> 00:04:38,080 um and you know everyone knows 125 00:04:36,080 --> 00:04:39,120 everything about everyone at all times 126 00:04:38,080 --> 00:04:41,280 um 127 00:04:39,120 --> 00:04:43,199 but i you know i think that you being 128 00:04:41,280 --> 00:04:44,080 here indicates that you 129 00:04:43,199 --> 00:04:46,160 you know 130 00:04:44,080 --> 00:04:47,680 really wants to create a better world 131 00:04:46,160 --> 00:04:50,000 and believe that technology can be a 132 00:04:47,680 --> 00:04:53,680 really important way to to 133 00:04:50,000 --> 00:04:54,400 propel that i mean i certainly do 134 00:04:53,680 --> 00:04:56,800 but 135 00:04:54,400 --> 00:04:59,040 at the same time if we can agree that 136 00:04:56,800 --> 00:05:00,720 the current trajectory that we're on one 137 00:04:59,040 --> 00:05:03,120 which relies on 138 00:05:00,720 --> 00:05:04,720 mass data collection and surveillance 139 00:05:03,120 --> 00:05:07,039 that if we can agree that this causes 140 00:05:04,720 --> 00:05:08,880 immense harm both when it's done by 141 00:05:07,039 --> 00:05:11,440 governments as part of a broader 142 00:05:08,880 --> 00:05:13,280 surveillance and policing agenda 143 00:05:11,440 --> 00:05:16,240 or when it's done by companies as part 144 00:05:13,280 --> 00:05:18,160 of surveillance capitalism 145 00:05:16,240 --> 00:05:20,479 if we can agree that this is causing 146 00:05:18,160 --> 00:05:22,800 harm then privacy is one of the key ways 147 00:05:20,479 --> 00:05:25,680 that we can push back 148 00:05:22,800 --> 00:05:28,800 but we can't push back in any kind of 149 00:05:25,680 --> 00:05:31,440 meaningful way if we fixate on privacy 150 00:05:28,800 --> 00:05:33,680 as an individual issue we need to see it 151 00:05:31,440 --> 00:05:35,280 as a collective one 152 00:05:33,680 --> 00:05:37,120 so my goal is to get you all thinking 153 00:05:35,280 --> 00:05:40,160 about privacy maybe in a different way 154 00:05:37,120 --> 00:05:42,080 than you have before or maybe not 155 00:05:40,160 --> 00:05:43,919 what i ideally want is that you leave 156 00:05:42,080 --> 00:05:46,639 this talk feeling emboldened about 157 00:05:43,919 --> 00:05:48,960 privacy and and you know really want to 158 00:05:46,639 --> 00:05:50,960 incorporate it into your work 159 00:05:48,960 --> 00:05:53,039 rather than it being you know a burden 160 00:05:50,960 --> 00:05:55,280 or something that's limiting or you know 161 00:05:53,039 --> 00:05:56,800 the worst just a compliance tick box 162 00:05:55,280 --> 00:05:57,600 exercise 163 00:05:56,800 --> 00:06:00,560 so 164 00:05:57,600 --> 00:06:04,639 here's what i have planned um 165 00:06:00,560 --> 00:06:07,360 so let's just just dig let's just dig in 166 00:06:04,639 --> 00:06:10,400 so we've reached the dreaded portion of 167 00:06:07,360 --> 00:06:12,560 any talk on privacy where the speaker 168 00:06:10,400 --> 00:06:14,400 tries to define it and the reason it's 169 00:06:12,560 --> 00:06:16,080 dreaded is because 170 00:06:14,400 --> 00:06:18,080 it's just such 171 00:06:16,080 --> 00:06:19,120 a notoriously difficult concept to 172 00:06:18,080 --> 00:06:20,240 define 173 00:06:19,120 --> 00:06:22,000 and i think that this is one of the 174 00:06:20,240 --> 00:06:24,560 problems that we face is that in any 175 00:06:22,000 --> 00:06:26,160 discussion about privacy we're using one 176 00:06:24,560 --> 00:06:28,160 word to try to describe all different 177 00:06:26,160 --> 00:06:29,600 kinds of concepts and 178 00:06:28,160 --> 00:06:31,360 theories and 179 00:06:29,600 --> 00:06:32,880 essentially approaches to information 180 00:06:31,360 --> 00:06:34,639 management 181 00:06:32,880 --> 00:06:36,800 so when people talk about privacy it 182 00:06:34,639 --> 00:06:40,400 could really be in reference to any or 183 00:06:36,800 --> 00:06:41,919 even many of these words and others 184 00:06:40,400 --> 00:06:45,280 and i think we can agree that this is a 185 00:06:41,919 --> 00:06:46,160 massive undertaking for one word 186 00:06:45,280 --> 00:06:48,080 so 187 00:06:46,160 --> 00:06:50,800 looking here we've got you know pick out 188 00:06:48,080 --> 00:06:53,199 a few the to be led alone that's kind of 189 00:06:50,800 --> 00:06:56,080 the classic definition this idea that we 190 00:06:53,199 --> 00:06:58,319 should be free from interference 191 00:06:56,080 --> 00:07:00,479 control to be able to exercise control 192 00:06:58,319 --> 00:07:02,240 about you know where our personal 193 00:07:00,479 --> 00:07:04,160 information goes and who gets it and 194 00:07:02,240 --> 00:07:07,759 what happens and this is also tied up in 195 00:07:04,160 --> 00:07:09,160 ideas of consent and choice 196 00:07:07,759 --> 00:07:11,280 um 197 00:07:09,160 --> 00:07:13,199 self-determination dignity you know 198 00:07:11,280 --> 00:07:15,280 being able to develop one's personhood 199 00:07:13,199 --> 00:07:18,160 and personality 200 00:07:15,280 --> 00:07:19,520 secrecy is a uh in my opinion a 201 00:07:18,160 --> 00:07:22,000 problematic one 202 00:07:19,520 --> 00:07:24,639 uh privacy and secrecy often get kind of 203 00:07:22,000 --> 00:07:26,720 used interchangeably and i think it's 204 00:07:24,639 --> 00:07:29,520 honestly in my opinion one of the least 205 00:07:26,720 --> 00:07:31,280 nuanced and most problematic um ways to 206 00:07:29,520 --> 00:07:32,160 think about privacy 207 00:07:31,280 --> 00:07:34,400 uh 208 00:07:32,160 --> 00:07:36,720 bell hooks who is a renowned social 209 00:07:34,400 --> 00:07:39,440 activist and feminist once said 210 00:07:36,720 --> 00:07:42,160 open honest truth-telling individuals 211 00:07:39,440 --> 00:07:43,680 value privacy we all need spaces where 212 00:07:42,160 --> 00:07:46,080 we can be alone with our thoughts and 213 00:07:43,680 --> 00:07:48,080 feelings where we can experience healthy 214 00:07:46,080 --> 00:07:49,599 psychological autonomy and choose to 215 00:07:48,080 --> 00:07:52,240 share what we want to 216 00:07:49,599 --> 00:07:55,199 keeping secrets is usually about power 217 00:07:52,240 --> 00:07:56,639 about hiding and concealing information 218 00:07:55,199 --> 00:07:58,479 and when we think about it like this i 219 00:07:56,639 --> 00:08:00,639 would say that the government is way 220 00:07:58,479 --> 00:08:03,360 more interested in secrecy than we are 221 00:08:00,639 --> 00:08:03,360 as a collective 222 00:08:03,919 --> 00:08:08,319 um intimacy is an interesting one 223 00:08:05,919 --> 00:08:10,240 because it starts to like 224 00:08:08,319 --> 00:08:12,400 play in the space of privacy being able 225 00:08:10,240 --> 00:08:14,879 to expand and contract to be able to let 226 00:08:12,400 --> 00:08:16,160 people in so it's not just an individual 227 00:08:14,879 --> 00:08:17,840 thing it's something that we can welcome 228 00:08:16,160 --> 00:08:20,400 others into 229 00:08:17,840 --> 00:08:22,479 and of course anonymity which is not 230 00:08:20,400 --> 00:08:24,080 only really hard to say but is also i 231 00:08:22,479 --> 00:08:26,560 think worth paying a little bit more 232 00:08:24,080 --> 00:08:28,479 attention to because 233 00:08:26,560 --> 00:08:30,800 privacy and anonymity often get 234 00:08:28,479 --> 00:08:33,039 conflated and in my experience working 235 00:08:30,800 --> 00:08:35,919 as a privacy consultant this is often 236 00:08:33,039 --> 00:08:38,240 where people fall down 237 00:08:35,919 --> 00:08:39,760 so anonymity generally refers to when 238 00:08:38,240 --> 00:08:41,760 information is 239 00:08:39,760 --> 00:08:43,680 information or actions or whatever is 240 00:08:41,760 --> 00:08:45,839 kept separate from our name or other 241 00:08:43,680 --> 00:08:48,560 identifying features 242 00:08:45,839 --> 00:08:50,480 but anonymity is also really fragile 243 00:08:48,560 --> 00:08:52,959 it's really not that challenging to 244 00:08:50,480 --> 00:08:55,200 identify someone using only a few data 245 00:08:52,959 --> 00:08:57,839 points so this was shown way back in 246 00:08:55,200 --> 00:09:00,720 2000 by latanya sweeney in 247 00:08:57,839 --> 00:09:02,640 in the us who showed that 87 of 248 00:09:00,720 --> 00:09:05,200 americans could be uniquely identified 249 00:09:02,640 --> 00:09:06,880 by just their zip code gender and date 250 00:09:05,200 --> 00:09:09,279 of birth 251 00:09:06,880 --> 00:09:11,600 more recently i'm sure many of you are 252 00:09:09,279 --> 00:09:14,000 probably aware of the work that vanessa 253 00:09:11,600 --> 00:09:15,519 teague and ben rubenstein and chris 254 00:09:14,000 --> 00:09:17,600 colnaine have done at the university of 255 00:09:15,519 --> 00:09:19,920 melbourne where they exposed just how 256 00:09:17,600 --> 00:09:22,959 simple it was to re-identify individuals 257 00:09:19,920 --> 00:09:25,120 from de-identified data 258 00:09:22,959 --> 00:09:27,440 and the protection offered by an 259 00:09:25,120 --> 00:09:28,959 anonymity is really minimal when it 260 00:09:27,440 --> 00:09:32,080 comes to something like surveillance 261 00:09:28,959 --> 00:09:34,000 capitalism because frankly companies 262 00:09:32,080 --> 00:09:37,040 don't really care about our identity as 263 00:09:34,000 --> 00:09:38,480 much as they care about our profile 264 00:09:37,040 --> 00:09:40,720 and so this is where people sort of fall 265 00:09:38,480 --> 00:09:42,800 into this trap that i mentioned they you 266 00:09:40,720 --> 00:09:46,320 know in the process of 267 00:09:42,800 --> 00:09:49,200 designing um an initiative or a project 268 00:09:46,320 --> 00:09:51,040 they'll they'll say something to me like 269 00:09:49,200 --> 00:09:52,399 well 270 00:09:51,040 --> 00:09:54,240 we don't actually 271 00:09:52,399 --> 00:09:57,120 know who they are we don't collect their 272 00:09:54,240 --> 00:09:59,279 name so it's not invading privacy 273 00:09:57,120 --> 00:10:01,360 but the trouble is is that you can cause 274 00:09:59,279 --> 00:10:03,680 all kinds of harm to people without ever 275 00:10:01,360 --> 00:10:06,640 actually knowing who they are and this 276 00:10:03,680 --> 00:10:08,560 can still be an invasion of privacy 277 00:10:06,640 --> 00:10:10,880 so targeted marketing and behavioral 278 00:10:08,560 --> 00:10:13,920 advertising is 279 00:10:10,880 --> 00:10:16,560 is an example of this of using what is 280 00:10:13,920 --> 00:10:18,800 called individuation so this means that 281 00:10:16,560 --> 00:10:20,640 you know the company may not know 282 00:10:18,800 --> 00:10:22,880 who i am they may not know my name is 283 00:10:20,640 --> 00:10:25,279 sam but they know my social and 284 00:10:22,880 --> 00:10:26,880 political and economic preferences which 285 00:10:25,279 --> 00:10:28,959 is determined by the data that has been 286 00:10:26,880 --> 00:10:31,519 collected about me and the inferences 287 00:10:28,959 --> 00:10:33,200 that they've made based on that data 288 00:10:31,519 --> 00:10:35,600 they use this to create essentially an 289 00:10:33,200 --> 00:10:38,240 abstract identity which is then in turn 290 00:10:35,600 --> 00:10:40,399 used to influence me to offer me highly 291 00:10:38,240 --> 00:10:42,560 personalized advertising or messaging 292 00:10:40,399 --> 00:10:44,079 which is really just a fancy form of 293 00:10:42,560 --> 00:10:45,839 manipulation 294 00:10:44,079 --> 00:10:48,079 and because all of this happens behind 295 00:10:45,839 --> 00:10:50,640 the scenes generally without people ever 296 00:10:48,079 --> 00:10:52,480 really knowing about it it curtails our 297 00:10:50,640 --> 00:10:54,399 ability to control 298 00:10:52,480 --> 00:10:55,920 the information that's out there about 299 00:10:54,399 --> 00:10:57,440 us 300 00:10:55,920 --> 00:10:58,640 these kinds of companies and data 301 00:10:57,440 --> 00:11:00,079 brokers 302 00:10:58,640 --> 00:11:02,800 you know they're not interested in who i 303 00:11:00,079 --> 00:11:05,040 am as a person they're interested in me 304 00:11:02,800 --> 00:11:07,600 as an entity that emits data that they 305 00:11:05,040 --> 00:11:09,519 can vacuum up they're interested in me 306 00:11:07,600 --> 00:11:13,120 as someone who buys things and someone 307 00:11:09,519 --> 00:11:14,640 who can be influenced to buy more things 308 00:11:13,120 --> 00:11:16,800 and when you start thinking about this 309 00:11:14,640 --> 00:11:18,720 kind of power to influence 310 00:11:16,800 --> 00:11:21,040 and how this can be used in areas other 311 00:11:18,720 --> 00:11:23,200 than in in you know commercial and 312 00:11:21,040 --> 00:11:24,959 retail spaces that's when it starts 313 00:11:23,200 --> 00:11:26,880 getting particularly scary because 314 00:11:24,959 --> 00:11:28,959 there's quite a difference between being 315 00:11:26,880 --> 00:11:29,920 influenced by a pair of shoes for 316 00:11:28,959 --> 00:11:31,760 example 317 00:11:29,920 --> 00:11:35,440 compared with being influenced to 318 00:11:31,760 --> 00:11:38,320 believe a certain political ideology or 319 00:11:35,440 --> 00:11:40,480 to be influenced to be led to believe 320 00:11:38,320 --> 00:11:42,720 particular kinds of health 321 00:11:40,480 --> 00:11:45,200 information or misinformation 322 00:11:42,720 --> 00:11:48,240 or to vote for a particular politician 323 00:11:45,200 --> 00:11:49,839 but the underlying approach is the same 324 00:11:48,240 --> 00:11:52,240 so anonymity 325 00:11:49,839 --> 00:11:56,639 while essential in lots of respects can 326 00:11:52,240 --> 00:11:58,399 create a bit of a false sense of privacy 327 00:11:56,639 --> 00:12:00,000 and so one thing is clear is that 328 00:11:58,399 --> 00:12:02,000 there's so much going on here this is 329 00:12:00,000 --> 00:12:04,639 this is so much happening and this is 330 00:12:02,000 --> 00:12:07,440 one of the troubles as i mentioned this 331 00:12:04,639 --> 00:12:10,240 wobbly non-fixed social construct that 332 00:12:07,440 --> 00:12:12,639 we call privacy that morphs and changes 333 00:12:10,240 --> 00:12:14,240 and shifts and evolves over time is 334 00:12:12,639 --> 00:12:15,920 really wobbly 335 00:12:14,240 --> 00:12:18,079 and one of the questions that i think we 336 00:12:15,920 --> 00:12:20,000 need to think about is you know should 337 00:12:18,079 --> 00:12:22,320 we narrow this down should we try to 338 00:12:20,000 --> 00:12:24,639 just focus in on one particular aspect 339 00:12:22,320 --> 00:12:27,120 of it that could be more 340 00:12:24,639 --> 00:12:28,959 regular regular regular 341 00:12:27,120 --> 00:12:31,360 regulatorable 342 00:12:28,959 --> 00:12:33,440 more easily regulated or 343 00:12:31,360 --> 00:12:34,720 should we take a more expansive view and 344 00:12:33,440 --> 00:12:36,480 this offers lots of different 345 00:12:34,720 --> 00:12:38,480 opportunities but also makes it kind of 346 00:12:36,480 --> 00:12:40,639 hard to grapple with 347 00:12:38,480 --> 00:12:42,399 um i think as you'll see i i lean 348 00:12:40,639 --> 00:12:44,639 towards expansive but 349 00:12:42,399 --> 00:12:47,279 you know that is certainly a question to 350 00:12:44,639 --> 00:12:47,279 to think about 351 00:12:47,680 --> 00:12:51,279 so i think it's worth having a look at 352 00:12:48,959 --> 00:12:54,399 some history to contextualize what i 353 00:12:51,279 --> 00:12:56,160 mean by privacy as a collective issue 354 00:12:54,399 --> 00:12:58,639 there's plenty to dig into in ancient 355 00:12:56,160 --> 00:13:00,720 history of idea about ideas of privacy 356 00:12:58,639 --> 00:13:01,519 and the personal and the political 357 00:13:00,720 --> 00:13:02,800 but 358 00:13:01,519 --> 00:13:06,000 i think that's going a little bit too 359 00:13:02,800 --> 00:13:07,839 far back for our um purposes so less to 360 00:13:06,000 --> 00:13:09,600 say that the concept of privacy has been 361 00:13:07,839 --> 00:13:11,760 around for a really long time 362 00:13:09,600 --> 00:13:14,240 but the idea of the right to privacy 363 00:13:11,760 --> 00:13:16,320 started to emerge in the late 1800s 364 00:13:14,240 --> 00:13:18,880 particularly when american lawyers 365 00:13:16,320 --> 00:13:21,360 samuel warren and louis brandeis wrote a 366 00:13:18,880 --> 00:13:22,880 famous article on the right privacy 367 00:13:21,360 --> 00:13:25,440 which they described as the right to be 368 00:13:22,880 --> 00:13:27,040 let alone so it was closely tied up in 369 00:13:25,440 --> 00:13:28,959 this idea of private property and 370 00:13:27,040 --> 00:13:31,120 ownership 371 00:13:28,959 --> 00:13:32,880 and then fast forward to the 1960s and 372 00:13:31,120 --> 00:13:34,560 alan weston published privacy and 373 00:13:32,880 --> 00:13:36,800 freedom where he described the right to 374 00:13:34,560 --> 00:13:38,079 privacy in terms of self-determination 375 00:13:36,800 --> 00:13:39,920 and 376 00:13:38,079 --> 00:13:41,760 personal autonomy 377 00:13:39,920 --> 00:13:43,600 and obviously there are lots more things 378 00:13:41,760 --> 00:13:46,160 along the way in this path but the point 379 00:13:43,600 --> 00:13:48,399 that i want to get to is that 380 00:13:46,160 --> 00:13:49,839 all of these ideas are really focused in 381 00:13:48,399 --> 00:13:53,519 on privacy 382 00:13:49,839 --> 00:13:53,519 being seen as an individual issue 383 00:13:56,079 --> 00:14:00,000 and along the way 384 00:13:58,000 --> 00:14:03,839 it has really formed this association 385 00:14:00,000 --> 00:14:03,839 with privacy and libertarianism 386 00:14:04,560 --> 00:14:08,639 of course 387 00:14:06,480 --> 00:14:11,040 so libertarianism as a refresher is a 388 00:14:08,639 --> 00:14:12,480 political philosophy which holds as the 389 00:14:11,040 --> 00:14:14,480 name suggests 390 00:14:12,480 --> 00:14:16,800 liberty as a core principle 391 00:14:14,480 --> 00:14:19,120 libertarians seek to maximize autonomy 392 00:14:16,800 --> 00:14:21,519 political freedom they emphasize freedom 393 00:14:19,120 --> 00:14:22,880 of association freedom of choice and 394 00:14:21,519 --> 00:14:24,800 individualism 395 00:14:22,880 --> 00:14:28,399 and libertarians are generally really 396 00:14:24,800 --> 00:14:29,760 skeptical of authority and state power 397 00:14:28,399 --> 00:14:31,199 so there's lots of good things in there 398 00:14:29,760 --> 00:14:32,639 and you know it's like it sounds okay 399 00:14:31,199 --> 00:14:35,600 when you put it like that 400 00:14:32,639 --> 00:14:37,199 but of course as with any ideology there 401 00:14:35,600 --> 00:14:39,519 are many schools of thought under the 402 00:14:37,199 --> 00:14:41,199 one umbrella and while libertarianism 403 00:14:39,519 --> 00:14:42,160 does have a history of association with 404 00:14:41,199 --> 00:14:44,720 the left 405 00:14:42,160 --> 00:14:46,079 over the last you know century or so 406 00:14:44,720 --> 00:14:48,639 it's really become very strongly 407 00:14:46,079 --> 00:14:50,720 associated with more right-wing politics 408 00:14:48,639 --> 00:14:52,959 with ideas of conservatism and free 409 00:14:50,720 --> 00:14:55,120 market capitalism and an emphasis on 410 00:14:52,959 --> 00:14:57,360 property rights and ownership 411 00:14:55,120 --> 00:14:59,199 and in my opinion where it falls down is 412 00:14:57,360 --> 00:15:01,519 that it centres the individual to a 413 00:14:59,199 --> 00:15:04,240 fault overlooking our collective 414 00:15:01,519 --> 00:15:07,920 interests intersectionality or the needs 415 00:15:04,240 --> 00:15:09,440 and experiences of marginalized groups 416 00:15:07,920 --> 00:15:11,120 so privacy for a really long time has 417 00:15:09,440 --> 00:15:13,680 kind of hung out in this libertarian 418 00:15:11,120 --> 00:15:16,240 club and that's not necessarily a bad 419 00:15:13,680 --> 00:15:17,839 thing but it's not the only way that we 420 00:15:16,240 --> 00:15:19,519 can think about privacy 421 00:15:17,839 --> 00:15:21,040 and personally i think that if we want 422 00:15:19,519 --> 00:15:23,600 to leverage privacy as a way to 423 00:15:21,040 --> 00:15:24,959 facilitate more positive social change 424 00:15:23,600 --> 00:15:26,800 we need to be thinking about it as a 425 00:15:24,959 --> 00:15:28,959 collective issue 426 00:15:26,800 --> 00:15:32,079 so that means instead of focusing on me 427 00:15:28,959 --> 00:15:34,399 and my data and my right to be let alone 428 00:15:32,079 --> 00:15:35,199 we frame privacy as a collective social 429 00:15:34,399 --> 00:15:37,360 good 430 00:15:35,199 --> 00:15:39,600 worthy of protection not just to protect 431 00:15:37,360 --> 00:15:42,560 ourselves as individuals but to protect 432 00:15:39,600 --> 00:15:44,240 and empower communities 433 00:15:42,560 --> 00:15:46,399 and when you think about it 434 00:15:44,240 --> 00:15:47,920 the whole argument of like well i've got 435 00:15:46,399 --> 00:15:49,040 nothing to hide so i've got nothing to 436 00:15:47,920 --> 00:15:50,560 fear 437 00:15:49,040 --> 00:15:52,000 that is squarely caught up in 438 00:15:50,560 --> 00:15:53,519 individualism 439 00:15:52,000 --> 00:15:55,279 it's not always about you as an 440 00:15:53,519 --> 00:15:58,000 individual because loss of privacy 441 00:15:55,279 --> 00:15:59,600 impacts all of us 442 00:15:58,000 --> 00:16:01,920 so thankfully we're starting to see a 443 00:15:59,600 --> 00:16:04,240 shift in the way that we uh hear about 444 00:16:01,920 --> 00:16:07,519 privacy being spoken about about it 445 00:16:04,240 --> 00:16:09,519 being framed as a public good 446 00:16:07,519 --> 00:16:11,040 so shashana zabov in the age of 447 00:16:09,519 --> 00:16:12,639 capitalism said that the belief that 448 00:16:11,040 --> 00:16:15,199 privacy is private is the most 449 00:16:12,639 --> 00:16:17,120 treacherous hallucination of them all 450 00:16:15,199 --> 00:16:19,920 and that's because 451 00:16:17,120 --> 00:16:21,440 the impact of losing privacy is shared 452 00:16:19,920 --> 00:16:24,320 so there's a number of ways that this 453 00:16:21,440 --> 00:16:25,839 can happen um so you know perhaps the 454 00:16:24,320 --> 00:16:26,880 easiest place to start is this sense 455 00:16:25,839 --> 00:16:28,560 that 456 00:16:26,880 --> 00:16:30,079 your privacy will be impacted by the 457 00:16:28,560 --> 00:16:32,079 choices of actions 458 00:16:30,079 --> 00:16:34,240 choices or actions of those around you 459 00:16:32,079 --> 00:16:35,759 so for example you might decide not to 460 00:16:34,240 --> 00:16:38,000 be on social media to protect your 461 00:16:35,759 --> 00:16:39,600 privacy but that doesn't prevent other 462 00:16:38,000 --> 00:16:41,040 people from sharing information about 463 00:16:39,600 --> 00:16:42,959 you it also doesn't prevent the 464 00:16:41,040 --> 00:16:46,480 platforms from 465 00:16:42,959 --> 00:16:48,639 tracking you around around the internet 466 00:16:46,480 --> 00:16:50,720 another example is with the increased 467 00:16:48,639 --> 00:16:52,480 use of machine learning so just by 468 00:16:50,720 --> 00:16:54,560 merely sharing characteristics with 469 00:16:52,480 --> 00:16:56,720 people who may or may not have willingly 470 00:16:54,560 --> 00:16:59,279 shared their data can result in you 471 00:16:56,720 --> 00:17:01,440 being impacted by inferences or 472 00:16:59,279 --> 00:17:03,120 decisions that's made on that data so 473 00:17:01,440 --> 00:17:04,400 that data wasn't collected about you is 474 00:17:03,120 --> 00:17:05,760 collect about other people who are 475 00:17:04,400 --> 00:17:07,520 similar to you 476 00:17:05,760 --> 00:17:08,799 but then this can play out in things 477 00:17:07,520 --> 00:17:11,360 like 478 00:17:08,799 --> 00:17:14,000 applications for home loans insurance or 479 00:17:11,360 --> 00:17:15,919 even applying for work 480 00:17:14,000 --> 00:17:18,720 and then of course possibly the biggest 481 00:17:15,919 --> 00:17:21,439 issue of all is the invasions or loss of 482 00:17:18,720 --> 00:17:23,600 privacy can have societal level impacts 483 00:17:21,439 --> 00:17:24,640 so if we take cambridge analytica as an 484 00:17:23,600 --> 00:17:26,240 example 485 00:17:24,640 --> 00:17:28,559 this shows that when privacy is 486 00:17:26,240 --> 00:17:30,240 threatened our democratic processes can 487 00:17:28,559 --> 00:17:32,000 falter 488 00:17:30,240 --> 00:17:33,440 and so all of this suggests that privacy 489 00:17:32,000 --> 00:17:35,520 is a collective 490 00:17:33,440 --> 00:17:39,120 networked good 491 00:17:35,520 --> 00:17:39,120 rather than purely individualistic 492 00:17:40,160 --> 00:17:45,120 okay so lots of talk of privacy often 493 00:17:43,039 --> 00:17:46,480 focuses on it as a state of being as 494 00:17:45,120 --> 00:17:48,720 something that we have 495 00:17:46,480 --> 00:17:51,200 you know we have it i have privacy or i 496 00:17:48,720 --> 00:17:53,039 don't have privacy it also gets very 497 00:17:51,200 --> 00:17:55,840 caught up in um 498 00:17:53,039 --> 00:17:57,360 you know theory and philosophy which you 499 00:17:55,840 --> 00:17:59,200 know we've already we've done we've done 500 00:17:57,360 --> 00:18:01,280 that portion of the talk 501 00:17:59,200 --> 00:18:03,360 but i think that there is value in 502 00:18:01,280 --> 00:18:05,520 considering considering privacy and the 503 00:18:03,360 --> 00:18:08,880 act of upholding it as something more 504 00:18:05,520 --> 00:18:10,240 active as a tool that we can use 505 00:18:08,880 --> 00:18:12,880 so we all know that governments and 506 00:18:10,240 --> 00:18:14,400 companies are busy accumulating masses 507 00:18:12,880 --> 00:18:16,720 of data including our personal 508 00:18:14,400 --> 00:18:18,799 information and as they do this the 509 00:18:16,720 --> 00:18:21,120 existing power imbalances between them 510 00:18:18,799 --> 00:18:22,640 and us grows larger 511 00:18:21,120 --> 00:18:25,120 and as more and more of our lives are 512 00:18:22,640 --> 00:18:26,960 online you know our personal life dating 513 00:18:25,120 --> 00:18:29,440 health work 514 00:18:26,960 --> 00:18:31,120 accessing essential government services 515 00:18:29,440 --> 00:18:32,960 all of these aspects of our lives are 516 00:18:31,120 --> 00:18:34,320 becoming integrated with digital 517 00:18:32,960 --> 00:18:36,400 technology 518 00:18:34,320 --> 00:18:38,240 and as this is happening the power 519 00:18:36,400 --> 00:18:40,320 shifts towards those who control the 520 00:18:38,240 --> 00:18:43,120 technological systems and those who 521 00:18:40,320 --> 00:18:45,200 collect and hold the data 522 00:18:43,120 --> 00:18:47,039 so alongside this is this sense that 523 00:18:45,200 --> 00:18:49,679 digital technology is you know treated 524 00:18:47,039 --> 00:18:51,600 like a force of nature it is inevitable 525 00:18:49,679 --> 00:18:54,799 and unstoppable 526 00:18:51,600 --> 00:18:57,120 and we as individuals and communities we 527 00:18:54,799 --> 00:18:58,559 are treated as objects that technology 528 00:18:57,120 --> 00:19:00,880 happens to 529 00:18:58,559 --> 00:19:03,440 rather than active participants with 530 00:19:00,880 --> 00:19:07,679 agency who you know have a desire to 531 00:19:03,440 --> 00:19:07,679 shape the technology and the future 532 00:19:07,760 --> 00:19:10,400 so i think that we can use privacy as a 533 00:19:09,600 --> 00:19:13,440 tool 534 00:19:10,400 --> 00:19:15,840 and as a way to examine and interrogate 535 00:19:13,440 --> 00:19:17,360 information flows which is in turn a 536 00:19:15,840 --> 00:19:18,880 good method for examining power 537 00:19:17,360 --> 00:19:20,720 structures 538 00:19:18,880 --> 00:19:22,720 so if we imagine these structures as 539 00:19:20,720 --> 00:19:24,880 something that relies on like a complex 540 00:19:22,720 --> 00:19:27,360 framework of information gathering and 541 00:19:24,880 --> 00:19:29,600 sharing i like to imagine like a 542 00:19:27,360 --> 00:19:32,240 construction of pipes pumping data 543 00:19:29,600 --> 00:19:34,320 around sucking it out of individuals and 544 00:19:32,240 --> 00:19:36,400 communities and pumping it towards 545 00:19:34,320 --> 00:19:38,799 governments and companies 546 00:19:36,400 --> 00:19:40,960 so if we look at it this way protecting 547 00:19:38,799 --> 00:19:43,679 privacy can be used as a tool to to look 548 00:19:40,960 --> 00:19:45,600 at these structures the pipes and to 549 00:19:43,679 --> 00:19:47,679 renegotiate the bounds of where the 550 00:19:45,600 --> 00:19:48,880 pipes go and what data flows through 551 00:19:47,679 --> 00:19:51,200 them 552 00:19:48,880 --> 00:19:54,640 and by prioritizing privacy protections 553 00:19:51,200 --> 00:19:57,840 we can redistribute this power in i hope 554 00:19:54,640 --> 00:19:57,840 a more distributed way 555 00:19:58,320 --> 00:20:02,159 and 556 00:19:59,200 --> 00:20:03,760 as such upholding privacy can be an act 557 00:20:02,159 --> 00:20:06,400 of resistance 558 00:20:03,760 --> 00:20:08,880 digital surveillance is highly efficient 559 00:20:06,400 --> 00:20:11,200 as a way to enforce social order it 560 00:20:08,880 --> 00:20:13,919 makes us conform it helps create a 561 00:20:11,200 --> 00:20:15,760 culture of compliance and it generates 562 00:20:13,919 --> 00:20:18,080 social consequences for any behavior 563 00:20:15,760 --> 00:20:19,919 that deviates from the norm 564 00:20:18,080 --> 00:20:21,919 protecting privacy is one way that we 565 00:20:19,919 --> 00:20:24,320 can resist this 566 00:20:21,919 --> 00:20:27,760 upholding privacy enables us to take 567 00:20:24,320 --> 00:20:29,440 some power back it enables us to see how 568 00:20:27,760 --> 00:20:30,880 governments and companies are using our 569 00:20:29,440 --> 00:20:32,640 personal information for their own 570 00:20:30,880 --> 00:20:34,640 profit or agenda 571 00:20:32,640 --> 00:20:36,720 and to say no 572 00:20:34,640 --> 00:20:39,760 to be able to maintain a level of 573 00:20:36,720 --> 00:20:41,520 autonomy and dignity in a world where 574 00:20:39,760 --> 00:20:44,000 increasingly everything is noble and 575 00:20:41,520 --> 00:20:44,000 traceable 576 00:20:44,080 --> 00:20:47,840 so many people 577 00:20:45,440 --> 00:20:50,320 lament the end of privacy like ah 578 00:20:47,840 --> 00:20:51,919 privacy is dead why bother 579 00:20:50,320 --> 00:20:53,200 why resist you know 580 00:20:51,919 --> 00:20:54,799 but it's not 581 00:20:53,200 --> 00:20:56,640 at least i don't think it is it's not 582 00:20:54,799 --> 00:20:57,919 dead in fact the ongoing moves to 583 00:20:56,640 --> 00:20:59,760 undermine it by governments and 584 00:20:57,919 --> 00:21:01,679 companies just goes to show how much it 585 00:20:59,760 --> 00:21:02,640 isn't dead and how much it threatens 586 00:21:01,679 --> 00:21:05,360 them 587 00:21:02,640 --> 00:21:07,440 so this tweet from um eva galperin was 588 00:21:05,360 --> 00:21:09,360 at the same time that the 589 00:21:07,440 --> 00:21:11,120 pegasus project came to light which was 590 00:21:09,360 --> 00:21:12,960 a coordinated surveillance effort 591 00:21:11,120 --> 00:21:14,720 between international law enforcement 592 00:21:12,960 --> 00:21:16,559 agencies and i think that it rings 593 00:21:14,720 --> 00:21:17,360 really true 594 00:21:16,559 --> 00:21:19,039 we 595 00:21:17,360 --> 00:21:20,960 really cannot succumb to privacy 596 00:21:19,039 --> 00:21:22,559 fatalism because that is exactly what 597 00:21:20,960 --> 00:21:24,640 would benefit governments and companies 598 00:21:22,559 --> 00:21:26,799 who want us to give up on it and to 599 00:21:24,640 --> 00:21:28,799 allow them to continue to exploit the 600 00:21:26,799 --> 00:21:32,799 information that they have 601 00:21:28,799 --> 00:21:32,799 protecting your privacy is how we resist 602 00:21:33,440 --> 00:21:37,919 privacy is also an essential component 603 00:21:35,360 --> 00:21:40,400 to any political movement any organizing 604 00:21:37,919 --> 00:21:42,880 any attempt to challenge the status quo 605 00:21:40,400 --> 00:21:45,280 so in this sense protecting privacy is 606 00:21:42,880 --> 00:21:46,320 both an act of resistance in and of 607 00:21:45,280 --> 00:21:48,400 itself 608 00:21:46,320 --> 00:21:50,400 as well as a tool to enable 609 00:21:48,400 --> 00:21:53,760 organizations and communities to push 610 00:21:50,400 --> 00:21:55,679 for important social progression 611 00:21:53,760 --> 00:21:57,919 if you just think about any social 612 00:21:55,679 --> 00:22:00,320 movement any you know any progression 613 00:21:57,919 --> 00:22:02,000 that we've had over the past 614 00:22:00,320 --> 00:22:03,440 you know century 615 00:22:02,000 --> 00:22:05,840 would any of that have been possible if 616 00:22:03,440 --> 00:22:07,679 they didn't have the ability to retain 617 00:22:05,840 --> 00:22:08,640 some level of privacy 618 00:22:07,679 --> 00:22:10,880 you know 619 00:22:08,640 --> 00:22:13,760 we know that 620 00:22:10,880 --> 00:22:15,120 governments do not appreciate people who 621 00:22:13,760 --> 00:22:16,880 are trying to challenge the power 622 00:22:15,120 --> 00:22:18,799 structures of the status quo it's no 623 00:22:16,880 --> 00:22:20,480 coincidence it's no coincidence that 624 00:22:18,799 --> 00:22:23,120 activists are some of the most heavily 625 00:22:20,480 --> 00:22:25,440 surveilled among us 626 00:22:23,120 --> 00:22:27,600 so protecting privacy is how we ensure 627 00:22:25,440 --> 00:22:30,640 that we maintain a healthy amount of 628 00:22:27,600 --> 00:22:32,320 political engagement and pressure 629 00:22:30,640 --> 00:22:33,840 and not only that 630 00:22:32,320 --> 00:22:35,840 but the right to privacy can enable 631 00:22:33,840 --> 00:22:38,000 other rights and freedoms for example 632 00:22:35,840 --> 00:22:39,919 did you know that the right to privacy 633 00:22:38,000 --> 00:22:42,000 is the reason that the us supreme court 634 00:22:39,919 --> 00:22:43,679 decided that prohibiting access to 635 00:22:42,000 --> 00:22:45,360 abortion and contraception was 636 00:22:43,679 --> 00:22:47,600 unconstitutional 637 00:22:45,360 --> 00:22:49,520 a pivotal moment in history of 638 00:22:47,600 --> 00:22:52,320 reproductive rights was achieved based 639 00:22:49,520 --> 00:22:53,679 on the notion of privacy 640 00:22:52,320 --> 00:22:55,360 and while there certainly is debate 641 00:22:53,679 --> 00:22:57,840 about whether or not that was the right 642 00:22:55,360 --> 00:23:00,159 you know um rationale behind the 643 00:22:57,840 --> 00:23:02,080 decision it does go to show that the 644 00:23:00,159 --> 00:23:04,880 right to privacy can really pave the way 645 00:23:02,080 --> 00:23:07,120 for social progression 646 00:23:04,880 --> 00:23:09,919 and this brings me to my next point the 647 00:23:07,120 --> 00:23:12,000 privacy can be a way to facilitate into 648 00:23:09,919 --> 00:23:14,159 intersectional harm reduction 649 00:23:12,000 --> 00:23:16,559 it should go without saying that you 650 00:23:14,159 --> 00:23:19,280 know harms caused by loss or invasion of 651 00:23:16,559 --> 00:23:21,520 privacy or privacy breaches they don't 652 00:23:19,280 --> 00:23:23,280 impact everyone equally 653 00:23:21,520 --> 00:23:25,760 systems of oppression are increasingly 654 00:23:23,280 --> 00:23:27,120 enabled by technologies using 655 00:23:25,760 --> 00:23:29,600 such as things like 656 00:23:27,120 --> 00:23:31,280 facial recognition technology or 657 00:23:29,600 --> 00:23:32,320 automated decision making machine 658 00:23:31,280 --> 00:23:34,320 learning 659 00:23:32,320 --> 00:23:35,679 and all of these require immense amounts 660 00:23:34,320 --> 00:23:37,280 of personal information in order to 661 00:23:35,679 --> 00:23:38,640 function 662 00:23:37,280 --> 00:23:41,440 and what we're seeing at the moment is 663 00:23:38,640 --> 00:23:43,440 an exacerbation of inequality amplified 664 00:23:41,440 --> 00:23:46,960 by technology which is fueled by 665 00:23:43,440 --> 00:23:49,520 invasive and extractive data practices 666 00:23:46,960 --> 00:23:50,480 so upholding privacy by limiting the 667 00:23:49,520 --> 00:23:52,799 collection 668 00:23:50,480 --> 00:23:55,360 use and storage and sharing of personal 669 00:23:52,799 --> 00:23:58,240 information is a vital part of reducing 670 00:23:55,360 --> 00:24:00,799 harm across intersections 671 00:23:58,240 --> 00:24:02,480 so let's look at a couple of examples 672 00:24:00,799 --> 00:24:03,520 just to sort of wrap a head around this 673 00:24:02,480 --> 00:24:05,200 so 674 00:24:03,520 --> 00:24:07,120 you know women have been policed and 675 00:24:05,200 --> 00:24:09,120 observed for centuries until very 676 00:24:07,120 --> 00:24:10,720 recently they were considered not to 677 00:24:09,120 --> 00:24:11,600 have any privacy at all when it comes to 678 00:24:10,720 --> 00:24:13,200 sexual 679 00:24:11,600 --> 00:24:15,360 propriety 680 00:24:13,200 --> 00:24:17,120 sex workers continue to be policed and 681 00:24:15,360 --> 00:24:19,120 surveilled more than 682 00:24:17,120 --> 00:24:21,039 most groups online and when their 683 00:24:19,120 --> 00:24:24,000 privacy is invaded it's generally not 684 00:24:21,039 --> 00:24:25,919 taken seriously by those in power 685 00:24:24,000 --> 00:24:29,919 and privacy has played a really big role 686 00:24:25,919 --> 00:24:31,279 in the history of lgbtq plus rights 687 00:24:29,919 --> 00:24:33,679 you remember how i mentioned that 688 00:24:31,279 --> 00:24:35,520 privacy helped uh in the the fight for 689 00:24:33,679 --> 00:24:37,520 reproductive rights well that same 690 00:24:35,520 --> 00:24:39,760 argument was then later used to 691 00:24:37,520 --> 00:24:41,440 decriminalize homosexual homosexuality 692 00:24:39,760 --> 00:24:43,760 in the us as well 693 00:24:41,440 --> 00:24:45,760 but it's not all good you know privacy 694 00:24:43,760 --> 00:24:48,240 invasive laws and policies have also 695 00:24:45,760 --> 00:24:50,080 been used to oppress and stigmatize 696 00:24:48,240 --> 00:24:53,840 people based on their sexuality and 697 00:24:50,080 --> 00:24:53,840 their gender expression for decades 698 00:24:54,159 --> 00:24:58,720 so as i mentioned at the very beginning 699 00:24:56,559 --> 00:25:01,120 surveillance and colonization go hand in 700 00:24:58,720 --> 00:25:02,960 hand pictured here is my friend kat who 701 00:25:01,120 --> 00:25:05,679 highlighted in a discussion about facial 702 00:25:02,960 --> 00:25:07,679 recognition technology that if we want a 703 00:25:05,679 --> 00:25:09,200 future in which aboriginal and torres 704 00:25:07,679 --> 00:25:11,760 strait islander peoples are not 705 00:25:09,200 --> 00:25:14,240 overrepresented in prisons 706 00:25:11,760 --> 00:25:16,240 then this is incompatible this reality 707 00:25:14,240 --> 00:25:17,600 is incompatible with rampant 708 00:25:16,240 --> 00:25:20,240 surveillance 709 00:25:17,600 --> 00:25:22,159 so strong privacy protections is not 710 00:25:20,240 --> 00:25:23,520 only a way to i mean it's not the only 711 00:25:22,159 --> 00:25:25,600 way to stop this from happening there's 712 00:25:23,520 --> 00:25:28,320 lots of factors at play here but it is a 713 00:25:25,600 --> 00:25:30,320 key ingredient in reducing harm caused 714 00:25:28,320 --> 00:25:32,159 by over-policing which relies on 715 00:25:30,320 --> 00:25:33,919 surveillance 716 00:25:32,159 --> 00:25:35,760 so when we consider privacy as a 717 00:25:33,919 --> 00:25:38,159 collective issue rather than just an 718 00:25:35,760 --> 00:25:39,919 individual one we can start to see how 719 00:25:38,159 --> 00:25:43,200 the impacts are so much bigger than just 720 00:25:39,919 --> 00:25:45,120 me and my data and it means we can start 721 00:25:43,200 --> 00:25:47,279 to use it as a tool to push back to 722 00:25:45,120 --> 00:25:50,240 protect communities and to look after 723 00:25:47,279 --> 00:25:53,840 one another and i think in turn to 724 00:25:50,240 --> 00:25:53,840 create a more equitable future 725 00:25:54,080 --> 00:25:57,200 but 726 00:25:54,880 --> 00:25:58,799 all of this is not without its tensions 727 00:25:57,200 --> 00:26:01,120 and a little content note i'm about to 728 00:25:58,799 --> 00:26:03,200 just mention um 729 00:26:01,120 --> 00:26:05,039 some gendered violence just just a 730 00:26:03,200 --> 00:26:08,000 little heads up if you want to if that's 731 00:26:05,039 --> 00:26:09,200 not for you today you can jump out 732 00:26:08,000 --> 00:26:10,880 um 733 00:26:09,200 --> 00:26:11,919 so i mentioned right at the beginning 734 00:26:10,880 --> 00:26:14,000 that 735 00:26:11,919 --> 00:26:16,000 uh privacy can be a double-edged sword 736 00:26:14,000 --> 00:26:18,320 and i think a prime example of that is 737 00:26:16,000 --> 00:26:20,799 in the realm of gendered and family 738 00:26:18,320 --> 00:26:22,559 violence so for a long time privacy was 739 00:26:20,799 --> 00:26:25,120 weaponized against women and gender 740 00:26:22,559 --> 00:26:27,919 diverse folks as a cover for widespread 741 00:26:25,120 --> 00:26:30,880 widespread abuse and violence you know 742 00:26:27,919 --> 00:26:33,120 keeping things behind closed doors 743 00:26:30,880 --> 00:26:34,960 but now as we're seeing a shift towards 744 00:26:33,120 --> 00:26:37,039 how we think about privacy and also 745 00:26:34,960 --> 00:26:38,400 shift in terms of our gender politics 746 00:26:37,039 --> 00:26:40,240 and our understanding 747 00:26:38,400 --> 00:26:42,400 you know modern ideas of privacy are 748 00:26:40,240 --> 00:26:44,400 certainly not a call to go back to be 749 00:26:42,400 --> 00:26:47,360 hiding abusive 750 00:26:44,400 --> 00:26:47,360 behavior from view 751 00:26:48,400 --> 00:26:51,679 and so 752 00:26:50,320 --> 00:26:53,279 what makes things a little bit more 753 00:26:51,679 --> 00:26:55,679 complicated as well is that privacy is 754 00:26:53,279 --> 00:26:57,840 absolutely vital for for many victim 755 00:26:55,679 --> 00:27:00,000 survivors and their protection so 756 00:26:57,840 --> 00:27:02,320 privacy like many things can be 757 00:27:00,000 --> 00:27:04,559 weaponized if we let it that can be a 758 00:27:02,320 --> 00:27:07,120 you know a tool for progression on the 759 00:27:04,559 --> 00:27:08,960 on the flip side so we need a liberating 760 00:27:07,120 --> 00:27:10,480 rather than a restrictive view of 761 00:27:08,960 --> 00:27:14,480 privacy 762 00:27:10,480 --> 00:27:15,840 so in net privacy sasha molotoris says 763 00:27:14,480 --> 00:27:18,960 that privacy has the potential to be 764 00:27:15,840 --> 00:27:20,480 both liberating and stifling and the 765 00:27:18,960 --> 00:27:22,799 challenge is to conceptualize more 766 00:27:20,480 --> 00:27:24,399 clearly a positive privacy that is we 767 00:27:22,799 --> 00:27:26,960 want a privacy that affords liberty 768 00:27:24,399 --> 00:27:29,360 instead of confinement we want a privacy 769 00:27:26,960 --> 00:27:31,600 in service of equity and i really i just 770 00:27:29,360 --> 00:27:33,679 think that nails that just nails it on 771 00:27:31,600 --> 00:27:35,600 the head 772 00:27:33,679 --> 00:27:37,120 so we're getting towards the end now so 773 00:27:35,600 --> 00:27:39,120 let's just quickly reflect on some of 774 00:27:37,120 --> 00:27:41,919 the kinds of uh values that are 775 00:27:39,120 --> 00:27:43,279 currently embedded in privacy invasive 776 00:27:41,919 --> 00:27:45,279 approaches 777 00:27:43,279 --> 00:27:47,440 so currently there's this huge focus on 778 00:27:45,279 --> 00:27:49,440 extractivism as i said before at the 779 00:27:47,440 --> 00:27:51,520 beginning you know this need to collect 780 00:27:49,440 --> 00:27:53,440 everything to mine and extract and 781 00:27:51,520 --> 00:27:55,919 exploit and profit 782 00:27:53,440 --> 00:27:57,760 this clearly aligns with how you know 783 00:27:55,919 --> 00:27:58,799 the environment is being exploited as 784 00:27:57,760 --> 00:28:01,200 well 785 00:27:58,799 --> 00:28:02,799 if data is the new oil then we are 786 00:28:01,200 --> 00:28:04,880 fracking people for their personal 787 00:28:02,799 --> 00:28:06,960 information 788 00:28:04,880 --> 00:28:08,880 it's also grounded in policing in 789 00:28:06,960 --> 00:28:10,640 watching and being watched and treating 790 00:28:08,880 --> 00:28:14,080 everyone as a possible suspect 791 00:28:10,640 --> 00:28:16,399 incarceral and punitive approaches 792 00:28:14,080 --> 00:28:17,840 and there is a complete lack of respect 793 00:28:16,399 --> 00:28:20,159 for boundaries 794 00:28:17,840 --> 00:28:23,039 um you know privacy invasive technology 795 00:28:20,159 --> 00:28:25,200 just storms in forces consent takes your 796 00:28:23,039 --> 00:28:26,720 personal information influences you and 797 00:28:25,200 --> 00:28:29,279 then tells you it tells you that you 798 00:28:26,720 --> 00:28:31,039 consented to it all 799 00:28:29,279 --> 00:28:33,039 so there's nothing inevitable about this 800 00:28:31,039 --> 00:28:34,960 we can change the way that things are 801 00:28:33,039 --> 00:28:36,799 going and you guessed it i think that 802 00:28:34,960 --> 00:28:38,480 upholding privacy is a collective issue 803 00:28:36,799 --> 00:28:40,320 is the way to get there 804 00:28:38,480 --> 00:28:41,760 so i've got some quotes here from adrian 805 00:28:40,320 --> 00:28:43,600 murray brown who is a writer and 806 00:28:41,760 --> 00:28:45,840 activist who i admire and she really 807 00:28:43,600 --> 00:28:47,919 emphasizes this idea of 808 00:28:45,840 --> 00:28:50,399 imagining and being creative about how 809 00:28:47,919 --> 00:28:51,840 the how we want to build futures 810 00:28:50,399 --> 00:28:54,159 so what might the world look like where 811 00:28:51,840 --> 00:28:56,159 privacy was recognized and respected and 812 00:28:54,159 --> 00:28:58,000 upheld as a collective good 813 00:28:56,159 --> 00:29:00,240 would we be able to think more radically 814 00:28:58,000 --> 00:29:02,159 about the possibilities of technology to 815 00:29:00,240 --> 00:29:04,320 build things that would not be for the 816 00:29:02,159 --> 00:29:06,799 purpose of selling things but to enrich 817 00:29:04,320 --> 00:29:09,279 and uplift the collective well-being 818 00:29:06,799 --> 00:29:11,200 would we be able to reject this idea 819 00:29:09,279 --> 00:29:14,960 that we must trade our privacy for 820 00:29:11,200 --> 00:29:16,880 safety convenience and connection 821 00:29:14,960 --> 00:29:19,120 so to wrap up i want to leave you to 822 00:29:16,880 --> 00:29:21,360 consider not just how we might use 823 00:29:19,120 --> 00:29:23,200 privacy as a tool to push back against 824 00:29:21,360 --> 00:29:25,520 current power imbalances 825 00:29:23,200 --> 00:29:27,679 but how we might use privacy 826 00:29:25,520 --> 00:29:29,279 and perceive privacy as a guiding 827 00:29:27,679 --> 00:29:31,440 principle towards creating better 828 00:29:29,279 --> 00:29:33,440 structures altogether based on something 829 00:29:31,440 --> 00:29:35,520 that isn't centered on extractivism 830 00:29:33,440 --> 00:29:37,120 policing and control 831 00:29:35,520 --> 00:29:39,679 i don't want to you know just build more 832 00:29:37,120 --> 00:29:42,240 walls with privacy to be a blocker to 833 00:29:39,679 --> 00:29:44,000 say no to technological innovation 834 00:29:42,240 --> 00:29:46,720 rather i want privacy to be seen as a 835 00:29:44,000 --> 00:29:49,279 collective good to help us imagine new 836 00:29:46,720 --> 00:29:51,200 ways of thinking about tech and data and 837 00:29:49,279 --> 00:29:53,520 how we can shape it to build a better 838 00:29:51,200 --> 00:29:55,200 future for all of us 839 00:29:53,520 --> 00:29:57,919 thank you so much for listening and 840 00:29:55,200 --> 00:29:57,919 paying attention 841 00:29:59,440 --> 00:30:03,760 thank you so much it was so good there 842 00:30:02,000 --> 00:30:07,039 are some really cool chats going on as 843 00:30:03,760 --> 00:30:09,840 well so i'm trying not to be totally off 844 00:30:07,039 --> 00:30:09,840 camera 845 00:30:10,000 --> 00:30:14,159 it's really good if you guys have any 846 00:30:11,600 --> 00:30:15,440 questions please jump into the hallway 847 00:30:14,159 --> 00:30:17,200 afterwards 848 00:30:15,440 --> 00:30:20,760 and we'll be back in about 15 minutes 849 00:30:17,200 --> 00:30:20,760 with our next talk 850 00:30:24,159 --> 00:30:26,240 you