1 00:00:06,320 --> 00:00:11,499 [Music] 2 00:00:16,400 --> 00:00:21,199 welcome everyone um and welcome back 3 00:00:19,039 --> 00:00:22,800 after afternoon tea think firstly thank 4 00:00:21,199 --> 00:00:24,960 you to catherine for the amazing 5 00:00:22,800 --> 00:00:26,880 conversation that we had about shovel 6 00:00:24,960 --> 00:00:30,400 ready glamour graduates 7 00:00:26,880 --> 00:00:32,880 uh next up we have ingrid and catherine 8 00:00:30,400 --> 00:00:34,640 ingrid drives practice change in digital 9 00:00:32,880 --> 00:00:36,320 transformation of humanity's research 10 00:00:34,640 --> 00:00:37,760 and cultural heritage 11 00:00:36,320 --> 00:00:40,000 through the development of new 12 00:00:37,760 --> 00:00:42,399 technologies and national infrastructure 13 00:00:40,000 --> 00:00:44,640 she's a leader and volunteer in the 14 00:00:42,399 --> 00:00:47,120 international lodlam which is linked 15 00:00:44,640 --> 00:00:50,239 open data for libraries archives and 16 00:00:47,120 --> 00:00:52,719 museums and ai for lamb which is ai for 17 00:00:50,239 --> 00:00:54,719 libraries archives and museums and in 18 00:00:52,719 --> 00:00:56,800 those communities and she's also a 19 00:00:54,719 --> 00:00:58,640 metadata nerd and a bit of a tech head 20 00:00:56,800 --> 00:01:02,640 and she's also fabulous 21 00:00:58,640 --> 00:01:04,879 uh catherine has is also joining us 22 00:01:02,640 --> 00:01:07,200 who is an ai for lam co 23 00:01:04,879 --> 00:01:09,520 committee volunteer she has worked with 24 00:01:07,200 --> 00:01:11,840 archives in the show since 2003 and is 25 00:01:09,520 --> 00:01:13,439 currently the associate director at the 26 00:01:11,840 --> 00:01:15,680 royal melbourne institute of technology 27 00:01:13,439 --> 00:01:17,680 or rmit library 28 00:01:15,680 --> 00:01:20,000 she is a phd student in her final year 29 00:01:17,680 --> 00:01:22,240 at monash university and catherine is 30 00:01:20,000 --> 00:01:24,479 also the past editor of the australian 31 00:01:22,240 --> 00:01:27,520 society of archivist journal archives 32 00:01:24,479 --> 00:01:30,240 and manuscripts and is currently on the 33 00:01:27,520 --> 00:01:32,400 editorial board member for the archives 34 00:01:30,240 --> 00:01:34,240 and records journal so over to you 35 00:01:32,400 --> 00:01:36,640 lovely ladies thanks 36 00:01:34,240 --> 00:01:38,159 thanks sarah lovely to be here with you 37 00:01:36,640 --> 00:01:40,640 today um 38 00:01:38,159 --> 00:01:42,079 i'm coming to you today from none or 39 00:01:40,640 --> 00:01:44,079 land 40 00:01:42,079 --> 00:01:46,960 and i'd like to pay my respects to 41 00:01:44,079 --> 00:01:49,200 elders past present and emerging and i'd 42 00:01:46,960 --> 00:01:50,880 like to introduce my co-speaker 43 00:01:49,200 --> 00:01:52,960 catherine jarvi 44 00:01:50,880 --> 00:01:54,880 thank you great to be here everyone nice 45 00:01:52,960 --> 00:01:57,200 to meet you and i would also like to pay 46 00:01:54,880 --> 00:02:00,320 my respects to their elders past present 47 00:01:57,200 --> 00:02:01,920 future and anybody here today 48 00:02:00,320 --> 00:02:03,280 great so we're going to talk to you 49 00:02:01,920 --> 00:02:05,200 today it's going to be a bit of a 50 00:02:03,280 --> 00:02:07,200 conversation between catherine and i 51 00:02:05,200 --> 00:02:10,720 about the grassroots action that's been 52 00:02:07,200 --> 00:02:13,680 going on in the regional chapter for ai 53 00:02:10,720 --> 00:02:15,920 forum by libraries archives and museums 54 00:02:13,680 --> 00:02:17,760 we spoke at the uh go glam mini 55 00:02:15,920 --> 00:02:19,120 conference last year 56 00:02:17,760 --> 00:02:20,560 and we just thought we'd provide a bit 57 00:02:19,120 --> 00:02:22,640 of an update 58 00:02:20,560 --> 00:02:25,120 to let you know what we've been up to 59 00:02:22,640 --> 00:02:27,360 where we think we're heading and why we 60 00:02:25,120 --> 00:02:29,520 think you might like to be involved in 61 00:02:27,360 --> 00:02:32,160 some way shape or form 62 00:02:29,520 --> 00:02:34,000 so we'll just kick off 63 00:02:32,160 --> 00:02:36,400 okay so there's a bunch of us that have 64 00:02:34,000 --> 00:02:38,160 come together to do some coordination 65 00:02:36,400 --> 00:02:39,760 and you can see that there's quite a 66 00:02:38,160 --> 00:02:42,239 sprinkling there of people from 67 00:02:39,760 --> 00:02:44,239 australia who've taken on coordination 68 00:02:42,239 --> 00:02:45,840 activities not everybody's active at the 69 00:02:44,239 --> 00:02:48,720 same time 70 00:02:45,840 --> 00:02:51,040 people's lives of course 71 00:02:48,720 --> 00:02:53,599 interrupt um and their ability to 72 00:02:51,040 --> 00:02:55,840 participate is welcomed whenever it's 73 00:02:53,599 --> 00:02:57,519 available and um 74 00:02:55,840 --> 00:02:59,599 they can take a back seat whenever they 75 00:02:57,519 --> 00:03:01,680 need to so we've had people move in and 76 00:02:59,599 --> 00:03:02,720 out of the coordination group 77 00:03:01,680 --> 00:03:05,360 and 78 00:03:02,720 --> 00:03:07,840 people come to this 79 00:03:05,360 --> 00:03:10,319 idea of ai for lamb and what we might do 80 00:03:07,840 --> 00:03:12,319 as a community a little bit differently 81 00:03:10,319 --> 00:03:14,000 which is great 82 00:03:12,319 --> 00:03:17,519 so we're going to cover four questions 83 00:03:14,000 --> 00:03:20,239 today why take a grassroots approach 84 00:03:17,519 --> 00:03:21,920 why does community building across group 85 00:03:20,239 --> 00:03:23,599 boundaries matter 86 00:03:21,920 --> 00:03:26,000 who have we heard from and what have we 87 00:03:23,599 --> 00:03:27,680 learned and what happens next 88 00:03:26,000 --> 00:03:29,920 so we're going to enter into a bit of a 89 00:03:27,680 --> 00:03:32,400 conversation and we really look forward 90 00:03:29,920 --> 00:03:33,920 to your questions or comments as we go 91 00:03:32,400 --> 00:03:35,280 through 92 00:03:33,920 --> 00:03:36,560 so catherine i'm going to hand this one 93 00:03:35,280 --> 00:03:39,120 to you 94 00:03:36,560 --> 00:03:41,120 why do you think a grassroots approach 95 00:03:39,120 --> 00:03:43,360 was appealing 96 00:03:41,120 --> 00:03:45,280 yeah so for me i thought i'd talk to my 97 00:03:43,360 --> 00:03:48,560 motivation for joining and that was 98 00:03:45,280 --> 00:03:51,440 because being a grassroots group you get 99 00:03:48,560 --> 00:03:52,799 a whole different cohort of people from 100 00:03:51,440 --> 00:03:55,040 various different 101 00:03:52,799 --> 00:03:57,519 backgrounds different levels of 102 00:03:55,040 --> 00:03:57,690 organization different types of people 103 00:03:57,519 --> 00:03:58,879 so 104 00:03:57,690 --> 00:04:01,200 [Music] 105 00:03:58,879 --> 00:04:02,239 my particular interest in 106 00:04:01,200 --> 00:04:04,560 ai 107 00:04:02,239 --> 00:04:06,400 and applying that into my workplace 108 00:04:04,560 --> 00:04:08,879 setting was to see what other people 109 00:04:06,400 --> 00:04:12,080 were doing how they were applying ai 110 00:04:08,879 --> 00:04:14,959 within their personal projects or their 111 00:04:12,080 --> 00:04:17,919 technical projects at work and because i 112 00:04:14,959 --> 00:04:19,519 work in a library we haven't got any ai 113 00:04:17,919 --> 00:04:20,880 projects underway 114 00:04:19,519 --> 00:04:22,960 when i began 115 00:04:20,880 --> 00:04:25,440 aiflam a year ago 116 00:04:22,960 --> 00:04:27,360 so i was hoping to learn from my peers 117 00:04:25,440 --> 00:04:29,919 and really bring that back to the 118 00:04:27,360 --> 00:04:31,680 librarians within the university to say 119 00:04:29,919 --> 00:04:33,360 hey there's this great project that 120 00:04:31,680 --> 00:04:36,080 others are doing we could do something 121 00:04:33,360 --> 00:04:38,320 similar or these are the skills that we 122 00:04:36,080 --> 00:04:39,919 could acquire particularly because my 123 00:04:38,320 --> 00:04:42,639 role is 124 00:04:39,919 --> 00:04:45,280 associate director of engagement 125 00:04:42,639 --> 00:04:47,919 we want to engage our staff to 126 00:04:45,280 --> 00:04:49,840 really upskill in ai where it's 127 00:04:47,919 --> 00:04:52,880 necessary for their role or where their 128 00:04:49,840 --> 00:04:56,080 role can be built and expanded upon its 129 00:04:52,880 --> 00:04:58,400 cur its current form and also recruit in 130 00:04:56,080 --> 00:05:00,160 to the library the skills required but 131 00:04:58,400 --> 00:05:03,680 we need the projects 132 00:05:00,160 --> 00:05:05,440 for to attract those those people and um 133 00:05:03,680 --> 00:05:07,280 create those position descriptions and 134 00:05:05,440 --> 00:05:09,199 create opportunities for people to come 135 00:05:07,280 --> 00:05:11,520 into the library so that was my 136 00:05:09,199 --> 00:05:14,000 motivation and the reason why i thought 137 00:05:11,520 --> 00:05:15,919 joining a grassroots group like ai for 138 00:05:14,000 --> 00:05:18,000 lamb would be useful to me it hasn't 139 00:05:15,919 --> 00:05:20,240 necessarily panned out yet i'm still on 140 00:05:18,000 --> 00:05:23,360 the hunt for you know 141 00:05:20,240 --> 00:05:25,440 as we grow and expand i'm still looking 142 00:05:23,360 --> 00:05:27,440 for that perfect position description to 143 00:05:25,440 --> 00:05:29,840 create within the library or perfect 144 00:05:27,440 --> 00:05:31,360 project to start and kick off but i'm 145 00:05:29,840 --> 00:05:33,280 sure it will happen 146 00:05:31,360 --> 00:05:34,639 yeah that's really good it's nice and 147 00:05:33,280 --> 00:05:37,120 concrete 148 00:05:34,639 --> 00:05:40,000 i really like when we talked about this 149 00:05:37,120 --> 00:05:42,720 that this was a really clear motivation 150 00:05:40,000 --> 00:05:43,759 and it was quite different to mine 151 00:05:42,720 --> 00:05:45,520 because 152 00:05:43,759 --> 00:05:47,600 i'm not in a management role i'm 153 00:05:45,520 --> 00:05:50,160 managing an infrastructure project at 154 00:05:47,600 --> 00:05:53,360 anu but i became 155 00:05:50,160 --> 00:05:55,520 really interested in ai because i went 156 00:05:53,360 --> 00:05:57,759 overseas to have a talk to 157 00:05:55,520 --> 00:06:00,160 some colleagues in norway 158 00:05:57,759 --> 00:06:03,120 who were doing um who'd kicked off this 159 00:06:00,160 --> 00:06:04,479 whole community and it just kind of blew 160 00:06:03,120 --> 00:06:06,400 my mind 161 00:06:04,479 --> 00:06:08,319 and i 162 00:06:06,400 --> 00:06:10,319 really could see the need to have a 163 00:06:08,319 --> 00:06:12,240 cross-cutting agenda because there were 164 00:06:10,319 --> 00:06:15,039 so many different viewpoints to sort of 165 00:06:12,240 --> 00:06:16,800 try and draw together and to learn from 166 00:06:15,039 --> 00:06:18,160 to complement whatever i might be able 167 00:06:16,800 --> 00:06:18,960 to bring 168 00:06:18,160 --> 00:06:21,360 and 169 00:06:18,960 --> 00:06:23,840 i really liked the grassroots approach 170 00:06:21,360 --> 00:06:25,919 because i could see that it would 171 00:06:23,840 --> 00:06:28,400 create a nice complement to the 172 00:06:25,919 --> 00:06:30,880 professional associations that many in 173 00:06:28,400 --> 00:06:32,560 our community belong to 174 00:06:30,880 --> 00:06:34,400 and it's quite hard to kind of pull all 175 00:06:32,560 --> 00:06:36,240 those different parts of the glam 176 00:06:34,400 --> 00:06:38,560 equation together let alone bring them 177 00:06:36,240 --> 00:06:40,080 together with the research 178 00:06:38,560 --> 00:06:42,400 community and bring them together with 179 00:06:40,080 --> 00:06:44,479 the technical community so that was what 180 00:06:42,400 --> 00:06:46,840 really appealed to me 181 00:06:44,479 --> 00:06:49,759 about the whole idea of a grass roots 182 00:06:46,840 --> 00:06:51,199 approach and also i guess 183 00:06:49,759 --> 00:06:52,400 in relation to those different 184 00:06:51,199 --> 00:06:54,800 viewpoints 185 00:06:52,400 --> 00:06:56,319 to be able to translate ai technology 186 00:06:54,800 --> 00:06:58,400 well was really important to me 187 00:06:56,319 --> 00:07:00,960 personally probably because i've had a 188 00:06:58,400 --> 00:07:03,759 lot of involvement over the last 189 00:07:00,960 --> 00:07:06,160 i don't know how many years 190 00:07:03,759 --> 00:07:09,440 with technology development and it's 191 00:07:06,160 --> 00:07:11,520 really important um not to take uh 192 00:07:09,440 --> 00:07:13,599 technologies at face value and that 193 00:07:11,520 --> 00:07:15,759 critical kind of thinking 194 00:07:13,599 --> 00:07:18,560 seemed to be a really nice niche to 195 00:07:15,759 --> 00:07:21,840 enter into with others because you get 196 00:07:18,560 --> 00:07:23,280 so much use from other perspectives 197 00:07:21,840 --> 00:07:25,759 so yeah that's kind of why the 198 00:07:23,280 --> 00:07:28,400 grassroots approach and catherine and i 199 00:07:25,759 --> 00:07:30,479 thought that this this is a good way of 200 00:07:28,400 --> 00:07:34,160 conveying to you the reasons that people 201 00:07:30,479 --> 00:07:36,720 have stepped up to just be part of the 202 00:07:34,160 --> 00:07:39,360 slack channel that we've got and also to 203 00:07:36,720 --> 00:07:41,440 help with coordination 204 00:07:39,360 --> 00:07:44,319 so catherine i'll hand it back to you 205 00:07:41,440 --> 00:07:47,120 around community building 206 00:07:44,319 --> 00:07:49,919 yeah so as i said we're looking to build 207 00:07:47,120 --> 00:07:52,879 our ai philam community and 208 00:07:49,919 --> 00:07:54,400 i find that the events that we do have 209 00:07:52,879 --> 00:07:56,000 and the conversations that we have 210 00:07:54,400 --> 00:07:58,560 within ai for lamb is a good 211 00:07:56,000 --> 00:08:01,759 conversation starter within my library 212 00:07:58,560 --> 00:08:03,280 so i can share the links to our youtube 213 00:08:01,759 --> 00:08:04,240 videos with them 214 00:08:03,280 --> 00:08:06,400 uh 215 00:08:04,240 --> 00:08:08,400 we're really big users of yammer within 216 00:08:06,400 --> 00:08:10,000 rmit libraries so 217 00:08:08,400 --> 00:08:12,560 i can share the links there and have 218 00:08:10,000 --> 00:08:14,960 local conversations about it 219 00:08:12,560 --> 00:08:18,080 it has been hard to get people to talk 220 00:08:14,960 --> 00:08:20,319 about ai i think they might be wary 221 00:08:18,080 --> 00:08:21,599 about it so i'd be interested to have a 222 00:08:20,319 --> 00:08:23,919 chat um 223 00:08:21,599 --> 00:08:24,960 later in our conversations in the 224 00:08:23,919 --> 00:08:26,400 questions 225 00:08:24,960 --> 00:08:28,879 but um 226 00:08:26,400 --> 00:08:31,280 i think making people feel 227 00:08:28,879 --> 00:08:32,640 less scared about ai i think there's 228 00:08:31,280 --> 00:08:34,479 this impression that maybe you know 229 00:08:32,640 --> 00:08:38,080 robots will take over 230 00:08:34,479 --> 00:08:39,919 librarians jobs or archivists roles uh 231 00:08:38,080 --> 00:08:43,760 it's about you know talking about the 232 00:08:39,919 --> 00:08:46,000 opportunities i feel to expand uh what 233 00:08:43,760 --> 00:08:48,080 we're doing and create new roles that 234 00:08:46,000 --> 00:08:50,320 incorporates both the human and the 235 00:08:48,080 --> 00:08:53,360 technology together 236 00:08:50,320 --> 00:08:55,200 uh so i feel that has really 237 00:08:53,360 --> 00:08:56,720 being part of ai for land has helped me 238 00:08:55,200 --> 00:08:59,120 have that conversation and bring that 239 00:08:56,720 --> 00:09:01,279 conversation into my workplace 240 00:08:59,120 --> 00:09:03,680 also i'm studying a grassroots animal 241 00:09:01,279 --> 00:09:05,760 rights organization so i 242 00:09:03,680 --> 00:09:08,800 really interested in communities as part 243 00:09:05,760 --> 00:09:09,680 of my phd research and that research is 244 00:09:08,800 --> 00:09:11,920 about 245 00:09:09,680 --> 00:09:14,080 appraisal within communities and 246 00:09:11,920 --> 00:09:17,680 appraisal being how 247 00:09:14,080 --> 00:09:19,519 archives are stored created and why and 248 00:09:17,680 --> 00:09:22,399 for how long 249 00:09:19,519 --> 00:09:24,399 not only by people but by technologies 250 00:09:22,399 --> 00:09:26,560 and that i think has really led to an 251 00:09:24,399 --> 00:09:29,200 interest in alignment with that study 252 00:09:26,560 --> 00:09:31,440 with linked open data the open web and 253 00:09:29,200 --> 00:09:33,760 really thinking about how communities 254 00:09:31,440 --> 00:09:37,440 can empower themselves to be their own 255 00:09:33,760 --> 00:09:39,120 archivists and again not necessarily 256 00:09:37,440 --> 00:09:41,040 while you're challenging the traditional 257 00:09:39,120 --> 00:09:42,880 archival models you're not necessarily 258 00:09:41,040 --> 00:09:45,680 putting the archivist out of a job it's 259 00:09:42,880 --> 00:09:48,399 partnering with archivists so 260 00:09:45,680 --> 00:09:50,560 that's really aligned there i feel that 261 00:09:48,399 --> 00:09:52,560 in my workplace it's about making feel 262 00:09:50,560 --> 00:09:55,200 people feel comfortable about the ai 263 00:09:52,560 --> 00:09:58,000 conversation and in my studies thinking 264 00:09:55,200 --> 00:10:00,560 about how we can empower communities in 265 00:09:58,000 --> 00:10:02,240 their own archival practice without 266 00:10:00,560 --> 00:10:04,720 putting the archivists out in the coals 267 00:10:02,240 --> 00:10:07,680 that it's a partnership 268 00:10:04,720 --> 00:10:09,600 yeah that's um i think the the comfort 269 00:10:07,680 --> 00:10:11,839 we had a bit of a chat um in prepping 270 00:10:09,600 --> 00:10:15,040 for this talk about how important 271 00:10:11,839 --> 00:10:17,440 feeling comfortable with technologies is 272 00:10:15,040 --> 00:10:19,279 and i guess because my background is 273 00:10:17,440 --> 00:10:22,480 lobby background 274 00:10:19,279 --> 00:10:24,079 literacy and dexterity have always been 275 00:10:22,480 --> 00:10:27,920 elements of library practice where 276 00:10:24,079 --> 00:10:29,760 you're teaching people about new ways to 277 00:10:27,920 --> 00:10:32,240 find information or new types of 278 00:10:29,760 --> 00:10:33,839 information or new ways to create 279 00:10:32,240 --> 00:10:35,920 information 280 00:10:33,839 --> 00:10:38,720 that's definitely been an element of 281 00:10:35,920 --> 00:10:40,079 practice and 282 00:10:38,720 --> 00:10:41,920 yeah i guess 283 00:10:40,079 --> 00:10:43,920 the community building for me was really 284 00:10:41,920 --> 00:10:45,920 important because i think 285 00:10:43,920 --> 00:10:48,240 we get the best out of each other 286 00:10:45,920 --> 00:10:49,920 because of our slightly different 287 00:10:48,240 --> 00:10:51,920 approaches to things 288 00:10:49,920 --> 00:10:52,800 and because i've worked in e-research 289 00:10:51,920 --> 00:10:55,279 you know 290 00:10:52,800 --> 00:10:57,440 probably for about 12 years now 291 00:10:55,279 --> 00:11:00,959 and it's really altered the way that i 292 00:10:57,440 --> 00:11:03,120 think about um services around 293 00:11:00,959 --> 00:11:05,120 collections glam collections 294 00:11:03,120 --> 00:11:06,880 and of course around research because 295 00:11:05,120 --> 00:11:09,920 i've worked with such a different range 296 00:11:06,880 --> 00:11:13,519 of people and i've really benefited 297 00:11:09,920 --> 00:11:16,240 they've been the occasional argument 298 00:11:13,519 --> 00:11:19,440 around you know uh what ai can and 299 00:11:16,240 --> 00:11:21,040 should do and and uh you know it's 300 00:11:19,440 --> 00:11:23,360 sometimes been intimidating for me 301 00:11:21,040 --> 00:11:25,200 especially when i've had a really you 302 00:11:23,360 --> 00:11:26,320 know toe-to-toe conversation with one of 303 00:11:25,200 --> 00:11:28,480 my physics 304 00:11:26,320 --> 00:11:31,279 um colleagues uh 305 00:11:28,480 --> 00:11:32,880 from my time at arnett um because he 306 00:11:31,279 --> 00:11:36,240 comes at things from a very kind of 307 00:11:32,880 --> 00:11:38,160 particular way and i um i really have 308 00:11:36,240 --> 00:11:39,279 seen the benefit of that and that's why 309 00:11:38,160 --> 00:11:41,120 i think 310 00:11:39,279 --> 00:11:44,240 working across the boundaries is so 311 00:11:41,120 --> 00:11:45,760 important and because it's really clear 312 00:11:44,240 --> 00:11:47,920 in research 313 00:11:45,760 --> 00:11:49,200 and it's just come out in the australian 314 00:11:47,920 --> 00:11:52,800 national collaborative research 315 00:11:49,200 --> 00:11:56,720 infrastructure strategy roadmap that 316 00:11:52,800 --> 00:11:59,440 there is a a role being seen for those 317 00:11:56,720 --> 00:12:01,040 who are perhaps not technical to come in 318 00:11:59,440 --> 00:12:02,320 and bring their different types of 319 00:12:01,040 --> 00:12:04,639 expertise 320 00:12:02,320 --> 00:12:07,120 to ai technology development and 321 00:12:04,639 --> 00:12:09,120 application and it's there out there in 322 00:12:07,120 --> 00:12:11,519 the draft that's just been put out there 323 00:12:09,120 --> 00:12:13,680 in december last year that they see a 324 00:12:11,519 --> 00:12:15,920 specific role for social science and i 325 00:12:13,680 --> 00:12:18,079 would argue for humanities researchers 326 00:12:15,920 --> 00:12:20,480 as well 327 00:12:18,079 --> 00:12:22,800 to bring that expertise to this equation 328 00:12:20,480 --> 00:12:23,680 so i guess that's really underlined for 329 00:12:22,800 --> 00:12:25,120 me 330 00:12:23,680 --> 00:12:26,880 the importance 331 00:12:25,120 --> 00:12:29,360 but also because 332 00:12:26,880 --> 00:12:32,079 that knowledge transfer 333 00:12:29,360 --> 00:12:33,040 does need someone that's trusted 334 00:12:32,079 --> 00:12:35,360 to help 335 00:12:33,040 --> 00:12:37,200 facilitate a conversation or to kind of 336 00:12:35,360 --> 00:12:39,680 test ideas against 337 00:12:37,200 --> 00:12:41,680 and i gave a talk to some nazla 338 00:12:39,680 --> 00:12:44,480 colleagues last year because i could see 339 00:12:41,680 --> 00:12:47,600 a massive opportunity to bring the 340 00:12:44,480 --> 00:12:49,440 social skill sets to this 341 00:12:47,600 --> 00:12:50,839 use of ai technology and its 342 00:12:49,440 --> 00:12:52,480 developments 343 00:12:50,839 --> 00:12:54,800 yeah 344 00:12:52,480 --> 00:12:54,800 okay 345 00:12:55,040 --> 00:12:59,440 so this is the matrix um 346 00:12:57,680 --> 00:13:01,839 uh that we put together for the 347 00:12:59,440 --> 00:13:03,440 community and i guess it just gives you 348 00:13:01,839 --> 00:13:06,000 a bit of a breakdown 349 00:13:03,440 --> 00:13:06,880 of the areas that we're trying to work 350 00:13:06,000 --> 00:13:07,760 in 351 00:13:06,880 --> 00:13:09,360 we're 352 00:13:07,760 --> 00:13:11,040 trying to understand how we can work 353 00:13:09,360 --> 00:13:12,959 with our colleagues across the tasman 354 00:13:11,040 --> 00:13:14,639 and we've had some success with that 355 00:13:12,959 --> 00:13:15,839 through the speaking program which is 356 00:13:14,639 --> 00:13:17,120 terrific 357 00:13:15,839 --> 00:13:18,560 um 358 00:13:17,120 --> 00:13:20,480 we've caught up with each other on a 359 00:13:18,560 --> 00:13:22,399 regular basis it's pretty relaxed i 360 00:13:20,480 --> 00:13:24,399 think it is but 361 00:13:22,399 --> 00:13:26,160 most people sort of come and go as they 362 00:13:24,399 --> 00:13:28,320 can 363 00:13:26,160 --> 00:13:30,880 we're really looking for more community 364 00:13:28,320 --> 00:13:32,399 leaders to come forward and people to 365 00:13:30,880 --> 00:13:35,760 pick up an initiative that they're 366 00:13:32,399 --> 00:13:36,720 interested in to drive that forward 367 00:13:35,760 --> 00:13:39,680 um 368 00:13:36,720 --> 00:13:40,959 and we've had some really interesting uh 369 00:13:39,680 --> 00:13:42,639 i guess 370 00:13:40,959 --> 00:13:44,000 moments where we've discovered that 371 00:13:42,639 --> 00:13:46,079 because we're on the other side of the 372 00:13:44,000 --> 00:13:47,040 world where a lot of the 373 00:13:46,079 --> 00:13:49,440 action 374 00:13:47,040 --> 00:13:51,120 is not happening like a lot of the ai 375 00:13:49,440 --> 00:13:53,839 for lamb action is happening in the 376 00:13:51,120 --> 00:13:56,240 northern hemisphere that we've got real 377 00:13:53,839 --> 00:13:58,160 um practical challenges around how we 378 00:13:56,240 --> 00:14:00,240 work with our colleagues and the rest of 379 00:13:58,160 --> 00:14:01,600 the world who are doing some terrific 380 00:14:00,240 --> 00:14:04,720 project work 381 00:14:01,600 --> 00:14:06,560 so um i guess these are like this is 382 00:14:04,720 --> 00:14:08,959 just a bit of a snapshot of what we're 383 00:14:06,560 --> 00:14:11,199 trying to do 384 00:14:08,959 --> 00:14:13,360 so catherine listening and learning what 385 00:14:11,199 --> 00:14:15,519 have you listened to and learned about 386 00:14:13,360 --> 00:14:17,199 yeah so there's been so many really 387 00:14:15,519 --> 00:14:18,880 interesting presentations and the 388 00:14:17,199 --> 00:14:20,560 recordings are available on youtube i 389 00:14:18,880 --> 00:14:22,720 just picked one out 390 00:14:20,560 --> 00:14:24,959 as an example about 391 00:14:22,720 --> 00:14:27,440 one that really stood out and 392 00:14:24,959 --> 00:14:29,440 there may be so many more that we could 393 00:14:27,440 --> 00:14:32,560 delve into this year as well but the 394 00:14:29,440 --> 00:14:34,800 csiro hearing about their collections 395 00:14:32,560 --> 00:14:36,959 they've got a living archival living 396 00:14:34,800 --> 00:14:39,120 collection with biological specimens and 397 00:14:36,959 --> 00:14:42,079 organisms that are alive and having to 398 00:14:39,120 --> 00:14:44,320 be catalogued to a sound library um that 399 00:14:42,079 --> 00:14:46,560 has the sounds of birds so the diversity 400 00:14:44,320 --> 00:14:48,560 of the collection is really immense and 401 00:14:46,560 --> 00:14:50,880 that struck a chord with me 402 00:14:48,560 --> 00:14:52,480 thinking about the different databases 403 00:14:50,880 --> 00:14:53,760 and data sets that need to be brought 404 00:14:52,480 --> 00:14:54,880 together to 405 00:14:53,760 --> 00:14:59,040 um 406 00:14:54,880 --> 00:15:01,760 really make those accessible uh i can 407 00:14:59,040 --> 00:15:03,839 relate to that having a system that we 408 00:15:01,760 --> 00:15:05,839 have archival records in our trim and 409 00:15:03,839 --> 00:15:08,480 content manager system and then we've 410 00:15:05,839 --> 00:15:11,199 got the library alma primo system 411 00:15:08,480 --> 00:15:13,040 separate separated out so it's a sort of 412 00:15:11,199 --> 00:15:14,720 perennial problem of how to bring those 413 00:15:13,040 --> 00:15:17,839 together and make them 414 00:15:14,720 --> 00:15:20,079 more easily accessible to our users and 415 00:15:17,839 --> 00:15:21,440 i know um the 416 00:15:20,079 --> 00:15:22,880 state library of queensland have done 417 00:15:21,440 --> 00:15:24,959 some really interesting things to bring 418 00:15:22,880 --> 00:15:27,360 their archives and library systems 419 00:15:24,959 --> 00:15:30,480 together so that one in particular 420 00:15:27,360 --> 00:15:30,480 struck a chord with me 421 00:15:30,720 --> 00:15:34,800 okay so um 422 00:15:33,040 --> 00:15:36,240 i'm going to skip across my areas of 423 00:15:34,800 --> 00:15:38,000 interest because we're running short of 424 00:15:36,240 --> 00:15:39,040 time but um 425 00:15:38,000 --> 00:15:41,199 the 426 00:15:39,040 --> 00:15:43,519 there is a very clear sentiment amongst 427 00:15:41,199 --> 00:15:46,240 the group of coordinators to really 428 00:15:43,519 --> 00:15:48,320 support teaching and learning and there 429 00:15:46,240 --> 00:15:50,399 was some working group activity 430 00:15:48,320 --> 00:15:52,480 last year to try and understand how to 431 00:15:50,399 --> 00:15:54,639 bring people into that process of 432 00:15:52,480 --> 00:15:57,360 learning about ai and 433 00:15:54,639 --> 00:15:59,519 perhaps playing with ai technologies 434 00:15:57,360 --> 00:16:01,920 and that was um just a bunch of people 435 00:15:59,519 --> 00:16:03,680 who decided to come together and work on 436 00:16:01,920 --> 00:16:04,480 that but we managed to forge a bit of a 437 00:16:03,680 --> 00:16:07,199 link 438 00:16:04,480 --> 00:16:09,199 with the wider community 439 00:16:07,199 --> 00:16:11,519 who are doing uh teaching and learning 440 00:16:09,199 --> 00:16:13,440 work and i've got a link there to one of 441 00:16:11,519 --> 00:16:15,920 their outputs because they really went 442 00:16:13,440 --> 00:16:17,759 for it and it's a it looks like a 443 00:16:15,920 --> 00:16:20,320 terrific handbook 444 00:16:17,759 --> 00:16:22,880 for people to go through but i think 445 00:16:20,320 --> 00:16:25,600 this really was um kind of an underlying 446 00:16:22,880 --> 00:16:28,000 motivation was for many people to expand 447 00:16:25,600 --> 00:16:30,160 their understanding and to try and lift 448 00:16:28,000 --> 00:16:33,440 up others as they're trying to kind of 449 00:16:30,160 --> 00:16:34,839 find their way into this technology 450 00:16:33,440 --> 00:16:38,079 so 451 00:16:34,839 --> 00:16:40,240 uh last but not least 452 00:16:38,079 --> 00:16:42,399 what are we up to next um so catherine 453 00:16:40,240 --> 00:16:44,000 i'll throw the ball to you just quickly 454 00:16:42,399 --> 00:16:46,800 well mine's an easy one because all the 455 00:16:44,000 --> 00:16:49,199 details are there on screen and people 456 00:16:46,800 --> 00:16:51,519 can message ingrid that's ingrid's email 457 00:16:49,199 --> 00:16:54,880 address we've got a website that ai for 458 00:16:51,519 --> 00:16:56,959 lamb.org is the broader uh global 459 00:16:54,880 --> 00:16:57,920 website so we are connected into the 460 00:16:56,959 --> 00:17:00,720 global 461 00:16:57,920 --> 00:17:02,639 ai for lamb group but we have 462 00:17:00,720 --> 00:17:05,120 a slack channel as well as a google 463 00:17:02,639 --> 00:17:07,120 drive that is open as well so you all 464 00:17:05,120 --> 00:17:10,480 are welcome and encouraged to reach out 465 00:17:07,120 --> 00:17:14,720 to us and join us in our quest to talk 466 00:17:10,480 --> 00:17:16,160 about all things ai yeah nice one um so 467 00:17:14,720 --> 00:17:18,640 we're looking to have some more events 468 00:17:16,160 --> 00:17:19,919 and talks and alexis tyndall has told me 469 00:17:18,640 --> 00:17:21,600 today that she's definitely up for 470 00:17:19,919 --> 00:17:23,600 driving ahead with that and i think 471 00:17:21,600 --> 00:17:26,400 we're going to have a planning meeting 472 00:17:23,600 --> 00:17:26,400 in february 473 00:17:26,880 --> 00:17:29,679 between 474 00:17:28,000 --> 00:17:30,880 australia and new zealand because we've 475 00:17:29,679 --> 00:17:32,799 got some great colleagues from the 476 00:17:30,880 --> 00:17:34,240 national library of new zealand working 477 00:17:32,799 --> 00:17:35,760 with us 478 00:17:34,240 --> 00:17:37,440 we're looking for more cross-sector 479 00:17:35,760 --> 00:17:39,280 collaborations we had a nice 480 00:17:37,440 --> 00:17:41,600 collaboration with nasa and also with 481 00:17:39,280 --> 00:17:44,640 vala last year just to make sure that we 482 00:17:41,600 --> 00:17:47,200 leverage what we're doing 483 00:17:44,640 --> 00:17:50,480 we've been putting forward this idea of 484 00:17:47,200 --> 00:17:54,240 a big data global challenge and anyone's 485 00:17:50,480 --> 00:17:56,000 welcome to talk to me about that um and 486 00:17:54,240 --> 00:17:57,440 we're looking to try and see 487 00:17:56,000 --> 00:17:59,039 how we can work with our peers on the 488 00:17:57,440 --> 00:18:02,240 other side of the world i've just had an 489 00:17:59,039 --> 00:18:03,840 email from tom cromer at stanford and 490 00:18:02,240 --> 00:18:06,000 svenar 491 00:18:03,840 --> 00:18:07,679 from the norwegian national library to 492 00:18:06,000 --> 00:18:09,600 see how we can work with each other 493 00:18:07,679 --> 00:18:11,120 because they had a terrific fantastic 494 00:18:09,600 --> 00:18:13,200 futures conference 495 00:18:11,120 --> 00:18:14,799 on the other side of the world last year 496 00:18:13,200 --> 00:18:16,640 in paris and of course none of us could 497 00:18:14,799 --> 00:18:18,240 go so 498 00:18:16,640 --> 00:18:20,799 they're really open and wonderful 499 00:18:18,240 --> 00:18:22,799 colleagues are to be working with 500 00:18:20,799 --> 00:18:23,600 and just to give you a bit of a flavor 501 00:18:22,799 --> 00:18:24,880 of 502 00:18:23,600 --> 00:18:26,480 what the other 503 00:18:24,880 --> 00:18:28,559 people on the other side of the world 504 00:18:26,480 --> 00:18:31,520 were thinking about when they talked 505 00:18:28,559 --> 00:18:33,360 about a global data set amy abby potter 506 00:18:31,520 --> 00:18:34,799 sorry from the library of congress 507 00:18:33,360 --> 00:18:37,679 took this idea 508 00:18:34,799 --> 00:18:39,520 that i put it forward to that conference 509 00:18:37,679 --> 00:18:40,640 to think about what a global data set 510 00:18:39,520 --> 00:18:41,760 might be 511 00:18:40,640 --> 00:18:43,760 and 512 00:18:41,760 --> 00:18:47,039 why we might want to do something like 513 00:18:43,760 --> 00:18:47,039 that and um 514 00:18:47,200 --> 00:18:50,880 these are the kind of themes that came 515 00:18:48,799 --> 00:18:51,760 out through the consultation 516 00:18:50,880 --> 00:18:54,000 but i'll 517 00:18:51,760 --> 00:18:55,760 put a pin in it there and thank you all 518 00:18:54,000 --> 00:18:57,360 for your attention and thank you uh 519 00:18:55,760 --> 00:19:00,640 catherine for jumping in with me at the 520 00:18:57,360 --> 00:19:02,160 last minute and um yeah uh 521 00:19:00,640 --> 00:19:04,320 let's hear what your questions are 522 00:19:02,160 --> 00:19:05,840 thanks thanks everyone 523 00:19:04,320 --> 00:19:07,679 thanks 524 00:19:05,840 --> 00:19:09,600 thank you both that was fantastic and i 525 00:19:07,679 --> 00:19:11,679 love that cats were in the in the word 526 00:19:09,600 --> 00:19:13,120 cloud there so 527 00:19:11,679 --> 00:19:15,520 it was the first thing i noticed was 528 00:19:13,120 --> 00:19:18,000 cats was there um so we've got a couple 529 00:19:15,520 --> 00:19:19,840 of questions for you uh what are the key 530 00:19:18,000 --> 00:19:21,360 challenges in getting grassroots 531 00:19:19,840 --> 00:19:22,640 organizations up and running in this 532 00:19:21,360 --> 00:19:25,039 space 533 00:19:22,640 --> 00:19:27,520 i think well from my perspective it was 534 00:19:25,039 --> 00:19:28,640 about taking a risk um 535 00:19:27,520 --> 00:19:32,080 i sat 536 00:19:28,640 --> 00:19:34,400 in 2020 after being to that conference 537 00:19:32,080 --> 00:19:36,080 um and i went 538 00:19:34,400 --> 00:19:37,840 self-funded went overseas to the 539 00:19:36,080 --> 00:19:40,160 conference 540 00:19:37,840 --> 00:19:42,000 at stanford had an amazing time and 541 00:19:40,160 --> 00:19:44,000 there was one person from australia 542 00:19:42,000 --> 00:19:46,400 there 543 00:19:44,000 --> 00:19:48,720 sarah graham from university of sydney i 544 00:19:46,400 --> 00:19:51,200 was so thrilled and then i thought i 545 00:19:48,720 --> 00:19:53,600 can't bear the idea of not having an 546 00:19:51,200 --> 00:19:55,440 opportunity to get with other people so 547 00:19:53,600 --> 00:19:56,400 i just took a punt 548 00:19:55,440 --> 00:19:58,160 and 549 00:19:56,400 --> 00:20:01,200 i think for me that's the biggest 550 00:19:58,160 --> 00:20:01,200 challenge is actually 551 00:20:01,760 --> 00:20:05,600 creating a space for people 552 00:20:03,919 --> 00:20:08,080 to do what they think 553 00:20:05,600 --> 00:20:08,880 is a good thing to do and to help each 554 00:20:08,080 --> 00:20:09,679 other 555 00:20:08,880 --> 00:20:12,400 and 556 00:20:09,679 --> 00:20:13,840 not to put too much structure around it 557 00:20:12,400 --> 00:20:16,480 and 558 00:20:13,840 --> 00:20:19,840 to support people to come and go 559 00:20:16,480 --> 00:20:21,840 and in particular through this 560 00:20:19,840 --> 00:20:24,400 because people's lives have been going 561 00:20:21,840 --> 00:20:27,440 up and down and people have been amazing 562 00:20:24,400 --> 00:20:29,039 but you know i'm i'm comfortable with i 563 00:20:27,440 --> 00:20:31,679 think a reasonably high level of 564 00:20:29,039 --> 00:20:33,440 uncertainty so that that's my answer 565 00:20:31,679 --> 00:20:35,360 what about you catherine yeah i was 566 00:20:33,440 --> 00:20:37,440 going to say the same uh 567 00:20:35,360 --> 00:20:38,720 and that particularly this group has 568 00:20:37,440 --> 00:20:40,880 been 569 00:20:38,720 --> 00:20:43,120 quite compared to say other groups you 570 00:20:40,880 --> 00:20:46,240 feel the weight of obligation to be 571 00:20:43,120 --> 00:20:48,320 attending and to you know do so many 572 00:20:46,240 --> 00:20:50,240 things for say professional societies or 573 00:20:48,320 --> 00:20:53,679 other other groups that you attend 574 00:20:50,240 --> 00:20:56,159 whereas ingrid is the leader um and just 575 00:20:53,679 --> 00:20:58,080 us as a cultural you know the culture 576 00:20:56,159 --> 00:20:59,840 that we have very supportive of each 577 00:20:58,080 --> 00:21:01,600 other for that the people going in and 578 00:20:59,840 --> 00:21:03,679 out so that is the difference that i 579 00:21:01,600 --> 00:21:06,000 feel has really made a difference to and 580 00:21:03,679 --> 00:21:07,120 encouraged people to come along and have 581 00:21:06,000 --> 00:21:09,280 a go 582 00:21:07,120 --> 00:21:12,320 yeah definitely because sometimes people 583 00:21:09,280 --> 00:21:14,240 have the best ideas all they need is 584 00:21:12,320 --> 00:21:15,360 someone to be a little engine behind 585 00:21:14,240 --> 00:21:17,600 them 586 00:21:15,360 --> 00:21:19,600 and we can all do that for everybody for 587 00:21:17,600 --> 00:21:22,159 each other 588 00:21:19,600 --> 00:21:24,799 okay uh next question is from donna 589 00:21:22,159 --> 00:21:28,159 benjamin who's in the chat um we have an 590 00:21:24,799 --> 00:21:30,960 innovation program at red hat around ai 591 00:21:28,159 --> 00:21:32,720 and ml uh around the use cases i'd be 592 00:21:30,960 --> 00:21:34,559 interested in exploring uh if there's 593 00:21:32,720 --> 00:21:36,799 any opportunities for us to collaborate 594 00:21:34,559 --> 00:21:38,960 with aiphalm uh on some kind of 595 00:21:36,799 --> 00:21:40,159 experiment uh how could we start this 596 00:21:38,960 --> 00:21:41,600 conversation 597 00:21:40,159 --> 00:21:44,320 oh just um 598 00:21:41,600 --> 00:21:48,159 there's my email address 599 00:21:44,320 --> 00:21:49,919 just uh contact uh anytime 600 00:21:48,159 --> 00:21:52,000 that's kind of why we've jumped into 601 00:21:49,919 --> 00:21:53,440 these forums because we're looking 602 00:21:52,000 --> 00:21:55,200 either to be 603 00:21:53,440 --> 00:21:57,520 partners or to help with 604 00:21:55,200 --> 00:21:59,039 broken partnerships 605 00:21:57,520 --> 00:22:02,880 yep 606 00:21:59,039 --> 00:22:04,640 and the next one is 607 00:22:02,880 --> 00:22:06,799 for extending into education have you 608 00:22:04,640 --> 00:22:09,679 thought about reaching out to it 609 00:22:06,799 --> 00:22:10,960 teachers associations 610 00:22:09,679 --> 00:22:13,679 uh 611 00:22:10,960 --> 00:22:16,480 i don't i'm not personally catherine you 612 00:22:13,679 --> 00:22:18,320 we've mainly um reached out to the glam 613 00:22:16,480 --> 00:22:20,799 sector so that could be something we can 614 00:22:18,320 --> 00:22:23,600 put on our to-do list for this year 615 00:22:20,799 --> 00:22:27,520 yeah or whoever asked that question um 616 00:22:23,600 --> 00:22:30,159 you could come along and drive it 617 00:22:27,520 --> 00:22:32,480 um how can we get involved and are there 618 00:22:30,159 --> 00:22:35,120 any prerequisites 619 00:22:32,480 --> 00:22:36,400 no no prerequisites just put your hand 620 00:22:35,120 --> 00:22:38,640 up 621 00:22:36,400 --> 00:22:40,720 and uh it's really great if you can jump 622 00:22:38,640 --> 00:22:43,440 in the slack space uh and if you'd like 623 00:22:40,720 --> 00:22:46,159 to coordinate just get in touch 624 00:22:43,440 --> 00:22:48,799 i'll be delighted to announce that sarah 625 00:22:46,159 --> 00:22:51,520 jermaine has put her hand up um and i've 626 00:22:48,799 --> 00:22:55,400 already got a little list of things that 627 00:22:51,520 --> 00:22:55,400 she might like to do 628 00:22:57,120 --> 00:23:03,520 and one last question um have you seen 629 00:23:00,559 --> 00:23:05,760 ai or ml being used in any hackathons 630 00:23:03,520 --> 00:23:09,120 using information archives at all and do 631 00:23:05,760 --> 00:23:10,720 you see that as a positive or a negative 632 00:23:09,120 --> 00:23:14,000 uh 633 00:23:10,720 --> 00:23:16,640 well not me personally i haven't seen 634 00:23:14,000 --> 00:23:19,039 hacking i've been to a workshop i guess 635 00:23:16,640 --> 00:23:20,559 i went to a workshop hosted by 636 00:23:19,039 --> 00:23:22,640 colleagues at the norwegian national 637 00:23:20,559 --> 00:23:25,039 library just to 638 00:23:22,640 --> 00:23:27,760 give tensorflow a bit of a run over with 639 00:23:25,039 --> 00:23:29,360 a bunch of images and that was really 640 00:23:27,760 --> 00:23:31,440 interesting because 641 00:23:29,360 --> 00:23:32,720 they've got computer scientists in their 642 00:23:31,440 --> 00:23:33,600 library team 643 00:23:32,720 --> 00:23:35,200 and 644 00:23:33,600 --> 00:23:37,440 they're really 645 00:23:35,200 --> 00:23:39,679 working hard to unlock their collections 646 00:23:37,440 --> 00:23:41,039 and in particular around norwegian 647 00:23:39,679 --> 00:23:44,559 language because 648 00:23:41,039 --> 00:23:44,559 many of the um 649 00:23:44,799 --> 00:23:50,400 i guess data sets and technologies are 650 00:23:47,679 --> 00:23:53,039 built heavily around english and they 651 00:23:50,400 --> 00:23:53,840 need norwegian they don't need english 652 00:23:53,039 --> 00:23:56,240 so 653 00:23:53,840 --> 00:23:58,640 i think there are really productive 654 00:23:56,240 --> 00:24:01,120 reasons to get in and have a hack and to 655 00:23:58,640 --> 00:24:03,120 get people around a problem what about 656 00:24:01,120 --> 00:24:05,600 you catherine i've been to one in a 657 00:24:03,120 --> 00:24:08,000 european one through the there's an open 658 00:24:05,600 --> 00:24:10,720 glam forum that i found it out through 659 00:24:08,000 --> 00:24:12,880 there and again the time 660 00:24:10,720 --> 00:24:15,039 it's very northern centric so the time 661 00:24:12,880 --> 00:24:16,880 wasn't very australian friendly so we 662 00:24:15,039 --> 00:24:18,559 find that that's you know a bit of a 663 00:24:16,880 --> 00:24:20,799 hump to get everybody together from 664 00:24:18,559 --> 00:24:23,679 across the globe at the right time frank 665 00:24:20,799 --> 00:24:26,000 um so i was just willing to stay up and 666 00:24:23,679 --> 00:24:28,159 see what that was all about and in terms 667 00:24:26,000 --> 00:24:29,360 of any downsides well there was a call 668 00:24:28,159 --> 00:24:32,799 out for 669 00:24:29,360 --> 00:24:35,760 archival collections to be involved so 670 00:24:32,799 --> 00:24:38,240 uh people were willingly had collections 671 00:24:35,760 --> 00:24:40,799 that they wanted to be hacked 672 00:24:38,240 --> 00:24:42,000 and the risk assessment i guess was left 673 00:24:40,799 --> 00:24:43,919 with those 674 00:24:42,000 --> 00:24:45,919 archival institutions to put their hand 675 00:24:43,919 --> 00:24:49,120 up with i think they were typically 676 00:24:45,919 --> 00:24:50,640 already open and published archival 677 00:24:49,120 --> 00:24:54,080 content so 678 00:24:50,640 --> 00:24:56,559 relatively risk-free in terms of hacking 679 00:24:54,080 --> 00:24:58,720 yeah but i guess that the question means 680 00:24:56,559 --> 00:25:00,880 that there is a gap here for us 681 00:24:58,720 --> 00:25:02,880 given your answer catherine there's a 682 00:25:00,880 --> 00:25:05,039 gap here for us to look at 683 00:25:02,880 --> 00:25:06,400 whether someone wants to come forward 684 00:25:05,039 --> 00:25:09,039 and and 685 00:25:06,400 --> 00:25:11,679 work with others to get a hack up with 686 00:25:09,039 --> 00:25:13,279 cultural collections we had a great talk 687 00:25:11,679 --> 00:25:15,760 earlier today 688 00:25:13,279 --> 00:25:17,039 because acme have made their api openly 689 00:25:15,760 --> 00:25:17,760 available 690 00:25:17,039 --> 00:25:19,279 and 691 00:25:17,760 --> 00:25:22,480 have put some jupiter notebooks out 692 00:25:19,279 --> 00:25:24,880 there on co-lab so i do think it's ripe 693 00:25:22,480 --> 00:25:27,039 for the picking 694 00:25:24,880 --> 00:25:28,480 oh i'll help out a little bit but there 695 00:25:27,039 --> 00:25:29,679 might be other people who might want to 696 00:25:28,480 --> 00:25:32,480 jump up 697 00:25:29,679 --> 00:25:32,480 and pitch in 698 00:25:32,640 --> 00:25:35,600 that's wonderful thank you so much both 699 00:25:34,720 --> 00:25:39,760 for 700 00:25:35,600 --> 00:25:42,559 for speaking to us today um at 4 20 701 00:25:39,760 --> 00:25:45,440 next up we've got moving to self-managed 702 00:25:42,559 --> 00:25:48,159 open access publishing uh from jesse lim 703 00:25:45,440 --> 00:25:50,799 and that will be a fantastic talk i'm 704 00:25:48,159 --> 00:25:53,039 sure so thank you both again and um 705 00:25:50,799 --> 00:25:56,960 hopefully everyone joins ai for lamb now 706 00:25:53,039 --> 00:26:01,080 that i have oh please do join us yeah 707 00:25:56,960 --> 00:26:01,080 thanks thank you