1
00:00:00.230 --> 00:00:01.420
Liam Wyatt: I try to.
2
00:00:02.790 --> 00:00:21.980
Liam Wyatt: Hello. My name is Liam Wyatt, and this is a recording of a session in what day we may fifth 2023, my God! What years it for the future audiences! Discussion of the Wikimedia foundation annual plan and the concept of future
3
00:00:21.980 --> 00:00:23.480
audiences in general
4
00:00:23.720 --> 00:00:32.880
Liam Wyatt: we have a variety of Wikimedia Foundation staff and staff from several Wikimedia affiliate chapters as well as volunteers in the room.
5
00:00:32.940 --> 00:00:41.110
Liam Wyatt: The notes documents I will link in the chat here and link from the recording metadata. When I upload the video
6
00:00:41.280 --> 00:00:50.240
Liam Wyatt: as a community foundation event, this falls within the scope of the friendly space policies which are linked from the notes that I just shared there as well.
7
00:00:50.380 --> 00:00:54.110
Liam Wyatt: You can see the gender and Attendees list. You're welcome to add your name
8
00:00:54.140 --> 00:00:58.080
Liam Wyatt: to the page, and a rough description of what we'll discuss today.
9
00:00:58.290 --> 00:01:07.190
Liam Wyatt: Please feel free to add comments in the chat questions. You might have to be added to the agenda as we go or raise your hand in the
10
00:01:07.230 --> 00:01:09.410
Liam Wyatt: in the reactions.
11
00:01:09.960 --> 00:01:21.690
Liam Wyatt: i'll default the section itself. Otherwise I will start by heading over to Marianna Penshot, also from looking to foundation and partnerships our team to talk about
12
00:01:21.900 --> 00:01:24.950
Liam Wyatt: the concept of future audiences as we see it.
13
00:01:25.730 --> 00:01:44.180
Maryana Pinchuk: Yeah, thanks, liam. So just a quick intro for those you don't know me. I'm Marianna Pinchuck, not to be confused with Marianna, is under our CEO same name. Different pronunciation. I've been in the movement. Oh, gosh! Since 2,010
14
00:01:44.200 --> 00:01:59.870
Maryana Pinchuk: worked at the Wikimedia Foundation in my first tour, mostly in product management, and now, in my second tour doing some partnerships things, and I've been a volunteer editor for that time as well.
15
00:01:59.980 --> 00:02:19.030
Maryana Pinchuk: I am really excited to talk to you all about future audiences. It is a concept that was actually originated by my colleague, Marshall Miller, who was also on this call, who, I will hand over to introduce himself, and also talk more about what future audiences is in case you haven't been
16
00:02:19.030 --> 00:02:21.860
eagerly following all of our Meta updates.
17
00:02:23.620 --> 00:02:42.660
Marshall Miller: Thanks. Okay, hey, everyone. I'm Marshall Miller, there's a lot of foundation staff here which just goes to show how excited we are about this, and how much we care about the thoughts of all of you who are not Foundation Staff. So i'm hoping we have even more conversations, and and get in touch with more volunteers and more affiliate staff in the future, and
18
00:02:42.660 --> 00:02:48.950
Marshall Miller: we'll kind of ask you at the end if you know anyone else who should join these talks, please send them
19
00:02:49.240 --> 00:03:09.120
Marshall Miller: to our collective way. So yeah, I'm Marshall Miller. I'm. A director of product at the Wikimedia Foundation. I've been at the foundation for 5 years. My volunteer username is Cloud Atlas and I joined the foundation after just falling in love with editing Wikipedia, and wanting to work on it all the time. So now i'm happy that I get to. And.
20
00:03:09.120 --> 00:03:11.950
Marshall Miller: Mariana, it's time to talk about future audiences. Right?
21
00:03:12.000 --> 00:03:22.510
Marshall Miller: Cool. Okay. So i'm going to share a couple of slides to help us understand this. But i'm hoping that this is going to be mostly discussion. So can you see the slides
22
00:03:25.120 --> 00:03:26.960
Marshall Miller: cool? Isn't that
23
00:03:27.190 --> 00:03:32.470
Marshall Miller: i'm going to push slideshow, and then i'll be even better. Okay. So
24
00:03:33.450 --> 00:03:46.130
Marshall Miller: briefly, the concept behind future audiences is that we are a really large movement that already has millions of people consuming our content. Thousands of people contributing content.
25
00:03:46.330 --> 00:03:59.680
Marshall Miller: But we know we want to reach even more people. And so this is the part of the Wikipedia Foundation's annual plan that is about experimenting to figure out how to reach new kinds of people that we've never reached before.
26
00:03:59.890 --> 00:04:10.870
Marshall Miller: and for that reason this is sort of our riskiest part of the annual plan. It has some of our most controversial thoughts in it, and frankly, some of the ideas that
27
00:04:10.870 --> 00:04:28.220
Marshall Miller: we think that there are probably parts of our community that are gonna have concerns about questions about. And that's great. That's what we're here to talk about. And so I just kind of wanted to preface all this by saying, Really glad you're here, and I hope that you all are willing to kind of dream with us and experiment with us.
28
00:04:28.220 --> 00:04:33.790
Marshall Miller: even as we talk about some of the things that maybe the Wikipedia Foundation or our movement Hasn't tried in the past.
29
00:04:34.210 --> 00:04:46.310
Marshall Miller: so to get to some of the intuition. Our goal is always been to bring free knowledge to everyone in the world. And so far we've done a pretty good job, you know. Maybe a 1 billionpeople or so are exposed to Wikimedia Content
30
00:04:46.650 --> 00:04:53.540
Marshall Miller: movement. Strategy tells us that we want to be the essential infrastructure of the ecosystem of free knowledge
31
00:04:54.010 --> 00:05:02.260
Marshall Miller: by providing knowledge as a service and ensuring knowledge. Equity. That's a really tall order, You know the essential infrastructure of free knowledge for everyone in the world.
32
00:05:03.360 --> 00:05:18.540
Marshall Miller: And I think a lot of us are here today because we know the world has been changing. But the goal stays the same right, even though technology has changed. Population of the world has grown. Internet access has changed. We're still trying to bring free knowledge to everyone on Earth.
33
00:05:18.660 --> 00:05:22.930
Marshall Miller: But because the world is changing the way that we achieve. That goal
34
00:05:23.090 --> 00:05:25.480
Marshall Miller: probably has to evolve along with it.
35
00:05:25.570 --> 00:05:43.400
Marshall Miller: And the Wikimedia movement and the Wikimedia software has changed in a lot of ways. Over the last 20 years many improvements opened up access to a lot of new people, amazing programs that we run in our communities. But the way that we deliver knowledge is roughly the same as it's always been
36
00:05:43.430 --> 00:05:56.020
Marshall Miller: right. Wikipedia is made up of pages that have Wikipedia articles on them, and that has done a lot for a lot of people in the world. But we're starting to learn that it's not working for also a lot of people in the world.
37
00:05:57.960 --> 00:06:14.350
Marshall Miller: and here are like 3 bullet points about some of the 3 biggest trends that people like Mariana and me and others at the foundation research to help us understand the changing ecosystem, and i'll just touch them briefly, because I bet a lot of you already think about this.
38
00:06:14.420 --> 00:06:32.630
Marshall Miller: The first and probably biggest one is online content. Creation has fragmented and search is fundamentally changed. So what this is referring to is when we think about the main ways that people have used the Internet in the past. Googling things clicking on links from Google coming to Wikipedia.
39
00:06:32.710 --> 00:06:45.470
Marshall Miller: That's all changing a lot right now, because people are spending a lot more time in other platforms. We hear a lot about tik tok about Youtube about Instagram people spending time and messaging apps, and that's where they learn.
40
00:06:45.580 --> 00:06:57.260
Marshall Miller: and especially now that we're all realizing how big of a deal conversational AI is becoming that's an even bigger change to this landscape of how people seek and receive information.
41
00:06:58.180 --> 00:07:03.500
Marshall Miller: The second big bullet point, Disinformation and misinformation are on the rise
42
00:07:03.640 --> 00:07:21.950
Marshall Miller: even before conversational AI. This was already a pretty big concern that it was becoming easy to create content on the Internet and all kinds of content, whether true or untrue, is spreading all around the Internet and that is an important trend for us and the Wikimedia movement, because what we care about is truth
43
00:07:21.950 --> 00:07:26.540
Marshall Miller: and reliable information. And so it means that we have a role to play here.
44
00:07:26.920 --> 00:07:39.350
Marshall Miller: And then, finally, the way that governments are regulating the Internet and regulating information is also changing. There are new laws and new countries that put pressure on the way that knowledge
45
00:07:39.350 --> 00:07:58.300
Marshall Miller: moves around the way that knowledge has to be copyrighted and copied. And so we are thinking about how to react to that as well. So this is. There's a lot more we could talk about with trends, but just to kind of remind us and put us on the same page about what are some of the main things changing in the outside world that mean that we should be thinking about
46
00:07:58.300 --> 00:07:59.970
Marshall Miller: how and if we need to change.
47
00:08:00.950 --> 00:08:08.420
Marshall Miller: And so what we want to do with future audiences in general is, we know we're trying to become the essential infrastructure of free knowledge
48
00:08:08.470 --> 00:08:11.740
Marshall Miller: movement, strategy tells us to innovate in free knowledge.
49
00:08:12.220 --> 00:08:24.190
Marshall Miller: And how might we do that? Well, we probably can't pursue every great idea we have that we're with this a lot of great people in our movement, but we probably don't have enough people or enough resources to do it all.
50
00:08:24.370 --> 00:08:30.430
Marshall Miller: And so what we're trying to think about is if we wanted to paint a picture of where we want to end up
51
00:08:30.460 --> 00:08:38.039
Marshall Miller: in 2,030 or beyond. And say, here's where here's the future we want to live in, and what we want the world to be like. Could we do that.
52
00:08:38.120 --> 00:08:49.990
Marshall Miller: and then figure out how we might take steps to get there. And that's what we're trying to do with with future audiences is we're trying to do the kinds of experiments that help us figure out. What should the future look like, and can we achieve it?
53
00:08:50.590 --> 00:09:02.260
Marshall Miller: So that's the high level motivation, and I don't even think we need to go into any more slides. But what I'm going to switch over to now is just this is our notes document that many of you have open.
54
00:09:02.470 --> 00:09:12.820
Marshall Miller: and it lays out these objectives and key results. Objectives and key results are are the planning framework that we use at the Wikipedia Foundation product and technology department
55
00:09:13.230 --> 00:09:14.020
Marshall Miller: and
56
00:09:14.150 --> 00:09:18.770
Marshall Miller: the 2, there's 2 objectives for future audiences. One is
57
00:09:18.810 --> 00:09:21.430
Marshall Miller: describe multiple potential strategies.
58
00:09:21.670 --> 00:09:31.920
Marshall Miller: So this one means, let's think, about the different kinds of future that we could end up in. Do we want the Wikimedia content to be like this, or like that, or like that?
59
00:09:32.480 --> 00:09:36.440
Marshall Miller: And then the second objective is to test hypotheses.
60
00:09:36.540 --> 00:09:39.460
Marshall Miller: to validate or invalidate those strategies.
61
00:09:39.690 --> 00:09:50.510
Marshall Miller: So the idea there is, if there's 3 different futures that we think we could go in, which ones look like the best ways to go. And how can we test and find out whether that's right?
62
00:09:51.020 --> 00:09:51.960
Marshall Miller: And so
63
00:09:52.080 --> 00:10:04.550
Marshall Miller: nested under? There are the 2 kind of meaty things we wanted to talk about today. We think that we want to test a hypothesis around reaching global youth audiences
64
00:10:04.930 --> 00:10:14.850
Marshall Miller: on third party content platforms. So this is about, you know, thinking back on those trends and recognizing that there are a lot of young people in the world
65
00:10:14.880 --> 00:10:21.480
Marshall Miller: who are not going through Google going to Wikipedia the way that people did it in the past and thinking, how can they be reached?
66
00:10:22.290 --> 00:10:29.040
Marshall Miller: And then the second key result is test the hypothesis around conversational AI knowledge, seeking
67
00:10:29.180 --> 00:10:38.700
Marshall Miller: so recognizing that that's also a major change in the way people look for knowledge, and we want to figure out how Wikipedia should fit in in the right way.
68
00:10:39.140 --> 00:10:46.480
Marshall Miller: And so that's all of the table setting. I wanted to do so. This is the way that we're trying to think about this inside the Wikimedia foundation.
69
00:10:46.740 --> 00:11:02.700
Marshall Miller: and we've been talking about this on Meta a little bit. But as we discussed today, if you all think this is the wrong way to think about it, or like we aren't thinking about the future correctly like that's what we need your help with. So please speak up. And we're also gonna dig into some of these
70
00:11:02.700 --> 00:11:07.580
Marshall Miller: these 2. I like key results today, and some of the ideas we have
71
00:11:07.700 --> 00:11:14.130
Marshall Miller: with the with the we. We want to get this conversation started and going beyond today. So that's the table setting.
72
00:11:14.170 --> 00:11:25.450
Marshall Miller: So I wanted to pause here to see if people, if this makes sense, if people have like questions about it at a high level before we get into the details, or also if Liam or Marian, I think that I
73
00:11:25.770 --> 00:11:28.300
Marshall Miller: miss something or something else needs to be described.
74
00:11:33.050 --> 00:11:34.270
Maryana Pinchuk: Nothing for me.
75
00:11:34.720 --> 00:11:42.660
Liam Wyatt: There is a question. Practical housekeeping question from these slides is, can these slides be made available, and can you link them into the the documentation itself
76
00:11:42.800 --> 00:11:44.310
Liam Wyatt: when it's.
77
00:11:44.370 --> 00:11:45.460
Liam Wyatt: if that's possible?
78
00:11:47.730 --> 00:11:49.280
Marshall Miller: So yes.
79
00:11:49.960 --> 00:11:59.880
Liam Wyatt: the the first question that did come up in the chat is regards to a more theoretical question of the definition of what we counting as free knowledge. In this context, I want to hold that
80
00:11:59.940 --> 00:12:05.680
Liam Wyatt: in case anyone has a direct answer to it, as it more met up.
81
00:12:06.120 --> 00:12:15.440
Liam Wyatt: Question very valid when I like to discuss myself, but to to dive directly into the question raised by Marshall's presentations. Here
82
00:12:15.560 --> 00:12:21.140
Liam Wyatt: we have 2 broad areas. See I the i'd like if we could get some
83
00:12:21.300 --> 00:12:26.130
Liam Wyatt: general feelings, no hard answers required today or or no firm
84
00:12:27.280 --> 00:12:44.740
Liam Wyatt: answers that we have to come up with within the next 45 min. But the 2 general areas are of AI, and and these kinds of platforms have generated content. And third party platform where it's it's tun and generated. But it's not within our own purview.
85
00:12:45.940 --> 00:12:54.200
Liam Wyatt: Does anyone have an immediate questions or feedback comments on either of those 2 areas? Noting that
86
00:12:54.880 --> 00:12:57.900
Liam Wyatt: AI and generated text
87
00:12:59.030 --> 00:13:09.620
Liam Wyatt: button issues on across across a variety of policy, ethical and technological areas these days that they'd like to share with us. Now
88
00:13:09.890 --> 00:13:11.120
Liam Wyatt: to get things going.
89
00:13:24.240 --> 00:13:36.170
Liam Wyatt: You know the comment from Nada in the chat Here, I think one of the concerns about the AI generated content is its reliability and resourcing. Sometimes I think it's great. If Wik Media has AI reliable content
90
00:13:36.210 --> 00:13:39.780
Liam Wyatt: and reliable chatting, I mean from the featured content.
91
00:13:39.990 --> 00:13:43.030
Liam Wyatt: That is this with reference to the idea that
92
00:13:44.110 --> 00:13:49.940
Liam Wyatt: the AI content that is generated is unclear where it comes from.
93
00:13:51.460 --> 00:13:55.190
Liam Wyatt: and whether it can be relied upon.
94
00:13:55.640 --> 00:14:03.400
Liam Wyatt: Sometimes it comes from with reference to Wikipedia content. That itself is good quality, but you can't trace it back to where it came from.
95
00:14:11.520 --> 00:14:13.430
Liam Wyatt: Yes. find you.
96
00:14:13.810 --> 00:14:23.730
Liam Wyatt: I'm. I'm happy to read things out for those who don't wish to speak on recording themselves. That's fine. But you are welcome to up in your microphone and speak directly
97
00:14:23.780 --> 00:14:25.320
Liam Wyatt: Marshall or
98
00:14:25.330 --> 00:14:29.550
Liam Wyatt: or or Marian, or indeed any one else from the Wikipedia Foundation.
99
00:14:29.860 --> 00:14:34.190
Liam Wyatt: There has been discussions with regards to attribution and reliability of content.
100
00:14:34.520 --> 00:14:39.050
Liam Wyatt: Do you wish to speak to that or the the the questions being being raised?
101
00:14:40.080 --> 00:14:44.050
Marshall Miller: Yeah, I think we can get into that as we start to talk about the AI part.
102
00:14:44.170 --> 00:14:50.210
Marshall Miller: So just before we dive in Louis to see your hands up, what do you? What do you say?
103
00:14:50.810 --> 00:14:57.160
Luis Villa (he/him): You know I I was actually just curious to probe a little more. This question of
104
00:14:59.520 --> 00:15:05.040
Luis Villa (he/him): I I I guess the question is that key results on knowledge seeking
105
00:15:08.650 --> 00:15:12.730
Luis Villa (he/him): There, there's so many different Krs that could have been chosen around AI,
106
00:15:12.970 --> 00:15:25.850
Luis Villa (he/him): and I'm. Curious. Why, if there was, you know why you settled on that one which is not to say it's a bad one, but i'm just wondering if there were other routes that you considered but discarded.
107
00:15:25.960 --> 00:15:40.180
Luis Villa (he/him): or other routes that seem interesting, but it might be for future quarters just to help understand what you're prioritizing. Why you prioritize this particular thing and and what you're thinking about in this in that space.
108
00:15:40.190 --> 00:15:42.140
Marshall Miller: Great Thank you. That's a great question.
109
00:15:43.510 --> 00:15:50.300
Marshall Miller: And I see 2 other people raise their hands. So I i'm gonna try to talk about Louis's thing first and then change. Get to you.
110
00:15:50.350 --> 00:15:56.430
Marshall Miller: Okay, so what? When we're thinking about, or at least when I was thinking about conversational AI.
111
00:15:56.520 --> 00:16:02.170
Marshall Miller: I think that there are 2. Hmm. 2 main sides of the coin here.
112
00:16:02.630 --> 00:16:07.380
Marshall Miller: One is the fact that outside of the walls of our movement.
113
00:16:07.880 --> 00:16:13.650
Marshall Miller: Obviously this technology is getting really popular and being used in so many different places.
114
00:16:13.670 --> 00:16:19.180
Marshall Miller: Some of the most popular ones are things like chat, gpt, or bing AI,
115
00:16:19.440 --> 00:16:26.670
Marshall Miller: and we know that they use a ton of wikimedia content and have the potential to spread a lot of that content
116
00:16:26.760 --> 00:16:33.810
Marshall Miller: much the same way that Google Search did over the years by surfacing Wikimedia content on the search results page.
117
00:16:33.900 --> 00:16:36.670
Marshall Miller: And so this one side of the coin is like.
118
00:16:36.790 --> 00:16:44.840
Marshall Miller: should we be thinking about how people are receiving Wikimedia content through all these external players?
119
00:16:45.070 --> 00:16:47.170
Marshall Miller: And also how might they
120
00:16:47.190 --> 00:16:50.720
Marshall Miller: contribute back, or get involved through those places.
121
00:16:50.870 --> 00:16:55.390
Marshall Miller: Because if we think again about the analogy of Google search through the years.
122
00:16:55.800 --> 00:17:09.339
Marshall Miller: the Google search results. Page has been great for spreading knowledge content. A lot of people get knowledge through the searches, but it's potentially cost issues for people getting to our sites, learning they can contribute learning. They can donate.
123
00:17:09.339 --> 00:17:16.160
Marshall Miller: And so this there's this moment where we can think what is the right way. That Wikimedia content should be presented outside.
124
00:17:16.390 --> 00:17:27.220
Marshall Miller: and then the other side of the coin is more like inside of the walls of our movement. What are we, or what should we be doing to think about? How we use AI
125
00:17:27.500 --> 00:17:42.700
Marshall Miller: to either build up the knowledge store, get people involved, change the offering that is, on our websites or on our apps. How do we play a role in ethical AI, and like become leaders in the way people use this in general.
126
00:17:43.030 --> 00:17:49.220
Marshall Miller: So there was these 2. There's these 2 sides right like the outside, one inside one. And so for this key result
127
00:17:49.410 --> 00:17:51.610
Marshall Miller: we chose the outside one.
128
00:17:51.660 --> 00:18:03.500
Marshall Miller: because future audiences is primarily about reaching all of these billions of people we Don't reach yet. And we thought, these technologies have the potential to reach all these people. So let's think.
129
00:18:03.640 --> 00:18:08.740
Marshall Miller: how can we help reach them better? If, as the knowledge spreads outside our walls.
130
00:18:09.060 --> 00:18:22.940
Marshall Miller: But that's not to say that there's no one at the foundation thinking about the other side of the coin, so like, for instance, Chris Alvin's here, who's our director of machine Learning. And so, you know, Chris is active in the AI community and is thinking about writing about
131
00:18:23.090 --> 00:18:34.600
Marshall Miller: what role we play in using machine learning inside our projects, and how we have a voice in like the broader ethical AI space. So Louise, does that make sense.
132
00:18:36.210 --> 00:18:37.510
Luis Villa (he/him): That was great. Thank you.
133
00:18:39.660 --> 00:18:41.490
Liam Wyatt: I see it from Sage.
134
00:18:42.600 --> 00:18:49.080
Sage Ross: Yeah, I wanted to to some extent bring it back to Louis's first question, which I think is
135
00:18:49.160 --> 00:18:58.140
Sage Ross: it doesn't have to be a sidetrack so much? And this is about sort of what the definition of free knowledge is.
136
00:18:58.210 --> 00:19:00.630
Sage Ross: I think, about that in terms of
137
00:19:00.710 --> 00:19:10.280
Sage Ross: well, what kind like, what do we do on Wikipedia? And what's our relationship to the knowledge that we're giving people access to? And
138
00:19:10.320 --> 00:19:19.460
Sage Ross: by and large. We are curating knowledge. We are taking things from all over the place which does definitely not have to be, You know, free culture, knowledge.
139
00:19:19.460 --> 00:19:39.300
Sage Ross: and like curating that in a way that makes it accessible to people. And so like, I would encourage us to think about our forays into video and into AI in the exact same way that what we're actually good at is giving people tools to make sense of all of the knowledge that's out there, all of it.
140
00:19:39.350 --> 00:19:42.800
Sage Ross: you, you know, like the best of it, right? And like.
141
00:19:42.880 --> 00:19:54.720
Sage Ross: there's this overwhelming flood of knowledge in print that has been for centuries. And now also in video, and also in sort of like aggregated text, that just kind of like.
142
00:19:54.730 --> 00:20:11.170
Sage Ross: almost like statistical knowledge about what many, many people have said. And so like, if we take the same approach of like our job is to curate and provide like a free thing. Whatever we make is going to be free in the free culture sense, but, like
143
00:20:11.250 --> 00:20:18.830
Sage Ross: our goal here, is to just like curate, the best of it, and do the best that we can to like.
144
00:20:19.020 --> 00:20:27.000
Sage Ross: provide a lens and sort of a filtering and a sort of a you know, tools for for dealing with that overload of knowledge.
145
00:20:27.300 --> 00:20:38.090
Sage Ross: And if if we think about it like that, then then it becomes less of a of a sort of existential problem of like. Oh, gosh! Can we work with like Youtube content, or whatever, because it's not free.
146
00:20:40.170 --> 00:20:41.710
Liam Wyatt: Thanks so much. I think there's
147
00:20:41.850 --> 00:20:46.620
Liam Wyatt: there is a big difference like you say, between all of it and the best of it, or even perhaps not
148
00:20:46.630 --> 00:20:53.570
Liam Wyatt: good, but still necessary content that ought to be free as opposed to everyone's, has the phone number.
149
00:20:54.570 --> 00:21:02.290
Liam Wyatt: The sum of all human knowledge might be more of a secret Service kind of mission rather than ours about free culture.
150
00:21:03.220 --> 00:21:05.050
Marshall Miller: So, Clara.
151
00:21:05.150 --> 00:21:13.550
Marshall Miller: I, if you don't, mind, i'd like to ask you about your plus 100, because that would help us make this less of like a. Q. A. And more of a discussion.
152
00:21:13.740 --> 00:21:32.130
Klara Sielicka-Baryłka (Wikimedia Polska): No, this I was listening to you. And this is, I think someone has already written that 1045, and I was thinking, wow, me, too, and that I have to change because I was so
153
00:21:32.130 --> 00:21:47.740
Klara Sielicka-Baryłka (Wikimedia Polska): general an ideal, like almost 10 years ago, when I was in stepping into a video world. And now I can see, as we are speaking at the beginning, that everything is changing so fast
154
00:21:48.110 --> 00:21:53.020
Klara Sielicka-Baryłka (Wikimedia Polska): that I really can't believe, like for 100%,
155
00:21:53.190 --> 00:22:07.150
Klara Sielicka-Baryłka (Wikimedia Polska): that we can be the one my voice and telling people. Yes, this is good, because this is from us that people have to see by themselves that this is good for them and tools because they want to. So we have to provide.
156
00:22:07.160 --> 00:22:08.890
Klara Sielicka-Baryłka (Wikimedia Polska): as you said.
157
00:22:10.370 --> 00:22:20.550
Klara Sielicka-Baryłka (Wikimedia Polska): good tools, good resources like databases or educational materials
158
00:22:20.680 --> 00:22:24.150
Klara Sielicka-Baryłka (Wikimedia Polska): provided by experts. But enough.
159
00:22:24.180 --> 00:22:38.690
Klara Sielicka-Baryłka (Wikimedia Polska): even funny way or something. And this is also the discussion that we have, I think, in every country, especially in Europe. I think that what we have to do with this young people who are coming to Wikipedia and ruining this.
160
00:22:39.060 --> 00:22:47.590
Klara Sielicka-Baryłka (Wikimedia Polska): And what about next edit that? Maybe not. You know all these discussions. And then but
161
00:22:47.890 --> 00:22:57.510
Klara Sielicka-Baryłka (Wikimedia Polska): they they really When I, When I run the lessons from Wikipedia, I can see how they use them. So we
162
00:22:57.720 --> 00:22:59.760
Klara Sielicka-Baryłka (Wikimedia Polska): but
163
00:22:59.790 --> 00:23:06.150
Klara Sielicka-Baryłka (Wikimedia Polska): looking at the article, and changing and thinking about 5 pillars.
164
00:23:06.180 --> 00:23:10.750
Klara Sielicka-Baryłka (Wikimedia Polska): Not exactly, they think, but they appreciate our work.
165
00:23:10.820 --> 00:23:26.580
Klara Sielicka-Baryłka (Wikimedia Polska): They they very happy to dive into it so we can. We have to use them. So this one plus to was I was about this that maybe our role has to change, because
166
00:23:26.770 --> 00:23:34.720
Klara Sielicka-Baryłka (Wikimedia Polska): I don't really don't know my M. From my maybe low level of expertise. I saw my.
167
00:23:36.280 --> 00:23:46.910
Klara Sielicka-Baryłka (Wikimedia Polska): that how we can battle Microsoft, Google, and everything we have to focus on what what we are good at like, like I've had here.
168
00:23:47.080 --> 00:23:59.570
Klara Sielicka-Baryłka (Wikimedia Polska): And okay, Tik Tok will be really okay it. We also started. We are also thinking about this and making faults, and so on.
169
00:23:59.720 --> 00:24:04.980
Klara Sielicka-Baryłka (Wikimedia Polska): Even museums are already on this stuff. Yes, like
170
00:24:05.390 --> 00:24:17.820
Klara Sielicka-Baryłka (Wikimedia Polska): Oh, yeah. And since many years, so that's that's from me
171
00:24:17.900 --> 00:24:26.560
Marshall Miller: what our value will be in a world where people can easily generate so much content on AI Chris, do you want to speak to that for a second?
172
00:24:28.090 --> 00:24:36.310
chrisalbon: Sure. So I think. Sorry I I'm Chris I'm. The director of Ml. I think about AI a lot. Obviously.
173
00:24:36.640 --> 00:24:45.250
chrisalbon: I think one of the things that like I I wish we could say a 1,000 times on every every medium, to every single person. is it
174
00:24:45.510 --> 00:24:54.010
chrisalbon: I? I I firmly believe I for, like I 100% believe not just, you know, because this is where I work at my job. But I firmly believe
175
00:24:54.170 --> 00:24:55.150
chrisalbon: that
176
00:24:55.490 --> 00:25:13.840
chrisalbon: in an Internet where you can spend 10 bucks and create a 1 million articles of, you know, of whatever quality like they're probably really bad quality. You can make them really biased. You can make them pro dictatorship. You can make them pro some corporation, whatever you could just make infinite amounts of content.
177
00:25:13.840 --> 00:25:22.950
chrisalbon: The Internet just becomes flooded with all this, all this really low quality or misinformation, or actively disinformation, content that
178
00:25:23.340 --> 00:25:29.240
chrisalbon: in that, in that Internet which is a 100% where we're going, and there's already articles about how it's happening right now
179
00:25:29.300 --> 00:25:32.400
chrisalbon: that Wikipedia
180
00:25:32.510 --> 00:25:35.470
chrisalbon: and wicked data become
181
00:25:36.120 --> 00:25:41.750
chrisalbon: an island in in a a a, you know, an ocean of of mud
182
00:25:42.350 --> 00:25:58.360
chrisalbon: where it becomes a safe place for people to go to get information. It becomes a safe place for, say, search engines or people who create models to find reliable information. It comes to safe place for Internet readers to go and find information.
183
00:25:58.490 --> 00:25:59.390
chrisalbon: And
184
00:25:59.580 --> 00:26:03.200
chrisalbon: in that world the work of
185
00:26:03.340 --> 00:26:15.600
chrisalbon: the Volunteers doesn't become less useful, right isn't less valuable because because AI exists, it is more valuable, is way more valuable, because the thing that it provides is more rare.
186
00:26:15.760 --> 00:26:19.500
chrisalbon: There's less. There's less ideas of trying to create.
187
00:26:19.520 --> 00:26:26.700
chrisalbon: You know, another place where all this information is fact checked, and it's more trying to just generate tons of content, of bad quality.
188
00:26:26.820 --> 00:26:42.430
chrisalbon: And in that world, you know. I I think it. It flips. It flips your thinking because you don't say, oh, we're you know AI is gonna come. And then we're you know there's gonna be no no role. It's like, No, no, this is we're. We're even more important now that before it it becomes like, what do we do?
189
00:26:42.430 --> 00:26:59.970
chrisalbon: To the the phrase I've been using is defend the island like if you're in the sea of muck and filth and bad content. And you have this island that is, of of high value, of reliable content. Where you can discuss things and through consensus, you know, reach high quality information. We need to defend the island.
190
00:26:59.970 --> 00:27:23.580
chrisalbon: How do we do that? You know you. You make tools that make it easy for editors to to check content. You make the task of editing more enjoyable. You make it easier. You remove the work that you know you sort of automate the work that that takes a lot of time. And then, in addition to that, you make it easy for for people who want to use the content in something else on some other platform to grab it and go use it and discover our content.
191
00:27:23.580 --> 00:27:46.040
chrisalbon: But you know the thing I just keep on. I keep on pushing this, this, this, this thing that I really believe that's like it. The future is that, like Wikipedia, becomes way more valuable than it has in the past, and it's not that it hasn't been valuable in the past. It's just that we become more rare and more useful when the sort of you know, when there's a sea of of crappy content everywhere.
192
00:27:47.540 --> 00:27:48.590
Liam Wyatt: Thank you, Chris.
193
00:27:48.990 --> 00:27:54.740
Liam Wyatt: In the context of defending this island and finding a way to defend it.
194
00:27:55.270 --> 00:28:03.650
Liam Wyatt: I like the coming up, by the way in the chat here? Can we Have we ever be offensive rather than merely defensive in our in our work?
195
00:28:03.770 --> 00:28:12.200
Liam Wyatt: I'd like to raised a specific point, and and asked Mariana or Marshall to speak to the question of AI plugins.
196
00:28:12.490 --> 00:28:16.060
Maryana Pinchuk: I think, before we go to that, there's a hand from on.
197
00:28:16.180 --> 00:28:18.020
Liam Wyatt: I'm sorry. Yes, I know.
198
00:28:19.400 --> 00:28:46.690
Nanour: No, it's only just a comments that above the conversation that I listen the presentation that Marshall did. I just took a a moment here, so to say, that who will be decided that in the future will be the first platform, the free for for the free knowledge. Is there any metrics, or who will? How we are going to evaluate this.
199
00:28:46.690 --> 00:29:05.040
Nanour: to say that? Yes, we are the first platform for the free knowledge. And what about the other platforms that they are working for on free knowledge to. We are collaborating with them. Are they? Are our alliances or the partnerships in the future.
200
00:29:05.040 --> 00:29:18.100
Nanour: or we are going to do it alone. Why, also I don't think so, and about what Chris said that this iron and and this trusted place at.
201
00:29:18.100 --> 00:29:35.620
Nanour: If, just when i'm buying from the the shop that I trust that product. So everyone will come to me because I know this place. I'm going there, and i'm going there to after 10 years, and they all have the same a product, the same
202
00:29:35.620 --> 00:29:54.740
Nanour: parameters that keep it so we have to keep our 5 pairs for the first one for the second one to keep our volunteers. We need to empower more and more our communities of all over the world, actually, by by
203
00:29:54.820 --> 00:29:55.850
Nanour: yeah.
204
00:29:56.150 --> 00:30:08.160
Nanour: encouraging the not to represented communities, and they go to brings the other 2 types of audience to our movement. Thank you.
205
00:30:09.540 --> 00:30:18.730
Liam Wyatt: Thank you very much. I I really would like to hear an answer to the first question in particular. How do we expect to be able to measure success
206
00:30:18.800 --> 00:30:25.240
Liam Wyatt: against the strategic priority of becoming the essential infrastructure of free knowledge.
207
00:30:25.440 --> 00:30:30.650
Liam Wyatt: which is one of the 2 pillars about of our strategy. How can that be measured?
208
00:30:31.140 --> 00:30:33.000
Liam Wyatt: Feel like I want to send that to Monty.
209
00:30:40.580 --> 00:31:00.180
Margeigh Novotny: Wow, thanks. Well, so a couple of things. One is the first is, we have to be able to agree on what being the essential infrastructure, free knowledge means. That's been kind of a stumbling block. I think we've we've
210
00:31:00.240 --> 00:31:07.800
Margeigh Novotny: felt over the last couple of years. It it means different things to different people. But I think
211
00:31:08.100 --> 00:31:18.570
Margeigh Novotny: the this strategy of of articulating possible futures that this future audiences discussion is is raising
212
00:31:19.010 --> 00:31:21.780
Margeigh Novotny: gives us the opportunity to
213
00:31:24.250 --> 00:31:27.990
Margeigh Novotny: basically test what what we think we mean by that.
214
00:31:28.160 --> 00:31:41.750
Margeigh Novotny: And then also the the strategy of breaking down. If we can articulate 3 futures, future, one future, 2 future 3, and we can test hypotheses about
215
00:31:43.060 --> 00:31:53.220
Margeigh Novotny: how to get to any of those futures. We're in the in the process of doing this work that we're talking about here today.
216
00:31:53.460 --> 00:32:08.220
Margeigh Novotny: We are going to be able to it. It it doesn't become such a giant amorphous question, how you measure it. We're, we're measuring each of the baby steps to to getting to any one possible future.
217
00:32:08.350 --> 00:32:13.080
Margeigh Novotny: And that's how we know whether we're getting there. So I think it's going to be something like
218
00:32:13.090 --> 00:32:15.750
Margeigh Novotny: a grand process of elimination.
219
00:32:17.260 --> 00:32:22.940
Margeigh Novotny: And actually, you know, I, that's exciting, I think, for us to be getting there.
220
00:32:23.670 --> 00:32:31.880
Marshall Miller: Thank you, Margie. I think that makes a lot of sense, and not nor I. I think it's really cool that we keep asking these big questions, so I really appreciate that
221
00:32:32.050 --> 00:32:47.230
Marshall Miller: I wanted to get into something specific around AI. So like we were just talking about. There are so many different aspects of this challenge like Louise you brought up. How did we decide on this aspect? So what I want to do is show you one of the ideas we're working on. Now
222
00:32:47.330 --> 00:33:02.600
Marshall Miller: I want to remind us this is experimental. It's going to be small at first, and I hope this is like a safe space to share and get feedback. And so it kind of addresses this question that sage was bringing up about being on the offensive.
223
00:33:02.650 --> 00:33:03.430
Marshall Miller: So
224
00:33:03.620 --> 00:33:07.010
Marshall Miller: the context is, let me share my screen again.
225
00:33:09.290 --> 00:33:12.580
Marshall Miller: Which one
226
00:33:14.440 --> 00:33:15.150
Marshall Miller: this one.
227
00:33:19.210 --> 00:33:19.920
Marshall Miller: Okay.
228
00:33:20.430 --> 00:33:29.170
Marshall Miller: So the context is, I I bet a lot of people here have at least tried out Chat Gpt, which is the most famous and popular of these chat bots right now.
229
00:33:29.560 --> 00:33:30.830
Marshall Miller: and
230
00:33:30.880 --> 00:33:38.680
Marshall Miller: one of the thoughts that we could naturally think is, we know that these that this model is trained on Wikipedia content already.
231
00:33:38.740 --> 00:33:44.350
Marshall Miller: If you've played with it at all, it's pretty easy to tell that it's using Wikipedia and spreading the knowledge.
232
00:33:44.560 --> 00:33:55.950
Marshall Miller: But unfortunately it doesn't do anything to attribute to Wikipedia or its sources, or to tell people that the knowledge came from us, or to encourage them to help contribute back to the knowledge comments.
233
00:33:56.100 --> 00:34:09.130
Marshall Miller: And so that's one of the things you've been thinking and worrying about, and it's very similar to the challenge we've had with search engines that they surface Wikipedia, content not always with great attribution, not always with opportunities to get involved.
234
00:34:09.409 --> 00:34:13.360
Marshall Miller: And so this is a screenshot from Chat Gpt. Now.
235
00:34:13.550 --> 00:34:26.440
Marshall Miller: if you ask a current events question something like who won the 2,023 women's, basketball tournament, it says, i'm sorry I do not know my knowledge, cut off Date is from 2,021,
236
00:34:26.500 --> 00:34:35.080
Marshall Miller: like the tournament Hasn't taken place yet, etc., and that's because of the way these models are built. They're not able to incorporate current events.
237
00:34:35.159 --> 00:34:54.330
Marshall Miller: Well, you know who's great at incorporating current events, Our volunteer communities who can? Who can update articles within minutes of information occurring, and it it's for common that some of news outlets or search engines just get it straight from Wikipedia when that's how they find out that something changed. So
238
00:34:54.440 --> 00:35:00.190
Marshall Miller: open. AI who makes Chat Gpt is starting up this new program called plugins.
239
00:35:00.280 --> 00:35:11.730
Marshall Miller: which allow outside websites to install these pieces of software into Chat Gpt that help users of chat, Gpt access that kind of content directly.
240
00:35:12.020 --> 00:35:17.500
Marshall Miller: So, for example. there's a plugin with expedia expedia as the travel website.
241
00:35:17.640 --> 00:35:35.300
Marshall Miller: And so if you have this plugin, you can plan a trip through Chat Gbt: and say i'm flying from London to Rome next week, or I want to fly next week like what are some ticket options? And then it goes and asks to Pedia, brings the information back and helps you plan your trip with that kind of current information.
242
00:35:35.620 --> 00:35:39.600
Marshall Miller: And so we've been talking about what might a Wikipedia plugin look like.
243
00:35:40.130 --> 00:35:48.090
Marshall Miller: And so the the idea here is we already know that Wikipedia information is being used in these models.
244
00:35:48.140 --> 00:35:57.170
Marshall Miller: and right now we're not. We don't have any influence about how it's used. but perhaps with an opportunity like building a plugin.
245
00:35:57.290 --> 00:36:05.120
Marshall Miller: We could shape how people see Wikipedia content whether they know it's from Wikipedia, and whether they are able to get involved.
246
00:36:05.310 --> 00:36:13.980
Marshall Miller: And so Chris, who's here, and a couple of the rest of us have been playing with this idea, and it's one of our first experiments. So I want to show you what this is starting to look like.
247
00:36:14.020 --> 00:36:18.220
Marshall Miller: So i'm gonna. This is another screenshot from our plugin asking the same question
248
00:36:18.750 --> 00:36:20.150
Marshall Miller: and chat Gpt.
249
00:36:20.210 --> 00:36:26.230
Marshall Miller: And it's using our plugin, which nobody has access to just us, because we're still playing with it.
250
00:36:26.320 --> 00:36:38.780
Marshall Miller: and this one is able to give the current information. According to Wikipedia, the Lsu Tigers won the tournament. It was on April second 2,023. This is all summarized content from the Wikipedia article
251
00:36:39.050 --> 00:36:42.620
Marshall Miller: it says you can find out more information in the Wikipedia article with the link.
252
00:36:42.830 --> 00:36:51.240
Marshall Miller: and we can even ask the plugin to tell people anyone can edit Wikipedia. If you want to get involved here's how you can find out more.
253
00:36:51.270 --> 00:36:55.130
Marshall Miller: Now, there's definitely better ways to do this. This was just our first try.
254
00:36:55.330 --> 00:37:05.120
Marshall Miller: but showing the idea that if we work to partner with organizations like this quickly, we might have an opportunity to shape how the Wikipedia content
255
00:37:05.360 --> 00:37:09.630
Marshall Miller: appears to others through these platforms. So
256
00:37:09.960 --> 00:37:16.060
Marshall Miller: I i'm hoping that showing something specific like this kind of generates some thoughts and reactions from all of you
257
00:37:16.090 --> 00:37:35.210
Marshall Miller: about whether this direction feels right. We could easily brainstorm many things that could go wrong like we already have been trying to do that and thinking, how could this go wrong. How could this cause harm? And we want to hear more about this from all of you. But just showing this, and there's one more screenshot on these lines I want to show is that
258
00:37:35.210 --> 00:37:48.070
Marshall Miller: obviously this is a a question about something that happened in the United States, and the answer is in English. We've been experimenting with. What can this do in other languages? And what can this do with information from other Wikipedia?
259
00:37:48.080 --> 00:38:01.640
Marshall Miller: Here is an example of asking a question in Spanish, and getting an answer in Spanish, you can see a bug right here. It starts in English, saying, According to Wikipedia, this is a bug on our end. So this is how we're like thinking about this and and testing it.
260
00:38:01.910 --> 00:38:05.920
Marshall Miller: So i'll pause there. I bet you all get the idea, and I want to hear what you all think.
261
00:38:11.830 --> 00:38:27.780
Liam Wyatt: Thank you, Marshall. That's quite an inspiring What's the word? Brave new world of potential ways of interacting with the media content. I see one hand up for ivy Chef when I know that that question is not related to this specifically. I have that in
262
00:38:27.940 --> 00:38:31.790
Liam Wyatt: in hold. Thank you very much. I will call on you next.
263
00:38:31.820 --> 00:38:36.030
Liam Wyatt: Sage. Did you have a response to this or a reaction to this plugin? Go for it?
264
00:38:36.120 --> 00:38:47.330
Sage Ross: Yeah, I just have a a question about like how this would work like, how how would how would we know when Wikipedia information is being used? It was my understanding that
265
00:38:47.520 --> 00:38:50.030
Sage Ross: well, you know, like a human, can often
266
00:38:50.160 --> 00:38:59.190
Sage Ross: tell, that obviously this came from there, that like at its heart the like, the Gpt system itself Doesn't know.
267
00:38:59.260 --> 00:39:04.580
Marshall Miller: That's a that's a question about like, technically how plugins work. And so, Mariana or Chris, do you want to explain that?
268
00:39:07.670 --> 00:39:09.670
chrisalbon: Yeah, Marianne, You can, or I can.
269
00:39:09.880 --> 00:39:18.790
Maryana Pinchuk: Sure. Yeah. So the the thing that's different about plugins versus sort of vanilla chat gpt is that one of the things you can do with plugins is
270
00:39:18.790 --> 00:39:35.160
Maryana Pinchuk: specify essentially the knowledge base that Chat Gpt will draw from. So it's still applying it's sort of like conversational agent to the content it might be doing some light summarizing here, like. If you go to the Wikipedia article on the final 4, you might see slightly different wording here, so it's still sort of
271
00:39:35.160 --> 00:39:44.120
Maryana Pinchuk: taking the content and running it through its dialogue filter. But it is using Wikipedia specifically as it's knowledge based because we've told it to do so with this plugin system.
272
00:39:44.510 --> 00:39:50.400
Sage Ross: It's the thought of scope to go into it more in more detail. But i'm really curious about like
273
00:39:50.850 --> 00:39:52.750
Sage Ross: how it differs from a normal.
274
00:39:53.240 --> 00:39:57.630
Marshall Miller: I'll just, I'll just add, like, literally like
275
00:39:57.750 --> 00:40:16.250
Marshall Miller: if under the plugin world. If you ask this question, Who won the women's basketball tournament, chat. Gpt sees that question, and it goes. This is a knowledge question. I'm going to use the Wikipedia plugin just like it would use if it sees like this is a travel question. I'm going to use the expedia plugin. So then it goes to our plugin.
276
00:40:16.250 --> 00:40:24.540
Marshall Miller: and it decides what? In what query to ask it, and it might have used like 2023 women's, basketball tournament.
277
00:40:24.630 --> 00:40:27.140
Marshall Miller: and then sends it over api to us.
278
00:40:27.740 --> 00:40:39.310
Marshall Miller: and then gets the Co. Gets information back from us, which in this case was content from Wikipedia articles. and then receives all of that, like all the articles. And then it starts to summarize it and turn it into this answer.
279
00:40:40.460 --> 00:40:49.300
Liam Wyatt: So to clarify the plugin doesn't require the person asking the question to say, I only want contents from Wikipedia.
280
00:40:49.490 --> 00:40:54.980
Liam Wyatt: I the chat, but itself makes that decision. It's not like you as suggested in the chat.
281
00:40:55.010 --> 00:40:57.410
Liam Wyatt: You go into a search engine and saying.
282
00:40:57.420 --> 00:41:01.960
Liam Wyatt: Search only within the site, Wikipedia. And now here's my query.
283
00:41:06.800 --> 00:41:13.950
Marshall Miller: That's right, but it You can also say that you want it from Wikipedia, and it will take the hit if in case it hasn't like, realized that already.
284
00:41:18.120 --> 00:41:22.120
Liam Wyatt: The other question we have in the chat here as I in the in the queue Here
285
00:41:22.180 --> 00:41:31.720
Liam Wyatt: it's from I'm. Sorry if I is pronouncing your name. Would you like to me to read out your question or feel free to speak it yourself?
286
00:41:31.920 --> 00:41:48.870
Abhishek Suryawanshi: Yeah, sure. I'm. Just Thank you so much for hosting this call. I have a basic question like this: the future audiences initiative team have any resources for the community to get involved in this in terms of plan or on the support where community members also want to get involved and then help you guys.
287
00:41:54.350 --> 00:41:56.120
Marshall Miller: Mariano.
288
00:41:56.360 --> 00:41:57.110
Liam Wyatt: Yes.
289
00:41:57.350 --> 00:42:09.660
Maryana Pinchuk: So yeah. So this is a pretty small piece of the overall product and tech portfolio sort of the plan for this year is to mainly focus our energy and attention at the broad scale on existing audiences.
290
00:42:09.660 --> 00:42:21.760
Maryana Pinchuk: For a lot of really good reasons. However, we do have sort of separate from this. A whole community grants process which you, probably some of you on this call, are already familiar with.
291
00:42:21.790 --> 00:42:40.880
Maryana Pinchuk: and one of the things that we can certainly do is encourage our community resources team to fund innovative new ideas, that kind of really dovetail with this whole future audiences concept. So we don't have anything specific so aside as far as like a a separate grants track.
292
00:42:40.880 --> 00:42:52.310
Maryana Pinchuk: But in terms of other ways that community members can participate. We really, I I think, the probably the most valuable thing you can do at this stage and across the whole year, as we get into things like this
293
00:42:52.350 --> 00:43:00.040
Maryana Pinchuk: is make time. If you're interested in this kind of work to to give us your thoughts, your feedback, your reactions
294
00:43:00.040 --> 00:43:17.700
Maryana Pinchuk: to help us test these things. So we will probably touch on this in a second. But you know, quality of output here is going to be very important if we're associating this content with our brand very strongly. So we're gonna have to be very judicious about making sure that the output is good.
295
00:43:17.700 --> 00:43:32.860
Maryana Pinchuk: A. And for that we're just gonna need help. And the the people who know what is good are with comedians. So there are a lot of different ways that I think community members can get involved in this work, and we'll also talk about this at the end, when we have a little bit of time carved out for sort of next steps.
296
00:43:32.860 --> 00:43:39.590
Maryana Pinchuk: But I hope that sort of answers your your question, or let me know if if you have more thoughts there.
297
00:43:45.310 --> 00:43:55.370
Marshall Miller: Thanks. So we wanted to get to one other topic, although this one's really interesting and exciting. So I get. Why we're talking about this so much. I I think we wanted to get to one other topic, and
298
00:43:55.510 --> 00:44:05.710
Marshall Miller: before we transition I guess I wanted to check for all of you in the call that are not Wikimedia Foundation Staff. I I i'm interested in us, detecting like.
299
00:44:05.780 --> 00:44:23.160
Marshall Miller: do people have major concerns about this idea, or is it sort of like cautiously optimistic like? Seems like a good idea. But there are pitfalls. If if it's the if it's that. Then that's awesome, like. Let's convene again and keep talking about it. But I would love to find out if there are people that you know think this is the wrong track.
300
00:44:27.790 --> 00:44:29.810
Liam Wyatt: That's a weird case.
301
00:44:30.440 --> 00:44:38.460
Luis Villa (he/him): I'm weird. I'm a weird case, as I'm. X foundation. But I think the biggest risk here is moving too slow.
302
00:44:39.510 --> 00:44:45.710
Luis Villa (he/him): so I will, and I know there are. I know they're very good. I'm keenly aware that they are very good reasons
303
00:44:45.730 --> 00:44:51.550
Luis Villa (he/him): why the foundation often moves in a very measured pace. But
304
00:44:53.890 --> 00:44:58.330
Luis Villa (he/him): but i'd i'd urge all of you to think big and think bold in this moment.
305
00:44:58.360 --> 00:45:08.530
Luis Villa (he/him): because it's a it's both a big opportunity and a very big risk. If we don't react boldly and bigly enough to this.
306
00:45:08.900 --> 00:45:14.710
Liam Wyatt: we might quote you on that.
307
00:45:14.800 --> 00:45:21.440
Luis Villa (he/him): Put it on freaking banners like, do whatever you need with that. And and yeah.
308
00:45:23.050 --> 00:45:27.080
Liam Wyatt: there is a a response from Clara in the chat agreeing with you.
309
00:45:27.230 --> 00:45:29.890
Liam Wyatt: I see 2 other hands up, not in, and
310
00:45:29.950 --> 00:45:44.040
Liam Wyatt: obviously so. I think the it's a great initiative. But we are talking about future audiences, but the current audiences and the past audiences are developed by the community itself.
311
00:45:44.210 --> 00:45:54.440
Abhishek Suryawanshi: So I would like to see community leading this initiative, supported by the foundation as compared to foundation, doing everything and just updating community on what's happening
312
00:45:54.470 --> 00:45:56.720
Abhishek Suryawanshi: just might do. Sense? Thanks.
313
00:45:57.980 --> 00:46:00.150
Liam Wyatt: Thank you. And Martin.
314
00:46:01.240 --> 00:46:08.290
Martin: It's probably the same question as should we have as we can be a foundation, or we can be that
315
00:46:08.300 --> 00:46:11.700
Martin: communities a page on Facebook.
316
00:46:11.820 --> 00:46:17.090
Martin: So we have channels on telegram so like using
317
00:46:17.990 --> 00:46:21.220
Martin: tools which are not greatly
318
00:46:22.310 --> 00:46:28.740
Martin: working the way we are of being part of the ecosystem which we want to have.
319
00:46:29.270 --> 00:46:30.940
Martin: and
320
00:46:31.010 --> 00:46:35.640
Martin: I I can imagine that some people would answer this question with.
321
00:46:36.100 --> 00:46:37.580
Martin: maybe we shouldn't do this
322
00:46:39.840 --> 00:46:44.480
Martin: and what we do. This plugin looks cool.
323
00:46:44.560 --> 00:46:48.600
Martin: On the one hand. we don't solve the problem with the general
324
00:46:48.750 --> 00:46:52.710
Martin: an AI which uses our content without
325
00:46:52.990 --> 00:46:54.170
Martin: giving credit.
326
00:46:55.230 --> 00:47:01.900
Martin: and we also more this except the way they
327
00:47:03.090 --> 00:47:07.680
Martin: they gather information and and produce content
328
00:47:08.110 --> 00:47:09.410
Martin: for them all.
329
00:47:09.440 --> 00:47:15.440
Martin: Yeah, I'm: i'm German. So we also have problems with Wikipedia, 0 and such.
330
00:47:15.530 --> 00:47:17.710
Martin: And because of net neutrality.
331
00:47:19.670 --> 00:47:26.020
Martin: I i'm not sure if we, as one only one Wikipedia, should
332
00:47:26.230 --> 00:47:28.390
Martin: well use our powers.
333
00:47:28.590 --> 00:47:31.010
Martin: and to
334
00:47:31.890 --> 00:47:41.550
Martin: to say we are the ones we give. This is according to Wikipedia, being like we are, the credible sources here. I Don't trust the other ones. We have the plugin.
335
00:47:41.730 --> 00:47:48.420
Martin: These are concerns. and I shared partly, and i'm pretty sure that others will also share.
336
00:47:48.590 --> 00:47:51.630
Martin: And so
337
00:47:53.090 --> 00:47:58.860
Martin: I just wanted to say that not 100% people are probably happy about this.
338
00:47:58.970 --> 00:48:03.260
Martin: Nonetheless. I hear Louis and I see the problems here.
339
00:48:04.880 --> 00:48:07.090
Martin: If we don't do anything.
340
00:48:07.350 --> 00:48:08.570
Martin: This will
341
00:48:11.250 --> 00:48:16.370
Martin: will overtake so fast that we will lose our position.
342
00:48:16.460 --> 00:48:19.380
Martin: and and in all regards
343
00:48:19.530 --> 00:48:20.390
Martin: so
344
00:48:20.480 --> 00:48:23.980
Martin: pretty undecided on this. But there definitely concerns
345
00:48:28.390 --> 00:48:36.920
Liam Wyatt: thank you, Mountain. I think your you one your point of of reticence, or we need to do this fast. But we need to do this appropriately.
346
00:48:37.070 --> 00:48:51.210
Liam Wyatt: That needs to be captured, but it actually also draws us into one of the other areas of the objectives about future audiences which we have not spoken about much. You mentioned other platforms, Facebook, for example.
347
00:48:51.230 --> 00:48:56.890
Liam Wyatt: in the couple of minutes that we have a many. We've spent a lot of time talking about AI, but
348
00:48:57.000 --> 00:49:02.860
Liam Wyatt: one of the other areas regards to objective to key result. One
349
00:49:02.900 --> 00:49:13.130
Liam Wyatt: in our discussion here in our and your plan document talks about third party content platforms and reaching youth audiences within them.
350
00:49:13.390 --> 00:49:14.630
Liam Wyatt: You mentioned
351
00:49:14.650 --> 00:49:26.290
Liam Wyatt: the difficulties in choosing where and when Wikipedia movement should have an official presence in third party platforms, especially if we don't control the policies, or we don't approve of the
352
00:49:26.530 --> 00:49:32.190
Liam Wyatt: the organization that owns those platforms. But that is where the audience might be.
353
00:49:34.500 --> 00:49:39.090
Liam Wyatt: How do you not, or or others. I think we should
354
00:49:39.540 --> 00:49:47.480
Liam Wyatt: appropriately approach this goal of reaching the people, particularly young people, where they are.
355
00:49:47.500 --> 00:49:50.280
Liam Wyatt: rather than expecting them to come to us.
356
00:49:57.750 --> 00:49:58.900
Liam Wyatt: Go, please, Martin.
357
00:50:00.310 --> 00:50:11.500
Martin: Oh, there is not an easy, easy answer; and if we, if we look at Wikimedia all and the discussions about, should we have a presence on Twitter or not.
358
00:50:11.900 --> 00:50:12.580
Martin: It's.
359
00:50:13.030 --> 00:50:15.160
Martin: We will have to have
360
00:50:16.660 --> 00:50:19.360
Martin: reviews, reflections, and such
361
00:50:21.520 --> 00:50:25.900
Martin: from time to time to see if we are still on the right path.
362
00:50:26.100 --> 00:50:31.990
Martin: and please also has to apply to AI. Of course.
363
00:50:33.010 --> 00:50:39.090
Liam Wyatt: So how do we, as a movement, have that kind of conversation that does not require
364
00:50:39.300 --> 00:50:41.070
Liam Wyatt: everyone getting a veto
365
00:50:41.420 --> 00:50:51.310
Liam Wyatt: over there. There will never be a circumstance where any technical decision, particularly third party platform integration. Everyone agrees with every decision making.
366
00:50:51.550 --> 00:50:56.020
Liam Wyatt: Obviously we inside the Wikipedia Foundation. There is a formal
367
00:50:56.050 --> 00:51:06.650
Liam Wyatt: I rocky. Your boss can say yes or no, but in a movement how do we have the fast decision making that Louis thinks of, and the cautious
368
00:51:07.260 --> 00:51:10.290
Liam Wyatt: approach to doing the right thing.
369
00:51:10.340 --> 00:51:19.670
Liam Wyatt: But it's also necessary. How do we connect those 2 things without giving every single person a right to veto every single
370
00:51:19.960 --> 00:51:22.520
Martin: opportunity. Now Vitas are not
371
00:51:22.550 --> 00:51:25.240
Martin: constructive.
372
00:51:26.040 --> 00:51:31.860
Martin: Yeah, of course, concerns must be heard. They must be taken into concern
373
00:51:32.010 --> 00:51:36.790
Martin: taking into production. And
374
00:51:37.040 --> 00:51:44.150
Martin: but we've always been very slow on improvements and moving forward.
375
00:51:44.260 --> 00:51:49.850
Martin: and and I feel this will also be the case with a AI, and already is
376
00:51:49.990 --> 00:51:56.560
Martin: I'm. Coming from John Wikipedia. We don't have ours, for example, because we are quite skeptical about
377
00:51:57.060 --> 00:52:00.760
Martin: technological improvements using is such to that are.
378
00:52:00.880 --> 00:52:02.420
Martin: So
379
00:52:03.880 --> 00:52:09.670
Martin: I i'm. I'm asking myself the same question since a couple of weeks month.
380
00:52:10.140 --> 00:52:13.140
Martin: How how do we really get
381
00:52:13.720 --> 00:52:15.760
Martin: technological improvements
382
00:52:17.840 --> 00:52:20.330
Martin: and community herd? And
383
00:52:20.550 --> 00:52:25.190
Martin: to date. My my conclusion is maybe to have some kind of
384
00:52:26.670 --> 00:52:27.930
Martin: council.
385
00:52:30.340 --> 00:52:32.470
Martin: Yeah, maybe
386
00:52:33.570 --> 00:52:38.250
Martin: the Subcommittee of the Global Council which we'll focus on
387
00:52:38.460 --> 00:52:51.340
Martin: technological improvements. We can't go get everything through request for commons on a global level. This will not scale with the speed of the road around us.
388
00:52:51.630 --> 00:52:55.770
Martin: A. And we we missed so many options in the past.
389
00:52:55.950 --> 00:52:57.630
Martin: and
390
00:52:57.650 --> 00:53:00.690
the technological improvements are slow.
391
00:53:00.990 --> 00:53:02.770
Martin: and
392
00:53:03.050 --> 00:53:08.650
Martin: there must be a change to that. and which also implies that
393
00:53:08.720 --> 00:53:14.720
Martin: local communities should realize that there's a need for this.
394
00:53:15.140 --> 00:53:19.200
Martin: and maybe send out people to make decisions
395
00:53:19.380 --> 00:53:20.880
Martin: on their behalf.
396
00:53:21.640 --> 00:53:25.940
Martin: But it's it's it's question of governance. Once again
397
00:53:26.120 --> 00:53:27.960
Martin: we'll see how it will develop.
398
00:53:29.170 --> 00:53:34.900
Liam Wyatt: Thank you, Martin and I definitely appreciate the desire for some kind of oversight and good.
399
00:53:35.060 --> 00:53:43.820
Liam Wyatt: what with balance with the risk of inventing governance structures for the sake of it, and elections and committees, and so forth.
400
00:53:44.140 --> 00:53:46.570
Liam Wyatt: But yes, there needs to be a way to
401
00:53:47.820 --> 00:53:50.610
Liam Wyatt: to balance these 2 risks and and benefits.
402
00:53:51.280 --> 00:54:00.380
Maryana Pinchuk: So we've got like 2 min left, and a few folks asking about social apps and and video and sort of what our approach there is.
403
00:54:00.430 --> 00:54:15.430
Maryana Pinchuk: and I really don't want to drop that, because I think it's a really important sort of second track of testing that we really want to get into. Marshall has queued up a little video that he could play, maybe as a a parting thought on inspiration for thinking about how
404
00:54:15.430 --> 00:54:26.140
Maryana Pinchuk: our content our whole model of verifying knowledge could play in this space. So i'll just let him kind of play this. This is a
405
00:54:26.410 --> 00:54:28.510
Maryana Pinchuk: tik tok account that he's a big fan of
406
00:54:29.010 --> 00:54:32.650
Marshall Miller: Yeah. So this is a Tik Tok account called uncovering California.
407
00:54:32.770 --> 00:54:52.320
Marshall Miller: and what they what the people who make this account do is they make videos about the geography of California that get tens of thousands of use, hundreds of thousands of views. And you will see why i'm showing this in a second. So look, I bet you this sounds not gonna play unless I share it a little bit differently. Just a second
408
00:54:53.290 --> 00:54:57.100
Marshall Miller: share a Well, i'm not sure if the sound is gonna play, let's try it.
409
00:55:02.880 --> 00:55:11.390
California. 58 county. and we're going to do a video on all of that part 14 located in the geographic center of California.
410
00:55:12.400 --> 00:55:21.500
I just understand what in the So Gary County was incorporated in 1,890 4 from a portion of
411
00:55:22.790 --> 00:55:28.420
It was 2,001, 3 square miles. It's 60 number they want, and it's ordered by these 5,
412
00:55:29.700 --> 00:55:38.030
and we they're very on where you want. There's no right in the West. It has a semi-arid step
413
00:55:39.760 --> 00:55:52.470
population of about 100 56,000 The county seat and largest city is the city of Madeera 56,000 people. Its name comes from the Spanish word for wood and access to the railroad of that industry
414
00:55:52.480 --> 00:55:56.170
right along with my. So there it is the
415
00:55:56.660 --> 00:56:12.610
Marshall Miller: Okay? So i'll pause there. You can probably guess why we shared this. We talked to these creators, the people that made this video, and we asked them about their process, and of course the first thing they said was, First we go to the Wikipedia article and we take notes.
416
00:56:13.780 --> 00:56:15.060
Marshall Miller: Did somebody say something
417
00:56:16.640 --> 00:56:25.870
Marshall Miller: sorry the process was. First we go to the Wikipedia article, and they take notes, and they use that as the basis for the video, and they can they also take images from comments.
418
00:56:26.240 --> 00:56:32.950
Marshall Miller: And so it kind of shows us that Wikimedia content is spreading through these other platforms.
419
00:56:33.130 --> 00:56:35.170
Marshall Miller: which is great for spreading knowledge.
420
00:56:35.360 --> 00:56:48.890
Marshall Miller: But you know a few things that it tells us is they don't mention Wikipedia here anywhere, so there's no attribution they don't talk about, you know, contributing back knowledge or donating, or anything like that. And
421
00:56:48.990 --> 00:57:02.830
Marshall Miller: also we don't know if they're using the knowledge in the in like a totally accurate way, and whether they've processed it correctly and reflected it correctly. And so the kind of question the the food for thought since we're coming to the end of the meeting is
422
00:57:02.860 --> 00:57:12.170
Marshall Miller: given that this is already happening and reaching people that probably would never be reached by Wikipedia. How might we think about encouraging this to happen in a good way.
423
00:57:12.290 --> 00:57:29.310
Marshall Miller: in a way that spreads knowledge with high fidelity that allows people to understand where it came from, so they can get involved. So it's a similar line of thinking to the one we just talked about with AI. Right like the knowledge is spreading through AI. How should we get involved to help it spread the right way.
424
00:57:29.360 --> 00:57:34.680
Marshall Miller: And this is an example that comes from social media. The knowledge is spreading through social media.
425
00:57:34.690 --> 00:57:37.170
Marshall Miller: How might we encourage it to happen the right way.
426
00:57:42.950 --> 00:57:55.860
Maryana Pinchuk: So I think we're over time. But one thing we definitely want to get to is asking you all to sign up on our Meta page the link to which someone can drop in chat.
427
00:57:55.860 --> 00:58:08.170
Maryana Pinchuk: because I think it's clear that this is not one conversation, and we're done. See you next year. This is going to be a a very like interesting rich area, with many, many different paths, and for
428
00:58:08.180 --> 00:58:25.920
Maryana Pinchuk: and we would love to just get a sense generally from you all which areas in particular you're interested in. Is it the social app video space? Is it the Conversationally, I space? Is it, thinking about future trends overall and helping us kind of understand where things are, are pointing us as a movement.
429
00:58:25.920 --> 00:58:37.250
Maryana Pinchuk: so that we can follow up more with each of you, and really leverage your great thought, partnership and ideas, and just continue this conversation.
430
00:58:37.440 --> 00:58:51.650
Maryana Pinchuk: So please please do sign up there. I know you've already signed up to talk. Come, talk to us here. Please sign up there again. Please indicate your interest area, and we'll be following up soon to keep this going. So thank you all so much for being here.
431
00:58:52.320 --> 00:58:56.270
Liam Wyatt: Thank you, Mariana. Yes, that page. I put a chat. A link in the chat
432
00:58:56.320 --> 00:59:10.750
Liam Wyatt: is currently a draft page. It was put on Meta today to begin looking at. There is also obviously the future Trends section of the current Wikimedia Foundation Annual Plan draft document
433
00:59:11.030 --> 00:59:14.890
Liam Wyatt: which this speaks, to which this results from.
434
00:59:15.170 --> 00:59:27.220
Liam Wyatt: and hopefully this page will become the the one we just mentioned, will become the the hub of the home of all future audiences, work over the next year, and beyond which will
435
00:59:27.390 --> 00:59:35.990
Liam Wyatt: doubtless spread into series of separate. interrelated but separate hypotheses and
436
00:59:36.170 --> 00:59:49.560
Liam Wyatt: trial activities in different directions from AI plugin to partnership with tik, tok, creators and and everything in between different kinds of Api use and attribution work
437
00:59:49.600 --> 00:59:54.310
Liam Wyatt: lots of different public policy technology as well as
438
00:59:54.370 --> 00:59:57.790
Liam Wyatt: ethical and partnership activities.
439
00:59:58.070 --> 01:00:08.230
Liam Wyatt: Best place will probably be that page, though no doubt there'll be various blog posts and email threads and complaints that will come in different ways
440
01:00:08.370 --> 01:00:09.880
Liam Wyatt: over time.
441
01:00:11.220 --> 01:00:22.930
Liam Wyatt: that, particularly as Martin mentioned, there's this challenge in doing that the right way, doing it the fast way and doing something that is both right and the fast it will be hard to achieve.
442
01:00:23.560 --> 01:00:31.870
Liam Wyatt: We are slightly over time. I will thank you all for your time. I'll put this to you on the committed commons hopefully tonight or tomorrow, and
443
01:00:31.910 --> 01:00:43.860
Liam Wyatt: future comments can still be added to the top pages of the various things I mentioned today, notably the okay odds, which is the
444
01:00:44.880 --> 01:00:49.940
Liam Wyatt: I was to get one. Okay, I was tested for objectives and key results.
445
01:00:50.790 --> 01:00:56.580
Liam Wyatt: Draft document for the entire committee foundation and your plans, technology work
446
01:00:56.810 --> 01:01:07.120
Liam Wyatt: that is currently still in draft. And this conversation today is a supplement to that work on Wiki. It is not a replacement for it, for those who could not come.
447
01:01:07.560 --> 01:01:19.070
Liam Wyatt: you know or not, did not want to come to a in person to be a conversation with that All check. Thank you for your time, and i'll see you around the keys. Good evening.
448
01:01:20.610 --> 01:01:21.430
Maryana Pinchuk: Thanks All.
|