1
00:00:00,000 --> 00:00:03,000
And now we are going to begin with our first keynote
2
00:00:03,000 --> 00:00:06,000
[http://www.harihareswara.net/ <span style="color:#BBBBFF !important">Sumana Harihareswara</span>] ([[User:Sumanah]])
3
00:00:06,000 --> 00:00:08,500
Did I get that right?
4
00:00:09,000 --> 00:00:09,600
Ok
4
00:00:10,000 --> 00:00:11,500
Sumana is
5
00:00:11,500 --> 00:00:13,500
a Senior Technical Writer
6
00:00:13,500 --> 00:00:15,000
at the Wikimedia Foundation,
7
00:00:15,250 --> 00:00:19,000
where she works in the Engineering Community Team
8
00:00:19,500 --> 00:00:22,500
engaging volunteer developers within the Wikimedia movement
9
00:00:22,500 --> 00:00:24,500
and rocking hackathons.
10
00:00:25,000 --> 00:00:27,250
Sumana has worked for Collabora, GNOME,
11
00:00:27,250 --> 00:00:31,000
QuestionCopyright.org, Fog Creek Software,
12
00:00:31,000 --> 00:00:32,700
Behavior, and Salon.com.
13
00:00:32,700 --> 00:00:34,000
She has contributed to MediaWiki,
14
00:00:34,000 --> 00:00:37,500
AltLaw, Empathy, Miro, and Zeitgeist
15
00:00:37,500 --> 00:00:39,000
open source projects.
16
00:00:39,250 --> 00:00:41,500
She’s also a blogger at Geek Feminism,
17
00:00:41,500 --> 00:00:43,200
and a member of the Board of Directors
18
00:00:43,200 --> 00:00:45,500
of the Ada Initiative,
19
00:00:46,000 --> 00:00:48,500
and was an editor and release organizer
20
00:00:48,500 --> 00:00:50,500
of the GNOME Journal.
21
00:00:51,000 --> 00:00:52,500
She’s truly a badass
22
00:00:52,500 --> 00:00:55,250
and a friend who has supported
23
00:00:55,250 --> 00:00:59,250
and helped me as I’m sure she has for many other people
24
00:00:59,250 --> 00:01:01,000
in this room and around the world,
25
00:01:01,500 --> 00:01:05,000
feel welcome within the Wikimedia free culture,
26
00:01:05,000 --> 00:01:08,250
open knowledge, techno activist and feminist communities,
27
00:01:08,250 --> 00:01:10,700
and I’m very excited to be introducing Sumana
28
00:01:10,700 --> 00:01:13,300
to all of you as your first keynote speaker.
29
00:01:14,000 --> 00:01:19,000
[''applause'']
30
00:01:22,000 --> 00:01:25,000
Hey there. You can all hear me all right?
31
00:01:25,000 --> 00:01:26,000
Great.
32
00:01:27,000 --> 00:01:32,000
Well, thank you to everyone who worked on creating this conference
33
00:01:32,000 --> 00:01:34,000
and bringing it here.
34
00:01:34,000 --> 00:01:36,000
And thank you for inviting me to open this conference.
35
00:01:36,000 --> 00:01:38,000
I'm speaking for myself today,
36
00:01:38,000 --> 00:01:42,500
and not the Wikimedia Foundation or the Ada Initiative,
37
00:01:42,500 --> 00:01:45,700
which is good because I might swear a little bit.
38
00:01:46,000 --> 00:01:48,000
And thank you to NyQuil
39
00:01:48,000 --> 00:01:49,500
for making it possible for me to be here.
40
00:01:49,500 --> 00:01:51,000
[''laughter'']
41
00:02:02,500 --> 00:02:04,500
Another acknowledgement.
42
00:02:05,000 --> 00:02:07,700
I acknowledge the Lenape people
43
00:02:08,700 --> 00:02:10,000
on whose ancestral land I stand,
44
00:02:10,000 --> 00:02:13,000
the African people whose bones were buried unmarked
45
00:02:13,000 --> 00:02:14,000
in this city’s foundations,
46
00:02:14,00 --> 00:02:17,000
and the involuntary and exploited labor of people
47
00:02:17,000 --> 00:02:20,000
from every land which helped create New York City.
48
00:02:21,000 --> 00:02:25,000
And I want to thank all of you who are here,
49
00:02:25,000 --> 00:02:28,000
and everyone who shares in imagining
50
00:02:28,000 --> 00:02:30,000
a world in which every single human being
51
00:02:30,000 --> 00:02:32,500
can freely share in the sum of all knowledge.
52
00:02:32,500 --> 00:02:34,500
You might have heard that line before.
53
00:02:34,500 --> 00:02:36,000
That’s our commitment together.
54
00:02:36,000 --> 00:02:39,500
And Wikimedia is a wonderful place to be
55
00:02:39,500 --> 00:02:42,000
because I can help everyone,
56
00:02:42,000 --> 00:02:44,500
and everyone can help me.
57
00:02:45,000 --> 00:02:48,000
All of us here have goals,
58
00:02:48,000 --> 00:02:50,000
about what we want Wikimedia to be,
59
00:02:50,000 --> 00:02:52,500
about how we see that mission unfolding.
60
00:02:52,500 --> 00:02:55,000
And we each have our own different perspectives,
61
00:02:55,000 --> 00:02:56,000
which is great,
62
00:02:56,000 --> 00:03:00,500
because no one of us has all the answers to how to achieve this mission,
63
00:03:00,500 --> 00:03:03,000
which means we have to learn from each other.
64
00:03:03,000 --> 00:03:05,000
Every single one of us –
65
00:03:05,000 --> 00:03:06,000
every single one of you –
66
00:03:06,000 --> 00:03:10,000
has stuff we need to learn to help achieve the mission.
67
00:03:10,000 --> 00:03:13,000
For some of us, we need to learn how to use new tools,
68
00:03:13,000 --> 00:03:15,700
we need to learn certain aspects, little bits of domain knowledge,
69
00:03:15,700 --> 00:03:18,500
we learn how to talk to each other in new ways,
70
00:03:18,500 --> 00:03:23,500
or we need to learn how to teach each other, and teach new people, better.
71
00:03:23,500 --> 00:03:26,000
And so that’s why I’m going to be talking today
72
00:03:26,000 --> 00:03:31,000
about making learning environments, nurturing learning environments.
73
00:03:31,000 --> 00:03:34,000
I'm going to tell you about one that worked for me,
74
00:03:34,500 --> 00:03:36,500
a place where I found I could learn things
75
00:03:36,500 --> 00:03:40,000
that I had not been able to learn elsewhere, including at Wikimedia.
76
00:03:40,000 --> 00:03:42,500
And I want you to see how the essential features of that place
77
00:03:42,500 --> 00:03:47,000
helped everyone learn more effectively than some people can learn right now
78
00:03:47,000 --> 00:03:49,500
within the Wikimedia community.
79
00:03:49,500 --> 00:03:52,000
Then I'll talk about how we apply that to Wikimedia as a whole.
78
00:03:53,000 --> 00:03:54,500
In fall 2013
79
00:03:54,500 --> 00:03:59,200
I took three months off from my Wikimedia Foundation work to improve my programming skills.
78
00:03:59,200 --> 00:04:02,000
I did this at an experimental place called Hacker School,
79
00:04:02,000 --> 00:04:07,500
where people who already have some programming experience get together to improve their skills.
80
00:04:07,500 --> 00:04:08,800
And there's no set curriculum;
81
00:04:08,800 --> 00:04:11,500
it's a face-to-face place, not too far from here,
82
00:04:11,500 --> 00:04:14,000
right at Canal and Broadway, right here in Manhattan,
83
00:04:14,000 --> 00:04:18,200
where people work on their own projects, or contribute to open source together,
84
00:04:18,200 --> 00:04:20,250
or work through textbooks, or online courses.
85
00:04:20,250 --> 00:04:23,000
It's free to attend, but to encourage gender diversity,
86
00:04:23,000 --> 00:04:26,000
there are grants for women, to cover living expenses.
87
00:04:26,000 --> 00:04:28,750
And in my batch, which was 59 people,
88
00:04:28,750 --> 00:04:32,000
gathered together for three months, it was 42% women.
89
00:04:32,000 --> 00:04:36,500
So I’ll start off by talking about ways in which Hacker School is unlike us,
90
00:04:36,500 --> 00:04:40,000
but then talk about why I think we can still learn from it,
91
00:04:40,000 --> 00:04:43,000
and there are some ways in which Hacker School is very Wikimedian.
92
00:04:44,000 --> 00:04:47,700
Now, it’s unlike us in that every single person there has a
93
00:04:47,700 --> 00:04:50,750
really, concretely, shared goal.
94
00:04:50,750 --> 00:04:53,500
And you might say “Hold on, Sumana, you just talked about this shared mission!”
95
00:04:53,500 --> 00:04:57,000
Well, yeah, but I think a lot of us contribute to that mission in very different ways
96
00:04:57,000 --> 00:05:00,800
and have somewhat different conceptions of what it is;
97
00:05:00,800 --> 00:05:03,500
some of us contribute to Wikisource, or to Commons,
98
00:05:03,500 --> 00:05:05,250
some of us care more about editing,
99
00:05:05,250 --> 00:05:06,500
some about programming,
100
00:05:06,500 --> 00:05:08,000
some about design, some about outreach –
101
00:05:08,000 --> 00:05:14,000
it can be fuzzy, and Hacker School is much more concentrated.
102
00:05:14,000 --> 00:05:17,500
Every single person there is just there to become a better programmer,
103
00:05:17,500 --> 00:05:19,250
even if they’re learning different languages, and so on.
104
00:05:19,250 --> 00:05:21,200
And it’s small. And young.
105
00:05:21,200 --> 00:05:23,500
Hacker School has only been around for a few years,
106
00:05:23,500 --> 00:05:27,300
and there’s a total of maybe 350 people who have ever passed through it.
107
00:05:27,300 --> 00:05:30,000
It’s very face-to-face.
108
00:05:30,000 --> 00:05:35,000
Now, there is some online chat with alumni, but it’s in New York City,
109
00:05:35,000 --> 00:05:37,000
and there’s very little remote component.
110
00:05:37,500 --> 00:05:42,500
And possibly most importantly, they’re willing to exclude.
111
00:05:42,500 --> 00:05:48,500
Hacker School has a selection process, and not everyone gets in.
112
00:05:49,500 --> 00:05:55,000
They aren’t that big on you having to be some kind of hotshot programmer.
113
00:05:55,000 --> 00:05:57,000
They are totally fine with letting in people who have,
114
00:05:57,000 --> 00:05:59,500
let’s say, two months of programming experience,
115
00:05:59,500 --> 00:06:01,500
or twenty years of programming experience.
116
00:06:01,500 --> 00:06:04,250
In my batch, there were two PhDs.
117
00:06:04,250 --> 00:06:11,000
They balance having that basic skills check with valuing heterogeneity,
118
00:06:11,000 --> 00:06:16,000
but there is a selection process, and I’ll talk more about what they’re checking for later.
119
00:06:16,710 --> 00:06:22,860
But they're also like us. They appeal to me
in much the same way that Wikimedia does,
120
00:06:22,860 --> 00:06:29,860
in that it's experimental and it's very consensus-driven.
There's a lot of ad-hocracy. Many things have
121
00:06:29,900 --> 00:06:36,360
changed from month to month and year to year
at Hacker School, iteratively and experimentally,
122
00:06:36,360 --> 00:06:40,180
and there are different people on different
projects and everyone has something to learn
123
00:06:40,180 --> 00:06:44,819
and teach each other. It's an ad-hocracy,
and a do-ocracy, as you might be familiar
124
00:06:44,819 --> 00:06:51,819
with, and there's a balance between peer mentoring
and paid help. And this probably sounds familiar
125
00:06:52,960 --> 00:06:59,879
to you, now in the 10-plus years of the Wikimedia
movement, we've grown to have that. The vast
126
00:06:59,879 --> 00:07:05,810
majority of the stuff that's happening is
being done ad hoc, by volunteers, people helping
127
00:07:05,810 --> 00:07:11,960
each other, but we agree there are some things
it's really useful for paid people to do.
128
00:07:11,960 --> 00:07:17,580
At Hacker School there are five paid facilitators,
who do things like helping people set and
129
00:07:17,580 --> 00:07:21,449
reach their learning goals, and directing
them to resources, and starting them off with
130
00:07:21,449 --> 00:07:28,449
things like pair programming. And there are
diversity goals at Hacker School, and that
131
00:07:28,659 --> 00:07:33,000
feels familiar to you, probably. You've seen
that in the past several years Wikimedia,
132
00:07:33,000 --> 00:07:38,569
the movement, has set various diversity goals
and tried various ways to achieve them. And
133
00:07:38,569 --> 00:07:45,289
there's a lot of transparency and open source
at Hacker School. For instance, their user
134
00:07:45,289 --> 00:07:49,590
manual is public. You can just go to hackerschool.com
and read it.
135
00:07:49,590 --> 00:07:53,099
to photographer: Any time I see a camera pointed
at me, I start making more hand gestures,
136
00:07:53,099 --> 00:07:54,629
I hope you don't mind.
137
00:07:54,629 --> 00:07:55,449
(laughter)
138
00:07:55,449 --> 00:08:02,449
Camera Operator: Don't worry; I'll make you
look beautiful!
139
00:08:02,849 --> 00:08:09,439
Sumana: Make me look smart, that's more important.
140
00:08:09,439 --> 00:08:10,259
(cheers)
141
00:08:10,259 --> 00:08:16,560
In fact, a lot of Hacker Schoolers do their
work either by making small open source projects
142
00:08:16,560 --> 00:08:22,580
or contributing to them, and so it felt very
familiar to me in various ways, and I think
143
00:08:22,580 --> 00:08:28,000
we can learn from how they're achieving certain
goals that we couldn't, necessarily.
144
00:08:28,000 --> 00:08:33,010
So let's look at how the environment is set
up at Hacker School, and see what we can borrow.
145
00:08:33,010 --> 00:08:38,040
And here, as I talk about how people learn,
I'm going to be taking bits and pieces from
146
00:08:38,040 --> 00:08:43,510
cognitive apprenticeship theory, from a book
called How Learning Works, and various other
147
00:08:43,510 --> 00:08:50,510
citations that I already put on my blog earlier
today at harihareswara.net, and I'll be tweeting
148
00:08:51,090 --> 00:08:54,650
and denting and linking to later.
[Section: How we learn]
149
00:08:54,650 --> 00:09:00,080
People learn differently, for one thing. I
mean, really, different people learn incredibly
150
00:09:00,080 --> 00:09:06,850
differently. If you were making clothes, you
will have to make more than 1 size of shoe.
151
00:09:06,850 --> 00:09:11,330
If you were making almost anything, you would
have to think about that. And different people
152
00:09:11,330 --> 00:09:18,330
really learn incredibly differently. For instance,
the Felder-Silverman Engineering Learning
153
00:09:18,340 --> 00:09:25,340
Styles suggest four main axes that different
people vary on. Either way is good, but if
154
00:09:26,460 --> 00:09:30,780
you're just going to be tailoring something
for someone, they'll learn even better.
155
00:09:30,780 --> 00:09:34,560
Some people are more active learners: they
want to bump into things and make mistakes,
156
00:09:34,560 --> 00:09:37,820
and that's how they learn. Some people are
more reflective learners: they prefer to look
157
00:09:37,820 --> 00:09:42,970
at a map first. Some people are more sensing
learners: they prefer concrete examples. Some
158
00:09:42,970 --> 00:09:47,730
people are more intuitive learners, and they
like to look at patterns and think about how
159
00:09:47,730 --> 00:09:53,670
things connect together. Visual versus verbal:
do people want to make diagrams, or read or
160
00:09:53,670 --> 00:09:59,270
listen to words? And sequential and global:
does your learning proceed along LEGO brick
161
00:09:59,270 --> 00:10:04,830
by LEGO brick, or do you sort of fall in and
then have a big epiphany a little bit later.
162
00:10:04,830 --> 00:10:10,030
I have changed the way that I teach and the
way that I mentor and the way that I even
163
00:10:10,030 --> 00:10:15,630
ask little questions - answer little questions
in IRC - based on knowing this. My mentee
164
00:10:15,630 --> 00:10:21,530
Frances is here, for the Outreach Program
for Women, and as she was writing her application
165
00:10:21,530 --> 00:10:25,710
for her internship, I basically said "Would
you mind taking this assessment so that I
166
00:10:25,710 --> 00:10:29,690
can know how you learn so I can teach you
better?" And things have gone a lot better,
167
00:10:29,690 --> 00:10:34,430
because I'm not going to send her to videos,
because she hates videos! And interpersonal
168
00:10:34,430 --> 00:10:40,060
conflict actually goes a lot smoother, because
sussing out how other people are learning,
169
00:10:40,060 --> 00:10:44,840
and how they're listening to me, and maybe
what their mental model is that's different
170
00:10:44,840 --> 00:10:48,990
from mine, makes a massive difference. It
decreases friction and increases the productivity
171
00:10:48,990 --> 00:10:51,500
of that encounter.
172
00:10:51,500 --> 00:10:56,700
Another way that people learn is through legitimate
peripheral participation, which is quite a
173
00:10:56,700 --> 00:11:01,100
mouthful. If there's any members of Wikimedia
Foundation's Growth Team here they might be
174
00:11:01,100 --> 00:11:06,310
kind of bouncing up and down, because that's
a lot of what they help enable. The idea of
175
00:11:06,310 --> 00:11:12,870
legitimate peripheral participation is "Here's
a main, hard, complex activity that only the
176
00:11:12,870 --> 00:11:18,290
most expert people in a community can do"
but sort of coming out in ripples from that
177
00:11:18,290 --> 00:11:23,740
are smaller, less complex, easier to learn
tasks, where if people do the easier to learn
178
00:11:23,740 --> 00:11:30,290
tasks first while they can look over the shoulders
of the experts, they'll learn more. You know,
179
00:11:30,290 --> 00:11:35,920
if you're sweeping up sawdust in a woodworking
shop and then learning to measure while you
180
00:11:35,920 --> 00:11:40,830
watch other people doing the really complicated
cuts, you'll learn more. You'll see how long
181
00:11:40,830 --> 00:11:46,290
things take, what the rhythm is, what kinds
of decisions you have to make. And the Growth
182
00:11:46,290 --> 00:11:50,850
Team has been helping people find legitimate
peripheral participation in editing through
183
00:11:50,850 --> 00:11:57,390
things like typo fixes. And it seems to me
like mobile editing, and increasing people's
184
00:11:57,390 --> 00:12:03,810
ability to do quick photo uploads to Commons
and add to Wikidata is very similar.
185
00:12:03,810 --> 00:12:10,300
One thing that I think we can learn from legitimate
peripheral participation, as that idea, is
186
00:12:10,300 --> 00:12:17,300
-- do we actually have good pathways for people
to do that in other parts of Wikimedia, the
187
00:12:17,850 --> 00:12:24,850
more social ways? Like being on the Grants
Committee, or WikiProjects, and other more
188
00:12:25,210 --> 00:12:31,190
complicated forms of contribution that involve
more interpersonal interaction. We can redesign
189
00:12:31,190 --> 00:12:36,140
tasks to reduce the cognitive load on learners
so they can focus on key aspects of the task
190
00:12:36,140 --> 00:12:41,500
they are deliberately practicing. There's
a summary of this process in the How Learning
191
00:12:41,500 --> 00:12:42,260
Works book.
192
00:12:42,260 --> 00:12:46,620
To become self-directed learners, students
must learn to assess the demands of the task,
193
00:12:46,620 --> 00:12:50,350
evaluate their own knowledge and skills, plan
their approach, monitor their progress, and
194
00:12:50,350 --> 00:12:56,620
adjust their strategies as needed. And when
you're doing something you're an expert at,
195
00:12:56,620 --> 00:13:01,770
you do that without thinking about it too
much, but helping people get that into their
196
00:13:01,770 --> 00:13:06,140
muscle memory is a process in itself, and
it's one worth designing.
197
00:13:06,140 --> 00:13:10,050
My friend Mel Chua, who's a Wikipedian and
an education researcher, once summarized these
198
00:13:10,050 --> 00:13:16,070
three lessons as super important for us to
learn, as: one, learning is designable like
199
00:13:16,070 --> 00:13:22,600
code; two, our brains are snowflakes, we learn
differently; and three, we do not function
200
00:13:22,600 --> 00:13:27,970
standalone, we learn in communities. So I
lived through all these lessons in the autumn,
201
00:13:27,970 --> 00:13:28,950
at Hacker School.
202
00:13:28,950 --> 00:13:34,120
I thought about how I learn. I learned a lot
about how I learn. I use little rituals. I
203
00:13:34,120 --> 00:13:41,120
listen to certain music -- for me, it's the
Tron Legacy soundtrack -- or I take a break
204
00:13:42,320 --> 00:13:47,630
every 90 minutes. I learn best by setting
small goals for myself, to combine textbook
205
00:13:47,630 --> 00:13:54,630
learning with making little apps or websites.
And I learn with and from others. It is important
206
00:13:54,670 --> 00:13:59,760
for me to be around other people in person
and online, so that I can learn from them
207
00:13:59,760 --> 00:14:04,260
and I can teach them. For me, it has to be
reciprocal. And it worked! I learned a lot
208
00:14:04,260 --> 00:14:09,440
at Hacker School. I came in a dabbler and
I came out a much better programmer.
209
00:14:09,440 --> 00:14:12,450
One reason is Hacker School was a place I
could show up and I could ask what the hell
210
00:14:12,450 --> 00:14:17,990
an array was, and someone would help me and
give me an answer. I could have looked up
211
00:14:17,990 --> 00:14:24,810
individual definitions on my own, but conversation
was part of what helped me build the conceptual
212
00:14:24,810 --> 00:14:30,240
models for those definitions to fit into.
So you might think about the next time someone
213
00:14:30,240 --> 00:14:37,100
asks a question, is -- answering the question
is partly about figuring out what their conceptual
214
00:14:37,100 --> 00:14:42,480
model is, so you can help them build it.
215
00:14:42,480 --> 00:14:49,480
And nothing is magic. I think that was -- that's
something that all of us sometimes have to
216
00:14:50,500 --> 00:14:54,040
remember, is that that thing that someone
else is doing that seems impossibly hard,
217
00:14:54,040 --> 00:14:59,320
or the thing someone else knows that's full
of all this jargon -- if we try different
218
00:14:59,320 --> 00:15:05,760
ways, if we ask the right questions and set
up nurturing learning environments, we can
219
00:15:05,760 --> 00:15:11,110
learn it. It's not magic. And I didn't have
that belief before I think I came into Hacker
220
00:15:11,110 --> 00:15:16,140
School a little bit afraid of certain buzzwords,
as though they were just impossibly hard.
221
00:15:16,140 --> 00:15:20,510
And that was a change for me. You may have
heard of Carol Dweck's research on the fixed
222
00:15:20,510 --> 00:15:24,670
versus the growth models of how we look at
the world and learning. The short summary
223
00:15:24,670 --> 00:15:30,560
is, if you believe that talent is nurture
and practice, then you'll grow. But if you
224
00:15:30,560 --> 00:15:37,560
believe that some people are just good at
X, if it's inborn or nature, then we won't
225
00:15:38,120 --> 00:15:44,610
learn. And I was able to learn at Hacker School
because it was safe to fail. If you're going
226
00:15:44,610 --> 00:15:48,640
to try things, you're going to fail sometimes,
you're going to make mistakes in front of
227
00:15:48,640 --> 00:15:55,420
other people, and people learn a lot slower
if they're afraid to fail, if they're slower
228
00:15:55,420 --> 00:15:56,540
to ask questions.
[Section: The No Asshole Zone]
229
00:15:56,540 --> 00:16:00,880
So how does that work? How do you make people
feel more okay about working in public, which
230
00:16:00,880 --> 00:16:07,880
includes sometimes failing or showing ignorance?
Well, a No Asshole zone really helps.
231
00:16:08,720 --> 00:16:10,850
(laughter)
232
00:16:10,850 --> 00:16:14,870
So remember when I talked about the selection
process? Part of the interview and admissions
233
00:16:14,870 --> 00:16:19,510
process was a pair programming interview where
you tried to solve a small programming problem
234
00:16:19,510 --> 00:16:24,050
over the internet, and the main point was
not "How good are you as a programmer?" It's
235
00:16:24,050 --> 00:16:28,870
"How well do you deal with frustration, and
do you turn into a jerk when you're trying
236
00:16:28,870 --> 00:16:33,180
to solve a problem with someone else or teach
someone something?" 'Cause it's kind of hard
237
00:16:33,180 --> 00:16:38,070
to really keep the jerkitude inside, I think,
when you're, like, a little bit frustrated
238
00:16:38,070 --> 00:16:42,810
and you're trying to work with somebody for
that. And those people got rejected. It was
239
00:16:42,810 --> 00:16:47,300
amazing what a pleasure it was to be in a
room with 58 other people, all of whom had
240
00:16:47,300 --> 00:16:52,620
specifically been chosen for their ability
to collaborate with others.
241
00:16:52,620 --> 00:16:59,620
Also, to keep us from accidentally discouraging
other people from doing the things they need
242
00:17:01,660 --> 00:17:06,399
to do to learn, at Hacker School there are
four social rules. These are social rules
243
00:17:06,399 --> 00:17:13,399
to help everyone feel okay with failure and
ignorance. No feigned surprise. No well-actuallys.
244
00:17:13,500 --> 00:17:19,189
No back-seat driving. And no sexism, racism,
homophobia, and so on. Now, the user manual,
245
00:17:19,189 --> 00:17:22,069
which is available online, does a great job
explaining all these, and I'm going to talk
246
00:17:22,069 --> 00:17:25,289
about the first two, because they're most
important for our context.
247
00:17:25,289 --> 00:17:31,519
Feigning surprise. When someone says "I don't
know what X is", you don't say "You don't
248
00:17:31,519 --> 00:17:34,980
know what X is?!" or "I can't believe you
don't know what X is!" Because that's just
249
00:17:34,980 --> 00:17:38,690
a dominance display. That's grandstanding.
That makes the other person feel a little
250
00:17:38,690 --> 00:17:43,190
bit bad and makes them less likely to show
you vulnerability in the future. It makes
251
00:17:43,190 --> 00:17:48,409
them more likely to go off and surround themselves
in a protective shell of seeming knowledge
252
00:17:48,409 --> 00:17:51,960
before ever contacting you again.
253
00:17:51,960 --> 00:17:58,230
Well-actuallys. That's the pedantic corrections
that don't make a difference to the conversation
254
00:17:58,230 --> 00:18:04,240
that's happening. Sometimes it's better to
err on the side of clarity rather than precision.
255
00:18:04,240 --> 00:18:11,240
Well-actuallys are breaking that. You sometimes
see, when people actually start trying to
256
00:18:12,259 --> 00:18:19,110
take this rule in, that in a conversation,
if they have a correction, they struggle and
257
00:18:19,110 --> 00:18:24,070
think about it. Is it worth making? Is this
actually important enough to break the flow
258
00:18:24,070 --> 00:18:28,039
of what other people are learning and getting
out of this conversation. Kind of like I think
259
00:18:28,039 --> 00:18:34,250
we in Wikimedia world will say "This might
be bikeshedding but -". It's a way of seeing
260
00:18:34,250 --> 00:18:38,320
that this rule actually has soaked in.
261
00:18:38,320 --> 00:18:41,879
I think it's also important to note, well,
how do these rules get enforced? Well, all
262
00:18:41,879 --> 00:18:48,879
of us felt empowered to say to anyone else,
quickly and a bit nonchalantly, "Hey, that
263
00:18:49,960 --> 00:18:53,649
was a well-actually," or "That's kind of feigned
surprise, don't you think?" And the other
264
00:18:53,649 --> 00:19:00,649
person said sorry, and moved on. I can't tell
you how freeing it felt that first week, to
265
00:19:01,370 --> 00:19:07,049
say "I don't know" a million times. Because
I had been trained not to display ignorance
266
00:19:07,049 --> 00:19:09,250
for fear of being told I didn't belong.
267
00:19:09,250 --> 00:19:15,529
We have the 4 social rules up on the wall,
framed, at Hacker School, and sometimes people
268
00:19:15,529 --> 00:19:21,049
will, while referencing it, unconsciously
turn their bodies towards them, because it's
269
00:19:21,049 --> 00:19:25,210
that much in our core values. If you don't
understand why something you did broke the
270
00:19:25,210 --> 00:19:28,850
rules, you don't ask the person who corrected
you. You ask a facilitator. You ask someone
271
00:19:28,850 --> 00:19:32,940
who's paid to do that emotional labor, and
you don't bring everyone else's work to a
272
00:19:32,940 --> 00:19:39,940
screeching halt. This might sound a little
bit foreign to some of us right now. Being
273
00:19:40,590 --> 00:19:47,590
able to ask someone to stop doing the thing
that's harming everyone else's work and knowing
274
00:19:47,990 --> 00:19:53,419
that it will actually stop and that there's
someone else who's paid to do that emotional
275
00:19:53,419 --> 00:19:57,210
labor who will take care of any conversation
that needs to happen.
276
00:19:57,210 --> 00:19:59,850
[Section: Vulnerability]
277
00:19:59,850 --> 00:20:04,899
Community management is a first-class responsibility
at Hacker School. Every one of the five facilitators
278
00:20:04,899 --> 00:20:08,509
who are the only employees at Hacker School
are community managers. That's a big part
279
00:20:08,509 --> 00:20:13,240
of their job. They will help with any kind
of problem, including the brains of everyone
280
00:20:13,240 --> 00:20:18,600
trying to work, and including helping you
with having a bad day, with your emotional
281
00:20:18,600 --> 00:20:23,029
fears and anxieties as well. Emotions are
first class citizens. If you are having a
282
00:20:23,029 --> 00:20:27,279
bad day, if you are worried about being good
enough, if you find it demoralizing that someone
283
00:20:27,279 --> 00:20:31,940
told you were wasting your time trying to
contribute to some open source project, you
284
00:20:31,940 --> 00:20:37,700
are not weak for having these problems and
talking about them openly. If we are publicly
285
00:20:37,700 --> 00:20:42,789
vulnerable then we can also help each other.
286
00:20:42,789 --> 00:20:46,370
Speaking about being vulnerable, now's when
I talk about what it's like to be a woman
287
00:20:46,370 --> 00:20:53,370
in Wikimedia. Especially when I'm the only
woman in the room. There's an xkcd about what
288
00:20:57,090 --> 00:21:01,470
it feels like to be the only woman in the
room. That's number 385, for those of you
289
00:21:01,470 --> 00:21:07,600
who do XKCD by number. A guy makes a mistake
solving a math problem, and another guy says,
290
00:21:07,600 --> 00:21:11,490
"Wow, you suck at math." A woman makes the
same mistake, and a guy says, "Wow, girls
291
00:21:11,490 --> 00:21:14,019
suck at math."
292
00:21:14,019 --> 00:21:20,190
Which happens to me -- I feel that way, I
worry about that -- a fair amount in Wikimedia
293
00:21:20,190 --> 00:21:27,190
world. Especially in the engineering spaces,
where I spend most of my time. The Zurich
294
00:21:27,940 --> 00:21:33,279
hackathon was 14% women, I believe, earlier
this month, and that was the most I've ever
295
00:21:33,279 --> 00:21:35,350
seen [author's note: meaning, the most I've
seen at a Wikimedia hackathon]. It was amazing
296
00:21:35,350 --> 00:21:42,350
to not always be the only woman in the room.
But at Hacker School there were 42% women
297
00:21:42,450 --> 00:21:47,029
in my batch. There were dozens of other women.
It made a tremendous difference to me. I didn't
298
00:21:47,029 --> 00:21:53,379
know all the women's names at the end of the
first week! That was amazing to me! And there
299
00:21:53,379 --> 00:21:58,370
were so many different kinds of women, as
there were different kinds of people. Some
300
00:21:58,370 --> 00:22:03,440
of them had been programming for two months,
some for twenty years, like kernel hacking.
301
00:22:03,440 --> 00:22:10,399
Some of them were interested in back-end development,
in machine learning, in visualization, various
302
00:22:10,399 --> 00:22:14,309
different kinds of things. No matter who I
wanted to become, there was someone who looked
303
00:22:14,309 --> 00:22:19,379
like me. There was someone who could talk
to me in my register. And we had conversations
304
00:22:19,379 --> 00:22:23,789
with everybody, but because half the people
were women, half my conversations were with
305
00:22:23,789 --> 00:22:27,809
women, and if I failed at something, it was
very unlikely that I was carrying the banner
306
00:22:27,809 --> 00:22:32,110
for all womankind.
307
00:22:32,110 --> 00:22:36,759
The How Learning Works book points out that
we have known about stereotype threat since
308
00:22:36,759 --> 00:22:43,759
1995. We have known that if you point out
to members of a marginalized group "Hey, hi
309
00:22:46,539 --> 00:22:51,070
there Member of Marginalized Group, did you
know you're marginalized here?", that's going
310
00:22:51,070 --> 00:22:56,220
to decrease performance and their willingness
to be vulnerable. And there are different
311
00:22:56,220 --> 00:23:01,990
kinds of accepting climates for marginalized
groups. DeSurra and Church, also in How Learning
312
00:23:01,990 --> 00:23:08,990
Works, talk about the climate for people who
are LGBT. A community might be explicitly
313
00:23:10,249 --> 00:23:16,679
marginalizing, overtly discriminatory; implicitly
marginalizing, subtly excluding certain groups;
314
00:23:16,679 --> 00:23:21,399
implicitly centralizing, welcoming of alternate
perspectives, can validate them, but it's
315
00:23:21,399 --> 00:23:27,440
on the minority group still to bring the topic
up even though it's okay when they do. And
316
00:23:27,440 --> 00:23:32,710
then there's explicitly centralizing the alternate
perspective. Bringing up and welcoming alternate
317
00:23:32,710 --> 00:23:37,610
perspectives without those minority students
needing to do that work. A teacher, for instance,
318
00:23:37,610 --> 00:23:41,789
bringing it up in syllabi, in the first discussion.
319
00:23:41,789 --> 00:23:48,779
In the book Women's Ways of Knowing around
community confirmation, there's also an observation
320
00:23:48,779 --> 00:23:55,580
that some groups, especially many women, find
that confirmation and community are prerequisites
321
00:23:55,580 --> 00:24:00,929
rather than consequences of learning certain
hard things. When you look at how our editathons
322
00:24:00,929 --> 00:24:05,809
have been able to increase attendance by women
by starting with the social aspect, I think
323
00:24:05,809 --> 00:24:10,179
you can see how this plays out in practice.
Software Carpentry, another learning outreach
324
00:24:10,179 --> 00:24:14,990
project, has also been able to increase attendance
by underrepresented groups at their bootcamps
325
00:24:14,990 --> 00:24:18,779
by suggesting that people bring their friends.
[Section: Liberty and hospitality]
326
00:24:18,779 --> 00:24:25,399
So Hacker School provided a relaxing learning
community for me where I could fail safely
327
00:24:25,399 --> 00:24:32,399
and I had role models. It was great. I learned
a lot. And then in January, when I came back
328
00:24:33,639 --> 00:24:38,759
to work, I felt like a fish who had taken
a three-month break from the water she swims
329
00:24:38,759 --> 00:24:45,759
in, and wow, it was demoralizing. It is -- we
have demoralizing people in the Wikimedia
330
00:24:50,039 --> 00:24:54,539
community, and we have some demoralizing processes
in places, and some of us have gotten used
331
00:24:54,539 --> 00:24:58,590
to it, but then there's the people who are
leaving or who are thinking of leaving, or
332
00:24:58,590 --> 00:25:04,330
who never even come in. It's super demoralizing
to be in a world where some people seem to
333
00:25:04,330 --> 00:25:09,210
follow the opposite of those four social rules,
like those are the key tactics in how they
334
00:25:09,210 --> 00:25:11,679
relate to others.
335
00:25:11,679 --> 00:25:18,499
I was able to able to articulate this to myself
as the spectrum of liberty versus hospitality.
336
00:25:18,499 --> 00:25:24,470
The Wikimedia movement really privileges liberty,
way over hospitality. And for many people
337
00:25:24,470 --> 00:25:28,840
in the Wikimedia movement, free speech, as
John Scalzi put it, is the ability to be a
338
00:25:28,840 --> 00:25:35,840
dick in every possible circumstance. Criticize
others in any words we like, change each other's
339
00:25:37,139 --> 00:25:41,159
words, and do anything that is not legally
prohibited.
340
00:25:41,159 --> 00:25:47,919
Hospitality, on the other hand, is thinking
more about right speech, just speech, useful
341
00:25:47,919 --> 00:25:53,519
speech, and compassion. We only say and do
things that help each other. The first responsibility
342
00:25:53,519 --> 00:25:59,399
of every citizen is to help each other achieve
our goals, and make each other happy.
343
00:25:59,399 --> 00:26:03,330
I think these two views exist on a spectrum,
and we are way over to one side, and moving
344
00:26:03,330 --> 00:26:09,379
closer to the middle would help everyone learn
better and would help us keep and grow our
345
00:26:09,379 --> 00:26:12,470
contributor base.
[Section: What we should do]
346
00:26:12,470 --> 00:26:18,080
So what should we do? Well, I'm going to point
to a few sets of recommendations now, at a
347
00:26:18,080 --> 00:26:24,350
very high level, and only talk about a few
of them. And, as I mentioned, there's a bunch
348
00:26:24,350 --> 00:26:31,210
of links on my blog at this very moment that
I'll also be linking around.
349
00:26:31,210 --> 00:26:36,639
There are recommendations from Ada Initiative's
Valerie Aurora, in her session at the Wikimedia
350
00:26:36,639 --> 00:26:41,869
Diversity Conference in October. The slides
and notes from that session are up. And one
351
00:26:41,869 --> 00:26:48,480
of them is to think carefully about what we
do in super-public spaces versus how we act
352
00:26:48,480 --> 00:26:54,980
in invite-only space or quite private spaces,
and to think about what those spaces are.
353
00:26:54,980 --> 00:27:01,980
I think of the spaces that are more secret
or private as places where certain people
354
00:27:02,259 --> 00:27:07,990
can sort of rest and vent and collaborate,
and ask the questions they feel afraid of
355
00:27:07,990 --> 00:27:12,509
asking in public, so they can gain the strength
and confidence to go further out, into the
356
00:27:12,509 --> 00:27:19,509
invite-only spaces or the very public spaces.
I think we've seen this in my own experience
357
00:27:20,690 --> 00:27:27,669
at Hacker School, and we see that also the
invite-only spaces, or spaces where everybody
358
00:27:27,669 --> 00:27:33,679
coming in agrees to follow the same rules
so it's a place where you feel safer -- these
359
00:27:33,679 --> 00:27:39,090
are like tidepools, places where certain kinds
of people and certain kinds of behaviour can
360
00:27:39,090 --> 00:27:44,499
be nurtured and grown so that it's ready to
go out into the wider ocean.
361
00:27:44,499 --> 00:27:51,499
We can also modify existing spaces. We can
set up informal but real contracts or promises
362
00:27:53,690 --> 00:27:59,679
with specific people or in specific larger
spaces. I've done this. I've said "Hey, for
363
00:27:59,679 --> 00:28:04,299
this conversation -- I know in the past we've
had trouble assuming good faith of each other.
364
00:28:04,299 --> 00:28:07,779
Will you try -- I will try extra hard to assume
good faith of you if you'll assume good faith
365
00:28:07,779 --> 00:28:13,779
of me." And that actually made things go a
lot better.
366
00:28:13,779 --> 00:28:20,450
Valuing hospitality: another thing I'd like
us to do. When someone is criticized for doing
367
00:28:20,450 --> 00:28:26,960
something inhospitable, the first response
needs to not be "Oh, but remember their edit
368
00:28:26,960 --> 00:28:32,639
count. Remember he's done X or she's done
Y for this community." We need to start treating
369
00:28:32,639 --> 00:28:38,460
hospitality as a first class virtue, and see
that it is the seed of everything else. Alberto
370
00:28:38,460 --> 00:28:43,739
Brandolini said "The amount of energy necessary
to refute bullshit is an order of magnitude
371
00:28:43,739 --> 00:28:50,739
bigger than to produce it." It has a big cost
when someone treats others badly. If someone
372
00:28:52,769 --> 00:28:57,119
is ruining the hospitality of a place by using
their liberty in a certain way, we need to
373
00:28:57,119 --> 00:29:04,119
stop making excuses, and start on the path
of exclusion. If we exclude no one explicitly,
374
00:29:04,450 --> 00:29:10,499
we are just excluding a lot of people implicitly.
Including people like me.
375
00:29:10,499 --> 00:29:16,720
The Hacker School social rules. I personally
have started following them all the time,
376
00:29:16,720 --> 00:29:20,840
not just at Hacker School, and I'd encourage
you to consider doing the same. Maybe we could
377
00:29:20,840 --> 00:29:22,570
even get userboxes!
378
00:29:22,570 --> 00:29:23,629
(laughter)
379
00:29:23,629 --> 00:29:29,989
Is that how everything changes, right? In
Wikimedia? Just get the right userbox.
380
00:29:29,989 --> 00:29:36,989
Also let's go to Karen Sandler's presentation
this afternoon, here at Wiki Conference USA,
381
00:29:38,179 --> 00:29:43,419
called "Bringing More Women to Free Software:
What's Working for Us". That's about another
382
00:29:43,419 --> 00:29:48,879
different kind of learning environment that's
more like Wikimedia in some ways than Hacker
383
00:29:48,879 --> 00:29:52,700
School is. The Outreach Program for Women
is a mentorship and learning project that
384
00:29:52,700 --> 00:29:57,980
MediaWiki, Wikidata, and a number of other
aligned groups and organizations participate
385
00:29:57,980 --> 00:30:04,980
in. So let's go over there and talk and learn
more! And I'm also happy to talk more about
386
00:30:05,080 --> 00:30:11,100
this stuff in the hallways, and during the
Unconference day. Possibly I'll offer you
387
00:30:11,100 --> 00:30:13,009
my hand sanitizer, because I am still a little
bit sick.
388
00:30:13,009 --> 00:30:19,080
And any time that you're talking with someone
who knows less than you about something, you
389
00:30:19,080 --> 00:30:23,950
have an opportunity to teach them. And this
includes in those on-wiki conversations, in
390
00:30:23,950 --> 00:30:30,950
IRC, in the bug tracker, in RT, in the outreach
-- behind the scenes on those outreach days,
391
00:30:32,169 --> 00:30:32,570
everything!
392
00:30:32,570 --> 00:30:38,269
There are three really important things that
will help you teach them. Learning is designable
393
00:30:38,269 --> 00:30:44,249
like code. Our brains are snowflakes; different
people learn differently. And we do not function
394
00:30:44,249 --> 00:30:49,980
stand-alone. We learn in communities. We learn
from and with each other. And we're all doing
395
00:30:49,980 --> 00:30:56,980
this together. Thank you.