Miller test edit

Why bother with the miller test text? I believe it's an unhelpful tangent that just invites people to play amateur lawyer.

The simple fact of the matter is that if we succeed at fulfilling COM:SCOPE, in that the purpose of the work on commons is _educational_, then we will have prima-facie satisfaction of the miller test. I am aware of no example where any work with a case for educational or scientific (as required by COM:SCOPE) has even been _argued_ to fail this test. It makes sense to refer to the effective law when the law is more restrictive than what we might otherwise allow, so it is the substance of the law that is limiting our policy in those cases, but where our own behaviour is more restrictive than the law going deeply into the legal issues just invites confusion— as it will encourage people to try to rules lawyer the law in order to allow or forbid something. Rules lawyering is less problematic over commons policies because the commons community is able to shut a rules lawyer down by being able to say "No, thats not what the policy means, here— I'll go fix that ambiguous wording for you" but we have no such ability over US statutes or court precedents.

Moreover, I don't think we want to invite a lot of speculative interpretations of the law. If someone in an ultra-conservative town went to some podunk kangaroo court and tried to bring an obscenity claim against Wikimedia or it's contributors (say, perhaps, over some religious iconography that they found objectionable) we'd probably begin with a hard-line argument that our educational and scholarly mission provides an absolute protection against the application of obscenity law. I wouldn't want to see random uninformed speculation on commons policy pages about the implication of the law used as a weapon to make that kind of legal fight harder or more costly. ... so because of that we really should avoid going into the law except where the law is defining our behaviour.

Thoughts? --Gmaxwell (talk) 19:54, 7 July 2010 (UTC)Reply

The main purpose of this section is not to invite users to nominate files for deletion under obscenity law, but actually to avert spurious nominations under obscenity law. The idea is, if a user goes into legal panic mode over an image for obscenity, that panic will be defused if they are forced to calmly evaluate the image under the Miller test, which every image on Commons really ought to pass. Admittedly, the frustrating inability of the courts to make up their mind regarding the community standards of an international website makes this far less straightforward than it ought to be. A secondary justification for this section would be to address generalized concerns of critics that Commons "hosts obscene content." An alternative strategy would be to omit any mention of obscenity here, and if anyone ever does nominate an image for deletion for obscenity we bring out the test then and explain why their legal concern is inapplicable. Dcoetzee (talk) 20:14, 7 July 2010 (UTC)Reply
I'd obviously prefer the last strategy of yours, though a note along the lines of "Adherence to the non-commercial educational mission of our project scope is also important because in the United States, where our servers are hosted, works with serious literary, artistic, political, or scientific value may not be prohibited by any obscenity law because they enjoy first amendment protection[1]." might address the panic case you're worried about? --Gmaxwell (talk) 20:43, 7 July 2010 (UTC)Reply
  • If the Commons were to be held to obscenity I would be confused as to how they would select a "local" jury. I guess they could pick people around whatever building hosts the servers but since it's a web site that's pretty much ignoring the rest of the state/nation/warudo which also looks at it. This is why it only seems relevant to something that is actually printed and displayed in public as opposed to being on a computer screen. TY© (talk) 06:26, 8 July 2010 (UTC)Reply
  • They'd find a jury whereever they thought they could get a conviction. Since we actively reach throughout the country, they could claim whatever jurisdiction they find convenient.--Prosfilaes (talk) 10:33, 8 July 2010 (UTC)Reply
The difficulty in finding a "local" jury is commented upon in writings on the Miller test. As far as I recall, the prevalent interpretation of "local community standards" in the case of internet offerings is "national community standards". --JN466 16:37, 8 July 2010 (UTC)Reply
I've taken Gmaxwell's suggestions and tried to dramatically briefen the obscenity section while keeping it reasonably accurate. Please take a look at the current text and offer feedback. Dcoetzee (talk) 22:14, 8 July 2010 (UTC)Reply
That's a great improvement. I was going to suggest removing the full text of the Miller Farce Test myself but wasn't sure how that would go down. In particular, I think the statement "There is an open legal question about the community standards which are used in applying these tests, which we do not attempt to resolve here. Because of the complex, subjective nature of obscenity law, no work should be speedy deleted on the basis of perceived obscenity, or based on 18 U.S.C. § 1466A" is to be applauded. "Subjective" is the word (in the eyes of some people even pictures of women in bikinis are "obscene"). Anatiomaros (talk) 23:13, 8 July 2010 (UTC)Reply
Looks reasonable to me. I think it's a clear improvement.--Gmaxwell (talk) 06:40, 9 July 2010 (UTC)Reply
I think it's okay. Just for reference, what sort of support/oppose ratio are we looking for in polls for adoption? --JN466 20:26, 9 July 2010 (UTC)Reply
Some people will oppose the adoption of such a policy entirely and there's not much we can do about that, but I'd like if possible for us to get support from all other concerned users, and to address any major concerns people have, since a policy is very strong and affects a lot of things and I think we're a small enough community to pull that off. Dcoetzee (talk) 21:35, 9 July 2010 (UTC)Reply
Are there further actionable comments above that need to be addressed? --JN466 11:45, 11 July 2010 (UTC)Reply

Consent for pictures taken at public events edit

What about pictures taken at events like the w:Folsom Street Fair, or at porn industry events? As we have seen, these may feature explicit sexual content, but they are public events. Should consent of the person depicted be required at the time of upload? Are personality rights an issue? --JN466 11:41, 11 July 2010 (UTC)Reply

Well, the images there don't appear to be explicit, so I'd personally say that there is no expectation of privacy, and while it would be useful to PR tag, it's not necessary to get consent. -mattbuck (Talk) 15:34, 11 July 2010 (UTC)Reply
See Category:Folsom_Street_Fair. That testicle cuff with weights image that we discussed for half a year came from that fair. As for porn shows, see File:Forced_Orgasm.jpg; similar: File:And_the_second_glass_must_be_filled.jpg. I am asking this because as presently written, this proposed policy would require consent upload from the persons depicted. Now the question is, if it is a public event that allows photography, should we require consent or not? If we do, then such snaps are out. If we want to allow snaps like that, we have to tweak the "Sexual content uploaded without the consent of the participants" section and write an exception for public events into it. --JN466 23:10, 11 July 2010 (UTC)Reply
My personal opinion would be that if there is no reasonable expectation of privacy then there is no need for consent for photos, as with any other image. -mattbuck (Talk) 23:33, 11 July 2010 (UTC)Reply
I think this is covered in Commons:Photographs of identifiable people#What are 'public' and 'private' places?, part of the guideline referenced in the section on prohibited material. I don't think this policy should make any unique specifications beyond what that offers. Wnt (talk) 03:40, 13 July 2010 (UTC)Reply

I've thought more about this and I think the right way to deal with is to treat all photographs of identifiable persons which require consent equally - while sexual content frequently falls into the "requires consent" category, it does not always, and even when it leads to deletion, this can be contentious enough to require discussion, as in the testicle cuffs case. As such, I've revised to eliminate the speculative noconsent tag and advised a normal deletion request instead, where such matters can be duly discussed. I imagine in straightforward cases, the request will be closed quickly. Do other people like the current language? Dcoetzee (talk) 20:02, 20 July 2010 (UTC)Reply

I believe it's appropriate to support at the very least an assertion of consent being required for all sexual content. Personally, I'd accept such an assertion solely on the basis that the picture was taken at a notable street event (like Fulsom) - but not somewhere like a beach. I'll catch up and have more to say / maybe re-edit the page.... cheers, Privatemusings (talk) 09:52, 21 July 2010 (UTC)Reply
I believe that Dcoetzee is correct in that we should treat all media containing identifiable persons equally. And seeing as Commons:Photographs of identifiable people#What are 'public' and 'private' places? is already sufficient to accomplish this, anything more than a reference to existing policy would simply be a discriminatory Trojan horse.   — C M B J   02:04, 22 July 2010 (UTC)Reply
Which wouldn't seem to preclude protection for some classes of media featuring unidentifiable people, no? (ie. pictures of a girl having sex doggy style which don't show her face - I believe we should require her consent to publish on this project) - on the general point, I do believe that it's unethical to publish photographs of people sunbathing topless without their consent. You may disagree, but I'm sure you'll respect that, and I'd ask you to give it some thought. Privatemusings (talk) 07:19, 22 July 2010 (UTC)Reply
Nothing precludes anyone from supporting a change in our existing identifiable persons guideline, but doing so in a new proposal first is tantamount to forum shopping.   — C M B J   10:12, 31 July 2010 (UTC)Reply
Why would the identifiable persons guideline say anything about unidentifiable persons?? Sexual content seems like the perfect place to discuss consent for sexual content. --99of9 (talk) 10:22, 31 July 2010 (UTC)Reply
The second half of his comment implied that nude, non-sexual photographs of identifiable persons should be handled differently by this proposal, even if the subject is in a public place as defined by existing policy.   — C M B J   19:25, 31 July 2010 (UTC)Reply
Ok, I understand now. If stricter standards are being worked out here for sexual content, it still seems like a fine place to then ask how widely they should be applied. --99of9 (talk) 05:41, 1 August 2010 (UTC)Reply
My original messages were composed in somewhat of a blitz; I did not mean to come off as sounding so harsh. There's nothing wrong with having a discussion here. But hypothetical implementations of certain restrictions would circumvent consensus building elsewhere.   — C M B J   08:46, 3 August 2010 (UTC)Reply
I don't see any reason why someone sunbathing in a public place would require any more consent then anyone else in a public place.--Prosfilaes (talk) 17:42, 31 July 2010 (UTC)Reply
The world being what it is, very little that is "explicitly sexual" is going to occur in public. Nudity is another matter, and I would hope that (at least in countries like the U.S. where the law is quite clear that one may take and publish photos of pretty much anything that happens in public) we are not going to even consider introducing consent requirements of our own that are not called for by the law. To take as an example a category that is mainly my work, consider Category:Solstice Cyclists. This is a very public event, much photographed and viewed by Seattleites of all ages; the handful of people who don't want to be identifiable wear masks; and most of them are naked, or nearly so, although they are also mostly wearing body paint. These should no more require evidence of individual consent than photos of a Memorial Day parade. - Jmabel ! talk 21:34, 31 July 2010 (UTC)Reply
That's an interesting (and colourful!) example, thanks. Certainly those subjects imply a great deal more consent than a sunbather snapped with a long lens. Perhaps there is a delicate line that could be drawn around exhibitions? 99of9 (talk) 05:51, 1 August 2010 (UTC)Reply

Purpose? edit

What's the purpose of the "Purpose" section? Is there a way that we can shorten it to, say, one sentence or less?  ;) Wnt (talk) 14:17, 15 July 2010 (UTC)Reply

Mainly, to provide a broad overview of the intended audience (should I be reading this?) and what topics it will discuss. This helps people decide whether to read on or not bother. Dcoetzee (talk) 20:03, 20 July 2010 (UTC)Reply

2010 Wikimedia Study of Controversial Content edit

Was this already linked? It's quite important, the WMF will give the community some help to clarify the issue and obviously they need help from the community. Nemo 01:35, 20 July 2010 (UTC)Reply

Hello, I'm Robert Harris, the consultant who is conducting the above-mentioned study. I've been following your discussions here quite closely during the last month, but thought it inappropriate to intrude on this space while a vote was being held on the suggested new policy. But Nemo has captured the spirit of what I'm trying to do accurately -- to create a two-way street where the Foundation and the community can share insights on how to deal with the problems they're struggling with and you've been discussing. To that end, I've posted a series of questions to survey opinion and continue discussion throughout the Commons and Wikimedia communities on the Meta page which hosts the study (see link above). I'll be especially interested in the views of those of you who have contributed to (or followed along with) this discussion. Obviously, the issue of sexual content within Wikimedia is something to which you've given a lot of thought. Robertmharris (talk) 12:06, 22 July 2010 (UTC)Reply
Thank you for your note Mr. Harris. I hope all participants here will accept your invitation. I certainly will. TheDJ (talk) 12:58, 22 July 2010 (UTC)Reply

Featuring sexually explicit content on the Commons main page edit

I would like to see some discussion on the issue of featuring sexually explicit content on the Commons main page. When the German Wikipedia featured a photo of a vagina on their main page it caused a great deal of contention and disruption. Knowing the culture of Commons, I think we have the potential to do far worse than featuring a photo of a vagina. To me this is not an issue of censorship, but of making sure that our main page is welcoming to as wide a variety of contributors as possible (and avoiding more bad PR if possible). So my question is: Are there any types of images that we should discourage people from featuring on the main page of Commons? Kaldari (talk) 17:32, 22 July 2010 (UTC)Reply

Are we assuming that the administrators have the brain worms? The people who put images on the front page of Commons need to have wisdom and discretion. I wouldn't put an image of Imam Ali on the front page on the 21st day of Ramadan in 2009, as putting an Islamic martyr on the front page on 9/11 wouldn't have been a good choice. There's lots of non-sexual examples like that. Instead of making up rules, why don't we choose people of good taste to put things on the front page and if they wish they can make up some guidelines?--Prosfilaes (talk) 17:47, 22 July 2010 (UTC)Reply
Nudity/sexuality, violence, politics and religion. Possibly better avoid spiders too. -mattbuck (Talk) 17:47, 22 July 2010 (UTC)Reply
Picture of the day (which is featured on the Main Page) is not restricted to admins. Anyone, even anonymous IPs, can add pictures to the POTD queue which then show up on the Main Page. @Mattbuck: I'm familiar with the slippery slope argument. I believe, however, that some basic common sense guidelines could be created which do not lead to us restricting spider photos from the Main Page. Kaldari (talk) 19:07, 22 July 2010 (UTC)Reply
No, I really think that those should be left out. The first four for being generally offensive to lots of people, the last because close-ups of spiders are just creepy. -mattbuck (Talk) 21:46, 22 July 2010 (UTC)Reply
POTD comes from Featured Pictures, which has its own standards. Restricting sexual content from featured pictures has been discussed above and seems to be a de-facto reality. I'm strongly opposed to any policy that fails to take into account that sex is not the only thing people find offensive.--Prosfilaes (talk) 20:20, 22 July 2010 (UTC)Reply
Good point. How would you suggest that such a policy be worded to take that into account (but also avoiding the "slippery slope")? Kaldari (talk) 20:40, 22 July 2010 (UTC)Reply
And FWIW, there is currently nothing preventing a well-shot hardcore pornographic image from becoming a featured image on Commons. We've already had numerous nude and sexually explicit pictures nominated.[2][3][4][5][6][7][8][9][10] In every case they have failed due to purely technical reasons. Why would it be any different for a hardcore image? The only reason it's a de-facto reality is because it hasn't happened yet. The situation is currently a catch-22 since the featured picture people say they are not responsible for what gets put on the Main Page, and the POTD people say they are not responsible for what gets featured picture status. So effectively, no one has responsibility for this issue. Kaldari (talk) 21:09, 22 July 2010 (UTC)Reply
Your final link is to a successful nomination, not a failed one. There is also Commons:Featured picture candidates/Image:Anime Girl.svg, which is not only featured, but was a finalist for PotY 2008 and was Picture of the Day on July 13, 2009. Not very explicit, of course, but I could certainly see some people objecting to it. Did anyone? Powers (talk) 13:37, 23 July 2010 (UTC)Reply
I'm not sure, but I do know that people complained when we featured the Michele Merkin photo on the main page (which was also not very explicit). Kaldari (talk) 18:43, 23 July 2010 (UTC)Reply
Well, I guess that just goes to show someone will complain about nearly anything. My preference would be to not engage in the futility of trying to appease everyone. Powers (talk) 00:27, 24 July 2010 (UTC)Reply
I'm not trying to appease everyone. I would just like to see Commons be a place that people from any culture, age-group, background, etc. can feel comfortable using. Obviously if we put Goatse on the main page that would probably drive some people away. Keeping Goatse off the main page isn't censorship or "trying to appease everyone", it's just common sense. Similarly, keeping other images that would offend thousands of potential contributors off the main page is probably a good idea. We don't have to appease everyone, but at the same time, we don't have to ignore everyone either. Kaldari (talk) 01:13, 24 July 2010 (UTC)Reply
Obviously we need to replace COM:POTD with COM:FK. -mattbuck (Talk) 01:44, 24 July 2010 (UTC)Reply
But how many people have to object before it's a bad idea to offend them? As you noted, the Merkin photo got complaints; how many thousands of potential contributors were turned off? How many thousands would be so offended by a painting of Mohammed on the front page that they'd never come back? Unless we've discovered some way to predict offense, I don't see how we can draw a line. Powers (talk) 14:23, 24 July 2010 (UTC)Reply
As I said at FP its not about censorship, the issue everybody is ignoring is that in most countries veiwing sexually explicit material from your work place can and does result in dismissal and/or substancial financial damages. This has been upheld in the courts of those countries including the US, until there is a technical solution that is an effective solution(aka my preference option) our first concern should be about making Commons accessable to more people. If the cost of that is limiting images in FP or even on Main page then its a small price to pay. Gnangarra 09:55, 25 July 2010 (UTC)Reply
It's called "not getting caught". You can easily switch tabs with one click of the mouse... If you can't do that, you probably shouldn't be looking at Commons instead of working in the first place. You're wasting company time. There is a technical solution, most browsers have an option to block images. If you want to selectively filter them, there's some good extensions you can get for Firefox (like Ad Block Plus). Rocket000 (talk) 04:50, 26 July 2010 (UTC)Reply
No one should have to worry about "getting caught" looking at the Commons Main Page. A teacher should be able to pull up Commons on a projector to show their class without worrying about what sort of photos we're featuring today. If that's not the case, I don't think we're doing a very good job of living up to our mission. Kaldari (talk) 23:12, 26 July 2010 (UTC)Reply

(unindent) Why do we need any sort of written policy for this? Commons generally thrives (relative to en) on a lack of unwritten rules, which allow for a degree of flexibility in implementation. Having a hard rule saying images of X, Y and Z are not allowed on POTD goes beyond what en does here - its one area that they use common sense for, are we so worried about porn now that we can't do the same? Its interesting to see how the 5 main page processes work on en.wp:

  • WP:TFA - Main page articles are selected by a specific trusted user from the pool, after being suggested by users.
  • WP:POTD - Displays the enwp FP pool in strict order of promotion (not sure if it would skip specific images or not).
  • WP:DYK and WP:ITN - A number of admins select suggestions from users after discussion.
  • WP:SA - Specific days can be edited by any user in advance; same as our POTD.

TFA is useful to compare, and shows how POTD and FP should differ here as there are examples of featured articles that will never be on the main page. For example, there are some very short Featured articles, which won't be put on the main page simply because they are short - its got nothing to do with the subject or how its dealt with (apart from fact there isn't a great deal to say), because a short example can't really display the "best work". However, that rule isn't codified and could theoretically be broken. It isn't because Raul knows what he is doing... Anyone can edit our POTD, but we should trust the community to get it right without telling them exactly... That's leaving aside the comment that we don't have "explicit imagery" as a featured picture at present so why are we wasting time debating a theoretical problem? If that ever happened, anyone could add such an image to the queue, and anyone could replace it with someone else (insert Common sense here)...--Nilfanion (talk) 10:58, 25 July 2010 (UTC)Reply

As I demonstrated above, this isn't an entirely theoretical problem. The issue will arise at some point, it's only a matter of time. If we can prevent a potential PR disaster (and substantial community disruption) by writing down a couple of common sense rules, why shouldn't we? What if we had written down the BLP policy at en.wiki before someone vandalized John Seigenthaler's article? We don't have to wait for a disaster to address the problem. Kaldari (talk) 23:22, 26 July 2010 (UTC)Reply
Well the POTD queue isn't selected on the spot, but a number of days (or weeks even sometimes) in advance. The existing rules do state "don't swap it without good reason". If someone puts an explicit image as a POTD, then it can be discussed well in advance of it actually being on main page and not as a theoretical - so we don't have to make sweeping statements about explicit images (and what that is), but the precise one in question.--Nilfanion (talk) 23:38, 26 July 2010 (UTC)Reply
There are only a small handful of people who actually add images to the queue. So in practice images are only discussed once they hit the Main Page. We don't have to make sweeping statements about explicit images and what that is. We can just add a simple guideline saying "POTD should be safe for work". That doesn't seem overly bureaucratic to me. Kaldari (talk) 17:19, 27 July 2010 (UTC)Reply
What's safe for work in the U.S., France, or Saudi Arabia is all different - and even more different between individual companies in those countries. Besides, it's a stupid workplace that allows people to go to Wikimedia Commons but punishes employees if there happens to be a "naughty" picture out front that day. If there's a culture clash there, we don't have to be the ones to back down. Wikimedia has a mission to make content freely available to the people of the world — but no obligation to make it available to companies that enforce asinine policies. The employees can access from home if they're that concerned, and probably ought to anyway.
In any case, the featured content is not free of moral censorship, because people are free to vote yes or no according to their aesthetic sensitivity, which is not independent of moral reactions. The ongoing discussion shows very little support for any further restriction. Most likely, purely vulgar images will receive a frosty reception and lose the vote. "Indecent" images that win a vote probably have some obvious redeeming quality to them. Wnt (talk) 01:04, 28 July 2010 (UTC)Reply
"Aesthetics" is not a featured picture criteria. The closest criteria are probably balance and composition. If someone were to vote No on a picture due to "aesthetic sensitivity" or a "moral reaction" as you suggest, their vote would be completely ignored as not conforming to the criteria (as has happened before). So the safeguards that you suggest exist, do not, in fact, apply. Kaldari (talk) 20:35, 28 July 2010 (UTC)Reply
Please review Commons:Featured picture candidates. Note for example, "Value - our main goal is to feature most valuable pictures from all others." and "Symbolic meaning or relevance…. Opinion wars can begin here…. A bad picture of a very difficult subject is a better picture than a good picture of an ordinary subject." These criteria quite literally call for value judgments and lower ratings for vulgar (= ordinary) images. Admittedly there is a weirdness that the "complete guidelines" are much more limited than the general overview. But I've seen plenty of value judgments recently - people have asked what is so special about a brick wall, or said that there have been too many images of time-lapsed water. It's a subjective judgment. I think that the fairest resolution is to live with this and let people vote up and down on individual images as they choose - not to disqualify images ahead of time, nor to throw away votes because you dislike their motivation. Otherwise the process ends up imposing censorship or encouraging people to vote disingenuously, adding conflict and mutual loss of respect into what could be an open process. Wnt (talk) 03:39, 29 July 2010 (UTC)Reply
I think you're conflating two definitions of the word "vulgar". Powers (talk) 12:18, 29 July 2010 (UTC)Reply
Not really. Issues of size aside, why might someone be underwhelmed by a proposal for a featured penis on Commons? Well, because we're not ooohing and ahhhing, saying I can't believe you managed to get such a good closeup of a penis, you know how rare they are and how long you have to stay in the bird blind to even get a split-second opportunity to take a shot with the telephoto lens... It's not interesting because it is, well, common. And in truth, the pejorative use of "vulgar" was never anything more than "common", and all the dirty words and dirty deeds of the Victorians are based on the abhorrence of the plain and simple speech and actions of the common folk. Wnt (talk) 12:52, 29 July 2010 (UTC)Reply
Yes, there is a connection etymologically, but the meanings have diverged since then. A vulgar (as in pertaining to reproductive functions) image will not necessarily be so vulgar (as in common) as to preclude the oohing and aahing you mention. In many cases, certainly, but not all. Powers (talk) 00:40, 30 July 2010 (UTC)Reply
Yes, there's some divergence in meaning, and the criterion did use the word "ordinary", and you can find a case where that matters - but my whole point is to argue for the right of the editors to make their own case-by-case judgments, and I argue against trying to impose a Procrustean notion of vulgarity (or ordinariness) either on the candidate image or the editor's reason for voting. Wnt (talk) 04:05, 31 July 2010 (UTC)Reply

Redundancy edit

How do people feel about allowing the deletion of high quality and otherwise potentially educational images because there are already several other images of comparable quality and educational value? Is there any reason to keep more than a handful of essentially indistinguishable nude pictures? 71.198.176.22 20:33, 29 July 2010 (UTC)Reply

Humanity is vast and varied. Ethnic groups times body shapes times ages times gender is a heck of a lot more than a handful.--Prosfilaes (talk) 20:54, 29 July 2010 (UTC)Reply
I didn't mean to suggest excluding different ethnicities or body shapes; only to exclude redundant images which are essentially indistinguishable because they fall into the same such categories. "Handful" is probably not the right term, but the question needs to be asked: How many genitalia close-up photos, for example, do we need to fulfill our educational mission? 71.198.176.22 21:00, 29 July 2010 (UTC)Reply
In some cases we have multiple images of the same person doing the same thing, just at slightly different angles or taken a few seconds apart from each other. I think in these cases, the best image should be chosen and the rest deleted. Kaldari (talk) 00:13, 31 July 2010 (UTC)Reply
I previously filed such a deletion request at Commons:Deletion requests/File:Explicit wmf.OGG, citing Commons:Deletion policy against "Files that add nothing educationally distinct to the collection of images we already hold covering the same subject". I think that because such a policy exists, there is no reason to write it here anew, and especially, because there's no reason to set up some slightly different version only for this narrow subcategory of files. Wnt (talk) 03:57, 31 July 2010 (UTC)Reply
  • Kaldari, are you saying this ("multiple images of the same person doing the same thing") specifically about sexual content, or in general? Using my own work for an examples, would you say that the 5 images I put in Category:Nick Hornby are excessive because they are not different enough from one another? - Jmabel ! talk 05:46, 31 July 2010 (UTC)Reply

Would it be reasonable to reference the Commons:Deletion policy passage which Wnt cites above explicitly in this policy? Would it be wise to explicitly state that it might be used to remove the several dozen unquestionably redundant genitalia photos which are uploaded to Commons daily? 71.198.176.22 04:26, 1 August 2010 (UTC)Reply

Several dozen, huh? I checked back to the start of the month, which is four and half hours old, and the only vaguely sexual pictures I saw were File:Venus Restraining Cupid by François Boucher.png and File:Bouguereau, William Adolphe - Putto sur un monstre marin.png, a couple old-school paintings. It'd be nice if we could use realistic numbers about the problem.--Prosfilaes (talk) 04:55, 1 August 2010 (UTC)Reply
Back in March, during a meetup planning ideas for public displays, an admin showed me an automatically generated thumbnail gallery of the most recent 120 or so roughly square image uploads from Commons, and there were at least five pictures of people's genitals included in that snapshot. Perhaps it was just a coincidence, and if the proportion of genitalia photos being uploaded has substantially decreased, great, but I still say it's worth explicitly stating that potentially offensive images ought to be deleted if they are highly redundant with existing images of comparable subjects and quality. 208.54.14.26 08:03, 5 August 2010 (UTC)Reply
It's useful to have something vague enough not to be checkable. I went back 480 pictures right now, and only the last page had a single picture of penis, in use on the German Wikipedia. "Potentially offensive" is never a standard we use. Ever. It's so massively not neutral it's not even funny. There's no need to repeat general policy here.--Prosfilaes (talk) 15:54, 5 August 2010 (UTC)Reply
Assuming every 1 in 480 uploads is such a picture, per [11] that would still indicate over a dozen such images per day. I'm glad if the rate of such uploads has fallen, but I can't imagine why it would, or why it wouldn't return if it has. I think there is in fact a need to repeat general policy here, so people involved in related deletion decisions are sure to know it. However, I agree there's no reason to refer to images as potentially offensive. Why not say: "Media ought to be deleted if they are redundant with existing media of comparable subjects and quality."? 71.198.176.22 12:10, 6 August 2010 (UTC)Reply

I agree with Wnt here, and think we should stick with existing policy on this. It is useful to have multiple high-quality pictures, but not useful to have swathes of actually-redundant low-quality rubbish. Ultimately redundancy has to be decided in a deletion review. 99of9 (talk) 06:00, 1 August 2010 (UTC)Reply

more on consent edit

per several above threads, and further head scratching, I've completed some editing to the 'consent' section. Feedback most welcome. Privatemusings (talk) 00:51, 30 July 2010 (UTC)Reply

For anyone not following closely, the change made is this. It is certainly a very substantive change, the first in a while. Indeed, it is so substantive that I wonder if (presuming we are bringing this to a poll) this difference might merit a separate poll. - Jmabel ! talk 01:23, 30 July 2010 (UTC)Reply
I'll need to check the diff.s but the substance of the change is actually a change back, no? - I believe the poll was largely running prior to D's edits on the issue of consent? (will check and get back to all....) Privatemusings (talk) 01:36, 30 July 2010 (UTC)Reply
With the proviso that I feel I'm a bit rubbish at diving into all the diff.s - so anyone who feels they're good at such things could no doubt produce something perhaps more authoritative, what I found was that on 3rd July, when the poll kicked off, the consent aspect of the policy referred to evidence of consent being required, if challenged, and that uploaders 'should include an assertion of consent in the image description' (see here for one of the edits). I accentuated the need for consent here which I think was mid-poll (6th July) and after some improvements, the section pretty much stabalised the same day. It stayed that way until D revisited the issue on the 20th - and I was the next editor to the page to restore and tweak today (the 30th) - hope that helps in some way - it's certainly clarified some things for me :-) cheers, Privatemusings (talk) 01:52, 30 July 2010 (UTC)Reply
(ec) If I got it correctly that change in your diff a carte blanche for users like The Cleaner to delete a whole cat subtree. How should consent be proven? Especially if someone likes to be anonymous? What's needed for images from flickr? --Saibo (Δ) 01:57, 30 July 2010 (UTC)Reply
Some sources of Flickr images should be fine: in particular, accounts known to handle these things carefully (e.g. Suicide Girls). But certainly, for accounts about which we don't know much, the issue of Flickrwashing is particularly sensitive in this area. With or without any special rules, I'd be pretty hesitant to view a basically anonymous Flickr account as a good source for a photo where issues of subjects' consent might reasonably be both relevant and in question.
Something not really addressed in the proposed policy, and which perhaps should be, is that an uploader/photographer with an established good reputation for their adherence to rules and policies should not be challenged at every turn. I would hope that here, as elsewhere, we would have a presumption that people who have clearly been "good actors" in the past continue to be "good actors". For example, in the unlikely event that I uploaded a photo in this area, I would hope that my actions would be treated with a considerably stronger presumption that I know what I'm doing and am adhering to policy than if a comparable image were uploaded by someone basically anonymous, and where their first uploads are sexually oriented photographs of a nature that raises questions about the subject's consent. - Jmabel ! talk 06:22, 30 July 2010 (UTC)Reply
I agree with this, but feel it's more likely to be an issue of culture / practice rather than policy (see our current broo ha ha's over folk tagging images contributed by long term commonsers with various copyvio / unsourced templates for example of less than stellar practice in this area) - you can't legislate a healthy community unfortunately. Privatemusings (talk) 06:47, 30 July 2010 (UTC)Reply
I agree with this change back to requiring a statement of consent. Sexual content on web archives has the potential to be used for long term damage to a person's reputation (personal/career/etc). So I think it is only responsible to be very certain that we have consent (even beyond the legal requirements). Personally, I would extend this to nudity. --99of9 (talk) 11:13, 30 July 2010 (UTC)Reply

So now everybody can delete all sexual pictures uploaded in the past on commons in only a few days, saying "ok, this photo uploaded X years ago ago by an user that doesn't contribute anymore/never see the template in this few day must be deleted"? If we assume good faith when an user upload an image for copyright, national law (like freedom of panorama), etc.. why we can't assume it also for this "consent"? And I don't think that if an user, in bad faith, upload consciously a photo where people didn't give the consent to redistribuition,exposing himself to legal action, then have some problem to put it also a sentence like "consent given" in the description... --Yoggysot (talk) 14:36, 30 July 2010 (UTC)Reply

It explicitly says that such images will be given a longer period of grace. Regarding good faith: Freedom of Panorama is verifiable, and gets checked all the time, so that's hardly a direct comparison. Copyright is also either checkable (if not own-work) or requires an assertion (if own-work), and in my opinion consent should be brought into line with it.--99of9 (talk) 06:02, 31 July 2010 (UTC)Reply

@Jmabel: Sure, Flickr images from suspicious flickr accounts re not okay - for copyvio reasons, too. --Saibo (Δ) 15:24, 30 July 2010 (UTC)Reply

This issue has been kicking around for quite some time here. Originally several of the people supporting a more restrictive policy were arguing for 2257 documentation at Wikimedia Commons; after it appeared that WMF's counsel doesn't think Wikimedia needs to keep such records this seemed to be pretty well dropped, as it would create privacy issues without really accomplishing much. This leaves us with the ghost of the issue, which is whether we should simply get some extra statement, "honest, this is an upload of a photo taken with consent". While such an assertion may do no harm, obviously it isn't very strong proof, and losing images by imposing the requirement retroactively would be a waste.
I think we don't need to reinvent the wheel here - honest-to-goodness sexual content, as defined here, is only a small slice of the potentially humiliating content that can be uploaded here about an individual. There's already an existing policy on photographs of identifiable people, which explains when consent is needed, and which doesn't require some special template for the uploader to say that honest-to-god he means it. So I think that the wording before the diff referenced above was just about exactly what I would have said. I think we do need to recognize that the system we have (like for many other online sites) is not at all immune from abuse, so we should be quick and deferential about deleting photos when a complaint comes up from someone who doesn't want to be depicted here in such a pose, without stickling claims about irrevocable consent. Wnt (talk) 03:45, 31 July 2010 (UTC)Reply
Requiring a statement is exactly in line with how we handle copyright for "own work". It can later be challenged if it becomes suspicious, but it is usually good enough. One extra benefit is that it provides clear evidence of direct fraud whenever it is actually discovered that consent was not present. The uploader can not claim "I was never aware that I needed consent". --99of9 (talk) 04:29, 31 July 2010 (UTC)Reply
What's the use of having evidence of fraud? If we somehow actually find out that someone is uploading sexual photos without consent, I assume they're going to be blocked for a very long time no matter what. (Which of course doesn't stop them from getting a new account and uploading a new sex picture tomorrow) Wnt (talk) 14:00, 2 August 2010 (UTC)Reply

Current issues under review? edit

Can someone recap the issues being addressed / revisited since the poll ended? So far I see 1) clarifications of purpose, expected outcome, and consent requirements, 2) review of focus on Miller, and 3) review of the amount of legal jargon used. --SJ+ 07:32, 30 July 2010 (UTC)Reply

I think it would be wise to say something explicit about #Redundancy. 71.198.176.22 04:28, 1 August 2010 (UTC)Reply

longer grace period edit

I specified a longer period - saying that until January 2011, media will have a month's grace period - my thought is that at that time (Jan) all media will be treated the same. Privatemusings (talk) 02:47, 2 August 2010 (UTC)Reply

I've posted a counter-proposal for the language, which does not specify any hard-and-fast deadlines. I don't think you can count on a contributor from 2006 to log in even a month later, nor does everyone currently active see a message within three days. I'll allow that participants in a deletion discussion might fairly exercise "reasonable suspicion" regarding whether there was consent for the upload, and that providing or not providing a response may weigh on this. But we're not talking about an affidavit under penalties of perjury here! The importance of providing such a statement seems highly exaggerated, and the length of time needed for a normal deletion discussion is quite a bit more than your grace periods. Wnt (talk) 13:36, 2 August 2010 (UTC)Reply
I've taken a first slash at Template:noconsent (and Template:consent, which I think will be needed and mentioned in my draft of the former). The second template is just a skeleton - it would like little check and X icons and categories to put the image in and so on. I have no idea if this is what the other people want, which is why I think we need to have something to look at. Wnt (talk) 00:17, 3 August 2010 (UTC)Reply
I like most of what you did there. What about the case where no overt consent is needed because something was entirely public (e.g. some material in Category:Folsom Street Fair)? - Jmabel ! talk 21:54, 4 August 2010 (UTC)Reply
I've updated the dating of the template (might take more than a month to ratify this ;-) - and restored the 'prod' type functionality of the noconsent tag, which I feel is important. I've also extended the proposed 'extra grace' period for pre-existing media, because it seems sensible to compromise.
@jmabel - I believe it would be entirely reasonable to assert that someone would have no expectation of privacy at a street fair such as folsom, and therefore has consented to being photographed etc. To reiterate another matter, I feel no such consent has been given by folk at a beach, for example, but this has more to do with nudity, as oppose to what we're talking about here as 'sexual content' :-) Privatemusings (talk) 01:20, 5 August 2010 (UTC)Reply
I don't actually see much in the folsom street fair category that counts as "sexual content" - I don't think a picture of a woman walking around with a cord around her breasts really counts even under the "sadistic and masochistic abuse" part, which always seemed a bit dodgy to me to start with. The template I designed isn't meant to be an exhaustive list; besides adding categories, you can simply write in your own explanation for something fairly unusual like public sex. I suppose at some point the more problematic answers and write-in explanations will get categorized somewhere that someone can look over them and see if they make sense by policy, but I don't think that should be the main goal of the template - the main goal should be to make the uploader think about the policy and any legal issues. Wnt (talk) 13:08, 5 August 2010 (UTC)Reply
I just read your consent tag, and am already impressed! I think we need clarity on what kind of publication implies consent. I presume you don't mean Flikr?? Would the phrase "professionally published" work? --99of9 (talk) 12:28, 5 August 2010 (UTC)Reply
Well, that's the kind of policy question I thought people might start asking once they had a look at a prototype. It's possible to add new categories to the template just by adding new lines in the switch statements. The question with flikr is, if a person can upload something with his free flikr account, then "finds" it there and uploads it to Wikimedia Commons with an account under a different name, then he has plausible deniability here and he avoids any consent requirement. A similar but weaker issue applies to someone who uploads a photo taken by a friend, who assures him that it was taken with consent. I should note however that such questions also apply to any other photograph of an identifiable person and I'm not sure what Commons standards are about that. Maybe by not having a consent template for those, we avoid prying too closely into matters we'd rather not know about. But it may be best to simply accept any published work, because otherwise you have a situation where a pool of public domain photos exists on commercial sites that Commons can't touch, which is contrary to its mission. Wnt (talk) 13:01, 5 August 2010 (UTC)Reply

Another poll? edit

Should we have another poll? The proposal got more than 2/3 support last time round, some improvements have been made since then, and perhaps we can get this wrapped up now. --JN466 14:18, 4 August 2010 (UTC)Reply

The consent section is still swinging back and forward, so I think we should wait until we figure that out. --99of9 (talk) 20:21, 4 August 2010 (UTC)Reply
No poll until this is stable. I'd also like to see a clear agreement that no major amendments or significant rewording takes place once a poll is actually opened (unless agreement is reached here first). Anatiomaros (talk) 00:19, 5 August 2010 (UTC)Reply
It's been a week since even the discussion page here was edited. The main lingering issue regards consent, but I think that the difference of opinion on this has been narrowed somewhat, and the remaining point of contention, as 99of9 and I have expressed below, regards photographs of identifiable people in general — so they should probably be addressed in discussions pertaining to that policy rather than here.
Because this document is mostly a guide to other relevant policies, I don't think its passage is essential, but one doesn't like to see over 2/3 of people vote for something only to be stymied because a few editors make some changes during the vote. I would suggest that we should make the next vote formally about a fixed historical version, such as the current version ([12]), and that those who voted previously should be recontacted to vote. (Because the vote was uneven, that may technically be a biased canvas, yet it seems fairer than discarding their votes) Wnt (talk) 13:56, 19 August 2010 (UTC)Reply
Support (what Wnt just said). - Jmabel ! talk 15:18, 19 August 2010 (UTC)Reply
Hmmmm. I still think we can do better than this in terms of protecting subjects. I believe this policy should actually have meat to it, rather than just summarizing other policies. Wnt, I don't remember ever agreeing that new consent rules should only apply to identifiable people, and hence should be discussed elsewhere. This is partly because later information or additional pictures can make a subject identifiable, even if the picture itself originally didn't identify the subject. This is much more dangerous for sexual content than most other images. We also still need a clear statement on imported consent. That discussion only got answers from Wnt and I, and clearly needs more input. --99of9 (talk) 12:22, 20 August 2010 (UTC)Reply
I'm concerned that extra criteria are being introduced (evidence) that would make it very easy to attack an image and very difficult to defend against such an attack. Consent rules must only apply to identifiable people in non public situations. Subjects' rights need to be protected, but so do those of editors and illustrators against attempts to censor and destroy their work. --Simonxag (talk) 11:47, 21 August 2010 (UTC)Reply
99of9 raises a point I hadn't thought about, which deserves some consideration. On one hand, I don't think we should consider deleting photos of isolated genitalia for lack of consent, because no one's going to recognize them. Even if the uploader later gives them a name and address, this is a matter of "posting private personal information" irrelevant to the encyclopedia - not an issue with the image itself. Yet you might see a two photo series, taken in the same room of a person wearing the same limited attire, might make it easy to trace an identifiable clothed photo to a "non-identifiable" sexual one. I think though that the key here is that the latter is identifiable from photos alone, which the former requires that one believe the uploader. Why should you assume someone making an underhanded comment has told you the truth? So I would say that the broad term "identifiable" fairly covers this: the point is, if the detail of the photo (not just the face) tells you who it is, then it may require consent.
The problem with delaying the vote until the discussion is over is that only the prospect of a vote seems to get it going, and if we can't reach some community consensus, one way or another, then community consensus may not be respected in future policy consideration on the issue. Maybe we'll have to put up two versions and see whose is better favored. I'll try to work out a fair concession to this newest point in an edit, but I doubt it will satisfy 99of9. Wnt (talk) 18:03, 21 August 2010 (UTC)Reply
A lot of work has been done on this draft; it needs to be put up to a vote, so it can become effective, and then put into the hands of the community for further fine tuning. --JN466 05:24, 26 August 2010 (UTC)Reply
This process has taken way too long. There seems to be much stonewalling from a small number of detractors. Minor improvements or changes can be made after this policy has taken effect. - Stillwaterising (talk) 19:46, 1 September 2010 (UTC)Reply
I've added two more issues below - the deletion of the office action section is major, but I think most people would agree to keep it in; the terms under which consent is not required from identifiable persons is relatively minor, and I think we can reach consensus on it anyway. We may well kick up coarser sand than this once the first five voters visit the page. Wnt (talk) 06:13, 2 September 2010 (UTC)Reply
Is there a chance we might manage to agree on this draft? Someone might still disagree about the "actual sexual acts" part, but that's one of four criteria recently added regarding photos of identifiable people taken without consent but during an intended public display. Wnt (talk) 04:08, 5 September 2010 (UTC)Reply

Maybe a practical test edit

A week ago i found some new images from user:midnight68. Since i found them to be either out of scope, fakes or even illegal i installed a mass deletion request. Maybe this case can be used to verify the wording of the guideline and maybe future policy? --Niabot (talk) 07:13, 5 August 2010 (UTC)Reply

Actually, I don't think this proposal would affect that case. In theory, the deletion discussion could be based on whether the images could be found "obscene", but I think this would be extraordinarily unlikely. (See [13] for a more realistic view of the current situation, in which even pornography involving children unknown to NCMEC is not being prosecuted - and unfortunately, it doesn't sound like they are tracking down such children by ordinary means of investigation either) I think that images from a noted Japanese comics company would be clearly artistic or educational and exempted by Miller, etc. That said... the images are from a noted company, unless you believe the uploader's statement that he drew them in pencil and altered them in photoshop to make it look like they came from there. I think the sexual part is really just a distraction. Wnt (talk) 12:45, 5 August 2010 (UTC)Reply

Automatic deletions by noconsent template edit

It looks like privatemusings and I have reached something of an impasse regarding whether an unanswered noconsent tag should lead to automatic deletions of material.[14] He views it as something like a "prod" that allows a file to be deleted without discussion; however a true Wikipedia prod can be challenged by anyone, while this is only answerable by the original uploader. I view it as a means for the uploader to confirm that he is aware of existing policy, so that if a proposal for deletion comes up, people know that he didn't just make a mistake.

The biggest sticking point for me is that I think there are people who uploaded things to Commons in 2006 who have temporarily lost interest and won't be back for years, and when they get back I don't want them to find that all their previous work was thrown out because of some silly form that didn't even exist when they were here last. More generally, I don't think that Commons should be deleting such files unless they're proposed for deletion and people actually look at them and decide that there really is a problem with them.

To avoid any more reverting back and forth, I'd like to hear some third opinions about this issue. Wnt (talk) 13:28, 5 August 2010 (UTC)Reply

In #consent again...., Dcoetzee included a before-July-2010 exception. What happened to that? Wknight94 talk 15:30, 5 August 2010 (UTC)Reply
I'm certainly opposed to any "evidence of consent" requirements that don't have an exception for old uploads (and old images). --Carnildo (talk) 19:45, 5 August 2010 (UTC)Reply
Whilst I appreciate people's right to respect for their privacy, which I feel quite strongly about myself, I think this creates far more problems than it solves. Some comments: 1. It would be unfair, to say the least, to apply this to old uploads, for the reasons noted above. 2. Even for recent uploads - assuming this was passed - this would in effect amount to a near-certainty that most of the tagged files would be deleted without discussion. 5 days? How many people, apart from "wiki anoraks" like us, are here several times a week? 3. If this is about consent why limit it to sexual content? Any picture of somebody uploaded here without their consent is a breach of privacy. If we used this proposed template across the board it would inevitably lead to many thousands of file deletions, most of them unneccesary. We should seriously reconsider this. Apart from files that are obviously illegal for us to host we should not be automatically deleting images like this. There must be a better way. Anatiomaros (talk) 23:57, 5 August 2010 (UTC)Reply
I'm not sure why the principle that consent is required for sexual content would only apply to newly uploaded material, to be honest, and @wknight - the 'july 2010' bit became 'August' and now 'whenever ratified' - so older material gets a 2 month time period for assertion of consent to be provided prior to any 'prod' type deletion. I totally concur with wnt that some more opinions would be a good thing though - and to that end I dropped a note in to the foundation and commons mailing lists :-) cheers, Privatemusings (talk) 07:36, 6 August 2010 (UTC)Reply

While I haven't had time to read all the debates, there are only three possible options that I could even consider supporting, because these are the only ones that adhere to the "absolute and non-negotiable" NPOV doctrine of the Wikimedia Foundation projects -

  1. All media (including video and sound files) that contains one or more identifiable, or potentially identifiable living person, recently deceased person or potentially living person (for cases where it is not known whether they are alive or dead) must contain an explicit declaration of consent from each such person, regardless of when it was uploaded. All images not including such declaration to be deleted after a reasonable length of time after notification to the uploader.
  2. Declarations of consent from each identifiable or potentially identifiable person within a picture, video, sound file, etc are encouraged but not required.
  3. No mention of consent is mentioned or required anywhere (basically as now)

Anything that depends on the subject matter of the media, or any subjective interpretation of anything, is completely incompatible with NPOV. Any system that exempts media uploaded before any date doesn't actually provide any significant amount of any of the benefits of the system, but does provide most of the disbenefits. Thryduulf (talk) 09:04, 6 August 2010 (UTC)Reply

NPOV really doesn't apply to media as such, bear in mind that maps that are highly POV can be within Commons scope. The applicability of NPOV on Commons primarily relates to the descriptions and those POV-charged maps should state they are POV "The border of China claimed by the Republic of China" for example. As for processes, NPOV really doesn't matter as such - its reasonable to require certain classes of images to have additional protection - such as the whole COM:FLICKR process. The problem with this proposal is when does an image become sexual and require it be treated as such? There is a grey area between sexual content and non-sexual content, and everyone's opinion as to what the border is will be different.
The date exemption needs to be considered too - its ok to grandfather somethings in, but not necessarily here.--Nilfanion (talk) 11:13, 6 August 2010 (UTC)Reply

My position is that eventually all sexual content should show evidence of consent. To get to that, perhaps give existing own-work by wikipedians a very long grace period (even 6 months would be ok by me). 99of9 (talk) 13:06, 6 August 2010 (UTC)Reply

I do not think so. There are much old works we never will get consent for (as we do not know whom to ask or because the persons are dead). Also much valuable works, some not replaceable. And if the persons depicted do not want to have them here they can always get them deleted (or should be able to), in the cases where we want to respect their wishes.
(We might need some more text about what cases should be handled by office actions, speedy delete and deletion discussion respectively: a person finding herself here shouldn't be given false promises and I think (s)he neither should have to read and interpret the policy about identifiable persons.)
The reason to have a declaration of consent when new media is uploaded is because we do not want to have them here until the person finds out about it, after his/her friends (and non-friends) have found it. If the image has been on the net for years, the person probably knows about it and accepts it, or will never know about it (and neither will the person's acquaintances). Then little will be won by deleting it from here.
--LPfi (talk) 15:31, 6 August 2010 (UTC)Reply
    • In my opinion we really have to get away from the idea that it's ok to wait for a humiliated person to find their own image on the internet before they get our protection. At that stage serious damage may have already been done to their personal/professional reputation. How many people, if their ex-lover had compromising pictures that they had *not* given their consent to publish, would want to have to monitor the entire internet of sexual imagery just to preserve their rights?--99of9 (talk) 00:57, 7 August 2010 (UTC)Reply
      • If there were a solution, then I would agree. But all we can do is force the uploader to declare consent, and stop honest uploaders from uploading without consent. We have no viable plan for stopping dishonest uploaders from uploading to these pictures to Commons, and we have no hope of affecting "the entire internet of sexual imagery".--Prosfilaes (talk) 06:15, 7 August 2010 (UTC)Reply
        • We just have to do our bit to be part of the solution, not part of the problem. By raising one extra barrier for the dishonest (statement of consent), it proves that they are indeed dishonest and they cannot claim they were simply ignorant or ill-informed. It could even provide the victim with an extra legal hook to catch them with. --99of9 (talk) 13:28, 12 August 2010 (UTC)Reply

Thryduulf: does your remark above mean that you reject the distinction as to whether the picture was taken in a public place? In the U.S., at least, there is implicit consent simply by doing something in public. For an obvious example, an identifiable picture of someone in a parade or speaking on a panel at a conference clearly (at least to me) has absolutely no requirement of explicit consent. - Jmabel ! talk 18:25, 6 August 2010 (UTC)Reply

99of9: how do you propose to get explicit consent for a 19th century French engraving? - Jmabel ! talk 18:27, 6 August 2010 (UTC)Reply

  • My understanding is that modelling for an engraving was laborious work, which is very strong evidence of implied consent, since engravings were almost universally intended for release (unlike boyfriend-girlfriend photographs). So in that particular case I'd say the evidence is there. However I take your point that I haven't really thought through consent after death. I am probably willing to accept the "living or recently deceased" language from COM:PEOPLE, but then again, it doesn't seem right for a photographer to release a non-consenting private image even long after death... --99of9 (talk) 00:50, 7 August 2010 (UTC)Reply

Previously published images without consent edit

The comment about 19th century French engravings above returns us to the huge unknown in the policy regarding what to do with published images. If you have something like Category:Silvana Suárez, it's relatively recent sexual content that has lapsed to the public domain from a professional publisher. One assumes Playboy got her consent for the photo. At the same time, she might not have given them consent to republish the photo anywhere, any time, for any purpose. So for Commons purposes her consent may be incomplete. A more extreme case comes to us from Category:Lynndie England, where we know that there is sexual content present that is utterly and absolutely without consent, which we carry because it was worldwide news. Now the stand I'd take is that a published image that makes its way to the public domain or free license does not require consent, just as a photo taken originally in a public place does not require consent. This makes the consent policy a burden of the Wikimedia Commons contributor who is using his own camera and lights, not a general inquisition into the provenance of every image. I think that censoring the Lynndie England pictures, even if it is in the spirit of protecting the dignity of prisoners of war under the Geneva Convention, is still not actually Wikimedia Commons' duty — our duty is to fairly support and report the full variety of world events.

However, that said, anyone can slap up an image on Flikr or a private web site with a free license. So someone might or might not want to have a more demanding notion of "publication", depending on whether checking consent in advance or having access to the full range of free content is our top priority. I can see the argument here that Commons, not wanting to get sucked into something particularly squalid, might make a stricter standard regarding sexual content per se, while adopting a less stringent standard for ordinary photographs of identifiable persons, which seems to be the status quo (I don't think Flikr images are banned there, provided they have confirmation of CC license). But that's a decision no one has made yet. Wnt (talk) 19:22, 6 August 2010 (UTC)Reply

Obviously I favour more demanding notions of publication if we are going to import private images without any evidence of consent, especially for sexual content, but also for all private images. An absolute bare minimum would be a professional editorial process, and a verifiable physical address and contact details for the publisher. 99of9 (talk) 13:41, 12 August 2010 (UTC)Reply

Anchor vs. headings edit

This edit looks like a mess to me. Either we should be using {{Anchor}} or headings for this. I personally think in this case {{Anchor}} makes more sense. - Jmabel ! talk 23:32, 5 August 2010 (UTC)Reply

yeah - I made a real pigs ear of that! - I'll try and fix it up now.... cheers, Privatemusings (talk) 07:11, 6 August 2010 (UTC)Reply
There are many times when both headings and anchors are a good thing. An anchor is meant to be a hopefully stable way to link to a specific section of the document, even though the heading may be revised. In the messy edit, the anchor tag wasn't actually to blame - it was just an empty <span> that has no effect on the text. The culprit was the ";* ", which makes a line look like some sort of bottom level heading on its own, but which when enclosed in ==='s just prints out as ;* - thus the mess. I would have tidied it up by deleting the ";*" but I was thinking more about reverting the whole edit, per discussion above. Wnt (talk) 19:30, 6 August 2010 (UTC)Reply

"Any purpose" edit

Recently added: "The consent must not be conditional, but allow for the use of the photograph by anyone and for any purpose." I don't think that's right. Commons photos can still have personality rights restrictions. For example, most Commons photos of identifiable individuals - not just sexual ones - could not freely be used in an advertisement. - Jmabel ! talk 00:50, 22 August 2010 (UTC)Reply

Hmmm. I was trying to work toward a consensus there.  ;) The problem I see is that if you were a photographer, taking a sexually explicit picture of someone to upload to Commons, saying that "this is just for Wikimedia Commons" would be a lie. Obviously the picture is going to end up plastered everywhere from 4chan to Encyclopedia Dramatica, and if that's not what the person has in mind, then that consent isn't really consent. I'll see if I can think of a way to separate personality rights from Commons copyright consent. Wnt (talk) 19:33, 22 August 2010 (UTC)Reply

Commons is not an amateur porn site edit

Is there any way we could stop using this as a deletion reason? Can't we assume that when someone uploads a photograph of their penis, that it was a serious attempt at adding to Commons? It seems to be what AGF demands, and we don't loose anything by using "low-quality penis picture" in the summary instead of "Commons is not an amateur porn site".--Prosfilaes (talk) 08:52, 22 August 2010 (UTC)Reply

This may be true, but the phrase is coded in COM:PORN (a paragraph of COM:NOT). Because that's part of an existing guideline, we can't change it here - you'll have to start the discussion over there. Wnt (talk) 19:47, 22 August 2010 (UTC)Reply
I agree with you both. Regarding this "amateur" label, surely that means that unless a professional photographer donates some of his/her work to us we will only have amateur pictures and therefore they are all at risk of deletion simply by invoking "Commons is not an amateur porn site"? This needs amending at COM:PORN, but I'm not sure about the best wording. The conundrum is that whilst we certainly do not want Commons to be an "amateur porn site", nevertheless we rely on "amateur" pictures for most actual photographs with sexual content. Anatiomaros (talk) 19:18, 23 August 2010 (UTC)Reply

once more unto consent, my friends.... edit

I've culled and tightened the consent section again, intending to focus more on the principles rather than additional processes. I believe there will be a consensus for commons policy on sexual content to include assertions of consent for all sexual content, the practice following we can discuss anon... otoh, if there is no consensus that consent is required for sexual content, we can probably spell that out too - the more eyes on this the better, so be sure to tell folk you bump into! cheers, Privatemusings (talk) 04:14, 23 August 2010 (UTC)Reply

But how would we control that consent? OTRS is already seriously backlogged, and we won't be able to accurately confirm the cosent. Furthermore an email from bunny1212@gmail.com (made up) containing something like "I hereby declare consent to be pictured in File:ABC.jpg" wouldn't even prove that "bunny1212" is the actual depicted person. Except for some very few cases we would not be able to actually prove that the email is from the depicted person. Thus   Disagree--DieBuche (talk) 09:16, 23 August 2010 (UTC)Reply
The key phrase is "an assertion of consent". That first step is not the same as proof. That is exactly the same as the first step we ask for in own-copyright issues, an assertion. --99of9 (talk) 11:40, 23 August 2010 (UTC)Reply
So I'd would be ok if I uploaded an explicit file and wrote "Subject has agreed to the publication" or smt. similar?--DieBuche (talk) 12:43, 23 August 2010 (UTC)Reply
Yes, that would be adequate unless someone had reason to later believe that you were a liar (again just like copyvios). Wnt has developed a template that you could add easily during the upload process (presumeably with a drop down option on the upload form). --99of9 (talk) 12:48, 23 August 2010 (UTC)Reply
An assertion of consent (as opposed of evidence of consent), is not a show stopper for me. I do however question its usefulness. Such a requirement is already there (by law even), and adding it trough a separate templated statement, is just a form of adding disclaimers in my opinion. TheDJ (talk) 14:34, 23 August 2010 (UTC)Reply
The problem is all old works that are going to be deleted because the uploader and the subjects will never notice the new policy. This is not only a question of 19th century engravings but also a lot of newer works. Do we need to delete those? I would suggest that old works would need assertions only if there is reason to believe consent was never given. And even then only in the cases we think the subjects may actually be hurt (not anonymous persons dead half a century ago) --LPfi (talk) 19:36, 23 August 2010 (UTC)Reply
I've reverted. We host sexual content that we know was uploaded without the consent of all subjects, but for which there is a compelling case for hosting it (see, for example, Category:Lynndie England). Any sexual content proposal that bans these images is unacceptable. --Carnildo (talk) 20:37, 23 August 2010 (UTC)Reply
The prisoner abuse shots were already explicitly mentioned as permissible under this proposal (p'raps you missed that?) - but I've clarified the language a bit. Robert Harris' update on meta is worth a look too for those interested in this area. Privatemusings (talk) 21:04, 23 August 2010 (UTC)Reply

I still think this is a mess. It seems to a positive requirement of an explicit statement of consent will wreak havoc on a lot of content that is already there and probably unproblematic. For example, the Suicide Girls images come from Flickr, and are almost certainly unproblematic.

I think it would be more straightforward to say something like "When uploading photographic sexual content to the Commons (or when uploading other sexual content that realistically depicts potentially identifiable individuals), it is the uploader's responsibility to assure that adequate consent has been given to allow redistribution and re-use of these images. In particular, this means that any living or recently deceased individuals depicted (1) were capable of giving consent (e.g. of legal age in the relevant jurisdiction) and (2) gave such consent either explicitly or implicitly. Examples of explicit consent are (a) a model release, (b) photographer/uploader is him- or herself the subject, (c) <whatever else we want to include, including the possibility - which I favor - that for trusted uploaders' own work we can accept the upload as an assertion of consent, barring evidence to the contrary>. Implicit consent exists in some countries including the United States if the photo was take in a public place (e.g. the Fulsom Street Fair)." That's rough, which is why I'm not proposing a precise edit at this time. - Jmabel ! talk 23:20, 23 August 2010 (UTC)Reply

I've made one more effort to try to reach consensus by recognizing that self-published sources can be evaluated skeptically on a case by case basis. I still am not inclined to put a whole lot of emphasis on an assertion of consent - even if one is actually given. I don't know whether we're going to be able to come up with a single consensus draft, or if we'll have to vote on two side-by-side, but for now I think that the debate is still productive. Wnt (talk) 19:09, 24 August 2010 (UTC)Reply
Jmabel: my concern with that language is that it doesn't protect uploads of published photos. I think we should keep a strong wall of separation between what one might pejoratively call the pornographer's art, i.e. staging a sexual photograph in person, and the more traditional Wikimedian role of collecting together freely licensed content. Wnt (talk) 19:14, 24 August 2010 (UTC)Reply
I have no problem with where you went with this. I've made what I hope will be seen as a friendly edit to increase clarity. - Jmabel ! talk 00:25, 25 August 2010 (UTC)Reply
I prefer Jmabel's version, because I do not think we should collate freely licensed unconsenting sexual content from any sites which do not have sufficiently high standards of consent requirements. (public domain lapses excepted) --99of9 (talk) 12:02, 25 August 2010 (UTC)Reply
My trouble with this revision is that I want to limit the identifiability to what can actually be worked out from the full contents of the File: page at Commons. In particular, I don't want people to be proposing an image for deletion because someone posted on Twitter that this is so-and-so's miscellaneous body part. If you can work out the information from the photo annotation, metadata, or even by comparing moles and scars to a reliably identified public image (including on Commons) that is one thing, but I don't want files to be open to deletion by innuendo. Such deletions, in fact, might tend to give credibility to such rumors anyway, and would certainly make them more widely known. Wnt (talk) 16:14, 25 August 2010 (UTC)Reply

Regarding identifiability edit

You've captured some of what I meant when I said that identifiability was broader than the content of the image itself. The fact that other images can identify a person also leads to the obvious conclusion that images released later can also identify a person. This is troubling, because once we've had it on a public archive for a significant period, it may well have gone viral, and if it is only identified later, it could cause just as much damage to the unconsenting person's reputation etc. This is why I favour applying the consent rule to all sexual content (apart from valuable historic images of those long dead). --99of9 (talk) 12:02, 25 August 2010 (UTC)Reply

I have to recognize that there are many examples of seemingly non-identifiable sexual material that we still would not want without the consent of the subject. For example, the Florida case that I mentioned here some months ago involved someone taking surreptitious photographs of girls' genitals, which we would not want to touch with an eleven foot pole. Yet there are a handful of cases in which an editor plausibly could upload no-consent sexual images without apparent wrongdoing. For example, a retired surgeon might upload a photo of hypospadias and its surgical correction from his archives, even though the subject is an infant and no consent was obtained to post the photo to Commons, provided that only the genital was shown and the child is not named.
While such cases are narrow and hard to predict, I want to make sure that policy leaves room for them, because some such content may only be offered to us once, if at all, before history moves on and the disease or mode of treatment no longer exists. Wnt (talk) 17:11, 26 August 2010 (UTC)Reply
There is no need to strengthen existing privacy policy with regard to sexual images. In the past we've had a good record of respecting photographic subjects' privacy (at least once an issue with an image was recognized). Useful images should not be deleted and educational projects damaged and rules should not be in place that would give justification for doing this. Concerns about anonymous images (which could be claimed to be anybody) and pictures of the now dead, are a particular threat to medical and educational illustrations. --Simonxag (talk) 21:35, 26 August 2010 (UTC)Reply
I'm not convinced of the assertion in your second sentence when images like this survive a deletion review Commons:Deletion requests/File:Biamyinmd relaxing.jpg! --99of9 (talk) 09:13, 31 August 2010 (UTC)Reply
Gosh how moral. Amy's privacy should be protected!! Most of her (constructive and useful) contributions to the Commons have been deleted, she has been anonymously threatened (see User talk:Biamyinmd), it seems she's been forced off Flickr, and now her remaining work should be deleted in order to protect her "rights". What this does show is that the Commons has not (even in the past) been good enough at protecting the work of genuine contributors against censors. "Privacy" has become the censor's weapon of choice, for harassing photographers in the street, censoring the UK's Parliament and here, for attacking controversial images on the Commons. --Simonxag (talk) 10:25, 1 September 2010 (UTC)Reply
I think that some argument might be made for deleting some of her stuff on the grounds that it isn't really within project scope - i.e. "pictures of yourself" are typically excluded. I'd prefer to see a reason to keep some of this stuff. That said, some of the currently-deleted pregnancy pictures might deserve a second look - for example, it might be argued that a photo of a pregnant woman masturbating might be of relevance to the sexuality of pregnant women, oxytocin and so on.
The privacy issue does seem plausible here - the shots do seem quite poorly posed for a model - she doesn't even have a smile or an eager expression on her face. Someone else is holding the camera, and I have to consider the possibility that User:bi-Amy-in-MD might not be the woman. There's nothing in the photo description or her account to tell us why she's posting this stuff. The statement that she has been "forced off Flickr" is particularly disturbing, since it might indicate that she found the pictures and filed a complaint - can someone post a link to her former account name? Maybe the Flickr admins could let us know if there was any such complaint.
In general, I think we'll find that debatable consent and project scope violation go together. A consenting participant is trying to achieve some effect, which may well be educational or artistic; a non-consenting participant is just sitting there. Wnt (talk) 07:42, 2 September 2010 (UTC)Reply
It seems her new Flickr ID is [15] . The new content is really toned down: she says she had her old ID "deleted by the flickr controllers". Some of her old pictures have been used (and hence useful). --Simonxag (talk) 22:38, 2 September 2010 (UTC)Reply

In other words edit

I have removed the following sentence because IMO it has a number of problems:

In other words, a photograph of a young girl undergoing circumcision found on an anthropologist's web site will be treated with less suspicion than an anonymous Flickr snapshot of a young naked couple staring at the camera with a startled expression.

  • If the young girl is considered sexual content, then elsewhere in the article we have already stated: Photographic depictions of sexually explicit conduct where there is a legitimate concern that not all participants appear to be at least 18 years of age should be proposed for deletion per our precautionary principle;.
  • Children are not legally considered to be capable of giving consent for this kind of material, so no matter how professional the anthropologist seems, we cannot take his word on the subject's consent.
  • The startled expression of the couple is not necessary to reject an anonymous Flickr shot, since consent to take the picture is quite different from consent to upload it to a public archive.

Removal of this sentence doesn't mean I agree with the rest, but this was particularly problematic IMO. --99of9 (talk) 12:16, 25 August 2010 (UTC)Reply

I knew there was some reason why that example came to mind. I think that such an image would be an example of content that might deserve to be scrutinized, but obviously isn't child pornography - which in fact has great social and political importance and may have the potential to help save millions of children from a practice which, at least to this observer, more than rivals rape for its brutality. There are a number of "Körper des Kindes" photos already on Commons which involve naked children but were published by others, which also should not be subject to the consent policy and also (with less certainty to me, but at least by consensus votes) are not remotely likely to be ruled as child pornography. I think the consent should apply only if Commons people themselves are going out and taking the photographs - or where a self-published third-party site might with reasonable likelihood be used as a loophole. Wnt (talk) 16:05, 25 August 2010 (UTC)Reply
Still, examples aren't strictly necessary, and if we somehow manage to agree on a single draft I can live without that sentence. Wnt (talk) 16:05, 25 August 2010 (UTC)Reply

Rocket000 and office actions edit

I reinserted a section deleted by User:Rocket000 ([16]) about office actions. His edit summary says that "copyvios are illegal too", but obviously, if anyone ever runs into genuine child pornography being distributed on Wikimedia Commons the situation is going to be more serious. [I think that noncommercial copyright violation is still regarded as fundamentally a civil matter, but someone proposes a new law every month, each worse than the last. And DMCA takedown notices are coming out as office actions.] I think that if anything ever requires an office action to be used, child pornography would be the top candidate. It was also important to me during those early edits to stress that Wikimedia had always had a working process to comply with laws against child pornography, as certain people had been suggesting it didn't.

I see Rocket000 deleted Commons:Office actions and COM:OFFICE, which I put up some time ago after learning that (as I understand it) Meta:Office actions was a policy that applied to Commons, and wanting to have every policy that actually applied listed in Category:Commons policies. I don't want to get distracted by the latter question, but so far as I know, the policy still applies here whether or not a local copy is displayed, and having people contact Wikimedia by the formal front door phone or email would still appear to be the fastest and most discreet way to deal with serious sexual content issues. Wnt (talk) 05:29, 2 September 2010 (UTC)Reply

Of course it still applies. The Foundation has many policies that apply to us but we don't need to create pages for them. The page didn't offer any information really. Linking directly to the Foundation's site would be more helpful I think. I only mentioned "copyvios" because you only mentioned "illegal". I wasn't comparing the two at all. Anyway, I think anything regarding the Foundation's actions should come from a foundation member. We don't have the authority to say what they will do or don't do. If you wish to get them involved, Commons is not the right place. Try wmf:, meta:, or the best way: their mailing list. Rocket000 (talk) 06:16, 2 September 2010 (UTC)Reply
I made COM:OFFICE into a soft redirect, you can do the other if you think we need it. Rocket000 (talk) 07:10, 2 September 2010 (UTC)Reply
I don't feel that the paragraph is telling them what to do. It simply tells users that there's an existing policy, which makes it clear that the Foundation handles such things. I see that policy, and the various contact numbers, as a clear existing invitation. If there's anything said here about the office action policy that you don't think is true, please explain.
In any case, I suppose they deserve an invite to the vote when it comes. Wnt (talk) 07:48, 2 September 2010 (UTC)Reply
I don't see anywhere in their policy ("Office actions", I assume mean) that they always handle this stuff. They deal more with the public. We have local oversights that can suppress stuff. Rocket000 (talk) 08:26, 2 September 2010 (UTC)Reply
To the best of my knowlege, child pornography is not oversighted, which is reversable and leaves a copy on the servers in an inaccessable location. Instead, a developer removes the image from the server entirely. --Carnildo (talk) 19:20, 2 September 2010 (UTC)Reply
What makes you think that? I've only ever seen oversight actions. Wouldn't it then be best to contact the devs instead of the foundation members which don't have direct access to the servers? Of course they can tell the devs to remove something but it would be faster to go directly to them. Rocket000 (talk) 19:44, 2 September 2010 (UTC)Reply
Such a deletion is not just a technical matter. Remember that the Foundation would be caught between a) a requirement to completely expunge the file, and b) a requirement not to destroy evidence or interfere with an investigation. I really don't know what they're required to do, but I think it's something "above our pay grade" that would be damaging to the project and perhaps Wikimedia as a whole if mishandled. So far as I know, the office action mechanism is a) capable of dealing with the situation, b) the only existing mechanism to deal with this situation, c) current Commons policy. Wnt (talk) 05:55, 3 September 2010 (UTC)Reply
I'm asking why you believe that. I mean, did you read that somewhere? Rocket000 (talk) 06:07, 3 September 2010 (UTC)Reply
My information on the mechanism comes from a combination of a comment made during an old discussion of how to handle uploads of child pornography, and looking at old cases where images were deleted with a comment related to child pornography. I don't know what the decision-making chain involved was. --Carnildo (talk) 19:25, 3 September 2010 (UTC)Reply

The last quibble edit

There are two sections in the requirements about what photos of identifiable people taken by Wikipedians don't require consent that might deserve examination:

  • The surrounding events or actions are of a sort where somewhat-sexualised behaviour might be expected, e.g. a gay pride parade.
  • The actions, while sexualised, do not go so far as actual sexual acts.

The first of the two seems like the primary concern - for example, it would seem to ban a photo of a heckler running naked through a football game or political rally. Also, there's something that rubs me the wrong way about the example of a gay pride parade as a place where sexualized behavior would be expected, even if it is true, because it doesn't seem like it has to be true by definition, and as such, it seems to set a stereotype. And as such it also illustrates the sort of subjectivity that such a rule will bring out if made policy.

The second is unlikely to come up often, but if there is ever a political protest of this type, I'd prefer not to have it banned in advance. There might be also some weird exhibitions of a girl going for the Guinness record that might be vaguely within project scope.

In addition, I will make clearer in the text now that these exceptions apply to photographs of identifiable people, which I assume is intended. Failure to specify this clearly could lead to problems with things such as the medical photos that I discussed above, since few of those were taken specifically with us in mind. Wnt (talk) 06:03, 2 September 2010 (UTC)Reply

I agree with your comment about gay pride parades. The gay pride parade in Nashville is about as conservative as you can imagine, certainly a lot less sexualized than the Mardi Gras parade. I think a better example would be a porn industry conference, or something of that nature. The current example is merely perpetuating a stereotype. Kaldari (talk) 20:39, 3 September 2010 (UTC)Reply

Has been in use edit

The wording "The material is or has been in use on the Commons or another Wikimedia project for educational purposes." is easily gamed. We have had a number of cases where an uploader inserted their image in some project in response to a DR. We used to have a requirement in the draft that the image had been in use for a substantial amount of time; that wording may not be perfect, but we need something that makes the passage less gameable. --JN466 23:00, 8 September 2010 (UTC)Reply

I agree that it is "gameable" but that goes for across the board, not just sexual content. We can not opt out of a clause which is rightly regarded as one of the cornerstones of Commons policy or we will be undermining that policy itself. The draft may well have included "in use for a substantial amount of time" at some point, but I suspect that the reason it was not included is because that too is inherently "gameable". Anatiomaros (talk) 23:52, 8 September 2010 (UTC)Reply
An Wikipedia should have regular policing, and any larger one will have a number of people watching these articles. If an image gets added to an article, those policers or watchers will remove it if it's not in scope for that article. Personally, I'm more concerned with the fact that people go after images that could very well be used in an article on the grounds that it's out of scope, and then get upset when people find an article it can go in and place it there.--Prosfilaes (talk) 23:53, 8 September 2010 (UTC)Reply
Well, I remember cases where an uploader stuck their image into small projects whose languages they didn't even speak ... anyone can paste an upload in the Letzebuergesch, Swabian, Latin, Marathi and Korean Wikipedia, and keep their fingers crossed that at least one of those low-traffic projects will fail to spot and remove the image during the course of the DR. --JN466 01:39, 12 September 2010 (UTC)Reply
Korean is hardly a low-traffic project; it's the 21st largest Wikipedia and has more than 2000 active users. I do see the problem, but just because a Wikipedia isn't English, doesn't mean it can't take care of itself. And the case I recall didn't have the image put into any low-traffic projects, and the image was not removed from any of the Wikipedias, indicating that it was in fact in scope.--Prosfilaes (talk) 04:33, 12 September 2010 (UTC)Reply
Prosfilaes, you know that even in en:WP some vandalism sticks for months: example. To assume that a sexual image inserted in some article in a low-traffic project, and not being removed for a week or two, legitimises the image as educational, is over-optimistic. If an editor participates in a project just to insert his or her uploads, without a history of good-faith contributions in that project, and without discussion in that project, the picture being present there for a short while should not be seen as an indication of that project's consensus that the image is in fact of educational use. --JN466 21:41, 14 September 2010 (UTC)Reply
Why do you say "even in en:WP"? Between the size of the English Wikipedia and the lack of patrolled edits and controlled revisions, I suspect it has an unusually low accuracy on patrolling. Frankly, I'm unappreciative of the desire to delete images that have a claim of usefulness as being out of scope so I'm happy to err on the side of keeping the images.--Prosfilaes (talk) 06:31, 15 September 2010 (UTC)Reply

Good, valuable, sex articles and projects are present and growing on many different language Wikiprojects. They are formed around different consensuses and have different needs in terms of illustration. Recently these were vandalized by image deletion. We should thank those who have worked so hard to undo the damage and prevent its repeat. --Simonxag (talk) 09:10, 15 September 2010 (UTC)Reply

Another poll edit

I would like to start another poll, as this draft seems to have stabilised. Are there any objections? --JN466 23:00, 8 September 2010 (UTC)Reply

  •   Comment If there is a consensus for that over the next week or so I'll go along with it. However, as I stated above, I think we should have a clear agreement that no major amendments or significant rewording should be made once a poll is actually opened (unless agreement is reached here first). Anatiomaros (talk) 23:59, 8 September 2010 (UTC)Reply
In the light of the way this document continues to be amended, especially the attempts to make Consent a catch-all deletion rationale, I'm inclined to withdraw my comment above. Sadly, some people seem to be trying to make sexual content of any sort almost impossible for us to host. I will not be party to the undermining of one of the cornerstones of Wikipedia. Anatiomaros (talk) 16:23, 19 September 2010 (UTC)Reply
  •   Comment Talking about a poll seems to cause new changes, but the changes here are very small. I think that several recent versions [17] [18] [19] are quite similar, differing only in the amount of skepticism implied for evaluation of consent in old files that get proposed for AfD, and any one of these drafts is close enough to a stable consensus to vote on. Wnt (talk) 07:05, 11 September 2010 (UTC)Reply
  •   Comment If we are going to consider running this proposal through another round of polling, then there needs to be the stipulation of a sitewide notice. Otherwise, we're going to have the same handful of proponents and detractors arguing the same points, and the results are going to be skewed in favor of whichever side happens to care enough to be checking back here weekly for the past three months, which, presumably, would be supporters.   — C M B J   18:52, 12 September 2010 (UTC)Reply
  •   Comment The proposed policy has developed a lot. There's been plenty of debate and many issues that weren't thought through have been addressed. I support taking a fixed version and having a well publicized vote. --Simonxag (talk) 11:45, 13 September 2010 (UTC)Reply
  •   Comment to be honest, I have seen little progress since the last poll. The page is still edited almost every other day, so it is far from stable (a requirement to becoming a policy). I don't know. TheDJ (talk) 12:43, 17 September 2010 (UTC)Reply

Providing evidence of consent: not only the uploader can do this edit

From the current draft: "Media should be removed due to lack of consent only if the uploader cannot provide sufficient evidence that consent was given." Clearly, in some cases not only the uploader can provide this evidence. For example, if the picture came from Suicide Girls, anyone, not only the uploader, could clarify that Suicide Girls are an established operation who can be reasonably presumed to have obtained proper consent for an image they made available under a CC license. - Jmabel ! talk 05:08, 12 September 2010 (UTC)Reply

Fair enough. I've adjusted it. See if it works now. --99of9 (talk) 11:03, 12 September 2010 (UTC)Reply
No it doesn't. In most cases only the uploader knows that consent has been given and there is often no way to tell him or her about the problem. Therefore we most choose whether to delete also totally unproblematic files, only because the uploader is no longer active on Commons and therefore will not state that consent was given, or keep also problematic files that were uploaded before this policy.
Personally I suppose we have quite a lot of the images in the first category and that files in the latter category are less of a problem. People uploading ex-girlfriend pictures maliciously probably do tell about them and so they have done the harm already (and been removed). This is a wild guess of course, and I'd be grateful if somebody having a grasp about the reality would tell about it.
--LPfi (talk) 19:00, 12 September 2010 (UTC)Reply
This is the latest volley in a long-running dispute regarding the validity of plain, undocumented images. I originally wrote the sentence as
"Media should be removed due to lack of consent only if there is some reason to suspect that consent was not given."
because I think that there needs to be some iota of suspicion that an image was made without consent before it is put up for deletion. I don't see any justification for images being deleted just because of paperwork issues, especially past images but also present versions. This paperwork is just a user-submitted promise that he isn't doing something that he already knows is unethical. It's meant to prevent accidents. Wnt (talk) 02:24, 15 September 2010 (UTC)Reply

The WMF policy of unconditional removal edit

I'm not participating too much in this draft at the moment, but I did want to relate a relevant discussion I had with Mike Godwin. In short, he asserted that if a person depicted in a nude or sexual image asked WMF for its removal, it would be removed, regardless of the circumstances, even if detailed documentation had been provided and kept on file.

While I don't imagine anyone here at Commons agrees with this policy, we don't really have a say in it - so rather than divert discussion towards a futile attempt to modify this policy, I suggest we keep it in mind when discussing issues like consent. Our aim should not be to prevent removal of these materials at a later time (which we cannot do) but rather to avoid damage to the depicted subjects. This suggests among other things that an informal assertion that they are "okay with it" would be as good as any kind of legal release statement. Dcoetzee (talk) 02:15, 13 September 2010 (UTC)Reply

Honestly? I haven't read that as a major issue here. The cases where someone has objected to such material being on here--one of the Suicide Girls, and a German S&Mer--they've been deleted. The question is how much evidence do we need that the person ever consented.--Prosfilaes (talk) 07:23, 13 September 2010 (UTC)Reply
Actually, I do agree with that policy. It's just good manners. One might object in a case where the image has historical importance and has been widely published, something along the lines of Abu Ghraib, but other than that, it is really off to insist on making available someone's nude or sexual pictures against their will. As for an informal assertion that "they are okay with it" – we have anonymous contributors here, and there are precedents where people have simply lied, and posted stolen nude images for the lulz, or as part of a vendetta. Not the sort of thing we should make easy. --JN466 21:49, 14 September 2010 (UTC)Reply

Unfortunately, it seems that much of our careful thought and difficult argument may be undone by the journalistically guided parallel project of the Wikimedia Foundation. A sex symbol releases an iconic image but later converts to a fundamentalist religion, a rapist in a historically significant atrocity (think Abu Grahib or the forced "marriages" of Islamic terrorist groups) objects to public denunciation, an "anti-porn" campaigner claims that it his penis in a condom use illustration. All images will be deleted regardless of circumstances. --Simonxag (talk) 10:42, 13 September 2010 (UTC)Reply

I would hope that the WMF can differentiate between previously published images and user-submitted images. The whole "photographs of identifiable persons" policy is aimed at Wikimedia Commons participants who pick up the camera, which is very different from compilation of PD images. I don't want to see Wikimedia as the one place in the world where you can't have a published image, leaving the field to copyrighted enterprise. But it is fair to respond to complaints about user-submitted photos with deletion, because it's just about always going to be plausible that proper consent was never really obtained. Wnt (talk) 02:29, 15 September 2010 (UTC)Reply
There have already been cases of people making false claims about images they found objectionable eg. File:Annie Sprinkle Neo Sacred Prostitute.jpg. So far they have only been limited by their lack of success. --Simonxag (talk) 08:42, 15 September 2010 (UTC)Reply

small note edit

I came here to point out this update from the consultant preparing a report loosely related to this page (and there'll be more tomorrow, apparently) - I also edited the proposal with the rationale stated in the edit summary. cheers, Privatemusings (talk) 11:09, 19 September 2010 (UTC)Reply

Again on consent edit

If I understand this recent edit correctly, it seems to insist that there must always be positive evidence of consent for any image that is considered sexual content, even where there is no reasonable basis to doubt consent. I'll assume good faith and presume that this is not intended simply as an arbitrary rule that will have the effect of removing most sexual content, but it seems to me that it would have that effect.

Side note: there is a bit of a paradox here. In general, I would presume we'd rather have most sexual content be of unidentifiable individuals. However, clear evidence of consent often requires that the individuals depicted be identifiable as the ones that gave the consent!

Let's look at some scenarios:

1) Are we really saying that for each image deemed to be "sexual content", even a longtime reliable contributor must provide positive evidence that there is consent from all parties to a photo taken by that contributor, even if those parties are unidentifiable in the photo? What would constitute positive evidence that the unidentifiable individuals in the photo gave their consent?

2) Similar case, but where the image was uploaded five years ago, and the uploader is no longer an active contributor. Picture is of what uploader asserted at the time to be his or her own genitalia. Is something further needed by way of evidence?

3) Published erotic photo from 100 years ago. No one involved in the making of the photo is still alive. What would evidence of consent even mean?

4) Drawing. Nothing obviously identifiable about any individual in the image. What consent, if any, could possibly be required?

5) Medical book illustration, now in public domain. Can we presume that the published work already dealt appropriately with consent issues?

Comments on any of these scenarios would be welcome, as would a suggestion of wording that would make it clear if some of these would not require positive evidence of consent, or if such consent could be presumed, etc. - Jmabel ! talk 16:17, 19 September 2010 (UTC)Reply

I agree with all these points, Jmabel. This is like having justice based on the assumption "Guilty until proved innocent". A blanket rule like this would in effect result in a very high proportion of sexual content being deleted. Anatiomaros (talk) 16:29, 19 September 2010 (UTC)Reply
Full agreement as well. Consent in these cases is an unusable criterium, that should not be applied. TheDJ (talk) 16:54, 19 September 2010 (UTC)Reply
99of9 says that he only defends an existing policy. Does there exist such a policy? I never have heard about it, even in form of "case law" or something. Trycatch (talk) 17:04, 19 September 2010 (UTC)Reply
The policy I was referring to is Commons:Project_scope/Evidence: the burden of proof lies on the uploader or other person arguing for the file to be retained to demonstrate that ... any required consent has been obtained. If case law is also important to you, consider Commons:Deletion_requests/File:Lying_on_back_in_underwear.jpg (which had age issues mixed in, but appears primarily about consent). --99of9 (talk) 12:03, 22 September 2010 (UTC)Reply
That is with reference to copyrights, not the permission of the picture's subjects. - Jmabel ! talk 14:50, 22 September 2010 (UTC)Reply
Hm, no, as far as I see it, the plain reading of "any required consent" includes both the copyright owner and the only other mention of the word consent on that page "the subject's consent is also required". --99of9 (talk) 12:13, 23 September 2010 (UTC)Reply
Yes, that case law is about consent, but a case where consent was explicitly denied. I don't think any version of this policy has suggested that when the photographer tells us the subject asked for the photograph not to be shown, that we should keep it anyway. I don't think that we should have the same rules for copyright as for permission, because the rules for nudity are going to be much more strictly applied in practice.--Prosfilaes (talk) 19:08, 22 September 2010 (UTC)Reply
Ok, you're right, I didn't look carefully enough, that was a bad example. Sometime I'll look for more if you still want case law. --99of9 (talk) 12:18, 23 September 2010 (UTC)Reply
Also curious, on this one Simonxag voted delete, but he is the one reverting my wording now. What have I missed Simon? --99of9 (talk) 12:56, 23 September 2010 (UTC)Reply
I agree; there should always need to be some reason to take action before we start deleting.--Prosfilaes (talk) 19:10, 19 September 2010 (UTC)Reply
It seems to me that almost everyone involved here recently has been in agreement with the broad outlines of the policy on sexual content (why Commons needs to host some sexual content, what sort of criteria would be applied to determine whether an image is in scope), even if inevitably some of us would make different judgments than others as to whether certain images have, for example, any educational value. The most important remaining area of disagreement is precisely this one: how explicitly consent of the depicted individuals needs to be demonstrated, and whether the consent standards might be different for different types of images (different media, different provenance, etc.).
I would still like to see if we can hammer out a bit more consensus on the consent issues. I was really hoping with my list above would provoke response from the people who want the stricter consent rules, not just a piling on from those of us who think those strict rules are problematic. 99of9, Privatemusings, or others who want the stricter criteria for consent: can you tell us how you would address any of the 5 cases above? I'm trying to work out whether we disagree over what the policy should mean or over how to word the policy, and insofar as we disagree about what it should mean I'm trying to work out if we can narrow define the disagreement more precisely, so that we can itemize what is actually in dispute and possibly eventually poll about each of the issues that are in dispute. If you don't chime in here, we have no way to know whether we have a deep disagreement about intent, or merely a shallow one about wording. - Jmabel ! talk 21:37, 19 September 2010 (UTC)Reply

< I'm happy to chime in (patience! :-) - I'll go through the scenarios later, when I get the chance, and agree that it's worth being careful on these, and other potential, 'boundary' issues. Do you mind if I reply 'in thread' above? cheers, Privatemusings (talk) 01:43, 20 September 2010 (UTC)Reply

Privatemusings, I am repeating the list below as a place to reply in-thread, because people will inevitably reply to your replies, and if they did that above it would soon be very hard to see what I'd originally written. - Jmabel ! talk 18:54, 20 September 2010 (UTC)Reply

I will also comment on these 5 points, but life has just got very busy for a few days. --99of9 (talk) 23:38, 21 September 2010 (UTC)Reply

Space for in-thread discussion follows; please sign your comments.

1) Are we really saying that for each image deemed to be "sexual content", even a longtime reliable contributor must provide positive evidence that there is consent from all parties to a photo taken by that contributor, even if those parties are unidentifiable in the photo? What would constitute positive evidence that the unidentifiable individuals in the photo gave their consent? - Jmabel ! talk

Whilst I may see the merits in requiring 'positive evidence' what I support currently is simply the presence of an assertion of the consent of all parties, combined with an approach akin to our examination of copyvio (ie. is it credible or not). Privatemusings (talk) 03:34, 22 September 2010 (UTC)Reply
I don't believe the idea of positive evidence for copyright can be extrapolated to this situation. It is very easy to copy and paste something from the Web, and it is very easy for someone who hosts content online to send a communication to OTRS to prove consent to put it in Wikimedia commons — the mere fact that the e-mail comes from an official address at a site which (apparently) has right to host the content, is a pretty strong piece of evidence. But taking a naked photograph of someone and uploading it to the internet is not so simple without consent, and obviously involves some substantial ethical transgressions to begin with. Lying to us and saying "yeah bro, I had consent, sure" is just not a barrier by comparison. An email from some random account doesn't really tell us much either. There's really not much to do but assume good faith, just as countless other sites do that host images on the Web. Wnt (talk) 19:54, 22 September 2010 (UTC)Reply
An assertion counts as something to me. If that was present, then I personally would not nominate it for deletion unless something else looked dodgy. If it came up for deletion, I would consider the contributors standing, and if they are in good standing, I would accept their assertion, just as I accept those in good standing who assert statements about the ownership of copyright. However, I believe policy is also clear that when challenged, they should be responsive to the challenge to provide more evidence if there are some doubts. This could come in many forms of course, and could be looked at on a case by case basis. --99of9 (talk) 13:12, 23 September 2010 (UTC)Reply
Per 99of9. A clear, unambiguous assertion that the person depicted has consented to the media being uploaded (not just the picture being taken) is the minimum. --JN466 12:29, 30 September 2010 (UTC)Reply

2) Similar case, but where the image was uploaded five years ago, and the uploader is no longer an active contributor. Picture is of what uploader asserted at the time to be his or her own genitalia. Is something further needed by way of evidence? - Jmabel ! talk

If the uploaded asserted it was their genitalia, then consent is clearly stated, in my view. Again, we should examine such media on a similar standard to how we examine copyvio. for credibility. Privatemusings (talk) 03:34, 22 September 2010 (UTC)Reply
I agree with PM. If on the other hand it said "my girlfriend" without an assertion of consent, it should be deleted. --99of9 (talk) 13:12, 23 September 2010 (UTC)Reply
Agree with PM. --JN466 12:31, 30 September 2010 (UTC)Reply

3) Published erotic photo from 100 years ago. No one involved in the making of the photo is still alive. What would evidence of consent even mean? - Jmabel ! talk

Personally, I would take an image over a certain age to strongly imply consent of all parties too. I would accept an explicit exemption to new criteria of media over 100 with no worries :-) Privatemusings (talk) 03:34, 22 September 2010 (UTC)Reply
I agree with PM. This is quite similar to the engraving question you asked earlier. In those days boyfriend-girlfriend snaps were not common, and all photographs implied the consent to publish. In 100 years I will not accept a private digipic from 2010 because cameras have moved out of the professional realm. --99of9 (talk) 13:12, 23 September 2010 (UTC)Reply
Per PM. --JN466 12:34, 30 September 2010 (UTC)Reply

4) Drawing. Nothing obviously identifiable about any individual in the image. What consent, if any, could possibly be required? - Jmabel ! talk

Claims that media is 'drawn' should be evaluated with the same rigour with which we examine copyvio (again!) - we have had badly, simply filtered photos uploaded claiming to be drawings, and I hope we're sensible enough to assess credibility fairly. Consent of a fictional / non-existent person depicted in a drawing is not, of course, required :-) (there is no individual to supply consent) Privatemusings (talk) 03:34, 22 September 2010 (UTC)Reply
I agree with PM, and he makes some interesting points that I had not thought about. --99of9 (talk) 13:12, 23 September 2010 (UTC)Reply
Per PM. --JN466 12:34, 30 September 2010 (UTC)Reply

5) Medical book illustration, now in public domain. Can we presume that the published work already dealt appropriately with consent issues? - Jmabel ! talk

A Medical book, or indeed previously published work (other book of some sort) which has lapsed, or been released, into the public domain is likely to find a good home here on commons. Again, such claims should be assessed in the same manner we address copyvio. for credibility etc. cheers, Privatemusings (talk) 03:34, 22 September 2010 (UTC)Reply
Book yes, Flickr no. --99of9 (talk) 13:12, 23 September 2010 (UTC)Reply
Per PM. --JN466 12:34, 30 September 2010 (UTC)Reply

I strongly oppose the recent edit because it would result in the deletion of the vast majority of sexual photographs, most of which are not likely to involve consent issues. It would therefore be very difficult to implement, mostly unnecessary, and would probably result in the proposed policy failing to gain consensus or being discredited or otherwise ignored. I am replacing it with a compromise proposal saying that evidence of consent is necessary only when the subject of the media is likely to be publicly identifiable. That would still allow distinctive genital birthmarks but not full faces. 71.198.176.22 10:49, 21 September 2010 (UTC)Reply

I also oppose that edit, and I don't support the deletion of unlabeled old images whether they are identifiable or not. I think that we should just take both versions to a vote and get a broader group of people to pass judgment on the issue (see below). Wnt (talk) 19:46, 22 September 2010 (UTC)Reply

Attempt at hammering out more of a consensus edit

OK, still stumbling toward something here so I don't want to edit the project page—unlike some, I prefer to develop something closer to consensus first—but maybe we need something like the following in the policy:

  • Future uploaders of their own photographs that fall under the heading of sexual content are expected to make a positive assertion that the subjects of the photos consented to the photograph and to its public release. Provision of further evidence of consent is welcome (using the usual COM:OTRS process) but is not normally required.
  • There can be no presumption of consent for uploads of sexual content of somewhat unclear provenance (e.g. from a "random Flickr account"). These require positive evidence of subjects' consent.
  • The issue of subject's consent does not arise for drawings and other non-photographic representations (unless they are of identifiable individuals).
  • Material from reputably published sources that has passed into the public domain or is available on a free-licensed basis will generally be presumed to have dealt appropriately with issues of the subject's consent unless there is evidence to the contrary. This would apply, for example, to illustrations from a medical textbook or a sex manual that has passed into the public domain, or to images that an organization such as Suicide Girls has released under a Creative Commons license.
  • Because of near-impossibility of verification, the subject's consent will generally be presumed for older photographs (before 1950), unless either (1) the individual in the photo is identifiable and is still alive or (2) there is concrete evidence that the picture was taken without consent.

This doesn't yet address the issue of existing images that are uploaders' own photos, especially if that uploader is no longer available to discuss the matter. We are farther from consensus on that. But I think that, possibly with minor tweaks, the above is at least very close to a consensus (which is not to say unanimity—there are some unreconcilables on both sides here—but I think there will be very broad agreement on these. - Jmabel ! talk 19:36, 23 September 2010 (UTC)Reply

I think that we should worry about consent on living and recently dead individuals only. If a picture was taken surreptitiously in 1942, of someone now long dead, it no longer matters whether they consented or not.--Prosfilaes (talk) 19:59, 23 September 2010 (UTC)Reply
looks pretty good to me, giving it a go.... Privatemusings (talk) 00:40, 24 September 2010 (UTC)Reply
I added three cases: public events, historically significant events (Abu Graib) and the disputed uploads from users not longer active.
  • Public events where photography is expected is probably unproblematic, but the case should be mentioned.
  • In the Abu Ghraib case the images in question are regarded to be in public domain, but there might be events where images circulated in the press are under copyright while a Commons user has taken similar pictures. I think Commons might in some cases be the first to publish such pictures, but then not only the uploader should think about whether public interest or privacy is more important. Some procedure should be developed, but that is a later issue.
  • Photographs by commons users mot longer active are hard to reach consensus on. I think that is no reason to forget about them. My wording is a try at being neutral until we come nearer consensus (or consensus on where our stands differ).
--LPfi (talk) 07:27, 24 September 2010 (UTC)Reply
The problem with these changes is that they add thresholds like "historically significant", "reputably published", "before 1950" for which I really don't see justification. I'm not making changes to your text at this time because I've decided to favor a poll for each of the two positions on consent, and I don't just want to be edit-warring about all these new conditions while you're developing your version, but I don't think they're a good idea. Wnt (talk) 01:56, 25 September 2010 (UTC)Reply
  • Also, the current version is very messy. The bit that lists "additional restrictions" has only ONE additional restriction as far as I can tell, and then a long list of subjects that are or are not assumed to be subject to that restriction. I think we should still be aiming for one version, because if we're going to vote, there needs to be more than two versions (e.g. the current version does not include my preference that old uploads be treated consistently with new uploads apart from an extended grace period).--99of9 (talk) 11:08, 27 September 2010 (UTC)Reply

Threat to important images edit

The way the discussion is developing shows that a requirement of consent is a serious danger to sex educational images.

Take the whole range of our user generated artwork, which is both useful as it stands and needs to be expanded. If there is no specific model, no consent can be forthcoming. But, as people have already started to mention, questions can always be raised. Any image that isn't identifiable might be anyone. The impossible requirement of consent can eliminate any artwork that a deletion proposer chooses to target.

As for our educational photographs of sex organs etc., the requirement of consent will remove a good number. We have just one photograph of the results of female genital cutting, (Warning distressing image!). It is in use on a number of projects and documents an important subject. The description reads "photo of my 29-year old egyptian girlfriend who had FGM as a child" and is the sole contribution of the uploader. 95% of Egyptian women have undergone some form of FGC. I know nothing of the circumstances of the taking of this picture, nor do I or anyone else have the right to.

"Consent" may actually be a threat to the anonymity of the subject and artist. In many countries the production of sexually explicit material can lead to serious legal consequences (in Iran the death penalty but long prison terms in many places) and it is not uncommon for sex educators and AIDS activists to face persecution and even jail. The USC 2257 legislation actually has provisions preventing its use against the subjects of the artwork (it is supposed to be there to protect young people from being exploited by the porn industry), but such protection will only work in the US. I am happy that our contributors can hide behind effective anonymity. --Simonxag (talk) 18:05, 2 October 2010 (UTC)Reply

Basically, your argument is that we can only be educational by stealing pictures of other people and exploit their sexuality. That is really immoral and not acceptable practice on WMF websites. No one is forced to produce pornography of themselves, so any legal problems they may be subject to is their own choice. By hiding behind anonymity, you are destroying thousands of innocent people who do not want people surfing Wikipedia to masturbate to them because some pervert stole their image and uploaded it to show off to their friends. Ottava Rima (talk) 23:37, 3 October 2010 (UTC)Reply
The image Simonxag is discussing is certainly not pornographic. His point is that statements of consent seem to require individual identification of models. I have to say, I largely agree with that concern: images where the person photographed is not identifiable by our readers seem preferable for most sexually-related topics, and I don't think we have the security systems in place to safeguard identification if we had it. - Jmabel ! talk 00:59, 4 October 2010 (UTC)Reply

I think I've here identified 2 important groups of images :- one, where anonymity is chosen, where evidence of consent will not be forthcoming; the other, artwork, where evidence of consent is literally impossible. If suspicion is not required to be reasonable all such works will be liable to be deleted. --Simonxag (talk) 11:00, 4 October 2010 (UTC)Reply

I don't think the reasonability clause is at issue in either of these cases. In the artwork case, which is similar to jmabel's issue #4 above (drawings), if there's evidently no human subject, there's no requirement. In the case of preferred anonymity, then an uploader's statement will still be possible. In anonymity cases it may also help to somehow state in the consent template that the reason that all further evidence will be anonymized is that the subject has requested this. --99of9 (talk) 12:50, 4 October 2010 (UTC)Reply
Artwork may very well depict identifiable persons. In fact mostly it does.
Request for anonymity can be stated by a parameter in the template, but I think it mostly adds to complexity: if the model does not want to be anonymous, the name (more likely: a pseudonym) is probably given in the image description. In the case of people not active as models I suppose most would prefer anonymity.
There is now no guidelines in the policy proposal about how to weight (un)reasonable suspicion of non-consent against the very vague "evidence" of consent we will get in the typical cases. What evidence can reasonably be expected about an anonymous model, other than the uploader saying "yes, the model consented", or stronger, "yes, really, the model did, I did not misunderstand and I am not lying"?
Talking of evidence is misleading here. We can discuss evidence of that the uploader has understood the issues and says (s)he's got consent. Other than that, the only thing that really is relevant (for the anonymous cases, which I feel are very important) is the trustworthiness of the uploader or anybody claiming to know the uploader or the model. Calling impression of trustworthiness "evidence" is not calling a spade a spade.
Even if any deletion will go through the full deletion process, the wording here is important, as it (or the impression of it) will be used in the debates. It will not be nice to discuss the trustworthiness of specific users, but (as I see it) that is what the consent issue boils down to, if a statement or assumption of consent is contested.
--LPfi (talk) 09:26, 5 October 2010 (UTC)Reply
I hear what you're saying, but I still think the best word is evidence. If I am willing to defend a file I upload with evidence, that will prevail no matter whether I'm untrustworthy. For example, if it gets to a deletion debate, my subject may be willing to put in an OTRS job. Or I might have a signed contract with the model that I could provide on request. I'm not saying that's required, but I'm saying that evidence trumps trustworthiness. --99of9 (talk) 10:00, 5 October 2010 (UTC)Reply
You need to establish the connection between the model and the consenting person. Even if the model trusts the OTRS people not to break anonymity (and would you, if you were e.g. a politician?) that is hard to do. How would you suggest proving the e-mail was sent or the contract signed by the anonymous model? --LPfi (talk) 12:02, 5 October 2010 (UTC)Reply
Evidence is not the same as proof. If someone is willing to say that it's them in a picture, that is evidence. Then we have reached at least the same standard of evidence as when the uploader says "this is my genitalia", see jmabel's case #2, except now we have two people saying it: the photographer and someone claiming to be the model. Politicians posing anonymously for sexual content on wikipedia sounds like an over-stretched test case to me. By the way, another alternative to OTRS is to break anonymity to one or two well-chosen and communally trusted editors. --99of9 (talk) 13:07, 5 October 2010 (UTC)Reply
Except without a complicated process we do not know even that. The person sending the e-mail may be the photographer or may have nothing to do with the image. In the case of the uploader we can at least presume it is the same person (or people trusting each other) every time the same account is used.
For breaking anonymity to trusted editors, this is what I would do: I would upload sexual content under a separate account and ask somebody that trusts me, that I trust and that the community trusts to declare (s)he knows consent has been given (after my having disclosed at least some of the circumstances to that user). But that is about trust, not evidence. Using a subgroup of the OTRS team will help keep the confidentiality, but will not help with the problem of knowing whether the consent was given by the model or by somebody else.
--LPfi (talk) 16:24, 5 October 2010 (UTC)Reply

Every time the consent gets discussed, the threat looks more serious. Nude artwork is based on the artist's work on real people (perhaps many life models, perhaps one lover) and any non identifiable image (art or photo) can be claimed to be anyone (gender etc. permitting). Add a bit of moral claptrap about thousands of people masturbating and you have a requirement for "evidence". This, as I said before, will be impossible for the artwork and simply not forthcoming (for good reasons) for the rest. All artwork and (at least) our existing educational photos need to be protected from this new mass deletion rule. --Simonxag (talk) 11:32, 5 October 2010 (UTC)Reply

strongly recommend taking a look edit

Robert Harris, the consultant hired by the Wikmedia Foundation to examine the issue of 'controversial content' has now made some recommendations. I broadly support the measures he's suggesting, and recommend taking a look. Privatemusings (talk) 03:36, 22 September 2010 (UTC)Reply

I think it is at least interesting. However, I wonder about how we are to determine whether "the intent" of a particular image "is merely to arouse, not educate". The problem is that this is not inherent in the image. Courbet's famous painting L'Origine du monde is not all that visually different from a porn image. Now, in this case, the exception for "art" clarifies the matter - few would deny that Courbet was an important artist - but in most cases the line between art and dreck is one different people will draw in different places. I have no idea how this will help us decide whether this Second Life BDSM image is in scope. It is clearly intended at least partly to arouse, but is also a possibly useful illustration for an article about the use of a St. Andrews Cross in bondage. So, I'm not sure whether, with respect to imagery, he is offering anything clearer than what we started with. - Jmabel ! talk 05:12, 22 September 2010 (UTC)Reply
I think the proposal is terrible, but it actually has relatively little to do with sexual content as defined here - topless women, for example, are not covered by this proposal. I think that a better policy was made here, and the last quibbles are really hovering pretty much in the same place. Wnt (talk) 19:33, 22 September 2010 (UTC)Reply
I haven't had the time to digest and dissect every clause and word, but the wording of parts of the section headed 'Recommendations on “Controversial” Images' (and in my opinion the proposed designation of "controversial categories" is itself "controversial") should put all of us who believe in the central tenet 'Commons is not censored' on our guard. Read between the lines (my italics):
(4.) "That Commons editors and administrators review the application of the existing Commons policy on educational scope to images of nudity in Commons, where breasts, genital areas (pubis, vulva, penis) and/or buttocks are clearly visible, and the intent of the image, to a reasonable person, is merely to arouse, not educate, with a view to deleting such images from Commons." As Jmabel notes, how do we decide that, and would somebody care to define who exactly is "a reasonable person"?
(5.) "That historical, ethnographic and art images be excluded from such a review, and be considered in virtually all cases to be in scope. "Virtually all" means "some [which?!] should be deleted". That opens the door to all sorts of attempts at censorship of the world's artistic heritage. This seems contrary to the consensus we have here and is not acceptable. Moral deletionists would seize on that (further defined as "perhaps with the occasional exception") to nominate for deletion all sorts of artistic images; even if we are likely to reject all possible deletion requests under this clause, do we really want to have to waste our time and resources dealing with that, for no reason at all?
"Beware the wolf in sheep's clothing"! Anatiomaros (talk) 23:01, 22 September 2010 (UTC)Reply
Here's another gem: "And, although not our primary motivation in making this recommendation, it must be noted that they [i.e. sexual images] are offensive to many people, men and women alike, and represent with their inclusion a very clear bias, and point of view – that of woman as sexual object [presumably "man as sexual object" is OK?]. They are far from neutral. We would never allow this point of view untrammeled and unreflexive presence on any Wikipedia site as a clear violation of NPOV – we should not allow it on Commons either." Poor grammar aside, that is a very worrying statement. Many of our images clearly do portray a POV, e.g. propaganda posters, political cartoons. Does their inclusion here also come under NPOV, which in any case is not a rule on Commons? Anatiomaros (talk) 23:19, 22 September 2010 (UTC)Reply
Yeah, the person who proposes censorship for single category of offensive pictures (offensive from North-American WASP point of view), do this in the glory of NPOV. Most of the good photographs (of buildings, landscapes, cars, ships, aircrafts, etc.) are not neutral, because they arouse (and indented to arouse) some feelings, but only one certain sort of feeling should be censored for The Great Neutrality. Pretty bit of doublethink. And he even don't know that there is no Wikipedia-style NPOV policy on Commons? After all these months of research? Trycatch (talk) 01:31, 23 September 2010 (UTC)Reply

I've only just noticed part three is out now. --Simonxag (talk) 22:11, 11 October 2010 (UTC)Reply

Let's have a vote on two versions (withdrawn) edit

The debate over whether unannotated images should be deleted by default has gone on for months, and I don't see any progress (to my perspective) since my September 11 version.[20] I think we should just go ahead and hold a vote on two versions, i.e. mine and whatever version the delete-by-default faction would propose, assuming they agree amongst themselves on a single wording. The existence of such disagreements is inevitable, and we should not expect debate over the policy to stop once a vote is taken, but we're as finished as we're ever going to be, and I'd like to have a sensible policy proposal go to the public before someone else tries to redo the whole thing from scratch. Wnt (talk) 19:43, 22 September 2010 (UTC)Reply

I could support such a plan. Progress is not in sight. TheDJ (talk) 23:59, 22 September 2010 (UTC)Reply
Well, there is some progress in #Again on consent. Privatemusings' intent as stated there in response to more specific examples is apparently lot closer to my view (and that of what I believe is a majority of people who have been working on this) than at least one of Privatemusings' edits to the project page has suggested. I think there is still a chance of broadening the group who are in consensus by working out a clearer wording for the project page. I am waiting, however, for 99of9 to weigh in, because that seems to me to be the most likely source of actual disagreement about intent. - Jmabel ! talk 03:00, 23 September 2010 (UTC)Reply
It's possible to divide the proposal into sections and collect votes on each section. - Stillwaterising (talk) 10:49, 25 September 2010 (UTC)Reply
As far as I can tell, the evidence-of-consent issue is the only one that we've failed to come to an agreement about. It's easier to vote on just two drafts and get a vote in favor of a complete policy than to vote on nine sections and end up with a "this is policy-but-this-is-essay" hybrid, with the disagreement still unresolved. Wnt (talk) 15:56, 25 September 2010 (UTC)Reply
  • Could one of you please summarise what they perceive to be the key difference between Wnt's version and Privatemusing's version, as represented in this diff? I can't get my head round it. :( --JN466 12:42, 30 September 2010 (UTC)Reply
This draft isn't so bad as some of the others. Mainly I was concerned about the part where it says "Because of near-impossibility of verification, the subject's consent will generally be presumed for older photographs (before 1950), unless either (1) the individual in the photo is identifiable and is still alive or (2) there is concrete evidence that the picture was taken without consent." I have to say, initially I interpreted it to apply to previously published photographs, which made me take it badly, but looking it over again I think that it should already be referring only to old self-made photographs. Also, it takes a more restrictive view of Flickr photographs than I would have preferred, and I worry about how "reputably" will be interpreted (after all, some would dispute that any pornography is reputable...). And I would have preferred to see these things limited to interpreting the photographs of identifiable persons policy rather than adding to them. And I'm a worried about what the OTRS documentation process could come to. These things said, it could be worse, and the progress of discussion here and external events (the Robert Harris study and the Tyler Clementi tape) drive me to accept this draft, with one small change to make it clear that the 1950 thing doesn't apply to published photographs. Wnt (talk) 19:09, 5 October 2010 (UTC)Reply

Assertion of consent edit

We were discussing assertions of consent, above. Apparently, Flickr have a system whereby uploaders are required to click a box, asserting that they are not violating anyone's copyright, and are uploading images with the consent of those depicted. I am not a Flickr user, and have no experience of this, but it seems a useful idea. Clicking a box is an action that has a legal validity; it's commonly used in online interfaces to assert that the user has read licence information, disclaimers etc., and assents to them. Thoughts? --JN466 23:59, 30 September 2010 (UTC)Reply

I'm a frequent Flickr user with a "pro" account. FWIW, I can't remember ever being prompted for this. - Jmabel ! talk 04:10, 1 October 2010 (UTC)Reply
Interesting, perhaps I was misinformed. Could you have a look whether something like this comes up when you indicate you want to upload an image with adult content? --JN466 13:59, 1 October 2010 (UTC)Reply
Flickr's TOS is rather clear on a proactive enforcement that they take in terms of fraudulent uploads: "Flickr expressly reserves the right to immediately modify, delete content from, suspend or terminate your account and refuse current or future use of any Yahoo! service, including Flickr pro, if Flickr, in its sole discretion believes you have: (i) violated or tried to violate the rights of others; or (ii) acted inconsistently with the spirit or letter of the TOS, the Community Guidelines or these Additional Terms. In such event, your Flickr pro account may be suspended or cancelled immediately in our discretion, all the information and content contained within it deleted permanently and you will not be entitled to any refund of any of the amounts you've paid for such account. Flickr accepts no liability for information or content that is deleted."
Flickr use to have, at least, a verified process similar to Facebook's current. If they got rid of it, then they are making themselves more liable. Ottava Rima (talk) 14:22, 1 October 2010 (UTC)Reply
In my experience they are a useless distraction (other than as a legal formality in USA): you have already made the decision to upload the file, the text you are supposed to read is either too long for people to bother or short enough that you read it already – and there is no possibility to ask for clarification if you do not understand the issues. Sometimes the text you are supposed to accept doesn't even show up (in a readable way).
The box is though typically reduced to “To upload the image you must click here. That means a bunch of legal obligations, but never mind. Nobody else cares either.” There is no reason to believe that people used to such boxes would treat the Commons' one differently.
It is better to have a box that you can either check (which adds the template) or leave unchecked. If unsure, leave it unchecked, and real people will ask you about the details, as they relate to the specific image. Or just suppose serious users uploading sexual content will have read the policy and be able to add the assertion or template by hand.
--LPfi (talk) 11:08, 1 October 2010 (UTC)Reply
Speaking for myself, if the upload didn't work until I had clicked a box saying, essentially, "I assert that I have the consent of the people depicted for uploading this picture", it would make me think twice (and would also make me think about possible repercussions if I were lying). That wouldn't be a bad thing for media featuring sexual content. --JN466 13:59, 1 October 2010 (UTC)Reply
I haven't objected to the use of an advisory consent template, used to help make sure uploaders know relevant policies. I would also agree that the current warning on the upload screen ("Compromising or embarrassing images of non-public people taken without their knowledge are often problematic. Use good judgment.") may be too laid-back. But I don't support a box to click, because the interface can't tell if the freshly uploaded image is sexual, and there are millions and millions of uploads of insects and waterfalls and scanned documents that would be burdened by it. Bear in mind that even if clicking a box takes onlyone second, and no one is ever confused and gives up the upload, for every 2,628,000 images that is still a solid month of time spent, day and night, on the little clicks. And as explained previously I also oppose any dumping of good content dumped over a new paperwork obstacle. Wnt (talk) 14:47, 1 October 2010 (UTC)Reply
Well, you could add a step on the "entirely my own work" upload screen asking the uploader whether the file they want to upload features sexual content or nudity, and have the click box only appear for those users who say yes. Sexual content uploaded without passing through that step would be deleted on sight, and the uploader asked to upload it again, going through the proper screen, and clicking that box. --JN466 15:28, 1 October 2010 (UTC)Reply
You know what will happen if you add a checkbox and require people to check the box in order to upload an image? They'll check the box. It doesn't matter what the box says -- people are so conditioned to clicking boxes saying things like "Yes, I have read the terms and conditions" that they'll do it on autopilot. --Carnildo (talk) 19:14, 1 October 2010 (UTC)Reply
And if the uploader checks the box but it turns out they stole the image, guess what happens? They go to jail under various fraud charges. Ottava Rima (talk) 23:38, 3 October 2010 (UTC)Reply
Please don't be ridiculous. The chance of anyone "going to jail" over inaccurately representing whether they had consent to use an image is close to nil. For starters, like you (and unlike me), most of our contributors work under pseudonyms that leave them effectively anonymous. While it is possible that someone could get in legal trouble for distributing certain sexual imagery without the consent of the parties depicted, it would not make one whit of difference that they had checked a box that claimed to have such consent: it's not like it was a sworn statement in a court of law. - Jmabel ! talk 01:03, 4 October 2010 (UTC)Reply
Fraud convictions happen quite regularly, especially when there is a notable situation. Fraud would be an additional charge as prosecutors tend to stack charges. Ottava Rima (talk) 20:34, 4 October 2010 (UTC)Reply
This is a bogus argument. You can't defraud Wikipedia by uploading unusable or illegal content — you're not getting paid for it! And Wikipedia's check box has nothing to do with any other victim of fraud. That's like saying that the wag who posted pictures of a naked girl spraying herself with whipped cream under "People eating" committed fraud by putting it in the wrong category. Wnt (talk) 18:39, 5 October 2010 (UTC)Reply
"You can't defraud Wikipedia by uploading unusable or illegal content" Yes, you can. You can defraud anyone that has you claim that what you publish is legally yours to publish when it is not. Ottava Rima (talk) 20:08, 6 October 2010 (UTC)Reply
Anti fraud laws deal with getting money (or equivalent) in naughty ways, at the best of times they are hard to enforce and do not apply to mere liars. Pranksters can be prosecuted if they, for example, mislead the courts or even just cause the police (or equivalent) to waste their time; but if they merely lie "fraud" laws won't get them. Defamation and intellectual property are generally civil matters. The criminal laws that you might hope apply are privacy related "peeping tom" laws, but they'll apply in the jurisdiction where the "offense" took place: depending on the country the authorities there may be chiefly interested in having the victim of the peeping tom stoned to death for adultery. --Simonxag (talk) 23:34, 6 October 2010 (UTC)Reply
Moreover, if someone wants to upload "stolen" photos on Internet for its own purposes (revenge, defamation, etc.), there are thousands of sex-related web site that have a bigger visibility than wikipedia where him can do it... --Yoggysot (talk) 02:29, 4 October 2010 (UTC)Reply
I agree that a checkbox won't send you to jail. But evidence of malice is additive, and IM(non-lawyer)O, it may occasionally be useful to a defamation case to have clear evidence that an uploader was actively made aware of the requirements, and chose to flout them. Perhaps I shouldn't speak outside my area of expertise. Is yours a legal opinion jmabel? --99of9 (talk) 05:40, 4 October 2010 (UTC)Reply
If there are thousands of other websites, then they can upload any of the images there and we should close off Wikimedia to the possibility. Let criminal activity happen elsewhere. Ottava Rima (talk) 20:34, 4 October 2010 (UTC)Reply
Making users think for one second about consent is a perfectly healthy thing to do for a site with a serious mission like ours, just like they have to think about copyright for at least one second. I think the best method is to handle it like copyright, with an additional drop down option box (that adds templates). I don't think it's a good idea to actually prevent upload, since most media doesn't even involve people. Obviously we would only chase up missing consent if there was a person in the sexual (or identifiable private) photo. --99of9 (talk) 05:34, 4 October 2010 (UTC)Reply
If we add a dropdown, how do we have it not affect the 99%+ of users who never add anything even remotely sexual? - Jmabel ! talk 16:17, 4 October 2010 (UTC)Reply
Doing nothing to the dropdown could either (a) not add any template to the file (just like I routinely leave the "other versions" blank), or (b) by default be set to "This file has no identifiable people in it". Remember that consent will also help those users who upload photos of identifiable people in private locations - perhaps more than 1%. --99of9 (talk) 09:48, 5 October 2010 (UTC)Reply
I like 99of9's idea of combining this with how we handle identifiable people in private locations, and I also like the idea of a drop down box defaulting to "This file has no identifiable people in it". Excellent idea. --JN466 14:27, 5 October 2010 (UTC)Reply
I think that a dropdown that doesn't hinder uploads is entirely acceptable and only makes this easier. I don't think that no action should lead to an explicit assertion (such as "no identifiable people") because otherwise the point of such a template, such as it is, is lost. Eventually someone is going to go through sexual content categories looking for consent violations, and they should know which pictures the poster really said have no identifiable persons. That way (for example) someone can specialize in looking for license plates, distinctive tattoos, watermarks etc. in images dubbed non-identifiable, without wasting a lot of time. Wnt (talk) 19:13, 5 October 2010 (UTC)Reply

On evidence edit

99of9 linked ”evidence” to Commons:Project scope/Evidence. I have not noticed that policy before, but I do not like this. It is mostly about copyright, where request for anonymity is uncommon. The parts most concerning us say:

  • ”Also, the copyright owner/author should be identified, if known or reasonably ascertainable.”
  • ”In all cases, the burden of proof lies on the uploader or other person arguing for the file to be retained to demonstrate that so far as can reasonably be ascertained: [...] that any required consent has been obtained.”

There is the "reasonable", which might be enough, but I think the linked policy is not written with our issues in mind and should either not be referenced from here or rewritten (its relation to this policy should in any case be stated explicitly).

I think that stub policy ads nothing to our effort. The question about what level of evidence is needed should either be discussed here or left to discussions about individual files.

--LPfi (talk) 11:47, 9 October 2010 (UTC)Reply

I think that one part that most concerns us is "Where the file is a photograph which shows an identifiable person, the subject's consent is also required". Our case is strongly parallel to this, we now simply also require it for most sexual content even if the individual is not identifiable. This was discussed briefly in the section above called Again on Consent. I presumed the silence after I gave the "case-law" was indicative that most had accepted my reading of the consensus. 99of9 (talk) 12:18, 9 October 2010 (UTC)Reply
The issue is what kind of evidence we will need. With copyright we usually have either 1) a link to a source, which we can check, 2) an OTRS mail or 3) an author claiming own work with no earlier publication. If the image is found elsewhere, with an earlier date of publication, we can usually assume copyright infringement. No problem here.
With the identifiable persons there is if course the paparazzi problem. My understanding is that photos that seem to have been taken in a private place are deleted if there is no evidence at all that there is consent, or if the uploader is clearly untrustworthy. But if the uploader is a regular contributor and states he has consent, I think that is usually enough. The policy is not used to require signed documents etcetera.
For sexual content where the person is not identified but would be possible to identify given similar photos, we have a real problem: some would like to have the image deleted because of its sexual in nature and proving consent without giving away the identity is hard. We had that discussion and no viable means were suggested.
So requiring proof will be equal to forbidding anonymous models photographed by contributors without professional book keeping. That would be a disaster. And for very little good, as the "see my ex girlfriend's pussy" images can be published elsewhere and are probably seldom of such quality that they cannot be deleted as being out of scope.
I see no problem with the near consensus here about not deleting sexual content without real worries about (e.g.) consent issues. The evidence policy is written for copyright issues and is not suitable for sexual content. As we are demanding consent also for images for which evidence was not required before and as we have discussed these matters thoroughly, I think it is totally acceptable to amend that policy to exclude images that fall under this policy, where consent also is handled.
(In the DR discussions referenced above, consent was stated only in one and that one had other issues. If lack of consent was the reason for deletion, then one might note that consent was handled by OTRS. I think statement of consent by the uploader has been enough in most cases.)
--LPfi (talk) 07:51, 12 October 2010 (UTC)Reply
Why are you repeatedly caricaturing me as "requiring proof"?? I already replied the first time you said this that evidence is different to proof. All I am doing here is linking to our evidence policy. Maybe it's because of the phrase "burden of proof", which, in context, is just talking about who's responsibility it is to convince the other side, not about actual proof. Obviously we do not want to reverse that! What do they have to do according to our evidence policy? "demonstrate that so far as can reasonably be ascertained ... that any required consent has been obtained." That seems perfectly fair to me. If the community is usually happy with a trusted uploader demonstrating by saying "yes I have obtained permission to release this", then the community will continue to accept this as a demonstration whether we link to that policy or not. Anyway, if you really want to argue for change of that other policy, that will really have to be done over there, and we can argue about it after this one gets approved. (P.S. I also do not accept your assertion that the evidence policy is "written for copyright". It clearly spells out that it also applies to consent issues.) --99of9 (talk) 12:06, 16 October 2010 (UTC)Reply
Let me see if I've got this straight. You want to link a page which indicates that evidence of consent is only ever needed for identifiable people (as certainly would be the case for non-sexual content). I thought it was a position you were arguing against before. I would agree that for much sexual content that is, indeed, the case, but clearly there are some images (e.g. upskirt photos) where internal evidence argues that consent probably was not obtained. I just don't think the content on that page is relevant. - Jmabel ! talk 16:34, 16 October 2010 (UTC)Reply
No it doesn't say "only ever needed", it says for identifiable/public that consent is "also needed". I agree once this sexual content policy is agreed, that page would naturally add a line something like "in some cases defined by COM:SEX consent is also needed". The point of the page is that it's the uploader's responsibility to show enough evidence for "all consent" (since they're the one with all the information). 99of9 (talk) 21:55, 16 October 2010 (UTC)Reply
I would take the position that the evidence policy does not, or at least should not, require proof that the subject has consented, since otherwise a very large number of photos on Commons would be under threat. But it is an existing relevant policy, whether well worded or not, and I can't seriously argue not to link to it. Wnt (talk) 06:27, 17 October 2010 (UTC)Reply

Are we done? edit

I can't promise the draft is bug-free — I just spotted a provision that would have appeared to ban ribald political cartoons, which I changed (hope you don't mind). But since I've abandoned my proposal to put up my old draft independently, while others have moderated the more objectionable terms of theirs, I think we're at a single consensus version ready to vote, unless there are further objections. It doesn't seem to trigger any mass deletions of old content (though the content may be scrutinized), but does demand a positive assertion of consent — something I've opposed, but which might help uploaders avoid legal trouble that seems distinctly more likely than it was a few weeks ago, and may offer reusers of the content a little protection under certain unlikely conditions. Are there any controversies still outstanding at this time? Wnt (talk) 19:01, 8 October 2010 (UTC)Reply

I certainly think this is a reasonable draft to put before the community. - Jmabel ! talk 05:15, 9 October 2010 (UTC)Reply
If we have a rough compromise on meaning, perhaps we should have a short period where we all concentrate on style and presentation. It still seems quite rough around the edges. But apart from that, I am in favour of putting this to the community soon. --99of9 (talk) 10:06, 9 October 2010 (UTC)Reply
Yes Wnt, please let's go with it. I don't think I have any major problems with it, and even if it is not perfect: all our policies evolve over time, and so will this one; the important thing is to make sure it gains policy status in the first place, and actually gets to be used by the community. --JN466 21:33, 11 October 2010 (UTC)Reply
I should note that at the moment the proposal describes itself as a policy in one place and a guideline in another. Even Commons:Photographs of identifiable people is actually marked as a guideline rather than policy. The last vote was whether to make it a policy, and the consensus seemed to be leaning in that direction last time, but I should check if there's still interest in promoting it to guideline rather than policy. Wnt (talk) 06:49, 17 October 2010 (UTC)Reply
It should be a policy in my view. --JN466 01:56, 18 October 2010 (UTC)Reply

I just read over the last poll results, and one of the biggest concerns was that we were too deep in legal jargon. Have we reduced that enough? --99of9 (talk) 13:10, 16 October 2010 (UTC)Reply

Hmmm, actually, in my last edit I added to the jargon a little, by explicitly referencing the site disclaimer - but hopefully this reduces the fear of "getting too deep in lawyer territory" as one voter put it. I don't know if people will agree with me on that last edit... Wnt (talk) 06:49, 17 October 2010 (UTC)Reply
In the last poll, the page had a full 2/3 support. I was actually surprised the poll was closed as unsuccessful. --JN466 01:56, 18 October 2010 (UTC)Reply
I've been boycotting Wikimedia since the end of June waiting until Com:sex is approved. I spent WAY too much time and energy doing legal research and engaging in useless debate. Will somebody please email me when this is actually going to happen please? I rarely log on anymore - I have more important things to do. - Stillwaterising (talk) 19:24, 19 October 2010 (UTC)Reply

Scientific merit of pornography edit

I'm concerned that the mention of "scientific" in Commons:Sexual content#Obscenity law would tend to imply what is, essentially, a serious misrepresentation of the peer reviewed secondary medical literature.[21][22] These kinds of mistakes have led to, for example, minors accused of "sexting" being forced to appear on sex offender registries[23] with which actual adult sex offenders are very familiar. We all have a responsibility to avoid catastrophic unintended consequences. After those issues are addressed, I would ask Robert Harris if he felt there were any issues which had not been sufficiently addressed. I understand he would like some kind of metadata about controversial content categories which could allow for a javascript UI to collapse controversial content. I think it is far more important to resolve the scientific issues first, because of the unintended consequences we have seen from those who have been unable or unwilling to do so. 71.198.176.22 00:39, 18 October 2010 (UTC)Reply

What? The mention of "scientific" is a quote from the law, and I do not understand what you're going on about. Certainly, Commons has no connection to wheat happens to minors accused of sexting.--Prosfilaes (talk) 00:58, 18 October 2010 (UTC)Reply
If we are going to set up a standard which specifically refers to scientific merit, I think we need to say exactly what that merit is in the most reliable sources we have. Otherwise we risk fueling moral panics. Teens are generally as able to upload illegal content as anyone else. If we leave in place the ability of someone supporting deletion because of "no scientific merit" without explaining how vacuous that argument may be, then we do a tremendous disservice to the wider community as well as ourselves. 71.198.176.22 01:17, 18 October 2010 (UTC)Reply
The courts have already set up a standard that specifically refers to scientific merit. It's union, not intersection; anything deleted would be as without all of serious literary, artistic, political and scientific value and appealing to purient interests, in which case we could be forced by law to delete it. Once again, I don't see why this has anything to do with moral panics.--Prosfilaes (talk) 01:36, 18 October 2010 (UTC)Reply
The "scientific" quote is from the w:Miller test. Wikipedia can't alter these terms from this legal interpretation. What Wikipedia might (possibly) have control over are 'local community standards'. It's not clear whether an Internet obscenity case would involve the standards of a city, a state, the U.S. as a whole, "Internet users", or -just maybe- Wikipedia users.
In early drafts I sought to include an explicit statement that because Wikimedia has a core policy opposing censorship, and because it has its own much more effective and less damaging mechanisms for evaluating and removing content if the community didn't decide to tolerate it, that in this policy we explicitly proclaim a community standard that rejects legal prosecution of people for posting obscene content. Very few people, if anyone, actually seemed to want to encourage that Wikipedia editors be subject to obscenity prosecution; but a majority worried that declaring such a standard would sound defiant and therefore would actually increase the risk that someone would face obscenity prosecution; or that it is overly confusing; or that the chance that courts would recognize Wikipedia as a local community capable of setting standards was just too remote to bother talking about. Wnt (talk) 07:11, 23 October 2010 (UTC)Reply

Lead paragraph edit

I guess we should discuss this edit to the lead [24]. I agree yours is more faithful to the original lead, I wasn't just shortening, I was trying to make our case for a policy clearer. We have to say somewhere in the lead why sexual content deserves a different policy from the already existing Scope. In my opinion there are 3 categories of reasons that sexual content is different to say photos of buildings or celebrities, and each of them has influenced our choices of actual policy:

  1. legal rules are different (for both the uploader and the host), and we want to state/summarize them explicitly to anyone considering uploading
  2. ethically we have decided to strengthen our consent verification for the protection for non-consenting subjects, beyond what is legally required (What other term would you like to use for this? Certainly for me it was about weighing the ethics against the cost to the repository size)
  3. sensitivities are heightened in the general populus about this issue, thus our decisions like no-surprise categorization.

I'm happy to work with you on how to best summarize, but within that we need to set up for our own conclusions! So I don't think we should be happy until we have a version of the lead which states why sexual content is in some sense distinct enough to require a policy. --99of9 (talk) 09:09, 18 October 2010 (UTC)Reply

I don't think we need to make any such case, because by and large we are trying to stay consistent with other policies. We're adhering to existing policy regarding genuinely illegal material, photographs of identifiable persons, and project scope. We clearly don't have much of a mandate to change the status quo of sexual material — the point of the policy is to disprove claims that Wikimedia allowed, e.g., child pornography (which was never true), and to bring together all the relevant policies in one place so that people wouldn't miss them.
I don't want to even use the words "ethics" and "sensitivity" because they mean a lot of different things to different people, and quite a few of these interpretations run counter to the inclusive mission I hope we seek. It's easier to say that photos of sex organs are relevant and informative than to argue that they are ethical to a group of people with a wide range of ethics. And there are very many images (the Abu Ghraib images still linger in mind) which hardly anyone could say are "sensitive". Wnt (talk) 06:12, 19 October 2010 (UTC)Reply
Rather than 'ethics', 'social impact on the subject' seems relevant here. --SJ+ 05:30, 20 October 2010 (UTC)Reply
Ok, I agree with your change to the word ethics. What kind of mandate are you talking about Wnt? I assume that the mandate comes when the policy is voted upon. We are a bunch of editors who've thought hard about this and made a proposal for plenty of clarification, and some tightening (consent, categorization). --99of9 (talk) 12:55, 28 October 2010 (UTC)Reply
What I meant by that is just that I don't think most voters will be looking for any great change here, just a clearer and more carefully considered policy that reduces the need to argue about deletions from first principles. Now that said, there is some change here - there is a consensus to set up some mechanism, perhaps a pull-down menu, to persuade uploaders to assert consent for future uploads. I spent some time opposing that, but was in a minority; more to the point, the Tyler Clementi case has changed how such things are likely to be viewed in the next few years, and its attendant prosecutions illustrate that Wikimedia participants may be safer if there is something they can point to as "proof" (however tenuous) that they believed they were handling material uploaded with consent. Still, I think the appetite for change is at least very limited. Wnt (talk) 21:58, 28 October 2010 (UTC)Reply

I still think this sentence "The purpose of this policy is to define how Wikimedia Commons should exclude materials outside Commons' project scope, while retaining what is potentially useful for educational purposes." is silly for this page. That sentence describes the purpose of another page already... COM:SCOPE. If they have identical purposes, two policies would be silly. --99of9 (talk) 13:02, 28 October 2010 (UTC)Reply

I'm not really fond of the sentence myself. Someone added the purpose section, and it seemed to have some support at the time, so I tried to reduce it into the lead for brevity rather than eliminating it. If you want to change "The purpose of this policy... addresses" to "This policy addresses" I shan't complain. Wnt (talk) 21:58, 28 October 2010 (UTC)Reply

"Obviously outside" v. "Not obviously outside" edit

The section under Obviously outside of scope is a bit confusing. At present it reads:

...Scope policy is very general and sexual content is frequently deleted under this policy; however, for certain categories of sexual content, any scope concerns should be discussed in a deletion discussion and the media should not be speedy deleted. These include the following:

The last half of this paragraph is actually about content that is not obviously outside of scope. It would be clearer to state the whole positively: move the phrase "These include the following:" and the following list to a new subheading under Normal deletions. --SJ+ 05:30, 20 October 2010 (UTC)Reply

Hmmm, I tried to fix this once before and it got reversed for some reason. I just made another attempt. I'm trying not to change the logic as written, but let me know if I've failed. I'm willing to put up with the original, despite the convoluted structure, if this turns out to be contentious. Wnt (talk) 02:26, 21 October 2010 (UTC)Reply
This is definitely clearer. 07:24, 23 October 2010 (UTC)

[PREVIEW] Second poll for promotion to policy (October 2010) edit

This is a preview of the poll text. I'd like informal comment from people following this article whether we're ready to proceed with the poll, and whether any additional steps should be taken to ensure we get a conclusive result. If there are no further objections, this paragraph will be removed and the poll should go live, provided an admin then agrees to put up the site notice. Wnt (talk) 07:47, 23 October 2010 (UTC)Reply

Proposed poll text:

This is a poll to adopt Commons:Sexual content as a Wikimedia Commons policy.

Please give your opinion below with {{Support}}, {{Oppose}}, or {{Neutral}}, with a brief explanation.

The poll concerns whether people accept or reject the November 26 revision. Edits made to the policy during the polling period may be temporarily reverted, and will need to be adopted by consensus, like changes to the other established policies.

Voting on the poll will proceed for ten days, beginning from the time that MediaWiki:Sitenotice is altered by an administrator to advertise this poll.

A previous poll was closed as "no consensus" at #Poll for promotion to policy above. Wnt (talk) 07:47, 23 October 2010 (UTC)Reply

This process has taken a long time, more than 6 months since I reintroduced the original proposal by Privatemusings. I think the voting should go for a little longer than 7 days, either 10 or 14 days. - Stillwaterising (talk) 00:47, 28 October 2010 (UTC)Reply
I have no objection to 10 days provided an admin is willing to keep up the site notice for that long. I've changed the text above accordingly. If anyone disagrees with this, let us know. Wnt (talk) 01:19, 28 October 2010 (UTC)Reply
Note: Because there was a lot of good copy editing, I've changed "October 23 revision" to "October 28 revision" and the associated link above. These revisions improve readability; one did make a substantive change in the speedy deletions for out of scope section, but I don't think it actually changes how the policy would be used. Wnt (talk) 22:04, 28 October 2010 (UTC)Reply
I think it'd be good to keep working on copy-editing. The content seems reasonably stable, but it is very long and quite wordy. I think effective paring could cut it down about 30% in length without changing the meaning a bit. Good policy is only as good as it is clear and readable. Feedback on recent edits welcome. Ocaasi (talk) 03:47, 2 November 2010 (UTC)Reply
I support a longer time for the poll to run... 10 or even 14 is far better than 7. I also support waiting a bit more for further copyediting. I note that there was a big bunch of it by Ocaasi (thanks!) on 2 Nov and I agree that trying to remove wording without changing meaning would be beneficial. I plan to support this but would like to support the best, most clearly written, and most concise version possible. Thanks to all for your efforts. ++Lar: t/c 11:41, 4 November 2010 (UTC)Reply
Is there any plan to move forward on this? I imagine the Foundation will be less likely to swoop in on Commons if the community can show that it is effectively addressing these issues on its own. Kaldari (talk) 23:56, 24 November 2010 (UTC)Reply
I've now updated this preview to mention the November 26 revision. The poll may soon be proposed near the bottom of this page. Wnt (talk) 19:12, 29 November 2010 (UTC)Reply
No objections yet and several supporters. Seems like a good time. Ocaasi (talk) 10:07, 2 December 2010 (UTC)Reply
Return to the project page "Sexual content/Archive 5".