Commons:Village pump/Proposals/Archive/2012/12

Pictures of identifiable minors

Over at en.wp ianmacm made the following suggestion: "How about changing the rules on Commons so that any identifiable photograph of a child under 18 (...) would be speedily deleted unless e-mail permission was available?" Since it was suggested to "officially" make that suggestion here, I'm hereby doing so. The reasons for such a new rule should be, hopefully, blatantly obvious to everyone. Personally, I'd prefer OTRS permission, but the details are up for discussion, of course. Getting a rough consensus that this is a good idea would be a good first step. --Conti| 12:00, 28 November 2012 (UTC)

Regarding "identifiable photograph of a child under 18" - that's way too broad. Any picture taken with the 36 MP Nikon D800 of a kid-friendly tourist attraction would have to be deleted, because children the size of ants at preview size suddenly become identifiable at 100%. It must be clarified how prominent the child needs to be in the picture. -- King of 12:03, 28 November 2012 (UTC)
Yes. Also that the child needs to be a living person and the image needs to be previously unpublished. Probably some other stuff too. FormerIP (talk) 12:05, 28 November 2012 (UTC)
Whether the picture is previously unpublished is not a consideration in my view. To give an example, if a fetish account on Flickr uploads an image of a minor under a CC licence, we should not take that as licence to host the image here without parental consent: especially as such accounts are often blocked on Flickr a relatively short time after the upload there. In other words, children's personality rights do not end just because someone uploaded an image on Flickr. (If you meant something else by "previously unpublished", then accept my apologies.) Andreas JN466 12:26, 28 November 2012 (UTC)
True. We could exclude group shots or photos where the minor is obviously not the focus of the picture. Another exclusion rule that comes to mind are celebrities that are still minors. --Conti| 12:08, 28 November 2012 (UTC)
I would not exclude group shots if these groups are primarily composed of children. Even our local school asks parents for permission before using such images on their school website, let alone releasing them under a CC licence. Celebrities who are minors should be excluded if the pictures are taken in a public place, but generally, any picture of a minor that is taken in a setting where people would have a reasonable expectation of privacy should require parental OTRS consent. This includes pictures of kids taken at their homes, at school, and at other non-public events. Pictures of minors bathing, whether nude or otherwise, should likewise only be hosted with parental consent; I believe that's the right thing to do. Andreas JN466 12:21, 28 November 2012 (UTC)
What this proposal is aiming for is not to have pictures like File:S7301533 (6024774145).jpg without proper written permissions. The reason why, again, should be blatantly obvious. --Conti| 12:11, 28 November 2012 (UTC)
Are you suggesting it for any photo of a child under 18, or just ones that are potentially compromising, e.g. partial nudity, obesity, etc.? -- King of 12:16, 28 November 2012 (UTC)
Potentially compromising would be a good start, but I don't see why we wouldn't need permission for any photo. We can't just assume that anyone who's making pictures and puts the online under a free license has the permission from the subject's parents, right? If people have a problem with that, though, we can focus on potentially compromising. I'd rather have a consensus for that than no consensus at all. --Conti| 12:22, 28 November 2012 (UTC)
I was aware that this proposal could be picked apart endlessly. However, since Flickr and similar sites may contain family photographs that were never intended for public distribution, a common sense approach is needed before allocating a CC license on a permanent basis. Had I been a Commons admin, the images that set off the controversy on Jimbo's talk page would have been speedily deleted. There is no evidence that the person who transferred these images to Commons is either the child's parents or the person who took the photographs. Since some of these photos seem to have been taken on private property (eg a swimming pool changing room) they are pretty indefensible for a CC license on Commons.--Ianmacm (talk) 12:29, 28 November 2012 (UTC)
If I might ask a rather dumb question, why does the standard policy of COM:PEOPLE not apply to children? Why is a child in public more entitled to privacy than an adult in public? -mattbuck (Talk) 12:30, 28 November 2012 (UTC)
You might as well ask why a child is more entitled to protection than an adult. I'll leave the answer to that up to you. --Conti| 12:33, 28 November 2012 (UTC)
Agree with Mattbuck. At first I thought the proposal made sense, but I've reviewed COM:PEOPLE - if it's a picture in a public place, it should stay. --Cyclopia (talk) 12:35, 28 November 2012 (UTC)
I don't think anyone here is concerned with street scenes that may contain minors, but with pictures taken in private contexts and contexts where there is a reasonable expectation of privacy (an area where COM:PEOPLE is regularly broken in Commons with respect to adults as well). Where images are sexually suggestive, minors are obviously entitled to more privacy than adults, based on child protection laws. Andreas JN466 12:37, 28 November 2012 (UTC)
I may be missing something, but the images complained about do not seem to have been taken in a public place (?) FormerIP (talk) 12:38, 28 November 2012 (UTC)
See Commons:PEOPLE#What_are_.27public.27_and_.27private.27_places.3F. If you are on a school playground, or at a private swimming pool, or even at a camping trip in nature, you have a reasonable expectation of privacy. Andreas JN466 12:41, 28 November 2012 (UTC)
I'm not sure whether you are agreeing or disagreeing, Jayen. But it seems to me that, unless it is possible to buy spectator tickets to Scout camp, these are private photos taken in a private place. FormerIP (talk) 12:42, 28 November 2012 (UTC)
See for example the Editors' Code of Practice for the Press Complaints Commission in the UK. Children should have stronger protection than adults in this area.--Ianmacm (talk) 12:45, 28 November 2012 (UTC)
Sorry for the confusion. I am indeed agreeing with you, FormerIP. Andreas JN466 12:47, 28 November 2012 (UTC)
So that means two things: 1) per existing Commons policy and absent consent, these images are prohibited and should be deleted (if there have already been discussions closed as keep, those should be reviewed; 2) Commons may or may not need stronger rules in this area, but these images are not a good example of why. FormerIP (talk) 12:52, 28 November 2012 (UTC)
Given the fact that there are very active commons editors who vehemently argue to keep some of those images, I'd say they are indeed a very good example of why we need stronger rules in this area. (Or have the existing rules applied more consistently, I suppose). --Conti| 12:54, 28 November 2012 (UTC)
Well, if it's either, it must be the latter, surely? FormerIP (talk) 12:55, 28 November 2012 (UTC)
I certainly wouldn't be opposed to it. I just don't know how to achieve that, other than making the rules stricter. I'm always happy for other suggestions, though. --Conti| 12:57, 28 November 2012 (UTC)
Just like the PCC editors' code of practice has a special section detailing child-related concerns, so should COM:People. In the case of children, I believe we should promote speedy deletion if there is no credible parental consent – and in this context, these images are good examples. Andreas JN466 13:15, 28 November 2012 (UTC)
Well, there's a wider issue that Commons:Criteria for speedy deletion is just sitting there in limbo. That would take care of these images being speedy deleted, but I don't see that it's a question that relates specifically to children (or are you saying that photos of adults without permission should not be speedy deleted). And I think there are ITN reasons not to worry too much about what the PCC does. FormerIP (talk) 13:35, 28 November 2012 (UTC)
It's not just the PCC, is it? This stance is universal. Also see Commons:Administrators'_noticeboard#Still_more_examples for an example of where COM:People is broken here with regard to adults. The uploader is a red link, the images are clearly taken in a private context, and there is clearly no OTRS consent from the subjects. Yet Commons admins have edited and categorised these files, rather than deleting them. Andreas JN466 13:41, 28 November 2012 (UTC)
OK, so that's about practice/enforcement (?). FormerIP (talk) 13:42, 28 November 2012 (UTC)
A lack of enforcement is the baseline we are starting from. Given that deletion discussions can take weeks or even months to close in Commons, and images of children are generally considered a more pressing problem, I believe a policy paragraph in COM:People ensuring that these can be speedily deleted without all of us having to jump through "voting delete" hoops as we both have done today is the foremost concern. But it is unquestionably true that Commons' performance with respect to pictures of adults taken in private contexts and hosted without their consent needs to improve as well. Andreas JN466 13:48, 28 November 2012 (UTC)
But don't you think a line saying "images without permission can be speedied if they are of children" might tend to enforce a position that, by implication, other images without permission will not be?
If you don't see that as a problem, then I make it proposal o'clock. FormerIP (talk) 14:05, 28 November 2012 (UTC)
I don't see the implication. Just because images that meet criteria X can be speedily deleted does not mean that images that meet criteria Y can't be speedily deleted. --Conti| 14:12, 28 November 2012 (UTC)
How about a wording like this? "Images taken in non-public contexts that lack credible subject consent for hosting on Wikimedia Commons can be speedily deleted. This applies in particular to images of minors that were taken in non-public contexts and lack credible evidence of parental consent for hosting on Wikimedia Commons." Andreas JN466 14:19, 28 November 2012 (UTC)
That sounds good to me. Minor tweak, though: instead of "contexts" substitute "locations". A state school is (usually) a private location but it exists, in a sense, in a public context. Objections based on that would be stupid, obviously, but why risk the confusion? FormerIP (talk) 14:40, 28 November 2012 (UTC)
It doesn't seem good to me. Why "non-public", when "private" is simpler and easier to understand? Why "contexts" or "locations" when COM:PEOPLE talks about "private places"? When is it reasonable to assume consent? When is it unreasonable to require it? Taken literally, I think this suggestion would mean we have to delete all our copies of the Mona Lisa! --Avenue (talk) 15:00, 28 November 2012 (UTC)
Previously unpublished photographs of identifiable living people not taken in a public place and lacking verified subject consent for hosting should be speedily deleted. Particularly serious consideration should be given to speedy deletion in the case of images of minors. FormerIP (talk) 15:07, 28 November 2012 (UTC)
That's a lot better, but I think there are still some problems. Do we want to rule out speedily deleting uploads from Flickr? ("Previously unpublished" seems to.) The description of consent is very different from those produced by our {{Consent}} template. Is there good reason for this? If not, I'd think we should follow the existing wording. There are also grey areas between public and private places, which this doesn't allow for. Something like "clearly taken in a private place" would be better IMO. More generally, the whole phrasing should be more consistent with the existing speedy deletion criteria for files. Just describe the files that can be speedied; saying they "should be speedily deleted" will be redundant. --Avenue (talk) 15:41, 28 November 2012 (UTC)
I'm meaning "unpublished" to mean not published by an outlet with editorial oversight. I think most people will get that but, if not, I'm not sure what concise wording to use. I'm not a commons regular so I don't know about the template, but it seems to allow users to simply assert that their images are kosher, which may not be what we are after here. "clearly taken in a private place", IMO, would but the BOP the wrong way. Cases of doubt should surely be resolved conservatively?
Before spending too much more time discussing it, is anyone actually intending to make this proposal? FormerIP (talk) 16:08, 28 November 2012 (UTC)
Personally I like the idea of allowing speedy deletion of clear-cut COM:PEOPLE violations. This could be done by adding a new criterion, or by expanding an existing one (e.g. File criterion #5). I need to think more about which approach would be best. Speedy deletion criteria need broad consensus; I think an expansive view of the burden of proof is unlikely to pass muster, both on the public/private issue and establishing consent. I do see that some people are keen to require more formal proof of consent, but it's not just the template that says asserting consent is usually enough. It is also the WMF's position. And "reputably published" might work. --Avenue (talk) 17:04, 28 November 2012 (UTC)
I may not be appreciating how things work here, but I was under the impression that the speedy deletion criteria were a stale proposal and not actual policy. If they are supposed to be followed, then there is not much of a policy issue, except making that clearer. In fact, it seems to me that the Scouting files should have been speedied under criterion 5 as lacking a necessary permission. The issue in that case is not the policy, but the fact that it wasn't followed. FormerIP (talk) 17:13, 28 November 2012 (UTC)
Hmmm, foolish me. You're right, they're only a proposal. I still like the idea, but I guess it's probably better to put it into COM:PEOPLE at this stage. --Avenue (talk) 01:20, 29 November 2012 (UTC)
Oppose. COM:IDENT is already a restrictive policy, developed over some years. The mere fact that our High Inquisitor found someone breaking it is no reason to write a whole new policy that would ban any attempt to photograph a crowd outside the Vatican. Wnt (talk) 16:27, 28 November 2012 (UTC)
It would be nice if you would read the constructive discussion above and not turn this into a vote. None of what you say applies to the actual proposal, the crowds outside of the Vatican will be safe. --Conti| 16:35, 28 November 2012 (UTC)
That is what the blue text at the top of this discussion logically means. I think that Russavia's photo below is an example of what should be allowed based solely on it being a public street scene. The role of the policy is to keep Commons out of ethical trouble, not to make sure it doesn't interfere with the market for commercial stock photos. Wnt (talk) 20:45, 28 November 2012 (UTC)

Exclusions

If we are going to change Wikimedia Commons into a space that has special, beyond-legal and beyond-human rights policies for photographs of minors (commonly taken as 18 years old in the United States), could we include some early practical criteria and implementation suggestions that can be part of a RFC? My guess would be that around 500,000 photographs on Commons would need to be evaluated and may be speedy deleted as a result of this suggested change in policy. With some agreed criteria, we can size and estimate the work involved. I would hope this would be planned along with a massive campaign to attempt to contact the original photographers to get sufficient information for valid un-deletions, keeping OTRS and our volunteers busy throughout next year and (I suggest) likely to require paid staff support under a Wikimedia project grant.

Practical basic exclusion criteria might help, such as:

  • photographs including minors taken before 1874 are exempt
  • photographs with non-portrait shots of minors with no identifiable characteristics (so a minor's arm with an unusual birth mark might be deleted, but a minors's arm without identifiable marks might not be) are exempt
  • photographs in a public place covered by FoP where the image is not in violation of other aspects of Photographs of identifiable people are exempt
  • photographs verified by OTRS ticket without legal issues are exempt
  • photographs published or released elsewhere and validly imported to Commons on a free license by an institution or established photographer with a recognized review process (such as The National Archives) are exempt

Thanks -- (talk) 12:43, 28 November 2012 (UTC)

"verified by OTRS" - only if the OTRS ticket contains a permission from depicted or parent, not if ticket only for copyright.
"public place covered by FoP" - ähem, FoP is specific for copyright issues, nothing else. In addition, the "public place" doctrine is (probably) from US law and likely not fully transferable to personality issues in every other country. --Túrelio (talk) 13:17, 28 November 2012 (UTC)
Thanks Túrelio, I'm not sure I understand your first point. Were you saying that OTRS agents do not currently accept copyright releases from minors? I am unclear how OTRS agents are expected to legally validate claims of age. As an OTRS agent I did validate claims by doing some background checks but I did not systematically check the age of each correspondent. -- (talk) 13:29, 28 November 2012 (UTC)
 
I, Russavia, uploaded this image from Flickr, and it presents ZERO problems
Being not an OTRS volunteer by myself, I have the impression that "OTRS permissions" are mostly about copyright questions. So, the term "verified by OTRS" was a bit too generic for me. It should specify what has to be verified. --Túrelio (talk) 13:59, 28 November 2012 (UTC)
Okay, thanks. I agree, these suggestions are a long way from being anything more than conceptual and generic. Plenty of detail would need to be drilled through before we understand what action is needed to implement anything here. -- (talk) 14:21, 28 November 2012 (UTC)
I would hope that we are in agreement that Flickr will never be covered by that last point. Delicious carbuncle (talk) 16:11, 28 November 2012 (UTC)
No we are not in agreement. Refer to the photo at the right for why. russavia (talk) 17:16, 28 November 2012 (UTC)
That photo is unproblematic because it appears to have been taken in a public location, not because flickr is always unproblematic. FormerIP (talk) 17:21, 28 November 2012 (UTC)
I'm sorry, Russavia, but you will have to use words instead of images if you are trying to say something. Lacking some kind of positive confirmation that the guardians of these children gave consent, I do not believe that they should be permitted on Commons. This is kind of the point of the whole discussion. Delicious carbuncle (talk) 17:27, 28 November 2012 (UTC)
A picture says a thousand words DC. russavia (talk) 17:31, 28 November 2012 (UTC)
This one just says that someone's missing the point. FormerIP (talk) 18:08, 28 November 2012 (UTC)

  Comment It's concerning that people here appear not to know a basic part of a key Commons policy [well guideline if we're being precise], apparently in long-term overreliance on what US law says (i.e. publication doesn't need subject consent when photo is taken in a public place). The table below from COM:IDENT (collapsed so as not to overwhelm the discussion) is overwhelmingly "Yes" or "Yes (with exceptions)" in the second column. That's "Yes" to "needs subject consent?" for photographs taken in a public place. I say again for emphasis: for photographs taken in a PUBLIC place. (The exceptions are typically de minimis type things.)

Commons:IDENT#Country-specific_consent_requirements says...
Consent required for action related to a picture of a person in a public place (by country)
Country/Territory Take a picture Publish1 a picture Commercially2 use a published picture
Afghanistan No Yes (with exceptions) Yes (with exceptions)
Argentina No Yes (with exceptions) Yes (with exceptions)
Australia No (with exceptions) No (with exceptions) Yes
Austria No (with exceptions) No (with exceptions) Yes
Belgium No Yes (with exceptions) Yes
Brazil Yes Yes Yes
Bulgaria No No Yes
Canada Depends on province Yes (with exceptions) Yes
China, People's Republic of No No Yes
China, Republic of No No (with exceptions) Yes
Czech Republic Yes (with exceptions) Yes (with exceptions) Yes (with exceptions)
Denmark No Yes (with exceptions) Yes (with exceptions)
Ethiopia No Yes (with exceptions) Yes
Finland No Yes (with exceptions) Yes (with exceptions)
France Yes (with exceptions) Yes (with exceptions)[1] Yes
Germany No (with exceptions) Yes (with exceptions) Yes (with exceptions)
Greece No No Yes (with exceptions)
Hong Kong SAR Depends on circumstances Depends on circumstances Depends on circumstances
Hungary Yes (with exceptions) Yes (with exceptions) Yes (with exceptions)
Iceland No No (with exceptions) Yes
India No No (with exceptions) Yes (with exceptions)
Indonesia Yes (with exceptions) Yes (with exceptions) Yes (with exceptions)
Iran No (with exceptions) No (with exceptions) No (with exceptions)
Ireland No (with exceptions) No (with exceptions) No (with exceptions)
Israel No No (with exceptions) Yes
Italy No Yes (with exceptions)[2][3][4] Yes[5]
Japan Yes (with exceptions) Yes (with exceptions) Yes (with exceptions)
Libya No Yes (with exceptions) Yes
Macau SAR Yes (with exceptions) Yes (with exceptions) Yes (with exceptions)
Mexico No Yes Yes
Netherlands No No (with exceptions) No (with exceptions)
New Zealand No No Yes
Norway No Yes (with exceptions) Yes (with exceptions)
Peru No Yes (with exceptions) Yes (with exceptions)
Philippines No Yes (with exceptions) Yes
Poland No Yes (with exceptions) Yes
Portugal No (with exceptions) Yes (with exceptions) Yes
Romania No Yes (with exceptions) Yes (with exceptions)
Russian Federation No Yes (with exceptions) Yes (with exceptions)
Singapore No (with exceptions) No (with exceptions) No (with exceptions)
Slovakia Yes (with exceptions) Yes (with exceptions) Yes (with exceptions)
Slovenia No No Yes
South Africa No No Yes
South Korea Yes (with exceptions) Yes (with exceptions) Yes (with exceptions)
Spain Yes Yes Yes
Sweden No No Yes
Switzerland Yes Yes Yes
Turkey Yes (with exceptions) Yes (with exceptions) Yes (with exceptions)
United Kingdom No (with exceptions) No (with exceptions) Yes
United States No No Usually (although laws differ by state)
1:In this context of consent requirements, "publish" refers to "making public" and is separate from the term "publish" as may be defined elsewhere (e.g. U.S./U.K. copyright law).

2:In this context of consent requirements, "commercial use" is separate from, and not in reference to, licensing conditions that may prohibit commercial use (non-commercial licenses). Often commercial use in this context is contrasted with "editorial use", with the former referring to advertising and marketing purposes and the latter referring to news reporting and education even if made with a profit motive.

Now the table may be wrong, at least in part. As part of all this debate, a review of it wouldn't hurt. But as of now, this is the table in COM:IDENT. Rd232 (talk) 23:25, 28 November 2012 (UTC)

Thanks for the reminder Rd232. I think it probably does need a bit of polishing. For example, the United Kingdom seems to have been overlooked which seems very odd, though maybe my country is better on this form of free speech than the rest ;-) The exceptions explained by country are useful, and could do with some case studies to illustrate them. We could usefully tease out interpretation, such as when "picture of a person" may be interpreted as a person in a group of people, whether this only applies if the person is the 'focus' of the photograph even when in a crowd, or when the group is large enough to no longer require permission from everyone in the photo. In relation to the current fracas, a number of these sections mention special rules for minors, and one step forward might be to promote this information up to the main table to ensure it is not overlooked by being squirrelled away in footnotes. -- (talk) 23:48, 28 November 2012 (UTC)
I don't think the UK has been overlooked, Fae, it just doesn't have any consent requirements. Just for clarification, though, the table may affect you if you live in one of the countries listed, but it doesn't affect what Commons policy is (?). FormerIP (talk) 23:54, 28 November 2012 (UTC)
Ah good old UK, land of the free (and subject of the Crown). That means I can take lots of photos of people in the park, on the bus, in their gardens (from the pavement), topless or not, without anyone later making accusations that I must be some sort of criminal, terrorist or pervert because I did not get their consent in writing. Time to try out the Flickr upload bot for all profiles based in the UK... Now, I wonder how many benefits there would be in hosting a server here in the UK? -- (talk) 00:13, 29 November 2012 (UTC)
Don't get too hopeful, because the Leveson Report comes out tomorrow. FormerIP (talk) 00:44, 29 November 2012 (UTC)
Yes, above the table it says that it's "a list of countries where consent is needed for one or more of the mentioned situations", so we shouldn't read anything into the absence of countries where consent isn't needed. External links covering some of these countries are listed in Commons:Photographs of identifiable people#Photography in public places. The effect of the table on Commons would come from reading it together with Commons:Photographs of identifiable people#Legality. --Avenue (talk) 01:45, 29 November 2012 (UTC)
Regardless as to what the law is in any particular jurisdiction the prime question is whether Commons wants to be know as the new 'Creepshot' site or not? Just because it might be lawful does not mean that Commons should host it, and catalog it. Going back to flickr, there are many photos that whilst taken in the US and legal in the US flickr will not host, example being stalkerish candid photos of people in the street. John lilburne (talk) 08:01, 29 November 2012 (UTC)
AFAICT, "Creepshot"-type images would be deleted under COM:MORAL - it wouldn't be a question of consent (although, I guess, the existence of consent would likely be a consideration). FormerIP (talk) 22:43, 29 November 2012 (UTC)
Just do a search for upskirt, and you'll find images like http://commons.wikimedia.org/wiki/File:Upskirt_woman.jpg -- or have a look at Category:Upskirt and ask yourself if all the women shown there are likely to be aware and have consented to the Commons upload. I see no evidence that admins are patrolling and deleting anything per COM:MORAL I believe policy should allow them and encourage them to do so. Andreas JN466 10:28, 30 November 2012 (UTC)
The image you point to is gone (do you know how long had it been there?), and I can't see anything in the upskirt (for Christ's sake? OK, I'll agree with you on that...) category that appears to have been taken without the subject's consent. FormerIP (talk) 21:00, 30 November 2012 (UTC)

Community consequences

I believe if this is created as an RFC, as suggested, it would have to specifically address the following consequential changes to Commons and propose these areas as changes as part of the RFC:

  • Site Administrators would have to be verified as over 18 years old, as deletion and undeletion actions and access are not restricted by age. Existing Admins under 18 would have their status removed.
  • The Upload wizard and all other upload tools would have to include an explanation of age related policies and uploaders would probably have to declare if they were over 18 in order to accept any license release and automatically reject their upload if not. Uploading may well have to become an age verified process itself.
  • Images of minors may have policy based limitations on usage to avoid demeaning use in other projects. It is unclear how this would be patrolled or enforced.

Thanks -- (talk) 13:25, 28 November 2012 (UTC)

I don't see what any of that has to do with the initial suggestion. --Conti| 13:28, 28 November 2012 (UTC)
It would be very odd indeed to have a programme of vetting and speedy deleting photographs of minors, including photographs that cannot be accepted as there is no verified legal evidence on file of age of self releases, or that the photographer is a legal guardian or parent, which itself would be expected to be run by minors. -- (talk) 13:32, 28 November 2012 (UTC)
Not any more odd than minors being able to delete (and see already deleted) pornographic images. :) --Conti| 13:35, 28 November 2012 (UTC)
I agree that deletions of pornographic images should not be handled by minors, and that it is inappropriate for admins who are in their early teens to have (as they currently do) access to the entire corpus of deleted pornographic images ever uploaded to Commons, including those considered too extreme to be kept in the public's view. But this is a more general governance issue, and it does not make sense to mix the two issues here. As a separate proposal, I would support the principle that Commons admins should be at least 18. Andreas JN466 14:02, 28 November 2012 (UTC)
So long as we remain committed to not deleting educational material due to its sexually explicit nature, there is no hidden hoard of "pornography too extreme for Commons" to bar minors from viewing. Wnt (talk) 20:57, 28 November 2012 (UTC)
I believe the bestiality video deleted a few months ago for example is still on the server, for admins' viewing pleasure. So will be the simulated picture of a naked woman in her bathtub having her throat cut with a knife, not to mention hundreds of amateur porn images deleted over the past few years. Andreas JN466 03:21, 29 November 2012 (UTC)
The second link doesn't look seem to be a deleted image. I opposed deletion of the first - the rationale given for that by the closing admin, that it was illegal in Florida, would apply to both adults and children, and to "deleted" images still available to a small audience. The fact that Commons hasn't been raided is pretty good evidence that this is a bogus rationale, as it is historical material of artistic, educational, and political significance to encyclopedic coverage of the original films and their modern re-release, and firmly within the realm protected by the Miller Test so far as a non-lawyer can see. Wnt (talk) 17:11, 30 November 2012 (UTC)
CC'd images of people are problematic as even the photographer loses control over how the image is used. Some people commenting here know that to be true as they have removed images of themselves from flickr when they were used for mocking purposes. Whilst not limited to children, images of kids are particularly problematic, and it would be preferable that the guardians signed off on having their child feature in a CC licensed image. A snatched photo of a child walking in the street and tagged obesity is not nice, nor helpful even if the face is blurred. Neither are photos of kids on the beach or at a swim meet where someone has snapped a cameltoe, or where the point of the photo is the fact that their feet or chest is bare. There is a whole world of exploitation of photos of kids which WMF projects should not, even innocently, be participating in or enabling. John lilburne (talk) 13:56, 28 November 2012 (UTC)
The upload wizard should provide whatever information is required to ensure that uploads meet our scope and other policies (i.e., COM:IDENT). If this is not already the case, it should be changed as soon as reasonably possible. All upload tools need to have similar information available. In the case of the Flickr upload tool, it is likely to be used by more experienced users, but it would not be prohibitive to have for example, checkboxes that the user could check to confirm their belief that all images uploaded are (a) within scope and (b) meet COM:IDENT restrictions. Delicious carbuncle (talk) 16:08, 28 November 2012 (UTC)
There is no need for any assertion that the photo is in scope. And a check box for "meets COM:IDENT" demands too much information processing. It would not be overly unreasonable, and more dispositive, to have a check box to say whether it is known to you as the uploader that the subject has consented to public distribution of the photograph, and perhaps a field to say how you know (release, personal communication, prior publication, etc.) Checking "no" might yield a warning digest from COM:IDENT, though we would have to avoid Wikimedia giving any actual legal advice. Wnt (talk) 20:53, 28 November 2012 (UTC)
Let's be clear about the purpose of "verified" consent. surely it is so that the folks at Wikipediocracy can get their hot little hands on a leaked list of all the email addresses and names of people who are subjects of something sexually explicit, not merely the uploaders, so that they can out them at great length on their site until some unknown scallywags spam their employers and families with copies of the photos for great victory. Wnt (talk) 21:05, 28 November 2012 (UTC)
The verification would ideally be handled by OTRS. So, no. --Conti| 21:09, 28 November 2012 (UTC)
Wnt, it is my impression that Wikipediocracy generally leans toward protecting the privacy of people who end up on Wikipedia or Commons. Perhaps you are confusing it with Encyclopedia Dramatica, where you are a contributor? Delicious carbuncle (talk) 21:38, 28 November 2012 (UTC)
I can pee in the same pool as a Republican without being a Republican; nonetheless the pool will still be one which has been peed in by Republicans. Wnt (talk) 17:14, 30 November 2012 (UTC)

I'm not sure that there's an urgent need for broad sweeping policy changes, but I would hope that at least File:Two_little_girls.jpg could be deleted (which was not possible for some reason in 2007). AnonMoos (talk) 11:28, 29 November 2012 (UTC)

I've started a deletion request. --Avenue (talk) 12:58, 29 November 2012 (UTC)

Further to the above discussions, in particular the one between FormerIP and Avenue, please note this edit.

Thanks. Andreas JN466 18:59, 29 November 2012 (UTC)

I reverted your change to Commons:Photographs of identifiable people. Speedy deletions because of missing subject consent are not in line with any Commons deletion policy. Not with Commons:Deletion policy, not with the proposed Commons:Criteria for speedy deletion. --Rosenzweig τ 19:14, 29 November 2012 (UTC)
That's what the discussion above was for, wasn't it? If you have objections to the change, feel free to raise them, but so far there seems to be a consensus for it. --Conti| 19:30, 29 November 2012 (UTC)
No, such major changes need to be proposed and discussed more carefully and over a longer period (and be announced more widely). So far much of the discussion has been quite messy, and with not that many people involved. We should start drafting some clear proposals, yes, but not declaring consensus on any. Rd232 (talk) 19:40, 29 November 2012 (UTC)
I agree completely with Rd232. Much more drafting work is needed before we can take something to a wider audience. --Avenue (talk) 20:10, 29 November 2012 (UTC)

Here's a new version: Photographs and videos of identifiable living people clearly taken in a private place with no claim of consent for publication can be speedily deleted, unless previously published by a reputable organisation. Such content may alternatively be tagged with {{consent|query}} and given a grace period of seven days before being deleted, except for images of minors. Any thoughts? --Avenue (talk) 21:51, 29 November 2012 (UTC)

Sounds good to me. --Conti| 22:28, 29 November 2012 (UTC)
Me too. Support. Andreas JN466 10:57, 30 November 2012 (UTC)
a) What about files transferred from Flickr or other platforms, should this be valid for those too?
b) What about files that show models (not photographed in public) that posed for the images?
c) What about files that show celebrities/notable people? Not taken in any compromising situations or something like that, but for example posing with or for fans (again, not photographed in public)?
d) What about files that are not newly uploaded, but are already here on Commons, some of them for years? Just how many files are we talking about here? Do you envision that anybody can just put them up for speedy deletion, any admin can speedily delete them? If yes, how do you propose to keep this manageable? --Rosenzweig τ 22:29, 29 November 2012 (UTC)
a) Yes. b) Good point. Also private events to which photographers were invited (e.g. press conference). c) If they're in a private place, I don't see why this should be an exemption. d) We could have a grandfather clause for some categories of image. FormerIP (talk) 22:51, 29 November 2012 (UTC)
As for b) and c), if they're obviously models or pose for the photo, then they are most likely not taken at a private place as defined by COM:IDENT, such as a party or a public event. As for d), I'd support a grandfather clause. People are free to nominate such pictures for deletion, of course, or mark them with the appropriate template. But they can already do that, so there wouldn't be any change. --Conti| 23:07, 29 November 2012 (UTC)
Thanks, those are good questions.
(a) Perhaps we need to cover "reputable organisation" in more depth elsewhere, but FormerIP's concept of an "outlet with editorial oversight" seems reasonable to me. Flickr does not exercise nearly enough editorial oversight to qualify IMO.
(b) Models seem a grey area, possibly unsuitable for speedy deletions. FormerIP, there's no expectation of privacy at a press conference, so according to COM:PEOPLE#What are 'public' and 'private' places?, that is a public place (whether on private property or not).
(c) Some laws have exceptions for public figures, and COM:MORAL also indicates an exception, so yes, speedy deletion doesn't seem appropriate here.
(d) A grandfather clause could be a good idea, not least as a way to keep things manageable. I don't know how many files are affected by COM:PEOPLE, but I suspect a lot. Yes, a tag to request such speedy deletions would seem a sensible idea. Speedy deletion seems more manageable than individual deletion requests, but we should probably also encourage mass DRs where appropriate (>5-10 images?). Another way to keep things manageable is to reduce the inflow of problematic images, and some ideas for this have been put forward, but I think that's a separate issue. --Avenue (talk) 23:35, 29 November 2012 (UTC)
What about a photo taken in a photographer's studio? FormerIP (talk) 23:32, 29 November 2012 (UTC)
All sorts of photos are taken in photographers' studios, many of which would be for private use (e.g. family portraits). --Avenue (talk) 23:47, 29 November 2012 (UTC)
b): Where models have posed for a picture in a private location, we absolutely should have model consent to host the image here. And of course any ethical photographer will make sure they have that consent before uploading such images here. Andreas JN466 01:50, 30 November 2012 (UTC)
I was thinking of situations where there might be reasons to assume the model's consent, e.g. when similar photos from the same shoot have been reputably published elsewhere. --Avenue (talk) 11:12, 30 November 2012 (UTC)
c): If a fan spends a night in a pop star's hotel room and takes a morning-after picture of the pop star, then I think the pop star is as entitled to the consent requirement as anyone else who has just gotten out of bed. If they are photographed with the fan in the hotel lobby, then the picture can be assumed to have been taken in public. There are edge cases, but you cannot legislate for all of them. Andreas JN466 01:55, 30 November 2012 (UTC)
Hm. Why are we reinventing the wheel? The basic premise of COM:PEOPLE is to respect local laws on subject consent. Such laws already have worked out the various exceptions to make the issue manageable. So why don't we start by clarifying what those different laws are, and maybe then identify some commonalities. We could then more carefully consider whether to take those commonalities, and the spirit of COM:PEOPLE, and apply a higher standard than local laws require in some cases. Also, throwing "speedy deletion" into the mix is not entirely a good idea, because COM:SPEEDY is not policy. Finally, making "private" place the problem obscures the fact that in many countries consent is (often if not always) required for "public" places as well. Rd232 (talk) 23:24, 29 November 2012 (UTC)
I see SPEEDY as not being policy as a bit of an issue. Why is there a process without a policy? It would seem sensible that speedy deletion should be applied in any black/white case. Why wait around for a foregone conclusion? This just happens to be the example that someone has brought up.
The private/public distinction is already written into policy in IDENT. There may be issues around that (personally, I don't think so, but it's not an unreasonable thing to discuss), but this proposal is building on existing policy, rather than seeking to change it, in that regard. FormerIP (talk) 23:30, 29 November 2012 (UTC)
There is a policy for speedy deletions because they are part of the Commons:Deletion policy (Commons:Deletion policy#Speedy deletion, to which COM:SPEEDY currently redirects). There is also a proposal (which is obviously not the same as a policy) at Commons:Criteria for speedy deletion. --Rosenzweig τ 23:43, 29 November 2012 (UTC)
Mm, since 21 Sep COM:SPEEDY again points at the (vague) deletion policy subsection rather than the (detailed but stale) proposal. The generic {{Speedydelete}} tag isn't used much AFAIK. Rd232 (talk) 00:15, 30 November 2012 (UTC)
If we have a picture that is in clear violation of COM:PEOPLE (it should be speedied, rather than be kept hanging around for weeks on end. People can always request a deletion review. Andreas JN466 02:00, 30 November 2012 (UTC)

  Comment I think we really need to slow down a bit. This is not a small, simple problem that a quick fix can be had for; so there's no point in rushing. Let's try and be a bit more systematic and have a wider view of the entire problem, and how different things we can do might fit together. So

  1. establish what it is we're aiming to achieve. Is it simply properly enforce COM:PEOPLE, which requires us to respect local subject consent requirements? Or something more, or something else?
  2. how are we currently failing?
  3. what conclusions do we draw about different things that need to be changed about the system?

For instance, a systemic perspective points to the need for education improvement (during upload/Flickr import process) to prevent problems arising in the first place, as well as how to more efficiently run around fixing them. I think we might get better results, actually, if we started an RFC to answer these questions, kick some ideas around, and then start drafting. COM:VPR is better suited to smaller problems or more well-developed proposals than to this kind of very expansive general discussion, I think. Rd232 (talk) 23:49, 29 November 2012 (UTC)

Rd232, I agree with all of your points here, but I wonder if there might not be some things that could be accomplished in the short-term. If we are entering what might be a long process, it would be a shame to wait on adding effective guidance about COM:PEOPLE when using upload tools, for example. Delicious carbuncle (talk) 03:39, 30 November 2012 (UTC)
Agreed, we could start drafting that guidance now - a good draft would really help, and a brief line linking to it from appropriate places probably doesn't need major discussion (or if it does, we'll soon find out...). I'm running low on time, or I would start doing it now. Rd232 (talk) 13:02, 30 November 2012 (UTC)

Flickr as a problematic source

At the moment, some people are thinking "If it is on Flickr, it must be OK". This is not the case, as COM:PEOPLE is a separate issue. If an identifiable person is the main subject of an image, its Flickr status should not be the sole guideline for its acceptability on Commons. It is basic politeness to ask if the uploader consents to transferring the image to Commons, and any bot process should ask whether this has been done. Also, Flickr has an obvious loophole, because the fact that an image was certified as CC on date X does not automatically mean that it will have a CC license on future date Y. Many Flickr images are deleted for a range of reasons, including copyvio and blatant lying about authorship of the image. COM:PEOPLE needs to be tightened to take this into account.--Ianmacm (talk) 11:17, 30 November 2012 (UTC)

Indeed. Otherwise Commons is nothing but a Flickrwashing machine. Andreas JN466 21:48, 2 December 2012 (UTC)

RFC

Now at Commons:Requests for comment/images of identifiable people. I hope it will prove a useful way to move the discussion forward - there are plenty of ideas being kicked around, and some enthusiasm for systemic improvement, so let's have at it. Rd232 (talk) 13:02, 30 November 2012 (UTC)

It would be good to get in black-and-white precisely what the issues are. The discussions here and on en.wp have successfully identified a question about speedy deletion. But what else is there? Is policy inadequate? Are decisions routinely made that go against policy? Is there some other problem? FormerIP (talk) 21:20, 30 November 2012 (UTC)
There is a "Problems" section in the RFC partly for this. But it's fine to "pre-discuss" them here too :) I don't know whether decisions are "routinely" made that go against policy; it's not something that can be evaluated easily. A factor too is that policy in this area has only developed in the last year or two, so if you go back to 2010 or earlier decisions, then it's a different ballgame anyway. Rd232 (talk) 22:32, 30 November 2012 (UTC)
What I meant was it would be good for people to concisely lay out what they see as issues in the section for that in the RfC. It's not easy to either come up with or consider solutions until the problems are clearly set out. Shoulda been clearer. FormerIP (talk) 22:45, 30 November 2012 (UTC)

Vietnam War photo heading

Short and simple -

Page - Vietnam War

Photo - File:Communistvillagers1966.jpg

'Fine print' associated with photo states 'suspected', i.e. is not a definitive as is the photo's bold type heading per above.

Having been in combat in rural areas I know that most villagers, suspected' or not, were/are not Communists other than in relatively primitive sense. Certainly not Marxist-Leninists.

Where I happened to be, even 'suspected' was a rationale to kill.

Please change photo's header.

Addendum - The war in Vietnam and the spillover conflicts in Laos and Cambodia were even more lethal[than Korea]. These numbers are also hard to pin down, although by several scholarly estimates, Vietnamese military and civilian deaths ranged from 1.5 million to 3.8 million, with the U.S.-led campaign in Cambodia resulting in 600,000 to 800,000 deaths, and Laotian war mortality estimated at about 1 million. http://articles.washingtonpost.com/2012-01-06/opinions/35439668_1_civilian-deaths-wars-three-year-conflict

  Done. I've renamed it to "File:Vietnamese villagers suspected of being communists by the US Army - 1966.jpg". — Cheers, JackLee talk 13:46, 17 December 2012 (UTC)

Interface search box - add one for categories

 
Current search box with two buttons: "Go" and "Search"

I would like to have an improvement for the search box. Proposal: Install (a) a third button or (b) a second search box just for categories. --Mattes (talk) 23:55, 9 December 2012 (UTC) ... tired of check and uncheck boxes in a following menue, modify default settings and scrolling around... P.S. Is it possible to create a JavaScript based personal search box[1]? I don't mean something like this :-P

Support. Alternate solution: autoreplace "c:" or similar by "category:". --Foroa (talk) 02:26, 10 December 2012 (UTC)
...Or maybe just set Category as the default search namespace. And the current search box has only one button for pretty much all users, by the way. --Yair rand (talk) 04:40, 10 December 2012 (UTC)
Might well be confusing as a default option, but would be useful as an optional gadget, yes. Rd232 (talk) 14:43, 10 December 2012 (UTC)
I find "go" and "search" really more confusing. If I remember right, we had two search boxes some years ago but I don't recall the labelling... It's a matter of common sense. How should users know what to get after choosing "go" or "search" BTW? --Mattes (talk) 15:12, 11 December 2012 (UTC)

I prefer to start at Special:Search. But Special:Search does not have category search in the line of items at the top. This line of items only shows up after a search is made:

Help • Search categories • Show other tools

"Search categories" should be an option from the beginning. Go to Special:Search to see what I mean. I have it bookmarked. I have also enabled the gadget that lets me Ctrl-click the blank search form at the top right of all pages. That takes me to Special:Search in a new tab.

See related category search requests at Commons:Requests for comment/improving search. --Timeshifter (talk) 15:04, 14 December 2012 (UTC)

I will be eternally thankful to any one who implements Matte's (or Foroa's) suggestion. I have typed the word "category:" thousands of times. Assuming it takes five seconds to move the cursor and type the word, I have spent almost an entire day in total writing "category:" - which is less than nothing, though, compared to the frustration it begets to do it over and over every time you work on Commons. --Jonund (talk) 16:45, 22 December 2012 (UTC)

Automatic redirecting, disallowal of duplicate names

The situation where a single image name (let us say test.jpg) can have 8 different files:

  • test.jpg
  • test.jpG
  • test.jPg
  • test.jPG
  • test.Jpg
  • test.JpG
  • test.JPg
  • test.JPG

And that's before we consider JPEG, PNGs, etc. Is there no way we can simply make file names unique, so that all variants on .jpg redirect to .JPG or whatever? I am aware this will require a lot of file moving, but it would make life easier for everyone. -mattbuck (Talk) 02:41, 18 December 2012 (UTC)

Maybe there are things we can do with bots to recognise name conflicts (eg test.jpg /test.jpG both exist! please rename one of them!) and maybe even mime-type conflicts (this .jpg file is a .png! what gives?), but a more fundamental fix needs MediaWiki work. There is Bugzilla:32660 File extensions for the same file type should not allow variations of a file name (File:X.jpg, File:X.jpeg, File:X.JPG should all refer to the same file) and its cousins. Rd232 (talk) 02:55, 18 December 2012 (UTC)
One really dirty solution might be to force the last 3 letters of new upload filenames to lowercase. Not sure we can do that ourselves (Javascript?), but it would be an easy MediaWiki / UploadWizard bug. Rd232 (talk) 02:57, 18 December 2012 (UTC)
I'm not so sure that the mixed-case alternatives are allowed... AnonMoos (talk) 19:53, 21 December 2012 (UTC)
  1. Laurent, Olivier (23 April 2013). "Protecting the Right to Photograph, or Not to Be Photographed". The New York Times. Retrieved on 15 February 2015.
  2. Italy, Street-Photography and the Law (29 October 2013). Archived from the original on April 13, 2016. Retrieved on 15 February 2015.
  3. Monti, Andrea. Italian Law & Street Photography / What are you allowed to shoot?. Archived from the original on March 17, 2017. Retrieved on 15 February 2015.
  4. Art. 97. Legge 22 aprile 1941 n. 633 - Protezione del diritto d'autore e di altri diritti connessi al suo esercizio (G.U. n.166 del 16 luglio 1941) / Testo consolidato al 6 febbraio 2016 (DLgs 15 gennaio 2016, n. 8). Retrieved on 2020-05-05.
  5. Art. 96. Legge 22 aprile 1941 n. 633 - Protezione del diritto d'autore e di altri diritti connessi al suo esercizio (G.U. n.166 del 16 luglio 1941) / Testo consolidato al 6 febbraio 2016 (DLgs 15 gennaio 2016, n. 8). Retrieved on 2020-05-05.