I'm sure we've all seen the story about the racist picture of Michelle Obama - it's even made it onto the demotiv thread. There are a popular perception that Google initially barely reacted to this incident other than posting a warning/apology when visitors clicked on the link in Google Images, but have no removed the image. Others think that google didn't remove the image, but merely reduced it's ranking so it didn't appear near the top of the listing I think the the reality is perhaps slightly different; I've also heard that the original site has taken the image down and that is the reason for the drop in rankings. http://news.bbc.co.uk/1/hi/world/americas/8377922.stm Either way, my point is hypothetical. I was wondering to what extent Google should get involved in censorship... Regardless of the reality, it first I was horrified that Google should tamper with their search results in such a partisan way, but then again wouldn't we expect them to do something if other unsavoury listings/images managed to float to the top of the pile. What would they do if the KKK managed to google-bomb their way to the top of the liostings for some innocuous searches. What if a CP or other illegal image was manipulated to the top of the pile? If we accept that there are some cases where we would expect Google to manually intervene, where would we draw the line? And how would we know when Google had intervened? For the record, as distasteful as the Obama image is, I'd certainly not alter it's ranking; though I expect the White House might disagree with me...
That's an intriguing question. Google tends to be very pragmatic when it comes to censorship --note its involvement with China under the argument that censored access to the internet is still better than no access at all. To an extent Google has a point. Google also has a point that as a search engine it just reflects what is out there on the internet, as unbiased as possible. Nevertheless one could see the merits of it censoring some extreme images (e.g. child porn), if only not to collude with the mentality that seeks such imagery out. The problem as always is practical: who decides what to censor? We have already seen the hash made of the censorship of the Virgin Killer (by the Scorpions) record cover. Although IMO it was quite legit to challenge that image, I think that the process by which it occurred only served to obfuscate the reasons why. Flickr tried its hands at censorship and met with huge protests. Given the fairly innocuous (even if graphic) material it subjected to censorship I can see why users made a fuss. Some ideas just need challenging, because they are damaging to people in all sorts of insidious ways. But the challenge needs to be better than a knee-jerk response by some bureaucrat. Perhaps the most Google-esque way to do it is to let the browsing public have the final say: give people the option to report an image as unacceptable. If enough people protest (preferably from a varied demographic, not just, say, some uptight cluster of killjoy's in the Mid-Western States or the Middle East), then the image is reviewed and if necessary, removed.
Interesting question, censorship. Most of the "big minds" of the past, at least classically speaking, were all for it-there were always things they considered to just be unfit for general consumption. Plato argued for it especially in the defenders of the city in The Republic (although he also went as far as to defend intitutionalized homosexuality and pederasty as well) and his allusions to societally acceptable sodomy aside, he made a somewhat decent point. Summed up, he argues that certain influences will have undesired effects on the defense of the city. He cites certain musical modes, the influence of women, politics, gossip, and family-much of what we take away from our own military in order to mold them into what we "want" as a civil defense force. To an extent, censorship is a positive thing, shielding people from things they often have no need to know about (and before you ask me what, who NEEDS to know about CP?) or otherwise hindering the dissemination of said articles. However, to censor simply because you don't like it needs to be examined with far more scrutiny. Do you dislike it because it offends you? Because it's morally wrong? Because it's legally wrong? Or because it's a loose temptation, running rampant in your patio, occasionally scratching on the wall to let you know it's dropped a deuce on your porch so you'll have to come out there and clean it up, and it'll get you then? Do you flag it as offensive because you fear for others? What constitutes an acceptable reason to censor? As far as dealing with people, there's a good reason to censor a lot of things just to keep the dumb ones from trying it-many don't have an original thought in their heads, and the fewer we inject, the better. To be honest, that's a failing in other areas of education, but it can at least be band-aided with censorship (yes, I know what that's doing to my argument, it's intentional.) But then again, look at China, where ideas and freedom are the taboo subject, and look at the consequences-who is fit to judge what is censored and what isn't? Is it based on absolute moral certitude (wanna give me a GOOD reason that CP shouldn't be censored, to revisit a running theme?) or is it merely on the whims of a quite possibly corrupt state or ignorant populace? Before anyone strikes, I have no clear answer. I do know that lax censorship leads to all kinds of teaching experiences for my wife and I, and as a result, my kids know in my opinion absolutely too much, but have no interest in trying any of it (thank God!) I was sheltered as a child, and while I think it's my moral compass that ended up saving me, I surely wish I had seen some more of what was out there before being let go to the world... As always, opinions expressed above are just that, as a result of my life and experiences, the only facts cite their source and all complaints can be directed to the helpdesk.
To address the only point I have time for at the moment: There doesn't seem to exist any reason to censor Child Pornography. Viewing child porn harms no-one, doesn't break anyone's essential freedoms, and although it's morally distasteful and perhaps fuels an already immoral and illegal industry, there's not much wrong with it. It's the act of creation which is the serious moral problem, not the act of viewing it.
Well, if someone has to manufacture it for it to be viewed, it's kinda cyclical. I know what you're trying to say, but because it doesn't seem to have an adverse effect on me doesn't mean that the same doesn't hold true for others-especially if the .jpg you're viewing directly required the degradation of fellow humans to produce. Censorship is less in this case about the concept of children as the objects of pornography as it is an attempt to lessen the demand for the production thereof, a process which IS morally contemptible. Though even your thesis statement is debatable, assuming its validity someone has to suffer to create that image, so therefore the argument still stands. Nexxo, Spec, and a good conversation topic. It's like Christmas came early.
Your right in that theoretically speaking, viewing child pornography is not harmful. But the second it becomes legal to view, your legalising the production. If there were streaming websites similar to the porn streaming sites that exist today for child porn, then these websites would be ad-driven with the view of making a profit. And the way to keep people viewing would be to produce more. Aside from that, legalising the viewing of something is sending a message that the state is covertly sanctioning its use/production, just as when the cannabis laws were relaxed.
Not true. It's not illegal to witness a murder or rape, but it's completely illegal (and immoral) to do such things. Now, take this further: It's morally repugnant to enjoy viewing a murder, but it's not illegal. By enjoying viewing a murder you harm no-one. Similarly with child pornography, it's morally repugnant to enjoy viewing the horrific sexual abuse of a child, but it's not inherently immoral, for in viewing child pornography you harm no-one. So imagine this scenario. No more child pornography is produced. Everyone in the world agrees to stop recording instances of child abuse so that they can be viewed later. The world's entire collection of child pornography is amassed into one single huge archive, enough so that no paedophile will ever be dissatisfied by what's there. Is it wrong for anyone to view the pornography then? I have a feeling most people would say yes, even though I can't think of any moral reason to say no. It seems a case of us not liking something, and so demanding that no-one may enjoy it simply because of our dislike. Sorry for focussing on the one issue here, I intend on addressing more points once I have time to sit down and spend a little time thinking.
Following your supposition to its grim end, would not the reason for the distaste at that point be the fact that people WERE degraded to get those images? Some acts are wrong inherently, and the fact that they are past does not alter that fact-a factor that would likely account for the offense even at things that we know happened in the past. Your argument ends up being the equivalent of "Well, the Holocaust happened in the past, so it's no longer important-let's dig up every painful memory, every death record, stolen filling and shameful photograph and generally treat it like it's no big deal." Except it is a big deal, one that people fought and bled and died to put a stop to, and to denigrate it in such a fashion reduces it to a cheap, shabby image of itself that lacks all the meaning and leaves none of the gore, grit and horror. To those people victimized by the child pornography industry, don't you think even if we put a stop to it now, banned it all effectively but still allowed access to the library of work still extant that they would have an issue? That the person who was forced into it at the age of 11 who now wants a job has their application turned down by someone who saw their image online and thought they were a morally reprehensible person, or even worse that someone gets an idea and starts to stalk their "favorite" subject? To drag it out further, what if someone got an eyeful, noticed a child nearby and decided "hey, that looks like fun. I should try that." Censorship in this circumstance is able to function as a preventative, even if the efficacy is debatable. We lock up items in the cabinets when our children are small so that they do not ingest them and therefore kill themselves, in the same wise, some universally harmful stuff probably ought to go under the rug, as it were-they're just as bad for you (if not as lethal) as the drain cleaner your parents told you not to drink.
Addressing the CP issue first (not that I want it to dominate the thread) but some CP is peddled for profit; trading and viewing such stuff creates a market and many unscrupulous people will take advantage of that - thus ensuring that more children are abused. Since the general concencus is that there is no place for CP in any developed society, and given the legal predicament you can be in merely by viewing it, there is every reason why Google and others should go to great lengths to purge such material from their directories. However, in the Michelle Obama case, it is less clear cut. Although this image was clearly offensive - and designed to be offensive, it require human interpretation to reach this conclusion. In another context, I might 'shop an image of my friend as some sort of private joke and place it on my website - here there is no offense intended or received. Should it be censored, especially since an unconnected 3rd party might see it and not understand the context? What about the free speech angle? Even if something is tasteless and offensive, should google be intervening? Should google be required to interpret laws local to each domain? It is illegal to deny the holocaust in Germany but it's pretty much legal everywhere else. [I think that...] I think that in general, google shouldn't censor stuff unless it fits one of a few exceptional categories (CP being a prime example), but it should classify content carefully, so it can easily be filtered out by those who wish to. But in general, they should use a very light touch. And in the case of the Obama image, it should be left, unedited and uncommented, in all it's glory. If people want to remove it, they should check the AUP of the IPP, and take action that way. Or they cold always just ignore it... If Google were more actively censor it's content, it would be the top of a slippery slope.
I can't help but agree. There are few places where I think it's a great idea to let stupidity sort itself out, but this might be one of them. I've not finished with this subject, but I have to be up at 4 AM, for 18 hours of Black Friday mayhem. I may come back to this late tomorrow night. As for the CP issue, well, it's a simple tangible. Because we can wrap our heads around an easy right or wrong, we use it as a launching board, but we're about into deeper waters here... Can we keep this alive for a while?
regarding the CP - Pornography is something that stimulates the sexual urges, with regards to Peado's any image of a child, be that a portrait, playing in the garden, sitting at the table for a family, these sorts of images could be deemed as CP. The whole topic of censorship, in general is a slippery slop argument, and you could spend the rest of your life discussing this. as for censorship on the internet, there are things in place, age warnings etc. but every country has different laws for different things.
Actually, it does. I have gone into this in another thread: Viewing child porn harms children in that after sufficient exposure (whether you're into that kind of thing or not) it normalises the idea of sexualising children. At some point, children do pay. Like at some point, racist or anti-Semitic propaganda results in racist or anti-Semitic acts against people. You look at it too much as a dynamic between the producer, the consumer and the person(s) in the image. You also have to consider the impact of the idea on the people that the image relates to, and society as a whole. Actually I am almost 100% sure that you can be legally charged if you witnessed a murder and did nothing. Yes. Moral reason: unchallenged exposure to it normalises the inappropriate sexualisation of children. We see what material that sexualises women does for how women get regarded and treated. We see what racist materials do for how ethnic minorities get regarded and treated. Jean-Charles DeMenezes could say a bit about the harmfulness of stereotyped ideas --if he was still alive.
Whoever said ideas weren't dangerous... Yeah, I think we're all on the same page here. There's a quote, I swear i'm gonna butcher, that goes something like "The greatest good is to know good, and do it. The greatest evil is to know good, and not do it." By that definition, censorship is a necessity. Of course, there is a difference between harm and discomfort. Discomfort can help us grow, teach us and uncover flaws in our worldview. On the other paw, intentionally harming someone is pretty much evil, no redemption to it. Whoo, way too much time at work, and way too many dumb people afterwards. Censorship IS a slippery slope, and I don't think there's any easily definable right or wrong (not to say there isn't one, I don't believe in unanswerable questions) nor do I think we'll find the answer to everyone's question in a forum thread on Bit. For my part, there is little I'd censor, and it's pretty much all been covered. However, the more you censor, the more there is to censor, and the spiral never ends-there will always be someone offended by something. In my broad travels across the US, I lived in Amish country for a while. Hard-working, honest, completely sheltered. Were truly offended by bright colors, calling them an affront to God. I kid you not. I was really glad I drove a silver car back then, I was mostly unnoticed. In Amish country, I dressed mostly in black if I went into a town (their cooking rules, as does their woodwork) and I was treated pretty decently. Having quite the beard myself most of the time, I fit in pretty well. I was just missing the hat, but I didn't have a wife to give me one. I chose to voluntarily censor myself in that situation-a choice that isn't being explored. There is censorship by governing bodies, which I think is still the true theme of the thread, but there is always the concept of self-censorship. It's too bad we can't accurately metafilter search data based on our personal preferences. I'd love for my searches to stop returning porn. I'd love for my daughters' searches to stop returning warez sites. They also have porn. I'm so absolutely certain that my daughters need to see yet another "throbbing teen waiting for your-*you know." That's such a positive influence on them (as was earlier covered by Nexxo.) I'd love to be able to say to my computer "stop showing me results that contain giant wangs" or something and it ACTUALLY WORK. Not also block Bit-Tech and half the rest of the internet. I know, I know, that really doesn't work, because some people would just set the search filter to "pr0n plz" and it would defeat the purpose. Call it wishing for a simpler time, when people still had a moral compass and understood right from wrong. Before you say it, yes, I just stated that I believe the majority of people to be devoid of a moral compass. One of the main reasons that censorship is necessary is because people do not have a firmly seated "yes, this is right" or "no, this is wrong." Furthermore, they aren't willing to stand for that belief, especially when anything involving opposition is involved. Most of the people I encounter don't really mind right or wrong, or even worse don't believe in it-going on about "personal right and personal wrong." Personally, if I believed in my heart of hearts that 2+2=5, doesn't matter the fervor of my belief it's still wrong. Morality is no different. If morality is bound by logic (and it is, otherwise we'd never agree that anything was wrong) then it must dictate that certain actions are by nature wrong, certain actions are right, and certain actions are right by intent, and so forth. Censoring the actions that are wrong by nature would leave us with stuff that's wrong by intent, but ill intent can be changed by instruction, if it wishes to be changed. Good Lord I'm rambling, I bet I've made half the points I wanted to make, and I have to crash. I'll rejoin this at a later time when people have properly ridiculed me.
The constant and quite already well covered problem about censorship is: "Who will guard the guards?" Once something becomes Censored the question becomes "what next?" But if nothing is censored, the question transforms into: "Why not?".
Good point. I guess the most basic line to draw is to only remove what is illegal for a specific country. So hence in China they have stricter filters (actually think that might have been more a reaction to political pressure than actual law?). So cp is illegal so should be filtered off, where as an deeply offensive image of Americas first lady may or may not be legal depending on which country it's being viewed in and there own laws.
The only problem with the free speech is international boundaries. In Germany Gore is totally unacceptable, but in the US, it's fine. If we could have an international "censorship" organization, it would make things easier, although at the same time it might cause a near totalitarian form of censorship as we would have overly-concerned groups all around the world lobbying things. Not such a good idea after all. It's more freedom of the press, but you have to realize that not all nations have said freedom. Even so though, I personally am not too bothered by some censorship, like the ban of CP and it doesn't surprise me that there would be censorship in other areas as well, but as long as the ability to censor is done in severe moderation and our freedom of press and opinion is preserved to it's fullest extent, I see no problem. On another note, Mr. Mario, the problem with that filtering in each country is that regulation would cost too much and basically be pointless. However it's a good step.
nah sorry, this stinks of you can say and think what ever you want as long as it isn't offensive to anyone as for saying that not all nations have this freedom, who cares (no offense) why should Google give a crap about other countries laws, following this reasoning Google wouldn't exist, everything on Google is offensive to someone, somewhere. this is a clear cut case of double standards
Well as far as child porn goes I agree since veiw pornography can heavily shape percieved sexual norms and sexualisation of children leads to disturbing mainstream practises refer to baby ballroom, child clothing of the decade and child beuty pagents. As far as women go I have to disagree because sexualisation, in pornography, of legal acts that are consensual lead in my belief to sexual activity that is not harmful to any participents. But this comes down to the marks and sparks advert it is using what humans view as attractive to advertise a product.
On the topic of google censoring, I think that the only acceptable solution would be an extension of their current filtering - so that offensive results didn't ever appear in innocuous searches, but so the data is still out there if the user decides to purposefully look for it.