{"id":319,"date":"2015-06-17T22:00:35","date_gmt":"2015-06-18T05:00:35","guid":{"rendered":"https:\/\/traverstodd.com\/?p=319"},"modified":"2015-06-29T23:14:46","modified_gmt":"2015-06-30T06:14:46","slug":"right-to-be-forgotten-misguided-law-or-only-avenue","status":"publish","type":"post","link":"https:\/\/traverstodd.com\/right-to-be-forgotten-misguided-law-or-only-avenue\/","title":{"rendered":"Right to be forgotten – misguided law or the only avenue?"},"content":{"rendered":"
In 2014, a top EU court has ruled Google must amend some search results at the request of ordinary people in a test of the so-called “right to be forgotten”.<\/p>\n
The case was made that Google needs to comply with requests<\/a> to remove irrelevant or outdated data. The law is currently in Europe and has not spread to the US. Critics in the EU have rightly called the rule tantamount to censorship, while some have argued in favor of the ruling, feeling it is unfair to have outdated data remain present in search results.<\/p>\n The rule has extended far beyond “irrelevant or outdated” data, as it has been shown that the rule has allowed ordinary people to remove results they would rather not have discovered in internet searches. Worse, pedophiles have successfully had unfavorable results removed from Google as well.<\/p>\n So if Kanye doesn’t like stuff that’s said about him – for which Google probably has a couple terabytes of data reserved just for holding indexed results relating to negative-Kanye-speak – he can make them go poof.<\/p>\n Lots of questions<\/a> abound about whether the law is something necessary, or whether it’s a precedent-setting step toward increased censorship. Indeed, the internet can be a cruel place.<\/p>\n But is censoring results in a search engine the best to remove unwanted entries on the world wide web? I’d argue no.<\/p>\n But hosting providers host that content. The buck doesn’t stop with Google (nor Bing, nor Yahoo, nor the others that are even less important than Yahoo…like, um, Infoseek or Netscape?). If I am Facebook, Twitter, or my-shameless-Tumblr-blog-of-filth, and I’m hosting terrible info about you – shouldn’t the onus be on the content-provider to comply with removal requests? If the EU government hosts a website that lists, say, information for worried parents about pedophiles in their neighborhood, shouldn’t I be able to search a suspected pedophile’s name in Google to see if it comes up?<\/p>\n I’d argue yes.<\/p>\n Google is showing us content we seek. But the provider is hosting the content for Google to index in the first place. Some would argue the inevitability<\/a> of hosts needing to worry about requests coming their way, and to me this appears to be the more accurate route for deleting content you don’t want.<\/p>\nWho says it’s outdated?<\/h3>\n
Search engines index content.<\/h3>\n
Getting around the law<\/h4>\n