Thanks, Kristof Neefs, for drawing the IPKat's attention to this document. Kristof speculates, as does the IPKat, as to why this lengthy and very carefully worded document -- which is not without its merits -- made no explicit reference to intellectual property rights. Merpel says, of course this Recommendation is "not without its merits": any document that urges so firmlythe adoption and implementation of two conflicting and contradictory policies is bound to be right somewhere along the line ..."The Committee of Ministers, under the terms of Article 15.b of the Statute of the Council of Europe,
Considering that the aim of the Council of Europe is to achieve greater unity between its members for the purpose of safeguarding and realising the ideals and principles which are their common heritage;
Recalling that States Parties to the ... European Convention on Human Rights ... have undertaken to secure to everyone within their jurisdiction the human rights and fundamental freedoms defined in the Convention;
Reaffirming the commitment of member states to the fundamental right to freedom of expression and to receive and impart information and ideas without interference by public authorities and regardless of frontiers, as guaranteed by Article 10 of the European Convention on Human Rights [the IPKat reminds readers at this point that ideas and information float across borders pretty well, but copyright remains stubbornly territorial despite so many attempts at ironing out those dastardly frontiers];
Aware that any intervention by member states that forbids access to specific Internet content may constitute a restriction on freedom of expression and access to information in the online environment and that such a restriction would have to fulfil the conditions in Article 10, paragraph 2, of the European Convention on Human Rights and the relevant case law of the European Court of Human Rights [There's no problem interfering with online trade, is there? The IPKat wonders];
Recalling in this respect the Declaration on human rights and the rule of law in the information society, adopted by the Committee of Ministers on 13 May 2005, according to which member states should maintain and enhance legal and practical measures to prevent state and private censorship;
Recalling Recommendation Rec(2007)11 of the Committee of Ministers to member states on promoting freedom of expression and information in the new information and communications environment, according to which member states, the private sector and civil society are encouraged to develop common standards and strategies to promote transparency and the provision of information, guidance and assistance to the individual users of technologies and services concerning, inter alia, the blocking of access to and filtering of content and services with regard to the right to receive and impart information;
Noting that the voluntary and responsible use of Internet filters (products, systems and measures to block or filter Internet content) can promote confidence and security on the Internet for users, in particular children and young people, while also aware that the use of such filters can impact on the right to freedom of expression and information, as protected by Article 10 of the European Convention on Human Rights;
Recalling Recommendation Rec(2006)12 of the Committee of Ministers on empowering children in the new information and communications environment, which underlines the importance of information literacy and training strategies for children to enable them to better understand and deal with content (for example violence and self-harm, pornography, discrimination and racism) and behaviours (such as grooming, bullying, harassment or stalking) carrying a risk of harm, thereby promoting a greater sense of confidence, well-being and respect for others in the new information and communications environment;
Left: prevented from grooming and stalking, cats soon abandon their online pursuits for more traditional pastimes ...
Convinced of the necessity to ensure that users are made aware of, understand and are able to effectively use, adjust and control filters according to their individual needs;
Recalling Recommendation Rec(2001)8 of the Committee of Ministers on self-regulation concerning cyber content (self-regulation and user protection against illegal or harmful content on new communications and information services), which encourages the neutral labelling of content to enable users to make their own value judgements over such content and the development of a wide range of search tools and filtering profiles, which provide users with the ability to select content on the basis of content descriptors [labelling's a tricky concept, the IPKat notes: it enables those who want to avoid something to do so, but makes it much easier for those who want it to get it];
Aware of the public service value of the Internet, understood as people’s significant reliance on the Internet as an essential tool for their everyday activities (communication, information, knowledge, commercial transactions [at last!], entertainment) and the resulting legitimate expectation that Internet services be accessible, affordable, secure, reliable and ongoing and recalling in this regard Recommendation Rec(2007)16 of the Committee of Ministers on measures to promote the public service value of the Internet;
Recalling the Declaration of the Committee of Ministers on freedom of communication on the Internet of 28 May 2003, which stresses that public authorities should not, through general blocking or filtering measures, deny access by the public to information and other communication on the Internet, regardless of frontiers, but that this does not prevent the installation of filters for the protection of minors, in particular in places accessible to them, such as schools or libraries [If only things were so simple. Before the days of the internet, most libraries either didn't stock unsuitable works or limited their access -- alas, it only seems to make the forbidden fruit more desirable];
Reaffirming the commitment of member states to everyone’s right to private life and secrecy of correspondence [now this one is really tricky: everyone leaves an online trail of one sort or another -- and social networking sites seem to be creating a new level of access somewhere between the traditional zones of access-to-all and totally confidential], as protected by Article 8 of the European Convention on Human Rights, and recalling the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (ETS No. 108) and its Additional Protocol regarding supervisory authorities and transborder data flows (ETS No. 181) as well as Recommendation No. R (99) 5 of the Committee of Ministers on the protection of privacy on the Internet,
[The IPKat says, you've had the hors d'oeuvres. Now this is where the main course starts ...]
Recommends that member states adopt common standards and strategies with regard to Internet filters to promote the full exercise and enjoyment of the right to freedom of expression and information and related rights and freedoms in the European Convention on Human Rights, in particular by:
– taking measures with regard to Internet filters in line with the guidelines set out in the appendix to this recommendation [the IPKat says, you can read them below];
– bringing these guidelines to the attention of all relevant private and public sector stakeholders, in particular those who design, use (install, activate, deactivate and implement) and monitor Internet filters, and to civil society, so that they may contribute to their implementation.
Appendix to Recommendation CM/Rec(2008)6
Guidelines
I. Using and controlling Internet filters in order to fully exercise and enjoy the right to freedom of expression and information
Right: since the CoE is so keen on transparency, we thought that a colour X-ray of an Appendix would be appropriate
Users’ awareness, understanding of and ability to effectively use Internet filters are key factors which enable them to fully exercise and enjoy their human rights and fundamental freedoms, in particular the right to freedom of expression and information, and to participate actively in democratic processes. When confronted with filters, users must be informed that a filter is active and, where appropriate, be able to identify and to control the level of filtering the content they access is subject to. Moreover, they should have the possibility to challenge the blocking or filtering of content and to seek clarifications and remedies [the Council of Europe has to look both ways here. In reality, filters work best when people don't know they're there].
In co-operation with the private sector and civil society, member states should ensure that users are made aware of activated filters and, where appropriate, are able to activate and deactivate them and be assisted in varying the level of filtering in operation, in particular by:
i. developing and promoting a minimum level of information for users to enable them to identify when filtering has been activated and to understand how, and according to which criteria, the filtering operates (for example, black lists, white lists, keyword blocking, content rating, etc., or combinations thereof) [so long as the blacklists include all the regular mis-spellings etc etc];
ii. developing minimum levels of and standards for the information provided to the user to explain why a specific type of content has been filtered;
iii. regularly reviewing and updating filters in order to improve their effectiveness, proportionality and legitimacy in relation to their intended purpose;
iv. providing clear and concise information and guidance regarding the manual overriding of an activated filter, namely whom to contact when it appears that content has been unreasonably blocked and the reasons which may allow a filter to be overridden for a specific type of content or Uniform Resource Locator (URL);
v. ensuring that content filtered by mistake or error can be accessed without undue difficulty and within a reasonable time;
vi. promoting initiatives to raise awareness of the social and ethical responsibilities of those actors who design, use and monitor filters with particular regard to the right to freedom of expression and information and to the right to private life, as well as to the active participation in public life and democratic processes;
vii. raising awareness of the potential limitations to freedom of expression and information and the right to private life resulting from the use of filters and of the need to ensure proportionality of such limitations;
viii. facilitating an exchange of experiences and best practices with regard to the design, use and monitoring of filters;
ix. encouraging the provision of training courses for network administrators, parents, educators and other people using and monitoring filters;
Left: training courses are essential for parents if they are not to be over-protective
x. promoting and co-operating with existing initiatives to foster responsible use of filters in compliance with human rights, democracy and the rule of law;
xi. fostering filtering standards and benchmarks to help users choose and best control filters.
In this context, civil society should be encouraged to raise users’ awareness of the potential benefits and dangers of filters. This should include promoting the importance and significance of free and unhindered access to the Internet so that every individual user may fully exercise and enjoy their human rights and fundamental freedoms, in particular the right to freedom of expression and information and the right to private life, as well as to effectively participate in public life and democratic processes. [promoting free and unhindered access and promoting the need to filter are both desirables, but do tend to have a somewhat contradictory flavour]
II. Appropriate filtering for children and young people
The Internet has significantly increased the number and diversity of ideas, information and opinions which people may receive and impart in the fulfilment of their right to freedom of expression and information without interference by public authorities and regardless of frontiers. At the same time, it has increased the amount of readily available content carrying a risk of harm, particularly for children and young people. To satisfy the legitimate desire and duty of member states to protect children and young people from content carrying a risk of harm, the proportionate use of filters can constitute an appropriate means of encouraging access to and confident use of the Internet and be a complement to other strategies on how to tackle harmful content, such as the development and provision of information literacy.
In this context, member states should:
i. facilitate the development of strategies to identify content carrying a risk of harm for children and young people, taking into account the diversity of cultures, values and opinions;
ii. co-operate with the private sector and civil society to avoid over-protection of children and young people by, inter alia, supporting research and development for the production of “intelligent” filters that take more account of the context in which the information is provided (for example by differentiating between harmful content itself and unproblematic references to it, such as may be found on scientific websites);
iii. facilitate and promote initiatives that assist parents and educators in the selection and use of developmental-age appropriate filters for children and young people;
iv. inform children and young people about the benefits and dangers of Internet content and its filtering as part of media education strategies in formal and non-formal education.
Furthermore, the private sector should be encouraged to:
i. develop “intelligent” filters offering developmental-age appropriate filtering which can be adapted to follow the child’s progress and age while, at the same time, ensuring that filtering does not occur when the content is deemed neither harmful nor unsuitable for the group which the filter has been activated to protect; [this is creepy -- a child is rewarded for his or her maturation under a heavily filtered system by being progressively less heavily filtered]
ii. co-operate with self- and co-regulatory bodies in order to develop standards for developmental-age appropriate rating systems for content carrying a risk of harm, taking into account the diversity of cultures, values and opinions;
iii. develop, in co-operation with civil society, common labels for filters to assist parents and educators in making informed choices when acquiring filters and to certify that they meet certain quality requirements;
iv. promote the interoperability of systems for the self-classification of content by providers and help to increase awareness about the potential benefits and dangers of such classification models.
Moreover, civil society should be encouraged to:
i. debate and share their experiences and knowledge when assessing and raising awareness of the development and use of filters as a protective measure for children and young people;
ii regularly monitor and analyse the use and impact of filters for children and young people, with particular regard to their effectiveness and their contribution to the exercise and enjoyment of the rights and freedoms guaranteed by Article 10 and other provisions of the European Convention on Human Rights.
III. Use and application of Internet filters by the public and private sector
Notwithstanding the importance of empowering users to use and control filters as mentioned above, and noting the wider public service value of the Internet, public actors on all levels (such as administrations, libraries and educational institutions) which introduce filters or use them when delivering services to the public, should ensure full respect for all users’ right to freedom of expression and information and their right to private life and secrecy of correspondence.
In this context, member states should:
i. refrain from filtering Internet content in electronic communications networks operated by public actors for reasons other than those laid down in Article 10, paragraph 2, of the European Convention on Human Rights, as interpreted by the European Court of Human Rights; [So we can't filter out those public actors? This might give us a foretaste of life in North Korea ...]
ii. guarantee that nationwide general blocking or filtering measures are only introduced by the state if the conditions of Article 10, paragraph 2, of the European Convention on Human Rights are fulfilled. Such action by the state should only be taken if the filtering concerns specific and clearly identifiable content, a competent national authority has taken a decision on its illegality and the decision can be reviewed by an independent and impartial tribunal or regulatory body, in accordance with the requirements of Article 6 of the European Convention on Human Rights;
iii. introduce, where appropriate and necessary, provisions under national law for the prevention of intentional abuse of filters to restrict citizens’ access to lawful content; [Would this enable users of the internet to access and copy works that are in the public domain, even if they are in private control?]
iv. ensure that all filters are assessed both before and during their implementation to ensure that the effects of the filtering are proportionate to the purpose of the restriction and thus necessary in a democratic society, in order to avoid unreasonable blocking of content;
v. provide for effective and readily accessible means of recourse and remedy, including suspension of filters, in cases where users and/or authors of content claim that content has been blocked unreasonably;
vi. avoid the universal and general blocking of offensive or harmful content for users who are not part of the group which a filter has been activated to protect, and of illegal content for users who justifiably demonstrate a legitimate interest or need to access such content under exceptional circumstances, particularly for research purposes;
vii. ensure that the right to private life and secrecy of correspondence is respected when using and applying filters and that personal data logged, recorded and processed via filters are only used for legitimate and non-commercial purposes. [But purposes can be legitimate while being commercial, for example the compilation and sale of anonymised data relating to national or regional purchasing habits]
Furthermore, member states and the private sector are encouraged to:
i. regularly assess and review the effectiveness and proportionality regarding the introduction of filters;
ii. strengthen the information and guidance to users who are subject to filters in private networks, including information about the existence of, and reasons for, the use of a filter and the criteria upon which the filter operates;
iii. co-operate with users (customers, employees, etc.) to improve the transparency, effectiveness and proportionality of filters.
In this context, civil society should be encouraged to follow the development and deployment of filters both by key state and private sector actors. It should, where appropriate, call upon member states and the private sector, respectively, to ensure and to facilitate all users’ right to freedom of expression and information, in particular as regards their freedom to receive information without interference by public authorities and regardless of frontiers in the new information and communications environment".
The IPKat has been perusing a Council of Europe Recommendation, this being the inelegantly named Recommendation CM/Rec(2008)6 of the Committee of Ministers to member states on measures to promote the respect for freedom of expression and information with regard to Internet filters, adopted by the Committee of Ministers on 26 March 2008 at the 1022nd meeting of the Ministers’ Deputies). According to the document,
No comments:
All comments must be moderated by a member of the IPKat team before they appear on the blog. Comments will not be allowed if the contravene the IPKat policy that readers' comments should not be obscene or defamatory; they should not consist of ad hominem attacks on members of the blog team or other comment-posters and they should make a constructive contribution to the discussion of the post on which they purport to comment.
It is also the IPKat policy that comments should not be made completely anonymously, and users should use a consistent name or pseudonym (which should not itself be defamatory or obscene, or that of another real person), either in the "identity" field, or at the beginning of the comment. Current practice is to, however, allow a limited number of comments that contravene this policy, provided that the comment has a high degree of relevance and the comment chain does not become too difficult to follow.
Learn more here: http://ipkitten.blogspot.com/p/want-to-complain.html