Content control software , commonly referred to as the internet filter, is software that limits or controls content accessible to Internet users, especially when used to restrict material sent over the Internet via the Web, e-mail, or other means. Content control software determines what content will be available or blocked.
Such restrictions can be applied at various levels: governments may try to apply them nationally (see Internet censorship), or they can, for example, be applied by ISPs to clients, by the company to its personnel, by schools to its students, by the library to its visitors , by a parent to a child's computer, or by an individual user to his or her own computer.
His motives are often to prevent access to content that the computer owner or other authorities might consider inappropriate. When charged without user consent, content control can be characterized as a form of Internet censorship. Some content control software includes a time control function that empowers parents to manage the amount of time children can access the Internet or play games or other computer activities.
In some countries, such software is everywhere. In Cuba, if a computer user in a government-controlled cafe typed certain words, word processor or browser closed automatically, and a "state security alert" was given.
Video Content-control software
Terminology
The term "content control" is used on occasions by CNN, Playboy, San Francisco Chronicle, and The New York Times. However, some other terms, including "content filtering software", "secure web gateway", "censorware", "content security and control", "web filtering software", "content sensor software", and "blocking software content ", often used. "Nannyware" has also been used both in product marketing and by the media. Gartner's industry research company uses a "secure web gateway" (SWG) to describe the market segment.
Companies that create products that selectively block websites do not refer to these products as censorware, and prefer terms like "Internet filters" or "URL Filters"; in special cases software designed specifically to allow parents to monitor and restrict their children's access, "parental control software" is also used. Some products record all the sites accessed by users and rank them by content type to be reported to the "accountability partner" of a person's choice, and the term accountability software is used. Internet filters, parental control software, and/or accountability software can also be combined into a single product.
Those who are critical of the software, however, use the term "censorware" freely: consider the Censorware Project, for example. The use of the term "censorware" in the editorial that criticizes the software maker is widespread and includes many different varieties and applications: Xeni Jardin used this term in an editorial on March 9, 2006 at The New York Times when discussing device usage US-made filtering software to suppress content in China; in the same month, a high school student used the term to discuss the spread of the software in his school district.
In general, beyond the editorial pages as described above, traditional newspapers do not use the term "censorware" in their reporting, preferring to use less controversial terms such as "content filter", "content control", or "web filtering"; The New York Times and Wall Street Journal both appear to follow this practice. On the other hand, Web-based newspapers such as CNET use the term in an editorial and journalistic context, such as "Windows Live to Get Censorware."
Maps Content-control software
Filter type
Filters can be implemented in various ways: by software on personal computers, through network infrastructure such as proxy servers, DNS servers, or firewalls that provide Internet access. There is no solution that provides complete coverage, so most companies use a mix of technologies to achieve the right content controls according to their policies.
- Browser-based filter
- The browser-based content filtering solution is the lightest solution for content filtering, and is implemented through third-party browser extensions.
- Filter e-mail
- The e-mail filter acts on the information contained in the body of the letter, in the mail headers such as sender and subject, and e-mail attachments to classify, accept, or reject messages. Bayesian filters, a type of statistical filter, are commonly used. Client and server based filters are available.
- Client-side filters
- This type of filter is installed as software on every computer where filtering is required. These filters can usually be managed, disabled, or uninstalled by anyone with administrator-level privileges on the system.
- Content-restricted (or filtered) ISPs
- ISPs that are restricted to content (or filtered) are Internet service providers that offer access to only a portion of Internet content set on an opt-in or mandatory basis. Anyone who subscribes to this type of service is subject to restrictions. This type of filter can be used to apply government control, regulation or parent to the customer.
- Network-based filtering
- This type of filter is implemented in the transport layer as a transparent proxy, or in the application layer as a web proxy. Filtering software can include data loss prevention functions to filter out and login information. All users are subject to the access policy specified by the institution. Filtering can be customized so that district high school libraries can have different filtration profiles from district junior libraries.
- DNS-based filtering
- This filtering type is applied to the DNS layer and attempts to prevent searches for unmatched domains in a set of policies (either parental controls or company rules).
- Filter search engine
- Many search engines, like Google and Bing, offer users the option to enable security filters. When this security filter is enabled, this filter filters out inappropriate links from all search results. If the user knows the actual URL of the website displaying explicit or martial content, they have the ability to access the content without using a search engine. Some providers offer machine-oriented versions of their children that only allow child-friendly websites.
Reasons to filter
Internet service providers (ISPs) that block content containing pornography, or religious content, politics, or content related to controversial news are often used by parents who do not allow their children to access content that is inconsistent with their personal beliefs. However, content filtering software may also be used to block malware and other content that contains unfriendly, annoying, or intrusive material including adware, spam, computer viruses, worms, trojan horses, and spyware.
Most content control software is marketed to organizations or parents. It is, however, also marketed on occasions to facilitate self-censorship, for example by people who struggle with online pornography addiction, gambling, chat rooms, etc. Self-contained sensor software can also be used by some people to avoid seeing content they deem immoral, inappropriate, or just annoying. A number of accountability software products are marketed as self-censorship or accountability software . It is often promoted by religious media and at religious meetings.
Opinions about when this software is moral (and sometimes legal) is used extensively with people who strongly support and over the same software used in various scenarios.
Criticism
Filtering errors
Overblocking
Utilizing overly filtered filters while filtering content, or incorrectly labeling content that is not intended to be censored may result in over-blocking, or over-censorship. Excessive blocks may filter out material that should be accepted under applicable filtering policies, such as health-related information may be inadvertently filtered along with pornographic material due to Scunthorpe issues. Filter administrators may prefer to make mistakes on the side of caution by accepting excessive blocking to prevent risk of access to sites they deem unwanted. The content control software is referred to as blocking access to Beaver College before its name turns into Arcadia University. Another example is the Horniman Museum's screening. In addition, excessive blocking may prompt the user to bypass the filter completely.
Underblocking
Whenever new information is uploaded to the Internet, filters can be under blocked, or under censorship, content if the party responsible for maintaining the filter does not update it quickly and accurately, and the blacklist rather than the whitelisted filtering policy already exists.
Morality and opinion
Many disagree with the government that filters out views on moral or political issues, agreeing that this could be support for propaganda. Many will also find it unacceptable that ISPs, either by law or by the ISP's own choice, should use the software without allowing users to disable filtering for their own connections. In the United States, the First Amendment to the Constitution of the United States has been cited in a call to criminalize forceful Internet censorship. (See section below)
Without adequate government oversight, content filtering software can allow private companies to censor to their heart's content. (See the religious or political censorship, below). Government utilization or encouragement of content control software is a component of the Internet Censorship (not to be confused with Internet Monitoring, where content is monitored and unnecessarily restricted). Governments of countries like the People's Republic of China, and Cuba are examples of countries where this ethically controversial activity is alleged to have occurred.
Legal action
In 1998, the US federal district court in Virginia ruled that the imposition of mandatory filing in public libraries violated the First Amendment.
In 1996 the US Congress passed the Communications Decision Act, which prohibited indecency on the Internet. Civil liberties groups challenged laws under the First Amendment, and in 1997 the Supreme Court ruled in favor of them. Part of the civil liberties argument, especially from groups such as the Electronic Frontier Foundation, is that parents who want to block sites can use their own content filtering software, making government engagement unnecessary.
In the late 1990s, groups such as the Censorware Project initiated content-controlled content engineering and decrypt black lists to determine what types of sites are blocked by the software. This led to legal action accusing violation of the license agreement "Cyber ââPatrol". They found that such tools routinely block unmatched sites and also failed to block the intended target. (See the overzealous Filtering, below).
Some content control software companies respond by claiming that their screening criteria are supported by intensive manual inspections. The opponents of the company argue, on the other hand, that performing the necessary checks will require more resources than the company has and therefore their claims are invalid.
The Motion Picture Association succeeds in getting British authorities imposing ISPs to use content control software to prevent copyright infringement by their customers.
Religious, religious, anti-religious, and political censorship
Many types of content control software have been shown to block sites based on the religious and political propensities of company owners. Examples include blocking several religious sites (including Vatican websites), many political sites, and homosexuality-related sites. X-Stop is shown to block sites like Quaker's website, National Journal of Legal Sexual Orientation, The Heritage Foundation, and part of The Ethical Spectacle. CYBERsitter blocked sites like the National Organization for Women. Nancy Willard, an academic researcher and lawyer, points out that many public schools and US libraries use the same filtering software used by many Christian organizations. Cyber ââPatrol, a product developed by The Anti-Defamation League and Mattel's The Learning Company, has been found to block not only political sites deemed to be involved in 'hate speech' but also human rights websites, such as pages the Amnesty International web about the Israeli website and gay rights website, such as glaad.org.
Content labeling
Labeling of content may be regarded as another form of content control software. In 1994, the Internet Content Rating Association (ICRA) - now part of the Family Online Safety Institute - developed a content rating system for online content providers. Using an online questionnaire, a webmaster explains the nature of their web content. A small file is generated which contains a computer readable diamond and is easy to read from this description which can then be used by content filtering software to block or allow the site.
ICRA labels are available in various formats. This includes the World Wide Web Consortium (RDF) Consortium Resources Framework and the Internet Content Selection (PICS) Platform used by Microsoft Internet Explorer Content Advisors.
The ICRA label is an example of self-labeling. Similarly, in 2006, the Association of Sites Advocating Child Protection (ASACP) initiated a limited self-labeling initiative in adults. Members of ASACP are concerned that various forms of proposed legislation in the United States will have the effect of forcing adult companies to label their content. The RTA label, unlike the ICRA label, does not require the webmaster to fill in the questionnaire or sign up to use it. Like ICRA, the RTA label is free. Both labels are recognized by a wide variety of content control software.
The Volunteer Content Rating System (VCR) is designed by Oak Solid Software for their CYBERsitter filtering software, as an alternative to PICS systems, which some critics find too complex. It uses the embedded HTML metadata tag in the web page document to determine the type of content contained in the document. Only two levels are specified, adult and adult , making the specs very simple.
Use in public library
United States
The use of Internet filters or content control software varies greatly in public libraries in the United States, because Internet usage policies are set by the local library board. Many libraries adopt Internet filters after Congress conditions acceptance of universal service discounts on the use of Internet filters through the Children's Internet Protection Act (CIPA). Other libraries do not install content control software, believing that acceptable use policies and educational efforts address the issue of children accessing age-inappropriate content while keeping adult users free to access information freely. Some libraries use Internet filters on computers used by children only. Some libraries that use content-control software allow software to be disabled on a case-by-case basis when applying to a librarian; libraries subject to CIPA are required to have policies that allow adults to request filters to be disabled without having to explain the reason for their request.
Many jurists believe that a number of legal cases, especially Reno v. American Civil Liberties Union , stipulates that the use of content control software in the library is a violation of the First Amendment. Children's Internet Protection Act [CIPA] and the June 2003 case United States Library Association v. American found the CIPA constitution as a condition placed on federal funding acceptance, stating that the First Amendment's concerns were dismissed by a legal provision allowing adult library users to have filtering software disabled, without necessarily explaining the reason for their request. Plurality decisions leave the future "as applied" to the Constitutional challenge, however.
In November 2006, a lawsuit was filed against North Central District Library District (NCRL) in Washington State because its policy refused to disable restrictions on adult customer requests, but CIPA was not challenged in that regard. In May 2010, the Washington State Supreme Court gave an opinion after being asked to endorse the question referred to by the United States District Court for the Eastern District of Washington: "Whether a public library, consistent with Article I, Ã, § 5 of the Washington Constitution, can filter Internet access for all customers without disabling Web sites containing constitution-protected speeches at the request of adult library patrons. "The Washington State Supreme Court ruled that NCRL's Internet filtering policy does not violate Article I, Section 5 of the Washington State Constitution. The Court said: "It seems to us that NCRL's screening policy is fair and appropriate to its mission and this policy and a neutral point of view.It seems that there is no article I, section 5 content-based violation that exists in this case." The essential mission of NCRL is to promote reading and lifelong learning.As maintained by NCRL, it makes sense to impose restrictions on Internet access to maintain an environment conducive to learning and contemplative thinking. "The case was returned to federal court.
In March 2007, Virginia passed a law similar to CIPA that requires public libraries to receive state funds to use content control software. Like CIPA, legislation requires libraries to disable filters for adult library users when asked to do so by users.
Australia
The Australian Internet Security Advisory Board has information on "practical advice on Internet security, parental controls and filters for the protection of children, students and families" which also includes public libraries.
NetAlert, a software provided free of charge by the Australian government, was allegedly broken by 16-year-old Tom Wood less than a week after it was released in August 2007. Wood allegedly passed a $ 84 million filter in about half. an hour to highlight issues with the government's approach to Internet content filtering.
The Australian Government has introduced laws that require ISPs to "restrict access to age-restricted content (commercial MA15 content and R18 content) whether hosted in Australia or provided from Australia" beginning Jan. 20, 2008, known as Cleanfeed.
Cleanfeed is a proposed content-level proposed ISP content filtering system. It was proposed by Beazley to lead the Australian Labor Party's opposition in a 2006 press release, with the intention of protecting vulnerable children from being blinded by parental computers. It was announced on 31 December 2007 as a policy to be implemented by the Rudd ALP government, and preliminary tests in Tasmania have produced a 2008 report. Cleanfeed is funded in the current budget, and moves towards an Interest Statement for direct testing with ISPs in 2008. Public opposition and criticism has emerged, led by EFA and gained the attention of irregular mainstream media, with the majority of Australians reported "strongly against" its implementation. Criticism includes the cost, inaccuracies (it is impossible to ensure only illegal sites are blocked) and the fact that it will be mandatory, which can be seen as a nuisance to the right to free speech. Another major critic point is that although the filter is claimed to stop certain material, the underground ring associated with the material will not be affected. Filters may also provide a false sense of security for parents, who may oversee fewer children while using the internet, achieving the opposite effect. Cleanfeed is the responsibility of Senator Conroy's portfolio.
Denmark
In Denmark declared a policy that it would "prevent inappropriate Internet sites accessible from children's libraries across Denmark." "It is important that every library in the country has an opportunity to protect children against pornographic material when they use library computers.This is a top priority for me as Minister of Culture to ensure children can surf the internet safely in the library," states Brian Mikkelsen in a press release of the Danish Ministry of Culture. "
United Kingdom
Many libraries in the UK such as the British Library and local public library authorities apply filters to Internet access. According to research conducted by Radical Librarians Collective, at least 98% of public libraries apply filters; including categories such as "LGBT interest", "abortion" and "questionable". Some public libraries block Payday loan websites
Skip filter
Content filtering in general can be "surpassed completely by tech-savvy individuals." Blocking content on the device "[will not]... ensures that end users will not be able to find their way around the filter."
Some software can be skipped successfully by using alternative protocols such as FTP or telnet or HTTPS, performing searches in different languages, using a proxy server or explorer like Psiphon. Also web pages that are cached are returned by Google or other searches may pass through some controls as well. The web syndication service can provide an alternative path for content. Some worse-designed programs can be turned off by shutting down their processes: for example, in Microsoft Windows via Windows Task Manager, or on Mac OS X using Force Quit or Activity Monitor. Many solutions and counters to solve problems from content control software makers exist. Google services are often blocked by filters, but this is probably most often skipped by using https:// in place of http:// because content filtering software can not interpret content below secure connection (in this case SSL).
Many content filters have options that allow authorized persons to skip content filters. This is particularly useful in environments where computers are being watched and content filters aggressively block websites that need access.
Encrypted VPNs can be used as a means of bypassing content control software, especially if content control software is installed on an Internet gateway or firewall.
Sometimes, antivirus software with web protection can stop content control filters.
Products and services
Some ISPs offer parental control options. Some offer security software that includes parental controls. Mac OS X v10.4 offers parental controls for multiple applications (Mail, Finder, iChat, Safari & Dictionary). The Microsoft Windows Vista operating system also includes content control software.
Content filtering technology exists in two main forms: application gateway or packet inspection. For HTTP access, the application gateway is called web-proxy or just a proxy. Such web proxies can check initial requests and web pages returned using complex rules and will not return any part of the page to the requester until a decision is made. In addition they can make substitutions in whole or for any part of the returned results. The packet inspection filter initially does not interfere with connection to the server but checks the data in connection as it passes, at some point the filter may decide that the connection will be filtered and then will disconnect it by injecting TCP - Reset or similar fake package. Both of these techniques can be used in conjunction with packet filters that monitor links until it sees HTTP connections starting to IP addresses that have content that needs to be filtered. The packet filter then redirects the connection to the web-proxy that can perform detailed filtering on the website without having to pass all unfiltered connections. This combination is quite popular because it can reduce system costs significantly.
Gateway-based content control software may be more difficult to skip than desktop software because users do not have physical access to filtering devices. However, many techniques in the Bypassing filter section still work.
See also
- Adultism
- Ad filtering
- David Burt, a former librarian and advocate for content control software
- Comparison of software and content control providers (including parental control software)
- Computer and network supervision
- Examination of content in
- egress progress, outbound network traffic control
- The Financial Coalition Against Child Pornography
- Internet censorship
- censorship of internet censors
- Internet Security
- Opposition to pornography
- Parent control
- Peacefire, a US-based website dedicated to "preserving First Amendment rights for Internet users, especially those younger than 18"
- Russian State Duma Bill 89417-6 - Proposed bill that will mandate content control software
- Wordfilter, common name for scripts commonly used in Internet forums or chat rooms that automatically scan posts or user comments when sent and automatically modify or censor certain words or phrases
References
Source of the article : Wikipedia