8 votes

Dis-Info Filter for Google Searches!

I got this idea after reading Ironic, My Prediction Has Gone Viral by TheBrushfire.

The implications, if true, would be the only option to "hide" information on the internet...that is, to saturate the web with disinfo, false news stories, etc. whose headlines and word content are littered with search terms relevant to the information the dis-info agents are trying to hide. They're basically doing SEO (search engine optimization) for viral stories so those results clog up Google's search results for people investigating the hidden story...in this particular case, a TB breakout.

We need a Dis-Info Search Filter, and it seems like it would be fairly easy to build using Google's basic search operators. In fact, I do this all the time when searching in Google. My favorites are (-) and ("") to subtract exact words or phrases from the results, and I just keep drilling down.

So here's the idea...

Imagine a Firefox, IE, Safari, or Chrome Add-On (or just a simple website) that let us enter the URL of a known or suspected dis-info news article and then also the keywords we're actually looking for. Our Google searches could then exclude search results with similar content to the suspected or known dis-info article.

Who's in?!! Does anyone know what language this could be written in?



Trending on the Web

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

we could create a site

we could create a site similar to reddit, but where you tag or list stories that you know are disinfo. Then a plugin for your browser could be created that when you did a web search it filtered out the disinfo story lists you've subscribed to.

Articles on the site could be rated, 0 for disinfo 5 for seems legit, and then the search engine would take into consideration when displaying them.

In order to rate an article you would have to pay a .001 BTC (Bitcoin) fee, this would fund the site, and also limit those who were actually voting on the information to tech savvy geeks who can comprehend the use of Bitcoin and also spend alot of time on the internet and are vested in the reliability of internet information.

Really....

It would be easiest to write a Java procedure: an algorithm that's inversely weighted against the google/bing algorithm. (It wouldn't promote what you're looking for, but it would negate how theirs is overriding your search.)

What you have to do is have a control "search" term.

Ex, the brushfire

Use google, bing, etc...

Search the term and document the hits on the first 15 pages, including all meta elements in the code (as well as the many SEO-style attributions, to which I am not privy)

Ultimately, you will document these characteristics and categorize them with numerical weights (just for categorization, the figures do not represent true weighted value....yet) and the run a OLS regression....

Whatever is output as statistically significant--including positive and negative coefficients, then i think you'd inverse them (which would create at least a canceling effect to get a clean search....)

Anybody who follows, does that seem like the right place to start? Jus throwing it out there....

Brilliant though, though I suspect it may not be easy for me.

Did you consider eliminating the usual suspects?

    Example: {search token} -ABC -CBS -CNN -FOX - NBC -PBS

If this notion does not go far enough, consider excluding other suspect irritations:

    Example: ... -"Mark Twain"

I hope I was of some help.

Disclaimer: Mark Twain (1835-1910-To be continued) is unlicensed. His river pilot's license went delinquent in 1862. Caution advised. Daily Paul

Eliminate unwanted Web sites or domains w/ one line in HOST file

Blocking Unwanted Connections with a Hosts File

You can use a HOSTS file to block ads, banners, 3rd party Cookies, 3rd party page counters, web bugs, and even most hijackers. This is accomplished by blocking the connection(s) that supplies these little gems. The Hosts file is loaded into memory (cache) at startup, so there is no need to turn on, adjust or change any settings with the exception of the DNS Client service (see below). Windows [Operating systems for Internet] automatically looks for the existence of a HOSTS file and if found, checks the HOSTS file first for entries to the web page you just requested. The 127.0.0.1 is considered the location of your computer, so when an entry listed in the MVPS HOSTS file is requested on a page you are viewing, your computer thinks 127.0.0.1 is the location of the file. When this file is not located it skips onto the next file and thus the ad server is blocked from loading the banner, Cookie, or some unscrupulous ActiveX, or javascript file.

Example - the following entry 127.0.0.1 ad.doubleclick.net blocks all files supplied by that DoubleClick Server to the web page you are viewing. This also prevents the server from tracking your movements. Why? ... because in certain cases "Ad Servers" like Doubleclick (and many others) will try silently to open a separate connection on the webpage you are viewing, record your movements then yes ... follow you to additional sites you may visit.

Using a well designed HOSTS file can speed the loading of web pages by not having to wait for these ads, annoying banners, hit counters, etc. to load. This also helps to protect your Privacy and Security by blocking sites that may track your viewing habits, also known as "click-thru tracking" or Data Miners. Simply using a HOSTS file is not a cure-all against all the dangers on the Internet, but it does provide another very effective "Layer of Protection".

All happens in microseconds, which is much faster than trying to fetch a file from half way around the world. Another great feature of the HOSTS file is that it is a two-way file, meaning if some parasite does get into your system (usually bundled with other products) the culprit can not get out (call home) as long as the necessary entries exist. This is why it's important to keep your HOSTS file up to Date. How to get notified of MVPS HOSTS updates.

Disclaimer: Mark Twain (1835-1910-To be continued) is unlicensed. His river pilot's license went delinquent in 1862. Caution advised. Daily Paul

Search Tokens: -ABC -CBS -CNN -FOX -PBS -NBC

Search tokens used to effectively block dis-information sites:

Disclaimer: Mark Twain (1835-1910-To be continued) is unlicensed. His river pilot's license went delinquent in 1862. Caution advised. Daily Paul

webpage

I'd create a webpage with a couple of textfields for the location and the keywords, then pipe it into startpage.com (anonymous google search).

Even better

Do all those things somewhere other than google.com.

Basically...

...all we'd be doing is scanning known or suspected articles...and excluding strings unique of characters using "quotation marks" and -subtract signs in a Google search.

I'm a serial entrepreneur and liberty activist from Texas!

www.RevolutionCarBadges.com
www.NonNetwork.com