Big Brothers

We are the Borg.
User avatar
Posts: 16862
Joined: Thu Sep 19, 2013 5:50 pm
Has thanked: 2059 times
Been thanked: 2824 times

Big Brothers

Post by Witness » Wed Mar 14, 2018 12:21 am

Stanford News wrote:An artificial intelligence algorithm developed by Stanford researchers can determine a neighborhood’s political leanings by its cars

Stanford researchers are using computer algorithms that can see and learn to analyze millions of publicly available images on Google Street View to determine the political leanings of a given neighborhood just by looking at the cars on the streets. ... -leanings/

The NYT wrote:Why Stanford Researchers Tried to Create a ‘Gaydar’ Machine

Michal Kosinski felt he had good reason to teach a machine to detect sexual orientation.

An Israeli start-up had started hawking a service that predicted terrorist proclivities based on facial analysis. Chinese companies were developing facial recognition software not only to catch known criminals — but also to help the government predict who might break the law next.

And all around Silicon Valley, where Dr. Kosinski works as a professor at Stanford Graduate School of Business, entrepreneurs were talking about faces as if they were gold waiting to be mined.

Few seemed concerned. So to call attention to the privacy risks, he decided to show that it was possible to use facial recognition analysis to detect something intimate, something “people should have full rights to keep private.” ... study.html

Wired wrote:AI Research Is in Desperate Need of an Ethical Watchdog

[big snip]

So researchers have to take ethics into their own hands. Take a recent example: Last month, researchers affiliated with Stony Brook University and several major internet companies released a free app, a machine learning algorithm that guesses ethnicity and nationality from a name to about 80 percent accuracy. They trained the algorithm using millions of names from Twitter and from e-mail contact lists provided by an undisclosed company—and they didn't have to go through a university review board to make the app.

The app, called NamePrism, allows you to analyze millions of names at a time to look for society-level trends. Stony Brook computer scientist Steven Skiena, who used to work for the undisclosed company, says you could use it to track the hiring tendencies in swaths of industry. “The purpose of this tool is to identify and prevent discrimination,” says Skiena. ... -watchdog/

"Prevent"? Well, it can be used differently…

Of course I tried it out (



Needs some fine-tuning (and what the heck are these "nationalities"?). :mrgreen:

User avatar
Posts: 33590
Joined: Tue Jun 08, 2004 1:21 am
Location: Los Angeles, CA
Been thanked: 1749 times

Re: Big Brothers

Post by Grammatron » Wed Mar 14, 2018 1:02 am

Science finally being put to good use.

User avatar
Posts: 14092
Joined: Fri Oct 26, 2007 4:13 pm
Location: Friar McWallclocks Bar -- Where time stands still while you lean over!
Has thanked: 1970 times
Been thanked: 605 times

Re: Big Brothers

Post by sparks » Wed Mar 14, 2018 1:56 am

FSM, what fucking madness.
You can lead them to knowledge, but you can't make them think.

User avatar
Abdul Alhazred
Posts: 71451
Joined: Mon Jun 07, 2004 1:33 pm
Title: Yes, that one.
Location: Chicago
Has thanked: 3329 times
Been thanked: 1228 times

Re: Big Brothers

Post by Abdul Alhazred » Wed Mar 14, 2018 9:07 am

Witness wrote: Needs some fine-tuning (and what the heck are these "nationalities"?). :mrgreen:
I wouldn't consider "white" a nationality, but I am what is called that, so the AI got it right.

Because it recognized the name and based the probability on what kind of smartass would adopt that moniker. :coolspecs:
Image "If I turn in a sicko, will I get a reward?"

"Yes! A BIG REWARD!" ====> Click here to turn in a sicko
Any man writes a mission statement spends a night in the box.
-- our mission statement plappendale