" /> Coruscation: July 2016 Archives

« June 2016 | Main | August 2016 »

July 31, 2016

Future is mining cloud data

The next big competition in cloud computing also involves artificial intelligence, fed by loads of data. Soon, Mr. Kurian said, Oracle will offer applications that draw from what it knows about the people whose actions are recorded in Oracle databases. The company has anonymized data from 1,500 companies, including three billion consumers and 400 million business profiles, representing $3 trillion in consumer purchases.

"Most of the world's data is already inside Oracle databases," said Thomas Kurian, , Oracle's president of product development

That's the kind of hold on people's information that perhaps only Facebook can match. But Mark Zuckerberg doesn't sell business software. At least, not yet.

July 12, 2016

Nature Facebook experiment boosted USA voter turnout in 2010

Social pressure:

The experiment assigned all US Facebook users who were over 18 and accessed the website on the 2 November 2010 -- the day of the elections -- to one of three groups.

Social science: Poked to vote
Computational social science: Making the links Facebook 'likes' the scientific method

About 611,000 users (1%) received an 'informational message' at the top of their news feeds, which encouraged them to vote, provided a link to information on local polling places and included a clickable 'I voted' button and a counter of Facebook users who had clicked it. About 60 million users (98%) received a 'social message', which included the same elements but also showed the profile pictures of up to six randomly selected Facebook friends who had clicked the 'I voted' button. The remaining 1% of users were assigned to a control group that received no message.

The researchers then compared the groups' online behaviours, and matched 6.3 million users with publicly available voting records to see which group was actually most likely to vote in real life.

The results showed that those who got the informational message voted at the same rate as those who saw no message at all. But those who saw the social message were 2% more likely to click the 'I voted' button and 0.3% more likely to seek information about a polling place than those who received the informational message, and 0.4% more likely to head to the polls than either other group.

The social message, the researchers estimate, directly increased turnout by about 60,000 votes. But a further 280,000 people were indirectly nudged to the polls by seeing messages in their news feeds, for example, telling them that their friends had clicked the 'I voted' button. "The online social network helps to quadruple the effect of the message," says Fowler.

Nature's Facebook experiment boosts US voter turnout.

July 11, 2016

Boris, Have I Got News for You (HIGNFY)

ALEXANDER BORIS DE PFEFFEL JOHNSON is very sad

Boris is very sad.png


Mr. Johnson, who at 37 has the stentorian voice of an elderly 19th-century flâneur, subscribes to the English tradition of pretending that your work simply flows invisibly from you, with no actual effort. In an increasingly meritocratic England, it is an old-school affectation that does not fool anyone.

July 10, 2016

Autodesign, 1980 - 2010, four-door fastback 'coupe'

BMW will sell you an X6 "coupe" which, properly speaking, should be called the X6-11 because it looks exactly like a Citation X-11 with the nose from a Pontiac Grand Am welded on as an afterthought.

In retrospect, it's fairly obvious why somebody would trade in a '79 Granada for an '84 Accord: You got twice the gas mileage and more than twice the longevity at virtually no cost in usable interior room. That's a practical, sensible decision.

It's not nearly as easy to understand why someone would trade a 2011 Accord for a 2016 Pilot or CR-V. There's a substantial price penalty to be paid for the "upgrade" to a crossover or SUV. Fuel economy suffers. Tires and brakes wear out quicker and cost more to replace. The handling of any lifted vehicle is always much, much worse than that of the car from which it's derived. Look at it this way: If you knew with absolute certainty that your morning commute tomorrow would feature a flatbed losing its cargo on the road ahead of you, scattering cars and trucks in every which direction while you tried to steer and brake your way to safety, would you rather be driving a Camry or a Highlander? A BMW 530i or a BMW X5? A Porsche Cayman, or a Cayenne?

To choose a crossover instead of a car is to willingly give back virtually all of the advances that American buyers gained when they went from Granadas to Accords. And what do you get in return? It can't be that customers demand all-wheel-drive; that was offered in everything from the Camry to the Tempo back in the Nineties and very few people stepped up to pay the extra money. Most of the "SUVs" I see on the freeway nowadays have an empty hole where the (optional) rear differential would go anyway.

-- Jack Baruth has won races on four different kinds of bicycles and in seven different kinds of cars. Everything he writes should probably come with a trigger warning.

July 8, 2016

Culture Digitally Facebook trending its made of people but we should-have-already-known-that/

Culturedigitally: facebook trending its made of people but we should have already known that.

July 5, 2016

A watch list, which relies on the predictive judgments of anonymous analysts predisposed to err on the side of caution


The threats that the terrorist watch list and no-fly list pose to civil liberties -- indeed, to the very idea of citizenship -- are enormous. Watch lists are designed to circumvent the protections of due process and the separation of powers. They subvert a principle of our free society: Our rights aren't held on loan until a government official labels us suspect, at which point they are easily stripped away; our rights are ours unless and until a court concludes that we have violated the law.

This is not the case with a watch list, which relies on the predictive judgments of anonymous analysts predisposed to err on the side of caution. Their job is to stop something horrible from happening. Why would they be inclined to err the other way? Their decisions require no judicial approval, and their standard for labeling someone a suspected terrorist to be watch-listed is very low, a mere "reasonable suspicion."

As one federal judge noted in a case involving a plaintiff's challenge to being placed on the no-fly list, "an American citizen can find himself labeled a suspected terrorist because of a 'reasonable suspicion' based on a 'reasonable suspicion.' "

Some people who are tempted by watch lists but reluctant to deprive people of rights without due process propose combining them with the procedures used for search warrants or wiretaps. Why not just open up the watch list process to a judge who can assess these determinations? If it's good enough for the Fourth Amendment, isn't it good enough for the Second?

But this analogy doesn't work. The low standards and one-sided nature of warrant requests are only the first step in a longer, public, adversarial process. They satisfy public safety needs to investigate and stop suspected crimes, but this is followed by an opportunity for a trial with a higher burden of proof and a meaningful chance to confront and respond to the state's evidence.


-- Jeffrey Kahn, a law professor at Southern Methodist University, is the author of "Mrs. Shipley's Ghost: The Right to Travel and Terrorist Watchlists."

July 4, 2016

New nerd glasses ? Low Bridge fit by Warby Parker

Warby Parker's Low Bridge.

Fit for everyone's eyes.

July 2, 2016

Facebook makes the news

According to a statement from Tom Stocky, who is in charge of the trending topics list, Facebook has policies "for the review team to ensure consistency and neutrality" of the items that appear in the trending list.

But Facebook declined to discuss whether any editorial guidelines governed its algorithms, including the system that determines what people see in News Feed. Those algorithms could have profound implications for society. For instance, one persistent worry about algorithmic-selected news is that it might reinforce people's previously held points of view. If News Feed shows news that we're each likely to Like, it could trap us into echo chambers and contribute to rising political polarization. In a study last year, Facebook's scientists asserted the echo chamber effect was muted.

But when Facebook changes its algorithm -- which it does routinely -- does it have guidelines to make sure the changes aren't furthering an echo chamber? Or that the changes aren't inadvertently favoring one candidate or ideology over another? In other words, are Facebook's engineering decisions subject to ethical review? Nobody knows.

The other reason to be wary of Facebook's bias has to do with sheer size. Ms. Caplan notes that when studying bias in traditional media, scholars try to make comparisons across different news outlets. To determine if The Times is ignoring a certain story unfairly, look at competitors like The Washington Post and The Wall Street Journal. If those outlets are covering a story and The Times isn't, there could be something amiss about The Times's news judgment.

Such comparative studies are nearly impossible for Facebook. Facebook is personalized, in that what you see on your News Feed is different from what I see on mine, so the only entity in a position to look for systemic bias across all of Facebook is Facebook itself. Even if you could determine the spread of stories across all of Facebook's readers, what would you compare it to?

"Facebook has achieved saturation," Ms. Caplan said. No other social network is as large, popular, or used in the same way, so there's really no good rival for comparing Facebook's algorithmic output in order to look for bias.

What we're left with is a very powerful black box. In a 2010 study, Facebook's data scientists proved that simply by showing some users that their friends had voted, Facebook could encourage people to go to the polls. That study was randomized -- Facebook wasn't selectively showing messages to supporters of a particular candidate.

Facebook tinkered with users emotions in 2014 news feed experiment

NY Times Technology on Facebook's tinkering with users emotions in 2014 news feed experiment: outcry stirred.

http://www.nytimes.com/2014/06/30/technology/facebook-tinkers-with-users-emotions-in-news-feed-experiment-stirring-outcry.html.