Toggle light / dark theme

If you’ve been hacked in recent years, odds are you fell for that perfectly crafted phishing message in your email. Even the most mindful individuals can slip up, but Google’s employees have reportedly had a flawless security record for more than a year thanks to a recent policy requiring them to use physical security keys.

Krebs on Security reports that in early 2017, Google started requiring its 85,000 employees to use a security key device to handle two-factor authentication when logging into their various accounts. Rather than just having a single password, or receiving a secondary access code via text message (or an app such as Google Authenticator), the employees had to use a traditional password as well as plug in a device that only they possessed. The results were stellar. From the report:

A Google spokesperson said Security Keys now form the basis of all account access at Google.

Read more

About the future death of explainability to understand AI thinking, the writing is on the wall…


These divergent approaches, one regulatory, the other deregulatory, follow the same pattern as antitrust enforcement, which faded in Washington and began flourishing in Brussels during the George W. Bush administration. But there is a convincing case that when it comes to overseeing the use and abuse of algorithms, neither the European nor the American approach has much to offer. Automated decision-making has revolutionized many sectors of the economy and it brings real gains to society. It also threatens privacy, autonomy, democratic practice, and ideals of social equality in ways we are only beginning to appreciate.

At the simplest level, an algorithm is a sequence of steps for solving a problem. The instructions for using a coffeemaker are an algorithm for converting inputs (grounds, filter, water) into an output (coffee). When people say they’re worried about the power of algorithms, however, they’re talking about the application of sophisticated, often opaque, software programs to enormous data sets. These programs employ advanced statistical methods and machine-learning techniques to pick out patterns and correlations, which they use to make predictions. The most advanced among them, including a subclass of machine-learning algorithms called “deep neural networks,” can infer complex, nonlinear relationships that they weren’t specifically programmed to find.

Predictive algorithms are increasingly central to our lives. They determine everything from what ads we see on the Internet, to whether we are flagged for increased security screening at the airport, to our medical diagnoses and credit scores. They lie behind two of the most powerful products of the digital information age: Google Search and Facebook’s Newsfeed. In many respects, machine-learning algorithms are a boon to humanity; they can map epidemics, reduce energy consumption, perform speech recognition, and predict what shows you might like on Netflix. In other respects, they are troubling. Facebook uses AI algorithms to discern the mental and emotional states of its users. While Mark Zuckerberg emphasizes the application of this technique to suicide prevention, opportunities for optimizing advertising may provide the stronger commercial incentive.

Read more

Xage (pronounced Zage), a blockchain security startup based in Silicon Valley, announced a $12 million Series A investment today led by March Capital Partners. GE Ventures, City Light Capital and NexStar Partners also participated.

The company emerged from stealth in December with a novel idea to secure the myriad of devices in the industrial internet of things on the blockchain. Here’s how I described it in a December 2017 story:

Xage is building a security fabric for IoT, which takes blockchain and synthesizes it with other capabilities to create a secure environment for devices to operate. If the blockchain is at its core a trust mechanism, then it can give companies confidence that their IoT devices can’t be compromised. Xage thinks that the blockchain is the perfect solution to this problem.

Read more

Quantum communication and cryptography are the future of high-security communication. But many challenges lie ahead before a worldwide quantum network can be set up, including propagating the quantum signal over long distances. One of the major challenges is to create memories with the capacity to store quantum information carried by light. Researchers at the University of Geneva (UNIGE), Switzerland, in partnership with CNRS, France, have discovered a new material in which an element, ytterbium, can store and protect the fragile quantum information even while operating at high frequencies. This makes ytterbium an ideal candidate for future quantum networks, where the aim is to propagate the signal over long distances by acting as repeaters. These results are published in the journal Nature Materials.

Quantum cryptography today uses optical fibre over several hundred kilometres and is marked by its high degree of security: it is impossible to copy or intercept information without making it disappear.

However, the fact that it is impossible to copy the signal also prevents scientists from amplifying it to diffuse it over long distances, as is the case with the Wi-Fi network.

Read more

But even though money is necessary, it’s not sufficient to provide human beings a sense of satisfaction, Obama cautioned. As more and more tasks and services become automated with the rise of artificial intelligence, “that’s going to make the job of giving everybody work that is meaningful tougher, and we’re going to have to be more imaginative, and the pace of change is going to require us to do more fundamental re-imagining of our social and political arrangements, to protect the economic security and the dignity that comes with a job.”


The former president says “we’re going to have to consider new ways of thinking” as technology threatens current labor markets.

Read more

We may also process your information for legitimate reasons associated with your use or ownership of an Aston Martin car, for reasons concerning information or network security, to defend or pursue legal rights or to meet regulatory requirements. Any information processed for contacts based in the EU will not be transferred outside the EU.

You can update your contact details at any time by emailing [email protected] or change your mind by clicking “unsubscribe” in any email you receive from us.

For further details on how your data is used and stored please refer to our PRIVACY POLICY.

Read more

“DATA SLAVERY.” Jennifer Lyn Morone, an American artist, thinks this is the state in which most people now live. To get free online services, she laments, they hand over intimate information to technology firms. “Personal data are much more valuable than you think,” she says. To highlight this sorry state of affairs, Ms Morone has resorted to what she calls “extreme capitalism”: she registered herself as a company in Delaware in an effort to exploit her personal data for financial gain. She created dossiers containing different subsets of data, which she displayed in a London gallery in 2016 and offered for sale, starting at £100 ($135). The entire collection, including her health data and social-security number, can be had for £7,000.

Only a few buyers have taken her up on this offer and she finds “the whole thing really absurd”. Yet if the job of the artist is to anticipate the Zeitgeist, Ms Morone was dead on: this year the world has discovered that something is rotten in the data economy. Since it emerged in March that Cambridge Analytica, a political consultancy, had acquired data on 87m Facebook users in underhand ways, voices calling for a rethink of the handling of online personal data have only grown louder. Even Angela Merkel, Germany’s chancellor, recently called for a price to be put on personal data, asking researchers to come up with solutions.

Read more