Recent quotes:

Google's Improbable Deal to Recreate the Real World in VR | WIRED

Improbable offers a new way of building virtual worlds, including not just immersive games à la Second Life or World of Warcraft, but also vast digital simulations of real cities, economies, and biological systems. The idea is that these virtual worlds can run in a holistic way across a practically infinite network of computers, so that they can expand to unprecedented sizes and reach new levels of complexity.

Vélemény: Gyáva - NOL.hu

During Tickets and all other activities of terrorist attacks in the bus terminal - topped with the words TEK up Janos Hajdu-chief presented to the Prime Minister late-night spectacle . The counterterrorism megfogalmazásunkat probably indignantly rejected, as was a serious exercise. But the commando techniques rarely is often presented to the public and foreign visitors. Sharp right position for the enemy unexpected, but well-rehearsed devices are the most important advantages. This only endangers the public initiation.

Google and Facebook go after Go

Google DeepMind employs more than 200 AI researchers and engineers. Over the 18 months or so it's spent on AlphaGo, the team ballooned from two or three people to 15, Hassabis said. "Go is a pretty sizable project for us," he said. DeepMind recently hired Matthew Lai, a London researcher who developed a system capable of playing chess at the grandmaster level. His software was able to reason in a way similar to how humans do, a more efficient method than IBM's attempt to crunch every possible outcome before making a move in the 1990s.

Facebook Open Sources Its AI Hardware as It Races Google | WIRED

Big Sur includes eight GPU boards, each loaded with dozens of chips while consuming only about 300 Watts of power. Although GPUs were originally designed to render images for computer games and other highly graphical applications, they’ve proven remarkably adept at deep learning. […]Traditional processors help drive these machines, but big companies like Facebook and Google and Baidu have found that their neural networks are far more efficient if they shift much of the computation onto GPUs. […] After 18 months of development, Big Sur is twice as fast as the previous system Facebook used to train its neural networks.