MojoKid writes: Motorola's first generation Moto 360 smartwatch was one of the first Android Wear smartwatches to hit the market, and because of its round display, became the immediate flag bearer for the Android Wear platform. As new competition has entered the fray — including entries from Apple with the Apple Watch and Samsung with the Gear S2 — Motorola is announcing a second generation smartwatch that solves most of the complaints of the previous model. Motorola has ditched the archaic Texas Instruments OMAP 3 processor in the original Moto 360. The new second generation Moto 360 brings a more credible 1.2GHz, quad-core Qualcomm Snapdragon 400 processor and Adreno 305 graphics to the table. You'll also find 512MB of RAM and 4GB of storage. And if you didn't like the largish dimensions of the previous Moto 360, you'll be glad to know that Motorola is offering two sizes this time around. There's a 46mm diameter case that comes with a 360x330 display and a smaller 42mm diameter case that houses a 360x325 display. Motorola has also introduced a dedicated women's model of the Moto 360 which features a 42mm diameter case and accepts smaller 16mm bands. As for battery life, Motorola says that the men's and women's 42mm models comes with a 300 mAh battery which is good for up to 1.5 days of mixed use, while the 46mm watch comes with a larger 400 mAh battery which is good for up to 2 days on charge.
An anonymous reader writes: With Linux 4.3 AMD is adding the initial open-source driver for the R9 Fury graphics cards. Unfortunate for Linux gamers, the R9 Fury isn't yet in good shape on the open-source driver and it's not good with the Catalyst Linux driver either as previously discussed. With the initial code going into Linux 4.3, the $550 R9 Fury runs slower than graphics cards like the lower-cost and older R7 370 and HD 7950 GPUs, since AMD's open-source developers haven't yet found the time to implement power management / re-clocking support. The R9 Fury also only advertises OpenGL 3.0 support while the hardware is GL4.5-capable and the other open-source AMD GCN driver ships OpenGL 4.1. It doesn't look like AMD has any near-term R9 Fury Linux fix for either driver, but at least their older hardware is performing well with the open-source code.
mikejuk writes to note that the Unicode Consortium has accepted 38 new emoji characters as candidates for Unicode 9.0, including characters depicting bacon and a duck."Why could we possibly need a duck? Many of the new characters are the 'other half' of gender-matched pairs, so the Dancer emoji (which is usually rendered as Apple's salsa dancing woman) gets a Man Dancing emoji, who frankly looks like a cross between John Travolta in Saturday Night Fever and your dad at the wedding disco. ... Other additions include carrot, cucumber, and avocado, and bacon. ... The list of additions is rounded off with new animal emojis. Some are the 'missing' zodiac symbols (lion and crab). Others are as baffling as ever – is there *really* a demand for a mallard duck? Sorry: it's in fact a drake!
alexvoica writes: The first general-purpose graphics processor (GPGPU) now available as open-source RTL was unveiled at the Hot Chips event. Although the GPGPU is in an early and relatively crude stage, it is another piece of an emerging open-source hardware platform, said Karu Sankaralingam, an associate professor of computer science at the University of Wisconsin-Madison. Sankaralingam led the team that designed the Many-core Integrated Accelerator of Wisconsin (MIAOW). A 12-person team developed the MIAOW core in 36 months. Their goal was simply to create a functional GPGPU without setting any specific area, frequency, power or performance goals. The resulting GPGPU uses just 95 instructions and 32 compute units in its current design. It only supports single-precision operations. Students are now adding a graphics pipeline to the design, a job expected to take about six months.
MojoKid writes: AMD today added a third card to its new Fury line that's arguably the most intriguing of the bunch, the Radeon R9 Nano. True to its name, the Nano is a very compact card, though don't be fooled by its diminutive stature. Lurking inside this 6-inch graphics card is a Fiji GPU core built on a 28nm manufacturing process paired with 4GB of High Bandwidth Memory (HBM). It's a full 1.5 inches shorter than the standard Fury X, and unlike its liquid cooled sibling, there's no radiator and fan assembly to mount. The Fury Nano sports 64 compute units with 64 stream processors each for a total of 4,096 stream processors, just like Fury X. It also has an engine clock of up to 1,000MHz and pushes 8.19 TFLOPs of compute performance. That's within striking distance of the Fury X, which features a 1,050MHz engine clock at 8.6 TFLOPs. Ars Technica, too, takes a look at the new Nano.
An anonymous reader writes: Starting on September 1, Amazon will no longer support Flash across its advertising platform. The online retailer sites changes to browser support and a desire for customers to have a better experience as their reasons for blocking it. Google has been quite active recently in efforts to kill Flash; the Chrome beta channel has begun automatically pausing Flash, Google has converted ads from Flash to HTML5, and YouTube uses HTML5 by default now as well. Safari and Firefox also place limits on Flash content. Is Flash finally on its way out?
MojoKid writes: NVIDIA is launching a new mainstream graphics card today, the GeForce GTX 950, based on the company's GM206 GPU. The GM206 debuted on the GeForce GTX 960, which launched a few months back. As the new card's name suggests though, the GM206 used on the GeForce GTX 950 isn't quite as powerful as the one used on the GTX 960. The company is targeting this card at MOBA (massive online battle arena) players, who don't necessarily need the most powerful GPUs on the market, but want smooth, consistent framerates at resolutions of 1080p or below. It's being positioned as a significant, yet affordable, upgrade over cards like the GeForce GTX 650 Ti, that are a couple of generations old. NVIDIA's reference specifications for the GeForce GTX 950 call for a base clock of 1024MHz and a Boost clock of 1188MHz. The GPU is packing 768 CUDA cores, 48 texture units, and 32 ROPs. The 2GB of video memory on GeForce GTX 950 cards is clocked at a 6.6GHz (effective GDDR5 data rate) and the memory links to the GPU via a 128-bit interface. At those clocks, the GeForce GTX 950 offers up a peak textured fillrate of 49.2 GTexels/s and 105.6 GB/s of memory bandwidth. At a $159 starting MSRP, in the benchmarks, the GeForce GTX 950 offers solid entry level or midrange performance at 1080p resolutions. It's a bit faster than AMD's Radeon R9 270X but comes in just behind a Radeon R9 285.
MojoKid writes: Intel is still keeping a number of details regarding its complete Skylake microarchitecture and product line-up under wraps for a few more weeks, but at a public session at IDF, some of the design updates introduced with Skylake were detailed. Virtually every aspect of Skylake has been improved versus the previous-gen Haswell microarchitecture. I/O, Ring Bus, and LLC throughput has been increased, the graphics architecture has been updated to support DX12 and new eDRAM configurations, it has an integrated camera ISP, support for faster DDR4 memory, and more flexible overclocking features. All of these things culminate in a processor that offers higher IPC performance and improved power efficiency. There are also new security technologies dubbed Intel Software Guard Extensions (Intel SGX) onboard Skylake, which support new instructions to create and isolate enclaves from malware and privileged software attack, along with Memory Protection Extensions (Intel MPX) to help protect stack and heap buffer boundaries as well. A new technology, dubbed Intel Speed Shift, also allows Skylake to switch power states faster than previous-gen products, controlling P states fully in hardware, whereas previous-gen products required OS control. The end result is that Skylake can switch P states in 1ms, whereas it takes roughly 30ms with older processors.
An anonymous reader writes: AMD's Linux gaming performance has been embarrassingly bad, and it doesn't look like there's any quick remedy. Virtual Programming just released Dirt: Showdown for Linux, and it's the latest example of AMD's Linux driver issues: AMD's GPU results are still far behind NVIDIA's, with even the Radeon R9 Fury running slower than NVIDIA's aging GTX 680 and GTX 760. If a racing game doesn't interest you, Feral Interactive confirmed they are releasing Company of Heroes 2 for Linux next week, but only NVIDIA and Intel graphics are supported.
MojoKid writes: Intel's Skylake is here and the new architecture comprises Intel's 6th generation Core line of CPUs. In recent testing it was confirmed that Intel's Skylake-based Core i7-6700K is the company's fastest quad-core desktop processor to date. However, one thing Intel kept a tight lid on was the underlying technology of the Gen9 Intel HD Graphics engine on board Skylake, that is until now. An overview of the changes Intel made specific to Intel Gen9 graphics, notes the following among other tweaks: Available L3 cache capacity has been increased to 768 Kbytes per slice (512 Kbytes for application data). Sizes of both L3 and LLC request queues have been increased. This improves latency hiding to achieve better effective bandwidth against the architecture peak theoretical. Gen9 EDRAM now acts as a memory-side cache between LLC and DRAM. Also, the EDRAM memory controller has moved into the system agent, adjacent to the display controller, to support power efficient and low latency display refresh. Gen9 has also been designed to enable products with 1, 2, or 3 slices, each with 24 EUs per slice and 8 EUs per subslice. Finally, Gen9 adds new power gating and clock domains for more efficient dynamic power management. This can particularly improve low power media playback modes.
harrymcc writes: Security and performance issues with Adobe's Flash Player have led to countless calls for its abandonment. But a significant percentage of major sites still use it--and many of those companies aren't eager to explain why. Over at Fast Company, Jared Newman investigates why Flash won't disappear from the web anytime soon. From the article: Despite the pressure from tech circles, the sites I spoke with said they simply weren’t able to start moving away from Flash until recently, when better technology become available. And even now, it’s going to take time for them to finish building the necessary tools. "Originally, Flash was necessary to solve a couple problems," says Adam Denenberg, chief technical officer for streaming music service iHeartRadio. "Streaming was difficult, especially for live stations, and there were no real http-supported streaming protocols that offered the flexibility of what was required a few years back."
SlappingOysters writes: Grab It discusses the loss of splitscreen gaming to the Halo series in this article that asks developer 343 Industries to re-evaluate its position on cutting the feature. The developer has cited "increased visual and gameplay fidelity" as the reasons for cutting the series' hallmark mode. In better news for couch co-op fans, the site does confirm that Gears of War 4 will have splitscreen gameplay when it releases in 2016.
Vigile writes: The future of graphics APIs lies in DirectX 12 and Vulkan, both built to target GPU hardware at a lower level than previously available. The advantages are better performance, better efficiency on all hardware and more control for the developer that is willing to put in the time and effort to understand the hardware in question. Until today we have only heard or seen theoretical "peak" performance claims of DX12 compared to DX11. PC Perspective just posted an article that uses a pre-beta version of Ashes of the Singularity, an upcoming RTS utilizing the Oxide Games Nitrous engine, to evaluate and compare DX12's performance claims and gains against DX11. In the story we find five different processor platforms tested with two different GPUs and two different resolutions. Results are interesting and show that DX12 levels the playing field for AMD, with its R9 390X gaining enough ground in DX12 to overcome a significant performance deficit that exists using DX11 to the GTX 980.
theodp writes: Plans for Rupert Murdoch & Co. to teach your children to code just hit a bump in the road. Murdoch's News Corp. last week announced it plans to exit the education business as it announced a $371 million write-down of the investment in its Amplify education unit, which aimed to reinvent education via digital tools, tablets and curriculum reinforced with snazzy graphics. The news may help to explain why Amplify MOOC, the entity that offered online AP Computer Science A to high school students, was re-dubbed Edhesive ("online education that sticks") a couple of months ago. Tech-backed Code.org, whose $1+ million "Gold Supporters" include the James and Kathryn Murdoch-led Quadrivium Foundation, announced a partnership with Edhesive to bring CS to schools in June, around the same time Edhesive LLC was formed.
jones_supa writes: As pointed out by a Redditor, it seems that suspending the machine is not officially supported by SteamOS anymore. A SteamOS user opened a bug report due to his controllers being unresponsive after a suspend cycle. To this, a Valve engineer bluntly reported that "suspend is no longer supported". He further explained the issue by saying that given the state of hardware and software support throughout the graphics stack on Linux, the team didn't think that they could make the feature work reliably.
An anonymous reader writes: Most security interfaces today leave a lot to be desired, and many security pros are gaming enthusiasts, accustomed to a sharp and engaging virtual world. ProtectWise CEO Scott Chasin and CTO Gene Stevens wanted to give them a helpful security tool with an interactive visual dashboard that looks straight out of Call of Duty. The UI is called ProtectWise Visualizer, and its creator is Jake Sargeant, FX pro and a visual designer at MN8 Studio. If his name sounds familiar, it's because he was the Lead Animated Graphics Artist for the movie TRON: Legacy. There's plenty of inspiration available for movie-style UIs; the problem with much of it is that not everyone likes an interface that looks like an especially busy video game.
An anonymous reader writes: Today Pixar announced their second major open source project, Universal Scene Description. USD is the technology that enables 'hundreds of artists to operate simultaneously on the same collections of assets in different contexts', says Pixar VP of software R&D, Guido Quaroni. Pixar has been working with industry to vet the new technology, gaining backing from VFX power-houses MPC and Double Negative as well as high-end digital content creation tool creator, The Foundry. Official source release is slated for summer 2016. Pixar released its RenderMan animation and rendering suite, free back in March.
An anonymous reader writes: With kicking off ACM SIGGRAPH '15, The Khronos Group came out with several big announcements, including the release of OpenGL ES 3.2 (which incorporates Android AEP functionality), confirmation that Google will support Vulkan on Android (when released), new desktop OpenGL extensions, and updates to the existing OpenCL 2.0 specification. They stopped short of releasing the heavily anticipated Vulkan Graphics API and also refrained from releasing a new desktop OpenGL version. They hope to have the Vulkan specification and its implementations released before year's end.