An anonymous reader writes: AMD Windows customers were greeted this week to the new "Crimson" Radeon Software that brought many bug fixes, performance improvements, and brand new control panel. While AMD also released this Crimson driver for Linux, it really doesn't change much. The control panel is unchanged except for replacing "Catalyst" strings with "Radeon" and there's been no performance changes but just some isolated slowdowns. The Crimson Linux release notes only mention two changes: a fix for glxgears stuttering and mouse cursor corruption.
Slashdot Deals: Deal of the Day - Pay What You Want for the Learn to Code Bundle, includes AngularJS, Python, HTML5, Ruby, and more. ×
dmleonard618 writes: Google is gearing up to release Android Studio 2.0 with three key features. The company has released the preview version of the release, and says it focuses on speed of delivery and testing. The new features include Instant Run, which lets developers see the impact of their code changes; Android Emulator, a rebuilt user interface; and an early preview of a new GPU Profiler that allows developers to record and replay graphics-intensive apps frame by frame.
MojoKid writes: Although AMD's mid-range GPU line-up has been relatively strong for a while now, the company is launching the new Radeon R9 380X today with the goal of taking down competing graphics cards like NVIDIA's popular GeForce GTX 960. The Radeon R9 380X has a fully-functional AMD Tonga GPU with all 32 compute units / 2048 shader processors enabled. AMD's reference specifications call for 970MHz+ engine clock with 4GB of 1425MHz GDDR5 memory (5.7 Gbps effective). Typical board power is 190W and cards require a pair of supplemental 6-pin power feeds. The vast majority of the Radeon R9 380X cards that will hit the market, however, will likely be custom models that are factory overlcocked and look nothing like AMD's reference design. The Radeon R9 380X, or more specifically the factory overclocked Sapphire Nitro R9 380X tested, performed significantly better than AMD's Radeon R9 285 or NVIDIA's GeForce GTX 960 across the board. The 380X, however, could not catch more powerful and more expensive cards like the GeForce GTX 970. Regardless, the Radeon R9 380X is easily the fastest graphics card on the market right now, under $250.
An anonymous reader writes: Yesterday marked the release of Star Wars Battlefront, EA DICE's attempt to resurrect a Star Wars video game series that had great success a decade ago, but gradually petered out over the course of several years. Early reviews for the game are mixed. Games Radar's video review gives it a lot of credit for being incredibly faithful to the feeling of Star Wars. Polygon's review praises the game's accessibility and its broad variety of PvP options, but acknowledges that it had to trade complexity to get there. Giant Bomb's review is much more blunt: "Slick production values, solid controls, and tons of fan service can't make up for mediocre progression and a lack of content." Many reviews rate the graphics highly, and performance is solid even on consoles. It's worth noting that user ratings on Metacritic come in significantly lower than critics' ratings, with the most common complaint being about the dearth of content.
An anonymous reader writes: Following last week's announcement of the Jetson TX1 development board, NVIDIA is now allowing independent reports of performance for their $599 USD 64-bit ARM development board. Linux results published by Phoronix show very strong performance for the Jetson TX1 when looking at the Cortex-A57 speed relative to the Tegra K1 and older Tegra SoCs along with other ARM hardware like Calxeda and Raspberry Pi. The Jetson TX1 was generally multiple times faster than ARM hardware a few years old. The graphics performance was twice as fast as the year-old Jetson TK1 thanks to the Maxwell GPU. Compared to x86 hardware, in CPU-bound tasks the performance is comparable to an AMD Sempron/Phenom except when utilizing GPGPU computing where it's then faster than Intel Skylake and Xeon processors. The Jetson TX1 had a peak power consumption of 16 Watts and an average power use of under 10 Watts.
An anonymous reader writes: Linux 4.4-rc1 has been released. New features of Linux 4.4 include a Raspberry Pi kernel mode-setting driver, support for 3D acceleration by QEMU guest virtual machines, AMD Stoney APU support, Qualcomm Snapdragon 820 support, expanded eBPF virtual machine programs, new hardware peripheral support, file-system fixes, faster SHA crypto support on Intel hardware, and LightNVM / Open-Channel SSD support.
mcpublic writes: Today is the 44th anniversary of the Intel 4004, the pioneering 4-bit microprocessor that powered the first electronic taxi meters. According to the unaffiliated (and newly renamed) Intel 4004 45th Anniversary Project web site, they have just re-created the complete set of VLSI mask artwork for the 4004 using scalable vector graphics, and updated their Busicom 141-PF calculator replica aimed at collectors and hobbyists. Included is some interesting historical perspective: Back in the early 1970s, there was no electrical CAD software, design-rule checkers were people, and VLSI lithographic masks were hand-crafted on giant light tables by unsung "rubylith cutters."
An anonymous reader writes: NVIDIA has unveiled the Jetson TX1 development board powered by their Tegra X1 SoC. The Jetson TX1 has a Maxwell GPU capable of 1 TFLOP/s, four 64-bit ARM A57 processors, 4GB of RAM, and 16GB of onboard storage. NVIDIA isn't yet allowing media to publish benchmarks, but the company's reported figures show the graphics and deep learning performance to be comparable to an Intel Core i7-6700K while scoring multiple times better on performance-per-Watt. This development board costs $599 (or $299 for the educational version) and consumes less than 10 Watts.
MojoKid writes: Qualcomm held an event in New York City today to demonstrate for the first time its highly anticipated Snapdragon 820 System-on-Chip (SoC). More than just a speed bump and refresh of the Snapdragon 810, Qualcomm says it designed the Snapdragon 820 "from the ground up to be unlike anything else." Behind that marketing spin is indeed an SoC with a custom 64-bit quad-core Kyro processor clocked at up to 2.2GHz. Qualcomm says it delivers up to twice the performance and twice the power efficiency of its predecessor, which is in fact an 8-core chip. Qualcomm officials have quoted 2x the performance of their previous gen Snapdragon 810 in single threaded throughput alone, which is a sizable gain. Efficiency is also being touted here, and according to Qualcomm, the improvements it made to the underlying architecture translate into nearly a third (30 percent) less power consumption. That should help the Snapdragon 820 steer clear of overheating concerns, which is something the 810 wasn't able to do.
MojoKid writes: Intel's 6th Generation Skylake family of Core processors has been available for some time now for desktops. However, the mobile variant of Skylake is perhaps Intel's most potent incarnation of the new architecture that has been power-optimized on 14nm technology with a beefier graphics engine for notebooks. In late Q3, Intel started rolling out Skylake-U versions of the chip in a 15 Watt TDP flavor. This is the power envelope that most "ultrabooks" are built with and it's likely to be Intel's highest volume SKU of the processor. The Lenovo Yoga 900 tested here was configured with an Intel Core i7-6500U dual-core processor that also supports Intel HyperThreading for 4 logical processing threads available. Its base frequency is 2.5GHz, but the chip will Turbo Boost to 3GHz and down clocks way down to 500MHz when idle. The chip also has 4MB of shared L3 cache and 512K of L2 and 128K of data cache, total. In the benchmarks, the new Skylake-U mobile chip is about 5 — 10 faster than Intel's previous generation Broadwell platform in CPU-intensive tasks and 20+ percent faster in graphics and gaming, at the same power envelope, likely with better battery life, depending on the device.
An anonymous reader writes: Today marks three years since Valve's Steam client went into beta on Linux. In that time over 1,600 games have become natively available for Linux. Going beyond having many new Linux games, Phoronix recaps, "we've seen Valve make significant investments into the open-source graphics stack and other areas of Linux (in part through their sponsorship of Collabora and LunarG). Valve developers are significantly pushing SDL2. We've seen more mainstream interest in Linux gaming, and Valve has been heavily involved in the creation of the Vulkan graphics API. They have given away their entire game collection to the Mesa/Ubuntu/Debian upstream developers, and much more." The three-year anniversary is coincidentally just days before the release of Steam Machines.
An anonymous reader writes: VR is easy for video games, but hard for live action: you don't know where the viewer will be in the virtual world, so you can't put the camera in the right place in the real world. Light field cameras are perfect for VR though, because they're essentially holographic, and capture lots of positions at once. And Lytro has announced the first system that's both 'light field' and 'holographic', which changes everything. Wired seems similarly excited.
An anonymous reader writes: Software engineer Adrian Courrèges posted on his blog a breakdown of the rendering of a frame in Grand Theft Auto: V. Each rendering pass is explained in detail, with all the techniques and the tricks Rockstar used to make the game run on 8-year-old consoles. It's a fascinating trip through the making of a frame and reminds us of how far GPU computing power has come. Here's a brief snippet from the beginning: "As a first step, the game renders a cubemap of the environment. This cubemap is generated in realtime at each frame, its purpose is to help render realistic reflections later. This part is forward-rendered. How is such cubemap rendered? For those not familiar with the technique, this is just like you would do in the real world when taking a panoramic picture: put the camera on a tripod, imagine you’re standing right in the middle of a big cube and shoot at the 6 faces of the cube, one by one, rotating by 90 degrees each time. This is exactly how the game does: each face is rendered into a 128x128 HDR texture."
An anonymous reader writes: Apple's new set top box is on sale now, and has launched with several high profile games in the new tvOS App Store, including Guitar Hero Live and PS4 hit Transistor. However, as one writer points out, the Apple TV is still not an adequate console replacement, and it's not because of the graphics. Instead, several software issues and restrictions issued by Apple itself prevent developers from creating blockbuster exclusives for the platform, including the requirement that all games be playable using the bundled remote, lack of support for four players, and the 200MB initial app download limit. If these remain in place, can the Apple TV become a viable games platform, where the Ouya and PlayStation TV have failed before?
MojoKid writes: AMD has gone through significant changes as a company over the last few months. Recently, we've seen them enter into a joint venture with Nantong Fujitsu for final assembly and test operations. They've also formed the new Radeon Technologies Group, led by longtime graphics guru Raja Koduri. Today, AMD is announcing another big change, and this one affects a piece of software that you may have running on your systems right now, if there's a Radeon graphics card on board. AMD is ditching Catalyst Control Center in favor of software dubbed Radeon Settings, which is a critical part of what AMD is calling the Radeon Software Crimson Edition. Radeon Software Crimson Edition is completely re-architected and is claimed to offer new features, improvements to stability and responsiveness, and performance improvements as well. The update will include a new Game Manager, video quality presets, social media integration, simplified EF setup, a system notifications tab, and more. It looks as though the first version of the software will be out this month.
An anonymous reader writes: The Linux 4.3 kernel was released as stable today. The Linux 4.3 kernel brings Intel Skylake support, reworked NVIDIA open-source graphics support, and many other changes with the code count hitting 20.6 million lines of code.
Espectr0 writes: YouTube user Hacking Jules would like you to see his collection of game emulators running on Android Wear. He manages to play classic 3D Mario and Zelda games running in a Nintendo 64 emulator on the original LG G-Watch, while also running Monster Hunter on the PPSSPP emulator.As the linked article admits, this is a work of passion rather than practicality -- if you actually want to play those games enjoyably, don't trade your console or conventional emulator for a smart watch.
yathosho writes with some good news for GitHub developers: GitHub's new Atom editor sees a first big update in version 1.1. Character measurement has been improved, fonts with ligatures and variable width fonts are now supported. The biggest new feature is probably live Markdown preview, matching the current theme. There's also a 1.2.0 beta available, for those who want to have a look into Atom's future.
An anonymous reader writes: Back in June, Warner Brothers removed Batman: Arkham Knight from sale after a lot of graphics and performance issues found on the PC version. Now, after spending five months trying to fix this mess, Rocksteady and Warner Bros re-released the game on Steam with some free Batman titles for those who acquired the launch edition. However, Warner Bros noted there are still a few caveats with Windows 10 users recommended to have 12GB of RAM to avoid paging issues: "For Windows 10 users, we've found that having at least 12GB of system RAM on a PC allows the game to operate without paging and provides a smoother gameplay experience." Some initial tests show no performance gains on the re-released version. Warner Bros claims that it's still working closely with its GPU partners in order to enable SLI/Crossfire for the game.
An anonymous reader writes: Phoronix's recent 22-Way SteamOS Graphics Card Comparison showed that NVIDIA wins across the board when it comes to closed-source OpenGL driver performance. However, when it comes to the open-source driver performance for Steam Linux gaming, no one is really the winner. A new article, "Are The Open-Source Graphics Drivers Good Enough For Steam Linux Gaming?" answers that question with "heck no" by its author. While AMD is generally regarded as having better open-source support, their newer graphics cards still can't run at their rated clock frequencies due to lack of power management support, the lack of enough OpenGL 4.x support means many AAA Linux games simply cannot run yet, not enough QA means regressions are common, and other issues were noted when it comes to testing a number of modern graphics cards on the open-source drivers.