MojoKid (1002251) writes NVIDIA just officially announced the SHIELD Tablet (powered by their Tegra K1 SoC) and SHIELD wireless controller. As the SHIELD branding implies, the new SHIELD tablet and wireless controller builds upon the previously-released, Android-based SHIELD portable to bring a gaming-oriented tablet to consumers. The SHIELD Tablet and wireless controller are somewhat of mashup of the SHIELD portable and the Tegra Note 7, but featuring updated technology and better build materials. You could think of the SHIELD Tablet and wireless controller as an upgraded SHIELD portable gaming device, with the screen de-coupled from the controller. The device features NVIDIA's Tegra K1 SoC, paired to 2GB of RAM and an 8", full-HD IPS display, with a native resolution of 1920x1200. There are also a pair of 5MP cameras on the SHIELD Tablet (front and rear), 802.11a/b/g/n 2x2 MIMO WiFi configuration, GPS, a 9-axis motion sensor, and Bluetooth 4.0 LE. In addition to the WiFi-only version (which features 16GB of internal storage), NVIDIA has a 32GB version coming with LTE connectivity as well. NVIDIA will begin taking pre-orders for the SHIELD Tablet and wireless controller immediately.
Follow Slashdot stories on Twitter
mpicpp (3454017) writes with news of a notoriously abused (basically "method of displaying images on a machine") software patent being declared invalid. From the article: The ruling from last week is one of the first to apply new Supreme Court guidance about when ideas are too "abstract" to be patented. ... The patents in this case describe a type of "device profile" that allows digital images to be accurately displayed on different devices. US Patent No. 6,128,415 was originally filed by Polaroid in 1996. After a series of transfers, in 2012 the patent was sold to Digitech Image Technologies, a branch of Acacia Research Corporation, the largest publicly traded patent assertion company. ... In the opinion, a three-judge panel found that the device profile described in the patent is a "collection of intangible color and spatial information," not a machine or manufactured object. "Data in its ethereal, non-physical form is simply information that does not fall under any of the categories of eligible subject matter under section 101," wrote Circuit Judge Jimmie Reyna on behalf of the panel.
An anonymous reader writes The much anticipated Xorg Server 1.16 release is now available. The X.Org "Marionberry Pie" release features XWayland integration, GLAMOR support, systemd support, and many other features. XWayland support allows for legacy X11 support in Wayland environments via GL acceleration, GLAMOR provides generic 2D acceleration, non-PCI GPU device improvements, and countless other changes. The systemd integration finally allows the X server to run without root privileges, something in the works for a very long time. The non-PCI device improvements mean System-on-a-Chip graphics will work more smoothly, auto-enumerating just like PCI graphics devices do. As covered previously, GLAMOR (the pure OpenGL acceleration backend) has seen quite a bit of improvement, and now works with Xephyr and XWayland.
KDE Community (3396057) writes "KDE proudly announces the immediate availability of Plasma 5.0, providing a visually updated core desktop experience that is easy to use and familiar to the user. Plasma 5.0 introduces a new major version of KDE's workspace offering. The new Breeze artwork concept introduces cleaner visuals and improved readability. Central work-flows have been streamlined, while well-known overarching interaction patterns are left intact. Plasma 5.0 improves support for high-DPI displays and ships a converged shell, able to switch between user experiences for different target devices. Changes under the hood include the migration to a new, fully hardware-accelerated graphics stack centered around an OpenGL(ES) scenegraph. Plasma is built using Qt 5 and Frameworks 5." sfcrazy reviewed the new desktop experience. It would appear the semantic desktop search features finally work even if you don't have an 8-core machine with an SSD.
Dputiger (561114) writes "It has been almost two years since AMD launched the FirePro W9000 and kicked off a heated battle in the workstation GPU wars with NVIDIA. AMD recently released the powerful FirePro W9100, however, a new card based on the same Hawaii-class GPU as the desktop R9 290X, but aimed at the professional workstation market. The W9100's GPU features 2,816 stream processors, and the card boasts 320GB/s of memory bandwidth, and six mini-DisplayPorts, all of which support DP1.2 and 4K output. The W9100 carries more RAM than any other AMD GPU as well, a whopping 16GB of GDDR5 on a single card. Even NVIDIA's top-end Quadro K6000 tops out at 12GB, which means AMD sits in a class by itself in this area. In terms of performance, this review shows that the FirePro W9100 doesn't always outshine its competition, but its price/performance ratio keep it firmly in the running. But if AMD continues to improve its product mix and overall software support, it should close the gap even more in the pro GPU market in the next 18-24 months."
Zothecula (1870348) writes "The Retina displays featured on Apple's iPhone 4 and 5 models pack a pixel density of 326 ppi, with individual pixels measuring 78 micrometers. That might seem plenty good enough given the average human eye is unable to differentiate between the individual pixels, but scientists in the UK have now developed technology that could lead to extremely high-resolution displays that put such pixel densities to shame."
MojoKid writes: Normally, the question of whether a game runs better on the PC or a console is a no-brainer, at least for PC users. Watch Dogs, however, with its problematic and taxing PC play, challenges that concept. And since the gap between consoles and PCs is typically smallest at the beginning of the console generation, HotHardware decided to take the Xbox One out for a head-to-head comparison against the PC with this long-awaited title. What was found may surprise you. Depending on just how much horsepower your PC has, the Xbox One (and possibly the PS4 though that wasn't compared) might be the better option. There's no question that the PC can look better, even before you factor in the mods that have been released to date, but unless you've spent $300 or more on a fairly recent GPU, you're not going to be able to run the game at sufficiently high detail to benefit from the enhanced image quality and resolution. If you have a Radeon HD 7950 / R9 280 or an NVIDIA card with greater than 4GB of RAM or a GeForce GTX 780 / 780 Ti, you can happily observe Watch Dogs make hash out of the Xbox One — but statistically, only a minority of gamers have this sort of high-end hardware. This comparison should be viewed in light of the recent allegations that the PC version's graphics were deliberately handicapped.
An anonymous reader writes "Today, CentOS project unveiled CentOS Linux 7 for 64 bit x86 compatible machines. CentOS conforms fully with Red Hat's redistribution policy and aims to have full functional compatibility with the upstream product released in last month. The new version includes systemd, firewalld, GRUB2, LXC, docker, xfs instead of ext4 filesystem by default. The Linux kernel updated to 3.10.0, support for Linux Containers, 3d graphics drivers out of the box, OpenJDK 7, support for 40G Ethernet cards, installations in UEFI secure Boot mode on compatible hardware and more. See the complete list of features here and here. You can grab this release by visiting the official mirror site or via torrents. On a related note there is also a CentOS Linux 7 installation screencast here."
benrothke writes There is a not so fine line between data dashboards and other information displays that provide pretty but otherwise useless and unactionable information; and those that provide effective answers to key questions. Data-Driven Security: Analysis, Visualization and Dashboards is all about the later. In this extremely valuable book, authors Jay Jacobs and Bob Rudis show you how to find security patterns in your data logs and extract enough information from it to create effective information security countermeasures. By using data correctly and truly understanding what that data means, the authors show how you can achieve much greater levels of security. Keep reading for the rest of Ben's review.
mpicpp (3454017) writes Apple told news website The Loop that it has decided to abandon Aperture, its professional photo-editing software application. "With the introduction of the new Photos app and iCloud Photo Library, enabling you to safely store all of your photos in iCloud and access them from anywhere, there will be no new development of Aperture," Apple said in a statement to The Loop. "When Photos for OS X ships next year, users will be able to migrate their existing Aperture libraries to Photos for OS." The new Photos app, which will debut with OS X Yosemite when it launches this fall, will also replace iPhoto. It promises to be more intuitive and user friendly, but as such, likely not as full featured as what Aperture currently offers.
Google I/O, the company's annual developer tracking^wdevelopers conference, has opened today in San Francisco. This year the company has reduced the number of conference sessions to 80, but also promised a broader approach than in previous years -- in other words, there may be a shift in focus a bit from Google's best known platforms (Chrome/Chrome OS and Android). Given its wide-ranging acquisitions and projects (like the recent purchase of Nest, which itself promptly bought Dropcam, the ever smarter fleet of self-driving cars, the growing number of Glass devices in the wild, and the announcement of a 3D scanning high end tablet quite unlike the Nexus line of tablets and phones), there's no shortage of edges to focus on. Judging from the booths set up in advance of the opening (like one with a sign announcing "The Physical Web," expect some of the stuff that gets lumped into "the Internet of Things." Watch this space -- updates will appear below -- for notes from the opening keynote, or follow along yourself with the live stream, and add your own commentary in the comments. In the days to come, watch for some video highlights of projects on display at I/O, too. Update: 06/25 17:41 GMT by T : Updates rolling in below on Android, wearables, Android in cars, Chromecast, smart watches, etc.Keep checking back! (Every few minutes, I get another chunk in there.)
An anonymous reader writes At the non-profit where I work, there isn't a lot of money for buying stock photos or licensing professional images. So, we've turned to sources of 'free' imagery, notably Creative Commons-licensed photos on Flickr. While we're not a huge organization, we do have 100+ individuals creating content in one way or another. We're now wrestling with compliance of the CC licensing, like including links for By Attribution images, etc. Our legal counsel is also scared of photographers changing their licenses and suing us after the fact. How do you document the images you find were licensed one way in the past, especially when numerous people from across the country are acquiring the images?
An anonymous reader writes "With the Linux 3.16 kernel the Nouveau driver now supports re-clocking for letting the NVIDIA GPU cores and video memory on this reverse-engineered NVIDIA driver run at their designed frequencies. Up to now the Nouveau driver has been handicapped to running at whatever (generally low) clock frequencies the video BIOS programmed the hardware to at boot time, but with Linux 3.16 is experimental support for up-clocking to the hardware-rated speeds. The results show the open-source NVIDIA driver running multiple times faster, but it doesn't work for all NVIDIA hardware, causes lock-ups for some GPUs at some frequencies, and isn't yet dynamically controlled. However, it appears to be the biggest break-through in years for this open-source NVIDIA driver that up to now has been too slow for most Linux games."
An anonymous reader writes Developed as part of a university master thesis is this "truly 3D" windowing system environment. The 3D desktop was developed as a Qt Wayland compositor and output to an Oculus Rift display and then controlled using a high-precision Razer mouse. Overall, it's interesting research for bringing 2D windows into a 3D workspace using Wayland and the Oculus Rift. The code is hosted as the Motorcar Compositor. A video demonstration is on YouTube.
An anonymous reader writes "One of the oldest pieces of the Linux desktop stack still widely in use today is the X Window System that today is commonly referred to as X11 or in recent years the X.Org Server. The X Window System predates the Linux kernel, the Free Software Foundation, GCC, and other key pieces of the Linux infrastructure — or most software widely-used in general. Today marks 30 years since the announcement of X at MIT when it was introduced to Project Athena." X wasn't new when I first saw it, on Sun workstations the summer before I started college. When did you first encounter it?
angry tapir (1463043) writes Third party developers will be able to build mobile applications that tap into the features of Adobe's Creative Cloud, including effects such as Photoshop's "content-aware fill" and PSD file manipulation, thanks to a new SDK the company is releasing as part of a major update to the suite of graphic design products. However, the company has been mum on important details such as how much (if anything) it will cost and what the license is likely to be (at the very least it seems end users will need to be Creative Cloud subscribers). The company has also made a foray into hardware releasing a pressure-sensitive stylus for tablets called Ink and a ruler called Slide.
Advocatus Diaboli writes: Many PC gamers were disappointed that Ubisoft's latest AAA game, Watch_Dogs, did not look as nice as when displayed at E3 in 2012. But this week a modder discovered that code to improve the game's graphics on the PC is still buried within the released game, and can be turned back on without difficulty or performance hits. Ubisoft has yet to answer whether (or why) their PC release was deliberately handicapped. Gaming commentator Total Biscuit has a video explaining the controversy.
An anonymous reader writes 4K monitor prices have fallen into the range where mainstream consumers are starting to consider them for work and for play. There are enough models that we can compare and contrast, and figure out which are the best of the ones available. But this report at The Wirecutter makes the case that absent a pressing need for 8.29 million pixels, you should just wait before buying one. They say, "The current version of the HDMI specification (1.4a) can only output a 4096×2160 resolution at a refresh rate of 24 Hz or 3840×2160 at 30 Hz—the latter, half that of what we're used to on TVs and monitors. Connect up a 4K monitor at 30 Hz via HDMI and you'll see choppier animations and transitions in your OS. You might also encounter some visible motion stuttering during normal use, and you'll be locked to a maximum of 30 frames per second for your games—it's playable, but not that smooth. ... Most people don't own a system that's good enough for gaming on a 4K display—at least, not at highest-quality settings. You'll be better off if you just plan to surf the Web in 4K: Nvidia cards starting in the 600 series and AMD Radeon HD 6000 and 7000-series GPUs can handle 4K, as can systems built with integrated Intel HD 4000 graphics or AMD Trinity APUs. ... There's a light on the horizon. OS support will strengthen, connection types will be able to handle 4K displays sans digital tricks, and prices will drop as more 4K displays hit the market. By then, there will even be more digital content to play on a 4K display (if gaming or multitasking isn't your thing), and 4K monitors will even start to pull in fancier display technology like Nvidia's G-Sync for even smoother digital shootouts."
An anonymous reader writes "Phoronix last week tested 65 graphics cards on open source drivers under Linux and the best result was generally with the open source AMD Radeon drivers. This week they put out a 35-graphics-card comparison using the proprietary AMD/NVIDIA drivers (with the other 30 cards being too old for the latest main drivers) under Ubuntu 14.04. The winner for proprietary GPU driver support on Linux was NVIDIA, which shouldn't come as much of a surprise given that Valve and other Linux game developers are frequently recommending NVIDIA graphics for their game titles while AMD Catalyst support doesn't usually come to games until later. The Radeon OpenGL performance with Catalyst had some problems, but at least its performance per Watt was respectable. Open-source fans are encouraged to use AMD hardware on Linux while those just wanting the best performance and overall experience should see NVIDIA with their binary driver."
mrspoonsi (2955715) writes Mozilla, the organisation behind the Firefox browser, has announced it will start selling low-cost smartphones in India within the "next few months". Speaking to the Wall Street Journal, the firm's chief operating officer suggested the handsets, which will be manufactured by two Indian companies, would retail at $25 (£15) [note: full article paywalled]. They will run Mozilla's HTML5 web-based mobile operating system, Firefox OS. The firm already sells Firefox-powered phones in Europe and Latin America. Firefox OS has come a long way even in the year since we saw a tech demo at Linux Fest Northwest.