Forgot your password?
typodupeerror

Follow Slashdot stories on Twitter

Supercomputing

First Demonstration of Artificial Intelligence On a Quantum Computer 98

Posted by Soulskill
from the teaching-a-new-dog-old-tricks dept.
KentuckyFC writes: Machine learning algorithms use a training dataset to learn how to recognize features in images and use this 'knowledge' to spot the same features in new images. The computational complexity of this task is such that the time required to solve it increases in polynomial time with the number of images in the training set and the complexity of the "learned" feature. So it's no surprise that quantum computers ought to be able to rapidly speed up this process. Indeed, a group of theoretical physicists last year designed a quantum algorithm that solves this problem in logarithmic time rather than polynomial, a significant improvement.

Now, a Chinese team has successfully implemented this artificial intelligence algorithm on a working quantum computer, for the first time. The information processor is a standard nuclear magnetic resonance quantum computer capable of handling 4 qubits. The team trained it to recognize the difference between the characters '6' and '9' and then asked it to classify a set of handwritten 6s and 9s accordingly, which it did successfully. The team says this is the first time that this kind of artificial intelligence has ever been demonstrated on a quantum computer and opens the way to the more rapid processing of other big data sets — provided, of course, that physicists can build more powerful quantum computers.
Software

Brown Dog: a Search Engine For the Other 99 Percent (of Data) 23

Posted by Soulskill
from the because-it-fetches-data dept.
aarondubrow writes: We've all experienced the frustration of trying to access information on websites, only to find that the data is trapped in outdated, difficult-to-read file formats and that metadata — the critical data about the data, such as when and how and by whom it was produced — is nonexistent. Led by Kenton McHenry, a team at the National Center for Supercomputing Applications is working to change that. Recipients in 2013 of a $10 million, five-year award from the National Science Foundation, the team is developing software that allows researchers to manage and make sense of vast amounts of digital scientific data that is currently trapped in outdated file formats. The NCSA team recently demonstrated two publicly-available services to make the contents of uncurated data collections accessible.
Supercomputing

Supercomputing Upgrade Produces High-Resolution Storm Forecasts 77

Posted by samzenpus
from the clearer-pictures dept.
dcblogs writes A supercomputer upgrade is paying off for the U.S. National Weather Service, with new high-resolution models that will offer better insight into severe weather. This improvement in modeling detail is a result of a supercomputer upgrade. The National Oceanic and Atmospheric Administration, which runs the weather service, put into production two new IBM supercomputers, each 213 teraflops, running Linux on Intel processors. These systems replaced 74-teraflop, four-year old systems. More computing power means systems can run more mathematics, and increase the resolution or detail on the maps from 8 miles to 2 miles.
Google

Google To Build Quantum Information Processors 72

Posted by Soulskill
from the remember-when-google-was-a-search-company dept.
An anonymous reader writes The Google Quantum AI Team has announced that they're bringing in a team from the University of California at Santa Barbara to build quantum information processors within the company. "With an integrated hardware group the Quantum AI team will now be able to implement and test new designs for quantum optimization and inference processors based on recent theoretical insights as well as our learnings from the D-Wave quantum annealing architecture." Google will continue to work with D-Wave, but the UC Santa Barbara group brings its own areas of expertise with superconducting qubit arrays.
Cloud

IBM Opens Up Its Watson Supercomputer To Researchers 28

Posted by samzenpus
from the try-it-out dept.
An anonymous reader writes IBM has announced the "Watson Discovery Advisor" a cloud-based tool that will let researchers comb through massive troves of data, looking for insights and connections. The company says it's a major expansion in capabilities for the Watson Group, which IBM seeded with a $1 billion investment. "Scientific discovery takes us to a different level as a learning system," said Steve Gold, vice president of the Watson Group. "Watson can provide insights into the information independent of the question. The ability to connect the dots opens up a new world of possibilities."
Supercomputing

How a Supercomputer Beat the Scrap Heap and Lived On To Retire In Africa 145

Posted by Unknown Lamer
from the spread-the-computing dept.
New submitter jorge_salazar (3562633) writes Pieces of the decommissioned Ranger supercomputer, 40 racks in all, were shipped to researchers in South Africa, Tanzania, and Botswana to help seed their supercomputing aspirations. They say they'll need supercomputers to solve their growing science problems in astronomy, bioinformatics, climate modeling and more. Ranger's own beginnings were described by the co-founder of Sun Microsystems as a 'historic moment in petaflop computing."
Supercomputing

A Peek Inside D-Wave's Quantum Computing Hardware 55

Posted by Soulskill
from the hamsters-are-neither-alive-nor-dead dept.
JeremyHsu writes: A one-second delay can still seem like an eternity for a quantum computing machine capable of running calculations in mere millionths of a second. That delay represents just one of the challenges D-Wave Systems overcame in building its second-generation quantum computing machine known as D-Wave Two — a system that has been leased to customers such as Google, NASA and Lockheed Martin. D-Wave's rapid-scaling approach to quantum computing has plenty of critics, but the company's experience in building large-scale quantum computing hardware could provide valuable lessons for everyone, regardless of whether the D-Wave machines live up to quantum computing's potential by proving they can outperform classical computers. (D-Wave recently detailed the hardware design changes between its first- and second-generation quantum computing machines in the the June 2014 issue of the journal IEEE Transactions on Applied Superconductivity.)

"We were nervous about going down this path," says Jeremy Hilton, vice president of processor development at D-Wave Systems. "This architecture requires the qubits and the quantum devices to be intermingled with all these big classical objects. The threat you worry about is noise and impact of all this stuff hanging around the qubits. Traditional experiments in quantum computing have qubits in almost perfect isolation. But if you want quantum computing to be scalable, it will have to be immersed in a sea of computing complexity.
Supercomputing

Computing a Cure For HIV 89

Posted by Soulskill
from the petaflops-for-science dept.
aarondubrow writes: The tendency of HIV to mutate and resist drugs has made it particularly difficult to eradicate. But in the last decade scientists have begun using a new weapon in the fight against HIV: supercomputers. Using some of the nation's most powerful supercomputers, teams of researchers are pushing the limits of what we know about HIV and how we can treat it. The Huffington Post describes how supercomputers are helping scientists understand and treat the disease.
Bitcoin

NSF Researcher Suspended For Mining Bitcoin 220

Posted by Unknown Lamer
from the probably-shouldn't-do-that dept.
PvtVoid (1252388) writes "In the semiannual report to Congress by the NSF Office of Inspector General, the organization said it received reports of a researcher who was using NSF-funded supercomputers at two universities to mine Bitcoin. The computationally intensive mining took up about $150,000 worth of NSF-supported computer use at the two universities to generate bitcoins worth about $8,000 to $10,000, according to the report. It did not name the researcher or the universities."
Supercomputing

Electrical Control of Nuclear Spin Qubits: Important Step For Quantum Computing 42

Posted by Soulskill
from the now-you're-cooking-with-electricity dept.
Taco Cowboy writes: "Using a spin cascade in a single-molecule magnet, scientists at Karlsruhe Institute of Technology and their French partners have demonstrated that a single nuclear spin can be realized in a purely electric manner, rather than through the use of magnetic fields (abstract). For their experiments, the researchers used a nuclear spin-qubit transistor that consists of a single-molecule magnet connected to three electrodes (source, drain, and gate). The single-molecule magnet is a TbPc2 molecule — a single metal ion of terbium that is enclosed by organic phthalocyanine molecules of carbon, nitrogen, and hydrogen atoms. The gap between the electric field and the spin is bridged by the so-called hyperfine-Stark effect that transforms the electric field into a local magnetic field. This quantum mechanical process can be transferred to all nuclear spin systems and, hence, opens up entirely novel perspectives for integrating quantum effects in nuclear spins into electronic circuits"
Supercomputing

Stanford Bioengineers Develop 'Neurocore' Chips 9,000 Times Faster Than a PC 209

Posted by Soulskill
from the i'll-order-a-dozen dept.
kelk1 sends this article from the Stanford News Service: "Stanford bioengineers have developed faster, more energy-efficient microchips based on the human brain – 9,000 times faster and using significantly less power than a typical PC (abstract). Kwabena Boahen and his team have developed Neurogrid, a circuit board consisting of 16 custom-designed 'Neurocore' chips. Together these 16 chips can simulate 1 million neurons and billions of synaptic connections. The team designed these chips with power efficiency in mind. Their strategy was to enable certain synapses to share hardware circuits. ... But much work lies ahead. Each of the current million-neuron Neurogrid circuit boards cost about $40,000. (...) Neurogrid is based on 16 Neurocores, each of which supports 65,536 neurons. Those chips were made using 15-year-old fabrication technologies. By switching to modern manufacturing processes and fabricating the chips in large volumes, he could cut a Neurocore's cost 100-fold – suggesting a million-neuron board for $400 a copy."
Space

Using Supercomputers To Predict Signs of Black Holes Swallowing Stars 31

Posted by samzenpus
from the hungry-hungry-black-holes dept.
aarondubrow (1866212) writes "A 'tidal disruption' occurs when a star orbits too close to a black hole and gets sucked in. The phenomenon is accompanied by a bright flare with a unique signature that changes over time. Researchers at the Georgia Institute of Technology are using Stampede and other NSF-supported supercomputers to simulate tidal disruptions in order to better understand the dynamics of the process. Doing so helps astronomers find many more possible candidates of tidal disruptions in sky surveys and will reveal details of how stars and black holes interact."
IBM

Fifty Years Ago IBM 'Bet the Company' On the 360 Series Mainframe 169

Posted by timothy
from the y'-tell-the-kids-that-today dept.
Hugh Pickens DOT Com (2995471) writes "Those of us of a certain age remember well the breakthrough that the IBM 360 series mainframes represented when it was unveiled fifty years ago on 7 April 1964. Now Mark Ward reports at BBC that the first System 360 mainframe marked a break with all general purpose computers that came before because it was possible to upgrade the processors but still keep using the same code and peripherals from earlier models. "Before System 360 arrived, businesses bought a computer, wrote programs for it and then when it got too old or slow they threw it away and started again from scratch," says Barry Heptonstall. IBM bet the company when they developed the 360 series. At the time IBM had a huge array of conflicting and incompatible lines of computers, and this was the case with the computer industry in general at the time, it was largely a custom or small scale design and production industry, but IBM was such a large company and the problems of this was getting obvious: When upgrading from one of the smaller series of IBM computers to a larger one, the effort in doing that transition was so big so you might as well go for a competing product from the "BUNCH" (Burroughs, Univac, NCR, CDC and Honeywell). Fred Brooks managed the development of IBM's System/360 family of computers and the OS/360 software support package and based his software classic "The Mythical Man-Month" on his observation that "adding manpower to a late software project makes it later." The S/360 was also the first computer to use microcode to implement many of its machine instructions, as opposed to having all of its machine instructions hard-wired into its circuitry. Despite their age, mainframes are still in wide use today and are behind many of the big information systems that keep the modern world humming handling such things as airline reservations, cash machine withdrawals and credit card payments. "We don't see mainframes as legacy technology," says Charlie Ewen. "They are resilient, robust and are very cost-effective for some of the work we do.""
Stats

Mystery MLB Team Moves To Supercomputing For Their Moneyball Analysis 56

Posted by timothy
from the stats-nerds-with-bats dept.
An anonymous reader writes "A mystery [Major League Baseball] team has made a sizable investment in Cray's latest effort at bringing graph analytics at extreme scale to bat. Nicole Hemsoth writes that what the team is looking for is a "hypothesis machine" that will allow them to integrate multiple, deep data wells and pose several questions against the same data. They are looking for platforms that allow users to look at facets of a given dataset, adding new cuts to see how certain conditions affect the reflection of a hypothesized reality."
Supercomputing

Pentago Is a First-Player Win 136

Posted by timothy
from the heads-I-win-tails-you-lose dept.
First time accepted submitter jwpeterson writes "Like chess and go, pentago is a two player, deterministic, perfect knowledge, zero sum game: there is no random or hidden state, and the goal of the two players is to make the other player lose (or at least tie). Unlike chess and go, pentago is small enough for a computer to play perfectly: with symmetries removed, there are a mere 3,009,081,623,421,558 (3e15) possible positions. Thus, with the help of several hours on 98304 threads of Edison, a Cray supercomputer at NERSC, pentago is now strongly solved. 'Strongly' means that perfect play is efficiently computable for any position. For example, the first player wins."
IBM

IBM Dumping $1 Billion Into New Watson Group 182

Posted by samzenpus
from the eggs-in-one-basket dept.
Nerval's Lobster writes "IBM believes its Watson supercomputing platform is much more than a gameshow-winning gimmick: its executives are betting very big that the software will fundamentally change how people and industries compute. In the beginning, IBM assigned 27 core researchers to the then-nascent Watson. Working diligently, those scientists and developers built a tough 'Jeopardy!' competitor. Encouraged by that success on live television, Big Blue devoted a larger team to commercializing the technology—a group it made a point of hiding in Austin, Texas, so its members could better focus on hardcore research. After years of experimentation, IBM is now prepping Watson to go truly mainstream. As part of that upgraded effort (which includes lots of hype-generating), IBM will devote a billion dollars and thousands of researchers to a dedicated Watson Group, based in New York City at 51 Astor Place. The company plans on pouring another $100 million into an equity fund for Watson's growing app ecosystem. If everything goes according to IBM's plan, Watson will help kick off what CEO Ginni Rometty refers to as a third era in computing. The 19th century saw the rise of a "tabulating" era: the birth of machines designed to count. In the latter half of the 20th century, developers and scientists initiated the 'programmable' era—resulting in PCs, mobile devices, and the Internet. The third (potential) era is 'cognitive,' in which computers become adept at understanding and solving, in a very human way, some of society's largest problems. But no matter how well Watson can read, understand and analyze, the platform will need to earn its keep. Will IBM's clients pay lots of money for all that cognitive power? Or will Watson ultimately prove an overhyped sideshow?"
Encryption

NSA Trying To Build Quantum Computer 221

Posted by Soulskill
from the looking-forward-to-quantum-leaks dept.
New submitter sumoinsanity writes "The Washington Post has disclosed that the NSA is trying to build a quantum computer for use in cracking modern encryption. Their work is part of a research project into tackling the toughest equipment, which received $79.7 million in total funding. Another article makes the case that the NSA's quantum computing efforts are both disturbing and reassuring. The reassuring part is that public key infrastructure is still OK when done properly, since the NSA is still working so hard to defeat it. It's also highly unlikely that the NSA has achieved significant progress without outside awareness or help. More disturbing is that it may simply be a matter of time before it fails, and our private messages are out there for all to see."
Supercomputing

Using Supercomputers To Find a Bacterial "Off" Switch 30

Posted by samzenpus
from the bug-crunching dept.
Nerval's Lobster writes "The comparatively recent addition of supercomputing to the toolbox of biomedical research may already have paid off in a big way: Researchers have used a bio-specialized supercomputer to identify a molecular 'switch' that might be used to turn off bad behavior by pathogens. They're now trying to figure out what to do with that discovery by running even bigger tests on the world's second-most-powerful supercomputer. The 'switch' is a pair of amino acids called Phe396 that helps control the ability of the E. coli bacteria to move under its own power. Phe396 sits on a chemoreceptor that extends through the cell wall, so it can pass information about changes in the local environment to proteins on the inside of the cell. Its role was discovered by a team of researchers from the University of Tennessee and the ORNL Joint Institute for Computational Sciences using a specialized supercomputer called Anton, which was built specifically to simulate biomolecular interactions among proteins and other molecules to give researchers a better way to study details of how molecules interact. 'For decades proteins have been viewed as static molecules, and almost everything we know about them comes from static images, such as those produced with X-ray crystallography,' according to Igor Zhulin, a researcher at ORNL and professor of microbiology at UT, in whose lab the discovery was made. 'But signaling is a dynamic process, which is difficult to fully understand using only snapshots.'"
Medicine

Google Supercomputers Tackle Giant Drug-Interaction Data Crunch 50

Posted by timothy
from the more-cubbies-for-more-index-cards dept.
ananyo writes "By analysing the chemical structure of a drug, researchers can see if it is likely to bind to, or 'dock' with, a biological target such as a protein. Researchers have now unveiled a computational effort that used Google's supercomputers to assesses billions of potential dockings on the basis of drug and protein information held in public databases. The effort will help researchers to find potentially toxic side effects and to predict how and where a compound might work in the body. 'It's the largest computational docking ever done by mankind,' says Timothy Cardozo, a pharmacologist at New York University's Langone Medical Center, who presented the project at the US National Institutes of Health's High Risk–High Reward Symposium in Bethesda, Maryland. The result, a website called Drugable, is still in testing, but it will eventually be available for free, allowing researchers to predict how and where a compound might work in the body, purely on the basis of chemical structure."

COMPASS [for the CDC-6000 series] is the sort of assembler one expects from a corporation whose president codes in octal. -- J.N. Gray

Working...