×
Security

New 'GoFetch' Apple CPU Attack Exposes Crypto Keys (securityweek.com) 40

"There is a new side channel attack against Apple 'M' series CPUs that does not appear to be fixable without a major performance hit," writes Slashdot reader EncryptedSoldier. SecurityWeek reports: A team of researchers representing several universities in the United States has disclosed the details of a new side-channel attack method that can be used to extract secret encryption keys from systems powered by Apple CPUs. The attack method, dubbed GoFetch, has been described as a microarchitectural side-channel attack that allows the extraction of secret keys from constant-time cryptographic implementations. These types of attacks require local access to the targeted system. The attack targets a hardware optimization named data memory-dependent prefetcher (DMP), which attempts to prefetch addresses found in the contents of program memory to improve performance.

The researchers have found a way to use specially crafted cryptographic operation inputs that allow them to infer secret keys, guessing them bits at a time by monitoring the behavior of the DMP. They managed to demonstrate end-to-end key extraction attacks against several crypto implementations, including OpenSSL Diffie-Hellman Key Exchange, Go RSA, and the post-quantum CRYSTALS-Kyber and CRYSTALS-Dilithium. The researchers have conducted successful GoFetch attacks against systems powered by Apple M1 processors, and they have found evidence that the attack could also work against M2 and M3 processors. They have also tested an Intel processor that uses DMP, but found that it's 'more robust' against such attacks.

The experts said Apple is investigating the issue, but fully addressing it does not seem trivial. The researchers have proposed several countermeasures, but they involve hardware changes that are not easy to implement or mitigations that can have a significant impact on performance. Apple told SecurityWeek that it thanks the researchers for their collaboration as this work advances the company's understanding of these types of threats. The tech giant also shared a link to a developer page that outlines one of the mitigations mentioned by the researchers.
The researchers have published a paper (PDF) detailing their work.

Ars Technica's Dan Goodin also reported on the vulnerability.
Desktops (Apple)

Unpatchable Vulnerability in Apple Chip Leaks Secret Encryption Keys (arstechnica.com) 85

A newly discovered vulnerability baked into Apple's M-series of chips allows attackers to extract secret keys from Macs when they perform widely used cryptographic operations, academic researchers have revealed in a paper published Thursday. From a report: The flaw -- a side channel allowing end-to-end key extractions when Apple chips run implementations of widely used cryptographic protocols -- can't be patched directly because it stems from the microarchitectural design of the silicon itself. Instead, it can only be mitigated by building defenses into third-party cryptographic software that could drastically degrade M-series performance when executing cryptographic operations, particularly on the earlier M1 and M2 generations. The vulnerability can be exploited when the targeted cryptographic operation and the malicious application with normal user system privileges run on the same CPU cluster.

The threat resides in the chips' data memory-dependent prefetcher, a hardware optimization that predicts the memory addresses of data that running code is likely to access in the near future. By loading the contents into the CPU cache before it's actually needed, the DMP, as the feature is abbreviated, reduces latency between the main memory and the CPU, a common bottleneck in modern computing. DMPs are a relatively new phenomenon found only in M-series chips and Intel's 13th-generation Raptor Lake microarchitecture, although older forms of prefetchers have been common for years. Security experts have long known that classical prefetchers open a side channel that malicious processes can probe to obtain secret key material from cryptographic operations. This vulnerability is the result of the prefetchers making predictions based on previous access patterns, which can create changes in state that attackers can exploit to leak information. In response, cryptographic engineers have devised constant-time programming, an approach that ensures that all operations take the same amount of time to complete, regardless of their operands. It does this by keeping code free of secret-dependent memory accesses or structures.

Microsoft

Microsoft Unveils Surface Pro 10 and Surface Laptop 6 for Business, Its First AI PCs (theverge.com) 37

Microsoft has announced two new Surface devices, the Surface Pro 10 for Business and Surface Laptop 6 for Business, both featuring Intel's latest Core Ultra processors, a dedicated Neural Processing Unit (NPU), and a new Copilot key for AI-powered features in Windows 11.

The devices, which will start shipping to commercial customers on April 9th, have been designed exclusively for businesses and will not be sold directly to consumers. The Surface Pro 10 for Business, starting at $1,199, offers a choice between Core Ultra 5 135U and Core Ultra 7 165U options, with up to 64GB of RAM and a 256GB Gen4 SSD. It also features an improved 13-inch display with an antireflective coating and a 1440p front-facing camera with a 114-degree field of view.

The Surface Laptop 6 for Business, also starting at $1,199, is powered by Intel's Core Ultra H-series chips and is available with up to 64GB of RAM and a 1TB Gen4 SSD. The 15-inch model includes two USB-C Thunderbolt 4 ports, while the 13.5-inch model features a single USB-C Thunderbolt 4 port. Both devices have an optional smart card reader and are Microsoft's most easily serviceable Surface devices to date.

Further reading: Microsoft's official blog.
Intel

Intel Prepares For $100 Billion Spending Spree Across Four US States 18

After securing billions in federal grants and loans, Reuters reports that the company is "planning a $100-billion spending spree across four U.S. states" to build and expand its chip manufacturing factories. From the report: The centerpiece of Intel's five-year spending plan is turning empty fields near Columbus, Ohio, into what CEO Pat Gelsinger described to reporters on Tuesday as "the largest AI chip manufacturing site in the world," starting as soon as 2027. Intel's plan will also involve revamping sites in New Mexico and Oregon and expanding operations in Arizona, where longtime rival Taiwan Semiconductor Manufacturing Co is also building a massive factory that it hopes will receive funding from President Joe Biden's push to bring advanced semiconductor manufacturing back to the United States. [...]

Gelsinger said about 30% of the $100-billion plan will be spent on construction costs such as labor, piping and concrete. The remaining will go towards buying chipmaking tools from firms such as ASML, Tokyo Electron, Applied Materials and KLA, among others. Those tools will help bring the Ohio site online by 2027 or 2028, though Gelsinger warned the timeline could slip if the chip market takes a dive. Beyond grants and loans, Intel plans to make most of the purchases from its existing cash flows.

"It will still take three to five years for Intel to become a serious player in the foundry market" for cutting-edge chips, said Kinngai Chan, an analyst at Summit Insights. However, he warned more investment would be needed before Intel could overtake TSMC, adding that the Taiwanese firm could remain the leader for "some time to come." Gelsinger has previously said a second round of U.S. funding for chip factories would likely be needed to re-establish the U.S. as a leader in semiconductor manufacturing, which he reiterated on Tuesday. "It took us three-plus decades to lose this industry. It's not going to come back in three to five years of CHIPS Act" funding, said Gelsinger, who referred to the low-interest-rate funding as "smart capital."
Intel

Intel Awarded Up To $8.5 Billion in CHIPS Act Grants, With Billions More in Loans Available 29

The White House said Wednesday Intel has been awarded up to $8.5 billion in CHIPS Act funding, as the Biden administration ramps up its effort to bring semiconductor manufacturing to U.S. soil. From a report: Intel could receive an additional $11 billion in loans from the CHIPS and Science Act, which was passed in 2022. The awards will be announced by President Joe Biden in Arizona on Wednesday. The money will help "leading-edge semiconductors made in the United States" keep "America in the driver's seat of innovation," U.S. Secretary of Commerce Gina Raimondo said on a call with reporters. Intel and the White House said their agreement is nonbinding and preliminary and could change.

Intel has long been a stalwart of the U.S. semiconductor industry, developing chips that power many of the world's PCs and data center servers. However, the company has been eclipsed in revenue by Nvidia, which leads in artificial intelligence chips, and has been surpassed in market cap by rival AMD and mobile phone chipmaker Qualcomm.
Microsoft

Trying Out Microsoft's Pre-Release OS/2 2.0 (theregister.com) 98

Last month, the only known surviving copy of 32-bit OS/2 from Microsoft was purchased for $650. "Now, two of the internet's experts in getting early PC operating systems running today have managed to fire it up, and you can see the results," reports The Register. From the report: Why such interest in this nearly third-of-a-century old, unreleased OS? Because this is the way the PC industry very nearly went. This SDK came out in June 1990, just one month after Windows 3.0. If 32-bit OS/2 had launched as planned, Windows 3 would have been the last version before it was absorbed into OS/2 and disappeared. There would never have been any 32-bit versions: no Windows NT, no Windows 95; no Explorer, no Start menu or taskbars. That, in turn, might well have killed off Apple as well. No iPod, no iPhone, no fondleslabs. Twenty-first century computers would be unimaginably different. The surprise here is that we can see a glimpse of this world that never happened. The discovery of this pre-release OS shows how very nearly ready it was in 1990. IBM didn't release its solo version until April 1992, the same month as Windows 3.1 -- but now, we can see it was nearly ready two years earlier.

That's why Michal Necasek of the OS/2 Museum called his look The Future That Never Was. He uncovered a couple of significant bugs, but more impressively, he found workarounds for both, and got both features working fine. OS/2 2 could run multiple DOS VMs at once, but in the preview, they wouldn't open -- due to use of an undocumented instruction which Intel did implement in the Pentium MMX and later processors. Secondly, the bundled network client wouldn't install -- but removing a single file got that working fine. That alone is a significant difference between Microsoft's OS/2 2.0 and IBM's version: Big Blue didn't include networking until Warp Connect 3 in 1995.

His verdict: "The 6.78 build of OS/2 2.0 feels surprisingly stable and complete. The cover letter that came with the SDK stressed that Microsoft developers had been using the OS/2 pre-release for day-to-day work." Over at Virtually Fun, Neozeed also took an actual look at Microsoft OS/2 2.0, carefully recreating that screenshot from PC Magazine in May 1990. He even managed to get some Windows 2 programs running, although this preview release did not yet have a Windows subsystem. On his Internet Archive page, he has disk images and downloadable virtual machines so that you can run this yourself under VMware or 86Box.

Intel

Pentagon Scraps $2.5 Billion Grant To Intel (seekingalpha.com) 38

According to Bloomberg (paywalled), the Pentagon has reportedly scrapped its plan to allocate $2.5 billion in grants to Intel, causing the firm's stock to slip in extended-hours trading. From a report: The decision now leaves the U.S. Commerce Department, which is responsible for doling out the funds from the U.S. CHIPs and Science Act, to make up the shortfall, the news outlet said. The Commerce Dept. was initially only supposed to cover $1B of the $3.5B that Intel is slated to receive for advanced defense and intelligence-related semiconductors. The deal is slated to position Intel as the dedicated supplier for processors used for military and intelligence applications and could result in a Secure Enclave inside Intel's chip factory, the news outlet said. With the Pentagon reportedly pulling out, it could alter how much Intel and other companies receive from the CHIPs Act, the news outlet said.
AI

"We Asked Intel To Define 'AI PC.' Its reply: 'Anything With Our Latest CPUs'" (theregister.com) 35

An anonymous reader shares a report: If you're confused about what makes a PC an "AI PC," you're not alone. But finally have something of an answer: if it packs a GPU, a processor that boasts a neural processing unit and can handle VNNI and Dp4a instructions, it qualifies -- at least according to Robert Hallock, Intel's senior director of technical marketing. As luck would have it, that combo is present in Intel's current-generation desktop processors -- 14th-gen Core, aka Core Ultra, aka "Meteor Lake." All models feature a GPU, NPU, and can handle Vector Neural Network Instructions (VNNI) that speed some -- surprise! -- neural networking tasks, and the DP4a instructions that help GPUs to process video.

Because AI PCs are therefore just PCs with current processors, Intel doesn't consider "AI PC" to be a brand that denotes conformity with a spec or a particular capability not present in other PCs. Intel used the "Centrino" brand to distinguish Wi-Fi-enabled PCs, and did likewise by giving home entertainment PCs the "Viiv" moniker. Chipzilla still uses the tactic with "vPro" -- a brand that denotes processors that include manageability and security for business users. But AI PCs are neither a brand nor a spec. "The reason we have not created a category for it like Centrino is we believe this is simply what a PC will be like in four or five years time," Hallock told The Register, adding that Intel's recipe for an AI PC doesn't include specific requirements for memory, storage, or I/O speeds. "There are cases where a very large LLM might require 32GB of RAM," he noted. "Everything else will fit comfortably in a 16GB system."

AI

Jensen Huang Says Even Free AI Chips From Competitors Can't Beat Nvidia's GPUs 50

An anonymous reader shares a report: Nvidia CEO Jensen Huang recently took to the stage to claim that Nvidia's GPUs are "so good that even when the competitor's chips are free, it's not cheap enough." Huang further explained that Nvidia GPU pricing isn't really significant in terms of an AI data center's total cost of ownership (TCO). The impressive scale of Nvidia's achievements in powering the booming AI industry is hard to deny; the company recently became the world's third most valuable company thanks largely to its AI-accelerating GPUs, but Jensen's comments are sure to be controversial as he dismisses a whole constellation of competitors, such as AMD, Intel and a range of competitors with ASICs and other types of custom AI silicon.

Starting at 22:32 of the YouTube recording, John Shoven, Former Trione Director of SIEPR and the Charles R. Schwab Professor Emeritus of Economics, Stanford University, asks, "You make completely state-of-the-art chips. Is it possible that you'll face competition that claims to be good enough -- not as good as Nvidia -- but good enough and much cheaper? Is that a threat?" Jensen Huang begins his response by unpacking his tiny violin. "We have more competition than anyone on the planet," claimed the CEO. He told Shoven that even Nvidia's customers are its competitors, in some cases. Also, Huang pointed out that Nvidia actively helps customers who are designing alternative AI processors and goes as far as revealing to them what upcoming Nvidia chips are on the roadmap.
United States

How Much Energy Will New Semiconductor Factories Burn Through in the US? (theverge.com) 41

A new report warns that a boom in computer chip manufacturing in the US could fuel demand for dirty energy, despite companies' environmental claims. The solution for manufacturers, surprisingly, might be to act more like other big tech companies chasing climate goals. From a report: New semiconductor factories being built in the US by four of the biggest manufacturers -- Intel, TSMC, Samsung, and Micron -- could use more than twice as much electricity as the city of Seattle once they're operational. These companies claim to run on renewable energy, but according to an analysis by nonprofit Stand.earth, that's not entirely true. Semiconductors happen to make up a big chunk of a device's carbon footprint. And unless companies turn to clean energy, they could wind up driving up greenhouse gas emissions as domestic chip manufacturing makes a comeback.

The CHIPS and Science Act, which passed in 2022, set aside $52.7 billion in funding for domestic chip manufacturing. Now, the four companies scrutinized in the report have plans to build megafactories in Arizona, Ohio, Oregon, Idaho, Texas, and New York. Each of those megafactories alone could use as much electricity as a medium-sized town, according to the report. Cumulatively, nine facilities could eventually add 2.1 gigawatts in new electricity demand. "We're not slowing down on any of our sustainability commitments, even with our recently announced investments," Intel said in an email.

Social Networks

Threads' API Is Coming in June (techcrunch.com) 17

In 2005 Gabe Rivera was a compiler software engineer at Intel — before starting the tech-news aggregator Techmeme. And last year his Threads profile added the words "This is a little self-serving, but I want all social networks to be as open as possible."

Friday Threads engineer Jesse Chen posted that it was Rivera's post when Threads launched asking for an API that "convinced us to go for it." And Techmeme just made its first post using the API, according to Chen. The Verge reports : Threads plans to release its API by the end of June after testing it with a limited set of partners, including Hootsuite, Sprinklr, Sprout Social, Social News Desk, and Techmeme. The API will let developers build third-party apps for Threads and allow sites to publish directly to the platform.
More from TechCrunch: Engineer Jesse Chen posted that the company has been building the API for the past few months. The API currently allows users to authenticate, publish threads and fetch the content they post through these tools. "Over the past few months, we've been building the Threads API to enable creators, developers, and brands to manage their Threads presence at scale and easily share fresh, new ideas with their communities from their favorite third-party applications," he said...

The engineer added that Threads is looking to add more capabilities to APIs for moderation and insights gathering.

Space

The Desert Planet In 'Dune' Is Plausible, According To Science (sciencenews.org) 51

The desert planet Arrakis in Frank Herbert's science fiction novel Dune is plausible, says Alexander Farnsworth, a climate modeler at the University of Bristol in England. According to Science News, the world would be a harsh place for humans to live, and they probably wouldn't have to worry about getting eaten by extraterrestrial helminths. From the report: For their Arrakis climate simulation, which you can explore at the website Climate Archive, Farnsworth and colleagues started with the well-known physics that drive weather and climate on Earth. Using our planet as a starting point makes sense, Farnsworth says, partly because Herbert drew inspiration for Arrakis from "some sort of semi-science of looking at dune systems on the Earth itself." The team then added nuggets of information about the planet from details in Herbert's novels and in the Dune Encyclopedia. According to that intel, the fictional planet's atmosphere is similar to Earth's with a couple of notable differences. Arrakis has less carbon dioxide in the atmosphere than Earth -- about 350 parts per million on the desert planet compared with 417 parts per million on Earth. But Dune has far more ozone in its lower atmosphere: 0.5 percent of the gases in the atmosphere compared to Earth's 0.000001 percent.

All that extra ozone is crucial for understanding the planet. Ozone is a powerful greenhouse gas, about 65 times as potent at warming the atmosphere as carbon dioxide is, when measured over a 20-year period. "Arrakis would certainly have a much warmer atmosphere, even though it has less CO2 than Earth today," Farnsworth says. In addition to warming the planet, so much ozone in the lower atmosphere could be bad news. "For humans, that would be incredibly toxic, I think, almost fatal if you were to live under such conditions," Farnsworth says. People on Arrakis would probably have to rely on technology to scrub ozone from the air. Of course, ozone in the upper atmosphere could help shield Arrakis from harmful radiation from its star, Canopus. (Canopus is a real star also known as Alpha Carinae. It's visible in the Southern Hemisphere and is the second brightest star in the sky. Unfortunately for Dune fans, it isn't known to have planets.) If Arrakis were real, it would be located about as far from Canopus as Pluto is from the sun, Farnsworth says. But Canopus is a large white star calculated to be about 7,200 degrees Celsius. "That's significantly hotter than the sun," which runs about 2,000 degrees cooler, Farnsworth says. But "there's a lot of supposition and assumptions they made in here, and whether those are accurate numbers or not, I can't say."

The climate simulation revealed that Arrakis probably wouldn't be exactly as Herbert described it. For instance, in one throwaway line, the author described polar ice caps receding in the summer heat. But Farnsworth and colleagues say it would be far too hot at the poles, about 70Â C during the summer, for ice caps to exist at all. Plus, there would be too little precipitation to replenish the ice in the winter. High clouds and other processes would warm the atmosphere at the poles and keep it warmer than lower latitudes, especially in the summertime. Although Herbert's novels have people living in the midlatitudes and close to the poles, the extreme summer heat and bone-chilling -40C to -75C temperatures in the winters would make those regions nearly unlivable without technology, Farnsworth says. Temperatures in Arrakis' tropical latitudes would be relatively more pleasant at 45C in the warmest months and about 15C in colder months. On Earth, high humidity in the tropics makes it far warmer than at the poles. But on Arrakis, "most of the atmospheric moisture was essentially removed from the tropics," making even the scorching summers more tolerable. The poles are where clouds and the paltry amount of moisture gather and heat the atmosphere. But the tropics on Arrakis pose their own challenges. Hurricane force winds would regularly sandblast inhabitants and build dunes up to 250 meters tall, the researchers calculate. It doesn't mean people couldn't live on Arrakis, just that they'd need technology and lots of off-world support to bring in food and water, Farnsworth says. "I'd say it's a very livable world, just a very inhospitable world."

Microsoft

Microsoft is Working With Nvidia, AMD and Intel To Improve Upscaling Support in PC Games (theverge.com) 22

Microsoft has outlined a new Windows API designed to offer a seamless way for game developers to integrate super resolution AI-upscaling features from Nvidia, AMD, and Intel. From a report: In a new blog post, program manager Joshua Tucker describes Microsoft's new DirectSR API as the "missing link" between games and super resolution technologies, and says it should provide "a smoother, more efficient experience that scales across hardware."

"This API enables multi-vendor SR [super resolution] through a common set of inputs and outputs, allowing a single code path to activate a variety of solutions including Nvidia DLSS Super Resolution, AMD FidelityFX Super Resolution, and Intel XeSS," the post reads. The pitch seems to be that developers will be able to support this DirectSR API, rather than having to write code for each and every upscaling technology.

The blog post comes a couple of weeks after an "Automatic Super Resolution" feature was spotted in a test version of Windows 11, which promised to "use AI to make supported games play more smoothly with enhanced details." Now, it seems the feature will plug into existing super resolution technologies like DLSS, FSR, and XeSS rather than offering a Windows-level alternative.

Intel

Intel Puts 1nm Process (10A) on the Roadmap For 2027 (tomshardware.com) 35

Intel's previously-unannounced Intel 10A (analogous to 1nm) will enter production/development in late 2027, marking the arrival of the company's first 1nm node, and its 14A (1.4nm) node will enter production in 2026. The company is also working to create fully autonomous AI-powered fabs in the future. Tom's Hardware: Intel's Keyvan Esfarjani, the company's EVP and GM and Foundry Manufacturing and Supply, held a very insightful session that covered the company's latest developments and showed how the roadmap unfolds over the coming years. Here, we can see two charts, with the first outlining the company's K-WSPW (thousands of wafer starts per week) capacity for Intel's various process nodes. Notably, capacity typically indicates how many wafers can be started, but not the total output -- output varies based on yields. You'll notice there isn't a label for the Y-axis, which would give us a direct read on Intel's production volumes. However, this does give us a solid idea of the proportionality of Intel's planned node production over the next several years.

Intel did not specify the arrival date of its coming 14A node in its previous announcements, but here, the company indicates it will begin production of the Intel 14A node in 2026. Even more importantly, Intel will begin production/development of its as-yet-unannounced 10A node in late 2027, filling out its roster of nodes produced with EUV technology. Intel's 'A' suffix in its node naming convention represents Angstroms, and 10 Angstroms converts to 1nm, meaning this is the company's first 1nm-class node. Intel hasn't shared any details about the 10A/1nm node but has told us that it classifies a new node as at least having a double-digit power/performance improvement. Intel CEO Pat Gelsinger has told us the cutoff for a new node is around a 14% to 15% improvement, so we can expect that 10A will have at least that level of improvement over the 14A node. (For example, the difference between Intel 7 and Intel 4 was a 15% improvement.)

Robotics

Bezos, Nvidia Join OpenAI in Funding Humanoid Robot Startup (msn.com) 11

OpenAI, Microsoft, Nvidia, and Jeff Bezos are all part of a pack of investors in a business "developing human-like robots," reports Bloomberg, "according to people with knowledge of the situation..."

At the startup — which is named "Figure" — engineers "are working on a robot that looks and moves like a human. The company has said it hopes its machine, called Figure 01, will be able to perform dangerous jobs that are unsuitable for people and that its technology will help alleviate labor shortages." Figure is raising about $675 million in a funding round that carries a pre-money valuation of roughly $2 billion, said the people, who asked not to be identified because the matter is private. Through his firm Explore Investments LLC, Bezos has committed $100 million. Microsoft is investing $95 million, while Nvidia and an Amazon.com Inc.-affiliated fund are each providing $50 million... Other technology companies are involved as well. Intel Corp.'s venture capital arm is pouring in $25 million, and LG Innotek is providing $8.5 million. Samsung's investment group, meanwhile, committed $5 million. Backers also include venture firms Parkway Venture Capital, which is investing $100 million, and Align Ventures, which is providing $90 million...

The AI robotics industry has been busy lately. Earlier this year, OpenAI-backed Norwegian robotics startup 1X Technologies AS raised $100 million. Vancouver-based Sanctuary AI is developing a humanoid robot called Phoenix. And Tesla Inc. is working on a robot called Optimus, with Elon Musk calling it one of his most important projects. Agility Robotics, which Amazon backed in 2022, has bots in testing at one of the retailer's warehouses.
Bloomberg calls the investments in Figure "part of a scramble to find new applications for artificial intelligence."
Intel

Microsoft Will Use Intel To Manufacture Home-Grown Processor (yahoo.com) 30

Intel has landed Microsoft as a customer for its made-to-order chip business, marking a key win for an ambitious turnaround effort under Chief Executive Officer Pat Gelsinger. From a report: Microsoft plans to use Intel's 18A manufacturing technology to make a forthcoming chip that the software maker designed in-house, the two companies said at an event Wednesday. They didn't identify the product, but Microsoft recently announced plans for two homegrown chips: a computer processor and an artificial intelligence accelerator.

Intel has been seeking to prove it can compete in the foundry market, where companies produce custom chips for clients. It's a major shift for the semiconductor pioneer, which once had the world's most advanced chipmaking facilities and kept them to itself. These days, Intel is racing to catch up with companies like Taiwan Semiconductor Manufacturing Co., which leads the foundry industry. Microsoft, meanwhile, is looking to secure a steady supply of semiconductors to power its data-center operations -- especially as demand for AI grows. Designing its own chips also lets Microsoft fine-tune the products to its specific needs. "We need a reliable supply of the most advanced, high-performance and high-quality semiconductors," Microsoft CEO Satya Nadella said in a statement. âoeThat's why we are so excited to work with Intel."

Intel

Intel Accused of Inflating Over 2,600 CPU Benchmark Results (pcworld.com) 47

An anonymous reader shared this report from PCWorld: The Standard Performance Evaluation Corporation, better known as SPEC, has invalidated over 2600 of its own results testing Xeon processors in the 2022 and 2023 version of its popular industrial SPEC CPU 2017 test. After investigating, SPEC found that Intel had used compilers that were, quote, "performing a compilation that specifically improves the performance of the 523.xalancbmk_r / 623.xalancbmk_s benchmarks using a priori knowledge of the SPEC code and dataset to perform a transformation that has narrow applicability."

In layman's terms, SPEC is accusing Intel of optimizing the compiler specifically for its benchmark, which means the results weren't indicative of how end users could expect to see performance in the real world. Intel's custom compiler might have been inflating the relevant results of the SPEC test by up to 9%...

Slightly newer versions of the compilers used in the latest industrial Xeon processors, the 5th-gen Emerald Rapids series, do not use these allegedly performance-enhancing APIs. I'll point out that both the Xeon processors and the SPEC 2017 test are some high-level hardware meant for "big iron" industrial and educational applications, and aren't especially relevant for the consumer market we typically cover.

More info at ServeTheHome, Phoronix, and Tom's Hardware.
Communications

The US Government Makes a $42 Million Bet On Open Cell Networks (theverge.com) 26

An anonymous reader quotes a report from The Verge: The US government has committed $42 million to further the development of the 5G Open RAN (O-RAN) standard that would allow wireless providers to mix and match cellular hardware and software, opening up a bigger market for third-party equipment that's cheaper and interoperable. The National Telecommunications and Information Administration (NTIA) grant would establish a Dallas O-RAN testing center to prove the standard's viability as a way to head off Huawei's steady cruise toward a global cellular network hardware monopoly.

Verizon global network and technology president Joe Russo promoted the funding as a way to achieve "faster innovation in an open environment." To achieve the standard's goals, AT&T vice president of RAN technology Robert Soni says that AT&T and Verizon have formed the Acceleration of Compatibility and Commercialization for Open RAN Deployments Consortium (ACCoRD), which includes a grab bag of wireless technology companies like Ericsson, Nokia, Samsung, Dell, Intel, Broadcom, and Rakuten. Japanese wireless carrier Rakuten formed as the first O-RAN network in 2020. The company's then CEO, Tareq Amin, told The Verge's Nilay Patel in 2022 that Open RAN would enable low-cost network build-outs using smaller equipment rather than massive towers -- which has long been part of the promise of 5G.

But O-RAN is about more than that; establishing interoperability means companies like Verizon and AT&T wouldn't be forced to buy all of their hardware from a single company to create a functional network. For the rest of us, that means faster build-outs and "more agile networks," according to Rakuten. In the US, Dish has been working on its own O-RAN network, under the name Project Genesis. The 5G network was creaky and unreliable when former Verge staffer Mitchell Clarke tried it out in Las Vegas in 2022, but the company said in June last year that it had made its goal of covering 70 percent of the US population. Dish has struggled to become the next big cell provider in the US, though -- leading satellite communications company EchoStar, which spun off from Dish in 2008, to purchase the company in January.
The Washington Post writes that O-RAN "is Washington's anointed champion to try to unseat the Chinese tech giant Huawei Technologies" as the world's biggest supplier of cellular infrastructure gear.

According to the Post, Biden has emphasized the importance of O-RAN in conversations with international leaders over the past few years. Additionally, it notes that Congress along with the NTIA have dedicated approximately $2 billion to support the development of this standard.
Microsoft

Microsoft Working On Its Own DLSS-like Upscaler for Windows 11 (theverge.com) 42

Microsoft appears to be readying its own DLSS-like AI upscaling feature for PC games. From a report: X user PhantomOcean3 discovered the feature inside the latest test versions of Windows 11 over the weekend, with Microsoft describing its automatic super resolution as a way to "use AI to make supported games play more smoothly with enhanced details." That sounds a lot like Nvidia's Deep Learning Super Sampling (DLSS) technology, which uses AI to upscale games and improve frame rates and image quality. AMD and Intel also offer their own variants, with FSR and XeSS both growing in popularity in recent PC game releases.
AI

AI PCs To Account for Nearly 60% of All PC Shipments by 2027, IDC Says (idc.com) 70

IDC, in a press release: A new forecast from IDC shows shipments of artificial intelligence (AI) PCs -- personal computers with specific system-on-a-chip (SoC) capabilities designed to run generative AI tasks locally -- growing from nearly 50 million units in 2024 to more than 167 million in 2027. By the end of the forecast, IDC expects AI PCs will represent nearly 60% of all PC shipments worldwide. [...] Until recently, running an AI task locally on a PC was done on the central processing unit (CPU), the graphics processing unit (GPU), or a combination of the two. However, this can have a negative impact on the PC's performance and battery life because these chips are not optimized to run AI efficiently. PC silicon vendors have now introduced AI-specific silicon to their SoCs called neural processing units (NPUs) that run these tasks more efficiently.

To date, IDC has identified three types of NPU-enabled AI PCs:
1. Hardware-enabled AI PCs include an NPU that offers less than 40 tera operations per second (TOPS) performance and typically enables specific AI features within apps to run locally. Qualcomm, Apple, AMD, and Intel are all shipping chips in this category today.

2. Next-generation AI PCs include an NPU with 40 to 60 TOPS performance and an AI-first operating system (OS) that enables persistent and pervasive AI capabilities in the OS and apps. Qualcomm, AMD, and Intel have all announced future chips for this category, with delivery expected to begin in 2024. Microsoft is expected to roll out major updates (and updated system specifications) to Windows 11 to take advantage of these high-TOPS NPUs.

3. Advanced AI PCs are PCs that offer more than 60 TOPS of NPU performance. While no silicon vendors have announced such products, IDC expects them to appear in the coming years. This IDC forecast does not include advanced AI PCs, but they will be incorporated into future updates.
Michael Dell, commenting on X: This is correct and might be underestimating it. AI PCs are coming fast and Dell is ready.

Slashdot Top Deals