Intel

Intel Iris Xe Video Cards Now Shipping To OEMs: DG1 Lands In Desktops (anandtech.com) 14

Ryan Smith, writing at AnandTech: Following plans first unveiled last year during the launch of their DG1 GPU, Intel sends word this morning that the first Iris Xe video cards have finally begun shipping to OEMs. Based on the DG1 discrete GPU that's already being used in Intel's Iris Xe MAX laptop accelerators, the Iris Xe family of video cards are their desktop counterpart, implementing the GPU on a traditional video card. Overall, with specifications almost identical to Xe MAX, Intel is similarly positioning these cards for the entry-level market, where they are being released as an OEM-only part. As a quick refresher, the DG1 GPU is based on the same Xe-LP graphics architecture as Tiger Lake's integrated GPU. In fact, in broad terms the DG1 can be thought of as a nearly 1-to-1 discrete version of that iGPU, containing the same 96 EUs and 128-bit LPDDR4X memory interface as Tiger Lake itself. Consequently, while DG1 is a big first step for Intel -- marking the launch of their first discrete GPU of the modern era -- the company is planning very modestly for this generation of parts. The first DG1 GPUs were shipped in the fall as part of Intel's Iris Xe MAX graphics solution for laptops. At the time, Intel also indicated that a desktop card for OEMs would also be coming in 2021, and now, right on schedule, those desktop cards have begun shipping out. Further reading: Intel's Iris Xe DG1 Graphics Cards Not Compatible with AMD, Older Systems.
Intel

Intel Has To Be Better Than 'Lifestyle Company' Apple at Making CPUs, Says New CEO (theverge.com) 215

Intel's new CEO, Pat Gelsinger, doesn't start his new role until February, but he's already prepping the company to take on Apple's M1 chips. From a report: The Oregonian, a local newspaper in Oregon where Intel maintains a large presence, reports that the chip maker held an all-hands company meeting yesterday, and Gelsinger attended. "We have to deliver better products to the PC ecosystem than any possible thing that a lifestyle company in Cupertino" makes, Gelsinger reportedly told Intel employees. "We have to be that good, in the future." Intel has been facing increased competition from both Apple and AMD recently. Apple announced its transition to its own silicon back in June, calling it a "historic day for the Mac." The transition has gone well, with M1-based Macs providing impressive performance and battery life compared to existing Intel-based Macs.
Businesses

Qualcomm To Acquire NUVIA: A CPU Magnitude Shift (anandtech.com) 25

Today, Qualcomm has announced they will be acquiring NUVIA for $1.4 billion -- acquiring the start-up company consisting of industry veterans which originally were behind the creation of Apple's high-performance CPU cores. AnandTech reports: NUVIA was originally founded in February 2019 and coming out of stealth-mode in November of that year. The start-up was founded by industry veterans Gerard Williams III, John Bruno and Manu Gulati, having extensive industry experience at Google, Apple, Arm, Broadcom and AMD. Gerard Williams III in particular was the chief architect for over a decade at Apple, having been the lead architect on all of Apple's CPU designs up to the Lightning core in the A13 -- with the newer Apple A14 and Apple M1 Firestorm cores possibly also having been in the pipeline under his direction.

NUVIA had been able to recruit a lot of top industry talent from various CPU design teams across the industry, and had planned to enter the high-performance computing and enterprise market with a new server SoC with a new CPU core dubbed "Phoenix." NUVIA particularly had made aggressive claims about how their design would be able to significantly outperform the competition both in raw performance and power efficiency once it came to market -- usually such claims are always to be taken with scepticism, however due to the members of the design team and talent having proven themselves in the form of Apple's very successful CPU microarchitectures, there's a lot more weight and credibility to them compared to other start-ups.

Qualcomm now acquiring NUVIA gives them the possibility to take advantage of the start-up's early work in the server space, possibly reinvigorating the company's ambitions in the server space, and giving them a second shot at the market. It's to be noted however that in today's press release about the acquisition there had been no mention of server or enterprise plans. Furthermore, the move also has larger repercussions in the consumer space, with Qualcomm claiming that NUVIA CPU designs are expected to be deployed in flagship mobile SoCs and next generation laptops, as well as other industrial applications such as digital cockpits and ADAS.

Intel

Intel CEO Bob Swan To Step Down in February, VMware CEO Pat Gelsinger To Replace Him (cnbc.com) 41

Intel CEO Bob Swan is set to step down effective Feb. 15. From a report: VMWare CEO Pat Gelsinger will take over the position, sources told CNBC. Intel's stock was up about 13% in premarket trading following the news. VMWare's stock was down nearly 5%. Swan was named CEO in January 2019 after serving as interim CEO for seven months. During Swan's tenure, Intel has suffered blows from competitors. Over the summer, Intel reported that its latest generation chips would be delayed while AMD's were already shipping inside laptops. Apple announced in the fall that it would use its own proprietary chips in its Mac computers, breaking a 15-year partnership with Intel for its chip supplies.
AMD

AMD Shows Off Impressive Ryzen 5000 Mobile Processors and 3rd Gen Epyc Server Chips (venturebeat.com) 34

Advanced Micro Devices showed off some impressive Ryzen 5000 mobile processors today and teased the performance of its 3rd Gen Epyc server chips. From a report: Those chips are aimed at keeping AMD's performance lead over its rival Intel in the mobile and server markets. AMD CEO Lisa Su showed off the new chips in a keynote speech at CES 2021, the online-only tech trade show. AMD is launching its Ryzen 5000 Series mobile processors for gaming laptops and thin-and-light notebooks. These eight-core x86 chips are built with a 7-nanometer manufacturing process (where the circuits are 7 billionths of a meter apart). They are also based on the Zen3 design for processor cores, which can process instructions 19% faster per clock cycle than Zen2 cores.

The H-Series focuses on top performance in laptops for gamers and content creators, while the U-Series focuses on thin-and-light notebooks with great battery life. The chips have four to eight cores and they range in power consumption from 15 watts to 45 watts. AMD said the 5000 Series will be available in PCs in February, and we'll see more than 150 systems using it. That compares to 100 systems for the Ryzen 4000 Series and 70 for the Ryzen 3000.

Intel

Intel Unveils New Core H-Series Laptop and 11th Gen Desktop Processors At CES 2021 (hothardware.com) 68

MojoKid writes: At its virtual CES 2021 event today, Intel's EVP Gregory Bryant unveiled an array of new processors and technologies targeting virtually every market, from affordable Chromebooks to enthusiast-class gaming laptops and high-end desktops. Intel's 11th Gen Core vPro platform was announced, featuring new Intel Hardware Shield AI-enabled threat ransomware and crytpo-mining malware detection technology. In addition, the Intel Rocket Lake-S based Core i9-11900K 8-core CPU was revealed, offering up to a 19% improvement in IPC performance and the ability to out-pace AMD's Ryzen 9 5900X 12-core CPU in some workloads like gaming. Also, a new high-end hybrid processor, code-named Alder Lake was previewed. Alder Lake packs both high-performance cores and high-efficiency cores on a single product, for what Intel calls its "most power-scalable system-on-chip" ever. Alder Lake will also be manufactured using an enhanced version of 10nm SuperFin technology with improved power and thermal characteristics, and targets both desktop and mobile form factors when they arrive later this year.

Finally, Intel launched its new 11th Gen Core H-Series Tiger Lake H35 parts that will appear in high-performance laptops as thin as 16mm. At the top of the 11th Gen H-Series stack is the Intel Core i7-11375H Special Edition, a 35W quad-core processor (8-threads) that turbos up to 5GHz and supports PCI Express 4.0, and is targeted for ultraportable gaming notebooks. Intel is claiming single-threaded performance improvements in the neighborhood of 15% over previous-gen architectures and a greater than 40% improvement in multi-threaded workloads. Intel's Bryant also announced an 8-core mobile processor variant leveraging the same architecture as the 11th Gen H-Series that is slated to start shipping a bit later this quarter at 5GHz on multiple cores, with 20 lanes of PCIe Gen 4 connectivity.

Hardware

Graphics Cards Are About To Get a Lot More Expensive, Asus Warns (pcworld.com) 159

Ever since Nvidia's GeForce RTX 30-series and AMD's Radeon RX 6000-series graphics cards launched last fall, the overwhelming demand and tight supply, exacerbated by a cryptocurrency boom, has caused prices for all graphics cards to go nuts. Brace yourself: It looks like it's about to get even worse. From a report: In the Asus DIY PC Facebook group, Asus technical marketing manager Juan Jose Guerrero III warned that prices for the company's components will increase in the new year. "We have an announcement in regards to MSRP price changes that are effective in early 2021 for our award-winning series of graphic cards and motherboards," Guerrero wrote, though he warned that "additional models" may also wind up receiving price increases as well. "Our new MSRP reflects increases in cost for components. operating costs, and logistical activities plus a continuation of import tariffs. We worked closely with our supply and logistic partners to minimize price increases. ASUS greatly appreciates your continued business and support as we navigate through this time of unprecedented market change."
Intel

Linus Torvalds Rails At Intel For 'Killing' the ECC Industry (theregister.com) 218

An anonymous reader quotes a report from The Register: Linux creator Linus Torvalds has accused Intel of preventing widespread use of error-correcting memory and being "instrumental in killing the whole ECC industry with its horribly bad market segmentation." ECC stands for error-correcting code. ECC memory uses additional parity bits to verify that the data read from memory is the same as the data that was written. Without this check, memory is vulnerable to occasional corruption where a bit is flipped spontaneously, for example, by background radiation. Memory can also be attacked using a technique called Rowhammer, where rapid repeated reads of the same memory locations can cause adjacent locations to change their state. ECC memory solves these problems and has been available for over 50 years yet most personal computers do not use it. Cost is a factor but what riles Torvalds is that Intel has made ECC support a feature of its Xeon range, aimed at servers and high-end workstations, and does not support it in other ranges such as the Core series.

The topic came up in a discussion about AMD's new Zen 3 Ryzen 9 5000 series processors on the Real World Tech forum site. AMD has semi-official ECC support in most of its processors. "I don't really see AMD's unofficial ECC support being a big deal," said an unwary contributor. "ECC absolutely matters," retorted Torvalds. "Intel has been detrimental to the whole industry and to users because of their bad and misguided policies wrt ECC. Seriously. And if you don't believe me, then just look at multiple generations of rowhammer, where each time Intel and memory manufacturers bleated about how it's going to be fixed next time... And yes, that was -- again -- entirely about the misguided and arse-backwards policy of 'consumers don't need ECC', which made the market for ECC memory go away."

The accusation is significant particularly at a time when security issues are high on the agenda. The suggestion is that Intel's marketing decisions have held back adoption of a technology that makes users more secure -- though rowhammer is only one of many potential attack mechanisms -- as well as making PCs more stable. "The arguments against ECC were always complete and utter garbage. Now even the memory manufacturers are starting to do ECC internally because they finally owned up to the fact that they absolutely have to," said Torvalds. Torvalds said that Xeon prices deterred usage. "I used to look at the Xeon CPU's, and I could never really make the math work. The Intel math was basically that you get twice the CPU for five times the price. So for my personal workstations, I ended up using Intel consumer CPU's." Prices, he said, dropped last year "because of Ryzen and Threadripper... but it was a 'too little, much too late' situation." By way of mitigation, he added that "apart from their ECC stance I was perfectly happy with [Intel's] consumer offerings."

AMD

Xbox Series X and S Shortages Have Microsoft Asking AMD for Help (gizmodo.com) 32

Supply issues have hamstrung the rollout of the latest generation of video game consoles. Even now, nearly two months after the Xbox Series X and Xbox Series S released, Microsoft is still scrambling to meet demand and has reportedly reached out to chipmaker AMD to fast-track production on its end. From a report: AMD manufactures the GPU and CPU for both consoles, so if it's able to push out its chips faster, Microsoft could, in theory, churn out more consoles by extension. As spotted by VGC, Microsoft is "working as hard as we can" to pump out more systems and has even contacted AMD for help, according to Xbox head Phil Spencer in a recent appearance on the Major Nelson Radio podcast hosted by Xbox Live director of programming Larry Hyrb "I get some people [asking], 'why didn't you build more? Why didn't you start earlier? Why didn't you ship them earlier?' I mean, all of those things," Spencer said. "It's really just down to physics and engineering. We're not holding them back: We're building them as fast as we can. We have all the assembly lines going. I was on the phone last week with [CEO and president] Lisa Su at AMD [asking], 'How do we get more? How do we get more?' So it's something that we're constantly working on."
AMD

Speculation Grows As AMD Files Patent for GPU Design (hothardware.com) 39

Long-time Slashdot reader UnknowingFool writes: AMD filed a patent on using chiplets for a GPU with hints on why it has waited this long to extend their CPU strategy to GPUs. The latency between chiplets poses more of a performance problem for GPUs, and AMD is attempting to solve the problem with a new interconnect called high bandwidth passive crosslink. This new interconnect will allow each GPU to more effectively communicate with each other and the CPU.
"With NVIDIA working on its own MCM design with Hopper architecture, it's about time that we left monolithic GPU designs in the past and enable truly exponential performance growth," argues Wccftech.

And Hot Hardware delves into the details, calling it a "hybrid CPU-FPGA design that could be enabled by Xilinx tech." While they often aren't as great as CPUs on their own, FPGAs can do a wonderful job accelerating specific tasks... [A]n FPGA in the hands of a capable engineer can offload a wide variety of tasks from a CPU and speed processes along. Intel has talked a big game about integrating Xeons with FPGAs over the last six years, but it hasn't resulted in a single product hitting its lineup. A new patent by AMD, though, could mean that the FPGA newcomer might be ready to make one of its own...

AMD made 20 claims in its patent application, but the gist is that a processor can include one or more execution units that can be programmed to handle different types of custom instruction sets. That's exactly what an FPGA does...

AMD has been working on different ways to speed up AI calculations for years. First the company announced and released the Radeon Impact series of AI accelerators, which were just big headless Radeon graphics processors with custom drivers. The company doubled down on that with the release of the MI60, its first 7-nm GPU ahead of the Radeon RX 5000 series launch, in 2018. A shift to focusing on AI via FPGAs after the Xilinx acquisition makes sense, and we're excited to see what the company comes up with.

Bug

'Cyberpunk 2077' Players Are Fixing Parts of the Game Before CD Projekt (vice.com) 79

Cyberpunk 2077 is here in all its glory and pain. On some machines, it's a visual spectacle pushing the limits of current technology and delivering on the promise of Deus Ex, but open world. On other machines, including last-gen consoles, it's a unoptimized and barely playable nightmare. Developer CD Projekt Red has said it's working to improve the game, but fans already have a number of fixes, particularly if you're using an AMD CPU. From a report: Fans aren't waiting for the developer however and over the weekend AMD CPU users discovered that a few small tweaks could improve performance on their PCs. Some players reported performance gains of as much as 60 percent. Cyberpunk 2077 seems to be a CPU intensive game and, at release, it isn't properly optimized for AMD chips. "If you run the game on an AMD CPU and check your usage in task manager, it seems to utilise 4 (logical, 2 physical) cores in frequent bursts up to 100% usage, whereas the rest of the physical cores sit around 40-60%, and their logical counterparts remain idle," Redditor BramblexD explained in a post on the /r/AMD subreddit. Basically, Cyberpunk 2077 is only utilizing a portion of any AMD chips power.

Digital Foundry, a YouTube channel that does in-depth technical analysis of video games, noticed the AMD issue as well. "It really looks like Cyberpunk is not properly using the hyperthreads on Ryzen CPUs," Digital Foundry said in a recent video. To fix this issue, the community has developed three separate solutions. One involves altering the game's executable with a hex editor, the other involves editing a config file, and a third is an unofficial patch built by the community. All three do the same thing -- unleash the power of AMDs processors. "Holy shit are you a wizard or something? The game is finally playable now!" One redditor said of the hex editing technique. "With this tweak my CPU usage went from 50% to ~75% and my frametime is so much more stable now."

Hardware

'This Is a Bad Time to Build a High-End Gaming PC' (extremetech.com) 177

Joel Hruska, writing at ExtremeTech: It's not just a question of whether top-end hardware is available, but whether midrange and last-gen hardware is selling at reasonable prices. If you want to go AMD, be aware that Ryzen 5000 CPUs are hard to find and the 6800 and 6800 XT are vanishingly rare. The upper-range Ryzen 3000 CPUs available on Amazon and Newegg are also selling for well above their prices six months ago. If you want to build an Intel system, the situation is a little different. A number of the 9th and 10th-gen chips are actually priced at MSRP and not too hard to find. The Core i7-9700K has fallen to $269, for example, and it's still one of Intel's fastest gaming CPUs. At that price, paired with a Z370 motherboard, you could build a gaming-focused system, so long as you don't actually need a new high-end GPU. The Core i7-10700K is $359, which isn't quite as competitive, but it squares off reasonably well against chips like the 3700X at $325. Amazon and Newegg both report the 3600X selling for more, at $400 and $345, respectively.

But even if these prices are appealing, the current GPU market makes building a gaming system much above lower-midrange to midrange a non-starter. Radeon 6000 GPUs and RTX 3000 GPUs are both almost impossible to find, and the older, slower, and less feature-rich cards that you can buy are almost all selling for more today than they were six months ago. Not every GPU has been kicked into the stratosphere, but between the cards you can't buy and the cards you shouldn't buy, there's a limited number of deals currently on the market. Your best bet is to set up price alerts on specific SKUs you are watching with the vendor in question. There is some limited good news, though: DRAM and SSDs are both still reasonably priced. DRAM and SSD prices are both expected to decline 10-15 percent through Q4 2020 compared with the previous quarter, and there are good deals to be had on both. [...] Power supply prices look reasonable, too, and motherboard availability looks solid. If you don't need to buy a GPU right now and you're willing to or prefer to use Intel, there's a more reasonable case to be made for building a system. But if you need a high-end GPU and/or want a high-end Ryzen chip to go with it, you may be better off shopping prebuilt systems or waiting a few more months.

PlayStation (Games)

Is Sony Developing a Dual-GPU PS5 Pro? (collider.com) 60

According to a Sony patent spotted by T3, the console maker may be working on a new PlayStation 5 with two graphics card. From the report: The patent describes a "scalable game console" where "a second GPU [is] communicatively coupled to the first GPU" and that the system is for "home console and cloud gaming" usage. To us here at T3 that suggests a next-gen PlayStation console, most likely a new PS5 Pro flagship, supercharged with two graphics cards instead of just one. These would both come in the APU (accelerated processing unit) format that the PlayStation 5's system-on-a-chip (SoC) do, with two custom made AMD APUs working together to deliver enhanced gaming performance and cloud streaming.

The official Sony patent notes that, "plural SoCs may be used to provide a 'high-end' version of the console with greater processing and storage capability," while "the 'high end' system can also contain more memory such as random-access memory (RAM) and other features and may also be used for a cloud-optimized version using the same game console chip with more performance." And, with the PlayStation 5 console only marginally weaker on paper than the Xbox Series X (the PS5 delivers 10.28 teraflops compared to the Xbox Series X's 12 teraflops), a new PS5 Pro console that comes with two APUs rather than one, improving local gaming performance as well as cloud gaming, would be no doubt the Xbox Series X as king of the next-gen consoles death blow.

The cloud gaming part of the patent is particularly interesting, too, as it seems to suggest that this technology could not just find itself in a new flagship PS5 Pro console, but also in more streamlined cloud-based hardware, too. An upgraded PS5 Digital Edition seems a smart bet, as too the much-rumored PSP 5G. [...] Will we see a PS5 Pro anytime soon? Here at T3 we think absolutely not -- we imagine we'll get at least two straight years of PS5 before we see anything at all. As for a cloud-based next-gen PSP 5G, though...

Hardware

NVIDIA Launches GeForce RTX 3060 Ti, Sets a New Gaming Performance Bar At $399 (hothardware.com) 70

MojoKid writes: NVIDIA expanded its line-up of Ampere-based graphics cards today with a new lower cost GeForce RTX 3060 Ti. As its name suggests, the new $399 NVIDIA GPU supplants the previous-gen GeForce RTX 2060 / RTX 2060 Super, and slots in just behind the recently-released GeForce RTX 3070. The GeForce RTX 3060 Ti features 128 CUDA cores per SM, for a total of 4,864, 4 Third-Gen Tensor cores per SM (152 total), and 38 Second-Gen RT cores. The GPU has a typical boost clock of 1,665MHz and it is linked to 8GB of standard GDDR6 memory (not the GDDR6X of the RTX 3080/3090) via a 256-bit memory interface that offers up to 448GB/s of peak bandwidth. In terms of overall performance, the RTX 3060 Ti lands in the neighborhood of the GeForce RTX 2080 Super, and well ahead of cards like AMD's Radeon RX 5700 XT. The GeForce RTX 3060 Ti's 8GB frame buffer may give some users pause, but for 1080p and 1440p gaming, it shouldn't be a problem for the overwhelming majority of titles. It's also par for the course in this $399 price band. Cards are reported to be shipping in retail tomorrow.
Graphics

Radeon RX 6800 and 6800 XT Performance Marks AMD's Return To High-End Graphics (hothardware.com) 62

MojoKid writes: AMD officially launched its Radeon RX 6800 and Radeon RX 6800 XT graphics cards today, previously known in the PC gaming community as Big Navi. The company claimed these high-end GPUs would compete with NVIDIA's best GeForce RTX 30 series and it appears AMD made good on its claims. AMD's new Radeon RX 6800 XT and Radeon 6800 are based on the company's RDNA 2 GPU architecture, with the former sporting 72 Compute Units (CUs) and 2250MHz boost clock, while the RX 6800 sports 60 CUs at a 2105MHz boost clock. Both cards come equipped with 16GB of GDDR6 memory and 128MB of on-die cache AMD calls Infinity Cache, that improves bandwidth and latency, feeding the GPU in front of its 256-bit GDDR6 memory interface.

In the benchmarks, it is fair to say the Radeon RX 6800 is typically faster than an NVIDIA GeForce RTX 3070 just as AMD suggested. Things are not as cut and dry for the Radeon RX 6800 XT though, as the GeForce RTX 3080 and Radeon RX 6800 XT trade victories depending on the game title or workload, but the RTX 3080 has an edge overall. In DXR Ray Tracing performance, NVIDIA has a distinct advantage at the high-end. Though the Radeon RX 6800 wasn't too far behind and RTX 3070, neither the Radeon RX 6800 XT or 6800 card came close the GeForce RTX 3080. Pricing is set at $649 and $579 for the AMD Radeon RX 6800 XT and Radeon RX 6800, respectively and the cards are on sale as of today. However, demand is likely to be fierce as this new crop of high-end graphics cards from both companies have been quickly evaporating from retail shelves.

Desktops (Apple)

Apple's M1 Is Exceeding Expectations (extremetech.com) 274

Reviews are starting to pour in of Apple's MacBook Pro, MacBook Air and Mac Mini featuring the new M1 ARM-based processor -- and they're overwhelmingly positive. "As with the Air, the Pro's performance exceeds expectations," writes Nilay Patel via The Verge.

"Apple's next chapter offers strong performance gains, great battery and starts at $999," says Brian Heater via TechCrunch.

"When Apple said it would start producing Macs with its own system-on-chip processors, custom CPU and GPU silicon (and a bunch of other stuff) to replace parts from Intel and AMD, we figured it would be good. I never expected it would be this good," says Jason Cross in his review of the MacBook Air M1.

"The M1 is a serious, serious contender for one of the all-time most efficient and highest-performing architectures we've ever seen deploy," says ExtremeTech's Joel Hruska.

"Spending a few days with the 2020 Mac mini has shown me that it's a barnburner of a miniature desktop PC," writes Chris Welch via The Verge. "It outperforms most Intel Macs in several benchmarks, runs apps reliably, and offers a fantastic day-to-day experience whether you're using it for web browsing and email or for creative editing and professional work. That potential will only grow when Apple inevitably raises the RAM ceiling and (hopefully) brings back those missing USB ports..."

"Quibbling about massively parallel workloads -- which the M1 wasn't designed for -- aside, Apple has clearly broken the ice on high-performance ARM desktop and laptop designs," writes Jim Salter via Ars Technica. "Yes, you can build an ARM system that competes strongly with x86, even at very high performance levels."

"The M1-equipped MacBook Air now packs far better performance than its predecessors, rivaling at times the M1-based MacBook Pro. At $999, it's the best value among macOS laptops," concludes PCMag.

"For developers, the Apple Silicon Macs also represent the very first full-fledged Arm machines on the market that have few-to-no compromises. This is a massive boost not just for Apple, but for the larger Arm ecosystem and the growing Arm cloud-computing business," writes Andrei Frumusanu via AnandTech. "Overall, Apple hit it out of the park with the M1."

AMD

Microsoft Reveals Pluton, a Custom Security Chip Built Into Intel, AMD, and Qualcomm Processors (techcrunch.com) 143

An anonymous reader shares a report: For the past two years, some of the world's biggest chip makers have battled a series of hardware flaws, like Meltdown and Spectre, which made it possible -- though not easy -- to pluck passwords and other sensitive secrets directly from their processors. The chip makers rolled out patches, but required the companies to rethink how they approach chip security. Now, Microsoft thinks it has the answer with its new security chip, which it calls Pluton. The chip, announced today, is the brainchild of a partnership between Microsoft, and chip makers Intel, AMD, and Qualcomm. Pluton acts as a hardware root-of-trust, which in simple terms protects a device's hardware from tampering, such as from hardware implants or by hackers exploiting flaws in the device's low-level firmware. By integrating the chip inside future Intel, AMD, and Qualcomm central processor units, or CPUs, it makes it far more difficult for hackers with physical access to a computer to launch hardware attacks and extract sensitive data, the companies said. "The Microsoft Pluton design will create a much tighter integration between the hardware and the Windows operating system at the CPU that will reduce the available attack surface," said David Weston, director of enterprise and operating system security at Microsoft.
Programming

Why Apple Silicon Needs an Open Source Fortran Compiler (walkingrandomly.com) 113

"Earlier this week Apple announced their new, ARM-based 'Apple Silicon' machines to the world in a slick marketing event that had many of us reaching for our credit cards," writes Mike Croucher, technical evangelist at The Numerical Algorithms Group.

"Simultaneously, The Numerical Algorithms Group announced that they had ported their Fortran Compiler to the new platform. At the time of writing this is the only Fortran compiler publicly available for Apple Silicon although that will likely change soon as open source Fortran compilers get updated."

An anonymous Slashdot reader offers this analysis: Apple Silicon currently has no open source Fortran compiler and Apple themselves are one of the few silicon manufacturers who don't have their own Fortran compiler. You could be forgiven for thinking that this doesn't matter to most users... if it wasn't for the fact that sizeable percentages of foundational data science platforms such as R and SciPy are written in Fortran.
Croucher argues that "More modern systems, such as R, make direct use of a lot of this code because it is highly performant and, perhaps more importantly, has been battle tested in production for decades. Numerical computing is hard (even when all of your instincts suggest otherwise) and when someone demonstrably does it right, it makes good sense to reuse rather than reinvent..."

"The community needs and will demand open source (or at least free) Fortran compilers if data scientists are ever going to realise the full potential of Apple's new hardware and I have no doubt that these are on the way. Other major silicon providers (e.g. Intel, AMD, NEC and NVIDIA/PGI) have their own Fortran compiler that co-exist with the open ones. Perhaps Apple should join the club..."
AMD

AMD Ryzen 5000 Series Processors Set a New Performance Bar Over Intel (hothardware.com) 70

MojoKid writes: AMD made bold claims when the company unveiled its new Zen 3-based Ryzen 5000 series processors early last month. Statements like "historic IPC uplift" and "fastest for gamers" were waved about like flags of victory. However, as with most things in the computing world, independent testing is always the best way to validate claims. Today AMD lifted the embargo on 3rd party reviews and, in testing, AMD's new Ryzen 5000 series CPUs set a new performance bar virtually across the board, and one that Intel currently can't touch. There are four processors in the initial Ryzen 5000 series lineup, though it's a safe bet more will be coming later. The current entry point is the Ryzen 5 5600X 6-core / 8-thread processor, followed by the 8-core / 16-thread Ryzen 7 5800X, 12-core / 24 thread Ryzen 9 5900X, and the flagship 16-core / 32-thread Ryzen 9 5950X. All of these new CPUs are backwards compatible with AMD socket AM4 motherboards. In comparison to Zen 2, Zen 3 has a larger L1 branch target buffer and improved bandwidth through multiple parts of its pipeline with additional load/store flexibility. Where Zen 2 could handle 2 load and 1 store per cycle, Zen 3 can handle 3 load and 2 stores. All told, AMD is claiming an average 19% increase in IPC with Zen 3, which is a huge uplift gen-over-gen. Couple that IPC uplift with stronger multi-core scaling and a new unified L3 cache configuration, and Zen 3's performance looks great across a wide variety of workloads for both content creation and gaming especially. AMD's Ryzen 9 5950X, Ryzen 9 5900X, Ryzen 7 5800X and Ryzen 5 5600X will be priced at $799, $549, $449 and $299, respectively and should be on retail and etail shelves starting today.
Linux

SiFive Unveils Plan For Linux PCs With RISC-V Processors (venturebeat.com) 42

SiFive today announced it is creating a platform for Linux-based personal computers based on RISC-V processors. VentureBeat reports: Assuming customers adopt the processors and use them in PCs, the move might be part of a plan to create Linux-based PCs that use royalty-free processors. This could be seen as a challenge to computers based on designs from Intel, Advanced Micro Devices, Apple, or Arm, but giants of the industry don't have to cower just yet. The San Mateo, California-based company unveiled HiFive Unmatched, a development design for a Linux-based PC that uses its RISC-V processors. At the moment, these development PCs are early alternatives, most likely targeted at hobbyists and engineers who may snap them up when they become available in the fourth quarter for $665.

The SiFive HiFive Unmatched board will have a SiFive processor, dubbed the SiFive FU740 SoC, a 5-core processor with four SiFive U74 cores and one SiFive S7 core. The U-series cores are Linux-based 64-bit application processor cores based on RISC-V. These cores can be mixed and matched with other SiFive cores, such as the SiFive FU740. These components are all leveraging SiFive's existing intellectual property portfolio. The HiFive Unmatched board comes in the mini-ITX standard form factor to make it easy to build a RISC-V PC. SiFive also added some standard industry connectors -- ATX power supplies, PCI-Express expansion, Gigabit Ethernet, and USB ports are present on a single-board RISC-V development system.

The HiFive Unmatched board includes 8GB of DDR4 memory, 32MB of QSPI flash memory, and a microSD card slot on the motherboard. For debugging and monitoring, developers can access the console output of the board through the built-in microUSB type-B connector. Developers can expand it using PCI-Express slots, including both a PCIe general-purpose slot (PCIe Gen 3 x8) for graphics, FPGAs, or other accelerators and M.2 slots for NVME storage (PCIe Gen 3 x4) and Wi-Fi/Bluetooth modules (PCIe Gen 3 x1). There are four USB 3.2 Gen 1 type-A ports on the rear, next to the Gigabit Ethernet port, making it easy to connect peripherals. The system will ship with a bootable SD card that includes Linux and popular system developer packages, with updates available for download from SiFive.com. It will be available for preorders soon.

For some more context: Could RISC-V processors compete with Intel, ARM, and AMD?

Slashdot Top Deals