Robotics

McDonald's Starts Testing Automated Drive-Thru Ordering (cnbc.com) 133

New submitter DaveV1.0 shares a report from CNBC: At 10 McDonald's locations in Chicago, workers aren't taking down customers' drive-thru orders for McNuggets and french fries -- a computer is, CEO Chris Kempczinski said Wednesday. Kempczinski said the restaurants using the voice-ordering technology are seeing about 85% order accuracy. Only about a fifth of orders need to be a taken by a human at those locations, he said, speaking at Alliance Bernstein's Strategic Decisions conference.

In 2019, under former CEO Steve Easterbrook, McDonald's went on a spending spree, snapping up restaurant tech. One of those acquisitions was Apprente, which uses artificial intelligence software to take drive-thru orders. Kempczinski said the technology will likely take more than one or two years to implement. "Now there's a big leap from going to 10 restaurants in Chicago to 14,000 restaurants across the U.S., with an infinite number of promo permutations, menu permutations, dialect permutations, weather — and on and on and on," he said. Another challenge has been training restaurant workers to stop themselves from jumping in to help.

Hardware

US PC Shipments Soar 73% In the First Quarter As Apple Falls From Top Spot (techcrunch.com) 76

An anonymous reader quotes a report from TechCrunch: With increased demand from the pandemic, Canalys reports that U.S. PC shipments were up 73% over the same period last year. That added up to a total of 34 million units sold. While Apple had a good quarter with sales up 36%, it was surpassed by HP, which sold 11 million units in total with annual growth up an astonishing 122.6%. As Canalys pointed out, the first quarter tends to be a weaker one for Apple hardware following the holiday season, but it's a big move for HP nonetheless. Other companies boasting big growth numbers include Samsung at 116% and Lenovo at 92.8%. Dell was up 29.2%, fairly modest compared with the rest of the group.

Overall though it was a stunning quarter as units flew off the shelves. Canalys Research Analyst Brian Lynch says some of this can be attributed to the increased demand from 2020 as people moved to work and school from home and needed new machines to get their work done, but regardless the growth was unrivaled historically. " Q1 2021 still rates as one of the best first quarters the industry has ever seen. Vendors have prioritized fulfilling U.S. backlogs before supply issues are addressed in other parts of the world," Lynch said in a statement. Perhaps not surprisingly, low-cost Chromebooks were the most popular item as people looking to refresh their devices, especially for education purposes, turned to the lower end of the PC market, which likely had a negative impact on higher-priced Apple products, as well contributing to its drop from the top spot.
According to Canalys, Chromebook sales were up a whopping 548% with Samsung leading that growth with an astonishing 1,963% growth rate. "Asus, HP and Lenovo all reported Chromebook sales rates up over 900%," adds TechCrunch.
Power

Reducing Poverty Can Actually Lower Energy Demand, Finds Research (arstechnica.com) 196

An anonymous reader shares a report from The Conversation: As people around the world escape poverty, you might expect their energy use to increase. But my research in Nepal, Vietnam, and Zambia found the opposite: lower levels of deprivation were linked to lower levels of energy demand. What is behind this counterintuitive finding? [...] We found that households that do have access to clean fuels, safe water, basic education and adequate food -- that is, those not in extreme poverty -- can use as little as half the energy of the national average in their country. This is important, as it goes directly against the argument that more resources and energy will be needed for people in the global south to escape extreme poverty. The biggest factor is the switch from traditional cooking fuels, like firewood or charcoal, to more efficient (and less polluting) electricity and gas.

In Zambia, Nepal, and Vietnam, modern energy resources are extremely unfairly distributed -- more so than income, general spending, or even spending on leisure. As a consequence, poorer households use more dirty energy than richer households, with ensuing health and gender impacts. Cooking with inefficient fuels consumes a lot of energy, and even more when water needs to be boiled before drinking. But do households with higher incomes and more devices have a better chance of escaping poverty? Some do, but having higher incomes and mobile phones are not either prerequisites or guarantees of having basic needs satisfied. Richer households without access to electricity or sanitation are not spared from having malnourished children or health problems from using charcoal. Ironically, for most households, it is easier to obtain a mobile phone than a clean, nonpolluting fuel for cooking. Therefore, measuring progress via household income leads to an incomplete understanding of poverty and its deprivations.

So what? Are we arguing against the global south using more energy for development? No: instead of focusing on how much energy is used, we are pointing to the importance of collective services (like electricity, indoor sanitation and public transport) for alleviating the multiple deprivations of poverty. In addressing these issues we cannot shy away from asking why so many countries in the global south have such a low capacity to invest in those services. It has to do with the fact that poverty does not just happen: it is created via interlinked systems of wealth extraction such as structural adjustment, or high costs of servicing national debts. Given that climate change is caused by the energy use of a rich minority in the global north but the consequences are borne by the majority in the poorer global south, human development is not only a matter of economic justice but also climate justice. Investing in vital collective services underpins both.

Robotics

Sidewalk Robots are Now Delivering Food in Miami (msn.com) 74

18-inch tall robots on four wheels zipping across city sidewalks "stopped people in their tracks as they whipped out their camera phones," reports the Florida Sun-Sentinel.

"The bots' mission: To deliver restaurant meals cheaply and efficiently, another leap in the way food comes to our doors and our tables." The semiautonomous vehicles were engineered by Kiwibot, a company started in 2017 to game-change the food delivery landscape...

In May, Kiwibot sent a 10-robot fleet to Miami as part of a nationwide pilot program funded by the Knight Foundation. The program is driven to understand how residents and consumers will interact with this type of technology, especially as the trend of robot servers grows around the country. And though Broward County is of interest to Kiwibot, Miami-Dade County officials jumped on board, agreeing to launch robots around neighborhoods such as Brickell, downtown Miami and several others, in the next couple of weeks... "Our program is completely focused on the residents of Miami-Dade County and the way they interact with this new technology. Whether it's interacting directly or just sharing the space with the delivery bots," said Carlos Cruz-Casas, with the county's Department of Transportation...

Remote supervisors use real-time GPS tracking to monitor the robots. Four cameras are placed on the front, back and sides of the vehicle, which the supervisors can view on a computer screen. [A spokesperson says later in the article "there is always a remote and in-field team looking for the robot."] If crossing the street is necessary, the robot will need a person nearby to ensure there is no harm to cars or pedestrians. The plan is to allow deliveries up to a mile and a half away so robots can make it to their destinations in 30 minutes or less.

Earlier Kiwi tested its sidewalk-travelling robots around the University of California at Berkeley, where at least one of its robots burst into flames. But the Sun-Sentinel reports that "In about six months, at least 16 restaurants came on board making nearly 70,000 deliveries...

"Kiwibot now offers their robotic delivery services in other markets such as Los Angeles and Santa Monica by working with the Shopify app to connect businesses that want to employ their robots." But while delivery fees are normally $3, this new Knight Foundation grant "is making it possible for Miami-Dade County restaurants to sign on for free."

A video shows the reactions the sidewalk robots are getting from pedestrians on a sidewalk, a dog on a leash, and at least one potential restaurant customer looking forward to no longer having to tip human food-delivery workers.
AMD

RISC Vs. CISC Is the Wrong Lens For Comparing Modern x86, ARM CPUs (extremetech.com) 118

Long-time Slashdot reader Dputiger writes: Go looking for the difference between x86 and ARM CPUs, and you'll run into the idea of CISC versus RISC immediately. But 40 years after the publication of David Patterson and David Ditzel's 1981 paper, "The Case for a Reduced Instruction Set Computer," CISC and RISC are poor top-level categories for comparing these two CPU families.
ExtremeTech writes:
The problem with using RISC versus CISC as a lens for comparing modern x86 versus ARM CPUs is that it takes three specific attributes that matter to the x86 versus ARM comparison — process node, microarchitecture, and ISA — crushes them down to one, and then declares ARM superior on the basis of ISA alone. The ISA-centric argument acknowledges that manufacturing geometry and microarchitecture are important and were historically responsible for x86's dominance of the PC, server, and HPC market. This view holds that when the advantages of manufacturing prowess and install base are controlled for or nullified, RISC — and by extension, ARM CPUs — will typically prove superior to x86 CPUs.

The implementation-centric argument acknowledges that ISA can and does matter, but that historically, microarchitecture and process geometry have mattered more. Intel is still recovering from some of the worst delays in the company's history. AMD is still working to improve Ryzen, especially in mobile. Historically, both x86 manufacturers have demonstrated an ability to compete effectively against RISC CPU manufacturers.

Given the reality of CPU design cycles, it's going to be a few years before we really have an answer as to which argument is superior. One difference between the semiconductor market of today and the market of 20 years ago is that TSMC is a much stronger foundry competitor than most of the RISC manufacturers Intel faced in the late 1990s and early 2000s. Intel's 7nm team has got to be under tremendous pressure to deliver on that node.

Nothing in this story should be read to imply that an ARM CPU can't be faster and more efficient than an x86 CPU.

Google

How Reliable Are Modern CPUs? (theregister.com) 64

Slashdot reader ochinko (user #19,311) shares The Register's report about a recent presentation by Google engineer Peter Hochschild. His team discovered machines with higher-than-expected hardware errors that "showed themselves sporadically, long after installation, and on specific, individual CPU cores rather than entire chips or a family of parts." The Google researchers examining these silent corrupt execution errors (CEEs) concluded "mercurial cores" were to blame CPUs that miscalculated occasionally, under different circumstances, in a way that defied prediction...The errors were not the result of chip architecture design missteps, and they're not detected during manufacturing tests. Rather, Google engineers theorize, the errors have arisen because we've pushed semiconductor manufacturing to a point where failures have become more frequent and we lack the tools to identify them in advance.

In a paper titled "Cores that don't count" [PDF], Hochschild and colleagues Paul Turner, Jeffrey Mogul, Rama Govindaraju, Parthasarathy Ranganathan, David Culler, and Amin Vahdat cite several plausible reasons why the unreliability of computer cores is only now receiving attention, including larger server fleets that make rare problems more visible, increased attention to overall reliability, and software development improvements that reduce the rate of software bugs. "But we believe there is a more fundamental cause: ever-smaller feature sizes that push closer to the limits of CMOS scaling, coupled with ever-increasing complexity in architectural design," the researchers state, noting that existing verification methods are ill-suited for spotting flaws that occur sporadically or as a result of physical deterioration after deployment.

Facebook has noticed the errors, too. In February, the social ad biz published a related paper, "Silent Data Corruption at Scale," that states, "Silent data corruptions are becoming a more common phenomena in data centers than previously observed...."

The risks posed by misbehaving cores include not only crashes, which the existing fail-stop model for error handling can accommodate, but also incorrect calculations and data loss, which may go unnoticed and pose a particular risk at scale. Hochschild recounted an instance where Google's errant hardware conducted what might be described as an auto-erratic ransomware attack. "One of our mercurial cores corrupted encryption," he explained. "It did it in such a way that only it could decrypt what it had wrongly encrypted."

How common is the problem? The Register notes that Google's researchers shared a ballpark figure "on the order of a few mercurial cores per several thousand machines similar to the rate reported by Facebook."
Hardware

Apple Working On iPad Pro With Wireless Charging, New iPad Mini (bloomberg.com) 11

An anonymous reader quotes a report from Bloomberg: Apple is working on a new iPad Pro with wireless charging and the first iPad mini redesign in six years, seeking to continue momentum for a category that saw rejuvenated sales during the pandemic. The Cupertino, California-based company is planning to release the new iPad Pro in 2022 and the iPad mini later this year [...]. The main design change in testing for the iPad Pro is a switch to a glass back from the current aluminum enclosure. The updated iPad mini is planned to have narrower screen borders while the removal of its home button has also been tested.

For the new Pro model, the switch to a glass back is being tested, in part, to enable wireless charging for the first time. Making the change in material would bring iPads closer to iPhones, which Apple has transitioned from aluminum to glass backs in recent years. Apple's development work on the new iPad Pro is still early, and the company's plans could change or be canceled before next year's launch [...]. Wireless charging replaces the usual power cable with an inductive mat, which makes it easier for users to top up their device's battery. It has grown into a common feature in smartphones but is a rarity among tablets. Apple added wireless charging to iPhones in 2017 and last year updated it with a magnet-based MagSafe system that ensured more consistent charging speeds.

The company is testing a similar MagSafe system for the iPad Pro. Wireless charging will likely be slower than directly plugging in a charger to the iPad's Thunderbolt port, which will remain as part of the next models. As part of its development of the next iPad Pro, Apple is also trying out technology called reverse wireless charging. That would allow users to charge their iPhone or other gadgets by laying them on the back of the tablet. Apple had previously been working on making this possible for the iPhone to charge AirPods and Apple Watches. In addition to the next-generation iPad Pro and iPad mini, Apple is also working on a thinner version of its entry-level iPad geared toward students. That product is planned to be released as early as the end of this year, about the same time as the new iPad mini.
Apple is still reportedly working on a technology similar to its failed AirPower, a charging mat designed to simultaneously charge an iPhone, Apple Watch and AirPods. People familiar with the matter said it's also internally investigating alternative wireless charging methods that can work over greater distances than an inductive connection.
Power

7-11 Is Opening 500 EV Charging Stations By the End of 2022 (cnet.com) 168

7-11 announced Tuesday that it will be placing 500 EV chargers at 250 stores in the U.S. and Canada by the end of 2022. CNET reports: OK, but if they can't keep the Slurpee machine up and running, what kind of charging can users expect? Well, we don't know, and 7-11 isn't saying, but we do know that they will be DC fast-chargers, and it looks like they'll be supplied by ChargePoint, so we'd bet on anything from 60-ish kilowatts to 125 kilowatts. These new chargers will join 7-11's small network of 22 charging stations at 14 stores in four states, and the whole thing is a part of 7-11's ongoing work to reduce its carbon footprint.
Wireless Networking

Samsung Will Shut Down the v1 SmartThings Hub This Month (arstechnica.com) 86

Samsung is killing the first-generation SmartThings Hub at the end of the month, kicking off phase two of its plan to shut down the SmartThings ecosystem and force users over to in-house Samsung infrastructure. "Phase one was in October, when Samsung killed the Classic SmartThings app and replaced it with a byzantine disaster of an app that it developed in house," writes Ars Technica's Ron Amadeo. "Phase three will see the shutdown of the SmartThings Groovy IDE, an excellent feature that lets members of the community develop SmartThings device handlers and complicated automation apps." From the report: The SmartThings Hub is basically a Wi-Fi access point -- but for your smart home stuff instead of your phones and laptops. Instead of Wi-Fi, SmartThings is the access point for a Zigbee and Z-Wave network, two ultra low-power mesh networks used by smart home devices. [...] The Hub connects your smart home network to the Internet, giving you access to a control app and connecting to other services like your favorite voice assistant. You might think that killing the old Hub could be a ploy to sell more hardware, but Samsung -- a hardware company -- is actually no longer interested in making SmartThings hardware. The company passed manufacturing for the latest "SmartThings Hub (v3)" to German Internet-of-things company Aeotec. The new Hub is normally $125, but Samsung is offering existing users a dirt-cheat $35 upgrade price.

For users who have to buy a new hub, migrating between hubs in the SmartThings ecosystem is a nightmare. Samsung doesn't provide any kind of migration program, so you have to unpair every single individual smart device from your old hub to pair it to the new one. This means you'll need to perform some kind of task on every light switch, bulb, outlet, and sensor, and you'll have to do the same for any other smart thing you've bought over the years. Doing this on each device is a hassle that usually involves finding the manual to look up the secret "exclusion" input, which is often some arcane Konami code. Picture holding the top button on a paddle light for seven seconds until a status light starts blinking and then opening up the SmartThings app to unpair it. Samsung is also killing the "SmartThings Link for Nvidia Shield" dongle, which let users turn Android TV devices into SmartThings Hubs.

Power

Bill Gates' Next Generation Nuclear Reactor To Be Built In Wyoming (reuters.com) 334

Billionaire Bill Gates' advanced nuclear reactor company TerraPower LLC and PacifiCorp have selected Wyoming to launch the first Natrium reactor project on the site of a retiring coal plant, the state's governor said on Wednesday. Reuters reports: TerraPower, founded by Gates about 15 years ago, and power company PacifiCorp, owned by Warren Buffet's Berkshire Hathaway, said the exact site of the Natrium reactor demonstration plant is expected to be announced by the end of the year. Small advanced reactors, which run on different fuels than traditional reactors, are regarded by some as a critical carbon-free technology than can supplement intermittent power sources like wind and solar as states strive to cut emissions that cause climate change.

The project features a 345 megawatt sodium-cooled fast reactor with molten salt-based energy storage that could boost the system's power output to 500 MW during peak power demand. TerraPower said last year that the plants would cost about $1 billion. Late last year the U.S. Department of Energy awarded TerraPower $80 million in initial funding to demonstrate Natrium technology, and the department has committed additional funding in coming years subject to congressional appropriations.

AMD

AMD Unveils Radeon RX 6000M Mobile GPUs For New Breed of All-AMD Gaming Laptops (hothardware.com) 15

MojoKid writes: AMD just took the wraps off its new line of Radeon RX 6000M GPUs for gaming laptops. Combined with its Ryzen 5000 series processors, the company claims all-AMD powered "AMD Advantage" machines will deliver new levels of performance, visual fidelity and value for gamers. AMD unveiled three new mobile GPUs. Sitting at the top is the Radeon RX 6800M, featuring 40 compute units, 40 ray accelerators, a 2,300MHz game clock and 12GB of GDDR6 memory. According to AMD, its flagship Radeon RX 6800M mobile GPU can deliver 120 frames per second at 1440p with a blend of raytracing, compute, and traditional effects.

Next, the new Radeon RX 6700M sports 36 compute units, 36 ray accelerators, a 2,300MHz game clock and 10GB of GDDR6 memory. Finally, the Radeon RX 6600M comes armed with 28 compute units and 28 ray accelerators, a 2,177MHz game clock and 8GB of GDDR6 memory. HotHardware has a deep dive review of a new ASUS ROG Strix G15 gaming laptop with the Radeon RX 6800M on board, as well as an 8-core Ryzen 9 5900HX processor. In the benchmarks, the Radeon RX 6800M-equipped machine puts up numbers that rival GeForce RTX 3070 and 3080 laptop GPUs in traditional rasterized game engines, though it trails a bit in ray tracing enhanced gaming. You can expect this new breed of all-AMD laptops to arrive in market sometime later this month.

Businesses

Instacart Bets on Robots To Shrink Ranks of 500,000 Gig Shoppers (bloomberg.com) 43

Instacart has an audacious plan to replace its army of gig shoppers with robots -- part of a long-term strategy to cut costs and put its relationship with supermarket chains on a sustainable footing. From a report: The plan, detailed in documents reviewed by Bloomberg, involves building automated fulfillment centers around the U.S., where hundreds of robots would fetch boxes of cereal and cans of soup while humans gather produce and deli products. Some facilities would be attached to existing grocery stores while larger standalone centers would process orders for several locations, according to the documents, which were dated July and December.

Despite working on the strategy for more than a year, however, the company has yet to sign up a single supermarket chain. Instacart had planned to begin testing the fulfillment centers later this year, the documents show. But the company has fallen behind schedule, according to people familiar with the situation. And though the documents mention asking several automation providers to build the technology, Instacart hasn't settled on any, said the people, who requested anonymity to discuss a private matter. In February, the Financial Times reported on elements of the strategy and said Instacart in early 2020 sent out requests for proposals to five robotics companies.

An Instacart spokeswoman said the company was busy buttressing its operations during the pandemic, when it signed up 300,000 new gig workers in a matter of weeks, bringing the current total to more than 500,000. But the delays in getting the automation strategy off the ground could potentially undermine plans to go public this year. Investors know robots will play a critical role in modernizing the $1.4 trillion U.S. grocery industry.

Hardware

The GeForce RTX 3080 Ti is Nvidia's 'New Gaming Flagship' (pcworld.com) 60

Nvidia officially announced the long-awaited GeForce RTX 3080 Ti during its Computex keynote late Monday night, and this $1,200 graphics card looks like an utter beast. The $600 GeForce RTX 3070 Ti also made its debut with faster GDDR6X memory. From a report: All eyes are on the RTX 3080 Ti, though. Nvidia dubbed it GeForce's "new gaming flagship" as the $1,500 RTX 3090 is built for work and play alike, but the new GPU is a 3090 in all but name (and memory capacity). While Nvidia didn't go into deep technical details during the keynote, the GeForce RTX 3080 Ti's specifications page shows it packing a whopping 10,240 CUDA cores -- just a couple hundred less than the 3090's 10,496 count, but massively more than the 8,704 found in the vanilla 3080.

Expect this card to chew through games on par with the best, especially in games that support real-time ray tracing and Nvidia's amazing DLSS feature. The memory system can handle the ride, as it's built using the RTX 3090's upgraded bones. The GeForce RTX 3080 Ti comes with a comfortable 12GB of blazing-fast GDDR6X memory over a wide 384-bit bus, which is half the ludicrous 24GB capacity found in the 3090, but more than enough to handle any gaming workload you through at it. That's not true with the vanilla RTX 3080, which comes with 10GB of GDDR6X over a smaller bus, as rare titles (like Doom Eternal) can already use more than 10GB of memory when you're playing at 4K resolution with the eye candy cranked to the max. The extra two gigs make the RTX 3080 Ti feel much more future-proof.

Data Storage

Seagate 'Exploring' Possible New Line of Crypto-Specific Hard Drives (techradar.com) 47

In a Q&A with TechRadar, storage hardware giant Seagate revealed it is keeping a close eye on the crypto space, with a view to potentially launching a new line of purpose-built drives. From the report: Asked whether companies might develop storage products specifically for cryptocurrency use cases, Jason M. Feist, who heads up Seagate's emerging products arm, said it was a "possibility." Feist said he could offer no concrete information at this stage, but did suggest the company is "exploring this opportunity and imagines others may be as well."
Intel

Intel's latest 11th Gen Processor Brings 5.0GHz Speeds To Thin and Light Laptops (theverge.com) 51

Intel made a splash earlier in May with the launch of its first 11th Gen Tiger Lake H-series processors for more powerful laptops, but at Computex 2021, the company is also announcing a pair of new U-series chips -- one of which marks the first 5.0GHz clock speed for the company's U-series lineup of lower voltage chips. From a report: Specifically, Intel is announcing the Core i7-1195G7 -- its new top of the line chip in the U-series range -- and the Core i5-1155G7, which takes the crown of Intel's most powerful Core i5-level chip, too. Like the original 11th Gen U-series chips, the new chips operate in the 12W to 28W range. Both new chips are four core / eight thread configurations, and feature Intel's Iris Xe integrated graphics (the Core i7-1195G7 comes with 96 EUs, while the Core i5-1155G7 has 80 EUs.)

The Core i7-1195G7 features a base clock speed of 2.9GHz, but cranks up to a 5.0GHz maximum single core speed using Intel's Turbo Boost Max 3.0 technology. The Core i5-1155G7, on the other hand, has a base clock speed of 2.5GHz and a boosted speed of 4.5GHz. Getting to 5GHz out of the box is a fairly recent development for laptop CPUs, period: Intel's first laptop processor to cross the 5GHz mark arrived in 2019.

Supercomputing

World's Fastest AI Supercomputer Built from 6,159 NVIDIA A100 Tensor Core GPUs (nvidia.com) 57

Slashdot reader 4wdloop shared this report from NVIDIA's blog, joking that maybe this is where all NVIDIA's chips are going: It will help piece together a 3D map of the universe, probe subatomic interactions for green energy sources and much more. Perlmutter, officially dedicated Thursday at the National Energy Research Scientific Computing Center (NERSC), is a supercomputer that will deliver nearly four exaflops of AI performance for more than 7,000 researchers. That makes Perlmutter the fastest system on the planet on the 16- and 32-bit mixed-precision math AI uses. And that performance doesn't even include a second phase coming later this year to the system based at Lawrence Berkeley National Lab.

More than two dozen applications are getting ready to be among the first to ride the 6,159 NVIDIA A100 Tensor Core GPUs in Perlmutter, the largest A100-powered system in the world. They aim to advance science in astrophysics, climate science and more. In one project, the supercomputer will help assemble the largest 3D map of the visible universe to date. It will process data from the Dark Energy Spectroscopic Instrument (DESI), a kind of cosmic camera that can capture as many as 5,000 galaxies in a single exposure. Researchers need the speed of Perlmutter's GPUs to capture dozens of exposures from one night to know where to point DESI the next night. Preparing a year's worth of the data for publication would take weeks or months on prior systems, but Perlmutter should help them accomplish the task in as little as a few days.

"I'm really happy with the 20x speedups we've gotten on GPUs in our preparatory work," said Rollin Thomas, a data architect at NERSC who's helping researchers get their code ready for Perlmutter. DESI's map aims to shed light on dark energy, the mysterious physics behind the accelerating expansion of the universe.

A similar spirit fuels many projects that will run on NERSC's new supercomputer. For example, work in materials science aims to discover atomic interactions that could point the way to better batteries and biofuels. Traditional supercomputers can barely handle the math required to generate simulations of a few atoms over a few nanoseconds with programs such as Quantum Espresso. But by combining their highly accurate simulations with machine learning, scientists can study more atoms over longer stretches of time. "In the past it was impossible to do fully atomistic simulations of big systems like battery interfaces, but now scientists plan to use Perlmutter to do just that," said Brandon Cook, an applications performance specialist at NERSC who's helping researchers launch such projects. That's where Tensor Cores in the A100 play a unique role. They accelerate both the double-precision floating point math for simulations and the mixed-precision calculations required for deep learning.

Graphics

Resale Prices Triple for NVIDIA Chips as Gamers Compete with Bitcoin Miners (yahoo.com) 108

"In the niche world of customers for high-end semiconductors, a bitter feud is pitting bitcoin miners against hardcore gamers," reports Quartz: At issue is the latest line of NVIDIA graphics cards — powerful, cutting-edge chips with the computational might to display the most advanced video game graphics on the market. Gamers want the chips so they can experience ultra-realistic lighting effects in their favorite games. But they can't get their hands on NVIDIA cards, because miners are buying them up and adapting them to crunch cryptographic codes and harvest digital currency. The fierce competition to buy chips — combined with a global semiconductor shortage — has driven resale prices up as much as 300%, and led hundreds of thousands of desperate consumers to sign up for daily raffles for the right to buy chips at a significant mark-up.

To broker a peace between its warring customers, NVIDIA is, essentially, splitting its cutting-edge graphics chips into two dumbed-down products: GeForce for gamers and the Cryptocurrency Mining Processor (CMP) for miners. GeForce is the latest NVIDIA graphics card — except key parts of it have been slowed down to make it less valuable for miners racing to solve crypto puzzles. CMP is based on a slightly older version of NVIDIA's graphics card which has been stripped of all of its display outputs, so gamers can't use it to render graphics.

NVIDIA's goal in splitting its product offerings is to incentivize miners to only buy CMP chips, and leave the GeForce chips for the gamers. "What we hope is that the CMPs will satisfy the miners...[and] steer our GeForce supply to gamers," said CEO Jansen Huang on a May 26 conference call with investors and analysts... It won't be easy to keep the miners at bay, however. NVIDIA tried releasing slowed-down graphics chips in February in an effort to deter miners from buying them, but it didn't work. The miners quickly figured out how to hack the chips and make them perform at full-speed again.

Power

Is Natural Gas (Mostly) Good for Global Warming? (ieee.org) 139

Natural gas "creates less carbon emissions than the coal it replaces, but we have to find ways to minimize the leakage of methane."

That's the opinion of Vaclav Smil, a distinguished professor emeritus at the University of Manitoba and a Fellow of the Royal Society of Canada, writing in IEEE's Spectrum (in an article shared by Slashdot reader schwit1): Natural gas is abundant, low-cost, convenient, and reliably transported, with low emissions and high combustion efficiency. Natural-gas-fired heating furnaces have maximum efficiencies of 95 to 97 percent, and combined-cycle gas turbines now achieve overall efficiency slightly in excess of 60 percent. Of course, burning gas generates carbon dioxide, but the ratio of energy to carbon is excellent: Burning a gigajoule of natural gas produces 56 kilograms of carbon dioxide, about 40 percent less than the 95 kg emitted by bituminous coal.

This makes gas the obvious replacement for coal. In the United States, this transition has been unfolding for two decades. Gas-fueled capacity increased by 192 gigawatts from 2000 to 2005 and by an additional 69 GW from 2006 through the end of 2020. Meanwhile, the 82 GW of coal-fired capacity that U.S. utilities removed from 2012 to 2020 is projected to be augmented by another 34 GW by 2030, totaling 116 GW — more than a third of the former peak rating.

So far, so green. But methane is itself a very potent greenhouse gas, packing from 84 to 87 times as much global warming potential as an equal quantity of carbon dioxide when measured over 20 years (and 28 to 36 times as much over 100 years). And some of it leaks out. In 2018, a study of the U.S. oil and natural-gas supply chain found that those emissions were about 60 percent higher than the Environmental Protection Agency had estimated. Such fugitive emissions, as they are called, are thought to be equivalent to 2.3 percent of gross U.S. gas production...

Without doubt, methane leakages during extraction, processing, and transportation do diminish the overall beneficial impact of using more natural gas, but they do not erase it, and they can be substantially reduced.

China

China's 'Artificial Sun' Fusion Reactor Just Set a New World Record (scmp.com) 90

The South China Morning Post reports that China "has reached another milestone in its quest for a fusion reactor, with one of its 'artificial suns' sustaining extreme temperatures for several times longer that its previous benchmark, according to state media." State news agency Xinhua reported that the Experimental Advanced Superconducting Tokamak in a facility in the eastern city of Hefei registered a plasma temperature of 120 million degrees Celsius for 101 seconds on Friday. It also maintained a temperature of 160 million degrees Celsius for 20 seconds, the report said...

The facilities are part of China's quest for fusion reactors, which hold out hope of unlimited clean energy. But there are many challenges to overcome in what has already been a decades-long quest for the world's scientists. Similar endeavours are under way in the United States, Europe, Russia, South Korea. China is also among 35 countries involved in the International Thermonuclear Experimental Reactor (ITER) megaproject in France...

Despite the progress made, fusion reactors are still a long way from reality. Song Yuntao, director of the Institute of Plasma Physics of the Chinese Academy of Sciences, said the latest results were a major achievement for physics and engineering in China. "The experiment's success lays the foundation for China to build its own nuclear fusion energy station," Song was quoted as saying.

NASA notes that the core of the Sun is only about 15 million degrees Celsius.

So for many seconds China's fusion reactor was more than 10 times hotter than the sun.
Australia

Robots and AI Will Guide Australia's First Fully Automated Farm (abc.net.au) 41

"Robots and artificial intelligence will replace workers on Australia's first fully automated farm," reports Australia's national public broadcaster ABC.

The total cost of the farm's upgrade? $20 million. Charles Sturt University in Wagga Wagga will create the "hands-free farm" on a 1,900-hectare property to demonstrate what robots and artificial intelligence can do without workers in the paddock... The farm will use robotic tractors, harvesters, survey equipment and drones, artificial intelligence that will handle sowing, dressing and harvesting, new sensors to measure plants, soils and animals and carbon management tools to minimise the carbon footprint.

The farm is already operated commercially and grows a range of broadacre crops, including wheat, canola, and barley, as well as a vineyard, cattle and sheep.

Slashdot Top Deals