×
Apple

Apple Expanding Self-Service Repair Program To iPhone 14 Lineup and More Macs (macrumors.com) 16

Apple today announced that its self-service repair program will be expanding to the iPhone 14 lineup, 13-inch MacBook Air with the M2 chip, and 14-inch and 16-inch MacBook Pro models with M2 Pro and M2 Max chips starting June 21. From a report: First launched in April 2022, Apple's program provides customers with access to parts, manuals, and tools to repair select devices. Apple says the program is designed for anyone with "experience repairing electronic devices," but says the "vast majority" of customers are better off visiting an Apple Store or Apple Authorized Service Provider. Apple also announced that customers can now complete the post-repair System Configuration process by placing the device into Diagnostics Mode and following the on-screen prompts. Users no longer need to contact the program's support team to complete this step, which verifies that the parts are genuine and working properly.
Television

LCD TVs Won't See Any Further Development (tomsguide.com) 70

According to an industry insider, LCD TVs won't see any further development because all new R&D money is being spent on self-emissive displays like MicroLED and OLED, as well as on backlight technology like Mini-LED. Tom's Guide reports: According to Bob Raikes from Display Daily, it's all about OLED development. "I asked EMD (which is the US name of Merck KGaA and is by far the dominant supplier of LC materials), what they were doing to push LC materials for displays onto the next stage ... They are developing LCs for privacy windows and antennas, but they told us that 'there is no pull from clients' for significant development in LC materials," Raikes wrote in a recent article. "That shouldn't have been a surprise to me -- I have been talking about the switch to OLED and other emissive displays for the premium end (and later the mainstream) of the display market for a lot of years. Still, after decades of reporting on LC developments, it took a moment to sink in!"

As for what, specifically, manufacturers are working on, it's the production of QD-OLED panels for use in the high-end Samsung and Sony TVs like the Samsung S95C OLED and Sony A95K OLED as well as the development of PHOLED panels that use a blue phosphorescent material that has a longer shelf life and can go brighter than the traditional organic material in OLED panels. [...] Sadly, LCD TVs' days are coming to a close, but OLED TVs are still going strong.

Power

California's First Solar-Powered Microgrid Neighborhood Has a Giant Community Battery (theverge.com) 90

As part of a series of articles on smart homes, the Verge visits an energy-efficient home in the southern California desert that's "part of California's first planned smart, solar-powered residential microgrid community." A surprisingly small number of solar panels on the roof soak up the sun in the desert landscape... funneling power into the tightly designed building envelope. Here, a 13-kilowatt hour home battery sits beside a smart load panel that controls every electrical appliance in the home, from the hybrid electric heat-pump water heater and high-efficiency heat pump HVAC system — both Wi-Fi enabled to share data — to the light switches, EnergyStar fridge, and energy-efficient induction cooktop. Using software algorithms, the Schneider load center intelligently determines where to best draw power from — the SunPower solar panels, the battery, or the grid. It then makes recommendations the Conriques can use to set automations that change power sources or reduce energy use when prices and demand spike...

The 43 new residences in KB Home-built Shadow Mountain, which launched in November 2022, and the 176 more planned as part of two communities, Durango and Oak Shade, are all-electric, solar-powered smart homes. By next year they will be connected to a 2.3 megawatt-hour community battery, sending any excess energy their panels generate to the common power source and creating a community microgrid. When the power goes down, the microgrid will kick in, isolating all 219 homes from the grid and keeping their essential functions up and running. The homes will draw first from their own battery (and potentially their EV) and then from the community battery. "When the system hits a potential steady state, they can ride a power outage for days, if not in perpetuity, with proper solar production," explains Brad Wills of Schneider Electric, manufacturers of the home's smart load panel, the community's microgrid components, and the software that runs the system...

Developed as a partnership between SunPower, KB Home, University of California, Irvine, Schneider Electric, Southern California Edison, Kia America, and the US Department of Energy, Shadow Mountain is designed to be a blueprint for how we can build better, smarter communities in the future... A recent DOE study estimated that by 2030, grid-interactive efficient buildings like those at Shadow Mountain could save up to $18 billion per year in power system costs and cut 80 million tons of carbon emissions annually.

The article describes how the community helps the larger power grid:
  • They can send electricity back into the grid during periods of peak demand.
  • The local power company now also has the option to "island" the entire community off the grid in times of high demand.
  • The community "is also trialing high-output vehicle-to-home and vehicle-to-grid functions."

Power

A Finnish Firm Thinks It Can Cut Industrial Carbon Emissions By a Third (economist.com) 58

The Ecoomist asks: How can we "green" the high-temperature chemical processes in industries like steelmaking or the production of chemical or cement. "Because it is tricky or impossible to produce such temperatures for some industrial processes using electricity alone, firms rely on fossil fuels."

But a Finnish engineering firm called Coolbrook thinks they have an answer: The easiest way to think about Coolbrook's system is as a gas turbine in reverse. A conventional gas turbine — as used in power stations or jet engines — burns fossil fuel to create a hot, high-pressure gas that spins rotor blades. That rotational energy can be used to run a thrust-generating fan (as in jet aircraft) or converted to electricity in a generator (as in a power station). The new system begins instead with an electric motor. The motor spins the turbine's rotors. Gas or liquid is then fed to the turbine. Once inside, the rotors accelerate the stuff to supersonic speeds, and then rapidly slow it again. The sudden deceleration transforms the kinetic energy contained in the accelerated gas or fluid into heat. If the motor is powered by green electricity, then no carbon dioxide is produced...

Laboratory trials have shown that yields from the electrified process could be significantly higher than what can be obtained with fossil fuels. Assuming that everything goes according to plan, the firm will try producing heat for several other industrial processes... Joonas Rauramo, Coolbrook's boss, reckons his firm's technology could eliminate perhaps 30% of heavy-industrial emissions. And, he says, it can do so without needing to invent anything fundamentally new. "It is a known science," says Mr Rauramo. "It has just not been applied in exactly the way we are doing it."

The article's subheading puts it succinctly. "Running a turbine backwards can produce green heat."

Thanks to long-time Slashdot reader SpzToid for sharing the article.
China

Cringely Predicts Moore's Law Will Continue -- Because of AI (cringely.com) 35

"I predict that Generative Artificial Intelligence is going to go a long way toward keeping Moore's Law in force," writes long-time tech pundit Robert X. Cringely, "and the way this is going to happen says a lot about the chip business, global economics, and Artificial Intelligence, itself." The current el cheapo AI research frenzy is likely to subside as LLaMA ages into obsolescence and has to be replaced by something more expensive, putting Google, Microsoft and OpenAI back in control. Understand, too, that these big, established companies like the idea of LLMs costing so much to build because that makes it harder for startups to disrupt. It's a form of restraint of trade, though not illegal...

[T]here is an opportunity for vertical LLMs trained on different data — real data from industries like medicine and auto mechanics. Whoever owns this data will own these markets. What will make these models both better and cheaper is they can be built from a LLaMA base because most of that data doesn't have to change over time... Bloomberg has already done this for investment advice using its unique database of historical financial information. With an average of 50 billion nodes, these vertical models will cost only five percent as much to run as OpenAI's one billion node GPT-4...

[I]t ought to be pretty simple to apply AI to chip design, building custom chip design models to iterate into existing simulators and refine new designs that actually have a pretty good chance of being novel.

And who will be the first to leverage this chip AI? China... Look for fabless AI chip startups to spring-up around Chinese universities and for the Chinese Communist Party to put lots of money into this very cost-effective work. Because even if it's used just to slim-down and improve existing designs, that's another generation of chips China might otherwise not have had at all.

AI

A New Approach to Computation Reimagines Artificial Intelligence: Hyperdimensional Computing (quantamagazine.org) 43

Quanta magazine thinks there's a better alternative to the artificial neural networks (or ANNs) powering AI systems. (Alternate URL) For one, ANNs are "super power-hungry," said Cornelia Fermüller, a computer scientist at the University of Maryland. "And the other issue is [their] lack of transparency." Such systems are so complicated that no one truly understands what they're doing, or why they work so well. This, in turn, makes it almost impossible to get them to reason by analogy, which is what humans do — using symbols for objects, ideas and the relationships between them....

Bruno Olshausen, a neuroscientist at the University of California, Berkeley, and others argue that information in the brain is represented by the activity of numerous neurons... This is the starting point for a radically different approach to computation known as hyperdimensional computing. The key is that each piece of information, such as the notion of a car, or its make, model or color, or all of it together, is represented as a single entity: a hyperdimensional vector. A vector is simply an ordered array of numbers. A 3D vector, for example, comprises three numbers: the x, y and z coordinates of a point in 3D space. A hyperdimensional vector, or hypervector, could be an array of 10,000 numbers, say, representing a point in 10,000-dimensional space. These mathematical objects and the algebra to manipulate them are flexible and powerful enough to take modern computing beyond some of its current limitations and foster a new approach to artificial intelligence...

Hyperdimensional computing tolerates errors better, because even if a hypervector suffers significant numbers of random bit flips, it is still close to the original vector. This implies that any reasoning using these vectors is not meaningfully impacted in the face of errors. The team of Xun Jiao, a computer scientist at Villanova University, has shown that these systems are at least 10 times more tolerant of hardware faults than traditional ANNs, which themselves are orders of magnitude more resilient than traditional computing architectures...

All of these benefits over traditional computing suggest that hyperdimensional computing is well suited for a new generation of extremely sturdy, low-power hardware. It's also compatible with "in-memory computing systems," which perform the computing on the same hardware that stores data (unlike existing von Neumann computers that inefficiently shuttle data between memory and the central processing unit). Some of these new devices can be analog, operating at very low voltages, making them energy-efficient but also prone to random noise.

Thanks to Slashdot reader ZipNada for sharing the article.
Power

Why EVs Won't Crash the Electric Grid (msn.com) 418

"If everyone has an electric car, will the electric grid be able to support all those cars being recharged?"

That's the question being answered this week in the Washington Post's "Climate Coach" newsletter: We can already see a preview of our electric future in Norway, one of the countries with the highest share of EVs. More than 90 percent of new cars sold in the country were plug-in electric, according to the latest data, from May. More than 20 percent of the country's overall vehicle fleet is electric, a share expected to rise to one-third by 2025. So far, the grid has essentially shrugged it off. "We haven't seen any issue of the grid collapsing," says Anne Nysæther, a managing director at Elvia, a utility serving Oslo and the surrounding areas with the nation's largest concentration of EVs. The country, now almost entirely powered by renewables, has easily met the extra demand from EVs while slashing greenhouse gas emissions. That's good, because Norway will ban all new petrol and diesel cars by 2025...

To electrify everything — all these expected EVs, heat pumps and other big power draws — [the U.S.] will need to start building up our grid, according to Jesse Jenkins, an energy modeling and engineering expert at Princeton University. The United States must at least double its electricity supply by 2050, while stringing up 75,000 miles of new high-voltage lines by 2035, the equivalent of 15 round trips from Los Angeles to New York City, and connect new wind and solar generation to the grid. That sounds like a lot. But something like this has already been done. From the 1970s to the 1990s, the U.S. built new transmission capacity at a speed close to what is required today, writes Jenkins in Mother Jones, even as electricity demand grew.

Robotics

More AI is Coming to Fast-Food Restaurant Drive-Through Lanes (cnn.com) 103

An anonymous reader shared this report from CNN: Over the past few years, restaurants from White Castle to Wendy's have been investing in artificial intelligence tech for drive-thrus... [E]fforts have ramped up recently, with two announcements in May. CKE Restaurants (owner of Hardee's and Carl's Jr.) said it will roll out AI ordering capability more broadly after a successful pilot. Soon after, Wendy's said it had expanded its partnership with Google Cloud to include an AI ordering tool at the drive-thru. The chain is piloting the program in Columbus, Ohio this month.
Fast food restaurants "say it's a way to ease the burden placed on overworked employees, and a solution to bogged down drive-thrus overwhelmed by a surge of customers," according to the article. "But customers — and workers — may not be thrilled with the technology... " Frustrated customers have already documented cases of AI getting their orders wrong, and experts warn the noisy drive-thru is a challenging environment for the technology. And AI may swipe hours or even entire jobs away from fast-food workers... Out of ten orders placed by customers at an Indiana White Castle that uses AI in its drive-thru, three people asked to speak with a human employee, because of either an error or a desire to simply talk to a person, the Wall Street Journal recently reported.

That said, AI inherently improves as it collects more data. The experience may improve after tools take more orders and learn to better recognize voices.

For companies, a hiccup-y start seems to be well worth the potential boost to sales. One of the main benefits of using AI in the drive-thru is that it upsells relentlessly — leading customers to spend more, according to Presto Automation, an AI company that works with restaurants and has partnered with CKE... Some analysts are similarly bullish. "We believe that AI voice recognition and digital only lanes could speed up the average drive through service time by at least 20-30%," analysts wrote in a Bernstein Research note published in March. "We expect AI to augment the competitive advantages of restaurants with digital culture."

Short-staffed restaurants may see AI as a way to fill in the gaps... Some restaurants are still struggling to find staff. Meanwhile, dining trends have changed. The pandemic sent customers to drive-thrus in droves and some have kept the habit, contributing to slower drive-thru times.

Hardware

M2 Max Is Basically An M1 Ultra, and M2 Ultra Nearly Doubles the Performance (9to5mac.com) 42

The new Mac Studio started shipping to customers this week, giving product reviewers a chance to test Apple's "most capable chip ever." According to new benchmarks by YouTuber Luke Miani, the M2 Ultra features nearly double the GPU performance of last year's M1 Ultra, with notable performance improvements in other areas. 9to5Mac reports: While the M1 Max and M1 Ultra are blazing fast, the difference between the two wasn't as notable as some expected. In many tasks, the much cheaper M1 Max wasn't too far off from the top-end M1 Ultra variant, especially in video editing, photo editing, and 3D rendering. Despite the M1 Ultra literally being 2 M1 Max's fused, the performance was never doubled. For the M2 series, Apple has made some significant changes under the hood, especially in GPU scaling. In Luke's testing, he found that in some GPU heavy applications, like Blender 3D and 3DMark, the M2 Ultra was sometimes precisely twice the performance of M2 Max -- perfect GPU scaling! In Final Cut Pro exports, it nearly doubled again. He also found that the M2 Ultra doubled the GPU performance of the M1 Ultra in these same benchmarks -- a genuinely remarkable year-over-year upgrade.

The reason for the massive performance improvement is that Apple added a memory controller chip to the M2 generation that balances the load between all of M2 Ultra's cores -- M1 Ultra required the ram to be maxed out before using all cores. M1 Ultra was very good at doing many tasks simultaneously but struggled to do one task, such as benchmarking or rendering, faster than the M1 Max. With M2 Ultra, because of this new memory controller, Apple can now achieve the same incredible performance without the memory buffer needing to be maxed out. It's important to note that some applications cannot take advantage of the M2 Ultra fully, and in non-optimized applications, you should not expect double the performance.

Despite this incredible efficiency and performance, the better deal might be the M2 Max. In Luke's testing, the M2 Max performed very similarly or outperformed last year's M1 Ultra. In Blender, Final Cut Pro, 3DMark, and Rise of the Tomb Raider, the M2 Max consistently performed the same or better than the M1 Ultra. Instead of finding an M1 Ultra on eBay, it might be best to save money and get the M2 Max if you're planning on doing tasks that heavily utilize the GPU. While the GPU performance is similar, the M1 Ultra still has the advantage of far more CPU cores, and will outperform the M2 Max in CPU heavy workloads.

Supercomputing

Intel To Start Shipping a Quantum Processor (arstechnica.com) 18

An anonymous reader quotes a report from Ars Technica: Intel does a lot of things, but it's mostly noted for making and shipping a lot of processors, many of which have been named after bodies of water. So, saying that the company is set to start sending out a processor called Tunnel Falls would seem unsurprising if it weren't for some key details. Among them: The processor's functional units are qubits, and you shouldn't expect to be able to pick one up on New Egg. Ever. Tunnel Falls appears to be named after a waterfall near Intel's Oregon facility, where the company's quantum research team does much of its work. It's a 12-qubit chip, which places it well behind the qubit count of many of Intel's competitors -- all of which are making processors available via cloud services. But Jim Clarke, who heads Intel's quantum efforts, said these differences were due to the company's distinct approach to developing quantum computers.

Intel, in contrast, is attempting to build silicon-based qubits that can benefit from the developments that most of the rest of the company is working on. The company hopes to "ride the coattails of what the CMOS industry has been doing for years," Clarke said in a call with the press and analysts. The goal, according to Clarke, is to make sure the answer to "what do we have to change from our silicon chip in order to make it?" is "as little as possible." The qubits are based on quantum dots, structures that are smaller than the wavelength of an electron in the material. Quantum dots can be used to trap individual electrons, and the properties of the electron can then be addressed to store quantum information. Intel uses its fabrication expertise to craft the quantum dot and create all the neighboring features needed to set and read its state and perform manipulations.

However, Clarke said there are different ways of encoding a qubit in a quantum dot (Loss-DiVincenzo, singlet-triplet, and exchange-only, for those curious). This gets at another key difference with Intel's efforts: While most of its competitors are focused solely on fostering a software developer community, Intel is simultaneously trying to develop a community that will help it improve its hardware. (For software developers, the company also released a software developer kit.) To help get this community going, Intel will send Tunnel Falls processors out to a few universities: The Universities of Maryland, Rochester, Wisconsin, and Sandia National Lab will be the first to receive the new chip, and the company is interested in signing up others. The hope is that researchers at these sites will help Intel characterize sources of error and which forms of qubits provide the best performance.
"Overall, Intel has made a daring choice for its quantum strategy," concludes Ars' John Timmer. "Electron-based qubits have been more difficult to work with than many other technologies because they tend to have shorter life spans before they decohere and lose the information they should be holding. Intel is counting on rapid iteration, a large manufacturing capacity, and a large community to help it figure out how to overcome this. But testing quantum computing chips and understanding why their qubits sometimes go wrong is not an easy process; it requires highly specialized refrigeration hardware that takes roughly a day to get the chips down to a temperature where they can be used."

"The company seems to be doing what it needs to overcome that bottleneck, but it's likely to need more than three universities to sign up if the strategy is going to work."
Power

World's Largest Fusion Project Is In Big Trouble, New Documents Reveal (scientificamerican.com) 157

An anonymous reader quotes a report from Scientific American: It could be a new world record, although no one involved wants to talk about it. In the south of France, a collaboration among 35 countries has been birthing one of the largest and most ambitious scientific experiments ever conceived: the giant fusion power machine known as the International Thermonuclear Experimental Reactor (ITER). But the only record ITER seems certain to set doesn't involve "burning" plasma at temperatures 10 times higher than that of the sun's core, keeping this "artificial star" ablaze and generating net energy for seconds at a time or any of fusion energy's other spectacular and myriad prerequisites. Instead ITER is on the verge of a record-setting disaster as accumulated schedule slips and budget overruns threaten to make it the most delayed -- and most cost-inflated -- science project in history.

The ITER project formally began in 2006, when its international partners agreed to fund an estimated [$6.3 billion], 10-year plan that would have seen ITER come online in 2016. The most recent official cost estimate stands at more than [$22 billion], with ITER nominally turning on scarcely two years from now. Documents recently obtained via a lawsuit, however, imply that these figures are woefully outdated: ITER is not just facing several years' worth of additional delays but also a growing internal recognition that the project's remaining technical challenges are poised to send budgets spiraling even further out of control and successful operation ever further into the future.

The documents, drafted a year ago for a private meeting of the ITER Council, ITER's governing body, show that at the time, the project was bracing for a three-year delay -- a doubling of internal estimates prepared just six months earlier. And in the year since those documents were written, the already grim news out of ITER has unfortunately only gotten worse. Yet no one within the ITER Organization has been able to provide estimates of the additional delays, much less the extra expenses expected to result from them. Nor has anyone at the U.S. Department of Energy, which is in charge of the nation's contributions to ITER, been able to do so. When contacted for this story, DOE officials did not respond to any questions by the time of publication.

Intel

Intel To Launch New Core Processor Branding for Meteor Lake: Drop the i, Add Ultra Tier (anandtech.com) 36

As first hinted at by Intel back in late April, Intel is embarking on a journey to redefine its client processor branding, the biggest such shift in the previous 15 years of the company. From a report: Having already made waves by altering its retail packaging on premium desktop chips such as the Core i9-11900K and Core i9-12900K, the tech giant aims to introduce a new naming scheme across its client processors, signaling a transformative phase in its client roadmap. This shift is due to begin in the second half of the year, when Intel will launch their highly anticipated Meteor Lake CPUs. Meteor Lake represents a significant leap forward for the company in regards to manufacturing, architecture, and design -- and, it would seem, is prompting the need for a fresh product naming convention.

The most important changes include dropping the 'i' from the naming scheme and opting for a more straightforward Core 3, 5, and 7 branding structure for Intel's mainstream processors. The other notable inclusion, which is now officially confirmed, is that Intel will bifurcate the Core brand a bit and place its premium client products in their own category, using the new Ultra moniker. Ultra chips will signify a higher performance tier and target market for the parts, and will be the only place Intel uses their top-end Core 9 (previously i9) branding.

Microsoft

Microsoft Now Sells Surface Replacement Parts, Including Displays, Batteries, and SSDs (theverge.com) 18

Microsoft is starting to sell replacement components for its Surface devices. The software giant now supplies replacement parts in the Microsoft Store, allowing Surface owners to replace their displays, batteries, SSDs, and more. From a report: "We are excited to offer replacement components to technically inclined consumers for out-of-warranty, self repair," says Tim McGuiggan, VP of devices services and product engineering at Microsoft. "When purchasing a replacement component, you will receive the part and relevant collateral components (such as screws if applicable)." Tools to help you repair a Microsoft Surface device are sold separately by iFixit, which Microsoft partnered with in 2021 to sell official Surface repair tools. iFixit supplies tools like battery covers to protect against punctures during repair, debonding cradles to help cut the adhesive that holds screen glass in place, and a tool to properly replace a screen.
Supercomputing

Iran Unveils 'Quantum' Device That Anyone Can Buy for $589 on Amazon (vice.com) 67

What Iran's military called "the first product of the quantum processing algorithm" of the Naval university appears to be a stock development board, available widely online for around $600. Motherboard reports: According to multiple state-linked news agencies in Iran, the computer will help Iran detect disturbances on the surface of water using algorithms. Iranian Rear Admiral Habibollah Sayyari showed off the board during the ceremony and spoke of Iran's recent breakthroughs in the world of quantum technology. The touted quantum device appears to be a development board manufactured by a company called Diligent. The brand "ZedBoard" appears clearly in pictures. According to the company's website, the ZedBoard has everything the beginning developer needs to get started working in Android, Linux, and Windows. It does not appear to come with any of the advanced qubits that make up a quantum computer, and suggested uses include "video processing, reconfigurable computing, motor control, software acceleration," among others.

"I'm sure this board can work perfectly for people with more advanced [Field Programmable Gate Arrays] experience, however, I am a beginner and I can say that this is also a good beginner-friendly board," said one review on Diligent's website. Those interested in the board can buy one on Amazon for $589. It's impossible to know if Iran has figured out how to use off-the-shelf dev boards to make quantum algorithms, but it's not likely.

China

US To Allow South Korean, Taiwan Chip Makers To Keep Operations In China (msn.com) 27

According to the Wall Street Journal, the Biden administration is expected to allow leading semiconductor manufacturers from South Korea and Taiwan to continue and expand their chipmaking operations in China. From a report: Alan Estevez, undersecretary of commerce for industry and security, announced the decision at an industry gathering last week. The exemptions, initially granted for one year in October last year, were provided to several companies, including South Korea's Samsung Electronics and Taiwan Semiconductor Manufacturing, who have invested billions in building plants in China.

The decision to extend the exemptions reflects the challenges faced by U.S. authorities in isolating China from high-tech goods in a highly integrated global industry. The U.S. has been trying to keep advanced chips out of Chinese hands by limiting exports not only from American manufacturers but also those made by allies. However, U.S. and foreign chip makers have resisted these efforts, and governments in Asia and Europe have also pushed back. The most vocal criticism has come from South Korea, whose largest export market is China.
Further reading: Ex-Samsung Executive Accused of Stealing Secrets for China Chip Factory
Data Storage

Western Digital Sparks Panic, Anger For Age-Shaming HDDs (arstechnica.com) 124

An anonymous reader quotes a report from Ars Technica: When should you be concerned about a NAS hard drive failing? Multiple factors are at play, so many might turn to various SMART (self-monitoring, analysis, and reporting technology) data. When it comes to how long the drive has been active, there are backup companies like Backblaze using hard drives that are nearly 8 years old. That may be why some customers have been panicked, confused, and/or angered to see their Western Digital NAS hard drive automatically given a warning label in Synology's DiskStation Manager (DSM) after they were powered on for three years. With no other factors considered for these automatic flags, Western Digital is accused of age-shaming drives to push people to buy new HDDs prematurely. The practice's revelation is the last straw for some users. Western Digital already had a steep climb to win back NAS customers' trust after shipping NAS drives with SMR (shingled magnetic recording) instead of CMR (conventional magnetic recording). Now, some are saying they won't use or recommend the company's hard drives anymore.

As users have reported online, including on Synology-focused and Synology's own forums, as well as on Reddit and YouTube, Western Digital drives using Western Device Digital Analytics (WDDA) are getting a "warning" stamp in Synology DSM once their power-on hours count hits the three-year mark. WDDA is similar to SMART monitoring and rival offerings, like Seagate's IronWolf, and is supposed to provide analytics and actionable items. The recommended action says: "The drive has accumulated a large number of power on hours [throughout] the entire life of the drive. Please consider to replace the drive soon." There seem to be no discernible problems with the hard drives otherwise.

Synology confirmed this to Ars Technica and noted that the labels come from Western Digital, not Synology. A spokesperson said the "WDDA monitoring and testing subsystem is developed by Western Digital, including the warning after they reach a certain number of power-on-hours." The practice has caused some, like YouTuber SpaceRex, to stop recommending Western Digital drives for the foreseeable future. In May, the YouTuber and tech consultant described his outrage, saying three years is "absolutely nothing" for a NAS drive and lamenting the flags having nothing to do with anything besides whether or not a drive has been in use for three years. A user on SynoForum discussed their "panic" upon seeing the label. And SpaceRex said one of its clients also panicked and quickly replaced the "warning" drives out of fear of losing business-critical data. "It is clearly predatory tactics by Western Digital trying to sell more hard drives," SpaceRex said in a June 10 video.
"Users are also concerned that this could prevent people from noticing serious problems with their drive," adds Ars. "Further, you can't repair a pool with a drive marked with a warning label."

Some of the affected products with WDDA include the WD Red Pro, WD Red Plus, and WD Purple. A discussion post about how to disable WDDA via SSH can be found here.
Databases

Will Submerging Computers Make Data Centers More Climate Friendly? (oregonlive.com) 138

20 miles west of Portland, engineers at an Intel lab are dunking expensive racks of servers "in a clear bath" made of motor oil-like petrochemicals, reports the Oregonian, where the servers "give off a greenish glow as they silently labor away on ordinary computing tasks." Intel's submerged computers operate just as they would on a dry server rack because they're not bathing in water, even though it looks just like it. They're soaking in a synthetic oil that doesn't conduct electricity. So the computers don't short out.

They thrive, in fact, because the fluid absorbs the heat from the hardworking computers much better than air does. It's the same reason a hot pan cools off a lot more quickly if you soak it in water than if you leave it on the stove.

As data centers grow increasingly powerful, the computers are generating so much heat that cooling them uses exorbitant amounts of energy. The cooling systems can use as much electricity as the computers themselves. So Intel and other big tech companies are designing liquid cooling systems that could use far less electricity, hoping to lower data centers' energy costs by as much as a third — and reducing the facilities' climate impact. It's a wholesale change in thinking for data centers, which already account for 2% of all the electricity consumption in the U.S... Skeptics caution that it may be difficult or prohibitively expensive to overhaul existing data centers to adapt to liquid cooling. Advocates of the shift, including Intel, say a transition is imperative to accommodate data centers' growing thirst for power. "It's really starting to come to a head as we're hitting the energy crisis and the need for climate action globally," said Jen Huffstetler, Intel's chief product sustainability officer...

Cooler computers can be packed more tightly together in data centers, since they don't need space for airflow. Computer manufacturers can pack chips together more tightly on the motherboard, enabling more computing power in the same space. And liquid cooling could significantly reduce data centers' environmental and economic costs. Conventional data centers' evaporative cooling systems require tremendous volumes of water and huge amounts of electricity...

Many other tech companies are backing immersion cooling, too. Google, Facebook and Microsoft are all helping fund immersion cooling research at Oregon State... [T]he timing may finally be right for data centers operators to make the shift away from air cooling to something far more efficient. Intel's Huffstetler said she expects to see liquid cooling become widespread in the next three to five years.

The article notes other challenges:
  • liquid adds more weight than some buildings' upper floors can support
  • Some metals degrade faster in liquid than they do in air.
  • And the engineers had to modify the servers by removing their fans — "because they serve no purpose while immersed."

Intel

Intel Demos Its New 'Backside' Power-Delivery Chip Tech (ieee.org) 28

Next year Intel introduces a new transistor — RibbonFET — and a new way of powering it called "PowerVia."

This so-called "backside power" approach "aims to separate power and I/O wiring, shifting power lines to the back of the wafer," reports Tom's Hardware, which "eliminates any possible interference between the data and power wires and increases logic transistor density." IEEE Spectrum explains this approach "leaves more room for the data interconnects above the silicon," while "the power interconnects can be made larger and therefore less resistive."

And Intel has already done some successful powering tests using it on Intel's current transistors: The resulting cores saw more than a 6 percent frequency boost as well as more compact designs and 30 percent less power loss. Just as important, the tests proved that including backside power doesn't make the chips more costly, less reliable, or more difficult to test for defects. Intel is presenting the details of these tests in Tokyo next week at the IEEE Symposium on VLSI Technology and Circuits...

[C]ores can be made more compact, decreasing the length of interconnects between logic cells, which speeds things up. When the standard logic cells that make up the processor core are laid out on the chip, interconnect congestion keeps them from packing together perfectly, leaving loads of blank space between the cells. With less congestion among the data interconnects, the cells fit together more tightly, with some portions up to 95 percent filled... What's more, the lack of congestion allowed some of the smallest interconnects to spread out a bit, reducing parasitic capacitance that hinders performance...

With the process for PowerVia worked out, the only change Intel will have to make in order to complete its move from Intel 4 to the next node, called 20A, is to the transistor... Success would put Intel ahead of TSMC and Samsung, in offering both nanosheet transistors and backside power.

ISS

Adventure in Space: ISS Astronauts Install Fifth Roll-out Solar Blanket to Boost Power (cbsnews.com) 25

The international space station is equpped with four 39-foot blankets (11.8-meters), reports CBS News. The first one was delivered in December of 2000 — and now it's time for some changes: Two astronauts ventured outside the International Space Station Friday and installed the fifth of six roll-out solar array blankets — iROSAs — needed to offset age-related degradation and micrometeoroid damage to the lab's original solar wings.

Floating in the Quest airlock, veteran Stephen Bowen, making his ninth spacewalk, and crewmate Woody Hoburg, making his first, switched their spacesuits to battery power at 9:25 a.m. EDT, officially kicking off the 264th spacewalk devoted to ISS assembly and maintenance and the seventh so far this year. NASA is in the process of upgrading the ISS's solar power system by adding six iROSAs to the lab's eight existing U.S. arrays. The first four roll-out blankets were installed during spacewalks in 2021 and 2022. Bowen and Hoburg installed the fifth during Friday's spacewalk and plan to deploy the sixth during another excursion next Thursday.

The two new iROSAs were delivered to the space station earlier this week in the unpressurized trunk section of a SpaceX cargo Dragon. The lab's robot arm pulled them out Wednesday and mounted them on the right side of the station's power truss just inboard the starboard wings... As the station sailed 260 miles above the Great Lakes, the 63-foot-long solar array slowly unwound like a window shade to its full length. Well ahead of schedule by that point, the spacewalkers carried out a variety of get-ahead tasks to save time next week when they float back outside to install the second new iROSA.

They returned to the airlock and began re-pressurization procedures at 3:28 p.m., bringing the 6-hour three-minute spacewalk to a close. With nine spacewalks totaling 60 hours and 22 minutes under his belt, Bowen now ranks fifth on the list of the world's most experienced spacewalkers.

"Combined with the 95-kilowatt output of the original eight panels, the station's upgraded system will provide about 215,000 kilowatts of power."
Power

Smoke Sends US Northeast Solar Power Plunging By 50% As Wildfires Rage In Canada (reuters.com) 90

Longtime Slashdot reader WindBourne writes: "A shroud of smoke has sent solar power generation in parts of the eastern US plummeting by more than 50% as wildfires rage in Canada," reports Bloomberg. "Solar farms powering New England were producing 56% less energy at times of peak demand compared with the week before, according to the region's grid operator. Electricity generated by solar across the territory serviced by PJM Interconnection LLC, which spans Illinois to North Carolina, was down about 25% from the previous week."

Not mentioned in the article is that the wind generator output has also dropped. ["Wind power also dropped to 5% of total generation so far this week versus a recent high of 12% during the windy week ended May 12," reports Reuters. "That forced power generators to boost the amount of electricity generated by gas to 45% this week, up from around 40% in recent weeks."]

If forest fires can cut PV output by 50%, what would happen in real disasters when a nation most needs their electricity -- especially as we convert from fossil fuels (stored energy) to electricity? This will hopefully have politicians thinking in terms of national security, as well as anthropogenic global warming, when it comes to western grids.

Slashdot Top Deals