×
Wireless Networking

Why Your Wi-Fi Router Doubles As an Apple AirTag (krebsonsecurity.com) 8

An anonymous reader quotes a report from Krebs On Security: Apple and the satellite-based broadband service Starlink each recently took steps to address new research into the potential security and privacy implications of how their services geo-locate devices. Researchers from the University of Maryland say they relied on publicly available data from Apple to track the location of billions of devices globally -- including non-Apple devices like Starlink systems -- and found they could use this data to monitor the destruction of Gaza, as well as the movements and in many cases identities of Russian and Ukrainian troops. At issue is the way that Apple collects and publicly shares information about the precise location of all Wi-Fi access points seen by its devices. Apple collects this location data to give Apple devices a crowdsourced, low-power alternative to constantly requesting global positioning system (GPS) coordinates.

Both Apple and Google operate their own Wi-Fi-based Positioning Systems (WPS) that obtain certain hardware identifiers from all wireless access points that come within range of their mobile devices. Both record the Media Access Control (MAC) address that a Wi-FI access point uses, known as a Basic Service Set Identifier or BSSID. Periodically, Apple and Google mobile devices will forward their locations -- by querying GPS and/or by using cellular towers as landmarks -- along with any nearby BSSIDs. This combination of data allows Apple and Google devices to figure out where they are within a few feet or meters, and it's what allows your mobile phone to continue displaying your planned route even when the device can't get a fix on GPS.

With Google's WPS, a wireless device submits a list of nearby Wi-Fi access point BSSIDs and their signal strengths -- via an application programming interface (API) request to Google -- whose WPS responds with the device's computed position. Google's WPS requires at least two BSSIDs to calculate a device's approximate position. Apple's WPS also accepts a list of nearby BSSIDs, but instead of computing the device's location based off the set of observed access points and their received signal strengths and then reporting that result to the user, Apple's API will return the geolocations of up to 400 hundred more BSSIDs that are nearby the one requested. It then uses approximately eight of those BSSIDs to work out the user's location based on known landmarks.

In essence, Google's WPS computes the user's location and shares it with the device. Apple's WPS gives its devices a large enough amount of data about the location of known access points in the area that the devices can do that estimation on their own. That's according to two researchers at the University of Maryland, who theorized they could use the verbosity of Apple's API to map the movement of individual devices into and out of virtually any defined area of the world. The UMD pair said they spent a month early in their research continuously querying the API, asking it for the location of more than a billion BSSIDs generated at random. They learned that while only about three million of those randomly generated BSSIDs were known to Apple's Wi-Fi geolocation API, Apple also returned an additional 488 million BSSID locations already stored in its WPS from other lookups.
"Plotting the locations returned by Apple's WPS between November 2022 and November 2023, Levin and Rye saw they had a near global view of the locations tied to more than two billion Wi-Fi access points," the report adds. "The map showed geolocated access points in nearly every corner of the globe, apart from almost the entirety of China, vast stretches of desert wilderness in central Australia and Africa, and deep in the rainforests of South America."

The researchers wrote: "We observe routers move between cities and countries, potentially representing their owner's relocation or a business transaction between an old and new owner. While there is not necessarily a 1-to-1 relationship between Wi-Fi routers and users, home routers typically only have several. If these users are vulnerable populations, such as those fleeing intimate partner violence or a stalker, their router simply being online can disclose their new location."

A copy of the UMD research is available here (PDF).
Earth

'Never-Ending' UK Rain Made 10 Times More Likely By Climate Crisis, Study Says (theguardian.com) 90

The seemingly "never-ending" rain last autumn and winter in the UK and Ireland was made 10 times more likely and 20% wetter by human-caused global heating, a study has found. From a report: More than a dozen storms battered the region in quick succession between October and March, which was the second-wettest such period in nearly two centuries of records. The downpour led to severe floods, at least 20 deaths, severe damage to homes and infrastructure, power blackouts, travel cancellations, and heavy losses of crops and livestock.

The level of rain caused by the storms would have occurred just once in 50 years without the climate crisis, but is now expected every five years owing to 1.2C of global heating reached in recent years. If fossil fuel burning is not rapidly cut and the global temperature reaches 2C in the next decade or two, such severe wet weather would occur every three years on average, the analysis showed. [...] The analysis, conducted by climate scientists working as part of the World Weather Attribution group, compared how likely and how intense the wet winter was in today's heated world with how likely it would have been in a world without high levels of carbon emissions. Warmer air can hold more water vapour and therefore produce more rain. Hundreds of "attribution studies" have shown how global heating is already supercharging extreme weather such as heatwaves, wildfires, droughts and storms across the world.

AI

Meta AI Chief Says Large Language Models Will Not Reach Human Intelligence (ft.com) 74

Meta's AI chief said the large language models that power generative AI products such as ChatGPT would never achieve the ability to reason and plan like humans, as he focused instead on a radical alternative approach to create "superintelligence" in machines. From a report: Yann LeCun, chief AI scientist at the social media giant that owns Facebook and Instagram, said LLMs had "very limited understanding of logicâ... do not understand the physical world, do not have persistent memory, cannot reason in any reasonable definition of the term and cannot planâ...âhierarchically."

In an interview with the Financial Times, he argued against relying on advancing LLMs in the quest to make human-level intelligence, as these models can only answer prompts accurately if they have been fed the right training data and are, therefore, "intrinsically unsafe." Instead, he is working to develop an entirely new generation of AI systems that he hopes will power machines with human-level intelligence, although he said this vision could take 10 years to achieve. Meta has been pouring billions of dollars into developing its own LLMs as generative AI has exploded, aiming to catch up with rival tech groups, including Microsoft-backed OpenAI and Alphabet's Google.

Power

California Exceeds 100% of Energy Demand With Renewables Over a Record 45 Days (electrek.co) 148

An anonymous reader quotes a report from Electrek: In a major clean energy benchmark, wind, solar, and hydro exceeded 100% of demand on California's main grid for 69 of the past 75 days. Stanford University professor of civil and environmental engineering Mark Z. Jacobson continues to track California's renewables performance – and it's still exciting. In an update today on Twitter (X), Jacobson reports that California has now exceeded 100% of energy demand with renewables over a record 45 days straight, and 69 out of 75. [...]

Jacobson predicted on April 4 that California will entirely be on renewables and battery storage 24/7 by 2035. California passed a law that commits to achieving 100% net zero electricity by 2045. Will it beat that goal by a decade? We hope so. It's going to be exciting to watch.
Further reading: California Exceeds 100% of Energy Demand With Renewables Over a Record 30 Days
Windows

Windows Now Has AI-Powered Copy and Paste 56

Umar Shakir reports via The Verge: Microsoft is adding a new Advanced Paste feature to PowerToys for Windows 11 that can convert your clipboard content on the fly with the power of AI. The new feature can help people speed up their workflows by doing things like copying code in one language and pasting it in another, although its best tricks require OpenAI API credits.

Advanced Paste is included in PowerToys version 0.81 and, once enabled, can be activated with a special key command: Windows Key + Shift + V. That opens an Advanced Paste text window that offers paste conversion options including plaintext, markdown, and JSON. If you enable Paste with AI in the Advanced Paste settings, you'll also see an OpenAI prompt where you can enter the conversion you want -- summarized text, translations, generated code, a rewrite from casual to professional style, Yoda syntax, or whatever you can think to ask for.
The Courts

Apple Says US Antitrust Lawsuit Should Be Dismissed 62

Apple said on Tuesday it plans to ask a U.S. judge to dismiss a lawsuit filed by the Justice Department and 15 states in March that alleged the iPhone maker monopolized the smartphone market, hurt smaller rivals and drove up prices. From a report: In a letter to U.S. District Judge Julien X. Neals in New Jersey, Apple said "far from being a monopolist, Apple faces fierce competition from well-established rivals, and the complaint fails to allege that Apple has the ability to charge supra-competitive prices or restrict output in the alleged smartphone markets." In the letter to the judge, Apple said the DOJ relies on a new "theory of antitrust liability that no court has recognized."

The government is expected to respond within seven days to the Apple letter, which the court requires parties to submit, hoping to expedite cases before advancing to a potentially more robust and expensive effort to dismiss a lawsuit. The Justice Department alleges that Apple uses its market power to get more money from consumers, developers, content creators, artists, publishers, small businesses and merchants. The civil lawsuit accuses Apple of an illegal monopoly on smartphones maintained by imposing contractual restrictions on, and withholding critical access from, developers.
United States

US Government Urges Federal Contractors To Strengthen Encryption (bloomberg.com) 20

Companies working with the US government may be required to start protecting their data and technology from attacks by quantum computers as soon as July. From a report: The National Institute for Standards and Technology, part of the Department of Commerce, will in July stipulate three types of encryption algorithms the agency deems sufficient for protecting data from quantum computers, setting an internationally-recognized standard aimed at helping organizations manage evolving cybersecurity threats. The rollout of the standards will kick off "the transition to the next generation of cryptography," White House deputy national security adviser Anne Neuberger told Bloomberg in Cambridge, England on Tuesday. Breaking encryption not only threatens "national security secrets" but also the the way we secure the internet, online payments and bank transactions, she added.

Neuberger was speaking at an event organized by the University of Cambridge and Vanderbilt University, hosting academics, industry professionals and government officials to discuss the threats posed to cybersecurity by quantum computing, which vastly accelerates processing power by performing calculations in parallel rather than sequentially and will make existing encryption systems obsolete.

Bitcoin

Vitalik Buterin Addresses Threats To Ethereum's Decentralization In New Blog Post (theblock.co) 25

In a new blog post, Ethereum co-founder Vitalik Buterin has shared his thoughts on three issues core to Ethereum's decentralization: MEV, liquid staking, and the hardware requirements of nodes. The Block reports: In his post, published on May 17, Buterin first addresses the issue of MEV, or the financial gain that sophisticated node operators can capture by reordering the transactions within a block. Buterin characterizes the two approaches to MEV as "minimization" (reducing MEV through smart protocol design, such as CowSwap) and "quarantining" (attempting to reduce or eliminate MEV altogether through in-protocol techniques). While MEV quarantining seems like an alluring option, Buterin notes that the prospect comes with some centralization risks. "If builders have the power to exclude transactions from a block entirely, there are attacks that can quite easily arise," Buterin noted. However, Buterin championed the builders working on MEV quarantining through concepts like transaction inclusion lists, which "take away the builder's ability to push transactions out of the block entirely." "I think ideas in this direction - really pushing the quarantine box to be as small as possible - are really interesting, and I'm in favor of going in that direction," Buterin concluded.

Buterin also addressed the relatively low number of solo Ethereum stakers, as most stakers choose to stake with a staking provider, either a centralized offering like Coinbase or a decentralized offering like Lido or RocketPool, given the complexity, hardware requirement, and 32 eth minimum needed to operate an Ethereum node solo. While Buterin acknowledges the progress being made to reduce the cost and complexity around running a solo node, he also noted "once again there is more that we could do," perhaps through reducing the time to withdraw staked ether or reducing the 32 eth minimum requirement to become a solo staker. "Incorrect answers could lead Ethereum down a path of centralization and 're-creating the traditional financial system with extra steps'; correct answers could create a shining example of a successful ecosystem with a wide and diverse set of solo stakers and highly decentralized staking pools," Buterin wrote. [...]

Buterin finished his post by imploring the Ethereum ecosystem to tackle the hard questions rather than shy away from them. "...We should have deep respect for the properties that make Ethereum unique, and continue to work to maintain and improve on those properties as Ethereum scales," Buterin wrote. Buterin added today, in a post on X, that he was pleased to see civil debate among community members. "I'm really proud that ethereum does not have any culture of trying to prevent people from speaking their minds, even when they have very negative feelings toward major things in the protocol or ecosystem. Some wave the ideal of 'open discourse' as a flag, some take it seriously," Buterin wrote.

HP

HP Resurrects '90s OmniBook Branding, Kills Spectre and Dragonfly (arstechnica.com) 53

HP announced today that it will resurrect the "Omni" branding it first coined for its business-oriented laptops introduced in 1993. The vintage branding will now be used for the company's new consumer-facing laptops, with HP retiring the Spectre and Dragonfly brands in the process. Furthermore, computers under consumer PC series names like Pavilion will also no longer be released. "Instead, every consumer computer from HP will be called either an OmniBook for laptops, an OmniDesk for desktops, or an OmniStudio for AIOs," reports Ars Technica. From the report: The computers will also have a modifier, ranging from 3 up to 5, 7, X, or Ultra to denote computers that are entry-level all the way up to advanced. For instance, an HP OmniBook Ultra would represent HP's highest-grade consumer laptop. "For example, an HP OmniBook 3 will appeal to customers who prioritize entertainment and personal use, while the OmniBook X will be designed for those with higher creative and technical demands," Stacy Wolff, SVP of design and sustainability at HP, said via a press announcement today. [...] So far, HP has announced one new Omni computer, the OmniBook X. It has a 12-core Snapdragon X Elite X1E-78-100, 16GB or 32GB of MPDDR5x-8448 memory, up to 2TB of storage, and a 14-inch, 2240x1400 IPS display. HP is pointing to the Latin translation of omni, meaning "all" (or everything), as the rationale behind the naming update. The new name should give shoppers confidence that the computers will provide all the things that they need.

HP is also getting rid of some of its commercial series names, like Pro. From now on, new, lower-end commercial laptops will be ProBooks. There will also be ProDesktop desktops and ProStudio AIOs. These computers will have either a 2 modifier for entry-level designs or a 4 modifier for ones with a little more power. For example, an HP ProDesk 2 is less powerful than an HP ProDesk 4. Anything more powerful will be considered either an EliteBook (laptops), EliteDesk (desktops), or EliteStudio (AIOs). For the Elite computers, the modifiers go from 6 to 8, X, and then Ultra. A Dragonfly laptop today would fall into the Ultra category. HP did less overhauling of its commercial lineup because it "recognized a need to preserve the brand equity and familiarity with our current sub-brands," Wolff said, adding that HP "acknowledged the creation of additional product names like Dragonfly made those products stand out, rather than be seen as part of a holistic portfolio." [...]

As you might now expect of any tech rebranding, marketing push, or product release these days, HP is also announcing a new emblem that will appear on its computers, as well as other products or services, that substantially incorporate AI. The two laptops announced today carry the logo. According to Wolff, on computers, the logo means that the systems have an integrated NPU "at 40+ trillions of operations per second." They also come with a chatbot based on ChatGPT 4, an HP spokesperson told me.

Transportation

Some People Who Rented a Tesla from Hertz Were Still Charged for Gas (thedrive.com) 194

"Last week, we reported on a customer who was charged $277 for gasoline his rented Tesla couldn't have possibly used," writes the automotive blog The Drive.

"And now, we've heard from other Hertz customers who say they've been charged even more." Hertz caught attention last week for how it handled a customer whom it had charged a "Skip the Pump" fee, which allows renters to pay a premium for Hertz to refill the tank for them. But of course, this customer's rented Tesla Model 3 didn't use gas — it draws power from a battery — and Hertz has a separate, flat fee for EV recharges. Nevertheless, the customer was charged $277.39 despite returning the car with the exact same charge they left with, and Hertz refused to refund it until after our story ran. It's no isolated incident either, as other customers have written in to inform us that it happened to them, too....

Evan Froehlich returned the rental at 21 percent charge, expecting to pay a flat $25 recharge fee. (It's ordinarily $35, but Hertz's loyalty program discounts it.) To Froehlich's surprise, he was hit with a $340.97 "Skip the Pump" fee, which can be applied after returning a car if it's not requested beforehand. He says Hertz's customer service was difficult to reach, and that it took making a ruckus on social media to get Hertz's attention. In the end, a Hertz representative was able to review the charge and have it reversed....

A March 2023 Facebook post documenting a similar case indicates this has been happening for more than a year.

After renting a Tesla Model 3, another customer even got a $475.19 "fuel charge," according to the article — in addition to a $25 charging fee: They also faced a $125.01 "rebill" for using the Supercharger network during their rental, which other Hertz customers have expressed surprise and frustration with. Charging costs can vary, but a 75-percent charge from a Supercharger will often cost in the region of just $15.
China

China Uses Giant Rail Gun to Shoot a Smart Bomb Nine Miles Into the Sky (futurism.com) 127

"China's navy has apparently tested out a hypersonic rail gun," reports Futurism, describing it as "basically a device that uses a series of electromagnets to accelerate a projectile to incredible speeds."

But "during a demonstration of its power, things didn't go quite as planned." As the South China Morning Post reports, the rail gun test lobbed a precision-guided projectile — or smart bomb — nine miles into the stratosphere. But because it apparently didn't go up as high as it was supposed to, the test was ultimately declared unsuccessful. This conclusion came after an analysis led by Naval Engineering University professor Lu Junyong, whose team found with the help of AI that even though the winged smart bomb exceeded Mach 5 speeds, it didn't perform as well as it could have. This occurred, as Lu's team found, because the projectile was spinning too fast during its ascent, resulting in an "undesirable tilt."
But what's more interesting is the project itself. "Successful or not, news of the test is a pretty big deal given that it was just a few months ago that reports emerged about China's other proposed super-powered rail gun, which is intended to send astronauts on a Boeing 737-size ship into space.... which for the record did not make it all the way to space..." Chinese officials, meanwhile, are paying lip service to the hypersonic rail gun technology's potential to revolutionize civilian travel by creating even faster railways and consumer space launches, too.
Japan and France also have railgun projects, according to a recent article from Defense One. "Yet the nation that has demonstrated the most continuing interest is China," with records of railgun work dating back as far as 2011: The Chinese team claimed that their railgun can fire a projectile 100 to 200 kilometers at Mach 6. Perhaps most importantly, it uses up to 100,000 AI-enabled sensors to identify and fix any problems before critical failure, and can slowly improve itself over time. This, they said, had enabled them to test-fire 120 rounds in a row without failure, which, if true, suggests that they solved a longstanding problem that reportedly bedeviled U.S. researchers. However, the team still has a ways to go before mounting an operational railgun on a ship; according to one Chinese article, the projectiles fired were only 25mm caliber, well below the size of even lightweight naval artillery.

As with many other Chinese defense technology programs, much remains opaque about the program...

While railguns tend to get the headlines, this lab has made advances in a wide range of electric and electromagnetic applications for the PLA Navy's warships. For example, the lab's research on electromagnetic launch technology has also been applied to the development of electromagnetic catapults for the PLAN's growing aircraft carrier fleet...

While it remains to be seen whether the Chinese navy can develop a full-scale railgun, produce it at scale, and integrate it onto its warships, it is obvious that it has made steady advances in recent years on a technology of immense military significance that the US has abandoned.

Thanks to long-time Slashdot reader Tangential for sharing the news.
Earth

America Takes Its Biggest Step Yet to End Coal Mining (msn.com) 160

The Washington Post reports that America took "one of its biggest steps yet to keep fossil fuels in the ground," announcing Thursday that it will end new coal leasing in the Powder River Basin, "which produces nearly half the coal in the United States...

"It could prevent billions of tons of coal from being extracted from more than 13 million acres across Montana and Wyoming, with major implications for U.S. climate goals." A significant share of the nation's fossil fuels come from federal lands and waters. The extraction and combustion of these fuels accounted for nearly a quarter of U.S. carbon dioxide emissions between 2005 and 2014, according to a study by the U.S. Geological Survey. In a final environmental impact statement released Thursday, Interior's Bureau of Land Management found that continued coal leasing in the Powder River Basin would harm the climate and public health. The bureau determined that no future coal leasing should happen in the basin, and it estimated that coal mining in the Wyoming portion of the region would end by 2041.

Last year, the Powder River Basin generated 251.9 million tons of coal, accounting for nearly 44 percent of all coal produced in the United States. Under the bureau's determination, the 14 active coal mines in the Powder River Basin can continue operating on lands they have leased, but they cannot expand onto other public lands in the region... "This means that billions of tons of coal won't be burned, compared to business as usual," said Shiloh Hernandez, a senior attorney at the environmental law firm Earthjustice. "It's good news, and it's really the only defensible decision the BLM could have made, given the current climate crisis...."

The United States is moving away from coal, which has struggled to compete economically with cheaper gas and renewable energy. U.S. coal output tumbled 36 percent from 2015 to 2023, according to the Energy Information Administration. The Sierra Club's Beyond Coal campaign estimates that 382 coal-fired power plants have closed down or proposed to retire, with 148 remaining. In addition, the Environmental Protection Agency finalized an ambitious set of rules in April aimed at slashing air pollution, water pollution and planet-warming emissions spewing from the nation's power plants. One of the most significant rules will push all existing coal plants by 2039 to either close or capture 90 percent of their carbon dioxide emissions at the smokestack.

"The nation's electricity generation needs are being met increasingly by wind, solar and natural gas," said Tom Sanzillo, director of financial analysis at the Institute for Energy Economics and Financial Analysis, an energy think tank. "The nation doesn't need any increase in the amount of coal under lease out of the Powder River Basin."

AI

'Openwashing' 35

An anonymous reader quotes a report from The New York Times: There's a big debate in the tech world over whether artificial intelligence models should be "open source." Elon Musk, who helped found OpenAI in 2015, sued the startup and its chief executive, Sam Altman, on claims that the company had diverged from its mission of openness. The Biden administration is investigating the risks and benefits of open source models. Proponents of open source A.I. models say they're more equitable and safer for society, while detractors say they are more likely to be abused for malicious intent. One big hiccup in the debate? There's no agreed-upon definition of what open source A.I. actually means. And some are accusing A.I. companies of "openwashing" -- using the "open source" term disingenuously to make themselves look good. (Accusations of openwashing have previously been aimed at coding projects that used the open source label too loosely.)

In a blog post on Open Future, a European think tank supporting open sourcing, Alek Tarkowski wrote, "As the rules get written, one challenge is building sufficient guardrails against corporations' attempts at 'openwashing.'" Last month the Linux Foundation, a nonprofit that supports open-source software projects, cautioned that "this 'openwashing' trend threatens to undermine the very premise of openness -- the free sharing of knowledge to enable inspection, replication and collective advancement." Organizations that apply the label to their models may be taking very different approaches to openness. [...]

The main reason is that while open source software allows anyone to replicate or modify it, building an A.I. model requires much more than code. Only a handful of companies can fund the computing power and data curation required. That's why some experts say labeling any A.I. as "open source" is at best misleading and at worst a marketing tool. "Even maximally open A.I. systems do not allow open access to the resources necessary to 'democratize' access to A.I., or enable full scrutiny," said David Gray Widder, a postdoctoral fellow at Cornell Tech who has studied use of the "open source" label by A.I. companies.
Data Storage

WD Rolls Out New 2.5-Inch HDDs For the First Time In 7 Years (tomshardware.com) 62

Western Digital has unveiled new 6TB external hard drives -- "the first new capacity point for this hard drive drive form factor in about seven years," reports Tom's Hardware. "There is a catch, though: the HDD is slow and will unlikely fit into any mobile PCs, so it looks like it will exclusively serve portable and specialized storage products." From the report: Western Digital's 6TB 2.5-inch HDD is currently used for the latest versions of the company's My Passport, Black P10, and G-Drive ArmorATD external storage devices and is not available separately. All of these drives (excluding the already very thick G-Drive ArmorATD) are thicker than their 5 TB predecessors, which may suggest that in a bid to increase the HDD's capacity, the manufacturer simply installed another platter and made the whole drive thicker instead of developing new platters with a higher areal density.

While this is a legitimate way to expand the capacity of a hard drive, it is necessary to note that 5TB 2.5-inch HDDs already feature a 15-mm z-height, which is the highest standard z-height for 2.5-inch form-factor storage devices. As a result, these 6TB 2.5-inch drives will unlikely fit into any desktop PC. When it comes to specifications of the latest My Passport, Black P10, and G-Drive ArmorATD external HDDs, Western Digital only discloses that they offer up to 130 MB/s read speed (just like their predecessors), feature a USB 3.2 Gen 1 (up to 5 GT/s) interface using either a modern USB Type-C or Micro USB Type-B connector and do not require an external power adapter.

News

Robert Dennard, Inventor of DRAM, Dies At 91 20

necro81 writes: Robert Dennard was working at IBM in the 1960s when he invented a way to store one bit using a single transistor and capacitor. The technology became dynamic random access memory (DRAM), which when implemented using the emerging technology of silicon integrated circuits, helped catapult computing by leaps and bounds. The first commercial DRAM chips in the late 1960s held just 1024 bits; today's DDR5 modules hold hundreds of billions.

Dr. Robert H. Dennard passed away last month at age 91. (alternate link)

In the 1970s he helped guide technology roadmaps for the ever-shrinking feature size of lithography, enabling the early years of Moore's Law. He wrote a seminal paper in 1974 relating feature size and power consumption that is now referred to as Dennard Scaling. His technological contributions earned him numerous awards, and accolades from the National Academy of Engineering, IEEE, and the National Inventor's Hall of Fame.
AI

OpenAI's Long-Term AI Risk Team Has Disbanded (wired.com) 21

An anonymous reader shares a report: In July last year, OpenAI announced the formation of a new research team that would prepare for the advent of supersmart artificial intelligence capable of outwitting and overpowering its creators. Ilya Sutskever, OpenAI's chief scientist and one of the company's cofounders, was named as the colead of this new team. OpenAI said the team would receive 20 percent of its computing power. Now OpenAI's "superalignment team" is no more, the company confirms. That comes after the departures of several researchers involved, Tuesday's news that Sutskever was leaving the company, and the resignation of the team's other colead. The group's work will be absorbed into OpenAI's other research efforts.

Sutskever's departure made headlines because although he'd helped CEO Sam Altman start OpenAI in 2015 and set the direction of the research that led to ChatGPT, he was also one of the four board members who fired Altman in November. Altman was restored as CEO five chaotic days later after a mass revolt by OpenAI staff and the brokering of a deal in which Sutskever and two other company directors left the board. Hours after Sutskever's departure was announced on Tuesday, Jan Leike, the former DeepMind researcher who was the superalignment team's other colead, posted on X that he had resigned.

Power

In a Milestone, the US Exceeds 5 Million Solar Installations (electrek.co) 157

According to the Solar Energy Industries Association (SEIA), the U.S. has officially surpassed 5 million solar installations. "The 5 million milestone comes just eight years after the U.S. achieved its first million in 2016 -- a stark contrast to the four decades it took to reach that initial milestone since the first grid-connected solar project in 1973," reports Electrek. From the report: Since the beginning of 2020, more than half of all U.S. solar installations have come online, and over 25% have been activated since the Inflation Reduction Act became law 20 months ago. Solar arrays have been installed on homes and businesses and as utility-scale solar farms. The U.S. solar market was valued at $51 billion in 2023. Even with changes in state policies, market trends indicate robust growth in solar installations across the U.S. According to SEIA forecasts, the number of solar installations is expected to double to 10 million by 2030 and triple to 15 million by 2034.

The residential sector represents 97% of all U.S. solar installations. This sector has consistently set new records for annual installations over the past several years, achieving new highs for five straight years and in 10 out of the last 12 years. The significant growth in residential solar can be attributed to its proven value as an investment for homeowners who wish to manage their energy costs more effectively. California is the frontrunner with 2 million solar installations, though recent state policies have significantly damaged its rooftop solar market. Meanwhile, other states are experiencing rapid growth. For example, Illinois, which had only 2,500 solar installations in 2017, now boasts over 87,000. Similarly, Florida has seen its solar installations surge from 22,000 in 2017 to 235,000 today. By 2030, 22 states or territories are anticipated to surpass 100,000 solar installations. The U.S. has enough solar installed to cover every residential rooftop in the Four Corners states of Colorado, Utah, Arizona, and New Mexico.

AI

Hugging Face Is Sharing $10 Million Worth of Compute To Help Beat the Big AI Companies (theverge.com) 10

Kylie Robison reports via The Verge: Hugging Face, one of the biggest names in machine learning, is committing $10 million in free shared GPUs to help developers create new AI technologies. The goal is to help small developers, academics, and startups counter the centralization of AI advancements. [...] Delangue is concerned about AI startups' ability to compete with the tech giants. Most significant advancements in artificial intelligence -- like GPT-4, the algorithms behind Google Search, and Tesla's Full Self-Driving system -- remain hidden within the confines of major tech companies. Not only are these corporations financially incentivized to keep their models proprietary, but with billions of dollars at their disposal for computational resources, they can compound those gains and race ahead of competitors, making it impossible for startups to keep up. Hugging Face aims to make state-of-the-art AI technologies accessible to everyone, not just the tech giants. [...]

Access to compute poses a significant challenge to constructing large language models, often favoring companies like OpenAI and Anthropic, which secure deals with cloud providers for substantial computing resources. Hugging Face aims to level the playing field by donating these shared GPUs to the community through a new program called ZeroGPU. The shared GPUs are accessible to multiple users or applications concurrently, eliminating the need for each user or application to have a dedicated GPU. ZeroGPU will be available via Hugging Face's Spaces, a hosting platform for publishing apps, which has over 300,000 AI demos created so far on CPU or paid GPU, according to the company.

Access to the shared GPUs is determined by usage, so if a portion of the GPU capacity is not actively utilized, that capacity becomes available for use by someone else. This makes them cost-effective, energy-efficient, and ideal for community-wide utilization. ZeroGPU uses Nvidia A100 GPU devices to power this operation -- which offer about half the computation speed of the popular and more expensive H100s. "It's very difficult to get enough GPUs from the main cloud providers, and the way to get them -- which is creating a high barrier to entry -- is to commit on very big numbers for long periods of times," Delangue said. Typically, a company would commit to a cloud provider like Amazon Web Services for one or more years to secure GPU resources. This arrangement disadvantages small companies, indie developers, and academics who build on a small scale and can't predict if their projects will gain traction. Regardless of usage, they still have to pay for the GPUs. "It's also a prediction nightmare to know how many GPUs and what kind of budget you need," Delangue said.

Microsoft

Microsoft's AI Push Imperils Climate Goal As Carbon Emissions Jump 30% (bnnbloomberg.ca) 68

Microsoft's ambitious goal to be carbon negative by 2030 is threatened by its expanding AI operations, which have increased its carbon footprint by 30% since 2020. To meet its targets, Microsoft must quickly adopt green technologies and improve efficiency in its data centers, which are critical for AI but heavily reliant on carbon-intensive resources. Bloomberg reports: Now to meet its goals, the software giant will have to make serious progress very quickly in gaining access to green steel and concrete and less carbon-intensive chips, said Brad Smith, president of Microsoft, in an exclusive interview with Bloomberg Green. "In 2020, we unveiled what we called our carbon moonshot. That was before the explosion in artificial intelligence," he said. "So in many ways the moon is five times as far away as it was in 2020, if you just think of our own forecast for the expansion of AI and its electrical needs." [...]

Despite AI's ravenous energy consumption, this actually contributes little to Microsoft's hike in emissions -- at least on paper. That's because the company says in its sustainability report that it's 100% powered by renewables. Companies use a range of mechanisms to make such claims, which vary widely in terms of credibility. Some firms enter into long-term power purchase agreements (PPAs) with renewable developers, where they shoulder some of a new energy plant's risk and help get new solar and wind farms online. In other cases, companies buy renewable energy credits (RECs) to claim they're using green power, but these inexpensive credits do little to spur new demand for green energy, researchers have consistently found. Microsoft uses a mix of both approaches. On one hand, it's one of the biggest corporate participants in power purchase agreements, according to BloombergNEF, which tracks these deals. But it's also a huge purchaser of RECs, using these instruments to claim about half of its energy use is clean, according to its environmental filings in 2022. By using a large quantity of RECs, Microsoft is essentially masking an even larger growth in emissions. "It is Microsoft's plan to phase out the use of unbundled RECs in future years," a spokesperson for the company said. "We are focused on PPAs as a primary strategy."

So what else can be done? Smith, along with Microsoft's Chief Sustainability Officer Melanie Nakagawa, has laid out clear steps in the sustainability report. High among them is to increase efficiency, which is to use the same amount of energy or computing to do more work. That could help reduce the need for data centers, which will reduce emissions and electricity use. On most things, "our climate goals require that we spend money," said Smith. "But efficiency gains will actually enable us to save money." Microsoft has also been at the forefront of buying sustainable aviation fuels that has helped reduce some of its emissions from business travel. The company also wants to partner with those who will "accelerate breakthroughs" to make greener steel, concrete and fuels. Those technologies are starting to work at a small scale, but remain far from being available in commercial quantities even if expensive. Cheap renewable power has helped make Microsoft's climate journey easier. But the tech giant's electricity consumption last year rivaled that of a small European country -- beating Slovenia easily. Smith said that one of the biggest bottlenecks for it to keep getting access to green power is the lack of transmission lines from where the power is generated to the data centers. That's why Microsoft says it's going to increase lobbying efforts to get governments to speed up building the grid.
If Microsoft's emissions remain high going into 2030, Smith said the company may consider bulk purchases of carbon removal credits, even though it's not "the desired course."

"You've got to be willing to invest and pay for it," said Smith. Climate change is "a problem that humanity created and that humanity can solve."
Power

US Regulators Approve Rule That Could Speed Renewables (npr.org) 23

Longtime Slashdot reader necro81 writes: The U.S. Federal Energy Regulatory Commission (FERC), which controls interstate energy infrastructure, approved a rule Monday that should boost new transmission infrastructure and make it easier to connect renewable energy projects. (More coverage here, here, and here.)

Some 11,000 projects totaling 2,600 GW of capacity are in planning, waiting to break ground, or connect to the grid. But they're stymied by the need for costly upgrades, or simply waiting for review. The frustrations are many. Each proposed project undergoes a lengthy grid-impact study and assessed the cost of necessary upgrades. Each project is considered in isolation, regardless of whether similar projects are happening nearby that could share the upgrade costs or auger different improvements. The planning process tends to be reactive -- examining only the applications in front of them -- rather than considering trends over the coming years. It's a first-come, first-served queue: if one project is ready to break ground, it must wait behind another project that's still securing funding or permitting.

Two years in development, the dryly-named Improvements to Generator Interconnection Procedures and Agreements directs utility operators to plan infrastructure improvements with a 20-yr forecast of new energy sources and increased demand. Rather than examining each project in isolation, similar projects will be clustered and examined together. Instead of a First-Come, First-Served serial process, operators will instead examine First-Ready, allowing shovel-ready projects to jump the queue. The expectation is that these new rules will speed up and streamline the process of developing and connecting new energy projects through more holistic planning, penalties for delays, sensible cost-sharing for upgrades, and justification for long-term investments.

Slashdot Top Deals