×
Government

US Government Opens 22 Million Acres of Federal Lands To Solar 106

An anonymous reader quotes a report from Electrek: The Biden administration has updated the roadmap for solar development to 22 million acres of federal lands in the US West. The Bureau of Land Management (BLM) and the Department of Energy's National Renewable Energy Laboratory have determined that 700,000 acres of federal lands will be needed for solar farms over the next 20 years, so BLM recommended 22 million acres to give "maximum flexibility" to help the US reach its net zero by 2035 power sector goal. The plan is an update of the Bureau of Land Management's 2012 Western Solar Plan, which originally identified areas for solar development in six states -- Arizona, California, Colorado, Nevada, New Mexico, and Utah.

The updated roadmap refines the analysis in the original six states and expands to five more states -- Idaho, Montana, Oregon, Washington, and Wyoming. It also focuses on lands within 10 miles of existing or planned transmission lines and moves away from lands with sensitive resources. [...] BLM under the Biden administration has approved 47 clean energy projects and permitted 11,236 megawatts (MW) of wind, solar, and geothermal energy on public lands, enough to power more than 3.5 million homes.
Ben Norris, vice president of regulatory affairs at the Solar Energy Industries Association (SEIA), said in response to BLM's announced Western Solar Plan updates: "The proposal ... identifies 200,000 acres of land near transmission infrastructure, helping to correct an important oversight and streamline solar development. Under the current policy, there are at least 80 million acres of federal lands open to oil and gas development, which is 100 times the amount of public land available for solar. BLM's proposal is a big step in the right direction and recognizes the key role solar plays in our energy economy."
AI

OpenAI Ceo Sam Altman Is Still Chasing Billions To Build AI Chips 11

According to Bloomberg (paywalled), OpenAI CEO Sam Altman is reportedly raising billions to develop a global network of chip fabrication factories, collaborating with leading chip manufacturers to address the high demand for chips required for advanced AI models. The Verge reports: A major cost and limitation for running AI models is having enough chips to handle the computations behind bots like ChatGPT or DALL-E that answer prompts and generate images. Nvidia's value rose above $1 trillion for the first time last year, partly due to a virtual monopoly it has as GPT-4, Gemini, Llama 2, and other models depend heavily on its popular H100 GPUs.

Accordingly, the race to manufacture more high-powered chips to run complex AI systems has only intensified. The limited number of fabs capable of making high-end chips is driving Altman or anyone else to bid for capacity years before you need it in order to produce the new chips. And going against the likes of Apple requires deep-pocketed investors who will front costs that the nonprofit OpenAI still can't afford. SoftBank Group and Abu Dhabi-based AI holding company G42 have reportedly been in talks about raising money for Altman's project.
Hardware

Researchers Claim First Functioning Graphene-Based Chip (ieee.org) 4

An anonymous reader quotes a report from IEEE Spectrum: Researchers at Georgia Tech, in Atlanta, have developed what they are calling the world's first functioning graphene-based semiconductor. This breakthrough holds the promise to revolutionize the landscape of electronics, enabling faster traditional computers and offering a new material for future quantum computers. The research, published on January 3 in Nature and led by Walt de Heer, a professor of physics at Georgia Tech, focuses on leveraging epitaxial graphene, a crystal structure of carbon chemically bonded to silicon carbide (SiC). This novel semiconducting material, dubbed semiconducting epitaxial graphene (SEC) -- or alternatively, epigraphene -- boasts enhanced electron mobility compared with that of traditional silicon, allowing electrons to traverse with significantly less resistance. The outcome is transistors capable of operating at terahertz frequencies, offering speeds 10 times as fast as that of the silicon-based transistors used in current chips.

De Heer describes the method used as a modified version of an extremely simple technique that has been known for over 50 years. "When silicon carbide is heated to well over 1,000C, silicon evaporates from the surface, leaving a carbon-rich surface which then forms into graphene," says de Heer. This heating step is done with an argon quartz tube in which a stack of two SiC chips are placed in a graphite crucible, according to de Heer. Then a high-frequency current is run through a copper coil around the quartz tube, which heats the graphite crucible through induction. The process takes about an hour. De Heer added that the SEC produced this way is essentially charge neutral, and when exposed to air, it will spontaneously be doped by oxygen. This oxygen doping is easily removed by heating it at about 200C in vacuum. "The chips we use cost about [US] $10, the crucible about $1, and the quartz tube about $10," said de Heer. [...]

De Heer and his research team concede, however, that further exploration is needed to determine whether graphene-based semiconductors can surpass the current superconducting technology used in advanced quantum computers. The Georgia Tech team do not envision incorporating graphene-based semiconductors with standard silicon or compound semiconductor lines. Instead, they are aiming for a paradigm shift beyond silicon, utilizing silicon carbide. They are developing methods, such as coating SEC with boron nitride, to protect and enhance its compatibility with conventional semiconductor lines. Comparing their work with commercially available graphene field-effect transistors (GFETs), de Heer explains that there is a crucial difference: "Conventional GFETs do not use semiconducting graphene, making them unsuitable for digital electronics requiring a complete transistor shutdown." He says that the SEC developed by his team allows for a complete shutdown, meeting the stringent requirements of digital electronics. De Heer says that it will take time to develop this technology. "I compare this work to the Wright brothers' first 100-meter flight. It will mainly depend on how much work is done to develop it."

China

US To Ban Pentagon From Buying Batteries From China's CATL, BYD (bnnbloomberg.ca) 17

U.S. lawmakers have banned the Defense Department from buying batteries produced by China's biggest manufacturers. "The rule implemented as part of the latest National Defense Authorization Act that passed on Dec. 22 will prevent procuring batteries from Contemporary Amperex Technology Co. Ltd., BYD Co. and four other Chinese companies beginning in October 2027," reports Bloomberg. From the report: The measure doesn't extend to commercial purchases by companies such as Ford, which is licensing technology from CATL to build electric-vehicle batteries in Michigan. Tesla also sources some of its battery cells from BYD, which became the new top-selling EV maker globally in the fourth quarter. The four other manufacturers whose batteries will be banned are Envision Energy Ltd., EVE Energy Co., Gotion High Tech Co. and Hithium Energy Storage Technology Co.

The decision still requires Pentagon officials to more clearly define the reach of the new rule. It adds to previous provisions outlined by the NDAA that decoupled the Defense Department's supply chain from China, including restrictions on use of Chinese semiconductors. While the Defense Department bans apply strictly to defense procurement, industries and lawmakers closely follow the rules as a guide for what materials, products and companies to trust in their own course of business.

Japan

Japan's SLIM Probe Lands On Moon, But Suffers Power Problem (space.com) 17

Geoffrey.landis writes: The Japan SLIM spacecraft has successfully landed on moon, but power problems mean it may be short mission. The good news is that the landing was successful, making Japan only the fifth nation to successfully make a lunar landing, and the ultra-miniature rover and the hopper both deployed. The bad news is that the solar arrays aren't producing power, and unless they can fix the problem in the next few hours, the batteries will be depleted and it will die. But, short mission or long, hurrah for Japan for being the fifth country to successfully land a mission on the surface of the moon (on their third try; two previous missions didn't make it). It's a rather amazing mission. I've never seen a spacecraft concept that lands under rocket power vertically but then rotates over to rest horizontally on the surface.
Data Storage

30TB Hard Drives Are Nearly Here (tomshardware.com) 74

Seagate this week unveiled the industry's first hard disk drive platform that uses heat-assisted media recording (HAMR). Tom's Hardware: The new Mozaic 3+ platform relies on several all-new technologies, including new media, new write and read heads, and a brand-new controller. The platform will be used for Seagate's upcoming Exos hard drives for cloud datacenters with a 30TB capacity and higher. Heat-assisted magnetic recording is meant to radically increase areal recording density of magnetic media by making writes while the recording region is briefly heated to a point where its magnetic coercivity drops significantly.

Seagate's Mozaic 3+ uses 10 glass disks with a magnetic layer consisting of an iron-platinum superlattice structure that ensures both longevity and smaller media grain size compared to typical HDD platters. To record the media, the platform uses a plasmonic writer sub-system with a vertically integrated nanophotonic laser that heats the media before writing. Because individual grains are so small with the new media, their individual magnetic signatures are lower, whereas magnetic inter-track interference (ITI) effect is somewhat higher. As a result, Seagate had to introduce its new Gen 7 Spintronic Reader, which features the "world's smallest and most sensitive magnetic field reading sensors," according to the company. Because Seagate's new Mozaic 3+ platform deals with new media with a very small grain size, an all-new writer, and a reader that features multiple tiny magnetic field readers, it also requires a lot of compute horsepower to orchestrate the drive's work. Therefore, Seagate has equipped with Mozaic 3+ platform with an all-new controller made on a 12nm fabrication process.

AI

Sam Altman Says AI Depends On Energy Breakthrough (reuters.com) 105

An anonymous reader quotes a report from Reuters: OpenAI's CEO Sam Altman on Tuesday said an energy breakthrough is necessary for future artificial intelligence, which will consume vastly more power than people have expected. Speaking at a Bloomberg event on the sidelines of the World Economic Forum's annual meeting in Davos, Altman said the silver lining is that more climate-friendly sources of energy, particularly nuclear fusion or cheaper solar power and storage, are the way forward for AI. "There's no way to get there without a breakthrough," he said. "It motivates us to go invest more in fusion."

In 2021, Altman personally provided $375 million to private U.S. nuclear fusion company Helion Energy, which since has signed a deal to provide energy to Microsoft in future years. Microsoft is OpenAI's biggest financial backer and provides it computing resources for AI. Altman said he wished the world would embrace nuclear fission as an energy source as well.
Further reading: Microsoft Needs So Much Power to Train AI That It's Considering Small Nuclear Reactors
Hardware

80 Years Later, GCHQ Releases New Images of Nazi Code-Breaking Computer (arstechnica.com) 79

An anonymous reader quotes a report from Ars Technica: On Thursday, UK's Government Communications Headquarters (GCHQ) announced the release of previously unseen images and documents related to Colossus, one of the first digital computers. The release marks the 80th anniversary of the code-breaking machines that significantly aided the Allied forces during World War II. While some in the public knew of the computers earlier (PDF), the UK did not formally acknowledge the project's existence until the 2000s.

Colossus was not one computer but a series of computers developed by British scientists between 1943 and 1945. These 2-meter-tall electronic beasts played an instrumental role in breaking the Lorenz cipher, a code used for communications between high-ranking German officials in occupied Europe. The computers were said to have allowed allies to "read Hitler's mind," according to The Sydney Morning Herald. The technology behind Colossus was highly innovative for its time. Tommy Flowers, the engineer behind its construction, used over 2,500 vacuum tubes to create logic gates, a precursor to the semiconductor-based electronic circuits found in modern computers. While 1945's ENIAC was long considered the clear front-runner in digital computing, the revelation of Colossus' earlier existence repositioned it in computing history. (However, it's important to note that ENIAC was a general-purpose computer, and Colossus was not.)

GCHQ's public sharing of archival documents includes several photos of the computer at different periods and a letter discussing Tommy Flowers' groundbreaking work that references the interception of "rather alarming German instructions." Following the war, the UK government issued orders for the destruction of most Colossus machines, and Flowers was required to turn over all related documentation. The GCHQ claims that the Colossus tech "was so effective, its functionality was still in use by us until the early 1960s." In the GCHQ press release, Director Anne Keast-Butler paid tribute to Colossus' place in the UK's lineage of technological innovation: "The creativity, ingenuity and dedication shown by Tommy Flowers and his team to keep the country safe were as crucial to GCHQ then as today."

Robotics

BMW Will Employ Figure's Humanoid Robot At South Carolina Plant (techcrunch.com) 91

Figure's first humanoid robot will be coming to a BMW manufacturing facility in South Carolina. TechCrunch reports: BMW has not disclosed how many Figure 01 models it will deploy initially. Nor do we know precisely what jobs the robot will be tasked with when it starts work. Figure did, however, confirm with TechCrunch that it is beginning with an initial five tasks, which will be rolled out one at a time. While folks in the space have been cavalierly tossing out the term "general purpose" to describe these sorts of systems, it's important to temper expectations and point out that they will all arrive as single- or multi-purpose systems, growing their skillset over time. Figure CEO Brett Adcock likens the approach to an app store -- something that Boston Dynamics currently offers with its Spot robot via SDK.

Likely initial applications include standard manufacturing tasks such as box moving, pick and place and pallet unloading and loading -- basically the sort of repetitive tasks for which factory owners claim to have difficulty retaining human workers. Adcock says that Figure expects to ship its first commercial robot within a year, an ambitious timeline even for a company that prides itself on quick turnaround times. The initial batch of applications will be largely determined by Figure's early partners like BMW. The system will, for instance, likely be working with sheet metal to start. Adcock adds that the company has signed up additional clients, but declined to disclose their names. It seems likely Figure will instead opt to announce each individually to keep the news cycle spinning in the intervening 12 months.

Education

'A Groundbreaking Study Shows Kids Learn Better On Paper, Not Screens. Now What?' (theguardian.com) 130

In an opinion piece for the Guardian, American journalist and author John R. MacArthur discusses the alarming decline in reading skills among American youth, highlighted by a Department of Education survey showing significant drops in text comprehension since 2019-2020, with the situation worsening since 2012. While remote learning during the pandemic and other factors like screen-based reading are blamed, a new study by Columbia University suggests that reading on paper is more effective for comprehension than reading on screens, a finding not yet widely adopted in digital-focused educational approaches. From the report: What if the principal culprit behind the fall of middle-school literacy is neither a virus, nor a union leader, nor "remote learning"? Until recently there has been no scientific answer to this urgent question, but a soon-to-be published, groundbreaking study from neuroscientists at Columbia University's Teachers College has come down decisively on the matter: for "deeper reading" there is a clear advantage to reading a text on paper, rather than on a screen, where "shallow reading was observed." [...] [Dr Karen Froud] and her team are cautious in their conclusions and reluctant to make hard recommendations for classroom protocol and curriculum. Nevertheless, the researchers state: "We do think that these study outcomes warrant adding our voices ... in suggesting that we should not yet throw away printed books, since we were able to observe in our participant sample an advantage for depth of processing when reading from print."

I would go even further than Froud in delineating what's at stake. For more than a decade, social scientists, including the Norwegian scholar Anne Mangen, have been reporting on the superiority of reading comprehension and retention on paper. As Froud's team says in its article: "Reading both expository and complex texts from paper seems to be consistently associated with deeper comprehension and learning" across the full range of social scientific literature. But the work of Mangen and others hasn't influenced local school boards, such as Houston's, which keep throwing out printed books and closing libraries in favor of digital teaching programs and Google Chromebooks. Drunk on the magical realism and exaggerated promises of the "digital revolution," school districts around the country are eagerly converting to computerized test-taking and screen-reading programs at the precise moment when rigorous scientific research is showing that the old-fashioned paper method is better for teaching children how to read.

Indeed, for the tech boosters, Covid really wasn't all bad for public-school education: "As much as the pandemic was an awful time period," says Todd Winch, the Levittown, Long Island, school superintendent, "one silver lining was it pushed us forward to quickly add tech supports." Newsday enthusiastically reports: "Island schools are going all-in on high tech, with teachers saying they are using computer programs such as Google Classroom, I-Ready, and Canvas to deliver tests and assignments and to grade papers." Terrific, especially for Google, which was slated to sell 600 Chromebooks to the Jericho school district, and which since 2020 has sold nearly $14bn worth of the cheap laptops to K-12 schools and universities.

If only Winch and his colleagues had attended the Teachers College symposium that presented the Froud study last September. The star panelist was the nation's leading expert on reading and the brain, John Gabrieli, an MIT neuroscientist who is skeptical about the promises of big tech and its salesmen: "I am impressed how educational technology has had no effect on scale, on reading outcomes, on reading difficulties, on equity issues," he told the New York audience. "How is it that none of it has lifted, on any scale, reading? ... It's like people just say, "Here is a product. If you can get it into a thousand classrooms, we'll make a bunch of money.' And that's OK; that's our system. We just have to evaluate which technology is helping people, and then promote that technology over the marketing of technology that has made no difference on behalf of students ... It's all been product and not purpose." I'll only take issue with the notion that it's "OK" to rob kids of their full intellectual potential in the service of sales -- before they even get started understanding what it means to think, let alone read.

Apple

Apple Again Banned From Selling Watches In US With Blood Oxygen Sensor (cnbc.com) 75

A U.S. Court of Appeals said Apple will again be barred from selling the Apple Watch Series 9 and Ultra 2 beginning Thursday. These models both contain a blood oxygen sensor that infringes on the intellectual property of medical device company Masimo.

"The court order Wednesday did not rule on Apple's effort to overturn a U.S. International Trade Commission ban on the company selling the affected watches in the United States," notes CNBC. "But it lifted an injunction that had blocked the ban from taking effect while that appeal is pending." From the report: In December, Apple chose to briefly remove the affected watches from its online and retail stores, though retailers with those devices in stock may still sell them. Earlier this week, court filings suggested that Apple had received approval from U.S. Customs for a modified version of its Apple Watches that lack the blood oxygen feature and therefore no longer infringe on Masimo's intellectual property. It could open a path for a modified Apple Watch to return to U.S. store shelves.
Robotics

'Student Should Have a Healthy-Looking BMI': How Universities Bend Over Backwards To Accommodate Food Delivery Robots (404media.co) 125

samleecole writes: A food delivery robot company instructed a public university to promote its service on campus with photographs and video featuring only students who "have a healthy-looking BMI," [body mass index] according to emails and documents I obtained via a public records request. The emails also discuss how ordering delivery via robot should become a "habit" for a "captured" customer base of students on campus.

These highly specific instructions show how universities around the country are going to extreme lengths to create a welcoming environment on campus for food delivery robots that sometimes have trouble crossing the street and need traffic infrastructure redesigned for them in order to navigate campus, a relatively absurd cache of public records obtained by 404 Media reveals.

China

China's Chip Imports Fell By a Record 15% Due To US Sanctions, Globally Weaker Demand (tomshardware.com) 49

According to Bloomberg, China's chip import value dropped significantly by 15.4% in 2023, from $413 billion to $349 billion. "Chip sales were down across the board in 2023 thanks to a weakening global economy, but China's chip imports indicate that its economy might be in trouble," reports Tom's Hardware. "The country's inability to import cutting-edge silicon is also certainly a factor in its decreasing chip imports." From the report: In 2022, the value of chip imports to China stood at $413 billion, and in 2023 the country only imported chips worth a total of $349 billion, a 15.4% decrease in value. That a drop happened at all isn't surprising; even TSMC, usually considered to be one of the most advanced fabbing corporation in the world, saw its sales decline by 4.5%. However, a 15.4% decrease in shipments is much more significant, and indicates China has particular issues other than weaker demand across the world.

China's ongoing economic issues, such as its high deflation could play a part. Deflation is when currency increases in value, the polar opposite of inflation, when currency loses value. As inflation has been a significant problem for countries such as the U.S. and UK, deflation might sound much more appealing, but economically it can be problematic. A deflationary economy encourages consumers not to spend, since money is increasing in value, meaning buyers can purchase more if they wait. In other words, deflation decreases demand for products like semiconductors.

However, shipment volume only decreased by 10.8% compared to the 15.4% decline in value, meaning the chips that China didn't buy in 2023 were particularly valuable. This likely reflects U.S. sanctions on China, which prevents it from buying top-end graphics cards, especially from Nvidia. The H100, H200, GH200, and the RTX 4090 are illegal to ship to China, and they're some of Nvidia's best GPUs. The moving target for U.S. sanctions could also make exporters and importers more tepid, as it's hard to tell if more sanctions could suddenly upend plans and business deals.

Power

World's First Floating Offshore Wind Farm To Be Taken Offline For Up To 4 Months (electrek.co) 142

An anonymous reader quotes a report from Electrek: The world's first floating offshore wind farm, Hywind Scotland, is coming offline for three to four months for "heavy maintenance." Hywind Scotland's operator, Norwegian power giant Equinor, says that operational data has indicated that its wind turbines need work. The pilot project has been in operation since 2017. The five Siemens Gamesa turbines will be towed to Norway this summer. An Equinor spokesperson said, "This is the first such operation for a floating farm, and the safest method to do this is to tow the turbines to shore and execute the operations in sheltered conditions."

Norwegian contractor Wergeland Group will undertake the work. The spokesperson added, "Wergeland is the closest port with offshore wind experience and sufficient water depth that can service these turbines." As the world's first floating offshore wind farm, Hywind Scotland has trailblazed for much larger floating wind farms now in the pipeline. Its five floating wind turbines have a total capacity of 30 megawatts (MW). It generates enough electricity to power the equivalent of 34,000 households in the UK. Each turbine's maximum height, base to turbine, is 253 meters (830 feet). [...] Equinor said in December 2022, when Hywind Scotland turned five, that it was the world's best-performing offshore wind farm, achieving a capacity factor of 54% over its five years of operations.

Earth

Can Pumping CO2 Into California's Oil Fields Help Stop Global Warming? (yahoo.com) 83

America's Environmental Protection Agency "has signed off on a California oil company's plans to permanently store carbon emissions deep underground to combat global warming," reports the Los Angeles Times: California Resources Corp., the state's largest oil and gas company, applied for permission to send 1.46 million metric tons of carbon dioxide each year into the Elk Hills oil field, a depleted oil reservoir about 25 miles outside of downtown Bakersfield. The emissions would be collected from several industrial sources nearby, compressed into a liquid-like state and injected into porous rock more than one mile underground.

Although this technique has never been performed on a large scale in California, the state's climate plan calls for these operations to be widely deployed across the Central Valley to reduce carbon emissions from industrial facilities. The EPA issued a draft permit for the California Resources Corp. project, which is poised to be finalized in March following public comments. As California transitions away from oil production, a new business model for fossil fuel companies has emerged: carbon management. Oil companies have heavily invested in transforming their vast network of exhausted oil reservoirs into a long-term storage sites for planet-warming gases, including California Resources Corp., the largest nongovernmental owner of mineral rights in California...

[Environmentalists] say that the transportation and injection of CO2 — an asphyxiating gas that displaces oxygen — could lead to dangerous leaks. Nationwide, there have been at least 25 carbon dioxide pipeline leaks between 2002 and 2021, according to the U.S. Department of Transportation. Perhaps the most notable incident occurred in Satartia, Miss., in 2020 when a CO2 pipeline ruptured following heavy rains. The leak led to the hospitalization of 45 people and the evacuation of 200 residents... Under the EPA draft permit, California Resources Corp. must take a number of steps to mitigate these risks. The company must plug 157 wells to ensure the CO2 remains underground, monitor the injection site for leaks and obtain a $33-million insurance policy.

Canadian-based Brookfield Corporation also invested $500 million, according to the article, with California Resources Corp. seeking permits for five projects — more than any company in the nation. "It's kind of reversing the role, if you will," says their chief sustainability officer. "Instead of taking oil and gas out, we're putting carbon in."

Meanwhile, there's applications for "about a dozen" more projects in California's Central Valley that could store millions of tons of carbon emissions in old oil and gas fields — and California Resources Corp says greater Los Angeles is also "being evaluated" as a potential storage site.
Robotics

The Global Project To Make a General Robotic Brain (ieee.org) 23

Generative AI "doesn't easily carry over into robotics," write two researchers in IEEE Spectrum, "because the Internet is not full of robotic-interaction data in the same way that it's full of text and images."

That's why they're working on a single deep neural network capable of piloting many different types of robots... Robots need robot data to learn from, and this data is typically created slowly and tediously by researchers in laboratory environments for very specific tasks... The most impressive results typically only work in a single laboratory, on a single robot, and often involve only a handful of behaviors... [W]hat if we were to pool together the experiences of many robots, so a new robot could learn from all of them at once? We decided to give it a try. In 2023, our labs at Google and the University of California, Berkeley came together with 32 other robotics laboratories in North America, Europe, and Asia to undertake the RT-X project, with the goal of assembling data, resources, and code to make general-purpose robots a reality...

The question is whether a deep neural network trained on data from a sufficiently large number of different robots can learn to "drive" all of them — even robots with very different appearances, physical properties, and capabilities. If so, this approach could potentially unlock the power of large datasets for robotic learning. The scale of this project is very large because it has to be. The RT-X dataset currently contains nearly a million robotic trials for 22 types of robots, including many of the most commonly used robotic arms on the market...

Surprisingly, we found that our multirobot data could be used with relatively simple machine-learning methods, provided that we follow the recipe of using large neural-network models with large datasets. Leveraging the same kinds of models used in current LLMs like ChatGPT, we were able to train robot-control algorithms that do not require any special features for cross-embodiment. Much like a person can drive a car or ride a bicycle using the same brain, a model trained on the RT-X dataset can simply recognize what kind of robot it's controlling from what it sees in the robot's own camera observations. If the robot's camera sees a UR10 industrial arm, the model sends commands appropriate to a UR10. If the model instead sees a low-cost WidowX hobbyist arm, the model moves it accordingly.

"To test the capabilities of our model, five of the laboratories involved in the RT-X collaboration each tested it in a head-to-head comparison against the best control system they had developed independently for their own robot... Remarkably, the single unified model provided improved performance over each laboratory's own best method, succeeding at the tasks about 50 percent more often on average." And they then used a pre-existing vision-language model to successfully add the ability to output robot actions in response to image-based prompts.

"The RT-X project shows what is possible when the robot-learning community acts together... and we hope that RT-X will grow into a collaborative effort to develop data standards, reusable models, and new techniques and algorithms."

Thanks to long-time Slashdot reader Futurepower(R) for sharing the article.
Power

Chinese Company Announces Mass Production of Small Nuclear Battery With 50-Year Lifespan (tomshardware.com) 172

"Chinese company Betavolt has announced an atomic energy battery for consumers with a touted 50-year lifespan," reports Tom's Hardware: The Betavolt BV100 will be the first product to launch using the firm's new atomic battery technology, constructed using a nickel -63 isotope and diamond semiconductor material. Betavolt says that its nuclear battery will target aerospace, AI devices, medical, MEMS systems, intelligent sensors, small drones, and robots — and may eventually mean manufacturers can sell smartphones that never need charging...

[T]he BV100, which is in the pilot stage ahead of mass production, doesn't offer a lot of power. This 15 x 15 x 5mm battery delivers 100 microwatts at 3 volts. It is mentioned that multiple BV100 batteries can be used together in series or parallel depending on device requirements. Betavolt also asserts that it has plans to launch a 1-watt version of its atomic battery in 2025. The new BV100 is claimed to be a disruptive product on two counts. Firstly, a safe miniature atomic battery with 50 years of maintenance-free stamina is a breakthrough. Secondly, Betavolt claims it is the only company in the world with the technology to dope large-size diamond semiconductor materials, as used by the BV100. It is using its 4th Gen diamond semiconductor material here...

[T]he Betavolt BV100 is claimed to be safe for consumers and won't leak radiation even if subjected to gunshots or puncture... Betavolt's battery uses a nickel -63 isotope as the energy source, which decays to a stable isotope of copper. This, plus the diamond semiconductor material, helps the BV100 operate stably in environments ranging from -60 to 120 degrees Celsius, according to the firm...

Betavolt will be well aware of devices with a greater thirst for power and teases that it is investigating isotopes such as strontium- 90, promethium- 147, and deuterium to develop atomic energy batteries with higher power levels and even longer service lives — up to 230 years.

Thanks to long-time Slashdot reader hackingbear for sharing the news.
Power

Wind Turbines Are Friendlier To Birds Than Oil-and-Gas Drilling, Study Finds (yahoo.com) 80

A new analysis suggests that wind turbines have little impact on bird populations, according to the Economist — and that oil-and-gas extraction may be worse: Erik Katovich [an economist at the University of Geneva] combined bird population and species maps with the locations and construction dates of all wind turbines in the United States, with the exceptions of Alaska and Hawaii, between 2000 and 2020. He found that building turbines had no discernible effect on bird populations. That reassuring finding held even when he looked specifically at large birds like hawks, vultures and eagles that many people believe are particularly vulnerable to being struck.

But Dr. Katovich did not confine his analysis to wind power alone. He also examined oil-and-gas extraction. Like wind power, this has boomed in America over the past couple of decades, with the rise of shale gas produced by hydraulic fracturing, or fracking, of rocks. Production has risen from 37m cubic metres in 2007 to 740m cubic metres in 2020. Comparing bird populations to the locations of new gas wells revealed an average 15% drop in bird numbers when new wells were drilled, probably due to a combination of noise, air pollution and the disturbance of rivers and ponds that many birds rely upon. When drilling happens in places designated by the National Audubon Society as "important bird areas", bird numbers instead dropped by 25%. Such places are typically migration hubs, feeding grounds or breeding locations.

Wind power, in other words, not only produces far less planet-heating carbon dioxide and methane than do fossil fuels. It appears to be significantly less damaging to wildlife, too.

Thanks to long-time Slashdot reader SpzToid for sharing the article.
Earth

America Cracks Down on Methane Emissions from Oil and Gas Facilities (msn.com) 36

Friday America's Environmental Protection Agency "proposed steep new fees on methane emissions from oil and gas facilities," reports the Washington Post, "escalating a crackdown on the fossil fuel industry's planet-warming pollution."

Methane does not linger in the atmosphere as long as carbon dioxide, but it is far more effective at trapping heat — roughly 80 times more potent in its first decade. It is responsible for roughly a third of global warming today, and the oil and gas industry accounts for about 14 percent of the world's annual methane emissions, according to estimates from the International Energy Agency. Other large methane sources include livestock, landfills and coal mines.
So America's new Methane Emissions Reduction Program "levies a fee on wasteful methane emissions from large oil and gas facilities," according to the article: The fee starts at $900 per metric ton of emissions in 2024, increasing to $1,200 in 2025 and $1,500 in 2026 and thereafter. The EPA proposal lays out how the fee will be implemented, including how the charge will be calculated...

At the U.N. Climate Change Conference in Dubai in December, EPA Administrator Michael Regan announced final standards to limit methane emissions from U.S. oil and gas operations. Fossil fuel companies that comply with these standards will be exempt from the new fee... Fred Krupp, president of the Environmental Defense Fund, said the fee will encourage fossil fuel firms to deploy innovative technologies that detect methane leaks. Such cutting-edge technologies range from ground-based sensors to satellites in space. "Proven solutions to cut oil and gas methane and to avoid the fee are being used by leading companies in states across the country," Krupp said in a statement...

In addition to methane, the EPA proposal could slash emissions of hazardous air pollutants, including smog-forming volatile organic compounds and cancer-causing benzene [according to an EPA official].

The federal government also gave America's fossil fuel companies nearly $1 billion to help them comply with the methane regulation, according to the article.

The article also includes this statement from an executive at the American Petroleum Institute, the top lobbying arm of the U.S. oil and gas industry, complaining that the fines create a "regime" that would "stifle innovation," and urging Congress to repeal it.
Hardware

Micron Displays Next-Gen LPCAMM2 Modules For Laptops At CES 2024 28

At CES 2024 this week, Micron demonstrated its next-gen LPCAMM2 memory modules based on LPDDR5X memory. Not only are they smaller and more powerful than traditional SODIMMs, they can be "serviced during the manufacturing process and upgraded by the user," says Micron. Tom's Hardware reports: Micron's LPCAMM2 are industry-standard memory modules that will be available in 16 GB, 32 GB, and 64 GB capacities as well as with speed bins of up to a 9600 MT/s data transfer rate. These modules are designed to replace conventional SODIMMs as well as soldered-down LPDDR5X memory subsystem while offering the best of both worlds: flexibility, repairability, and upgradeability of modular memory solutions as well as high performance and low power consumption of mobile DRAM. Indeed, a Micron LPCAMM2 module is smaller than a traditional SODIMM despite the fact that it has a 128-bit memory interface and up to 64 GB of LPDDR5X memory onboard. Needless to say, the module is massively smaller than two SODIMM memory sticks that offer a 128-bit memory interface both in terms of height and in terms of physical footprint.

Slashdot Top Deals