Microsoft

Microsoft Strikes Deal With Mistral in Push Beyond OpenAI (ft.com) 13

Microsoft has struck a deal with French AI startup Mistral as it seeks to broaden its involvement in the fast-growing industry beyond OpenAI. From a report: The US tech giant will provide the 10-month-old Paris-based company with help in bringing its AI models to market. Microsoft will also take a minor stake in Mistral, although the financial details have not been disclosed. The partnership makes Mistral the second company to provide commercial language models available on Microsoft's Azure cloud computing platform. Microsoft has already invested about $13 billion in San Francisco-based OpenAI, an alliance that is being reviewed by competition watchdogs in the US, EU and UK. Other Big Tech rivals, such as Google and Amazon, are also investing heavily in building generative AI -- software that can produce text, images and code in seconds -- which analysts believe has the capacity to shake up industries across the world. WSJ adds: On Monday, Mistral plans to announce a new AI model, called Mistral Large, that Mensch said can perform some reasoning tasks comparably with GPT-4, OpenAI's most advanced language model to date, and Gemini Ultra, Google's new model. Mensch said his new model cost less than 20 million euros, the equivalent of roughly $22 million, to train. By contrast OpenAI Chief Executive Sam Altman said last year after the release of GPT-4 that training his company's biggest models cost "much more than" $50 million to $100 million.
Moon

Moon Landing's Payloads Include Archive of Human Knowledge, Lunar Data Center Test, NFTs (medium.com) 75

In 2019 a SpaceX Falcon 9 rocket launched an Israeli spacecraft carrying a 30-million page archive of human civilization to the moon. Unfortunately, that spacecraft crashed. But thanks to this week's moon landing by the Odysseus, there's now a 30-million page "Lunar Library" on the moon — according to a Medium post by the Arch Mission Foundation.

"This historic moment secures humanity's cultural heritage and knowledge in an indestructible archive built to last for up to billions of years." Etched onto thin sheets of nickel, called NanoFiche, the Lunar Library is practically indestructible and can withstand the harsh conditions of space... Some of the notable content includes:


The Wikipedia. The entire English Wikipedia containing over 6 million articles on every branch of knowledge.
Project Gutenberg. Portions of Project Gutenberg's library of over 70,000 free eBooks containing some of our most treasured literature.
The Long Now Foundation's Rosetta Project archive of over 7,000 human languages and The Panlex datasets.
Selections from the Internet Archive's collections of books and important documents and data sets.
The SETI Institute's Earthling Project, featuring a musical compilation of 10,000 vocal submissions representing humanity united
The Arch Lunar Art Archive containing a collection of works from global contemporary and digital artists in 2022, recorded as NFTs.
David Copperfield's Magic Secrets — the secrets to all his greatest illusions — including how he will make the Moon disappear in the near future.
The Arch Mission Primer — which teaches a million concepts with images and words in 5 languages.
The Arch Mission Private Library — containing millions of pages as well as books, documents and articles on every subject, including a broad range of fiction and non-fiction, textbooks, periodicals, audio recordings, videos, historical documents, software sourcecode, data sets, and more.
The Arch Mission Vaults — private collections, including collections from our advisors and partners, and a collection of important texts and images from all the world's religions including the great religions and indigenous religions from around the world, collections of books, photos, and a collection of music by leading recording artists, and much more content that may be revealed in the future...


We also want to recognize our esteemed advisors, and our many content partners and collections including the Wikimedia Foundation, the Long Now Foundation, The SETI Institute Earthling Project, the Arch Lunar Art Archive project, Project Gutenberg, the Internet Archive, and the many donors who helped make the Lunar Library possible through their generous contributions. This accomplishment would not have happened without the collaborative support of so many...

We will continue to send backups of our important knowledge and cultural heritage — placing them on the surface of the Earth, in caves and deep underground bunkers and mines, and around the solar system as well. This is a mission that continues as long as humanity endures, and perhaps even long after we are gone, as a gift for whoever comes next.

Space.com has a nice rundown of the other new payloads that just landed on the moon. Some highlights:
  • "Cloud computing startup Lonestar's Independence payload is a lunar data center test mission for data storage and transmission from the lunar surface."
  • LRA is a small hemisphere of light-reflectors built to servce as a precision landmark to "allow spacecraft to ping it with lasers to help them determine their precise distance..."
  • ROLSES is a radio spectrometer for measuring the electron density near the lunar surface, "and how it may affect radio observatories, as well as observing solar and planetary radio waves and other phenomena."
  • "Artist Jeff Koons is sending 125 miniature stainless steel Moon Phase sculptures, each honoring significant human achievements across cultures and history, to be displayed on the moon in a cube. "

Cloud

Service Mesh Linkerd Moves Its Stable Releases Behind a Paywall (techtarget.com) 13

TechTarget notes it was Linkerd's original developers who coined the term "service mesh" — describing their infrastructure layer for communication between microservices.

But "There has to be some way of connecting the businesses that are being built on top of Linkerd back to funding the project," argues Buoyant CEO William Morgan. "If we don't do that, then there's no way for us to evolve this project and to grow it in the way that I think we all want."

And so, TechTarget reports... Beginning May 21, 2024, any company with more than 50 employees running Linkerd in production must pay Buoyant $2,000 per Kubernetes cluster per month to access stable releases of the project...

The project's overall source code will remain available in GitHub, and edge, or experimental early releases of code, will continue to be committed to open source. But the additional work done by Buoyant developers to backport minimal changes so that they're compatible with existing versions of Linkerd and to fix bugs, with reliability guarantees, to create stable releases will only be available behind a paywall, Morgan said... Morgan said he is prepared for backlash from the community about this change. In the last section of a company blog post FAQ about the update, Morgan included a question that reads, in part, "Who can I yell at...?"

But industry watchers flatly pronounced the change a departure from open source. "By saying, 'Sorry but we can no longer afford to hand out a production-ready product as free open source code,' Buoyant has removed the open source character of this project," said Torsten Volk, an analyst at Enterprise Management Associates. "This goes far beyond the popular approach of offering a managed version of a product that may include some additional premium features for a fee while still providing customers with the option to use the more basic open source version in production." Open source developers outside Buoyant won't want to contribute to the project — and Buoyant's bottom line — without receiving production-ready code in return, Volk predicted.

Morgan conceded that these are potentially valid concerns and said he's open to finding a way to resolve them with contributors... "I don't think there's a legal argument there, but there's an unresolved tension there, similar to testing edge releases — that's labor just as much as contributing is. I don't have a great answer to that, but it's not unique to Buoyant or Linkerd."

And so, "Starting in May, if you want the latest stable version of the open source Linkerd to download and run, you will have to go with Buoyant's commercial distribution," according to another report (though "there are discounts for non-profits, high-volume use cases, and other unique needs.") The Cloud Native Computing Foundation manages the open source project. The copyright is held by the Linkerd authors themselves. Linkerd is licensed under the Apache 2.0 license.

Buoyant CEO William Morgan explained in an interview with TNS that the changes in licensing are necessary to continue to ensure that Linkerd runs smoothly for enterprise users. Packaging the releases has also been demanding a lot of resources, perhaps even more than maintaining and advancing the core software itself, Morgan explained. He likened the approach to how Red Hat operates with Linux, which offers Fedora as an early release and maintains its core Linux offering, Red Hat Enterprise Linux (RHEL) for commercial clients.

"If you want the work that we put into the stable releases, which is predominantly around, not just testing, but also minimizing the changes in subsequent releases, that's hard hard work" requiring input from "world-leading experts in distributed systems," Morgan said.

"Well, that's kind of the dark, proprietary side of things."

Businesses

Nvidia Posts Record Revenue Up 265% On Booming AI Business (cnbc.com) 27

In its fourth quarter earnings report today, Nvidia beat Wall Street's forecast for earnings and sales, causing shares to rise about 10% in extended trading. CNBC reports: Here's what the company reported compared with what Wall Street was expecting for the quarter ending in January, based on a survey of analysts by LSEG, formerly known as Refinitiv:

Earnings per share: $5.16 adjusted vs. $4.64 expected
Revenue: $22.10 billion vs. $20.62 billion expected

Nvidia said it expected $24.0 billion in sales in the current quarter. Analysts polled by LSEG were looking for $5.00 per share on $22.17 billion in sales. Nvidia CEO Jensen Huang addressed investor fears that the company may not be able to keep up this growth or level of sales for the whole year on a call with analysts. "Fundamentally, the conditions are excellent for continued growth" in 2025 and beyond, Huang told analysts. He says demand for the company's GPUs will remain high due to generative AI and an industry-wide shift away from central processors to the accelerators that Nvidia makes.

Nvidia reported $12.29 billion in net income during the quarter, or $4.93 per share, up 769% versus last year's $1.41 billion or 57 cents per share. Nvidia's total revenue rose 265% from a year ago, based on strong sales for AI chips for servers, particularly the company's "Hopper" chips such as the H100, it said. "Strong demand was driven by enterprise software and consumer internet applications, and multiple industry verticals including automotive, financial services and health care," the company said in commentary provided to investors. Those sales are reported in the company's Data Center business, which now comprises the majority of Nvidia's revenue. Data center sales were up 409% to $18.40 billion. Over half the company's data center sales went to large cloud providers. [...]

The company's gaming business, which includes graphics cards for laptops and PCs, was merely up 56% year over year to $2.87 billion. Graphics cards for gaming used to be Nvidia's primary business before its AI chips started taking off, and some of Nvidia's graphics cards can be used for AI. Nvidia's smaller businesses did not show the same meteoric growth. Its automotive business declined 4% to $281 million in sales, and its OEM and other business, which includes crypto chips, rose 7% to $90 million. Nvidia's business making graphics hardware for professional applications rose 105% to $463 million.

Businesses

International Nest Aware Subscriptions Jump in Price, as Much As 100% (arstechnica.com) 43

Google's "Nest Aware" camera subscription is going through another round of price increases. From a report: This time it's for international users. There's no big announcement or anything, just a smattering of email screenshots from various countries on the Nest subreddit. 9to5Google was nice enough to hunt down a pile of the announcements. Nest Aware is a monthly subscription fee for Google's Nest cameras. Nest cameras exclusively store all their video in the cloud, and without the subscription, you aren't allowed to record video 24/7.

There are two sets of subscriptions to keep track of: the current generation subscription for modern cameras and the "first generation Nest Aware" subscription for older cameras. To give you an idea of what we're dealing with, in the US, the current free tier only gets you three hours of "event" video -- meaning video triggered by motion detection. Even the basic $8-a-month subscription doesn't get you 24/7 recording -- that's still only 30 days of event video. The "Nest Aware Plus" subscription, at $15 a month in the US, gets you 10 days of 24/7 video recording. The "first-generation" Nest Aware subscription, which is tied to earlier cameras and isn't available for new customers anymore, is doubling in price in Canada. The basic tier of five days of 24/7 video is going from a yearly fee of CA$50 to CA$110 (the first-generation sub has 24/7 video on every tier). Ten days of video is jumping from CA$80 to CA$160, and 30 days is going from CA$110 to CA$220. These are the prices for a single camera; the first-generation subscription will have additional charges for additional cameras. The current Nest Aware subscription for modern cameras is getting jumps that look similar to the US, with Nest Aware Plus, the mid-tier, going from CA$16 to CA $20 per month, and presumably similar raises across the board.

Sony

Sony's PlayStation Portal Hacked To Run Emulated PSP Games (theverge.com) 12

An anonymous reader shares a report: Sony's new PlayStation Portal has been hacked by Google engineers to run emulated games locally. The $199.99 handheld debuted in November but was limited to just streaming games from a PS5 console and not even titles from Sony's cloud gaming service. Now, two Google engineers have managed to get the PPSSPP emulator running natively on the PlayStation Portal, allowing a Grand Theft Auto PSP version to run on the Portal without Wi-Fi streaming required. "After more than a month of hard work, PPSSPP is running natively on PlayStation Portal. Yes, we hacked it," says Andy Nguyen in a post on X. Nguyen also confirms that the exploit is "all software based," so it doesn't require any hardware modifications like additional chips or soldering. Only a photo of Grand Theft Auto: Liberty City Stories running on the PlayStation Portal has been released so far, but Nguyen may release some videos to demonstrate the exploit at the weekend.
Security

MIT Researchers Build Tiny Tamper-Proof ID Tag Utilizing Terahertz Waves (mit.edu) 42

A few years ago, MIT researchers invented a cryptographic ID tag — but like traditional RFID tags, "a counterfeiter could peel the tag off a genuine item and reattach it to a fake," writes MIT News.

"The researchers have now surmounted this security vulnerability by leveraging terahertz waves to develop an antitampering ID tag that still offers the benefits of being tiny, cheap, and secure." They mix microscopic metal particles into the glue that sticks the tag to an object, and then use terahertz waves to detect the unique pattern those particles form on the item's surface. Akin to a fingerprint, this random glue pattern is used to authenticate the item, explains Eunseok Lee, an electrical engineering and computer science (EECS) graduate student and lead author of a paper on the antitampering tag. "These metal particles are essentially like mirrors for terahertz waves. If I spread a bunch of mirror pieces onto a surface and then shine light on that, depending on the orientation, size, and location of those mirrors, I would get a different reflected pattern. But if you peel the chip off and reattach it, you destroy that pattern," adds Ruonan Han, an associate professor in EECS, who leads the Terahertz Integrated Electronics Group in the Research Laboratory of Electronics.

The researchers produced a light-powered antitampering tag that is about 4 square millimeters in size. They also demonstrated a machine-learning model that helps detect tampering by identifying similar glue pattern fingerprints with more than 99 percent accuracy. Because the terahertz tag is so cheap to produce, it could be implemented throughout a massive supply chain. And its tiny size enables the tag to attach to items too small for traditional RFIDs, such as certain medical devices...

"These responses are impossible to duplicate, as long as the glue interface is destroyed by a counterfeiter," Han says. A vendor would take an initial reading of the antitampering tag once it was stuck onto an item, and then store those data in the cloud, using them later for verification."

Seems like the only way to thwart that would be carving out the part of the surface where the tag was affixed — and then pasting the tag, glue, and what it adheres to all together onto some other surface. But more importantly, Han says they'd wanted to demonstrate "that the application of the terahertz spectrum can go well beyond broadband wireless."

In this case, you can use terahertz for ID, security, and authentication. There are a lot of possibilities out there."
Earth

Ocean Temperatures Are Skyrocketing (arstechnica.com) 110

"For nearly a year now, a bizarre heating event has been unfolding across the world's oceans," reports Wired.

"In March 2023, global sea surface temperatures started shattering record daily highs and have stayed that way since..." Brian McNoldy, a hurricane researcher at the University of Miami. "It's really getting to be strange that we're just seeing the records break by this much, and for this long...." Unlike land, which rapidly heats and cools as day turns to night and back again, it takes a lot to warm up an ocean that may be thousands of feet deep. So even an anomaly of mere fractions of a degree is significant. "To get into the two or three or four degrees, like it is in a few places, it's pretty exceptional," says McNoldy.

So what's going on here? For one, the oceans have been steadily warming over the decades, absorbing something like 90 percent of the extra heat that humans have added to the atmosphere...

A major concern with such warm surface temperatures is the health of the ecosystems floating there: phytoplankton that bloom by soaking up the sun's energy and the tiny zooplankton that feed on them. If temperatures get too high, certain species might suffer, shaking the foundations of the ocean food web. But more subtly, when the surface warms, it creates a cap of hot water, blocking the nutrients in colder waters below from mixing upwards. Phytoplankton need those nutrients to properly grow and sequester carbon, thus mitigating climate change...

Making matters worse, the warmer water gets, the less oxygen it can hold. "We have seen the growth of these oxygen minimum zones," says Dennis Hansell, an oceanographer and biogeochemist at the University of Miami. "Organisms that need a lot of oxygen, they're not too happy when the concentrations go down in any way — think of a tuna that is expending a lot of energy to race through the water."

But why is this happening? The article suggests less dust blowing from the Sahara desert to shade the oceans, but also 2020 regulations that reduced sulfur aerosols in shipping fuels. (This reduced toxic air pollution — but also some cloud cover.)

There was also an El Nino in the Pacific ocean last summer — now waning — which complicates things, according to biological oceanographer Francisco Chavez of the Monterey Bay Aquarium Research Institute in California. "One of our challenges is trying to tease out what these natural variations are doing in relation to the steady warming due to increasing CO2 in the atmosphere."

But the article points out that even the Atlantic ocean is heating up — and "sea surface temperatures started soaring last year well before El Niño formed." And last week the U.S. Climate Prediction Center predicted there's now a 55% chance of a La Nina in the Atlantic between June and August, according to the article — which could increase the likelihood of hurricanes.

Thanks to long-time Slashdot reader mrflash818 for sharing the article.
AI

Will 'Precision Agriculture' Be Harmful to Farmers? (substack.com) 61

Modern U.S. farming is being transformed by precision agriculture, writes Paul Roberts, the founder of securepairs.org and Editor in Chief at Security Ledger.

Theres autonomous tractors and "smart spraying" systems that use AI-powered cameras to identify weeds, just for starters. "Among the critical components of precision agriculture: Internet- and GPS connected agricultural equipment, highly accurate remote sensors, 'big data' analytics and cloud computing..." As with any technological revolution, however, there are both "winners" and "losers" in the emerging age of precision agriculture... Precision agriculture, once broadly adopted, promises to further reduce the need for human labor to run farms. (Autonomous equipment means you no longer even need drivers!) However, the risks it poses go well beyond a reduction in the agricultural work force. First, as the USDA notes on its website: the scale and high capital costs of precision agriculture technology tend to favor large, corporate producers over smaller farms. Then there are the systemic risks to U.S. agriculture of an increasingly connected and consolidated agriculture sector, with a few major OEMs having the ability to remotely control and manage vital equipment on millions of U.S. farms... (Listen to my podcast interview with the hacker Sick Codes, who reverse engineered a John Deere display to run the Doom video game for insights into the company's internal struggles with cybersecurity.)

Finally, there are the reams of valuable and proprietary environmental and operational data that farmers collect, store and leverage to squeeze the maximum productivity out of their land. For centuries, such information resided in farmers' heads, or on written or (more recently) digital records that they owned and controlled exclusively, typically passing that knowledge and data down to succeeding generation of farm owners. Precision agriculture technology greatly expands the scope, and granularity, of that data. But in doing so, it also wrests it from the farmer's control and shares it with equipment manufacturers and service providers — often without the explicit understanding of the farmers themselves, and almost always without monetary compensation to the farmer for the data itself. In fact, the Federal Government is so concerned about farm data they included a section (1619) on "information gathering" into the latest farm bill.

Over time, this massive transfer of knowledge from individual farmers or collectives to multinational corporations risks beggaring farmers by robbing them of one of their most vital assets: data, and turning them into little more than passive caretakers of automated equipment managed, controlled and accountable to distant corporate masters.

Weighing in is Kevin Kenney, a vocal advocate for the "right to repair" agricultural equipment (and also an alternative fuel systems engineer at Grassroots Energy LLC). In the interview, he warns about the dangers of tying repairs to factory-installed firmware, and argues that its the long-time farmer's "trade secrets" that are really being harvested today. The ultimate beneficiary could end up being the current "cabal" of tractor manufacturers.

"While we can all agree that it's coming...the question is who will own these robots?" First, we need to acknowledge that there are existing laws on the books which for whatever reason, are not being enforced. The FTC should immediately start an investigation into John Deere and the rest of the 'Tractor Cabal' to see to what extent farmers' farm data security and privacy are being compromised. This directly affects national food security because if thousands- or tens of thousands of tractors' are hacked and disabled or their data is lost, crops left to rot in the fields would lead to bare shelves at the grocery store... I think our universities have also been delinquent in grasping and warning farmers about the data-theft being perpetrated on farmers' operations throughout the United States and other countries by makers of precision agricultural equipment.
Thanks to long-time Slashdot reader chicksdaddy for sharing the article.
AI

Scientists Propose AI Apocalypse Kill Switches 104

A paper (PDF) from researchers at the University of Cambridge, supported by voices from numerous academic institutions including OpenAI, proposes remote kill switches and lockouts as methods to mitigate risks associated with advanced AI technologies. It also recommends tracking AI chip sales globally. The Register reports: The paper highlights numerous ways policymakers might approach AI hardware regulation. Many of the suggestions -- including those designed to improve visibility and limit the sale of AI accelerators -- are already playing out at a national level. Last year US president Joe Biden put forward an executive order aimed at identifying companies developing large dual-use AI models as well as the infrastructure vendors capable of training them. If you're not familiar, "dual-use" refers to technologies that can serve double duty in civilian and military applications. More recently, the US Commerce Department proposed regulation that would require American cloud providers to implement more stringent "know-your-customer" policies to prevent persons or countries of concern from getting around export restrictions. This kind of visibility is valuable, researchers note, as it could help to avoid another arms race, like the one triggered by the missile gap controversy, where erroneous reports led to massive build up of ballistic missiles. While valuable, they warn that executing on these reporting requirements risks invading customer privacy and even lead to sensitive data being leaked.

Meanwhile, on the trade front, the Commerce Department has continued to step up restrictions, limiting the performance of accelerators sold to China. But, as we've previously reported, while these efforts have made it harder for countries like China to get their hands on American chips, they are far from perfect. To address these limitations, the researchers have proposed implementing a global registry for AI chip sales that would track them over the course of their lifecycle, even after they've left their country of origin. Such a registry, they suggest, could incorporate a unique identifier into each chip, which could help to combat smuggling of components.

At the more extreme end of the spectrum, researchers have suggested that kill switches could be baked into the silicon to prevent their use in malicious applications. [...] The academics are clearer elsewhere in their study, proposing that processor functionality could be switched off or dialed down by regulators remotely using digital licensing: "Specialized co-processors that sit on the chip could hold a cryptographically signed digital "certificate," and updates to the use-case policy could be delivered remotely via firmware updates. The authorization for the on-chip license could be periodically renewed by the regulator, while the chip producer could administer it. An expired or illegitimate license would cause the chip to not work, or reduce its performance." In theory, this could allow watchdogs to respond faster to abuses of sensitive technologies by cutting off access to chips remotely, but the authors warn that doing so isn't without risk. The implication being, if implemented incorrectly, that such a kill switch could become a target for cybercriminals to exploit.

Another proposal would require multiple parties to sign off on potentially risky AI training tasks before they can be deployed at scale. "Nuclear weapons use similar mechanisms called permissive action links," they wrote. For nuclear weapons, these security locks are designed to prevent one person from going rogue and launching a first strike. For AI however, the idea is that if an individual or company wanted to train a model over a certain threshold in the cloud, they'd first need to get authorization to do so. Though a potent tool, the researchers observe that this could backfire by preventing the development of desirable AI. The argument seems to be that while the use of nuclear weapons has a pretty clear-cut outcome, AI isn't always so black and white. But if this feels a little too dystopian for your tastes, the paper dedicates an entire section to reallocating AI resources for the betterment of society as a whole. The idea being that policymakers could come together to make AI compute more accessible to groups unlikely to use it for evil, a concept described as "allocation."
Microsoft

Microsoft 'Retires' Azure IoT Central In Platform Rethink (theregister.com) 4

Lindsay Clark reports via The Register: In a statement on the Azure console, Microsoft confirmed the Azure IoT Central service is being retired on March 31, 2027. "Starting on April 1, 2024, you won't be able to create new application resources; however, all existing IoT Central applications will continue to function and be managed. Subscription {{subscriptionld} is not allowed to create new applications. Please create a support ticket to request an exception," the statement to customers, seen by The Register, said. According to a Microsoft "Learn" post from February 8, 2024, IoT Central is an IoT application platform as a service (aPaaS) designed to reduce work and costs while building, managing, and maintaining IoT solutions.

Microsoft's Azure IoT offering includes three pillars: IoT Hub, IoT Edge and IoT Central. IoT Hub is a cloud-based service that provides a "secure and scalable way to connect, monitor, and manage IoT devices and sensors," according to Microsoft. Azure IoT Edge is designed to allow devices to run cloud-based workloads locally. And Azure IoT Central is a fully managed, cloud-based IoT solution for connecting and managing devices at scale. Central is a layer above Hub in the architecture, and Hub itself may well continue. One developer told The Register there was no warning about Hub on the Azure console. As for IoT Edge, it is "a device-focused runtime that enables you to deploy, run, and monitor containerized Linux workloads." Microsoft has not said whether this would continue.

Cloud

Nginx Core Developer Quits Project, Says He No Longer Sees Nginx as 'Free and Open Source Project For the Public Good' (arstechnica.com) 53

A core developer of Nginx, currently the world's most popular web server, has quit the project, stating that he no longer sees it as "a free and open source project... for the public good." From a report: His fork, freenginx, is "going to be run by developers, and not corporate entities," writes Maxim Dounin, and will be "free from arbitrary corporate actions." Dounin is one of the earliest and still most active coders on the open source Nginx project and one of the first employees of Nginx, Inc., a company created in 2011 to commercially support the steadily growing web server. Nginx is now used on roughly one-third of the world's web servers, ahead of Apache.

Nginx Inc. was acquired by Seattle-based networking firm F5 in 2019. Later that year, two of Nginx's leaders, Maxim Konovalov and Igor Sysoev, were detained and interrogated in their homes by armed Russian state agents. Sysoev's former employer, Internet firm Rambler, claimed that it owned the rights to Nginx's source code, as it was developed during Sysoev's tenure at Rambler (where Dounin also worked). While the criminal charges and rights do not appear to have materialized, the implications of a Russian company's intrusion into a popular open source piece of the web's infrastructure caused some alarm. Sysoev left F5 and the Nginx project in early 2022. Later that year, due to the Russian invasion of Ukraine, F5 discontinued all operations in Russia. Some Nginx developers still in Russia formed Angie, developed in large part to support Nginx users in Russia. Dounin technically stopped working for F5 at that point, too, but maintained his role in Nginx "as a volunteer," according to Dounin's mailing list post.

Dounin writes in his announcement that "new non-technical management" at F5 "recently decided that they know better how to run open source projects. In particular, they decided to interfere with security policy nginx uses for years, ignoring both the policy and developers' position." While it was "quite understandable," given their ownership, Dounin wrote that it means he was "no longer able to control which changes are made in nginx," hence his departure and fork.

Software

VMware Admits Sweeping Broadcom Changes Are Worrying Customers (arstechnica.com) 79

An anonymous reader quotes a report from Ars Technica: Broadcom has made a lot of changes to VMware since closing its acquisition of the company in November. On Wednesday, VMware admitted that these changes are worrying customers. With customers mulling alternatives and partners complaining, VMware is trying to do damage control and convince people that change is good. Not surprisingly, the plea comes from a VMware marketing executive: Prashanth Shenoy, VP of product and technical marketing for the Cloud, Infrastructure, Platforms, and Solutions group at VMware. In Wednesday's announcement, Shenoy admitted that VMware "has been all about change" since being swooped up for $61 billion. This has resulted in "many questions and concerns" as customers "evaluate how to maximize value from" VMware products.

Among these changes is VMware ending perpetual license sales in favor of a subscription-based business model. VMware had a history of relying on perpetual licensing; VMware called the model its "most renowned" a year ago. Shenoy's blog sought to provide reasoning for the change, with the executive writing that "all major enterprise software providers are on [subscription models] today." However, the idea that '"everyone's doing it" has done little to ameliorate impacted customers who prefer paying for something once and owning it indefinitely (while paying for associated support costs). Customers are also dealing with budget concerns with already paid-for licenses set to lose support and the only alternative being a monthly fee.

Shenoy's blog, though, focused on license portability. "This means you will be able to deploy on-premises and then take your subscription at any time to a supported Hyperscaler or VMware Cloud Services Provider environment as desired. You retain your license subscription as you move," Shenoy wrote, noting new Google Cloud VMware Engine license portability support for VMware Cloud Foundation. Further, Shenoy claimed the discontinuation of VMware products so that Broadcom could focus on VMware Cloud Foundation and vSphere Foundation would be beneficial, because "offering a few offerings that are lower in price on the high end and are packed with more value for the same or less cost on the lower end makes business sense for customers, partners, and VMware."
VMware's Wednesday post also addressed Broadcom taking VMware's biggest customers direct, removing channel partners from the equation: "It makes business sense for Broadcom to have close relationships with its most strategic VMware customers to make sure VMware Cloud Foundation is being adopted, used, and providing customer value. However, we expect there will be a role change in accounts that will have to be worked through so that both Broadcom and our partners are providing the most value and greatest impact to strategic customers. And, partners will play a critical role in adding value beyond what Broadcom may be able."

"Broadcom identified things that needed to change and, as a responsible company, made the changes quickly and decisively," added Shenoy. "The changes that have taken place over the past 60+ days were absolutely necessary."
Google

Google Rolls Out Updated AI Model Capable of Handling Longer Text, Video (bloomberg.com) 11

An anonymous reader shares a report: Alphabet's Google is rolling out a new version of its powerful artificial intelligence model that it says can handle larger amounts of text and video than products made by competitors. The updated AI model, called Gemini 1.5 Pro, will be available on Thursday to cloud customers and developers so they can test its new features and eventually create new commercial applications. Google and its rivals have spent billions to ramp up their capabilities in generative AI and are keen to attract corporate clients to show their investments are paying off. [...]

Gemini 1.5 can be trained faster and more efficiently, and has the ability to process a huge amount of information each time it's prompted, according to Vinyals. For example, developers can use Gemini 1.5 Pro to query up to an hour's worth of video, 11 hours of audio or more than 700,000 words in a document, an amount of data that Google says is the "longest context window" of any large-scale AI model yet. Gemini 1.5 can process far more data compared with what the latest AI models from OpenAI and Anthropic can handle, according to Google. In a pre-recorded video demonstration for reporters, Google showed off how engineers asked Gemini 1.5 Pro to ingest a 402-page PDF transcript of the Apollo 11 moon landing, and then prompted it to find quotes that showed "three funny moments."

Earth

Scientists Resort To Once-Unthinkable Solutions To Cool the Planet 205

Dumping chemicals in the ocean? Spraying saltwater into clouds? Injecting reflective particles into the sky? Scientists are resorting to once unthinkable techniques to cool the planet because global efforts to check greenhouse gas emissions are failing. From a report: These geoengineering approaches were once considered taboo by scientists and regulators who feared that tinkering with the environment could have unintended consequences, but now researchers are receiving taxpayer funds and private investments to get out of the lab and test these methods outdoors. The shift reflects growing concern that efforts to reduce greenhouse gas emissions aren't moving fast enough to prevent the destructive effects of heat waves, storms and floods made worse by climate change. Geoengineering isn't a substitute for reducing emissions, according to scientists and business leaders involved in the projects. Rather, it is a way to slow climate warming in the next few years while buying time to switch to a carbon-free economy in the longer term.

Three field experiments are under way in the U.S. and overseas. This month, researchers aboard a ship off the northeastern coast of Australia near the Whitsunday Islands are spraying a briny mixture through high-pressure nozzles into the air in an attempt to brighten low-altitude clouds that form over the ocean. Scientists hope bigger, brighter clouds will reflect sunlight away from the Earth, shade the ocean surface and cool the waters around the Great Barrier Reef, where warming ocean temperatures have contributed to massive coral die-offs. The research project, known as marine cloud brightening, is led by Southern Cross University as part of the $64.55 million, or 100 million Australian dollars, Reef Restoration and Adaptation Program. The program is funded by the partnership between the Australian government's Reef Trust and the Great Barrier Reef Foundation and includes conservation organizations and several academic institutions.
Businesses

Nvidia Becomes Third Most Valuable US Company (cnbc.com) 75

Nvidia is now the third most valuable company in the U.S., surpassing Google parent Alphabet and Amazon. It's only behind Apple and Microsoft in terms of market cap. CNBC reports: Nvidia rose over 2% to close at $739.00 per share, giving it a market value of $1.83 trillion to Google's $1.82 trillion market cap. The move comes one day after Nvidia surpassed Amazon in terms of market value. The symbolic milestone is more confirmation that Nvidia has become a Wall Street darling on the back of elevated AI chip sales, valued even more highly than some of the large software companies and cloud providers that develop and integrate AI technology into their products.

Nvidia shares are up over 221% over the past 12 months on robust demand for its AI server chips that can cost more than $20,000 each. Companies like Google and Amazon need thousands of them for their cloud services. Before the recent AI boom, Nvidia was best known for consumer graphics processors it sold to PC makers to build gaming computers, a less lucrative market.

Privacy

US Military Notifies 20,000 of Data Breach After Cloud Email Leak (techcrunch.com) 11

An anonymous reader quotes a report from TechCrunch: The U.S. Department of Defense is notifying tens of thousands of individuals that their personal information was exposed in an email data spill last year. According to the breach notification letter sent out to affected individuals on February 1, the Defense Intelligence Agency -- the DOD's military intelligence agency -- said, "numerous email messages were inadvertently exposed to the Internet by a service provider," between February 3 and February 20, 2023. TechCrunch has learned that the breach disclosure letters relate to an unsecured U.S. government cloud email server that was spilling sensitive emails to the open internet. The cloud email server, hosted on Microsoft's cloud for government customers, was accessible from the internet without a password, likely due to a misconfiguration.

The DOD is sending breach notification letters to around 20,600 individuals whose information was affected. "As a matter of practice and operations security, we do not comment on the status of our networks and systems. The affected server was identified and removed from public access on February 20, 2023, and the vendor has resolved the issues that resulted in the exposure. DOD continues to engage with the service provider on improving cyber event prevention and detection. Notification to affected individuals is ongoing," said DOD spokesperson Cdr. Tim Gorman in an email to TechCrunch.

United States

FTC Chair Khan: Stop Monopolies Before They Happen (axios.com) 40

FTC chair Lina Khan is hunting for evidence that Microsoft, Google and Amazon require cloud computing spend, board seats or exclusivity deals in return for their investments in AI startups. From a report: At a Friday event, Khan framed today's AI landscape as an inflection point for tech that is "enormously important for opening up markets and injecting competition and disrupting existing incumbents." The FTC chair offered Axios' Sara Fischer new details of how she's handling a market inquiry into the relationship between Big Tech companies and AI startups, in an interview at the Digital Content Next Summit in Charleston, S.C.

In handling the surge in AI innovation and its impacts on the broader tech and media landscape, Khan said she aims to tackle monopoly "before it becomes fully fledged." She said the FTC is looking for chokepoints in each layer of the AI tech stack: "chips. compute, foundational models, applications." Khan said she's also paying close attention to vertical integration -- when players look to extend dominance over one tech layer into adjacent layers -- or when they attempt acquisitions aimed at solidifying an existing monopoly. That includes any potential integration between Sam Altman's nascent chip project and OpenAI, though she said she welcomes chip competition.

Data Storage

Backblaze's Geriatric Hard Drives Kicked the Bucket More in 2023 (theregister.com) 51

Backblaze has published a report on hard drive failures for 2023, finding that rates increased during the year due to aging drives that it plans to upgrade. From a report: Backblaze, which focuses on cloud-based storage services, claims to have more than three exabytes of data storage under its management. As of the end of last year, the company monitored 270,222 hard drives used for data storage, some of which are excluded from the statistics because they are still being evaluated. That still left a collection of 269,756 hard drives comprised of 35 drive models. Statistics on SSDs used as boot drives are reported separately.

Backblaze found one drive model exhibited zero failures for all of 2023, the Seagate 8 TB ST8000NM000A. However, this came with the caveat that there are only 204 examples in service, and these were deployed only since Q3 2022, so have accumulated a limited number of drive days (total time operational). Nevertheless, as Backblaze's principal cloud storage evangelist Andy Klein pointed out: "Zero failures over 18 months is a nice start."

Communications

The US Government Makes a $42 Million Bet On Open Cell Networks (theverge.com) 26

An anonymous reader quotes a report from The Verge: The US government has committed $42 million to further the development of the 5G Open RAN (O-RAN) standard that would allow wireless providers to mix and match cellular hardware and software, opening up a bigger market for third-party equipment that's cheaper and interoperable. The National Telecommunications and Information Administration (NTIA) grant would establish a Dallas O-RAN testing center to prove the standard's viability as a way to head off Huawei's steady cruise toward a global cellular network hardware monopoly.

Verizon global network and technology president Joe Russo promoted the funding as a way to achieve "faster innovation in an open environment." To achieve the standard's goals, AT&T vice president of RAN technology Robert Soni says that AT&T and Verizon have formed the Acceleration of Compatibility and Commercialization for Open RAN Deployments Consortium (ACCoRD), which includes a grab bag of wireless technology companies like Ericsson, Nokia, Samsung, Dell, Intel, Broadcom, and Rakuten. Japanese wireless carrier Rakuten formed as the first O-RAN network in 2020. The company's then CEO, Tareq Amin, told The Verge's Nilay Patel in 2022 that Open RAN would enable low-cost network build-outs using smaller equipment rather than massive towers -- which has long been part of the promise of 5G.

But O-RAN is about more than that; establishing interoperability means companies like Verizon and AT&T wouldn't be forced to buy all of their hardware from a single company to create a functional network. For the rest of us, that means faster build-outs and "more agile networks," according to Rakuten. In the US, Dish has been working on its own O-RAN network, under the name Project Genesis. The 5G network was creaky and unreliable when former Verge staffer Mitchell Clarke tried it out in Las Vegas in 2022, but the company said in June last year that it had made its goal of covering 70 percent of the US population. Dish has struggled to become the next big cell provider in the US, though -- leading satellite communications company EchoStar, which spun off from Dish in 2008, to purchase the company in January.
The Washington Post writes that O-RAN "is Washington's anointed champion to try to unseat the Chinese tech giant Huawei Technologies" as the world's biggest supplier of cellular infrastructure gear.

According to the Post, Biden has emphasized the importance of O-RAN in conversations with international leaders over the past few years. Additionally, it notes that Congress along with the NTIA have dedicated approximately $2 billion to support the development of this standard.

Slashdot Top Deals