×
Iphone

Apple Is Finally Willing To Make Gadgets Thicker So They Work Better (cnbc.com) 75

Apple has started to make its products thicker in an effort to give people what they want: functionality over form. This is a good thing. There are two recent examples: this year's iPhones and the new 16-inch MacBook Pro. Todd Haselton writes via CNBC: This is a theory, but it seems this may be that there are some design changes being made after the departure of Apple's former chief design officer Jony Ive. Ive was known for creating gorgeous products but, sometimes as we've seen with the older MacBook keyboard, perhaps at the cost of functionality. Form over function, as they say. [...] If you look back at the iPhone 8, for example, the phone measured just 7.3-mm thick, an example of Apple's seeming obsession with creating devices that were as thin as possible but often at the cost of battery life. But this year, Apple put a huge focus on battery life because it knows that's one of top things people want from their phones (along with great cameras). As a result of the larger battery, this year's iPhone 11 is slightly fatter at 8.3-mm thick. It's barely noticeable but shows that Apple knows people are willing to sacrifice on thinness for a phone that lasts all day.

Then there's the 16-inch MacBook Pro that was announced on Wednesday. It's less than 1-mm thicker than the 15-inch MacBook Pro that it replaces, and it weighs 4.3 pounds instead of 4 pounds in the prior model. It's 2% larger than the 15-inch MacBook Pro, too. All of this helps Apple include what people want in a similar but slightly bigger form factor: a keyboard with keys that you can actually tap into and that works, instead of one that's practically flat with very little key travel. The flat so-called butterfly keyboard was prone to exposure to dust and debris, which could lead to keys not registering or repeating themselves and, ultimately, lots of typos. Apple also focused on battery life in its new laptop. It lasts an hour longer than last year's model and charges fully in just 2.5 hours. That's partly because Apple was able to increase the battery size, something that likely contributed to the larger and heavier form factor.

Open Source

GitHub Places Open-Source Code In Arctic Cave For Safekeeping (bloomberg.com) 50

pacopico writes: GitHub's CEO Nat Friedman traveled to Svalbard in October to stash Linux, Android, and 6,000 other open-source projects in a permafrost-filled, abandoned coal mine. It's part of a project to safeguard the world's software from existential threats and also just to archive the code for posterity. As Friedman says, "If you told someone 20 years ago that in 2020, all of human civilization will depend on and run on open-source code written for free by volunteers in countries all around the world who don't know each other, and it'll just be downloaded and put into almost every product, I think people would say, 'That's crazy, that's never going to happen. Software is written by big, professional companies.' It's sort of a magical moment. Having a historical record of this will, I think, be valuable to future generations." GitHub plans to open several more vaults in other places around the world and to store any code that people want included.
AI

Boston Dynamics CEO on the Company's Top 3 Robots, AI, and Viral Videos (venturebeat.com) 6

In a rare interview, Boston Dynamics CEO Marc Raibert talked about the three robots the company is currently focused on (today -- Spot, tomorrow -- Handle, and the future -- Atlas), its current customers, potential applications, AI, simulation, and of course those viral videos. An excerpt from the interview: "Today," for Raibert, refers to a time period that extends over the course of the next year or so. Spot is the "today" robot because it's already shipping to early adopters. In fact, it's only been shipping for about six weeks. Boston Dynamics wants Spot to be a platform -- Raibert has many times referred to it as "the Android of robots." Spot, which weighs about 60 pounds, "is not an end-use application robot," said Raibert. Users can add hardware payloads, and they can add software that interacts with Spot through its API. In fact, Raibert's main purpose in attending Web Summit was to inspire attendees to develop hardware and software for Spot. Boston Dynamics has an arm, spectrum radio, cameras, and lidars for Spot, but other companies are developing their own sensors. The "Spot" we're talking about is technically the SpotMini. It was renamed when it succeeded its older, bigger brother Spot. "The legacy Spot was a research project. We're really not doing anything with it at the moment. We just call it 'Spot' now; it's the product."

Spot can go up and down stairs by using obstacle detection cameras to see railings and steps. It also has an autonomous navigation system that lets it traverse a terrain. While Spot can be steered by a human, the computers onboard regulate the legs and balance. Spot travels at about 3 miles per hour, which is about human walking speed. It has cameras on its front, back, and sides that help it navigate, travel autonomously, and move omnidirectionally. It has different gaits (slow, walking, running, and even show-off), can turn in place, and has a "chicken head" mode. That last one means it can decouple the motion of its hand from its body, similar to how many animals can stabilize one part while the rest of the body moves.

Businesses

Dell Unveils Subscription Model To Counter Amazon, Microsoft (bloomberg.com) 29

Dell is planning to offer business clients a subscription model for products like servers and personal computers, "seeking to counter the lure of cloud services from Amazon and Microsoft," reports Bloomberg. From the report: Dell and its hardware peers have been under pressure to offer corporate clients the flexibility and simplicity of infrastructure cloud services. Public cloud titans such as Amazon Web Services and Microsoft Azure have cut demand for data-center hardware as more businesses look to rent computing power rather than invest in their own server farms. Rival Hewlett Packard Enterprise said in June that it would move to a subscription model by 2022. Research firm Gartner predicts 15% of data-center hardware deals will include pay-per-use pricing in 2022, up from 1% in 2019, Dell said.

Dell is making it easier for clients to upgrade their hardware since they don't have to spend a large amount of capital expenditures upfront, but can pay a smaller amount each month that counts toward a company's operating expenditures. For the consumption programs, customers pay for the amount of storage or computing power they use. Companies can also hire Dell to completely manage their hardware infrastructure for them. While Dell's overall sales climbed 2% in the quarter that ended Aug. 2, demand for its servers and networking gear dropped 12% in a reversal from last year, when there was unprecedented customer interest in the products. Dell still expects the vast majority of customers to pay upfront for products in the next three to five years, Grocott said.

Medicine

UCLA Now Has the First Zero-Emission, All-Electric Mobile Surgical Instrument Lab 26

UCLA's new mobile surgical lab is a zero-emission, all-electric vehicle that will move back and forth between two UCLA campuses, collecting, sterilizing and repairing surgical instruments for the medical staff there. TechCrunch reports: Why is that even needed? The usual process is sending out surgical instruments for this kind of service by a third-party, and it's handled in a dedicated facility at a significant annual cost. UCLA Health Center estimates that it can save as much as $750,000 per year using the EV lab from Winnebago instead. The traveling lab can operate for around eight hours, including round-trips between the two hospital campuses, or for a total distance traveled of between 85 and 125 miles on a single charge of its battery, depending on usage. It also offers "the same level of performance, productivity and compliance" as a lab in a fixed-location building, according to Winnebago.
Intel

Intel Fixes a Security Flaw It Said Was Repaired 6 Months Ago (nytimes.com) 27

An anonymous reader quotes a report from The New York Times: Last May, when Intel released a patch for a group of security vulnerabilities researchers had found in the company's computer processors, Intel implied that all the problems were solved. But that wasn't entirely true, according to Dutch researchers at Vrije Universiteit Amsterdam who discovered the vulnerabilities and first reported them to the tech giant in September 2018. The software patch meant to fix the processor problem addressed only some of the issues the researchers had found. It would be another six months before a second patch, publicly disclosed by the company on Tuesday, would fix all of the vulnerabilities Intel indicated were fixed in May, the researchers said in a recent interview.

The public message from Intel was "everything is fixed," said Cristiano Giuffrida, a professor of computer science at Vrije Universiteit Amsterdam and one of the researchers who reported the vulnerabilities. "And we knew that was not accurate." While many researchers give companies time to fix problems before the researchers disclose them publicly, the tech firms can be slow to patch the flaws and attempt to muzzle researchers who want to inform the public about the security issues. Researchers often agree to disclose vulnerabilities privately to tech companies and stay quiet about them until the company can release a patch. Typically, the researchers and companies coordinate on a public announcement of the fix. But the Dutch researchers say Intel has been abusing the process. Now the Dutch researchers claim Intel is doing the same thing again. They said the new patch issued on Tuesday still doesn't fix another flaw they provided Intel in May. The Intel flaws, like other high-profile vulnerabilities the computer security community has recently discovered in computer chips, allowed an attacker to extract passwords, encryption keys and other sensitive data from processors in desktop computers, laptops and cloud-computing servers.
Intel says the patches "greatly reduce" the risk of attack, but don't completely fix everything the researchers submitted.

The company's spokeswoman Leigh Rosenwald said Intel was publishing a timeline with Tuesday's patch for the sake of transparency. "This is not something that is normal practice of ours, but we realized this is a complicated issue. We definitely want to be transparent about that," she said. "While we may not agree with some of the assertions made by the researchers, those disagreements aside, we value our relationship with them."
Intel

Intel's Cascade Lake CPUs Impacted By New Zombieload v2 Attack (zdnet.com) 43

The Zombieload vulnerability disclosed earlier this year in May has a second variant that also works against more recent Intel processors, not just older ones, including Cascade Lake, Intel's latest line of high-end CPUs -- initially thought to have been unaffected. From a report: Intel is releasing microcode (CPU firmware) updates today to address this new Zombieload attack variant, as part of its monthly Patch Tuesday -- known as the Intel Platform Update (IPU) process. Back in May, two teams of academics disclosed a new batch of vulnerabilities that impacted Intel CPUs. Collectively known as MDS attacks, these are security flaws in the same class as Meltdown, Spectre, and Foreshadow. The attacks rely on taking advantage of the speculative execution process, which is an optimization technique that Intel added to its CPUs to improve data processing speeds and performance. Vulnerabilities like Meltdown, Spectre, and Foreshadow, showed that the speculative execution process was riddled with security holes. Disclosed in May, MDS attacks were just the latest line of vulnerabilities impacting speculative execution. They were different from the original Meltdown, Spectre, and Foreshadow bugs disclosed in 2018 because they attacked different areas of a CPU's speculative execution process. Further reading: Flaw in Intel PMx driver gives 'near-omnipotent control over a victim device'.
Robotics

American Robots Lose Jobs To Asian Robots as Adidas Shifts Manufacturing (nypost.com) 85

Adidas plans to close high-tech "robot" factories in Germany and the United States that it launched to bring production closer to customers, saying Monday that deploying some of the technology in Asia would be "more economic and flexible." Reuters: The Adidas factories were part of a drive to meet demand for faster delivery of new styles to its major markets and to counter rising wages in Asia and higher shipping costs. It originally planned a global network of similar factories. The German sportswear company did not give details on why it was closing the facilities, which have proved expensive and where the technology has been difficult to extend to different products. Martin Shankland, Adidas' head of global operations, said the factories had helped the company improve its expertise in innovative manufacturing, but it aimed to apply what it had learned with its suppliers.
Power

Honda Works On Second EV, Quits Diesel, and Puts Hydrogen On Hold (electrek.co) 148

Socguy writes: In late October, at Honda's "Electric Vision" event in Amsterdam, the company said it was "electrifying" its entire product line, which mostly means hybrids. "We will bring further battery-electric products to the market," they said. At the same time it would seem, diesel and hydrogen are on the way out. Katsushi Inoue, Honda Europe's president, said: "Maybe hydrogen fuel cell cars will come, but that's a technology for the next era. Our focus is on hybrid and electric vehicles now." Diesel is also on the way out as in September, Honda said it would phase out all diesel cars by 2021. In addition to the all-electric Honda E, which is launching in Europe next year, the company will introduce a second EV that's expected to be revealed by 2022.
Power

How Tech From Australia Could Prevent California Wildfires and PG&E Blackouts (ieee.org) 106

"Technology developed to combat Australia's deadly bushfires could slash California's fire risk and reduce the need for PG&E's 'public safety power shutoffs'," reports IEEE Spectrum.

"See the video to watch an advanced power diverter cut off 22,000 volts of power in less than 1/20th of a second, preventing ignition of dry brush," writes Slashdot reader carbonnation.

IEEE Spectrum reports: California utility Pacific Gas & Electric (PG&E) delivered a bitter pill last month when it said that deliberate blackouts to keep its lines from sparking wildfires could be the new normal for millions of customers for the next decade -- a dangerous disruption to power-dependent communities that California governor Gavin Newsom says "no state in the 21st Century should experience."

Grid experts say Newsom is right, because technology available today can slash the risk of grid-induced fires, reducing or eliminating the need for PG&E's "public safety power shutoffs...."

Some of the most innovative fire-beating grid technologies are the products of an R&D program funded by the state of Victoria in Australia, prompted by deadly grid-sparked bushfires there 10 years ago. Early this year, utilities in Victoria began a massive rollout of one solution: power diverters that are expected to protect all of the substations serving the state's high fire risk areas by 2024. "It's not cheap to put one in but once you do it, you've got 1,000 kilometers of network that's suddenly a lot safer," says Monash University professor Tony Marxsen, former chair of the Australian Energy Market Operator, Australia's power grid regulator, and chairman of Melbourne-based grid equipment developer IND Technology.

The power diverters -- known as Rapid Earth Fault Current Limiters (REFCLs) -- react to the surge of current unleashed when a power line strikes the ground or is struck by a tree. When this happens on one of Victoria's 22-kilovolt distribution circuits, the REFCL instantly begins collapsing the faulted line's voltage toward 100 volts, and can get there in as few as 40 milliseconds (ms). "If it can do it within 85 ms, you won't get fires," he says... Marxsen says 20 to 30 percent of the distribution circuits in PG&E's territory have the appropriate three-phase design for REFCLs, as do a similar proportion of circuits in the territory of Southern California Edison (which is also grappling with grid-sparked wildfires). "It would certainly offer the option of not shutting down the networks when there's high fire risk," he says.

Power

'Bring Back the Replaceable Laptop Battery' 216

"If you've gone shopping for a new laptop lately, you may notice something missing in all newer models regardless of make," writes Slashdot reader ikhider.

There's no removable battery. Whether mainstream or obscure manufacturer, the fact that pretty much all of them are made in the same area denote a similar approach to soldering batteries in. While battery technology may have improved, it is not to the extent that they no longer need to be replaced. Premium retention of charges generally tend to deplete in about a year or so. This impacts the device mobility and necessitates replacement. Also, the practical use of having a backup battery if you need one cannot even be applied.

While some high-end models may have better quality batteries, it does not replace popping in a fresh, new one. This leads to one conclusion, planned obsolescence.If you want your laptop to still be mobile when the battery fizzles out, forget about it. Buy new instead. Pick your manufacturer, even those famed for building 'tank' laptops that last forever, all you need is a fresh battery, upgrade the RAM, and a new HD or SSD and away you go. While the second hand market still has good models with replaceable batteries, it is only a matter of time before that too fizzles away. If you had a limited budget, you could still get a good, second-hand machine [in the past], but now you are stuck with the low end.

Consumers need to make their case to manufacturers, for their own best interest to leverage the life of a machine on their own terms, not the manufacturers. Bring back the removable laptop battery.
AMD

AMD Unveils the World's Most Powerful Desktop CPUs (zdnet.com) 187

ZDNet reports: In the never ending war between the chip giants, AMD has released a salvo by unveiling what are the world's most powerful desktop processors -- the new 24-core AMD Ryzen Threadripper 3960X and 32-core AMD Ryzen Threadripper 3970X... These 3rd-generation Ryzen Threadripper Processors are built using AMD's 7-nanometer "Zen 2" core architecture, and both chips feature 88 PCIe 4.0 lanes with extraordinary power efficiency.

On the performanced front, AMD claims that the new 32-core Ryzen Threadripper 3970X offers up to 90 percent faster performance over the competition... This performance doesn't mean the chips are power-hungry either, with AMD claiming they deliver up to 66 percent better power efficiency compared to previous generation processors. The new chips do, however, need a new socket. The new socket is called sTRX4, which offers expansion for serious multi-GPU and NVMe arrays, quad channel DDR4, ECC support, and unlocked overclocking.... [T]hey both will be available starting Tuesday, November 19.

Engadget reports: After getting some wins against Intel in the desktop enthusiast processor race, AMD is trying to run up the score with its latest model, the Ryzen 9 3950X. It has 16 cores/32 threads, a 3.5 Ghz base clock with up to 4.7 GHz boost (on two cores) and 105 watt power consumption (TDP), and costs $749, compared to $1,199 for Intel's 12-core i9-9920X. At the same time, AMD claims it outperforms the i9-9920X in gaming and even more so for content creation, where those extra cores can be best exploited.

According to the company, it'll do some Adobe Premiere tasks up to 26 percent quicker than an i9-9920X, and 42 percent faster than an 8-core i9-9900K. Better still, the Ryzen 9 3950X delivers 2.34 times more performance per watt than its Intel counterpart, and consumes 173W of absolute wall power compared to 304W for the i9-9920X. The power figures alone could be decisive for creators who run multiple workstations for 3D animation and rendering...

If $749 is $700 too much, AMD has another option -- the Athlon 3000G. The dual-core processor runs at 3.5Ghz, but AMD said it's "the only unlocked option in its segment," meaning you can push it to around 3.9Ghz. That'll boost its performance ahead of Intel's $73 Pentium G5400, AMD said. The Athlon 3000G will arrive November 19th for $49.

Data Storage

Ask Slashdot: Are There Storage Devices With Hardware Compression Built In? 120

Slashdot reader dryriver writes: Using a compressed disk drive or hard drive has been possible for decades now. But when you do this in software or the operating system, the CPU does the compressing and decompressing. Are there any hard drives or SSDs that can work compressed using their own built in hardware for this?

I'm not talking about realtime video compression using a hardware CODEC chip -- this does exist and is used -- but rather a storage medium that compresses every possible type of file using its own compression and decompression realtime hardware without a significant speed hit.

Leave your best thoughts and suggestions in the comments. Are there storage devices with hardware compressiong built in?
Intel

Intel Performance Strategy Team Publishing Intentionally Misleading Benchmarks (servethehome.com) 42

An anonymous reader shares a post: This week something happened that many may not have seen. Intel published a set of benchmarks showing advantage of a dual Intel Xeon Platinum 9282 system versus the AMD EPYC 7742. Vendors present benchmarks to show that their products are good from time-to-time. There is one difference in this case: we checked Intel's work and found that they presented a number to intentionally mislead would-be buyers as to the company's relative performance versus AMD.
Displays

Screen Time Might Be Physically Changing Kids' Brains 56

An anonymous reader quotes a report from MIT Technology Review: A study published today in JAMA Pediatrics warns that kids' literacy and language skills suffer with screen use, and MRI scans of their brains appear to back up the findings. Forty-seven 3- to 5-year-olds took a test to measure their cognitive abilities, and their parents were asked to answer a detailed survey about screen time habits. Questions included: How frequently do they use that screen? What type of content are they viewing? And is there an adult sitting with the child talking about what they're watching? The answers were scored against a set of screen time guidelines put out by the American Academy of Pediatrics. The kids also had their brains scanned in an MRI machine.

The scans revealed that kids who spent more time in front of screens had what the authors call lower "white matter integrity." White matter can be roughly thought of as the brain's internal communications network -- its long nerve fibers are sheathed in fatty insulation that allows electrical signals to move from one area of the brain to another without interruption. The integrity of that structure -- how well organized the nerve fibers are, and how well developed the myelin sheath is -- is associated with cognitive function, and it develops as kids learn language. Lead author John Hutton of Cincinnati Children's Hospital told MIT Technology Review there's a clear link between higher screen use and lower white matter integrity in the children his team studied. That structural change appears to be reflected in the results of the cognitive test the kids took as well, which showed high screen time associated with lower levels of language and literacy skills.
Signe Lauren Bray, a researcher at the University of Calgary who was not involved in the study, downplays the findings by pointing out that it's a small and preliminary study. "It's absolutely not clear that screen time causes differences in brain development and there are many factors that could explain the association found here," she says.

Regardless, "Caution is warranted," Hutton says. "Children are not small grown-ups, and their needs change with development."
Power

An Energy Breakthrough Could Store Solar Power For Decades (bloomberg.com) 92

An anonymous reader quotes a report from Bloomberg: Scientists at Chalmers University of Technology in Gothenburg have figured out how to harness the energy and keep it in reserve so it can be released on demand in the form of heat -- even decades after it was captured. The innovations include an energy-trapping molecule, a storage system that promises to outperform traditional batteries, at least when it comes to heating, and an energy-storing laminate coating that can be applied to windows and textiles. The breakthroughs, from a team led by researcher Kasper Moth-Poulsen, have garnered praise within the scientific community. Now comes the real test: whether Moth-Poulsen can get investors to back his technology and take it to market.

The system starts with a liquid molecule made up of carbon, hydrogen, and nitrogen. When hit by sunlight, the molecule draws in the sun's energy and holds it until a catalyst triggers its release as heat. The researchers spent almost a decade and $2.5 million to create a specialized storage unit, which Moth-Poulsen, a 40-year-old professor in the department of chemistry and chemical engineering, says has the stability to outlast the 5-to 10-year life span of typical lithium-ion batteries on the market today. The most advanced potential commercial use the team developed is a transparent coating that can be applied to home windows, a moving vehicle, or even clothing. The coating collects solar energy and releases heat, reducing electricity required for heating spaces and curbing carbon emissions. Moth-Poulsen is coating an entire building on campus to showcase the technology. The ideal use in the early going, he says, is in relatively small spaces. "This could be heating of electrical vehicles or in houses."
Moth-Poulsen believes there's potential for the system to produce electricity, but his team is focused for now on heating.

"Moth-Poulsen plans to spin off a company that would advance the technology and says he's in talks with venture capital investors," adds Bloomberg. "The storage unit could be commercially available in as little as six years and the coating in three, pending the $5 million of additional funding he estimates will be needed to bring the coating to market."
Google

Buying Fitbit Won't Save Google's Failing Wear OS (androidpolice.com) 27

David Ruddock of AndroidPolice technology blog tries to make sense of last week's $2.1 billion acquisition of Fitbit by Google. He argues that Fitbit's offerings -- hardware, software, engineering talent, or even patent wall -- can't save Google's wearable operating system Wear OS. From his column: Hardware is what Google is after, with a blog post cleatly stating its acquisition of Fitbit is about future Wear OS devices, meaning you can probably kiss Fitbit's unloved smartwatch OS goodbye. So, that means we can count on Google leveraging Fitbit's renowned hardware to finally give Wear OS the horsepower and capabilities it needs to compete with Apple, right? Well, no. Fitbit's smartwatches have been most lauded for their long battery life, which has historically been enabled by extremely slow but highly power-efficient processors. The Versa 2 allegedly comes with significant performance improvements, but as a smartwatch, it just isn't very... smart. Michael Fisher points out in his review that the Versa 2's near week-long life on a single charge is only impressive when looked at in a very generous light. The Versa 2 doesn't have GPS, the battery only lasts that long when not using the always-on display (with AoD, it's closer to 3 days), the watch itself doesn't work for almost anything but fitness tracking on its own, and most of your interactions with it end up happening on your smartphone anyway. I can also tell you from experience that the Apple Watch Series 5 lasts about two days on a charge with the always-on display enabled (and Samsung's watches last even longer), so Fitbit managing a day more which a much less useful watch isn't exactly game-changing technology.

In short, Fitbit's products are not ones Google should be excited about buying. The hardware is nothing special, and the software is clearly going in the dumpster. What has Google bought, then? The sad, very practical truth is probably patents and engineers. Fitbit does develop at least some of its hardware in-house, and likely has a decent number of patents related to fitness tracking and basic wearable technology, including those stemming from its acquisition of Pebble. Its product engineers would receive resources and tools at Google that Fitbit may not have afforded them. In short: Google's purchase is almost certainly a speculative one. Google is hoping that Fitbit's technology portfolio and its engineering talent can create a better, faster, stronger Wear OS watch. That isn't the kind of acquisition that screams "our product is successful," it's one that looks far more like a Hail Mary from a company that is rapidly losing any hope of remaining relevant in the wearables space. A more cynical view of Google's acquisition might argue that this is more about Fitbit's brand and users than anything else. If Google simply markets its in-house smartwatches as Fitbits running Wear OS, it would be more able to tap into Fitbit's existing customer base and retail relationships. Customer base is something Wear OS is sorely missing at the moment, and Fitbit is a brand that many consumers recognize, albeit mostly for the company's "dumb" fitness trackers, not its smartwatches. Speaking of, given Google's focus on Wear OS as part of this acquisition, my guess is that those more popular but very basic trackers will be discontinued.

Data Storage

Microsoft and Warner Bros. Archived the Original 'Superman' Movie on a Futuristic Glass Disc (variety.com) 93

Microsoft has teamed up with Warner Bros. to text. The collaboration, which was unveiled at Microsoft's Ignite 2019 conference in Orlando, Florida Monday, is a first test case for a new storage technology that could eventually help safeguard Hollywood's movies and TV shows, as well as many other forms of data, for centuries to come. From a report: "Glass has a very, very long lifetime," said Microsoft Research principal researcher Ant Rowstron in a recent conversation with Variety. "Thousands of years." The piece of silica glass storing the 1978 "Superman" movie, measures 7.5 cm x 7.5 cm x 2 mm. The glass contains 75.6 GB of data plus error redundancy codes. Microsoft began to investigate glass as a storage medium in 2016 in partnership with the University of Southampton Optoelectonics Research Centre. The goal of these efforts, dubbed "Project Silica," is to find a new storage medium optimized for what industry insiders like to call cold data -- the type of data you likely won't need to access for months, years, or even decades. It's data that doesn't need to sit on a server, ready to be used 24/7, but that is kept in a vault, away from anything that could corrupt it.

Turns out that Warner Bros. has quite a bit of this kind of cold data. Founded in the 1920s, the studio has been safekeeping original celluloid film reels, audio from 1940s radio shows and much more, for decades. Think classics like "Casablanca," "The Wizard of Oz" or "Looney Tunes" cartoons. "Our mission is to preserve those original assets in perpetuity," said Brad Collar, who is leading these efforts at Warner Bros. as the studio's senior vice president of global archives and media engineering. And while the studio is deeply invested in these classics, it also keeps adding an ever-increasing number of modern assets to its archives, ranging from digitally-shot films and television episodes to newer forms of entertainment, including video games. To date, the Warner Bros. archive contains some 20 million assets, with tens of thousands of new items being added every year. Each of them is being stored in multiple locations, explained Collar. "We want to have more than one copy."

Power

Does California Need A More Decentralized Energy System? (vox.com) 198

"California's electricity system is failing," argues Vox, in an article shared by Slashdot reader nickwinlund77. But they're proposing a way "to make California's electricity system cleaner, more reliable, and more resilient." In a nutshell, it is accelerating the evolution from a centralized, top-down, long-distance, one-way energy system to a more decentralized, bottom-up, local, networked system. In the energy world, this is summed up as a more distributed energy system. It puts more power, both electrical and political, in local hands. Though it is still in early days, and only hints of what's to come are yet visible, the evolution to a more distributed system is inevitable...

Solar+storage+smart inverter systems work better and more seamlessly [than diesel generators] during a blackout. What's more, when they are connected together into a microgrid, their collective generation and consumption can be balanced out, maximizing backup power... The knock on microgrids has traditionally been that they're expensive, but they are already reaching cost parity with California grid power in some places. And while it is true that, on an upfront-capital basis, they are more expensive than diesel generators, they are not more expensive on a lifetime basis because clean distributed-energy resources, unlike diesel generators, can provide useful services even when there's no blackout... As Public Safety Power Shutoff events continue, emergency-backup benefits will be enough to kick-start a decent microgrid market. It's already happening, especially around Tier 1 loads. And customers are herding to solar+storage systems, as Tesla and other companies eye big growth...

The core problem with California's electricity system is that its millions of customers are overwhelmingly dependent on power generated by large, remote power plants and carried over long distances on overhead power lines, often through hilly, mountainous, and/or forested territory becoming dryer and more fire-prone by the year... [U]tilities are still operating with a 20th-century hangover, a model that forces them to prefer big investments in big grid infrastructure.

The article also notes "vehicle-to-grid" technology which will offer electric cars bidirectional energy-storage and demand-shifting capabilities, and argues that a network of distributed-energy resources can ultimately be installed quickly and will lower the need for long-distance power transmission lines.

But it argues the transition won't happen until the state's government makes a more ambitious push.
EU

Germany's Giant Windmills Are Wildly Unpopular (financialpost.com) 287

"Local politics are a bigger problem for renewable energy growth than competition from fossil fuels," warns a Bloomberg columnist: It's getting harder to get permission to erect the turbine towers. Local regulations are getting stricter. Bavaria decided back in 2014 that the distance between a wind turbine and the nearest housing must be 10 times the height of the mast, which, given the density of dwellings, makes it hard to find a spot anywhere. Wind energy development is practically stalled in the state now. Brandenburg, the state surrounding Berlin, passed a law this year demanding that wind-farm operators pay 10,000 euros ($11,100) per turbine each year to communities within 3 kilometers of the windmills. Wind projects are also often rejected or stalled because they're deemed to interfere with military communications, air traffic control or broadcast radio stations.

Besides, local opponents of the wind farms often go to court to stall new developments or even have existing towers dismantled. According to the wind-industry lobby BWE, 325 turbine installations with a total capacity of more than 1 gigawatt (some 2% of the country's total installed capacity) are tied up in litigation. The irony is that the litigants are often just as "green" as the wind-energy proponents -- one is the large conservation organization NABU, which says it's not against wind energy as such but merely demands that installations are planned with preserving nature in mind. Almost half of the complaints are meant to protect various bird and bat species; others claim the turbines make too much noise or emit too much low-frequency infrasound. Regardless of the validity of such claims, projects get tied up in the courts even after jumping through the many hoops necessary to get a permit.

Another reason for local resistance to the wind farms is a form of Nimbyism: People hate the way the wind towers change landscapes. There's even a German word for it, Verspargelung, roughly translated as pollution with giant asparagus sticks... This nasty political and regulatory climate creates too much uncertainty for investors, just as the German government prepares to phase out wind-energy subsidies...

Without technological breakthroughs -- for example, in energy storage, which would make fewer new turbines necessary -- Germany, and then other countries that try to build up renewable energy generation as it has done, will be hard put to push production to the level required to reach climate goals.

Slashdot Top Deals