Windows

iPad Launch Blindsided Windows Team, Reveals Former Microsoft Executive (twitter.com) 109

The launch of the iPad ten years ago was a big surprise to everyone in the industry -- including to Microsoft executives. Steven Sinofsky, the former President of the Windows Division at Microsoft, shares Microsoft's perspective as well as those of the other industry figures and press on the iPad: The announcement 10 years ago today of the "magical" iPad was clearly a milestone in computing. It was billed to be the "next" computer. For me, managing Windows, just weeks after the launch of Microsoft's "latest creation" Windows 7, it was a as much a challenge as magical. Given that Star Trek had tablets it was inevitable that the form factor would make it to computing (yes, the dynabook...). Microsoft had been working for more than 10 years starting with "WinPad" through Tablet PC. We were fixated on Win32, Pen, and more. The success of iPhone (140K apps & 3B downloads announced that day) blinded us at Microsoft as to where Apple was heading. Endless rumors of Apple's tablet *obviously* meant a pen computer based on Mac. Why not? The industry chased this for 20 years. That was our context. The press, however, was fixated on Apple lacking an "answer" (pundits seem to demand answers) to Netbooks -- those small, cheap, Windows laptops sweeping the world. Over 40 million sold. "What would Apple's response be?" We worried -- a cheap, pen-based, Mac. Sorry Harry!

Jobs said that a new computer needed to be better at some things, better than an iPhone/iPod and better than a laptop. Then he just went right at Netbooks answering what could be better at these things. "Some people have thought that that's a Netbook." (The audience joined in a round of laughter.) Then he said, "The problem is ... Netbooks aren't better at anything ... They're slow. They have low quality displays ... and they run clunky old PC software ... They're just cheap laptops." "Cheap laptops" ... from my perch that was a good thing. I mean inexpensive was a better word. But we knew that Netbooks (and ATOM) were really just a way to make use of the struggling efforts to make low-power, fanless, intel chips for phones. A brutal takedown of 40M units. Sitting in a Le Corbusier chair, he showed the "extraordinary" things his new device did, from browsing to email to photos and videos and more. The real kicker was that it achieved 10 hours of battery life -- unachievable in PCs struggling for 4 hours with their whirring fans.

There was no stylus..no pen. How could one input or be PRODUCTIVE? PC brains were so wedded to a keyboard, mouse, and pen alternative that the idea of being productive without those seemed fanciful. Also instant standby, no viruses, rotate-able, maintained quality over time... As if to emphasize the point, Schiller showed "rewritten" versions of Apple's iWork apps for the iPad. The iPad would have a word processor, spreadsheet, and presentation graphics. Rounding out the demonstration, the iPad would also sync settings with iTune -- content too. This was still early in the travails of iCloud but really a game changer Windows completely lacked except in enterprise with crazy server infrastructure or "consumer" Live apps. iPad had a 3G modem BECAUSE it was built on the iPhone. If you could figure out the device drivers and software for a PC, you'd need a multi-hundred dollar USB modem and a $60/month fee at best. The iPad made this a $29.99 option on AT&T and a slight uptick in purchase price. Starting at $499, iPad was a shot right across the consumer laptop. Consumer laptops were selling over 100 million units a year! Pundits were shocked at the price. I ordered mine arriving in 60/90 days.

At CES weeks earlier, there were the earliest tablets -- made with no help from Google a few fringe Chinese ODMs were shopping hacky tablets called "Mobile Internet Devices" or "Media Tablets". Samsung's Galaxy was 9 months away. Android support (for 4:3 screens) aways. The first looks and reviews a bit later were just endless (and now tiresome) commentary on how the iPad was really for "consumption" and not productivity. There were no files. No keyboard. No mouse. No overlapping windows. Can't write code! In a literally classically defined case of disruption, iPad didn't do those things but what it did, it did so much better not only did people prefer it but they changed what they did in order to use it. Besides, email was the most used too and iPad was great for that. In first year 2010-2011 Apple sold 20 million iPads. That same year would turn out to be an historical high water mark for PCs (365M, ~180M laptops). Analysts had forecasted more than 500M PCs were now rapidly increasing tablet forecasts to 100s of million and dropping PC. The iPad and iPhone were soundly existential threats to Microsoft's core platform business.

Without a platform Microsoft controlled that developers sought out, the soul of the company was "missing." The PC had been overrun by browsers, a change 10 years in the making. PC OEMs were deeply concerned about a rise of Android and loved the Android model (no PC maker would ultimately be a major Android OEM, however). Even Windows Server was eclipsed by Linux and Open Source. The kicker for me, though, was that keyboard stand for the iPad. It was such a hack. Such an obvious "objection handler." But it was critically important because it was a clear reminder that the underlying operating system was "real" ...it was not a "phone OS". Knowing the iPhone and now iPad ran an robust OS under the hood, with a totally different "shell", interface model (touch), and app model (APIs and architecture) had massive implications for being the leading platform provider for computers. That was my Jan 27, 2010.
Further reading: The iPad's original software designer and program lead look back on the device's first 10 years.
Businesses

The iPad Awkwardly Turns 10 (daringfireball.net) 52

John Gruber: Ten years ago today, Steve Jobs introduced the iPad on stage at the Yerba Buena theater in San Francisco. [...] Ten years later, though, I don't think the iPad has come close to living up to its potential. [...] Software is where the iPad has gotten lost. iPadOS's "multitasking" model is far more capable than the iPhone's, yes, but somehow Apple has painted it into a corner in which it is far less consistent and coherent than the Mac's, while also being far less capable. iPad multitasking: more complex, less powerful. That's quite a combination.

Consider the basic task of putting two apps on screen at the same time, the basic definition of "multitasking" in the UI sense. To launch the first app, you tap its icon on the homescreen, just like on the iPhone, and just like on the iPad before split-screen multitasking. Tapping an icon to open an app is natural and intuitive. But to get a second app on the same screen, you cannot tap its icon. You must first slide up from the bottom of the screen to reveal the Dock. Then you must tap and hold on an app icon in the Dock. Then you drag the app icon out of the Dock to launch it in a way that it will become the second app splitting the display. But isn't dragging an icon out of the Dock the way that you remove apps from the Dock? Yes, it is -- when you do it from the homescreen.

So the way you launch an app in the Dock for split-screen mode is identical to the way you remove that app from the Dock. Oh, and apps that aren't in the Dock can't become the second app in split screen mode. What sense does that limitation make? On the iPhone you can only have one app on screen at a time. The screen is the app; the app is the screen. This is limiting but trivial to understand. [...] On iPad you can only have two apps on screen at the same time, and you must launch them in entirely different ways -- one of them intuitive (tap any app icon), one of them inscrutable (drag one of the handful of apps you've placed in your Dock). And if you don't quite drag the app from the Dock far enough to the side of the screen, it launches in "Slide Over", an entirely different shared-screen rather than split-screen mode. The whole concept is not merely inconsistent, it's incoherent. How would anyone ever figure out how to split-screen multitask on the iPad if they didn't already know how to do it?

[...] As things stand today, I get a phone call from my mom once a month or so because she's accidentally gotten Safari into split-screen mode when tapping links in Mail or Messages and can't get out. I like my iPad very much, and use it almost every day. But if I could go back to the pre-split-screen, pre-drag-and-drop interface I would. Which is to say, now that iPadOS has its own name, I wish I could install the iPhone's one-app-on-screen-at-a-time, no-drag-and-drop iOS on my iPad Pro. I'd do it in a heartbeat and be much happier for it. The iPad at 10 is, to me, a grave disappointment. Not because it's "bad", because it's not bad -- it's great even -- but because great though it is in so many ways, overall it has fallen so far short of the grand potential it showed on day one. To reach that potential, Apple needs to recognize they have made profound conceptual mistakes in the iPad user interface, mistakes that need to be scrapped and replaced, not polished and refined. I worry that iPadOS 13 suggests the opposite -- that Apple is steering the iPad full speed ahead down a blind alley.
Further reading: The iPad's original software designer and program lead look back on the device's first 10 years.
Privacy

ProtonVPN Open Sources All Its Code (protonvpn.com) 29

ProtonVPN open sourced its code this week, ZDNet reports: On Tuesday, the virtual private network (VPN) provider, also known for the ProtonMail secure email service, said that the code backing ProtonVPN applications on every system -- Microsoft Windows, Apple macOS, Android, and iOS -- is now publicly available for review in what Switzerland-based ProtonVPN calls "natural" progression.

"There is a lack of transparency and accountability regarding who operates VPN services, their security qualifications, and whether they fully conform to privacy laws like GDPR," the company says. "Making all of our applications open source is, therefore, a natural next step." Each application has also undergone a security audit by SEC Consult, which ProtonVPN says builds upon a previous partnership with Mozilla...

The source code for each app is now available on GitHub (Windows, macOS, Android, iOS). "As a community-supported organization, we have a responsibility to be as transparent, accountable, and accessible as possible," ProtonVPN says.

"Going open source helps us to do that and serve you better at the same time."

They're also publishing the results of an independent security audit for each app. "As former CERN scientists, publication and peer review are a core part of our ethos..." the company wrote in a blog post. They also point out that Switzerland has some of the world's strongest privacy laws -- and that ProtonVPN observes a strict no-logs policy.

But how do they feel about their competition? "Studies have found that over one-third of Android VPNs actually contain malware, many VPNs suffered from major security lapses, and many free VPN services that claimed to protect privacy are secretly selling user data to third parties."
PlayStation (Games)

'Rocket League' To Drop Linux and Mac Support (steamcommunity.com) 100

Long-time Slashdot reader Motor writes: Rocket League — a very popular multiplayer game — will no longer "be patched" for Linux and the Mac after March — say the publisher, Psyonix...

The publishers say it's motivated by the need to support unspecified "new technologies".

Thanks Psyonix.

The announcement says their final patch "will disable online functionality (such as in-game purchases) for players on macOS and Linux, but offline features including Local Matches, and splitscreen play will still be accessible."

"Players on Mac can try running Rocket League on Windows with Apple's Boot Camp tool," explains a support page, while adding in the next sentence that "Boot Camp is not something Psyonix officially supports." And if you play Rocket League on Linux, "you can try Steam's Proton app or Wine. These tools are not officially supported by Psyonix."

The support page also includes instructions on how to request a refund.
Desktops (Apple)

36 Years Ago Today, Steve Jobs Unveiled the First Macintosh (macrumors.com) 108

An anonymous reader quotes a report from MacRumors: On January 24, 1984, former Apple CEO Steve Jobs introduced the first Macintosh at Apple's annual shareholder's meeting in Cupertino, California, debuting the new computer equipped with a 9-inch black and white display, an 8MHz Motorola 68000 processor, 128KB of RAM, a 3.5-inch floppy drive, and a price tag of $2,495. The now iconic machine weighed in at a whopping 17 pounds and was advertised as offering a word processing program, a graphics package, and a mouse. At the time it was introduced, the Macintosh was seen as Apple's last chance to overcome IBM's domination of the personal computer market and remain a major player in the personal computer industry. Despite the high price at the time, which was equivalent to around $6,000 today, the Macintosh sold well, with Apple hitting 70,000 units sold by May 1984. The now iconic "1984" Super Bowl ad that Apple invested in and debuted days before the Macintosh was unveiled may have helped bolster sales.
Television

Apple TV Plus Reportedly Has More Subscribers Than Disney Plus (fastcompany.com) 39

If a report from The Wall Street Journal is correct, Apple's TV Plus service that launched late last year has 10 million more subscribers than Disney Plus, which launched at a similar time but with access to almost every TV show and movie Disney owns the rights to. For comparison, Apple TV Plus launched with only 11 titles. Fast Company reports: According to the Wall Street Journal, an Ampere Analysis study found that Apple's fledgling Apple TV Plus service garnered an astounding 33.6 million subscribers in the U.S. in Q4 2019. That puts it as the third-most-popular streaming service in America. Here are the top five streaming video services according to the report: 1.) Netflix -- 61.3 million U.S. subscribers; 2.) Amazon Prime Video -- 42.2 million U.S. subscribers; 3.) Apple TV Plus -- 33.6 million U.S. subscribers; 4.) Hulu -- 31.8 million U.S. subscribers; 5.) Disney Plus -- 23.2 million U.S. subscribers.

To be sure, Apple TV Plus is the video streaming service with the lowest monthly cost at just $4.99, but with only 11 series or movies available at launch in Q4 2019, how on earth did it leapfrog Disney Plus with its catalog of Marvel, Star Wars, and Pixar offerings (not to mention Baby Yoda)? The answer probably lies in the fact that Apple began giving away free subscriptions to its Apple TV Plus service to anyone who bought an iPhone, iPad, Mac, or Apple TV from mid-September onwards. Given that Apple sells tens of millions of those devices a month, it's no wonder Apple TV Plus has accumulated so many subscribers already. However, the real test for Apple will be how many of those subscribers stay on once their year-long free subscription of Apple TV Plus comes to an end.

Privacy

Bruce Schneier: Banning Facial Recognition Isn't Enough (nytimes.com) 90

Bruce Schneier, writing at New York Times: Communities across the United States are starting to ban facial recognition technologies. In May of last year, San Francisco banned facial recognition; the neighboring city of Oakland soon followed, as did Somerville and Brookline in Massachusetts (a statewide ban may follow). In December, San Diego suspended a facial recognition program in advance of a new statewide law, which declared it illegal, coming into effect. Forty major music festivals pledged not to use the technology, and activists are calling for a nationwide ban. Many Democratic presidential candidates support at least a partial ban on the technology. These efforts are well intentioned, but facial recognition bans are the wrong way to fight against modern surveillance. Focusing on one particular identification method misconstrues the nature of the surveillance society we're in the process of building. Ubiquitous mass surveillance is increasingly the norm. In countries like China, a surveillance infrastructure is being built by the government for social control. In countries like the United States, it's being built by corporations in order to influence our buying behavior, and is incidentally used by the government.

In all cases, modern mass surveillance has three broad components: identification, correlation and discrimination. Let's take them in turn. Facial recognition is a technology that can be used to identify people without their knowledge or consent. It relies on the prevalence of cameras, which are becoming both more powerful and smaller, and machine learning technologies that can match the output of these cameras with images from a database of existing photos. But that's just one identification technology among many. People can be identified at a distance by their heart beat or by their gait, using a laser-based system. Cameras are so good that they can read fingerprints and iris patterns from meters away. And even without any of these technologies, we can always be identified because our smartphones broadcast unique numbers called MAC addresses. Other things identify us as well: our phone numbers, our credit card numbers, the license plates on our cars. China, for example, uses multiple identification technologies to support its surveillance state.

Chrome

Google Will Wind Down Chrome Apps Starting in June (pcworld.com) 32

Google said this week that it will begin to phase out traditional Chrome apps starting in June, and winding down slowly over two years' time. Chrome extensions, though, will live on. From a report: Google said Tuesday in a blog post that it would stop accepting new Chrome apps in March. Existing apps could continue to be developed through June, 2022. The important dates start in June of this year, when Google will end support for Chrome Apps on the Windows, Mac, and Linux platforms. Education and Enterprise customers on these platforms will get a little more time to get their affairs in order, until December, 2020. Google had actually said four years ago that it would phase out Chrome apps on Windows, Mac, and Linux in 2018. The company appears to have waited longer than announced before beginning this process. The other platform that's affected by this, of course, is Google's own Chrome OS and Chromebooks, for which the apps were originally developed.
Desktops (Apple)

Low Power Mode for Mac Laptops: Making the Case Again (marco.org) 58

In light of this week's rumor that a Pro Mode -- which will supposedly boost performance on Macs with Catalina operating system -- may be coming, long time developer and Apple commentator Marco Arment makes the case for a Low Power Mode on macOS. He writes: Modern hardware constantly pushes thermal and power limits, trying to strike a balance that minimizes noise and heat while maximizing performance and battery life. Software also plays a role, trying to keep everything background-updated, content-indexed, and photo-analyzed so it's ready for us when we want it, but not so aggressively that we notice any cost to performance or battery life. Apple's customers don't usually have control over these balances, and they're usually fixed at design time with little opportunity to adapt to changing circumstances or customer priorities.

The sole exception, Low Power Mode on iOS, seems to be a huge hit: by offering a single toggle that chooses a different balance, people are able to greatly extend their battery life when they know they'll need it. Mac laptops need Low Power Mode, too. I believe so strongly in its potential because I've been using it on my laptops (in a way) for years, and it's fantastic. I've been disabling Intel Turbo Boost on my laptops with Turbo Boost Switcher Pro most of the time since 2015. In 2018, I first argued for Low Power Mode on macOS with a list of possible tweaks, concluding that disabling Turbo Boost was still the best bang-for-the-buck tweak to improve battery life without a noticeable performance cost in most tasks.

Recently, as Intel has crammed more cores and higher clocks into smaller form factors and pushed thermal limits to new extremes, the gains have become even more significant. [...] With Turbo Boost disabled, peak CPU power consumption drops by 62%, with a correspondingly huge reduction in temperature. This has two massive benefits: The fans never audibly spin up. [...] It runs significantly cooler. Turbo Boost lets laptops get too hot to comfortably hold in your lap, and so much heat radiates out that it can make hands sweaty. Disable it, and the laptop only gets moderately warm, not hot, and hands stay comfortably dry. I haven't done formal battery testing on the 16-inch, since it's so difficult and time-consuming to do in a controlled way that's actually useful to people, but anecdotally, I'm seeing similar battery gains by disabling Turbo Boost that I've seen with previous laptops: significantly longer battery life that I'd estimate to be between 30-50%.

Programming

'We're Approaching the Limits of Computer Power -- We Need New Programmers Now' (theguardian.com) 306

Ever-faster processors led to bloated software, but physical limits may force a return to the concise code of the past. John Naughton: Moore's law is just a statement of an empirical correlation observed over a particular period in history and we are reaching the limits of its application. In 2010, Moore himself predicted that the laws of physics would call a halt to the exponential increases. "In terms of size of transistor," he said, "you can see that we're approaching the size of atoms, which is a fundamental barrier, but it'll be two or three generations before we get that far -- but that's as far out as we've ever been able to see. We have another 10 to 20 years before we reach a fundamental limit." We've now reached 2020 and so the certainty that we will always have sufficiently powerful computing hardware for our expanding needs is beginning to look complacent. Since this has been obvious for decades to those in the business, there's been lots of research into ingenious ways of packing more computing power into machines, for example using multi-core architectures in which a CPU has two or more separate processing units called "cores" -- in the hope of postponing the awful day when the silicon chip finally runs out of road. (The new Apple Mac Pro, for example, is powered by a 28-core Intel Xeon processor.) And of course there is also a good deal of frenzied research into quantum computing, which could, in principle, be an epochal development.

But computing involves a combination of hardware and software and one of the predictable consequences of Moore's law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; there's a legend that for years afterwards he could recite the entire program by heart. There are thousands of stories like this from the early days of computing. But as Moore's law took hold, the need to write lean, parsimonious code gradually disappeared and incentives changed.

Firefox

Firefox 72 Arrives With Fingerprinting Blocked By Default, Picture-in-Picture on macOS and Linux (venturebeat.com) 49

Mozilla today launched Firefox 72 for Windows, Mac, Linux, and Android. Firefox 72 includes fingerprinting scripts blocked by default, less annoying notifications, and Picture-in-Picture video on macOS and Linux. There isn't too much else here, as Mozilla has now transitioned Firefox releases to a four-week cadence (from six to eight weeks).
The Internet

Apple News No Longer Supports RSS (mjtsai.com) 49

Mac developer Michael Tsai reports that Apple News no longer supports RSS. The news comes from user David A. Desrosiers, who writes: Apple News on iOS and macOS no longer supports adding RSS or ATOM feeds from anywhere. Full-stop, period. It will immediately fetch, then reject those feeds and fail to display them, silently without any message or error. I can see in my own server's log that they make the request using the correct app on iOS and macOS, but then ignore the feed completely; a validated, clean feed. They ONLY support their own, hand-picked, curated feeds now. You can visit a feed in Safari, and it will prompt you to open the feed in Apple News, then silently ignore that request, after fetching the full feed content from the remote site. Simon Willison, creator of Datasette and co-creator of Django, points out that Apple News still hijacks links to Atom/RSS feeds -- "so if you click on one of those links in Mobile Safari you'll be bounced to the News app, which will then display an error."
Programming

State of Apple's Catalyst (daringfireball.net) 16

At its developer conference in June this year, Apple introduced Project Catalyst that aims to help developers swiftly bring their iOS apps to Macs. Developers have had more than half a year to play with Catalyst. Here's where things stand currently: The crux of the issue in my mind is that iOS and Mac OS are so fundamentally different that the whole notion of getting a cohesive experience through porting apps with minimal effort becomes absurd. The problem goes beyond touch vs pointer UX into how apps exist and interact within their wider OSes. While both Mac OS and iOS are easy to use, their ease stem from very different conventions. The more complicated Mac builds ease almost entirely through cohesion. Wherever possible, Mac applications are expected to share the same shortcuts, controls, windowing behavior, etc... so users can immediately find their bearings regardless of the application. This also means that several applications existing in the same space largely share the same visual and UX language. Having Finder, Safari, BBEdit and Transmit open on the same desktop looks and feels natural.

By comparison, the bulk of iOS's simplicity stems from a single app paradigm. Tap an icon on the home screen to enter an app that takes over the entire user experience until exited. Cohesion exists and is still important, but its surface area is much smaller because most iOS users only ever see and use a single app at a time. For better and worse, the single app paradigm allows for more diverse conventions within apps. Having different conventions for doing the same thing across multiple full screen apps is not an issue because users only have to ever deal with one of those conventions at a given time. That innocuous diversity becomes incongruous once those same apps have to live side-by-side.
Columnist John Gruber of DaringFireball adds: I think part of the problem is Catalyst itself -- it just doesn't feel like nearly a full-fledged framework for creating proper Mac apps yet. But I think another problem is the culture of doing a lot of nonstandard custom UI on iOS. As Wellborn points out, that flies on iOS -- we UI curmudgeons may not like it, but it flies -- because you're only ever using one app at a time on iOS. It cracks a bit with split-screen multitasking on iPadOS, but I've found that a lot of the iPad apps with the least-standard UIs don't even support split-screen multitasking on iPadOS, so the incongruities -- or incoherences, to borrow Wellborn's well-chosen word -- don't matter as much. But try moving these apps to the Mac and the nonstandard UIs stick out like a sore thumb, and whatever work the Catalyst frameworks do to support Mac conventions automatically doesn't kick in if the apps aren't even using the standard UIKit controls to start with. E.g. scrolling a view with Page Up, Page Down, Home, and End. Further reading: Apple's Merged iPad, Mac Apps Leave Developers Uneasy, Users Paying Twice (October 2019).
Portables (Apple)

Walt Mossberg: Tim Cook's Apple Had a Great Decade But No New Blockbusters (theverge.com) 59

Veteran tech columnist, who retired two years ago, returns with one story to cap the end of the decade: Apple hasn't said how many Watches and AirPods it's sold, but they're widely believed to be the dominant players in each of their categories and, in the grand Apple tradition, the envy of competitors that scramble to ape them. Neither of these hardware successes has matched the impact or scale of Jobs' greatest hits. Even the iPad, despite annual unit sales that are sharply down from its heyday, generated almost as much revenue by itself in fiscal 2019 as the entire category of "wearables, home and accessories" where the Apple Watch and AirPods are slotted by Apple. [...] Cook does bear the responsibility for a series of actions that screwed up the Macintosh for years. The beloved mainstream MacBook Air was ignored for five years. At the other end of the scale, the Mac Pro, the mainstay of professional audio, graphics, and video producers, was first neglected then reissued in 2013 in a way that put form so far ahead of function that it enraged its customer base. Some insiders think Cook allowed Ive's design team far too much power and that the balance Jobs was able to strike between the designers and the engineers was gone, at least until Ive left the company earlier this year.

The design-first culture that took root under Cook struck again with the MacBook Pro, yielding new laptops so thin their keyboards were awful and featuring USB-C ports that required sleek Macs to be used with ugly dongles. Apple has only recently retreated back to decent keyboards on the latest MacBook Pro, and it issued a much more promising Mac Pro. But dongles are still a part of the Apple experience across its product lines. Cook's other success this decade was to nurture the iPhone along as smartphone sales first plateaued and then began to decline. The biggest change he made came in 2014, before the dip, when Apple introduced two new iPhone 6 models, which belatedly adopted big screens that Android phones had pioneered. Sales took off like a rocket, and there's been a big iPhone option every year since.

Software

Getting Drivers for Old Hardware Is Harder Than Ever (vice.com) 165

At least one major provider of hardware-level BIOS drivers is actively deleting old stuff it no longer supports, while old FTP sites where vintage drivers are often found are soon going to be harder to reach. Ernie Smith, writing for Motherboard: You've never lived until you've had to download a driver from an archived forum post on the Internet Archive's Wayback Machine. You have no idea if it's going to work, but it's your only option. So you bite the bullet. I recently did this with a PCI-based SATA card I was attempting to flash to support a PowerPC-based Mac, and while it was a bit of a leap of faith, it actually ended up working. Score one for chance. But this, increasingly, feels like it may be a way of life for people trying to keep old hardware alive -- despite the fact that all the drivers generally have to do is simply sit on the internet, available when they're necessary.

Apparently, that isn't easy enough for Intel. Recently, the chipmaker took BIOS drivers, a boot-level firmware technology used for hardware initialization in earlier generations of PCs, for a number of its unsupported motherboards off its website, citing the fact that the programs have reached an "End of Life" status. While it reflects the fact that Unified Extensible Firmware Interface (UEFI), a later generation of firmware technology used in PCs and Macs, is expected to ultimately replace BIOS entirely, it also leaves lots of users with old gadgets out in a lurch. And as Bleeping Computer has noted, it appears to be part of a broader trend to prevent downloads for unsupported hardware on the Intel website -- things that have long lived past their current lives. After all, if something goes wrong, Intel can be sure it's not liable if a 15-year-old BIOS update borks a system.

Desktops (Apple)

Apple's New Mac Pro Can Cost $52,000. That's Without the $400 Wheels (bloomberg.com) 273

Apple started selling its new Mac Pro desktop computer on Tuesday, complete with eye-watering pricing options that can push the cost north of $50,000. From a report: The new machine, built in Austin, Texas after Apple got tariff relief from the Trump administration, starts at $5,999 for specifications that some programmers, video editors, and photographers might consider measly. Fully loaded, the computer costs more than $52,000, and that's excluding the optional $400 wheels for easily moving the machine around an office. For some professional users, the cost of Apple's new computer is just part of doing business. But for most consumers, the Mac Pro's price is shocking. As one of the most expensive personal computers in the world, some Apple users quickly compared the cost to a car. The base product includes 256 gigabytes of storage, low for professional computers in the same price range. A 4 terabyte option is an extra $1,400. An 8 terabyte upgrade is coming later, according to Apple's website, but pricing hasn't been announced. To increase the computer's RAM memory from 32 gigabytes to 1.5 terabytes is $25,000 extra, the main reason the price can exceed $52,000. Apple said a version of the Mac Pro designed to be racked in data centers costs an extra $500 and will launch later. The Mac Pro does not include a display. Apple put a new Pro Display XDR on sale Tuesday for $4,999.
Chrome

Google Releases Chrome 79 With New Features Including an Option To Freeze Tabs and Back-Forward Caching (zdnet.com) 29

Google today released Chrome 79 for Windows, Mac, Linux, Chrome OS, Android, and iOS users. This release comes with security and bug fixes, but also with new features such as built-in support for the Password Checkup tool, real-time blacklisting of malicious sites via the Safe Browsing API, general availability of Predictive Phishing protections, a ban on loading HTTPS "mixed content," support for tab freezing, a new UI for the Chrome Sync profile section, and support for a back-forward caching mechanism. ZDNet has outlined each new feature in-depth.
Businesses

Apple Sues iPhone CPU Design Ace After He Quits To Run Data center Chip Upstart Nuvia (theregister.co.uk) 100

Apple is suing the former chief architect of its iPhone and iPad microprocessors, who in February quit to co-found a data-center chip design biz. From a report: In a complaint filed in the Santa Clara Superior Court, in California, USA, and seen by The Register, the Cupertino goliath claimed Gerard Williams, CEO of semiconductor upstart Nuvia, broke his Apple employment agreement while setting up his new enterprise. Williams -- who oversaw the design of Apple's custom high-performance mobile Arm-compatible processors for nearly a decade -- quit the iGiant in February to head up the newly founded Nuvia. The startup officially came out of stealth mode at the end of November, boasting it had bagged $53m in funding. It appears to be trying to design silicon chips, quite possibly Arm-based ones, for data center systems; it is being coy right now with its plans and intentions.

[...] Apple's lawsuit alleged Williams hid the fact he was preparing to leave Apple to start his own business while still working at Apple, and drew on his work in steering iPhone processor design to create his new company. Crucially, Tim Cook & Co's lawyers claimed he tried to lure away staff from his former employer. All of this was, allegedly, in breach of his contract. The iGiant also reckoned Williams had formed the startup in hope of being bought by Apple to produce future systems for its data centers. [...] Apple's side of the story, however, has been challenged by Williams, who accused the Mac giant of wrongdoing. Last month, his team hit back with a counter argument alleging that Apple doesn't have a legal leg to stand on. The paperwork states Apple's employment contract provisions in this case are not enforceable under California law: they argue the language amounts to a non-compete clause, which is, generally speaking, a no-no in the Golden State. Thus, they say, Williams was allowed to plan and recruit for his new venture while at Apple. [...] They also allege that Apple's evidence in its complaint, notably text messages he exchanged with another Apple engineer and conversations with his eventual Nuvia co-founders, were collected illegally by the highly paranoid iPhone maker.

Desktops (Apple)

Apple's Activation Lock Will Make It Very Difficult To Refurbish Macs (ifixit.com) 178

Apple's Activation Lock is an anti-theft feature built into iOS, watchOS, and macOS Catalina that prevents people from restoring your Apple devices without your permission. "With the release of macOS Catalina earlier this fall, any Mac that's equipped with Apple's new T2 security chip now comes with Activation Lock," writes iFixit's Craig Lloyd. What this means is that there will likely be thousands of perfectly good Macs being parted out or scrapped instead of being put into the hands of people who could really use them. From the report: Activation Lock was designed to prevent anyone else from using your device if it's ever lost or stolen, and it's built into the "Find My" service on iPhones, iPads, and other Apple devices. When you're getting rid of an old phone, you want to use Apple's Reset feature to wipe the phone clean, which also removes it from Find My iPhone and gets rid of the Activation Lock. But if you forget, and sell your old iPhone to a friend before you properly wipe it, the phone will just keep asking them for your Apple ID before they can set it up as a new phone. In other words, they won't be able to do much with it besides scrap it for parts.

That seems like a nice way to thwart tech thieves, but it also causes unnecessary chaos for recyclers and refurbishers who are wading through piles of locked devices they can't reuse. This reduces the supply of refurbished devices, making them more expensive -- oh, and it's an environmental nightmare. [...] The T2 security chip, however, erases any hope and makes it impossible to do anything on a Mac without the proper Apple ID credentials. Attempting any kind of hardware tinkering on a T2-enabled Mac activates a hardware lock, which can only be undone by connecting the device to Apple-authorized repair software. It's great for device security, but terrible for repair and refurbishment. While recyclers may not be dealing with as many locked Macs as locked iPhones (especially since Activation Lock on Macs is still very new, and there are specific software criteria that need to be met), it's only a matter of time before thousands upon thousands of perfectly working Macs are scrapped or shredded, for lack of an unknown password.

Slashdot Top Deals