AMD

NVIDIA Launches GeForce RTX 2080 Super, RTX 2070 Super and RTX 2060 Super GPUs, Aims To One-Up AMD With More Power For the Same Price (hothardware.com) 63

MojoKid writes: NVIDIA just launched three new GeForce RTX gaming GPUs to battle against AMD's forthcoming Radeon RX 5700 series. The GeForce RTX 2080 Super, GeForce RTX 2070 Super and GeForce RTX 2060 Super will all be shipping this month. GeForce RTX 2070 Super and RTX 2060 Super cards are out making the rounds in benchmark reviews, while the RTX 2080 Super will arrive in a couple of weeks. The GeForce RTX 2070 Super is more than just an overclocked RTX 2070 but actually based on the GeForce RTX 2080's TU104 NVIDIA Turing GPU with 40 active SMs, for a total of 2,560 CUDA cores at 1,605MHz and 1,770MHz base and boost clocks, respectively. The RTX 2060 Super is still based on the original TU106 GPU, but it has four additional SMs enabled, which brings the CUDA core count up to 2,176 (from 1,920) at a somewhat higher 1470MHz base clock and boost clock 30MHz lower at 1,650MHz.

There is an additional 2GB of GDDR6 memory on the card too for a total of 8GB now. Performance-wise, both cards are significant upgrades over the originals, with roughly 10 -- 23 percent gains, depending on the resolution or application. The GeForce RTX 2070 Super is often faster than the pricier AMD Radeon VII, especially at 1440p. At 4K, however, the Radeon VII's memory bandwidth advantage often gives it an edge. The new GeForce RTX 2060 Super is faster than a Radeon RX Vega 64 more often than not. It will be interesting to see how these cards compete with AMD's Radeon RX 5700 Navi-based card when they arrive later this month. NVIDIA could have just thrown a wrench in the works for AMD.

AMD

Leaked Internal Intel Memo Acknowledges 'Resurgent', 'Formidable' AMD (hothardware.com) 162

Slashdot reader MojoKid writes: AMD announced its 3rd Gen Ryzen 3000 series processors at Computex earlier this month and the company's Zen 2 architecture is promised to bring single threaded performance parity with Intel but exceedingly better multithreaded throughput in content creation and other high-end workloads.

Intel has obviously taken notice of AMD's Zen 2 advancements and nowhere is its renewed keen focus more evident than in an internal memo that just leaked out to public venues. The memo was originally posted on Intel's internal "Circuit News" employee portal and it's quite revealing. The memo, which is entitled, "AMD competitive profile: Where we go toe-to-toe, why they are resurgent, which chips of ours beat theirs", is a surprisingly frank look at how AMD has managed to get the best of Intel, at least currently, and how the company should manage this renewed or "resurgent" competitive threat.

What's most surprising about the memo, which was penned by Circuit News Managing Editor Walden Kirsch, is how flattering it is in general to AMD, pointing out that it was the best-performing stock on the S&P 500 for 2018. In terms of Zen 2 and AMD's Ryzen 3000 series, the author notes, "Intel 9th Gen Core processors are likely to lead AMD's Ryzen-based products on lightly threaded productivity benchmarks as well as many gaming benchmarks," Kirsch writes in the memo. "For multi-threaded workloads, such as heavy content creation workloads, AMD's Matisse is expected to lead." All in, the internal memo is a rather insightful and well-reasoned look at the threat that AMD poses to Intel and how the company might respond.

AMD

AMD Cites 'Factual Errors', 'Omissions' in Critical Report on Its China Venture (forbes.com) 69

Thursday the Wall Street Journal wrote a piece about AMD's joint venture with Chinese holding coming THATIC -- titled "How a Big U.S. Chip Maker Gave China the 'Keys to the Kingdom'." The article argues that AMD "essentially granted China access to advanced processor IP that could be used to threaten U.S. national security," reports Forbes.

But they add that the same day, AMD executive Harry Wolin wrote an angry blog post in response, complaining that the story "contains several factual errors and omissions and does not portray an accurate picture." Forbes reports: From Wolin's post, "Starting in 2015, AMD diligently and proactively briefed the Department of Defense, the Department of Commerce and multiple other agencies within the U.S. Government before entering into the joint ventures. AMD received no objections whatsoever from any agency to the formation of the joint ventures or to the transfer of technology -- technology which was of lower performance than other commercially available processors. In fact, prior to the formation of the joint ventures and the transfer of technology, the Department of Commerce notified AMD that the technology proposed was not restricted or otherwise prohibited from being transferred. Given this clear feedback, AMD moved ahead with the joint ventures."

Not only does AMD claim it had the green light from multiple government entities to enter into the deal, the post claims that the WSJ article is simply wrong. "The Wall Street Journal story omits important factual details, including the fact that AMD put significant protections in place to protect its intellectual property (IP) and prevent valuable IP from being misused or reverse engineered to develop future generations of processors."

Intel

Intel Will Cut Desktop CPU Prices By 10-15% as Ryzen 3000 Draws Near, Report Says (techspot.com) 117

It's just over two weeks until AMD's full Ryzen 3000 family of processors arrive, and it appears Intel is concerned about the effect they may have on its own chip sales. From a report: As such, the company is reportedly planning to reduce the price of its eighth- and ninth-generation CPUs by 10 to 15 percent. The report comes from DigiTimes, citing sources from motherboard makers. It claims Intel has already notified its downstream PC and motherboard partners about the processor price drops, which could see anything from $25 to $75 knocked off the CPUs. If the report is accurate, the enthusiast eight-core/16-thread Core i9 9900K will be one of the chips to see a price reduction, as will the i7-9700K, and the i5-9600K.
AMD

AMD Is Working On a Monster 64-Core Threadripper CPU, Landing As Early As Q4 2019 (wccftech.com) 206

AMD is preparing a monstrous 64-core/128-thread Threadripper CPU for launch in Q4 2019. "AMD's largest HEDT processor right now is the W2990X which tops out at 32-cores," reports Wccftech. "This is nothing to sneeze at and is already the highest core HEDT part around but because the world can't get enough of these yummy cores, AMD is planning to launch a 64-core version in Q4 2019." From the report: The platform is called X599 right now although I am told AMD is considering changing the name to avoid confusion with Intel. This is not really surprising since both Intel and AMD HEDT platforms have the same nomenclature and it can get really confusing. I am also told that they they plan to retain the "99" suffix. AMD is planning to launch the 64-core Threadripper part and the corresponding platform in Q4 2019. In fact, that is when you can expect these motherboards to start popping up from various AIBs.

Now my source did not mention a new socket, so as far as I know, this should be socket compatible with the existing TR4 motherboards and only a bios update should be needed if you already own one. What I don't know right now is whether this is a 14nm part or a 7nm part. Conventional wisdom would dictate that this is a 14nm part trickling down from their server space, but who knows, maybe the company will surprise all of us? This is pretty exciting news, because knowing AMD, the 64-core Threadripper CPU will probably be priced in the $2500 to $3000 range, making it one of the most affordable workstation processors around with this many threads.

AMD

AMD Unveils Zen 2 CPU Architecture, Navi GPU Architecture and a Slew of Products (hothardware.com) 167

MojoKid writes: AMD let loose today with a number of high profile launches at the E3 2019 Expo in Los Angeles, CA. The company disclosed its full Zen 2 Ryzen 3000 series microarchitecture, which AMD claims offers an IPC uplift of 15% generation over generation, thanks to better branch prediction, higher integer throughput, and reduced effective latency to memory. Zen 2 also significantly beefs up floating point throughput with double the FP performance of the previous generation. AMD also announced a 16-core/32-thread variant, dubbed Ryzen 3950X, that drops at $750 -- a full $950 cheaper than a similar spec 16-core Intel Core i9-9960X. On the graphics side, AMD's RDNA architecture in Navi will power the company's new Radeon RX 5700 series, which is said to offer competitive performance to NVIDIA's GeForce RTX 2070 and 2060 series. The Navi-based GPU at the heart of the upcoming Radeon RX 5700 series is manufactured on TSMC's 7nm process node and features GDDR6 memory, along with PCI Express 4.0 interface support. Versus AMD's previous generation GCN (Graphics Core Next) architecture, RDNA delivers more than 50% better performance-per-watt and 25% better overall performance. Greater than 50% of that improvement comes from architecture optimizations according to AMD; the GPU also gets a boost from its 7nm process and frequency gains. Radeon RX 5700 and 5700 XT cards will be available in market on July 7th, along with AMD Ryzen 3000 chips, but pricing hasn't been established yet for the Radeon GPUs.
XBox (Games)

Project Scarlett is the Next Xbox Console, Launching in Holiday 2020 (polygon.com) 115

Project Scarlett is the next Microsoft video game console. Phil Spencer, executive vice president of gaming at the company, announced the hardware during Microsoft's E3 2019 press briefing. From a report: "The console should be optimized for one thing and one thing only," said Spencer, "gaming." Spencer explained the console has been developed by the team responsible for the Xbox One X. A promotional video featuring various Xbox employees promised variable refresh rates, real-time ray tracing, 8K resolution and frame rates up to 120 frames per second, and a new SSD that has upwards of 40 times better performance over the current generation. The tech at the heart of the console -- which Microsoft said is four times as powerful as the Xbox One X -- will be a custom chip based on AMD's Zen 2 and Navi technology.
Desktops (Apple)

Apple's Top Spec Mac Pro and Pro Display Will Cost At Least $50,000 (theverge.com) 335

Apple announced this week that its new Mac Pro starts at an already pricey $6,000, but the company neglected to mention how much the top-of-the-line model will cost. From a report on The Verge: So we shopped around for equivalent parts to the top-end spec that Apple's promising. As it turns out: $33,720.88 is likely the bare minimum -- and that's before factoring in the four GPUs, which could easily jack that price up to around $45,000. For all that dough, big-budget video editors and other creative types get a lot of firepower: a 28-core Intel Xeon W processor, an almost-impossible-to-comprehend 1.5TB of RAM, 4TB of SSD storage, and four AMD Radeon Pro Vega II Duo GPUs -- assuming you can afford one. Add in a Pro Display XDR monitor (and a Pro Stand to go with it), and you're looking at a workstation that could clear $50,000. Keep in mind too that these estimates are based on market prices for these (or similar) parts: Apple historically has charged far more for its pre-built configurations than for a computer you'd build on your own.
Desktops (Apple)

Apple Announces All-New Redesigned Mac Pro, Starting at $5,999 (theverge.com) 317

The long-awaited Mac Pro is here. From a report: The new Intel Xeon processor inside the Mac Pro will have up to 28 cores, with up to 300W of power and heavy-duty cooling, "so it can run unconstrained at full power at all times." System memory can be maxed out at an eyebrow-raising 1.5TB, says Apple. There are eight internal PCI Express slots, with four of them being double-wide. Two USB-C and two USB-A ports will grace the front of the system, which is at least one more USB-C port than you'll find on a majority of desktop PC systems and cases today. With this Mac Pro, Apple is launching a custom expansion module it calls an MPX Module. This is a giant quad-wide PCIe card that fits two graphics cards, has its own dedicated heatsink, and also has a Thunderbolt 3 connector on the bottom for extra bandwidth / power / display connectivity. Apple says you can spec that out with AMD's Radeon Pro Vega 2 or Radeon Pro Vega 2 Duo, the latter of which would get you four GPUs in total. The power supply of the new Mac Pro maxes out at 1.4kW. Three large fans sit at the front, just behind the new aluminum grille, blowing air across the system at a rate of 300 cubic feet per minute. It starts at $5,999.
AMD

Samsung and AMD Announce Multi-Year Strategic Graphics IP Licensing Deal For SLSI Mobile GPUs (anandtech.com) 17

Samsung and AMD announced today a new multi-year strategic partnership between the two companies, where Samsung SLSI will license graphics IP from AMD for use in mobile GPUs. From a report: The announcement is a major disruptive move for the mobile graphics landscape as it signifies that Samsung is going forward with the productization of their own in-house GPU architecture in future Exynos chipsets. Samsung is said to have started work on their own "S-GPU" at its research division back around in 2012, with the company handing over the new IP to a new division called "ACL," or Advanced Computing Lab in San Jose, which has a joint charter with SARC (Samsung Austin R&D Center, where Samsung currently designs its custom mobile CPU & memory controller IP). With today's announced partnership, Samsung will license "custom graphics IP" from AMD. What this IP means is a bit unclear from the press release, but we have some strong pointers on what it might be.

Samsung's own GPU architecture is already quite far along, having seen 7 years of development, and already being integrated in test silicon chipsets. Unless the deal was signed years ago and only publicly announced today, it would signify that the IP being talked about now is a patent-deal, rather than new architectural IP from AMD that Samsung would integrate in their own designs. Samsung's new GPU IP is the first from-scratch design in over a decade, in an industry with very old incumbents with massive patent-pools. Thus what today's announcement likely means is likely that Samsung is buying a patent-chest from AMD in order to protect themselves from possible litigation from other industry players.

Intel

Intel Boldly Claims Its 'Ice Lake' Integrated Graphics Are As Good as AMD's (pcworld.com) 147

While Intel is expected to detail its upcoming 10nm processor, Ice Lake, during its Tuesday keynote here at Computex, the company is already making one bold claim -- that Ice Lake's integrated Gen11 graphics engine is on par or better than AMD's current Ryzen 7 graphics. From a report: It's a bold claim, and one that Ryan Shrout, a former journalist and now the chief performance strategist for Intel, said that Intel doesn't make lightly. "I don't think we can overstate how important this is for us, to make this claim and this statement about the one area that people railed on us for in the mobile space," Shrout said shortly before Computex began. Though Intel actually supplies the largest number of integrated graphics chipsets in the PC space, it does so on the strength of its CPU performance (and also thanks to strong relationships with laptop makers). Historically, AMD has leveraged its Radeon "Vega" GPUs to attract buyers seeking a more powerful integrated graphics solution. But what Intel is trying to do now, with its Xe discrete graphics on the horizon, is let its GPUs stand on their own merits. Referencing a series of benchmarks and games from the 3DMark Sky Diver test to Fortnite to Overwatch, Intel claims performance that's 3 to 15 percent faster than the Ryzen 7. Intel's argument is based on a comparison of a U-series Ice Lake part at 25 watts, versus a Ryzen 7 3700U, also at 25 watts.
AMD

AMD Unveils the 12-Core Ryzen 9 3900X, at Half the Price of Intel's Competing Core i9 9920X Chipset (techcrunch.com) 261

AMD CEO Lisa Su today unveiled news about its chips and graphics processors that will increase pressure on competitors Intel and Nvidia, both in terms of pricing and performance. From a report: All new third-generation Ryzen CPUs, the first with 7-nanometer desktop chips, will go on sale on July 7. The showstopper of Su's keynote was the announcement of AMD's 12-core, 24-thread Ryzen 9 3900x chip, the flagship of its third-generation Ryzen family. It will retail starting at $499, half the price of Intel's competing Core i9 9920X chipset, which is priced at $1,189 and up. The 3900x has 4.6 Ghz boost speed and 70 MB of total cache and uses 105 watts of thermal design power (versus the i9 9920x's 165 watts), making it more efficient. AMD says that in a Blender demo against Intel i9-9920x, the 3900x finished about 18 percent more quickly. Starting prices for other chips in the family are $199 for the 6-core, 12-thread 3600; $329 for the 8-core, 16-thread Ryzen 3700x (with 4.4 Ghz boost, 36 MB of total cache and a 65 watt TDP); and $399 for the 8-core, 16-thread Ryzen 3800X (4.5 Ghz, 32MB cache, 105w).
AMD

Intel Performance Hit 5x Harder Than AMD After Spectre, Meltdown Patches (extremetech.com) 170

Phoronix has conducted a series of tests to show just how much the Spectre and Meltdown patches have impacted the raw performance of Intel and AMD CPUs. While the patches have resulted in performance decreases across the board, ranging from virtually nothing to significant depending on the application, it appears that Intel received the short end of the stick as its CPUs have been hit five times harder than AMD, according to ExtremeTech. From the report: The collective impact of enabling all patches is not a positive for Intel. While the impacts vary tremendously from virtually nothing to significant on an application-by-application level, the collective whack is about 15-16 percent on all Intel CPUs without Hyper-Threading disabled. Disabling increases the overall performance impact to 20 percent (for the 7980XE), 24.8 percent (8700K) and 20.5 percent (6800K).

The AMD CPUs are not tested with HT disabled, because disabling SMT isn't a required fix for the situation on AMD chips, but the cumulative impact of the decline is much smaller. AMD loses ~3 percent with all fixes enabled. The impact of these changes is enough to change the relative performance weighting between the tested solutions. With no fixes applied, across its entire test suite, the CPU performance ranking is (from fastest to slowest): 7980XE (288), 8700K (271), 2990WX (245), 2700X (219), 6800K. (200). With the full suite of mitigations enabled, the CPU performance ranking is (from fastest to slowest): 2990WX (238), 7980XE (231), 2700X (213), 8700K (204), 6800K (159).
In closing, ExtremeTech writes: "AMD, in other words, now leads the aggregate performance metrics, moving from 3rd and 4th to 1st and 3rd. This isn't the same as winning every test, and since the degree to which each test responds to these changes varies, you can't claim that the 2990WX is now across-the-board faster than the 7980XE in the Phoronix benchmark suite. It isn't. But the cumulative impact of these patches could result in more tests where Intel and AMD switch rankings as a result of performance impacts that only hit one vendor."
Businesses

Ask Slashdot: Are the Big Players In Tech Even Competing With Each Other? 145

dryriver writes: For capitalism to work for consumers in a beneficial way, the big players have to compete hard against each other and innovate courageously. What appears to be happening instead, however, is that every year almost everybody is making roughly the same product at roughly the same price point. Most 4K TVs at the same price point have the same features -- there is little to distinguish manufacturer A from manufacturer B. Ditto for smartphones -- nobody suddenly puts a 3D scanning capable lightfield camera, shake-the-phone-to-charge-it or something similarly innovative into their next phone. Ditto for game consoles -- Xbox and Playstation are not very different from each other at all. Nintendo does "different," but underpowers its hardware. Ditto for laptops -- the only major difference I see in laptops is the quality of the screen panel used and of the cooling system. The last laptop with an auto stereoscopic 3D screen I have seen is the long-discontinued Toshiba Satellite 3D. Ditto for CPUs and GPUs -- it doesn't really matter whether you buy Intel, AMD, or Nvidia. There is nothing so "different" or "distinct" in any of the electronics they make that it makes you go "wow, that is truly groundbreaking." Ditto for sports action cameras, DSLRs, portable storage and just about everything else "tech." So where precisely -- besides pricing and build-quality differences -- is the competition in what these companies are doing? Shouldn't somebody be trying to "pull far ahead of the pack" or "ahead of the curve" with some crazy new feature that nobody else has? Or is true innovation in tech simply dead now?
Businesses

Hewlett Packard Enterprise To Acquire Supercomputer Maker Cray for $1.3 Billion (anandtech.com) 101

Hewlett Packard Enterprise will be buying the supercomputer maker Cray for roughly $1.3 billion, the companies said this morning. Intending to use Cray's knowledge and technology to bolster their own supercomputing and high-performance computing technologies, when the deal closes, HPE will become the world leader for supercomputing technology. From a report: Cray of course needs no introduction. The current leader in the supercomputing field and founder of supercomputing as we know it, Cray has been a part of the supercomputing landscape since the 1970s. Starting at the time with fully custom systems, in more recent years Cray has morphed into an integrator and scale-out specialist, combining processors from the likes of Intel, AMD, and NVIDIA into supercomputers, and applying their own software, I/O, and interconnect technologies. The timing of the acquisition announcement closely follows other major news from Cray: the company just landed a $600 million US Department of Energy contract to supply the Frontier supercomputer to Oak Ridge National Laboratory in 2021. Frontier is one of two exascale supercomputers Cray is involved in -- the other being a subcontractor for the 2021 Aurora system -- and in fact Cray is involved in the only two exascale systems ordered by the US Government thus far. So in both a historical and modern context, Cray was and is one of the biggest players in the supercomputing market.
AMD

World's Fastest Supercomputer Coming To US in 2021 From Cray, AMD (cnet.com) 89

The "exascale" computing race is getting a new entrant called Frontier, a $600 million machine with Cray and AMD technology that could become the world's fastest when it arrives at Oak Ridge National Laboratory in 2021. From a report: Frontier should be able to perform 1.5 quintillion calculations per second, a level called 1.5 exaflops and enough to claim the performance crown, the Energy Department announced Tuesday. Its speed will be about 10 times faster than that of the current record holder on the Top500 supercomputer ranking, the IBM-built Summit machine, also at Oak Ridge, and should surpass a $500 million, 1-exaflops Cray-Intel supercomputer called Aurora to be built in 2021 at Argonne National Laboratory. There's no guarantee the US will win the race to exascale machines -- those that cross the 1-exaflop threshold -- because China, Japan and France each could have exascale machines in 2020. At stake is more than national bragging rights: It's also about the ability to perform cutting-edge research in areas like genomics, nuclear physics, cosmology, drug discovery, artificial intelligence and climate simulation.
Software

Blender Developers Find Old Linux Drivers Are Better Maintained Than Windows (phoronix.com) 151

To not a lot of surprise compared to the world of proprietary graphics drivers on Windows where once the support is retired the driver releases stop, old open-source Linux OpenGL drivers are found to be better maintained. From a report: Blender developers working on shipping Blender 2.80 this July as the big update to this open-source 3D modeling software today rolled out the Linux GPU requirements for this next release. The requirements themselves aren't too surprising and cover NVIDIA GPUs released in the last ten years, AMD GCN for best support, and Intel Haswell graphics or newer. In the case of NVIDIA graphics they tend to do a good job maintaining their legacy driver branches. With the AMD Radeon and Intel graphics, Blender developers acknowledge older hardware may work better on Linux.
AMD

AMD Gained Market Share For 6th Straight Quarter, CEO Says (venturebeat.com) 123

Advanced Micro Devices CEO Lisa Su said during her remarks on AMD's first quarter earnings conference call with analysts today that she was confident about the state of competition with rivals like Intel and Nvidia in processors and graphics chips. She also pointed out that the company gained market share in processors for the 6th straight quarter. From a report: AMD's revenue was $1.27 billion for the first quarter, down 23% from the same quarter a year ago. But Su noted that Ryzen and Epyc processor and datacenter graphics processing units (GPUs) revenue more than doubled year-over-year, helping expand the gross margin by 5 percentage points. If there was a lag in the quarter, it was due to softness in the graphics channel and lower semi-custom revenue (which includes game console chips). Su said AMD's unit shipments increased significantly and the company's new products drove a higher client average selling price (ASP).
Graphics

Ask Slashdot: Why Are 3D Games, VR/AR Still Rendered Using Polygons In 2019? 230

dryriver writes: A lot of people seem to believe that computers somehow need polygons, NURBS surfaces, voxels or point clouds "to be able to define and render 3D models to the screen at all." This isn't really true. All a computer needs to light, shade, and display a 3D model is to know the answer to the question "is there a surface point at coordinate XYZ or not." Many different mathematical structures or descriptors can be dreamed up that can tell a computer whether there is indeed a 3D model surface point at coordinate XYZ or behind a given screen pixel XY. Polygons/triangles are a very old approach to 3D graphics that was primarily designed not to overstress the very limited CPU and RAM resources of the first computers capable of displaying raster 3D graphics. The brains who invented the technique back in the late 1960s probably figured that by the 1990s at the latest, their method would be replaced by something better and more clever. Yet here we are in 2019 buying pricey Nvidia, AMD, and other GPUs that are primarily polygon/triangle accelerators.

Why is this? Creating good-looking polygon models is still a slow, difficult, iterative and money intensive task in 2019. A good chunk of the $60 you pay for an AAA PC or console game is the sheer amount of time, manpower and effort required to make everything in a 15-hour-long game experience using unwieldy triangles and polygons. So why still use polygons at all? Why not dream up a completely new "there is a surface point here" technique that makes good 3D models easier to create and may render much, much faster than polygons/triangles on modern hardware to boot? Why use a 50-year-old approach to 3D graphics when new, better approaches can be pioneered?
PlayStation (Games)

What To Expect From Sony's Next-Gen PlayStation (wired.com) 131

Daetrin writes: Sony is unwilling to confirm "Playstation 5" as the name, but their next console is "no mere upgrade" according to a report from Wired, which cites Sony executives -- who spoke on the record:

"PlayStation's next-generation console ticks all those boxes, starting with an AMD chip at the heart of the device. (Warning: some alphabet soup follows.) The CPU is based on the third generation of AMD's Ryzen line and contains eight cores of the company's new 7nm Zen 2 microarchitecture. The GPU, a custom variant of Radeon's Navi family, will support ray tracing, a technique that models the travel of light to simulate complex interactions in 3D environments. While ray tracing is a staple of Hollywood visual effects and is beginning to worm its way into $10,000 high-end processors, no game console has been able to manage it. Yet."

The console will also have a solid-state drive and is currently planned to be backward-compatible with both PS4 games and PSVR.

Slashdot Top Deals