Minecraft RTX vs Minecraft: how much does ray tracing really matter?

We first caught a glimpse of Minecraft RTX way back at Gamescom 2019, where Nvidia was keen to show off how far ray tracing had come in the year since Nvidia Turing first hit the market. However, it's been close to a year since Nvidia first showed what ray tracing could look like in Minecraft, and we're only just now seeing it enter beta. 

We've been playing the beta for a couple days now and based on our first impressions, we can easily say that it, alongside Quake II RTX, makes for a stunning example of how much of a difference ray tracing can make. 

And, sure, it can be argued that ray tracing in a game as visually basic as Minecraft is a waste of effort, but it's that basic nature that really makes the difference in lighting pop. Ray tracing, in effect, makes Minecraft look like an entirely different game, and we've got the screenshots to prove it. 

When can I play Minecraft RTX

First things first, if you're looking to get in on the Minecraft RTX action yourself, you're in luck. The Minecraft RTX beta begins today, and anyone can sign up for it – but it's a bit of a process. 

Quick note: if you plan to do any of this, be sure to back up your worlds by going to the following folder and copying any files you need to keep. 



  • %LOCALAPPDATA%PackagesMicrosoft.MinecraftUWP_8wekyb3d8bbweLocalStategamesco m.mojangminecraftWorlds

Minecraft RTX is only available on Windows 10, so you're going to need to own that version to access it. If you want into the beta, download the Xbox Insider Hub app from the Windows 10 Store, open it and click the three horizontal lines in the top left corner.

From there, click on 'Insider Content' and then click on Minecraft for Windows 10. From there just click Join. From there, you can click on 'Manage' and choose whether you want a part of the regular Minecraft Beta program or just the Minecraft RTX Beta.

Now that that's done, be sure to download the game ready driver from GeForce Experience or Nvidia's website. 

Nvidia Geforce RTX 2060 Super

What do you need to play Minecraft RTX?

Because we're talking about ray tracing and AMD RDNA 2 GPUs with hardware-enabled ray tracing are releasing who-knows-when, you need an Nvidia RTX graphics card to really take advantage of these new effects. 

Nvidia told us that an Nvidia GeForce RTX 2060 is required for "playable" framerates at 1080p, which it defined as "30 fps". Keep in mind, however, that this is with DLSS 2.0 enabled, which, to the uninitiated, is Nvidia's AI-based upscaling technology. 

We didn't do full performance testing of Minecraft RTX like we typically do with major game launches (see Resident Evil 3 and Doom Eternal performance), because it was a beta. However, that doesn't mean that we ignored our performance. 

We played the game on our home gaming PC, which is packed with an AMD Ryzen 9 3900X, 32GB of RAM and an Nvidia GeForce RTX 2080 Ti – as close to top-end as you can get right now without breaking into the HEDT world. And, even with this extravagant level of hardware, we didn't manage to break a 60 fps average at 3,440 x 1,440 with ray tracing enabled. 

That being said, when we turned off the DLSS upscaling, framerates tanked all the way down to the mid-20s, so we would say that enabling DLSS is basically required with Minecraft RTX. 

Keep in mind that this is just a beta, however, so performance can get a lot better over time as the talented developers over at Mojang work out the kinks and get optimization down. 

Until then, however, we'd advise sticking with 1080p or 1440p unless you're one of the few folks that has an RTX 2080 Ti installed. 

Does Minecraft RTX really make that much of a difference?

We definitely encourage you to download and install the beta if you really want to see how much of a difference ray tracing makes with Minecraft, but we can personally attest that it makes it look like an entirely different game. 

The inclusion of ray traced global illumination, shadows and reflections here makes every single object in the game look so much better. Items that create light are more important, as their placement in your project is going to determine whether or not a cave or a room is going to have enough light to see. 

One of the demo worlds that Nvidia provided to us had this giant castle, and just for laughs we went through and removed all of the torches. It got so dark that we were reminded of the illumination tech found in Metro Exodus, which had really sold us on this whole ray tracing thing when we saw it in action. 

We can go on all day about how jaw-dropping the difference is, but screenshots are going to tell the story better anyways, so here you go. And, again, we encourage you to try it out for yourself, because this editor has never played Minecraft before writing this article, and they don't know how to really get the most out of this technology here. 

So, yeah: Minecraft with RTX is pretty incredible. While the game isn't as bombastic as Control or Metro Exodus, we think it's a neat way to show off what the tech can do. 

Plus, in a game that revolves so much around creating whatever folks want to create, having more tools to do that can't hurt. Frankly, while we probably won't be playing much Minecraft in the future, we're ecstatic to see what people create with it. 

Posted in Uncategorised

Ready your graphics cards, Crysis might be getting remastered

Back in the late 2000s, it became a meme to test the best gaming PCs by asking ourselves "can it run Crysis". And, while the game has stayed a visual heavyweight, it's not really much of an accomplishment to get it running smoothly these days – what with all the ray tracing GPUs. 

However, the Crysis Twitter account broke its three-year-long silence, tweeting out simply "RECEIVING DATA". Sure, there's not much information there, but already the internet is swirling around with Crysis remaster rumors. And, we're totally here for it. 

Crysis 3 was the last game in the series to torture our computers when it came out all the way back in 2013. Since then, the seminal PC game series has been lying dormant, even though Cryengine has remained hugely popular. According to TweakTown, this isn't the first word of a Crysis remaster, as some hints were apparently dropped in tech demos.

If the remastered Crysis can adopt all the latest technology like ray tracing and take advantage of multi-core processors like the Ryzen 9 3950X, we could definitely see Crysis return to punish push an entire new generation of PC hardware. 

Now, obviously this is all speculation based on a tweet and a tech demo, so it's important to take it all with a grain of salt. Still, as TweakTown points out, EA did say it has some pretty big aces up its sleeve this year – maybe Crysis: Remastered is one of them. 

Can it run Crysis?

2020 is going to be a year that'll be packed to the brim with exciting PC hardware. We've already heard AMD CEO Lisa Su assure us that Ryzen 4000 CPUs and RDNA 2 GPUs are going to be hitting the streets, and we're sure that those lineups won't go unanswered by Intel or Nvidia, respectively. 

A new Crysis game, even if it is just a remaster of the original (fingers crossed for the whole trilogy), could be the perfect PC game to show off the new tech. 

Even though neither Crysis 2 or Crysis 3 carried the same graphics card-killing reputation as the first game in the series, all three games utilized the latest graphics technology to create some of the most beautiful games of all time. Hell, the whole trilogy still stands up 13 years after the original Crysis came out way back in 2007. 

Crytek, if nothing else, knows how to make a technically ambitious game, one that truly takes advantage of the best hardware on the market. If it remasters Crysis – and remember, this is far from a known fact – it could end up being one of the best-looking and most demanding games of the year. 

We know we can't wait to see what a 2020 Crysis looks like, and we're sure we're not alone. 


Posted in Uncategorised

AMD RDNA 2 release date, specs and rumors: everything we know about ‘Big Navi’

Speculations and leaks surrounding the AMD RDNA 2 are piling up as we move closer to its rumored release date. Thanks to the triumphant success of AMD Navi, which launched way back in July 2019 with the Radeon RX 5700 XT and RX 5700 leading the charge, people have had a taste of AMD’s mid-range champions. Now they’re hungry for ‘Big Navi' – the AMD graphics card that’s said to dethrone the reigning champ, RTX 2080 Ti.

As we’re nearing a year since the first AMD RDNA graphics cards hit the streets, we’re now anxious to know when we’ll see RDNA 2, and what it will look like. There have been a ton of rumors that have been surfacing, painting a picture of some incredibly powerful AMD graphics cards that might give Nvidia some much-needed competition on the high-end. Though, sadly, they may not all feature ray tracing support.

The good news is that it will be the graphics architecture behind the next-generation consoles, the Xbox Series X and PlayStation 5, so it's safe to assume that new graphics cards are well on their way. 

AMD hasn't come out and officially announced many details about RDNA 2 or the supposed 'Big Navi' card that is rumored to be coming out. Although what could be the AMD Big Navi graphics card has been spotted in Linux driver recently, and AMD CFO Devinder Kumar himself said that AMD is "on track to launch our next-generation Zen 3 CPUs and RDNA 2 GPUs in late 2020". They’re now expected to arrive before the PS5 and Xbox Series X

We’ve got enough gossip, speculations and updates to tide us over until the AMD RDNA 2 cards hit the streets. So, we gathered all of them in one place. Be sure to keep this page bookmarked, as we’ll keep it updated with all the latest AMD Big Navi rumors and information.

Cut to the chase

  • What is it? AMD's hopefully high-end graphics card
  • When is it out? Late 2020
  • What will it cost? No one knows, but it probably won't be cheap

AMD RDNA 2

AMD Big Navi release date

It seems like AMD Big Navi release date rumors have been everywhere since the very beginning of time – or at least since Navi rumors first started appearing in late 2018. However, here we are in 2020 still without a solid release date for AMD's high-end graphics card. 

Initially we had heard that Big Navi would be launching at Computex – but that show has been pushed all the way back to September due to Covid-19, so that's a no go. But even if it did appear at the giant Taipei tech show, we probably won't see it until the end of 2020 – think October or November. 

At the AMD Financial Analyst Day back in March, Team Red said itself that it was targeting the end of 2020 for the launch of Big Navi. Kumar even reinforced this target release window, saying that AMD is "on track to launch our next-generation Zen 3 CPUs and RDNA 2 GPUs in late 2020". While that feels like it’s still a long way away, these AMD graphics cards seem like they will arrive before the PS5 and Xbox Series X.

Either way, we'll get an AMD RDNA 2 graphics card release date when AMD is ready. However, because Computex did get pushed back, we wouldn't be surprised if AMD came out of nowhere with a giant online-only event to show off the RDNA architecture any day now. Stay tuned. 

AMD RDNA 2

AMD Big Navi price

AMD has traditionally enjoyed a reputation for providing more affordable products than their competition, but we're not so sure that's going to extend to Big Navi. 

And to back this up, we'd like to point to the AMD Radeon VII. With this graphics card, AMD genuinely provided performance that was pretty close to what the Nvidia GeForce RTX 2080 provided at the time, while sitting with a price tag of $679 (about £540, AU$970), which was very close to the RTX 2080's $699 (£649, AU$1,119) price tag at the time. 

Now, since launching its Navi lineup, AMD has put some price pressure on Nvidia's mid-range lineup, most notably baiting Nvidia into lowering its prices on its Super cards at the last minute. So, we might see something that challenges the RTX 2080 Ti for maybe $200 (£200, AU$300) less than that card's MSRP, but don't expect graphics card prices to drop to where they were before Nvidia Turing made everything more expensive. 

AMD RDNA 2

AMD Big Navi specs and features

We have heard so many different rumors about what the 'Big Navi' card will eventually look like, but we're only really diving into what's actually feasible here. 

What we know is that AMD is planning on dropping a 4K-ready graphics card that can handle ray tracing in 2020. That much comes directly from AMD CEO Lisa Su, who came out in an interview and said "I can say you’re going to see Big Navi in 2020."That's definitely a blunt way to tell us to expect the high-end GPU this year. 

Then, during AMD's financial day, AMD doubled down on RDNA 2 coming, with a slide saying that the top-of-stack graphics cards would not only provide "Uncompromising 4K gaming," but that it will also feature hardware-based ray tracing and variable rate shading – two features already present in Nvidia Turing GPUs.

Unfortunately, we have to take this rumor with a grain of salt. Newer rumors suggest that only flagship variants of AMD’s RDNA 2 GPUs will feature ray tracing. It seems that the feature will now be reserved only for AMD’s high-end and enthusiast-grade Navi 2X GPUs. And, that’s apparently due to the lower-end and more mainstream options not having the capacity to support hardware-level ray tracing at an optimal frame rate.

Beyond that, we have seen rumors that suggest the RDNA 2 flagship will feature 80 compute units, which if each compute unit has the same amount of Streaming Processors (SPs) as the original RDNA graphics cards – we could see 5,120 SPs on a single GPU – but it could be even more. 

Of course, that's a lot of silicon to cool, and AMD's current blower-style coolers probably wouldn't be sufficient. Even in the transition from Pascal to Turing, Nvidia abandoned blower-style coolers in favor of more efficient dual-fan designs, and word on the street is that AMD could be doing the same thing

What is perhaps more important is that the rumors are saying that Big Navi could be up to 30% faster than the Nvidia GeForce RTX 2080 Ti, which would frankly be amazing if true – that card can already destroy pretty much every game at 4K, so a further 30% boost could probably mean that we'll see more accessible 4K gaming lower in the product stack. 

At the end of the day, while we do have all of this juicy speculation, we don't know what will ultimately end up being true and what won't be. And, even if AMD Big Navi ends up blowing Nvidia Turing out of the water, it's entirely possible that the RTX 3080 may be close behind. Either way, 2020 is going to be an exciting year for the best graphics cards. 

Posted in Uncategorised

Gaming laptops will never be the same, and it’s all thanks to AMD

Even the best laptops are, traditionally speaking, a lot slower than a full desktop PC. The PC components that laptop manufacturers shove into a portable machine consume way less power, and are generally constricted  by thermal limits imposed by being in a tight space – especially with recent Ultrabooks and such getting thinner and thinner. However, with AMD Ryzen 4000, that has fundamentally changed. 

We've finally been able to get our hands on a Zen 2-powered laptop, in the shape of the Asus Zephyrus G14. That laptop is equipped with the Ryzen 9 4900HS, which is the 35W variant of the most powerful laptop chip in the lineup right now. And let us just say this: it's mighty impressive

In fact, this processor blew our minds so much, that we did a full round of testing, putting it up against the most powerful consumer processors on the market right now from both Intel and AMD. It's definitely not faster than either the Intel Core i9-9900K or the Ryzen 9 3900X, but it comes close enough to definitely be classified as desktop-class performance. 



But first: battery life

However exciting it would be to just dump a bunch of performance graphs in your face (don't worry, that's coming here in a bit), we want to note that there's a fundamental difference in what's valuable for a laptop and for a desktop. 

When you're picking up a desktop processor, power consumption does matter to a point. The more power you draw from the wall, the more you're typically going to spend on your power bill each month. The difference between a 65W Ryzen 7 3700X and a 95W Core i9-9900K each month will likely be trivial, but there's definitely a difference there. 

For a laptop, however, battery life comes into play in a huge way. One of the major appeals of having a portable device is to, like, be able to carry it around wherever you go, and if your gaming laptop dies after an hour or two of just browsing the web or watching a movie on the train, what's the point of it even being a laptop?

For the longest time, that's kind of been the compromise you had to make for having a gaming laptop in the first place. Even the most power efficient gaming laptops out there only last 3-5 hours doing even the most lightweight tasks – but that's changed now. 

In our battery tests for the Zephyrus G14, the laptop has battery life that's right up there with the likes of the Dell XPS 13, a laptop that is designed first and foremost for working on the road. All in a gaming laptop that can play all the latest and best PC games

Thanks to a controller that's built into the CPU die, AMD's firmware is able to dynamically adjust clock speeds depending on the actual workloads being imposed on the laptops. So, if you're just browsing the web or writing up a document in Microsoft Word, clock speeds come down, which also brings down power consumption. 

This is pretty much standard, but what sets AMD apart from either Intel's current Ice Lake chips or even AMD Ryzen 3000 mobile is how much faster these Zen 2 processors are at adjusting clock speeds – as the simple act of adjusting clock speed takes power, too. 

This was all done in the name of maximizing real-world battery life beyond what people even test for. AMD flew us out to their Austin, Texas campus to do a deep dive into what Ryzen 4000 would be capable of, and in an interview AMD director of product management Renato Fragale told us that rather than focusing on hard benchmarks, "we need to be looking at, or around, a cross-section of applications". 

We specifically asked a question for one of our own use cases, like traveling internationally to a big tech conference (even if they've been cancelled for a bit), and Fragale explained to us "if you know you're going to be unplugged for 14 hours, and you know you've gotta get some work done, you slide [the Windows Power Slider] towards battery life. Because realistically, if you're doing Word or PowerPoint, or whatever, you probably don't need a ton of performance to do that; so let's go save some battery."

Long story short, AMD has introduced laptop chips that can provide excellent performance for workloads like gaming – which we're about to dive into now – yet can also deliver battery life to get you through a long plane flight. It's finally the best of both worlds. 


AMD Ryzen 4000 crushing those benchmarks

So we teased the performance a bit when we noticed earlier that the AMD Ryzen 9 4900HS is faster than the Intel Core i7-9700K in Cinebench R20. To be fair, this was a claim that AMD made way back when it revealed its lineup back at CES 2020. We don't know about you, but we never trust manufacturer-delivered benchmarks – we like to test everything ourselves. 

And, well, that's exactly what we did. We tested the Ryzen 9 4900HS against the Intel Core i5-9600K, Core i7-9700K and Core i9-9900K. And, for parity's sake we also tested the AMD Ryzen 5 3600, Ryzen 7 3700X and Ryzen 9 3900X. The numbers are frankly astounding. 

Like we already mentioned in that earlier news story, the AMD Ryzen 9 4900HS scores a whopping 4,194 points in the Cinebench R20 multi-core test against the Intel Core i7-9700K's 3,726. But we had to go deeper for this one. 

In Geekbench 5, the 4900HS comes ahead in multi-core with 7,820 points to the 9700K's 7,728, while coming close to the same single-core performance with a 1,170 single-core score. Even in 3DMark, where Intel is historically very strong, the 4900HS beats the 9700K with a CPU score of 8,438 to Team Blue's 8,106.

What's even more impressive, however, is how close the AMD Ryzen 9 4900HS comes to the Intel Core i9-9900K. A chip that not only retails for $479 (£469, about AU$696) on its own, but also has a TDP of 95W – more than double the power of the 4900HS. 

In Cinebench R20, the Core i9-9900K gets a multi-core score of 4,789 and a single core score of 515. That's just 13% and 7% faster, respectively, for a processor that consumes more than double the power and is being cooled by a 360mm AIO liquid cooler. 

Even in Geekbench, the 9900K is just 13% faster than the AMD laptop chip, scoring 8952 in the multi-core test. The higher numbers might look at first to be a win for Intel, but we can assure you they're not

The 4900HS is even within reaching distance of AMD's own desktop processors, namely the Ryzen 7 3700X. AMD's 8-core, 16-thread mainstream desktop processor gets a Cinebench R20 multi-core score of 4,802 and a GeekBench multi-core score of 9,037. 

The AMD Ryzen 9 4900HS certainly isn't faster than the Ryzen 9 and Core i9 desktop processors, but the fact that it is right up there with them should be applauded. 

This is excellent news for anyone that needs a powerful workstation that won't break your back while traveling. If you just need something that can handle video editing while traveling, now's the time to pick up that new laptop. Best part? During our testing the temperatures for the Ryzen 9 4900HS peaked at 91 C – short of the thermal limit that most MacBooks constantly bump up against. 

We are in a new age of high-end mobile computing, and it's all thanks to the amazing work AMD put in here. 

Razer Blade 15 Studio Edition

Looking to the future

However it's not all sunshine and roses here. Right now Intel has claim to all the shiny flagships on the market right now. The Razer Blade 15, the MSI GS66 Stealth and more are all still exclusively rocking Intel Comet Lake-H processors. 

To be clear, we haven't tested Comet Lake-H processors for ourselves yet, so we can't speak to their performance or the battery life offered. 

What we can say, however, is that Intel processors are the ones that are getting paired with an RTX 2080 Super to provide high-end gaming or video production laptops. 

Now, it's important to note that AMD isn't restricting manufacturers from pairing an AMD Ryzen 4000 processor with a higher-end GPU. In our Austin interview, AMD Senior Manager of Product Management Scott Stankard told us that "we didn't place a restriction. But in the contest of: we are attacking the market, we are saying that we want to go over double the systems from 2019 to 2020."

In short, AMD wants to make sure it has the bulk of the market covered. And, it's true that when you're out there shopping for a laptop, a high-end device that can cost thousands of dollars is only going to appeal to a limited number of people. 

And again, as much as we'd like to see AMD processors in the halo laptops of the world, AMD isn't the only company that has to make that decision – laptop OEMs have to, well, put the processors into the halo laptops. 

Stankard tells us "in the notebook, it's a collaboration with the OEMs. So the OEM has to look at their portfolio, figure out what they want to go build and how they want to go attack it. So we don't have the full free-rein to say, "we will do this"." AMD can certainly influence laptop manufacturers, but the Asus's and the Dells of the world have to actually make the decision to include the processors they want to include. 

Personally, we can't wait to see these processors paired with the best GPUs on the market. We absolutely love the idea of pairing a high-end AMD CPU with the top-end mobile GPU in a gaming laptop that can absolutely destroy games even at 4K. That day isn't here yet, but as more laptop manufacturers come around to just how good AMD Ryzen 4000 is, and the benefits it provides everyday users, we're sure that's going to change. 

Until then, we can just revel in the performance that these processors deliver. The best gaming laptops just got a whole lot better, and if things continue in this direction, we might even start recommending gaming laptops over an Ultrabook, thanks to the increased performance without having to compromise on battery life. It's just that good. 

Posted in Uncategorised

Intel Core i9-10900K benchmarks have leaked, and it’s still slower than the Ryzen 9 3900X

For what feels like ages at this point, we've been waiting for Intel 10th-generation Comet Lake processors for desktop to make their appearance. And, while we have heard plenty of rumors about when we'll see them, we're starting to see info suggesting what they'll be capable of. 

The latest of these is a Geekbench 5 benchmark result spotted by renowned hardware leaker TUM_APISAK, and the results are pretty interesting. Notably, it lists the maximum frequency as 5.08GHz, which is lower than the 5.3GHz that previous leaks have suggested. This leads to a multi-core score of 11,296 which isn't quite as powerful as AMD's current-generation flagship. 

We actually just retested the AMD Ryzen 9 3900X the night before this leak appeared, where the 12-core processor managed a score of 12,060, which makes it still around 7% faster than the alleged Intel chip's result – keep in mind that the 3900X launched way back in July 2019, too. 

However, the AMD Ryzen 9 3900X does fall behind this leaked benchmark in single-core performance, scoring 1,268 points in last night's testing compared to the 1,408 in this leak. That is a pretty substantial 10% lead that Intel is potentially claiming here, which would maintain its position as the manufacturer behind the best processors for gaming. 

Obviously, we can't wait to get this little chunk of silicon in for our own in-house testing to see exactly how it stacks up against AMD, but we still have no idea when that will actually happen. Intel will launch its next-gen processors when it decides is the best time, and until then we're just going to have to wait and see. 

A temporary fix?

Intel's 10th-generation Comet Lake-S processors may narrow the massive gap that exists between AMD and Intel in the desktop world right now, but it may not last for long. Keep in mind that AMD CEO Lisa Su has said that Ryzen 4000 processors for desktop will be coming this year

If the Intel Core i9-10900K only manages to come 7% short of beating the 3900X and only beating it in single-core by around 10%, that doesn't bode well for Intel whenever Team Red manages to launch its next desktop platform. Word on the street, according to an AdoredTV leak, is that the Zen 3-based Ryzen 4000 lineup is going to see a 15% boost in IPC performance. If that's paired with higher clockspeeds on AMD's next platform, Intel's single-core lead could vanish. 

And now that we've seen AMD bring the Zen 2 improvements over to mobile, there's a lot of pressure on Intel to come up with something truly exciting. We said it in another piece touching on our brief testing of the AMD Ryzen 9 4900HS (more on that coming very soon), but we'd love to see Intel come up with its own Ryzen moment. 

Intel Comet Lake-H has just arrived and Comet Lake-S is likely right around the corner, so we're incredibly interested to see whether or not it can shake up AMD's stranglehold on the processor world. 

And if it does, you can bet we'll be diving into that when the time comes. 

Posted in Uncategorised

Wow that PS5 controller is ugly, isn’t it?

The PS5 is still quite a ways out, and the marketing has been kind of a roller coaster. Back at CES 2020, Sony only offered a logo as its way of teasing a reveal, a month after Microsoft showed off the Xbox Series X design. 

Then, after GDC 2020 got canceled, both console makers revealed deep dives of the specs online in wildly different ways: The Xbox Series X through a detailed blog post over at Digital Foundry, and Sony decided to stream Lead System Architect Mark Cerny talking about SSDs for about an hour

 

But today, Sony revealed the DualSense PS5 controller and, well, it's basically the ugliest first-party controller ever designed. I know that right off the bat that might seem like a needlessly hot take, and ultimately if you love it, you're totally valid – but I mean look at it. 

A photo of the new PS5 DualSense Controller, for reference.

Why is the touch pad approximately the size of my forehead? Why is the PS button not a circle that's easy to just reach over and find? Why are all the buttons white? I have so many questions, and so many things that I absolutely abhor about this controller that I'm going to have to dive deep on this one. 

Aesthetics matter

In the blog post where Sony took the veil off of this controller, it said that "traditionally our base controllers have been a single color. As you can see, we went a different direction this time around." A 'different direction' is a bit of an understatement, but actually kind of hones in on the major problem with how the controller appears.


In stark contrast to the DualShock 4, which was this one-tone gamepad with a familiar shape, the DualSense controller is this rounded affair that's white at the top and black at the bottom, with a light bar that surrounds the touch pad, making it look way bigger than it probably is. Why?

Across Twitter right after this display was revealed, I started seeing people who A. hated the design, B. love it for some reason or C. sharing an all-black photoshop of a controller that looks way better.

Some other people started coming up with some other edits for the controller, and I'm totally here for those, too. Hey, we're in the middle of social distancing here, so I love the fact that people's creative blood is flowing. 

Another problem I have with the DualSense controller's design is the lack of color-coding on the face buttons. I'm pretty familiar with where the all the buttons are on a PlayStation controller, but having green=triangle and blue=x makes it really easy for folks that are new to gaming to jump in and have a good time – especially for anyone that has issues with their sight. 

And speaking about accessibility, let's talk about the new PS button. I have pretty good vision, and at least judging the images Sony shared, that new button kind of blends into the background of the controller. I obviously haven't had a chance to play with the PS5 or its controller yet, but I would go ahead and assume that the PS button will have similar functionality as it does now, or as the guide button does on the Xbox One controller – making it extremely important for people to be able to find easily.

a photo of a dualshock 4 controller

See how you can easily see what all the buttons are? Color coding works.

There's some good things

I want to be the first out of the gate to applaud Sony on the inclusion of USB-C. Right now, pretty much every device I have in my apartment uses that connection standard, which means it will be super easy to use this controller with whatever charger I have lying around, and more importantly, I can more easily plug it in to my PC – which I fully intend to do. 

I also really like the idea of having a microphone built into the controller, and it absolutely baffles me that this hasn't happened before. If you look at the DualShock 4 controller, it has a 3.5mm audio jack on the bottom, which is awesome for plugging in a pair of headphones, but if you don't have the right kind of headset, you can be stuck without a microphone when you need it. 

With the built-in microphone, the DualSense controller will let you use basically whatever headphones you have lying around, while still being able to chat with your friends online. There's even a handy little mute button right under the PS button.

The DualSense also looks to be shaped pretty similarly to the Nintendo Switch Pro Controller, and that is pretty much my favorite controller to date – one of the most comfortable if you were to ask me. 

There seems to be a lot of improvements that went on under the surface with the PS5's DualSense controller, and I definitely don't want to take away from that. If we are just talking about functionality, then this might be Sony's best controller to date. But, there's still quite a bit wrong here. 

Ultimately, it's probably fine

There are plenty of people who love this new PS5 controller look, and that's totally valid. Ultimately, as long as the controller does what it needs to do while being comfortable enough to play through PS5 games, it probably doesn't matter too much what the original controller looks like. 

Sony will likely be putting out dozens of different  DualSense controllers just like it did with the DualShock 4, so while the design of this first one makes me want to vomit a little bit, there will probably be an all-pink controller at launch that I'll be jumping on pretty much immediately. 

But, I will say that I'm ready for Sony to lift the veil on what the PS5 is actually going to look like, because while I think it's ridiculous that the Xbox Series X looks like a mini-fridge, at least I know what it will look like.

Either way, it's only a matter of time before Sony unveils the full PS5 design in all its glory. And, if the DualSense controller is any indication, it's probably going to be white and black – just a guess.  

Posted in Uncategorised

Resident Evil 3 PC performance: the Nvidia GeForce GTX 1060 is undead

Remakes of classic games are extremely popular right now, and titles like Resident Evil 3 remake are an excellent example. The original version of the game never made its way to PC, so not only are PC gamers able to play this horror classic for the first time, but it looks like a brand new piece of software entirely. In fact, it's probably one of the best-looking games of the year. 

However, with all the visual eye candy here, Resident Evil 3 on PC is pretty taxing on your gaming PC – particularly on VRAM. Before you go on Steam and slap down some cash to pick up Resident Evil 3, we thought it'd be good to explore just how it will perform on some of the most popular graphics cards. 

Luckily, we here at TechRadar have access to all the latest and greatest PC hardware, and we went ahead and tested a two-minute slice of the game, embedded below, on ten graphics cards, from the Nvidia GeForce GTX 1060 6GB to the Nvidia GeForce RTX 2080 Ti.

Pictured: our GPU running Resident Evil 3 at 4K

Resident Evil 3 PC performance: putting it to the test

The Resident Evil 3 system requirements, listed below, do a pretty decent job of telling you what kind of hardware you'll need to play this game at 1080p both at 30 fps (minimum) and 60 fps (recommended) levels. For the most part, our testing reveals that these requirements are generally accurate, but it goes a little deeper. 

For instance, Resident Evil 3 on PC is an extremely VRAM-dependent game, to the point where running the game at "Graphics Priority", this game's way of saying "High" recommends 5.54GB of VRAM at 1080p – which a lot of older graphics cards like the venerable Nvidia GeForce GTX 970 just won't have. 

We did find that you can go above the amount of VRAM you have available, given that you have enough system memory to handle some of the overflow. Just keep in mind that if you're playing in this way, you could experience some occasional stuttering. For instance, the Nvidia GeForce GTX 1060 at 1080p performed extremely well at 1080p with Graphics Priority settings, but when we turned it to Max, our average frame rate dropped just a little bit, yet our 0.1% low dropped all the way down to 13 fps from 37.

For the most part these drastic drops in frame rate are pretty rare, and only really happen when you're transitioning from area to area or getting out of a cutscene. The game is still completely playable maxed out, and for the most part is relatively smooth. 

If you want to max it out and get completely smooth gameplay, however, you're going to want to make sure you have a graphics card with at least 8GB of VRAM, which means an Nvidia GeForce RTX 2060 Super or an AMD Radeon RX 5700

Your graphics card is only one part of the setup, however. We tested the game on a system equipped with an Intel Core i9-9900K and 32GB of HyperX Fury RGB RAM at 3,000MHz. (We mostly stuck to 3,000MHz because we forgot what it was actually rated for and didn't have time for a ton of trial and error). Watching system usage, all 16 threads of the processor were being hit pretty consistently by the game, with an even spread across all of them. 

Traditional knowledge points to games as being single-threaded applications, and that's a reality we're getting further from with every major game release. The system requirements technically recommend an Intel Core i7-3770, but we'd recommend a more heavily threaded processor for this game. 

As for memory usage, the game is actually pretty forgiving, assuming you're not going over your VRAM budget. When we were within spec for the graphics card we were using, the Resident Evil 3 PC port only really used around 6-7GB of system memory. However, that rapidly got pushed upwards if we tried to max out the game on pretty much anything but the RTX 2080 Ti. System memory usage could then spike all the way up to 11 or 12GB, which means if you have the recommended 8GB of system memory, you could run into some serious Resident Evil 3 performance problems. 

Here are the Resident Evil 3 minimum system requirements: 

  • CPU: Intel Core i5-4460 or AMD FX-6300
  • RAM: 8GB
  • Graphics card: Nvidia GeForce GTX 760 or AMD Radeon RX R7 260x
  • DirectX 11
  • Storage: 45GB

Resident Evil 3 Recommended system requirements: 

  • CPU: Intel Core i7-3770 or AMD FX-9590
  • RAM: 8GB
  • Graphics card: Nvidia GeForce GTX 1060 or AMD Radeon RX 480
  • DirectX 12
  • Storage: 45GB

Resident Evil 3 PC performance: by the numbers

Perhaps unsurprisingly, the only cards that can deliver a solid 60 fps at 4K are the Nvidia GeForce RTX 2080 Super and the RTX 2080 Ti. Even at the more balanced Graphics Priority setting, which is designed for GPUs with 6-8GB of VRAM, the RTX 2070 Super is just shy of hitting that vaunted 60 fps target with 59. Still, though, that's a very playable frame rate. 

However, for as much attention as 4K gets in the mainstream, not many people actually play at that resolution. 1080p is still the most common display resolution, and if you're still rocking an FHD display, you shouldn't have too much of a problem running this game at 60 fps. 

While the RTX 2080 Ti can easily exceed 200 fps with all the bells and whistles turned all the way up, things are more interesting at the lower end of the spectrum. 

Graphics cards like the AMD Radeon RX 5500 XT and Nvidia GeForce GTX 1660 Super are far more affordable, and can still max this game out with good frame rates. With everything turned up all the way, even the aging GeForce GTX 1060 6GB manages a respectable 70 fps with everything turned up. 

Again, as we mentioned earlier, maxing the game out with only 6GB of video memory can result in some jagged bits of gameplay, so we'd recommend sticking with the Graphics Priority preset. The difference in visual quality isn't that apparent, and average frame rate gets boosted up to 74 fps, while avoiding any massive spikes of latency. 

Finally, 1440p is becoming more and more popular every day, and it's an entirely attainable resolution for modern mid-range GPUs. With the Nvidia GeForce RTX 2060, we got around 83 fps with the Graphics Priority settings, which, again, is recommended for GPUs with only 6GB of VRAM. If you do have an 8GB card, however, the Nvidia GeForce RTX 2060 Super was able to get a solid 86 fps average, while the Radeon RX 5700 was close behind at 84 fps. 

At the end of the day, most graphics cards should be able to handle this game pretty easily, unless you're trying to run it at 4K. Performance drops in half at this high resolution, so we'd only recommend it if you have one of the absolute best GPUs on the market today. 

However, because this game includes AMD's FidelityFX CAS (Contrast-Adaptive Sharpening), if you want to run the game on your 4K monitor, you should be able to upscale from 1440p without it looking awful. In fact, in our testing, even our Nvidia graphics cards provided excellent video quality on our 4K screen. That way, you kind of get the best of both worlds. 

Upgrade your rig for Resident Evil 3 PC performance

While Resident Evil 3 isn't the most demanding game on the market right now – looking at you, Control it is still going to be hard for some older GPUs and CPUs to work with. 

If you're still rocking a graphics card with 4GB or less of RAM, you're probably going to be in for some performance issues. Likewise, if you have an older Core i5 processor with just 4 cores and 4 threads, you'll similarly have some problems. And, if you haven't upgraded to 16GB of RAM, we'd advise you to do so while RAM is still affordable.

We're about to see new consoles and smartphones launch at the tail end of the year, and that will likely see memory and SSD prices increase. 

But, we get it. PC components are kind of confusing, especially if you're a newcomer to the space. So, we went ahead and put together a couple of build configurations for 1080p, 1440p and 4K that can run the game (and pretty much all the best PC games) at the Graphics Priority preset.  

1080p60 build:

  • CPU: AMD Ryzen 5 3600
  • RAM: 16GB HyperX Fury RGB
  • Motherboard: MSI B450 Gaming Pro Carbon AC
  • Graphics card: AMD Radeon RX 5500 XT or Nvidia GeForce GTX 1650 Super
  • SSD: Samsung 860 Evo 500GB 
  • Power supply: EVGA SuperNOVA 650 G1

1440p60 build:

  • CPU: AMD Ryzen 5 3600
  • RAM: 16GB HyperX Fury RGB
  • Motherboard: MSI B450 Gaming Pro Carbon AC
  • Graphics card: AMD Radeon RX 5700 or Nvidia GeForce RTX 2060 Super
  • SSD: Samsung 860 Evo 500GB
  • Power supply: EVGA SuperNOVA 650 G1

4K60 build:

  • CPU: AMD Ryzen 7 3700X
  • RAM: 16GB HyperX Fury RGB
  • Motherboard: MSI B450 Gaming Pro Carbon AC
  • Graphics card: AMD Radeon VII or Nvidia GeForce RTX 2080 Super
  • SSD: Samsung 860 Evo 500GB
  • Power supply: EVGA SuperNOVA 850 G1
Posted in Uncategorised

Where to buy a printer: these retailers still have stock

If you're anything like us, you rarely need to think about where to buy a printer because your office has one. However, now that so many people are working remotely, potentially without the critical IT infrastructure they're used to, you might be on the market for a printer of your very own. We're here to help. 

But because so many people are currently on the lookout for supplies for working from home, supply is a little bit short at the moment. Luckily, we here at TechRadar are pros at scavenging the web to find the best deals around, and we went ahead and did just that. 

So, whether you're looking for something that can keep up with your office printer or you just want something that can spit out some shipping labels every now and then we've got you covered. 

Where to buy a printer in the US

Learn more about what printer is best for you with our list of the best home printer 2020: the top printers for home use.

You can also shop more of the best cheap printer deals: our top budget picks.

Posted in Uncategorised

Where to buy a router: work remotely without interruption

A lot of folks are working from home during the coronavirus crisis, which means that their wireless networks might be under a bit more stress than is typical – so you might need to pick up a more robust wireless router.

You see, a lot of ISPs may include a modem with a built-in router, but wireless performance generally suffers in these kinds of devices. If you're after a super reliable wireless network when working from home, picking up a dedicated wireless router can save yourself (and anyone else doing remote work in your home) a lot of frustration. 

However, because there are so many people scouring the web for work from home supplies, a lot of the best routers are selling out extremely quickly, making it hard to find one that will arrive in a timely fashion. 

Luckily, we here at TechRadar are online shopping pros, and we went ahead and found all the routers that are still available so you can spend less time shopping and more time working. 

The best routers still available right now

Posted in Uncategorised

Where to buy a webcam: these retailers still have stock

When you have to work from home, the capability to have an effective video conference is extremely important. And, while a lot of folks may have laptops that have a built-in camera, the best webcam can boost video quality so much that they're essential for anyone having regular video calls. 

However, now that the Covid-19 crisis has driven so many people to work from home, supply is running out. Understandably, a lot of people are scrambling to get their hands on a solid webcam so they can keep doing their jobs, even if they have to be remote. 

If you're scrambling to get your hands on a solid webcam, however, don't worry – they're definitely still out there, even if they're not quite as easy to find. 

Luckily, we here at TechRadar are online shopping pros, and we've found the retailers that still have solid webcams on sale and gathered them all up down below. Just keep in mind that because webcams are moving so fast, there are certain models that may sell out super quickly.  

In the UK? We’ve just added some webcams for the UK market as well. Stock is moving really quickly right now, so make sure you jump on what's available now. Check out our best webcams piece also to see the current range and find out which are still in stock in your areas.

Where to buy a webcam in the US

We scoured the web to find the webcams that still have stock available to buy right now. Because the nature of online shopping, however, we'll likely see some of these sell out in the meantime. Some of these webcams are on backorder right now, but if you do buy one that's affected you should still be able to get it in about a week. 

Where to buy a webcam in the UK

Stock is currently an issue for webcams in the UK right now with many of the big retailers such as Amazon, Curry's and Argos selling out quickly while people adjust to their new working from home situation. We've still found some cheap webcams that are still available, so don't worry if you're still looking - we can help your search.

See more webcam offers with our roundup of the best cheap webcams and see the best webcams 2020: top picks for working from home.

You can also shop more deals with our roundup of Father's Day sales and the best 4th of July sales 2020 that are happening now.

Posted in Uncategorised

Doom Eternal PC performance: 4K60 with an Nvidia GeForce RTX 2060 Super

Doom Eternal on PC is finally out, after months of not having a huge tent-pole game release, and it really couldn't have come at a better time. Because we're all about to be spending a lot of time indoors, it's good to finally have such a great title to dig our teeth into for a while.

But if you're on PC, it can be nerve-wracking when you're picking up the first AAA game of the year. How will it run? Will my graphics card be enough? Will it make my processor scream out in heat-induced pain?

Well, we have good news for you, Doom Eternal PC performance is pretty freaking excellent, delivering awesome frame  rates along the entire gamut of current-generation cards. Even the Nvidia GeForce GTX 1060 can absolutely eat up this game at 1080p, with playable 1440p frame rates. 

Slaying those frame rates

Putting Doom Eternal PC performance to the test

Because it's hard to gauge just how well you can expect a game to perform based on the PC system requirements that developers will release ahead of a game's launch, we here at TechRadar decided to actually put it to the test. We have access to all of the best consumer-grade graphics cards on the market, and even managed to get most of them home with us.

Because of current events, there are a few GPUs that aren't available to us that we would have liked to test - namely the Nvidia GeForce GTX 1660 and GTX 1660 Super and the AMD Radeon RX 590. Maybe next time. 

As far as the section of the game we tested, we found a checkpoint about 15 minutes into the first mission, right after you get the cannon that has plenty of action. We set a timer for two minutes for each run. Because Doom Eternal doesn't have a canned benchmark on rails that we can run over and over again, we had to play through each run ourselves. 

While this does mean you can be 100% certain that these numbers represent actual gameplay, keep in mind that this is a chaotic game and we couldn't get identical runs for any two benchmark runs.

The game is so choppy on the AMD Radeon RX 5500 XT and the Nvidia GeForce 1060 that we simply cannot recommend that you play in 4K with any of these cards - or anything with a lower spec.

We tested at 1080p, 1440p and 4K on both the High and Ultra Nightmare presets, but there's some caveats there. Doom Eternal will not even let you set the game to Ultra Nightmare if you have less than 8GB of VRAM. While we could have lowered the textures down to Ultra to get the game to run, that would add too many caveats to the results, so we just, well, we didn't do it. 

And, finally, for the AMD Radeon RX 5500 XT and the Nvidia GeForce 1060, performance was really choppy at 4K high, so we didn't actually do the benchmark. We were getting around a 30fps average, but the game is so choppy that we simply cannot recommend that you play at this resolution with any of these cards - or anything with a lower spec.

Now, let's get into the results.

Let's talk numbers

Just a couple weeks prior to the launch of Doom Eternal, some system requirements leaked out for a moment that led folks to believe that lower-end hardware would be a no-go. Well, luckily, Doom Eternal runs like a dream. 

The game is incredibly fast-paced, as is the nature of Doom, so being able to get high frame rates is crucially important. Doom Eternal simply feels like garbage at less than 60 fps, so you're going to want to hit that number here. 

Luckily, every single graphics card we tested was able to hit well above that number at 1080p High settings, with even the aged Nvidia GeForce GTX 1060 scoring a healthy 86 fps. 

And, if you want to go up to 4K, you'll be happy to know that many graphics cards are also capable of this, with everything above and including an Nvidia GeForce RTX 2060 Super or an AMD Radeon RX 5700 XT able to stay above 60 fps at 4K High settings. 

The RTX 2060 and the AMD Radeon RX 5600 XT come close, but ultimately fail to break through that 60 fps barrier. Some more tweaking to the numbers will likely get you to that number, though. However, with one of these cards, we'd recommend sticking to 1440p or 1080p, where they're able to deliver some high frame rate joy. 

If you're anything like the editor writing this, and you like to crank all of the graphics settings up to Ultra Nightmare at 4K while still getting a high frame rate, you should be happy to know that you don't need an Nvidia GeForce RTX 2080 Ti to get this done. While that GPU is obviously able to handle this, delivering a juicy 104 fps maxed out at 4K, you can actually just use an Nvidia GeForce RTX 2070 Super to get 65 fps with everything cranked up to 11. 

When we saw the CPU usage

Ok, what about CPU and RAM usage?

Doom Eternal's PC port runs on the VULKAN API, which we absolutely love. Our test system is equipped with an AMD Ryzen 9 3900X, and we spotted several occasions where the game was actually able to utilize all 12 cores and 24 threads. This is yet another game that is going to help end the age of single-threaded games.

The game's system requirements recommend a AMD Ryzen 7 1800X or an Intel Core i7-6700K for Doom Eternal, and yeah, that's basically what we'd recommend too. If you're able to get your hands on at least an 8-core CPU, that's what we'd advise going with. 

Though, we should be 100% clear here: we didn't test any other CPUs at this time. We can't tell you whether or not the game will chug on a quad-core processor, but we can pretty safely assume that it will. 

As for system memory, throughout all of our testing, with only having Doom Eternal and MSI Afterburner open (to record frame rates), the system was using 9GB of RAM.  Windows 10 will generally reserve more memory than it actually needs, so while you can probably get away with 8GB of RAM, we recommend 16GB. 

So, basically, Doom Eternal is one hell of a PC port, delivering amazing performance across the board. It is a CPU intensive game, however, so if you're rocking an old processor, you might start to run into some issues. But, if you have a modern processor and pretty much any graphics card from this generation, whether it's from AMD or Nvidia, you're in for some high frame rate fun, and that's exactly what we like to see from the best PC games. 

And, if you're looking for an upgrade to get ready for Doom Eternal, we went ahead and included our exclusive price comparison tool for our recommended Doom Eternal components. 

Posted in Uncategorised

PS5 specs: why Sony faces an uphill battle

When it comes to game consoles like the PS5, one thing that truly sets them apart from, say, a gaming PC, is the fact that they're defined by games, rather than hardware. Sure, powerful hardware is necessary to power the best games, but that's not everything.

But how far does that go? 

The answer to that question likely varies from person to person, and from developer to developer, but power definitely plays an important role. And the PS5, at least on paper, is much less powerful than the Xbox Series X

Again, for a lot of people, they're going to look at the distinct lack of exclusives on Microsoft's platform, as its games all also come out on PC, and make the decision to flock to Sony's platform no matter what. But for anyone that's looking to choose the most powerful console, things just got a bit complicated, and we're here to help make sense of the mass of numbers that Sony just spat out. 

To turbo or not to turbo?

While Sony lead system architect Mark Cerny spent a long time talking about SSDs and the PS5 vs PS4 loading times – and that's something we'll definitely get into in a bit – something else takes priority here, and it's boost clocks. 

We know, we know – that doesn't sound totally exciting, and we get that. But, the difference in how the PS5 is going to handle boost clocks vs the Xbox Series X is one of the biggest distinctions between two platforms that are using pretty similar silicon.

The way the Xbox Series X operates, which we've gone into more detail with over here, is that developers can either choose to utilize the CPU in single-threaded mode with 8-cores and 8-threads or in multi-threaded mode with 16-threads. The clock speed changes between these two modes between 3.8 GHz for the former and 3.6 GHz for the latter. 

The PS5 is doing things differently. The silicon in the PS5 will instead act more like a standard desktop processor, boosting up in clock speed whenever the temperature isn't too high, and the current workload demands it. There's a catch, though - the max boost clock for the AMD Zen 2-based CPU is 3.5 GHz, lower than the Xbox Series X in hyperthreading mode.  

The GPU operates in much the same way, boosting up to 2.23GHz when it can. Now, Cerny claims that the PS5 will sit at this clock speed for long periods of time, and it's not like we'd be able to actually measure that even when the system comes out, but that's incredibly optimistic. Typically games hammer a graphics card, causing temps to go high and stay high. 

Sony still hasn't bothered revealing the system design yet (seriously), so we don't know what kind of cooling solution will be available, but things might get a little slow and heated over here. Especially as the console's lifecycle goes on and folks' systems inevitably get filled with dust, those turbo speeds are going to go down. And, if Sony is going to rely on that high clock speed to hit its performance targets, people could start to see performance drops the longer they have their console. 

Getting graphic

Both consoles will be using GPUs based on AMD's RDNA 2 architecture, but they're definitely not the same. The PS5 GPU will be rocking 36 compute units, with a max boost clock of 2.23GHz. The Xbox Series X, on the other hand will have 52 Compute units, with a constant clock speed of 1.8GHz. 

Now, it would probably help to clarify what a Compute Unit actually is. Basically, they're similar to CPU cores, in that a GPU will have a bunch of them, and they'll be able to act independently to complete workloads. However, within each of these Compute Units are alot of different elements, the most important of which is the Stream Processor, which is another tiny processor. 

The GPU in the Xbox Series X has 3,328 Stream Processors across its 52 compute units, and because the GPUs are based on the same RDNA architecture, it's safe to assume that each compute unit has the same amount of Stream Processors. So, we're looking at 2,304 Stream Processors in the PS5 GPU, a pretty striking difference. Mark Cerny believes that clock speed is more important, but in our testing of graphics cards over the years, we've never seen a 20% increase in clock speeds make up for 31% less Stream Processors. 

We obviously don't know how the GPU will perform in games right now, but this difference in hardware could mean the difference between the PS5 being capable of 4K60 or 4K30. Luckily, the PS5 will still be capable of ray tracing like its Microsoft-branded competitor, but if AMD RDNA 2 is built anything like Nvidia Turing – which, again, is a leap this could mean that there's less silicon dedicated to ray tracing, too. 

But who knows, maybe everything we've learned about computing is a lie. 

OK, about those SSDs

A large part of the PS5 specs presentation was the SSD. Cerny made it pretty clear that including lightning-fast storage was a huge deal for Sony with this console. But, while the 5.5GB/s of bandwidth sounds great for loading times, there's something else that SSDs could contribute. 

You see, when you're playing a big open world game like Red Dead Redemption 2 or Skyrim, there are certain limitations that are put into effect thanks to the sheer amount of data required. 

Especially if you don't want loading screens all the time, you need to load a lot of data into your system memory, of which there's only a limited amount. However, because SSDs are monumentally faster than a hard disk drive, that limitation could theoretically vanish. 

Take Skyrim, for example. Let's say you're wandering around, merrily exploring the world in front of you and a tempting little dungeon or fort appears that you want to plunder. The way things work now, the world data is about all that the game can really load into memory without the game turning into a slideshow, but with an SSD that's nearly as fast as RAM, the game could call on that data as its needed. 

This all just in theory, but we could end up seeing an open world game that only needs to load when you first boot up the game. The most exciting part is that both consoles will be capable of doing this, and gaming PCs with speedy drives will be able to take care of it too. The next generation of games will look better, sure, but we're going to start seeing open world games that will make The Witcher 3 look like a child's sandbox. 

But we should make one thing abundantly clear, too – loading times will significantly diminish. PCIe 4.0 SSDs like the one found in the PS5 are up to 100x faster than a spinning drive, and while it's unlikely that we'll see loading times get cut by that much, it does mean that we'll be waiting for games to load for seconds rather than minutes. 

PS5 vs Xbox Series X

Right now, on paper at least, the PS5 looks pretty inferior to the Xbox Series X in terms of pure hardware on offer. 

However, with that in mind you need to remember that we still don't know everything about what these consoles are going to be able to do, how much they're going to cost and, most importantly, the games that will be available for them. 

It's entirely possible that Microsoft will be charging much more for the Xbox Series X than Sony is for the PS5 – there is that Project Lockhart system lying in wait, after all. And, taking into consideration the magic Sony has been able to pull with first-party games like Horizon: Zero Dawn and Death Stranding ,the lack of power might not even matter. 

Both of the next-generation consoles are coming out at the end of 2020, so we're sure we'll find out more in the very near future, but either way, things are starting to heat up in the next console wars, and we're totally here for it. 

Posted in Uncategorised

How you can use your PC to help scientists develop a treatment for the coronavirus

Right now the coronavirus is sweeping the entire world, and it's truly a big deal. One of the things that's keeping it at the center of everyone's attention is the lack of specific treatments to help folks – particularly the elderly – get through it. 

In order for these treatments to be developed, scientists from across the world have a lot of work to do – but you actually might be able to help.

The folks at Folding@Home have published a blog post asking for people to run the F@H client in order to help scientists develop a treatment for coronavirus. We encourage you to read the blog post, as it goes into far more detail than simple tech journalists like us feel comfortable doing, but we can help you get set up and make sure you're offering the most computing horsepower. 

We've been running F@H on our PCs for about a week now, and we encourage you to do the same. If you get it set up right, you can even help out when you're working on your computer. 

MacBook Pro (16-inch, 2019)

Take stock of your computer's capabilities

This is good advice pretty much whenever, but if you're going to be seriously taxing your computer with something like Folding@Home, you need to be aware of what your computer can do. 

Particularly, you should get an idea of what kind of CPU you're running. The easiest way to do this if you're on Windows 10, is hit Ctrl + Shift + ESC, which will bring up Task Manager. Once you're in there, click the arrow on the bottom right-hand corner of the window labeled "More details", then go over to the performance tab. Here you'll be able to easily get a pretty good idea of the hardware in your PC. 

If you're on a Mac, you can also get an idea of what kind of PC you're running by going to the Apple logo in the top left corner of your display, clicking it and selecting "About This Mac." Unfortunately, Apple isn't exactly forthcoming with the exact specs if you're running macos High Sierra or earlier. But if you've got a MacBook Pro in the last few years, you can assume that the Core i7 is more than capable. 

Unfortunately, you'll have to look it up on your own to get the exact core count of your laptop – just google "MacBook Pro (15-inch, 2018) Core i7 specs" or whatever model you have to get more detailed spec information. 

Generally, if you're going to be running this program while working, you're going to want at least 6 cores and 12 threads or "Logical processors" – which will naturally rule out a lot of laptops (Don't worry, you can still help). 

You will get the option when you're setting up Folding@Home to choose how much of your hardware you want dedicated to fighting the disease, from 'Light' to 'Full'. If you are running on a laptop, or any other computer with a weaker CPU, we would advise against using the 'Full' preset when you're working, but if you want to offer constant help, the 'Light' preset should be relatively easy to run in the background. 


How to run Folding@Home

Once you're comfortable running Folding@Home, getting it going is actually fairly easy. Go to the Folding@Home page and select 'Start folding'. From there you can download it for Windows 10 right off the bat. 

If you're not running Windows 10, there's a link for alternative downloads, and you should click that.  From there you can download a client for macOS or several flavors of Linux. 

Go ahead and download the client appropriate for your system, open the downloaded file and follow the instructions on the screen. Once it's installed, it should run automatically (unless you specifically told it not to.) 

Once it's running, assuming you're running Windows 10, you should be able to see the Folding@Home icon in your system tray, which looks like a protein made of a bunch of different colors. When this is active in your system tray, it means that it's running. You can right click the icon to access various settings for the software, and even pause it if you're working on something particularly demanding. 

There are a lot of granular controls in the app, which you can use if you feel comfortable doing so. But, for the vast majority of people, you can essentially use the default settings, and everything will be smooth sailing. 

There are only two things we'd advise messing with at first. You should right click the icon, and select 'Web control'. From there, change the power slider to 'Medium' or 'Light' (unless you have sufficient cooling, which we'll go more into further on). Also, unless you don't have any heavy-duty computing workloads to work on, you should go to the column called simply 'When' and set that to "Only when idle". Otherwise, it will run in the background at all times, which may be problematic for less powerful systems. 

Keeping things cool

Because of the intensive workloads that your CPU is going to be spinning up on, your device will start generating a ton of heat – which is totally fine. Luckily, most processors these days have a ton of fail-safes in place to slow the CPU down when things get a little too warm, so you're not really in danger of damaging your device. 

There is definitely an argument to be made that running your PC at high temperatures for extended periods of time can diminish the lifespan of your CPU, and that's something you should keep in mind. This is why we wouldn't advise laptop users to run Folding@Home on a laptop with the 'Full' performance preset at all times, simply because you'll be up against that thermal limit for an extended period of time, and those temperatures can adversely affect other components in the laptop. 

However, if you have one of the best gaming PCs, equipped with a solid CPU cooler and decent airflow, you really don't have anything to worry about. If you can keep your CPU under 80 degrees C under full load, you should feel totally comfortable running Folding@Home for extended periods of time. 

Still, at the end of the day, you should only do what you're comfortable doing with your computer. Unless you know you have that thermal headroom, it's probably for the best to avoid the 'Full' preset, just to preserve the longevity of your PC – but, again that's entirely up to you. 


Idle or not so idle?

Here at TechRadar, we have access to some blindingly powerful computer hardware, so we've been able to have our PC running Folding@Home pretty much non-stop. With an AMD Ryzen 9 3900X, we can typically finish a project every 15 minutes with the performance setting on 'Full', without it really affecting our work. 

Now, obviously, not everyone will have access to that level of hardware, but many desktop processors will be able to contribute a lot to the cause. However, no matter what level of performance you have, you'll have to decide whether or not to have it running while you're actually working. 

Depending on what kind of work you have to do, you can totally get away with running it while working, but if not, you can set it to run only when your computer is idle. To do this, right click the Folding@Home icon in your system tray, select "Web Control" and then it will open a browser window. Once you're there, you'll see a column that just says 'When'. From there you can select 'Only when idle' or 'While I'm working'. 

If you select 'Only when idle', Folding@Home will only do its thing when your computer isn't working on other tasks. If your daily workload involves a lot fo computationally heavy tasks, this is what we'd advise. We'd only have it running while you work if you have a ton of performance headroom.

And, even if you do have it running while you're working, you can go into the advanced settings by right clicking the icon in your system tray and selecting 'Advanced control'. 

That will open a window called "FAHControl". From there, click the 'Configure' button in the top left corner of the window. If you go over to the 'Advanced' tab, you'll be able to change the Folding Core Priority. It will default to "Lowest possible", and we advise you to leave it there. 

This will basically allow any other application you're running take priority, so that your PC doesn't lose a ton of resources to Folding@Home while you're working. Again, with this enabled, we were able to keep working with F@H running, but your mileage may vary. 


Posted in Uncategorised

Xbox Series X will absolutely be able to do 4K60, here’s why

The next generation of consoles is looming somewhere over the horizon, taunting us with a late 2020 release. And up to now there hasn't been a ton of information about what we could expect from the Xbox Series X and PS5, beyond some vague fake-sounding stuff like "8K" and "instant loading times". 

Microsoft did reveal some information in a blog post a couple weeks ago, but that was largely limited to a vague statement that the GPU would be capable of 12 teraflops of compute performance, and didn't really tell us much beyond that. 

But now that Microsoft has revealed almost everything we need to know about the hardware through a lengthy post on Eurogamer, we have a much clearer picture of what the Xbox Series X will be capable of – and, really, the Xbox Series X seems to be quite an incredible piece of hardware. 

The Xbox Series X will have the equivalent of an AMD Ryzen 7 3700X - but without a Boost Clock

All about the numbers

We've known for a while that the Xbox Series X would be sporting an AMD Zen 2 processor and RDNA 2 graphics, but now we have some actual specific hardware information.

The Xbox Series X will have an 8-core, 16-thread processor with a maximum clock of 3.8 GHz, but it won't quite operate like a desktop chip.

The Xbox Series X will have an 8-core, 16-thread processor with a maximum clock of 3.8 GHz, but it won't quite operate like a desktop chip. You see, rather than automatically boosting up core clocks when there's thermal headroom and the demand for greater performance, the processor will stay at one clock speed at all times – depending on how developers use the CPUs. 

What we mean is that developers can either use the processor with or without SMT (simultaneous multi-threading), and the clock speed will change depending on their choice. The clock speed will be 3.8GHz if hyperthreading isn't being used, which will boost single-core performance, or 3.6GHz if it's enabled. 

Since most games right now prioritize a few cores with high clock speeds, this makes the most sense right now, but we could see a move for games to prioritize multi-threaded performance in the future, and this approach would leave the Xbox Series X open to that as time goes on. 

What's particularly interesting, however is the GPU on offer here. Microsoft decided on a 12TFLOP GPU to target 4K60 gameplay – we already knew that – but now we know how Microsoft is hitting that target.

The Xbox Series X will be using a GPU with 3,328 Stream Processors spread across 52 compute units ... the AMD Radeon RX 5700 XT – the current AMD RDNA flagship – is rocking 2,560 Stream Processors across 40 compute units.

The Xbox Series X will be using a GPU with 3,328 Stream Processors spread across 52 compute units. This might not mean a lot to many folks, but to put it in perspective, the AMD Radeon RX 5700 XT – the current AMD RDNA flagship – is rocking 2,560 Stream Processors across 40 compute units. This means the Xbox Series X GPU has 24% more Stream Processors than the 5700 XT, which should lead to a substantial lead in performance.

Raw compute performance for a GPU is only a part of the story: memory is also extremely important. Now, the Xbox Series X will have 16GB of 14Gbps GDDR6, but that's shared between the CPU and GPU. However, because the operating system of the console will only be limited to one CPU core, it's likely that the system won't be using too much of the system RAM, leaving more resources for actual games to utilize. 

But to further expand memory capabilities, Microsoft is also leveraging the extreme bandwidth of PCIe Gen 4.0 to expand system memory when needed. This could potentially be the gateway to actual 8K gaming, as it would allow for much higher resolution textures.

Is this 4K?

One of the major issues with the Xbox One X and the PS4 Pro is that they're not quite capable of handling true 4K gaming, even though the consoles were marketed targeting that resolution. Instead, both Sony and Microsoft utilized checkerboard rendering to upscale the resolution of games up to 4K. 

For the most part this is fine, and looks enough like 4K to pass, but it's not quite the same as native 4K rendering. The problem is that native 4K rendering takes a lot of processing power. In fact, it's not until you drop something like $799 (£749, AU$1,199) on an Nvidia GeForce RTX 2080 that you can even approach '4K60' performance. 

So, now that the Xbox Series X is rocking a GPU that's even more powerful than an RTX 2080, at a price point that's likely going to be be way more approachable than a high-end graphics card, true 4K gaming is about to enter the mainstream. 

The Xbox Series X is already able to handle Minecraft with full path tracing

The future is ray traced

Now, of course, both major console manufacturers have promised that their consoles would support hardware-accelerated ray tracing, much like the Nvidia Turing line of graphics cards

We haven't seen this capability in AMD's first-generation RDNA cards like the Radeon RX 5700, and while we knew it was on the horizon, now we know a bit more about what to expect. Specifically, the Xbox Series X's hardware acceleration will be able to leverage 25 TFLOPs of performance for ray traced workloads. Now, it's hard to compare that to something like the Nvidia GeForce RTX 2080 Ti, as Nvidia measures its ray tracing performance in RTX-OPS and Giga Rays/s.

We don't know yet if AMD's new GPUs will be comparable to Nvidia's in this regard, but we can be sure that if games on next-generation consoles are going to start pushing out more and more games with ray tracing, we're going to see a huge increase in the amount of games that support the technology. And because there are so few games that actually have ray tracing right now, we frankly need the consoles to adopt the technology. 

But it doesn't end there. AMD and Microsoft also seem to be targeting Nvidia's DLSS technology with RDNA 2 and the Xbox Series X. If you're not familiar with DLSS, or deep learning super sampling, it's a technology that uses dedicated hardware on Turing graphics cards to upscale images through AI. 

Nvidia graphics cards have dedicated Tensor cores that handle this, but AMD is taking another approach. Instead, AMD will be relying on the raw throughput of the GPU, and executing the machine learning workloads through 8- and 4-bit integer operations – much lower precision than the 32-bit operations that are typically used in graphics workloads. This should result in a huge amount of power for this up-scaling without sacrificing too much. 

Looks like consoles are about to catch up with the best graphics cards

What does this mean for PC gaming?

The specs for the Xbox Series X are looking mighty juicy right now, but what about folks that like PC gaming instead? Well, it might mean you'll have to make some upgrades in the near future. 

We should be clear – the specs that Microsoft has revealed here don't outpace the most powerful consumer components on the market, but they do come close. Near the end of the year, you're going to want to make sure that you're rocking an 8-core CPU and at least an Nvidia GeForce RTX 2080 Super if you want to stay ahead of the curve.

That's an incredibly tall order for a lot of people, as that hardware is incredibly expensive. The Xbox Series X, then, could very well be on-par with the best gaming PCs right now, and given that it's going to be much cheaper for that level of performance, it could potentially be an easy recommendation. 

Now, we'll totally admit that we were incredibly skeptical when we first heard that Microsoft started making its claims about Xbox Series X performance, but these specs are incredibly promising. It's going to be quite a while before we're able to get our hands on the console to see what it's really capable of, but it's looking pretty damn promising. 

We're about to see a true generational jump, and it's going to be felt no matter which platform you typically play games on. It's something to be excited about, even if it's several months away, and we're here for that. 

Posted in Uncategorised

Doom Eternal PC system requirements and unlock times revealed

Just a day ago, we spotted some system requirements for Doom Eternal that were way higher than we expected. But they almost immediately got taken down, leading us to believe that they weren't quite right – and, it turns out, they weren't. 

Official Doom Eternal system requirements have now been revealed on Bethesda's blog, and they're much more reasonable than the brief requirements that surfaced earlier. That earlier post suggested that you would need an Nvidia GeForce GTX 1060 at the minimum, along with 8GB of RAM and an Intel Core i5 processor. 

But, luckily, you can get away with much more modest hardware. If you just want to run the game at 1080p and 60 fps, you can get away with an Nvidia GeForce GTX 1050 Ti or an AMD Radeon RX 470 – a much more accessible level of hardware. We listed out the full Doom Eternal system requirements below, but it's a relief to know that more people can go to Hell (if they want to).

The only thing that really sticks out to us is the CPU requirements. For the recommended and the 'Ultra-Nightmare' settings, Bethesda and id Software give specific CPUs to aim for, which we'll dive into more below, but for the minimum just has a clock speed as a target. This may not be super helpful for people looking to see if their CPU meets the minimum requirements or not, as "Intel Core i5 @ 3.3GHz" could theoretically go all the way back to the Intel Core i5 2500K, if it has a nice little overclock on it. 

Either way, if you want to download the game early, you're in luck. You can preload the game up to 48 hours before the game launches on PC, and it will unlock at 12:00 AM in your local time zone. 

Minimum specs (1080p/60 fps/ low quality settings)

  • CPU: Intel Core i5 @ 3.3GHz | AMD Ryzen 3 @ 3.1 GHz
  • GPU: Nvidia GeForce GTX 1050 Ti, Nvidia GeForce GTX 1650 | AMD Radeon R9 280, AMD Radeon RX 470
  • Memory: 8GB 
  • HDD: 50GB
  • OS: Windows 7/Windows 10 (64-bit)

Recommended specs (1440p/60 fps/high quality settings)

  • CPU: Intel Core i7-6700K | AMD Ryzen 7 1800X
  • GPU: Nvidia GeForce RTX 2060, GTX 1080 | AMD Radeon RX Vega 56
  • Memory: 8GB
  • HDD: 50GB

Ultra-Nightmare specs (4K/60 fps/Ultra-Nightmare settings)

  • CPU: Intel Core i9-9900K | AMD Ryzen 7 3700X
  • GPU: Nvidia GeForce RTX 2080 Ti | AMD Radeon VII
  • Memory: 16GB
  • HDD: 50GB

What kind of PC do you need to run Doom Eternal?

The Doom Eternal system requirements are kind of all over the place, and it's hard to nail down the perfect system to play the game. Now, we obviously haven't got our hands to test performance like we have with Red Dead Redemption 2 or Halo: Reach – yet – but we can get a pretty good idea of the right kind of hardware to run the game. 

Because the game at a minimum requires an Intel Core i5 or AMD Ryzen 3 at 3.5 GHz or 3.1GHz, respectively, we can safely assume that the game isn't heavily threaded. For instance, the AMD Ryzen 3 1300X has a 3.5GHz base clock and just 4 single-threaded cores. That means that even old quad-core champs like the Intel Core i5-3570K should theoretically be fine. But, again, IPC (instructions per clock) performance has come a long way since either of these processors launched, so you can't be 100% sure. 

Things get a little more confusing when you go up to the Ultra-Nightmare specs. It calls for an Intel Core i9-9900K or an AMD Ryzen 7 3700X, which may both be 8-core, 16-thread processors, but perform very differently. Intel's processor has stronger single-core performance, thanks to its high 5 GHz boost clock, whereas the Zen 2-based 3700X smokes it in heavily threaded workloads (at a cheaper price). 



The confusion continues when you look at the GPU requirements for this level, too, as it calls for either an Nvidia GeForce RTX 2080 Ti or an AMD Radeon VII – two GPUs that are definitely not on equal footing. 

At the end of the day, we won't actually know what performance will look like for Doom Eternal until we're able to actually run it through some testing ourselves, but for now we can help you make sure your system is up to date, using currently-available PC components. 

If you just want to play at 1080p with high settings, this is the system we recommend:

However, if you want to really embrace everything this game has to offer, soaking in the high-fidelity graphics, we recommend the following hardware: 

  • CPU: AMD Ryzen 7 3700X or Intel Core i7-9700K
  • GPU: Nvidia GeForce RTX 2080 Super
  • Memory: 16GB
  • HDD: 50GB SSD

You may notice that we recommended the Nvidia GeForce RTX 2080 Super instead of the RTX 2080 Ti, but that comes down to the fact that Bethesda is recommending the Radeon VII for the same level of performance. Because the RTX 2080 Ti is so much more expensive, it's hard to recommend if you're able to get a similar level of performance out of a card that costs nearly half as much. 

Then again, it's entirely possible that Doom Eternal is going to be a huge VRAM drain, necessitating the 11GB of GDDR6 on the RTX 2080 Ti or the 16GB of HBM2 on the Radeon VII – but we seriously doubt it. 

If you need to make an upgrade your hardware to play Doom Eternal when it hits the street, we went ahead and included convenient price comparison widgets down below so you can find the best price on admittedly expensive PC hardware. 

Via PC Gamer

Posted in Uncategorised