I wouldn’t bet on an Nvidia GeForce RTX 3090 appearing

Recently, rumors surrounding the next Nvidia graphics cards have been everywhere, and it's extremely likely that we're going to see new ones before the year draws to a close. However, while some of the rumors surrounding the Nvidia GeForce RTX 3080 and other potential graphics cards like it seem extremely solid, others skew to the ridiculous side of the spectrum. 

There is one particular Nvidia Ampere rumor that keeps appearing that gets my attention, however: a supposed Nvidia GeForce RTX 3090. But here's the thing: it's probably not going to happen

It'd be easy to just point at the Nvidia Turing lineup and say "hey, there was no RTX 2090, so there won't be an RTX 3090", but it runs a bit deeper than that. So, let's do a nice bit of deconstruction.

Back to the 90s

The last Nvidia graphics card we got with the xx90 name was the Nvidia GeForce GTX 690 all the way back in 2012. The 690, just like the GTX 590 before it, was a dual-GPU card, which basically meant that users could have a sick SLI configuration while only taking up one PCIe slot. 

Nvidia wasn't alone in releasing graphics cards like this back then, as AMD was in on the action too. The AMD Radeon HD 7990, for instance, came out in 2013, and was probably the last dual-GPU card that could be considered mainstream – with products like the R9 295x2 being way too expensive for everyday users to even consider. 

It's not a coincidence that Nvidia dropped the xx90 naming scheme after the 690, either. It wasn't Nvidia's last dual-GPU card, that honor would go to the Nvidia Titan Z, but again, that card, along with the rest of the Titan lineup, is meant for creative workloads, rather than gaming. 

Some may point to the AMD Radeon RX 590, of course, as that GPU launched just a few years ago. But again, that card was essentially just an AMD Radeon RX 580 with faster memory and an overclock. 

If you want to know more about the history of these dual GPU graphics cards, here's a video from Tech Showdown that goes into some pretty exhaustive detail, going all the way back to the Voodoo II:

That single GPU life

Since the AMD Radeon HD 7990 and Nvidia GeForce GTX 690, it seems like any dual GPUs that came out were separate from the main graphics card lineup from either GPU manufacturer, and there's definitely a reason for that. 

I reached out to Nvidia and it told me that the question ultimately boils down to diminishing returns. I was told that "dual GPUs are expensive to produce and require adequate cooling and thermal/acoustics. There’s also a limited market for them." 

The best graphics cards are already getting more and more expensive – you just have to look at the Nvidia GeForce RTX 2080 Ti for evidence of that. That graphics card launched at $1,199 (£1,099, AU$1,899), with a whopping 250W TDP. Nvidia even had to adopt a dual-fan design with its Founders Edition cards with this generation, and in our testing, temperatures still get super-hot. 

But, it's a bit more complicated than "it's expensive and hot". Games are built differently then they were a decade ago, and with the advent of DirectX 12, the onus is on developers to bake in support for multi-GPU configurations. 

When I asked Nvidia about this I was told "SLI was a huge hit for us when we launched it 16 years ago. At that time however, there were no APIs to support game scaling, so NVIDIA did all of the heavy lifting in our software drivers." In fact, SLI was even successful and "folks were eager to pair two GPUs in their system using the SLI bridge. A dual GPU like the 690, is essentially the same thing, just on a single PCB (board)."

But because of the change in software support, things have changed, and that comes from APIs changing from implicit mGPU support to explicit mGPU support. With the former, multi GPU support came down to Nvidia: it would update graphics drivers with profiles for different games, making sure that SLI configurations worked well. However, with explicit mGPU, that shifts over to the game developers, who have to bake support straight into the game engine – while Nvidia still has to do a lot of heavy lifting on its end, too. 

So, as Nvidia puts it: "explicit mGPU is rarely worth the time for the developer since the scaling result isn’t worth the investment. Plus, as our GPUs have become more powerful, and capable of doing 4K over 60fps as an example, there’s really no compelling reason to build a card with two GPUs."

It's impossible to completely rule out a future dual-GPU card entirely, but it doesn't seem very likely that we'll see one again in the near future, at least until 8K gaming becomes a thing, which will probably be a long time

Does it have to be dual GPU?

Now that we're back in the world of speculation, it's possible that Nvidia will just put out a single-GPU card with xx90 branding and call it a day. I don't think that's going to happen either. 

One of my colleagues pointed out the Intel Core i9-9900K and AMD Ryzen 9 3900X as an example, saying that could point to a shift in the components industry to having the number "9" as branding. But I think there's some important context missing here. 

Now, this is entirely my interpretation of the Coffee Lake Refresh release, but the Core i9-9900K appeared after AMD Ryzen 2nd generation processors like the Ryzen 7 2700X started threatening Intel's long standing desktop superiority. 

The Intel Core i9 branding was around before that, but only on HEDT processors like the Intel Core i9-7980XE – there was no 8th-generation X-series chips... oops. However, when faced with AMD's rising competition and a manufacturing process that was – and still is – stagnant, Intel had to bring in the big guns with the Core i9-9900K. 

And, in turn, AMD responded with the AMD Ryzen 9 3900X and the Ryzen 9 3950X, both of which remain unchallenged by Intel nearly a full year after Ryzen 3000's initial launch. 

While the Core i9 and Ryzen 9 branding is probably here to stay, that moment hasn't really happened in the graphics card market yet. AMD RDNA 2 is somewhere off on the horizon promising to bring ray tracing and 4K gaming to AMD Radeon users, but it's not here yet. Assuming both AMD and Nvidia are dropping new graphics cards this year, Nvidia doesn't really have much of a reason to shift up that branding. 

The biggest rumor pointing to the RTX 3090 revolves around the idea that there will be three graphics cards using the top-end GA102 GPU, but, like Nvidia Titan is a thing. If I had to just guess what those three graphics cards would be, I'd suggest an RTX 3080 Ti, and if anything two Titan cards. 

Nvidia has been going all-in on creators over the last couple years, especially with its RTX Studio program, so my money is still on one halo flagship gaming card, with any other GA102 GPUs – if that's even what it's called – being reserved for pro-level Nvidia Titan graphics cards. 

Still, this is all speculation. So, if this is wrong, Nvidia – prove it. 

Posted in Uncategorised

No, Zen 3-based AMD Ryzen 4000 processors haven’t been delayed until 2021

AMD Ryzen 4000 desktop processors understandably have a lot of hype behind them, which is leading to a lot of rumors spilling out all over the internet – but not all of them are to be believed, especially when it goes so hard against what AMD itself is saying. 

Just yesterday, there was a rumor that came out from DigiTimes saying that AMD Ryzen 4000 would be delayed. On the surface that even makes a lot of sense – AMD is clearly the performance leader right now, and the newly-launched Intel Comet Lake-S processors have failed to topple the Zen 2 lineup, with the flagship Core i9-10900K barely making a dent. 

However, we have seen AMD double down again and again that its Zen 3-based Ryzen 4000 processors will hit the streets before 2020 draws to an end. In fact, AMD has reached out to us yet again to tell us that it's still on track to launch the next-generation processors before the year is through. 

We still have no idea when AMD Ryzen 4000 processors will actually hit the streets, but with how insistent AMD has been that it's on track to launch Ryzen 4000 this year, it's pretty unlikely that we'll get hit with a major delay.

But, who knows, unforeseen circumstances may arise and push the AMD Ryzen 4000 release date back into 2021. Ultimately, we'll just have to wait and see what and when AMD is ready share. Until then, it looks like we'll just have to live with the minor Ryzen 3000XT refresh

Posted in Uncategorised

So, an AMD Ryzen 4000 APU can apparently run Crysis without a CPU cooler 🤷‍♀️

While AMD Ryzen 4000 laptop chips are probably one of the best things to happen to gaming laptops in a while, we haven't really heard much about the U-series chips. However, they might be a tad underrated. 

Fritzchens Fritz over on Twitter, posted a video of an AMD Ryzen 3 4300U with no cooling whatsoever, running through the entire Crysis benchmark. It's impressive that the board running this chip didn't immediately shut down due to heat, but the fact that it got through an entire benchmark run is seriously awesome. 

This was apparently achieved by going into the BIOS and setting the maximum temperature to 90C down from 100C, which caused the processor to start throttling much earlier, stopping it from immediately overheating once getting into the operating system. 

Now, as should be totally expected with something like this, benchmark scores are unimpressive to say the least. The processor manages a measly 327 points in Cinebench R15, and 395 in 3DMark Time Spy. Now, we've never tested this processor, but if this chip can get numbers like that with no cooling, we can only imagine what it can do in a laptop with a robust cooling system. 

Either way, this is definitely not something you should try at home. Modern CPUs have failsafes in place that will stop a CPU from frying itself, but it's not impossible to brick your system by testing this out. That doesn't mean you can't be impressed by someone else putting their hardware at risk, though. 

Posted in Uncategorised

Intel Lakefield is here, powering the future of computing

At CES 2020 it seemed like every laptop manufacturer wanted to show off foldable devices, but wouldn't reveal what was powering them. However, Intel Lakefield processors have now been launched, and will be powering a whole slew of inventive computers.

Intel Lakefield will only be behind two announced laptops at first: the Lenovo ThinkPad X1 Fold and the Intel version of the Samsung Galaxy Book S. The former doesn't have an official release date at the time of writing, but Samsung's Lakefield-powered device should be hitting the street this month. 

It's important to note that these aren't just another processor refresh – this is a completely new chip design. Intel has clearly taken some inspiration from ARM's big.LITTLE architecture: one 10nm Sunny Cove CPU core will be paired with four lower-power Tremont cores. The bigger Sunny Cove core will tackle heavy workloads that need a lot of power, while the Tremont cores will more efficiently tackle background tasks. 

What's even more impressive is the new Foveros 3D stacking technology, which will essentially stack the entire SoC and memory into one tiny package that measures just 12 x 12 x 1mm, which is basically the size of a dime. This will eliminate the need for RAM to be built into motherboards externally, and will lead to much smaller devices.

Coupled with the included Intel LTE solution built into the die, Intel Lakefield is going to be behind the most portable devices we've seen, and we can't wait to get our hands on it. 

Don't expect a powerhouse

The two processors announced as part of Intel Lakefield are the Intel Core i5-L16G7 and the Intel Core i3-L13G4. Both of these processors are 5-core chips with no Hyper Threading, and even the Core i5 has a max single-core speed of 3.0 GHz. 

Needless to say, hardcore productivity isn't the aim of these processors. In terms of raw performance, these CPUs are almost certainly going to be slower than Intel's Ice Lake processors, and are instead aimed at long battery life and portability. 

We obviously haven't had a chance to test any device with one of these processors quite yet, but we imagine that these chips will be ideal for folks who need an always-connected device that they can take with them wherever they go, and who only need something powerful enough to do light office work like checking email and loading up some spreadsheets. 

And, because of the smaller board size that will be enabled by these Intel Lakefield processors, this architecture will be the default for foldable devices, where there is less space available. 

For instance, when we reviewed the Samsung Galaxy S with the Qualcomm Kryo 495, that laptop weighed in at just 2.12 pounds (0.96kg) and was less than half an inch thick. The obvious benefit with the Intel Lakefield version will be that it will be able to run all Windows apps, as it will support all x64 and x32 programs. 

So before you go out and preorder a Lakefield-powered device because it is the future of mobile computing, you should seriously consider if it's right for you. If you're a traveling businessperson, it might just be for you. 

 


Posted in Uncategorised

New AMD Ryzen 3000XT CPUs spotted, but you can probably ignore them

Fresh off the release of the Ryzen 3 3100 and 3300X, not to mention the new Intel Comet Lake-S lineup, it looks like AMD may be launching some new versions of its hard-hitting Ryzen 3000 CPU lineup. 

Retail listings for new 'XT' iterations of AMD Ryzen 3000 processors have appeared on French tech retailer Materiel.net, making it more likely that these processors will actually exist. However, as the folks over at VideoCardz point out, it's best to take the prices with a grain of salt. That said, with the AMD Ryzen 9 3900XT appearing at €499 (about $555, £455, AU$820), it would be within the same price range as the AMD Ryzen 9 3900X when that CPU launched back in July 2019. 


This new AMD leak doesn't paint a clearer picture of what we can expect from the CPU performance, so it's probably still safe to assume that the Chiphell leak spotted last week is the closest we have to solid information. According to that leak, the Ryzen 9 3900XT will feature a boost clock of 4.8GHz, up from the 4.6GHz that the original 3900X features. 

That's not a huge jump in specs, but because of the strong IPC (instructions per clock) performance of AMD Zen 2, it should lead to some substantial gains, especially in games.  

Either way, we won't know whether these chips exist until AMD confirms it, and even if they do exist, they could just exist as an OEM option for prebuilt gaming PCs. So, don't get too excited just yet. 

Temper expectations

While we would love to see AMD provide some more powerful processors, especially if the chips fall within the same price range as the existing silicon, it's important to note that they won't be that much better

A 200MHz jump in clock speed is substantial, but it's not enough to suddenly make anyone's Ryzen 3000-equipped gaming PC irrelevant. According to these rumors, this would be a 4% jump in clock speed while maintaining the same core counts and TDP (thermal design power). So, if you already have a PC with one of Team Red's latest chips, you have no real reason to pick one of these up. 

However, for folks that were on the fence about upgrading to the most recent processors, these chips, if real, would be an excellent choice – more performance is never a bad thing. 

Keep in mind that the AMD Ryzen 4000 CPU is likely right around the corner, so investing in what's essentially last year's tech right now may not be the best idea in the world. This all kind of puts these XT processors in a weird spot. 

If we're to put our speculation hats on for a second, we could see these processors acting as a special anniversary edition of the Ryzen 3000 processors – the only leaks so far have been of the most popular chips in the lineup. All the AMD rumors are pointing to a late 2020 launch for Ryzen 4000, so a limited anniversary run would make a lot of sense. Intel did it last year with the Core i9-9900KS, so this could be AMD taking a similar approach. 

We'll see what these AMD chips look like when we get a solid announcement, but it's equally possible that these processors will never actually see the light of day.  

Posted in Uncategorised

Intel Comet Lake lack of PCIe 4.0 support is a big missed opportunity

After almost a year and a half of waiting, Intel Comet Lake-S has finally arrived, bringing a 10-core processor to its mainstream lineup for the first time – one with pretty amazing clock speeds. 

Just on the performance front, it does everything it needs to – the Intel Core i9-10900K provides some pretty incredible single- and multi-threaded performance, even if it doesn't quite topple over AMD's current hold on the desktop CPU throne. 

We're not living in such a simple world anymore, however, where performance is all that matters. Even if we're only talking about PC gaming, the exclusion of PCIe 4.0 doesn't bode well for future performance of systems with Comet Lake processors, especially where graphics cards and storage are concerned. 

How can something so small be so important?

PCIe 4.0 is the future, sorry

When I first heard that PCIe 4.0 was coming to the AMD X570 platform back at Computex 2019, I knew that it would be an improvement over PCIe 3.0, I just didn't realize how much of an improvement it would be. 

I've only had the chance to test one SSD that uses the interface, the Gigabyte Aorus NVMe Gen4 SSD that AMD provided in my testing kit when I reviewed the Ryzen 9 3900X and Ryzen 7 3700X. That SSD is currently locked up in an office building in Manhattan, but when I tested that processor, I was able to get speeds of 4,996 MB/s in the CrystalDiskMark sequential read test. To put that in perspective, the Adata XPG SX8200 Pro I use on my test bench gets 3,720MB/s in the same test – that's  a 25% improvement already, from one of the fastest PCIe 3.0 SSDs to one of the first PCIe 4.0 SSDs.

The best SSDs will get faster and faster on this interface as time goes on, making PCIe 3.0 SSDs obsolete in the same way that NVMe SSDs did to SATA SSDs. 

That's even before you consider the other major component on PCIe – the best graphics cards. Sure, the only graphics cards that use PCIe 4.0 right now are AMD Navi cards like the Radeon RX 5700 XT, but that's going to change. We've already heard rumors that the RTX 3000 cards are going to use PCIe 4.0, and if that's not enough, Nvidia is already using PCIe 4.0 in its server-grade Ampere products

There's more to gaming than a shiny GeForce RTX 2070 Super

Gaming gaming gaming

The PS5 and Xbox Series X are just a few months away at this point, and both Sony and Microsoft won't stop talking about the SSDs in these systems and how they're going to push gaming forward. Hell, remember that Unreal Engine 5 tech demo that looked incredible? The developers behind that are saying that if PC gamers want to be able to see stuff like that they're going to have to pick up an NVMe SSD.

And, yeah, you can do that with a Comet Lake-S processor, that's absolutely true. But if the consoles are all using PCIe 4.0 SSDs, it's not going to be long before that's the baseline of performance for AAA games – that's kind of how it goes with each console launch. 

Intel likes to make claims that it makes the best processors for gaming, and what it bases those claims on is that single-core performance is the most important factor. And, for the most part, it's right – at launch

Given how even in my briefing for these 10th-generation Comet Lake-S processors, Intel compared the Intel Core i9-10900K to the Core i7-7700K because people typically upgrade every 3-5 years. 

With a high-end product like the Core i9-10900K, there are going to be users that will upgrade as soon as Intel's 11th-generation desktop chip comes out, sure, but there are going to be many more that will be relying on that processor far longer than just a year or two.

Never thought a console would make me urge caution to people thinking about buying a top-end CPU

Intel should have waited

Back at CES 2020, when Intel teased Xe graphics and Tiger Lake, I said that Team Blue needed to launch a desktop processor if it wanted to stay relevant. And it did that. However, nothing exists in a vacuum. 

By launching a new processor that not only is expensive enough on its own, but also requires a whole new motherboard, Intel is essentially asking that consumers spend hundreds (if not more) on a system that's going to be, in some ways, inferior to a console in less than a year. 

Because lightning-fast storage is shaping up to be one of the battling grounds of the upcoming generation of games – and you can be damn sure that will include the best PC games – Intel's lack of PCIe 4.0 seems incredibly short-sighted, especially for a company that puts so much value in PC gaming. 

Only time will tell if PCIe 4.0 becomes as big a deal as I think it will, but it's not looking so hot for Intel. Here's to hoping that the rumors that Rocket Lake is going to follow close behind Comet Lake are true – Intel's going to need PCIe 4.0 support if it wants to be the best CPU for gaming. 

Posted in Uncategorised

Nvidia Ampere has been announced: here’s what that means for you

The Nvidia GeForce RTX 3080 may be the most anticipated product in the computing world, and while the rumor mill was guessing (and probably still is) that the next consumer-facing graphics card would be based on Ampere, that hasn't quite happened yet. 

Instead, Nvidia announced Ampere much in the same way it did with Volta a few years ago – built primarily for Data Center with no mention of PC gaming or GeForce. Now that AI has become so important in the world, thanks in large part to the unprecedented rise in cloud computing, Nvidia has been hard at work developing the A100 GPU, which should deliver a whopping 20x improvement in raw compute power, compared to Nvidia Volta.  

We don't know if Nvidia Ampere is going to be behind the next GeForce cards yet – and we wouldn't exactly rule it out – but that doesn't mean that the GPU architecture won't impact the lives of us normal folks who can't afford to drop hundreds of thousands of dollars on server equipment. 

These days IoT devices are piling up in everyone's homes, while companies like Amazon are creating, for example, grocery stores that let you automatically purchase products as you toss them into your shopping cart, and cars are getting ready to drive by themselves. All of this requires a ton of compute power, which is why the new 7nm Ampere architecture, along with its 3rd-generation Tensor Cores, is such a big deal. 

The GPU should be available soon for any business that could benefit from such sheer computing power, along with systems like the DGX A100, which packs eight A100 GPUs into a single rack that will cost an eye-watering $1 million. That sounds like a lot, but plenty of buyers for both the DGX A100 and the A100 on its own have been lined up, with the likes of Microsoft, Google, Dell and Amazon getting in on the Ampere action. 

What about GeForce?

So, Nvidia didn't announce a new gaming graphics card. It's unfortunate, sure, but we're not really expecting to see a new GeForce product until fall anyways, whether or not it's going to be based on the Ampere architecture – which isn't exactly guaranteed. 

Back when we thought the Nvidia GeForce RTX 2080 was going to be the GeForce GTX 1180, word on the street was that it was going to be based on Volta, but that didn't end up being true. Instead, that architecture was dedicated to data center products like the Tesla V100 and the Titan V. Ampere could easily follow in those same footsteps. 

However, Volta played an important role in Turing's development either way, as Volta cards were the first products that used the Tensor Cores that proved to be so important in the Nvidia RTX 20-series cards. The Ampere GPUs revealed today are using the third-generation Tensor Cores, which are much more powerful than the second-generation ones found in Nvidia Turing graphics cards. 

Even if the Nvidia GeForce RTX 3080 isn't based on this Ampere architecture, it's very unlikely that it won't be built on the same 7nm manufacturing process, which means efficiency and power improvements will be similar. We probably won't see a graphics card that's literally 20 times more powerful than the RTX 2080 Ti – but those rumors we saw earlier this week that suggested the 3080 Ti will be 40% faster than the current flagship are looking much more reasonable now. 

We don't know anything about the next GeForce graphics cards at the end of the day, and we won't know anything until Nvidia is ready to share some information. However, with how much more advanced its data center GPUs have become in the three years since Volta came out, we can only imagine what PC gaming is going to look like in just a few months. 

Posted in Uncategorised

Intel and AMD both just made budget PC gaming so much better

AMD has just launched its Ryzen 3 3100 and 3300X processors, and they're basically the best thing that's happened to the $100-$150 price range in years. Not only are these processors extremely affordable, but thanks to the Zen 2 architecture they're built on, they provide some pretty significant performance improvements, too. 

But AMD isn't alone here. Intel gets a lot of hate these days, and we totally get it. We were stuck with 9th-generation Coffee Lake Refresh processors for nearly two years, and even those stopped being impressive a couple months after they hit the streets. 

Comet Lake-S is here, though, and while we haven't had the chance to test these processors, the entire product stack now has Hyper-Threading, along with much higher boost clocks (and power consumption). 

So, basically, we've reached the golden age of PC building, and there's a lot to be excited about.  

AMD Ryzen Threadripper

Threadripper has kind of helped push Intel, too.

Ryzen-fueled CPU competition has paid off

Ever since the first AMD Ryzen processors made their way out, people have been saying that it would drive both major CPU manufacturers to release better products for cheaper. And, while AMD processors were continually getting better and better, Intel was stuck on a 14nm process, seemingly in defiance of the advancements AMD was making. 

Team Blue just kept upping TDP and clock speeds and doubling down on its status as the "best gaming CPU". But that can only go so far. 

Eventually, however, multi-threaded processors were inevitable. While Intel isn't new to releasing processors with Hyper-Threading, this is the fist time in a long time where Hyper-Threading is coming to the whole desktop lineup – all the way down to the Pentium Gold G6400T.

This means that no matter what kind of budget you have for building a PC, you can get strong multi-core performance. Even if you only have a couple hundred bucks, you can build a PC that's not only strong enough for the best PC games, but also good enough to get some video editing done. 

 

Xbox Series X

Xbox Series X is going to push PC games forward, trust us.

A good sign for the future

For the longest time PC games were extremely single-threaded applications. Even to this day, there are a ton of popular titles that will only really use one or two of your CPU cores, ignoring all the rest. 

Because a vast majority of gaming PCs use Intel processors still – even if AMD has been killing it in sales over the last couple of years – this hasn't really been a huge problem. Intel processors in general feature very strong single-core performance, which has led to Team Blue's reputation as the company behind the best gaming processors. 

There have been plenty of games over the last few years, however, that have bucked that trend. Titles like Battlefield V and Assassin's Creed Odyssey are heavily threaded, leading to much stronger performance across the board. In fact, many of the PC games we test for our performance test articles have started splitting processing among many cores, with Doom Eternal and Red Dead Redemption 2 being major ones. 

And when you consider that the next generation consoles both feature AMD Zen 2 processors with 8 cores and 16 threads, we fully expect this to continue well into the future. With the huge install base that consoles bring with them, it's impossible that games won't be optimized for processors with many cores. 

Even before these consoles come out, though, multi-core processors are quickly becoming the baseline for gaming PCs. According to the latest Steam Hardware Survey, the amount of CPU cores in gaming PCs is obviously growing. Quad core processors are down to 48.89% of the market, from 52.49% in December. 

Hexa-core processors, thanks to mainstream heroes like the Ryzen 5 3600 and the Intel Core i5-9600, have grown from 20.13% of the Steam userbase to 22.58% in the same 5-month period, while 6.7% are now using high-end 8-core chips. If 6- and 8-core processors continue to grow in popularity like this, gaming will inevitably become multi-threaded – that's how technology works, after all. 

Build a PC

We can't wait to see all the budget PC builds

It's the perfect time to build your budget gaming PC

Building a gaming PC, especially if you want to tackle the latest and greatest PC games, is an expensive venture. Even if you go with solid mid-range hardware like the Ryzen 5 3600X and an Nvidia GeForce RTX 2060, you're still probably looking at a pretty sizable price tag. 

Anything that can bring down that barrier of entry is a good thing in our books, and both AMD and Intel have come out with 2020 processors that have done so. We haven't had the chance to test the newly-announced Intel Comet Lake-S desktop processors yet, but just taking a look at the Core i3 product stack, we can just tell that they'll provide an excellent solution for folks trying to save cash on their gaming PC. 

This is one of the odd times where both major CPU manufacturers are providing very compelling products, and it couldn't come at a better time. We predict that CPU requirements are going to change a lot over the next couple of years, so we couldn't be happier that Intel and AMD are providing this kind of power to everyone. 

So if you were on the fence about building that "cheap gaming PC" that people on internet forums are always telling you is possible, now's the time to do it. The best part, though – you don't have to compromise on awesome CPU performance to get a $100 processor anymore. 

Posted in Uncategorised

This bidet toilet seat attachment is the at-home throne I needed in my life

I went to Taiwan for the first time to attend Computex in 2019, and when I was there I got to try a bidet for the first time. Before that, I had heard a ton of stories about how much more hygienic they are compared to using toilet paper – and it really makes sense if you think about it for a couple of minutes – but that hotel bathroom exposed me to a whole new world.

I used that toilet after going through a 16 hour plane ride from New York City to Taipei literally a week after being hired as an editor at TechRadar. Needless to say, I was pretty exhausted. But, using that bidet made me feel completely refreshed. 

Fast forward to the end of the conference, and I flew back to New York, then back to my home in Colorado, and I didn't get to use a bidet for nearly a year. I missed it every day – toilet paper kind of loses its appeal when you've seen just how good life can be.

So, when Coway reached out and asked if I wanted to try out the BidetMega 400, there was no way I was going to turn that down. This is my story. 


Computex did more than just show me some fancy RGB PC components

A little about the Coway BidetMega 400

When I first heard that I'd be getting a bidet in the mail, my expectations were pretty tame. Just thought it'd be a toilet seat with a hose running through it that would clean me up after I used the bathroom. I didn't know just how luxurious a toilet could be. 

The Coway BidetMega 400 isn't just a bidet, it's an experience. Yes, it has the core functionality of spraying water to clean, but it can do so much more. The bidet stores some water in a heated reservoir, at one of three user-selected heat levels. This means when you use the bidet, it's not just ice-cold water being sprayed at you, it's heated and it feels amazing



But that's not even everything that's heated. You can also select one of three heating levels for the seat, which would have been nice in the actual winter, but it's whatever. Still, even when the weather is moderately warm, like it is right now, there's a certain something about being able to sit down on a warm piece of plastic when I'm doing my business.

The warm nature of the BidetMega 400 also includes a blowdryer, almost completely eliminating the need for toilet paper – though it's still good to make sure that the bidet didn't, like, miss anything. That's right, this thing is seriously high-tech.


Finally, the BidetMega 400 has a freaking night light. Because the Covid-19 epidemic and the changing nature of the world have been keeping me up at night a bit, I've been using the bathroom more in the middle of the night, and now my toilet illuminates the restroom in this pleasant blue light so I don't have to blind myself by turning the lights on. Honestly, at this point I don't think I can live without it – it even self-cleans every time it senses someone is sitting on it.

I'm in love.

The night light is seriously cool

Taking it a bidet at a time

Now of course, the world is in kind of a weird place right now. I'm not in expert in pretty much anything but processors and ray tracing (and even that is debatable), but I can speak from my own personal experience. 

The Rite Aid by my apartment has been sold out of pretty much any paper product for like 6 weeks, and I've had to refresh my toilet paper and paper towel supply by going to the little bodega on my corner – and it really only has single-ply toilet paper on offer. It definitely gets the job done, but that's no way to live. 

This bidet came just in time, really – I don't need to worry about battling the crowds in order to get my hands on toilet paper, and I don't have to sacrifice hygiene to avoid it. I've been using the Coway BidetMega 400 for around three weeks at this point, and while I recognize the privilege that having such a luxurious hygiene product in the current climate is, I couldn't be more thankful for it. 

It's at the point where I legitimately look forward to using the bathroom. I feel so refreshed and clean afterwards, without really being in danger of touching my own waste. It's a bit of a pain to clean, and installing the bidet in a tiny New York City apartment was a bit of a chore, but I really would do it all again. 

That one hour of work that it took to install it early on a Saturday morning was nothing compared to the weeks of comfort I've experienced as a result. In the face of this pandemic, we're all under so much stress, but the BidetMega 400 is the creature comfort I needed in these hectic times. 

The world will probably eventually return to normal, but at the end of the day I don't think I will. It'll probably be a long time before I'm emotionally and mentally prepared to handle crowds again. The silver lining in all of it is that it gave me the opportunity to get a bidet in my life, and even when everything else goes back to the way it was before, I'll never go back to living a bidet-less life. It's seriously that good. 

I went ahead and included some links to buy a bidet down below. If you have the opportunity, I wholly recommend it, regardless of brand. 


Posted in Uncategorised

Vampire: the Masquerade – Bloodlines 2 is coming to Xbox Series X

At the Inside Xbox event, Paradox showed off a haunting trailer of Vampire: the Masquerade - Bloodlines 2, but more importantly, it confirms that the game is coming to next-generation consoles. 

The game will take place in Seattle, and while we've previously seen footage that places you in society's underbelly, the recent gameplay reveal shows a vampire that seems to be an executive. 


Beyond that, there weren't many details to be gleaned. It did show some live first-person gameplay, using your vampire powers and jumping around a bunch of rooftops. One thing's for certain, though, this game isn't going to be shy about the gory details, and the next generation hardware is going to bring it to life in a big way.

We still don't have an official launch date for Vampire: the Masquerade - Bloodlines 2, but given that the trailer was mostly live gameplay, we expect it to be an early title for the Xbox Series X, and, of course, PC. 

Now, we just have to wait. 

Posted in Uncategorised

RTX Voice the best Nvidia Turing feature and it has nothing to do with PC games

When you think of Nvidia Turing cards like the RTX 2080, the first thing that probably comes to mind is ray tracing in the best PC games – but what if you could ray trace your voice?

Team Green just released the beta of RTX Voice, and it might just be the best thing you can do with an Nvidia graphics card

The concept is simple, use the AI processing capabilities of Nvidia's Tensor Cores to block any background noise, whether it's coming in from your speakers, or going out through your microphone. 

In a time where millions of people are working from home, including us, a feature that gets rid of distracting noises is truly a godsend. The only potential problem is that this handy piece of software requires an Nvidia RTX graphics card to use. There is a potential workaround, but because we haven't tested it ourselves, we can't really recommend it. 

What we have tested, however, is RTX Voice itself. We tested it through streaming games on Twitch, a Discord chat with family and even a wealth of meetings. 

These are professional tools now, ok?

Gaming PC? More like meeting PC

One of the more tolerable things about having to work from home all the time is that we get to use our home PC for work – which is packed to the brim with powerful hardware like a Ryzen 9 3900X, 32GB of RAM and most importantly, an Nvidia GeForce RTX 2080 Ti.

So, when we heard that RTX Voice was here, and could utilize the power of our graphics cards in a way that's actually productive, we obviously jumped on that immediately. 

If you want to do the same, you can simply download the client by heading over to this Nvidia page and following the instructions. The whole thing took us maybe 5 minutes and then we were ready to go. 

Once you have it up and running, you'll want to change at least your input to Nvidia RTX Voice through sound settings of whatever app you want to use.  Keep in mind that because this is in a pretty early beta state, not every app is going to be compatible. While, generally, we were able to use it with pretty much every program we actually wanted the technology to work with – like Discord, Google Hangouts, Zoom and OBS – setting Windows 10 audio to Nvidia RTX Voice meant that some things just didn't have audio. 

But, hey, it works in all the teleconferencing and streaming apps we have installed right now – which is a lot, so we'll call that a win. 

Works like a charm

Right after downloading RTX Voice, we did whatever any normal person would do and loaded up Audacity and started making a ton of noise with our keyboard while talking. 

Here's the miraculous part: it works like a freaking charm. Not only did we stop hearing the keyboard while we were talking, but all of the background noise – keep in mind TechRadar's US headquarters are in New York City – simply vanished. 

We then went on a video call with some family with RTX Voice enabled, and while it worked like a charm, we heard that our voice was went down in volume with it enabled – a small price to pay. 

Really, though, we've been in so many meetings where people either have sirens going off in the background, keyboard noises or even pets making all kinds of noises. This kind of software is truly a godsend. 

The best part? You can enable it on incoming sound. In our various meetings today, we enabled RTX Voice, and all the distracting noises were gone, both from our end and from other parties. This not only leads to better sound quality, but it's just easier to pay attention to what's happening in a presentation without having to wonder about what some random noise is. 

Should you download RTX Voice?

If you have an RTX-equipped computer that you can get some work done with, RTX Voice is absolutely worth a download. The clarity in both your voice and others makes meetings more comfortable for everyone involved, especially if you live in a typically loud environment. 

Downloading and enabling the software is an extremely painless process, too, so even if it's still in its beta phase, we recommend it to anyone that has the appropriate hardware. 

If you don't have the appropriate hardware, you may be out of luck, however. We already said that there's a workaround if you have a non-RTX Nvidia graphics card, but we haven't tested that fix ourselves, so we really can't make any recommendations on that front. 

All we're saying is that if you're comfortable messing around with some files, that option is open to you. Just keep in mind that this won't open just any GPU up to be used with RTX voice. It reportedly only works well on Pascal Nvidia GPUs or newer – so if you have something like the Nvidia GeForce GTX 970 or an AMD graphics card, you're probably going to run into some trouble. 

Either way, in our experience, this software has been a dream come true. We'll be using it in pretty much every meeting we have going forward, and will be encouraging our colleagues to do the same. 

And if this convinced you to finally upgrade your GPU, we've got your back there, too. 

Posted in Uncategorised

Team Fortress 2 source code has leaked, and you can apparently get malware by playing

Update: we have heard from Valve, who assures users that playing on official servers is perfectly safe. We've included their statement in the article below.

The source code for Team Fortress 2 has apparently been leaked, leading to hackers reportedly able to deliver malware through Remote Code Execution to other players. 

This leak was initially reported by @SteamDB on Twitter, with the source code in question dating back to 2017 and 2018, affecting Counter-Strike: Source and Team Fortress 2. According to a report on the issue from PCGamesN, several Team Fortress 2 server communities have advised players to avoid the game until further notice. 

Valve has reached out with a comment, saying "We have reviewed the leaked code and believe it to be a reposting of a limited CS:GO engine code depot released to partners in late 2017, and originally leaked in 2018. From this review, we have not found any reason for players to be alarmed or avoid the current builds (as always, playing on the official servers is recommended for greatest security)."

Valve goes on to clarify that it's investigating the problem and anyone who has any information can report it on Valve's security page, which will explain how to fix the issue. 

However, according to @HeavyUpdateOut on Twitter, "Remote Code Execution exploits have already been found".  It's important to note, however, that @HeavyUpdateOut is simply a fan account, and while it's unbelievably popular, you should take the extent of this damage with a grain of salt. 

The community has taken the lead with this issue, with a post on the TF2 subreddit warning users away from playing TF2 or CS:GO until the problem is patched out. That post does state that "If you aren't playing on any multiplayer servers you are not at risk" – but it may be best to avoid the affected games entirely. 

We are also hearing unconfirmed reports that all current multiplayer Source-based games may be affected, including Garry's Mod. 

Until Valve comes out and makes a statement or updates the game in some way, this is unconfirmed. But, because this is potentially a danger to your data security, our advice would be to avoid playing until the problem has been properly addressed by Valve. 

We're going to be doing some further investigation on our end, as well, and will update as soon as we get any more information. Until then, maybe it's time to check out one of the best PC games just to play it safe for now. 

This is a developing story.

Why is this so dangerous?

We have to reiterate that reports of Remote Code Execution in Team Fortress 2 and other Valve games have been unconfirmed. In fact, in that Reddit thread we mentioned earlier, mod Demoman clarifies that the source code is "an old version and was initially leaked about a year or two ago". And further that "it is unlikely but not impossible that security flaws such as RCE (Remote Code Execution) exist". 

Still, the risk of RCE in the first place is a pretty substantial threat. Through this particularly nasty flavor of malware, an attacker can gain full control of your PC, and execute any code without your permission. 

Wannacry was a pretty major example of a cyberattack enabled through RCE last year. This was a piece of ransomware that encrypted all files on victim's PCs, demanding a substantial payment through cryptocurrency. 

So, even if RCE hasn't been actively confirmed, the fact that it's even a possibility in the present state of the game means that it's best avoided. If an attacker is able to pull it off, all of your data is potentially at risk. 


Posted in Uncategorised

AMD could be launching budget Ryzen processors to challenge Intel Comet Lake

Word on the street right now is that Intel Comet Lake-S could be announced in the very near future, but AMD could be cooking something up to challenge Intel's next-generation desktop processors.

However, rather than pushing its Ryzen 4000 lineup to directly compete with the top-end of the Comet Lake spectrum – which will likely involve a 10-core, 20-thread Intel Core i9-10900K – we could see a couple budget-friendly Ryzen 3000 processors, according to a tweet from renowned hardware leaker @momomo_us. 

According to this leak, we may be seeing an AMD Ryzen 3 3100 and an AMD Ryzen 3 3300X, both of which would be 4-core, 8-thread entry-level 65W processors. This would make a lot of sense, as we haven't seen a Zen 2-based Ryzen 3 processor on desktop yet, as the Ryzen 3 3200G is based on the 12nm Zen+ architecture. 

With the large boost to IPC performance that AMD Zen 2 brings to the table, single core performance sees a huge jump, which may be why AMD may be saving these entry-level products for Intel's new chips. Intel's processors typically beat AMD when it comes to single-threaded performance, and as one of the most common uses for these high-volume entry-level processors is for budget gaming PCs, Intel Comet Lake potentially provides a much more substantial challenge to AMD.

Still, we don't even know when Intel Comet Lake processors are actually coming, what the real specs are going to end up being, or how AMD plans to respond. This is a pretty lightweight leak, so it's best to take it all with a grain of salt. We'll find out sooner or later what Intel and AMD are planning soon – the original date for Computex 2020 is rapidly approaching, after all. 

The budget battle

Some of the most popular PC games on the market right now are still pretty lightly threaded. Games like Fortnite tend to rely on much fewer CPU cores than tech powerhouses like Battlefield V as this approach keeps the barrier of entry lower for folks who may not have a 12-core Ryzen 9 3900X to slot into a gaming PC. 

If Intel does pad out its Core i3 lineup with budget-friendly processors sporting strong single-threaded performance, it could end up putting AMD in an interesting position. As much as we love talking about the Core i9s and Ryzen 9s of the world, it's impossible to overstate just how important budget-level silicon really is. 

Whenever Intel does finally come out and reveal its Comet Lake lineup, we doubt AMD will rest on its laurels. Whether Team Red comes out with some cheap processors or just drops prices on its existing chips is anyone's guess, but the market for processors is about to heat up. 

Via PC Gamer

Posted in Uncategorised

Why Crysis Remastered is needed, even if the original still looks so good

Crysis originally hit the market back on November 13, 2007, just about a year after the notorious Windows Vista plagued PCs around the world. The game still looks gorgeous after all these years, managing to look like it belongs in this console generation. But it still needs to be remastered. Badly. Luckily Crysis Remastered is on the way.

For anyone that doesn't know, Crysis was essentially the spiritual successor to the original Far Cry, after Ubisoft bought the rights to the franchise. When Crysis came out, it was notorious for being extremely difficult to run. The game wasn't just demanding for the sake of it, though. The game looks leagues better than anything that came out at the same time – Crysis released about two months after Halo 3 on Xbox 360. 

To this day, whenever I get a new graphics card for my personal use, I go through all three Crysis games just to see how much better the GPU is than my last – in addition to other graphical heavyweights like the Metro and Witcher series. 

However, over the last five or so years, or basically since Windows 10 hit the market, this has become a lot harder to do, as Crysis doesn't play nice with a 64-bit operating system. 

Tell me this doesn't look at least a little like a current-generation game

Jumping OS hurdles 

When I first saw that Crysis Remastered was basically confirmed by a leaked website (before being officially announced just an hour later), I immediately jumped up and installed the original game on my PC.

I realized that, well, it doesn't work right out of the gate. To begin with, the game installs the GameSpy client – a service that died off back in 2013 – which immediately crashes every time it tried to launch because... of course it does. The game requires the files to be there, though, so removing the dead client is a no go. 

Because I have a bad memory – I went through this whole process a few years ago – I thought my Ultrawide monitor desktop resolution was causing Crysis to crash every time I tried launching it – as that has caused issues with other titles. I'm lucky enough to have access to a ton of hardware, so I just moved to my test bench, which is attached to a 16:9 4K monitor to see if that would solve things. 

It didn't. PC gaming!

I then tried to set launch options in Steam to get the game to run in DirectX 9 mode, rather than the default DirectX 10. That didn't work either. Going to the forums next, I saw some people suggest trying compatibility mode for Windows XP and Vista, and neither of those worked.

Finally, I stumbled on a YouTube video that walked me through an incredibly convoluted process that involved downloading Crysis Warhead – the short spinoff title launched in 2008 – and moving folders between the two. That finally did it. Kind of. 

That got the game to launch, but after the ridiculous intro cutscene I was greeted with a black screen that wouldn't go away. I had to hit the '~' key to open up the control console and type in "con_restricted 0", hit Enter, then type "map island" and hit Enter again. That finally made the game load and let me play. 

And, after a process that took me around an hour and a half to get through, I'm finally able to play Crysis. And even after all of that, running the game on modern hardware remains a buggy experience, with visual glitches in all the cutscenes and the game crashing every few checkpoints. 

I'm pretty happy that I get to play through this gem of a game again. But anyone who got caught up in the hype around the original Crysis and wanted to jump on it – especially if they weren't playing a lot of PC games 15 years ago – probably won't be willing to jump through all of these hoops. 


Even explosions break pieces of the environment, sending shrapnel everywhere

Crysis is still gorgeous and a blast to play

Once all the setup and bugs were sorted, I was reminded just how much I adore Crysis. Not only does it still stand up visually, which it absolutely does, but the gameplay somehow doesn't feel dated 13 years after its initial launch date. 

The way you're able to seamlessly swap between suit modes to adapt to different solutions, the way you're able to customize your weapons on the fly – this arguably ancient game somehow feels more modern than Far Cry 5

Crysis is still such a blast to play through that I didn't just give up after the 5th time it crashed at a checkpoint. I just kept relaunching the game and soaking up that nostalgia deep into the evening. 

That being said, there are some caveats. Don't think that just because this game is 13 years old it's suddenly easy to run. Because of the constant crashes I couldn't get my benchmark software to play nicely, but I was observing around 60-70 fps at 4K with maxed settings – on an RTX 2080 Ti, and that's with full GPU utilization. 

To put that into perspective, Doom Eternal gets 104 fps on the same graphics card with equivalent settings – and that game hit the streets just a month ago. 

I'm not sure how much harder Crysis Remastered is going to hit graphics cards. But with features like software-based ray tracing, state-of-the-art Depth of Field and other pretty-but-demanding graphics settings, don't expect Crysis Remastered to get any less demanding. 


Shadow work this good shouldn't have been possible 13 years ago – and pretty much wasn't with that generation of hardware

Why Crysis Remastered is so important

Crysis has never sold particularly well, which is probably a major reason why we haven't heard from the franchise in almost seven years. However, Crysis, and games like it, serve a valuable purpose – pushing gaming technology further. 

We've seen a couple games recently that pushed tech just as hard over the last year or so – namely Metro Exodus and Control, both of which already look like next-generation games on PC. Crysis Remastered could see Crytek reclaim its title as the studio behind the most stunning games on the market. 

Beyond that – Crysis is an excellent game and better than either of the sequels. Not a lot of people were able to jump into the game when it originally came out because of the restrictive hardware requirements. Though, I do have to admit I laughed when I saw the Intel Core 2 Duo logo pop up on the intro screen – we've come a long way. 

Crysis Remastered coming out on PS4, Xbox One and Nintendo Switch is going to open up this classic to thousands of people who were never able to play the game themselves. 

So, I for one am looking forward to this game hitting launching in a few months. If nothing else, at least I won't have to spend an hour of my precious time trying to get the game to run on a modern operating system. Even without the shiny graphics bells and whistles being added onto the game, being able to get the new version running by just hitting the "play" button in Origin would be a huge improvement. 

Now I just have to wait. 

Posted in Uncategorised

Crysis Remastered announced, reviving one of the best PC games of all time

In what might be the worst kept secret in gaming history, Crysis Remastered has been officially announced by Crytek, with a release window and everything. 

After a whirlwind of leaks, Crytek announced the game, for PC, Xbox One, PS4 and Nintendo Switch, coming Summer 2020 (June-August for folks in the Southern Hemisphere). Crysis Remastered will be filled to the brim with technological advancements, with software-based ray tracing being chief among them. 

We've reached out to both Crytek and Nvidia about what exactly this means, and will be sure to update this article if we hear back from either company, but it could be a game changer. Most ray tracing found in the best PC games right now is hardware-based, thanks to Nvidia Turing's RT cores that accelerate this computationally heavy workload. 

It's extremely likely that Nvidia's RT cores will still be able to accelerate Crysis Remastered ray tracing, so that's not such a huge deal. The huge news here is that it will open the window to other hardware manufacturers (read: AMD) to easily get ray tracing running on their hardware. 

Crysis with ray tracing is going to be quite the visual delight. The original game, launched way back in 2007, still looks good to this day, so coupled with all the other visual additions, Crysis Remastered could very well end up being 2020's graphics tech showcase. We'll just have to wait and see, though. 

Look at all that juicy tech

No matter what happens, you should just brace yourself for an onslaught of "can it run Crysis" jokes. 

Besides the always-demanding ray tracing, Crysis Remastered is going to be including stuff like volumetric fog and god rays, along with what Crytek is calling "state-of-the-art" depth of field. To someone who doesn't spend their lives surrounded with PC gaming jargon probably won't know what these mean, and that's fine. 

What you need to know is that this will make Crysis Remastered look really good, and may make the title verge more into remake territory rather than the straightforward remasters we're used to. 

Given that Crytek's teaser trailer opens with a bunch of clipped comments about Crysis's legacy as an extremely demanding PC game, we're putting our money on Crysis Remastered taking that legacy back. 

We were finally able to run original Crysis at 4K 60 fps with everything maxed out on an Nvidia GeForce RTX 2080 Ti – now it's time for Crysis to push that out of reach once more. And, really, we're totally OK with that. 

Posted in Uncategorised