cua cà mau cua tươi sống cua cà mau bao nhiêu 1kg giá cua hôm nay giá cua cà mau hôm nay cua thịt cà mau cua biển cua biển cà mau cách luộc cua cà mau cua gạch cua gạch cà mau vựa cua cà mau lẩu cua cà mau giá cua thịt cà mau hôm nay giá cua gạch cà mau giá cua gạch cách hấp cua cà mau cua cốm cà mau cua hấp mua cua cà mau cua ca mau ban cua ca mau cua cà mau giá rẻ cua biển tươi cuaganic cua cua thịt cà mau cua gạch cà mau cua cà mau gần đây hải sản cà mau cua gạch son cua đầy gạch giá rẻ các loại cua ở việt nam các loại cua biển ở việt nam cua ngon cua giá rẻ cua gia re crab farming crab farming cua cà mau cua cà mau cua tươi sống cua tươi sống cua cà mau bao nhiêu 1kg giá cua hôm nay giá cua cà mau hôm nay cua thịt cà mau cua biển cua biển cà mau cách luộc cua cà mau cua gạch cua gạch cà mau vựa cua cà mau lẩu cua cà mau giá cua thịt cà mau hôm nay giá cua gạch cà mau giá cua gạch cách hấp cua cà mau cua cốm cà mau cua hấp mua cua cà mau cua ca mau ban cua ca mau cua cà mau giá rẻ cua biển tươi cuaganic cua cua thịt cà mau cua gạch cà mau cua cà mau gần đây hải sản cà mau cua gạch son cua đầy gạch giá rẻ các loại cua ở việt nam các loại cua biển ở việt nam cua ngon cua giá rẻ cua gia re crab farming crab farming cua cà mau
Skip to main content

For the first time ever, I’m ready to switch to an AMD graphics card

Fine, AMD. You win. I’m jumping ship.

With the launch of the RX 7900 XTX and 7900 XT, this Nvidia fan was finally convinced to pick up an AMD graphics card as my next upgrade. I can’t believe I’m saying it, but for the first time ever, I couldn’t be more excited to be going Team Red.

Recommended Videos

I was never a fan of AMD

RX 7900 XTX lying on a textured background.
Image used with permission by copyright holder

Yes, I admit — I was never that much of an AMD fan. Seeing as PC hardware has always been my thing, I kept up to date with AMD and its rivals in equal measures, but one bad experience with an AMD processor years ago put me off enough that I never really went back. Around 15 years have passed — ancient history, as far as computing is concerned — and beyond testing and building for others, I never owned an AMD CPU or GPU in my own personal build.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Over time, this reluctance for AMD grew into a habit, and it was often justified — I picked Intel and Nvidia because I trusted them more and their hardware was simply better. This was years before the GPU shortage when components were still affordable enough that I was OK with spending a little more if it meant I’d be putting good stuff into my new builds.

Of course, as time went by, AMD improved. With the launch of Ryzen CPUs and RDNA 2 GPUs, I was ready to acknowledge that AMD is solid again, but still not quite ready to cut the cord and say farewell to Nvidia.

So there I was, an Nvidia fan planning out my next build, until the last few weeks finally broke me. AMD’s launch of the RX 7900 XTX was the final nail in the coffin of my “no AMD” phase.

I tried to stick with Nvidia

Nvidia GeForce RTX 4090 GPU.
Jacob Roach / Digital Trends

Despite the soaring prices during the GPU shortage and the fact that AMD’s range was more affordable (even though none of it really was at the time), my upgrade plan has for months now involved an Nvidia card. I prepared different builds, ranging from an RTX 3070 Ti to an RTX 3090, and have been keeping my eye on the prices — still high in my area — until I could find a deal I’d consider worth it.

But my resolve was slowly melting. There I was, with AMD’s graphics cards within reach; perhaps not quite as good as Nvidia in ways like ray tracing, but still more than sufficient. Still, knowing that both manufacturers would be releasing new lineups this year, I made the common mistake of waiting to find out what we were getting instead of building my PC right away.

Cue the RTX 4090. It’s a real beast of a graphics card, with a pretty high power requirement and a much, much higher price. In our testing, the card proved to be pretty incredible in terms of performance, but in my mind, that still wasn’t enough to sway me to spend $1,600 on a graphics card. Not that I had the option to, anyway — despite the price tag, the GPU sold out in minutes, and I’m not going to be giving a few hundred dollars extra to a scalper just to be able to play Cyberpunk 2077 in seamless 4K.

Of course, I could wait for the RTX 4080 — the 16GB version, that is, because Nvidia promptly “unlaunched” the overpriced mistake of a card that was the RTX 4080 12GB. Unfortunately, the version with more memory didn’t convince me, either. Maybe I’m being cheap, or maybe I just want to pay reasonable prices for my hardware; either way, I wasn’t feeling up to it.

A steady decline

Nvidia RTX 4090 power cable.
Image used with permission by copyright holder

The last few weeks have been rough for Nvidia, even despite the initial success of its new Ada Lovelace generation of GPUs. First, the EVGA controversy — no matter how you spin it, it’s just not a good look. Then, the controversy surrounding the RTX 40-series GPUs started, and I was quickly running out of ways to defend my own choices.

Jensen Huang, Nvidia’s CEO, said it himself: “The idea that the chip is going to go down in price is a story of the past.” The timing of that statement could not have been worse, given the fact that many Nvidia enthusiasts, myself included, were pretty unhappy with the way Nvidia chose to price the next generation of graphics cards. Huang basically made it clear that things are not going to get any better in that regard.

Now, it turns out that the RTX 4090, and therefore also the RTX 4080, may have some melting issues due to the power adapter. A quick PSA: don’t bend your cables if you want to avoid a fire hazard. Don’t get me wrong — despite these problems, the RTX 4090 does seem pretty outstanding in a lot of ways, and in all likelihood, the RTX 4080 will also be a significant upgrade over the previous gen.

Somehow, that just doesn’t matter to me anymore. After 15 years, it’s time to give AMD another shot.

AMD couldn’t have picked a better time

AMD RX 7900 XTX standing up on a red background.
Image used with permission by copyright holder

With the disappointment in Nvidia leaving a bitter taste in my mouth, I found myself getting increasingly excited about the announcement of RDNA 3 GPUs. I’ve already toyed with the idea of picking up an AMD CPU for my next PC, and I was ready to make the same choice in terms of the graphics card.

Watching AMD’s announcement, I knew that I was on board. It’s sad that we’re at a time when a $1,000 GPU is a thrilling prospect, but it is — especially if we’re talking about a flagship that will likely become one of the best graphics cards of this generation.

The two new AMD flagships, the Radeon RX 7900 XTX and the RX 7900 XT, sound pretty great. We won’t know their true performance until they land in the hands of eager reviewers, but AMD promises a 54% increase over RDNA 2 in performance per watt alongside being up to 1.7 times faster than the RX 6950 XT at 4K; access to DisplayPort 2.1 (and subsequently, 8K monitors, supposedly coming soon); and second-generation ray tracing that could help it catch up to Nvidia in that regard. AMD also claims that AI performance will be up to 2.7 better than the previous generation of GPUs.

AMD keeps the power requirements more conservative, maxing the TDP out at 355W for the 7900 XTX, and it’s not going to use Nvidia’s ill-fated 12VHPWR adapter, which, so far, seems to be the cause behind these melted RTX 4090s.

All of that is nice, but the best part is that AMD, unlike Nvidia, didn’t raise its prices. The flagship will cost $999 for the reference model, followed by $899 for the 7900 XT.

We don’t all need an RTX 4090

Nvidia and AMD's CEOs side-by-side.
AMD/Nvidia

Some readers may chime in here and tell me that there’s no way the RX 7900 XTX will keep up with the RTX 4090, and in all likelihood, they’d be right. However, the truth is that not all of us need an RTX 4090 — in fact, most of us don’t. There still aren’t many games that really need that kind of power, and even if they do, you can still run them on a cheaper GPU if you sacrifice a little bit of frame rate or take the settings down a notch.

Not many people really need an RTX 4090. Some do, but I am certainly not one of them; at least, not at that price.

I believe that the market needs more of what AMD is serving up, meaning semi-affordable hardware that’s more accessible to more users, and less of the ultra-high-end components that most gamers just can’t justify in their building budgets.

AMD’s flagships sound like the perfect middle ground between the expensive enthusiast-only sector and the mid-range segment where you have to compromise on some settings in certain games. They’re likely to run most AAA titles on max settings, but they’re still priced at a level I can get behind.

I’m ready, AMD. It’s going to be nice to see you again.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
This might be the most ridiculous GPU I’ve ever seen
The MSI GeForce RTX 4090 Suprim Fuzion graphics card.

MSI just revealed what might just be the most ridiculous graphics card we've seen to date, and it snagged the Best Choice Award at Computex 2024. MSI's GeForce RTX 4090 24G Suprim Fuzion is an enormous GPU, and although it's equipped with just two fans -- a rarity for an RTX 4090 -- it may have the best cooling solution in MSI's arsenal, all thanks to the built-in all-in-one (AIO) liquid cooler. Don't see any external radiator? That's because there isn't one. All the tubing and the radiator are tucked away inside the shroud of this massive beast. But how much will that really help?

It's safe to say that MSI went a little extra when it designed this GPU. With an integrated radiator inside the shroud, it looks quite unassuming at a glance, but don't be fooled -- this is a 4.5-slot behemoth that probably weighs a ton. MSI got rid of every single external tube and concealed the entire AIO cooler within the shroud, making the pipes 90% shorter than in traditional designs.

Read more
AMD’s canceled GPU could have crushed Nvidia
The AMD Radeon RX 7900 XTX graphics card.

For months now, we've been hearing rumors that AMD gave up on its best graphics card from the upcoming RDNA 4 lineup, and instead opted to target the midrange segment. However, that doesn't mean that such a GPU was never in the works. Data mining revealed that the card may indeed have been planned, and if it was ever released, it would've given Nvidia's RTX 4090 a run for its money.

The top GPU in question, commonly referred to as Navi 4C or Navi 4X, was spotted in some patch information for AMD's GFX12 lineup -- which appears to be a code name for RDNA 4. The data was then posted by Kepler_L2, a well-known hardware leaker, on Anandtech forums. What at first glance seems to be many lines of code actually reveals the specs of the reportedly canceled graphics card.

Read more
Why I’m feeling hopeful about Nvidia’s RTX 50-series GPUs
The RTX 4070 Super on a pink background.

I won't lie -- I was pretty scared of Nvidia's RTX 50-series, and I stand by the opinion that those fears were valid. They didn't come out of thin air; they were fueled by Nvidia's approach to GPU pricing and value for the money.

However, the RTX 40 Super refresh is a step in the right direction, and it's one I never expected to happen. Nvidia's most recent choices show that it may have learned an important lesson, and that's good news for future generations of graphics cards.
The price of performance
Nvidia really didn't hold back in the RTX 40 series. It introduced some of the best graphics cards we've seen in a while, but raw performance isn't the only thing to consider when estimating the value of a GPU. The price is the second major factor and weighing it against performance can often tip the scales from "great" to "disappointing." That was the case with several GPUs in the Ada generation.

Read more