AMD fans be like:
FANcy
Looks more like Fan-A, Fan-B and Fan-C to me.
Just got this card as an upgrade to my 5700xt. It is so good, and REALLY pretty.
Hey, I also have 5700xt. What card is this? And how much of an upgrade is it?
Sapphire 7900xtx Nitro+, and 3x performance boost PLUS far more stable frametimes at the same framerates
you know, im kinda a pc fan myself…
Ugh. Can I just say how much I fucking HATE how every single fucking product on the market today is a cheap, broken, barely functional piece of shit.
I swear to God the number of times I have to FIX something BRAND NEW that I JUST PAID FOR is absolutely ridiculous.
I knew I should’ve been an engineer, how easy must it be to sit around and make shit that doesn’t work?
Fucking despicable. Do better or die, manufacturers.
Most of the time, the product itself comes out of engineering just fine and then it gets torn up and/or ruined by the business side of the company. That said, sometimes people do make mistakes - in my mind, it’s more of how they’re handled by the company (oftentimes poorly). One of the products my team worked on a few years ago was one that required us to spin up our own ASIC. We spun one up (in the neighborhood of ~20-30 million dollars USD), and a few months later, found a critical flaw in it. So we spun up a second ASIC, again spending $20-30M, and when we were nearly going to release the product, we discovered a bad flaw in the new ASIC. The products worked for the most part, but of course not always, as the bug would sometimes get hit. My company did the right thing and never released the product, though.
It’s almost never the engineers fault. That whole Nasa spacecraft that exploaded was due to bureaucracy and pushing the mission forwards.
Capitalism: “Make as much as possible as fast as possible”
Capitalism: “Growth or die!”
Earth: I mean… If that’s how it’s gotta be, you little assholes🤷👋🔥
It’s kind of gallows hilarious that for all the world’s religions worshipping ridiculous campfire ghost stories, we have a creator, we have a remarkable macro-organism mother consisting of millions of species, her story of hosting life going back 3.8 billion years, most living in homeostasis with their ecosystem.
But to our actual, not fucking ridiculous works of lazy fiction creator, Earth, we literally choose to treat her like our property to loot, rape, and pillage thoughtlessly, and continue to act as a cancer upon her eyes wide open. We as a species are so fucking weird, and not the good kind.
Not really, and I say this being a communist myself. Capitalism just requires to extract the maximum profit from the capital investment, sometimes it leads to what you said, sometimes it leads to the opposite (e.g. no difference between i5 1st gen and i5 8th gen)
It’s not easy to make shit that doesn’t work if you care about what you’re doing. I bet there’s angry debates between engineers and business majors behind many of these enshitifications.
Though, for these Intel ones, they might have been less angry and more “are you sure these risks are worth taking?” because they probably felt like they had to push them to the extreme to compete. The angry conversations probably happened 5-10 years ago before AMD brought the pressure when Intel was happy to assume they had no competition and didn’t have to improve things that much to keep making a killing. At this point, it’s just a scramble to make up for those decisions and catch up. Which their recent massive layoffs won’t help with.
I’ve put together 2 computers the last couple years, one Intel (12th gen, fortunately) and one AMD. Both had stability issues, and I had to mess with the BIOS settings to get them stable. I actually had to under-clock the RAM on the AMD (probably had something to do with maxing-out the RAM capacity, but I still shouldn’t need to under-clock, IMO). I think I’m going to get workstation-grade components the next time I need to build a computer.
So this doesn’t apply to the Intel situation, but a good lesson to learn is that the bleeding edge cuts both ways. Meaning that anyone buying the absolute latest technology, there’s going to be some friction with usability at first. It should never surmount to broken hardware like the Intel CPUs, but buggy drivers for a few weeks/months is kinda normal. There’s no way of knowing what’s going to happen when a brand new product is going to be released. The producer must do their due diligence and test for anything catastrophic but weird things happen in the wild that no one can predict. Like I said at the top, this doesn’t apply to Intel’s situation because it was a catastrophic failure, but if you’re ever on the bleeding edge assume eventually you’re going to get cut.
Planned Obsolescence is a real problem. But Intel in this situation is just straight up incompetence.
Welcome to capitalism. Infinite growth is required, and when a market is well and truly saturated the next step is cutting more and more costs.
Incidentally, Cancer also pursues a similar strategy.
Ryzen gang
My 7800x3d is incredible, I won’t be going back to Intel any time soon.
deleted by creator
To put this into context, the zen5 X3D chips aren’t out yet so this isn’t really an apples to apples comparison between generations. Also, zen5 was heavily optimized for efficiency rather than speed - they’re only like 5% faster than zen4 (X series, not X3D ofc) last I saw but they do that at the zen3 TDPs, which is crazy impressive. I’m not disagreeing with you about the 7800X3D - I love that chip, it’s def a good one - just don’t want people to get the wrong idea about zen5.
I’m still staying with my 5950X. So many cores!
Not sure how much longer I’ll be using the 5950x tbh. We’ve reached a point where the mobile processors have faster multicore (for the AI 370) than the 5950X without gulping down boatloads of power.
Also on the 7800X3D. I think I switched at just the right time. I’ve been on Intel since the Athlon XP. The next buy would have been 13/14th gen.
Me who bought AMD cpu and gpu last year for my new rig cause fuck the massive mark up for marginal improvement on last gen stats.
Also glad to picked the AMD version when ordering a new laptop yesterday.
is still on the atom n270
So glad I went full AMD for my latest build.
tldr: Flaw can give a hacker access to your computer only if they have already bypassed most of the computer’s security.
This means continue not going to sketchy sites.
Continue not downloading that obviously malicious attachment.
Continue not being a dumbass.
Proceed as normal.
Because if a hacker got that deep your system is already fucked.
It’s more serious than normal because if your PC ever gets owned, a wipe and reinstall will not remove the exploit.
“Nissim sums up that worst-case scenario in more practical terms: “You basically have to throw your computer away.””
Okay, so what gets permanently “owned?” The BIOS on the motherboard, the CPU? is the GPU also hosed?
The CPU is hosed. I assume the throw away is in reference to laptops or mini PCs where the CPU isn’t socketed.
Yeah the Intel issue is definitely a bigger problem. Imo.
I agree.
I’m not that worried about it effecting me lol, i would be more concerned about my intel cpu dying, especially since it’s been around for decades.
im a fan of no corporation especially not fucking amd, but they have been so much better than intel recently that im struggling to understand why anyone still buys intel
Of all the CPU and GPU manufacturers out there, AMD is the most consistently pro-consumer with the least corporate fuckery, so I take mighty exception at your ‘especially not fucking amd’ comment.
Most of the shopping I’ve been helping people with lately has been for laptops. And while there are slightly more AMD options then before laptops are still dominated by Intel for the most part. Especially if you’re trying to help someone pick something while on a tighter budget.
thats fair if u are looking for the cheapest laptops basically nothing is amd, also i bet most people dont know what those powered by x stickers even mean nor care and honestly why should they. i didnt consider that, i was more thinking about people making their own pcs but it is also wierd that laptop manufacturers and oems prefer intel so much maybe efficiency is the biggest factor i know amds cpus tend to be more power hungry
What’s so bad about AMD? They seem a lot better than Intel imo.
They are bad at writing software and firmware support is sketchy. That second point is technically the motherboard vendors fault but it could be due to confusing design and documentation on the AMD side. Hardware-wise they are great AFAIK.
Amd has always run really lean in terms of employees which hurts their quality imo. In 2016 (a year before ryzen 1 came out amds lowest point quality wise) intel had ~100k employees, at the same time amd had a little over 8000 and supported a wider portfolio of products, today amd is up to about 30k and it shows (although until last week intel was also up to 130k)
Researchers discover potentially catastrophic exploit present in AMD chips for decades
They’re both very flawed
Despite being potentially catastrophic, this issue is unlikely to impact regular people.
Doesn’t seem very similar to me.
Sounds like some precious and sweet intelboi feels bad holding the bag…
Don’t be a fan of one or the other, just get what’s more appropriate at the time of buying.
Intel has not halted sales or clawed back any inventory. It will not do a recall, period. The company is not currently commenting on whether or how it might extend its warranty.
They may be greedy but they are not stupid. Clearly they calculated that by just ignoring the issue and eating the lawsuits, they save money compared to trying to make an actual solution (whatever that would even look like in the first place)
Gotta love fucking over the consumer twice! They’re gonna get, what, $5 out of a class action? $5 and a burned out cpu, yay!
Is the issue the heat created by these high voltages? I noticed pretty quickly the 13600k was a lot hotter than I anticipated and slapped a much beefier cooler on it to bring that down.
This keeps getting slightly misrepresented.
There is no fix for CPUs that are already damaged.
There is a fix now to prevent it from happening to a good CPU.
But isn’t the fix basically under clocking those CPU?
Meaning the “solution” (not even out yet) is crippling those units before the flaw cripples them?
They said the cause was a bug in the microcode making the CPU request unsafe voltages:
Our analysis of returned processors confirms that the elevated operating voltage is stemming from a microcode algorithm resulting in incorrect voltage requests to the processor.
If the buggy behaviour of the voltage contributed to higher boosts, then the fix will cost some performance. But if the clocks were steered separately from the voltage, and the boost clock is still achieved without the overly high voltage, then it might be performance neutral.
I think we will know for sure soon, multiple reviewers announced they were planning to test the impact.
Thanks for the clarification
That was the first “Intel Baseline Profile” they rolled out to mobo manufacturers earlier in the year. They’ve roll out a new fix now.
As an i9-13900k owner, thanks. My chip has been great so far, better update when I get home
I have a i7-13700k that’s been sitting in the box since I got a deal on it last month. I was pondering returning it and spending the extra couple hundred to get an AMD setup.
I’ve been following all this then checked on the Asus site for my board and saw the BIOS updates…
Updated with microcode 0x125 to ensure eTVB operates within Intel specificatIons…
And this week there’s a beta release…
The new BIOS includes Intel microcode 0x129…
Remember Spectre? When they recommended disabling hyperthreading?
Not out yet. But you can manually set your clocks and disable boost.
Not out yet.
Actually the 0x129 microcode was released yesterday, now it depends on which motherboard you have and how quickly they release a bios that packages it. According to Anandtech Asus and MSI did already release before Intel made the announcement. I see some for Gigabyte and Asrock too.
So, not out yet. At least not fully.
If you prefer being right, rather than just accepting the extra information, then sure let’s go with that.
For CPUs nothing beats AMD
For years, Intel’s compiler, math library MKL and their profiler, VTune, really only worked well with their own CPUs. There was in fact code that decreased performance if it detected a non-Intel CPU in place:
https://www.agner.org/optimize/blog/read.php?i=49&v=f
That later became part of a larger lawsuit, but since Intel is not discriminating against AMD directly, but rather against all other non-Intel CPUs, the result of the lawsuit was underwhelming. In fact, it’s still a problem today:
https://medium.com/codex/fixing-intel-compilers-unfair-cpu-dispatcher-part-1-2-4a4a367c8919
Given that the MKL is a widely used library, people also indirectly suffer from this if they buy an AMD CPU and utilize software that links against that library.
As someone working in low-level optimization, that was/is a shitty situation. I still bought an AMD CPU after the latest fiasco a couple of weeks ago.
Honestly even with gpus now too. I was forced to team green for a few years because they were so far behind. Now though, unless you absolutely need a 4090 for some reason, you can get basically the same performance from and, for 70% of the cost
I haven’t really been paying much attention to the latest GPU news, but can AMD cards do ray tracing and dlss and all that jazz that comes with RTX cards?
DLSS is off the table, but you CAN raytrace. That being said I do not see the value of RT myself. It has the greatest performance impact of any graphical setting and often looks only marginally better than baked in lighting.
It depends greatly on the game. I’ve seen a huge difference in games like Control where the game itself was used to feature that… Well… Feature! You can see it in the quality of the lighting and the reflections. You also get better illumination on darker areas thanks to radiated lighting. It’s much more natural looking.
dlss is a brand name both amd and intel have their own version of the same thing, and they are only a little worse if at all.
Yes, but by different names. They use FSR that’s basically the same thing, I haven’t noticed a difference in quality. Ray tracing too, just not branded as RTX
There is analogous functionality for most of it, though it’s generally not quite as good across the board.
FSR is AMD’s answer to DLSS, but the quality isn’t quite as good. However the implementation is hardware agnostic so everyone can use it, which is pretty nice. Even Nvidia’s users with older GPUs like a 1080 who are locked out of using DLSS can still use FSR in supported games. If you have an AMD card then you also get the option in the driver settings of enabling it globally for every game, whether it has support built in or not.
Ray tracing is present and works just fine, though their performance is about a generation behind. It’s perfectly usable if you keep your expectations in line with that though. Especially in well optimized games like DOOM Eternal or light ray tracing like in Guardians of the Galaxy. Fully path traced lighting like in Cyberpunk 2077 is completely off the table though.
Obviously AMD has hardware video encoders. People like to point out that the visual quality of then is lower than Nvidia’s but I always found them perfectly serviceable. AMD’s background recording stuff is also built directly into their driver suite, no need to install anything extra.
While they do have their own GPU-powered microphone noise removal, a la RTX Voice, AMD does lack the full set of tools found in Nvidia Broadcast, e.g. video background removal and whatnot. There is also no equivalent to RTX HDR.
Finally, if you’ve an interest in locally running any LLM or diffusion models they’re more of a pain to get working well on AMD as the majority of implementations are CUDA based.
I disagree. Processing power may be similar, but Nvidia still outperforms with raytracing, and more importantly DLSS.
Whats the point of having the same processing power, when Nvidia still gets more than double the FPS in any game that supports DLSS
FSR exists, and FSR 3 actually looks very good when compared with DLSS. These arguments about raytracing and DLSS are getting weaker and weaker.
There are still strong arguments for nvidia GPUs in the prosumer market due to the usage of its CUDA cores with some software suites, but for gaming, Nvidia is just overcharging because they still hold the mindshare.
I had the 3090 and then the 6900xtx. The differences were minimal, if even noticeable. Ray tracing is about a generation behind from Nvidia to and, but they’re catching up.
As the other commenter said too fsr is the same as dlss. For me, I actually got a better frame rate with fsr playing cyberpunk and satisfactory than I did dlss!
Are you just posting this under every comment? This isn’t even a fraction as bad as the Intel CPU issue. Something tells me you have Intel hardware…
I switched to AMD largely for better battery performance, but this mades me feel like I dodged a bullet.
Just out of curiosity, when you say better battery performance, what kind of battery are we talking about? Is this in a laptop, a desktop on some sort of remote/ backup system?
Laptop.
I see, so is it a known thing that AMD CPU laptops generally have better battery life? I always see arguments for one CPU/GPU over another because of better power consumption, but I’ve never been in a position where I needed to worry much about it, so I’ve never looked much into the claims.
Seemed that way when I was shopping last, but that was over a year ago so I can’t cite sources. Supposedly their low mode uses less power and runs faster than Intel’s. I can’t confirm the faster part but it definitely lasts longer on battery power than any of the Intel laptops I’ve owned.
I see, thanks for the info
AMD CPUs indeed have better efficiency when it comes to energy used, or so I always hear.
I thought the point would be a depressed and self deprecating “I’m something of an Intel CPU myself”.
Glad my first self-built PC is full AMD (built about a year ago).
Screw Intel and Nvidia
7700X is what it was built with
This. Full AMD on my last build as well.
I don’t care about any corp, I was looking at best bang for buck at the time. I was shocked how everyone I knew was like you should get this intel or that Nvidia, and when I asked why not <comparable performance AMD at 2/3 the price>, all I was getting back was marketing blabber.
How many times are you going to post the same, unrelated link ??
Id rather exploit than hardware failure
r/ayymd
Amd 5700u and fx8150 still going strong
I loved my FX cpu but I lived in a desert and the heat in the summer coming off that thing would make my room 100F or more. First machine I built a custom water loop for. Didn’t help with the heat in the room, but did stop it from shutting down randomly, so I could continue to sit in the sweltering heat in my underpants and play video games until dawn. Better times.
You might want to go through the trouble of extending that radiator loop all the way out through a window.
Trust me, if the parents would have let me punch some holes through the walls to get rid of some heat, I would have.
I had the FX8350 Black Edition, and that thing would keep my room at 70f… In the winter… With a window open.
Summer gaming was BSOD city. I miss it so much.
Of course it didn’t help the heat in the room, the heat from the CPU still has to go somewhere. Better coolers aren’t for the room, they’re for the CPU. in fact a better cooler could make the room hotter because it is removing heat at a higher rate from the CPU and dumping it into the room
I wanna switch to amd some day
Can we talk about how utterly useless that default could cooler is? Like for relatively high end gaming CPU it really shouldn’t be legal for it to ship with something so useless.