crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,892
|
Post by crashV👀d👀 on Sept 12, 2024 21:38:18 GMT
Wouldn't that render any installs that hopped through that loophole to now fall over and become e-waste?
I have 3 laptops and all of them are older hardware where I used Rufus to install windows. None have fallen over yet (not saying they won't at some point though)
|
|
malek86
Junior Member
Pomegranate Deseeder
Posts: 3,266
|
Post by malek86 on Sept 15, 2024 14:07:58 GMT
Finally got my Ryzen 5700X3D. Upgrading wasn't as hard as I expected - I forgot my aftermarket cooler is set up so that it can be removed without any need to take out the mobo.
All I've been able to try so far is the Assassin's Creed Odyssey Discovery Tour, and my CPU time went from 14-16ms (with the 3600) to 10-11ms. So that's a big improvement to be sure. Even if the slow RAM were a limiting factor, it would be fine for my 60hz monitor until the end of the generation at least. So with relatively little money spent, I can definitely press on for another few years. When UE6 games start coming out on the PS6, I'll just upgrade the entire system.
|
|
Blue_Mike
Full Member
Meet Hanako At Embers
Posts: 5,408
|
Post by Blue_Mike on Sept 15, 2024 16:44:14 GMT
|
|
crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,892
|
Post by crashV👀d👀 on Sept 17, 2024 8:06:17 GMT
Most of the 650 boards seem to be PCI4 for storage with a select few offering 5. The ones that offer 5 seem to be up there in the £200 range, at which point isn't it just better to get a 670 board?
Am I missing something?
|
|
|
Post by Fake_Blood on Sept 17, 2024 8:41:59 GMT
Probably some common sense.
|
|
crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,892
|
Post by crashV👀d👀 on Sept 17, 2024 10:10:02 GMT
helpful
the last time i looked at boards its was the diff between 570 and 550. the 550 still offered the same speed storage option just less of it where-as this appears to not offer the same depending on board as far as I can tell and then we have E variants.
|
|
|
Post by uiruki on Sept 17, 2024 11:03:49 GMT
The ones offering PCIE5 will be B650E, because AMD wanted another variant to upsell on the stack. The main other connectivity difference between B650E and X670 is USB and SATA, which has double the capacity on the chipset. That gives 2 20gbps USB and 12 at 10gbps on the 670s, versus 1 and 6 on 650. B650E gives you more PCIE5 lanes than X670 non-E, but slightly less overall (B650E has 36 lanes, of which 24 can be 5 speeds, while X670 is 44/8 and X670E is 44/24). A single NVME drive uses up to 4 lanes and a graphics card 16, so you've got room for two full speed PCIE5 drives and your graphics card, then another 12 or 20 lanes for PCIE4. Only 8 lanes on the X670 seems bad at first but even the 4090 only runs at PCIE4 speeds so you're unlikely to be hamstrung by that. That makes X670E hard to justify, depending on pricing. My take as someone who isn't in the market is that if you want to use PCIE5 NVME drives than consider B650E only if it's cheaper than X670, otherwise just go with B650. If you need the extra lanes and USB on X670/E, then you already know you need them. After that, choose your board by the features you want and the number of M.2 sockets.
In short: Just want a computer that runs all your peripherals at full speed? B650. Do you really want PCIE5 storage, something that no games currently take advantage of? B650E or X670, whichever is cheaper.
Do you want a halo product that unless you're doing workstation/heavy duty networking stuff you'll definitely not use at all? That's what X670E is for.
|
|
|
Post by Fake_Blood on Sept 17, 2024 11:51:49 GMT
helpful the last time i looked at boards its was the diff between 570 and 550. the 550 still offered the same speed storage option just less of it where-as this appears to not offer the same depending on board as far as I can tell and then we have E variants. Only joking of course, didn’t we have virtually the same enthusiast level pc?
|
|
crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,892
|
Post by crashV👀d👀 on Sept 17, 2024 12:03:53 GMT
Pretty much, I'm so enthusiastic.
I'm still AM4 with a 5800x3d and you went whole hog and got a 7800 didn't ya?
|
|
crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,892
|
Post by crashV👀d👀 on Sept 17, 2024 12:05:14 GMT
uiruki appreciate that thanks. It's pretty much what I was understanding especially after digging into it a bit more this morning but I needed the sanity check
|
|
|
Post by Fake_Blood on Sept 21, 2024 11:23:06 GMT
Pretty much, I'm so enthusiastic. I'm still AM4 with a 5800x3d and you went whole hog and got a 7800 didn't ya? Yeah I went AM5 with the new PCI whatever thing so I can switch to the faster NVMe drives whenever that’ll make sense or make a measurable difference. I was switching from Intel though so needed new everything anyways.
|
|
crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,892
|
Post by crashV👀d👀 on Sept 26, 2024 20:34:05 GMT
Pinch of salt time people videocardz.com/newz/nvidia-geforce-rtx-5090-and-rtx-5080-specs-leakedGeForce RTX 5090 to feature 21760 CUDA cores, 32GB GDDR7 memory and 600W, RTX 5080 gets 16GB VRAM Coming from Kopite7kimi himself. RTX 5090 One of the most reliable NVIDIA leakers has now confirmed the specs for two of NVIDIA's upcoming Blackwell graphics cards, specifically the RTX 5090 and RTX 5080. According to Kopite7kimi, the RTX 5090 is now said to feature a 512-bit memory bus, not 448-bit as mentioned earlier. The RTX 5090 will also feature 32GB of GDDR7 memory. RTX 5090 with 170 SMs, 32GB VRAM, return of 512-bit bus The flagship Blackwell gaming model is expected to use the GB202-300 GPU with 21,760 FP32 (CUDA) cores, which is fewer than the 24,576 cores of the full chip (a reduction of 13%). Additionally, the card is rumored to have a 600W spec, though the leaker has ot confirmed whether that's the TBP or TGP, as both an refer to different metrics. Interestingly, although the power has increased from
|
|
X201
Full Member
Posts: 5,148
|
Post by X201 on Sept 26, 2024 21:49:54 GMT
|
|
X201
Full Member
Posts: 5,148
|
Post by X201 on Sept 26, 2024 21:53:35 GMT
I’ve been saving for ages to make the leap from my old 1080, I’ll be getting one for my rebuild - after I’ve picked myself up from the floor after seeing the price. The extra ram will be great for me - my 3D rendering software collapses if there are too many assets in my scenes.
|
|
|
Post by Chopsen on Sept 26, 2024 22:21:59 GMT
The last two generations of GPUs have been little about actual hardware innovation and just physically bigger and drawn more power to achieve any performance improvement. Oh and some fancy pants stuff on the driver side to make it looks like there were more frames and details being drawn than there actually were.
|
|
|
Post by captbirdseye on Sept 27, 2024 7:23:52 GMT
32gb ddr 7 will be expensive as hell.
|
|
Frog
Full Member
Posts: 7,304
|
Post by Frog on Sept 27, 2024 7:41:20 GMT
I'm not sure that's fair, the frame generation is one of the biggest innovations in GPU technology for quite a while.
|
|
|
Post by Vandelay on Sept 27, 2024 7:51:33 GMT
Frame gen is pretty great. Even just for efficiency rather than actual performance gains. I'm playing Frostpunk 2 at the moment and I have locked it at 60fps and enabled frame gen. Gone from using 90%+ of my GPU to only 50-60%. It's not something I would do for an action game, as it means the game is really running at 30fps-ish, but perfect for a strategy/management game.
Of course, great for performance too. Even on my 4090 it can be hard to get the most out of my 144hz screen, even though it is only 1440p (ultra wide). Stick on frame gen though and it is no problem.
|
|
|
Post by dfunked on Sept 27, 2024 8:14:11 GMT
My 750w PSU is sweating right now...
In fairness I probably don't use my PC enough to justify upgrading my trusty 3080. VRAM limits aside it's still more than adequate for me.
|
|
crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,892
|
Post by crashV👀d👀 on Sept 27, 2024 8:57:02 GMT
Some wild rumour going around that it'll require 2 of the new 16pin power connectors which surely cant be true.
|
|
Tomo
Junior Member
Posts: 3,542
|
Post by Tomo on Sept 27, 2024 9:09:31 GMT
My 750w PSU is sweating right now... In fairness I probably don't use my PC enough to justify upgrading my trusty 3080. VRAM limits aside it's still more than adequate for me. Yeah, same. My 3080 still feels like a monster tbh, unless I want to see nose hairs from 50ft. I'm also not convinced there are enough games coming out that actually need these bonkers cards. GPUs feel ahead of the games themselves atm in my mind.
|
|
|
Post by Vandelay on Sept 27, 2024 9:30:20 GMT
I upgraded my 500w PSU when I went from my 1070 to a 2080 Super. Picked up an 850w thinking that will be more than enough... then I got a 4090 and it is only just meeting the requirements. I wouldn't want to put anything beefier in than that and the 5090 certainly looks like it is even more power hungry.
I do actually cap 4090 at 80% power usage, as it becomes massively inefficient above about 80% (think some even drop it down to 70% usage with minimal effect). I did have it go full tilt initially and the PSU was fine, but I expect it would run into issues without the cap.
|
|
X201
Full Member
Posts: 5,148
|
Post by X201 on Sept 27, 2024 12:44:37 GMT
I’ve just been reading about a 2200W PSU, it’s bonkers that we’re getting close to electric kettle levels
|
|
|
Post by Fake_Blood on Sept 27, 2024 13:36:01 GMT
I also have 850 watts from my precious build, it’s one of the reasons why I went from an 9900k to a 7800X3D, that thing uses like 60 watts when gaming.
|
|
Blue_Mike
Full Member
Meet Hanako At Embers
Posts: 5,408
|
Post by Blue_Mike on Sept 27, 2024 16:17:47 GMT
Also on 850w. Fucking hell, we should start installing solar panels connected to independent generators before long if we want to keep our 'leccy bills down after upgrading to new hardware.
|
|
crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,892
|
Post by crashV👀d👀 on Sept 27, 2024 17:48:14 GMT
I swapped my 750 to a 1000w because my 4090 was causing it to whine plus the added benefit of a direct power cable without splitter/adapters.
Might need to have a separate PSU just for the GPU in the future at this rate
|
|
|
Post by Chopsen on Sept 27, 2024 18:59:32 GMT
You're going to need a separate *wall socket* for your GPU at this rate. It fucking nuts.
|
|
crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,892
|
Post by crashV👀d👀 on Sept 27, 2024 23:12:18 GMT
|
|
|
Post by Hanimalle on Sept 28, 2024 9:13:45 GMT
Hey everyone, I'd be interested in hearing your advice about the new gaming pc I'm considering buying. Before that though, here's my current configuration so you get an idea of what I'm currently working with :
CPU : Intel(R) Core(TM) i5-9600K CPU Cooler : MSI Core Frozr L RAM : 16,0 Go DDR4 3000 MHz Graphics Card : NVIDIA GeForce RTX 2080 Case : Cooler Master MasterCase H500 Power Supply : EVGA 750 GQ SSD : Crucial MX500 - 1 To Motherboard : MSI MPG Z390 GAMING EDGE AC
As for why I'm seriously considering investing in a new system, I've noticed that my computer does seem to struggle a little bit in more recent games such as Pacific Drive for example and there are a few upcoming games that I'm interested in that will probably be similarly too demanding for my current pc. My native resolution is 2560x1440 and my screen can go up to 144 Hz.
And so here's what my current hardware wishlist looks like, tell me if something could or should be changed :
CPU : Intel Core i5-14600KF CPU Cooler : MSI MAG CORELIQUID 360R V2 RAM : G.Skill Ripjaws M5 RGB Black - 2 x 16 Go (32 Go) - DDR5 5600 MHz - CL30 Graphics Card : MSI GeForce RTX 4070 Ti SUPER GAMING X SLIM Case : MSI MPG Gungnir 110R Power Supply : MSI MPG A850G PCIE5 - Gold SSD : Samsung 990 PRO - 2 To Motherboard : MSI MAG Z790 TOMAHAWK WIFI
|
|
|
Post by Chopsen on Sept 28, 2024 9:18:29 GMT
Intel CPU wouldn't be my first choice given that intel's flagship range have a high failure rate, bugs, and awful power consumption.
Liquid cooling is a popular option. I have a corsair AIO. If I had my time again, I'd go for a noctua and stick with aircooling. I was hoping the AIO would give quieter cooling. It'd actually quite loud.
|
|