crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,895
Member is Online
|
Post by crashV👀d👀 on Apr 4, 2024 23:50:44 GMT
I think once they get a 38” 3840x1600 OLED I’ll have to bite on one of those. I had that in LCD form briefly (sent back for dead pixels) and it’s a fantastic size for working. Explains why I cant find one. i've just been looking around for one as I assumed this was now a thing but they're all 1440p. I want that step up in image but do not want to go down in res. EDIT: I also want G-Sync Ultimate cos reasons I think my monitors have fans in them but i've never heard them
|
|
|
Post by barchetta on Apr 5, 2024 6:21:15 GMT
This MSI has been silent too.
|
|
Blue_Mike
Full Member
Meet Hanako At Embers
Posts: 5,408
|
Post by Blue_Mike on Apr 10, 2024 12:36:32 GMT
Overclockers taking no shit today.
|
|
|
Post by dfunked on Apr 10, 2024 12:41:16 GMT
Hah, what a shithead. Fair enough try to scam someone if that's what floats your boat, but putting that shit on Reddit is just plain moronic.
|
|
crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,895
Member is Online
|
Post by crashV👀d👀 on Apr 13, 2024 21:10:51 GMT
|
|
Derblington
Junior Member
Did you know I have a girlfriend
Posts: 2,143
|
Post by Derblington on Apr 26, 2024 21:31:09 GMT
So, I finally changed up from dual 27" 1440p monitors to a 42" LG C3 OLED. It's freaking huge. It's going to take a bit of getting used to for me but, like my TV in the Home Theater thread, my gf has immediately fallen for it and wants to play games in my office
|
|
|
Post by muddyfunster on Apr 26, 2024 21:45:04 GMT
You are so lucky to get that reaction. I just get eye rolls.
|
|
uiruki
New Member
Posts: 814
Member is Online
|
Post by uiruki on Apr 26, 2024 21:54:42 GMT
If your eyes can handle it, 4K at 100% scaling is some great desktop room. That said, it's only a little bit more than two 1440p monitors!
|
|
Derblington
Junior Member
Did you know I have a girlfriend
Posts: 2,143
|
Post by Derblington on Apr 26, 2024 22:13:08 GMT
I'm currently running 4k 100%, yeah. I'll try it out for a week. If I get eye strain or any other affects I'll bump it up to 125.
|
|
|
Post by Fake_Blood on Apr 27, 2024 12:46:52 GMT
|
|
|
Post by baihu1983 on May 2, 2024 14:00:45 GMT
Any sites that let you put in recommend specs and then list any gaming laptops that meet said requirements?
|
|
X201
Full Member
Posts: 5,152
|
Post by X201 on May 29, 2024 15:44:02 GMT
Haven't built my own machine for ages, so it's nice to see that nothing has changed and motherboard manufacturers are still doing silly things like halving the bandwidth of PCI slots should you happen to plug in a second card or M2 drive.
|
|
|
Post by Fake_Blood on May 29, 2024 16:38:06 GMT
That's very much on Intel and AMD, they don't want you to build a server with consumer hardware.
|
|
|
Post by brokenkey on May 29, 2024 18:00:10 GMT
Thoughts please. Older computer has stopped doing video output. Fans are all spinning, and I've confirmed the graphics card is ok on another computer. I've tried a different cable to the monitor, a different monitor with another cable. Still no image.
What else might be broken?
|
|
|
Post by Chopsen on May 29, 2024 18:20:52 GMT
Mobo set to output on integrated graphics?
Otherwise pcie slot broken?
|
|
|
Post by Fake_Blood on May 29, 2024 18:33:13 GMT
Blown PCIe power bus, card is getting power from the slot, but not the external power connectors. If you're lucky the psu will have another PCIe cable.
|
|
malek86
Junior Member
Pomegranate Deseeder
Posts: 3,266
|
Post by malek86 on Jun 29, 2024 12:15:43 GMT
I'm amazed at how expensive B650 mobos still are. DDR5 sticks are getting cheaper, but 32GB still need a lot of money. My old plan of getting a Ryzen 5700X3D as a holdout until the next console gen hits is starting to look somewhat appealing again.
|
|
uiruki
New Member
Posts: 814
Member is Online
|
Post by uiruki on Jun 29, 2024 12:36:29 GMT
I still think the biggest determining factor in CPU choice is the refresh rate of your monitor and where you’re aiming for frame rate. Going from 3900X to 5800X3D was much more impactful than I expected because it made it so much easier to stay at and above 100fps but if I was only aiming for 60 then the difference would be a bit marginal.
|
|
malek86
Junior Member
Pomegranate Deseeder
Posts: 3,266
|
Post by malek86 on Jun 29, 2024 12:56:08 GMT
Given that I only have a 6600XT and 60hz monitor, it would barely make a difference, unless I started playing games like Flight Simulator and stuff. The fact that I'm still running 2666mhz RAM is annoying, so I might get less 1% low frametime spikes, but probably nothing too impactful.
Still, assuming I do upgrade my GPU in a couple years to something at least on the level of a 7800XT, it could be mighty useful to have around. There's no guarantee this kind of CPU will still be available next year, for example.
But yes, right now it feels a bit like a waste of money. If I want to spend something, it would be better to get a new smartwatch to replace my aged Gear Sport. But that's another thing that feels more expensive than it needs be. Man... it's not just computer parts, tech in general has got so expensive in the last few years.
|
|
crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,895
Member is Online
|
Post by crashV👀d👀 on Jul 23, 2024 9:04:10 GMT
really good testing video to demonstrate just how shit low vram (8gb) cards can be when dropped in older systems pci 2 or 3
|
|
|
Post by AgentHomer on Jul 23, 2024 11:08:53 GMT
Due to me mostly gaming on Xbox and Microsoft seemingly intent on doing their best to ruin the brand, I am still humming and hawing about switching over to PC and the Steam Deck I bought just didn't hit the spot. So I am either thinking ROG Ally X or a proper PC. I fear my budget won't stretch beyond a system with a 4070 Super (and even that is pushing it) from the pre-builds I am looking at (don't judge, I've never built one before and now being in my 40s feel inept at attempting it). Is a 12GB GPU going to be sufficient or would I be better going down the AMD route and getting a card with more VRAM but less Ray Tracing goodness?
|
|
|
Post by Vandelay on Jul 23, 2024 11:36:17 GMT
I've not watched it, but I suspect the video above might help provide you with an answer about VRAM Correct card obviously depends on the resolution and settings you are going for. Assuming we are just talking 60fps, I would think you would be set at 1440p with a 4070. Not sure about RT on there, but I expect with DLSS enabled it should be good, depending on the game and type of RT used. If you are going 4k, you might need DLSS dialed up there to be consistent and you would need to dial down the RT, but could be fine for some light RT usage (such as shadows). 12GB should be ok, I think. It is only really 8GB were you really have issues. I would recommend sticking with Nvidia if you can, as DLSS is much better than the AMD equivalent. RT is pretty much a no go on AMD too. Having said that, always worth checking benchmarks and seeing if your use case is going to be fine. AMD have some great hardware for rasterised rendering and if you aren't going to need upscaling then they are a good choice (and the issues with upscaling are mostly game dependent and often may not be noticeable).
|
|
|
Post by Chopsen on Jul 23, 2024 13:40:49 GMT
AMD have a tiny fraction of the market compared to nVidia, so developers are not targeting higher vram requirements in their games generally. I don't think the higher vram is especially useful for gaming, as it's unlikely to be the limiting factor.
Ray tracing I'm sure is a load of bollocks. I can't tell the difference when the game is running. I feel like I'm taking crazy pills.
Early on this gen nvidia were quite expensive compared to AMD, there's less in it now.
|
|
crashV👀d👀
Junior Member
not just a game anymore...
Posts: 3,895
Member is Online
|
Post by crashV👀d👀 on Jul 23, 2024 14:05:31 GMT
Developers are targeting the memory allowances/constraints of consoles so 12gb is the new norm.
|
|
|
Post by Vandelay on Jul 23, 2024 14:20:12 GMT
Ray tracing I'm sure is a load of bollocks. I can't tell the difference when the game is running. I feel like I'm taking crazy pills. It is certainly not bollocks, but the impact on image quality varies a lot between implementations. Using it just for shadows is the most light weight use for it and is generally the least noticeable. Global illumination on the other hand can produce massive differences with the image and look noticeably better (not always though). Good implementation of reflections can also be a very big improvement to image quality, such as in Control. You also have the similar but slightly different path tracing, used by Alan Wake 2 and Cyberpunk 2077. Those games look truly next gen when maxed out and it is a night and day difference. But designers and engine gurus have become experts at faking realistic lighting. A fantastically constructed rasterised scene can look better than a game that just uses RT shadows. Plus, some of the changes are going to be fairly subtle and only noticeable when you look at a screenshot and not when you are running around a game. It will be the future though. The amount of time that needs to be invested in creating a scene without ray tracing is going to be days of work. With ray tracing, you just flip the switch and place your lights and away you go. This gen of consoles means we aren't going to see rasterised rendering ditched just yet (I believe the PS5 only had ray tracing hardware added a few months before release), but I fully expect the next gen to be built with ray tracing expected and we will see multiplatform games requiring a ray tracing capable card on PC.
|
|
|
Post by Chopsen on Jul 23, 2024 15:15:24 GMT
Nah. It's bollocks I mean, I realise it's actually a thing and what it invoves, obvs. But honestly turning it off and on barely makes a difference that I can notice. Even with Cyberpunk 2077 in photo mode, I honestly really couldn't tell. I feel like it's a placebo. I fully appreciate this is likely a me thing. I suspect that doing all the lighting "by hand" is done so well in most games that it looks pretty good already, and RT I don't really get what else I should be looking for. I see the point that it's harder to do it the old way though, and that it's likely to be the way forward given it's just easier to develop in that way. And that's fine. But in the hear and now, I don't really see turning it off and on makes much odds to my experience of playing the game.
|
|
|
Post by AgentHomer on Jul 23, 2024 18:08:01 GMT
Cheers for the tips, I try to keep on top of PC tech but obviously it changes rapidly. So, if I got something that had a CPU like the 7800x3D (not going near Intel after everything I've read lately which is a shame as in the past I have always had Intel CPUs), 32GB DDR5 Ram and a 4070 Super, I'd be fairly safe for the short/medium term?
|
|
|
Post by Vandelay on Jul 23, 2024 18:32:33 GMT
Definitely. CPU would likely be good for a long while. GPU is always the one that won't last as long, but 4070 will be good until the next gen, I would think.
|
|
|
Post by barchetta on Jul 24, 2024 12:07:09 GMT
I built my PC before Christmas to almost same spec - just a vanilla 4070 not a super version. Running it with a 32" Ultrawide OLED at 1440p. More than happy at the moment. DLSS frame gen also helps.
Main use is MSFS but have dabbled with Cyberpunk, RDR2 etc and it is very nice and quite a step up from my old i5-6600k/1070 combo.
I might look at a new gfx card next year and hand the 4070 down to my lad, but I'm hoping it will last a good while yet if not.
|
|
zisssou
Junior Member
Posts: 3,405
Member is Online
|
Post by zisssou on Jul 25, 2024 21:14:05 GMT
Not so much building a pc advice.. but does anyone have any recommendations for a budget up to 800-1k laptop?
I’d bloody love to build a pc again, but my partner is set on getting a gaming laptop, as she prefers to game like that playing sims and strategy games etc. Now I know the budget is pretty mental for the games she plays, and she’d probably be fine with something that came out in 2021, but I guess it’s future proof for many years..
|
|