Same happened to me, Ive seen some benchmarks in youtube and they are hitting 30 / 35 FPS in 1080p with FG and mine is doing 10 FPS in the storm, with FG and on 720p, I am wondering if its my GPU or something else idk
I got an I7-8700K, which is, well, old by now, but the benchmark was using that CPU, so the only thing that I can think of now is the fact that my 1060 is the 3GB version, so that might be causing the massive frame drops and poor performance, regardless, I am due an upgrade.
...and even 6GB is on the low side because the benchmark program was telling me that I might experience visual glitches when my settings shifted the game's VRAM allocation close to the 5GB mark.
I just got my gf a 4060 for her birthday/(late)Christmas because she had a 1060 and wants to play wilds and had the same problem during the first beta. I installed it yesterday and it didn't turn on. I then found out that her power supply was really shity and now I need to change the power supply and re wire everything.
No matter what I do I'm pretty much locked into 60-80 fps. There's very very little optimization available to push to 120-170 fps when you have the power to do it.
I dropped dlss to max framerate over quality, I dropped as many settings as I could other than resolution/texture quality and I could BARELY impact the fps. And in the benchmarks stages in cities I'd drop 20-30 fps at random points
This is on a 3080 with a 5800 x3d
This is VERY playable as is, but I want more optimization to actually pump my fps up if I need to
So did no one with a pc play rise on switch ever which ran at 30 and was fine. Or is the fps just the modern version of bit wars. Like over 120 is literally physically unnecessary.
It is, but the people running dual core i3s do need to understand that, even if it was optimised at all, it still probably wouldn’t run well on their ewaste CPU.
The amount of 40 series cards paired with skylake CPUs is hilarious though.
I upgraded my GPU from a GTX 1060 6GB to a 7900XT just for Wilds. That Beta ran like dogwater on the 1060. Barely even playable. Built a new PC with a 13600K but kept the 1060 to save some money. And now I've spent that money lol.
Yeah, the beta is pre-optimization. Thankfully the bench mark is with the most current level of optimization that they were either already working on even when Beta 1 was released to the public or shortly after. And given how much support World's and Iceborne got I could very well see that further optimization will be done post-launch allowing even more people play who might not quite get there at launch.
Just bought my PS5 specifically for Wilds...and seeing what this sub has become, I'm really glad I didn't take my brother/hunting partner's advice to switch to PC.
If you bought a decent pc at the time the ps5 released, you wouldn’t have to worry about any of this. These are mostly people talking about trying to run wilds on graphics cards from a decade ago.
Except the cards shown in the meme and in the lower-scoring benchmarks are from graphic cards and/or CPUs from, at minimum, almost nearly a decade ago, from before even World released.
Were you to build a PC now, or even a while back, you'd be in a far better spot. PC may require a very hefty initial cost, but keeping it updated over the years is where the savings come in (while being smart about it of course. I'm not gonna buy a series 50 nVidia card yet for example. My 4060ti can likely last for a few more years).
Honestly i will probably go to pc the next generation, but for the moment im fine. Also that reminds me how i bought the ps4 in summer of 2016 to play the TES6 (also Borderlands 3) because i was sure they were gonna announce it soon xd
Yeah, I got a bit curious and asked AI for a quick cost overview and it gives me a 4000 Euro estimation for some of the posted rigs needed to run it at 4K on highest settings at >60 fps (if built from scratch, I only have work-provided laptops).
to be fair, I did spent almost the same in these past years on my console media setup (PS5pro + VRR-capable 65" TV + 11.1.4 Dolby Atmos soundbar/system) but with the intend to use it across multiple gens and ensuring that the high-end output is in place.
Do high-end PC players budget really the double amount or just stick with gaming via monitor and headphones?
Well first off. Playing a game in native 4k is extremely taxing on system resources. Consoles upscale from 1080p. So 4K on consoles is not really 4k. 4k on PC is really 4k and you naturally need a much much better rig to do it.
Secondly you can build a 4K capable machine for like 1800ish or buy a prebuilt for around 2K.
A 4080S and 7800x3d for example can get like 60-70fps at native 4k and around 120-140 with frame gen. For wilds specifically. A 4k monitor will run you about 200-300 usd too.
But high end pcs last a very long time and they can be upgraded piecemeal every now and then to keep top end performance. With the caveat that you resell your old parts since they maintain value. Well past their prime. 3080s for example are still maintaining like 80% of their original msrp. So high end pcs require more upfront investment but you save money over time.
It's not like your PS5 is gonna magically run it better than a PC. The game runs like shit on everything and the PS5 is several generations behind even older PC's at this point.
You know how fucking nuts a Mad Max event would be for MH? I would rock the hell out of a Immortan Joe inspired armor and a double barreled Bow gun or a V8 inspired Gunlance
I have never wanted anything more in my life why must you do this (but also imagine the potential a dieselpunk makeover could make for the hunting horns moveset!!!!!!)
It's slighty better. I have a 100w rtx 2060 on my laptop and it's almost like a 1660 super. I have a friend with that gpu and usually we obtain the same performance in almost all the games, maybe 5fps differences depending on the game.
I could run the benchmark at 30 stable fps with dlss activated and textures and other options in medium. But well, depends on the version of the 2070 for the laptop and the quantity of watts it can support.
2080 with a 8600k, I did overclock but I was getting an average of 57 but during the actual gameplay I was down to about 45-55 fps.
This was with performance DLSS, 1080p, low settings.
I might have to play this game at 30 FPS cap... but honestly.. If I can push some settings up slightly like balance DLSS and medium settings with consistent 30 FPS idm.
MH is a game where lower frames is typically fine. A lot like Souls games.
I'm also overclocking though(not a lot i'm not pushing my limits) but slightly.
This is the first time in my MonHun career I had to worry about these things. I'm honestly thinking of just sticking with RiseBreak until the next portable team's game.
For years PC guys were begging publishers to stop letting the PS4 gen hold gaming back, and now you need an RTX 4090 to hit a good framerate without relying on fake frames bullshit.
Even with a 4090, I have to rely on DLSS's fake frames to hit what I consider good framerates. Without it, I'm at like 95-100fps (which for the price of the system I built and my monitor's refresh rate is not nearly high enough).
The game definitely isn't optimized. I heavily sympathize the what many in this sub are saying.
That's still on the devs for relying on those to make their game run well instead of properly optimizing it. It's unfortunately becoming more widespread.
mobile i5? Sorry bro but I'm having a hard time believing you, are we talking about a solid 60fps in 1080p or 30 in 720p? I'm asking because I genuinely have a pc stronger than yours and I can't run it.
Some bros out there with ripe computers getting huffy about a 2025 game not hitting 4k/120fps
Like I get it, optimization these days is dog water and that's entirely on devs, but be real - you can't seriously expect high performance from a machine that, were it a person, would need nurse assistance just to take a shit. Computers are like Labradors; anything over ten years is practically geriatric.
Again, it's on devs to optimize, and they should be railed on for constantly dropping the ball in that regard. I'm just saying, it's difficult to take someone seriously as they pontificate about the ills of modern game development standards from atop their mighty GTX 780 - a legitimately 12+ year old card.
ya know what im with you on that second line. 4k/120 is a lofty goal, especially for a game like this.
1080p/60 (solid 60, no dips) tho without fake frames? with a 5950x 4090 and 64gb of RAM? I should be able to expect that much and not be unrealistic, and yet the benchy still dips to the 40s in the stone-lookin village toward the end
I'm more annoyed that developers are pushing for barely better graphics than going for something more stylized that is way easier on a lot of people's computers.
The whole idea of "look at all this realistic looking grass that will absolutely tank the shit out of your frames" and that kind of graphic chase is honestly not sustainable.
I have exactly the same system; 5950x, 4090, 64gb. There is no good reason for this setup to struggle achieving a stable 60fps on any modern game. It's still a very powerful build and should only just be starting to feel some of it's age now.
I really want to stress that nothing said here should be taken as excusing shitty optimisation standards from devs. They fumble optimisation constantly and should have their feet held to the coals for it.
100%. People thinking that we should expect the thing to struggle on a chip 2 gens newer (technically 1.5 but i digress) and substantially better than the 'recommended' are capcoms biggest white knights for some reason right now. Its absurd.
The push for all of these fake performance workarounds from gpu makers was a mistake that has really harmed the optimization of games.
4070 ti super and ryzen 7 7700x with 32gb ram. 1440p ultra settings no fake frames and I got 60 fps at the worst parts of the storm and the village. I also had discord open during that so idk what's wrong with your stuff.
I saw a benchmark with the literal most powerful gaming set up you could have 5090 + 9800x3d. And the STILL couldn't run full ultra, RT, no frame gen, and at native 4k for consistent 120.
Now do I need all that? Absolutely not, but people buy those top of line gpus to be able to go to 240 fps and shit.
Besides, most comments I see are people with 3060's and 3070's, and they're struggling too.
Compare mhwilds to recent games like elden ring, armored core 6, Kingdom come deliverance 2, Baulders gate 3, and helldivers 2. All of those had a far better launch state than this game, and I'd be hard pressed to see where the improvements are for all the performance sacrifices.
From all I've seen, this could very well just be a symptom of an engine that wasn't designed well for wide areas. That choice is still up for criticism because at the end of the day. It's hurt the game.
I'm sorry but, the specs stating that 60fps in1080pWITH frame gen is literally insane and shows how incompetent these devs are. These graphics are nowhere near good enough to justify that.
That's exactly what I did. I wanted to upgrade for a while but when I saw that I was below the minimum recommended, i knew i had to do it soon. When I tried the demo (with a 1070), i was barely able to play it (around 15fps). I planned ahead and bought everything around black friday, saved a lot so now I should be set for a couple of years.
I just realized a minute ago that right now the graphics card market is so deranged it would be cheaper to purchase an entire PS5 Pro and a copy of the game than it would be to try and upgrade to something in the upper range currently released :c
I have some pretty "decent" results, 30-40 fps with my 1070 at 800x600 lowest without framegen. With framegen it goes up to 60, but I don't want to add more latency on top of that. I can't afford to upgrade sadly :(
my 5600 and 2070s in beta is already struggling in base camp. i am not expecting it to perform better when all of the other players are loaded in in that area. also frame generation should never be in a game recommended settings. the wind and hair simulation is a great addition but many dont need it. MHW engine performs much better than this and probably can achieve the same image quality if they tweak it a bit.
That's an 8 year old GPU. You're not too far under the minimum specs, you may be able to figure something out, but it will barely be playable if at all.
That's cos the game is cpu bottlenecked, not gpu bottlenecked
This post is about graphics card but the game has a low impact on the graphics card and a high impact on the cpu, a result of poor optimisation. Even the fancy pants 5090s will struggle if they havent got the best cpu on the market too.
It’s updated with new optimizations. A lot of people are reporting improvements from the beta. Keep in mind though, a lot of the benchmark is a cutscene, which is always going to have a higher frame rate than during gameplay. So when someone posts a screenshot with their average frame rate, it’s not really accurate and should be bumped down a bit.
Yeah, the gameplay portion went down to around 25 fps for me during the storm while the cutscenes were around 45- 60. It's still playable, but I might be trying to upgrade sometime soon anyway
Had written it off months ago while debating a new console or PC. But for the fun of it tried the benchmark on 1070/i7-4770, 1080p lowest. Got avg 33 fps "Playable"...But also looks like playing through a vaseline-smeared screen door.
Add: but the cutscenes and characters all looking good compared to the woodblock monstrosities some are apparently getting lol
I can run it at an average 52fps on the lowest settings on my laptop. More than I thought it'd be tbh. But I feel like it would be bad for my laptop at the end of the day
So many people are posting stuff from 10+ year okd systems like “oh noes this games terrible” or doing super custom setting on their systems to tank it fro the upvotes. Its fucking annoying
I'm so glad when i bought my laptop i waited and went for the 4060 and not the 40050. Getting 90+fps avarge with minimum of 62 . Frame gen is magick, black magick but magick.
I litterally just ran the benchmark over lunch and my 3080 Ti and 12th Gen i9 - Just absolutely slapped 70 FPS Average with EVERYTHING maxed out and turned on.
Then I saw this, having just browed the thread for 10 minutes nad spurted my tea.
True so true I accidentally though that this was the beta version to play
And when it started play I was like (what the freak is this!!) And I checked again in steam and then I found out I was wrong and this is not the beta version
What I don’t understand is how I’m getting 60fps without frame generation and a 1660 Ti 6gb laptop… I do have an i7 9750H so I’m guessing the game is cpu heavy
i genuinely feel so bad that i cant run the game i was so fucking hyped to play wilds. Then in the beta i got the polygon monsters. looks my 3050ti is not gonna cut it. Cant even afford an upgrade rn :( . Literally the only thing i looked forward to this year was playing this game FeelsBadMan
It amazes me how much my 3900x still carries in today's age despite it bottlenecking my 7900 XTX and has been in since the release, it really did future proof.
I forgot what I've got, but it delights me that my 6yo laptop passes as "good."
What doesn't delight me is how everything kinda looks like it's made out of clay on medium settings, and honestly I would have a colorful game like World than a game that renders every hair on zingore's butt.
2070 here, gotta use dlss and high/medium to play at around 50 fps. The game doesn't even look that great compared to other games my PC runs really well.
531
u/Collasox Feb 05 '25
My GTX 1060 is going strong in the benchmark (14 fps)