Jump to content
Sturgeon's House

General PC games master race thread. Everything about games. EVERYTHING.


Recommended Posts

  • Replies 2.5k
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Popular Posts

holy shit

Ali Abu Al-Serbi will destroy Gaijin rafidits and murtads!

So I've been practicing editing and such lately to try to get a Youtube thing off the ground. One project I'm working on is putting together a review of the Total War series, and so far I guess I'll p

Yes to both I guess?

 

They did some things I like with the game:

 

-Guns are no longer pinpoint accurate to the sights. They were in BF4 and it was a mess. People could nail you full-auto from hundred meters away with an SMG and it created mass dead zones in the giant maps. It also created a gameplay meta really similar to Call of Duty, but more on that later. Now, semi-autos and SMGs disperse from center the more/faster you fire while LMGs get more accurate the longer you hold down the trigger. It results in being able to use more of the map than before because of more misses.

 

-Every little escaping carbon vapor on your character doesn't mark you on the mini-map. People will only show up on the mini-map if they're manually spotted or spotted by a sniper's flare gun. This was the other thing that turned BF3 and especially BF4 into a CoD-like, where people were staring at their mini-map 90% of the time and shooting around corners because they knew where everyone was. I was wondering why I hated 4 so much and a friend of mine who loves watching BF streamers linked me to some of their videos and they were just sprinting and bunny hopping and it looked CoD as hell and I didn't pick the game back up. Now, you and your team have to use actual effort and eyeballs to spot enemies.

 

-Vehicles are more self-contained. There's no longer a class that has a repair tool that you can just pick. You can spawn on tanks, horses or planes from the spawn menu only (cars just spawn at points around the map and you jump into one after spawning. You can also get into any abandoned tank/plane/horse), and when you do so, you spawn as a "pilot" class that gives you a C96 carbine (assuming more options in full release), a repair tool, and some AT grenades. This results in less people vehicle wasting by using a plane as a personal taxi to go get infantry kills since the Carbine is kind of poopy. The pilot of a vehicle can also fix their vehicle by holding X to repair chunks of about 20 HP at a time. You can only look around at this time, and any damage at all interrupts this and you have to restart the process. Some people don't like it, but I prefer it to the old way of praying that there's an engineer nearby and that he spawned with a repair tool, or risking jumping out of your vehicle to do it yourself and some fucking dillhole jumps into it and runs off.

 

-A few other vehicle things. For one, you have a sort of "ready rack" feature for tanks. For cannons, you can hold about 5-7 rounds before you reach zero and have to start reloading to get another shot (loading to your "rack" takes more time than if you just had a fresh round waiting). Other thing is that it doesn't look like they're doing that stupid thing like in 4 where you could customize each chunk of the vehicle and locking you out of really useful stuff. There's pre-determined loadouts to do different things, now.

 

-Melee is good and useful. The times I've played as a scout (sniper) class, I've been able to help my team in close quarters by using melee or bayonet charges.

 

-Some people hate that the new Conquest scoring system doesn't take kills into account, but I like it. Too many matches were lost because teams I was on that played the objectives lost to campers  (most notably in 3 and 4)

 

-The armored train is neat. In case anyone didn't know there are certain factors that determine that once a round, each team will get a "behemoth," essentially something like a zeppelin or armored train that has lots of little killing stations on it. These vehicles are destructible and usually issued to a team that's not doing well. The train can park at three different cap zones to capture them while dealing out crazy damage, but that still leaves something like 4 other cap zones that the enemy team can still cap without trouble, so it's not an "easy win" button, but more of a "chance to play catch-up" button.

 

Some bad things:

 

-I wlll breathe a sweet sigh of fucking relief when multiplayer gaming is rid of locking essential weapons and items behind progression. Tanks are unbalanced as hell right now because the only long-range personal AT weapon is locked behind a progression system (that is bugged in the beta so it takes forever to get), and the other is a special pickup that's way out in the desert. It's one thing to say "Well that's the skinner box" (it actually isn't), but it's another to have a hamster demand to be given the Skinner Box over any other option at all.

 

-They put a 20 minute timer on all rounds, so there are no comebacks or any idea of how long Conquest rounds will last.

 

-Rush game mode is really dull because it's down to 12 v 12 and they give each side tanks, so you can have both of the entire teams sitting in tanks not doing anything. 

 

-When DICE says "beta" they give you a fucking beta. It's bugged to hell and my early purchase is really hinging on seeing how many of these they fix.

 

-This can be a pro or can, but tanks are really hard to kill, and it's exacerbated by the AT weapon progression thing. It's hard to get an idea of the balance when an essential balancing tool is almost non-existent in the game right now.

 

-The map's kind of boring. It's not bad, but the alpha got this really cool-looking French countryside map and we got a desert map.

Agreed with most, but you can unlock the AT rocket gun via the website pretty easily (www.battlefield.com/career afaik), and it's pretty freaking good versus tanks. If the tank isn't paying attention you can kill it without running out of ammo. Something which wasn't possible with the AT grenades, K bullets, etc.

Link to post
Share on other sites

Agreed with most, but you can unlock the AT rocket gun via the website pretty easily (www.battlefield.com/career afaik), and it's pretty freaking good versus tanks. If the tank isn't paying attention you can kill it without running out of ammo. Something which wasn't possible with the AT grenades, K bullets, etc.

 

Yeah, but that was also a glitch that was patched out. I'm just saying there's literally no point in locking out such an important piece of kit.

 

 

 

Well that was very unimpressive. While I still enjoy CoD 4's campaign, that game pissed me off because it caused the death of immersion in shooters with its "hit marker +100 points +25 bonus pointsyou ranked up master seargeant shooter person bloody screen you unlocked this gun and this scope" etc. route to HUDs that even infect single-player shooters these days.

Link to post
Share on other sites

Yeah, but that was also a glitch that was patched out. I'm just saying there's literally no point in locking out such an important piece of kit.

Ah didn't know it was patched. And yes, it should be a default unlock. 

 

 

 

Well that was very unimpressive. While I still enjoy CoD 4's campaign, that game pissed me off because it caused the death of immersion in shooters with its "hit marker +100 points +25 bonus pointsyou ranked up master seargeant shooter person bloody screen you unlocked this gun and this scope" etc. route to HUDs that even infect single-player shooters these days.

Exactly, I don't know why all these pop-ups have to be there, the random shit noises. I just want to play a good shooter, fuck off with random ass shit, please, CoD. 

Link to post
Share on other sites

So I've been practicing editing and such lately to try to get a Youtube thing off the ground. One project I'm working on is putting together a review of the Total War series, and so far I guess I'll put a condensed version here for kicks:

 

Rome 1: This one tends to age poorly and well at the same time for me. The battles have aged surprisingly-well, and even outdo modern TW battles in some ways. My favorite feature in particular is how when a unit that's beating another unit will keep advancing, they slowly push the frontline backwards and envelope the enemy. It both looks really cool and creates tactical situations that allow for satsifying encirclements. The campaign map is so much more simple than later games. You can build any building, anywhere, so less fucking around waiting forever to replenish units and build armies that becomes really obnoxious starting with Napoleon.

 

Medieval 2: Kind of torn on this game. The battles can be clunky, as the intentionally slow moving speed of units can make it unclear if your orders went through, and the pathfinding (especially in towns) is absolute garbo. But they absolutely nailed the feel of armored units just wailing on each other. Like Rome 1, the grand strategy component is something I've grown to like very recently, as it is simple and just serves to get into big battles.

 

Empire: I got this one surprisingly late, and I wished I hadn't. I understand a lot of the issues people have with this game, but I'm enjoying just how crazy is it is in terms of unique units and how cartoonishly weird the campaign map can get when you take a turn and just noticed that Prussia just gifted Poland to the Iroquois Nations. This is a personal gripe, but I hate how this started naval battles in the series. As a campaign thing, it's just more time I have to spend building and recruiting. It would be fine if the naval battles weren't like watching twenty blind sloths trying to mate in a pool of molasses. They're plodding, buggy, and I never understood why I won or lost a single one of these in terms of mechanics.

 

Napoleon: It's alright. The musket gameplay of Empire is really refined and expanded-upon here, and is really fun to engage in. Unfortunately, something about the presentation is just very dull. It's all well put-together and gets you into lots of battles and all that, but I feel like they made Empire and felt like they had to make this game. This is kind of where I started to have gripes about the campaign map, too. It becomes very time restrictive while also forcing you to spend much more time planning and clicking and planning and clicking, rather than getting into massive battles.

 

Shogun 2: Probably the most well-presented TW game. The music, art, graphics, and design just show that this was a passion project for the entire team. Unfortunately for me, this one upped the game in terms of piddling around on the campaign map, as you had to carefully plan cities to get you the right types of units. This results in certain moments of the campaign forcing me to spend about a 1:10 ratio of time in favor of carefully planning cities and building armies versus actually fighting in battles. This is a shame, because the battles are beautiful, with some unique units that are fun to utilize in creative ways. Unfortunately, this game highlighted issues with the engine for me. As the first melee-focused game in the series to use Empire's engine, the battles are locked into these animated 1v1 duels that look like a bunch of inflatable punching clowns trying to have an orgy. The units will only fight on a pre-determined frontline. None of the push-back in Rome 1 and Medieval 2, and it just looks alien in certain situations. I overall do have a very positive opinion on this game, but there are a few things in retrospect that I wish it did better.

 

Rome II: I wasn't around for the release of this game after seeing how awful of a reception it got, but I bought it recently, and was pleasantly surprised with how much I like it. The campaign map is thankfully simplified again, and the game actually gives people nice tutorials. The UI is also very well laid-out for both battles and the campaign map. The battles are a mixed bag. They apparently had a massive problem with the "duel" system from Shogun 2 in this game that basically busted as the battles were hurt both in terms of look and function by some AI features and the fact that units were hardlocked into 1v1 duels only. This is apparent in new builds of the game by the fact that each unit has maybe one or two slow "attack" animations of them awkwardly jabbing their weapons, but you no longer have the issue of a unit of five spearmen holding out against a thousand surrounding swordsmen because none of the swordsmen can use numbers to gang up on them. This was improved, but the look of the battles is very disappointing. The static nature just means two units fight, one wins. Flanking is only really helpful for morale shocks, and doesn't give you as much of a noticeable killing edge as Rome 1/Medieval 2. Also, what were once passive abilities like heavier charges for cavalry, are now special activated abilities, that you have to switch on each time you want to use them. So unlike earlier games where a cavalry unit's charges were more powerful the faster they moved, you now have to charge, then activate a special ability one by one for each cavalry unit. A really dumb system. Overall, I really do like the game. I just wished that the battles didn't look so bleh.

 

Attila: God, I wish I could like this game. They did some really neat things with the battles here, but it feels like the developers just didn't want us to play them this time. No, most of the time you will be on the campaign map, fussing over politics, building farms that piss people off more than empty lots, and trying to predict how the AI will cheat next. The AI has been pumped up to ridiculous levels, only engaging you if they have the auto-resolve completely in their favor. This results in enemies using pixel-perfect cheats and bullshit to sneak 5,000 huns through a 1 mile space, torch your settlement, and move 4 times normal distance to be out of reach of anyone. There's a reason why this is the only TW whose gameplay I've extensively modded. It's a shame, because the battles have interesting pace, ranged units are much more interesting/useful, and the changes to morale and exertion are really cool. That, and there are different types of factions to play that all play differently, like the nomadic tribes that can liberate or dominate factions as they tear west across Rome to find permanent settlements. It's just that the map is flat and shitty, and the enemy is given a plethora of cheats that make it unplayable unless you have them removed. Probably the game in the series with the most unrealized potential. There's a reason why so many TW playing channels have like 10 campaigns from all the other games, but like handful of unfinished ones for this. It just feels like they forgot the battles existed, and got too swept up in the post-Dark Souls "Please stomp my balls" wave.

 

Warhammer: Brought my faith back to the series. There's so much love and polish here. The battles once again have weight and clash to them, the map is strategic yet barebones and I spend much more time fighting battles than I did with S2/R2/Attila. Factions play differently and units all have some kind of interesting use. The flat map returns from Attila which kind of blows, but it's the one thing I really think of that I greatly dislike.

Link to post
Share on other sites

I played Rome 1 extensively, with and without mods.  It sounds like a lot of the things that frustrated me about it are still in later games.

 

The pathfinding is excruciating.  It makes me want to scream.  Open fields just with infantry aren't so bad, but in cities combat looks fake, fake, fake.  And nonsensical.  Add in cavalry, and the limitations of the engine will really show.  For some bizarre reason, in Rome 1 cavalry are terrible at running down fleeing units.  They have trouble targeting routing infantry, seriously, try it some time.  You will throw your keyboard skyward in exasperation.

 

Rome 1 may have been my first grand strategy game, so at first I was perfectly content with the interface.  Then I picked up Civ IV.  Then I put some interface mods on Civ IV.  Then I realized how poo RTW's interface was.  There was absolutely no attempt to streamline the turn-based portion of the game.  Important stat modifiers are completely hidden from the player.  Important functions (like moving retinue from character to character and determining income breakdown) are buried in nonsensical places, and it's not immediately obvious to a newbie that you can even do these things.  Optimal play involves a lot of tedious micro-management that the interface in no way obliges you in completing.

 

The AI is completely idiotic and predictable.  In open maps when I was polished I could routinely kill infantry-centric armies with horse archers that the auto-resolve would give me 90%+ odds of losing, because the game is just that bad at handling anything that isn't line infantry slugfests.  While this did allow me to get my Genghis on, it got really old after a while.

 

And the worst part?  None of this aggravating shit can be fixed with mods.  New units, new unit animations even,new maps, and new cosmetics could be done, but the most grating parts of the experience were absolutely untouchable.

 

Tell us when you get the youtube channel going; I'm interested in seeing what you've got.

Link to post
Share on other sites

Ran into a new hack in Dark Souls 3 last night. 

 

I'm running through the game on a new build I want to try out (Daggers-only Dex build).  I'm naked, save for a shield and two daggers I swap between. 

 

Now, in the Dark Souls games there's a ring that grants you increased damage on ripostes and backstabs. These are critical attacks that can be performed when you parry an enemy or other player (with the shield/other parrying tool) or sneak around them for a backstab. Massive damage, cool animation, makes my dick hard, etc etc. 

 

This fast dagger character relies on critical attacks.

 

So in one area, I decide to try invading. Why not?  We'll see if I can get some decent PVP in. I haven't done any on this character yet. 

 

Very first world I invade, and I notice every enemy is dead. That's a bad sign. As an invader, you're working WITH the enemies to kill the host. 

 

When all enemies are dead, that's a sure sign of GANKING. 

 

I round the corner, and sure enough there's three other players hanging out by the boss door. Waiting for invaders to 3v1 into the dirt. 

 

That's called a gank. It's when multiple people team up against one. It's abysmal. 

 

But it gets worse. 

 

These people aren't decent players. They aren't even good player. They're all using over-powered, bland weapons and have atrociously bad tactics. I dodged around the three of them for probably four minutes, slashing here or there. But they can heal, and I can't punish a heal attempt while fending off 2 other players. 

 

Then I parry one of the idiots. My riposte does enough damage to kill a player in one shot. 

 

BUT, the hornet ring animation doesn't happen. I get a normal riposte animation, for normal riposte damage. 

 

WTF. 

 

After that, they cornered me and stun-locked me to death. The phantom/summon guy that I parried gives me a cheeky emote. 

 

------------------------------------

 

Which brings me to the meat of the issue: 

 

If you are going to cheat in an online game, do it like you've got a set of balls between your legs. 

 

Don't do this half-ass coy little shit, like doing critical-damage protection hacks. What the fuck is that? If you're GANKING, and HACKING, why not go all the way? Have some fun with it! Christ on a stick, hack an item that petrifies me in one hit. Make your rolls damage me and just roll into me to death. Or just don't try to hide it and give you and all your buddies infinite health or stamina. Or get rid of your hitbox! 

 

You're already Ganking, so obviously there's no honor or pride left in your sorry husk of a human shell. Just give in and hack like you mean it, you cowardly fucktards.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Content

    • By Collimatrix
      What a Long, Strange Trip it's Been
       
      PC gaming has been a hell of a ride.  I say "has been" both in the sense that exciting and dynamic things have happened, but also in the sense that the most exciting and dynamic times are behind us.  No other form of video gaming is as closely tied to the latest developments in personal computing hardware, and current trends do not suggest that anything dramatically new and exciting is immediately around the corner.  Indeed, fundamental aspects of semiconductor physics suggest that chip technology is nearing, or perhaps already on a plateau where only slow, incremental improvement is possible.  This, in turn, will limit the amount of improvement possible for game developers.  Gaming certainly will not disappear, and PC gaming will also not disappear, although the PC gaming share of the market may contract in the future.  But I think it is a reasonable expectation that future PC game titles will not be such dramatic technological improvements over older titles as was the case in the past in the near term.  In the long term, current technology and hardware design will eventually be replaced with something entirely different and disruptive, but as always it is difficult, maybe impossible to predict what that replacement will be.
       
      The Good Old Days
       
      The start of the modern, hardware-driven PC gaming culture that we all know and love began with Id Software's early first person shooter titles, most importantly 1993's Doom.
       
      PC gaming was around before Doom, of course, but Doom's combination of cutting edge graphics technology and massive, massive appeal is what really got the ball rolling.

      Doom was phenomenally popular.  There were, at one point, more installs of Doom than there were installs of the Windows operating system.  I don't think there is any subsequent PC title that can claim that.  Furthermore, it was Doom, and its spiritual successor Quake that really defined PC gaming as a genre that pushed the boundaries of what was possible with hardware.
       
      Doom convincingly faked 3D graphics on computers that had approximately the same number-crunching might as a potato.  It also demanded radically more computing power than Wolfenstein 3D, but in those days computing hardware was advancing at such a rate that this wasn't really unreasonable.  This was followed by Quake, which was actually 3D, and demanded so much more of the hardware then available that it quickly became one of the first games to support hardware acceleration.
       
      Id software disintegrated under the stress of the development of Quake, and while many of the original Id team have gone on to do noteworthy things in PC gaming technology, none of it has been earth-shaking the way their work at Id was.  And so, the next important development occurred not with Id's games, but with their successors.
       
      It had become clear, by that point, that there was a strong consumer demand for higher game framerates, but also for better-looking graphics.  In addition to ever-more sophisticated game engines and higher poly-count game models, the next big advance in PC gaming technology was the addition of shaders to the graphics.
       
      Shaders could be used to smooth out the low-poly models of the time, apply lighting effects, and generally make the games look less like spiky ass.  But the important caveat about shaders, from a hardware development perspective, was that shader code ran extremely well in parallel while the rest of the game code ran well in series.  The sort of chip that would quickly do the calculations for the main game, and the sort of chip that would do quickly do calculations for the graphics were therefore very different.  Companies devoted exclusively to making graphics-crunching chips emerged (of these, only Nvidia is left standing), and the stage was set for the heyday of PC gaming hardware evolution from the mid 1990s to the early 2000s.  Initially, there were a great number of hardware acceleration options, and getting everything to work was a bit of an inconsistent mess that only enthusiasts really bothered with, but things rapidly settled down to where we are today.  The important rules of thumb which have, hitherto applied are:

      -The IBM-compatible personal computer is the chosen mount of the Glorious PC Gaming Master Race™. 
      -The two most important pieces of hardware on a gaming PC are the CPU and the GPU, and every year the top of the line CPUs and GPUs will be a little faster than before.
      -Even though, as of the mid 2000s, both gaming consoles and Macs were made of predominantly IBM-compatible hardware, they are not suitable devices for the Glorious PC Gaming Master Race™.  This is because they have artificially-imposed software restrictions that keep them from easily being used the same way as a proper gaming PC.
      -Even though they did not suffer from the same compatibility issues as consoles or Macs, computers with integrated graphics processors are not suitable devices for the Glorious PC Gaming Master Race™.
      -Intel CPUs are the best, and Nvidia GPUs are the best.  AMD is a budget option in both categories.
       
      The Victorious March of Moore's Law
       
      Moore's Law, which is not an actual physical law, but rather an observation about the shrinkage of the physical size of transistors, has held so true for most of the 21st century that it seemed like it was an actual fundamental law of the universe.
       
      The most visible and obvious indication of the continuous improvement in computer hardware was that every year the clock speeds on CPUs got higher.
       

       
      Now, clock speed itself isn't actually particularly indicative of overall CPU performance, since that is a complex interplay of clock speed, instructions per cycle and pipe length.  But at the time, CPU architecture was staying more or less the same, so the increase in CPU clock speeds was a reasonable enough, and very marketing-friendly indicator of how swimmingly things were going.  In 2000, Intel was confident that 10 GHZ chips were about a decade away.
       
      This reliable increase in computing power corresponded with a reliable improvement in game graphics and design year on year.  You can usually look at a game from the 2000s and guess, to within a few years, when it came out because the graphical improvements were that consistent year after year.
       
      The improvement was also rapid.  Compare 2004's Far Cry to 2007's Crysis.
       

       

       
      And so, for a time, game designers and hardware designers marched hand in hand towards ever greater performance.
       
      The End of the Low-Hanging Fruit
       
      But you know how this works, right?  Everyone has seen VH1's Behind the Music.  This next part is where it all comes apart after the explosive success and drugs and groupies, leaving just the drugs.  This next part is where we are right now.
       
      If you look again at the chart of CPU clock speeds, you see that improvement flatlines at about 2005.  This is due to the end of Dennard Scaling.  Until about 2006, reductions in the size of transistors allowed chip engineers to increase clock speeds without worrying about thermal issues, but that isn't the case anymore.  Transistors have become so small that significant amounts of current leakage occur, meaning that clock speeds cannot improve without imposing unrealistic thermal loads on the chips.
       
      Clock speed isn't everything.  The actual muscle of a CPU is a function of several things; the pipeline, the instructions per clock cycle, clock speed, and, after 2005 with the introduction of the Athlon 64X2, the core count.  And, even as clock speed remained the same, these other important metrics did continue to see improvement:



      The catch is that the raw performance of a CPU is, roughly speaking, a multiplicative product of all of these things working together.  If the chip designers can manage a 20% increase in IPC and a 20% increase in clock speed, and some enhancements to pipeline design that amount to a 5% improvement, then they're looking at a 51.2% overall improvement in chip performance.  Roughly.  But if they stop being able to improve one of these factors, then to achieve the same increases in performance, they need to cram in the improvements into just the remaining areas, which is a lot harder than making modest improvements across the board.
       
      Multi-core CPUs arrived to market at around the same time that clock speed increases became impossible.  Adding more cores to the CPU did initially allow for some multiplicative improvements in chip performance, which did buy time for the trend of ever-increasing performance.  The theoretical FLOPS (floating point operations per second) of a chip is a function of its IPC, core count and clock speed.  However, the real-world performance increase provided by multi-core processing is highly dependent on the degree to which the task can be paralleled, and is subject to Amdahl's Law:


      Most games can be only poorly parallelized.  The parallel portion is probably around the 50% mark for everything except graphics, which has can be parallelized excellently.  This means that as soon as CPUs hit 16 cores, there was basically no additional improvement to be had in games from multi-core technology.  That is, unless game designers start to code games specifically for better multi-core performance, but so far this has not happened.  On top of this, adding more cores to a CPU usually imposes a small reduction to clock speed, so the actual point of diminishing returns may occur at a slightly lower core count.
       
      On top of all that, designing new and smaller chip architecture has become harder and harder.  Intel first announced 10nm chip architecture back in September 2017, and showed a timeline with it straddling 2017 and 2018.  2018 has come and gone, and still no 10nm.  Currently Intel is hopeful that they can get 10nm chips to market by the end of 2019.
       
      AMD have had a somewhat easier time of it, announcing a radically different mixed 14nm and 7nm "chiplet" architecture at the end of 2018, and actually brought a 7nm discrete graphics card to market at the beginning of 2019.  However, this new graphics card merely matches NVIDIA's top-of-the-line cards, both in terms of performance and in terms of price.  This is a significant development, since AMD's graphics cards have usually been second-best, or cost-effective mid-range models at best, so for them to have a competitive top-of-the-line model is noteworthy.  But, while CPUs and GPUs are different, it certainly doesn't paint a picture of obvious and overwhelming superiority for the new 7nm process.  The release of AMD's "chiplet" Zen 2 CPUs appears to have been delayed to the middle of 2019, so I suppose we'll find out then.  Additionally, it appears that the next-generation of Playstation will use a version of AMD's upcoming "Navi" GPU, as well as a Zen CPU, and AMD hardware will power the next-generation XBOX as well. 
       
      So AMD is doing quite well servicing the console gaming peasant crowd, at least.  Time will tell whether the unexpected delays faced by their rivals along with the unexpected boost from crypto miners buying literally every fucking GPU known to man will allow them to dominate the hardware market going forward.  Investors seem optimistic, however:


       
      With Intel, they seem less sanguine:



      and with NVIDIA, well...
       

       
      But the bottom line is don't expect miracles.  While it would be enormously satisfying to see Intel and NVIDIA taken down a peg after years of anti-consumer bullshit, the reality is that hardware improvements have fundamentally become difficult.  For the time being, nobody is going to be throwing out their old computers just because they've gotten slow.  As the rate of improvements dwindles, people will start throwing out their old PCs and replacing them only because they've gotten broken.
       
      OK, but What About GPUs?
       
      GPU improvements took longer to slow down than CPU improvements, in large part because GPU workloads can be parallel processed well.  But the slowdown has arrived.
       
      This hasn't stopped the manufacturers of discrete GPUs from trying to innovate, of course.  Not only that; the market is about to become more competitive with Intel announcing their plans for a discrete GPU in the near future.  NVIDIA has pushed their new ray-tracing optimized graphics cards for the past few months as well.  The cryptomining GPU boom has come and gone; GPUs turn out to be better than CPUs at cryptomining, but ASICs beat out GPUs but a lot, so that market is unlikely to be a factor again.  GPUs are still relatively cost-competitive for a variety of machine learning tasks, although long-term these will probably be displaced by custom designed chips like the ones Google is mass-ordering.
       
      Things really do not look rosy for GPU sales.  Every time someone discovers some clever alternative use for GPUs like cryptomining or machine learning, they get displaced after a few years by custom hardware solutions even more fine-tuned to the task.  Highly parallel chips are the future, but there's no reason to think that those highly parallel chips will be traditional GPUs, per se.

      And speaking of which, aren't CPUs getting more parallel, with their ever-increasing core count?  And doesn't AMD's "chiplet" architecture allow wildly differently optimized cores to be stitched together?  So, the CPU of a computer could very easily be made to accommodate capable on-board graphics muscle.  So... why do we even need GPUs in the future?  After all, PCs used to have discrete sound cards and networking cards, and the CPU does all of that now.  The GPU has really been the last hold-out, and will likely be swallowed by the CPU, at least on low and mid range machines in the next few years.
       
      Where to Next?
       
      At the end of 2018, popular YouTube tech channel LinusTechTips released a video about Shadow.  Shadow is a company that is planning to use centrally-located servers to provide cloud-based games streaming.  At the time, the video was received with (understandably) a lot of skepticism, and even Linus doesn't sound all that convinced by Shadow's claims.
       
       
      The technical problems with such a system seem daunting, especially with respect to latency.  This really did seem like an idea that would come and go.  This is not its time; the technology simply isn't good enough.

      And then, just ten days ago, Google announced that they had exactly the same idea:
       
       
      The fact that tech colossus Google is interested changed a lot of people's minds about the idea of cloud gaming.  Is this the way forward?  I am unconvinced.  The latency problems do seem legitimately difficult to overcome, even for Google.  Also, almost everything that Google tries to do that isn't search on Android fails miserably.  Remember Google Glass?  Google Plus?
       
      But I do think that games that are partially cloud-based will have some market share.  Actually, they already do.  I spent a hell of a lot of time playing World of Tanks, and that game calculates all line-of-sight checks and all gunfire server-side.  Most online games do have some things that are calculated server-side, but WoT was an extreme example for the time.  I could easily see future games offloading a greater amount of the computational load to centralized servers vis a vis the player's own PC.
       
      But there are two far greater harbingers of doom for PC gaming than cloud computing.  The first is smart phones and the second is shitty American corporate culture.  Smart phones are set to saturate the world in a way desktop PCs never did.  American games publishers are currently more interested in the profits from gambling-esque game monetization schemes than they are in making games.  Obviously, I don't mean that in a generic anti-capitalist, corporation-bashing hippie way.  I hate hippies.  I fuck hippies for breakfast.  But if you look at even mainstream news outlets on Electronic Arts, it's pretty obvious that the AAA games industry, which had hitherto been part of the engine driving the games/hardware train forward, is badly sick right now.  The only thing that may stop their current sleaziness is government intervention.
       
      So, that brings us to the least important, but most discussion-sparking part of the article; my predictions.  In the next few years, I predict that the most popular game titles will be things like Fortnite or Apex Legends.  They will be monetized on some sort of games-as-service model, and will lean heavily if not entirely on multiplayer modes.  They may incorporate some use of server-side calculation to offload the player PC, but in general they will work on modest PCs because they will only aspire to have decent, readable graphics rather than really pretty ones.  The typical "gaming rig" for this type of game will be a modest and inexpensive desktop or laptop running built-in graphics with no discrete graphics card.  There will continue to be an enthusiast market for games that push the limits, but this market will no longer drive the majority of gaming hardware sales.  If these predictions sound suspiciously similar to those espoused by the Coreteks tech channel, that's because I watched a hell of a lot of his stuff when researching this post, and I find his views generally convincing.
       
      Intel's Foveros 3D chip architecture could bring a surge in CPU performance, but I predict that it will be a one-time surge, followed by the return to relatively slow improvement.  The reason why is that the Foveros architecture allows for truly massive CPU caches, and these could be used to create enormous IPC gains.  But after the initial boon caused by the change in architecture, the same problems that are currently slowing down improvement would be back, the same as before.  It definitely wouldn't be a return to the good old days of Moore's Law.  Even further down the road, a switch to a different semiconducting material such as Gallium Nitride (which is already used in some wireless devices and military electronics) could allow further miniaturization and speed ups where silicon has stalled out.  But those sort of predictions stretch my limited prescience and knowledge of semiconductor physics too far.
       
      If you are interested in this stuff, I recommend diving into Coretek's channel (linked above) as well as Adored TV.
    • By SergeantMatt
      tl;dr, does not appear to be great for gaming.
      http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks
      Removing GPU bottlenecking by running at low resolutions, the 1800X gets beaten by the 7700K in most games, which is half the price.



×
×
  • Create New...