Jump to content
Sturgeon's House

Leaderboard


Popular Content

Showing content with the highest reputation on 03/30/2019 in all areas

  1. 8 points
    What a Long, Strange Trip it's Been PC gaming has been a hell of a ride. I say "has been" both in the sense that exciting and dynamic things have happened, but also in the sense that the most exciting and dynamic times are behind us. No other form of video gaming is as closely tied to the latest developments in personal computing hardware, and current trends do not suggest that anything dramatically new and exciting is immediately around the corner. Indeed, fundamental aspects of semiconductor physics suggest that chip technology is nearing, or perhaps already on a plateau where only slow, incremental improvement is possible. This, in turn, will limit the amount of improvement possible for game developers. Gaming certainly will not disappear, and PC gaming will also not disappear, although the PC gaming share of the market may contract in the future. But I think it is a reasonable expectation that future PC game titles will not be such dramatic technological improvements over older titles as was the case in the past in the near term. In the long term, current technology and hardware design will eventually be replaced with something entirely different and disruptive, but as always it is difficult, maybe impossible to predict what that replacement will be. The Good Old Days The start of the modern, hardware-driven PC gaming culture that we all know and love began with Id Software's early first person shooter titles, most importantly 1993's Doom. PC gaming was around before Doom, of course, but Doom's combination of cutting edge graphics technology and massive, massive appeal is what really got the ball rolling. Doom was phenomenally popular. There were, at one point, more installs of Doom than there were installs of the Windows operating system. I don't think there is any subsequent PC title that can claim that. Furthermore, it was Doom, and its spiritual successor Quake that really defined PC gaming as a genre that pushed the boundaries of what was possible with hardware. Doom convincingly faked 3D graphics on computers that had approximately the same number-crunching might as a potato. It also demanded radically more computing power than Wolfenstein 3D, but in those days computing hardware was advancing at such a rate that this wasn't really unreasonable. This was followed by Quake, which was actually 3D, and demanded so much more of the hardware then available that it quickly became one of the first games to support hardware acceleration. Id software disintegrated under the stress of the development of Quake, and while many of the original Id team have gone on to do noteworthy things in PC gaming technology, none of it has been earth-shaking the way their work at Id was. And so, the next important development occurred not with Id's games, but with their successors. It had become clear, by that point, that there was a strong consumer demand for higher game framerates, but also for better-looking graphics. In addition to ever-more sophisticated game engines and higher poly-count game models, the next big advance in PC gaming technology was the addition of shaders to the graphics. Shaders could be used to smooth out the low-poly models of the time, apply lighting effects, and generally make the games look less like spiky ass. But the important caveat about shaders, from a hardware development perspective, was that shader code ran extremely well in parallel while the rest of the game code ran well in series. The sort of chip that would quickly do the calculations for the main game, and the sort of chip that would do quickly do calculations for the graphics were therefore very different. Companies devoted exclusively to making graphics-crunching chips emerged (of these, only Nvidia is left standing), and the stage was set for the heyday of PC gaming hardware evolution from the mid 1990s to the early 2000s. Initially, there were a great number of hardware acceleration options, and getting everything to work was a bit of an inconsistent mess that only enthusiasts really bothered with, but things rapidly settled down to where we are today. The important rules of thumb which have, hitherto applied are: -The IBM-compatible personal computer is the chosen mount of the Glorious PC Gaming Master Race™. -The two most important pieces of hardware on a gaming PC are the CPU and the GPU, and every year the top of the line CPUs and GPUs will be a little faster than before. -Even though, as of the mid 2000s, both gaming consoles and Macs were made of predominantly IBM-compatible hardware, they are not suitable devices for the Glorious PC Gaming Master Race™. This is because they have artificially-imposed software restrictions that keep them from easily being used the same way as a proper gaming PC. -Even though they did not suffer from the same compatibility issues as consoles or Macs, computers with integrated graphics processors are not suitable devices for the Glorious PC Gaming Master Race™. -Intel CPUs are the best, and Nvidia GPUs are the best. AMD is a budget option in both categories. The Victorious March of Moore's Law Moore's Law, which is not an actual physical law, but rather an observation about the shrinkage of the physical size of transistors, has held so true for most of the 21st century that it seemed like it was an actual fundamental law of the universe. The most visible and obvious indication of the continuous improvement in computer hardware was that every year the clock speeds on CPUs got higher. Now, clock speed itself isn't actually particularly indicative of overall CPU performance, since that is a complex interplay of clock speed, instructions per cycle and pipe length. But at the time, CPU architecture was staying more or less the same, so the increase in CPU clock speeds was a reasonable enough, and very marketing-friendly indicator of how swimmingly things were going. In 2000, Intel was confident that 10 GHZ chips were about a decade away. This reliable increase in computing power corresponded with a reliable improvement in game graphics and design year on year. You can usually look at a game from the 2000s and guess, to within a few years, when it came out because the graphical improvements were that consistent year after year. The improvement was also rapid. Compare 2004's Far Cry to 2007's Crysis. And so, for a time, game designers and hardware designers marched hand in hand towards ever greater performance. The End of the Low-Hanging Fruit But you know how this works, right? Everyone has seen VH1's Behind the Music. This next part is where it all comes apart after the explosive success and drugs and groupies, leaving just the drugs. This next part is where we are right now. If you look again at the chart of CPU clock speeds, you see that improvement flatlines at about 2005. This is due to the end of Dennard Scaling. Until about 2006, reductions in the size of transistors allowed chip engineers to increase clock speeds without worrying about thermal issues, but that isn't the case anymore. Transistors have become so small that significant amounts of current leakage occur, meaning that clock speeds cannot improve without imposing unrealistic thermal loads on the chips. Clock speed isn't everything. The actual muscle of a CPU is a function of several things; the pipeline, the instructions per clock cycle, clock speed, and, after 2005 with the introduction of the Athlon 64X2, the core count. And, even as clock speed remained the same, these other important metrics did continue to see improvement: The catch is that the raw performance of a CPU is, roughly speaking, a multiplicative product of all of these things working together. If the chip designers can manage a 20% increase in IPC and a 20% increase in clock speed, and some enhancements to pipeline design that amount to a 5% improvement, then they're looking at a 51.2% overall improvement in chip performance. Roughly. But if they stop being able to improve one of these factors, then to achieve the same increases in performance, they need to cram in the improvements into just the remaining areas, which is a lot harder than making modest improvements across the board. Multi-core CPUs arrived to market at around the same time that clock speed increases became impossible. Adding more cores to the CPU did initially allow for some multiplicative improvements in chip performance, which did buy time for the trend of ever-increasing performance. The theoretical FLOPS (floating point operations per second) of a chip is a function of its IPC, core count and clock speed. However, the real-world performance increase provided by multi-core processing is highly dependent on the degree to which the task can be paralleled, and is subject to Amdahl's Law: Most games can be only poorly parallelized. The parallel portion is probably around the 50% mark for everything except graphics, which has can be parallelized excellently. This means that as soon as CPUs hit 16 cores, there was basically no additional improvement to be had in games from multi-core technology. That is, unless game designers start to code games specifically for better multi-core performance, but so far this has not happened. On top of this, adding more cores to a CPU usually imposes a small reduction to clock speed, so the actual point of diminishing returns may occur at a slightly lower core count. On top of all that, designing new and smaller chip architecture has become harder and harder. Intel first announced 10nm chip architecture back in September 2017, and showed a timeline with it straddling 2017 and 2018. 2018 has come and gone, and still no 10nm. Currently Intel is hopeful that they can get 10nm chips to market by the end of 2019. AMD have had a somewhat easier time of it, announcing a radically different mixed 14nm and 7nm "chiplet" architecture at the end of 2018, and actually brought a 7nm discrete graphics card to market at the beginning of 2019. However, this new graphics card merely matches NVIDIA's top-of-the-line cards, both in terms of performance and in terms of price. This is a significant development, since AMD's graphics cards have usually been second-best, or cost-effective mid-range models at best, so for them to have a competitive top-of-the-line model is noteworthy. But, while CPUs and GPUs are different, it certainly doesn't paint a picture of obvious and overwhelming superiority for the new 7nm process. The release of AMD's "chiplet" Zen 2 CPUs appears to have been delayed to the middle of 2019, so I suppose we'll find out then. Additionally, it appears that the next-generation of Playstation will use a version of AMD's upcoming "Navi" GPU, as well as a Zen CPU, and AMD hardware will power the next-generation XBOX as well. So AMD is doing quite well servicing the console gaming peasant crowd, at least. Time will tell whether the unexpected delays faced by their rivals along with the unexpected boost from crypto miners buying literally every fucking GPU known to man will allow them to dominate the hardware market going forward. Investors seem optimistic, however: With Intel, they seem less sanguine: and with NVIDIA, well... But the bottom line is don't expect miracles. While it would be enormously satisfying to see Intel and NVIDIA taken down a peg after years of anti-consumer bullshit, the reality is that hardware improvements have fundamentally become difficult. For the time being, nobody is going to be throwing out their old computers just because they've gotten slow. As the rate of improvements dwindles, people will start throwing out their old PCs and replacing them only because they've gotten broken. OK, but What About GPUs? GPU improvements took longer to slow down than CPU improvements, in large part because GPU workloads can be parallel processed well. But the slowdown has arrived. This hasn't stopped the manufacturers of discrete GPUs from trying to innovate, of course. Not only that; the market is about to become more competitive with Intel announcing their plans for a discrete GPU in the near future. NVIDIA has pushed their new ray-tracing optimized graphics cards for the past few months as well. The cryptomining GPU boom has come and gone; GPUs turn out to be better than CPUs at cryptomining, but ASICs beat out GPUs but a lot, so that market is unlikely to be a factor again. GPUs are still relatively cost-competitive for a variety of machine learning tasks, although long-term these will probably be displaced by custom designed chips like the ones Google is mass-ordering. Things really do not look rosy for GPU sales. Every time someone discovers some clever alternative use for GPUs like cryptomining or machine learning, they get displaced after a few years by custom hardware solutions even more fine-tuned to the task. Highly parallel chips are the future, but there's no reason to think that those highly parallel chips will be traditional GPUs, per se. And speaking of which, aren't CPUs getting more parallel, with their ever-increasing core count? And doesn't AMD's "chiplet" architecture allow wildly differently optimized cores to be stitched together? So, the CPU of a computer could very easily be made to accommodate capable on-board graphics muscle. So... why do we even need GPUs in the future? After all, PCs used to have discrete sound cards and networking cards, and the CPU does all of that now. The GPU has really been the last hold-out, and will likely be swallowed by the CPU, at least on low and mid range machines in the next few years. Where to Next? At the end of 2018, popular YouTube tech channel LinusTechTips released a video about Shadow. Shadow is a company that is planning to use centrally-located servers to provide cloud-based games streaming. At the time, the video was received with (understandably) a lot of skepticism, and even Linus doesn't sound all that convinced by Shadow's claims. The technical problems with such a system seem daunting, especially with respect to latency. This really did seem like an idea that would come and go. This is not its time; the technology simply isn't good enough. And then, just ten days ago, Google announced that they had exactly the same idea: The fact that tech colossus Google is interested changed a lot of people's minds about the idea of cloud gaming. Is this the way forward? I am unconvinced. The latency problems do seem legitimately difficult to overcome, even for Google. Also, almost everything that Google tries to do that isn't search on Android fails miserably. Remember Google Glass? Google Plus? But I do think that games that are partially cloud-based will have some market share. Actually, they already do. I spent a hell of a lot of time playing World of Tanks, and that game calculates all line-of-sight checks and all gunfire server-side. Most online games do have some things that are calculated server-side, but WoT was an extreme example for the time. I could easily see future games offloading a greater amount of the computational load to centralized servers vis a vis the player's own PC. But there are two far greater harbingers of doom for PC gaming than cloud computing. The first is smart phones and the second is shitty American corporate culture. Smart phones are set to saturate the world in a way desktop PCs never did. American games publishers are currently more interested in the profits from gambling-esque game monetization schemes than they are in making games. Obviously, I don't mean that in a generic anti-capitalist, corporation-bashing hippie way. I hate hippies. I fuck hippies for breakfast. But if you look at even mainstream news outlets on Electronic Arts, it's pretty obvious that the AAA games industry, which had hitherto been part of the engine driving the games/hardware train forward, is badly sick right now. The only thing that may stop their current sleaziness is government intervention. So, that brings us to the least important, but most discussion-sparking part of the article; my predictions. In the next few years, I predict that the most popular game titles will be things like Fortnite or Apex Legends. They will be monetized on some sort of games-as-service model, and will lean heavily if not entirely on multiplayer modes. They may incorporate some use of server-side calculation to offload the player PC, but in general they will work on modest PCs because they will only aspire to have decent, readable graphics rather than really pretty ones. The typical "gaming rig" for this type of game will be a modest and inexpensive desktop or laptop running built-in graphics with no discrete graphics card. There will continue to be an enthusiast market for games that push the limits, but this market will no longer drive the majority of gaming hardware sales. If these predictions sound suspiciously similar to those espoused by the Coreteks tech channel, that's because I watched a hell of a lot of his stuff when researching this post, and I find his views generally convincing. Intel's Foveros 3D chip architecture could bring a surge in CPU performance, but I predict that it will be a one-time surge, followed by the return to relatively slow improvement. The reason why is that the Foveros architecture allows for truly massive CPU caches, and these could be used to create enormous IPC gains. But after the initial boon caused by the change in architecture, the same problems that are currently slowing down improvement would be back, the same as before. It definitely wouldn't be a return to the good old days of Moore's Law. Even further down the road, a switch to a different semiconducting material such as Gallium Nitride (which is already used in some wireless devices and military electronics) could allow further miniaturization and speed ups where silicon has stalled out. But those sort of predictions stretch my limited prescience and knowledge of semiconductor physics too far. If you are interested in this stuff, I recommend diving into Coretek's channel (linked above) as well as Adored TV.
  2. 6 points
    Toxn

    Competition: Californium 2250

    So this week I did my usual 'run hard with the first idea that pops into your head' approach. This is what I came up with: This is Brick junior. It fulfils the objective requirements for firepower and armour protection, and the threshold requirements for specific power (I don't know about the rest yet). The Brick series of vehicles are all built around the idea of the essential crew (see below) being housed in the hull/turret ring under the hull line. The turret superstructure, in turn, is thinly armoured except for the mantlet, although the addition of lightweight armour panels to the turret periphery can readily increase the protection provided to the gun and its associated equipment. As a result, Brick junior weighs 49 tonnes fully loaded and can run on a stretched T-72 derived suspension (Basically one extra road wheel). The engine is an AVDS-series derived unit running to a rear transmission assembly (again, T-72 derived). The gun is a 150mm L/45 piece designed primarily to sling HEAT-FS and HE-FS. It also has an emergency supercharge APFSDS round which can penetrate a base-model Norman from the front out to 2000m if needed. The range of HEAT-FS ammunition (steel-coned HEDP, copper-coned and improved copper-coned tandem charge) means that no practical level of up-armouring will save the existing models of Cascadian tanks. The gun is fed by an autoloader unit running from an ammunition compartment in the left side of the turret. The compartment holds up to 20 charges and 20 shells, and feeds them to a loading rail in the turret rear. The ammunition compartment can be topped up from a secondary 20+20 compartment left of the driver when the turret is locked forwards. Both compartments have blow-off panels on top. The M-model Brick would use a more simple autoloader unit, so it would have to level the gun between shots. The more sophisticated version would be able to follow the gun through most of it's elevation and depression range. The gun itself and elevate/depress 30/-10 degrees. The coax is a 12.7mm piece. The most unusual feature of the Brick is the armoured pulpit mounted to the rear of the turret superstructure. This acts as a counterweight, and also houses the observer-gunner. Xis job is to observe and operate the rangefinder, freeing up some work from the commander and gunner. This crewmember is, however, not necessary to operate the tank and so only enjoys base levels of protection. A 12.7mm pintle-mounted gun may be provided for particularly trusted observer-gunners to operate. Brick junior identifies as gender-fluid demisexual polyamorous, and xer preferred greeting is 'please, oh god no'.
  3. 2 points
    GL-ATGM inside of 2A46
  4. 2 points
    just an improved protection Type-59-1 prototype, nothing too fancy,what they are researching are likely used on Type-69-2A
  5. 2 points
    Heh, GL-ATGMs are not enough for us. 57 mm autocannons with guided projectiles, haha.
  6. 2 points
    Xoon

    Competition: Californium 2250

    So I have been working on a schematic for my hydrostatic transmission. In short, its a hydraulic open circuit. A swash plate pump delivers the power to the system. A over pressure valve makes sure excess pressure and pressure spikes are vented into the reservoir. The power is split by a distributor valve, acting as a sort of differential lock in the case of a track slipping. The flow is then regulated by a flow valve that feeds into a direction changer valve. It feeds the hydraulic motor, also a swash plate motor. In parallel, it has a freewheel valve for when you simply want the vehicle to rotate freely. Both feed back into the reservoir. A accumulator is also hooked into the high pressure side of the system, providing regenerative breaking and smoothing out the power delivered to and from the motors. This is just a first draft, I will most likely add more safety features and refine the system. Also it lacks the auxiliary equipment like the suspension. I am also researching ways of making the pump more efficient past simply running constantly and the excess pressure being vented into the reservoir, wasted. Maybe also making the pump into a starter for the engine. I also lack the oil cooler. I have been pondering on using two engines to power the system, one for each motor, or two pumps. The issue would be that power could not be shared between the motors and same with regenerative breaks. I am also working on the hydropneumatic suspension. This is a schematic for the front and rear most road wheels. Pressure is feed into a 3/4 valve which is used elevate the roadwheel, a over pressure valve is used to regulate the amount it elevates. Outside of that, the pressure is fed though a over pressure valve that regulates the height of the suspension. Then a accumulator is coupled in parallel to provide the "springyness". Before the hydraulic cylinder is a flow valve, which regulates the amount of flow, that is used to modify the stiffness of the suspension. This system is feed by the open circuit shown above in the first picture. And yes, everything is written in Norwegian, because diversity, a core principle of the most supreme state. And yes, it was all hand written while cutting steel, because I lack proper schematic software.
  7. 1 point
    Oops. https://www.somersetcountygazette.co.uk/news/17537164.all-knives-stolen-from-avon-and-somerset-police-knife-amnesty-bin/?fbclid=IwAR1qcM4UjZ4e-BgUZySqpNJgTTYXMMGhoGLCsJq-GHpewT3nsnpQf7eRYYQ All knives stolen from Avon and Somerset Police knife amnesty bin By Amy Cole @GazetteAmyReporter CONCERN: One of the knife amnesty bins 0 comment POLICE are investigating after a knife amnesty bin was ransacked - and all the knives stolen.
  8. 1 point
    LoooSeR

    Turkish touch

    Already in shit:
  9. 1 point
    N-L-M

    Competition: Californium 2250

    0.45m.
  10. 1 point
    Sturgeon

    Competition: Californium 2250

    48.1 tonnes of hull so far.
  11. 1 point
    Sturgeon

    Competition: Californium 2250

    Here's the engine model I'll be going with: That is 3 V-55AM2s stretched out so that their cylinders each have a 1-cylinder sized gap between them, and then the center engine is mounted upside down, and the outer two are mounted right-side up and interleaved with them It weighs 3500 kg and produces a net of 2,340 hp. Current hull cross-section looks like this:
  12. 1 point
    Xoon

    Competition: Californium 2250

    Taking your idea, I came up with something like this. A pressure sensor sits on the accumulator. When the accumulator compresses, it compresses a hydraulic fluid which controls the engine throttle, the more the accumulator is compressed, the less throttle the engine gets. Also, when at the dead bottom, it activates a valve which decouples the prime mover from the pump and locks it hydraulically, while also bypassing it with a check valve. I could add a hydraulic/pneumatic PID for more precise control. Also note that the control lines are simplified, and probably will be changed to better reflect their behavior later. And thanks, I almost flipped my table when I noticed the missing line. Luckily it is too heavy. EDIT: Updated my schematic with a PID, am wondering if I should use a electric PID instead of a hydraulic. A worry would be lack of pressure could stop the system. I think I might need a separate flow chart for the control engineering. One schematic for the main hydraulics, and one for the control scheme. Edit: Damn, the spring in the clutch cylinder is on the wrong side. Please ignore.
  13. 1 point
    So I picked up a copy of Dave Hobbs, British Aircraft Carrier Design, and read up a little on the "Armored Deck" Carriers, and boy, I may have been wrong about them being overrated, they simply may in fact, just be horrible designs. One thing he mentions, is the Royal Navy designed the armored deck carriers, not with the Med in mind, though that at least gives a decent argument for these bad designs, it's wrong if what Hobbs is saying is true, they designed these ships, thinking no amount of CAP could ever stop a raid from getting to the ship before they could attack. The carrier could also not get its interceptors launched, and high enough to stop the attack, so it decided armor and AA guns were the way to defend the ship. This was only a valid idea pre-radar, and even then, they didn't get enough AA firepower or armor on these ships for it to help much. The few times the armored deck was tested, it didn't really live up to its reputation. Once radar was a thing, even the Brits realized this would allow enough time to launch interceptors once the tech matured, and by wars start radar was there. Now, this gets us into Fleet Air Arms aircraft choices, and this whole area is a nightmare, of poor planning, doctrine, and interservice idiocy. So the only real test of the Armor came when the Illustrious, was attacked by Stukas, supposedly, elite ship hunters, in January of 1941. Since her CAP got suckered down by a low-level attack, the Stukas had a free hand, and they hit the Illustrious six times, four 1100 pounders, one was a dud, and three 550 pounders, one near miss. What's interesting here is only one bomb hit the armored deck, and it went right through the armor, and blew up in the hanger, causing serious damage to the ship's structure. The near miss may have damaged the hull. She limped to Malta on fire and took another bomb a week later. Once they got her Sea Worthy, they eventually had to send her to the USA for a rebuild. Even the US Shipyards could not fix the ship all the way, she suffered vibration problems from these attacks that eventually required the center shaft to be removed, and the ship limited to 26 knots, later the vibrations got bad again and she had to be limited to 24 knots! She was out of action 10 months and was never right again. Even the argument that these carriers were good for the Kamikaze threat is a myth since the US Navy deemed them almost not worth the trouble of having around, because of their small air groups, small bunker stores, and stupidly small avgas and ordnance storage. People do not think about the logistical side of the carrier much. The US Navy designed their carriers around an 80 to 90 plane air group, with enough gas and ord to operate them about five days of moderate operations before they need to refuel and rearm. The Essex class could do 20,000 miles at 15 knots on 6160tons of fuel oil. The Illustrious class was 12,000 NM at 14 knots with 4640 tons of fuel oil. That means the Illustrious class had to pull off the line and refuel, a lot. The Essex class had 240,000 gallons of avgas. The Illustrious class only had 50,000 gallons of avgas! That's a small gas load even for a small air wing. It was stored very safely though... Now, this problem is bigger than you think, because they realized the errors in their thinking and did everything they could to increase the air group size on the ships. They eventually got them up to about 60 planes, Corsairs, and Avengers, and Spits later... They did this by adopting the American style deck storage, and a multi-barrier landing system. This made problems worse in several ways for these ships, the first, they were already cramped, by packing in more pilots and ground crew to work on the planes, they ended up packing these things like sardines, and their living standards were NOT up to US Navy standards. Maybe US Navy WWI standards. This also made the fuel problem almost unworkable. They would have to take on Avgas daily! Or they would if they could keep any airplanes working. So another problem with these ships is their layout. For some reason, the Brits decided these things needed two story hangers. Why? Who knows, on the first four ships, the hangers were different heights, but still to short for good planes. One was 16 feet and one 14. Only the 16-foot hanger could take clipped Corsairs. Why not one larger normal sized hanger deck? No idea. So the British figured out these were not great ships after the first four, and in the next four tried to fix them, and messed them up much worse. They decided the armored box concept was too much and thinned out the sides. They also decided 30 knots was to slow, and added more boilers and a fourth shaft, in an only slightly bigger ship. This compounded the low living space problems. They did not really increase the bunker fuel or avgas loads much. Even better, they made both hangers 14 feet, so now they could operate Seafires or Hellcats, but the US Navy didn’t have enough kitties to go around, so they operated the Spits, or more crashed them over and over into the deck, destroying them far faster than enemy action. If you look at these ships post-war, the ones that took damage didn’t get rebuilds, the ones that did still didn’t operate long after the war. Granted the Brits were broke, but the Essex lasted in US Service well into the 90s and were a bargain compared to a new forrestal class. Another point was made that the Armored deck Carriers were supposed to take bomb damage better, and then US Carriers get shit talked for their wooden decks. As if they didn’t have an armored deck in the Hanger. They also forget the Enterprise, the greatest carrier in history, took three bombs, four near misses, and retired under her own power and was back in action in a little over a month. Later in the Battle of Santa Cruz, her terrible wooden deck took two bombs but was repaired, during the fight, and she was able to land her aircraft and the Hornets and continued to operated. When she retired from the fight, she was only laid up ten days for repairs before going back out for operations. One of the selling points of the wooden deck was ease of repair, and her machinery was all just fine after all that. Granted, two Essex class Carriers caught bombs or Kamikazes at the absolute worst possible time and suffered horrendous damage. That still doesn’t make the case for the Brit Armored CVs being good since they never got tested having a whole, loaded for a strike, deck park going up on them. I bet neither the Bunker Hill or Franklin took ten months to fix either, and both were in “New” condition when mothballed. I think the US Navy was right, they did a bunch of studies that said the carrier would need to be 60k tons of more to have a viable armored deck, and usable airwing, and thus the Midways were born. Sources: Anotomy of the Ship, The Aircraft Carrier Victorious, Anatomy of the Ship The Aircraft Carrier Intrepid, British Aircraft Carrier Design and History by David Hobbs, and Fleets of WWII by Richard Worth plus that armored carriers apologist site.
  14. 0 points
    LoooSeR

    Collimatrix's Terrible Music Thread

    ... Black in gold. https://gunter-spb.livejournal.com/2531613.html
  15. 0 points
  16. 0 points
    DIADES

    Competition: Californium 2250

    To hear is to obey.
×
×
  • Create New...