0
0

[–] BeautifulInside ago 

It rarely pays off to optimize memory footprint or disk size in desktop applications. And similarly, performance doesn't sell. The balancing act is usually made to maximize long-term profits, or to minimize deficits. Quality is merely a factor in that equation, and a poorly understood one at that. In general, the people who actually make the product are the ones that want the product to be the best, regardless of costs.

0
1

[–] rwbj 0 points 1 point (+1|-0) ago 

As an aside to this there is a subtle marketing benefit to inflated sizes. Gamers who know absolutely nothing about development often associate bigger file sizes with bigger or more impressive games. There was an terrible PS3 game, Heavenly Sword, that was about 5 hours of mediocre gameplay. The marketing behind the game managed to use the, at the time, very large file size as a selling point. Sony was also doing the same thing by framing blu-ray as capable of all new sorts of gaming experiences because more disk space. Where'd all the disk space go? Well they used up 10GB in uncompressed audio alone. Completely unnecessary and in that case done certainly for marketing purposes. And that's not just a distant memory. Titanfall may have engaged in the same game, though their marketers weren't so overt about it. In that game's 48GB install, a whopping 35GB of it was uncompressed audio!

0
1

[–] flope_de 0 points 1 point (+1|-0) ago  (edited ago)

Fallout 4 does the same shit. More than three quarters of the install are taken up by multiple copies of the same badly compressed textures, uncompressed audio, and badly compressed videos. Half of those videos are not even used in the actual game. It would not be much larger than Skyrim, if they used proper compression.

0
2

[–] Salnax 0 points 2 points (+2|-0) ago 

Generally, these larger file sizes are cases of developers sacrificing a lot of storage space for a chunk of better presentation and performance. And on one hand, I welcome it, since storage is much better than it used to be in the days of 4 GB hard drives and 850 MB CDs. If your typical Blu-Ray disc has over 15 times the capacity of a GameCube disc, those moderate improvements are harmless.

The main problem I've had is that although physical storage is progressing at least as quickly as file size, my ability to download large files isn't. Which is honestly one of my bigger gripes about PC gaming at the moment.

0
3

[–] thrus 0 points 3 points (+3|-0) ago 

Game drive space is often graphics that are getting uncompressed and updated, the load times are better if you don't have to wait for it to unzip files every time you move around and gameplay is smother for the same reason.

0
0

[–] enneract ago 

What about S3TC? Textures compressed with these methods are much smaller than uncompressed files (saving disk space) and can also be uploaded directly to the GPU (making loading times shorter).

0
0

[–] thrus ago 

I don't know enough about the nuts and bolts, I just know some games (payday 2 for example) had released patches that while not huge them self drastically altered the install size with this in mind. It could have been something other then graphics but the massive gain in size seemed to much to have been anything else (talking something like 10 or 15Gb for Payday 2).

0
3

[–] 5770714? 0 points 3 points (+3|-0) ago 

Because it adds layers on layers on layers on layers of library code. Sometimes reinventing the wheel is worth it.

0
3

[–] enneract 0 points 3 points (+3|-0) ago 

It's not just layers of library code, but layers of abstraction in general. Tons of developers are using managed languages like C# that are significantly less efficient because of all the run-time checks and garbage collection they perform.

0
1

[–] 5772772? 0 points 1 point (+1|-0) ago  (edited ago)

C# creates very efficient code, if it is slow then it has nothing to do with C# but the way you built your application. Just look at what the IL creates and especially look how the IL is translated to real machine code that is near efficient as C++.

But of course if you have a SAP background the you will never get the speed out of managed code.

0
1

[–] rwbj 0 points 1 point (+1|-0) ago 

C# is by no means less performant than even something like native C++ now a days. That hasn't been the case for years. Feel free to Google for benchmarks. You won't find many done in the past 6 or so years as once the difference became completely negligible benchmarks no longer served much purpose. Of course you are right though that there has been a dramatic spike in the number of games and developers using C#, and so of the proportion of all games using C# - including poorly optimized ones - has increased.

0
6

[–] Reubarbarian 0 points 6 points (+6|-0) ago  (edited ago)

As a Game Designer at a major international corporation, I can tell you that it has much more to do with budgets, schedules, and managers' decisions than anything to do with the (correct) desires of the developers to optimize.

All but the tiniest studios are on ridiculous schedules and suffer budgetary constraints/ mismanagement.

This isn't intended to excuse this behaviour (as it is horrible and it kills me inside every day), but it is an unfortunate consequence of needing large amounts of cash in order to create your favourite titles. I complain to my managers about this on a regular basis! ;)

0
10

[–] RevanProdigalKnight 0 points 10 points (+10|-0) ago  (edited ago)

As a software engineer, one of the first things I was told in school was that there are always tradeoffs in programming. Want a program to run faster? Gotta use more RAM. Need to save space in memory? It's going to need to be on the disk, which increases waits for I/O operations.

In addition, most of the code running in a given program is part of a second- or third-party library, meaning the developer of the end product has no control over how it allocates its resources.

Now, is it impossible to work around these problems? No, not at all. We simply aren't given the time to do so by management.

0
2

[–] tomlinas 0 points 2 points (+2|-0) ago 

Not only are there tradeoffs, but there's also a pretty unspoken tradeoff in a user-facing local application like Skype: you want the thing to be as responsive as possible to the user. If the user has 8GB of RAM and 5GB is free, I'm certainly not going to be focusing on keeping my memory footprint as small as possible.

With drive space and memory at all-time lows it's really not worth optimizing for those things anymore except in corner cases where you're crushing I/Ops with a database or running a really unoptimized set of threads or something.