What goes into an HD texture pack?

I posted this in response to a question on the Daily thread. I think the response was a bit lost on the original supplicant (although it did answer his question!). So I’m reposting it here in case it helps someone else.

(I think search engines do a slightly better job indexing and reporting wordpress results than deeply-buried reddit comments).

Other texture packs like Skyrim HD only cover a tiny percentage of what’s in the HR DLC – Noble Skyrim covers 4.4% and Skyrim HD covers 2.6% of the textures.1

Running these packs without the high res DLC will actually decrease VRAM hit since so many textures will be reverted to low-res vanilla. With the high res DLC the increase in VRAM requirements are probably around 5-15% (a few hundred MB all told), which is consistent with what my actual tests have told me.2 Obsessively retexturing every single object in skyrim like I have3 will increase VRAM useage by around a factor of 4, but in the end adding many new objects (if you triple the different kinds of animals and trees in one scene or give all 30 NPCs a different high res hair… woof) has the highest VRAM hit.

Recommended is to use the optimized vanilla textures version that includes the HR DLC, since the high res DLC does have some inefficiencies, and use whatever retextures on top of that that you please.


  1. numbers include the DLC, normal maps, and all other types of dds files included in the HR DLC, such as glow maps, inactive parallax maps, etc.
  2. Math: 5% of the textures have been increased in size from somewhere to 2x size to 4x size. So 5% of the original size – 5% of 6.9 GB – has increased in size to around 1.3 GB with the 4x textures, replacing the original 0.34 GB, for a total increase in size of 1 GB, which is 14.5% of 6.9. 1
  3. At this point mod organizer stops accurately counting overwrites – if I go to my virtual data folder I can’t find a single texture from the high res dlc, but it still says I’m only at around 60% of the files from it overwritten.

  1. This does not explain why the total size on disk of Noble Skyrim is a whopping 3.43 GB. However, my tests show an increase of around 200-300 mb depending on scene, out of 1.6-1.8 mb useage total1, which is consistent with the math above.2

  1. note that VRAM useage varies with hardware; these tests were done with a 970 and actual numbers were a lot different on my 650M.
  2. Noble Skyrim and Skyrim HD pick textures that make a big impact on the scene and focus on cities, dungeons… basically constructed features. So you’ll see a lot bigger difference in Solitude, where I have actual notes on the numbers, than in the middle of the forest in Falkreath, where neither mod does anything at all.
Advertisements

Skyrim Special Edition

Skyrim Special Edition, also sometimes called the Skyrim Remaster, is coming out in just over five weeks.

And we still know almost nothing about it.

What we do know is pretty easy to summarize though:

  • It will be 64bit and dx11. This *may* mean better memory handling (*if* they rewrote Skyrim’s faulty memory heap handler as part of the update), but it *definitely* means more than 3.1 GB of RAM available, and more than 4 GB VRAM available on Windows 8 and 10 (that was already the case on 7), and probably some snazzy dx11 features too.
  • It will have new shaders based on dx11. Shaders are things like: How the water and lighting looks. Like what ENB and SweetFX and SKGE do (actually it looks almost identical to SKGE).
  • They claim it will have new textures. There was no sign of new textures in the video. It may be that they’re just bringing the high res DLC to console.
  • There will be no new content. Period.
  • There almost certainly will not be any bug fixes, except for things that may have incidentally been fixed by recompiling the engine (such as the memory fix).
  • It will be a new game on the steam store. It will not overwrite or touch the old game.
  • All PC users who own all the DLC will get it for free.
  • Existing saves (presumably they mean vanilla saves) will work.

What about mod compatibility?

  • There will not be mods on PS4.
  • XBOX users will get a paltry 2 GB of storage for mods. Mods must be things that can be packed up in an esp + BSA in the Creation Kit. No loose files, no patchers, no dyndolod, skse, etc. And nothing that relies on those. Or is bigger than 2 GB.
  • “Old mods will mostly work” – they will need to be re-saved in the Creation Kit. Presumably this applies to mods that are an esp + BSA, and more complex mods may need reworking. We don’t know more than that.
  • It is possible for:
    • Havok to have updated (affecting all nifs with collision, HDT, etc.)
    • Scaleform to have updated (affecting all mods that use the UI)
    • Papyrus to have updated (not at all likely given that old saves will work, unknown impact)
  • Any mod that injects code will need to be redone. That is, any mod with a .dll file will need to be redone, by the author (or someone else with appropriate permissions and a lot of skill at looking up addresses).

The scale of this redoing is summarized best by Expired (author of Racemenu, EFF, F4SE, a bunch of other things, contributor to SKSE and SkyUI, just generally our best hope for updating literally everything with a .dll), in this comment.

They may not have to totally redo all the code from scratch.

But they will have to, regardless, look up each and every address that each and every dll accesses, find the new address that does the same thing (if it exists), and update. Without any documentation.

The sky is not falling, but it’s going to be a lot of work, done by the very few number of people that know how to do that work.

And even though this doesn’t apply to console at all, because you guys can’t even use the mods that discussion is about (dll files physically won’t work on console), it will affect the conversion and progress of the mods you will get. Only PC gamers can make mods, and there’s not much incentive for PC gamers to mod SSE if all the mods we can’t live without can’t work.

Addition: I forgot to mention ENB. It’s technically included in the “anything with a dll” statement, but since Boris will only let Boris work on it, it’s kind of a special case. Boris was not happy with the dx11 changes made to the Creation Engine for FO4 and said he couldn’t really do anything interesting ENB on that game. He says this is probably also going to be true for Skyrim Special Edition. Without seeing it in person it’ll be hard to say how big an impact that’s going to be on getting a beautiful game. Certainly many people are perfectly happy with FO4 graphics. But if you’re in love with how your game looks currently… just understand that even if ENB does get updated for SSE, it’s never going to look quite the same.

Another thing to note is that SKGE (which Alenet has promised will be updated for SSE), technically allows replacement of the vanilla shaders with any custom shader. This means better performance than ENB’s post processing method, and potentially any shader that’s in ENB, could also be implemented through SKGE. But that means someone rewriting the shader for use in SKGE, which not many people have the talent to do (or rather, those with the talent don’t have the time). (and all of the updates above need to happen before SKGE can be implemented for SSE at all). However many people are now seeing the potential in SKGE and it’s possible the current (really terrible) shaders will get replaced with something that really can make a game beautiful. So it may be possible to get a beautiful game without ENB one day, but only time will tell.

SSE is releasing in just over five weeks.

It will not be reasonably moddable on PC for… I give it just over five months.
I’ll see you on 64bit in March, guys.

Update: It is now May and no sign of a vast number of my favorite mods being ported. GG – my estimate was the longest I saw and it’s taking even longer!

Scripts are good, mmkay?

There are 10x as many scripts in the vanilla game as there is in even a pretty heavy load order. Yes, mod scripts are likely to be doing much heavier things, things papyrus was not built to do particularly well. And modders aren’t all professional programmers and may make pretty bad scripts sometimes.

But overall, scripts aren’t out to eat your save games and shit on your bed. They’re an essential part of everything from dialogue and quests to casting a spell to that really cool swirling wind effect when Alduin shows up. Not to mention freezing/starving to death, draining stamina, placing weapons on your back, putting snowberries on a bookshelf, and so on and so forth.

So when I see someone say they heard “there’s a limit of only three or four scripted mods in a stable game” a single tear rolls down my cheek.

It’s like not eating baklava because you think it has fish in it.

Oh, and all my time troubleshooting, I’ve seen maybe 10 CTDs caused by scripts, and those were really, really obvious.

Even if scripts are badly written, they do not cause CTDs. Stack dumps, scripts failing to calculate, lag, save bloat, and even (eventually) save corruption can be caused by scripts, but just a straight-up CTD is exceedingly rare. The vast majority of people will never see a CTD caused by a haywire script.

If you are crashing, do not show me your papyrus log. Period. It is not relevant. I do want to see your memory blocks log though.

If you have some other issue that is not crashing, you can go ahead and share (your entire) papyrus log. Read this first though.

Myths and Legends: LOOT

I hear a lot of Myths about LOOT. I hear it’s the perfect load order tool, and if you run LOOT your game will never have any bugs. I hear it’s a horrible mess, and running LOOT will turn your load order into a massive pile of sewage. I even hear that LOOT never does anything and why do people recommend it? (On that last one: Learn the difference between mods and plugins, mmkay?)

Thing is LOOT isn’t Jesus come to save us and it’s not a pile of dog poo either. It’s a massive crowdsourcing project.

Like this piece of music: https://crowdsound.net/

It’s not a bad song, but there’s some places where you just don’t understand why it made the decisions it made. Got caught in a loop or something.

And it’s pretty cool for something that’s totally free and all crowd sourced.

Firstly:

LOOT does not solve bugs. LOOT sorts plugins to the best of its ability to do so. And it reports issues, if those issues have been reported to LOOT. Right? LOOT can’t know things it doesn’t know. It’s not magic.

LOOT cannot replace your own reading on incompatibility, looking in TES5edit at mod overwrites, looking for errors, etc. If you say “I ran LOOT I don’t know why it’s buggy”, I’m going to take quick look to see if you actually ran LOOT, then tell you to go look for bugs.

Secondly:

LOOT does not do a bad job sorting. It does a fantastic job sorting. I know just about all there is to know about mods 😉 and I still can’t sort a load order as well as LOOT can, with all the tools available to me.

Some people claim LOOT doesn’t sort mods correctly. This is also a Myth. Because guess what? Half the time, LOOT actually DOES sort the damned thing correctly and you just think otherwise because you’re wrong, or you didn’t actually try it, or some other reason. The other half the time, it’s because LOOT didn’t know it was doing anything wrong. LOOT only knows what it knows.

You are the crowd.

Go tell it what to do on Github and quit your pointless bitching.

The above thoughts also apply to Mod Picker (which I promise is coming out Soon TM and at this point that is a Riot Soon TM and not a Blizzard one and I’m sorry).

Myths and Legends: Papyrus Ini Settings

It’s well established in the community that some ini settings should never be changed. For the most part, this is true. Increasing ugrids to load WILL decrease the stability of your game. Adding HWHavokThreads WILL (probably) do literally nothing.

And yet, there is the pervasive Myth that some settings will improve your game’s performance, or at least prevent stack dumps. The Legends of the community will tell you the opposite – that changing these settings will actually INCREASE the likelihood of stack dumps.

The truth might be a little more subtle.

However, nothing will improve the stability of the scripting engine as much as using fewer, and better-written, mods.

fPostLoadUpdateTimeMS=500

No one thinks this setting is unsafe. It will increase your loading screen by the time listed, in this case half a second (the default value). I’d recommend leaving it less than 1000, as large values may noticeably increase loading screen time.

fUpdateBudgetMS=1.2
fExtraTaskletBudgetMS=1.2

This setting controls how much time per frame papyrus gets to do its thing. Each frame, when at 60 fps, is 16.67 ms. 1.2 ms of that is taken for Papyrus to do its calculations. The remainder goes to other calculations, and the largest chunk of it to drawing the frame. If you’re struggling to stay at 60 fps, it’s because your computer can’t do everything it needs to do (calculations and rendering) in 16 ms. If you’re well over 60 fps and have to cap it, your computer has no problem drawing the frame in 16 ms.

My understanding is that any papyrus steps that do not get completed in the time set by this setting get pushed to the next frame. After getting pushed past a certain number of frames, the script may fail to run or will certainly fail to do what it was supposed to do in a timely fashion. If the game decides a script is frozen altogether because it hasn’t had a chance to do its thing in a very, very long time, it may dump it. That’s bad.

If your computer is running at lower frames, scripts will take longer to run too. Remember, they only get a certain amount of time per frame. If your computer is taking 30 ms to draw a frame, papyrus only gets 25 ms every second to do its thing. If your computer is taking only 16 ms to draw a frame, papyrus gets 50 ms out of every second to do its thing. Everything is smoother at higher frame rate!

However, more things to do in papyrus will not decrease frame rate, because of that little setting up there. It prevents a laggy or badly written script from taking over and freezing the game – the game will not wait for the script.

So. What happens when we increase that setting? Papyrus can use more time. Unlike the load screen thing, it doesn’t have to use more time. It just can.

Let’s say we increase it by 25%. A nice conservative change.

Papyrus gets 25% more time to process. Everything else you need to do to get a frame drawn in 16.67 ms gets 0.3 ms less to process. Doesn’t seem real significant, does it? It’s a big boon to papyrus if it happens to need all that time, and a tiny change in how much time your computer gets to draw a frame, if it uses all that time.

Let’s say your computer is really struggling. You’re at 30 frames, and you know scripts aren’t running in a timely fashion.

That change will reduce your framerate. But, it will overall increase how much time papyrus gets… at the cost of everything else!

If your computer’s really breezing along, and you’d be at 100 fps if you didn’t have to cap it to 60… your computer can process a frame in 10 ms. You have to cap it at 60. Why not give an extra 5 ms to papyrus? If it needs it, you’ll still be above 60 fps. If it doesn’t need it, you’ll be exactly where you were before. Your scripts may run in fewer frames (they may not), leading to an overall performance improvement and more stable gameplay.

But if your game gets a particularly difficult to process scene, with a lot of scripts and a lot of things to compute… it’s suddenly going to chug much, much harder. Not only will it take more time to draw the frame, but papyrus will demand that time too instead of patiently waiting its turn.

And if you think that papyrus is going to take a full 800 ms to process anything, per frame? Just turn that shit off. You’re saying papyrus gets to take almost an entire second just to do its stuff it if it needs it? If your game is ever at that degree of laggy, it is literally unplayable.

And that’s why you should never set these the way the popular Myths will have you do it.

But nor should you shun them in fear, despite what the Legends tell you, because they’re not actually that scary.

iMinMemoryPageSize=128
iMaxMemoryPageSize=512

These are how much memory is devoted to hold skyrim processes. Anyone who’s ever thought “Wildcat is only 76 kb download? That can’t be right!” has come across the fact that scripts are really small. They don’t require that much memory to process either.

The first two control the size of stacks for papryus to allocate. The way this works might remind you of the way the familiar SKSE patch works, except this only for papyrus (and it’s stacks, not heaps. See discussion below). Also, those values are literally six orders of magnitude smaller, because they’re in bytes, not megabytes. Also, unlike the skse patch, THIS ALLOCATOR IS NOT BROKEN.

Repeat after me:

Papyrus Memory allocation IS NOT BROKEN.

So there’s no need to increase the stack size! It will fucking allocate a new stack when it needs to! You don’t need to make bigger stacks, because it can make more stacks!

Anyways. No reason to mess with these then. According to the CK wiki:

“iMinMemoryPageSize is the smallest amount of memory the VM will allocate for a single stack page, in bytes. Smaller values will waste less memory on small stacks, but larger values will reduce the number of allocations for stacks with many small frames (which improves performance).”

Decreasing = less memory wasteage, perhaps important if your Skyrim uses more than 3.1 GB (Seriously, if your skyrim ever uses more than 3.1 GB of RAM, screenshot that shit, I want to see it! And your modlist! Please!)

Increasing = fewer allocations, better performance – which is probably not noticeable because allocation is not really the slow part of the skyrim engine in most cases.

“iMaxMemoryPageSize is the largest amount of memory the VM will allocate for a single stack page, in bytes. Smaller values may force the VM to allocate more pages for large stack frames. Larger values may cause the memory allocator to allocate differently, decreasing performance for large stack frames.”

Decreasing = more allocations, less performance

Increasing = broken memory allocation, less performance.

So. DON’T TOUCH THESE.

It doesn’t help, and it can certainly hurt.

I guess I just agreed with the Myth, eh? Well that’s the problem with Myths. Most of them have a grain of truth.

iMaxAllocatedMemoryBytes=76800

Last one. This is the maximum amount of stack size. So Skyrim can only use 75 kb of memory for Papyrus. That’s…. not a whole lot.

But there’s a reason it’s not a whole lot. Scripts are tiny. Real tiny. Even all 76 kb of Wildcat doesn’t need 75 kb to process at anyone time.

But still… Skyrim… with a lot of mods… that’s a lot of scripts. Even if they’re tiny that can all add up. What if papyrus needs more and can’t allocate it?

It waits for memory to be freed, and then it uses it. Waiting = slower scripts, and eventually, if your game is that overloaded, stack dumps.

So why don’t we increase it?

Because if you make it bigger, you get stack thrashing, and stack thrashing causes stack dumps, and that’s bad.

(I can’t actually explain stack thrashing, I read the whole wikipedia article on it and I’m still not really sure what it is other than “buffer overflow”, and I can’t explain what a buffer overflow is, other than it’s bad. It’s real bad. It’s a heck of a lot worse than waiting on a slow script).

Again, small increases might give papyrus a little more breathing room without harming it too much. 25%. Maybe 50%. But doubling it? Increasing it by 9 orders of magnitude? (I’m not kidding. I wish I was kidding). Don’t do that.

Probably it’s important right now to note the difference between Stacks (what all that discussion up there was about) and Heaps (which is what the SKSE memory patch edits).

Stacks are quick memory used for active calculations. They’re allocated, the calculation gets done, and they go away. Easy.

Heaps are slow memory used for storing things, like objects and variables. They’re allocated, lots of different stacks access them to do different things, and they don’t get cleared. They stay there and the heaps slowly grow as new objects get named.

That’s why stacks are so much smaller, and why it’s really not a big deal to have a slightly-too-small max number of stacks (because it’ll just go away and then you can make a new stack). And why it’s a big deal to have a slightly-too-small heap size.

So – the Myth has some truth, and some falsehood. Or maybe depending on which Myth you read, it was all false or all true. There’s a lot of different Myths. But written here, is the truth, as best I know it.

Myths and Legends: Autosaves and Quicksaves

TES is a universe built on Myths and Legends. Or Legends that are Myths. Or Myths that are really Legends. Everything is true and not true at the same time. It’s a great universe to create in.

No surprise that Myths about modding are just as common.

There is an extremely popular myth that the default skyrim autosaves and quicksaves do not properly stop scripts and therefore will cause CTDs? Save bloat? Stack dumps? The actual symptom is different every time the myth is repeated. (A common problem with myths). Only going to the skyrim menu and saving from there will result in a safe save. Or the console. That works too.

This is most certainly a myth. First of all, anyone can test for themselves that quicksaves and autosaves do indeed stop scripts from running, identical to going to the main menu. If you turn on papyrus logging and go to make a quicksave, you can easily see the “VM is freezing” and “VM is thawing” messages that indicated scripts were stopped, recorded, and restarted. This happens regardless of type of save.

No, the rendering engine doesn’t stop, but the rendering engine doesn’t get baked into a save, now does it?

Secondly, there’s this excellent breakdown of why this is an utterly ridiculous thing to even think in the first place, by Merad.

To quote:

“The whole ” script running while saving” thing, however, is moronic. Creating a save requires capturing a snapshot of the world state to a file. If you allow the world state to be altered while you are saving it, of course you will end up with blatant corruption everywhere. That’s the kind of mistake that a sophomore CS major should know to avoid. I find it hard to believe that Beth could have devs that stupid, and also hard to believe that the game would function at all if it was written that way.”

Now where did this myth come from?

There’s a few possibilities.

  1. Windows corrupted files when it was handling the IO of writing a save. While unlikely, windows does corrupt files and especially on older computers this is possible. My understanding is that this is more likely if you’re overwriting files.

  2. The saves got corrupted because of skyrim/mod bugs, and people who relied only on autosaves and quicksaves didn’t have old backup saves to go to.

  3. Overwriting files (when files are named the same) takes longer because of an issue with SKSE. It just takes longer, there’s no actual harm in it.

What’s the real answer?

Autosaves and Quicksaves are perfectly fine, but you should never overwrite files, as that may cause problems. Plus, you always want an old save to return to in case something bad happens to your more recent saves. Don’t overwrite or delete saves!