Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Metadata shift of some blocks after adding Block of Granite #1361

Closed
ALongStringOfNumbers opened this issue Dec 21, 2020 · 10 comments · Fixed by #1549
Closed

[BUG] Metadata shift of some blocks after adding Block of Granite #1361

ALongStringOfNumbers opened this issue Dec 21, 2020 · 10 comments · Fixed by #1549
Labels
open for discussion status: help needed Extra attention is needed type: bug Something isn't working

Comments

@ALongStringOfNumbers
Copy link
Collaborator

Describe the bug
When granite was made into a GTCE material, a Block of Granite was added as a Gregtech block. However, this block was added near the end of the compressed block set, specifically at gregtech:compressed_16:12, which had the effect of bumping up all the meta numbers of the blocks that were originally at gregtech:compressed_16:12 and beyond, gregtech:compressed_18:0 being the final compressed block, now gregtech:compressed_18:1 after the addition of the Block of Granite.

This caused the shift of any existing blocks in the world, in something like a Storage Drawer or the Player Inventory, transmuting one block into the block preceding it in meta numbers. In addition, it would affect scripts of anyone who makes packs and performed any operation on blocks that were at compressed_16:12 or further.

Versions
Forge: 14.23.5.2847
GTCE:
Modpack: This was originally found in Omnifactory (See Nomifactory/Nomifactory#662 ), but was then replicated in a minimal environment
Addons: SoG was installed in Omnifactory, but this can be replicated without SoG

Setup
Playing Solo
New world generated

Steps To Reproduce
To replicate the material transmutation in a minimal envionrment:

  1. Join a world with Gregtech and Storage Drawers before 1.10.5.582 ( I used 1.10.4.574)
  2. Give yourself a block of Magnesia
  3. Place this block in a storage drawer, and keep one in your inventory
  4. Leave the world and update gregtech versions, I updated to 1.10.8.599 in this example
  5. The block of Magnesia in the storage drawer became a block of Material._Null in this case.

To replicate in a modpack environment:

  1. Install Omnifactory 1.2.2
  2. Give yourself a block with the command /give @p gregtech:compressed_16 1 12. This will be Block of Magnesia.
  3. Place this block into a storage drawer.
  4. Repeat this with any block desired after compressed_16:12. You can even do compressed_18:1, which does not exist in this version.
  5. Update Omnifactory to the latest dev preview and drop your 1.2.2 save into the dev preview instance.
  6. See that the items in the storage drawers have changed, and even the compressed_18:1 block, which previously was a block of Material._null, is now a block of Neutronium. This is because the Block of Granite was added part way through the compressed block list

Expected behavior
For no shift of existing material to occur. I do not know how well this could be accomplished, other than adding new blocks at the end of the compressed block list. This would mean instead of Block of Granite being compressed_16:12, it would be compressed_18:1, which would not sift the IDs of any of the blocks before it.

There might be some way instead to update the IDs of every block after the newly added block, so that materials are not changed during an update, however I do not know anything about this.

@Exaxxion
Copy link
Collaborator

I haven't found a way to hook into Forge's detection of the world being saved by an older version of GTCE, but there is an event that can be listened to if an ID is missing, which mods like AdvancedRocketry have used successfully to remap items. By changing instances of gregtech:compressed_## to some other name, you could trigger the event and remap the existing items to a more future-proof mapping scheme.

The trick of course is knowing what the old mappings were; the ordinals used for MetaBlock mappings aren't exactly informative on their own, particularly since GTCE's known materials are commingled with arbitrarily defined ones from CraftTweaker scripts or addon mods.

@LAGIdiot
Copy link
Member

This is unexpected behavior which should not occurred. It seems that compressed blocks are not tied to material ID system but they work with ids based on order of blocks which needs to have this variant. As shown here https://github.com/GregTechCE/GregTech/blob/master/src/main/java/gregtech/common/blocks/MetaBlocks.java#L236 .

This problem occurred in version 1.10.5 released on 2020-11-10.

Quick fix for this issue could be registering Granite with latest id +1 which would shift blocks back to origin position - saving Omnifactory and few other which did not upgraded yet. But at same time it would destroy all Granite products and shift blocks for people who already upgraded.

I am quite torn how to handle this and would like to get some input on it.

@LAGIdiot LAGIdiot added status: help needed Extra attention is needed and removed status: unverified labels Jan 17, 2021
@gabor7d2
Copy link

gabor7d2 commented Jan 25, 2021

Since Forge writes the versions of all mods a world was last opened with to the main level.dat file, it could be figured out upon loading whether this save already has those blocks shifted or not. So if the version of Gregtech in the save is <1.10.5, we shouldn't need to do anything (assuming that the next GT release reverts back the shift). Otherwise, if it is less than the version of the next GT release, some kind of live conversion must happen somehow. (this less than is really important to check because otherwise the conversion might occur between upgrading 2 future versions as well)

Screenshot 2021-01-24 at 19 10 10

I've looked around in Forge's world data loading code, and unfortunately didn't find any place where one could tap into the FML named node of the nbt tree, unless the mod container could be made Injected which I think is reserved for FML and minecraft.jar (see FMLCommonHandler.java#handleWorldDataLoad).

So I've basically copied over the logic of checking the world versions from Forge (Starts in SaveFormatOld.java#loadAndFix, then FMLCommonHandler.java#handleWorldDataLoad which then calls 2 implementations, one of which is in FMLContainer#readData):

    @SubscribeEvent
    public void onWorldLoad(WorldEvent.Load event) {
        if (!event.getWorld().isRemote) {
            try {
                File saveDir = event.getWorld().getSaveHandler().getWorldDirectory();
                if (!saveDir.exists()) return;
                File levelDat = new File(saveDir, "level.dat");
                if (!levelDat.exists()) {
                    logger.debug("Level.dat doesn't exist!");
                    return;
                }
                NBTTagCompound nbt = CompressedStreamTools.readCompressed(new FileInputStream(levelDat));
                NBTTagCompound fmlTag = nbt.getCompoundTag("FML");

                if (fmlTag.hasKey("ModList"))
                {
                    NBTTagList modList = fmlTag.getTagList("ModList", (byte)10);
                    for (int i = 0; i < modList.tagCount(); i++)
                    {
                        NBTTagCompound mod = modList.getCompoundTagAt(i);
                        if (!mod.getString("ModId").equals("examplemod")) continue;
                        logger.debug("Examplemod version in save is: " + mod.getString("ModVersion"));
                    }
                } else logger.debug("Nbt doesnt have ModList key");
            } catch (Exception e) {
                logger.error("Error on WorldLoad");
                e.printStackTrace();
            }
        }
    }

Now I'm not really familiar with Forge's codebase, sorry if this method is a bit hacky but it should work, I've tested on MacOS (adoptopenjdk 8) with both the Integrated server and a Dedicated server and it only gets fired on the "server side" in both cases, although it gets fired a lot of times throughout the game (at every world load), so a boolean needs to be put in place, which needs to be reset when the entire world/session is closed.

It is only reading and not writing anything, and it seems like Forge correctly closes the file before we open it again. Also, Forge's global (per save) MapStorage might point to the level.dat, if that is the case it would be really easy to query this info without any workarounds. I didn't have time to fully research how that system works.

Here is the output after upgrading test mod from 1.0 to 1.2, correctly saying 1.0:
Screenshot 2021-01-25 at 2 40 58

I've been thinking more of the actual conversion, this could be implemented in a lot of ways, what I've imagined is the following:
For the worlds which already have the blocks shifted and thus need shifting back, GregTech could write a "shiftedBlocks" flag into the main world data somewhere (e.g. to the global MapStorage). Then, as each chunk gets loaded, run through all the nbt data for all blocks/tiles and shift the gregtech:compressed_xx:yy ids which are after granite back by one, and save somewhere that this chunk has been converted.
The "shiftedBlocks" flag is needed to distinguish worlds that need their chunks converted and ones that dont, because if the world gets closed and opened again, we wouldn't be able to know that this world still has chunks with wrong data, although it might be fine if we just check for the presence of the list of converted chunks in storage. And it would be best if newly generated worlds are kept clean and don't have this flag or any chunk conversion storage. (I think the 1 boolean check per chunk load isn't gonna affect performance)

This idea is the most feasible I've come up with as of now. Converting all chunks in one go wouldn't work for large worlds, and concurrent modification is a bad idea too (like after the blocks are loaded and sent to the client, and modifying the physical files would probably break things too). I don't even know if manipulating chunk contents is allowed by Forge before it gets loaded, and whether drawers/ae2 cells amongst other things react well to changing their nbt data just before/while they are being initialized.

Thanks for coming to my ted talk, I hope some of this helps :D If I can help with anything I'm more than happy to try, also I'm going to test this level.dat reading on windows tomorrow and will report back if it doesn't work. I'm really sorry for this much text lol.

@Exaxxion
Copy link
Collaborator

The solution for this needs several parts:

  1. Implementation of a new mechanism which does not exhibit the flaws of the existing mechanism,
  2. Detection of worlds saved prior to the new mechanism, and
  3. Conversion of blocks in worlds detected in (2) from their current state to the new mechanism in (1).

Part (3) necessarily entails retaining the existing logic for the purposes of figuring out how the blocks would have been mapped before the addition of the Granite material in 1.10.5. Custom materials are not possible for GTCE to know about at compile time, thus the old mapping must be computed at runtime.

The conversion also needs to be performed atomically, otherwise we risk leaving the world in an inconsistent state. Doing the conversion piecemeal seems risky to me, even if we mark chunks along the way. If the game crashes after a chunk is converted but before it is marked, reopening the game would run the update a second time and it's anyone's guess where things would end up. An atomic conversion, even if it takes a little while on larger worlds, also reduces the complexity since you perform it once and you're done; there's nothing else to check going forward.

I don't even know [...] whether drawers/ae2 cells amongst other things react well to changing their nbt data just before/while they are being initialized.

Storage Drawers seem to handle item registry changes fine, but when AE2 encounters an unregistered item it just deletes it. I ran into this a while ago in Omni when LibVulpes changed the registry name for Dilithium Crystals. Everything else converted their contents to the new ID just fine, except AE2 which just deleted all of it from the drive.

The ID shifting that happened both when we tried to use an ID in a gap range for Microversium and when Granite was added in 1.10.5 didn't break the drawers, it just changed what items were contained in them. I believe in the former case that blocks of plastic in a drawer suddenly turned into blocks of meat.


A simple improvement to the existing mapping scheme would be something like this:

MetaBlocks could be generated not using encounter order but rather a modulus based on the material ID. This will ensure that a particular material ID will always map to the same value, independently of other materials registered. The computation is also very simple, requiring no special mapping logic.

For example HSS-G has the material ID 302. You would perform integer division and modulus with 16 (the cap for MetaBlock subitems) resulting in 302 / 16 = 18 and 302 % 16 = 14. Thus the item "Block of HSS-G" would be (using the current scheme): gregtech:compressed_18:14.

The substantial benefits of this approach have the tradeoff of sparse population of MetaBlocks. If materials in a sequence frequently do not have a block form, there will be unused IDs. This means there will be a larger number of _## MetaBlocks registered than before, but given that the material system has a hard cap at 999, the expansion is bounded. There could be at most 62 registered MetaBlocks (ID 999 mapping to gregtech:compressed_62:7). This is also mitigated by the fact that MetaBlocks where no items fall into its range could simply not be registered.

The above would take care of how to futureproof the mapping, except in cases where a mod or modpack register an item in a range that is eventually reserved by GTCE. This would result in a registry conflict, with the custom material being rejected. This is a separate issue that intersects with this one, but can be dealt with separately. It's mitigated by opting for high IDs that are unlikely to be added for a long time, but ultimately unavoidable without some way to segregate native materials from custom ones.

The above all addresses point (1) at the start of this comment. How do we apply this to the subsequent points?

I propose that it would make the most sense to leverage Forge's missing registry event which is fired on world load. Forge automatically creates a backup of the world when this happens, which improves the safety of the conversion process. It also ensures that it is applied all at once when the world is first loaded, after which no additional work needs to be done.

This event may be triggered on world load by changing the MetaBlock registry name assigned during the proposed new mapping process from gregtech:compressed_## to anything else, like gregtech:meta_block_##. Hooking this event allows us to remap the old items arbitrarily to new values. At this time we could read the GTCE version in the level.dat being loaded to see if shifting may have occurred, like how @gabor6505 describes in his comment above.

We would use the old MetaBlock packing algorithm to determine what the old order is (accommodating for shifting if applicable) and remap each MetaBlock to the corresponding new registry value.

That should suitably cover points (2) and (3).

@gabor7d2
Copy link

gabor7d2 commented Jan 30, 2021

I've done quite a lot of research on this matter, just out of fun and curiosity, and would like to share my progress on it so far.

I like the modulus based Metadata calculation and the move to a new registry name demonstrated by @Exaxxion as a way to deal with this and futureproof the system, and in the future, custom registered materials could have their own separate set of blocks/items etc. Though this isn't a must for fixing this issue.

Unfortunately, Forge's MissingMappings event doesn't support editing meta values, it just creates a one-on-one registry name reference, so that anything accessing by the old registry name gets served with the Item/Block of the new registry name. So the most we can do using MissingMappings is to remap gregtech:compressed_## to something like gregtech:meta_block_##. This event by itself doesn't have too many advantages anyway, as a backup is only made if at least 1 mapping wasn't processed (== was left on MissingMappings.Action.DEFAULT), which results in a destructive operation, so a backup makes sense. However, Forge assumes that if the mod processes all of the remaps, it must be doing everything correctly, and doesn't do a backup before actually applying the collected remap definitions.


At this point, I believe there are 2 ways to tackle the problem:

  1. Remap compressed to meta_block with MissingMappings, and then remap the metadata using DataFixers. DataFixers could also be used to "remap" compressed to meta_block in the NBT data of ItemStacks (and then doing metadata remap), because stacks' registry names inside NBT don't get "replaced" by MissingMappings as long as they're not accessed and Forge gives their new registry name from the registry. This is not required, but would make the conversion less susceptible to undefined behaviour of mods (like AE2, which mostly worked fine, but there was 1 case where I managed to have both the new and the old ItemStack on disk, though the old one wasn't accessible from the terminal, so it was probably just garbage data). This first pass would need to be run for everything the DataFixer finds in the chunk before running the second one, which remaps metadata, which means that 2 DataFixer instances with different version numbers are required.

  2. In MissingMappings, set the action of all entries to ignore, and do all the conversion with DataFixers. I believe this way would be more logical, have better performance, and would need fewer DataFixer definitions, because we can for example convert directly from compressed_12:3 to meta_block_24:2 instead of first remapping to meta_block with the wrong metadata, and then fixing metadata in another pass.

Using the first solution, a manual backup would need to be created on the first invoke of the MissingMappings event. If using the second solution, a manual world backup can be created at the WorldLoad event, at that point, even the level.dat should still be untouched. The order of the events is: MissingMappings -> WorldLoad -> Chunk/BlockEntity/Entity DataFixers invoked for each chunk that isn't yet fixed.


There is no way to remap metadata of blocks "atomically" in one go AFAIK, but DataFixers are a vanilla feature, and should be reasonably safe I'd assume. Fixing happens as each chunk gets loaded, most certainly before ticking begins, because otherwise it would be an issue with vanilla hoppers etc. as well. After fixes are run, the chunk is marked with the version specified in the used CHUNK type DataFixer implementation (paired with the MODID). While the chunk is being fixed, the system also invokes any fixers for the ITEM_INSTANCE, ENTITY and BLOCK_ENTITY FixTypes, probably storing the fix version that is returned by their respective fixers for these types for each of the entries encountered.

DataFixer documentation

Behaviour of FixTypes (according to my testing):
CHUNK : Run fixes on the entire NBTCompound of the specific chunk.
ITEM_INSTANCE : Runs for each ItemStack encountered in any vanilla inventory/(tile) entity and a small amount of mods that registered their own containers for use with this system.
BLOCK_ENTITY: Runs for each Block Entity (Tile entity) encountered.
ENTITY: Runs for each entity
PLAYER: Runs for the player in the level.dat file for every world listed in the singleplayer menu when the menu is opened, but any changes made to the compound are not persisted (it wouldn't be useful for us anyway because we don't know the Gregtech version yet). It does not run when the host player of a singleplayer world joins, so this needs to be done with a PlayerLoggedIn event that is only run once, for the first joining player of a singleplayer world (IntegratedServer). But it can be used to fix data of players joining a LAN IntegratedServer or a DedicatedServer.

ITEM_INSTANCE would be convenient, but it doesn't really contain any inventories that are not vanilla. The way I got around this in my testing is to recursively go through each NBTTagCompound of the encountered BLOCK_ENTITY/ENTITY/PLAYER's whole NBT tree, and modify the id and Damage values of all compounds that have both of these keys (if the id is something which should be remapped).

Some mods might get confused, but there is a backup made for reverting the changes. Applied Energistics 2 might be one of these mods, however during my testing with a small network (Medium voltage level Omnifactory world, and a test world with just AE2, Actually Additions and Storage Drawers), I haven't experienced any data loss or other issue. As long as only the registry name and Damage are modified, AE should have no reason to work inconsistently, even though they claim it is not a good idea, and that AE's NBT might not be clean (whatever that means, looking at their code they just write normal NBT data) (also this). At the point DataFixers are run, the TileEntities might not even be initialized yet, let alone ticking. (I've even thrown in a fix that replaces all found ItemStacks with granite, and it did survive, though it wasn't ideal, still the same amount of types occupied in each cell, unless I modify it's contents, and even after modification, there is a lot of garbage data in the cell's NBT. This was just an extreme test case which doesn't apply to the current issue.)


Most mods don't go this far to implement data fixing, and just tell their users to use a new world, but it was fun to play around with it. My example mod for a complete implementation of the described DataFixing above can be found here. Explanation of the versions in the builds folder: version 1.7 shifts all stones' meta (1 through 6) down 1 both in world and in inventories. There is also a version with the -all suffix that replaces every found ItemStack that doesn't have NBT data and is not from AE2 with granite. 1.8-test1block, aside from the behaviour of 1.7, contains Test 1 Block, and remaps examplemod:test2 to examplemod:test1 using MissingMappings. 1.8-test2block does the inverse. There are also some logs which contain a lot of useful info and the order of the actions performed.

This implementation is hacky, and needs more testing to be considered safe. Testing with my example mod is a bit difficult, because data fixes are only enabled if level.dat contains an examplemod version between 1.1 and 1.3. This can be manually added/edited with a preferred NBT editor, and the actual data fixing is guaranteed to run only once for already existing chunks, and never for newly generated ones (by the vanilla DataFixer design). I would consider the concept tested enough, so the rest of the testing would need to be after being implemented in Gregtech. People reading this can still help by testing with my example mod if they have the time to experiment a bit, and report back any issues they find.

If somebody takes on implementing this, that's great. Otherwise, I might do it in a week or two, if somebody gives me the green light. But I can't promise anything right now. I've yet to set up a Gregtech dev environment and look through/understand its Material registry system.

@LAGIdiot
Copy link
Member

Thank you both on this in depth analysis.

Proposal by @Exaxxion regarding what needs to be fixed and when seems correct. And suggested implementation for handling compressed is solid I am for implementing it that way.

Regarding conversion of current data proposal of using DataFixer by @gabor6505 sounds to me as way to go. But I did only read upon it here and on provided case studies. From my point of view we need to sets of them one from before 1.10.5 and from after 1.10.5 . And from given proposal option 2 seems more straightforward and reliable to me.

If anyone has other ideas or improvements upon this or concerns please step forward of I will give it go signal.

@LAGIdiot
Copy link
Member

I don't see any concerns or additional discussion regarding this. So this is go signal for implementing this as discussed above and in a way selected in my last comment.

@ALongStringOfNumbers
Copy link
Collaborator Author

@gabor6505 I was taking a look at your examples and noticed that you used WRONG_DATA_VERSION_START and WRONG_DATA_VERSION_END in your example mod. In the case with the granite and gtce specifically, the WRONG_DATA_VERSION_START would be 1.10.5, however what would the WRONG_DATA_VERSION_END be? Would it be the version where the fix gets merged into GTCE, or would this have to be changed every update of GTCE to the new version to catch players updating to versions beyond where the fix was originally introduced?

@gabor7d2
Copy link

gabor7d2 commented Feb 14, 2021

@ALongStringOfNumbers WRONG_DATA_VERSION_END would need to be set to the last GTCE version where the registry name/metadata fixing hasn't been implemented yet. If the data fixing would be implemented in 1.11.4, then it would need to be set to 1.11.3. My code just checks if the version of the mod that the world was last opened with is inside the inclusive bounds specified by these START and END numbers. (If it is inside the bounds, dataFixes gets set to true, and this variable is checked in the fixTagCompound of IFixableData instances. The DataFixers in this case should use mapping definitions (List<BlockFixDefinition> in my code) that directly convert the list of gregtech:compressed variants and meta (with the granite included) to gregtech:meta_block with modulus based meta.)

In the case of GTCE, another variable should be created that is true when the mod version in the save is less than WRONG_DATA_VERSION_START (so less than 1.10.5), in which case the granite hasn't been introduced yet, so the DataFixers should use definitions that convert the list of gregtech:compressed variants and meta (pre 1.10.5, without granite) to gregtech:meta_block.

The most tricky thing imo will be to construct the 2 mapping definitions, for which the mod needs to figure out what meta was given to each block in the given version, (also including additions from external addons like CraftTweaker?)

Responding to @LAGIdiot,

From my point of view we need to sets of them one from before 1.10.5 and from after 1.10.5.

We could just use 2 different lists of mapping definitions in the same DataFixers (like I just described above), the plan is to directly convert from either state to the new, fixed state if I'm not mistaken? If using 2 different sets of DataFixers for direct conversion, then the versions of both sets should be set to the same, which indicates that the stuff they processed has been converted to the new state.

(If anything similar happens or something needs to be remapped in the future, it will be quite convenient to do so, just create new DataFixers with a higher version, and Minecraft will deal with first converting stuff with the lower version one if need to be, and then the higher one.)

Btw, I was using Oracle's GPL v2 licensed VersionUtil for version comparison, that class could probably be rewritten with a shorter implementation.

@ALongStringOfNumbers
Copy link
Collaborator Author

The most tricky thing imo will be to construct the 2 mapping definitions, for which the mod needs to figure out what meta was given to each block in the given version, (also including additions from external addons like CraftTweaker?)

The crafttweaker materials are added in preinit I believe, with the registration of all metablocks, so there should not be any special cases needed for them, as they would have already been registered and in the material system by the time the fixers are called in init

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
open for discussion status: help needed Extra attention is needed type: bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants