FileForums

FileForums (https://fileforums.com/index.php)
-   Conversion Tutorials (https://fileforums.com/forumdisplay.php?f=55)
-   -   [Dev]XTool (https://fileforums.com/showthread.php?t=101613)

Gupta 23-05-2020 22:40

Quote:

You may store some rare/large duplicated streams in a temp file, while storing small/frequent dupes in RAM - this way the excessive HDD load won't happen,
maybe he can introduce the second phase in compression, then he can store forward reference count for a stream that should ideally decrease requirements memory size beyond window size and that should increase compression too.

HDDs are very slow, I recently upgraded to nvme based storage and I can feel the speed.

panker1992 28-05-2020 11:48

Dedup
 
Quote:

Originally Posted by FitGirl (Post 485989)
Thanks for returning to the project, deduplication is a very useful feature.
I have an idea which will reduce the required RAM for dedup. You may store some rare/large duplicated streams in a temp file, while storing small/frequent dupes in RAM - this way the excessive HDD load won't happen, cause reads will be rare and the RAM won't be used that much. 1-2 GB is a pretty big amount even for machines with 8 GB. And for users with 4 GB installation will be almost impossible, considering srep and lolz/lzma. Even with page file. So reduction/control over used RAM is a must, I think.

I'd recommend you Halo Reach for testing dedup, it has tons of duplicate streams of a different size.

there is also a sorting match feature that can reduce ram needed and that is as follows.

srep does a very good job finding matches that are located far away!
that in order to happen makes a dictionary!
IF you sort the files you feed srep you can actually reduce ram needed and its speed

Sorting preprocession can speedup the process and cost less ram !! and remove IO overhead because NO temps

Razor12911 29-05-2020 01:23

Quote:

Originally Posted by FitGirl (Post 485989)
Thanks for returning to the project, deduplication is a very useful feature.
I have an idea which will reduce the required RAM for dedup. You may store some rare/large duplicated streams in a temp file, while storing small/frequent dupes in RAM - this way the excessive HDD load won't happen, cause reads will be rare and the RAM won't be used that much. 1-2 GB is a pretty big amount even for machines with 8 GB. And for users with 4 GB installation will be almost impossible, considering srep and lolz/lzma. Even with page file. So reduction/control over used RAM is a must, I think.

I'd recommend you Halo Reach for testing dedup, it has tons of duplicate streams of a different size.

Quote:

Originally Posted by Gupta (Post 485991)
maybe he can introduce the second phase in compression, then he can store forward reference count for a stream that should ideally decrease requirements memory size beyond window size and that should increase compression too.


HDDs are very slow, I recently upgraded to nvme based storage and I can feel the speed.

Quote:

Originally Posted by panker1992 (Post 486059)
there is also a sorting match feature that can reduce ram needed and that is as follows.

srep does a very good job finding matches that are located far away!
that in order to happen makes a dictionary!
IF you sort the files you feed srep you can actually reduce ram needed and its speed

Sorting preprocession can speedup the process and cost less ram !! and remove IO overhead because NO temps


Believe me I have several ideas of how to reduce memory usage before even relying on virtual memory. Optimisation is my middle name.

bunti_o4u 29-05-2020 09:13

Quote:

Originally Posted by Edison007 (Post 472130)
I had time, and i added support for this files.
Also I added parsing FAT from watch_dogs, but without (de)compression yet.
It takes time to deal with the xcompress-library.

does it support stdio?

if you add stdio, it would be great..

panker1992 29-05-2020 12:30

xcompress if pure windows 10 compression, and i think it supports it by default.

Razor12911 10-07-2020 16:06

1 Attachment(s)
Can you guys test this. I added preflate as an alternative of reflate in case if you are having issues with crc errors when you use reflate or prolonged precompression times such as this:
Quote:

Originally Posted by dixen (Post 486161)
Dishonored: Death of the Outsider

*.resources - 17 gb > 25.1 gb for 23 minutes on 4 threads

For example. ZTool or Xtool (v0.12) - 30% for 4 hours

Thanks BLACKFIRE69

test unpack

I'm no expert when it comes to C++ and I did my best to compile a library for xtool to use so it may have bugs.

I ran a benchmark and here are the results

benchmark results on "game1.resources" from "Dishonored Death of the Outsiders"
xtool_2019:
Code:

8.30 GB >> ?? >> ?? >> ??
precompression time using 4 threads = it's better to watch paint dry (takes hours)

xtool_2020 (WIP):
Code:

8.30 GB >> 11.3 GB >> 8.74 GB >> 6.24 GB
precompression time using 2 threads = 8 minutes, 48 seconds
precompression time using 4 threads = 5 minutes, 7 seconds

PrecompX:
Code:

8.30 GB >> 11.1 GB >> 8.79 GB >> 6.28 GB
precompression time using 4 threads = 8 minutes, 29 seconds


dixen 12-07-2020 13:13

Dishonored Death of the Outsiders

game1.resources

Quote:

Compressed 1 file, 8,920,763,754 => 11,774,785,195 bytes. Ratio 131.99%
Compression time: cpu 11.31 sec/real 385.94 sec = 3%. Speed 23.11 mB/s
All OK
When unpack function?)

Razor12911 13-07-2020 21:01

1 Attachment(s)
Quote:

Originally Posted by dixen (Post 486764)
Dishonored Death of the Outsiders

game1.resources



When unpack function?)

Soon, maybe this week. Still busy writing the main code while looking for more ways to speed up processing :)

lolaya 14-07-2020 06:34

what is ztool and xtool and lolz?

Grumpy 14-07-2020 08:42

Quote:

Originally Posted by lolaya (Post 486806)
what is ztool and xtool and lolz?

That information is already available on these forums, search and you will find, you can not expect for others to always hold your hand and spoon feed you all the time. ;)

lolaya 14-07-2020 08:48

[email protected]

more posts here :D I can't find it easy

BLACKFIRE69 14-07-2020 19:58

Quote:

Originally Posted by lolaya (Post 486812)
[email protected]

more posts here :D I can't find it easy

that's the why studying hard.... ;)

doofoo24 15-07-2020 06:22

can't wait to test the new xtool :D

Razor12911 18-07-2020 03:34

1 Attachment(s)
Here's an early working version of xtool. I'm still busy adding preflate, reflate and other things so currently only the zlib function works which means this will not work on Dishonored 2 or DOOM but it will work on most titles.

Drag and drop game files that are zlib compressed on the bat file

Things to test:

* Stability
* Can you cancel installation when xtool was used in setup
* If you have 16 threads or more, make xtool use 100% of your cpu to see if it can handle it
* if you have 32GB+ ram, set high chunk size, something like -c1gb and see if xtool was able to allocate the necessary memory
* If you have potato pc, don't worry I got a job for you :) check if there are no problems if you are using 1 thread both in encoding and decoding

Whatever you do, don't place preflate_dll.dll near the exe, not yet. else it will fail, the preflate code hasn't been added yet :D

Notes:

The file check must report
FC: no differences encountered
if the test went successfully, if it failed kindly upload the file it failed on.

This xtool uses slightly more memory than the 2019 version, I opted for stability than less memory usage

xtool now accepts maths expressions in command line, if you wanted it to use all cores minus 1, you can just write -t100p-1 or -t100-2, totally up to you

Next release/test focuses on:

+ preflate
+ reflate (reflate actually isn't replaced by preflate, it sometimes gives better output than preflate so I'll keep it just in case you are after best results)
+ depth setting

Masquerade 18-07-2020 05:14

Hello Razor, here is my test (I did make a post before but I had to delete it since I used a reflate files by mistake):

My spec: Ryzen 7 2700 (8c/16t) + 16GB RAM.

Bat file editied for 128mb chunk size.

Here's is XTool using all 16 of my threads:

https://i.imgur.com/jGFDfgB.png

Testing on pak file from Astroneer:

Start size: 2.13GB

End Size: 3.75GB

No differences encountered ;)

Not entirely sure how I'd go about testing it in a setup...


All times are GMT -7. The time now is 13:00.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2022, vBulletin Solutions Inc.
Copyright 2000-2020, FileForums @ https://fileforums.com