144p Test Suite (was: STN torture test)

Discussion of programming and development for the original Game Boy and Game Boy Color.
nitro2k01
Posts: 252
Joined: Sat Aug 28, 2010 9:01 am

Re: STN torture test

Post by nitro2k01 »

Out of curiosity I tried comparing my own compression algorithm to PB16.

greenhillzone.u.chrgb:
Uncompressed: 1744 bytes
PB16: 921 bytes
Mine: 1082 bytes

I think this is basically down to encoding since I use a multibyte encoding So for example, it takes a 3 byte sequence to start a backtrack operation, the command byte, how many bytes to backtrack, and how many to copy.

greenhillzone.nam:
Uncompressed: 576 bytes
PB16: 400 bytes
Mine: 373 bytes

Mine does better, but this is expected since mine encodes sequences, which is beneficial for map data. All PB16 can do in this case is to eliminate repeated data which is basically only repetitions of the same tile in this file.

Note that there's one (perhaps irrelevant) example where mine radically beats PB16, namely tiles that didn't get the unique treatment.

greenhillzone.chrgb:
Uncompressed: 9216 bytes
PB16: 3876 bytes
Mine: 1534 bytes

The reason is that mine has a 512 byte offset reach for backtrack. When encoding the non-unique tite version, a backtrack offset of 257-512 bytes back was used a total of 35 times for this file. Of course, the non-unique'd tile data would not even fit in memory, so it's a moot point in this case. For context, I added deep backtrack to help compress music files which were 8 kiB big (ie would fill all of WRAM) where this made a huge improvement.
calima
Posts: 1745
Joined: Tue Oct 06, 2015 10:16 am

Re: STN torture test

Post by calima »

@tepples
Please don't filebomb, zip files should extract to a directory, not have a ton of top-level files.
calima
Posts: 1745
Joined: Tue Oct 06, 2015 10:16 am

Re: STN torture test

Post by calima »

More numbers, just because :P

greenhillzone.u.chrgb:
Uncompressed: 1744 bytes
PB16: 921 bytes
nitro: 1082 bytes
lz4hc: 1052 bytes
zlib: 877 bytes

greenhillzone.nam:
Uncompressed: 576 bytes
PB16: 400 bytes
nitro: 373 bytes
lz4hc: 400 bytes
zlib: 348 bytes
tepples
Posts: 22705
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: STN torture test

Post by tepples »

I admit that filebombing has been a problem with my automatic zip packaging dating back to at least 2009 with Concentration Room.

In man 1 zip I couldn't find an option to prepend a directory name to all files that shall be added to an archive. Is there a different program people should use that supports this? Or would I have to write a script that creates a new subdirectory, copies all files that shall be included in the zipfile to that subdirectory, adds that subdirectory to the zipfile using zip -r, and removes that subdirectory recursively?

EDIT: I wrote an anti-filebomb script, which will be included in the tools folder of the next version and hopefully my other projects.
calima
Posts: 1745
Joined: Tue Oct 06, 2015 10:16 am

Re: STN torture test

Post by calima »

Nowadays you'd have your projects in git, and use "git archive --prefix mydir/ --format zip HEAD > myfile.zip".
User avatar
koitsu
Posts: 4201
Joined: Sun Sep 19, 2004 9:28 pm
Location: A world gone mad

Re: STN torture test

Post by koitsu »

tepples wrote:In man 1 zip I couldn't find an option to prepend a directory name to all files that shall be added to an archive. Is there a different program people should use that supports this? Or would I have to write a script that creates a new subdirectory, copies all files that shall be included in the zipfile to that subdirectory, adds that subdirectory to the zipfile using zip -r, and removes that subdirectory recursively?
Sounds wasteful I/O-wise. I know you run *IX, so: if you can get a list of filenames you know definitively are supposed to be in the resulting zipfile, then just use a symlink to get what you want; zip by default will automatically follow symlinks (there's a weird extension that actually can back up symlinks but uh, yeah, don't use htat), so here's a crummy example:

Code: Select all

$ cd ~
$ mkdir -p example
$ touch example/include1 example/include2 example/include3 example/do_not_include
$ perl -e 'print "abc\n"x999999' > ~/example/hello4
$ ls -l example
total 135
-rw-------    1 jdc       users           0 Apr 18 00:54 do_not_include
-rw-------    1 jdc       users     3999996 Apr 18 00:54 hello4
-rw-------    1 jdc       users           0 Apr 18 00:54 include1
-rw-------    1 jdc       users           0 Apr 18 00:54 include2
-rw-------    1 jdc       users           0 Apr 18 00:54 include3
$ cd /tmp
$ ln -s ~/example prepend
$ zip result.zip prepend/hello4 prepend/include1 prepend/include2 prepend/include3
  adding: prepend/hello4 (deflated 100%)
  adding: prepend/include1 (stored 0%)
  adding: prepend/include2 (stored 0%)
  adding: prepend/include3 (stored 0%)
$ rm -r prepend
$ unzip -l result.zip
Archive:  result.zip
  Length     Date   Time    Name
 --------    ----   ----    ----
  3999996  04-18-18 00:54   prepend/hello4
        0  04-18-18 00:54   prepend/include1
        0  04-18-18 00:54   prepend/include2
        0  04-18-18 00:54   prepend/include3
You could probably use -r with this to include a directory as well (ex. prepend/directoryname), assuming you wanted everything in there.

The prepend/hello4 prepend/include1 prepend/include2 prepend/include3 part of the zip line is what you'd have to figure out how to provide through a scripted manner. If the number of files/arguments to zip becomes long, thus exhausting argv space, or contain "painful characters" (like spaces): zip also has an -i/--include argument that can be given a value of @filename.lst. What to include pathname-wise will be read from filename.lst instead of from the command line, so you could just make something that generates filename.lst and then refer to it in the above manner (zip result.zip --include @/path/to/filename.lst).

The symlink method just acts as a crappy way to accomplish what you want, without having to copy a ton of data. You don't have to use the paths/methods I described, but it's one way. I imagine a filesystem loopback mount ("nullfs mount" on BSD) could work just as easily; on Linux this is done through mount --bind, IIRC. You still would need a list of the files though.

But if this is actually being done on Windows, then NTFS offers symlinks (I don't know how well they work). You can alternately (and probably a better idea) use NTFS junctions (I use these and they work well). mklink.exe that comes with Windows Vista or later can manage all this.
User avatar
TmEE
Posts: 960
Joined: Wed Feb 13, 2008 9:10 am
Location: Norway (50 and 60Hz compatible :P)
Contact:

Re: STN torture test

Post by TmEE »

I would say it is up to whatever extracts the files to provide a folder/directory for files to go into (which is default behaviour in all the archival programs I use). I absolutely dislike archives that have the files in a folder/directory inside it, one extra step for me to get rid of output dir to prevent having a new dir with yet another one in it rather than just the files I'm after.
tepples
Posts: 22705
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: STN torture test

Post by tepples »

@calima
Thanks for pointing out the --prefix argument to git archive. I must have missed the first time I read man git-archive. I had been using git ls-files to construct a list of files piped into zip -@; see for example the makefile for Pently.

I tend to dislike makefile rules that write the target file even if the program exits with an error, as Make then considers a file "built" even if it is empty or otherwise not valid. But I discovered that if you use -o instead of >, you don't need --format. The form with > is better for piping into things like gzip, though git config already understands .tar.gz and can have additional compressors added through git config.

So something like the following rule (untested) might work:

Code: Select all

$(title)-$(version).zip: $(title) makefile README.md
<HT>git archive --prefix $(title)-$(version)/ -o $@ HEAD
(The <HT> represents a tab character, which doesn't survive phpBB.)

But the other problem with git archive is that it contains only files in the repo (that is, source code), without a way to specify files to include that are only in the working copy and not in the repo (such as the executable). I had been producing zipfiles containing both ROM and source code to make distributing the complete corresponding source code easier than not doing so. Adding the file later with zip would put it at the top level, not inside the relevant folder.

@koitsu
I considered ln -s briefly until I remembered that contributors to some of my other projects use Windows. Windows gained symbolic links fairly recently in the form of the mklink command, available to normal users in Windows 10 Creators Update and later. But several aspects of mklink confuse me: why it uses the opposite argument order from ln -s (and from the existing copy command in Windows) is beyond me, why it needs the extra /D flag when a link happens to point at a directory, and why Windows required administrator privilege to create a symbolic link prior to Windows 10 Creators Update. I don't want to have to make the makefile call powershell Start-Process (something) -Verb RunAs (as described in this Super User question) to make the symbolic link, make the archive, remove the symbolic link, and then somehow fix the permissions on the resulting file. Nor do I want to require users of Windows 7 to upgrade to Windows 10.

@TmEE
During research for my previous post to this topic, I noticed that "tarbomb" and "zip bomb" have different meanings. A "tarbomb" is an archive with more than one top-level member, and a "zip bomb" is an archive that decompresses to a set of files several orders of magnitude larger, so as to confuse antivirus software that scans nested archives. Does this reflect different expectations of directory structure on the part of users of tar and zip formats?
User avatar
thefox
Posts: 3134
Joined: Mon Jan 03, 2005 10:36 am
Location: 🇫🇮
Contact:

Re: STN torture test

Post by thefox »

YMMV, but for .zip files I don't think it's standard in any way to have all of the contents in a single top-level directory. I've always seen far more .zip files which don't have the top-level directory. That's just the way things are. For .tar.gz and other unixy formats the situation is different.

I've seen some archivers which create a new directory for the contents unless one exists in the archive already. I'd just use something like that instead of expecting the rest of the world to change their habits. (Should be fairly easy to write a script to automate this.)
Download STREEMERZ for NES from fauxgame.com! — Some other stuff I've done: fo.aspekt.fi
tepples
Posts: 22705
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: STN torture test

Post by tepples »

It has come to my attention that someone does want to make a GBC port. To prepare for this, I have made some behind-the-scenes technical changes to 144p Test Suite.

0.04 (2018-04-29)
  • No more tarbombing: Create zipfile with all files in an internal folder (requested by calima)
  • Skip logo fadeout and SGB detection on Game Boy Color/Advance
  • Overscan: Start border thickness at 2 instead of temporary values left in from testing
  • Overscan: Draw bottom border with WX instead of LCDC (requested by ISSOtm)
  • Stopwatch: Hide face with window instead of LCDC (requested by ISSOtm)
  • Hide incomplete first frame with BGP and OBP0 instead of LCDC (requested by ISSOtm)
  • Vertical scroll: Fix a buffer overflow causing the test to start paused
  • Use de facto standard hardware.inc, with 'r' in front of all port names (requested by ISSOtm)
  • Grid test pattern no longer uses Sharpness help screen
Attachments
gb240p-0.04.zip
(128.26 KiB) Downloaded 791 times
tepples
Posts: 22705
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: STN torture test

Post by tepples »

tepples
Posts: 22705
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: STN torture test

Post by tepples »

I have begun work on adding Game Boy Color enhancements to the following activities:
  • Activities using help engine (menu, About, Credits, Sound test)
  • Vertical scroll
  • Lame boy demo
I have determined that the following could benefit from enhancement (issue #2):
  • Gray ramp
  • Motion blur
  • Solid screen
  • Shadow sprite
  • Hill zone scroll
I have further determined that the following tests, which were left out due to lack of color, should be included for feature parity with ports on other consoles (issue 1):
  • PLUGE
  • Gradient color bars
  • SMPTE color bars
  • Color bars on gray
  • Color bleed
But I doubt that I will be able to fit everything in 32K. I started with 26K used and 6K free. After adding the GBC enhancements that I have added so far, I am up to 27.4K used and 4.6K free. Some of this is because some parts need GBC-specific graphics to distinguish, for example, white-as-backdrop from white-as-skin-color from white-as-shirt when two of them occur in the same tile. Once I near the 32K limit of easily obtained flash carts that are less expensive than SD adapters, could anyone here assist in code review to find things I could pack smaller?
Shonumi
Posts: 342
Joined: Sun Jan 26, 2014 9:31 am

Re: STN torture test

Post by Shonumi »

tepples wrote: Once I near the 32K limit of easily obtained flash carts that are less expensive than SD adapters, could anyone here assist in code review to find things I could pack smaller?
I'd be willing to take a look. No promises, but I'll take a peek when you're ready.
tepples
Posts: 22705
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: STN torture test

Post by tepples »

Currently building the gameboy subtree of the repository produces a binary of which at least 27 KiB is used. Do you want to start the review now, or would you prefer to wait until I hit 31 KiB?
Shonumi
Posts: 342
Joined: Sun Jan 26, 2014 9:31 am

Re: STN torture test

Post by Shonumi »

I'll start taking a look this weekend. Not sure how much I can help, but I'll try.
Post Reply