Because I can, I hacked together a sitemapper. It's at /home/silver/bin/sitemapper.pl (because I'm imaginative with my naming like that), and its output is at ~silver/sitemap.html (and sitemap2, and any others I think of).
It excludes anybody in the tildebot killfile at /home/brendn/bin/botify/killfile, respects robots.txt's (checking each directory for them), and has an exclusion file at /home/silver/bin/allowdeny.txt of everything/everybody that's automatically generated (that I've noticed so far), including itself.
At the moment I'm manually running it, but I could put it in a cron if there was demand. It also wouldn't be hard to add a JSON dump if anyone would use that.