Swashbuckled

Subscriptions: 8

Total pages: 47 | First page | Last known page | RSS

Homepage: https://swashbuckled.darkbluecomics.com/

Added on: 2012-12-02 18:21:24

Categories: genre:fantasy genre:furry advisory:Web 14 archetype:pirates art:manga style format:episodic

Diego is sentenced to hang for piracy-related crimes, when a mysterious girl claiming to be the princess offers him a pardon... for a most mysterious price!
Viewing Bookmark
# Page

Actions

Crawl errors

The last 5 crawl errors during the last 30 days. Having this empty doesn't necessarily imply that there isn't something wrong with the crawler. I'll go through these eventually but I don't mind if you ask me to check whether the crawler's doing the right thing.

Page order Time URL HTTP status
46 2026-03-25 18:04:53 https://swashbuckled.darkbluecomics.com/comic/orders-are-just-orders/ HttpExceptionRequest Request { host = "swashbuckled.darkbluecomics.com" port = 443 secure = True requestHeaders = [("User-Agent","piperka.ne
46 2026-03-24 21:04:02 https://swashbuckled.darkbluecomics.com/comic/orders-are-just-orders/ HttpExceptionRequest Request { host = "swashbuckled.darkbluecomics.com" port = 443 secure = True requestHeaders = [("User-Agent","piperka.ne
46 2026-03-24 01:06:38 https://swashbuckled.darkbluecomics.com/comic/orders-are-just-orders/ HttpExceptionRequest Request { host = "swashbuckled.darkbluecomics.com" port = 443 secure = True requestHeaders = [("User-Agent","piperka.ne
46 2026-03-23 05:04:07 https://swashbuckled.darkbluecomics.com/comic/orders-are-just-orders/ HttpExceptionRequest Request { host = "swashbuckled.darkbluecomics.com" port = 443 secure = True requestHeaders = [("User-Agent","piperka.ne
46 2026-03-22 09:07:08 https://swashbuckled.darkbluecomics.com/comic/orders-are-just-orders/ HttpExceptionRequest Request { host = "swashbuckled.darkbluecomics.com" port = 443 secure = True requestHeaders = [("User-Agent","piperka.ne