100 Commits

Author SHA1 Message Date
ba1ebcd6ba Merge pull request #407 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.11.1
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/setup-buildx-action from 3.11.0 to 3.11.1
2025-06-20 14:21:48 +02:00
cc655b0acd Bump docker/setup-buildx-action from 3.11.0 to 3.11.1
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.11.0 to 3.11.1.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.11.0...v3.11.1)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-version: 3.11.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-19 05:58:29 +00:00
3f5c9d0ca1 Merge pull request #403 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.11.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/setup-buildx-action from 3.10.0 to 3.11.0
2025-06-17 11:41:35 +02:00
538825f0ef Bump docker/setup-buildx-action from 3.10.0 to 3.11.0
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.10.0 to 3.11.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.10.0...v3.11.0)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-version: 3.11.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-17 06:03:00 +00:00
f0de0a29da Merge pull request #400 from C9Glax/dependabot/github_actions/docker/build-push-action-6.18.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/build-push-action from 6.17.0 to 6.18.0
2025-05-28 15:50:47 +02:00
d4227f2b8f Bump docker/build-push-action from 6.17.0 to 6.18.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.17.0 to 6.18.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.17.0...v6.18.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-version: 6.18.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-28 05:59:24 +00:00
cd00d35f22 Merge pull request #395 from C9Glax/dependabot/github_actions/docker/build-push-action-6.17.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/build-push-action from 6.16.0 to 6.17.0
2025-05-16 16:15:40 +02:00
4ef3e877ce Bump docker/build-push-action from 6.16.0 to 6.17.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.16.0 to 6.17.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.16.0...v6.17.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-version: 6.17.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-16 05:31:47 +00:00
7dba2518f9 Merge pull request #388 from C9Glax/dependabot/github_actions/docker/build-push-action-6.16.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/build-push-action from 6.15.0 to 6.16.0
2025-04-25 09:01:00 +02:00
7506a0201e Bump docker/build-push-action from 6.15.0 to 6.16.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.15.0 to 6.16.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.15.0...v6.16.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-version: 6.16.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-04-25 05:51:29 +00:00
91fb815153 Update README.md
Some checks failed
Docker Image CI / build (push) Has been cancelled
2025-03-29 21:17:22 +01:00
6faf8bc733 Merge pull request #376 from C9Glax/cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
Weebcentral fixxes
2025-03-18 18:13:47 +01:00
bdff5b7aec Merge pull request #375 from TheyCallMeTravis/webtoons-search_regex_fix
Some checks failed
Docker Image CI / build (push) Has been cancelled
webtoons - fix search regex
2025-03-18 18:12:35 +01:00
5af8060d7b Merge pull request #374 from TheyCallMeTravis/weebcentral-fixsearch
Weebcentral - Fix Search Results Parse
2025-03-18 18:12:23 +01:00
6ed8ff1d52 webtoons - fix search regex parsing 2025-03-18 10:12:42 -05:00
3324ed6e4a Weebcentral - Fix Search Results Parse 2025-03-17 14:29:09 -05:00
67fd9d284b Merge pull request #369 from TheyCallMeTravis/WeebCentral-add_referrer
Some checks failed
Docker Image CI / build (push) Has been cancelled
WeebCentral - add referer to DownloadChapterImages
2025-03-15 10:24:28 +01:00
08f26dd21d add referer to DownloadChapterImages 2025-03-14 21:18:51 -05:00
89ed500751 Update actions for Server-V2
Some checks failed
Docker Image CI / build (push) Has been cancelled
2025-03-08 19:02:14 +01:00
b00b0ee030 Merge branch 'master' into cuttingedge-merge-ServerV2
Some checks failed
Docker Image CI / build (push) Has been cancelled
2025-03-08 19:00:42 +01:00
e47c52ad48 Merge pull request #367 from C9Glax/cuttingedge
Cuttingedge merge
2025-03-08 18:58:40 +01:00
293f0af8e3 Merge pull request #366 from merlinmarijn/manganato-domain-switch
Some checks are pending
Docker Image CI / build (push) Waiting to run
Manganato connector search fix
2025-03-08 07:40:11 +01:00
ebfa34e386 Update Manganato.cs 2025-03-07 22:33:24 +01:00
14524407f9 Update Manganato.cs 2025-03-07 22:29:40 +01:00
d56f0b383a Merge pull request #365 from merlinmarijn/manganato-domain-switch
Some checks are pending
Docker Image CI / build (push) Waiting to run
Manganato fix chapter naming format in CBZ files (i am sorry)
2025-03-07 21:54:06 +01:00
70391c83c1 Update Manganato.cs
i found out, i am stupid
2025-03-07 21:39:17 +01:00
dc7696ee26 Merge pull request #364 from merlinmarijn/manganato-domain-switch
Some checks are pending
Docker Image CI / build (push) Waiting to run
Enforce correct referrer check for access to Manganato
2025-03-07 20:19:52 +01:00
49dab9a670 Referrer policy changed
- Updated: image hosting platform seem to have changed a policy requiring now to send the referrer from the actual site instead of just allowing any connecting regardless of the referrer address
2025-03-07 19:57:27 +01:00
c9bc79fbd5 Update new_connector.yml
Some checks failed
Docker Image CI / build (push) Has been cancelled
2025-03-07 10:19:08 +01:00
83ce315f87 Merge pull request #357 from merlinmarijn/manganato-domain-switch
Some checks are pending
Docker Image CI / build (push) Waiting to run
Update Connector for Manganato connector: Migrate from .com to .gg & Adjust HTML Parsing
#358 @merlinmarijn
2025-03-07 10:06:44 +01:00
59511056d0 added try around getting urls 2025-03-03 23:43:35 +01:00
ed3ca5dba8 removed leftover comment 2025-03-03 23:04:43 +01:00
8df05d7e8a fixed image referrer 2025-03-03 22:59:25 +01:00
95d1e37b47 Update Manganato.cs 2025-03-03 22:27:37 +01:00
b6494ab7f9 Merge pull request #354 from C9Glax/dependabot/github_actions/docker/setup-qemu-action-3.6.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/setup-qemu-action from 3.5.0 to 3.6.0
2025-03-03 13:54:59 +01:00
1d1d01b6e5 Bump docker/setup-qemu-action from 3.5.0 to 3.6.0
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 3.5.0 to 3.6.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](https://github.com/docker/setup-qemu-action/compare/v3.5.0...v3.6.0)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-03 05:09:43 +00:00
5bb4977876 Merge pull request #353 from ale-ben/cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
Weebcentral: File name also depends on original chapter name
2025-03-02 16:20:07 +01:00
c6bb1c9180 [cuttingedge] fix(Chapter): Minor logic change to account for all chapterName cases 2025-03-02 16:06:21 +01:00
9a066e7ac7 [cuttingedge] fix(Weebcentral): Updated CheckChapterIsDownloaded logic to also consider chapter name if present 2025-03-02 15:57:09 +01:00
4bafffded4 [cuttingedge] feat(Weebcentra): When ordering chapters, order name desc to put special chapters first 2025-03-02 15:56:12 +01:00
942b43da67 Merge branch 'refs/heads/cuttingedge' into cuttingedge-merge-ServerV2 2025-03-02 10:06:48 +01:00
ce5538b352 Merge pull request #341 from Makhuta/cuttingedge
Some checks are pending
Docker Image CI / build (push) Waiting to run
Added to HandleGet support for connector languages
2025-03-02 09:51:11 +01:00
0cfdf17bd4 Merge pull request #350 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.10.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/setup-buildx-action from 3.9.0 to 3.10.0
2025-03-02 09:50:21 +01:00
0c48c1e020 Merge pull request #351 from C9Glax/dependabot/github_actions/docker/build-push-action-6.15.0
Bump docker/build-push-action from 6.14.0 to 6.15.0
2025-03-02 09:50:17 +01:00
0638e75ed6 Merge pull request #349 from C9Glax/dependabot/github_actions/docker/setup-qemu-action-3.5.0
Bump docker/setup-qemu-action from 3.4.0 to 3.5.0
2025-03-02 09:49:16 +01:00
5a4bc1c6de [cuttingedge] fix(Weebcentral): Handle case of chapter name with multiple number parts 2025-03-01 12:06:51 +01:00
71f663ca2f [cuttingedge] fix(Weebcentral): File name also depends on original chapter name 2025-03-01 11:39:12 +01:00
1b61a16061 Bump docker/build-push-action from 6.14.0 to 6.15.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.14.0 to 6.15.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.14.0...v6.15.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-27 05:25:23 +00:00
db81fdce39 Bump docker/setup-buildx-action from 3.9.0 to 3.10.0
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.9.0 to 3.10.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.9.0...v3.10.0)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-27 05:25:21 +00:00
fdb5451162 Bump docker/setup-qemu-action from 3.4.0 to 3.5.0
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 3.4.0 to 3.5.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](https://github.com/docker/setup-qemu-action/compare/v3.4.0...v3.5.0)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-27 05:25:19 +00:00
6b7632b071 Merge pull request #344 from C9Glax/dependabot/github_actions/docker/build-push-action-6.14.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/build-push-action from 6.13.0 to 6.14.0
2025-02-20 16:49:19 +01:00
06c080dfce Bump docker/build-push-action from 6.13.0 to 6.14.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.13.0 to 6.14.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.13.0...v6.14.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-20 05:50:25 +00:00
8130e11a9c Merge pull request #342 from C9Glax/cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
Cuttingedge master merge
2025-02-14 15:51:42 +01:00
659a42d370 Add
- ability to get supported languages of connector for use in monitoring languages selector
2025-02-12 14:07:13 +01:00
9cef068785 Merge pull request #340 from Makhuta/cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
Fix the Webtoons connector getting few chapters multiple times
2025-02-11 21:24:14 +01:00
4ad3149523 Fix
- fixed when parsing chapters the pages was incorrectly parsed resulting into adding the chapters from the last page multiple times (was still downloading OK but it would try to download the chapters from last page multiple times)
2025-02-11 20:16:30 +01:00
e6d40a7b36 Remove unused code from Weebcentral
Some checks failed
Docker Image CI / build (push) Has been cancelled
2025-02-09 18:37:55 +01:00
a95cb90561 Update Nuget packages 2025-02-09 18:37:41 +01:00
603e1b41d9 Add Chromium referer header 2025-02-09 18:37:33 +01:00
bb8a514830 Do not create .duplicate files anymore.
Just warn in log and delete (or attempt to delete)
2025-02-09 17:57:08 +01:00
edacaaba8a Update Readme 2025-02-09 17:38:54 +01:00
d97da26994 spelling error 2025-02-09 17:37:53 +01:00
8b923d73c4 Merge pull request #337 from Makhuta/cuttingedge
Add Manga Connector
2025-02-09 17:34:08 +01:00
814efd3528 Merge remote-tracking branch 'origin/cuttingedge' into cuttingedge 2025-02-09 17:28:03 +01:00
2cd5d8bc4f Merge branch 'cuttingedge-merge-ServerV2' into cuttingedge 2025-02-09 17:27:42 +01:00
5a864ab9b7 Remove Manga4Life 2025-02-09 17:27:35 +01:00
c700974693 Merge pull request #339 from C9Glax/dependabot/github_actions/docker/setup-qemu-action-3.4.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/setup-qemu-action from 3.3.0 to 3.4.0
2025-02-09 17:18:42 +01:00
553b5558d3 Merge pull request #338 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.9.0
Bump docker/setup-buildx-action from 3.8.0 to 3.9.0
2025-02-09 17:18:25 +01:00
c9bbfee26b Merge pull request #331 from C9Glax/dependabot/github_actions/docker/build-push-action-6.13.0
Bump docker/build-push-action from 6.12.0 to 6.13.0
2025-02-09 17:18:10 +01:00
6e869eeb0d Bump docker/setup-qemu-action from 3.3.0 to 3.4.0
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 3.3.0 to 3.4.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](https://github.com/docker/setup-qemu-action/compare/v3.3.0...v3.4.0)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-07 05:13:18 +00:00
be7da69dbd Bump docker/setup-buildx-action from 3.8.0 to 3.9.0
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.8.0 to 3.9.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.8.0...v3.9.0)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-07 05:13:16 +00:00
7f13d9b1e6 Fix
- forgotten comma
2025-02-06 15:39:06 +01:00
0c9e3205c2 Add Manga Connector
- added [Webtoon](https://www.webtoons.com) manga connector
- modified/added support for saving covers with refferer
2025-02-06 15:37:30 +01:00
8c3b70b32e Bump docker/build-push-action from 6.12.0 to 6.13.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.12.0 to 6.13.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.12.0...v6.13.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-01-27 06:02:55 +00:00
4f7031ecfc Merge pull request #328 from ale-ben/cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
fix: Add escape to Weebcentral regex
2025-01-25 23:26:48 +01:00
f7a285aabd [cuttingedge] fix: Add escape to Weebcentral regex 2025-01-25 11:40:00 +01:00
786482398c Merge pull request #327 from ale-ben/cuttingedge
Some checks are pending
Docker Image CI / build (push) Waiting to run
Fix bug that prevented download of chapters 0
2025-01-24 22:09:53 +01:00
7921dcb1cb [cuttingedge] fix: Change condition for newChapters. Should solve #323 2025-01-24 21:52:52 +01:00
d0c9313279 Merge pull request #322 from C9Glax/dependabot/github_actions/docker/build-push-action-6.12.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/build-push-action from 6.11.0 to 6.12.0
2025-01-16 17:00:36 +01:00
58cf4cf4e0 Bump docker/build-push-action from 6.11.0 to 6.12.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.11.0 to 6.12.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.11.0...v6.12.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-01-16 05:52:33 +00:00
280d715a7c Merge branch 'cuttingedge-merge-ServerV2' into cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
2025-01-15 23:14:20 +01:00
d0b775444d Change Chromium back to WaitUntilNavigation.Networkidle0 2025-01-15 23:14:15 +01:00
b4edcccafe Merge branch 'cuttingedge-merge-ServerV2' into cuttingedge 2025-01-15 23:02:55 +01:00
268441a47d Trangasettings Json 2025-01-15 23:02:48 +01:00
1701881f4b Merge branch 'cuttingedge-merge-ServerV2' into cuttingedge 2025-01-15 22:53:47 +01:00
78a9322036 ChromiumDownloadClient Change WaitUnitlNavigation to Load instead of NetworkIdle 2025-01-15 22:53:39 +01:00
e5be5703f8 Merge branch 'cuttingedge-merge-ServerV2' into cuttingedge 2025-01-15 22:25:16 +01:00
cc32b3dfae TrangaSettings Chromium Timeouts 2025-01-15 22:24:55 +01:00
ce217aae4f Merge remote-tracking branch 'origin/cuttingedge' into cuttingedge 2025-01-15 22:18:11 +01:00
123a8b06b2 jobloading errormessage 2025-01-15 22:15:33 +01:00
2350c5a04b Remove Mangasee 2025-01-15 22:13:58 +01:00
f532e2ff76 JobBoss LoadJobsList change:
Fix Directory.Exists jobsFolderPath to create new Directory
Fix Loading Job fails leading to crash.
2025-01-15 22:13:50 +01:00
3abf7224d0 Merge pull request #316 from ale-ben/cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
Fixed regex to capture chapters with decimal (1.5, ..)
2025-01-11 00:41:20 +01:00
b39dbd5671 [cuttingedge] fix(weebcentral): Fixed regex to capture chapters with decimal (1.5, ..) 2025-01-10 22:10:34 +01:00
375fad0c21 Merge pull request #314 from C9Glax/dependabot/github_actions/docker/setup-qemu-action-3.3.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/setup-qemu-action from 3.2.0 to 3.3.0
2025-01-10 11:02:46 +01:00
ee0d17c24f Merge pull request #315 from C9Glax/dependabot/github_actions/docker/build-push-action-6.11.0
Bump docker/build-push-action from 6.9.0 to 6.11.0
2025-01-10 11:02:26 +01:00
36ab3c3fdb Bump docker/build-push-action from 6.9.0 to 6.11.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.9.0 to 6.11.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.9.0...v6.11.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-01-10 05:46:13 +00:00
c3d60c6586 Bump docker/setup-qemu-action from 3.2.0 to 3.3.0
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 3.2.0 to 3.3.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](https://github.com/docker/setup-qemu-action/compare/v3.2.0...v3.3.0)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-01-10 05:46:10 +00:00
b96ae4a2d2 Merge pull request #304 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.8.0
Bump docker/setup-buildx-action from 3.7.1 to 3.8.0
2024-12-17 17:38:07 +01:00
3a25c0b221 Bump docker/setup-buildx-action from 3.7.1 to 3.8.0
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.7.1 to 3.8.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.7.1...v3.8.0)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-12-17 05:59:19 +00:00
48 changed files with 1393 additions and 3727 deletions

View File

@ -12,7 +12,7 @@ body:
- type: checkboxes - type: checkboxes
attributes: attributes:
label: Is the Website free to access? label: Is the Website free to access?
description: We can't support pay-to-use sites. description: We can't support pay-to-use sites, or captcha-proxied sites as Cloudflare.
options: options:
- label: The Website is freely accessible. - label: The Website is freely accessible.
required: true required: true

View File

@ -17,12 +17,12 @@ jobs:
# https://github.com/docker/setup-qemu-action#usage # https://github.com/docker/setup-qemu-action#usage
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0 uses: docker/setup-qemu-action@v3.6.0
# https://github.com/marketplace/actions/docker-setup-buildx # https://github.com/marketplace/actions/docker-setup-buildx
- name: Set up Docker Buildx - name: Set up Docker Buildx
id: buildx id: buildx
uses: docker/setup-buildx-action@v3.7.1 uses: docker/setup-buildx-action@v3.11.1
# https://github.com/docker/login-action#docker-hub # https://github.com/docker/login-action#docker-hub
- name: Login to Docker Hub - name: Login to Docker Hub
@ -33,7 +33,7 @@ jobs:
# https://github.com/docker/build-push-action#multi-platform-image # https://github.com/docker/build-push-action#multi-platform-image
- name: Build and push API - name: Build and push API
uses: docker/build-push-action@v6.9.0 uses: docker/build-push-action@v6.18.0
with: with:
context: ./ context: ./
file: ./Dockerfile file: ./Dockerfile

View File

@ -1,45 +0,0 @@
name: Docker Image CI
on:
push:
branches: [ "dev" ]
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
# https://github.com/docker/setup-qemu-action#usage
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0
# https://github.com/marketplace/actions/docker-setup-buildx
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v3.7.1
# https://github.com/docker/login-action#docker-hub
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
# https://github.com/docker/build-push-action#multi-platform-image
- name: Build and push API
uses: docker/build-push-action@v6.9.0
with:
context: ./
file: ./Dockerfile
#platforms: linux/amd64,linux/arm64,linux/riscv64,linux/ppc64le,linux/s390x,linux/386,linux/mips64le,linux/mips64,linux/arm/v7,linux/arm/v6
platforms: linux/amd64,linux/arm64
pull: true
push: true
tags: |
glax/tranga-api:dev

View File

@ -17,12 +17,12 @@ jobs:
# https://github.com/docker/setup-qemu-action#usage # https://github.com/docker/setup-qemu-action#usage
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0 uses: docker/setup-qemu-action@v3.6.0
# https://github.com/marketplace/actions/docker-setup-buildx # https://github.com/marketplace/actions/docker-setup-buildx
- name: Set up Docker Buildx - name: Set up Docker Buildx
id: buildx id: buildx
uses: docker/setup-buildx-action@v3.7.1 uses: docker/setup-buildx-action@v3.11.1
# https://github.com/docker/login-action#docker-hub # https://github.com/docker/login-action#docker-hub
- name: Login to Docker Hub - name: Login to Docker Hub
@ -33,7 +33,7 @@ jobs:
# https://github.com/docker/build-push-action#multi-platform-image # https://github.com/docker/build-push-action#multi-platform-image
- name: Build and push API - name: Build and push API
uses: docker/build-push-action@v6.9.0 uses: docker/build-push-action@v6.18.0
with: with:
context: ./ context: ./
file: ./Dockerfile file: ./Dockerfile

View File

@ -2,7 +2,7 @@ name: Docker Image CI
on: on:
push: push:
branches: [ "Server-V2" ] branches: [ "postgres-Server-V2" ]
workflow_dispatch: workflow_dispatch:
jobs: jobs:
@ -17,12 +17,12 @@ jobs:
# https://github.com/docker/setup-qemu-action#usage # https://github.com/docker/setup-qemu-action#usage
- name: Set up QEMU - name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0 uses: docker/setup-qemu-action@v3.6.0
# https://github.com/marketplace/actions/docker-setup-buildx # https://github.com/marketplace/actions/docker-setup-buildx
- name: Set up Docker Buildx - name: Set up Docker Buildx
id: buildx id: buildx
uses: docker/setup-buildx-action@v3.7.1 uses: docker/setup-buildx-action@v3.11.1
# https://github.com/docker/login-action#docker-hub # https://github.com/docker/login-action#docker-hub
- name: Login to Docker Hub - name: Login to Docker Hub
@ -33,7 +33,7 @@ jobs:
# https://github.com/docker/build-push-action#multi-platform-image # https://github.com/docker/build-push-action#multi-platform-image
- name: Build and push API - name: Build and push API
uses: docker/build-push-action@v6.9.0 uses: docker/build-push-action@v6.18.0
with: with:
context: ./ context: ./
file: ./Dockerfile file: ./Dockerfile

2
.gitignore vendored
View File

@ -20,8 +20,6 @@ riderModule.iml
cover.jpg cover.jpg
cover.png cover.png
/.vscode /.vscode
/.vs/
Tranga/Properties/launchSettings.json
/Manga /Manga
/settings /settings
*.DotSettings.user *.DotSettings.user

View File

@ -1,8 +1,12 @@
# Testers for V2 wanted!
[Details](https://github.com/C9Glax/tranga/pull/355#issuecomment-2764217944)
<!-- PROJECT LOGO --> <!-- PROJECT LOGO -->
<br /> <br />
<div align="center"> <div align="center">
<h3 align="center">Tranga v2</h3> <h3 align="center">Tranga</h3>
<p align="center"> <p align="center">
Automatic Manga and Metadata downloader Automatic Manga and Metadata downloader
@ -45,14 +49,13 @@ Tranga can download Chapters and Metadata from "Scanlation" sites such as
- [MangaDex.org](https://mangadex.org/) (Multilingual) - [MangaDex.org](https://mangadex.org/) (Multilingual)
- [Manganato.com](https://manganato.com/) (en) - [Manganato.com](https://manganato.com/) (en)
- [Mangasee.com](https://mangasee123.com/) (en)
- [MangaKatana.com](https://mangakatana.com) (en) - [MangaKatana.com](https://mangakatana.com) (en)
- [Mangaworld.bz](https://www.mangaworld.bz/) (it) - [Mangaworld.bz](https://www.mangaworld.bz/) (it)
- [Bato.to](https://bato.to/v3x) (en) - [Bato.to](https://bato.to/v3x) (en)
- [Manga4Life](https://manga4life.com) (en)
- [ManhuaPlus](https://manhuaplus.org/) (en) - [ManhuaPlus](https://manhuaplus.org/) (en)
- [MangaHere](https://www.mangahere.cc/) (en) (Their covers aren't scrapeable.) - [MangaHere](https://www.mangahere.cc/) (en) (Their covers aren't scrapeable.)
- [Weebcentral](https://weebcentral.com) (en) - [Weebcentral](https://weebcentral.com) (en)
- [Webtoons](https://www.webtoons.com/en/)
- ❓ Open an [issue](https://github.com/C9Glax/tranga/issues/new?assignees=&labels=New+Connector&projects=&template=new_connector.yml&title=%5BNew+Connector%5D%3A+) - ❓ Open an [issue](https://github.com/C9Glax/tranga/issues/new?assignees=&labels=New+Connector&projects=&template=new_connector.yml&title=%5BNew+Connector%5D%3A+)
and trigger a library-scan with [Komga](https://komga.org/) and [Kavita](https://www.kavitareader.com/). and trigger a library-scan with [Komga](https://komga.org/) and [Kavita](https://www.kavitareader.com/).
@ -62,8 +65,7 @@ Notifications can be sent to your devices using [Gotify](https://gotify.net/), [
### What this does and doesn't do ### What this does and doesn't do
Tranga (this git-repo) will open a port (standard 6531) and listen for requests to add Jobs to Monitor and/or download specific Manga. Tranga (this git-repo) will open a port (standard 6531) and listen for requests to add Jobs to Monitor and/or download specific Manga.
The configuration is all done through HTTP-Requests. [Documentation](docs/API_Calls_v2.md) The configuration is all done through HTTP-Requests.
_**For a web-frontend use [tranga-website](https://github.com/C9Glax/tranga-website).**_ _**For a web-frontend use [tranga-website](https://github.com/C9Glax/tranga-website).**_
This project downloads the images for a Manga from the specified Scanlation-Website and packages them with some metadata - from that same website - in a .cbz-archive (per chapter). This project downloads the images for a Manga from the specified Scanlation-Website and packages them with some metadata - from that same website - in a .cbz-archive (per chapter).
@ -91,8 +93,6 @@ That is why I wanted to create my own project, in a language I understand, and t
- [PuppeteerSharp](https://www.puppeteersharp.com/) - [PuppeteerSharp](https://www.puppeteersharp.com/)
- [Html Agility Pack (HAP)](https://html-agility-pack.net/) - [Html Agility Pack (HAP)](https://html-agility-pack.net/)
- [Soenneker.Utils.String.NeedlemanWunsch](https://github.com/soenneker/soenneker.utils.string.needlemanwunsch) - [Soenneker.Utils.String.NeedlemanWunsch](https://github.com/soenneker/soenneker.utils.string.needlemanwunsch)
- [Sixlabors.ImageSharp](https://docs-v2.sixlabors.com/articles/imagesharp/index.html#license)
- [zstd-wrapper](https://github.com/oleg-st/ZstdSharp) [zstd](https://github.com/facebook/zstd)
- 💙 Blåhaj 🦈 - 💙 Blåhaj 🦈
<p align="right">(<a href="#readme-top">back to top</a>)</p> <p align="right">(<a href="#readme-top">back to top</a>)</p>

View File

@ -44,7 +44,7 @@ public readonly struct Chapter : IComparable
if (name is not null && name.Length > 0) if (name is not null && name.Length > 0)
{ {
string chapterName = IllegalStrings.Replace(string.Concat(LegalCharacters.Matches(name)), ""); string chapterName = IllegalStrings.Replace(string.Concat(LegalCharacters.Matches(name)), "");
this.fileName = $"{chapterVolNumStr} - {chapterName}"; this.fileName = chapterName.Length > 0 ? $"{chapterVolNumStr} - {chapterName}" : chapterVolNumStr;
} }
else else
this.fileName = chapterVolNumStr; this.fileName = chapterVolNumStr;
@ -96,17 +96,20 @@ public readonly struct Chapter : IComparable
if(mangaArchive is null) if(mangaArchive is null)
{ {
FileInfo[] archives = new DirectoryInfo(mangaDirectory).GetFiles("*.cbz"); FileInfo[] archives = new DirectoryInfo(mangaDirectory).GetFiles("*.cbz");
Regex volChRex = new(@"(?:Vol(?:ume)?\.([0-9]+)\D*)?Ch(?:apter)?\.([0-9]+(?:\.[0-9]+)*)"); Regex volChRex = new(@"(?:Vol(?:ume)?\.([0-9]+)\D*)?Ch(?:apter)?\.([0-9]+(?:\.[0-9]+)*)(?: - (.*))?.cbz");
Chapter t = this; Chapter t = this;
mangaArchive = archives.FirstOrDefault(archive => mangaArchive = archives.FirstOrDefault(archive =>
{ {
Match m = volChRex.Match(archive.Name); Match m = volChRex.Match(archive.Name);
if (m.Groups[1].Success) /*
return m.Groups[1].Value == t.volumeNumber.ToString(GlobalBase.numberFormatDecimalPoint) && * 1. If the volumeNumber is not present in the filename, it is not checked.
m.Groups[2].Value == t.chapterNumber.ToString(GlobalBase.numberFormatDecimalPoint); * 2. Check the chapterNumber in the chapter against the one in the filename.
else * 3. The chpaterName has to either be absent both in the chapter and the filename or match.
return m.Groups[2].Value == t.chapterNumber.ToString(GlobalBase.numberFormatDecimalPoint); */
return (!m.Groups[1].Success || m.Groups[1].Value == t.volumeNumber.ToString(GlobalBase.numberFormatDecimalPoint)) &&
m.Groups[2].Value == t.chapterNumber.ToString(GlobalBase.numberFormatDecimalPoint) &&
((!m.Groups[3].Success && string.IsNullOrEmpty(t.name)) || m.Groups[3].Value == t.name);
}); });
} }

View File

@ -1,11 +1,8 @@
using System.Globalization; using System.Globalization;
using System.Runtime.InteropServices;
using System.Text;
using System.Text.RegularExpressions; using System.Text.RegularExpressions;
using Logging; using Logging;
using Newtonsoft.Json; using Newtonsoft.Json;
using Tranga.LibraryConnectors; using Tranga.LibraryConnectors;
using Tranga.MangaConnectors;
using Tranga.NotificationConnectors; using Tranga.NotificationConnectors;
namespace Tranga; namespace Tranga;
@ -17,7 +14,6 @@ public abstract class GlobalBase
protected HashSet<NotificationConnector> notificationConnectors { get; init; } protected HashSet<NotificationConnector> notificationConnectors { get; init; }
protected HashSet<LibraryConnector> libraryConnectors { get; init; } protected HashSet<LibraryConnector> libraryConnectors { get; init; }
private Dictionary<string, Manga> cachedPublications { get; init; } private Dictionary<string, Manga> cachedPublications { get; init; }
protected HashSet<MangaConnector> _connectors;
public static readonly NumberFormatInfo numberFormatDecimalPoint = new (){ NumberDecimalSeparator = "." }; public static readonly NumberFormatInfo numberFormatDecimalPoint = new (){ NumberDecimalSeparator = "." };
protected static readonly Regex baseUrlRex = new(@"https?:\/\/[0-9A-z\.-]+(:[0-9]+)?"); protected static readonly Regex baseUrlRex = new(@"https?:\/\/[0-9A-z\.-]+(:[0-9]+)?");
@ -27,7 +23,6 @@ public abstract class GlobalBase
this.notificationConnectors = clone.notificationConnectors; this.notificationConnectors = clone.notificationConnectors;
this.libraryConnectors = clone.libraryConnectors; this.libraryConnectors = clone.libraryConnectors;
this.cachedPublications = clone.cachedPublications; this.cachedPublications = clone.cachedPublications;
this._connectors = clone._connectors;
} }
protected GlobalBase(Logger? logger) protected GlobalBase(Logger? logger)
@ -36,7 +31,15 @@ public abstract class GlobalBase
this.notificationConnectors = TrangaSettings.LoadNotificationConnectors(this); this.notificationConnectors = TrangaSettings.LoadNotificationConnectors(this);
this.libraryConnectors = TrangaSettings.LoadLibraryConnectors(this); this.libraryConnectors = TrangaSettings.LoadLibraryConnectors(this);
this.cachedPublications = new(); this.cachedPublications = new();
this._connectors = new(); }
protected void AddMangaToCache(Manga manga)
{
if (!this.cachedPublications.TryAdd(manga.internalId, manga))
{
Log($"Overwriting Manga {manga.internalId}");
this.cachedPublications[manga.internalId] = manga;
}
} }
protected Manga? GetCachedManga(string internalId) protected Manga? GetCachedManga(string internalId)
@ -48,71 +51,9 @@ public abstract class GlobalBase
}; };
} }
protected IEnumerable<Manga> GetAllCachedManga() => cachedPublications.Values; protected IEnumerable<Manga> GetAllCachedManga()
protected void AddMangaToCache(Manga manga)
{ {
if (!cachedPublications.TryAdd(manga.internalId, manga)) return cachedPublications.Values;
{
Log($"Overwriting Manga {manga.internalId}");
cachedPublications[manga.internalId] = manga;
}
ExportManga();
}
protected void RemoveMangaFromCache(Manga manga) => RemoveMangaFromCache(manga.internalId);
protected void RemoveMangaFromCache(string internalId)
{
cachedPublications.Remove(internalId);
ExportManga();
}
internal void ImportManga()
{
string folder = TrangaSettings.mangaCacheFolderPath;
Directory.CreateDirectory(folder);
foreach (FileInfo fileInfo in new DirectoryInfo(folder).GetFiles())
{
string content = File.ReadAllText(fileInfo.FullName);
try
{
Manga m = JsonConvert.DeserializeObject<Manga>(content, new MangaConnectorJsonConverter(this, _connectors));
this.cachedPublications.TryAdd(m.internalId, m);
}
catch (JsonException e)
{
Log($"Error parsing Manga {fileInfo.Name}:\n{e.Message}");
}
}
}
private static bool ExportRunning = false;
private void ExportManga()
{
while (ExportRunning)
Thread.Sleep(1);
ExportRunning = true;
string folder = TrangaSettings.mangaCacheFolderPath;
Directory.CreateDirectory(folder);
Manga[] copy = new Manga[cachedPublications.Values.Count];
cachedPublications.Values.CopyTo(copy, 0);
foreach (Manga manga in copy)
{
string content = JsonConvert.SerializeObject(manga, Formatting.Indented);
string filePath = Path.Combine(folder, $"{manga.internalId}.json");
File.WriteAllText(filePath, content, Encoding.UTF8);
}
foreach (FileInfo fileInfo in new DirectoryInfo(folder).GetFiles())
{
if(!cachedPublications.Keys.Any(key => fileInfo.Name.Substring(0, fileInfo.Name.LastIndexOf('.')).Equals(key)))
fileInfo.Delete();
}
ExportRunning = false;
} }
protected void Log(string message) protected void Log(string message)

View File

@ -7,12 +7,12 @@ public class DownloadChapter : Job
{ {
public Chapter chapter { get; init; } public Chapter chapter { get; init; }
public DownloadChapter(GlobalBase clone, Chapter chapter, DateTime lastExecution, string? parentJobId = null) : base(clone, JobType.DownloadChapterJob, lastExecution, parentJobId: parentJobId) public DownloadChapter(GlobalBase clone, MangaConnector connector, Chapter chapter, DateTime lastExecution, string? parentJobId = null) : base(clone, JobType.DownloadChapterJob, connector, lastExecution, parentJobId: parentJobId)
{ {
this.chapter = chapter; this.chapter = chapter;
} }
public DownloadChapter(GlobalBase clone, Chapter chapter, string? parentJobId = null) : base(clone, JobType.DownloadChapterJob, parentJobId: parentJobId) public DownloadChapter(GlobalBase clone, MangaConnector connector, Chapter chapter, string? parentJobId = null) : base(clone, JobType.DownloadChapterJob, connector, parentJobId: parentJobId)
{ {
this.chapter = chapter; this.chapter = chapter;
} }
@ -44,15 +44,11 @@ public class DownloadChapter : Job
return Array.Empty<Job>(); return Array.Empty<Job>();
} }
protected override MangaConnector GetMangaConnector()
{
return chapter.parentManga.mangaConnector;
}
public override bool Equals(object? obj) public override bool Equals(object? obj)
{ {
if (obj is not DownloadChapter otherJob) if (obj is not DownloadChapter otherJob)
return false; return false;
return otherJob.chapter.Equals(this.chapter); return otherJob.mangaConnector == this.mangaConnector &&
otherJob.chapter.Equals(this.chapter);
} }
} }

View File

@ -1,29 +1,29 @@
using Newtonsoft.Json; using Tranga.MangaConnectors;
using Tranga.MangaConnectors;
namespace Tranga.Jobs; namespace Tranga.Jobs;
public class DownloadNewChapters : Job public class DownloadNewChapters : Job
{ {
public string mangaInternalId { get; set; } public Manga manga { get; set; }
[JsonIgnore] private Manga? manga => GetCachedManga(mangaInternalId);
public string translatedLanguage { get; init; } public string translatedLanguage { get; init; }
public DownloadNewChapters(GlobalBase clone, string mangaInternalId, DateTime lastExecution, bool recurring = false, TimeSpan? recurrence = null, string? parentJobId = null, string translatedLanguage = "en") : base(clone, JobType.DownloadNewChaptersJob, lastExecution, recurring, recurrence, parentJobId) public DownloadNewChapters(GlobalBase clone, MangaConnector connector, Manga manga, DateTime lastExecution,
bool recurring = false, TimeSpan? recurrence = null, string? parentJobId = null, string translatedLanguage = "en") : base(clone, JobType.DownloadNewChaptersJob, connector, lastExecution, recurring,
recurrence, parentJobId)
{ {
this.mangaInternalId = mangaInternalId; this.manga = manga;
this.translatedLanguage = translatedLanguage; this.translatedLanguage = translatedLanguage;
} }
public DownloadNewChapters(GlobalBase clone, MangaConnector connector, string mangaInternalId, bool recurring = false, TimeSpan? recurrence = null, string? parentJobId = null, string translatedLanguage = "en") : base (clone, JobType.DownloadNewChaptersJob, recurring, recurrence, parentJobId) public DownloadNewChapters(GlobalBase clone, MangaConnector connector, Manga manga, bool recurring = false, TimeSpan? recurrence = null, string? parentJobId = null, string translatedLanguage = "en") : base (clone, JobType.DownloadNewChaptersJob, connector, recurring, recurrence, parentJobId)
{ {
this.mangaInternalId = mangaInternalId; this.manga = manga;
this.translatedLanguage = translatedLanguage; this.translatedLanguage = translatedLanguage;
} }
protected override string GetId() protected override string GetId()
{ {
return $"{GetType()}-{mangaInternalId}"; return $"{GetType()}-{manga.internalId}";
} }
public override string ToString() public override string ToString()
@ -33,39 +33,27 @@ public class DownloadNewChapters : Job
protected override IEnumerable<Job> ExecuteReturnSubTasksInternal(JobBoss jobBoss) protected override IEnumerable<Job> ExecuteReturnSubTasksInternal(JobBoss jobBoss)
{ {
if (manga is null) manga.SaveSeriesInfoJson();
{ Chapter[] chapters = mangaConnector.GetNewChapters(manga, this.translatedLanguage);
Log($"Manga {mangaInternalId} is missing! Can not execute job.");
return Array.Empty<Job>();
}
manga.Value.SaveSeriesInfoJson();
Chapter[] chapters = manga.Value.mangaConnector.GetNewChapters(manga.Value, this.translatedLanguage);
this.progressToken.increments = chapters.Length; this.progressToken.increments = chapters.Length;
List<Job> jobs = new(); List<Job> jobs = new();
manga.Value.mangaConnector.CopyCoverFromCacheToDownloadLocation(manga.Value); mangaConnector.CopyCoverFromCacheToDownloadLocation(manga);
foreach (Chapter chapter in chapters) foreach (Chapter chapter in chapters)
{ {
DownloadChapter downloadChapterJob = new(this, chapter, parentJobId: this.id); DownloadChapter downloadChapterJob = new(this, this.mangaConnector, chapter, parentJobId: this.id);
jobs.Add(downloadChapterJob); jobs.Add(downloadChapterJob);
} }
UpdateMetadata updateMetadataJob = new(this, mangaInternalId, parentJobId: this.id); UpdateMetadata updateMetadataJob = new(this, this.mangaConnector, this.manga, parentJobId: this.id);
jobs.Add(updateMetadataJob); jobs.Add(updateMetadataJob);
progressToken.Complete(); progressToken.Complete();
return jobs; return jobs;
} }
protected override MangaConnector GetMangaConnector()
{
if (manga is null)
throw new Exception($"Missing Manga {mangaInternalId}");
return manga.Value.mangaConnector;
}
public override bool Equals(object? obj) public override bool Equals(object? obj)
{ {
if (obj is not DownloadNewChapters otherJob) if (obj is not DownloadNewChapters otherJob)
return false; return false;
return otherJob.mangaConnector == this.mangaConnector && return otherJob.mangaConnector == this.mangaConnector &&
otherJob.manga?.publicationId == this.manga?.publicationId; otherJob.manga.publicationId == this.manga.publicationId;
} }
} }

View File

@ -4,6 +4,7 @@ namespace Tranga.Jobs;
public abstract class Job : GlobalBase public abstract class Job : GlobalBase
{ {
public MangaConnector mangaConnector { get; init; }
public ProgressToken progressToken { get; private set; } public ProgressToken progressToken { get; private set; }
public bool recurring { get; init; } public bool recurring { get; init; }
public TimeSpan? recurrenceTime { get; set; } public TimeSpan? recurrenceTime { get; set; }
@ -12,15 +13,14 @@ public abstract class Job : GlobalBase
public string id => GetId(); public string id => GetId();
internal IEnumerable<Job>? subJobs { get; private set; } internal IEnumerable<Job>? subJobs { get; private set; }
public string? parentJobId { get; init; } public string? parentJobId { get; init; }
public enum JobType : byte { DownloadChapterJob = 0, DownloadNewChaptersJob = 1, UpdateMetaDataJob = 2, MonitorManga = 3 } public enum JobType : byte { DownloadChapterJob, DownloadNewChaptersJob, UpdateMetaDataJob }
public MangaConnector mangaConnector => GetMangaConnector();
public JobType jobType; public JobType jobType;
internal Job(GlobalBase clone, JobType jobType, bool recurring = false, TimeSpan? recurrenceTime = null, string? parentJobId = null) : base(clone) internal Job(GlobalBase clone, JobType jobType, MangaConnector connector, bool recurring = false, TimeSpan? recurrenceTime = null, string? parentJobId = null) : base(clone)
{ {
this.jobType = jobType; this.jobType = jobType;
this.mangaConnector = connector;
this.progressToken = new ProgressToken(0); this.progressToken = new ProgressToken(0);
this.recurring = recurring; this.recurring = recurring;
if (recurring && recurrenceTime is null) if (recurring && recurrenceTime is null)
@ -31,10 +31,11 @@ public abstract class Job : GlobalBase
this.parentJobId = parentJobId; this.parentJobId = parentJobId;
} }
internal Job(GlobalBase clone, JobType jobType, DateTime lastExecution, bool recurring = false, internal Job(GlobalBase clone, JobType jobType, MangaConnector connector, DateTime lastExecution, bool recurring = false,
TimeSpan? recurrenceTime = null, string? parentJobId = null) : base(clone) TimeSpan? recurrenceTime = null, string? parentJobId = null) : base(clone)
{ {
this.jobType = jobType; this.jobType = jobType;
this.mangaConnector = connector;
this.progressToken = new ProgressToken(0); this.progressToken = new ProgressToken(0);
this.recurring = recurring; this.recurring = recurring;
if (recurring && recurrenceTime is null) if (recurring && recurrenceTime is null)
@ -94,6 +95,4 @@ public abstract class Job : GlobalBase
} }
protected abstract IEnumerable<Job> ExecuteReturnSubTasksInternal(JobBoss jobBoss); protected abstract IEnumerable<Job> ExecuteReturnSubTasksInternal(JobBoss jobBoss);
protected abstract MangaConnector GetMangaConnector();
} }

View File

@ -1,4 +1,5 @@
using System.Runtime.InteropServices; using System.Diagnostics;
using System.Runtime.InteropServices;
using System.Text.RegularExpressions; using System.Text.RegularExpressions;
using Newtonsoft.Json; using Newtonsoft.Json;
using Tranga.MangaConnectors; using Tranga.MangaConnectors;
@ -70,9 +71,11 @@ public class JobBoss : GlobalBase
RemoveJob(job); RemoveJob(job);
} }
public IEnumerable<Job> GetJobsLike(string? internalId = null, float? chapterNumber = null) public IEnumerable<Job> GetJobsLike(string? connectorName = null, string? internalId = null, float? chapterNumber = null)
{ {
IEnumerable<Job> ret = this.jobs; IEnumerable<Job> ret = this.jobs;
if (connectorName is not null)
ret = ret.Where(job => job.mangaConnector.name == connectorName);
if (internalId is not null && chapterNumber is not null) if (internalId is not null && chapterNumber is not null)
ret = ret.Where(jjob => ret = ret.Where(jjob =>
@ -87,18 +90,18 @@ public class JobBoss : GlobalBase
{ {
if (jjob is not DownloadNewChapters job) if (jjob is not DownloadNewChapters job)
return false; return false;
return job.mangaInternalId == internalId; return job.manga.internalId == internalId;
}); });
return ret; return ret;
} }
public IEnumerable<Job> GetJobsLike(Manga? publication = null, public IEnumerable<Job> GetJobsLike(MangaConnector? mangaConnector = null, Manga? publication = null,
Chapter? chapter = null) Chapter? chapter = null)
{ {
if (chapter is not null) if (chapter is not null)
return GetJobsLike(chapter.Value.parentManga.internalId, chapter.Value.chapterNumber); return GetJobsLike(mangaConnector?.name, chapter.Value.parentManga.internalId, chapter.Value.chapterNumber);
else else
return GetJobsLike(publication?.internalId); return GetJobsLike(mangaConnector?.name, publication?.internalId);
} }
public Job? GetJobById(string jobId) public Job? GetJobById(string jobId)
@ -143,37 +146,44 @@ public class JobBoss : GlobalBase
private void LoadJobsList(HashSet<MangaConnector> connectors) private void LoadJobsList(HashSet<MangaConnector> connectors)
{ {
Directory.CreateDirectory(TrangaSettings.jobsFolderPath);
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(TrangaSettings.jobsFolderPath, UserRead | UserWrite | UserExecute | GroupRead | OtherRead);
if (!Directory.Exists(TrangaSettings.jobsFolderPath)) //No jobs to load if (!Directory.Exists(TrangaSettings.jobsFolderPath)) //No jobs to load
{
Directory.CreateDirectory(TrangaSettings.jobsFolderPath);
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(TrangaSettings.jobsFolderPath, UserRead | UserWrite | UserExecute | GroupRead | OtherRead);
return; return;
}
//Load Manga-Files
ImportManga();
//Load json-job-files //Load json-job-files
foreach (FileInfo file in Directory.GetFiles(TrangaSettings.jobsFolderPath, "*.json").Select(f => new FileInfo(f))) foreach (FileInfo file in Directory.GetFiles(TrangaSettings.jobsFolderPath, "*.json").Select(f => new FileInfo(f)))
{ {
Log($"Adding {file.Name}"); Log($"Adding {file.Name}");
Job? job = JsonConvert.DeserializeObject<Job>(File.ReadAllText(file.FullName), try
new JobJsonConverter(this, new MangaConnectorJsonConverter(this, connectors)));
if (job is null)
{
string newName = file.FullName + ".failed";
Log($"Failed loading file {file.Name}.\nMoving to {newName}");
File.Move(file.FullName, newName);
}
else
{ {
Job? job = JsonConvert.DeserializeObject<Job>(File.ReadAllText(file.FullName),
new JobJsonConverter(this, new MangaConnectorJsonConverter(this, connectors)));
if (job is null) throw new NullReferenceException();
Log($"Adding Job {job}"); Log($"Adding Job {job}");
if (!AddJob(job, file.FullName)) //If we detect a duplicate, delete the file. if (!AddJob(job, file.FullName)) //If we detect a duplicate, delete the file.
{ {
string path = string.Concat(file.FullName, ".duplicate"); //string path = string.Concat(file.FullName, ".duplicate");
file.MoveTo(path); //file.MoveTo(path);
Log($"Duplicate detected or otherwise not able to add job to list.\nMoved job {job} to {path}"); //Log($"Duplicate detected or otherwise not able to add job to list.\nMoved job {job} to {path}");
Log($"Duplicate detected or otherwise not able to add job to list. Removed the file {file.FullName} {job}");
} }
} }
catch (Exception e)
{
if (e is not UnreachableException or NullReferenceException)
throw;
Log(e.Message);
string newName = file.FullName + ".failed";
Log($"Failed loading file {file.Name}.\nMoving to {newName}.\n" +
$"If you think this is a bug, upload contents of the file to the Bugreport!");
File.Move(file.FullName, newName);
continue;
}
} }
//Connect jobs to parent-jobs and add Publications to cache //Connect jobs to parent-jobs and add Publications to cache
@ -186,24 +196,12 @@ public class JobBoss : GlobalBase
parentJob.AddSubJob(job); parentJob.AddSubJob(job);
Log($"Parent Job {parentJob}"); Log($"Parent Job {parentJob}");
} }
if (job is DownloadNewChapters dncJob)
AddMangaToCache(dncJob.manga);
} }
string[] jobMangaInternalIds = this.jobs.Where(job => job is DownloadNewChapters)
.Select(dnc => ((DownloadNewChapters)dnc).mangaInternalId).ToArray();
jobMangaInternalIds = jobMangaInternalIds.Concat(
this.jobs.Where(job => job is UpdateMetadata)
.Select(dnc => ((UpdateMetadata)dnc).mangaInternalId)).ToArray();
string[] internalIds = GetAllCachedManga().Select(m => m.internalId).ToArray();
string[] extraneousIds = internalIds.Except(jobMangaInternalIds).ToArray();
foreach (string internalId in extraneousIds)
RemoveMangaFromCache(internalId);
string[] coverFiles = Directory.GetFiles(TrangaSettings.coverImageCache); string[] coverFiles = Directory.GetFiles(TrangaSettings.coverImageCache);
foreach(string fileName in coverFiles.Where(fileName => !GetAllCachedManga().Any(manga => manga.coverFileNameInCache == fileName))) foreach(string fileName in coverFiles.Where(fileName => !GetAllCachedManga().Any(manga => manga.coverFileNameInCache == fileName)))
File.Delete(fileName);
string[] mangaFiles = Directory.GetFiles(TrangaSettings.mangaCacheFolderPath);
foreach(string fileName in mangaFiles.Where(fileName => !GetAllCachedManga().Any(manga => fileName.Split('.')[0] == manga.internalId)))
File.Delete(fileName); File.Delete(fileName);
} }

View File

@ -24,31 +24,52 @@ public class JobJsonConverter : JsonConverter
{ {
JObject jo = JObject.Load(reader); JObject jo = JObject.Load(reader);
if(!jo.ContainsKey("jobType")) if (jo.ContainsKey("jobType") && jo["jobType"]!.Value<byte>() == (byte)Job.JobType.UpdateMetaDataJob)
throw new Exception();
return Enum.Parse<Job.JobType>(jo["jobType"]!.Value<byte>().ToString()) switch
{ {
Job.JobType.UpdateMetaDataJob => new UpdateMetadata(_clone, return new UpdateMetadata(this._clone,
jo.GetValue("mangaInternalId")!.Value<string>()!, jo.GetValue("mangaConnector")!.ToObject<MangaConnector>(JsonSerializer.Create(new JsonSerializerSettings()
jo.GetValue("parentJobId")!.Value<string?>()),
Job.JobType.DownloadChapterJob => new DownloadChapter(this._clone,
jo.GetValue("chapter")!.ToObject<Chapter>(JsonSerializer.Create(new JsonSerializerSettings()
{ {
Converters = { this._mangaConnectorJsonConverter } Converters =
})), {
DateTime.UnixEpoch, this._mangaConnectorJsonConverter
jo.GetValue("parentJobId")!.Value<string?>()), }
Job.JobType.DownloadNewChaptersJob => new DownloadNewChapters(this._clone, }))!,
jo.GetValue("mangaInternalId")!.Value<string>()!, jo.GetValue("manga")!.ToObject<Manga>(),
jo.GetValue("lastExecution") is {} le jo.GetValue("parentJobId")!.Value<string?>());
? le.ToObject<DateTime>() }else if ((jo.ContainsKey("jobType") && jo["jobType"]!.Value<byte>() == (byte)Job.JobType.DownloadNewChaptersJob) || jo.ContainsKey("translatedLanguage"))//TODO change to jobType
: DateTime.UnixEpoch, {
DateTime lastExecution = jo.GetValue("lastExecution") is {} le
? le.ToObject<DateTime>()
: DateTime.UnixEpoch; //TODO do null checks on all variables
return new DownloadNewChapters(this._clone,
jo.GetValue("mangaConnector")!.ToObject<MangaConnector>(JsonSerializer.Create(new JsonSerializerSettings()
{
Converters =
{
this._mangaConnectorJsonConverter
}
}))!,
jo.GetValue("manga")!.ToObject<Manga>(),
lastExecution,
jo.GetValue("recurring")!.Value<bool>(), jo.GetValue("recurring")!.Value<bool>(),
jo.GetValue("recurrenceTime")!.ToObject<TimeSpan?>(), jo.GetValue("recurrenceTime")!.ToObject<TimeSpan?>(),
jo.GetValue("parentJobId")!.Value<string?>()), jo.GetValue("parentJobId")!.Value<string?>());
_ => throw new Exception() }else if ((jo.ContainsKey("jobType") && jo["jobType"]!.Value<byte>() == (byte)Job.JobType.DownloadChapterJob) || jo.ContainsKey("chapter"))//TODO change to jobType
}; {
return new DownloadChapter(this._clone,
jo.GetValue("mangaConnector")!.ToObject<MangaConnector>(JsonSerializer.Create(new JsonSerializerSettings()
{
Converters =
{
this._mangaConnectorJsonConverter
}
}))!,
jo.GetValue("chapter")!.ToObject<Chapter>(),
DateTime.UnixEpoch,
jo.GetValue("parentJobId")!.Value<string?>());
}
throw new Exception();
} }
public override bool CanWrite => false; public override bool CanWrite => false;

View File

@ -10,7 +10,7 @@ public class ProgressToken
public DateTime executionStarted { get; private set; } public DateTime executionStarted { get; private set; }
public TimeSpan timeRemaining => GetTimeRemaining(); public TimeSpan timeRemaining => GetTimeRemaining();
public enum State : byte { Running = 0, Complete = 1, Standby = 2, Cancelled = 3, Waiting = 4 } public enum State { Running, Complete, Standby, Cancelled, Waiting }
public State state { get; private set; } public State state { get; private set; }
public ProgressToken(int increments) public ProgressToken(int increments)

View File

@ -1,21 +1,19 @@
using System.Text.Json.Serialization; using Tranga.MangaConnectors;
using Tranga.MangaConnectors;
namespace Tranga.Jobs; namespace Tranga.Jobs;
public class UpdateMetadata : Job public class UpdateMetadata : Job
{ {
public string mangaInternalId { get; set; } public Manga manga { get; set; }
[JsonIgnore] private Manga? manga => GetCachedManga(mangaInternalId);
public UpdateMetadata(GlobalBase clone, string mangaInternalId, string? parentJobId = null) : base(clone, JobType.UpdateMetaDataJob, parentJobId: parentJobId) public UpdateMetadata(GlobalBase clone, MangaConnector connector, Manga manga, string? parentJobId = null) : base(clone, JobType.UpdateMetaDataJob, connector, parentJobId: parentJobId)
{ {
this.mangaInternalId = mangaInternalId; this.manga = manga;
} }
protected override string GetId() protected override string GetId()
{ {
return $"{GetType()}-{mangaInternalId}"; return $"{GetType()}-{manga.internalId}";
} }
public override string ToString() public override string ToString()
@ -25,14 +23,8 @@ public class UpdateMetadata : Job
protected override IEnumerable<Job> ExecuteReturnSubTasksInternal(JobBoss jobBoss) protected override IEnumerable<Job> ExecuteReturnSubTasksInternal(JobBoss jobBoss)
{ {
if (manga is null)
{
Log($"Manga {mangaInternalId} is missing! Can not execute job.");
return Array.Empty<Job>();
}
//Retrieve new Metadata //Retrieve new Metadata
Manga? possibleUpdatedManga = mangaConnector.GetMangaFromId(manga.Value.publicationId); Manga? possibleUpdatedManga = mangaConnector.GetMangaFromId(manga.publicationId);
if (possibleUpdatedManga is { } updatedManga) if (possibleUpdatedManga is { } updatedManga)
{ {
if (updatedManga.Equals(this.manga)) //Check if anything changed if (updatedManga.Equals(this.manga)) //Check if anything changed
@ -41,9 +33,26 @@ public class UpdateMetadata : Job
return Array.Empty<Job>(); return Array.Empty<Job>();
} }
AddMangaToCache(manga.Value.WithMetadata(updatedManga)); this.manga = manga.WithMetadata(updatedManga);
this.manga.Value.SaveSeriesInfoJson(true); this.manga.SaveSeriesInfoJson(true);
this.mangaConnector.CopyCoverFromCacheToDownloadLocation((Manga)manga); this.mangaConnector.CopyCoverFromCacheToDownloadLocation(manga);
foreach (Job job in jobBoss.GetJobsLike(publication: this.manga))
{
string oldFile;
if (job is DownloadNewChapters dc)
{
oldFile = dc.id;
dc.manga = this.manga;
}
else if (job is UpdateMetadata um)
{
oldFile = um.id;
um.manga = this.manga;
}
else
continue;
jobBoss.UpdateJobFile(job, oldFile);
}
this.progressToken.Complete(); this.progressToken.Complete();
} }
else else
@ -56,19 +65,12 @@ public class UpdateMetadata : Job
return Array.Empty<Job>(); return Array.Empty<Job>();
} }
protected override MangaConnector GetMangaConnector()
{
if (manga is null)
throw new Exception($"Missing Manga {mangaInternalId}");
return manga.Value.mangaConnector;
}
public override bool Equals(object? obj) public override bool Equals(object? obj)
{ {
if (obj is not UpdateMetadata otherJob) if (obj is not UpdateMetadata otherJob)
return false; return false;
return otherJob.mangaConnector == this.mangaConnector && return otherJob.mangaConnector == this.mangaConnector &&
otherJob.manga?.publicationId == this.manga?.publicationId; otherJob.manga.publicationId == this.manga.publicationId;
} }
} }

View File

@ -3,7 +3,6 @@ using System.Text;
using System.Text.RegularExpressions; using System.Text.RegularExpressions;
using System.Web; using System.Web;
using Newtonsoft.Json; using Newtonsoft.Json;
using Tranga.MangaConnectors;
using static System.IO.UnixFileMode; using static System.IO.UnixFileMode;
namespace Tranga; namespace Tranga;
@ -28,6 +27,8 @@ public struct Manga
// ReSharper disable once MemberCanBePrivate.Global // ReSharper disable once MemberCanBePrivate.Global
public int? year { get; private set; } public int? year { get; private set; }
public string? originalLanguage { get; } public string? originalLanguage { get; }
// ReSharper disable twice MemberCanBePrivate.Global
public string status { get; private set; }
public ReleaseStatusByte releaseStatus { get; private set; } public ReleaseStatusByte releaseStatus { get; private set; }
public enum ReleaseStatusByte : byte public enum ReleaseStatusByte : byte
{ {
@ -43,15 +44,14 @@ public struct Manga
public float ignoreChaptersBelow { get; set; } public float ignoreChaptersBelow { get; set; }
public float latestChapterDownloaded { get; set; } public float latestChapterDownloaded { get; set; }
public float latestChapterAvailable { get; set; } public float latestChapterAvailable { get; set; }
public string websiteUrl { get; private set; }
public MangaConnector mangaConnector { get; private set; } public string? websiteUrl { get; private set; }
private static readonly Regex LegalCharacters = new (@"[A-Za-zÀ-ÖØ-öø-ÿ0-9 \.\-,'\'\)\(~!\+]*"); private static readonly Regex LegalCharacters = new (@"[A-Za-zÀ-ÖØ-öø-ÿ0-9 \.\-,'\'\)\(~!\+]*");
[JsonConstructor] [JsonConstructor]
public Manga(MangaConnector mangaConnector, string sortName, List<string> authors, string? description, Dictionary<string,string> altTitles, string[] tags, string? coverUrl, string? coverFileNameInCache, Dictionary<string,string>? links, int? year, string? originalLanguage, string publicationId, ReleaseStatusByte releaseStatus, string? websiteUrl, string? folderName = null, float? ignoreChaptersBelow = 0) public Manga(string sortName, List<string> authors, string? description, Dictionary<string,string> altTitles, string[] tags, string? coverUrl, string? coverFileNameInCache, Dictionary<string,string>? links, int? year, string? originalLanguage, string publicationId, ReleaseStatusByte releaseStatus, string? websiteUrl = null, string? folderName = null, float? ignoreChaptersBelow = 0)
{ {
this.mangaConnector = mangaConnector;
this.sortName = HttpUtility.HtmlDecode(sortName); this.sortName = HttpUtility.HtmlDecode(sortName);
this.authors = authors.Select(HttpUtility.HtmlDecode).ToList()!; this.authors = authors.Select(HttpUtility.HtmlDecode).ToList()!;
this.description = HttpUtility.HtmlDecode(description); this.description = HttpUtility.HtmlDecode(description);
@ -72,7 +72,8 @@ public struct Manga
this.latestChapterDownloaded = 0; this.latestChapterDownloaded = 0;
this.latestChapterAvailable = 0; this.latestChapterAvailable = 0;
this.releaseStatus = releaseStatus; this.releaseStatus = releaseStatus;
this.websiteUrl = websiteUrl??""; this.status = Enum.GetName(releaseStatus) ?? "";
this.websiteUrl = websiteUrl;
} }
public Manga WithMetadata(Manga newManga) public Manga WithMetadata(Manga newManga)
@ -85,6 +86,7 @@ public struct Manga
authors = authors.Union(newManga.authors).ToList(), authors = authors.Union(newManga.authors).ToList(),
altTitles = altTitles.UnionBy(newManga.altTitles, kv => kv.Key).ToDictionary(x => x.Key, x => x.Value), altTitles = altTitles.UnionBy(newManga.altTitles, kv => kv.Key).ToDictionary(x => x.Key, x => x.Value),
tags = tags.Union(newManga.tags).ToArray(), tags = tags.Union(newManga.tags).ToArray(),
status = newManga.status,
releaseStatus = newManga.releaseStatus, releaseStatus = newManga.releaseStatus,
websiteUrl = newManga.websiteUrl, websiteUrl = newManga.websiteUrl,
year = newManga.year, year = newManga.year,
@ -98,6 +100,7 @@ public struct Manga
return false; return false;
return this.description == compareManga.description && return this.description == compareManga.description &&
this.year == compareManga.year && this.year == compareManga.year &&
this.status == compareManga.status &&
this.releaseStatus == compareManga.releaseStatus && this.releaseStatus == compareManga.releaseStatus &&
this.sortName == compareManga.sortName && this.sortName == compareManga.sortName &&
this.latestChapterAvailable.Equals(compareManga.latestChapterAvailable) && this.latestChapterAvailable.Equals(compareManga.latestChapterAvailable) &&

View File

@ -8,7 +8,7 @@ namespace Tranga.MangaConnectors;
public class AsuraToon : MangaConnector public class AsuraToon : MangaConnector
{ {
public AsuraToon(GlobalBase clone) : base(clone, "AsuraToon", ["en"], ["asuracomic.net"]) public AsuraToon(GlobalBase clone) : base(clone, "AsuraToon", ["en"])
{ {
this.downloadClient = new ChromiumDownloadClient(clone); this.downloadClient = new ChromiumDownloadClient(clone);
} }
@ -113,7 +113,7 @@ public class AsuraToon : MangaConnector
HtmlNode? firstChapterNode = document.DocumentNode.SelectSingleNode("//a[contains(@href, 'chapter/1')]/../following-sibling::h3"); HtmlNode? firstChapterNode = document.DocumentNode.SelectSingleNode("//a[contains(@href, 'chapter/1')]/../following-sibling::h3");
int? year = int.Parse(firstChapterNode?.InnerText.Split(' ')[^1] ?? "2000"); int? year = int.Parse(firstChapterNode?.InnerText.Split(' ')[^1] ?? "2000");
Manga manga = new (this, sortName, authors, description, altTitles, tags, coverUrl, coverFileNameInCache, links, Manga manga = new (sortName, authors, description, altTitles, tags, coverUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl); year, originalLanguage, publicationId, releaseStatus, websiteUrl);
AddMangaToCache(manga); AddMangaToCache(manga);
return manga; return manga;

View File

@ -8,7 +8,7 @@ namespace Tranga.MangaConnectors;
public class Bato : MangaConnector public class Bato : MangaConnector
{ {
public Bato(GlobalBase clone) : base(clone, "Bato", ["en"], ["bato.to"]) public Bato(GlobalBase clone) : base(clone, "Bato", ["en"])
{ {
this.downloadClient = new HttpDownloadClient(clone); this.downloadClient = new HttpDownloadClient(clone);
} }
@ -114,8 +114,8 @@ public class Bato : MangaConnector
case "pending": releaseStatus = Manga.ReleaseStatusByte.Unreleased; break; case "pending": releaseStatus = Manga.ReleaseStatusByte.Unreleased; break;
} }
Manga manga = new (this, sortName, authors, description, altTitles, tags, posterUrl, coverFileNameInCache, new Dictionary<string, string>(), Manga manga = new (sortName, authors, description, altTitles, tags, posterUrl, coverFileNameInCache, new Dictionary<string, string>(),
year, originalLanguage, publicationId, releaseStatus, websiteUrl); year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga); AddMangaToCache(manga);
return manga; return manga;
} }

View File

@ -10,7 +10,6 @@ namespace Tranga.MangaConnectors;
internal class ChromiumDownloadClient : DownloadClient internal class ChromiumDownloadClient : DownloadClient
{ {
private static IBrowser? _browser; private static IBrowser? _browser;
private const int StartTimeoutMs = 10000;
private readonly HttpDownloadClient _httpDownloadClient; private readonly HttpDownloadClient _httpDownloadClient;
private static async Task<IBrowser> StartBrowser(Logging.Logger? logger = null) private static async Task<IBrowser> StartBrowser(Logging.Logger? logger = null)
@ -24,7 +23,7 @@ internal class ChromiumDownloadClient : DownloadClient
"--disable-dev-shm-usage", "--disable-dev-shm-usage",
"--disable-setuid-sandbox", "--disable-setuid-sandbox",
"--no-sandbox"}, "--no-sandbox"},
Timeout = StartTimeoutMs Timeout = TrangaSettings.ChromiumStartupTimeoutMs
}, new LoggerFactory([new LogProvider(logger)])); }, new LoggerFactory([new LogProvider(logger)]));
} }
@ -70,17 +69,20 @@ internal class ChromiumDownloadClient : DownloadClient
private RequestResult MakeRequestBrowser(string url, string? referrer = null, string? clickButton = null) private RequestResult MakeRequestBrowser(string url, string? referrer = null, string? clickButton = null)
{ {
if (_browser is null)
return new RequestResult(HttpStatusCode.InternalServerError, null, Stream.Null);
IPage page = _browser.NewPageAsync().Result; IPage page = _browser.NewPageAsync().Result;
page.DefaultTimeout = 10000; page.DefaultTimeout = TrangaSettings.ChromiumPageTimeoutMs;
page.SetExtraHttpHeadersAsync(new() { { "Referer", referrer } });
IResponse response; IResponse response;
try try
{ {
response = page.GoToAsync(url, WaitUntilNavigation.Networkidle0).Result; response = page.GoToAsync(url, WaitUntilNavigation.Networkidle0).Result;
Log("Page loaded."); Log($"Page loaded. {url}");
} }
catch (Exception e) catch (Exception e)
{ {
Log($"Could not load Page:\n{e.Message}"); Log($"Could not load Page {url}\n{e.Message}");
page.CloseAsync(); page.CloseAsync();
return new RequestResult(HttpStatusCode.InternalServerError, null, Stream.Null); return new RequestResult(HttpStatusCode.InternalServerError, null, Stream.Null);
} }

View File

@ -2,10 +2,6 @@
using System.Net; using System.Net;
using System.Runtime.InteropServices; using System.Runtime.InteropServices;
using System.Text.RegularExpressions; using System.Text.RegularExpressions;
using SixLabors.ImageSharp;
using SixLabors.ImageSharp.Formats.Jpeg;
using SixLabors.ImageSharp.Processing;
using SixLabors.ImageSharp.Processing.Processors.Binarization;
using Tranga.Jobs; using Tranga.Jobs;
using static System.IO.UnixFileMode; using static System.IO.UnixFileMode;
@ -19,13 +15,11 @@ public abstract class MangaConnector : GlobalBase
{ {
internal DownloadClient downloadClient { get; init; } = null!; internal DownloadClient downloadClient { get; init; } = null!;
public string[] SupportedLanguages; public string[] SupportedLanguages;
public string[] BaseUris;
protected MangaConnector(GlobalBase clone, string name, string[] supportedLanguages, string[] baseUris) : base(clone) protected MangaConnector(GlobalBase clone, string name, string[] supportedLanguages) : base(clone)
{ {
this.name = name; this.name = name;
this.SupportedLanguages = supportedLanguages; this.SupportedLanguages = supportedLanguages;
this.BaseUris = baseUris;
Directory.CreateDirectory(TrangaSettings.coverImageCache); Directory.CreateDirectory(TrangaSettings.coverImageCache);
} }
@ -66,7 +60,7 @@ public abstract class MangaConnector : GlobalBase
return Array.Empty<Chapter>(); return Array.Empty<Chapter>();
Log($"Checking for duplicates {manga}"); Log($"Checking for duplicates {manga}");
List<Chapter> newChaptersList = allChapters.Where(nChapter => nChapter.chapterNumber > manga.ignoreChaptersBelow List<Chapter> newChaptersList = allChapters.Where(nChapter => nChapter.chapterNumber >= manga.ignoreChaptersBelow
&& !nChapter.CheckChapterIsDownloaded()).ToList(); && !nChapter.CheckChapterIsDownloaded()).ToList();
Log($"{newChaptersList.Count} new chapters. {manga}"); Log($"{newChaptersList.Count} new chapters. {manga}");
try try
@ -146,22 +140,6 @@ public abstract class MangaConnector : GlobalBase
return requestResult.statusCode; return requestResult.statusCode;
} }
private void ProcessImage(string imagePath)
{
if (!TrangaSettings.bwImages && TrangaSettings.compression == 100)
return;
DateTime start = DateTime.Now;
using Image image = Image.Load(imagePath);
File.Delete(imagePath);
if(TrangaSettings.bwImages)
image.Mutate(i => i.ApplyProcessor(new AdaptiveThresholdProcessor()));
image.SaveAsJpeg(imagePath, new JpegEncoder()
{
Quality = TrangaSettings.compression
});
Log($"Image processing took {DateTime.Now.Subtract(start):s\\.fff} B/W:{TrangaSettings.bwImages} Compression: {TrangaSettings.compression}");
}
protected HttpStatusCode DownloadChapterImages(string[] imageUrls, Chapter chapter, RequestType requestType, string? referrer = null, ProgressToken? progressToken = null) protected HttpStatusCode DownloadChapterImages(string[] imageUrls, Chapter chapter, RequestType requestType, string? referrer = null, ProgressToken? progressToken = null)
{ {
string saveArchiveFilePath = chapter.GetArchiveFilePath(); string saveArchiveFilePath = chapter.GetArchiveFilePath();
@ -200,14 +178,11 @@ public abstract class MangaConnector : GlobalBase
progressToken?.Complete(); progressToken?.Complete();
return HttpStatusCode.NoContent; return HttpStatusCode.NoContent;
} }
foreach (string imageUrl in imageUrls) foreach (string imageUrl in imageUrls)
{ {
string extension = imageUrl.Split('.')[^1].Split('?')[0]; string extension = imageUrl.Split('.')[^1].Split('?')[0];
Log($"Downloading image {chapterNum + 1:000}/{imageUrls.Length:000}"); Log($"Downloading image {chapterNum + 1:000}/{imageUrls.Length:000}"); //TODO
string imagePath = Path.Join(tempFolder, $"{chapterNum++}.{extension}"); HttpStatusCode status = DownloadImage(imageUrl, Path.Join(tempFolder, $"{chapterNum++}.{extension}"), requestType, referrer);
HttpStatusCode status = DownloadImage(imageUrl, imagePath, requestType, referrer);
ProcessImage(imagePath);
Log($"{saveArchiveFilePath} {chapterNum + 1:000}/{imageUrls.Length:000} {status}"); Log($"{saveArchiveFilePath} {chapterNum + 1:000}/{imageUrls.Length:000} {status}");
if ((int)status < 200 || (int)status >= 300) if ((int)status < 200 || (int)status >= 300)
{ {
@ -238,7 +213,7 @@ public abstract class MangaConnector : GlobalBase
return HttpStatusCode.OK; return HttpStatusCode.OK;
} }
protected string SaveCoverImageToCache(string url, string mangaInternalId, RequestType requestType) protected string SaveCoverImageToCache(string url, string mangaInternalId, RequestType requestType, string? referrer = null)
{ {
Regex urlRex = new (@"https?:\/\/((?:[a-zA-Z0-9-]+\.)+[a-zA-Z0-9]+)\/(?:.+\/)*(.+\.([a-zA-Z]+))"); Regex urlRex = new (@"https?:\/\/((?:[a-zA-Z0-9-]+\.)+[a-zA-Z0-9]+)\/(?:.+\/)*(.+\.([a-zA-Z]+))");
//https?:\/\/[a-zA-Z0-9-]+\.([a-zA-Z0-9-]+\.[a-zA-Z0-9]+)\/(?:.+\/)*(.+\.([a-zA-Z]+)) for only second level domains //https?:\/\/[a-zA-Z0-9-]+\.([a-zA-Z0-9-]+\.[a-zA-Z0-9]+)\/(?:.+\/)*(.+\.([a-zA-Z]+)) for only second level domains
@ -249,7 +224,7 @@ public abstract class MangaConnector : GlobalBase
if (File.Exists(saveImagePath)) if (File.Exists(saveImagePath))
return saveImagePath; return saveImagePath;
RequestResult coverResult = downloadClient.MakeRequest(url, requestType); RequestResult coverResult = downloadClient.MakeRequest(url, requestType, referrer);
using MemoryStream ms = new(); using MemoryStream ms = new();
coverResult.result.CopyTo(ms); coverResult.result.CopyTo(ms);
Directory.CreateDirectory(TrangaSettings.coverImageCache); Directory.CreateDirectory(TrangaSettings.coverImageCache);

View File

@ -32,14 +32,13 @@ public class MangaConnectorJsonConverter : JsonConverter
"MangaDex" => this._connectors.First(c => c is MangaDex), "MangaDex" => this._connectors.First(c => c is MangaDex),
"Manganato" => this._connectors.First(c => c is Manganato), "Manganato" => this._connectors.First(c => c is Manganato),
"MangaKatana" => this._connectors.First(c => c is MangaKatana), "MangaKatana" => this._connectors.First(c => c is MangaKatana),
"Mangasee" => this._connectors.First(c => c is Mangasee),
"Mangaworld" => this._connectors.First(c => c is Mangaworld), "Mangaworld" => this._connectors.First(c => c is Mangaworld),
"Bato" => this._connectors.First(c => c is Bato), "Bato" => this._connectors.First(c => c is Bato),
"Manga4Life" => this._connectors.First(c => c is MangaLife),
"ManhuaPlus" => this._connectors.First(c => c is ManhuaPlus), "ManhuaPlus" => this._connectors.First(c => c is ManhuaPlus),
"MangaHere" => this._connectors.First(c => c is MangaHere), "MangaHere" => this._connectors.First(c => c is MangaHere),
"AsuraToon" => this._connectors.First(c => c is AsuraToon), "AsuraToon" => this._connectors.First(c => c is AsuraToon),
"Weebcentral" => this._connectors.First(c => c is Weebcentral), "Weebcentral" => this._connectors.First(c => c is Weebcentral),
"Webtoons" => this._connectors.First(c => c is Webtoons),
_ => throw new UnreachableException($"Could not find Connector with name {connectorName}") _ => throw new UnreachableException($"Could not find Connector with name {connectorName}")
}; };
} }

View File

@ -10,7 +10,7 @@ public class MangaDex : MangaConnector
//https://api.mangadex.org/docs/3-enumerations/#language-codes--localization //https://api.mangadex.org/docs/3-enumerations/#language-codes--localization
//https://en.wikipedia.org/wiki/List_of_ISO_639_language_codes //https://en.wikipedia.org/wiki/List_of_ISO_639_language_codes
//https://gist.github.com/Josantonius/b455e315bc7f790d14b136d61d9ae469 //https://gist.github.com/Josantonius/b455e315bc7f790d14b136d61d9ae469
public MangaDex(GlobalBase clone) : base(clone, "MangaDex", ["en","pt","pt-br","it","de","ru","aa","ab","ae","af","ak","am","an","ar-ae","ar-bh","ar-dz","ar-eg","ar-iq","ar-jo","ar-kw","ar-lb","ar-ly","ar-ma","ar-om","ar-qa","ar-sa","ar-sy","ar-tn","ar-ye","ar","as","av","ay","az","ba","be","bg","bh","bi","bm","bn","bo","br","bs","ca","ce","ch","co","cr","cs","cu","cv","cy","da","de-at","de-ch","de-de","de-li","de-lu","div","dv","dz","ee","el","en-au","en-bz","en-ca","en-cb","en-gb","en-ie","en-jm","en-nz","en-ph","en-tt","en-us","en-za","en-zw","eo","es-ar","es-bo","es-cl","es-co","es-cr","es-do","es-ec","es-es","es-gt","es-hn","es-la","es-mx","es-ni","es-pa","es-pe","es-pr","es-py","es-sv","es-us","es-uy","es-ve","es","et","eu","fa","ff","fi","fj","fo","fr-be","fr-ca","fr-ch","fr-fr","fr-lu","fr-mc","fr","fy","ga","gd","gl","gn","gu","gv","ha","he","hi","ho","hr-ba","hr-hr","hr","ht","hu","hy","hz","ia","id","ie","ig","ii","ik","in","io","is","it-ch","it-it","iu","iw","ja","ja-ro","ji","jv","jw","ka","kg","ki","kj","kk","kl","km","kn","ko","ko-ro","kr","ks","ku","kv","kw","ky","kz","la","lb","lg","li","ln","lo","ls","lt","lu","lv","mg","mh","mi","mk","ml","mn","mo","mr","ms-bn","ms-my","ms","mt","my","na","nb","nd","ne","ng","nl-be","nl-nl","nl","nn","no","nr","ns","nv","ny","oc","oj","om","or","os","pa","pi","pl","ps","pt-pt","qu-bo","qu-ec","qu-pe","qu","rm","rn","ro","rw","sa","sb","sc","sd","se-fi","se-no","se-se","se","sg","sh","si","sk","sl","sm","sn","so","sq","sr-ba","sr-sp","sr","ss","st","su","sv-fi","sv-se","sv","sw","sx","syr","ta","te","tg","th","ti","tk","tl","tn","to","tr","ts","tt","tw","ty","ug","uk","ur","us","uz","ve","vi","vo","wa","wo","xh","yi","yo","za","zh-cn","zh-hk","zh-mo","zh-ro","zh-sg","zh-tw","zh","zu"], ["mangadex.org"]) public MangaDex(GlobalBase clone) : base(clone, "MangaDex", ["en","pt","pt-br","it","de","ru","aa","ab","ae","af","ak","am","an","ar-ae","ar-bh","ar-dz","ar-eg","ar-iq","ar-jo","ar-kw","ar-lb","ar-ly","ar-ma","ar-om","ar-qa","ar-sa","ar-sy","ar-tn","ar-ye","ar","as","av","ay","az","ba","be","bg","bh","bi","bm","bn","bo","br","bs","ca","ce","ch","co","cr","cs","cu","cv","cy","da","de-at","de-ch","de-de","de-li","de-lu","div","dv","dz","ee","el","en-au","en-bz","en-ca","en-cb","en-gb","en-ie","en-jm","en-nz","en-ph","en-tt","en-us","en-za","en-zw","eo","es-ar","es-bo","es-cl","es-co","es-cr","es-do","es-ec","es-es","es-gt","es-hn","es-la","es-mx","es-ni","es-pa","es-pe","es-pr","es-py","es-sv","es-us","es-uy","es-ve","es","et","eu","fa","ff","fi","fj","fo","fr-be","fr-ca","fr-ch","fr-fr","fr-lu","fr-mc","fr","fy","ga","gd","gl","gn","gu","gv","ha","he","hi","ho","hr-ba","hr-hr","hr","ht","hu","hy","hz","ia","id","ie","ig","ii","ik","in","io","is","it-ch","it-it","iu","iw","ja","ja-ro","ji","jv","jw","ka","kg","ki","kj","kk","kl","km","kn","ko","ko-ro","kr","ks","ku","kv","kw","ky","kz","la","lb","lg","li","ln","lo","ls","lt","lu","lv","mg","mh","mi","mk","ml","mn","mo","mr","ms-bn","ms-my","ms","mt","my","na","nb","nd","ne","ng","nl-be","nl-nl","nl","nn","no","nr","ns","nv","ny","oc","oj","om","or","os","pa","pi","pl","ps","pt-pt","qu-bo","qu-ec","qu-pe","qu","rm","rn","ro","rw","sa","sb","sc","sd","se-fi","se-no","se-se","se","sg","sh","si","sk","sl","sm","sn","so","sq","sr-ba","sr-sp","sr","ss","st","su","sv-fi","sv-se","sv","sw","sx","syr","ta","te","tg","th","ti","tk","tl","tn","to","tr","ts","tt","tw","ty","ug","uk","ur","us","uz","ve","vi","vo","wa","wo","xh","yi","yo","za","zh-cn","zh-hk","zh-mo","zh-ro","zh-sg","zh-tw","zh","zu"])
{ {
this.downloadClient = new HttpDownloadClient(clone); this.downloadClient = new HttpDownloadClient(clone);
} }
@ -129,10 +129,10 @@ public class MangaDex : MangaConnector
false => null false => null
}; };
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased; Manga.ReleaseStatusByte status = Manga.ReleaseStatusByte.Unreleased;
if (attributes.TryGetPropertyValue("status", out JsonNode? statusNode)) if (attributes.TryGetPropertyValue("status", out JsonNode? statusNode))
{ {
releaseStatus = statusNode?.GetValue<string>().ToLower() switch status = statusNode?.GetValue<string>().ToLower() switch
{ {
"ongoing" => Manga.ReleaseStatusByte.Continuing, "ongoing" => Manga.ReleaseStatusByte.Continuing,
"completed" => Manga.ReleaseStatusByte.Completed, "completed" => Manga.ReleaseStatusByte.Completed,
@ -176,7 +176,6 @@ public class MangaDex : MangaConnector
} }
Manga pub = new( Manga pub = new(
this,
title, title,
authors, authors,
description, description,
@ -188,8 +187,8 @@ public class MangaDex : MangaConnector
year, year,
originalLanguage, originalLanguage,
publicationId, publicationId,
releaseStatus, status,
$"https://mangadex.org/title/{publicationId}" websiteUrl: $"https://mangadex.org/title/{publicationId}"
); );
AddMangaToCache(pub); AddMangaToCache(pub);
return pub; return pub;

View File

@ -7,7 +7,7 @@ namespace Tranga.MangaConnectors;
public class MangaHere : MangaConnector public class MangaHere : MangaConnector
{ {
public MangaHere(GlobalBase clone) : base(clone, "MangaHere", ["en"], ["www.mangahere.cc"]) public MangaHere(GlobalBase clone) : base(clone, "MangaHere", ["en"])
{ {
this.downloadClient = new ChromiumDownloadClient(clone); this.downloadClient = new ChromiumDownloadClient(clone);
} }
@ -101,7 +101,7 @@ public class MangaHere : MangaConnector
.SelectSingleNode("//p[contains(concat(' ',normalize-space(@class),' '),' fullcontent ')]"); .SelectSingleNode("//p[contains(concat(' ',normalize-space(@class),' '),' fullcontent ')]");
string description = descriptionNode.InnerText; string description = descriptionNode.InnerText;
Manga manga = new(this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, Manga manga = new(sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links, coverFileNameInCache, links,
null, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl); null, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga); AddMangaToCache(manga);

View File

@ -7,7 +7,7 @@ namespace Tranga.MangaConnectors;
public class MangaKatana : MangaConnector public class MangaKatana : MangaConnector
{ {
public MangaKatana(GlobalBase clone) : base(clone, "MangaKatana", ["en"], ["mangakatana.com"]) public MangaKatana(GlobalBase clone) : base(clone, "MangaKatana", ["en"])
{ {
this.downloadClient = new HttpDownloadClient(clone); this.downloadClient = new HttpDownloadClient(clone);
} }
@ -141,8 +141,8 @@ public class MangaKatana : MangaConnector
year = Convert.ToInt32(yearString); year = Convert.ToInt32(yearString);
} }
Manga manga = new (this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links, Manga manga = new (sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl); year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga); AddMangaToCache(manga);
return manga; return manga;
} }

View File

@ -1,206 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class MangaLife : MangaConnector
{
public MangaLife(GlobalBase clone) : base(clone, "Manga4Life", ["en"], ["manga4life.com"])
{
this.downloadClient = new ChromiumDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string sanitizedTitle = WebUtility.UrlEncode(publicationTitle);
string requestUrl = $"https://manga4life.com/search/?name={sanitizedTitle}";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Manga>();
if (requestResult.htmlDocument is null)
return Array.Empty<Manga>();
Manga[] publications = ParsePublicationsFromHtml(requestResult.htmlDocument);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications;
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://manga4life.com/manga/{publicationId}");
}
public override Manga? GetMangaFromUrl(string url)
{
Regex publicationIdRex = new(@"https:\/\/(www\.)?manga4life.com\/manga\/(.*)(\/.*)*");
string publicationId = publicationIdRex.Match(url).Groups[2].Value;
RequestResult requestResult = this.downloadClient.MakeRequest(url, RequestType.MangaInfo);
if(requestResult.htmlDocument is not null)
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, publicationId, url);
return null;
}
private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{
HtmlNode resultsNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']/div[last()]/div[1]/div");
if (resultsNode.Descendants("div").Count() == 1 && resultsNode.Descendants("div").First().HasClass("NoResults"))
{
Log("No results.");
return Array.Empty<Manga>();
}
Log($"{resultsNode.SelectNodes("div").Count} items.");
HashSet<Manga> ret = new();
foreach (HtmlNode resultNode in resultsNode.SelectNodes("div"))
{
string url = resultNode.Descendants().First(d => d.HasClass("SeriesName")).GetAttributeValue("href", "");
Manga? manga = GetMangaFromUrl($"https://manga4life.com{url}");
if (manga is not null)
ret.Add((Manga)manga);
}
return ret.ToArray();
}
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
string originalLanguage = "", status = "";
Dictionary<string, string> altTitles = new(), links = new();
HashSet<string> tags = new();
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
HtmlNode posterNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']//div[@class='row']//img");
string posterUrl = posterNode.GetAttributeValue("src", "");
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
HtmlNode titleNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']//div[@class='row']//h1");
string sortName = titleNode.InnerText;
HtmlNode[] authorsNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Author(s):']/..").Descendants("a")
.ToArray();
List<string> authors = new();
foreach (HtmlNode authorNode in authorsNodes)
authors.Add(authorNode.InnerText);
HtmlNode[] genreNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Genre(s):']/..").Descendants("a")
.ToArray();
foreach (HtmlNode genreNode in genreNodes)
tags.Add(genreNode.InnerText);
HtmlNode yearNode = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Released:']/..").Descendants("a")
.First();
int year = Convert.ToInt32(yearNode.InnerText);
HtmlNode[] statusNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Status:']/..").Descendants("a")
.ToArray();
foreach (HtmlNode statusNode in statusNodes)
if (statusNode.InnerText.Contains("publish", StringComparison.CurrentCultureIgnoreCase))
status = statusNode.InnerText.Split(' ')[0];
switch (status.ToLower())
{
case "cancelled": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "hiatus": releaseStatus = Manga.ReleaseStatusByte.OnHiatus; break;
case "discontinued": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "complete": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
}
HtmlNode descriptionNode = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Description:']/..")
.Descendants("div").First();
string description = descriptionNode.InnerText;
Manga manga = new(this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links, year, originalLanguage, publicationId, releaseStatus, websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
RequestResult result = downloadClient.MakeRequest($"https://manga4life.com/manga/{manga.publicationId}", RequestType.Default, clickButton:"[class*='ShowAllChapters']");
if ((int)result.statusCode < 200 || (int)result.statusCode >= 300 || result.htmlDocument is null)
{
return Array.Empty<Chapter>();
}
HtmlNodeCollection chapterNodes = result.htmlDocument.DocumentNode.SelectNodes(
"//a[contains(concat(' ',normalize-space(@class),' '),' ChapterLink ')]");
string[] urls = chapterNodes.Select(node => node.GetAttributeValue("href", "")).ToArray();
Regex urlRex = new (@"-chapter-([0-9\\.]+)(-index-([0-9\\.]+))?");
List<Chapter> chapters = new();
foreach (string url in urls)
{
Match rexMatch = urlRex.Match(url);
string volumeNumber = "1";
if (rexMatch.Groups[3].Value.Length > 0)
volumeNumber = rexMatch.Groups[3].Value;
string chapterNumber = rexMatch.Groups[1].Value;
string fullUrl = $"https://manga4life.com{url}";
fullUrl = fullUrl.Replace(Regex.Match(url,"(-page-[0-9])").Value,"");
try
{
chapters.Add(new Chapter(manga, "", volumeNumber, chapterNumber, fullUrl));
}
catch (Exception e)
{
Log($"Failed to load chapter {chapterNumber}: {e.Message}");
}
}
//Return Chapters ordered by Chapter-Number
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
RequestResult requestResult = this.downloadClient.MakeRequest(chapter.url, RequestType.Default);
if (requestResult.htmlDocument is null)
{
progressToken?.Cancel();
return HttpStatusCode.RequestTimeout;
}
HtmlDocument document = requestResult.htmlDocument;
HtmlNode gallery = document.DocumentNode.Descendants("div").First(div => div.HasClass("ImageGallery"));
HtmlNode[] images = gallery.Descendants("img").Where(img => img.HasClass("img-fluid")).ToArray();
List<string> urls = new();
foreach(HtmlNode galleryImage in images)
urls.Add(galleryImage.GetAttributeValue("src", ""));
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(urls.ToArray(), chapter, RequestType.MangaImage, progressToken:progressToken);
}
}

View File

@ -8,7 +8,7 @@ namespace Tranga.MangaConnectors;
public class Manganato : MangaConnector public class Manganato : MangaConnector
{ {
public Manganato(GlobalBase clone) : base(clone, "Manganato", ["en"], ["manganato.com"]) public Manganato(GlobalBase clone) : base(clone, "Manganato", ["en"])
{ {
this.downloadClient = new HttpDownloadClient(clone); this.downloadClient = new HttpDownloadClient(clone);
} }
@ -17,7 +17,7 @@ public class Manganato : MangaConnector
{ {
Log($"Searching Publications. Term=\"{publicationTitle}\""); Log($"Searching Publications. Term=\"{publicationTitle}\"");
string sanitizedTitle = string.Join('_', Regex.Matches(publicationTitle, "[A-z]*").Where(str => str.Length > 0)).ToLower(); string sanitizedTitle = string.Join('_', Regex.Matches(publicationTitle, "[A-z]*").Where(str => str.Length > 0)).ToLower();
string requestUrl = $"https://manganato.com/search/story/{sanitizedTitle}"; string requestUrl = $"https://manganato.gg/search/story/{sanitizedTitle}";
RequestResult requestResult = RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default); downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300) if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
@ -32,13 +32,19 @@ public class Manganato : MangaConnector
private Manga[] ParsePublicationsFromHtml(HtmlDocument document) private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{ {
List<HtmlNode> searchResults = document.DocumentNode.Descendants("div").Where(n => n.HasClass("search-story-item")).ToList(); List<HtmlNode> searchResults = document.DocumentNode.Descendants("div").Where(n => n.HasClass("story_item")).ToList();
Log($"{searchResults.Count} items."); Log($"{searchResults.Count} items.");
List<string> urls = new(); List<string> urls = new();
foreach (HtmlNode mangaResult in searchResults) foreach (HtmlNode mangaResult in searchResults)
{ {
urls.Add(mangaResult.Descendants("a").First(n => n.HasClass("item-title")).GetAttributes() try
.First(a => a.Name == "href").Value); {
urls.Add(mangaResult.Descendants("h3").First(n => n.HasClass("story_name"))
.Descendants("a").First().GetAttributeValue("href", ""));
} catch
{
//failed to get a url, send it to the void
}
} }
HashSet<Manga> ret = new(); HashSet<Manga> ret = new();
@ -78,69 +84,57 @@ public class Manganato : MangaConnector
string originalLanguage = ""; string originalLanguage = "";
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased; Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
HtmlNode infoNode = document.DocumentNode.Descendants("div").First(d => d.HasClass("story-info-right")); HtmlNode infoNode = document.DocumentNode.Descendants("ul").First(d => d.HasClass("manga-info-text"));
string sortName = infoNode.Descendants("h1").First().InnerText; string sortName = infoNode.Descendants("h1").First().InnerText;
HtmlNode infoTable = infoNode.Descendants().First(d => d.Name == "table"); foreach (HtmlNode li in infoNode.Descendants("li"))
foreach (HtmlNode row in infoTable.Descendants("tr"))
{ {
string key = row.SelectNodes("td").First().InnerText.ToLower(); string text = li.InnerText.Trim().ToLower();
string value = row.SelectNodes("td").Last().InnerText;
string keySanitized = string.Concat(Regex.Matches(key, "[a-z]"));
switch (keySanitized) if (text.StartsWith("author(s) :"))
{ {
case "alternative": authors = li.Descendants("a").Select(a => a.InnerText.Trim()).ToArray();
string[] alts = value.Split(" ; "); }
for(int i = 0; i < alts.Length; i++) else if (text.StartsWith("status :"))
altTitles.Add(i.ToString(), alts[i]); {
break; string status = text.Replace("status :", "").Trim().ToLower();
case "authors": if (string.IsNullOrWhiteSpace(status))
authors = value.Split('-'); releaseStatus = Manga.ReleaseStatusByte.Continuing;
for (int i = 0; i < authors.Length; i++) else if (status == "ongoing")
authors[i] = authors[i].Replace("\r\n", ""); releaseStatus = Manga.ReleaseStatusByte.Continuing;
break; else
case "status": releaseStatus = Enum.Parse<Manga.ReleaseStatusByte>(status, true);
switch (value.ToLower()) }
{ else if (li.HasClass("genres"))
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break; {
case "completed": releaseStatus = Manga.ReleaseStatusByte.Completed; break; tags = li.Descendants("a").Select(a => a.InnerText.Trim()).ToHashSet();
}
break;
case "genres":
string[] genres = value.Split(" - ");
for (int i = 0; i < genres.Length; i++)
genres[i] = genres[i].Replace("\r\n", "");
tags = genres.ToHashSet();
break;
} }
} }
string posterUrl = document.DocumentNode.Descendants("span").First(s => s.HasClass("info-image")).Descendants("img").First() string posterUrl = document.DocumentNode.Descendants("div").First(s => s.HasClass("manga-info-pic")).Descendants("img").First()
.GetAttributes().First(a => a.Name == "src").Value; .GetAttributes().First(a => a.Name == "src").Value;
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover); string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover, "https://www.manganato.gg/");
string description = document.DocumentNode.Descendants("div").First(d => d.HasClass("panel-story-info-description")) string description = document.DocumentNode.SelectSingleNode("//div[@id='contentBox']")
.InnerText.Replace("Description :", ""); .InnerText.Replace("Description :", "");
while (description.StartsWith('\n')) while (description.StartsWith('\n'))
description = description.Substring(1); description = description.Substring(1);
string pattern = "MMM dd,yyyy HH:mm"; string pattern = "MMM-dd-yyyy HH:mm";
HtmlNode? oldestChapter = document.DocumentNode HtmlNode? oldestChapter = document.DocumentNode
.SelectNodes("//span[contains(concat(' ',normalize-space(@class),' '),' chapter-time ')]").MaxBy( .SelectNodes("//div[contains(concat(' ',normalize-space(@class),' '),' row ')]/span[@title]").MaxBy(
node => DateTime.ParseExact(node.GetAttributeValue("title", "Dec 31 2400, 23:59"), pattern, node => DateTime.ParseExact(node.GetAttributeValue("title", "Dec-31-2400 23:59"), pattern,
CultureInfo.InvariantCulture).Millisecond); CultureInfo.InvariantCulture).Millisecond);
int year = DateTime.ParseExact(oldestChapter?.GetAttributeValue("title", "Dec 31 2400, 23:59")??"Dec 31 2400, 23:59", pattern, int year = DateTime.ParseExact(oldestChapter?.GetAttributeValue("title", "Dec 31 2400, 23:59")??"Dec 31 2400, 23:59", pattern,
CultureInfo.InvariantCulture).Year; CultureInfo.InvariantCulture).Year;
Manga manga = new (this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links, Manga manga = new (sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl); year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga); AddMangaToCache(manga);
return manga; return manga;
} }
@ -148,7 +142,7 @@ public class Manganato : MangaConnector
public override Chapter[] GetChapters(Manga manga, string language="en") public override Chapter[] GetChapters(Manga manga, string language="en")
{ {
Log($"Getting chapters {manga}"); Log($"Getting chapters {manga}");
string requestUrl = $"https://chapmanganato.com/{manga.publicationId}"; string requestUrl = manga.websiteUrl;
RequestResult requestResult = RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default); downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300) if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
@ -166,21 +160,21 @@ public class Manganato : MangaConnector
{ {
List<Chapter> ret = new(); List<Chapter> ret = new();
HtmlNode chapterList = document.DocumentNode.Descendants("ul").First(l => l.HasClass("row-content-chapter")); HtmlNode chapterList = document.DocumentNode.Descendants("div").First(l => l.HasClass("chapter-list"));
Regex volRex = new(@"Vol\.([0-9]+).*"); Regex volRex = new(@"Vol\.([0-9]+).*");
Regex chapterRex = new(@"https:\/\/chapmanganato.[A-z]+\/manga-[A-z0-9]+\/chapter-([0-9\.]+)"); Regex chapterRex = new(@"https:\/\/chapmanganato.[A-z]+\/manga-[A-z0-9]+\/chapter-([0-9\.]+)");
Regex nameRex = new(@"Chapter ([0-9]+(\.[0-9]+)*){1}:? (.*)"); Regex nameRex = new(@"Chapter ([0-9]+(\.[0-9]+)*){1}:? (.*)");
foreach (HtmlNode chapterInfo in chapterList.Descendants("li")) foreach (HtmlNode chapterInfo in chapterList.Descendants("div").Where(x => x.HasClass("row")))
{ {
string fullString = chapterInfo.Descendants("a").First(d => d.HasClass("chapter-name")).InnerText; string url = chapterInfo.Descendants("a").First().GetAttributeValue("href", "");
var name = chapterInfo.Descendants("a").First().InnerText.Trim();
string url = chapterInfo.Descendants("a").First(d => d.HasClass("chapter-name")) string chapterName = nameRex.Match(name).Groups[3].Value;
.GetAttributeValue("href", ""); string chapterNumber = Regex.Match(name, @"Chapter ([0-9]+(\.[0-9]+)*)").Groups[1].Value;
string? volumeNumber = volRex.IsMatch(fullString) ? volRex.Match(fullString).Groups[1].Value : null; string? volumeNumber = Regex.Match(chapterName, @"Vol\.([0-9]+)").Groups[1].Value;
string chapterNumber = chapterRex.Match(url).Groups[1].Value; if (string.IsNullOrWhiteSpace(volumeNumber))
string chapterName = nameRex.Match(fullString).Groups[3].Value; volumeNumber = "0";
try try
{ {
ret.Add(new Chapter(manga, chapterName, volumeNumber, chapterNumber, url)); ret.Add(new Chapter(manga, chapterName, volumeNumber, chapterNumber, url));
@ -221,7 +215,7 @@ public class Manganato : MangaConnector
string[] imageUrls = ParseImageUrlsFromHtml(requestResult.htmlDocument); string[] imageUrls = ParseImageUrlsFromHtml(requestResult.htmlDocument);
return DownloadChapterImages(imageUrls, chapter, RequestType.MangaImage, "https://chapmanganato.com/", progressToken:progressToken); return DownloadChapterImages(imageUrls, chapter, RequestType.MangaImage, "https://www.manganato.gg", progressToken:progressToken);
} }
private string[] ParseImageUrlsFromHtml(HtmlDocument document) private string[] ParseImageUrlsFromHtml(HtmlDocument document)

View File

@ -1,233 +0,0 @@
using System.Data;
using System.Net;
using System.Text.RegularExpressions;
using System.Xml.Linq;
using HtmlAgilityPack;
using Newtonsoft.Json;
using Soenneker.Utils.String.NeedlemanWunsch;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class Mangasee : MangaConnector
{
public Mangasee(GlobalBase clone) : base(clone, "Mangasee", ["en"], ["mangasee123.com"])
{
this.downloadClient = new ChromiumDownloadClient(clone);
}
private struct SearchResult
{
public string i { get; set; }
public string s { get; set; }
public string[] a { get; set; }
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string requestUrl = "https://mangasee123.com/_search.php";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
Log($"Failed to retrieve search: {requestResult.statusCode}");
return Array.Empty<Manga>();
}
try
{
SearchResult[] searchResults = JsonConvert.DeserializeObject<SearchResult[]>(requestResult.htmlDocument!.DocumentNode.InnerText) ??
throw new NoNullAllowedException();
SearchResult[] filteredResults = FilteredResults(publicationTitle, searchResults);
Log($"Total available manga: {searchResults.Length} Filtered down to: {filteredResults.Length}");
string[] urls = filteredResults.Select(result => $"https://mangasee123.com/manga/{result.i}").ToArray();
List<Manga> searchResultManga = new();
foreach (string url in urls)
{
Manga? newManga = GetMangaFromUrl(url);
if(newManga is { } manga)
searchResultManga.Add(manga);
}
Log($"Retrieved {searchResultManga.Count} publications. Term=\"{publicationTitle}\"");
return searchResultManga.ToArray();
}
catch (NoNullAllowedException)
{
Log("Failed to retrieve search");
return Array.Empty<Manga>();
}
}
private readonly string[] _filterWords = {"a", "the", "of", "as", "to", "no", "for", "on", "with", "be", "and", "in", "wa", "at", "be", "ni"};
private string ToFilteredString(string input) => string.Join(' ', input.ToLower().Split(' ').Where(word => _filterWords.Contains(word) == false));
private SearchResult[] FilteredResults(string publicationTitle, SearchResult[] unfilteredSearchResults)
{
Dictionary<SearchResult, int> similarity = new();
foreach (SearchResult sr in unfilteredSearchResults)
{
List<int> scores = new();
string filteredPublicationString = ToFilteredString(publicationTitle);
string filteredSString = ToFilteredString(sr.s);
scores.Add(NeedlemanWunschStringUtil.CalculateSimilarity(filteredSString, filteredPublicationString));
foreach (string srA in sr.a)
{
string filteredAString = ToFilteredString(srA);
scores.Add(NeedlemanWunschStringUtil.CalculateSimilarity(filteredAString, filteredPublicationString));
}
similarity.Add(sr, scores.Sum() / scores.Count);
}
List<SearchResult> ret = similarity.OrderBy(s => s.Value).Take(10).Select(s => s.Key).ToList();
return ret.ToArray();
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://mangasee123.com/manga/{publicationId}");
}
public override Manga? GetMangaFromUrl(string url)
{
Regex publicationIdRex = new(@"https:\/\/mangasee123.com\/manga\/(.*)(\/.*)*");
string publicationId = publicationIdRex.Match(url).Groups[1].Value;
RequestResult requestResult = this.downloadClient.MakeRequest(url, RequestType.MangaInfo);
if((int)requestResult.statusCode < 300 && (int)requestResult.statusCode >= 200 && requestResult.htmlDocument is not null)
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, publicationId, url);
return null;
}
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
string originalLanguage = "", status = "";
Dictionary<string, string> altTitles = new(), links = new();
HashSet<string> tags = new();
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
HtmlNode posterNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']//div[@class='row']//img");
string posterUrl = posterNode.GetAttributeValue("src", "");
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
HtmlNode titleNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']//div[@class='row']//h1");
string sortName = titleNode.InnerText;
HtmlNode[] authorsNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Author(s):']/..").Descendants("a")
.ToArray();
List<string> authors = new();
foreach (HtmlNode authorNode in authorsNodes)
authors.Add(authorNode.InnerText);
HtmlNode[] genreNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Genre(s):']/..").Descendants("a")
.ToArray();
foreach (HtmlNode genreNode in genreNodes)
tags.Add(genreNode.InnerText);
HtmlNode yearNode = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Released:']/..").Descendants("a")
.First();
int year = Convert.ToInt32(yearNode.InnerText);
HtmlNode[] statusNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Status:']/..").Descendants("a")
.ToArray();
foreach (HtmlNode statusNode in statusNodes)
if (statusNode.InnerText.Contains("publish", StringComparison.CurrentCultureIgnoreCase))
status = statusNode.InnerText.Split(' ')[0];
switch (status.ToLower())
{
case "cancelled": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "hiatus": releaseStatus = Manga.ReleaseStatusByte.OnHiatus; break;
case "discontinued": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "complete": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
}
HtmlNode descriptionNode = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Description:']/..")
.Descendants("div").First();
string description = descriptionNode.InnerText;
Manga manga = new(this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links, year, originalLanguage, publicationId, releaseStatus, websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
try
{
XDocument doc = XDocument.Load($"https://mangasee123.com/rss/{manga.publicationId}.xml");
XElement[] chapterItems = doc.Descendants("item").ToArray();
List<Chapter> chapters = new();
Regex chVolRex = new(@".*chapter-([0-9\.]+)(?:-index-([0-9\.]+))?.*");
foreach (XElement chapter in chapterItems)
{
string url = chapter.Descendants("link").First().Value;
Match m = chVolRex.Match(url);
string? volumeNumber = m.Groups[2].Success ? m.Groups[2].Value : "1";
string chapterNumber = m.Groups[1].Value;
string chapterUrl = Regex.Replace(url, @"-page-[0-9]+(\.html)", ".html");
try
{
chapters.Add(new Chapter(manga, "", volumeNumber, chapterNumber, chapterUrl));
}
catch (Exception e)
{
Log($"Failed to load chapter {chapterNumber}: {e.Message}");
}
}
//Return Chapters ordered by Chapter-Number
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
catch (HttpRequestException e)
{
Log($"Failed to load https://mangasee123.com/rss/{manga.publicationId}.xml \n\r{e}");
return Array.Empty<Chapter>();
}
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
RequestResult requestResult = this.downloadClient.MakeRequest(chapter.url, RequestType.Default);
if (requestResult.htmlDocument is null)
{
progressToken?.Cancel();
return HttpStatusCode.RequestTimeout;
}
HtmlDocument document = requestResult.htmlDocument;
HtmlNode gallery = document.DocumentNode.Descendants("div").First(div => div.HasClass("ImageGallery"));
HtmlNode[] images = gallery.Descendants("img").Where(img => img.HasClass("img-fluid")).ToArray();
List<string> urls = new();
foreach(HtmlNode galleryImage in images)
urls.Add(galleryImage.GetAttributeValue("src", ""));
return DownloadChapterImages(urls.ToArray(), chapter, RequestType.MangaImage, progressToken:progressToken);
}
}

View File

@ -7,7 +7,7 @@ namespace Tranga.MangaConnectors;
public class Mangaworld: MangaConnector public class Mangaworld: MangaConnector
{ {
public Mangaworld(GlobalBase clone) : base(clone, "Mangaworld", ["it"], ["www.mangaworld.ac"]) public Mangaworld(GlobalBase clone) : base(clone, "Mangaworld", ["it"])
{ {
this.downloadClient = new ChromiumDownloadClient(clone); this.downloadClient = new ChromiumDownloadClient(clone);
} }
@ -118,8 +118,8 @@ public class Mangaworld: MangaConnector
string yearString = metadata.SelectSingleNode("//span[text()='Anno di uscita: ']/..").SelectNodes("a").First().InnerText; string yearString = metadata.SelectSingleNode("//span[text()='Anno di uscita: ']/..").SelectNodes("a").First().InnerText;
int year = Convert.ToInt32(yearString); int year = Convert.ToInt32(yearString);
Manga manga = new (this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links, Manga manga = new (sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl); year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga); AddMangaToCache(manga);
return manga; return manga;
} }

View File

@ -7,7 +7,7 @@ namespace Tranga.MangaConnectors;
public class ManhuaPlus : MangaConnector public class ManhuaPlus : MangaConnector
{ {
public ManhuaPlus(GlobalBase clone) : base(clone, "ManhuaPlus", ["en"], ["manhuaplus.org"]) public ManhuaPlus(GlobalBase clone) : base(clone, "ManhuaPlus", ["en"])
{ {
this.downloadClient = new ChromiumDownloadClient(clone); this.downloadClient = new ChromiumDownloadClient(clone);
} }
@ -127,7 +127,7 @@ public class ManhuaPlus : MangaConnector
.SelectSingleNode("//div[@id='syn-target']"); .SelectSingleNode("//div[@id='syn-target']");
string description = descriptionNode.InnerText; string description = descriptionNode.InnerText;
Manga manga = new(this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, Manga manga = new(sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl); year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga); AddMangaToCache(manga);

View File

@ -0,0 +1,273 @@
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class Webtoons : MangaConnector
{
public Webtoons(GlobalBase clone) : base(clone, "Webtoons", ["en"])
{
this.downloadClient = new HttpDownloadClient(clone);
}
// Done
public override Manga[] GetManga(string publicationTitle = "")
{
string sanitizedTitle = string.Join(' ', Regex.Matches(publicationTitle, "[A-z]*").Where(m => m.Value.Length > 0)).ToLower();
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string requestUrl = $"https://www.webtoons.com/en/search?keyword={sanitizedTitle}&searchType=WEBTOON";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300) {
Log($"Failed to retrieve site");
return Array.Empty<Manga>();
}
if (requestResult.htmlDocument is null)
{
Log($"Failed to retrieve site");
return Array.Empty<Manga>();
}
Manga[] publications = ParsePublicationsFromHtml(requestResult.htmlDocument);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications;
}
// Done
public override Manga? GetMangaFromId(string publicationId)
{
PublicationManager pb = new PublicationManager(publicationId);
return GetMangaFromUrl($"https://www.webtoons.com/en/{pb.Category}/{pb.Title}/list?title_no={pb.Id}");
}
// Done
public override Manga? GetMangaFromUrl(string url)
{
RequestResult requestResult = downloadClient.MakeRequest(url, RequestType.MangaInfo);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300) {
return null;
}
if (requestResult.htmlDocument is null)
{
Log($"Failed to retrieve site");
return null;
}
Regex regex = new Regex(@".*webtoons\.com\/en\/(?<category>[^\/]+)\/(?<title>[^\/]+)\/list\?title_no=(?<id>\d+).*");
Match match = regex.Match(url);
if(match.Success) {
PublicationManager pm = new PublicationManager(match.Groups["title"].Value, match.Groups["category"].Value, match.Groups["id"].Value);
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, pm.getPublicationId(), url);
}
Log($"Failed match Regex ID");
return null;
}
// Done
private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{
HtmlNode mangaList = document.DocumentNode.SelectSingleNode("//ul[contains(@class, 'card_lst')]");
if (!mangaList.ChildNodes.Any(node => node.Name == "li")) {
Log($"Failed to parse publication");
return Array.Empty<Manga>();
}
List<string> urls = document.DocumentNode
.SelectNodes("//ul[contains(@class, 'card_lst')]/li/a")
.Select(node => node.GetAttributeValue("href", "https://www.webtoons.com"))
.ToList();
HashSet<Manga> ret = new();
foreach (string url in urls)
{
Manga? manga = GetMangaFromUrl(url);
if (manga is not null)
ret.Add((Manga)manga);
}
return ret.ToArray();
}
private string capitalizeString(string str = "") {
if(str.Length == 0) return "";
if(str.Length == 1) return str.ToUpper();
return char.ToUpper(str[0]) + str.Substring(1).ToLower();
}
// Done
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
HtmlNode infoNode1 = document.DocumentNode.SelectSingleNode("//*[@id='content']/div[2]/div[1]/div[1]");
HtmlNode infoNode2 = document.DocumentNode.SelectSingleNode("//*[@id='content']/div[2]/div[2]/div[2]");
string sortName = infoNode1.SelectSingleNode(".//h1[contains(@class, 'subj')]").InnerText;
string description = infoNode2.SelectSingleNode(".//p[contains(@class, 'summary')]")
.InnerText.Trim();
HtmlNode posterNode = document.DocumentNode.SelectSingleNode("//div[contains(@class, 'detail_body') and contains(@class, 'banner')]");
Regex regex = new Regex(@"url\('(?<url>.*?)'\)");
Match match = regex.Match(posterNode.GetAttributeValue("style", ""));
string posterUrl = match.Groups["url"].Value;
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover, websiteUrl);
string genre = infoNode1.SelectSingleNode(".//h2[contains(@class, 'genre')]")
.InnerText.Trim();
string[] tags = [ genre ];
List<HtmlNode> authorsNodes = infoNode1.SelectSingleNode(".//div[contains(@class, 'author_area')]").Descendants("a").ToList();
List<string> authors = authorsNodes.Select(node => node.InnerText.Trim()).ToList();
string originalLanguage = "";
int year = DateTime.Now.Year;
string status1 = infoNode2.SelectSingleNode(".//p").InnerText;
string status2 = infoNode2.SelectSingleNode(".//p/span").InnerText;
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
if(status2.Length == 0 || status1.ToLower() == "completed") {
releaseStatus = Manga.ReleaseStatusByte.Completed;
} else if(status2.ToLower() == "up") {
releaseStatus = Manga.ReleaseStatusByte.Continuing;
}
Manga manga = new(sortName, authors, description, new Dictionary<string, string>(), tags, posterUrl, coverFileNameInCache, new Dictionary<string, string>(),
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
// Done
public override Chapter[] GetChapters(Manga manga, string language = "en")
{
PublicationManager pm = new PublicationManager(manga.publicationId);
string requestUrl = $"https://www.webtoons.com/en/{pm.Category}/{pm.Title}/list?title_no={pm.Id}";
// Leaving this in for verification if the page exists
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Chapter>();
// Get number of pages
int pages = requestResult.htmlDocument.DocumentNode
.SelectNodes("//div[contains(@class, 'paginate')]/a")
.ToList()
.Count;
List<Chapter> chapters = new List<Chapter>();
for(int page = 1; page <= pages; page++) {
string pageRequestUrl = $"{requestUrl}&page={page}";
chapters.AddRange(ParseChaptersFromHtml(manga, pageRequestUrl));
}
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
// Done
private List<Chapter> ParseChaptersFromHtml(Manga manga, string mangaUrl)
{
RequestResult result = downloadClient.MakeRequest(mangaUrl, RequestType.Default);
if ((int)result.statusCode < 200 || (int)result.statusCode >= 300 || result.htmlDocument is null)
{
Log("Failed to load site");
return new List<Chapter>();
}
List<Chapter> ret = new();
foreach (HtmlNode chapterInfo in result.htmlDocument.DocumentNode.SelectNodes("//ul/li[contains(@class, '_episodeItem')]"))
{
HtmlNode infoNode = chapterInfo.SelectSingleNode(".//a");
string url = infoNode.GetAttributeValue("href", "");
string id = chapterInfo.GetAttributeValue("id", "");
if(id == "") continue;
string? volumeNumber = null;
string chapterNumber = chapterInfo.GetAttributeValue("data-episode-no", "");
if(chapterNumber == "") continue;
string chapterName = infoNode.SelectSingleNode(".//span[contains(@class, 'subj')]/span").InnerText.Trim();
ret.Add(new Chapter(manga, chapterName, volumeNumber, chapterNumber, url));
}
return ret;
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
string requestUrl = chapter.url;
// Leaving this in to check if the page exists
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
progressToken?.Cancel();
return requestResult.statusCode;
}
string[] imageUrls = ParseImageUrlsFromHtml(requestUrl);
return DownloadChapterImages(imageUrls, chapter, RequestType.MangaImage, progressToken:progressToken, referrer: requestUrl);
}
private string[] ParseImageUrlsFromHtml(string mangaUrl)
{
RequestResult requestResult =
downloadClient.MakeRequest(mangaUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
return Array.Empty<string>();
}
if (requestResult.htmlDocument is null)
{
Log($"Failed to retrieve site");
return Array.Empty<string>();
}
return requestResult.htmlDocument.DocumentNode
.SelectNodes("//*[@id='_imageList']/img")
.Select(node =>
node.GetAttributeValue("data-url", ""))
.ToArray();
}
}
internal class PublicationManager {
public PublicationManager(string title = "", string category = "", string id = "") {
this.Title = title;
this.Category = category;
this.Id = id;
}
public PublicationManager(string publicationId) {
string[] parts = publicationId.Split("|");
if(parts.Length == 3) {
this.Title = parts[0];
this.Category = parts[1];
this.Id = parts[2];
} else {
this.Title = "";
this.Category = "";
this.Id = "";
}
}
public string getPublicationId() {
return $"{this.Title}|{this.Category}|{this.Id}";
}
public string Title { get; set; }
public string Category { get; set; }
public string Id { get; set; }
}

View File

@ -1,7 +1,6 @@
using System.Net; using System.Net;
using System.Text.RegularExpressions; using System.Text.RegularExpressions;
using HtmlAgilityPack; using HtmlAgilityPack;
using Soenneker.Utils.String.NeedlemanWunsch;
using Tranga.Jobs; using Tranga.Jobs;
namespace Tranga.MangaConnectors; namespace Tranga.MangaConnectors;
@ -22,10 +21,10 @@ public class Weebcentral : MangaConnector
{ {
Log($"Searching Publications. Term=\"{publicationTitle}\""); Log($"Searching Publications. Term=\"{publicationTitle}\"");
const int limit = 32; //How many values we want returned at once const int limit = 32; //How many values we want returned at once
var offset = 0; //"Page" int offset = 0; //"Page"
var requestUrl = string requestUrl =
$"{_baseUrl}/search/data?limit={limit}&offset={offset}&text={publicationTitle}&sort=Best+Match&order=Ascending&official=Any&display_mode=Minimal%20Display"; $"{_baseUrl}/search/data?limit={limit}&offset={offset}&text={publicationTitle}&sort=Best+Match&order=Ascending&official=Any&display_mode=Minimal%20Display";
var requestResult = RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default); downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300 || if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300 ||
requestResult.htmlDocument == null) requestResult.htmlDocument == null)
@ -34,7 +33,7 @@ public class Weebcentral : MangaConnector
return []; return [];
} }
var publications = ParsePublicationsFromHtml(requestResult.htmlDocument); Manga[] publications = ParsePublicationsFromHtml(requestResult.htmlDocument);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\""); Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications; return publications;
@ -43,15 +42,15 @@ public class Weebcentral : MangaConnector
private Manga[] ParsePublicationsFromHtml(HtmlDocument document) private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{ {
if (document.DocumentNode.SelectNodes("//article") == null) if (document.DocumentNode.SelectNodes("//article") == null)
return Array.Empty<Manga>(); return [];
var urls = document.DocumentNode.SelectNodes("/html/body/article/a[@class='link link-hover']") List<string> urls = document.DocumentNode.SelectNodes("/html/body/article/a[@class='link link-hover tooltip tooltip-bottom']")
.Select(elem => elem.GetAttributeValue("href", "")).ToList(); .Select(elem => elem.GetAttributeValue("href", "")).ToList();
HashSet<Manga> ret = new(); HashSet<Manga> ret = new();
foreach (var url in urls) foreach (string url in urls)
{ {
var manga = GetMangaFromUrl(url); Manga? manga = GetMangaFromUrl(url);
if (manga is not null) if (manga is not null)
ret.Add((Manga)manga); ret.Add((Manga)manga);
} }
@ -62,9 +61,9 @@ public class Weebcentral : MangaConnector
public override Manga? GetMangaFromUrl(string url) public override Manga? GetMangaFromUrl(string url)
{ {
Regex publicationIdRex = new(@"https:\/\/weebcentral\.com\/series\/(\w*)\/(.*)"); Regex publicationIdRex = new(@"https:\/\/weebcentral\.com\/series\/(\w*)\/(.*)");
var publicationId = publicationIdRex.Match(url).Groups[1].Value; string publicationId = publicationIdRex.Match(url).Groups[1].Value;
var requestResult = downloadClient.MakeRequest(url, RequestType.MangaInfo); RequestResult requestResult = downloadClient.MakeRequest(url, RequestType.MangaInfo);
if ((int)requestResult.statusCode < 300 && (int)requestResult.statusCode >= 200 && if ((int)requestResult.statusCode < 300 && (int)requestResult.statusCode >= 200 &&
requestResult.htmlDocument is not null) requestResult.htmlDocument is not null)
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, publicationId, url); return ParseSinglePublicationFromHtml(requestResult.htmlDocument, publicationId, url);
@ -73,26 +72,26 @@ public class Weebcentral : MangaConnector
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl) private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{ {
var posterNode = HtmlNode? posterNode =
document.DocumentNode.SelectSingleNode("//section[@class='flex items-center justify-center']/picture/img"); document.DocumentNode.SelectSingleNode("//section[@class='flex items-center justify-center']/picture/img");
var posterUrl = posterNode?.GetAttributeValue("src", "") ?? ""; string posterUrl = posterNode?.GetAttributeValue("src", "") ?? "";
var coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover); string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
var titleNode = document.DocumentNode.SelectSingleNode("//section/h1"); HtmlNode? titleNode = document.DocumentNode.SelectSingleNode("//section/h1");
var sortName = titleNode?.InnerText ?? "Undefined"; string sortName = titleNode?.InnerText ?? "Undefined";
HtmlNode[] authorsNodes = HtmlNode[] authorsNodes =
document.DocumentNode.SelectNodes("//ul/li[strong/text() = 'Author(s): ']/span")?.ToArray() ?? []; document.DocumentNode.SelectNodes("//ul/li[strong/text() = 'Author(s): ']/span")?.ToArray() ?? [];
var authors = authorsNodes.Select(n => n.InnerText).ToList(); List<string> authors = authorsNodes.Select(n => n.InnerText).ToList();
HtmlNode[] genreNodes = HtmlNode[] genreNodes =
document.DocumentNode.SelectNodes("//ul/li[strong/text() = 'Tags(s): ']/span")?.ToArray() ?? []; document.DocumentNode.SelectNodes("//ul/li[strong/text() = 'Tags(s): ']/span")?.ToArray() ?? [];
HashSet<string> tags = genreNodes.Select(n => n.InnerText).ToHashSet(); HashSet<string> tags = genreNodes.Select(n => n.InnerText).ToHashSet();
var statusNode = document.DocumentNode.SelectSingleNode("//ul/li[strong/text() = 'Status: ']/a"); HtmlNode? statusNode = document.DocumentNode.SelectSingleNode("//ul/li[strong/text() = 'Status: ']/a");
var status = statusNode?.InnerText ?? ""; string status = statusNode?.InnerText ?? "";
Log("unable to parse status"); Log("unable to parse status");
var releaseStatus = Manga.ReleaseStatusByte.Unreleased; Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
switch (status.ToLower()) switch (status.ToLower())
{ {
case "cancelled": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break; case "cancelled": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
@ -101,19 +100,19 @@ public class Weebcentral : MangaConnector
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break; case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
} }
var yearNode = document.DocumentNode.SelectSingleNode("//ul/li[strong/text() = 'Released: ']/span"); HtmlNode? yearNode = document.DocumentNode.SelectSingleNode("//ul/li[strong/text() = 'Released: ']/span");
var year = Convert.ToInt32(yearNode?.InnerText ?? "0"); int year = Convert.ToInt32(yearNode?.InnerText ?? "0");
var descriptionNode = document.DocumentNode.SelectSingleNode("//ul/li[strong/text() = 'Description']/p"); HtmlNode? descriptionNode = document.DocumentNode.SelectSingleNode("//ul/li[strong/text() = 'Description']/p");
var description = descriptionNode?.InnerText ?? "Undefined"; string description = descriptionNode?.InnerText ?? "Undefined";
HtmlNode[] altTitleNodes = document.DocumentNode HtmlNode[] altTitleNodes = document.DocumentNode
.SelectNodes("//ul/li[strong/text() = 'Associated Name(s)']/ul/li")?.ToArray() ?? []; .SelectNodes("//ul/li[strong/text() = 'Associated Name(s)']/ul/li")?.ToArray() ?? [];
Dictionary<string, string> altTitles = new(), links = new(); Dictionary<string, string> altTitles = new(), links = new();
for (var i = 0; i < altTitleNodes.Length; i++) for (int i = 0; i < altTitleNodes.Length; i++)
altTitles.Add(i.ToString(), altTitleNodes[i].InnerText); altTitles.Add(i.ToString(), altTitleNodes[i].InnerText);
var originalLanguage = ""; string originalLanguage = "";
Manga manga = new(sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, Manga manga = new(sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links, coverFileNameInCache, links,
@ -127,74 +126,54 @@ public class Weebcentral : MangaConnector
return GetMangaFromUrl($"https://weebcentral.com/series/{publicationId}"); return GetMangaFromUrl($"https://weebcentral.com/series/{publicationId}");
} }
private string ToFilteredString(string input)
{
return string.Join(' ', input.ToLower().Split(' ').Where(word => _filterWords.Contains(word) == false));
}
private SearchResult[] FilteredResults(string publicationTitle, SearchResult[] unfilteredSearchResults)
{
Dictionary<SearchResult, int> similarity = new();
foreach (var sr in unfilteredSearchResults)
{
List<int> scores = new();
var filteredPublicationString = ToFilteredString(publicationTitle);
var filteredSString = ToFilteredString(sr.s);
scores.Add(NeedlemanWunschStringUtil.CalculateSimilarity(filteredSString, filteredPublicationString));
foreach (var srA in sr.a)
{
var filteredAString = ToFilteredString(srA);
scores.Add(NeedlemanWunschStringUtil.CalculateSimilarity(filteredAString, filteredPublicationString));
}
similarity.Add(sr, scores.Sum() / scores.Count);
}
var ret = similarity.OrderBy(s => s.Value).Take(10).Select(s => s.Key).ToList();
return ret.ToArray();
}
public override Chapter[] GetChapters(Manga manga, string language = "en") public override Chapter[] GetChapters(Manga manga, string language = "en")
{ {
Log($"Getting chapters {manga}"); Log($"Getting chapters {manga}");
var requestUrl = $"{_baseUrl}/series/{manga.publicationId}/full-chapter-list"; string requestUrl = $"{_baseUrl}/series/{manga.publicationId}/full-chapter-list";
var requestResult = RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default); downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300) if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Chapter>(); return [];
//Return Chapters ordered by Chapter-Number //Return Chapters ordered by Chapter-Number
if (requestResult.htmlDocument is null) if (requestResult.htmlDocument is null)
return Array.Empty<Chapter>(); return [];
var chapters = ParseChaptersFromHtml(manga, requestResult.htmlDocument); List<Chapter> chapters = ParseChaptersFromHtml(manga, requestResult.htmlDocument);
Log($"Got {chapters.Count} chapters. {manga}"); Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray(); return chapters.OrderByDescending(c => c.name).ThenBy(c => c.volumeNumber).ThenBy(c => c.chapterNumber).ToArray();
} }
private List<Chapter> ParseChaptersFromHtml(Manga manga, HtmlDocument document) private List<Chapter> ParseChaptersFromHtml(Manga manga, HtmlDocument document)
{ {
var chaptersWrapper = document.DocumentNode.SelectSingleNode("/html/body"); HtmlNode? chaptersWrapper = document.DocumentNode.SelectSingleNode("/html/body");
Regex chapterRex = new(@".* (\d+)"); Regex chapterRex = new(@"(\d+(?:\.\d+)*)");
Regex chapterNameRex = new(@"(\w* )+");
Regex idRex = new(@"https:\/\/weebcentral\.com\/chapters\/(\w*)"); Regex idRex = new(@"https:\/\/weebcentral\.com\/chapters\/(\w*)");
var ret = chaptersWrapper.Descendants("a").Select(elem => List<Chapter> ret = chaptersWrapper.Descendants("a").Select(elem =>
{ {
var url = elem.GetAttributeValue("href", "") ?? "Undefined"; string url = elem.GetAttributeValue("href", "") ?? "Undefined";
if (!url.StartsWith("https://") && !url.StartsWith("http://")) if (!url.StartsWith("https://") && !url.StartsWith("http://"))
return new Chapter(manga, null, null, "-1", "undefined"); return new Chapter(manga, null, null, "-1", "undefined");
var idMatch = idRex.Match(url); Match idMatch = idRex.Match(url);
var id = idMatch.Success ? idMatch.Groups[1].Value : null; string? id = idMatch.Success ? idMatch.Groups[1].Value : null;
var chapterNode = elem.SelectSingleNode("span[@class='grow flex items-center gap-2']/span")?.InnerText ?? string chapterNode = elem.SelectSingleNode("span[@class='grow flex items-center gap-2']/span")?.InnerText ??
"Undefined"; "Undefined";
var chapterNumberMatch = chapterRex.Match(chapterNode); MatchCollection chapterNumberMatch = chapterRex.Matches(chapterNode);
var chapterNumber = chapterNumberMatch.Success ? chapterNumberMatch.Groups[1].Value : "-1"; string chapterNumber = chapterNumberMatch.Count > 0 ? chapterNumberMatch[^1].Groups[1].Value : "-1";
MatchCollection chapterNameMatch = chapterNameRex.Matches(chapterNode);
string chapterName = chapterNameMatch.Count > 0
? string.Join(" - ",
chapterNameMatch.Select(m => m.Groups[1].Value.Trim())
.Where(name => name.Length > 0 && !name.Equals("Chapter", StringComparison.OrdinalIgnoreCase)).ToArray()).Trim()
: "";
return new Chapter(manga, null, null, chapterNumber, url, id); return new Chapter(manga, chapterName != "" ? chapterName : null, null, chapterNumber, url, id);
}).Where(elem => elem.chapterNumber != -1 && elem.url != "undefined").ToList(); }).Where(elem => elem.chapterNumber != -1 && elem.url != "undefined").ToList();
ret.Reverse(); ret.Reverse();
@ -209,7 +188,7 @@ public class Weebcentral : MangaConnector
return HttpStatusCode.RequestTimeout; return HttpStatusCode.RequestTimeout;
} }
var chapterParentManga = chapter.parentManga; Manga chapterParentManga = chapter.parentManga;
if (progressToken?.cancellationRequested ?? false) if (progressToken?.cancellationRequested ?? false)
{ {
progressToken.Cancel(); progressToken.Cancel();
@ -218,26 +197,19 @@ public class Weebcentral : MangaConnector
Log($"Retrieving chapter-info {chapter} {chapterParentManga}"); Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
var requestResult = downloadClient.MakeRequest(chapter.url, RequestType.Default); RequestResult requestResult = downloadClient.MakeRequest(chapter.url, RequestType.Default);
if (requestResult.htmlDocument is null) if (requestResult.htmlDocument is null)
{ {
progressToken?.Cancel(); progressToken?.Cancel();
return HttpStatusCode.RequestTimeout; return HttpStatusCode.RequestTimeout;
} }
var document = requestResult.htmlDocument; HtmlDocument? document = requestResult.htmlDocument;
var imageNodes = HtmlNode[] imageNodes =
document.DocumentNode.SelectNodes($"//section[@hx-get='{chapter.url}/images']/img")?.ToArray() ?? []; document.DocumentNode.SelectNodes($"//section[@hx-get='{chapter.url}/images']/img")?.ToArray() ?? [];
var urls = imageNodes.Select(imgNode => imgNode.GetAttributeValue("src", "")).ToArray(); string[] urls = imageNodes.Select(imgNode => imgNode.GetAttributeValue("src", "")).ToArray();
return DownloadChapterImages(urls, chapter, RequestType.MangaImage, progressToken: progressToken); return DownloadChapterImages(urls, chapter, RequestType.MangaImage, progressToken: progressToken, referrer: "https://weebcentral.com/");
}
private struct SearchResult
{
public string i { get; set; }
public string s { get; set; }
public string[] a { get; set; }
} }
} }

772
Tranga/Server.cs Normal file
View File

@ -0,0 +1,772 @@
using System.Net;
using System.Runtime.InteropServices;
using System.Text;
using System.Text.RegularExpressions;
using Newtonsoft.Json;
using Tranga.Jobs;
using Tranga.LibraryConnectors;
using Tranga.MangaConnectors;
using Tranga.NotificationConnectors;
namespace Tranga;
public class Server : GlobalBase
{
private readonly HttpListener _listener = new ();
private readonly Tranga _parent;
public Server(Tranga parent) : base(parent)
{
this._parent = parent;
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
this._listener.Prefixes.Add($"http://*:{TrangaSettings.apiPortNumber}/");
else
this._listener.Prefixes.Add($"http://localhost:{TrangaSettings.apiPortNumber}/");
Thread listenThread = new (Listen);
listenThread.Start();
Thread watchThread = new(WatchRunning);
watchThread.Start();
}
private void WatchRunning()
{
while(_parent.keepRunning)
Thread.Sleep(1000);
this._listener.Close();
}
private void Listen()
{
this._listener.Start();
foreach(string prefix in this._listener.Prefixes)
Log($"Listening on {prefix}");
while (this._listener.IsListening && _parent.keepRunning)
{
try
{
HttpListenerContext context = this._listener.GetContext();
//Log($"{context.Request.HttpMethod} {context.Request.Url} {context.Request.UserAgent}");
Task t = new(() =>
{
HandleRequest(context);
});
t.Start();
}
catch (HttpListenerException)
{
}
}
}
private void HandleRequest(HttpListenerContext context)
{
HttpListenerRequest request = context.Request;
HttpListenerResponse response = context.Response;
if (request.Url!.LocalPath.Contains("favicon"))
{
SendResponse(HttpStatusCode.NoContent, response);
return;
}
switch (request.HttpMethod)
{
case "GET":
HandleGet(request, response);
break;
case "POST":
HandlePost(request, response);
break;
case "DELETE":
HandleDelete(request, response);
break;
case "OPTIONS":
SendResponse(HttpStatusCode.OK, context.Response);
break;
default:
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
}
private Dictionary<string, string> GetRequestVariables(string query)
{
Dictionary<string, string> ret = new();
Regex queryRex = new (@"\?{1}&?([A-z0-9-=]+=[A-z0-9-=]+)+(&[A-z0-9-=]+=[A-z0-9-=]+)*");
if (!queryRex.IsMatch(query))
return ret;
query = query.Substring(1);
foreach (string keyValuePair in query.Split('&').Where(str => str.Length >= 3))
{
string var = keyValuePair.Split('=')[0];
string val = Regex.Replace(keyValuePair.Substring(var.Length + 1), "%20", " ");
val = Regex.Replace(val, "%[0-9]{2}", "");
ret.Add(var, val);
}
return ret;
}
private void HandleGet(HttpListenerRequest request, HttpListenerResponse response)
{
Dictionary<string, string> requestVariables = GetRequestVariables(request.Url!.Query);
string? connectorName, jobId, internalId;
MangaConnector? connector;
Manga? manga;
string path = Regex.Match(request.Url!.LocalPath, @"[A-z0-9]+(\/[A-z0-9]+)*").Value;
switch (path)
{
case "Connectors":
SendResponse(HttpStatusCode.OK, response, _parent.GetConnectors().Select(con => con.name).ToArray());
break;
case "Languages":
if (!requestVariables.TryGetValue("connector", out connectorName) ||
!_parent.TryGetConnector(connectorName, out connector))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
SendResponse(HttpStatusCode.OK, response, connector);
break;
case "Manga/Cover":
if (!requestVariables.TryGetValue("internalId", out internalId) ||
!_parent.TryGetPublicationById(internalId, out manga))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
string filePath = manga?.coverFileNameInCache ?? "";
if (File.Exists(filePath))
{
FileStream coverStream = new(filePath, FileMode.Open);
SendResponse(HttpStatusCode.OK, response, coverStream);
}
else
{
SendResponse(HttpStatusCode.NotFound, response);
}
break;
case "Manga/FromConnector":
requestVariables.TryGetValue("title", out string? title);
requestVariables.TryGetValue("url", out string? url);
if (!requestVariables.TryGetValue("connector", out connectorName) ||
!_parent.TryGetConnector(connectorName, out connector) ||
(title is null && url is null))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (url is not null)
{
HashSet<Manga> ret = new();
manga = connector!.GetMangaFromUrl(url);
if (manga is not null)
ret.Add((Manga)manga);
SendResponse(HttpStatusCode.OK, response, ret);
}else
SendResponse(HttpStatusCode.OK, response, connector!.GetManga(title!));
break;
case "Manga/Chapters":
if(!requestVariables.TryGetValue("connector", out connectorName) ||
!requestVariables.TryGetValue("internalId", out internalId) ||
!_parent.TryGetConnector(connectorName, out connector) ||
!_parent.TryGetPublicationById(internalId, out manga))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
requestVariables.TryGetValue("translatedLanguage", out string? translatedLanguage);
SendResponse(HttpStatusCode.OK, response, connector!.GetChapters((Manga)manga!, translatedLanguage??"en"));
break;
case "Jobs":
if (!requestVariables.TryGetValue("jobId", out jobId))
{
if(!_parent.jobBoss.jobs.Any(jjob => jjob.id == jobId))
SendResponse(HttpStatusCode.BadRequest, response);
else
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.First(jjob => jjob.id == jobId));
break;
}
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs);
break;
case "Jobs/Progress":
if (requestVariables.TryGetValue("jobId", out jobId))
{
if(!_parent.jobBoss.jobs.Any(jjob => jjob.id == jobId))
SendResponse(HttpStatusCode.BadRequest, response);
else
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.First(jjob => jjob.id == jobId).progressToken);
break;
}
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.Select(jjob => jjob.progressToken));
break;
case "Jobs/Running":
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.Where(jjob => jjob.progressToken.state is ProgressToken.State.Running));
break;
case "Jobs/Waiting":
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.Where(jjob => jjob.progressToken.state is ProgressToken.State.Standby).OrderBy(jjob => jjob.nextExecution));
break;
case "Jobs/MonitorJobs":
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.Where(jjob => jjob is DownloadNewChapters).OrderBy(jjob => ((DownloadNewChapters)jjob).manga.sortName));
break;
case "Settings":
SendResponse(HttpStatusCode.OK, response, TrangaSettings.AsJObject());
break;
case "Settings/userAgent":
SendResponse(HttpStatusCode.OK, response, TrangaSettings.userAgent);
break;
case "Settings/customRequestLimit":
SendResponse(HttpStatusCode.OK, response, TrangaSettings.requestLimits);
break;
case "Settings/AprilFoolsMode":
SendResponse(HttpStatusCode.OK, response, TrangaSettings.aprilFoolsMode);
break;
case "NotificationConnectors":
SendResponse(HttpStatusCode.OK, response, notificationConnectors);
break;
case "NotificationConnectors/Types":
SendResponse(HttpStatusCode.OK, response,
Enum.GetValues<NotificationConnector.NotificationConnectorType>().Select(nc => new KeyValuePair<byte, string?>((byte)nc, Enum.GetName(nc))));
break;
case "LibraryConnectors":
SendResponse(HttpStatusCode.OK, response, libraryConnectors);
break;
case "LibraryConnectors/Types":
SendResponse(HttpStatusCode.OK, response,
Enum.GetValues<LibraryConnector.LibraryType>().Select(lc => new KeyValuePair<byte, string?>((byte)lc, Enum.GetName(lc))));
break;
case "Ping":
SendResponse(HttpStatusCode.OK, response, "Pong");
break;
case "LogMessages":
if (logger is null || !File.Exists(logger?.logFilePath))
{
SendResponse(HttpStatusCode.NotFound, response);
break;
}
if (requestVariables.TryGetValue("count", out string? count))
{
try
{
uint messageCount = uint.Parse(count);
SendResponse(HttpStatusCode.OK, response, logger.Tail(messageCount));
}
catch (FormatException f)
{
SendResponse(HttpStatusCode.InternalServerError, response, f);
}
}else
SendResponse(HttpStatusCode.OK, response, logger.GetLog());
break;
case "LogFile":
if (logger is null || !File.Exists(logger?.logFilePath))
{
SendResponse(HttpStatusCode.NotFound, response);
break;
}
string logDir = new FileInfo(logger.logFilePath).DirectoryName!;
string tmpFilePath = Path.Join(logDir, "Tranga.log");
File.Copy(logger.logFilePath, tmpFilePath);
SendResponse(HttpStatusCode.OK, response, new FileStream(tmpFilePath, FileMode.Open));
File.Delete(tmpFilePath);
break;
default:
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
}
private void HandlePost(HttpListenerRequest request, HttpListenerResponse response)
{
Dictionary<string, string> requestVariables = GetRequestVariables(request.Url!.Query);
string? connectorName, internalId, jobId, chapterNumStr, customFolderName, translatedLanguage, notificationConnectorStr, libraryConnectorStr;
MangaConnector? connector;
Manga? tmpManga;
Manga manga;
Job? job;
NotificationConnector.NotificationConnectorType notificationConnectorType;
LibraryConnector.LibraryType libraryConnectorType;
string path = Regex.Match(request.Url!.LocalPath, @"[A-z0-9]+(\/[A-z0-9]+)*").Value;
switch (path)
{
case "Manga":
if(!requestVariables.TryGetValue("internalId", out internalId) ||
!_parent.TryGetPublicationById(internalId, out tmpManga))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga = (Manga)tmpManga!;
SendResponse(HttpStatusCode.OK, response, manga);
break;
case "Jobs/MonitorManga":
if(!requestVariables.TryGetValue("connector", out connectorName) ||
!requestVariables.TryGetValue("internalId", out internalId) ||
!requestVariables.TryGetValue("interval", out string? intervalStr) ||
!_parent.TryGetConnector(connectorName, out connector)||
!_parent.TryGetPublicationById(internalId, out tmpManga) ||
!TimeSpan.TryParse(intervalStr, out TimeSpan interval))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga = (Manga)tmpManga!;
if (requestVariables.TryGetValue("ignoreBelowChapterNum", out chapterNumStr))
{
if (!float.TryParse(chapterNumStr, numberFormatDecimalPoint, out float chapterNum))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga.ignoreChaptersBelow = chapterNum;
}
if (requestVariables.TryGetValue("customFolderName", out customFolderName))
manga.MovePublicationFolder(TrangaSettings.downloadLocation, customFolderName);
requestVariables.TryGetValue("translatedLanguage", out translatedLanguage);
_parent.jobBoss.AddJob(new DownloadNewChapters(this, connector!, manga, true, interval, translatedLanguage: translatedLanguage??"en"));
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Jobs/DownloadNewChapters":
if(!requestVariables.TryGetValue("connector", out connectorName) ||
!requestVariables.TryGetValue("internalId", out internalId) ||
!_parent.TryGetConnector(connectorName, out connector)||
!_parent.TryGetPublicationById(internalId, out tmpManga))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga = (Manga)tmpManga!;
if (requestVariables.TryGetValue("ignoreBelowChapterNum", out chapterNumStr))
{
if (!float.TryParse(chapterNumStr, numberFormatDecimalPoint, out float chapterNum))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga.ignoreChaptersBelow = chapterNum;
}
if (requestVariables.TryGetValue("customFolderName", out customFolderName))
manga.MovePublicationFolder(TrangaSettings.downloadLocation, customFolderName);
requestVariables.TryGetValue("translatedLanguage", out translatedLanguage);
_parent.jobBoss.AddJob(new DownloadNewChapters(this, connector!, manga, false, translatedLanguage: translatedLanguage??"en"));
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Jobs/UpdateMetadata":
if (!requestVariables.TryGetValue("internalId", out internalId))
{
foreach (Job pJob in _parent.jobBoss.jobs.Where(possibleDncJob =>
possibleDncJob.jobType is Job.JobType.DownloadNewChaptersJob).ToArray())//ToArray to avoid modyifying while adding new jobs
{
DownloadNewChapters dncJob = pJob as DownloadNewChapters ??
throw new Exception("Has to be DownloadNewChapters Job");
_parent.jobBoss.AddJob(new UpdateMetadata(this, dncJob.mangaConnector, dncJob.manga));
}
SendResponse(HttpStatusCode.Accepted, response);
}
else
{
Job[] possibleDncJobs = _parent.jobBoss.GetJobsLike(internalId: internalId).ToArray();
switch (possibleDncJobs.Length)
{
case <1: SendResponse(HttpStatusCode.BadRequest, response, "Could not find matching release"); break;
case >1: SendResponse(HttpStatusCode.BadRequest, response, "Multiple releases??"); break;
default:
DownloadNewChapters dncJob = possibleDncJobs[0] as DownloadNewChapters ??
throw new Exception("Has to be DownloadNewChapters Job");
_parent.jobBoss.AddJob(new UpdateMetadata(this, dncJob.mangaConnector, dncJob.manga));
SendResponse(HttpStatusCode.Accepted, response);
break;
}
}
break;
case "Jobs/StartNow":
if (!requestVariables.TryGetValue("jobId", out jobId) ||
!_parent.jobBoss.TryGetJobById(jobId, out job))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
_parent.jobBoss.AddJobToQueue(job!);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Jobs/Cancel":
if (!requestVariables.TryGetValue("jobId", out jobId) ||
!_parent.jobBoss.TryGetJobById(jobId, out job))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
job!.Cancel();
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/UpdateDownloadLocation":
if (!requestVariables.TryGetValue("downloadLocation", out string? downloadLocation) ||
!requestVariables.TryGetValue("moveFiles", out string? moveFilesStr) ||
!bool.TryParse(moveFilesStr, out bool moveFiles))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
TrangaSettings.UpdateDownloadLocation(downloadLocation, moveFiles);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/AprilFoolsMode":
if (!requestVariables.TryGetValue("enabled", out string? aprilFoolsModeEnabledStr) ||
!bool.TryParse(aprilFoolsModeEnabledStr, out bool aprilFoolsModeEnabled))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
TrangaSettings.UpdateAprilFoolsMode(aprilFoolsModeEnabled);
SendResponse(HttpStatusCode.Accepted, response);
break;
/*case "Settings/UpdateWorkingDirectory":
if (!requestVariables.TryGetValue("workingDirectory", out string? workingDirectory))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
settings.UpdateWorkingDirectory(workingDirectory);
SendResponse(HttpStatusCode.Accepted, response);
break;*/
case "Settings/userAgent":
if(!requestVariables.TryGetValue("userAgent", out string? customUserAgent))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
TrangaSettings.UpdateUserAgent(customUserAgent);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/userAgent/Reset":
TrangaSettings.UpdateUserAgent(null);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/customRequestLimit":
if (!requestVariables.TryGetValue("requestType", out string? requestTypeStr) ||
!requestVariables.TryGetValue("requestsPerMinute", out string? requestsPerMinuteStr) ||
!Enum.TryParse(requestTypeStr, out RequestType requestType) ||
!int.TryParse(requestsPerMinuteStr, out int requestsPerMinute))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
TrangaSettings.UpdateRateLimit(requestType, requestsPerMinute);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/customRequestLimit/Reset":
TrangaSettings.ResetRateLimits();
break;
case "NotificationConnectors/Update":
if (!requestVariables.TryGetValue("notificationConnector", out notificationConnectorStr) ||
!Enum.TryParse(notificationConnectorStr, out notificationConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (notificationConnectorType is NotificationConnector.NotificationConnectorType.Gotify)
{
if (!requestVariables.TryGetValue("gotifyUrl", out string? gotifyUrl) ||
!requestVariables.TryGetValue("gotifyAppToken", out string? gotifyAppToken))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddNotificationConnector(new Gotify(this, gotifyUrl, gotifyAppToken));
SendResponse(HttpStatusCode.Accepted, response);
}else if (notificationConnectorType is NotificationConnector.NotificationConnectorType.LunaSea)
{
if (!requestVariables.TryGetValue("lunaseaWebhook", out string? lunaseaWebhook))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddNotificationConnector(new LunaSea(this, lunaseaWebhook));
SendResponse(HttpStatusCode.Accepted, response);
}else if (notificationConnectorType is NotificationConnector.NotificationConnectorType.Ntfy)
{
if (!requestVariables.TryGetValue("ntfyUrl", out string? ntfyUrl) ||
!requestVariables.TryGetValue("ntfyUser", out string? ntfyUser)||
!requestVariables.TryGetValue("ntfyPass", out string? ntfyPass))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddNotificationConnector(new Ntfy(this, ntfyUrl, ntfyUser, ntfyPass, null));
SendResponse(HttpStatusCode.Accepted, response);
}
else
{
SendResponse(HttpStatusCode.BadRequest, response);
}
break;
case "NotificationConnectors/Test":
NotificationConnector notificationConnector;
if (!requestVariables.TryGetValue("notificationConnector", out notificationConnectorStr) ||
!Enum.TryParse(notificationConnectorStr, out notificationConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (notificationConnectorType is NotificationConnector.NotificationConnectorType.Gotify)
{
if (!requestVariables.TryGetValue("gotifyUrl", out string? gotifyUrl) ||
!requestVariables.TryGetValue("gotifyAppToken", out string? gotifyAppToken))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
notificationConnector = new Gotify(this, gotifyUrl, gotifyAppToken);
}else if (notificationConnectorType is NotificationConnector.NotificationConnectorType.LunaSea)
{
if (!requestVariables.TryGetValue("lunaseaWebhook", out string? lunaseaWebhook))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
notificationConnector = new LunaSea(this, lunaseaWebhook);
}else if (notificationConnectorType is NotificationConnector.NotificationConnectorType.Ntfy)
{
if (!requestVariables.TryGetValue("ntfyUrl", out string? ntfyUrl) ||
!requestVariables.TryGetValue("ntfyUser", out string? ntfyUser)||
!requestVariables.TryGetValue("ntfyPass", out string? ntfyPass))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
notificationConnector = new Ntfy(this, ntfyUrl, ntfyUser, ntfyPass, null);
}
else
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
notificationConnector.SendNotification("Tranga Test", "This is Test-Notification.");
SendResponse(HttpStatusCode.Accepted, response);
break;
case "NotificationConnectors/Reset":
if (!requestVariables.TryGetValue("notificationConnector", out notificationConnectorStr) ||
!Enum.TryParse(notificationConnectorStr, out notificationConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
DeleteNotificationConnector(notificationConnectorType);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "LibraryConnectors/Update":
if (!requestVariables.TryGetValue("libraryConnector", out libraryConnectorStr) ||
!Enum.TryParse(libraryConnectorStr, out libraryConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (libraryConnectorType is LibraryConnector.LibraryType.Kavita)
{
if (!requestVariables.TryGetValue("kavitaUrl", out string? kavitaUrl) ||
!requestVariables.TryGetValue("kavitaUsername", out string? kavitaUsername) ||
!requestVariables.TryGetValue("kavitaPassword", out string? kavitaPassword))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddLibraryConnector(new Kavita(this, kavitaUrl, kavitaUsername, kavitaPassword));
SendResponse(HttpStatusCode.Accepted, response);
}else if (libraryConnectorType is LibraryConnector.LibraryType.Komga)
{
if (!requestVariables.TryGetValue("komgaUrl", out string? komgaUrl) ||
!requestVariables.TryGetValue("komgaAuth", out string? komgaAuth))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddLibraryConnector(new Komga(this, komgaUrl, komgaAuth));
SendResponse(HttpStatusCode.Accepted, response);
}
else
{
SendResponse(HttpStatusCode.BadRequest, response);
}
break;
case "LibraryConnectors/Test":
LibraryConnector libraryConnector;
if (!requestVariables.TryGetValue("libraryConnector", out libraryConnectorStr) ||
!Enum.TryParse(libraryConnectorStr, out libraryConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (libraryConnectorType is LibraryConnector.LibraryType.Kavita)
{
if (!requestVariables.TryGetValue("kavitaUrl", out string? kavitaUrl) ||
!requestVariables.TryGetValue("kavitaUsername", out string? kavitaUsername) ||
!requestVariables.TryGetValue("kavitaPassword", out string? kavitaPassword))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
libraryConnector = new Kavita(this, kavitaUrl, kavitaUsername, kavitaPassword);
}else if (libraryConnectorType is LibraryConnector.LibraryType.Komga)
{
if (!requestVariables.TryGetValue("komgaUrl", out string? komgaUrl) ||
!requestVariables.TryGetValue("komgaAuth", out string? komgaAuth))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
libraryConnector = new Komga(this, komgaUrl, komgaAuth);
}
else
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
libraryConnector.UpdateLibrary();
SendResponse(HttpStatusCode.Accepted, response);
break;
case "LibraryConnectors/Reset":
if (!requestVariables.TryGetValue("libraryConnector", out libraryConnectorStr) ||
!Enum.TryParse(libraryConnectorStr, out libraryConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
DeleteLibraryConnector(libraryConnectorType);
SendResponse(HttpStatusCode.Accepted, response);
break;
default:
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
}
private void HandleDelete(HttpListenerRequest request, HttpListenerResponse response)
{
Dictionary<string, string> requestVariables = GetRequestVariables(request.Url!.Query);
string? connectorName, internalId;
MangaConnector connector;
Manga manga;
string path = Regex.Match(request.Url!.LocalPath, @"[A-z0-9]+(\/[A-z0-9]+)*").Value;
switch (path)
{
case "Jobs":
if (!requestVariables.TryGetValue("jobId", out string? jobId) ||
!_parent.jobBoss.TryGetJobById(jobId, out Job? job))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
_parent.jobBoss.RemoveJob(job!);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Jobs/DownloadNewChapters":
if(!requestVariables.TryGetValue("connector", out connectorName) ||
!requestVariables.TryGetValue("internalId", out internalId) ||
_parent.GetConnector(connectorName) is null ||
_parent.GetPublicationById(internalId) is null)
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
connector = _parent.GetConnector(connectorName)!;
manga = (Manga)_parent.GetPublicationById(internalId)!;
_parent.jobBoss.RemoveJobs(_parent.jobBoss.GetJobsLike(connector, manga));
SendResponse(HttpStatusCode.Accepted, response);
break;
case "NotificationConnectors":
if (!requestVariables.TryGetValue("notificationConnector", out string? notificationConnectorStr) ||
!Enum.TryParse(notificationConnectorStr, out NotificationConnector.NotificationConnectorType notificationConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
DeleteNotificationConnector(notificationConnectorType);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "LibraryConnectors":
if (!requestVariables.TryGetValue("libraryConnectors", out string? libraryConnectorStr) ||
!Enum.TryParse(libraryConnectorStr,
out LibraryConnector.LibraryType libraryConnectoryType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
DeleteLibraryConnector(libraryConnectoryType);
SendResponse(HttpStatusCode.Accepted, response);
break;
default:
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
}
private void SendResponse(HttpStatusCode statusCode, HttpListenerResponse response, object? content = null)
{
//Log($"Response: {statusCode} {content}");
response.StatusCode = (int)statusCode;
response.AddHeader("Access-Control-Allow-Headers", "Content-Type, Accept, X-Requested-With");
response.AddHeader("Access-Control-Allow-Methods", "GET, POST, DELETE");
response.AddHeader("Access-Control-Max-Age", "1728000");
response.AppendHeader("Access-Control-Allow-Origin", "*");
try
{
if (content is not Stream)
{
response.ContentType = "application/json";
response.AddHeader("Cache-Control", "no-store");
response.OutputStream.Write(content is not null
? Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(content))
: Array.Empty<byte>());
response.OutputStream.Close();
}
else if (content is FileStream stream)
{
string contentType = stream.Name.Split('.')[^1];
response.AddHeader("Cache-Control", "max-age=600");
switch (contentType.ToLower())
{
case "gif":
response.ContentType = "image/gif";
break;
case "png":
response.ContentType = "image/png";
break;
case "jpg":
case "jpeg":
response.ContentType = "image/jpeg";
break;
case "log":
response.ContentType = "text/plain";
break;
}
stream.CopyTo(response.OutputStream);
response.OutputStream.Close();
stream.Close();
}
}
catch (Exception e)
{
Log(e.ToString());
}
}
}

View File

@ -1,19 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
namespace Tranga.Server;
internal struct RequestPath
{
internal readonly string HttpMethod;
internal readonly string RegexStr;
internal readonly Func<GroupCollection, Dictionary<string, string>, ValueTuple<HttpStatusCode, object?>> Method;
public RequestPath(string httpHttpMethod, string regexStr,
Func<GroupCollection, Dictionary<string, string>, ValueTuple<HttpStatusCode, object?>> method)
{
this.HttpMethod = httpHttpMethod;
this.RegexStr = regexStr + "(?:/?)";
this.Method = method;
}
}

View File

@ -1,269 +0,0 @@
using System.Net;
using System.Runtime.InteropServices;
using System.Text;
using System.Text.RegularExpressions;
using Newtonsoft.Json;
using SixLabors.ImageSharp;
using SixLabors.ImageSharp.Formats.Png;
using SixLabors.ImageSharp.Metadata.Profiles.Exif;
using SixLabors.ImageSharp.Metadata.Profiles.Iptc;
using ZstdSharp;
namespace Tranga.Server;
public partial class Server : GlobalBase, IDisposable
{
private readonly HttpListener _listener = new();
private readonly Tranga _parent;
private bool _running = true;
private readonly List<RequestPath> _apiRequestPaths;
public Server(Tranga parent) : base(parent)
{
/*
* Contains all valid Request Methods, Paths (with Regex Group Matching for specific Parameters) and Handling Methods
*/
_apiRequestPaths = new List<RequestPath>
{
new ("GET", @"/v2/Connector/Types", GetV2ConnectorTypes),
new ("GET", @"/v2/Connector/([a-zA-Z]+)/GetManga", GetV2ConnectorConnectorNameGetManga),
new ("GET", @"/v2/Mangas", GetV2Mangas),
new ("GET", @"/v2/Manga/Search", GetV2MangaSearch),
new ("GET", @"/v2/Manga", GetV2Manga),
new ("GET", @"/v2/Manga/([-A-Za-z0-9]*={0,3})/Cover", GetV2MangaInternalIdCover),
new ("GET", @"/v2/Manga/([-A-Za-z0-9]*={0,3})/Chapters", GetV2MangaInternalIdChapters),
new ("GET", @"/v2/Manga/([-A-Za-z0-9]*={0,3})/Chapters/Latest", GetV2MangaInternalIdChaptersLatest),
new ("POST", @"/v2/Manga/([-A-Za-z0-9]*={0,3})/ignoreChaptersBelow", PostV2MangaInternalIdIgnoreChaptersBelow),
new ("POST", @"/v2/Manga/([-A-Za-z0-9]*={0,3})/moveFolder", PostV2MangaInternalIdMoveFolder),
new ("GET", @"/v2/Manga/([-A-Za-z0-9]*={0,3})", GetV2MangaInternalId),
new ("DELETE", @"/v2/Manga/([-A-Za-z0-9]*={0,3})", DeleteV2MangaInternalId),
new ("GET", @"/v2/Jobs", GetV2Jobs),
new ("GET", @"/v2/Jobs/Running", GetV2JobsRunning),
new ("GET", @"/v2/Jobs/Waiting", GetV2JobsWaiting),
new ("GET", @"/v2/Jobs/Monitoring", GetV2JobsMonitoring),
new ("GET", @"/v2/Jobs/Standby", GetV2JobsStandby),
new ("GET", @"/v2/Job/Types", GetV2JobTypes),
new ("POST", @"/v2/Job/Create/([a-zA-Z]+)", PostV2JobCreateType),
new ("GET", @"/v2/Job", GetV2Job),
new ("GET", @"/v2/Job/([a-zA-Z\.]+-[-A-Za-z0-9+/]*={0,3}(?:-[0-9]+)?)/Progress", GetV2JobJobIdProgress),
new ("POST", @"/v2/Job/([a-zA-Z\.]+-[-A-Za-z0-9+/]*={0,3}(?:-[0-9]+)?)/StartNow", PostV2JobJobIdStartNow),
new ("POST", @"/v2/Job/([a-zA-Z\.]+-[-A-Za-z0-9+/]*={0,3}(?:-[0-9]+)?)/Cancel", PostV2JobJobIdCancel),
new ("GET", @"/v2/Job/([a-zA-Z\.]+-[-A-Za-z0-9+/]*={0,3}(?:-[0-9]+)?)", GetV2JobJobId),
new ("DELETE", @"/v2/Job/([a-zA-Z\.]+-[-A-Za-z0-9+/]*={0,3}(?:-[0-9]+)?)", DeleteV2JobJobId),
new ("GET", @"/v2/Settings", GetV2Settings),
new ("GET", @"/v2/Settings/UserAgent", GetV2SettingsUserAgent),
new ("POST", @"/v2/Settings/UserAgent", PostV2SettingsUserAgent),
new ("GET", @"/v2/Settings/RateLimit/Types", GetV2SettingsRateLimitTypes),
new ("GET", @"/v2/Settings/RateLimit", GetV2SettingsRateLimit),
new ("POST", @"/v2/Settings/RateLimit", PostV2SettingsRateLimit),
new ("GET", @"/v2/Settings/RateLimit/([a-zA-Z]+)", GetV2SettingsRateLimitType),
new ("POST", @"/v2/Settings/RateLimit/([a-zA-Z]+)", PostV2SettingsRateLimitType),
new ("GET", @"/v2/Settings/AprilFoolsMode", GetV2SettingsAprilFoolsMode),
new ("POST", @"/v2/Settings/AprilFoolsMode", PostV2SettingsAprilFoolsMode),
new ("GET", @"/v2/Settings/CompressImages", GetV2SettingsCompressImages),
new ("POST", @"/v2/Settings/CompressImages", PostV2SettingsCompressImages),
new ("GET", @"/v2/Settings/BWImages", GetV2SettingsBwImages),
new ("POST", @"/v2/Settings/BWImages", PostV2SettingsBwImages),
new ("POST", @"/v2/Settings/DownloadLocation", PostV2SettingsDownloadLocation),
new ("GET", @"/v2/LibraryConnector", GetV2LibraryConnector),
new ("GET", @"/v2/LibraryConnector/Types", GetV2LibraryConnectorTypes),
new ("GET", @"/v2/LibraryConnector/([a-zA-Z]+)", GetV2LibraryConnectorType),
new ("POST", @"/v2/LibraryConnector/([a-zA-Z]+)", PostV2LibraryConnectorType),
new ("POST", @"/v2/LibraryConnector/([a-zA-Z]+)/Test", PostV2LibraryConnectorTypeTest),
new ("DELETE", @"/v2/LibraryConnector/([a-zA-Z]+)", DeleteV2LibraryConnectorType),
new ("GET", @"/v2/NotificationConnector", GetV2NotificationConnector),
new ("GET", @"/v2/NotificationConnector/Types", GetV2NotificationConnectorTypes),
new ("GET", @"/v2/NotificationConnector/([a-zA-Z]+)", GetV2NotificationConnectorType),
new ("POST", @"/v2/NotificationConnector/([a-zA-Z]+)", PostV2NotificationConnectorType),
new ("POST", @"/v2/NotificationConnector/([a-zA-Z]+)/Test", PostV2NotificationConnectorTypeTest),
new ("DELETE", @"/v2/NotificationConnector/([a-zA-Z]+)", DeleteV2NotificationConnectorType),
new ("GET", @"/v2/LogFile", GetV2LogFile),
new ("GET", @"/v2/Ping", GetV2Ping),
new ("POST", @"/v2/Ping", PostV2Ping)
};
this._parent = parent;
if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
this._listener.Prefixes.Add($"http://*:{TrangaSettings.apiPortNumber}/");
else
this._listener.Prefixes.Add($"http://localhost:{TrangaSettings.apiPortNumber}/");
Thread listenThread = new(Listen);
listenThread.Start();
while(_parent.keepRunning && _running)
Thread.Sleep(100);
this.Dispose();
}
private void Listen()
{
this._listener.Start();
foreach (string prefix in this._listener.Prefixes)
Log($"Listening on {prefix}");
while (this._listener.IsListening && _parent.keepRunning)
{
try
{
HttpListenerContext context = this._listener.GetContext();
//Log($"{context.Request.HttpMethod} {context.Request.Url} {context.Request.UserAgent}");
Task t = new(() =>
{
HandleRequest(context);
});
t.Start();
}
catch (HttpListenerException)
{
}
}
}
private void HandleRequest(HttpListenerContext context)
{
HttpListenerRequest request = context.Request;
HttpListenerResponse response = context.Response;
if (request.HttpMethod == "OPTIONS")
{
SendResponse(HttpStatusCode.NoContent, response);//Response always contains all valid Request-Methods
return;
}
if (request.Url!.LocalPath.Contains("favicon"))
{
SendResponse(HttpStatusCode.NoContent, response);
return;
}
string path = Regex.Match(request.Url.LocalPath, @"\/[a-zA-Z0-9\.+/=-]+(\/[a-zA-Z0-9\.+/=-]+)*").Value; //Local Path
if (!Regex.IsMatch(path, "/v2(/.*)?")) //Use only v2 API
{
SendResponse(HttpStatusCode.NotFound, response, "Use Version 2 API");
return;
}
Dictionary<string, string> requestVariables = GetRequestVariables(request.Url!.Query); //Variables in the URI
Dictionary<string, string> requestBody = GetRequestBody(request); //Variables in the JSON body
Dictionary<string, string> requestParams = requestVariables.UnionBy(requestBody, v => v.Key)
.ToDictionary(kv => kv.Key, kv => kv.Value); //The actual variable used for the API
ValueTuple<HttpStatusCode, object?> responseMessage; //Used to respond to the HttpRequest
if (_apiRequestPaths.Any(p => p.HttpMethod == request.HttpMethod && Regex.Match(path, p.RegexStr).Length == path.Length)) //Check if Request-Path is valid
{
RequestPath requestPath =
_apiRequestPaths.First(p => p.HttpMethod == request.HttpMethod && Regex.Match(path, p.RegexStr).Length == path.Length);
responseMessage =
requestPath.Method.Invoke(Regex.Match(path, requestPath.RegexStr).Groups, requestParams); //Get HttpResponse content
}
else
responseMessage = new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.MethodNotAllowed, "Unknown Request Path");
SendResponse(responseMessage.Item1, response, responseMessage.Item2);
}
private Dictionary<string, string> GetRequestVariables(string query)
{
Dictionary<string, string> ret = new();
Regex queryRex = new(@"\?{1}&?([A-z0-9-=]+=[A-z0-9-=]+)+(&[A-z0-9-=]+=[A-z0-9-=]+)*");
if (!queryRex.IsMatch(query))
return ret;
query = query.Substring(1);
foreach (string keyValuePair in query.Split('&').Where(str => str.Length >= 3))
{
string var = keyValuePair.Split('=')[0];
string val = Regex.Replace(keyValuePair.Substring(var.Length + 1), "%20", " ");
val = Regex.Replace(val, "%[0-9]{2}", "");
ret.Add(var, val);
}
return ret;
}
private Dictionary<string, string> GetRequestBody(HttpListenerRequest request)
{
if (!request.HasEntityBody)
{
//Nospam Log("No request body");
return new Dictionary<string, string>();
}
Stream body = request.InputStream;
Encoding encoding = request.ContentEncoding;
using StreamReader streamReader = new (body, encoding);
try
{
Dictionary<string, string> requestBody =
JsonConvert.DeserializeObject<Dictionary<string, string>>(streamReader.ReadToEnd())
?? new();
return requestBody;
}
catch (JsonException e)
{
Log(e.Message);
}
return new Dictionary<string, string>();
}
private void SendResponse(HttpStatusCode statusCode, HttpListenerResponse response, object? content = null)
{
//Log($"Response: {statusCode} {content}");
response.StatusCode = (int)statusCode;
response.AddHeader("Access-Control-Allow-Headers", "Content-Type, Accept, X-Requested-With");
response.AddHeader("Access-Control-Allow-Methods", "GET, POST, DELETE");
response.AddHeader("Access-Control-Max-Age", "1728000");
response.AddHeader("Access-Control-Allow-Origin", "*");
response.AddHeader("Content-Encoding", "zstd");
using CompressionStream compressor = new (response.OutputStream, 5);
try
{
if (content is Stream stream)
{
response.ContentType = "text/plain";
response.AddHeader("Cache-Control", "private, no-store");
stream.CopyTo(compressor);
stream.Close();
}else if (content is Image image)
{
response.ContentType = image.Metadata.DecodedImageFormat?.DefaultMimeType ?? PngFormat.Instance.DefaultMimeType;
response.AddHeader("Cache-Control", "public, max-age=3600");
response.AddHeader("Expires", $"{DateTime.Now.AddHours(1):ddd\\,\\ dd\\ MMM\\ yyyy\\ HH\\:mm\\:ss} GMT");
string lastModifiedStr = "";
if (image.Metadata.IptcProfile is not null)
{
DateTime date = DateTime.ParseExact(image.Metadata.IptcProfile.GetValues(IptcTag.CreatedDate).First().Value, "yyyyMMdd",null);
DateTime time = DateTime.ParseExact(image.Metadata.IptcProfile.GetValues(IptcTag.CreatedTime).First().Value, "HHmmssK",null);
lastModifiedStr = $"{date:ddd\\,\\ dd\\ MMM\\ yyyy} {time:HH\\:mm\\:ss} GMT";
}else if (image.Metadata.ExifProfile is not null)
{
DateTime datetime = DateTime.ParseExact(image.Metadata.ExifProfile.Values.FirstOrDefault(value => value.Tag == ExifTag.DateTime)?.ToString() ?? "2000:01:01 01:01:01", "yyyy:MM:dd HH:mm:ss", null);
lastModifiedStr = $"{datetime:ddd\\,\\ dd\\ MMM\\ yyyy\\ HH\\:mm\\:ss} GMT";
}
if(lastModifiedStr.Length>0)
response.AddHeader("Last-Modified", lastModifiedStr);
image.Save(compressor, image.Metadata.DecodedImageFormat ?? PngFormat.Instance);
image.Dispose();
}
else
{
response.ContentType = "application/json";
response.AddHeader("Cache-Control", "private, no-store");
if(content is not null)
new MemoryStream(Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(content))).CopyTo(compressor);
else
compressor.Write(Array.Empty<byte>());
}
compressor.Flush();
response.OutputStream.Close();
}
catch (HttpListenerException e)
{
Log(e.ToString());
}
}
public void Dispose()
{
_running = false;
((IDisposable)_listener).Dispose();
}
}

View File

@ -1,31 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using Tranga.MangaConnectors;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2ConnectorTypes(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.Accepted, _parent.GetConnectors());
}
private ValueTuple<HttpStatusCode, object?> GetV2ConnectorConnectorNameGetManga(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.GetConnectors().Any(mangaConnector => mangaConnector.name == groups[1].Value)||
!_parent.TryGetConnector(groups[1].Value, out MangaConnector? connector) ||
connector is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, $"Connector '{groups[1].Value}' does not exist.");
if (requestParameters.TryGetValue("title", out string? title))
{
return (HttpStatusCode.OK, connector.GetManga(title));
}else if (requestParameters.TryGetValue("url", out string? url))
{
return (HttpStatusCode.OK, connector.GetMangaFromUrl(url));
}else
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, "Parameter 'title' or 'url' has to be set.");
}
}

View File

@ -1,176 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using Tranga.Jobs;
using Tranga.MangaConnectors;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2Jobs(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, _parent.jobBoss.jobs.Select(job => job.id));
}
private ValueTuple<HttpStatusCode, object?> GetV2JobsRunning(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, _parent.jobBoss.jobs
.Where(job => job.progressToken.state is ProgressToken.State.Running)
.Select(job => job.id));
}
private ValueTuple<HttpStatusCode, object?> GetV2JobsWaiting(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, _parent.jobBoss.jobs
.Where(job => job.progressToken.state is ProgressToken.State.Waiting)
.Select(job => job.id));
}
private ValueTuple<HttpStatusCode, object?> GetV2JobsStandby(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, _parent.jobBoss.jobs
.Where(job => job.progressToken.state is ProgressToken.State.Standby)
.Select(job => job.id));
}
private ValueTuple<HttpStatusCode, object?> GetV2JobsMonitoring(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, _parent.jobBoss.jobs
.Where(job => job.jobType is Job.JobType.DownloadNewChaptersJob)
.Select(job => job.id));
}
private ValueTuple<HttpStatusCode, object?> GetV2JobTypes(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK,
Enum.GetValues<Job.JobType>().ToDictionary(b => (byte)b, b => Enum.GetName(b)));
}
private ValueTuple<HttpStatusCode, object?> PostV2JobCreateType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out Job.JobType jobType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"JobType {groups[1].Value} does not exist.");
}
string? mangaId;
Manga? manga;
switch (jobType)
{
case Job.JobType.MonitorManga:
if(!requestParameters.TryGetValue("internalId", out mangaId) ||
!_parent.TryGetPublicationById(mangaId, out manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, "'internalId' Parameter missing, or is not a valid ID.");
if(!requestParameters.TryGetValue("interval", out string? intervalStr) ||
!TimeSpan.TryParse(intervalStr, out TimeSpan interval))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, "'interval' Parameter missing, or is not in correct format.");
requestParameters.TryGetValue("language", out string? language);
if (requestParameters.TryGetValue("customFolder", out string? folder))
manga.Value.MovePublicationFolder(TrangaSettings.downloadLocation, folder);
if (requestParameters.TryGetValue("startChapter", out string? startChapterStr) &&
float.TryParse(startChapterStr, out float startChapter))
{
Manga manga1 = manga.Value;
manga1.ignoreChaptersBelow = startChapter;
}
return _parent.jobBoss.AddJob(new DownloadNewChapters(this, ((Manga)manga).mangaConnector,
((Manga)manga).internalId, true, interval, language)) switch
{
true => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null),
false => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.Conflict, "Job already exists."),
};
case Job.JobType.UpdateMetaDataJob:
if(!requestParameters.TryGetValue("internalId", out mangaId) ||
!_parent.TryGetPublicationById(mangaId, out manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, "InternalId Parameter missing, or is not a valid ID.");
return _parent.jobBoss.AddJob(new UpdateMetadata(this, ((Manga)manga).internalId)) switch
{
true => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null),
false => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.Conflict, "Job already exists."),
};
case Job.JobType.DownloadNewChaptersJob: //TODO
case Job.JobType.DownloadChapterJob: //TODO
default: return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.MethodNotAllowed, $"JobType {Enum.GetName(jobType)} is not supported.");
}
}
private ValueTuple<HttpStatusCode, object?> GetV2JobJobId(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!_parent.jobBoss.TryGetJobById(groups[1].Value, out Job? job) ||
job is null)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Job with ID: '{groups[1].Value}' does not exist.");
}
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, job);
}
private ValueTuple<HttpStatusCode, object?> DeleteV2JobJobId(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!_parent.jobBoss.TryGetJobById(groups[1].Value, out Job? job) ||
job is null)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Job with ID: '{groups[1].Value}' does not exist.");
}
_parent.jobBoss.RemoveJob(job);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> GetV2JobJobIdProgress(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!_parent.jobBoss.TryGetJobById(groups[1].Value, out Job? job) ||
job is null)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, $"Job with ID: '{groups[1].Value}' does not exist.");
}
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, job.progressToken);
}
private ValueTuple<HttpStatusCode, object?> PostV2JobJobIdStartNow(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!_parent.jobBoss.TryGetJobById(groups[1].Value, out Job? job) ||
job is null)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Job with ID: '{groups[1].Value}' does not exist.");
}
_parent.jobBoss.AddJobs(job.ExecuteReturnSubTasks(_parent.jobBoss));
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> PostV2JobJobIdCancel(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!_parent.jobBoss.TryGetJobById(groups[1].Value, out Job? job) ||
job is null)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Job with ID: '{groups[1].Value}' does not exist.");
}
job.Cancel();
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> GetV2Job(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(!requestParameters.TryGetValue("jobIds", out string? jobIdListStr))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, "Missing parameter 'jobIds'.");
string[] jobIdList = jobIdListStr.Split(',');
List<Job> ret = new();
foreach (string jobId in jobIdList)
{
if(!_parent.jobBoss.TryGetJobById(jobId, out Job? job) || job is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Job with id '{jobId}' not found.");
ret.Add(job);
}
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, ret);
}
}

View File

@ -1,116 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using Tranga.LibraryConnectors;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2LibraryConnector(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, libraryConnectors);
}
private ValueTuple<HttpStatusCode, object?> GetV2LibraryConnectorTypes(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK,
Enum.GetValues<LibraryConnector.LibraryType>().ToDictionary(b => (byte)b, b => Enum.GetName(b)));
}
private ValueTuple<HttpStatusCode, object?> GetV2LibraryConnectorType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out LibraryConnector.LibraryType libraryType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"LibraryType {groups[1].Value} does not exist.");
}
if(libraryConnectors.All(lc => lc.libraryType != libraryType))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"LibraryType {Enum.GetName(libraryType)} not configured.");
else
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, libraryConnectors.First(lc => lc.libraryType == libraryType));
}
private ValueTuple<HttpStatusCode, object?> PostV2LibraryConnectorType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out LibraryConnector.LibraryType libraryType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"LibraryType {groups[1].Value} does not exist.");
}
if(!requestParameters.TryGetValue("url", out string? url))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'url' missing.");
if(!requestParameters.TryGetValue("username", out string? username))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'username' missing.");
if(!requestParameters.TryGetValue("password", out string? password))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'password' missing.");
switch (libraryType)
{
case LibraryConnector.LibraryType.Kavita:
Kavita kavita = new (this, url, username, password);
libraryConnectors.RemoveWhere(lc => lc.libraryType == LibraryConnector.LibraryType.Kavita);
libraryConnectors.Add(kavita);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, kavita);
case LibraryConnector.LibraryType.Komga:
Komga komga = new (this, url, username, password);
libraryConnectors.RemoveWhere(lc => lc.libraryType == LibraryConnector.LibraryType.Komga);
libraryConnectors.Add(komga);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, komga);
default: return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.MethodNotAllowed, $"LibraryType {Enum.GetName(libraryType)} is not supported.");
}
}
private ValueTuple<HttpStatusCode, object?> PostV2LibraryConnectorTypeTest(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out LibraryConnector.LibraryType libraryType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"LibraryType {groups[1].Value} does not exist.");
}
if(!requestParameters.TryGetValue("url", out string? url))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'url' missing.");
if(!requestParameters.TryGetValue("username", out string? username))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'username' missing.");
if(!requestParameters.TryGetValue("password", out string? password))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'password' missing.");
switch (libraryType)
{
case LibraryConnector.LibraryType.Kavita:
Kavita kavita = new (this, url, username, password);
return kavita.Test() switch
{
true => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, kavita),
_ => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.FailedDependency, kavita)
};
case LibraryConnector.LibraryType.Komga:
Komga komga = new (this, url, username, password);
return komga.Test() switch
{
true => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, komga),
_ => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.FailedDependency, komga)
};
default: return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.MethodNotAllowed, $"LibraryType {Enum.GetName(libraryType)} is not supported.");
}
}
private ValueTuple<HttpStatusCode, object?> DeleteV2LibraryConnectorType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out LibraryConnector.LibraryType libraryType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"LibraryType {groups[1].Value} does not exist.");
}
if(libraryConnectors.All(lc => lc.libraryType != libraryType))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"LibraryType {Enum.GetName(libraryType)} not configured.");
else
{
libraryConnectors.Remove(libraryConnectors.First(lc => lc.libraryType == libraryType));
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
}
}

View File

@ -1,166 +0,0 @@
using SixLabors.ImageSharp;
using SixLabors.ImageSharp.Processing;
using System.Net;
using System.Text.RegularExpressions;
using SixLabors.ImageSharp.Processing.Processors.Transforms;
using Tranga.Jobs;
using Tranga.MangaConnectors;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2Mangas(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, GetAllCachedManga().Select(m => m.internalId));
}
private ValueTuple<HttpStatusCode, object?> GetV2Manga(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(!requestParameters.TryGetValue("mangaIds", out string? mangaIdListStr))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, "Missing parameter 'mangaIds'.");
string[] mangaIdList = mangaIdListStr.Split(',').Distinct().ToArray();
List<Manga> ret = new();
foreach (string mangaId in mangaIdList)
{
if(!_parent.TryGetPublicationById(mangaId, out Manga? manga) || manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with id '{mangaId}' not found.");
ret.Add(manga.Value);
}
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, ret);
}
private ValueTuple<HttpStatusCode, object?> GetV2MangaSearch(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(!requestParameters.TryGetValue("title", out string? title))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, "Missing parameter 'title'.");
List<Manga> ret = new();
List<Thread> threads = new();
foreach (MangaConnector mangaConnector in _connectors)
{
Thread t = new (() =>
{
ret.AddRange(mangaConnector.GetManga(title));
});
t.Start();
threads.Add(t);
}
while(threads.Any(t => t.ThreadState is ThreadState.Running or ThreadState.WaitSleepJoin))
Thread.Sleep(10);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, ret);
}
private ValueTuple<HttpStatusCode, object?> GetV2MangaInternalId(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, manga);
}
private ValueTuple<HttpStatusCode, object?> DeleteV2MangaInternalId(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
Job[] jobs = _parent.jobBoss.GetJobsLike(publication: manga).ToArray();
_parent.jobBoss.RemoveJobs(jobs);
RemoveMangaFromCache(groups[1].Value);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> GetV2MangaInternalIdCover(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
string filePath = manga.Value.coverFileNameInCache!;
if(!File.Exists(filePath))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, "Cover-File not found.");
Image image = Image.Load(filePath);
if (requestParameters.TryGetValue("dimensions", out string? dimensionsStr))
{
Regex dimensionsRex = new(@"([0-9]+)x([0-9]+)");
if(!dimensionsRex.IsMatch(dimensionsStr))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, "Requested dimensions not in required format.");
Match m = dimensionsRex.Match(dimensionsStr);
int width = int.Parse(m.Groups[1].Value);
int height = int.Parse(m.Groups[2].Value);
double aspectRequested = (double)width / (double)height;
double aspectCover = (double)image.Width / (double)image.Height;
Size newSize = aspectRequested > aspectCover
? new Size(width, (width / image.Width) * image.Height)
: new Size((height / image.Height) * image.Width, height);
image.Mutate(x => x.Resize(newSize, CubicResampler.Robidoux, true));
}
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, image);
}
private ValueTuple<HttpStatusCode, object?> GetV2MangaInternalIdChapters(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
Chapter[] chapters = requestParameters.TryGetValue("language", out string? parameter) switch
{
true => manga.Value.mangaConnector.GetChapters((Manga)manga, parameter),
false => manga.Value.mangaConnector.GetChapters((Manga)manga)
};
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, chapters);
}
private ValueTuple<HttpStatusCode, object?> GetV2MangaInternalIdChaptersLatest(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
float latest = requestParameters.TryGetValue("language", out string? parameter) switch
{
true => manga.Value.mangaConnector.GetChapters(manga.Value, parameter).Max().chapterNumber,
false => manga.Value.mangaConnector.GetChapters(manga.Value).Max().chapterNumber
};
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, latest);
}
private ValueTuple<HttpStatusCode, object?> PostV2MangaInternalIdIgnoreChaptersBelow(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
if (requestParameters.TryGetValue("startChapter", out string? startChapterStr) &&
float.TryParse(startChapterStr, out float startChapter))
{
Manga manga1 = manga.Value;
manga1.ignoreChaptersBelow = startChapter;
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}else
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, "Parameter 'startChapter' missing, or failed to parse.");
}
private ValueTuple<HttpStatusCode, object?> PostV2MangaInternalIdMoveFolder(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
if(!requestParameters.TryGetValue("location", out string? newFolder))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, "Parameter 'location' missing.");
manga.Value.MovePublicationFolder(TrangaSettings.downloadLocation, newFolder);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
}

View File

@ -1,33 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2LogFile(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (logger is null || !File.Exists(logger?.logFilePath))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, "Missing Logfile");
}
FileStream logFile = new (logger.logFilePath, FileMode.Open, FileAccess.Read);
FileStream content = new(Path.GetTempFileName(), FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.ReadWrite, 0, FileOptions.DeleteOnClose);
logFile.Position = 0;
logFile.CopyTo(content);
content.Position = 0;
logFile.Dispose();
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, content);
}
private ValueTuple<HttpStatusCode, object?> GetV2Ping(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.Accepted, "Pong!");
}
private ValueTuple<HttpStatusCode, object?> PostV2Ping(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.Accepted, "Pong!");
}
}

View File

@ -1,136 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using Tranga.NotificationConnectors;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2NotificationConnector(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, notificationConnectors);
}
private ValueTuple<HttpStatusCode, object?> GetV2NotificationConnectorTypes(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK,
Enum.GetValues<NotificationConnectors.NotificationConnector.NotificationConnectorType>().ToDictionary(b => (byte)b, b => Enum.GetName(b)));
}
private ValueTuple<HttpStatusCode, object?> GetV2NotificationConnectorType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out NotificationConnector.NotificationConnectorType notificationConnectorType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"NotificationType {groups[1].Value} does not exist.");
}
if(notificationConnectors.All(nc => nc.notificationConnectorType != notificationConnectorType))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"NotificationType {Enum.GetName(notificationConnectorType)} not configured.");
else
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, notificationConnectors.First(nc => nc.notificationConnectorType != notificationConnectorType));
}
private ValueTuple<HttpStatusCode, object?> PostV2NotificationConnectorType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out NotificationConnector.NotificationConnectorType notificationConnectorType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"NotificationType {groups[1].Value} does not exist.");
}
string? url;
switch (notificationConnectorType)
{
case NotificationConnector.NotificationConnectorType.Gotify:
if(!requestParameters.TryGetValue("url", out url))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'url' missing.");
if(!requestParameters.TryGetValue("appToken", out string? appToken))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'appToken' missing.");
Gotify gotify = new (this, url, appToken);
this.notificationConnectors.RemoveWhere(nc =>
nc.notificationConnectorType == NotificationConnector.NotificationConnectorType.Gotify);
this.notificationConnectors.Add(gotify);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, gotify);
case NotificationConnector.NotificationConnectorType.LunaSea:
if(!requestParameters.TryGetValue("webhook", out string? webhook))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'webhook' missing.");
LunaSea lunaSea = new (this, webhook);
this.notificationConnectors.RemoveWhere(nc =>
nc.notificationConnectorType == NotificationConnector.NotificationConnectorType.LunaSea);
this.notificationConnectors.Add(lunaSea);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, lunaSea);
case NotificationConnector.NotificationConnectorType.Ntfy:
if(!requestParameters.TryGetValue("url", out url))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'url' missing.");
if(!requestParameters.TryGetValue("username", out string? username))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'username' missing.");
if(!requestParameters.TryGetValue("password", out string? password))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'password' missing.");
Ntfy ntfy = new(this, url, username, password, null);
this.notificationConnectors.RemoveWhere(nc =>
nc.notificationConnectorType == NotificationConnector.NotificationConnectorType.Ntfy);
this.notificationConnectors.Add(ntfy);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, ntfy);
default:
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.MethodNotAllowed, $"NotificationType {Enum.GetName(notificationConnectorType)} is not supported.");
}
}
private ValueTuple<HttpStatusCode, object?> PostV2NotificationConnectorTypeTest(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out NotificationConnector.NotificationConnectorType notificationConnectorType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"NotificationType {groups[1].Value} does not exist.");
}
string? url;
switch (notificationConnectorType)
{
case NotificationConnector.NotificationConnectorType.Gotify:
if(!requestParameters.TryGetValue("url", out url))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'url' missing.");
if(!requestParameters.TryGetValue("appToken", out string? appToken))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'appToken' missing.");
Gotify gotify = new (this, url, appToken);
gotify.SendNotification("Tranga Test", "It was successful :3");
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, gotify);
case NotificationConnector.NotificationConnectorType.LunaSea:
if(!requestParameters.TryGetValue("webhook", out string? webhook))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'webhook' missing.");
LunaSea lunaSea = new (this, webhook);
lunaSea.SendNotification("Tranga Test", "It was successful :3");
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, lunaSea);
case NotificationConnector.NotificationConnectorType.Ntfy:
if(!requestParameters.TryGetValue("url", out url))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'url' missing.");
if(!requestParameters.TryGetValue("username", out string? username))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'username' missing.");
if(!requestParameters.TryGetValue("password", out string? password))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'password' missing.");
Ntfy ntfy = new(this, url, username, password, null);
ntfy.SendNotification("Tranga Test", "It was successful :3");
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, ntfy);
default:
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.MethodNotAllowed, $"NotificationType {Enum.GetName(notificationConnectorType)} is not supported.");
}
}
private ValueTuple<HttpStatusCode, object?> DeleteV2NotificationConnectorType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out NotificationConnector.NotificationConnectorType notificationConnectorType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"NotificationType {groups[1].Value} does not exist.");
}
if(notificationConnectors.All(nc => nc.notificationConnectorType != notificationConnectorType))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"NotificationType {Enum.GetName(notificationConnectorType)} not configured.");
else
{
notificationConnectors.Remove(notificationConnectors.First(nc => nc.notificationConnectorType != notificationConnectorType));
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
}
}

View File

@ -1,137 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using Tranga.MangaConnectors;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2Settings(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.AsJObject());
}
private ValueTuple<HttpStatusCode, object?> GetV2SettingsUserAgent(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.userAgent);
}
private ValueTuple<HttpStatusCode, object?> PostV2SettingsUserAgent(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (!requestParameters.TryGetValue("value", out string? userAgent))
{
TrangaSettings.UpdateUserAgent(null);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.Accepted, null);
}
else
{
TrangaSettings.UpdateUserAgent(userAgent);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
}
private ValueTuple<HttpStatusCode, object?> GetV2SettingsRateLimitTypes(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, Enum.GetValues<RequestType>().ToDictionary(b =>(byte)b, b => Enum.GetName(b)) );
}
private ValueTuple<HttpStatusCode, object?> GetV2SettingsRateLimit(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.requestLimits);
}
private ValueTuple<HttpStatusCode, object?> PostV2SettingsRateLimit(GroupCollection groups, Dictionary<string, string> requestParameters)
{
foreach (KeyValuePair<string, string> kv in requestParameters)
{
if(!Enum.TryParse(kv.Key, out RequestType requestType) ||
!int.TryParse(kv.Value, out int requestsPerMinute))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, null);
TrangaSettings.UpdateRateLimit(requestType, requestsPerMinute);
}
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.requestLimits);
}
private ValueTuple<HttpStatusCode, object?> GetV2SettingsRateLimitType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, out RequestType requestType))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"RequestType {groups[1].Value}");
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.requestLimits[requestType]);
}
private ValueTuple<HttpStatusCode, object?> PostV2SettingsRateLimitType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, out RequestType requestType))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"RequestType {groups[1].Value}");
if (!requestParameters.TryGetValue("value", out string? requestsPerMinuteStr) ||
!int.TryParse(requestsPerMinuteStr, out int requestsPerMinute))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, "Errors parsing requestsPerMinute");
TrangaSettings.UpdateRateLimit(requestType, requestsPerMinute);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> GetV2SettingsAprilFoolsMode(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.aprilFoolsMode);
}
private ValueTuple<HttpStatusCode, object?> PostV2SettingsAprilFoolsMode(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (!requestParameters.TryGetValue("value", out string? trueFalseStr) ||
!bool.TryParse(trueFalseStr, out bool trueFalse))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, "Errors parsing 'value'");
TrangaSettings.UpdateAprilFoolsMode(trueFalse);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> GetV2SettingsCompressImages(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.compression);
}
private ValueTuple<HttpStatusCode, object?> PostV2SettingsCompressImages(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (!requestParameters.TryGetValue("value", out string? valueStr) ||
!int.TryParse(valueStr, out int value)
|| value != int.Clamp(value, 1, 100))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, "Errors parsing 'value'");
TrangaSettings.UpdateCompressImages(value);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> GetV2SettingsBwImages(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.bwImages);
}
private ValueTuple<HttpStatusCode, object?> PostV2SettingsBwImages(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (!requestParameters.TryGetValue("value", out string? trueFalseStr) ||
!bool.TryParse(trueFalseStr, out bool trueFalse))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, "Errors parsing 'value'");
TrangaSettings.UpdateBwImages(trueFalse);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> PostV2SettingsDownloadLocation(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (!requestParameters.TryGetValue("location", out string? folderPath))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, "Missing Parameter 'location'");
try
{
bool moveFiles = requestParameters.TryGetValue("moveFiles", out string? moveFilesStr) switch
{
false => true,
true => bool.Parse(moveFilesStr!)
};
TrangaSettings.UpdateDownloadLocation(folderPath, moveFiles);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
catch (FormatException)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, "Error Parsing Parameter 'moveFiles'");
}
}
}

View File

@ -8,7 +8,8 @@ public partial class Tranga : GlobalBase
{ {
public bool keepRunning; public bool keepRunning;
public JobBoss jobBoss; public JobBoss jobBoss;
private Server.Server _server; private Server _server;
private HashSet<MangaConnector> _connectors;
public Tranga(Logger? logger) : base(logger) public Tranga(Logger? logger) : base(logger)
{ {
@ -17,22 +18,21 @@ public partial class Tranga : GlobalBase
_connectors = new HashSet<MangaConnector>() _connectors = new HashSet<MangaConnector>()
{ {
new Manganato(this), new Manganato(this),
new Mangasee(this),
new MangaDex(this), new MangaDex(this),
new MangaKatana(this), new MangaKatana(this),
new Mangaworld(this), new Mangaworld(this),
new Bato(this), new Bato(this),
new MangaLife(this),
new ManhuaPlus(this), new ManhuaPlus(this),
new MangaHere(this), new MangaHere(this),
new AsuraToon(this), new AsuraToon(this),
new Weebcentral(this) new Weebcentral(this),
new Webtoons(this),
}; };
foreach(DirectoryInfo dir in new DirectoryInfo(Path.GetTempPath()).GetDirectories("trangatemp"))//Cleanup old temp folders foreach(DirectoryInfo dir in new DirectoryInfo(Path.GetTempPath()).GetDirectories("trangatemp"))//Cleanup old temp folders
dir.Delete(); dir.Delete();
jobBoss = new(this, this._connectors); jobBoss = new(this, this._connectors);
StartJobBoss(); StartJobBoss();
this._server = new Server.Server(this); this._server = new Server(this);
string[] emojis = { "(•‿•)", "(づ \u25d5‿\u25d5 )づ", "( \u02d8\u25bd\u02d8)っ\u2668", "=\uff3e\u25cf \u22cf \u25cf\uff3e=", "(ΦωΦ)", "(\u272a\u3268\u272a)", "( ノ・o・ )ノ", "(〜^\u2207^ )〜", "~(\u2267ω\u2266)~","૮ \u00b4• ﻌ \u00b4• ა", "(\u02c3ᆺ\u02c2)", "(=\ud83d\udf66 \u0f1d \ud83d\udf66=)"}; string[] emojis = { "(•‿•)", "(づ \u25d5‿\u25d5 )づ", "( \u02d8\u25bd\u02d8)っ\u2668", "=\uff3e\u25cf \u22cf \u25cf\uff3e=", "(ΦωΦ)", "(\u272a\u3268\u272a)", "( ノ・o・ )ノ", "(〜^\u2207^ )〜", "~(\u2267ω\u2266)~","૮ \u00b4• ﻌ \u00b4• ა", "(\u02c3ᆺ\u02c2)", "(=\ud83d\udf66 \u0f1d \ud83d\udf66=)"};
SendNotifications("Tranga Started", emojis[Random.Shared.Next(0,emojis.Length-1)]); SendNotifications("Tranga Started", emojis[Random.Shared.Next(0,emojis.Length-1)]);
Log(TrangaSettings.AsJObject().ToString()); Log(TrangaSettings.AsJObject().ToString());
@ -52,9 +52,9 @@ public partial class Tranga : GlobalBase
return connector is not null; return connector is not null;
} }
public List<MangaConnector> GetConnectors() public IEnumerable<MangaConnector> GetConnectors()
{ {
return _connectors.ToList(); return _connectors;
} }
public Manga? GetPublicationById(string internalId) => GetCachedManga(internalId); public Manga? GetPublicationById(string internalId) => GetCachedManga(internalId);

View File

@ -10,13 +10,10 @@
<ItemGroup> <ItemGroup>
<PackageReference Include="GlaxArguments" Version="1.1.0" /> <PackageReference Include="GlaxArguments" Version="1.1.0" />
<PackageReference Include="HtmlAgilityPack" Version="1.11.71" /> <PackageReference Include="HtmlAgilityPack" Version="1.11.72" />
<PackageReference Include="Newtonsoft.Json" Version="13.0.3" /> <PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
<PackageReference Include="SixLabors.ImageSharp" Version="3.1.5" /> <PackageReference Include="PuppeteerSharp" Version="20.1.0" />
<PackageReference Include="PuppeteerSharp" Version="20.0.5" />
<PackageReference Include="Soenneker.Utils.String.NeedlemanWunsch" Version="2.1.301" /> <PackageReference Include="Soenneker.Utils.String.NeedlemanWunsch" Version="2.1.301" />
<PackageReference Include="System.Drawing.Common" Version="9.0.0-preview.7.24405.4" />
<PackageReference Include="ZstdSharp.Port" Version="0.8.1" />
</ItemGroup> </ItemGroup>
<ItemGroup> <ItemGroup>

View File

@ -17,14 +17,11 @@ public static class TrangaSettings
public static string userAgent { get; private set; } = DefaultUserAgent; public static string userAgent { get; private set; } = DefaultUserAgent;
public static bool bufferLibraryUpdates { get; private set; } = false; public static bool bufferLibraryUpdates { get; private set; } = false;
public static bool bufferNotifications { get; private set; } = false; public static bool bufferNotifications { get; private set; } = false;
public static int compression{ get; private set; } = 40;
public static bool bwImages { get; private set; } = false;
[JsonIgnore] public static string settingsFilePath => Path.Join(workingDirectory, "settings.json"); [JsonIgnore] public static string settingsFilePath => Path.Join(workingDirectory, "settings.json");
[JsonIgnore] public static string libraryConnectorsFilePath => Path.Join(workingDirectory, "libraryConnectors.json"); [JsonIgnore] public static string libraryConnectorsFilePath => Path.Join(workingDirectory, "libraryConnectors.json");
[JsonIgnore] public static string notificationConnectorsFilePath => Path.Join(workingDirectory, "notificationConnectors.json"); [JsonIgnore] public static string notificationConnectorsFilePath => Path.Join(workingDirectory, "notificationConnectors.json");
[JsonIgnore] public static string jobsFolderPath => Path.Join(workingDirectory, "jobs"); [JsonIgnore] public static string jobsFolderPath => Path.Join(workingDirectory, "jobs");
[JsonIgnore] public static string coverImageCache => Path.Join(workingDirectory, "imageCache"); [JsonIgnore] public static string coverImageCache => Path.Join(workingDirectory, "imageCache");
[JsonIgnore] public static string mangaCacheFolderPath => Path.Join(workingDirectory, "mangaCache");
public static ushort? version { get; } = 2; public static ushort? version { get; } = 2;
public static bool aprilFoolsMode { get; private set; } = true; public static bool aprilFoolsMode { get; private set; } = true;
[JsonIgnore]internal static readonly Dictionary<RequestType, int> DefaultRequestLimits = new () [JsonIgnore]internal static readonly Dictionary<RequestType, int> DefaultRequestLimits = new ()
@ -38,6 +35,8 @@ public static class TrangaSettings
}; };
public static Dictionary<RequestType, int> requestLimits { get; set; } = DefaultRequestLimits; public static Dictionary<RequestType, int> requestLimits { get; set; } = DefaultRequestLimits;
public static int ChromiumStartupTimeoutMs { get; set; } = 30000;
public static int ChromiumPageTimeoutMs { get; set; } = 30000;
public static void LoadFromWorkingDirectory(string directory) public static void LoadFromWorkingDirectory(string directory)
{ {
@ -51,9 +50,7 @@ public static class TrangaSettings
ExportSettings(); ExportSettings();
} }
public static void CreateOrUpdate(string? downloadDirectory = null, string? pWorkingDirectory = null, public static void CreateOrUpdate(string? downloadDirectory = null, string? pWorkingDirectory = null, int? pApiPortNumber = null, string? pUserAgent = null, bool? pAprilFoolsMode = null, bool? pBufferLibraryUpdates = null, bool? pBufferNotifications = null)
int? pApiPortNumber = null, string? pUserAgent = null, bool? pAprilFoolsMode = null,
bool? pBufferLibraryUpdates = null, bool? pBufferNotifications = null, int? pCompression = null, bool? pbwImages = null)
{ {
if(pWorkingDirectory is null && File.Exists(settingsFilePath)) if(pWorkingDirectory is null && File.Exists(settingsFilePath))
LoadFromWorkingDirectory(workingDirectory); LoadFromWorkingDirectory(workingDirectory);
@ -64,8 +61,6 @@ public static class TrangaSettings
aprilFoolsMode = pAprilFoolsMode ?? aprilFoolsMode; aprilFoolsMode = pAprilFoolsMode ?? aprilFoolsMode;
bufferLibraryUpdates = pBufferLibraryUpdates ?? bufferLibraryUpdates; bufferLibraryUpdates = pBufferLibraryUpdates ?? bufferLibraryUpdates;
bufferNotifications = pBufferNotifications ?? bufferNotifications; bufferNotifications = pBufferNotifications ?? bufferNotifications;
compression = pCompression ?? compression;
bwImages = pbwImages ?? bwImages;
Directory.CreateDirectory(downloadLocation); Directory.CreateDirectory(downloadLocation);
Directory.CreateDirectory(workingDirectory); Directory.CreateDirectory(workingDirectory);
ExportSettings(); ExportSettings();
@ -105,67 +100,33 @@ public static class TrangaSettings
ExportSettings(); ExportSettings();
} }
public static void UpdateCompressImages(int value)
{
compression = int.Clamp(value, 1, 100);
ExportSettings();
}
public static void UpdateBwImages(bool enabled)
{
bwImages = enabled;
ExportSettings();
}
public static void UpdateDownloadLocation(string newPath, bool moveFiles = true) public static void UpdateDownloadLocation(string newPath, bool moveFiles = true)
{ {
if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux)) if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
Directory.CreateDirectory(newPath, GroupRead | GroupWrite | None | OtherRead | OtherWrite | UserRead | UserWrite); Directory.CreateDirectory(newPath,
GroupRead | GroupWrite | None | OtherRead | OtherWrite | UserRead | UserWrite);
else else
Directory.CreateDirectory(newPath); Directory.CreateDirectory(newPath);
if (moveFiles) if (moveFiles && Directory.Exists(downloadLocation))
MoveContentsOfDirectoryTo(TrangaSettings.downloadLocation, newPath); Directory.Move(downloadLocation, newPath);
TrangaSettings.downloadLocation = newPath; downloadLocation = newPath;
ExportSettings(); ExportSettings();
} }
public static void UpdateWorkingDirectory(string newPath) public static void UpdateWorkingDirectory(string newPath)
{ {
if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux)) if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
Directory.CreateDirectory(newPath, GroupRead | GroupWrite | None | OtherRead | OtherWrite | UserRead | UserWrite); Directory.CreateDirectory(newPath,
GroupRead | GroupWrite | None | OtherRead | OtherWrite | UserRead | UserWrite);
else else
Directory.CreateDirectory(newPath); Directory.CreateDirectory(newPath);
Directory.Move(workingDirectory, newPath);
MoveContentsOfDirectoryTo(TrangaSettings.workingDirectory, newPath); workingDirectory = newPath;
TrangaSettings.workingDirectory = newPath;
ExportSettings(); ExportSettings();
} }
private static void MoveContentsOfDirectoryTo(string oldDir, string newDir)
{
string[] directoryPaths = Directory.GetDirectories(oldDir);
string[] filePaths = Directory.GetFiles(oldDir);
foreach (string file in filePaths)
{
string newPath = Path.Join(newDir, Path.GetFileName(file));
File.Move(file, newPath, true);
}
foreach(string directory in directoryPaths)
{
string? dirName = Path.GetDirectoryName(directory);
if(dirName is null)
continue;
string newPath = Path.Join(newDir, dirName);
if(Directory.Exists(newPath))
MoveContentsOfDirectoryTo(directory, newPath);
else
Directory.Move(directory, newPath);
}
}
public static void UpdateUserAgent(string? customUserAgent) public static void UpdateUserAgent(string? customUserAgent)
{ {
userAgent = customUserAgent ?? DefaultUserAgent; userAgent = customUserAgent ?? DefaultUserAgent;
@ -208,8 +169,8 @@ public static class TrangaSettings
jobj.Add("requestLimits", JToken.FromObject(requestLimits)); jobj.Add("requestLimits", JToken.FromObject(requestLimits));
jobj.Add("bufferLibraryUpdates", JToken.FromObject(bufferLibraryUpdates)); jobj.Add("bufferLibraryUpdates", JToken.FromObject(bufferLibraryUpdates));
jobj.Add("bufferNotifications", JToken.FromObject(bufferNotifications)); jobj.Add("bufferNotifications", JToken.FromObject(bufferNotifications));
jobj.Add("compression", JToken.FromObject(compression)); jobj.Add("chromiumStartTimeout", JToken.FromObject(ChromiumStartupTimeoutMs));
jobj.Add("bwImages", JToken.FromObject(bwImages)); jobj.Add("chromiumPageTimeout", JToken.FromObject(ChromiumPageTimeoutMs));
return jobj; return jobj;
} }
@ -234,9 +195,9 @@ public static class TrangaSettings
bufferLibraryUpdates = blu.Value<bool>()!; bufferLibraryUpdates = blu.Value<bool>()!;
if (jobj.TryGetValue("bufferNotifications", out JToken? bn)) if (jobj.TryGetValue("bufferNotifications", out JToken? bn))
bufferNotifications = bn.Value<bool>()!; bufferNotifications = bn.Value<bool>()!;
if (jobj.TryGetValue("compression", out JToken? ci)) if (jobj.TryGetValue("chromiumStartTimeout", out JToken? cst))
compression = ci.Value<int>()!; ChromiumStartupTimeoutMs = cst.Value<int>();
if (jobj.TryGetValue("bwImages", out JToken? bwi)) if (jobj.TryGetValue("chromiumPageTimeout", out JToken? cpt))
bwImages = bwi.Value<bool>()!; ChromiumPageTimeoutMs = cpt.Value<int>();
} }
} }

View File

@ -1,383 +0,0 @@
## Tranga API Calls
This document serves to outline all of the different HTTP API calls that Tranga accepts. Tranga expects specific HTTP methods for its calls and therefore careful attention must be paid when making them.
In the examples below, `{apiUri}` refers to your `http(s)://TRANGA.FRONTEND.URI/api`. Parameters are included in the HTTP request URI and the request body is in JSON format. Tranga responses are always
in the JSON format within the Response Body.
#### [GET] /Connectors
Retrieves the available manga sites (connectors) that Tranga is currently able to download manga from.
- Parameters:
None
- Request Body:
None
#### [GET] /Jobs
Retrieves all jobs that Tranga is keeping track of, includes Running Jobs, Waiting Jobs, Manga Tracking (Monitoring) Jobs.
- Parameters:
None
- Request Body:
None
#### [DELETE] /Jobs
Removes the specified job given by the job ID
- Request Variables:
- None
- Request Body:
```
{
jobId: ${Tranga Job ID}
}
```
#### [POST] /Jobs/Cancel
Cancels a running job or prevents a queued job from running.
- Parameters:
None
- Request Body:
```
{
jobId: ${Tranga Job ID}
}
```
#### [POST] /Jobs/DownloadNewChapters
Manually adds a Job to Tranga's queue to check for and download new chapters for a specified manga
- Parameters:
None
- Request Body:
```
{
connector: ${Manga Connector to Download From}
internalId: ${Tranga Manga ID}
translatedLanguage: ${Manga Language}
}
```
#### [GET] /Jobs/Running
Retrieves all currently running jobs.
- Parameters:
None
- Request Body:
None
#### [POST] /Jobs/StartNow
Manually starts a configured job
- Parameters:
None
- Request Body:
```
{
jobId: ${Tranga Job ID}
}
```
#### [GET]/Jobs/Waiting
Retrieves all currently queued jobs.
- Parameters:
None
- Request Body:
None
#### [GET] /Jobs/MonitorJobs
Retrieves all jobs for Mangas that Tranga is currently tracking.
- Parameters:
None
- Request Body:
None
#### [POST] /Jobs/MonitorManga
Adds a new manga for Tranga to monitor
- Parameters:
None
- Request Body:
```
{
connector: ${Manga Connector to download from}
internalId: ${Tranga Manga ID}
interval: ${Interval at which to run job, in the HH:MM:SS format}
translatedLanguage: ${Supported language code}
ignoreBelowChapterNum: ${Chapter number to start downloading from}
customFolderName: ${Folder Name to save Manga to}
}
```
#### [GET] /Jobs/Progress
Retrieves the current completion progress of a running or waiting job. Tranga's ID for the Job is returned with each of the `GET /Job/` API calls.
- Parameters:
- `{jobId}`: Tranga Job ID
- Request Body:
None
#### [POST] /Jobs/UpdateMetadata
Updates the metadata for all monitored mangas
- Parameters:
None
- Request Body:
None
#### [GET] /LibraryConnectors
Retrieves the currently configured library servers
- Parameters:
None
- Request Body:
None
#### [DELETE] /LibraryConnectors/Reset
Resets or clears a configured library connector
- Parameters:
None
- Request Body:
```
{
libraryConnector: Komga/Kavita
}
```
#### [POST] /LibraryConnectors/Test
Verifies the behavior of a library connector before saving it. The connector must be checked to verify that the connection is active.
- Parameters:
None
- Request Body:
```
{
libraryConnector: Komga/Kavita
libraryURL: ${Library URL}
komgaAuth: Only for when libraryConnector = Komga
kavitaUsername: Only for when libraryConnector = Kavita
kavitaPassword: Only for when libraryConnector = Kavita
}
```
#### [GET] /LibraryConnectors/Types
Retrives Key-Value pairs for all of Tranga's currently supported library servers.
- Parameters:
None
- Request Body:
None
#### [POST] /LibraryConnectors/Update
Updates or Adds a Library Connector to Tranga
- Parameters: None
- Request Body:
```
{
libraryConnector: Komga/Kavita
libraryURL: ${Library URL}
komgaAuth: Only for when libraryConnector = Komga
kavitaUsername: Only for when libraryConnector = Kavita
kavitaPassword: Only for when libraryConnector = Kavita
}
```
#### [GET] /LogFile
Retrieves the log file from the running Tranga instance
- Parameters:
None
- Request Body:
None
#### [GET] /Manga/FromConnector
Retrieves the details about a specified manga from a specific connector. If the manga title returned by Tranga is a URL (determined by the presence of `http` in the title, the API call should use the second
call with the `url` rather than the `title`.
- Parameters:
- `{connector}`: Manga Connector
- `{url/title}`: Manga URL/Title
- Request Body:
None
#### [GET] /Manga/Chapters
Retrieves the currently available chapters for a specified manga from a connector. The `{internalId}` is how Tranga uniquely recognizes and distinguishes different Manga.
- Parameters:
- `{connector}`: Manga Connector
- `{internalId}`: Tranga Manga ID
- `{translatedLanguage}`: Translated Language
- Request Body:
None
#### [GET] /Manga/Cover
Retrives the URL of the cover image for a specific manga that Tranga is tracking.
- Parameters:
- `{internalId}`: Tranga Manga ID
- Request Body:
None
#### [GET] /NotificationConnectors
Retrieves the currently configured notification providers
- Parameters:
None
- Request Body:
None
#### [DELETE] /NotificationConnectors/Reset
Resets or clears a configured notification connector
- Parameters:
None
- Request Body:
```
{
notificationConnector: Gotify/Ntfy/LunaSea
}
```
#### [POST] /NotificationConnectors/Test
Tests a notification connector with the currently input settings. The connector behavior must be checked to verify that the input settings are correct.
- Parameters:
None
- Request Body:
```
{
notificationConnector: Gotify/Ntfy/LunaSea
gotifyUrl:
gotifyAppToken:
lunaseaWebhook:
ntfyUrl:
ntfyAuth:
}
```
#### [POST] /NotificationConnectors/Update
Updates or Adds a notification connector to Tranga
- Parameters:
None
- Request Body:
```
{
notificationConnector: Gotify/Ntfy/LunaSea
gotifyUrl:
gotifyAppToken:
lunaseaWebhook:
ntfyUrl:
ntfyAuth:
}
```
#### [GET] /NotificationConnectors/Types
Retrives Key-Value pairs for all of Tranga's currently supported notification providers.
- Parameters:
None
- Request Body:
None
#### [GET] /Ping
This call is used periodically by the web frontend to establish that connection to the server is active.
- Parameters:
None
- Request Body:
None
#### [GET] /Settings
Retrieves the content of Tranga's `settings.json`
- Parameters:
None
- Request Body:
None
#### [GET] /Settings/customRequestLimit
Retrieves the configured rate limits for different types of manga connector requests.
- Parameters:
None
- Request Body:
None
#### [POST] /Settings/customRequestLimit
Sets the rate limits for different types of manga connector requests.
- Parameters:
None
- Request Body:
```
{
requestType: {Request Byte}
requestsPerMinute: {Rate Limit in Requests Per Minute}
}
```
#### [POST] /Settings/UpdateDownloadLocation
Updates the root directory of where Tranga downloads manga
- Parameters:
None
- Request Body:
```
{
downloadLocation: {New Root Directory}
moveFiles: "true"/"false"
}
```
#### [POST] /Settings/userAgent
Updates the user agent that Tranga uses when scraping the web
- Parameters
- Request Body:
```
{
userAgent: {User Agent String}
}
```

File diff suppressed because it is too large Load Diff

View File

@ -1,173 +0,0 @@
## Connector
```
{
"name": string,
"SupportedLanguages": string[],
"BaseUris": string[]
}
```
## Manga
```
{
"sortName": string,
"authors": string[],
"altTitles": string[][],
"description": string,
"tags": string[],
"coverUrl": string,
"coverFileNameInCache": string,
"links": string[][],
"year": int,
"originalLanguage": string,
"releaseStatus": ReleaseStatus, see ReleaseStatus
"folderName": string,
"publicationId": string,
"internalId": string,
"ignoreChaptersBelow": number,
"latestChapterDownloaded": number,
"latestChapterAvailable": number,
"websiteUrl": string,
"mangaConnector": Connector
}
```
## Chapter
```
{
"parentManga": IManga,
"name": string | undefined,
"volumeNumber": string,
"chapterNumber": string,
"url": string,
"fileName": string,
"id": string?
}
```
### ReleaseStatus
```
{
Continuing = 0,
Completed = 1,
OnHiatus = 2,
Cancelled = 3,
Unreleased = 4
}
```
## Job
```
{
"progressToken": IProgressToken,
"recurring": boolean,
"recurrenceTime": string,
"lastExecution": Date,
"nextExecution": Date,
"id": string,
"jobType": number, //see JobType
"parentJobId": string | null,
"mangaConnector": IMangaConnector,
"mangaInternalId": string | undefined, //only on DownloadNewChapters
"translatedLanguage": string | undefined, //only on DownloadNewChapters
"chapter": IChapter | undefined, //only on DownloadChapter
}
```
### JobType
```
{
DownloadChapterJob = 0,
DownloadNewChaptersJob = 1,
UpdateMetaDataJob = 2,
MonitorManga = 3
}
```
## ProgressToken
```
{
"cancellationRequested": boolean,
"increments": number,
"incrementsCompleted": number,
"progress": number,
"lastUpdate": Date,
"executionStarted": Date,
"timeRemaining": Date,
"state": number //see ProgressState
}
```
### ProgressState
```
{
Running = 0,
Complete = 1,
Standby = 2,
Cancelled = 3,
Waiting = 4
}
```
## Settings
```
{
"downloadLocation": string,
"workingDirectory": string,
"apiPortNumber": number,
"userAgent": string,
"bufferLibraryUpdates": boolean,
"bufferNotifications": boolean,
"version": number,
"aprilFoolsMode": boolean,
"compressImages": boolean,
"bwImages": boolean,
"requestLimits": {
"MangaInfo": number,
"MangaDexFeed": number,
"MangaDexImage": number,
"MangaImage": number,
"MangaCover": number,
"Default": number
}
}
```
## LibraryConnector
```
{
"libraryType": number, //see LibraryType
"baseUrl": string,
"auth": string
}
```
### LibraryType
```
{
Komga = 0,
Kavita = 1
}
```
## NotificationConnector
```
{
"notificationConnectorType": number, //see NotificationConnectorType
"endpoint": string, //only on Ntfy, Gotify
"appToken": string, //only on Gotify
"auth": string, //only on Ntfy
"topic": string, //only on Ntfy
"id": string, //only on LunaSea
}
```
### NotificationConnectorType
```
{
Gotify = 0,
LunaSea = 1,
Ntfy = 2
}
```