218 Commits

Author SHA1 Message Date
12a542da39 Merge pull request #404 from C9Glax/master
Some checks failed
Docker Image CI / build (push) Has been cancelled
Last merge of master/cuttingedge before V2 transition.
2025-06-18 19:10:40 +02:00
3f5c9d0ca1 Merge pull request #403 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.11.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/setup-buildx-action from 3.10.0 to 3.11.0
2025-06-17 11:41:35 +02:00
538825f0ef Bump docker/setup-buildx-action from 3.10.0 to 3.11.0
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.10.0 to 3.11.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.10.0...v3.11.0)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-version: 3.11.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-06-17 06:03:00 +00:00
f0de0a29da Merge pull request #400 from C9Glax/dependabot/github_actions/docker/build-push-action-6.18.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/build-push-action from 6.17.0 to 6.18.0
2025-05-28 15:50:47 +02:00
d4227f2b8f Bump docker/build-push-action from 6.17.0 to 6.18.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.17.0 to 6.18.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.17.0...v6.18.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-version: 6.18.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-28 05:59:24 +00:00
cd00d35f22 Merge pull request #395 from C9Glax/dependabot/github_actions/docker/build-push-action-6.17.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/build-push-action from 6.16.0 to 6.17.0
2025-05-16 16:15:40 +02:00
4ef3e877ce Bump docker/build-push-action from 6.16.0 to 6.17.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.16.0 to 6.17.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.16.0...v6.17.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-version: 6.17.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-05-16 05:31:47 +00:00
7dba2518f9 Merge pull request #388 from C9Glax/dependabot/github_actions/docker/build-push-action-6.16.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/build-push-action from 6.15.0 to 6.16.0
2025-04-25 09:01:00 +02:00
7506a0201e Bump docker/build-push-action from 6.15.0 to 6.16.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.15.0 to 6.16.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.15.0...v6.16.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-version: 6.16.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-04-25 05:51:29 +00:00
91fb815153 Update README.md
Some checks failed
Docker Image CI / build (push) Has been cancelled
2025-03-29 21:17:22 +01:00
6faf8bc733 Merge pull request #376 from C9Glax/cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
Weebcentral fixxes
2025-03-18 18:13:47 +01:00
bdff5b7aec Merge pull request #375 from TheyCallMeTravis/webtoons-search_regex_fix
Some checks failed
Docker Image CI / build (push) Has been cancelled
webtoons - fix search regex
2025-03-18 18:12:35 +01:00
5af8060d7b Merge pull request #374 from TheyCallMeTravis/weebcentral-fixsearch
Weebcentral - Fix Search Results Parse
2025-03-18 18:12:23 +01:00
6ed8ff1d52 webtoons - fix search regex parsing 2025-03-18 10:12:42 -05:00
3324ed6e4a Weebcentral - Fix Search Results Parse 2025-03-17 14:29:09 -05:00
67fd9d284b Merge pull request #369 from TheyCallMeTravis/WeebCentral-add_referrer
Some checks failed
Docker Image CI / build (push) Has been cancelled
WeebCentral - add referer to DownloadChapterImages
2025-03-15 10:24:28 +01:00
08f26dd21d add referer to DownloadChapterImages 2025-03-14 21:18:51 -05:00
89ed500751 Update actions for Server-V2
Some checks failed
Docker Image CI / build (push) Has been cancelled
2025-03-08 19:02:14 +01:00
b00b0ee030 Merge branch 'master' into cuttingedge-merge-ServerV2
Some checks failed
Docker Image CI / build (push) Has been cancelled
2025-03-08 19:00:42 +01:00
e47c52ad48 Merge pull request #367 from C9Glax/cuttingedge
Cuttingedge merge
2025-03-08 18:58:40 +01:00
293f0af8e3 Merge pull request #366 from merlinmarijn/manganato-domain-switch
Some checks are pending
Docker Image CI / build (push) Waiting to run
Manganato connector search fix
2025-03-08 07:40:11 +01:00
ebfa34e386 Update Manganato.cs 2025-03-07 22:33:24 +01:00
14524407f9 Update Manganato.cs 2025-03-07 22:29:40 +01:00
d56f0b383a Merge pull request #365 from merlinmarijn/manganato-domain-switch
Some checks are pending
Docker Image CI / build (push) Waiting to run
Manganato fix chapter naming format in CBZ files (i am sorry)
2025-03-07 21:54:06 +01:00
70391c83c1 Update Manganato.cs
i found out, i am stupid
2025-03-07 21:39:17 +01:00
dc7696ee26 Merge pull request #364 from merlinmarijn/manganato-domain-switch
Some checks are pending
Docker Image CI / build (push) Waiting to run
Enforce correct referrer check for access to Manganato
2025-03-07 20:19:52 +01:00
49dab9a670 Referrer policy changed
- Updated: image hosting platform seem to have changed a policy requiring now to send the referrer from the actual site instead of just allowing any connecting regardless of the referrer address
2025-03-07 19:57:27 +01:00
c9bc79fbd5 Update new_connector.yml
Some checks failed
Docker Image CI / build (push) Has been cancelled
2025-03-07 10:19:08 +01:00
83ce315f87 Merge pull request #357 from merlinmarijn/manganato-domain-switch
Some checks are pending
Docker Image CI / build (push) Waiting to run
Update Connector for Manganato connector: Migrate from .com to .gg & Adjust HTML Parsing
#358 @merlinmarijn
2025-03-07 10:06:44 +01:00
59511056d0 added try around getting urls 2025-03-03 23:43:35 +01:00
ed3ca5dba8 removed leftover comment 2025-03-03 23:04:43 +01:00
8df05d7e8a fixed image referrer 2025-03-03 22:59:25 +01:00
95d1e37b47 Update Manganato.cs 2025-03-03 22:27:37 +01:00
b6494ab7f9 Merge pull request #354 from C9Glax/dependabot/github_actions/docker/setup-qemu-action-3.6.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/setup-qemu-action from 3.5.0 to 3.6.0
2025-03-03 13:54:59 +01:00
1d1d01b6e5 Bump docker/setup-qemu-action from 3.5.0 to 3.6.0
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 3.5.0 to 3.6.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](https://github.com/docker/setup-qemu-action/compare/v3.5.0...v3.6.0)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-03-03 05:09:43 +00:00
5bb4977876 Merge pull request #353 from ale-ben/cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
Weebcentral: File name also depends on original chapter name
2025-03-02 16:20:07 +01:00
c6bb1c9180 [cuttingedge] fix(Chapter): Minor logic change to account for all chapterName cases 2025-03-02 16:06:21 +01:00
9a066e7ac7 [cuttingedge] fix(Weebcentral): Updated CheckChapterIsDownloaded logic to also consider chapter name if present 2025-03-02 15:57:09 +01:00
4bafffded4 [cuttingedge] feat(Weebcentra): When ordering chapters, order name desc to put special chapters first 2025-03-02 15:56:12 +01:00
942b43da67 Merge branch 'refs/heads/cuttingedge' into cuttingedge-merge-ServerV2 2025-03-02 10:06:48 +01:00
ce5538b352 Merge pull request #341 from Makhuta/cuttingedge
Some checks are pending
Docker Image CI / build (push) Waiting to run
Added to HandleGet support for connector languages
2025-03-02 09:51:11 +01:00
0cfdf17bd4 Merge pull request #350 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.10.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/setup-buildx-action from 3.9.0 to 3.10.0
2025-03-02 09:50:21 +01:00
0c48c1e020 Merge pull request #351 from C9Glax/dependabot/github_actions/docker/build-push-action-6.15.0
Bump docker/build-push-action from 6.14.0 to 6.15.0
2025-03-02 09:50:17 +01:00
0638e75ed6 Merge pull request #349 from C9Glax/dependabot/github_actions/docker/setup-qemu-action-3.5.0
Bump docker/setup-qemu-action from 3.4.0 to 3.5.0
2025-03-02 09:49:16 +01:00
5a4bc1c6de [cuttingedge] fix(Weebcentral): Handle case of chapter name with multiple number parts 2025-03-01 12:06:51 +01:00
71f663ca2f [cuttingedge] fix(Weebcentral): File name also depends on original chapter name 2025-03-01 11:39:12 +01:00
1b61a16061 Bump docker/build-push-action from 6.14.0 to 6.15.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.14.0 to 6.15.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.14.0...v6.15.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-27 05:25:23 +00:00
db81fdce39 Bump docker/setup-buildx-action from 3.9.0 to 3.10.0
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.9.0 to 3.10.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.9.0...v3.10.0)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-27 05:25:21 +00:00
fdb5451162 Bump docker/setup-qemu-action from 3.4.0 to 3.5.0
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 3.4.0 to 3.5.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](https://github.com/docker/setup-qemu-action/compare/v3.4.0...v3.5.0)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-27 05:25:19 +00:00
6b7632b071 Merge pull request #344 from C9Glax/dependabot/github_actions/docker/build-push-action-6.14.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/build-push-action from 6.13.0 to 6.14.0
2025-02-20 16:49:19 +01:00
06c080dfce Bump docker/build-push-action from 6.13.0 to 6.14.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.13.0 to 6.14.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.13.0...v6.14.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-20 05:50:25 +00:00
8130e11a9c Merge pull request #342 from C9Glax/cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
Cuttingedge master merge
2025-02-14 15:51:42 +01:00
659a42d370 Add
- ability to get supported languages of connector for use in monitoring languages selector
2025-02-12 14:07:13 +01:00
9cef068785 Merge pull request #340 from Makhuta/cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
Fix the Webtoons connector getting few chapters multiple times
2025-02-11 21:24:14 +01:00
4ad3149523 Fix
- fixed when parsing chapters the pages was incorrectly parsed resulting into adding the chapters from the last page multiple times (was still downloading OK but it would try to download the chapters from last page multiple times)
2025-02-11 20:16:30 +01:00
e6d40a7b36 Remove unused code from Weebcentral
Some checks failed
Docker Image CI / build (push) Has been cancelled
2025-02-09 18:37:55 +01:00
a95cb90561 Update Nuget packages 2025-02-09 18:37:41 +01:00
603e1b41d9 Add Chromium referer header 2025-02-09 18:37:33 +01:00
bb8a514830 Do not create .duplicate files anymore.
Just warn in log and delete (or attempt to delete)
2025-02-09 17:57:08 +01:00
edacaaba8a Update Readme 2025-02-09 17:38:54 +01:00
d97da26994 spelling error 2025-02-09 17:37:53 +01:00
8b923d73c4 Merge pull request #337 from Makhuta/cuttingedge
Add Manga Connector
2025-02-09 17:34:08 +01:00
814efd3528 Merge remote-tracking branch 'origin/cuttingedge' into cuttingedge 2025-02-09 17:28:03 +01:00
2cd5d8bc4f Merge branch 'cuttingedge-merge-ServerV2' into cuttingedge 2025-02-09 17:27:42 +01:00
5a864ab9b7 Remove Manga4Life 2025-02-09 17:27:35 +01:00
c700974693 Merge pull request #339 from C9Glax/dependabot/github_actions/docker/setup-qemu-action-3.4.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/setup-qemu-action from 3.3.0 to 3.4.0
2025-02-09 17:18:42 +01:00
553b5558d3 Merge pull request #338 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.9.0
Bump docker/setup-buildx-action from 3.8.0 to 3.9.0
2025-02-09 17:18:25 +01:00
c9bbfee26b Merge pull request #331 from C9Glax/dependabot/github_actions/docker/build-push-action-6.13.0
Bump docker/build-push-action from 6.12.0 to 6.13.0
2025-02-09 17:18:10 +01:00
6e869eeb0d Bump docker/setup-qemu-action from 3.3.0 to 3.4.0
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 3.3.0 to 3.4.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](https://github.com/docker/setup-qemu-action/compare/v3.3.0...v3.4.0)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-07 05:13:18 +00:00
be7da69dbd Bump docker/setup-buildx-action from 3.8.0 to 3.9.0
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.8.0 to 3.9.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.8.0...v3.9.0)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-02-07 05:13:16 +00:00
7f13d9b1e6 Fix
- forgotten comma
2025-02-06 15:39:06 +01:00
0c9e3205c2 Add Manga Connector
- added [Webtoon](https://www.webtoons.com) manga connector
- modified/added support for saving covers with refferer
2025-02-06 15:37:30 +01:00
8c3b70b32e Bump docker/build-push-action from 6.12.0 to 6.13.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.12.0 to 6.13.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.12.0...v6.13.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-01-27 06:02:55 +00:00
4f7031ecfc Merge pull request #328 from ale-ben/cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
fix: Add escape to Weebcentral regex
2025-01-25 23:26:48 +01:00
f7a285aabd [cuttingedge] fix: Add escape to Weebcentral regex 2025-01-25 11:40:00 +01:00
786482398c Merge pull request #327 from ale-ben/cuttingedge
Some checks are pending
Docker Image CI / build (push) Waiting to run
Fix bug that prevented download of chapters 0
2025-01-24 22:09:53 +01:00
7921dcb1cb [cuttingedge] fix: Change condition for newChapters. Should solve #323 2025-01-24 21:52:52 +01:00
d0c9313279 Merge pull request #322 from C9Glax/dependabot/github_actions/docker/build-push-action-6.12.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/build-push-action from 6.11.0 to 6.12.0
2025-01-16 17:00:36 +01:00
58cf4cf4e0 Bump docker/build-push-action from 6.11.0 to 6.12.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.11.0 to 6.12.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.11.0...v6.12.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-01-16 05:52:33 +00:00
280d715a7c Merge branch 'cuttingedge-merge-ServerV2' into cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
2025-01-15 23:14:20 +01:00
d0b775444d Change Chromium back to WaitUntilNavigation.Networkidle0 2025-01-15 23:14:15 +01:00
b4edcccafe Merge branch 'cuttingedge-merge-ServerV2' into cuttingedge 2025-01-15 23:02:55 +01:00
268441a47d Trangasettings Json 2025-01-15 23:02:48 +01:00
1701881f4b Merge branch 'cuttingedge-merge-ServerV2' into cuttingedge 2025-01-15 22:53:47 +01:00
78a9322036 ChromiumDownloadClient Change WaitUnitlNavigation to Load instead of NetworkIdle 2025-01-15 22:53:39 +01:00
e5be5703f8 Merge branch 'cuttingedge-merge-ServerV2' into cuttingedge 2025-01-15 22:25:16 +01:00
cc32b3dfae TrangaSettings Chromium Timeouts 2025-01-15 22:24:55 +01:00
ce217aae4f Merge remote-tracking branch 'origin/cuttingedge' into cuttingedge 2025-01-15 22:18:11 +01:00
123a8b06b2 jobloading errormessage 2025-01-15 22:15:33 +01:00
2350c5a04b Remove Mangasee 2025-01-15 22:13:58 +01:00
f532e2ff76 JobBoss LoadJobsList change:
Fix Directory.Exists jobsFolderPath to create new Directory
Fix Loading Job fails leading to crash.
2025-01-15 22:13:50 +01:00
3abf7224d0 Merge pull request #316 from ale-ben/cuttingedge
Some checks failed
Docker Image CI / build (push) Has been cancelled
Fixed regex to capture chapters with decimal (1.5, ..)
2025-01-11 00:41:20 +01:00
b39dbd5671 [cuttingedge] fix(weebcentral): Fixed regex to capture chapters with decimal (1.5, ..) 2025-01-10 22:10:34 +01:00
375fad0c21 Merge pull request #314 from C9Glax/dependabot/github_actions/docker/setup-qemu-action-3.3.0
Some checks failed
Docker Image CI / build (push) Has been cancelled
Bump docker/setup-qemu-action from 3.2.0 to 3.3.0
2025-01-10 11:02:46 +01:00
ee0d17c24f Merge pull request #315 from C9Glax/dependabot/github_actions/docker/build-push-action-6.11.0
Bump docker/build-push-action from 6.9.0 to 6.11.0
2025-01-10 11:02:26 +01:00
36ab3c3fdb Bump docker/build-push-action from 6.9.0 to 6.11.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.9.0 to 6.11.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.9.0...v6.11.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-01-10 05:46:13 +00:00
c3d60c6586 Bump docker/setup-qemu-action from 3.2.0 to 3.3.0
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 3.2.0 to 3.3.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](https://github.com/docker/setup-qemu-action/compare/v3.2.0...v3.3.0)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-01-10 05:46:10 +00:00
6aa8413c40 Fix #311 MangaWorld now requires Javascript
Some checks failed
Docker Image CI / build (push) Has been cancelled
2025-01-09 01:48:13 +01:00
b96ae4a2d2 Merge pull request #304 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.8.0
Bump docker/setup-buildx-action from 3.7.1 to 3.8.0
2024-12-17 17:38:07 +01:00
3a25c0b221 Bump docker/setup-buildx-action from 3.7.1 to 3.8.0
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.7.1 to 3.8.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.7.1...v3.8.0)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-12-17 05:59:19 +00:00
e1f1a05724 Merge pull request #302 from ale-ben/feature/weebcentral_build_error
Fix build error in Weebcentral
2024-12-14 21:54:28 +01:00
72d9bda0e8 [feature/weebcentral_build_error] fix type in equality check 2024-12-14 20:44:43 +01:00
a40a9c84df Merge pull request #298 from ale-ben/feature/weebcentral
Weebcentral implementation
2024-12-14 18:42:47 +01:00
825b945ad1 AsuraToon Crash on no Artists or Authors
Fix #296
2024-12-14 18:02:41 +01:00
b8c624f3ea AsuraToon crash when there is no search-results #296 2024-12-14 17:55:20 +01:00
93cfdddd19 Possible fix #300 chromium statup "Failed to launch browser! chrome_crashpad_handler: --database is required" 2024-12-14 17:51:22 +01:00
4c8d9bfaf2 [feature/weebcentral] Added Weebcentral to readme 2024-12-14 16:29:43 +01:00
dd988658c0 [feature/weebcentral] Added Weebcentral to connectors 2024-12-14 16:18:15 +01:00
cf4c84a47f [feature/weebcentral] Working download logic 2024-12-14 00:58:52 +01:00
5d9bfc3adf [feature/weebcentral] Get chapters 2024-12-14 00:45:10 +01:00
5a770c8e9f [feature/weebcentral] Working search 2024-12-13 23:42:35 +01:00
e3bd7620aa Fix #296 AsuraToon
AsuraComic does not use Static sites, use Chromium instead.
Make Puppeteer spam less logs
2024-12-13 18:53:25 +01:00
428d6e13d1 Fix UpdateJobFile with oldFile:
oldFilePath was fullname, not relative
2024-12-12 22:41:28 +01:00
1e6a65c0fd Chapter volume and chapternumber as float instead of string.
Possible fix #293
2024-12-12 22:33:13 +01:00
025d43b752 Fix duplicate job check.
We were still adding duplicate jobs if not *every* field in the Manga matched.
We now only compare publicationId.
2024-12-12 22:18:06 +01:00
113c0abba7 Merge pull request #294 from C9Glax/cuttingedge
Merge cuttingedge into master
2024-12-12 22:07:13 +01:00
747df0bde5 Add Puppeteer Logger 2024-12-12 21:42:21 +01:00
463f360808 Dependency updates 2024-12-12 21:28:58 +01:00
85d7c07b13 Mangaworld add decimal-chapters (686.5) to regex
#289
2024-12-04 19:55:31 +01:00
553f56ecaf Longer ExceptionMessage when Chapter comparison fails
#289
2024-12-04 19:49:38 +01:00
9cc4f8c090 Merge pull request #283 from C9Glax/cuttingedge-merge-candiate
AsuraToon merge
2024-11-28 21:41:19 +01:00
204fb7614d Fix #281 Manganato errors when there is no chapters uploaded 2024-11-28 21:35:29 +01:00
d6e73ffcdf Merge pull request #276 from C9Glax/cuttingedge-merge-candiate
Cuttingedge merge candidate
2024-11-28 21:23:56 +01:00
5a8202f872 More logging 2024-11-11 17:59:48 +01:00
55cc2a2e84 Merge pull request #277 from C9Glax/asuratoon
Asuratoon
2024-11-02 17:51:12 +01:00
b619109ea1 fix #141 chapternames 2024-11-02 17:48:18 +01:00
72943330c3 Merge branch 'refs/heads/cuttingedge' into asuratoon 2024-11-02 17:45:13 +01:00
38bc1e4d53 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-11-02 17:44:30 +01:00
47479f7a0d Fix chaptermarkers.
Don't create one if Chapter does not have an ID
2024-11-02 17:44:23 +01:00
b2381be860 #141 fix ParsePublicationsFromHtml, statusNode, titleNode, firstChapterNode
fix ParseChaptersFromHtml nodeCollection of ChapterURls
fix ParseImageUrlsFromHtml xPath
fix Chapterparsing names
2024-11-02 17:42:26 +01:00
657e1b338b resolves #141 Asuratoon connector 2024-11-02 17:19:17 +01:00
ee265a7519 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-11-02 16:24:55 +01:00
5b0624654b rename duplicates to append ".duplicate" 2024-11-02 16:24:44 +01:00
a75549c699 Only try loading .json files on startup (exclude .failed for example) 2024-11-02 16:24:25 +01:00
f46244cb9c Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-10-31 20:43:11 +01:00
9db3f1b0da Extend logging on startup 2024-10-31 20:42:56 +01:00
dc9cd4b1dd Append ".failed" to job-files that werent successfully added. 2024-10-31 20:41:46 +01:00
3566ad774d Moved logging to actually say if we added a job to the list 2024-10-31 20:41:21 +01:00
94b81969c7 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-10-30 22:40:31 +01:00
bd8cb86c52 Always set directory-permissions 2024-10-30 22:29:32 +01:00
34c5436b33 Always set directory-permissions 2024-10-30 22:29:16 +01:00
4690394437 Formatting 2024-10-30 22:27:55 +01:00
02cf8578c9 Explicitly set File/Directory permissions for jobs 2024-10-30 22:27:50 +01:00
067497ddd0 Delete duplicate files on startup. 2024-10-30 20:38:53 +01:00
4b88cdbd90 When updating Jobfiles, dont write a new file if we werent able to successfully delete the old one 2024-10-30 20:31:16 +01:00
420013f07b Delete chapterMarkers if the file doesn't exist anymore. 2024-10-30 18:23:14 +01:00
8cee11aa22 Fix #272 Manhuaplus missing year string 2024-10-29 19:15:19 +01:00
198bbdcf94 Set hidden Attribute to Markerfiles 2024-10-27 02:58:50 +02:00
c58adf64fa #271 Create Marker-files for Chapters.
If a Connector provides a unique ID for a chapter, Tranga will create a markerfile, containing the current name of the Chapter
This should prevent duplicates, or missing chapters.
2024-10-27 02:41:28 +02:00
957debea01 Mangahere change list-2 to list-1 in selector 2024-10-27 02:22:58 +02:00
5186ae66c9 Merge pull request #270 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.7.1
Bump docker/setup-buildx-action from 3.6.1 to 3.7.1
2024-10-23 16:11:06 +02:00
c35e1ef517 Merge pull request #269 from C9Glax/dependabot/github_actions/docker/build-push-action-6.9.0
Bump docker/build-push-action from 6.7.0 to 6.9.0
2024-10-23 16:10:52 +02:00
8f6891142b Bump docker/setup-buildx-action from 3.6.1 to 3.7.1
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.6.1 to 3.7.1.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.6.1...v3.7.1)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-10-23 05:49:09 +00:00
b52e6d4908 Bump docker/build-push-action from 6.7.0 to 6.9.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.7.0 to 6.9.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.7.0...v6.9.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-10-23 05:49:07 +00:00
30c44760e7 Merge pull request #256 from C9Glax/cuttingedge-merge-candidate
Cuttingedge merge candidate
2024-09-29 01:13:56 +02:00
a3ae3c320d Merge branch 'refs/heads/cuttingedge' into cuttingedge-merge-candidate 2024-09-29 01:07:59 +02:00
ea262889e6 Its late. Set TARGETPLATFORM in base 2024-09-29 01:02:50 +02:00
445542b653 Set --platform to BUILDPLATFORM for dotnet 2024-09-29 00:58:24 +02:00
b7718220ef Merge branch 'refs/heads/cuttingedge' into cuttingedge-merge-candidate 2024-09-29 00:54:28 +02:00
34c62e8658 Remove cache step from cuttingedge workflow, set --platform to TARGETPLATFORM instead 2024-09-29 00:50:53 +02:00
a9fcc93670 Merge pull request #257 from C9Glax/master
Update docker-image-cuttingedge.yml
2024-09-29 00:44:17 +02:00
68d7ef258f Update docker-image-cuttingedge.yml
Clear Cache on build
2024-09-29 00:40:59 +02:00
fdea4f5ea5 Merge branch 'cuttingedge-merge-ServerV2' into cuttingedge 2024-09-27 17:09:19 +02:00
ac3039e587 Add Star-Graph to README 2024-09-27 17:08:59 +02:00
3829a1cf26 Merge branch 'refs/heads/cuttingedge' into cuttingedge-merge-candidate 2024-09-27 15:03:51 +02:00
c3daa0b751 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-27 15:03:44 +02:00
3a072beea3 Update Readme:
* Fix dotnet Version
* Link directly to new issue for new Connectors
* Add Ntfy as Notification Connector
* Remove Roadmap
2024-09-27 15:03:06 +02:00
8e6f2798a9 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge-merge-candidate 2024-09-27 14:58:07 +02:00
9cbde9a6b4 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-27 14:57:57 +02:00
0870aa9fdb Merge branch 'refs/heads/master' into cuttingedge-merge-ServerV2 2024-09-27 14:57:36 +02:00
172650e644 Merge pull request #254 from C9Glax/cuttingedge-merge-candidate
Cuttingedge merge candidate
2024-09-27 14:53:24 +02:00
52ff2e54a8 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-27 14:51:11 +02:00
61d80a93cf Fix #255 MangaKatana sanitization. 2024-09-27 14:50:57 +02:00
7be3ee52e9 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-23 15:40:53 +02:00
981eb0fd9f Fix notification batching:
Do not resend old notifications.
2024-09-23 15:40:43 +02:00
47f3044a6d Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-22 00:15:59 +02:00
6d03cc5f8d Fix incorrect setting check for notificationsbuffer 2024-09-22 00:15:50 +02:00
290c405f52 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-22 00:09:54 +02:00
fcdbd32872 Include amount of notifications of type in title 2024-09-22 00:09:45 +02:00
eb6c37cc53 Output settings.json on startup 2024-09-22 00:05:09 +02:00
d922842186 Add NotificationBuffer, so Notification are not spammed on every chapter. 2024-09-22 00:02:43 +02:00
69323d6d60 Add LibraryBuffer, so Libraries are not spammed with scans on every download. 2024-09-21 21:02:55 +02:00
46a0fb8c48 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-21 20:34:57 +02:00
ec8eb40941 Allow Versions to lose their volume number, if site no longer lists it. 2024-09-21 20:30:55 +02:00
d2074fae35 Readable CheckChapterIsDownloaded check 2024-09-21 20:23:21 +02:00
713bbc230f Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-18 18:56:09 +02:00
32ab9a552f Also delete files on UpdateJobFile if we dont provide a filepath 2024-09-18 18:56:01 +02:00
c11c68d6d7 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-18 18:46:02 +02:00
09fdb6e5f1 Fix #250 old jobs getting re-exported. 2024-09-18 18:45:55 +02:00
e86ad03b1e Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-17 00:51:30 +02:00
9dfbe89e87 include --platform=$BUILDPLATFORM in Dockerfile 2024-09-17 00:51:22 +02:00
98e75af486 Merge branch 'cuttingedge' of ssh://git.bernloehr.eu:222/glax/Tranga into cuttingedge 2024-09-16 23:21:13 +02:00
e2f5c3badc Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 23:18:57 +02:00
cda07bb9aa Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 23:09:43 +02:00
7c18466e95 Fix NETSDK1194 on build 2024-09-16 23:09:34 +02:00
ce1c4d3f65 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 22:48:06 +02:00
52d0489a1b Fix duplicate mangas on startup 2024-09-16 22:47:55 +02:00
f89aea6ac8 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 21:19:27 +02:00
5f05ba1049 Make SupportedLanguages public. 2024-09-16 21:19:19 +02:00
a20ee01cfa Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 21:17:18 +02:00
cf5cbba9a8 #247 Add supported languages to Mangaconnectors 2024-09-16 21:17:07 +02:00
600b56033d Upgrade to Dotnet 8.0 LangVer 12 2024-09-16 21:11:50 +02:00
fdea3659f1 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 20:38:19 +02:00
7f3754fb64 Fix startup issue/issue with existing chapters: ProgressToken would not complete 2024-09-16 20:36:40 +02:00
2dac5db4da Create single Chromium Instance that is shared between all Connectors.
Fix pages staying open when page could not be loaded.
2024-09-16 20:30:23 +02:00
3456fc6564 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 19:52:39 +02:00
35f2625f05 Fix #249 Manhuaplus where author/tags are not set. 2024-09-16 19:52:25 +02:00
0b9948e367 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 18:32:45 +02:00
96f3dbce65 Throw more readable exceptions if deserialization fails for Mangaconnectors.
#249
2024-09-16 18:32:34 +02:00
895128a462 Merge remote-tracking branch 'origin/cuttingedge-merge-ServerV2' into cuttingedge-merge-ServerV2 2024-09-16 18:24:39 +02:00
a94186455b Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-11 14:41:35 +02:00
7d3deee74c Remove unused constant 2024-09-11 14:40:28 +02:00
5980b64caa Readable Chapter comparison 2024-09-11 14:40:03 +02:00
cbecb257ef Remove unused constant 2024-09-11 14:39:16 +02:00
8316ed08a7 Merge pull request #245 from C9Glax/cuttingedge
Prod didn't break, nice
2024-09-09 10:10:36 +02:00
9b8b80cd24 Fix response closed on OPTIONS request 2024-09-07 20:44:15 +02:00
15f3e2b8ec Use current time as internalId for Manga instead of BASE64 string of title
#232
Fix #237
2024-09-07 20:33:03 +02:00
2be29e4019 MangaDex only download single release for chapter.
Fix #219
2024-09-07 20:16:05 +02:00
59 changed files with 2218 additions and 3623 deletions

View File

@ -12,7 +12,7 @@ body:
- type: checkboxes
attributes:
label: Is the Website free to access?
description: We can't support pay-to-use sites.
description: We can't support pay-to-use sites, or captcha-proxied sites as Cloudflare.
options:
- label: The Website is freely accessible.
required: true

View File

@ -17,12 +17,12 @@ jobs:
# https://github.com/docker/setup-qemu-action#usage
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0
uses: docker/setup-qemu-action@v3.6.0
# https://github.com/marketplace/actions/docker-setup-buildx
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v3.6.1
uses: docker/setup-buildx-action@v3.11.0
# https://github.com/docker/login-action#docker-hub
- name: Login to Docker Hub
@ -33,7 +33,7 @@ jobs:
# https://github.com/docker/build-push-action#multi-platform-image
- name: Build and push API
uses: docker/build-push-action@v6.7.0
uses: docker/build-push-action@v6.18.0
with:
context: ./
file: ./Dockerfile

View File

@ -1,45 +0,0 @@
name: Docker Image CI
on:
push:
branches: [ "dev" ]
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
# https://github.com/docker/setup-qemu-action#usage
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0
# https://github.com/marketplace/actions/docker-setup-buildx
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v3.6.1
# https://github.com/docker/login-action#docker-hub
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
# https://github.com/docker/build-push-action#multi-platform-image
- name: Build and push API
uses: docker/build-push-action@v6.7.0
with:
context: ./
file: ./Dockerfile
#platforms: linux/amd64,linux/arm64,linux/riscv64,linux/ppc64le,linux/s390x,linux/386,linux/mips64le,linux/mips64,linux/arm/v7,linux/arm/v6
platforms: linux/amd64,linux/arm64
pull: true
push: true
tags: |
glax/tranga-api:dev

View File

@ -17,12 +17,12 @@ jobs:
# https://github.com/docker/setup-qemu-action#usage
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0
uses: docker/setup-qemu-action@v3.6.0
# https://github.com/marketplace/actions/docker-setup-buildx
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v3.6.1
uses: docker/setup-buildx-action@v3.11.0
# https://github.com/docker/login-action#docker-hub
- name: Login to Docker Hub
@ -33,7 +33,7 @@ jobs:
# https://github.com/docker/build-push-action#multi-platform-image
- name: Build and push API
uses: docker/build-push-action@v6.7.0
uses: docker/build-push-action@v6.18.0
with:
context: ./
file: ./Dockerfile

View File

@ -2,7 +2,7 @@ name: Docker Image CI
on:
push:
branches: [ "Server-V2" ]
branches: [ "postgres-Server-V2" ]
workflow_dispatch:
jobs:
@ -17,12 +17,12 @@ jobs:
# https://github.com/docker/setup-qemu-action#usage
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0
uses: docker/setup-qemu-action@v3.6.0
# https://github.com/marketplace/actions/docker-setup-buildx
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v3.6.1
uses: docker/setup-buildx-action@v3.11.0
# https://github.com/docker/login-action#docker-hub
- name: Login to Docker Hub
@ -33,7 +33,7 @@ jobs:
# https://github.com/docker/build-push-action#multi-platform-image
- name: Build and push API
uses: docker/build-push-action@v6.7.0
uses: docker/build-push-action@v6.18.0
with:
context: ./
file: ./Dockerfile

2
.gitignore vendored
View File

@ -20,8 +20,6 @@ riderModule.iml
cover.jpg
cover.png
/.vscode
/.vs/
Tranga/Properties/launchSettings.json
/Manga
/settings
*.DotSettings.user

View File

@ -2,13 +2,14 @@
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net7.0</TargetFramework>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<LangVersion>12</LangVersion>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Spectre.Console.Cli" Version="0.47.1-preview.0.11" />
<PackageReference Include="Spectre.Console.Cli" Version="0.49.1" />
</ItemGroup>
<ItemGroup>

View File

@ -1,16 +1,18 @@
# syntax=docker/dockerfile:1
ARG DOTNET=7.0
ARG DOTNET=8.0
FROM mcr.microsoft.com/dotnet/runtime:$DOTNET AS base
FROM --platform=$TARGETPLATFORM mcr.microsoft.com/dotnet/runtime:$DOTNET AS base
WORKDIR /publish
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
ENV PUPPETEER_EXECUTABLE_PATH=/usr/bin/chromium
ENV XDG_CONFIG_HOME=/tmp/.chromium
ENV XDG_CACHE_HOME=/tmp/.chromium
RUN apt-get update \
&& apt-get install -y libx11-6 libx11-xcb1 libatk1.0-0 libgtk-3-0 libcups2 libdrm2 libxkbcommon0 libxcomposite1 libxdamage1 libxrandr2 libgbm1 libpango-1.0-0 libcairo2 libasound2 libxshmfence1 libnss3 chromium \
&& apt-get autopurge -y \
&& apt-get autoclean -y
FROM mcr.microsoft.com/dotnet/sdk:$DOTNET AS build-env
FROM --platform=$BUILDPLATFORM mcr.microsoft.com/dotnet/sdk:$DOTNET AS build-env
WORKDIR /src
COPY Tranga.sln /src
@ -20,9 +22,9 @@ COPY Tranga/Tranga.csproj /src/Tranga/Tranga.csproj
RUN dotnet restore /src/Tranga.sln
COPY . /src/
RUN dotnet publish -c Release -o /publish -maxcpucount:1
RUN dotnet publish -c Release --property:OutputPath=/publish -maxcpucount:1
FROM base AS runtime
FROM --platform=$TARGETPLATFORM base AS runtime
EXPOSE 6531
ARG UNAME=tranga
ARG UID=1000

View File

@ -1,9 +1,10 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net7.0</TargetFramework>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<LangVersion>12</LangVersion>
</PropertyGroup>
</Project>

View File

@ -1,8 +1,12 @@
# Testers for V2 wanted!
[Details](https://github.com/C9Glax/tranga/pull/355#issuecomment-2764217944)
<!-- PROJECT LOGO -->
<br />
<div align="center">
<h3 align="center">Tranga v2</h3>
<h3 align="center">Tranga</h3>
<p align="center">
Automatic Manga and Metadata downloader
@ -45,23 +49,23 @@ Tranga can download Chapters and Metadata from "Scanlation" sites such as
- [MangaDex.org](https://mangadex.org/) (Multilingual)
- [Manganato.com](https://manganato.com/) (en)
- [Mangasee.com](https://mangasee123.com/) (en)
- [MangaKatana.com](https://mangakatana.com) (en)
- [Mangaworld.bz](https://www.mangaworld.bz/) (it)
- [Bato.to](https://bato.to/v3x) (en)
- [Manga4Life](https://manga4life.com) (en)
- [ManhuaPlus](https://manhuaplus.org/) (en)
- [MangaHere](https://www.mangahere.cc/) (en) (Their covers suck)
- ❓ Open an [issue](https://github.com/C9Glax/tranga/issues)
- [MangaHere](https://www.mangahere.cc/) (en) (Their covers aren't scrapeable.)
- [Weebcentral](https://weebcentral.com) (en)
- [Webtoons](https://www.webtoons.com/en/)
- ❓ Open an [issue](https://github.com/C9Glax/tranga/issues/new?assignees=&labels=New+Connector&projects=&template=new_connector.yml&title=%5BNew+Connector%5D%3A+)
and trigger a library-scan with [Komga](https://komga.org/) and [Kavita](https://www.kavitareader.com/).
Notifications can be sent to your devices using [Gotify](https://gotify.net/) and [LunaSea](https://www.lunasea.app/).
Notifications can be sent to your devices using [Gotify](https://gotify.net/), [LunaSea](https://www.lunasea.app/) or [Ntfy](https://ntfy.sh/
).
### What this does and doesn't do
Tranga (this git-repo) will open a port (standard 6531) and listen for requests to add Jobs to Monitor and/or download specific Manga.
The configuration is all done through HTTP-Requests. [Documentation](docs/API_Calls_v2.md)
The configuration is all done through HTTP-Requests.
_**For a web-frontend use [tranga-website](https://github.com/C9Glax/tranga-website).**_
This project downloads the images for a Manga from the specified Scanlation-Website and packages them with some metadata - from that same website - in a .cbz-archive (per chapter).
@ -93,6 +97,16 @@ That is why I wanted to create my own project, in a language I understand, and t
<p align="right">(<a href="#readme-top">back to top</a>)</p>
## Star History
<a href="https://star-history.com/#c9glax/tranga&Date">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=c9glax/tranga&type=Date&theme=dark" />
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=c9glax/tranga&type=Date" />
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=c9glax/tranga&type=Date" />
</picture>
</a>
<!-- GETTING STARTED -->
## Getting Started
@ -108,14 +122,9 @@ access the folder.
### Prerequisites
#### To Build
[.NET-Core 7.0 SDK](https://dotnet.microsoft.com/en-us/download/dotnet/7.0)
[.NET-Core 8.0 SDK](https://dotnet.microsoft.com/en-us/download/dotnet/8.0)
#### To Run
[.NET-Core 7.0 Runtime](https://dotnet.microsoft.com/en-us/download/dotnet/7.0) scroll down a bit, should be on the right the second item.
<!-- ROADMAP -->
## Roadmap
- [ ]
[.NET-Core 8.0 Runtime](https://dotnet.microsoft.com/en-us/download/dotnet/8.0) scroll down a bit, should be on the right the second item.
See the [open issues](https://github.com/C9Glax/tranga/issues) for a full list of proposed features (and known issues).

View File

@ -1,5 +1,7 @@
using System.Text.RegularExpressions;
using System.Runtime.InteropServices;
using System.Text.RegularExpressions;
using System.Xml.Linq;
using static System.IO.UnixFileMode;
namespace Tranga;
@ -12,33 +14,37 @@ public readonly struct Chapter : IComparable
// ReSharper disable once MemberCanBePrivate.Global
public Manga parentManga { get; }
public string? name { get; }
public string volumeNumber { get; }
public string chapterNumber { get; }
public float volumeNumber { get; }
public float chapterNumber { get; }
public string url { get; }
// ReSharper disable once MemberCanBePrivate.Global
public string fileName { get; }
public string? id { get; }
private static readonly Regex LegalCharacters = new (@"([A-z]*[0-9]* *\.*-*,*\]*\[*'*\'*\)*\(*~*!*)*");
private static readonly Regex IllegalStrings = new(@"(Vol(ume)?|Ch(apter)?)\.?", RegexOptions.IgnoreCase);
private static readonly Regex Digits = new(@"[0-9\.]*");
public Chapter(Manga parentManga, string? name, string? volumeNumber, string chapterNumber, string url)
public Chapter(Manga parentManga, string? name, string? volumeNumber, string chapterNumber, string url, string? id = null)
: this(parentManga, name, float.Parse(volumeNumber??"0", GlobalBase.numberFormatDecimalPoint),
float.Parse(chapterNumber, GlobalBase.numberFormatDecimalPoint), url, id)
{
}
public Chapter(Manga parentManga, string? name, float? volumeNumber, float chapterNumber, string url, string? id = null)
{
this.parentManga = parentManga;
this.name = name;
this.volumeNumber = volumeNumber is not null ? string.Concat(Digits.Matches(volumeNumber).Select(x => x.Value)) : "0";
this.chapterNumber = string.Concat(Digits.Matches(chapterNumber).Select(x => x.Value));
this.volumeNumber = volumeNumber??0;
this.chapterNumber = chapterNumber;
this.url = url;
this.id = id;
string chapterVolNumStr;
if (volumeNumber is not null && volumeNumber.Length > 0)
chapterVolNumStr = $"Vol.{volumeNumber} Ch.{chapterNumber}";
else
chapterVolNumStr = $"Ch.{chapterNumber}";
string chapterVolNumStr = $"Vol.{this.volumeNumber} Ch.{chapterNumber}";
if (name is not null && name.Length > 0)
{
string chapterName = IllegalStrings.Replace(string.Concat(LegalCharacters.Matches(name)), "");
this.fileName = $"{chapterVolNumStr} - {chapterName}";
this.fileName = chapterName.Length > 0 ? $"{chapterVolNumStr} - {chapterName}" : chapterVolNumStr;
}
else
this.fileName = chapterVolNumStr;
@ -58,29 +64,14 @@ public readonly struct Chapter : IComparable
public int CompareTo(object? obj)
{
if (obj is Chapter otherChapter)
{
if (float.TryParse(volumeNumber, GlobalBase.numberFormatDecimalPoint, out float volumeNumberFloat) &&
float.TryParse(chapterNumber, GlobalBase.numberFormatDecimalPoint, out float chapterNumberFloat) &&
float.TryParse(otherChapter.volumeNumber, GlobalBase.numberFormatDecimalPoint,
out float otherVolumeNumberFloat) &&
float.TryParse(otherChapter.chapterNumber, GlobalBase.numberFormatDecimalPoint,
out float otherChapterNumberFloat))
{
switch (volumeNumberFloat.CompareTo(otherVolumeNumberFloat))
{
case < 0:
return -1;
case > 0:
return 1;
default:
return chapterNumberFloat.CompareTo(otherChapterNumberFloat);
}
}
else throw new FormatException($"Value could not be parsed");
}
if(obj is not Chapter otherChapter)
throw new ArgumentException($"{obj} can not be compared to {this}");
return volumeNumber.CompareTo(otherChapter.volumeNumber) switch
{
<0 => -1,
>0 => 1,
_ => chapterNumber.CompareTo(otherChapter.chapterNumber)
};
}
/// <summary>
@ -89,25 +80,56 @@ public readonly struct Chapter : IComparable
/// <returns>true if chapter is present</returns>
internal bool CheckChapterIsDownloaded()
{
if (!Directory.Exists(Path.Join(TrangaSettings.downloadLocation, parentManga.folderName)))
string mangaDirectory = Path.Join(TrangaSettings.downloadLocation, parentManga.folderName);
if (!Directory.Exists(mangaDirectory))
return false;
FileInfo[] archives = new DirectoryInfo(Path.Join(TrangaSettings.downloadLocation, parentManga.folderName)).GetFiles().Where(file => file.Name.Split('.')[^1] == "cbz").ToArray();
Regex volChRex = new(@"(?:Vol(?:ume)?\.([0-9]+)\D*)?Ch(?:apter)?\.([0-9]+(?:\.[0-9]+)*)");
FileInfo? mangaArchive = null;
string markerPath = Path.Join(mangaDirectory, $".{id}");
if (this.id is not null && File.Exists(markerPath))
{
if(File.Exists(File.ReadAllText(markerPath)))
mangaArchive = new FileInfo(File.ReadAllText(markerPath));
else
File.Delete(markerPath);
}
if(mangaArchive is null)
{
FileInfo[] archives = new DirectoryInfo(mangaDirectory).GetFiles("*.cbz");
Regex volChRex = new(@"(?:Vol(?:ume)?\.([0-9]+)\D*)?Ch(?:apter)?\.([0-9]+(?:\.[0-9]+)*)(?: - (.*))?.cbz");
Chapter t = this;
string thisPath = GetArchiveFilePath();
FileInfo? archive = archives.FirstOrDefault(archive =>
mangaArchive = archives.FirstOrDefault(archive =>
{
Match m = volChRex.Match(archive.Name);
string archiveVolNum = m.Groups[1].Success ? m.Groups[1].Value : "0";
string archiveChNum = m.Groups[2].Value;
return archiveVolNum == t.volumeNumber && archiveChNum == t.chapterNumber ||
archiveVolNum == "0" && archiveChNum == t.chapterNumber;
/*
* 1. If the volumeNumber is not present in the filename, it is not checked.
* 2. Check the chapterNumber in the chapter against the one in the filename.
* 3. The chpaterName has to either be absent both in the chapter and the filename or match.
*/
return (!m.Groups[1].Success || m.Groups[1].Value == t.volumeNumber.ToString(GlobalBase.numberFormatDecimalPoint)) &&
m.Groups[2].Value == t.chapterNumber.ToString(GlobalBase.numberFormatDecimalPoint) &&
((!m.Groups[3].Success && string.IsNullOrEmpty(t.name)) || m.Groups[3].Value == t.name);
});
if(archive is not null && thisPath != archive.FullName)
archive.MoveTo(thisPath, true);
return archive is not null;
}
string correctPath = GetArchiveFilePath();
if(mangaArchive is not null && mangaArchive.FullName != correctPath)
mangaArchive.MoveTo(correctPath, true);
return (mangaArchive is not null);
}
public void CreateChapterMarker()
{
if (this.id is null)
return;
string path = Path.Join(TrangaSettings.downloadLocation, parentManga.folderName, $".{id}");
File.WriteAllText(path, GetArchiveFilePath());
File.SetAttributes(path, FileAttributes.Hidden);
if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(path, UserRead | UserWrite | UserExecute | GroupRead | GroupWrite | GroupExecute | OtherRead | OtherExecute);
}
/// <summary>
/// Creates full file path of chapter-archive
/// </summary>

View File

@ -1,11 +1,8 @@
using System.Globalization;
using System.Runtime.InteropServices;
using System.Text;
using System.Text.RegularExpressions;
using Logging;
using Newtonsoft.Json;
using Tranga.LibraryConnectors;
using Tranga.MangaConnectors;
using Tranga.NotificationConnectors;
namespace Tranga;
@ -17,7 +14,6 @@ public abstract class GlobalBase
protected HashSet<NotificationConnector> notificationConnectors { get; init; }
protected HashSet<LibraryConnector> libraryConnectors { get; init; }
private Dictionary<string, Manga> cachedPublications { get; init; }
protected HashSet<MangaConnector> _connectors;
public static readonly NumberFormatInfo numberFormatDecimalPoint = new (){ NumberDecimalSeparator = "." };
protected static readonly Regex baseUrlRex = new(@"https?:\/\/[0-9A-z\.-]+(:[0-9]+)?");
@ -27,7 +23,6 @@ public abstract class GlobalBase
this.notificationConnectors = clone.notificationConnectors;
this.libraryConnectors = clone.libraryConnectors;
this.cachedPublications = clone.cachedPublications;
this._connectors = clone._connectors;
}
protected GlobalBase(Logger? logger)
@ -36,7 +31,15 @@ public abstract class GlobalBase
this.notificationConnectors = TrangaSettings.LoadNotificationConnectors(this);
this.libraryConnectors = TrangaSettings.LoadLibraryConnectors(this);
this.cachedPublications = new();
this._connectors = new();
}
protected void AddMangaToCache(Manga manga)
{
if (!this.cachedPublications.TryAdd(manga.internalId, manga))
{
Log($"Overwriting Manga {manga.internalId}");
this.cachedPublications[manga.internalId] = manga;
}
}
protected Manga? GetCachedManga(string internalId)
@ -48,71 +51,9 @@ public abstract class GlobalBase
};
}
protected IEnumerable<Manga> GetAllCachedManga() => cachedPublications.Values;
protected void AddMangaToCache(Manga manga)
protected IEnumerable<Manga> GetAllCachedManga()
{
if (!cachedPublications.TryAdd(manga.internalId, manga))
{
Log($"Overwriting Manga {manga.internalId}");
cachedPublications[manga.internalId] = manga;
}
ExportManga();
}
protected void RemoveMangaFromCache(Manga manga) => RemoveMangaFromCache(manga.internalId);
protected void RemoveMangaFromCache(string internalId)
{
cachedPublications.Remove(internalId);
ExportManga();
}
internal void ImportManga()
{
string folder = TrangaSettings.mangaCacheFolderPath;
Directory.CreateDirectory(folder);
foreach (FileInfo fileInfo in new DirectoryInfo(folder).GetFiles())
{
string content = File.ReadAllText(fileInfo.FullName);
try
{
Manga m = JsonConvert.DeserializeObject<Manga>(content, new MangaConnectorJsonConverter(this, _connectors));
this.cachedPublications.TryAdd(m.internalId, m);
}
catch (JsonException e)
{
Log($"Error parsing Manga {fileInfo.Name}:\n{e.Message}");
}
}
}
private static bool ExportRunning = false;
private void ExportManga()
{
while (ExportRunning)
Thread.Sleep(1);
ExportRunning = true;
string folder = TrangaSettings.mangaCacheFolderPath;
Directory.CreateDirectory(folder);
Manga[] copy = new Manga[cachedPublications.Values.Count];
cachedPublications.Values.CopyTo(copy, 0);
foreach (Manga manga in copy)
{
string content = JsonConvert.SerializeObject(manga, Formatting.Indented);
string filePath = Path.Combine(folder, $"{manga.internalId}.json");
File.WriteAllText(filePath, content, Encoding.UTF8);
}
foreach (FileInfo fileInfo in new DirectoryInfo(folder).GetFiles())
{
if(!cachedPublications.Keys.Any(key => fileInfo.Name.Substring(0, fileInfo.Name.LastIndexOf('.')).Equals(key)))
fileInfo.Delete();
}
ExportRunning = false;
return cachedPublications.Values;
}
protected void Log(string message)
@ -125,10 +66,10 @@ public abstract class GlobalBase
Log(string.Format(fStr, replace));
}
protected void SendNotifications(string title, string text)
protected void SendNotifications(string title, string text, bool buffer = false)
{
foreach (NotificationConnector nc in notificationConnectors)
nc.SendNotification(title, text);
nc.SendNotification(title, text, buffer);
}
protected void AddNotificationConnector(NotificationConnector notificationConnector)

View File

@ -7,12 +7,12 @@ public class DownloadChapter : Job
{
public Chapter chapter { get; init; }
public DownloadChapter(GlobalBase clone, Chapter chapter, DateTime lastExecution, string? parentJobId = null) : base(clone, JobType.DownloadChapterJob, lastExecution, parentJobId: parentJobId)
public DownloadChapter(GlobalBase clone, MangaConnector connector, Chapter chapter, DateTime lastExecution, string? parentJobId = null) : base(clone, JobType.DownloadChapterJob, connector, lastExecution, parentJobId: parentJobId)
{
this.chapter = chapter;
}
public DownloadChapter(GlobalBase clone, Chapter chapter, string? parentJobId = null) : base(clone, JobType.DownloadChapterJob, parentJobId: parentJobId)
public DownloadChapter(GlobalBase clone, MangaConnector connector, Chapter chapter, string? parentJobId = null) : base(clone, JobType.DownloadChapterJob, connector, parentJobId: parentJobId)
{
this.chapter = chapter;
}
@ -37,22 +37,18 @@ public class DownloadChapter : Job
if (success == HttpStatusCode.OK)
{
UpdateLibraries();
SendNotifications("Chapter downloaded", $"{chapter.parentManga.sortName} - {chapter.chapterNumber}");
SendNotifications("Chapter downloaded", $"{chapter.parentManga.sortName} - {chapter.chapterNumber}", true);
}
});
downloadTask.Start();
return Array.Empty<Job>();
}
protected override MangaConnector GetMangaConnector()
{
return chapter.parentManga.mangaConnector;
}
public override bool Equals(object? obj)
{
if (obj is not DownloadChapter otherJob)
return false;
return otherJob.chapter.Equals(this.chapter);
return otherJob.mangaConnector == this.mangaConnector &&
otherJob.chapter.Equals(this.chapter);
}
}

View File

@ -1,29 +1,29 @@
using Newtonsoft.Json;
using Tranga.MangaConnectors;
using Tranga.MangaConnectors;
namespace Tranga.Jobs;
public class DownloadNewChapters : Job
{
public string mangaInternalId { get; set; }
[JsonIgnore] private Manga? manga => GetCachedManga(mangaInternalId);
public Manga manga { get; set; }
public string translatedLanguage { get; init; }
public DownloadNewChapters(GlobalBase clone, string mangaInternalId, DateTime lastExecution, bool recurring = false, TimeSpan? recurrence = null, string? parentJobId = null, string translatedLanguage = "en") : base(clone, JobType.DownloadNewChaptersJob, lastExecution, recurring, recurrence, parentJobId)
public DownloadNewChapters(GlobalBase clone, MangaConnector connector, Manga manga, DateTime lastExecution,
bool recurring = false, TimeSpan? recurrence = null, string? parentJobId = null, string translatedLanguage = "en") : base(clone, JobType.DownloadNewChaptersJob, connector, lastExecution, recurring,
recurrence, parentJobId)
{
this.mangaInternalId = mangaInternalId;
this.manga = manga;
this.translatedLanguage = translatedLanguage;
}
public DownloadNewChapters(GlobalBase clone, MangaConnector connector, string mangaInternalId, bool recurring = false, TimeSpan? recurrence = null, string? parentJobId = null, string translatedLanguage = "en") : base (clone, JobType.DownloadNewChaptersJob, recurring, recurrence, parentJobId)
public DownloadNewChapters(GlobalBase clone, MangaConnector connector, Manga manga, bool recurring = false, TimeSpan? recurrence = null, string? parentJobId = null, string translatedLanguage = "en") : base (clone, JobType.DownloadNewChaptersJob, connector, recurring, recurrence, parentJobId)
{
this.mangaInternalId = mangaInternalId;
this.manga = manga;
this.translatedLanguage = translatedLanguage;
}
protected override string GetId()
{
return $"{GetType()}-{mangaInternalId}";
return $"{GetType()}-{manga.internalId}";
}
public override string ToString()
@ -33,38 +33,27 @@ public class DownloadNewChapters : Job
protected override IEnumerable<Job> ExecuteReturnSubTasksInternal(JobBoss jobBoss)
{
if (manga is null)
{
Log($"Manga {mangaInternalId} is missing! Can not execute job.");
return Array.Empty<Job>();
}
manga.Value.SaveSeriesInfoJson();
Chapter[] chapters = manga.Value.mangaConnector.GetNewChapters(manga.Value, this.translatedLanguage);
manga.SaveSeriesInfoJson();
Chapter[] chapters = mangaConnector.GetNewChapters(manga, this.translatedLanguage);
this.progressToken.increments = chapters.Length;
List<Job> jobs = new();
manga.Value.mangaConnector.CopyCoverFromCacheToDownloadLocation(manga.Value);
mangaConnector.CopyCoverFromCacheToDownloadLocation(manga);
foreach (Chapter chapter in chapters)
{
DownloadChapter downloadChapterJob = new(this, chapter, parentJobId: this.id);
DownloadChapter downloadChapterJob = new(this, this.mangaConnector, chapter, parentJobId: this.id);
jobs.Add(downloadChapterJob);
}
UpdateMetadata updateMetadataJob = new(this, mangaInternalId, parentJobId: this.id);
UpdateMetadata updateMetadataJob = new(this, this.mangaConnector, this.manga, parentJobId: this.id);
jobs.Add(updateMetadataJob);
progressToken.Complete();
return jobs;
}
protected override MangaConnector GetMangaConnector()
{
if (manga is null)
throw new Exception($"Missing Manga {mangaInternalId}");
return manga.Value.mangaConnector;
}
public override bool Equals(object? obj)
{
if (obj is not DownloadNewChapters otherJob)
return false;
return otherJob.manga.Equals(this.manga);
return otherJob.mangaConnector == this.mangaConnector &&
otherJob.manga.publicationId == this.manga.publicationId;
}
}

View File

@ -4,6 +4,7 @@ namespace Tranga.Jobs;
public abstract class Job : GlobalBase
{
public MangaConnector mangaConnector { get; init; }
public ProgressToken progressToken { get; private set; }
public bool recurring { get; init; }
public TimeSpan? recurrenceTime { get; set; }
@ -12,15 +13,14 @@ public abstract class Job : GlobalBase
public string id => GetId();
internal IEnumerable<Job>? subJobs { get; private set; }
public string? parentJobId { get; init; }
public enum JobType : byte { DownloadChapterJob, DownloadNewChaptersJob, UpdateMetaDataJob, MonitorManga }
public MangaConnector mangaConnector => GetMangaConnector();
public enum JobType : byte { DownloadChapterJob, DownloadNewChaptersJob, UpdateMetaDataJob }
public JobType jobType;
internal Job(GlobalBase clone, JobType jobType, bool recurring = false, TimeSpan? recurrenceTime = null, string? parentJobId = null) : base(clone)
internal Job(GlobalBase clone, JobType jobType, MangaConnector connector, bool recurring = false, TimeSpan? recurrenceTime = null, string? parentJobId = null) : base(clone)
{
this.jobType = jobType;
this.mangaConnector = connector;
this.progressToken = new ProgressToken(0);
this.recurring = recurring;
if (recurring && recurrenceTime is null)
@ -31,10 +31,11 @@ public abstract class Job : GlobalBase
this.parentJobId = parentJobId;
}
internal Job(GlobalBase clone, JobType jobType, DateTime lastExecution, bool recurring = false,
internal Job(GlobalBase clone, JobType jobType, MangaConnector connector, DateTime lastExecution, bool recurring = false,
TimeSpan? recurrenceTime = null, string? parentJobId = null) : base(clone)
{
this.jobType = jobType;
this.mangaConnector = connector;
this.progressToken = new ProgressToken(0);
this.recurring = recurring;
if (recurring && recurrenceTime is null)
@ -94,6 +95,4 @@ public abstract class Job : GlobalBase
}
protected abstract IEnumerable<Job> ExecuteReturnSubTasksInternal(JobBoss jobBoss);
protected abstract MangaConnector GetMangaConnector();
}

View File

@ -1,6 +1,9 @@
using System.Text.RegularExpressions;
using System.Diagnostics;
using System.Runtime.InteropServices;
using System.Text.RegularExpressions;
using Newtonsoft.Json;
using Tranga.MangaConnectors;
using static System.IO.UnixFileMode;
namespace Tranga.Jobs;
@ -17,7 +20,7 @@ public class JobBoss : GlobalBase
Log($"Next job in {jobs.MinBy(job => job.nextExecution)?.nextExecution.Subtract(DateTime.Now)} {jobs.MinBy(job => job.nextExecution)?.id}");
}
public bool AddJob(Job job)
public bool AddJob(Job job, string? jobFile = null)
{
if (ContainsJobLike(job))
{
@ -26,11 +29,12 @@ public class JobBoss : GlobalBase
}
else
{
if (!this.jobs.Add(job))
return false;
Log($"Added {job}");
this.jobs.Add(job);
UpdateJobFile(job);
return true;
UpdateJobFile(job, jobFile);
}
return true;
}
public void AddJobs(IEnumerable<Job> jobsToAdd)
@ -67,9 +71,11 @@ public class JobBoss : GlobalBase
RemoveJob(job);
}
public IEnumerable<Job> GetJobsLike(string? internalId = null, string? chapterNumber = null)
public IEnumerable<Job> GetJobsLike(string? connectorName = null, string? internalId = null, float? chapterNumber = null)
{
IEnumerable<Job> ret = this.jobs;
if (connectorName is not null)
ret = ret.Where(job => job.mangaConnector.name == connectorName);
if (internalId is not null && chapterNumber is not null)
ret = ret.Where(jjob =>
@ -77,25 +83,25 @@ public class JobBoss : GlobalBase
if (jjob is not DownloadChapter job)
return false;
return job.chapter.parentManga.internalId == internalId &&
job.chapter.chapterNumber == chapterNumber;
job.chapter.chapterNumber.Equals(chapterNumber);
});
else if (internalId is not null)
ret = ret.Where(jjob =>
{
if (jjob is not DownloadNewChapters job)
return false;
return job.mangaInternalId == internalId;
return job.manga.internalId == internalId;
});
return ret;
}
public IEnumerable<Job> GetJobsLike(Manga? publication = null,
public IEnumerable<Job> GetJobsLike(MangaConnector? mangaConnector = null, Manga? publication = null,
Chapter? chapter = null)
{
if (chapter is not null)
return GetJobsLike(chapter.Value.parentManga.internalId, chapter.Value.chapterNumber);
return GetJobsLike(mangaConnector?.name, chapter.Value.parentManga.internalId, chapter.Value.chapterNumber);
else
return GetJobsLike(publication?.internalId);
return GetJobsLike(mangaConnector?.name, publication?.internalId);
}
public Job? GetJobById(string jobId)
@ -143,31 +149,42 @@ public class JobBoss : GlobalBase
if (!Directory.Exists(TrangaSettings.jobsFolderPath)) //No jobs to load
{
Directory.CreateDirectory(TrangaSettings.jobsFolderPath);
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(TrangaSettings.jobsFolderPath, UserRead | UserWrite | UserExecute | GroupRead | OtherRead);
return;
}
Regex idRex = new (@"(.*)\.json");
//Load json-job-files
foreach (FileInfo file in new DirectoryInfo(TrangaSettings.jobsFolderPath).EnumerateFiles().Where(fileInfo => idRex.IsMatch(fileInfo.Name)))
foreach (FileInfo file in Directory.GetFiles(TrangaSettings.jobsFolderPath, "*.json").Select(f => new FileInfo(f)))
{
Log($"Adding {file.Name}");
try
{
Job? job = JsonConvert.DeserializeObject<Job>(File.ReadAllText(file.FullName),
new JobJsonConverter(this, new MangaConnectorJsonConverter(this, connectors)));
if (job is null)
{
string newName = file.FullName + ".failed";
Log($"Failed loading file {file.Name}.\nMoving to {newName}");
File.Move(file.FullName, newName);
}
else
{
Log($"Adding Job {job}");
this.jobs.Add(job);
}
}
if (job is null) throw new NullReferenceException();
//Load Manga-Files
ImportManga();
Log($"Adding Job {job}");
if (!AddJob(job, file.FullName)) //If we detect a duplicate, delete the file.
{
//string path = string.Concat(file.FullName, ".duplicate");
//file.MoveTo(path);
//Log($"Duplicate detected or otherwise not able to add job to list.\nMoved job {job} to {path}");
Log($"Duplicate detected or otherwise not able to add job to list. Removed the file {file.FullName} {job}");
}
}
catch (Exception e)
{
if (e is not UnreachableException or NullReferenceException)
throw;
Log(e.Message);
string newName = file.FullName + ".failed";
Log($"Failed loading file {file.Name}.\nMoving to {newName}.\n" +
$"If you think this is a bug, upload contents of the file to the Bugreport!");
File.Move(file.FullName, newName);
continue;
}
}
//Connect jobs to parent-jobs and add Publications to cache
foreach (Job job in this.jobs)
@ -179,65 +196,47 @@ public class JobBoss : GlobalBase
parentJob.AddSubJob(job);
Log($"Parent Job {parentJob}");
}
if (job is DownloadNewChapters dncJob)
AddMangaToCache(dncJob.manga);
}
string[] jobMangaInternalIds = this.jobs.Where(job => job is DownloadNewChapters)
.Select(dnc => ((DownloadNewChapters)dnc).mangaInternalId).ToArray();
jobMangaInternalIds = jobMangaInternalIds.Concat(
this.jobs.Where(job => job is UpdateMetadata)
.Select(dnc => ((UpdateMetadata)dnc).mangaInternalId)).ToArray();
string[] internalIds = GetAllCachedManga().Select(m => m.internalId).ToArray();
string[] extraneousIds = internalIds.Except(jobMangaInternalIds).ToArray();
foreach (string internalId in extraneousIds)
RemoveMangaFromCache(internalId);
string[] coverFiles = Directory.GetFiles(TrangaSettings.coverImageCache);
foreach(string fileName in coverFiles.Where(fileName => !GetAllCachedManga().Any(manga => manga.coverFileNameInCache == fileName)))
File.Delete(fileName);
string[] mangaFiles = Directory.GetFiles(TrangaSettings.mangaCacheFolderPath);
foreach(string fileName in mangaFiles.Where(fileName => !GetAllCachedManga().Any(manga => fileName.Split('.')[0] == manga.internalId)))
File.Delete(fileName);
}
internal void UpdateJobFile(Job job, string? oldFile = null)
{
string newJobFilePath = Path.Join(TrangaSettings.jobsFolderPath, $"{job.id}.json");
string oldFilePath = oldFile??Path.Join(TrangaSettings.jobsFolderPath, $"{job.id}.json");
if (!this.jobs.Any(jjob => jjob.id == job.id))
//Delete old file
if (File.Exists(oldFilePath))
{
Log($"Deleting Job-file {oldFilePath}");
try
{
Log($"Deleting Job-file {newJobFilePath}");
while(IsFileInUse(newJobFilePath))
while(IsFileInUse(oldFilePath))
Thread.Sleep(10);
File.Delete(newJobFilePath);
File.Delete(oldFilePath);
}
catch (Exception e)
{
Log(e.ToString());
Log($"Error deleting {oldFilePath} job {job.id}\n{e}");
return; //Don't export a new file when we haven't actually deleted the old one
}
}
else
//Export job (in new file) if it is still in our jobs list
if (GetJobById(job.id) is not null)
{
Log($"Exporting Job {newJobFilePath}");
string jobStr = JsonConvert.SerializeObject(job, Formatting.Indented);
while(IsFileInUse(newJobFilePath))
Thread.Sleep(10);
File.WriteAllText(newJobFilePath, jobStr);
}
if(oldFile is not null)
try
{
Log($"Deleting old Job-file {oldFile}");
while(IsFileInUse(oldFile))
Thread.Sleep(10);
File.Delete(oldFile);
}
catch (Exception e)
{
Log(e.ToString());
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(newJobFilePath, UserRead | UserWrite | GroupRead | OtherRead);
}
}

View File

@ -24,31 +24,52 @@ public class JobJsonConverter : JsonConverter
{
JObject jo = JObject.Load(reader);
if(!jo.ContainsKey("jobType"))
throw new Exception();
return Enum.Parse<Job.JobType>(jo["jobType"]!.Value<byte>().ToString()) switch
if (jo.ContainsKey("jobType") && jo["jobType"]!.Value<byte>() == (byte)Job.JobType.UpdateMetaDataJob)
{
Job.JobType.UpdateMetaDataJob => new UpdateMetadata(_clone,
jo.GetValue("mangaInternalId")!.Value<string>()!,
jo.GetValue("parentJobId")!.Value<string?>()),
Job.JobType.DownloadChapterJob => new DownloadChapter(this._clone,
jo.GetValue("chapter")!.ToObject<Chapter>(JsonSerializer.Create(new JsonSerializerSettings()
return new UpdateMetadata(this._clone,
jo.GetValue("mangaConnector")!.ToObject<MangaConnector>(JsonSerializer.Create(new JsonSerializerSettings()
{
Converters = { this._mangaConnectorJsonConverter }
})),
DateTime.UnixEpoch,
jo.GetValue("parentJobId")!.Value<string?>()),
Job.JobType.DownloadNewChaptersJob => new DownloadNewChapters(this._clone,
jo.GetValue("mangaInternalId")!.Value<string>()!,
jo.GetValue("lastExecution") is {} le
Converters =
{
this._mangaConnectorJsonConverter
}
}))!,
jo.GetValue("manga")!.ToObject<Manga>(),
jo.GetValue("parentJobId")!.Value<string?>());
}else if ((jo.ContainsKey("jobType") && jo["jobType"]!.Value<byte>() == (byte)Job.JobType.DownloadNewChaptersJob) || jo.ContainsKey("translatedLanguage"))//TODO change to jobType
{
DateTime lastExecution = jo.GetValue("lastExecution") is {} le
? le.ToObject<DateTime>()
: DateTime.UnixEpoch,
: DateTime.UnixEpoch; //TODO do null checks on all variables
return new DownloadNewChapters(this._clone,
jo.GetValue("mangaConnector")!.ToObject<MangaConnector>(JsonSerializer.Create(new JsonSerializerSettings()
{
Converters =
{
this._mangaConnectorJsonConverter
}
}))!,
jo.GetValue("manga")!.ToObject<Manga>(),
lastExecution,
jo.GetValue("recurring")!.Value<bool>(),
jo.GetValue("recurrenceTime")!.ToObject<TimeSpan?>(),
jo.GetValue("parentJobId")!.Value<string?>()),
_ => throw new Exception()
};
jo.GetValue("parentJobId")!.Value<string?>());
}else if ((jo.ContainsKey("jobType") && jo["jobType"]!.Value<byte>() == (byte)Job.JobType.DownloadChapterJob) || jo.ContainsKey("chapter"))//TODO change to jobType
{
return new DownloadChapter(this._clone,
jo.GetValue("mangaConnector")!.ToObject<MangaConnector>(JsonSerializer.Create(new JsonSerializerSettings()
{
Converters =
{
this._mangaConnectorJsonConverter
}
}))!,
jo.GetValue("chapter")!.ToObject<Chapter>(),
DateTime.UnixEpoch,
jo.GetValue("parentJobId")!.Value<string?>());
}
throw new Exception();
}
public override bool CanWrite => false;

View File

@ -1,21 +1,19 @@
using System.Text.Json.Serialization;
using Tranga.MangaConnectors;
using Tranga.MangaConnectors;
namespace Tranga.Jobs;
public class UpdateMetadata : Job
{
public string mangaInternalId { get; set; }
[JsonIgnore] private Manga? manga => GetCachedManga(mangaInternalId);
public Manga manga { get; set; }
public UpdateMetadata(GlobalBase clone, string mangaInternalId, string? parentJobId = null) : base(clone, JobType.UpdateMetaDataJob, parentJobId: parentJobId)
public UpdateMetadata(GlobalBase clone, MangaConnector connector, Manga manga, string? parentJobId = null) : base(clone, JobType.UpdateMetaDataJob, connector, parentJobId: parentJobId)
{
this.mangaInternalId = mangaInternalId;
this.manga = manga;
}
protected override string GetId()
{
return $"{GetType()}-{mangaInternalId}";
return $"{GetType()}-{manga.internalId}";
}
public override string ToString()
@ -25,14 +23,8 @@ public class UpdateMetadata : Job
protected override IEnumerable<Job> ExecuteReturnSubTasksInternal(JobBoss jobBoss)
{
if (manga is null)
{
Log($"Manga {mangaInternalId} is missing! Can not execute job.");
return Array.Empty<Job>();
}
//Retrieve new Metadata
Manga? possibleUpdatedManga = mangaConnector.GetMangaFromId(manga.Value.publicationId);
Manga? possibleUpdatedManga = mangaConnector.GetMangaFromId(manga.publicationId);
if (possibleUpdatedManga is { } updatedManga)
{
if (updatedManga.Equals(this.manga)) //Check if anything changed
@ -41,9 +33,26 @@ public class UpdateMetadata : Job
return Array.Empty<Job>();
}
AddMangaToCache(manga.Value.WithMetadata(updatedManga));
this.manga.Value.SaveSeriesInfoJson(true);
this.mangaConnector.CopyCoverFromCacheToDownloadLocation((Manga)manga);
this.manga = manga.WithMetadata(updatedManga);
this.manga.SaveSeriesInfoJson(true);
this.mangaConnector.CopyCoverFromCacheToDownloadLocation(manga);
foreach (Job job in jobBoss.GetJobsLike(publication: this.manga))
{
string oldFile;
if (job is DownloadNewChapters dc)
{
oldFile = dc.id;
dc.manga = this.manga;
}
else if (job is UpdateMetadata um)
{
oldFile = um.id;
um.manga = this.manga;
}
else
continue;
jobBoss.UpdateJobFile(job, oldFile);
}
this.progressToken.Complete();
}
else
@ -56,18 +65,12 @@ public class UpdateMetadata : Job
return Array.Empty<Job>();
}
protected override MangaConnector GetMangaConnector()
{
if (manga is null)
throw new Exception($"Missing Manga {mangaInternalId}");
return manga.Value.mangaConnector;
}
public override bool Equals(object? obj)
{
if (obj is not UpdateMetadata otherJob)
return false;
return otherJob.manga.Equals(this.manga);
return otherJob.mangaConnector == this.mangaConnector &&
otherJob.manga.publicationId == this.manga.publicationId;
}
}

View File

@ -61,7 +61,7 @@ public class Kavita : LibraryConnector
return "";
}
public override void UpdateLibrary()
protected override void UpdateLibraryInternal()
{
Log("Updating libraries.");
foreach (KavitaLibrary lib in GetLibraries())

View File

@ -25,7 +25,7 @@ public class Komga : LibraryConnector
return $"Komga {baseUrl}";
}
public override void UpdateLibrary()
protected override void UpdateLibraryInternal()
{
Log("Updating libraries.");
foreach (KomgaLibrary lib in GetLibraries())

View File

@ -17,6 +17,9 @@ public abstract class LibraryConnector : GlobalBase
public string baseUrl { get; }
// ReSharper disable once MemberCanBeProtected.Global
public string auth { get; } //Base64 encoded, if you use your password everywhere, you have problems
private DateTime? _updateLibraryRequested = null;
private readonly Thread? _libraryBufferThread = null;
private const int NoChangeTimeout = 2, BiggestInterval = 20;
protected LibraryConnector(GlobalBase clone, string baseUrl, string auth, LibraryType libraryType) : base(clone)
{
@ -28,8 +31,47 @@ public abstract class LibraryConnector : GlobalBase
this.baseUrl = baseUrlRex.Match(baseUrl).Value;
this.auth = auth;
this.libraryType = libraryType;
if (TrangaSettings.bufferLibraryUpdates)
{
_libraryBufferThread = new(CheckLibraryBuffer);
_libraryBufferThread.Start();
}
public abstract void UpdateLibrary();
}
private void CheckLibraryBuffer()
{
while (true)
{
if (_updateLibraryRequested is not null && DateTime.Now.Subtract((DateTime)_updateLibraryRequested) > TimeSpan.FromMinutes(NoChangeTimeout)) //If no updates have been requested for NoChangeTimeout minutes, update library
{
UpdateLibraryInternal();
_updateLibraryRequested = null;
}
Thread.Sleep(100);
}
}
public void UpdateLibrary()
{
_updateLibraryRequested ??= DateTime.Now;
if (!TrangaSettings.bufferLibraryUpdates)
{
UpdateLibraryInternal();
return;
}else if (_updateLibraryRequested is not null &&
DateTime.Now.Subtract((DateTime)_updateLibraryRequested) > TimeSpan.FromMinutes(BiggestInterval)) //If the last update has been more than BiggestInterval minutes ago, update library
{
UpdateLibraryInternal();
_updateLibraryRequested = null;
}
else if(_updateLibraryRequested is not null)
{
Log($"Buffering Library Updates (Updates in latest {((DateTime)_updateLibraryRequested).Add(TimeSpan.FromMinutes(BiggestInterval)).Subtract(DateTime.Now)} or {((DateTime)_updateLibraryRequested).Add(TimeSpan.FromMinutes(NoChangeTimeout)).Subtract(DateTime.Now)})");
}
}
protected abstract void UpdateLibraryInternal();
internal abstract bool Test();
protected static class NetClient

View File

@ -3,7 +3,6 @@ using System.Text;
using System.Text.RegularExpressions;
using System.Web;
using Newtonsoft.Json;
using Tranga.MangaConnectors;
using static System.IO.UnixFileMode;
namespace Tranga;
@ -28,6 +27,8 @@ public struct Manga
// ReSharper disable once MemberCanBePrivate.Global
public int? year { get; private set; }
public string? originalLanguage { get; }
// ReSharper disable twice MemberCanBePrivate.Global
public string status { get; private set; }
public ReleaseStatusByte releaseStatus { get; private set; }
public enum ReleaseStatusByte : byte
{
@ -43,15 +44,14 @@ public struct Manga
public float ignoreChaptersBelow { get; set; }
public float latestChapterDownloaded { get; set; }
public float latestChapterAvailable { get; set; }
public string websiteUrl { get; private set; }
public MangaConnector mangaConnector { get; private set; }
public string? websiteUrl { get; private set; }
private static readonly Regex LegalCharacters = new (@"[A-Za-zÀ-ÖØ-öø-ÿ0-9 \.\-,'\'\)\(~!\+]*");
[JsonConstructor]
public Manga(MangaConnector mangaConnector, string sortName, List<string> authors, string? description, Dictionary<string,string> altTitles, string[] tags, string? coverUrl, string? coverFileNameInCache, Dictionary<string,string>? links, int? year, string? originalLanguage, string publicationId, ReleaseStatusByte releaseStatus, string? websiteUrl, string? folderName = null, float? ignoreChaptersBelow = 0)
public Manga(string sortName, List<string> authors, string? description, Dictionary<string,string> altTitles, string[] tags, string? coverUrl, string? coverFileNameInCache, Dictionary<string,string>? links, int? year, string? originalLanguage, string publicationId, ReleaseStatusByte releaseStatus, string? websiteUrl = null, string? folderName = null, float? ignoreChaptersBelow = 0)
{
this.mangaConnector = mangaConnector;
this.sortName = HttpUtility.HtmlDecode(sortName);
this.authors = authors.Select(HttpUtility.HtmlDecode).ToList()!;
this.description = HttpUtility.HtmlDecode(description);
@ -67,11 +67,12 @@ public struct Manga
while (this.folderName.EndsWith('.'))
this.folderName = this.folderName.Substring(0, this.folderName.Length - 1);
string onlyLowerLetters = string.Concat(this.sortName.ToLower().Where(Char.IsLetter));
this.internalId = Convert.ToBase64String(Encoding.ASCII.GetBytes($"{onlyLowerLetters}{this.year}"));
this.internalId = DateTime.Now.Ticks.ToString();
this.ignoreChaptersBelow = ignoreChaptersBelow ?? 0f;
this.latestChapterDownloaded = 0;
this.latestChapterAvailable = 0;
this.releaseStatus = releaseStatus;
this.status = Enum.GetName(releaseStatus) ?? "";
this.websiteUrl = websiteUrl;
}
@ -85,6 +86,7 @@ public struct Manga
authors = authors.Union(newManga.authors).ToList(),
altTitles = altTitles.UnionBy(newManga.altTitles, kv => kv.Key).ToDictionary(x => x.Key, x => x.Value),
tags = tags.Union(newManga.tags).ToArray(),
status = newManga.status,
releaseStatus = newManga.releaseStatus,
websiteUrl = newManga.websiteUrl,
year = newManga.year,
@ -98,6 +100,7 @@ public struct Manga
return false;
return this.description == compareManga.description &&
this.year == compareManga.year &&
this.status == compareManga.status &&
this.releaseStatus == compareManga.releaseStatus &&
this.sortName == compareManga.sortName &&
this.latestChapterAvailable.Equals(compareManga.latestChapterAvailable) &&

View File

@ -0,0 +1,217 @@
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class AsuraToon : MangaConnector
{
public AsuraToon(GlobalBase clone) : base(clone, "AsuraToon", ["en"])
{
this.downloadClient = new ChromiumDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string sanitizedTitle = string.Join(' ', Regex.Matches(publicationTitle, "[A-z]*").Where(m => m.Value.Length > 0)).ToLower();
string requestUrl = $"https://asuracomic.net/series?name={sanitizedTitle}";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Manga>();
if (requestResult.htmlDocument is null)
{
Log($"Failed to retrieve site");
return Array.Empty<Manga>();
}
Manga[] publications = ParsePublicationsFromHtml(requestResult.htmlDocument);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications;
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://asuracomic.net/series/{publicationId}");
}
public override Manga? GetMangaFromUrl(string url)
{
RequestResult requestResult = downloadClient.MakeRequest(url, RequestType.MangaInfo);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return null;
if (requestResult.htmlDocument is null)
{
Log($"Failed to retrieve site");
return null;
}
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, url.Split('/')[^1], url);
}
private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{
HtmlNodeCollection mangaList = document.DocumentNode.SelectNodes("//a[starts-with(@href,'series')]");
if (mangaList is null || mangaList.Count < 1)
return [];
IEnumerable<string> urls = mangaList.Select(a => $"https://asuracomic.net/{a.GetAttributeValue("href", "")}");
List<Manga> ret = new();
foreach (string url in urls)
{
Manga? manga = GetMangaFromUrl(url);
if (manga is not null)
ret.Add((Manga)manga);
}
return ret.ToArray();
}
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
string? originalLanguage = null;
Dictionary<string, string> altTitles = new(), links = new();
HtmlNodeCollection genreNodes = document.DocumentNode.SelectNodes("//h3[text()='Genres']/../div/button");
string[] tags = genreNodes.Select(b => b.InnerText).ToArray();
HtmlNode statusNode = document.DocumentNode.SelectSingleNode("//h3[text()='Status']/../h3[2]");
Manga.ReleaseStatusByte releaseStatus = statusNode.InnerText.ToLower() switch
{
"ongoing" => Manga.ReleaseStatusByte.Continuing,
"hiatus" => Manga.ReleaseStatusByte.OnHiatus,
"completed" => Manga.ReleaseStatusByte.Completed,
"dropped" => Manga.ReleaseStatusByte.Cancelled,
"season end" => Manga.ReleaseStatusByte.Continuing,
"coming soon" => Manga.ReleaseStatusByte.Unreleased,
_ => Manga.ReleaseStatusByte.Unreleased
};
HtmlNode coverNode =
document.DocumentNode.SelectSingleNode("//img[@alt='poster']");
string coverUrl = coverNode.GetAttributeValue("src", "");
string coverFileNameInCache = SaveCoverImageToCache(coverUrl, publicationId, RequestType.MangaCover);
HtmlNode titleNode =
document.DocumentNode.SelectSingleNode("//title");
string sortName = Regex.Match(titleNode.InnerText, @"(.*) - Asura Scans").Groups[1].Value;
HtmlNode descriptionNode =
document.DocumentNode.SelectSingleNode("//h3[starts-with(text(),'Synopsis')]/../span");
string description = descriptionNode?.InnerText??"";
HtmlNodeCollection authorNodes = document.DocumentNode.SelectNodes("//h3[text()='Author']/../h3[not(text()='Author' or text()='_')]");
HtmlNodeCollection artistNodes = document.DocumentNode.SelectNodes("//h3[text()='Artist']/../h3[not(text()='Artist' or text()='_')]");
IEnumerable<string> authorNames = authorNodes is null ? [] : authorNodes.Select(a => a.InnerText);
IEnumerable<string> artistNames = artistNodes is null ? [] : artistNodes.Select(a => a.InnerText);
List<string> authors = authorNames.Concat(artistNames).ToList();
HtmlNode? firstChapterNode = document.DocumentNode.SelectSingleNode("//a[contains(@href, 'chapter/1')]/../following-sibling::h3");
int? year = int.Parse(firstChapterNode?.InnerText.Split(' ')[^1] ?? "2000");
Manga manga = new (sortName, authors, description, altTitles, tags, coverUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
string requestUrl = $"https://asuracomic.net/series/{manga.publicationId}";
// Leaving this in for verification if the page exists
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Chapter>();
//Return Chapters ordered by Chapter-Number
List<Chapter> chapters = ParseChaptersFromHtml(manga, requestUrl);
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
private List<Chapter> ParseChaptersFromHtml(Manga manga, string mangaUrl)
{
RequestResult result = downloadClient.MakeRequest(mangaUrl, RequestType.Default);
if ((int)result.statusCode < 200 || (int)result.statusCode >= 300 || result.htmlDocument is null)
{
Log("Failed to load site");
return new List<Chapter>();
}
List<Chapter> ret = new();
HtmlNodeCollection chapterURLNodes = result.htmlDocument.DocumentNode.SelectNodes("//a[contains(@href, '/chapter/')]");
Regex infoRex = new(@"Chapter ([0-9]+)(.*)?");
foreach (HtmlNode chapterInfo in chapterURLNodes)
{
string chapterUrl = chapterInfo.GetAttributeValue("href", "");
Match match = infoRex.Match(chapterInfo.InnerText);
string chapterNumber = match.Groups[1].Value;
string? chapterName = match.Groups[2].Success && match.Groups[2].Length > 1 ? match.Groups[2].Value : null;
string url = $"https://asuracomic.net/series/{chapterUrl}";
try
{
ret.Add(new Chapter(manga, chapterName, null, chapterNumber, url));
}
catch (Exception e)
{
Log($"Failed to load chapter {chapterNumber}: {e.Message}");
}
}
return ret;
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
string requestUrl = chapter.url;
// Leaving this in to check if the page exists
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
progressToken?.Cancel();
return requestResult.statusCode;
}
string[] imageUrls = ParseImageUrlsFromHtml(requestUrl);
return DownloadChapterImages(imageUrls, chapter, RequestType.MangaImage, progressToken:progressToken);
}
private string[] ParseImageUrlsFromHtml(string mangaUrl)
{
RequestResult requestResult =
downloadClient.MakeRequest(mangaUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
return Array.Empty<string>();
}
if (requestResult.htmlDocument is null)
{
Log($"Failed to retrieve site");
return Array.Empty<string>();
}
HtmlNodeCollection images =
requestResult.htmlDocument.DocumentNode.SelectNodes("//img[contains(@alt, 'chapter page')]");
return images.Select(i => i.GetAttributeValue("src", "")).ToArray();
}
}

View File

@ -8,7 +8,7 @@ namespace Tranga.MangaConnectors;
public class Bato : MangaConnector
{
public Bato(GlobalBase clone) : base(clone, "Bato")
public Bato(GlobalBase clone) : base(clone, "Bato", ["en"])
{
this.downloadClient = new HttpDownloadClient(clone);
}
@ -114,8 +114,8 @@ public class Bato : MangaConnector
case "pending": releaseStatus = Manga.ReleaseStatusByte.Unreleased; break;
}
Manga manga = new (this, sortName, authors, description, altTitles, tags, posterUrl, coverFileNameInCache, new Dictionary<string, string>(),
year, originalLanguage, publicationId, releaseStatus, websiteUrl);
Manga manga = new (sortName, authors, description, altTitles, tags, posterUrl, coverFileNameInCache, new Dictionary<string, string>(),
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
@ -150,7 +150,7 @@ public class Bato : MangaConnector
HtmlNode chapterList =
result.htmlDocument.DocumentNode.SelectSingleNode("/html/body/div/main/div[3]/astro-island/div/div[2]/div/div/astro-slot");
Regex numberRex = new(@"\/title\/.+\/[0-9]+(-vol_([0-9]+))?-ch_([0-9\.]+)");
Regex numberRex = new(@"\/title\/.+\/([0-9])+(?:-vol_([0-9]+))?-ch_([0-9\.]+)");
foreach (HtmlNode chapterInfo in chapterList.SelectNodes("div"))
{
@ -158,12 +158,20 @@ public class Bato : MangaConnector
string chapterUrl = infoNode.GetAttributeValue("href", "");
Match match = numberRex.Match(chapterUrl);
string id = match.Groups[1].Value;
string? volumeNumber = match.Groups[2].Success ? match.Groups[2].Value : null;
string chapterNumber = match.Groups[3].Value;
string chapterName = chapterNumber;
string url = $"https://bato.to{chapterUrl}?load=2";
try
{
ret.Add(new Chapter(manga, chapterName, volumeNumber, chapterNumber, url));
}
catch (Exception e)
{
Log($"Failed to load chapter {chapterNumber}: {e.Message}");
}
}
return ret;
}
@ -190,10 +198,7 @@ public class Bato : MangaConnector
string[] imageUrls = ParseImageUrlsFromHtml(requestUrl);
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(imageUrls, chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, "https://mangakatana.com/", progressToken:progressToken);
return DownloadChapterImages(imageUrls, chapter, RequestType.MangaImage, progressToken:progressToken);
}
private string[] ParseImageUrlsFromHtml(string mangaUrl)

View File

@ -2,20 +2,19 @@
using System.Text;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Microsoft.Extensions.Logging;
using PuppeteerSharp;
namespace Tranga.MangaConnectors;
internal class ChromiumDownloadClient : DownloadClient
{
private IBrowser browser { get; set; }
private const string ChromiumVersion = "1154303";
private const int StartTimeoutMs = 30000;
private static IBrowser? _browser;
private readonly HttpDownloadClient _httpDownloadClient;
private async Task<IBrowser> StartBrowser()
private static async Task<IBrowser> StartBrowser(Logging.Logger? logger = null)
{
Log($"Starting Browser. ({StartTimeoutMs}ms timeout)");
logger?.WriteLine("Starting ChromiumDownloadClient Puppeteer");
return await Puppeteer.LaunchAsync(new LaunchOptions
{
Headless = true,
@ -24,14 +23,40 @@ internal class ChromiumDownloadClient : DownloadClient
"--disable-dev-shm-usage",
"--disable-setuid-sandbox",
"--no-sandbox"},
Timeout = StartTimeoutMs
});
Timeout = TrangaSettings.ChromiumStartupTimeoutMs
}, new LoggerFactory([new LogProvider(logger)]));
}
private class LogProvider : GlobalBase, ILoggerProvider
{
public LogProvider(Logging.Logger? logger) : base(logger) { }
public void Dispose() { }
public ILogger CreateLogger(string categoryName) => new Logger(logger);
}
private class Logger : GlobalBase, ILogger
{
public Logger(Logging.Logger? logger) : base(logger) { }
public void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception? exception, Func<TState, Exception?, string> formatter)
{
if (logLevel <= LogLevel.Information)
return;
logger?.WriteLine("Puppeteer", formatter.Invoke(state, exception));
}
public bool IsEnabled(LogLevel logLevel) => true;
public IDisposable? BeginScope<TState>(TState state) where TState : notnull => null;
}
public ChromiumDownloadClient(GlobalBase clone) : base(clone)
{
this.browser = StartBrowser().Result;
_httpDownloadClient = new(this);
if(_browser is null)
_browser = StartBrowser(this.logger).Result;
}
private readonly Regex _imageUrlRex = new(@"https?:\/\/.*\.(?:p?jpe?g|gif|a?png|bmp|avif|webp)(\?.*)?");
@ -44,17 +69,21 @@ internal class ChromiumDownloadClient : DownloadClient
private RequestResult MakeRequestBrowser(string url, string? referrer = null, string? clickButton = null)
{
IPage page = this.browser.NewPageAsync().Result;
page.DefaultTimeout = 10000;
if (_browser is null)
return new RequestResult(HttpStatusCode.InternalServerError, null, Stream.Null);
IPage page = _browser.NewPageAsync().Result;
page.DefaultTimeout = TrangaSettings.ChromiumPageTimeoutMs;
page.SetExtraHttpHeadersAsync(new() { { "Referer", referrer } });
IResponse response;
try
{
response = page.GoToAsync(url, WaitUntilNavigation.Networkidle0).Result;
Log("Page loaded.");
Log($"Page loaded. {url}");
}
catch (Exception e)
{
Log($"Could not load Page:\n{e.Message}");
Log($"Could not load Page {url}\n{e.Message}");
page.CloseAsync();
return new RequestResult(HttpStatusCode.InternalServerError, null, Stream.Null);
}
@ -85,9 +114,4 @@ internal class ChromiumDownloadClient : DownloadClient
page.CloseAsync();
return new RequestResult(response.Status, document, stream, false, "");
}
public override void Close()
{
this.browser.CloseAsync();
}
}

View File

@ -41,5 +41,4 @@ internal abstract class DownloadClient : GlobalBase
}
internal abstract RequestResult MakeRequestInternal(string url, string? referrer = null, string? clickButton = null);
public abstract void Close();
}

View File

@ -72,9 +72,4 @@ internal class HttpDownloadClient : DownloadClient
return new RequestResult(response.StatusCode, document, stream);
}
public override void Close()
{
Log("Closing.");
}
}

View File

@ -14,15 +14,12 @@ namespace Tranga.MangaConnectors;
public abstract class MangaConnector : GlobalBase
{
internal DownloadClient downloadClient { get; init; } = null!;
public string[] SupportedLanguages;
public void StopDownloadClient()
{
downloadClient.Close();
}
protected MangaConnector(GlobalBase clone, string name) : base(clone)
protected MangaConnector(GlobalBase clone, string name, string[] supportedLanguages) : base(clone)
{
this.name = name;
this.SupportedLanguages = supportedLanguages;
Directory.CreateDirectory(TrangaSettings.coverImageCache);
}
@ -63,8 +60,7 @@ public abstract class MangaConnector : GlobalBase
return Array.Empty<Chapter>();
Log($"Checking for duplicates {manga}");
List<Chapter> newChaptersList = allChapters.Where(nChapter => float.TryParse(nChapter.chapterNumber, numberFormatDecimalPoint, out float chapterNumber)
&& chapterNumber > manga.ignoreChaptersBelow
List<Chapter> newChaptersList = allChapters.Where(nChapter => nChapter.chapterNumber >= manga.ignoreChaptersBelow
&& !nChapter.CheckChapterIsDownloaded()).ToList();
Log($"{newChaptersList.Count} new chapters. {manga}");
try
@ -83,79 +79,6 @@ public abstract class MangaConnector : GlobalBase
return newChaptersList.ToArray();
}
public Chapter[] SelectChapters(Manga manga, string searchTerm, string? language = null)
{
Chapter[] availableChapters = this.GetChapters(manga, language??"en");
Regex volumeRegex = new ("((v(ol)*(olume)*){1} *([0-9]+(-[0-9]+)?){1})", RegexOptions.IgnoreCase);
Regex chapterRegex = new ("((c(h)*(hapter)*){1} *([0-9]+(-[0-9]+)?){1})", RegexOptions.IgnoreCase);
Regex singleResultRegex = new("([0-9]+)", RegexOptions.IgnoreCase);
Regex rangeResultRegex = new("([0-9]+(-[0-9]+))", RegexOptions.IgnoreCase);
Regex allRegex = new("a(ll)?", RegexOptions.IgnoreCase);
if (volumeRegex.IsMatch(searchTerm) && chapterRegex.IsMatch(searchTerm))
{
string volume = singleResultRegex.Match(volumeRegex.Match(searchTerm).Value).Value;
string chapter = singleResultRegex.Match(chapterRegex.Match(searchTerm).Value).Value;
return availableChapters.Where(aCh => aCh.volumeNumber is not null &&
aCh.volumeNumber.Equals(volume, StringComparison.InvariantCultureIgnoreCase) &&
aCh.chapterNumber.Equals(chapter, StringComparison.InvariantCultureIgnoreCase))
.ToArray();
}
else if (volumeRegex.IsMatch(searchTerm))
{
string volume = volumeRegex.Match(searchTerm).Value;
if (rangeResultRegex.IsMatch(volume))
{
string range = rangeResultRegex.Match(volume).Value;
int start = Convert.ToInt32(range.Split('-')[0]);
int end = Convert.ToInt32(range.Split('-')[1]);
return availableChapters.Where(aCh => aCh.volumeNumber is not null &&
Convert.ToInt32(aCh.volumeNumber) >= start &&
Convert.ToInt32(aCh.volumeNumber) <= end).ToArray();
}
else if (singleResultRegex.IsMatch(volume))
{
string volumeNumber = singleResultRegex.Match(volume).Value;
return availableChapters.Where(aCh =>
aCh.volumeNumber is not null &&
aCh.volumeNumber.Equals(volumeNumber, StringComparison.InvariantCultureIgnoreCase)).ToArray();
}
}
else if (chapterRegex.IsMatch(searchTerm))
{
string chapter = chapterRegex.Match(searchTerm).Value;
if (rangeResultRegex.IsMatch(chapter))
{
string range = rangeResultRegex.Match(chapter).Value;
int start = Convert.ToInt32(range.Split('-')[0]);
int end = Convert.ToInt32(range.Split('-')[1]);
return availableChapters.Where(aCh => Convert.ToInt32(aCh.chapterNumber) >= start &&
Convert.ToInt32(aCh.chapterNumber) <= end).ToArray();
}
else if (singleResultRegex.IsMatch(chapter))
{
string chapterNumber = singleResultRegex.Match(chapter).Value;
return availableChapters.Where(aCh =>
aCh.chapterNumber.Equals(chapterNumber, StringComparison.InvariantCultureIgnoreCase)).ToArray();
}
}
else
{
if (rangeResultRegex.IsMatch(searchTerm))
{
int start = Convert.ToInt32(searchTerm.Split('-')[0]);
int end = Convert.ToInt32(searchTerm.Split('-')[1]);
return availableChapters[start..(end + 1)];
}
else if(singleResultRegex.IsMatch(searchTerm))
return new [] { availableChapters[Convert.ToInt32(searchTerm)] };
else if (allRegex.IsMatch(searchTerm))
return availableChapters;
}
return Array.Empty<Chapter>();
}
public abstract HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null);
/// <summary>
@ -217,8 +140,10 @@ public abstract class MangaConnector : GlobalBase
return requestResult.statusCode;
}
protected HttpStatusCode DownloadChapterImages(string[] imageUrls, string saveArchiveFilePath, RequestType requestType, string? comicInfoPath = null, string? referrer = null, ProgressToken? progressToken = null)
protected HttpStatusCode DownloadChapterImages(string[] imageUrls, Chapter chapter, RequestType requestType, string? referrer = null, ProgressToken? progressToken = null)
{
string saveArchiveFilePath = chapter.GetArchiveFilePath();
if (progressToken?.cancellationRequested ?? false)
return HttpStatusCode.RequestTimeout;
Log($"Downloading Images for {saveArchiveFilePath}");
@ -234,12 +159,15 @@ public abstract class MangaConnector : GlobalBase
Directory.CreateDirectory(directoryPath);
if (File.Exists(saveArchiveFilePath)) //Don't download twice.
{
progressToken?.Complete();
return HttpStatusCode.Created;
}
//Create a temporary folder to store images
string tempFolder = Directory.CreateTempSubdirectory("trangatemp").FullName;
int chapter = 0;
int chapterNum = 0;
//Download all Images to temporary Folder
if (imageUrls.Length == 0)
{
@ -253,9 +181,9 @@ public abstract class MangaConnector : GlobalBase
foreach (string imageUrl in imageUrls)
{
string extension = imageUrl.Split('.')[^1].Split('?')[0];
Log($"Downloading image {chapter + 1:000}/{imageUrls.Length:000}"); //TODO
HttpStatusCode status = DownloadImage(imageUrl, Path.Join(tempFolder, $"{chapter++}.{extension}"), requestType, referrer);
Log($"{saveArchiveFilePath} {chapter + 1:000}/{imageUrls.Length:000} {status}");
Log($"Downloading image {chapterNum + 1:000}/{imageUrls.Length:000}"); //TODO
HttpStatusCode status = DownloadImage(imageUrl, Path.Join(tempFolder, $"{chapterNum++}.{extension}"), requestType, referrer);
Log($"{saveArchiveFilePath} {chapterNum + 1:000}/{imageUrls.Length:000} {status}");
if ((int)status < 200 || (int)status >= 300)
{
progressToken?.Complete();
@ -269,23 +197,23 @@ public abstract class MangaConnector : GlobalBase
progressToken?.Increment();
}
if(comicInfoPath is not null){
File.Copy(comicInfoPath, Path.Join(tempFolder, "ComicInfo.xml"));
File.Delete(comicInfoPath); //Delete tmp-file
}
File.WriteAllText(Path.Join(tempFolder, "ComicInfo.xml"), chapter.GetComicInfoXmlString());
Log($"Creating archive {saveArchiveFilePath}");
//ZIP-it and ship-it
ZipFile.CreateFromDirectory(tempFolder, saveArchiveFilePath);
chapter.CreateChapterMarker();
if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(saveArchiveFilePath, UserRead | UserWrite | UserExecute | GroupRead | GroupWrite | GroupExecute);
File.SetUnixFileMode(saveArchiveFilePath, UserRead | UserWrite | UserExecute | GroupRead | GroupWrite | GroupExecute | OtherRead | OtherExecute);
Directory.Delete(tempFolder, true); //Cleanup
Log("Created archive.");
progressToken?.Complete();
Log("Download complete.");
return HttpStatusCode.OK;
}
protected string SaveCoverImageToCache(string url, string mangaInternalId, RequestType requestType)
protected string SaveCoverImageToCache(string url, string mangaInternalId, RequestType requestType, string? referrer = null)
{
Regex urlRex = new (@"https?:\/\/((?:[a-zA-Z0-9-]+\.)+[a-zA-Z0-9]+)\/(?:.+\/)*(.+\.([a-zA-Z]+))");
//https?:\/\/[a-zA-Z0-9-]+\.([a-zA-Z0-9-]+\.[a-zA-Z0-9]+)\/(?:.+\/)*(.+\.([a-zA-Z]+)) for only second level domains
@ -296,7 +224,7 @@ public abstract class MangaConnector : GlobalBase
if (File.Exists(saveImagePath))
return saveImagePath;
RequestResult coverResult = downloadClient.MakeRequest(url, requestType);
RequestResult coverResult = downloadClient.MakeRequest(url, requestType, referrer);
using MemoryStream ms = new();
coverResult.result.CopyTo(ms);
Directory.CreateDirectory(TrangaSettings.coverImageCache);

View File

@ -1,4 +1,6 @@
using Newtonsoft.Json;
using System.Data;
using System.Diagnostics;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
namespace Tranga.MangaConnectors;
@ -22,29 +24,23 @@ public class MangaConnectorJsonConverter : JsonConverter
public override object ReadJson(JsonReader reader, Type objectType, object? existingValue, JsonSerializer serializer)
{
JObject jo = JObject.Load(reader);
switch (jo.GetValue("name")!.Value<string>()!)
string? connectorName = jo.Value<string>("name");
if (connectorName is null)
throw new ConstraintException("Name can not be null.");
return connectorName switch
{
case "MangaDex":
return this._connectors.First(c => c is MangaDex);
case "Manganato":
return this._connectors.First(c => c is Manganato);
case "MangaKatana":
return this._connectors.First(c => c is MangaKatana);
case "Mangasee":
return this._connectors.First(c => c is Mangasee);
case "Mangaworld":
return this._connectors.First(c => c is Mangaworld);
case "Bato":
return this._connectors.First(c => c is Bato);
case "Manga4Life":
return this._connectors.First(c => c is MangaLife);
case "ManhuaPlus":
return this._connectors.First(c => c is ManhuaPlus);
case "MangaHere":
return this._connectors.First(c => c is MangaHere);
}
throw new Exception();
"MangaDex" => this._connectors.First(c => c is MangaDex),
"Manganato" => this._connectors.First(c => c is Manganato),
"MangaKatana" => this._connectors.First(c => c is MangaKatana),
"Mangaworld" => this._connectors.First(c => c is Mangaworld),
"Bato" => this._connectors.First(c => c is Bato),
"ManhuaPlus" => this._connectors.First(c => c is ManhuaPlus),
"MangaHere" => this._connectors.First(c => c is MangaHere),
"AsuraToon" => this._connectors.First(c => c is AsuraToon),
"Weebcentral" => this._connectors.First(c => c is Weebcentral),
"Webtoons" => this._connectors.First(c => c is Webtoons),
_ => throw new UnreachableException($"Could not find Connector with name {connectorName}")
};
}
public override bool CanWrite => false;

View File

@ -7,14 +7,17 @@ using JsonSerializer = System.Text.Json.JsonSerializer;
namespace Tranga.MangaConnectors;
public class MangaDex : MangaConnector
{
public MangaDex(GlobalBase clone) : base(clone, "MangaDex")
//https://api.mangadex.org/docs/3-enumerations/#language-codes--localization
//https://en.wikipedia.org/wiki/List_of_ISO_639_language_codes
//https://gist.github.com/Josantonius/b455e315bc7f790d14b136d61d9ae469
public MangaDex(GlobalBase clone) : base(clone, "MangaDex", ["en","pt","pt-br","it","de","ru","aa","ab","ae","af","ak","am","an","ar-ae","ar-bh","ar-dz","ar-eg","ar-iq","ar-jo","ar-kw","ar-lb","ar-ly","ar-ma","ar-om","ar-qa","ar-sa","ar-sy","ar-tn","ar-ye","ar","as","av","ay","az","ba","be","bg","bh","bi","bm","bn","bo","br","bs","ca","ce","ch","co","cr","cs","cu","cv","cy","da","de-at","de-ch","de-de","de-li","de-lu","div","dv","dz","ee","el","en-au","en-bz","en-ca","en-cb","en-gb","en-ie","en-jm","en-nz","en-ph","en-tt","en-us","en-za","en-zw","eo","es-ar","es-bo","es-cl","es-co","es-cr","es-do","es-ec","es-es","es-gt","es-hn","es-la","es-mx","es-ni","es-pa","es-pe","es-pr","es-py","es-sv","es-us","es-uy","es-ve","es","et","eu","fa","ff","fi","fj","fo","fr-be","fr-ca","fr-ch","fr-fr","fr-lu","fr-mc","fr","fy","ga","gd","gl","gn","gu","gv","ha","he","hi","ho","hr-ba","hr-hr","hr","ht","hu","hy","hz","ia","id","ie","ig","ii","ik","in","io","is","it-ch","it-it","iu","iw","ja","ja-ro","ji","jv","jw","ka","kg","ki","kj","kk","kl","km","kn","ko","ko-ro","kr","ks","ku","kv","kw","ky","kz","la","lb","lg","li","ln","lo","ls","lt","lu","lv","mg","mh","mi","mk","ml","mn","mo","mr","ms-bn","ms-my","ms","mt","my","na","nb","nd","ne","ng","nl-be","nl-nl","nl","nn","no","nr","ns","nv","ny","oc","oj","om","or","os","pa","pi","pl","ps","pt-pt","qu-bo","qu-ec","qu-pe","qu","rm","rn","ro","rw","sa","sb","sc","sd","se-fi","se-no","se-se","se","sg","sh","si","sk","sl","sm","sn","so","sq","sr-ba","sr-sp","sr","ss","st","su","sv-fi","sv-se","sv","sw","sx","syr","ta","te","tg","th","ti","tk","tl","tn","to","tr","ts","tt","tw","ty","ug","uk","ur","us","uz","ve","vi","vo","wa","wo","xh","yi","yo","za","zh-cn","zh-hk","zh-mo","zh-ro","zh-sg","zh-tw","zh","zu"])
{
this.downloadClient = new HttpDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
Log($"Searching Publications. Term={publicationTitle}");
const int limit = 100; //How many values we want returned at once
int offset = 0; //"Page"
int total = int.MaxValue; //How many total results are there, is updated on first request
@ -54,7 +57,7 @@ public class MangaDex : MangaConnector
if(MangaFromJsonObject(mangaNode.AsObject()) is { } manga)
retManga.Add(manga); //Add Publication (Manga) to result
}
Log($"Retrieved {retManga.Count} publications. Term=\"{publicationTitle}\"");
Log($"Retrieved {retManga.Count} publications. Term={publicationTitle}");
return retManga.ToArray();
}
@ -126,10 +129,10 @@ public class MangaDex : MangaConnector
false => null
};
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
Manga.ReleaseStatusByte status = Manga.ReleaseStatusByte.Unreleased;
if (attributes.TryGetPropertyValue("status", out JsonNode? statusNode))
{
releaseStatus = statusNode?.GetValue<string>().ToLower() switch
status = statusNode?.GetValue<string>().ToLower() switch
{
"ongoing" => Manga.ReleaseStatusByte.Continuing,
"completed" => Manga.ReleaseStatusByte.Completed,
@ -173,7 +176,6 @@ public class MangaDex : MangaConnector
}
Manga pub = new(
this,
title,
authors,
description,
@ -185,8 +187,8 @@ public class MangaDex : MangaConnector
year,
originalLanguage,
publicationId,
releaseStatus,
$"https://mangadex.org/title/{publicationId}"
status,
websiteUrl: $"https://mangadex.org/title/{publicationId}"
);
AddMangaToCache(pub);
return pub;
@ -244,8 +246,17 @@ public class MangaDex : MangaConnector
continue;
}
if(chapterNum is not "null")
chapters.Add(new Chapter(manga, title, volume, chapterNum, chapterId));
try
{
if(!chapters.Any(chp =>
chp.volumeNumber.Equals(float.Parse(volume??"0", numberFormatDecimalPoint)) &&
chp.chapterNumber.Equals(float.Parse(chapterNum, numberFormatDecimalPoint))))
chapters.Add(new Chapter(manga, title, volume, chapterNum, chapterId, chapterId));
}
catch (Exception e)
{
Log($"Failed to load chapter {chapterNum}: {e.Message}");
}
}
}
@ -287,10 +298,7 @@ public class MangaDex : MangaConnector
foreach (JsonNode? image in imageFileNames)
imageUrls.Add($"{baseUrl}/data/{hash}/{image!.GetValue<string>()}");
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
//Download Chapter-Images
return DownloadChapterImages(imageUrls.ToArray(), chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, progressToken:progressToken);
return DownloadChapterImages(imageUrls.ToArray(), chapter, RequestType.MangaImage, progressToken:progressToken);
}
}

View File

@ -7,7 +7,7 @@ namespace Tranga.MangaConnectors;
public class MangaHere : MangaConnector
{
public MangaHere(GlobalBase clone) : base(clone, "MangaHere")
public MangaHere(GlobalBase clone) : base(clone, "MangaHere", ["en"])
{
this.downloadClient = new ChromiumDownloadClient(clone);
}
@ -101,7 +101,7 @@ public class MangaHere : MangaConnector
.SelectSingleNode("//p[contains(concat(' ',normalize-space(@class),' '),' fullcontent ')]");
string description = descriptionNode.InnerText;
Manga manga = new(this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
Manga manga = new(sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links,
null, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
@ -117,7 +117,7 @@ public class MangaHere : MangaConnector
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300 || requestResult.htmlDocument is null)
return Array.Empty<Chapter>();
List<string> urls = requestResult.htmlDocument.DocumentNode.SelectNodes("//div[@id='list-2']/ul//li//a[contains(@href, '/manga/')]")
List<string> urls = requestResult.htmlDocument.DocumentNode.SelectNodes("//div[@id='list-1']/ul//li//a[contains(@href, '/manga/')]")
.Select(node => node.GetAttributeValue("href", "")).ToList();
Regex chapterRex = new(@".*\/manga\/[a-zA-Z0-9\-\._\~\!\$\&\'\(\)\*\+\,\;\=\:\@]+\/v([0-9(TBD)]+)\/c([0-9\.]+)\/.*");
@ -129,8 +129,16 @@ public class MangaHere : MangaConnector
string volumeNumber = rexMatch.Groups[1].Value == "TBD" ? "0" : rexMatch.Groups[1].Value;
string chapterNumber = rexMatch.Groups[2].Value;
string fullUrl = $"https://www.mangahere.cc{url}";
try
{
chapters.Add(new Chapter(manga, "", volumeNumber, chapterNumber, fullUrl));
}
catch (Exception e)
{
Log($"Failed to load chapter {chapterNumber}: {e.Message}");
}
}
//Return Chapters ordered by Chapter-Number
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
@ -181,12 +189,9 @@ public class MangaHere : MangaConnector
}
} while (downloaded++ <= images);
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
if (progressToken is not null)
progressToken.increments = images;//we blip to normal length, in downloadchapterimages it is increasaed by the amount of urls again
return DownloadChapterImages(imageUrls.ToArray(), chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, progressToken:progressToken);
return DownloadChapterImages(imageUrls.ToArray(), chapter, RequestType.MangaImage, progressToken:progressToken);
}
private string[] ParseImageUrlsFromHtml(HtmlDocument document)

View File

@ -7,7 +7,7 @@ namespace Tranga.MangaConnectors;
public class MangaKatana : MangaConnector
{
public MangaKatana(GlobalBase clone) : base(clone, "MangaKatana")
public MangaKatana(GlobalBase clone) : base(clone, "MangaKatana", ["en"])
{
this.downloadClient = new HttpDownloadClient(clone);
}
@ -15,7 +15,7 @@ public class MangaKatana : MangaConnector
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string sanitizedTitle = string.Join('_', Regex.Matches(publicationTitle, "[A-z]*").Where(m => m.Value.Length > 0)).ToLower();
string sanitizedTitle = string.Join("%20", Regex.Matches(publicationTitle, "[A-z]*").Where(m => m.Value.Length > 0)).ToLower();
string requestUrl = $"https://mangakatana.com/?search={sanitizedTitle}&search_by=book_name";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
@ -141,8 +141,8 @@ public class MangaKatana : MangaConnector
year = Convert.ToInt32(yearString);
}
Manga manga = new (this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl);
Manga manga = new (sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
@ -186,8 +186,15 @@ public class MangaKatana : MangaConnector
string? volumeNumber = volumeRex.IsMatch(url) ? volumeRex.Match(url).Groups[1].Value : null;
string chapterNumber = chapterNumRex.Match(url).Groups[1].Value;
string chapterName = chapterNameRex.Match(fullString).Groups[1].Value;
try
{
ret.Add(new Chapter(manga, chapterName, volumeNumber, chapterNumber, url));
}
catch (Exception e)
{
Log($"Failed to load chapter {chapterNumber}: {e.Message}");
}
}
return ret;
}
@ -214,10 +221,7 @@ public class MangaKatana : MangaConnector
string[] imageUrls = ParseImageUrlsFromHtml(requestUrl);
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(imageUrls, chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, "https://mangakatana.com/", progressToken:progressToken);
return DownloadChapterImages(imageUrls, chapter, RequestType.MangaImage, progressToken:progressToken);
}
private string[] ParseImageUrlsFromHtml(string mangaUrl)

View File

@ -1,199 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class MangaLife : MangaConnector
{
public MangaLife(GlobalBase clone) : base(clone, "Manga4Life")
{
this.downloadClient = new ChromiumDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string sanitizedTitle = WebUtility.UrlEncode(publicationTitle);
string requestUrl = $"https://manga4life.com/search/?name={sanitizedTitle}";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Manga>();
if (requestResult.htmlDocument is null)
return Array.Empty<Manga>();
Manga[] publications = ParsePublicationsFromHtml(requestResult.htmlDocument);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications;
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://manga4life.com/manga/{publicationId}");
}
public override Manga? GetMangaFromUrl(string url)
{
Regex publicationIdRex = new(@"https:\/\/(www\.)?manga4life.com\/manga\/(.*)(\/.*)*");
string publicationId = publicationIdRex.Match(url).Groups[2].Value;
RequestResult requestResult = this.downloadClient.MakeRequest(url, RequestType.MangaInfo);
if(requestResult.htmlDocument is not null)
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, publicationId, url);
return null;
}
private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{
HtmlNode resultsNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']/div[last()]/div[1]/div");
if (resultsNode.Descendants("div").Count() == 1 && resultsNode.Descendants("div").First().HasClass("NoResults"))
{
Log("No results.");
return Array.Empty<Manga>();
}
Log($"{resultsNode.SelectNodes("div").Count} items.");
HashSet<Manga> ret = new();
foreach (HtmlNode resultNode in resultsNode.SelectNodes("div"))
{
string url = resultNode.Descendants().First(d => d.HasClass("SeriesName")).GetAttributeValue("href", "");
Manga? manga = GetMangaFromUrl($"https://manga4life.com{url}");
if (manga is not null)
ret.Add((Manga)manga);
}
return ret.ToArray();
}
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
string originalLanguage = "", status = "";
Dictionary<string, string> altTitles = new(), links = new();
HashSet<string> tags = new();
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
HtmlNode posterNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']//div[@class='row']//img");
string posterUrl = posterNode.GetAttributeValue("src", "");
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
HtmlNode titleNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']//div[@class='row']//h1");
string sortName = titleNode.InnerText;
HtmlNode[] authorsNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Author(s):']/..").Descendants("a")
.ToArray();
List<string> authors = new();
foreach (HtmlNode authorNode in authorsNodes)
authors.Add(authorNode.InnerText);
HtmlNode[] genreNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Genre(s):']/..").Descendants("a")
.ToArray();
foreach (HtmlNode genreNode in genreNodes)
tags.Add(genreNode.InnerText);
HtmlNode yearNode = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Released:']/..").Descendants("a")
.First();
int year = Convert.ToInt32(yearNode.InnerText);
HtmlNode[] statusNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Status:']/..").Descendants("a")
.ToArray();
foreach (HtmlNode statusNode in statusNodes)
if (statusNode.InnerText.Contains("publish", StringComparison.CurrentCultureIgnoreCase))
status = statusNode.InnerText.Split(' ')[0];
switch (status.ToLower())
{
case "cancelled": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "hiatus": releaseStatus = Manga.ReleaseStatusByte.OnHiatus; break;
case "discontinued": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "complete": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
}
HtmlNode descriptionNode = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Description:']/..")
.Descendants("div").First();
string description = descriptionNode.InnerText;
Manga manga = new(this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links, year, originalLanguage, publicationId, releaseStatus, websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
RequestResult result = downloadClient.MakeRequest($"https://manga4life.com/manga/{manga.publicationId}", RequestType.Default, clickButton:"[class*='ShowAllChapters']");
if ((int)result.statusCode < 200 || (int)result.statusCode >= 300 || result.htmlDocument is null)
{
return Array.Empty<Chapter>();
}
HtmlNodeCollection chapterNodes = result.htmlDocument.DocumentNode.SelectNodes(
"//a[contains(concat(' ',normalize-space(@class),' '),' ChapterLink ')]");
string[] urls = chapterNodes.Select(node => node.GetAttributeValue("href", "")).ToArray();
Regex urlRex = new (@"-chapter-([0-9\\.]+)(-index-([0-9\\.]+))?");
List<Chapter> chapters = new();
foreach (string url in urls)
{
Match rexMatch = urlRex.Match(url);
string volumeNumber = "1";
if (rexMatch.Groups[3].Value.Length > 0)
volumeNumber = rexMatch.Groups[3].Value;
string chapterNumber = rexMatch.Groups[1].Value;
string fullUrl = $"https://manga4life.com{url}";
fullUrl = fullUrl.Replace(Regex.Match(url,"(-page-[0-9])").Value,"");
chapters.Add(new Chapter(manga, "", volumeNumber, chapterNumber, fullUrl));
}
//Return Chapters ordered by Chapter-Number
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
RequestResult requestResult = this.downloadClient.MakeRequest(chapter.url, RequestType.Default);
if (requestResult.htmlDocument is null)
{
progressToken?.Cancel();
return HttpStatusCode.RequestTimeout;
}
HtmlDocument document = requestResult.htmlDocument;
HtmlNode gallery = document.DocumentNode.Descendants("div").First(div => div.HasClass("ImageGallery"));
HtmlNode[] images = gallery.Descendants("img").Where(img => img.HasClass("img-fluid")).ToArray();
List<string> urls = new();
foreach(HtmlNode galleryImage in images)
urls.Add(galleryImage.GetAttributeValue("src", ""));
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(urls.ToArray(), chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, progressToken:progressToken);
}
}

View File

@ -8,7 +8,7 @@ namespace Tranga.MangaConnectors;
public class Manganato : MangaConnector
{
public Manganato(GlobalBase clone) : base(clone, "Manganato")
public Manganato(GlobalBase clone) : base(clone, "Manganato", ["en"])
{
this.downloadClient = new HttpDownloadClient(clone);
}
@ -17,7 +17,7 @@ public class Manganato : MangaConnector
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string sanitizedTitle = string.Join('_', Regex.Matches(publicationTitle, "[A-z]*").Where(str => str.Length > 0)).ToLower();
string requestUrl = $"https://manganato.com/search/story/{sanitizedTitle}";
string requestUrl = $"https://manganato.gg/search/story/{sanitizedTitle}";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
@ -32,13 +32,19 @@ public class Manganato : MangaConnector
private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{
List<HtmlNode> searchResults = document.DocumentNode.Descendants("div").Where(n => n.HasClass("search-story-item")).ToList();
List<HtmlNode> searchResults = document.DocumentNode.Descendants("div").Where(n => n.HasClass("story_item")).ToList();
Log($"{searchResults.Count} items.");
List<string> urls = new();
foreach (HtmlNode mangaResult in searchResults)
{
urls.Add(mangaResult.Descendants("a").First(n => n.HasClass("item-title")).GetAttributes()
.First(a => a.Name == "href").Value);
try
{
urls.Add(mangaResult.Descendants("h3").First(n => n.HasClass("story_name"))
.Descendants("a").First().GetAttributeValue("href", ""));
} catch
{
//failed to get a url, send it to the void
}
}
HashSet<Manga> ret = new();
@ -78,69 +84,57 @@ public class Manganato : MangaConnector
string originalLanguage = "";
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
HtmlNode infoNode = document.DocumentNode.Descendants("div").First(d => d.HasClass("story-info-right"));
HtmlNode infoNode = document.DocumentNode.Descendants("ul").First(d => d.HasClass("manga-info-text"));
string sortName = infoNode.Descendants("h1").First().InnerText;
HtmlNode infoTable = infoNode.Descendants().First(d => d.Name == "table");
foreach (HtmlNode li in infoNode.Descendants("li"))
{
string text = li.InnerText.Trim().ToLower();
foreach (HtmlNode row in infoTable.Descendants("tr"))
if (text.StartsWith("author(s) :"))
{
string key = row.SelectNodes("td").First().InnerText.ToLower();
string value = row.SelectNodes("td").Last().InnerText;
string keySanitized = string.Concat(Regex.Matches(key, "[a-z]"));
switch (keySanitized)
{
case "alternative":
string[] alts = value.Split(" ; ");
for(int i = 0; i < alts.Length; i++)
altTitles.Add(i.ToString(), alts[i]);
break;
case "authors":
authors = value.Split('-');
for (int i = 0; i < authors.Length; i++)
authors[i] = authors[i].Replace("\r\n", "");
break;
case "status":
switch (value.ToLower())
{
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
case "completed": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
authors = li.Descendants("a").Select(a => a.InnerText.Trim()).ToArray();
}
break;
case "genres":
string[] genres = value.Split(" - ");
for (int i = 0; i < genres.Length; i++)
genres[i] = genres[i].Replace("\r\n", "");
tags = genres.ToHashSet();
break;
else if (text.StartsWith("status :"))
{
string status = text.Replace("status :", "").Trim().ToLower();
if (string.IsNullOrWhiteSpace(status))
releaseStatus = Manga.ReleaseStatusByte.Continuing;
else if (status == "ongoing")
releaseStatus = Manga.ReleaseStatusByte.Continuing;
else
releaseStatus = Enum.Parse<Manga.ReleaseStatusByte>(status, true);
}
else if (li.HasClass("genres"))
{
tags = li.Descendants("a").Select(a => a.InnerText.Trim()).ToHashSet();
}
}
string posterUrl = document.DocumentNode.Descendants("span").First(s => s.HasClass("info-image")).Descendants("img").First()
string posterUrl = document.DocumentNode.Descendants("div").First(s => s.HasClass("manga-info-pic")).Descendants("img").First()
.GetAttributes().First(a => a.Name == "src").Value;
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover, "https://www.manganato.gg/");
string description = document.DocumentNode.Descendants("div").First(d => d.HasClass("panel-story-info-description"))
string description = document.DocumentNode.SelectSingleNode("//div[@id='contentBox']")
.InnerText.Replace("Description :", "");
while (description.StartsWith('\n'))
description = description.Substring(1);
string pattern = "MMM dd,yyyy HH:mm";
string pattern = "MMM-dd-yyyy HH:mm";
HtmlNode oldestChapter = document.DocumentNode
.SelectNodes("//span[contains(concat(' ',normalize-space(@class),' '),' chapter-time ')]").MaxBy(
node => DateTime.ParseExact(node.GetAttributeValue("title", "Dec 31 2400, 23:59"), pattern,
CultureInfo.InvariantCulture).Millisecond)!;
HtmlNode? oldestChapter = document.DocumentNode
.SelectNodes("//div[contains(concat(' ',normalize-space(@class),' '),' row ')]/span[@title]").MaxBy(
node => DateTime.ParseExact(node.GetAttributeValue("title", "Dec-31-2400 23:59"), pattern,
CultureInfo.InvariantCulture).Millisecond);
int year = DateTime.ParseExact(oldestChapter.GetAttributeValue("title", "Dec 31 2400, 23:59"), pattern,
int year = DateTime.ParseExact(oldestChapter?.GetAttributeValue("title", "Dec 31 2400, 23:59")??"Dec 31 2400, 23:59", pattern,
CultureInfo.InvariantCulture).Year;
Manga manga = new (this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl);
Manga manga = new (sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
@ -148,7 +142,7 @@ public class Manganato : MangaConnector
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
string requestUrl = $"https://chapmanganato.com/{manga.publicationId}";
string requestUrl = manga.websiteUrl;
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
@ -166,23 +160,30 @@ public class Manganato : MangaConnector
{
List<Chapter> ret = new();
HtmlNode chapterList = document.DocumentNode.Descendants("ul").First(l => l.HasClass("row-content-chapter"));
HtmlNode chapterList = document.DocumentNode.Descendants("div").First(l => l.HasClass("chapter-list"));
Regex volRex = new(@"Vol\.([0-9]+).*");
Regex chapterRex = new(@"https:\/\/chapmanganato.[A-z]+\/manga-[A-z0-9]+\/chapter-([0-9\.]+)");
Regex nameRex = new(@"Chapter ([0-9]+(\.[0-9]+)*){1}:? (.*)");
foreach (HtmlNode chapterInfo in chapterList.Descendants("li"))
foreach (HtmlNode chapterInfo in chapterList.Descendants("div").Where(x => x.HasClass("row")))
{
string url = chapterInfo.Descendants("a").First().GetAttributeValue("href", "");
var name = chapterInfo.Descendants("a").First().InnerText.Trim();
string chapterName = nameRex.Match(name).Groups[3].Value;
string chapterNumber = Regex.Match(name, @"Chapter ([0-9]+(\.[0-9]+)*)").Groups[1].Value;
string? volumeNumber = Regex.Match(chapterName, @"Vol\.([0-9]+)").Groups[1].Value;
if (string.IsNullOrWhiteSpace(volumeNumber))
volumeNumber = "0";
try
{
string fullString = chapterInfo.Descendants("a").First(d => d.HasClass("chapter-name")).InnerText;
string url = chapterInfo.Descendants("a").First(d => d.HasClass("chapter-name"))
.GetAttributeValue("href", "");
string? volumeNumber = volRex.IsMatch(fullString) ? volRex.Match(fullString).Groups[1].Value : null;
string chapterNumber = chapterRex.Match(url).Groups[1].Value;
string chapterName = nameRex.Match(fullString).Groups[3].Value;
ret.Add(new Chapter(manga, chapterName, volumeNumber, chapterNumber, url));
}
catch (Exception e)
{
Log($"Failed to load chapter {chapterNumber}: {e.Message}");
}
}
ret.Reverse();
return ret;
}
@ -214,10 +215,7 @@ public class Manganato : MangaConnector
string[] imageUrls = ParseImageUrlsFromHtml(requestResult.htmlDocument);
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(imageUrls, chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, "https://chapmanganato.com/", progressToken:progressToken);
return DownloadChapterImages(imageUrls, chapter, RequestType.MangaImage, "https://www.manganato.gg", progressToken:progressToken);
}
private string[] ParseImageUrlsFromHtml(HtmlDocument document)

View File

@ -1,229 +0,0 @@
using System.Data;
using System.Net;
using System.Text.RegularExpressions;
using System.Xml.Linq;
using HtmlAgilityPack;
using Newtonsoft.Json;
using Soenneker.Utils.String.NeedlemanWunsch;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class Mangasee : MangaConnector
{
public Mangasee(GlobalBase clone) : base(clone, "Mangasee")
{
this.downloadClient = new ChromiumDownloadClient(clone);
}
private struct SearchResult
{
public string i { get; set; }
public string s { get; set; }
public string[] a { get; set; }
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string requestUrl = "https://mangasee123.com/_search.php";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
Log($"Failed to retrieve search: {requestResult.statusCode}");
return Array.Empty<Manga>();
}
try
{
SearchResult[] searchResults = JsonConvert.DeserializeObject<SearchResult[]>(requestResult.htmlDocument!.DocumentNode.InnerText) ??
throw new NoNullAllowedException();
SearchResult[] filteredResults = FilteredResults(publicationTitle, searchResults);
Log($"Total available manga: {searchResults.Length} Filtered down to: {filteredResults.Length}");
string[] urls = filteredResults.Select(result => $"https://mangasee123.com/manga/{result.i}").ToArray();
List<Manga> searchResultManga = new();
foreach (string url in urls)
{
Manga? newManga = GetMangaFromUrl(url);
if(newManga is { } manga)
searchResultManga.Add(manga);
}
Log($"Retrieved {searchResultManga.Count} publications. Term=\"{publicationTitle}\"");
return searchResultManga.ToArray();
}
catch (NoNullAllowedException)
{
Log("Failed to retrieve search");
return Array.Empty<Manga>();
}
}
private readonly string[] _filterWords = {"a", "the", "of", "as", "to", "no", "for", "on", "with", "be", "and", "in", "wa", "at", "be", "ni"};
private string ToFilteredString(string input) => string.Join(' ', input.ToLower().Split(' ').Where(word => _filterWords.Contains(word) == false));
private SearchResult[] FilteredResults(string publicationTitle, SearchResult[] unfilteredSearchResults)
{
Dictionary<SearchResult, int> similarity = new();
foreach (SearchResult sr in unfilteredSearchResults)
{
List<int> scores = new();
string filteredPublicationString = ToFilteredString(publicationTitle);
string filteredSString = ToFilteredString(sr.s);
scores.Add(NeedlemanWunschStringUtil.CalculateSimilarity(filteredSString, filteredPublicationString));
foreach (string srA in sr.a)
{
string filteredAString = ToFilteredString(srA);
scores.Add(NeedlemanWunschStringUtil.CalculateSimilarity(filteredAString, filteredPublicationString));
}
similarity.Add(sr, scores.Sum() / scores.Count);
}
List<SearchResult> ret = similarity.OrderBy(s => s.Value).Take(10).Select(s => s.Key).ToList();
return ret.ToArray();
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://mangasee123.com/manga/{publicationId}");
}
public override Manga? GetMangaFromUrl(string url)
{
Regex publicationIdRex = new(@"https:\/\/mangasee123.com\/manga\/(.*)(\/.*)*");
string publicationId = publicationIdRex.Match(url).Groups[1].Value;
RequestResult requestResult = this.downloadClient.MakeRequest(url, RequestType.MangaInfo);
if((int)requestResult.statusCode < 300 && (int)requestResult.statusCode >= 200 && requestResult.htmlDocument is not null)
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, publicationId, url);
return null;
}
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
string originalLanguage = "", status = "";
Dictionary<string, string> altTitles = new(), links = new();
HashSet<string> tags = new();
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
HtmlNode posterNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']//div[@class='row']//img");
string posterUrl = posterNode.GetAttributeValue("src", "");
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
HtmlNode titleNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']//div[@class='row']//h1");
string sortName = titleNode.InnerText;
HtmlNode[] authorsNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Author(s):']/..").Descendants("a")
.ToArray();
List<string> authors = new();
foreach (HtmlNode authorNode in authorsNodes)
authors.Add(authorNode.InnerText);
HtmlNode[] genreNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Genre(s):']/..").Descendants("a")
.ToArray();
foreach (HtmlNode genreNode in genreNodes)
tags.Add(genreNode.InnerText);
HtmlNode yearNode = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Released:']/..").Descendants("a")
.First();
int year = Convert.ToInt32(yearNode.InnerText);
HtmlNode[] statusNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Status:']/..").Descendants("a")
.ToArray();
foreach (HtmlNode statusNode in statusNodes)
if (statusNode.InnerText.Contains("publish", StringComparison.CurrentCultureIgnoreCase))
status = statusNode.InnerText.Split(' ')[0];
switch (status.ToLower())
{
case "cancelled": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "hiatus": releaseStatus = Manga.ReleaseStatusByte.OnHiatus; break;
case "discontinued": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "complete": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
}
HtmlNode descriptionNode = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Description:']/..")
.Descendants("div").First();
string description = descriptionNode.InnerText;
Manga manga = new(this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links, year, originalLanguage, publicationId, releaseStatus, websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
try
{
XDocument doc = XDocument.Load($"https://mangasee123.com/rss/{manga.publicationId}.xml");
XElement[] chapterItems = doc.Descendants("item").ToArray();
List<Chapter> chapters = new();
Regex chVolRex = new(@".*chapter-([0-9\.]+)(?:-index-([0-9\.]+))?.*");
foreach (XElement chapter in chapterItems)
{
string url = chapter.Descendants("link").First().Value;
Match m = chVolRex.Match(url);
string? volumeNumber = m.Groups[2].Success ? m.Groups[2].Value : "1";
string chapterNumber = m.Groups[1].Value;
string chapterUrl = Regex.Replace(url, @"-page-[0-9]+(\.html)", ".html");
chapters.Add(new Chapter(manga, "", volumeNumber, chapterNumber, chapterUrl));
}
//Return Chapters ordered by Chapter-Number
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
catch (HttpRequestException e)
{
Log($"Failed to load https://mangasee123.com/rss/{manga.publicationId}.xml \n\r{e}");
return Array.Empty<Chapter>();
}
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
RequestResult requestResult = this.downloadClient.MakeRequest(chapter.url, RequestType.Default);
if (requestResult.htmlDocument is null)
{
progressToken?.Cancel();
return HttpStatusCode.RequestTimeout;
}
HtmlDocument document = requestResult.htmlDocument;
HtmlNode gallery = document.DocumentNode.Descendants("div").First(div => div.HasClass("ImageGallery"));
HtmlNode[] images = gallery.Descendants("img").Where(img => img.HasClass("img-fluid")).ToArray();
List<string> urls = new();
foreach(HtmlNode galleryImage in images)
urls.Add(galleryImage.GetAttributeValue("src", ""));
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(urls.ToArray(), chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, progressToken:progressToken);
}
}

View File

@ -7,9 +7,9 @@ namespace Tranga.MangaConnectors;
public class Mangaworld: MangaConnector
{
public Mangaworld(GlobalBase clone) : base(clone, "Mangaworld")
public Mangaworld(GlobalBase clone) : base(clone, "Mangaworld", ["it"])
{
this.downloadClient = new HttpDownloadClient(clone);
this.downloadClient = new ChromiumDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
@ -118,8 +118,8 @@ public class Mangaworld: MangaConnector
string yearString = metadata.SelectSingleNode("//span[text()='Anno di uscita: ']/..").SelectNodes("a").First().InnerText;
int year = Convert.ToInt32(yearString);
Manga manga = new (this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl);
Manga manga = new (sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
@ -149,19 +149,28 @@ public class Mangaworld: MangaConnector
document.DocumentNode.SelectSingleNode(
"//div[contains(concat(' ',normalize-space(@class),' '),'chapters-wrapper')]");
Regex volumeRex = new(@"[Vv]olume ([0-9]+).*");
Regex chapterRex = new(@"[Cc]apitolo ([0-9]+(?:\.[0-9]+)?).*");
Regex idRex = new(@".*\/read\/([a-z0-9]+)(?:[?\/].*)?");
if (chaptersWrapper.Descendants("div").Any(descendant => descendant.HasClass("volume-element")))
{
foreach (HtmlNode volNode in document.DocumentNode.SelectNodes("//div[contains(concat(' ',normalize-space(@class),' '),'volume-element')]"))
{
string volume = Regex.Match(volNode.SelectNodes("div").First(node => node.HasClass("volume")).SelectSingleNode("p").InnerText,
@"[Vv]olume ([0-9]+).*").Groups[1].Value;
string volume = volumeRex.Match(volNode.SelectNodes("div").First(node => node.HasClass("volume")).SelectSingleNode("p").InnerText).Groups[1].Value;
foreach (HtmlNode chNode in volNode.SelectNodes("div").First(node => node.HasClass("volume-chapters")).SelectNodes("div"))
{
string number = Regex.Match(chNode.SelectSingleNode("a").SelectSingleNode("span").InnerText,
@"[Cc]apitolo ([0-9]+).*").Groups[1].Value;
string number = chapterRex.Match(chNode.SelectSingleNode("a").SelectSingleNode("span").InnerText).Groups[1].Value;
string url = chNode.SelectSingleNode("a").GetAttributeValue("href", "");
ret.Add(new Chapter(manga, null, volume, number, url));
string id = idRex.Match(chNode.SelectSingleNode("a").GetAttributeValue("href", "")).Groups[1].Value;
try
{
ret.Add(new Chapter(manga, null, volume, number, url, id));
}
catch (Exception e)
{
Log($"Failed to load chapter {number}: {e.Message}");
}
}
}
}
@ -169,10 +178,17 @@ public class Mangaworld: MangaConnector
{
foreach (HtmlNode chNode in chaptersWrapper.SelectNodes("div").Where(node => node.HasClass("chapter")))
{
string number = Regex.Match(chNode.SelectSingleNode("a").SelectSingleNode("span").InnerText,
@"[Cc]apitolo ([0-9]+).*").Groups[1].Value;
string number = chapterRex.Match(chNode.SelectSingleNode("a").SelectSingleNode("span").InnerText).Groups[1].Value;
string url = chNode.SelectSingleNode("a").GetAttributeValue("href", "");
ret.Add(new Chapter(manga, null, null, number, url));
string id = idRex.Match(chNode.SelectSingleNode("a").GetAttributeValue("href", "")).Groups[1].Value;
try
{
ret.Add(new Chapter(manga, null, null, number, url, id));
}
catch (Exception e)
{
Log($"Failed to load chapter {number}: {e.Message}");
}
}
}
@ -207,10 +223,7 @@ public class Mangaworld: MangaConnector
string[] imageUrls = ParseImageUrlsFromHtml(requestResult.htmlDocument);
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(imageUrls, chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, "https://www.mangaworld.bz/", progressToken:progressToken);
return DownloadChapterImages(imageUrls, chapter, RequestType.MangaImage,"https://www.mangaworld.bz/", progressToken:progressToken);
}
private string[] ParseImageUrlsFromHtml(HtmlDocument document)

View File

@ -7,7 +7,7 @@ namespace Tranga.MangaConnectors;
public class ManhuaPlus : MangaConnector
{
public ManhuaPlus(GlobalBase clone) : base(clone, "ManhuaPlus")
public ManhuaPlus(GlobalBase clone) : base(clone, "ManhuaPlus", ["en"])
{
this.downloadClient = new ChromiumDownloadClient(clone);
}
@ -82,21 +82,36 @@ public class ManhuaPlus : MangaConnector
HtmlNode titleNode = document.DocumentNode.SelectSingleNode("//h1");
string sortName = titleNode.InnerText.Replace("\n", "");
List<string> authors = new();
try
{
HtmlNode[] authorsNodes = document.DocumentNode
.SelectNodes("//a[contains(@href, 'https://manhuaplus.org/authors/')]")
.ToArray();
List<string> authors = new();
foreach (HtmlNode authorNode in authorsNodes)
authors.Add(authorNode.InnerText);
}
catch (ArgumentNullException e)
{
Log("No authors found.");
}
try
{
HtmlNode[] genreNodes = document.DocumentNode
.SelectNodes("//a[contains(@href, 'https://manhuaplus.org/genres/')]").ToArray();
foreach (HtmlNode genreNode in genreNodes)
tags.Add(genreNode.InnerText.Replace("\n", ""));
}
catch (ArgumentNullException e)
{
Log("No genres found");
}
string yearNodeStr = document.DocumentNode
.SelectSingleNode("//aside//i[contains(concat(' ',normalize-space(@class),' '),' fa-clock ')]/../span").InnerText.Replace("\n", "");
int year = int.Parse(yearNodeStr.Split(' ')[0].Split('/')[^1]);
Regex yearRex = new(@"(?:[0-9]{1,2}\/){2}([0-9]{2,4}) [0-9]{1,2}:[0-9]{1,2}");
HtmlNode yearNode = document.DocumentNode.SelectSingleNode("//aside//i[contains(concat(' ',normalize-space(@class),' '),' fa-clock ')]/../span");
Match match = yearRex.Match(yearNode.InnerText);
int year = match.Success && match.Groups[1].Success ? int.Parse(match.Groups[1].Value) : 1960;
status = document.DocumentNode.SelectSingleNode("//aside//i[contains(concat(' ',normalize-space(@class),' '),' fa-rss ')]/../span").InnerText.Replace("\n", "");
switch (status.ToLower())
@ -112,7 +127,7 @@ public class ManhuaPlus : MangaConnector
.SelectSingleNode("//div[@id='syn-target']");
string description = descriptionNode.InnerText;
Manga manga = new(this, sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
Manga manga = new(sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
@ -140,8 +155,15 @@ public class ManhuaPlus : MangaConnector
string volumeNumber = "1";
string chapterNumber = rexMatch.Groups[1].Value;
string fullUrl = url;
try
{
chapters.Add(new Chapter(manga, "", volumeNumber, chapterNumber, fullUrl));
}
catch (Exception e)
{
Log($"Failed to load chapter {chapterNumber}: {e.Message}");
}
}
//Return Chapters ordered by Chapter-Number
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
@ -176,9 +198,6 @@ public class ManhuaPlus : MangaConnector
HtmlNode[] images = document.DocumentNode.SelectNodes("//a[contains(concat(' ',normalize-space(@class),' '),' readImg ')]/img").ToArray();
List<string> urls = images.Select(node => node.GetAttributeValue("src", "")).ToList();
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(urls.ToArray(), chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, progressToken:progressToken);
return DownloadChapterImages(urls.ToArray(), chapter, RequestType.MangaImage, progressToken:progressToken);
}
}

View File

@ -0,0 +1,273 @@
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class Webtoons : MangaConnector
{
public Webtoons(GlobalBase clone) : base(clone, "Webtoons", ["en"])
{
this.downloadClient = new HttpDownloadClient(clone);
}
// Done
public override Manga[] GetManga(string publicationTitle = "")
{
string sanitizedTitle = string.Join(' ', Regex.Matches(publicationTitle, "[A-z]*").Where(m => m.Value.Length > 0)).ToLower();
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string requestUrl = $"https://www.webtoons.com/en/search?keyword={sanitizedTitle}&searchType=WEBTOON";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300) {
Log($"Failed to retrieve site");
return Array.Empty<Manga>();
}
if (requestResult.htmlDocument is null)
{
Log($"Failed to retrieve site");
return Array.Empty<Manga>();
}
Manga[] publications = ParsePublicationsFromHtml(requestResult.htmlDocument);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications;
}
// Done
public override Manga? GetMangaFromId(string publicationId)
{
PublicationManager pb = new PublicationManager(publicationId);
return GetMangaFromUrl($"https://www.webtoons.com/en/{pb.Category}/{pb.Title}/list?title_no={pb.Id}");
}
// Done
public override Manga? GetMangaFromUrl(string url)
{
RequestResult requestResult = downloadClient.MakeRequest(url, RequestType.MangaInfo);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300) {
return null;
}
if (requestResult.htmlDocument is null)
{
Log($"Failed to retrieve site");
return null;
}
Regex regex = new Regex(@".*webtoons\.com\/en\/(?<category>[^\/]+)\/(?<title>[^\/]+)\/list\?title_no=(?<id>\d+).*");
Match match = regex.Match(url);
if(match.Success) {
PublicationManager pm = new PublicationManager(match.Groups["title"].Value, match.Groups["category"].Value, match.Groups["id"].Value);
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, pm.getPublicationId(), url);
}
Log($"Failed match Regex ID");
return null;
}
// Done
private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{
HtmlNode mangaList = document.DocumentNode.SelectSingleNode("//ul[contains(@class, 'card_lst')]");
if (!mangaList.ChildNodes.Any(node => node.Name == "li")) {
Log($"Failed to parse publication");
return Array.Empty<Manga>();
}
List<string> urls = document.DocumentNode
.SelectNodes("//ul[contains(@class, 'card_lst')]/li/a")
.Select(node => node.GetAttributeValue("href", "https://www.webtoons.com"))
.ToList();
HashSet<Manga> ret = new();
foreach (string url in urls)
{
Manga? manga = GetMangaFromUrl(url);
if (manga is not null)
ret.Add((Manga)manga);
}
return ret.ToArray();
}
private string capitalizeString(string str = "") {
if(str.Length == 0) return "";
if(str.Length == 1) return str.ToUpper();
return char.ToUpper(str[0]) + str.Substring(1).ToLower();
}
// Done
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
HtmlNode infoNode1 = document.DocumentNode.SelectSingleNode("//*[@id='content']/div[2]/div[1]/div[1]");
HtmlNode infoNode2 = document.DocumentNode.SelectSingleNode("//*[@id='content']/div[2]/div[2]/div[2]");
string sortName = infoNode1.SelectSingleNode(".//h1[contains(@class, 'subj')]").InnerText;
string description = infoNode2.SelectSingleNode(".//p[contains(@class, 'summary')]")
.InnerText.Trim();
HtmlNode posterNode = document.DocumentNode.SelectSingleNode("//div[contains(@class, 'detail_body') and contains(@class, 'banner')]");
Regex regex = new Regex(@"url\('(?<url>.*?)'\)");
Match match = regex.Match(posterNode.GetAttributeValue("style", ""));
string posterUrl = match.Groups["url"].Value;
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover, websiteUrl);
string genre = infoNode1.SelectSingleNode(".//h2[contains(@class, 'genre')]")
.InnerText.Trim();
string[] tags = [ genre ];
List<HtmlNode> authorsNodes = infoNode1.SelectSingleNode(".//div[contains(@class, 'author_area')]").Descendants("a").ToList();
List<string> authors = authorsNodes.Select(node => node.InnerText.Trim()).ToList();
string originalLanguage = "";
int year = DateTime.Now.Year;
string status1 = infoNode2.SelectSingleNode(".//p").InnerText;
string status2 = infoNode2.SelectSingleNode(".//p/span").InnerText;
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
if(status2.Length == 0 || status1.ToLower() == "completed") {
releaseStatus = Manga.ReleaseStatusByte.Completed;
} else if(status2.ToLower() == "up") {
releaseStatus = Manga.ReleaseStatusByte.Continuing;
}
Manga manga = new(sortName, authors, description, new Dictionary<string, string>(), tags, posterUrl, coverFileNameInCache, new Dictionary<string, string>(),
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
// Done
public override Chapter[] GetChapters(Manga manga, string language = "en")
{
PublicationManager pm = new PublicationManager(manga.publicationId);
string requestUrl = $"https://www.webtoons.com/en/{pm.Category}/{pm.Title}/list?title_no={pm.Id}";
// Leaving this in for verification if the page exists
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Chapter>();
// Get number of pages
int pages = requestResult.htmlDocument.DocumentNode
.SelectNodes("//div[contains(@class, 'paginate')]/a")
.ToList()
.Count;
List<Chapter> chapters = new List<Chapter>();
for(int page = 1; page <= pages; page++) {
string pageRequestUrl = $"{requestUrl}&page={page}";
chapters.AddRange(ParseChaptersFromHtml(manga, pageRequestUrl));
}
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
// Done
private List<Chapter> ParseChaptersFromHtml(Manga manga, string mangaUrl)
{
RequestResult result = downloadClient.MakeRequest(mangaUrl, RequestType.Default);
if ((int)result.statusCode < 200 || (int)result.statusCode >= 300 || result.htmlDocument is null)
{
Log("Failed to load site");
return new List<Chapter>();
}
List<Chapter> ret = new();
foreach (HtmlNode chapterInfo in result.htmlDocument.DocumentNode.SelectNodes("//ul/li[contains(@class, '_episodeItem')]"))
{
HtmlNode infoNode = chapterInfo.SelectSingleNode(".//a");
string url = infoNode.GetAttributeValue("href", "");
string id = chapterInfo.GetAttributeValue("id", "");
if(id == "") continue;
string? volumeNumber = null;
string chapterNumber = chapterInfo.GetAttributeValue("data-episode-no", "");
if(chapterNumber == "") continue;
string chapterName = infoNode.SelectSingleNode(".//span[contains(@class, 'subj')]/span").InnerText.Trim();
ret.Add(new Chapter(manga, chapterName, volumeNumber, chapterNumber, url));
}
return ret;
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
string requestUrl = chapter.url;
// Leaving this in to check if the page exists
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
progressToken?.Cancel();
return requestResult.statusCode;
}
string[] imageUrls = ParseImageUrlsFromHtml(requestUrl);
return DownloadChapterImages(imageUrls, chapter, RequestType.MangaImage, progressToken:progressToken, referrer: requestUrl);
}
private string[] ParseImageUrlsFromHtml(string mangaUrl)
{
RequestResult requestResult =
downloadClient.MakeRequest(mangaUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
return Array.Empty<string>();
}
if (requestResult.htmlDocument is null)
{
Log($"Failed to retrieve site");
return Array.Empty<string>();
}
return requestResult.htmlDocument.DocumentNode
.SelectNodes("//*[@id='_imageList']/img")
.Select(node =>
node.GetAttributeValue("data-url", ""))
.ToArray();
}
}
internal class PublicationManager {
public PublicationManager(string title = "", string category = "", string id = "") {
this.Title = title;
this.Category = category;
this.Id = id;
}
public PublicationManager(string publicationId) {
string[] parts = publicationId.Split("|");
if(parts.Length == 3) {
this.Title = parts[0];
this.Category = parts[1];
this.Id = parts[2];
} else {
this.Title = "";
this.Category = "";
this.Id = "";
}
}
public string getPublicationId() {
return $"{this.Title}|{this.Category}|{this.Id}";
}
public string Title { get; set; }
public string Category { get; set; }
public string Id { get; set; }
}

View File

@ -0,0 +1,215 @@
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class Weebcentral : MangaConnector
{
private readonly string _baseUrl = "https://weebcentral.com";
private readonly string[] _filterWords =
{ "a", "the", "of", "as", "to", "no", "for", "on", "with", "be", "and", "in", "wa", "at", "be", "ni" };
public Weebcentral(GlobalBase clone) : base(clone, "Weebcentral", ["en"])
{
downloadClient = new ChromiumDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
const int limit = 32; //How many values we want returned at once
int offset = 0; //"Page"
string requestUrl =
$"{_baseUrl}/search/data?limit={limit}&offset={offset}&text={publicationTitle}&sort=Best+Match&order=Ascending&official=Any&display_mode=Minimal%20Display";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300 ||
requestResult.htmlDocument == null)
{
Log($"Failed to retrieve search: {requestResult.statusCode}");
return [];
}
Manga[] publications = ParsePublicationsFromHtml(requestResult.htmlDocument);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications;
}
private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{
if (document.DocumentNode.SelectNodes("//article") == null)
return [];
List<string> urls = document.DocumentNode.SelectNodes("/html/body/article/a[@class='link link-hover tooltip tooltip-bottom']")
.Select(elem => elem.GetAttributeValue("href", "")).ToList();
HashSet<Manga> ret = new();
foreach (string url in urls)
{
Manga? manga = GetMangaFromUrl(url);
if (manga is not null)
ret.Add((Manga)manga);
}
return ret.ToArray();
}
public override Manga? GetMangaFromUrl(string url)
{
Regex publicationIdRex = new(@"https:\/\/weebcentral\.com\/series\/(\w*)\/(.*)");
string publicationId = publicationIdRex.Match(url).Groups[1].Value;
RequestResult requestResult = downloadClient.MakeRequest(url, RequestType.MangaInfo);
if ((int)requestResult.statusCode < 300 && (int)requestResult.statusCode >= 200 &&
requestResult.htmlDocument is not null)
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, publicationId, url);
return null;
}
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
HtmlNode? posterNode =
document.DocumentNode.SelectSingleNode("//section[@class='flex items-center justify-center']/picture/img");
string posterUrl = posterNode?.GetAttributeValue("src", "") ?? "";
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
HtmlNode? titleNode = document.DocumentNode.SelectSingleNode("//section/h1");
string sortName = titleNode?.InnerText ?? "Undefined";
HtmlNode[] authorsNodes =
document.DocumentNode.SelectNodes("//ul/li[strong/text() = 'Author(s): ']/span")?.ToArray() ?? [];
List<string> authors = authorsNodes.Select(n => n.InnerText).ToList();
HtmlNode[] genreNodes =
document.DocumentNode.SelectNodes("//ul/li[strong/text() = 'Tags(s): ']/span")?.ToArray() ?? [];
HashSet<string> tags = genreNodes.Select(n => n.InnerText).ToHashSet();
HtmlNode? statusNode = document.DocumentNode.SelectSingleNode("//ul/li[strong/text() = 'Status: ']/a");
string status = statusNode?.InnerText ?? "";
Log("unable to parse status");
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
switch (status.ToLower())
{
case "cancelled": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "hiatus": releaseStatus = Manga.ReleaseStatusByte.OnHiatus; break;
case "complete": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
}
HtmlNode? yearNode = document.DocumentNode.SelectSingleNode("//ul/li[strong/text() = 'Released: ']/span");
int year = Convert.ToInt32(yearNode?.InnerText ?? "0");
HtmlNode? descriptionNode = document.DocumentNode.SelectSingleNode("//ul/li[strong/text() = 'Description']/p");
string description = descriptionNode?.InnerText ?? "Undefined";
HtmlNode[] altTitleNodes = document.DocumentNode
.SelectNodes("//ul/li[strong/text() = 'Associated Name(s)']/ul/li")?.ToArray() ?? [];
Dictionary<string, string> altTitles = new(), links = new();
for (int i = 0; i < altTitleNodes.Length; i++)
altTitles.Add(i.ToString(), altTitleNodes[i].InnerText);
string originalLanguage = "";
Manga manga = new(sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://weebcentral.com/series/{publicationId}");
}
public override Chapter[] GetChapters(Manga manga, string language = "en")
{
Log($"Getting chapters {manga}");
string requestUrl = $"{_baseUrl}/series/{manga.publicationId}/full-chapter-list";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return [];
//Return Chapters ordered by Chapter-Number
if (requestResult.htmlDocument is null)
return [];
List<Chapter> chapters = ParseChaptersFromHtml(manga, requestResult.htmlDocument);
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.OrderByDescending(c => c.name).ThenBy(c => c.volumeNumber).ThenBy(c => c.chapterNumber).ToArray();
}
private List<Chapter> ParseChaptersFromHtml(Manga manga, HtmlDocument document)
{
HtmlNode? chaptersWrapper = document.DocumentNode.SelectSingleNode("/html/body");
Regex chapterRex = new(@"(\d+(?:\.\d+)*)");
Regex chapterNameRex = new(@"(\w* )+");
Regex idRex = new(@"https:\/\/weebcentral\.com\/chapters\/(\w*)");
List<Chapter> ret = chaptersWrapper.Descendants("a").Select(elem =>
{
string url = elem.GetAttributeValue("href", "") ?? "Undefined";
if (!url.StartsWith("https://") && !url.StartsWith("http://"))
return new Chapter(manga, null, null, "-1", "undefined");
Match idMatch = idRex.Match(url);
string? id = idMatch.Success ? idMatch.Groups[1].Value : null;
string chapterNode = elem.SelectSingleNode("span[@class='grow flex items-center gap-2']/span")?.InnerText ??
"Undefined";
MatchCollection chapterNumberMatch = chapterRex.Matches(chapterNode);
string chapterNumber = chapterNumberMatch.Count > 0 ? chapterNumberMatch[^1].Groups[1].Value : "-1";
MatchCollection chapterNameMatch = chapterNameRex.Matches(chapterNode);
string chapterName = chapterNameMatch.Count > 0
? string.Join(" - ",
chapterNameMatch.Select(m => m.Groups[1].Value.Trim())
.Where(name => name.Length > 0 && !name.Equals("Chapter", StringComparison.OrdinalIgnoreCase)).ToArray()).Trim()
: "";
return new Chapter(manga, chapterName != "" ? chapterName : null, null, chapterNumber, url, id);
}).Where(elem => elem.chapterNumber != -1 && elem.url != "undefined").ToList();
ret.Reverse();
return ret;
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
RequestResult requestResult = downloadClient.MakeRequest(chapter.url, RequestType.Default);
if (requestResult.htmlDocument is null)
{
progressToken?.Cancel();
return HttpStatusCode.RequestTimeout;
}
HtmlDocument? document = requestResult.htmlDocument;
HtmlNode[] imageNodes =
document.DocumentNode.SelectNodes($"//section[@hx-get='{chapter.url}/images']/img")?.ToArray() ?? [];
string[] urls = imageNodes.Select(imgNode => imgNode.GetAttributeValue("src", "")).ToArray();
return DownloadChapterImages(urls, chapter, RequestType.MangaImage, progressToken: progressToken, referrer: "https://weebcentral.com/");
}
}

View File

@ -24,7 +24,7 @@ public class Gotify : NotificationConnector
return $"Gotify {endpoint}";
}
public override void SendNotification(string title, string notificationText)
protected override void SendNotificationInternal(string title, string notificationText)
{
Log($"Sending notification: {title} - {notificationText}");
MessageData message = new(title, notificationText);

View File

@ -20,7 +20,7 @@ public class LunaSea : NotificationConnector
return $"LunaSea {id}";
}
public override void SendNotification(string title, string notificationText)
protected override void SendNotificationInternal(string title, string notificationText)
{
Log($"Sending notification: {title} - {notificationText}");
MessageData message = new(title, notificationText);

View File

@ -3,14 +3,72 @@
public abstract class NotificationConnector : GlobalBase
{
public readonly NotificationConnectorType notificationConnectorType;
private DateTime? _notificationRequested = null;
private readonly Thread? _notificationBufferThread = null;
private const int NoChangeTimeout = 3, BiggestInterval = 30;
private List<KeyValuePair<string, string>> _notifications = new();
protected NotificationConnector(GlobalBase clone, NotificationConnectorType notificationConnectorType) : base(clone)
{
Log($"Creating notificationConnector {Enum.GetName(notificationConnectorType)}");
this.notificationConnectorType = notificationConnectorType;
if (TrangaSettings.bufferLibraryUpdates)
{
_notificationBufferThread = new(CheckNotificationBuffer);
_notificationBufferThread.Start();
}
}
private void CheckNotificationBuffer()
{
while (true)
{
if (_notificationRequested is not null && DateTime.Now.Subtract((DateTime)_notificationRequested) > TimeSpan.FromMinutes(NoChangeTimeout)) //If no updates have been requested for NoChangeTimeout minutes, update library
{
string[] uniqueTitles = _notifications.DistinctBy(n => n.Key).Select(n => n.Key).ToArray();
Log($"Notification Buffer sending! Notifications: {string.Join(", ", uniqueTitles)}");
foreach (string ut in uniqueTitles)
{
string[] texts = _notifications.Where(n => n.Key == ut).Select(n => n.Value).ToArray();
SendNotificationInternal($"{ut} ({texts.Length})", string.Join('\n', texts));
}
_notificationRequested = null;
_notifications.Clear();
}
Thread.Sleep(100);
}
}
public enum NotificationConnectorType : byte { Gotify = 0, LunaSea = 1, Ntfy = 2 }
public abstract void SendNotification(string title, string notificationText);
public void SendNotification(string title, string notificationText, bool buffer = false)
{
_notificationRequested ??= DateTime.Now;
if (!TrangaSettings.bufferNotifications || !buffer)
{
SendNotificationInternal(title, notificationText);
return;
}
_notifications.Add(new(title, notificationText));
if (_notificationRequested is not null &&
DateTime.Now.Subtract((DateTime)_notificationRequested) > TimeSpan.FromMinutes(BiggestInterval)) //If the last update has been more than BiggestInterval minutes ago, update library
{
string[] uniqueTitles = _notifications.DistinctBy(n => n.Key).Select(n => n.Key).ToArray();
foreach (string ut in uniqueTitles)
{
string[] texts = _notifications.Where(n => n.Key == ut).Select(n => n.Value).ToArray();
SendNotificationInternal(ut, string.Join('\n', texts));
}
_notificationRequested = null;
_notifications.Clear();
}
else if(_notificationRequested is not null)
{
Log($"Buffering Notifications (Updates in latest {((DateTime)_notificationRequested).Add(TimeSpan.FromMinutes(BiggestInterval)).Subtract(DateTime.Now)} or {((DateTime)_notificationRequested).Add(TimeSpan.FromMinutes(NoChangeTimeout)).Subtract(DateTime.Now)})");
}
}
protected abstract void SendNotificationInternal(string title, string notificationText);
}

View File

@ -54,7 +54,7 @@ public class Ntfy : NotificationConnector
return $"Ntfy {endpoint} {topic}";
}
public override void SendNotification(string title, string notificationText)
protected override void SendNotificationInternal(string title, string notificationText)
{
Log($"Sending notification: {title} - {notificationText}");
MessageData message = new(title, topic, notificationText);

772
Tranga/Server.cs Normal file
View File

@ -0,0 +1,772 @@
using System.Net;
using System.Runtime.InteropServices;
using System.Text;
using System.Text.RegularExpressions;
using Newtonsoft.Json;
using Tranga.Jobs;
using Tranga.LibraryConnectors;
using Tranga.MangaConnectors;
using Tranga.NotificationConnectors;
namespace Tranga;
public class Server : GlobalBase
{
private readonly HttpListener _listener = new ();
private readonly Tranga _parent;
public Server(Tranga parent) : base(parent)
{
this._parent = parent;
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
this._listener.Prefixes.Add($"http://*:{TrangaSettings.apiPortNumber}/");
else
this._listener.Prefixes.Add($"http://localhost:{TrangaSettings.apiPortNumber}/");
Thread listenThread = new (Listen);
listenThread.Start();
Thread watchThread = new(WatchRunning);
watchThread.Start();
}
private void WatchRunning()
{
while(_parent.keepRunning)
Thread.Sleep(1000);
this._listener.Close();
}
private void Listen()
{
this._listener.Start();
foreach(string prefix in this._listener.Prefixes)
Log($"Listening on {prefix}");
while (this._listener.IsListening && _parent.keepRunning)
{
try
{
HttpListenerContext context = this._listener.GetContext();
//Log($"{context.Request.HttpMethod} {context.Request.Url} {context.Request.UserAgent}");
Task t = new(() =>
{
HandleRequest(context);
});
t.Start();
}
catch (HttpListenerException)
{
}
}
}
private void HandleRequest(HttpListenerContext context)
{
HttpListenerRequest request = context.Request;
HttpListenerResponse response = context.Response;
if (request.Url!.LocalPath.Contains("favicon"))
{
SendResponse(HttpStatusCode.NoContent, response);
return;
}
switch (request.HttpMethod)
{
case "GET":
HandleGet(request, response);
break;
case "POST":
HandlePost(request, response);
break;
case "DELETE":
HandleDelete(request, response);
break;
case "OPTIONS":
SendResponse(HttpStatusCode.OK, context.Response);
break;
default:
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
}
private Dictionary<string, string> GetRequestVariables(string query)
{
Dictionary<string, string> ret = new();
Regex queryRex = new (@"\?{1}&?([A-z0-9-=]+=[A-z0-9-=]+)+(&[A-z0-9-=]+=[A-z0-9-=]+)*");
if (!queryRex.IsMatch(query))
return ret;
query = query.Substring(1);
foreach (string keyValuePair in query.Split('&').Where(str => str.Length >= 3))
{
string var = keyValuePair.Split('=')[0];
string val = Regex.Replace(keyValuePair.Substring(var.Length + 1), "%20", " ");
val = Regex.Replace(val, "%[0-9]{2}", "");
ret.Add(var, val);
}
return ret;
}
private void HandleGet(HttpListenerRequest request, HttpListenerResponse response)
{
Dictionary<string, string> requestVariables = GetRequestVariables(request.Url!.Query);
string? connectorName, jobId, internalId;
MangaConnector? connector;
Manga? manga;
string path = Regex.Match(request.Url!.LocalPath, @"[A-z0-9]+(\/[A-z0-9]+)*").Value;
switch (path)
{
case "Connectors":
SendResponse(HttpStatusCode.OK, response, _parent.GetConnectors().Select(con => con.name).ToArray());
break;
case "Languages":
if (!requestVariables.TryGetValue("connector", out connectorName) ||
!_parent.TryGetConnector(connectorName, out connector))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
SendResponse(HttpStatusCode.OK, response, connector);
break;
case "Manga/Cover":
if (!requestVariables.TryGetValue("internalId", out internalId) ||
!_parent.TryGetPublicationById(internalId, out manga))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
string filePath = manga?.coverFileNameInCache ?? "";
if (File.Exists(filePath))
{
FileStream coverStream = new(filePath, FileMode.Open);
SendResponse(HttpStatusCode.OK, response, coverStream);
}
else
{
SendResponse(HttpStatusCode.NotFound, response);
}
break;
case "Manga/FromConnector":
requestVariables.TryGetValue("title", out string? title);
requestVariables.TryGetValue("url", out string? url);
if (!requestVariables.TryGetValue("connector", out connectorName) ||
!_parent.TryGetConnector(connectorName, out connector) ||
(title is null && url is null))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (url is not null)
{
HashSet<Manga> ret = new();
manga = connector!.GetMangaFromUrl(url);
if (manga is not null)
ret.Add((Manga)manga);
SendResponse(HttpStatusCode.OK, response, ret);
}else
SendResponse(HttpStatusCode.OK, response, connector!.GetManga(title!));
break;
case "Manga/Chapters":
if(!requestVariables.TryGetValue("connector", out connectorName) ||
!requestVariables.TryGetValue("internalId", out internalId) ||
!_parent.TryGetConnector(connectorName, out connector) ||
!_parent.TryGetPublicationById(internalId, out manga))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
requestVariables.TryGetValue("translatedLanguage", out string? translatedLanguage);
SendResponse(HttpStatusCode.OK, response, connector!.GetChapters((Manga)manga!, translatedLanguage??"en"));
break;
case "Jobs":
if (!requestVariables.TryGetValue("jobId", out jobId))
{
if(!_parent.jobBoss.jobs.Any(jjob => jjob.id == jobId))
SendResponse(HttpStatusCode.BadRequest, response);
else
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.First(jjob => jjob.id == jobId));
break;
}
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs);
break;
case "Jobs/Progress":
if (requestVariables.TryGetValue("jobId", out jobId))
{
if(!_parent.jobBoss.jobs.Any(jjob => jjob.id == jobId))
SendResponse(HttpStatusCode.BadRequest, response);
else
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.First(jjob => jjob.id == jobId).progressToken);
break;
}
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.Select(jjob => jjob.progressToken));
break;
case "Jobs/Running":
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.Where(jjob => jjob.progressToken.state is ProgressToken.State.Running));
break;
case "Jobs/Waiting":
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.Where(jjob => jjob.progressToken.state is ProgressToken.State.Standby).OrderBy(jjob => jjob.nextExecution));
break;
case "Jobs/MonitorJobs":
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.Where(jjob => jjob is DownloadNewChapters).OrderBy(jjob => ((DownloadNewChapters)jjob).manga.sortName));
break;
case "Settings":
SendResponse(HttpStatusCode.OK, response, TrangaSettings.AsJObject());
break;
case "Settings/userAgent":
SendResponse(HttpStatusCode.OK, response, TrangaSettings.userAgent);
break;
case "Settings/customRequestLimit":
SendResponse(HttpStatusCode.OK, response, TrangaSettings.requestLimits);
break;
case "Settings/AprilFoolsMode":
SendResponse(HttpStatusCode.OK, response, TrangaSettings.aprilFoolsMode);
break;
case "NotificationConnectors":
SendResponse(HttpStatusCode.OK, response, notificationConnectors);
break;
case "NotificationConnectors/Types":
SendResponse(HttpStatusCode.OK, response,
Enum.GetValues<NotificationConnector.NotificationConnectorType>().Select(nc => new KeyValuePair<byte, string?>((byte)nc, Enum.GetName(nc))));
break;
case "LibraryConnectors":
SendResponse(HttpStatusCode.OK, response, libraryConnectors);
break;
case "LibraryConnectors/Types":
SendResponse(HttpStatusCode.OK, response,
Enum.GetValues<LibraryConnector.LibraryType>().Select(lc => new KeyValuePair<byte, string?>((byte)lc, Enum.GetName(lc))));
break;
case "Ping":
SendResponse(HttpStatusCode.OK, response, "Pong");
break;
case "LogMessages":
if (logger is null || !File.Exists(logger?.logFilePath))
{
SendResponse(HttpStatusCode.NotFound, response);
break;
}
if (requestVariables.TryGetValue("count", out string? count))
{
try
{
uint messageCount = uint.Parse(count);
SendResponse(HttpStatusCode.OK, response, logger.Tail(messageCount));
}
catch (FormatException f)
{
SendResponse(HttpStatusCode.InternalServerError, response, f);
}
}else
SendResponse(HttpStatusCode.OK, response, logger.GetLog());
break;
case "LogFile":
if (logger is null || !File.Exists(logger?.logFilePath))
{
SendResponse(HttpStatusCode.NotFound, response);
break;
}
string logDir = new FileInfo(logger.logFilePath).DirectoryName!;
string tmpFilePath = Path.Join(logDir, "Tranga.log");
File.Copy(logger.logFilePath, tmpFilePath);
SendResponse(HttpStatusCode.OK, response, new FileStream(tmpFilePath, FileMode.Open));
File.Delete(tmpFilePath);
break;
default:
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
}
private void HandlePost(HttpListenerRequest request, HttpListenerResponse response)
{
Dictionary<string, string> requestVariables = GetRequestVariables(request.Url!.Query);
string? connectorName, internalId, jobId, chapterNumStr, customFolderName, translatedLanguage, notificationConnectorStr, libraryConnectorStr;
MangaConnector? connector;
Manga? tmpManga;
Manga manga;
Job? job;
NotificationConnector.NotificationConnectorType notificationConnectorType;
LibraryConnector.LibraryType libraryConnectorType;
string path = Regex.Match(request.Url!.LocalPath, @"[A-z0-9]+(\/[A-z0-9]+)*").Value;
switch (path)
{
case "Manga":
if(!requestVariables.TryGetValue("internalId", out internalId) ||
!_parent.TryGetPublicationById(internalId, out tmpManga))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga = (Manga)tmpManga!;
SendResponse(HttpStatusCode.OK, response, manga);
break;
case "Jobs/MonitorManga":
if(!requestVariables.TryGetValue("connector", out connectorName) ||
!requestVariables.TryGetValue("internalId", out internalId) ||
!requestVariables.TryGetValue("interval", out string? intervalStr) ||
!_parent.TryGetConnector(connectorName, out connector)||
!_parent.TryGetPublicationById(internalId, out tmpManga) ||
!TimeSpan.TryParse(intervalStr, out TimeSpan interval))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga = (Manga)tmpManga!;
if (requestVariables.TryGetValue("ignoreBelowChapterNum", out chapterNumStr))
{
if (!float.TryParse(chapterNumStr, numberFormatDecimalPoint, out float chapterNum))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga.ignoreChaptersBelow = chapterNum;
}
if (requestVariables.TryGetValue("customFolderName", out customFolderName))
manga.MovePublicationFolder(TrangaSettings.downloadLocation, customFolderName);
requestVariables.TryGetValue("translatedLanguage", out translatedLanguage);
_parent.jobBoss.AddJob(new DownloadNewChapters(this, connector!, manga, true, interval, translatedLanguage: translatedLanguage??"en"));
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Jobs/DownloadNewChapters":
if(!requestVariables.TryGetValue("connector", out connectorName) ||
!requestVariables.TryGetValue("internalId", out internalId) ||
!_parent.TryGetConnector(connectorName, out connector)||
!_parent.TryGetPublicationById(internalId, out tmpManga))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga = (Manga)tmpManga!;
if (requestVariables.TryGetValue("ignoreBelowChapterNum", out chapterNumStr))
{
if (!float.TryParse(chapterNumStr, numberFormatDecimalPoint, out float chapterNum))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga.ignoreChaptersBelow = chapterNum;
}
if (requestVariables.TryGetValue("customFolderName", out customFolderName))
manga.MovePublicationFolder(TrangaSettings.downloadLocation, customFolderName);
requestVariables.TryGetValue("translatedLanguage", out translatedLanguage);
_parent.jobBoss.AddJob(new DownloadNewChapters(this, connector!, manga, false, translatedLanguage: translatedLanguage??"en"));
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Jobs/UpdateMetadata":
if (!requestVariables.TryGetValue("internalId", out internalId))
{
foreach (Job pJob in _parent.jobBoss.jobs.Where(possibleDncJob =>
possibleDncJob.jobType is Job.JobType.DownloadNewChaptersJob).ToArray())//ToArray to avoid modyifying while adding new jobs
{
DownloadNewChapters dncJob = pJob as DownloadNewChapters ??
throw new Exception("Has to be DownloadNewChapters Job");
_parent.jobBoss.AddJob(new UpdateMetadata(this, dncJob.mangaConnector, dncJob.manga));
}
SendResponse(HttpStatusCode.Accepted, response);
}
else
{
Job[] possibleDncJobs = _parent.jobBoss.GetJobsLike(internalId: internalId).ToArray();
switch (possibleDncJobs.Length)
{
case <1: SendResponse(HttpStatusCode.BadRequest, response, "Could not find matching release"); break;
case >1: SendResponse(HttpStatusCode.BadRequest, response, "Multiple releases??"); break;
default:
DownloadNewChapters dncJob = possibleDncJobs[0] as DownloadNewChapters ??
throw new Exception("Has to be DownloadNewChapters Job");
_parent.jobBoss.AddJob(new UpdateMetadata(this, dncJob.mangaConnector, dncJob.manga));
SendResponse(HttpStatusCode.Accepted, response);
break;
}
}
break;
case "Jobs/StartNow":
if (!requestVariables.TryGetValue("jobId", out jobId) ||
!_parent.jobBoss.TryGetJobById(jobId, out job))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
_parent.jobBoss.AddJobToQueue(job!);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Jobs/Cancel":
if (!requestVariables.TryGetValue("jobId", out jobId) ||
!_parent.jobBoss.TryGetJobById(jobId, out job))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
job!.Cancel();
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/UpdateDownloadLocation":
if (!requestVariables.TryGetValue("downloadLocation", out string? downloadLocation) ||
!requestVariables.TryGetValue("moveFiles", out string? moveFilesStr) ||
!bool.TryParse(moveFilesStr, out bool moveFiles))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
TrangaSettings.UpdateDownloadLocation(downloadLocation, moveFiles);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/AprilFoolsMode":
if (!requestVariables.TryGetValue("enabled", out string? aprilFoolsModeEnabledStr) ||
!bool.TryParse(aprilFoolsModeEnabledStr, out bool aprilFoolsModeEnabled))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
TrangaSettings.UpdateAprilFoolsMode(aprilFoolsModeEnabled);
SendResponse(HttpStatusCode.Accepted, response);
break;
/*case "Settings/UpdateWorkingDirectory":
if (!requestVariables.TryGetValue("workingDirectory", out string? workingDirectory))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
settings.UpdateWorkingDirectory(workingDirectory);
SendResponse(HttpStatusCode.Accepted, response);
break;*/
case "Settings/userAgent":
if(!requestVariables.TryGetValue("userAgent", out string? customUserAgent))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
TrangaSettings.UpdateUserAgent(customUserAgent);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/userAgent/Reset":
TrangaSettings.UpdateUserAgent(null);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/customRequestLimit":
if (!requestVariables.TryGetValue("requestType", out string? requestTypeStr) ||
!requestVariables.TryGetValue("requestsPerMinute", out string? requestsPerMinuteStr) ||
!Enum.TryParse(requestTypeStr, out RequestType requestType) ||
!int.TryParse(requestsPerMinuteStr, out int requestsPerMinute))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
TrangaSettings.UpdateRateLimit(requestType, requestsPerMinute);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/customRequestLimit/Reset":
TrangaSettings.ResetRateLimits();
break;
case "NotificationConnectors/Update":
if (!requestVariables.TryGetValue("notificationConnector", out notificationConnectorStr) ||
!Enum.TryParse(notificationConnectorStr, out notificationConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (notificationConnectorType is NotificationConnector.NotificationConnectorType.Gotify)
{
if (!requestVariables.TryGetValue("gotifyUrl", out string? gotifyUrl) ||
!requestVariables.TryGetValue("gotifyAppToken", out string? gotifyAppToken))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddNotificationConnector(new Gotify(this, gotifyUrl, gotifyAppToken));
SendResponse(HttpStatusCode.Accepted, response);
}else if (notificationConnectorType is NotificationConnector.NotificationConnectorType.LunaSea)
{
if (!requestVariables.TryGetValue("lunaseaWebhook", out string? lunaseaWebhook))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddNotificationConnector(new LunaSea(this, lunaseaWebhook));
SendResponse(HttpStatusCode.Accepted, response);
}else if (notificationConnectorType is NotificationConnector.NotificationConnectorType.Ntfy)
{
if (!requestVariables.TryGetValue("ntfyUrl", out string? ntfyUrl) ||
!requestVariables.TryGetValue("ntfyUser", out string? ntfyUser)||
!requestVariables.TryGetValue("ntfyPass", out string? ntfyPass))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddNotificationConnector(new Ntfy(this, ntfyUrl, ntfyUser, ntfyPass, null));
SendResponse(HttpStatusCode.Accepted, response);
}
else
{
SendResponse(HttpStatusCode.BadRequest, response);
}
break;
case "NotificationConnectors/Test":
NotificationConnector notificationConnector;
if (!requestVariables.TryGetValue("notificationConnector", out notificationConnectorStr) ||
!Enum.TryParse(notificationConnectorStr, out notificationConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (notificationConnectorType is NotificationConnector.NotificationConnectorType.Gotify)
{
if (!requestVariables.TryGetValue("gotifyUrl", out string? gotifyUrl) ||
!requestVariables.TryGetValue("gotifyAppToken", out string? gotifyAppToken))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
notificationConnector = new Gotify(this, gotifyUrl, gotifyAppToken);
}else if (notificationConnectorType is NotificationConnector.NotificationConnectorType.LunaSea)
{
if (!requestVariables.TryGetValue("lunaseaWebhook", out string? lunaseaWebhook))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
notificationConnector = new LunaSea(this, lunaseaWebhook);
}else if (notificationConnectorType is NotificationConnector.NotificationConnectorType.Ntfy)
{
if (!requestVariables.TryGetValue("ntfyUrl", out string? ntfyUrl) ||
!requestVariables.TryGetValue("ntfyUser", out string? ntfyUser)||
!requestVariables.TryGetValue("ntfyPass", out string? ntfyPass))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
notificationConnector = new Ntfy(this, ntfyUrl, ntfyUser, ntfyPass, null);
}
else
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
notificationConnector.SendNotification("Tranga Test", "This is Test-Notification.");
SendResponse(HttpStatusCode.Accepted, response);
break;
case "NotificationConnectors/Reset":
if (!requestVariables.TryGetValue("notificationConnector", out notificationConnectorStr) ||
!Enum.TryParse(notificationConnectorStr, out notificationConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
DeleteNotificationConnector(notificationConnectorType);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "LibraryConnectors/Update":
if (!requestVariables.TryGetValue("libraryConnector", out libraryConnectorStr) ||
!Enum.TryParse(libraryConnectorStr, out libraryConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (libraryConnectorType is LibraryConnector.LibraryType.Kavita)
{
if (!requestVariables.TryGetValue("kavitaUrl", out string? kavitaUrl) ||
!requestVariables.TryGetValue("kavitaUsername", out string? kavitaUsername) ||
!requestVariables.TryGetValue("kavitaPassword", out string? kavitaPassword))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddLibraryConnector(new Kavita(this, kavitaUrl, kavitaUsername, kavitaPassword));
SendResponse(HttpStatusCode.Accepted, response);
}else if (libraryConnectorType is LibraryConnector.LibraryType.Komga)
{
if (!requestVariables.TryGetValue("komgaUrl", out string? komgaUrl) ||
!requestVariables.TryGetValue("komgaAuth", out string? komgaAuth))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddLibraryConnector(new Komga(this, komgaUrl, komgaAuth));
SendResponse(HttpStatusCode.Accepted, response);
}
else
{
SendResponse(HttpStatusCode.BadRequest, response);
}
break;
case "LibraryConnectors/Test":
LibraryConnector libraryConnector;
if (!requestVariables.TryGetValue("libraryConnector", out libraryConnectorStr) ||
!Enum.TryParse(libraryConnectorStr, out libraryConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (libraryConnectorType is LibraryConnector.LibraryType.Kavita)
{
if (!requestVariables.TryGetValue("kavitaUrl", out string? kavitaUrl) ||
!requestVariables.TryGetValue("kavitaUsername", out string? kavitaUsername) ||
!requestVariables.TryGetValue("kavitaPassword", out string? kavitaPassword))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
libraryConnector = new Kavita(this, kavitaUrl, kavitaUsername, kavitaPassword);
}else if (libraryConnectorType is LibraryConnector.LibraryType.Komga)
{
if (!requestVariables.TryGetValue("komgaUrl", out string? komgaUrl) ||
!requestVariables.TryGetValue("komgaAuth", out string? komgaAuth))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
libraryConnector = new Komga(this, komgaUrl, komgaAuth);
}
else
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
libraryConnector.UpdateLibrary();
SendResponse(HttpStatusCode.Accepted, response);
break;
case "LibraryConnectors/Reset":
if (!requestVariables.TryGetValue("libraryConnector", out libraryConnectorStr) ||
!Enum.TryParse(libraryConnectorStr, out libraryConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
DeleteLibraryConnector(libraryConnectorType);
SendResponse(HttpStatusCode.Accepted, response);
break;
default:
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
}
private void HandleDelete(HttpListenerRequest request, HttpListenerResponse response)
{
Dictionary<string, string> requestVariables = GetRequestVariables(request.Url!.Query);
string? connectorName, internalId;
MangaConnector connector;
Manga manga;
string path = Regex.Match(request.Url!.LocalPath, @"[A-z0-9]+(\/[A-z0-9]+)*").Value;
switch (path)
{
case "Jobs":
if (!requestVariables.TryGetValue("jobId", out string? jobId) ||
!_parent.jobBoss.TryGetJobById(jobId, out Job? job))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
_parent.jobBoss.RemoveJob(job!);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Jobs/DownloadNewChapters":
if(!requestVariables.TryGetValue("connector", out connectorName) ||
!requestVariables.TryGetValue("internalId", out internalId) ||
_parent.GetConnector(connectorName) is null ||
_parent.GetPublicationById(internalId) is null)
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
connector = _parent.GetConnector(connectorName)!;
manga = (Manga)_parent.GetPublicationById(internalId)!;
_parent.jobBoss.RemoveJobs(_parent.jobBoss.GetJobsLike(connector, manga));
SendResponse(HttpStatusCode.Accepted, response);
break;
case "NotificationConnectors":
if (!requestVariables.TryGetValue("notificationConnector", out string? notificationConnectorStr) ||
!Enum.TryParse(notificationConnectorStr, out NotificationConnector.NotificationConnectorType notificationConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
DeleteNotificationConnector(notificationConnectorType);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "LibraryConnectors":
if (!requestVariables.TryGetValue("libraryConnectors", out string? libraryConnectorStr) ||
!Enum.TryParse(libraryConnectorStr,
out LibraryConnector.LibraryType libraryConnectoryType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
DeleteLibraryConnector(libraryConnectoryType);
SendResponse(HttpStatusCode.Accepted, response);
break;
default:
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
}
private void SendResponse(HttpStatusCode statusCode, HttpListenerResponse response, object? content = null)
{
//Log($"Response: {statusCode} {content}");
response.StatusCode = (int)statusCode;
response.AddHeader("Access-Control-Allow-Headers", "Content-Type, Accept, X-Requested-With");
response.AddHeader("Access-Control-Allow-Methods", "GET, POST, DELETE");
response.AddHeader("Access-Control-Max-Age", "1728000");
response.AppendHeader("Access-Control-Allow-Origin", "*");
try
{
if (content is not Stream)
{
response.ContentType = "application/json";
response.AddHeader("Cache-Control", "no-store");
response.OutputStream.Write(content is not null
? Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(content))
: Array.Empty<byte>());
response.OutputStream.Close();
}
else if (content is FileStream stream)
{
string contentType = stream.Name.Split('.')[^1];
response.AddHeader("Cache-Control", "max-age=600");
switch (contentType.ToLower())
{
case "gif":
response.ContentType = "image/gif";
break;
case "png":
response.ContentType = "image/png";
break;
case "jpg":
case "jpeg":
response.ContentType = "image/jpeg";
break;
case "log":
response.ContentType = "text/plain";
break;
}
stream.CopyTo(response.OutputStream);
response.OutputStream.Close();
stream.Close();
}
}
catch (Exception e)
{
Log(e.ToString());
}
}
}

View File

@ -1,19 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
namespace Tranga.Server;
internal struct RequestPath
{
internal readonly string HttpMethod;
internal readonly string RegexStr;
internal readonly Func<GroupCollection, Dictionary<string, string>, ValueTuple<HttpStatusCode, object?>> Method;
public RequestPath(string httpHttpMethod, string regexStr,
Func<GroupCollection, Dictionary<string, string>, ValueTuple<HttpStatusCode, object?>> method)
{
this.HttpMethod = httpHttpMethod;
this.RegexStr = regexStr + "(?:/?)";
this.Method = method;
}
}

View File

@ -1,237 +0,0 @@
using System.Net;
using System.Runtime.InteropServices;
using System.Text;
using System.Text.RegularExpressions;
using Newtonsoft.Json;
namespace Tranga.Server;
public partial class Server : GlobalBase, IDisposable
{
private readonly HttpListener _listener = new();
private readonly Tranga _parent;
private bool _running = true;
private readonly List<RequestPath> _apiRequestPaths;
public Server(Tranga parent) : base(parent)
{
/*
* Contains all valid Request Methods, Paths (with Regex Group Matching for specific Parameters) and Handling Methods
*/
_apiRequestPaths = new List<RequestPath>
{
new ("GET", @"/v2/Connector/Types", GetV2ConnectorTypes),
new ("GET", @"/v2/Connector/([a-zA-Z]+)/GetManga", GetV2ConnectorConnectorNameGetManga),
new ("GET", @"/v2/Mangas", GetV2Mangas),
new ("GET", @"/v2/Manga/Search", GetV2MangaSearch),
new ("GET", @"/v2/Manga", GetV2Manga),
new ("GET", @"/v2/Manga/([-A-Za-z0-9]*={0,3})", GetV2MangaInternalId),
new ("DELETE", @"/v2/Manga/([-A-Za-z0-9]*={0,3})", DeleteV2MangaInternalId),
new ("GET", @"/v2/Manga/([-A-Za-z0-9]*={0,3})/Cover", GetV2MangaInternalIdCover),
new ("GET", @"/v2/Manga/([-A-Za-z0-9]*={0,3})/Chapters", GetV2MangaInternalIdChapters),
new ("GET", @"/v2/Manga/([-A-Za-z0-9]*={0,3})/Chapters/Latest", GetV2MangaInternalIdChaptersLatest),
new ("POST", @"/v2/Manga/([-A-Za-z0-9]*={0,3})/ignoreChaptersBelow", PostV2MangaInternalIdIgnoreChaptersBelow),
new ("POST", @"/v2/Manga/([-A-Za-z0-9]*={0,3})/moveFolder", PostV2MangaInternalIdMoveFolder),
new ("GET", @"/v2/Jobs", GetV2Jobs),
new ("GET", @"/v2/Jobs/Running", GetV2JobsRunning),
new ("GET", @"/v2/Jobs/Waiting", GetV2JobsWaiting),
new ("GET", @"/v2/Jobs/Monitoring", GetV2JobsMonitoring),
new ("GET", @"/v2/Job/Types", GetV2JobTypes),
new ("POST", @"/v2/Job/Create/([a-zA-Z]+)", PostV2JobCreateType),
new ("GET", @"/v2/Job", GetV2Job),
new ("GET", @"/v2/Job/([a-zA-Z\.]+-[-A-Za-z0-9+/]*={0,3}(?:-[0-9]+)?)", GetV2JobJobId),
new ("DELETE", @"/v2/Job/([a-zA-Z\.]+-[-A-Za-z0-9+/]*={0,3}(?:-[0-9]+)?)", DeleteV2JobJobId),
new ("GET", @"/v2/Job/([a-zA-Z\.]+-[-A-Za-z0-9+/]*={0,3}(?:-[0-9]+)?)/Progress", GetV2JobJobIdProgress),
new ("POST", @"/v2/Job/([a-zA-Z\.]+-[-A-Za-z0-9+/]*={0,3}(?:-[0-9]+)?)/StartNow", PostV2JobJobIdStartNow),
new ("POST", @"/v2/Job/([a-zA-Z\.]+-[-A-Za-z0-9+/]*={0,3}(?:-[0-9]+)?)/Cancel", PostV2JobJobIdCancel),
new ("GET", @"/v2/Settings", GetV2Settings),
new ("GET", @"/v2/Settings/UserAgent", GetV2SettingsUserAgent),
new ("POST", @"/v2/Settings/UserAgent", PostV2SettingsUserAgent),
new ("GET", @"/v2/Settings/RateLimit/Types", GetV2SettingsRateLimitTypes),
new ("GET", @"/v2/Settings/RateLimit", GetV2SettingsRateLimit),
new ("POST", @"/v2/Settings/RateLimit", PostV2SettingsRateLimit),
new ("GET", @"/v2/Settings/RateLimit/([a-zA-Z]+)", GetV2SettingsRateLimitType),
new ("POST", @"/v2/Settings/RateLimit/([a-zA-Z]+)", PostV2SettingsRateLimitType),
new ("GET", @"/v2/Settings/AprilFoolsMode", GetV2SettingsAprilFoolsMode),
new ("POST", @"/v2/Settings/AprilFoolsMode", PostV2SettingsAprilFoolsMode),
new ("POST", @"/v2/Settings/DownloadLocation", PostV2SettingsDownloadLocation),
new ("GET", @"/v2/LibraryConnector", GetV2LibraryConnector),
new ("GET", @"/v2/LibraryConnector/Types", GetV2LibraryConnectorTypes),
new ("GET", @"/v2/LibraryConnector/([a-zA-Z]+)", GetV2LibraryConnectorType),
new ("POST", @"/v2/LibraryConnector/([a-zA-Z]+)", PostV2LibraryConnectorType),
new ("POST", @"/v2/LibraryConnector/([a-zA-Z]+)/Test", PostV2LibraryConnectorTypeTest),
new ("DELETE", @"/v2/LibraryConnector/([a-zA-Z]+)", DeleteV2LibraryConnectorType),
new ("GET", @"/v2/NotificationConnector", GetV2NotificationConnector),
new ("GET", @"/v2/NotificationConnector/Types", GetV2NotificationConnectorTypes),
new ("GET", @"/v2/NotificationConnector/([a-zA-Z]+)", GetV2NotificationConnectorType),
new ("POST", @"/v2/NotificationConnector/([a-zA-Z]+)", PostV2NotificationConnectorType),
new ("POST", @"/v2/NotificationConnector/([a-zA-Z]+)/Test", PostV2NotificationConnectorTypeTest),
new ("DELETE", @"/v2/NotificationConnector/([a-zA-Z]+)", DeleteV2NotificationConnectorType),
new ("GET", @"/v2/LogFile", GetV2LogFile),
new ("GET", @"/v2/Ping", GetV2Ping),
new ("POST", @"/v2/Ping", PostV2Ping)
};
this._parent = parent;
if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
this._listener.Prefixes.Add($"http://*:{TrangaSettings.apiPortNumber}/");
else
this._listener.Prefixes.Add($"http://localhost:{TrangaSettings.apiPortNumber}/");
Thread listenThread = new(Listen);
listenThread.Start();
while(_parent.keepRunning && _running)
Thread.Sleep(100);
this.Dispose();
}
private void Listen()
{
this._listener.Start();
foreach (string prefix in this._listener.Prefixes)
Log($"Listening on {prefix}");
while (this._listener.IsListening && _parent.keepRunning)
{
try
{
HttpListenerContext context = this._listener.GetContext();
//Log($"{context.Request.HttpMethod} {context.Request.Url} {context.Request.UserAgent}");
Task t = new(() =>
{
HandleRequest(context);
});
t.Start();
}
catch (HttpListenerException)
{
}
}
}
private void HandleRequest(HttpListenerContext context)
{
HttpListenerRequest request = context.Request;
HttpListenerResponse response = context.Response;
if (request.HttpMethod == "OPTIONS")
{
SendResponse(HttpStatusCode.NoContent, response);//Response always contains all valid Request-Methods
return;
}
if (request.Url!.LocalPath.Contains("favicon"))
{
SendResponse(HttpStatusCode.NoContent, response);
return;
}
string path = Regex.Match(request.Url.LocalPath, @"\/[a-zA-Z0-9\.+/=-]+(\/[a-zA-Z0-9\.+/=-]+)*").Value; //Local Path
if (!Regex.IsMatch(path, "/v2(/.*)?")) //Use only v2 API
{
SendResponse(HttpStatusCode.NotFound, response, "Use Version 2 API");
return;
}
Dictionary<string, string> requestVariables = GetRequestVariables(request.Url!.Query); //Variables in the URI
Dictionary<string, string> requestBody = GetRequestBody(request); //Variables in the JSON body
Dictionary<string, string> requestParams = requestVariables.UnionBy(requestBody, v => v.Key)
.ToDictionary(kv => kv.Key, kv => kv.Value); //The actual variable used for the API
ValueTuple<HttpStatusCode, object?> responseMessage; //Used to respond to the HttpRequest
if (_apiRequestPaths.Any(p => p.HttpMethod == request.HttpMethod && Regex.Match(path, p.RegexStr).Length == path.Length)) //Check if Request-Path is valid
{
RequestPath requestPath =
_apiRequestPaths.First(p => p.HttpMethod == request.HttpMethod && Regex.Match(path, p.RegexStr).Length == path.Length);
responseMessage =
requestPath.Method.Invoke(Regex.Match(path, requestPath.RegexStr).Groups, requestParams); //Get HttpResponse content
}
else
responseMessage = new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.MethodNotAllowed, "Unknown Request Path");
SendResponse(responseMessage.Item1, response, responseMessage.Item2);
}
private Dictionary<string, string> GetRequestVariables(string query)
{
Dictionary<string, string> ret = new();
Regex queryRex = new(@"\?{1}&?([A-z0-9-=]+=[A-z0-9-=]+)+(&[A-z0-9-=]+=[A-z0-9-=]+)*");
if (!queryRex.IsMatch(query))
return ret;
query = query.Substring(1);
foreach (string keyValuePair in query.Split('&').Where(str => str.Length >= 3))
{
string var = keyValuePair.Split('=')[0];
string val = Regex.Replace(keyValuePair.Substring(var.Length + 1), "%20", " ");
val = Regex.Replace(val, "%[0-9]{2}", "");
ret.Add(var, val);
}
return ret;
}
private Dictionary<string, string> GetRequestBody(HttpListenerRequest request)
{
if (!request.HasEntityBody)
{
//Nospam Log("No request body");
return new Dictionary<string, string>();
}
Stream body = request.InputStream;
Encoding encoding = request.ContentEncoding;
using StreamReader streamReader = new (body, encoding);
try
{
Dictionary<string, string> requestBody =
JsonConvert.DeserializeObject<Dictionary<string, string>>(streamReader.ReadToEnd())
?? new();
return requestBody;
}
catch (JsonException e)
{
Log(e.Message);
}
return new Dictionary<string, string>();
}
private void SendResponse(HttpStatusCode statusCode, HttpListenerResponse response, object? content = null)
{
//Log($"Response: {statusCode} {content}");
response.StatusCode = (int)statusCode;
response.AddHeader("Access-Control-Allow-Headers", "Content-Type, Accept, X-Requested-With");
response.AddHeader("Access-Control-Allow-Methods", "GET, POST, DELETE");
response.AddHeader("Access-Control-Max-Age", "1728000");
response.AppendHeader("Access-Control-Allow-Origin", "*");
try
{
if (content is Stream stream)
{
response.ContentType = "image/jpeg";
response.AddHeader("Cache-Control", "max-age=600");
stream.CopyTo(response.OutputStream);
response.OutputStream.Close();
stream.Close();
}
else
{
response.ContentType = "application/json";
response.AddHeader("Cache-Control", "no-store");
response.OutputStream.Write(content is not null
? Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(content))
: Array.Empty<byte>());
}
response.OutputStream.Close();
}
catch (HttpListenerException e)
{
Log(e.ToString());
}
}
public void Dispose()
{
_running = false;
((IDisposable)_listener).Dispose();
}
}

View File

@ -1,31 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using Tranga.MangaConnectors;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2ConnectorTypes(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.Accepted, _parent.GetConnectors());
}
private ValueTuple<HttpStatusCode, object?> GetV2ConnectorConnectorNameGetManga(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.GetConnectors().Contains(groups[1].Value) ||
!_parent.TryGetConnector(groups[1].Value, out MangaConnector? connector) ||
connector is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, $"Connector '{groups[1].Value}' does not exist.");
if (requestParameters.TryGetValue("title", out string? title))
{
return (HttpStatusCode.OK, connector.GetManga(title));
}else if (requestParameters.TryGetValue("url", out string? url))
{
return (HttpStatusCode.OK, connector.GetMangaFromUrl(url));
}else
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, "Parameter 'title' or 'url' has to be set.");
}
}

View File

@ -1,169 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using Tranga.Jobs;
using Tranga.MangaConnectors;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2Jobs(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, _parent.jobBoss.jobs.Select(job => job.id));
}
private ValueTuple<HttpStatusCode, object?> GetV2JobsRunning(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, _parent.jobBoss.jobs
.Where(job => job.progressToken.state is ProgressToken.State.Running)
.Select(job => job.id));
}
private ValueTuple<HttpStatusCode, object?> GetV2JobsWaiting(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, _parent.jobBoss.jobs
.Where(job => job.progressToken.state is ProgressToken.State.Waiting)
.Select(job => job.id));
}
private ValueTuple<HttpStatusCode, object?> GetV2JobsMonitoring(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, _parent.jobBoss.jobs
.Where(job => job.jobType is Job.JobType.DownloadNewChaptersJob)
.Select(job => job.id));
}
private ValueTuple<HttpStatusCode, object?> GetV2JobTypes(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK,
Enum.GetValues<Job.JobType>().ToDictionary(b => (byte)b, b => Enum.GetName(b)));
}
private ValueTuple<HttpStatusCode, object?> PostV2JobCreateType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out Job.JobType jobType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"JobType {groups[1].Value} does not exist.");
}
string? mangaId;
Manga? manga;
switch (jobType)
{
case Job.JobType.MonitorManga:
if(!requestParameters.TryGetValue("internalId", out mangaId) ||
!_parent.TryGetPublicationById(mangaId, out manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, "'internalId' Parameter missing, or is not a valid ID.");
if(!requestParameters.TryGetValue("interval", out string? intervalStr) ||
!TimeSpan.TryParse(intervalStr, out TimeSpan interval))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, "'interval' Parameter missing, or is not in correct format.");
requestParameters.TryGetValue("language", out string? language);
if (requestParameters.TryGetValue("customFolder", out string? folder))
manga.Value.MovePublicationFolder(TrangaSettings.downloadLocation, folder);
if (requestParameters.TryGetValue("startChapter", out string? startChapterStr) &&
float.TryParse(startChapterStr, out float startChapter))
{
Manga manga1 = manga.Value;
manga1.ignoreChaptersBelow = startChapter;
}
return _parent.jobBoss.AddJob(new DownloadNewChapters(this, ((Manga)manga).mangaConnector,
((Manga)manga).internalId, true, interval, language)) switch
{
true => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null),
false => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.Conflict, "Job already exists."),
};
case Job.JobType.UpdateMetaDataJob:
if(!requestParameters.TryGetValue("internalId", out mangaId) ||
!_parent.TryGetPublicationById(mangaId, out manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, "InternalId Parameter missing, or is not a valid ID.");
return _parent.jobBoss.AddJob(new UpdateMetadata(this, ((Manga)manga).internalId)) switch
{
true => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null),
false => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.Conflict, "Job already exists."),
};
case Job.JobType.DownloadNewChaptersJob: //TODO
case Job.JobType.DownloadChapterJob: //TODO
default: return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.MethodNotAllowed, $"JobType {Enum.GetName(jobType)} is not supported.");
}
}
private ValueTuple<HttpStatusCode, object?> GetV2JobJobId(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!_parent.jobBoss.TryGetJobById(groups[1].Value, out Job? job) ||
job is null)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Job with ID: '{groups[1].Value}' does not exist.");
}
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, job);
}
private ValueTuple<HttpStatusCode, object?> DeleteV2JobJobId(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!_parent.jobBoss.TryGetJobById(groups[1].Value, out Job? job) ||
job is null)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Job with ID: '{groups[1].Value}' does not exist.");
}
_parent.jobBoss.RemoveJob(job);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> GetV2JobJobIdProgress(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!_parent.jobBoss.TryGetJobById(groups[1].Value, out Job? job) ||
job is null)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, $"Job with ID: '{groups[1].Value}' does not exist.");
}
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, job.progressToken);
}
private ValueTuple<HttpStatusCode, object?> PostV2JobJobIdStartNow(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!_parent.jobBoss.TryGetJobById(groups[1].Value, out Job? job) ||
job is null)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Job with ID: '{groups[1].Value}' does not exist.");
}
_parent.jobBoss.AddJobs(job.ExecuteReturnSubTasks(_parent.jobBoss));
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> PostV2JobJobIdCancel(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!_parent.jobBoss.TryGetJobById(groups[1].Value, out Job? job) ||
job is null)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Job with ID: '{groups[1].Value}' does not exist.");
}
job.Cancel();
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> GetV2Job(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(!requestParameters.TryGetValue("jobIds", out string? jobIdListStr))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, "Missing parameter 'jobIds'.");
string[] jobIdList = jobIdListStr.Split(',');
List<Job> ret = new();
foreach (string jobId in jobIdList)
{
if(!_parent.jobBoss.TryGetJobById(jobId, out Job? job) || job is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Job with id '{jobId}' not found.");
ret.Add(job);
}
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, ret);
}
}

View File

@ -1,120 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using Tranga.LibraryConnectors;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2LibraryConnector(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, libraryConnectors);
}
private ValueTuple<HttpStatusCode, object?> GetV2LibraryConnectorTypes(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK,
Enum.GetValues<LibraryConnector.LibraryType>().ToDictionary(b => (byte)b, b => Enum.GetName(b)));
}
private ValueTuple<HttpStatusCode, object?> GetV2LibraryConnectorType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out LibraryConnector.LibraryType libraryType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"LibraryType {groups[1].Value} does not exist.");
}
if(libraryConnectors.All(lc => lc.libraryType != libraryType))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"LibraryType {Enum.GetName(libraryType)} not configured.");
else
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, libraryConnectors.First(lc => lc.libraryType == libraryType));
}
private ValueTuple<HttpStatusCode, object?> PostV2LibraryConnectorType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out LibraryConnector.LibraryType libraryType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"LibraryType {groups[1].Value} does not exist.");
}
if(!requestParameters.TryGetValue("URL", out string? url))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'url' missing.");
switch (libraryType)
{
case LibraryConnector.LibraryType.Kavita:
if(!requestParameters.TryGetValue("username", out string? username))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'username' missing.");
if(!requestParameters.TryGetValue("password", out string? password))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'password' missing.");
Kavita kavita = new (this, url, username, password);
libraryConnectors.RemoveWhere(lc => lc.libraryType == LibraryConnector.LibraryType.Kavita);
libraryConnectors.Add(kavita);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, kavita);
case LibraryConnector.LibraryType.Komga:
if(!requestParameters.TryGetValue("auth", out string? auth))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'auth' missing.");
Komga komga = new (this, url, auth);
libraryConnectors.RemoveWhere(lc => lc.libraryType == LibraryConnector.LibraryType.Komga);
libraryConnectors.Add(komga);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, komga);
default: return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.MethodNotAllowed, $"LibraryType {Enum.GetName(libraryType)} is not supported.");
}
}
private ValueTuple<HttpStatusCode, object?> PostV2LibraryConnectorTypeTest(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out LibraryConnector.LibraryType libraryType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"LibraryType {groups[1].Value} does not exist.");
}
if(!requestParameters.TryGetValue("URL", out string? url))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'url' missing.");
switch (libraryType)
{
case LibraryConnector.LibraryType.Kavita:
if(!requestParameters.TryGetValue("username", out string? username))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'username' missing.");
if(!requestParameters.TryGetValue("password", out string? password))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'password' missing.");
Kavita kavita = new (this, url, username, password);
return kavita.Test() switch
{
true => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, kavita),
_ => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.FailedDependency, kavita)
};
case LibraryConnector.LibraryType.Komga:
if(!requestParameters.TryGetValue("auth", out string? auth))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'auth' missing.");
Komga komga = new (this, url, auth);
return komga.Test() switch
{
true => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, komga),
_ => new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.FailedDependency, komga)
};
default: return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.MethodNotAllowed, $"LibraryType {Enum.GetName(libraryType)} is not supported.");
}
}
private ValueTuple<HttpStatusCode, object?> DeleteV2LibraryConnectorType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out LibraryConnector.LibraryType libraryType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"LibraryType {groups[1].Value} does not exist.");
}
if(libraryConnectors.All(lc => lc.libraryType != libraryType))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"LibraryType {Enum.GetName(libraryType)} not configured.");
else
{
libraryConnectors.Remove(libraryConnectors.First(lc => lc.libraryType == libraryType));
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
}
}

View File

@ -1,173 +0,0 @@
using System.Drawing;
using System.Drawing.Imaging;
using System.Net;
using System.Text.RegularExpressions;
using Tranga.Jobs;
using Tranga.MangaConnectors;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2Mangas(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, GetAllCachedManga().Select(m => m.internalId));
}
private ValueTuple<HttpStatusCode, object?> GetV2Manga(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(!requestParameters.TryGetValue("mangaIds", out string? mangaIdListStr))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, "Missing parameter 'mangaIds'.");
string[] mangaIdList = mangaIdListStr.Split(',');
List<Manga> ret = new();
foreach (string mangaId in mangaIdList)
{
if(!_parent.TryGetPublicationById(mangaId, out Manga? manga) || manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with id '{mangaId}' not found.");
ret.Add(manga.Value);
}
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, ret);
}
private ValueTuple<HttpStatusCode, object?> GetV2MangaSearch(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(!requestParameters.TryGetValue("title", out string? title))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, "Missing parameter 'title'.");
List<Manga> ret = new();
List<Thread> threads = new();
foreach (MangaConnector mangaConnector in _connectors)
{
Thread t = new (() =>
{
ret.AddRange(mangaConnector.GetManga(title));
});
t.Start();
threads.Add(t);
}
while(threads.Any(t => t.ThreadState is ThreadState.Running or ThreadState.WaitSleepJoin))
Thread.Sleep(10);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, ret);
}
private ValueTuple<HttpStatusCode, object?> GetV2MangaInternalId(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, manga);
}
private ValueTuple<HttpStatusCode, object?> DeleteV2MangaInternalId(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
Job[] jobs = _parent.jobBoss.GetJobsLike(publication: manga).ToArray();
_parent.jobBoss.RemoveJobs(jobs);
RemoveMangaFromCache(groups[1].Value);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> GetV2MangaInternalIdCover(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
string filePath = manga.Value.coverFileNameInCache!;
if(!File.Exists(filePath))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, "Cover-File not found.");
Bitmap bitmap;
if (requestParameters.TryGetValue("dimensions", out string? dimensionsStr))
{
Regex dimensionsRex = new(@"([0-9]+)x([0-9]+)");
if(!dimensionsRex.IsMatch(dimensionsStr))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, "Requested dimensions not in required format.");
Match m = dimensionsRex.Match(dimensionsStr);
int width = int.Parse(m.Groups[1].Value);
int height = int.Parse(m.Groups[2].Value);
double aspectRequested = (double)width / (double)height;
using Image coverImage = Image.FromFile(filePath);
double aspectCover = (double)coverImage.Width / (double)coverImage.Height;
Size newSize = aspectRequested > aspectCover
? new Size(width, (width / coverImage.Width) * coverImage.Height)
: new Size((height / coverImage.Height) * coverImage.Width, height);
bitmap = new(coverImage, newSize);
}
else
{
FileStream coverStream = new(filePath, FileMode.Open);
bitmap = new(coverStream);
}
using MemoryStream ret = new();
bitmap.Save(ret, ImageFormat.Jpeg);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, ret);
}
private ValueTuple<HttpStatusCode, object?> GetV2MangaInternalIdChapters(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
Chapter[] chapters = requestParameters.TryGetValue("language", out string? parameter) switch
{
true => manga.Value.mangaConnector.GetChapters((Manga)manga, parameter),
false => manga.Value.mangaConnector.GetChapters((Manga)manga)
};
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, chapters);
}
private ValueTuple<HttpStatusCode, object?> GetV2MangaInternalIdChaptersLatest(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
float latest = requestParameters.TryGetValue("language", out string? parameter) switch
{
true => float.Parse(manga.Value.mangaConnector.GetChapters(manga.Value, parameter).Max().chapterNumber),
false => float.Parse(manga.Value.mangaConnector.GetChapters(manga.Value).Max().chapterNumber)
};
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, latest);
}
private ValueTuple<HttpStatusCode, object?> PostV2MangaInternalIdIgnoreChaptersBelow(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
if (requestParameters.TryGetValue("startChapter", out string? startChapterStr) &&
float.TryParse(startChapterStr, out float startChapter))
{
Manga manga1 = manga.Value;
manga1.ignoreChaptersBelow = startChapter;
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}else
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, "Parameter 'startChapter' missing, or failed to parse.");
}
private ValueTuple<HttpStatusCode, object?> PostV2MangaInternalIdMoveFolder(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!_parent.TryGetPublicationById(groups[1].Value, out Manga? manga) ||
manga is null)
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"Manga with ID '{groups[1].Value} could not be found.'");
if(!requestParameters.TryGetValue("location", out string? newFolder))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.BadRequest, "Parameter 'location' missing.");
manga.Value.MovePublicationFolder(TrangaSettings.downloadLocation, newFolder);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
}

View File

@ -1,33 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2LogFile(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (logger is null || !File.Exists(logger?.logFilePath))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, "Missing Logfile");
}
FileStream logFile = new (logger.logFilePath, FileMode.Open, FileAccess.Read);
FileStream content = new(Path.GetTempFileName(), FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.ReadWrite, 0, FileOptions.DeleteOnClose);
logFile.Position = 0;
logFile.CopyTo(content);
content.Position = 0;
logFile.Dispose();
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, content);
}
private ValueTuple<HttpStatusCode, object?> GetV2Ping(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.Accepted, "Pong!");
}
private ValueTuple<HttpStatusCode, object?> PostV2Ping(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.Accepted, "Pong!");
}
}

View File

@ -1,136 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using Tranga.NotificationConnectors;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2NotificationConnector(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, notificationConnectors);
}
private ValueTuple<HttpStatusCode, object?> GetV2NotificationConnectorTypes(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK,
Enum.GetValues<NotificationConnectors.NotificationConnector.NotificationConnectorType>().ToDictionary(b => (byte)b, b => Enum.GetName(b)));
}
private ValueTuple<HttpStatusCode, object?> GetV2NotificationConnectorType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out NotificationConnector.NotificationConnectorType notificationConnectorType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"NotificationType {groups[1].Value} does not exist.");
}
if(notificationConnectors.All(nc => nc.notificationConnectorType != notificationConnectorType))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"NotificationType {Enum.GetName(notificationConnectorType)} not configured.");
else
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, notificationConnectors.First(nc => nc.notificationConnectorType != notificationConnectorType));
}
private ValueTuple<HttpStatusCode, object?> PostV2NotificationConnectorType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out NotificationConnector.NotificationConnectorType notificationConnectorType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"NotificationType {groups[1].Value} does not exist.");
}
string? url;
switch (notificationConnectorType)
{
case NotificationConnector.NotificationConnectorType.Gotify:
if(!requestParameters.TryGetValue("url", out url))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'url' missing.");
if(!requestParameters.TryGetValue("appToken", out string? appToken))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'appToken' missing.");
Gotify gotify = new (this, url, appToken);
this.notificationConnectors.RemoveWhere(nc =>
nc.notificationConnectorType == NotificationConnector.NotificationConnectorType.Gotify);
this.notificationConnectors.Add(gotify);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, gotify);
case NotificationConnector.NotificationConnectorType.LunaSea:
if(!requestParameters.TryGetValue("webhook", out string? webhook))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'webhook' missing.");
LunaSea lunaSea = new (this, webhook);
this.notificationConnectors.RemoveWhere(nc =>
nc.notificationConnectorType == NotificationConnector.NotificationConnectorType.LunaSea);
this.notificationConnectors.Add(lunaSea);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, lunaSea);
case NotificationConnector.NotificationConnectorType.Ntfy:
if(!requestParameters.TryGetValue("url", out url))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'url' missing.");
if(!requestParameters.TryGetValue("username", out string? username))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'username' missing.");
if(!requestParameters.TryGetValue("password", out string? password))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'password' missing.");
Ntfy ntfy = new(this, url, username, password, null);
this.notificationConnectors.RemoveWhere(nc =>
nc.notificationConnectorType == NotificationConnector.NotificationConnectorType.Ntfy);
this.notificationConnectors.Add(ntfy);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, ntfy);
default:
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.MethodNotAllowed, $"NotificationType {Enum.GetName(notificationConnectorType)} is not supported.");
}
}
private ValueTuple<HttpStatusCode, object?> PostV2NotificationConnectorTypeTest(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out NotificationConnector.NotificationConnectorType notificationConnectorType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"NotificationType {groups[1].Value} does not exist.");
}
string? url;
switch (notificationConnectorType)
{
case NotificationConnector.NotificationConnectorType.Gotify:
if(!requestParameters.TryGetValue("url", out url))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'url' missing.");
if(!requestParameters.TryGetValue("appToken", out string? appToken))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'appToken' missing.");
Gotify gotify = new (this, url, appToken);
gotify.SendNotification("Tranga Test", "It was successful :3");
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, gotify);
case NotificationConnector.NotificationConnectorType.LunaSea:
if(!requestParameters.TryGetValue("webhook", out string? webhook))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'webhook' missing.");
LunaSea lunaSea = new (this, webhook);
lunaSea.SendNotification("Tranga Test", "It was successful :3");
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, lunaSea);
case NotificationConnector.NotificationConnectorType.Ntfy:
if(!requestParameters.TryGetValue("url", out url))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'url' missing.");
if(!requestParameters.TryGetValue("username", out string? username))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'username' missing.");
if(!requestParameters.TryGetValue("password", out string? password))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotAcceptable, "Parameter 'password' missing.");
Ntfy ntfy = new(this, url, username, password, null);
ntfy.SendNotification("Tranga Test", "It was successful :3");
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, ntfy);
default:
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.MethodNotAllowed, $"NotificationType {Enum.GetName(notificationConnectorType)} is not supported.");
}
}
private ValueTuple<HttpStatusCode, object?> DeleteV2NotificationConnectorType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, true, out NotificationConnector.NotificationConnectorType notificationConnectorType))
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"NotificationType {groups[1].Value} does not exist.");
}
if(notificationConnectors.All(nc => nc.notificationConnectorType != notificationConnectorType))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"NotificationType {Enum.GetName(notificationConnectorType)} not configured.");
else
{
notificationConnectors.Remove(notificationConnectors.First(nc => nc.notificationConnectorType != notificationConnectorType));
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
}
}

View File

@ -1,108 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using Tranga.MangaConnectors;
namespace Tranga.Server;
public partial class Server
{
private ValueTuple<HttpStatusCode, object?> GetV2Settings(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.AsJObject());
}
private ValueTuple<HttpStatusCode, object?> GetV2SettingsUserAgent(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.userAgent);
}
private ValueTuple<HttpStatusCode, object?> PostV2SettingsUserAgent(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (!requestParameters.TryGetValue("value", out string? userAgent))
{
TrangaSettings.UpdateUserAgent(null);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.Accepted, null);
}
else
{
TrangaSettings.UpdateUserAgent(userAgent);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
}
private ValueTuple<HttpStatusCode, object?> GetV2SettingsRateLimitTypes(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, Enum.GetValues<RequestType>().ToDictionary(b =>(byte)b, b => Enum.GetName(b)) );
}
private ValueTuple<HttpStatusCode, object?> GetV2SettingsRateLimit(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.requestLimits);
}
private ValueTuple<HttpStatusCode, object?> PostV2SettingsRateLimit(GroupCollection groups, Dictionary<string, string> requestParameters)
{
foreach (KeyValuePair<string, string> kv in requestParameters)
{
if(!Enum.TryParse(kv.Key, out RequestType requestType) ||
!int.TryParse(kv.Value, out int requestsPerMinute))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, null);
TrangaSettings.UpdateRateLimit(requestType, requestsPerMinute);
}
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.requestLimits);
}
private ValueTuple<HttpStatusCode, object?> GetV2SettingsRateLimitType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, out RequestType requestType))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"RequestType {groups[1].Value}");
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.requestLimits[requestType]);
}
private ValueTuple<HttpStatusCode, object?> PostV2SettingsRateLimitType(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if(groups.Count < 1 ||
!Enum.TryParse(groups[1].Value, out RequestType requestType))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, $"RequestType {groups[1].Value}");
if (!requestParameters.TryGetValue("value", out string? requestsPerMinuteStr) ||
!int.TryParse(requestsPerMinuteStr, out int requestsPerMinute))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, "Errors parsing requestsPerMinute");
TrangaSettings.UpdateRateLimit(requestType, requestsPerMinute);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> GetV2SettingsAprilFoolsMode(GroupCollection groups, Dictionary<string, string> requestParameters)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, TrangaSettings.aprilFoolsMode);
}
private ValueTuple<HttpStatusCode, object?> PostV2SettingsAprilFoolsMode(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (!requestParameters.TryGetValue("value", out string? trueFalseStr) ||
!bool.TryParse(trueFalseStr, out bool trueFalse))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, "Errors parsing 'value'");
TrangaSettings.UpdateAprilFoolsMode(trueFalse);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
private ValueTuple<HttpStatusCode, object?> PostV2SettingsDownloadLocation(GroupCollection groups, Dictionary<string, string> requestParameters)
{
if (!requestParameters.TryGetValue("location", out string? folderPath))
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.NotFound, "Missing Parameter 'location'");
try
{
bool moveFiles = requestParameters.TryGetValue("moveFiles", out string? moveFilesStr) switch
{
false => true,
true => bool.Parse(moveFilesStr!)
};
TrangaSettings.UpdateDownloadLocation(folderPath, moveFiles);
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.OK, null);
}
catch (FormatException)
{
return new ValueTuple<HttpStatusCode, object?>(HttpStatusCode.InternalServerError, "Error Parsing Parameter 'moveFiles'");
}
}
}

View File

@ -8,7 +8,8 @@ public partial class Tranga : GlobalBase
{
public bool keepRunning;
public JobBoss jobBoss;
private Server.Server _server;
private Server _server;
private HashSet<MangaConnector> _connectors;
public Tranga(Logger? logger) : base(logger)
{
@ -17,22 +18,24 @@ public partial class Tranga : GlobalBase
_connectors = new HashSet<MangaConnector>()
{
new Manganato(this),
new Mangasee(this),
new MangaDex(this),
new MangaKatana(this),
new Mangaworld(this),
new Bato(this),
new MangaLife(this),
new ManhuaPlus(this),
new MangaHere(this),
new AsuraToon(this),
new Weebcentral(this),
new Webtoons(this),
};
foreach(DirectoryInfo dir in new DirectoryInfo(Path.GetTempPath()).GetDirectories("trangatemp"))//Cleanup old temp folders
dir.Delete();
jobBoss = new(this, this._connectors);
StartJobBoss();
this._server = new Server.Server(this);
this._server = new Server(this);
string[] emojis = { "(•‿•)", "(づ \u25d5‿\u25d5 )づ", "( \u02d8\u25bd\u02d8)っ\u2668", "=\uff3e\u25cf \u22cf \u25cf\uff3e=", "(ΦωΦ)", "(\u272a\u3268\u272a)", "( ノ・o・ )ノ", "(〜^\u2207^ )〜", "~(\u2267ω\u2266)~","૮ \u00b4• ﻌ \u00b4• ა", "(\u02c3ᆺ\u02c2)", "(=\ud83d\udf66 \u0f1d \ud83d\udf66=)"};
SendNotifications("Tranga Started", emojis[Random.Shared.Next(0,emojis.Length-1)]);
Log(TrangaSettings.AsJObject().ToString());
}
public MangaConnector? GetConnector(string name)
@ -49,9 +52,9 @@ public partial class Tranga : GlobalBase
return connector is not null;
}
public IEnumerable<string> GetConnectors()
public IEnumerable<MangaConnector> GetConnectors()
{
return _connectors.Select(c => c.name);
return _connectors;
}
public Manga? GetPublicationById(string internalId) => GetCachedManga(internalId);

View File

@ -1,19 +1,19 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net7.0</TargetFramework>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<OutputType>Exe</OutputType>
<LangVersion>12</LangVersion>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="GlaxArguments" Version="1.1.0" />
<PackageReference Include="HtmlAgilityPack" Version="1.11.46" />
<PackageReference Include="HtmlAgilityPack" Version="1.11.72" />
<PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
<PackageReference Include="PuppeteerSharp" Version="10.0.0" />
<PackageReference Include="PuppeteerSharp" Version="20.1.0" />
<PackageReference Include="Soenneker.Utils.String.NeedlemanWunsch" Version="2.1.301" />
<PackageReference Include="System.Drawing.Common" Version="9.0.0-preview.7.24405.4" />
</ItemGroup>
<ItemGroup>

View File

@ -15,12 +15,13 @@ public static class TrangaSettings
public static string workingDirectory { get; private set; } = Path.Join(RuntimeInformation.IsOSPlatform(OSPlatform.Linux) ? "/usr/share" : Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData), "tranga-api");
public static int apiPortNumber { get; private set; } = 6531;
public static string userAgent { get; private set; } = DefaultUserAgent;
public static bool bufferLibraryUpdates { get; private set; } = false;
public static bool bufferNotifications { get; private set; } = false;
[JsonIgnore] public static string settingsFilePath => Path.Join(workingDirectory, "settings.json");
[JsonIgnore] public static string libraryConnectorsFilePath => Path.Join(workingDirectory, "libraryConnectors.json");
[JsonIgnore] public static string notificationConnectorsFilePath => Path.Join(workingDirectory, "notificationConnectors.json");
[JsonIgnore] public static string jobsFolderPath => Path.Join(workingDirectory, "jobs");
[JsonIgnore] public static string coverImageCache => Path.Join(workingDirectory, "imageCache");
[JsonIgnore] public static string mangaCacheFolderPath => Path.Join(workingDirectory, "mangaCache");
public static ushort? version { get; } = 2;
public static bool aprilFoolsMode { get; private set; } = true;
[JsonIgnore]internal static readonly Dictionary<RequestType, int> DefaultRequestLimits = new ()
@ -34,6 +35,8 @@ public static class TrangaSettings
};
public static Dictionary<RequestType, int> requestLimits { get; set; } = DefaultRequestLimits;
public static int ChromiumStartupTimeoutMs { get; set; } = 30000;
public static int ChromiumPageTimeoutMs { get; set; } = 30000;
public static void LoadFromWorkingDirectory(string directory)
{
@ -47,15 +50,17 @@ public static class TrangaSettings
ExportSettings();
}
public static void CreateOrUpdate(string? downloadDirectory = null, string? pWorkingDirectory = null, int? pApiPortNumber = null, string? pUserAgent = null, bool? pAprilFoolsMode = null)
public static void CreateOrUpdate(string? downloadDirectory = null, string? pWorkingDirectory = null, int? pApiPortNumber = null, string? pUserAgent = null, bool? pAprilFoolsMode = null, bool? pBufferLibraryUpdates = null, bool? pBufferNotifications = null)
{
if(pWorkingDirectory is null && File.Exists(settingsFilePath))
LoadFromWorkingDirectory(workingDirectory);
TrangaSettings.downloadLocation = downloadDirectory ?? TrangaSettings.downloadLocation;
TrangaSettings.workingDirectory = pWorkingDirectory ?? TrangaSettings.workingDirectory;
TrangaSettings.apiPortNumber = pApiPortNumber ?? TrangaSettings.apiPortNumber;
TrangaSettings.userAgent = pUserAgent ?? TrangaSettings.userAgent;
TrangaSettings.aprilFoolsMode = pAprilFoolsMode ?? TrangaSettings.aprilFoolsMode;
downloadLocation = downloadDirectory ?? downloadLocation;
workingDirectory = pWorkingDirectory ?? workingDirectory;
apiPortNumber = pApiPortNumber ?? apiPortNumber;
userAgent = pUserAgent ?? userAgent;
aprilFoolsMode = pAprilFoolsMode ?? aprilFoolsMode;
bufferLibraryUpdates = pBufferLibraryUpdates ?? bufferLibraryUpdates;
bufferNotifications = pBufferNotifications ?? bufferNotifications;
Directory.CreateDirectory(downloadLocation);
Directory.CreateDirectory(workingDirectory);
ExportSettings();
@ -91,7 +96,7 @@ public static class TrangaSettings
public static void UpdateAprilFoolsMode(bool enabled)
{
TrangaSettings.aprilFoolsMode = enabled;
aprilFoolsMode = enabled;
ExportSettings();
}
@ -103,10 +108,10 @@ public static class TrangaSettings
else
Directory.CreateDirectory(newPath);
if (moveFiles && Directory.Exists(TrangaSettings.downloadLocation))
Directory.Move(TrangaSettings.downloadLocation, newPath);
if (moveFiles && Directory.Exists(downloadLocation))
Directory.Move(downloadLocation, newPath);
TrangaSettings.downloadLocation = newPath;
downloadLocation = newPath;
ExportSettings();
}
@ -117,26 +122,26 @@ public static class TrangaSettings
GroupRead | GroupWrite | None | OtherRead | OtherWrite | UserRead | UserWrite);
else
Directory.CreateDirectory(newPath);
Directory.Move(TrangaSettings.workingDirectory, newPath);
TrangaSettings.workingDirectory = newPath;
Directory.Move(workingDirectory, newPath);
workingDirectory = newPath;
ExportSettings();
}
public static void UpdateUserAgent(string? customUserAgent)
{
TrangaSettings.userAgent = customUserAgent ?? DefaultUserAgent;
userAgent = customUserAgent ?? DefaultUserAgent;
ExportSettings();
}
public static void UpdateRateLimit(RequestType requestType, int newLimit)
{
TrangaSettings.requestLimits[requestType] = newLimit;
requestLimits[requestType] = newLimit;
ExportSettings();
}
public static void ResetRateLimits()
{
TrangaSettings.requestLimits = DefaultRequestLimits;
requestLimits = DefaultRequestLimits;
ExportSettings();
}
@ -155,13 +160,17 @@ public static class TrangaSettings
public static JObject AsJObject()
{
JObject jobj = new JObject();
jobj.Add("downloadLocation", JToken.FromObject(TrangaSettings.downloadLocation));
jobj.Add("workingDirectory", JToken.FromObject(TrangaSettings.workingDirectory));
jobj.Add("apiPortNumber", JToken.FromObject(TrangaSettings.apiPortNumber));
jobj.Add("userAgent", JToken.FromObject(TrangaSettings.userAgent));
jobj.Add("aprilFoolsMode", JToken.FromObject(TrangaSettings.aprilFoolsMode));
jobj.Add("version", JToken.FromObject(TrangaSettings.version));
jobj.Add("requestLimits", JToken.FromObject(TrangaSettings.requestLimits));
jobj.Add("downloadLocation", JToken.FromObject(downloadLocation));
jobj.Add("workingDirectory", JToken.FromObject(workingDirectory));
jobj.Add("apiPortNumber", JToken.FromObject(apiPortNumber));
jobj.Add("userAgent", JToken.FromObject(userAgent));
jobj.Add("aprilFoolsMode", JToken.FromObject(aprilFoolsMode));
jobj.Add("version", JToken.FromObject(version));
jobj.Add("requestLimits", JToken.FromObject(requestLimits));
jobj.Add("bufferLibraryUpdates", JToken.FromObject(bufferLibraryUpdates));
jobj.Add("bufferNotifications", JToken.FromObject(bufferNotifications));
jobj.Add("chromiumStartTimeout", JToken.FromObject(ChromiumStartupTimeoutMs));
jobj.Add("chromiumPageTimeout", JToken.FromObject(ChromiumPageTimeoutMs));
return jobj;
}
@ -171,16 +180,24 @@ public static class TrangaSettings
{
JObject jobj = JObject.Parse(serialized);
if (jobj.TryGetValue("downloadLocation", out JToken? dl))
TrangaSettings.downloadLocation = dl.Value<string>()!;
downloadLocation = dl.Value<string>()!;
if (jobj.TryGetValue("workingDirectory", out JToken? wd))
TrangaSettings.workingDirectory = wd.Value<string>()!;
workingDirectory = wd.Value<string>()!;
if (jobj.TryGetValue("apiPortNumber", out JToken? apn))
TrangaSettings.apiPortNumber = apn.Value<int>();
apiPortNumber = apn.Value<int>();
if (jobj.TryGetValue("userAgent", out JToken? ua))
TrangaSettings.userAgent = ua.Value<string>()!;
userAgent = ua.Value<string>()!;
if (jobj.TryGetValue("aprilFoolsMode", out JToken? afm))
TrangaSettings.aprilFoolsMode = afm.Value<bool>()!;
aprilFoolsMode = afm.Value<bool>()!;
if (jobj.TryGetValue("requestLimits", out JToken? rl))
TrangaSettings.requestLimits = rl.ToObject<Dictionary<RequestType, int>>()!;
requestLimits = rl.ToObject<Dictionary<RequestType, int>>()!;
if (jobj.TryGetValue("bufferLibraryUpdates", out JToken? blu))
bufferLibraryUpdates = blu.Value<bool>()!;
if (jobj.TryGetValue("bufferNotifications", out JToken? bn))
bufferNotifications = bn.Value<bool>()!;
if (jobj.TryGetValue("chromiumStartTimeout", out JToken? cst))
ChromiumStartupTimeoutMs = cst.Value<int>();
if (jobj.TryGetValue("chromiumPageTimeout", out JToken? cpt))
ChromiumPageTimeoutMs = cpt.Value<int>();
}
}

View File

@ -1,383 +0,0 @@
## Tranga API Calls
This document serves to outline all of the different HTTP API calls that Tranga accepts. Tranga expects specific HTTP methods for its calls and therefore careful attention must be paid when making them.
In the examples below, `{apiUri}` refers to your `http(s)://TRANGA.FRONTEND.URI/api`. Parameters are included in the HTTP request URI and the request body is in JSON format. Tranga responses are always
in the JSON format within the Response Body.
#### [GET] /Connectors
Retrieves the available manga sites (connectors) that Tranga is currently able to download manga from.
- Parameters:
None
- Request Body:
None
#### [GET] /Jobs
Retrieves all jobs that Tranga is keeping track of, includes Running Jobs, Waiting Jobs, Manga Tracking (Monitoring) Jobs.
- Parameters:
None
- Request Body:
None
#### [DELETE] /Jobs
Removes the specified job given by the job ID
- Request Variables:
- None
- Request Body:
```
{
jobId: ${Tranga Job ID}
}
```
#### [POST] /Jobs/Cancel
Cancels a running job or prevents a queued job from running.
- Parameters:
None
- Request Body:
```
{
jobId: ${Tranga Job ID}
}
```
#### [POST] /Jobs/DownloadNewChapters
Manually adds a Job to Tranga's queue to check for and download new chapters for a specified manga
- Parameters:
None
- Request Body:
```
{
connector: ${Manga Connector to Download From}
internalId: ${Tranga Manga ID}
translatedLanguage: ${Manga Language}
}
```
#### [GET] /Jobs/Running
Retrieves all currently running jobs.
- Parameters:
None
- Request Body:
None
#### [POST] /Jobs/StartNow
Manually starts a configured job
- Parameters:
None
- Request Body:
```
{
jobId: ${Tranga Job ID}
}
```
#### [GET]/Jobs/Waiting
Retrieves all currently queued jobs.
- Parameters:
None
- Request Body:
None
#### [GET] /Jobs/MonitorJobs
Retrieves all jobs for Mangas that Tranga is currently tracking.
- Parameters:
None
- Request Body:
None
#### [POST] /Jobs/MonitorManga
Adds a new manga for Tranga to monitor
- Parameters:
None
- Request Body:
```
{
connector: ${Manga Connector to download from}
internalId: ${Tranga Manga ID}
interval: ${Interval at which to run job, in the HH:MM:SS format}
translatedLanguage: ${Supported language code}
ignoreBelowChapterNum: ${Chapter number to start downloading from}
customFolderName: ${Folder Name to save Manga to}
}
```
#### [GET] /Jobs/Progress
Retrieves the current completion progress of a running or waiting job. Tranga's ID for the Job is returned with each of the `GET /Job/` API calls.
- Parameters:
- `{jobId}`: Tranga Job ID
- Request Body:
None
#### [POST] /Jobs/UpdateMetadata
Updates the metadata for all monitored mangas
- Parameters:
None
- Request Body:
None
#### [GET] /LibraryConnectors
Retrieves the currently configured library servers
- Parameters:
None
- Request Body:
None
#### [DELETE] /LibraryConnectors/Reset
Resets or clears a configured library connector
- Parameters:
None
- Request Body:
```
{
libraryConnector: Komga/Kavita
}
```
#### [POST] /LibraryConnectors/Test
Verifies the behavior of a library connector before saving it. The connector must be checked to verify that the connection is active.
- Parameters:
None
- Request Body:
```
{
libraryConnector: Komga/Kavita
libraryURL: ${Library URL}
komgaAuth: Only for when libraryConnector = Komga
kavitaUsername: Only for when libraryConnector = Kavita
kavitaPassword: Only for when libraryConnector = Kavita
}
```
#### [GET] /LibraryConnectors/Types
Retrives Key-Value pairs for all of Tranga's currently supported library servers.
- Parameters:
None
- Request Body:
None
#### [POST] /LibraryConnectors/Update
Updates or Adds a Library Connector to Tranga
- Parameters: None
- Request Body:
```
{
libraryConnector: Komga/Kavita
libraryURL: ${Library URL}
komgaAuth: Only for when libraryConnector = Komga
kavitaUsername: Only for when libraryConnector = Kavita
kavitaPassword: Only for when libraryConnector = Kavita
}
```
#### [GET] /LogFile
Retrieves the log file from the running Tranga instance
- Parameters:
None
- Request Body:
None
#### [GET] /Manga/FromConnector
Retrieves the details about a specified manga from a specific connector. If the manga title returned by Tranga is a URL (determined by the presence of `http` in the title, the API call should use the second
call with the `url` rather than the `title`.
- Parameters:
- `{connector}`: Manga Connector
- `{url/title}`: Manga URL/Title
- Request Body:
None
#### [GET] /Manga/Chapters
Retrieves the currently available chapters for a specified manga from a connector. The `{internalId}` is how Tranga uniquely recognizes and distinguishes different Manga.
- Parameters:
- `{connector}`: Manga Connector
- `{internalId}`: Tranga Manga ID
- `{translatedLanguage}`: Translated Language
- Request Body:
None
#### [GET] /Manga/Cover
Retrives the URL of the cover image for a specific manga that Tranga is tracking.
- Parameters:
- `{internalId}`: Tranga Manga ID
- Request Body:
None
#### [GET] /NotificationConnectors
Retrieves the currently configured notification providers
- Parameters:
None
- Request Body:
None
#### [DELETE] /NotificationConnectors/Reset
Resets or clears a configured notification connector
- Parameters:
None
- Request Body:
```
{
notificationConnector: Gotify/Ntfy/LunaSea
}
```
#### [POST] /NotificationConnectors/Test
Tests a notification connector with the currently input settings. The connector behavior must be checked to verify that the input settings are correct.
- Parameters:
None
- Request Body:
```
{
notificationConnector: Gotify/Ntfy/LunaSea
gotifyUrl:
gotifyAppToken:
lunaseaWebhook:
ntfyUrl:
ntfyAuth:
}
```
#### [POST] /NotificationConnectors/Update
Updates or Adds a notification connector to Tranga
- Parameters:
None
- Request Body:
```
{
notificationConnector: Gotify/Ntfy/LunaSea
gotifyUrl:
gotifyAppToken:
lunaseaWebhook:
ntfyUrl:
ntfyAuth:
}
```
#### [GET] /NotificationConnectors/Types
Retrives Key-Value pairs for all of Tranga's currently supported notification providers.
- Parameters:
None
- Request Body:
None
#### [GET] /Ping
This call is used periodically by the web frontend to establish that connection to the server is active.
- Parameters:
None
- Request Body:
None
#### [GET] /Settings
Retrieves the content of Tranga's `settings.json`
- Parameters:
None
- Request Body:
None
#### [GET] /Settings/customRequestLimit
Retrieves the configured rate limits for different types of manga connector requests.
- Parameters:
None
- Request Body:
None
#### [POST] /Settings/customRequestLimit
Sets the rate limits for different types of manga connector requests.
- Parameters:
None
- Request Body:
```
{
requestType: {Request Byte}
requestsPerMinute: {Rate Limit in Requests Per Minute}
}
```
#### [POST] /Settings/UpdateDownloadLocation
Updates the root directory of where Tranga downloads manga
- Parameters:
None
- Request Body:
```
{
downloadLocation: {New Root Directory}
moveFiles: "true"/"false"
}
```
#### [POST] /Settings/userAgent
Updates the user agent that Tranga uses when scraping the web
- Parameters
- Request Body:
```
{
userAgent: {User Agent String}
}
```

File diff suppressed because it is too large Load Diff

View File

@ -1,41 +0,0 @@
## Manga
```json
{
}
```
## Chapter
```json
{
}
```
## Job
```json
{
}
```
## ProgressToken
```json
{
}
```
## Settings
```json
{
}
```
## LibraryConnector
```json
{
}
```
## NotificationConnector
```json
{
}
```