2
0

Compare commits

..

792 Commits
1.4 ... master

Author SHA1 Message Date
5186ae66c9
Merge pull request #270 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.7.1
Bump docker/setup-buildx-action from 3.6.1 to 3.7.1
2024-10-23 16:11:06 +02:00
c35e1ef517
Merge pull request #269 from C9Glax/dependabot/github_actions/docker/build-push-action-6.9.0
Bump docker/build-push-action from 6.7.0 to 6.9.0
2024-10-23 16:10:52 +02:00
dependabot[bot]
8f6891142b
Bump docker/setup-buildx-action from 3.6.1 to 3.7.1
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.6.1 to 3.7.1.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.6.1...v3.7.1)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-10-23 05:49:09 +00:00
dependabot[bot]
b52e6d4908
Bump docker/build-push-action from 6.7.0 to 6.9.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.7.0 to 6.9.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.7.0...v6.9.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-10-23 05:49:07 +00:00
30c44760e7
Merge pull request #256 from C9Glax/cuttingedge-merge-candidate
Cuttingedge merge candidate
2024-09-29 01:13:56 +02:00
a3ae3c320d Merge branch 'refs/heads/cuttingedge' into cuttingedge-merge-candidate 2024-09-29 01:07:59 +02:00
ea262889e6 Its late. Set TARGETPLATFORM in base 2024-09-29 01:02:50 +02:00
445542b653 Set --platform to BUILDPLATFORM for dotnet 2024-09-29 00:58:24 +02:00
b7718220ef Merge branch 'refs/heads/cuttingedge' into cuttingedge-merge-candidate 2024-09-29 00:54:28 +02:00
34c62e8658 Remove cache step from cuttingedge workflow, set --platform to TARGETPLATFORM instead 2024-09-29 00:50:53 +02:00
a9fcc93670
Merge pull request #257 from C9Glax/master
Update docker-image-cuttingedge.yml
2024-09-29 00:44:17 +02:00
68d7ef258f
Update docker-image-cuttingedge.yml
Clear Cache on build
2024-09-29 00:40:59 +02:00
fdea4f5ea5 Merge branch 'cuttingedge-merge-ServerV2' into cuttingedge 2024-09-27 17:09:19 +02:00
ac3039e587 Add Star-Graph to README 2024-09-27 17:08:59 +02:00
3829a1cf26 Merge branch 'refs/heads/cuttingedge' into cuttingedge-merge-candidate 2024-09-27 15:03:51 +02:00
c3daa0b751 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-27 15:03:44 +02:00
3a072beea3 Update Readme:
* Fix dotnet Version
* Link directly to new issue for new Connectors
* Add Ntfy as Notification Connector
* Remove Roadmap
2024-09-27 15:03:06 +02:00
8e6f2798a9 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge-merge-candidate 2024-09-27 14:58:07 +02:00
9cbde9a6b4 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-27 14:57:57 +02:00
0870aa9fdb Merge branch 'refs/heads/master' into cuttingedge-merge-ServerV2 2024-09-27 14:57:36 +02:00
172650e644
Merge pull request #254 from C9Glax/cuttingedge-merge-candidate
Cuttingedge merge candidate
2024-09-27 14:53:24 +02:00
52ff2e54a8 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-27 14:51:11 +02:00
61d80a93cf Fix #255 MangaKatana sanitization. 2024-09-27 14:50:57 +02:00
7be3ee52e9 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-23 15:40:53 +02:00
981eb0fd9f Fix notification batching:
Do not resend old notifications.
2024-09-23 15:40:43 +02:00
47f3044a6d Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-22 00:15:59 +02:00
6d03cc5f8d Fix incorrect setting check for notificationsbuffer 2024-09-22 00:15:50 +02:00
290c405f52 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-22 00:09:54 +02:00
fcdbd32872 Include amount of notifications of type in title 2024-09-22 00:09:45 +02:00
eb6c37cc53 Output settings.json on startup 2024-09-22 00:05:09 +02:00
d922842186 Add NotificationBuffer, so Notification are not spammed on every chapter. 2024-09-22 00:02:43 +02:00
69323d6d60 Add LibraryBuffer, so Libraries are not spammed with scans on every download. 2024-09-21 21:02:55 +02:00
46a0fb8c48 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-21 20:34:57 +02:00
ec8eb40941 Allow Versions to lose their volume number, if site no longer lists it. 2024-09-21 20:30:55 +02:00
d2074fae35 Readable CheckChapterIsDownloaded check 2024-09-21 20:23:21 +02:00
713bbc230f Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-18 18:56:09 +02:00
32ab9a552f Also delete files on UpdateJobFile if we dont provide a filepath 2024-09-18 18:56:01 +02:00
c11c68d6d7 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-18 18:46:02 +02:00
09fdb6e5f1 Fix #250 old jobs getting re-exported. 2024-09-18 18:45:55 +02:00
e86ad03b1e Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-17 00:51:30 +02:00
9dfbe89e87 include --platform=$BUILDPLATFORM in Dockerfile 2024-09-17 00:51:22 +02:00
98e75af486 Merge branch 'cuttingedge' of ssh://git.bernloehr.eu:222/glax/Tranga into cuttingedge 2024-09-16 23:21:13 +02:00
e2f5c3badc Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 23:18:57 +02:00
cda07bb9aa Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 23:09:43 +02:00
7c18466e95 Fix NETSDK1194 on build 2024-09-16 23:09:34 +02:00
ce1c4d3f65 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 22:48:06 +02:00
52d0489a1b Fix duplicate mangas on startup 2024-09-16 22:47:55 +02:00
f89aea6ac8 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 21:19:27 +02:00
5f05ba1049 Make SupportedLanguages public. 2024-09-16 21:19:19 +02:00
a20ee01cfa Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 21:17:18 +02:00
cf5cbba9a8 #247 Add supported languages to Mangaconnectors 2024-09-16 21:17:07 +02:00
600b56033d Upgrade to Dotnet 8.0 LangVer 12 2024-09-16 21:11:50 +02:00
fdea3659f1 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 20:38:19 +02:00
7f3754fb64 Fix startup issue/issue with existing chapters: ProgressToken would not complete 2024-09-16 20:36:40 +02:00
2dac5db4da Create single Chromium Instance that is shared between all Connectors.
Fix pages staying open when page could not be loaded.
2024-09-16 20:30:23 +02:00
3456fc6564 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 19:52:39 +02:00
35f2625f05 Fix #249 Manhuaplus where author/tags are not set. 2024-09-16 19:52:25 +02:00
0b9948e367 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-16 18:32:45 +02:00
96f3dbce65 Throw more readable exceptions if deserialization fails for Mangaconnectors.
#249
2024-09-16 18:32:34 +02:00
895128a462 Merge remote-tracking branch 'origin/cuttingedge-merge-ServerV2' into cuttingedge-merge-ServerV2 2024-09-16 18:24:39 +02:00
a94186455b Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-09-11 14:41:35 +02:00
7d3deee74c Remove unused constant 2024-09-11 14:40:28 +02:00
5980b64caa Readable Chapter comparison 2024-09-11 14:40:03 +02:00
cbecb257ef Remove unused constant 2024-09-11 14:39:16 +02:00
8316ed08a7
Merge pull request #245 from C9Glax/cuttingedge
Prod didn't break, nice
2024-09-09 10:10:36 +02:00
7ff9ac53ee Build all docker images with new workflow #233 2024-09-09 09:42:52 +02:00
6faaaf4139 Fix #243 Moving Publication folders, overwrite files, merge folders 2024-09-09 09:23:25 +02:00
9b8b80cd24 Fix response closed on OPTIONS request 2024-09-07 20:44:15 +02:00
15f3e2b8ec Use current time as internalId for Manga instead of BASE64 string of title
#232
Fix #237
2024-09-07 20:33:03 +02:00
2be29e4019 MangaDex only download single release for chapter.
Fix #219
2024-09-07 20:16:05 +02:00
e8dbf7a718
Merge pull request #233 from vonProteus/arm64
Added support for ARM
2024-08-31 20:57:44 +02:00
vonProteus
a968f4328d
Added support for ARM 2024-08-31 20:38:10 +02:00
398b6fff05
Merge pull request #230 from C9Glax/cuttingedge-merge-candidate
Cuttingedge merge candidate
2024-08-31 20:25:33 +02:00
f5da2f8526
Merge pull request #231 from C9Glax/dependabot/github_actions/docker/build-push-action-6.7.0
Bump docker/build-push-action from 6.6.1 to 6.7.0
2024-08-31 20:24:43 +02:00
dependabot[bot]
73093ab86c
Bump docker/build-push-action from 6.6.1 to 6.7.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.6.1 to 6.7.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.6.1...v6.7.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-08-27 05:55:58 +00:00
fccaf9fcbe Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-08-26 20:47:06 +02:00
3122aa32e8 fix #223 wrong selector 2024-08-26 20:46:50 +02:00
02fad2dd44 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-08-26 20:28:51 +02:00
e0a7d1a187 Fix #220 Mangaworld Chapter number parsing 2024-08-26 20:28:40 +02:00
d0f9a4102c Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-08-26 20:18:44 +02:00
9f178821b6 Fix #223 Manganato chapter relative dates. 2024-08-26 20:18:35 +02:00
682fd0bc2a Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-08-26 13:22:09 +02:00
dfa8e66f34 Fix try-block in Server.cs 2024-08-26 13:21:54 +02:00
8f51d22303 Fix try-block in Server.cs 2024-08-26 13:21:34 +02:00
d41de84262 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge
# Conflicts:
#	Tranga/Server.cs
2024-08-26 13:21:05 +02:00
1bd20791b8 Add Cache-Control headers 2024-08-26 13:18:48 +02:00
03aeab44cd Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-08-26 13:11:41 +02:00
6d723b6355 Fix Settings not returning as JSON 2024-08-26 13:11:00 +02:00
7b91bb699f Fix Settings not loading on reload 2024-08-26 13:10:47 +02:00
14e33cc496 Fix Settings not loading on reload 2024-08-26 13:09:33 +02:00
6f3bba99b0 Fix Settings not returning as JSON 2024-08-26 12:59:19 +02:00
2d848843d0 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-08-26 12:37:03 +02:00
63b493fa9c Rework TrangaSettings 2024-08-26 12:36:35 +02:00
001a37b8ef Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-08-26 11:18:12 +02:00
69d6884517 #227 Fix wrong filtering, only return top 10 results 2024-08-26 11:17:59 +02:00
db73af3bdd Fix crash when outputstream closes before response could be sent.
#227
2024-08-26 10:38:45 +02:00
59547efab2 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-08-26 10:35:37 +02:00
f4336f9777 #227 Mangasee Return results that have similarity over 95% or at least top ten results 2024-08-26 10:35:16 +02:00
bec3ac52a9 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-08-20 20:53:09 +02:00
ea37e81ece Fix last commit 2024-08-20 20:53:03 +02:00
6a20783d48 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-08-20 20:47:21 +02:00
21af75f410 Faster download for images-urls.
#224
2024-08-20 20:47:13 +02:00
a629792818 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-08-08 21:09:26 +02:00
34dd78810d Update README.md 2024-08-08 21:09:08 +02:00
e1c504226c Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-08-08 21:04:09 +02:00
200a22228f add log output for Mangahere
https://github.com/C9Glax/tranga/issues/69
2024-08-08 21:02:13 +02:00
bc10136331 MangaHere image download sucks, you have to iterate all over all images one by one. Have some extra traffic then, idc.
https://github.com/C9Glax/tranga/issues/69
2024-08-08 21:00:37 +02:00
06df6e0767 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-08-08 19:00:26 +02:00
ba029b71f5 Merge branch 'refs/heads/manhuaplus' into cuttingedge-merge-ServerV2 2024-08-08 19:00:20 +02:00
082802ddbe Merge branch 'refs/heads/master' into cuttingedge-merge-ServerV2 2024-08-08 19:00:09 +02:00
d5f1df0400
Merge pull request #216 from C9Glax/dependabot/github_actions/docker/build-push-action-6.6.1
Bump docker/build-push-action from 6.5.0 to 6.6.1
2024-08-08 18:59:46 +02:00
d00881e611 Add Connector ManhuaPlus
https://github.com/C9Glax/tranga/issues/213
2024-08-08 18:58:40 +02:00
dependabot[bot]
72bc7ec07b
Bump docker/build-push-action from 6.5.0 to 6.6.1
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.5.0 to 6.6.1.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.5.0...v6.6.1)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-08-08 05:08:32 +00:00
89b5aa266e Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-07-31 19:25:03 +02:00
926c0d5833 fix #214 foldernames 2024-07-31 19:24:59 +02:00
80e2568113 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-07-31 17:48:21 +02:00
3b6417eff2 Fix #214 HTML encoded Characters 2024-07-31 17:48:15 +02:00
2812a6dff1 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-07-31 17:44:37 +02:00
1991862a42 Merge remote-tracking branch 'refs/remotes/github/master' into cuttingedge-merge-ServerV2 2024-07-31 17:44:22 +02:00
40e4d5c203
Merge pull request #215 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.6.1
Bump docker/setup-buildx-action from 3.4.0 to 3.6.1
2024-07-31 17:44:05 +02:00
49e9731184
Merge pull request #212 from C9Glax/dependabot/github_actions/docker/setup-qemu-action-3.2.0
Bump docker/setup-qemu-action from 3.1.0 to 3.2.0
2024-07-31 17:43:57 +02:00
a4e85f254f
Merge pull request #210 from C9Glax/dependabot/github_actions/docker/build-push-action-6.5.0
Bump docker/build-push-action from 6.3.0 to 6.5.0
2024-07-31 17:43:48 +02:00
dependabot[bot]
4f47aeadcf
Bump docker/setup-buildx-action from 3.4.0 to 3.6.1
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.4.0 to 3.6.1.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.4.0...v3.6.1)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-07-30 05:45:04 +00:00
dependabot[bot]
e0c1356fea
Bump docker/setup-qemu-action from 3.1.0 to 3.2.0
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 3.1.0 to 3.2.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](https://github.com/docker/setup-qemu-action/compare/v3.1.0...v3.2.0)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-07-23 06:02:31 +00:00
dependabot[bot]
0d9b3d2499
Bump docker/build-push-action from 6.3.0 to 6.5.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.3.0 to 6.5.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.3.0...v6.5.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-07-23 06:02:27 +00:00
8e5d15ead9 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-07-11 15:46:27 +02:00
b8c28e6d21
Merge pull request #207 from C9Glax/master
Update active dev branch with changes to master
2024-07-11 15:45:33 +02:00
9ea5e436fe
Merge pull request #204 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.4.0
Bump docker/setup-buildx-action from 3.3.0 to 3.4.0
2024-07-11 15:44:39 +02:00
b4c310638a
Merge pull request #205 from C9Glax/dependabot/github_actions/docker/build-push-action-6.3.0
Bump docker/build-push-action from 6.1.0 to 6.3.0
2024-07-11 15:44:17 +02:00
159341ff3c
Merge pull request #206 from C9Glax/dependabot/github_actions/docker/setup-qemu-action-3.1.0
Bump docker/setup-qemu-action from 2.2.0 to 3.1.0
2024-07-11 15:43:58 +02:00
dependabot[bot]
29338b9b17
Bump docker/setup-qemu-action from 2.2.0 to 3.1.0
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 2.2.0 to 3.1.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](https://github.com/docker/setup-qemu-action/compare/v2.2.0...v3.1.0)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-07-10 05:46:20 +00:00
dependabot[bot]
0eda8913b0
Bump docker/build-push-action from 6.1.0 to 6.3.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.1.0 to 6.3.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.1.0...v6.3.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-07-10 05:46:17 +00:00
dependabot[bot]
5ca50630e4
Bump docker/setup-buildx-action from 3.3.0 to 3.4.0
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.3.0 to 3.4.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.3.0...v3.4.0)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-07-10 05:46:15 +00:00
d0bfb262bf Merge remote-tracking branch 'refs/remotes/github/master' into cuttingedge-merge-ServerV2 2024-07-09 11:22:05 +02:00
4f14f15ade
Merge pull request #200 from C9Glax/dependabot/github_actions/docker/setup-qemu-action-3.1.0
Bump docker/setup-qemu-action from 2.2.0 to 3.1.0
2024-07-09 11:20:29 +02:00
d89a24fd11
Merge pull request #201 from C9Glax/dependabot/github_actions/docker/build-push-action-6.3.0
Bump docker/build-push-action from 6.1.0 to 6.3.0
2024-07-09 11:20:14 +02:00
a5859e3c82
Merge pull request #203 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.4.0
Bump docker/setup-buildx-action from 3.3.0 to 3.4.0
2024-07-09 11:19:55 +02:00
dd2fa3fbd7 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-07-09 11:17:58 +02:00
33e5d65785 fix Kavita GetLibraries 2024-07-09 11:17:50 +02:00
dependabot[bot]
d60ed77dbe
Bump docker/setup-buildx-action from 3.3.0 to 3.4.0
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.3.0 to 3.4.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.3.0...v3.4.0)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-07-05 05:11:09 +00:00
dependabot[bot]
e15c6816b5
Bump docker/build-push-action from 6.1.0 to 6.3.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 6.1.0 to 6.3.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v6.1.0...v6.3.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-07-04 05:56:47 +00:00
dependabot[bot]
4a4fe4b40d
Bump docker/setup-qemu-action from 2.2.0 to 3.1.0
Bumps [docker/setup-qemu-action](https://github.com/docker/setup-qemu-action) from 2.2.0 to 3.1.0.
- [Release notes](https://github.com/docker/setup-qemu-action/releases)
- [Commits](https://github.com/docker/setup-qemu-action/compare/v2.2.0...v3.1.0)

---
updated-dependencies:
- dependency-name: docker/setup-qemu-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-07-04 05:56:42 +00:00
4881789970 Merge branch 'refs/heads/cuttingedge' 2024-06-29 22:50:07 +02:00
be1e6fe988 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-06-29 22:49:56 +02:00
f61e51e506 Fix crash when moving files, now overwrites. 2024-06-29 22:49:39 +02:00
eba511749b
Merge pull request #199 from C9Glax/cuttingedge
Merge cuttingedge to latest.
2024-06-29 19:49:06 +02:00
de4c57a0cd Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-06-29 19:37:09 +02:00
e368c3c98a Fix https://github.com/C9Glax/tranga/issues/193
Mangaworld Volume and Chapter number Parsing.
2024-06-29 19:37:02 +02:00
d17ca1d97a
Merge pull request #197 from C9Glax/master
Merge Github Actions
2024-06-29 19:22:59 +02:00
e9376e3782
Merge pull request #196 from C9Glax/master
Merge Github Actions
2024-06-29 19:21:41 +02:00
7c217a7e33 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-06-29 19:20:16 +02:00
a437fcbca1 Possible fix https://github.com/C9Glax/tranga/issues/185
Mangaworld publication id had invalid path characters.
2024-06-29 19:20:04 +02:00
1dcfecd66f Create CoverImageCache when saving coverimages. 2024-06-29 19:14:37 +02:00
6db4646336 Move/rename archives if volume number gets updated. 2024-06-29 19:11:18 +02:00
8a6298e3fd
Merge pull request #157 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.3.0
Bump docker/setup-buildx-action from 3.1.0 to 3.3.0
2024-06-27 00:08:31 +02:00
194705c124
Merge pull request #194 from C9Glax/dependabot/github_actions/docker/build-push-action-6.1.0
Bump docker/build-push-action from 5.3.0 to 6.1.0
2024-06-27 00:06:28 +02:00
dependabot[bot]
f4d5969003
Bump docker/build-push-action from 5.3.0 to 6.1.0
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 5.3.0 to 6.1.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v5.3.0...v6.1.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-06-24 05:58:28 +00:00
9d92069a4b #187 NTFY JsonConverter 2024-06-15 21:39:53 +02:00
5614729eab #187 Server v1 NTFY username password 2024-06-15 21:33:42 +02:00
d52ec8d36f NTFY username and password usage instead of auth. 2024-06-15 21:24:28 +02:00
37dfb4df02 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-06-02 01:05:20 +02:00
42feea3ad5 Fix covers returning wrong fileLocation if cover already exists. 2024-06-02 01:05:08 +02:00
bbc750d731 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-06-02 00:23:23 +02:00
08dd01942f #183 Fix NTFY not exporting topic to notificationConnectors.json 2024-06-02 00:23:16 +02:00
351144e763 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-06-02 00:09:18 +02:00
aea4c0c61b Add GlaxArguments to fetch Runtime-Args 2024-06-02 00:09:03 +02:00
7b9e935db7 Commented optional second level only domains for cover-image-names 2024-06-01 22:10:09 +02:00
048b165d76 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-06-01 22:09:18 +02:00
ebe3012c69 NTFY check endpoint URI and add optional custom topic #183 2024-06-01 22:09:08 +02:00
a5dbed9525 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-05-26 23:04:27 +02:00
811ddd903f fix missing minus-sign from domain namers in coverimages 2024-05-26 23:04:16 +02:00
f948809bcd Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-05-26 22:51:59 +02:00
7ceb9cd4cb #182 Changed filename to instead of remote filename have the format server-internalId.fileFormat 2024-05-26 22:51:46 +02:00
57f1e037ef Corrected check for if cover exists 2024-05-26 22:45:39 +02:00
6ca8d58e43 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-05-26 18:46:58 +02:00
e3211b95e2 #182 Remove covers that have no asssociated Manga 2024-05-26 18:46:40 +02:00
b5e9e03f64 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-05-26 18:34:57 +02:00
98bd8a983b Possible Fix #182 2024-05-26 18:34:45 +02:00
f4996659ef Fix loading file results in "null"-job and crashes. 2024-05-26 18:23:16 +02:00
e05684d5d1 Fix loading file results in "null"-job and crashes. 2024-05-26 18:22:51 +02:00
4a7d23c0d9 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-05-26 18:10:45 +02:00
1d44b6d9c6 Log added Jobs during Startup 2024-05-26 18:10:29 +02:00
811a183af2 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-04-27 19:09:22 +02:00
fb0755eb89 Use NeedlemanWunsch for string comparison on Mangasee.cs
Resolves #132
#167
2024-04-27 19:09:12 +02:00
2e8b896f3b Fix #178 wrong check on parsing variable aprilfoolsmode 2024-04-27 17:53:08 +02:00
4692cc297a Fix MangaDex linksNode is null 2024-04-26 00:48:55 +02:00
3d855020eb Export job files indented. 2024-04-25 21:32:48 +02:00
c6d0168d2f Fix #174 auth not being written to file for ntfy. 2024-04-25 21:29:05 +02:00
d52213002e Delete old jobfiles. 2024-04-25 21:24:29 +02:00
ec9290f41f Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge
# Conflicts:
#	Tranga/Jobs/UpdateMetadata.cs
2024-04-25 21:10:42 +02:00
6b91796e5a Update manga in DownloadNewChapters Jobs 2024-04-25 21:10:26 +02:00
9f9ea569d5 fix bug Manga.WithMetadata coverfilenameincache not being replaced. 2024-04-25 21:03:57 +02:00
4bd1150a0e fix bug Manga.WithMetadata coverfilenameincache not being replaced. 2024-04-25 21:03:44 +02:00
8b62e2c467 Possible fix #175 Export jobs when Manga-Metadata is updated. 2024-04-25 20:57:59 +02:00
7ec262a2e4 Possible fix #175 Export jobs when Manga-Metadata is updated. 2024-04-25 20:57:46 +02:00
d32d5976ee Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-04-25 20:46:32 +02:00
58cff6513a Possible fix #175 2024-04-25 20:46:26 +02:00
783f229a6a Add LibraryConnector.Test to see if requests can be made to endpoint. 2024-04-23 00:58:33 +02:00
aaf06da8e1 Merge branch 'refs/heads/cuttingedge-merge-ServerV2' into cuttingedge 2024-04-23 00:20:50 +02:00
51a26a3cba Fix https://github.com/C9Glax/tranga/issues/143
ImageCache could never find files, because they were not in the expected location.
2024-04-23 00:20:34 +02:00
762da4c859 Make cachedPublications private with getter-setter 2024-04-22 22:43:42 +02:00
daba940b45 Make cachePublications a dictionary with internalId as key. 2024-04-22 22:38:23 +02:00
79e61a62c7 Export Jobfiles after execution, update metadata in jobfiles 2024-04-22 22:29:22 +02:00
06fe98323a Fix crashing when comparing old Manga (missing websiteUrl) 2024-04-22 22:09:43 +02:00
5f820c53f5 Update websiteUrl on metadata-refresh https://github.com/C9Glax/tranga-website/issues/60 2024-04-22 22:03:09 +02:00
c69f1f6569 Addresses #170 Manganato authors and genres include "\r\n" 2024-04-22 04:45:49 +02:00
5bdbd9e2e4 Hack to resolve #60 Website-URL.
Field will have same name, just acquisition will be better.
2024-04-22 02:25:39 +02:00
f729c44f88 Merge branch 'refs/heads/master' into cuttingedge 2024-04-20 18:49:19 +02:00
f4966b0348 Docker Image build 2024-04-20 18:48:51 +02:00
df2fc4a036 Remove README CLI reference 2024-04-20 18:39:49 +02:00
0ab2ae03ce unionby isntead of concat 2024-04-19 03:07:46 +02:00
95236daf41 Check if tags and authors are the same on Manga equals.
UpdateManga performs union/concat operation on alttitles, tags and authors
2024-04-19 03:00:31 +02:00
294ce01bc3 Set Manga.releaseStatus to new releaseStatus.
Fix #119
2024-04-19 02:37:17 +02:00
13565d1c7a Fixes #166 MangaDex crash on UpdateMetadata, needed to include cover_art in request 2024-04-19 02:21:20 +02:00
54b24ac37f Merge remote-tracking branch 'refs/remotes/db-2001/cuttingedge' into cuttingedge 2024-04-19 00:10:14 +02:00
c67e89f1dd null checks 2024-04-19 00:07:34 +02:00
Dity
4ba44d3ac3
Merge branch 'C9Glax:cuttingedge' into cuttingedge 2024-04-18 18:04:07 -04:00
8631cf6376
Merge pull request #161 from C9Glax/MangaDexRequestLimitChange
MangaDex request limit change
2024-04-18 23:54:44 +02:00
df4d547e2b Fix crash with old settings files 2024-04-18 23:52:52 +02:00
db-2001
006b71b496 Merge remote-tracking branch 'upstream/cuttingedge' into cuttingedge 2024-04-18 17:48:43 -04:00
5f03b0d89c Closes #154 2024-04-18 23:05:04 +02:00
6dc1ea0030 Merge branch 'refs/heads/master' into cuttingedge 2024-04-18 22:52:51 +02:00
ff08754610 Bump docker/setup-buildx-action@v3.3.0
Bump docker/build-push-action@v5.3.0
2024-04-18 22:52:38 +02:00
d1a6c0ad3d Set Chromium Start Timeout to 30 seconds.
Resolves #135 ?
2024-04-18 22:13:10 +02:00
0260868968
Merge pull request #163 from C9Glax/cuttingedge
Connector Bugs, AprilFools Mode
2024-04-18 21:29:40 +02:00
b1f72dcb81 Legacy RateLimit remove 2024-04-18 19:00:28 +02:00
b0f353819b Legacy RateLimit 2024-04-18 18:58:42 +02:00
8f8d019861 Streamlined MangaDex information retrieval 2024-04-18 18:56:34 +02:00
21a7392493 Resolves #160, Rated Manga on Mangadex. 2024-04-18 18:01:02 +02:00
db-2001
0d5db15f87 Merge remote-tracking branch 'upstream/cuttingedge' into cuttingedge 2024-04-16 21:51:58 -04:00
431fde0d76 Wrong April Fools check.
Resolves https://github.com/C9Glax/tranga/issues/159
2024-04-16 04:18:56 +02:00
e022bf3081 Merge branch 'cuttingedge' into dev 2024-04-15 15:02:52 +02:00
c25a4f69ec Cleanup 2024-04-15 14:51:01 +02:00
82bdb248b9 userAgent private set in settings 2024-04-15 14:50:44 +02:00
b27114eaad April Fools Mode
https://github.com/C9Glax/tranga/issues/155
2024-04-15 14:50:03 +02:00
Dity
051eb4a417
Merge pull request #158 from db-2001/cuttingedge
Reimplement Fix for Mangasee
2024-04-14 14:35:06 -04:00
db-2001
482704af2c Merge remote-tracking branch 'upstream/cuttingedge' into cuttingedge 2024-04-14 14:29:30 -04:00
dependabot[bot]
af4229920d
Bump docker/setup-buildx-action from 3.1.0 to 3.3.0
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 3.1.0 to 3.3.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v3.1.0...v3.3.0)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-09 05:32:25 +00:00
537ad3a5f8 https://github.com/C9Glax/tranga/issues/142
Cleanup old temporary Folders and files
2024-04-01 20:35:47 +02:00
6a8697fc3a Manga4Life fix bug that made it impossible for Manga to be loaded if they did not have a "Load more Chapters" button.
https://github.com/C9Glax/tranga/issues/149
Created a check if the button exists before trying to click it.
2024-04-01 20:12:25 +02:00
94582496ef Mangadex do not try downloading externally linked chapters, or chapters that have no pages.
https://github.com/C9Glax/tranga/issues/153
2024-04-01 20:00:02 +02:00
17ef5eae0f Fix MangaDex request for new Chapter. 2024-03-30 21:53:11 +01:00
db-2001
d5b6d4e8ee Fixes for https://github.com/C9Glax/tranga/issues/138 and bug fix for MDex 2024-03-29 23:59:16 -04:00
db-2001
05190bc9e2 Holy moly a fix for Mangasee 2024-03-26 18:16:41 -04:00
db-2001
d211dd2d01 Added check to prevent creation of empty chapter files 2024-03-18 22:32:26 -04:00
590547e407 Add Logline to print current logfilePath. 2024-03-05 02:55:10 +01:00
2ad04c5c46 Change LogFilePath to LogFolderPath
#139
2024-03-05 02:35:47 +01:00
189569ccdf dev image 2024-02-28 20:38:22 +01:00
2872eeea09
Merge pull request #134 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-3.1.0
Bump docker/setup-buildx-action from 2.10.0 to 3.1.0
2024-02-28 07:03:31 +01:00
dependabot[bot]
c0cfeaa35d
Bump docker/setup-buildx-action from 2.10.0 to 3.1.0
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 2.10.0 to 3.1.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v2.10.0...v3.1.0)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-02-28 06:02:59 +00:00
2fd780996c Dockerfile maddnesssss 2024-02-28 04:03:53 +01:00
b390bb8ea5 LogFilePath 2024-02-28 03:59:09 +01:00
847829e617 Corrected DockerFile Arguments 2024-02-28 03:56:24 +01:00
0f29da00de
Merge pull request #122 from C9Glax/tranga-website-41
Website Changes
2024-02-28 03:22:42 +01:00
9b2a6de841
Merge pull request #133 from C9Glax/cuttingedge
RateLimits, FileNames, Volume/Chapter Numbers
2024-02-28 02:49:48 +01:00
17a27c9922 Reset RequestLimits 2024-02-28 02:33:43 +01:00
6c9071b22b Reset UserAgent 2024-02-28 02:32:36 +01:00
abfe42b7c1 Reset UserAgent when Empty 2024-02-28 02:25:46 +01:00
72ae124418 Handle unauthorized kavita 2024-02-28 02:25:17 +01:00
bee6e7ba37 Export settings after updating rateLimits 2024-02-28 02:23:58 +01:00
8079ffc742 GlobalBase static is FileInUse 2024-02-28 02:17:48 +01:00
6d6e33491b Indented Json 2024-02-28 02:15:04 +01:00
a8697a14a3 GlobalBase static is FileInUse 2024-02-28 02:14:58 +01:00
e2adac937a Fix settings not being loaded from settingsfile 2024-02-28 02:13:18 +01:00
b4708c5d10 Encoding 850 issue for jsonconvert 2024-02-28 02:12:23 +01:00
597abde115 Fix wrong chapter (and volume) numbers for chapters 2024-02-27 22:04:14 +01:00
2a824bbb8d Correct "1" ChapterNumbers for Mangasee 2024-02-12 21:04:14 +01:00
9691eb0d08 Correct ChapterNumbers for Mangasee 2024-02-12 21:02:01 +01:00
4888e18fd2 Correct ChapterNumbers for Mangasee 2024-02-12 20:49:33 +01:00
0aa92a7913 Correct VolumeNumbers for Mangasee 2024-02-12 11:22:19 +01:00
db53e2156b API added POST
NotificationConnectors/Reset
LibraryConnectors/Reset
2024-02-11 20:44:27 +01:00
1cce0f204e API added POST
NotificationConnectors/Test
LibraryConnectors/Test
2024-02-11 20:41:55 +01:00
ce41c49a0e Merge branch 'master' into tranga-website-41 2024-02-11 01:11:41 +01:00
b8570e5eef Merge branch 'master' into cuttingedge 2024-02-11 01:11:34 +01:00
1f24a2349d Do not build latest/master on pull 2024-02-11 01:11:23 +01:00
ca95460218 https://github.com/C9Glax/tranga/pull/122
https://github.com/C9Glax/tranga-website/pull/41
LogFile
Enable LogFiles
2024-02-11 01:06:40 +01:00
e801cc4cbf #122 RateLimit GET
https://github.com/C9Glax/tranga-website/pull/41
2024-02-11 00:49:26 +01:00
2c4c8de8b5 Remove StyleSheet from TrangaSettings 2024-02-11 00:39:21 +01:00
0b4461265c #109 Rate Limits
Moved Config for RateLimits to TrangaSettings
Updated API: Settings/customRequestLimit
requestType in RequestType.cs
requestsPerMinute as int
2024-02-11 00:35:33 +01:00
c008d55f26 #103 Regeeeeex 2024-02-08 11:05:44 +01:00
9b990aecea With a passion 2024-02-07 19:40:07 +01:00
299fa6afda I hate Regex 2024-02-07 19:37:35 +01:00
c03e927565 Fix Mangaworld #103 Plurals 2024-02-07 19:23:55 +01:00
bb6c553afa One more Regex... 2024-02-07 19:05:11 +01:00
33d78ed757 https://github.com/C9Glax/tranga/issues/111#issuecomment-1932447848 2024-02-07 18:18:33 +01:00
84272ddd1e https://github.com/C9Glax/tranga/issues/111#issuecomment-1932447848 2024-02-07 18:08:57 +01:00
2f0fbbd3cb #111 Fix renaming of chapters.
Fixed check if Chapter exists
2024-02-07 15:50:26 +01:00
5bc414fd59 #113 old formatting of fileNames 2024-02-07 15:34:20 +01:00
2eaeadb92c #113 whitespaces 2024-02-07 15:29:42 +01:00
d8df6eccb1 Mangasee fix cloudflare 520 2024-02-07 14:53:57 +01:00
db64b717eb Fix regex for parsing publicationId 2024-02-02 19:38:16 +01:00
1afe36a525 add todo 2024-02-02 18:46:09 +01:00
aa692f6978 #108 2024-02-02 18:45:12 +01:00
c706824222
Merge pull request #110 from C9Glax/cuttingedge
Update Master
2024-01-31 19:14:41 +01:00
3ca6245fc2 safe Useragent as string and export settings after changing 2024-01-31 19:00:38 +01:00
2dd82aad13 https://datatracker.ietf.org/doc/html/rfc2616 2024-01-31 18:46:37 +01:00
3c4867a276 #105 2024-01-31 18:39:34 +01:00
bae157cdb4 Cleanup #90 2024-01-31 18:39:34 +01:00
3b818ff1af typo 2024-01-31 18:39:34 +01:00
5d12be2983 Fix crash when Request times out on ChromiumDownloadClient 2024-01-31 18:39:34 +01:00
31a4e693e0 Custom Request Limits #109 2024-01-31 18:39:34 +01:00
e49db9a4cb Change toplevel domain #103 2024-01-25 16:40:04 +01:00
54142e61fe Fix #103 2024-01-20 17:20:56 +01:00
cd5ca0e302 Fix #90 2024-01-20 16:44:22 +01:00
95da900213 Add url to Request-Error Output 2024-01-20 16:33:47 +01:00
b5be4e0dd8 Fixes #97 missing jobs.
Implemented Equals(obj) functions for Chapter, DownloadChapter and DownloadNewChapters to check if jobs already exist.
2024-01-11 20:19:04 +01:00
0c135aa89e Fixes #97 because stupid 2024-01-06 17:12:36 +01:00
e11ee4dafe Fixes #98 VolumeNumber can not be null for comparison 2024-01-04 17:04:08 +01:00
05573f65f9 #96 Added single click to load all chapters. 2024-01-03 18:37:29 +01:00
d986c808e3 Chapter as Comparable 2024-01-03 18:37:12 +01:00
5df63b00c2 Moved Struct RequestResult to own file 2024-01-03 17:31:00 +01:00
903bb5af5e Resolves #97 Manga4Life Volume Numbers 2024-01-03 17:05:33 +01:00
cc8453d4a8 #85 included characters with accents, umlauts, and + 2023-12-24 16:52:24 +01:00
800d4c1ec1 Amend 29f6de2590
Fix #87, manga that return no chapters, crash when updating latest released chapter.
2023-12-24 16:43:49 +01:00
b4f97eefcf Fix comparisons 2023-12-24 16:34:54 +01:00
29f6de2590 Catch parsing error #93 to prevent crashes and restart loops 2023-12-24 16:27:20 +01:00
23e5c4a7b1 Fix #93 2023-12-24 16:20:06 +01:00
e15717cb04
Merge pull request #84 from arxae/mangakatana_input_string_not_correct_format
Fixed input string not being in correct format
2023-11-13 11:54:02 +01:00
Andy Maesen
b995fc568a Requested changes 2023-11-13 06:49:20 +01:00
442d949371 Fix #80 UpdateMetaData failing 2023-11-12 13:03:33 +01:00
263d0e6036 Fix #82 Tranga crashes when cover is missing from imageCache.
Retrying download of cover and copy
2023-11-12 12:39:32 +01:00
Andy Maesen
7c7d43021e Fixed input string not being in correct format 2023-11-12 05:38:06 +01:00
5cdc7d7207 Fix wrong jobtype 2023-11-05 16:14:23 +01:00
1bcbd1517f Addresses #81 2023-11-05 16:14:12 +01:00
b72da45ae9 Add GetMangaFromId for MangaWorld 2023-11-02 15:58:16 +01:00
01041e43ac Fix publicationId for MangaWorld 2023-11-02 15:58:04 +01:00
4c1a659f16 Add API: POST Jobs/UpdateMetadata 2023-11-02 15:48:46 +01:00
2e02f0b237 Exception message. 2023-11-02 15:48:31 +01:00
77f93d87f9 UpdateMetadata now finishes correctly. 2023-11-02 15:48:17 +01:00
45c0f19a9d Added override Manga.Equals 2023-11-02 15:48:03 +01:00
7c09deb143 Remove Manga.WebsiteUrl 2023-11-02 15:47:43 +01:00
449d406eab Add MangaConnector.GetMangaFromId 2023-11-02 15:47:16 +01:00
083ce238d8 Add UpdateMetadata Job to DownloadNewChapters 2023-11-02 15:20:34 +01:00
5f9ffb8aad Improved UpdateMetadata 2023-11-02 15:20:20 +01:00
92bc3d5aa8 Catch HttpRequestException in LibraryConnector 2023-11-02 15:19:56 +01:00
49ab8928b1 Add parameter JobBoss to Job.ExecuteTask (and Internal) 2023-11-02 15:19:36 +01:00
391efcb9bc Add Field jobType to Job 2023-11-02 15:18:41 +01:00
963ad375e8 Add Job UpdateMetadata --> untested! 2023-11-01 14:17:11 +01:00
0a5ded2036 Add field WebsiteUrl to Manga 2023-11-01 14:15:55 +01:00
4843c7f05c Overwrite SeriesInfo.json parameter in SaveSeriesInfoJson. 2023-11-01 14:04:35 +01:00
6adbda2359 #77 Added field releaseStatus to Manga 2023-11-01 13:59:21 +01:00
425cf7e0d6 Re-add forgotten seriesInfo.json to new downloads 2023-11-01 13:36:58 +01:00
8f5dd5aab5 #78 Manganato chapternumber parsing from url 2023-11-01 13:22:33 +01:00
733ae285f1 #76 debug 2023-10-31 16:46:41 +01:00
2e1c8ce34f #75 Reimplemented own search.
At the moment returns too many results, levenshtein distance still too inefficient.
2023-10-31 15:47:39 +01:00
c965bc38d1 https://github.com/C9Glax/tranga-website/issues/19
Wrong regex for URLs with ports
2023-10-30 19:30:51 +01:00
37266ea095 https://github.com/C9Glax/tranga-website/issues/19
Add exception handling if host doesnt exist
2023-10-30 13:48:25 +01:00
8caac538c9 https://github.com/C9Glax/tranga-website/issues/19 Send a badrequest response if not a valid libraryconnector 2023-10-30 13:39:50 +01:00
7c7f711bb4 https://github.com/C9Glax/tranga-website/pull/17 2023-10-28 12:47:13 +02:00
d78897eb74 #74 untested 2023-10-27 14:09:34 +02:00
438c11af4f #73 api side, untested 2023-10-27 13:47:37 +02:00
38df54baff Exception handling on request failed HttpDownloadClient 2023-10-25 18:22:00 +02:00
98d187d133 Possible fix #72
Volume Numbers broke Regex
Now can also parse volume numbers!
2023-10-25 18:16:26 +02:00
5352cca058 Possible fix for #72
RegexMatching was off for last element sometimes on bato
2023-10-23 17:01:26 +02:00
3381909afd Fix #72 Chapternumber Parsing Bato 2023-10-21 15:44:37 +02:00
7219641859 #68 Because XML is sometimes broken, we parse from somewhere else
Also fixed the faulty url completion.
2023-10-20 15:01:55 +02:00
f63851d95d #68 JsonConverter 2023-10-20 14:50:26 +02:00
e72301d062 #68 and other chromium connectors: Wait for page to be fully loaded 2023-10-20 14:49:48 +02:00
2302e1009b Merge branch 'issue_70' into cuttingedge 2023-10-20 14:40:37 +02:00
40fea6cc7f Fix #70 invalid chapter numbers 2023-10-20 14:40:24 +02:00
5458c43f21 Merge branch 'timeout-bug' into cuttingedge 2023-10-19 13:00:03 +02:00
f78bec43d6 Fix an issue where a request-timeout would cause a restartloop. 2023-10-19 12:59:20 +02:00
88876fb8f4 #68 corrected url in GetChapters 2023-10-19 12:09:43 +02:00
c71aec8882 #68 Readme and Name 2023-10-19 12:08:49 +02:00
ddfba0d864 #68 MangaLife untested code, XML on site is broken 2023-10-19 12:06:03 +02:00
ca9c0b22c1 Merge pull request '#67 prevent crash if xml document does not exist' (!60) from cuttingedge into master
Reviewed-on: #60
2023-10-15 12:21:31 +02:00
6844d0a242 #67 prevent crash if xml document does not exist 2023-10-15 12:19:44 +02:00
fd9319de27 Merge pull request 'Fix #66 Mangasee search and parsing' (!59) from cuttingedge into master
Reviewed-on: #59
2023-10-14 13:07:46 +02:00
726be70af3 #66 Mangasee search sanitization 2023-10-14 12:59:35 +02:00
19c9ecb3e7 #66 Mangasee empty search breaks 2023-10-14 12:59:06 +02:00
f01a786e59 Merge pull request 'cuttingedge' (!58) from cuttingedge into master
Reviewed-on: #58
2023-10-12 20:48:15 +02:00
59f9bcc7d0 Merge remote-tracking branch 'origin/cuttingedge' into cuttingedge 2023-10-12 20:47:25 +02:00
2796a2adb5 Merge branch 'master' into cuttingedge 2023-10-12 20:47:16 +02:00
e07b191293 Merge branch 'master' into cuttingedge 2023-10-12 20:46:43 +02:00
9bf650f5fc New Issue Template: New Connector 2023-10-12 20:45:56 +02:00
334795b263 Update readme to reflect new connectors 2023-10-10 22:58:05 +02:00
51a6f216af Remove extraneous covers from imageCache. 2023-10-10 22:51:24 +02:00
238a2775f4 Author formatting bato 2023-10-10 22:45:11 +02:00
fec970d7d6 #64 fix empty search 2023-10-10 22:43:34 +02:00
e642d50c47 #64 Bato
Comment: This website suuuucks to scrape. There is gonna be so many issues
2023-10-10 22:40:44 +02:00
fafcdac00a Fix file-extension on image download 2023-10-10 22:40:07 +02:00
1785aa28ea Change coverCacheFilenames, to avoid conflicts and malformatted filenames 2023-10-10 22:34:47 +02:00
f22c332cab Merge pull request 'cuttingedge' (!57) from cuttingedge into master
Reviewed-on: #57
2023-10-10 21:21:34 +02:00
b3bf523e1e Fix #63 Chapter numbering. 2023-10-09 15:28:37 +02:00
06b2e11164 Add Mangaworld to dict. 2023-10-09 15:15:42 +02:00
7972f07801 housekeeping 2023-10-04 22:09:33 +02:00
d89af7cc5b Fix multiple enumeration 2023-10-04 22:09:27 +02:00
31a0c6ffb2 Fix build warnings 2023-10-04 18:14:46 +02:00
668a3b3a96 MangaDex nullchecking in response 2023-10-04 18:14:12 +02:00
3938c61297 #62 https://github.com/C9Glax/tranga/issues/62#issuecomment-1747064431
Parsing, parsing, parsing
2023-10-04 17:45:13 +02:00
4f3bcd245d #62 fix one bug, create another 2023-10-04 15:44:06 +02:00
129c95f123 Set timeout on chromiumclient
#62
2023-10-04 11:20:14 +02:00
e2cdf27d40 https://github.com/C9Glax/tranga/issues/62#issuecomment-1746422154
#62
ChapterNumber Parsing on Manganato
2023-10-04 11:15:24 +02:00
4156365b18 Improved logic on QueueContainsJob and AddJobTo Queue
Added some documentation
2023-10-04 09:38:40 +02:00
d3ccddd8db Fix multiple enumeration 2023-10-04 09:33:11 +02:00
13075a8704 Improved logic in LoadJobsList 2023-10-04 09:31:03 +02:00
e7d9f53a93 Prevent override of List-jobs in AddJobsQueue-method 2023-10-04 09:30:42 +02:00
dc6dfd4aa1 Renamed method ExportJob(s) to UpdateJobFiles 2023-10-04 09:30:08 +02:00
0fba09b1e8 Logic removed unecessary call 2023-10-04 09:24:21 +02:00
f08b9e85ec Add log message for inactive jobs 2023-10-03 20:46:59 +02:00
95fcc73c74 Cancel Running Jobs if inactive for more than 5 minutes 2023-10-03 20:46:21 +02:00
73492d8102 #62 even more debug logging 2023-10-03 20:38:45 +02:00
c69dd22ecf #62 more debug-logging
Instead of assigning buffer copy directy from result to filestream
2023-10-03 14:07:58 +02:00
17b6c523a2 Print results before downloading covers 2023-09-28 15:53:57 +02:00
6c3f7604fe Better Mangasee search 2023-09-28 15:53:40 +02:00
94f88f08e9
Update bug_report.yml
WHAT
2023-09-26 18:50:56 +02:00
47327524be body can not be empty? 2023-09-26 18:47:02 +02:00
3b96419739 will this work 2023-09-26 18:39:12 +02:00
b7c9b4e9b4
Update issue templates 2023-09-26 18:37:59 +02:00
13adb45444 File.extensions.matter 2023-09-26 18:32:44 +02:00
b8fbee578e Update readme 2023-09-26 18:30:52 +02:00
c1fb42b537 Update docker compose to latest 2023-09-26 18:29:49 +02:00
dcc12ec3ea Merge remote-tracking branch 'github/master' 2023-09-26 18:28:23 +02:00
8c554076b2 Merge branch 'cuttingedge' 2023-09-26 18:28:15 +02:00
a10fbdf3a5
Merge pull request #59 from C9Glax/C9Glax-patch-1
Update issue templates
2023-09-26 18:27:38 +02:00
f246209685 Changed to template 2023-09-26 18:26:42 +02:00
41c561bd1d
Update issue templates 2023-09-26 18:18:06 +02:00
fc7d5463c3 Fix #58
Mangaworld: Manga without volumes crash
2023-09-26 18:03:18 +02:00
3c2ce266f6 Changed (fixed?) queuelogic 2023-09-20 21:59:39 +02:00
306cb87d67 Fix Check for subjobs 2023-09-20 21:34:04 +02:00
23cda74487 Fix wrong domain regex 2023-09-20 21:33:53 +02:00
3ceee63dfc Only send notification on successful downloads 2023-09-20 14:40:03 +02:00
4e5a6fe97b Export Library and notification connectors on deletion
Added logging
2023-09-20 14:11:31 +02:00
b3b1971dad Startup notification 2023-09-20 13:58:10 +02:00
2699f35b62 housekeeping 2023-09-20 13:33:13 +02:00
7a14583d6a Moved Regex for baseUrl to Globalbase 2023-09-20 13:30:52 +02:00
660f6a1648 Logmessages for creation of library and notification Connector 2023-09-20 13:28:09 +02:00
482fcb7102 better logging for removing files 2023-09-19 23:24:39 +02:00
b6cdb07e3f Remove filewrites 2023-09-19 23:15:18 +02:00
0875e7ee12 Remove log clutter and filewrites 2023-09-19 23:07:26 +02:00
cb6482ebae Add logmessage on startup for next job 2023-09-19 20:04:25 +02:00
87ea077281 Remove log clutter and filewrites 2023-09-19 20:02:56 +02:00
c1aa4cf6b5 Fi bug with exportjobslist not exporting updated jobs 2023-09-19 19:59:51 +02:00
f5b6b1785f small improvements 2023-09-19 19:57:35 +02:00
2553a150d1 Add log to see wait time 2023-09-19 19:54:26 +02:00
b149d377dc Add log to see wait time 2023-09-19 19:54:00 +02:00
0209159c5c Add log to see wait time 2023-09-19 19:50:39 +02:00
e31820eb00 Export Jobs list when finished. 2023-09-19 19:49:42 +02:00
c4d69c27a4 copy cover 2023-09-19 19:43:58 +02:00
3ee53b7436 copy cover 2023-09-19 19:43:39 +02:00
64ec0963e1 copy cover 2023-09-19 19:42:50 +02:00
27c4ed719c Cancel failed jobs 2023-09-19 19:33:43 +02:00
4f4b0cb3a8 LibraryConnector baseUrl regex 2023-09-19 19:22:49 +02:00
48d312da0b File Permissions 2023-09-19 19:21:37 +02:00
1fe4b75ac7 Folder permissions 2023-09-19 19:04:55 +02:00
c580fafc62 Added user tranga to container and set permissions 2023-09-19 19:00:00 +02:00
58040ecb10 Order of returned API Jobs/MonitorJobs And Jobs/Waiting 2023-09-19 18:06:08 +02:00
2960a9b8f0 Merge branch 'cuttingedge'
# Conflicts:
#	Tranga/Connectors/Mangasee.cs
2023-09-19 16:59:58 +02:00
f52bb8eb89 Get Readme ready for migration to master 2023-09-19 16:54:17 +02:00
ae0dc548ae Changed working directory on linux to /usr/share/tranga-api
Updated docker-compose to include settings-volume
2023-09-19 16:47:49 +02:00
051b85d08b Added contentType to response for images and logs 2023-09-19 16:43:08 +02:00
d89ca0a2ef Changed Jobs ToString 2023-09-19 16:30:55 +02:00
f1f640c1f6 Mangaworld fix volume and chapter numbers 2023-09-19 16:30:44 +02:00
9319aa7d1f Fix Mangaworld empty search-result crash 2023-09-19 16:24:07 +02:00
656e62628e Fix Mangaworld search 2023-09-19 16:23:52 +02:00
ba27adf255 Show startmessage and log settings 2023-09-19 16:08:00 +02:00
88ca75e883 Use lock statement instead of variable to lock logmessages 2023-09-19 15:59:52 +02:00
67c23b357f Add console-output to Dockerfile 2023-09-14 14:55:45 +02:00
4a5271e2a7 Added italian tags to series.json 2023-09-13 23:33:12 +02:00
fec5ad664c Fix possible nullreference 2023-09-13 23:02:36 +02:00
3cea5fb431 #50 Added Mangaworld.bz connector 2023-09-13 23:00:52 +02:00
7fa44fba54 Fix filename for coverimage if url contains parameters 2023-09-13 23:00:27 +02:00
d6b5a29fdc Fix Manganato kaguya-bug: volumenumber, chapternumber, chaptername match 2023-09-13 21:47:50 +02:00
a4a49d40f0 API GET LogMessages new optional parameter count 2023-09-13 14:40:23 +02:00
28fa85f05c #50 Added parameter translatedLanguage POST Jobs/DownloadNewChapters
POST Jobs/MonitorManga
2023-09-13 14:20:10 +02:00
1066e1ca2e #50 translated-language support (if connector supports it)
API GET Manga/Chapters new parameter "translatedLanguage"
2023-09-13 14:09:47 +02:00
39307f4313 Changed jobs.json to instead be a directory with one file per job
#48
2023-09-09 19:15:20 +02:00
a316ee3d48 Changed id creation for Jobs to be more descriptive 2023-09-09 19:14:47 +02:00
569622099d DownloadClient and MangaConnector improvements
DownloadClient is now abstract for HttpDownloadClient and ChromiumDownloadClient
The chromium client will exit the headless browser (on clean exit of the program).
The field "name" of MangaConnector is no longer abstract, instead set through constructor.
2023-09-08 23:27:09 +02:00
017701867d Fixed logic on API GET Jobs/Progress 2023-09-08 19:58:44 +02:00
c3d62bd337 Added ProgressToken timeRemaining 2023-09-08 19:58:29 +02:00
dc9e9e705c Fix FileLogger filePath 2023-09-08 19:28:44 +02:00
9eee6683fa Add API GET Ping 2023-09-08 16:31:38 +02:00
1265c7a072 Added API: GET Manga 2023-09-05 20:26:31 +02:00
c601541249 Added API: GET LogMessages and LogFile
resolves #10
2023-09-05 20:02:24 +02:00
ae1184320f Added API: customFolderName to Jobs/MonitorManga and Jobs/DownloadNewChapters
resolves #30
2023-09-05 19:51:18 +02:00
384e4c4f43 Added parameter "ignoreBelowChapterNum" tp API: Jobs/MonitorManga and Jobs/DownloadNewChapters 2023-09-05 19:44:14 +02:00
76a2b2498a Added numberFormatDecimalPoint to GlobalBase 2023-09-05 19:42:46 +02:00
2ab21b15cf
Merge pull request #47 from C9Glax/dependabot/github_actions/actions/checkout-4
Bump actions/checkout from 3 to 4
2023-09-05 19:33:20 +02:00
dependabot[bot]
7acdf7a19b
Bump actions/checkout from 3 to 4
Bumps [actions/checkout](https://github.com/actions/checkout) from 3 to 4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-09-05 17:31:28 +00:00
af8716fcb1 Possible fix for #20 2023-09-05 19:28:43 +02:00
5f2c66b729 Merge remote-tracking branch 'origin/cuttingedge' into cuttingedge 2023-09-03 18:37:03 +02:00
e030f02431 Merge remote-tracking branch 'origin/cuttingedge' into cuttingedge
# Conflicts:
#	README.md
2023-09-03 18:36:57 +02:00
bdeb75f4e4 Merge remote-tracking branch 'origin/cuttingedge' into cuttingedge
# Conflicts:
#	README.md
2023-09-03 18:36:05 +02:00
4ce114986d Updated Readme to reflect the seperation of tranga (the api) and the website. 2023-09-03 18:35:51 +02:00
8035bf3fcd Updated Readme to reflect the seperation of tranga (the api) and the website. 2023-09-03 18:34:17 +02:00
85bf3ec7e8 Fix MangaSee publicationId 2023-09-02 23:14:31 +02:00
0f17615b10 Fix FileInUse 2023-09-02 23:14:16 +02:00
0c8145803e Possibly related to #20 2023-09-02 22:49:00 +02:00
b2e0c3db97 docker-compose to cuttingedge 2023-09-02 22:43:09 +02:00
ca283fcfff Fix Dockerfile, copy CLI 2023-09-02 22:39:54 +02:00
1d55070daf Merge branch '41_-_trash_everything' into cuttingedge
# Conflicts:
#	Tranga/MangaConnectors/DownloadClient.cs
2023-09-02 22:33:29 +02:00
32fd75bdae Add Manga to cached on parsing 2023-09-02 22:12:49 +02:00
99ad702163 Fixed MangaDex GetMangaFromUrl Regex-Group and resultobject 2023-09-02 22:12:34 +02:00
6e3a9c2a78 Added Lock to MemoryLogger 2023-09-02 21:53:09 +02:00
ad1d4dfe23 Fixed naming errors containing Manga
Added GetMangaFromUrl(url) to Mangaconnector
2023-09-02 21:52:48 +02:00
14ba71005f CheckJobs combined cancelled and completed checks,
added standby check
2023-09-02 16:16:00 +02:00
22c4c0eb2c Fixed GetJobsLike, for empty publication, but existing chapter 2023-09-02 16:15:06 +02:00
44f8d369c3 Added AddJobs to JobBoss 2023-09-02 16:14:36 +02:00
c0e6da144e Changed Job.ExecuteNow to ExecutionEnqueue
Instead of replacing progressToken, change Increments based in completed increments
2023-09-02 16:14:21 +02:00
51a1ae72ca Added parentJobId for deserialization
When creating Jobs with null as recurrence time, set it to zero
Job.NextExecution() removed the recurrence check
2023-09-02 16:12:10 +02:00
79bbc92467 Added lastExecution time on jobs.json parse 2023-09-02 15:05:15 +02:00
ae5be31c89 Fixed Jobs/StartNow 2023-09-02 14:49:31 +02:00
eebe25a378 Added check if jobQueue is empty 2023-09-02 14:46:38 +02:00
0f3da4ec81 Added check to read/write jobs.json if file is in use
Write jobs.json on change
2023-09-02 14:46:13 +02:00
0b77dc1172 Added ProgressToken state Cancelled 2023-09-02 14:45:46 +02:00
37cf47bc17 Reduced CheckJobs timer to 100ms 2023-09-02 14:45:02 +02:00
4cce2e04cb Renamed Job.Reset to ResetProgress 2023-09-02 14:13:30 +02:00
5465ac4e5c Removed DELETE Jobs/DownloadChapter and Jobs/MonitorManga. Can both be reached with DELETE Jobs (jobId)
Added POST Jobs/Cancel
CancelJob and RemoveJob cancels/removes subJobs
2023-09-02 14:13:15 +02:00
dd4d5a81ee Fix JobId variable in API requests 2023-09-02 14:11:44 +02:00
a05e1914e3 Log output changes 2023-09-02 14:11:11 +02:00
ed79ee5d0f Add Manga from Jobs to cachedManga 2023-09-01 23:41:50 +02:00
28e05e549d Added import and export for Jobs
Renamed tasksFilePath -> jobsFilePath and changed to jobs.json
2023-09-01 23:37:50 +02:00
eaab7c5235 Fixed jobs not starting at all 2023-09-01 23:08:31 +02:00
0552b3db82 Fix crash on null Logmessage 2023-09-01 22:53:38 +02:00
c813e1854d Do not add duplicate jobs 2023-09-01 22:39:22 +02:00
32036df057 Added API call to retrieve cover with internalId.
No need to mount imageCache over multiple containers.
2023-09-01 21:40:56 +02:00
394829ee36 Revert "Download Covers only when Downloading Chapters"
This reverts commit e663163d

Covers might be important
2023-09-01 21:17:46 +02:00
2a389f1ede Changed default download and working directories.
ExportSettings() now created folder
2023-08-31 17:07:54 +02:00
3167f6c3e6 Changed default log-folder path, and log-encoding to utf8 2023-08-31 17:07:17 +02:00
89c5f4b820 Added API-call GET Jobs/MonitorJobs 2023-08-31 16:40:08 +02:00
1c1169e5ce Renamed Managers to Connectors 2023-08-31 16:39:39 +02:00
d5d34c5381 Changed return-values of API: NotifcationConnectors/Types and LibraryConnectors/Types 2023-08-31 15:52:47 +02:00
c0efbb22cc Fixed JsonParsing of NotifcationConnector and LibraryConnector with GlobalBase 2023-08-31 15:41:02 +02:00
9f30e52713 Added new API-Calls:
POST: Jobs/StartNow
DELETE: Jobs
2023-08-31 13:12:03 +02:00
1fd36c91d6 Renamed Publication.cs to Manga.cs
Renamed Request-Paths "Tasks" to "Jobs"
2023-08-31 12:16:02 +02:00
e663163de8 Download Covers only when Downloading Chapters 2023-08-31 12:14:03 +02:00
4827b90c3d
Merge pull request #45 from C9Glax/dependabot/github_actions/docker/setup-buildx-action-2.10.0
Bump docker/setup-buildx-action from 2.9.1 to 2.10.0
2023-08-29 19:09:40 +02:00
e274c864f9 CLI: Add Status Code to output 2023-08-29 14:11:46 +02:00
f4bc182954 CLI: Prompt directy for HttpMethod, ignore input when exiting log 2023-08-29 14:09:35 +02:00
3365be219c Logger: Logmessage time 2023-08-29 14:08:57 +02:00
10708b3abd Add CLI with basic functionality. 2023-08-29 14:00:55 +02:00
c1e939f1e3 Server correct shutdown/force shutdown 2023-08-29 12:40:10 +02:00
21d53dabec TrangaSettings corrected logic for loading settingsfile, and overwriting settings 2023-08-29 12:39:48 +02:00
a9417dbba6 Trangasettings fix infinite loop on load 2023-08-29 12:39:21 +02:00
dependabot[bot]
4ca7b107eb
Bump docker/setup-buildx-action from 2.9.1 to 2.10.0
Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 2.9.1 to 2.10.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](https://github.com/docker/setup-buildx-action/compare/v2.9.1...v2.10.0)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-08-29 05:58:02 +00:00
61024bcee9 UserDictionary and variable readonly 2023-08-27 01:22:39 +02:00
ea1b8749a6 Removed unnecessary check 2023-08-27 01:22:21 +02:00
2fcab1f1b1 More Logging 2023-08-27 01:22:08 +02:00
bbd716383a Added ToString overrides 2023-08-27 01:21:23 +02:00
6e1a0ab06c Corrected order of constructor (GlobalBase clone) 2023-08-27 01:15:02 +02:00
181942153b Fixed some variables 2023-08-27 01:05:32 +02:00
fe04af4a2b Added most API-functions 2023-08-27 01:01:39 +02:00
4240a1eb6a Added methods to search for jobs, and remove multiple jobs. 2023-08-27 01:01:20 +02:00
32349c1ddf Added methods to Update Downloadlocation and WorkingDirectory 2023-08-27 01:00:42 +02:00
a94d3d6b40 Added method to delete Library/Notification-Connector 2023-08-27 01:00:13 +02:00
f916cda0f1 Corrected order of constructor (GlobalBase clone) 2023-08-27 00:59:54 +02:00
a8f0f1af15 More API Requests 2023-08-26 02:43:24 +02:00
0cf3a95f58 cachedPublications 2023-08-26 02:42:57 +02:00
a89a526fda Default language GetChapters: en 2023-08-26 02:42:31 +02:00
4d1e43e7b3 Job: add Id 2023-08-26 02:40:24 +02:00
4f9749d09e Fix bug with MangaDex, Useragent 2023-08-26 01:51:16 +02:00
7614f9aad3 Add User Agent to MangaConnectors 2023-08-26 01:50:31 +02:00
97c0e42512 Handle first requests, add parameter parser 2023-08-26 01:47:36 +02:00
565bc0775d Add Connectors to Tranga 2023-08-26 01:47:15 +02:00
e6a3fa2899 public GetPublications 2023-08-26 01:46:36 +02:00
2d82279d98 Added startup args, and first http-requesthandler 2023-08-24 13:35:07 +02:00
c5559a4ceb Save api-Portnumber in settings 2023-08-24 13:34:43 +02:00
2572a537ab Job Inherits from GlobalBase 2023-08-24 13:34:23 +02:00
58db049496 Merged MonitorJobs and CheckJobs in JobBoss 2023-08-24 13:34:09 +02:00
8f309fcfd7 Library- and NotificationConnectors in GlobalBase 2023-08-24 13:33:33 +02:00
11461051f3 Fixed missing filelogger crash 2023-08-24 12:13:34 +02:00
a4aa571870 Added Jobs and ProgressToken 2023-08-04 14:51:40 +02:00
e4086a8892 Rename TBaseObject -> GlobalBase
Remove Notification and Library Connectors from GlobalBase
2023-08-01 18:24:19 +02:00
c45e4ddf90 Rename Connectors -> MangaConnectors 2023-08-01 18:22:24 +02:00
675effd317 Trash everything and writing everything from scratch 2023-08-01 18:21:29 +02:00
a4f67c9ab4 Merge pull request 'Fixes for MangaKatana' (!53) from cuttingedge into master
Reviewed-on: #53
2023-07-31 23:09:24 +02:00
2538a29788 MangaKatana fix search result characters 2023-07-31 23:05:29 +02:00
81d5802092 MangaKatana fix bug where empty result in search would crash program 2023-07-31 23:03:46 +02:00
436edfde66 Fix issue where closed connection crashes api 2023-07-31 22:58:41 +02:00
00c1cd56b8 Merge pull request '#31 #40' (!52) from cuttingedge into master
Reviewed-on: #52
2023-07-31 22:50:22 +02:00
a63154b581 Fix new installation startup issue where version would be null on new installs 2023-07-31 22:47:35 +02:00
53fe7ee983 Possible fix for #31
chapter regex
2023-07-31 22:47:14 +02:00
6fb4098c16 Merge pull request 'Missing logger, breaking version in settings.json' (!51) from cuttingedge into master
Reviewed-on: #51
2023-07-31 02:14:06 +02:00
7a024e8733 Add logger to CommonObjects on deserialiazation 2023-07-31 02:11:53 +02:00
835e239be5 Cleanup 2023-07-31 02:07:39 +02:00
df8538c3b4 Merge pull request 'version' (!50) from cuttingedge into master
Reviewed-on: #50
2023-07-31 01:59:42 +02:00
f832fe0de3 version 2023-07-31 01:58:00 +02:00
ebdb38bd57 Merge pull request 'Moving away from API/CLI model, combined into single executable.' (!49) from cuttingedge into master
Reviewed-on: #49
2023-07-31 01:53:50 +02:00
e3201a9b99 Ignore Logger 2023-07-31 01:50:26 +02:00
eb50b84266 Converters 2023-07-31 01:48:40 +02:00
b3d778ff56 accessibility 2023-07-31 01:45:55 +02:00
00861c406a added logging 2023-07-31 01:42:15 +02:00
01c8784bab wrong array 2023-07-31 01:30:32 +02:00
3aa299e48a deserialization of enum 2023-07-31 01:28:32 +02:00
d1ce244135 New Migration to new commonObjects 2023-07-31 01:26:38 +02:00
c91754614b weird env 2023-07-31 00:58:22 +02:00
70b1ae4812 isLinux 2023-07-31 00:52:27 +02:00
336e08aebf If not running cli add back console output 2023-07-31 00:46:14 +02:00
18134cdf01 If not running cli add back console output 2023-07-31 00:43:57 +02:00
5b89cbd042 Only run TaskMode on Windows 2023-07-31 00:41:25 +02:00
74aca86b62 Wrong entrypoint 2023-07-31 00:36:56 +02:00
e5abaa4549 Wrong entrypoint 2023-07-31 00:35:11 +02:00
eb0eb71e86 wrong dockerfile 2023-07-31 00:33:57 +02:00
4e73b0a4cf wrong dockerfile 2023-07-31 00:32:42 +02:00
140074208f Merged API and CLI into one. 2023-07-31 00:31:19 +02:00
fa19d3da14 Fix missing file on loading settings/commonobjects 2023-07-31 00:01:18 +02:00
3d6657b483 Moved libraryManagers, notificationManagers and logger to commonObjects class. 2023-07-30 23:31:25 +02:00
f9b5e05974
Merge pull request #39 from C9Glax/cuttingedge
Move Namespaces, move logger to TrangaSettings, move downloadClient to seperate File, remove deprecated calls
2023-07-30 17:34:06 +02:00
ad4027779f Remove Deprecated CreateUpdateLibraryTask 2023-07-30 17:29:30 +02:00
98ec0b837f Remove Enter input from settings, instead update all settings on click of "Update" Button.
resolves #38
2023-07-30 17:27:47 +02:00
1afa3df316 Cleanup build warnings, ReShaper, Dictionary 2023-07-30 17:25:04 +02:00
d83aa1ef5b deprecated 2023-07-30 17:11:11 +02:00
b610ec734e Chapter readonly struct 2023-07-30 17:09:39 +02:00
abf587377c API: Changed uninstantiated class Program to static 2023-07-30 17:09:30 +02:00
437349bd27 TrangaSettings changed set directive 2023-07-30 17:09:10 +02:00
000539d6a6 Moved logger to Trangasettings 2023-07-30 17:08:43 +02:00
b4bef25a22 Moved downloadclient to separate file 2023-07-30 17:04:43 +02:00
579e400a5d Moved class to appropriate namespaces 2023-07-30 17:01:54 +02:00
8af2b12fc0 Moved class to appropriate namespaces 2023-07-30 16:26:29 +02:00
bad4330330 introduce branch cuttingedge 2023-07-30 16:21:04 +02:00
42596752d3 FIX: null Publications in tasks 2023-07-29 18:55:06 +02:00
16238c590b Remove UpdateLibrariesTask 2023-07-29 18:20:41 +02:00
9f38dc3b6a Revert "Remove UpdateLibrariesTask"
This reverts commit de14ff0b75.
2023-07-29 18:18:02 +02:00
485637d99a Added Min-Chapter-Number to API 2023-07-28 10:47:36 +02:00
de14ff0b75 Remove UpdateLibrariesTask 2023-07-28 10:41:20 +02:00
f947c37bd6 Change website context to revert location to / instead of /Website 2023-07-28 10:30:54 +02:00
77eec0f696 Fix wrong deserialization 2023-07-21 00:32:18 +02:00
18323f9f51 remove debug 2023-07-21 00:22:41 +02:00
2cd2b6842d arch armv7 fails to build 2023-07-21 00:20:13 +02:00
09f815903f arch arm64 fails to build 2023-07-21 00:18:08 +02:00
c108478039 context 2 2023-07-21 00:15:35 +02:00
74289e43b7 context 2023-07-21 00:14:07 +02:00
2779f9ba09 Merge remote-tracking branch 'origin/master' 2023-07-21 00:12:23 +02:00
59a8e556f0 wrong build path 2023-07-21 00:12:09 +02:00
074b137b5c Merge pull request 'dev' (!48) from dev into master
Reviewed-on: #48
2023-07-21 00:10:33 +02:00
3cb2540794 debugging 2023-07-21 00:09:59 +02:00
02c9934896 change context back to API 2023-07-21 00:09:51 +02:00
b2e1c95bca Merge remote-tracking branch 'origin/master' 2023-07-21 00:07:44 +02:00
8c9e3ea6b6 Merge pull request 'split into two actions, dont always build tranga-base' (!47) from dev into master
Reviewed-on: #47
2023-07-21 00:07:16 +02:00
db441607ad Merge branch 'master' into dev 2023-07-21 00:04:24 +02:00
91c56783dc restore absolute path 2023-07-21 00:03:32 +02:00
2c288eeeea Don't rebuild tranga-base every time. 2023-07-20 23:54:30 +02:00
57a1ea91fc Merge pull request 'dev' (!46) from dev into master
Reviewed-on: #46
2023-07-20 23:50:15 +02:00
06138a3927 Workflow change context 2023-07-20 23:49:33 +02:00
84b053e672 Merge remote-tracking branch 'origin/dev' 2023-07-20 23:44:17 +02:00
0fe0cbc4ad
Merge pull request #34 from C9Glax/dev
Unsupported arch
2023-07-20 23:42:04 +02:00
62e6ce8363 remove unsupported platforms 2023-07-20 23:38:10 +02:00
a4f3ec6580
Merge pull request #33 from schklom/master
Automatic build of Docker images for many platforms (ARM too)
2023-07-20 23:25:10 +02:00
schklom
8b4e996b7e
Create dependabot.yml 2023-07-20 23:10:46 +02:00
schklom
964540d30f
Create docker-image.yml 2023-07-20 23:10:15 +02:00
fa69f4488f Removed UpdateLibraryTask (deprecated).
Libraries will be updated on new Chapters downloaded.
Added Migrator, for future file-changes
2023-07-20 18:15:14 +02:00
42c2876188 Mangakatana chapter num fix 2023-07-16 20:22:33 +02:00
715244ff1b Mangasee more logging 2023-07-16 18:15:28 +02:00
2333cd9095 Mangasee more bad words 2023-07-16 18:15:11 +02:00
c8225db4fe #30 #31 2023-07-16 17:47:00 +02:00
6741ca096b Startup Message 2023-07-16 17:38:42 +02:00
a897a7b3a2 Better Logger.
Includes a formatted Console-Log
2023-07-16 17:33:15 +02:00
0f8932e712 Fixed missing logger for notificationManagers on deserialization 2023-07-09 21:38:49 +02:00
78023ef0fd resolves #21 lunasea 2023-07-09 21:35:15 +02:00
Andy Maesen
d171f34e4e
Update README.md 2023-07-07 14:23:33 +02:00
Andy Maesen
aa0dc4fa35 Fixes single result redirect 2023-07-06 02:09:56 +02:00
25f48592c0 Added more badwords to filter out when searching mangasee, resolves #26 2023-07-04 22:44:01 +02:00
398ac304d2 Update Komga/Kavita immediately after new chapter is Downloaded 2023-07-03 00:01:08 +02:00
58a62f8272 Mangasee search all title-fields. 2023-07-02 23:54:02 +02:00
86752c9a7e Order of task Execution by due-time 2023-07-02 23:10:16 +02:00
f9a7828d02 Moved notification back to DownloadChapterTask
temp: Dont wait for childTasks to finish to finish parent task
2023-07-02 23:06:24 +02:00
c97ff69148 Fix for new publications: Add to collection 2023-07-02 22:46:01 +02:00
1735bbcf8a Fix wrong query from allTasks to runningTasks 2023-06-30 00:23:00 +02:00
9ae8ca65df resolves #25 characters encoding mistake 2023-06-29 21:09:42 +02:00
00599cd24e Infinite loop on unavailable chapters 2023-06-28 23:00:24 +02:00
6d5618a1f7 Infinite loop on unavailable chapters 2023-06-28 22:46:22 +02:00
a1202a875d Moved sucessstate to taskmanager 2023-06-28 22:43:46 +02:00
98946b4aa3 Fixed null chapterNumber on mangadex 2023-06-28 22:43:24 +02:00
41b6bb77b6 Moved GetPublicationsFromConnector to connector.
Moved GetNewChaptersList to Connector.
Removed knownPublications file
Renamed chapterCollection to collection and only contains Publications
2023-06-28 22:43:03 +02:00
e70a14ca56 Only send notifications if more than 0 new chapters 2023-06-28 19:23:06 +02:00
b099da1156 Chapter fix RegexMatching on chapter number 2023-06-28 00:13:23 +02:00
01d1f922c2 MangaDex chapterNumber non.nullable 2023-06-28 00:13:09 +02:00
47a80d67a8 TrangaTask Success-State and child task deletion 2023-06-27 23:55:13 +02:00
16e3549455 Export Data after deleting task 2023-06-27 23:54:44 +02:00
be8c6b50ba Notification moved to TrangaTask 2023-06-27 23:37:13 +02:00
a38fcf50ca nullable types removed 2023-06-27 23:25:35 +02:00
82f6c7b3fe Moved GetArchiveFilePath, CheckChapterIsDownloaded and GetComicInfoXmlString to Chapter.cs 2023-06-27 23:22:23 +02:00
5586d2c104 Connector CheckChapterIsDownloaded more Regex 2023-06-27 23:14:22 +02:00
62dc9fee2a GetComicInfoXmlString: protected -> internal 2023-06-27 23:09:09 +02:00
ac96fca6dc Chapter illegalstring regex 2023-06-27 23:08:29 +02:00
25a6ceff10 Remove sortNumber-field from Chapter
API: Change Tasks/Progress chapterSortNumber to ChapterNumber
2023-06-27 23:06:37 +02:00
b3e1d39d0f Rename Connector.SearchChapters -> SelectChapters
Added "a(ll)"-option to SelectChapters
2023-06-27 23:02:55 +02:00
2833b7f22a Remove Legacy support for "DownloadNewChapters" 2023-06-27 22:59:33 +02:00
cbdd305b69 TaskManager AddTask make better use of GetTasksMatching and GetTasksMatching easier usage 2023-06-27 22:59:23 +02:00
b88890817e TaskManager _runningDownloadChapterTasks -> _runningTasks for all TrangaTasks 2023-06-27 22:58:40 +02:00
f66ab7d40b Connector use TrangaSettings instead of own values for imageCache and downloadLocation 2023-06-27 22:57:44 +02:00
4cb3694cd5 Re-add task timeout 2023-06-27 22:23:53 +02:00
a05d4c8bd9 Merge remote-tracking branch 'origin/master' 2023-06-27 22:23:23 +02:00
22f87a74b2 Re-add task timeout 2023-06-27 22:23:19 +02:00
ba57282879 Re-add task timeout 2023-06-27 22:19:06 +02:00
9ccba6fba6 Fix CheckChapterIsDownloaded Directory does not exists exception returning 0 chapters 2023-06-25 23:56:22 +02:00
4f01c1166f Fix taskIds being changed during requests, no workaround this time 2023-06-25 23:56:00 +02:00
0a51e7ad3d Fix taskIds being changed during requests 2023-06-25 23:26:36 +02:00
e541b922dc
Merge pull request #24 from arxae/master
Added MangaKatana connector
2023-06-25 21:38:18 +02:00
604abd5f9a Fix bug where ChildTasks hung parentTasks 2023-06-24 21:00:26 +02:00
7b311eae75 Will break: CheckChapterIsDownloaded 2023-06-24 20:46:35 +02:00
Andy Maesen
d4eb72cd99 Required changes 2023-06-23 22:14:27 +02:00
b515215f4b Fix taskIds being changed during requests 2023-06-22 23:09:59 +02:00
a16686dfbf Fix wrong taskNames 2023-06-22 22:52:26 +02:00
Andy Maesen
4275703941 Added MangaKatana connector 2023-06-22 14:22:21 +02:00
c3342984ea Server fixed bug where ?& in request url caused variables to not parse 2023-06-21 18:04:41 +02:00
ed4bdb5b33 TrangaSettings export after change 2023-06-21 18:04:12 +02:00
0f0902c932 LunaSea changed to id device/id or user/id instead of full url 2023-06-21 18:03:48 +02:00
6508055b43 API Fix closed response socket 2023-06-21 17:42:56 +02:00
abc66511d8 Fixed progress tracking this time for realsies. resolves #5 2023-06-21 17:30:31 +02:00
9ed36c47b5 Fixed taskId on init deserialization 2023-06-21 17:29:48 +02:00
fd1b2a8470 API Fix closed response socket 2023-06-21 17:29:20 +02:00
8058749ab5 Website fix wrong task on deletion 2023-06-21 16:53:56 +02:00
8737617e5f Fix deletion of successful child tasks 2023-06-21 16:53:41 +02:00
7e4f43f1e2 API fix CORS preflight 2023-06-21 16:53:07 +02:00
12b1b2afd6 Server fix interfaces on windows 2023-06-21 16:52:57 +02:00
0f9ac60fcd closes #11 readme update 2023-06-21 16:17:40 +02:00
8c87f2948c README updated screenshots 2023-06-21 16:08:36 +02:00
e0fb817256 Changed glax/tranga-base to latest 2023-06-20 23:26:49 +02:00
cdd2d94ba1 Wrote my own Http-Server.
ASP-NET can **** my **** and *** :)
2023-06-20 23:15:56 +02:00
d5b7645cd2 "Thread-safe" message adding.. 2023-06-20 23:15:22 +02:00
9af5c1603e Using HttpStatusCode to signify Task-Success
When DownloadChapterTask returns notfound, do not retry.
2023-06-20 15:46:54 +02:00
1035939309 Fix overflow 2023-06-20 15:18:58 +02:00
3b542c04f6 ReShaper cleanup,
Remove unnecessary using directives
2023-06-20 14:59:08 +02:00
a809b7c285 Added timeout to Connector DownloadClient 2023-06-20 14:58:02 +02:00
e883277400 Renamed DownloadNewChaptersTask to MonitorPublicationTask
Added TrangaTask.Clone() method
Rewrote TrangaTask.progress for the billionth+1 time.
Removed Increment and DecrementProgress methods
Removed TrangaTask.ReplaceFailedChildTask method
Changed return type of TrangaTask.ExecuteTask to bool, signifying success.
Added Failed Execution state to TrangaTask
Replaced taskManager failed-task logic
Removed TaskManager bulky AddTask and DeleteTask methods
Removed TaskManager bulky Constructor
2023-06-20 14:57:44 +02:00
23dfdc0933 Connector DownloadChapter, DownloadImage, DownloadChapterImages returns successState.
RequestResult replace HttpStatusCode with success-status boolean.
DownloadChapterTask: Only send Notification when Chapter download successful
2023-06-19 22:45:33 +02:00
edc24fff5b Moved notification to DownloadChapterTask, sends when parentTask exists. 2023-06-19 22:34:34 +02:00
6cdccdf66b Fix infinite loop of DownloadNewChaptersTask 2023-06-19 22:32:32 +02:00
a4c9168551 Selector for task-sanitizer 2023-06-19 17:17:47 +02:00
821a1b7c3a Unique IDs for TrangaTask now based on Random-generator 2023-06-19 17:17:24 +02:00
b2b4256972 Startup message api 2023-06-19 16:46:12 +02:00
d2f46e4637 #21 Deserialization of LunaSea Object 2023-06-19 11:27:07 +02:00
303fc293ba Fixed Bug on AddTask where no new UpdateLibraryTask would be added 2023-06-15 22:32:55 +02:00
36c145da26 Gotify change to normal priority 2023-06-15 21:24:01 +02:00
c822c74f42 website fix taskSelectOutput overflow issue 2023-06-15 21:16:56 +02:00
dda4054d34 API: Fix nullable bug on Getchapters 2023-06-15 21:15:44 +02:00
5b2546fdbc removed unnecessary log 2023-06-15 19:07:25 +02:00
c11e3993ea Added successmessage to NotificationManager 2023-06-15 19:06:53 +02:00
02a382a99a Website: Added connector NotificationManager LunaSea
Added Update Method for TrangaSettings for LunaSea
#21
2023-06-15 18:57:50 +02:00
c6c8f5cdf6 TrangaSettings nullable library and notificationManagers will initialize a new Hashset 2023-06-15 18:50:50 +02:00
84842aed3c Added connector NotificationManager LunaSea 2023-06-15 18:50:19 +02:00
d9ced11cd1 Website: Added gotify config 2023-06-15 18:38:47 +02:00
25c90782dc Moved UpdateSettings to TrangaSettings
Added NotificaitonManager
Added Gotify
Added Notification on MonitorTask download new chapters
2023-06-15 18:25:32 +02:00
e789c429cd TaskManager when deleting task also remove from parent. 2023-06-15 18:24:19 +02:00
93de471836 Added TrangaTask.RemoveChildTask 2023-06-15 18:22:59 +02:00
8b58e7dd13 Website: On Download Chapters only show chapters that have not yet been downloaded
API: Added new variables to /Publications/GetChapters: onlyNew and onlyExisting. API will return only new, only existing or all chapters depending on variables.
#19
2023-06-15 17:14:20 +02:00
b571bfa43d Moved GetNewChaptersList to taskManager and added GetExistingChaptersList 2023-06-15 17:07:32 +02:00
088d1c4647 Derived Constructor 2023-06-15 17:06:41 +02:00
f280c01802 Browser Version for both windows and linux 2023-06-15 16:30:07 +02:00
1be10b310d Fix Regex Bug on downlaod volumes 2023-06-11 19:17:03 +02:00
a0469f3145 Cancel DownloadChapter-Task on removal 2023-06-11 19:16:05 +02:00
fcd81f03b3 resolves #17 no cover image 2023-06-11 19:05:08 +02:00
76604d84d8 Better way of handling progress, and childProgress. 2023-06-11 18:24:26 +02:00
af822febbe fixed nullable warning 2023-06-11 18:01:04 +02:00
8e207c3119 Better way of handling progress, and childProgress. 2023-06-11 17:27:33 +02:00
b6f8c8aab5 TaskType check 2023-06-11 17:05:24 +02:00
36f7cbd3e9 Better way of handling progress, and childProgress.
More reliable taskFinishTime
2023-06-11 17:04:33 +02:00
3b2643d949 Website show remaining time instead of percentage 2023-06-11 16:38:12 +02:00
9fd8bf1741 website uses taskId 2023-06-10 16:00:41 +02:00
d5c9c5ba96 Redid progress calcuation on DownloadNewChaptersTask and DownloadChapterTask 2023-06-10 16:00:16 +02:00
c8e27921ab Added taskId to trangaTask and parentTaskId to DownloadChapterTask as unique identifier to attach ChildTasks to ParentTask on deserialization. 2023-06-10 15:59:42 +02:00
6eaba07801 Changed progress type from float to double 2023-06-10 15:58:11 +02:00
41929e0c72 DownloadChapterTask sets execution of parentTask 2023-06-10 15:04:37 +02:00
4fcaca1a6e Multiple authors resolves #7 2023-06-10 14:45:04 +02:00
0e3c7f32d7 Added CancellationToken to TrangaTask #14 2023-06-10 14:34:30 +02:00
1c94625840 Added CancellationToken to TrangaTask #14 2023-06-10 14:27:09 +02:00
32f89f9dce Multiple authors resolves #7 2023-06-10 14:05:23 +02:00
234735a562 Order of tasks closes #15
Also API /Queue/Get orders in order of nextExecution
2023-06-10 00:45:55 +02:00
8b916eb854 invalid Ids 2023-06-10 00:23:23 +02:00
29e1790c93 website tasks-width now max 95vw 2023-06-10 00:10:16 +02:00
ac4c799a74 Better indication if tasks have started. 2023-06-10 00:07:41 +02:00
7c62883c37 invalid id 2023-06-10 00:02:51 +02:00
02018253bf wrong nesting ... 2023-06-10 00:01:38 +02:00
2aec884009 Moved update interval for task-progress to own interval, progress gets continually updated. 2023-06-09 23:58:04 +02:00
b3321ff030 unnecessary log 2023-06-09 23:48:39 +02:00
16c1094875 Replaced Task-Progress-Tracking Window with more fancy one 2023-06-09 23:46:10 +02:00
5763d50409 #14 temporary workaround for disposing tasks 2023-06-09 23:45:53 +02:00
ad43297358 API: Updated /Tasks/GetProgress to return progress of specific task (by sortNumber) 2023-06-09 23:43:57 +02:00
b17800e0ef Decrement progress of parenttask when childtask fails 2023-06-09 23:43:19 +02:00
89c80d2997 Fixed bug where tasks would instantly failed when launched #14 2023-06-09 23:42:54 +02:00
6485b8744f API: Updated /Tasks/GetProgress to return progress of specific task (by sortNumber) 2023-06-09 23:42:18 +02:00
a3a96b6b55 Added DecrementProgress function to TrangaTask 2023-06-09 23:38:28 +02:00
5bce3c6fdd Website: Monitor task creation styling 2023-06-09 22:15:29 +02:00
5fa0c98d05 Documentation how to create tasks #11 2023-06-09 11:26:51 +02:00
b166013770 resolves #13 Website: Clear previous results 2023-06-09 11:12:43 +02:00
02fe849046 Better downloadChapter selection 2023-06-09 11:06:18 +02:00
d42393c83a Website + API ability to download specific volumes 2023-06-08 19:53:05 +02:00
c685bd622f Website:
New task-Creation dialog
Redesigned Settings dialog
2023-06-08 19:25:28 +02:00
dc83cc2194 Fixed Range on CLI downloadchaptertask creation 2023-06-08 19:25:03 +02:00
7784f2024e API changes:
/Tranga/GetAvailableControllers => /Controllers/Get
/Tranga/GetKnownPublications =>/Publications/GetKnown
/Tranga/GetPublicationsFromConnector => /Publications/GetFromConnector
/Tasks/GetTaskTypes => /Tasks/GetTypes
/Tasks/GetTaskProgress => /Tasks/GetProgress
/Tasks/Create is now split in 3:
    /Tasks/CreateMonitorTask
    /Tasks/CreateUpdateLibraryTask
    /Tasks/CreateDownloadChaptersTask
2023-06-08 19:24:46 +02:00
4895079887 Remove DownloadChapterTask from _runningDownloadChapterTasks after completion 2023-06-07 15:01:24 +02:00
ab1ddc6dc8 Less cluttered log 2023-06-07 00:31:27 +02:00
87eade10cf #40 task timeout criteria 2023-06-07 00:27:53 +02:00
1f3ac41b30 removed unnecessary cast 2023-06-07 00:24:58 +02:00
6a304bb330 #40 task timeout 2023-06-07 00:24:27 +02:00
b0642d1251 removed unnecessary check 2023-06-06 22:11:57 +02:00
63b5139e93 Split error message for better logging 2023-06-06 22:11:38 +02:00
e938784388 Created own base image for tranga-api (to stop apt always updating) 2023-06-06 22:11:26 +02:00
c436389426 renamed wrong variable names publicationId -> internalId 2023-06-06 21:57:10 +02:00
5099e25f3f Fixed wrong comparison on add new task 2023-06-06 21:56:51 +02:00
98 changed files with 6251 additions and 4480 deletions

View File

@ -22,4 +22,6 @@
**/secrets.dev.yaml
**/values.dev.yaml
LICENSE
README.md
README.md
Manga
settings

21
.github/ISSUE_TEMPLATE/bug_report.yml vendored Normal file
View File

@ -0,0 +1,21 @@
name: Bug Report
description: File a bug report
title: "[It broke]: "
labels: ["bug"]
body:
- type: textarea
attributes:
label: What is broken?
description: What happened? How did we get here?
placeholder: The place where you tell me what you expected to happen, and what happened instead.
validations:
required: true
- type: textarea
attributes:
label: Log-output
description: The output of `docker logs tranga-api`
render: C#
- type: textarea
attributes:
label: Additional stuff
description: Screenshots, anything you think might help

View File

@ -0,0 +1,23 @@
name: New Connector Request
description: Request a new site to be added
title: "[New Connector]: "
labels: ["New Connector"]
body:
- type: input
attributes:
label: Website-Link
placeholder: https://
validations:
required: true
- type: checkboxes
attributes:
label: Is the Website free to access?
description: We can't support pay-to-use sites.
options:
- label: The Website is freely accessible.
required: true
- type: textarea
attributes:
label: Anything else?
validations:
required: false

7
.github/dependabot.yml vendored Normal file
View File

@ -0,0 +1,7 @@
version: 2
updates:
# Maintain dependencies for GitHub Actions
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "daily"

View File

@ -0,0 +1,45 @@
name: Docker Image CI
on:
push:
branches: [ "cuttingedge" ]
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
# https://github.com/docker/setup-qemu-action#usage
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0
# https://github.com/marketplace/actions/docker-setup-buildx
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v3.7.1
# https://github.com/docker/login-action#docker-hub
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
# https://github.com/docker/build-push-action#multi-platform-image
- name: Build and push API
uses: docker/build-push-action@v6.9.0
with:
context: ./
file: ./Dockerfile
#platforms: linux/amd64,linux/arm64,linux/riscv64,linux/ppc64le,linux/s390x,linux/386,linux/mips64le,linux/mips64,linux/arm/v7,linux/arm/v6
platforms: linux/amd64,linux/arm64
pull: true
push: true
tags: |
glax/tranga-api:cuttingedge

45
.github/workflows/docker-image-dev.yml vendored Normal file
View File

@ -0,0 +1,45 @@
name: Docker Image CI
on:
push:
branches: [ "dev" ]
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
# https://github.com/docker/setup-qemu-action#usage
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0
# https://github.com/marketplace/actions/docker-setup-buildx
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v3.7.1
# https://github.com/docker/login-action#docker-hub
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
# https://github.com/docker/build-push-action#multi-platform-image
- name: Build and push API
uses: docker/build-push-action@v6.9.0
with:
context: ./
file: ./Dockerfile
#platforms: linux/amd64,linux/arm64,linux/riscv64,linux/ppc64le,linux/s390x,linux/386,linux/mips64le,linux/mips64,linux/arm/v7,linux/arm/v6
platforms: linux/amd64,linux/arm64
pull: true
push: true
tags: |
glax/tranga-api:dev

View File

@ -0,0 +1,45 @@
name: Docker Image CI
on:
push:
branches: [ "master" ]
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
# https://github.com/docker/setup-qemu-action#usage
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0
# https://github.com/marketplace/actions/docker-setup-buildx
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v3.7.1
# https://github.com/docker/login-action#docker-hub
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
# https://github.com/docker/build-push-action#multi-platform-image
- name: Build and push API
uses: docker/build-push-action@v6.9.0
with:
context: ./
file: ./Dockerfile
#platforms: linux/amd64,linux/arm64,linux/riscv64,linux/ppc64le,linux/s390x,linux/386,linux/mips64le,linux/mips64,linux/arm/v7,linux/arm/v6
platforms: linux/amd64,linux/arm64
pull: true
push: true
tags: |
glax/tranga-api:latest

View File

@ -0,0 +1,45 @@
name: Docker Image CI
on:
push:
branches: [ "Server-V2" ]
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
# https://github.com/docker/setup-qemu-action#usage
- name: Set up QEMU
uses: docker/setup-qemu-action@v3.2.0
# https://github.com/marketplace/actions/docker-setup-buildx
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v3.7.1
# https://github.com/docker/login-action#docker-hub
- name: Login to Docker Hub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
# https://github.com/docker/build-push-action#multi-platform-image
- name: Build and push API
uses: docker/build-push-action@v6.9.0
with:
context: ./
file: ./Dockerfile
#platforms: linux/amd64,linux/arm64,linux/riscv64,linux/ppc64le,linux/s390x,linux/386,linux/mips64le,linux/mips64,linux/arm/v7,linux/arm/v6
platforms: linux/amd64,linux/arm64
pull: true
push: true
tags: |
glax/tranga-api:Server-V2

6
.gitignore vendored
View File

@ -18,4 +18,8 @@ riderModule.iml
/dataSources.local.xml
/.idea
cover.jpg
cover.png
cover.png
/.vscode
/Manga
/settings
*.DotSettings.user

View File

@ -2,17 +2,14 @@
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net7.0</TargetFramework>
<RootNamespace>Tranga_CLI</RootNamespace>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<DockerDefaultTargetOS>Linux</DockerDefaultTargetOS>
<LangVersion>12</LangVersion>
</PropertyGroup>
<ItemGroup>
<Content Include="..\.dockerignore">
<Link>.dockerignore</Link>
</Content>
<PackageReference Include="Spectre.Console.Cli" Version="0.47.1-preview.0.11" />
</ItemGroup>
<ItemGroup>

157
CLI/Program.cs Normal file
View File

@ -0,0 +1,157 @@
using System.ComponentModel;
using System.Diagnostics.CodeAnalysis;
using Logging;
using Spectre.Console;
using Spectre.Console.Cli;
using Tranga;
var app = new CommandApp<TrangaCli>();
return app.Run(args);
internal sealed class TrangaCli : Command<TrangaCli.Settings>
{
public sealed class Settings : CommandSettings
{
[Description("Directory to which downloaded Manga are saved")]
[CommandOption("-d|--downloadLocation")]
[DefaultValue(null)]
public string? downloadLocation { get; init; }
[Description("Directory in which application-data is saved")]
[CommandOption("-w|--workingDirectory")]
[DefaultValue(null)]
public string? workingDirectory { get; init; }
[Description("Enables the file-logger")]
[CommandOption("-f")]
[DefaultValue(null)]
public bool? fileLogger { get; init; }
[Description("Path to save logfile to")]
[CommandOption("-l|--fPath")]
[DefaultValue(null)]
public string? fileLoggerPath { get; init; }
[Description("Port on which to run API on")]
[CommandOption("-p|--port")]
[DefaultValue(null)]
public int? apiPort { get; init; }
}
public override int Execute([NotNull] CommandContext context, [NotNull] Settings settings)
{
List<Logger.LoggerType> enabledLoggers = new();
if(settings.fileLogger is true)
enabledLoggers.Add(Logger.LoggerType.FileLogger);
string? logFolderPath = settings.fileLoggerPath ?? "";
Logger logger = new(enabledLoggers.ToArray(), Console.Out, Console.OutputEncoding, logFolderPath);
if(settings.workingDirectory is not null)
TrangaSettings.LoadFromWorkingDirectory(settings.workingDirectory);
else
TrangaSettings.CreateOrUpdate();
if(settings.downloadLocation is not null)
TrangaSettings.CreateOrUpdate(downloadDirectory: settings.downloadLocation);
Tranga.Tranga? api = null;
Thread trangaApi = new Thread(() =>
{
api = new(logger);
});
trangaApi.Start();
HttpClient client = new();
bool exit = false;
while (!exit)
{
string menuSelect = AnsiConsole.Prompt(
new SelectionPrompt<string>()
.Title("Menu")
.PageSize(10)
.MoreChoicesText("Up/Down")
.AddChoices(new[]
{
"CustomRequest",
"Log",
"Exit"
}));
switch (menuSelect)
{
case "CustomRequest":
HttpMethod requestMethod = AnsiConsole.Prompt(
new SelectionPrompt<HttpMethod>()
.Title("Request Type")
.AddChoices(new[]
{
HttpMethod.Get,
HttpMethod.Delete,
HttpMethod.Post
}));
string requestPath = AnsiConsole.Prompt(
new TextPrompt<string>("Request Path:"));
List<ValueTuple<string, string>> parameters = new();
while (AnsiConsole.Confirm("Add Parameter?"))
{
string name = AnsiConsole.Ask<string>("Parameter Name:");
string value = AnsiConsole.Ask<string>("Parameter Value:");
parameters.Add(new ValueTuple<string, string>(name, value));
}
string requestString = $"http://localhost:{TrangaSettings.apiPortNumber}/{requestPath}";
if (parameters.Any())
{
requestString += "?";
foreach (ValueTuple<string, string> parameter in parameters)
requestString += $"{parameter.Item1}={parameter.Item2}&";
}
HttpRequestMessage request = new (requestMethod, requestString);
AnsiConsole.WriteLine($"Request: {request.Method} {request.RequestUri}");
HttpResponseMessage response;
if (AnsiConsole.Confirm("Send Request?"))
response = client.Send(request);
else break;
AnsiConsole.WriteLine($"Response: {(int)response.StatusCode} {response.StatusCode}");
AnsiConsole.WriteLine(response.Content.ReadAsStringAsync().Result);
break;
case "Log":
List<string> lines = logger.Tail(10).ToList();
Rows rows = new Rows(lines.Select(line => new Text(line)));
AnsiConsole.Live(rows).Start(context =>
{
bool running = true;
while (running)
{
string[] newLines = logger.GetNewLines();
if (newLines.Length > 0)
{
lines.AddRange(newLines);
rows = new Rows(lines.Select(line => new Text(line)));
context.UpdateTarget(rows);
}
Thread.Sleep(100);
if (AnsiConsole.Console.Input.IsKeyAvailable())
{
AnsiConsole.Console.Input.ReadKey(true); //Do not process input
running = false;
}
}
});
break;
case "Exit":
exit = true;
break;
}
}
if (api is not null)
api.keepRunning = false;
return 0;
}
}

View File

@ -1,14 +1,42 @@
# syntax=docker/dockerfile:1
ARG DOTNET=8.0
FROM mcr.microsoft.com/dotnet/sdk:7.0 as build-env
WORKDIR /src
COPY . /src/
RUN dotnet restore Tranga-API/Tranga-API.csproj
RUN dotnet publish -c Release -o /publish
FROM mcr.microsoft.com/dotnet/aspnet:7.0 as runtime
FROM --platform=$TARGETPLATFORM mcr.microsoft.com/dotnet/runtime:$DOTNET AS base
WORKDIR /publish
COPY --from=build-env /publish .
EXPOSE 80
RUN apt-get update && apt-get install -y libx11-6 libx11-xcb1 libatk1.0-0 libgtk-3-0 libcups2 libdrm2 libxkbcommon0 libxcomposite1 libxdamage1 libxrandr2 libgbm1 libpango-1.0-0 libcairo2 libasound2 libxshmfence1 libnss3
ENTRYPOINT ["dotnet", "/publish/Tranga-API.dll"]
ENV PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true
ENV PUPPETEER_EXECUTABLE_PATH=/usr/bin/chromium
RUN apt-get update \
&& apt-get install -y libx11-6 libx11-xcb1 libatk1.0-0 libgtk-3-0 libcups2 libdrm2 libxkbcommon0 libxcomposite1 libxdamage1 libxrandr2 libgbm1 libpango-1.0-0 libcairo2 libasound2 libxshmfence1 libnss3 chromium \
&& apt-get autopurge -y \
&& apt-get autoclean -y
FROM --platform=$BUILDPLATFORM mcr.microsoft.com/dotnet/sdk:$DOTNET AS build-env
WORKDIR /src
COPY Tranga.sln /src
COPY CLI/CLI.csproj /src/CLI/CLI.csproj
COPY Logging/Logging.csproj /src/Logging/Logging.csproj
COPY Tranga/Tranga.csproj /src/Tranga/Tranga.csproj
RUN dotnet restore /src/Tranga.sln
COPY . /src/
RUN dotnet publish -c Release --property:OutputPath=/publish -maxcpucount:1
FROM --platform=$TARGETPLATFORM base AS runtime
EXPOSE 6531
ARG UNAME=tranga
ARG UID=1000
ARG GID=1000
RUN groupadd -g $GID -o $UNAME \
&& useradd -m -u $UID -g $GID -o -s /bin/bash $UNAME \
&& mkdir /usr/share/tranga-api \
&& mkdir /Manga \
&& chown 1000:1000 /usr/share/tranga-api \
&& chown 1000:1000 /Manga
USER $UNAME
WORKDIR /publish
COPY --chown=1000:1000 --from=build-env /publish .
USER 0
ENTRYPOINT ["dotnet", "/publish/Tranga.dll"]
CMD ["-f", "-c", "-l", "/usr/share/tranga-api/logs"]

View File

@ -1,32 +1,32 @@
using System.Text;
using System.Text.Json.Serialization;
namespace Logging;
public class FileLogger : LoggerBase
{
private string logFilePath { get; }
internal string logFilePath { get; }
private const int MaxNumberOfLogFiles = 5;
public FileLogger(string logFilePath, TextWriter? stdOut, Encoding? encoding = null) : base (stdOut, encoding)
public FileLogger(string logFilePath, Encoding? encoding = null) : base (encoding)
{
this.logFilePath = logFilePath;
DirectoryInfo dir = Directory.CreateDirectory(new FileInfo(logFilePath).DirectoryName!);
//Remove oldest logfile if more than MaxNumberOfLogFiles
string parentFolderPath = Path.GetDirectoryName(logFilePath)!;
for (int fileCount = new DirectoryInfo(parentFolderPath).EnumerateFiles().Count(); fileCount > MaxNumberOfLogFiles - 1; fileCount--) //-1 because we create own logfile later
File.Delete(new DirectoryInfo(parentFolderPath).EnumerateFiles().MinBy(file => file.LastWriteTime)!.FullName);
for (int fileCount = dir.EnumerateFiles().Count(); fileCount > MaxNumberOfLogFiles - 1; fileCount--) //-1 because we create own logfile later
File.Delete(dir.EnumerateFiles().MinBy(file => file.LastWriteTime)!.FullName);
}
protected override void Write(LogMessage logMessage)
{
try
{
File.AppendAllText(logFilePath, logMessage.ToString());
File.AppendAllText(logFilePath, logMessage.formattedMessage);
}
catch (Exception e)
catch (Exception)
{
stdOut?.WriteLine(e);
// ignored
}
}
}

View File

@ -4,14 +4,14 @@ namespace Logging;
public class FormattedConsoleLogger : LoggerBase
{
public FormattedConsoleLogger(TextWriter? stdOut, Encoding? encoding = null) : base(stdOut, encoding)
private readonly TextWriter _stdOut;
public FormattedConsoleLogger(TextWriter stdOut, Encoding? encoding = null) : base(encoding)
{
this._stdOut = stdOut;
}
protected override void Write(LogMessage message)
{
//Nothing to do yet
this._stdOut.Write(message.formattedMessage);
}
}

23
Logging/LogMessage.cs Normal file
View File

@ -0,0 +1,23 @@
namespace Logging;
public readonly struct LogMessage
{
public DateTime logTime { get; }
public string caller { get; }
public string value { get; }
public string formattedMessage => ToString();
public LogMessage(DateTime messageTime, string caller, string value)
{
this.logTime = messageTime;
this.caller = caller;
this.value = value;
}
public override string ToString()
{
string dateTimeString = $"{logTime.ToShortDateString()} {logTime.ToLongTimeString()}.{logTime.Millisecond,-3}";
string name = caller.Split(new char[] { '.', '+' }).Last();
return $"[{dateTimeString}] {name.Substring(0, name.Length >= 13 ? 13 : name.Length),13} | {value}";
}
}

View File

@ -1,10 +1,14 @@
using System.Net.Mime;
using System.Runtime.InteropServices;
using System.Text;
namespace Logging;
public class Logger : TextWriter
{
private static readonly string LogDirectoryPath = RuntimeInformation.IsOSPlatform(OSPlatform.Linux)
? "/var/log/tranga-api"
: Path.Join(Directory.GetCurrentDirectory(), "logs");
public string? logFilePath => _fileLogger?.logFilePath;
public override Encoding Encoding { get; }
public enum LoggerType
{
@ -12,24 +16,34 @@ public class Logger : TextWriter
ConsoleLogger
}
private FileLogger? _fileLogger;
private FormattedConsoleLogger? _formattedConsoleLogger;
private MemoryLogger _memoryLogger;
private TextWriter? stdOut;
private readonly FileLogger? _fileLogger;
private readonly FormattedConsoleLogger? _formattedConsoleLogger;
private readonly MemoryLogger _memoryLogger;
public Logger(LoggerType[] enabledLoggers, TextWriter? stdOut, Encoding? encoding, string? logFilePath)
public Logger(LoggerType[] enabledLoggers, TextWriter? stdOut, Encoding? encoding, string? logFolderPath)
{
this.Encoding = encoding ?? Encoding.ASCII;
this.stdOut = stdOut ?? null;
if (enabledLoggers.Contains(LoggerType.FileLogger) && logFilePath is not null)
_fileLogger = new FileLogger(logFilePath, null, encoding);
else
this.Encoding = encoding ?? Encoding.UTF8;
DateTime now = DateTime.Now;
if(enabledLoggers.Contains(LoggerType.FileLogger) && (logFolderPath is null || logFolderPath == ""))
{
_fileLogger = null;
throw new ArgumentException($"logFilePath can not be null for LoggerType {LoggerType.FileLogger}");
string filePath = Path.Join(LogDirectoryPath,
$"{now.ToShortDateString()}_{now.Hour}-{now.Minute}-{now.Second}.log");
_fileLogger = new FileLogger(filePath, encoding);
}else if (enabledLoggers.Contains(LoggerType.FileLogger) && logFolderPath is not null)
_fileLogger = new FileLogger(Path.Join(logFolderPath, $"{now.ToShortDateString()}_{now.Hour}-{now.Minute}-{now.Second}.log") , encoding);
if (enabledLoggers.Contains(LoggerType.ConsoleLogger) && stdOut is not null)
{
_formattedConsoleLogger = new FormattedConsoleLogger(stdOut, encoding);
}
_formattedConsoleLogger = enabledLoggers.Contains(LoggerType.ConsoleLogger) ? new FormattedConsoleLogger(null, encoding) : null;
_memoryLogger = new MemoryLogger(null, encoding);
else if (enabledLoggers.Contains(LoggerType.ConsoleLogger) && stdOut is null)
{
_formattedConsoleLogger = null;
throw new ArgumentException($"stdOut can not be null for LoggerType {LoggerType.ConsoleLogger}");
}
_memoryLogger = new MemoryLogger(encoding);
WriteLine(GetType().ToString(), $"Logfile: {logFilePath}");
}
public void WriteLine(string caller, string? value)
@ -46,9 +60,7 @@ public class Logger : TextWriter
_fileLogger?.Write(caller, value);
_formattedConsoleLogger?.Write(caller, value);
_memoryLogger.Write(caller, value);
stdOut?.Write(value);
}
public string[] Tail(uint? lines)
@ -60,4 +72,9 @@ public class Logger : TextWriter
{
return _memoryLogger.GetNewLines();
}
public string[] GetLog()
{
return _memoryLogger.GetLogMessages();
}
}

View File

@ -5,21 +5,10 @@ namespace Logging;
public abstract class LoggerBase : TextWriter
{
public override Encoding Encoding { get; }
protected TextWriter? stdOut { get; }
public LoggerBase(TextWriter? stdOut, Encoding? encoding = null)
public LoggerBase(Encoding? encoding = null)
{
this.Encoding = encoding ?? Encoding.ASCII;
this.stdOut = stdOut;
}
public void WriteLine(string caller, string? value)
{
value = value is null ? Environment.NewLine : string.Join(value, Environment.NewLine);
LogMessage message = new LogMessage(DateTime.Now, caller, value);
Write(message);
}
public void Write(string caller, string? value)
@ -27,32 +16,10 @@ public abstract class LoggerBase : TextWriter
if (value is null)
return;
LogMessage message = new LogMessage(DateTime.Now, caller, value);
stdOut?.Write(message.ToString());
LogMessage message = new (DateTime.Now, caller, value);
Write(message);
}
protected abstract void Write(LogMessage message);
public class LogMessage
{
public DateTime logTime { get; }
public string caller { get; }
public string value { get; }
public LogMessage(DateTime now, string caller, string value)
{
this.logTime = now;
this.caller = caller;
this.value = value;
}
public override string ToString()
{
string dateTimeString = $"{logTime.ToShortDateString()} {logTime.ToLongTimeString()}";
return $"[{dateTimeString}] {caller.Split(new char[]{'.','+'}).Last(),15} | {value}";
}
}
}

View File

@ -1,9 +1,10 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net7.0</TargetFramework>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<LangVersion>12</LangVersion>
</PropertyGroup>
</Project>

View File

@ -7,17 +7,20 @@ public class MemoryLogger : LoggerBase
private readonly SortedList<DateTime, LogMessage> _logMessages = new();
private int _lastLogMessageIndex = 0;
public MemoryLogger(TextWriter? stdOut, Encoding? encoding = null) : base(stdOut, encoding)
public MemoryLogger(Encoding? encoding = null) : base(encoding)
{
}
protected override void Write(LogMessage value)
{
_logMessages.Add(value.logTime, value);
lock (_logMessages)
{
_logMessages.Add(DateTime.Now, value);
}
}
public string[] GetLogMessage()
public string[] GetLogMessages()
{
return Tail(Convert.ToUInt32(_logMessages.Count));
}
@ -34,7 +37,10 @@ public class MemoryLogger : LoggerBase
for (int retIndex = 0; retIndex < ret.Length; retIndex++)
{
ret[retIndex] = _logMessages.GetValueAtIndex(_logMessages.Count - retLength + retIndex).ToString();
lock (_logMessages)
{
ret[retIndex] = _logMessages.GetValueAtIndex(_logMessages.Count - retLength + retIndex).ToString();
}
}
_lastLogMessageIndex = _logMessages.Count - 1;
@ -44,14 +50,25 @@ public class MemoryLogger : LoggerBase
public string[] GetNewLines()
{
int logMessageCount = _logMessages.Count;
string[] ret = new string[logMessageCount - _lastLogMessageIndex];
List<string> ret = new();
for (int retIndex = 0; retIndex < ret.Length; retIndex++)
int retIndex = 0;
for (; retIndex < logMessageCount - _lastLogMessageIndex; retIndex++)
{
ret[retIndex] = _logMessages.GetValueAtIndex(_lastLogMessageIndex + retIndex).ToString();
try
{
lock(_logMessages)
{
ret.Add(_logMessages.GetValueAtIndex(_lastLogMessageIndex + retIndex).ToString());
}
}
catch (NullReferenceException)//Called when LogMessage has not finished writing
{
break;
}
}
_lastLogMessageIndex = logMessageCount;
return ret;
_lastLogMessageIndex = _lastLogMessageIndex + retIndex;
return ret.ToArray();
}
}

105
README.md
View File

@ -1,12 +1,3 @@
<!-- PROJECT SHIELDS -->
<!--
*** I'm using markdown "reference style" links for readability.
*** Reference links are enclosed in brackets [ ] instead of parentheses ( ).
*** See the bottom of this document for the declaration of the reference variables
*** for contributors-url, forks-url, etc. This is an optional, concise syntax you may use.
*** https://www.markdownguide.org/basic-syntax/#reference-style-links
-->
<!-- PROJECT LOGO -->
<br />
<div align="center">
@ -16,10 +7,11 @@
<p align="center">
Automatic Manga and Metadata downloader
</p>
<p align="center">
This is the API for <a href="https://github.com/C9Glax/tranga-website">Tranga-Website</a>
</p>
</div>
<!-- TABLE OF CONTENTS -->
<details>
<summary>Table of Contents</summary>
@ -30,12 +22,10 @@
<li><a href="#built-with">Built With</a></li>
</ul>
</li>
<li>
<a href="#screenshots">Screenshots</a>
</li>
<li>
<a href="#getting-started">Getting Started</a>
<ul>
<li><a href="#prerequisites">Usage</a></li>
<li><a href="#prerequisites">Prerequisites</a></li>
</ul>
</li>
@ -51,19 +41,42 @@
<!-- ABOUT THE PROJECT -->
## About The Project
Tranga can download Chapters and Metadata from Scanlation sites such as
Tranga can download Chapters and Metadata from "Scanlation" sites such as
- [MangaDex.org](https://mangadex.org/)
- [Manganato.com](https://manganato.com/)
- [Mangasee](https://mangasee123.com/)
- [MangaDex.org](https://mangadex.org/) (Multilingual)
- [Manganato.com](https://manganato.com/) (en)
- [Mangasee.com](https://mangasee123.com/) (en)
- [MangaKatana.com](https://mangakatana.com) (en)
- [Mangaworld.bz](https://www.mangaworld.bz/) (it)
- [Bato.to](https://bato.to/v3x) (en)
- [Manga4Life](https://manga4life.com) (en)
- [ManhuaPlus](https://manhuaplus.org/) (en)
- [MangaHere](https://www.mangahere.cc/) (en) (Their covers aren't scrapeable.)
- ❓ Open an [issue](https://github.com/C9Glax/tranga/issues/new?assignees=&labels=New+Connector&projects=&template=new_connector.yml&title=%5BNew+Connector%5D%3A+)
and trigger a library-scan with [Komga](https://komga.org/) and [Kavita](https://www.kavitareader.com/).
Notifications can be sent to your devices using [Gotify](https://gotify.net/), [LunaSea](https://www.lunasea.app/) or [Ntfy](https://ntfy.sh/
).
### What this does and doesn't do
Tranga (this git-repo) will open a port (standard 6531) and listen for requests to add Jobs to Monitor and/or download specific Manga.
The configuration is all done through HTTP-Requests.
_**For a web-frontend use [tranga-website](https://github.com/C9Glax/tranga-website).**_
This project downloads the images for a Manga from the specified Scanlation-Website and packages them with some metadata - from that same website - in a .cbz-archive (per chapter).
It does this on an interval, and checks for any Chapters (.cbz-Archive) not already existing in your specified Download-Location. (If you rename or move files, it will download those again)
Tranga can (if configured) trigger a scan in Komga or Kavita, however the directory in which the Manga reside has to be available to both Tranga and Komga/Kavita.
The project doesn't manage metadata, and doesn't curate, change or enhance any information that isn't available on the selected Scanlation-Site.
It will blindly use whatever is scrapes (yes this is a glorified Web-scraper).
and automatically start updates in [Komga](https://komga.org/) and [Kavita](https://www.kavitareader.com/) to import them.
### Inspiration:
Because [Kaizoku](https://github.com/oae/kaizoku) was relying on [mangal](https://github.com/metafates/mangal) and mangal
hasn't received bugfixes for it's issues with Titles not showing up, or throwing errors because of illegal characters,
there were no alternatives for automatic downloads. However [Kaizoku](https://github.com/oae/kaizoku) certainly had a great Web-UI.
hasn't received bugfixes for its issues with Titles not showing up, or throwing errors because of illegal characters,
there were no alternatives for automatic downloads. However, [Kaizoku](https://github.com/oae/kaizoku) certainly had a great Web-UI.
That is why I wanted to create my own project, in a language I understand, and that I am able to maintain myself.
@ -75,51 +88,41 @@ That is why I wanted to create my own project, in a language I understand, and t
- Newtonsoft.JSON
- [PuppeteerSharp](https://www.puppeteersharp.com/)
- [Html Agility Pack (HAP)](https://html-agility-pack.net/)
- Love <3 Blåhaj 🦈
- [Soenneker.Utils.String.NeedlemanWunsch](https://github.com/soenneker/soenneker.utils.string.needlemanwunsch)
- 💙 Blåhaj 🦈
<p align="right">(<a href="#readme-top">back to top</a>)</p>
## Star History
## Screenshots
![image](screenshots/overview.png)
![image](screenshots/addtask.png)
| ![image](screenshots/settings.png) | ![image](screenshots/publication-description.png) |
|-----------------------------------:|:-------------------------------------------------:|
<p align="right">(<a href="#readme-top">back to top</a>)</p>
<a href="https://star-history.com/#c9glax/tranga&Date">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=c9glax/tranga&type=Date&theme=dark" />
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=c9glax/tranga&type=Date" />
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=c9glax/tranga&type=Date" />
</picture>
</a>
<!-- GETTING STARTED -->
## Getting Started
There is two release types:
- CLI
- Docker
### CLI
Head over to [releases](https://git.bernloehr.eu/glax/Tranga/releases) and download. The CLI will guide you through setup.
### Docker
Download [docker-compose.yaml](https://git.bernloehr.eu/glax/Tranga/src/branch/master/docker-compose.yaml) and configure to your needs.
Download [docker-compose.yaml](https://git.bernloehr.eu/glax/Tranga/src/branch/master/docker-compose.yaml) and configure to your needs.
Mount `/Manga` to wherever you want your chapters (`.cbz`-Archives) downloaded (where Komga/Kavita can access them).
The `docker-compose` also includes [tranga-website](https://github.com/C9Glax/tranga-website) as frontend. For its configuration refer to the repo README.
Wherever you are mounting `/usr/share/Tranga-API` you also need to mount that same path + `/imageCache` in the webserver container.
For compatibility do not execute the compose as root (which you should not do anyways...) but as user that can
access the folder.
### Prerequisites
[.NET-Core 7.0 SDK](https://dotnet.microsoft.com/en-us/download/dotnet/7.0)
#### To Build
[.NET-Core 8.0 SDK](https://dotnet.microsoft.com/en-us/download/dotnet/8.0)
#### To Run
[.NET-Core 8.0 Runtime](https://dotnet.microsoft.com/en-us/download/dotnet/8.0) scroll down a bit, should be on the right the second item.
<!-- ROADMAP -->
## Roadmap
- [ ] Docker ARM support
- [ ] ?
See the [open issues](https://git.bernloehr.eu/glax/Tranga/issues) for a full list of proposed features (and known issues).
See the [open issues](https://github.com/C9Glax/tranga/issues) for a full list of proposed features (and known issues).
<p align="right">(<a href="#readme-top">back to top</a>)</p>

View File

@ -1,185 +0,0 @@
using System.Runtime.InteropServices;
using Logging;
using Tranga;
string applicationFolderPath = Path.Join(Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData), "Tranga-API");
string downloadFolderPath = RuntimeInformation.IsOSPlatform(OSPlatform.Linux) ? "/Manga" : Path.Join(applicationFolderPath, "Manga");
string logsFolderPath = RuntimeInformation.IsOSPlatform(OSPlatform.Linux) ? "/var/logs/Tranga" : Path.Join(applicationFolderPath, "logs");
string logFilePath = Path.Join(logsFolderPath, $"log-{DateTime.Now:dd-M-yyyy-HH-mm-ss}.txt");
string settingsFilePath = Path.Join(applicationFolderPath, "settings.json");
Directory.CreateDirectory(logsFolderPath);
Logger logger = new(new[] { Logger.LoggerType.FileLogger, Logger.LoggerType.ConsoleLogger }, Console.Out, Console.Out.Encoding, logFilePath);
logger.WriteLine("Tranga", "Loading settings.");
TrangaSettings settings;
if (File.Exists(settingsFilePath))
settings = TrangaSettings.LoadSettings(settingsFilePath, logger);
else
settings = new TrangaSettings(downloadFolderPath, applicationFolderPath, new HashSet<LibraryManager>());
Directory.CreateDirectory(settings.workingDirectory);
Directory.CreateDirectory(settings.downloadLocation);
Directory.CreateDirectory(settings.coverImageCache);
logger.WriteLine("Tranga",$"Application-Folder: {settings.workingDirectory}");
logger.WriteLine("Tranga",$"Settings-File-Path: {settings.settingsFilePath}");
logger.WriteLine("Tranga",$"Download-Folder-Path: {settings.downloadLocation}");
logger.WriteLine("Tranga",$"Logfile-Path: {logFilePath}");
logger.WriteLine("Tranga",$"Image-Cache-Path: {settings.coverImageCache}");
logger.WriteLine("Tranga", "Loading Taskmanager.");
TaskManager taskManager = new (settings, logger);
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
builder.Services.AddControllers().AddNewtonsoftJson();
string corsHeader = "Tranga";
builder.Services.AddCors(options =>
{
options.AddPolicy(name: corsHeader,
policy =>
{
policy.AllowAnyOrigin();
policy.WithMethods("GET", "POST", "DELETE");
});
});
var app = builder.Build();
app.UseSwagger();
app.UseSwaggerUI();
app.UseCors(corsHeader);
app.MapGet("/Tranga/GetAvailableControllers", () => taskManager.GetAvailableConnectors().Keys.ToArray());
app.MapGet("/Tranga/GetKnownPublications", () => taskManager.GetAllPublications());
app.MapGet("/Tranga/GetPublicationsFromConnector", (string connectorName, string title) =>
{
Connector? connector = taskManager.GetAvailableConnectors().FirstOrDefault(con => con.Key == connectorName).Value;
if (connector is null)
return Array.Empty<Publication>();
if(title.Length < 4)
return Array.Empty<Publication>();
return taskManager.GetPublicationsFromConnector(connector, title);
});
app.MapGet("/Tasks/GetTaskTypes", () => Enum.GetNames(typeof(TrangaTask.Task)));
app.MapPost("/Tasks/Create", (string taskType, string? connectorName, string? publicationId, string reoccurrenceTime, string? language) =>
{
TrangaTask.Task task = Enum.Parse<TrangaTask.Task>(taskType);
taskManager.AddTask(task, connectorName, publicationId, TimeSpan.Parse(reoccurrenceTime), language??"");
});
app.MapDelete("/Tasks/Delete", (string taskType, string? connectorName, string? publicationId) =>
{
TrangaTask.Task task = Enum.Parse<TrangaTask.Task>(taskType);
taskManager.DeleteTask(task, connectorName, publicationId);
});
app.MapGet("/Tasks/Get", (string taskType, string? connectorName, string? searchString) =>
{
try
{
TrangaTask.Task task = Enum.Parse<TrangaTask.Task>(taskType);
return taskManager.GetTasksMatching(task, connectorName:connectorName, searchString:searchString);
}
catch (ArgumentException)
{
return Array.Empty<TrangaTask>();
}
});
app.MapGet("/Tasks/GetTaskProgress", (string taskType, string? connectorName, string? publicationId) =>
{
try
{
TrangaTask.Task pTask = Enum.Parse<TrangaTask.Task>(taskType);
TrangaTask? task = taskManager
.GetTasksMatching(pTask, connectorName: connectorName, internalId: publicationId)?.First();
if (task is null)
return -1f;
return task.progress;
}
catch (ArgumentException)
{
return -1f;
}
});
app.MapPost("/Tasks/Start", (string taskType, string? connectorName, string? internalId) =>
{
try
{
TrangaTask.Task pTask = Enum.Parse<TrangaTask.Task>(taskType);
TrangaTask? task = taskManager
.GetTasksMatching(pTask, connectorName: connectorName, internalId: internalId)?.FirstOrDefault();
if (task is null)
return;
taskManager.ExecuteTaskNow(task);
}
catch (ArgumentException)
{
return;
}
});
app.MapGet("/Tasks/GetRunningTasks",
() => taskManager.GetAllTasks().Where(task => task.state is TrangaTask.ExecutionState.Running));
app.MapGet("/Queue/GetList",
() => taskManager.GetAllTasks().Where(task => task.state is TrangaTask.ExecutionState.Enqueued));
app.MapPost("/Queue/Enqueue", (string taskType, string? connectorName, string? publicationId) =>
{
try
{
TrangaTask.Task pTask = Enum.Parse<TrangaTask.Task>(taskType);
TrangaTask? task = taskManager
.GetTasksMatching(pTask, connectorName: connectorName, internalId: publicationId)?.First();
if (task is null)
return;
taskManager.AddTaskToQueue(task);
}
catch (ArgumentException)
{
return;
}
});
app.MapDelete("/Queue/Dequeue", (string taskType, string? connectorName, string? publicationId) =>
{
try
{
TrangaTask.Task pTask = Enum.Parse<TrangaTask.Task>(taskType);
TrangaTask? task = taskManager
.GetTasksMatching(pTask, connectorName: connectorName, internalId: publicationId)?.First();
if (task is null)
return;
taskManager.RemoveTaskFromQueue(task);
}
catch (ArgumentException)
{
return;
}
});
app.MapGet("/Settings/Get", () => taskManager.settings);
app.MapPost("/Settings/Update",
(string? downloadLocation, string? komgaUrl, string? komgaAuth, string? kavitaUrl, string? kavitaUsername, string? kavitaPassword) =>
taskManager.UpdateSettings(downloadLocation, komgaUrl, komgaAuth, kavitaUrl, kavitaUsername, kavitaPassword));
app.Run();

View File

@ -1,28 +0,0 @@
{
"iisSettings": {
"windowsAuthentication": false,
"anonymousAuthentication": true,
"iisExpress": {
"applicationUrl": "http://localhost:1716",
"sslPort": 44391
}
},
"profiles": {
"http": {
"commandName": "Project",
"dotnetRunMessages": true,
"launchBrowser": true,
"applicationUrl": "http://localhost:5177"
},
"https": {
"commandName": "Project",
"dotnetRunMessages": true,
"launchBrowser": true,
"applicationUrl": "https://localhost:7036;http://localhost:5177"
},
"IIS Express": {
"commandName": "IISExpress",
"launchBrowser": true
}
}
}

View File

@ -1,28 +0,0 @@
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net7.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<RootNamespace>Tranga_API</RootNamespace>
<DockerDefaultTargetOS>Linux</DockerDefaultTargetOS>
</PropertyGroup>
<ItemGroup>
<Content Include="..\.dockerignore">
<Link>.dockerignore</Link>
</Content>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\Logging\Logging.csproj" />
<ProjectReference Include="..\Tranga\Tranga.csproj" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="Microsoft.AspNetCore.Mvc.NewtonsoftJson" Version="7.0.5" />
<PackageReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Design" Version="7.0.6" />
<PackageReference Include="Swashbuckle.AspNetCore" Version="6.5.0" />
</ItemGroup>
</Project>

View File

@ -1,8 +0,0 @@
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
}
}

View File

@ -1,9 +0,0 @@
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
},
"AllowedHosts": "*"
}

View File

@ -1,622 +0,0 @@
using System.Globalization;
using Logging;
using Tranga;
using Tranga.LibraryManagers;
using Tranga.TrangaTasks;
namespace Tranga_CLI;
/*
* This is written with pure hatred for readability.
* At some point do this properly.
* Read at own risk.
*/
public static class Tranga_Cli
{
public static void Main(string[] args)
{
string applicationFolderPath = Path.Join(Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData), "Tranga");
string logsFolderPath = Path.Join(applicationFolderPath, "logs");
string logFilePath = Path.Join(logsFolderPath, $"log-{DateTime.Now:dd-M-yyyy-HH-mm-ss}.txt");
string settingsFilePath = Path.Join(applicationFolderPath, "settings.json");
Directory.CreateDirectory(applicationFolderPath);
Directory.CreateDirectory(logsFolderPath);
Console.WriteLine($"Logfile-Path: {logFilePath}");
Console.WriteLine($"Settings-File-Path: {settingsFilePath}");
Logger logger = new(new[] { Logger.LoggerType.FileLogger }, null, Console.Out.Encoding, logFilePath);
logger.WriteLine("Tranga_CLI", "Loading Taskmanager.");
TrangaSettings settings = File.Exists(settingsFilePath) ? TrangaSettings.LoadSettings(settingsFilePath, logger) : new TrangaSettings(Directory.GetCurrentDirectory(), applicationFolderPath, new HashSet<LibraryManager>());
logger.WriteLine("Tranga_CLI", "User Input");
Console.WriteLine($"Output folder path [{settings.downloadLocation}]:");
string? tmpPath = Console.ReadLine();
while(tmpPath is null)
tmpPath = Console.ReadLine();
if (tmpPath.Length > 0)
settings.downloadLocation = tmpPath;
Console.WriteLine($"Komga BaseURL [{settings.libraryManagers.FirstOrDefault(lm => lm.GetType() == typeof(Komga))?.baseUrl}]:");
string? tmpUrlKomga = Console.ReadLine();
while (tmpUrlKomga is null)
tmpUrlKomga = Console.ReadLine();
if (tmpUrlKomga.Length > 0)
{
Console.WriteLine("Username:");
string? tmpKomgaUser = Console.ReadLine();
while (tmpKomgaUser is null || tmpKomgaUser.Length < 1)
tmpKomgaUser = Console.ReadLine();
Console.WriteLine("Password:");
string tmpKomgaPass = string.Empty;
ConsoleKey key;
do
{
var keyInfo = Console.ReadKey(intercept: true);
key = keyInfo.Key;
if (key == ConsoleKey.Backspace && tmpKomgaPass.Length > 0)
{
Console.Write("\b \b");
tmpKomgaPass = tmpKomgaPass[0..^1];
}
else if (!char.IsControl(keyInfo.KeyChar))
{
Console.Write("*");
tmpKomgaPass += keyInfo.KeyChar;
}
} while (key != ConsoleKey.Enter);
settings.libraryManagers.RemoveWhere(lm => lm.GetType() == typeof(Komga));
settings.libraryManagers.Add(new Komga(tmpUrlKomga, tmpKomgaUser, tmpKomgaPass, logger));
}
Console.WriteLine($"Kavita BaseURL [{settings.libraryManagers.FirstOrDefault(lm => lm.GetType() == typeof(Kavita))?.baseUrl}]:");
string? tmpUrlKavita = Console.ReadLine();
while (tmpUrlKavita is null)
tmpUrlKavita = Console.ReadLine();
if (tmpUrlKavita.Length > 0)
{
Console.WriteLine("Username:");
string? tmpKavitaUser = Console.ReadLine();
while (tmpKavitaUser is null || tmpKavitaUser.Length < 1)
tmpKavitaUser = Console.ReadLine();
Console.WriteLine("Password:");
string tmpKavitaPass = string.Empty;
ConsoleKey key;
do
{
var keyInfo = Console.ReadKey(intercept: true);
key = keyInfo.Key;
if (key == ConsoleKey.Backspace && tmpKavitaPass.Length > 0)
{
Console.Write("\b \b");
tmpKavitaPass = tmpKavitaPass[0..^1];
}
else if (!char.IsControl(keyInfo.KeyChar))
{
Console.Write("*");
tmpKavitaPass += keyInfo.KeyChar;
}
} while (key != ConsoleKey.Enter);
settings.libraryManagers.RemoveWhere(lm => lm.GetType() == typeof(Kavita));
settings.libraryManagers.Add(new Kavita(tmpUrlKavita, tmpKavitaUser, tmpKavitaPass, logger));
}
logger.WriteLine("Tranga_CLI", "Loaded.");
TaskMode(settings, logger);
}
private static void TaskMode(TrangaSettings settings, Logger logger)
{
TaskManager taskManager = new (settings, logger);
ConsoleKey selection = ConsoleKey.EraseEndOfFile;
PrintMenu(taskManager, taskManager.settings.downloadLocation);
while (selection != ConsoleKey.Q)
{
int taskCount = taskManager.GetAllTasks().Length;
int taskRunningCount = taskManager.GetAllTasks().Count(task => task.state == TrangaTask.ExecutionState.Running);
int taskEnqueuedCount =
taskManager.GetAllTasks().Count(task => task.state == TrangaTask.ExecutionState.Enqueued);
Console.SetCursorPosition(0,1);
Console.WriteLine($"Tasks (Running/Queue/Total)): {taskRunningCount}/{taskEnqueuedCount}/{taskCount}");
if (Console.KeyAvailable)
{
selection = Console.ReadKey().Key;
switch (selection)
{
case ConsoleKey.L:
while (!Console.KeyAvailable)
{
PrintTasks(taskManager.GetAllTasks(), logger);
Console.WriteLine("Press any key.");
Thread.Sleep(500);
}
Console.ReadKey();
break;
case ConsoleKey.C:
CreateTask(taskManager, logger);
Console.WriteLine("Press any key.");
Console.ReadKey();
break;
case ConsoleKey.D:
DeleteTask(taskManager, logger);
Console.WriteLine("Press any key.");
Console.ReadKey();
break;
case ConsoleKey.E:
ExecuteTaskNow(taskManager, logger);
Console.WriteLine("Press any key.");
Console.ReadKey();
break;
case ConsoleKey.S:
SearchTasks(taskManager, logger);
Console.WriteLine("Press any key.");
Console.ReadKey();
break;
case ConsoleKey.R:
while (!Console.KeyAvailable)
{
PrintTasks(
taskManager.GetAllTasks().Where(eTask => eTask.state == TrangaTask.ExecutionState.Running)
.ToArray(), logger);
Console.WriteLine("Press any key.");
Thread.Sleep(500);
}
Console.ReadKey();
break;
case ConsoleKey.K:
while (!Console.KeyAvailable)
{
PrintTasks(
taskManager.GetAllTasks()
.Where(qTask => qTask.state is TrangaTask.ExecutionState.Enqueued)
.ToArray(), logger);
Console.WriteLine("Press any key.");
Thread.Sleep(500);
}
Console.ReadKey();
break;
case ConsoleKey.F:
TailLog(logger);
Console.ReadKey();
break;
case ConsoleKey.G:
RemoveTaskFromQueue(taskManager, logger);
Console.WriteLine("Press any key.");
Console.ReadKey();
break;
case ConsoleKey.B:
AddTaskToQueue(taskManager, logger);
Console.WriteLine("Press any key.");
Console.ReadKey();
break;
case ConsoleKey.M:
AddMangaTaskToQueue(taskManager, logger);
Console.WriteLine("Press any key.");
Console.ReadKey();
break;
}
PrintMenu(taskManager, taskManager.settings.downloadLocation);
}
Thread.Sleep(200);
}
logger.WriteLine("Tranga_CLI", "Exiting.");
Console.Clear();
Console.WriteLine("Exiting.");
if (taskManager.GetAllTasks().Any(task => task.state == TrangaTask.ExecutionState.Running))
{
Console.WriteLine("Force quit (Even with running tasks?) y/N");
selection = Console.ReadKey().Key;
while(selection != ConsoleKey.Y && selection != ConsoleKey.N)
selection = Console.ReadKey().Key;
taskManager.Shutdown(selection == ConsoleKey.Y);
}else
// ReSharper disable once RedundantArgumentDefaultValue Better readability
taskManager.Shutdown(false);
}
private static void PrintMenu(TaskManager taskManager, string folderPath)
{
int taskCount = taskManager.GetAllTasks().Length;
int taskRunningCount = taskManager.GetAllTasks().Count(task => task.state == TrangaTask.ExecutionState.Running);
int taskEnqueuedCount =
taskManager.GetAllTasks().Count(task => task.state == TrangaTask.ExecutionState.Enqueued);
Console.Clear();
Console.WriteLine($"Download Folder: {folderPath}");
Console.WriteLine($"Tasks (Running/Queue/Total)): {taskRunningCount}/{taskEnqueuedCount}/{taskCount}");
Console.WriteLine();
Console.WriteLine($"{"C: Create Task",-30}{"L: List tasks",-30}{"B: Enqueue Task", -30}");
Console.WriteLine($"{"D: Delete Task",-30}{"S: Search Tasks", -30}{"K: List Task Queue", -30}");
Console.WriteLine($"{"E: Execute Task now",-30}{"R: List Running Tasks", -30}{"G: Remove Task from Queue", -30}");
Console.WriteLine($"{"M: New Download Manga Task",-30}{"", -30}{"", -30}");
Console.WriteLine($"{"",-30}{"F: Show Log",-30}{"Q: Exit",-30}");
}
private static void PrintTasks(TrangaTask[] tasks, Logger logger)
{
logger.WriteLine("Tranga_CLI", "Printing Tasks");
int taskCount = tasks.Length;
int taskRunningCount = tasks.Count(task => task.state == TrangaTask.ExecutionState.Running);
int taskEnqueuedCount = tasks.Count(task => task.state == TrangaTask.ExecutionState.Enqueued);
Console.Clear();
int tIndex = 0;
Console.WriteLine($"Tasks (Running/Queue/Total): {taskRunningCount}/{taskEnqueuedCount}/{taskCount}");
string header =
$"{"",-5}{"Task",-20} | {"Last Executed",-20} | {"Reoccurrence",-12} | {"State",-10} | {"Progress",-9} | {"Finished",-20} | {"Remaining",-12} | {"Connector",-15} | Publication/Manga ";
Console.WriteLine(header);
Console.WriteLine(new string('-', header.Length));
foreach (TrangaTask trangaTask in tasks)
{
string[] taskSplit = trangaTask.ToString().Split(", ");
Console.WriteLine($"{tIndex++:000}: {taskSplit[0],-20} | {taskSplit[1],-20} | {taskSplit[2],-12} | {taskSplit[3],-10} | {taskSplit[4],-9} | {taskSplit[5],-20} | {taskSplit[6][..12],-12} | {(taskSplit.Length > 7 ? taskSplit[7] : ""),-15} | {(taskSplit.Length > 8 ? taskSplit[8] : "")} {(taskSplit.Length > 9 ? taskSplit[9] : "")} {(taskSplit.Length > 10 ? taskSplit[10] : "")}");
}
}
private static TrangaTask[] SelectTasks(TrangaTask[] tasks, Logger logger)
{
logger.WriteLine("Tranga_CLI", "Menu: Select task");
if (tasks.Length < 1)
{
Console.Clear();
Console.WriteLine("There are no available Tasks.");
logger.WriteLine("Tranga_CLI", "No available Tasks.");
return Array.Empty<TrangaTask>();
}
PrintTasks(tasks, logger);
logger.WriteLine("Tranga_CLI", "Selecting Task to Remove (from queue)");
Console.WriteLine("Enter q to abort");
Console.WriteLine($"Select Task(s) (0-{tasks.Length - 1}):");
string? selectedTask = Console.ReadLine();
while(selectedTask is null || selectedTask.Length < 1)
selectedTask = Console.ReadLine();
if (selectedTask.Length == 1 && selectedTask.ToLower() == "q")
{
Console.Clear();
Console.WriteLine("aborted.");
logger.WriteLine("Tranga_CLI", "aborted");
return Array.Empty<TrangaTask>();
}
if (selectedTask.Contains('-'))
{
int start = Convert.ToInt32(selectedTask.Split('-')[0]);
int end = Convert.ToInt32(selectedTask.Split('-')[1]);
return tasks[start..end];
}
else
{
int selectedTaskIndex = Convert.ToInt32(selectedTask);
return new[] { tasks[selectedTaskIndex] };
}
}
private static void AddMangaTaskToQueue(TaskManager taskManager, Logger logger)
{
Console.Clear();
logger.WriteLine("Tranga_CLI", "Menu: Add Manga Download to queue");
Connector? connector = SelectConnector(taskManager.GetAvailableConnectors().Values.ToArray(), logger);
if (connector is null)
return;
Publication? publication = SelectPublication(taskManager, connector, logger);
if (publication is null)
return;
TimeSpan reoccurrence = SelectReoccurrence(logger);
logger.WriteLine("Tranga_CLI", "Sending Task to TaskManager");
TrangaTask? newTask = taskManager.AddTask(TrangaTask.Task.DownloadNewChapters, connector.name, publication.Value.publicationId, reoccurrence, "en");
Console.WriteLine(newTask);
}
private static void AddTaskToQueue(TaskManager taskManager, Logger logger)
{
Console.Clear();
logger.WriteLine("Tranga_CLI", "Menu: Add Task to queue");
TrangaTask[] tasks = taskManager.GetAllTasks().Where(rTask =>
rTask.state is not TrangaTask.ExecutionState.Enqueued and not TrangaTask.ExecutionState.Running).ToArray();
TrangaTask[] selectedTasks = SelectTasks(tasks, logger);
logger.WriteLine("Tranga_CLI", $"Sending {selectedTasks.Length} Tasks to TaskManager");
foreach(TrangaTask task in selectedTasks)
taskManager.AddTaskToQueue(task);
}
private static void RemoveTaskFromQueue(TaskManager taskManager, Logger logger)
{
Console.Clear();
logger.WriteLine("Tranga_CLI", "Menu: Remove Task from queue");
TrangaTask[] tasks = taskManager.GetAllTasks().Where(rTask => rTask.state is TrangaTask.ExecutionState.Enqueued).ToArray();
TrangaTask[] selectedTasks = SelectTasks(tasks, logger);
logger.WriteLine("Tranga_CLI", $"Sending {selectedTasks.Length} Tasks to TaskManager");
foreach(TrangaTask task in selectedTasks)
taskManager.RemoveTaskFromQueue(task);
}
private static void TailLog(Logger logger)
{
logger.WriteLine("Tranga_CLI", "Menu: Show Log-lines");
Console.Clear();
string[] lines = logger.Tail(20);
foreach (string message in lines)
Console.Write(message);
while (!Console.KeyAvailable)
{
string[] newLines = logger.GetNewLines();
foreach(string message in newLines)
Console.Write(message);
Thread.Sleep(40);
}
}
private static void CreateTask(TaskManager taskManager, Logger logger)
{
logger.WriteLine("Tranga_CLI", "Menu: Creating Task");
TrangaTask.Task? tmpTask = SelectTaskType(logger);
if (tmpTask is null)
return;
TrangaTask.Task task = (TrangaTask.Task)tmpTask;
Connector? connector = null;
if (task != TrangaTask.Task.UpdateLibraries)
{
connector = SelectConnector(taskManager.GetAvailableConnectors().Values.ToArray(), logger);
if (connector is null)
return;
}
Publication? publication = null;
if (task != TrangaTask.Task.UpdateLibraries)
{
publication = SelectPublication(taskManager, connector!, logger);
if (publication is null)
return;
}
if (task is TrangaTask.Task.DownloadNewChapters)
{
TimeSpan reoccurrence = SelectReoccurrence(logger);
logger.WriteLine("Tranga_CLI", "Sending Task to TaskManager");
TrangaTask newTask = new DownloadNewChaptersTask(TrangaTask.Task.DownloadNewChapters, connector!.name, (Publication)publication!, reoccurrence, "en");
taskManager.AddTask(newTask);
Console.WriteLine(newTask);
}else if (task is TrangaTask.Task.DownloadChapter)
{
foreach (Chapter chapter in SelectChapters(connector!, (Publication)publication!, logger))
{
TrangaTask newTask = new DownloadChapterTask(TrangaTask.Task.DownloadChapter, connector!.name,
(Publication)publication!, chapter, "en");
taskManager.AddTask(newTask);
Console.WriteLine(newTask);
}
}
}
private static void ExecuteTaskNow(TaskManager taskManager, Logger logger)
{
logger.WriteLine("Tranga_CLI", "Menu: Executing Task");
TrangaTask[] tasks = taskManager.GetAllTasks().Where(nTask => nTask.state is not TrangaTask.ExecutionState.Running).ToArray();
TrangaTask[] selectedTasks = SelectTasks(tasks, logger);
logger.WriteLine("Tranga_CLI", $"Sending {selectedTasks.Length} Tasks to TaskManager");
foreach(TrangaTask task in selectedTasks)
taskManager.ExecuteTaskNow(task);
}
private static void DeleteTask(TaskManager taskManager, Logger logger)
{
logger.WriteLine("Tranga_CLI", "Menu: Delete Task");
TrangaTask[] tasks = taskManager.GetAllTasks();
TrangaTask[] selectedTasks = SelectTasks(tasks, logger);
logger.WriteLine("Tranga_CLI", $"Sending {selectedTasks.Length} Tasks to TaskManager");
foreach(TrangaTask task in selectedTasks)
taskManager.DeleteTask(task);
}
private static TrangaTask.Task? SelectTaskType(Logger logger)
{
logger.WriteLine("Tranga_CLI", "Menu: Select TaskType");
Console.Clear();
string[] taskNames = Enum.GetNames<TrangaTask.Task>();
int tIndex = 0;
Console.WriteLine("Available Tasks:");
foreach (string taskName in taskNames)
Console.WriteLine($"{tIndex++}: {taskName}");
Console.WriteLine("Enter q to abort");
Console.WriteLine($"Select Task (0-{taskNames.Length - 1}):");
string? selectedTask = Console.ReadLine();
while(selectedTask is null || selectedTask.Length < 1)
selectedTask = Console.ReadLine();
if (selectedTask.Length == 1 && selectedTask.ToLower() == "q")
{
Console.Clear();
Console.WriteLine("aborted.");
logger.WriteLine("Tranga_CLI", "aborted.");
return null;
}
try
{
int selectedTaskIndex = Convert.ToInt32(selectedTask);
string selectedTaskName = taskNames[selectedTaskIndex];
return Enum.Parse<TrangaTask.Task>(selectedTaskName);
}
catch (Exception e)
{
Console.WriteLine($"Exception: {e.Message}");
logger.WriteLine("Tranga_CLI", e.Message);
}
return null;
}
private static TimeSpan SelectReoccurrence(Logger logger)
{
logger.WriteLine("Tranga_CLI", "Menu: Select Reoccurrence");
Console.WriteLine("Select reoccurrence Timer (Format hh:mm:ss):");
return TimeSpan.Parse(Console.ReadLine()!, new CultureInfo("en-US"));
}
private static Chapter[] SelectChapters(Connector connector, Publication publication, Logger logger)
{
logger.WriteLine("Tranga_CLI", "Menu: Select Chapters");
Chapter[] availableChapters = connector.GetChapters(publication, "en");
int cIndex = 0;
Console.WriteLine("Chapters:");
foreach(Chapter chapter in availableChapters)
Console.WriteLine($"{cIndex++}: Vol.{chapter.volumeNumber} Ch.{chapter.chapterNumber} - {chapter.name}");
Console.WriteLine("Enter q to abort");
Console.WriteLine($"Select Chapter(s):");
string? selectedChapters = Console.ReadLine();
while(selectedChapters is null || selectedChapters.Length < 1)
selectedChapters = Console.ReadLine();
if (selectedChapters.Length == 1 && selectedChapters.ToLower() == "q")
{
Console.Clear();
Console.WriteLine("aborted.");
logger.WriteLine("Tranga_CLI", "aborted.");
return Array.Empty<Chapter>();
}
if (selectedChapters.Contains('-'))
{
int start = Convert.ToInt32(selectedChapters.Split('-')[0]);
int end = Convert.ToInt32(selectedChapters.Split('-')[1]);
return availableChapters[start..end];
}
else
return new Chapter[] { availableChapters[Convert.ToInt32(selectedChapters)] };
}
private static Connector? SelectConnector(Connector[] connectors, Logger logger)
{
logger.WriteLine("Tranga_CLI", "Menu: Select Connector");
Console.Clear();
int cIndex = 0;
Console.WriteLine("Connectors:");
foreach (Connector connector in connectors)
Console.WriteLine($"{cIndex++}: {connector.name}");
Console.WriteLine("Enter q to abort");
Console.WriteLine($"Select Connector (0-{connectors.Length - 1}):");
string? selectedConnector = Console.ReadLine();
while(selectedConnector is null || selectedConnector.Length < 1)
selectedConnector = Console.ReadLine();
if (selectedConnector.Length == 1 && selectedConnector.ToLower() == "q")
{
Console.Clear();
Console.WriteLine("aborted.");
logger.WriteLine("Tranga_CLI", "aborted.");
return null;
}
try
{
int selectedConnectorIndex = Convert.ToInt32(selectedConnector);
return connectors[selectedConnectorIndex];
}
catch (Exception e)
{
Console.WriteLine($"Exception: {e.Message}");
logger.WriteLine("Tranga_CLI", e.Message);
}
return null;
}
private static Publication? SelectPublication(TaskManager taskManager, Connector connector, Logger logger)
{
logger.WriteLine("Tranga_CLI", "Menu: Select Publication");
Console.Clear();
Console.WriteLine($"Connector: {connector.name}");
Console.WriteLine("Publication search query (leave empty for all):");
string? query = Console.ReadLine();
Publication[] publications = taskManager.GetPublicationsFromConnector(connector, query ?? "");
if (publications.Length < 1)
{
logger.WriteLine("Tranga_CLI", "No publications returned");
Console.WriteLine($"No publications for query '{query}' returned;");
return null;
}
int pIndex = 0;
Console.WriteLine("Publications:");
foreach(Publication publication in publications)
Console.WriteLine($"{pIndex++}: {publication.sortName}");
Console.WriteLine("Enter q to abort");
Console.WriteLine($"Select publication to Download (0-{publications.Length - 1}):");
string? selectedPublication = Console.ReadLine();
while(selectedPublication is null || selectedPublication.Length < 1)
selectedPublication = Console.ReadLine();
if (selectedPublication.Length == 1 && selectedPublication.ToLower() == "q")
{
Console.Clear();
Console.WriteLine("aborted.");
logger.WriteLine("Tranga_CLI", "aborted.");
return null;
}
try
{
int selectedPublicationIndex = Convert.ToInt32(selectedPublication);
return publications[selectedPublicationIndex];
}
catch (Exception e)
{
Console.WriteLine($"Exception: {e.Message}");
logger.WriteLine("Tranga_CLI", e.Message);
}
return null;
}
private static void SearchTasks(TaskManager taskManager, Logger logger)
{
logger.WriteLine("Tranga_CLI", "Menu: Search task");
Console.Clear();
Console.WriteLine("Enter search query:");
string? query = Console.ReadLine();
while (query is null || query.Length < 4)
query = Console.ReadLine();
PrintTasks(taskManager.GetAllTasks().Where(qTask =>
qTask.ToString().ToLower().Contains(query, StringComparison.OrdinalIgnoreCase)).ToArray(), logger);
}
}

View File

@ -2,11 +2,9 @@
Microsoft Visual Studio Solution File, Format Version 12.00
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Tranga", ".\Tranga\Tranga.csproj", "{545E81B9-D96B-4C8F-A97F-2C02414DE566}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Tranga-CLI", "Tranga-CLI\Tranga-CLI.csproj", "{4899E3B2-B259-479A-B43E-042D043E9501}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Logging", "Logging\Logging.csproj", "{415BE889-BB7D-426F-976F-8D977876A462}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Tranga-API", "Tranga-API\Tranga-API.csproj", "{48F4E495-75BC-4402-8E03-DEC5B79D7E83}"
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "CLI", "CLI\CLI.csproj", "{4324C816-F9D2-468F-8ED6-397FE2F0DCB3}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
@ -18,17 +16,13 @@ Global
{545E81B9-D96B-4C8F-A97F-2C02414DE566}.Debug|Any CPU.Build.0 = Debug|Any CPU
{545E81B9-D96B-4C8F-A97F-2C02414DE566}.Release|Any CPU.ActiveCfg = Release|Any CPU
{545E81B9-D96B-4C8F-A97F-2C02414DE566}.Release|Any CPU.Build.0 = Release|Any CPU
{4899E3B2-B259-479A-B43E-042D043E9501}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{4899E3B2-B259-479A-B43E-042D043E9501}.Debug|Any CPU.Build.0 = Debug|Any CPU
{4899E3B2-B259-479A-B43E-042D043E9501}.Release|Any CPU.ActiveCfg = Release|Any CPU
{4899E3B2-B259-479A-B43E-042D043E9501}.Release|Any CPU.Build.0 = Release|Any CPU
{415BE889-BB7D-426F-976F-8D977876A462}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{415BE889-BB7D-426F-976F-8D977876A462}.Debug|Any CPU.Build.0 = Debug|Any CPU
{415BE889-BB7D-426F-976F-8D977876A462}.Release|Any CPU.ActiveCfg = Release|Any CPU
{415BE889-BB7D-426F-976F-8D977876A462}.Release|Any CPU.Build.0 = Release|Any CPU
{48F4E495-75BC-4402-8E03-DEC5B79D7E83}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{48F4E495-75BC-4402-8E03-DEC5B79D7E83}.Debug|Any CPU.Build.0 = Debug|Any CPU
{48F4E495-75BC-4402-8E03-DEC5B79D7E83}.Release|Any CPU.ActiveCfg = Release|Any CPU
{48F4E495-75BC-4402-8E03-DEC5B79D7E83}.Release|Any CPU.Build.0 = Release|Any CPU
{4324C816-F9D2-468F-8ED6-397FE2F0DCB3}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{4324C816-F9D2-468F-8ED6-397FE2F0DCB3}.Debug|Any CPU.Build.0 = Debug|Any CPU
{4324C816-F9D2-468F-8ED6-397FE2F0DCB3}.Release|Any CPU.ActiveCfg = Release|Any CPU
{4324C816-F9D2-468F-8ED6-397FE2F0DCB3}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
EndGlobal

View File

@ -1,6 +1,14 @@
<wpf:ResourceDictionary xml:space="preserve" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:s="clr-namespace:System;assembly=mscorlib" xmlns:ss="urn:shemas-jetbrains-com:settings-storage-xaml" xmlns:wpf="http://schemas.microsoft.com/winfx/2006/xaml/presentation">
<s:Boolean x:Key="/Default/UserDictionary/Words/=altnames/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=authorsartists/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=Gotify/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=jjob/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=Komga/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=lunasea/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=mangakatana/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=Manganato/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=Mangasee/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=Mangaworld/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=Ntfy/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=Taskmanager/@EntryIndexedValue">True</s:Boolean>
<s:Boolean x:Key="/Default/UserDictionary/Words/=Tranga/@EntryIndexedValue">True</s:Boolean></wpf:ResourceDictionary>

View File

@ -1,5 +1,5 @@
using System.Globalization;
using System.Text.RegularExpressions;
using System.Text.RegularExpressions;
using System.Xml.Linq;
namespace Tranga;
@ -7,34 +7,128 @@ namespace Tranga;
/// Has to be Part of a publication
/// Includes the Chapter-Name, -VolumeNumber, -ChapterNumber, the location of the chapter on the internet and the saveName of the local file.
/// </summary>
public struct Chapter
public readonly struct Chapter : IComparable
{
// ReSharper disable once MemberCanBePrivate.Global
public Manga parentManga { get; }
public string? name { get; }
public string? volumeNumber { get; }
public string? chapterNumber { get; }
public string volumeNumber { get; }
public string chapterNumber { get; }
public string url { get; }
// ReSharper disable once MemberCanBePrivate.Global
public string fileName { get; }
public string sortNumber { get; }
private static readonly Regex LegalCharacters = new Regex(@"([A-z]*[0-9]* *\.*-*,*\]*\[*'*\'*\)*\(*~*!*)*");
public Chapter(string? name, string? volumeNumber, string? chapterNumber, string url)
private static readonly Regex LegalCharacters = new (@"([A-z]*[0-9]* *\.*-*,*\]*\[*'*\'*\)*\(*~*!*)*");
private static readonly Regex IllegalStrings = new(@"(Vol(ume)?|Ch(apter)?)\.?", RegexOptions.IgnoreCase);
private static readonly Regex Digits = new(@"[0-9\.]*");
public Chapter(Manga parentManga, string? name, string? volumeNumber, string chapterNumber, string url)
{
this.parentManga = parentManga;
this.name = name;
this.volumeNumber = volumeNumber;
this.chapterNumber = chapterNumber;
this.volumeNumber = volumeNumber is not null ? string.Concat(Digits.Matches(volumeNumber).Select(x => x.Value)) : "0";
this.chapterNumber = string.Concat(Digits.Matches(chapterNumber).Select(x => x.Value));
this.url = url;
NumberFormatInfo nfi = new NumberFormatInfo()
{
NumberDecimalSeparator = "."
};
sortNumber = decimal.Round(Convert.ToDecimal(this.volumeNumber ?? "1") * Convert.ToDecimal(this.chapterNumber, nfi), 1)
.ToString(nfi);
string chapterVolNumStr;
if (volumeNumber is not null && volumeNumber.Length > 0)
chapterVolNumStr = $"Vol.{volumeNumber} Ch.{chapterNumber}";
else
chapterVolNumStr = $"Ch.{chapterNumber}";
string chapterName = string.Concat(LegalCharacters.Matches(name ?? ""));
string volStr = this.volumeNumber is not null ? $"Vol.{this.volumeNumber} " : "";
string chNumberStr = this.chapterNumber is not null ? $"Ch.{chapterNumber} " : "";
string chNameStr = chapterName.Length > 0 ? $"- {chapterName}" : "";
chNameStr = chNameStr.Replace("Volume", "").Replace("volume", "");
this.fileName = $"{volStr}{chNumberStr}{chNameStr}";
if (name is not null && name.Length > 0)
{
string chapterName = IllegalStrings.Replace(string.Concat(LegalCharacters.Matches(name)), "");
this.fileName = $"{chapterVolNumStr} - {chapterName}";
}
else
this.fileName = chapterVolNumStr;
}
public override string ToString()
{
return $"Chapter {parentManga.sortName} {parentManga.internalId} {chapterNumber} {name}";
}
public override bool Equals(object? obj)
{
if (obj is not Chapter)
return false;
return CompareTo(obj) == 0;
}
public int CompareTo(object? obj)
{
if(obj is not Chapter otherChapter)
throw new ArgumentException($"{obj} can not be compared to {this}");
if (float.TryParse(volumeNumber, GlobalBase.numberFormatDecimalPoint, out float volumeNumberFloat) &&
float.TryParse(chapterNumber, GlobalBase.numberFormatDecimalPoint, out float chapterNumberFloat) &&
float.TryParse(otherChapter.volumeNumber, GlobalBase.numberFormatDecimalPoint,
out float otherVolumeNumberFloat) &&
float.TryParse(otherChapter.chapterNumber, GlobalBase.numberFormatDecimalPoint,
out float otherChapterNumberFloat))
{
return volumeNumberFloat.CompareTo(otherVolumeNumberFloat) switch
{
<0 => -1,
>0 => 1,
_ => chapterNumberFloat.CompareTo(otherChapterNumberFloat)
};
}
else throw new FormatException($"Value could not be parsed");
}
/// <summary>
/// Checks if a chapter-archive is already present
/// </summary>
/// <returns>true if chapter is present</returns>
internal bool CheckChapterIsDownloaded()
{
string mangaDirectory = Path.Join(TrangaSettings.downloadLocation, parentManga.folderName);
if (!Directory.Exists(mangaDirectory))
return false;
FileInfo[] archives = new DirectoryInfo(mangaDirectory).GetFiles("*.cbz");
Regex volChRex = new(@"(?:Vol(?:ume)?\.([0-9]+)\D*)?Ch(?:apter)?\.([0-9]+(?:\.[0-9]+)*)");
Chapter t = this;
string correctPath = GetArchiveFilePath();
FileInfo? archive = archives.FirstOrDefault(archive =>
{
Match m = volChRex.Match(archive.Name);
/*Uncommenting this section will only allow *Version without Volume number* -> *Version with Volume number* but not the other way
if (m.Groups[1].Success)
return m.Groups[1].Value == t.volumeNumber && m.Groups[2].Value == t.chapterNumber;
else*/
return m.Groups[2].Value == t.chapterNumber;
});
if(archive is not null && archive.FullName != correctPath)
archive.MoveTo(correctPath, true);
return (archive is not null);
}
/// <summary>
/// Creates full file path of chapter-archive
/// </summary>
/// <returns>Filepath</returns>
internal string GetArchiveFilePath()
{
return Path.Join(TrangaSettings.downloadLocation, parentManga.folderName, $"{parentManga.folderName} - {this.fileName}.cbz");
}
/// <summary>
/// Creates a string containing XML of publication and chapter.
/// See ComicInfo.xml
/// </summary>
/// <returns>XML-string</returns>
internal string GetComicInfoXmlString()
{
XElement comicInfo = new XElement("ComicInfo",
new XElement("Tags", string.Join(',', parentManga.tags)),
new XElement("LanguageISO", parentManga.originalLanguage),
new XElement("Title", this.name),
new XElement("Writer", string.Join(',', parentManga.authors)),
new XElement("Volume", this.volumeNumber),
new XElement("Number", this.chapterNumber)
);
return comicInfo.ToString();
}
}

View File

@ -1,298 +0,0 @@
using System.IO.Compression;
using System.Net;
using System.Runtime.InteropServices;
using System.Text.RegularExpressions;
using System.Xml.Linq;
using Logging;
using Tranga.TrangaTasks;
using static System.IO.UnixFileMode;
namespace Tranga;
/// <summary>
/// Base-Class for all Connectors
/// Provides some methods to be used by all Connectors, as well as a DownloadClient
/// </summary>
public abstract class Connector
{
internal string downloadLocation { get; } //Location of local files
protected DownloadClient downloadClient { get; init; }
protected readonly Logger? logger;
protected readonly string imageCachePath;
protected Connector(string downloadLocation, string imageCachePath, Logger? logger)
{
this.downloadLocation = downloadLocation;
this.logger = logger;
this.downloadClient = new DownloadClient(new Dictionary<byte, int>()
{
//RequestTypes for RateLimits
}, logger);
this.imageCachePath = imageCachePath;
if (!Directory.Exists(imageCachePath))
Directory.CreateDirectory(this.imageCachePath);
}
public abstract string name { get; } //Name of the Connector (e.g. Website)
/// <summary>
/// Returns all Publications with the given string.
/// If the string is empty or null, returns all Publication of the Connector
/// </summary>
/// <param name="publicationTitle">Search-Query</param>
/// <returns>Publications matching the query</returns>
public abstract Publication[] GetPublications(string publicationTitle = "");
/// <summary>
/// Returns all Chapters of the publication in the provided language.
/// If the language is empty or null, returns all Chapters in all Languages.
/// </summary>
/// <param name="publication">Publication to get Chapters for</param>
/// <param name="language">Language of the Chapters</param>
/// <returns>Array of Chapters matching Publication and Language</returns>
public abstract Chapter[] GetChapters(Publication publication, string language = "");
/// <summary>
/// Retrieves the Chapter (+Images) from the website.
/// Should later call DownloadChapterImages to retrieve the individual Images of the Chapter and create .cbz archive.
/// </summary>
/// <param name="publication">Publication that contains Chapter</param>
/// <param name="chapter">Chapter with Images to retrieve</param>
/// <param name="parentTask">Will be used for progress-tracking</param>
public abstract void DownloadChapter(Publication publication, Chapter chapter, DownloadChapterTask parentTask);
/// <summary>
/// Copies the already downloaded cover from cache to downloadLocation
/// </summary>
/// <param name="publication">Publication to retrieve Cover for</param>
/// <param name="settings">TrangaSettings</param>
public void CopyCoverFromCacheToDownloadLocation(Publication publication, TrangaSettings settings)
{
logger?.WriteLine(this.GetType().ToString(), $"Cloning cover {publication.sortName} {publication.internalId}");
//Check if Publication already has a Folder and cover
string publicationFolder = publication.CreatePublicationFolder(downloadLocation);
DirectoryInfo dirInfo = new (publicationFolder);
if (dirInfo.EnumerateFiles().Any(info => info.Name.Contains("cover.")))
{
logger?.WriteLine(this.GetType().ToString(), $"Cover exists {publication.sortName}");
return;
}
string fileInCache = Path.Join(settings.coverImageCache, publication.coverFileNameInCache);
string newFilePath = Path.Join(publicationFolder, $"cover.{Path.GetFileName(fileInCache).Split('.')[^1]}" );
logger?.WriteLine(this.GetType().ToString(), $"Cloning cover {fileInCache} -> {newFilePath}");
File.Copy(fileInCache, newFilePath, true);
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(newFilePath, GroupRead | GroupWrite | OtherRead | OtherWrite | UserRead | UserWrite);
}
/// <summary>
/// Creates a string containing XML of publication and chapter.
/// See ComicInfo.xml
/// </summary>
/// <returns>XML-string</returns>
protected static string GetComicInfoXmlString(Publication publication, Chapter chapter, Logger? logger)
{
logger?.WriteLine("Connector", $"Creating ComicInfo.Xml for {publication.sortName} {publication.internalId} {chapter.volumeNumber}-{chapter.chapterNumber}");
XElement comicInfo = new XElement("ComicInfo",
new XElement("Tags", string.Join(',',publication.tags)),
new XElement("LanguageISO", publication.originalLanguage),
new XElement("Title", chapter.name),
new XElement("Writer", publication.author),
new XElement("Volume", chapter.volumeNumber),
new XElement("Number", chapter.chapterNumber)
);
return comicInfo.ToString();
}
/// <summary>
/// Checks if a chapter-archive is already present
/// </summary>
/// <returns>true if chapter is present</returns>
public bool CheckChapterIsDownloaded(Publication publication, Chapter chapter)
{
Regex legalCharacters = new Regex(@"([A-z]*[0-9]* *\.*-*,*\]*\[*'*\'*\)*\(*~*!*)*");
string oldFilePath = Path.Join(downloadLocation, publication.folderName, $"{string.Concat(legalCharacters.Matches(chapter.name ?? ""))} - V{chapter.volumeNumber}C{chapter.chapterNumber} - {chapter.sortNumber}.cbz");
string oldFilePath2 = Path.Join(downloadLocation, publication.folderName, $"{string.Concat(legalCharacters.Matches(chapter.name ?? ""))} - VC{chapter.chapterNumber} - {chapter.chapterNumber}.cbz");
string newFilePath = GetArchiveFilePath(publication, chapter);
if (File.Exists(oldFilePath))
File.Move(oldFilePath, newFilePath);
else if (File.Exists(oldFilePath2))
File.Move(oldFilePath2, newFilePath);
return File.Exists(newFilePath);
}
/// <summary>
/// Creates full file path of chapter-archive
/// </summary>
/// <returns>Filepath</returns>
protected string GetArchiveFilePath(Publication publication, Chapter chapter)
{
return Path.Join(downloadLocation, publication.folderName, $"{publication.folderName} - {chapter.fileName}.cbz");
}
/// <summary>
/// Downloads Image from URL and saves it to the given path(incl. fileName)
/// </summary>
/// <param name="imageUrl"></param>
/// <param name="fullPath"></param>
/// <param name="requestType">RequestType for Rate-Limit</param>
/// <param name="referrer">referrer used in html request header</param>
private void DownloadImage(string imageUrl, string fullPath, byte requestType, string? referrer = null)
{
DownloadClient.RequestResult requestResult = downloadClient.MakeRequest(imageUrl, requestType, referrer);
if (requestResult.result != Stream.Null)
{
byte[] buffer = new byte[requestResult.result.Length];
requestResult.result.ReadExactly(buffer, 0, buffer.Length);
File.WriteAllBytes(fullPath, buffer);
}else
logger?.WriteLine(this.GetType().ToString(), "No Stream-Content in result.");
}
/// <summary>
/// Downloads all Images from URLs, Compresses to zip(cbz) and saves.
/// </summary>
/// <param name="imageUrls">List of URLs to download Images from</param>
/// <param name="saveArchiveFilePath">Full path to save archive to (without file ending .cbz)</param>
/// <param name="parentTask">Used for progress tracking</param>
/// <param name="comicInfoPath">Path of the generate Chapter ComicInfo.xml, if it was generated</param>
/// <param name="requestType">RequestType for RateLimits</param>
/// <param name="referrer">Used in http request header</param>
protected void DownloadChapterImages(string[] imageUrls, string saveArchiveFilePath, byte requestType, DownloadChapterTask parentTask, string? comicInfoPath = null, string? referrer = null)
{
logger?.WriteLine("Connector", $"Downloading Images for {saveArchiveFilePath}");
//Check if Publication Directory already exists
string directoryPath = Path.GetDirectoryName(saveArchiveFilePath)!;
if (!Directory.Exists(directoryPath))
Directory.CreateDirectory(directoryPath);
if (File.Exists(saveArchiveFilePath)) //Don't download twice.
return;
//Create a temporary folder to store images
string tempFolder = Directory.CreateTempSubdirectory().FullName;
int chapter = 0;
//Download all Images to temporary Folder
foreach (string imageUrl in imageUrls)
{
string[] split = imageUrl.Split('.');
string extension = split[^1];
logger?.WriteLine("Connector", $"Downloading Image {chapter + 1:000}/{imageUrls.Length:000} {parentTask.publication.sortName} {parentTask.publication.internalId} Vol.{parentTask.chapter.volumeNumber} Ch.{parentTask.chapter.chapterNumber} {parentTask.progress:P2}");
DownloadImage(imageUrl, Path.Join(tempFolder, $"{chapter++}.{extension}"), requestType, referrer);
parentTask.IncrementProgress(1f / imageUrls.Length);
}
if(comicInfoPath is not null)
File.Copy(comicInfoPath, Path.Join(tempFolder, "ComicInfo.xml"));
logger?.WriteLine("Connector", $"Creating archive {saveArchiveFilePath}");
//ZIP-it and ship-it
ZipFile.CreateFromDirectory(tempFolder, saveArchiveFilePath);
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(saveArchiveFilePath, GroupRead | GroupWrite | OtherRead | OtherWrite | UserRead | UserWrite);
Directory.Delete(tempFolder, true); //Cleanup
}
protected string SaveCoverImageToCache(string url, byte requestType)
{
string[] split = url.Split('/');
string filename = split[^1];
string saveImagePath = Path.Join(imageCachePath, filename);
if (File.Exists(saveImagePath))
return filename;
DownloadClient.RequestResult coverResult = downloadClient.MakeRequest(url, requestType);
using MemoryStream ms = new();
coverResult.result.CopyTo(ms);
File.WriteAllBytes(saveImagePath, ms.ToArray());
logger?.WriteLine(this.GetType().ToString(), $"Saving image to {saveImagePath}");
return filename;
}
protected class DownloadClient
{
private static readonly HttpClient Client = new();
private readonly Dictionary<byte, DateTime> _lastExecutedRateLimit;
private readonly Dictionary<byte, TimeSpan> _rateLimit;
private Logger? logger;
/// <summary>
/// Creates a httpClient
/// </summary>
/// <param name="rateLimitRequestsPerMinute">Rate limits for requests. byte is RequestType, int maximum requests per minute for RequestType</param>
/// <param name="logger"></param>
public DownloadClient(Dictionary<byte, int> rateLimitRequestsPerMinute, Logger? logger)
{
this.logger = logger;
_lastExecutedRateLimit = new();
_rateLimit = new();
foreach(KeyValuePair<byte, int> limit in rateLimitRequestsPerMinute)
_rateLimit.Add(limit.Key, TimeSpan.FromMinutes(1).Divide(limit.Value));
}
/// <summary>
/// Request Webpage
/// </summary>
/// <param name="url"></param>
/// <param name="requestType">For RateLimits: Same Endpoints use same type</param>
/// <param name="referrer">Used in http request header</param>
/// <returns>RequestResult with StatusCode and Stream of received data</returns>
public RequestResult MakeRequest(string url, byte requestType, string? referrer = null)
{
if (_rateLimit.TryGetValue(requestType, out TimeSpan value))
_lastExecutedRateLimit.TryAdd(requestType, DateTime.Now.Subtract(value));
else
{
logger?.WriteLine(this.GetType().ToString(), "RequestType not configured for rate-limit.");
return new RequestResult(HttpStatusCode.NotAcceptable, Stream.Null);
}
TimeSpan rateLimitTimeout = _rateLimit[requestType]
.Subtract(DateTime.Now.Subtract(_lastExecutedRateLimit[requestType]));
if(rateLimitTimeout > TimeSpan.Zero)
Thread.Sleep(rateLimitTimeout);
HttpResponseMessage? response = null;
while (response is null)
{
try
{
HttpRequestMessage requestMessage = new(HttpMethod.Get, url);
if(referrer is not null)
requestMessage.Headers.Referrer = new Uri(referrer);
_lastExecutedRateLimit[requestType] = DateTime.Now;
response = Client.Send(requestMessage);
}
catch (HttpRequestException e)
{
logger?.WriteLine(this.GetType().ToString(), e.Message);
logger?.WriteLine(this.GetType().ToString(), $"Waiting {_rateLimit[requestType] * 2}... Retrying.");
Thread.Sleep(_rateLimit[requestType] * 2);
}
}
Stream resultString = response.IsSuccessStatusCode ? response.Content.ReadAsStream() : Stream.Null;
if (!response.IsSuccessStatusCode)
logger?.WriteLine(this.GetType().ToString(), $"Request-Error {response.StatusCode}: {response.ReasonPhrase}");
return new RequestResult(response.StatusCode, resultString);
}
public struct RequestResult
{
public HttpStatusCode statusCode { get; }
public Stream result { get; }
public RequestResult(HttpStatusCode statusCode, Stream result)
{
this.statusCode = statusCode;
this.result = result;
}
}
}
}

View File

@ -1,277 +0,0 @@
using System.Globalization;
using System.Net;
using System.Text.Json;
using System.Text.Json.Nodes;
using Logging;
using Tranga.TrangaTasks;
namespace Tranga.Connectors;
public class MangaDex : Connector
{
public override string name { get; }
private enum RequestType : byte
{
Manga,
Feed,
AtHomeServer,
CoverUrl,
Author,
}
public MangaDex(string downloadLocation, string imageCachePath, Logger? logger) : base(downloadLocation, imageCachePath, logger)
{
name = "MangaDex";
this.downloadClient = new DownloadClient(new Dictionary<byte, int>()
{
{(byte)RequestType.Manga, 250},
{(byte)RequestType.Feed, 250},
{(byte)RequestType.AtHomeServer, 40},
{(byte)RequestType.CoverUrl, 250},
{(byte)RequestType.Author, 250}
}, logger);
}
public override Publication[] GetPublications(string publicationTitle = "")
{
logger?.WriteLine(this.GetType().ToString(), $"Getting Publications (title={publicationTitle})");
const int limit = 100; //How many values we want returned at once
int offset = 0; //"Page"
int total = int.MaxValue; //How many total results are there, is updated on first request
HashSet<Publication> publications = new();
int loadedPublicationData = 0;
while (offset < total) //As long as we haven't requested all "Pages"
{
//Request next Page
DownloadClient.RequestResult requestResult =
downloadClient.MakeRequest(
$"https://api.mangadex.org/manga?limit={limit}&title={publicationTitle}&offset={offset}", (byte)RequestType.Manga);
if (requestResult.statusCode != HttpStatusCode.OK)
break;
JsonObject? result = JsonSerializer.Deserialize<JsonObject>(requestResult.result);
offset += limit;
if (result is null)
break;
total = result["total"]!.GetValue<int>(); //Update the total number of Publications
JsonArray mangaInResult = result["data"]!.AsArray(); //Manga-data-Array
//Loop each Manga and extract information from JSON
foreach (JsonNode? mangeNode in mangaInResult)
{
logger?.WriteLine(this.GetType().ToString(), $"Getting publication data. {++loadedPublicationData}/{total}");
JsonObject manga = (JsonObject)mangeNode!;
JsonObject attributes = manga["attributes"]!.AsObject();
string publicationId = manga["id"]!.GetValue<string>();
string title = attributes["title"]!.AsObject().ContainsKey("en") && attributes["title"]!["en"] is not null
? attributes["title"]!["en"]!.GetValue<string>()
: attributes["title"]![((IDictionary<string, JsonNode?>)attributes["title"]!.AsObject()).Keys.First()]!.GetValue<string>();
string? description = attributes["description"]!.AsObject().ContainsKey("en") && attributes["description"]!["en"] is not null
? attributes["description"]!["en"]!.GetValue<string?>()
: null;
JsonArray altTitlesObject = attributes["altTitles"]!.AsArray();
Dictionary<string, string> altTitlesDict = new();
foreach (JsonNode? altTitleNode in altTitlesObject)
{
JsonObject altTitleObject = (JsonObject)altTitleNode!;
string key = ((IDictionary<string, JsonNode?>)altTitleObject).Keys.ToArray()[0];
altTitlesDict.TryAdd(key, altTitleObject[key]!.GetValue<string>());
}
JsonArray tagsObject = attributes["tags"]!.AsArray();
HashSet<string> tags = new();
foreach (JsonNode? tagNode in tagsObject)
{
JsonObject tagObject = (JsonObject)tagNode!;
if(tagObject["attributes"]!["name"]!.AsObject().ContainsKey("en"))
tags.Add(tagObject["attributes"]!["name"]!["en"]!.GetValue<string>());
}
string? posterId = null;
string? authorId = null;
if (manga.ContainsKey("relationships") && manga["relationships"] is not null)
{
JsonArray relationships = manga["relationships"]!.AsArray();
posterId = relationships.FirstOrDefault(relationship => relationship!["type"]!.GetValue<string>() == "cover_art")!["id"]!.GetValue<string>();
authorId = relationships.FirstOrDefault(relationship => relationship!["type"]!.GetValue<string>() == "author")!["id"]!.GetValue<string>();
}
string? coverUrl = GetCoverUrl(publicationId, posterId);
string? coverCacheName = null;
if (coverUrl is not null)
coverCacheName = SaveCoverImageToCache(coverUrl, (byte)RequestType.AtHomeServer);
string? author = GetAuthor(authorId);
Dictionary<string, string> linksDict = new();
if (attributes.ContainsKey("links") && attributes["links"] is not null)
{
JsonObject linksObject = attributes["links"]!.AsObject();
foreach (string key in ((IDictionary<string, JsonNode?>)linksObject).Keys)
{
linksDict.Add(key, linksObject[key]!.GetValue<string>());
}
}
int? year = attributes.ContainsKey("year") && attributes["year"] is not null
? attributes["year"]!.GetValue<int?>()
: null;
string? originalLanguage = attributes.ContainsKey("originalLanguage") && attributes["originalLanguage"] is not null
? attributes["originalLanguage"]!.GetValue<string?>()
: null;
string status = attributes["status"]!.GetValue<string>();
Publication pub = new (
title,
author,
description,
altTitlesDict,
tags.ToArray(),
coverUrl,
coverCacheName,
linksDict,
year,
originalLanguage,
status,
publicationId
);
publications.Add(pub); //Add Publication (Manga) to result
}
}
logger?.WriteLine(this.GetType().ToString(), $"Done getting publications (title={publicationTitle})");
return publications.ToArray();
}
public override Chapter[] GetChapters(Publication publication, string language = "")
{
logger?.WriteLine(this.GetType().ToString(), $"Getting Chapters for {publication.sortName} {publication.internalId} (language={language})");
const int limit = 100; //How many values we want returned at once
int offset = 0; //"Page"
int total = int.MaxValue; //How many total results are there, is updated on first request
List<Chapter> chapters = new();
//As long as we haven't requested all "Pages"
while (offset < total)
{
//Request next "Page"
DownloadClient.RequestResult requestResult =
downloadClient.MakeRequest(
$"https://api.mangadex.org/manga/{publication.publicationId}/feed?limit={limit}&offset={offset}&translatedLanguage%5B%5D={language}", (byte)RequestType.Feed);
if (requestResult.statusCode != HttpStatusCode.OK)
break;
JsonObject? result = JsonSerializer.Deserialize<JsonObject>(requestResult.result);
offset += limit;
if (result is null)
break;
total = result["total"]!.GetValue<int>();
JsonArray chaptersInResult = result["data"]!.AsArray();
//Loop through all Chapters in result and extract information from JSON
foreach (JsonNode? jsonNode in chaptersInResult)
{
JsonObject chapter = (JsonObject)jsonNode!;
JsonObject attributes = chapter["attributes"]!.AsObject();
string chapterId = chapter["id"]!.GetValue<string>();
string? title = attributes.ContainsKey("title") && attributes["title"] is not null
? attributes["title"]!.GetValue<string>()
: null;
string? volume = attributes.ContainsKey("volume") && attributes["volume"] is not null
? attributes["volume"]!.GetValue<string>()
: null;
string? chapterNum = attributes.ContainsKey("chapter") && attributes["chapter"] is not null
? attributes["chapter"]!.GetValue<string>()
: null;
chapters.Add(new Chapter(title, volume, chapterNum, chapterId));
}
}
//Return Chapters ordered by Chapter-Number
NumberFormatInfo chapterNumberFormatInfo = new()
{
NumberDecimalSeparator = "."
};
logger?.WriteLine(this.GetType().ToString(), $"Done getting Chapters for {publication.internalId}");
return chapters.OrderBy(chapter => Convert.ToSingle(chapter.chapterNumber, chapterNumberFormatInfo)).ToArray();
}
public override void DownloadChapter(Publication publication, Chapter chapter, DownloadChapterTask parentTask)
{
logger?.WriteLine(this.GetType().ToString(), $"Downloading Chapter-Info {publication.sortName} {publication.internalId} {chapter.volumeNumber}-{chapter.chapterNumber}");
//Request URLs for Chapter-Images
DownloadClient.RequestResult requestResult =
downloadClient.MakeRequest($"https://api.mangadex.org/at-home/server/{chapter.url}?forcePort443=false'", (byte)RequestType.AtHomeServer);
if (requestResult.statusCode != HttpStatusCode.OK)
return;
JsonObject? result = JsonSerializer.Deserialize<JsonObject>(requestResult.result);
if (result is null)
return;
string baseUrl = result["baseUrl"]!.GetValue<string>();
string hash = result["chapter"]!["hash"]!.GetValue<string>();
JsonArray imageFileNames = result["chapter"]!["data"]!.AsArray();
//Loop through all imageNames and construct urls (imageUrl)
HashSet<string> imageUrls = new();
foreach (JsonNode? image in imageFileNames)
imageUrls.Add($"{baseUrl}/data/{hash}/{image!.GetValue<string>()}");
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, GetComicInfoXmlString(publication, chapter, logger));
//Download Chapter-Images
DownloadChapterImages(imageUrls.ToArray(), GetArchiveFilePath(publication, chapter), (byte)RequestType.AtHomeServer, parentTask, comicInfoPath);
}
private string? GetCoverUrl(string publicationId, string? posterId)
{
logger?.WriteLine(this.GetType().ToString(), $"Getting CoverUrl for {publicationId}");
if (posterId is null)
{
logger?.WriteLine(this.GetType().ToString(), $"No posterId, aborting");
return null;
}
//Request information where to download Cover
DownloadClient.RequestResult requestResult =
downloadClient.MakeRequest($"https://api.mangadex.org/cover/{posterId}", (byte)RequestType.CoverUrl);
if (requestResult.statusCode != HttpStatusCode.OK)
return null;
JsonObject? result = JsonSerializer.Deserialize<JsonObject>(requestResult.result);
if (result is null)
return null;
string fileName = result["data"]!["attributes"]!["fileName"]!.GetValue<string>();
string coverUrl = $"https://uploads.mangadex.org/covers/{publicationId}/{fileName}";
logger?.WriteLine(this.GetType().ToString(), $"Got Cover-Url for {publicationId} -> {coverUrl}");
return coverUrl;
}
private string? GetAuthor(string? authorId)
{
if (authorId is null)
return null;
DownloadClient.RequestResult requestResult =
downloadClient.MakeRequest($"https://api.mangadex.org/author/{authorId}", (byte)RequestType.Author);
if (requestResult.statusCode != HttpStatusCode.OK)
return null;
JsonObject? result = JsonSerializer.Deserialize<JsonObject>(requestResult.result);
if (result is null)
return null;
string author = result["data"]!["attributes"]!["name"]!.GetValue<string>();
logger?.WriteLine(this.GetType().ToString(), $"Got author {authorId} -> {author}");
return author;
}
}

View File

@ -1,196 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Logging;
using Tranga.TrangaTasks;
namespace Tranga.Connectors;
public class Manganato : Connector
{
public override string name { get; }
public Manganato(string downloadLocation, string imageCachePath, Logger? logger) : base(downloadLocation, imageCachePath, logger)
{
this.name = "Manganato";
this.downloadClient = new DownloadClient(new Dictionary<byte, int>()
{
{(byte)1, 60}
}, logger);
}
public override Publication[] GetPublications(string publicationTitle = "")
{
logger?.WriteLine(this.GetType().ToString(), $"Getting Publications (title={publicationTitle})");
string sanitizedTitle = string.Concat(Regex.Matches(publicationTitle, "[A-z]* *")).ToLower().Replace(' ', '_');
string requestUrl = $"https://manganato.com/search/story/{sanitizedTitle}";
DownloadClient.RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, (byte)1);
if (requestResult.statusCode != HttpStatusCode.OK)
return Array.Empty<Publication>();
return ParsePublicationsFromHtml(requestResult.result);
}
private Publication[] ParsePublicationsFromHtml(Stream html)
{
StreamReader reader = new (html);
string htmlString = reader.ReadToEnd();
HtmlDocument document = new ();
document.LoadHtml(htmlString);
IEnumerable<HtmlNode> searchResults = document.DocumentNode.Descendants("div").Where(n => n.HasClass("search-story-item"));
List<string> urls = new();
foreach (HtmlNode mangaResult in searchResults)
{
urls.Add(mangaResult.Descendants("a").First(n => n.HasClass("item-title")).GetAttributes()
.First(a => a.Name == "href").Value);
}
HashSet<Publication> ret = new();
foreach (string url in urls)
{
DownloadClient.RequestResult requestResult =
downloadClient.MakeRequest(url, (byte)1);
if (requestResult.statusCode != HttpStatusCode.OK)
return Array.Empty<Publication>();
ret.Add(ParseSinglePublicationFromHtml(requestResult.result, url.Split('/')[^1]));
}
return ret.ToArray();
}
private Publication ParseSinglePublicationFromHtml(Stream html, string publicationId)
{
StreamReader reader = new (html);
string htmlString = reader.ReadToEnd();
HtmlDocument document = new ();
document.LoadHtml(htmlString);
string status = "";
Dictionary<string, string> altTitles = new();
Dictionary<string, string>? links = null;
HashSet<string> tags = new();
string? author = null, originalLanguage = null;
int? year = DateTime.Now.Year;
HtmlNode infoNode = document.DocumentNode.Descendants("div").First(d => d.HasClass("story-info-right"));
string sortName = infoNode.Descendants("h1").First().InnerText;
HtmlNode infoTable = infoNode.Descendants().First(d => d.Name == "table");
foreach (HtmlNode row in infoTable.Descendants("tr"))
{
string key = row.SelectNodes("td").First().InnerText.ToLower();
string value = row.SelectNodes("td").Last().InnerText;
string keySanitized = string.Concat(Regex.Matches(key, "[a-z]"));
switch (keySanitized)
{
case "alternative":
string[] alts = value.Split(" ; ");
for(int i = 0; i < alts.Length; i++)
altTitles.Add(i.ToString(), alts[i]);
break;
case "authors":
author = value;
break;
case "status":
status = value;
break;
case "genres":
string[] genres = value.Split(" - ");
tags = genres.ToHashSet();
break;
default: break;
}
}
string posterUrl = document.DocumentNode.Descendants("span").First(s => s.HasClass("info-image")).Descendants("img").First()
.GetAttributes().First(a => a.Name == "src").Value;
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, 1);
string description = document.DocumentNode.Descendants("div").First(d => d.HasClass("panel-story-info-description"))
.InnerText.Replace("Description :", "");
while (description.StartsWith('\n'))
description = description.Substring(1);
string yearString = document.DocumentNode.Descendants("li").Last(li => li.HasClass("a-h")).Descendants("span")
.First(s => s.HasClass("chapter-time")).InnerText;
year = Convert.ToInt32(yearString.Split(',')[^1]) + 2000;
return new Publication(sortName, author, description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, status, publicationId);
}
public override Chapter[] GetChapters(Publication publication, string language = "")
{
logger?.WriteLine(this.GetType().ToString(), $"Getting Chapters for {publication.sortName} {publication.internalId} (language={language})");
string requestUrl = $"https://chapmanganato.com/{publication.publicationId}";
DownloadClient.RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, (byte)1);
if (requestResult.statusCode != HttpStatusCode.OK)
return Array.Empty<Chapter>();
return ParseChaptersFromHtml(requestResult.result);
}
private Chapter[] ParseChaptersFromHtml(Stream html)
{
StreamReader reader = new (html);
string htmlString = reader.ReadToEnd();
HtmlDocument document = new ();
document.LoadHtml(htmlString);
List<Chapter> ret = new();
HtmlNode chapterList = document.DocumentNode.Descendants("ul").First(l => l.HasClass("row-content-chapter"));
foreach (HtmlNode chapterInfo in chapterList.Descendants("li"))
{
string fullString = chapterInfo.Descendants("a").First(d => d.HasClass("chapter-name")).InnerText;
string? volumeNumber = fullString.Contains("Vol.") ? fullString.Replace("Vol.", "").Split(' ')[0] : null;
string? chapterNumber = fullString.Split(':')[0].Split("Chapter ")[1].Replace('-','.');
string chapterName = string.Concat(fullString.Split(':')[1..]);
string url = chapterInfo.Descendants("a").First(d => d.HasClass("chapter-name"))
.GetAttributeValue("href", "");
ret.Add(new Chapter(chapterName, volumeNumber, chapterNumber, url));
}
ret.Reverse();
return ret.ToArray();
}
public override void DownloadChapter(Publication publication, Chapter chapter, DownloadChapterTask parentTask)
{
logger?.WriteLine(this.GetType().ToString(), $"Downloading Chapter-Info {publication.sortName} {publication.internalId} {chapter.volumeNumber}-{chapter.chapterNumber}");
string requestUrl = chapter.url;
DownloadClient.RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, (byte)1);
if (requestResult.statusCode != HttpStatusCode.OK)
return;
string[] imageUrls = ParseImageUrlsFromHtml(requestResult.result);
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, GetComicInfoXmlString(publication, chapter, logger));
DownloadChapterImages(imageUrls, GetArchiveFilePath(publication, chapter), (byte)1, parentTask, comicInfoPath, "https://chapmanganato.com/");
}
private string[] ParseImageUrlsFromHtml(Stream html)
{
StreamReader reader = new (html);
string htmlString = reader.ReadToEnd();
HtmlDocument document = new ();
document.LoadHtml(htmlString);
List<string> ret = new();
HtmlNode imageContainer =
document.DocumentNode.Descendants("div").First(i => i.HasClass("container-chapter-reader"));
foreach(HtmlNode imageNode in imageContainer.Descendants("img"))
ret.Add(imageNode.GetAttributeValue("src", ""));
return ret.ToArray();
}
}

View File

@ -1,235 +0,0 @@
using System.Net;
using System.Text.RegularExpressions;
using System.Xml.Linq;
using HtmlAgilityPack;
using Logging;
using Newtonsoft.Json;
using PuppeteerSharp;
using Tranga.TrangaTasks;
namespace Tranga.Connectors;
public class Mangasee : Connector
{
public override string name { get; }
private IBrowser? _browser = null;
private const string ChromiumVersion = "1153303";
public Mangasee(string downloadLocation, string imageCachePath, Logger? logger) : base(downloadLocation,
imageCachePath, logger)
{
this.name = "Mangasee";
this.downloadClient = new DownloadClient(new Dictionary<byte, int>()
{
{ (byte)1, 60 }
}, logger);
Task d = new Task(DownloadBrowser);
d.Start();
}
private async void DownloadBrowser()
{
BrowserFetcher browserFetcher = new BrowserFetcher();
foreach(string rev in browserFetcher.LocalRevisions().Where(rev => rev != ChromiumVersion))
browserFetcher.Remove(rev);
if (!browserFetcher.LocalRevisions().Contains(ChromiumVersion))
{
logger?.WriteLine(this.GetType().ToString(), "Downloading headless browser");
DateTime last = DateTime.Now.Subtract(TimeSpan.FromSeconds(5));
browserFetcher.DownloadProgressChanged += (sender, args) =>
{
double currentBytes = Convert.ToDouble(args.BytesReceived) / Convert.ToDouble(args.TotalBytesToReceive);
if (args.TotalBytesToReceive == args.BytesReceived)
{
logger?.WriteLine(this.GetType().ToString(), "Browser downloaded.");
}
else if (DateTime.Now > last.AddSeconds(5))
{
logger?.WriteLine(this.GetType().ToString(), $"Browser download progress: {currentBytes:P2}");
last = DateTime.Now;
}
};
if (!browserFetcher.CanDownloadAsync(ChromiumVersion).Result)
{
logger?.WriteLine(this.GetType().ToString(), $"Can't download browser version {ChromiumVersion}");
return;
}
await browserFetcher.DownloadAsync(ChromiumVersion);
}
logger?.WriteLine(this.GetType().ToString(), "Starting browser.");
this._browser = await Puppeteer.LaunchAsync(new LaunchOptions
{
Headless = true,
ExecutablePath = browserFetcher.GetExecutablePath(ChromiumVersion),
Args = new [] {
"--disable-gpu",
"--disable-dev-shm-usage",
"--disable-setuid-sandbox",
"--no-sandbox"}
});
}
public override Publication[] GetPublications(string publicationTitle = "")
{
logger?.WriteLine(this.GetType().ToString(), $"Getting Publications (title={publicationTitle})");
string sanitizedTitle = string.Concat(Regex.Matches(publicationTitle, "[A-z]* *")).ToLower().Replace(' ', '+');
string requestUrl = $"https://mangasee123.com/_search.php";
DownloadClient.RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, (byte)1);
if (requestResult.statusCode != HttpStatusCode.OK)
return Array.Empty<Publication>();
return ParsePublicationsFromHtml(requestResult.result, publicationTitle);
}
private Publication[] ParsePublicationsFromHtml(Stream html, string publicationTitle)
{
string jsonString = new StreamReader(html).ReadToEnd();
List<SearchResultItem> result = JsonConvert.DeserializeObject<List<SearchResultItem>>(jsonString)!;
Dictionary<SearchResultItem, int> queryFiltered = new();
foreach (SearchResultItem resultItem in result)
{
foreach (string term in publicationTitle.Split(' '))
if (resultItem.i.Contains(term, StringComparison.CurrentCultureIgnoreCase))
if (!queryFiltered.TryAdd(resultItem, 0))
queryFiltered[resultItem]++;
}
queryFiltered = queryFiltered.Where(item => item.Value >= publicationTitle.Split(' ').Length - 1)
.ToDictionary(item => item.Key, item => item.Value);
HashSet<Publication> ret = new();
List<SearchResultItem> orderedFiltered =
queryFiltered.OrderBy(item => item.Value).ToDictionary(item => item.Key, item => item.Value).Keys.ToList();
foreach (SearchResultItem orderedItem in orderedFiltered)
{
DownloadClient.RequestResult requestResult =
downloadClient.MakeRequest($"https://mangasee123.com/manga/{orderedItem.i}", (byte)1);
if (requestResult.statusCode != HttpStatusCode.OK)
return Array.Empty<Publication>();
ret.Add(ParseSinglePublicationFromHtml(requestResult.result, orderedItem.s, orderedItem.i, orderedItem.a));
}
return ret.ToArray();
}
private Publication ParseSinglePublicationFromHtml(Stream html, string sortName, string publicationId, string[] a)
{
StreamReader reader = new (html);
string htmlString = reader.ReadToEnd();
HtmlDocument document = new ();
document.LoadHtml(htmlString);
string originalLanguage = "", status = "";
Dictionary<string, string> altTitles = new(), links = new();
HashSet<string> tags = new();
HtmlNode posterNode =
document.DocumentNode.Descendants("img").First(img => img.HasClass("img-fluid") && img.HasClass("bottom-5"));
string posterUrl = posterNode.GetAttributeValue("src", "");
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, 1);
HtmlNode attributes = document.DocumentNode.Descendants("div")
.First(div => div.HasClass("col-md-9") && div.HasClass("col-sm-8") && div.HasClass("top-5"))
.Descendants("ul").First();
HtmlNode[] authorsNodes = attributes.Descendants("li")
.First(node => node.InnerText.Contains("author(s):", StringComparison.CurrentCultureIgnoreCase))
.Descendants("a").ToArray();
string[] authors = new string[authorsNodes.Length];
for (int j = 0; j < authors.Length; j++)
authors[j] = authorsNodes[j].InnerText;
string author = string.Join(" - ", authors);
HtmlNode[] genreNodes = attributes.Descendants("li")
.First(node => node.InnerText.Contains("genre(s):", StringComparison.CurrentCultureIgnoreCase))
.Descendants("a").ToArray();
foreach (HtmlNode genreNode in genreNodes)
tags.Add(genreNode.InnerText);
HtmlNode yearNode = attributes.Descendants("li")
.First(node => node.InnerText.Contains("released:", StringComparison.CurrentCultureIgnoreCase))
.Descendants("a").First();
int year = Convert.ToInt32(yearNode.InnerText);
HtmlNode[] statusNodes = attributes.Descendants("li")
.First(node => node.InnerText.Contains("status:", StringComparison.CurrentCultureIgnoreCase))
.Descendants("a").ToArray();
foreach(HtmlNode statusNode in statusNodes)
if (statusNode.InnerText.Contains("publish", StringComparison.CurrentCultureIgnoreCase))
status = statusNode.InnerText.Split(' ')[0];
HtmlNode descriptionNode = attributes.Descendants("li").First(node => node.InnerText.Contains("description:", StringComparison.CurrentCultureIgnoreCase)).Descendants("div").First();
string description = descriptionNode.InnerText;
int i = 0;
foreach(string at in a)
altTitles.Add((i++).ToString(), at);
return new Publication(sortName, author, description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, status, publicationId);
}
// ReSharper disable once ClassNeverInstantiated.Local Will be instantiated during deserialization
private class SearchResultItem
{
#pragma warning disable CS8618 //Will always be set
public string i { get; set; }
public string s { get; set; }
public string[] a { get; set; }
#pragma warning restore CS8618
}
public override Chapter[] GetChapters(Publication publication, string language = "")
{
XDocument doc = XDocument.Load($"https://mangasee123.com/rss/{publication.publicationId}.xml");
XElement[] chapterItems = doc.Descendants("item").ToArray();
List<Chapter> ret = new();
foreach (XElement chapter in chapterItems)
{
string? volumeNumber = "1";
string chapterName = chapter.Descendants("title").First().Value;
string chapterNumber = Regex.Matches(chapterName, "[0-9]+")[^1].ToString();
string url = chapter.Descendants("link").First().Value;
url = url.Replace(Regex.Matches(url,"(-page-[0-9])")[0].ToString(),"");
ret.Add(new Chapter("", volumeNumber, chapterNumber, url));
}
ret.Reverse();
return ret.ToArray();
}
public override void DownloadChapter(Publication publication, Chapter chapter, DownloadChapterTask parentTask)
{
while (this._browser is null)
{
logger?.WriteLine(this.GetType().ToString(), "Waiting for headless browser to download...");
Thread.Sleep(1000);
}
logger?.WriteLine(this.GetType().ToString(), $"Downloading Chapter-Info {publication.sortName} {publication.internalId} {chapter.volumeNumber}-{chapter.chapterNumber}");
IPage page = _browser.NewPageAsync().Result;
IResponse response = page.GoToAsync(chapter.url).Result;
if (response.Ok)
{
HtmlDocument document = new ();
document.LoadHtml(page.GetContentAsync().Result);
HtmlNode gallery = document.DocumentNode.Descendants("div").First(div => div.HasClass("ImageGallery"));
HtmlNode[] images = gallery.Descendants("img").Where(img => img.HasClass("img-fluid")).ToArray();
List<string> urls = new();
foreach(HtmlNode galleryImage in images)
urls.Add(galleryImage.GetAttributeValue("src", ""));
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, GetComicInfoXmlString(publication, chapter, logger));
DownloadChapterImages(urls.ToArray(), GetArchiveFilePath(publication, chapter), (byte)1, parentTask, comicInfoPath);
}
}
}

143
Tranga/GlobalBase.cs Normal file
View File

@ -0,0 +1,143 @@
using System.Globalization;
using System.Text.RegularExpressions;
using Logging;
using Newtonsoft.Json;
using Tranga.LibraryConnectors;
using Tranga.NotificationConnectors;
namespace Tranga;
public abstract class GlobalBase
{
[JsonIgnore]
public Logger? logger { get; init; }
protected HashSet<NotificationConnector> notificationConnectors { get; init; }
protected HashSet<LibraryConnector> libraryConnectors { get; init; }
private Dictionary<string, Manga> cachedPublications { get; init; }
public static readonly NumberFormatInfo numberFormatDecimalPoint = new (){ NumberDecimalSeparator = "." };
protected static readonly Regex baseUrlRex = new(@"https?:\/\/[0-9A-z\.-]+(:[0-9]+)?");
protected GlobalBase(GlobalBase clone)
{
this.logger = clone.logger;
this.notificationConnectors = clone.notificationConnectors;
this.libraryConnectors = clone.libraryConnectors;
this.cachedPublications = clone.cachedPublications;
}
protected GlobalBase(Logger? logger)
{
this.logger = logger;
this.notificationConnectors = TrangaSettings.LoadNotificationConnectors(this);
this.libraryConnectors = TrangaSettings.LoadLibraryConnectors(this);
this.cachedPublications = new();
}
protected void AddMangaToCache(Manga manga)
{
if (!this.cachedPublications.TryAdd(manga.internalId, manga))
{
Log($"Overwriting Manga {manga.internalId}");
this.cachedPublications[manga.internalId] = manga;
}
}
protected Manga? GetCachedManga(string internalId)
{
return cachedPublications.TryGetValue(internalId, out Manga manga) switch
{
true => manga,
_ => null
};
}
protected IEnumerable<Manga> GetAllCachedManga()
{
return cachedPublications.Values;
}
protected void Log(string message)
{
logger?.WriteLine(this.GetType().Name, message);
}
protected void Log(string fStr, params object?[] replace)
{
Log(string.Format(fStr, replace));
}
protected void SendNotifications(string title, string text, bool buffer = false)
{
foreach (NotificationConnector nc in notificationConnectors)
nc.SendNotification(title, text, buffer);
}
protected void AddNotificationConnector(NotificationConnector notificationConnector)
{
Log($"Adding {notificationConnector}");
notificationConnectors.RemoveWhere(nc => nc.notificationConnectorType == notificationConnector.notificationConnectorType);
notificationConnectors.Add(notificationConnector);
while(IsFileInUse(TrangaSettings.notificationConnectorsFilePath))
Thread.Sleep(100);
Log("Exporting notificationConnectors");
File.WriteAllText(TrangaSettings.notificationConnectorsFilePath, JsonConvert.SerializeObject(notificationConnectors));
}
protected void DeleteNotificationConnector(NotificationConnector.NotificationConnectorType notificationConnectorType)
{
Log($"Removing {notificationConnectorType}");
notificationConnectors.RemoveWhere(nc => nc.notificationConnectorType == notificationConnectorType);
while(IsFileInUse(TrangaSettings.notificationConnectorsFilePath))
Thread.Sleep(100);
Log("Exporting notificationConnectors");
File.WriteAllText(TrangaSettings.notificationConnectorsFilePath, JsonConvert.SerializeObject(notificationConnectors));
}
protected void UpdateLibraries()
{
foreach(LibraryConnector lc in libraryConnectors)
lc.UpdateLibrary();
}
protected void AddLibraryConnector(LibraryConnector libraryConnector)
{
Log($"Adding {libraryConnector}");
libraryConnectors.RemoveWhere(lc => lc.libraryType == libraryConnector.libraryType);
libraryConnectors.Add(libraryConnector);
while(IsFileInUse(TrangaSettings.libraryConnectorsFilePath))
Thread.Sleep(100);
Log("Exporting libraryConnectors");
File.WriteAllText(TrangaSettings.libraryConnectorsFilePath, JsonConvert.SerializeObject(libraryConnectors, Formatting.Indented));
}
protected void DeleteLibraryConnector(LibraryConnector.LibraryType libraryType)
{
Log($"Removing {libraryType}");
libraryConnectors.RemoveWhere(lc => lc.libraryType == libraryType);
while(IsFileInUse(TrangaSettings.libraryConnectorsFilePath))
Thread.Sleep(100);
Log("Exporting libraryConnectors");
File.WriteAllText(TrangaSettings.libraryConnectorsFilePath, JsonConvert.SerializeObject(libraryConnectors, Formatting.Indented));
}
protected bool IsFileInUse(string filePath) => IsFileInUse(filePath, this.logger);
public static bool IsFileInUse(string filePath, Logger? logger)
{
if (!File.Exists(filePath))
return false;
try
{
using FileStream stream = new (filePath, FileMode.Open, FileAccess.Read, FileShare.None);
stream.Close();
return false;
}
catch (IOException)
{
logger?.WriteLine($"File is in use {filePath}");
return true;
}
}
}

View File

@ -0,0 +1,54 @@
using System.Net;
using Tranga.MangaConnectors;
namespace Tranga.Jobs;
public class DownloadChapter : Job
{
public Chapter chapter { get; init; }
public DownloadChapter(GlobalBase clone, MangaConnector connector, Chapter chapter, DateTime lastExecution, string? parentJobId = null) : base(clone, JobType.DownloadChapterJob, connector, lastExecution, parentJobId: parentJobId)
{
this.chapter = chapter;
}
public DownloadChapter(GlobalBase clone, MangaConnector connector, Chapter chapter, string? parentJobId = null) : base(clone, JobType.DownloadChapterJob, connector, parentJobId: parentJobId)
{
this.chapter = chapter;
}
protected override string GetId()
{
return $"{GetType()}-{chapter.parentManga.internalId}-{chapter.chapterNumber}";
}
public override string ToString()
{
return $"{id} Chapter: {chapter}";
}
protected override IEnumerable<Job> ExecuteReturnSubTasksInternal(JobBoss jobBoss)
{
Task downloadTask = new(delegate
{
mangaConnector.CopyCoverFromCacheToDownloadLocation(chapter.parentManga);
HttpStatusCode success = mangaConnector.DownloadChapter(chapter, this.progressToken);
chapter.parentManga.UpdateLatestDownloadedChapter(chapter);
if (success == HttpStatusCode.OK)
{
UpdateLibraries();
SendNotifications("Chapter downloaded", $"{chapter.parentManga.sortName} - {chapter.chapterNumber}", true);
}
});
downloadTask.Start();
return Array.Empty<Job>();
}
public override bool Equals(object? obj)
{
if (obj is not DownloadChapter otherJob)
return false;
return otherJob.mangaConnector == this.mangaConnector &&
otherJob.chapter.Equals(this.chapter);
}
}

View File

@ -0,0 +1,59 @@
using Tranga.MangaConnectors;
namespace Tranga.Jobs;
public class DownloadNewChapters : Job
{
public Manga manga { get; set; }
public string translatedLanguage { get; init; }
public DownloadNewChapters(GlobalBase clone, MangaConnector connector, Manga manga, DateTime lastExecution,
bool recurring = false, TimeSpan? recurrence = null, string? parentJobId = null, string translatedLanguage = "en") : base(clone, JobType.DownloadNewChaptersJob, connector, lastExecution, recurring,
recurrence, parentJobId)
{
this.manga = manga;
this.translatedLanguage = translatedLanguage;
}
public DownloadNewChapters(GlobalBase clone, MangaConnector connector, Manga manga, bool recurring = false, TimeSpan? recurrence = null, string? parentJobId = null, string translatedLanguage = "en") : base (clone, JobType.DownloadNewChaptersJob, connector, recurring, recurrence, parentJobId)
{
this.manga = manga;
this.translatedLanguage = translatedLanguage;
}
protected override string GetId()
{
return $"{GetType()}-{manga.internalId}";
}
public override string ToString()
{
return $"{id} Manga: {manga}";
}
protected override IEnumerable<Job> ExecuteReturnSubTasksInternal(JobBoss jobBoss)
{
manga.SaveSeriesInfoJson();
Chapter[] chapters = mangaConnector.GetNewChapters(manga, this.translatedLanguage);
this.progressToken.increments = chapters.Length;
List<Job> jobs = new();
mangaConnector.CopyCoverFromCacheToDownloadLocation(manga);
foreach (Chapter chapter in chapters)
{
DownloadChapter downloadChapterJob = new(this, this.mangaConnector, chapter, parentJobId: this.id);
jobs.Add(downloadChapterJob);
}
UpdateMetadata updateMetadataJob = new(this, this.mangaConnector, this.manga, parentJobId: this.id);
jobs.Add(updateMetadataJob);
progressToken.Complete();
return jobs;
}
public override bool Equals(object? obj)
{
if (obj is not DownloadNewChapters otherJob)
return false;
return otherJob.mangaConnector == this.mangaConnector &&
otherJob.manga.Equals(this.manga);
}
}

98
Tranga/Jobs/Job.cs Normal file
View File

@ -0,0 +1,98 @@
using Tranga.MangaConnectors;
namespace Tranga.Jobs;
public abstract class Job : GlobalBase
{
public MangaConnector mangaConnector { get; init; }
public ProgressToken progressToken { get; private set; }
public bool recurring { get; init; }
public TimeSpan? recurrenceTime { get; set; }
public DateTime? lastExecution { get; private set; }
public DateTime nextExecution => NextExecution();
public string id => GetId();
internal IEnumerable<Job>? subJobs { get; private set; }
public string? parentJobId { get; init; }
public enum JobType : byte { DownloadChapterJob, DownloadNewChaptersJob, UpdateMetaDataJob }
public JobType jobType;
internal Job(GlobalBase clone, JobType jobType, MangaConnector connector, bool recurring = false, TimeSpan? recurrenceTime = null, string? parentJobId = null) : base(clone)
{
this.jobType = jobType;
this.mangaConnector = connector;
this.progressToken = new ProgressToken(0);
this.recurring = recurring;
if (recurring && recurrenceTime is null)
throw new ArgumentException("If recurrence is set to true, a recurrence time has to be provided.");
else if(recurring && recurrenceTime is not null)
this.lastExecution = DateTime.Now.Subtract((TimeSpan)recurrenceTime);
this.recurrenceTime = recurrenceTime ?? TimeSpan.Zero;
this.parentJobId = parentJobId;
}
internal Job(GlobalBase clone, JobType jobType, MangaConnector connector, DateTime lastExecution, bool recurring = false,
TimeSpan? recurrenceTime = null, string? parentJobId = null) : base(clone)
{
this.jobType = jobType;
this.mangaConnector = connector;
this.progressToken = new ProgressToken(0);
this.recurring = recurring;
if (recurring && recurrenceTime is null)
throw new ArgumentException("If recurrence is set to true, a recurrence time has to be provided.");
this.lastExecution = lastExecution;
this.recurrenceTime = recurrenceTime ?? TimeSpan.Zero;
this.parentJobId = parentJobId;
}
protected abstract string GetId();
public void AddSubJob(Job job)
{
subJobs ??= new List<Job>();
subJobs = subJobs.Append(job);
}
private DateTime NextExecution()
{
if(recurrenceTime.HasValue && lastExecution.HasValue)
return lastExecution.Value.Add(recurrenceTime.Value);
if(recurrenceTime.HasValue && !lastExecution.HasValue)
return DateTime.Now;
return DateTime.MaxValue;
}
public void ResetProgress()
{
this.progressToken.increments -= progressToken.incrementsCompleted;
this.lastExecution = DateTime.Now;
this.progressToken.Waiting();
}
public void ExecutionEnqueue()
{
this.progressToken.increments -= progressToken.incrementsCompleted;
this.progressToken.Standby();
}
public void Cancel()
{
Log($"Cancelling {this}");
this.progressToken.cancellationRequested = true;
this.progressToken.Cancel();
this.lastExecution = DateTime.Now;
if(subJobs is not null)
foreach(Job subJob in subJobs)
subJob.Cancel();
}
public IEnumerable<Job> ExecuteReturnSubTasks(JobBoss jobBoss)
{
progressToken.Start();
subJobs = ExecuteReturnSubTasksInternal(jobBoss);
lastExecution = DateTime.Now;
return subJobs;
}
protected abstract IEnumerable<Job> ExecuteReturnSubTasksInternal(JobBoss jobBoss);
}

279
Tranga/Jobs/JobBoss.cs Normal file
View File

@ -0,0 +1,279 @@
using System.Text.RegularExpressions;
using Newtonsoft.Json;
using Tranga.MangaConnectors;
namespace Tranga.Jobs;
public class JobBoss : GlobalBase
{
public HashSet<Job> jobs { get; init; }
private Dictionary<MangaConnector, Queue<Job>> mangaConnectorJobQueue { get; init; }
public JobBoss(GlobalBase clone, HashSet<MangaConnector> connectors) : base(clone)
{
this.jobs = new();
LoadJobsList(connectors);
this.mangaConnectorJobQueue = new();
Log($"Next job in {jobs.MinBy(job => job.nextExecution)?.nextExecution.Subtract(DateTime.Now)} {jobs.MinBy(job => job.nextExecution)?.id}");
}
public void AddJob(Job job)
{
if (ContainsJobLike(job))
{
Log($"Already Contains Job {job}");
}
else
{
Log($"Added {job}");
this.jobs.Add(job);
UpdateJobFile(job);
}
}
public void AddJobs(IEnumerable<Job> jobsToAdd)
{
foreach (Job job in jobsToAdd)
AddJob(job);
}
/// <summary>
/// Compares contents of the provided job and all current jobs
/// Does not check if objects are the same
/// </summary>
public bool ContainsJobLike(Job job)
{
return this.jobs.Any(existingJob => existingJob.Equals(job));
}
public void RemoveJob(Job job)
{
Log($"Removing {job}");
job.Cancel();
this.jobs.Remove(job);
if(job.subJobs is not null && job.subJobs.Any())
RemoveJobs(job.subJobs);
UpdateJobFile(job);
}
public void RemoveJobs(IEnumerable<Job?> jobsToRemove)
{
List<Job?> toRemove = jobsToRemove.ToList(); //Prevent multiple enumeration
Log($"Removing {toRemove.Count()} jobs.");
foreach (Job? job in toRemove)
if(job is not null)
RemoveJob(job);
}
public IEnumerable<Job> GetJobsLike(string? connectorName = null, string? internalId = null, string? chapterNumber = null)
{
IEnumerable<Job> ret = this.jobs;
if (connectorName is not null)
ret = ret.Where(job => job.mangaConnector.name == connectorName);
if (internalId is not null && chapterNumber is not null)
ret = ret.Where(jjob =>
{
if (jjob is not DownloadChapter job)
return false;
return job.chapter.parentManga.internalId == internalId &&
job.chapter.chapterNumber == chapterNumber;
});
else if (internalId is not null)
ret = ret.Where(jjob =>
{
if (jjob is not DownloadNewChapters job)
return false;
return job.manga.internalId == internalId;
});
return ret;
}
public IEnumerable<Job> GetJobsLike(MangaConnector? mangaConnector = null, Manga? publication = null,
Chapter? chapter = null)
{
if (chapter is not null)
return GetJobsLike(mangaConnector?.name, chapter.Value.parentManga.internalId, chapter.Value.chapterNumber);
else
return GetJobsLike(mangaConnector?.name, publication?.internalId);
}
public Job? GetJobById(string jobId)
{
if (this.jobs.FirstOrDefault(jjob => jjob.id == jobId) is { } job)
return job;
return null;
}
public bool TryGetJobById(string jobId, out Job? job)
{
if (this.jobs.FirstOrDefault(jjob => jjob.id == jobId) is { } ret)
{
job = ret;
return true;
}
job = null;
return false;
}
private bool QueueContainsJob(Job job)
{
if (mangaConnectorJobQueue.TryAdd(job.mangaConnector, new Queue<Job>()))//If we can add the queue, there is certainly no job in it
return true;
return mangaConnectorJobQueue[job.mangaConnector].Contains(job);
}
public void AddJobToQueue(Job job)
{
Log($"Adding Job to Queue. {job}");
if(!QueueContainsJob(job))
mangaConnectorJobQueue[job.mangaConnector].Enqueue(job);
job.ExecutionEnqueue();
}
private void AddJobsToQueue(IEnumerable<Job> newJobs)
{
foreach(Job job in newJobs)
AddJobToQueue(job);
}
private void LoadJobsList(HashSet<MangaConnector> connectors)
{
if (!Directory.Exists(TrangaSettings.jobsFolderPath)) //No jobs to load
{
Directory.CreateDirectory(TrangaSettings.jobsFolderPath);
return;
}
Regex idRex = new (@"(.*)\.json");
//Load json-job-files
foreach (FileInfo file in new DirectoryInfo(TrangaSettings.jobsFolderPath).EnumerateFiles().Where(fileInfo => idRex.IsMatch(fileInfo.Name)))
{
Log($"Adding {file.Name}");
Job? job = JsonConvert.DeserializeObject<Job>(File.ReadAllText(file.FullName),
new JobJsonConverter(this, new MangaConnectorJsonConverter(this, connectors)));
if (job is null)
{
string newName = file.FullName + ".failed";
Log($"Failed loading file {file.Name}.\nMoving to {newName}");
File.Move(file.FullName, newName);
}
else
{
Log($"Adding Job {job}");
this.jobs.Add(job);
UpdateJobFile(job, file.Name);
}
}
//Connect jobs to parent-jobs and add Publications to cache
foreach (Job job in this.jobs)
{
Log($"Loading Job {job}");
Job? parentJob = this.jobs.FirstOrDefault(jjob => jjob.id == job.parentJobId);
if (parentJob is not null)
{
parentJob.AddSubJob(job);
Log($"Parent Job {parentJob}");
}
if (job is DownloadNewChapters dncJob)
AddMangaToCache(dncJob.manga);
}
string[] coverFiles = Directory.GetFiles(TrangaSettings.coverImageCache);
foreach(string fileName in coverFiles.Where(fileName => !GetAllCachedManga().Any(manga => manga.coverFileNameInCache == fileName)))
File.Delete(fileName);
}
internal void UpdateJobFile(Job job, string? oldFile = null)
{
string newJobFilePath = Path.Join(TrangaSettings.jobsFolderPath, $"{job.id}.json");
string oldFilePath = Path.Join(TrangaSettings.jobsFolderPath, oldFile??$"{job.id}.json");
//Delete old file
if (File.Exists(oldFilePath))
{
Log($"Deleting Job-file {oldFilePath}");
try
{
while(IsFileInUse(oldFilePath))
Thread.Sleep(10);
File.Delete(oldFilePath);
}
catch (Exception e)
{
Log(e.ToString());
}
}
//Export job (in new file) if it is still in our jobs list
if (GetJobById(job.id) is not null)
{
Log($"Exporting Job {newJobFilePath}");
string jobStr = JsonConvert.SerializeObject(job, Formatting.Indented);
while(IsFileInUse(newJobFilePath))
Thread.Sleep(10);
File.WriteAllText(newJobFilePath, jobStr);
}
}
private void UpdateAllJobFiles()
{
Log("Exporting Jobs");
foreach (Job job in this.jobs)
UpdateJobFile(job);
//Remove files with jobs not in this.jobs-list
Regex idRex = new (@"(.*)\.json");
foreach (FileInfo file in new DirectoryInfo(TrangaSettings.jobsFolderPath).EnumerateFiles())
{
if (idRex.IsMatch(file.Name))
{
string id = idRex.Match(file.Name).Groups[1].Value;
if (!this.jobs.Any(job => job.id == id))
{
try
{
file.Delete();
}
catch (Exception e)
{
Log(e.ToString());
}
}
}
}
}
public void CheckJobs()
{
AddJobsToQueue(jobs.Where(job => job.progressToken.state == ProgressToken.State.Waiting && job.nextExecution < DateTime.Now && !QueueContainsJob(job)).OrderBy(job => job.nextExecution));
foreach (Queue<Job> jobQueue in mangaConnectorJobQueue.Values)
{
if(jobQueue.Count < 1)
continue;
Job queueHead = jobQueue.Peek();
if (queueHead.progressToken.state is ProgressToken.State.Complete or ProgressToken.State.Cancelled)
{
if(!queueHead.recurring)
RemoveJob(queueHead);
else
queueHead.ResetProgress();
jobQueue.Dequeue();
Log($"Next job in {jobs.MinBy(job => job.nextExecution)?.nextExecution.Subtract(DateTime.Now)} {jobs.MinBy(job => job.nextExecution)?.id}");
}else if (queueHead.progressToken.state is ProgressToken.State.Standby)
{
Job eJob = jobQueue.Peek();
Job[] subJobs = eJob.ExecuteReturnSubTasks(this).ToArray();
UpdateJobFile(eJob);
AddJobs(subJobs);
AddJobsToQueue(subJobs);
}else if (queueHead.progressToken.state is ProgressToken.State.Running && DateTime.Now.Subtract(queueHead.progressToken.lastUpdate) > TimeSpan.FromMinutes(5))
{
Log($"{queueHead} inactive for more than 5 minutes. Cancelling.");
queueHead.Cancel();
}
}
}
}

View File

@ -0,0 +1,84 @@
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using Tranga.MangaConnectors;
namespace Tranga.Jobs;
public class JobJsonConverter : JsonConverter
{
private GlobalBase _clone;
private MangaConnectorJsonConverter _mangaConnectorJsonConverter;
internal JobJsonConverter(GlobalBase clone, MangaConnectorJsonConverter mangaConnectorJsonConverter)
{
this._clone = clone;
this._mangaConnectorJsonConverter = mangaConnectorJsonConverter;
}
public override bool CanConvert(Type objectType)
{
return (objectType == typeof(Job));
}
public override object ReadJson(JsonReader reader, Type objectType, object? existingValue, JsonSerializer serializer)
{
JObject jo = JObject.Load(reader);
if (jo.ContainsKey("jobType") && jo["jobType"]!.Value<byte>() == (byte)Job.JobType.UpdateMetaDataJob)
{
return new UpdateMetadata(this._clone,
jo.GetValue("mangaConnector")!.ToObject<MangaConnector>(JsonSerializer.Create(new JsonSerializerSettings()
{
Converters =
{
this._mangaConnectorJsonConverter
}
}))!,
jo.GetValue("manga")!.ToObject<Manga>(),
jo.GetValue("parentJobId")!.Value<string?>());
}else if ((jo.ContainsKey("jobType") && jo["jobType"]!.Value<byte>() == (byte)Job.JobType.DownloadNewChaptersJob) || jo.ContainsKey("translatedLanguage"))//TODO change to jobType
{
DateTime lastExecution = jo.GetValue("lastExecution") is {} le
? le.ToObject<DateTime>()
: DateTime.UnixEpoch; //TODO do null checks on all variables
return new DownloadNewChapters(this._clone,
jo.GetValue("mangaConnector")!.ToObject<MangaConnector>(JsonSerializer.Create(new JsonSerializerSettings()
{
Converters =
{
this._mangaConnectorJsonConverter
}
}))!,
jo.GetValue("manga")!.ToObject<Manga>(),
lastExecution,
jo.GetValue("recurring")!.Value<bool>(),
jo.GetValue("recurrenceTime")!.ToObject<TimeSpan?>(),
jo.GetValue("parentJobId")!.Value<string?>());
}else if ((jo.ContainsKey("jobType") && jo["jobType"]!.Value<byte>() == (byte)Job.JobType.DownloadChapterJob) || jo.ContainsKey("chapter"))//TODO change to jobType
{
return new DownloadChapter(this._clone,
jo.GetValue("mangaConnector")!.ToObject<MangaConnector>(JsonSerializer.Create(new JsonSerializerSettings()
{
Converters =
{
this._mangaConnectorJsonConverter
}
}))!,
jo.GetValue("chapter")!.ToObject<Chapter>(),
DateTime.UnixEpoch,
jo.GetValue("parentJobId")!.Value<string?>());
}
throw new Exception();
}
public override bool CanWrite => false;
/// <summary>
/// Don't call this
/// </summary>
public override void WriteJson(JsonWriter writer, object? value, JsonSerializer serializer)
{
throw new Exception("Dont call this");
}
}

View File

@ -0,0 +1,78 @@
namespace Tranga.Jobs;
public class ProgressToken
{
public bool cancellationRequested { get; set; }
public int increments { get; set; }
public int incrementsCompleted { get; set; }
public float progress => GetProgress();
public DateTime lastUpdate { get; private set; }
public DateTime executionStarted { get; private set; }
public TimeSpan timeRemaining => GetTimeRemaining();
public enum State { Running, Complete, Standby, Cancelled, Waiting }
public State state { get; private set; }
public ProgressToken(int increments)
{
this.cancellationRequested = false;
this.increments = increments;
this.incrementsCompleted = 0;
this.state = State.Waiting;
this.executionStarted = DateTime.UnixEpoch;
this.lastUpdate = DateTime.UnixEpoch;
}
private float GetProgress()
{
if(increments > 0 && incrementsCompleted > 0)
return incrementsCompleted / (float)increments;
return 0;
}
private TimeSpan GetTimeRemaining()
{
if (increments > 0 && incrementsCompleted > 0)
return DateTime.Now.Subtract(this.executionStarted).Divide(incrementsCompleted).Multiply(increments - incrementsCompleted);
return TimeSpan.MaxValue;
}
public void Increment()
{
this.lastUpdate = DateTime.Now;
this.incrementsCompleted++;
if (incrementsCompleted > increments)
state = State.Complete;
}
public void Standby()
{
this.lastUpdate = DateTime.Now;
state = State.Standby;
}
public void Start()
{
this.lastUpdate = DateTime.Now;
state = State.Running;
this.executionStarted = DateTime.Now;
}
public void Complete()
{
this.lastUpdate = DateTime.Now;
state = State.Complete;
}
public void Cancel()
{
this.lastUpdate = DateTime.Now;
state = State.Cancelled;
}
public void Waiting()
{
this.lastUpdate = DateTime.Now;
state = State.Waiting;
}
}

View File

@ -0,0 +1,76 @@
using Tranga.MangaConnectors;
namespace Tranga.Jobs;
public class UpdateMetadata : Job
{
public Manga manga { get; set; }
public UpdateMetadata(GlobalBase clone, MangaConnector connector, Manga manga, string? parentJobId = null) : base(clone, JobType.UpdateMetaDataJob, connector, parentJobId: parentJobId)
{
this.manga = manga;
}
protected override string GetId()
{
return $"{GetType()}-{manga.internalId}";
}
public override string ToString()
{
return $"{id} Manga: {manga}";
}
protected override IEnumerable<Job> ExecuteReturnSubTasksInternal(JobBoss jobBoss)
{
//Retrieve new Metadata
Manga? possibleUpdatedManga = mangaConnector.GetMangaFromId(manga.publicationId);
if (possibleUpdatedManga is { } updatedManga)
{
if (updatedManga.Equals(this.manga)) //Check if anything changed
{
this.progressToken.Complete();
return Array.Empty<Job>();
}
this.manga = manga.WithMetadata(updatedManga);
this.manga.SaveSeriesInfoJson(true);
this.mangaConnector.CopyCoverFromCacheToDownloadLocation(manga);
foreach (Job job in jobBoss.GetJobsLike(publication: this.manga))
{
string oldFile;
if (job is DownloadNewChapters dc)
{
oldFile = dc.id;
dc.manga = this.manga;
}
else if (job is UpdateMetadata um)
{
oldFile = um.id;
um.manga = this.manga;
}
else
continue;
jobBoss.UpdateJobFile(job, oldFile);
}
this.progressToken.Complete();
}
else
{
Log($"Could not find Manga {manga}");
this.progressToken.Cancel();
return Array.Empty<Job>();
}
this.progressToken.Cancel();
return Array.Empty<Job>();
}
public override bool Equals(object? obj)
{
if (obj is not UpdateMetadata otherJob)
return false;
return otherJob.mangaConnector == this.mangaConnector &&
otherJob.manga.Equals(this.manga);
}
}

View File

@ -0,0 +1,126 @@
using System.Text.Json.Nodes;
using Logging;
using Newtonsoft.Json;
using JsonSerializer = System.Text.Json.JsonSerializer;
namespace Tranga.LibraryConnectors;
public class Kavita : LibraryConnector
{
public Kavita(GlobalBase clone, string baseUrl, string username, string password) :
base(clone, baseUrl, GetToken(baseUrl, username, password, clone.logger), LibraryType.Kavita)
{
}
[JsonConstructor]
public Kavita(GlobalBase clone, string baseUrl, string auth) : base(clone, baseUrl, auth, LibraryType.Kavita)
{
}
public override string ToString()
{
return $"Kavita {baseUrl}";
}
private static string GetToken(string baseUrl, string username, string password, Logger? logger = null)
{
HttpClient client = new()
{
DefaultRequestHeaders =
{
{ "Accept", "application/json" }
}
};
HttpRequestMessage requestMessage = new ()
{
Method = HttpMethod.Post,
RequestUri = new Uri($"{baseUrl}/api/Account/login"),
Content = new StringContent($"{{\"username\":\"{username}\",\"password\":\"{password}\"}}", System.Text.Encoding.UTF8, "application/json")
};
try
{
HttpResponseMessage response = client.Send(requestMessage);
logger?.WriteLine($"Kavita | GetToken {requestMessage.RequestUri} -> {response.StatusCode}");
if (response.IsSuccessStatusCode)
{
JsonObject? result = JsonSerializer.Deserialize<JsonObject>(response.Content.ReadAsStream());
if (result is not null)
return result["token"]!.GetValue<string>();
}
else
{
logger?.WriteLine($"Kavita | {response.Content}");
}
}
catch (HttpRequestException e)
{
logger?.WriteLine($"Kavita | Unable to retrieve token:\n\r{e}");
}
logger?.WriteLine("Kavita | Did not receive token.");
return "";
}
protected override void UpdateLibraryInternal()
{
Log("Updating libraries.");
foreach (KavitaLibrary lib in GetLibraries())
NetClient.MakePost($"{baseUrl}/api/Library/scan?libraryId={lib.id}", "Bearer", auth, logger);
}
internal override bool Test()
{
foreach (KavitaLibrary lib in GetLibraries())
if (NetClient.MakePost($"{baseUrl}/api/Library/scan?libraryId={lib.id}", "Bearer", auth, logger))
return true;
return false;
}
/// <summary>
/// Fetches all libraries available to the user
/// </summary>
/// <returns>Array of KavitaLibrary</returns>
private IEnumerable<KavitaLibrary> GetLibraries()
{
Log("Getting libraries.");
Stream data = NetClient.MakeRequest($"{baseUrl}/api/Library/libraries", "Bearer", auth, logger);
if (data == Stream.Null)
{
Log("No libraries returned");
return Array.Empty<KavitaLibrary>();
}
JsonArray? result = JsonSerializer.Deserialize<JsonArray>(data);
if (result is null)
{
Log("No libraries returned");
return Array.Empty<KavitaLibrary>();
}
List<KavitaLibrary> ret = new();
foreach (JsonNode? jsonNode in result)
{
JsonObject? jObject = (JsonObject?)jsonNode;
if(jObject is null)
continue;
int libraryId = jObject!["id"]!.GetValue<int>();
string libraryName = jObject["name"]!.GetValue<string>();
ret.Add(new KavitaLibrary(libraryId, libraryName));
}
return ret;
}
private struct KavitaLibrary
{
public int id { get; }
// ReSharper disable once UnusedAutoPropertyAccessor.Local
public string name { get; }
public KavitaLibrary(int id, string name)
{
this.id = id;
this.name = name;
}
}
}

View File

@ -1,50 +1,62 @@
using System.Text.Json.Nodes;
using Logging;
using Newtonsoft.Json;
using JsonSerializer = System.Text.Json.JsonSerializer;
namespace Tranga.LibraryManagers;
namespace Tranga.LibraryConnectors;
/// <summary>
/// Provides connectivity to Komga-API
/// Can fetch and update libraries
/// </summary>
public class Komga : LibraryManager
public class Komga : LibraryConnector
{
public Komga(string baseUrl, string username, string password, Logger? logger)
: base(baseUrl, Convert.ToBase64String(System.Text.Encoding.ASCII.GetBytes($"{username}:{password}")), logger, LibraryType.Komga)
public Komga(GlobalBase clone, string baseUrl, string username, string password)
: base(clone, baseUrl, Convert.ToBase64String(System.Text.Encoding.ASCII.GetBytes($"{username}:{password}")), LibraryType.Komga)
{
}
[JsonConstructor]
public Komga(string baseUrl, string auth, Logger? logger) : base(baseUrl, auth, logger, LibraryType.Komga)
public Komga(GlobalBase clone, string baseUrl, string auth) : base(clone, baseUrl, auth, LibraryType.Komga)
{
}
public override void UpdateLibrary()
public override string ToString()
{
logger?.WriteLine(this.GetType().ToString(), $"Updating Libraries");
return $"Komga {baseUrl}";
}
protected override void UpdateLibraryInternal()
{
Log("Updating libraries.");
foreach (KomgaLibrary lib in GetLibraries())
NetClient.MakePost($"{baseUrl}/api/v1/libraries/{lib.id}/scan", "Basic", auth, logger);
}
internal override bool Test()
{
foreach (KomgaLibrary lib in GetLibraries())
if (NetClient.MakePost($"{baseUrl}/api/v1/libraries/{lib.id}/scan", "Basic", auth, logger))
return true;
return false;
}
/// <summary>
/// Fetches all libraries available to the user
/// </summary>
/// <returns>Array of KomgaLibraries</returns>
private IEnumerable<KomgaLibrary> GetLibraries()
{
logger?.WriteLine(this.GetType().ToString(), $"Getting Libraries");
Log("Getting Libraries");
Stream data = NetClient.MakeRequest($"{baseUrl}/api/v1/libraries", "Basic", auth, logger);
if (data == Stream.Null)
{
logger?.WriteLine(this.GetType().ToString(), $"No libraries returned");
Log("No libraries returned");
return Array.Empty<KomgaLibrary>();
}
JsonArray? result = JsonSerializer.Deserialize<JsonArray>(data);
if (result is null)
{
logger?.WriteLine(this.GetType().ToString(), $"No libraries returned");
Log("No libraries returned");
return Array.Empty<KomgaLibrary>();
}
@ -54,7 +66,7 @@ public class Komga : LibraryManager
{
var jObject = (JsonObject?)jsonNode;
string libraryId = jObject!["id"]!.GetValue<string>();
string libraryName = jObject!["name"]!.GetValue<string>();
string libraryName = jObject["name"]!.GetValue<string>();
ret.Add(new KomgaLibrary(libraryId, libraryName));
}
@ -64,6 +76,7 @@ public class Komga : LibraryManager
private struct KomgaLibrary
{
public string id { get; }
// ReSharper disable once UnusedAutoPropertyAccessor.Local
public string name { get; }
public KomgaLibrary(string id, string name)

View File

@ -0,0 +1,144 @@
using System.Net;
using System.Net.Http.Headers;
using Logging;
namespace Tranga.LibraryConnectors;
public abstract class LibraryConnector : GlobalBase
{
public enum LibraryType : byte
{
Komga = 0,
Kavita = 1
}
// ReSharper disable once UnusedAutoPropertyAccessor.Global
public LibraryType libraryType { get; }
public string baseUrl { get; }
// ReSharper disable once MemberCanBeProtected.Global
public string auth { get; } //Base64 encoded, if you use your password everywhere, you have problems
private DateTime? _updateLibraryRequested = null;
private readonly Thread? _libraryBufferThread = null;
private const int NoChangeTimeout = 2, BiggestInterval = 20;
protected LibraryConnector(GlobalBase clone, string baseUrl, string auth, LibraryType libraryType) : base(clone)
{
Log($"Creating libraryConnector {Enum.GetName(libraryType)}");
if (!baseUrlRex.IsMatch(baseUrl))
throw new ArgumentException("Base url does not match pattern");
if(auth == "")
throw new ArgumentNullException(nameof(auth), "Auth can not be empty");
this.baseUrl = baseUrlRex.Match(baseUrl).Value;
this.auth = auth;
this.libraryType = libraryType;
if (TrangaSettings.bufferLibraryUpdates)
{
_libraryBufferThread = new(CheckLibraryBuffer);
_libraryBufferThread.Start();
}
}
private void CheckLibraryBuffer()
{
while (true)
{
if (_updateLibraryRequested is not null && DateTime.Now.Subtract((DateTime)_updateLibraryRequested) > TimeSpan.FromMinutes(NoChangeTimeout)) //If no updates have been requested for NoChangeTimeout minutes, update library
{
UpdateLibraryInternal();
_updateLibraryRequested = null;
}
Thread.Sleep(100);
}
}
public void UpdateLibrary()
{
_updateLibraryRequested ??= DateTime.Now;
if (!TrangaSettings.bufferLibraryUpdates)
{
UpdateLibraryInternal();
return;
}else if (_updateLibraryRequested is not null &&
DateTime.Now.Subtract((DateTime)_updateLibraryRequested) > TimeSpan.FromMinutes(BiggestInterval)) //If the last update has been more than BiggestInterval minutes ago, update library
{
UpdateLibraryInternal();
_updateLibraryRequested = null;
}
else if(_updateLibraryRequested is not null)
{
Log($"Buffering Library Updates (Updates in latest {((DateTime)_updateLibraryRequested).Add(TimeSpan.FromMinutes(BiggestInterval)).Subtract(DateTime.Now)} or {((DateTime)_updateLibraryRequested).Add(TimeSpan.FromMinutes(NoChangeTimeout)).Subtract(DateTime.Now)})");
}
}
protected abstract void UpdateLibraryInternal();
internal abstract bool Test();
protected static class NetClient
{
public static Stream MakeRequest(string url, string authScheme, string auth, Logger? logger)
{
HttpClient client = new();
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue(authScheme, auth);
HttpRequestMessage requestMessage = new ()
{
Method = HttpMethod.Get,
RequestUri = new Uri(url)
};
try
{
HttpResponseMessage response = client.Send(requestMessage);
logger?.WriteLine("LibraryManager.NetClient",
$"GET {url} -> {(int)response.StatusCode}: {response.ReasonPhrase}");
if (response.StatusCode is HttpStatusCode.Unauthorized &&
response.RequestMessage!.RequestUri!.AbsoluteUri != url)
return MakeRequest(response.RequestMessage!.RequestUri!.AbsoluteUri, authScheme, auth, logger);
else if (response.IsSuccessStatusCode)
return response.Content.ReadAsStream();
else
return Stream.Null;
}
catch (Exception e)
{
switch (e)
{
case HttpRequestException:
logger?.WriteLine("LibraryManager.NetClient", $"Failed to make Request:\n\r{e}\n\rContinuing.");
break;
default:
throw;
}
return Stream.Null;
}
}
public static bool MakePost(string url, string authScheme, string auth, Logger? logger)
{
HttpClient client = new()
{
DefaultRequestHeaders =
{
{ "Accept", "application/json" },
{ "Authorization", new AuthenticationHeaderValue(authScheme, auth).ToString() }
}
};
HttpRequestMessage requestMessage = new ()
{
Method = HttpMethod.Post,
RequestUri = new Uri(url)
};
HttpResponseMessage response = client.Send(requestMessage);
logger?.WriteLine("LibraryManager.NetClient", $"POST {url} -> {(int)response.StatusCode}: {response.ReasonPhrase}");
if(response.StatusCode is HttpStatusCode.Unauthorized && response.RequestMessage!.RequestUri!.AbsoluteUri != url)
return MakePost(response.RequestMessage!.RequestUri!.AbsoluteUri, authScheme, auth, logger);
else if (response.IsSuccessStatusCode)
return true;
else
return false;
}
}
}

View File

@ -0,0 +1,45 @@
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
namespace Tranga.LibraryConnectors;
public class LibraryManagerJsonConverter : JsonConverter
{
private readonly GlobalBase _clone;
internal LibraryManagerJsonConverter(GlobalBase clone)
{
this._clone = clone;
}
public override bool CanConvert(Type objectType)
{
return (objectType == typeof(LibraryConnector));
}
public override object ReadJson(JsonReader reader, Type objectType, object? existingValue, JsonSerializer serializer)
{
JObject jo = JObject.Load(reader);
if (jo["libraryType"]!.Value<byte>() == (byte)LibraryConnector.LibraryType.Komga)
return new Komga(this._clone,
jo.GetValue("baseUrl")!.Value<string>()!,
jo.GetValue("auth")!.Value<string>()!);
if (jo["libraryType"]!.Value<byte>() == (byte)LibraryConnector.LibraryType.Kavita)
return new Kavita(this._clone,
jo.GetValue("baseUrl")!.Value<string>()!,
jo.GetValue("auth")!.Value<string>()!);
throw new Exception();
}
public override bool CanWrite => false;
/// <summary>
/// Don't call this
/// </summary>
public override void WriteJson(JsonWriter writer, object? value, JsonSerializer serializer)
{
throw new Exception("Dont call this");
}
}

View File

@ -1,121 +0,0 @@
using System.Net;
using System.Net.Http.Headers;
using Logging;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using Tranga.LibraryManagers;
namespace Tranga;
public abstract class LibraryManager
{
public enum LibraryType : byte
{
Komga = 0,
Kavita = 1
}
public LibraryType libraryType { get; }
public string baseUrl { get; }
public string auth { get; } //Base64 encoded, if you use your password everywhere, you have problems
protected Logger? logger;
/// <param name="baseUrl">Base-URL of Komga instance, no trailing slashes(/)</param>
/// <param name="auth">Base64 string of username and password (username):(password)</param>
/// <param name="logger"></param>
protected LibraryManager(string baseUrl, string auth, Logger? logger, LibraryType libraryType)
{
this.baseUrl = baseUrl;
this.auth = auth;
this.logger = logger;
this.libraryType = libraryType;
}
public abstract void UpdateLibrary();
public void AddLogger(Logger newLogger)
{
this.logger = newLogger;
}
protected static class NetClient
{
public static Stream MakeRequest(string url, string authScheme, string auth, Logger? logger)
{
HttpClient client = new();
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue(authScheme, auth);
HttpRequestMessage requestMessage = new ()
{
Method = HttpMethod.Get,
RequestUri = new Uri(url)
};
logger?.WriteLine("LibraryManager", $"GET {url}");
HttpResponseMessage response = client.Send(requestMessage);
logger?.WriteLine("LibraryManager", $"{(int)response.StatusCode} {response.StatusCode}: {response.ReasonPhrase}");
if(response.StatusCode is HttpStatusCode.Unauthorized && response.RequestMessage!.RequestUri!.AbsoluteUri != url)
return MakeRequest(response.RequestMessage!.RequestUri!.AbsoluteUri, authScheme, auth, logger);
else if (response.IsSuccessStatusCode)
return response.Content.ReadAsStream();
else
return Stream.Null;
}
public static bool MakePost(string url, string authScheme, string auth, Logger? logger)
{
HttpClient client = new()
{
DefaultRequestHeaders =
{
{ "Accept", "application/json" },
{ "Authorization", new AuthenticationHeaderValue(authScheme, auth).ToString() }
}
};
HttpRequestMessage requestMessage = new ()
{
Method = HttpMethod.Post,
RequestUri = new Uri(url)
};
logger?.WriteLine("LibraryManager", $"POST {url}");
HttpResponseMessage response = client.Send(requestMessage);
logger?.WriteLine("LibraryManager", $"{(int)response.StatusCode} {response.StatusCode}: {response.ReasonPhrase}");
if(response.StatusCode is HttpStatusCode.Unauthorized && response.RequestMessage!.RequestUri!.AbsoluteUri != url)
return MakePost(response.RequestMessage!.RequestUri!.AbsoluteUri, authScheme, auth, logger);
else if (response.IsSuccessStatusCode)
return true;
else
return false;
}
}
public class LibraryManagerJsonConverter : JsonConverter
{
public override bool CanConvert(Type objectType)
{
return (objectType == typeof(LibraryManager));
}
public override object ReadJson(JsonReader reader, Type objectType, object? existingValue, JsonSerializer serializer)
{
JObject jo = JObject.Load(reader);
if (jo["libraryType"]!.Value<Int64>() == (Int64)LibraryType.Komga)
return jo.ToObject<Komga>(serializer)!;
if (jo["libraryType"]!.Value<Int64>() == (Int64)LibraryType.Kavita)
return jo.ToObject<Kavita>(serializer)!;
throw new Exception();
}
public override bool CanWrite => false;
/// <summary>
/// Don't call this
/// </summary>
public override void WriteJson(JsonWriter writer, object? value, JsonSerializer serializer)
{
throw new Exception("Dont call this");
}
}
}

View File

@ -1,94 +0,0 @@
using System.Text.Json.Nodes;
using Logging;
using Newtonsoft.Json;
using JsonSerializer = System.Text.Json.JsonSerializer;
namespace Tranga.LibraryManagers;
public class Kavita : LibraryManager
{
public Kavita(string baseUrl, string username, string password, Logger? logger) : base(baseUrl, GetToken(baseUrl, username, password), logger, LibraryType.Kavita)
{
}
[JsonConstructor]
public Kavita(string baseUrl, string auth, Logger? logger) : base(baseUrl, auth, logger, LibraryType.Kavita)
{
}
private static string GetToken(string baseUrl, string username, string password)
{
HttpClient client = new()
{
DefaultRequestHeaders =
{
{ "Accept", "application/json" }
}
};
HttpRequestMessage requestMessage = new ()
{
Method = HttpMethod.Post,
RequestUri = new Uri($"{baseUrl}/api/Account/login"),
Content = new StringContent($"{{\"username\":\"{username}\",\"password\":\"{password}\"}}", System.Text.Encoding.UTF8, "application/json")
};
HttpResponseMessage response = client.Send(requestMessage);
JsonObject? result = JsonSerializer.Deserialize<JsonObject>(response.Content.ReadAsStream());
if (result is not null)
return result!["token"]!.GetValue<string>();
else return "";
}
public override void UpdateLibrary()
{
logger?.WriteLine(this.GetType().ToString(), $"Updating Libraries");
foreach (KavitaLibrary lib in GetLibraries())
NetClient.MakePost($"{baseUrl}/api/Library/scan?libraryId={lib.id}", "Bearer", auth, logger);
}
/// <summary>
/// Fetches all libraries available to the user
/// </summary>
/// <returns>Array of KavitaLibrary</returns>
private IEnumerable<KavitaLibrary> GetLibraries()
{
logger?.WriteLine(this.GetType().ToString(), $"Getting Libraries");
Stream data = NetClient.MakeRequest($"{baseUrl}/api/Library", "Bearer", auth, logger);
if (data == Stream.Null)
{
logger?.WriteLine(this.GetType().ToString(), $"No libraries returned");
return Array.Empty<KavitaLibrary>();
}
JsonArray? result = JsonSerializer.Deserialize<JsonArray>(data);
if (result is null)
{
logger?.WriteLine(this.GetType().ToString(), $"No libraries returned");
return Array.Empty<KavitaLibrary>();
}
HashSet<KavitaLibrary> ret = new();
foreach (JsonNode? jsonNode in result)
{
var jObject = (JsonObject?)jsonNode;
int libraryId = jObject!["id"]!.GetValue<int>();
string libraryName = jObject!["name"]!.GetValue<string>();
ret.Add(new KavitaLibrary(libraryId, libraryName));
}
return ret;
}
private struct KavitaLibrary
{
public int id { get; }
public string name { get; }
public KavitaLibrary(int id, string name)
{
this.id = id;
this.name = name;
}
}
}

222
Tranga/Manga.cs Normal file
View File

@ -0,0 +1,222 @@
using System.Runtime.InteropServices;
using System.Text;
using System.Text.RegularExpressions;
using System.Web;
using Newtonsoft.Json;
using static System.IO.UnixFileMode;
namespace Tranga;
/// <summary>
/// Contains information on a Publication (Manga)
/// </summary>
public struct Manga
{
public string sortName { get; private set; }
public List<string> authors { get; private set; }
// ReSharper disable once UnusedAutoPropertyAccessor.Global
public Dictionary<string,string> altTitles { get; private set; }
// ReSharper disable once MemberCanBePrivate.Global
public string? description { get; private set; }
public string[] tags { get; private set; }
// ReSharper disable once UnusedAutoPropertyAccessor.Global
public string? coverUrl { get; private set; }
public string? coverFileNameInCache { get; private set; }
// ReSharper disable once UnusedAutoPropertyAccessor.Global
public Dictionary<string,string> links { get; }
// ReSharper disable once MemberCanBePrivate.Global
public int? year { get; private set; }
public string? originalLanguage { get; }
// ReSharper disable twice MemberCanBePrivate.Global
public string status { get; private set; }
public ReleaseStatusByte releaseStatus { get; private set; }
public enum ReleaseStatusByte : byte
{
Continuing = 0,
Completed = 1,
OnHiatus = 2,
Cancelled = 3,
Unreleased = 4
};
public string folderName { get; private set; }
public string publicationId { get; }
public string internalId { get; }
public float ignoreChaptersBelow { get; set; }
public float latestChapterDownloaded { get; set; }
public float latestChapterAvailable { get; set; }
public string? websiteUrl { get; private set; }
private static readonly Regex LegalCharacters = new (@"[A-Za-zÀ-ÖØ-öø-ÿ0-9 \.\-,'\'\)\(~!\+]*");
[JsonConstructor]
public Manga(string sortName, List<string> authors, string? description, Dictionary<string,string> altTitles, string[] tags, string? coverUrl, string? coverFileNameInCache, Dictionary<string,string>? links, int? year, string? originalLanguage, string publicationId, ReleaseStatusByte releaseStatus, string? websiteUrl = null, string? folderName = null, float? ignoreChaptersBelow = 0)
{
this.sortName = HttpUtility.HtmlDecode(sortName);
this.authors = authors.Select(HttpUtility.HtmlDecode).ToList()!;
this.description = HttpUtility.HtmlDecode(description);
this.altTitles = altTitles.ToDictionary(a => HttpUtility.HtmlDecode(a.Key), a => HttpUtility.HtmlDecode(a.Value));
this.tags = tags.Select(HttpUtility.HtmlDecode).ToArray()!;
this.coverFileNameInCache = coverFileNameInCache;
this.coverUrl = coverUrl;
this.links = links ?? new Dictionary<string, string>();
this.year = year;
this.originalLanguage = originalLanguage;
this.publicationId = publicationId;
this.folderName = folderName ?? string.Concat(LegalCharacters.Matches(HttpUtility.HtmlDecode(sortName)));
while (this.folderName.EndsWith('.'))
this.folderName = this.folderName.Substring(0, this.folderName.Length - 1);
string onlyLowerLetters = string.Concat(this.sortName.ToLower().Where(Char.IsLetter));
this.internalId = DateTime.Now.Ticks.ToString();
this.ignoreChaptersBelow = ignoreChaptersBelow ?? 0f;
this.latestChapterDownloaded = 0;
this.latestChapterAvailable = 0;
this.releaseStatus = releaseStatus;
this.status = Enum.GetName(releaseStatus) ?? "";
this.websiteUrl = websiteUrl;
}
public Manga WithMetadata(Manga newManga)
{
return this with
{
sortName = newManga.sortName,
description = newManga.description,
coverUrl = newManga.coverUrl,
authors = authors.Union(newManga.authors).ToList(),
altTitles = altTitles.UnionBy(newManga.altTitles, kv => kv.Key).ToDictionary(x => x.Key, x => x.Value),
tags = tags.Union(newManga.tags).ToArray(),
status = newManga.status,
releaseStatus = newManga.releaseStatus,
websiteUrl = newManga.websiteUrl,
year = newManga.year,
coverFileNameInCache = newManga.coverFileNameInCache
};
}
public override bool Equals(object? obj)
{
if (obj is not Manga compareManga)
return false;
return this.description == compareManga.description &&
this.year == compareManga.year &&
this.status == compareManga.status &&
this.releaseStatus == compareManga.releaseStatus &&
this.sortName == compareManga.sortName &&
this.latestChapterAvailable.Equals(compareManga.latestChapterAvailable) &&
this.authors.All(a => compareManga.authors.Contains(a)) &&
(this.coverFileNameInCache??"").Equals(compareManga.coverFileNameInCache) &&
(this.websiteUrl??"").Equals(compareManga.websiteUrl) &&
this.tags.All(t => compareManga.tags.Contains(t));
}
public override string ToString()
{
return $"Publication {sortName} {internalId}";
}
public string CreatePublicationFolder(string downloadDirectory)
{
string publicationFolder = Path.Join(downloadDirectory, this.folderName);
if(!Directory.Exists(publicationFolder))
Directory.CreateDirectory(publicationFolder);
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(publicationFolder, GroupRead | GroupWrite | GroupExecute | OtherRead | OtherWrite | OtherExecute | UserRead | UserWrite | UserExecute);
return publicationFolder;
}
public void MovePublicationFolder(string downloadDirectory, string newFolderName)
{
string oldPath = Path.Join(downloadDirectory, this.folderName);
this.folderName = newFolderName;//Create new Path with the new folderName
string newPath = CreatePublicationFolder(downloadDirectory);
if (Directory.Exists(oldPath))
{
if (Directory.Exists(newPath)) //Move/Overwrite old Files, Delete old Directory
{
IEnumerable<string> newPathFileNames = new DirectoryInfo(newPath).GetFiles().Select(fi => fi.Name);
foreach(FileInfo fileInfo in new DirectoryInfo(oldPath).GetFiles().Where(fi => newPathFileNames.Contains(fi.Name) == false))
File.Move(fileInfo.FullName, Path.Join(newPath, fileInfo.Name), true);
Directory.Delete(oldPath);
}else
Directory.Move(oldPath, newPath);
}
}
public void UpdateLatestDownloadedChapter(Chapter chapter)//TODO check files if chapters are all downloaded
{
float chapterNumber = Convert.ToSingle(chapter.chapterNumber, GlobalBase.numberFormatDecimalPoint);
latestChapterDownloaded = latestChapterDownloaded < chapterNumber ? chapterNumber : latestChapterDownloaded;
}
public void SaveSeriesInfoJson(bool overwrite = false)
{
string publicationFolder = CreatePublicationFolder(TrangaSettings.downloadLocation);
string seriesInfoPath = Path.Join(publicationFolder, "series.json");
if(overwrite || (!overwrite && !File.Exists(seriesInfoPath)))
File.WriteAllText(seriesInfoPath,this.GetSeriesInfoJson());
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(seriesInfoPath, GroupRead | GroupWrite | OtherRead | OtherWrite | UserRead | UserWrite);
}
/// <returns>Serialized JSON String for series.json</returns>
private string GetSeriesInfoJson()
{
SeriesInfo si = new (new Metadata(this));
return System.Text.Json.JsonSerializer.Serialize(si);
}
//Only for series.json
private struct SeriesInfo
{
// ReSharper disable once UnusedAutoPropertyAccessor.Local we need it, trust
[JsonRequired]public Metadata metadata { get; }
public SeriesInfo(Metadata metadata) => this.metadata = metadata;
}
//Only for series.json what an abomination, why are all the fields not-null????
private struct Metadata
{
// ReSharper disable UnusedAutoPropertyAccessor.Local we need them all, trust me
[JsonRequired] public string type { get; }
[JsonRequired] public string publisher { get; }
// ReSharper disable twice IdentifierTypo
[JsonRequired] public int comicid { get; }
[JsonRequired] public string booktype { get; }
// ReSharper disable InconsistentNaming This one property is capitalized. Why?
[JsonRequired] public string ComicImage { get; }
[JsonRequired] public int total_issues { get; }
[JsonRequired] public string publication_run { get; }
[JsonRequired]public string name { get; }
[JsonRequired]public string year { get; }
[JsonRequired]public string status { get; }
[JsonRequired]public string description_text { get; }
public Metadata(Manga manga) : this(manga.sortName, manga.year.ToString() ?? string.Empty, manga.releaseStatus, manga.description ?? "")
{
}
public Metadata(string name, string year, ReleaseStatusByte status, string description_text)
{
this.name = name;
this.year = year;
this.status = status switch
{
ReleaseStatusByte.Continuing => "Continuing",
ReleaseStatusByte.Completed => "Ended",
_ => Enum.GetName(status) ?? "Ended"
};
this.description_text = description_text;
//kill it with fire, but otherwise Komga will not parse
type = "Manga";
publisher = "";
comicid = 0;
booktype = "";
ComicImage = "";
total_issues = 0;
publication_run = "";
}
}
}

View File

@ -0,0 +1,225 @@
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class Bato : MangaConnector
{
public Bato(GlobalBase clone) : base(clone, "Bato", ["en"])
{
this.downloadClient = new HttpDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string sanitizedTitle = string.Join(' ', Regex.Matches(publicationTitle, "[A-z]*").Where(m => m.Value.Length > 0)).ToLower();
string requestUrl = $"https://bato.to/v3x-search?word={sanitizedTitle}&lang=en";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Manga>();
if (requestResult.htmlDocument is null)
{
Log($"Failed to retrieve site");
return Array.Empty<Manga>();
}
Manga[] publications = ParsePublicationsFromHtml(requestResult.htmlDocument);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications;
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://bato.to/title/{publicationId}");
}
public override Manga? GetMangaFromUrl(string url)
{
RequestResult requestResult = downloadClient.MakeRequest(url, RequestType.MangaInfo);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return null;
if (requestResult.htmlDocument is null)
{
Log($"Failed to retrieve site");
return null;
}
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, url.Split('/')[^1], url);
}
private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{
HtmlNode mangaList = document.DocumentNode.SelectSingleNode("//div[@data-hk='0-0-2']");
if (!mangaList.ChildNodes.Any(node => node.Name == "div"))
return Array.Empty<Manga>();
List<string> urls = mangaList.ChildNodes
.Select(node => $"https://bato.to{node.Descendants("div").First().FirstChild.GetAttributeValue("href", "")}").ToList();
HashSet<Manga> ret = new();
foreach (string url in urls)
{
Manga? manga = GetMangaFromUrl(url);
if (manga is not null)
ret.Add((Manga)manga);
}
return ret.ToArray();
}
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
HtmlNode infoNode = document.DocumentNode.SelectSingleNode("/html/body/div/main/div[1]/div[2]");
string sortName = infoNode.Descendants("h3").First().InnerText;
string description = document.DocumentNode
.SelectSingleNode("//div[contains(concat(' ',normalize-space(@class),' '),'prose')]").InnerText;
string[] altTitlesList = infoNode.ChildNodes[1].ChildNodes[2].InnerText.Split('/');
int i = 0;
Dictionary<string, string> altTitles = altTitlesList.ToDictionary(s => i++.ToString(), s => s);
string posterUrl = document.DocumentNode.SelectNodes("//img")
.First(child => child.GetAttributeValue("data-hk", "") == "0-1-0").GetAttributeValue("src", "").Replace("&amp;", "&");
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
List<HtmlNode> genreNodes = document.DocumentNode.SelectSingleNode("//b[text()='Genres:']/..").SelectNodes("span").ToList();
string[] tags = genreNodes.Select(node => node.FirstChild.InnerText).ToArray();
List<HtmlNode> authorsNodes = infoNode.ChildNodes[1].ChildNodes[3].Descendants("a").ToList();
List<string> authors = authorsNodes.Select(node => node.InnerText.Replace("amp;", "")).ToList();
HtmlNode? originalLanguageNode = document.DocumentNode.SelectSingleNode("//span[text()='Tr From']/..");
string originalLanguage = originalLanguageNode is not null ? originalLanguageNode.LastChild.InnerText : "";
if (!int.TryParse(
document.DocumentNode.SelectSingleNode("//span[text()='Original Publication:']/..").LastChild.InnerText.Split('-')[0],
out int year))
year = DateTime.Now.Year;
string status = document.DocumentNode.SelectSingleNode("//span[text()='Original Publication:']/..")
.ChildNodes[2].InnerText;
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
switch (status.ToLower())
{
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
case "completed": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
case "hiatus": releaseStatus = Manga.ReleaseStatusByte.OnHiatus; break;
case "cancelled": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "pending": releaseStatus = Manga.ReleaseStatusByte.Unreleased; break;
}
Manga manga = new (sortName, authors, description, altTitles, tags, posterUrl, coverFileNameInCache, new Dictionary<string, string>(),
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
string requestUrl = $"https://bato.to/title/{manga.publicationId}";
// Leaving this in for verification if the page exists
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Chapter>();
//Return Chapters ordered by Chapter-Number
List<Chapter> chapters = ParseChaptersFromHtml(manga, requestUrl);
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
private List<Chapter> ParseChaptersFromHtml(Manga manga, string mangaUrl)
{
RequestResult result = downloadClient.MakeRequest(mangaUrl, RequestType.Default);
if ((int)result.statusCode < 200 || (int)result.statusCode >= 300 || result.htmlDocument is null)
{
Log("Failed to load site");
return new List<Chapter>();
}
List<Chapter> ret = new();
HtmlNode chapterList =
result.htmlDocument.DocumentNode.SelectSingleNode("/html/body/div/main/div[3]/astro-island/div/div[2]/div/div/astro-slot");
Regex numberRex = new(@"\/title\/.+\/[0-9]+(-vol_([0-9]+))?-ch_([0-9\.]+)");
foreach (HtmlNode chapterInfo in chapterList.SelectNodes("div"))
{
HtmlNode infoNode = chapterInfo.FirstChild.FirstChild;
string chapterUrl = infoNode.GetAttributeValue("href", "");
Match match = numberRex.Match(chapterUrl);
string? volumeNumber = match.Groups[2].Success ? match.Groups[2].Value : null;
string chapterNumber = match.Groups[3].Value;
string chapterName = chapterNumber;
string url = $"https://bato.to{chapterUrl}?load=2";
ret.Add(new Chapter(manga, chapterName, volumeNumber, chapterNumber, url));
}
return ret;
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
string requestUrl = chapter.url;
// Leaving this in to check if the page exists
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
progressToken?.Cancel();
return requestResult.statusCode;
}
string[] imageUrls = ParseImageUrlsFromHtml(requestUrl);
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(imageUrls, chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, "https://mangakatana.com/", progressToken:progressToken);
}
private string[] ParseImageUrlsFromHtml(string mangaUrl)
{
RequestResult requestResult =
downloadClient.MakeRequest(mangaUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
return Array.Empty<string>();
}
if (requestResult.htmlDocument is null)
{
Log($"Failed to retrieve site");
return Array.Empty<string>();
}
HtmlDocument document = requestResult.htmlDocument;
HtmlNode images = document.DocumentNode.SelectNodes("//astro-island").First(node =>
node.GetAttributeValue("component-url", "").Contains("/_astro/ImageList."));
string weirdString = images.OuterHtml;
string weirdString2 = Regex.Match(weirdString, @"props=\""(.*)}\""").Groups[1].Value;
string[] urls = Regex.Matches(weirdString2, @"(https:\/\/[A-z\-0-9\.\?\&\;\=\/]+)\\")
.Select(match => match.Groups[1].Value.Replace("&amp;", "&")).ToArray();
return urls;
}
}

View File

@ -0,0 +1,86 @@
using System.Net;
using System.Text;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using PuppeteerSharp;
namespace Tranga.MangaConnectors;
internal class ChromiumDownloadClient : DownloadClient
{
private static readonly IBrowser Browser = StartBrowser().Result;
private const int StartTimeoutMs = 10000;
private readonly HttpDownloadClient _httpDownloadClient;
private static async Task<IBrowser> StartBrowser()
{
return await Puppeteer.LaunchAsync(new LaunchOptions
{
Headless = true,
Args = new [] {
"--disable-gpu",
"--disable-dev-shm-usage",
"--disable-setuid-sandbox",
"--no-sandbox"},
Timeout = StartTimeoutMs
});
}
public ChromiumDownloadClient(GlobalBase clone) : base(clone)
{
_httpDownloadClient = new(this);
}
private readonly Regex _imageUrlRex = new(@"https?:\/\/.*\.(?:p?jpe?g|gif|a?png|bmp|avif|webp)(\?.*)?");
internal override RequestResult MakeRequestInternal(string url, string? referrer = null, string? clickButton = null)
{
return _imageUrlRex.IsMatch(url)
? _httpDownloadClient.MakeRequestInternal(url, referrer)
: MakeRequestBrowser(url, referrer, clickButton);
}
private RequestResult MakeRequestBrowser(string url, string? referrer = null, string? clickButton = null)
{
IPage page = Browser.NewPageAsync().Result;
page.DefaultTimeout = 10000;
IResponse response;
try
{
response = page.GoToAsync(url, WaitUntilNavigation.Networkidle0).Result;
Log("Page loaded.");
}
catch (Exception e)
{
Log($"Could not load Page:\n{e.Message}");
page.CloseAsync();
return new RequestResult(HttpStatusCode.InternalServerError, null, Stream.Null);
}
Stream stream = Stream.Null;
HtmlDocument? document = null;
if (response.Headers.TryGetValue("Content-Type", out string? content))
{
if (content.Contains("text/html"))
{
if (clickButton is not null && page.QuerySelectorAsync(clickButton).Result is not null)
page.ClickAsync(clickButton).Wait();
string htmlString = page.GetContentAsync().Result;
stream = new MemoryStream(Encoding.Default.GetBytes(htmlString));
document = new ();
document.LoadHtml(htmlString);
}else if (content.Contains("image"))
{
stream = new MemoryStream(response.BufferAsync().Result);
}
}
else
{
page.CloseAsync();
return new RequestResult(HttpStatusCode.InternalServerError, null, Stream.Null);
}
page.CloseAsync();
return new RequestResult(response.Status, document, stream, false, "");
}
}

View File

@ -0,0 +1,44 @@
using System.Net;
using HtmlAgilityPack;
namespace Tranga.MangaConnectors;
internal abstract class DownloadClient : GlobalBase
{
private readonly Dictionary<RequestType, DateTime> _lastExecutedRateLimit;
protected DownloadClient(GlobalBase clone) : base(clone)
{
this._lastExecutedRateLimit = new();
}
public RequestResult MakeRequest(string url, RequestType requestType, string? referrer = null, string? clickButton = null)
{
if (!TrangaSettings.requestLimits.ContainsKey(requestType))
{
Log("RequestType not configured for rate-limit.");
return new RequestResult(HttpStatusCode.NotAcceptable, null, Stream.Null);
}
int rateLimit = TrangaSettings.userAgent == TrangaSettings.DefaultUserAgent
? TrangaSettings.DefaultRequestLimits[requestType]
: TrangaSettings.requestLimits[requestType];
TimeSpan timeBetweenRequests = TimeSpan.FromMinutes(1).Divide(rateLimit);
_lastExecutedRateLimit.TryAdd(requestType, DateTime.Now.Subtract(timeBetweenRequests));
TimeSpan rateLimitTimeout = timeBetweenRequests.Subtract(DateTime.Now.Subtract(_lastExecutedRateLimit[requestType]));
if (rateLimitTimeout > TimeSpan.Zero)
{
Log($"Waiting {rateLimitTimeout.TotalSeconds} seconds");
Thread.Sleep(rateLimitTimeout);
}
RequestResult result = MakeRequestInternal(url, referrer, clickButton);
_lastExecutedRateLimit[requestType] = DateTime.Now;
return result;
}
internal abstract RequestResult MakeRequestInternal(string url, string? referrer = null, string? clickButton = null);
}

View File

@ -0,0 +1,75 @@
using System.Net;
using System.Net.Http.Headers;
using HtmlAgilityPack;
namespace Tranga.MangaConnectors;
internal class HttpDownloadClient : DownloadClient
{
private static readonly HttpClient Client = new()
{
Timeout = TimeSpan.FromSeconds(10)
};
public HttpDownloadClient(GlobalBase clone) : base(clone)
{
Client.DefaultRequestHeaders.TryAddWithoutValidation("User-Agent", TrangaSettings.userAgent);
}
internal override RequestResult MakeRequestInternal(string url, string? referrer = null, string? clickButton = null)
{
if(clickButton is not null)
Log("Can not click button on static site.");
HttpResponseMessage? response = null;
while (response is null)
{
HttpRequestMessage requestMessage = new(HttpMethod.Get, url);
if (referrer is not null)
requestMessage.Headers.Referrer = new Uri(referrer);
//Log($"Requesting {requestType} {url}");
try
{
response = Client.Send(requestMessage);
}
catch (Exception e)
{
switch (e)
{
case TaskCanceledException:
Log($"Request timed out {url}.\n\r{e}");
return new RequestResult(HttpStatusCode.RequestTimeout, null, Stream.Null);
case HttpRequestException:
Log($"Request failed {url}\n\r{e}");
return new RequestResult(HttpStatusCode.BadRequest, null, Stream.Null);
}
}
}
if (!response.IsSuccessStatusCode)
{
Log($"Request-Error {response.StatusCode}: {url}");
return new RequestResult(response.StatusCode, null, Stream.Null);
}
Stream stream = response.Content.ReadAsStream();
HtmlDocument? document = null;
if (response.Content.Headers.ContentType?.MediaType == "text/html")
{
StreamReader reader = new (stream);
document = new ();
document.LoadHtml(reader.ReadToEnd());
stream.Position = 0;
}
// Request has been redirected to another page. For example, it redirects directly to the results when there is only 1 result
if (response.RequestMessage is not null && response.RequestMessage.RequestUri is not null)
{
return new RequestResult(response.StatusCode, document, stream, true,
response.RequestMessage.RequestUri.AbsoluteUri);
}
return new RequestResult(response.StatusCode, document, stream);
}
}

View File

@ -0,0 +1,307 @@
using System.IO.Compression;
using System.Net;
using System.Runtime.InteropServices;
using System.Text.RegularExpressions;
using Tranga.Jobs;
using static System.IO.UnixFileMode;
namespace Tranga.MangaConnectors;
/// <summary>
/// Base-Class for all Connectors
/// Provides some methods to be used by all Connectors, as well as a DownloadClient
/// </summary>
public abstract class MangaConnector : GlobalBase
{
internal DownloadClient downloadClient { get; init; } = null!;
public string[] SupportedLanguages;
protected MangaConnector(GlobalBase clone, string name, string[] supportedLanguages) : base(clone)
{
this.name = name;
this.SupportedLanguages = supportedLanguages;
Directory.CreateDirectory(TrangaSettings.coverImageCache);
}
public string name { get; } //Name of the Connector (e.g. Website)
/// <summary>
/// Returns all Publications with the given string.
/// If the string is empty or null, returns all Publication of the Connector
/// </summary>
/// <param name="publicationTitle">Search-Query</param>
/// <returns>Publications matching the query</returns>
public abstract Manga[] GetManga(string publicationTitle = "");
public abstract Manga? GetMangaFromUrl(string url);
public abstract Manga? GetMangaFromId(string publicationId);
/// <summary>
/// Returns all Chapters of the publication in the provided language.
/// If the language is empty or null, returns all Chapters in all Languages.
/// </summary>
/// <param name="manga">Publication to get Chapters for</param>
/// <param name="language">Language of the Chapters</param>
/// <returns>Array of Chapters matching Publication and Language</returns>
public abstract Chapter[] GetChapters(Manga manga, string language="en");
/// <summary>
/// Updates the available Chapters of a Publication
/// </summary>
/// <param name="manga">Publication to check</param>
/// <param name="language">Language to receive chapters for</param>
/// <returns>List of Chapters that were previously not in collection</returns>
public Chapter[] GetNewChapters(Manga manga, string language = "en")
{
Log($"Getting new Chapters for {manga}");
Chapter[] allChapters = this.GetChapters(manga, language);
if (allChapters.Length < 1)
return Array.Empty<Chapter>();
Log($"Checking for duplicates {manga}");
List<Chapter> newChaptersList = allChapters.Where(nChapter => float.TryParse(nChapter.chapterNumber, numberFormatDecimalPoint, out float chapterNumber)
&& chapterNumber > manga.ignoreChaptersBelow
&& !nChapter.CheckChapterIsDownloaded()).ToList();
Log($"{newChaptersList.Count} new chapters. {manga}");
try
{
Chapter latestChapterAvailable =
allChapters.Max();
manga.latestChapterAvailable =
Convert.ToSingle(latestChapterAvailable.chapterNumber, numberFormatDecimalPoint);
}
catch (Exception e)
{
Log(e.ToString());
Log($"Failed getting new Chapters for {manga}");
}
return newChaptersList.ToArray();
}
public Chapter[] SelectChapters(Manga manga, string searchTerm, string? language = null)
{
Chapter[] availableChapters = this.GetChapters(manga, language??"en");
Regex volumeRegex = new ("((v(ol)*(olume)*){1} *([0-9]+(-[0-9]+)?){1})", RegexOptions.IgnoreCase);
Regex chapterRegex = new ("((c(h)*(hapter)*){1} *([0-9]+(-[0-9]+)?){1})", RegexOptions.IgnoreCase);
Regex singleResultRegex = new("([0-9]+)", RegexOptions.IgnoreCase);
Regex rangeResultRegex = new("([0-9]+(-[0-9]+))", RegexOptions.IgnoreCase);
Regex allRegex = new("a(ll)?", RegexOptions.IgnoreCase);
if (volumeRegex.IsMatch(searchTerm) && chapterRegex.IsMatch(searchTerm))
{
string volume = singleResultRegex.Match(volumeRegex.Match(searchTerm).Value).Value;
string chapter = singleResultRegex.Match(chapterRegex.Match(searchTerm).Value).Value;
return availableChapters.Where(aCh => aCh.volumeNumber is not null &&
aCh.volumeNumber.Equals(volume, StringComparison.InvariantCultureIgnoreCase) &&
aCh.chapterNumber.Equals(chapter, StringComparison.InvariantCultureIgnoreCase))
.ToArray();
}
else if (volumeRegex.IsMatch(searchTerm))
{
string volume = volumeRegex.Match(searchTerm).Value;
if (rangeResultRegex.IsMatch(volume))
{
string range = rangeResultRegex.Match(volume).Value;
int start = Convert.ToInt32(range.Split('-')[0]);
int end = Convert.ToInt32(range.Split('-')[1]);
return availableChapters.Where(aCh => aCh.volumeNumber is not null &&
Convert.ToInt32(aCh.volumeNumber) >= start &&
Convert.ToInt32(aCh.volumeNumber) <= end).ToArray();
}
else if (singleResultRegex.IsMatch(volume))
{
string volumeNumber = singleResultRegex.Match(volume).Value;
return availableChapters.Where(aCh =>
aCh.volumeNumber is not null &&
aCh.volumeNumber.Equals(volumeNumber, StringComparison.InvariantCultureIgnoreCase)).ToArray();
}
}
else if (chapterRegex.IsMatch(searchTerm))
{
string chapter = chapterRegex.Match(searchTerm).Value;
if (rangeResultRegex.IsMatch(chapter))
{
string range = rangeResultRegex.Match(chapter).Value;
int start = Convert.ToInt32(range.Split('-')[0]);
int end = Convert.ToInt32(range.Split('-')[1]);
return availableChapters.Where(aCh => Convert.ToInt32(aCh.chapterNumber) >= start &&
Convert.ToInt32(aCh.chapterNumber) <= end).ToArray();
}
else if (singleResultRegex.IsMatch(chapter))
{
string chapterNumber = singleResultRegex.Match(chapter).Value;
return availableChapters.Where(aCh =>
aCh.chapterNumber.Equals(chapterNumber, StringComparison.InvariantCultureIgnoreCase)).ToArray();
}
}
else
{
if (rangeResultRegex.IsMatch(searchTerm))
{
int start = Convert.ToInt32(searchTerm.Split('-')[0]);
int end = Convert.ToInt32(searchTerm.Split('-')[1]);
return availableChapters[start..(end + 1)];
}
else if(singleResultRegex.IsMatch(searchTerm))
return new [] { availableChapters[Convert.ToInt32(searchTerm)] };
else if (allRegex.IsMatch(searchTerm))
return availableChapters;
}
return Array.Empty<Chapter>();
}
public abstract HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null);
/// <summary>
/// Copies the already downloaded cover from cache to downloadLocation
/// </summary>
/// <param name="manga">Publication to retrieve Cover for</param>
/// <param name="retries">Number of times to retry to copy the cover (or download it first)</param>
public void CopyCoverFromCacheToDownloadLocation(Manga manga, int? retries = 1)
{
Log($"Copy cover {manga}");
//Check if Publication already has a Folder and cover
string publicationFolder = manga.CreatePublicationFolder(TrangaSettings.downloadLocation);
DirectoryInfo dirInfo = new (publicationFolder);
if (dirInfo.EnumerateFiles().Any(info => info.Name.Contains("cover", StringComparison.InvariantCultureIgnoreCase)))
{
Log($"Cover exists {manga}");
return;
}
string? fileInCache = manga.coverFileNameInCache;
if (fileInCache is null || !File.Exists(fileInCache))
{
Log($"Cloning cover failed: File missing {fileInCache}.");
if (retries > 0 && manga.coverUrl is not null)
{
Log($"Trying {retries} more times");
SaveCoverImageToCache(manga.coverUrl, manga.internalId, 0);
CopyCoverFromCacheToDownloadLocation(manga, --retries);
}
return;
}
string newFilePath = Path.Join(publicationFolder, $"cover.{Path.GetFileName(fileInCache).Split('.')[^1]}" );
Log($"Cloning cover {fileInCache} -> {newFilePath}");
File.Copy(fileInCache, newFilePath, true);
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(newFilePath, GroupRead | GroupWrite | UserRead | UserWrite);
}
/// <summary>
/// Downloads Image from URL and saves it to the given path(incl. fileName)
/// </summary>
/// <param name="imageUrl"></param>
/// <param name="fullPath"></param>
/// <param name="requestType">RequestType for Rate-Limit</param>
/// <param name="referrer">referrer used in html request header</param>
private HttpStatusCode DownloadImage(string imageUrl, string fullPath, RequestType requestType, string? referrer = null)
{
RequestResult requestResult = downloadClient.MakeRequest(imageUrl, requestType, referrer);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return requestResult.statusCode;
if (requestResult.result == Stream.Null)
return HttpStatusCode.NotFound;
FileStream fs = new (fullPath, FileMode.Create);
requestResult.result.CopyTo(fs);
fs.Close();
return requestResult.statusCode;
}
protected HttpStatusCode DownloadChapterImages(string[] imageUrls, string saveArchiveFilePath, RequestType requestType, string? comicInfoPath = null, string? referrer = null, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
return HttpStatusCode.RequestTimeout;
Log($"Downloading Images for {saveArchiveFilePath}");
if (progressToken is not null)
progressToken.increments += imageUrls.Length;
//Check if Publication Directory already exists
string directoryPath = Path.GetDirectoryName(saveArchiveFilePath)!;
if (!Directory.Exists(directoryPath))
if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
Directory.CreateDirectory(directoryPath,
UserRead | UserWrite | UserExecute | GroupRead | GroupWrite | GroupExecute );
else
Directory.CreateDirectory(directoryPath);
if (File.Exists(saveArchiveFilePath)) //Don't download twice.
{
progressToken?.Complete();
return HttpStatusCode.Created;
}
//Create a temporary folder to store images
string tempFolder = Directory.CreateTempSubdirectory("trangatemp").FullName;
int chapter = 0;
//Download all Images to temporary Folder
if (imageUrls.Length == 0)
{
Log("No images found");
if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(saveArchiveFilePath, UserRead | UserWrite | UserExecute | GroupRead | GroupWrite | GroupExecute);
Directory.Delete(tempFolder, true);
progressToken?.Complete();
return HttpStatusCode.NoContent;
}
foreach (string imageUrl in imageUrls)
{
string extension = imageUrl.Split('.')[^1].Split('?')[0];
Log($"Downloading image {chapter + 1:000}/{imageUrls.Length:000}"); //TODO
HttpStatusCode status = DownloadImage(imageUrl, Path.Join(tempFolder, $"{chapter++}.{extension}"), requestType, referrer);
Log($"{saveArchiveFilePath} {chapter + 1:000}/{imageUrls.Length:000} {status}");
if ((int)status < 200 || (int)status >= 300)
{
progressToken?.Complete();
return status;
}
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Complete();
return HttpStatusCode.RequestTimeout;
}
progressToken?.Increment();
}
if(comicInfoPath is not null){
File.Copy(comicInfoPath, Path.Join(tempFolder, "ComicInfo.xml"));
File.Delete(comicInfoPath); //Delete tmp-file
}
Log($"Creating archive {saveArchiveFilePath}");
//ZIP-it and ship-it
ZipFile.CreateFromDirectory(tempFolder, saveArchiveFilePath);
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(saveArchiveFilePath, UserRead | UserWrite | UserExecute | GroupRead | GroupWrite | GroupExecute);
Directory.Delete(tempFolder, true); //Cleanup
progressToken?.Complete();
return HttpStatusCode.OK;
}
protected string SaveCoverImageToCache(string url, string mangaInternalId, RequestType requestType)
{
Regex urlRex = new (@"https?:\/\/((?:[a-zA-Z0-9-]+\.)+[a-zA-Z0-9]+)\/(?:.+\/)*(.+\.([a-zA-Z]+))");
//https?:\/\/[a-zA-Z0-9-]+\.([a-zA-Z0-9-]+\.[a-zA-Z0-9]+)\/(?:.+\/)*(.+\.([a-zA-Z]+)) for only second level domains
Match match = urlRex.Match(url);
string filename = $"{match.Groups[1].Value}-{mangaInternalId}.{match.Groups[3].Value}";
string saveImagePath = Path.Join(TrangaSettings.coverImageCache, filename);
if (File.Exists(saveImagePath))
return saveImagePath;
RequestResult coverResult = downloadClient.MakeRequest(url, requestType);
using MemoryStream ms = new();
coverResult.result.CopyTo(ms);
Directory.CreateDirectory(TrangaSettings.coverImageCache);
File.WriteAllBytes(saveImagePath, ms.ToArray());
Log($"Saving cover to {saveImagePath}");
return saveImagePath;
}
}

View File

@ -0,0 +1,54 @@
using System.Data;
using System.Diagnostics;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
namespace Tranga.MangaConnectors;
public class MangaConnectorJsonConverter : JsonConverter
{
private GlobalBase _clone;
private readonly HashSet<MangaConnector> _connectors;
internal MangaConnectorJsonConverter(GlobalBase clone, HashSet<MangaConnector> connectors)
{
this._clone = clone;
this._connectors = connectors;
}
public override bool CanConvert(Type objectType)
{
return (objectType == typeof(MangaConnector));
}
public override object ReadJson(JsonReader reader, Type objectType, object? existingValue, JsonSerializer serializer)
{
JObject jo = JObject.Load(reader);
string? connectorName = jo.Value<string>("name");
if (connectorName is null)
throw new ConstraintException("Name can not be null.");
return connectorName switch
{
"MangaDex" => this._connectors.First(c => c is MangaDex),
"Manganato" => this._connectors.First(c => c is Manganato),
"MangaKatana" => this._connectors.First(c => c is MangaKatana),
"Mangasee" => this._connectors.First(c => c is Mangasee),
"Mangaworld" => this._connectors.First(c => c is Mangaworld),
"Bato" => this._connectors.First(c => c is Bato),
"Manga4Life" => this._connectors.First(c => c is MangaLife),
"ManhuaPlus" => this._connectors.First(c => c is ManhuaPlus),
"MangaHere" => this._connectors.First(c => c is MangaHere),
_ => throw new UnreachableException($"Could not find Connector with name {connectorName}")
};
}
public override bool CanWrite => false;
/// <summary>
/// Don't call this
/// </summary>
public override void WriteJson(JsonWriter writer, object? value, JsonSerializer serializer)
{
throw new Exception("Dont call this");
}
}

View File

@ -0,0 +1,298 @@
using System.Net;
using System.Text.Json.Nodes;
using System.Text.RegularExpressions;
using Tranga.Jobs;
using JsonSerializer = System.Text.Json.JsonSerializer;
namespace Tranga.MangaConnectors;
public class MangaDex : MangaConnector
{
//https://api.mangadex.org/docs/3-enumerations/#language-codes--localization
//https://en.wikipedia.org/wiki/List_of_ISO_639_language_codes
//https://gist.github.com/Josantonius/b455e315bc7f790d14b136d61d9ae469
public MangaDex(GlobalBase clone) : base(clone, "MangaDex", ["en","pt","pt-br","it","de","ru","aa","ab","ae","af","ak","am","an","ar-ae","ar-bh","ar-dz","ar-eg","ar-iq","ar-jo","ar-kw","ar-lb","ar-ly","ar-ma","ar-om","ar-qa","ar-sa","ar-sy","ar-tn","ar-ye","ar","as","av","ay","az","ba","be","bg","bh","bi","bm","bn","bo","br","bs","ca","ce","ch","co","cr","cs","cu","cv","cy","da","de-at","de-ch","de-de","de-li","de-lu","div","dv","dz","ee","el","en-au","en-bz","en-ca","en-cb","en-gb","en-ie","en-jm","en-nz","en-ph","en-tt","en-us","en-za","en-zw","eo","es-ar","es-bo","es-cl","es-co","es-cr","es-do","es-ec","es-es","es-gt","es-hn","es-la","es-mx","es-ni","es-pa","es-pe","es-pr","es-py","es-sv","es-us","es-uy","es-ve","es","et","eu","fa","ff","fi","fj","fo","fr-be","fr-ca","fr-ch","fr-fr","fr-lu","fr-mc","fr","fy","ga","gd","gl","gn","gu","gv","ha","he","hi","ho","hr-ba","hr-hr","hr","ht","hu","hy","hz","ia","id","ie","ig","ii","ik","in","io","is","it-ch","it-it","iu","iw","ja","ja-ro","ji","jv","jw","ka","kg","ki","kj","kk","kl","km","kn","ko","ko-ro","kr","ks","ku","kv","kw","ky","kz","la","lb","lg","li","ln","lo","ls","lt","lu","lv","mg","mh","mi","mk","ml","mn","mo","mr","ms-bn","ms-my","ms","mt","my","na","nb","nd","ne","ng","nl-be","nl-nl","nl","nn","no","nr","ns","nv","ny","oc","oj","om","or","os","pa","pi","pl","ps","pt-pt","qu-bo","qu-ec","qu-pe","qu","rm","rn","ro","rw","sa","sb","sc","sd","se-fi","se-no","se-se","se","sg","sh","si","sk","sl","sm","sn","so","sq","sr-ba","sr-sp","sr","ss","st","su","sv-fi","sv-se","sv","sw","sx","syr","ta","te","tg","th","ti","tk","tl","tn","to","tr","ts","tt","tw","ty","ug","uk","ur","us","uz","ve","vi","vo","wa","wo","xh","yi","yo","za","zh-cn","zh-hk","zh-mo","zh-ro","zh-sg","zh-tw","zh","zu"])
{
this.downloadClient = new HttpDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term={publicationTitle}");
const int limit = 100; //How many values we want returned at once
int offset = 0; //"Page"
int total = int.MaxValue; //How many total results are there, is updated on first request
HashSet<Manga> retManga = new();
int loadedPublicationData = 0;
List<JsonNode> results = new();
//Request all search-results
while (offset < total) //As long as we haven't requested all "Pages"
{
//Request next Page
RequestResult requestResult = downloadClient.MakeRequest(
$"https://api.mangadex.org/manga?limit={limit}&title={publicationTitle}&offset={offset}" +
$"&contentRating%5B%5D=safe&contentRating%5B%5D=suggestive&contentRating%5B%5D=erotica" +
$"&contentRating%5B%5D=pornographic" +
$"&includes%5B%5D=manga&includes%5B%5D=cover_art&includes%5B%5D=author" +
$"&includes%5B%5D=artist&includes%5B%5D=tag", RequestType.MangaInfo);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
break;
JsonObject? result = JsonSerializer.Deserialize<JsonObject>(requestResult.result);
offset += limit;
if (result is null)
break;
if(result.ContainsKey("total"))
total = result["total"]!.GetValue<int>(); //Update the total number of Publications
else continue;
if (result.ContainsKey("data"))
results.AddRange(result["data"]!.AsArray()!);//Manga-data-Array
}
foreach (JsonNode mangaNode in results)
{
Log($"Getting publication data. {++loadedPublicationData}/{total}");
if(MangaFromJsonObject(mangaNode.AsObject()) is { } manga)
retManga.Add(manga); //Add Publication (Manga) to result
}
Log($"Retrieved {retManga.Count} publications. Term={publicationTitle}");
return retManga.ToArray();
}
public override Manga? GetMangaFromId(string publicationId)
{
RequestResult requestResult =
downloadClient.MakeRequest($"https://api.mangadex.org/manga/{publicationId}?includes%5B%5D=manga&includes%5B%5D=cover_art&includes%5B%5D=author&includes%5B%5D=artist&includes%5B%5D=tag", RequestType.MangaInfo);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return null;
JsonObject? result = JsonSerializer.Deserialize<JsonObject>(requestResult.result);
if(result is not null)
return MangaFromJsonObject(result["data"]!.AsObject());
return null;
}
public override Manga? GetMangaFromUrl(string url)
{
Regex idRex = new (@"https:\/\/mangadex.org\/title\/([A-z0-9-]*)\/.*");
string id = idRex.Match(url).Groups[1].Value;
Log($"Got id {id} from {url}");
return GetMangaFromId(id);
}
private Manga? MangaFromJsonObject(JsonObject manga)
{
if (!manga.TryGetPropertyValue("id", out JsonNode? idNode))
return null;
string publicationId = idNode!.GetValue<string>();
if (!manga.TryGetPropertyValue("attributes", out JsonNode? attributesNode))
return null;
JsonObject attributes = attributesNode!.AsObject();
if (!attributes.TryGetPropertyValue("title", out JsonNode? titleNode))
return null;
string title = titleNode!.AsObject().ContainsKey("en") switch
{
true => titleNode.AsObject()["en"]!.GetValue<string>(),
false => titleNode.AsObject().First().Value!.GetValue<string>()
};
Dictionary<string, string> altTitlesDict = new();
if (attributes.TryGetPropertyValue("altTitles", out JsonNode? altTitlesNode))
{
foreach (JsonNode? altTitleNode in altTitlesNode!.AsArray())
{
JsonObject altTitleNodeObject = altTitleNode!.AsObject();
altTitlesDict.TryAdd(altTitleNodeObject.First().Key, altTitleNodeObject.First().Value!.GetValue<string>());
}
}
if (!attributes.TryGetPropertyValue("description", out JsonNode? descriptionNode))
return null;
string description = descriptionNode!.AsObject().ContainsKey("en") switch
{
true => descriptionNode.AsObject()["en"]!.GetValue<string>(),
false => descriptionNode.AsObject().FirstOrDefault().Value?.GetValue<string>() ?? ""
};
Dictionary<string, string> linksDict = new();
if (attributes.TryGetPropertyValue("links", out JsonNode? linksNode) && linksNode is not null)
foreach (KeyValuePair<string, JsonNode?> linkKv in linksNode!.AsObject())
linksDict.TryAdd(linkKv.Key, linkKv.Value.GetValue<string>());
string? originalLanguage =
attributes.TryGetPropertyValue("originalLanguage", out JsonNode? originalLanguageNode) switch
{
true => originalLanguageNode?.GetValue<string>(),
false => null
};
Manga.ReleaseStatusByte status = Manga.ReleaseStatusByte.Unreleased;
if (attributes.TryGetPropertyValue("status", out JsonNode? statusNode))
{
status = statusNode?.GetValue<string>().ToLower() switch
{
"ongoing" => Manga.ReleaseStatusByte.Continuing,
"completed" => Manga.ReleaseStatusByte.Completed,
"hiatus" => Manga.ReleaseStatusByte.OnHiatus,
"cancelled" => Manga.ReleaseStatusByte.Cancelled,
_ => Manga.ReleaseStatusByte.Unreleased
};
}
int? year = attributes.TryGetPropertyValue("year", out JsonNode? yearNode) switch
{
true => yearNode?.GetValue<int>(),
false => null
};
HashSet<string> tags = new(128);
if (attributes.TryGetPropertyValue("tags", out JsonNode? tagsNode))
foreach (JsonNode? tagNode in tagsNode!.AsArray())
tags.Add(tagNode!["attributes"]!["name"]!["en"]!.GetValue<string>());
if (!manga.TryGetPropertyValue("relationships", out JsonNode? relationshipsNode))
return null;
JsonNode? coverNode = relationshipsNode!.AsArray()
.FirstOrDefault(rel => rel!["type"]!.GetValue<string>().Equals("cover_art"));
if (coverNode is null)
return null;
string fileName = coverNode["attributes"]!["fileName"]!.GetValue<string>();
string coverUrl = $"https://uploads.mangadex.org/covers/{publicationId}/{fileName}";
string coverCacheName = SaveCoverImageToCache(coverUrl, publicationId, RequestType.MangaCover);
List<string> authors = new();
JsonNode?[] authorNodes = relationshipsNode.AsArray()
.Where(rel => rel!["type"]!.GetValue<string>().Equals("author") || rel!["type"]!.GetValue<string>().Equals("artist")).ToArray();
foreach (JsonNode? authorNode in authorNodes)
{
string authorName = authorNode!["attributes"]!["name"]!.GetValue<string>();
if(!authors.Contains(authorName))
authors.Add(authorName);
}
Manga pub = new(
title,
authors,
description,
altTitlesDict,
tags.ToArray(),
coverUrl,
coverCacheName,
linksDict,
year,
originalLanguage,
publicationId,
status,
websiteUrl: $"https://mangadex.org/title/{publicationId}"
);
AddMangaToCache(pub);
return pub;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
const int limit = 100; //How many values we want returned at once
int offset = 0; //"Page"
int total = int.MaxValue; //How many total results are there, is updated on first request
List<Chapter> chapters = new();
//As long as we haven't requested all "Pages"
while (offset < total)
{
//Request next "Page"
RequestResult requestResult =
downloadClient.MakeRequest(
$"https://api.mangadex.org/manga/{manga.publicationId}/feed?limit={limit}&offset={offset}&translatedLanguage%5B%5D={language}&contentRating%5B%5D=safe&contentRating%5B%5D=suggestive&contentRating%5B%5D=erotica&contentRating%5B%5D=pornographic", RequestType.MangaDexFeed);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
break;
JsonObject? result = JsonSerializer.Deserialize<JsonObject>(requestResult.result);
offset += limit;
if (result is null)
break;
total = result["total"]!.GetValue<int>();
JsonArray chaptersInResult = result["data"]!.AsArray();
//Loop through all Chapters in result and extract information from JSON
foreach (JsonNode? jsonNode in chaptersInResult)
{
JsonObject chapter = (JsonObject)jsonNode!;
JsonObject attributes = chapter["attributes"]!.AsObject();
string chapterId = chapter["id"]!.GetValue<string>();
string? title = attributes.ContainsKey("title") && attributes["title"] is not null
? attributes["title"]!.GetValue<string>()
: null;
string? volume = attributes.ContainsKey("volume") && attributes["volume"] is not null
? attributes["volume"]!.GetValue<string>()
: null;
string chapterNum = attributes.ContainsKey("chapter") && attributes["chapter"] is not null
? attributes["chapter"]!.GetValue<string>()
: "null";
if (attributes.ContainsKey("pages") && attributes["pages"] is not null &&
attributes["pages"]!.GetValue<int>() < 1)
{
Log($"Skipping {chapterId} Vol.{volume} Ch.{chapterNum} {title} because it has no pages or is externally linked.");
continue;
}
if(chapterNum is not "null" && !chapters.Any(chp => chp.volumeNumber.Equals(volume) && chp.chapterNumber.Equals(chapterNum)))
chapters.Add(new Chapter(manga, title, volume, chapterNum, chapterId));
}
}
//Return Chapters ordered by Chapter-Number
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
//Request URLs for Chapter-Images
RequestResult requestResult =
downloadClient.MakeRequest($"https://api.mangadex.org/at-home/server/{chapter.url}?forcePort443=false", RequestType.MangaDexImage);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
progressToken?.Cancel();
return requestResult.statusCode;
}
JsonObject? result = JsonSerializer.Deserialize<JsonObject>(requestResult.result);
if (result is null)
{
progressToken?.Cancel();
return HttpStatusCode.NoContent;
}
string baseUrl = result["baseUrl"]!.GetValue<string>();
string hash = result["chapter"]!["hash"]!.GetValue<string>();
JsonArray imageFileNames = result["chapter"]!["data"]!.AsArray();
//Loop through all imageNames and construct urls (imageUrl)
HashSet<string> imageUrls = new();
foreach (JsonNode? image in imageFileNames)
imageUrls.Add($"{baseUrl}/data/{hash}/{image!.GetValue<string>()}");
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
//Download Chapter-Images
return DownloadChapterImages(imageUrls.ToArray(), chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, progressToken:progressToken);
}
}

View File

@ -0,0 +1,203 @@
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class MangaHere : MangaConnector
{
public MangaHere(GlobalBase clone) : base(clone, "MangaHere", ["en"])
{
this.downloadClient = new ChromiumDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string sanitizedTitle = string.Join('+', Regex.Matches(publicationTitle, "[A-z]*").Where(str => str.Length > 0)).ToLower();
string requestUrl = $"https://www.mangahere.cc/search?title={sanitizedTitle}";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300 || requestResult.htmlDocument is null)
return Array.Empty<Manga>();
Manga[] publications = ParsePublicationsFromHtml(requestResult.htmlDocument);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications;
}
private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{
if (document.DocumentNode.SelectNodes("//div[contains(concat(' ',normalize-space(@class),' '),' container ')]").Any(node => node.ChildNodes.Any(cNode => cNode.HasClass("search-keywords"))))
return Array.Empty<Manga>();
List<string> urls = document.DocumentNode
.SelectNodes("//a[contains(@href, '/manga/') and not(contains(@href, '.html'))]")
.Select(thumb => $"https://www.mangahere.cc{thumb.GetAttributeValue("href", "")}").Distinct().ToList();
HashSet<Manga> ret = new();
foreach (string url in urls)
{
Manga? manga = GetMangaFromUrl(url);
if (manga is not null)
ret.Add((Manga)manga);
}
return ret.ToArray();
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://www.mangahere.cc/manga/{publicationId}");
}
public override Manga? GetMangaFromUrl(string url)
{
RequestResult requestResult =
downloadClient.MakeRequest(url, RequestType.MangaInfo);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300 || requestResult.htmlDocument is null)
return null;
Regex idRex = new (@"https:\/\/www\.mangahere\.[a-z]{0,63}\/manga\/([0-9A-z\-]+).*");
string id = idRex.Match(url).Groups[1].Value;
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, id, url);
}
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
string originalLanguage = "", status = "";
Dictionary<string, string> altTitles = new(), links = new();
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
//We dont get posters, because same origin bs HtmlNode posterNode = document.DocumentNode.SelectSingleNode("//img[contains(concat(' ',normalize-space(@class),' '),' detail-info-cover-img ')]");
string posterUrl = "http://static.mangahere.cc/v20230914/mangahere/images/nopicture.jpg";
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
HtmlNode titleNode = document.DocumentNode.SelectSingleNode("//span[contains(concat(' ',normalize-space(@class),' '),' detail-info-right-title-font ')]");
string sortName = titleNode.InnerText;
List<string> authors = document.DocumentNode
.SelectNodes("//p[contains(concat(' ',normalize-space(@class),' '),' detail-info-right-say ')]/a")
.Select(node => node.InnerText)
.ToList();
HashSet<string> tags = document.DocumentNode
.SelectNodes("//p[contains(concat(' ',normalize-space(@class),' '),' detail-info-right-tag-list ')]/a")
.Select(node => node.InnerText)
.ToHashSet();
status = document.DocumentNode.SelectSingleNode("//span[contains(concat(' ',normalize-space(@class),' '),' detail-info-right-title-tip ')]").InnerText;
switch (status.ToLower())
{
case "cancelled": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "hiatus": releaseStatus = Manga.ReleaseStatusByte.OnHiatus; break;
case "discontinued": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "complete": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
}
HtmlNode descriptionNode = document.DocumentNode
.SelectSingleNode("//p[contains(concat(' ',normalize-space(@class),' '),' fullcontent ')]");
string description = descriptionNode.InnerText;
Manga manga = new(sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links,
null, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
string requestUrl = $"https://www.mangahere.cc/manga/{manga.publicationId}";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300 || requestResult.htmlDocument is null)
return Array.Empty<Chapter>();
List<string> urls = requestResult.htmlDocument.DocumentNode.SelectNodes("//div[@id='list-2']/ul//li//a[contains(@href, '/manga/')]")
.Select(node => node.GetAttributeValue("href", "")).ToList();
Regex chapterRex = new(@".*\/manga\/[a-zA-Z0-9\-\._\~\!\$\&\'\(\)\*\+\,\;\=\:\@]+\/v([0-9(TBD)]+)\/c([0-9\.]+)\/.*");
List<Chapter> chapters = new();
foreach (string url in urls)
{
Match rexMatch = chapterRex.Match(url);
string volumeNumber = rexMatch.Groups[1].Value == "TBD" ? "0" : rexMatch.Groups[1].Value;
string chapterNumber = rexMatch.Groups[2].Value;
string fullUrl = $"https://www.mangahere.cc{url}";
chapters.Add(new Chapter(manga, "", volumeNumber, chapterNumber, fullUrl));
}
//Return Chapters ordered by Chapter-Number
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
List<string> imageUrls = new();
int downloaded = 1;
int images = 1;
string url = string.Join('/', chapter.url.Split('/')[..^1]);
do
{
RequestResult requestResult =
downloadClient.MakeRequest($"{url}/{downloaded}.html", RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
progressToken?.Cancel();
return requestResult.statusCode;
}
if (requestResult.htmlDocument is null)
{
progressToken?.Cancel();
return HttpStatusCode.InternalServerError;
}
imageUrls.AddRange(ParseImageUrlsFromHtml(requestResult.htmlDocument));
images = requestResult.htmlDocument.DocumentNode
.SelectNodes("//a[contains(@href, '/manga/')]")
.MaxBy(node => node.GetAttributeValue("data-page", 0))!.GetAttributeValue("data-page", 0);
logger?.WriteLine($"MangaHere speciality: Get Image-url {downloaded}/{images}");
if (progressToken is not null)
{
progressToken.increments = images * 2;//we also have to download the images later
progressToken.Increment();
}
} while (downloaded++ <= images);
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
if (progressToken is not null)
progressToken.increments = images;//we blip to normal length, in downloadchapterimages it is increasaed by the amount of urls again
return DownloadChapterImages(imageUrls.ToArray(), chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, progressToken:progressToken);
}
private string[] ParseImageUrlsFromHtml(HtmlDocument document)
{
return document.DocumentNode
.SelectNodes("//img[contains(concat(' ',normalize-space(@class),' '),' reader-main-img ')]")
.Select(node =>
{
string url = node.GetAttributeValue("src", "");
return url.StartsWith("//") ? $"https:{url}" : url;
})
.ToArray();
}
}

View File

@ -0,0 +1,241 @@
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class MangaKatana : MangaConnector
{
public MangaKatana(GlobalBase clone) : base(clone, "MangaKatana", ["en"])
{
this.downloadClient = new HttpDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string sanitizedTitle = string.Join("%20", Regex.Matches(publicationTitle, "[A-z]*").Where(m => m.Value.Length > 0)).ToLower();
string requestUrl = $"https://mangakatana.com/?search={sanitizedTitle}&search_by=book_name";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Manga>();
// ReSharper disable once MergeIntoPattern
// If a single result is found, the user will be redirected to the results directly instead of a result page
if(requestResult.hasBeenRedirected
&& requestResult.redirectedToUrl is not null
&& requestResult.redirectedToUrl.Contains("mangakatana.com/manga"))
{
return new [] { ParseSinglePublicationFromHtml(requestResult.result, requestResult.redirectedToUrl.Split('/')[^1], requestResult.redirectedToUrl) };
}
Manga[] publications = ParsePublicationsFromHtml(requestResult.result);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications;
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://mangakatana.com/manga/{publicationId}");
}
public override Manga? GetMangaFromUrl(string url)
{
RequestResult requestResult =
downloadClient.MakeRequest(url, RequestType.MangaInfo);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return null;
return ParseSinglePublicationFromHtml(requestResult.result, url.Split('/')[^1], url);
}
private Manga[] ParsePublicationsFromHtml(Stream html)
{
StreamReader reader = new(html);
string htmlString = reader.ReadToEnd();
HtmlDocument document = new();
document.LoadHtml(htmlString);
IEnumerable<HtmlNode> searchResults = document.DocumentNode.SelectNodes("//*[@id='book_list']/div");
if (searchResults is null || !searchResults.Any())
return Array.Empty<Manga>();
List<string> urls = new();
foreach (HtmlNode mangaResult in searchResults)
{
urls.Add(mangaResult.Descendants("a").First().GetAttributes()
.First(a => a.Name == "href").Value);
}
HashSet<Manga> ret = new();
foreach (string url in urls)
{
Manga? manga = GetMangaFromUrl(url);
if (manga is not null)
ret.Add((Manga)manga);
}
return ret.ToArray();
}
private Manga ParseSinglePublicationFromHtml(Stream html, string publicationId, string websiteUrl)
{
StreamReader reader = new(html);
string htmlString = reader.ReadToEnd();
HtmlDocument document = new();
document.LoadHtml(htmlString);
Dictionary<string, string> altTitles = new();
Dictionary<string, string>? links = null;
HashSet<string> tags = new();
string[] authors = Array.Empty<string>();
string originalLanguage = "";
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
HtmlNode infoNode = document.DocumentNode.SelectSingleNode("//*[@id='single_book']");
string sortName = infoNode.Descendants("h1").First(n => n.HasClass("heading")).InnerText;
HtmlNode infoTable = infoNode.SelectSingleNode("//*[@id='single_book']/div[2]/div/ul");
foreach (HtmlNode row in infoTable.Descendants("li"))
{
string key = row.SelectNodes("div").First().InnerText.ToLower();
string value = row.SelectNodes("div").Last().InnerText;
string keySanitized = string.Concat(Regex.Matches(key, "[a-z]"));
switch (keySanitized)
{
case "altnames":
string[] alts = value.Split(" ; ");
for (int i = 0; i < alts.Length; i++)
altTitles.Add(i.ToString(), alts[i]);
break;
case "authorsartists":
authors = value.Split(',');
break;
case "status":
switch (value.ToLower())
{
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
case "completed": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
}
break;
case "genres":
tags = row.SelectNodes("div").Last().Descendants("a").Select(a => a.InnerText).ToHashSet();
break;
}
}
string posterUrl = document.DocumentNode.SelectSingleNode("//*[@id='single_book']/div[1]/div").Descendants("img").First()
.GetAttributes().First(a => a.Name == "src").Value;
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
string description = document.DocumentNode.SelectSingleNode("//*[@id='single_book']/div[3]/p").InnerText;
while (description.StartsWith('\n'))
description = description.Substring(1);
int year = DateTime.Now.Year;
string yearString = infoTable.Descendants("div").First(d => d.HasClass("updateAt"))
.InnerText.Split('-')[^1];
if(yearString.Contains("ago") == false)
{
year = Convert.ToInt32(yearString);
}
Manga manga = new (sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
string requestUrl = $"https://mangakatana.com/manga/{manga.publicationId}";
// Leaving this in for verification if the page exists
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Chapter>();
//Return Chapters ordered by Chapter-Number
List<Chapter> chapters = ParseChaptersFromHtml(manga, requestUrl);
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
private List<Chapter> ParseChaptersFromHtml(Manga manga, string mangaUrl)
{
// Using HtmlWeb will include the chapters since they are loaded with js
HtmlWeb web = new();
HtmlDocument document = web.Load(mangaUrl);
List<Chapter> ret = new();
HtmlNode chapterList = document.DocumentNode.SelectSingleNode("//div[contains(@class, 'chapters')]/table/tbody");
Regex volumeRex = new(@"[0-9a-z\-\.]+\/[0-9a-z\-]*v([0-9\.]+)");
Regex chapterNumRex = new(@"[0-9a-z\-\.]+\/[0-9a-z\-]*c([0-9\.]+)");
Regex chapterNameRex = new(@"Chapter [0-9\.]+:? (.*)");
foreach (HtmlNode chapterInfo in chapterList.Descendants("tr"))
{
string fullString = chapterInfo.Descendants("a").First().InnerText;
string url = chapterInfo.Descendants("a").First()
.GetAttributeValue("href", "");
string? volumeNumber = volumeRex.IsMatch(url) ? volumeRex.Match(url).Groups[1].Value : null;
string chapterNumber = chapterNumRex.Match(url).Groups[1].Value;
string chapterName = chapterNameRex.Match(fullString).Groups[1].Value;
ret.Add(new Chapter(manga, chapterName, volumeNumber, chapterNumber, url));
}
return ret;
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
string requestUrl = chapter.url;
// Leaving this in to check if the page exists
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
progressToken?.Cancel();
return requestResult.statusCode;
}
string[] imageUrls = ParseImageUrlsFromHtml(requestUrl);
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(imageUrls, chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, "https://mangakatana.com/", progressToken:progressToken);
}
private string[] ParseImageUrlsFromHtml(string mangaUrl)
{
HtmlWeb web = new();
HtmlDocument document = web.Load(mangaUrl);
// Images are loaded dynamically, but the urls are present in a piece of js code on the page
string js = document.DocumentNode.SelectSingleNode("//script[contains(., 'data-src')]").InnerText
.Replace("\r", "")
.Replace("\n", "")
.Replace("\t", "");
// ReSharper disable once StringLiteralTypo
string regexPat = @"(var thzq=\[')(.*)(,];function)";
var group = Regex.Matches(js, regexPat).First().Groups[2].Value.Replace("'", "");
var urls = group.Split(',');
return urls;
}
}

View File

@ -0,0 +1,199 @@
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class MangaLife : MangaConnector
{
public MangaLife(GlobalBase clone) : base(clone, "Manga4Life", ["en"])
{
this.downloadClient = new ChromiumDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string sanitizedTitle = WebUtility.UrlEncode(publicationTitle);
string requestUrl = $"https://manga4life.com/search/?name={sanitizedTitle}";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Manga>();
if (requestResult.htmlDocument is null)
return Array.Empty<Manga>();
Manga[] publications = ParsePublicationsFromHtml(requestResult.htmlDocument);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications;
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://manga4life.com/manga/{publicationId}");
}
public override Manga? GetMangaFromUrl(string url)
{
Regex publicationIdRex = new(@"https:\/\/(www\.)?manga4life.com\/manga\/(.*)(\/.*)*");
string publicationId = publicationIdRex.Match(url).Groups[2].Value;
RequestResult requestResult = this.downloadClient.MakeRequest(url, RequestType.MangaInfo);
if(requestResult.htmlDocument is not null)
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, publicationId, url);
return null;
}
private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{
HtmlNode resultsNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']/div[last()]/div[1]/div");
if (resultsNode.Descendants("div").Count() == 1 && resultsNode.Descendants("div").First().HasClass("NoResults"))
{
Log("No results.");
return Array.Empty<Manga>();
}
Log($"{resultsNode.SelectNodes("div").Count} items.");
HashSet<Manga> ret = new();
foreach (HtmlNode resultNode in resultsNode.SelectNodes("div"))
{
string url = resultNode.Descendants().First(d => d.HasClass("SeriesName")).GetAttributeValue("href", "");
Manga? manga = GetMangaFromUrl($"https://manga4life.com{url}");
if (manga is not null)
ret.Add((Manga)manga);
}
return ret.ToArray();
}
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
string originalLanguage = "", status = "";
Dictionary<string, string> altTitles = new(), links = new();
HashSet<string> tags = new();
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
HtmlNode posterNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']//div[@class='row']//img");
string posterUrl = posterNode.GetAttributeValue("src", "");
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
HtmlNode titleNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']//div[@class='row']//h1");
string sortName = titleNode.InnerText;
HtmlNode[] authorsNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Author(s):']/..").Descendants("a")
.ToArray();
List<string> authors = new();
foreach (HtmlNode authorNode in authorsNodes)
authors.Add(authorNode.InnerText);
HtmlNode[] genreNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Genre(s):']/..").Descendants("a")
.ToArray();
foreach (HtmlNode genreNode in genreNodes)
tags.Add(genreNode.InnerText);
HtmlNode yearNode = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Released:']/..").Descendants("a")
.First();
int year = Convert.ToInt32(yearNode.InnerText);
HtmlNode[] statusNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Status:']/..").Descendants("a")
.ToArray();
foreach (HtmlNode statusNode in statusNodes)
if (statusNode.InnerText.Contains("publish", StringComparison.CurrentCultureIgnoreCase))
status = statusNode.InnerText.Split(' ')[0];
switch (status.ToLower())
{
case "cancelled": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "hiatus": releaseStatus = Manga.ReleaseStatusByte.OnHiatus; break;
case "discontinued": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "complete": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
}
HtmlNode descriptionNode = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Description:']/..")
.Descendants("div").First();
string description = descriptionNode.InnerText;
Manga manga = new(sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links, year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
RequestResult result = downloadClient.MakeRequest($"https://manga4life.com/manga/{manga.publicationId}", RequestType.Default, clickButton:"[class*='ShowAllChapters']");
if ((int)result.statusCode < 200 || (int)result.statusCode >= 300 || result.htmlDocument is null)
{
return Array.Empty<Chapter>();
}
HtmlNodeCollection chapterNodes = result.htmlDocument.DocumentNode.SelectNodes(
"//a[contains(concat(' ',normalize-space(@class),' '),' ChapterLink ')]");
string[] urls = chapterNodes.Select(node => node.GetAttributeValue("href", "")).ToArray();
Regex urlRex = new (@"-chapter-([0-9\\.]+)(-index-([0-9\\.]+))?");
List<Chapter> chapters = new();
foreach (string url in urls)
{
Match rexMatch = urlRex.Match(url);
string volumeNumber = "1";
if (rexMatch.Groups[3].Value.Length > 0)
volumeNumber = rexMatch.Groups[3].Value;
string chapterNumber = rexMatch.Groups[1].Value;
string fullUrl = $"https://manga4life.com{url}";
fullUrl = fullUrl.Replace(Regex.Match(url,"(-page-[0-9])").Value,"");
chapters.Add(new Chapter(manga, "", volumeNumber, chapterNumber, fullUrl));
}
//Return Chapters ordered by Chapter-Number
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
RequestResult requestResult = this.downloadClient.MakeRequest(chapter.url, RequestType.Default);
if (requestResult.htmlDocument is null)
{
progressToken?.Cancel();
return HttpStatusCode.RequestTimeout;
}
HtmlDocument document = requestResult.htmlDocument;
HtmlNode gallery = document.DocumentNode.Descendants("div").First(div => div.HasClass("ImageGallery"));
HtmlNode[] images = gallery.Descendants("img").Where(img => img.HasClass("img-fluid")).ToArray();
List<string> urls = new();
foreach(HtmlNode galleryImage in images)
urls.Add(galleryImage.GetAttributeValue("src", ""));
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(urls.ToArray(), chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, progressToken:progressToken);
}
}

View File

@ -0,0 +1,234 @@
using System.Globalization;
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class Manganato : MangaConnector
{
public Manganato(GlobalBase clone) : base(clone, "Manganato", ["en"])
{
this.downloadClient = new HttpDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string sanitizedTitle = string.Join('_', Regex.Matches(publicationTitle, "[A-z]*").Where(str => str.Length > 0)).ToLower();
string requestUrl = $"https://manganato.com/search/story/{sanitizedTitle}";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Manga>();
if (requestResult.htmlDocument is null)
return Array.Empty<Manga>();
Manga[] publications = ParsePublicationsFromHtml(requestResult.htmlDocument);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications;
}
private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{
List<HtmlNode> searchResults = document.DocumentNode.Descendants("div").Where(n => n.HasClass("search-story-item")).ToList();
Log($"{searchResults.Count} items.");
List<string> urls = new();
foreach (HtmlNode mangaResult in searchResults)
{
urls.Add(mangaResult.Descendants("a").First(n => n.HasClass("item-title")).GetAttributes()
.First(a => a.Name == "href").Value);
}
HashSet<Manga> ret = new();
foreach (string url in urls)
{
Manga? manga = GetMangaFromUrl(url);
if (manga is not null)
ret.Add((Manga)manga);
}
return ret.ToArray();
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://chapmanganato.com/{publicationId}");
}
public override Manga? GetMangaFromUrl(string url)
{
RequestResult requestResult =
downloadClient.MakeRequest(url, RequestType.MangaInfo);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return null;
if (requestResult.htmlDocument is null)
return null;
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, url.Split('/')[^1], url);
}
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
Dictionary<string, string> altTitles = new();
Dictionary<string, string>? links = null;
HashSet<string> tags = new();
string[] authors = Array.Empty<string>();
string originalLanguage = "";
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
HtmlNode infoNode = document.DocumentNode.Descendants("div").First(d => d.HasClass("story-info-right"));
string sortName = infoNode.Descendants("h1").First().InnerText;
HtmlNode infoTable = infoNode.Descendants().First(d => d.Name == "table");
foreach (HtmlNode row in infoTable.Descendants("tr"))
{
string key = row.SelectNodes("td").First().InnerText.ToLower();
string value = row.SelectNodes("td").Last().InnerText;
string keySanitized = string.Concat(Regex.Matches(key, "[a-z]"));
switch (keySanitized)
{
case "alternative":
string[] alts = value.Split(" ; ");
for(int i = 0; i < alts.Length; i++)
altTitles.Add(i.ToString(), alts[i]);
break;
case "authors":
authors = value.Split('-');
for (int i = 0; i < authors.Length; i++)
authors[i] = authors[i].Replace("\r\n", "");
break;
case "status":
switch (value.ToLower())
{
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
case "completed": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
}
break;
case "genres":
string[] genres = value.Split(" - ");
for (int i = 0; i < genres.Length; i++)
genres[i] = genres[i].Replace("\r\n", "");
tags = genres.ToHashSet();
break;
}
}
string posterUrl = document.DocumentNode.Descendants("span").First(s => s.HasClass("info-image")).Descendants("img").First()
.GetAttributes().First(a => a.Name == "src").Value;
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
string description = document.DocumentNode.Descendants("div").First(d => d.HasClass("panel-story-info-description"))
.InnerText.Replace("Description :", "");
while (description.StartsWith('\n'))
description = description.Substring(1);
string pattern = "MMM dd,yyyy HH:mm";
HtmlNode oldestChapter = document.DocumentNode
.SelectNodes("//span[contains(concat(' ',normalize-space(@class),' '),' chapter-time ')]").MaxBy(
node => DateTime.ParseExact(node.GetAttributeValue("title", "Dec 31 2400, 23:59"), pattern,
CultureInfo.InvariantCulture).Millisecond)!;
int year = DateTime.ParseExact(oldestChapter.GetAttributeValue("title", "Dec 31 2400, 23:59"), pattern,
CultureInfo.InvariantCulture).Year;
Manga manga = new (sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
string requestUrl = $"https://chapmanganato.com/{manga.publicationId}";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Chapter>();
//Return Chapters ordered by Chapter-Number
if (requestResult.htmlDocument is null)
return Array.Empty<Chapter>();
List<Chapter> chapters = ParseChaptersFromHtml(manga, requestResult.htmlDocument);
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
private List<Chapter> ParseChaptersFromHtml(Manga manga, HtmlDocument document)
{
List<Chapter> ret = new();
HtmlNode chapterList = document.DocumentNode.Descendants("ul").First(l => l.HasClass("row-content-chapter"));
Regex volRex = new(@"Vol\.([0-9]+).*");
Regex chapterRex = new(@"https:\/\/chapmanganato.[A-z]+\/manga-[A-z0-9]+\/chapter-([0-9\.]+)");
Regex nameRex = new(@"Chapter ([0-9]+(\.[0-9]+)*){1}:? (.*)");
foreach (HtmlNode chapterInfo in chapterList.Descendants("li"))
{
string fullString = chapterInfo.Descendants("a").First(d => d.HasClass("chapter-name")).InnerText;
string url = chapterInfo.Descendants("a").First(d => d.HasClass("chapter-name"))
.GetAttributeValue("href", "");
string? volumeNumber = volRex.IsMatch(fullString) ? volRex.Match(fullString).Groups[1].Value : null;
string chapterNumber = chapterRex.Match(url).Groups[1].Value;
string chapterName = nameRex.Match(fullString).Groups[3].Value;
ret.Add(new Chapter(manga, chapterName, volumeNumber, chapterNumber, url));
}
ret.Reverse();
return ret;
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
string requestUrl = chapter.url;
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
progressToken?.Cancel();
return requestResult.statusCode;
}
if (requestResult.htmlDocument is null)
{
progressToken?.Cancel();
return HttpStatusCode.InternalServerError;
}
string[] imageUrls = ParseImageUrlsFromHtml(requestResult.htmlDocument);
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(imageUrls, chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, "https://chapmanganato.com/", progressToken:progressToken);
}
private string[] ParseImageUrlsFromHtml(HtmlDocument document)
{
List<string> ret = new();
HtmlNode imageContainer =
document.DocumentNode.Descendants("div").First(i => i.HasClass("container-chapter-reader"));
foreach(HtmlNode imageNode in imageContainer.Descendants("img"))
ret.Add(imageNode.GetAttributeValue("src", ""));
return ret.ToArray();
}
}

View File

@ -0,0 +1,230 @@
using System.Data;
using System.Net;
using System.Text.RegularExpressions;
using System.Xml.Linq;
using HtmlAgilityPack;
using Newtonsoft.Json;
using Soenneker.Utils.String.NeedlemanWunsch;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class Mangasee : MangaConnector
{
public Mangasee(GlobalBase clone) : base(clone, "Mangasee", ["en"])
{
this.downloadClient = new ChromiumDownloadClient(clone);
}
private struct SearchResult
{
public string i { get; set; }
public string s { get; set; }
public string[] a { get; set; }
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string requestUrl = "https://mangasee123.com/_search.php";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
Log($"Failed to retrieve search: {requestResult.statusCode}");
return Array.Empty<Manga>();
}
try
{
SearchResult[] searchResults = JsonConvert.DeserializeObject<SearchResult[]>(requestResult.htmlDocument!.DocumentNode.InnerText) ??
throw new NoNullAllowedException();
SearchResult[] filteredResults = FilteredResults(publicationTitle, searchResults);
Log($"Total available manga: {searchResults.Length} Filtered down to: {filteredResults.Length}");
string[] urls = filteredResults.Select(result => $"https://mangasee123.com/manga/{result.i}").ToArray();
List<Manga> searchResultManga = new();
foreach (string url in urls)
{
Manga? newManga = GetMangaFromUrl(url);
if(newManga is { } manga)
searchResultManga.Add(manga);
}
Log($"Retrieved {searchResultManga.Count} publications. Term=\"{publicationTitle}\"");
return searchResultManga.ToArray();
}
catch (NoNullAllowedException)
{
Log("Failed to retrieve search");
return Array.Empty<Manga>();
}
}
private readonly string[] _filterWords = {"a", "the", "of", "as", "to", "no", "for", "on", "with", "be", "and", "in", "wa", "at", "be", "ni"};
private string ToFilteredString(string input) => string.Join(' ', input.ToLower().Split(' ').Where(word => _filterWords.Contains(word) == false));
private SearchResult[] FilteredResults(string publicationTitle, SearchResult[] unfilteredSearchResults)
{
Dictionary<SearchResult, int> similarity = new();
foreach (SearchResult sr in unfilteredSearchResults)
{
List<int> scores = new();
string filteredPublicationString = ToFilteredString(publicationTitle);
string filteredSString = ToFilteredString(sr.s);
scores.Add(NeedlemanWunschStringUtil.CalculateSimilarity(filteredSString, filteredPublicationString));
foreach (string srA in sr.a)
{
string filteredAString = ToFilteredString(srA);
scores.Add(NeedlemanWunschStringUtil.CalculateSimilarity(filteredAString, filteredPublicationString));
}
similarity.Add(sr, scores.Sum() / scores.Count);
}
List<SearchResult> ret = similarity.OrderBy(s => s.Value).Take(10).Select(s => s.Key).ToList();
return ret.ToArray();
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://mangasee123.com/manga/{publicationId}");
}
public override Manga? GetMangaFromUrl(string url)
{
Regex publicationIdRex = new(@"https:\/\/mangasee123.com\/manga\/(.*)(\/.*)*");
string publicationId = publicationIdRex.Match(url).Groups[1].Value;
RequestResult requestResult = this.downloadClient.MakeRequest(url, RequestType.MangaInfo);
if((int)requestResult.statusCode < 300 && (int)requestResult.statusCode >= 200 && requestResult.htmlDocument is not null)
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, publicationId, url);
return null;
}
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
string originalLanguage = "", status = "";
Dictionary<string, string> altTitles = new(), links = new();
HashSet<string> tags = new();
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
HtmlNode posterNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']//div[@class='row']//img");
string posterUrl = posterNode.GetAttributeValue("src", "");
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
HtmlNode titleNode = document.DocumentNode.SelectSingleNode("//div[@class='BoxBody']//div[@class='row']//h1");
string sortName = titleNode.InnerText;
HtmlNode[] authorsNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Author(s):']/..").Descendants("a")
.ToArray();
List<string> authors = new();
foreach (HtmlNode authorNode in authorsNodes)
authors.Add(authorNode.InnerText);
HtmlNode[] genreNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Genre(s):']/..").Descendants("a")
.ToArray();
foreach (HtmlNode genreNode in genreNodes)
tags.Add(genreNode.InnerText);
HtmlNode yearNode = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Released:']/..").Descendants("a")
.First();
int year = Convert.ToInt32(yearNode.InnerText);
HtmlNode[] statusNodes = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Status:']/..").Descendants("a")
.ToArray();
foreach (HtmlNode statusNode in statusNodes)
if (statusNode.InnerText.Contains("publish", StringComparison.CurrentCultureIgnoreCase))
status = statusNode.InnerText.Split(' ')[0];
switch (status.ToLower())
{
case "cancelled": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "hiatus": releaseStatus = Manga.ReleaseStatusByte.OnHiatus; break;
case "discontinued": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "complete": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
}
HtmlNode descriptionNode = document.DocumentNode
.SelectNodes("//div[@class='BoxBody']//div[@class='row']//span[text()='Description:']/..")
.Descendants("div").First();
string description = descriptionNode.InnerText;
Manga manga = new(sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
try
{
XDocument doc = XDocument.Load($"https://mangasee123.com/rss/{manga.publicationId}.xml");
XElement[] chapterItems = doc.Descendants("item").ToArray();
List<Chapter> chapters = new();
Regex chVolRex = new(@".*chapter-([0-9\.]+)(?:-index-([0-9\.]+))?.*");
foreach (XElement chapter in chapterItems)
{
string url = chapter.Descendants("link").First().Value;
Match m = chVolRex.Match(url);
string? volumeNumber = m.Groups[2].Success ? m.Groups[2].Value : "1";
string chapterNumber = m.Groups[1].Value;
string chapterUrl = Regex.Replace(url, @"-page-[0-9]+(\.html)", ".html");
chapters.Add(new Chapter(manga, "", volumeNumber, chapterNumber, chapterUrl));
}
//Return Chapters ordered by Chapter-Number
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
catch (HttpRequestException e)
{
Log($"Failed to load https://mangasee123.com/rss/{manga.publicationId}.xml \n\r{e}");
return Array.Empty<Chapter>();
}
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
RequestResult requestResult = this.downloadClient.MakeRequest(chapter.url, RequestType.Default);
if (requestResult.htmlDocument is null)
{
progressToken?.Cancel();
return HttpStatusCode.RequestTimeout;
}
HtmlDocument document = requestResult.htmlDocument;
HtmlNode gallery = document.DocumentNode.Descendants("div").First(div => div.HasClass("ImageGallery"));
HtmlNode[] images = gallery.Descendants("img").Where(img => img.HasClass("img-fluid")).ToArray();
List<string> urls = new();
foreach(HtmlNode galleryImage in images)
urls.Add(galleryImage.GetAttributeValue("src", ""));
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(urls.ToArray(), chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, progressToken:progressToken);
}
}

View File

@ -0,0 +1,227 @@
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class Mangaworld: MangaConnector
{
public Mangaworld(GlobalBase clone) : base(clone, "Mangaworld", ["it"])
{
this.downloadClient = new HttpDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string sanitizedTitle = string.Join(' ', Regex.Matches(publicationTitle, "[A-z]*").Where(str => str.Length > 0)).ToLower();
string requestUrl = $"https://www.mangaworld.ac/archive?keyword={sanitizedTitle}";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Manga>();
if (requestResult.htmlDocument is null)
return Array.Empty<Manga>();
Manga[] publications = ParsePublicationsFromHtml(requestResult.htmlDocument);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications;
}
private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{
if (!document.DocumentNode.SelectSingleNode("//div[@class='comics-grid']").ChildNodes
.Any(node => node.HasClass("entry")))
return Array.Empty<Manga>();
List<string> urls = document.DocumentNode
.SelectNodes(
"//div[@class='comics-grid']//div[@class='entry']//a[contains(concat(' ',normalize-space(@class),' '),'thumb')]")
.Select(thumb => thumb.GetAttributeValue("href", "")).ToList();
HashSet<Manga> ret = new();
foreach (string url in urls)
{
Manga? manga = GetMangaFromUrl(url);
if (manga is not null)
ret.Add((Manga)manga);
}
return ret.ToArray();
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://www.mangaworld.ac/manga/{publicationId}");
}
public override Manga? GetMangaFromUrl(string url)
{
RequestResult requestResult =
downloadClient.MakeRequest(url, RequestType.MangaInfo);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return null;
if (requestResult.htmlDocument is null)
return null;
Regex idRex = new (@"https:\/\/www\.mangaworld\.[a-z]{0,63}\/manga\/([0-9]+\/[0-9A-z\-]+).*");
string id = idRex.Match(url).Groups[1].Value;
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, id, url);
}
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
Dictionary<string, string> altTitles = new();
Dictionary<string, string>? links = null;
string originalLanguage = "";
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
HtmlNode infoNode = document.DocumentNode.Descendants("div").First(d => d.HasClass("info"));
string sortName = infoNode.Descendants("h1").First().InnerText;
HtmlNode metadata = infoNode.Descendants().First(d => d.HasClass("meta-data"));
HtmlNode altTitlesNode = metadata.SelectSingleNode("//span[text()='Titoli alternativi: ' or text()='Titolo alternativo: ']/..").ChildNodes[1];
string[] alts = altTitlesNode.InnerText.Split(", ");
for(int i = 0; i < alts.Length; i++)
altTitles.Add(i.ToString(), alts[i]);
HtmlNode genresNode =
metadata.SelectSingleNode("//span[text()='Generi: ' or text()='Genero: ']/..");
HashSet<string> tags = genresNode.SelectNodes("a").Select(node => node.InnerText).ToHashSet();
HtmlNode authorsNode =
metadata.SelectSingleNode("//span[text()='Autore: ' or text()='Autori: ']/..");
string[] authors = authorsNode.SelectNodes("a").Select(node => node.InnerText).ToArray();
string status = metadata.SelectSingleNode("//span[text()='Stato: ']/..").SelectNodes("a").First().InnerText;
// ReSharper disable 5 times StringLiteralTypo
switch (status.ToLower())
{
case "cancellato": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "in pausa": releaseStatus = Manga.ReleaseStatusByte.OnHiatus; break;
case "droppato": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "finito": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
case "in corso": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
}
string posterUrl = document.DocumentNode.SelectSingleNode("//img[@class='rounded']").GetAttributeValue("src", "");
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId.Replace('/', '-'), RequestType.MangaCover);
string description = document.DocumentNode.SelectSingleNode("//div[@id='noidungm']").InnerText;
string yearString = metadata.SelectSingleNode("//span[text()='Anno di uscita: ']/..").SelectNodes("a").First().InnerText;
int year = Convert.ToInt32(yearString);
Manga manga = new (sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl, coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
string requestUrl = $"https://www.mangaworld.ac/manga/{manga.publicationId}";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Chapter>();
//Return Chapters ordered by Chapter-Number
if (requestResult.htmlDocument is null)
return Array.Empty<Chapter>();
List<Chapter> chapters = ParseChaptersFromHtml(manga, requestResult.htmlDocument);
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
private List<Chapter> ParseChaptersFromHtml(Manga manga, HtmlDocument document)
{
List<Chapter> ret = new();
HtmlNode chaptersWrapper =
document.DocumentNode.SelectSingleNode(
"//div[contains(concat(' ',normalize-space(@class),' '),'chapters-wrapper')]");
if (chaptersWrapper.Descendants("div").Any(descendant => descendant.HasClass("volume-element")))
{
foreach (HtmlNode volNode in document.DocumentNode.SelectNodes("//div[contains(concat(' ',normalize-space(@class),' '),'volume-element')]"))
{
string volume = Regex.Match(volNode.SelectNodes("div").First(node => node.HasClass("volume")).SelectSingleNode("p").InnerText,
@"[Vv]olume ([0-9]+).*").Groups[1].Value;
foreach (HtmlNode chNode in volNode.SelectNodes("div").First(node => node.HasClass("volume-chapters")).SelectNodes("div"))
{
string number = Regex.Match(chNode.SelectSingleNode("a").SelectSingleNode("span").InnerText,
@"[Cc]apitolo ([0-9]+).*").Groups[1].Value;
string url = chNode.SelectSingleNode("a").GetAttributeValue("href", "");
ret.Add(new Chapter(manga, null, volume, number, url));
}
}
}
else
{
foreach (HtmlNode chNode in chaptersWrapper.SelectNodes("div").Where(node => node.HasClass("chapter")))
{
string number = Regex.Match(chNode.SelectSingleNode("a").SelectSingleNode("span").InnerText,
@"[Cc]apitolo ([0-9]+).*").Groups[1].Value;
string url = chNode.SelectSingleNode("a").GetAttributeValue("href", "");
ret.Add(new Chapter(manga, null, null, number, url));
}
}
ret.Reverse();
return ret;
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
string requestUrl = $"{chapter.url}?style=list";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
{
progressToken?.Cancel();
return requestResult.statusCode;
}
if (requestResult.htmlDocument is null)
{
progressToken?.Cancel();
return HttpStatusCode.InternalServerError;
}
string[] imageUrls = ParseImageUrlsFromHtml(requestResult.htmlDocument);
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(imageUrls, chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, "https://www.mangaworld.bz/", progressToken:progressToken);
}
private string[] ParseImageUrlsFromHtml(HtmlDocument document)
{
List<string> ret = new();
HtmlNode imageContainer =
document.DocumentNode.SelectSingleNode("//div[@id='page']");
foreach(HtmlNode imageNode in imageContainer.Descendants("img"))
ret.Add(imageNode.GetAttributeValue("src", ""));
return ret.ToArray();
}
}

View File

@ -0,0 +1,198 @@
using System.Net;
using System.Text.RegularExpressions;
using HtmlAgilityPack;
using Tranga.Jobs;
namespace Tranga.MangaConnectors;
public class ManhuaPlus : MangaConnector
{
public ManhuaPlus(GlobalBase clone) : base(clone, "ManhuaPlus", ["en"])
{
this.downloadClient = new ChromiumDownloadClient(clone);
}
public override Manga[] GetManga(string publicationTitle = "")
{
Log($"Searching Publications. Term=\"{publicationTitle}\"");
string sanitizedTitle = string.Join(' ', Regex.Matches(publicationTitle, "[A-z]*").Where(str => str.Length > 0)).ToLower();
string requestUrl = $"https://manhuaplus.org/search?keyword={sanitizedTitle}";
RequestResult requestResult =
downloadClient.MakeRequest(requestUrl, RequestType.Default);
if ((int)requestResult.statusCode < 200 || (int)requestResult.statusCode >= 300)
return Array.Empty<Manga>();
if (requestResult.htmlDocument is null)
return Array.Empty<Manga>();
Manga[] publications = ParsePublicationsFromHtml(requestResult.htmlDocument);
Log($"Retrieved {publications.Length} publications. Term=\"{publicationTitle}\"");
return publications;
}
private Manga[] ParsePublicationsFromHtml(HtmlDocument document)
{
if (document.DocumentNode.SelectSingleNode("//h1/../..").ChildNodes//I already want to not.
.Any(node => node.InnerText.Contains("No manga found")))
return Array.Empty<Manga>();
List<string> urls = document.DocumentNode
.SelectNodes("//h1/../..//a[contains(@href, 'https://manhuaplus.org/manga/') and contains(concat(' ',normalize-space(@class),' '),' clamp ') and not(contains(@href, '/chapter'))]")
.Select(mangaNode => mangaNode.GetAttributeValue("href", "")).ToList();
logger?.WriteLine($"Got {urls.Count} urls.");
HashSet<Manga> ret = new();
foreach (string url in urls)
{
Manga? manga = GetMangaFromUrl(url);
if (manga is not null)
ret.Add((Manga)manga);
}
return ret.ToArray();
}
public override Manga? GetMangaFromId(string publicationId)
{
return GetMangaFromUrl($"https://manhuaplus.org/manga/{publicationId}");
}
public override Manga? GetMangaFromUrl(string url)
{
Regex publicationIdRex = new(@"https:\/\/manhuaplus.org\/manga\/(.*)(\/.*)*");
string publicationId = publicationIdRex.Match(url).Groups[1].Value;
RequestResult requestResult = this.downloadClient.MakeRequest(url, RequestType.MangaInfo);
if((int)requestResult.statusCode < 300 && (int)requestResult.statusCode >= 200 && requestResult.htmlDocument is not null && requestResult.redirectedToUrl != "https://manhuaplus.org/home") //When manga doesnt exists it redirects to home
return ParseSinglePublicationFromHtml(requestResult.htmlDocument, publicationId, url);
return null;
}
private Manga ParseSinglePublicationFromHtml(HtmlDocument document, string publicationId, string websiteUrl)
{
string originalLanguage = "", status = "";
Dictionary<string, string> altTitles = new(), links = new();
HashSet<string> tags = new();
Manga.ReleaseStatusByte releaseStatus = Manga.ReleaseStatusByte.Unreleased;
HtmlNode posterNode = document.DocumentNode.SelectSingleNode("/html/body/main/div/div/div[2]/div[1]/figure/a/img");//BRUH
Regex posterRex = new(@".*(\/uploads/covers/[a-zA-Z0-9\-\._\~\!\$\&\'\(\)\*\+\,\;\=\:\@]+).*");
string posterUrl = $"https://manhuaplus.org/{posterRex.Match(posterNode.GetAttributeValue("src", "")).Groups[1].Value}";
string coverFileNameInCache = SaveCoverImageToCache(posterUrl, publicationId, RequestType.MangaCover);
HtmlNode titleNode = document.DocumentNode.SelectSingleNode("//h1");
string sortName = titleNode.InnerText.Replace("\n", "");
List<string> authors = new();
try
{
HtmlNode[] authorsNodes = document.DocumentNode
.SelectNodes("//a[contains(@href, 'https://manhuaplus.org/authors/')]")
.ToArray();
foreach (HtmlNode authorNode in authorsNodes)
authors.Add(authorNode.InnerText);
}
catch (ArgumentNullException e)
{
Log("No authors found.");
}
try
{
HtmlNode[] genreNodes = document.DocumentNode
.SelectNodes("//a[contains(@href, 'https://manhuaplus.org/genres/')]").ToArray();
foreach (HtmlNode genreNode in genreNodes)
tags.Add(genreNode.InnerText.Replace("\n", ""));
}
catch (ArgumentNullException e)
{
Log("No genres found");
}
string yearNodeStr = document.DocumentNode
.SelectSingleNode("//aside//i[contains(concat(' ',normalize-space(@class),' '),' fa-clock ')]/../span").InnerText.Replace("\n", "");
int year = int.Parse(yearNodeStr.Split(' ')[0].Split('/')[^1]);
status = document.DocumentNode.SelectSingleNode("//aside//i[contains(concat(' ',normalize-space(@class),' '),' fa-rss ')]/../span").InnerText.Replace("\n", "");
switch (status.ToLower())
{
case "cancelled": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "hiatus": releaseStatus = Manga.ReleaseStatusByte.OnHiatus; break;
case "discontinued": releaseStatus = Manga.ReleaseStatusByte.Cancelled; break;
case "complete": releaseStatus = Manga.ReleaseStatusByte.Completed; break;
case "ongoing": releaseStatus = Manga.ReleaseStatusByte.Continuing; break;
}
HtmlNode descriptionNode = document.DocumentNode
.SelectSingleNode("//div[@id='syn-target']");
string description = descriptionNode.InnerText;
Manga manga = new(sortName, authors.ToList(), description, altTitles, tags.ToArray(), posterUrl,
coverFileNameInCache, links,
year, originalLanguage, publicationId, releaseStatus, websiteUrl: websiteUrl);
AddMangaToCache(manga);
return manga;
}
public override Chapter[] GetChapters(Manga manga, string language="en")
{
Log($"Getting chapters {manga}");
RequestResult result = downloadClient.MakeRequest($"https://manhuaplus.org/manga/{manga.publicationId}", RequestType.Default);
if ((int)result.statusCode < 200 || (int)result.statusCode >= 300 || result.htmlDocument is null)
{
return Array.Empty<Chapter>();
}
HtmlNodeCollection chapterNodes = result.htmlDocument.DocumentNode.SelectNodes("//li[contains(concat(' ',normalize-space(@class),' '),' chapter ')]//a");
string[] urls = chapterNodes.Select(node => node.GetAttributeValue("href", "")).ToArray();
Regex urlRex = new (@".*\/chapter-([0-9\-]+).*");
List<Chapter> chapters = new();
foreach (string url in urls)
{
Match rexMatch = urlRex.Match(url);
string volumeNumber = "1";
string chapterNumber = rexMatch.Groups[1].Value;
string fullUrl = url;
chapters.Add(new Chapter(manga, "", volumeNumber, chapterNumber, fullUrl));
}
//Return Chapters ordered by Chapter-Number
Log($"Got {chapters.Count} chapters. {manga}");
return chapters.Order().ToArray();
}
public override HttpStatusCode DownloadChapter(Chapter chapter, ProgressToken? progressToken = null)
{
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Manga chapterParentManga = chapter.parentManga;
if (progressToken?.cancellationRequested ?? false)
{
progressToken.Cancel();
return HttpStatusCode.RequestTimeout;
}
Log($"Retrieving chapter-info {chapter} {chapterParentManga}");
RequestResult requestResult = this.downloadClient.MakeRequest(chapter.url, RequestType.Default);
if (requestResult.htmlDocument is null)
{
progressToken?.Cancel();
return HttpStatusCode.RequestTimeout;
}
HtmlDocument document = requestResult.htmlDocument;
HtmlNode[] images = document.DocumentNode.SelectNodes("//a[contains(concat(' ',normalize-space(@class),' '),' readImg ')]/img").ToArray();
List<string> urls = images.Select(node => node.GetAttributeValue("src", "")).ToList();
string comicInfoPath = Path.GetTempFileName();
File.WriteAllText(comicInfoPath, chapter.GetComicInfoXmlString());
return DownloadChapterImages(urls.ToArray(), chapter.GetArchiveFilePath(), RequestType.MangaImage, comicInfoPath, progressToken:progressToken);
}
}

View File

@ -0,0 +1,27 @@
using System.Net;
using HtmlAgilityPack;
namespace Tranga.MangaConnectors;
public struct RequestResult
{
public HttpStatusCode statusCode { get; }
public Stream result { get; }
public bool hasBeenRedirected { get; }
public string? redirectedToUrl { get; }
public HtmlDocument? htmlDocument { get; }
public RequestResult(HttpStatusCode statusCode, HtmlDocument? htmlDocument, Stream result)
{
this.statusCode = statusCode;
this.htmlDocument = htmlDocument;
this.result = result;
}
public RequestResult(HttpStatusCode statusCode, HtmlDocument? htmlDocument, Stream result, bool hasBeenRedirected, string redirectedTo)
: this(statusCode, htmlDocument, result)
{
this.hasBeenRedirected = hasBeenRedirected;
redirectedToUrl = redirectedTo;
}
}

View File

@ -0,0 +1,11 @@
namespace Tranga.MangaConnectors;
public enum RequestType : byte
{
Default = 0,
MangaDexFeed = 1,
MangaImage = 2,
MangaCover = 3,
MangaDexImage = 5,
MangaInfo = 6
}

View File

@ -0,0 +1,58 @@
using System.Text;
using Newtonsoft.Json;
namespace Tranga.NotificationConnectors;
public class Gotify : NotificationConnector
{
public string endpoint { get; }
// ReSharper disable once MemberCanBePrivate.Global
public string appToken { get; }
private readonly HttpClient _client = new();
[JsonConstructor]
public Gotify(GlobalBase clone, string endpoint, string appToken) : base(clone, NotificationConnectorType.Gotify)
{
if (!baseUrlRex.IsMatch(endpoint))
throw new ArgumentException("endpoint does not match pattern");
this.endpoint = baseUrlRex.Match(endpoint).Value;;
this.appToken = appToken;
}
public override string ToString()
{
return $"Gotify {endpoint}";
}
protected override void SendNotificationInternal(string title, string notificationText)
{
Log($"Sending notification: {title} - {notificationText}");
MessageData message = new(title, notificationText);
HttpRequestMessage request = new(HttpMethod.Post, $"{endpoint}/message");
request.Headers.Add("X-Gotify-Key", this.appToken);
request.Content = new StringContent(JsonConvert.SerializeObject(message, Formatting.None), Encoding.UTF8, "application/json");
HttpResponseMessage response = _client.Send(request);
if (!response.IsSuccessStatusCode)
{
StreamReader sr = new (response.Content.ReadAsStream());
Log($"{response.StatusCode}: {sr.ReadToEnd()}");
}
}
private class MessageData
{
// ReSharper disable four times UnusedAutoPropertyAccessor.Local
public string message { get; }
public long priority { get; }
public string title { get; }
public Dictionary<string, object> extras { get; }
public MessageData(string title, string message)
{
this.title = title;
this.message = message;
this.extras = new();
this.priority = 4;
}
}
}

View File

@ -0,0 +1,49 @@
using System.Text;
using Newtonsoft.Json;
namespace Tranga.NotificationConnectors;
public class LunaSea : NotificationConnector
{
// ReSharper disable once MemberCanBePrivate.Global
public string id { get; init; }
private readonly HttpClient _client = new();
[JsonConstructor]
public LunaSea(GlobalBase clone, string id) : base(clone, NotificationConnectorType.LunaSea)
{
this.id = id;
}
public override string ToString()
{
return $"LunaSea {id}";
}
protected override void SendNotificationInternal(string title, string notificationText)
{
Log($"Sending notification: {title} - {notificationText}");
MessageData message = new(title, notificationText);
HttpRequestMessage request = new(HttpMethod.Post, $"https://notify.lunasea.app/v1/custom/{id}");
request.Content = new StringContent(JsonConvert.SerializeObject(message, Formatting.None), Encoding.UTF8, "application/json");
HttpResponseMessage response = _client.Send(request);
if (!response.IsSuccessStatusCode)
{
StreamReader sr = new (response.Content.ReadAsStream());
Log($"{response.StatusCode}: {sr.ReadToEnd()}");
}
}
private class MessageData
{
// ReSharper disable twice UnusedAutoPropertyAccessor.Local
public string title { get; }
public string body { get; }
public MessageData(string title, string body)
{
this.title = title;
this.body = body;
}
}
}

View File

@ -0,0 +1,74 @@
namespace Tranga.NotificationConnectors;
public abstract class NotificationConnector : GlobalBase
{
public readonly NotificationConnectorType notificationConnectorType;
private DateTime? _notificationRequested = null;
private readonly Thread? _notificationBufferThread = null;
private const int NoChangeTimeout = 3, BiggestInterval = 30;
private List<KeyValuePair<string, string>> _notifications = new();
protected NotificationConnector(GlobalBase clone, NotificationConnectorType notificationConnectorType) : base(clone)
{
Log($"Creating notificationConnector {Enum.GetName(notificationConnectorType)}");
this.notificationConnectorType = notificationConnectorType;
if (TrangaSettings.bufferLibraryUpdates)
{
_notificationBufferThread = new(CheckNotificationBuffer);
_notificationBufferThread.Start();
}
}
private void CheckNotificationBuffer()
{
while (true)
{
if (_notificationRequested is not null && DateTime.Now.Subtract((DateTime)_notificationRequested) > TimeSpan.FromMinutes(NoChangeTimeout)) //If no updates have been requested for NoChangeTimeout minutes, update library
{
string[] uniqueTitles = _notifications.DistinctBy(n => n.Key).Select(n => n.Key).ToArray();
Log($"Notification Buffer sending! Notifications: {string.Join(", ", uniqueTitles)}");
foreach (string ut in uniqueTitles)
{
string[] texts = _notifications.Where(n => n.Key == ut).Select(n => n.Value).ToArray();
SendNotificationInternal($"{ut} ({texts.Length})", string.Join('\n', texts));
}
_notificationRequested = null;
_notifications.Clear();
}
Thread.Sleep(100);
}
}
public enum NotificationConnectorType : byte { Gotify = 0, LunaSea = 1, Ntfy = 2 }
public void SendNotification(string title, string notificationText, bool buffer = false)
{
_notificationRequested ??= DateTime.Now;
if (!TrangaSettings.bufferNotifications || !buffer)
{
SendNotificationInternal(title, notificationText);
return;
}
_notifications.Add(new(title, notificationText));
if (_notificationRequested is not null &&
DateTime.Now.Subtract((DateTime)_notificationRequested) > TimeSpan.FromMinutes(BiggestInterval)) //If the last update has been more than BiggestInterval minutes ago, update library
{
string[] uniqueTitles = _notifications.DistinctBy(n => n.Key).Select(n => n.Key).ToArray();
foreach (string ut in uniqueTitles)
{
string[] texts = _notifications.Where(n => n.Key == ut).Select(n => n.Value).ToArray();
SendNotificationInternal(ut, string.Join('\n', texts));
}
_notificationRequested = null;
_notifications.Clear();
}
else if(_notificationRequested is not null)
{
Log($"Buffering Notifications (Updates in latest {((DateTime)_notificationRequested).Add(TimeSpan.FromMinutes(BiggestInterval)).Subtract(DateTime.Now)} or {((DateTime)_notificationRequested).Add(TimeSpan.FromMinutes(NoChangeTimeout)).Subtract(DateTime.Now)})");
}
}
protected abstract void SendNotificationInternal(string title, string notificationText);
}

View File

@ -0,0 +1,46 @@
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
namespace Tranga.NotificationConnectors;
public class NotificationManagerJsonConverter : JsonConverter
{
private GlobalBase _clone;
public NotificationManagerJsonConverter(GlobalBase clone)
{
this._clone = clone;
}
public override bool CanConvert(Type objectType)
{
return (objectType == typeof(NotificationConnector));
}
public override object ReadJson(JsonReader reader, Type objectType, object? existingValue,
JsonSerializer serializer)
{
JObject jo = JObject.Load(reader);
switch (jo["notificationConnectorType"]!.Value<byte>())
{
case (byte)NotificationConnector.NotificationConnectorType.Gotify:
return new Gotify(this._clone, jo.GetValue("endpoint")!.Value<string>()!, jo.GetValue("appToken")!.Value<string>()!);
case (byte)NotificationConnector.NotificationConnectorType.LunaSea:
return new LunaSea(this._clone, jo.GetValue("id")!.Value<string>()!);
case (byte)NotificationConnector.NotificationConnectorType.Ntfy:
return new Ntfy(this._clone, jo.GetValue("endpoint")!.Value<string>()!, jo.GetValue("topic")!.Value<string>()!, jo.GetValue("auth")!.Value<string>()!);
}
throw new Exception();
}
public override bool CanWrite => false;
/// <summary>
/// Don't call this
/// </summary>
public override void WriteJson(JsonWriter writer, object? value, JsonSerializer serializer)
{
throw new Exception("Dont call this");
}
}

View File

@ -0,0 +1,87 @@
using System.Text;
using System.Text.RegularExpressions;
using Newtonsoft.Json;
namespace Tranga.NotificationConnectors;
public class Ntfy : NotificationConnector
{
// ReSharper disable twice MemberCanBePrivate.Global
public string endpoint { get; init; }
public string auth { get; init; }
public string topic { get; init; }
private readonly HttpClient _client = new();
[JsonConstructor]
public Ntfy(GlobalBase clone, string endpoint, string topic, string auth) : base(clone, NotificationConnectorType.Ntfy)
{
this.endpoint = endpoint;
this.topic = topic;
this.auth = auth;
}
public Ntfy(GlobalBase clone, string endpoint, string username, string password, string? topic = null) :
this(clone, EndpointAndTopicFromUrl(endpoint)[0], topic??EndpointAndTopicFromUrl(endpoint)[1], AuthFromUsernamePassword(username, password))
{
}
private static string AuthFromUsernamePassword(string username, string password)
{
string authHeader = "Basic " + Convert.ToBase64String(Encoding.UTF8.GetBytes($"{username}:{password}"));
string authParam = Convert.ToBase64String(Encoding.UTF8.GetBytes(authHeader)).Replace("=","");
return authParam;
}
private static string[] EndpointAndTopicFromUrl(string url)
{
string[] ret = new string[2];
if (!baseUrlRex.IsMatch(url))
throw new ArgumentException("url does not match pattern");
Regex rootUriRex = new(@"(https?:\/\/[a-zA-Z0-9-\.]+\.[a-zA-Z0-9]+)(?:\/([a-zA-Z0-9-\.]+))?.*");
Match match = rootUriRex.Match(url);
if(!match.Success)
throw new ArgumentException($"Error getting URI from provided endpoint-URI: {url}");
ret[0] = match.Groups[1].Value;
ret[1] = match.Groups[2].Success && match.Groups[2].Value.Length > 0 ? match.Groups[2].Value : "tranga";
return ret;
}
public override string ToString()
{
return $"Ntfy {endpoint} {topic}";
}
protected override void SendNotificationInternal(string title, string notificationText)
{
Log($"Sending notification: {title} - {notificationText}");
MessageData message = new(title, topic, notificationText);
HttpRequestMessage request = new(HttpMethod.Post, $"{this.endpoint}?auth={this.auth}");
request.Content = new StringContent(JsonConvert.SerializeObject(message, Formatting.None), Encoding.UTF8, "application/json");
HttpResponseMessage response = _client.Send(request);
if (!response.IsSuccessStatusCode)
{
StreamReader sr = new (response.Content.ReadAsStream());
Log($"{response.StatusCode}: {sr.ReadToEnd()}");
}
}
private class MessageData
{
// ReSharper disable UnusedAutoPropertyAccessor.Local
public string topic { get; }
public string title { get; }
public string message { get; }
public int priority { get; }
public MessageData(string title, string topic, string message)
{
this.topic = topic;
this.title = title;
this.message = message;
this.priority = 3;
}
}
}

View File

@ -1,128 +0,0 @@
using System.Runtime.InteropServices;
using System.Text;
using System.Text.RegularExpressions;
using Newtonsoft.Json;
using static System.IO.UnixFileMode;
namespace Tranga;
/// <summary>
/// Contains information on a Publication (Manga)
/// </summary>
public readonly struct Publication
{
public string sortName { get; }
public string? author { get; }
public Dictionary<string,string> altTitles { get; }
// ReSharper disable trice MemberCanBePrivate.Global, trust
public string? description { get; }
public string[] tags { get; }
public string? posterUrl { get; }
public string? coverFileNameInCache { get; }
public Dictionary<string,string> links { get; }
public int? year { get; }
public string? originalLanguage { get; }
public string status { get; }
public string folderName { get; }
public string publicationId { get; }
public string internalId { get; }
private static readonly Regex LegalCharacters = new Regex(@"[A-Z]*[a-z]*[0-9]* *\.*-*,*'*\'*\)*\(*~*!*");
public Publication(string sortName, string? author, string? description, Dictionary<string,string> altTitles, string[] tags, string? posterUrl, string? coverFileNameInCache, Dictionary<string,string>? links, int? year, string? originalLanguage, string status, string publicationId)
{
this.sortName = sortName;
this.author = author;
this.description = description;
this.altTitles = altTitles;
this.tags = tags;
this.coverFileNameInCache = coverFileNameInCache;
this.posterUrl = posterUrl;
this.links = links ?? new Dictionary<string, string>();
this.year = year;
this.originalLanguage = originalLanguage;
this.status = status;
this.publicationId = publicationId;
this.folderName = string.Concat(LegalCharacters.Matches(sortName));
while (this.folderName.EndsWith('.'))
this.folderName = this.folderName.Substring(0, this.folderName.Length - 1);
string onlyLowerLetters = string.Concat(this.sortName.ToLower().Where(Char.IsLetter));
this.internalId = Convert.ToBase64String(Encoding.ASCII.GetBytes($"{onlyLowerLetters}{this.year}"));
}
public string CreatePublicationFolder(string downloadDirectory)
{
string publicationFolder = Path.Join(downloadDirectory, this.folderName);
if(!Directory.Exists(publicationFolder))
Directory.CreateDirectory(publicationFolder);
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(publicationFolder, GroupRead | GroupWrite | GroupExecute | OtherRead | OtherWrite | OtherExecute | UserRead | UserWrite | UserExecute);
return publicationFolder;
}
public void SaveSeriesInfoJson(string downloadDirectory)
{
string publicationFolder = CreatePublicationFolder(downloadDirectory);
string seriesInfoPath = Path.Join(publicationFolder, "series.json");
if(!File.Exists(seriesInfoPath))
File.WriteAllText(seriesInfoPath,this.GetSeriesInfoJson());
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
File.SetUnixFileMode(seriesInfoPath, GroupRead | GroupWrite | OtherRead | OtherWrite | UserRead | UserWrite);
}
/// <returns>Serialized JSON String for series.json</returns>
public string GetSeriesInfoJson()
{
SeriesInfo si = new (new Metadata(this.sortName, this.year.ToString() ?? string.Empty, this.status, this.description ?? ""));
return System.Text.Json.JsonSerializer.Serialize(si);
}
//Only for series.json
private struct SeriesInfo
{
// ReSharper disable once UnusedAutoPropertyAccessor.Local we need it, trust
[JsonRequired]public Metadata metadata { get; }
public SeriesInfo(Metadata metadata) => this.metadata = metadata;
}
//Only for series.json what an abomination, why are all the fields not-null????
private struct Metadata
{
// ReSharper disable UnusedAutoPropertyAccessor.Local we need them all, trust me
[JsonRequired] public string type { get; }
[JsonRequired] public string publisher { get; }
// ReSharper disable twice IdentifierTypo
[JsonRequired] public int comicid { get; }
[JsonRequired] public string booktype { get; }
// ReSharper disable InconsistentNaming This one property is capitalized. Why?
[JsonRequired] public string ComicImage { get; }
[JsonRequired] public int total_issues { get; }
[JsonRequired] public string publication_run { get; }
[JsonRequired]public string name { get; }
[JsonRequired]public string year { get; }
[JsonRequired]public string status { get; }
[JsonRequired]public string description_text { get; }
public Metadata(string name, string year, string status, string description_text)
{
this.name = name;
this.year = year;
if(status.ToLower() == "ongoing" || status.ToLower() == "hiatus")
this.status = "Continuing";
else if (status.ToLower() == "completed" || status.ToLower() == "cancelled" || status.ToLower() == "discontinued")
this.status = "Ended";
else
this.status = status;
this.description_text = description_text;
//kill it with fire, but otherwise Komga will not parse
type = "Manga";
publisher = "";
comicid = 0;
booktype = "";
ComicImage = "";
total_issues = 0;
publication_run = "";
}
}
}

763
Tranga/Server.cs Normal file
View File

@ -0,0 +1,763 @@
using System.Net;
using System.Runtime.InteropServices;
using System.Text;
using System.Text.RegularExpressions;
using Newtonsoft.Json;
using Tranga.Jobs;
using Tranga.LibraryConnectors;
using Tranga.MangaConnectors;
using Tranga.NotificationConnectors;
namespace Tranga;
public class Server : GlobalBase
{
private readonly HttpListener _listener = new ();
private readonly Tranga _parent;
public Server(Tranga parent) : base(parent)
{
this._parent = parent;
if(RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
this._listener.Prefixes.Add($"http://*:{TrangaSettings.apiPortNumber}/");
else
this._listener.Prefixes.Add($"http://localhost:{TrangaSettings.apiPortNumber}/");
Thread listenThread = new (Listen);
listenThread.Start();
Thread watchThread = new(WatchRunning);
watchThread.Start();
}
private void WatchRunning()
{
while(_parent.keepRunning)
Thread.Sleep(1000);
this._listener.Close();
}
private void Listen()
{
this._listener.Start();
foreach(string prefix in this._listener.Prefixes)
Log($"Listening on {prefix}");
while (this._listener.IsListening && _parent.keepRunning)
{
try
{
HttpListenerContext context = this._listener.GetContext();
//Log($"{context.Request.HttpMethod} {context.Request.Url} {context.Request.UserAgent}");
Task t = new(() =>
{
HandleRequest(context);
});
t.Start();
}
catch (HttpListenerException)
{
}
}
}
private void HandleRequest(HttpListenerContext context)
{
HttpListenerRequest request = context.Request;
HttpListenerResponse response = context.Response;
if (request.Url!.LocalPath.Contains("favicon"))
{
SendResponse(HttpStatusCode.NoContent, response);
return;
}
switch (request.HttpMethod)
{
case "GET":
HandleGet(request, response);
break;
case "POST":
HandlePost(request, response);
break;
case "DELETE":
HandleDelete(request, response);
break;
case "OPTIONS":
SendResponse(HttpStatusCode.OK, context.Response);
break;
default:
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
}
private Dictionary<string, string> GetRequestVariables(string query)
{
Dictionary<string, string> ret = new();
Regex queryRex = new (@"\?{1}&?([A-z0-9-=]+=[A-z0-9-=]+)+(&[A-z0-9-=]+=[A-z0-9-=]+)*");
if (!queryRex.IsMatch(query))
return ret;
query = query.Substring(1);
foreach (string keyValuePair in query.Split('&').Where(str => str.Length >= 3))
{
string var = keyValuePair.Split('=')[0];
string val = Regex.Replace(keyValuePair.Substring(var.Length + 1), "%20", " ");
val = Regex.Replace(val, "%[0-9]{2}", "");
ret.Add(var, val);
}
return ret;
}
private void HandleGet(HttpListenerRequest request, HttpListenerResponse response)
{
Dictionary<string, string> requestVariables = GetRequestVariables(request.Url!.Query);
string? connectorName, jobId, internalId;
MangaConnector? connector;
Manga? manga;
string path = Regex.Match(request.Url!.LocalPath, @"[A-z0-9]+(\/[A-z0-9]+)*").Value;
switch (path)
{
case "Connectors":
SendResponse(HttpStatusCode.OK, response, _parent.GetConnectors().Select(con => con.name).ToArray());
break;
case "Manga/Cover":
if (!requestVariables.TryGetValue("internalId", out internalId) ||
!_parent.TryGetPublicationById(internalId, out manga))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
string filePath = manga?.coverFileNameInCache ?? "";
if (File.Exists(filePath))
{
FileStream coverStream = new(filePath, FileMode.Open);
SendResponse(HttpStatusCode.OK, response, coverStream);
}
else
{
SendResponse(HttpStatusCode.NotFound, response);
}
break;
case "Manga/FromConnector":
requestVariables.TryGetValue("title", out string? title);
requestVariables.TryGetValue("url", out string? url);
if (!requestVariables.TryGetValue("connector", out connectorName) ||
!_parent.TryGetConnector(connectorName, out connector) ||
(title is null && url is null))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (url is not null)
{
HashSet<Manga> ret = new();
manga = connector!.GetMangaFromUrl(url);
if (manga is not null)
ret.Add((Manga)manga);
SendResponse(HttpStatusCode.OK, response, ret);
}else
SendResponse(HttpStatusCode.OK, response, connector!.GetManga(title!));
break;
case "Manga/Chapters":
if(!requestVariables.TryGetValue("connector", out connectorName) ||
!requestVariables.TryGetValue("internalId", out internalId) ||
!_parent.TryGetConnector(connectorName, out connector) ||
!_parent.TryGetPublicationById(internalId, out manga))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
requestVariables.TryGetValue("translatedLanguage", out string? translatedLanguage);
SendResponse(HttpStatusCode.OK, response, connector!.GetChapters((Manga)manga!, translatedLanguage??"en"));
break;
case "Jobs":
if (!requestVariables.TryGetValue("jobId", out jobId))
{
if(!_parent.jobBoss.jobs.Any(jjob => jjob.id == jobId))
SendResponse(HttpStatusCode.BadRequest, response);
else
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.First(jjob => jjob.id == jobId));
break;
}
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs);
break;
case "Jobs/Progress":
if (requestVariables.TryGetValue("jobId", out jobId))
{
if(!_parent.jobBoss.jobs.Any(jjob => jjob.id == jobId))
SendResponse(HttpStatusCode.BadRequest, response);
else
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.First(jjob => jjob.id == jobId).progressToken);
break;
}
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.Select(jjob => jjob.progressToken));
break;
case "Jobs/Running":
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.Where(jjob => jjob.progressToken.state is ProgressToken.State.Running));
break;
case "Jobs/Waiting":
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.Where(jjob => jjob.progressToken.state is ProgressToken.State.Standby).OrderBy(jjob => jjob.nextExecution));
break;
case "Jobs/MonitorJobs":
SendResponse(HttpStatusCode.OK, response, _parent.jobBoss.jobs.Where(jjob => jjob is DownloadNewChapters).OrderBy(jjob => ((DownloadNewChapters)jjob).manga.sortName));
break;
case "Settings":
SendResponse(HttpStatusCode.OK, response, TrangaSettings.AsJObject());
break;
case "Settings/userAgent":
SendResponse(HttpStatusCode.OK, response, TrangaSettings.userAgent);
break;
case "Settings/customRequestLimit":
SendResponse(HttpStatusCode.OK, response, TrangaSettings.requestLimits);
break;
case "Settings/AprilFoolsMode":
SendResponse(HttpStatusCode.OK, response, TrangaSettings.aprilFoolsMode);
break;
case "NotificationConnectors":
SendResponse(HttpStatusCode.OK, response, notificationConnectors);
break;
case "NotificationConnectors/Types":
SendResponse(HttpStatusCode.OK, response,
Enum.GetValues<NotificationConnector.NotificationConnectorType>().Select(nc => new KeyValuePair<byte, string?>((byte)nc, Enum.GetName(nc))));
break;
case "LibraryConnectors":
SendResponse(HttpStatusCode.OK, response, libraryConnectors);
break;
case "LibraryConnectors/Types":
SendResponse(HttpStatusCode.OK, response,
Enum.GetValues<LibraryConnector.LibraryType>().Select(lc => new KeyValuePair<byte, string?>((byte)lc, Enum.GetName(lc))));
break;
case "Ping":
SendResponse(HttpStatusCode.OK, response, "Pong");
break;
case "LogMessages":
if (logger is null || !File.Exists(logger?.logFilePath))
{
SendResponse(HttpStatusCode.NotFound, response);
break;
}
if (requestVariables.TryGetValue("count", out string? count))
{
try
{
uint messageCount = uint.Parse(count);
SendResponse(HttpStatusCode.OK, response, logger.Tail(messageCount));
}
catch (FormatException f)
{
SendResponse(HttpStatusCode.InternalServerError, response, f);
}
}else
SendResponse(HttpStatusCode.OK, response, logger.GetLog());
break;
case "LogFile":
if (logger is null || !File.Exists(logger?.logFilePath))
{
SendResponse(HttpStatusCode.NotFound, response);
break;
}
string logDir = new FileInfo(logger.logFilePath).DirectoryName!;
string tmpFilePath = Path.Join(logDir, "Tranga.log");
File.Copy(logger.logFilePath, tmpFilePath);
SendResponse(HttpStatusCode.OK, response, new FileStream(tmpFilePath, FileMode.Open));
File.Delete(tmpFilePath);
break;
default:
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
}
private void HandlePost(HttpListenerRequest request, HttpListenerResponse response)
{
Dictionary<string, string> requestVariables = GetRequestVariables(request.Url!.Query);
string? connectorName, internalId, jobId, chapterNumStr, customFolderName, translatedLanguage, notificationConnectorStr, libraryConnectorStr;
MangaConnector? connector;
Manga? tmpManga;
Manga manga;
Job? job;
NotificationConnector.NotificationConnectorType notificationConnectorType;
LibraryConnector.LibraryType libraryConnectorType;
string path = Regex.Match(request.Url!.LocalPath, @"[A-z0-9]+(\/[A-z0-9]+)*").Value;
switch (path)
{
case "Manga":
if(!requestVariables.TryGetValue("internalId", out internalId) ||
!_parent.TryGetPublicationById(internalId, out tmpManga))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga = (Manga)tmpManga!;
SendResponse(HttpStatusCode.OK, response, manga);
break;
case "Jobs/MonitorManga":
if(!requestVariables.TryGetValue("connector", out connectorName) ||
!requestVariables.TryGetValue("internalId", out internalId) ||
!requestVariables.TryGetValue("interval", out string? intervalStr) ||
!_parent.TryGetConnector(connectorName, out connector)||
!_parent.TryGetPublicationById(internalId, out tmpManga) ||
!TimeSpan.TryParse(intervalStr, out TimeSpan interval))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga = (Manga)tmpManga!;
if (requestVariables.TryGetValue("ignoreBelowChapterNum", out chapterNumStr))
{
if (!float.TryParse(chapterNumStr, numberFormatDecimalPoint, out float chapterNum))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga.ignoreChaptersBelow = chapterNum;
}
if (requestVariables.TryGetValue("customFolderName", out customFolderName))
manga.MovePublicationFolder(TrangaSettings.downloadLocation, customFolderName);
requestVariables.TryGetValue("translatedLanguage", out translatedLanguage);
_parent.jobBoss.AddJob(new DownloadNewChapters(this, connector!, manga, true, interval, translatedLanguage: translatedLanguage??"en"));
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Jobs/DownloadNewChapters":
if(!requestVariables.TryGetValue("connector", out connectorName) ||
!requestVariables.TryGetValue("internalId", out internalId) ||
!_parent.TryGetConnector(connectorName, out connector)||
!_parent.TryGetPublicationById(internalId, out tmpManga))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga = (Manga)tmpManga!;
if (requestVariables.TryGetValue("ignoreBelowChapterNum", out chapterNumStr))
{
if (!float.TryParse(chapterNumStr, numberFormatDecimalPoint, out float chapterNum))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
manga.ignoreChaptersBelow = chapterNum;
}
if (requestVariables.TryGetValue("customFolderName", out customFolderName))
manga.MovePublicationFolder(TrangaSettings.downloadLocation, customFolderName);
requestVariables.TryGetValue("translatedLanguage", out translatedLanguage);
_parent.jobBoss.AddJob(new DownloadNewChapters(this, connector!, manga, false, translatedLanguage: translatedLanguage??"en"));
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Jobs/UpdateMetadata":
if (!requestVariables.TryGetValue("internalId", out internalId))
{
foreach (Job pJob in _parent.jobBoss.jobs.Where(possibleDncJob =>
possibleDncJob.jobType is Job.JobType.DownloadNewChaptersJob).ToArray())//ToArray to avoid modyifying while adding new jobs
{
DownloadNewChapters dncJob = pJob as DownloadNewChapters ??
throw new Exception("Has to be DownloadNewChapters Job");
_parent.jobBoss.AddJob(new UpdateMetadata(this, dncJob.mangaConnector, dncJob.manga));
}
SendResponse(HttpStatusCode.Accepted, response);
}
else
{
Job[] possibleDncJobs = _parent.jobBoss.GetJobsLike(internalId: internalId).ToArray();
switch (possibleDncJobs.Length)
{
case <1: SendResponse(HttpStatusCode.BadRequest, response, "Could not find matching release"); break;
case >1: SendResponse(HttpStatusCode.BadRequest, response, "Multiple releases??"); break;
default:
DownloadNewChapters dncJob = possibleDncJobs[0] as DownloadNewChapters ??
throw new Exception("Has to be DownloadNewChapters Job");
_parent.jobBoss.AddJob(new UpdateMetadata(this, dncJob.mangaConnector, dncJob.manga));
SendResponse(HttpStatusCode.Accepted, response);
break;
}
}
break;
case "Jobs/StartNow":
if (!requestVariables.TryGetValue("jobId", out jobId) ||
!_parent.jobBoss.TryGetJobById(jobId, out job))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
_parent.jobBoss.AddJobToQueue(job!);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Jobs/Cancel":
if (!requestVariables.TryGetValue("jobId", out jobId) ||
!_parent.jobBoss.TryGetJobById(jobId, out job))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
job!.Cancel();
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/UpdateDownloadLocation":
if (!requestVariables.TryGetValue("downloadLocation", out string? downloadLocation) ||
!requestVariables.TryGetValue("moveFiles", out string? moveFilesStr) ||
!bool.TryParse(moveFilesStr, out bool moveFiles))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
TrangaSettings.UpdateDownloadLocation(downloadLocation, moveFiles);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/AprilFoolsMode":
if (!requestVariables.TryGetValue("enabled", out string? aprilFoolsModeEnabledStr) ||
!bool.TryParse(aprilFoolsModeEnabledStr, out bool aprilFoolsModeEnabled))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
TrangaSettings.UpdateAprilFoolsMode(aprilFoolsModeEnabled);
SendResponse(HttpStatusCode.Accepted, response);
break;
/*case "Settings/UpdateWorkingDirectory":
if (!requestVariables.TryGetValue("workingDirectory", out string? workingDirectory))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
settings.UpdateWorkingDirectory(workingDirectory);
SendResponse(HttpStatusCode.Accepted, response);
break;*/
case "Settings/userAgent":
if(!requestVariables.TryGetValue("userAgent", out string? customUserAgent))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
TrangaSettings.UpdateUserAgent(customUserAgent);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/userAgent/Reset":
TrangaSettings.UpdateUserAgent(null);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/customRequestLimit":
if (!requestVariables.TryGetValue("requestType", out string? requestTypeStr) ||
!requestVariables.TryGetValue("requestsPerMinute", out string? requestsPerMinuteStr) ||
!Enum.TryParse(requestTypeStr, out RequestType requestType) ||
!int.TryParse(requestsPerMinuteStr, out int requestsPerMinute))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
TrangaSettings.UpdateRateLimit(requestType, requestsPerMinute);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Settings/customRequestLimit/Reset":
TrangaSettings.ResetRateLimits();
break;
case "NotificationConnectors/Update":
if (!requestVariables.TryGetValue("notificationConnector", out notificationConnectorStr) ||
!Enum.TryParse(notificationConnectorStr, out notificationConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (notificationConnectorType is NotificationConnector.NotificationConnectorType.Gotify)
{
if (!requestVariables.TryGetValue("gotifyUrl", out string? gotifyUrl) ||
!requestVariables.TryGetValue("gotifyAppToken", out string? gotifyAppToken))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddNotificationConnector(new Gotify(this, gotifyUrl, gotifyAppToken));
SendResponse(HttpStatusCode.Accepted, response);
}else if (notificationConnectorType is NotificationConnector.NotificationConnectorType.LunaSea)
{
if (!requestVariables.TryGetValue("lunaseaWebhook", out string? lunaseaWebhook))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddNotificationConnector(new LunaSea(this, lunaseaWebhook));
SendResponse(HttpStatusCode.Accepted, response);
}else if (notificationConnectorType is NotificationConnector.NotificationConnectorType.Ntfy)
{
if (!requestVariables.TryGetValue("ntfyUrl", out string? ntfyUrl) ||
!requestVariables.TryGetValue("ntfyUser", out string? ntfyUser)||
!requestVariables.TryGetValue("ntfyPass", out string? ntfyPass))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddNotificationConnector(new Ntfy(this, ntfyUrl, ntfyUser, ntfyPass, null));
SendResponse(HttpStatusCode.Accepted, response);
}
else
{
SendResponse(HttpStatusCode.BadRequest, response);
}
break;
case "NotificationConnectors/Test":
NotificationConnector notificationConnector;
if (!requestVariables.TryGetValue("notificationConnector", out notificationConnectorStr) ||
!Enum.TryParse(notificationConnectorStr, out notificationConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (notificationConnectorType is NotificationConnector.NotificationConnectorType.Gotify)
{
if (!requestVariables.TryGetValue("gotifyUrl", out string? gotifyUrl) ||
!requestVariables.TryGetValue("gotifyAppToken", out string? gotifyAppToken))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
notificationConnector = new Gotify(this, gotifyUrl, gotifyAppToken);
}else if (notificationConnectorType is NotificationConnector.NotificationConnectorType.LunaSea)
{
if (!requestVariables.TryGetValue("lunaseaWebhook", out string? lunaseaWebhook))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
notificationConnector = new LunaSea(this, lunaseaWebhook);
}else if (notificationConnectorType is NotificationConnector.NotificationConnectorType.Ntfy)
{
if (!requestVariables.TryGetValue("ntfyUrl", out string? ntfyUrl) ||
!requestVariables.TryGetValue("ntfyUser", out string? ntfyUser)||
!requestVariables.TryGetValue("ntfyPass", out string? ntfyPass))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
notificationConnector = new Ntfy(this, ntfyUrl, ntfyUser, ntfyPass, null);
}
else
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
notificationConnector.SendNotification("Tranga Test", "This is Test-Notification.");
SendResponse(HttpStatusCode.Accepted, response);
break;
case "NotificationConnectors/Reset":
if (!requestVariables.TryGetValue("notificationConnector", out notificationConnectorStr) ||
!Enum.TryParse(notificationConnectorStr, out notificationConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
DeleteNotificationConnector(notificationConnectorType);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "LibraryConnectors/Update":
if (!requestVariables.TryGetValue("libraryConnector", out libraryConnectorStr) ||
!Enum.TryParse(libraryConnectorStr, out libraryConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (libraryConnectorType is LibraryConnector.LibraryType.Kavita)
{
if (!requestVariables.TryGetValue("kavitaUrl", out string? kavitaUrl) ||
!requestVariables.TryGetValue("kavitaUsername", out string? kavitaUsername) ||
!requestVariables.TryGetValue("kavitaPassword", out string? kavitaPassword))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddLibraryConnector(new Kavita(this, kavitaUrl, kavitaUsername, kavitaPassword));
SendResponse(HttpStatusCode.Accepted, response);
}else if (libraryConnectorType is LibraryConnector.LibraryType.Komga)
{
if (!requestVariables.TryGetValue("komgaUrl", out string? komgaUrl) ||
!requestVariables.TryGetValue("komgaAuth", out string? komgaAuth))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
AddLibraryConnector(new Komga(this, komgaUrl, komgaAuth));
SendResponse(HttpStatusCode.Accepted, response);
}
else
{
SendResponse(HttpStatusCode.BadRequest, response);
}
break;
case "LibraryConnectors/Test":
LibraryConnector libraryConnector;
if (!requestVariables.TryGetValue("libraryConnector", out libraryConnectorStr) ||
!Enum.TryParse(libraryConnectorStr, out libraryConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
if (libraryConnectorType is LibraryConnector.LibraryType.Kavita)
{
if (!requestVariables.TryGetValue("kavitaUrl", out string? kavitaUrl) ||
!requestVariables.TryGetValue("kavitaUsername", out string? kavitaUsername) ||
!requestVariables.TryGetValue("kavitaPassword", out string? kavitaPassword))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
libraryConnector = new Kavita(this, kavitaUrl, kavitaUsername, kavitaPassword);
}else if (libraryConnectorType is LibraryConnector.LibraryType.Komga)
{
if (!requestVariables.TryGetValue("komgaUrl", out string? komgaUrl) ||
!requestVariables.TryGetValue("komgaAuth", out string? komgaAuth))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
libraryConnector = new Komga(this, komgaUrl, komgaAuth);
}
else
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
libraryConnector.UpdateLibrary();
SendResponse(HttpStatusCode.Accepted, response);
break;
case "LibraryConnectors/Reset":
if (!requestVariables.TryGetValue("libraryConnector", out libraryConnectorStr) ||
!Enum.TryParse(libraryConnectorStr, out libraryConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
DeleteLibraryConnector(libraryConnectorType);
SendResponse(HttpStatusCode.Accepted, response);
break;
default:
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
}
private void HandleDelete(HttpListenerRequest request, HttpListenerResponse response)
{
Dictionary<string, string> requestVariables = GetRequestVariables(request.Url!.Query);
string? connectorName, internalId;
MangaConnector connector;
Manga manga;
string path = Regex.Match(request.Url!.LocalPath, @"[A-z0-9]+(\/[A-z0-9]+)*").Value;
switch (path)
{
case "Jobs":
if (!requestVariables.TryGetValue("jobId", out string? jobId) ||
!_parent.jobBoss.TryGetJobById(jobId, out Job? job))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
_parent.jobBoss.RemoveJob(job!);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "Jobs/DownloadNewChapters":
if(!requestVariables.TryGetValue("connector", out connectorName) ||
!requestVariables.TryGetValue("internalId", out internalId) ||
_parent.GetConnector(connectorName) is null ||
_parent.GetPublicationById(internalId) is null)
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
connector = _parent.GetConnector(connectorName)!;
manga = (Manga)_parent.GetPublicationById(internalId)!;
_parent.jobBoss.RemoveJobs(_parent.jobBoss.GetJobsLike(connector, manga));
SendResponse(HttpStatusCode.Accepted, response);
break;
case "NotificationConnectors":
if (!requestVariables.TryGetValue("notificationConnector", out string? notificationConnectorStr) ||
!Enum.TryParse(notificationConnectorStr, out NotificationConnector.NotificationConnectorType notificationConnectorType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
DeleteNotificationConnector(notificationConnectorType);
SendResponse(HttpStatusCode.Accepted, response);
break;
case "LibraryConnectors":
if (!requestVariables.TryGetValue("libraryConnectors", out string? libraryConnectorStr) ||
!Enum.TryParse(libraryConnectorStr,
out LibraryConnector.LibraryType libraryConnectoryType))
{
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
DeleteLibraryConnector(libraryConnectoryType);
SendResponse(HttpStatusCode.Accepted, response);
break;
default:
SendResponse(HttpStatusCode.BadRequest, response);
break;
}
}
private void SendResponse(HttpStatusCode statusCode, HttpListenerResponse response, object? content = null)
{
//Log($"Response: {statusCode} {content}");
response.StatusCode = (int)statusCode;
response.AddHeader("Access-Control-Allow-Headers", "Content-Type, Accept, X-Requested-With");
response.AddHeader("Access-Control-Allow-Methods", "GET, POST, DELETE");
response.AddHeader("Access-Control-Max-Age", "1728000");
response.AppendHeader("Access-Control-Allow-Origin", "*");
try
{
if (content is not Stream)
{
response.ContentType = "application/json";
response.AddHeader("Cache-Control", "no-store");
response.OutputStream.Write(content is not null
? Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(content))
: Array.Empty<byte>());
response.OutputStream.Close();
}
else if (content is FileStream stream)
{
string contentType = stream.Name.Split('.')[^1];
response.AddHeader("Cache-Control", "max-age=600");
switch (contentType.ToLower())
{
case "gif":
response.ContentType = "image/gif";
break;
case "png":
response.ContentType = "image/png";
break;
case "jpg":
case "jpeg":
response.ContentType = "image/jpeg";
break;
case "log":
response.ContentType = "text/plain";
break;
}
stream.CopyTo(response.OutputStream);
response.OutputStream.Close();
stream.Close();
}
}
catch (Exception e)
{
Log(e.ToString());
}
}
}

View File

@ -1,439 +0,0 @@
using Logging;
using Newtonsoft.Json;
using Tranga.Connectors;
using Tranga.LibraryManagers;
using Tranga.TrangaTasks;
namespace Tranga;
/// <summary>
/// Manages all TrangaTasks.
/// Provides a Threaded environment to execute Tasks, and still manage the Task-Collection
/// </summary>
public class TaskManager
{
public Dictionary<Publication, List<Chapter>> chapterCollection = new();
private HashSet<TrangaTask> _allTasks = new HashSet<TrangaTask>();
private bool _continueRunning = true;
private readonly Connector[] _connectors;
public TrangaSettings settings { get; }
private Logger? logger { get; }
/// <param name="downloadFolderPath">Local path to save data (Manga) to</param>
/// <param name="workingDirectory">Path to the working directory</param>
/// <param name="imageCachePath">Path to the cover-image cache</param>
/// <param name="libraryManagers"></param>
/// <param name="logger"></param>
public TaskManager(string downloadFolderPath, string workingDirectory, string imageCachePath, HashSet<LibraryManager> libraryManagers, Logger? logger = null)
{
this.logger = logger;
this.settings = new TrangaSettings(downloadFolderPath, workingDirectory, libraryManagers);
ExportDataAndSettings();
this._connectors = new Connector[]
{
new MangaDex(downloadFolderPath, imageCachePath, logger),
new Manganato(downloadFolderPath, imageCachePath, logger),
new Mangasee(downloadFolderPath, imageCachePath, logger)
};
Thread taskChecker = new(TaskCheckerThread);
taskChecker.Start();
}
public void UpdateSettings(string? downloadLocation, string? komgaUrl, string? komgaAuth, string? kavitaUrl, string? kavitaUsername, string? kavitaPassword)
{
if (komgaUrl is not null && komgaAuth is not null && komgaUrl.Length > 0 && komgaAuth.Length > 0)
{
settings.libraryManagers.RemoveWhere(lm => lm.GetType() == typeof(Komga));
settings.libraryManagers.Add(new Komga(komgaUrl, komgaAuth, logger));
}
if (kavitaUrl is not null && kavitaUsername is not null && kavitaPassword is not null && kavitaUrl.Length > 0 && kavitaUsername.Length > 0 && kavitaPassword.Length > 0)
{
settings.libraryManagers.RemoveWhere(lm => lm.GetType() == typeof(Kavita));
settings.libraryManagers.Add(new Kavita(kavitaUrl, kavitaUsername, kavitaPassword, logger));
}
if (downloadLocation is not null && downloadLocation.Length > 0)
settings.downloadLocation = downloadLocation;
ExportDataAndSettings();
}
public TaskManager(TrangaSettings settings, Logger? logger = null)
{
this.logger = logger;
this._connectors = new Connector[]
{
new MangaDex(settings.downloadLocation, settings.coverImageCache, logger),
new Manganato(settings.downloadLocation, settings.coverImageCache, logger),
new Mangasee(settings.downloadLocation, settings.coverImageCache, logger)
};
this.settings = settings;
ImportData();
ExportDataAndSettings();
Thread taskChecker = new(TaskCheckerThread);
taskChecker.Start();
}
/// <summary>
/// Runs continuously until shutdown.
/// Checks if tasks have to be executed (time elapsed)
/// </summary>
private void TaskCheckerThread()
{
logger?.WriteLine(this.GetType().ToString(), "Starting TaskCheckerThread.");
int allTasksWaitingLength = _allTasks.Count(task => task.state is TrangaTask.ExecutionState.Waiting);
while (_continueRunning)
{
TrangaTask[] tmp = _allTasks.Where(taskQuery =>
taskQuery.nextExecution < DateTime.Now &&
taskQuery.state is TrangaTask.ExecutionState.Waiting or TrangaTask.ExecutionState.Enqueued).ToArray();
foreach (TrangaTask task in tmp)
{
task.state = TrangaTask.ExecutionState.Enqueued;
switch (task.task)
{
case TrangaTask.Task.DownloadNewChapters:
if (!_allTasks.Any(taskQuery => taskQuery.task == TrangaTask.Task.DownloadNewChapters &&
taskQuery.state is TrangaTask.ExecutionState.Running &&
((DownloadNewChaptersTask)taskQuery).connectorName == ((DownloadNewChaptersTask)task).connectorName))
{
ExecuteTaskNow(task);
}
break;
case TrangaTask.Task.DownloadChapter:
if (!_allTasks.Any(taskQuery =>
taskQuery.task == TrangaTask.Task.DownloadChapter &&
taskQuery.state is TrangaTask.ExecutionState.Running &&
((DownloadChapterTask)taskQuery).connectorName ==
((DownloadChapterTask)task).connectorName))
{
ExecuteTaskNow(task);
}
break;
case TrangaTask.Task.UpdateLibraries:
ExecuteTaskNow(task);
break;
}
}
if(allTasksWaitingLength != _allTasks.Count(task => task.state is TrangaTask.ExecutionState.Waiting))
ExportDataAndSettings();
allTasksWaitingLength = _allTasks.Count(task => task.state is TrangaTask.ExecutionState.Waiting);
Thread.Sleep(1000);
}
}
/// <summary>
/// Forces the execution of a given task
/// </summary>
/// <param name="task">Task to execute</param>
public void ExecuteTaskNow(TrangaTask task)
{
task.state = TrangaTask.ExecutionState.Running;
Task t = new(() =>
{
task.Execute(this, this.logger);
});
t.Start();
}
public void AddTask(TrangaTask newTask)
{
logger?.WriteLine(this.GetType().ToString(), $"Adding new Task {newTask}");
switch (newTask.task)
{
case TrangaTask.Task.UpdateLibraries:
//Only one UpdateKomgaLibrary Task
logger?.WriteLine(this.GetType().ToString(), $"Removing old {newTask.task}-Task.");
_allTasks.RemoveWhere(trangaTask => trangaTask.task is TrangaTask.Task.UpdateLibraries);
break;
case TrangaTask.Task.DownloadNewChapters:
IEnumerable<TrangaTask> matchingdnc =
_allTasks.Where(mTask => mTask.GetType() == typeof(DownloadNewChaptersTask));
if (matchingdnc.All(mTask =>
((DownloadNewChaptersTask)mTask).publication.internalId != ((DownloadNewChaptersTask)newTask).publication.publicationId &&
((DownloadNewChaptersTask)mTask).connectorName != ((DownloadNewChaptersTask)newTask).connectorName))
_allTasks.Add(newTask);
else
logger?.WriteLine(this.GetType().ToString(), $"Task already exists {newTask}");
break;
case TrangaTask.Task.DownloadChapter:
IEnumerable<TrangaTask> matchingdc =
_allTasks.Where(mTask => mTask.GetType() == typeof(DownloadChapterTask));
if (!matchingdc.Any(mTask =>
((DownloadChapterTask)mTask).publication.internalId == ((DownloadChapterTask)newTask).publication.internalId &&
((DownloadChapterTask)mTask).connectorName == ((DownloadChapterTask)newTask).connectorName &&
((DownloadChapterTask)mTask).chapter.sortNumber == ((DownloadChapterTask)newTask).chapter.sortNumber))
_allTasks.Add(newTask);
else
logger?.WriteLine(this.GetType().ToString(), $"Task already exists {newTask}");
break;
}
ExportDataAndSettings();
}
public void DeleteTask(TrangaTask removeTask)
{
logger?.WriteLine(this.GetType().ToString(), $"Removing Task {removeTask}");
_allTasks.Remove(removeTask);
}
public TrangaTask? AddTask(TrangaTask.Task taskType, string? connectorName, string? publicationId,
TimeSpan reoccurrenceTime, string? language = "en")
{
TrangaTask? newTask = null;
switch (taskType)
{
case TrangaTask.Task.UpdateLibraries:
newTask = new UpdateLibrariesTask(taskType, reoccurrenceTime);
break;
case TrangaTask.Task.DownloadNewChapters:
if(connectorName is null || publicationId is null || language is null)
logger?.WriteLine(this.GetType().ToString(), $"Values connectorName, publicationName and language can not be null.");
GetConnector(connectorName); //Check if connectorName is valid
Publication publication = GetAllPublications().First(pub => pub.internalId == publicationId);
newTask = new DownloadNewChaptersTask(taskType, connectorName!, publication, reoccurrenceTime, language!);
break;
}
if(newTask is not null)
AddTask(newTask);
return newTask;
}
/// <summary>
/// Removes Task from task-collection
/// </summary>
/// <param name="task">TrangaTask.Task type</param>
/// <param name="connectorName">Name of Connector that was used</param>
/// <param name="publicationId">Publication that was used</param>
public void DeleteTask(TrangaTask.Task task, string? connectorName, string? publicationId)
{
logger?.WriteLine(this.GetType().ToString(), $"Removing Task {task} {publicationId}");
switch (task)
{
case TrangaTask.Task.UpdateLibraries:
//Only one UpdateKomgaLibrary Task
logger?.WriteLine(this.GetType().ToString(), $"Removing old {task}-Task.");
_allTasks.RemoveWhere(trangaTask => trangaTask.task is TrangaTask.Task.UpdateLibraries);
break;
case TrangaTask.Task.DownloadNewChapters:
if (connectorName is null || publicationId is null)
logger?.WriteLine(this.GetType().ToString(), "connectorName and publication can not be null");
else
{
_allTasks.RemoveWhere(mTask =>
mTask.GetType() == typeof(DownloadNewChaptersTask) &&
((DownloadNewChaptersTask)mTask).publication.internalId == publicationId &&
((DownloadNewChaptersTask)mTask).connectorName == connectorName!);
_allTasks.RemoveWhere(mTask =>
mTask.GetType() == typeof(DownloadChapterTask) &&
((DownloadChapterTask)mTask).publication.internalId == publicationId &&
((DownloadChapterTask)mTask).connectorName == connectorName!);
}
break;
}
ExportDataAndSettings();
}
public IEnumerable<TrangaTask> GetTasksMatching(TrangaTask.Task taskType, string? connectorName = null, string? searchString = null, string? internalId = null)
{
switch (taskType)
{
case TrangaTask.Task.UpdateLibraries:
return _allTasks.Where(tTask => tTask.task == TrangaTask.Task.UpdateLibraries);
case TrangaTask.Task.DownloadNewChapters:
if(connectorName is null)
return _allTasks.Where(tTask => tTask.task == taskType);
GetConnector(connectorName);//Name check
IEnumerable<TrangaTask> matchingdnc = _allTasks.Where(tTask => tTask.GetType() == typeof(DownloadNewChaptersTask));
if (searchString is not null)
{
return matchingdnc.Where(mTask =>
((DownloadNewChaptersTask)mTask).connectorName == connectorName &&
((DownloadNewChaptersTask)mTask).ToString().Contains(searchString, StringComparison.InvariantCultureIgnoreCase));
}
else if (internalId is not null)
{
return matchingdnc.Where(mTask =>
((DownloadNewChaptersTask)mTask).connectorName == connectorName &&
((DownloadNewChaptersTask)mTask).publication.internalId == internalId);
}
else
return _allTasks.Where(tTask =>
tTask.GetType() == typeof(DownloadNewChaptersTask) &&
((DownloadNewChaptersTask)tTask).connectorName == connectorName);
case TrangaTask.Task.DownloadChapter:
if(connectorName is null)
return _allTasks.Where(tTask => tTask.task == taskType);
GetConnector(connectorName);//Name check
IEnumerable<TrangaTask> matchingdc = _allTasks.Where(tTask => tTask.GetType() == typeof(DownloadChapterTask));
if (searchString is not null)
{
return matchingdc.Where(mTask =>
((DownloadChapterTask)mTask).connectorName == connectorName &&
((DownloadChapterTask)mTask).ToString().Contains(searchString, StringComparison.InvariantCultureIgnoreCase));
}
else if (internalId is not null)
{
return matchingdc.Where(mTask =>
((DownloadChapterTask)mTask).connectorName == connectorName &&
((DownloadChapterTask)mTask).publication.publicationId == internalId);
}
else
return _allTasks.Where(tTask =>
tTask.GetType() == typeof(DownloadChapterTask) &&
((DownloadChapterTask)tTask).connectorName == connectorName);
default:
return Array.Empty<TrangaTask>();
}
}
/// <summary>
/// Removes a Task from the queue
/// </summary>
/// <param name="task"></param>
public void RemoveTaskFromQueue(TrangaTask task)
{
task.lastExecuted = DateTime.Now;
task.state = TrangaTask.ExecutionState.Waiting;
}
/// <summary>
/// Sets last execution time to start of time
/// Let taskManager handle enqueuing
/// </summary>
/// <param name="task"></param>
public void AddTaskToQueue(TrangaTask task)
{
task.lastExecuted = DateTime.UnixEpoch;
}
/// <returns>All available Connectors</returns>
public Dictionary<string, Connector> GetAvailableConnectors()
{
return this._connectors.ToDictionary(connector => connector.name, connector => connector);
}
/// <returns>All TrangaTasks in task-collection</returns>
public TrangaTask[] GetAllTasks()
{
TrangaTask[] ret = new TrangaTask[_allTasks.Count];
_allTasks.CopyTo(ret);
return ret;
}
public Publication[] GetPublicationsFromConnector(Connector connector, string? title = null)
{
Publication[] ret = connector.GetPublications(title ?? "");
foreach (Publication publication in ret)
{
if(chapterCollection.All(pub => pub.Key.internalId != publication.internalId))
this.chapterCollection.TryAdd(publication, new List<Chapter>());
}
return ret;
}
/// <returns>All added Publications</returns>
public Publication[] GetAllPublications()
{
return this.chapterCollection.Keys.ToArray();
}
/// <summary>
/// Return Connector with given Name
/// </summary>
/// <param name="connectorName">Connector-name (exact)</param>
/// <exception cref="Exception">If Connector is not available</exception>
public Connector GetConnector(string? connectorName)
{
if(connectorName is null)
throw new Exception($"connectorName can not be null");
Connector? ret = this._connectors.FirstOrDefault(connector => connector.name == connectorName);
if (ret is null)
throw new Exception($"Connector {connectorName} is not an available Connector.");
return ret;
}
/// <summary>
/// Shuts down the taskManager.
/// </summary>
/// <param name="force">If force is true, tasks are aborted.</param>
public void Shutdown(bool force = false)
{
logger?.WriteLine(this.GetType().ToString(), $"Shutting down (forced={force})");
_continueRunning = false;
ExportDataAndSettings();
if(force)
Environment.Exit(_allTasks.Count(task => task.state is TrangaTask.ExecutionState.Enqueued or TrangaTask.ExecutionState.Running));
//Wait for tasks to finish
while(_allTasks.Any(task => task.state is TrangaTask.ExecutionState.Running or TrangaTask.ExecutionState.Enqueued))
Thread.Sleep(10);
logger?.WriteLine(this.GetType().ToString(), "Tasks finished. Bye!");
Environment.Exit(0);
}
private void ImportData()
{
logger?.WriteLine(this.GetType().ToString(), "Importing Data");
string buffer;
if (File.Exists(settings.tasksFilePath))
{
logger?.WriteLine(this.GetType().ToString(), $"Importing tasks from {settings.tasksFilePath}");
buffer = File.ReadAllText(settings.tasksFilePath);
this._allTasks = JsonConvert.DeserializeObject<HashSet<TrangaTask>>(buffer, new JsonSerializerSettings() { Converters = { new TrangaTask.TrangaTaskJsonConverter() } })!;
}
if (File.Exists(settings.knownPublicationsPath))
{
logger?.WriteLine(this.GetType().ToString(), $"Importing known publications from {settings.knownPublicationsPath}");
buffer = File.ReadAllText(settings.knownPublicationsPath);
Publication[] publications = JsonConvert.DeserializeObject<Publication[]>(buffer)!;
foreach (Publication publication in publications)
this.chapterCollection.TryAdd(publication, new List<Chapter>());
}
}
/// <summary>
/// Exports data (settings, tasks) to file
/// </summary>
private void ExportDataAndSettings()
{
logger?.WriteLine(this.GetType().ToString(), $"Exporting settings to {settings.settingsFilePath}");
while(IsFileInUse(settings.settingsFilePath))
Thread.Sleep(50);
File.WriteAllText(settings.settingsFilePath, JsonConvert.SerializeObject(settings));
logger?.WriteLine(this.GetType().ToString(), $"Exporting tasks to {settings.tasksFilePath}");
while(IsFileInUse(settings.tasksFilePath))
Thread.Sleep(50);
File.WriteAllText(settings.tasksFilePath, JsonConvert.SerializeObject(this._allTasks));
logger?.WriteLine(this.GetType().ToString(), $"Exporting known publications to {settings.knownPublicationsPath}");
while(IsFileInUse(settings.knownPublicationsPath))
Thread.Sleep(50);
File.WriteAllText(settings.knownPublicationsPath, JsonConvert.SerializeObject(this.chapterCollection.Keys.ToArray()));
}
private bool IsFileInUse(string path)
{
if (!File.Exists(path))
return false;
try
{
using FileStream stream = new (path, FileMode.Open, FileAccess.Read, FileShare.None);
stream.Close();
}
catch (IOException)
{
return true;
}
return false;
}
}

92
Tranga/Tranga.cs Normal file
View File

@ -0,0 +1,92 @@
using Logging;
using Tranga.Jobs;
using Tranga.MangaConnectors;
namespace Tranga;
public partial class Tranga : GlobalBase
{
public bool keepRunning;
public JobBoss jobBoss;
private Server _server;
private HashSet<MangaConnector> _connectors;
public Tranga(Logger? logger) : base(logger)
{
Log("\n\n _______ \n|_ _|.----..---.-..-----..-----..---.-.\n | | | _|| _ || || _ || _ |\n |___| |__| |___._||__|__||___ ||___._|\n |_____| \n\n");
keepRunning = true;
_connectors = new HashSet<MangaConnector>()
{
new Manganato(this),
new Mangasee(this),
new MangaDex(this),
new MangaKatana(this),
new Mangaworld(this),
new Bato(this),
new MangaLife(this),
new ManhuaPlus(this),
new MangaHere(this),
};
foreach(DirectoryInfo dir in new DirectoryInfo(Path.GetTempPath()).GetDirectories("trangatemp"))//Cleanup old temp folders
dir.Delete();
jobBoss = new(this, this._connectors);
StartJobBoss();
this._server = new Server(this);
string[] emojis = { "(•‿•)", "(づ \u25d5‿\u25d5 )づ", "( \u02d8\u25bd\u02d8)っ\u2668", "=\uff3e\u25cf \u22cf \u25cf\uff3e=", "(ΦωΦ)", "(\u272a\u3268\u272a)", "( ノ・o・ )ノ", "(〜^\u2207^ )〜", "~(\u2267ω\u2266)~","૮ \u00b4• ﻌ \u00b4• ა", "(\u02c3ᆺ\u02c2)", "(=\ud83d\udf66 \u0f1d \ud83d\udf66=)"};
SendNotifications("Tranga Started", emojis[Random.Shared.Next(0,emojis.Length-1)]);
Log(TrangaSettings.AsJObject().ToString());
}
public MangaConnector? GetConnector(string name)
{
foreach(MangaConnector mc in _connectors)
if (mc.name.Equals(name, StringComparison.InvariantCultureIgnoreCase))
return mc;
return null;
}
public bool TryGetConnector(string name, out MangaConnector? connector)
{
connector = GetConnector(name);
return connector is not null;
}
public IEnumerable<MangaConnector> GetConnectors()
{
return _connectors;
}
public Manga? GetPublicationById(string internalId) => GetCachedManga(internalId);
public bool TryGetPublicationById(string internalId, out Manga? manga)
{
manga = GetPublicationById(internalId);
return manga is not null;
}
private void StartJobBoss()
{
Thread t = new (() =>
{
while (keepRunning)
{
if(!TrangaSettings.aprilFoolsMode || !IsAprilFirst())
jobBoss.CheckJobs();
else
Log("April Fools Mode in Effect");
Thread.Sleep(100);
}
});
t.Start();
}
private bool IsAprilFirst()
{
//UTC 01 Apr +-12hrs
DateTime start = new DateTime(DateTime.Now.Year, 03, 31, 12, 0, 0, DateTimeKind.Utc);
DateTime end = new DateTime(DateTime.Now.Year, 04, 02, 12, 0, 0, DateTimeKind.Utc);
if (DateTime.UtcNow > start && DateTime.UtcNow < end)
return true;
return false;
}
}

View File

@ -1,19 +1,30 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net7.0</TargetFramework>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<OutputType>Exe</OutputType>
<LangVersion>12</LangVersion>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="GlaxArguments" Version="1.1.0" />
<PackageReference Include="HtmlAgilityPack" Version="1.11.46" />
<PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
<PackageReference Include="PuppeteerSharp" Version="10.0.0" />
<PackageReference Include="Soenneker.Utils.String.NeedlemanWunsch" Version="2.1.301" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\Logging\Logging.csproj" />
</ItemGroup>
<ItemGroup>
<Content Include="..\.dockerignore">
<Link>.dockerignore</Link>
<DependentUpon>Dockerfile</DependentUpon>
</Content>
</ItemGroup>
</Project>

51
Tranga/TrangaArgs.cs Normal file
View File

@ -0,0 +1,51 @@
using Logging;
using GlaxArguments;
namespace Tranga;
public partial class Tranga : GlobalBase
{
public static void Main(string[] args)
{
Argument downloadLocation = new (new[] { "-d", "--downloadLocation" }, 1, "Directory to which downloaded Manga are saved");
Argument workingDirectory = new (new[] { "-w", "--workingDirectory" }, 1, "Directory in which application-data is saved");
Argument consoleLogger = new (new []{"-c", "--consoleLogger"}, 0, "Enables the consoleLogger");
Argument fileLogger = new (new []{"-f", "--fileLogger"}, 0, "Enables the fileLogger");
Argument fPath = new (new []{"-l", "--fPath"}, 1, "Log Folder Path");
Argument[] arguments = new[]
{
downloadLocation,
workingDirectory,
consoleLogger,
fileLogger,
fPath
};
ArgumentFetcher fetcher = new (arguments);
Dictionary<Argument, string[]> fetched = fetcher.Fetch(args);
string? directoryPath = fetched.TryGetValue(fPath, out string[]? path) ? path[0] : null;
if (directoryPath is not null && !Directory.Exists(directoryPath))
Directory.CreateDirectory(directoryPath);
List<Logger.LoggerType> enabledLoggers = new();
if(fetched.ContainsKey(consoleLogger))
enabledLoggers.Add(Logger.LoggerType.ConsoleLogger);
if (fetched.ContainsKey(fileLogger))
enabledLoggers.Add(Logger.LoggerType.FileLogger);
Logger logger = new(enabledLoggers.ToArray(), Console.Out, Console.OutputEncoding, directoryPath);
bool dlp = fetched.TryGetValue(downloadLocation, out string[]? downloadLocationPath);
bool wdp = fetched.TryGetValue(workingDirectory, out string[]? workingDirectoryPath);
if (wdp)
TrangaSettings.LoadFromWorkingDirectory(workingDirectoryPath![0]);
else
TrangaSettings.CreateOrUpdate();
if(dlp)
TrangaSettings.CreateOrUpdate(downloadDirectory: downloadLocationPath![0]);
Tranga _ = new (logger);
}
}

View File

@ -1,39 +1,195 @@
using Logging;
using System.Runtime.InteropServices;
using Newtonsoft.Json;
using Tranga.LibraryManagers;
using Newtonsoft.Json.Linq;
using Tranga.LibraryConnectors;
using Tranga.MangaConnectors;
using Tranga.NotificationConnectors;
using static System.IO.UnixFileMode;
namespace Tranga;
public class TrangaSettings
public static class TrangaSettings
{
public string downloadLocation { get; set; }
public string workingDirectory { get; set; }
[JsonIgnore]public string settingsFilePath => Path.Join(workingDirectory, "settings.json");
[JsonIgnore]public string tasksFilePath => Path.Join(workingDirectory, "tasks.json");
[JsonIgnore]public string knownPublicationsPath => Path.Join(workingDirectory, "knownPublications.json");
[JsonIgnore] public string coverImageCache => Path.Join(workingDirectory, "imageCache");
public HashSet<LibraryManager> libraryManagers { get; }
public TrangaSettings(string downloadLocation, string workingDirectory, HashSet<LibraryManager> libraryManagers)
[JsonIgnore] internal static readonly string DefaultUserAgent = $"Tranga ({Enum.GetName(Environment.OSVersion.Platform)}; {(Environment.Is64BitOperatingSystem ? "x64" : "")}) / 1.0";
public static string downloadLocation { get; private set; } = (RuntimeInformation.IsOSPlatform(OSPlatform.Linux) ? "/Manga" : Path.Join(Directory.GetCurrentDirectory(), "Downloads"));
public static string workingDirectory { get; private set; } = Path.Join(RuntimeInformation.IsOSPlatform(OSPlatform.Linux) ? "/usr/share" : Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData), "tranga-api");
public static int apiPortNumber { get; private set; } = 6531;
public static string userAgent { get; private set; } = DefaultUserAgent;
public static bool bufferLibraryUpdates { get; private set; } = false;
public static bool bufferNotifications { get; private set; } = false;
[JsonIgnore] public static string settingsFilePath => Path.Join(workingDirectory, "settings.json");
[JsonIgnore] public static string libraryConnectorsFilePath => Path.Join(workingDirectory, "libraryConnectors.json");
[JsonIgnore] public static string notificationConnectorsFilePath => Path.Join(workingDirectory, "notificationConnectors.json");
[JsonIgnore] public static string jobsFolderPath => Path.Join(workingDirectory, "jobs");
[JsonIgnore] public static string coverImageCache => Path.Join(workingDirectory, "imageCache");
public static ushort? version { get; } = 2;
public static bool aprilFoolsMode { get; private set; } = true;
[JsonIgnore]internal static readonly Dictionary<RequestType, int> DefaultRequestLimits = new ()
{
if (downloadLocation.Length < 1 || workingDirectory.Length < 1)
throw new ArgumentException("Download-location and working-directory paths can not be empty!");
this.workingDirectory = workingDirectory;
this.downloadLocation = downloadLocation;
this.libraryManagers = libraryManagers;
{RequestType.MangaInfo, 250},
{RequestType.MangaDexFeed, 250},
{RequestType.MangaDexImage, 40},
{RequestType.MangaImage, 60},
{RequestType.MangaCover, 250},
{RequestType.Default, 60}
};
public static Dictionary<RequestType, int> requestLimits { get; set; } = DefaultRequestLimits;
public static void LoadFromWorkingDirectory(string directory)
{
TrangaSettings.workingDirectory = directory;
if(File.Exists(settingsFilePath))
Deserialize(File.ReadAllText(settingsFilePath));
else return;
Directory.CreateDirectory(downloadLocation);
Directory.CreateDirectory(workingDirectory);
ExportSettings();
}
public static TrangaSettings LoadSettings(string importFilePath, Logger? logger)
public static void CreateOrUpdate(string? downloadDirectory = null, string? pWorkingDirectory = null, int? pApiPortNumber = null, string? pUserAgent = null, bool? pAprilFoolsMode = null, bool? pBufferLibraryUpdates = null, bool? pBufferNotifications = null)
{
if (!File.Exists(importFilePath))
return new TrangaSettings(Path.Join(Directory.GetCurrentDirectory(), "Downloads"), Directory.GetCurrentDirectory(), new HashSet<LibraryManager>());
if(pWorkingDirectory is null && File.Exists(settingsFilePath))
LoadFromWorkingDirectory(workingDirectory);
downloadLocation = downloadDirectory ?? downloadLocation;
workingDirectory = pWorkingDirectory ?? workingDirectory;
apiPortNumber = pApiPortNumber ?? apiPortNumber;
userAgent = pUserAgent ?? userAgent;
aprilFoolsMode = pAprilFoolsMode ?? aprilFoolsMode;
bufferLibraryUpdates = pBufferLibraryUpdates ?? bufferLibraryUpdates;
bufferNotifications = pBufferNotifications ?? bufferNotifications;
Directory.CreateDirectory(downloadLocation);
Directory.CreateDirectory(workingDirectory);
ExportSettings();
}
string toRead = File.ReadAllText(importFilePath);
TrangaSettings settings = JsonConvert.DeserializeObject<TrangaSettings>(toRead, new JsonSerializerSettings() { Converters = { new LibraryManager.LibraryManagerJsonConverter()} })!;
if(logger is not null)
foreach(LibraryManager lm in settings.libraryManagers)
lm.AddLogger(logger);
public static HashSet<LibraryConnector> LoadLibraryConnectors(GlobalBase clone)
{
if (!File.Exists(libraryConnectorsFilePath))
return new HashSet<LibraryConnector>();
return JsonConvert.DeserializeObject<HashSet<LibraryConnector>>(File.ReadAllText(libraryConnectorsFilePath),
new JsonSerializerSettings()
{
Converters =
{
new LibraryManagerJsonConverter(clone)
}
})!;
}
return settings;
public static HashSet<NotificationConnector> LoadNotificationConnectors(GlobalBase clone)
{
if (!File.Exists(notificationConnectorsFilePath))
return new HashSet<NotificationConnector>();
return JsonConvert.DeserializeObject<HashSet<NotificationConnector>>(File.ReadAllText(notificationConnectorsFilePath),
new JsonSerializerSettings()
{
Converters =
{
new NotificationManagerJsonConverter(clone)
}
})!;
}
public static void UpdateAprilFoolsMode(bool enabled)
{
aprilFoolsMode = enabled;
ExportSettings();
}
public static void UpdateDownloadLocation(string newPath, bool moveFiles = true)
{
if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
Directory.CreateDirectory(newPath,
GroupRead | GroupWrite | None | OtherRead | OtherWrite | UserRead | UserWrite);
else
Directory.CreateDirectory(newPath);
if (moveFiles && Directory.Exists(downloadLocation))
Directory.Move(downloadLocation, newPath);
downloadLocation = newPath;
ExportSettings();
}
public static void UpdateWorkingDirectory(string newPath)
{
if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
Directory.CreateDirectory(newPath,
GroupRead | GroupWrite | None | OtherRead | OtherWrite | UserRead | UserWrite);
else
Directory.CreateDirectory(newPath);
Directory.Move(workingDirectory, newPath);
workingDirectory = newPath;
ExportSettings();
}
public static void UpdateUserAgent(string? customUserAgent)
{
userAgent = customUserAgent ?? DefaultUserAgent;
ExportSettings();
}
public static void UpdateRateLimit(RequestType requestType, int newLimit)
{
requestLimits[requestType] = newLimit;
ExportSettings();
}
public static void ResetRateLimits()
{
requestLimits = DefaultRequestLimits;
ExportSettings();
}
public static void ExportSettings()
{
if (File.Exists(settingsFilePath))
{
while(GlobalBase.IsFileInUse(settingsFilePath, null))
Thread.Sleep(100);
}
else
Directory.CreateDirectory(new FileInfo(settingsFilePath).DirectoryName!);
File.WriteAllText(settingsFilePath, Serialize());
}
public static JObject AsJObject()
{
JObject jobj = new JObject();
jobj.Add("downloadLocation", JToken.FromObject(downloadLocation));
jobj.Add("workingDirectory", JToken.FromObject(workingDirectory));
jobj.Add("apiPortNumber", JToken.FromObject(apiPortNumber));
jobj.Add("userAgent", JToken.FromObject(userAgent));
jobj.Add("aprilFoolsMode", JToken.FromObject(aprilFoolsMode));
jobj.Add("version", JToken.FromObject(version));
jobj.Add("requestLimits", JToken.FromObject(requestLimits));
jobj.Add("bufferLibraryUpdates", JToken.FromObject(bufferLibraryUpdates));
jobj.Add("bufferNotifications", JToken.FromObject(bufferNotifications));
return jobj;
}
public static string Serialize() => AsJObject().ToString();
public static void Deserialize(string serialized)
{
JObject jobj = JObject.Parse(serialized);
if (jobj.TryGetValue("downloadLocation", out JToken? dl))
downloadLocation = dl.Value<string>()!;
if (jobj.TryGetValue("workingDirectory", out JToken? wd))
workingDirectory = wd.Value<string>()!;
if (jobj.TryGetValue("apiPortNumber", out JToken? apn))
apiPortNumber = apn.Value<int>();
if (jobj.TryGetValue("userAgent", out JToken? ua))
userAgent = ua.Value<string>()!;
if (jobj.TryGetValue("aprilFoolsMode", out JToken? afm))
aprilFoolsMode = afm.Value<bool>()!;
if (jobj.TryGetValue("requestLimits", out JToken? rl))
requestLimits = rl.ToObject<Dictionary<RequestType, int>>()!;
if (jobj.TryGetValue("bufferLibraryUpdates", out JToken? blu))
bufferLibraryUpdates = blu.Value<bool>()!;
if (jobj.TryGetValue("bufferNotifications", out JToken? bn))
bufferNotifications = bn.Value<bool>()!;
}
}

View File

@ -1,131 +0,0 @@
using System.Text.Json.Serialization;
using Logging;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using Tranga.TrangaTasks;
using JsonConverter = Newtonsoft.Json.JsonConverter;
namespace Tranga;
/// <summary>
/// Stores information on Task, when implementing new Tasks also update the serializer
/// </summary>
[JsonDerivedType(typeof(DownloadNewChaptersTask), 2)]
[JsonDerivedType(typeof(UpdateLibrariesTask), 3)]
[JsonDerivedType(typeof(DownloadChapterTask), 4)]
public abstract class TrangaTask
{
// ReSharper disable once CommentTypo ...Tell me why!
// ReSharper disable once MemberCanBePrivate.Global I want it thaaat way
public TimeSpan reoccurrence { get; }
public DateTime lastExecuted { get; set; }
public Task task { get; }
[Newtonsoft.Json.JsonIgnore]public ExecutionState state { get; set; }
[Newtonsoft.Json.JsonIgnore]public float progress { get; protected set; }
[Newtonsoft.Json.JsonIgnore]public DateTime nextExecution => lastExecuted.Add(reoccurrence);
[Newtonsoft.Json.JsonIgnore]public DateTime executionStarted { get; protected set; }
[Newtonsoft.Json.JsonIgnore]
public DateTime executionApproximatelyFinished => this.progress != 0
? this.executionStarted.Add(DateTime.Now.Subtract(this.executionStarted) / this.progress)
: DateTime.MaxValue;
[Newtonsoft.Json.JsonIgnore]
public TimeSpan executionApproximatelyRemaining => this.executionApproximatelyFinished.Subtract(DateTime.Now);
public enum ExecutionState
{
Waiting,
Enqueued,
Running
};
protected TrangaTask(Task task, TimeSpan reoccurrence)
{
this.reoccurrence = reoccurrence;
this.lastExecuted = DateTime.Now.Subtract(reoccurrence);
this.task = task;
this.progress = 0f;
this.executionStarted = DateTime.Now;
}
public float IncrementProgress(float amount)
{
this.progress += amount;
return this.progress;
}
/// <summary>
/// BL for concrete Tasks
/// </summary>
/// <param name="taskManager"></param>
/// <param name="logger"></param>
protected abstract void ExecuteTask(TaskManager taskManager, Logger? logger);
/// <summary>
/// Execute the task
/// </summary>
/// <param name="taskManager">Should be the parent taskManager</param>
/// <param name="logger"></param>
public void Execute(TaskManager taskManager, Logger? logger)
{
logger?.WriteLine(this.GetType().ToString(), $"Executing Task {this}");
this.state = ExecutionState.Running;
this.executionStarted = DateTime.Now;
ExecuteTask(taskManager, logger);
this.lastExecuted = DateTime.Now;
this.state = ExecutionState.Waiting;
logger?.WriteLine(this.GetType().ToString(), $"Finished Executing Task {this}");
}
/// <returns>True if elapsed time since last execution is greater than set interval</returns>
public bool ShouldExecute()
{
return nextExecution < DateTime.Now && state is ExecutionState.Waiting;
}
public enum Task : byte
{
DownloadNewChapters = 2,
UpdateLibraries = 3,
DownloadChapter = 4
}
public override string ToString()
{
return $"{task}, {lastExecuted}, {reoccurrence}, {state}, {progress:P2}, {executionApproximatelyFinished}, {executionApproximatelyRemaining}";
}
public class TrangaTaskJsonConverter : JsonConverter
{
public override bool CanConvert(Type objectType)
{
return (objectType == typeof(TrangaTask));
}
public override object ReadJson(JsonReader reader, Type objectType, object? existingValue, JsonSerializer serializer)
{
JObject jo = JObject.Load(reader);
if (jo["task"]!.Value<Int64>() == (Int64)Task.DownloadNewChapters)
return jo.ToObject<DownloadNewChaptersTask>(serializer)!;
if (jo["task"]!.Value<Int64>() == (Int64)Task.UpdateLibraries)
return jo.ToObject<UpdateLibrariesTask>(serializer)!;
if (jo["task"]!.Value<Int64>() == (Int64)Task.DownloadChapter)
return jo.ToObject<DownloadChapterTask>(serializer)!;
throw new Exception();
}
public override bool CanWrite => false;
/// <summary>
/// Don't call this
/// </summary>
public override void WriteJson(JsonWriter writer, object? value, JsonSerializer serializer)
{
throw new Exception("Dont call this");
}
}
}

View File

@ -1,42 +0,0 @@
using Logging;
using Newtonsoft.Json;
namespace Tranga.TrangaTasks;
public class DownloadChapterTask : TrangaTask
{
public string connectorName { get; }
public Publication publication { get; }
public string language { get; }
public Chapter chapter { get; }
[JsonIgnore]private DownloadNewChaptersTask? parentTask { get; init; }
public DownloadChapterTask(Task task, string connectorName, Publication publication, Chapter chapter, string language = "en", DownloadNewChaptersTask? parentTask = null) : base(task, TimeSpan.Zero)
{
this.chapter = chapter;
this.connectorName = connectorName;
this.publication = publication;
this.language = language;
this.parentTask = parentTask;
}
protected override void ExecuteTask(TaskManager taskManager, Logger? logger)
{
Publication pub = (Publication)this.publication!;
Connector connector = taskManager.GetConnector(this.connectorName);
connector.DownloadChapter(pub, this.chapter, this);
taskManager.DeleteTask(this);
}
public new float IncrementProgress(float amount)
{
this.progress += amount;
parentTask?.IncrementProgress(amount);
return this.progress;
}
public override string ToString()
{
return $"{base.ToString()}, {connectorName}, {publication.sortName} {publication.internalId}, Vol.{chapter.volumeNumber} Ch.{chapter.chapterNumber}";
}
}

View File

@ -1,68 +0,0 @@
using Logging;
using Newtonsoft.Json;
namespace Tranga.TrangaTasks;
public class DownloadNewChaptersTask : TrangaTask
{
public string connectorName { get; }
public Publication publication { get; }
public string language { get; }
[JsonIgnore]private int childTaskAmount { get; set; }
public DownloadNewChaptersTask(Task task, string connectorName, Publication publication, TimeSpan reoccurrence, string language = "en") : base(task, reoccurrence)
{
this.connectorName = connectorName;
this.publication = publication;
this.language = language;
childTaskAmount = 0;
}
public new float IncrementProgress(float amount)
{
this.progress += amount / this.childTaskAmount;
return this.progress;
}
protected override void ExecuteTask(TaskManager taskManager, Logger? logger)
{
Publication pub = publication!;
Connector connector = taskManager.GetConnector(this.connectorName);
//Check if Publication already has a Folder
pub.CreatePublicationFolder(taskManager.settings.downloadLocation);
List<Chapter> newChapters = GetNewChaptersList(connector, pub, language!, ref taskManager.chapterCollection);
this.childTaskAmount = newChapters.Count;
connector.CopyCoverFromCacheToDownloadLocation(pub, taskManager.settings);
pub.SaveSeriesInfoJson(connector.downloadLocation);
foreach (Chapter newChapter in newChapters)
taskManager.AddTask(new DownloadChapterTask(Task.DownloadChapter, this.connectorName!, pub, newChapter, this.language, this));
}
/// <summary>
/// Updates the available Chapters of a Publication
/// </summary>
/// <param name="connector">Connector to use</param>
/// <param name="publication">Publication to check</param>
/// <param name="language">Language to receive chapters for</param>
/// <param name="chapterCollection"></param>
/// <returns>List of Chapters that were previously not in collection</returns>
private static List<Chapter> GetNewChaptersList(Connector connector, Publication publication, string language, ref Dictionary<Publication, List<Chapter>> chapterCollection)
{
List<Chapter> newChaptersList = new();
chapterCollection.TryAdd(publication, newChaptersList); //To ensure publication is actually in collection
Chapter[] newChapters = connector.GetChapters(publication, language);
newChaptersList = newChapters.Where(nChapter => !connector.CheckChapterIsDownloaded(publication, nChapter)).ToList();
return newChaptersList;
}
public override string ToString()
{
return $"{base.ToString()}, {connectorName}, {publication.sortName} {publication.internalId}";
}
}

View File

@ -1,17 +0,0 @@
using Logging;
namespace Tranga.TrangaTasks;
public class UpdateLibrariesTask : TrangaTask
{
public UpdateLibrariesTask(Task task, TimeSpan reoccurrence) : base(task, reoccurrence)
{
}
protected override void ExecuteTask(TaskManager taskManager, Logger? logger)
{
foreach(LibraryManager lm in taskManager.settings.libraryManagers)
lm.UpdateLibrary();
this.progress = 1f;
}
}

View File

@ -1,4 +0,0 @@
FROM nginx:alpine3.17-slim
COPY . /usr/share/nginx/html
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]

View File

@ -1,135 +0,0 @@
let apiUri = `http://${window.location.host.split(':')[0]}:6531`
if(getCookie("apiUri") != ""){
apiUri = getCookie("apiUri");
}
function getCookie(cname) {
let name = cname + "=";
let decodedCookie = decodeURIComponent(document.cookie);
let ca = decodedCookie.split(';');
for(let i = 0; i < ca.length; i++) {
let c = ca[i];
while (c.charAt(0) == ' ') {
c = c.substring(1);
}
if (c.indexOf(name) == 0) {
return c.substring(name.length, c.length);
}
}
return "";
}
async function GetData(uri){
let request = await fetch(uri, {
method: 'GET',
headers: {
'Accept': 'application/json'
}
});
let json = await request.json();
return json;
}
function PostData(uri){
fetch(uri, {
method: 'POST'
});
}
function DeleteData(uri){
fetch(uri, {
method: 'DELETE'
});
}
async function GetAvailableControllers(){
var uri = apiUri + "/Tranga/GetAvailableControllers";
let json = await GetData(uri);
return json;
}
async function GetPublication(connectorName, title){
var uri = apiUri + `/Tranga/GetPublicationsFromConnector?connectorName=${connectorName}&title=${title}`;
let json = await GetData(uri);
return json;
}
async function GetKnownPublications(){
var uri = apiUri + "/Tranga/GetKnownPublications";
let json = await GetData(uri);
return json;
}
async function GetTaskTypes(){
var uri = apiUri + "/Tasks/GetTaskTypes";
let json = await GetData(uri);
return json;
}
async function GetRunningTasks(){
var uri = apiUri + "/Tasks/GetRunningTasks";
let json = await GetData(uri);
return json;
}
async function GetDownloadTasks(){
var uri = apiUri + "/Tasks/Get?taskType=DownloadNewChapters";
let json = await GetData(uri);
return json;
}
async function GetSettings(){
var uri = apiUri + "/Settings/Get";
let json = await GetData(uri);
return json;
}
async function GetKomgaTask(){
var uri = apiUri + "/Tasks/Get?taskType=UpdateLibraries";
let json = await GetData(uri);
return json;
}
function CreateTask(taskType, reoccurrence, connectorName, publicationId, language){
var uri = apiUri + `/Tasks/Create?taskType=${taskType}&connectorName=${connectorName}&publicationId=${publicationId}&reoccurrenceTime=${reoccurrence}&language=${language}`;
PostData(uri);
}
function StartTask(taskType, connectorName, internalId){
var uri = apiUri + `/Tasks/Start?taskType=${taskType}&connectorName=${connectorName}&internalId=${internalId}`;
PostData(uri);
}
function EnqueueTask(taskType, connectorName, publicationId){
var uri = apiUri + `/Queue/Enqueue?taskType=${taskType}&connectorName=${connectorName}&publicationId=${publicationId}`;
PostData(uri);
}
function UpdateSettings(downloadLocation, komgaUrl, komgaAuth, kavitaUrl, kavitaUser, kavitaPass){
var uri = apiUri + "/Settings/Update?"
if(downloadLocation != ""){
uri += "&downloadLocation="+downloadLocation;
}
if(komgaUrl != "" && komgaAuth != ""){
uri += `&komgaUrl=${komgaUrl}&komgaAuth=${komgaAuth}`;
}
if(kavitaUrl != "" && kavitaUser != "" && kavitaPass != ""){
uri += `&kavitaUrl=${kavitaUrl}&kavitaUsername=${kavitaUser}&kavitaPassword=${kavitaPass}`;
}
PostData(uri);
}
function DeleteTask(taskType, connectorName, publicationId){
var uri = apiUri + `/Tasks/Delete?taskType=${taskType}&connectorName=${connectorName}&publicationId=${publicationId}`;
DeleteData(uri);
}
function DequeueTask(taskType, connectorName, publicationId){
var uri = apiUri + `/Queue/Dequeue?taskType=${taskType}&connectorName=${connectorName}&publicationId=${publicationId}`;
DeleteData(uri);
}
async function GetQueue(){
var uri = apiUri + "/Queue/GetList";
let json = await GetData(uri);
return json;
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 66 KiB

View File

@ -1,134 +0,0 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Tranga</title>
<link rel="stylesheet" href="style.css">
<link rel="icon" type="image/x-icon" href="favicon.ico">
</head>
<body>
<wrapper>
<topbar>
<titlebox>
<img alt="website image is Blahaj" src="media/blahaj.png">
<span>Tranga</span>
</titlebox>
<spacer></spacer>
<searchdiv>
<label for="searchbox"></label><input id="searchbox" placeholder="Filter" type="text">
</searchdiv>
<img id="settingscog" src="media/settings-cogwheel.svg" height="100%" alt="settingscog">
</topbar>
<viewport>
<content>
<div id="addPublication">
<p>+</p>
</div>
<publication>
<img src="media/cover.jpg">
<publication-information>
<connector-name class="pill">MangaDex</connector-name>
<publication-name>Tensei Pandemic</publication-name>
</publication-information>
</publication>
</content>
<popup id="addTaskPopup">
<blur-background id="blurBackgroundTaskPopup"></blur-background>
<addtask-window>
<window-titlebar>
<p>Add Task</p>
<img id="closePopupImg" src="media/close-x.svg" alt="Close">
</window-titlebar>
<window-content>
<addtask-settings>
<addtask-setting><label for="selectReccurrence">Recurrence</label><input id="selectReccurrence" type="time" value="01:00:00" step="3600"></addtask-setting>
<addtask-setting><label for="connectors">Connector</label>
<select id="connectors">
<option value=""></option>
</select>
</addtask-setting>
<addtask-setting><label for="searchPublicationQuery">Search Title</label><input id="searchPublicationQuery" type="text"></addtask-setting>
<input type="submit" value="Search" onclick="NewSearch();">
</addtask-settings>
<div id="taskSelectOutput"></div>
</window-content>
</addtask-window>
</popup>
<popup id="publicationViewerPopup">
<blur-background id="blurBackgroundPublicationPopup"></blur-background>
<publication-viewer>
<img id="pubviewcover" src="media/cover.jpg" alt="cover">
<publication-information>
<publication-name id="publicationViewerName">Tensei Pandemic</publication-name>
<publication-tags id="publicationViewerTags"></publication-tags>
<publication-author id="publicationViewerAuthor">Imamura Hinata</publication-author>
<publication-description id="publicationViewerDescription">Imamura Hinata is a high school boy with a cute appearance.
Since his trauma with the first love, he wanted to be more manly than anybody else. But one day he woke up to something different…
The total opposite of his ideal male body!
Pandemic love comedy!
</publication-description>
<publication-interactions>
<publication-starttask>Start Task ▶️</publication-starttask>
<publication-delete>Delete Task ❌</publication-delete>
<publication-add>Add Task </publication-add>
</publication-interactions>
</publication-information>
</publication-viewer>
</popup>
<popup id="settingsPopup">
<blur-background id="blurBackgroundSettingsPopup"></blur-background>
<settings>
<span style="font-weight: bold; text-align: center; font-size: 16pt;">Settings</span>
<div>
<p class="title">Download Location:</p>
<span id="downloadLocation"></span>
</div>
<div>
<p class="title">API-URI</p>
<label for="settingApiUri"></label><input placeholder="https://" type="text" id="settingApiUri">
</div>
<komga-settings>
<span class="title">Komga</span>
<div>Configured: <span id="komgaConfigured">✅❌</span></div>
<label for="komgaUrl"></label><input placeholder="URL" id="komgaUrl" type="text">
<label for="komgaUsername"></label><input placeholder="Username" id="komgaUsername" type="text">
<label for="komgaPassword"></label><input placeholder="Password" id="komgaPassword" type="password">
</komga-settings>
<kavita-settings>
<span class="title">Kavita</span>
<div>Configured: <span id="kavitaConfigured">✅❌</span></div>
<label for="kavitaUrl"></label><input placeholder="URL" id="kavitaUrl" type="text">
<label for="kavitaUsername"></label><input placeholder="Username" id="kavitaUsername" type="text">
<label for="kavitaPassword"></label><input placeholder="Password" id="kavitaPassword" type="password">
</kavita-settings>
<div>
<label for="libraryUpdateTime" style="margin-right: 5px;">Update Time</label><input id="libraryUpdateTime" type="time" value="00:01:00" step="10">
<input type="submit" value="Update" onclick="UpdateLibrarySettings()">
</div>
</settings>
</popup>
</viewport>
<footer>
<div>
<img src="media/running.svg" alt="running"><div id="tasksRunningTag">0</div>
</div>
<div>
<img src="media/queue.svg" alt="queue"><div id="tasksQueuedTag">0</div>
</div>
<div>
<img src="media/tasks.svg" alt="queue"><div id="totalTasksTag">0</div>
</div>
<p id="madeWith">Made with Blåhaj 🦈</p>
</footer>
</wrapper>
<footer-tag-popup>
<footer-tag-content>
<footer-tag-task-name>Test</footer-tag-task-name>
</footer-tag-content>
</footer-tag-popup>
<script src="apiConnector.js"></script>
<script src="interaction.js"></script>
</body>
</html>

View File

@ -1,446 +0,0 @@
let publications = [];
let tasks = [];
let toEditId;
const searchBox = document.querySelector("#searchbox");
const searchPublicationQuery = document.querySelector("#searchPublicationQuery");
const selectPublication = document.querySelector("#taskSelectOutput");
const connectorSelect = document.querySelector("#connectors");
const settingsPopup = document.querySelector("#settingsPopup");
const settingsCog = document.querySelector("#settingscog");
const selectRecurrence = document.querySelector("#selectReccurrence");
const tasksContent = document.querySelector("content");
const addTaskPopup = document.querySelector("#addTaskPopup");
const publicationViewerPopup = document.querySelector("#publicationViewerPopup");
const publicationViewerWindow = document.querySelector("publication-viewer");
const publicationViewerDescription = document.querySelector("#publicationViewerDescription");
const publicationViewerName = document.querySelector("#publicationViewerName");
const publicationViewerTags = document.querySelector("#publicationViewerTags");
const publicationViewerAuthor = document.querySelector("#publicationViewerAuthor");
const pubviewcover = document.querySelector("#pubviewcover");
const publicationDelete = document.querySelector("publication-delete");
const publicationAdd = document.querySelector("publication-add");
const publicationTaskStart = document.querySelector("publication-starttask");
const closetaskpopup = document.querySelector("#closePopupImg");
const settingDownloadLocation = document.querySelector("#downloadLocation");
const settingKomgaUrl = document.querySelector("#komgaUrl");
const settingKomgaUser = document.querySelector("#komgaUsername");
const settingKomgaPass = document.querySelector("#komgaPassword");
const settingKavitaUrl = document.querySelector("#kavitaUrl");
const settingKavitaUser = document.querySelector("#kavitaUsername");
const settingKavitaPass = document.querySelector("#kavitaPassword");
const libraryUpdateTime = document.querySelector("#libraryUpdateTime");
const settingKomgaConfigured = document.querySelector("#komgaConfigured");
const settingKavitaConfigured = document.querySelector("#kavitaConfigured");
const settingApiUri = document.querySelector("#settingApiUri");
const tagTasksRunning = document.querySelector("#tasksRunningTag");
const tagTasksQueued = document.querySelector("#tasksQueuedTag");
const tagTasksTotal = document.querySelector("#totalTasksTag");
const tagTaskPopup = document.querySelector("footer-tag-popup");
const tagTasksPopupContent = document.querySelector("footer-tag-content");
searchbox.addEventListener("keyup", (event) => FilterResults());
settingsCog.addEventListener("click", () => OpenSettings());
document.querySelector("#blurBackgroundSettingsPopup").addEventListener("click", () => HideSettings());
closetaskpopup.addEventListener("click", () => HideAddTaskPopup());
document.querySelector("#blurBackgroundTaskPopup").addEventListener("click", () => HideAddTaskPopup());
document.querySelector("#blurBackgroundPublicationPopup").addEventListener("click", () => HidePublicationPopup());
publicationDelete.addEventListener("click", () => DeleteTaskClick());
publicationAdd.addEventListener("click", () => AddTaskClick());
publicationTaskStart.addEventListener("click", () => StartTaskClick());
settingApiUri.addEventListener("keypress", (event) => {
if(event.key === "Enter"){
apiUri = settingApiUri.value;
setTimeout(() => GetSettingsClick(), 100);
document.cookie = `apiUri=${apiUri};`;
}
});
searchPublicationQuery.addEventListener("keypress", (event) => {
if(event.key === "Enter"){
NewSearch();
}
});
tagTasksRunning.addEventListener("mouseover", (event) => ShowRunningTasks(event));
tagTasksRunning.addEventListener("mouseout", () => CloseTasksPopup());
tagTasksQueued.addEventListener("mouseover", (event) => ShowQueuedTasks(event));
tagTasksQueued.addEventListener("mouseout", () => CloseTasksPopup());
tagTasksTotal.addEventListener("mouseover", (event) => ShowAllTasks(event));
tagTasksTotal.addEventListener("mouseout", () => CloseTasksPopup());
let availableConnectors;
GetAvailableControllers()
.then(json => availableConnectors = json)
.then(json =>
json.forEach(connector => {
var option = document.createElement('option');
option.value = connector;
option.innerText = connector;
connectorSelect.appendChild(option);
})
);
function NewSearch(){
//Disable inputs
selectRecurrence.disabled = true;
connectorSelect.disabled = true;
searchPublicationQuery.disabled = true;
//Waitcursor
document.body.style.cursor = "wait";
selectRecurrence.style.cursor = "wait";
connectorSelect.style.cursor = "wait";
searchPublicationQuery.style.cursor = "wait";
//Empty previous results
selectPublication.replaceChildren();
GetPublication(connectorSelect.value, searchPublicationQuery.value)
.then(json =>
json.forEach(publication => {
var option = CreatePublication(publication, connectorSelect.value);
option.addEventListener("click", (mouseEvent) => {
ShowPublicationViewerWindow(publication.internalId, mouseEvent, true);
});
selectPublication.appendChild(option);
}
))
.then(() => {
//Re-enable inputs
selectRecurrence.disabled = false;
connectorSelect.disabled = false;
searchPublicationQuery.disabled = false;
//Cursor
document.body.style.cursor = "initial";
selectRecurrence.style.cursor = "initial";
connectorSelect.style.cursor = "initial";
searchPublicationQuery.style.cursor = "initial";
});
}
//Returns a new "Publication" Item to display in the tasks section
function CreatePublication(publication, connector){
var publicationElement = document.createElement('publication');
publicationElement.setAttribute("id", publication.internalId);
var img = document.createElement('img');
img.src = `imageCache/${publication.coverFileNameInCache}`;
publicationElement.appendChild(img);
var info = document.createElement('publication-information');
var connectorName = document.createElement('connector-name');
connectorName.innerText = connector;
connectorName.className = "pill";
info.appendChild(connectorName);
var publicationName = document.createElement('publication-name');
publicationName.innerText = publication.sortName;
info.appendChild(publicationName);
publicationElement.appendChild(info);
if(publications.filter(pub => pub.internalId === publication.internalId) < 1)
publications.push(publication);
return publicationElement;
}
function DeleteTaskClick(){
taskToDelete = tasks.filter(tTask => tTask.publication.internalId === toEditId)[0];
DeleteTask("DownloadNewChapters", taskToDelete.connectorName, toEditId);
HidePublicationPopup();
}
function AddTaskClick(){
CreateTask("DownloadNewChapters", selectRecurrence.value, connectorSelect.value, toEditId, "en")
HideAddTaskPopup();
HidePublicationPopup();
}
function StartTaskClick(){
var toEditTask = tasks.filter(task => task.publication.internalId == toEditId)[0];
StartTask("DownloadNewChapters", toEditTask.connectorName, toEditId);
HidePublicationPopup();
}
function ResetContent(){
//Delete everything
tasksContent.replaceChildren();
//Add "Add new Task" Button
var add = document.createElement("div");
add.setAttribute("id", "addPublication")
var plus = document.createElement("p");
plus.innerText = "+";
add.appendChild(plus);
add.addEventListener("click", () => ShowNewTaskWindow());
tasksContent.appendChild(add);
}
function ShowPublicationViewerWindow(publicationId, event, add){
//Show popup
publicationViewerPopup.style.display = "block";
//Set position to mouse-position
if(event.clientY < window.innerHeight - publicationViewerWindow.offsetHeight)
publicationViewerWindow.style.top = `${event.clientY}px`;
else
publicationViewerWindow.style.top = `${event.clientY - publicationViewerWindow.offsetHeight}px`;
if(event.clientX < window.innerWidth - publicationViewerWindow.offsetWidth)
publicationViewerWindow.style.left = `${event.clientX}px`;
else
publicationViewerWindow.style.left = `${event.clientX - publicationViewerWindow.offsetWidth}px`;
//Edit information inside the window
var publication = publications.filter(pub => pub.internalId === publicationId)[0];
publicationViewerName.innerText = publication.sortName;
publicationViewerTags.innerText = publication.tags.join(", ");
publicationViewerDescription.innerText = publication.description;
publicationViewerAuthor.innerText = publication.author;
pubviewcover.src = `imageCache/${publication.coverFileNameInCache}`;
toEditId = publicationId;
//Check what action should be listed
if(add){
publicationAdd.style.display = "initial";
publicationDelete.style.display = "none";
publicationTaskStart.style.display = "none";
}
else{
publicationAdd.style.display = "none";
publicationDelete.style.display = "initial";
publicationTaskStart.style.display = "initial";
}
}
function HidePublicationPopup(){
publicationViewerPopup.style.display = "none";
}
function ShowNewTaskWindow(){
selectPublication.replaceChildren();
addTaskPopup.style.display = "block";
}
function HideAddTaskPopup(){
addTaskPopup.style.display = "none";
}
const fadeIn = [
{ opacity: "0" },
{ opacity: "1" }
];
const fadeInTiming = {
duration: 50,
iterations: 1,
fill: "forwards"
}
function OpenSettings(){
GetSettingsClick();
settingsPopup.style.display = "flex";
}
function HideSettings(){
settingsPopup.style.display = "none";
}
function GetSettingsClick(){
settingApiUri.value = "";
settingKomgaUrl.value = "";
settingKomgaUser.value = "";
settingKomgaPass.value = "";
settingKavitaUrl.value = "";
settingKavitaUser.value = "";
settingKavitaPass.value = "";
settingKomgaConfigured.innerText = "❌";
settingKavitaConfigured.innerText = "❌";
settingApiUri.placeholder = apiUri;
GetSettings().then(json => {
settingDownloadLocation.innerText = json.downloadLocation;
json.libraryManagers.forEach(lm => {
if(lm.libraryType == 0){
settingKomgaUrl.placeholder = lm.baseUrl;
settingKomgaUser.placeholder = "User";
settingKomgaPass.placeholder = "***";
settingKomgaConfigured.innerText = "✅";
} else if(lm.libraryType == 1){
settingKavitaUrl.placeholder = lm.baseUrl;
settingKavitaUser.placeholder = "User";
settingKavitaPass.placeholder = "***";
settingKavitaConfigured.innerText = "✅";
}
});
});
GetKomgaTask().then(json => {
if(json.length > 0)
libraryUpdateTime.value = json[0].reoccurrence;
});
}
function UpdateLibrarySettings(){
if(settingKomgaUser.value != "" && settingKomgaPass != ""){
var auth = utf8_to_b64(`${settingKomgaUser.value}:${settingKomgaPass.value}`);
console.log(auth);
if(settingKomgaUrl.value != "")
UpdateSettings("", settingKomgaUrl.value, auth, "", "");
else
UpdateSettings("", settingKomgaUrl.placeholder, auth, "", "");
}
if(settingKavitaUrl.value != "" && settingKavitaUser.value != "" && settingKavitaPass.value != ""){
UpdateSettings("", "", "", settingKavitaUrl.value, settingKavitaUser.value, settingKavitaPass.value);
}
CreateTask("UpdateLibraries", libraryUpdateTime.value, "","","");
setTimeout(() => GetSettingsClick(), 200);
}
function utf8_to_b64( str ) {
return window.btoa(unescape(encodeURIComponent( str )));
}
function ShowRunningTasks(event){
GetRunningTasks()
.then(json => {
tagTasksPopupContent.replaceChildren();
json.forEach(task => {
if(task.publication != null){
var taskname = document.createElement("footer-tag-task-name");
if(task.task == 2)
taskname.innerText = `${task.publication.sortName} - ${task.progress.toLocaleString(undefined,{style: 'percent', minimumFractionDigits:2})}`;
else if(task.task == 4)
taskname.innerText = `${task.publication.sortName} Vol.${task.chapter.volumeNumber} Ch.${task.chapter.chapterNumber} - ${task.progress.toLocaleString(undefined,{style: 'percent', minimumFractionDigits:2})}`;
tagTasksPopupContent.appendChild(taskname);
}
});
if(tagTasksPopupContent.children.length > 0){
tagTaskPopup.style.display = "block";
tagTaskPopup.style.left = `${tagTasksRunning.offsetLeft - 20}px`;
}
});
}
function ShowQueuedTasks(event){
GetQueue()
.then(json => {
tagTasksPopupContent.replaceChildren();
json.forEach(task => {
var taskname = document.createElement("footer-tag-task-name");
if(task.task == 2)
taskname.innerText = `${task.publication.sortName}`;
else if(task.task == 4)
taskname.innerText = `${task.publication.sortName} Vol.${task.chapter.volumeNumber} Ch.${task.chapter.chapterNumber}`;
tagTasksPopupContent.appendChild(taskname);
});
if(json.length > 0){
tagTaskPopup.style.display = "block";
tagTaskPopup.style.left = `${tagTasksQueued.offsetLeft- 20}px`;
}
});
}
function ShowAllTasks(event){
GetDownloadTasks()
.then(json => {
tagTasksPopupContent.replaceChildren();
json.forEach(task => {
var taskname = document.createElement("footer-tag-task-name");
taskname.innerText = task.publication.sortName;
tagTasksPopupContent.appendChild(taskname);
});
if(json.length > 0){
tagTaskPopup.style.display = "block";
tagTaskPopup.style.left = `${tagTasksTotal.offsetLeft - 20}px`;
}
});
}
function CloseTasksPopup(){
tagTaskPopup.style.display = "none";
}
function FilterResults(){
if(searchBox.value.length > 0){
tasksContent.childNodes.forEach(publication => {
publication.childNodes.forEach(item => {
if(item.nodeName.toLowerCase() == "publication-information"){
item.childNodes.forEach(information => {
if(information.nodeName.toLowerCase() == "publication-name"){
if(!information.textContent.toLowerCase().includes(searchBox.value.toLowerCase())){
publication.style.display = "none";
}else{
publication.style.display = "initial";
}
}
});
}
});
});
}else{
tasksContent.childNodes.forEach(publication => publication.style.display = "initial");
}
}
//Resets the tasks shown
ResetContent();
//Get Tasks and show them
GetDownloadTasks()
.then(json => json.forEach(task => {
var publication = CreatePublication(task.publication, task.connectorName);
publication.addEventListener("click", (event) => ShowPublicationViewerWindow(task.publication.internalId, event, false));
tasksContent.appendChild(publication);
tasks.push(task);
}));
GetRunningTasks()
.then(json => {
tagTasksRunning.innerText = json.length;
});
GetDownloadTasks()
.then(json => {
tagTasksTotal.innerText = json.length;
});
GetQueue()
.then(json => {
tagTasksQueued.innerText = json.length;
})
setInterval(() => {
//Tasks from API
var cTasks = [];
GetDownloadTasks()
.then(json => json.forEach(task => cTasks.push(task)))
.then(() => {
//Only update view if tasks-amount has changed
if(tasks.length != cTasks.length) {
//Resets the tasks shown
ResetContent();
//Add all currenttasks to view
cTasks.forEach(task => {
var publication = CreatePublication(task.publication, task.connectorName);
publication.addEventListener("click", (event) => ShowPublicationViewerWindow(task.publication.internalId, event, false));
tasksContent.appendChild(publication);
})
tasks = cTasks;
}
}
);
GetRunningTasks()
.then(json => {
tagTasksRunning.innerText = json.length;
});
GetDownloadTasks()
.then(json => {
tagTasksTotal.innerText = json.length;
});
GetQueue()
.then(json => {
tagTasksQueued.innerText = json.length;
})
}, 1000);

Binary file not shown.

Before

Width:  |  Height:  |  Size: 124 KiB

View File

@ -1,4 +0,0 @@
<?xml version="1.0" encoding="utf-8"?><!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
<svg width="800px" height="800px" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
<path fill-rule="evenodd" clip-rule="evenodd" d="M5.29289 5.29289C5.68342 4.90237 6.31658 4.90237 6.70711 5.29289L12 10.5858L17.2929 5.29289C17.6834 4.90237 18.3166 4.90237 18.7071 5.29289C19.0976 5.68342 19.0976 6.31658 18.7071 6.70711L13.4142 12L18.7071 17.2929C19.0976 17.6834 19.0976 18.3166 18.7071 18.7071C18.3166 19.0976 17.6834 19.0976 17.2929 18.7071L12 13.4142L6.70711 18.7071C6.31658 19.0976 5.68342 19.0976 5.29289 18.7071C4.90237 18.3166 4.90237 17.6834 5.29289 17.2929L10.5858 12L5.29289 6.70711C4.90237 6.31658 4.90237 5.68342 5.29289 5.29289Z" fill="#0F1729"/>
</svg>

Before

Width:  |  Height:  |  Size: 804 B

View File

@ -1,7 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
<svg width="800px" height="800px" viewBox="0 0 16 16" xmlns="http://www.w3.org/2000/svg" fill="none">
<g fill="#000000">
<path d="M2.23 2.674a.75.75 0 00-.96 1.152L3.578 5.75 1.27 7.674a.75.75 0 00.96 1.152l3-2.5a.75.75 0 000-1.152l-3-2.5zM8.25 5a.75.75 0 000 1.5h6a.75.75 0 000-1.5h-6zM5.5 9.25a.75.75 0 01.75-.75h8a.75.75 0 010 1.5h-8a.75.75 0 01-.75-.75zM6.25 12a.75.75 0 000 1.5h8a.75.75 0 000-1.5h-8z"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 545 B

View File

@ -1,53 +0,0 @@
<?xml version="1.0" encoding="iso-8859-1"?>
<!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
<svg fill="#000000" version="1.1" id="Capa_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"
width="800px" height="800px" viewBox="0 0 235.504 235.504"
xml:space="preserve">
<g>
<g>
<path d="M195.209,81.456l-49.227-0.15c0.737-0.886,1.351-1.868,2.284-2.583c3.282-2.497,3.911-7.166,1.427-10.438
c-2.501-3.266-7.161-3.919-10.443-1.423c-4.873,3.715-8.388,8.704-10.255,14.389l-22.191-0.064
c-9.508,0-19.588,7.398-22.938,16.851l-16.877,47.479c-1.775,5.013-1.338,9.966,1.207,13.568
c2.412,3.427,6.384,5.318,11.187,5.358l45.126,0.136c-1.509,5.186-4.701,9.622-9.352,12.424
c-4.891,2.957-10.636,3.814-16.172,2.444c-3.994-0.998-8.031,1.442-9.027,5.418c-0.99,4.012,1.445,8.035,5.432,9.032
c2.927,0.738,5.879,1.091,8.808,1.091c6.516,0,12.93-1.788,18.645-5.23c8.312-5.013,14.172-12.979,16.484-22.409
c0.232-0.905,0.232-1.823,0.124-2.713l28.296,0.092h0.049c2.925,0,5.854-0.89,8.684-2.147c0.2,0.493,0.32,1.014,0.661,1.471
c3.335,4.677,4.629,10.343,3.688,15.993c-0.95,5.627-4.028,10.536-8.688,13.862c-3.351,2.376-4.14,7.037-1.755,10.379
c1.466,2.04,3.751,3.122,6.062,3.122c1.491,0,3.006-0.429,4.312-1.367c7.919-5.61,13.16-13.966,14.771-23.52
c1.603-9.565-0.613-19.203-6.28-27.122c-0.48-0.693-1.134-1.19-1.779-1.659c1.318-1.831,2.501-3.763,3.238-5.854l16.863-47.464
c1.795-5.018,1.351-9.969-1.194-13.58C203.954,83.387,200.015,81.47,195.209,81.456z M201.979,98.405l-16.868,47.464
c-0.981,2.757-2.941,5.214-5.213,7.329c-0.337,0.16-0.706,0.229-1.026,0.465c-0.673,0.485-1.182,1.122-1.639,1.747
c-2.962,1.996-6.288,3.339-9.434,3.339v2.989l-0.044-2.989l-33.194-0.101c-0.232-0.076-0.424-0.261-0.661-0.324
c-1.435-0.353-2.805-0.145-4.095,0.309l-29.768-0.101l1.192-3.358c0.549-1.547-0.269-3.25-1.813-3.795
c-1.521-0.553-3.25,0.24-3.799,1.804l-1.899,5.334l-14.318-0.044c-2.805,0-5.063-0.998-6.336-2.813
c-1.437-2.032-1.603-4.921-0.463-8.144l16.877-47.478c2.48-6.979,10.417-12.868,17.356-12.868l12.217,0.038l-1.963,5.536
c-0.555,1.549,0.262,3.25,1.805,3.797c0.331,0.12,0.661,0.174,0.998,0.174c1.227,0,2.372-0.768,2.793-1.986l2.497-7.019
c0.064-0.164-0.048-0.322-0.016-0.487h2.512c-0.905,7.758,1.163,15.42,5.947,21.638c5.903,7.687,14.852,11.726,23.873,11.726
c6.371,0,12.771-2.001,18.186-6.129c3.266-2.488,3.911-7.167,1.426-10.441c-2.508-3.267-7.161-3.901-10.455-1.415
c-6.612,5.056-16.146,3.775-21.223-2.809c-2.445-3.194-3.487-7.133-2.958-11.117c0.061-0.503,0.353-0.916,0.481-1.402
l52.216,0.156c2.806,0,5.054,1.004,6.324,2.811C202.928,92.241,203.105,95.223,201.979,98.405z"/>
<path d="M107.997,127.194c-1.531-0.553-3.248,0.244-3.799,1.791l-4.302,12.099c-0.551,1.543,0.265,3.242,1.813,3.795
c0.331,0.116,0.659,0.16,0.998,0.16c1.214,0,2.372-0.765,2.801-1.976l4.294-12.099
C110.369,129.446,109.551,127.728,107.997,127.194z"/>
<path d="M116.6,103.014c-1.529-0.541-3.25,0.252-3.805,1.805l-4.298,12.088c-0.547,1.547,0.261,3.252,1.799,3.799
c0.329,0.12,0.659,0.172,1,0.172c1.222,0,2.368-0.769,2.809-1.983l4.294-12.09C118.955,105.268,118.139,103.555,116.6,103.014z"/>
<path d="M232.527,90.428l-14.896-0.038l0,0c-1.639,0-2.974,1.327-2.997,2.976c0,1.639,1.342,2.981,2.981,2.989l14.896,0.042l0,0
c1.643,0,2.978-1.331,2.993-2.979C235.504,91.763,234.17,90.436,232.527,90.428z"/>
<path d="M220.333,80.436c0.629,0,1.242-0.188,1.771-0.583l11.994-8.83c1.326-0.974,1.611-2.842,0.645-4.168
c-0.965-1.327-2.845-1.611-4.163-0.637l-11.998,8.833c-1.323,0.974-1.607,2.841-0.642,4.167
C218.513,80.003,219.418,80.436,220.333,80.436z"/>
<path d="M209.152,56.279c-1.547-0.549-3.25,0.269-3.787,1.805l-4.997,14.036c-0.537,1.547,0.26,3.252,1.803,3.807
c0.337,0.12,0.674,0.172,0.994,0.172c1.242,0,2.385-0.757,2.821-1.986l4.985-14.036C211.516,58.541,210.695,56.846,209.152,56.279
z"/>
<path d="M17.587,100.894h55.208c1.641,0,2.976-1.343,2.976-2.981c0-1.641-1.334-2.988-2.976-2.988H17.587
c-1.641,0-2.988,1.338-2.988,2.988C14.599,99.559,15.946,100.894,17.587,100.894z"/>
<path d="M68.471,119.328c0-1.641-1.345-2.987-2.986-2.987H10.283c-1.639,0-2.981,1.338-2.981,2.987
c0,1.639,1.342,2.974,2.981,2.974h55.202C67.119,122.301,68.471,120.967,68.471,119.328z"/>
<path d="M58.188,137.758H2.974c-1.641,0-2.974,1.335-2.974,2.989c0,1.64,1.333,2.974,2.974,2.974h55.214
c1.639,0,2.981-1.334,2.981-2.974C61.162,139.093,59.827,137.758,58.188,137.758z"/>
<path d="M169.611,28.097c11.821,0,21.403,9.584,21.403,21.41c0,11.82-9.582,21.408-21.403,21.408
c-11.822,0-21.412-9.588-21.412-21.408C148.199,37.681,157.789,28.097,169.611,28.097z"/>
</g>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 4.6 KiB

View File

@ -1,21 +0,0 @@
<?xml version="1.0" encoding="iso-8859-1"?>
<!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
<svg fill="#000000" version="1.1" id="Capa_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"
width="800px" height="800px" viewBox="0 0 93.5 93.5" xml:space="preserve">
<g>
<g>
<path d="M93.5,40.899c0-2.453-1.995-4.447-4.448-4.447H81.98c-0.74-2.545-1.756-5.001-3.035-7.331l4.998-5
c0.826-0.827,1.303-1.973,1.303-3.146c0-1.19-0.462-2.306-1.303-3.146L75.67,9.555c-1.613-1.615-4.673-1.618-6.29,0l-5,5
c-2.327-1.28-4.786-2.296-7.332-3.037v-7.07C57.048,1.995,55.053,0,52.602,0H40.899c-2.453,0-4.447,1.995-4.447,4.448v7.071
c-2.546,0.741-5.005,1.757-7.333,3.037l-5-5c-1.68-1.679-4.609-1.679-6.288,0L9.555,17.83c-1.734,1.734-1.734,4.555,0,6.289
l4.999,5c-1.279,2.33-2.295,4.788-3.036,7.333h-7.07C1.995,36.452,0,38.447,0,40.899V52.6c0,2.453,1.995,4.447,4.448,4.447h7.071
c0.74,2.545,1.757,5.003,3.036,7.332l-4.998,4.999c-0.827,0.827-1.303,1.974-1.303,3.146c0,1.189,0.462,2.307,1.302,3.146
l8.274,8.273c1.614,1.615,4.674,1.619,6.29,0l5-5c2.328,1.279,4.786,2.297,7.333,3.037v7.071c0,2.453,1.995,4.448,4.447,4.448
h11.702c2.453,0,4.446-1.995,4.446-4.448V81.98c2.546-0.74,5.005-1.756,7.332-3.037l5,5c1.681,1.68,4.608,1.68,6.288,0
l8.275-8.273c1.734-1.734,1.734-4.555,0-6.289l-4.998-5.001c1.279-2.329,2.295-4.787,3.035-7.332h7.071
c2.453,0,4.448-1.995,4.448-4.446V40.899z M62.947,46.75c0,8.932-7.266,16.197-16.197,16.197c-8.931,0-16.197-7.266-16.197-16.197
c0-8.931,7.266-16.197,16.197-16.197C55.682,30.553,62.947,37.819,62.947,46.75z"/>
</g>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 1.7 KiB

View File

@ -1,10 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
<svg fill="#000000" height="800px" width="800px" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"
viewBox="0 0 24 24" enable-background="new 0 0 24 24" xml:space="preserve">
<g id="task">
<path d="M4,23.4l-3.7-3.7l1.4-1.4L4,20.6l4.3-4.3l1.4,1.4L4,23.4z M24,21H12v-2h12V21z M4,15.4l-3.7-3.7l1.4-1.4L4,12.6l4.3-4.3
l1.4,1.4L4,15.4z M24,13H12v-2h12V13z M4,7.4L0.3,3.7l1.4-1.4L4,4.6l4.3-4.3l1.4,1.4L4,7.4z M24,5H12V3h12V5z"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 603 B

View File

@ -1,518 +0,0 @@
:root{
--background-color: #030304;
--second-background-color: #fff;
--primary-color: #f5a9b8;
--secondary-color: #5bcefa;
--accent-color: #fff;
--topbar-height: 60px;
box-sizing: border-box;
}
body{
padding: 0;
margin: 0;
height: 100vh;
background-color: var(--background-color);
font-family: "Inter", sans-serif;
overflow-x: hidden;
}
wrapper {
display: flex;
flex-flow: column;
flex-wrap: nowrap;
height: 100vh;
}
background-placeholder{
background-color: var(--second-background-color);
opacity: 1;
position: absolute;
width: 100%;
height: 100%;
border-radius: 0 0 5px 0;
z-index: -1;
}
topbar {
display: flex;
align-items: center;
height: var(--topbar-height);
background-color: var(--secondary-color);
z-index: 100;
box-shadow: 0 0 20px black;
}
titlebox {
position: relative;
display: flex;
margin: 0 0 0 40px;
height: 100%;
align-items:center;
justify-content:center;
}
titlebox span{
cursor: default;
font-size: 24pt;
font-weight: bold;
background: linear-gradient(150deg, var(--primary-color), var(--accent-color));
-webkit-background-clip: text;
-webkit-text-fill-color: transparent;
margin-left: 20px;
}
titlebox img {
height: 100%;
margin-right: 10px;
cursor: grab;
}
spacer{
flex-grow: 1;
}
searchdiv{
display: block;
margin: 0 10px 0 0;
}
#searchbox {
padding: 3px 10px;
border: 0;
border-radius: 4px;
font-size: 14pt;
width: 250px;
}
#settingscog {
cursor: pointer;
margin: 0px 30px;
height: 50%;
filter: invert(100%) sepia(0%) saturate(7465%) hue-rotate(115deg) brightness(116%) contrast(101%);
}
viewport {
position: relative;
display: flex;
flex-flow: row;
flex-wrap: nowrap;
flex-grow: 1;
height: 100%;
overflow-y: scroll;
}
footer {
display: flex;
flex-direction: row;
flex-wrap: nowrap;
width: 100%;
height: 40px;
align-items: center;
justify-content: center;
background-color: var(--primary-color);
align-content: center;
}
footer > div {
height: 100%;
margin: 0 30px;
display: flex;
flex-direction: row;
flex-wrap: nowrap;
align-items: center;
cursor: pointer;
}
footer > div > *{
height: 40%;
margin: 0 5px;
}
#madeWith {
flex-grow: 1;
text-align: right;
margin-right: 20px;
cursor: url("media/blahaj.png"), grab;
}
content {
position: relative;
flex-grow: 1;
border-radius: 5px;
display: flex;
flex-direction: row;
flex-wrap: wrap;
justify-content: start;
align-content: start;
}
settings {
width: 50%;
background-color: var(--accent-color);
display: flex;
flex-direction: column;
z-index: 10;
position: absolute;
left: 25%;
top: 100px;
border-radius: 5px;
padding: 10px 0;
}
#settingsPopup{
z-index: 10;
}
settings > * {
margin: 0 20%;
}
settings input {
margin: 3px 0;
padding: 3px;
border-radius: 3px;
border: 1px solid rgba(0,0,0,0.2);
width: 100%;
}
settings .title {
font-weight: bolder;
font-size: 14pt;
margin: 15px 0 2px 0;
}
komga-settings {
margin-top: 20px;
display: flex;
flex-direction: column;
flex-wrap: nowrap;
}
#addPublication {
cursor: pointer;
background-color: var(--secondary-color);
width: 180px;
height: 300px;
border-radius: 5px;
margin: 10px 10px;
padding: 15px 20px;
position: relative;
}
#addPublication p{
width: 100%;
text-align: center;
font-size: 150pt;
vertical-align: middle;
line-height: 300px;
margin: 0;
color: var(--accent-color);
}
.pill {
flex-grow: 0;
height: 14pt;
font-size: 12pt;
border-radius: 9pt;
background-color: var(--primary-color);
padding: 2pt 17px;
color: black;
}
publication{
cursor: pointer;
background-color: var(--secondary-color);
width: 180px;
height: 300px;
border-radius: 5px;
margin: 10px 10px;
padding: 15px 20px;
position: relative;
}
publication::after{
content: '';
position: absolute;
left: 0; top: 0;
border-radius: 5px;
width: 100%; height: 100%;
background: linear-gradient(rgba(0,0,0,0.8), rgba(0, 0, 0, 0.7),rgba(0, 0, 0, 0.2));
}
publication-information {
display: flex;
flex-direction: column;
justify-content: start;
}
publication-information * {
z-index: 1;
color: var(--accent-color);
}
connector-name{
width: fit-content;
margin: 10px 0;
}
publication-name{
width: fit-content;
font-size: 16pt;
font-weight: bold;
}
publication img {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
object-fit: cover;
z-index: 0;
border-radius: 5px;
}
popup{
display: none;
width: 100%;
min-height: 100%;
top: 0;
left: 0;
position: fixed;
z-index: 2;
}
blur-background {
width: 100%;
height: 100%;
position: absolute;
left: 0;
background-color: black;
opacity: 0.5;
}
addtask-window {
display: flex;
flex-direction: column;
flex-wrap: nowrap;
position: absolute;
left: 12.5%;
top: 15%;
width: 75%;
min-height: 70%;
max-height: 80%;
padding: 0;
background-color: var(--accent-color);
border-radius: 5px;
}
window-titlebar {
width: 100%;
height: 60px;
background-color: var(--primary-color);
border-radius: 5px 5px 0 0;
color: var(--accent-color);
display: flex block;
flex-direction: row;
justify-content: space-between;
align-items: center;
}
window-titlebar p {
margin: 0 30px;
font-size: 14pt;
font-weight: bolder;
letter-spacing: 1px;
}
window-titlebar #closePopupImg {
height: 70%;
cursor: pointer;
margin-right: 20px;
filter: invert(100%) sepia(0%) saturate(100%) hue-rotate(115deg) brightness(116%) contrast(101%);
}
window-content {
display: flex;
flex-direction: column;
padding: 20px 5%;
overflow-x: scroll;
}
addtask-settings{
display: flex;
justify-content: center;
align-items: center;
}
addtask-settings select, addtask-settings input{
padding: 5px;
font-size: 10pt;
border: 1px solid rgba(0,0,0,0.2);
border-radius: 3px;
background-color: transparent;
margin: 10px 0;
width: 150px;
}
addtask-settings label {
font-weight: bolder;
margin: 0 5px;
}
addtask-settings addtask-setting{
margin: 0 15px;
}
#taskSelectOutput{
display: flex;
flex-direction: row;
flex-wrap: wrap;
justify-content: start;
align-content: start;
}
#publicationViewerPopup{
z-index: 5;
}
publication-viewer{
display: block;
width: 450px;
position: absolute;
top: 200px;
left: 400px;
background-color: var(--accent-color);
border-radius: 5px;
overflow: hidden;
padding: 15px;
}
publication-viewer::after{
content: '';
position: absolute;
left: 0; top: 0;
border-radius: 5px;
width: 100%;
height: 100%;
background: rgba(0,0,0,0.8);
backdrop-filter: blur(3px);
}
publication-viewer img {
position: absolute;
left: 0;
top: 0;
height: 100%;
width: 100%;
object-fit: cover;
border-radius: 5px;
z-index: 0;
}
publication-viewer publication-information > * {
margin: 5px 0;
}
publication-viewer publication-information publication-name {
width: initial;
overflow-x: scroll;
white-space: nowrap;
scrollbar-width: none;
}
publication-viewer publication-information publication-tags::before {
content: "Tags";
display: block;
font-weight: bolder;
}
publication-viewer publication-information publication-tags {
overflow-x: scroll;
white-space: nowrap;
scrollbar-width: none;
}
publication-viewer publication-information publication-author::before {
content: "Author: ";
font-weight: bolder;
}
publication-viewer publication-information publication-description::before {
content: "Description";
display: block;
font-weight: bolder;
}
publication-viewer publication-information publication-description {
font-size: 12pt;
margin: 5px 0;
height: 145px;
overflow-x: scroll;
}
publication-viewer publication-information publication-interactions {
display: flex;
flex-direction: row;
justify-content: end;
align-items: start;
width: 100%;
}
publication-viewer publication-information publication-interactions > * {
margin: 0 10px;
font-size: 16pt;
cursor: pointer;
}
publication-viewer publication-information publication-interactions publication-starttask {
color: var(--secondary-color);
}
publication-viewer publication-information publication-interactions publication-delete {
color: red;
}
publication-viewer publication-information publication-interactions publication-add {
color: limegreen;
}
footer-tag-popup {
display: none;
padding: 2px 4px;
position: fixed;
bottom: 58px;
left: 20px;
background-color: var(--second-background-color);
z-index: 8;
border-radius: 5px;
max-height: 400px;
}
footer-tag-content{
position: relative;
max-height: 400px;
display: flex;
flex-direction: column;
flex-wrap: nowrap;
overflow-y: scroll;
}
footer-tag-content > * {
margin: 2px 5px;
}
footer-tag-popup::before{
content: "";
width: 0;
height: 0;
position: absolute;
border-right: 10px solid var(--second-background-color);
border-left: 10px solid transparent;
border-top: 10px solid var(--second-background-color);
border-bottom: 10px solid transparent;
left: 0px;
bottom: -17px;
border-radius: 0 0 0 5px;
}

21
docker-compose.local.yaml Normal file
View File

@ -0,0 +1,21 @@
version: '3'
services:
tranga-api:
build:
dockerfile: Dockerfile
context: .
container_name: tranga-api
volumes:
- ./Manga:/Manga
- ./settings:/usr/share/tranga-api
ports:
- "6531:6531"
restart: unless-stopped
tranga-website:
image: glax/tranga-website:latest
container_name: tranga-website
ports:
- "9555:80"
depends_on:
- tranga-api
restart: unless-stopped

View File

@ -4,18 +4,16 @@ services:
image: glax/tranga-api:latest
container_name: tranga-api
volumes:
- ./tranga:/usr/share/Tranga-API #1 when replacing ./tranga replace #2 with same value
- ./Manga:/Manga
- ./settings:/usr/share/tranga-api
ports:
- 6531:80
- "6531:6531"
restart: unless-stopped
tranga-website:
image: glax/tranga-website:latest
container_name: tranga-website
volumes:
- ./tranga/imageCache:/usr/share/nginx/html/imageCache:ro #2 when replacing Point to same value as #1/imageCache
ports:
- 9555:80
- "9555:80"
depends_on:
- tranga-api
restart: unless-stopped

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.0 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.6 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.2 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.7 MiB