Compare commits

...

19 Commits

Author SHA1 Message Date
Oystein Kristoffer Tveit 3c4f6ccf8c
models/BookcaseItem: make shelf non-nullable 2024-07-31 22:36:12 +02:00
Oystein Kristoffer Tveit 8161e5e92a
cli: make devscripts subcommand more accessible 2024-07-28 14:32:13 +02:00
Oystein Kristoffer Tveit 4e356f122a
models/migrations: redo initial migration 2024-07-28 14:12:45 +02:00
Oystein Kristoffer Tveit f62011d6f7
pyproject.toml: add missing deps 2024-07-28 14:09:40 +02:00
Oystein Kristoffer Tveit 2242b3ce74
README: move issues from TODO list to gitea issue tracker 2024-07-27 23:14:27 +02:00
Adrian Gunnar Lauterer b2432e782e
services/metadata_fetchers: init
Co-authored-by: Oystein Kristoffer Tveit <oysteikt@pvv.ntnu.no>
2024-07-27 22:24:34 +02:00
Oystein Kristoffer Tveit ec448c9f57
add more test data 2024-05-17 22:35:18 +02:00
Oystein Kristoffer Tveit 80fafbe3df
cli/prompt_utils: clear some bugs 2024-05-17 22:35:18 +02:00
Oystein Kristoffer Tveit fa180ca354
cli/main: add command to show borrowed/queued items, move `list_bookcases` to advanced 2024-05-17 22:35:17 +02:00
Oystein Kristoffer Tveit b3f80888d5
cli/search: clear some bugs 2024-05-17 22:35:17 +02:00
Oystein Kristoffer Tveit 77175cbb3a
cli/bookcase_item: add command to extend borrowing 2024-05-17 22:35:17 +02:00
Oystein Kristoffer Tveit 36ddc59253
flake.nix: add support for more architectures + misc 2024-04-29 00:16:21 +02:00
Oystein Kristoffer Tveit d999d6710c
README: add further notes on future work with online sources 2024-01-14 03:58:22 +01:00
Oystein Kristoffer Tveit d678b1f525
cli: implement slabbedasker 2024-01-14 03:55:36 +01:00
Oystein Kristoffer Tveit 1e3a24f575
fix csv test data spacing issue 2024-01-14 03:54:59 +01:00
Oystein Kristoffer Tveit 832c95198d
flake.nix: add devShell 2024-01-14 03:41:58 +01:00
Oystein Kristoffer Tveit 03f221a807
misc: small formatting and error checking improvements 2024-01-14 03:41:40 +01:00
Oystein Kristoffer Tveit 369180ff85
deadline-daemon: implement remaining pieces 2024-01-14 03:40:27 +01:00
Oystein Kristoffer Tveit 1550c1f2e3
Add scripts to seed data for testing + misc 2024-01-14 03:39:16 +01:00
33 changed files with 1411 additions and 222 deletions

View File

@ -42,44 +42,4 @@ Unless provided through the `--config` flag, program will automatically look for
- `~/.config/worblehat/config.toml`
- `/var/lib/worblehat/config.toml`
Run `poetry run worblehat --help` for more info
## TODO List
### Setting up a database with all of PVVs books
- [ ] Create postgres database
- [ ] Model all bookshelfs
- [ ] Scan in all books
### Cli version of the program (this is currently being worked on)
- [X] Ability to pull data from online sources with ISBN
- [X] Ability to create and update bookcases
- [X] Ability to create and update bookcase shelfs
- [X] Ability to create and update bookcase items
- [X] Ability to borrow and deliver items
- [ ] Ability to borrow and deliver multiple items at a time
- [X] Ability to enter the queue for borrowing an item
- [ ] Ability to extend a borrowing, only if no one is behind you in the queue
- [ ] Ability to list borrowed items which are overdue
- [~] Ability to search for items
- [ ] Ability to print PVV-specific labels for items missing a label, or which for any other reason needs a custom one
- [X] Ascii art of monkey with fingers in eyes
### Deadline daemon
- [X] Ability to be notified when deadlines are due
- [ ] Ability to be notified when books are available
- [ ] Ability to have expiring queue positions automatically expire
### Web version of the program
- [ ] Ability for PVV members to search for books through the PVV website
## Points of discussion
- Should this project run in a separate tty-instance on Dibblers interface, or should they share the tty with some kind of switching ability?
After some discussion with other PVV members, we came up with an idea where we run the programs in separate ttys, and use a set of large mechanical switches connected to a QMK-flashed microcontroller to switch between them.
- Workaround for not being able to represent items with same ISBN and different owner: if you are absolutely adamant about placing your item at PVV while still owning it, even though PVV already owns a copy of this item, please print out a new label with a "PVV-ISBN" for it
Run `poetry run worblehat --help` for more info

View File

@ -33,5 +33,8 @@ from = 'worblehat@pvv.ntnu.no'
subject_prefix = '[Worblehat]'
[deadline_daemon]
warn_days_before_borrow_deadline = [ "5", "1" ]
warn_days_before_expiring_queue_position_deadline = [ "3", "1" ]
enabled = true
dryrun = false
warn_days_before_borrowing_deadline = [ 5, 1 ]
days_before_queue_position_expires = 14
warn_days_before_expiring_queue_position_deadline = [ 3, 1 ]

View File

@ -1,33 +1,29 @@
isbn, note, bookcase, shelf
9780486809038, emily riehl, arbeidsrom_smal, 5
9781568811307, winning ways, arbeidsrom_smal, 5
9780486458731, cardano, arbeidsrom_smal, 5
9780486462394, alg topo, arbeidsrom_smal, 5
9780582447585, formulae, arbeidsrom_smal, 5
9780486466668, theory of numbers, arbeidsrom_smal, 5
9780486462431, conv surf, arbeidsrom_smal, 5
9780486449685, math fun and earnest, arbeidsrom_smal, 5
9780486417103, lin prog, arbeidsrom_smal, 5
9780130892393, complex analysis, arbeidsrom_smal, 5
9781292024967, abstract alg, arbeidsrom_smal, 5
9780471728979, kreyzig, arbeidsrom_smal, 5
9781847762399, calc 1, arbeidsrom_smal, 5
9781847762399, calc 1 again, arbeidsrom_smal, 5
9781787267763, calc 1 nome, arbeidsrom_smal, 5
9781787267770, calc 2 nome, arbeidsrom_smal, 5
9780199208258, non lin ode, arbeidsrom_smal, 5
9788251915953, tabeller, arbeidsrom_smal, 5
9788251915953, taeller 2, arbeidsrom_smal, 5
9788251915953, tabeller 3, arbeidsrom_smal, 5
9788251915953, tabeller 4, arbeidsrom_smal, 5
9780750304009, fractals and chaos, arbeidsrom_smal, 5
9788241902116, geometri, arbeidsrom_smal, 5
9781620402788, simpsons, arbeidsrom_smal, 5
9781846683459, math curiosities, arbeidsrom_smal, 5
9789810245344, fuzzy logic, arbeidsrom_smal, 5
9781429224048, vect calc, arbeidsrom_smal, 5
9780122407611, gambling, arbeidsrom_smal, 5
9788278220054, rottman slitt, arbeidsrom_smal, 5
9780321748232, prob and stat, arbeidsrom_smal, 5
9780387954752, stats with r, arbeidsrom_smal, 5
9781568814421, maths by exp, arbeidsrom_smal, 5
isbn,note,bookcase,shelf
9780486809038,emily riehl,arbeidsrom_smal,5
9781568811307,winning ways,arbeidsrom_smal,5
9780486458731,cardano,arbeidsrom_smal,5
9780486462394,alg topo,arbeidsrom_smal,5
9780582447585,formulae,arbeidsrom_smal,5
9780486466668,theory of numbers,arbeidsrom_smal,5
9780486462431,conv surf,arbeidsrom_smal,5
9780486449685,math fun and earnest,arbeidsrom_smal,5
9780486417103,lin prog,arbeidsrom_smal,5
9780130892393,complex analysis,arbeidsrom_smal,5
9781292024967,abstract alg,arbeidsrom_smal,5
9780471728979,kreyzig,arbeidsrom_smal,5
9781847762399,calc 1,arbeidsrom_smal,5
9781787267763,calc 1 nome,arbeidsrom_smal,5
9781787267770,calc 2 nome,arbeidsrom_smal,5
9780199208258,non lin ode,arbeidsrom_smal,5
9788251915953,tabeller,arbeidsrom_smal,5
9780750304009,fractals and chaos,arbeidsrom_smal,5
9788241902116,geometri,arbeidsrom_smal,5
9781620402788,simpsons,arbeidsrom_smal,5
9781846683459,math curiosities,arbeidsrom_smal,5
9789810245344,fuzzy logic,arbeidsrom_smal,5
9781429224048,vect calc,arbeidsrom_smal,5
9780122407611,gambling,arbeidsrom_smal,5
9788278220054,rottman slitt,arbeidsrom_smal,5
9780321748232,prob and stat,arbeidsrom_smal,5
9780387954752,stats with r,arbeidsrom_smal,5
9781568814421,maths by exp,arbeidsrom_smal,5

1 isbn note bookcase shelf
2 9780486809038 emily riehl arbeidsrom_smal 5
3 9781568811307 winning ways arbeidsrom_smal 5
4 9780486458731 cardano arbeidsrom_smal 5
5 9780486462394 alg topo arbeidsrom_smal 5
6 9780582447585 formulae arbeidsrom_smal 5
7 9780486466668 theory of numbers arbeidsrom_smal 5
8 9780486462431 conv surf arbeidsrom_smal 5
9 9780486449685 math fun and earnest arbeidsrom_smal 5
10 9780486417103 lin prog arbeidsrom_smal 5
11 9780130892393 complex analysis arbeidsrom_smal 5
12 9781292024967 abstract alg arbeidsrom_smal 5
13 9780471728979 kreyzig arbeidsrom_smal 5
14 9781847762399 calc 1 arbeidsrom_smal 5
15 9781847762399 9781787267763 calc 1 again calc 1 nome arbeidsrom_smal 5
16 9781787267763 9781787267770 calc 1 nome calc 2 nome arbeidsrom_smal 5
17 9781787267770 9780199208258 calc 2 nome non lin ode arbeidsrom_smal 5
18 9780199208258 9788251915953 non lin ode tabeller arbeidsrom_smal 5
19 9788251915953 9780750304009 tabeller fractals and chaos arbeidsrom_smal 5
20 9788251915953 9788241902116 taeller 2 geometri arbeidsrom_smal 5
21 9788251915953 9781620402788 tabeller 3 simpsons arbeidsrom_smal 5
22 9788251915953 9781846683459 tabeller 4 math curiosities arbeidsrom_smal 5
23 9780750304009 9789810245344 fractals and chaos fuzzy logic arbeidsrom_smal 5
24 9788241902116 9781429224048 geometri vect calc arbeidsrom_smal 5
25 9781620402788 9780122407611 simpsons gambling arbeidsrom_smal 5
26 9781846683459 9788278220054 math curiosities rottman slitt arbeidsrom_smal 5
27 9789810245344 9780321748232 fuzzy logic prob and stat arbeidsrom_smal 5
28 9781429224048 9780387954752 vect calc stats with r arbeidsrom_smal 5
29 9780122407611 9781568814421 gambling maths by exp arbeidsrom_smal 5
9788278220054 rottman slitt arbeidsrom_smal 5
9780321748232 prob and stat arbeidsrom_smal 5
9780387954752 stats with r arbeidsrom_smal 5
9781568814421 maths by exp arbeidsrom_smal 5

View File

@ -2,16 +2,16 @@
"nodes": {
"nixpkgs": {
"locked": {
"lastModified": 1683014792,
"narHash": "sha256-6Va9iVtmmsw4raBc3QKvQT2KT/NGRWlvUlJj46zN8B8=",
"lastModified": 1714272655,
"narHash": "sha256-3/ghIWCve93ngkx5eNPdHIKJP/pMzSr5Wc4rNKE1wOc=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "1a411f23ba299db155a5b45d5e145b85a7aafc42",
"rev": "12430e43bd9b81a6b4e79e64f87c624ade701eaf",
"type": "github"
},
"original": {
"id": "nixpkgs",
"ref": "nixos-unstable",
"ref": "nixos-23.11",
"type": "indirect"
}
},

View File

@ -1,23 +1,42 @@
{
# inputs.nixpkgs.url = "nixpkgs/nixos-22.11";
inputs.nixpkgs.url = "nixpkgs/nixos-unstable";
inputs.nixpkgs.url = "nixpkgs/nixos-23.11";
outputs = { self, nixpkgs }: let
system = "x86_64-linux";
pkgs = nixpkgs.legacyPackages.${system};
inherit (pkgs) lib;
systems = [
"x86_64-linux"
"aarch64-linux"
"x86_64-darwin"
"aarch64-darwin"
];
forAllSystems = f: nixpkgs.lib.genAttrs systems (system: let
pkgs = nixpkgs.legacyPackages.${system};
in f system pkgs);
in {
apps.${system} = let
app = program: {
apps = forAllSystems (system: pkgs: let
mkApp = program: {
type = "app";
inherit program;
};
in {
default = self.apps.${system}.worblehat;
worblehat = app "${self.packages.${system}.worblehat}/bin/worblehat";
};
default = mkApp self.packages.${system}.default;
});
packages.${system} = {
devShells = forAllSystems (_: pkgs: {
default = pkgs.mkShell {
packages = with pkgs; [
poetry
sqlite
];
shellHook = ''
poetry install
poetry shell && exit
'';
};
});
overlays.default = final: prev: self.packages.${final.system};
packages = forAllSystems (system: pkgs: {
default = self.packages.${system}.worblehat;
worblehat = with pkgs.python3Packages; buildPythonPackage {
pname = "worblehat";
@ -38,6 +57,6 @@
sqlalchemy
];
};
};
});
};
}
}

234
poetry.lock generated
View File

@ -1,10 +1,9 @@
# This file is automatically @generated by Poetry 1.4.1 and should not be changed by hand.
# This file is automatically @generated by Poetry 1.7.1 and should not be changed by hand.
[[package]]
name = "alembic"
version = "1.10.4"
description = "A database migration tool for SQLAlchemy."
category = "main"
optional = false
python-versions = ">=3.7"
files = [
@ -20,11 +19,31 @@ typing-extensions = ">=4"
[package.extras]
tz = ["python-dateutil"]
[[package]]
name = "beautifulsoup4"
version = "4.12.3"
description = "Screen-scraping library"
optional = false
python-versions = ">=3.6.0"
files = [
{file = "beautifulsoup4-4.12.3-py3-none-any.whl", hash = "sha256:b80878c9f40111313e55da8ba20bdba06d8fa3969fc68304167741bbf9e082ed"},
{file = "beautifulsoup4-4.12.3.tar.gz", hash = "sha256:74e3d1928edc070d21748185c46e3fb33490f22f52a3addee9aee0f4f7781051"},
]
[package.dependencies]
soupsieve = ">1.2"
[package.extras]
cchardet = ["cchardet"]
chardet = ["chardet"]
charset-normalizer = ["charset-normalizer"]
html5lib = ["html5lib"]
lxml = ["lxml"]
[[package]]
name = "blinker"
version = "1.6.2"
description = "Fast, simple object-to-object and broadcast signaling"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
@ -32,11 +51,134 @@ files = [
{file = "blinker-1.6.2.tar.gz", hash = "sha256:4afd3de66ef3a9f8067559fb7a1cbe555c17dcbe15971b05d1b625c3e7abe213"},
]
[[package]]
name = "bs4"
version = "0.0.2"
description = "Dummy package for Beautiful Soup (beautifulsoup4)"
optional = false
python-versions = "*"
files = [
{file = "bs4-0.0.2-py2.py3-none-any.whl", hash = "sha256:abf8742c0805ef7f662dce4b51cca104cffe52b835238afc169142ab9b3fbccc"},
{file = "bs4-0.0.2.tar.gz", hash = "sha256:a48685c58f50fe127722417bae83fe6badf500d54b55f7e39ffe43b798653925"},
]
[package.dependencies]
beautifulsoup4 = "*"
[[package]]
name = "certifi"
version = "2024.7.4"
description = "Python package for providing Mozilla's CA Bundle."
optional = false
python-versions = ">=3.6"
files = [
{file = "certifi-2024.7.4-py3-none-any.whl", hash = "sha256:c198e21b1289c2ab85ee4e67bb4b4ef3ead0892059901a8d5b622f24a1101e90"},
{file = "certifi-2024.7.4.tar.gz", hash = "sha256:5a1e7645bc0ec61a09e26c36f6106dd4cf40c6db3a1fb6352b0244e7fb057c7b"},
]
[[package]]
name = "charset-normalizer"
version = "3.3.2"
description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
optional = false
python-versions = ">=3.7.0"
files = [
{file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
{file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
{file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
{file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
{file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
{file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
{file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
{file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
{file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
{file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
{file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
{file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
{file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
{file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
{file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
{file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
{file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
{file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
{file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
{file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
{file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
{file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
{file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
{file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
{file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
{file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
{file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
{file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
{file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
{file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
{file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
{file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
{file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
{file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
{file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
{file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
{file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
{file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
]
[[package]]
name = "click"
version = "8.1.3"
description = "Composable command line interface toolkit"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
@ -51,7 +193,6 @@ colorama = {version = "*", markers = "platform_system == \"Windows\""}
name = "colorama"
version = "0.4.6"
description = "Cross-platform colored terminal text."
category = "main"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
files = [
@ -63,7 +204,6 @@ files = [
name = "flask"
version = "2.3.2"
description = "A simple framework for building complex web applications."
category = "main"
optional = false
python-versions = ">=3.8"
files = [
@ -86,7 +226,6 @@ dotenv = ["python-dotenv"]
name = "flask-admin"
version = "1.6.1"
description = "Simple and extensible admin interface framework for Flask"
category = "main"
optional = false
python-versions = ">=3.6"
files = [
@ -106,7 +245,6 @@ azure = ["azure-storage-blob"]
name = "flask-sqlalchemy"
version = "3.0.3"
description = "Add SQLAlchemy support to your Flask application."
category = "main"
optional = false
python-versions = ">=3.7"
files = [
@ -122,7 +260,6 @@ SQLAlchemy = ">=1.4.18"
name = "greenlet"
version = "2.0.2"
description = "Lightweight in-process concurrent programming"
category = "main"
optional = false
python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*"
files = [
@ -131,6 +268,7 @@ files = [
{file = "greenlet-2.0.2-cp27-cp27m-win32.whl", hash = "sha256:6c3acb79b0bfd4fe733dff8bc62695283b57949ebcca05ae5c129eb606ff2d74"},
{file = "greenlet-2.0.2-cp27-cp27m-win_amd64.whl", hash = "sha256:283737e0da3f08bd637b5ad058507e578dd462db259f7f6e4c5c365ba4ee9343"},
{file = "greenlet-2.0.2-cp27-cp27mu-manylinux2010_x86_64.whl", hash = "sha256:d27ec7509b9c18b6d73f2f5ede2622441de812e7b1a80bbd446cb0633bd3d5ae"},
{file = "greenlet-2.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d967650d3f56af314b72df7089d96cda1083a7fc2da05b375d2bc48c82ab3f3c"},
{file = "greenlet-2.0.2-cp310-cp310-macosx_11_0_x86_64.whl", hash = "sha256:30bcf80dda7f15ac77ba5af2b961bdd9dbc77fd4ac6105cee85b0d0a5fcf74df"},
{file = "greenlet-2.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:26fbfce90728d82bc9e6c38ea4d038cba20b7faf8a0ca53a9c07b67318d46088"},
{file = "greenlet-2.0.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9190f09060ea4debddd24665d6804b995a9c122ef5917ab26e1566dcc712ceeb"},
@ -139,6 +277,7 @@ files = [
{file = "greenlet-2.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:76ae285c8104046b3a7f06b42f29c7b73f77683df18c49ab5af7983994c2dd91"},
{file = "greenlet-2.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:2d4686f195e32d36b4d7cf2d166857dbd0ee9f3d20ae349b6bf8afc8485b3645"},
{file = "greenlet-2.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:c4302695ad8027363e96311df24ee28978162cdcdd2006476c43970b384a244c"},
{file = "greenlet-2.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d4606a527e30548153be1a9f155f4e283d109ffba663a15856089fb55f933e47"},
{file = "greenlet-2.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c48f54ef8e05f04d6eff74b8233f6063cb1ed960243eacc474ee73a2ea8573ca"},
{file = "greenlet-2.0.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a1846f1b999e78e13837c93c778dcfc3365902cfb8d1bdb7dd73ead37059f0d0"},
{file = "greenlet-2.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3a06ad5312349fec0ab944664b01d26f8d1f05009566339ac6f63f56589bc1a2"},
@ -168,6 +307,7 @@ files = [
{file = "greenlet-2.0.2-cp37-cp37m-win32.whl", hash = "sha256:3f6ea9bd35eb450837a3d80e77b517ea5bc56b4647f5502cd28de13675ee12f7"},
{file = "greenlet-2.0.2-cp37-cp37m-win_amd64.whl", hash = "sha256:7492e2b7bd7c9b9916388d9df23fa49d9b88ac0640db0a5b4ecc2b653bf451e3"},
{file = "greenlet-2.0.2-cp38-cp38-macosx_10_15_x86_64.whl", hash = "sha256:b864ba53912b6c3ab6bcb2beb19f19edd01a6bfcbdfe1f37ddd1778abfe75a30"},
{file = "greenlet-2.0.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:1087300cf9700bbf455b1b97e24db18f2f77b55302a68272c56209d5587c12d1"},
{file = "greenlet-2.0.2-cp38-cp38-manylinux2010_x86_64.whl", hash = "sha256:ba2956617f1c42598a308a84c6cf021a90ff3862eddafd20c3333d50f0edb45b"},
{file = "greenlet-2.0.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fc3a569657468b6f3fb60587e48356fe512c1754ca05a564f11366ac9e306526"},
{file = "greenlet-2.0.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8eab883b3b2a38cc1e050819ef06a7e6344d4a990d24d45bc6f2cf959045a45b"},
@ -176,6 +316,7 @@ files = [
{file = "greenlet-2.0.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:b0ef99cdbe2b682b9ccbb964743a6aca37905fda5e0452e5ee239b1654d37f2a"},
{file = "greenlet-2.0.2-cp38-cp38-win32.whl", hash = "sha256:b80f600eddddce72320dbbc8e3784d16bd3fb7b517e82476d8da921f27d4b249"},
{file = "greenlet-2.0.2-cp38-cp38-win_amd64.whl", hash = "sha256:4d2e11331fc0c02b6e84b0d28ece3a36e0548ee1a1ce9ddde03752d9b79bba40"},
{file = "greenlet-2.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:8512a0c38cfd4e66a858ddd1b17705587900dd760c6003998e9472b77b56d417"},
{file = "greenlet-2.0.2-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:88d9ab96491d38a5ab7c56dd7a3cc37d83336ecc564e4e8816dbed12e5aaefc8"},
{file = "greenlet-2.0.2-cp39-cp39-manylinux2010_x86_64.whl", hash = "sha256:561091a7be172ab497a3527602d467e2b3fbe75f9e783d8b8ce403fa414f71a6"},
{file = "greenlet-2.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:971ce5e14dc5e73715755d0ca2975ac88cfdaefcaab078a284fea6cfabf866df"},
@ -192,11 +333,21 @@ files = [
docs = ["Sphinx", "docutils (<0.18)"]
test = ["objgraph", "psutil"]
[[package]]
name = "idna"
version = "3.7"
description = "Internationalized Domain Names in Applications (IDNA)"
optional = false
python-versions = ">=3.5"
files = [
{file = "idna-3.7-py3-none-any.whl", hash = "sha256:82fee1fc78add43492d3a1898bfa6d8a904cc97d8427f683ed8e798d07761aa0"},
{file = "idna-3.7.tar.gz", hash = "sha256:028ff3aadf0609c1fd278d8ea3089299412a7a8b9bd005dd08b9f8285bcb5cfc"},
]
[[package]]
name = "isbnlib"
version = "3.10.14"
description = "Extract, clean, transform, hyphenate and metadata for ISBNs (International Standard Book Number)."
category = "main"
optional = false
python-versions = "*"
files = [
@ -208,7 +359,6 @@ files = [
name = "itsdangerous"
version = "2.1.2"
description = "Safely pass data to untrusted environments and back."
category = "main"
optional = false
python-versions = ">=3.7"
files = [
@ -220,7 +370,6 @@ files = [
name = "jinja2"
version = "3.1.2"
description = "A very fast and expressive template engine."
category = "main"
optional = false
python-versions = ">=3.7"
files = [
@ -238,7 +387,6 @@ i18n = ["Babel (>=2.7)"]
name = "mako"
version = "1.2.4"
description = "A super-fast templating language that borrows the best ideas from the existing templating languages."
category = "main"
optional = false
python-versions = ">=3.7"
files = [
@ -258,7 +406,6 @@ testing = ["pytest"]
name = "markupsafe"
version = "2.1.2"
description = "Safely add untrusted strings to HTML/XML markup."
category = "main"
optional = false
python-versions = ">=3.7"
files = [
@ -318,7 +465,6 @@ files = [
name = "pastel"
version = "0.2.1"
description = "Bring colors to your terminal."
category = "dev"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
files = [
@ -330,7 +476,6 @@ files = [
name = "poethepoet"
version = "0.20.0"
description = "A task runner that works well with poetry."
category = "dev"
optional = false
python-versions = ">=3.8"
files = [
@ -349,7 +494,6 @@ poetry-plugin = ["poetry (>=1.0,<2.0)"]
name = "psycopg2-binary"
version = "2.9.6"
description = "psycopg2 - Python-PostgreSQL Database Adapter"
category = "main"
optional = false
python-versions = ">=3.6"
files = [
@ -417,11 +561,42 @@ files = [
{file = "psycopg2_binary-2.9.6-cp39-cp39-win_amd64.whl", hash = "sha256:f6a88f384335bb27812293fdb11ac6aee2ca3f51d3c7820fe03de0a304ab6249"},
]
[[package]]
name = "requests"
version = "2.32.3"
description = "Python HTTP for Humans."
optional = false
python-versions = ">=3.8"
files = [
{file = "requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6"},
{file = "requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760"},
]
[package.dependencies]
certifi = ">=2017.4.17"
charset-normalizer = ">=2,<4"
idna = ">=2.5,<4"
urllib3 = ">=1.21.1,<3"
[package.extras]
socks = ["PySocks (>=1.5.6,!=1.5.7)"]
use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
[[package]]
name = "soupsieve"
version = "2.5"
description = "A modern CSS selector implementation for Beautiful Soup."
optional = false
python-versions = ">=3.8"
files = [
{file = "soupsieve-2.5-py3-none-any.whl", hash = "sha256:eaa337ff55a1579b6549dc679565eac1e3d000563bcb1c8ab0d0fefbc0c2cdc7"},
{file = "soupsieve-2.5.tar.gz", hash = "sha256:5663d5a7b3bfaeee0bc4372e7fc48f9cff4940b3eec54a6451cc5299f1097690"},
]
[[package]]
name = "sqlalchemy"
version = "2.0.12"
description = "Database Abstraction Library"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
@ -499,7 +674,6 @@ sqlcipher = ["sqlcipher3-binary"]
name = "tomli"
version = "2.0.1"
description = "A lil' TOML parser"
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
@ -511,7 +685,6 @@ files = [
name = "typing-extensions"
version = "4.5.0"
description = "Backported and Experimental Type Hints for Python 3.7+"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
@ -519,11 +692,27 @@ files = [
{file = "typing_extensions-4.5.0.tar.gz", hash = "sha256:5cb5f4a79139d699607b3ef622a1dedafa84e115ab0024e0d9c044a9479ca7cb"},
]
[[package]]
name = "urllib3"
version = "2.2.2"
description = "HTTP library with thread-safe connection pooling, file post, and more."
optional = false
python-versions = ">=3.8"
files = [
{file = "urllib3-2.2.2-py3-none-any.whl", hash = "sha256:a448b2f64d686155468037e1ace9f2d2199776e17f0a46610480d311f73e3472"},
{file = "urllib3-2.2.2.tar.gz", hash = "sha256:dd505485549a7a552833da5e6063639d0d177c04f23bc3864e41e5dc5f612168"},
]
[package.extras]
brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
h2 = ["h2 (>=4,<5)"]
socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
zstd = ["zstandard (>=0.18.0)"]
[[package]]
name = "werkzeug"
version = "2.3.3"
description = "The comprehensive WSGI web application library."
category = "main"
optional = false
python-versions = ">=3.8"
files = [
@ -541,7 +730,6 @@ watchdog = ["watchdog (>=2.3)"]
name = "wtforms"
version = "3.0.1"
description = "Form validation and rendering for Python web development."
category = "main"
optional = false
python-versions = ">=3.7"
files = [
@ -558,4 +746,4 @@ email = ["email-validator"]
[metadata]
lock-version = "2.0"
python-versions = "^3.11"
content-hash = "123df985006374b7c4ede4587a2facef89306039e35af84ddc9c516eecd46c89"
content-hash = "4498f324fc36a557e7b63b693fea8a11388b127e86dcc17ab67e087dce3a9ec3"

View File

@ -16,6 +16,8 @@ isbnlib = "^3.10.14"
python = "^3.11"
sqlalchemy = "^2.0.8"
psycopg2-binary = "^2.9.6"
requests = "^2.32.3"
bs4 = "^0.0.2"
[tool.poetry.group.dev.dependencies]
werkzeug = "^2.3.3"

View File

@ -1,8 +1,8 @@
from textwrap import dedent
from sqlalchemy import (
event,
select,
event,
select,
)
from sqlalchemy.orm import Session
@ -61,26 +61,6 @@ class WorblehatCli(NumberedCmd):
exit(0)
def do_list_bookcases(self, _: str):
bookcase_shelfs = self.sql_session.scalars(
select(BookcaseShelf)
.join(Bookcase)
.order_by(
Bookcase.name,
BookcaseShelf.column,
BookcaseShelf.row,
)
).all()
bookcase_uid = None
for shelf in bookcase_shelfs:
if shelf.bookcase.uid != bookcase_uid:
print(shelf.bookcase.short_str())
bookcase_uid = shelf.bookcase.uid
print(f' {shelf.short_str()} - {sum(i.amount for i in shelf.items)} items')
def do_show_bookcase(self, arg: str):
bookcase_selector = InteractiveItemSelector(
cls = Bookcase,
@ -95,6 +75,35 @@ class WorblehatCli(NumberedCmd):
print(f' {item.name} - {item.amount} copies')
def do_show_borrowed_queued(self, _: str):
borrowed_items = self.sql_session.scalars(
select(BookcaseItemBorrowing)
.where(BookcaseItemBorrowing.delivered.is_(None))
.order_by(BookcaseItemBorrowing.end_time),
).all()
if len(borrowed_items) == 0:
print('No borrowed items found.')
else:
print('Borrowed items:')
for item in borrowed_items:
print(f'- {item.username} - {item.item.name} - to be delivered by {item.end_time.strftime("%Y-%m-%d")}')
print()
queued_items = self.sql_session.scalars(
select(BookcaseItemBorrowingQueue)
.order_by(BookcaseItemBorrowingQueue.entered_queue_time),
).all()
if len(queued_items) == 0:
print('No queued items found.')
else:
print('Users in queue:')
for item in queued_items:
print(f'- {item.username} - {item.item.name} - entered queue at {item.entered_queue_time.strftime("%Y-%m-%d")}')
def _create_bookcase_item(self, isbn: str):
bookcase_item = create_bookcase_item_from_isbn(isbn, self.sql_session)
if bookcase_item is None:
@ -149,6 +158,8 @@ class WorblehatCli(NumberedCmd):
if (existing_item := self.sql_session.scalars(
select(BookcaseItem)
.where(BookcaseItem.isbn == isbn)
.join(BookcaseItemBorrowing)
.join(BookcaseItemBorrowingQueue)
).one_or_none()) is not None:
print(f'\nFound existing item for isbn "{isbn}"')
BookcaseItemCli(
@ -171,6 +182,28 @@ class WorblehatCli(NumberedCmd):
).cmdloop()
def do_show_slabbedasker(self, _: str):
slubberter = self.sql_session.scalars(
select(BookcaseItemBorrowing)
.join(BookcaseItem)
.where(
BookcaseItemBorrowing.end_time < datetime.now(),
BookcaseItemBorrowing.delivered.is_(None),
)
.order_by(
BookcaseItemBorrowing.end_time,
),
).all()
if len(slubberter) == 0:
print('No slubberts found. Life is good.')
return
for slubbert in slubberter:
print('Slubberter:')
print(f'- {slubbert.username} - {slubbert.item.name} - {slubbert.end_time.strftime("%Y-%m-%d")}')
def do_advanced(self, _: str):
AdvancedOptionsCli(self.sql_session).cmdloop()
@ -191,10 +224,10 @@ class WorblehatCli(NumberedCmd):
def do_exit(self, _: str):
if self.sql_session_dirty:
if prompt_yes_no('Would you like to save your changes?'):
self.sql_session.commit()
else:
self.sql_session.rollback()
if prompt_yes_no('Would you like to save your changes?'):
self.sql_session.commit()
else:
self.sql_session.rollback()
exit(0)
@ -204,26 +237,30 @@ class WorblehatCli(NumberedCmd):
'doc': 'Choose / Add item with its ISBN',
},
1: {
'f': do_list_bookcases,
'doc': 'List all bookcases',
},
2: {
'f': do_search,
'doc': 'Search',
},
3: {
2: {
'f': do_show_bookcase,
'doc': 'Show a bookcase, and its items',
},
3: {
'f': do_show_borrowed_queued,
'doc': 'Show borrowed/queued items',
},
4: {
'f': do_show_slabbedasker,
'doc': 'Show slabbedasker',
},
5: {
'f': do_save,
'doc': 'Save changes',
},
5: {
6: {
'f': do_abort,
'doc': 'Abort changes',
},
6: {
7: {
'f': do_advanced,
'doc': 'Advanced options',
},

View File

@ -58,6 +58,7 @@ class InteractiveItemSelector(Cmd):
self.execute_selection = execute_selection
self.complete_selection = complete_selection
self.default_item = default
self.result = None
if default is not None:
self.prompt = f'Select {cls.__name__} [{default.name}]> '
@ -197,6 +198,7 @@ class NumberedItemSelector(NumberedCmd):
super().__init__()
self.items = items
self.stringify = stringify
self.result = None
self.funcs = {
i: {
'f': self._select_item,

View File

@ -91,6 +91,26 @@ class AdvancedOptionsCli(NumberedCmd):
self.sql_session.flush()
def do_list_bookcases(self, _: str):
bookcase_shelfs = self.sql_session.scalars(
select(BookcaseShelf)
.join(Bookcase)
.order_by(
Bookcase.name,
BookcaseShelf.column,
BookcaseShelf.row,
)
).all()
bookcase_uid = None
for shelf in bookcase_shelfs:
if shelf.bookcase.uid != bookcase_uid:
print(shelf.bookcase.short_str())
bookcase_uid = shelf.bookcase.uid
print(f' {shelf.short_str()} - {sum(i.amount for i in shelf.items)} items')
def do_done(self, _: str):
return True
@ -104,6 +124,10 @@ class AdvancedOptionsCli(NumberedCmd):
'f': do_add_bookcase_shelf,
'doc': 'Add bookcase shelf',
},
3: {
'f': do_list_bookcases,
'doc': 'List all bookcases',
},
9: {
'f': do_done,
'doc': 'Done',

View File

@ -1,4 +1,4 @@
from datetime import datetime
from datetime import datetime, timedelta
from textwrap import dedent
from sqlalchemy import select
@ -7,6 +7,7 @@ from sqlalchemy.orm import Session
from worblehat.cli.prompt_utils import (
InteractiveItemSelector,
NumberedCmd,
NumberedItemSelector,
format_date,
prompt_yes_no,
)
@ -22,6 +23,7 @@ from worblehat.services.bookcase_item import (
create_bookcase_item_from_isbn,
is_valid_isbn,
)
from worblehat.services.config import Config
from .bookcase_shelf_selector import select_bookcase_shelf
@ -174,6 +176,50 @@ class BookcaseItemCli(NumberedCmd):
print(f'Successfully delivered the item for {borrowing.username}')
def do_extend_borrowing(self, _: str):
borrowings = self.sql_session.scalars(
select(BookcaseItemBorrowing)
.join(BookcaseItem, BookcaseItem.uid == BookcaseItemBorrowing.fk_bookcase_item_uid)
.where(BookcaseItem.isbn == self.bookcase_item.isbn)
.order_by(BookcaseItemBorrowing.username)
).all()
if len(borrowings) == 0:
print('No one seems to have borrowed this item')
return
borrowing_queue = self.sql_session.scalars(
select(BookcaseItemBorrowingQueue)
.where(
BookcaseItemBorrowingQueue.item == self.bookcase_item,
BookcaseItemBorrowingQueue.item_became_available_time == None,
)
.order_by(BookcaseItemBorrowingQueue.entered_queue_time)
).all()
if len(borrowing_queue) != 0:
print('Sorry, you cannot extend the borrowing because there are people waiting in the queue')
print('Borrowing queue:')
for i, b in enumerate(borrowing_queue):
print(f' {i + 1}) {b.username}')
return
print('Who are you?')
selector = NumberedItemSelector(
items = list(borrowings),
stringify = lambda b: f'{b.username} - Until {format_date(b.end_time)}',
)
selector.cmdloop()
if selector.result is None:
return
borrowing = selector.result
borrowing.end_time = datetime.now() + timedelta(days=int(Config['deadline_daemon.days_before_queue_position_expires']))
self.sql_session.flush()
print(f'Successfully extended the borrowing for {borrowing.username} until {format_date(borrowing.end_time)}')
def do_done(self, _: str):
return True
@ -188,10 +234,14 @@ class BookcaseItemCli(NumberedCmd):
'doc': 'Deliver',
},
3: {
'f': do_extend_borrowing,
'doc': 'Extend borrowing',
},
4: {
'f': do_edit,
'doc': 'Edit',
},
4: {
5: {
'f': do_update_data,
'doc': 'Pull updated data from online databases',
},

View File

@ -13,6 +13,7 @@ class SearchCli(NumberedCmd):
def __init__(self, sql_session: Session):
super().__init__()
self.sql_session = sql_session
self.result = None
def do_search_all(self, _: str):
@ -57,11 +58,11 @@ class SearchCli(NumberedCmd):
elif len(author) == 1:
selected_author = author[0]
print('Found author:')
print(f" {selected_author.name} ({sum(item.amount for item in selected_author.books)} items)")
print(f" {selected_author.name} ({sum(item.amount for item in selected_author.items)} items)")
else:
selector = NumberedItemSelector(
items = author,
stringify = lambda author: f"{author.name} ({sum(item.amount for item in author.books)} items)",
stringify = lambda author: f"{author.name} ({sum(item.amount for item in author.items)} items)",
)
selector.cmdloop()
if selector.result is None:
@ -69,7 +70,7 @@ class SearchCli(NumberedCmd):
selected_author = selector.result
selector = NumberedItemSelector(
items = selected_author.books,
items = list(selected_author.items),
stringify = lambda item: f"{item.name} ({item.isbn})",
)
selector.cmdloop()

View File

@ -11,14 +11,25 @@ from worblehat.models import (
DeadlineDaemonLastRunDatetime,
BookcaseItemBorrowingQueue,
)
from worblehat.services.email import send_email
class DeadlineDaemon:
def __init__(self, sql_session: Session):
if not Config['deadline_daemon.enabled']:
return
self.sql_session = sql_session
self.last_run = self.sql_session.scalars(
select(DeadlineDaemonLastRunDatetime),
).one()
).one_or_none()
if self.last_run is None:
logging.info('No previous run found, assuming this is the first run')
self.last_run = DeadlineDaemonLastRunDatetime(time=datetime.now())
self.sql_session.add(self.last_run)
self.sql_session.commit()
self.last_run_datetime = self.last_run.time
self.current_run_datetime = datetime.now()
@ -26,6 +37,12 @@ class DeadlineDaemon:
def run(self):
logging.info('Deadline daemon started')
if not Config['deadline_daemon.enabled']:
logging.warn('Deadline daemon disabled, exiting')
return
if Config['deadline_daemon.dryrun']:
logging.warn('Running in dryrun mode')
self.send_close_deadline_reminder_mails()
self.send_overdue_mails()
@ -36,6 +53,97 @@ class DeadlineDaemon:
self.last_run.time = self.current_run_datetime
self.sql_session.commit()
###################
# EMAIL TEMPLATES #
###################
def _send_close_deadline_mail(self, borrowing: BookcaseItemBorrowing):
logging.info(f'Sending close deadline mail to {borrowing.username}@pvv.ntnu.no.')
send_email(
f'{borrowing.username}@pvv.ntnu.no',
'Reminder - Your borrowing deadline is approaching',
dedent(f'''
Your borrowing deadline for the following item is approaching:
{borrowing.item.name}
Please return the item by {borrowing.end_time.strftime("%a %b %d, %Y")}
''',
).strip(),
)
def _send_overdue_mail(self, borrowing: BookcaseItemBorrowing):
logging.info(f'Sending overdue mail to {borrowing.username}@pvv.ntnu.no for {borrowing.item.isbn} - {borrowing.end_time.strftime("%a %b %d, %Y")}')
send_email(
f'{borrowing.username}@pvv.ntnu.no',
'Your deadline has passed',
dedent(f'''
Your delivery deadline for the following item has passed:
{borrowing.item.name}
Please return the item as soon as possible.
''',
).strip(),
)
def _send_newly_available_mail(self, queue_item: BookcaseItemBorrowingQueue):
logging.info(f'Sending newly available mail to {queue_item.username}')
days_before_queue_expires = Config['deadline_daemon.days_before_queue_position_expires']
# TODO: calculate and format the date of when the queue position expires in the mail.
send_email(
f'{queue_item.username}@pvv.ntnu.no',
'An item you have queued for is now available',
dedent(f'''
The following item is now available for you to borrow:
{queue_item.item.name}
Please pick up the item within {days_before_queue_expires} days.
''',
).strip(),
)
def _send_expiring_queue_position_mail(self, queue_position: BookcaseItemBorrowingQueue, day: int):
logging.info(f'Sending queue position expiry reminder to {queue_position.username}@pvv.ntnu.no.')
send_email(
f'{queue_position.username}@pvv.ntnu.no',
'Reminder - Your queue position expiry deadline is approaching',
dedent(f'''
Your queue position expiry deadline for the following item is approaching:
{queue_position.item.name}
Please borrow the item by {(queue_position.item_became_available_time + timedelta(days=day)).strftime("%a %b %d, %Y")}
''',
).strip(),
)
def _send_queue_position_expired_mail(self, queue_position: BookcaseItemBorrowingQueue):
send_email(
f'{queue_position.username}@pvv.ntnu.no',
'Your queue position has expired',
dedent(f'''
Your queue position for the following item has expired:
{queue_position.item.name}
You can queue for the item again at any time, but you will be placed at the back of the queue.
There are currently {len(queue_position.item.borrowing_queue)} users in the queue.
''',
).strip(),
)
##################
# EMAIL ROUTINES #
##################
def _sql_subtract_date(self, x: datetime, y: timedelta):
if self.sql_session.bind.dialect.name == 'sqlite':
@ -48,10 +156,10 @@ class DeadlineDaemon:
def send_close_deadline_reminder_mails(self):
logging.info('Sending mails about items with a closing deadline')
logging.info('Sending mails for items with a closing deadline')
# TODO: This should be int-parsed and validated before the daemon started
days = [int(d) for d in Config['deadline_daemon.warn_days_before_borrow_deadline']]
days = [int(d) for d in Config['deadline_daemon.warn_days_before_borrowing_deadline']]
for day in days:
borrowings_to_remind = self.sql_session.scalars(
@ -69,23 +177,11 @@ class DeadlineDaemon:
),
).all()
for borrowing in borrowings_to_remind:
logging.info(f' Sending close deadline mail to {borrowing.username}@pvv.ntnu.no. {day} days left')
send_email(
f'{borrowing.username}@pvv.ntnu.no',
'Reminder - Your borrowing deadline is approaching',
dedent(f'''
Your borrowing deadline for the following item is approaching:
{borrowing.item.name}
Please return the item by {borrowing.end_time.strftime("%a %b %d, %Y")}
''',
).strip(),
)
self._send_close_deadline_mail(borrowing)
def send_overdue_mails(self):
logging.info('Sending mails about overdue items')
logging.info('Sending mails for overdue items')
to_remind = self.sql_session.scalars(
select(BookcaseItemBorrowing)
@ -96,19 +192,7 @@ class DeadlineDaemon:
).all()
for borrowing in to_remind:
logging.info(f' Sending overdue mail to {borrowing.username}@pvv.ntnu.no for {borrowing.item.isbn} - {borrowing.end_time.strftime("%a %b %d, %Y")}')
send_email(
f'{borrowing.username}@pvv.ntnu.no',
'Your deadline has passed',
dedent(f'''
Your delivery deadline for the following item has passed:
{borrowing.item.name}
Please return the item as soon as possible.
''',
).strip(),
)
self._send_overdue_mail(borrowing)
def send_newly_available_mails(self):
@ -133,15 +217,75 @@ class DeadlineDaemon:
).all()
for queue_item in newly_available:
logging.info(f'Sending newly available mail to {queue_item.username}')
logging.warning('Not implemented')
logging.info(f'Adding user {queue_item.username} to queue for {queue_item.item.name}')
queue_item.item_became_available_time = self.current_run_datetime
self.sql_session.commit()
self._send_newly_available_mail(queue_item)
def send_expiring_queue_position_mails(self):
logging.info('Sending mails about queue positions which are expiring soon')
logging.warning('Not implemented')
days = [int(d) for d in Config['deadline_daemon.warn_days_before_expiring_queue_position_deadline']]
for day in days:
queue_positions_to_remind = self.sql_session.scalars(
select(BookcaseItemBorrowingQueue)
.join(
BookcaseItemBorrowing,
BookcaseItemBorrowing.fk_bookcase_item_uid == BookcaseItemBorrowingQueue.fk_bookcase_item_uid,
)
.where(
self._sql_subtract_date(
BookcaseItemBorrowingQueue.item_became_available_time + timedelta(days=day),
timedelta(days=day),
)
.between(
self.last_run_datetime,
self.current_run_datetime,
),
),
).all()
for queue_position in queue_positions_to_remind:
self._send_expiring_queue_position_mail(queue_position, day)
def auto_expire_queue_positions(self):
logging.info('Expiring queue positions which are too old')
logging.warning('Not implemented')
queue_position_expiry_days = int(Config['deadline_daemon.days_before_queue_position_expires'])
overdue_queue_positions = self.sql_session.scalars(
select(BookcaseItemBorrowingQueue)
.where(
BookcaseItemBorrowingQueue.item_became_available_time + timedelta(days=queue_position_expiry_days) < self.current_run_datetime,
BookcaseItemBorrowingQueue.expired.is_(False),
),
).all()
for queue_position in overdue_queue_positions:
logging.info(f'Expiring queue position for {queue_position.username} for item {queue_position.item.name}')
queue_position.expired = True
next_queue_position = self.sql_session.scalars(
select(BookcaseItemBorrowingQueue)
.where(
BookcaseItemBorrowingQueue.fk_bookcase_item_uid == queue_position.fk_bookcase_item_uid,
BookcaseItemBorrowingQueue.item_became_available_time.is_(None),
)
.order_by(BookcaseItemBorrowingQueue.entered_queue_time)
.limit(1),
).one_or_none()
self._send_queue_position_expired_mail(queue_position)
if next_queue_position is not None:
next_queue_position.item_became_available_time = self.current_run_datetime
logging.info(f'Next user in queue for item {next_queue_position.item.name} is {next_queue_position.username}')
self._send_newly_available_mail(next_queue_position)
self.sql_session.commit()

View File

@ -0,0 +1,117 @@
from datetime import datetime, timedelta
from worblehat.models import (
BookcaseItem,
BookcaseItemBorrowing,
BookcaseItemBorrowingQueue,
DeadlineDaemonLastRunDatetime,
)
from worblehat.services.config import Config
from .seed_test_data import main as seed_test_data_main
def clear_db(sql_session):
sql_session.query(BookcaseItemBorrowingQueue).delete()
sql_session.query(BookcaseItemBorrowing).delete()
sql_session.query(DeadlineDaemonLastRunDatetime).delete()
sql_session.commit()
# NOTE: feel free to change this function to suit your needs
# it's just a quick and dirty way to get some data into the database
# for testing the deadline daemon - oysteikt 2024
def main(sql_session):
borrow_warning_days = [timedelta(days=int(d)) for d in Config['deadline_daemon.warn_days_before_borrowing_deadline']]
queue_warning_days = [timedelta(days=int(d)) for d in Config['deadline_daemon.warn_days_before_expiring_queue_position_deadline']]
queue_expire_days = int(Config['deadline_daemon.days_before_queue_position_expires'])
clear_db(sql_session)
seed_test_data_main(sql_session)
books = sql_session.query(BookcaseItem).all()
last_run_datetime = datetime.now() - timedelta(days=16)
last_run = DeadlineDaemonLastRunDatetime(last_run_datetime)
sql_session.add(last_run)
# Create at least one item that is borrowed and not supposed to be returned yet
borrowing = BookcaseItemBorrowing(
item=books[0],
username='test_borrower_still_borrowing',
)
borrowing.start_time = last_run_datetime - timedelta(days=1)
borrowing.end_time = datetime.now() - timedelta(days=6)
sql_session.add(borrowing)
# Create at least one item that is borrowed and is supposed to be returned soon
borrowing = BookcaseItemBorrowing(
item=books[1],
username='test_borrower_return_soon',
)
borrowing.start_time = last_run_datetime - timedelta(days=1)
borrowing.end_time = datetime.now() - timedelta(days=2)
sql_session.add(borrowing)
# Create at least one item that is borrowed and is overdue
borrowing = BookcaseItemBorrowing(
item=books[2],
username='test_borrower_overdue',
)
borrowing.start_time = datetime.now() - timedelta(days=1)
borrowing.end_time = datetime.now() + timedelta(days=1)
sql_session.add(borrowing)
# Create at least one item that is in the queue and is not supposed to be borrowed yet
queue_item = BookcaseItemBorrowingQueue(
item=books[3],
username='test_queue_user_still_waiting',
)
queue_item.entered_queue_time = last_run_datetime - timedelta(days=1)
borrowing = BookcaseItemBorrowing(
item=books[3],
username='test_borrower_return_soon',
)
borrowing.start_time = last_run_datetime - timedelta(days=1)
borrowing.end_time = datetime.now() - timedelta(days=2)
sql_session.add(queue_item)
sql_session.add(borrowing)
# Create at least three items that is in the queue and two items were just returned
for i in range(3):
queue_item = BookcaseItemBorrowingQueue(
item=books[4 + i],
username=f'test_queue_user_{i}',
)
sql_session.add(queue_item)
for i in range(3):
borrowing = BookcaseItemBorrowing(
item=books[4 + i],
username=f'test_borrower_returned_{i}',
)
borrowing.start_time = last_run_datetime - timedelta(days=2)
borrowing.end_time = datetime.now() + timedelta(days=1)
if i != 2:
borrowing.delivered = datetime.now() - timedelta(days=1)
sql_session.add(borrowing)
# Create at least one item that has been in the queue for so long that the queue position should expire
queue_item = BookcaseItemBorrowingQueue(
item=books[7],
username='test_queue_user_expired',
)
queue_item.entered_queue_time = datetime.now() - timedelta(days=15)
# Create at least one item that has been in the queue for so long that the queue position should expire,
# but the queue person has already been notified
queue_item = BookcaseItemBorrowingQueue(
item=books[8],
username='test_queue_user_expired_notified',
)
queue_item.entered_queue_time = datetime.now() - timedelta(days=15)
sql_session.commit()

View File

@ -0,0 +1,87 @@
import csv
from pathlib import Path
from datetime import datetime, timedelta
from worblehat.models import (
Bookcase,
BookcaseItem,
BookcaseShelf,
MediaType,
Language,
)
from worblehat.services.config import Config
CSV_FILE = Path(__file__).parent.parent.parent / 'data' / 'arbeidsrom_smal_hylle_5.csv'
def clear_db(sql_session):
sql_session.query(BookcaseItem).delete()
sql_session.query(BookcaseShelf).delete()
sql_session.query(Bookcase).delete()
sql_session.query(MediaType).delete()
sql_session.query(Language).delete()
sql_session.commit()
def main(sql_session):
clear_db(sql_session)
media_type = MediaType(
name='Book',
description='A book',
)
sql_session.add(media_type)
language = Language(
name='Norwegian',
iso639_1_code='no',
)
sql_session.add(language)
seed_case = Bookcase(
name='seed_case',
description='test bookcase with test data',
)
sql_session.add(seed_case)
seed_shelf_1 = BookcaseShelf(
row=1,
column=1,
bookcase=seed_case,
description='test shelf with test data 1',
)
seed_shelf_2 = BookcaseShelf(
row=2,
column=1,
bookcase=seed_case,
description='test shelf with test data 2',
)
sql_session.add(seed_shelf_1)
sql_session.add(seed_shelf_2)
bookcase_items = []
with open(CSV_FILE) as csv_file:
csv_reader = csv.reader(csv_file, delimiter=',')
next(csv_reader)
for row in csv_reader:
item = BookcaseItem(
isbn=row[0],
name=row[1],
)
item.media_type = media_type
item.language = language
bookcase_items.append(item)
half = len(bookcase_items) // 2
first_half = bookcase_items[:half]
second_half = bookcase_items[half:]
for item in first_half:
seed_shelf_1.items.add(item)
for item in second_half:
seed_shelf_2.items.add(item)
sql_session.add_all(bookcase_items)
sql_session.commit()

View File

@ -7,6 +7,7 @@ from sqlalchemy.orm import Session
from .services import (
Config,
arg_parser,
devscripts_arg_parser,
)
from .deadline_daemon import DeadlineDaemon
@ -60,6 +61,19 @@ def main():
WorblehatCli.run_with_safe_exit_wrapper(sql_session)
exit(0)
if args.command == 'devscripts':
sql_session = _connect_to_database(echo=Config['logging.debug_sql'])
if args.script == 'seed-content-for-deadline-daemon':
from .devscripts.seed_content_for_deadline_daemon import main
main(sql_session)
elif args.script == 'seed-test-data':
from .devscripts.seed_test_data import main
main(sql_session)
else:
print(devscripts_arg_parser.format_help())
exit(1)
exit(0)
if args.command == 'flask-dev':
flask_dev_main()
exit(0)
@ -70,4 +84,5 @@ def main():
flask_prod_main()
exit(0)
print(arg_parser.format_help())
print(arg_parser.format_help())
exit(1)

View File

@ -2,10 +2,11 @@ from __future__ import annotations
from typing import TYPE_CHECKING
from sqlalchemy import (
ForeignKey,
Integer,
SmallInteger,
String,
ForeignKey,
Text,
)
from sqlalchemy.orm import (
Mapped,
@ -31,13 +32,16 @@ if TYPE_CHECKING:
from .Language import Language
from .MediaType import MediaType
class BookcaseItem(Base, UidMixin, UniqueNameMixin):
from worblehat.flaskapp.database import db
class BookcaseItem(Base, UidMixin):
isbn: Mapped[int] = mapped_column(String, unique=True, index=True)
name: Mapped[str] = mapped_column(Text, index=True)
owner: Mapped[str] = mapped_column(String, default='PVV')
amount: Mapped[int] = mapped_column(SmallInteger, default=1)
fk_media_type_uid: Mapped[int] = mapped_column(ForeignKey('MediaType.uid'))
fk_bookcase_shelf_uid: Mapped[int | None] = mapped_column(ForeignKey('BookcaseShelf.uid'))
fk_bookcase_shelf_uid: Mapped[int] = mapped_column(ForeignKey('BookcaseShelf.uid'))
fk_language_uid: Mapped[int | None] = mapped_column(ForeignKey('Language.uid'))
media_type: Mapped[MediaType] = relationship(back_populates='items')
@ -63,4 +67,13 @@ class BookcaseItem(Base, UidMixin, UniqueNameMixin):
):
self.name = name
self.isbn = isbn
self.owner = owner
self.owner = owner
@classmethod
def get_by_isbn(cls, isbn: str, sql_session: Session = db.session) -> Self | None:
"""
NOTE:
This method defaults to using the flask_sqlalchemy session.
It will not work outside of a request context, unless another session is provided.
"""
return sql_session.query(cls).where(cls.isbn == isbn).one_or_none()

View File

@ -21,7 +21,8 @@ if TYPE_CHECKING:
class BookcaseItemBorrowingQueue(Base, UidMixin):
username: Mapped[str] = mapped_column(String)
entered_queue_time = mapped_column(DateTime, default=datetime.now())
entered_queue_time: Mapped[datetime] = mapped_column(DateTime, default=datetime.now())
item_became_available_time: Mapped[datetime | None] = mapped_column(DateTime)
expired = mapped_column(Boolean, default=False)
fk_bookcase_item_uid: Mapped[int] = mapped_column(ForeignKey('BookcaseItem.uid'), index=True)

View File

@ -20,4 +20,8 @@ class DeadlineDaemonLastRunDatetime(Base):
),
)
uid: Mapped[bool] = mapped_column(Boolean, primary_key=True, default=True)
time: Mapped[datetime] = mapped_column(DateTime, default=datetime.now())
time: Mapped[datetime] = mapped_column(DateTime, default=datetime.now())
def __init__(self, time: datetime | None = None):
if time is not None:
self.time = time

View File

@ -1,8 +1,8 @@
"""initial_migration
Revision ID: d51c7172d2f2
Revision ID: 7dfbf8a8dec8
Revises:
Create Date: 2023-05-06 17:46:39.230122
Create Date: 2024-07-31 21:07:13.434012
"""
from alembic import op
@ -10,7 +10,7 @@ import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'd51c7172d2f2'
revision = '7dfbf8a8dec8'
down_revision = None
branch_labels = None
depends_on = None
@ -44,6 +44,12 @@ def upgrade() -> None:
with op.batch_alter_table('Category', schema=None) as batch_op:
batch_op.create_index(batch_op.f('ix_Category_name'), ['name'], unique=True)
op.create_table('DeadlineDaemonLastRunDatetime',
sa.Column('uid', sa.Boolean(), nullable=False),
sa.Column('time', sa.DateTime(), nullable=False),
sa.CheckConstraint('uid = true', name=op.f('ck_DeadlineDaemonLastRunDatetime_`single_row_only`')),
sa.PrimaryKeyConstraint('uid', name=op.f('pk_DeadlineDaemonLastRunDatetime'))
)
op.create_table('Language',
sa.Column('iso639_1_code', sa.String(length=2), nullable=False),
sa.Column('uid', sa.Integer(), nullable=False),
@ -75,13 +81,13 @@ def upgrade() -> None:
)
op.create_table('BookcaseItem',
sa.Column('isbn', sa.String(), nullable=False),
sa.Column('name', sa.Text(), nullable=False),
sa.Column('owner', sa.String(), nullable=False),
sa.Column('amount', sa.SmallInteger(), nullable=False),
sa.Column('fk_media_type_uid', sa.Integer(), nullable=False),
sa.Column('fk_bookcase_shelf_uid', sa.Integer(), nullable=True),
sa.Column('fk_bookcase_shelf_uid', sa.Integer(), nullable=False),
sa.Column('fk_language_uid', sa.Integer(), nullable=True),
sa.Column('uid', sa.Integer(), nullable=False),
sa.Column('name', sa.Text(), nullable=False),
sa.ForeignKeyConstraint(['fk_bookcase_shelf_uid'], ['BookcaseShelf.uid'], name=op.f('fk_BookcaseItem_fk_bookcase_shelf_uid_BookcaseShelf')),
sa.ForeignKeyConstraint(['fk_language_uid'], ['Language.uid'], name=op.f('fk_BookcaseItem_fk_language_uid_Language')),
sa.ForeignKeyConstraint(['fk_media_type_uid'], ['MediaType.uid'], name=op.f('fk_BookcaseItem_fk_media_type_uid_MediaType')),
@ -89,13 +95,13 @@ def upgrade() -> None:
)
with op.batch_alter_table('BookcaseItem', schema=None) as batch_op:
batch_op.create_index(batch_op.f('ix_BookcaseItem_isbn'), ['isbn'], unique=True)
batch_op.create_index(batch_op.f('ix_BookcaseItem_name'), ['name'], unique=True)
batch_op.create_index(batch_op.f('ix_BookcaseItem_name'), ['name'], unique=False)
op.create_table('BookcaseItemBorrowing',
sa.Column('username', sa.String(), nullable=False),
sa.Column('start_time', sa.DateTime(), nullable=False),
sa.Column('end_time', sa.DateTime(), nullable=False),
sa.Column('delivered', sa.Boolean(), nullable=False),
sa.Column('delivered', sa.DateTime(), nullable=True),
sa.Column('fk_bookcase_item_uid', sa.Integer(), nullable=False),
sa.Column('uid', sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(['fk_bookcase_item_uid'], ['BookcaseItem.uid'], name=op.f('fk_BookcaseItemBorrowing_fk_bookcase_item_uid_BookcaseItem')),
@ -106,8 +112,9 @@ def upgrade() -> None:
op.create_table('BookcaseItemBorrowingQueue',
sa.Column('username', sa.String(), nullable=False),
sa.Column('entered_queue_time', sa.DateTime(), nullable=True),
sa.Column('should_notify_user', sa.Boolean(), nullable=True),
sa.Column('entered_queue_time', sa.DateTime(), nullable=False),
sa.Column('item_became_available_time', sa.DateTime(), nullable=True),
sa.Column('expired', sa.Boolean(), nullable=True),
sa.Column('fk_bookcase_item_uid', sa.Integer(), nullable=False),
sa.Column('uid', sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(['fk_bookcase_item_uid'], ['BookcaseItem.uid'], name=op.f('fk_BookcaseItemBorrowingQueue_fk_bookcase_item_uid_BookcaseItem')),
@ -160,6 +167,7 @@ def downgrade() -> None:
batch_op.drop_index(batch_op.f('ix_Language_iso639_1_code'))
op.drop_table('Language')
op.drop_table('DeadlineDaemonLastRunDatetime')
with op.batch_alter_table('Category', schema=None) as batch_op:
batch_op.drop_index(batch_op.f('ix_Category_name'))

View File

@ -1,4 +1,7 @@
from .argument_parser import arg_parser
from .argument_parser import (
arg_parser,
devscripts_arg_parser,
)
from .bookcase_item import (
create_bookcase_item_from_isbn,
is_valid_isbn,

View File

@ -31,6 +31,19 @@ subparsers.add_parser(
help = 'Start the web interface in production mode',
)
devscripts_arg_parser = subparsers.add_parser('devscripts', help='Run development scripts')
devscripts_subparsers = devscripts_arg_parser.add_subparsers(dest='script')
devscripts_subparsers.add_parser(
'seed-test-data',
help = 'Seed test data in the database',
)
devscripts_subparsers.add_parser(
'seed-content-for-deadline-daemon',
help = 'Seed data tailorded for testing the deadline daemon, into the database',
)
arg_parser.add_argument(
'-V',
'--version',

View File

@ -3,6 +3,7 @@ import isbnlib
from sqlalchemy import select
from sqlalchemy.orm import Session
from .metadata_fetchers import fetch_metadata_from_multiple_sources
from ..models import (
Author,
BookcaseItem,
@ -25,20 +26,32 @@ def is_valid_isbn(isbn: str) -> bool:
def create_bookcase_item_from_isbn(isbn: str, sql_session: Session) -> BookcaseItem | None:
metadata = isbnlib.meta(isbn, 'openl')
if len(metadata.keys()) == 0:
"""
This function fetches metadata for the given ISBN and creates a BookcaseItem from it.
It does so using a database connection to connect it to the correct authors and language
through the sql ORM.
If no metadata is found, None is returned.
Please not that the returned BookcaseItem will likely not be fully populated with the required
data, such as the book's location in the library, and the owner of the book, etc.
"""
metadata = fetch_metadata_from_multiple_sources(isbn)
if len(metadata) == 0:
return None
metadata = metadata[0]
bookcase_item = BookcaseItem(
name = metadata.get('Title'),
name = metadata.title,
isbn = int(isbn.replace('-', '')),
)
if len(authors := metadata.get('Authors')) > 0:
if len(authors := metadata.authors) > 0:
for author in authors:
bookcase_item.authors.add(Author(author))
if (language := metadata.get('Language')):
if (language := metadata.language):
bookcase_item.language = sql_session.scalars(
select(Language)
.where(Language.iso639_1_code == language)

View File

@ -24,6 +24,9 @@ class Config:
]
def __class_getitem__(cls, name: str) -> Any:
if cls._config is None:
raise RuntimeError('Configuration not loaded, call Config.load_configuration() first.')
__config = cls._config
for attr in name.split('.'):
__config = __config.get(attr)

View File

@ -1,6 +1,7 @@
from pathlib import Path
import smtplib
from textwrap import indent
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
@ -8,16 +9,16 @@ from .config import Config
def send_email(to: str, subject: str, body: str):
if Config['smtp.enabled']:
msg = MIMEMultipart()
msg['From'] = Config['smtp.from']
msg['To'] = to
if Config['smtp.subject_prefix']:
msg['Subject'] = f"{Config['smtp.subject_prefix']} {subject}"
else:
msg['Subject'] = subject
msg.attach(MIMEText(body, 'plain'))
msg = MIMEMultipart()
msg['From'] = Config['smtp.from']
msg['To'] = to
if Config['smtp.subject_prefix']:
msg['Subject'] = f"{Config['smtp.subject_prefix']} {subject}"
else:
msg['Subject'] = subject
msg.attach(MIMEText(body, 'plain'))
if Config['smtp.enabled'] and not Config['deadline_daemon.dryrun']:
try:
with smtplib.SMTP(Config['smtp.host'], Config['smtp.port']) as server:
server.starttls()
@ -31,6 +32,4 @@ def send_email(to: str, subject: str, body: str):
print(err)
else:
print('Debug: Email sending is disabled, so the following email was not sent:')
print(f' To: {to}')
print(f' Subject: {subject}')
print(f' Body: {body}')
print(indent(msg.as_string(), ' '))

View File

@ -0,0 +1,62 @@
from dataclasses import dataclass
from typing import Set
# TODO: Add more languages
LANGUAGES: set[str] = set([
"no",
"en",
"de",
"fr",
"es",
"it",
"sv",
"da",
"fi",
"ru",
"zh",
"ja",
"ko",
])
@dataclass
class BookMetadata:
"""A class representing metadata for a book."""
isbn: str
title: str
# The source of the metadata provider
source: str
authors: Set[str]
language: str | None
publish_date: str | None
num_pages: int | None
subjects: Set[str]
def to_dict(self) -> dict[str, any]:
return {
'isbn': self.isbn,
'title': self.title,
'source': self.metadata_source_id(),
'authors': set() if self.authors is None else self.authors,
'language': self.language,
'publish_date': self.publish_date,
'num_pages': self.num_pages,
'subjects': set() if self.subjects is None else self.subjects
}
def validate(self) -> None:
if not self.isbn:
raise ValueError('Missing ISBN')
if not self.title:
raise ValueError('Missing title')
if not self.source:
raise ValueError('Missing source')
if not self.authors:
raise ValueError('Missing authors')
if self.language is not None and self.language not in LANGUAGES:
raise ValueError(f'Invalid language: {self.language}. Consider adding it to the LANGUAGES set if you think this is a mistake.')
if self.num_pages is not None and self.num_pages < 0:
raise ValueError(f'Invalid number of pages: {self.num_pages}')

View File

@ -0,0 +1,20 @@
#base fetcher.
from abc import ABC, abstractmethod
from .BookMetadata import BookMetadata
class BookMetadataFetcher(ABC):
"""
A base class for metadata fetchers.
"""
@classmethod
@abstractmethod
def metadata_source_id(cls) -> str:
"""Returns a unique identifier for the metadata source, to identify where the metadata came from."""
pass
@classmethod
@abstractmethod
def fetch_metadata(cls, isbn: str) -> BookMetadata | None:
"""Tries to fetch metadata for the given ISBN."""
pass

View File

@ -0,0 +1,51 @@
"""
A BookMetadataFetcher for the Google Books API.
"""
import requests
from worblehat.services.metadata_fetchers.BookMetadataFetcher import BookMetadataFetcher
from worblehat.services.metadata_fetchers.BookMetadata import BookMetadata
class GoogleBooksFetcher(BookMetadataFetcher):
@classmethod
def metadata_source_id(_cls) -> str:
return "google_books"
@classmethod
def fetch_metadata(cls, isbn: str) -> BookMetadata | None:
try:
jsonInput = requests.get(
f"https://www.googleapis.com/books/v1/volumes",
params = {"q": f"isbn:{isbn}"},
).json()
data = jsonInput.get("items")[0].get("volumeInfo")
authors = set(data.get("authors") or [])
title = data.get("title")
publishDate = data.get("publish_date")
numberOfPages = data.get("number_of_pages")
if numberOfPages:
numberOfPages = int(numberOfPages)
subjects = set(data.get("categories") or [])
languages = data.get("languages")
except Exception:
return None
return BookMetadata(
isbn = isbn,
title = title,
source = cls.metadata_source_id(),
authors = authors,
language = languages,
publish_date = publishDate,
num_pages = numberOfPages,
subjects = subjects,
)
if __name__ == '__main__':
book_data = GoogleBooksFetcher.fetch_metadata('0132624788')
book_data.validate()
print(book_data)

View File

@ -0,0 +1,61 @@
"""
A BookMetadataFetcher for the Open Library API.
"""
import requests
from worblehat.services.metadata_fetchers.BookMetadataFetcher import BookMetadataFetcher
from worblehat.services.metadata_fetchers.BookMetadata import BookMetadata
LANGUAGE_MAP = {
"Norwegian": "no",
}
class OpenLibraryFetcher(BookMetadataFetcher):
@classmethod
def metadata_source_id(_cls) -> str:
return "open_library"
@classmethod
def fetch_metadata(cls, isbn: str) -> BookMetadata | None:
try:
jsonInput = requests.get(f"https://openlibrary.org/isbn/{isbn}.json").json()
author_keys = jsonInput.get("authors") or []
author_names = set()
for author_key in author_keys:
key = author_key.get('key')
author_name = requests.get(f"https://openlibrary.org/{key}.json").json().get("name")
author_names.add(author_name)
title = jsonInput.get("title")
publishDate = jsonInput.get("publish_date")
numberOfPages = jsonInput.get("number_of_pages")
if numberOfPages:
numberOfPages = int(numberOfPages)
language_key = jsonInput.get("languages")[0].get("key")
language = requests.get(f"https://openlibrary.org/{language_key}.json").json().get("identifiers").get("iso_639_1")[0]
subjects = set(jsonInput.get("subjects") or [])
except Exception:
return None
return BookMetadata(
isbn = isbn,
title = title,
source = cls.metadata_source_id(),
authors = author_names,
language = language,
publish_date = publishDate,
num_pages = numberOfPages,
subjects = subjects,
)
if __name__ == '__main__':
book_data = OpenLibraryFetcher.fetch_metadata('9788205530751')
book_data.validate()
print(book_data)

View File

@ -0,0 +1,109 @@
"""
A BookMetadataFetcher that webscrapes https://outland.no/
"""
from bs4 import BeautifulSoup
import requests
from worblehat.services.metadata_fetchers.BookMetadataFetcher import BookMetadataFetcher
from worblehat.services.metadata_fetchers.BookMetadata import BookMetadata
LANGUAGE_MAP = {
"Norsk": "no",
"Engelsk": "en",
"Tysk": "de",
"Fransk": "fr",
"Spansk": "es",
"Italiensk": "it",
"Svensk": "sv",
"Dansk": "da",
"Finsk": "fi",
"Russisk": "ru",
"Kinesisk": "zh",
"Japansk": "ja",
"Koreansk": "ko",
}
class OutlandScraperFetcher(BookMetadataFetcher):
@classmethod
def metadata_source_id(_cls) -> str:
return "outland_scraper"
@classmethod
def fetch_metadata(cls, isbn: str) -> BookMetadata | None:
try:
# Find the link to the product page
response = requests.get(f"https://outland.no/{isbn}")
soup = BeautifulSoup(response.content, "html.parser")
soup = soup.find_all("a", class_="product-item-link")
href = soup[0].get("href")
# Find the metadata on the product page
response = requests.get(href)
soup = BeautifulSoup(response.content, "html.parser")
data = soup.find_all("td", class_="col data")
# Collect the metadata
title = soup.find_all("span", class_="base")[0].text
releaseDate = soup.find_all("span", class_="release-date")[0].text.strip()
releaseDate = releaseDate[-4:] # only keep year
bookData = {
"Title": title,
"PublishDate": releaseDate,
"Authors": None,
"NumberOfPages": None,
"Genre": None,
"Language": None,
"Subjects": None,
}
dataKeyMap = {
"Authors": "Forfattere",
"NumberOfPages": "Antall Sider",
"Genre": "Sjanger",
"Language": "Språk",
"Subjects": "Serie"
}
for value in data:
for key in dataKeyMap:
if str(value).lower().__contains__(dataKeyMap[key].lower()):
bookData[key] = value.text
break
if bookData["Language"] is not None:
bookData["Language"] = LANGUAGE_MAP.get(bookData["Language"])
if bookData["Authors"] is not None:
bookData["Authors"] = set(bookData["Authors"].split(", "))
if bookData["Subjects"] is not None:
bookData["Subjects"] = set(bookData["Subjects"].split(", "))
if bookData["NumberOfPages"] is not None:
bookData["NumberOfPages"] = int(bookData["NumberOfPages"])
except Exception:
return None
return BookMetadata(
isbn = isbn,
title = bookData.get('Title'),
source = cls.metadata_source_id(),
authors = bookData.get('Authors'),
language = bookData.get('Language'),
publish_date = bookData.get('PublishDate'),
num_pages = bookData.get('NumberOfPages'),
subjects = bookData.get('Subjects'),
)
if __name__ == '__main__':
book_data = OutlandScraperFetcher.fetch_metadata('9781947808225')
book_data.validate()
print(book_data)

View File

@ -0,0 +1 @@
from .book_metadata_fetcher import fetch_metadata_from_multiple_sources

View File

@ -0,0 +1,80 @@
"""
this module contains the fetch_book_metadata() function which fetches book metadata from multiple sources in threads and returns the higest ranked non-None result.
"""
from concurrent.futures import ThreadPoolExecutor
from worblehat.services.metadata_fetchers.BookMetadata import BookMetadata
from worblehat.services.metadata_fetchers.BookMetadataFetcher import BookMetadataFetcher
from worblehat.services.metadata_fetchers.GoogleBooksFetcher import GoogleBooksFetcher
from worblehat.services.metadata_fetchers.OpenLibraryFetcher import OpenLibraryFetcher
from worblehat.services.metadata_fetchers.OutlandScraperFetcher import OutlandScraperFetcher
# The order of these fetchers determines the priority of the sources.
# The first fetcher in the list has the highest priority.
FETCHERS: list[BookMetadataFetcher] = [
OpenLibraryFetcher,
GoogleBooksFetcher,
OutlandScraperFetcher,
]
FETCHER_SOURCE_IDS: list[str] = [fetcher.metadata_source_id() for fetcher in FETCHERS]
def sort_metadata_by_priority(metadata: list[BookMetadata]) -> list[BookMetadata]:
"""
Sorts the given metadata by the priority of the sources.
The order of the metadata is the same as the order of the sources in the FETCHERS list.
"""
# Note that this function is O(n^2) but the number of fetchers is small so it's fine.
return sorted(metadata, key=lambda m: FETCHER_SOURCE_IDS.index(m.source))
def fetch_metadata_from_multiple_sources(isbn: str, strict=False) -> list[BookMetadata]:
"""
Returns a list of metadata fetched from multiple sources.
Sources that does not have metadata for the given ISBN will be ignored.
There is no guarantee that there will be any metadata.
The results are always ordered in the same way as the fetchers are listed in the FETCHERS list.
"""
isbn = isbn.replace('-', '').replace('_', '').strip().lower()
if len(isbn) != 10 and len(isbn) != 13 and not isbn.isnumeric():
raise ValueError('Invalid ISBN')
results: list[BookMetadata] = []
with ThreadPoolExecutor() as executor:
futures = [executor.submit(fetcher.fetch_metadata, isbn) for fetcher in FETCHERS]
for future in futures:
result = future.result()
if result is not None:
results.append(result)
for result in results:
try:
result.validate()
except ValueError as e:
if strict:
raise e
else:
print(f'Invalid metadata: {e}')
results.remove(result)
return sort_metadata_by_priority(results)
if __name__ == '__main__':
from pprint import pprint
isbn = '0132624788'
metadata = fetch_metadata_from_multiple_sources(isbn)
pprint(metadata)

View File

@ -1,4 +1,5 @@
import csv
from datetime import datetime, timedelta
from pathlib import Path
from sqlalchemy.orm import Session
@ -6,7 +7,11 @@ from sqlalchemy.orm import Session
from worblehat.flaskapp.database import db
from ..models import (
Author,
Bookcase,
BookcaseItem,
BookcaseItemBorrowing,
BookcaseItemBorrowingQueue,
BookcaseShelf,
Language,
MediaType,
@ -117,6 +122,100 @@ def seed_data(sql_session: Session = db.session):
BookcaseShelf(row=0, column=4, bookcase=bookcases[4], description="Religion"),
]
authors = [
Author(name="Donald E. Knuth"),
Author(name="J.K. Rowling"),
Author(name="J.R.R. Tolkien"),
Author(name="George R.R. Martin"),
Author(name="Stephen King"),
Author(name="Agatha Christie"),
]
book1 = BookcaseItem(
name = "The Art of Computer Programming",
isbn = "9780201896831",
)
book1.authors.add(authors[0])
book1.media_type = media_types[0]
book1.shelf = shelfs[59]
book2 = BookcaseItem(
name = "Harry Potter and the Philosopher's Stone",
isbn = "9780747532743",
)
book2.authors.add(authors[1])
book2.media_type = media_types[0]
book2.shelf = shelfs[-1]
book_owned_by_other_user = BookcaseItem(
name = "Book owned by other user",
isbn = "9780747532744",
)
book_owned_by_other_user.owner = "other_user"
book_owned_by_other_user.authors.add(authors[4])
book_owned_by_other_user.media_type = media_types[0]
book_owned_by_other_user.shelf = shelfs[-2]
borrowed_book_more_available = BookcaseItem(
name = "Borrowed book with more available",
isbn = "9780747532745",
)
borrowed_book_more_available.authors.add(authors[5])
borrowed_book_more_available.media_type = media_types[0]
borrowed_book_more_available.shelf = shelfs[-3]
borrowed_book_more_available.amount = 2
borrowed_book_no_more_available = BookcaseItem(
name = "Borrowed book with no more available",
isbn = "9780747532746",
)
borrowed_book_no_more_available.authors.add(authors[5])
borrowed_book_no_more_available.media_type = media_types[0]
borrowed_book_no_more_available.shelf = shelfs[-3]
borrowed_book_people_in_queue = BookcaseItem(
name = "Borrowed book with people in queue",
isbn = "9780747532747",
)
borrowed_book_people_in_queue.authors.add(authors[5])
borrowed_book_people_in_queue.media_type = media_types[0]
borrowed_book_people_in_queue.shelf = shelfs[-3]
borrowed_book_by_slabbedask = BookcaseItem(
name = "Borrowed book by slabbedask",
isbn = "9780747532748",
)
borrowed_book_by_slabbedask.authors.add(authors[5])
borrowed_book_by_slabbedask.media_type = media_types[0]
borrowed_book_by_slabbedask.shelf = shelfs[-3]
books = [
book1,
book2,
book_owned_by_other_user,
borrowed_book_more_available,
borrowed_book_no_more_available,
borrowed_book_people_in_queue,
]
slabbedask_borrowing = BookcaseItemBorrowing(
username="slabbedask",
item=borrowed_book_more_available,
)
slabbedask_borrowing.end_time = datetime.now() - timedelta(days=1)
borrowings = [
BookcaseItemBorrowing(username="user", item=borrowed_book_more_available),
BookcaseItemBorrowing(username="user", item=borrowed_book_no_more_available),
BookcaseItemBorrowing(username="user", item=borrowed_book_people_in_queue),
slabbedask_borrowing,
]
queue = [
BookcaseItemBorrowingQueue(username="user", item=borrowed_book_people_in_queue),
]
with open(Path(__file__).parent.parent.parent / 'data' / 'iso639_1.csv') as f:
reader = csv.reader(f)
languages = [Language(name, code) for (code, name) in reader]
@ -125,5 +224,9 @@ def seed_data(sql_session: Session = db.session):
sql_session.add_all(bookcases)
sql_session.add_all(shelfs)
sql_session.add_all(languages)
sql_session.add_all(authors)
sql_session.add_all(books)
sql_session.add_all(borrowings)
sql_session.add_all(queue)
sql_session.commit()
print("Added test media types, bookcases and shelfs.")