73 Commits

Author SHA1 Message Date
2207001136 fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 43s
2025-12-09 21:30:19 +09:00
fead6257c7 fixup! WIP 2025-12-09 21:30:15 +09:00
2a9ace4263 fixup! .gitea/workflows: init test pipeline
All checks were successful
Run tests / run-tests (push) Successful in 1m8s
2025-12-09 20:39:21 +09:00
60fa6529ee fixup! WIP 2025-12-09 20:39:01 +09:00
2e66a9a4b0 fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 41s
2025-12-09 18:53:14 +09:00
a087d3bede fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 41s
2025-12-09 18:43:42 +09:00
45bb31aba0 fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 45s
2025-12-09 18:32:04 +09:00
f15c748558 fixup! WIP
Some checks failed
Run tests / run-tests (push) Has been cancelled
2025-12-09 18:31:10 +09:00
d220342d56 fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 46s
2025-12-09 17:44:18 +09:00
108e17edb8 fixup! WIP: docs/economy 2025-12-09 17:40:41 +09:00
12028cee22 fixup! WIP: docs/economy
All checks were successful
Run tests / run-tests (push) Successful in 46s
2025-12-09 17:03:47 +09:00
1515eb6dff fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 44s
2025-12-09 17:02:28 +09:00
7199cbf34a WIP: docs/economy 2025-12-09 17:02:22 +09:00
722f0cae93 fixup! WIP 2025-12-09 15:51:54 +09:00
16be0f0fbe fixup! WIP 2025-12-09 15:47:14 +09:00
cec91d923c fixup! WIP 2025-12-09 15:30:16 +09:00
0504cc1a1e fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 46s
2025-12-09 15:17:52 +09:00
e7453d0fdd fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 1m25s
2025-12-09 14:43:45 +09:00
c6ecb6fae9 fixup! WIP 2025-12-09 13:25:34 +09:00
aaa5a6c556 fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 46s
2025-12-09 13:00:16 +09:00
6a83a41f28 fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 46s
2025-12-09 12:55:24 +09:00
aa4e8dbee5 fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 41s
2025-12-09 12:49:38 +09:00
f39e649b3d fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 41s
2025-12-09 11:56:06 +09:00
0a2fc799dd fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 1m37s
2025-12-09 04:25:05 +09:00
7d498f9bf1 fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 40s
2025-12-08 21:55:25 +09:00
f1b15357f9 fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 41s
2025-12-08 21:43:27 +09:00
de896901bb models/Base: add comment about __repr__ impl 2025-12-08 21:43:26 +09:00
15d1763405 fixup! WIP 2025-12-08 21:43:23 +09:00
683981d9dc fixup! .gitea/workflows: init test pipeline
All checks were successful
Run tests / run-tests (push) Successful in 40s
2025-12-08 21:30:07 +09:00
4289d63ac9 fixup! .gitea/workflows: init test pipeline
All checks were successful
Run tests / run-tests (push) Successful in 45s
2025-12-08 21:28:01 +09:00
ce3e65357b fixup! .gitea/workflows: init test pipeline
All checks were successful
Run tests / run-tests (push) Successful in 54s
2025-12-08 21:16:04 +09:00
928ab2a98a fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 38s
2025-12-08 21:13:49 +09:00
0b59d469dd fixup! WIP
All checks were successful
Run tests / run-tests (push) Successful in 56s
2025-12-08 21:04:49 +09:00
24c5a9af38 fixup! .gitea/workflows: init test pipeline
All checks were successful
Run tests / run-tests (push) Successful in 53s
2025-12-08 20:33:03 +09:00
21ccf78401 fixup! .gitea/workflows: init test pipeline
All checks were successful
Run tests / run-tests (push) Successful in 1m8s
2025-12-08 20:29:10 +09:00
d5b481d97a fixup! .gitea/workflows: init test pipeline
All checks were successful
Run tests / run-tests (push) Successful in 35s
2025-12-08 20:27:32 +09:00
cac1b5be20 fixup! .gitea/workflows: init test pipeline
Some checks failed
Run tests / run-tests (push) Failing after 24s
2025-12-08 20:24:47 +09:00
ad1fcfe98d fixup! .gitea/workflows: init test pipeline
Some checks failed
Run tests / run-tests (push) Failing after 22s
2025-12-08 20:20:35 +09:00
cc7b40ab7e fixup! .gitea/workflows: init test pipeline
Some checks failed
Run tests / run-tests (push) Failing after 21s
2025-12-08 20:19:09 +09:00
d35ffd04cc fixup! .gitea/workflows: init test pipeline
Some checks failed
Run tests / run-tests (push) Failing after 20s
2025-12-08 20:17:53 +09:00
d39f1f8a92 pyproject.toml: psycopg2 -> psycopg2-binary
Some checks failed
Run tests / run-tests (push) Failing after 24s
2025-12-08 20:07:37 +09:00
0e3bed9bf5 uv.lock: update 2025-12-08 20:07:37 +09:00
3a1fc58a68 .gitea/workflows: init test pipeline 2025-12-08 20:07:37 +09:00
1ec7c79378 fixup! WIP 2025-12-08 19:50:22 +09:00
bc43d4948c README: add overview of project structure 2025-12-08 19:45:37 +09:00
7e5345c7fb fixup! WIP 2025-12-08 19:45:22 +09:00
50867db928 fixup! WIP 2025-12-08 18:26:49 +09:00
5252cb611f fixup! WIP 2025-12-08 18:26:49 +09:00
5f510ee5d8 fixup! WIP 2025-12-08 18:26:48 +09:00
f8829a6c7b fixup! WIP 2025-12-08 18:26:48 +09:00
885e989659 fixup! WIP 2025-12-08 18:26:48 +09:00
5c0b2b5229 WIP 2025-12-08 18:26:48 +09:00
9f2d8229fd .gitignore: add pytest-cov data 2025-12-08 18:26:47 +09:00
8807d7278a {nix,pyproject.toml}: add pytest, pytest-cov 2025-12-08 18:26:46 +09:00
634716956e uv.lock: init 2025-12-08 18:26:08 +09:00
fb81eef26f flake.lock: bump 2025-12-08 18:25:59 +09:00
e9d30b63a5 flake.lock: bump 2025-06-07 15:13:32 +02:00
0844843e59 remove all the image related things from dibbler service 2025-05-17 20:05:35 +02:00
70677f7f79 db: handle database.url_file 2025-05-17 19:19:10 +02:00
4a4f0e6947 module.nix: config -> settings 2025-05-05 14:52:28 +02:00
a4d10ad0c7 Merge pull request 'Seed test data' (#16) from seed_test into master
Reviewed-on: #16
Reviewed-by: Oystein Kristoffer Tveit <oysteikt@pvv.ntnu.no>
2025-03-30 21:45:37 +02:00
a654baba11 ruff format 2025-03-30 21:44:37 +02:00
e69d04dcd0 mock script, og mock data. 2025-03-29 22:48:30 +01:00
b2a6384f31 la tilbake uv, en project manager 2025-03-29 22:46:30 +01:00
4f89765070 ignorer bifiler fra hatchling 2025-03-29 22:42:36 +01:00
914e5b4e50 fjerner __pyachce__, fra repo tracking 2025-03-29 22:37:11 +01:00
de20bad7dd remove conf.py 2025-03-19 18:47:23 +01:00
4bab5e7e21 treewide: fix brother-ql usage 2025-03-19 18:47:16 +01:00
b85a6535fe shell.nix: add python with all packages 2025-03-19 18:14:42 +01:00
22a09b4177 README: add more information 2025-03-19 18:06:40 +01:00
c39b15d1a8 .envrc: init 2025-03-19 17:50:48 +01:00
122ac2ab18 treewide: update everything nix 2025-03-19 17:50:14 +01:00
28228beccd pyproject.toml: remove invalid license
This license field was added without any of the earlier contributors
consent on accident. It is not valid
2025-03-17 21:03:55 +01:00
76 changed files with 6512 additions and 772 deletions

3
.envrc
View File

@@ -1,2 +1 @@
# devenv needs to know the path to the current working directory to create and manage mutable state
use flake . --no-pure-eval
use flake

View File

@@ -0,0 +1,77 @@
name: Run tests
on:
pull_request:
push:
workflow_dispatch:
inputs:
debug_sqlalchemy:
description: "Print SQL statements executed by SQLAlchemy during tests"
type: boolean
default: false
jobs:
run-tests:
runs-on: debian-latest
steps:
- uses: actions/checkout@v6
- name: Install uv
uses: astral-sh/setup-uv@v7
- name: Install dependencies
run: uv sync --locked --group test
- name: Run tests
continue-on-error: true
run: |
set -euo pipefail
set -x
PYTEST_ARGS=(
-vv
--cov=dibbler.lib
--cov=dibbler.models
--cov=dibbler.queries
--cov-report=html
--cov-branch
--self-contained-html
--html=./test-report/index.html
)
if [ "${{ inputs.debug_sqlalchemy }}" == "true" ]; then
PYTEST_ARGS+=(
--echo
)
fi
uv run -- pytest "${PYTEST_ARGS[@]}"
- name: Generate badge
run: uv run -- coverage-badge -o htmlcov/badge.svg
- name: Upload test report
uses: https://git.pvv.ntnu.no/Projects/rsync-action@v1
with:
source: ./test-report/
target: ${{ gitea.ref_name }}/test-report/
username: gitea-web
ssh-key: ${{ secrets.WEB_SYNC_SSH_KEY }}
host: pages.pvv.ntnu.no
known-hosts: "pages.pvv.ntnu.no ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIH2QjfFB+city1SYqltkVqWACfo1j37k+oQQfj13mtgg"
- name: Upload coverage report
uses: https://git.pvv.ntnu.no/Projects/rsync-action@v1
with:
source: ./htmlcov/
target: ${{ gitea.ref_name }}/coverage/
username: gitea-web
ssh-key: ${{ secrets.WEB_SYNC_SSH_KEY }}
host: pages.pvv.ntnu.no
known-hosts: "pages.pvv.ntnu.no ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIH2QjfFB+city1SYqltkVqWACfo1j37k+oQQfj13mtgg"
- name: Check failure
if: failure()
run: |
echo "Tests failed"
exit 1

11
.gitignore vendored
View File

@@ -1,12 +1,13 @@
result
result-*
.venv
.direnv
.devenv
**/__pycache__
dibbler.egg-info
dist
config.ini
test.db
.ruff_cache
.coverage
htmlcov
test-report

View File

@@ -1,48 +1,85 @@
[![Coverage](https://pages.pvv.ntnu.no/Projects/dibbler/event-sourcing/coverage/badge.svg)](https://pages.pvv.ntnu.no/Projects/dibbler/event-sourcing/coverage)
[![Test report](https://img.shields.io/badge/status-grab_the_latest_test_report-blue)](https://pages.pvv.ntnu.no/Projects/dibbler/event-sourcing/test-report)
# Dibbler
EDB-system for PVVVV
## Hva er dette?
Dibbler er et system laget av PVVere for PVVere for å byttelåne både matvarer og godis.
Det er designet for en gammeldags VT terminal, og er laget for å være enkelt både å bruke og å hacke på.
Programmet er skrevet i Python, og bruker en sql database for å lagre data.
Samlespleiseboden er satt opp slik at folk kjøper inn varer, og får dibblerkreditt, og så kan man bruke
denne kreditten til å kjøpe ut andre varer. Det er ikke noen form for authentisering, så hele systemet er basert på tillit.
Det er anbefalt å koble en barkodeleser til systemet for å gjøre det enklere å både legge til og kjøpe varer.
## Kom i gang
Installer python, og lag og aktiver et venv. Installer så avhengighetene med `pip install`.
Deretter kan du kjøre programmet med
```console
python -m dibbler -c example-config.ini create-db
python -m dibbler -c example-config.ini loop
```
## Prosjektstruktur
Her er en oversikt over prosjektstrukturen og hva de forskjellige mappene og filene gjør.
### `dibbler/models`
I denne mappen ligger databasemodellene. Med få unntak så er hver fil i denne mappen en modell.
Vi bruker for tiden moderne deklarativ SQLAlchemy syntaks for å definere modellene (see [SQLAlchemy - Declarative Mapping Styles](https://docs.sqlalchemy.org/en/20/orm/declarative_styles.html)).
Pass på å ikke putte for mye logikk i modellene, de skal helst bare definere dataene. Konstruktører, hjelpefunksjoner og statisk validering er anbefalt, men unngå dynamisk validering mot databasen - det hører hjemme i `dibbler/queries`.
### `dibbler/queries`
I denne mappen ligger databasespørringer. Disse databasespørringene har etter hvert blitt ganske komplekse da vi ikke lagrer tilstand, men heller deriverer den ut ifra en gående logg av transaksjoner. Her gjøres det også en del validering, både statisk validering av argumenter, men også dynamisk validering mot databasen.
### `dibbler/menus`
Her ligger menydefinisjonene for terminalgrensesnittet. Menyene håndterer brukerinteraksjon og navigasjon.
### `dibbler/lib`
Her finner du hjelpefunksjoner og verktøy som brukes på tvers av prosjektet. Ting som ikke passet inn andre steder.
### `dibbler/subcommands`
Her ligger inngangspunktet for kommandolinjegrensesnittet. Dette er ikke noe vanlige brukere vanligvis vil se da vi har låst dibbler-terminalen til å kjøre terminalgrensesnittet i en evig loop. Det er nyttig for å legge ved ekstra konfigurasjon eller å legge ved vedlikeholdsoppgaver og testverktøy.
### `tests`
Her ligger enhetstester for prosjektet. Testene bruker `pytest` som testløper. Vi tester i all hovedsak databasespørringer og modelllogikk her, da "korrekthet" av terminalgrensesnittet er vanskelig å definere og teste automatisk.
## Nix
### Hvordan kjøre
nix run github:Programvareverkstedet/dibbler
### Hvordan utvikle?
python -m venv .venv
source .venv/activate
pip install -e .
cp example-config.ini config.ini
dibbler -c config.ini create-db
dibbler -c config.ini loop
eller hvis du tolererer nix og postgres:
direnv allow # eller bare `nix develop`
devenv up
dibbler create-db
dibbler loop
### Bygge image
### Bygge nytt image
For å bygge et image trenger du en builder som takler å bygge for arkitekturen du skal lage et image for.
_(Eller be til gudene om at cross compile funker)_
(Eller be til gudene om at cross compile funker)
Flaket exposer en modul som autologger inn med en bruker som automatisk kjører dibbler, og setter opp et minimalistisk miljø.
Før du bygger imaget burde du lage en `config.ini` fil lokalt som inneholder instillingene dine. **NB: Denne kommer til å ligge i nix storen.**
Før du bygger imaget burde du kopiere og endre `example-config.ini` lokalt til å inneholde instillingene dine. **NB: Denne kommer til å ligge i nix storen, ikke si noe her som du ikke vil at moren din skal høre.**
Du kan også endre hvilken `config.ini` som blir brukt direkte i pakken eller i modulen.
Du kan også endre hvilken config-fil som blir brukt direkte i pakken eller i modulen.
Se eksempelet for hvordan skrot er satt opp i `flake.nix`
Se eksempelet for hvordan skrot er satt opp i `flake.nix` og `nix/skrott.nix`
### Bygge image for skrot
Skrot har et system image definert i `flake.nix`:
Skrot har et image definert i flake.nix:
1. lag `config.ini` (`cp {example-,}config.ini`)
1. endre `example-config.ini`
2. `nix build .#images.skrot`
3. ???
4. non-profit!
4. non-profit

View File

@@ -1,4 +0,0 @@
{ pkgs ? import <nixos-unstable> { } }:
{
dibbler = pkgs.callPackage ./nix/dibbler.nix { };
}

View File

@@ -1,12 +1,19 @@
import os
from pathlib import Path
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from dibbler.conf import config
engine = create_engine(
os.environ.get("DIBBLER_DATABASE_URL")
or config.get("database", "url")
)
database_url: str | None = None
if (url := config.get("database", "url")) is not None:
database_url = url
elif (url_file := config.get("database", "url_file")) is not None:
with Path(url_file).open() as file:
database_url = file.read().strip()
assert database_url is not None, "Database URL must be specified in config"
engine = create_engine(database_url)
Session = sessionmaker(bind=engine)

0
dibbler/lib/__init__.py Normal file
View File

View File

@@ -2,7 +2,7 @@ import os
from PIL import ImageFont
from barcode.writer import ImageWriter, mm2px
from brother_ql.devicedependent import label_type_specs
from brother_ql.labels import ALL_LABELS
def px2mm(px, dpi=300):
@@ -12,14 +12,15 @@ def px2mm(px, dpi=300):
class BrotherLabelWriter(ImageWriter):
def __init__(self, typ="62", max_height=350, rot=False, text=None):
super(BrotherLabelWriter, self).__init__()
assert typ in label_type_specs
label = next([l for l in ALL_LABELS if l.identifier == typ])
assert label is not None
self.rot = rot
if self.rot:
self._h, self._w = label_type_specs[typ]["dots_printable"]
self._h, self._w = label.dots_printable
if self._w == 0 or self._w > max_height:
self._w = min(max_height, self._h / 2)
else:
self._w, self._h = label_type_specs[typ]["dots_printable"]
self._w, self._h = label.dots_printable
if self._h == 0 or self._h > max_height:
self._h = min(max_height, self._w / 2)
self._xo = 0.0

View File

@@ -1,79 +1,7 @@
import pwd
import subprocess
import os
import pwd
import signal
from sqlalchemy import or_, and_
from ..models import User, Product
def search_user(string, session, ignorethisflag=None):
string = string.lower()
exact_match = (
session.query(User)
.filter(or_(User.name == string, User.card == string, User.rfid == string))
.first()
)
if exact_match:
return exact_match
user_list = (
session.query(User)
.filter(
or_(
User.name.ilike(f"%{string}%"),
User.card.ilike(f"%{string}%"),
User.rfid.ilike(f"%{string}%"),
)
)
.all()
)
return user_list
def search_product(string, session, find_hidden_products=True):
if find_hidden_products:
exact_match = (
session.query(Product)
.filter(or_(Product.bar_code == string, Product.name == string))
.first()
)
else:
exact_match = (
session.query(Product)
.filter(
or_(
Product.bar_code == string,
and_(Product.name == string, Product.hidden is False),
)
)
.first()
)
if exact_match:
return exact_match
if find_hidden_products:
product_list = (
session.query(Product)
.filter(
or_(
Product.bar_code.ilike(f"%{string}%"),
Product.name.ilike(f"%{string}%"),
)
)
.all()
)
else:
product_list = (
session.query(Product)
.filter(
or_(
Product.bar_code.ilike(f"%{string}%"),
and_(Product.name.ilike(f"%{string}%"), Product.hidden is False),
)
)
.all()
)
return product_list
import subprocess
def system_user_exists(username):

View File

@@ -2,9 +2,10 @@ import os
import datetime
import barcode
from brother_ql import BrotherQLRaster, create_label
from brother_ql.brother_ql_create import create_label
from brother_ql.raster import BrotherQLRaster
from brother_ql.backends import backend_factory
from brother_ql.devicedependent import label_type_specs
from brother_ql.labels import ALL_LABELS
from PIL import Image, ImageDraw, ImageFont
from .barcode_helpers import BrotherLabelWriter
@@ -17,10 +18,11 @@ def print_name_label(
label_type="62",
printer_type="QL-700",
):
label = next([l for l in ALL_LABELS if l.identifier == label_type])
if not rotate:
width, height = label_type_specs[label_type]["dots_printable"]
width, height = label.dots_printable
else:
height, width = label_type_specs[label_type]["dots_printable"]
height, width = label.dots_printable
font_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "ChopinScript.ttf")
fs = 2000

View File

@@ -0,0 +1,129 @@
from dibbler.models import Transaction, TransactionType
from dibbler.models.Transaction import EXPECTED_FIELDS
def render_transaction_log(transaction_log: list[Transaction]) -> str:
"""
Renders a transaction log as a pretty, human-readable string.
"""
aggregated_log = _aggregate_joint_transactions(transaction_log)
lines = []
for i, transaction in enumerate(aggregated_log):
if isinstance(transaction, list):
inner_lines = []
is_last = i == len(aggregated_log) - 1
lines.append(_render_transaction(transaction[0], is_last))
for j, sub_transaction in enumerate(transaction[1:]):
is_last_inner = j == len(transaction) - 2
line = _render_transaction(sub_transaction, is_last_inner)
inner_lines.append(line)
indented_inner_lines = _indent_lines(inner_lines, is_last=is_last)
lines.extend(indented_inner_lines)
else:
is_last = i == len(aggregated_log) - 1
line = _render_transaction(transaction, is_last)
lines.append(line)
return "\n".join(lines)
def _aggregate_joint_transactions(
transactions: list[Transaction],
) -> list[Transaction | list[Transaction]]:
aggregated: list[Transaction | list[Transaction]] = []
i = 0
while i < len(transactions):
current = transactions[i]
# The aggregation is running backwards, so it will hit JOINT transactions first
if current.type_ == TransactionType.JOINT:
joint_transactions = [current]
j = i
while j < len(transactions):
j += 1
next_transaction = transactions[j]
if next_transaction.type_ == TransactionType.JOINT_BUY_PRODUCT:
joint_transactions.append(next_transaction)
else:
break
aggregated.append(joint_transactions)
i = j # Skip processed transactions
elif current.type_ == TransactionType.JOINT:
# Empty joint transaction?
i += 1
continue
else:
aggregated.append(current)
i += 1
return aggregated
def _indent_lines(lines: list[str], is_last: bool = False) -> list[str]:
indented_lines = []
for line in lines:
if is_last:
indented_lines.append(" " + line)
else:
indented_lines.append("" + line)
return indented_lines
def _render_transaction(transaction: Transaction, is_last: bool) -> str:
match transaction.type_:
case TransactionType.ADD_PRODUCT:
line = f"ADD_PRODUCT({transaction.id}, {transaction.user.name}"
for field in EXPECTED_FIELDS[TransactionType.ADD_PRODUCT]:
value = getattr(transaction, field)
line += f", {field}={value}"
line += ")"
case TransactionType.BUY_PRODUCT:
line = f"BUY_PRODUCT({transaction.id}, {transaction.user.name}"
for field in EXPECTED_FIELDS[TransactionType.BUY_PRODUCT]:
value = getattr(transaction, field)
line += f", {field}={value}"
line += ")"
case TransactionType.ADJUST_STOCK:
line = f"ADJUST_STOCK({transaction.id}, {transaction.user.name}"
for field in EXPECTED_FIELDS[TransactionType.ADJUST_STOCK]:
value = getattr(transaction, field)
line += f", {field}={value}"
line += ")"
case TransactionType.ADJUST_PENALTY:
line = f"ADJUST_PENALTY({transaction.id}, {transaction.user.name}"
for field in EXPECTED_FIELDS[TransactionType.ADJUST_PENALTY]:
value = getattr(transaction, field)
line += f", {field}={value}"
line += ")"
case TransactionType.ADJUST_INTEREST:
line = f"ADJUST_INTEREST({transaction.id}, {transaction.user.name}"
for field in EXPECTED_FIELDS[TransactionType.ADJUST_INTEREST]:
value = getattr(transaction, field)
line += f", {field}={value}"
line += ")"
case TransactionType.ADJUST_BALANCE:
line = f"ADJUST_BALANCE({transaction.id}, {transaction.user.name}"
for field in EXPECTED_FIELDS[TransactionType.ADJUST_BALANCE]:
value = getattr(transaction, field)
line += f", {field}={value}"
line += ")"
case TransactionType.JOINT:
line = f"JOINT({transaction.id}, {transaction.user.name}"
for field in EXPECTED_FIELDS[TransactionType.JOINT]:
value = getattr(transaction, field)
line += f", {field}={value}"
line += ")"
case TransactionType.JOINT_BUY_PRODUCT:
line = f"JOINT_BUY_PRODUCT({transaction.id}, {transaction.user.name}"
for field in EXPECTED_FIELDS[TransactionType.JOINT_BUY_PRODUCT]:
value = getattr(transaction, field)
line += f", {field}={value}"
line += ")"
case _:
line = (
f"UNKNOWN[{transaction.type_}](id={transaction.id}, user_id={transaction.user_id})"
)
return "└─ " + line if is_last else "├─ " + line

View File

@@ -76,12 +76,8 @@ class Database:
personDatoVerdi = defaultdict(list) # dict->array
personUkedagVerdi = defaultdict(list)
# for global
personPosTransactions = (
{}
) # personPosTransactions[trygvrad] == 100 #trygvrad har lagt 100kr i boksen
personNegTransactions = (
{}
) # personNegTransactions[trygvrad» == 70 #trygvrad har tatt 70kr fra boksen
personPosTransactions = {} # personPosTransactions[trygvrad] == 100 #trygvrad har lagt 100kr i boksen
personNegTransactions = {} # personNegTransactions[trygvrad» == 70 #trygvrad har tatt 70kr fra boksen
globalVareAntall = {} # globalVareAntall[Oreo] == 3
globalVareVerdi = {} # globalVareVerdi[Oreo] == 30 #[kr]
globalPersonAntall = {} # globalPersonAntall[trygvrad] == 3

View File

@@ -1,6 +1,5 @@
import argparse
import sys
import os
from pathlib import Path
from dibbler.conf import config
@@ -10,9 +9,9 @@ parser.add_argument(
"-c",
"--config",
help="Path to the config file",
type=str,
required=False,
default=os.environ.get("DIBBLER_CONFIG_FILE", None)
type=Path,
metavar="FILE",
default="config.ini",
)
subparsers = parser.add_subparsers(
@@ -23,12 +22,12 @@ subparsers = parser.add_subparsers(
subparsers.add_parser("loop", help="Run the dibbler loop")
subparsers.add_parser("create-db", help="Create the database")
subparsers.add_parser("slabbedasker", help="Find out who is slabbedasker")
subparsers.add_parser("seed-data", help="Fill with mock data")
subparsers.add_parser("transaction-log", help="Print transaction log")
def main():
args = parser.parse_args()
if args.config is None:
print("ERROR: no config was provided", file=sys.stderr)
config.read(args.config)
if args.subcommand == "loop":
@@ -46,6 +45,16 @@ def main():
slabbedasker.main()
elif args.subcommand == "seed-data":
import dibbler.subcommands.seed_test_data as seed_test_data
seed_test_data.main()
elif args.subcommand == "transaction-log":
import dibbler.subcommands.transaction_log as transaction_log
transaction_log.main()
if __name__ == "__main__":
main()

View File

@@ -180,7 +180,7 @@ When finished, write an empty line to confirm the purchase.\n"""
print(f"User {t.user.name}'s credit is now {t.user.credit:d} kr")
if t.user.credit < config.getint("limits", "low_credit_warning_limit"):
print(
f'USER {t.user.name} HAS LOWER CREDIT THAN {config.getint("limits", "low_credit_warning_limit"):d},',
f"USER {t.user.name} HAS LOWER CREDIT THAN {config.getint('limits', 'low_credit_warning_limit'):d},",
"AND SHOULD CONSIDER PUTTING SOME MONEY IN THE BOX.",
)

View File

@@ -10,12 +10,16 @@ from sqlalchemy.orm.collections import (
)
def _pascal_case_to_snake_case(name: str) -> str:
return "".join(["_" + i.lower() if i.isupper() else i for i in name]).lstrip("_")
class Base(DeclarativeBase):
metadata = MetaData(
naming_convention={
"ix": "ix_%(column_0_label)s",
"ix": "ix_%(table_name)s_%(column_0_label)s",
"uq": "uq_%(table_name)s_%(column_0_name)s",
"ck": "ck_%(table_name)s_`%(constraint_name)s`",
"ck": "ck_%(table_name)s_%(constraint_name)s",
"fk": "fk_%(table_name)s_%(column_0_name)s_%(referred_table_name)s",
"pk": "pk_%(table_name)s",
}
@@ -23,8 +27,12 @@ class Base(DeclarativeBase):
@declared_attr.directive
def __tablename__(cls) -> str:
return cls.__name__
return _pascal_case_to_snake_case(cls.__name__)
# NOTE: This is the default implementation of __repr__ for all tables,
# but it is preferable to override it for each table to get a nicer
# looking representation. This trades a bit of messiness for a complete
# output of all relevant fields.
def __repr__(self) -> str:
columns = ", ".join(
f"{k}={repr(v)}"

View File

@@ -1,5 +1,6 @@
from __future__ import annotations
from typing import TYPE_CHECKING
from typing import Self
from sqlalchemy import (
Boolean,
@@ -9,39 +10,44 @@ from sqlalchemy import (
from sqlalchemy.orm import (
Mapped,
mapped_column,
relationship,
)
from .Base import Base
if TYPE_CHECKING:
from .PurchaseEntry import PurchaseEntry
from .UserProducts import UserProducts
class Product(Base):
__tablename__ = "products"
id: Mapped[int] = mapped_column(Integer, primary_key=True)
"""Internal database ID"""
product_id: Mapped[int] = mapped_column(Integer, primary_key=True)
bar_code: Mapped[str] = mapped_column(String(13))
name: Mapped[str] = mapped_column(String(45))
price: Mapped[int] = mapped_column(Integer)
stock: Mapped[int] = mapped_column(Integer)
hidden: Mapped[bool] = mapped_column(Boolean, nullable=False, default=False)
bar_code: Mapped[str] = mapped_column(String(13), unique=True)
"""
The bar code of the product.
purchases: Mapped[set[PurchaseEntry]] = relationship(back_populates="product")
users: Mapped[set[UserProducts]] = relationship(back_populates="product")
This is a unique identifier for the product, typically a 13-digit
EAN-13 code.
"""
bar_code_re = r"[0-9]+"
name_re = r".+"
name_length = 45
name: Mapped[str] = mapped_column(String(45), unique=True)
"""
The name of the product.
def __init__(self, bar_code, name, price, stock=0, hidden=False):
self.name = name
Please don't write fanfics here, this is not a place for that.
"""
hidden: Mapped[bool] = mapped_column(Boolean, default=False)
"""
Whether the product is hidden from the user interface.
Hidden products are not shown in the product list, but can still be
used in transactions.
"""
def __init__(
self: Self,
bar_code: str,
name: str,
hidden: bool = False,
) -> None:
self.bar_code = bar_code
self.price = price
self.stock = stock
self.name = name
self.hidden = hidden
def __str__(self):
return self.name

View File

@@ -0,0 +1,16 @@
from datetime import datetime
from sqlalchemy import Integer, DateTime
from sqlalchemy.orm import Mapped, mapped_column
from dibbler.models import Base
class ProductCache(Base):
product_id: Mapped[int] = mapped_column(Integer, primary_key=True)
price: Mapped[int] = mapped_column(Integer)
price_timestamp: Mapped[datetime] = mapped_column(DateTime)
stock: Mapped[int] = mapped_column(Integer)
stock_timestamp: Mapped[datetime] = mapped_column(DateTime)

View File

@@ -1,70 +0,0 @@
from __future__ import annotations
from typing import TYPE_CHECKING
from datetime import datetime
import math
from sqlalchemy import (
DateTime,
Integer,
)
from sqlalchemy.orm import (
Mapped,
mapped_column,
relationship,
)
from .Base import Base
from .Transaction import Transaction
if TYPE_CHECKING:
from .PurchaseEntry import PurchaseEntry
class Purchase(Base):
__tablename__ = "purchases"
id: Mapped[int] = mapped_column(Integer, primary_key=True)
time: Mapped[datetime] = mapped_column(DateTime)
price: Mapped[int] = mapped_column(Integer)
transactions: Mapped[set[Transaction]] = relationship(
back_populates="purchase", order_by="Transaction.user_name"
)
entries: Mapped[set[PurchaseEntry]] = relationship(back_populates="purchase")
def __init__(self):
pass
def is_complete(self):
return len(self.transactions) > 0 and len(self.entries) > 0
def price_per_transaction(self, round_up=True):
if round_up:
return int(math.ceil(float(self.price) / len(self.transactions)))
else:
return int(math.floor(float(self.price) / len(self.transactions)))
def set_price(self, round_up=True):
self.price = 0
for entry in self.entries:
self.price += entry.amount * entry.product.price
if len(self.transactions) > 0:
for t in self.transactions:
t.amount = self.price_per_transaction(round_up=round_up)
def perform_purchase(self, ignore_penalty=False, round_up=True):
self.time = datetime.datetime.now()
self.set_price(round_up=round_up)
for t in self.transactions:
t.perform_transaction(ignore_penalty=ignore_penalty)
for entry in self.entries:
entry.product.stock -= entry.amount
def perform_soft_purchase(self, price, round_up=True):
self.time = datetime.datetime.now()
self.price = price
for t in self.transactions:
t.amount = self.price_per_transaction(round_up=round_up)
for t in self.transactions:
t.perform_transaction()

View File

@@ -1,37 +0,0 @@
from __future__ import annotations
from typing import TYPE_CHECKING
from sqlalchemy import (
Integer,
ForeignKey,
)
from sqlalchemy.orm import (
Mapped,
mapped_column,
relationship,
)
from .Base import Base
if TYPE_CHECKING:
from .Product import Product
from .Purchase import Purchase
class PurchaseEntry(Base):
__tablename__ = "purchase_entries"
id: Mapped[int] = mapped_column(Integer, primary_key=True)
amount: Mapped[int] = mapped_column(Integer)
product_id: Mapped[int] = mapped_column(ForeignKey("products.product_id"))
purchase_id: Mapped[int] = mapped_column(ForeignKey("purchases.id"))
product: Mapped[Product] = relationship(lazy="joined")
purchase: Mapped[Purchase] = relationship(lazy="joined")
def __init__(self, purchase, product, amount):
self.product = product
self.product_bar_code = product.bar_code
self.purchase = purchase
self.amount = amount

View File

@@ -1,52 +1,575 @@
from __future__ import annotations
from typing import TYPE_CHECKING
from datetime import datetime
from typing import TYPE_CHECKING, Self
from sqlalchemy import (
CheckConstraint,
DateTime,
ForeignKey,
Integer,
String,
Text,
and_,
column,
func,
or_,
)
from sqlalchemy.orm import (
Mapped,
mapped_column,
relationship,
)
from sqlalchemy.orm.collections import (
InstrumentedDict,
InstrumentedList,
InstrumentedSet,
)
from sqlalchemy.sql.schema import Index
from .Base import Base
from .TransactionType import TransactionType, TransactionTypeSQL
if TYPE_CHECKING:
from .Product import Product
from .User import User
from .Purchase import Purchase
# TODO: rename to *_PERCENT
# NOTE: these only matter when there are no adjustments made in the database.
DEFAULT_INTEREST_RATE_PERCENTAGE = 100
DEFAULT_PENALTY_THRESHOLD = -100
DEFAULT_PENALTY_MULTIPLIER_PERCENTAGE = 200
# TODO: allow for joint transactions?
# dibbler allows joint transactions (e.g. buying more than one product at once, several people buying the same product, etc.)
# instead of having the software split the transactions up, making them hard to reconnect,
# maybe we should add some sort of joint transaction id field to allow multiple transactions to be grouped together?
_DYNAMIC_FIELDS: set[str] = {
"amount",
"interest_rate_percent",
"joint_transaction_id",
"penalty_multiplier_percent",
"penalty_threshold",
"per_product",
"product_count",
"product_id",
"transfer_user_id",
}
EXPECTED_FIELDS: dict[TransactionType, set[str]] = {
TransactionType.ADD_PRODUCT: {"amount", "per_product", "product_count", "product_id"},
TransactionType.ADJUST_BALANCE: {"amount"},
TransactionType.ADJUST_INTEREST: {"interest_rate_percent"},
TransactionType.ADJUST_PENALTY: {"penalty_multiplier_percent", "penalty_threshold"},
TransactionType.ADJUST_STOCK: {"product_count", "product_id"},
TransactionType.BUY_PRODUCT: {"product_count", "product_id"},
TransactionType.JOINT: {"product_count", "product_id"},
TransactionType.JOINT_BUY_PRODUCT: {"joint_transaction_id"},
TransactionType.THROW_PRODUCT: {"product_count", "product_id"},
TransactionType.TRANSFER: {"amount", "transfer_user_id"},
}
assert all(x <= _DYNAMIC_FIELDS for x in EXPECTED_FIELDS.values()), (
"All expected fields must be part of _DYNAMIC_FIELDS."
)
def _transaction_type_field_constraints(
transaction_type: TransactionType,
expected_fields: set[str],
) -> CheckConstraint:
unexpected_fields = _DYNAMIC_FIELDS - expected_fields
return CheckConstraint(
or_(
column("type") != transaction_type.value,
and_(
*[column(field).is_not(None) for field in expected_fields],
*[column(field).is_(None) for field in unexpected_fields],
),
),
name=f"trx_type_{transaction_type.value}_expected_fields",
)
class Transaction(Base):
__tablename__ = "transactions"
__table_args__ = (
*[
_transaction_type_field_constraints(transaction_type, expected_fields)
for transaction_type, expected_fields in EXPECTED_FIELDS.items()
],
CheckConstraint(
or_(
column("type") != TransactionType.TRANSFER.value,
column("user_id") != column("transfer_user_id"),
),
name="trx_type_transfer_no_self_transfers",
),
CheckConstraint(
func.coalesce(column("product_count"), 1) != 0,
name="trx_product_count_non_zero",
),
CheckConstraint(
func.coalesce(column("penalty_multiplier_percent"), 100) >= 100,
name="trx_penalty_multiplier_percent_min_100",
),
CheckConstraint(
func.coalesce(column("interest_rate_percent"), 0) >= 0,
name="trx_interest_rate_percent_non_negative",
),
CheckConstraint(
func.coalesce(column("amount"), 1) != 0,
name="trx_amount_non_zero",
),
CheckConstraint(
func.coalesce(column("per_product"), 1) > 0,
name="trx_per_product_positive",
),
CheckConstraint(
func.coalesce(column("penalty_threshold"), 0) <= 0,
name="trx_penalty_threshold_max_0",
),
CheckConstraint(
or_(
column("joint_transaction_id").is_(None),
column("joint_transaction_id") != column("id"),
),
name="trx_joint_transaction_id_not_self",
),
# Speed up product count calculation
Index("product_user_time", "product_id", "user_id", "time"),
# Speed up product owner calculation
Index("user_product_time", "user_id", "product_id", "time"),
# Speed up user transaction list / credit calculation
Index("user_time", "user_id", "time"),
)
id: Mapped[int] = mapped_column(Integer, primary_key=True)
"""
A unique identifier for the transaction.
Not used for anything else than identifying the transaction in the database.
"""
time: Mapped[datetime] = mapped_column(DateTime)
amount: Mapped[int] = mapped_column(Integer)
penalty: Mapped[int] = mapped_column(Integer)
description: Mapped[str | None] = mapped_column(String(50))
"""
The time when the transaction took place.
user_name: Mapped[str] = mapped_column(ForeignKey("users.name"))
purchase_id: Mapped[int | None] = mapped_column(ForeignKey("purchases.id"))
This is used to order transactions chronologically, and to calculate
all kinds of state.
"""
user: Mapped[User] = relationship(lazy="joined")
purchase: Mapped[Purchase] = relationship(lazy="joined")
message: Mapped[str | None] = mapped_column(Text, nullable=True)
"""
A message that can be set by the user to describe the reason
behind the transaction (or potentially a place to write som fan fiction).
This is not used for any calculations, but can be useful for debugging.
"""
type_: Mapped[TransactionType] = mapped_column(TransactionTypeSQL, name="type")
"""
Which type of transaction this is.
The type determines which fields are expected to be set.
"""
amount: Mapped[int | None] = mapped_column(Integer)
"""
This field means different things depending on the transaction type:
- `ADD_PRODUCT`: The real amount spent on the products.
- `ADJUST_BALANCE`: The amount of credit to add or subtract from the user's balance.
- `TRANSFER`: The amount of balance to transfer to another user.
"""
per_product: Mapped[int | None] = mapped_column(Integer)
"""
If adding products, how much is each product worth
Note that this is distinct from the total amount of the transaction,
because this gets rounded up to the nearest integer, while the total amount
that the user paid in the store would be stored in the `amount` field.
"""
user_id: Mapped[int] = mapped_column(ForeignKey("user.id"))
"""The user who performs the transaction. See `user` for more details."""
user: Mapped[User] = relationship(
lazy="joined",
foreign_keys=[user_id],
)
"""
The user who performs the transaction.
For some transaction types, like `TRANSFER` and `ADD_PRODUCT`, this is a
functional field with "real world consequences" for price calculations.
For others, like `ADJUST_PENALTY` and `ADJUST_STOCK`, this is just a record of who
performed the transaction, and does not affect any state calculations.
In the case of `JOINT` transactions, this is the user who initiated the joint transaction.
"""
joint_transaction_id: Mapped[int | None] = mapped_column(ForeignKey("transaction.id"))
"""
An optional ID to group multiple transactions together as part of a joint transaction.
This is used for `JOINT` and `JOINT_BUY_PRODUCT` transactions, where multiple users
are involved in a single transaction.
"""
joint_transaction: Mapped[Transaction | None] = relationship(
lazy="joined",
foreign_keys=[joint_transaction_id],
)
"""
The joint transaction that this transaction is part of, if any.
"""
# Receiving user when moving credit from one user to another
transfer_user_id: Mapped[int | None] = mapped_column(ForeignKey("user.id"))
"""The user who receives money in a `TRANSFER` transaction."""
transfer_user: Mapped[User | None] = relationship(
lazy="joined",
foreign_keys=[transfer_user_id],
)
"""The user who receives money in a `TRANSFER` transaction."""
# The product that is either being added or bought
product_id: Mapped[int | None] = mapped_column(ForeignKey("product.id"))
"""The product being added or bought."""
product: Mapped[Product | None] = relationship(lazy="joined")
"""The product being added or bought."""
# The amount of products being added or bought
product_count: Mapped[int | None] = mapped_column(Integer)
"""
The amount of products being added or bought.
This is always relative to the existing stock.
- `ADD_PRODUCT` increases the stock by this amount.
- `BUY_PRODUCT` decreases the stock by this amount.
- `ADJUST_STOCK` increases or decreases the stock by this amount,
depending on whether the amount is positive or negative.
"""
penalty_threshold: Mapped[int | None] = mapped_column(Integer, nullable=True)
"""
On `ADJUST_PENALTY` transactions, this is the threshold in krs for when the user
should start getting penalized for low credit.
See also `penalty_multiplier`.
"""
penalty_multiplier_percent: Mapped[int | None] = mapped_column(Integer, nullable=True)
"""
On `ADJUST_PENALTY` transactions, this is the multiplier for the amount of
money the user has to pay when they have too low credit.
The multiplier is a percentage, so `100` means the user has to pay the full
price of the product, `200` means they have to pay double, etc.
See also `penalty_threshold`.
"""
# TODO: this should be inferred
# Assuming this is a BUY_PRODUCT transaction, was the user penalized for having
# too low credit in this transaction?
# is_penalized: Mapped[Boolean] = mapped_column(Boolean, default=False)
interest_rate_percent: Mapped[int | None] = mapped_column(Integer, nullable=True)
"""
On `ADJUST_INTEREST` transactions, this is the interest rate in percent
that the user has to pay on their balance.
The interest rate is a percentage, so `100` means the user has to pay the full
price of the product, `200` means they have to pay double, etc.
"""
def __init__(
self: Self,
type_: TransactionType,
user_id: int,
amount: int | None = None,
interest_rate_percent: int | None = None,
joint_transaction_id: int | None = None,
message: str | None = None,
penalty_multiplier_percent: int | None = None,
penalty_threshold: int | None = None,
per_product: int | None = None,
product_count: int | None = None,
product_id: int | None = None,
time: datetime | None = None,
transfer_user_id: int | None = None,
) -> None:
"""
Please do not call this constructor directly, use the factory methods instead.
"""
if time is None:
time = datetime.now()
def __init__(self, user, amount=0, description=None, purchase=None, penalty=1):
self.user = user
self.amount = amount
self.description = description
self.purchase = purchase
self.penalty = penalty
self.interest_rate_percent = interest_rate_percent
self.joint_transaction_id = joint_transaction_id
self.message = message
self.penalty_multiplier_percent = penalty_multiplier_percent
self.penalty_threshold = penalty_threshold
self.per_product = per_product
self.product_count = product_count
self.product_id = product_id
self.time = time
self.transfer_user_id = transfer_user_id
self.type_ = type_
self.user_id = user_id
def perform_transaction(self, ignore_penalty=False):
self.time = datetime.datetime.now()
if not ignore_penalty:
self.amount *= self.penalty
self.user.credit -= self.amount
self._validate_by_transaction_type()
def _validate_by_transaction_type(self: Self) -> None:
"""
Validates the transaction's fields based on its type.
Raises `ValueError` if the transaction is invalid.
"""
# TODO: do we allow free products?
if self.amount == 0:
raise ValueError("Amount must not be zero.")
for field in EXPECTED_FIELDS[self.type_]:
if getattr(self, field) is None:
raise ValueError(f"{field} must not be None for {self.type_.value} transactions.")
for field in _DYNAMIC_FIELDS - EXPECTED_FIELDS[self.type_]:
if getattr(self, field) is not None:
raise ValueError(f"{field} must be None for {self.type_.value} transactions.")
if self.per_product is not None and self.per_product <= 0:
raise ValueError("per_product must be greater than zero.")
if (
self.per_product is not None
and self.product_count is not None
and self.amount is not None
and self.amount > self.per_product * self.product_count
):
raise ValueError(
"The real amount of the transaction must be less than the total value of the products."
)
# TODO: improve printing further
def __repr__(self) -> str:
sort_order = [
"id",
"time",
]
columns = ", ".join(
f"{k}={repr(v)}"
for k, v in sorted(
self.__dict__.items(),
key=lambda item: chr(sort_order.index(item[0]))
if item[0] in sort_order
else item[0],
)
if not any(
[
k == "type_",
(k == "message" and v is None),
k.startswith("_"),
# Ensure that we don't try to print out the entire list of
# relationships, which could create an infinite loop
isinstance(v, Base),
isinstance(v, InstrumentedList),
isinstance(v, InstrumentedSet),
isinstance(v, InstrumentedDict),
*[k in (_DYNAMIC_FIELDS - EXPECTED_FIELDS[self.type_])],
]
)
)
return f"{self.type_.upper()}({columns})"
###################
# FACTORY METHODS #
###################
@classmethod
def adjust_balance(
cls: type[Self],
amount: int,
user_id: int,
time: datetime | None = None,
message: str | None = None,
) -> Self:
return cls(
time=time,
type_=TransactionType.ADJUST_BALANCE,
amount=amount,
user_id=user_id,
message=message,
)
@classmethod
def adjust_interest(
cls: type[Self],
interest_rate_percent: int,
user_id: int,
time: datetime | None = None,
message: str | None = None,
) -> Self:
return cls(
time=time,
type_=TransactionType.ADJUST_INTEREST,
interest_rate_percent=interest_rate_percent,
user_id=user_id,
message=message,
)
@classmethod
def adjust_penalty(
cls: type[Self],
penalty_multiplier_percent: int,
penalty_threshold: int,
user_id: int,
time: datetime | None = None,
message: str | None = None,
) -> Self:
return cls(
time=time,
type_=TransactionType.ADJUST_PENALTY,
penalty_multiplier_percent=penalty_multiplier_percent,
penalty_threshold=penalty_threshold,
user_id=user_id,
message=message,
)
@classmethod
def adjust_stock(
cls: type[Self],
user_id: int,
product_id: int,
product_count: int,
time: datetime | None = None,
message: str | None = None,
) -> Self:
return cls(
time=time,
type_=TransactionType.ADJUST_STOCK,
user_id=user_id,
product_id=product_id,
product_count=product_count,
message=message,
)
@classmethod
def add_product(
cls: type[Self],
amount: int,
user_id: int,
product_id: int,
per_product: int,
product_count: int,
time: datetime | None = None,
message: str | None = None,
) -> Self:
return cls(
time=time,
type_=TransactionType.ADD_PRODUCT,
amount=amount,
user_id=user_id,
product_id=product_id,
per_product=per_product,
product_count=product_count,
message=message,
)
@classmethod
def buy_product(
cls: type[Self],
user_id: int,
product_id: int,
product_count: int,
time: datetime | None = None,
message: str | None = None,
) -> Self:
return cls(
time=time,
type_=TransactionType.BUY_PRODUCT,
user_id=user_id,
product_id=product_id,
product_count=product_count,
message=message,
)
@classmethod
def joint(
cls: type[Self],
user_id: int,
product_id: int,
product_count: int,
time: datetime | None = None,
message: str | None = None,
) -> Self:
return cls(
time=time,
type_=TransactionType.JOINT,
user_id=user_id,
product_id=product_id,
product_count=product_count,
message=message,
)
@classmethod
def joint_buy_product(
cls: type[Self],
joint_transaction_id: int,
user_id: int,
time: datetime | None = None,
message: str | None = None,
) -> Self:
return cls(
time=time,
type_=TransactionType.JOINT_BUY_PRODUCT,
joint_transaction_id=joint_transaction_id,
user_id=user_id,
message=message,
)
@classmethod
def transfer(
cls: type[Self],
amount: int,
user_id: int,
transfer_user_id: int,
time: datetime | None = None,
message: str | None = None,
) -> Self:
return cls(
time=time,
type_=TransactionType.TRANSFER,
amount=amount,
user_id=user_id,
transfer_user_id=transfer_user_id,
message=message,
)
@classmethod
def throw_product(
cls: type[Self],
user_id: int,
product_id: int,
product_count: int,
time: datetime | None = None,
message: str | None = None,
) -> Self:
return cls(
time=time,
type_=TransactionType.THROW_PRODUCT,
user_id=user_id,
product_id=product_id,
product_count=product_count,
message=message,
)

View File

@@ -0,0 +1,29 @@
from enum import StrEnum, auto
from sqlalchemy import Enum as SQLEnum
class TransactionType(StrEnum):
"""
Enum for transaction types.
"""
ADD_PRODUCT = auto()
ADJUST_BALANCE = auto()
ADJUST_INTEREST = auto()
ADJUST_PENALTY = auto()
ADJUST_STOCK = auto()
BUY_PRODUCT = auto()
JOINT = auto()
JOINT_BUY_PRODUCT = auto()
THROW_PRODUCT = auto()
TRANSFER = auto()
TransactionTypeSQL = SQLEnum(
TransactionType,
native_enum=True,
create_constraint=True,
validate_strings=True,
values_callable=lambda x: [i.value for i in x],
)

View File

@@ -1,5 +1,6 @@
from __future__ import annotations
from typing import TYPE_CHECKING
from typing import Self
from sqlalchemy import (
Integer,
@@ -8,42 +9,35 @@ from sqlalchemy import (
from sqlalchemy.orm import (
Mapped,
mapped_column,
relationship,
)
from .Base import Base
if TYPE_CHECKING:
from .UserProducts import UserProducts
from .Transaction import Transaction
class User(Base):
__tablename__ = "users"
name: Mapped[str] = mapped_column(String(10), primary_key=True)
credit: Mapped[str] = mapped_column(Integer)
id: Mapped[int] = mapped_column(Integer, primary_key=True)
"""Internal database ID"""
name: Mapped[str] = mapped_column(String(20), unique=True)
"""The PVV username of the user."""
card: Mapped[str | None] = mapped_column(String(20))
"""The NTNU card number of the user."""
rfid: Mapped[str | None] = mapped_column(String(20))
"""The RFID tag of the user (if they have any, rare these days)."""
products: Mapped[set[UserProducts]] = relationship(back_populates="user")
transactions: Mapped[set[Transaction]] = relationship(back_populates="user")
# name_re = r"[a-z]+"
# card_re = r"(([Nn][Tt][Nn][Uu])?[0-9]+)?"
# rfid_re = r"[0-9a-fA-F]*"
name_re = r"[a-z]+"
card_re = r"(([Nn][Tt][Nn][Uu])?[0-9]+)?"
rfid_re = r"[0-9a-fA-F]*"
def __init__(self, name, card, rfid=None, credit=0):
def __init__(self: Self, name: str, card: str | None = None, rfid: str | None = None) -> None:
self.name = name
if card == "":
card = None
self.card = card
if rfid == "":
rfid = None
self.rfid = rfid
self.credit = credit
def __str__(self):
return self.name
# def __str__(self):
# return self.name
def is_anonymous(self):
return self.card == "11122233"
# def is_anonymous(self):
# return self.card == "11122233"

View File

@@ -0,0 +1,14 @@
from datetime import datetime
from sqlalchemy import Integer, DateTime
from sqlalchemy.orm import Mapped, mapped_column
from dibbler.models import Base
# More like user balance cash money flow, amirite?
class UserBalanceCache(Base):
user_id: Mapped[int] = mapped_column(Integer, primary_key=True)
balance: Mapped[int] = mapped_column(Integer)
timestamp: Mapped[datetime] = mapped_column(DateTime)

View File

@@ -1,31 +0,0 @@
from __future__ import annotations
from typing import TYPE_CHECKING
from sqlalchemy import (
Integer,
ForeignKey,
)
from sqlalchemy.orm import (
Mapped,
mapped_column,
relationship,
)
from .Base import Base
if TYPE_CHECKING:
from .User import User
from .Product import Product
class UserProducts(Base):
__tablename__ = "user_products"
user_name: Mapped[str] = mapped_column(ForeignKey("users.name"), primary_key=True)
product_id: Mapped[int] = mapped_column(ForeignKey("products.product_id"), primary_key=True)
count: Mapped[int] = mapped_column(Integer)
sign: Mapped[int] = mapped_column(Integer)
user: Mapped[User] = relationship()
product: Mapped[Product] = relationship()

View File

@@ -1,17 +1,13 @@
__all__ = [
'Base',
'Product',
'Purchase',
'PurchaseEntry',
'Transaction',
'User',
'UserProducts',
"Base",
"Product",
"Transaction",
"TransactionType",
"User",
]
from .Base import Base
from .Product import Product
from .Purchase import Purchase
from .PurchaseEntry import PurchaseEntry
from .Transaction import Transaction
from .TransactionType import TransactionType
from .User import User
from .UserProducts import UserProducts

View File

@@ -0,0 +1,37 @@
__all__ = [
# "add_product",
# "add_user",
"adjust_interest",
"adjust_penalty",
"current_interest",
"current_penalty",
"joint_buy_product",
"product_owners",
"product_owners_log",
"product_price",
"product_price_log",
"product_stock",
# "products_owned_by_user",
"search_product",
"search_user",
"transaction_log",
"user_balance",
"user_balance_log",
]
# from .add_product import add_product
# from .add_user import add_user
from .adjust_interest import adjust_interest
from .adjust_penalty import adjust_penalty
from .current_interest import current_interest
from .current_penalty import current_penalty
from .joint_buy_product import joint_buy_product
from .product_owners import product_owners, product_owners_log
from .product_price import product_price, product_price_log
from .product_stock import product_stock
# from .products_owned_by_user import products_owned_by_user
from .search_product import search_product
from .search_user import search_user
from .transaction_log import transaction_log
from .user_balance import user_balance, user_balance_log

View File

@@ -0,0 +1 @@
# TODO: implement me

View File

@@ -0,0 +1 @@
# TODO: implement me

View File

@@ -0,0 +1,25 @@
from sqlalchemy.orm import Session
from dibbler.models import Transaction
# TODO: this type of transaction should be password protected.
# the password can be set as a string literal in the config file.
def adjust_interest(
sql_session: Session,
user_id: int,
new_interest: int,
message: str | None = None,
) -> None:
if new_interest < 0:
raise ValueError("Interest rate cannot be negative")
transaction = Transaction.adjust_interest(
user_id=user_id,
interest_rate_percent=new_interest,
message=message,
)
sql_session.add(transaction)
sql_session.commit()

View File

@@ -0,0 +1,38 @@
from sqlalchemy.orm import Session
from dibbler.models import Transaction
from dibbler.queries.current_penalty import current_penalty
# TODO: this type of transaction should be password protected.
# the password can be set as a string literal in the config file.
def adjust_penalty(
sql_session: Session,
user_id: int,
new_penalty: int | None = None,
new_penalty_multiplier: int | None = None,
message: str | None = None,
) -> None:
if new_penalty is None and new_penalty_multiplier is None:
raise ValueError("At least one of new_penalty or new_penalty_multiplier must be provided")
if new_penalty_multiplier is not None and new_penalty_multiplier < 100:
raise ValueError("Penalty multiplier cannot be less than 100%")
if new_penalty is None or new_penalty_multiplier is None:
existing_penalty, existing_penalty_multiplier = current_penalty(sql_session)
if new_penalty is None:
new_penalty = existing_penalty
if new_penalty_multiplier is None:
new_penalty_multiplier = existing_penalty_multiplier
transaction = Transaction.adjust_penalty(
user_id=user_id,
penalty_threshold=new_penalty,
penalty_multiplier_percent=new_penalty_multiplier,
message=message,
)
sql_session.add(transaction)
sql_session.commit()

View File

@@ -0,0 +1,21 @@
from sqlalchemy import select
from sqlalchemy.orm import Session
from dibbler.models import Transaction, TransactionType
from dibbler.models.Transaction import DEFAULT_INTEREST_RATE_PERCENTAGE
def current_interest(sql_session: Session) -> int:
result = sql_session.scalars(
select(Transaction)
.where(Transaction.type_ == TransactionType.ADJUST_INTEREST)
.order_by(Transaction.time.desc())
.limit(1)
).one_or_none()
if result is None:
return DEFAULT_INTEREST_RATE_PERCENTAGE
elif result.interest_rate_percent is None:
return DEFAULT_INTEREST_RATE_PERCENTAGE
else:
return result.interest_rate_percent

View File

@@ -0,0 +1,25 @@
from sqlalchemy import select
from sqlalchemy.orm import Session
from dibbler.models import Transaction, TransactionType
from dibbler.models.Transaction import (
DEFAULT_PENALTY_MULTIPLIER_PERCENTAGE,
DEFAULT_PENALTY_THRESHOLD,
)
def current_penalty(sql_session: Session) -> tuple[int, int]:
result = sql_session.scalars(
select(Transaction)
.where(Transaction.type_ == TransactionType.ADJUST_PENALTY)
.order_by(Transaction.time.desc())
.limit(1)
).one_or_none()
if result is None:
return DEFAULT_PENALTY_THRESHOLD, DEFAULT_PENALTY_MULTIPLIER_PERCENTAGE
assert result.penalty_threshold is not None, "Penalty threshold must be set"
assert result.penalty_multiplier_percent is not None, "Penalty multiplier percent must be set"
return result.penalty_threshold, result.penalty_multiplier_percent

View File

@@ -0,0 +1,50 @@
from datetime import datetime
from sqlalchemy.orm import Session
from dibbler.models import (
Product,
Transaction,
User,
)
def joint_buy_product(
sql_session: Session,
product: Product,
product_count: int,
instigator: User,
users: list[User],
time: datetime | None = None,
message: str | None = None,
) -> None:
"""
Create buy product transactions for multiple users at once.
"""
if instigator not in users:
raise ValueError("Instigator must be in the list of users buying the product.")
if product_count <= 0:
raise ValueError("Product count must be positive.")
joint_transaction = Transaction.joint(
user_id=instigator.id,
product_id=product.id,
product_count=product_count,
time=time,
message=message,
)
sql_session.add(joint_transaction)
sql_session.flush() # Ensure joint_transaction gets an ID
for user in users:
buy_transaction = Transaction.joint_buy_product(
user_id=user.id,
joint_transaction_id=joint_transaction.id,
time=time,
message=message,
)
sql_session.add(buy_transaction)
sql_session.commit()

View File

@@ -0,0 +1,276 @@
from datetime import datetime
from dataclasses import dataclass
from sqlalchemy import (
CTE,
and_,
asc,
case,
func,
literal,
select,
)
from sqlalchemy.orm import Session
from dibbler.models import (
Product,
Transaction,
TransactionType,
User,
)
from dibbler.queries.product_stock import _product_stock_query
def _product_owners_query(
product_id: int,
use_cache: bool = True,
until: datetime | None = None,
cte_name: str = "rec_cte",
) -> CTE:
"""
The inner query for inferring the owners of a given product.
"""
if use_cache:
print("WARNING: Using cache for users owning product query is not implemented yet.")
product_stock = _product_stock_query(
product_id=product_id,
use_cache=use_cache,
until=until,
)
# Subset of transactions that we'll want to iterate over.
trx_subset = (
select(
func.row_number().over(order_by=asc(Transaction.time)).label("i"),
Transaction.time,
Transaction.id,
Transaction.type_,
Transaction.user_id,
Transaction.product_count,
)
.where(
Transaction.type_.in_(
[
TransactionType.ADD_PRODUCT,
TransactionType.BUY_PRODUCT,
TransactionType.ADJUST_STOCK,
TransactionType.JOINT,
TransactionType.THROW_PRODUCT,
]
),
Transaction.product_id == product_id,
literal(True) if until is None else Transaction.time <= until,
)
.order_by(Transaction.time.desc())
.subquery()
)
initial_element = select(
literal(0).label("i"),
literal(0).label("time"),
literal(None).label("transaction_id"),
literal(None).label("user_id"),
literal(0).label("product_count"),
product_stock.scalar_subquery().label("products_left_to_account_for"),
)
recursive_cte = initial_element.cte(name=cte_name, recursive=True)
recursive_elements = (
select(
trx_subset.c.i,
trx_subset.c.time,
trx_subset.c.id.label("transaction_id"),
# Who added the product (if any)
case(
# Someone adds the product -> they own it
(
trx_subset.c.type_ == TransactionType.ADD_PRODUCT,
trx_subset.c.user_id,
),
else_=None,
).label("user_id"),
# How many products did they add (if any)
case(
# Someone adds the product -> they added a certain amount of products
(trx_subset.c.type_ == TransactionType.ADD_PRODUCT, trx_subset.c.product_count),
# Stock got adjusted upwards -> consider those products as added by nobody
(
(trx_subset.c.type_ == TransactionType.ADJUST_STOCK)
& (trx_subset.c.product_count > 0),
trx_subset.c.product_count,
),
else_=0,
).label("product_count"),
# How many products left to account for
case(
# Someone adds the product -> increase the number of products left to account for
(
trx_subset.c.type_ == TransactionType.ADD_PRODUCT,
recursive_cte.c.products_left_to_account_for - trx_subset.c.product_count,
),
# Someone buys/joins/throws the product -> decrease the number of products left to account for
(
trx_subset.c.type_.in_(
[
TransactionType.BUY_PRODUCT,
TransactionType.JOINT,
TransactionType.THROW_PRODUCT,
]
),
recursive_cte.c.products_left_to_account_for - trx_subset.c.product_count,
),
# Someone adjusts the stock ->
# If adjusted upwards -> products owned by nobody, decrease products left to account for
# If adjusted downwards -> products taken away from owners, decrease products left to account for
(
(trx_subset.c.type_ == TransactionType.ADJUST_STOCK)
and (trx_subset.c.product_count > 0),
recursive_cte.c.products_left_to_account_for - trx_subset.c.product_count,
),
(
(trx_subset.c.type_ == TransactionType.ADJUST_STOCK)
and (trx_subset.c.product_count < 0),
recursive_cte.c.products_left_to_account_for + trx_subset.c.product_count,
),
else_=recursive_cte.c.products_left_to_account_for,
).label("products_left_to_account_for"),
)
.select_from(trx_subset)
.where(
and_(
trx_subset.c.i == recursive_cte.c.i + 1,
recursive_cte.c.products_left_to_account_for > 0,
)
)
)
return recursive_cte.union_all(recursive_elements)
@dataclass
class ProductOwnersLogEntry:
transaction: Transaction
user: User | None
products_left_to_account_for: int
def product_owners_log(
sql_session: Session,
product: Product,
use_cache: bool = True,
until: Transaction | None = None,
) -> list[ProductOwnersLogEntry]:
"""
Returns a log of the product ownership calculation for the given product.
If 'until' is given, only transactions up to that time are considered.
"""
recursive_cte = _product_owners_query(
product_id=product.id,
use_cache=use_cache,
until=until.time if until else None,
)
result = sql_session.execute(
select(
Transaction,
User,
recursive_cte.c.products_left_to_account_for,
)
.select_from(recursive_cte)
.join(
Transaction,
onclause=Transaction.id == recursive_cte.c.transaction_id,
)
.join(
User,
onclause=User.id == recursive_cte.c.user_id,
isouter=True,
)
.order_by(recursive_cte.c.i.desc())
).all()
if result is None:
# If there are no transactions for this product, the query should return an empty list, not None.
raise RuntimeError(
f"Something went wrong while calculating the owner log for product {product.name} (ID: {product.id})."
)
return [
ProductOwnersLogEntry(
transaction=row[0],
user=row[1],
products_left_to_account_for=row[2],
)
for row in result
]
def product_owners(
sql_session: Session,
product: Product,
use_cache: bool = True,
until: datetime | None = None,
) -> list[User | None]:
"""
Returns an ordered list of users owning the given product.
If 'until' is given, only transactions up to that time are considered.
"""
recursive_cte = _product_owners_query(
product_id=product.id,
use_cache=use_cache,
until=until,
)
db_result = sql_session.execute(
select(
recursive_cte.c.products_left_to_account_for,
recursive_cte.c.product_count,
User,
)
.join(User, User.id == recursive_cte.c.user_id, isouter=True)
.order_by(recursive_cte.c.i.desc())
).all()
print(db_result)
result: list[User | None] = []
none_count = 0
# We are moving backwards through history, but this is the order we want to return the list
# There are 3 cases:
# User is not none -> add user product_count times
# User is none, and product_count is not 0 -> add None product_count times
# User is none, and product_count is 0 -> check how much products are left to account for,
for products_left_to_account_for, product_count, user in db_result:
if user is not None:
if products_left_to_account_for < 0:
result.extend([user] * (product_count + products_left_to_account_for))
else:
result.extend([user] * product_count)
elif product_count != 0:
if products_left_to_account_for < 0:
none_count += product_count + products_left_to_account_for
else:
none_count += product_count
else:
pass
# none_count += user_count
# else:
result.extend([None] * none_count)
# # NOTE: if the last line exeeds the product count, we need to truncate it
# result.extend([user] * min(user_count, products_left_to_account_for))
# redistribute the user counts to a list of users
return list(result)

View File

@@ -0,0 +1,249 @@
import math
from dataclasses import dataclass
from datetime import datetime
from sqlalchemy import (
ColumnElement,
Integer,
SQLColumnExpression,
asc,
case,
cast,
func,
literal,
select,
)
from sqlalchemy.orm import Session
from dibbler.models import (
Product,
Transaction,
TransactionType,
)
from dibbler.models.Transaction import DEFAULT_INTEREST_RATE_PERCENTAGE
def _product_price_query(
product_id: int | ColumnElement[int],
use_cache: bool = True,
until: datetime | SQLColumnExpression[datetime] | None = None,
until_including: bool = True,
cte_name: str = "rec_cte",
):
"""
The inner query for calculating the product price.
"""
if use_cache:
print("WARNING: Using cache for product price query is not implemented yet.")
initial_element = select(
literal(0).label("i"),
literal(0).label("time"),
literal(None).label("transaction_id"),
literal(0).label("price"),
literal(0).label("product_count"),
)
recursive_cte = initial_element.cte(name=cte_name, recursive=True)
# Subset of transactions that we'll want to iterate over.
trx_subset = (
select(
func.row_number().over(order_by=asc(Transaction.time)).label("i"),
Transaction.id,
Transaction.time,
Transaction.type_,
Transaction.product_count,
Transaction.per_product,
)
.where(
Transaction.type_.in_(
[
TransactionType.BUY_PRODUCT,
TransactionType.ADD_PRODUCT,
TransactionType.ADJUST_STOCK,
TransactionType.JOINT,
]
),
Transaction.product_id == product_id,
case(
(literal(until_including), Transaction.time <= until),
else_=Transaction.time < until,
)
if until is not None
else literal(True),
)
.order_by(Transaction.time.asc())
.alias("trx_subset")
)
recursive_elements = (
select(
trx_subset.c.i,
trx_subset.c.time,
trx_subset.c.id.label("transaction_id"),
case(
# Someone buys the product -> price remains the same.
(trx_subset.c.type_ == TransactionType.BUY_PRODUCT, recursive_cte.c.price),
# Someone adds the product -> price is recalculated based on
# product count, previous price, and new price.
(
trx_subset.c.type_ == TransactionType.ADD_PRODUCT,
cast(
func.ceil(
(
recursive_cte.c.price * func.max(recursive_cte.c.product_count, 0)
+ trx_subset.c.per_product * trx_subset.c.product_count
)
/ (
# The running product count can be negative if the accounting is bad.
# This ensures that we never end up with negative prices or zero divisions
# and other disastrous phenomena.
func.max(recursive_cte.c.product_count, 0)
+ trx_subset.c.product_count
)
),
Integer,
),
),
# Someone adjusts the stock -> price remains the same.
(trx_subset.c.type_ == TransactionType.ADJUST_STOCK, recursive_cte.c.price),
# Should never happen
else_=recursive_cte.c.price,
).label("price"),
case(
# Someone buys the product -> product count is reduced.
(
trx_subset.c.type_ == TransactionType.BUY_PRODUCT,
recursive_cte.c.product_count - trx_subset.c.product_count,
),
(
trx_subset.c.type_ == TransactionType.JOINT,
recursive_cte.c.product_count - trx_subset.c.product_count,
),
# Someone adds the product -> product count is increased.
(
trx_subset.c.type_ == TransactionType.ADD_PRODUCT,
recursive_cte.c.product_count + trx_subset.c.product_count,
),
# Someone adjusts the stock -> product count is adjusted.
(
trx_subset.c.type_ == TransactionType.ADJUST_STOCK,
recursive_cte.c.product_count + trx_subset.c.product_count,
),
# Should never happen
else_=recursive_cte.c.product_count,
).label("product_count"),
)
.select_from(trx_subset)
.where(trx_subset.c.i == recursive_cte.c.i + 1)
)
return recursive_cte.union_all(recursive_elements)
# TODO: create a function for the log that pretty prints the log entries
# for debugging purposes
@dataclass
class ProductPriceLogEntry:
transaction: Transaction
price: int
product_count: int
def product_price_log(
sql_session: Session,
product: Product,
use_cache: bool = True,
until: Transaction | None = None,
) -> list[ProductPriceLogEntry]:
"""
Calculates the price of a product and returns a log of the price changes.
"""
recursive_cte = _product_price_query(
product.id,
use_cache=use_cache,
until=until.time if until else None,
)
result = sql_session.execute(
select(
Transaction,
recursive_cte.c.price,
recursive_cte.c.product_count,
)
.select_from(recursive_cte)
.join(
Transaction,
onclause=Transaction.id == recursive_cte.c.transaction_id,
)
.order_by(recursive_cte.c.i.asc())
).all()
if result is None:
# If there are no transactions for this product, the query should return an empty list, not None.
raise RuntimeError(
f"Something went wrong while calculating the price log for product {product.name} (ID: {product.id})."
)
return [
ProductPriceLogEntry(
transaction=row[0],
price=row.price,
product_count=row.product_count,
)
for row in result
]
def product_price(
sql_session: Session,
product: Product,
use_cache: bool = True,
until: Transaction | None = None,
include_interest: bool = False,
) -> int:
"""
Calculates the price of a product.
"""
recursive_cte = _product_price_query(
product.id,
use_cache=use_cache,
until=until.time if until else None,
)
# TODO: optionally verify subresults:
# - product_count should never be negative (but this happens sometimes, so just a warning)
# - price should never be negative
result = sql_session.scalars(
select(recursive_cte.c.price).order_by(recursive_cte.c.i.desc()).limit(1)
).one_or_none()
if result is None:
# If there are no transactions for this product, the query should return 0, not None.
raise RuntimeError(
f"Something went wrong while calculating the price for product {product.name} (ID: {product.id})."
)
if include_interest:
interest_rate = (
sql_session.scalar(
select(Transaction.interest_rate_percent)
.where(
Transaction.type_ == TransactionType.ADJUST_INTEREST,
literal(True) if until is None else Transaction.time <= until.time,
)
.order_by(Transaction.time.desc())
.limit(1)
)
or DEFAULT_INTEREST_RATE_PERCENTAGE
)
result = math.ceil(result * interest_rate / 100)
return result

View File

@@ -0,0 +1,94 @@
from datetime import datetime
from sqlalchemy import (
Select,
case,
func,
literal,
select,
)
from sqlalchemy.orm import Session
from dibbler.models import (
Product,
Transaction,
TransactionType,
)
def _product_stock_query(
product_id: int,
use_cache: bool = True,
until: datetime | None = None,
) -> Select:
"""
The inner query for calculating the product stock.
"""
if use_cache:
print("WARNING: Using cache for product stock query is not implemented yet.")
query = select(
func.sum(
case(
(
Transaction.type_ == TransactionType.ADD_PRODUCT,
Transaction.product_count,
),
(
Transaction.type_ == TransactionType.ADJUST_STOCK,
Transaction.product_count,
),
(
Transaction.type_ == TransactionType.BUY_PRODUCT,
-Transaction.product_count,
),
(
Transaction.type_ == TransactionType.JOINT,
-Transaction.product_count,
),
(
Transaction.type_ == TransactionType.THROW_PRODUCT,
-Transaction.product_count,
),
else_=0,
)
)
).where(
Transaction.type_.in_(
[
TransactionType.ADD_PRODUCT,
TransactionType.ADJUST_STOCK,
TransactionType.BUY_PRODUCT,
TransactionType.JOINT,
TransactionType.THROW_PRODUCT,
]
),
Transaction.product_id == product_id,
Transaction.time <= until if until is not None else literal(True),
)
return query
def product_stock(
sql_session: Session,
product: Product,
use_cache: bool = True,
until: datetime | None = None,
) -> int:
"""
Returns the number of products in stock.
If 'until' is given, only transactions up to that time are considered.
"""
query = _product_stock_query(
product_id=product.id,
use_cache=use_cache,
until=until,
)
result = sql_session.scalars(query).one_or_none()
return result or 0

View File

@@ -0,0 +1,10 @@
# This absoulutely needs a cache, else we can't stop recursing until we know all owners for all products...
#
# Since we know that the non-owned products will not get renowned by the user by other means,
# we can just check for ownership on the products that have an ADD_PRODUCT transaction for the user.
# between now and the cached time.
#
# However, the opposite way is more difficult. The cache will store which products are owned by which users,
# but we still need to check if the user passes out of ownership for the item, without needing to check past
# the cache time. Maybe we also need to store the queue number(s) per user/product combo in the cache? What if
# a user has products multiple places in the queue, interleaved with other users?

View File

@@ -0,0 +1,39 @@
from sqlalchemy import and_, literal, not_, or_, select
from sqlalchemy.orm import Session
from dibbler.models import Product
def search_product(
string: str,
sql_session: Session,
find_hidden_products=False,
) -> Product | list[Product]:
exact_match = sql_session.scalars(
select(Product).where(
or_(
Product.bar_code == string,
and_(
Product.name == string,
literal(True) if find_hidden_products else not_(Product.hidden),
),
)
)
).first()
if exact_match:
return exact_match
product_list = sql_session.scalars(
select(Product).where(
or_(
Product.bar_code.ilike(f"%{string}%"),
and_(
Product.name.ilike(f"%{string}%"),
literal(True) if find_hidden_products else not_(Product.hidden),
),
)
)
).all()
return list(product_list)

View File

@@ -0,0 +1,36 @@
from sqlalchemy import or_, select
from sqlalchemy.orm import Session
from dibbler.models import User
def search_user(
string: str,
sql_session: Session,
) -> User | list[User]:
string = string.lower()
exact_match = sql_session.scalars(
select(User).where(
or_(
User.name == string,
User.card == string,
User.rfid == string,
)
)
).first()
if exact_match:
return exact_match
user_list = sql_session.scalars(
select(User).where(
or_(
User.name.ilike(f"%{string}%"),
User.card.ilike(f"%{string}%"),
User.rfid.ilike(f"%{string}%"),
)
)
).all()
return list(user_list)

View File

@@ -0,0 +1,84 @@
from sqlalchemy import select
from sqlalchemy.orm import Session
from dibbler.models import (
Product,
Transaction,
TransactionType,
User,
)
# TODO: should this include full joint transactions that involve a user?
# TODO: should this involve throw-away transactions that affects a user?
def transaction_log(
sql_session: Session,
user: User | None = None,
product: Product | None = None,
exclusive_after: bool = False,
after_time=None,
after_transaction_id: int | None = None,
exclusive_before: bool = False,
before_time=None,
before_transaction_id: int | None = None,
transaction_type: list[TransactionType] | None = None,
negate_transaction_type_filter: bool = False,
limit: int | None = None,
) -> list[Transaction]:
"""
Retrieve the transaction log, optionally filtered.
Only one of `user` or `product` may be specified.
Only one of `after_time` or `after_transaction_id` may be specified.
Only one of `before_time` or `before_transaction_id` may be specified.
The before and after filters are inclusive by default.
"""
if not (user is None or product is None):
raise ValueError("Cannot filter by both user and product.")
if not (after_time is None or after_transaction_id is None):
raise ValueError("Cannot filter by both from_time and from_transaction_id.")
query = select(Transaction)
if user is not None:
query = query.where(Transaction.user_id == user.id)
if product is not None:
query = query.where(Transaction.product_id == product.id)
if after_time is not None:
if exclusive_after:
query = query.where(Transaction.time > after_time)
else:
query = query.where(Transaction.time >= after_time)
if after_transaction_id is not None:
if exclusive_after:
query = query.where(Transaction.id > after_transaction_id)
else:
query = query.where(Transaction.id >= after_transaction_id)
if before_time is not None:
if exclusive_before:
query = query.where(Transaction.time < before_time)
else:
query = query.where(Transaction.time <= before_time)
if before_transaction_id is not None:
if exclusive_before:
query = query.where(Transaction.id < before_transaction_id)
else:
query = query.where(Transaction.id <= before_transaction_id)
if transaction_type is not None:
if negate_transaction_type_filter:
query = query.where(~Transaction.type_.in_(transaction_type))
else:
query = query.where(Transaction.type_.in_(transaction_type))
if limit is not None:
query = query.limit(limit)
query = query.order_by(Transaction.time.asc(), Transaction.id.asc())
result = sql_session.scalars(query).all()
return list(result)

View File

@@ -0,0 +1,334 @@
from dataclasses import dataclass
from datetime import datetime
from sqlalchemy import (
CTE,
Float,
Integer,
and_,
asc,
case,
cast,
column,
func,
literal,
or_,
select,
)
from sqlalchemy.orm import Session
from dibbler.models import (
Transaction,
TransactionType,
User,
)
from dibbler.models.Transaction import (
DEFAULT_INTEREST_RATE_PERCENTAGE,
DEFAULT_PENALTY_MULTIPLIER_PERCENTAGE,
DEFAULT_PENALTY_THRESHOLD,
)
from dibbler.queries.product_price import _product_price_query
def _user_balance_query(
user_id: int,
use_cache: bool = True,
until: datetime | None = None,
until_including: bool = True,
cte_name: str = "rec_cte",
) -> CTE:
"""
The inner query for calculating the user's balance.
"""
if use_cache:
print("WARNING: Using cache for user balance query is not implemented yet.")
initial_element = select(
literal(0).label("i"),
literal(0).label("time"),
literal(None).label("transaction_id"),
literal(0).label("balance"),
literal(DEFAULT_INTEREST_RATE_PERCENTAGE).label("interest_rate_percent"),
literal(DEFAULT_PENALTY_THRESHOLD).label("penalty_threshold"),
literal(DEFAULT_PENALTY_MULTIPLIER_PERCENTAGE).label("penalty_multiplier_percent"),
)
recursive_cte = initial_element.cte(name=cte_name, recursive=True)
# Subset of transactions that we'll want to iterate over.
trx_subset = (
select(
func.row_number().over(order_by=asc(Transaction.time)).label("i"),
Transaction.amount,
Transaction.id,
Transaction.interest_rate_percent,
Transaction.penalty_multiplier_percent,
Transaction.penalty_threshold,
Transaction.product_count,
Transaction.product_id,
Transaction.time,
Transaction.transfer_user_id,
Transaction.type_,
)
.where(
or_(
and_(
Transaction.user_id == user_id,
Transaction.type_.in_(
[
TransactionType.ADD_PRODUCT,
TransactionType.ADJUST_BALANCE,
TransactionType.BUY_PRODUCT,
TransactionType.TRANSFER,
# TODO: join this with the JOINT transactions, and determine
# how much the current user paid for the product.
TransactionType.JOINT_BUY_PRODUCT,
]
),
),
and_(
Transaction.type_ == TransactionType.TRANSFER,
Transaction.transfer_user_id == user_id,
),
Transaction.type_.in_(
[
TransactionType.THROW_PRODUCT,
TransactionType.ADJUST_INTEREST,
TransactionType.ADJUST_PENALTY,
]
),
),
case(
(literal(until_including), Transaction.time <= until),
else_=Transaction.time < until,
)
if until is not None
else literal(True),
)
.order_by(Transaction.time.asc())
.alias("trx_subset")
)
recursive_elements = (
select(
trx_subset.c.i,
trx_subset.c.time,
trx_subset.c.id.label("transaction_id"),
case(
# Adjusts balance -> balance gets adjusted
(
trx_subset.c.type_ == TransactionType.ADJUST_BALANCE,
recursive_cte.c.balance + trx_subset.c.amount,
),
# Adds a product -> balance increases
(
trx_subset.c.type_ == TransactionType.ADD_PRODUCT,
recursive_cte.c.balance + trx_subset.c.amount,
),
# Buys a product -> balance decreases
(
trx_subset.c.type_ == TransactionType.BUY_PRODUCT,
recursive_cte.c.balance
- (
trx_subset.c.product_count
# Price of a single product, accounted for penalties and interest.
* cast(
func.ceil(
# TODO: This can get quite expensive real quick, so we should do some caching of the
# product prices somehow.
# Base price
(
# FIXME: this always returns 0 for some reason...
select(cast(column("price"), Float))
.select_from(
_product_price_query(
trx_subset.c.product_id,
use_cache=use_cache,
until=trx_subset.c.time,
until_including=False,
cte_name="product_price_cte",
)
)
.order_by(column("i").desc())
.limit(1)
).scalar_subquery()
# TODO: should interest be applied before or after the penalty multiplier?
# at the moment of writing, after sound right, but maybe ask someone?
# Interest
* (cast(recursive_cte.c.interest_rate_percent, Float) / 100)
# TODO: these should be added together, not multiplied, see specification
# Penalty
* case(
(
recursive_cte.c.balance < recursive_cte.c.penalty_threshold,
(
cast(recursive_cte.c.penalty_multiplier_percent, Float)
/ 100
),
),
else_=1.0,
)
),
Integer,
)
),
),
# Transfers money to self -> balance increases
(
and_(
trx_subset.c.type_ == TransactionType.TRANSFER,
trx_subset.c.transfer_user_id == user_id,
),
recursive_cte.c.balance + trx_subset.c.amount,
),
# Transfers money from self -> balance decreases
(
and_(
trx_subset.c.type_ == TransactionType.TRANSFER,
trx_subset.c.transfer_user_id != user_id,
),
recursive_cte.c.balance - trx_subset.c.amount,
),
# Throws a product -> if the user is considered to have bought it, balance increases
# TODO:
# (
# trx_subset.c.type_ == TransactionType.THROW_PRODUCT,
# recursive_cte.c.balance + trx_subset.c.amount,
# ),
# Interest adjustment -> balance stays the same
# Penalty adjustment -> balance stays the same
else_=recursive_cte.c.balance,
).label("balance"),
case(
(
trx_subset.c.type_ == TransactionType.ADJUST_INTEREST,
trx_subset.c.interest_rate_percent,
),
else_=recursive_cte.c.interest_rate_percent,
).label("interest_rate_percent"),
case(
(
trx_subset.c.type_ == TransactionType.ADJUST_PENALTY,
trx_subset.c.penalty_threshold,
),
else_=recursive_cte.c.penalty_threshold,
).label("penalty_threshold"),
case(
(
trx_subset.c.type_ == TransactionType.ADJUST_PENALTY,
trx_subset.c.penalty_multiplier_percent,
),
else_=recursive_cte.c.penalty_multiplier_percent,
).label("penalty_multiplier_percent"),
)
.select_from(trx_subset)
.where(trx_subset.c.i == recursive_cte.c.i + 1)
)
return recursive_cte.union_all(recursive_elements)
# TODO: create a function for the log that pretty prints the log entries
# for debugging purposes
@dataclass
class UserBalanceLogEntry:
transaction: Transaction
balance: int
interest_rate_percent: int
penalty_threshold: int
penalty_multiplier_percent: int
def is_penalized(self) -> bool:
"""
Returns whether this exact transaction is penalized.
"""
return False
# return self.transaction.type_ == TransactionType.BUY_PRODUCT and prev?
def user_balance_log(
sql_session: Session,
user: User,
use_cache: bool = True,
until: Transaction | None = None,
) -> list[UserBalanceLogEntry]:
"""
Returns a log of the user's balance over time, including interest and penalty adjustments.
If 'until' is given, only transactions up to that time are considered.
"""
recursive_cte = _user_balance_query(
user.id,
use_cache=use_cache,
until=until.time if until else None,
)
result = sql_session.execute(
select(
Transaction,
recursive_cte.c.balance,
recursive_cte.c.interest_rate_percent,
recursive_cte.c.penalty_threshold,
recursive_cte.c.penalty_multiplier_percent,
)
.select_from(recursive_cte)
.join(
Transaction,
onclause=Transaction.id == recursive_cte.c.transaction_id,
)
.order_by(recursive_cte.c.i.asc())
).all()
if result is None:
# If there are no transactions for this user, the query should return 0, not None.
raise RuntimeError(
f"Something went wrong while calculating the balance for user {user.name} (ID: {user.id})."
)
return [
UserBalanceLogEntry(
transaction=row[0],
balance=row.balance,
interest_rate_percent=row.interest_rate_percent,
penalty_threshold=row.penalty_threshold,
penalty_multiplier_percent=row.penalty_multiplier_percent,
)
for row in result
]
def user_balance(
sql_session: Session,
user: User,
use_cache: bool = True,
until: Transaction | None = None,
) -> int:
"""
Calculates the balance of a user.
If 'until' is given, only transactions up to that time are considered.
"""
recursive_cte = _user_balance_query(
user.id,
use_cache=use_cache,
until=until.time if until else None,
)
result = sql_session.scalar(
select(recursive_cte.c.balance).order_by(recursive_cte.c.i.desc()).limit(1)
)
if result is None:
# If there are no transactions for this user, the query should return 0, not None.
raise RuntimeError(
f"Something went wrong while calculating the balance for user {user.name} (ID: {user.id})."
)
return result

View File

@@ -0,0 +1 @@
# TODO: implement me

View File

@@ -0,0 +1,116 @@
from datetime import datetime
from pathlib import Path
from dibbler.db import Session
from dibbler.models import Product, Transaction, User
from dibbler.queries import joint_buy_product
JSON_FILE = Path(__file__).parent.parent.parent / "mock_data.json"
# TODO: integrate this as a part of create-db, either asking interactively
# whether to seed test data, or by using command line arguments for
# automatating the answer.
def clear_db(sql_session):
sql_session.query(Product).delete()
sql_session.query(User).delete()
sql_session.commit()
def main():
# TODO: There is some leftover json data in the mock_data.json file.
# It should be dealt with before merging this PR, either by removing
# it or using it here.
sql_session = Session()
clear_db(sql_session)
# Add users
user1 = User("Test User 1")
user2 = User("Test User 2")
user3 = User("Test User 3")
sql_session.add(user1)
sql_session.add(user2)
sql_session.add(user3)
sql_session.commit()
# Add products
product1 = Product("1234567890123", "Test Product 1")
product2 = Product("9876543210987", "Test Product 2")
sql_session.add(product1)
sql_session.add(product2)
sql_session.commit()
# Add transactions
transactions = [
Transaction.adjust_balance(
time=datetime(2023, 10, 1, 10, 0, 0),
amount=100,
user_id=user1.id,
),
Transaction.adjust_balance(
time=datetime(2023, 10, 1, 10, 0, 1),
amount=50,
user_id=user2.id,
),
Transaction.adjust_balance(
time=datetime(2023, 10, 1, 10, 0, 2),
amount=-50,
user_id=user1.id,
),
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 0),
amount=27 * 2,
per_product=27,
product_count=2,
user_id=user1.id,
product_id=product1.id,
),
Transaction.buy_product(
time=datetime(2023, 10, 1, 12, 0, 1),
product_count=1,
user_id=user2.id,
product_id=product1.id,
),
]
sql_session.add_all(transactions)
sql_session.flush()
joint_buy_product(
sql_session,
time=datetime(2023, 10, 1, 12, 0, 2),
instigator=user1,
product_count=1,
users=[user1, user2, user3],
product=product2,
)
joint_buy_product(
sql_session,
time=datetime(2023, 10, 1, 13, 0, 2),
instigator=user3,
product_count=2,
users=[user2, user3],
product=product2,
)
transactions = [
Transaction.buy_product(
time=datetime(2023, 10, 2, 14, 0, 0),
product_count=1,
user_id=user1.id,
product_id=product1.id,
),
Transaction.buy_product(
time=datetime(2023, 10, 2, 14, 0, 1),
product_count=1,
user_id=user2.id,
product_id=product2.id,
),
]
sql_session.add_all(transactions)
sql_session.commit()

View File

@@ -0,0 +1,16 @@
from dibbler.db import Session
from dibbler.queries import transaction_log
from dibbler.lib.render_transaction_log import render_transaction_log
def main() -> None:
sql_session = Session()
result = transaction_log(sql_session)
rendered = render_transaction_log(result)
print(rendered)
if __name__ == "__main__":
main()

208
docs/economics.md Normal file
View File

@@ -0,0 +1,208 @@
# Economics
This document provides an overview of how dibbler counts and calculates its running event log.
It is a sort of semi-formal specification for how dibbler's economy is intended to work, and is useful for both users and developers to understand the underlying mechanics.
## Some general notes
- All calculations involving money are done in whole numbers (integers). There are no fractional krs.
- All rounding is done by rounding up to the nearest integer, in favor of the system economy - not the users.
- The system allows negative stock counts, but acts a bit weirdly and potentially unfairly when that happens.
The system should generally warn you about this, and recommend recounting the stock whenever it happens.
## Adding products - product stock and product price
This section covers what happens to the stock count and price of a product when a user adds more of that product to the system.
### When the product count is `0` before adding.
When the product count is `0`, adding more of that product sets the product count to the amount added, and the product price will be set to the price of all products added divided by the number of products added, rounded up to the nearest integer.
```python
new_product_count = products_added
new_product_price = math.ceil(total_value_of_products_added / products_added)
```
### When the product count is greater than `0` before adding.
When the product count is greater than `0`, adding more of that product increases the product count by the amount added, and the product price will be recalculated as the total value of all existing products plus the total value of all newly added products, divided by the new total product count, rounded up to the nearest integer.
```python
new_product_count = product_count + products_added
new_product_price = math.ceil((total_value_of_existing_products + total_value_of_products_added) / new_product_count)
```
### When the product count is less than `0` before adding.
> [!NOTE]
> This situation can happen when the product count in the system does not accurately reflect the real-world stock of that product.
> This sometimes happens when people throw away product that have gone bad, or if someone buys something and forgets to actually take it from the shelf.
When the product count is less than `0`, adding more of that product increases the product count by the amount added. The product price will be recalculated with an assumption that the existing negative stock has a total value of `0`, plus the total value of all newly added products.
> [!WARN]
> Note that this means that if you add products to a negative stock and the stock is still negative,
> the product price will be completely recalculated the next time someone adds the same product.
> There will also be a noticable effect if the stock goes from negative to positive.
```python
new_product_count = product_count + products_added
new_product_price = math.ceil(((product_price * math.max(product_count, 0)) + (total_value_of_products_added)) / new_product_count)
```
### A note about adding `0` items
If a user attempts to add `0` items of a product, the system will not change the product count or price, and no transaction will be recorded.
## Buying products - product stock
### When the product count is positive and you buy less than or equal to the stock count
When the product count is positive and a user buys an amount less than or equal to the current stock count, the product stock count will be decreased by the amount bought.
```python
new_product_count = product_count - products_bought
```
### When the product count is positive or `0` and you buy more than there are in stock
When the product count is positive and a user buys an amount greater than the current stock count, the product stock count will be decreased by the amount bought, resulting in a negative stock count.
```python
new_product_count = product_count - products_bought
```
> [!NOTE]
> This should also yield a warning, recommending the user to adjust the stock count for the product in question.
### Buying from negative stock
When the product count is negative, buying more of that product will further decrease the product stock count.
```python
new_product_count = product_count - products_bought
```
> [!NOTE]
> This should also yield a warning, recommending the user to adjust the stock count for the product in question.
### Buying items with joint transactions.
The same rules as above apply for all 3 cases.
### Note about buying `0` items
If a user attempts to buy `0` items of a product, the system will not change the product count or price, and no transaction will be recorded.
## Interest and penalty
### What is interest, and why do we need it
We have had some issues with the economy going in the negative, most likely due to users throwing away products gone bad. When the economy goes negative, we end up in a situation where users have money but there aren't really any products to buy, because the users don't have the incentive to add products back into the system to gain more balance.
To readjust the economy over time, there is an interest rate that will increase the amount you pay for each product by a certain percentage (the interest rate). This percentage can be adjusted by administrators when they see that the economy needs fixing. By default, the interest rate is set to `0%`.
> [!NOTE]
> You can not go below `0%` interest rate.
### What is penalty, and why do we need it
We currently allow users to go into negative balance when buying products. This is useful when you're having a great time at hacking night or similar, and don't want to be stopped by a low balance. However, to avoid users going too deep into negative balance, we make the cost of the product multiply by a penalty multiplier once the user's balance goes below a certain threshold. This penalty multiplier and threshold can be adjusted by administrators. By default, the threshold is set to `-100` krs, and the penalty multiplier is set to `200%` (i.e. you pay double the amount of money for products once your balance goes below `-100` krs).
The penalty starts counting as soon as your balance goes below the threshold, not when it is equal to the threshold.
> [!NOTE]
> You can not set the penalty multiplier to below `100%` (that would be a rebate, not a penalty),
> and you can not set the penalty threshold to above `0` krs (we do not punish people for having money).
## Adding products - user balance
### When your existing balance is above the penalty threshold
You gain balance equal to the total value of the products you add.
Note that this might be separate from the per-product cost of the products after you add them, due to rounding and price recalculation.
```python
new_user_balance = user_balance + total_value_of_products_added
```
### When your existing balance is below the penalty threshold
This case is the same as above.
## Buying products - user balance
### When your existing balance is above the penalty threshold, and the purchase does not push you below the threshold
You pay the normal product price for the products you buy, plus any interest.
```python
new_user_balance = user_balance - (products_bought * product_price * (1 + interest_rate))
```
Note that the system performs a transaction for every product kind, so if you buy multiple different products in one go, the rounding is done per product kind.
### When your balance is below the penalty threshold before buying
You pay the penalized product price for the products you buy, plus any interest.
The interest and penalty are calculated separately before they are added together, *not* multiplied together.
```python
penalty = ((product_price * penalty_multiplier) - product_price)
interest = (product_price * interest_rate)
new_user_balance = user_balance - (products_bought * (product_price + penalty + interest))
```
### When your balance is above the penalty threshold before buying, but the purchase pushes you below the threshold
TODO:
```python
```
### Joint purchases, when all users are above the penalty threshold and stays above the threshold
TODO: how does rounding work here, does one user pay more than the other?
TODO: ordering the purchases in favor of the user.
When performing joint purchases (multiple users
### Joint purchases when one or more users are below the penalty threshold
TODO
### Joint purchases when one or more users will end up below the penalty threshold after the purchase
TODO
## Who owns a product
When throwing away products, it can be useful to know who added the products in the first place. Dibbler will look back at its event log to determine who added the products that are being thrown away, and pull the money from their balance in order for the economy to stay sane.
The algorithm is based on FIFO (first in, first out), meaning that the products that were added first are the ones that will be considered thrown away first. This might not always be accurate in real life (someone could buy a newer product and add it to the shelf before an older product is added and then considered newer by the system), but it is an overall reasonable approximation.
When adjusting the product count upwards manually, the system will consider the new products to not be owned by anyone. When adjusting the product count downwards manually, the system will let go of ownership of the products being removed according to the FIFO queue, without adjusting their balance.
If the stock count of a product goes negative, the system will consider that the products being bought are owned by no one, and will not adjust any balances. The system should warn about the negative stock count, and recommend recounting the stock. As mentioned above, the manual adjustment made when recounting the stock will not assign ownership to anyone.
Upon throwing away products (not manual adjustment), the system will pull money from the balances of the users who added the products being thrown away, according to the FIFO queue. In the case where the systemd decides that no one own the products due to manual adjustments, the system will not pull any money from anyone's balance and let the economy absorb the loss.
## Other actions
Transfers
Note about self-transfers
Balance adjustments
## Updating the economy specification
Keep old logic, database rows tagged with spec version.

View File

@@ -1,20 +1,18 @@
[general]
; quit_allowed = false
; stop_allowed = false
quit_allowed = true ; not for prod
stop_allowed = true ; not for prod
quit_allowed = true
stop_allowed = false
show_tracebacks = true
input_encoding = 'utf8'
[database]
; url = postgresql://dibbler:hunter2@127.0.0.1/pvvvv
url = sqlite:///test.db ; devenv will override this to postgres using DIBBLER_DATABASE_URL
# url = "postgresql://robertem@127.0.0.1/pvvvv"
url = sqlite:///test.db
[limits]
low_credit_warning_limit = -100
user_recent_transaction_limit = 100
# See https://pypi.org/project/brother_ql_next/ for label types
# See https://pypi.org/project/brother_ql/ for label types
# Set rotate to False for endless labels
[printer]
label_type = "62"

249
flake.lock generated
View File

@@ -1,93 +1,5 @@
{
"nodes": {
"cachix": {
"inputs": {
"devenv": [
"devenv"
],
"flake-compat": [
"devenv"
],
"git-hooks": [
"devenv"
],
"nixpkgs": "nixpkgs"
},
"locked": {
"lastModified": 1728672398,
"narHash": "sha256-KxuGSoVUFnQLB2ZcYODW7AVPAh9JqRlD5BrfsC/Q4qs=",
"owner": "cachix",
"repo": "cachix",
"rev": "aac51f698309fd0f381149214b7eee213c66ef0a",
"type": "github"
},
"original": {
"owner": "cachix",
"ref": "latest",
"repo": "cachix",
"type": "github"
}
},
"devenv": {
"inputs": {
"cachix": "cachix",
"flake-compat": "flake-compat",
"git-hooks": "git-hooks",
"nix": "nix",
"nixpkgs": "nixpkgs_3"
},
"locked": {
"lastModified": 1731619804,
"narHash": "sha256-wyxFaVooL8SzvQNpolpx32X+GoBPnCAg9E0i/Ekn3FU=",
"owner": "cachix",
"repo": "devenv",
"rev": "87edaaf1dddf17fe16eabab3c8edaf7cca2c3bc2",
"type": "github"
},
"original": {
"owner": "cachix",
"repo": "devenv",
"type": "github"
}
},
"flake-compat": {
"flake": false,
"locked": {
"lastModified": 1696426674,
"narHash": "sha256-kvjfFW7WAETZlt09AgDn1MrtKzP7t90Vf7vypd3OL1U=",
"owner": "edolstra",
"repo": "flake-compat",
"rev": "0f9255e01c2351cc7d116c072cb317785dd33b33",
"type": "github"
},
"original": {
"owner": "edolstra",
"repo": "flake-compat",
"type": "github"
}
},
"flake-parts": {
"inputs": {
"nixpkgs-lib": [
"devenv",
"nix",
"nixpkgs"
]
},
"locked": {
"lastModified": 1712014858,
"narHash": "sha256-sB4SWl2lX95bExY2gMFG5HIzvva5AVMJd4Igm+GpZNw=",
"owner": "hercules-ci",
"repo": "flake-parts",
"rev": "9126214d0a59633752a136528f5f3b9aa8565b7d",
"type": "github"
},
"original": {
"owner": "hercules-ci",
"repo": "flake-parts",
"type": "github"
}
},
"flake-utils": {
"inputs": {
"systems": "systems"
@@ -101,117 +13,17 @@
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"git-hooks": {
"inputs": {
"flake-compat": [
"devenv"
],
"gitignore": "gitignore",
"nixpkgs": [
"devenv",
"nixpkgs"
],
"nixpkgs-stable": [
"devenv"
]
},
"locked": {
"lastModified": 1730302582,
"narHash": "sha256-W1MIJpADXQCgosJZT8qBYLRuZls2KSiKdpnTVdKBuvU=",
"owner": "cachix",
"repo": "git-hooks.nix",
"rev": "af8a16fe5c264f5e9e18bcee2859b40a656876cf",
"type": "github"
},
"original": {
"owner": "cachix",
"repo": "git-hooks.nix",
"type": "github"
}
},
"gitignore": {
"inputs": {
"nixpkgs": [
"devenv",
"git-hooks",
"nixpkgs"
]
},
"locked": {
"lastModified": 1709087332,
"narHash": "sha256-HG2cCnktfHsKV0s4XW83gU3F57gaTljL9KNSuG6bnQs=",
"owner": "hercules-ci",
"repo": "gitignore.nix",
"rev": "637db329424fd7e46cf4185293b9cc8c88c95394",
"type": "github"
},
"original": {
"owner": "hercules-ci",
"repo": "gitignore.nix",
"type": "github"
}
},
"libgit2": {
"flake": false,
"locked": {
"lastModified": 1697646580,
"narHash": "sha256-oX4Z3S9WtJlwvj0uH9HlYcWv+x1hqp8mhXl7HsLu2f0=",
"owner": "libgit2",
"repo": "libgit2",
"rev": "45fd9ed7ae1a9b74b957ef4f337bc3c8b3df01b5",
"type": "github"
},
"original": {
"owner": "libgit2",
"repo": "libgit2",
"type": "github"
}
},
"nix": {
"inputs": {
"flake-compat": [
"devenv"
],
"flake-parts": "flake-parts",
"libgit2": "libgit2",
"nixpkgs": "nixpkgs_2",
"nixpkgs-23-11": [
"devenv"
],
"nixpkgs-regression": [
"devenv"
],
"pre-commit-hooks": [
"devenv"
]
},
"locked": {
"lastModified": 1727438425,
"narHash": "sha256-X8ES7I1cfNhR9oKp06F6ir4Np70WGZU5sfCOuNBEwMg=",
"owner": "domenkozar",
"repo": "nix",
"rev": "f6c5ae4c1b2e411e6b1e6a8181cc84363d6a7546",
"type": "github"
},
"original": {
"owner": "domenkozar",
"ref": "devenv-2.24",
"repo": "nix",
"type": "github"
"id": "flake-utils",
"type": "indirect"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1730531603,
"narHash": "sha256-Dqg6si5CqIzm87sp57j5nTaeBbWhHFaVyG7V6L8k3lY=",
"lastModified": 1764950072,
"narHash": "sha256-BmPWzogsG2GsXZtlT+MTcAWeDK5hkbGRZTeZNW42fwA=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "7ffd9ae656aec493492b44d0ddfb28e79a1ea25d",
"rev": "f61125a668a320878494449750330ca58b78c557",
"type": "github"
},
"original": {
@@ -221,59 +33,10 @@
"type": "github"
}
},
"nixpkgs_2": {
"locked": {
"lastModified": 1717432640,
"narHash": "sha256-+f9c4/ZX5MWDOuB1rKoWj+lBNm0z0rs4CK47HBLxy1o=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "88269ab3044128b7c2f4c7d68448b2fb50456870",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "release-24.05",
"repo": "nixpkgs",
"type": "github"
}
},
"nixpkgs_3": {
"locked": {
"lastModified": 1716977621,
"narHash": "sha256-Q1UQzYcMJH4RscmpTkjlgqQDX5yi1tZL0O345Ri6vXQ=",
"owner": "cachix",
"repo": "devenv-nixpkgs",
"rev": "4267e705586473d3e5c8d50299e71503f16a6fb6",
"type": "github"
},
"original": {
"owner": "cachix",
"ref": "rolling",
"repo": "devenv-nixpkgs",
"type": "github"
}
},
"nixpkgs_4": {
"locked": {
"lastModified": 1731611831,
"narHash": "sha256-R51rOqkWMfubBkZ9BY4Y1VaRoeqEBshlfQ8mMH5RjqI=",
"owner": "nixos",
"repo": "nixpkgs",
"rev": "cea28c811faadb50bee00d433bbf2fea845a43e4",
"type": "github"
},
"original": {
"owner": "nixos",
"ref": "nixos-unstable-small",
"repo": "nixpkgs",
"type": "github"
}
},
"root": {
"inputs": {
"devenv": "devenv",
"flake-utils": "flake-utils",
"nixpkgs": "nixpkgs_4"
"nixpkgs": "nixpkgs"
}
},
"systems": {

151
flake.nix
View File

@@ -1,128 +1,65 @@
{
description = "Dibbler samspleisebod";
inputs = {
nixpkgs.url = "github:nixos/nixpkgs/nixos-unstable-small";
flake-utils.url = "github:numtide/flake-utils";
devenv.url = "github:cachix/devenv";
};
inputs.nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
nixConfig = {
extra-trusted-public-keys = [
"devenv.cachix.org-1:w1cLUi8dv3hnoSPGAuibQv+f9TZLr6cv/Hm9XgU50cw="
outputs = { self, nixpkgs, flake-utils }: let
inherit (nixpkgs) lib;
systems = [
"x86_64-linux"
"aarch64-linux"
"x86_64-darwin"
"aarch64-darwin"
];
extra-substituters = [
"https://devenv.cachix.org"
];
};
outputs = { self, ... } @ inputs:
inputs.flake-utils.lib.eachDefaultSystem (system: let
pkgs = inputs.nixpkgs.legacyPackages.${system};
inherit (pkgs) lib;
in {
packages = {
forAllSystems = f: lib.genAttrs systems (system: let
pkgs = nixpkgs.legacyPackages.${system};
in f system pkgs);
in {
packages = forAllSystems (system: pkgs: {
default = self.packages.${system}.dibbler;
dibbler = pkgs.callPackage ./nix/dibbler.nix {
python3Packages = pkgs.python312Packages;
};
skrot = self.nixosConfigurations.skrot.config.system.build.sdImage;
});
dibbler = pkgs.python311Packages.callPackage ./nix/dibbler.nix { };
skrot-vm = self.nixosConfigurations.skrot.config.system.build.vm;
apps = forAllSystems (system: pkgs: {
default = self.apps.${system}.dibbler;
dibbler = flake-utils.lib.mkApp {
drv = self.packages.${system}.dibbler;
};
});
# devenv cruft
devenv-up = self.devShells.${system}.default.config.procfileScript;
devenv-test = self.devShells.${system}.default.config.test;
};
devShells = {
default = self.devShells.${system}.dibbler;
dibbler = inputs.devenv.lib.mkShell {
inherit inputs pkgs;
modules = [({ config, ... }: {
# https://devenv.sh/reference/options/
enterShell = ''
if [[ ! -f config.ini ]]; then
cp -v example-config.ini config.ini
fi
export REPO_ROOT=$(realpath .) # used by mkPythonEditablePackage
export DIBBLER_CONFIG_FILE=$(realpath config.ini)
export DIBBLER_DATABASE_URL=postgresql://dibbler:hunter2@/dibbler?host=${config.env.PGHOST}
'';
packages = [
/* self.packages.${system}.dibbler */
(pkgs.python311Packages.mkPythonEditablePackage {
inherit (self.packages.${system}.dibbler)
pname version
build-system dependencies;
scripts = (lib.importTOML ./pyproject.toml).project.scripts;
root = "$REPO_ROOT";
})
pkgs.python311Packages.black
pkgs.ruff
];
services.postgres = {
enable = true;
initialDatabases = [
{
name = "dibbler";
user = "dibbler";
pass = "hunter2";
}
];
};
})];
overlays = {
default = self.overlays.dibbler;
dibbler = final: prev: {
inherit (self.packages.${prev.system}) dibbler;
};
};
})
devShells = forAllSystems (system: pkgs: {
default = self.devShells.${system}.dibbler;
dibbler = pkgs.callPackage ./nix/shell.nix {
python = pkgs.python312;
};
});
//
{
# Note: using the module requires that you have applied the
# overlay first
# Note: using the module requires that you have applied the overlay first
nixosModules.default = import ./nix/module.nix;
images.skrot = self.nixosConfigurations.skrot.config.system.build.sdImage;
nixosConfigurations.skrot = inputs.nixpkgs.lib.nixosSystem {
nixosConfigurations.skrot = nixpkgs.lib.nixosSystem (rec {
system = "aarch64-linux";
pkgs = import nixpkgs {
inherit system;
overlays = [ self.overlays.dibbler ];
};
modules = [
(inputs.nixpkgs + "/nixos/modules/installer/sd-card/sd-image-aarch64.nix")
(nixpkgs + "/nixos/modules/installer/sd-card/sd-image-aarch64.nix")
self.nixosModules.default
({...}: {
system.stateVersion = "22.05";
networking = {
hostName = "skrot";
domain = "pvv.ntnu.no";
nameservers = [ "129.241.0.200" "129.241.0.201" ];
defaultGateway = "129.241.210.129";
interfaces.eth0 = {
useDHCP = false;
ipv4.addresses = [{
address = "129.241.210.235";
prefixLength = 25;
}];
};
};
# services.resolved.enable = true;
# systemd.network.enable = true;
# systemd.network.networks."30-network" = {
# matchConfig.Name = "*";
# DHCP = "no";
# address = [ "129.241.210.235/25" ];
# gateway = [ "129.241.210.129" ];
# };
})
./nix/skrott.nix
];
};
});
};
}

76
mock_data.json Normal file
View File

@@ -0,0 +1,76 @@
{
"products": [
{
"product_id": 1,
"bar_code": "1234567890123",
"name": "Wireless Mouse",
"price": 2999,
"stock": 150,
"hidden": false
},
{
"product_id": 2,
"bar_code": "9876543210987",
"name": "Mechanical Keyboard",
"price": 5999,
"stock": 75,
"hidden": false
},
{
"product_id": 3,
"bar_code": "1112223334445",
"name": "Gaming Monitor",
"price": 19999,
"stock": 20,
"hidden": false
},
{
"product_id": 4,
"bar_code": "5556667778889",
"name": "USB-C Docking Station",
"price": 8999,
"stock": 50,
"hidden": true
},
{
"product_id": 5,
"bar_code": "4445556667771",
"name": "Noise Cancelling Headphones",
"price": 12999,
"stock": 30,
"hidden": true
}
],
"users": [
{
"name": "Albert",
"credit": 42069,
"card": "NTU12345678",
"rfid": "a1b2c3d4e5"
},
{
"name": "lorem",
"credit": 2000,
"card": "9876543210",
"rfid": "f6e7d8c9b0"
},
{
"name": "ibsum",
"credit": 1000,
"card": "11122233",
"rfid": ""
},
{
"name": "dave",
"credit": 7500,
"card": "NTU56789012",
"rfid": "1234abcd5678"
},
{
"name": "eve",
"credit": 3000,
"card": null,
"rfid": "deadbeef1234"
}
]
}

View File

@@ -1,27 +1,31 @@
{ lib
, python3Packages
, fetchFromGitHub
, buildPythonApplication
, setuptools
, brother-ql
, matplotlib
, psycopg2
, python-barcode
, sqlalchemy
}:
buildPythonApplication {
python3Packages.buildPythonApplication {
pname = "dibbler";
version = "0.0.0";
pyproject = true;
version = "unstable";
src = lib.cleanSource ../.;
build-system = [ setuptools ];
dependencies = [
# we override pname to satisfy mkPythonEditablePackage
(brother-ql.overridePythonAttrs { pname = "brother-ql-next"; })
format = "pyproject";
# brother-ql is breaky breaky
# https://github.com/NixOS/nixpkgs/issues/285234
dontCheckRuntimeDeps = true;
pythonImportsCheck = [];
doCheck = true;
nativeCheckInputs = with python3Packages; [
pytest
pytestCheckHook
];
nativeBuildInputs = with python3Packages; [ setuptools ];
propagatedBuildInputs = with python3Packages; [
brother-ql
matplotlib
psycopg2
psycopg2-binary
python-barcode
sqlalchemy
];

View File

@@ -1,16 +1,31 @@
{ config, pkgs, lib, ... }: let
cfg = config.services.dibbler;
format = pkgs.formats.ini { };
in {
options.services.dibbler = {
enable = lib.mkEnableOption "dibbler, the little kiosk computer";
package = lib.mkPackageOption pkgs "dibbler" { };
config = lib.mkOption {
default = ../conf.py;
settings = lib.mkOption {
description = "Configuration for dibbler";
default = { };
type = lib.types.submodule {
freeformType = format.type;
};
};
};
config = let
screen = "${pkgs.screen}/bin/screen";
in {
in lib.mkIf cfg.enable {
services.dibbler.settings = lib.pipe ../example-config.ini [
builtins.readFile
builtins.fromTOML
(lib.mapAttrsRecursive (_: lib.mkDefault))
];
boot = {
consoleLogLevel = 0;
enableContainers = false;
@@ -23,10 +38,7 @@ in {
group = "dibbler";
extraGroups = [ "lp" ];
isNormalUser = true;
shell = (
(pkgs.writeShellScriptBin "login-shell" "${screen} -x dibbler")
// {shellPath = "/bin/login-shell";}
);
shell = (pkgs.writeShellScriptBin "login-shell" "${screen} -x dibbler") // {shellPath = "/bin/login-shell";};
};
};
@@ -35,7 +47,9 @@ in {
wantedBy = [ "default.target" ];
serviceConfig = {
ExecStartPre = "-${screen} -X -S dibbler kill";
ExecStart = "${screen} -dmS dibbler -O -l ${cfg.package}/bin/dibbler --config ${cfg.config} loop";
ExecStart = let
config = format.generate "dibbler-config.ini" cfg.settings;
in "${screen} -dmS dibbler -O -l ${cfg.package}/bin/dibbler --config ${config} loop";
ExecStartPost = "${screen} -X -S dibbler width 42 80";
User = "dibbler";
Group = "dibbler";
@@ -58,30 +72,6 @@ in {
serviceConfig.Restart = "always"; # restart when session is closed
};
services = {
openssh = {
enable = true;
permitRootLogin = "yes";
};
getty.autologinUser = lib.mkForce "dibbler";
udisks2.enable = false;
};
networking.firewall.logRefusedConnections = false;
console.keyMap = "no";
programs.command-not-found.enable = false;
i18n.supportedLocales = [ "en_US.UTF-8/UTF-8" ];
environment.noXlibs = true;
documentation = {
info.enable = false;
man.enable = false;
};
security = {
polkit.enable = lib.mkForce false;
audit.enable = false;
};
services.getty.autologinUser = lib.mkForce "dibbler";
};
}

24
nix/shell.nix Normal file
View File

@@ -0,0 +1,24 @@
{
mkShell,
python,
ruff,
uv,
}:
mkShell {
packages = [
ruff
uv
(python.withPackages (ps: with ps; [
brother-ql
matplotlib
psycopg2
python-barcode
sqlalchemy
pytest
pytest-cov
pytest-html
]))
];
}

27
nix/skrott.nix Normal file
View File

@@ -0,0 +1,27 @@
{...}: {
system.stateVersion = "25.05";
services.dibbler.enable = true;
networking = {
hostName = "skrot";
domain = "pvv.ntnu.no";
nameservers = [ "129.241.0.200" "129.241.0.201" ];
defaultGateway = "129.241.210.129";
interfaces.eth0 = {
useDHCP = false;
ipv4.addresses = [{
address = "129.241.210.235";
prefixLength = 25;
}];
};
};
# services.resolved.enable = true;
# systemd.network.enable = true;
# systemd.network.networks."30-network" = {
# matchConfig.Name = "*";
# DHCP = "no";
# address = [ "129.241.210.235/25" ];
# gateway = [ "129.241.210.129" ];
# };
}

View File

@@ -8,19 +8,26 @@ authors = []
description = "EDB-system for PVV"
readme = "README.md"
requires-python = ">=3.11"
license = {text = "BSD-3-Clause"}
classifiers = [
"Programming Language :: Python :: 3",
]
dependencies = [
"SQLAlchemy >= 2.0, <2.1",
"brother_ql_next",
"brother-ql",
"matplotlib",
"psycopg2 >= 2.8, <2.10",
"psycopg2-binary >= 2.8, <2.10",
"python-barcode",
]
dynamic = ["version"]
[dependency-groups]
test = [
"pytest",
"pytest-cov",
"coverage-badge>=1.1.2",
"pytest-html>=4.1.1",
]
[tool.setuptools.packages.find]
include = ["dibbler*"]

0
tests/__init__.py Normal file
View File

36
tests/conftest.py Normal file
View File

@@ -0,0 +1,36 @@
import pytest
from sqlalchemy import create_engine, event
from sqlalchemy.orm import Session
from dibbler.models import Base
def pytest_addoption(parser):
parser.addoption(
"--echo",
action="store_true",
help="Enable SQLAlchemy echo mode for debugging",
)
@pytest.fixture(scope="function")
def sql_session(request):
"""Create a new SQLAlchemy session for testing."""
echo = request.config.getoption("--echo")
engine = create_engine(
"sqlite:///:memory:",
echo=echo,
)
@event.listens_for(engine, "connect")
def set_sqlite_pragma(dbapi_connection, _connection_record):
cursor = dbapi_connection.cursor()
cursor.execute("PRAGMA foreign_keys=ON")
cursor.close()
Base.metadata.create_all(engine)
with Session(engine) as sql_session:
yield sql_session

0
tests/models/__init__.py Normal file
View File

View File

@@ -0,0 +1,32 @@
import pytest
from sqlalchemy.exc import IntegrityError
from sqlalchemy.orm import Session
from dibbler.models import Product
def insert_test_data(sql_session: Session) -> Product:
product = Product("1234567890123", "Test Product")
sql_session.add(product)
sql_session.commit()
return product
def test_product_no_duplicate_barcodes(sql_session: Session):
product = insert_test_data(sql_session)
duplicate_product = Product(product.bar_code, "Hehe >:)")
sql_session.add(duplicate_product)
with pytest.raises(IntegrityError):
sql_session.commit()
def test_product_no_duplicate_names(sql_session: Session):
product = insert_test_data(sql_session)
duplicate_product = Product("1918238911928", product.name)
sql_session.add(duplicate_product)
with pytest.raises(IntegrityError):
sql_session.commit()

View File

@@ -0,0 +1,175 @@
from datetime import datetime
import pytest
from sqlalchemy.exc import IntegrityError
from sqlalchemy.orm import Session
from dibbler.models import Product, Transaction, User
from dibbler.queries import product_stock
def insert_test_data(sql_session: Session) -> tuple[User, Product]:
user = User("Test User")
product = Product("1234567890123", "Test Product")
sql_session.add(user)
sql_session.add(product)
sql_session.commit()
return user, product
def test_user_not_allowed_to_transfer_to_self(sql_session: Session) -> None:
user, _ = insert_test_data(sql_session)
transaction = Transaction.transfer(
time=datetime(2023, 10, 1, 12, 0, 0),
user_id=user.id,
transfer_user_id=user.id,
amount=50,
)
sql_session.add(transaction)
with pytest.raises(IntegrityError):
sql_session.commit()
def test_product_foreign_key_constraint(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transaction = Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 0),
user_id=user.id,
product_id=product.id,
amount=27,
per_product=27,
product_count=1,
)
sql_session.add(transaction)
sql_session.commit()
# Attempt to add a transaction with a non-existent product
invalid_transaction = Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 1),
user_id=user.id,
product_id=9999, # Non-existent product ID
amount=27,
per_product=27,
product_count=1,
)
sql_session.add(invalid_transaction)
with pytest.raises(IntegrityError):
sql_session.commit()
def test_user_foreign_key_constraint(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transaction = Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 0),
user_id=user.id,
product_id=product.id,
amount=27,
per_product=27,
product_count=1,
)
sql_session.add(transaction)
sql_session.commit()
# Attempt to add a transaction with a non-existent user
invalid_transaction = Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 1),
user_id=9999, # Non-existent user ID
product_id=product.id,
amount=27,
per_product=27,
product_count=1,
)
sql_session.add(invalid_transaction)
with pytest.raises(IntegrityError):
sql_session.commit()
def test_transaction_buy_product_more_than_stock(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 0),
user_id=user.id,
product_id=product.id,
amount=27,
per_product=27,
product_count=1,
),
Transaction.buy_product(
time=datetime(2023, 10, 1, 13, 0, 0),
product_count=10,
user_id=user.id,
product_id=product.id,
),
]
sql_session.add_all(transactions)
sql_session.commit()
assert product_stock(sql_session, product) == 1 - 10
def test_transaction_buy_product_dont_allow_no_add_product_transactions(
sql_session: Session,
) -> None:
user, product = insert_test_data(sql_session)
transaction = Transaction.buy_product(
time=datetime(2023, 10, 1, 12, 0, 0),
product_count=1,
user_id=user.id,
product_id=product.id,
)
sql_session.add(transaction)
with pytest.raises(ValueError):
sql_session.commit()
def test_transaction_add_product_deny_amount_over_per_product_times_product_count(
sql_session: Session,
) -> None:
user, product = insert_test_data(sql_session)
with pytest.raises(ValueError):
_transaction = Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 0),
user_id=user.id,
product_id=product.id,
amount=27 * 2 + 1, # Invalid amount
per_product=27,
product_count=2,
)
def test_transaction_add_product_allow_amount_under_per_product_times_product_count(
sql_session: Session,
) -> None:
user, product = insert_test_data(sql_session)
transaction = Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 0),
user_id=user.id,
product_id=product.id,
amount=27 * 2 - 1, # Valid amount
per_product=27,
product_count=2,
)
sql_session.add(transaction)
sql_session.commit()

25
tests/models/test_user.py Normal file
View File

@@ -0,0 +1,25 @@
from datetime import datetime
import pytest
from sqlalchemy.exc import IntegrityError
from sqlalchemy.orm import Session
from dibbler.models import Product, Transaction, User
def insert_test_data(sql_session: Session) -> User:
user = User("Test User")
sql_session.add(user)
sql_session.commit()
return user
def test_ensure_no_duplicate_user_names(sql_session: Session):
user = insert_test_data(sql_session)
user2 = User(user.name)
sql_session.add(user2)
with pytest.raises(IntegrityError):
sql_session.commit()

View File

View File

View File

View File

@@ -0,0 +1,71 @@
import pytest
from datetime import datetime
from sqlalchemy.orm import Session
from dibbler.models import Transaction, User
from dibbler.queries import adjust_interest, current_interest
def test_adjust_interest_no_history(sql_session: Session) -> None:
user = User("Test User")
sql_session.add(user)
sql_session.commit()
adjust_interest(
sql_session,
user_id=user.id,
new_interest=3,
message="Setting initial interest rate",
)
sql_session.commit()
current_interest_rate = current_interest(sql_session)
assert current_interest_rate == 3
def test_adjust_interest_existing_history(sql_session: Session) -> None:
user = User("Test User")
sql_session.add(user)
sql_session.commit()
transactions = [
Transaction.adjust_interest(
time=datetime(2023, 10, 1, 9, 0, 0),
user_id=user.id,
interest_rate_percent=5,
message="Initial interest rate",
),
]
sql_session.add_all(transactions)
sql_session.commit()
current_interest_rate = current_interest(sql_session)
assert current_interest_rate == 5
adjust_interest(
sql_session,
user_id=user.id,
new_interest=2,
message="Adjusting interest rate",
)
sql_session.commit()
current_interest_rate = current_interest(sql_session)
assert current_interest_rate == 2
def test_adjust_interest_negative_failure(sql_session: Session) -> None:
user = User("Test User")
sql_session.add(user)
sql_session.commit()
with pytest.raises(ValueError, match="Interest rate cannot be negative"):
adjust_interest(
sql_session,
user_id=user.id,
new_interest=-1,
message="Attempting to set negative interest rate",
)

View File

@@ -0,0 +1,157 @@
from datetime import datetime
import pytest
from sqlalchemy.orm import Session
from dibbler.models import Transaction, User
from dibbler.models.Transaction import (
DEFAULT_PENALTY_MULTIPLIER_PERCENTAGE,
DEFAULT_PENALTY_THRESHOLD,
)
from dibbler.queries import adjust_penalty, current_penalty
def test_adjust_penalty_no_history(sql_session: Session) -> None:
user = User("Test User")
sql_session.add(user)
sql_session.commit()
adjust_penalty(
sql_session,
user_id=user.id,
new_penalty=-200,
message="Setting initial interest rate",
)
sql_session.commit()
(penalty, multiplier) = current_penalty(sql_session)
assert penalty == -200
assert multiplier == DEFAULT_PENALTY_MULTIPLIER_PERCENTAGE
def test_adjust_penalty_multiplier_no_history(sql_session: Session) -> None:
user = User("Test User")
sql_session.add(user)
sql_session.commit()
adjust_penalty(
sql_session,
user_id=user.id,
new_penalty_multiplier=125,
message="Setting initial interest rate",
)
sql_session.commit()
(penalty, multiplier) = current_penalty(sql_session)
assert penalty == DEFAULT_PENALTY_THRESHOLD
assert multiplier == 125
def test_adjust_penalty_multiplier_less_than_100_fail(sql_session: Session) -> None:
user = User("Test User")
sql_session.add(user)
sql_session.commit()
adjust_penalty(
sql_session,
user_id=user.id,
new_penalty_multiplier=100,
message="Setting initial interest rate",
)
sql_session.commit()
(_, multiplier) = current_penalty(sql_session)
assert multiplier == 100
with pytest.raises(ValueError, match="Penalty multiplier cannot be less than 100%"):
adjust_penalty(
sql_session,
user_id=user.id,
new_penalty_multiplier=99,
message="Setting initial interest rate",
)
def test_adjust_penalty_existing_history(sql_session: Session) -> None:
user = User("Test User")
sql_session.add(user)
sql_session.commit()
transactions = [
Transaction.adjust_penalty(
time=datetime(2024, 1, 1, 10, 0, 0),
user_id=user.id,
penalty_threshold=-150,
penalty_multiplier_percent=110,
message="Initial penalty settings",
),
]
sql_session.add_all(transactions)
sql_session.commit()
(penalty, _) = current_penalty(sql_session)
assert penalty == -150
adjust_penalty(
sql_session,
user_id=user.id,
new_penalty=-250,
message="Adjusting penalty threshold",
)
sql_session.commit()
(penalty, _) = current_penalty(sql_session)
assert penalty == -250
def test_adjust_penalty_multiplier_existing_history(sql_session: Session) -> None:
user = User("Test User")
sql_session.add(user)
sql_session.commit()
transactions = [
Transaction.adjust_penalty(
time=datetime(2024, 1, 1, 10, 0, 0),
user_id=user.id,
penalty_threshold=-150,
penalty_multiplier_percent=110,
message="Initial penalty settings",
),
]
sql_session.add_all(transactions)
sql_session.commit()
(_, multiplier) = current_penalty(sql_session)
assert multiplier == 110
adjust_penalty(
sql_session,
user_id=user.id,
new_penalty_multiplier=130,
message="Adjusting penalty multiplier",
)
sql_session.commit()
(_, multiplier) = current_penalty(sql_session)
assert multiplier == 130
def test_adjust_penalty_and_multiplier(sql_session: Session) -> None:
user = User("Test User")
sql_session.add(user)
sql_session.commit()
adjust_penalty(
sql_session,
user_id=user.id,
new_penalty=-300,
new_penalty_multiplier=150,
message="Setting both penalty and multiplier",
)
sql_session.commit()
(penalty, multiplier) = current_penalty(sql_session)
assert penalty == -300
assert multiplier == 150

View File

@@ -0,0 +1,35 @@
from datetime import datetime
from sqlalchemy.orm import Session
from dibbler.models.Transaction import DEFAULT_INTEREST_RATE_PERCENTAGE
from dibbler.models import Transaction, User
from dibbler.queries import current_interest
def test_current_interest_no_history(sql_session: Session) -> None:
assert current_interest(sql_session) == DEFAULT_INTEREST_RATE_PERCENTAGE
def test_current_interest_with_history(sql_session: Session) -> None:
user = User("Admin User")
sql_session.add(user)
sql_session.commit()
transactions = [
Transaction.adjust_interest(
time=datetime(2023, 10, 1, 10, 0, 0),
interest_rate_percent=5,
user_id=user.id,
),
Transaction.adjust_interest(
time=datetime(2023, 11, 1, 10, 0, 0),
interest_rate_percent=7,
user_id=user.id,
),
]
sql_session.add_all(transactions)
sql_session.commit()
assert current_interest(sql_session) == 7

View File

@@ -0,0 +1,42 @@
from datetime import datetime
from sqlalchemy.orm import Session
from dibbler.models import Transaction, User
from dibbler.models.Transaction import (
DEFAULT_PENALTY_MULTIPLIER_PERCENTAGE,
DEFAULT_PENALTY_THRESHOLD,
)
from dibbler.queries import current_penalty
def test_current_penalty_no_history(sql_session: Session) -> None:
assert current_penalty(sql_session) == (
DEFAULT_PENALTY_THRESHOLD,
DEFAULT_PENALTY_MULTIPLIER_PERCENTAGE,
)
def test_current_penalty_with_history(sql_session: Session) -> None:
user = User("Admin User")
sql_session.add(user)
sql_session.commit()
transactions = [
Transaction.adjust_penalty(
time=datetime(2023, 10, 1, 10, 0, 0),
penalty_threshold=-200,
penalty_multiplier_percent=150,
user_id=user.id,
),
Transaction.adjust_penalty(
time=datetime(2023, 10, 2, 10, 0, 0),
penalty_threshold=-300,
penalty_multiplier_percent=200,
user_id=user.id,
),
]
sql_session.add_all(transactions)
sql_session.commit()
assert current_penalty(sql_session) == (-300, 200)

View File

@@ -0,0 +1,9 @@
from sqlalchemy.orm import Session
def test_joint_buy_product_missing_product(sql_session: Session) -> None: ...
def test_joint_buy_product_missing_user(sql_session: Session) -> None: ...
def test_joint_buy_product_out_of_stock(sql_session: Session) -> None: ...
def test_joint_buy_product(sql_session: Session) -> None: ...
def test_joint_buy_product_duplicate_user(sql_session: Session) -> None: ...
def test_joint_buy_product_non_involved_instigator(sql_session: Session) -> None: ...

View File

@@ -0,0 +1,277 @@
from pprint import pprint
from sqlalchemy.orm import Session
from dibbler.models import Product, User
from dibbler.models.Transaction import Transaction
from dibbler.queries import product_owners, product_owners_log
def insert_test_data(sql_session: Session) -> tuple[Product, User]:
user = User("testuser")
product = Product("1234567890123", "Test Product")
sql_session.add(user)
sql_session.add(product)
sql_session.commit()
return product, user
def test_product_owners_no_transactions(sql_session: Session) -> None:
product, _ = insert_test_data(sql_session)
pprint(product_owners_log(sql_session, product))
owners = product_owners(sql_session, product)
assert owners == []
def test_product_owners_add_products(sql_session: Session) -> None:
product, user = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
user_id=user.id,
product_id=product.id,
amount=30,
per_product=10,
product_count=3,
)
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_owners_log(sql_session, product))
owners = product_owners(sql_session, product)
assert owners == [user, user, user]
def test_product_owners_add_and_buy_products(sql_session: Session) -> None:
product, user = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
user_id=user.id,
product_id=product.id,
amount=30,
per_product=10,
product_count=3,
),
Transaction.buy_product(
user_id=user.id,
product_id=product.id,
product_count=1,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_owners_log(sql_session, product))
owners = product_owners(sql_session, product)
assert owners == [user, user]
def test_product_owners_add_and_throw_products(sql_session: Session) -> None:
product, user = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
user_id=user.id,
product_id=product.id,
amount=40,
per_product=10,
product_count=4,
),
Transaction.throw_product(
user_id=user.id,
product_id=product.id,
product_count=2,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_owners_log(sql_session, product))
owners = product_owners(sql_session, product)
assert owners == [user, user]
def test_product_owners_multiple_users(sql_session: Session) -> None:
product, user1 = insert_test_data(sql_session)
user2 = User("testuser2")
sql_session.add(user2)
sql_session.commit()
transactions = [
Transaction.add_product(
user_id=user1.id,
product_id=product.id,
amount=20,
per_product=10,
product_count=2,
),
Transaction.add_product(
user_id=user2.id,
product_id=product.id,
amount=30,
per_product=10,
product_count=3,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_owners_log(sql_session, product))
owners = product_owners(sql_session, product)
assert owners == [user2, user2, user2, user1, user1]
def test_product_owners_adjust_stock_down(sql_session: Session) -> None:
product, user = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
user_id=user.id,
product_id=product.id,
amount=50,
per_product=10,
product_count=5,
),
Transaction.adjust_stock(
user_id=user.id,
product_id=product.id,
product_count=-2,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_owners_log(sql_session, product))
owners = product_owners(sql_session, product)
assert owners == [user, user, user]
def test_product_owners_adjust_stock_up(sql_session: Session) -> None:
product, user = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
user_id=user.id,
product_id=product.id,
amount=20,
per_product=10,
product_count=2,
),
Transaction.adjust_stock(
user_id=user.id,
product_id=product.id,
product_count=3,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_owners_log(sql_session, product))
owners = product_owners(sql_session, product)
assert owners == [user, user, None, None, None]
def test_product_owners_negative_stock(sql_session: Session) -> None:
product, user = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
user_id=user.id,
product_id=product.id,
amount=10,
per_product=10,
product_count=1,
),
Transaction.buy_product(
user_id=user.id,
product_id=product.id,
product_count=2,
),
]
sql_session.add_all(transactions)
sql_session.commit()
owners = product_owners(sql_session, product)
assert owners == []
def test_product_owners_add_products_from_negative_stock(sql_session: Session) -> None:
product, user = insert_test_data(sql_session)
transactions = [
Transaction.buy_product(
user_id=user.id,
product_id=product.id,
product_count=2,
),
Transaction.add_product(
user_id=user.id,
product_id=product.id,
amount=30,
per_product=10,
product_count=3,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_owners_log(sql_session, product))
owners = product_owners(sql_session, product)
assert owners == [user]
def test_product_owners_interleaved_users(sql_session: Session) -> None:
product, user1 = insert_test_data(sql_session)
user2 = User("testuser2")
sql_session.add(user2)
sql_session.commit()
transactions = [
Transaction.add_product(
user_id=user1.id,
product_id=product.id,
amount=20,
per_product=10,
product_count=2,
),
Transaction.add_product(
user_id=user2.id,
product_id=product.id,
amount=30,
per_product=10,
product_count=3,
),
Transaction.buy_product(
user_id=user1.id,
product_id=product.id,
product_count=1,
),
Transaction.add_product(
user_id=user1.id,
product_id=product.id,
amount=10,
per_product=10,
product_count=1,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_owners_log(sql_session, product))
owners = product_owners(sql_session, product)
assert owners == [user1, user2, user2, user1, user1]

View File

@@ -0,0 +1,418 @@
import math
from datetime import datetime
from pprint import pprint
from sqlalchemy.orm import Session
from dibbler.models import Product, Transaction, User
from dibbler.queries import product_price, product_price_log, joint_buy_product
# TODO: see if we can use pytest_runtest_makereport to print the "product_price_log"s
# only on failures instead of inlining it in every test function
def insert_test_data(sql_session: Session) -> tuple[User, Product]:
user = User("Test User")
product = Product("1234567890123", "Test Product")
sql_session.add(user)
sql_session.add(product)
sql_session.commit()
return user, product
def test_product_price_no_transactions(sql_session: Session) -> None:
_, product = insert_test_data(sql_session)
pprint(product_price_log(sql_session, product))
assert product_price(sql_session, product) == 0
def test_product_price_basic_history(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 0),
amount=27 * 2 - 1,
per_product=27,
product_count=2,
user_id=user.id,
product_id=product.id,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_price_log(sql_session, product))
assert product_price(sql_session, product) == 27
def test_product_price_sold_out(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 0),
amount=27 * 2 - 1,
per_product=27,
product_count=2,
user_id=user.id,
product_id=product.id,
),
Transaction.buy_product(
time=datetime(2023, 10, 1, 12, 0, 1),
product_count=2,
user_id=user.id,
product_id=product.id,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_price_log(sql_session, product))
assert product_price(sql_session, product) == 27
def test_product_price_interest(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.adjust_interest(
time=datetime(2023, 10, 1, 12, 0, 0),
interest_rate_percent=110,
user_id=user.id,
),
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 1),
amount=27 * 2 - 1,
per_product=27,
product_count=2,
user_id=user.id,
product_id=product.id,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_price_log(sql_session, product))
product_price_ = product_price(sql_session, product)
product_price_interest = product_price(sql_session, product, include_interest=True)
assert product_price_ == 27
assert product_price_interest == math.ceil(27 * 1.1)
def test_product_price_changing_interest(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.adjust_interest(
time=datetime(2023, 10, 1, 12, 0, 0),
interest_rate_percent=110,
user_id=user.id,
),
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 1),
amount=27 * 2 - 1,
per_product=27,
product_count=2,
user_id=user.id,
product_id=product.id,
),
Transaction.adjust_interest(
time=datetime(2023, 10, 1, 12, 0, 2),
interest_rate_percent=120,
user_id=user.id,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_price_log(sql_session, product))
product_price_interest = product_price(sql_session, product, include_interest=True)
assert product_price_interest == math.ceil(27 * 1.2)
def test_product_price_old_transaction(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 1),
amount=27 * 2,
per_product=27,
product_count=2,
user_id=user.id,
product_id=product.id,
),
# Price should be 27
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 2),
amount=38 * 3,
per_product=38,
product_count=3,
user_id=user.id,
product_id=product.id,
),
# price should be averaged upwards
]
sql_session.add_all(transactions)
sql_session.commit()
until_transaction = transactions[0]
pprint(
product_price_log(
sql_session,
product,
until=until_transaction,
)
)
product_price_ = product_price(
sql_session,
product,
until=until_transaction,
)
assert product_price_ == 27
# Price goes up and gets rounded up to the next integer
def test_product_price_round_up_from_below(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 1),
amount=27 * 2,
per_product=27,
product_count=2,
user_id=user.id,
product_id=product.id,
),
# Price should be 27
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 2),
amount=38 * 3,
per_product=38,
product_count=3,
user_id=user.id,
product_id=product.id,
),
# price should be averaged upwards
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_price_log(sql_session, product))
product_price_ = product_price(sql_session, product)
assert product_price_ == math.ceil((27 * 2 + 38 * 3) / (2 + 3))
# Price goes down and gets rounded up to the next integer
def test_product_price_round_up_from_above(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 1),
amount=27 * 2,
per_product=27,
product_count=2,
user_id=user.id,
product_id=product.id,
),
# Price should be 27
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 2),
amount=20 * 3,
per_product=20,
product_count=3,
user_id=user.id,
product_id=product.id,
),
# price should be averaged downwards
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_price_log(sql_session, product))
product_price_ = product_price(sql_session, product)
assert product_price_ == math.ceil((27 * 2 + 20 * 3) / (2 + 3))
def test_product_price_with_negative_stock_single_addition(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 13, 0, 0),
amount=1,
per_product=10,
product_count=1,
user_id=user.id,
product_id=product.id,
),
Transaction.buy_product(
time=datetime(2023, 10, 1, 13, 0, 1),
product_count=10,
user_id=user.id,
product_id=product.id,
),
Transaction.add_product(
time=datetime(2023, 10, 1, 13, 0, 2),
amount=22,
per_product=22,
product_count=1,
user_id=user.id,
product_id=product.id,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_price_log(sql_session, product))
# Stock went subzero, price should be the last added product price
product1_price = product_price(sql_session, product)
assert product1_price == 22
# TODO: what happens when stock is still negative and yet new products are added?
def test_product_price_with_negative_stock_multiple_additions(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 13, 0, 0),
amount=1,
per_product=10,
product_count=1,
user_id=user.id,
product_id=product.id,
),
Transaction.buy_product(
time=datetime(2023, 10, 1, 13, 0, 1),
product_count=10,
user_id=user.id,
product_id=product.id,
),
Transaction.add_product(
time=datetime(2023, 10, 1, 13, 0, 2),
amount=22,
per_product=22,
product_count=1,
user_id=user.id,
product_id=product.id,
),
Transaction.add_product(
time=datetime(2023, 10, 1, 13, 0, 3),
amount=29,
per_product=29,
product_count=2,
user_id=user.id,
product_id=product.id,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_price_log(sql_session, product))
# Stock went subzero, price should be the ceiled average of the last added products
product1_price = product_price(sql_session, product)
assert product1_price == math.ceil((22 + 29 * 2) / (1 + 2))
def test_product_price_joint_transactions(sql_session: Session) -> None:
user1, product = insert_test_data(sql_session)
user2 = User("Test User 2")
sql_session.add(user2)
sql_session.commit()
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 0),
amount=30 * 3,
per_product=30,
product_count=3,
user_id=user1.id,
product_id=product.id,
),
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 1),
amount=20 * 2,
per_product=20,
product_count=2,
user_id=user2.id,
product_id=product.id,
),
]
sql_session.add_all(transactions)
sql_session.commit()
product_price_ = product_price(sql_session, product)
assert product_price_ == math.ceil((30 * 3 + 20 * 2) / (3 + 2))
joint_buy_product(
sql_session,
time=datetime(2023, 10, 1, 12, 0, 2),
instigator=user1,
users=[user1, user2],
product=product,
product_count=2,
)
pprint(product_price_log(sql_session, product))
old_product_price = product_price_
product_price_ = product_price(sql_session, product)
assert product_price_ == old_product_price, (
"Joint buy transactions should not affect product price"
)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 3),
amount=25 * 4,
per_product=25,
product_count=4,
user_id=user1.id,
product_id=product.id,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(product_price_log(sql_session, product))
product_price_ = product_price(sql_session, product)
# Expected state:
# Added products:
# Count: 3 + 2 = 5, Price: (30 * 3 + 20 * 2) / 5 = 26
# Joint bought products:
# Count: 5 - 2 = 3, Price: n/a (should not affect price)
# Added products:
# Count: 3 + 4 = 7, Price: (26 * 3 + 25 * 4) / (3 + 4) = 25.57 -> 26
assert product_price_ == math.ceil((26 * 3 + 25 * 4) / (3 + 4))

View File

@@ -0,0 +1,182 @@
from datetime import datetime
from sqlalchemy import select
from sqlalchemy.orm import Session
from dibbler.models import Product, Transaction, User
from dibbler.queries import product_stock, joint_buy_product
def insert_test_data(sql_session: Session) -> None:
user1 = User("Test User 1")
sql_session.add(user1)
sql_session.commit()
def test_product_stock_basic_history(sql_session: Session) -> None:
insert_test_data(sql_session)
user1 = sql_session.scalars(select(User).where(User.name == "Test User 1")).one()
product = Product("1234567890123", "Test Product")
sql_session.add(product)
sql_session.commit()
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 0),
amount=10,
per_product=10,
user_id=user1.id,
product_id=product.id,
product_count=1,
),
]
sql_session.add_all(transactions)
sql_session.commit()
assert product_stock(sql_session, product) == 1
def test_product_stock_complex_history(sql_session: Session) -> None:
insert_test_data(sql_session)
user1 = sql_session.scalars(select(User).where(User.name == "Test User 1")).one()
product = Product("1234567890123", "Test Product")
sql_session.add(product)
sql_session.commit()
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 13, 0, 0),
amount=27 * 2,
per_product=27,
user_id=user1.id,
product_id=product.id,
product_count=2,
),
Transaction.buy_product(
time=datetime(2023, 10, 1, 13, 0, 1),
user_id=user1.id,
product_id=product.id,
product_count=3,
),
Transaction.add_product(
time=datetime(2023, 10, 1, 13, 0, 2),
amount=50 * 4,
per_product=50,
user_id=user1.id,
product_id=product.id,
product_count=4,
),
Transaction.adjust_stock(
time=datetime(2023, 10, 1, 15, 0, 0),
user_id=user1.id,
product_id=product.id,
product_count=3,
),
Transaction.adjust_stock(
time=datetime(2023, 10, 1, 15, 0, 1),
user_id=user1.id,
product_id=product.id,
product_count=-2,
),
]
sql_session.add_all(transactions)
sql_session.commit()
assert product_stock(sql_session, product) == 2 - 3 + 4 + 3 - 2
def test_product_stock_no_transactions(sql_session: Session) -> None:
insert_test_data(sql_session)
product = Product("1234567890123", "Test Product")
sql_session.add(product)
sql_session.commit()
assert product_stock(sql_session, product) == 0
def test_negative_product_stock(sql_session: Session) -> None:
insert_test_data(sql_session)
user1 = sql_session.scalars(select(User).where(User.name == "Test User 1")).one()
product = Product("1234567890123", "Test Product")
sql_session.add(product)
sql_session.commit()
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 14, 0, 0),
amount=50,
per_product=50,
user_id=user1.id,
product_id=product.id,
product_count=1,
),
Transaction.buy_product(
time=datetime(2023, 10, 1, 14, 0, 1),
user_id=user1.id,
product_id=product.id,
product_count=2,
),
Transaction.adjust_stock(
time=datetime(2023, 10, 1, 16, 0, 0),
user_id=user1.id,
product_id=product.id,
product_count=-1,
),
]
sql_session.add_all(transactions)
sql_session.commit()
# The stock should be negative because we added and bought the product
assert product_stock(sql_session, product) == 1 - 2 - 1
def test_product_stock_joint_transaction(sql_session: Session) -> None:
insert_test_data(sql_session)
user1 = sql_session.scalars(select(User).where(User.name == "Test User 1")).one()
user2 = User("Test User 2")
sql_session.add(user2)
sql_session.commit()
product = Product("1234567890123", "Test Product")
sql_session.add(product)
sql_session.commit()
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 17, 0, 0),
amount=100,
per_product=100,
user_id=user1.id,
product_id=product.id,
product_count=5,
),
]
sql_session.add_all(transactions)
sql_session.commit()
joint_buy_product(
sql_session,
time=datetime(2023, 10, 1, 17, 0, 1),
instigator=user1,
users=[user1, user2],
product=product,
product_count=3,
)
assert product_stock(sql_session, product) == 5 - 3
def test_product_stock_throw_away(sql_session: Session) -> None: ...

View File

@@ -0,0 +1,88 @@
from sqlalchemy.orm import Session
from dibbler.models import Product
from dibbler.queries import search_product
def insert_test_data(sql_session: Session) -> list[Product]:
products = [
Product("1234567890123", "Test Product A"),
Product("2345678901234", "Test Product B"),
Product("3456789012345", "Another Product"),
Product("4567890123456", "Hidden Product", hidden=True),
]
sql_session.add_all(products)
sql_session.commit()
return products
def test_search_product_no_products(sql_session: Session) -> None:
result = search_product("Nonexistent Product", sql_session)
assert isinstance(result, list)
assert len(result) == 0
def test_search_product_name_exact_match(sql_session: Session) -> None:
insert_test_data(sql_session)
result = search_product("Test Product A", sql_session)
assert isinstance(result, Product)
assert result.bar_code == "1234567890123"
def test_search_product_name_partial_match(sql_session: Session) -> None:
insert_test_data(sql_session)
result = search_product("Test Product", sql_session)
assert isinstance(result, list)
assert len(result) == 2
names = {product.name for product in result}
assert names == {"Test Product A", "Test Product B"}
def test_search_product_name_no_match(sql_session: Session) -> None:
insert_test_data(sql_session)
result = search_product("Nonexistent", sql_session)
assert isinstance(result, list)
assert len(result) == 0
def test_search_product_barcode_exact_match(sql_session: Session) -> None:
products = insert_test_data(sql_session)
product = products[1] # Test Product B
result = search_product(product.bar_code, sql_session)
assert isinstance(result, Product)
assert result.name == product.name
# Should not be able to find hidden products
def test_search_product_hidden_products(sql_session: Session) -> None:
insert_test_data(sql_session)
result = search_product("Hidden Product", sql_session)
assert isinstance(result, list)
assert len(result) == 0
# Should be able to find hidden products if specified
def test_search_product_find_hidden_products(sql_session: Session) -> None:
insert_test_data(sql_session)
result = search_product("Hidden Product", sql_session, find_hidden_products=True)
assert isinstance(result, Product)
assert result.name == "Hidden Product"
# Should be able to find hidden products by barcode despite not specified
def test_search_product_hidden_products_by_barcode(sql_session: Session) -> None:
products = insert_test_data(sql_session)
hidden_product = products[3] # Hidden Product
result = search_product(hidden_product.bar_code, sql_session)
assert isinstance(result, Product)
assert result.name == "Hidden Product"

View File

@@ -0,0 +1,78 @@
from sqlalchemy.orm import Session
from dibbler.models import User
from dibbler.queries import search_user
USER = [
("alice", 123),
("bob", 125),
("charlie", 126),
("david", 127),
("eve", 128),
("evey", 129),
("evy", 130),
("-symbol-man", 131),
("user_123", 132),
]
def setup_users(sql_session: Session) -> None:
for username, rfid in USER:
user = User(name=username, rfid=str(rfid))
sql_session.add(user)
sql_session.commit()
def test_search_user_exact_match(sql_session: Session) -> None:
setup_users(sql_session)
user = search_user("alice", sql_session)
assert user is not None
assert isinstance(user, User)
assert user.name == "alice"
user = search_user("125", sql_session)
assert user is not None
assert isinstance(user, User)
assert user.name == "bob"
def test_search_user_partial_match(sql_session: Session) -> None:
setup_users(sql_session)
users = search_user("ev", sql_session)
assert isinstance(users, list)
assert len(users) == 3
names = {user.name for user in users}
assert names == {"eve", "evey", "evy"}
users = search_user("user", sql_session)
assert isinstance(users, list)
assert len(users) == 1
assert users[0].name == "user_123"
def test_search_user_no_match(sql_session: Session) -> None:
setup_users(sql_session)
result = search_user("nonexistent", sql_session)
assert isinstance(result, list)
assert len(result) == 0
def test_search_user_special_characters(sql_session: Session) -> None:
setup_users(sql_session)
user = search_user("-symbol-man", sql_session)
assert user is not None
assert isinstance(user, User)
assert user.name == "-symbol-man"
def test_search_by_rfid(sql_session: Session) -> None:
setup_users(sql_session)
user = search_user("130", sql_session)
assert user is not None
assert isinstance(user, User)
assert user.name == "evy"

View File

@@ -0,0 +1,571 @@
from datetime import datetime, timedelta
import pytest
from sqlalchemy.orm import Session
from dibbler.models import (
Product,
Transaction,
TransactionType,
User,
)
from dibbler.queries import transaction_log
def insert_test_data(sql_session: Session) -> tuple[User, User, Product, Product]:
user1 = User("Test User 1")
user2 = User("Test User 2")
product1 = Product("1234567890123", "Test Product 1")
product2 = Product("9876543210987", "Test Product 2")
sql_session.add_all([user1, user2, product1, product2])
sql_session.commit()
return user1, user2, product1, product2
def insert_default_test_transactions(
sql_session: Session,
user1: User,
user2: User,
product1: Product,
product2: Product,
) -> list[Transaction]:
transactions = [
Transaction.adjust_balance(
time=datetime(2023, 10, 1, 10, 0, 0),
amount=100,
user_id=user1.id,
),
Transaction.adjust_balance(
time=datetime(2023, 10, 1, 10, 0, 1),
amount=50,
user_id=user2.id,
),
Transaction.adjust_balance(
time=datetime(2023, 10, 1, 10, 0, 2),
amount=-50,
user_id=user1.id,
),
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 0),
amount=27 * 2,
per_product=27,
product_count=2,
user_id=user1.id,
product_id=product1.id,
),
Transaction.buy_product(
time=datetime(2023, 10, 1, 12, 0, 1),
product_count=1,
user_id=user2.id,
product_id=product2.id,
),
Transaction.add_product(
time=datetime(2023, 10, 1, 12, 0, 2),
amount=15 * 1,
per_product=15,
product_count=1,
user_id=user2.id,
product_id=product2.id,
),
Transaction.transfer(
time=datetime(2023, 10, 1, 14, 0, 0),
amount=30,
user_id=user1.id,
transfer_user_id=user2.id,
),
]
sql_session.add_all(transactions)
sql_session.commit()
return transactions
def test_user_transactions_no_transactions(sql_session: Session) -> None:
insert_test_data(sql_session)
transactions = transaction_log(sql_session)
assert len(transactions) == 0
def test_transaction_log_filtered_by_user(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
insert_default_test_transactions(sql_session, user, user2, product, product2)
assert len(transaction_log(sql_session, user=user)) == 4
assert len(transaction_log(sql_session, user=user2)) == 3
def test_transaction_log_filtered_by_product(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
insert_default_test_transactions(sql_session, user, user2, product, product2)
assert len(transaction_log(sql_session, product=product)) == 1
assert len(transaction_log(sql_session, product=product2)) == 2
def test_transaction_log_after_datetime(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
assert (
len(
transaction_log(
sql_session,
after_time=transactions[2].time,
)
)
== len(transactions) - 2
)
def test_transaction_log_after_datetime_no_transactions(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
assert (
len(
transaction_log(
sql_session,
after_time=transactions[-1].time + timedelta(seconds=1),
)
)
== 0
)
def test_transaction_log_after_datetime_exclusive(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
assert (
len(
transaction_log(
sql_session,
after_time=transactions[2].time,
exclusive_after=True,
)
)
== len(transactions) - 3
)
def test_transaction_log_after_transaction_id(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
first_transaction = transactions[0]
assert len(
transaction_log(
sql_session,
after_transaction_id=first_transaction.id,
)
) == len(transactions)
def test_transaction_log_after_transaction_id_one_transaction(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
last_transaction = transactions[-1]
assert (
len(
transaction_log(
sql_session,
after_transaction_id=last_transaction.id,
)
)
== 1
)
def test_transaction_log_after_transaction_id_exclusive(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
third_transaction = transactions[2]
assert (
len(
transaction_log(
sql_session,
after_transaction_id=third_transaction.id,
exclusive_after=True,
)
)
== len(transactions) - 3
)
def test_transaction_log_before_datetime(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
assert (
len(
transaction_log(
sql_session,
before_time=transactions[-3].time,
)
)
== len(transactions) - 2
)
def test_transaction_log_before_datetime_no_transactions(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
assert (
len(
transaction_log(
sql_session,
before_time=transactions[0].time - timedelta(seconds=1),
)
)
== 0
)
def test_transaction_log_before_datetime_exclusive(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
assert (
len(
transaction_log(
sql_session,
before_time=transactions[-3].time,
exclusive_before=True,
)
)
== len(transactions) - 3
)
def test_transaction_log_before_transaction_id(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
last_transaction = transactions[-3]
assert (
len(
transaction_log(
sql_session,
before_transaction_id=last_transaction.id,
)
)
== len(transactions) - 2
)
def test_transaction_log_before_transaction_id_one_transaction(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
first_transaction = transactions[0]
assert (
len(
transaction_log(
sql_session,
before_transaction_id=first_transaction.id,
)
)
== 1
)
def test_transaction_log_before_transaction_id_exclusive(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
last_transaction = transactions[-3]
assert (
len(
transaction_log(
sql_session,
before_transaction_id=last_transaction.id,
exclusive_before=True,
)
)
== len(transactions) - 3
)
def test_transaction_log_before_after_datetime_combined(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
second_transaction = transactions[1]
fifth_transaction = transactions[4]
assert (
len(
transaction_log(
sql_session,
after_time=second_transaction.time,
before_time=fifth_transaction.time,
)
)
== 4
)
def test_transaction_log_before_after_transaction_id_combined(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
second_transaction = transactions[1]
fifth_transaction = transactions[4]
assert (
len(
transaction_log(
sql_session,
after_transaction_id=second_transaction.id,
before_transaction_id=fifth_transaction.id,
)
)
== 4
)
def test_transaction_log_before_date_after_transaction_id(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
second_transaction = transactions[1]
fifth_transaction = transactions[4]
assert (
len(
transaction_log(
sql_session,
before_time=fifth_transaction.time,
after_transaction_id=second_transaction.id,
)
)
== 4
)
def test_transaction_log_before_transaction_id_after_date(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
second_transaction = transactions[1]
fifth_transaction = transactions[4]
assert (
len(
transaction_log(
sql_session,
before_transaction_id=fifth_transaction.id,
after_time=second_transaction.time,
)
)
== 4
)
def test_transaction_log_after_product_and_user_not_allowed(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
insert_default_test_transactions(sql_session, user, user2, product, product2)
with pytest.raises(ValueError):
transaction_log(
sql_session,
user=user,
product=product,
after_time=datetime(2023, 10, 1, 11, 0, 0),
)
def test_transaction_log_after_datetime_and_transaction_id_not_allowed(
sql_session: Session,
) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
insert_default_test_transactions(sql_session, user, user2, product, product2)
with pytest.raises(ValueError):
transaction_log(
sql_session,
user=user,
after_time=datetime(2023, 10, 1, 11, 0, 0),
after_transaction_id=1,
)
def test_transaction_log_limit(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
assert len(transaction_log(sql_session, limit=3)) == 3
assert len(transaction_log(sql_session, limit=len(transactions) + 3)) == len(transactions)
def test_transaction_log_filtered_by_transaction_type(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
insert_default_test_transactions(sql_session, user, user2, product, product2)
assert (
len(
transaction_log(
sql_session,
transaction_type=[TransactionType.ADJUST_BALANCE],
)
)
== 3
)
assert (
len(
transaction_log(
sql_session,
transaction_type=[TransactionType.ADD_PRODUCT],
)
)
== 2
)
assert (
len(
transaction_log(
sql_session,
transaction_type=[TransactionType.BUY_PRODUCT, TransactionType.ADD_PRODUCT],
)
)
== 3
)
def test_transaction_log_filtered_by_transaction_type_negated(sql_session: Session) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
assert (
len(
transaction_log(
sql_session,
transaction_type=[TransactionType.ADJUST_BALANCE],
negate_transaction_type_filter=True,
)
)
== len(transactions) - 3
)
assert (
len(
transaction_log(
sql_session,
transaction_type=[TransactionType.ADD_PRODUCT],
negate_transaction_type_filter=True,
)
)
== len(transactions) - 2
)
assert (
len(
transaction_log(
sql_session,
transaction_type=[TransactionType.BUY_PRODUCT, TransactionType.ADD_PRODUCT],
negate_transaction_type_filter=True,
)
)
== len(transactions) - 3
)
def test_transaction_log_combined_filter_user_datetime_transaction_type_limit(
sql_session: Session,
) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
second_transaction = transactions[1]
sixth_transaction = transactions[5]
result = transaction_log(
sql_session,
user=user,
after_time=second_transaction.time,
before_time=sixth_transaction.time,
transaction_type=[TransactionType.ADJUST_BALANCE, TransactionType.ADD_PRODUCT],
limit=2,
)
assert len(result) == 2
def test_transaction_log_combined_filter_user_transaction_id_transaction_type_limit(
sql_session: Session,
) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
second_transaction = transactions[1]
sixth_transaction = transactions[5]
result = transaction_log(
sql_session,
user=user,
after_transaction_id=second_transaction.id,
before_transaction_id=sixth_transaction.id,
transaction_type=[TransactionType.ADJUST_BALANCE, TransactionType.ADD_PRODUCT],
limit=2,
)
assert len(result) == 2
def test_transaction_log_combined_filter_product_datetime_transaction_type_limit(
sql_session: Session,
) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
second_transaction = transactions[1]
sixth_transaction = transactions[5]
result = transaction_log(
sql_session,
product=product2,
after_time=second_transaction.time,
before_time=sixth_transaction.time,
transaction_type=[TransactionType.BUY_PRODUCT, TransactionType.ADD_PRODUCT],
limit=2,
)
assert len(result) == 2
def test_transaction_log_combined_filter_product_transaction_id_transaction_type_limit(
sql_session: Session,
) -> None:
user, user2, product, product2 = insert_test_data(sql_session)
transactions = insert_default_test_transactions(sql_session, user, user2, product, product2)
second_transaction = transactions[1]
sixth_transaction = transactions[5]
result = transaction_log(
sql_session,
product=product2,
after_transaction_id=second_transaction.id,
before_transaction_id=sixth_transaction.id,
transaction_type=[TransactionType.BUY_PRODUCT, TransactionType.ADD_PRODUCT],
limit=2,
)
assert len(result) == 2
# NOTE: see the corresponding TODO's above the function definition
def test_transaction_log_filtered_by_user_joint_transactions(sql_session: Session) -> None: ...
def test_transaction_log_filtered_by_user_throw_away_transactions(sql_session: Session) -> None: ...

View File

@@ -0,0 +1,330 @@
import math
from datetime import datetime
from pprint import pprint
from sqlalchemy.orm import Session
from dibbler.models import Product, Transaction, User
from dibbler.queries import user_balance, user_balance_log
# TODO: see if we can use pytest_runtest_makereport to print the "user_balance_log"s
# only on failures instead of inlining it in every test function
def insert_test_data(sql_session: Session) -> tuple[User, Product]:
user = User("Test User")
product = Product("1234567890123", "Test Product")
sql_session.add(user)
sql_session.add(product)
sql_session.commit()
return user, product
def test_user_balance_no_transactions(sql_session: Session) -> None:
user, _ = insert_test_data(sql_session)
pprint(user_balance_log(sql_session, user))
balance = user_balance(sql_session, user)
assert balance == 0
def test_user_balance_basic_history(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.adjust_balance(
time=datetime(2023, 10, 1, 10, 0, 0),
user_id=user.id,
amount=100,
),
Transaction.add_product(
time=datetime(2023, 10, 1, 10, 0, 1),
user_id=user.id,
product_id=product.id,
amount=27,
per_product=27,
product_count=1,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(user_balance_log(sql_session, user))
balance = user_balance(sql_session, user)
assert balance == 100 + 27
def test_user_balance_with_transfers(sql_session: Session) -> None:
user1, product = insert_test_data(sql_session)
user2 = User("Test User 2")
sql_session.add(user2)
sql_session.commit()
transactions = [
Transaction.adjust_balance(
time=datetime(2023, 10, 1, 10, 0, 0),
user_id=user1.id,
amount=100,
),
Transaction.transfer(
time=datetime(2023, 10, 1, 10, 0, 1),
user_id=user1.id,
transfer_user_id=user2.id,
amount=50,
),
Transaction.transfer(
time=datetime(2023, 10, 1, 10, 0, 2),
user_id=user2.id,
transfer_user_id=user1.id,
amount=30,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(user_balance_log(sql_session, user1))
user1_balance = user_balance(sql_session, user1)
assert user1_balance == 100 - 50 + 30
pprint(user_balance_log(sql_session, user2))
user2_balance = user_balance(sql_session, user2)
assert user2_balance == 50 - 30
def test_user_balance_complex_history(sql_session: Session) -> None:
raise NotImplementedError("This test is not implemented yet.")
def test_user_balance_penalty(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 10, 0, 0),
user_id=user.id,
product_id=product.id,
amount=27,
per_product=27,
product_count=1,
),
Transaction.adjust_balance(
time=datetime(2023, 10, 1, 11, 0, 0),
user_id=user.id,
amount=-200,
),
# Penalized, pays 2x the price (default penalty)
Transaction.buy_product(
time=datetime(2023, 10, 1, 12, 0, 0),
user_id=user.id,
product_id=product.id,
product_count=1,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(user_balance_log(sql_session, user))
assert user_balance(sql_session, user) == 27 - 200 - (27 * 2)
def test_user_balance_changing_penalty(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 10, 0, 0),
user_id=user.id,
product_id=product.id,
amount=27,
per_product=27,
product_count=1,
),
Transaction.adjust_balance(
time=datetime(2023, 10, 1, 11, 0, 0),
user_id=user.id,
amount=-200,
),
# Penalized, pays 2x the price (default penalty)
Transaction.buy_product(
time=datetime(2023, 10, 1, 12, 0, 0),
user_id=user.id,
product_id=product.id,
product_count=1,
),
Transaction.adjust_penalty(
time=datetime(2023, 10, 1, 13, 0, 0),
user_id=user.id,
penalty_multiplier_percent=300,
penalty_threshold=-100,
),
# Penalized, pays 3x the price
Transaction.buy_product(
time=datetime(2023, 10, 1, 14, 0, 0),
user_id=user.id,
product_id=product.id,
product_count=1,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(user_balance_log(sql_session, user))
assert user_balance(sql_session, user) == 27 - 200 - (27 * 2) - (27 * 3)
def test_user_balance_interest(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 10, 0, 0),
user_id=user.id,
product_id=product.id,
amount=27,
per_product=27,
product_count=1,
),
Transaction.adjust_interest(
time=datetime(2023, 10, 1, 11, 0, 0),
user_id=user.id,
interest_rate_percent=110,
),
Transaction.buy_product(
time=datetime(2023, 10, 1, 12, 0, 0),
user_id=user.id,
product_id=product.id,
product_count=1,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(user_balance_log(sql_session, user))
assert user_balance(sql_session, user) == 27 - math.ceil(27 * 1.1)
def test_user_balance_changing_interest(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 10, 0, 0),
user_id=user.id,
product_id=product.id,
amount=27 * 3,
per_product=27,
product_count=3,
),
Transaction.adjust_interest(
time=datetime(2023, 10, 1, 11, 0, 0),
user_id=user.id,
interest_rate_percent=110,
),
# Pays 1.1x the price
Transaction.buy_product(
time=datetime(2023, 10, 1, 12, 0, 0),
user_id=user.id,
product_id=product.id,
product_count=1,
),
Transaction.adjust_interest(
time=datetime(2023, 10, 1, 13, 0, 0),
user_id=user.id,
interest_rate_percent=120,
),
# Pays 1.2x the price
Transaction.buy_product(
time=datetime(2023, 10, 1, 14, 0, 0),
user_id=user.id,
product_id=product.id,
product_count=1,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(user_balance_log(sql_session, user))
assert user_balance(sql_session, user) == 27 * 3 - math.ceil(27 * 1.1) - math.ceil(27 * 1.2)
def test_user_balance_penalty_interest_combined(sql_session: Session) -> None:
user, product = insert_test_data(sql_session)
transactions = [
Transaction.add_product(
time=datetime(2023, 10, 1, 10, 0, 0),
user_id=user.id,
product_id=product.id,
amount=27,
per_product=27,
product_count=1,
),
Transaction.adjust_interest(
time=datetime(2023, 10, 1, 11, 0, 0),
user_id=user.id,
interest_rate_percent=110,
),
Transaction.adjust_balance(
time=datetime(2023, 10, 1, 12, 0, 0),
user_id=user.id,
amount=-200,
),
# Penalized, pays 2x the price (default penalty)
# Pays 1.1x the price
Transaction.buy_product(
time=datetime(2023, 10, 1, 13, 0, 0),
user_id=user.id,
product_id=product.id,
product_count=1,
),
]
sql_session.add_all(transactions)
sql_session.commit()
pprint(user_balance_log(sql_session, user))
assert user_balance(sql_session, user) == (27 - 200 - math.ceil(27 * 2 * 1.1))
def test_user_balance_joint_transactions(sql_session: Session):
pass
def test_user_balance_joint_transactions_interest(sql_session: Session):
pass
def test_user_balance_joint_transactions_changing_interest(sql_session: Session):
pass
def test_user_balance_joint_transactions_penalty(sql_session: Session):
pass
def test_user_balance_joint_transactions_changing_penalty(sql_session: Session):
pass
def test_user_balance_joint_transactions_penalty_interest_combined(sql_session: Session):
pass
def test_user_balance_throw_away_products(sql_session: Session):
pass

1097
uv.lock generated Normal file

File diff suppressed because it is too large Load Diff