Compare commits
9 Commits
word-regro
...
sqlite-icu
| Author | SHA1 | Date | |
|---|---|---|---|
|
7bacfc39a8
|
|||
|
c74a5f5cb6
|
|||
|
4fbed59143
|
|||
|
61ac226fc3
|
|||
|
ede57a7a00
|
|||
|
2ad1e038f1
|
|||
|
f40825de65
|
|||
|
5aa068eaec
|
|||
|
170c3a853e
|
24
README.md
24
README.md
@@ -18,26 +18,4 @@ Note that while the license for the code is MIT, the data has various licenses.
|
|||||||
| **Tanos JLPT levels:** | https://www.tanos.co.uk/jlpt/ |
|
| **Tanos JLPT levels:** | https://www.tanos.co.uk/jlpt/ |
|
||||||
| **Kangxi Radicals:** | https://ctext.org/kangxi-zidian |
|
| **Kangxi Radicals:** | https://ctext.org/kangxi-zidian |
|
||||||
|
|
||||||
## Implementation details
|
See [docs/overview.md](./docs/overview.md) for notes and implementation details.
|
||||||
|
|
||||||
### Word search
|
|
||||||
|
|
||||||
The word search procedure is currently split into 3 parts:
|
|
||||||
|
|
||||||
1. **Entry ID query**:
|
|
||||||
|
|
||||||
Use a complex query with various scoring factors to try to get list of
|
|
||||||
database ids pointing at dictionary entries, sorted by how likely we think this
|
|
||||||
word is the word that the caller is looking for. The output here is a `List<int>`
|
|
||||||
|
|
||||||
2. **Data Query**:
|
|
||||||
|
|
||||||
Takes the entry id list from the last search, and performs all queries needed to retrieve
|
|
||||||
all the dictionary data for those IDs. The result is a struct with a bunch of flattened lists
|
|
||||||
with data for all the dictionary entries. These lists are sorted by the order that the ids
|
|
||||||
were provided.
|
|
||||||
|
|
||||||
3. **Regrouping**:
|
|
||||||
|
|
||||||
Takes the flattened data, and regroups the items into structs with a more "hierarchical" structure.
|
|
||||||
All data tagged with the same ID will end up in the same struct. Returns a list of these structs.
|
|
||||||
|
|||||||
13
docs/lemmatizer.md
Normal file
13
docs/lemmatizer.md
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
# Lemmatizer
|
||||||
|
|
||||||
|
The lemmatizer is still quite experimental, but will play a more important role in the project in the future.
|
||||||
|
|
||||||
|
It is a manual implementation of a [Finite State Transducer](https://en.wikipedia.org/wiki/Morphological_dictionary#Finite_State_Transducers) for morphological parsing. The FST is used to recursively remove affixes from a word until it (hopefully) deconjugates into its dictionary form. This iterative deconjugation tree will then be combined with queries into the dictionary data to determine if the deconjugation leads to a real known word.
|
||||||
|
|
||||||
|
Each separate rule is a separate static object declared in `lib/util/lemmatizer/rules`.
|
||||||
|
|
||||||
|
There is a cli subcommand for testing the tool interactively, you can run
|
||||||
|
|
||||||
|
```bash
|
||||||
|
dart run jadb lemmatize -w '食べさせられない'
|
||||||
|
```
|
||||||
27
docs/overview.md
Normal file
27
docs/overview.md
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
# Overview
|
||||||
|
|
||||||
|
This is the documentation for `jadb`. Since I'm currently the only one working on it, the documentation is more or less just notes to myself, to ensure I remember how and why I implemented certain features in a certain way a few months down the road. This is not a comprehensive and formal documentation for downstream use, neither for developers nor end-users.
|
||||||
|
|
||||||
|
- [Word Search](./word-search.md)
|
||||||
|
- [Lemmatizer](./lemmatizer.md)
|
||||||
|
|
||||||
|
## Project structure
|
||||||
|
|
||||||
|
- `lib/_data_ingestion` contains all the code for reading data sources, transforming them and compiling them into an SQLite database. This is for the most part isolated from the rest of the codebase, and should not be depended on by any code used for querying the database.
|
||||||
|
- `lib/cli` contains code for cli tooling (e.g. argument parsing, subcommand handling, etc.)
|
||||||
|
- `lib/const_data` contains database data that is small enough to warrant being hardcoded as dart constants.
|
||||||
|
- `lib/models` contains all the code for representing the database schema as Dart classes, and for converting between those classes and the actual database.
|
||||||
|
- `lib/search` contains all the code for searching the database.
|
||||||
|
- `lib/util/lemmatizer` contains the code for lemmatization, which will be used by the search code in the future.
|
||||||
|
- `migrations` contains raw SQL files for creating the database schema.
|
||||||
|
|
||||||
|
## SQLite naming conventions
|
||||||
|
|
||||||
|
> [!WARNING]
|
||||||
|
> All of these conventions are actually not enforced yet, it will be fixed at some point.
|
||||||
|
|
||||||
|
- Indices are prefixed with `IDX__`
|
||||||
|
- Crossref tables are prefixed with `XREF__`
|
||||||
|
- Trigger names are prefixed with `TRG__`
|
||||||
|
- Views are prefixed with `VW__`
|
||||||
|
- All data sources should have a `<datasource>_Version` table, which contains a single row with the version of the data source used to generate the database.
|
||||||
21
docs/word-search.md
Normal file
21
docs/word-search.md
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
# Word search
|
||||||
|
|
||||||
|
The word search procedure is currently split into 3 parts:
|
||||||
|
|
||||||
|
1. **Entry ID query**:
|
||||||
|
|
||||||
|
Use a complex query with various scoring factors to try to get list of
|
||||||
|
database ids pointing at dictionary entries, sorted by how likely we think this
|
||||||
|
word is the word that the caller is looking for. The output here is a `List<int>`
|
||||||
|
|
||||||
|
2. **Data Query**:
|
||||||
|
|
||||||
|
Takes the entry id list from the last search, and performs all queries needed to retrieve
|
||||||
|
all the dictionary data for those IDs. The result is a struct with a bunch of flattened lists
|
||||||
|
with data for all the dictionary entries. These lists are sorted by the order that the ids
|
||||||
|
were provided.
|
||||||
|
|
||||||
|
3. **Regrouping**:
|
||||||
|
|
||||||
|
Takes the flattened data, and regroups the items into structs with a more "hierarchical" structure.
|
||||||
|
All data tagged with the same ID will end up in the same struct. Returns a list of these structs.
|
||||||
12
flake.lock
generated
12
flake.lock
generated
@@ -3,7 +3,7 @@
|
|||||||
"jmdict-src": {
|
"jmdict-src": {
|
||||||
"flake": false,
|
"flake": false,
|
||||||
"locked": {
|
"locked": {
|
||||||
"narHash": "sha256-lh46uougUzBrRhhwa7cOb32j5Jt9/RjBUhlVjwVzsII=",
|
"narHash": "sha256-eOc3a/AYNRFF3w6lWhyf0Sh92xeXS7+9Qvn0tvvH6Ys=",
|
||||||
"type": "file",
|
"type": "file",
|
||||||
"url": "http://ftp.edrdg.org/pub/Nihongo/JMdict_e.gz"
|
"url": "http://ftp.edrdg.org/pub/Nihongo/JMdict_e.gz"
|
||||||
},
|
},
|
||||||
@@ -15,7 +15,7 @@
|
|||||||
"jmdict-with-examples-src": {
|
"jmdict-with-examples-src": {
|
||||||
"flake": false,
|
"flake": false,
|
||||||
"locked": {
|
"locked": {
|
||||||
"narHash": "sha256-5oS2xDyetbuSM6ax3LUjYA3N60x+D3Hg41HEXGFMqLQ=",
|
"narHash": "sha256-nx+WMkscWvA/XImKM7NESYVmICwSgXWOO1KPXasHY94=",
|
||||||
"type": "file",
|
"type": "file",
|
||||||
"url": "http://ftp.edrdg.org/pub/Nihongo/JMdict_e_examp.gz"
|
"url": "http://ftp.edrdg.org/pub/Nihongo/JMdict_e_examp.gz"
|
||||||
},
|
},
|
||||||
@@ -27,7 +27,7 @@
|
|||||||
"kanjidic2-src": {
|
"kanjidic2-src": {
|
||||||
"flake": false,
|
"flake": false,
|
||||||
"locked": {
|
"locked": {
|
||||||
"narHash": "sha256-orSeQqSxhn9TtX3anYtbiMEm7nFkuomGnIKoVIUR2CM=",
|
"narHash": "sha256-2T/cAS/kZmVMURStgHVhz524+J9+v5onKs8eEYf2fY0=",
|
||||||
"type": "file",
|
"type": "file",
|
||||||
"url": "https://www.edrdg.org/kanjidic/kanjidic2.xml.gz"
|
"url": "https://www.edrdg.org/kanjidic/kanjidic2.xml.gz"
|
||||||
},
|
},
|
||||||
@@ -38,11 +38,11 @@
|
|||||||
},
|
},
|
||||||
"nixpkgs": {
|
"nixpkgs": {
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1771848320,
|
"lastModified": 1774386573,
|
||||||
"narHash": "sha256-0MAd+0mun3K/Ns8JATeHT1sX28faLII5hVLq0L3BdZU=",
|
"narHash": "sha256-4hAV26quOxdC6iyG7kYaZcM3VOskcPUrdCQd/nx8obc=",
|
||||||
"owner": "NixOS",
|
"owner": "NixOS",
|
||||||
"repo": "nixpkgs",
|
"repo": "nixpkgs",
|
||||||
"rev": "2fc6539b481e1d2569f25f8799236694180c0993",
|
"rev": "46db2e09e1d3f113a13c0d7b81e2f221c63b8ce9",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
|
|||||||
47
flake.nix
47
flake.nix
@@ -43,7 +43,12 @@
|
|||||||
"armv7l-linux"
|
"armv7l-linux"
|
||||||
];
|
];
|
||||||
|
|
||||||
forAllSystems = f: lib.genAttrs systems (system: f system nixpkgs.legacyPackages.${system});
|
forAllSystems = f: lib.genAttrs systems (system: let
|
||||||
|
pkgs = import nixpkgs {
|
||||||
|
inherit system;
|
||||||
|
overlays = [ self.overlays.sqlite-icu-ext ];
|
||||||
|
};
|
||||||
|
in f system pkgs);
|
||||||
in {
|
in {
|
||||||
apps = forAllSystems (system: pkgs: {
|
apps = forAllSystems (system: pkgs: {
|
||||||
default = {
|
default = {
|
||||||
@@ -77,15 +82,12 @@
|
|||||||
|
|
||||||
devShells = forAllSystems (system: pkgs: {
|
devShells = forAllSystems (system: pkgs: {
|
||||||
default = pkgs.mkShell {
|
default = pkgs.mkShell {
|
||||||
buildInputs = with pkgs; [
|
packages = with pkgs; [
|
||||||
dart
|
dart
|
||||||
gnumake
|
gnumake
|
||||||
lcov
|
lcov
|
||||||
sqlite-analyzer
|
sqldiff
|
||||||
sqlite-interactive
|
sqlite-interactive-icu-ext
|
||||||
sqlite-web
|
|
||||||
# sqlint
|
|
||||||
sqlfluff
|
|
||||||
];
|
];
|
||||||
env = {
|
env = {
|
||||||
LIBSQLITE_PATH = "${pkgs.sqlite.out}/lib/libsqlite3.so";
|
LIBSQLITE_PATH = "${pkgs.sqlite.out}/lib/libsqlite3.so";
|
||||||
@@ -93,8 +95,34 @@
|
|||||||
LD_LIBRARY_PATH = lib.makeLibraryPath [ pkgs.sqlite ];
|
LD_LIBRARY_PATH = lib.makeLibraryPath [ pkgs.sqlite ];
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
|
|
||||||
|
sqlite-debugging = pkgs.mkShell {
|
||||||
|
packages = with pkgs; [
|
||||||
|
sqlite-interactive-icu-ext
|
||||||
|
sqlite-analyzer
|
||||||
|
sqlite-web
|
||||||
|
sqlint
|
||||||
|
sqlfluff
|
||||||
|
];
|
||||||
|
};
|
||||||
});
|
});
|
||||||
|
|
||||||
|
overlays.sqlite-icu-ext = final: prev: let
|
||||||
|
overrideArgs = prev': {
|
||||||
|
configureFlags = prev'.configureFlags ++ [
|
||||||
|
"--with-icu-config=${lib.getExe' prev.icu.dev "icu-config"}"
|
||||||
|
"--enable-icu-collations"
|
||||||
|
];
|
||||||
|
|
||||||
|
buildInputs = prev'.buildInputs ++ [
|
||||||
|
prev.icu
|
||||||
|
];
|
||||||
|
};
|
||||||
|
in {
|
||||||
|
sqlite-icu-ext = prev.sqlite.overrideAttrs overrideArgs;
|
||||||
|
sqlite-interactive-icu-ext = prev.sqlite-interactive.overrideAttrs overrideArgs;
|
||||||
|
};
|
||||||
|
|
||||||
packages = let
|
packages = let
|
||||||
edrdgMetadata = {
|
edrdgMetadata = {
|
||||||
license = [{
|
license = [{
|
||||||
@@ -128,6 +156,8 @@
|
|||||||
ln -s ${src} $out
|
ln -s ${src} $out
|
||||||
'';
|
'';
|
||||||
|
|
||||||
|
inherit (pkgs) sqlite-icu-ext sqlite-interactive-icu-ext;
|
||||||
|
|
||||||
jmdict = pkgs.callPackage ./nix/jmdict.nix {
|
jmdict = pkgs.callPackage ./nix/jmdict.nix {
|
||||||
inherit jmdict-src jmdict-with-examples-src edrdgMetadata;
|
inherit jmdict-src jmdict-with-examples-src edrdgMetadata;
|
||||||
};
|
};
|
||||||
@@ -142,17 +172,20 @@
|
|||||||
|
|
||||||
database-tool = pkgs.callPackage ./nix/database_tool.nix {
|
database-tool = pkgs.callPackage ./nix/database_tool.nix {
|
||||||
inherit src;
|
inherit src;
|
||||||
|
sqlite = pkgs.sqlite-icu-ext;
|
||||||
};
|
};
|
||||||
|
|
||||||
database = pkgs.callPackage ./nix/database.nix {
|
database = pkgs.callPackage ./nix/database.nix {
|
||||||
inherit (self.packages.${system}) database-tool jmdict radkfile kanjidic2;
|
inherit (self.packages.${system}) database-tool jmdict radkfile kanjidic2;
|
||||||
inherit src;
|
inherit src;
|
||||||
|
sqlite = pkgs.sqlite-icu-ext;
|
||||||
};
|
};
|
||||||
|
|
||||||
database-wal = pkgs.callPackage ./nix/database.nix {
|
database-wal = pkgs.callPackage ./nix/database.nix {
|
||||||
inherit (self.packages.${system}) database-tool jmdict radkfile kanjidic2;
|
inherit (self.packages.${system}) database-tool jmdict radkfile kanjidic2;
|
||||||
inherit src;
|
inherit src;
|
||||||
wal = true;
|
wal = true;
|
||||||
|
sqlite = pkgs.sqlite-icu-ext;
|
||||||
};
|
};
|
||||||
|
|
||||||
docs = pkgs.callPackage ./nix/docs.nix {
|
docs = pkgs.callPackage ./nix/docs.nix {
|
||||||
|
|||||||
@@ -5,13 +5,23 @@ import 'package:jadb/_data_ingestion/jmdict/objects.dart';
|
|||||||
import 'package:jadb/table_names/jmdict.dart';
|
import 'package:jadb/table_names/jmdict.dart';
|
||||||
import 'package:sqflite_common/sqlite_api.dart';
|
import 'package:sqflite_common/sqlite_api.dart';
|
||||||
|
|
||||||
|
/// A wrapper for the result of resolving an xref, which includes the resolved entry and a flag
|
||||||
|
/// indicating whether the xref was ambiguous (i.e. could refer to multiple entries).
|
||||||
class ResolvedXref {
|
class ResolvedXref {
|
||||||
Entry entry;
|
Entry entry;
|
||||||
bool ambiguous;
|
bool ambiguous;
|
||||||
|
int? senseOrderNum;
|
||||||
|
|
||||||
ResolvedXref(this.entry, this.ambiguous);
|
ResolvedXref(this.entry, this.ambiguous, senseOrderNum);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Resolves an xref (pair of kanji, optionally reading, and optionally sense number) to an a specific
|
||||||
|
/// JMdict entry, if possible.
|
||||||
|
///
|
||||||
|
/// If the xref is ambiguous (i.e. it could refer to multiple entries), the
|
||||||
|
/// first entry is returned, and the returned value is marked as ambiguous.
|
||||||
|
///
|
||||||
|
/// If the xref cannot be resolved to any entry at all, an exception is thrown.
|
||||||
ResolvedXref resolveXref(
|
ResolvedXref resolveXref(
|
||||||
SplayTreeMap<String, Set<Entry>> entriesByKanji,
|
SplayTreeMap<String, Set<Entry>> entriesByKanji,
|
||||||
SplayTreeMap<String, Set<Entry>> entriesByReading,
|
SplayTreeMap<String, Set<Entry>> entriesByReading,
|
||||||
@@ -65,9 +75,10 @@ ResolvedXref resolveXref(
|
|||||||
'kanjiRef: ${xref.kanjiRef}, readingRef: ${xref.readingRef}, '
|
'kanjiRef: ${xref.kanjiRef}, readingRef: ${xref.readingRef}, '
|
||||||
'senseOrderNum: ${xref.senseOrderNum}',
|
'senseOrderNum: ${xref.senseOrderNum}',
|
||||||
);
|
);
|
||||||
return ResolvedXref(candidateEntries.first, true);
|
|
||||||
|
return ResolvedXref(candidateEntries.first, true, xref.senseOrderNum);
|
||||||
} else {
|
} else {
|
||||||
return ResolvedXref(candidateEntries.first, false);
|
return ResolvedXref(candidateEntries.first, false, xref.senseOrderNum);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -152,14 +163,14 @@ Future<void> seedJMDictData(List<Entry> entries, Database db) async {
|
|||||||
b.insert(JMdictTableNames.senseRestrictedToKanji, {
|
b.insert(JMdictTableNames.senseRestrictedToKanji, {
|
||||||
'entryId': e.entryId,
|
'entryId': e.entryId,
|
||||||
'senseId': s.senseId,
|
'senseId': s.senseId,
|
||||||
'kanji': rk,
|
'kanjiOrderNum': e.kanji.indexWhere((k) => k.reading == rk) + 1,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
for (final rr in s.restrictedToReading) {
|
for (final rr in s.restrictedToReading) {
|
||||||
b.insert(JMdictTableNames.senseRestrictedToReading, {
|
b.insert(JMdictTableNames.senseRestrictedToReading, {
|
||||||
'entryId': e.entryId,
|
'entryId': e.entryId,
|
||||||
'senseId': s.senseId,
|
'senseId': s.senseId,
|
||||||
'reading': rr,
|
'readingOrderNum': e.readings.indexWhere((r) => r.reading == rr) + 1,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
for (final ls in s.languageSource) {
|
for (final ls in s.languageSource) {
|
||||||
@@ -181,24 +192,17 @@ Future<void> seedJMDictData(List<Entry> entries, Database db) async {
|
|||||||
|
|
||||||
print(' [JMdict] Building xref trees');
|
print(' [JMdict] Building xref trees');
|
||||||
final SplayTreeMap<String, Set<Entry>> entriesByKanji = SplayTreeMap();
|
final SplayTreeMap<String, Set<Entry>> entriesByKanji = SplayTreeMap();
|
||||||
|
final SplayTreeMap<String, Set<Entry>> entriesByReading = SplayTreeMap();
|
||||||
|
|
||||||
for (final entry in entries) {
|
for (final entry in entries) {
|
||||||
for (final kanji in entry.kanji) {
|
for (final kanji in entry.kanji) {
|
||||||
if (entriesByKanji.containsKey(kanji.reading)) {
|
entriesByKanji.putIfAbsent(kanji.reading, () => {});
|
||||||
entriesByKanji.update(kanji.reading, (list) => list..add(entry));
|
entriesByKanji.update(kanji.reading, (set) => set..add(entry));
|
||||||
} else {
|
|
||||||
entriesByKanji.putIfAbsent(kanji.reading, () => {entry});
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
final SplayTreeMap<String, Set<Entry>> entriesByReading = SplayTreeMap();
|
|
||||||
for (final entry in entries) {
|
|
||||||
for (final reading in entry.readings) {
|
for (final reading in entry.readings) {
|
||||||
if (entriesByReading.containsKey(reading.reading)) {
|
entriesByReading.putIfAbsent(reading.reading, () => {});
|
||||||
entriesByReading.update(reading.reading, (list) => list..add(entry));
|
entriesByReading.update(reading.reading, (set) => set..add(entry));
|
||||||
} else {
|
|
||||||
entriesByReading.putIfAbsent(reading.reading, () => {entry});
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -207,6 +211,7 @@ Future<void> seedJMDictData(List<Entry> entries, Database db) async {
|
|||||||
|
|
||||||
for (final e in entries) {
|
for (final e in entries) {
|
||||||
for (final s in e.senses) {
|
for (final s in e.senses) {
|
||||||
|
final seenSeeAlsoXrefs = <int>{};
|
||||||
for (final xref in s.seeAlso) {
|
for (final xref in s.seeAlso) {
|
||||||
final resolvedEntry = resolveXref(
|
final resolvedEntry = resolveXref(
|
||||||
entriesByKanji,
|
entriesByKanji,
|
||||||
@@ -214,16 +219,24 @@ Future<void> seedJMDictData(List<Entry> entries, Database db) async {
|
|||||||
xref,
|
xref,
|
||||||
);
|
);
|
||||||
|
|
||||||
|
if (seenSeeAlsoXrefs.contains(resolvedEntry.entry.entryId)) {
|
||||||
|
print(
|
||||||
|
'WARNING: Skipping duplicate seeAlso xref from sense ${s.senseId} to entry ${resolvedEntry.entry.entryId}\n'
|
||||||
|
' (kanjiRef: ${xref.kanjiRef}, readingRef: ${xref.readingRef}, senseOrderNum: ${xref.senseOrderNum})',
|
||||||
|
);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
seenSeeAlsoXrefs.add(resolvedEntry.entry.entryId);
|
||||||
|
|
||||||
b.insert(JMdictTableNames.senseSeeAlso, {
|
b.insert(JMdictTableNames.senseSeeAlso, {
|
||||||
'senseId': s.senseId,
|
'senseId': s.senseId,
|
||||||
'xrefEntryId': resolvedEntry.entry.entryId,
|
'xrefEntryId': resolvedEntry.entry.entryId,
|
||||||
'seeAlsoKanji': xref.kanjiRef,
|
'xrefSenseOrderNum': resolvedEntry.senseOrderNum,
|
||||||
'seeAlsoReading': xref.readingRef,
|
|
||||||
'seeAlsoSense': xref.senseOrderNum,
|
|
||||||
'ambiguous': resolvedEntry.ambiguous,
|
'ambiguous': resolvedEntry.ambiguous,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
final seenAntonymXrefs = <int>{};
|
||||||
for (final ant in s.antonyms) {
|
for (final ant in s.antonyms) {
|
||||||
final resolvedEntry = resolveXref(
|
final resolvedEntry = resolveXref(
|
||||||
entriesByKanji,
|
entriesByKanji,
|
||||||
@@ -231,12 +244,18 @@ Future<void> seedJMDictData(List<Entry> entries, Database db) async {
|
|||||||
ant,
|
ant,
|
||||||
);
|
);
|
||||||
|
|
||||||
|
if (seenAntonymXrefs.contains(resolvedEntry.entry.entryId)) {
|
||||||
|
print(
|
||||||
|
'WARNING: Skipping duplicate antonym xref from sense ${s.senseId} to entry ${resolvedEntry.entry.entryId}\n'
|
||||||
|
' (kanjiRef: ${ant.kanjiRef}, readingRef: ${ant.readingRef}, senseOrderNum: ${ant.senseOrderNum})',
|
||||||
|
);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
seenAntonymXrefs.add(resolvedEntry.entry.entryId);
|
||||||
|
|
||||||
b.insert(JMdictTableNames.senseAntonyms, {
|
b.insert(JMdictTableNames.senseAntonyms, {
|
||||||
'senseId': s.senseId,
|
'senseId': s.senseId,
|
||||||
'xrefEntryId': resolvedEntry.entry.entryId,
|
'xrefEntryId': resolvedEntry.entry.entryId,
|
||||||
'antonymKanji': ant.kanjiRef,
|
|
||||||
'antonymReading': ant.readingRef,
|
|
||||||
'antonymSense': ant.senseOrderNum,
|
|
||||||
'ambiguous': resolvedEntry.ambiguous,
|
'ambiguous': resolvedEntry.ambiguous,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -27,6 +27,7 @@ Future<Database> openLocalDb({
|
|||||||
await db.execute('PRAGMA journal_mode = WAL');
|
await db.execute('PRAGMA journal_mode = WAL');
|
||||||
}
|
}
|
||||||
await db.execute('PRAGMA foreign_keys = ON');
|
await db.execute('PRAGMA foreign_keys = ON');
|
||||||
|
await db.execute("SELECT icu_load_collation('ja_JP', 'japanese')");
|
||||||
},
|
},
|
||||||
readOnly: !readWrite,
|
readOnly: !readWrite,
|
||||||
),
|
),
|
||||||
|
|||||||
@@ -10,11 +10,12 @@ Future<List<JLPTRankedWord>> parseJLPTRankedWords(
|
|||||||
) async {
|
) async {
|
||||||
final List<JLPTRankedWord> result = [];
|
final List<JLPTRankedWord> result = [];
|
||||||
|
|
||||||
final codec = CsvCodec(
|
final codec = Csv(
|
||||||
fieldDelimiter: ',',
|
fieldDelimiter: ',',
|
||||||
lineDelimiter: '\n',
|
lineDelimiter: '\n',
|
||||||
quoteMode: QuoteMode.strings,
|
quoteMode: QuoteMode.strings,
|
||||||
escapeCharacter: '\\',
|
escapeCharacter: '\\',
|
||||||
|
parseHeaders: false,
|
||||||
);
|
);
|
||||||
|
|
||||||
for (final entry in files.entries) {
|
for (final entry in files.entries) {
|
||||||
@@ -29,7 +30,6 @@ Future<List<JLPTRankedWord>> parseJLPTRankedWords(
|
|||||||
.openRead()
|
.openRead()
|
||||||
.transform(utf8.decoder)
|
.transform(utf8.decoder)
|
||||||
.transform(codec.decoder)
|
.transform(codec.decoder)
|
||||||
.flatten()
|
|
||||||
.map((row) {
|
.map((row) {
|
||||||
if (row.length != 3) {
|
if (row.length != 3) {
|
||||||
throw Exception('Invalid line in $jlptLevel: $row');
|
throw Exception('Invalid line in $jlptLevel: $row');
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
import 'package:jadb/models/common/jlpt_level.dart';
|
import 'package:jadb/models/common/jlpt_level.dart';
|
||||||
import 'package:jadb/models/jmdict/jmdict_kanji_info.dart';
|
import 'package:jadb/models/jmdict/jmdict_kanji_info.dart';
|
||||||
|
import 'package:jadb/models/jmdict/jmdict_misc.dart';
|
||||||
import 'package:jadb/models/jmdict/jmdict_reading_info.dart';
|
import 'package:jadb/models/jmdict/jmdict_reading_info.dart';
|
||||||
import 'package:jadb/models/word_search/word_search_match_span.dart';
|
import 'package:jadb/models/word_search/word_search_match_span.dart';
|
||||||
import 'package:jadb/models/word_search/word_search_ruby.dart';
|
import 'package:jadb/models/word_search/word_search_ruby.dart';
|
||||||
@@ -45,6 +46,13 @@ class WordSearchResult {
|
|||||||
/// the original searchword.
|
/// the original searchword.
|
||||||
List<WordSearchMatchSpan>? matchSpans;
|
List<WordSearchMatchSpan>? matchSpans;
|
||||||
|
|
||||||
|
/// Whether the first item in [japanese] contains kanji that likely is rare.
|
||||||
|
bool get hasUnusualKanji =>
|
||||||
|
(japanese.first.furigana != null &&
|
||||||
|
kanjiInfo[japanese.first.base] == JMdictKanjiInfo.rK) ||
|
||||||
|
senses.where((sense) => sense.misc.contains(JMdictMisc.onlyKana)).length >
|
||||||
|
(senses.length / 2);
|
||||||
|
|
||||||
/// All contents of [japanese], transliterated to romaji
|
/// All contents of [japanese], transliterated to romaji
|
||||||
List<String> get romaji => japanese
|
List<String> get romaji => japanese
|
||||||
.map((word) => transliterateKanaToLatin(word.furigana ?? word.base))
|
.map((word) => transliterateKanaToLatin(word.furigana ?? word.base))
|
||||||
|
|||||||
@@ -1,3 +1,5 @@
|
|||||||
|
SELECT icu_load_collation('ja_JP', 'japanese');
|
||||||
|
|
||||||
CREATE TABLE "JMdict_Version" (
|
CREATE TABLE "JMdict_Version" (
|
||||||
"version" VARCHAR(10) PRIMARY KEY NOT NULL,
|
"version" VARCHAR(10) PRIMARY KEY NOT NULL,
|
||||||
"date" DATE NOT NULL,
|
"date" DATE NOT NULL,
|
||||||
@@ -55,13 +57,13 @@ CREATE TABLE "JMdict_KanjiElement" (
|
|||||||
"elementId" INTEGER PRIMARY KEY,
|
"elementId" INTEGER PRIMARY KEY,
|
||||||
"entryId" INTEGER NOT NULL REFERENCES "JMdict_Entry"("entryId"),
|
"entryId" INTEGER NOT NULL REFERENCES "JMdict_Entry"("entryId"),
|
||||||
"orderNum" INTEGER NOT NULL,
|
"orderNum" INTEGER NOT NULL,
|
||||||
"reading" TEXT NOT NULL,
|
"reading" TEXT NOT NULL COLLATE japanese,
|
||||||
"news" INTEGER CHECK ("news" BETWEEN 1 AND 2),
|
"news" INTEGER CHECK ("news" BETWEEN 1 AND 2),
|
||||||
"ichi" INTEGER CHECK ("ichi" BETWEEN 1 AND 2),
|
"ichi" INTEGER CHECK ("ichi" BETWEEN 1 AND 2),
|
||||||
"spec" INTEGER CHECK ("spec" BETWEEN 1 AND 2),
|
"spec" INTEGER CHECK ("spec" BETWEEN 1 AND 2),
|
||||||
"gai" INTEGER CHECK ("gai" BETWEEN 1 AND 2),
|
"gai" INTEGER CHECK ("gai" BETWEEN 1 AND 2),
|
||||||
"nf" INTEGER CHECK ("nf" BETWEEN 1 AND 48),
|
"nf" INTEGER CHECK ("nf" BETWEEN 1 AND 48),
|
||||||
UNIQUE("entryId", "reading"),
|
-- UNIQUE("entryId", "reading"),
|
||||||
UNIQUE("entryId", "orderNum")
|
UNIQUE("entryId", "orderNum")
|
||||||
) WITHOUT ROWID;
|
) WITHOUT ROWID;
|
||||||
|
|
||||||
@@ -80,14 +82,14 @@ CREATE TABLE "JMdict_ReadingElement" (
|
|||||||
"elementId" INTEGER PRIMARY KEY,
|
"elementId" INTEGER PRIMARY KEY,
|
||||||
"entryId" INTEGER NOT NULL REFERENCES "JMdict_Entry"("entryId"),
|
"entryId" INTEGER NOT NULL REFERENCES "JMdict_Entry"("entryId"),
|
||||||
"orderNum" INTEGER NOT NULL,
|
"orderNum" INTEGER NOT NULL,
|
||||||
"reading" TEXT NOT NULL,
|
"reading" TEXT NOT NULL COLLATE japanese,
|
||||||
"readingDoesNotMatchKanji" BOOLEAN NOT NULL DEFAULT FALSE,
|
"readingDoesNotMatchKanji" BOOLEAN NOT NULL DEFAULT FALSE,
|
||||||
"news" INTEGER CHECK ("news" BETWEEN 1 AND 2),
|
"news" INTEGER CHECK ("news" BETWEEN 1 AND 2),
|
||||||
"ichi" INTEGER CHECK ("ichi" BETWEEN 1 AND 2),
|
"ichi" INTEGER CHECK ("ichi" BETWEEN 1 AND 2),
|
||||||
"spec" INTEGER CHECK ("spec" BETWEEN 1 AND 2),
|
"spec" INTEGER CHECK ("spec" BETWEEN 1 AND 2),
|
||||||
"gai" INTEGER CHECK ("gai" BETWEEN 1 AND 2),
|
"gai" INTEGER CHECK ("gai" BETWEEN 1 AND 2),
|
||||||
"nf" INTEGER CHECK ("nf" BETWEEN 1 AND 48),
|
"nf" INTEGER CHECK ("nf" BETWEEN 1 AND 48),
|
||||||
UNIQUE("entryId", "reading"),
|
-- UNIQUE("entryId", "reading"),
|
||||||
UNIQUE("entryId", "orderNum")
|
UNIQUE("entryId", "orderNum")
|
||||||
) WITHOUT ROWID;
|
) WITHOUT ROWID;
|
||||||
|
|
||||||
@@ -120,17 +122,17 @@ CREATE INDEX "JMdict_Sense_byEntryId_byOrderNum" ON "JMdict_Sense"("entryId", "o
|
|||||||
CREATE TABLE "JMdict_SenseRestrictedToKanji" (
|
CREATE TABLE "JMdict_SenseRestrictedToKanji" (
|
||||||
"entryId" INTEGER NOT NULL,
|
"entryId" INTEGER NOT NULL,
|
||||||
"senseId" INTEGER NOT NULL REFERENCES "JMdict_Sense"("senseId"),
|
"senseId" INTEGER NOT NULL REFERENCES "JMdict_Sense"("senseId"),
|
||||||
"kanji" TEXT NOT NULL,
|
"kanjiOrderNum" INTEGER NOT NULL CHECK ("kanjiOrderNum" > 0),
|
||||||
FOREIGN KEY ("entryId", "kanji") REFERENCES "JMdict_KanjiElement"("entryId", "reading"),
|
FOREIGN KEY ("entryId", "kanjiOrderNum") REFERENCES "JMdict_KanjiElement"("entryId", "orderNum"),
|
||||||
PRIMARY KEY ("entryId", "senseId", "kanji")
|
PRIMARY KEY ("entryId", "senseId", "kanjiOrderNum")
|
||||||
) WITHOUT ROWID;
|
) WITHOUT ROWID;
|
||||||
|
|
||||||
CREATE TABLE "JMdict_SenseRestrictedToReading" (
|
CREATE TABLE "JMdict_SenseRestrictedToReading" (
|
||||||
"entryId" INTEGER NOT NULL,
|
"entryId" INTEGER NOT NULL,
|
||||||
"senseId" INTEGER NOT NULL REFERENCES "JMdict_Sense"("senseId"),
|
"senseId" INTEGER NOT NULL REFERENCES "JMdict_Sense"("senseId"),
|
||||||
"reading" TEXT NOT NULL,
|
"readingOrderNum" INTEGER NOT NULL CHECK ("readingOrderNum" > 0),
|
||||||
FOREIGN KEY ("entryId", "reading") REFERENCES "JMdict_ReadingElement"("entryId", "reading"),
|
FOREIGN KEY ("entryId", "readingOrderNum") REFERENCES "JMdict_ReadingElement"("entryId", "orderNum"),
|
||||||
PRIMARY KEY ("entryId", "senseId", "reading")
|
PRIMARY KEY ("entryId", "senseId", "readingOrderNum")
|
||||||
) WITHOUT ROWID;
|
) WITHOUT ROWID;
|
||||||
|
|
||||||
-- In order to add xrefs, you will need to have added the entry to xref to.
|
-- In order to add xrefs, you will need to have added the entry to xref to.
|
||||||
@@ -145,32 +147,23 @@ CREATE TABLE "JMdict_SenseRestrictedToReading" (
|
|||||||
|
|
||||||
CREATE TABLE "JMdict_SenseSeeAlso" (
|
CREATE TABLE "JMdict_SenseSeeAlso" (
|
||||||
"senseId" INTEGER NOT NULL REFERENCES "JMdict_Sense"("senseId"),
|
"senseId" INTEGER NOT NULL REFERENCES "JMdict_Sense"("senseId"),
|
||||||
"xrefEntryId" INTEGER NOT NULL,
|
"xrefEntryId" INTEGER NOT NULL REFERENCES "JMdict_Entry"("entryId"),
|
||||||
"seeAlsoReading" TEXT,
|
-- Sometimes the cross reference is to a specific sense
|
||||||
"seeAlsoKanji" TEXT,
|
"xrefSenseOrderNum" INTEGER,
|
||||||
"seeAlsoSense" INTEGER,
|
|
||||||
-- For some entries, the cross reference is ambiguous. This means that while the ingestion
|
-- For some entries, the cross reference is ambiguous. This means that while the ingestion
|
||||||
-- has determined some xrefEntryId, it is not guaranteed to be the correct one.
|
-- has determined some xrefEntryId, it is not guaranteed to be the correct one.
|
||||||
"ambiguous" BOOLEAN NOT NULL DEFAULT FALSE,
|
"ambiguous" BOOLEAN NOT NULL DEFAULT FALSE,
|
||||||
FOREIGN KEY ("xrefEntryId", "seeAlsoKanji") REFERENCES "JMdict_KanjiElement"("entryId", "reading"),
|
FOREIGN KEY ("xrefEntryId", "xrefSenseOrderNum") REFERENCES "JMdict_Sense"("entryId", "orderNum"),
|
||||||
FOREIGN KEY ("xrefEntryId", "seeAlsoReading") REFERENCES "JMdict_ReadingElement"("entryId", "reading"),
|
UNIQUE("senseId", "xrefEntryId", "xrefSenseOrderNum")
|
||||||
FOREIGN KEY ("xrefEntryId", "seeAlsoSense") REFERENCES "JMdict_Sense"("entryId", "orderNum"),
|
|
||||||
UNIQUE("senseId", "xrefEntryId", "seeAlsoReading", "seeAlsoKanji", "seeAlsoSense")
|
|
||||||
);
|
);
|
||||||
|
|
||||||
CREATE TABLE "JMdict_SenseAntonym" (
|
CREATE TABLE "JMdict_SenseAntonym" (
|
||||||
"senseId" INTEGER NOT NULL REFERENCES "JMdict_Sense"("senseId"),
|
"senseId" INTEGER NOT NULL REFERENCES "JMdict_Sense"("senseId"),
|
||||||
"xrefEntryId" INTEGER NOT NULL,
|
"xrefEntryId" INTEGER NOT NULL REFERENCES "JMdict_Entry"("entryId"),
|
||||||
"antonymReading" TEXT,
|
|
||||||
"antonymKanji" TEXT,
|
|
||||||
"antonymSense" INTEGER,
|
|
||||||
-- For some entries, the cross reference is ambiguous. This means that while the ingestion
|
-- For some entries, the cross reference is ambiguous. This means that while the ingestion
|
||||||
-- has determined some xrefEntryId, it is not guaranteed to be the correct one.
|
-- has determined some xrefEntryId, it is not guaranteed to be the correct one.
|
||||||
"ambiguous" BOOLEAN NOT NULL DEFAULT FALSE,
|
"ambiguous" BOOLEAN NOT NULL DEFAULT FALSE,
|
||||||
FOREIGN KEY ("xrefEntryId", "antonymKanji") REFERENCES "JMdict_KanjiElement"("entryId", "reading"),
|
UNIQUE("senseId", "xrefEntryId")
|
||||||
FOREIGN KEY ("xrefEntryId", "antonymReading") REFERENCES "JMdict_ReadingElement"("entryId", "reading"),
|
|
||||||
FOREIGN KEY ("xrefEntryId", "antonymSense") REFERENCES "JMdict_Sense"("entryId", "orderNum"),
|
|
||||||
UNIQUE("senseId", "xrefEntryId", "antonymReading", "antonymKanji", "antonymSense")
|
|
||||||
);
|
);
|
||||||
|
|
||||||
-- These cross references are going to be mostly accessed from a sense
|
-- These cross references are going to be mostly accessed from a sense
|
||||||
|
|||||||
@@ -1,6 +1,7 @@
|
|||||||
{
|
{
|
||||||
src,
|
src,
|
||||||
buildDartApplication,
|
buildDartApplication,
|
||||||
|
sqlite,
|
||||||
}:
|
}:
|
||||||
buildDartApplication {
|
buildDartApplication {
|
||||||
pname = "jadb-database-tool";
|
pname = "jadb-database-tool";
|
||||||
@@ -9,6 +10,9 @@ buildDartApplication {
|
|||||||
|
|
||||||
dartEntryPoints."bin/jadb" = "bin/jadb.dart";
|
dartEntryPoints."bin/jadb" = "bin/jadb.dart";
|
||||||
|
|
||||||
|
# NOTE: here we are overriding the implicitly added runtimeDependency from the package fixup in pub2nix.
|
||||||
|
runtimeDependencies = [ sqlite ];
|
||||||
|
|
||||||
# NOTE: the default dart hooks are using `dart compile`, which is not able to call the
|
# NOTE: the default dart hooks are using `dart compile`, which is not able to call the
|
||||||
# new dart build hooks required to use package:sqlite3 >= 3.0.0. So we override
|
# new dart build hooks required to use package:sqlite3 >= 3.0.0. So we override
|
||||||
# these phases to use `dart build` instead.
|
# these phases to use `dart build` instead.
|
||||||
|
|||||||
48
pubspec.lock
48
pubspec.lock
@@ -5,18 +5,18 @@ packages:
|
|||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
name: _fe_analyzer_shared
|
name: _fe_analyzer_shared
|
||||||
sha256: "3b19a47f6ea7c2632760777c78174f47f6aec1e05f0cd611380d4593b8af1dbc"
|
sha256: "8d718c5c58904f9937290fd5dbf2d6a0e02456867706bfb6cd7b81d394e738d5"
|
||||||
url: "https://pub.dev"
|
url: "https://pub.dev"
|
||||||
source: hosted
|
source: hosted
|
||||||
version: "96.0.0"
|
version: "98.0.0"
|
||||||
analyzer:
|
analyzer:
|
||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
name: analyzer
|
name: analyzer
|
||||||
sha256: "0c516bc4ad36a1a75759e54d5047cb9d15cded4459df01aa35a0b5ec7db2c2a0"
|
sha256: "6141ad5d092d1e1d13929c0504658bbeccc1703505830d7c26e859908f5efc88"
|
||||||
url: "https://pub.dev"
|
url: "https://pub.dev"
|
||||||
source: hosted
|
source: hosted
|
||||||
version: "10.2.0"
|
version: "12.0.0"
|
||||||
args:
|
args:
|
||||||
dependency: "direct main"
|
dependency: "direct main"
|
||||||
description:
|
description:
|
||||||
@@ -29,10 +29,10 @@ packages:
|
|||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
name: async
|
name: async
|
||||||
sha256: "758e6d74e971c3e5aceb4110bfd6698efc7f501675bcfe0c775459a8140750eb"
|
sha256: e2eb0491ba5ddb6177742d2da23904574082139b07c1e33b8503b9f46f3e1a37
|
||||||
url: "https://pub.dev"
|
url: "https://pub.dev"
|
||||||
source: hosted
|
source: hosted
|
||||||
version: "2.13.0"
|
version: "2.13.1"
|
||||||
benchmark_harness:
|
benchmark_harness:
|
||||||
dependency: "direct dev"
|
dependency: "direct dev"
|
||||||
description:
|
description:
|
||||||
@@ -101,10 +101,10 @@ packages:
|
|||||||
dependency: "direct main"
|
dependency: "direct main"
|
||||||
description:
|
description:
|
||||||
name: csv
|
name: csv
|
||||||
sha256: bef2950f7a753eb82f894a2eabc3072e73cf21c17096296a5a992797e50b1d0d
|
sha256: "2e0a52fb729f2faacd19c9c0c954ff450bba37aa8ab999410309e2342e7013a2"
|
||||||
url: "https://pub.dev"
|
url: "https://pub.dev"
|
||||||
source: hosted
|
source: hosted
|
||||||
version: "7.1.0"
|
version: "8.0.0"
|
||||||
equatable:
|
equatable:
|
||||||
dependency: "direct main"
|
dependency: "direct main"
|
||||||
description:
|
description:
|
||||||
@@ -149,10 +149,10 @@ packages:
|
|||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
name: hooks
|
name: hooks
|
||||||
sha256: "7a08a0d684cb3b8fb604b78455d5d352f502b68079f7b80b831c62220ab0a4f6"
|
sha256: e79ed1e8e1929bc6ecb6ec85f0cb519c887aa5b423705ded0d0f2d9226def388
|
||||||
url: "https://pub.dev"
|
url: "https://pub.dev"
|
||||||
source: hosted
|
source: hosted
|
||||||
version: "1.0.1"
|
version: "1.0.2"
|
||||||
http_multi_server:
|
http_multi_server:
|
||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
@@ -197,18 +197,18 @@ packages:
|
|||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
name: matcher
|
name: matcher
|
||||||
sha256: "12956d0ad8390bbcc63ca2e1469c0619946ccb52809807067a7020d57e647aa6"
|
sha256: dc0b7dc7651697ea4ff3e69ef44b0407ea32c487a39fff6a4004fa585e901861
|
||||||
url: "https://pub.dev"
|
url: "https://pub.dev"
|
||||||
source: hosted
|
source: hosted
|
||||||
version: "0.12.18"
|
version: "0.12.19"
|
||||||
meta:
|
meta:
|
||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
name: meta
|
name: meta
|
||||||
sha256: "9f29b9bcc8ee287b1a31e0d01be0eae99a930dbffdaecf04b3f3d82a969f296f"
|
sha256: df0c643f44ad098eb37988027a8e2b2b5a031fd3977f06bbfd3a76637e8df739
|
||||||
url: "https://pub.dev"
|
url: "https://pub.dev"
|
||||||
source: hosted
|
source: hosted
|
||||||
version: "1.18.1"
|
version: "1.18.2"
|
||||||
mime:
|
mime:
|
||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
@@ -221,10 +221,10 @@ packages:
|
|||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
name: native_toolchain_c
|
name: native_toolchain_c
|
||||||
sha256: "89e83885ba09da5fdf2cdacc8002a712ca238c28b7f717910b34bcd27b0d03ac"
|
sha256: "6ba77bb18063eebe9de401f5e6437e95e1438af0a87a3a39084fbd37c90df572"
|
||||||
url: "https://pub.dev"
|
url: "https://pub.dev"
|
||||||
source: hosted
|
source: hosted
|
||||||
version: "0.17.4"
|
version: "0.17.6"
|
||||||
node_preamble:
|
node_preamble:
|
||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
@@ -349,10 +349,10 @@ packages:
|
|||||||
dependency: "direct main"
|
dependency: "direct main"
|
||||||
description:
|
description:
|
||||||
name: sqlite3
|
name: sqlite3
|
||||||
sha256: b7cf6b37667f6a921281797d2499ffc60fb878b161058d422064f0ddc78f6aa6
|
sha256: caa693ad15a587a2b4fde093b728131a1827903872171089dedb16f7665d3a91
|
||||||
url: "https://pub.dev"
|
url: "https://pub.dev"
|
||||||
source: hosted
|
source: hosted
|
||||||
version: "3.1.6"
|
version: "3.2.0"
|
||||||
stack_trace:
|
stack_trace:
|
||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
@@ -397,26 +397,26 @@ packages:
|
|||||||
dependency: "direct dev"
|
dependency: "direct dev"
|
||||||
description:
|
description:
|
||||||
name: test
|
name: test
|
||||||
sha256: "54c516bbb7cee2754d327ad4fca637f78abfc3cbcc5ace83b3eda117e42cd71a"
|
sha256: "8d9ceddbab833f180fbefed08afa76d7c03513dfdba87ffcec2718b02bbcbf20"
|
||||||
url: "https://pub.dev"
|
url: "https://pub.dev"
|
||||||
source: hosted
|
source: hosted
|
||||||
version: "1.29.0"
|
version: "1.31.0"
|
||||||
test_api:
|
test_api:
|
||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
name: test_api
|
name: test_api
|
||||||
sha256: "93167629bfc610f71560ab9312acdda4959de4df6fac7492c89ff0d3886f6636"
|
sha256: "949a932224383300f01be9221c39180316445ecb8e7547f70a41a35bf421fb9e"
|
||||||
url: "https://pub.dev"
|
url: "https://pub.dev"
|
||||||
source: hosted
|
source: hosted
|
||||||
version: "0.7.9"
|
version: "0.7.11"
|
||||||
test_core:
|
test_core:
|
||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
name: test_core
|
name: test_core
|
||||||
sha256: "394f07d21f0f2255ec9e3989f21e54d3c7dc0e6e9dbce160e5a9c1a6be0e2943"
|
sha256: "1991d4cfe85d5043241acac92962c3977c8d2f2add1ee73130c7b286417d1d34"
|
||||||
url: "https://pub.dev"
|
url: "https://pub.dev"
|
||||||
source: hosted
|
source: hosted
|
||||||
version: "0.6.15"
|
version: "0.6.17"
|
||||||
typed_data:
|
typed_data:
|
||||||
dependency: transitive
|
dependency: transitive
|
||||||
description:
|
description:
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ environment:
|
|||||||
dependencies:
|
dependencies:
|
||||||
args: ^2.7.0
|
args: ^2.7.0
|
||||||
collection: ^1.19.0
|
collection: ^1.19.0
|
||||||
csv: ^7.1.0
|
csv: ^8.0.0
|
||||||
equatable: ^2.0.0
|
equatable: ^2.0.0
|
||||||
path: ^1.9.1
|
path: ^1.9.1
|
||||||
sqflite_common: ^2.5.0
|
sqflite_common: ^2.5.0
|
||||||
|
|||||||
Reference in New Issue
Block a user