Compare commits

..

No commits in common. "main" and "v0.3.2" have entirely different histories.
main ... v0.3.2

7 changed files with 168 additions and 429 deletions

View file

@ -1,5 +0,0 @@
target/
**/*.rs.bk
node_modules/
Dockerfile
docker-compose.yml

3
.gitignore vendored
View file

@ -18,5 +18,4 @@ playwright/.cache/
.idea/
# Ignore database file
compareware.db
.qodo
compareware.db

View file

@ -1,7 +1,6 @@
# [CompareWare](https://compareware.org/)
# CompareWare
CompareWare is an open-source platform for comparing tools (software, hardware, etc.) with structured, crowdsourced data.
It combines Rust's **Leptos** for a modern, reactive frontend and **rusqlite** for data storage.
CompareWare is an open-source platform for comparing tools (software, hardware, etc.) with structured, crowdsourced data. It combines **Leptos** for a modern, reactive frontend and **Nostr** for decentralized data storage.
## **Features**
- **Item Management**: Add, view, and manage items with metadata and key-value tags.
@ -26,23 +25,23 @@ It combines Rust's **Leptos** for a modern, reactive frontend and **rusqlite** f
```bash
cargo leptos serve
```
3. Open your browser at [localhost:3004](http://localhost:3004)
3. Open your browser at [http://localhost:3000](http://localhost:3000)
## **Database Schema**
### Key Concepts
- **PK (Primary Key)**: Unique identifier for table records (🔑)
- **FK (Foreign Key)**: Reference linking related tables (➡️)
- **Core (core properties)**: name and description.
### Tables Overview
### **Tables Overview**
### Core Tables
| Table | Columns (PK/FK) | Description | Example Data |
|-------|------------------|-------------|--------------|
| **urls** | `id` (PK), `url`, `created_at` | Stores comparison URLs | `1, "/laptops", 2024-03-01` |
| **items** | `id` (PK), `url_id` (FK), `wikidata_id` | Comparison items | `"item1", 1, "Q214276"` |
| **properties** | `id` (PK), `name` | All available properties (including core) | `1.0, "name"`<br>`2.0, "description"`<br>`3.0, "screen_size"` |
| **item_properties** | `item_id` (PK/FK), `property_id` (PK/FK), `value` | All property values including name/description | `"item1", 1.0, "MacBook Pro"`<br>`"item1", 2.0, "16-inch laptop"`<br>`"item1", 3.0, "16 inches"` |
| **selected_properties** | `url_id` (PK/FK), `property_id` (PK/FK) | Active properties per URL (excludes core) | `1, 3.0` |
| **items** | `id` (PK), `url_id` (FK), `name`, `description`, `wikidata_id` | Comparison items | `"item1", 1, "MacBook Pro", "16-inch", "Q214276"` |
| **properties** | `id` (PK), `name`, `global_usage_count` | Available properties | `25, "screen_size", 150` |
| **item_properties** | `item_id` (PK/FK), `property_id` (PK/FK), `value` | Item-specific values | `"item1", 25, "16 inches"` |
| **selected_properties** | `url_id` (PK/FK), `property_id` (PK/FK) | Active properties per URL | `1, 25` |
### Data Flow
```mermaid
@ -60,45 +59,6 @@ flowchart LR
properties -->|property_id| item_properties
```
### Properties data flow
```mermaid
sequenceDiagram
participant User
participant App as Application
participant Wikidata
User->>App: Enters search
App->>Wikidata: fetch_wikidata_suggestions()
Wikidata-->>App: Return suggestions
App->>User: Show suggestions
User->>App: Selects item
App->>Wikidata: fetch_item_properties()
Wikidata-->>App: Return properties (IDs + values)
App->>Wikidata: fetch_property_labels()
Wikidata-->>App: Return labels
App->>App: Combine labels + properties
App->>User: Show labeled properties
```
## **Docker Deployment**
### **Prerequisites**
- Docker installed on your system
- Docker Compose (usually included with Docker Desktop)
### **Running with Docker**
1. Clone the repository:
```bash
git clone https://forge.ftt.gmbh/ryanmwangi/Compware.git
cd compareware
```
2. Start the container:
```bash
docker-compose up -d
```
3. Access the application at: [http://localhost:3004](http://localhost:3004)
### **Collaboration**
We welcome contributions! Heres how you can help:

View file

@ -1,11 +0,0 @@
services:
app:
build: .
ports:
- "3000:3000"
volumes:
- ./compareware.db:/app/compareware.db
environment:
- RUST_LOG=info
- LEPTOS_ENV=production
restart: unless-stopped

View file

@ -1,56 +0,0 @@
# Build stage
FROM rust:1.83.0-slim-bullseye as builder
# Install essential build tools
RUN apt-get update && \
apt-get install -y \
libsqlite3-dev \
build-essential \
clang \
libssl-dev \
pkg-config \
curl \
cmake \
protobuf-compiler \
&& rm -rf /var/lib/apt/lists/*
# Install Rust toolchain
RUN rustup component add rust-src
# Install cargo-leptos & wasm-bindgen-cli
RUN cargo install cargo-leptos --version 0.2.24 --locked
RUN cargo install wasm-bindgen-cli --version 0.2.99 --locked
# Build application
WORKDIR /app
COPY . .
# Explicitly set WASM target
RUN rustup target add wasm32-unknown-unknown
# Build project
ENV LEPTOS_OUTPUT_NAME="compareware"
# Build with release profile
RUN cargo leptos build --release
# Runtime stage
FROM debian:bullseye-slim
# Install runtime dependencies in Debian
RUN apt-get update && \
apt-get install -y \
libssl-dev \
libsqlite3-0 \
ca-certificates \
&& rm -rf /var/lib/apt/lists/*
# Copy build artifacts
COPY --from=builder /app/target/release/compareware /app/
COPY --from=builder /app/target/site /app/site
COPY assets /app/assets
# Configure container, expose port and set entrypoint
WORKDIR /app
EXPOSE 3000
ENV LEPTOS_SITE_ADDR=0.0.0.0:3000
ENV LEPTOS_SITE_ROOT="site"
CMD ["./compareware"]

View file

@ -141,8 +141,6 @@ pub fn ItemsList(
// Signal to store the fetched property labels
let (property_labels, set_property_labels) = create_signal(HashMap::<String, String>::new());
// State to manage property cache
let (property_cache, set_property_cache) = create_signal(HashMap::<String, HashMap<String, String>>::new());
#[cfg(feature = "ssr")]
fn get_current_url() -> String {
use leptos::use_context;
@ -383,31 +381,15 @@ pub fn ItemsList(
};
//function to fetch properties
async fn fetch_item_properties(
wikidata_id: &str,
set_property_labels: WriteSignal<HashMap<String, String>>,
property_cache: ReadSignal<HashMap<String, HashMap<String, String>>>,
set_property_cache: WriteSignal<HashMap<String, HashMap<String, String>>>,
property_labels: ReadSignal<HashMap<String, String>>,
) -> HashMap<String, String> {
// Check cache first
if let Some(cached) = property_cache.get().get(wikidata_id) {
return cached.clone();
}
async fn fetch_item_properties(wikidata_id: &str) -> HashMap<String, String> {
let sparql_query = format!(
r#"
SELECT ?prop ?propLabel ?value ?valueLabel WHERE {{
SELECT ?propLabel ?value ?valueLabel WHERE {{
wd:{} ?prop ?statement.
?statement ?ps ?value.
?property wikibase:claim ?prop.
?property wikibase:statementProperty ?ps.
SERVICE wikibase:label {{
bd:serviceParam wikibase:language "en".
?prop rdfs:label ?propLabel.
?value rdfs:label ?valueLabel.
}}
SERVICE wikibase:label {{ bd:serviceParam wikibase:language "en". }}
}}
"#,
wikidata_id
@ -426,66 +408,17 @@ pub fn ItemsList(
Ok(response) => {
if let Ok(data) = response.json::<serde_json::Value>().await {
let mut result = HashMap::new();
let mut prop_ids = Vec::new();
// First pass: collect unique property IDs
if let Some(bindings) = data["results"]["bindings"].as_array() {
for binding in bindings {
if let Some(prop) = binding["propLabel"]["value"].as_str() {
let prop_id = prop.replace("http://www.wikidata.org/prop/", "");
if !prop_ids.contains(&prop_id) {
prop_ids.push(prop_id.clone());
}
}
let prop_label = binding["propLabel"]["value"].as_str().unwrap_or("").to_string();
let prop_label = prop_label.replace("http://www.wikidata.org/prop/", "");
let value_label = binding["valueLabel"]["value"].as_str().unwrap_or("").to_string();
result.insert(prop_label, value_label);
log!("result: {:?}", result);
}
}
// Batch fetch missing labels
let existing_labels = property_labels.get();
let missing_ids: Vec<String> = prop_ids
.iter()
.filter(|id| !existing_labels.contains_key(*id))
.cloned()
.collect();
if !missing_ids.is_empty() {
let new_labels = fetch_property_labels(missing_ids).await;
set_property_labels.update(|labels| {
labels.extend(new_labels.clone());
});
}
// Second pass: build results
if let Some(bindings) = data["results"]["bindings"].as_array() {
for binding in bindings {
let prop_label = binding["propLabel"]["value"].as_str().unwrap_or_default();
let value = binding["valueLabel"]["value"]
.as_str()
.or_else(|| binding["value"]["value"].as_str())
.unwrap_or_default();
if let Some(prop_uri) = binding["prop"]["value"].as_str() {
let prop_id = prop_uri.split('/').last().unwrap_or_default().to_string();
result.insert(
prop_id.clone(),
value.to_string()
);
// Update labels if missing
set_property_labels.update(|labels| {
labels.entry(prop_id.clone())
.or_insert(prop_label.to_string());
});
}
}
}
// Update cache
set_property_cache.update(|cache| {
cache.insert(wikidata_id.to_string(), result.clone());
});
result
} else {
HashMap::new()
}
@ -582,28 +515,11 @@ pub fn ItemsList(
let add_property = {
let current_url = Rc::clone(&current_url);
let set_items = set_items.clone();
let set_property_labels = set_property_labels.clone();
let property_cache = property_cache.clone();
let set_property_cache = set_property_cache.clone();
Arc::new(move |property: String| {
// Normalize the property ID
let normalized_property = property.replace("http://www.wikidata.org/prop/", "");
let normalized_property_clone = normalized_property.clone();
// Check if label already exists
if !property_labels.get().contains_key(&normalized_property) {
spawn_local({
let normalized_property = normalized_property.clone();
let set_property_labels = set_property_labels.clone();
async move {
let labels = fetch_property_labels(vec![normalized_property.clone()]).await;
set_property_labels.update(|map| {
map.extend(labels);
});
}
});
}
// Check if property is already selected
if !selected_properties.get().contains_key(&normalized_property) && !normalized_property.is_empty() {
// Add property to selected properties
@ -665,45 +581,43 @@ pub fn ItemsList(
}
});
// Use the property label from the property_labels signal
let property_label = property_labels.get().get(&normalized_property).cloned().unwrap_or_else(|| normalized_property.clone());
log!("Added property with label: {}", property_label);
// Fetch the property label
let property_id = normalized_property.clone();
spawn_local(async move {
let labels = fetch_property_labels(vec![property_id.clone()]).await;
log!("Fetched labels: {:?}", labels);
set_property_labels.update(|labels_map| {
for (key, value) in labels {
log!("Inserting label: {} -> {}", key, value);
labels_map.insert(key, value);
}
});
});
}
});
// Fetch the relevant value for each item and populate the corresponding cells
set_items.update(|items| {
for item in items {
// Initialize property with empty string if it doesn't exist
item.custom_properties.entry(normalized_property.clone())
.or_insert_with(|| "".to_string());
// Only fetch properties if Wikidata ID exists
if let Some(wikidata_id) = &item.wikidata_id {
let wikidata_id = wikidata_id.clone();
let set_items = set_items.clone();
let set_fetched_properties = set_fetched_properties.clone();
let property_clone = normalized_property.clone();
let set_property_labels = set_property_labels.clone();
let property_clone = property.clone();
spawn_local(async move {
let properties = fetch_item_properties(
&wikidata_id,
set_property_labels.clone(),
property_cache.clone(),
set_property_cache.clone(),
property_labels.clone()
).await;
// Update the specific property for this item
let properties = fetch_item_properties(&wikidata_id).await;
// Update fetched properties and property labels
set_fetched_properties.update(|fp| {
fp.insert(wikidata_id.clone(), properties.clone());
});
set_property_labels.update(|pl| {
for (key, value) in properties.iter() {
pl.entry(key.clone()).or_insert_with(|| value.clone());
}
});
if let Some(value) = properties.get(&property_clone) {
set_items.update(|items| {
if let Some(item) = items.iter_mut()
.find(|i| i.wikidata_id.as_ref() == Some(&wikidata_id))
{
item.custom_properties.insert(
property_clone.clone(),
value.clone()
);
if let Some(item) = items.iter_mut().find(|item| item.wikidata_id.as_ref().unwrap() == &wikidata_id) {
item.custom_properties.insert(property_clone.clone(), value.clone());
}
});
}
@ -732,7 +646,7 @@ pub fn ItemsList(
if let Some(wikidata_id) = &item.wikidata_id {
let wikidata_id = wikidata_id.clone();
spawn_local(async move {
let properties = fetch_item_properties(&wikidata_id, set_property_labels.clone(), property_cache.clone(), set_property_cache.clone(), property_labels.clone()).await;
let properties = fetch_item_properties(&wikidata_id).await;
log!("Fetched properties for index {}: {:?}", index, properties);
});
}
@ -881,7 +795,7 @@ pub fn ItemsList(
// Fetch additional properties from Wikidata
let wikidata_id = id.clone();
spawn_local(async move {
let properties = fetch_item_properties(&wikidata_id, set_property_labels.clone(), property_cache.clone(), set_property_cache.clone(), property_labels.clone()).await;
let properties = fetch_item_properties(&wikidata_id).await;
// log!("Fetched properties for Wikidata ID {}: {:?}", wikidata_id, properties);
// Populate the custom properties for the new item
@ -1009,21 +923,17 @@ pub fn ItemsList(
</table>
<div style="margin-bottom: 20px;">
<input type="text" id="new-property" placeholder="Add New Property" list="properties" on:keydown=move |event| {
if event.key() == "Enter" {
if event.key() == "Enter"{
let input_element = event.target().unwrap().dyn_into::<web_sys::HtmlInputElement>().unwrap();
let input_value = input_element.value();
// Extract property ID from "Label (P123)" format
let property_id = input_value
.split(" (")
.last()
.and_then(|s| s.strip_suffix(')'))
.unwrap_or(&input_value)
.to_string();
if !property_id.is_empty() {
// Add the property using the extracted ID
add_property(property_id);
let property = input_element.value();
if !property.is_empty() {
// Extract the coded name from the selected value
let coded_name = property.split(" - ").next().unwrap_or(&property).to_string();
// Add the property using the coded name
add_property(coded_name);
// Clear the input field
input_element.set_value("");
}
}
@ -1031,11 +941,10 @@ pub fn ItemsList(
<datalist id="properties">
{move || {
let property_labels = property_labels.get().clone();
property_labels.into_iter().map(|(property_id, label)| {
property_labels.into_iter().map(|(property, label)| {
let property_clone = property.clone();
view! {
<option value={format!("{} ({})", label, property_id)}>
{ format!("{} ({})", label, property_id) }
</option>
<option value={property}>{ format!("{} - {}", property_clone, label) }</option>
}
}).collect::<Vec<_>>()
}}

273
src/db.rs
View file

@ -8,7 +8,7 @@ mod db_impl {
use std::collections::{HashMap, HashSet};
use std::sync::Arc;
use tokio::sync::Mutex;
use uuid::Uuid;
#[cfg(test)]
mod tests {
use super::*;
@ -258,35 +258,18 @@ mod db_impl {
"CREATE TABLE IF NOT EXISTS items (
id TEXT PRIMARY KEY,
url_id INTEGER NOT NULL,
name TEXT NOT NULL,
description TEXT,
wikidata_id TEXT,
item_order INTEGER NOT NULL DEFAULT 0,
FOREIGN KEY (url_id) REFERENCES urls(id) ON DELETE CASCADE
);
INSERT OR IGNORE INTO properties (name) VALUES
('name'),
('description');",
);",
)
.map_err(|e| {
eprintln!("Failed creating items table: {}", e);
e
})?;
// Check if the global_item_id column exists
let mut stmt = conn.prepare("PRAGMA table_info(items);")?;
let columns: Vec<String> = stmt
.query_map([], |row| row.get(1))? // Column 1 contains the column names
.collect::<Result<_, _>>()?;
if !columns.contains(&"global_item_id".to_string()) {
conn.execute_batch(
"ALTER TABLE items ADD COLUMN global_item_id TEXT;"
)
.map_err(|e| {
eprintln!("Failed adding global_item_id to items table: {}", e);
e
})?;
}
// 4. Table for selected properties
conn.execute_batch(
"CREATE TABLE IF NOT EXISTS selected_properties (
@ -305,11 +288,11 @@ mod db_impl {
// 5. Junction table for custom properties
conn.execute_batch(
"CREATE TABLE IF NOT EXISTS item_properties (
global_item_id TEXT NOT NULL,
item_id TEXT NOT NULL,
property_id INTEGER NOT NULL,
value TEXT NOT NULL,
PRIMARY KEY (global_item_id, property_id),
FOREIGN KEY (global_item_id) REFERENCES items(global_item_id) ON DELETE CASCADE,
PRIMARY KEY (item_id, property_id),
FOREIGN KEY (item_id) REFERENCES items(id) ON DELETE CASCADE,
FOREIGN KEY (property_id) REFERENCES properties(id) ON DELETE CASCADE
);",
)
@ -317,23 +300,6 @@ mod db_impl {
eprintln!("Failed creating item_properties table: {}", e);
e
})?;
// 6. Junction table for deleted properties
conn.execute_batch(
"CREATE TABLE IF NOT EXISTS deleted_properties (
url_id INTEGER NOT NULL,
global_item_id TEXT NOT NULL,
property_id INTEGER NOT NULL,
PRIMARY KEY (url_id, global_item_id, property_id),
FOREIGN KEY (url_id) REFERENCES urls(id) ON DELETE CASCADE,
FOREIGN KEY (global_item_id) REFERENCES items(global_item_id) ON DELETE CASCADE,
FOREIGN KEY (property_id) REFERENCES properties(id) ON DELETE CASCADE
);",
).map_err(|e| {
eprintln!("Failed creating item_properties table: {}", e);
e
})?;
Ok(())
}
@ -415,57 +381,62 @@ mod db_impl {
"WITH ordered_items AS (
SELECT
i.id,
i.name,
i.description,
i.wikidata_id,
i.item_order,
i.global_item_id
i.item_order
FROM items i
WHERE i.url_id = ?
ORDER BY i.item_order ASC
)
SELECT
SELECT
oi.id,
oi.name,
oi.description,
oi.wikidata_id,
name_ip.value AS name,
desc_ip.value AS description,
json_group_object(p.name, ip.value) as custom_properties
p.name AS prop_name,
ip.value
FROM ordered_items oi
LEFT JOIN item_properties ip
ON oi.global_item_id = ip.global_item_id
AND ip.property_id NOT IN (
SELECT property_id
FROM deleted_properties
WHERE url_id = ? AND global_item_id = oi.global_item_id
)
LEFT JOIN properties p
ON ip.property_id = p.id
LEFT JOIN item_properties name_ip
ON oi.global_item_id = name_ip.global_item_id
AND name_ip.property_id = (SELECT id FROM properties WHERE name = 'name')
LEFT JOIN item_properties desc_ip
ON oi.global_item_id = desc_ip.global_item_id
AND desc_ip.property_id = (SELECT id FROM properties WHERE name = 'description')
GROUP BY oi.id
LEFT JOIN item_properties ip ON oi.id = ip.item_id
LEFT JOIN properties p ON ip.property_id = p.id
ORDER BY oi.item_order ASC"
)?;
// Change from HashMap to Vec to preserve order
let rows = stmt.query_map([url_id, url_id], |row| {
let custom_props_json: String = row.get(4)?;
let custom_properties: HashMap<String, String> = serde_json::from_str(&custom_props_json)
.unwrap_or_default();
Ok(Item {
id: row.get(0)?,
name: row.get::<_, Option<String>>(2)?.unwrap_or_default(), // Handle NULL values for name
description: row.get::<_, Option<String>>(3)?.unwrap_or_default(), // Handle NULL values for description
wikidata_id: row.get(1)?,
custom_properties,
})
let mut items: Vec<Item> = Vec::new();
let mut current_id: Option<String> = None;
let rows = stmt.query_map([url_id], |row| {
Ok((
row.get::<_, String>(0)?, // id
row.get::<_, String>(1)?, // name
row.get::<_, String>(2)?, // description
row.get::<_, Option<String>>(3)?, // wikidata_id
row.get::<_, Option<String>>(4)?, // prop_name
row.get::<_, Option<String>>(5)?, // value
))
})?;
let mut items = Vec::new();
for row in rows {
items.push(row?);
let (id, name, desc, wd_id, prop, val) = row?;
if current_id.as_ref() != Some(&id) {
// New item - push to vector
items.push(Item {
id: id.clone(),
name,
description: desc,
wikidata_id: wd_id,
custom_properties: HashMap::new(),
});
current_id = Some(id);
}
if let (Some(p), Some(v)) = (prop, val) {
if let Some(last_item) = items.last_mut() {
last_item.custom_properties.insert(p, v);
}
}
}
Ok(items)
@ -523,96 +494,85 @@ mod db_impl {
Err(e) => return Err(e.into()),
};
// 4. Item insertion
let max_order: i32 = tx.query_row(
"SELECT COALESCE(MAX(item_order), 0) FROM items WHERE url_id = ?",
[url_id],
|row| row.get(0),
)?;
let global_item_id = match tx.query_row(
"SELECT ip.global_item_id
FROM item_properties ip
JOIN properties p ON ip.property_id = p.id
WHERE p.name = 'name' AND ip.value = ? LIMIT 1",
[&item.name],
|row| row.get::<_, String>(0),
) {
Ok(id) => id, // Reuse existing global_item_id
Err(rusqlite::Error::QueryReturnedNoRows) => {
let new_id = Uuid::new_v4().to_string(); // Generate a new global_item_id
new_id
}
Err(e) => return Err(e.into()),
};
// 4. Item insertion
log!("[DB] Upserting item");
tx.execute(
"INSERT INTO items (id, url_id, wikidata_id, item_order, global_item_id)
VALUES (?, ?, ?, ?, ?)
"INSERT INTO items (id, url_id, name, description, wikidata_id, item_order)
VALUES (?, ?, ?, ?, ?, ?)
ON CONFLICT(id) DO UPDATE SET
url_id = excluded.url_id,
name = excluded.name,
description = excluded.description,
wikidata_id = excluded.wikidata_id,
global_item_id = excluded.global_item_id",
item_order = excluded.item_order",
rusqlite::params![
&item.id,
url_id,
&item.name,
&item.description,
&item.wikidata_id,
max_order + 1,
&global_item_id
max_order + 1
],
)?;
log!("[DB] Item upserted successfully");
// property handling
let core_properties = vec![
("name", &item.name),
("description", &item.description)
];
for (prop, value) in core_properties.into_iter().chain(
item.custom_properties.iter().map(|(k, v)| (k.as_str(), v))
) {
let prop_id = self.get_or_create_property(&mut tx, prop).await?;
tx.execute(
"INSERT INTO item_properties (global_item_id, property_id, value)
VALUES (?, ?, ?)
ON CONFLICT(global_item_id, property_id) DO UPDATE SET
value = excluded.value",
rusqlite::params![&global_item_id, prop_id, value],
)?;
}
// Property synchronization
// Property handling with enhanced logging
log!("[DB] Synchronizing properties for item {}", item.id);
let existing_props = {
// Prepare statement and collect existing properties
let mut stmt = tx.prepare(
"SELECT p.name, ip.value
FROM item_properties ip
JOIN properties p ON ip.property_id = p.id
WHERE ip.global_item_id = ?",
WHERE ip.item_id = ?",
)?;
let mapped_rows = stmt.query_map([&item.id], |row| {
Ok((row.get::<_, String>(0)?, row.get::<_, String>(1)?))
})?;
mapped_rows.collect::<Result<HashMap<String, String>, _>>()?
};
// Include core properties in current_props check
let mut current_props: HashSet<&str> = item.custom_properties.keys()
.map(|s| s.as_str())
.collect();
current_props.insert("name");
current_props.insert("description");
// Cleanup with core property protection
for (prop, value) in &item.custom_properties {
// Update existing or insert new
let prop_id = self.get_or_create_property(&mut tx, prop).await?;
if let Some(existing_value) = existing_props.get(prop) {
if existing_value != value {
log!(
"[DB] Updating property {} from '{}' to '{}'",
prop,
existing_value,
value
);
tx.execute(
"UPDATE item_properties
SET value = ?
WHERE item_id = ?
AND property_id = (SELECT id FROM properties WHERE name = ?)",
rusqlite::params![value, &item.id, prop],
)?;
}
} else {
log!("[DB] Adding new property {}", prop);
tx.execute(
"INSERT INTO item_properties (item_id, property_id, value)
VALUES (?, ?, ?)",
rusqlite::params![&item.id, prop_id, value],
)?;
}
}
// Remove deleted properties
let current_props: HashSet<&str> =
item.custom_properties.keys().map(|s| s.as_str()).collect();
for (existing_prop, _) in existing_props {
if !current_props.contains(existing_prop.as_str())
&& !["name", "description"].contains(&existing_prop.as_str())
{
if !current_props.contains(existing_prop.as_str()) {
log!("[DB] Removing deleted property {}", existing_prop);
tx.execute(
"DELETE FROM item_properties
@ -641,11 +601,6 @@ mod db_impl {
"DELETE FROM items WHERE id = ? AND url_id = ?",
[item_id, &url_id.to_string()],
)?;
tx.execute(
"DELETE FROM item_properties WHERE global_item_id = ?",
[item_id],
)?;
tx.commit()?;
Ok(())
@ -655,35 +610,23 @@ mod db_impl {
pub async fn delete_property_by_url(&self, url: &str, property: &str) -> Result<(), Error> {
let mut conn = self.conn.lock().await;
let tx = conn.transaction()?;
// Get URL ID
let url_id: i64 =
tx.query_row("SELECT id FROM urls WHERE url = ?", [url], |row| row.get(0))?;
// Get property ID
let property_id: i64 = tx.query_row(
"SELECT id FROM properties WHERE name = ?",
[property],
|row| row.get(0),
// Delete property from all items in this URL
tx.execute(
"DELETE FROM item_properties
WHERE property_id IN (
SELECT id FROM properties WHERE name = ?
)
AND item_id IN (
SELECT id FROM items WHERE url_id = ?
)",
[property, &url_id.to_string()],
)?;
// Get all global_item_ids for this URL
{
let mut stmt = tx.prepare("SELECT global_item_id FROM items WHERE url_id = ?")?;
let global_item_ids: Vec<String> = stmt
.query_map([url_id], |row| row.get(0))?
.collect::<Result<_, _>>()?;
// Insert into deleted_properties for each global_item_id
for global_item_id in global_item_ids {
tx.execute(
"INSERT OR IGNORE INTO deleted_properties (url_id, global_item_id, property_id)
VALUES (?, ?, ?)",
rusqlite::params![url_id, global_item_id, property_id],
)?;
}
}
tx.commit()?;
Ok(())
}