Compare commits

...

49 Commits

Author SHA1 Message Date
6b6e0d4dec chore: try ffmpeg master branch
Some checks failed
continuous-integration/drone/push Build is failing
2025-06-11 15:14:53 +01:00
2576f75bc4 chore: cache docker build 2025-06-11 14:48:46 +01:00
ec271f4109 fix: install protobuf-compiler
All checks were successful
continuous-integration/drone/push Build is passing
2025-06-11 14:45:10 +01:00
21cc1ed714 fix: storage calculation
Some checks failed
continuous-integration/drone/push Build is failing
refactor: improve UI
2025-06-11 14:42:03 +01:00
dd6b35380b feat: new UI
chore: update readme
fix: upload
2025-06-11 12:52:04 +01:00
c4a519afb4 Add UI updates: admin reports button, quota display, and payment flow (#23)
Some checks failed
continuous-integration/drone/push Build is failing
* Initial plan for issue

* Implement UI updates: admin reports button, quota display, and payment flow

Co-authored-by: v0l <1172179+v0l@users.noreply.github.com>

* Final implementation complete - all UI updates successfully implemented

Co-authored-by: v0l <1172179+v0l@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: v0l <1172179+v0l@users.noreply.github.com>
2025-06-11 10:19:37 +01:00
fe263e9a46 refactor: cleanup AI slop 2025-06-11 10:03:03 +01:00
d3711ff52c Implement Admin Reporting UI with backend and frontend support (#21)
Some checks failed
continuous-integration/drone/push Build is failing
* Initial plan for issue

* Implement Admin Reporting UI with backend and frontend

Co-authored-by: v0l <1172179+v0l@users.noreply.github.com>

* Implement reviewed flag for reports instead of deletion and revert upload.tsx changes

Co-authored-by: v0l <1172179+v0l@users.noreply.github.com>

* Remove legacy files migration logic from upload UI

Co-authored-by: v0l <1172179+v0l@users.noreply.github.com>

* Delete ui_src/package-lock.json

* Delete ui_src/yarn.lock

* Restore yarn.lock file to original state

Co-authored-by: v0l <1172179+v0l@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: v0l <1172179+v0l@users.noreply.github.com>
Co-authored-by: Kieran <kieran@harkin.me>
2025-06-10 16:34:09 +01:00
fc080b5cd0 [WIP] Reporting (#19)
* Initial plan for issue

* Add database migration and report structures

Co-authored-by: v0l <1172179+v0l@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: v0l <1172179+v0l@users.noreply.github.com>
2025-06-10 15:28:44 +01:00
0554f1220f feat: return nip94 as tag array in blossom (#9)
Some checks failed
continuous-integration/drone Build is failing
2025-06-10 11:39:41 +01:00
ca2d23508b Implement payments system with default free allowance and quota enforcement (#17)
* Initial plan for issue

* Implement complete payments system with quota enforcement

- Add default free allowance configuration (100MB)
- Implement quota checking before uploads in both blossom and nip96 routes
- Add comprehensive quota checking functions in database module
- Enhance admin API to show quota information
- Add payment processing infrastructure
- Include all necessary database migrations

Users now get 100MB free storage + any valid paid storage.
Uploads are rejected when quota would be exceeded.

* Move free_quota_bytes to PaymentConfig and restore mime_type parameter

Co-authored-by: v0l <1172179+v0l@users.noreply.github.com>

---------

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: v0l <1172179+v0l@users.noreply.github.com>
2025-06-10 11:37:17 +01:00
71cb34eaee feat: include codecs tag in mime 2025-05-31 23:00:33 +01:00
e3e2986294 chore: add link to profile 2025-05-27 16:15:10 +01:00
afb33085b7 feat: nip98 timestamp window
closes https://github.com/v0l/route96/issues/12
2025-03-31 11:49:25 +01:00
f569b94d19 fix: remove void_cat references from grafana dashboard 2025-03-31 10:23:17 +01:00
915624b2d7 fix: build variants 2025-03-31 10:17:55 +01:00
6ba1f2ae9c chore: bump ffmpeg-rs-raw 2025-03-31 10:00:52 +01:00
f18a14e980 chore: cargo update
chore: switch to env_logger
closes https://github.com/v0l/route96/issues/11
2025-03-31 09:51:36 +01:00
2c42e19f42 chore: remove gitea docker image 2025-03-19 16:21:08 +00:00
b9d920ad49 chore: remove build trigger 2025-03-19 16:20:37 +00:00
7eb6b7221c fix: range response missing 1 byte 2025-02-27 17:11:46 +00:00
8dc2544b15 fix: invalid range to last byte 2025-02-27 16:21:34 +00:00
57050567c4 fix: invalid range for size 2025-02-27 16:16:37 +00:00
1a5388fc1c fix: disable push to git.v0l.io docker 2025-02-27 16:07:53 +00:00
6ccdb0fdc3 fix: range header response 2025-02-27 16:01:03 +00:00
6998f0ffac feat: r96util probe media before importing 2025-02-12 14:52:06 +00:00
317b0708e0 feat: r96util parallel 2025-02-12 14:24:13 +00:00
069aa30d52 feat: r96util progress 2025-02-12 13:50:59 +00:00
4dad339c09 feat: r96util import 2025-02-12 12:11:58 +00:00
f5b206dad3 fix: walkdir 2025-02-12 11:23:31 +00:00
b6bd190252 feat: r96util 2025-02-10 20:48:40 +00:00
3b4bb866ab fix: uploaded timestamp blossom 2025-02-07 14:36:01 +00:00
c885a71295 feat: return thumbnail url in meta 2025-02-07 09:50:46 +00:00
e1fca9a604 feat: filter list by mime 2025-02-06 22:33:26 +00:00
16a14de5d6 fix: dont patch video metadata for image files (always empty) 2025-02-04 13:30:22 +00:00
314d0c68af feat: accept void-cat uuid 2025-02-04 13:09:02 +00:00
5530f39779 fix: head void_cat_redirect 2025-01-30 22:25:56 +00:00
4f40efa99c fix: negative duration 2025-01-27 22:57:16 +00:00
ceca1904d7 fix: bump ffmpeg-rs-raw 2025-01-27 22:51:22 +00:00
2172c8557a fix: bump ffmpeg-rs-raw 2025-01-27 22:27:32 +00:00
f3989ba244 fix: log probe error 2025-01-27 22:11:58 +00:00
9f78c1a54f feat: backfill media metadata 2025-01-27 21:48:57 +00:00
201a3aaa49 fix: method tag for media upload 2025-01-27 21:22:06 +00:00
3ba5e7bc4c feat: video duration / bitrate 2025-01-27 21:19:11 +00:00
5fbe40faae feat: improve file list 2025-01-27 11:15:26 +00:00
0d8686a850 fix: thumbnail gen single frame 2025-01-27 10:14:24 +00:00
71f6f47a00 fix: release db connection 2025-01-25 23:52:42 +00:00
0bd531a21d feat: ui render image thumbs 2025-01-25 23:28:01 +00:00
6763e53d41 feat: generate thumbnails 2025-01-25 23:22:39 +00:00
48 changed files with 179073 additions and 1457 deletions

View File

@ -5,25 +5,26 @@ metadata:
namespace: git
concurrency:
limit: 1
trigger:
branch:
- main
event:
- push
volumes:
- name: cache
claim:
name: storage2
steps:
- name: build
image: docker
privileged: true
volumes:
- name: cache
path: /cache
environment:
TOKEN:
from_secret: gitea
TOKEN_DOCKER:
from_secret: docker_hub
commands:
- dockerd &
- docker login -u kieran -p $TOKEN git.v0l.io
- dockerd --data-root /cache/dockerd &
- docker login -u voidic -p $TOKEN_DOCKER
- docker buildx build --push -t git.v0l.io/kieran/route96:latest -t voidic/route96:latest .
- docker buildx build --push -t voidic/route96:latest .
- kill $(cat /var/run/docker.pid)
---
kind: pipeline
@ -36,18 +37,24 @@ trigger:
- tag
metadata:
namespace: git
volumes:
- name: cache
claim:
name: storage2
steps:
- name: build
image: docker
privileged: true
volumes:
- name: cache
path: /cache
environment:
TOKEN:
from_secret: gitea
TOKEN_DOCKER:
from_secret: docker_hub
commands:
- dockerd &
- docker login -u kieran -p $TOKEN git.v0l.io
- dockerd --data-root /cache/dockerd &
- docker login -u voidic -p $TOKEN_DOCKER
- docker buildx build --push -t git.v0l.io/kieran/route96:$DRONE_TAG -t voidic/route96:$DRONE_TAG .
- docker buildx build --push voidic/route96:$DRONE_TAG .
- kill $(cat /var/run/docker.pid)

1533
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -3,14 +3,6 @@ name = "route96"
version = "0.4.0"
edition = "2021"
[[bin]]
name = "void_cat_migrate"
required-features = ["bin-void-cat-migrate"]
[[bin]]
name = "void_cat_forced_migrate"
required-features = ["bin-void-cat-force-migrate"]
[[bin]]
name = "route96"
path = "src/bin/main.rs"
@ -19,22 +11,18 @@ path = "src/bin/main.rs"
name = "route96"
[features]
default = ["nip96", "blossom", "analytics", "ranges", "react-ui"]
default = ["nip96", "blossom", "analytics", "react-ui", "payments"]
media-compression = ["dep:ffmpeg-rs-raw", "dep:libc"]
labels = ["nip96", "dep:candle-core", "dep:candle-nn", "dep:candle-transformers"]
labels = ["media-compression", "dep:candle-core", "dep:candle-nn", "dep:candle-transformers"]
nip96 = ["media-compression"]
blossom = []
bin-void-cat-migrate = ["dep:sqlx-postgres"]
bin-void-cat-force-migrate = ["dep:regex", "dep:nostr-cursor"]
torrent-v2 = []
analytics = []
void-cat-redirects = ["dep:sqlx-postgres"]
ranges = ["dep:http-range-header"]
react-ui = []
payments = ["dep:fedimint-tonic-lnd"]
[dependencies]
log = "0.4.21"
nostr = "0.37.0"
nostr = "0.39.0"
pretty_env_logger = "0.5.0"
rocket = { version = "0.5.1", features = ["json"] }
tokio = { version = "1.37.0", features = ["rt", "rt-multi-thread", "macros"] }
@ -45,21 +33,19 @@ uuid = { version = "1.8.0", features = ["v4", "serde"] }
anyhow = "^1.0.82"
sha2 = "0.10.8"
sqlx = { version = "0.8.1", features = ["mysql", "runtime-tokio", "chrono", "uuid"] }
config = { version = "0.14.0", features = ["yaml"] }
config = { version = "0.15.7", features = ["yaml"] }
chrono = { version = "0.4.38", features = ["serde"] }
serde_with = { version = "3.8.1", features = ["hex"] }
reqwest = { version = "0.12.8", features = ["stream"] }
reqwest = { version = "0.12.8", features = ["stream", "http2"] }
clap = { version = "4.5.18", features = ["derive"] }
mime2ext = "0.1.53"
infer = "0.16.0"
infer = "0.19.0"
tokio-util = { version = "0.7.13", features = ["io", "io-util"] }
http-range-header = { version = "0.4.2" }
base58 = "0.2.0"
libc = { version = "0.2.153", optional = true }
ffmpeg-rs-raw = { git = "https://git.v0l.io/Kieran/ffmpeg-rs-raw.git", rev = "de2050cec07a095bace38d3ccf9c4c4f9b03b217", optional = true }
ffmpeg-rs-raw = { git = "https://git.v0l.io/Kieran/ffmpeg-rs-raw.git", rev = "928ab9664ff47c1b0bd8313ebc73d13b1ab43fc5", optional = true }
candle-core = { git = "https://git.v0l.io/huggingface/candle.git", tag = "0.8.1", optional = true }
candle-nn = { git = "https://git.v0l.io/huggingface/candle.git", tag = "0.8.1", optional = true }
candle-transformers = { git = "https://git.v0l.io/huggingface/candle.git", tag = "0.8.1", optional = true }
sqlx-postgres = { version = "0.8.2", optional = true, features = ["chrono", "uuid"] }
http-range-header = { version = "0.4.2", optional = true }
nostr-cursor = { git = "https://git.v0l.io/Kieran/nostr_backup_proc.git", branch = "main", optional = true }
regex = { version = "1.11.1", optional = true }
fedimint-tonic-lnd = { version = "0.2.0", optional = true, default-features = false, features = ["invoicesrpc", "lightningrpc"] }

View File

@ -15,9 +15,10 @@ RUN apt update && \
libwebp-dev \
libvpx-dev \
nasm \
libclang-dev && \
libclang-dev \
protobuf-compiler && \
rm -rf /var/lib/apt/lists/*
RUN git clone --single-branch --branch release/7.1 https://git.v0l.io/ffmpeg/FFmpeg.git && \
RUN git clone --single-branch --branch master https://git.v0l.io/ffmpeg/FFmpeg.git && \
cd FFmpeg && \
./configure \
--prefix=${FFMPEG_DIR} \

153
README.md
View File

@ -1,26 +1,143 @@
# route96
# Route96
Image hosting service
Decentralized blob storage server with Nostr integration, supporting multiple protocols and advanced media processing capabilities.
## Features
## Core Features
- [NIP-96 Support](https://github.com/nostr-protocol/nips/blob/master/96.md)
- [Blossom Support](https://github.com/hzrd149/blossom/blob/master/buds/01.md)
- [BUD-01](https://github.com/hzrd149/blossom/blob/master/buds/01.md)
- [BUD-02](https://github.com/hzrd149/blossom/blob/master/buds/02.md)
- [BUD-04](https://github.com/hzrd149/blossom/blob/master/buds/04.md)
- [BUD-05](https://github.com/hzrd149/blossom/blob/master/buds/05.md)
- [BUD-06](https://github.com/hzrd149/blossom/blob/master/buds/06.md)
- [BUD-08](https://github.com/hzrd149/blossom/blob/master/buds/08.md)
- Image compression to WebP
- Blurhash calculation
- AI image labeling ([ViT224](https://huggingface.co/google/vit-base-patch16-224))
- Plausible analytics
### Protocol Support
- **[NIP-96](https://github.com/nostr-protocol/nips/blob/master/96.md)** - Nostr file storage with media processing
- **[Blossom Protocol](https://github.com/hzrd149/blossom)** - Complete BUD specification compliance:
- [BUD-01](https://github.com/hzrd149/blossom/blob/master/buds/01.md) - Blob retrieval (GET/HEAD)
- [BUD-02](https://github.com/hzrd149/blossom/blob/master/buds/02.md) - Upload, delete, list operations
- [BUD-04](https://github.com/hzrd149/blossom/blob/master/buds/04.md) - Blob mirroring from remote servers
- [BUD-05](https://github.com/hzrd149/blossom/blob/master/buds/05.md) - Media optimization endpoints
- [BUD-06](https://github.com/hzrd149/blossom/blob/master/buds/06.md) - Upload requirement validation
- [BUD-08](https://github.com/hzrd149/blossom/blob/master/buds/08.md) - NIP-94 metadata support
- [BUD-09](https://github.com/hzrd149/blossom/blob/master/buds/09.md) - Content reporting system
## Planned
### Media Processing
- **Image & Video Compression** - Automatic WebP conversion and optimization
- **Thumbnail Generation** - Auto-generated thumbnails for images and videos
- **Blurhash Calculation** - Progressive image loading with blur previews
- **AI Content Labeling** - Automated tagging using [ViT-224](https://huggingface.co/google/vit-base-patch16-224) model
- **Media Metadata** - Automatic extraction of dimensions, duration, bitrate
- **Range Request Support** - RFC 7233 compliant partial content delivery
- Torrent seed V2
- Payment system
### Security & Administration
- **Nostr Authentication** - Cryptographic identity with kind 24242 events
- **Whitelist Support** - Restrict uploads to approved public keys
- **Quota Management** - Per-user storage limits with payment integration
- **Content Reporting** - Community-driven moderation via NIP-56 reports
- **Admin Dashboard** - Web interface for content and user management
- **CORS Support** - Full cross-origin resource sharing compliance
### Payment System
- **Lightning Network** - Bitcoin payments via LND integration
- **Fiat Tracking** - Multi-currency support (USD/EUR/GBP/JPY/etc.)
- **Flexible Billing** - Usage-based pricing (storage, egress, time-based)
- **Free Quotas** - Configurable free tier for new users
### Analytics & Monitoring
- **Plausible Integration** - Privacy-focused usage analytics
- **Comprehensive Logging** - Detailed operation tracking
- **Health Monitoring** - Service status and performance metrics
## API Endpoints
### Blossom Protocol
- `GET /<sha256>` - Retrieve blob by hash
- `HEAD /<sha256>` - Check blob existence
- `PUT /upload` - Upload new blob
- `DELETE /<sha256>` - Delete owned blob
- `GET /list/<pubkey>` - List user's blobs
- `PUT /mirror` - Mirror blob from remote URL
- `PUT /media` - Upload with media optimization
- `HEAD /upload` - Validate upload requirements
- `PUT /report` - Submit content reports
### NIP-96 Protocol
- `GET /.well-known/nostr/nip96.json` - Server information
- `POST /nip96` - File upload with Nostr auth
- `DELETE /nip96/<sha256>` - Delete with Nostr auth
### Admin Interface
- `GET /admin/*` - Web dashboard for content management
- Admin API endpoints for reports and user management
## Configuration
Route96 uses YAML configuration. See [config.yaml](config.yaml) for a complete example:
```yaml
listen: "127.0.0.1:8000"
database: "mysql://user:pass@localhost:3306/route96"
storage_dir: "./data"
max_upload_bytes: 104857600 # 100MB
public_url: "https://your-domain.com"
# Optional: Restrict to specific pubkeys
whitelist: ["pubkey1", "pubkey2"]
# Optional: Payment system
payments:
free_quota_bytes: 104857600
cost:
currency: "BTC"
amount: 0.00000100
unit: "GBSpace"
interval:
month: 1
```
## Quick Start Examples
### Upload a file (Blossom)
```bash
# Create authorization event (kind 24242)
auth_event='{"kind":24242,"tags":[["t","upload"],["expiration","1234567890"]],"content":"Upload file"}'
auth_b64=$(echo $auth_event | base64 -w 0)
curl -X PUT http://localhost:8000/upload \
-H "Authorization: Nostr $auth_b64" \
-H "Content-Type: image/jpeg" \
--data-binary @image.jpg
```
### Retrieve a file
```bash
curl http://localhost:8000/abc123def456...789
```
### List user's files
```bash
curl http://localhost:8000/list/user_pubkey_hex
```
## Feature Flags
Route96 supports optional features that can be enabled at compile time:
- `nip96` (default) - NIP-96 protocol support
- `blossom` (default) - Blossom protocol support
- `media-compression` - WebP conversion and thumbnails
- `labels` - AI-powered content labeling
- `payments` (default) - Lightning payment integration
- `analytics` (default) - Plausible analytics
- `react-ui` (default) - Web dashboard interface
```bash
# Build with specific features
cargo build --features "blossom,payments,media-compression"
```
## Requirements
- **Rust** 1.70+
- **MySQL/MariaDB** - Database storage
- **FFmpeg libraries** - Media processing (optional)
- **Node.js** - UI building (optional)
See [docs/debian.md](docs/debian.md) for detailed installation instructions.
## Running

View File

@ -13,19 +13,39 @@ max_upload_bytes: 5e+9
# Public facing url
public_url: "http://localhost:8000"
# Whitelisted pubkeys, leave out to disable
# (Optional) Whitelisted pubkeys, leave out to disable
# whitelist: ["63fe6318dc58583cfe16810f86dd09e18bfd76aabc24a0081ce2856f330504ed"]
# Path for ViT(224) image model (https://huggingface.co/google/vit-base-patch16-224)
vit_model:
model: "/home/kieran/Downloads/falcon_nsfw.safetensors"
config: "/home/kieran/Downloads/falcon_nsfw.json"
# (Optional) Path for ViT(224) image model (https://huggingface.co/google/vit-base-patch16-224)
# vit_model:
# model: "falcon_nsfw.safetensors"
# config: "falcon_nsfw.json"
# Analytics support
# (Optional) Analytics support
# plausible_url: "https://plausible.com/"
# Support legacy void
# void_cat_database: "postgres://postgres:postgres@localhost:41911/void"
# (Optional) Legacy file path for void.cat uploads
# void_cat_files: "/my/void.cat/data"
# Legacy file path for void.cat uploads
# void_cat_files: "/my/void.cat/data"
# (Optional) Payment system config
payments:
# (Optional) Free quota in bytes for users without payments (default: 100MB)
free_quota_bytes: 104857600
# (Optional) Fiat currency used to track exchange rate along with invoices
# If [cost] is using a fiat currency, exchange rates will always be stored
# in that currency, so this config is not needed
fiat: "USD"
# LND node config
lnd:
endpoint: "https://127.0.0.1:10001"
tls: "/home/kieran/.polar/networks/1/volumes/lnd/alice/tls.cert"
macaroon: "/home/kieran/.polar/networks/1/volumes/lnd/alice/data/chain/bitcoin/regtest/admin.macaroon"
# Cost per unit (BTC/USD/EUR/AUD/CAD/JPY/GBP)
cost:
currency: "BTC"
amount: 0.00000100
# Unit metric used to calculate quote (GBSpace, GBEgress)
unit: "GBSpace"
# Billing interval (day / month / year)
interval:
month: 1

View File

@ -18,13 +18,13 @@
"editable": true,
"fiscalYearStartMonth": 0,
"graphTooltip": 0,
"id": 15,
"id": 2,
"links": [],
"panels": [
{
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"fieldConfig": {
"defaults": {
@ -93,18 +93,18 @@
}
]
},
"pluginVersion": "10.4.2",
"pluginVersion": "11.5.2",
"targets": [
{
"dataset": "void_cat",
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"editorMode": "code",
"format": "table",
"rawQuery": true,
"rawSql": "SELECT \n hex(void_cat.users.pubkey) as pubkey, \n count(void_cat.user_uploads.file) as uploads, \n sum(void_cat.uploads.size) as size\nFROM void_cat.users, void_cat.user_uploads, void_cat.uploads\nwhere void_cat.users.id = void_cat.user_uploads.user_id\nand void_cat.user_uploads.file = void_cat.uploads.id\ngroup by void_cat.users.pubkey",
"rawSql": "SELECT \n hex(users.pubkey) as pubkey, \n count(user_uploads.file) as uploads, \n sum(uploads.size) as size\nFROM users, user_uploads, uploads\nwhere users.id = user_uploads.user_id\nand user_uploads.file = uploads.id\ngroup by users.pubkey",
"refId": "A",
"sql": {
"columns": [
@ -137,7 +137,7 @@
{
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"fieldConfig": {
"defaults": {
@ -174,6 +174,7 @@
"graphMode": "area",
"justifyMode": "auto",
"orientation": "auto",
"percentChangeColorMode": "standard",
"reduceOptions": {
"calcs": [
"lastNotNull"
@ -185,18 +186,18 @@
"textMode": "auto",
"wideLayout": true
},
"pluginVersion": "10.4.2",
"pluginVersion": "11.5.2",
"targets": [
{
"dataset": "mysql",
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"editorMode": "code",
"format": "table",
"rawQuery": true,
"rawSql": "select \n sum(uploads.size) as size\nfrom void_cat.uploads",
"rawSql": "select \n sum(uploads.size) as size\nfrom uploads",
"refId": "A",
"sql": {
"columns": [
@ -223,7 +224,7 @@
{
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"fieldConfig": {
"defaults": {
@ -260,6 +261,7 @@
"graphMode": "area",
"justifyMode": "auto",
"orientation": "auto",
"percentChangeColorMode": "standard",
"reduceOptions": {
"calcs": [
"lastNotNull"
@ -271,18 +273,18 @@
"textMode": "auto",
"wideLayout": true
},
"pluginVersion": "10.4.2",
"pluginVersion": "11.5.2",
"targets": [
{
"dataset": "mysql",
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"editorMode": "code",
"format": "table",
"rawQuery": true,
"rawSql": "select \n count(users.pubkey) as users\nfrom void_cat.users",
"rawSql": "select \n count(users.pubkey) as users\nfrom users",
"refId": "A",
"sql": {
"columns": [
@ -309,7 +311,7 @@
{
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"fieldConfig": {
"defaults": {
@ -346,6 +348,7 @@
"graphMode": "area",
"justifyMode": "auto",
"orientation": "auto",
"percentChangeColorMode": "standard",
"reduceOptions": {
"calcs": [
"lastNotNull"
@ -357,18 +360,18 @@
"textMode": "auto",
"wideLayout": true
},
"pluginVersion": "10.4.2",
"pluginVersion": "11.5.2",
"targets": [
{
"dataset": "mysql",
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"editorMode": "code",
"format": "table",
"rawQuery": true,
"rawSql": "select \n count(uploads.id) as files\nfrom void_cat.uploads",
"rawSql": "select \n count(uploads.id) as files\nfrom uploads",
"refId": "A",
"sql": {
"columns": [
@ -395,7 +398,7 @@
{
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"fieldConfig": {
"defaults": {
@ -445,18 +448,18 @@
},
"showHeader": true
},
"pluginVersion": "10.4.2",
"pluginVersion": "11.5.2",
"targets": [
{
"dataset": "mysql",
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"editorMode": "code",
"format": "table",
"rawQuery": true,
"rawSql": "select \n hex(uploads.id) as sha256,\n hex(users.pubkey) as uploader,\n uploads.name,\n sys.format_bytes(uploads.size) as size,\n uploads.mime_type,\n uploads.created,\n uploads.width,\n uploads.height\nfrom void_cat.uploads, void_cat.user_uploads, void_cat.users\nwhere uploads.id = user_uploads.file\nand users.id = user_uploads.user_id\norder by uploads.created desc\nlimit 50",
"rawSql": "select \n hex(uploads.id) as sha256,\n hex(users.pubkey) as uploader,\n uploads.name,\n sys.format_bytes(uploads.size) as size,\n uploads.mime_type,\n uploads.created,\n uploads.width,\n uploads.height\nfrom uploads, user_uploads, users\nwhere uploads.id = user_uploads.file\nand users.id = user_uploads.user_id\norder by uploads.created desc\nlimit 50",
"refId": "A",
"sql": {
"columns": [
@ -483,7 +486,7 @@
{
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"fieldConfig": {
"defaults": {
@ -533,18 +536,18 @@
},
"showHeader": true
},
"pluginVersion": "10.4.2",
"pluginVersion": "11.5.2",
"targets": [
{
"dataset": "mysql",
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"editorMode": "code",
"format": "table",
"rawQuery": true,
"rawSql": "select \n hex(uploads.id) as sha256,\n hex(users.pubkey) as uploader,\n uploads.name,\n sys.format_bytes(uploads.size) as size,\n uploads.mime_type,\n uploads.created\nfrom void_cat.uploads, void_cat.user_uploads, void_cat.users\nwhere uploads.id = user_uploads.file\nand users.id = user_uploads.user_id\norder by uploads.size desc\nlimit 50",
"rawSql": "select \n hex(uploads.id) as sha256,\n hex(users.pubkey) as uploader,\n uploads.name,\n sys.format_bytes(uploads.size) as size,\n uploads.mime_type,\n uploads.created\nfrom uploads, user_uploads, users\nwhere uploads.id = user_uploads.file\nand users.id = user_uploads.user_id\norder by uploads.size desc\nlimit 50",
"refId": "A",
"sql": {
"columns": [
@ -571,7 +574,7 @@
{
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"fieldConfig": {
"defaults": {
@ -621,13 +624,13 @@
},
"showHeader": true
},
"pluginVersion": "10.4.2",
"pluginVersion": "11.5.2",
"targets": [
{
"dataset": "mysql",
"datasource": {
"type": "mysql",
"uid": "cdnhzi5uxm8zkb"
"uid": "behhij20nn4zka"
},
"editorMode": "code",
"format": "table",
@ -657,7 +660,9 @@
"type": "table"
}
],
"schemaVersion": 39,
"preload": false,
"refresh": "",
"schemaVersion": 40,
"tags": [],
"templating": {
"list": []
@ -670,6 +675,6 @@
"timezone": "browser",
"title": "route96",
"uid": "ddni0rqotyltse",
"version": 12,
"version": 3,
"weekStart": ""
}

View File

@ -0,0 +1,4 @@
-- Add migration script here
alter table uploads
add column duration float,
add column bitrate integer unsigned;

View File

@ -0,0 +1,22 @@
-- Add migration script here
alter table users
add column paid_until timestamp,
add column paid_size bigint unsigned not null;
create table payments
(
payment_hash binary(32) not null primary key,
user_id integer unsigned not null,
created timestamp default current_timestamp,
amount integer unsigned not null,
is_paid bit(1) not null default 0,
days_value integer unsigned not null,
size_value bigint unsigned not null,
settle_index integer unsigned,
rate float,
constraint fk_payments_user_id
foreign key (user_id) references users (id)
on delete cascade
on update restrict
);

View File

@ -0,0 +1,28 @@
-- Create reports table for file reporting functionality
create table reports
(
id integer unsigned not null auto_increment primary key,
file_id binary(32) not null,
reporter_id integer unsigned not null,
event_json text not null,
created timestamp default current_timestamp,
constraint fk_reports_file
foreign key (file_id) references uploads (id)
on delete cascade
on update restrict,
constraint fk_reports_reporter
foreign key (reporter_id) references users (id)
on delete cascade
on update restrict
);
-- Unique index to prevent duplicate reports from same user for same file
create unique index ix_reports_file_reporter on reports (file_id, reporter_id);
-- Index for efficient lookups by file
create index ix_reports_file_id on reports (file_id);
-- Index for efficient lookups by reporter
create index ix_reports_reporter_id on reports (reporter_id);

View File

@ -0,0 +1,5 @@
-- Add reviewed flag to reports table
alter table reports add column reviewed boolean not null default false;
-- Index for efficient filtering of non-reviewed reports
create index ix_reports_reviewed on reports (reviewed);

View File

@ -1,7 +1,7 @@
use crate::analytics::Analytics;
use crate::settings::Settings;
use anyhow::Error;
use log::{info, warn};
use log::{debug, warn};
use nostr::serde_json;
use reqwest::ClientBuilder;
use rocket::Request;
@ -61,7 +61,7 @@ impl PlausibleAnalytics {
.send()
.await
{
Ok(_v) => info!("Sent {:?}", msg),
Ok(_v) => debug!("Sent {:?}", msg),
Err(e) => warn!("Failed to track: {}", e),
}
}

View File

@ -33,10 +33,12 @@ impl<'r> FromRequest<'r> for Nip98Auth {
if event.kind != Kind::HttpAuth {
return Outcome::Error((Status::new(401), "Wrong event kind"));
}
if event.created_at > Timestamp::now() {
if (event.created_at.as_u64() as i64 -
Timestamp::now().as_u64() as i64).abs() >= 60
{
return Outcome::Error((
Status::new(401),
"Created timestamp is in the future",
"Created timestamp is out of range",
));
}

View File

@ -0,0 +1,102 @@
use crate::db::{Database, FileUpload};
use crate::filesystem::FileStore;
use crate::processing::probe_file;
use anyhow::Result;
use log::{error, info, warn};
use tokio::sync::broadcast::Receiver;
pub struct MediaMetadata {
db: Database,
fs: FileStore,
}
impl MediaMetadata {
pub fn new(db: Database, fs: FileStore) -> Self {
Self { db, fs }
}
pub async fn process(&mut self, mut shutdown: Receiver<()>) -> Result<()> {
let to_migrate = self.db.get_missing_media_metadata().await?;
info!("{} files are missing metadata", to_migrate.len());
for file in to_migrate {
if shutdown.try_recv().is_ok() {
break;
}
// probe file and update metadata
let path = self.fs.get(&file.id);
match probe_file(&path) {
Ok(data) => {
let bv = data.best_video();
let duration = if data.duration < 0.0 {
None
} else {
Some(data.duration)
};
let bitrate = if data.bitrate == 0 {
None
} else {
Some(data.bitrate as u32)
};
info!(
"Updating metadata: id={}, dim={}x{}, dur={}, br={}",
hex::encode(&file.id),
bv.map(|v| v.width).unwrap_or(0),
bv.map(|v| v.height).unwrap_or(0),
duration.unwrap_or(0.0),
bitrate.unwrap_or(0)
);
if let Err(e) = self
.db
.update_metadata(
&file.id,
bv.map(|v| v.width as u32),
bv.map(|v| v.height as u32),
duration,
bitrate,
)
.await
{
error!("Failed to update metadata: {}", e);
}
}
Err(e) => {
warn!("Skipping missing file: {}, {}", hex::encode(&file.id), e);
}
}
}
Ok(())
}
}
impl Database {
pub async fn get_missing_media_metadata(&mut self) -> Result<Vec<FileUpload>> {
let results: Vec<FileUpload> = sqlx::query_as("select * from uploads where \
(mime_type like 'image/%' and (width is null or height is null)) or \
(mime_type like 'video/%' and (width is null or height is null or bitrate is null or duration is null))")
.fetch_all(&self.pool)
.await?;
Ok(results)
}
pub async fn update_metadata(
&mut self,
id: &Vec<u8>,
width: Option<u32>,
height: Option<u32>,
duration: Option<f32>,
bitrate: Option<u32>,
) -> Result<()> {
sqlx::query("update uploads set width=?, height=?, duration=?, bitrate=? where id=?")
.bind(width)
.bind(height)
.bind(duration)
.bind(bitrate)
.bind(id)
.execute(&self.pool)
.await?;
Ok(())
}
}

54
src/background/mod.rs Normal file
View File

@ -0,0 +1,54 @@
use crate::db::Database;
use crate::filesystem::FileStore;
use log::{error, info, warn};
use tokio::sync::broadcast;
use tokio::task::JoinHandle;
#[cfg(feature = "media-compression")]
mod media_metadata;
#[cfg(feature = "payments")]
mod payments;
pub fn start_background_tasks(
db: Database,
file_store: FileStore,
shutdown_rx: broadcast::Receiver<()>,
#[cfg(feature = "payments")] client: Option<fedimint_tonic_lnd::Client>,
) -> Vec<JoinHandle<()>> {
let mut ret = vec![];
#[cfg(feature = "media-compression")]
{
let db = db.clone();
let rx = shutdown_rx.resubscribe();
ret.push(tokio::spawn(async move {
info!("Starting MediaMetadata background task");
let mut m = media_metadata::MediaMetadata::new(db, file_store.clone());
if let Err(e) = m.process(rx).await {
error!("MediaMetadata failed: {}", e);
} else {
info!("MediaMetadata background task completed");
}
}));
}
#[cfg(feature = "payments")]
{
if let Some(client) = client {
let db = db.clone();
let rx = shutdown_rx.resubscribe();
ret.push(tokio::spawn(async move {
info!("Starting PaymentsHandler background task");
let mut m = payments::PaymentsHandler::new(client, db);
if let Err(e) = m.process(rx).await {
error!("PaymentsHandler failed: {}", e);
} else {
info!("PaymentsHandler background task completed");
}
}));
} else {
warn!("Not starting PaymentsHandler, configuration missing")
}
}
ret
}

View File

@ -0,0 +1,71 @@
use crate::db::Database;
use anyhow::Result;
use fedimint_tonic_lnd::lnrpc::invoice::InvoiceState;
use fedimint_tonic_lnd::lnrpc::InvoiceSubscription;
use fedimint_tonic_lnd::Client;
use log::{error, info};
use rocket::futures::StreamExt;
use sqlx::Row;
use tokio::sync::broadcast;
pub struct PaymentsHandler {
client: Client,
database: Database,
}
impl PaymentsHandler {
pub fn new(client: Client, database: Database) -> Self {
PaymentsHandler { client, database }
}
pub async fn process(&mut self, mut rx: broadcast::Receiver<()>) -> Result<()> {
let start_idx = self.database.get_last_settle_index().await?;
let mut invoices = self
.client
.lightning()
.subscribe_invoices(InvoiceSubscription {
add_index: 0,
settle_index: start_idx,
})
.await?;
info!("Starting invoice subscription from {}", start_idx);
let invoices = invoices.get_mut();
loop {
tokio::select! {
Ok(_) = rx.recv() => {
break;
}
Some(Ok(msg)) = invoices.next() => {
if msg.state == InvoiceState::Settled as i32 {
if let Ok(Some(mut p)) = self.database.get_payment(&msg.r_hash).await {
p.settle_index = Some(msg.settle_index);
p.is_paid = true;
match self.database.complete_payment(&p).await {
Ok(()) => info!(
"Successfully completed payment: {}",
hex::encode(&msg.r_hash)
),
Err(e) => error!("Failed to complete payment: {}", e),
}
}
}
}
}
}
Ok(())
}
}
impl Database {
async fn get_last_settle_index(&self) -> Result<u64> {
Ok(
sqlx::query("select max(settle_index) from payments where is_paid = true")
.fetch_one(&self.pool)
.await?
.try_get(0)
.unwrap_or(0),
)
}
}

View File

@ -3,6 +3,8 @@ use std::net::{IpAddr, SocketAddr};
use anyhow::Error;
use clap::Parser;
use config::Config;
#[cfg(feature = "payments")]
use fedimint_tonic_lnd::lnrpc::GetInfoRequest;
use log::{error, info};
use rocket::config::Ident;
use rocket::data::{ByteUnit, Limits};
@ -12,12 +14,14 @@ use rocket::shield::Shield;
use route96::analytics::plausible::PlausibleAnalytics;
#[cfg(feature = "analytics")]
use route96::analytics::AnalyticsFairing;
use route96::background::start_background_tasks;
use route96::cors::CORS;
use route96::db::Database;
use route96::filesystem::FileStore;
use route96::routes;
use route96::routes::{get_blob, head_blob, root};
use route96::settings::Settings;
use tokio::sync::broadcast;
#[derive(Parser, Debug)]
#[command(version, about)]
@ -63,15 +67,20 @@ async fn main() -> Result<(), Error> {
.limit("form", upload_limit);
config.ident = Ident::try_new("route96").unwrap();
let fs = FileStore::new(settings.clone());
let mut rocket = rocket::Rocket::custom(config)
.manage(FileStore::new(settings.clone()))
.manage(fs.clone())
.manage(settings.clone())
.manage(db.clone())
.attach(CORS)
.attach(Shield::new()) // disable
.mount(
"/",
routes![root, get_blob, head_blob, routes::void_cat_redirect],
routes![
root,
get_blob,
head_blob
],
)
.mount("/admin", routes::admin_routes());
@ -89,10 +98,51 @@ async fn main() -> Result<(), Error> {
{
rocket = rocket.mount("/", routes::nip96_routes());
}
#[cfg(feature = "media-compression")]
{
rocket = rocket.mount("/", routes![routes::get_blob_thumb]);
}
#[cfg(feature = "payments")]
let lnd = {
if let Some(lnd) = settings.payments.as_ref().map(|p| &p.lnd) {
let lnd = fedimint_tonic_lnd::connect(
lnd.endpoint.clone(),
lnd.tls.clone(),
lnd.macaroon.clone(),
)
.await?;
let info = {
let mut lnd = lnd.clone();
lnd.lightning().get_info(GetInfoRequest::default()).await?
};
info!(
"LND connected: {} v{}",
info.get_ref().alias,
info.get_ref().version
);
rocket = rocket
.manage(lnd.clone())
.mount("/", routes::payment::routes());
Some(lnd)
} else {
None
}
};
let (shutdown_tx, shutdown_rx) = broadcast::channel(1);
let jh = start_background_tasks(db, fs, shutdown_rx, lnd);
if let Err(e) = rocket.launch().await {
error!("Rocker error {}", e);
Err(Error::from(e))
} else {
Ok(())
}
shutdown_tx
.send(())
.expect("Failed to send shutdown signal");
for j in jh {
j.await?;
}
Ok(())
}

View File

@ -1,70 +0,0 @@
use clap::Parser;
use log::{info, warn};
use nostr::serde_json;
use nostr_cursor::cursor::NostrCursor;
use regex::Regex;
use rocket::futures::StreamExt;
use std::collections::{HashMap, HashSet};
use std::path::PathBuf;
use tokio::fs::File;
use tokio::io::AsyncWriteExt;
use uuid::Uuid;
#[derive(Parser, Debug)]
#[command(version, about)]
struct ProgramArgs {
/// Directory pointing to archives to scan
#[arg(short, long)]
pub archive: PathBuf,
/// Output path .csv
#[arg(short, long)]
pub output: PathBuf,
}
#[tokio::main]
async fn main() -> Result<(), anyhow::Error> {
pretty_env_logger::init();
let args: ProgramArgs = ProgramArgs::parse();
let mut report: HashMap<String, HashSet<String>> = HashMap::new();
let mut binding = NostrCursor::new(args.archive);
let mut cursor = Box::pin(binding.walk());
let matcher = Regex::new(r"void\.cat/d/(\w+)")?;
while let Some(Ok(e)) = cursor.next().await {
if e.content.contains("void.cat") {
let links = matcher.captures_iter(&e.content).collect::<Vec<_>>();
for link in links {
let g = link.get(1).unwrap().as_str();
let base58 = if let Ok(b) = nostr::bitcoin::base58::decode(g) {
b
} else {
warn!("Invalid base58 id {}", g);
continue;
};
let _uuid = if let Ok(u) = Uuid::from_slice_le(base58.as_slice()) {
u
} else {
warn!("Invalid uuid {}", g);
continue;
};
info!("Got link: {} => {}", g, e.pubkey);
if let Some(ur) = report.get_mut(&e.pubkey) {
ur.insert(g.to_string());
} else {
report.insert(e.pubkey.clone(), HashSet::from([g.to_string()]));
}
}
}
}
let json = serde_json::to_string(&report)?;
File::create(args.output)
.await?
.write_all(json.as_bytes())
.await?;
Ok(())
}

View File

@ -1,145 +0,0 @@
use anyhow::Error;
use clap::Parser;
use config::Config;
use log::{info, warn};
use nostr::bitcoin::base58;
use route96::db::{Database, FileUpload};
use route96::filesystem::FileStore;
use route96::settings::Settings;
use route96::void_db::VoidCatDb;
use route96::void_file::VoidFile;
use std::path::PathBuf;
use tokio::io::{AsyncWriteExt, BufWriter};
#[derive(Debug, Clone, clap::ValueEnum)]
enum ArgOperation {
Migrate,
ExportNginxRedirects,
}
#[derive(Parser, Debug)]
#[command(version, about)]
struct Args {
/// Database connection string for void.cat DB
#[arg(long)]
pub database: String,
/// Path to filestore on void.cat
#[arg(long)]
pub data_path: String,
#[arg(long)]
pub operation: ArgOperation,
}
#[tokio::main]
async fn main() -> Result<(), Error> {
pretty_env_logger::init();
let builder = Config::builder()
.add_source(config::File::with_name("config.yaml"))
.add_source(config::Environment::with_prefix("APP"))
.build()?;
let settings: Settings = builder.try_deserialize()?;
let db = Database::new(&settings.database).await?;
let fs = FileStore::new(settings.clone());
let args: Args = Args::parse();
let db_void = VoidCatDb::connect(&args.database).await?;
match args.operation {
ArgOperation::Migrate => {
let mut page = 0;
loop {
let files = db_void.list_files(page).await?;
if files.is_empty() {
break;
}
for f in files {
if let Err(e) = migrate_file(&f, &db, &fs, &args).await {
warn!("Failed to migrate file: {}, {}", &f.id, e);
}
}
page += 1;
}
}
ArgOperation::ExportNginxRedirects => {
let path: PathBuf = args.data_path.parse()?;
let conf_path = &path.join("nginx.conf");
info!("Writing redirects to {}", conf_path.to_str().unwrap());
let mut fout = BufWriter::new(tokio::fs::File::create(conf_path).await?);
let mut page = 0;
loop {
let files = db_void.list_files(page).await?;
if files.is_empty() {
break;
}
for f in files {
let legacy_id = base58::encode(f.id.to_bytes_le().as_slice());
let redirect = format!("location ^\\/d\\/{}(?:\\.\\w+)?$ {{\n\treturn 301 https://nostr.download/{};\n}}\n", &legacy_id, &f.digest);
fout.write_all(redirect.as_bytes()).await?;
}
page += 1;
}
}
}
Ok(())
}
async fn migrate_file(
f: &VoidFile,
db: &Database,
fs: &FileStore,
args: &Args,
) -> Result<(), Error> {
let pubkey_vec = hex::decode(&f.email)?;
let id_vec = hex::decode(&f.digest)?;
// copy file
let src_path = PathBuf::new()
.join(&args.data_path)
.join(VoidFile::map_to_path(&f.id));
let dst_path = fs.map_path(&id_vec);
if src_path.exists() && !dst_path.exists() {
info!(
"Copying file: {} from {} => {}",
&f.id,
src_path.to_str().unwrap(),
dst_path.to_str().unwrap()
);
tokio::fs::create_dir_all(dst_path.parent().unwrap()).await?;
tokio::fs::copy(src_path, dst_path).await?;
} else if dst_path.exists() {
info!("File already exists {}, continuing...", &f.id);
} else {
anyhow::bail!("Source file not found {}", src_path.to_str().unwrap());
}
let uid = db.upsert_user(&pubkey_vec).await?;
info!("Mapped user {} => {}", &f.email, uid);
let md: Option<Vec<&str>> = f.media_dimensions.as_ref().map(|s| s.split("x").collect());
let fu = FileUpload {
id: id_vec,
name: f.name.clone(),
size: f.size as u64,
mime_type: f.mime_type.clone(),
created: f.uploaded,
width: match &md {
Some(s) => Some(s[0].parse::<u32>()?),
None => None,
},
height: match &md {
Some(s) => Some(s[1].parse::<u32>()?),
None => None,
},
blur_hash: None,
alt: f.description.clone(),
};
db.add_file(&fu, uid).await?;
Ok(())
}

228
src/db.rs
View File

@ -25,6 +25,10 @@ pub struct FileUpload {
pub blur_hash: Option<String>,
/// Alt text of the media
pub alt: Option<String>,
/// Duration of media in seconds
pub duration: Option<f32>,
/// Average bitrate in bits/s
pub bitrate: Option<u32>,
#[sqlx(skip)]
#[cfg(feature = "labels")]
@ -43,6 +47,8 @@ impl From<&NewFileResult> for FileUpload {
height: value.height,
blur_hash: value.blur_hash.clone(),
alt: None,
duration: value.duration,
bitrate: value.bitrate,
#[cfg(feature = "labels")]
labels: value.labels.clone(),
}
@ -55,6 +61,10 @@ pub struct User {
pub pubkey: Vec<u8>,
pub created: DateTime<Utc>,
pub is_admin: bool,
#[cfg(feature = "payments")]
pub paid_until: Option<DateTime<Utc>>,
#[cfg(feature = "payments")]
pub paid_size: u64,
}
#[cfg(feature = "labels")]
@ -84,6 +94,31 @@ pub struct UserStats {
pub total_size: u64,
}
#[cfg(feature = "payments")]
#[derive(Clone, FromRow, Serialize)]
pub struct Payment {
pub payment_hash: Vec<u8>,
pub user_id: u64,
pub created: DateTime<Utc>,
pub amount: u64,
pub is_paid: bool,
pub days_value: u64,
pub size_value: u64,
pub settle_index: Option<u64>,
pub rate: Option<f32>,
}
#[derive(Clone, FromRow, Serialize)]
pub struct Report {
pub id: u64,
#[serde(with = "hex")]
pub file_id: Vec<u8>,
pub reporter_id: u64,
pub event_json: String,
pub created: DateTime<Utc>,
pub reviewed: bool,
}
#[derive(Clone)]
pub struct Database {
pub(crate) pool: sqlx::pool::Pool<sqlx::mysql::MySql>,
@ -104,14 +139,23 @@ impl Database {
.bind(pubkey)
.fetch_optional(&self.pool)
.await?;
match res {
let user_id = match res {
None => sqlx::query("select id from users where pubkey = ?")
.bind(pubkey)
.fetch_one(&self.pool)
.await?
.try_get(0),
Some(res) => res.try_get(0),
.try_get(0)?,
Some(res) => res.try_get(0)?,
};
// Make the first user (ID 1) an admin
if user_id == 1 {
sqlx::query("update users set is_admin = 1 where id = 1")
.execute(&self.pool)
.await?;
}
Ok(user_id)
}
pub async fn get_user(&self, pubkey: &Vec<u8>) -> Result<User, Error> {
@ -121,6 +165,13 @@ impl Database {
.await
}
pub async fn get_user_by_id(&self, user_id: u64) -> Result<User, Error> {
sqlx::query_as("select * from users where id = ?")
.bind(user_id)
.fetch_one(&self.pool)
.await
}
pub async fn get_user_stats(&self, id: u64) -> Result<UserStats, Error> {
sqlx::query_as(
"select cast(count(user_uploads.file) as unsigned integer) as file_count, \
@ -145,7 +196,7 @@ impl Database {
pub async fn add_file(&self, file: &FileUpload, user_id: u64) -> Result<(), Error> {
let mut tx = self.pool.begin().await?;
let q = sqlx::query("insert ignore into \
uploads(id,name,size,mime_type,blur_hash,width,height,alt,created) values(?,?,?,?,?,?,?,?,?)")
uploads(id,name,size,mime_type,blur_hash,width,height,alt,created,duration,bitrate) values(?,?,?,?,?,?,?,?,?,?,?)")
.bind(&file.id)
.bind(&file.name)
.bind(file.size)
@ -154,7 +205,9 @@ impl Database {
.bind(file.width)
.bind(file.height)
.bind(&file.alt)
.bind(file.created);
.bind(file.created)
.bind(file.duration)
.bind(file.bitrate);
tx.execute(q).await?;
let q2 = sqlx::query("insert ignore into user_uploads(file,user_id) values(?,?)")
@ -262,3 +315,168 @@ impl Database {
Ok((results, count))
}
}
#[cfg(feature = "payments")]
impl Database {
pub async fn insert_payment(&self, payment: &Payment) -> Result<(), Error> {
sqlx::query("insert into payments(payment_hash,user_id,amount,days_value,size_value,rate) values(?,?,?,?,?,?)")
.bind(&payment.payment_hash)
.bind(payment.user_id)
.bind(payment.amount)
.bind(payment.days_value)
.bind(payment.size_value)
.bind(payment.rate)
.execute(&self.pool)
.await?;
Ok(())
}
pub async fn get_payment(&self, payment_hash: &Vec<u8>) -> Result<Option<Payment>, Error> {
sqlx::query_as("select * from payments where payment_hash = ?")
.bind(payment_hash)
.fetch_optional(&self.pool)
.await
}
pub async fn get_user_payments(&self, uid: u64) -> Result<Vec<Payment>, Error> {
sqlx::query_as("select * from payments where user_id = ?")
.bind(uid)
.fetch_all(&self.pool)
.await
}
pub async fn complete_payment(&self, payment: &Payment) -> Result<(), Error> {
let mut tx = self.pool.begin().await?;
sqlx::query("update payments set is_paid = true, settle_index = ? where payment_hash = ?")
.bind(payment.settle_index)
.bind(&payment.payment_hash)
.execute(&mut *tx)
.await?;
// Calculate time extension based on fractional quota value
// If user upgrades from 5GB to 10GB, their remaining time gets halved
// If user pays for 1GB on a 5GB plan, they get 1/5 of the normal time
let current_user = self.get_user_by_id(payment.user_id).await?;
if let Some(paid_until) = current_user.paid_until {
if paid_until > chrono::Utc::now() {
// User has active subscription - calculate fractional time extension
let time_fraction = if current_user.paid_size > 0 {
payment.size_value as f64 / current_user.paid_size as f64
} else {
1.0 // If no existing quota, treat as 100%
};
let adjusted_days = (payment.days_value as f64 * time_fraction) as u64;
// Extend subscription time and upgrade quota if larger
let new_quota_size = std::cmp::max(current_user.paid_size, payment.size_value);
sqlx::query("update users set paid_until = TIMESTAMPADD(DAY, ?, paid_until), paid_size = ? where id = ?")
.bind(adjusted_days)
.bind(new_quota_size)
.bind(payment.user_id)
.execute(&mut *tx)
.await?;
} else {
// Expired subscription - set new quota and time
sqlx::query("update users set paid_until = TIMESTAMPADD(DAY, ?, current_timestamp), paid_size = ? where id = ?")
.bind(payment.days_value)
.bind(payment.size_value)
.bind(payment.user_id)
.execute(&mut *tx)
.await?;
}
} else {
// No existing subscription - set new quota
sqlx::query("update users set paid_until = TIMESTAMPADD(DAY, ?, current_timestamp), paid_size = ? where id = ?")
.bind(payment.days_value)
.bind(payment.size_value)
.bind(payment.user_id)
.execute(&mut *tx)
.await?;
}
tx.commit().await?;
Ok(())
}
/// Check if user has sufficient quota for an upload
pub async fn check_user_quota(&self, pubkey: &Vec<u8>, upload_size: u64, free_quota_bytes: u64) -> Result<bool, Error> {
// Get or create user
let user_id = self.upsert_user(pubkey).await?;
// Get user's current storage usage
let user_stats = self.get_user_stats(user_id).await.unwrap_or(UserStats {
file_count: 0,
total_size: 0
});
// Get user's paid quota
let user = self.get_user(pubkey).await?;
let (paid_size, paid_until) = (user.paid_size, user.paid_until);
// Calculate total available quota
let mut available_quota = free_quota_bytes;
// Add paid quota if still valid
if let Some(paid_until) = paid_until {
if paid_until > chrono::Utc::now() {
available_quota += paid_size;
}
}
// Check if upload would exceed quota
Ok(user_stats.total_size + upload_size <= available_quota)
}
/// Add a new report to the database
pub async fn add_report(&self, file_id: &[u8], reporter_id: u64, event_json: &str) -> Result<(), Error> {
sqlx::query("insert into reports (file_id, reporter_id, event_json) values (?, ?, ?)")
.bind(file_id)
.bind(reporter_id)
.bind(event_json)
.execute(&self.pool)
.await?;
Ok(())
}
/// List reports with pagination for admin view
pub async fn list_reports(&self, offset: u32, limit: u32) -> Result<(Vec<Report>, i64), Error> {
let reports: Vec<Report> = sqlx::query_as(
"select id, file_id, reporter_id, event_json, created, reviewed from reports where reviewed = false order by created desc limit ? offset ?"
)
.bind(limit)
.bind(offset)
.fetch_all(&self.pool)
.await?;
let count: i64 = sqlx::query("select count(id) from reports where reviewed = false")
.fetch_one(&self.pool)
.await?
.try_get(0)?;
Ok((reports, count))
}
/// Get reports for a specific file
pub async fn get_file_reports(&self, file_id: &[u8]) -> Result<Vec<Report>, Error> {
sqlx::query_as(
"select id, file_id, reporter_id, event_json, created, reviewed from reports where file_id = ? order by created desc"
)
.bind(file_id)
.fetch_all(&self.pool)
.await
}
/// Mark a report as reviewed (used for acknowledging)
pub async fn mark_report_reviewed(&self, report_id: u64) -> Result<(), Error> {
sqlx::query("update reports set reviewed = true where id = ?")
.bind(report_id)
.execute(&self.pool)
.await?;
Ok(())
}
}

View File

@ -1,5 +1,6 @@
#[cfg(feature = "labels")]
use crate::db::FileLabel;
#[cfg(feature = "labels")]
use crate::processing::labeling::label_frame;
#[cfg(feature = "media-compression")]
@ -9,11 +10,12 @@ use anyhow::Error;
use anyhow::Result;
#[cfg(feature = "media-compression")]
use ffmpeg_rs_raw::DemuxerInfo;
use ffmpeg_rs_raw::StreamInfo;
#[cfg(feature = "media-compression")]
use rocket::form::validate::Contains;
use serde::Serialize;
use sha2::{Digest, Sha256};
use std::path::PathBuf;
use std::path::{Path, PathBuf};
use tokio::fs::File;
use tokio::io::{AsyncRead, AsyncReadExt};
use uuid::Uuid;
@ -36,10 +38,13 @@ pub struct NewFileResult {
pub width: Option<u32>,
pub height: Option<u32>,
pub blur_hash: Option<String>,
pub duration: Option<f32>,
pub bitrate: Option<u32>,
#[cfg(feature = "labels")]
pub labels: Vec<FileLabel>,
}
#[derive(Clone)]
pub struct FileStore {
settings: Settings,
}
@ -57,7 +62,7 @@ impl FileStore {
/// Store a new file
pub async fn put<'r, S>(
&self,
stream: S,
path: S,
mime_type: &str,
compress: bool,
) -> Result<FileSystemResult>
@ -65,7 +70,7 @@ impl FileStore {
S: AsyncRead + Unpin + 'r,
{
// store file in temp path and hash the file
let (temp_file, size, hash) = self.store_hash_temp_file(stream).await?;
let (temp_file, size, hash) = self.store_hash_temp_file(path).await?;
let dst_path = self.map_path(&hash);
// check if file hash already exists
@ -74,7 +79,7 @@ impl FileStore {
return Ok(FileSystemResult::AlreadyExists(hash));
}
let mut res = if compress {
let mut res = if compress && crate::can_compress(mime_type) {
#[cfg(feature = "media-compression")]
{
let res = match self.compress_file(&temp_file, mime_type).await {
@ -92,20 +97,30 @@ impl FileStore {
anyhow::bail!("Compression not supported!");
}
} else {
let (width, height, mime_type) = {
let (width, height, mime_type, duration, bitrate) = {
#[cfg(feature = "media-compression")]
{
let probe = probe_file(&temp_file).ok();
let v_stream = probe.as_ref().and_then(|p| p.best_video());
let mime = Self::hack_mime_type(mime_type, &probe, &temp_file);
let mime = Self::hack_mime_type(mime_type, &probe, &v_stream, &temp_file);
(
v_stream.map(|v| v.width as u32),
v_stream.map(|v| v.height as u32),
mime,
probe
.as_ref()
.map(|p| if p.duration < 0. { 0.0 } else { p.duration }),
probe.as_ref().map(|p| p.bitrate as u32),
)
}
#[cfg(not(feature = "media-compression"))]
(None, None, Self::infer_mime_type(mime_type, &temp_file))
(
None,
None,
Self::infer_mime_type(mime_type, &temp_file),
None,
None,
)
};
NewFileResult {
path: temp_file,
@ -115,6 +130,8 @@ impl FileStore {
width,
height,
blur_hash: None,
duration,
bitrate,
}
};
@ -136,18 +153,50 @@ impl FileStore {
#[cfg(feature = "media-compression")]
/// Try to replace the mime-type when unknown using ffmpeg probe result
fn hack_mime_type(mime_type: &str, p: &Option<DemuxerInfo>, out_path: &PathBuf) -> String {
fn hack_mime_type(
mime_type: &str,
p: &Option<DemuxerInfo>,
stream: &Option<&StreamInfo>,
out_path: &PathBuf,
) -> String {
if let Some(p) = p {
if p.format.contains("mp4") {
return "video/mp4".to_string();
let mime = if p.format.contains("mp4") {
Some("video/mp4")
} else if p.format.contains("webp") {
return "image/webp".to_string();
Some("image/webp")
} else if p.format.contains("jpeg") {
return "image/jpeg".to_string();
Some("image/jpeg")
} else if p.format.contains("png") {
return "image/png".to_string();
Some("image/png")
} else if p.format.contains("gif") {
return "image/gif".to_string();
Some("image/gif")
} else {
None
};
let codec = if let Some(s) = stream {
match s.codec {
27 => Some("avc1".to_owned()), //AV_CODEC_ID_H264
173 => Some("hvc1".to_owned()), //AV_CODEC_ID_HEVC
86016 => Some("mp4a.40.33".to_string()), //AV_CODEC_ID_MP2
86017 => Some("mp4a.40.34".to_string()), //AV_CODEC_ID_MP3
86018 => Some("mp4a.40.2".to_string()), //AV_CODEC_ID_AAC
86019 => Some("ac-3".to_string()), //AV_CODEC_ID_AC3
86056 => Some("ec-3".to_string()), //AV_CODEC_ID_EAC3
_ => None,
}
} else {
None
};
if let Some(m) = mime {
return format!(
"{}{}",
m,
if let Some(c) = codec {
format!("; codecs=\"{}\"", c)
} else {
"".to_owned()
}
);
}
}
@ -164,7 +213,8 @@ impl FileStore {
}
}
async fn compress_file(&self, input: &PathBuf, mime_type: &str) -> Result<NewFileResult> {
#[cfg(feature = "media-compression")]
async fn compress_file(&self, input: &Path, mime_type: &str) -> Result<NewFileResult> {
let compressed_result = compress_file(input, mime_type, &self.temp_dir())?;
#[cfg(feature = "labels")]
let labels = if let Some(mp) = &self.settings.vit_model {
@ -194,6 +244,8 @@ impl FileStore {
height: Some(compressed_result.height as u32),
blur_hash: None,
mime_type: compressed_result.mime_type,
duration: Some(compressed_result.duration),
bitrate: Some(compressed_result.bitrate),
#[cfg(feature = "labels")]
labels,
})
@ -214,7 +266,7 @@ impl FileStore {
Ok((out_path, n, hash))
}
async fn hash_file(p: &PathBuf) -> Result<Vec<u8>, Error> {
pub async fn hash_file(p: &Path) -> Result<Vec<u8>, Error> {
let mut file = File::open(p).await?;
let mut hasher = Sha256::new();
let mut buf = [0; 4096];
@ -229,7 +281,7 @@ impl FileStore {
Ok(res.to_vec())
}
pub fn map_path(&self, id: &Vec<u8>) -> PathBuf {
fn map_path(&self, id: &Vec<u8>) -> PathBuf {
let id = hex::encode(id);
self.storage_dir().join(&id[0..2]).join(&id[2..4]).join(id)
}

View File

@ -1,13 +1,18 @@
#[cfg(feature = "analytics")]
pub mod analytics;
pub mod auth;
pub mod background;
pub mod cors;
pub mod db;
pub mod filesystem;
#[cfg(feature = "payments")]
pub mod payments;
#[cfg(feature = "media-compression")]
pub mod processing;
pub mod routes;
pub mod settings;
#[cfg(any(feature = "void-cat-redirects", feature = "bin-void-cat-migrate"))]
pub mod void_db;
pub mod void_file;
pub fn can_compress(mime_type: &str) -> bool {
mime_type.starts_with("image/")
}

53
src/payments.rs Normal file
View File

@ -0,0 +1,53 @@
use serde::{Deserialize, Serialize};
use std::fmt::{Display, Formatter};
#[cfg(feature = "payments")]
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct PaymentAmount {
pub currency: Currency,
pub amount: f32,
}
#[cfg(feature = "payments")]
#[derive(Debug, Clone, Serialize, Deserialize)]
pub enum Currency {
BTC,
USD,
EUR,
GBP,
JPY,
CAD,
AUD,
}
#[cfg(feature = "payments")]
#[derive(Debug, Clone, Serialize, Deserialize)]
pub enum PaymentUnit {
GBSpace,
GBEgress,
}
impl PaymentUnit {
/// Get the total size from a number of units
pub fn to_size(&self, units: f32) -> u64 {
(1024f32 * 1024f32 * 1024f32 * units) as u64
}
}
impl Display for PaymentUnit {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
match self {
PaymentUnit::GBSpace => write!(f, "GB Space"),
PaymentUnit::GBEgress => write!(f, "GB Egress"),
}
}
}
#[cfg(feature = "payments")]
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(rename_all = "lowercase")]
pub enum PaymentInterval {
Day(u16),
Month(u16),
Year(u16),
}

View File

@ -1,8 +1,9 @@
use std::path::{Path, PathBuf};
use anyhow::{bail, Error, Result};
use ffmpeg_rs_raw::ffmpeg_sys_the_third::AVPixelFormat::AV_PIX_FMT_YUV420P;
use ffmpeg_rs_raw::{Demuxer, DemuxerInfo, Encoder, StreamType, Transcoder};
use ffmpeg_rs_raw::ffmpeg_sys_the_third::{av_frame_free, av_packet_free, AVFrame};
use ffmpeg_rs_raw::{Decoder, Demuxer, DemuxerInfo, Encoder, Scaler, StreamType, Transcoder};
use std::path::{Path, PathBuf};
use std::ptr;
use uuid::Uuid;
#[cfg(feature = "labels")]
@ -21,7 +22,7 @@ impl WebpProcessor {
Self
}
pub fn process_file(
pub fn compress(
&mut self,
input: &Path,
mime_type: &str,
@ -65,9 +66,65 @@ impl WebpProcessor {
mime_type: "image/webp".to_string(),
width: image_stream.width,
height: image_stream.height,
duration: if probe.duration < 0. {
0.
} else {
probe.duration
},
bitrate: probe.bitrate as u32,
})
}
}
pub fn thumbnail(&mut self, input: &Path, out_path: &Path) -> Result<()> {
use ffmpeg_rs_raw::ffmpeg_sys_the_third::AVCodecID::AV_CODEC_ID_WEBP;
unsafe {
let mut input = Demuxer::new(input.to_str().unwrap())?;
let probe = input.probe_input()?;
let image_stream = probe
.streams
.iter()
.find(|c| c.stream_type == StreamType::Video)
.ok_or(Error::msg("No image found, cant compress"))?;
let w = 512u16;
let scale = w as f32 / image_stream.width as f32;
let h = (image_stream.height as f32 * scale) as u16;
let enc = Encoder::new(AV_CODEC_ID_WEBP)?
.with_height(h as i32)
.with_width(w as i32)
.with_pix_fmt(AV_PIX_FMT_YUV420P)
.with_framerate(1.0)?
.open(None)?;
let mut sws = Scaler::new();
let mut decoder = Decoder::new();
decoder.setup_decoder(image_stream, None)?;
while let Ok((mut pkt, _stream)) = input.get_packet() {
let mut frame_save: *mut AVFrame = ptr::null_mut();
for (mut frame, _stream) in decoder.decode_pkt(pkt)? {
if frame_save.is_null() {
frame_save = sws.process_frame(frame, w, h, AV_PIX_FMT_YUV420P)?;
}
av_frame_free(&mut frame);
}
av_packet_free(&mut pkt);
if !frame_save.is_null() {
enc.save_picture(frame_save, out_path.to_str().unwrap())?;
av_frame_free(&mut frame_save);
return Ok(());
}
}
Ok(())
}
}
}
pub struct NewFileProcessorResult {
@ -75,29 +132,27 @@ pub struct NewFileProcessorResult {
pub mime_type: String,
pub width: usize,
pub height: usize,
}
pub fn can_compress(mime_type: &str) -> bool {
mime_type.starts_with("image/")
pub duration: f32,
pub bitrate: u32,
}
pub fn compress_file(
stream: &Path,
path: &Path,
mime_type: &str,
out_dir: &Path,
) -> Result<NewFileProcessorResult, Error> {
if !can_compress(mime_type) {
if !crate::can_compress(mime_type) {
bail!("MIME type not supported");
}
if mime_type.starts_with("image/") {
let mut proc = WebpProcessor::new();
return proc.process_file(stream, mime_type, out_dir);
return proc.compress(path, mime_type, out_dir);
}
bail!("No media processor")
}
pub fn probe_file(stream: &Path) -> Result<DemuxerInfo> {
let mut demuxer = Demuxer::new(stream.to_str().unwrap())?;
pub fn probe_file(path: &Path) -> Result<DemuxerInfo> {
let mut demuxer = Demuxer::new(path.to_str().unwrap())?;
unsafe { demuxer.probe_input() }
}

View File

@ -1,14 +1,14 @@
use crate::auth::nip98::Nip98Auth;
use crate::db::{Database, FileUpload};
use crate::db::{Database, FileUpload, User, Report};
use crate::routes::{Nip94Event, PagedResult};
use crate::settings::Settings;
use rocket::serde::json::Json;
use rocket::serde::Serialize;
use rocket::{routes, Responder, Route, State};
use sqlx::{Error, Row};
use sqlx::{Error, QueryBuilder, Row};
pub fn admin_routes() -> Vec<Route> {
routes![admin_list_files, admin_get_self]
routes![admin_list_files, admin_get_self, admin_list_reports, admin_acknowledge_report]
}
#[derive(Serialize, Default)]
@ -53,10 +53,25 @@ pub struct SelfUser {
pub is_admin: bool,
pub file_count: u64,
pub total_size: u64,
#[cfg(feature = "payments")]
pub paid_until: u64,
#[cfg(feature = "payments")]
pub quota: u64,
#[cfg(feature = "payments")]
pub free_quota: u64,
#[cfg(feature = "payments")]
pub total_available_quota: u64,
}
#[derive(Serialize)]
pub struct AdminNip94File {
#[serde(flatten)]
pub inner: Nip94Event,
pub uploader: Vec<String>,
}
#[rocket::get("/self")]
async fn admin_get_self(auth: Nip98Auth, db: &State<Database>) -> AdminResponse<SelfUser> {
async fn admin_get_self(auth: Nip98Auth, db: &State<Database>, settings: &State<Settings>) -> AdminResponse<SelfUser> {
let pubkey_vec = auth.event.pubkey.to_bytes().to_vec();
match db.get_user(&pubkey_vec).await {
Ok(user) => {
@ -66,24 +81,55 @@ async fn admin_get_self(auth: Nip98Auth, db: &State<Database>) -> AdminResponse<
return AdminResponse::error(&format!("Failed to load user stats: {}", e))
}
};
#[cfg(feature = "payments")]
let (free_quota, total_available_quota) = {
let free_quota = settings.payments.as_ref()
.and_then(|p| p.free_quota_bytes)
.unwrap_or(104857600);
let mut total_available = free_quota;
// Add paid quota if still valid
if let Some(paid_until) = &user.paid_until {
if *paid_until > chrono::Utc::now() {
total_available += user.paid_size;
}
}
(free_quota, total_available)
};
AdminResponse::success(SelfUser {
is_admin: user.is_admin,
file_count: s.file_count,
total_size: s.total_size,
#[cfg(feature = "payments")]
paid_until: if let Some(u) = &user.paid_until {
u.timestamp() as u64
} else {
0
},
#[cfg(feature = "payments")]
quota: user.paid_size,
#[cfg(feature = "payments")]
free_quota,
#[cfg(feature = "payments")]
total_available_quota,
})
}
Err(_) => AdminResponse::error("User not found"),
}
}
#[rocket::get("/files?<page>&<count>")]
#[rocket::get("/files?<page>&<count>&<mime_type>")]
async fn admin_list_files(
auth: Nip98Auth,
page: u32,
count: u32,
mime_type: Option<String>,
db: &State<Database>,
settings: &State<Settings>,
) -> AdminResponse<PagedResult<Nip94Event>> {
) -> AdminResponse<PagedResult<AdminNip94File>> {
let pubkey_vec = auth.event.pubkey.to_bytes().to_vec();
let server_count = count.clamp(1, 5_000);
@ -95,40 +141,107 @@ async fn admin_list_files(
if !user.is_admin {
return AdminResponse::error("User is not an admin");
}
match db.list_all_files(page * server_count, server_count).await {
match db
.list_all_files(page * server_count, server_count, mime_type)
.await
{
Ok((files, count)) => AdminResponse::success(PagedResult {
count: files.len() as u32,
page,
total: count as u32,
files: files
.iter()
.map(|f| Nip94Event::from_upload(settings, f))
.into_iter()
.map(|f| AdminNip94File {
inner: Nip94Event::from_upload(settings, &f.0),
uploader: f.1.into_iter().map(|u| hex::encode(&u.pubkey)).collect(),
})
.collect(),
}),
Err(e) => AdminResponse::error(&format!("Could not list files: {}", e)),
}
}
#[rocket::get("/reports?<page>&<count>")]
async fn admin_list_reports(
auth: Nip98Auth,
page: u32,
count: u32,
db: &State<Database>,
) -> AdminResponse<PagedResult<Report>> {
let pubkey_vec = auth.event.pubkey.to_bytes().to_vec();
let server_count = count.clamp(1, 5_000);
let user = match db.get_user(&pubkey_vec).await {
Ok(user) => user,
Err(_) => return AdminResponse::error("User not found"),
};
if !user.is_admin {
return AdminResponse::error("User is not an admin");
}
match db.list_reports(page * server_count, server_count).await {
Ok((reports, total_count)) => AdminResponse::success(PagedResult {
count: reports.len() as u32,
page,
total: total_count as u32,
files: reports,
}),
Err(e) => AdminResponse::error(&format!("Could not list reports: {}", e)),
}
}
#[rocket::delete("/reports/<report_id>")]
async fn admin_acknowledge_report(
auth: Nip98Auth,
report_id: u64,
db: &State<Database>,
) -> AdminResponse<()> {
let pubkey_vec = auth.event.pubkey.to_bytes().to_vec();
let user = match db.get_user(&pubkey_vec).await {
Ok(user) => user,
Err(_) => return AdminResponse::error("User not found"),
};
if !user.is_admin {
return AdminResponse::error("User is not an admin");
}
match db.mark_report_reviewed(report_id).await {
Ok(()) => AdminResponse::success(()),
Err(e) => AdminResponse::error(&format!("Could not acknowledge report: {}", e)),
}
}
impl Database {
pub async fn list_all_files(
&self,
offset: u32,
limit: u32,
) -> Result<(Vec<FileUpload>, i64), Error> {
let results: Vec<FileUpload> = sqlx::query_as(
"select u.* \
from uploads u \
order by u.created desc \
limit ? offset ?",
)
.bind(limit)
.bind(offset)
.fetch_all(&self.pool)
.await?;
mime_type: Option<String>,
) -> Result<(Vec<(FileUpload, Vec<User>)>, i64), Error> {
let mut q = QueryBuilder::new("select u.* from uploads u ");
if let Some(m) = mime_type {
q.push("where u.mime_type = ");
q.push_bind(m);
}
q.push(" order by u.created desc limit ");
q.push_bind(limit);
q.push(" offset ");
q.push_bind(offset);
let results: Vec<FileUpload> = q.build_query_as().fetch_all(&self.pool).await?;
let count: i64 = sqlx::query("select count(u.id) from uploads u")
.fetch_one(&self.pool)
.await?
.try_get(0)?;
Ok((results, count))
let mut res = Vec::with_capacity(results.len());
for upload in results.into_iter() {
let upd = self.get_file_owners(&upload.id).await?;
res.push((upload, upd));
}
Ok((res, count))
}
}

View File

@ -5,7 +5,7 @@ use crate::routes::{delete_file, Nip94Event};
use crate::settings::Settings;
use log::error;
use nostr::prelude::hex;
use nostr::{Alphabet, SingleLetterTag, TagKind};
use nostr::{Alphabet, JsonUtil, SingleLetterTag, TagKind};
use rocket::data::ByteUnit;
use rocket::futures::StreamExt;
use rocket::http::{Header, Status};
@ -13,7 +13,6 @@ use rocket::response::Responder;
use rocket::serde::json::Json;
use rocket::{routes, Data, Request, Response, Route, State};
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use tokio::io::AsyncRead;
use tokio_util::io::StreamReader;
@ -25,9 +24,9 @@ pub struct BlobDescriptor {
pub size: u64,
#[serde(rename = "type", skip_serializing_if = "Option::is_none")]
pub mime_type: Option<String>,
pub created: u64,
pub uploaded: u64,
#[serde(rename = "nip94", skip_serializing_if = "Option::is_none")]
pub nip94: Option<HashMap<String, String>>,
pub nip94: Option<Vec<Vec<String>>>,
}
impl BlobDescriptor {
@ -45,14 +44,8 @@ impl BlobDescriptor {
sha256: id_hex,
size: value.size,
mime_type: Some(value.mime_type.clone()),
created: value.created.timestamp() as u64,
nip94: Some(
Nip94Event::from_upload(settings, value)
.tags
.iter()
.map(|r| (r[0].clone(), r[1].clone()))
.collect(),
),
uploaded: value.created.timestamp() as u64,
nip94: Some(Nip94Event::from_upload(settings, value).tags),
}
}
}
@ -71,13 +64,21 @@ pub fn blossom_routes() -> Vec<Route> {
upload_head,
upload_media,
head_media,
mirror
mirror,
report_file
]
}
#[cfg(not(feature = "media-compression"))]
pub fn blossom_routes() -> Vec<Route> {
routes![delete_blob, upload, list_files, upload_head, mirror]
routes![
delete_blob,
upload,
list_files,
upload_head,
mirror,
report_file
]
}
/// Generic holder response, mostly for errors
@ -250,12 +251,13 @@ async fn mirror(
process_stream(
StreamReader::new(rsp.bytes_stream().map(|result| {
result.map_err(|err| std::io::Error::new(std::io::ErrorKind::Other, err))
result.map_err(std::io::Error::other)
})),
&mime_type,
&None,
&pubkey,
false,
0, // No size info for mirror
fs,
db,
settings,
@ -344,17 +346,17 @@ async fn process_upload(
None
}
});
let size = auth.event.tags.iter().find_map(|t| {
let size_tag = auth.event.tags.iter().find_map(|t| {
if t.kind() == TagKind::Size {
t.content().and_then(|v| v.parse::<u64>().ok())
} else {
None
}
});
if let Some(z) = size {
if z > settings.max_upload_bytes {
return BlossomResponse::error("File too large");
}
let size = size_tag.or(auth.x_content_length).unwrap_or(0);
if size > 0 && size > settings.max_upload_bytes {
return BlossomResponse::error("File too large");
}
// check whitelist
@ -362,6 +364,28 @@ async fn process_upload(
return e;
}
// check quota
#[cfg(feature = "payments")]
{
let free_quota = settings
.payments
.as_ref()
.and_then(|p| p.free_quota_bytes)
.unwrap_or(104857600); // Default to 100MB
let pubkey_vec = auth.event.pubkey.to_bytes().to_vec();
if size > 0 {
match db
.check_user_quota(&pubkey_vec, size, free_quota)
.await
{
Ok(false) => return BlossomResponse::error("Upload would exceed quota"),
Err(_) => return BlossomResponse::error("Failed to check quota"),
Ok(true) => {} // Quota check passed
}
}
}
process_stream(
data.open(ByteUnit::Byte(settings.max_upload_bytes)),
&auth
@ -370,6 +394,7 @@ async fn process_upload(
&name,
&auth.event.pubkey.to_bytes().to_vec(),
compress,
size,
fs,
db,
settings,
@ -383,6 +408,7 @@ async fn process_stream<'p, S>(
name: &Option<&str>,
pubkey: &Vec<u8>,
compress: bool,
size: u64,
fs: &State<FileStore>,
db: &State<Database>,
settings: &State<Settings>,
@ -404,7 +430,7 @@ where
_ => return BlossomResponse::error("File not found"),
},
Err(e) => {
error!("{}", e.to_string());
error!("{}", e);
return BlossomResponse::error(format!("Error saving file (disk): {}", e));
}
};
@ -415,10 +441,103 @@ where
return BlossomResponse::error(format!("Failed to save file (db): {}", e));
}
};
// Post-upload quota check if we didn't have size information before upload
#[cfg(feature = "payments")]
if size == 0 {
let free_quota = settings
.payments
.as_ref()
.and_then(|p| p.free_quota_bytes)
.unwrap_or(104857600); // Default to 100MB
match db.check_user_quota(pubkey, upload.size, free_quota).await {
Ok(false) => {
// Clean up the uploaded file if quota exceeded
if let Err(e) = tokio::fs::remove_file(fs.get(&upload.id)).await {
log::warn!("Failed to cleanup quota-exceeding file: {}", e);
}
return BlossomResponse::error("Upload would exceed quota");
}
Err(_) => {
// Clean up on quota check error
if let Err(e) = tokio::fs::remove_file(fs.get(&upload.id)).await {
log::warn!("Failed to cleanup file after quota check error: {}", e);
}
return BlossomResponse::error("Failed to check quota");
}
Ok(true) => {} // Quota check passed
}
}
if let Err(e) = db.add_file(&upload, user_id).await {
error!("{}", e.to_string());
error!("{}", e);
BlossomResponse::error(format!("Error saving file (db): {}", e))
} else {
BlossomResponse::BlobDescriptor(Json(BlobDescriptor::from_upload(settings, &upload)))
}
}
#[rocket::put("/report", data = "<data>", format = "json")]
async fn report_file(
auth: BlossomAuth,
db: &State<Database>,
settings: &State<Settings>,
data: Json<nostr::Event>,
) -> BlossomResponse {
// Check if the request has the correct method tag
if !check_method(&auth.event, "report") {
return BlossomResponse::error("Invalid request method tag");
}
// Check whitelist
if let Some(e) = check_whitelist(&auth, settings) {
return e;
}
// Extract file SHA256 from the "x" tag in the report event
let file_sha256 = if let Some(x_tag) = data.tags.iter().find_map(|t| {
if t.kind() == TagKind::SingleLetter(SingleLetterTag::lowercase(Alphabet::X)) {
t.content()
} else {
None
}
}) {
match hex::decode(x_tag) {
Ok(hash) => hash,
Err(_) => return BlossomResponse::error("Invalid file hash in x tag"),
}
} else {
return BlossomResponse::error("Missing file hash in x tag");
};
// Verify the reported file exists
match db.get_file(&file_sha256).await {
Ok(Some(_)) => {} // File exists, continue
Ok(None) => return BlossomResponse::error("File not found"),
Err(e) => return BlossomResponse::error(format!("Failed to check file: {}", e)),
}
// Get or create the reporter user
let reporter_id = match db.upsert_user(&auth.event.pubkey.to_bytes().to_vec()).await {
Ok(user_id) => user_id,
Err(e) => return BlossomResponse::error(format!("Failed to get user: {}", e)),
};
// Store the report (the database will handle duplicate prevention via unique index)
match db
.add_report(&file_sha256, reporter_id, &data.as_json())
.await
{
Ok(()) => BlossomResponse::Generic(BlossomGenericResponse {
status: Status::Ok,
message: Some("Report submitted successfully".to_string()),
}),
Err(e) => {
if e.to_string().contains("Duplicate entry") {
BlossomResponse::error("You have already reported this file")
} else {
BlossomResponse::error(format!("Failed to submit report: {}", e))
}
}
}
}

View File

@ -1,21 +1,25 @@
use crate::db::{Database, FileUpload};
use crate::filesystem::FileStore;
#[cfg(feature = "media-compression")]
use crate::processing::WebpProcessor;
pub use crate::routes::admin::admin_routes;
#[cfg(feature = "blossom")]
pub use crate::routes::blossom::blossom_routes;
#[cfg(feature = "nip96")]
pub use crate::routes::nip96::nip96_routes;
use crate::settings::Settings;
use crate::void_file::VoidFile;
use anyhow::Error;
use http_range_header::{parse_range_header, EndPosition, StartPosition};
use log::{debug, warn};
use anyhow::{Error, Result};
use http_range_header::{
parse_range_header, EndPosition, StartPosition, SyntacticallyCorrectRange,
};
use log::warn;
use nostr::Event;
use rocket::fs::NamedFile;
use rocket::http::{ContentType, Header, Status};
use rocket::response::Responder;
use rocket::serde::Serialize;
use rocket::{Request, Response, State};
use std::env::temp_dir;
use std::io::SeekFrom;
use std::ops::Range;
use std::pin::{pin, Pin};
@ -24,12 +28,13 @@ use std::task::{Context, Poll};
use tokio::fs::File;
use tokio::io::{AsyncRead, AsyncSeek, ReadBuf};
mod admin;
#[cfg(feature = "blossom")]
mod blossom;
#[cfg(feature = "nip96")]
mod nip96;
mod admin;
#[cfg(feature = "payments")]
pub mod payment;
pub struct FilePayload {
pub file: File,
@ -56,28 +61,40 @@ struct PagedResult<T> {
impl Nip94Event {
pub fn from_upload(settings: &Settings, upload: &FileUpload) -> Self {
let hex_id = hex::encode(&upload.id);
let ext = if upload.mime_type != "application/octet-stream" {
mime2ext::mime2ext(&upload.mime_type)
} else {
None
};
let mut tags = vec![
vec![
"url".to_string(),
format!(
"{}/{}{}",
&settings.public_url,
&hex_id,
mime2ext::mime2ext(&upload.mime_type)
.map(|m| format!(".{m}"))
.unwrap_or("".to_string())
),
format!("{}/{}.{}", &settings.public_url, &hex_id, ext.unwrap_or("")),
],
vec!["x".to_string(), hex_id],
vec!["x".to_string(), hex_id.clone()],
vec!["m".to_string(), upload.mime_type.clone()],
vec!["size".to_string(), upload.size.to_string()],
];
if upload.mime_type.starts_with("image/") || upload.mime_type.starts_with("video/") {
tags.push(vec![
"thumb".to_string(),
format!("{}/thumb/{}.webp", &settings.public_url, &hex_id),
]);
}
if let Some(bh) = &upload.blur_hash {
tags.push(vec!["blurhash".to_string(), bh.clone()]);
}
if let (Some(w), Some(h)) = (upload.width, upload.height) {
tags.push(vec!["dim".to_string(), format!("{}x{}", w, h)])
}
if let Some(d) = &upload.duration {
tags.push(vec!["duration".to_string(), d.to_string()]);
}
if let Some(b) = &upload.bitrate {
tags.push(vec!["bitrate".to_string(), b.to_string()]);
}
#[cfg(feature = "labels")]
for l in &upload.labels {
let val = if l.label.contains(',') {
@ -104,18 +121,47 @@ struct RangeBody {
range_end: u64,
current_offset: u64,
poll_complete: bool,
file_size: u64,
}
const MAX_UNBOUNDED_RANGE: u64 = 1024 * 1024;
impl RangeBody {
pub fn new(file: File, range: Range<u64>) -> Self {
pub fn new(file: File, file_size: u64, range: Range<u64>) -> Self {
Self {
file,
file_size,
range_start: range.start,
range_end: range.end,
current_offset: 0,
poll_complete: false,
}
}
pub fn get_range(file_size: u64, header: &SyntacticallyCorrectRange) -> Range<u64> {
let range_start = match header.start {
StartPosition::Index(i) => i,
StartPosition::FromLast(i) => file_size.saturating_sub(i),
};
let range_end = match header.end {
EndPosition::Index(i) => i,
EndPosition::LastByte => (file_size - 1).min(range_start + MAX_UNBOUNDED_RANGE),
};
range_start..range_end
}
pub fn get_headers(&self) -> Vec<Header<'static>> {
let r_len = (self.range_end - self.range_start) + 1;
vec![
Header::new("content-length", r_len.to_string()),
Header::new(
"content-range",
format!(
"bytes {}-{}/{}",
self.range_start, self.range_end, self.file_size
),
),
]
}
}
impl AsyncRead for RangeBody {
@ -125,7 +171,7 @@ impl AsyncRead for RangeBody {
buf: &mut ReadBuf<'_>,
) -> Poll<std::io::Result<()>> {
let range_start = self.range_start + self.current_offset;
let range_len = self.range_end - range_start;
let range_len = self.range_end.saturating_sub(range_start) + 1;
let bytes_to_read = buf.remaining().min(range_len as usize) as u64;
if bytes_to_read == 0 {
@ -170,10 +216,13 @@ impl AsyncRead for RangeBody {
impl<'r> Responder<'r, 'static> for FilePayload {
fn respond_to(self, request: &'r Request<'_>) -> rocket::response::Result<'static> {
let mut response = Response::new();
response.set_header(Header::new("cache-control", "max-age=31536000, immutable"));
// handle ranges
#[cfg(feature = "ranges")]
{
// only use range response for files > 1MiB
if self.info.size < MAX_UNBOUNDED_RANGE {
response.set_sized_body(None, self.file);
} else {
response.set_header(Header::new("accept-ranges", "bytes"));
if let Some(r) = request.headers().get("range").next() {
if let Ok(ranges) = parse_range_header(r) {
@ -181,39 +230,22 @@ impl<'r> Responder<'r, 'static> for FilePayload {
warn!("Multipart ranges are not supported, fallback to non-range request");
response.set_streamed_body(self.file);
} else {
const MAX_UNBOUNDED_RANGE: u64 = 1024 * 1024;
let single_range = ranges.ranges.first().unwrap();
let range_start = match single_range.start {
StartPosition::Index(i) => i,
StartPosition::FromLast(i) => self.info.size - i,
};
let range_end = match single_range.end {
EndPosition::Index(i) => i,
EndPosition::LastByte => {
(range_start + MAX_UNBOUNDED_RANGE).min(self.info.size)
}
};
let r_len = range_end - range_start;
let r_body = RangeBody::new(self.file, range_start..range_end);
let range = RangeBody::get_range(self.info.size, single_range);
let r_body = RangeBody::new(self.file, self.info.size, range.clone());
response.set_status(Status::PartialContent);
response.set_header(Header::new("content-length", r_len.to_string()));
response.set_header(Header::new(
"content-range",
format!("bytes {}-{}/{}", range_start, range_end - 1, self.info.size),
));
let headers = r_body.get_headers();
for h in headers {
response.set_header(h);
}
response.set_streamed_body(Box::pin(r_body));
}
}
} else {
response.set_streamed_body(self.file);
response.set_sized_body(None, self.file);
}
}
#[cfg(not(feature = "ranges"))]
{
response.set_streamed_body(self.file);
response.set_header(Header::new("content-length", self.info.size.to_string()));
}
if let Ok(ct) = ContentType::from_str(&self.info.mime_type) {
response.set_header(ct);
@ -352,26 +384,91 @@ pub async fn head_blob(sha256: &str, fs: &State<FileStore>) -> Status {
}
}
/// Legacy URL redirect for void.cat uploads
#[rocket::get("/d/<id>")]
pub async fn void_cat_redirect(id: &str, settings: &State<Settings>) -> Option<NamedFile> {
let id = if id.contains(".") {
id.split('.').next().unwrap()
/// Generate thumbnail for image / video
#[cfg(feature = "media-compression")]
#[rocket::get("/thumb/<sha256>")]
pub async fn get_blob_thumb(
sha256: &str,
fs: &State<FileStore>,
db: &State<Database>,
) -> Result<FilePayload, Status> {
let sha256 = if sha256.contains(".") {
sha256.split('.').next().unwrap()
} else {
id
sha256
};
if let Some(base) = &settings.void_cat_files {
let uuid =
uuid::Uuid::from_slice_le(nostr::bitcoin::base58::decode(id).unwrap().as_slice())
.unwrap();
let f = base.join(VoidFile::map_to_path(&uuid));
debug!("Legacy file map: {} => {}", id, f.display());
if let Ok(f) = NamedFile::open(f).await {
Some(f)
} else {
None
}
let id = if let Ok(i) = hex::decode(sha256) {
i
} else {
None
return Err(Status::NotFound);
};
if id.len() != 32 {
return Err(Status::NotFound);
}
let info = if let Ok(Some(info)) = db.get_file(&id).await {
info
} else {
return Err(Status::NotFound);
};
if !(info.mime_type.starts_with("image/") || info.mime_type.starts_with("video/")) {
return Err(Status::NotFound);
}
let file_path = fs.get(&id);
let mut thumb_file = temp_dir().join(format!("thumb_{}", sha256));
thumb_file.set_extension("webp");
if !thumb_file.exists() {
let mut p = WebpProcessor::new();
if p.thumbnail(&file_path, &thumb_file).is_err() {
return Err(Status::InternalServerError);
}
};
if let Ok(f) = File::open(&thumb_file).await {
Ok(FilePayload {
file: f,
info: FileUpload {
size: thumb_file.metadata().unwrap().len(),
mime_type: "image/webp".to_string(),
..info
},
})
} else {
Err(Status::NotFound)
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_ranges() -> Result<()> {
let size = 16482469;
let req = parse_range_header("bytes=0-1023")?;
let r = RangeBody::get_range(size, req.ranges.first().unwrap());
assert_eq!(r.start, 0);
assert_eq!(r.end, 1023);
let req = parse_range_header("bytes=16482467-")?;
let r = RangeBody::get_range(size, req.ranges.first().unwrap());
assert_eq!(r.start, 16482467);
assert_eq!(r.end, 16482468);
let req = parse_range_header("bytes=-10")?;
let r = RangeBody::get_range(size, req.ranges.first().unwrap());
assert_eq!(r.start, 16482459);
assert_eq!(r.end, 16482468);
let req = parse_range_header("bytes=-16482470")?;
let r = RangeBody::get_range(size, req.ranges.first().unwrap());
assert_eq!(r.start, 0);
assert_eq!(r.end, MAX_UNBOUNDED_RANGE);
Ok(())
}
}

View File

@ -174,12 +174,8 @@ async fn upload(
settings: &State<Settings>,
form: Form<Nip96Form<'_>>,
) -> Nip96Response {
if let Some(size) = auth.content_length {
if size > settings.max_upload_bytes {
return Nip96Response::error("File too large");
}
}
if form.size > settings.max_upload_bytes {
let upload_size = auth.content_length.or(Some(form.size)).unwrap_or(0);
if upload_size > 0 && upload_size > settings.max_upload_bytes {
return Nip96Response::error("File too large");
}
let file = match form.file.open().await {
@ -193,7 +189,8 @@ async fn upload(
}
// account for upload speeds as slow as 1MB/s (8 Mbps)
let mbs = form.size / 1.megabytes().as_u64();
let size_for_timing = if upload_size > 0 { upload_size } else { form.size };
let mbs = size_for_timing / 1.megabytes().as_u64();
let max_time = 60.max(mbs);
if auth.event.created_at < Timestamp::now().sub(Duration::from_secs(max_time)) {
return Nip96Response::error("Auth event timestamp out of range");
@ -207,12 +204,38 @@ async fn upload(
}
let pubkey_vec = auth.event.pubkey.to_bytes().to_vec();
// check quota
#[cfg(feature = "payments")]
{
let free_quota = settings.payments.as_ref()
.and_then(|p| p.free_quota_bytes)
.unwrap_or(104857600); // Default to 100MB
if upload_size > 0 {
match db.check_user_quota(&pubkey_vec, upload_size, free_quota).await {
Ok(false) => return Nip96Response::error("Upload would exceed quota"),
Err(_) => return Nip96Response::error("Failed to check quota"),
Ok(true) => {} // Quota check passed
}
}
}
let upload = match fs
.put(file, content_type, !form.no_transform.unwrap_or(false))
.await
{
Ok(FileSystemResult::NewFile(blob)) => {
let mut upload: FileUpload = (&blob).into();
// Validate file size after upload if no pre-upload size was available
if upload_size == 0 && upload.size > settings.max_upload_bytes {
// Clean up the uploaded file
if let Err(e) = tokio::fs::remove_file(fs.get(&upload.id)).await {
log::warn!("Failed to cleanup oversized file: {}", e);
}
return Nip96Response::error("File too large");
}
upload.name = form.caption.map(|cap| cap.to_string());
upload.alt = form.alt.as_ref().map(|s| s.to_string());
upload
@ -222,7 +245,7 @@ async fn upload(
_ => return Nip96Response::error("File not found"),
},
Err(e) => {
error!("{}", e.to_string());
error!("{}", e);
return Nip96Response::error(&format!("Could not save file: {}", e));
}
};
@ -232,8 +255,34 @@ async fn upload(
Err(e) => return Nip96Response::error(&format!("Could not save user: {}", e)),
};
// Post-upload quota check if we didn't have size information before upload
#[cfg(feature = "payments")]
if upload_size == 0 {
let free_quota = settings.payments.as_ref()
.and_then(|p| p.free_quota_bytes)
.unwrap_or(104857600); // Default to 100MB
match db.check_user_quota(&pubkey_vec, upload.size, free_quota).await {
Ok(false) => {
// Clean up the uploaded file if quota exceeded
if let Err(e) = tokio::fs::remove_file(fs.get(&upload.id)).await {
log::warn!("Failed to cleanup quota-exceeding file: {}", e);
}
return Nip96Response::error("Upload would exceed quota");
}
Err(_) => {
// Clean up on quota check error
if let Err(e) = tokio::fs::remove_file(fs.get(&upload.id)).await {
log::warn!("Failed to cleanup file after quota check error: {}", e);
}
return Nip96Response::error("Failed to check quota");
}
Ok(true) => {} // Quota check passed
}
}
if let Err(e) = db.add_file(&upload, user_id).await {
error!("{}", e.to_string());
error!("{}", e);
return Nip96Response::error(&format!("Could not save file (db): {}", e));
}
Nip96Response::UploadResult(Json(Nip96UploadResult::from_upload(settings, &upload)))

131
src/routes/payment.rs Normal file
View File

@ -0,0 +1,131 @@
use crate::auth::nip98::Nip98Auth;
use crate::db::{Database, Payment};
use crate::payments::{Currency, PaymentAmount, PaymentInterval, PaymentUnit};
use crate::settings::Settings;
use chrono::{Months, Utc};
use fedimint_tonic_lnd::lnrpc::Invoice;
use fedimint_tonic_lnd::Client;
use log::{error, info};
use rocket::serde::json::Json;
use rocket::{routes, Route, State};
use serde::{Deserialize, Serialize};
use std::ops::{Add, Deref};
pub fn routes() -> Vec<Route> {
routes![get_payment, req_payment]
}
#[derive(Deserialize, Serialize)]
struct PaymentInfo {
/// Billing quota metric
pub unit: PaymentUnit,
/// Amount of time to bill units (GB/mo, Gb Egress/day etc.)
pub interval: PaymentInterval,
/// Value amount of payment
pub cost: PaymentAmount,
}
#[derive(Deserialize, Serialize)]
struct PaymentRequest {
/// Number of units requested to make payment
pub units: f32,
/// Quantity of orders to make
pub quantity: u16,
}
#[derive(Deserialize, Serialize)]
struct PaymentResponse {
pub pr: String,
}
#[rocket::get("/payment")]
async fn get_payment(settings: &State<Settings>) -> Option<Json<PaymentInfo>> {
settings.payments.as_ref().map(|p| {
Json::from(PaymentInfo {
unit: p.unit.clone(),
interval: p.interval.clone(),
cost: p.cost.clone(),
})
})
}
#[rocket::post("/payment", data = "<req>", format = "json")]
async fn req_payment(
auth: Nip98Auth,
db: &State<Database>,
settings: &State<Settings>,
lnd: &State<Client>,
req: Json<PaymentRequest>,
) -> Result<Json<PaymentResponse>, String> {
let cfg = if let Some(p) = &settings.payments {
p
} else {
return Err("Payment not enabled, missing configuration option(s)".to_string());
};
let btc_amount = match cfg.cost.currency {
Currency::BTC => cfg.cost.amount,
_ => return Err("Currency not supported".to_string()),
};
let amount = btc_amount * req.units * req.quantity as f32;
let pubkey_vec = auth.event.pubkey.to_bytes().to_vec();
let uid = db
.upsert_user(&pubkey_vec)
.await
.map_err(|_| "Failed to get user account".to_string())?;
let mut lnd = lnd.deref().clone();
let c = lnd.lightning();
let msat = (amount * 1e11f32) as u64;
let memo = format!(
"{}x {} {} for {}",
req.quantity, req.units, cfg.unit, auth.event.pubkey
);
info!("Requesting {} msats: {}", msat, memo);
let invoice = c
.add_invoice(Invoice {
value_msat: msat as i64,
memo,
..Default::default()
})
.await
.map_err(|e| e.message().to_string())?;
let days_value = match cfg.interval {
PaymentInterval::Day(d) => d as u64,
PaymentInterval::Month(m) => {
let now = Utc::now();
(now.add(Months::new(m as u32)) - now).num_days() as u64
}
PaymentInterval::Year(y) => {
let now = Utc::now();
(now.add(Months::new(12 * y as u32)) - now).num_days() as u64
}
};
let record = Payment {
payment_hash: invoice.get_ref().r_hash.clone(),
user_id: uid,
created: Default::default(),
amount: msat,
is_paid: false,
days_value,
size_value: cfg.unit.to_size(req.units),
settle_index: None,
rate: None,
};
if let Err(e) = db.insert_payment(&record).await {
error!("Failed to insert payment: {}", e);
return Err("Failed to insert payment".to_string());
}
Ok(Json(PaymentResponse {
pr: invoice.get_ref().payment_request.clone(),
}))
}

View File

@ -1,3 +1,5 @@
#[cfg(feature = "payments")]
use crate::payments::{Currency, PaymentAmount, PaymentInterval, PaymentUnit};
use serde::{Deserialize, Serialize};
use std::path::PathBuf;
@ -30,11 +32,12 @@ pub struct Settings {
/// Analytics tracking
pub plausible_url: Option<String>,
#[cfg(feature = "void-cat-redirects")]
pub void_cat_database: Option<String>,
/// Path to void.cat uploads (files-v2)
pub void_cat_files: Option<PathBuf>,
#[cfg(feature = "payments")]
/// Payment options for paid storage
pub payments: Option<PaymentConfig>,
}
#[derive(Debug, Clone, Serialize, Deserialize)]
@ -42,3 +45,33 @@ pub struct VitModelConfig {
pub model: PathBuf,
pub config: PathBuf,
}
#[cfg(feature = "payments")]
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct PaymentConfig {
/// LND connection details
pub lnd: LndConfig,
/// Pricing per unit
pub cost: PaymentAmount,
/// What metric to bill payments on
pub unit: PaymentUnit,
/// Billing interval time per unit
pub interval: PaymentInterval,
/// Fiat base currency to store exchange rates along with invoice
pub fiat: Option<Currency>,
/// Free quota in bytes for users without payments (default: 100MB)
pub free_quota_bytes: Option<u64>,
}
#[cfg(feature = "payments")]
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct LndConfig {
pub endpoint: String,
pub tls: PathBuf,
pub macaroon: PathBuf,
}

View File

@ -16,12 +16,14 @@
"@snort/system-react": "^1.5.1",
"classnames": "^2.5.1",
"react": "^18.3.1",
"react-dom": "^18.3.1"
"react-dom": "^18.3.1",
"react-router-dom": "^7.6.2"
},
"devDependencies": {
"@eslint/js": "^9.9.0",
"@types/react": "^18.3.3",
"@types/react-dom": "^18.3.0",
"@types/react-router-dom": "^5.3.3",
"@vitejs/plugin-react": "^4.3.1",
"autoprefixer": "^10.4.20",
"eslint": "^9.9.0",

View File

@ -1,12 +1,23 @@
import { BrowserRouter as Router, Routes, Route } from "react-router-dom";
import Header from "./views/header";
import Upload from "./views/upload";
import Admin from "./views/admin";
function App() {
return (
<div className="flex flex-col gap-4 mx-auto mt-4 max-w-[1920px] px-10">
<Header />
<Upload />
</div>
<Router>
<div className="min-h-screen bg-gray-900">
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8">
<Header />
<main className="py-8">
<Routes>
<Route path="/" element={<Upload />} />
<Route path="/admin" element={<Admin />} />
</Routes>
</main>
</div>
</div>
</Router>
);
}

View File

@ -22,12 +22,12 @@ export default function Button({
}
return (
<button
className={`py-2 px-4 rounded-md border-0 text-sm font-semibold bg-neutral-700 hover:bg-neutral-600 ${className} ${props.disabled || loading ? "opacity-50" : ""}`}
className={`${className} ${props.disabled || loading ? "opacity-50 cursor-not-allowed" : ""}`}
onClick={doClick}
{...props}
disabled={loading || (props.disabled ?? false)}
>
{children}
{loading ? "..." : children}
</button>
);
}

View File

@ -0,0 +1,170 @@
import { useState, useEffect } from "react";
import Button from "./button";
import {
PaymentInfo,
PaymentRequest,
Route96,
AdminSelf,
} from "../upload/admin";
interface PaymentFlowProps {
route96: Route96;
onPaymentRequested?: (paymentRequest: string) => void;
userInfo?: AdminSelf;
}
export default function PaymentFlow({
route96,
onPaymentRequested,
userInfo,
}: PaymentFlowProps) {
const [paymentInfo, setPaymentInfo] = useState<PaymentInfo | null>(null);
const [gigabytes, setGigabytes] = useState<number>(1);
const [months, setMonths] = useState<number>(1);
const [paymentRequest, setPaymentRequest] = useState<string>("");
const [error, setError] = useState<string>("");
const [loading, setLoading] = useState(false);
useEffect(() => {
if (paymentInfo === null) {
loadPaymentInfo();
}
}, [paymentInfo]);
// Set default gigabytes to user's current quota
useEffect(() => {
if (userInfo?.quota && userInfo.quota > 0) {
// Convert from bytes to GB using 1024^3 (MiB)
const currentQuotaGB = Math.round(userInfo.quota / (1024 * 1024 * 1024));
if (currentQuotaGB > 0) {
setGigabytes(currentQuotaGB);
}
}
}, [userInfo]);
async function loadPaymentInfo() {
try {
const info = await route96.getPaymentInfo();
setPaymentInfo(info);
} catch (e) {
if (e instanceof Error) {
setError(e.message);
} else {
setError("Failed to load payment info");
}
}
}
async function requestPayment() {
if (!paymentInfo) return;
setLoading(true);
setError("");
try {
const request: PaymentRequest = { units: gigabytes, quantity: months };
const response = await route96.requestPayment(request);
setPaymentRequest(response.pr);
onPaymentRequested?.(response.pr);
} catch (e) {
if (e instanceof Error) {
setError(e.message);
} else {
setError("Failed to request payment");
}
} finally {
setLoading(false);
}
}
if (error && !paymentInfo) {
return <div className="text-red-400">Payment not available: {error}</div>;
}
if (!paymentInfo) {
return <div className="text-gray-400">Loading payment info...</div>;
}
const totalCostBTC = paymentInfo.cost.amount * gigabytes * months;
const totalCostSats = Math.round(totalCostBTC * 100000000); // Convert BTC to sats
function formatStorageUnit(unit: string): string {
if (
unit.toLowerCase().includes("gbspace") ||
unit.toLowerCase().includes("gb")
) {
return "GB";
}
return unit;
}
return (
<div className="card">
<h3 className="text-lg font-bold mb-4">Top Up Account</h3>
<div className="space-y-4 mb-6">
<div className="text-center">
<div className="text-2xl font-bold text-gray-100 mb-2">
{gigabytes} {formatStorageUnit(paymentInfo.unit)} for {months} month
{months > 1 ? "s" : ""}
</div>
<div className="text-lg text-blue-400 font-semibold">
{totalCostSats.toLocaleString()} sats
</div>
</div>
<div className="grid grid-cols-2 gap-4">
<div>
<label className="block text-sm font-medium mb-2 text-gray-300">
Storage ({formatStorageUnit(paymentInfo.unit)})
</label>
<input
type="number"
min="1"
step="1"
value={gigabytes}
onChange={(e) => setGigabytes(parseInt(e.target.value) || 1)}
className="input w-full text-center text-lg"
/>
</div>
<div>
<label className="block text-sm font-medium mb-2 text-gray-300">
Duration (months)
</label>
<input
type="number"
min="1"
step="1"
value={months}
onChange={(e) => setMonths(parseInt(e.target.value) || 1)}
className="input w-full text-center text-lg"
/>
</div>
</div>
</div>
<Button
onClick={requestPayment}
disabled={loading || gigabytes <= 0 || months <= 0}
className="btn-primary w-full mb-4"
>
{loading ? "Processing..." : "Generate Payment Request"}
</Button>
{error && <div className="text-red-400 text-sm mb-4">{error}</div>}
{paymentRequest && (
<div className="bg-gray-800 p-4 rounded-lg border border-gray-700">
<div className="text-sm font-medium mb-2">Lightning Invoice:</div>
<div className="font-mono text-xs break-all bg-gray-900 p-2 rounded">
{paymentRequest}
</div>
<div className="text-xs text-gray-400 mt-2">
Copy this invoice to your Lightning wallet to complete payment
</div>
</div>
)}
</div>
);
}

View File

@ -1,20 +1,47 @@
import { hexToBech32 } from "@snort/shared";
import { NostrLink } from "@snort/system";
import { useUserProfile } from "@snort/system-react";
import { useMemo } from "react";
export default function Profile({ link }: { link: NostrLink }) {
const profile = useUserProfile(link.id);
export default function Profile({
link,
size,
showName,
}: {
link: NostrLink;
size?: number;
showName?: boolean;
}) {
const linkId = useMemo(() => link.id, [link.id]);
const profile = useUserProfile(linkId);
const s = size ?? 40;
return (
<div className="flex gap-2 items-center">
<a
className="flex gap-2 items-center"
href={`https://snort.social/${link.encode()}`}
target="_blank"
>
<img
src={profile?.picture}
className="rounded-full w-12 h-12 object-fit object-center"
src={
profile?.picture ||
`https://nostr.api.v0l.io/api/v1/avatar/cyberpunks/${link.id}`
}
alt={profile?.display_name || profile?.name || "User avatar"}
width={s}
height={s}
className="rounded-full object-fit owbject-center"
onError={(e) => {
const target = e.target as HTMLImageElement;
target.src = `https://nostr.api.v0l.io/api/v1/avatar/cyberpunks/${link.id}`;
}}
/>
<div>
{profile?.display_name ??
profile?.name ??
hexToBech32("npub", link.id).slice(0, 12)}
</div>
</div>
{(showName ?? true) && (
<div>
{profile?.display_name ??
profile?.name ??
hexToBech32("npub", link.id).slice(0, 12)}
</div>
)}
</a>
);
}

View File

@ -1,10 +1,16 @@
import { EventPublisher, Nip7Signer } from "@snort/system";
import { useMemo } from "react";
import useLogin from "./login";
export default function usePublisher() {
const login = useLogin();
switch (login?.type) {
case "nip7":
return new EventPublisher(new Nip7Signer(), login.pubkey);
}
return useMemo(() => {
switch (login?.type) {
case "nip7":
return new EventPublisher(new Nip7Signer(), login.pubkey);
default:
return undefined;
}
}, [login?.type, login?.pubkey]);
}

View File

@ -4,9 +4,49 @@
html,
body {
@apply bg-black text-white;
@apply bg-gray-900 text-gray-100 font-sans;
}
[data-theme="light"] {
@apply bg-gray-50 text-gray-900;
}
hr {
@apply border-neutral-500
}
@apply border-gray-700;
}
[data-theme="light"] hr {
@apply border-gray-200;
}
.card {
@apply bg-gray-800 rounded-lg shadow-sm border border-gray-700 p-6;
}
[data-theme="light"] .card {
@apply bg-white border-gray-200;
}
.btn-primary {
@apply bg-blue-600 hover:bg-blue-700 text-white px-4 py-2 rounded-lg font-medium transition-colors duration-200;
}
.btn-secondary {
@apply bg-gray-700 hover:bg-gray-600 text-gray-200 px-4 py-2 rounded-lg font-medium transition-colors duration-200;
}
[data-theme="light"] .btn-secondary {
@apply bg-gray-100 hover:bg-gray-200 text-gray-700;
}
.btn-danger {
@apply bg-red-600 hover:bg-red-700 text-white px-3 py-1.5 rounded-md text-sm font-medium transition-colors duration-200;
}
.input {
@apply border border-gray-600 bg-gray-700 text-gray-100 rounded-lg px-3 py-2 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-transparent;
}
[data-theme="light"] .input {
@apply border-gray-300 bg-white text-gray-900;
}

View File

@ -13,6 +13,11 @@ class LoginStore extends ExternalStore<LoginSession | undefined> {
this.notifyChange();
}
logout() {
this.#session = undefined;
this.notifyChange();
}
takeSnapshot(): LoginSession | undefined {
return this.#session ? { ...this.#session } : undefined;
}

File diff suppressed because one or more lines are too long

View File

@ -2,7 +2,44 @@ import { base64 } from "@scure/base";
import { throwIfOffline } from "@snort/shared";
import { EventKind, EventPublisher, NostrEvent } from "@snort/system";
export interface AdminSelf { is_admin: boolean, file_count: number, total_size: number }
export interface AdminSelf {
is_admin: boolean;
file_count: number;
total_size: number;
paid_until?: number;
quota?: number;
free_quota?: number;
total_available_quota?: number;
}
export interface Report {
id: number;
file_id: string;
reporter_id: number;
event_json: string;
created: string;
reviewed: boolean;
}
export interface PaymentInfo {
unit: string;
interval: {
[key: string]: number;
};
cost: {
currency: string;
amount: number;
};
}
export interface PaymentRequest {
units: number;
quantity: number;
}
export interface PaymentResponse {
pr: string;
}
export class Route96 {
constructor(
@ -14,14 +51,13 @@ export class Route96 {
async getSelf() {
const rsp = await this.#req("admin/self", "GET");
const data =
await this.#handleResponse<AdminResponse<AdminSelf>>(rsp);
const data = await this.#handleResponse<AdminResponse<AdminSelf>>(rsp);
return data;
}
async listFiles(page = 0, count = 10) {
async listFiles(page = 0, count = 10, mime: string | undefined) {
const rsp = await this.#req(
`admin/files?page=${page}&count=${count}`,
`admin/files?page=${page}&count=${count}${mime ? `&mime_type=${mime}` : ""}`,
"GET",
);
const data = await this.#handleResponse<AdminResponseFileList>(rsp);
@ -32,6 +68,55 @@ export class Route96 {
};
}
async listReports(page = 0, count = 10) {
const rsp = await this.#req(
`admin/reports?page=${page}&count=${count}`,
"GET",
);
const data = await this.#handleResponse<AdminResponseReportList>(rsp);
return {
...data,
...data.data,
files: data.data.files,
};
}
async acknowledgeReport(reportId: number) {
const rsp = await this.#req(`admin/reports/${reportId}`, "DELETE");
const data = await this.#handleResponse<AdminResponse<void>>(rsp);
return data;
}
async getPaymentInfo() {
const rsp = await this.#req("payment", "GET");
if (rsp.ok) {
return (await rsp.json()) as PaymentInfo;
} else {
const text = await rsp.text();
try {
const obj = JSON.parse(text) as AdminResponseBase;
throw new Error(obj.message);
} catch {
throw new Error(`Payment info failed: ${text}`);
}
}
}
async requestPayment(request: PaymentRequest) {
const rsp = await this.#req("payment", "POST", JSON.stringify(request));
if (rsp.ok) {
return (await rsp.json()) as PaymentResponse;
} else {
const text = await rsp.text();
try {
const obj = JSON.parse(text) as AdminResponseBase;
throw new Error(obj.message);
} catch {
throw new Error(`Payment request failed: ${text}`);
}
}
}
async #handleResponse<T extends AdminResponseBase>(rsp: Response) {
if (rsp.ok) {
return (await rsp.json()) as T;
@ -61,13 +146,19 @@ export class Route96 {
};
const u = `${this.url}${path}`;
const headers: Record<string, string> = {
accept: "application/json",
authorization: await auth(u, method),
};
if (body && method !== "GET") {
headers["content-type"] = "application/json";
}
return await fetch(u, {
method,
body,
headers: {
accept: "application/json",
authorization: await auth(u, method),
},
headers,
});
}
}
@ -87,3 +178,10 @@ export type AdminResponseFileList = AdminResponse<{
count: number;
files: Array<NostrEvent>;
}>;
export type AdminResponseReportList = AdminResponse<{
total: number;
page: number;
count: number;
files: Array<Report>;
}>;

View File

@ -18,7 +18,17 @@ export class Blossom {
this.url = new URL(this.url).toString();
}
async upload(file: File) {
async #handleError(rsp: Response) {
const reason = rsp.headers.get("X-Reason") || rsp.headers.get("x-reason");
if (reason) {
throw new Error(reason);
} else {
const text = await rsp.text();
throw new Error(text);
}
}
async upload(file: File): Promise<BlobDescriptor> {
const hash = await window.crypto.subtle.digest(
"SHA-256",
await file.arrayBuffer(),
@ -29,56 +39,63 @@ export class Blossom {
if (rsp.ok) {
return (await rsp.json()) as BlobDescriptor;
} else {
const text = await rsp.text();
throw new Error(text);
await this.#handleError(rsp);
throw new Error("Should not reach here");
}
}
async media(file: File) {
async media(file: File): Promise<BlobDescriptor> {
const hash = await window.crypto.subtle.digest(
"SHA-256",
await file.arrayBuffer(),
);
const tags = [["x", bytesToString("hex", new Uint8Array(hash))]];
const rsp = await this.#req("media", "PUT", "upload", file, tags);
const rsp = await this.#req("media", "PUT", "media", file, tags);
if (rsp.ok) {
return (await rsp.json()) as BlobDescriptor;
} else {
const text = await rsp.text();
throw new Error(text);
await this.#handleError(rsp);
throw new Error("Should not reach here");
}
}
async mirror(url: string) {
const rsp = await this.#req("mirror", "PUT", "mirror", JSON.stringify({ url }), undefined, {
"content-type": "application/json"
});
async mirror(url: string): Promise<BlobDescriptor> {
const rsp = await this.#req(
"mirror",
"PUT",
"mirror",
JSON.stringify({ url }),
undefined,
{
"content-type": "application/json",
},
);
if (rsp.ok) {
return (await rsp.json()) as BlobDescriptor;
} else {
const text = await rsp.text();
throw new Error(text);
await this.#handleError(rsp);
throw new Error("Should not reach here");
}
}
async list(pk: string) {
async list(pk: string): Promise<Array<BlobDescriptor>> {
const rsp = await this.#req(`list/${pk}`, "GET", "list");
if (rsp.ok) {
return (await rsp.json()) as Array<BlobDescriptor>;
} else {
const text = await rsp.text();
throw new Error(text);
await this.#handleError(rsp);
throw new Error("Should not reach here");
}
}
async delete(id: string) {
async delete(id: string): Promise<void> {
const tags = [["x", id]];
const rsp = await this.#req(id, "DELETE", "delete", undefined, tags);
if (!rsp.ok) {
const text = await rsp.text();
throw new Error(text);
await this.#handleError(rsp);
throw new Error("Should not reach here");
}
}

248
ui_src/src/views/admin.tsx Normal file
View File

@ -0,0 +1,248 @@
import { useEffect, useState, useCallback } from "react";
import { Navigate } from "react-router-dom";
import Button from "../components/button";
import FileList from "./files";
import ReportList from "./reports";
import { Blossom } from "../upload/blossom";
import useLogin from "../hooks/login";
import usePublisher from "../hooks/publisher";
import { Nip96FileList } from "../upload/nip96";
import { AdminSelf, Route96, Report } from "../upload/admin";
export default function Admin() {
const [self, setSelf] = useState<AdminSelf>();
const [error, setError] = useState<string>();
const [adminListedFiles, setAdminListedFiles] = useState<Nip96FileList>();
const [reports, setReports] = useState<Report[]>();
const [reportPages, setReportPages] = useState<number>();
const [reportPage, setReportPage] = useState(0);
const [adminListedPage, setAdminListedPage] = useState(0);
const [mimeFilter, setMimeFilter] = useState<string>();
const [loading, setLoading] = useState(true);
const login = useLogin();
const pub = usePublisher();
const url =
import.meta.env.VITE_API_URL || `${location.protocol}//${location.host}`;
const listAllUploads = useCallback(
async (n: number) => {
if (!pub) return;
try {
setError(undefined);
const uploader = new Route96(url, pub);
const result = await uploader.listFiles(n, 50, mimeFilter);
setAdminListedFiles(result);
} catch (e) {
if (e instanceof Error) {
setError(e.message.length > 0 ? e.message : "Upload failed");
} else if (typeof e === "string") {
setError(e);
} else {
setError("List files failed");
}
}
},
[pub, url, mimeFilter],
);
const listReports = useCallback(
async (n: number) => {
if (!pub) return;
try {
setError(undefined);
const route96 = new Route96(url, pub);
const result = await route96.listReports(n, 10);
setReports(result.files);
setReportPages(Math.ceil(result.total / result.count));
} catch (e) {
if (e instanceof Error) {
setError(e.message.length > 0 ? e.message : "List reports failed");
} else if (typeof e === "string") {
setError(e);
} else {
setError("List reports failed");
}
}
},
[pub, url],
);
async function acknowledgeReport(reportId: number) {
if (!pub) return;
try {
setError(undefined);
const route96 = new Route96(url, pub);
await route96.acknowledgeReport(reportId);
await listReports(reportPage);
} catch (e) {
if (e instanceof Error) {
setError(
e.message.length > 0 ? e.message : "Acknowledge report failed",
);
} else if (typeof e === "string") {
setError(e);
} else {
setError("Acknowledge report failed");
}
}
}
async function deleteFile(id: string) {
if (!pub) return;
try {
setError(undefined);
const uploader = new Blossom(url, pub);
await uploader.delete(id);
} catch (e) {
if (e instanceof Error) {
setError(e.message.length > 0 ? e.message : "Upload failed");
} else if (typeof e === "string") {
setError(e);
} else {
setError("List files failed");
}
}
}
useEffect(() => {
if (pub && !self) {
const r96 = new Route96(url, pub);
r96
.getSelf()
.then((v) => {
setSelf(v.data);
setLoading(false);
})
.catch(() => {
setLoading(false);
});
}
}, [pub, self, url]);
useEffect(() => {
if (pub && self?.is_admin && !adminListedFiles) {
listAllUploads(adminListedPage);
}
}, [adminListedPage, pub, self?.is_admin, listAllUploads, adminListedFiles]);
useEffect(() => {
if (pub && self?.is_admin && !reports) {
listReports(reportPage);
}
}, [reportPage, pub, self?.is_admin, listReports, reports]);
if (loading) {
return (
<div className="flex justify-center items-center h-64">
<div className="text-lg text-gray-400">Loading...</div>
</div>
);
}
if (!login) {
return (
<div className="card max-w-md mx-auto text-center">
<h2 className="text-xl font-semibold mb-4">Authentication Required</h2>
<p className="text-gray-400">
Please log in to access the admin panel.
</p>
</div>
);
}
if (!self?.is_admin) {
return <Navigate to="/" replace />;
}
return (
<div className="space-y-8">
<div className="flex items-center justify-between">
<h1 className="text-3xl font-bold text-gray-100">Admin Panel</h1>
</div>
{error && (
<div className="bg-red-900/20 border border-red-800 text-red-400 px-4 py-3 rounded-lg">
{error}
</div>
)}
<div className="grid gap-8 lg:grid-cols-2">
<div className="card">
<h2 className="text-xl font-semibold mb-6">File Management</h2>
<div className="space-y-4">
<div>
<label className="block text-sm font-medium text-gray-300 mb-2">
Filter by MIME type
</label>
<select
className="input w-full"
value={mimeFilter || ""}
onChange={(e) => setMimeFilter(e.target.value || undefined)}
>
<option value="">All Files</option>
<option value="image/webp">WebP Images</option>
<option value="image/jpeg">JPEG Images</option>
<option value="image/jpg">JPG Images</option>
<option value="image/png">PNG Images</option>
<option value="image/gif">GIF Images</option>
<option value="video/mp4">MP4 Videos</option>
<option value="video/mov">MOV Videos</option>
</select>
</div>
<Button
onClick={() => listAllUploads(0)}
className="btn-primary w-full"
>
Load All Files
</Button>
</div>
</div>
<div className="card">
<h2 className="text-xl font-semibold mb-6">Reports Management</h2>
<Button onClick={() => listReports(0)} className="btn-primary w-full">
Load Reports
</Button>
</div>
</div>
{adminListedFiles && (
<div className="card">
<h2 className="text-xl font-semibold mb-6">All Files</h2>
<FileList
files={adminListedFiles.files}
pages={Math.ceil(adminListedFiles.total / adminListedFiles.count)}
page={adminListedFiles.page}
onPage={(x) => setAdminListedPage(x)}
onDelete={async (x) => {
await deleteFile(x);
await listAllUploads(adminListedPage);
}}
/>
</div>
)}
{reports && (
<div className="card">
<h2 className="text-xl font-semibold mb-6">Reports</h2>
<ReportList
reports={reports}
pages={reportPages}
page={reportPage}
onPage={(x) => setReportPage(x)}
onAcknowledge={acknowledgeReport}
onDeleteFile={async (fileId) => {
await deleteFile(fileId);
await listReports(reportPage);
}}
/>
</div>
)}
</div>
);
}

View File

@ -1,7 +1,8 @@
import { NostrEvent } from "@snort/system";
import { NostrEvent, NostrLink } from "@snort/system";
import { useState } from "react";
import { FormatBytes } from "../const";
import classNames from "classnames";
import Profile from "../components/profile";
interface FileInfo {
id: string;
@ -9,6 +10,7 @@ interface FileInfo {
name?: string;
type?: string;
size?: number;
uploader?: Array<string>;
}
export default function FileList({
@ -26,19 +28,21 @@ export default function FileList({
}) {
const [viewType, setViewType] = useState<"grid" | "list">("grid");
if (files.length === 0) {
return <b>No Files</b>;
return <b className="text-gray-400">No Files</b>;
}
function renderInner(f: FileInfo) {
if (f.type?.startsWith("image/") || !f.type) {
if (
f.type?.startsWith("image/") ||
f.type?.startsWith("video/") ||
!f.type
) {
return (
<img src={f.url} className="w-full h-full object-contain object-center" loading="lazy" />
);
} else if (f.type?.startsWith("video/")) {
return (
<div className="w-full h-full flex items-center justify-center">
Video
</div>
<img
src={f.url.replace(`/${f.id}`, `/thumb/${f.id}`)}
className="w-full h-full object-contain object-center"
loading="lazy"
/>
);
}
}
@ -54,6 +58,7 @@ export default function FileList({
name: f.content,
type: f.tags.find((a) => a[0] === "m")?.at(1),
size: Number(f.tags.find((a) => a[0] === "size")?.at(1)),
uploader: "uploader" in f ? (f.uploader as Array<string>) : undefined,
};
} else {
return {
@ -72,17 +77,22 @@ export default function FileList({
for (let x = start; x < n; x++) {
ret.push(
<div
<button
key={x}
onClick={() => onPage?.(x)}
className={classNames("bg-neutral-700 hover:bg-neutral-600 min-w-8 text-center cursor-pointer font-bold",
className={classNames(
"px-3 py-2 text-sm font-medium border transition-colors",
{
"rounded-l-md": x === start,
"rounded-r-md": (x + 1) === n,
"bg-neutral-400": page === x,
})}
"rounded-r-md": x + 1 === n,
"bg-blue-600 text-white border-blue-600": page === x,
"bg-white text-gray-700 border-gray-300 hover:bg-gray-50":
page !== x,
},
)}
>
{x + 1}
</div>,
</button>,
);
}
@ -91,35 +101,57 @@ export default function FileList({
function showGrid() {
return (
<div className="grid gap-2 grid-cols-4 lg:grid-cols-6 xl:grid-cols-8 2xl:grid-cols-10">
<div className="grid gap-4 grid-cols-2 sm:grid-cols-3 md:grid-cols-4 lg:grid-cols-6 xl:grid-cols-8">
{files.map((a) => {
const info = getInfo(a);
return (
<div
key={info.id}
className="relative rounded-md aspect-square overflow-hidden bg-neutral-900"
className="group relative rounded-lg aspect-square overflow-hidden bg-gray-100 border border-gray-200 hover:shadow-md transition-shadow"
>
<div className="absolute flex flex-col items-center justify-center w-full h-full text-wrap text-sm break-all text-center opacity-0 hover:opacity-100 hover:bg-black/80">
<div>
{(info.name?.length ?? 0) === 0 ? "Untitled" : info.name}
<div className="absolute inset-0 flex flex-col items-center justify-center p-2 text-xs text-center opacity-0 group-hover:opacity-100 bg-black/75 text-white transition-opacity">
<div className="font-medium mb-1">
{(info.name?.length ?? 0) === 0
? "Untitled"
: info.name!.length > 20
? `${info.name?.substring(0, 10)}...${info.name?.substring(info.name.length - 10)}`
: info.name}
</div>
<div>
<div className="text-gray-300 mb-1">
{info.size && !isNaN(info.size)
? FormatBytes(info.size, 2)
: ""}
</div>
<div className="text-gray-300 mb-2">{info.type}</div>
<div className="flex gap-2">
<a href={info.url} className="underline" target="_blank">
Link
<a
href={info.url}
className="bg-blue-600 hover:bg-blue-700 px-2 py-1 rounded text-xs"
target="_blank"
>
View
</a>
{onDelete && <a href="#" onClick={e => {
e.preventDefault();
onDelete?.(info.id)
}} className="underline">
Delete
</a>}
{onDelete && (
<button
onClick={(e) => {
e.preventDefault();
onDelete?.(info.id);
}}
className="bg-red-600 hover:bg-red-700 px-2 py-1 rounded text-xs"
>
Delete
</button>
)}
</div>
{info.uploader &&
info.uploader.map((a, idx) => (
<Profile
key={idx}
link={NostrLink.publicKey(a)}
size={20}
/>
))}
</div>
{renderInner(info)}
</div>
@ -131,82 +163,131 @@ export default function FileList({
function showList() {
return (
<table className="table-auto text-sm">
<thead>
<tr>
<th className="border border-neutral-400 bg-neutral-500 py-1 px-2">
Name
</th>
<th className="border border-neutral-400 bg-neutral-500 py-1 px-2">
Type
</th>
<th className="border border-neutral-400 bg-neutral-500 py-1 px-2">
Size
</th>
<th className="border border-neutral-400 bg-neutral-500 py-1 px-2">
Actions
</th>
</tr>
</thead>
<tbody>
{files.map((a) => {
const info = getInfo(a);
return (
<tr key={info.id}>
<td className="border border-neutral-500 py-1 px-2 break-all">
{(info.name?.length ?? 0) === 0 ? "<Untitled>" : info.name}
</td>
<td className="border border-neutral-500 py-1 px-2 break-all">
{info.type}
</td>
<td className="border border-neutral-500 py-1 px-2">
{info.size && !isNaN(info.size)
? FormatBytes(info.size, 2)
: ""}
</td>
<td className="border border-neutral-500 py-1 px-2">
<div className="flex gap-2">
<a href={info.url} className="underline" target="_blank">
Link
</a>
{onDelete && <a href="#" onClick={e => {
e.preventDefault();
onDelete?.(info.id)
}} className="underline">
Delete
</a>}
</div>
</td>
</tr>
);
})}
</tbody>
</table>
<div className="overflow-x-auto">
<table className="min-w-full bg-white border border-gray-200 rounded-lg">
<thead className="bg-gray-50">
<tr>
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider border-b border-gray-200">
Preview
</th>
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider border-b border-gray-200">
Name
</th>
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider border-b border-gray-200">
Type
</th>
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider border-b border-gray-200">
Size
</th>
{files.some((i) => "uploader" in i) && (
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider border-b border-gray-200">
Uploader
</th>
)}
<th className="px-4 py-3 text-left text-xs font-medium text-gray-500 uppercase tracking-wider border-b border-gray-200">
Actions
</th>
</tr>
</thead>
<tbody className="divide-y divide-gray-200">
{files.map((a) => {
const info = getInfo(a);
return (
<tr key={info.id} className="hover:bg-gray-50">
<td className="px-4 py-3 w-16">
<div className="w-12 h-12 bg-gray-100 rounded overflow-hidden">
{renderInner(info)}
</div>
</td>
<td className="px-4 py-3 text-sm text-gray-900 break-all max-w-xs">
{(info.name?.length ?? 0) === 0 ? "<Untitled>" : info.name}
</td>
<td className="px-4 py-3 text-sm text-gray-500">
{info.type}
</td>
<td className="px-4 py-3 text-sm text-gray-500">
{info.size && !isNaN(info.size)
? FormatBytes(info.size, 2)
: ""}
</td>
{info.uploader && (
<td className="px-4 py-3">
{info.uploader.map((a, idx) => (
<Profile
key={idx}
link={NostrLink.publicKey(a)}
size={20}
/>
))}
</td>
)}
<td className="px-4 py-3">
<div className="flex gap-2">
<a
href={info.url}
className="bg-blue-600 hover:bg-blue-700 text-white px-3 py-1 rounded text-xs"
target="_blank"
>
View
</a>
{onDelete && (
<button
onClick={(e) => {
e.preventDefault();
onDelete?.(info.id);
}}
className="bg-red-600 hover:bg-red-700 text-white px-3 py-1 rounded text-xs"
>
Delete
</button>
)}
</div>
</td>
</tr>
);
})}
</tbody>
</table>
</div>
);
}
return (
<>
<div className="flex">
<div
onClick={() => setViewType("grid")}
className={`bg-neutral-700 hover:bg-neutral-600 min-w-20 text-center cursor-pointer font-bold rounded-l-md ${viewType === "grid" ? "bg-neutral-500" : ""}`}
>
Grid
</div>
<div
onClick={() => setViewType("list")}
className={`bg-neutral-700 hover:bg-neutral-600 min-w-20 text-center cursor-pointer font-bold rounded-r-md ${viewType === "list" ? "bg-neutral-500" : ""}`}
>
List
<div className="space-y-4">
<div className="flex justify-between items-center">
<div className="flex rounded-lg border border-gray-300 overflow-hidden">
<button
onClick={() => setViewType("grid")}
className={`px-4 py-2 text-sm font-medium transition-colors ${
viewType === "grid"
? "bg-blue-600 text-white"
: "bg-white text-gray-700 hover:bg-gray-50"
}`}
>
Grid
</button>
<button
onClick={() => setViewType("list")}
className={`px-4 py-2 text-sm font-medium transition-colors border-l border-gray-300 ${
viewType === "list"
? "bg-blue-600 text-white"
: "bg-white text-gray-700 hover:bg-gray-50"
}`}
>
List
</button>
</div>
</div>
{viewType === "grid" ? showGrid() : showList()}
{pages !== undefined && (
<>
<div className="flex flex-wrap">{pageButtons(page ?? 0, pages)}</div>
</>
{pages !== undefined && pages > 1 && (
<div className="flex justify-center">
<div className="flex rounded-lg border border-gray-300 overflow-hidden">
{pageButtons(page ?? 0, pages)}
</div>
</div>
)}
</>
</div>
);
}

View File

@ -1,11 +1,22 @@
import { Nip7Signer, NostrLink } from "@snort/system";
import { Link, useLocation } from "react-router-dom";
import { useEffect, useState } from "react";
import Button from "../components/button";
import Profile from "../components/profile";
import useLogin from "../hooks/login";
import usePublisher from "../hooks/publisher";
import { Login } from "../login";
import { AdminSelf, Route96 } from "../upload/admin";
export default function Header() {
const login = useLogin();
const pub = usePublisher();
const location = useLocation();
const [self, setSelf] = useState<AdminSelf>();
const url =
import.meta.env.VITE_API_URL ||
`${window.location.protocol}//${window.location.host}`;
async function tryLogin() {
try {
@ -19,14 +30,72 @@ export default function Header() {
//ignore
}
}
useEffect(() => {
if (pub && !self) {
const r96 = new Route96(url, pub);
r96
.getSelf()
.then((v) => setSelf(v.data))
.catch(() => {});
}
}, [pub, self, url]);
return (
<div className="flex justify-between items-center">
<div className="text-xl font-bold">route96</div>
{login ? (
<Profile link={NostrLink.publicKey(login.pubkey)} />
) : (
<Button onClick={tryLogin}>Login</Button>
)}
</div>
<header className="border-b border-gray-700 bg-gray-800 w-full">
<div className="px-4 sm:px-6 lg:px-8 flex justify-between items-center py-4">
<div className="flex items-center space-x-8">
<Link to="/">
<div className="text-2xl font-bold text-gray-100 hover:text-blue-400 transition-colors">
route96
</div>
</Link>
<nav className="flex space-x-6">
<Link
to="/"
className={`text-sm font-medium transition-colors ${
location.pathname === "/"
? "text-blue-400 border-b-2 border-blue-400 pb-1"
: "text-gray-300 hover:text-gray-100"
}`}
>
Upload
</Link>
{self?.is_admin && (
<Link
to="/admin"
className={`text-sm font-medium transition-colors ${
location.pathname === "/admin"
? "text-blue-400 border-b-2 border-blue-400 pb-1"
: "text-gray-300 hover:text-gray-100"
}`}
>
Admin
</Link>
)}
</nav>
</div>
<div className="flex items-center space-x-4">
{login ? (
<div className="flex items-center space-x-3">
<Profile link={NostrLink.publicKey(login.pubkey)} />
<Button
onClick={() => Login.logout()}
className="btn-secondary text-sm"
>
Logout
</Button>
</div>
) : (
<Button onClick={tryLogin} className="btn-primary">
Login
</Button>
)}
</div>
</div>
</header>
);
}

View File

@ -0,0 +1,158 @@
import { NostrLink } from "@snort/system";
import classNames from "classnames";
import Profile from "../components/profile";
import { Report } from "../upload/admin";
export default function ReportList({
reports,
pages,
page,
onPage,
onAcknowledge,
onDeleteFile,
}: {
reports: Array<Report>;
pages?: number;
page?: number;
onPage?: (n: number) => void;
onAcknowledge?: (reportId: number) => void;
onDeleteFile?: (fileId: string) => void;
}) {
if (reports.length === 0) {
return <b>No Reports</b>;
}
function pageButtons(page: number, n: number) {
const ret = [];
const start = 0;
for (let x = start; x < n; x++) {
ret.push(
<div
key={x}
onClick={() => onPage?.(x)}
className={classNames(
"bg-neutral-700 hover:bg-neutral-600 min-w-8 text-center cursor-pointer font-bold",
{
"rounded-l-md": x === start,
"rounded-r-md": x + 1 === n,
"bg-neutral-400": page === x,
},
)}
>
{x + 1}
</div>,
);
}
return ret;
}
function getReporterPubkey(eventJson: string): string | null {
try {
const event = JSON.parse(eventJson);
return event.pubkey;
} catch {
return null;
}
}
function getReportReason(eventJson: string): string {
try {
const event = JSON.parse(eventJson);
return event.content || "No reason provided";
} catch {
return "Invalid event data";
}
}
function formatDate(dateString: string): string {
return new Date(dateString).toLocaleString();
}
return (
<>
<table className="w-full border-collapse border border-neutral-500">
<thead>
<tr className="bg-neutral-700">
<th className="border border-neutral-500 py-2 px-4 text-left">
Report ID
</th>
<th className="border border-neutral-500 py-2 px-4 text-left">
File ID
</th>
<th className="border border-neutral-500 py-2 px-4 text-left">
Reporter
</th>
<th className="border border-neutral-500 py-2 px-4 text-left">
Reason
</th>
<th className="border border-neutral-500 py-2 px-4 text-left">
Created
</th>
<th className="border border-neutral-500 py-2 px-4 text-left">
Actions
</th>
</tr>
</thead>
<tbody>
{reports.map((report) => {
const reporterPubkey = getReporterPubkey(report.event_json);
const reason = getReportReason(report.event_json);
return (
<tr key={report.id} className="hover:bg-neutral-700">
<td className="border border-neutral-500 py-2 px-4">
{report.id}
</td>
<td className="border border-neutral-500 py-2 px-4 font-mono text-sm">
{report.file_id.substring(0, 12)}...
</td>
<td className="border border-neutral-500 py-2 px-4">
{reporterPubkey ? (
<Profile
link={NostrLink.publicKey(reporterPubkey)}
size={20}
/>
) : (
"Unknown"
)}
</td>
<td className="border border-neutral-500 py-2 px-4 max-w-xs truncate">
{reason}
</td>
<td className="border border-neutral-500 py-2 px-4">
{formatDate(report.created)}
</td>
<td className="border border-neutral-500 py-2 px-4">
<div className="flex gap-2">
<button
onClick={() => onAcknowledge?.(report.id)}
className="bg-blue-600 hover:bg-blue-700 px-2 py-1 rounded text-sm"
>
Acknowledge
</button>
<button
onClick={() => onDeleteFile?.(report.file_id)}
className="bg-red-600 hover:bg-red-700 px-2 py-1 rounded text-sm"
>
Delete File
</button>
</div>
</td>
</tr>
);
})}
</tbody>
</table>
{pages !== undefined && (
<>
<div className="flex justify-center mt-4">
<div className="flex gap-1">{pageButtons(page ?? 0, pages)}</div>
</div>
</>
)}
</>
);
}

View File

@ -1,6 +1,7 @@
import { useEffect, useState } from "react";
import { useEffect, useState, useCallback } from "react";
import Button from "../components/button";
import FileList from "./files";
import PaymentFlow from "../components/payment";
import { openFile } from "../upload";
import { Blossom } from "../upload/blossom";
import useLogin from "../hooks/login";
@ -8,29 +9,23 @@ import usePublisher from "../hooks/publisher";
import { Nip96, Nip96FileList } from "../upload/nip96";
import { AdminSelf, Route96 } from "../upload/admin";
import { FormatBytes } from "../const";
import Report from "../report.json";
export default function Upload() {
const [type, setType] = useState<"blossom" | "nip96">("blossom");
const [noCompress, setNoCompress] = useState(false);
const [showLegacy, setShowLegacy] = useState(false);
const [toUpload, setToUpload] = useState<File>();
const [self, setSelf] = useState<AdminSelf>();
const [error, setError] = useState<string>();
const [bulkPrgress, setBulkProgress] = useState<number>();
const [results, setResults] = useState<Array<object>>([]);
const [listedFiles, setListedFiles] = useState<Nip96FileList>();
const [adminListedFiles, setAdminListedFiles] = useState<Nip96FileList>();
const [listedPage, setListedPage] = useState(0);
const [adminListedPage, setAdminListedPage] = useState(0);
const [showPaymentFlow, setShowPaymentFlow] = useState(false);
const login = useLogin();
const pub = usePublisher();
const legacyFiles = Report as Record<string, Array<string>>;
const myLegacyFiles = login ? (legacyFiles[login.pubkey] ?? []) : [];
const url = import.meta.env.VITE_API_URL || `${location.protocol}//${location.host}`;
const url =
import.meta.env.VITE_API_URL || `${location.protocol}//${location.host}`;
async function doUpload() {
if (!pub) return;
if (!toUpload) return;
@ -38,7 +33,9 @@ export default function Upload() {
setError(undefined);
if (type === "blossom") {
const uploader = new Blossom(url, pub);
const result = noCompress ? await uploader.upload(toUpload) : await uploader.media(toUpload);
const result = noCompress
? await uploader.upload(toUpload)
: await uploader.media(toUpload);
setResults((s) => [...s, result]);
}
if (type === "nip96") {
@ -49,7 +46,7 @@ export default function Upload() {
}
} catch (e) {
if (e instanceof Error) {
setError(e.message.length > 0 ? e.message : "Upload failed");
setError(e.message || "Upload failed - no error details provided");
} else if (typeof e === "string") {
setError(e);
} else {
@ -58,42 +55,29 @@ export default function Upload() {
}
}
async function listUploads(n: number) {
if (!pub) return;
try {
setError(undefined);
const uploader = new Nip96(url, pub);
await uploader.loadInfo();
const result = await uploader.listFiles(n, 50);
setListedFiles(result);
} catch (e) {
if (e instanceof Error) {
setError(e.message.length > 0 ? e.message : "Upload failed");
} else if (typeof e === "string") {
setError(e);
} else {
setError("List files failed");
const listUploads = useCallback(
async (n: number) => {
if (!pub) return;
try {
setError(undefined);
const uploader = new Nip96(url, pub);
await uploader.loadInfo();
const result = await uploader.listFiles(n, 50);
setListedFiles(result);
} catch (e) {
if (e instanceof Error) {
setError(
e.message || "List files failed - no error details provided",
);
} else if (typeof e === "string") {
setError(e);
} else {
setError("List files failed");
}
}
}
}
async function listAllUploads(n: number) {
if (!pub) return;
try {
setError(undefined);
const uploader = new Route96(url, pub);
const result = await uploader.listFiles(n, 50);
setAdminListedFiles(result);
} catch (e) {
if (e instanceof Error) {
setError(e.message.length > 0 ? e.message : "Upload failed");
} else if (typeof e === "string") {
setError(e);
} else {
setError("List files failed");
}
}
}
},
[pub, url],
);
async function deleteFile(id: string) {
if (!pub) return;
@ -103,161 +87,412 @@ export default function Upload() {
await uploader.delete(id);
} catch (e) {
if (e instanceof Error) {
setError(e.message.length > 0 ? e.message : "Upload failed");
setError(e.message || "Delete failed - no error details provided");
} else if (typeof e === "string") {
setError(e);
} else {
setError("List files failed");
setError("Delete failed");
}
}
}
async function migrateLegacy() {
if (!pub) return;
const uploader = new Blossom(url, pub);
let ctr = 0;
for (const f of myLegacyFiles) {
try {
await uploader.mirror(`https://void.cat/d/${f}`);
} catch (e) {
console.error(e);
}
setBulkProgress(ctr++ / myLegacyFiles.length);
useEffect(() => {
if (pub && !listedFiles) {
listUploads(listedPage);
}
}
useEffect(() => {
listUploads(listedPage);
}, [listedPage]);
useEffect(() => {
listAllUploads(adminListedPage);
}, [adminListedPage]);
}, [listedPage, pub, listUploads, listedFiles]);
useEffect(() => {
if (pub && !self) {
const r96 = new Route96(url, pub);
r96.getSelf().then((v) => setSelf(v.data));
}
}, [pub, self]);
}, [pub, self, url]);
if (!login) {
return (
<div className="card max-w-2xl mx-auto text-center">
<h2 className="text-2xl font-semibold mb-4 text-gray-100">
Welcome to {window.location.hostname}
</h2>
<p className="text-gray-400 mb-6">
Please log in to start uploading files to your storage.
</p>
</div>
);
}
return (
<div className="flex flex-col gap-2 bg-neutral-800 p-8 rounded-xl w-full">
<h1 className="text-lg font-bold">
Welcome to {window.location.hostname}
</h1>
<div className="text-neutral-400 uppercase text-xs font-medium">
Upload Method
</div>
<div className="flex gap-4 items-center">
<div
className="flex gap-2 cursor-pointer"
onClick={() => setType("blossom")}
>
Blossom
<input type="radio" checked={type === "blossom"} />
<div className="max-w-4xl mx-auto space-y-8">
{error && (
<div className="bg-red-900/20 border border-red-800 text-red-400 px-4 py-3 rounded-lg">
{error}
</div>
<div
className="flex gap-2 cursor-pointer"
onClick={() => setType("nip96")}
>
NIP-96
<input type="radio" checked={type === "nip96"} />
</div>
</div>
)}
<div
className="flex gap-2 cursor-pointer"
onClick={() => setNoCompress((s) => !s)}
>
Disable Compression
<input type="checkbox" checked={noCompress} />
</div>
<div className="card">
<h2 className="text-xl font-semibold mb-6">Upload Settings</h2>
{toUpload && <FileList files={toUpload ? [toUpload] : []} />}
<div className="flex gap-4">
<Button
className="flex-1"
onClick={async () => {
const f = await openFile();
setToUpload(f);
}}
>
Choose Files
</Button>
<Button
className="flex-1"
onClick={doUpload} disabled={login === undefined}>
Upload
</Button>
</div>
<hr />
{!listedFiles && <Button disabled={login === undefined} onClick={() => listUploads(0)}>
List Uploads
</Button>}
<div className="space-y-6">
<div>
<label className="block text-sm font-medium text-gray-300 mb-3">
Upload Method
</label>
<div className="flex gap-6">
<label className="flex items-center cursor-pointer">
<input
type="radio"
checked={type === "blossom"}
onChange={() => setType("blossom")}
className="mr-2"
/>
<span className="text-sm font-medium text-gray-300">
Blossom
</span>
</label>
<label className="flex items-center cursor-pointer">
<input
type="radio"
checked={type === "nip96"}
onChange={() => setType("nip96")}
className="mr-2"
/>
<span className="text-sm font-medium text-gray-300">
NIP-96
</span>
</label>
</div>
</div>
<div>
<label className="flex items-center cursor-pointer">
<input
type="checkbox"
checked={noCompress}
onChange={(e) => setNoCompress(e.target.checked)}
className="mr-2"
/>
<span className="text-sm font-medium text-gray-300">
Disable Compression
</span>
</label>
</div>
{self && <div className="flex justify-between font-medium">
<div>Uploads: {self.file_count.toLocaleString()}</div>
<div>Total Size: {FormatBytes(self.total_size)}</div>
</div>}
{toUpload && (
<div className="border-2 border-dashed border-gray-600 rounded-lg p-4">
<FileList files={[toUpload]} />
</div>
)}
{login && myLegacyFiles.length > 0 && (
<div className="flex flex-col gap-4 font-bold">
You have {myLegacyFiles.length.toLocaleString()} files which can be migrated from void.cat
<div className="flex gap-2">
<Button onClick={() => migrateLegacy()}>
Migrate Files
<div className="flex gap-4">
<Button
onClick={async () => {
const f = await openFile();
setToUpload(f);
}}
className="btn-secondary flex-1"
>
Choose File
</Button>
<Button onClick={() => setShowLegacy(s => !s)}>
{!showLegacy ? "Show Files" : "Hide Files"}
<Button
onClick={doUpload}
disabled={!toUpload}
className="btn-primary flex-1"
>
Upload
</Button>
</div>
{bulkPrgress !== undefined && <progress value={bulkPrgress} />}
</div>
</div>
{self && (
<div className="card max-w-2xl mx-auto">
<h3 className="text-lg font-semibold mb-4">Storage Quota</h3>
<div className="space-y-4">
{self.total_available_quota && self.total_available_quota > 0 && (
<>
{/* File Count */}
<div className="flex justify-between text-sm">
<span>Files:</span>
<span className="font-medium">
{self.file_count.toLocaleString()}
</span>
</div>
{/* Progress Bar */}
<div className="space-y-2">
<div className="flex justify-between text-sm">
<span>Used:</span>
<span className="font-medium">
{FormatBytes(self.total_size)} of{" "}
{FormatBytes(self.total_available_quota)}
</span>
</div>
<div className="w-full bg-gray-700 rounded-full h-2.5">
<div
className={`h-2.5 rounded-full transition-all duration-300 ${
self.total_size / self.total_available_quota > 0.8
? "bg-red-500"
: self.total_size / self.total_available_quota > 0.6
? "bg-yellow-500"
: "bg-green-500"
}`}
style={{
width: `${Math.min(100, (self.total_size / self.total_available_quota) * 100)}%`,
}}
></div>
</div>
<div className="flex justify-between text-xs text-gray-400">
<span>
{(
(self.total_size / self.total_available_quota) *
100
).toFixed(1)}
% used
</span>
<span
className={`${
self.total_size / self.total_available_quota > 0.8
? "text-red-400"
: self.total_size / self.total_available_quota > 0.6
? "text-yellow-400"
: "text-green-400"
}`}
>
{FormatBytes(
Math.max(
0,
self.total_available_quota - self.total_size,
),
)}{" "}
remaining
</span>
</div>
</div>
{/* Quota Breakdown */}
<div className="space-y-2 pt-2 border-t border-gray-700">
{self.free_quota && self.free_quota > 0 && (
<div className="flex justify-between text-sm">
<span>Free Quota:</span>
<span className="font-medium">
{FormatBytes(self.free_quota)}
</span>
</div>
)}
{(self.quota ?? 0) > 0 && (
<div className="flex justify-between text-sm">
<span>Paid Quota:</span>
<span className="font-medium">
{FormatBytes(self.quota!)}
</span>
</div>
)}
{(self.paid_until ?? 0) > 0 && (
<div className="flex justify-between text-sm">
<span>Expires:</span>
<div className="text-right">
<div className="font-medium">
{new Date(
self.paid_until! * 1000,
).toLocaleDateString()}
</div>
<div className="text-xs text-gray-400">
{(() => {
const now = Date.now() / 1000;
const daysLeft = Math.max(
0,
Math.ceil(
(self.paid_until! - now) / (24 * 60 * 60),
),
);
return daysLeft > 0
? `${daysLeft} days left`
: "Expired";
})()}
</div>
</div>
</div>
)}
</div>
</>
)}
{(!self.total_available_quota ||
self.total_available_quota === 0) && (
<div className="text-center py-4 text-gray-400">
<p>No quota information available</p>
<p className="text-sm">
Contact administrator for storage access
</p>
</div>
)}
</div>
<Button
onClick={() => setShowPaymentFlow(!showPaymentFlow)}
className="btn-primary w-full mt-4"
>
{showPaymentFlow ? "Hide" : "Show"} Payment Options
</Button>
</div>
)}
{showLegacy && (
<FileList
files={myLegacyFiles.map(f => ({ id: f, url: `https://void.cat/d/${f}` }))}
/>
)}
{listedFiles && (
<FileList
files={listedFiles.files}
pages={Math.ceil(listedFiles.total / listedFiles.count)}
page={listedFiles.page}
onPage={(x) => setListedPage(x)}
onDelete={async (x) => {
await deleteFile(x);
await listUploads(listedPage);
}}
/>
{showPaymentFlow && pub && (
<div className="card">
<PaymentFlow
route96={new Route96(url, pub)}
onPaymentRequested={(pr) => {
console.log("Payment requested:", pr);
}}
userInfo={self}
/>
</div>
)}
{self?.is_admin && (
<>
<hr />
<h3>Admin File List:</h3>
<Button onClick={() => listAllUploads(0)}>List All Uploads</Button>
{adminListedFiles && (
<FileList
files={adminListedFiles.files}
pages={Math.ceil(adminListedFiles.total / adminListedFiles.count)}
page={adminListedFiles.page}
onPage={(x) => setAdminListedPage(x)}
onDelete={async (x) => {
await deleteFile(x);
await listAllUploads(adminListedPage);
}
}
/>
<div className="card">
<div className="flex justify-between items-center mb-6">
<h2 className="text-xl font-semibold">Your Files</h2>
{!listedFiles && (
<Button onClick={() => listUploads(0)} className="btn-primary">
Load Files
</Button>
)}
</>
</div>
{listedFiles && (
<FileList
files={listedFiles.files}
pages={Math.ceil(listedFiles.total / listedFiles.count)}
page={listedFiles.page}
onPage={(x) => setListedPage(x)}
onDelete={async (x) => {
await deleteFile(x);
await listUploads(listedPage);
}}
/>
)}
</div>
{results.length > 0 && (
<div className="card">
<h3 className="text-lg font-semibold mb-4">Upload Results</h3>
<div className="space-y-4">
{results.map((result: any, index) => (
<div
key={index}
className="bg-gray-800 border border-gray-700 rounded-lg p-4"
>
<div className="flex items-start justify-between mb-3">
<div className="flex-1">
<h4 className="font-medium text-green-400 mb-1">
Upload Successful
</h4>
<p className="text-sm text-gray-400">
{new Date(
(result.uploaded || Date.now() / 1000) * 1000,
).toLocaleString()}
</p>
</div>
<div className="text-right">
<span className="text-xs bg-blue-900/50 text-blue-300 px-2 py-1 rounded">
{result.type || "Unknown type"}
</span>
</div>
</div>
<div className="grid grid-cols-1 md:grid-cols-2 gap-4 mb-4">
<div>
<p className="text-sm text-gray-400">File Size</p>
<p className="font-medium">
{FormatBytes(result.size || 0)}
</p>
</div>
{result.nip94?.find((tag: any[]) => tag[0] === "dim") && (
<div>
<p className="text-sm text-gray-400">Dimensions</p>
<p className="font-medium">
{
result.nip94.find(
(tag: any[]) => tag[0] === "dim",
)?.[1]
}
</p>
</div>
)}
</div>
<div className="space-y-2">
<div>
<p className="text-sm text-gray-400 mb-1">File URL</p>
<div className="flex items-center gap-2">
<code className="text-xs bg-gray-900 text-green-400 px-2 py-1 rounded flex-1 overflow-hidden">
{result.url}
</code>
<button
onClick={() =>
navigator.clipboard.writeText(result.url)
}
className="text-xs bg-blue-600 hover:bg-blue-700 text-white px-2 py-1 rounded transition-colors"
title="Copy URL"
>
Copy
</button>
</div>
</div>
{result.nip94?.find((tag: any[]) => tag[0] === "thumb") && (
<div>
<p className="text-sm text-gray-400 mb-1">
Thumbnail URL
</p>
<div className="flex items-center gap-2">
<code className="text-xs bg-gray-900 text-blue-400 px-2 py-1 rounded flex-1 overflow-hidden">
{
result.nip94.find(
(tag: any[]) => tag[0] === "thumb",
)?.[1]
}
</code>
<button
onClick={() =>
navigator.clipboard.writeText(
result.nip94.find(
(tag: any[]) => tag[0] === "thumb",
)?.[1],
)
}
className="text-xs bg-blue-600 hover:bg-blue-700 text-white px-2 py-1 rounded transition-colors"
title="Copy Thumbnail URL"
>
Copy
</button>
</div>
</div>
)}
<div>
<p className="text-sm text-gray-400 mb-1">
File Hash (SHA256)
</p>
<code className="text-xs bg-gray-900 text-gray-400 px-2 py-1 rounded block overflow-hidden">
{result.sha256}
</code>
</div>
</div>
<details className="mt-4">
<summary className="text-sm text-gray-400 cursor-pointer hover:text-gray-300">
Show raw JSON data
</summary>
<pre className="text-xs bg-gray-900 text-gray-300 p-3 rounded mt-2 overflow-auto">
{JSON.stringify(result, undefined, 2)}
</pre>
</details>
</div>
))}
</div>
</div>
)}
{error && <b className="text-red-500">{error}</b>}
<pre className="text-xs font-monospace overflow-wrap">
{JSON.stringify(results, undefined, 2)}
</pre>
</div>
);
}

View File

@ -1 +1 @@
{"root":["./src/App.tsx","./src/const.ts","./src/login.ts","./src/main.tsx","./src/vite-env.d.ts","./src/components/button.tsx","./src/components/profile.tsx","./src/hooks/login.ts","./src/hooks/publisher.ts","./src/upload/admin.ts","./src/upload/blossom.ts","./src/upload/index.ts","./src/upload/nip96.ts","./src/views/files.tsx","./src/views/header.tsx","./src/views/upload.tsx"],"version":"5.6.2"}
{"root":["./src/App.tsx","./src/const.ts","./src/login.ts","./src/main.tsx","./src/vite-env.d.ts","./src/components/button.tsx","./src/components/payment.tsx","./src/components/profile.tsx","./src/hooks/login.ts","./src/hooks/publisher.ts","./src/upload/admin.ts","./src/upload/blossom.ts","./src/upload/index.ts","./src/upload/nip96.ts","./src/views/admin.tsx","./src/views/files.tsx","./src/views/header.tsx","./src/views/reports.tsx","./src/views/upload.tsx"],"version":"5.6.2"}

View File

@ -993,6 +993,13 @@ __metadata:
languageName: node
linkType: hard
"@types/history@npm:^4.7.11":
version: 4.7.11
resolution: "@types/history@npm:4.7.11"
checksum: 10c0/3facf37c2493d1f92b2e93a22cac7ea70b06351c2ab9aaceaa3c56aa6099fb63516f6c4ec1616deb5c56b4093c026a043ea2d3373e6c0644d55710364d02c934
languageName: node
linkType: hard
"@types/json-schema@npm:^7.0.15":
version: 7.0.15
resolution: "@types/json-schema@npm:7.0.15"
@ -1016,6 +1023,27 @@ __metadata:
languageName: node
linkType: hard
"@types/react-router-dom@npm:^5.3.3":
version: 5.3.3
resolution: "@types/react-router-dom@npm:5.3.3"
dependencies:
"@types/history": "npm:^4.7.11"
"@types/react": "npm:*"
"@types/react-router": "npm:*"
checksum: 10c0/a9231a16afb9ed5142678147eafec9d48582809295754fb60946e29fcd3757a4c7a3180fa94b45763e4c7f6e3f02379e2fcb8dd986db479dcab40eff5fc62a91
languageName: node
linkType: hard
"@types/react-router@npm:*":
version: 5.1.20
resolution: "@types/react-router@npm:5.1.20"
dependencies:
"@types/history": "npm:^4.7.11"
"@types/react": "npm:*"
checksum: 10c0/1f7eee61981d2f807fa01a34a0ef98ebc0774023832b6611a69c7f28fdff01de5a38cabf399f32e376bf8099dcb7afaf724775bea9d38870224492bea4cb5737
languageName: node
linkType: hard
"@types/react@npm:*, @types/react@npm:^18.3.3":
version: 18.3.8
resolution: "@types/react@npm:18.3.8"
@ -1522,6 +1550,13 @@ __metadata:
languageName: node
linkType: hard
"cookie@npm:^1.0.1":
version: 1.0.2
resolution: "cookie@npm:1.0.2"
checksum: 10c0/fd25fe79e8fbcfcaf6aa61cd081c55d144eeeba755206c058682257cb38c4bd6795c6620de3f064c740695bb65b7949ebb1db7a95e4636efb8357a335ad3f54b
languageName: node
linkType: hard
"cross-spawn@npm:^7.0.0, cross-spawn@npm:^7.0.2":
version: 7.0.3
resolution: "cross-spawn@npm:7.0.3"
@ -3187,6 +3222,34 @@ __metadata:
languageName: node
linkType: hard
"react-router-dom@npm:^7.6.2":
version: 7.6.2
resolution: "react-router-dom@npm:7.6.2"
dependencies:
react-router: "npm:7.6.2"
peerDependencies:
react: ">=18"
react-dom: ">=18"
checksum: 10c0/9a8370333b5c1ada5ed76a2c30a90ca5a5a8e6c8565165f147fb42b150f2b258b9e73935fe4945c459d770841abdfaf99c28f7e13da93b1f49b28e6a8e87aadb
languageName: node
linkType: hard
"react-router@npm:7.6.2":
version: 7.6.2
resolution: "react-router@npm:7.6.2"
dependencies:
cookie: "npm:^1.0.1"
set-cookie-parser: "npm:^2.6.0"
peerDependencies:
react: ">=18"
react-dom: ">=18"
peerDependenciesMeta:
react-dom:
optional: true
checksum: 10c0/c8ef65f2a378f38e3cba900d67fa2b80a41c1c3925102875ee07c12faa01ea40991cb3fbefaf3ff6914e724c755732e3d7dec2b1bdef09e0fddd00fccc85a06a
languageName: node
linkType: hard
"react@npm:^18.2.0, react@npm:^18.3.1":
version: 18.3.1
resolution: "react@npm:18.3.1"
@ -3367,6 +3430,13 @@ __metadata:
languageName: node
linkType: hard
"set-cookie-parser@npm:^2.6.0":
version: 2.7.1
resolution: "set-cookie-parser@npm:2.7.1"
checksum: 10c0/060c198c4c92547ac15988256f445eae523f57f2ceefeccf52d30d75dedf6bff22b9c26f756bd44e8e560d44ff4ab2130b178bd2e52ef5571bf7be3bd7632d9a
languageName: node
linkType: hard
"shebang-command@npm:^2.0.0":
version: 2.0.0
resolution: "shebang-command@npm:2.0.0"
@ -3726,6 +3796,7 @@ __metadata:
"@snort/system-react": "npm:^1.5.1"
"@types/react": "npm:^18.3.3"
"@types/react-dom": "npm:^18.3.0"
"@types/react-router-dom": "npm:^5.3.3"
"@vitejs/plugin-react": "npm:^4.3.1"
autoprefixer: "npm:^10.4.20"
classnames: "npm:^2.5.1"
@ -3737,6 +3808,7 @@ __metadata:
prettier: "npm:^3.3.3"
react: "npm:^18.3.1"
react-dom: "npm:^18.3.1"
react-router-dom: "npm:^7.6.2"
tailwindcss: "npm:^3.4.13"
typescript: "npm:^5.5.3"
typescript-eslint: "npm:^8.0.1"