initialize backlog
This commit is contained in:
188
.cursorrules
Normal file
188
.cursorrules
Normal file
@@ -0,0 +1,188 @@
|
||||
|
||||
# === BACKLOG.MD GUIDELINES START ===
|
||||
# Instructions for the usage of Backlog.md CLI Tool
|
||||
|
||||
## 1. Source of Truth
|
||||
|
||||
- Tasks live under **`backlog/tasks/`** (drafts under **`backlog/drafts/`**).
|
||||
- Every implementation decision starts with reading the corresponding Markdown task file.
|
||||
- Project documentation is in **`backlog/docs/`**.
|
||||
- Project decisions are in **`backlog/decisions/`**.
|
||||
|
||||
## 2. Defining Tasks
|
||||
|
||||
### **Title**
|
||||
|
||||
Use a clear brief title that summarizes the task.
|
||||
|
||||
### **Description**: (The **"why"**)
|
||||
|
||||
Provide a concise summary of the task purpose and its goal. Do not add implementation details here. It
|
||||
should explain the purpose and context of the task. Code snippets should be avoided.
|
||||
|
||||
### **Acceptance Criteria**: (The **"what"**)
|
||||
|
||||
List specific, measurable outcomes that define what means to reach the goal from the description. Use checkboxes (`- [ ]`) for tracking.
|
||||
When defining `## Acceptance Criteria` for a task, focus on **outcomes, behaviors, and verifiable requirements** rather
|
||||
than step-by-step implementation details.
|
||||
Acceptance Criteria (AC) define *what* conditions must be met for the task to be considered complete.
|
||||
They should be testable and confirm that the core purpose of the task is achieved.
|
||||
**Key Principles for Good ACs:**
|
||||
|
||||
- **Outcome-Oriented:** Focus on the result, not the method.
|
||||
- **Testable/Verifiable:** Each criterion should be something that can be objectively tested or verified.
|
||||
- **Clear and Concise:** Unambiguous language.
|
||||
- **Complete:** Collectively, ACs should cover the scope of the task.
|
||||
- **User-Focused (where applicable):** Frame ACs from the perspective of the end-user or the system's external behavior.
|
||||
|
||||
- *Good Example:* "- [ ] User can successfully log in with valid credentials."
|
||||
- *Good Example:* "- [ ] System processes 1000 requests per second without errors."
|
||||
- *Bad Example (Implementation Step):* "- [ ] Add a new function `handleLogin()` in `auth.ts`."
|
||||
|
||||
### Task file
|
||||
|
||||
Once a task is created it will be stored in `backlog/tasks/` directory as a Markdown file with the format
|
||||
`task-<id> - <title>.md` (e.g. `task-42 - Add GraphQL resolver.md`).
|
||||
|
||||
### Additional task requirements
|
||||
|
||||
- Tasks must be **atomic** and **testable**. If a task is too large, break it down into smaller subtasks.
|
||||
Each task should represent a single unit of work that can be completed in a single PR.
|
||||
|
||||
- **Never** reference tasks that are to be done in the future or that are not yet created. You can only reference
|
||||
previous
|
||||
tasks (id < current task id).
|
||||
|
||||
- When creating multiple tasks, ensure they are **independent** and they do not depend on future tasks.
|
||||
Example of wrong tasks splitting: task 1: "Add API endpoint for user data", task 2: "Define the user model and DB
|
||||
schema".
|
||||
Example of correct tasks splitting: task 1: "Add system for handling API requests", task 2: "Add user model and DB
|
||||
schema", task 3: "Add API endpoint for user data".
|
||||
|
||||
## 3. Recommended Task Anatomy
|
||||
|
||||
```markdown
|
||||
# task‑42 - Add GraphQL resolver
|
||||
|
||||
## Description (the why)
|
||||
|
||||
Short, imperative explanation of the goal of the task and why it is needed.
|
||||
|
||||
## Acceptance Criteria (the what)
|
||||
|
||||
- [ ] Resolver returns correct data for happy path
|
||||
- [ ] Error response matches REST
|
||||
- [ ] P95 latency ≤ 50 ms under 100 RPS
|
||||
|
||||
## Implementation Plan (the how)
|
||||
|
||||
1. Research existing GraphQL resolver patterns
|
||||
2. Implement basic resolver with error handling
|
||||
3. Add performance monitoring
|
||||
4. Write unit and integration tests
|
||||
5. Benchmark performance under load
|
||||
|
||||
## Implementation Notes (only added after working on the task)
|
||||
|
||||
- Approach taken
|
||||
- Features implemented or modified
|
||||
- Technical decisions and trade-offs
|
||||
- Modified or added files
|
||||
```
|
||||
|
||||
## 6. Implementing Tasks
|
||||
|
||||
Mandatory sections for every task:
|
||||
|
||||
- **Implementation Plan**: (The **"how"**) Outline the steps to achieve the task. Because the implementation details may
|
||||
change after the task is created, **the implementation notes must be added only after putting the task in progress**
|
||||
and before starting working on the task.
|
||||
- **Implementation Notes**: Document your approach, decisions, challenges, and any deviations from the plan. This
|
||||
section is added after you are done working on the task. It should summarize what you did and why you did it. Keep it
|
||||
concise but informative.
|
||||
|
||||
**IMPORTANT**: Do not implement anything else that deviates from the **Acceptance Criteria**. If you need to
|
||||
implement something that is not in the AC, update the AC first and then implement it or create a new task for it.
|
||||
|
||||
## 2. Typical Workflow
|
||||
|
||||
```bash
|
||||
# 1 Identify work
|
||||
backlog task list -s "To Do" --plain
|
||||
|
||||
# 2 Read details & documentation
|
||||
backlog task 42 --plain
|
||||
# Read also all documentation files in `backlog/docs/` directory.
|
||||
# Read also all decision files in `backlog/decisions/` directory.
|
||||
|
||||
# 3 Start work: assign yourself & move column
|
||||
backlog task edit 42 -a @{yourself} -s "In Progress"
|
||||
|
||||
# 4 Add implementation plan before starting
|
||||
backlog task edit 42 --plan "1. Analyze current implementation\n2. Identify bottlenecks\n3. Refactor in phases"
|
||||
|
||||
# 5 Break work down if needed by creating subtasks or additional tasks
|
||||
backlog task create "Refactor DB layer" -p 42 -a @{yourself} -d "Description" --ac "Tests pass,Performance improved"
|
||||
|
||||
# 6 Complete and mark Done
|
||||
backlog task edit 42 -s Done --notes "Implemented GraphQL resolver with error handling and performance monitoring"
|
||||
```
|
||||
|
||||
### 7. Final Steps Before Marking a Task as Done
|
||||
|
||||
Always ensure you have:
|
||||
|
||||
1. ✅ Marked all acceptance criteria as completed (change `- [ ]` to `- [x]`)
|
||||
2. ✅ Added an `## Implementation Notes` section documenting your approach
|
||||
3. ✅ Run all tests and linting checks
|
||||
4. ✅ Updated relevant documentation
|
||||
|
||||
## 8. Definition of Done (DoD)
|
||||
|
||||
A task is **Done** only when **ALL** of the following are complete:
|
||||
|
||||
1. **Acceptance criteria** checklist in the task file is fully checked (all `- [ ]` changed to `- [x]`).
|
||||
2. **Implementation plan** was followed or deviations were documented in Implementation Notes.
|
||||
3. **Automated tests** (unit + integration) cover new logic.
|
||||
4. **Static analysis**: linter & formatter succeed.
|
||||
5. **Documentation**:
|
||||
- All relevant docs updated (any relevant README file, backlog/docs, backlog/decisions, etc.).
|
||||
- Task file **MUST** have an `## Implementation Notes` section added summarising:
|
||||
- Approach taken
|
||||
- Features implemented or modified
|
||||
- Technical decisions and trade-offs
|
||||
- Modified or added files
|
||||
6. **Review**: self review code.
|
||||
7. **Task hygiene**: status set to **Done** via CLI (`backlog task edit <id> -s Done`).
|
||||
8. **No regressions**: performance, security and licence checks green.
|
||||
|
||||
⚠️ **IMPORTANT**: Never mark a task as Done without completing ALL items above.
|
||||
|
||||
## 9. Handy CLI Commands
|
||||
|
||||
| Purpose | Command |
|
||||
|------------------|------------------------------------------------------------------------|
|
||||
| Create task | `backlog task create "Add OAuth"` |
|
||||
| Create with desc | `backlog task create "Feature" -d "Enables users to use this feature"` |
|
||||
| Create with AC | `backlog task create "Feature" --ac "Must work,Must be tested"` |
|
||||
| Create with deps | `backlog task create "Feature" --dep task-1,task-2` |
|
||||
| Create sub task | `backlog task create -p 14 "Add Google auth"` |
|
||||
| List tasks | `backlog task list --plain` |
|
||||
| View detail | `backlog task 7 --plain` |
|
||||
| Edit | `backlog task edit 7 -a @{yourself} -l auth,backend` |
|
||||
| Add plan | `backlog task edit 7 --plan "Implementation approach"` |
|
||||
| Add AC | `backlog task edit 7 --ac "New criterion,Another one"` |
|
||||
| Add deps | `backlog task edit 7 --dep task-1,task-2` |
|
||||
| Add notes | `backlog task edit 7 --notes "We added this and that feature because"` |
|
||||
| Mark as done | `backlog task edit 7 -s "Done"` |
|
||||
| Archive | `backlog task archive 7` |
|
||||
| Draft flow | `backlog draft create "Spike GraphQL"` → `backlog draft promote 3.1` |
|
||||
| Demote to draft | `backlog task demote <task-id>` |
|
||||
|
||||
## 10. Tips for AI Agents
|
||||
|
||||
- **Always use `--plain` flag** when listing or viewing tasks for AI-friendly text output instead of using Backlog.md
|
||||
interactive UI.
|
||||
- When users mention to create a task, they mean to create a task using Backlog.md CLI tool.
|
||||
|
||||
# === BACKLOG.MD GUIDELINES END ===
|
||||
@@ -1,4 +1,4 @@
|
||||
FROM ruby:3.2.0 AS native-gems
|
||||
FROM ruby:3.2 AS native-gems
|
||||
RUN rm -f /etc/apt/apt.conf.d/docker-clean; \
|
||||
echo 'Binary::apt::APT::Keep-Downloaded-Packages "true";' > /etc/apt/apt.conf.d/keep-cache
|
||||
RUN \
|
||||
@@ -101,3 +101,6 @@ RUN curl --proto '=https' --tlsv1.2 -sSf https://just.systems/install.sh | bash
|
||||
|
||||
RUN su vscode -c "source /usr/local/share/nvm/nvm.sh && nvm install 18 && nvm use 18 && npm install -g yarn" 2>&1
|
||||
ENV PATH /usr/local/share/nvm/current/bin:$PATH
|
||||
|
||||
# install `backlog` tool
|
||||
RUN su vscode -c "npm i -g backlog.md"
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
source "$HOME/.cargo/env.fish"
|
||||
|
||||
1
.gitignore
vendored
1
.gitignore
vendored
@@ -62,3 +62,4 @@ yarn-debug.log*
|
||||
.yarn-integrity
|
||||
.DS_Store
|
||||
*.export
|
||||
.aider*
|
||||
|
||||
12
backlog/config.yml
Normal file
12
backlog/config.yml
Normal file
@@ -0,0 +1,12 @@
|
||||
project_name: "redux-scraper"
|
||||
default_status: "To Do"
|
||||
statuses: ["To Do", "In Progress", "Done"]
|
||||
labels: []
|
||||
milestones: []
|
||||
date_format: yyyy-mm-dd
|
||||
max_column_width: 20
|
||||
backlog_directory: "backlog"
|
||||
auto_open_browser: true
|
||||
default_port: 6420
|
||||
remote_operations: true
|
||||
auto_commit: false
|
||||
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-1
|
||||
title: Add bookmarking feature for posts across different domains
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Enable users to bookmark posts from different domains (FA, E621, Inkbunny) for later viewing and organization
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Users can bookmark posts from any domain
|
||||
- [ ] Bookmarks are persisted across sessions
|
||||
- [ ] Users can view and manage their bookmarks
|
||||
- [ ] Bookmarks can be organized or tagged
|
||||
21
backlog/tasks/task-10 - Unify-HTTP-client-configs.md
Normal file
21
backlog/tasks/task-10 - Unify-HTTP-client-configs.md
Normal file
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-10
|
||||
title: Unify HTTP client configs
|
||||
status: Done
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
updated_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Unify HTTP client configurations for all domains so the same job type can be used for different domains
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] HTTP client configs are unified across domains
|
||||
- [ ] Same job type can work with different domains
|
||||
- [ ] Configuration is centralized and maintainable
|
||||
- [ ] Existing functionality is preserved
|
||||
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-11
|
||||
title: Extract external_url_for_view to module
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Put the abstract external_url_for_view method in a module for reuse across different classes
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Method is extracted to a reusable module
|
||||
- [ ] Classes can include the module
|
||||
- [ ] Functionality is preserved
|
||||
- [ ] Code duplication is reduced
|
||||
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-12
|
||||
title: Backfill descriptions on inkbunny posts
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Backfill missing descriptions on existing inkbunny posts to ensure all posts have complete metadata
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] All inkbunny posts have descriptions
|
||||
- [ ] Backfill process is efficient
|
||||
- [ ] No data loss occurs
|
||||
- [ ] Process can be monitored and resumed if needed
|
||||
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-13
|
||||
title: Store deep update json on inkbunny posts
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
updated_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Store deep update JSON data on inkbunny posts to preserve all metadata from the API response
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Deep update JSON is stored for inkbunny posts
|
||||
- [ ] All metadata is preserved
|
||||
- [ ] Storage is efficient
|
||||
- [ ] Data can be accessed when needed
|
||||
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-14
|
||||
title: Fix Good Job runner exception handling
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Fix the manual Good Job runner to properly indicate when a job throws an exception by checking the return value of #perform
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Job runner indicates when exceptions occur
|
||||
- [ ] Return value of #perform is checked
|
||||
- [ ] Exception handling is robust
|
||||
- [ ] Error reporting is clear
|
||||
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-15
|
||||
title: Optimize FA user favs job incremental scanning
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Improve FA user favs job to stop in incremental mode when all posts on a page are already known favs to avoid false positives
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Job stops when all posts are known favs
|
||||
- [ ] Incremental scanning is optimized
|
||||
- [ ] False positives are avoided
|
||||
- [ ] Performance is improved
|
||||
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-16
|
||||
title: Add followers and following to FA user page
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Add followers and following information to the FA user show page to display social connections
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Followers count is displayed
|
||||
- [ ] Following count is displayed
|
||||
- [ ] Social connections are visible
|
||||
- [ ] Data is accurate and up-to-date
|
||||
20
backlog/tasks/task-17 - Parse-BBCode-in-post-descriptions.md
Normal file
20
backlog/tasks/task-17 - Parse-BBCode-in-post-descriptions.md
Normal file
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-17
|
||||
title: Parse BBCode in post descriptions
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Implement BBCode parsing for post descriptions to properly format content with BBCode markup
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] BBCode is parsed correctly
|
||||
- [ ] Formatted content is displayed properly
|
||||
- [ ] Common BBCode tags are supported
|
||||
- [ ] Parsing is secure and safe
|
||||
21
backlog/tasks/task-18 - Show-tags-on-FA-and-IB-posts.md
Normal file
21
backlog/tasks/task-18 - Show-tags-on-FA-and-IB-posts.md
Normal file
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-18
|
||||
title: Show tags on FA and IB posts
|
||||
status: Done
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
updated_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Display tags on FA posts and Inkbunny posts to help users understand content categorization
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Tags are displayed on FA posts
|
||||
- [ ] Tags are displayed on Inkbunny posts
|
||||
- [ ] Tag display is consistent
|
||||
- [ ] Tags are clickable and functional
|
||||
21
backlog/tasks/task-19 - Implement-SoFurry-scraper.md
Normal file
21
backlog/tasks/task-19 - Implement-SoFurry-scraper.md
Normal file
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-19
|
||||
title: Implement SoFurry scraper
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Implement a scraper for SoFurry platform to collect posts and user data
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] SoFurry scraper is implemented
|
||||
- [ ] Posts can be scraped
|
||||
- [ ] User data can be scraped
|
||||
- [ ] Data is stored consistently
|
||||
- [ ] Rate limiting is respected
|
||||
@@ -0,0 +1,23 @@
|
||||
---
|
||||
id: task-2
|
||||
title: Add search feature for FA and E621 content
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
updated_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Implement search functionality to search through FA descriptions, tags, E621 descriptions, and tags to help users find specific content
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Users can search FA post descriptions
|
||||
- [ ] Users can search FA post tags
|
||||
- [ ] Users can search E621 post descriptions
|
||||
- [ ] Users can search E621 post tags
|
||||
- [ ] Search results are relevant and properly ranked
|
||||
- [ ] Search is performant with large datasets
|
||||
20
backlog/tasks/task-20 - Create-unified-static-file-job.md
Normal file
20
backlog/tasks/task-20 - Create-unified-static-file-job.md
Normal file
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-20
|
||||
title: Create unified static file job
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Create a unified static file job that can handle file downloads across different domains
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Unified static file job is created
|
||||
- [ ] Job works across different domains
|
||||
- [ ] File downloads are handled consistently
|
||||
- [ ] Error handling is robust
|
||||
21
backlog/tasks/task-21 - Create-unified-avatar-file-job.md
Normal file
21
backlog/tasks/task-21 - Create-unified-avatar-file-job.md
Normal file
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-21
|
||||
title: Create unified avatar file job
|
||||
status: Done
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
updated_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Create a unified avatar file job that can handle avatar downloads across different domains
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Unified avatar file job is created
|
||||
- [ ] Job works across different domains
|
||||
- [ ] Avatar downloads are handled consistently
|
||||
- [ ] Error handling is robust
|
||||
21
backlog/tasks/task-22 - Add-ko-fi-domain-icon.md
Normal file
21
backlog/tasks/task-22 - Add-ko-fi-domain-icon.md
Normal file
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-22
|
||||
title: Add ko-fi domain icon
|
||||
status: Done
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
updated_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Add a domain icon for ko-fi platform to improve visual identification
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Ko-fi domain icon is added
|
||||
- [ ] Icon is visually appropriate
|
||||
- [ ] Icon is displayed correctly
|
||||
- [ ] Icon follows design standards
|
||||
21
backlog/tasks/task-23 - Add-tumblr-domain-icon.md
Normal file
21
backlog/tasks/task-23 - Add-tumblr-domain-icon.md
Normal file
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-23
|
||||
title: Add tumblr domain icon
|
||||
status: Done
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
updated_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Add a domain icon for tumblr platform to improve visual identification
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Tumblr domain icon is added
|
||||
- [ ] Icon is visually appropriate
|
||||
- [ ] Icon is displayed correctly
|
||||
- [ ] Icon follows design standards
|
||||
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-24
|
||||
title: Implement PCA visualization for user factors
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Implement PCA on user factors table to display a 2D plot showing user relationships and clustering
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] PCA analysis is implemented
|
||||
- [ ] 2D plot is generated
|
||||
- [ ] User relationships are visualized
|
||||
- [ ] Clustering is visible
|
||||
- [ ] Interactive visualization is provided
|
||||
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-25
|
||||
title: Use description links for post re-scanning
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Use links found in post descriptions to indicate when a post should be re-scanned, such as for comic next/prev links
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Description links are analyzed
|
||||
- [ ] Re-scanning logic is implemented
|
||||
- [ ] Comic navigation links are detected
|
||||
- [ ] Re-scanning is triggered appropriately
|
||||
21
backlog/tasks/task-26 - Fix-IDs-with-dots-in-URLs.md
Normal file
21
backlog/tasks/task-26 - Fix-IDs-with-dots-in-URLs.md
Normal file
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-26
|
||||
title: Fix IDs with dots in URLs
|
||||
status: Done
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
updated_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Fix handling of IDs that contain dots in URLs, such as https://refurrer.com/users/fa@jakke.
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] IDs with dots are handled correctly
|
||||
- [ ] URL routing works properly
|
||||
- [ ] Edge cases are covered
|
||||
- [ ] No breaking changes for existing URLs
|
||||
21
backlog/tasks/task-27 - Add-rich-inline-E621-links.md
Normal file
21
backlog/tasks/task-27 - Add-rich-inline-E621-links.md
Normal file
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-27
|
||||
title: Add rich inline E621 links
|
||||
status: Done
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
updated_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Add rich inline links to E621 posts, such as https://refurrer.com/posts/fa@60070060 with preview information
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Rich inline links are displayed
|
||||
- [ ] E621 posts show preview information
|
||||
- [ ] Links are visually enhanced
|
||||
- [ ] Preview data is accurate
|
||||
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-28
|
||||
title: Find and enqueue FA posts with favs but no scan
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Find FA posts that have favs recorded but no scan or file data, and enqueue scan jobs for them
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] FA posts with favs but no scan are identified
|
||||
- [ ] Scan jobs are enqueued appropriately
|
||||
- [ ] No duplicate jobs are created
|
||||
- [ ] Progress can be monitored
|
||||
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-29
|
||||
title: Create GlobalState for FA browse page tracking
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Create GlobalState entries for tracking the last FA ID on browse page and implement periodic scans from newest to stored ID
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] GlobalState entries are created
|
||||
- [ ] Last FA ID is tracked
|
||||
- [ ] Periodic scanning is implemented
|
||||
- [ ] Scan from newest to stored ID works
|
||||
- [ ] State is maintained correctly
|
||||
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-3
|
||||
title: Standardize embeddings tables schema
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Unify all embeddings tables to use the same consistent schema with item_id and embedding columns
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] All embeddings tables use consistent schema
|
||||
- [ ] Schema includes item_id and embedding columns
|
||||
- [ ] Data migration is successful
|
||||
- [ ] No data loss during standardization
|
||||
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-30
|
||||
title: Create GlobalState for backfill job management
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Create GlobalState entries for long running backfill jobs that can automatically restart them if they fail
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] GlobalState entries for backfill jobs are created
|
||||
- [ ] Jobs can be automatically restarted
|
||||
- [ ] Failure detection is implemented
|
||||
- [ ] Job state is properly managed
|
||||
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-31
|
||||
title: Add HTTP request/response logging flag
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Add a flag to pass to jobs to log HTTP requests and responses to a directory, with HTTP mock helper to read from that directory
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Flag for HTTP logging is implemented
|
||||
- [ ] HTTP requests are logged to directory
|
||||
- [ ] HTTP responses are logged to directory
|
||||
- [ ] Mock helper can read from directory
|
||||
- [ ] Logging is optional and configurable
|
||||
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-32
|
||||
title: Fix IP address for Cloudflare proxied requests
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Fix incorrect IP address detection for requests that are proxied through Cloudflare
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] IP address detection is correct for Cloudflare proxied requests
|
||||
- [ ] Real client IP is extracted properly
|
||||
- [ ] Cloudflare headers are handled correctly
|
||||
- [ ] Edge cases are covered
|
||||
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-33
|
||||
title: Add SOCKS5 proxy for additional workers
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Implement SOCKS5 proxy support for additional workers to distribute load and avoid rate limiting
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] SOCKS5 proxy support is implemented
|
||||
- [ ] Additional workers can use proxy
|
||||
- [ ] Load is distributed effectively
|
||||
- [ ] Rate limiting is avoided
|
||||
- [ ] Proxy configuration is flexible
|
||||
21
backlog/tasks/task-34 - Implement-backup-FA-scraper.md
Normal file
21
backlog/tasks/task-34 - Implement-backup-FA-scraper.md
Normal file
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-34
|
||||
title: Implement backup FA scraper
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Implement backup FA scraper using foxbot and the onion address g6jy5jkx466lrqojcngbnksugrcfxsl562bzuikrka5rv7srgguqbjid.onion
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Backup FA scraper is implemented
|
||||
- [ ] Foxbot integration works
|
||||
- [ ] Onion address is used correctly
|
||||
- [ ] Backup scraper can function independently
|
||||
- [ ] Failover mechanism is in place
|
||||
21
backlog/tasks/task-4 - Implement-Bluesky-scraper.md
Normal file
21
backlog/tasks/task-4 - Implement-Bluesky-scraper.md
Normal file
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-4
|
||||
title: Implement Bluesky scraper
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Create a scraper for Bluesky social media platform to collect posts and user data
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Scraper can fetch Bluesky posts
|
||||
- [ ] Scraper can fetch user profiles
|
||||
- [ ] Data is stored in consistent format
|
||||
- [ ] Error handling is implemented
|
||||
- [ ] Rate limiting is respected
|
||||
20
backlog/tasks/task-5 - Auto-enqueue-FA-user-profile-scans.md
Normal file
20
backlog/tasks/task-5 - Auto-enqueue-FA-user-profile-scans.md
Normal file
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-5
|
||||
title: Auto-enqueue FA user profile scans
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Automatically enqueue incremental scan jobs for FA users to keep their profiles up to date
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Jobs are automatically enqueued for FA users
|
||||
- [ ] Incremental scanning is implemented
|
||||
- [ ] System can handle multiple concurrent scans
|
||||
- [ ] Job scheduling is efficient and doesn't overload the system
|
||||
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-6
|
||||
title: Fix FA posts with font size adjustment prefix
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Fix parsing of FA posts that start with 'Font size adjustment: smallerlarger' to extract the actual content
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] FA posts with font size prefix are correctly parsed
|
||||
- [ ] Original content is extracted without the prefix
|
||||
- [ ] Existing data is cleaned up
|
||||
- [ ] New posts are handled correctly
|
||||
21
backlog/tasks/task-7 - Convert-logger-prefix-to-tagged.md
Normal file
21
backlog/tasks/task-7 - Convert-logger-prefix-to-tagged.md
Normal file
@@ -0,0 +1,21 @@
|
||||
---
|
||||
id: task-7
|
||||
title: Convert logger prefix to tagged
|
||||
status: Done
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
updated_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Convert logger .prefix=... calls to use .tagged(...) method for better logging structure
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] All logger .prefix calls are replaced with .tagged
|
||||
- [ ] Logging functionality is preserved
|
||||
- [ ] Log format is improved
|
||||
- [ ] Code is more maintainable
|
||||
20
backlog/tasks/task-8 - Convert-state-strings-to-enums.md
Normal file
20
backlog/tasks/task-8 - Convert-state-strings-to-enums.md
Normal file
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-8
|
||||
title: Convert state strings to enums
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Convert all state: string attributes to use ActiveRecord enums for better type safety and validation
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] All state string attributes are converted to enums
|
||||
- [ ] Enum values are properly defined
|
||||
- [ ] Database migration is successful
|
||||
- [ ] Existing data is preserved
|
||||
20
backlog/tasks/task-9 - Create-belongs_to_log_entry-macro.md
Normal file
20
backlog/tasks/task-9 - Create-belongs_to_log_entry-macro.md
Normal file
@@ -0,0 +1,20 @@
|
||||
---
|
||||
id: task-9
|
||||
title: Create belongs_to_log_entry macro
|
||||
status: To Do
|
||||
assignee: []
|
||||
created_date: '2025-07-08'
|
||||
labels: []
|
||||
dependencies: []
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Create a macro for ActiveRecord models to easily establish belongs_to relationships with log entries
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Macro is defined and functional
|
||||
- [ ] Models can use the macro to establish log entry relationships
|
||||
- [ ] Documentation is provided
|
||||
- [ ] Existing log entry relationships work correctly
|
||||
Reference in New Issue
Block a user