Yet another budget tracking tool. Successor of flow. The main features are:
- Web UI to manage data!
- Import data from any CSV list of operations provided by your banks.
budgetprovides uniform way of storing this data - Find and mark transfers between your accounts within a budget to exclude them from "incomes" and "withdraws"
- Tag operations in a way you need with powerful tagging criteria
- Build calendar-like aggregation of your operations to track your expences month-to-monht
- Configure authentication service in Yandex OAuth. Service currently supports only yandex oauth provider.
- Host budget web services (client bundle and server-side app). Deployment options are described below.
- Configre budget settings (import options, tagging and transfer ctriteria)
- Configure aggregation rules (logbook criteria)
Starting from this point you're ready to use service:
- Import new operations
- Handle possible duplicates and transfers
- Build and explore agregated expenses
The application consists of two main components that need to be hosted:
- Web Server (
NVs.Budget.Hosts.Web.Server): .NET 8.0 ASP.NET Core API server - Web Client (
NVs.Budget.Hosts.Web.Client): Angular application served as static files
- PostgreSQL 17+ database
- .NET 8.0 SDK (for building)
- Node.js 20+ and npm (for building client)
- Docker (optional, for containerized deployment)
The server requires the following configuration (via environment variables or appsettings.json):
- Connection Strings:
ConnectionStrings:IdentityContext- PostgreSQL connection string for identity/authentication dataConnectionStrings:BudgetContext- PostgreSQL connection string for budget data
- Authentication:
Auth:Yandex:ClientId- Yandex OAuth client IDAuth:Yandex:ClientSecret- Yandex OAuth client secret
- Frontend:
FrontendUrl- URL where the client application is hostedAllowedOrigins- Semicolon-separated list of allowed CORS origins
Both server and client have Dockerfiles for containerized deployment:
- Server: Exposes ports 7237 (HTTPS) and 5153 (HTTP)
- Client: Exposes ports 8080 (HTTP) and 8081 (HTTPS)
Build and run:
# Build server
docker build -f src/Hosts/NVs.Budget.Hosts.Web.Server/Dockerfile -t budget-server .
# Build client
docker build -f src/Hosts/NVs.Budget.Hosts.Web.Client/Dockerfile -t budget-client .
# Run with docker-compose (create your own compose file)For local development, see DEVELOPMENT.md for detailed setup instructions.
After deployment, run database migrations:
GET /admin/patch-dbThis endpoint applies all pending migrations to both identity and budget databases.
Each budget has three types of settings that control how operations are processed:
Configure how CSV files from your banks are parsed. Each setting is associated with a file pattern (regex) and includes:
- Culture: Locale for parsing numbers and dates (e.g.,
en-US,ru-RU) - Encoding: Text encoding of the CSV file (e.g.,
UTF-8,Windows-1251) - DateTime Kind: How to interpret date/time values (
Local,Utc,Unspecified) - Field Mappings: Maps CSV column indices/names to operation properties:
Amount- Transaction amountCurrency- Currency codeTimestamp- Transaction date/timeDescription- Transaction description
- Attribute Mappings: Maps CSV columns to custom operation attributes like MCC codes or whatever else you need and have in raw data
- Validation Rules: Rules to validate and filter CSV rows:
- Condition:
EqualsorNotEquals - Field: Column to check
- Value: Expected value
- Condition:
Define rules that automatically assign tags to operations based on conditions. Each criterion consists of:
- Tag Expression: Expression that computes the tag name from operation properties.
- Example:
o => o.Description.Contains("Grocery") ? "Food" : "Other"
- Example:
- Condition: Boolean expression that determines when to apply the tag
- Example:
o => o.Amount.Amount < 0(only for expenses)
- Example:
Tagging criteria are evaluated during import and update operations. Operations can have multiple tags.
Define rules to automatically detect transfers between accounts. Each criterion includes:
- Criterion Expression: Binary predicate that matches a source (withdraw) and sink (income) operation
- Example:
(source, sink) => source.Amount.Amount == -sink.Amount.Amount && source.Timestamp.Date == sink.Timestamp.Date
- Example:
- Accuracy: Confidence level of the match
Exact(100%) - High confidence, exact matchLikely(70%) - Probable match, may require review
- Comment: Description for the transfer
When a transfer is detected, both operations are tagged with Transfer, Source, or Sink tags and excluded from income/expense calculations.
Logbook criteria define how operations are grouped and aggregated for expense tracking. Criteria form a hierarchical structure where each level can have subcriteria for further grouping.
-
Tag-Based Criterion
- Matches operations by tags
- Types:
Including: Operation must have ALL specified tagsOneOf: Operation must have AT LEAST ONE of the specified tagsExcluding: Operation must NOT have any of the specified tags
- Example: Group all operations tagged with "Food" or "Restaurant"
-
Predicate-Based Criterion
- Matches operations using a boolean expression
- Example:
o => o.Amount.Amount < 0 && o.Timestamp.Month == DateTime.Now.Month - Useful for complex filtering logic
-
Substitution-Based Criterion
- Groups operations by a computed value
- The substitution expression returns a string that becomes the group name
- Example:
o => o.Timestamp.ToString("yyyy-MM")groups by month - Automatically creates subcriteria for each unique value
-
Universal Criterion
- Matches all operations
- Can be used as a root criterion or with subcriteria for grouping
- When used with subcriteria, operations are distributed to matching subcriteria
Criteria can be nested to create multi-level groupings:
Universal (all operations)
├── Tag-Based: "Food" (food-related expenses)
│ ├── Substitution: Month (group by month)
│ └── Tag-Based: "Restaurant" (restaurant expenses)
└── Tag-Based: "Transport" (transportation)
└── Substitution: Month
This structure allows you to build calendar-like views where operations are grouped by category and time period.
Import operations from CSV files exported by your banks. The import process uses the file reading settings configured for your budget to parse the CSV format.
Before importing, ensure you have:
- Configured file reading settings for your budget (see Budget settings)
- A CSV file exported from your bank
- Navigate to the import page for your budget
- Select a CSV file to import
- Optional: Specify file pattern - If you have multiple reading settings, provide a regex pattern to match the correct one (e.g.,
.*sberbank.*\.csv) - Optional: Set transfer confidence level - Choose the minimum accuracy for automatic transfer detection:
Exact(100%) - Only detect transfers with high confidenceLikely(70%) - Also detect probable transfers (may require review)
- Start the import - The system will:
- Parse the CSV file using the matching reading settings
- Register new operations
- Apply tagging criteria automatically
- Detect transfers based on your transfer criteria
- Detect duplicate operations
After import, you'll receive a summary with:
- Registered Operations: New operations successfully imported
- Duplicates: Groups of operations that appear to be duplicates (same amount, timestamp, and description)
- Errors: Any parsing errors or validation failures
- Success Messages: Information about transfers detected, tags applied, etc.
The system automatically detects potential duplicates based on:
- Same amount (absolute value)
- Same timestamp (within a small time window)
- Same description
Duplicate groups are shown in the import results. You can:
- Review each duplicate group
- Keep the operations if they're legitimate (not actual duplicates)
- Delete duplicates manually after import
If transfer detection criteria are configured, the system will automatically:
- Match withdraw operations with income operations
- Tag matched operations as
Transfer,Source, orSink - Exclude transfers from income/expense calculations
The transfer confidence level you select determines which transfers are automatically detected. You can review and adjust transfers later if needed.
Edit operations individually or in bulk to correct data, add tags, or update attributes.
- Find the operation in the operations list
- Click the edit button (✏️) to enter edit mode
- Modify the fields:
- Description: Transaction description
- Amount: Transaction amount (positive for income, negative for expenses)
- Currency: Currency code (e.g., USD, EUR, RUB)
- Tags: Add, remove, or modify tags
- Attributes: Add, remove, or modify custom attributes (key-value pairs)
- Save changes (💾) or Cancel (✕)
- Description (
string): Free-form text describing the transaction - Amount (
decimal): Transaction amount with decimal precision - Currency Code (
string): ISO currency code (3 letters, e.g., USD, EUR) - Timestamp (
DateTime): Date and time of the transaction - Tags (
string[]): Array of tag names - Attributes (
Dictionary<string, object>): Custom key-value pairs for additional metadata
You can update multiple operations at once by sending a batch update request. The update process will:
- Apply changes to all specified operations
- Re-evaluate tagging criteria (based on tagging mode)
- Re-detect transfers (if transfer confidence level is specified)
- Update operation versions for optimistic concurrency
When updating operations, you can choose how tags are handled:
- Append: Add new tags from tagging criteria without removing existing tags
- Replace: Replace all tags with those from tagging criteria
- None: Don't apply tagging criteria (keep existing tags)
When updating operations, you can optionally re-run transfer detection:
- Specify a transfer confidence level (
ExactorLikely) - The system will re-evaluate all operations for potential transfers
- Previously detected transfers may be updated or removed if criteria no longer match
Each operation displays:
- Operation ID: Unique identifier
- Budget ID: The budget this operation belongs to
- Version: Version number for optimistic concurrency control
You can expand an operation to view these details.
Operations can be deleted individually:
- Click the delete button (🗑️) on the operation
- Confirm the deletion
Note: Deleted operations are permanently removed and cannot be recovered. Consider exporting your data before bulk deletions.